US20210229686A1 - Automated Performance Checks For Autonomous Vehicles - Google Patents

Automated Performance Checks For Autonomous Vehicles Download PDF

Info

Publication number
US20210229686A1
US20210229686A1 US17/186,249 US202117186249A US2021229686A1 US 20210229686 A1 US20210229686 A1 US 20210229686A1 US 202117186249 A US202117186249 A US 202117186249A US 2021229686 A1 US2021229686 A1 US 2021229686A1
Authority
US
United States
Prior art keywords
vehicle
map data
data
computing device
traffic
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US17/186,249
Inventor
Colin Braley
Volker Grabe
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Waymo LLC
Original Assignee
Waymo LLC
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Waymo LLC filed Critical Waymo LLC
Priority to US17/186,249 priority Critical patent/US20210229686A1/en
Assigned to WAYMO LLC reassignment WAYMO LLC ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: BRALEY, COLIN, GRABE, Volker
Publication of US20210229686A1 publication Critical patent/US20210229686A1/en
Abandoned legal-status Critical Current

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W60/00Drive control systems specially adapted for autonomous road vehicles
    • B60W60/001Planning or execution of driving tasks
    • B60W60/0015Planning or execution of driving tasks specially adapted for safety
    • B60W60/0018Planning or execution of driving tasks specially adapted for safety by employing degraded modes, e.g. reducing speed, in response to suboptimal conditions
    • B60W60/00186Planning or execution of driving tasks specially adapted for safety by employing degraded modes, e.g. reducing speed, in response to suboptimal conditions related to the vehicle
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W40/00Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models
    • B60W40/02Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models related to ambient conditions
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W30/00Purposes of road vehicle drive control systems not related to the control of a particular sub-unit, e.g. of systems using conjoint control of vehicle sub-units
    • B60W30/10Path keeping
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W30/00Purposes of road vehicle drive control systems not related to the control of a particular sub-unit, e.g. of systems using conjoint control of vehicle sub-units
    • B60W30/14Adaptive cruise control
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W50/00Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
    • B60W50/02Ensuring safety in case of control system failures, e.g. by diagnosing, circumventing or fixing failures
    • B60W50/0205Diagnosing or detecting failures; Failure detection models
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W50/00Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
    • B60W50/02Ensuring safety in case of control system failures, e.g. by diagnosing, circumventing or fixing failures
    • B60W50/029Adapting to failures or work around with other constraints, e.g. circumvention by avoiding use of failed parts
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • G01C21/34Route searching; Route guidance
    • G01C21/3407Route searching; Route guidance specially adapted for specific applications
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01MTESTING STATIC OR DYNAMIC BALANCE OF MACHINES OR STRUCTURES; TESTING OF STRUCTURES OR APPARATUS, NOT OTHERWISE PROVIDED FOR
    • G01M17/00Testing of vehicles
    • G01M17/007Wheeled or endless-tracked vehicles
    • G01M17/06Steering behaviour; Rolling behaviour
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/0055Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots with safety arrangements
    • G05D1/0061Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots with safety arrangements for transition from automatic pilot to manual pilot and vice versa
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/0055Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots with safety arrangements
    • G05D1/0077Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots with safety arrangements using redundant signals or controls
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W50/00Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
    • B60W2050/0001Details of the control system
    • B60W2050/0019Control system elements or transfer functions
    • B60W2050/0026Lookup tables or parameter maps
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W50/00Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
    • B60W50/02Ensuring safety in case of control system failures, e.g. by diagnosing, circumventing or fixing failures
    • B60W50/0205Diagnosing or detecting failures; Failure detection models
    • B60W2050/021Means for detecting failure or malfunction
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W50/00Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
    • B60W50/02Ensuring safety in case of control system failures, e.g. by diagnosing, circumventing or fixing failures
    • B60W50/0205Diagnosing or detecting failures; Failure detection models
    • B60W2050/0215Sensor drifts or sensor failures
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W50/00Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
    • B60W50/02Ensuring safety in case of control system failures, e.g. by diagnosing, circumventing or fixing failures
    • B60W50/029Adapting to failures or work around with other constraints, e.g. circumvention by avoiding use of failed parts
    • B60W2050/0297Control Giving priority to different actuators or systems
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2420/00Indexing codes relating to the type of sensors based on the principle of their operation
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2554/00Input parameters relating to objects
    • B60W2554/20Static objects
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2556/00Input parameters relating to data
    • B60W2556/10Historical data
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2556/00Input parameters relating to data
    • B60W2556/40High definition maps

Definitions

  • Autonomous vehicles such as vehicles which do not require a human driver when operating in an autonomous driving mode, may be used to aid in the transport of passengers or items from one location to another.
  • An important component of an autonomous vehicle is the perception system, which allows the vehicle to perceive and interpret its surroundings using cameras, radar, sensors, and other similar devices.
  • the perception system executes numerous tasks while the autonomous vehicle is in motion, which ultimately leads to decisions, such as speeding up, slowing down, stopping, turning, etc.
  • the perception system may include a plurality of detection systems, such as cameras, sensors, and global positioning devices, which gathers and interprets images and sensor data about its surrounding environment, e.g., parked cars, trees, buildings, etc.
  • aspects of the disclosure provides for a system comprising one or more computing devices configured to identify a plurality of performance checks including a first check for a detection system of a plurality of detection systems of the vehicle and a second check for map data; select a plurality of road segments based on a location of the vehicle and the plurality of performance checks, wherein each of the plurality of road segments is selected for performing one or more of the plurality of performance checks; determine a test route for the vehicle by connecting the plurality of road segments and by connecting the location of the vehicle to one of the plurality of road segments; control the vehicle along the test route in an autonomous driving mode; while controlling the vehicle, receive sensor data from the plurality of detection systems of the vehicle; perform the plurality of performance checks based on the received sensor data; select an operation mode from a plurality of operation modes for the vehicle based on results of the plurality of performance checks; and operate the vehicle in the selected operation mode.
  • the plurality of road segments may include a first road segment, wherein one or more of the plurality of performance checks may be performed using one or more traffic features or stationary objects that are detectable along the first road segment.
  • the plurality of road segments may include a second road segment on which a maneuver required for one or more of the plurality of performance checks can be performed.
  • the first check may include comparing characteristics of a detected traffic feature with previously detected characteristics of the traffic feature.
  • the second check may include comparing a location of a detected traffic feature with a location of the detected traffic feature in the map data.
  • the plurality of performance checks may further include a third check for a component of the vehicle, the third check may include comparing one or more measurements related to the component of the vehicle with a threshold measurement.
  • the operation mode may be selected based on the results satisfying a threshold number of the plurality of performance checks.
  • the operation mode may be selected based on the results satisfying one or more set of performance checks of the plurality of performance checks.
  • the one or more computing devices may be further configured to determine one or more corrections to at least one of the detection systems based on the results of the plurality of performance checks. Operating in the selected operation mode may include using the one or more corrections.
  • the one or more computing devices may be further configured to update the map data based on the results of the plurality of performance checks. Operating in the selected operation mode may include using the updated map data.
  • the selected operation mode may be an inactive mode.
  • the system may further comprise the vehicle.
  • the disclosure further provides for identifying, by one or more computing devices, a plurality of performance checks including a first check for a detection system of a plurality of detection systems of the vehicle and a second check for map data; selecting, by the one or more computing devices, a plurality of road segments based on a location of the vehicle and the plurality of performance checks, wherein each of the plurality of road segments is selected for performing one or more of the plurality of performance checks; determining, by the one or more computing devices, a test route for the vehicle by connecting the plurality of road segments and by connecting the location of the vehicle to one of the plurality of road segments; controlling, by the one or more computing devices, the vehicle along the test route in an autonomous driving mode; while controlling the vehicle, receiving, by the one or more computing devices, sensor data from the plurality of detection systems of the vehicle; performing, by the one or more computing devices, the plurality of performance checks based on the received sensor data; selecting, by the one or more computing devices, an operation mode from a plurality of operation modes for the vehicle
  • the plurality of road segments may include a first road segment, wherein one or more of the plurality of performance checks may be performed using one or more traffic features or stationary objects that are detectable along the first road segment.
  • the plurality of road segments may include a second road segment on which a maneuver required for one or more of the plurality of performance checks can be performed.
  • the method may further comprise determining, by the one or more computing devices, one or more corrections to at least one of the detection systems based on the results of the plurality of performance checks, wherein operating in the selected operation mode may include using the one or more corrections.
  • the method may further comprise updating, by the one or more computing devices, the map data based on the results of the plurality of performance checks, wherein operating in the selected operation mode may include using the updated map data.
  • the plurality of performance checks may be performed at a regular interval.
  • FIG. 1 is a functional diagram of an example vehicle in accordance with aspects of the disclosure.
  • FIG. 2 is an example representation of map data in accordance with aspects of the disclosure.
  • FIG. 3 is an example external view of a vehicle in accordance with aspects of the disclosure.
  • FIG. 4 is an example pictorial diagram of a system in accordance with aspects of the disclosure.
  • FIG. 5 is an example functional diagram of a system in accordance with aspects of the disclosure.
  • FIG. 6 is an example situation in accordance with aspects of the disclosure.
  • FIG. 7 shows examples of collected sensor data in accordance with aspects of the disclosure.
  • FIG. 8 show examples of collected component data in accordance with aspects of the disclosure.
  • FIG. 9 show another example situation in accordance with aspects of the disclosure.
  • FIG. 10 is an example flow diagram in accordance with aspects of the disclosure.
  • the technology relates to performance checks for a vehicle to be performed after a full calibration, prior to operation, or at regular intervals.
  • a human driver may check various systems and components of the vehicle, such as making sure that the mirrors are adjusted, that the GPS system is functioning, and components such as steering wheel, brake, and signal lights, are responsive.
  • various systems of an autonomous vehicle also need to be checked before operation, particularly when the vehicle is to be operated in an autonomous mode, where a human driver may not be present to notice problems with the vehicle's systems. For instance, even if the vehicle had been fully calibrated in the past, a sensor in a perception system of the vehicle might have been moved during previous operation such as by another road user or a cleaner, or have been damaged by environmental factors such as temperature, humidity, etc.
  • a plurality of performance checks may be performed on the vehicle including, for example, a sensor check, a map check, and/or a component check.
  • the sensor check may include determining a level of function of a given sensor or detection system, such as detection accuracy, detection resolution, field of view, etc.
  • the map check may include determining an accuracy of the map data in relation to a given geographic area.
  • the component check may include determining a level of function of a given component, such as tire pressure, tire alignment, etc.
  • the results of the plurality of performance checks may be used to determine what functions of the vehicle are within set guidelines, such as for safety and comfort. The results may also be used to designate or clear the vehicle for particular modes of operation.
  • one or more computing devices may determine a test route based on the location of the vehicle, the map data, and the plurality of performance checks for the plurality of systems of the vehicle.
  • the test route need not include a designated depot or testing center, or be a closed route.
  • the vehicle's computing devices may navigate the vehicle along the test route using the one or more components and collect data using the plurality of detection systems. Collecting the data may include using a detection system of the plurality of detection system to detect one or more traffic features or stationary objects along the test route. In addition, collecting the data may include detecting one or more measurements related to a component of the vehicle.
  • the vehicle's computing devices may perform the plurality of performance checks by analyzing collected data.
  • characteristics of a detected traffic feature such as location, orientation, shape, color, reflectivity, etc.
  • a location or orientation of a detected traffic feature may be compared with the location or orientation of a previously detected or stored traffic feature in map data of the vehicle.
  • the one or more measurements related to a component of the vehicle may be compared with a threshold measurement.
  • the vehicle's computing devices may select an operation mode for operating the vehicle.
  • Operation modes may include, for example, task designations (passenger or non-passenger tasks), or limits on speeds, distance, or geographic area. Operation modes may also include an inactive mode, for example if the vehicle is not cleared for any other mode. In some implementations, modes may be selected for a plurality of vehicles by a remote system, such as a fleet management system. The vehicle may then be operated by the vehicle's computing devices in a particular mode based on the plurality of performance checks.
  • the features described above may allow autonomous vehicles to be quickly and properly prepared for operation. Quicker preparation means vehicles may be sent to users in a more timely fashion, even as demand fluctuates. As a result, users of autonomous vehicles may be able to be picked up in a timely manner. In addition, fewer resources, such as fuel, need be used in the preparation of the autonomous vehicle for service, which may reduce overall costs.
  • the features also allow for management of an entire fleet of autonomous vehicles designated for a plurality of modes that may service users more efficiently and safely.
  • a vehicle 100 in accordance with one aspect of the disclosure includes various components. While certain aspects of the disclosure are particularly useful in connection with specific types of vehicles, the vehicle may be any type of vehicle including, but not limited to, cars, trucks, motorcycles, busses, recreational vehicles, etc.
  • the vehicle may have one or more computing devices, such as computing device 110 containing one or more processors 120 , memory 130 and other components typically present in general purpose computing devices.
  • the memory 130 stores information accessible by the one or more processors 120 , including instructions 132 and data 134 that may be executed or otherwise used by the processor 120 .
  • the memory 130 may be of any type capable of storing information accessible by the processor, including a computing device-readable medium, or other medium that stores data that may be read with the aid of an electronic device, such as a hard-drive, memory card, ROM, RAM, DVD or other optical disks, as well as other write-capable and read-only memories.
  • Systems and methods may include different combinations of the foregoing, whereby different portions of the instructions and data are stored on different types of media.
  • the instructions 132 may be any set of instructions to be executed directly (such as machine code) or indirectly (such as scripts) by the processor.
  • the instructions may be stored as computing device code on the computing device-readable medium.
  • the terms “instructions” and “programs” may be used interchangeably herein.
  • the instructions may be stored in object code format for direct processing by the processor, or in any other computing device language including scripts or collections of independent source code modules that are interpreted on demand or compiled in advance. Functions, methods and routines of the instructions are explained in more detail below.
  • the data 134 may be retrieved, stored or modified by processor 120 in accordance with the instructions 132 .
  • data 134 of memory 130 may store predefined scenarios.
  • a given scenario may identify a set of scenario requirements including a type of object, a range of locations of the object relative to the vehicle, as well as other factors such as whether the autonomous vehicle is able to maneuver around the object, whether the object is using a turn signal, the condition of a traffic light relevant to the current location of the object, whether the object is approaching a stop sign, etc.
  • the requirements may include discrete values, such as “right turn signal is on” or “in a right turn only lane”, or ranges of values such as “having an heading that is oriented at an angle that is 30 to 60 degrees offset from a current path of vehicle 100 .”
  • the predetermined scenarios may include similar information for multiple objects.
  • the one or more processor 120 may be any conventional processors, such as commercially available CPUs. Alternatively, the one or more processors may be a dedicated device such as an ASIC or other hardware-based processor.
  • FIG. 1 functionally illustrates the processor, memory, and other elements of computing device 110 as being within the same block, it will be understood by those of ordinary skill in the art that the processor, computing device, or memory may actually include multiple processors, computing devices, or memories that may or may not be stored within the same physical housing.
  • internal electronic display 152 may be controlled by a dedicated computing device having its own processor or central processing unit (CPU), memory, etc. which may interface with the computing device 110 via a high-bandwidth or other network connection.
  • CPU central processing unit
  • this computing device may be a user interface computing device which can communicate with a user's client device.
  • the memory may be a hard drive or other storage media located in a housing different from that of computing device 110 . Accordingly, references to a processor or computing device will be understood to include references to a collection of processors or computing devices or memories that may or may not operate in parallel.
  • Computing device 110 may have all of the components normally used in connection with a computing device such as the processor and memory described above as well as a user input 150 (e.g., a mouse, keyboard, touch screen and/or microphone) and various electronic displays (e.g., a monitor having a screen or any other electrical device that is operable to display information).
  • the vehicle includes an internal electronic display 152 as well as one or more speakers 154 to provide information or audio visual experiences.
  • internal electronic display 152 may be located within a cabin of vehicle 100 and may be used by computing device 110 to provide information to passengers within the vehicle 100 .
  • the one or more speakers 154 may include external speakers that are arranged at various locations on the vehicle in order to provide audible notifications to objects external to the vehicle 100 .
  • computing device 110 may be an autonomous driving computing system incorporated into vehicle 100 .
  • the autonomous driving computing system may capable of communicating with various components of the vehicle.
  • computing device 110 may be in communication with various systems of vehicle 100 , such as deceleration system 160 (for controlling braking of the vehicle), acceleration system 162 (for controlling acceleration of the vehicle), steering system 164 (for controlling the orientation of the wheels and direction of the vehicle), signaling system 166 (for controlling turn signals), navigation system 168 (for navigating the vehicle to a location or around objects), positioning system 170 (for determining the position of the vehicle), perception system 172 (for detecting objects in the vehicle's environment), and power system 174 (for example, a battery and/or gas or diesel powered engine) in order to control the movement, speed, etc.
  • deceleration system 160 for controlling braking of the vehicle
  • acceleration system 162 for controlling acceleration of the vehicle
  • steering system 164 for controlling the orientation of the wheels and direction of the vehicle
  • signaling system 166 for controlling turn signals
  • navigation system 168 for
  • the computing device 110 may control the direction and speed of the vehicle by controlling various components.
  • computing device 110 may navigate the vehicle to a drop-off location completely autonomously using data from the map data and navigation system 168 .
  • Computing devices 110 may use the positioning system 170 to determine the vehicle's location and perception system 172 to detect and respond to objects when needed to reach the location safely.
  • computing devices 110 may cause the vehicle to accelerate (e.g., by increasing fuel or other energy provided to the engine by acceleration system 162 ), decelerate (e.g., by decreasing the fuel supplied to the engine, changing gears, and/or by applying brakes by deceleration system 160 ), change direction (e.g., by turning the front or rear wheels of vehicle 100 by steering system 164 ), and signal such changes (e.g., by lighting turn signals of signaling system 166 ).
  • the acceleration system 162 and deceleration system 160 may be a part of a drivetrain that includes various components between an engine of the vehicle and the wheels of the vehicle. Again, by controlling these systems, computing devices 110 may also control the drivetrain of the vehicle in order to maneuver the vehicle autonomously.
  • computing device 110 may interact with deceleration system 160 and acceleration system 162 in order to control the speed of the vehicle.
  • steering system 164 may be used by computing device 110 in order to control the direction of vehicle 100 .
  • the steering system may include components to control the angle of wheels to turn the vehicle.
  • Signaling system 166 may be used by computing device 110 in order to signal the vehicle's intent to other drivers or vehicles, for example, by lighting turn signals or brake lights when needed.
  • Navigation system 168 may be used by computing device 110 in order to determine and follow a route to a location.
  • the navigation system 168 and/or data 134 may store map data, e.g., highly detailed maps that computing devices 110 can use to navigate or control the vehicle.
  • map data e.g., highly detailed maps that computing devices 110 can use to navigate or control the vehicle.
  • these maps may identify the shape and elevation of roadways, lane markers, intersections, crosswalks, speed limits, traffic signal lights, buildings, signs, real time or historical traffic information, vegetation, or other such objects and information.
  • the lane markers may include features such as solid or broken double or single lane lines, solid or broken lane lines, reflectors, etc.
  • a given lane may be associated with left and right lane lines or other lane markers that define the boundary of the lane.
  • the map data may store known traffic or congestion information and/or and transit schedules (train, bus, etc.) from a particular pickup location at similar times in the past. This information may even be updated in real time by information received by the computing devices 110 .
  • FIG. 2 is an example of map data 200 .
  • the map data 200 includes the shape, location, and other characteristics of road 210 , road 220 , road 230 , road 240 , and road 250 .
  • Map data 200 may include lane markers or lane lines, such as lane line 211 for road 210 .
  • the lane lines may also define various lanes, for example lane line 211 defines lanes 212 , 214 of road 210 .
  • lanes may also be inferred by the width of a road, such as for roads 220 , 230 , 240 , 250 .
  • the map data 200 may also include information that identifies the direction of traffic and speed limits for each lane as well as information that allows the computing devices 110 to determine whether the vehicle has the right of way to complete a particular type of maneuver (i.e. complete a turn, cross a lane of traffic or intersection, etc.).
  • Map data 200 may also include relationship information between roads 210 , 220 , 230 , 240 , and 250 .
  • map data 200 may indicate that road 210 intersects road 220 at intersection 219 , that road 220 intersects road 230 at intersection 229 , that roads 230 , 240 , and 250 intersect at intersection 239 , and that road 250 intersects road 210 at intersection 259 .
  • Map data 200 may further include signs and markings on the roads with various characteristics and different semantic meanings. As shown, map data 200 includes traffic light 216 for road 210 and pedestrian crossing 218 across road 210 . Map data 200 also includes stop sign 260 . The map data 200 may additionally include other features such as curbs, waterways, vegetation, etc.
  • map data 200 may include various buildings or structures (such as points of interests) and the type of these buildings or structures. As shown, map data 200 depicts building 270 on road 210 .
  • map data 200 may include that the type of the building 270 is an airport, train station, stadium, school, church, hospital, apartment building, house, etc.
  • the type of the building 270 may be collected from administrative records, such as county records, or manually labeled by a human operator after reviewing aerial images.
  • Map data 200 may include additional information on building 270 , such as the locations of entrances and/or exits.
  • Map data 200 may also store predetermined stopping areas, such as a parking lot 280 . In this regard, such areas may be hand-selected by a human operator or learned by a computing device over time. Map data 200 may include additional information about the stopping areas, such as the location of entrance 282 and exit 284 of parking lot 280 , and that entrance 282 connects to road 240 , while exit 284 connects to roads 230 and 250 .
  • map data 200 may further include zoning information.
  • the zoning information may be obtained from administrative records, such as county records.
  • information on the roads may include indication that it is within a residential zone, a school zone, a commercial zone, etc.
  • the map data may further include location coordinates (examples of which are shown in FIG. 7 ), such as GPS coordinates of the roads 210 , 220 , 230 , 240 , and 250 , intersections 219 , 229 , 239 , and 259 , lane line 211 , lanes 212 and 214 , traffic light 216 , pedestrian crossing 218 , stop sign 260 , building 270 and its entrance 272 , parking lot 280 and its entrance 282 and exit 284 .
  • location coordinates such as GPS coordinates of the roads 210 , 220 , 230 , 240 , and 250 , intersections 219 , 229 , 239 , and 259 , lane line 211 , lanes 212 and 214 , traffic light 216 , pedestrian crossing 218 , stop sign 260 , building 270 and its entrance 272 , parking lot 280 and its entrance 282 and exit 284 .
  • the map data need not be entirely image based (for example, raster).
  • the detailed map data may include one or more roadgraphs or graph networks of information such as roads, lanes, intersections, and the connections between these features.
  • Each feature may be stored as graph data and may be associated with information such as a geographic location and whether or not it is linked to other related features, for example, a stop sign may be linked to a road and an intersection, etc.
  • the associated data may include grid-based indices of a roadgraph to allow for efficient lookup of certain roadgraph features.
  • the perception system 172 also includes one or more components for detecting objects external to the vehicle such as other vehicles, obstacles in the roadway, traffic signals, signs, trees, etc.
  • the perception system 172 may include one or more LIDAR sensor(s) 180 , camera sensor(s) 182 , and RADAR sensor(s) 184 .
  • the perception system 172 may include other sensors, such as SONAR device(s), gyroscope(s), accelerometer(s), and/or any other detection devices that record data which may be processed by computing devices 110 .
  • the sensors of the perception system may detect objects and their characteristics such as location, orientation, size, shape, type (for instance, vehicle, pedestrian, bicyclist, etc.), heading, and speed of movement, etc.
  • the raw data from the sensors and/or the aforementioned characteristics can be quantified or arranged into a descriptive function, vector, and or bounding box and sent for further processing to the computing devices 110 periodically and continuously as it is generated by the perception system 172 .
  • computing devices 110 may use the positioning system 170 to determine the vehicle's location and perception system 172 to detect and respond to objects when needed to reach the location safely.
  • FIG. 3 is an example external view of vehicle 100 .
  • roof-top housing 310 and dome housing 312 may include a LIDAR sensor as well as various cameras and RADAR units.
  • housing 320 located at the front end of vehicle 100 and housings 330 , 332 on the driver's and passenger's sides of the vehicle may each store a LIDAR sensor.
  • housing 330 is located in front of driver door 350 .
  • Vehicle 100 also includes housings 340 , 342 for RADAR units and/or cameras also located on the roof of vehicle 100 . Additional RADAR units and cameras (not shown) may be located at the front and rear ends of vehicle 100 and/or on other positions along the roof or roof-top housing 310 .
  • Vehicle 100 also includes many features of a typical passenger vehicle such as doors 350 , 352 , wheels 360 , 362 , etc.
  • computing devices 110 and/or perception system 172 may determine the object's type, for example, a traffic cone, pedestrian, a vehicle (such as a passenger car, truck, bus, etc.), bicycle, etc.
  • Objects may be identified by various models which may consider various characteristics of the detected objects, such as the size of an object, the speed of the object (bicycles do not tend to go faster than 40 miles per hour or slower than 0.1 miles per hour), the heat coming from the bicycle (bicycles tend to have rider that emit heat from their bodies), etc.
  • the object may be classified based on specific attributes of the object, such as information contained on a license plate, bumper sticker, or logos that appear on the vehicle.
  • sensor data (examples of which are shown in FIG. 7 ) collected by one or more sensors of the perception system 172 may be stored in data of computing device 110 of vehicle 100 .
  • vehicle 100 may have driven past stop sign 260 in the past, and have stored the detected values of stop sign 260 by LIDAR sensor(s) 180 in data 134 of memory 130 .
  • the detected values may include, for example, that when vehicle 100 is at location [x_b, y_b] (which may for example correspond to driving in road 230 towards intersection 239 and was 10 m away from reaching intersection 239 ), the stop sign 260 was detected to be at location [x 4 , y 4 ] and at a 30° angle from a front of vehicle 100 (which may for example correspond to when vehicle 100 is 8 m away from stop sign 260 on road 230 driving towards intersection 239 ).
  • these stored sensor data may be used for performance checks on the various systems of the vehicle 100 .
  • sensor data collected by the perception system of a reference vehicle may be stored in computing device 110 of vehicle 100 .
  • such sensor data may be stored remotely on a server or a storage system.
  • Computing device 110 may further store threshold values (some of which are shown in FIG. 8 ) for various components of vehicle 100 .
  • computing device 110 may store a threshold minimum tire pressure for tires of vehicle 100 .
  • computing device 110 may store threshold alignment angles for tires of vehicle 10 .
  • computing device 110 may store a threshold stopping distance at a particular speed for a brake of vehicle 100 .
  • the one or more computing devices 110 of vehicle 100 may also receive or transfer information to and from other computing devices, for instance using wireless network connections 156 .
  • the wireless network connections may include, for instance, BLUETOOTH®, Bluetooth LE, LTE, cellular, near field communications, etc. and various combinations of the foregoing.
  • FIGS. 4 and 5 are pictorial and functional diagrams, respectively, of an example system 400 that includes a plurality of computing devices 410 , 420 , 430 , 440 and a storage system 450 connected via a network 460 .
  • System 400 also includes vehicle 100 , and vehicle 100 A which may be configured similarly to vehicle 100 . Although only a few vehicles and computing devices are depicted for simplicity, a typical system may include significantly more.
  • each of computing devices 410 , 420 , 430 , 440 may include one or more processors, memory, data and instructions. Such processors, memories, data and instructions may be configured similarly to one or more processors 120 , memory 130 , data 134 , and instructions 132 of computing device 110 .
  • the network 460 may include various configurations and protocols including short range communication protocols such as BLUETOOTH®, Bluetooth LE, the Internet, World Wide Web, intranets, virtual private networks, wide area networks, local networks, private networks using communication protocols proprietary to one or more companies, Ethernet, WiFi and HTTP, and various combinations of the foregoing.
  • short range communication protocols such as BLUETOOTH®, Bluetooth LE, the Internet, World Wide Web, intranets, virtual private networks, wide area networks, local networks, private networks using communication protocols proprietary to one or more companies, Ethernet, WiFi and HTTP, and various combinations of the foregoing.
  • Such communication may be facilitated by any device capable of transmitting data to and from other computing devices, such as modems and wireless interfaces.
  • one or more computing devices 410 may include a server having a plurality of computing devices, e.g., a load balanced server farm, that exchange information with different nodes of a network for the purpose of receiving, processing and transmitting the data to and from other computing devices.
  • one or more computing devices 410 may include one or more server computing devices that are capable of communicating with one or more computing devices 110 of vehicle 100 or a similar computing device of vehicle 100 A as well as client computing devices 420 , 430 , 440 via the network 460 .
  • vehicles 100 and 100 A may be a part of a fleet of vehicles that can be dispatched by server computing devices to various locations. In this regard, the vehicles of the fleet may periodically send the server computing devices location information provided by the vehicle's respective positioning systems and the one or more server computing devices may track the locations of the vehicles.
  • sensor data may additionally or alternatively be stored on server computing device 410 .
  • threshold values for components of vehicle 100 may likewise be stored on server computing device 410 .
  • server computing devices 410 may use network 460 to transmit and present information to a user, such as user 422 , 432 , 442 on a display, such as displays 424 , 434 , 444 of computing devices 420 , 430 , 440 .
  • computing devices 420 , 430 , 440 may be considered client computing devices.
  • each client computing device 420 , 430 , 440 may be a personal computing device intended for use by a user 422 , 432 , 442 , and have all of the components normally used in connection with a personal computing device including a one or more processors (e.g., a central processing unit (CPU)), memory (e.g., RAM and internal hard drives) storing data and instructions, a display such as displays 424 , 434 , 444 (e.g., a monitor having a screen, a touch-screen, a projector, a television, or other device that is operable to display information), and user input devices 426 , 436 , 446 (e.g., a mouse, keyboard, touchscreen or microphone).
  • processors e.g., a central processing unit (CPU)
  • memory e.g., RAM and internal hard drives
  • a display such as displays 424 , 434 , 444 (e.g., a monitor having a screen, a touch-
  • a user such as user 422 , 432 , 442 , may send information, such as pickup or drop-off requests, to server computing devices 410 , using user input devices 426 , 436 , 446 of computing devices 420 , 430 , 440 .
  • the client computing devices may also include a camera for recording video streams, speakers, a network interface device, and all of the components used for connecting these elements to one another.
  • client computing devices 420 , 430 , and 440 may each comprise a full-sized personal computing device, they may alternatively comprise mobile computing devices capable of wirelessly exchanging data with a server over a network such as the Internet.
  • client computing device 420 may be a mobile phone or a device such as a wireless-enabled PDA, a tablet PC, a wearable computing device or system, or a netbook that is capable of obtaining information via the Internet or other networks.
  • client computing device 430 may be a wearable computing system, shown as a wrist watch in FIG. 4 .
  • the user may input information using a small keyboard, a keypad, microphone, using visual signals with a camera, or a touch screen.
  • client computing device 440 may be remote operator work station used by an administrator to provide remote operator services to users such as users 422 and 432 .
  • a remote operator 442 may use the remote operator work station 440 to communicate via a telephone call or audio connection with users through their respective client computing devices and/or vehicles 100 or 100 A in order to ensure the safe operation of vehicles 100 and 100 A and the safety of the users as described in further detail below.
  • FIGS. 4 and 5 Only a single remote operator work station 440 is shown in FIGS. 4 and 5 , any number of such work stations may be included in a typical system.
  • Storage system 450 may store various types of information as described in more detail below. This information may be retrieved or otherwise accessed by a server computing device, such as one or more server computing devices 410 , in order to perform some or all of the features described herein.
  • the information may include user account information such as credentials (e.g., a username and password as in the case of a traditional single-factor authentication as well as other types of credentials typically used in multi-factor authentications such as random identifiers, biometrics, etc.) that can be used to identify a user to the one or more server computing devices.
  • credentials e.g., a username and password as in the case of a traditional single-factor authentication as well as other types of credentials typically used in multi-factor authentications such as random identifiers, biometrics, etc.
  • the user account information may also include personal information such as the user's name, contact information, identifying information of the user's client computing device (or devices if multiple devices are used with the same user account), as well as age information, health information, and user history information about how long it has taken the user to enter or exit vehicles in the past as discussed below.
  • personal information such as the user's name, contact information, identifying information of the user's client computing device (or devices if multiple devices are used with the same user account), as well as age information, health information, and user history information about how long it has taken the user to enter or exit vehicles in the past as discussed below.
  • the storage system 450 may also store routing data for generating and evaluating routes between locations.
  • the routing information may be used to estimate how long it would take a vehicle at a first location to reach a second location.
  • the routing information may include map data, not necessarily as particular as the detailed map data 200 described above, but including roads, as well as information about those road such as direction (one way, two way, etc.), orientation (North, South, etc.), speed limits, as well as traffic information identifying expected traffic conditions, etc.
  • sensor data may additionally or alternatively be stored on storage system 450 .
  • threshold values for components of vehicle 100 may likewise be stored on storage system 450 .
  • the storage system 450 may also store information which can be provided to client computing devices for display to a user. For instance, the storage system 450 may store predetermined distance information for determining an area at which a vehicle is likely to stop for a given pickup or drop-off location. The storage system 450 may also store graphics, icons, and other items which may be displayed to a user as discussed below.
  • storage system 450 can be of any type of computerized storage capable of storing information accessible by the server computing devices 410 , such as a hard-drive, memory card, ROM, RAM, DVD, CD-ROM, write-capable, and read-only memories.
  • storage system 450 may include a distributed storage system where data is stored on a plurality of different storage devices which may be physically located at the same or different geographic locations.
  • Storage system 450 may be connected to the computing devices via the network 460 as shown in FIG. 4 and/or may be directly connected to or incorporated into any of the computing devices 110 , 410 , 420 , 430 , 440 , etc.
  • FIG. 6 illustrates an example situation 600 for performing a plurality of performance checks on vehicle 100 .
  • Various features in FIG. 6 may generally correspond to the shape, location, and other characteristics of features shown in map data 200 of FIG. 2 , and labeled as such. Additional features in FIG. 6 , including various road users and other objects, are described in detail below. Although these examples are useful for demonstration purposes, they should not be considered limiting.
  • vehicle 100 is currently parked curbside in lane 212 of road 210 , to ensure safe operation on the road, vehicle 100 may need to perform performance checks on its systems.
  • vehicle 100 may be scheduled to perform the performance checks on its systems on a regular basis, such as every day or week, every predetermined number of kilometers traveled or numbers of trips completed, or some frequency mandated by law. For example, during a previous day, vehicle 100 might have completed a number of trips, and upon completing these trips, vehicle 100 has parked roadside by the curb in lane 212 .
  • vehicle 100 may first perform a plurality of performance checks on its various systems. In one instance, some types of performance checks may be performed at higher frequency than other performance checks. In another instance, the frequency of the performance checks may depend on the type of vehicle.
  • a test route may be determined.
  • computing device 110 may determine the test route based on a location of the vehicle, map data, and the types of performance checks to be performed. For instance, computing device 110 may determine that vehicle 100 is currently parked by the curb in lane 212 near intersection 219 , and determine, based on map data 200 , a test route nearby so that vehicle 100 does not need to drive to a designated depot or testing center just to perform these tests. This ensures that the performance checks are performed as soon as possible, instead of risking operating the vehicle 100 on a long drive to the designated testing center, and ensures a more efficient use of resources, including fuel.
  • Computing device 110 may determine the test route further based on the types of performance checks that need to be performed in order to complete the plurality of performance checks. For instance, a list of required performance checks, including the type of each required performance check and frequency for each required performance check, may be stored on computing device 110 . Additionally or alternatively, the list of required performance checks may be stored on server computing device 410 and/or storage system 450 accessible by computing device 110 . For example, the list of required performance checks may include items such as “perform a sensor check using a stored traffic light detection at least once per 24 hours,” “perform a map check using a stop sign stored in map data at least once per month,” “perform a component check on all four tires at least once per week,” etc.
  • computing device 110 may select a plurality of segments, such as a plurality of road segments, using map data and stored sensor data, where each of the plurality of road segments is selected for performing one or more of the required performance checks.
  • Computing device 110 may then connect the plurality of road segments, and connect the location of the vehicle to one of the plurality of road segments to determine a test route in order to allow the vehicle to perform the plurality of performance checks.
  • computing device 110 may select a segment for a test route so that sensor data can be collected to compare with stored values of previous detections of traffic features or stationary objects. For example, as shown in FIG. 6 , computing device 110 may determine that traffic light 216 and pedestrian crossing 218 nearby vehicle 100 are stored as previously detected traffic features near vehicle 100 . As such, a sensor check on one or more detection systems may be performed by comparing newly detected values to these stored values. Therefore, computing device 110 may determine that segment 610 beginning at the current location of vehicle 100 and a right turn from lane 212 to road 220 may be used for performing the sensor check.
  • computing device 110 may select a segment for a test route so that sensor data can be collected to compare with stored locations and/or orientations of traffic features or stationary objects in map data. For example, as shown in FIG. 6 , computing device 110 may determine that stop sign 260 is stored in map data 200 as a traffic feature. As such, a map check on the map data 200 stored in navigation system may be performed by comparing newly detected location and orientation of the stop sign 260 with the stored location and/or orientation of the stop sign 260 in map data 200 . Therefore, computing device 110 may determine that segment 640 including a portion of road 230 near stop sign 260 may be used for performing the map check.
  • computing device 110 may select a segment for a test route where a particular vehicle maneuver may be performed. For example, as shown in FIG. 6 , computing device 110 may determine that a left turn may be performed at intersection 229 . As such, a component check on the brake, wheel alignment, and left turn signal may be performed at intersection 229 while vehicle 100 performs the left turn. Therefore, computing device 110 may determine that segment 620 including a portion of road 220 and a left turn at intersection 229 to road 230 can be used for performing the component check.
  • computing device 110 may determine that more than one segment is needed for performing a particular type of performance check.
  • a human operator may manually a list of items for a test route.
  • a list of items required for a test route may be stored on computing devices 110 , and/or stored on server computing device 410 and/or storage system 450 accessible by computing device 110 .
  • the list of items required for a test route may include items such as “five or more traffic lights on the test route,” “one multipoint turn on the test route,” etc.
  • computing device 110 may determine that a component check for the reverse signal requires maneuvers such as back-in or parallel parking or multi-point turns. As such, computing device may determine that segment 650 including parking lot 280 may be used for performing the component check on the reverse signal.
  • segment 610 may also be used for map check, since the location and/or orientation of traffic features such as traffic light 216 and pedestrian crossing 218 are stored in map data 200 .
  • segment 610 may also be used for a component check on the brake, wheel alignment, and right turn signal.
  • Computing device 110 may select additional segments of test route connecting the various segments selected for the particular types of performance checks. For example, computing device 110 may determine that segment 630 may be needed to connect segment 620 and segment 640 . As such, an example test route may include segments 610 , 620 , 630 , 640 , and 650 .
  • computing device 110 may store the corresponding or associated performance checks to be performed using sensor data from each segment of the test route. For example, computing device 110 may associate segment 610 with a sensor check, a map check using traffic light 216 , pedestrian crossing 218 , and a component check on wheel alignment and right-turn signal. For another example, computing device 110 may associate segment 620 with a component check on left-turn signal and wheel alignment. For still another example, computing device 110 may associate segment 640 with a sensor check and a map check using stop sign 260 . For yet another example, computing device 110 may not associate any check with segment 630 .
  • Computing device 110 may determine the test route further based on additional requirements for test routes.
  • One example requirement may be that one or more segments of the test route must have below a threshold traffic volume.
  • computing device 110 may receive historical or real-time traffic data from a database.
  • Another example requirement may be that one or more segments of the route must have a speed limit below or above a threshold speed limit.
  • computing device 110 may determine speed limits of various roads based on map data 200 .
  • Yet another example requirement may be that one or more segments of the route must not be performed in certain areas, such as a school zone.
  • computing device 110 may determine zoning information based on map data 200 .
  • Still another example requirement may be that particular types of maneuvers must be performed in a parking lot. For example, as shown in FIG.
  • parking lot 280 may be chosen based on a requirement that multi-point turns be made in parking lots. Additional example requirements may be that the test route must include multiple distinct traffic lights, one or more cul-de-sacs for performing multipoint turns, and a stored traffic feature in a designated depot or testing center for the vehicle 100 .
  • the test route need not be a closed loop.
  • computing device 110 may determine to go on a next segment at the end of segment 650 , instead of returning to the beginning of the test route.
  • the test route may be a closed loop, for example, computing device 110 may determine an additional segment connecting the end of segment 650 back to the beginning of segment 610 .
  • the plurality of performance checks may be repeated in order to collect more sets of sensor data, which for example may be averaged to obtain more accurate results.
  • the test route may be stored so that the test route can be used again by the vehicle at a later time to perform the aforementioned checks.
  • the test route described above may be stored, and if vehicle 100 happens to be around the area when the plurality of performance checks need to be performed again, computing device 110 may simply use the stored test route, instead of determining a new test route. For another instance, based on the performance checks need to be performed again, computing device 110 may use some but not all the segments of the stored test route.
  • computing device 110 may control the vehicle 100 to drive along the test route. While doing so, the perception system 172 and/or computing devices 110 may collect data while on the test route including sensor data and component data in order to perform the aforementioned performance checks.
  • FIG. 7 shows example sensor data 700 collected by various sensors in the perception system 172 while vehicle 100 drives along the test route shown in FIG. 6 .
  • FIG. 8 shows example components data 800 collected from various components while vehicle 100 drives along the test route shown in FIG. 6 .
  • the collected sensor data may include data on permanent traffic features or stationary objects, such as traffic light 216 , pedestrian crossing 218 , building 270 , stop sign 260 , and parking lot 280 .
  • the sensor data may include information such as the detected location and orientation of each traffic feature or object, as well as the location of the vehicle 100 when the sensor data on that feature or object is taken.
  • LIDAR sensor(s) 180 of vehicle 100 may detect traffic light 216 at location [x 1 , y 1 ] and at an angle 25° from vehicle 100 , and building 270 at location [x 3 , y 3 ] and at an angle 10° from vehicle 100 .
  • LIDAR sensor(s) 180 of vehicle 100 may detect stop sign 260 at location [x 4 , y 4 ] and at an angle 25° from vehicle 100 , and parking lot 280 at location [x 5 , y 5 ] and at an angle 25° from vehicle 100 .
  • Locations of vehicle 100 during the test route may be determined by navigation system 168 .
  • the LIDAR data may further include details such as the size and shape of these features or objects.
  • the stored sensor data may include information such as the previously detected location and orientation of each traffic feature or object.
  • the stored sensor data for a traffic feature or object may include previous detections of the traffic feature or object by sensors of the vehicle 100 in the past.
  • the stored sensor data for a traffic feature or object may additionally or alternatively include previous detections of the traffic feature or object made by sensors of other vehicles.
  • the stored sensor data may include the location of the vehicle taking the sensor data when the traffic feature or object was detected.
  • the stored sensor data may be stored on computing device 110 . Additionally or alternatively, the stored sensor data may be stored on server computing device 410 and/or storage system 450 accessible by computer device 110 .
  • stored sensor data and collected sensor data may include same type of sensor data taken by multiple sensors, such as by different LIDAR sensors mounted at different locations in or on vehicle 100 .
  • stored sensor data and collected sensor data may include different types of sensor data, such as camera data.
  • Each type of sensor data may include similar information as LIDAR data, such as detected location and orientation of each traffic feature or object, and the location of the vehicle 100 when the sensor data on that feature or object is taken.
  • each type of sensor data may include further details such as the size, shape, color of these features or objects.
  • the collected sensor data may further include data on temporary or moving traffic features and/or objects, such as vehicle 100 A, vehicle 100 B, traffic cone 670 and pedestrian 680 .
  • LIDAR sensor(s) 180 of vehicle 100 may detect vehicle 100 A at location [x 6 , y 6 ] at a 15° angle from the front of the vehicle 100 .
  • camera sensor(s) 182 and RADAR sensor 184 may also each detect vehicle 100 A at location [x 6 , y 6 ] at a 15° angle.
  • the camera data may further include the color of vehicle 100 A
  • RADAR data may further include speed of vehicle 100 A.
  • component data may be collected on various components of vehicle 100 .
  • tire pressures may be collected for all four tires of vehicle 100 .
  • wheel alignment data may be collected on all four wheels of vehicle 100 .
  • the wheel alignment data may include camber angle, caster angle, and toe angle for each wheel.
  • data on brake of vehicle 100 may be collected. For instance, stopping distance at a specific speed such as 100 km/hr may be measured.
  • responsiveness of various lights such as the turn and reverse signals, as well as night light, can be turned on and off.
  • the plurality of performance checks may be performed by analyzing the collected data.
  • Computing devices 110 may perform the plurality of performance checks by analyzing the collected data in real time while vehicle 100 navigates through the test route, or store the collected data in memory 130 so that computing device 110 may perform the checks after completing the test route.
  • the collected data may be uploaded to server computing device 410 or storage system 450 so that server computing device 410 may perform the plurality of performance checks. Having computing device 110 perform the checks may provide greater efficiency, since uploading collected data to server computing device 410 or storage system 450 may be time consuming.
  • detected characteristics of a traffic feature or object collected during the test route may be compared with previously detected or stored characteristics of the traffic feature or object.
  • a sensor may satisfy the sensor check when the characteristics collected during the test route match the previously detected or stored characteristics by the same sensor, and not satisfy the sensor check when the characteristics collected during the test route do not match the previously detected or stored characteristics. For instance, referring to FIG. 7 , collected LIDAR data from LIDAR sensor(s) 180 may be compared to stored LIDAR values from a previous detection by LIDAR sensor(s) 180 .
  • computing device 110 may determine that detected location for each of traffic light 216 , building 270 , stop sign 260 and parking lot 280 are identical to stored LIDAR values, but detected orientation of each is offset by a 5° angle. As such, computing device 110 may determine that LIDAR sensor(s) 180 does not satisfy the sensor check.
  • FIG. 9 shows an example situation 900 illustrating an example sensor check.
  • Various features in FIG. 9 may generally correspond to the shape, location, and other characteristics of features shown in map data 200 of FIG. 2 , and labeled as such. Additional features in FIG. 9 , including various road users and other objects, are described in detail below. Although these examples are useful for demonstration purposes, they should not be considered limiting.
  • LIDAR sensor(s) 180 detects stop sign 260 at a location [x 4 , y 4 ] and orientation of 25° angle with respect to a front right corner of vehicle 100 .
  • the stored LIDAR values for the stop sign 260 include location [x 4 , y 4 ] and orientation of 30° angle with respect to a front right corner of vehicle 100 . This may be due to a movement of the LIDAR sensor(s) 180 from its previous position when the stored LIDAR values were taken. For example, a pedestrian might have accidentally touched the LIDAR sensor(s) 180 when passing by vehicle 100 while vehicle 100 was parked curbside in lane 212 . As such, this rotation causes a ⁇ 5° angle offset for all detections made by LIDAR sensor(s) 180 .
  • computing device 110 may compare the location and/or orientation of traffic features and/or objects detected in the collected sensor data with the location and/or orientation of traffic features and/or objects stored in map data 200 .
  • computing device 110 may compare location [x 1 , y 1 ] for traffic light 216 detected in collected LIDAR data with location [x 1 , y 1 ] stored in map data 200 , compare location [x 3 , y 3 ] for building 270 detected in collected LIDAR data with location [x 3 , y 3 ] stored in map data 200 , compare location [x 4 , y 4 ] for stop sign 260 detected in collected LIDAR data with location [x 4 , y 4 ] stored in map data 200 , compare location [x 5 , y 5 ] for parking lot 280 detected in collected LIDAR data with location [x 5 , y 5 ] stored in map data 200 , and conclude that LIDAR sensor(s) 180 pass the sensor check.
  • LIDAR sensor(s) 180 pass the sensor check.
  • computing device 110 may determine that a sensor may still pass a sensor test if the differences between the stored and collected sensor data are within a predetermined range. For instance, computing device 110 may determine that LIDAR sensor(s) 180 may still pass the sensor test if the difference in stored and detected orientation for a detected object is within a 10° range.
  • Computing device 110 may determine one or more corrections for one or more sensors that fails the sensor test. For example, for LIDAR sensor(s) 180 , computing device 110 may determine a +5° correction for all orientation values detected by LIDAR sensor(s) 180 . For instance, computing device 110 may add 5° to the detected 25° angle for stop sign 260 .
  • Another sensor check may include comparing collected sensor data from various sensors of a same type for a detected object. For instance, if LIDAR sensor(s) 180 include multiple sensors have overlapping fields of views, computing device 110 may compare the LIDAR point cloud for traffic light 216 collected by a first sensor with the LIDAR point cloud for traffic light 216 collected by a second sensor. Such sensor error of the second sensor may be caused by any of a number of factors, such as damage by another road user, or due to environmental factors such as extreme temperature or humidity. Computing device 110 may determine that, if the two LIDAR point clouds match substantially, such as by 90% or some other threshold, then both the first and second sensors pass the sensor check.
  • Still another sensor check may include determining a resolution or field of view captured by a sensor. For example, if collected LIDAR data for LIDAR sensor(s) 180 has a smaller field of view than the stored LIDAR data, computing device 110 may further determine that LIDAR sensor(s) 180 has failed the sensor check. In some instances, computing device 110 may determine that LIDAR sensor(s) 180 may still pass the sensor test if the difference between field of view of the collected LIDAR data during test route and field of view of the stored LIDAR data is within a predetermined threshold difference. For another example, if collected camera data for camera sensor(s) 182 has a lower resolution than the stored camera data, computing device 110 may further determine that camera sensor(s) 182 has failed the sensor check.
  • computing device 110 may determine that camera sensor(s) 182 may still pass the sensor test if the difference between resolution of the collected camera data during test route and resolution of stored camera data is within a predetermined threshold difference.
  • Such changes in resolution or field of view may be caused by any of a number of factors, such as damage by another road user, or due to environmental factors such as extreme temperature or humidity.
  • Yet another sensor check may include determining whether a sensor produces unreasonable sensor data. For example, computing device 110 may determine that camera data produced by camera sensor(s) 182 are all green, and conclude that camera sensor(s) 182 fail the sensor check. For another example, computing device 110 may determine that LIDAR sensor(s) 180 produces empty point clouds, and conclude that LIDAR sensor(s) 180 fail the sensor check. Such changes in resolution or field of view may be caused by any of a number of factors, such as damage by another road user, or due to environmental factors such as extreme temperature or humidity.
  • a location or orientation of a detected traffic feature may be compared with the location and/or orientation of a previously detected or stored traffic feature stored in map data of the vehicle.
  • the map data may satisfy the map check when the location and/or orientation of detected traffic features during the test route match the location and/or orientation of traffic feature stored in the map data.
  • locations of traffic features detected by LIDAR sensor(s) 180 may be compared to locations stored in map data 200 .
  • computing device 110 may determine that detected location by LIDAR sensor(s) 180 for each of traffic light 216 , building 270 , stop sign 260 and parking lot 280 are identical to locations stored in map data 200 , but that pedestrian crossing 218 is not detected by LIDAR sensor(s) 180 .
  • computing device 110 may further determine whether the difference was due to an error in the map data 200 or an error in the collected sensor data. For example, computing device 110 may determine that, since pedestrian crossing is not a 3D structure and that the field of view of LIDAR sensor(s) 180 does not include ground level, LIDAR sensor(s) 180 cannot detect pedestrian crossing 218 , and therefore the difference does not indicate an error in map data 200 . In such cases, computing device 110 may further confirm by comparing the location stored in map data 200 with collected sensor data from another sensor, such as camera sensor(s) 182 . For example, computing device 110 may determine that the location for pedestrian crossing 218 in map data 200 matches the location detected by camera sensor(s) 182 .
  • computing device 110 device may determine that an update needs to be made for map data 200 .
  • FIG. 9 which shows the example situation 900 further illustrating an example map check.
  • LIDAR sensor(s) 180 of vehicle 100 detects a no-enter sign 910 near exit 284 of parking lot 280 .
  • map data 200 does not include data on a no-enter sign at this location.
  • computing device 110 may determine to update map data 200 with the detected location of no-enter sign 910 .
  • computing device 110 may determine that, even if some error exists, the map data may still pass a map test if a threshold number or percentage of traffic features stored in the map data have locations matching the detected locations from the collected sensor data. For instance, computing device 110 may determine that map data 200 may still pass the map test if at least five or at least 80% of the stored features have locations matching the collected sensor data. For example, since locations for traffic light 216 , pedestrian crossing 218 , building 270 , stop sign 260 , and parking lot 280 match that of collected LIDAR data, even though location for the no-enter sign 910 was missing, computing device 110 may still determine that map data 200 may pass the map test.
  • the one or more measurements related to a component of the vehicle may be compared with predetermined requirements.
  • the component may satisfy the component check when the one or more measurements satisfy predetermined requirements.
  • the predetermined requirements may be stored in computing device 110 , or alternatively or additionally stored on server computing device 410 and/or storage system 450 .
  • a component may satisfy a component check if a measurement meets a predetermined threshold value.
  • a predetermined threshold value For example, referring to FIG. 8 , a predetermined minimum threshold of 35 psi may be stored for tires of vehicle 100 . As shown, since the front left tire, the rear left tire, and the rear right tire each meets the predetermined minimum threshold, these tires satisfy the component check. However, since front right tire has a pressure of only 20 psi, the front right tire fails the component check.
  • a component may satisfy a component check if a measurement is within a predetermined range of values.
  • predetermined alignment angles may be set for the tires of vehicle 100 , which include camber, caster, and toe angles. Since each of the tires of vehicle 100 have alignment angles within these predetermined ranges, each of the tires of vehicle 100 passes the component check.
  • a component may satisfy a component check if a measurement indicates that the component has a predetermined level of responsiveness.
  • the predetermined level of responsiveness may be set as a binary (responsive or not) for each of the left turn, right turn, reverse, and brake signal lights, as well as the headlight.
  • computing device 110 may determine that they each pass the component check.
  • headlight since headlight is unresponsive, computing device 110 may determine that the headlight fails the component check.
  • the predetermined level or responsiveness may be set as a predetermined level of delay.
  • a predetermined stopping distance at a specific speed such as 100 km/hr, may be set for vehicle 100 .
  • computing device 110 may determine that the brake passes the component check.
  • computing devices 110 may select an operation mode for vehicle 100 .
  • Modes for operation may include, for example, task designations (passenger or non-passenger tasks).
  • Modes of operation may further include various limits, such as limits on speeds, distance, geographic area, or environmental conditions (such as weather, day/night).
  • Modes for operation may also include an inactive mode where the vehicle is pulled over or parked after completing the plurality of performance checks.
  • Computing device 110 may determine an operation mode based on results from the plurality of performance checks. For example, an operation mode may only be selected if a threshold number of percentage of performance checks are passed. For another example, an operation mode may only be selected if a specific set of performance checks are passed, such as a set of performance checks specific to driving at night or during poor visibility, which may include performance checks such as the sensor checks described above, and component checks involving signal lights and headlight, etc.
  • Computing device 110 may determine that one or more operation modes cannot be selected based on specific failures. For example, computing device 110 may determine that, if stopping distance at 100 km/hr for vehicle 100 is above 20 m, modes of operation involving driving at a speed of 100 km/hr or greater cannot be selected. For another example, computing device 110 may determine that, if less than 80% of the sensors in the perception system 172 fail the sensor test, operation mode involving driving at night or certain weather conditions cannot be selected. For still another example, computing device 110 may determine that, if one or more tires has a tire pressure below 35 psi, modes of operation involving passenger tasks cannot be selected.
  • Computing device 110 may select an operation mode further based on other factors, such as traffic law requirements and the type of vehicle. For example, traffic law may require a vehicle to have operating turn signals. As such, computing device 110 may select the inactive mode if any of the turn signals is unresponsive. For another example, computing device 110 may select an operation mode with a limit on distance only for compact vehicles with below normal tire pressures, and select an inactive operation mode for trucks with below normal tire pressures.
  • computing device 110 may operate vehicle 100 in the selected operation mode.
  • operating in the selected operation mode may include operating according to limits of the mode of operation, such as limits on speed, distance, geographic area, environmental condition.
  • operating in the selected mode may include whether to determining whether to accept passenger or non-passenger tasks.
  • Operating in the selected operation mode may further include using the determined corrections for one or more sensors. For example, as described with respect to FIGS. 7 and 9 , when operating vehicle 100 , computing device 110 may apply a correction of +5° to sensor data detected by LIDAR sensor(s) 180 .
  • Operating in the selected operation mode may further include using the updated map data.
  • computing device 110 may use updated map data 200 including the no-enter sign 910 .
  • Operation modes may also be selected for a plurality of vehicles by a remote system, such as a fleet management system.
  • server computing device 410 may manage a fleet of vehicles including vehicle 100 , 100 A, 100 B.
  • sensor data and component data collected by various vehicles in the fleet, such as vehicle 100 , 100 A, 100 B may be uploaded to server computing device 410 .
  • Server computing device 410 may compare the collected sensor data from each vehicle to stored sensor values from previous detections.
  • Server computing device 410 may also compare the collected components data with stored predetermined requirements.
  • the plurality of performance checks may be performed by computing device of each vehicle, and only the results (pass/fail) are uploaded to server computing device 410 .
  • Sever computing device 410 may then designate modes of operations for subsets of vehicles of the plurality of vehicles based on the plurality of performance checks as described above, such as based on passing a threshold number or percentage of performance checks, particular sets of performance checks, other factors such as type of vehicle or traffic law, etc. For another example, server computing device 410 may designate modes of operations further based on a planned distribution or demand for the vehicles in the fleet.
  • FIG. 10 shows an example flow diagram 1000 of an example method for performing a plurality of performance checks.
  • the example method may be performed by one or more processors, such as one or more processors 120 of computing device 110 .
  • processors 120 of computing device 110 may receive data and make various determinations as shown in flow diagram 1000 , and control the vehicle 100 based on these determinations.
  • a plurality of performance checks are identified, including a first check for a detection system of a plurality of detection systems of the vehicle and a second check for map data.
  • a plurality of road segments are selected based on a location of the vehicle and the plurality of performance checks, wherein each of the plurality of road segments is selected for performing one or more of the plurality of performance checks.
  • a test route is determined for the vehicle by connecting the plurality of road segments and by connecting the location of the vehicle to one of the plurality of road segments. For example, a plurality of road segments and a test route may be determined as described in relation to FIG. 6 .
  • the vehicle is controlled along the test route in an autonomous driving mode.
  • sensor data are received from the plurality of detection systems of a vehicle. For example, sensor data collected on a test route may be received by computing device 110 as described in relation to FIG. 7 .
  • the plurality of performance checks are performed based on the received sensor data. For example, one or more sensor checks may be performed by comparing the collected sensor data with stored previous sensor data. For another example, one or more map checks may be performed by comparing the collected sensor data with map data.
  • an operation mode is selected from a plurality of operation modes for the vehicle based on results of the plurality of performance checks. For example, the driving mode may be selected based on the results meeting a threshold number or percentage of performance checks.
  • the vehicle is operated in the selected operation mode. For example, operating in the selected operation mode may include using corrections to sensor data or updates to map data.

Landscapes

  • Engineering & Computer Science (AREA)
  • Automation & Control Theory (AREA)
  • Transportation (AREA)
  • Mechanical Engineering (AREA)
  • Physics & Mathematics (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Human Computer Interaction (AREA)
  • General Physics & Mathematics (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Mathematical Physics (AREA)
  • Traffic Control Systems (AREA)
  • Navigation (AREA)
  • Business, Economics & Management (AREA)
  • Health & Medical Sciences (AREA)
  • Artificial Intelligence (AREA)
  • Evolutionary Computation (AREA)
  • Game Theory and Decision Science (AREA)
  • Medical Informatics (AREA)
  • Electromagnetism (AREA)
  • Control Of Driving Devices And Active Controlling Of Vehicle (AREA)

Abstract

Aspects of the disclosure provides for a method for performing checks for a vehicle. In this regard, a plurality of performance checks may be identified including a first check for a detection system of a plurality of detection systems of the vehicle and a second check for map data. A test route for the vehicle may be determined based on a location of the vehicle and the plurality of performance checks. The vehicle may be controlled along the test route in an autonomous driving mode, while sensor data may be received from the plurality of detection systems of the vehicle. An operation mode may be selected based on results of the plurality of performance checks, and the vehicle may be operated in the selected operation mode.

Description

    CROSS REFERENCE TO RELATED APPLICATIONS
  • The present application is a continuation of U.S. patent application Ser. No. 16/219,386, filed Dec. 13, 2018, the disclosure of which is hereby incorporated herein by reference.
  • BACKGROUND
  • Autonomous vehicles, such as vehicles which do not require a human driver when operating in an autonomous driving mode, may be used to aid in the transport of passengers or items from one location to another. An important component of an autonomous vehicle is the perception system, which allows the vehicle to perceive and interpret its surroundings using cameras, radar, sensors, and other similar devices. The perception system executes numerous tasks while the autonomous vehicle is in motion, which ultimately leads to decisions, such as speeding up, slowing down, stopping, turning, etc. The perception system may include a plurality of detection systems, such as cameras, sensors, and global positioning devices, which gathers and interprets images and sensor data about its surrounding environment, e.g., parked cars, trees, buildings, etc.
  • SUMMARY
  • Aspects of the disclosure provides for a system comprising one or more computing devices configured to identify a plurality of performance checks including a first check for a detection system of a plurality of detection systems of the vehicle and a second check for map data; select a plurality of road segments based on a location of the vehicle and the plurality of performance checks, wherein each of the plurality of road segments is selected for performing one or more of the plurality of performance checks; determine a test route for the vehicle by connecting the plurality of road segments and by connecting the location of the vehicle to one of the plurality of road segments; control the vehicle along the test route in an autonomous driving mode; while controlling the vehicle, receive sensor data from the plurality of detection systems of the vehicle; perform the plurality of performance checks based on the received sensor data; select an operation mode from a plurality of operation modes for the vehicle based on results of the plurality of performance checks; and operate the vehicle in the selected operation mode.
  • The plurality of road segments may include a first road segment, wherein one or more of the plurality of performance checks may be performed using one or more traffic features or stationary objects that are detectable along the first road segment. The plurality of road segments may include a second road segment on which a maneuver required for one or more of the plurality of performance checks can be performed.
  • The first check may include comparing characteristics of a detected traffic feature with previously detected characteristics of the traffic feature. The second check may include comparing a location of a detected traffic feature with a location of the detected traffic feature in the map data.
  • The plurality of performance checks may further include a third check for a component of the vehicle, the third check may include comparing one or more measurements related to the component of the vehicle with a threshold measurement.
  • The operation mode may be selected based on the results satisfying a threshold number of the plurality of performance checks. The operation mode may be selected based on the results satisfying one or more set of performance checks of the plurality of performance checks.
  • The one or more computing devices may be further configured to determine one or more corrections to at least one of the detection systems based on the results of the plurality of performance checks. Operating in the selected operation mode may include using the one or more corrections.
  • The one or more computing devices may be further configured to update the map data based on the results of the plurality of performance checks. Operating in the selected operation mode may include using the updated map data.
  • The selected operation mode may be an inactive mode.
  • The system may further comprise the vehicle.
  • The disclosure further provides for identifying, by one or more computing devices, a plurality of performance checks including a first check for a detection system of a plurality of detection systems of the vehicle and a second check for map data; selecting, by the one or more computing devices, a plurality of road segments based on a location of the vehicle and the plurality of performance checks, wherein each of the plurality of road segments is selected for performing one or more of the plurality of performance checks; determining, by the one or more computing devices, a test route for the vehicle by connecting the plurality of road segments and by connecting the location of the vehicle to one of the plurality of road segments; controlling, by the one or more computing devices, the vehicle along the test route in an autonomous driving mode; while controlling the vehicle, receiving, by the one or more computing devices, sensor data from the plurality of detection systems of the vehicle; performing, by the one or more computing devices, the plurality of performance checks based on the received sensor data; selecting, by the one or more computing devices, an operation mode from a plurality of operation modes for the vehicle based on results of the plurality of performance checks; and operating, by the one or more computing devices, the vehicle in the selected operation mode.
  • The plurality of road segments may include a first road segment, wherein one or more of the plurality of performance checks may be performed using one or more traffic features or stationary objects that are detectable along the first road segment. The plurality of road segments may include a second road segment on which a maneuver required for one or more of the plurality of performance checks can be performed.
  • The method may further comprise determining, by the one or more computing devices, one or more corrections to at least one of the detection systems based on the results of the plurality of performance checks, wherein operating in the selected operation mode may include using the one or more corrections. The method may further comprise updating, by the one or more computing devices, the map data based on the results of the plurality of performance checks, wherein operating in the selected operation mode may include using the updated map data.
  • The plurality of performance checks may be performed at a regular interval.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • FIG. 1 is a functional diagram of an example vehicle in accordance with aspects of the disclosure.
  • FIG. 2 is an example representation of map data in accordance with aspects of the disclosure.
  • FIG. 3 is an example external view of a vehicle in accordance with aspects of the disclosure.
  • FIG. 4 is an example pictorial diagram of a system in accordance with aspects of the disclosure.
  • FIG. 5 is an example functional diagram of a system in accordance with aspects of the disclosure.
  • FIG. 6 is an example situation in accordance with aspects of the disclosure.
  • FIG. 7 shows examples of collected sensor data in accordance with aspects of the disclosure.
  • FIG. 8 show examples of collected component data in accordance with aspects of the disclosure.
  • FIG. 9 show another example situation in accordance with aspects of the disclosure.
  • FIG. 10 is an example flow diagram in accordance with aspects of the disclosure.
  • DETAILED DESCRIPTION Overview
  • The technology relates to performance checks for a vehicle to be performed after a full calibration, prior to operation, or at regular intervals. Before operating a vehicle on the road, a human driver may check various systems and components of the vehicle, such as making sure that the mirrors are adjusted, that the GPS system is functioning, and components such as steering wheel, brake, and signal lights, are responsive. Likewise, various systems of an autonomous vehicle also need to be checked before operation, particularly when the vehicle is to be operated in an autonomous mode, where a human driver may not be present to notice problems with the vehicle's systems. For instance, even if the vehicle had been fully calibrated in the past, a sensor in a perception system of the vehicle might have been moved during previous operation such as by another road user or a cleaner, or have been damaged by environmental factors such as temperature, humidity, etc.
  • As such, a plurality of performance checks may be performed on the vehicle including, for example, a sensor check, a map check, and/or a component check. The sensor check may include determining a level of function of a given sensor or detection system, such as detection accuracy, detection resolution, field of view, etc. The map check may include determining an accuracy of the map data in relation to a given geographic area. The component check may include determining a level of function of a given component, such as tire pressure, tire alignment, etc. The results of the plurality of performance checks may be used to determine what functions of the vehicle are within set guidelines, such as for safety and comfort. The results may also be used to designate or clear the vehicle for particular modes of operation.
  • To perform the plurality of performance checks, one or more computing devices may determine a test route based on the location of the vehicle, the map data, and the plurality of performance checks for the plurality of systems of the vehicle. The test route need not include a designated depot or testing center, or be a closed route.
  • The vehicle's computing devices may navigate the vehicle along the test route using the one or more components and collect data using the plurality of detection systems. Collecting the data may include using a detection system of the plurality of detection system to detect one or more traffic features or stationary objects along the test route. In addition, collecting the data may include detecting one or more measurements related to a component of the vehicle.
  • During the test route or after the vehicle completes the test route, the vehicle's computing devices may perform the plurality of performance checks by analyzing collected data. For a sensor check, characteristics of a detected traffic feature (such as location, orientation, shape, color, reflectivity, etc.) may be compared with previously detected or stored characteristics of the traffic feature. For a map check, a location or orientation of a detected traffic feature may be compared with the location or orientation of a previously detected or stored traffic feature in map data of the vehicle. For a component check, the one or more measurements related to a component of the vehicle may be compared with a threshold measurement.
  • Based on results from the plurality of performance checks, such as based on which performance checks have been satisfied, the vehicle's computing devices may select an operation mode for operating the vehicle. Operation modes may include, for example, task designations (passenger or non-passenger tasks), or limits on speeds, distance, or geographic area. Operation modes may also include an inactive mode, for example if the vehicle is not cleared for any other mode. In some implementations, modes may be selected for a plurality of vehicles by a remote system, such as a fleet management system. The vehicle may then be operated by the vehicle's computing devices in a particular mode based on the plurality of performance checks.
  • The features described above may allow autonomous vehicles to be quickly and properly prepared for operation. Quicker preparation means vehicles may be sent to users in a more timely fashion, even as demand fluctuates. As a result, users of autonomous vehicles may be able to be picked up in a timely manner. In addition, fewer resources, such as fuel, need be used in the preparation of the autonomous vehicle for service, which may reduce overall costs. The features also allow for management of an entire fleet of autonomous vehicles designated for a plurality of modes that may service users more efficiently and safely.
  • EXAMPLE SYSTEMS
  • As shown in FIG. 1, a vehicle 100 in accordance with one aspect of the disclosure includes various components. While certain aspects of the disclosure are particularly useful in connection with specific types of vehicles, the vehicle may be any type of vehicle including, but not limited to, cars, trucks, motorcycles, busses, recreational vehicles, etc. The vehicle may have one or more computing devices, such as computing device 110 containing one or more processors 120, memory 130 and other components typically present in general purpose computing devices.
  • The memory 130 stores information accessible by the one or more processors 120, including instructions 132 and data 134 that may be executed or otherwise used by the processor 120. The memory 130 may be of any type capable of storing information accessible by the processor, including a computing device-readable medium, or other medium that stores data that may be read with the aid of an electronic device, such as a hard-drive, memory card, ROM, RAM, DVD or other optical disks, as well as other write-capable and read-only memories. Systems and methods may include different combinations of the foregoing, whereby different portions of the instructions and data are stored on different types of media.
  • The instructions 132 may be any set of instructions to be executed directly (such as machine code) or indirectly (such as scripts) by the processor. For example, the instructions may be stored as computing device code on the computing device-readable medium. In that regard, the terms “instructions” and “programs” may be used interchangeably herein. The instructions may be stored in object code format for direct processing by the processor, or in any other computing device language including scripts or collections of independent source code modules that are interpreted on demand or compiled in advance. Functions, methods and routines of the instructions are explained in more detail below.
  • The data 134 may be retrieved, stored or modified by processor 120 in accordance with the instructions 132. As an example, data 134 of memory 130 may store predefined scenarios. A given scenario may identify a set of scenario requirements including a type of object, a range of locations of the object relative to the vehicle, as well as other factors such as whether the autonomous vehicle is able to maneuver around the object, whether the object is using a turn signal, the condition of a traffic light relevant to the current location of the object, whether the object is approaching a stop sign, etc. The requirements may include discrete values, such as “right turn signal is on” or “in a right turn only lane”, or ranges of values such as “having an heading that is oriented at an angle that is 30 to 60 degrees offset from a current path of vehicle 100.” In some examples, the predetermined scenarios may include similar information for multiple objects.
  • The one or more processor 120 may be any conventional processors, such as commercially available CPUs. Alternatively, the one or more processors may be a dedicated device such as an ASIC or other hardware-based processor. Although FIG. 1 functionally illustrates the processor, memory, and other elements of computing device 110 as being within the same block, it will be understood by those of ordinary skill in the art that the processor, computing device, or memory may actually include multiple processors, computing devices, or memories that may or may not be stored within the same physical housing. As an example, internal electronic display 152 may be controlled by a dedicated computing device having its own processor or central processing unit (CPU), memory, etc. which may interface with the computing device 110 via a high-bandwidth or other network connection. In some examples, this computing device may be a user interface computing device which can communicate with a user's client device. Similarly, the memory may be a hard drive or other storage media located in a housing different from that of computing device 110. Accordingly, references to a processor or computing device will be understood to include references to a collection of processors or computing devices or memories that may or may not operate in parallel.
  • Computing device 110 may have all of the components normally used in connection with a computing device such as the processor and memory described above as well as a user input 150 (e.g., a mouse, keyboard, touch screen and/or microphone) and various electronic displays (e.g., a monitor having a screen or any other electrical device that is operable to display information). In this example, the vehicle includes an internal electronic display 152 as well as one or more speakers 154 to provide information or audio visual experiences. In this regard, internal electronic display 152 may be located within a cabin of vehicle 100 and may be used by computing device 110 to provide information to passengers within the vehicle 100. In addition to internal speakers, the one or more speakers 154 may include external speakers that are arranged at various locations on the vehicle in order to provide audible notifications to objects external to the vehicle 100.
  • In one example, computing device 110 may be an autonomous driving computing system incorporated into vehicle 100. The autonomous driving computing system may capable of communicating with various components of the vehicle. For example, computing device 110 may be in communication with various systems of vehicle 100, such as deceleration system 160 (for controlling braking of the vehicle), acceleration system 162 (for controlling acceleration of the vehicle), steering system 164 (for controlling the orientation of the wheels and direction of the vehicle), signaling system 166 (for controlling turn signals), navigation system 168 (for navigating the vehicle to a location or around objects), positioning system 170 (for determining the position of the vehicle), perception system 172 (for detecting objects in the vehicle's environment), and power system 174 (for example, a battery and/or gas or diesel powered engine) in order to control the movement, speed, etc. of vehicle 100 in accordance with the instructions 132 of memory 130 in an autonomous driving mode which does not require or need continuous or periodic input from a passenger of the vehicle. Again, although these systems are shown as external to computing device 110, in actuality, these systems may also be incorporated into computing device 110, again as an autonomous driving computing system for controlling vehicle 100.
  • The computing device 110 may control the direction and speed of the vehicle by controlling various components. By way of example, computing device 110 may navigate the vehicle to a drop-off location completely autonomously using data from the map data and navigation system 168. Computing devices 110 may use the positioning system 170 to determine the vehicle's location and perception system 172 to detect and respond to objects when needed to reach the location safely. In order to do so, computing devices 110 may cause the vehicle to accelerate (e.g., by increasing fuel or other energy provided to the engine by acceleration system 162), decelerate (e.g., by decreasing the fuel supplied to the engine, changing gears, and/or by applying brakes by deceleration system 160), change direction (e.g., by turning the front or rear wheels of vehicle 100 by steering system 164), and signal such changes (e.g., by lighting turn signals of signaling system 166). Thus, the acceleration system 162 and deceleration system 160 may be a part of a drivetrain that includes various components between an engine of the vehicle and the wheels of the vehicle. Again, by controlling these systems, computing devices 110 may also control the drivetrain of the vehicle in order to maneuver the vehicle autonomously.
  • As an example, computing device 110 may interact with deceleration system 160 and acceleration system 162 in order to control the speed of the vehicle. Similarly, steering system 164 may be used by computing device 110 in order to control the direction of vehicle 100. For example, if vehicle 100 configured for use on a road, such as a car or truck, the steering system may include components to control the angle of wheels to turn the vehicle. Signaling system 166 may be used by computing device 110 in order to signal the vehicle's intent to other drivers or vehicles, for example, by lighting turn signals or brake lights when needed.
  • Navigation system 168 may be used by computing device 110 in order to determine and follow a route to a location. In this regard, the navigation system 168 and/or data 134 may store map data, e.g., highly detailed maps that computing devices 110 can use to navigate or control the vehicle. As an example, these maps may identify the shape and elevation of roadways, lane markers, intersections, crosswalks, speed limits, traffic signal lights, buildings, signs, real time or historical traffic information, vegetation, or other such objects and information. The lane markers may include features such as solid or broken double or single lane lines, solid or broken lane lines, reflectors, etc. A given lane may be associated with left and right lane lines or other lane markers that define the boundary of the lane. Thus, most lanes may be bounded by a left edge of one lane line and a right edge of another lane line. As noted above, the map data may store known traffic or congestion information and/or and transit schedules (train, bus, etc.) from a particular pickup location at similar times in the past. This information may even be updated in real time by information received by the computing devices 110.
  • FIG. 2 is an example of map data 200. As shown, the map data 200 includes the shape, location, and other characteristics of road 210, road 220, road 230, road 240, and road 250. Map data 200 may include lane markers or lane lines, such as lane line 211 for road 210. The lane lines may also define various lanes, for example lane line 211 defines lanes 212, 214 of road 210. As alternative to lane lines or markers, lanes may also be inferred by the width of a road, such as for roads 220, 230, 240, 250. The map data 200 may also include information that identifies the direction of traffic and speed limits for each lane as well as information that allows the computing devices 110 to determine whether the vehicle has the right of way to complete a particular type of maneuver (i.e. complete a turn, cross a lane of traffic or intersection, etc.).
  • Map data 200 may also include relationship information between roads 210, 220, 230, 240, and 250. For example, map data 200 may indicate that road 210 intersects road 220 at intersection 219, that road 220 intersects road 230 at intersection 229, that roads 230, 240, and 250 intersect at intersection 239, and that road 250 intersects road 210 at intersection 259.
  • Map data 200 may further include signs and markings on the roads with various characteristics and different semantic meanings. As shown, map data 200 includes traffic light 216 for road 210 and pedestrian crossing 218 across road 210. Map data 200 also includes stop sign 260. The map data 200 may additionally include other features such as curbs, waterways, vegetation, etc.
  • In addition, map data 200 may include various buildings or structures (such as points of interests) and the type of these buildings or structures. As shown, map data 200 depicts building 270 on road 210. For example, map data 200 may include that the type of the building 270 is an airport, train station, stadium, school, church, hospital, apartment building, house, etc. In this regard, the type of the building 270 may be collected from administrative records, such as county records, or manually labeled by a human operator after reviewing aerial images. Map data 200 may include additional information on building 270, such as the locations of entrances and/or exits.
  • Map data 200 may also store predetermined stopping areas, such as a parking lot 280. In this regard, such areas may be hand-selected by a human operator or learned by a computing device over time. Map data 200 may include additional information about the stopping areas, such as the location of entrance 282 and exit 284 of parking lot 280, and that entrance 282 connects to road 240, while exit 284 connects to roads 230 and 250.
  • In some examples, map data 200 may further include zoning information. For instance, the zoning information may be obtained from administrative records, such as county records. As such, information on the roads may include indication that it is within a residential zone, a school zone, a commercial zone, etc.
  • The map data may further include location coordinates (examples of which are shown in FIG. 7), such as GPS coordinates of the roads 210, 220, 230, 240, and 250, intersections 219, 229, 239, and 259, lane line 211, lanes 212 and 214, traffic light 216, pedestrian crossing 218, stop sign 260, building 270 and its entrance 272, parking lot 280 and its entrance 282 and exit 284.
  • Although the detailed map data is depicted herein as an image-based map, the map data need not be entirely image based (for example, raster). For example, the detailed map data may include one or more roadgraphs or graph networks of information such as roads, lanes, intersections, and the connections between these features. Each feature may be stored as graph data and may be associated with information such as a geographic location and whether or not it is linked to other related features, for example, a stop sign may be linked to a road and an intersection, etc. In some examples, the associated data may include grid-based indices of a roadgraph to allow for efficient lookup of certain roadgraph features.
  • The perception system 172 also includes one or more components for detecting objects external to the vehicle such as other vehicles, obstacles in the roadway, traffic signals, signs, trees, etc. For example, the perception system 172 may include one or more LIDAR sensor(s) 180, camera sensor(s) 182, and RADAR sensor(s) 184. The perception system 172 may include other sensors, such as SONAR device(s), gyroscope(s), accelerometer(s), and/or any other detection devices that record data which may be processed by computing devices 110. The sensors of the perception system may detect objects and their characteristics such as location, orientation, size, shape, type (for instance, vehicle, pedestrian, bicyclist, etc.), heading, and speed of movement, etc. The raw data from the sensors and/or the aforementioned characteristics can be quantified or arranged into a descriptive function, vector, and or bounding box and sent for further processing to the computing devices 110 periodically and continuously as it is generated by the perception system 172. As discussed in further detail below, computing devices 110 may use the positioning system 170 to determine the vehicle's location and perception system 172 to detect and respond to objects when needed to reach the location safely.
  • For instance, FIG. 3 is an example external view of vehicle 100. In this example, roof-top housing 310 and dome housing 312 may include a LIDAR sensor as well as various cameras and RADAR units. In addition, housing 320 located at the front end of vehicle 100 and housings 330, 332 on the driver's and passenger's sides of the vehicle may each store a LIDAR sensor. For example, housing 330 is located in front of driver door 350. Vehicle 100 also includes housings 340, 342 for RADAR units and/or cameras also located on the roof of vehicle 100. Additional RADAR units and cameras (not shown) may be located at the front and rear ends of vehicle 100 and/or on other positions along the roof or roof-top housing 310. Vehicle 100 also includes many features of a typical passenger vehicle such as doors 350, 352, wheels 360, 362, etc.
  • Once a nearby object is detected, computing devices 110 and/or perception system 172 may determine the object's type, for example, a traffic cone, pedestrian, a vehicle (such as a passenger car, truck, bus, etc.), bicycle, etc. Objects may be identified by various models which may consider various characteristics of the detected objects, such as the size of an object, the speed of the object (bicycles do not tend to go faster than 40 miles per hour or slower than 0.1 miles per hour), the heat coming from the bicycle (bicycles tend to have rider that emit heat from their bodies), etc. In addition, the object may be classified based on specific attributes of the object, such as information contained on a license plate, bumper sticker, or logos that appear on the vehicle.
  • For instance, sensor data (examples of which are shown in FIG. 7) collected by one or more sensors of the perception system 172 may be stored in data of computing device 110 of vehicle 100. Referring to FIG. 2, vehicle 100 may have driven past stop sign 260 in the past, and have stored the detected values of stop sign 260 by LIDAR sensor(s) 180 in data 134 of memory 130. In this example, the detected values may include, for example, that when vehicle 100 is at location [x_b, y_b] (which may for example correspond to driving in road 230 towards intersection 239 and was 10m away from reaching intersection 239), the stop sign 260 was detected to be at location [x4, y4] and at a 30° angle from a front of vehicle 100 (which may for example correspond to when vehicle 100 is 8 m away from stop sign 260 on road 230 driving towards intersection 239). As described in detail below with respect to the example methods, these stored sensor data may be used for performance checks on the various systems of the vehicle 100. In other examples, sensor data collected by the perception system of a reference vehicle may be stored in computing device 110 of vehicle 100. In other examples, such sensor data may be stored remotely on a server or a storage system.
  • Computing device 110 may further store threshold values (some of which are shown in FIG. 8) for various components of vehicle 100. For example, computing device 110 may store a threshold minimum tire pressure for tires of vehicle 100. For another example, computing device 110 may store threshold alignment angles for tires of vehicle 10. For yet another example, computing device 110 may store a threshold stopping distance at a particular speed for a brake of vehicle 100.
  • The one or more computing devices 110 of vehicle 100 may also receive or transfer information to and from other computing devices, for instance using wireless network connections 156. The wireless network connections may include, for instance, BLUETOOTH®, Bluetooth LE, LTE, cellular, near field communications, etc. and various combinations of the foregoing. FIGS. 4 and 5 are pictorial and functional diagrams, respectively, of an example system 400 that includes a plurality of computing devices 410, 420, 430, 440 and a storage system 450 connected via a network 460. System 400 also includes vehicle 100, and vehicle 100A which may be configured similarly to vehicle 100. Although only a few vehicles and computing devices are depicted for simplicity, a typical system may include significantly more.
  • As shown in FIG. 4, each of computing devices 410, 420, 430, 440 may include one or more processors, memory, data and instructions. Such processors, memories, data and instructions may be configured similarly to one or more processors 120, memory 130, data 134, and instructions 132 of computing device 110.
  • The network 460, and intervening nodes, may include various configurations and protocols including short range communication protocols such as BLUETOOTH®, Bluetooth LE, the Internet, World Wide Web, intranets, virtual private networks, wide area networks, local networks, private networks using communication protocols proprietary to one or more companies, Ethernet, WiFi and HTTP, and various combinations of the foregoing. Such communication may be facilitated by any device capable of transmitting data to and from other computing devices, such as modems and wireless interfaces.
  • In one example, one or more computing devices 410 may include a server having a plurality of computing devices, e.g., a load balanced server farm, that exchange information with different nodes of a network for the purpose of receiving, processing and transmitting the data to and from other computing devices. For instance, one or more computing devices 410 may include one or more server computing devices that are capable of communicating with one or more computing devices 110 of vehicle 100 or a similar computing device of vehicle 100A as well as client computing devices 420, 430, 440 via the network 460. For example, vehicles 100 and 100A may be a part of a fleet of vehicles that can be dispatched by server computing devices to various locations. In this regard, the vehicles of the fleet may periodically send the server computing devices location information provided by the vehicle's respective positioning systems and the one or more server computing devices may track the locations of the vehicles.
  • As mentioned above, rather than saving sensor data detecting various traffic features on computing device 110, such sensor data may additionally or alternatively be stored on server computing device 410. Likewise, threshold values for components of vehicle 100 may likewise be stored on server computing device 410.
  • In addition, server computing devices 410 may use network 460 to transmit and present information to a user, such as user 422, 432, 442 on a display, such as displays 424, 434, 444 of computing devices 420, 430, 440. In this regard, computing devices 420, 430, 440 may be considered client computing devices.
  • As shown in FIG. 5, each client computing device 420, 430, 440 may be a personal computing device intended for use by a user 422, 432, 442, and have all of the components normally used in connection with a personal computing device including a one or more processors (e.g., a central processing unit (CPU)), memory (e.g., RAM and internal hard drives) storing data and instructions, a display such as displays 424, 434, 444 (e.g., a monitor having a screen, a touch-screen, a projector, a television, or other device that is operable to display information), and user input devices 426, 436, 446 (e.g., a mouse, keyboard, touchscreen or microphone). A user, such as user 422, 432, 442, may send information, such as pickup or drop-off requests, to server computing devices 410, using user input devices 426, 436, 446 of computing devices 420, 430, 440. The client computing devices may also include a camera for recording video streams, speakers, a network interface device, and all of the components used for connecting these elements to one another.
  • Although the client computing devices 420, 430, and 440 may each comprise a full-sized personal computing device, they may alternatively comprise mobile computing devices capable of wirelessly exchanging data with a server over a network such as the Internet. By way of example only, client computing device 420 may be a mobile phone or a device such as a wireless-enabled PDA, a tablet PC, a wearable computing device or system, or a netbook that is capable of obtaining information via the Internet or other networks. In another example, client computing device 430 may be a wearable computing system, shown as a wrist watch in FIG. 4. As an example the user may input information using a small keyboard, a keypad, microphone, using visual signals with a camera, or a touch screen.
  • In some examples, client computing device 440 may be remote operator work station used by an administrator to provide remote operator services to users such as users 422 and 432. For example, a remote operator 442 may use the remote operator work station 440 to communicate via a telephone call or audio connection with users through their respective client computing devices and/or vehicles 100 or 100A in order to ensure the safe operation of vehicles 100 and 100A and the safety of the users as described in further detail below. Although only a single remote operator work station 440 is shown in FIGS. 4 and 5, any number of such work stations may be included in a typical system.
  • Storage system 450 may store various types of information as described in more detail below. This information may be retrieved or otherwise accessed by a server computing device, such as one or more server computing devices 410, in order to perform some or all of the features described herein. For example, the information may include user account information such as credentials (e.g., a username and password as in the case of a traditional single-factor authentication as well as other types of credentials typically used in multi-factor authentications such as random identifiers, biometrics, etc.) that can be used to identify a user to the one or more server computing devices. The user account information may also include personal information such as the user's name, contact information, identifying information of the user's client computing device (or devices if multiple devices are used with the same user account), as well as age information, health information, and user history information about how long it has taken the user to enter or exit vehicles in the past as discussed below.
  • The storage system 450 may also store routing data for generating and evaluating routes between locations. For example, the routing information may be used to estimate how long it would take a vehicle at a first location to reach a second location. In this regard, the routing information may include map data, not necessarily as particular as the detailed map data 200 described above, but including roads, as well as information about those road such as direction (one way, two way, etc.), orientation (North, South, etc.), speed limits, as well as traffic information identifying expected traffic conditions, etc.
  • As mentioned above, rather than saving sensor data detecting various traffic features on computing device 110 or server computing device 410, such sensor data may additionally or alternatively be stored on storage system 450. Likewise, threshold values for components of vehicle 100 may likewise be stored on storage system 450.
  • The storage system 450 may also store information which can be provided to client computing devices for display to a user. For instance, the storage system 450 may store predetermined distance information for determining an area at which a vehicle is likely to stop for a given pickup or drop-off location. The storage system 450 may also store graphics, icons, and other items which may be displayed to a user as discussed below.
  • As with memory 130, storage system 450 can be of any type of computerized storage capable of storing information accessible by the server computing devices 410, such as a hard-drive, memory card, ROM, RAM, DVD, CD-ROM, write-capable, and read-only memories. In addition, storage system 450 may include a distributed storage system where data is stored on a plurality of different storage devices which may be physically located at the same or different geographic locations. Storage system 450 may be connected to the computing devices via the network 460 as shown in FIG. 4 and/or may be directly connected to or incorporated into any of the computing devices 110, 410, 420, 430, 440, etc.
  • EXAMPLE METHODS
  • In addition to the systems described above and illustrated in the figures, various operations will now be described. It should be understood that the following operations do not have to be performed in the precise order described below. Rather, various steps can be handled in a different order or simultaneously, and steps may also be added or omitted
  • FIG. 6 illustrates an example situation 600 for performing a plurality of performance checks on vehicle 100. Various features in FIG. 6 may generally correspond to the shape, location, and other characteristics of features shown in map data 200 of FIG. 2, and labeled as such. Additional features in FIG. 6, including various road users and other objects, are described in detail below. Although these examples are useful for demonstration purposes, they should not be considered limiting.
  • As shown in FIG. 6, vehicle 100 is currently parked curbside in lane 212 of road 210, to ensure safe operation on the road, vehicle 100 may need to perform performance checks on its systems. In this regard, vehicle 100 may be scheduled to perform the performance checks on its systems on a regular basis, such as every day or week, every predetermined number of kilometers traveled or numbers of trips completed, or some frequency mandated by law. For example, during a previous day, vehicle 100 might have completed a number of trips, and upon completing these trips, vehicle 100 has parked roadside by the curb in lane 212. On the current day, before going on more trips, vehicle 100 may first perform a plurality of performance checks on its various systems. In one instance, some types of performance checks may be performed at higher frequency than other performance checks. In another instance, the frequency of the performance checks may depend on the type of vehicle.
  • In order to perform the plurality of performance checks on the vehicle, a test route may be determined. In this regard, computing device 110 may determine the test route based on a location of the vehicle, map data, and the types of performance checks to be performed. For instance, computing device 110 may determine that vehicle 100 is currently parked by the curb in lane 212 near intersection 219, and determine, based on map data 200, a test route nearby so that vehicle 100 does not need to drive to a designated depot or testing center just to perform these tests. This ensures that the performance checks are performed as soon as possible, instead of risking operating the vehicle 100 on a long drive to the designated testing center, and ensures a more efficient use of resources, including fuel.
  • Computing device 110 may determine the test route further based on the types of performance checks that need to be performed in order to complete the plurality of performance checks. For instance, a list of required performance checks, including the type of each required performance check and frequency for each required performance check, may be stored on computing device 110. Additionally or alternatively, the list of required performance checks may be stored on server computing device 410 and/or storage system 450 accessible by computing device 110. For example, the list of required performance checks may include items such as “perform a sensor check using a stored traffic light detection at least once per 24 hours,” “perform a map check using a stop sign stored in map data at least once per month,” “perform a component check on all four tires at least once per week,” etc. In one aspect, computing device 110 may select a plurality of segments, such as a plurality of road segments, using map data and stored sensor data, where each of the plurality of road segments is selected for performing one or more of the required performance checks. Computing device 110 may then connect the plurality of road segments, and connect the location of the vehicle to one of the plurality of road segments to determine a test route in order to allow the vehicle to perform the plurality of performance checks.
  • For instance, for a sensor check, computing device 110 may select a segment for a test route so that sensor data can be collected to compare with stored values of previous detections of traffic features or stationary objects. For example, as shown in FIG. 6, computing device 110 may determine that traffic light 216 and pedestrian crossing 218 nearby vehicle 100 are stored as previously detected traffic features near vehicle 100. As such, a sensor check on one or more detection systems may be performed by comparing newly detected values to these stored values. Therefore, computing device 110 may determine that segment 610 beginning at the current location of vehicle 100 and a right turn from lane 212 to road 220 may be used for performing the sensor check.
  • For a map check, computing device 110 may select a segment for a test route so that sensor data can be collected to compare with stored locations and/or orientations of traffic features or stationary objects in map data. For example, as shown in FIG. 6, computing device 110 may determine that stop sign 260 is stored in map data 200 as a traffic feature. As such, a map check on the map data 200 stored in navigation system may be performed by comparing newly detected location and orientation of the stop sign 260 with the stored location and/or orientation of the stop sign 260 in map data 200. Therefore, computing device 110 may determine that segment 640 including a portion of road 230 near stop sign 260 may be used for performing the map check.
  • For a component check, computing device 110 may select a segment for a test route where a particular vehicle maneuver may be performed. For example, as shown in FIG. 6, computing device 110 may determine that a left turn may be performed at intersection 229. As such, a component check on the brake, wheel alignment, and left turn signal may be performed at intersection 229 while vehicle 100 performs the left turn. Therefore, computing device 110 may determine that segment 620 including a portion of road 220 and a left turn at intersection 229 to road 230 can be used for performing the component check.
  • In some instances, computing device 110 may determine that more than one segment is needed for performing a particular type of performance check. In this regard, a human operator may manually a list of items for a test route. Alternatively or additionally, a list of items required for a test route may be stored on computing devices 110, and/or stored on server computing device 410 and/or storage system 450 accessible by computing device 110. For example, the list of items required for a test route may include items such as “five or more traffic lights on the test route,” “one multipoint turn on the test route,” etc. For example, as shown in FIG. 6, computing device 110 may determine that a component check for the reverse signal requires maneuvers such as back-in or parallel parking or multi-point turns. As such, computing device may determine that segment 650 including parking lot 280 may be used for performing the component check on the reverse signal.
  • In other instances, the segments of test routes selected for each type of performance check may be the same or have overlapping portions, or in other words, a given segment may be used for performing multiple types of performance checks. For example, in addition for sensor check, segment 610 may also be used for map check, since the location and/or orientation of traffic features such as traffic light 216 and pedestrian crossing 218 are stored in map data 200. For another example, segment 610 may also be used for a component check on the brake, wheel alignment, and right turn signal.
  • Computing device 110 may select additional segments of test route connecting the various segments selected for the particular types of performance checks. For example, computing device 110 may determine that segment 630 may be needed to connect segment 620 and segment 640. As such, an example test route may include segments 610, 620, 630, 640, and 650.
  • Further, where applicable, computing device 110 may store the corresponding or associated performance checks to be performed using sensor data from each segment of the test route. For example, computing device 110 may associate segment 610 with a sensor check, a map check using traffic light 216, pedestrian crossing 218, and a component check on wheel alignment and right-turn signal. For another example, computing device 110 may associate segment 620 with a component check on left-turn signal and wheel alignment. For still another example, computing device 110 may associate segment 640 with a sensor check and a map check using stop sign 260. For yet another example, computing device 110 may not associate any check with segment 630.
  • Computing device 110 may determine the test route further based on additional requirements for test routes. One example requirement may be that one or more segments of the test route must have below a threshold traffic volume. In this regard, computing device 110 may receive historical or real-time traffic data from a database. Another example requirement may be that one or more segments of the route must have a speed limit below or above a threshold speed limit. In this regard, computing device 110 may determine speed limits of various roads based on map data 200. Yet another example requirement may be that one or more segments of the route must not be performed in certain areas, such as a school zone. In this regard, computing device 110 may determine zoning information based on map data 200. Still another example requirement may be that particular types of maneuvers must be performed in a parking lot. For example, as shown in FIG. 6, although multi-point turn maneuvers may also be performed on road 230, parking lot 280 may be chosen based on a requirement that multi-point turns be made in parking lots. Additional example requirements may be that the test route must include multiple distinct traffic lights, one or more cul-de-sacs for performing multipoint turns, and a stored traffic feature in a designated depot or testing center for the vehicle 100.
  • The test route need not be a closed loop. For example, as shown in FIG. 6, computing device 110 may determine to go on a next segment at the end of segment 650, instead of returning to the beginning of the test route. In other examples, the test route may be a closed loop, for example, computing device 110 may determine an additional segment connecting the end of segment 650 back to the beginning of segment 610. In examples where the test route is a closed loop, the plurality of performance checks may be repeated in order to collect more sets of sensor data, which for example may be averaged to obtain more accurate results.
  • The test route may be stored so that the test route can be used again by the vehicle at a later time to perform the aforementioned checks. For instance, the test route described above may be stored, and if vehicle 100 happens to be around the area when the plurality of performance checks need to be performed again, computing device 110 may simply use the stored test route, instead of determining a new test route. For another instance, based on the performance checks need to be performed again, computing device 110 may use some but not all the segments of the stored test route.
  • Once the test route is determined, computing device 110 may control the vehicle 100 to drive along the test route. While doing so, the perception system 172 and/or computing devices 110 may collect data while on the test route including sensor data and component data in order to perform the aforementioned performance checks. For example, FIG. 7 shows example sensor data 700 collected by various sensors in the perception system 172 while vehicle 100 drives along the test route shown in FIG. 6. For another example, FIG. 8 shows example components data 800 collected from various components while vehicle 100 drives along the test route shown in FIG. 6.
  • Referring to FIGS. 6 and 7, the collected sensor data may include data on permanent traffic features or stationary objects, such as traffic light 216, pedestrian crossing 218, building 270, stop sign 260, and parking lot 280. As shown in FIG. 7, the sensor data may include information such as the detected location and orientation of each traffic feature or object, as well as the location of the vehicle 100 when the sensor data on that feature or object is taken. For example, while on the test route, when vehicle 100 is at location [x_a, y_a], LIDAR sensor(s) 180 of vehicle 100 may detect traffic light 216 at location [x1, y1] and at an angle 25° from vehicle 100, and building 270 at location [x3, y3] and at an angle 10° from vehicle 100. For another example, while still on the test route, when vehicle 100 is at location [x_b, y_b], LIDAR sensor(s) 180 of vehicle 100 may detect stop sign 260 at location [x4, y4] and at an angle 25° from vehicle 100, and parking lot 280 at location [x5, y5] and at an angle 25° from vehicle 100. Locations of vehicle 100 during the test route may be determined by navigation system 168. Although not shown, the LIDAR data may further include details such as the size and shape of these features or objects.
  • The stored sensor data may include information such as the previously detected location and orientation of each traffic feature or object. In this regard, the stored sensor data for a traffic feature or object may include previous detections of the traffic feature or object by sensors of the vehicle 100 in the past. The stored sensor data for a traffic feature or object may additionally or alternatively include previous detections of the traffic feature or object made by sensors of other vehicles. Further, the stored sensor data may include the location of the vehicle taking the sensor data when the traffic feature or object was detected. The stored sensor data may be stored on computing device 110. Additionally or alternatively, the stored sensor data may be stored on server computing device 410 and/or storage system 450 accessible by computer device 110.
  • Although not shown, stored sensor data and collected sensor data may include same type of sensor data taken by multiple sensors, such as by different LIDAR sensors mounted at different locations in or on vehicle 100. Further, stored sensor data and collected sensor data may include different types of sensor data, such as camera data. Each type of sensor data may include similar information as LIDAR data, such as detected location and orientation of each traffic feature or object, and the location of the vehicle 100 when the sensor data on that feature or object is taken. In addition, each type of sensor data may include further details such as the size, shape, color of these features or objects.
  • Although not shown, the collected sensor data may further include data on temporary or moving traffic features and/or objects, such as vehicle 100A, vehicle 100B, traffic cone 670 and pedestrian 680. For example, while at location [x_c, y_c] of the test route, LIDAR sensor(s) 180 of vehicle 100 may detect vehicle 100A at location [x6, y6] at a 15° angle from the front of the vehicle 100. At or around the same time, camera sensor(s) 182 and RADAR sensor 184 may also each detect vehicle 100A at location [x6, y6] at a 15° angle. For example, the camera data may further include the color of vehicle 100A, and RADAR data may further include speed of vehicle 100A.
  • Referring to FIGS. 6 and 8, component data may be collected on various components of vehicle 100. For example as shown in FIG. 8, tire pressures may be collected for all four tires of vehicle 100. For another example, wheel alignment data may be collected on all four wheels of vehicle 100. The wheel alignment data may include camber angle, caster angle, and toe angle for each wheel. For still another example, data on brake of vehicle 100 may be collected. For instance, stopping distance at a specific speed such as 100 km/hr may be measured. For yet another example, responsiveness of various lights, such as the turn and reverse signals, as well as night light, can be turned on and off.
  • During or after the vehicle completes the test route, the plurality of performance checks may be performed by analyzing the collected data. Computing devices 110 may perform the plurality of performance checks by analyzing the collected data in real time while vehicle 100 navigates through the test route, or store the collected data in memory 130 so that computing device 110 may perform the checks after completing the test route. Additionally or alternatively, the collected data may be uploaded to server computing device 410 or storage system 450 so that server computing device 410 may perform the plurality of performance checks. Having computing device 110 perform the checks may provide greater efficiency, since uploading collected data to server computing device 410 or storage system 450 may be time consuming.
  • For a sensor check, detected characteristics of a traffic feature or object collected during the test route may be compared with previously detected or stored characteristics of the traffic feature or object. A sensor may satisfy the sensor check when the characteristics collected during the test route match the previously detected or stored characteristics by the same sensor, and not satisfy the sensor check when the characteristics collected during the test route do not match the previously detected or stored characteristics. For instance, referring to FIG. 7, collected LIDAR data from LIDAR sensor(s) 180 may be compared to stored LIDAR values from a previous detection by LIDAR sensor(s) 180. For example, computing device 110 may determine that detected location for each of traffic light 216, building 270, stop sign 260 and parking lot 280 are identical to stored LIDAR values, but detected orientation of each is offset by a 5° angle. As such, computing device 110 may determine that LIDAR sensor(s) 180 does not satisfy the sensor check.
  • FIG. 9 shows an example situation 900 illustrating an example sensor check. Various features in FIG. 9 may generally correspond to the shape, location, and other characteristics of features shown in map data 200 of FIG. 2, and labeled as such. Additional features in FIG. 9, including various road users and other objects, are described in detail below. Although these examples are useful for demonstration purposes, they should not be considered limiting.
  • As shown in FIG. 9, while vehicle 100 is at location [x_b, y_b], LIDAR sensor(s) 180 detects stop sign 260 at a location [x4, y4] and orientation of 25° angle with respect to a front right corner of vehicle 100. However, the stored LIDAR values for the stop sign 260 include location [x4, y4] and orientation of 30° angle with respect to a front right corner of vehicle 100. This may be due to a movement of the LIDAR sensor(s) 180 from its previous position when the stored LIDAR values were taken. For example, a pedestrian might have accidentally touched the LIDAR sensor(s) 180 when passing by vehicle 100 while vehicle 100 was parked curbside in lane 212. As such, this rotation causes a −5° angle offset for all detections made by LIDAR sensor(s) 180.
  • Additionally or alternatively, computing device 110 may compare the location and/or orientation of traffic features and/or objects detected in the collected sensor data with the location and/or orientation of traffic features and/or objects stored in map data 200. For example as shown in FIG. 7, computing device 110 may compare location [x1, y1] for traffic light 216 detected in collected LIDAR data with location [x1, y1] stored in map data 200, compare location [x3, y3] for building 270 detected in collected LIDAR data with location [x3, y3] stored in map data 200, compare location [x4, y4] for stop sign 260 detected in collected LIDAR data with location [x4, y4] stored in map data 200, compare location [x5, y5] for parking lot 280 detected in collected LIDAR data with location [x5, y5] stored in map data 200, and conclude that LIDAR sensor(s) 180 pass the sensor check. In this regard, computing device 110 may compare collected sensor data with map data 200 for some or all traffic features and/or objects detected during the test route.
  • In some instances, computing device 110 may determine that a sensor may still pass a sensor test if the differences between the stored and collected sensor data are within a predetermined range. For instance, computing device 110 may determine that LIDAR sensor(s) 180 may still pass the sensor test if the difference in stored and detected orientation for a detected object is within a 10° range.
  • Computing device 110 may determine one or more corrections for one or more sensors that fails the sensor test. For example, for LIDAR sensor(s) 180, computing device 110 may determine a +5° correction for all orientation values detected by LIDAR sensor(s) 180. For instance, computing device 110 may add 5° to the detected 25° angle for stop sign 260.
  • Another sensor check may include comparing collected sensor data from various sensors of a same type for a detected object. For instance, if LIDAR sensor(s) 180 include multiple sensors have overlapping fields of views, computing device 110 may compare the LIDAR point cloud for traffic light 216 collected by a first sensor with the LIDAR point cloud for traffic light 216 collected by a second sensor. Such sensor error of the second sensor may be caused by any of a number of factors, such as damage by another road user, or due to environmental factors such as extreme temperature or humidity. Computing device 110 may determine that, if the two LIDAR point clouds match substantially, such as by 90% or some other threshold, then both the first and second sensors pass the sensor check.
  • Still another sensor check may include determining a resolution or field of view captured by a sensor. For example, if collected LIDAR data for LIDAR sensor(s) 180 has a smaller field of view than the stored LIDAR data, computing device 110 may further determine that LIDAR sensor(s) 180 has failed the sensor check. In some instances, computing device 110 may determine that LIDAR sensor(s) 180 may still pass the sensor test if the difference between field of view of the collected LIDAR data during test route and field of view of the stored LIDAR data is within a predetermined threshold difference. For another example, if collected camera data for camera sensor(s) 182 has a lower resolution than the stored camera data, computing device 110 may further determine that camera sensor(s) 182 has failed the sensor check. In some instances, computing device 110 may determine that camera sensor(s) 182 may still pass the sensor test if the difference between resolution of the collected camera data during test route and resolution of stored camera data is within a predetermined threshold difference. Such changes in resolution or field of view may be caused by any of a number of factors, such as damage by another road user, or due to environmental factors such as extreme temperature or humidity.
  • Yet another sensor check may include determining whether a sensor produces unreasonable sensor data. For example, computing device 110 may determine that camera data produced by camera sensor(s) 182 are all green, and conclude that camera sensor(s) 182 fail the sensor check. For another example, computing device 110 may determine that LIDAR sensor(s) 180 produces empty point clouds, and conclude that LIDAR sensor(s) 180 fail the sensor check. Such changes in resolution or field of view may be caused by any of a number of factors, such as damage by another road user, or due to environmental factors such as extreme temperature or humidity.
  • For another instance, for a map check, a location or orientation of a detected traffic feature may be compared with the location and/or orientation of a previously detected or stored traffic feature stored in map data of the vehicle. The map data may satisfy the map check when the location and/or orientation of detected traffic features during the test route match the location and/or orientation of traffic feature stored in the map data. For instance, referring to FIG. 7, locations of traffic features detected by LIDAR sensor(s) 180 may be compared to locations stored in map data 200. For example, computing device 110 may determine that detected location by LIDAR sensor(s) 180 for each of traffic light 216, building 270, stop sign 260 and parking lot 280 are identical to locations stored in map data 200, but that pedestrian crossing 218 is not detected by LIDAR sensor(s) 180.
  • When a difference between the map data 200 and collected sensor data on a traffic feature is detected, computing device 110 may further determine whether the difference was due to an error in the map data 200 or an error in the collected sensor data. For example, computing device 110 may determine that, since pedestrian crossing is not a 3D structure and that the field of view of LIDAR sensor(s) 180 does not include ground level, LIDAR sensor(s) 180 cannot detect pedestrian crossing 218, and therefore the difference does not indicate an error in map data 200. In such cases, computing device 110 may further confirm by comparing the location stored in map data 200 with collected sensor data from another sensor, such as camera sensor(s) 182. For example, computing device 110 may determine that the location for pedestrian crossing 218 in map data 200 matches the location detected by camera sensor(s) 182.
  • In some instances, computing device 110 device may determine that an update needs to be made for map data 200. Referring to FIG. 9, which shows the example situation 900 further illustrating an example map check. As shown, while at location [x_b, y_b], LIDAR sensor(s) 180 of vehicle 100 detects a no-enter sign 910 near exit 284 of parking lot 280. However, map data 200 does not include data on a no-enter sign at this location. As such, computing device 110 may determine to update map data 200 with the detected location of no-enter sign 910.
  • In addition or alternatively, computing device 110 may determine that, even if some error exists, the map data may still pass a map test if a threshold number or percentage of traffic features stored in the map data have locations matching the detected locations from the collected sensor data. For instance, computing device 110 may determine that map data 200 may still pass the map test if at least five or at least 80% of the stored features have locations matching the collected sensor data. For example, since locations for traffic light 216, pedestrian crossing 218, building 270, stop sign 260, and parking lot 280 match that of collected LIDAR data, even though location for the no-enter sign 910 was missing, computing device 110 may still determine that map data 200 may pass the map test.
  • For another instance, for a component check, the one or more measurements related to a component of the vehicle may be compared with predetermined requirements. The component may satisfy the component check when the one or more measurements satisfy predetermined requirements. For example, the predetermined requirements may be stored in computing device 110, or alternatively or additionally stored on server computing device 410 and/or storage system 450.
  • In some instances, a component may satisfy a component check if a measurement meets a predetermined threshold value. For example, referring to FIG. 8, a predetermined minimum threshold of 35 psi may be stored for tires of vehicle 100. As shown, since the front left tire, the rear left tire, and the rear right tire each meets the predetermined minimum threshold, these tires satisfy the component check. However, since front right tire has a pressure of only 20 psi, the front right tire fails the component check.
  • Additionally of alternatively, a component may satisfy a component check if a measurement is within a predetermined range of values. For example, referring to FIG. 8, predetermined alignment angles may be set for the tires of vehicle 100, which include camber, caster, and toe angles. Since each of the tires of vehicle 100 have alignment angles within these predetermined ranges, each of the tires of vehicle 100 passes the component check.
  • Additionally of alternatively, a component may satisfy a component check if a measurement indicates that the component has a predetermined level of responsiveness. For example, referring to FIG. 8, the predetermined level of responsiveness may be set as a binary (responsive or not) for each of the left turn, right turn, reverse, and brake signal lights, as well as the headlight. As shown, since left turn, right turn, reverse, and brake signal lights are all responsive, computing device 110 may determine that they each pass the component check. However, since headlight is unresponsive, computing device 110 may determine that the headlight fails the component check.
  • For another example, again referring to FIG. 8, the predetermined level or responsiveness may be set as a predetermined level of delay. As shown, for the brake, a predetermined stopping distance at a specific speed, such as 100 km/hr, may be set for vehicle 100. As such, since the measured stopping distance for vehicle 100 is 19 m, which is below the predetermined stopping distance of 20 m, computing device 110 may determine that the brake passes the component check.
  • Once the plurality of performance checks are completed, computing devices 110 may select an operation mode for vehicle 100. Modes for operation may include, for example, task designations (passenger or non-passenger tasks). Modes of operation may further include various limits, such as limits on speeds, distance, geographic area, or environmental conditions (such as weather, day/night). Modes for operation may also include an inactive mode where the vehicle is pulled over or parked after completing the plurality of performance checks.
  • Computing device 110 may determine an operation mode based on results from the plurality of performance checks. For example, an operation mode may only be selected if a threshold number of percentage of performance checks are passed. For another example, an operation mode may only be selected if a specific set of performance checks are passed, such as a set of performance checks specific to driving at night or during poor visibility, which may include performance checks such as the sensor checks described above, and component checks involving signal lights and headlight, etc.
  • Computing device 110 may determine that one or more operation modes cannot be selected based on specific failures. For example, computing device 110 may determine that, if stopping distance at 100 km/hr for vehicle 100 is above 20 m, modes of operation involving driving at a speed of 100 km/hr or greater cannot be selected. For another example, computing device 110 may determine that, if less than 80% of the sensors in the perception system 172 fail the sensor test, operation mode involving driving at night or certain weather conditions cannot be selected. For still another example, computing device 110 may determine that, if one or more tires has a tire pressure below 35 psi, modes of operation involving passenger tasks cannot be selected.
  • Computing device 110 may select an operation mode further based on other factors, such as traffic law requirements and the type of vehicle. For example, traffic law may require a vehicle to have operating turn signals. As such, computing device 110 may select the inactive mode if any of the turn signals is unresponsive. For another example, computing device 110 may select an operation mode with a limit on distance only for compact vehicles with below normal tire pressures, and select an inactive operation mode for trucks with below normal tire pressures.
  • Once an operation mode is selected, computing device 110 may operate vehicle 100 in the selected operation mode. For example, operating in the selected operation mode may include operating according to limits of the mode of operation, such as limits on speed, distance, geographic area, environmental condition. For another example, operating in the selected mode may include whether to determining whether to accept passenger or non-passenger tasks.
  • Operating in the selected operation mode may further include using the determined corrections for one or more sensors. For example, as described with respect to FIGS. 7 and 9, when operating vehicle 100, computing device 110 may apply a correction of +5° to sensor data detected by LIDAR sensor(s) 180.
  • Operating in the selected operation mode may further include using the updated map data. For example, as described with respect to FIGS. 7 and 9, when operating vehicle 100, computing device 110 may use updated map data 200 including the no-enter sign 910.
  • Operation modes may also be selected for a plurality of vehicles by a remote system, such as a fleet management system. For example, server computing device 410 may manage a fleet of vehicles including vehicle 100, 100A, 100B. In this regard, sensor data and component data collected by various vehicles in the fleet, such as vehicle 100, 100A, 100B may be uploaded to server computing device 410. Server computing device 410 may compare the collected sensor data from each vehicle to stored sensor values from previous detections. Server computing device 410 may also compare the collected components data with stored predetermined requirements. In some instances, the plurality of performance checks may be performed by computing device of each vehicle, and only the results (pass/fail) are uploaded to server computing device 410. Sever computing device 410 may then designate modes of operations for subsets of vehicles of the plurality of vehicles based on the plurality of performance checks as described above, such as based on passing a threshold number or percentage of performance checks, particular sets of performance checks, other factors such as type of vehicle or traffic law, etc. For another example, server computing device 410 may designate modes of operations further based on a planned distribution or demand for the vehicles in the fleet.
  • FIG. 10 shows an example flow diagram 1000 of an example method for performing a plurality of performance checks. The example method may be performed by one or more processors, such as one or more processors 120 of computing device 110. For example, processors 120 of computing device 110 may receive data and make various determinations as shown in flow diagram 1000, and control the vehicle 100 based on these determinations.
  • Referring to FIG. 10, in block 1010, a plurality of performance checks are identified, including a first check for a detection system of a plurality of detection systems of the vehicle and a second check for map data. In block 1020, a plurality of road segments are selected based on a location of the vehicle and the plurality of performance checks, wherein each of the plurality of road segments is selected for performing one or more of the plurality of performance checks. In block 1030, a test route is determined for the vehicle by connecting the plurality of road segments and by connecting the location of the vehicle to one of the plurality of road segments. For example, a plurality of road segments and a test route may be determined as described in relation to FIG. 6. In block 1040, the vehicle is controlled along the test route in an autonomous driving mode. In block 1050, while controlling the vehicle, sensor data are received from the plurality of detection systems of a vehicle. For example, sensor data collected on a test route may be received by computing device 110 as described in relation to FIG. 7.
  • In block 1060, the plurality of performance checks are performed based on the received sensor data. For example, one or more sensor checks may be performed by comparing the collected sensor data with stored previous sensor data. For another example, one or more map checks may be performed by comparing the collected sensor data with map data. In block 1070, an operation mode is selected from a plurality of operation modes for the vehicle based on results of the plurality of performance checks. For example, the driving mode may be selected based on the results meeting a threshold number or percentage of performance checks. In block 1080, the vehicle is operated in the selected operation mode. For example, operating in the selected operation mode may include using corrections to sensor data or updates to map data.
  • Unless otherwise stated, the foregoing alternative examples are not mutually exclusive, but may be implemented in various combinations to achieve unique advantages. As these and other variations and combinations of the features discussed above can be utilized without departing from the subject matter defined by the claims, the foregoing description of the embodiments should be taken by way of illustration rather than by way of limitation of the subject matter defined by the claims. In addition, the provision of the examples described herein, as well as clauses phrased as “such as,” “including” and the like, should not be interpreted as limiting the subject matter of the claims to the specific examples; rather, the examples are intended to illustrate only one of many possible embodiments. Further, the same reference numbers in different drawings can identify the same or similar elements.

Claims (20)

1. A system in a vehicle, the system comprising:
memory configured to store map data;
a plurality of sensors configured to detect traffic features and objects in an environment of the vehicle and generate sensor data; and
a computing device including one or more processors configured to perform a plurality of performance checks on the vehicle, the plurality of performance checks including a map check on the map data stored in the memory, the one or more processors being further configured to:
receive sensor data from at least one of the plurality of sensors, the received sensor data being associated with a stationary object or a traffic feature;
determine that the stationary object or the traffic feature is included in the map data stored in the memory;
perform the map check by comparing the received sensor data to the map data stored in the memory;
select an operation mode from a plurality of operation modes for the vehicle based on a result of the map check; and
operate the vehicle in the selected operation mode.
2. The system of claim 1, wherein the one or more processors are further configured to:
detect a difference between the map data and the received sensor data; and
determine whether the difference was due to an error in the map data.
3. The system of claim 2, wherein the one or more processors determine whether the difference was due to an error in the map data by determining which particular sensor of the plurality of sensors generated the sensor data, and then determining whether sensor data collected by a different one of the plurality of sensors should be compared to the map data.
4. The system of claim 1, wherein the one or more processors are further configured to:
determine whether differences exist in the comparison of the received sensor data to the map data; and
determine that the map check is passed when a threshold number or percentage of traffic features or objects of the map data have locations matching locations of traffic features or objects detected by the plurality of sensors.
5. The system of claim 1, wherein the stationary object is traffic sign.
6. The system of claim 1, wherein the sensor data generated by the plurality of sensors indicates at least one of location or orientation of the stationary object.
7. The system of claim 1, wherein the traffic features and objects include a traffic light.
8. The system of claim 1, wherein the traffic features and objects include a stop sign.
9. The system of claim 1, wherein the traffic features and objects include a crosswalk.
10. The system of claim 1, the one or more processors are configured to:
select a plurality of road segments based on a location of the vehicle and the plurality of performance checks, wherein each of the plurality of road segments is selected for performing one or more of the plurality of performance checks;
determine a test route for the vehicle by connecting the plurality of road segments and by connecting the location of the vehicle to one of the plurality of road segments; and
control the vehicle along the test route in an autonomous driving mode.
11. A method for performing a map check for a vehicle, the method comprising:
storing, by a memory, map data;
detecting, by a plurality of sensors, traffic features and objects in an environment of the vehicle;
generating, by the plurality of sensors, sensor data; and
performing, by one or more processors of a computing device, a map check on the map data stored in the memory by:
receiving, by the one or more processors, sensor data from at least one of the plurality of sensors, the received sensor data being associated with a stationary object or a traffic feature;
determining, by the one or more processors, that the stationary object or the traffic feature is included in the map data stored in the memory; and
performing, by the one or more processors, the map check by comparing the received sensor data to the map data stored in the memory,
wherein the one or more processors are configured to select an operation mode from a plurality of operation modes for the vehicle based on a result of the map check, and operate the vehicle in the selected operation mode.
12. The method of claim 11, further comprising:
detecting, by the one or more processors, a difference between the map data and the received sensor data; and
determining, by the one or more processors, whether the difference was due to an error in the map data.
13. The method of claim 12, wherein determining whether the difference was due to an error in the map data includes determining which particular sensor of the plurality of sensors generated the sensor data, and then determining whether sensor data collected by a different one of the plurality of sensors should be compared to the map data.
14. The method of claim 11, further comprising:
determining, by the one or more processors, whether differences exist in the comparison of the received sensor data to the map data; and
determining, by the one or more processors, that the map check is passed when a threshold number or percentage of traffic features or objects of the map data have locations matching locations of traffic features or objects detected by the plurality of sensors.
15. The method of claim 11, wherein the stationary object is traffic sign.
16. The method of claim 11, wherein the sensor data generated by the plurality of sensors indicates at least one of location or orientation of the stationary object.
17. The method of claim 11, wherein the traffic features and objects include a traffic light.
18. The method of claim 11, wherein the traffic features and objects include a stop sign.
19. The method of claim 11, wherein the traffic features and objects include a crosswalk.
20. The method of claim 11, further comprising:
selecting, by the one or more processors, a plurality of road segments based on a location of the vehicle and a plurality of performance checks, wherein each of the plurality of road segments is selected for performing one or more of the plurality of performance checks;
determining, by the one or more processors, a test route for the vehicle by connecting the plurality of road segments and by connecting the location of the vehicle to one of the plurality of road segments; and
controlling, by the one or more processors, the vehicle along the test route in an autonomous driving mode.
US17/186,249 2018-12-13 2021-02-26 Automated Performance Checks For Autonomous Vehicles Abandoned US20210229686A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US17/186,249 US20210229686A1 (en) 2018-12-13 2021-02-26 Automated Performance Checks For Autonomous Vehicles

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US16/219,386 US10960894B2 (en) 2018-12-13 2018-12-13 Automated performance checks for autonomous vehicles
US17/186,249 US20210229686A1 (en) 2018-12-13 2021-02-26 Automated Performance Checks For Autonomous Vehicles

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
US16/219,386 Continuation US10960894B2 (en) 2018-12-13 2018-12-13 Automated performance checks for autonomous vehicles

Publications (1)

Publication Number Publication Date
US20210229686A1 true US20210229686A1 (en) 2021-07-29

Family

ID=71073310

Family Applications (2)

Application Number Title Priority Date Filing Date
US16/219,386 Active 2039-07-10 US10960894B2 (en) 2018-12-13 2018-12-13 Automated performance checks for autonomous vehicles
US17/186,249 Abandoned US20210229686A1 (en) 2018-12-13 2021-02-26 Automated Performance Checks For Autonomous Vehicles

Family Applications Before (1)

Application Number Title Priority Date Filing Date
US16/219,386 Active 2039-07-10 US10960894B2 (en) 2018-12-13 2018-12-13 Automated performance checks for autonomous vehicles

Country Status (6)

Country Link
US (2) US10960894B2 (en)
EP (1) EP3877229A4 (en)
JP (1) JP7271669B2 (en)
KR (1) KR102503572B1 (en)
CN (1) CN113272198A (en)
WO (1) WO2020123195A1 (en)

Families Citing this family (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11520331B2 (en) * 2018-12-28 2022-12-06 Intel Corporation Methods and apparatus to update autonomous vehicle perspectives
US11462111B2 (en) * 2019-04-29 2022-10-04 Qualcomm Incorporated Method and apparatus for vehicle maneuver planning and messaging
CN110375991B (en) * 2019-06-19 2021-10-08 山东省科学院自动化研究所 Test road, system and method for lane changing capability of automatic driving vehicle
US11904890B2 (en) * 2020-06-17 2024-02-20 Baidu Usa Llc Lane change system for lanes with different speed limits
US11796336B2 (en) * 2020-12-10 2023-10-24 FEV Group GmbH Systems, methods and computer program products for determining routes for test drives of autonomous and/or partially autonomous vehicles
JP7380542B2 (en) * 2020-12-23 2023-11-15 トヨタ自動車株式会社 Automatic driving system and abnormality determination method
EP4080370B1 (en) * 2021-04-23 2024-04-03 Zenseact AB Vehicle software shadow mode testing
CN114061596B (en) * 2021-11-19 2024-03-22 北京国家新能源汽车技术创新中心有限公司 Automatic driving positioning method, system, testing method, equipment and storage medium
KR102585557B1 (en) 2021-11-29 2023-10-06 주식회사 와이즈오토모티브 Apparatus and method for evaluating obstacle recognition performance of autonomous vehicle
KR102536446B1 (en) 2021-11-29 2023-05-26 주식회사 와이즈오토모티브 Apparatus and method for evaluating obstacle recognition performance of an autonomous vehicle using plurality of evaluation devices in real road environment
KR102634543B1 (en) * 2021-12-21 2024-02-08 한국전자기술연구원 Method and system for creating scenario of road traffic for simulators reflecting real road conditions

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8718861B1 (en) * 2012-04-11 2014-05-06 Google Inc. Determining when to drive autonomously

Family Cites Families (28)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4119900A (en) 1973-12-21 1978-10-10 Ito Patent-Ag Method and system for the automatic orientation and control of a robot
JPS5842251A (en) * 1981-09-07 1983-03-11 Toshiba Corp Manufacture of semiconductor device
JP4600795B2 (en) 2001-03-26 2010-12-15 マツダ株式会社 Vehicle remote diagnosis method, vehicle remote diagnosis system, vehicle control device, vehicle remote diagnosis device, and computer program
DE102006005709B4 (en) * 2006-02-08 2016-08-04 Robert Bosch Gmbh Pressure measuring device and method for parameterizing a pressure measuring device
US9043073B2 (en) * 2011-11-16 2015-05-26 Flextronics Ap, Llc On board vehicle diagnostic module
US9719801B1 (en) 2013-07-23 2017-08-01 Waymo Llc Methods and systems for calibrating sensors using road map data
EP2922033B1 (en) * 2014-03-18 2018-11-21 Volvo Car Corporation A vehicle sensor diagnosis system and method and a vehicle comprising such a system
JP6220711B2 (en) 2014-03-26 2017-10-25 株式会社日本総合研究所 Vehicle state diagnostic system and vehicle state diagnostic method for moving body
US9996986B2 (en) * 2014-08-28 2018-06-12 GM Global Technology Operations LLC Sensor offset calibration using map information
JP6398501B2 (en) * 2014-09-10 2018-10-03 株式会社デンソー In-vehicle camera diagnostic device
DE102015218041A1 (en) * 2015-09-21 2017-03-23 Bayerische Motoren Werke Ag Method and device for providing data for a geometry map for autonomous or automated driving of a vehicle
US9916703B2 (en) 2015-11-04 2018-03-13 Zoox, Inc. Calibration for autonomous vehicle operation
CN106096192B (en) * 2016-06-27 2019-05-28 百度在线网络技术(北京)有限公司 A kind of construction method and device of the test scene of automatic driving vehicle
JP2018008652A (en) 2016-07-15 2018-01-18 株式会社ジェイテクト Steering device
CN106198049B (en) * 2016-07-15 2019-03-12 百度在线网络技术(北京)有限公司 Real vehicles are in ring test system and method
EP3309721A1 (en) * 2016-09-23 2018-04-18 KPIT Technologies Ltd. Autonomous system validation
DE102016218815B4 (en) * 2016-09-29 2020-07-30 Audi Ag Procedure for selecting a route for an emission test
JP2018096715A (en) 2016-12-08 2018-06-21 トヨタ自動車株式会社 On-vehicle sensor calibration system
WO2018126215A1 (en) 2016-12-30 2018-07-05 DeepMap Inc. High definition map updates
CN108267322A (en) * 2017-01-03 2018-07-10 北京百度网讯科技有限公司 The method and system tested automatic Pilot performance
US10268195B2 (en) * 2017-01-06 2019-04-23 Qualcomm Incorporated Managing vehicle driving control entity transitions of an autonomous vehicle based on an evaluation of performance criteria
JP6355780B1 (en) * 2017-03-08 2018-07-11 三菱電機株式会社 Vehicle evacuation device and vehicle evacuation method
US10846947B2 (en) * 2017-03-31 2020-11-24 Honeywell International Inc. System and method for analyzing vehicle systems during vehicle travel
US10268203B2 (en) 2017-04-20 2019-04-23 GM Global Technology Operations LLC Calibration validation for autonomous vehicle operations
JP6822309B2 (en) 2017-05-16 2021-01-27 株式会社デンソー Autonomous driving support device and automatic driving support method
CN107727411B (en) * 2017-10-30 2019-09-27 青岛慧拓智能机器有限公司 A kind of automatic driving vehicle assessment scene generation system and method
US11235778B2 (en) * 2018-01-24 2022-02-01 Clearpath Robotics Inc. Systems and methods for maintaining vehicle state information
US10755007B2 (en) * 2018-05-17 2020-08-25 Toyota Jidosha Kabushiki Kaisha Mixed reality simulation system for testing vehicle control system designs

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8718861B1 (en) * 2012-04-11 2014-05-06 Google Inc. Determining when to drive autonomously

Also Published As

Publication number Publication date
US20200189608A1 (en) 2020-06-18
WO2020123195A1 (en) 2020-06-18
US10960894B2 (en) 2021-03-30
JP2022510959A (en) 2022-01-28
CN113272198A (en) 2021-08-17
JP7271669B2 (en) 2023-05-11
KR102503572B1 (en) 2023-02-24
EP3877229A4 (en) 2022-07-27
KR20210090285A (en) 2021-07-19
EP3877229A1 (en) 2021-09-15

Similar Documents

Publication Publication Date Title
US20210229686A1 (en) Automated Performance Checks For Autonomous Vehicles
US11537133B2 (en) Dynamic routing for autonomous vehicles
US11373419B2 (en) Automatically detecting unmapped drivable road surfaces for autonomous vehicles
US11887474B2 (en) Detecting and responding to traffic redirection for autonomous vehicles
US11804136B1 (en) Managing and tracking scouting tasks using autonomous vehicles
US20240125619A1 (en) Generating scouting objectives
US12060080B2 (en) Puddle occupancy grid for autonomous vehicles
US20240075959A1 (en) Yellow light durations for autonomous vehicles
AU2021201402B2 (en) Detecting and responding to traffic redirection for autonomous vehicles

Legal Events

Date Code Title Description
AS Assignment

Owner name: WAYMO LLC, CALIFORNIA

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:BRALEY, COLIN;GRABE, VOLKER;REEL/FRAME:055439/0595

Effective date: 20181214

STPP Information on status: patent application and granting procedure in general

Free format text: APPLICATION DISPATCHED FROM PREEXAM, NOT YET DOCKETED

STPP Information on status: patent application and granting procedure in general

Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION

STPP Information on status: patent application and granting procedure in general

Free format text: NON FINAL ACTION MAILED

STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION