US20190163176A1 - Method for transferring control of an autonomous vehicle to a remote operator - Google Patents

Method for transferring control of an autonomous vehicle to a remote operator Download PDF

Info

Publication number
US20190163176A1
US20190163176A1 US16/206,477 US201816206477A US2019163176A1 US 20190163176 A1 US20190163176 A1 US 20190163176A1 US 201816206477 A US201816206477 A US 201816206477A US 2019163176 A1 US2019163176 A1 US 2019163176A1
Authority
US
United States
Prior art keywords
remote operator
autonomous
autonomous vehicle
remote
road segment
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US16/206,477
Inventor
Tao Wang
Wei Song
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Drive AI Inc
Original Assignee
Drive AI Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Drive AI Inc filed Critical Drive AI Inc
Priority to US16/206,477 priority Critical patent/US20190163176A1/en
Assigned to DRIVE.AI reassignment DRIVE.AI ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: SONG, WEI, WANG, TAO
Publication of US20190163176A1 publication Critical patent/US20190163176A1/en
Priority to US16/943,969 priority patent/US11131990B1/en
Priority to US17/327,362 priority patent/US11797001B1/en
Abandoned legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/0055Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots with safety arrangements
    • G05D1/0061Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots with safety arrangements for transition from automatic pilot to manual pilot and vice versa
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/0011Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots associated with a remote control arrangement
    • G05D1/0027Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots associated with a remote control arrangement involving a plurality of vehicles, e.g. fleet or convoy travelling
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/0011Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots associated with a remote control arrangement
    • G05D1/0038Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots associated with a remote control arrangement by providing the operator with simple or augmented images from one or more cameras located onboard the vehicle, e.g. tele-operation
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/0088Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots characterized by the autonomous decision making process, e.g. artificial intelligence, predefined behaviours
    • G05D2201/0213

Definitions

  • This invention relates generally to the field of autonomous vehicles and more specifically to a new and useful method for transferring control of an autonomous vehicle to a remote operator in the field of autonomous vehicles.
  • FIG. 1 is a flowchart representation of a method
  • FIG. 2 is a flowchart representation of one variation of the method
  • FIGS. 3A, 3B, and 3C are flowchart representations of variations of the method.
  • FIG. 4 is a flowchart representation of one variation of the method.
  • a method for transferring control of an autonomous vehicle to a remote operator includes, at a computer system: accessing a corpus of driving records of a fleet of autonomous vehicles operating within a geographic region in Block S 110 ; identifying a road segment, within the geographic region, associated with a frequency of transitions, from autonomous operation to local manual operation triggered by local operators occupying autonomous vehicles in the fleet, that exceeds a threshold frequency based on the corpus of driving records in Block S 120 ; and associating a location of the road segment, represented in a navigation map, with a remote operator trigger in Block S 130 .
  • the method also includes, at the autonomous vehicle operating within the geographic region: autonomously navigating along a route in Block S 140 ; transmitting a request for manual assistance to the remote operator in Block S 150 in response to approaching the location associated with the remote operator trigger; transmitting sensor data to a remote operator portal associated with the remote operator in Block S 152 ; executing a navigational command received from the remote operator via the remote operator portal in Block S 154 ; and resuming autonomous navigation along the route after passing the location in Block S 160 .
  • One variation of the method shown in FIG. 3C further includes, at the remote computer system: accessing a corpus of historical traffic accident data of human-operated vehicles involved in traffic accidents within a geographic region in Block S 110 ; identifying a road segment, within the geographic region, associated with a frequency of traffic accidents that exceeds a threshold frequency based on the corpus of historical traffic accident data in Block S 120 ; and associating a location of the road segment, represented in a navigation map, with a remote operator trigger in Block S 130 .
  • FIGS. 3A, 3B, and 3C further includes, at the remote computer system: accessing a specification for triggering manual control of autonomous vehicles in Block S 110 ; identifying a road segment, within a geographic region, exhibiting characteristics defined by the specification in Block S 120 ; and associating a location of the road segment, represented in a navigation map, with a remote operator trigger in Block S 130 .
  • Blocks of the method can be executed by a computer system (e.g., a computer network, a remote server) to preemptively annotate a navigation map with locations of remote operator triggers based on various existing data, such as: human-supervised autonomous vehicle test data; operating data recorded by autonomous vehicles while operating autonomously; accident data from human-operated vehicles; and/or characteristics of roads or intersections flagged for manual control.
  • a computer system e.g., a computer network, a remote server
  • an autonomous vehicle can execute other Blocks of the method to: automatically request remote operator assistance as the autonomous vehicle approaches a location of a remote operator trigger indicated in the navigation map; automatically cede decision-making or full operational control of the autonomous vehicle to a remote human operator; execute navigational commands received from the remote human operator to navigate through this location; and then resume autonomous operation upon passing this location or upon confirmation from the remote human operator to resume autonomous operation.
  • the remote computer system can access various historical data, such as: locations over which local human operators occupying autonomous vehicles took manual control of their autonomous vehicles (e.g., during autonomous vehicle testing); locations at which autonomous vehicles, operating autonomously, unexpectedly disengaged (e.g., due to an autonomous operation failure or inability to verify a next navigational action); and/or locations (and severity, cost) of accidents involving human-operated vehicles; etc. within a geographic region.
  • the remote computer system can isolate discrete locations, intersections, lanes, and/or other road segments at which an autonomous vehicle may be at greater risk for collision with other vehicles, may be delayed in executing a next navigational action, or may execute a next navigational action with reduced confidence.
  • the remote computer system can then populate a navigation map (or a localization map, a table, or other container) with remote operator triggers and related trigger parameters at geospatial locations of these flagged road segments.
  • the remote computer system can: generate a heatmap of frequencies of manual control selections, autonomous vehicle disengagements, and/or traffic accidents per instance of traversal by a vehicle throughout the geographic region over a period of time; identify discrete geospatial locations or small geospatial areas within the heatmap exhibiting greatest frequencies of manual control selections, autonomous vehicle disengagements, and/or traffic accidents per instance of traversal by a vehicle; write a remote operator flag to the navigation map at each of these discrete geospatial locations or small geospatial areas; and push this navigation map (or a navigation map update) to each autonomous vehicle deployed to this geographic region.
  • the remote computer system can also derive correlations between local conditions and these instances of manual control selections, autonomous vehicle disengagements, and/or traffic accidents—such as: time of day; local weather conditions; and an autonomous vehicle entering uncommon (e.g., five-way) intersections, entering railroad crossings, facing into the Sun, entering a school zone, nearing a large crowd of pedestrians, or approaching an unprotected left turn; etc.
  • the remote computer system can then write these conditions to corresponding remote operator triggers in the navigation map (or localization map, table, or other container) in the form of trigger parameters.
  • an autonomous vehicle can: reference a localization map to determine its geospatial location; and reference the navigation map to elect and then execute navigational actions, such as accelerating, braking, turning, changing lanes, etc. along a planned route toward a specified destination.
  • the autonomous vehicle can automatically transmit a request for manual assistance to a remote operator (or to a remote operator manager more generally).
  • the autonomous vehicle can transition from autonomous navigation to remote manual control by the remote operator and can transmit (or “stream”) video, LIDAR, and/or other sensor data to the remote operator portal associated with the remote operator in real-time.
  • the remote operator can view these sensor data through her remote operator portal and elect to: delay a navigational action (e.g., in the autonomous vehicle's queue); confirm a navigational action; select from a predefined set of navigational actions; or manually adjust brake, accelerator, and/or steering positions accordingly.
  • the autonomous vehicle can then transition back to full autonomous operation and resume full autonomous navigation along the planned route, such as: once the autonomous vehicle has moved past the location (or intersection, lane, and/or other road segment) linked to this remote operator trigger; or once the remote operator has confirmed—via the remote operator portal—transition back to autonomous operation.
  • emergency scenario or accident data for training an autonomous vehicle solution may not be immediately available without involving autonomous vehicles (or vehicles outfitted with similar sensor suites) in a variety of different accidents while collecting sensor data from these autonomous vehicles. Therefore, an autonomous vehicle solution may not be trained to detect and respond to possible emergency scenarios or to detect and respond to emergency scenarios in which it is directly involved, such as: occupying a railroad crossing as a train approaches; navigating past a vehicle that has crossed into oncoming traffic near the autonomous vehicle; or approaching a large animal crossing a road ahead of the autonomous vehicle.
  • the remote computer system can: identify discrete locations, intersections, lanes, or other road segments at which emergency scenarios are particularly likely to occur (e.g., locations associated with transition to manual control by local human operators while occupying these autonomous vehicles, locations associated with accident frequencies that substantially exceed a threshold, average, or baseline value); and then annotate a navigation map or other container with remote operator triggers at corresponding locations.
  • An autonomous vehicle approaching a location associated with a remote operator trigger can automatically and preemptively request assistance from a remote operator and serve sensor data to this remote operator prior to (e.g., ten seconds before) the autonomous vehicle's arrival at this flagged location, thereby enabling the remote operator to quickly perceive the scene around the autonomous vehicle and reliably assume manual control of the autonomous vehicle prior to the autonomous vehicle executing a higher-risk navigational action or disengaging due to a failure at the flagged location.
  • a remote operator manager can also dynamically and predictively allocate remote human operators to assist autonomous vehicles approaching locations of remote operator triggers indicated in the navigation map as these autonomous vehicles operate (e.g., execute routes) within a geographic region.
  • the remote computer system, remote operator portal, and fleet of autonomous vehicles can cooperate to annotate a navigation map with locations of remote operator triggers and to implement this navigation map in order to reduce risk to autonomous vehicles entering known higher-risk scenarios and in order to maintain high operating efficiency for these autonomous vehicles.
  • the remote computer system can preemptively identify higher-risk road segments, road segments in which autonomous vehicles may be unable to detect and avoid risk, or road segments in which autonomous vehicles may be unable to confidently elect a next navigational action and to label a navigational map (or other container) with remote operator triggers at corresponding locations.
  • An autonomous vehicle (or the remote computer system) can then automatically trigger a remote operator to assume control of the autonomous vehicle and to assist navigation of the autonomous vehicle as the autonomous vehicle approaches a road segment linked to a remote operator trigger in the navigation map in order to: reduce risk of collision with other vehicles or obstacles nearby; and/or maintain a high operating efficiency of the autonomous vehicle.
  • Block S 110 of the method recites, during a scan cycle, recording multi-dimensional sensor images at multi-dimensional sensors arranged on the vehicle.
  • an autonomous vehicle accesses sensor data from various sensors arranged on or integrated in the autonomous vehicle—such as distance scans from multiple LIDAR sensors and/or color 2D images from multiple color cameras—recorded approximately concurrently by sensors defining fields of view exhibiting some overlap over a distance range from the autonomous vehicle.
  • the autonomous vehicle includes: a suite of sensors configured to collect information about the autonomous vehicle's environment; local memory that stores a navigation map defining lane connections and nominal vehicle paths for a road area and a localization map that the autonomous vehicle implements to determine its location in real space; and a controller that governs actuators within the autonomous vehicle to execute various functions based on the navigation map, the localization map, and outputs of these sensors.
  • the autonomous vehicle includes a set of 360° LIDAR sensors arranged on the autonomous vehicle, such as one LIDAR sensor mounted at each corner of the autonomous vehicle or a set of LIDAR sensors integrated into a roof rack mounted to the roof of the autonomous vehicle.
  • Each LIDAR sensor can output one three-dimensional distance scan—such as in the form of a 3D point cloud representing distances between the LIDAR sensor and external surfaces within the field of view of the LIDAR sensor—per rotation of the LIDAR sensor (i.e., once per scan cycle).
  • the autonomous vehicle can also be outfitted (or retrofit) with additional sensors, such as: color cameras; 3D color cameras; a uni-dimensional or multi-dimensional (e.g., scanning) RADAR or infrared distance sensor; etc.
  • additional sensors such as: color cameras; 3D color cameras; a uni-dimensional or multi-dimensional (e.g., scanning) RADAR or infrared distance sensor; etc.
  • the autonomous vehicle can implement similar methods and techniques to read data from these sensors.
  • the autonomous vehicle can then: identify (or “perceive”) mutable objects nearby from these sensor data; regularly compare these data to features represented in a localization map in order to determine its location and orientation in real space; and identify a lane occupied by the autonomous vehicle, a local speed limit, a next navigational action, and/or proximity of a remote operator trigger location, etc. based on the autonomous vehicle's location and orientation and data stored in a navigation map.
  • identify or “perceive” mutable objects nearby from these sensor data; regularly compare these data to features represented in a localization map in order to determine its location and orientation in real space; and identify a lane occupied by the autonomous vehicle, a local speed limit, a next navigational action, and/or proximity of a remote operator trigger location, etc. based on the autonomous vehicle's location and orientation and data stored in a navigation map.
  • the autonomous vehicle can autonomously navigate toward a destination location in Block S 140 .
  • Block S 110 of the method recites accessing a corpus of driving records of a fleet of autonomous vehicles operating within a geographic region;
  • Block S 120 of the method recites identifying a road segment, within the geographic region, associated with a frequency of transitions, from autonomous operation to local manual operation triggered by local operators occupying autonomous vehicles in the fleet, that exceeds a threshold frequency based on the corpus of driving records;
  • Block S 130 of the method recites associating a location of the road segment, represented in a navigation map, with a remote operator trigger.
  • the remote computer system can: access operational data collected from autonomous vehicles occupied by local human operators (e.g., “test drivers”) during autonomous vehicle test periods on public roads; extract manual operator trends—such as location and or characteristics of adjacent road segments at times of manually-triggered transition from autonomous operation to manual operation—from these operational data; and then define remote operator triggers at road segments associated with higher frequencies of manually-triggered transition (and at locations exhibiting similarities to road segments associated with higher frequencies of manually-triggered transition), as shown in FIGS. 1 and 3A .
  • local human operators e.g., “test drivers”
  • manual operator trends such as location and or characteristics of adjacent road segments at times of manually-triggered transition from autonomous operation to manual operation—from these operational data
  • remote operator triggers at road segments associated with higher frequencies of manually-triggered transition (and at locations exhibiting similarities to road segments associated with higher frequencies of manually-triggered transition), as shown in FIGS. 1 and 3A .
  • An autonomous vehicle solution may be tested on public roads, such as over hundreds, thousands, or millions of miles.
  • a human operator occupying an autonomous vehicle during a test period may manually transition the autonomous vehicle from autonomous operation (e.g., an “autonomous mode”) to manual operation (e.g., a “manual mode”), such as when the autonomous vehicle approaches a difficult intersection or in the presence of an unexpected obstacle (e.g., a vehicle, a pedestrian, an animal) near or in the path of the autonomous vehicle.
  • autonomous mode autonomous operation
  • manual operation e.g., a “manual mode”
  • the autonomous vehicle (and/or the remote computer system) can record characteristics of such instances of human-triggered transitions to manual control, such as including: locations; times of day; local traffic conditions; constellations of detected obstacles nearby; lanes occupied by autonomous vehicles; road characteristics (e.g., road surface quality, wetness, color, reflectivity); weather conditions; and/or position of the autonomous vehicle relative to the Sun, Sun intensity, or sensor obscuration due to sunlight; etc. at (and slightly before) local human operators triggered these autonomous-to-manual-operation transitions.
  • the remote computer system can then aggregate these data in a remote database over time.
  • the remote computer system can then analyze these autonomous-to-manual-operation transitions and related data to isolate road segments and local conditions likely to necessitate remote manual control.
  • a (significant) proportion of these autonomous-to-manual-operation transitions may be arbitrary (e.g., anomalous, haphazard).
  • locations, times of day, local traffic conditions, and/or other conditions of some of these autonomous-to-manual-operation transitions may repeat with relatively high frequency over time.
  • the remote computer system can therefore: aggregate locations of these autonomous-to-manual-operation transitions occurring during road test periods throughout a geographic region over time; and identify road segments over which local human operators commonly transition their autonomous vehicles from autonomous operation to manual control in Block S 120 , such as with greater absolute frequency, greater frequency per instance the road segment is traversed, or greater frequency per unit time.
  • the remote computer system can access geospatial locations of instances of transition from autonomous operation to local manual operation triggered by local operators occupying autonomous vehicles in a fleet of autonomous vehicles over time (e.g., during test periods within a geographic region prior to deployment of this fleet of autonomous vehicles for full autonomous operation within this geographic region).
  • the remote computer system can then aggregate instances of transition from autonomous operation to manual operation across this fleet of autonomous vehicles over time into a set of groups based on geospatial proximity of these transitions.
  • the remote computer system can: calculate a frequency of autonomous-to-manual-operation transitions along a particular road segment—containing geospatial locations of autonomous-to-manual-operation transitions in this group—such as based on a ratio of total quantity of transitions in the first group to quantity of instances of autonomous vehicles in the fleet traversing this road segment; and then flag this road segment if this frequency of transitions exceeds a threshold frequency (e.g., 30%) in Block S 120 .
  • the remote computer system can then write a remote operator trigger to each of these flagged road segments in a navigation map for this geographic region.
  • the remote computer system can: access times of these instances of transition from autonomous operation to manual operation; and aggregate these autonomous-to-manual-operation transitions into groups further based on temporal proximity (e.g., occurring during the same day of the week and/or during similar times of day). For each group in this set, the remote computer system can: flag a road segment—containing geospatial locations of autonomous-to-manual-operation transitions in this group—if the frequency of transitions along this road segment within a time window represented in this group exceeds a threshold frequency in Block S 120 ; and then write a remote operator trigger with a constraint of this time window to this flagged road segment in the navigation map in Block S 130 .
  • the remote computer system can thus limit a remote operator trigger for a road segment flagged in the navigation map according to a time window; and the autonomous vehicle can transmit a request for manual assistance to a remote operator only upon approaching this road segment during the time window defined in this remote operator trigger.
  • the remote computer system accesses both: geospatial locations of autonomous-to-manual-operation transitions triggered by local operators occupying autonomous vehicles in the fleet; and scene characteristics (e.g., local traffic conditions, constellations of obstacles nearby, road surface quality, road wetness, road color, road reflectivity, local weather conditions) proximal autonomous vehicles during these autonomous-to-manual-operation transitions in Block S 110 .
  • the remote computer system then aggregates autonomous-to-manual-operation transitions into a set of groups based on both geospatial proximity and similarity of scene characteristics proximal autonomous vehicles during these transitions.
  • the remote computer system can: flag a road segment—containing geospatial locations of autonomous-to-manual-operation transitions in this group—if the frequency of transitions occurring along this road segment concurrently with a particular scene characteristic representative of this group exceeds a threshold frequency in Block S 120 ; and then write a remote operator trigger with a constraint of this particular scene characteristic to this flagged road segment in the navigation map in Block S 130 .
  • the remote computer system can thus limit a remote operator trigger for a road segment flagged in the navigation map according to a scene characteristic (or a constellation of scene characteristics); and the autonomous vehicle can transmit a request for manual assistance to a remote operator only upon detecting this scene characteristic (or a constellation of scene characteristics) when approaching this road segment.
  • the remote computer system can: access offsets between anteroposterior axes of autonomous vehicles and the Sun during autonomous-to-manual-operation transitions in Block S 110 ; identify a group of autonomous-to-manual-operation transitions occurring at geospatial locations along a road segment concurrent with solar offsets—between anteroposterior axes of autonomous vehicles and the Sun—that fall within a solar offset window in Block S 120 ; write a remote operator trigger to this road segment in the navigation map if this frequency of autonomous-to-manual-operation transitions in this group exceeds a threshold frequency in Block S 130 ; and then limit this remote operator trigger according to this solar offset window in Block S 130 .
  • the remote computer system can similarly calculate this solar offset window based on positions of autonomous vehicles relative to the Sun when solar radiation overwhelmed sensors (e.g., color cameras, LIDAR sensors) in these autonomous vehicles, such as along this road segment, and associate a remote operator trigger and solar offset window with this road segment accordingly. Later, as an autonomous vehicle approaches this road segment, the autonomous vehicle can transmit a request for manual assistance to a remote operator if the offset between an anteroposterior axes of the autonomous vehicle and the Sun falls within this solar offset window.
  • solar radiation overwhelmed sensors e.g., color cameras, LIDAR sensors
  • the remote computer system can: detect presence (e.g., quantities) of pedestrians proximal autonomous vehicles during autonomous-to-manual-operation transitions in Block S 110 ; identify a group of autonomous-to-manual-operation transitions occurring at geospatial locations along a road segment concurrent with presence of a minimum quantity (or a range) of pedestrians in Block S 120 ; write a remote operator trigger to this road segment in the navigation map if this frequency of autonomous-to-manual-operation transitions in this group exceeds a threshold frequency in Block S 130 ; and then limit this remote operator trigger according to this minimum quantity (or a range) of pedestrians in Block S 130 . Later, as an autonomous vehicle approaches this road segment, the autonomous vehicle can transmit a request for manual assistance to a remote operator if the autonomous vehicle has detected at least the minimum quantity of pedestrians in its vicinity.
  • presence e.g., quantities
  • the remote computer system can generate a heatmap of autonomous-to-manual-operation transitions during autonomous vehicle test periods throughout a geographic region, such as with groups of transitions weighted by spatial density and by ratio of number of transitions to total autonomous vehicle traversals across road segments in this geographic region.
  • the remote computer system can then rank road segments in this geographic region by intensity in the heatmap.
  • the remote computer system (or autonomous vehicles) can dynamically set and clear remote operator triggers at road segments within this geographic region based on rank of these road segments and availability of remote operators to handle remote operator requests from these autonomous vehicles.
  • the remote computer system can implement load balancing techniques to activate remote operator triggers for highest-ranking road segments and to selectively activate remote operator triggers for lower-ranking road segments responsive to increased availability of remote operators to respond to remote operator requests from these autonomous vehicles.
  • the remote computer system can then selectively annotate a navigation map with remote operator triggers in Block S 130 .
  • the remote computer system can annotate the navigation map with remote operator triggers at discrete locations, intersections, lanes, and/or segments of roadway at which local human operators in these autonomous vehicles frequently elected manual control.
  • the remote computer system can annotate the navigation map with a remote operator trigger for each discrete road segment and vehicle direction: over which local operators elected manual control of their autonomous vehicles more than a threshold number of times per instance that an autonomous vehicle traversed this road segment; or over which local operators elected manual control of their autonomous vehicles more than a threshold number of times per unit time; etc.
  • the autonomous vehicle can implement any other methods or techniques to extract manual control locations from historical autonomous vehicle test data and to automatically annotate the navigation map with remote operator triggers.
  • the remote computer system defines remote operator triggers based on historical autonomous vehicle disengagements—that is, instances in which autonomous vehicles in the fleet automatically ceased autonomous operation, such as due to failure of an autonomous technology, inability to perceive their surroundings with sufficient confidence, or inability to verify next navigational actions.
  • the remote computer system can: identify a road segment associated with a frequency of autonomous-to-manual-operation transitions—triggered by autonomous vehicles, rather than by local human operators occupying these autonomous vehicles—that exceeds a threshold frequency based on the corpus of driving records accessed in Block S 110 ; and then associate a location of this road segment, represented in the navigation map, with a remote operator trigger in Block S 130 .
  • the remote computer system can also implement methods and techniques described above to associate this remote operator trigger with addition conditions.
  • Block S 110 of the method includes accessing historical accident data of human-operated vehicles involved in road accidents within a geographic region; and Block S 120 of the method includes identifying a road segment, within the geographic region, associated with a frequency of accidents exceeding a threshold accident frequency.
  • the remote computer system can: access road vehicle accident data, such as from an accident database for human-operated vehicles, in Block S 110 ; and then extract trends from these data to identify locations (and local conditions) for which greater risk of accidents or collisions exist in Block S 120 .
  • the remote computer system can then define remote operator flags for these locations (and conditions) and write these remote operator flags to the navigation map (or other container) accordingly in Block S 130 , as shown in FIGS. 1 and 3C .
  • the remote computer system extracts, from available accident data: geospatial locations (e.g., latitudes and longitudes); lane identifiers; and directions of motion of vehicles involved in recorded accidents.
  • the remote computer system can also extract, from these accident data: navigational actions; days; times of day; weather conditions; numbers and types of vehicles involved (e.g., cars, trucks, cyclists); numbers of pedestrians involved; accident severities (e.g., minor impact, vehicle totaled); types of accidents (e.g., rear-end collisions, side-impact collisions, sideswipe collisions, vehicle rollover, head-on collisions, or multi-vehicle pile-ups); etc. of recorded accidents and vehicles involved in these accidents.
  • the remote computer system can then compile these accident data into a geospatial heatmap of accidents. For example, the remote computer system can weight each incidence of a recorded accident by: how recently the accident occurred; a number of vehicles involved in the accident; and/or a severity of the accident (e.g., as a function of total cost of vehicle damage and human injuries).
  • the remote computer system can then flag discrete geospatial locations, specific intersections, or specific road segments (e.g., a 100 -meter lengths of road) over which weighted rates of accidents per unit vehicle passing this location or per unit time exceed a threshold rate. (In this example, the remote computer system can adjust the threshold rate as a function of availability of remote operators.)
  • the remote computer system can write remote operator triggers to these locations, intersections, and/or road segments in the navigation map in Block S 130 , as described above.
  • the remote computer system can also write a weight (or “priority”) value to each of these remote operator triggers; and the autonomous vehicle and the remote computer system can cooperate to selectively engage a remote operator to assist the autonomous vehicle in passing a location of a remote operator trigger based on a weight assigned to this remote operator trigger and current resource load at the remote operator manager (i.e., based on current availability of remote operators), as described below.
  • a weight or “priority”
  • the remote computer system can: access a corpus of historical traffic accident data of manually-operated vehicles involved in traffic accidents within a geographic region in Block S 110 ; identify a road segment—within this geographic region—associated with a frequency of traffic accidents that exceeds a threshold frequency in Block S 120 based on this corpus of historical traffic accident data; and associate a location of this road segment with a remote operator trigger accordingly in Block S 130 .
  • Block S 110 of the method recites accessing a specification for triggering manual control of autonomous vehicles; and Block S 120 of the method recites identifying a road segment, within a geographic region, exhibiting characteristics defined by the specification.
  • the remote computer system can: access a remote operator trigger specification defined by a human or generate a remote operator trigger specification based on historical autonomous vehicle operation and remote operator data in Block S 110 ; then scan a navigational map, autonomous vehicle data, and/or existing traffic data, etc. for a geographic region for locations associated with characteristics that match the remote operator trigger specification; and flag these locations for assignment of remote operator triggers.
  • the remote computer system accesses a manual list of characteristics of locations of remote operator triggers or automatically characterizes these locations based on available test period and/or accident data in Block S 110 ; and then scans a navigation map for discrete locations, intersections, and/or road segments that exhibit similar characteristics in Block S 120 .
  • the remote computer system can automatically populate the navigation map with remote operator triggers based on characteristics of roadways represented in the navigation map rather than specifically as a function of past manual control and/or accident locations.
  • the remote computer system further processes manual control data for autonomous vehicle test periods—described above—and extracts additional trends from these data, such as: autonomous vehicle direction; lane occupied by an autonomous vehicle; navigational action (e.g., turning, lane change, merging) performed before, during, and/or after an autonomous-to-manual-operation transition; times of day; local traffic conditions (e.g., vehicle traffic density and speed); lengths of road segments traversed during autonomous-to-manual-operation transitions; types and proximity of obstacles near an autonomous vehicle during an autonomous-to-manual-operation transition; etc. Based on these trends, the remote computer system can correlate various parameters—such as navigational action, intersection type, road segment type, etc.—to elected manual control of autonomous vehicles.
  • navigational action e.g., turning, lane change, merging
  • the remote computer system can implement pattern recognition, regression, or other techniques to correlate local operator manual control of autonomous vehicles to certain characteristics of intersections or road segments.
  • the remote computer system can identify discrete lane segments and navigational actions over which local human operators are likely to elect manual control of autonomous vehicles, such as: right turns exceeding 110°; navigating through railroad crossings; navigating through road construction; unprotected left turns; etc.
  • the remote computer system can then: scan the navigation map for road segments or intersections, etc. that exhibit substantially similar characteristics in Block S 120 ; and annotate the navigation map with remote operator triggers at these locations accordingly in Block S 130 , as described above.
  • the remote computer system can implement similar methods and techniques to correlate accidents with certain characteristics of intersections or road segments in Block S 110 and then scan and annotate the navigation map accordingly in Block S 120 and S 130 .
  • the remote computer system correlates unprotected left turns with above-average rates of manual control by local operators and/or above-average rates of accidents in Block S 120 . Accordingly, the remote computer system identifies all unprotected left turns represented in the navigational map and labels the corresponding locations as remote operator triggers in Block S 130 . The autonomous vehicle thus submits a request for manual assistance in Block S 150 upon approaching an unprotected left turn.
  • the remote computer system can also identify a correlation between unprotected left turns and manual control by local operators and/or above-average rates of accidents during high-traffic periods, when local traffic is moving at high speed, or during particular times of day.
  • the remote computer system can then annotate the navigation map with remote operator triggers—including temporal, traffic, and/or other local condition parameters—at locations of known unprotected left turns represented in the navigation map in Block S 130 .
  • the autonomous vehicle can submit a request for manual assistance in Block S 150 and then automatically transition to manual control by the remote operator, such as upon entering the corresponding unprotected left turn lane, in Block S 154 .
  • the autonomous vehicle can transition back to autonomous navigation.
  • the autonomous vehicle determines that conditions specified by the remote operator trigger have not been met—based on data collected by the autonomous vehicle in real-time as the autonomous vehicle approaches this unprotected left turn—the autonomous vehicle can autonomously navigate through the unprotected left turn with remote operator assistance.
  • the remote computer system annotates the navigation map with remote operator triggers at locations of all known railroad crossings.
  • the computer vision can also write a conditional traffic-related statement to remote operator triggers for these known railroad crossings, such as confirmation to request remote operator assistance if another vehicle is stopped in the autonomous vehicle's lane, on the other side of the railroad crossing, and within a threshold distance of the railroad crossing (e.g., three car lengths or twenty meters).
  • the remote computer system can: derive a set of characteristics of the road segment; scanning the navigation map—of the geographic region containing this road segment—for a second road segment exhibiting characteristics similar to those of the road segment; associate a second location of the second road segment with a second remote operator trigger; in Block S 130 ; and write this remote operator trigger to the navigation map (or other container).
  • the remote computer system can therefore automatically identify additional road segments—that may obligate remote manual operation over autonomous operation for deployed autonomous vehicles—in a geographic region, even if historical data for autonomous vehicle operation through these road segments is unavailable or limited, based on similarities between these additional road segments and road segments previously associated with remote operator triggers.
  • the remote computer system can derive a constellation of location-agnostic scene characteristics and/or autonomous vehicle characteristics exhibiting high correlation with autonomous-to-manual-operation transitions, autonomous vehicle disengagements, traffic accidents, etc. from historical data described above, such as by implementing regression or deep learning techniques.
  • the remote computer system can then define remote operator triggers for deployed autonomous vehicles based on this constellation of location-agnostic scenes and/or autonomous vehicle characteristics.
  • the remote computer system can define a constellation of location-agnostic scene characteristics and/or autonomous vehicle characteristics including: damp-to-wet road condition; solar offset window (e.g., within the range of +/ ⁇ 15° zenith and +/ ⁇ 20° azimuthal to the Sun); and pedestrian present.
  • solar offset window e.g., within the range of +/ ⁇ 15° zenith and +/ ⁇ 20° azimuthal to the Sun
  • pedestrian present e.g., when an autonomous vehicle operating in an autonomous mode detects a pedestrian and falls within this solar offset window during or after rainfall,
  • Block S 140 of the method recites, at an autonomous vehicle, autonomously navigating along a route; and Block S 150 recites, at the autonomous vehicle, transmitting a request for manual assistance to the remote operator in response to approaching the location associated with the remote operator trigger.
  • the autonomous vehicle can implement autonomous navigation techniques to autonomously navigate from a start location (e.g., a pickup location specified by a rideshare user), along a route, toward a destination location (e.g., a dropoff location specified by the rideshare user). While navigating along this route, the autonomous vehicle can monitor its location and/or characteristics of a scene around the autonomous vehicle for condition specified in a remote operator trigger, such as defined in a navigation map stored locally on the autonomous vehicle.
  • the autonomous vehicle can transmit a request for human assistance to a remote operator.
  • the autonomous vehicle can cede operational controls to a remote operator in Block S 154 until the autonomous vehicle passes the road segment or until autonomous control is returned to the autonomous vehicle by the remote operator.
  • the autonomous vehicle can identify a set of conditions (e.g., autonomous vehicle location and orientation, local conditions) that fulfill a remote operator trigger and, accordingly, automatically return a request for manual human assistance to a remote operator manager (or to a remote operator directly).
  • the autonomous vehicle while autonomously navigating along a route that intersects a location of a remote operator trigger defined in the navigation map (or other container), the autonomous vehicle: estimates its time of arrival at this location; and then transmits a request for manual assistance to the remote operator manager in response to this time of arrival falling below a threshold duration (e.g., ten seconds; or five seconds when the autonomous vehicle is travelling at ten miles per hour, ten seconds when the autonomous vehicle is travelling at thirty miles per hour, and fifteen seconds when the autonomous vehicle is travelling at sixty miles per hour).
  • a threshold duration e.g., ten seconds; or five seconds when the autonomous vehicle is travelling at ten miles per hour, ten seconds when the autonomous vehicle is travelling at thirty miles per hour, and fifteen seconds when the autonomous vehicle is travelling at sixty miles per hour.
  • the remote computer system can define a remote operator trigger along a length of road segment in Block S 130 ; and the autonomous vehicle can automatically transmit a request for manual assistance to the remote operator manager in Block S 150 in response to entering this road segment.
  • the remote computer system can define a georeferenced boundary around a cluster of autonomous-to-manual-operation transitions and offset outwardly from a perimeter of this cluster by a trigger distance (e.g., 30 meters) and link this georeferenced boundary to a remote operator trigger for a road segment in Block S 130 .
  • a trigger distance e.g. 30 meters
  • the remote operator manager can: select a particular remote operator from a set of available remote operators; and then route sensor data—received from the autonomous vehicle in Block S 152 described below—to a remote operator portal associated with the remote operator, such as via a computer network. For example, upon receipt of a request for manual assistance from the autonomous vehicle responsive to a remote operator trigger, the remote operator manager can selectively reject the autonomous vehicle's request or connect the autonomous vehicle to an available remote operator based on a weight or priority associated with this remote operator trigger and based on current resource load (i.e., availability or remote operators).
  • the remote operator manager can implement resource allocation techniques to assign autonomous vehicles approaching locations of highest-priority remote operator triggers to available remote operators first, then assign autonomous vehicles approaching locations of lower-priority remote operator triggers to available remote operators up until a target resource load is met (e.g., 90% of remote operators are currently assisting autonomous vehicles).
  • a target resource load e.g. 90% of remote operators are currently assisting autonomous vehicles.
  • the autonomous vehicle can serve a request for manual assistance directly to a remote operator.
  • a remote operator can be assigned a preselected set of autonomous vehicles currently in operation with a geographic region, and the remote operator can monitor low-resolution sensor data—streamed from these autonomous vehicles when operating within the geographic region—through her remote operator portal.
  • the autonomous vehicle can return a request for remote operator control and return high-resolution sensor data (e.g., lower compression, larger sized, and/or greater frame rate color camera data) directly to the remote operator's portal in Block S 150 .
  • the remote operator portal can surface a sensor feed from the autonomous vehicle, enable remote controls for the autonomous vehicle, and prompt the remote operator to remotely engage the autonomous vehicle.
  • the remote computer system can automatically: track an autonomous vehicle; generate a request for manual assistance for the autonomous vehicle when conditions of a remote operator trigger are met at the autonomous vehicle; and serve this request to a remote operator.
  • the autonomous vehicle can stream low-resolution sensor, perception, and/or telemetry data to the remote computer system throughout operation.
  • the remote computer system can then automatically queue a remote operator to assume manual control of the autonomous vehicle when telemetry data received from the autonomous vehicle indicates that the autonomous vehicle is approaching a location assigned a remote operator trigger and when low-resolution perception data (e.g., types and locations of objects detected in camera and/or LIDAR data recorded by the autonomous vehicle) received from the autonomous vehicle indicates that conditions of this remote operator trigger are met.
  • low-resolution perception data e.g., types and locations of objects detected in camera and/or LIDAR data recorded by the autonomous vehicle
  • the autonomous vehicle, the remote operator manager, and/or the remote computer system can implement any other method or technique to selectively connect the autonomous vehicle to a remote operator responsive to a request for manual assistance based on a remote operator trigger.
  • the method further includes: Block S 152 , which recites transmitting sensor data to a remote operator portal associated with the remote operator; Block S 154 , which recites executing a navigational command received from the remote operator via the remote operator portal; and Block S 160 , which recites, in response to passing the location, resuming autonomous navigation along the planned route in Block S 160 .
  • the autonomous vehicle can serve data—such as raw sensor, perception, and/or telemetry data sufficient for enabling a remote operator to efficiently and reliably trigger a navigational action or assume manual control of the autonomous vehicle—to a remote operator.
  • the autonomous vehicle can then: execute commands received from the remote operator in Block S 154 in order to navigate through or past the road segment linked to the remote operator trigger; and transition back to autonomous operation in Block S 160 upon exiting this road segment and/or upon confirmation from the remote operator to resume autonomous navigation, as shown in FIGS. 2 and 4 .
  • the autonomous vehicle can stream raw sensor data, perception data (e.g., perception of a scene around the autonomous vehicle derived from raw sensor data recorded through sensors in the autonomous vehicle), and/or telemetry data to the remote operator portal in real-time over a wireless computer network following the initial remote operation time.
  • the autonomous vehicle can concurrently transition control of some or all actuators in the autonomous vehicle to the remote operator portal.
  • the autonomous vehicle (or the remote operator manager) enables a binary control function of the autonomous vehicle at the remote operator's portal, such as including: a confirm function to trigger the autonomous vehicle to execute a preselected navigational action (e.g., enter an intersection or execute a left turn through the road segment associated with the remote operator trigger); and a delay function to delay execution of this preselected navigational action.
  • a preselected navigational action e.g., enter an intersection or execute a left turn through the road segment associated with the remote operator trigger
  • a delay function to delay execution of this preselected navigational action.
  • the remote computer system writes a remote operator trigger to an unprotected left turn in the navigation map; and assigns a binary control function—including navigational action confirmation and delay options—to this remote operator trigger in Block S 130 .
  • the autonomous vehicle can: query the remote operator manager for remote manual control according to these binary control functions in Block S 150 .
  • the autonomous vehicle can stream sensor data to the remote operator manager in Block S 152 , such as: color camera feeds from forward-, left-, and right-facing cameras on the autonomous vehicle; composite point clouds containing concurrent data output by multiple LIDAR sensors on the autonomous vehicle; telemetry data; and/or vehicle speed, braking position, and accelerator position data; etc.
  • the remote operator manager can then distribute these sensor feeds to the operator portal associated with this remote operator.
  • the autonomous vehicle can autonomously slow to a stop just ahead of this intersection while awaiting a command from the remote operator.
  • the remote operator portal can render these sensor data for the remote operator in (near) real-time and enable binary controls for the autonomous vehicle.
  • the remote operator can submit confirmation to execute the planned left turn action; upon receipt of confirmation to execute the planned left turn action, the autonomous vehicle can resume autonomous execution of its planned route, including entering the intersection ahead and autonomously executing the left turn, in Blocks S 154 and S 160 .
  • the autonomous vehicle can: slow to a stop in response to approaching a location associated with a remote operator trigger; transmit a request for manual confirmation to resume autonomous navigation along the route as the autonomous vehicle slows upon approach to this location; and then resume autonomous navigation along this route—past the location specified by the remote operator trigger—in response to receipt of binary, manual confirmation from the remote operator.
  • the remote computer system can assign multiple possible navigational actions—such as “delay,” “sharp left,” “sweeping left,” “slow left,” and/or “fast left”—to a remote operator trigger in Block S 130 .
  • the autonomous vehicle can transmit a request to the remote operator manager for manual assistance; and the remote operator manager can assign the autonomous vehicle to a remote operator and enable selection of navigational actions specified by this remote operator trigger at this remote operator's portal.
  • the autonomous vehicle can then execute navigational actions selected by the remote operator via the remote operator portal in Block S 154 .
  • the autonomous vehicle can return to full autonomous operation in Block S 160 .
  • the remote computer system assigns full manual control of an autonomous vehicle—such as including control of brake, accelerator, and steering actuators in the autonomous vehicle—to a remote operator trigger in Block S 130 .
  • an autonomous vehicle approaches the location specified in this remote operator trigger while autonomously navigating along a planned route, the autonomous vehicle can request assistance from a remote operator in Block S 150 .
  • the remote operator manager assigns the autonomous vehicle to a remote operator, the autonomous vehicle, the remote computer system, and the remote operator's portal can cooperate to transition real-time drive-by-wire controls of brake, accelerator, and steering positions in the autonomous vehicle to the remote operator portal
  • the autonomous vehicle can stream sensor data to the remote operator manager for distribution to the remote operator portal in Block S 152 ; and the autonomous vehicle and the computer system can cooperate to transition from 100% autonomous/0% manual control of actuators in the autonomous vehicle to 0% autonomous/100% manual control of these actuators over a period of time (e.g., four seconds).
  • the remote operator can thus assume full manual control of the autonomous vehicle—such as via a joystick or other interface connected to the remote operator portal—and remotely navigate the autonomous vehicle through the location or road segment associated with this remote operator trigger.
  • the autonomous vehicle and the remote computer system can cooperate to transition from 0% autonomous/100% manual control back to 100% autonomous/0% manual control, such as instantaneously or over a period of time (e.g., two seconds) in Block S 160 .
  • the autonomous vehicle in response to confirmation of manual assistance from the remote operator, can transfer braking, acceleration, and steering controls of the autonomous vehicle to the remote operator portal; and then execute braking, acceleration, and/or steering commands received from the remote operator portal in Block S 154 .
  • the autonomous vehicle can then: cease transmission of sensor data to the remote operator portal and resume autonomous navigation along its assigned route after passing the location and/or in response to receipt of confirmation from the remote operator to resume autonomous navigation in Block S 160 .
  • the remote computer system (or the remote operator portal) identifies multiple autonomous vehicles scheduled or anticipated to approach a location of a remote operator trigger within a short period of time and then assigns a single remote operator to manually assist each of these autonomous vehicles as they sequentially traverse this location.
  • the remote computer system can group a string of five autonomous vehicles (out of eight total vehicles) in-line at an unprotected left turn and enable manual control of these vehicles to a single remote operator; the remote operator can then manually confirm execution of a left turn action for each autonomous vehicle in the group individually or for the group of autonomous vehicles as a whole.
  • the remote computer system can reduce cognitive load on the remote operator by continuing to assign autonomous vehicles approaching this location or road segment to this same remote operator, such as within a short, contiguous duration of time.
  • the remote computer system can assign the same remote operator to multiple autonomous vehicles passing through a particular remote operator trigger location within a limited period of time in order to enable the remote operator to make rapid, higher-accuracy navigational decisions for these autonomous vehicles and with less cognitive load.
  • the remote computer system can transition the remote operator to assist other autonomous vehicles passing or approaching other locations or road segments in the geographic region associated with remote operator triggers.
  • the remote computer system can dedicate a particular remote operator trigger to a single remote operator, such as over the full duration of this remote operator's shift. Therefore, in this variation, the remote computer system can assign this particular remote operator to assist each autonomous vehicle that approaches the location specified by this remote operator trigger over this period of time.
  • the remote computer system can cooperate with an autonomous vehicle in any other way to selectively and intermittently enable manual control of the autonomous vehicle by a remote operator as the autonomous vehicle approaches and navigates past a remote operator trigger location defined in a navigation map.
  • the remote computer system (or autonomous vehicles in the fleet) can aggregate: sensor data (e.g., camera, LIDAR, and telemetry data) recorded by autonomous vehicles when approaching, entering, and passing locations or road segments specified by remote operator triggers; remote operator commands returned to these autonomous vehicles while responding to autonomous vehicle requests for manual assistance; and results of execution of these commands by these autonomous vehicles (e.g., whether an autonomous vehicle collided with another object, proximity of other objects to the autonomous vehicle during execution of navigational commands received from a remote operator).
  • the remote computer system can then implement deep learning, artificial intelligence, regression, and/or other methods and techniques to refine an autonomous navigation model based on these data.
  • the remote computer system can implement deep learning, artificial intelligence, or similar techniques to retrain an autonomous navigation (or “path planning”) model based on sensor data, remote operator commands, and navigation results recorded near locations of remote operator triggers.
  • the autonomous vehicle can retrain the autonomous navigation model to elect a navigational action, an autonomous vehicle trajectory, and/or autonomous vehicle actuator positions more rapidly and/or more accurately at these remote operator trigger locations based on scene and autonomous vehicle characteristics near these locations and navigational commands issued by remote operators when guiding these autonomous vehicles through these remote operator trigger locations.
  • the remote computer system can then push this retrained (or “updated,” “revised”) autonomous navigation model to deployed autonomous vehicles, which can then implement this autonomous navigation model when operating autonomously, thereby reducing need for remote operators to manually assist these autonomous vehicles near remote operator trigger locations.
  • the remote computer system can transition remote operator triggers from full remote manual control to binary control—as described above—given that autonomous vehicles executing updated autonomous navigation models may be increasingly better suited to quickly and accurately select next navigational actions when approaching these remote operator trigger locations.
  • the remote computer system can reduce involvement and resource load of remote operators tasked with remotely assisting these autonomous vehicles over time.
  • the remote computer system can: record a corpus of sensor data received from an autonomous vehicle following a request for manual assistance as the autonomous vehicle approaches a remote operator trigger location; record a navigational command entered by a remote operator assigned to this autonomous vehicle and served to the autonomous vehicle responsive to this request for manual assistance; generate a revised autonomous navigation model based on this corpus of sensor data and the navigational command; and load this revised autonomous navigation model onto the autonomous vehicle—and other autonomous vehicles in the fleet.
  • the remote computer systems and methods described herein can be embodied and/or implemented at least in part as a machine configured to receive a computer-readable medium storing computer-readable instructions.
  • the instructions can be executed by computer-executable components integrated with the application, applet, host, server, network, website, communication service, communication interface, hardware/firmware/software elements of a human annotator computer or mobile device, wristband, smartphone, or any suitable combination thereof.
  • Other systems and methods of the embodiment can be embodied and/or implemented at least in part as a machine configured to receive a computer-readable medium storing computer-readable instructions.
  • the instructions can be executed by computer-executable components integrated by computer-executable components integrated with apparatuses and networks of the type described above.
  • the computer-readable medium can be stored on any suitable computer readable media such as RAMs, ROMs, flash memory, EEPROMs, optical devices (CD or DVD), hard drives, floppy drives, or any suitable device.
  • the computer-executable component can be a processor but any suitable dedicated hardware device can (alternatively or additionally) execute the instructions.

Landscapes

  • Engineering & Computer Science (AREA)
  • Remote Sensing (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Automation & Control Theory (AREA)
  • General Physics & Mathematics (AREA)
  • Physics & Mathematics (AREA)
  • Artificial Intelligence (AREA)
  • Medical Informatics (AREA)
  • Game Theory and Decision Science (AREA)
  • Evolutionary Computation (AREA)
  • Business, Economics & Management (AREA)
  • Health & Medical Sciences (AREA)
  • Traffic Control Systems (AREA)
  • Navigation (AREA)

Abstract

One variation of a method for transferring control of an autonomous vehicle to a remote operator includes: accessing a specification for triggering manual control of autonomous vehicles; identifying a road segment, within a geographic region, exhibiting characteristics defined by the specification; and associating a location of the road segment, represented in a navigation map, with a remote operator trigger. The method also includes, at the autonomous vehicle operating within the geographic region: autonomously navigating along a route; transmitting a request for manual assistance to the remote operator in response to approaching the location associated with the remote operator trigger; transmitting sensor data to a remote operator portal associated with the remote operator; and executing a navigational command received from the remote operator via the remote operator portal; and resuming autonomous navigation along the route after passing the location.

Description

    CROSS-REFERENCE TO RELATED APPLICATIONS
  • This Application claims the benefit of U.S. Provisional Application No. 62/592,806, filed on 30Nov. 2017, which is incorporated in its entirety by this reference.
  • TECHNICAL FIELD
  • This invention relates generally to the field of autonomous vehicles and more specifically to a new and useful method for transferring control of an autonomous vehicle to a remote operator in the field of autonomous vehicles.
  • BRIEF DESCRIPTION OF THE FIGURES
  • FIG. 1 is a flowchart representation of a method;
  • FIG. 2 is a flowchart representation of one variation of the method;
  • FIGS. 3A, 3B, and 3C are flowchart representations of variations of the method; and
  • FIG. 4 is a flowchart representation of one variation of the method.
  • DESCRIPTION OF THE EMBODIMENTS
  • The following description of embodiments of the invention is not intended to limit the invention to these embodiments but rather to enable a person skilled in the art to make and use this invention. Variations, configurations, implementations, example implementations, and examples described herein are optional and are not exclusive to the variations, configurations, implementations, example implementations, and examples they describe. The invention described herein can include any and all permutations of these variations, configurations, implementations, example implementations, and examples.
  • 1. METHOD
  • As shown in FIGS. 1, 2, and 3A, a method for transferring control of an autonomous vehicle to a remote operator includes, at a computer system: accessing a corpus of driving records of a fleet of autonomous vehicles operating within a geographic region in Block S110; identifying a road segment, within the geographic region, associated with a frequency of transitions, from autonomous operation to local manual operation triggered by local operators occupying autonomous vehicles in the fleet, that exceeds a threshold frequency based on the corpus of driving records in Block S120; and associating a location of the road segment, represented in a navigation map, with a remote operator trigger in Block S130. The method also includes, at the autonomous vehicle operating within the geographic region: autonomously navigating along a route in Block S140; transmitting a request for manual assistance to the remote operator in Block S150 in response to approaching the location associated with the remote operator trigger; transmitting sensor data to a remote operator portal associated with the remote operator in Block S152; executing a navigational command received from the remote operator via the remote operator portal in Block S154; and resuming autonomous navigation along the route after passing the location in Block S160.
  • One variation of the method shown in FIG. 3C further includes, at the remote computer system: accessing a corpus of historical traffic accident data of human-operated vehicles involved in traffic accidents within a geographic region in Block S110; identifying a road segment, within the geographic region, associated with a frequency of traffic accidents that exceeds a threshold frequency based on the corpus of historical traffic accident data in Block S120; and associating a location of the road segment, represented in a navigation map, with a remote operator trigger in Block S130.
  • Another variation of the method shown in FIGS. 3A, 3B, and 3C further includes, at the remote computer system: accessing a specification for triggering manual control of autonomous vehicles in Block S110; identifying a road segment, within a geographic region, exhibiting characteristics defined by the specification in Block S120; and associating a location of the road segment, represented in a navigation map, with a remote operator trigger in Block S130.
  • 2. APPLICATIONS
  • Generally, Blocks of the method can be executed by a computer system (e.g., a computer network, a remote server) to preemptively annotate a navigation map with locations of remote operator triggers based on various existing data, such as: human-supervised autonomous vehicle test data; operating data recorded by autonomous vehicles while operating autonomously; accident data from human-operated vehicles; and/or characteristics of roads or intersections flagged for manual control. While autonomously navigating a planned route, an autonomous vehicle can execute other Blocks of the method to: automatically request remote operator assistance as the autonomous vehicle approaches a location of a remote operator trigger indicated in the navigation map; automatically cede decision-making or full operational control of the autonomous vehicle to a remote human operator; execute navigational commands received from the remote human operator to navigate through this location; and then resume autonomous operation upon passing this location or upon confirmation from the remote human operator to resume autonomous operation.
  • In particular, the remote computer system can access various historical data, such as: locations over which local human operators occupying autonomous vehicles took manual control of their autonomous vehicles (e.g., during autonomous vehicle testing); locations at which autonomous vehicles, operating autonomously, unexpectedly disengaged (e.g., due to an autonomous operation failure or inability to verify a next navigational action); and/or locations (and severity, cost) of accidents involving human-operated vehicles; etc. within a geographic region. Based on these historical data, the remote computer system can isolate discrete locations, intersections, lanes, and/or other road segments at which an autonomous vehicle may be at greater risk for collision with other vehicles, may be delayed in executing a next navigational action, or may execute a next navigational action with reduced confidence. The remote computer system can then populate a navigation map (or a localization map, a table, or other container) with remote operator triggers and related trigger parameters at geospatial locations of these flagged road segments.
  • For example, the remote computer system can: generate a heatmap of frequencies of manual control selections, autonomous vehicle disengagements, and/or traffic accidents per instance of traversal by a vehicle throughout the geographic region over a period of time; identify discrete geospatial locations or small geospatial areas within the heatmap exhibiting greatest frequencies of manual control selections, autonomous vehicle disengagements, and/or traffic accidents per instance of traversal by a vehicle; write a remote operator flag to the navigation map at each of these discrete geospatial locations or small geospatial areas; and push this navigation map (or a navigation map update) to each autonomous vehicle deployed to this geographic region. In this example, the remote computer system can also derive correlations between local conditions and these instances of manual control selections, autonomous vehicle disengagements, and/or traffic accidents—such as: time of day; local weather conditions; and an autonomous vehicle entering uncommon (e.g., five-way) intersections, entering railroad crossings, facing into the Sun, entering a school zone, nearing a large crowd of pedestrians, or approaching an unprotected left turn; etc. The remote computer system can then write these conditions to corresponding remote operator triggers in the navigation map (or localization map, table, or other container) in the form of trigger parameters.
  • During autonomous operation, an autonomous vehicle can: reference a localization map to determine its geospatial location; and reference the navigation map to elect and then execute navigational actions, such as accelerating, braking, turning, changing lanes, etc. along a planned route toward a specified destination. As the autonomous vehicle approaches a location of a remote operator trigger indicated by the navigation map (or in the localization map, table, or other container), the autonomous vehicle can automatically transmit a request for manual assistance to a remote operator (or to a remote operator manager more generally). Once a remote operator is assigned to assist the autonomous vehicle in navigating through this location, the autonomous vehicle can transition from autonomous navigation to remote manual control by the remote operator and can transmit (or “stream”) video, LIDAR, and/or other sensor data to the remote operator portal associated with the remote operator in real-time. The remote operator can view these sensor data through her remote operator portal and elect to: delay a navigational action (e.g., in the autonomous vehicle's queue); confirm a navigational action; select from a predefined set of navigational actions; or manually adjust brake, accelerator, and/or steering positions accordingly. The autonomous vehicle can then transition back to full autonomous operation and resume full autonomous navigation along the planned route, such as: once the autonomous vehicle has moved past the location (or intersection, lane, and/or other road segment) linked to this remote operator trigger; or once the remote operator has confirmed—via the remote operator portal—transition back to autonomous operation.
  • For example, emergency scenario or accident data for training an autonomous vehicle solution may not be immediately available without involving autonomous vehicles (or vehicles outfitted with similar sensor suites) in a variety of different accidents while collecting sensor data from these autonomous vehicles. Therefore, an autonomous vehicle solution may not be trained to detect and respond to possible emergency scenarios or to detect and respond to emergency scenarios in which it is directly involved, such as: occupying a railroad crossing as a train approaches; navigating past a vehicle that has crossed into oncoming traffic near the autonomous vehicle; or approaching a large animal crossing a road ahead of the autonomous vehicle. In order to preemptively handle the possibility of such emergency scenarios throughout a geographic region, the remote computer system can: identify discrete locations, intersections, lanes, or other road segments at which emergency scenarios are particularly likely to occur (e.g., locations associated with transition to manual control by local human operators while occupying these autonomous vehicles, locations associated with accident frequencies that substantially exceed a threshold, average, or baseline value); and then annotate a navigation map or other container with remote operator triggers at corresponding locations. An autonomous vehicle approaching a location associated with a remote operator trigger can automatically and preemptively request assistance from a remote operator and serve sensor data to this remote operator prior to (e.g., ten seconds before) the autonomous vehicle's arrival at this flagged location, thereby enabling the remote operator to quickly perceive the scene around the autonomous vehicle and reliably assume manual control of the autonomous vehicle prior to the autonomous vehicle executing a higher-risk navigational action or disengaging due to a failure at the flagged location. A remote operator manager can also dynamically and predictively allocate remote human operators to assist autonomous vehicles approaching locations of remote operator triggers indicated in the navigation map as these autonomous vehicles operate (e.g., execute routes) within a geographic region. Altogether, the remote computer system, remote operator portal, and fleet of autonomous vehicles can cooperate to annotate a navigation map with locations of remote operator triggers and to implement this navigation map in order to reduce risk to autonomous vehicles entering known higher-risk scenarios and in order to maintain high operating efficiency for these autonomous vehicles.
  • In particular, the remote computer system can preemptively identify higher-risk road segments, road segments in which autonomous vehicles may be unable to detect and avoid risk, or road segments in which autonomous vehicles may be unable to confidently elect a next navigational action and to label a navigational map (or other container) with remote operator triggers at corresponding locations. An autonomous vehicle (or the remote computer system) can then automatically trigger a remote operator to assume control of the autonomous vehicle and to assist navigation of the autonomous vehicle as the autonomous vehicle approaches a road segment linked to a remote operator trigger in the navigation map in order to: reduce risk of collision with other vehicles or obstacles nearby; and/or maintain a high operating efficiency of the autonomous vehicle.
  • 3. AUTONOMOUS VEHICLE AND SENSOR SUITE
  • Block S110 of the method recites, during a scan cycle, recording multi-dimensional sensor images at multi-dimensional sensors arranged on the vehicle. Generally, in Block S110, an autonomous vehicle accesses sensor data from various sensors arranged on or integrated in the autonomous vehicle—such as distance scans from multiple LIDAR sensors and/or color 2D images from multiple color cameras—recorded approximately concurrently by sensors defining fields of view exhibiting some overlap over a distance range from the autonomous vehicle.
  • In one implementation, the autonomous vehicle includes: a suite of sensors configured to collect information about the autonomous vehicle's environment; local memory that stores a navigation map defining lane connections and nominal vehicle paths for a road area and a localization map that the autonomous vehicle implements to determine its location in real space; and a controller that governs actuators within the autonomous vehicle to execute various functions based on the navigation map, the localization map, and outputs of these sensors. In one implementation, the autonomous vehicle includes a set of 360° LIDAR sensors arranged on the autonomous vehicle, such as one LIDAR sensor mounted at each corner of the autonomous vehicle or a set of LIDAR sensors integrated into a roof rack mounted to the roof of the autonomous vehicle. Each LIDAR sensor can output one three-dimensional distance scan—such as in the form of a 3D point cloud representing distances between the LIDAR sensor and external surfaces within the field of view of the LIDAR sensor—per rotation of the LIDAR sensor (i.e., once per scan cycle).
  • The autonomous vehicle can also be outfitted (or retrofit) with additional sensors, such as: color cameras; 3D color cameras; a uni-dimensional or multi-dimensional (e.g., scanning) RADAR or infrared distance sensor; etc. The autonomous vehicle can implement similar methods and techniques to read data from these sensors.
  • The autonomous vehicle can then: identify (or “perceive”) mutable objects nearby from these sensor data; regularly compare these data to features represented in a localization map in order to determine its location and orientation in real space; and identify a lane occupied by the autonomous vehicle, a local speed limit, a next navigational action, and/or proximity of a remote operator trigger location, etc. based on the autonomous vehicle's location and orientation and data stored in a navigation map. By regularly implementing these methods and techniques in conjunction with a planned route, the autonomous vehicle can autonomously navigate toward a destination location in Block S140.
  • 4. REMOTE OPERATOR TRIGGER LOCATIONS BY AUTONOMOUS VEHICLE TEST DATA
  • Block S110 of the method recites accessing a corpus of driving records of a fleet of autonomous vehicles operating within a geographic region; Block S120 of the method recites identifying a road segment, within the geographic region, associated with a frequency of transitions, from autonomous operation to local manual operation triggered by local operators occupying autonomous vehicles in the fleet, that exceeds a threshold frequency based on the corpus of driving records; and Block S130 of the method recites associating a location of the road segment, represented in a navigation map, with a remote operator trigger. Generally, in Blocks S110, S120, and S130, the remote computer system can: access operational data collected from autonomous vehicles occupied by local human operators (e.g., “test drivers”) during autonomous vehicle test periods on public roads; extract manual operator trends—such as location and or characteristics of adjacent road segments at times of manually-triggered transition from autonomous operation to manual operation—from these operational data; and then define remote operator triggers at road segments associated with higher frequencies of manually-triggered transition (and at locations exhibiting similarities to road segments associated with higher frequencies of manually-triggered transition), as shown in FIGS. 1 and 3A.
  • An autonomous vehicle solution may be tested on public roads, such as over hundreds, thousands, or millions of miles. A human operator occupying an autonomous vehicle during a test period may manually transition the autonomous vehicle from autonomous operation (e.g., an “autonomous mode”) to manual operation (e.g., a “manual mode”), such as when the autonomous vehicle approaches a difficult intersection or in the presence of an unexpected obstacle (e.g., a vehicle, a pedestrian, an animal) near or in the path of the autonomous vehicle. The autonomous vehicle (and/or the remote computer system) can record characteristics of such instances of human-triggered transitions to manual control, such as including: locations; times of day; local traffic conditions; constellations of detected obstacles nearby; lanes occupied by autonomous vehicles; road characteristics (e.g., road surface quality, wetness, color, reflectivity); weather conditions; and/or position of the autonomous vehicle relative to the Sun, Sun intensity, or sensor obscuration due to sunlight; etc. at (and slightly before) local human operators triggered these autonomous-to-manual-operation transitions. The remote computer system can then aggregate these data in a remote database over time.
  • The remote computer system can then analyze these autonomous-to-manual-operation transitions and related data to isolate road segments and local conditions likely to necessitate remote manual control. In particular, a (significant) proportion of these autonomous-to-manual-operation transitions may be arbitrary (e.g., anomalous, haphazard). However, locations, times of day, local traffic conditions, and/or other conditions of some of these autonomous-to-manual-operation transitions may repeat with relatively high frequency over time. The remote computer system can therefore: aggregate locations of these autonomous-to-manual-operation transitions occurring during road test periods throughout a geographic region over time; and identify road segments over which local human operators commonly transition their autonomous vehicles from autonomous operation to manual control in Block S120, such as with greater absolute frequency, greater frequency per instance the road segment is traversed, or greater frequency per unit time.
  • 4.1 EXAMPLE: GEOSPATIAL PROXIMITY
  • For example, in Block S110, the remote computer system can access geospatial locations of instances of transition from autonomous operation to local manual operation triggered by local operators occupying autonomous vehicles in a fleet of autonomous vehicles over time (e.g., during test periods within a geographic region prior to deployment of this fleet of autonomous vehicles for full autonomous operation within this geographic region). The remote computer system can then aggregate instances of transition from autonomous operation to manual operation across this fleet of autonomous vehicles over time into a set of groups based on geospatial proximity of these transitions. For each group in this set of groups, the remote computer system can: calculate a frequency of autonomous-to-manual-operation transitions along a particular road segment—containing geospatial locations of autonomous-to-manual-operation transitions in this group—such as based on a ratio of total quantity of transitions in the first group to quantity of instances of autonomous vehicles in the fleet traversing this road segment; and then flag this road segment if this frequency of transitions exceeds a threshold frequency (e.g., 30%) in Block S120. The remote computer system can then write a remote operator trigger to each of these flagged road segments in a navigation map for this geographic region.
  • 4.2 EXAMPLE: TEMPORAL PROXIMITY
  • Furthermore, in the foregoing example, the remote computer system can: access times of these instances of transition from autonomous operation to manual operation; and aggregate these autonomous-to-manual-operation transitions into groups further based on temporal proximity (e.g., occurring during the same day of the week and/or during similar times of day). For each group in this set, the remote computer system can: flag a road segment—containing geospatial locations of autonomous-to-manual-operation transitions in this group—if the frequency of transitions along this road segment within a time window represented in this group exceeds a threshold frequency in Block S120; and then write a remote operator trigger with a constraint of this time window to this flagged road segment in the navigation map in Block S130. In this example, the remote computer system can thus limit a remote operator trigger for a road segment flagged in the navigation map according to a time window; and the autonomous vehicle can transmit a request for manual assistance to a remote operator only upon approaching this road segment during the time window defined in this remote operator trigger.
  • 4.3 EXAMPLE: SCENE CHARACTERISTICS
  • In a similar example shown in FIG. 3A, the remote computer system accesses both: geospatial locations of autonomous-to-manual-operation transitions triggered by local operators occupying autonomous vehicles in the fleet; and scene characteristics (e.g., local traffic conditions, constellations of obstacles nearby, road surface quality, road wetness, road color, road reflectivity, local weather conditions) proximal autonomous vehicles during these autonomous-to-manual-operation transitions in Block S110. The remote computer system then aggregates autonomous-to-manual-operation transitions into a set of groups based on both geospatial proximity and similarity of scene characteristics proximal autonomous vehicles during these transitions. For each group in this set, the remote computer system can: flag a road segment—containing geospatial locations of autonomous-to-manual-operation transitions in this group—if the frequency of transitions occurring along this road segment concurrently with a particular scene characteristic representative of this group exceeds a threshold frequency in Block S120; and then write a remote operator trigger with a constraint of this particular scene characteristic to this flagged road segment in the navigation map in Block S130. In this example, the remote computer system can thus limit a remote operator trigger for a road segment flagged in the navigation map according to a scene characteristic (or a constellation of scene characteristics); and the autonomous vehicle can transmit a request for manual assistance to a remote operator only upon detecting this scene characteristic (or a constellation of scene characteristics) when approaching this road segment.
  • 4.4 EXAMPLE: OBFUSCATION BY SOLAR RADIATION
  • In a similar example, the remote computer system can: access offsets between anteroposterior axes of autonomous vehicles and the Sun during autonomous-to-manual-operation transitions in Block S110; identify a group of autonomous-to-manual-operation transitions occurring at geospatial locations along a road segment concurrent with solar offsets—between anteroposterior axes of autonomous vehicles and the Sun—that fall within a solar offset window in Block S120; write a remote operator trigger to this road segment in the navigation map if this frequency of autonomous-to-manual-operation transitions in this group exceeds a threshold frequency in Block S130; and then limit this remote operator trigger according to this solar offset window in Block S130. The remote computer system can similarly calculate this solar offset window based on positions of autonomous vehicles relative to the Sun when solar radiation overwhelmed sensors (e.g., color cameras, LIDAR sensors) in these autonomous vehicles, such as along this road segment, and associate a remote operator trigger and solar offset window with this road segment accordingly. Later, as an autonomous vehicle approaches this road segment, the autonomous vehicle can transmit a request for manual assistance to a remote operator if the offset between an anteroposterior axes of the autonomous vehicle and the Sun falls within this solar offset window.
  • 4.5 EXAMPLE: PEDESTRIANS
  • In a similar example, the remote computer system can: detect presence (e.g., quantities) of pedestrians proximal autonomous vehicles during autonomous-to-manual-operation transitions in Block S110; identify a group of autonomous-to-manual-operation transitions occurring at geospatial locations along a road segment concurrent with presence of a minimum quantity (or a range) of pedestrians in Block S120; write a remote operator trigger to this road segment in the navigation map if this frequency of autonomous-to-manual-operation transitions in this group exceeds a threshold frequency in Block S130; and then limit this remote operator trigger according to this minimum quantity (or a range) of pedestrians in Block S130. Later, as an autonomous vehicle approaches this road segment, the autonomous vehicle can transmit a request for manual assistance to a remote operator if the autonomous vehicle has detected at least the minimum quantity of pedestrians in its vicinity.
  • 4.6 EXAMPLE: HEATMAP AND DYNAMIC REMOTE OPERATOR TRIGGERS
  • In the foregoing examples, the remote computer system can generate a heatmap of autonomous-to-manual-operation transitions during autonomous vehicle test periods throughout a geographic region, such as with groups of transitions weighted by spatial density and by ratio of number of transitions to total autonomous vehicle traversals across road segments in this geographic region. The remote computer system can then rank road segments in this geographic region by intensity in the heatmap. When a fleet of autonomous vehicles is deployed to operate autonomously in the geographic region, the remote computer system (or autonomous vehicles) can dynamically set and clear remote operator triggers at road segments within this geographic region based on rank of these road segments and availability of remote operators to handle remote operator requests from these autonomous vehicles. In particular, the remote computer system can implement load balancing techniques to activate remote operator triggers for highest-ranking road segments and to selectively activate remote operator triggers for lower-ranking road segments responsive to increased availability of remote operators to respond to remote operator requests from these autonomous vehicles.
  • 4.7 REMOTE OPERATOR TRIGGER GENERATION
  • The remote computer system can then selectively annotate a navigation map with remote operator triggers in Block S130. For example, the remote computer system can annotate the navigation map with remote operator triggers at discrete locations, intersections, lanes, and/or segments of roadway at which local human operators in these autonomous vehicles frequently elected manual control. For example, the remote computer system can annotate the navigation map with a remote operator trigger for each discrete road segment and vehicle direction: over which local operators elected manual control of their autonomous vehicles more than a threshold number of times per instance that an autonomous vehicle traversed this road segment; or over which local operators elected manual control of their autonomous vehicles more than a threshold number of times per unit time; etc.
  • However, the autonomous vehicle can implement any other methods or techniques to extract manual control locations from historical autonomous vehicle test data and to automatically annotate the navigation map with remote operator triggers.
  • 5. REMOTE OPERATOR TRIGGERS BY AUTONOMOUS VEHICLE DISENGAGEMENTS
  • In one variation shown in FIG. 3B, the remote computer system defines remote operator triggers based on historical autonomous vehicle disengagements—that is, instances in which autonomous vehicles in the fleet automatically ceased autonomous operation, such as due to failure of an autonomous technology, inability to perceive their surroundings with sufficient confidence, or inability to verify next navigational actions.
  • In this variation, the remote computer system can: identify a road segment associated with a frequency of autonomous-to-manual-operation transitions—triggered by autonomous vehicles, rather than by local human operators occupying these autonomous vehicles—that exceeds a threshold frequency based on the corpus of driving records accessed in Block S110; and then associate a location of this road segment, represented in the navigation map, with a remote operator trigger in Block S130. In this variation, the remote computer system can also implement methods and techniques described above to associate this remote operator trigger with addition conditions.
  • 6. REMOTE OPERATOR TRIGGERS BY HISTORICAL ACCIDENT DATA
  • In one variation shown in FIG. 3C: Block S110 of the method includes accessing historical accident data of human-operated vehicles involved in road accidents within a geographic region; and Block S120 of the method includes identifying a road segment, within the geographic region, associated with a frequency of accidents exceeding a threshold accident frequency. Generally, in this variation, the remote computer system can: access road vehicle accident data, such as from an accident database for human-operated vehicles, in Block S110; and then extract trends from these data to identify locations (and local conditions) for which greater risk of accidents or collisions exist in Block S120. The remote computer system can then define remote operator flags for these locations (and conditions) and write these remote operator flags to the navigation map (or other container) accordingly in Block S130, as shown in FIGS. 1 and 3C.
  • In one implementation, the remote computer system extracts, from available accident data: geospatial locations (e.g., latitudes and longitudes); lane identifiers; and directions of motion of vehicles involved in recorded accidents. The remote computer system can also extract, from these accident data: navigational actions; days; times of day; weather conditions; numbers and types of vehicles involved (e.g., cars, trucks, cyclists); numbers of pedestrians involved; accident severities (e.g., minor impact, vehicle totaled); types of accidents (e.g., rear-end collisions, side-impact collisions, sideswipe collisions, vehicle rollover, head-on collisions, or multi-vehicle pile-ups); etc. of recorded accidents and vehicles involved in these accidents.
  • The remote computer system can then compile these accident data into a geospatial heatmap of accidents. For example, the remote computer system can weight each incidence of a recorded accident by: how recently the accident occurred; a number of vehicles involved in the accident; and/or a severity of the accident (e.g., as a function of total cost of vehicle damage and human injuries). The remote computer system can then flag discrete geospatial locations, specific intersections, or specific road segments (e.g., a 100-meter lengths of road) over which weighted rates of accidents per unit vehicle passing this location or per unit time exceed a threshold rate. (In this example, the remote computer system can adjust the threshold rate as a function of availability of remote operators.)
  • Upon amassing a set of discrete geospatial locations, specific intersections, and/or specific road segments at which accidents have occurred in the past, such as with significant frequency and/or severity, the remote computer system can write remote operator triggers to these locations, intersections, and/or road segments in the navigation map in Block S130, as described above.
  • The remote computer system can also write a weight (or “priority”) value to each of these remote operator triggers; and the autonomous vehicle and the remote computer system can cooperate to selectively engage a remote operator to assist the autonomous vehicle in passing a location of a remote operator trigger based on a weight assigned to this remote operator trigger and current resource load at the remote operator manager (i.e., based on current availability of remote operators), as described below.
  • Therefore, the remote computer system can: access a corpus of historical traffic accident data of manually-operated vehicles involved in traffic accidents within a geographic region in Block S110; identify a road segment—within this geographic region—associated with a frequency of traffic accidents that exceeds a threshold frequency in Block S120 based on this corpus of historical traffic accident data; and associate a location of this road segment with a remote operator trigger accordingly in Block S130.
  • 7. REMOTE OPERATOR TRIGGERS BY ROAD CHARACTERISTICS
  • In another variation, Block S110 of the method recites accessing a specification for triggering manual control of autonomous vehicles; and Block S120 of the method recites identifying a road segment, within a geographic region, exhibiting characteristics defined by the specification. Generally, in this variation, the remote computer system can: access a remote operator trigger specification defined by a human or generate a remote operator trigger specification based on historical autonomous vehicle operation and remote operator data in Block S110; then scan a navigational map, autonomous vehicle data, and/or existing traffic data, etc. for a geographic region for locations associated with characteristics that match the remote operator trigger specification; and flag these locations for assignment of remote operator triggers.
  • In one implementation shown in FIG. 1, the remote computer system: accesses a manual list of characteristics of locations of remote operator triggers or automatically characterizes these locations based on available test period and/or accident data in Block S110; and then scans a navigation map for discrete locations, intersections, and/or road segments that exhibit similar characteristics in Block S120. In this implementation, the remote computer system can automatically populate the navigation map with remote operator triggers based on characteristics of roadways represented in the navigation map rather than specifically as a function of past manual control and/or accident locations.
  • In another implementation, the remote computer system further processes manual control data for autonomous vehicle test periods—described above—and extracts additional trends from these data, such as: autonomous vehicle direction; lane occupied by an autonomous vehicle; navigational action (e.g., turning, lane change, merging) performed before, during, and/or after an autonomous-to-manual-operation transition; times of day; local traffic conditions (e.g., vehicle traffic density and speed); lengths of road segments traversed during autonomous-to-manual-operation transitions; types and proximity of obstacles near an autonomous vehicle during an autonomous-to-manual-operation transition; etc. Based on these trends, the remote computer system can correlate various parameters—such as navigational action, intersection type, road segment type, etc.—to elected manual control of autonomous vehicles. In particular, the remote computer system can implement pattern recognition, regression, or other techniques to correlate local operator manual control of autonomous vehicles to certain characteristics of intersections or road segments. For example, the remote computer system can identify discrete lane segments and navigational actions over which local human operators are likely to elect manual control of autonomous vehicles, such as: right turns exceeding 110°; navigating through railroad crossings; navigating through road construction; unprotected left turns; etc. The remote computer system can then: scan the navigation map for road segments or intersections, etc. that exhibit substantially similar characteristics in Block S120; and annotate the navigation map with remote operator triggers at these locations accordingly in Block S130, as described above.
  • The remote computer system can implement similar methods and techniques to correlate accidents with certain characteristics of intersections or road segments in Block S110 and then scan and annotate the navigation map accordingly in Block S120 and S130.
  • 7.1 EXAMPLES
  • In one example, the remote computer system correlates unprotected left turns with above-average rates of manual control by local operators and/or above-average rates of accidents in Block S120. Accordingly, the remote computer system identifies all unprotected left turns represented in the navigational map and labels the corresponding locations as remote operator triggers in Block S130. The autonomous vehicle thus submits a request for manual assistance in Block S150 upon approaching an unprotected left turn.
  • In the foregoing example, the remote computer system can also identify a correlation between unprotected left turns and manual control by local operators and/or above-average rates of accidents during high-traffic periods, when local traffic is moving at high speed, or during particular times of day. The remote computer system can then annotate the navigation map with remote operator triggers—including temporal, traffic, and/or other local condition parameters—at locations of known unprotected left turns represented in the navigation map in Block S130. Upon approaching an unprotected left turn at a time specified by a corresponding remote operator trigger in the navigation map or when local traffic conditions match or exceed minimum traffic conditions specified by the remote operator trigger, the autonomous vehicle can submit a request for manual assistance in Block S150 and then automatically transition to manual control by the remote operator, such as upon entering the corresponding unprotected left turn lane, in Block S154. Upon completing the left turn under manual control or guidance, the autonomous vehicle can transition back to autonomous navigation. However, if the autonomous vehicle determines that conditions specified by the remote operator trigger have not been met—based on data collected by the autonomous vehicle in real-time as the autonomous vehicle approaches this unprotected left turn—the autonomous vehicle can autonomously navigate through the unprotected left turn with remote operator assistance.
  • In another example, the remote computer system annotates the navigation map with remote operator triggers at locations of all known railroad crossings. The computer vision can also write a conditional traffic-related statement to remote operator triggers for these known railroad crossings, such as confirmation to request remote operator assistance if another vehicle is stopped in the autonomous vehicle's lane, on the other side of the railroad crossing, and within a threshold distance of the railroad crossing (e.g., three car lengths or twenty meters).
  • 7.2 REMOTE OPERATE TRIGGER PROPAGATION
  • In a similar implementation shown in FIGS. 3A and 3C, after assigning a remote operator trigger to a road segment in Block S130, the remote computer system can: derive a set of characteristics of the road segment; scanning the navigation map—of the geographic region containing this road segment—for a second road segment exhibiting characteristics similar to those of the road segment; associate a second location of the second road segment with a second remote operator trigger; in Block S130; and write this remote operator trigger to the navigation map (or other container).
  • The remote computer system can therefore automatically identify additional road segments—that may obligate remote manual operation over autonomous operation for deployed autonomous vehicles—in a geographic region, even if historical data for autonomous vehicle operation through these road segments is unavailable or limited, based on similarities between these additional road segments and road segments previously associated with remote operator triggers.
  • 8. LOCATION-AGNOSTIC REMOTE OPERATOR TRIGGERS
  • In one variation, the remote computer system can derive a constellation of location-agnostic scene characteristics and/or autonomous vehicle characteristics exhibiting high correlation with autonomous-to-manual-operation transitions, autonomous vehicle disengagements, traffic accidents, etc. from historical data described above, such as by implementing regression or deep learning techniques. The remote computer system can then define remote operator triggers for deployed autonomous vehicles based on this constellation of location-agnostic scenes and/or autonomous vehicle characteristics. For example, the remote computer system can define a constellation of location-agnostic scene characteristics and/or autonomous vehicle characteristics including: damp-to-wet road condition; solar offset window (e.g., within the range of +/−15° zenith and +/−20° azimuthal to the Sun); and pedestrian present. Thus, when an autonomous vehicle operating in an autonomous mode detects a pedestrian and falls within this solar offset window during or after rainfall, the autonomous vehicle can serve a request to the remote operator manager for remote operator assistance according to this location-agnostic remote operator trigger.
  • 9. REMOTE OPERATOR REQUEST
  • Block S140 of the method recites, at an autonomous vehicle, autonomously navigating along a route; and Block S150 recites, at the autonomous vehicle, transmitting a request for manual assistance to the remote operator in response to approaching the location associated with the remote operator trigger. Generally, in Block S140, the autonomous vehicle can implement autonomous navigation techniques to autonomously navigate from a start location (e.g., a pickup location specified by a rideshare user), along a route, toward a destination location (e.g., a dropoff location specified by the rideshare user). While navigating along this route, the autonomous vehicle can monitor its location and/or characteristics of a scene around the autonomous vehicle for condition specified in a remote operator trigger, such as defined in a navigation map stored locally on the autonomous vehicle. Thus, as the autonomous vehicle approaches a road segment specified in a remote operator trigger (and confirms scene and/or autonomous vehicle characteristics specified in this remote operator trigger, such as traffic, weather, and time of day conditions), the autonomous vehicle can transmit a request for human assistance to a remote operator. Accordingly, the autonomous vehicle can cede operational controls to a remote operator in Block S154 until the autonomous vehicle passes the road segment or until autonomous control is returned to the autonomous vehicle by the remote operator. In particular, the autonomous vehicle can identify a set of conditions (e.g., autonomous vehicle location and orientation, local conditions) that fulfill a remote operator trigger and, accordingly, automatically return a request for manual human assistance to a remote operator manager (or to a remote operator directly).
  • In one implementation, while autonomously navigating along a route that intersects a location of a remote operator trigger defined in the navigation map (or other container), the autonomous vehicle: estimates its time of arrival at this location; and then transmits a request for manual assistance to the remote operator manager in response to this time of arrival falling below a threshold duration (e.g., ten seconds; or five seconds when the autonomous vehicle is travelling at ten miles per hour, ten seconds when the autonomous vehicle is travelling at thirty miles per hour, and fifteen seconds when the autonomous vehicle is travelling at sixty miles per hour).
  • Alternatively, the remote computer system can define a remote operator trigger along a length of road segment in Block S130; and the autonomous vehicle can automatically transmit a request for manual assistance to the remote operator manager in Block S150 in response to entering this road segment. For example, the remote computer system can define a georeferenced boundary around a cluster of autonomous-to-manual-operation transitions and offset outwardly from a perimeter of this cluster by a trigger distance (e.g., 30 meters) and link this georeferenced boundary to a remote operator trigger for a road segment in Block S130. Upon crossing this georeferenced boundary and entering this road segment (and upon confirming other conditions specified in the remote operator trigger), the autonomous vehicle can automatically transmit a request for manual assistance to the remote operator manager in Block S150.
  • Upon receipt of a request for manual assistance from the autonomous vehicle, the remote operator manager can: select a particular remote operator from a set of available remote operators; and then route sensor data—received from the autonomous vehicle in Block S152 described below—to a remote operator portal associated with the remote operator, such as via a computer network. For example, upon receipt of a request for manual assistance from the autonomous vehicle responsive to a remote operator trigger, the remote operator manager can selectively reject the autonomous vehicle's request or connect the autonomous vehicle to an available remote operator based on a weight or priority associated with this remote operator trigger and based on current resource load (i.e., availability or remote operators). In this example, the remote operator manager can implement resource allocation techniques to assign autonomous vehicles approaching locations of highest-priority remote operator triggers to available remote operators first, then assign autonomous vehicles approaching locations of lower-priority remote operator triggers to available remote operators up until a target resource load is met (e.g., 90% of remote operators are currently assisting autonomous vehicles).
  • Alternatively, the autonomous vehicle can serve a request for manual assistance directly to a remote operator. For example, a remote operator can be assigned a preselected set of autonomous vehicles currently in operation with a geographic region, and the remote operator can monitor low-resolution sensor data—streamed from these autonomous vehicles when operating within the geographic region—through her remote operator portal. When the autonomous vehicle enters or approaches a road segment associated with a remote operator trigger, the autonomous vehicle can return a request for remote operator control and return high-resolution sensor data (e.g., lower compression, larger sized, and/or greater frame rate color camera data) directly to the remote operator's portal in Block S150. Accordingly, the remote operator portal can surface a sensor feed from the autonomous vehicle, enable remote controls for the autonomous vehicle, and prompt the remote operator to remotely engage the autonomous vehicle.
  • Yet alternatively, the remote computer system can automatically: track an autonomous vehicle; generate a request for manual assistance for the autonomous vehicle when conditions of a remote operator trigger are met at the autonomous vehicle; and serve this request to a remote operator. For example, the autonomous vehicle can stream low-resolution sensor, perception, and/or telemetry data to the remote computer system throughout operation. The remote computer system can then automatically queue a remote operator to assume manual control of the autonomous vehicle when telemetry data received from the autonomous vehicle indicates that the autonomous vehicle is approaching a location assigned a remote operator trigger and when low-resolution perception data (e.g., types and locations of objects detected in camera and/or LIDAR data recorded by the autonomous vehicle) received from the autonomous vehicle indicates that conditions of this remote operator trigger are met.
  • However, the autonomous vehicle, the remote operator manager, and/or the remote computer system can implement any other method or technique to selectively connect the autonomous vehicle to a remote operator responsive to a request for manual assistance based on a remote operator trigger.
  • 10. REMOTE CONTROLS
  • The method further includes: Block S152, which recites transmitting sensor data to a remote operator portal associated with the remote operator; Block S154, which recites executing a navigational command received from the remote operator via the remote operator portal; and Block S160, which recites, in response to passing the location, resuming autonomous navigation along the planned route in Block S160. Generally, in Block S152, the autonomous vehicle can serve data—such as raw sensor, perception, and/or telemetry data sufficient for enabling a remote operator to efficiently and reliably trigger a navigational action or assume manual control of the autonomous vehicle—to a remote operator. The autonomous vehicle can then: execute commands received from the remote operator in Block S154 in order to navigate through or past the road segment linked to the remote operator trigger; and transition back to autonomous operation in Block S160 upon exiting this road segment and/or upon confirmation from the remote operator to resume autonomous navigation, as shown in FIGS. 2 and 4.
  • For example, once the autonomous vehicle determines that conditions of the remote operator trigger are met, returns a request for manual assistance to the remote operator, and/or receives confirmation of manual assistance from the remote operator or remote operator manager at an initial remote operation time, the autonomous vehicle can stream raw sensor data, perception data (e.g., perception of a scene around the autonomous vehicle derived from raw sensor data recorded through sensors in the autonomous vehicle), and/or telemetry data to the remote operator portal in real-time over a wireless computer network following the initial remote operation time. The autonomous vehicle can concurrently transition control of some or all actuators in the autonomous vehicle to the remote operator portal.
  • 10.1 BINARY REMOTE CONTROL FUNCTION
  • In one implementation shown in FIGS. 1 and 2, as the autonomous vehicle approaches the location of a remote operator trigger and once a remote operator is assigned to the autonomous vehicle, the autonomous vehicle (or the remote operator manager) enables a binary control function of the autonomous vehicle at the remote operator's portal, such as including: a confirm function to trigger the autonomous vehicle to execute a preselected navigational action (e.g., enter an intersection or execute a left turn through the road segment associated with the remote operator trigger); and a delay function to delay execution of this preselected navigational action.
  • In one example, the remote computer system: writes a remote operator trigger to an unprotected left turn in the navigation map; and assigns a binary control function—including navigational action confirmation and delay options—to this remote operator trigger in Block S130. As the autonomous vehicle traverses its assigned route and approaches this unprotected left turn in Block S140, the autonomous vehicle can: query the remote operator manager for remote manual control according to these binary control functions in Block S150. Once the remote operator manager assigns a remote operator to the autonomous vehicle, the autonomous vehicle can stream sensor data to the remote operator manager in Block S152, such as: color camera feeds from forward-, left-, and right-facing cameras on the autonomous vehicle; composite point clouds containing concurrent data output by multiple LIDAR sensors on the autonomous vehicle; telemetry data; and/or vehicle speed, braking position, and accelerator position data; etc. The remote operator manager can then distribute these sensor feeds to the operator portal associated with this remote operator.
  • In this example, as the autonomous vehicle nears the unprotected left turn, the autonomous vehicle can autonomously slow to a stop just ahead of this intersection while awaiting a command from the remote operator. Simultaneously, the remote operator portal can render these sensor data for the remote operator in (near) real-time and enable binary controls for the autonomous vehicle. Upon determining that the autonomous vehicle has right of way to enter the intersection ahead and will avoid oncoming traffic when executing a left turn action, the remote operator can submit confirmation to execute the planned left turn action; upon receipt of confirmation to execute the planned left turn action, the autonomous vehicle can resume autonomous execution of its planned route, including entering the intersection ahead and autonomously executing the left turn, in Blocks S154 and S160.
  • Therefore, in this example, the autonomous vehicle can: slow to a stop in response to approaching a location associated with a remote operator trigger; transmit a request for manual confirmation to resume autonomous navigation along the route as the autonomous vehicle slows upon approach to this location; and then resume autonomous navigation along this route—past the location specified by the remote operator trigger—in response to receipt of binary, manual confirmation from the remote operator.
  • 10.2 MULTIPLE REMOTE CONTROL FUNCTIONS
  • In a similar implementation, the remote computer system can assign multiple possible navigational actions—such as “delay,” “sharp left,” “sweeping left,” “slow left,” and/or “fast left”—to a remote operator trigger in Block S130. As an autonomous vehicle approaches the location specified by this remote operator trigger, the autonomous vehicle can transmit a request to the remote operator manager for manual assistance; and the remote operator manager can assign the autonomous vehicle to a remote operator and enable selection of navigational actions specified by this remote operator trigger at this remote operator's portal. The autonomous vehicle can then execute navigational actions selected by the remote operator via the remote operator portal in Block S154.
  • Once the remote operator confirms one or a subset of these navigational actions, once the autonomous vehicle has moved fully past the location specified in this remote operator trigger, and/or once the remote operator confirms transition back to autonomous operation, the autonomous vehicle can return to full autonomous operation in Block S160.
  • 10.3 FULL REMOTE CONTROL FUNCTIONS
  • In another implementation, the remote computer system assigns full manual control of an autonomous vehicle—such as including control of brake, accelerator, and steering actuators in the autonomous vehicle—to a remote operator trigger in Block S130. Thus, as an autonomous vehicle approaches the location specified in this remote operator trigger while autonomously navigating along a planned route, the autonomous vehicle can request assistance from a remote operator in Block S150. Once the remote operator manager assigns the autonomous vehicle to a remote operator, the autonomous vehicle, the remote computer system, and the remote operator's portal can cooperate to transition real-time drive-by-wire controls of brake, accelerator, and steering positions in the autonomous vehicle to the remote operator portal
  • For example, once the remote operator is assigned to assist the autonomous vehicle: the autonomous vehicle can stream sensor data to the remote operator manager for distribution to the remote operator portal in Block S152; and the autonomous vehicle and the computer system can cooperate to transition from 100% autonomous/0% manual control of actuators in the autonomous vehicle to 0% autonomous/100% manual control of these actuators over a period of time (e.g., four seconds). The remote operator can thus assume full manual control of the autonomous vehicle—such as via a joystick or other interface connected to the remote operator portal—and remotely navigate the autonomous vehicle through the location or road segment associated with this remote operator trigger.
  • Furthermore, once the autonomous vehicle is fully past the location or road segment linked to this remote operator trigger—such as confirmed by the remote operator—the autonomous vehicle and the remote computer system can cooperate to transition from 0% autonomous/100% manual control back to 100% autonomous/0% manual control, such as instantaneously or over a period of time (e.g., two seconds) in Block S160.
  • Therefore, in response to confirmation of manual assistance from the remote operator, the autonomous vehicle can transfer braking, acceleration, and steering controls of the autonomous vehicle to the remote operator portal; and then execute braking, acceleration, and/or steering commands received from the remote operator portal in Block S154. The autonomous vehicle can then: cease transmission of sensor data to the remote operator portal and resume autonomous navigation along its assigned route after passing the location and/or in response to receipt of confirmation from the remote operator to resume autonomous navigation in Block S160.
  • 11. DAISY CHAIN
  • In one variation, the remote computer system (or the remote operator portal) identifies multiple autonomous vehicles scheduled or anticipated to approach a location of a remote operator trigger within a short period of time and then assigns a single remote operator to manually assist each of these autonomous vehicles as they sequentially traverse this location. For example, the remote computer system can group a string of five autonomous vehicles (out of eight total vehicles) in-line at an unprotected left turn and enable manual control of these vehicles to a single remote operator; the remote operator can then manually confirm execution of a left turn action for each autonomous vehicle in the group individually or for the group of autonomous vehicles as a whole.
  • Therefore, in this variation, because a remote operator may become increasingly familiar with a segment of road, an intersection, current traffic conditions, current weather conditions, current pedestrian traffic, etc. near a location or road segment linked to a remote operator trigger as the remote operator handles manual assistance requests from autonomous vehicles passing this location or road segment over time, the remote computer system can reduce cognitive load on the remote operator by continuing to assign autonomous vehicles approaching this location or road segment to this same remote operator, such as within a short, contiguous duration of time. In particular, the remote computer system can assign the same remote operator to multiple autonomous vehicles passing through a particular remote operator trigger location within a limited period of time in order to enable the remote operator to make rapid, higher-accuracy navigational decisions for these autonomous vehicles and with less cognitive load.
  • Once a number of autonomous vehicles approaching this remote operator trigger location drops below a preset threshold quantity or frequency, the remote computer system can transition the remote operator to assist other autonomous vehicles passing or approaching other locations or road segments in the geographic region associated with remote operator triggers.
  • Alternatively, the remote computer system can dedicate a particular remote operator trigger to a single remote operator, such as over the full duration of this remote operator's shift. Therefore, in this variation, the remote computer system can assign this particular remote operator to assist each autonomous vehicle that approaches the location specified by this remote operator trigger over this period of time.
  • However, the remote computer system can cooperate with an autonomous vehicle in any other way to selectively and intermittently enable manual control of the autonomous vehicle by a remote operator as the autonomous vehicle approaches and navigates past a remote operator trigger location defined in a navigation map.
  • 12. AUTONOMOUS TECHNOLOGY UPDATE
  • In one variation, the remote computer system (or autonomous vehicles in the fleet) can aggregate: sensor data (e.g., camera, LIDAR, and telemetry data) recorded by autonomous vehicles when approaching, entering, and passing locations or road segments specified by remote operator triggers; remote operator commands returned to these autonomous vehicles while responding to autonomous vehicle requests for manual assistance; and results of execution of these commands by these autonomous vehicles (e.g., whether an autonomous vehicle collided with another object, proximity of other objects to the autonomous vehicle during execution of navigational commands received from a remote operator). The remote computer system can then implement deep learning, artificial intelligence, regression, and/or other methods and techniques to refine an autonomous navigation model based on these data. In particular, the remote computer system can implement deep learning, artificial intelligence, or similar techniques to retrain an autonomous navigation (or “path planning”) model based on sensor data, remote operator commands, and navigation results recorded near locations of remote operator triggers. For example, the autonomous vehicle can retrain the autonomous navigation model to elect a navigational action, an autonomous vehicle trajectory, and/or autonomous vehicle actuator positions more rapidly and/or more accurately at these remote operator trigger locations based on scene and autonomous vehicle characteristics near these locations and navigational commands issued by remote operators when guiding these autonomous vehicles through these remote operator trigger locations. The remote computer system can then push this retrained (or “updated,” “revised”) autonomous navigation model to deployed autonomous vehicles, which can then implement this autonomous navigation model when operating autonomously, thereby reducing need for remote operators to manually assist these autonomous vehicles near remote operator trigger locations.
  • For example, as the remote computer system updates the autonomous navigation model as described above and pushes updated autonomous navigation models to deployed autonomous vehicles over time, the remote computer system can transition remote operator triggers from full remote manual control to binary control—as described above—given that autonomous vehicles executing updated autonomous navigation models may be increasingly better suited to quickly and accurately select next navigational actions when approaching these remote operator trigger locations. By thus transitioning these remote operator triggers from full remote manual control to binary control, the remote computer system can reduce involvement and resource load of remote operators tasked with remotely assisting these autonomous vehicles over time.
  • Therefore, the remote computer system can: record a corpus of sensor data received from an autonomous vehicle following a request for manual assistance as the autonomous vehicle approaches a remote operator trigger location; record a navigational command entered by a remote operator assigned to this autonomous vehicle and served to the autonomous vehicle responsive to this request for manual assistance; generate a revised autonomous navigation model based on this corpus of sensor data and the navigational command; and load this revised autonomous navigation model onto the autonomous vehicle—and other autonomous vehicles in the fleet.
  • The remote computer systems and methods described herein can be embodied and/or implemented at least in part as a machine configured to receive a computer-readable medium storing computer-readable instructions. The instructions can be executed by computer-executable components integrated with the application, applet, host, server, network, website, communication service, communication interface, hardware/firmware/software elements of a human annotator computer or mobile device, wristband, smartphone, or any suitable combination thereof. Other systems and methods of the embodiment can be embodied and/or implemented at least in part as a machine configured to receive a computer-readable medium storing computer-readable instructions. The instructions can be executed by computer-executable components integrated by computer-executable components integrated with apparatuses and networks of the type described above. The computer-readable medium can be stored on any suitable computer readable media such as RAMs, ROMs, flash memory, EEPROMs, optical devices (CD or DVD), hard drives, floppy drives, or any suitable device. The computer-executable component can be a processor but any suitable dedicated hardware device can (alternatively or additionally) execute the instructions.
  • As a person skilled in the art will recognize from the previous detailed description and from the figures and claims, modifications and changes can be made to the embodiments of the invention without departing from the scope of this invention as defined in the following claims.

Claims (20)

I claim:
1. A method for transferring control of an autonomous vehicle to a remote operator comprising:
accessing a corpus of driving records of a fleet of autonomous vehicles operating within a geographic region;
identifying a road segment, within the geographic region, associated with a frequency of transitions, from autonomous operation to local manual operation triggered by local operators occupying autonomous vehicles in the fleet, that exceeds a threshold frequency based on the corpus of driving records;
associating a location of the road segment, represented in a navigation map, with a remote operator trigger; and
at the autonomous vehicle operating within the geographic region:
autonomously navigating along a route;
in response to approaching the location associated with the remote operator trigger:
transmitting a request for manual assistance to the remote operator;
transmitting sensor data to a remote operator portal associated with the remote operator; and
executing a navigational command received from the remote operator via the remote operator portal; and
resuming autonomous navigation along the route after passing the location.
2. The method of claim 1:
wherein accessing the corpus of driving records of the fleet of autonomous vehicles comprises accessing geospatial locations, within the geographic region, of instances of transition from autonomous operation to local manual operation triggered by local operators occupying autonomous vehicles in the fleet; and
wherein identifying the road segment comprises:
aggregating instances of transition from autonomous operation to manual operation into a set of groups based on geospatial proximity, the set of groups comprising a first group containing instances of transition at geospatial locations along the road segment;
for the first group, calculating the frequency of transitions based on a ratio of quantity of transitions in the first group to quantity of instances of autonomous vehicles in the fleet traversing the road segment; and
flagging the road segment for the remote operator trigger in response to the frequency of transitions exceeding the threshold frequency.
3. The method of claim 2:
wherein accessing the corpus of driving records of the fleet of autonomous vehicles comprises accessing times of instances of transition from autonomous operation to manual operation;
wherein aggregating instances of transition from autonomous operation to manual operation into the set of groups comprises aggregating instances of transition from autonomous operation to manual operation into the set of groups further based on temporal proximity, the first group containing instances of transition at geospatial locations along the road segment within a daily time window;
further comprising limiting the remote operator trigger for the road segment according to the daily time window; and
wherein transmitting the request for manual assistance to the remote operator comprises, at the autonomous vehicle, transmitting the request for manual assistance to the remote operator in response to approaching the location of the remote operator trigger during the daily time window.
4. The method of claim 1:
wherein accessing the corpus of driving records of the fleet of autonomous vehicles comprises accessing geospatial locations, within the geographic region, of instances of transition from autonomous operation to local manual operation triggered by local operators occupying autonomous vehicles in the fleet and scene characteristics proximal autonomous vehicles during instances of transition from autonomous operation to manual operation;
wherein identifying the road segment comprises:
aggregating instances of transition from autonomous operation to manual operation into a set of groups based on geospatial proximity and similarity of scene characteristics, the set of groups comprising a first group containing instances of transition at geospatial locations along the road segment and associated with a particular scene characteristic; and
flagging the road segment for the remote operator trigger in response to the frequency of transitions in the first group exceeding the threshold frequency;
further comprising limiting the remote operator trigger for the road segment according to the particular scene characteristic; and
wherein transmitting the request for manual assistance to the remote operator comprises, at the autonomous vehicle, transmitting the request for manual assistance to the remote operator in response to detecting the particular scene characteristic while approaching the location of the remote operator trigger.
5. The method of claim 4:
wherein accessing scene characteristics proximal autonomous vehicles during instances of transition from autonomous operation to manual operation comprises accessing scene characteristics comprising offsets between anteroposterior axes of autonomous vehicles and the Sun during instances of transition from autonomous operation to manual operation;
wherein identifying the road segment comprises identifying the first group containing instances of transition at geospatial locations along the road segment concurrent with offsets, between anteroposterior axes of autonomous vehicles and the Sun, within a solar offset window;
wherein limiting the remote operator trigger for the road segment according to the particular scene characteristic comprises limiting the remote operator trigger for the road segment according to the solar offset window; and
wherein transmitting the request for manual assistance to the remote operator comprises, at the autonomous vehicle, transmitting the request for manual assistance to the remote operator in response to an offset between an anteroposterior axes of the autonomous vehicle and the Sun falling within the solar offset window while approaching the location of the remote operator trigger.
6. The method of claim 4:
wherein accessing scene characteristics proximal autonomous vehicles during instances of transition from autonomous operation to manual operation comprises accessing scene characteristics comprising presence of pedestrians proximal autonomous vehicles during instances of transition from autonomous operation to manual operation;
wherein identifying the road segment comprises identifying the first group containing instances of transition at geospatial locations along the road segment concurrent with presence of a minimum quantity of pedestrians;
wherein limiting the remote operator trigger for the road segment according to the particular scene characteristic comprises associating the remote operator trigger for the road segment with presence of the minimum quantity of pedestrians; and
wherein transmitting the request for manual assistance to the remote operator comprises, at the autonomous vehicle, transmitting the request for manual assistance to the remote operator in response to detecting more than the minimum quantity of pedestrians proximal the autonomous vehicle while approaching the location of the remote operator trigger.
7. The method of claim 1:
wherein autonomously navigating along the route comprises, at the autonomous vehicle, autonomously navigating from a pickup location to a drop-off location specified by a user while the user occupies the autonomous vehicle and while a local operator is absent from the autonomous vehicle;
wherein transmitting the request for manual assistance to the remote operator comprises transmitting the request for manual assistance to the remote operator in response to approaching the location associated with the remote operator trigger while a local operator is absent from the autonomous vehicle; and
further comprising, at a second autonomous vehicle occupied by a second local operator:
autonomously navigating along a second route;
in response to approaching the location associated with the remote operator trigger while occupied by the second local operator, prompting the second local operator to assume manual control of the second autonomous vehicle; and
in response to passing the location, prompting the local operator to confirm autonomous navigation of the second autonomous vehicle along the second route.
8. The method of claim 1:
wherein transmitting the request for manual assistance to the remote operator comprises, transmitting the request for manual assistance to a remote operator manager in response to approaching the location associated with the remote operator trigger;
at the remote operator manager:
in response to receiving the request for manual assistance from the autonomous vehicle, selecting the remote operator from a set of available remote operators; and
routing sensor data received from the autonomous vehicle to the remote operator portal, associated with the remote operator, via a computer network.
9. The method of claim 1:
further comprising, at the autonomous vehicle, estimating a time of arrival of the autonomous vehicle at the location associated with the remote operator trigger while autonomously navigating along the route;
wherein transmitting the request for manual assistance to the remote operator comprises, at the autonomous vehicle, transmitting the request for manual assistance to the remote operator in response to the time of arrival falling below a threshold duration at a first time;
wherein transmitting sensor data to the remote operator portal comprises, at the autonomous vehicle, streaming sensor data to the remote operator portal in real-time over a wireless computer network following the first time; and
further comprising ceasing transmission of sensor data to the remote operator portal after passing the location.
10. The method of claim 1:
further comprising autonomously slowing to a stop in response to approaching the location associated with the remote operator trigger;
wherein transmitting the request for manual assistance to the remote operator comprises transmitting the request for manual confirmation to resume autonomous navigation along the route; and
wherein executing the navigational command received from the remote operator comprises resuming autonomous navigation along the route past the location in response to receipt of manual confirmation from the remote operator.
11. method of claim 1:
further comprising, in response to confirmation of manual assistance from the remote operator, transferring braking, acceleration, and steering controls of the autonomous vehicle to the remote operator portal;
wherein executing the navigational command received from the remote operator comprises executing braking, acceleration, and steering commands received from the remote operator portal; and
wherein resuming autonomous navigation along the route comprises resuming autonomous navigation along the route in response to receipt of confirmation from the remote operator to resume autonomous navigation.
12. The method of claim 1, further comprising:
accessing a corpus of historical traffic accident data of manually-operated vehicles involved in traffic accidents within the geographic region;
identifying a second road segment, within the geographic region, associated with a frequency of traffic accidents that exceeds a second threshold frequency based on the corpus of historical traffic accident data;
associating a second location of the second road segment, in the navigation map, with a second remote operator trigger; and
at the autonomous vehicle:
autonomously navigating along a second route;
in response to approaching the second location associated with the second remote operator trigger:
transmitting a second request for manual assistance to a second remote operator;
transmitting sensor data to a second remote operator portal associated with the second remote operator; and
executing a second navigational command received from the second remote operator via the second remote operator portal; and
resuming autonomous navigation along the second route after passing the second location.
13. The method of claim 1:
wherein autonomously navigating along the route comprises, at the autonomous vehicle, autonomously navigating along the route based on an autonomous navigation model;
further comprising, at a remote computer system:
recording a corpus of sensor data received from the autonomous vehicle following the request for manual assistance;
recording the navigational command entered by the remote operator and served to the autonomous vehicle responsive to the request for manual assistance;
generating a revised autonomous navigation model based on the corpus of sensor data and the navigational command; and
loading the revised autonomous navigation model onto the autonomous vehicle.
14. The method of claim 1, further comprising:
deriving a set of characteristics of the road segment;
scanning the navigation map for a second road segment exhibiting characteristics similar to the set of characteristics of the road segment;
associating a second location of the second road segment, in the navigation map, with a second remote operator trigger; and
at the autonomous vehicle:
autonomously navigating along a second route;
in response to approaching the second location associated with the second remote operator trigger:
transmitting a second request for manual assistance to a second remote operator;
transmitting sensor data to a second remote operator portal associated with the second remote operator; and
executing a second navigational command received from the second remote operator via the second remote operator portal; and
resuming autonomous navigation along the second route after passing the second location.
15. The method of claim 1, further comprising:
identifying a second road segment, within the geographic region, associated with a second frequency of transitions, from autonomous operation to remote manual operation triggered by autonomous vehicles in the fleet, that exceeds a second threshold frequency based on the corpus of driving records;
associating a second location of the second road segment, represented in the navigation map, with a second remote operator trigger; and
at the autonomous vehicle:
autonomously navigating along a second route;
in response to approaching the second location associated with the second remote operator trigger:
transmitting a second request for manual assistance to a second remote operator;
transmitting sensor data to a second remote operator portal associated with the second remote operator; and
executing a second navigational command received from the second remote operator via the second remote operator portal; and
resuming autonomous navigation along the second route after passing the second location.
16. A method for transferring control of an autonomous vehicle to a remote operator comprising:
accessing a corpus of historical traffic accident data of manually-operated vehicles involved in traffic accidents within a geographic region;
identifying a road segment, within the geographic region, associated with a frequency of traffic accidents that exceeds a threshold frequency based on the corpus of historical traffic accident data;
associating a location of the road segment, represented in a navigation map, with a remote operator trigger; and
at the autonomous vehicle operating within the geographic region:
autonomously navigating along a route;
in response to approaching the location associated with the remote operator trigger:
transmitting a request for manual assistance to the remote operator;
transmitting sensor data to a remote operator portal associated with the remote operator; and
executing a navigational command received from the remote operator via the remote operator portal; and
resuming autonomous navigation along the route after passing the location.
17. A method for transferring control of an autonomous vehicle to a remote operator comprising:
accessing a specification for triggering manual control of autonomous vehicles;
identifying a road segment, within a geographic region, exhibiting characteristics defined by the specification;
associating a location of the road segment, represented in a navigation map, with a remote operator trigger; and
at the autonomous vehicle operating within the geographic region:
autonomously navigating along a route;
in response to approaching the location associated with the remote operator trigger:
transmitting a request for manual assistance to the remote operator;
transmitting sensor data to a remote operator portal associated with the remote operator; and
executing a navigational command received from the remote operator via the remote operator portal; and
resuming autonomous navigation along the route after passing the location.
18. The method of claim 17:
wherein accessing the specification for triggering manual control of autonomous vehicles comprises accessing a threshold frequency of traffic accidents per unit distance;
wherein identifying the road segment comprises:
accessing a corpus of historical traffic accident data of manually-operated vehicles involved in traffic accidents within the geographic region;
based on the corpus of historical traffic accident data, isolating the road segment associated with a frequency of historical traffic accidents exceeding the threshold frequency.
19. The method of claim 17:
wherein accessing the specification for triggering manual control of autonomous vehicles comprises accessing a threshold frequency of transitions, from autonomous operation to remote manual operation, triggered by autonomous vehicles;
wherein identifying the road segment comprises:
accessing a corpus of driving records of a fleet of autonomous vehicles operating within the geographic region;
isolating the road segment, within the geographic region, associated with a frequency of transitions, from autonomous operation to local manual operation triggered by autonomous vehicles in the fleet, that exceeds the threshold frequency of transitions based on the corpus of driving records.
20. The method of claim 17:
wherein accessing the specification for triggering manual control of autonomous vehicles comprises accessing a threshold frequency of transitions, from autonomous operation to remote manual operation, triggered by local operators occupying autonomous vehicles;
wherein identifying the road segment comprises:
accessing a corpus of driving records of a fleet of autonomous vehicles operating within the geographic region;
isolating the road segment, within the geographic region, associated with a frequency of transitions, from autonomous operation to local manual operation triggered by local operators occupying autonomous vehicles in the fleet, that exceeds the threshold frequency of transitions based on the corpus of driving records.
US16/206,477 2017-11-30 2018-11-30 Method for transferring control of an autonomous vehicle to a remote operator Abandoned US20190163176A1 (en)

Priority Applications (3)

Application Number Priority Date Filing Date Title
US16/206,477 US20190163176A1 (en) 2017-11-30 2018-11-30 Method for transferring control of an autonomous vehicle to a remote operator
US16/943,969 US11131990B1 (en) 2017-11-30 2020-07-30 Method for transferring control to an operator
US17/327,362 US11797001B1 (en) 2017-11-30 2021-05-21 Method for transferring control to an operator

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US201762592806P 2017-11-30 2017-11-30
US16/206,477 US20190163176A1 (en) 2017-11-30 2018-11-30 Method for transferring control of an autonomous vehicle to a remote operator

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US16/943,969 Continuation US11131990B1 (en) 2017-11-30 2020-07-30 Method for transferring control to an operator

Publications (1)

Publication Number Publication Date
US20190163176A1 true US20190163176A1 (en) 2019-05-30

Family

ID=66633089

Family Applications (3)

Application Number Title Priority Date Filing Date
US16/206,477 Abandoned US20190163176A1 (en) 2017-11-30 2018-11-30 Method for transferring control of an autonomous vehicle to a remote operator
US16/943,969 Active 2039-02-19 US11131990B1 (en) 2017-11-30 2020-07-30 Method for transferring control to an operator
US17/327,362 Active US11797001B1 (en) 2017-11-30 2021-05-21 Method for transferring control to an operator

Family Applications After (2)

Application Number Title Priority Date Filing Date
US16/943,969 Active 2039-02-19 US11131990B1 (en) 2017-11-30 2020-07-30 Method for transferring control to an operator
US17/327,362 Active US11797001B1 (en) 2017-11-30 2021-05-21 Method for transferring control to an operator

Country Status (1)

Country Link
US (3) US20190163176A1 (en)

Cited By (71)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20180067488A1 (en) * 2016-09-08 2018-03-08 Mentor Graphics Corporation Situational awareness determination based on an annotated environmental model
US20190339697A1 (en) * 2018-05-01 2019-11-07 The Hi-Tech Robotic Systemz Ltd Managing drive modes of a vehicle
US20190389482A1 (en) * 2018-06-26 2019-12-26 Toyota Research Institute, Inc. Systems and methods for end-user modification of driving behavior of autonomous vehicle
US10558224B1 (en) * 2017-08-10 2020-02-11 Zoox, Inc. Shared vehicle obstacle data
US20200084419A1 (en) * 2018-09-10 2020-03-12 Panasonic Intellectual Property Corporation Of America Video transmitting device, video transmitting method, and recording medium
CN111372216A (en) * 2020-02-28 2020-07-03 中南大学 Resource scheduling method, system and storage medium for intelligent networked automobile
US20200239023A1 (en) * 2019-01-25 2020-07-30 Uatc, Llc Operator assistance for autonomous vehicles
US20200247469A1 (en) * 2019-01-31 2020-08-06 StradVision, Inc. Method and device for delivering steering intention of autonomous driving module or driver to steering apparatus of subject vehicle more accurately
US10761542B1 (en) * 2017-07-11 2020-09-01 Waymo Llc Methods and systems for keeping remote assistance operators alert
US20200326702A1 (en) * 2019-04-15 2020-10-15 Toyota Jidosha Kabushiki Kaisha Vehicle remote instruction system
US20200407949A1 (en) * 2018-07-31 2020-12-31 Komatsu Ltd. Work machine
CN112339775A (en) * 2019-08-06 2021-02-09 丰田自动车株式会社 Driving operation relay system and vehicle
CN112349170A (en) * 2019-08-09 2021-02-09 丰田自动车株式会社 Vehicle remote indication training device
CN112415904A (en) * 2019-08-23 2021-02-26 郑州宇通客车股份有限公司 Remote control method, device and system for automatic driving vehicle
CN112462640A (en) * 2019-09-06 2021-03-09 丰田自动车株式会社 Vehicle remote indication system
US10942516B2 (en) * 2018-12-12 2021-03-09 Valeo Schalter Und Sensoren Gmbh Vehicle path updates via remote vehicle control
CN112486162A (en) * 2019-09-12 2021-03-12 丰田自动车株式会社 Vehicle remote indication system
US11008199B2 (en) * 2018-08-22 2021-05-18 Tnt Crane & Rigging, Inc. Remotely operated crane system
GB2590112A (en) * 2019-09-12 2021-06-23 Motional Ad Llc Operation of an autonomous vehicle based on availability of navigational information
US20210191410A1 (en) * 2019-12-18 2021-06-24 Westinghouse Air Brake Technologies Corporation Vehicle control system
US11062617B2 (en) * 2019-01-14 2021-07-13 Polixir Technologies Limited Training system for autonomous driving control policy
US11079759B2 (en) * 2019-02-27 2021-08-03 Gm Cruise Holdings Llc Detection of active emergency vehicles shared within an autonomous vehicle fleet
EP3869843A1 (en) 2020-02-19 2021-08-25 Volkswagen Ag Method for invoking a teleoperated driving session, apparatus for performing the steps of the method, vehicle and computer program
CN113306569A (en) * 2020-02-26 2021-08-27 大众汽车股份公司 Method, computer program, device, vehicle and network entity for predicting deadlock situations of an automated vehicle
WO2021177964A1 (en) 2020-03-05 2021-09-10 Guident, Ltd. Artificial intelligence methods and systems for remote monitoring and control of autonomous vehicles
US11131990B1 (en) 2017-11-30 2021-09-28 Direct Current Capital LLC Method for transferring control to an operator
US20210300420A1 (en) * 2020-03-31 2021-09-30 Honda Motor Co., Ltd. Mobile object control method, mobile object control device, and storage medium
WO2021203079A1 (en) * 2020-04-03 2021-10-07 Uber Technologies, Inc. System and methods for automatic generation of remote assistance sessions based on anomaly data collected from human-driven vehicles
US20210366289A1 (en) * 2020-05-19 2021-11-25 Toyota Motor North America, Inc. Control of transport en route
US11215984B2 (en) * 2018-01-09 2022-01-04 Uatc, Llc Systems and methods for controlling an autonomous vehicle
US11217096B2 (en) * 2018-01-26 2022-01-04 Shandong Provincial Communications Planning And Design Institute Group Co., Ltd. Traffic flow dynamic guiding method based on region block
US20220011767A1 (en) * 2018-12-28 2022-01-13 Robert Bosch Gmbh Method for the at least semi-automated guidance of a motor vehicle
US11247702B2 (en) * 2017-12-25 2022-02-15 Hitachi Astemo, Ltd. Vehicle control device and electronic control system
US11249473B2 (en) * 2018-03-23 2022-02-15 Honda Motor Co., Ltd. Remote driving managing apparatus, and computer readable storage medium
US20220063654A1 (en) * 2020-08-27 2022-03-03 Here Global B.V. Method and apparatus to improve interaction models and user experience for autonomous driving in transition regions
CN114155741A (en) * 2020-09-08 2022-03-08 大众汽车股份公司 Vehicle, method, computer program and apparatus for resolving deadlocked traffic conditions of an automatically operated vehicle
US20220081001A1 (en) * 2020-09-15 2022-03-17 Beijing Baidu Netcom Science Technology Co., Ltd. Method and apparatus of determining guide path, method and apparatus of controlling driving of vehicle, and electronic device
US11314246B2 (en) * 2019-08-16 2022-04-26 Uber Technologies, Inc. Command toolbox for autonomous vehicles
WO2022091517A1 (en) * 2020-10-28 2022-05-05 株式会社デンソー Autonomous driving method, device, program, and system
WO2022090085A1 (en) * 2020-10-28 2022-05-05 Bayerische Motoren Werke Aktiengesellschaft Device and method for controlling a driving function for the automated longitudinal and/or lateral guidance of a vehicle
US11345354B2 (en) * 2019-03-25 2022-05-31 Subaru Corporation Vehicle control device, vehicle control method and computer-readable medium containing program
US11383702B2 (en) * 2018-11-29 2022-07-12 Hyundai Motor Company Vehicle and control method thereof
US11422551B2 (en) * 2018-12-27 2022-08-23 Intel Corporation Technologies for providing a cognitive capacity test for autonomous driving
US11447154B2 (en) * 2019-07-29 2022-09-20 Toyota Jidosha Kabushiki Kaisha Vehicle travel system
US11465652B2 (en) * 2020-06-11 2022-10-11 Woven Planet North America, Inc. Systems and methods for disengagement prediction and triage assistant
US20220335819A1 (en) * 2019-10-10 2022-10-20 Starship Technologies Oü Device, system and method for assisting mobile robots in autonomously crossing roads
US20220343703A1 (en) * 2019-11-19 2022-10-27 Vitesco Technologies GmbH Method for managing sporadic anomalies of a power system of a motor vehicle
CN115311876A (en) * 2021-05-07 2022-11-08 丰田自动车株式会社 Remote assistance management system, remote assistance management method, and remote assistance management program
US11518411B2 (en) * 2019-07-17 2022-12-06 Toyota Jidosha Kabushiki Kaisha Vehicle controller device and remote vehicle control system
US20220410929A1 (en) * 2021-06-25 2022-12-29 Hyundai Motor Company Autonomous vehicle, control system for remotely controlling the same, and method thereof
US11541790B2 (en) * 2018-10-24 2023-01-03 Robert Bosch Gmbh Method and device for adapting a position of a seat device of a vehicle during and/or prior to a switchover of the vehicle from an automated driving mode to a manual driving mode
WO2023276412A1 (en) * 2021-07-02 2023-01-05 株式会社デンソー Remote assistance device and remote assistance program
US20230021615A1 (en) * 2019-12-16 2023-01-26 Hitachi Astemo, Ltd. Vehicle control device, and vehicle control system
US11565420B2 (en) * 2019-02-13 2023-01-31 Phantom Auto Inc. Teleoperation in a smart container yard
US20230032713A1 (en) * 2021-06-02 2023-02-02 May Mobility, Inc. Method and system for remote assistance of an autonomous agent
US20230062744A1 (en) * 2021-08-25 2023-03-02 Toyota Jidosha Kabushiki Kaisha Information processing apparatus, information processing method and non-transitory storage medium
US20230138745A1 (en) * 2021-11-03 2023-05-04 Gm Cruise Holdings Llc Methodology for establishing time of response to map discrepancy detection event
US20230137111A1 (en) * 2021-11-03 2023-05-04 Gm Cruise Holdings Llc Methodology for establishing cadence-based review frequency for map segments
US20230137142A1 (en) * 2020-03-27 2023-05-04 Mercedes-Benz Group AG Method and a device for identifying potential hazard zones in road traffic
US11687094B2 (en) 2020-08-27 2023-06-27 Here Global B.V. Method, apparatus, and computer program product for organizing autonomous vehicles in an autonomous transition region
US11693409B2 (en) * 2018-03-21 2023-07-04 Uatc, Llc Systems and methods for a scenario tagger for autonomous vehicles
US11713979B2 (en) 2020-08-27 2023-08-01 Here Global B.V. Method, apparatus, and computer program product for generating a transition variability index related to autonomous driving
US11731615B2 (en) * 2019-04-28 2023-08-22 Ottopia Technologies Ltd. System and method for remote operator assisted driving through collision avoidance
US11755008B2 (en) * 2020-01-31 2023-09-12 Nissan North America, Inc. Using plays for human-AI decision-making in remote autonomous vehicle support
US11820401B1 (en) * 2020-06-02 2023-11-21 Aurora Operations, Inc. Autonomous vehicle remote teleoperations system
US11989018B2 (en) * 2019-03-29 2024-05-21 Honda Motor Co., Ltd. Remote operation device and remote operation method
WO2024123764A1 (en) * 2022-12-06 2024-06-13 Zoox, Inc. Systems and methods for disengaging or engaging autonomy remotely
US12027053B1 (en) 2022-12-13 2024-07-02 May Mobility, Inc. Method and system for assessing and mitigating risks encounterable by an autonomous vehicle
US12024197B2 (en) 2020-07-01 2024-07-02 May Mobility, Inc. Method and system for dynamically curating autonomous vehicle policies
US12032375B2 (en) 2018-07-20 2024-07-09 May Mobility, Inc. Multi-perspective system and method for behavioral policy selection by an autonomous agent
US12043524B2 (en) 2022-03-04 2024-07-23 Tnt Crane & Rigging, Inc. Remotely operated crane control system

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20220026004A (en) * 2020-08-24 2022-03-04 현대자동차주식회사 Autonomous driving control apparatus and method

Family Cites Families (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9798323B2 (en) * 2014-07-28 2017-10-24 GM Global Technology Operations LLC Crowd-sourced transfer-of-control policy for automated vehicles
JP6437629B2 (en) * 2015-03-03 2018-12-12 パイオニア株式会社 Route search apparatus, control method, program, and storage medium
KR20170015115A (en) * 2015-07-30 2017-02-08 삼성전자주식회사 Autonomous vehicle and method for controlling the autonomous vehicle
US20210295439A1 (en) * 2016-01-22 2021-09-23 State Farm Mutual Automobile Insurance Company Component malfunction impact assessment
US9688288B1 (en) * 2016-03-08 2017-06-27 VOLKSWAGEN AG et al. Geofencing for auto drive route planning
JP6355111B2 (en) * 2016-04-28 2018-07-11 本田技研工業株式会社 Vehicle control system
KR102521934B1 (en) * 2016-06-13 2023-04-18 삼성디스플레이 주식회사 Touch sensor and method for sensing touch using thereof
US10054947B2 (en) 2016-08-17 2018-08-21 Omnitracs, Llc Emergency stopping for autonomous commercial vehicles
US10202115B2 (en) 2016-09-13 2019-02-12 Here Global B.V. Method and apparatus for triggering vehicle sensors based on human accessory detection
CN109863545B (en) * 2016-10-21 2022-04-26 三菱电机株式会社 Automatic driving assistance device, automatic driving vehicle, automatic driving assistance method, and computer-readable storage medium
US10133270B2 (en) 2017-03-28 2018-11-20 Toyota Research Institute, Inc. Electronic control units, vehicles, and methods for switching vehicle control from an autonomous driving mode
US20190163176A1 (en) 2017-11-30 2019-05-30 drive.ai Inc. Method for transferring control of an autonomous vehicle to a remote operator
US10678238B2 (en) 2017-12-20 2020-06-09 Intel IP Corporation Modified-reality device and method for operating a modified-reality device

Cited By (103)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20180067488A1 (en) * 2016-09-08 2018-03-08 Mentor Graphics Corporation Situational awareness determination based on an annotated environmental model
US10678240B2 (en) * 2016-09-08 2020-06-09 Mentor Graphics Corporation Sensor modification based on an annotated environmental model
US11698643B2 (en) 2017-07-11 2023-07-11 Waymo Llc Methods and systems for keeping remote assistance operators alert
US11269354B2 (en) 2017-07-11 2022-03-08 Waymo Llc Methods and systems for keeping remote assistance operators alert
US10761542B1 (en) * 2017-07-11 2020-09-01 Waymo Llc Methods and systems for keeping remote assistance operators alert
US11449073B2 (en) * 2017-08-10 2022-09-20 Zoox, Inc. Shared vehicle obstacle data
US10558224B1 (en) * 2017-08-10 2020-02-11 Zoox, Inc. Shared vehicle obstacle data
US11131990B1 (en) 2017-11-30 2021-09-28 Direct Current Capital LLC Method for transferring control to an operator
US11797001B1 (en) 2017-11-30 2023-10-24 Direct Current Capital LLC Method for transferring control to an operator
US11247702B2 (en) * 2017-12-25 2022-02-15 Hitachi Astemo, Ltd. Vehicle control device and electronic control system
US11215984B2 (en) * 2018-01-09 2022-01-04 Uatc, Llc Systems and methods for controlling an autonomous vehicle
US11840266B2 (en) 2018-01-09 2023-12-12 Uatc, Llc Systems and methods for controlling an autonomous vehicle
US11217096B2 (en) * 2018-01-26 2022-01-04 Shandong Provincial Communications Planning And Design Institute Group Co., Ltd. Traffic flow dynamic guiding method based on region block
US11693409B2 (en) * 2018-03-21 2023-07-04 Uatc, Llc Systems and methods for a scenario tagger for autonomous vehicles
US11249473B2 (en) * 2018-03-23 2022-02-15 Honda Motor Co., Ltd. Remote driving managing apparatus, and computer readable storage medium
US20190339697A1 (en) * 2018-05-01 2019-11-07 The Hi-Tech Robotic Systemz Ltd Managing drive modes of a vehicle
US10717445B2 (en) * 2018-06-26 2020-07-21 Toyota Research Institute, Inc. Systems and methods for end-user modification of driving behavior of autonomous vehicle
US20190389482A1 (en) * 2018-06-26 2019-12-26 Toyota Research Institute, Inc. Systems and methods for end-user modification of driving behavior of autonomous vehicle
US12032375B2 (en) 2018-07-20 2024-07-09 May Mobility, Inc. Multi-perspective system and method for behavioral policy selection by an autonomous agent
US20200407949A1 (en) * 2018-07-31 2020-12-31 Komatsu Ltd. Work machine
US11008199B2 (en) * 2018-08-22 2021-05-18 Tnt Crane & Rigging, Inc. Remotely operated crane system
US11172167B2 (en) * 2018-09-10 2021-11-09 Panasonic Intellectual Property Corporation Of America Video transmitting device, video transmitting method, and recording medium
US20200084419A1 (en) * 2018-09-10 2020-03-12 Panasonic Intellectual Property Corporation Of America Video transmitting device, video transmitting method, and recording medium
US11541790B2 (en) * 2018-10-24 2023-01-03 Robert Bosch Gmbh Method and device for adapting a position of a seat device of a vehicle during and/or prior to a switchover of the vehicle from an automated driving mode to a manual driving mode
US11383702B2 (en) * 2018-11-29 2022-07-12 Hyundai Motor Company Vehicle and control method thereof
US10942516B2 (en) * 2018-12-12 2021-03-09 Valeo Schalter Und Sensoren Gmbh Vehicle path updates via remote vehicle control
US11422551B2 (en) * 2018-12-27 2022-08-23 Intel Corporation Technologies for providing a cognitive capacity test for autonomous driving
US20220011767A1 (en) * 2018-12-28 2022-01-13 Robert Bosch Gmbh Method for the at least semi-automated guidance of a motor vehicle
US11062617B2 (en) * 2019-01-14 2021-07-13 Polixir Technologies Limited Training system for autonomous driving control policy
US11884293B2 (en) * 2019-01-25 2024-01-30 Uber Technologies, Inc. Operator assistance for autonomous vehicles
US20200239023A1 (en) * 2019-01-25 2020-07-30 Uatc, Llc Operator assistance for autonomous vehicles
US10843728B2 (en) * 2019-01-31 2020-11-24 StradVision, Inc. Method and device for delivering steering intention of autonomous driving module or driver to steering apparatus of subject vehicle more accurately
US20200247469A1 (en) * 2019-01-31 2020-08-06 StradVision, Inc. Method and device for delivering steering intention of autonomous driving module or driver to steering apparatus of subject vehicle more accurately
US11565420B2 (en) * 2019-02-13 2023-01-31 Phantom Auto Inc. Teleoperation in a smart container yard
US11079759B2 (en) * 2019-02-27 2021-08-03 Gm Cruise Holdings Llc Detection of active emergency vehicles shared within an autonomous vehicle fleet
US11726476B2 (en) 2019-02-27 2023-08-15 Gm Cruise Holdings Llc Detection of active emergency vehicles shared within an autonomous vehicle fleet
US11345354B2 (en) * 2019-03-25 2022-05-31 Subaru Corporation Vehicle control device, vehicle control method and computer-readable medium containing program
US11989018B2 (en) * 2019-03-29 2024-05-21 Honda Motor Co., Ltd. Remote operation device and remote operation method
US11774964B2 (en) * 2019-04-15 2023-10-03 Toyota Jidosha Kabushiki Kaisha Vehicle remote instruction system
US20200326702A1 (en) * 2019-04-15 2020-10-15 Toyota Jidosha Kabushiki Kaisha Vehicle remote instruction system
US11731615B2 (en) * 2019-04-28 2023-08-22 Ottopia Technologies Ltd. System and method for remote operator assisted driving through collision avoidance
US11518411B2 (en) * 2019-07-17 2022-12-06 Toyota Jidosha Kabushiki Kaisha Vehicle controller device and remote vehicle control system
US11447154B2 (en) * 2019-07-29 2022-09-20 Toyota Jidosha Kabushiki Kaisha Vehicle travel system
CN112339775A (en) * 2019-08-06 2021-02-09 丰田自动车株式会社 Driving operation relay system and vehicle
US20210043103A1 (en) * 2019-08-09 2021-02-11 Toyota Jidosha Kabushiki Kaisha Vehicle remote instruction training device
US11955031B2 (en) * 2019-08-09 2024-04-09 Toyota Jidosha Kabushiki Kaisha Vehicle remote instruction training device
CN112349170A (en) * 2019-08-09 2021-02-09 丰田自动车株式会社 Vehicle remote indication training device
US11314246B2 (en) * 2019-08-16 2022-04-26 Uber Technologies, Inc. Command toolbox for autonomous vehicles
CN112415904A (en) * 2019-08-23 2021-02-26 郑州宇通客车股份有限公司 Remote control method, device and system for automatic driving vehicle
US11215982B2 (en) * 2019-09-06 2022-01-04 Toyota Jidosha Kabushiki Kaisha Vehicle remote instruction system
CN112462640A (en) * 2019-09-06 2021-03-09 丰田自动车株式会社 Vehicle remote indication system
US11999372B2 (en) 2019-09-12 2024-06-04 Motional Ad Llc Operation of an autonomous vehicle based on availability of navigational information
GB2590112B (en) * 2019-09-12 2022-02-09 Motional Ad Llc Operation of an autonomous vehicle based on availability of navigational information
CN112486162A (en) * 2019-09-12 2021-03-12 丰田自动车株式会社 Vehicle remote indication system
GB2590112A (en) * 2019-09-12 2021-06-23 Motional Ad Llc Operation of an autonomous vehicle based on availability of navigational information
US20220335819A1 (en) * 2019-10-10 2022-10-20 Starship Technologies Oü Device, system and method for assisting mobile robots in autonomously crossing roads
US11721142B2 (en) * 2019-11-19 2023-08-08 Vitesco Technologies GmbH Method for managing sporadic anomalies of a power system of a motor vehicle
US20220343703A1 (en) * 2019-11-19 2022-10-27 Vitesco Technologies GmbH Method for managing sporadic anomalies of a power system of a motor vehicle
US20230021615A1 (en) * 2019-12-16 2023-01-26 Hitachi Astemo, Ltd. Vehicle control device, and vehicle control system
US11720113B2 (en) * 2019-12-18 2023-08-08 Westinghouse Air Brake Technologies Corporation Vehicle control and trip planning system
US20210191410A1 (en) * 2019-12-18 2021-06-24 Westinghouse Air Brake Technologies Corporation Vehicle control system
US11755008B2 (en) * 2020-01-31 2023-09-12 Nissan North America, Inc. Using plays for human-AI decision-making in remote autonomous vehicle support
EP3869843A1 (en) 2020-02-19 2021-08-25 Volkswagen Ag Method for invoking a teleoperated driving session, apparatus for performing the steps of the method, vehicle and computer program
US11513513B2 (en) 2020-02-19 2022-11-29 Volkswagen Aktiengesellschaft Method for invoking a teleoperated driving session, apparatus for performing the steps of the method, vehicle and computer program
EP3872594A1 (en) * 2020-02-26 2021-09-01 Volkswagen Ag A method, a computer program, an apparatus, a vehicle, and a network entity for predicting a deadlock situation for an automated vehicle
CN113306569A (en) * 2020-02-26 2021-08-27 大众汽车股份公司 Method, computer program, device, vehicle and network entity for predicting deadlock situations of an automated vehicle
CN111372216A (en) * 2020-02-28 2020-07-03 中南大学 Resource scheduling method, system and storage medium for intelligent networked automobile
EP4097550A4 (en) * 2020-03-05 2023-09-13 Guident, Ltd. Artificial intelligence methods and systems for remote monitoring and control of autonomous vehicles
WO2021177964A1 (en) 2020-03-05 2021-09-10 Guident, Ltd. Artificial intelligence methods and systems for remote monitoring and control of autonomous vehicles
US20230137142A1 (en) * 2020-03-27 2023-05-04 Mercedes-Benz Group AG Method and a device for identifying potential hazard zones in road traffic
US11814082B2 (en) * 2020-03-31 2023-11-14 Honda Motor Co., Ltd. Mobile object control method, mobile object control device, and storage medium
US11565718B2 (en) * 2020-03-31 2023-01-31 Honda Motor Co., Ltd. Mobile object control method, mobile object control device, and storage medium
US20210300420A1 (en) * 2020-03-31 2021-09-30 Honda Motor Co., Ltd. Mobile object control method, mobile object control device, and storage medium
US20230126140A1 (en) * 2020-03-31 2023-04-27 Honda Motor Co., Ltd. Mobile object control method, mobile object control device, and storage medium
WO2021203079A1 (en) * 2020-04-03 2021-10-07 Uber Technologies, Inc. System and methods for automatic generation of remote assistance sessions based on anomaly data collected from human-driven vehicles
US11704998B2 (en) * 2020-04-03 2023-07-18 Uber Technologies, Inc. System and methods for automatic generation of remote assistance sessions based on anomaly data collected from human-driven vehicle
US11847919B2 (en) * 2020-05-19 2023-12-19 Toyota Motor North America, Inc. Control of transport en route
US20210366289A1 (en) * 2020-05-19 2021-11-25 Toyota Motor North America, Inc. Control of transport en route
US11820401B1 (en) * 2020-06-02 2023-11-21 Aurora Operations, Inc. Autonomous vehicle remote teleoperations system
US11465652B2 (en) * 2020-06-11 2022-10-11 Woven Planet North America, Inc. Systems and methods for disengagement prediction and triage assistant
US12024197B2 (en) 2020-07-01 2024-07-02 May Mobility, Inc. Method and system for dynamically curating autonomous vehicle policies
US11691643B2 (en) * 2020-08-27 2023-07-04 Here Global B.V. Method and apparatus to improve interaction models and user experience for autonomous driving in transition regions
US11687094B2 (en) 2020-08-27 2023-06-27 Here Global B.V. Method, apparatus, and computer program product for organizing autonomous vehicles in an autonomous transition region
US11713979B2 (en) 2020-08-27 2023-08-01 Here Global B.V. Method, apparatus, and computer program product for generating a transition variability index related to autonomous driving
US20220063654A1 (en) * 2020-08-27 2022-03-03 Here Global B.V. Method and apparatus to improve interaction models and user experience for autonomous driving in transition regions
EP3965443A1 (en) * 2020-09-08 2022-03-09 Volkswagen Ag Vehicles, methods, computer programs, and apparatuses for resolving a deadlock traffic situation of an automatically operated vehicle
CN114155741A (en) * 2020-09-08 2022-03-08 大众汽车股份公司 Vehicle, method, computer program and apparatus for resolving deadlocked traffic conditions of an automatically operated vehicle
US11724716B2 (en) * 2020-09-15 2023-08-15 Beijing Baidu Netcom Science Technology Co., Ltd. Method and apparatus of determining guide path, method and apparatus of controlling driving of vehicle, and electronic device
US20220081001A1 (en) * 2020-09-15 2022-03-17 Beijing Baidu Netcom Science Technology Co., Ltd. Method and apparatus of determining guide path, method and apparatus of controlling driving of vehicle, and electronic device
WO2022090085A1 (en) * 2020-10-28 2022-05-05 Bayerische Motoren Werke Aktiengesellschaft Device and method for controlling a driving function for the automated longitudinal and/or lateral guidance of a vehicle
WO2022091517A1 (en) * 2020-10-28 2022-05-05 株式会社デンソー Autonomous driving method, device, program, and system
CN115311876A (en) * 2021-05-07 2022-11-08 丰田自动车株式会社 Remote assistance management system, remote assistance management method, and remote assistance management program
US20220355827A1 (en) * 2021-05-07 2022-11-10 Toyota Jidosha Kabushiki Kaisha Remote assistance management system, remote assistance management method, and non-transitory computer-readable storage medium
US20230032713A1 (en) * 2021-06-02 2023-02-02 May Mobility, Inc. Method and system for remote assistance of an autonomous agent
US20220410929A1 (en) * 2021-06-25 2022-12-29 Hyundai Motor Company Autonomous vehicle, control system for remotely controlling the same, and method thereof
WO2023276412A1 (en) * 2021-07-02 2023-01-05 株式会社デンソー Remote assistance device and remote assistance program
US20230062744A1 (en) * 2021-08-25 2023-03-02 Toyota Jidosha Kabushiki Kaisha Information processing apparatus, information processing method and non-transitory storage medium
US20230137111A1 (en) * 2021-11-03 2023-05-04 Gm Cruise Holdings Llc Methodology for establishing cadence-based review frequency for map segments
US20230138745A1 (en) * 2021-11-03 2023-05-04 Gm Cruise Holdings Llc Methodology for establishing time of response to map discrepancy detection event
US11821738B2 (en) * 2021-11-03 2023-11-21 Gm Cruise Holdings Llc Methodology for establishing time of response to map discrepancy detection event
US12043524B2 (en) 2022-03-04 2024-07-23 Tnt Crane & Rigging, Inc. Remotely operated crane control system
WO2024123764A1 (en) * 2022-12-06 2024-06-13 Zoox, Inc. Systems and methods for disengaging or engaging autonomy remotely
US12027053B1 (en) 2022-12-13 2024-07-02 May Mobility, Inc. Method and system for assessing and mitigating risks encounterable by an autonomous vehicle

Also Published As

Publication number Publication date
US11797001B1 (en) 2023-10-24
US11131990B1 (en) 2021-09-28

Similar Documents

Publication Publication Date Title
US11797001B1 (en) Method for transferring control to an operator
US20230237908A1 (en) Method for accessing supplemental sensor data from other vehicles
US11592833B2 (en) Method for updating a localization map for a fleet of autonomous vehicles
US11685360B2 (en) Planning for unknown objects by an autonomous vehicle
US11495126B2 (en) Systems and methods for driving intelligence allocation between vehicles and highways
US10338591B2 (en) Methods for autonomously navigating across uncontrolled and controlled intersections
US11400925B2 (en) Planning for unknown objects by an autonomous vehicle
US20200400443A1 (en) Systems and methods for localization
US10496099B2 (en) Systems and methods for speed limit context awareness
CN108292474B (en) Coordination of a fleet of dispatching and maintaining autonomous vehicles
US10234864B2 (en) Planning for unknown objects by an autonomous vehicle
US20180281815A1 (en) Predictive teleassistance system for autonomous vehicles
CN114643995A (en) Simulation system and method for autonomous vehicle
CN110807412B (en) Vehicle laser positioning method, vehicle-mounted equipment and storage medium
US11260875B2 (en) Systems and methods for road surface dependent motion planning
US11941980B1 (en) Dynamic access and egress of railroad right of way
WO2018165199A1 (en) Planning for unknown objects by an autonomous vehicle
US20240217529A1 (en) Failover handling in autonomous vehicles
US20200333803A1 (en) Method of assisting with the driving of vehicles, computer program and associated system
KR20240019928A (en) Method for predicting and determining abnormal state of autonomous driving vehicle and apparatus and system therefor

Legal Events

Date Code Title Description
STPP Information on status: patent application and granting procedure in general

Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION

AS Assignment

Owner name: DRIVE.AI, CALIFORNIA

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:WANG, TAO;SONG, WEI;SIGNING DATES FROM 20181130 TO 20190320;REEL/FRAME:048987/0872

STPP Information on status: patent application and granting procedure in general

Free format text: NOTICE OF ALLOWANCE MAILED -- APPLICATION RECEIVED IN OFFICE OF PUBLICATIONS

STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO PAY ISSUE FEE