EP3256815A1 - Système de navigation autonome - Google Patents

Système de navigation autonome

Info

Publication number
EP3256815A1
EP3256815A1 EP15813220.9A EP15813220A EP3256815A1 EP 3256815 A1 EP3256815 A1 EP 3256815A1 EP 15813220 A EP15813220 A EP 15813220A EP 3256815 A1 EP3256815 A1 EP 3256815A1
Authority
EP
European Patent Office
Prior art keywords
characterization
vehicle
virtual
driving route
route
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
EP15813220.9A
Other languages
German (de)
English (en)
Inventor
designation of the inventor has not yet been filed The
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Apple Inc
Original Assignee
Apple Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Apple Inc filed Critical Apple Inc
Publication of EP3256815A1 publication Critical patent/EP3256815A1/fr
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0212Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • G01C21/34Route searching; Route guidance
    • G01C21/3407Route searching; Route guidance specially adapted for specific applications
    • G01C21/3415Dynamic re-routing, e.g. recalculating the route when the user deviates from calculated route or after detecting real-time traffic data or accidents
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/38Electronic maps specially adapted for navigation; Updating thereof
    • G01C21/3804Creation or updating of map data
    • G01C21/3807Creation or updating of map data characterised by the type of data
    • G01C21/3815Road data
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0212Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory
    • G05D1/0221Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory involving a learning process
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0276Control of position or course in two dimensions specially adapted to land vehicles using signals provided by a source external to the vehicle
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W50/00Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
    • B60W2050/0062Adapting control system settings
    • B60W2050/0075Automatic parameter input, automatic initialising or calibrating means
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2556/00Input parameters relating to data
    • B60W2556/10Historical data
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2556/00Input parameters relating to data
    • B60W2556/45External transmission of data to or from the vehicle
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2556/00Input parameters relating to data
    • B60W2556/45External transmission of data to or from the vehicle
    • B60W2556/50External transmission of data to or from the vehicle for navigation systems
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2756/00Output or target parameters relating to data
    • B60W2756/10Involving external transmission of data to or from the vehicle

Definitions

  • This disclosure relates generally to autonomous navigation of a vehicle, and in particular to development and evaluation of an autonomous navigation route characterization which can be utilized by at least some portion of a vehicle to autonomously navigate the route.
  • autonomous navigation is enabled via an autonomous navigation system which can process and respond to static features (e.g., roadway lanes, road signs, etc.) and dynamic features (present locations of other vehicles in a roadway on which the route extends, present environmental conditions, roadway obstructions, etc.) along a route in real-time as they are encountered, thereby replicating the real-time processing and driving capabilities of a human being.
  • static features e.g., roadway lanes, road signs, etc.
  • dynamic features present locations of other vehicles in a roadway on which the route extends, present environmental conditions, roadway obstructions, etc.
  • the processing and control capabilities required to simulate such processing and responsive capability can be impractical, if technically feasible, and the complexity and magnitude of computer systems required to be included in a vehicle to enable such real-time processing and responsiveness can present an unsuitably excessive investment in capital costs for each vehicle, thereby rendering the system impractical for usage on a wide scale.
  • autonomous navigation is enabled by developing a detailed map of various routes, including data indicating various features of the road (e.g., road signs, intersections, etc.), specifying various driving rules relative to the various routes (e.g., proper speed limits, lane changing speeds, lane locations, variations of driving rules based on various climate conditions and times of day, etc. for a given portion of a given route), and providing the map to autonomous navigation systems of various vehicles to enable the vehicles to autonomously navigate the various routes using the map.
  • various driving rules e.g., proper speed limits, lane changing speeds, lane locations, variations of driving rules based on various climate conditions and times of day, etc. for a given portion of a given route
  • Some embodiments provide a vehicle configured to autonomously navigate a driving route.
  • the vehicle includes sensor devices which monitor characteristics of the driving route based on the vehicle being navigated along the driving route, and an autonomous navigation system which is interoperable with the sensor devices to: implement a succession of updates to a virtual characterization of the driving route, based on monitoring a succession of manual navigations of the vehicle along the driving route, associate a confidence indicator with the virtual characterization, based on monitoring the succession of updates to the virtual characterization, and enable autonomous navigation of the vehicle along the driving route, based at least in part upon a determination that the confidence indicator at least meets a threshold confidence indication, such that the autonomous navigation system autonomously navigates the vehicle along at least a portion of the driving route, based on controlling one or more control elements of the vehicle and based on a user-initiated command, received at the autonomous navigation system via a user interface of the vehicle, to engage in autonomous navigation of the portion of the driving route.
  • Some embodiments provide an apparatus which includes an autonomous navigation system configured to be installed in a vehicle and selectively enable autonomous navigation of the vehicle along a driving route.
  • the autonomous navigation system can include a route characterization module which implements a succession of updates to a virtual characterization of the driving route, wherein each update is based on monitoring a separate one of a succession of manually -controlled navigations of the vehicle along the driving route, and implementing each update of the succession of updates includes associating a confidence indicator with the virtual characterization based on a monitored variation, of the virtual characterization which is associated with the respective update.
  • the autonomous navigation system can include a route evaluation module configured to enable user-initiated autonomous navigation of the driving route by the vehicle, based on a determination that a confidence indicator associated with the characterization of the driving route exceeds a threshold confidence indication.
  • Some embodiments provide a method which includes performing, by one or more computer systems installed in a vehicle: receiving a set of sensor data associated with a driving route, from a set of sensors included in the vehicle, based at least in part upon the vehicle being manually navigated along the driving route, processing the set of sensor data to update a stored characterization of the driving route, wherein the stored characterization is based on at least one previously-generated set of sensor data associated with one or more historical manual navigations of the vehicle along the driving route associating a confidence indicator with the updated characterization based on a comparison of the updated characterization with the stored characterization, and enabling availability of user-initiated autonomous navigation, by the vehicle, of the driving route, based at least in part upon a determination that the confidence indicator at least meets a predetermined threshold confidence indication.
  • FIG. 1 illustrates a schematic block diagram of a vehicle 100 which comprises an autonomous navigation system (ANS), according to some embodiments.
  • ANS autonomous navigation system
  • FIG. 2 illustrates an illustration of a vehicle, which includes an ANS and a set of sensor devices, navigating through a region which includes multiple roadway portions of multiple roadways, according to some embodiments.
  • FIG. 3 illustrates an illustration of a vehicle, which includes an ANS and a set of sensor devices, navigating through a region which includes multiple roadway portions of a roadway, according to some embodiments.
  • FIG. 4 illustrates a block diagram of an autonomous navigation system (ANS), according to some embodiments.
  • FIG. 5A-C illustrate a user interface associated with the autonomous navigation system, according to some embodiments.
  • FIG. 6 illustrates a user interface associated with the autonomous navigation system, according to some embodiments.
  • FIG. 7 illustrates developing virtual characterizations of one or more roadway portions to enable autonomous navigation of the one or more roadway portions, according to some embodiments.
  • FIG. 8 illustrates a schematic of an autonomous navigation network, according to some embodiments.
  • FIG. 9A-B illustrate a schematic of an autonomous navigation network, according to some embodiments.
  • FIG. 10 illustrates a "curation spectrum" of processing available to generate one or more virtual roadway portion characterizations, according to some embodiments.
  • FIG. 1 1 illustrates receiving and processing virtual characterizations, of one or more roadway portions, according to some embodiments.
  • FIG. 12 illustrates implementing at least a portion of a curation spectrum with regard to one or more virtual characterizations, of one or more roadway portions, according to some embodiments.
  • FIG. 13 illustrates an example computer system configured to implement aspects of a system and method for autonomous navigation, according to some embodiments.
  • a unit/circuit/component is “configured to” perform one or more tasks is expressly intended not to invoke 35 U.S.C. ⁇ 1 12(f) for that unit/circuit/component.
  • “configured to” can include generic structure (e.g., generic circuitry) that is manipulated by software and/or firmware (e.g., an FPGA or a general- purpose processor executing software) to operate in manner that is capable of performing the task(s) at issue.
  • "Configure to” may also include adapting a manufacturing process (e.g., a semiconductor fabrication facility) to fabricate devices (e.g., integrated circuits) that are adapted to implement or perform one or more tasks.
  • this term is used to describe one or more factors that affect a determination. This term does not foreclose additional factors that may affect a determination. That is, a determination may be solely based on those factors or based, at least in part, on those factors.
  • a determination may be solely based on those factors or based, at least in part, on those factors.
  • Some embodiments include one or more vehicles in which an autonomous navigation system (“ANS”) is included, where the ANS enables autonomous navigation of various driving routes (also referred to herein as “routes”) via developing a virtual characterization of the routes based on monitoring various real-world features of the routes during navigation of the vehicle along the routes.
  • the ANS controls various control elements of a vehicle to autonomously drive the vehicle (herein referred to as “autonomously navigate”, “autonomous navigation”, etc.) along one or more portions of a route based at least in part upon virtual characterizations of the one or more portions of the route.
  • Such autonomous navigation can include controlling vehicle control elements based on characterizations of driving rules included in the virtual route characterizations (e.g., vehicle velocity, spacing relative to other vehicles, position on a roadway, adjustments to same based on environmental conditions, etc.) and characterizations of static features of the route included in the virtual route characterizations (positions of roadway lanes, roadway edges, road signs, landmarks, roadway inclines, intersections, crosswalks, etc.), so that the ANS can safely autonomously navigate the vehicle along at least a portion of a route.
  • characterizations of driving rules included in the virtual route characterizations e.g., vehicle velocity, spacing relative to other vehicles, position on a roadway, adjustments to same based on environmental conditions, etc.
  • static features of the route included in the virtual route characterizations positions of roadway lanes, roadway edges, road signs, landmarks, roadway inclines, intersections, crosswalks, etc.
  • a "route" includes a pathway along which a vehicle is navigated.
  • a route can extend from a starting location to another separate destination location, extend back to a destination location which is the same as the starting location, etc.
  • a route can extend along one or more various portions of one or more various roadways.
  • a route between a home location and a work location can extend from a home driveway, through one or more residential streets, along one or more portions of one or more avenues, highways, toll ways, etc., and to one or more parking spaces in one or more parking areas.
  • Such routes can be routes which a user repeatedly navigates over time, including multiple times in a given day (e.g., routes between home and work locations may be travelled at least once in a given day).
  • an ANS in a vehicle enables autonomous navigation of one or more portions of a route, based at least in part upon virtual characterizations of the one or more portions.
  • Enabling autonomous navigation can include making autonomous navigation of one or more portions of a route available for selection by a user of the vehicle, such that the ANS can engage in autonomous navigation of the one or more portions based on receiving a user-initiated command to engage in the autonomous navigation of the one or more portions.
  • a virtual characterization of a route portion can include a virtual characterization of a portion of a roadway which is included in a route between one or more locations. Such a characterization can be referred to as a "virtual roadway portion characterization”.
  • a given virtual roadway portion characterization can be independent of any overall routes which can be navigated by a vehicle, so that a route navigated by a vehicle includes a set of roadway portions navigated in series, and one or more virtual roadway portion characterizations can be used by the ANS to autonomously navigate a vehicle along one or more various routes.
  • a virtual route characterization can include a set of one or more virtual roadway portion characterizations associated with a set of one or more roadway portions for which autonomous navigation is enabled and one or more roadway portions for which autonomous navigation is not enabled, and the ANS can engage in autonomous navigation of the portions for which autonomous navigation is enabled, interacting with a vehicle user, via a user interface of the vehicle, such that control of various control elements of the vehicle are transferred between the user and the ANS based upon which roadway portions along the route the vehicle is navigated.
  • the ANS develops virtual roadway portion characterizations, virtual route characterizations, etc. via monitoring the navigation of a vehicle in which the ANS is included along one or more routes.
  • monitoring can include monitoring one or more of the external environment, vehicle control elements, etc. as the vehicle is manually navigated by a user along the one or more routes.
  • a user can include a driver of the vehicle, a passenger of the vehicle, some combination thereof, etc.
  • the ANS can monitor various aspects of the manual navigation, including monitoring various static features (e.g., road signs, curbs, lane markers, stoplights, trees, landmarks, physical location of the vehicle, etc.) encountered in various roadway portions through which the vehicle is manually navigated, dynamic features (other vehicles navigating along the roadways, emergency vehicles, accidents, weather conditions, etc.) encountered in various roadway portions, driving characteristics of the user with regard to manual navigation of the vehicle (e.g., driving speed at various portions of the route, lane changing speed and operations, acceleration events and rates, deceleration events and rates, position on the roadway relative to static features, spacing from other vehicles, etc.) through various roadway portions, driving characteristics of mobile entities navigating through various roadway portions in proximity to the manually-navigated vehicle (e.g., in different lanes, ahead or behind the vehicle, etc.), some combination thereof, and the like.
  • various static features e.g., road signs, curbs, lane markers, stoplights, trees, landmarks, physical location of the vehicle, etc.
  • a “mobile entity” can include a motorized vehicle, including an automobile, truck, etc.; a manually -powered vehicle, including a bicycle, pedi-cab, etc.; a pedestrian, including a human, animal, etc.; some combination thereof, etc.
  • a driving characteristic of a mobile entity, including a vehicle, pedestrian, etc. can include data characterizing how the mobile entity navigates at least a portion of one or more roadway portions. For example, a driving characteristic of a pedestrian can indicate that the pedestrian travels along a roadway, within a certain distance of a certain edge of the roadway, at a certain speed.
  • the system can process input data, generated at various vehicle sensors based on the monitoring, to develop a virtual characterization of the route, which can include characterizations of the static features associated with various portions of the route (referred to herein as “static feature characterizations”), characterizations of driving rules associated with various portions of the route (referred to herein as “driving rule characterizations”), etc.
  • static feature characterizations characterizations of the static features associated with various portions of the route
  • driving rule characterizations characterizations of driving rules associated with various portions of the route
  • the ANS updates one or more virtual roadway portion characterizations of one or more roadway portions included in a route based upon monitoring successive manual navigations of the route.
  • the ANS implements successive updates of a virtual characterization of one or more roadway portions, based on multiple, successive navigations of the route, the ANS can develop and update a confidence indicator associated with one or more roadway portion characterizations associated with one or more roadway portions included in the route. For example, where the number of new static features in a roadway portion in a routinely-navigated route, identified based on processing input data from various vehicle sensors, decreases with monitoring successive manual navigations over the route, the confidence indicator associated with that virtual roadway portion characterization can increase with the successive monitoring of navigation through the roadway portion.
  • the ANS can enable an autonomous navigation feature of the vehicle for the one or more roadway portions, so that autonomous navigation of the vehicle along one or more portions of a route which include the one or more roadway portions is enabled.
  • the threshold level can be predetermined.
  • the ANS can adjustably establish a confidence indicator for one or more particular roadway portions, based at least upon monitoring of navigation along the one or more particular roadway portions, signals received from one or more remote services, systems, etc., some combination thereof, or the like.
  • an indicator can include one or more of a particular value, rank, level, some combination thereof, etc.
  • a confidence indicator can include one or more of a confidence value, confidence rank, confidence level, some combination thereof, etc.
  • the indicator can include one or more of a range of indicators.
  • the confidence indicator includes a confidence rank
  • the confidence indicator can include a particular rank of a range of ranks, where the particular rank indicates the relative confidence associated with the indicator.
  • the confidence indicator can include a particular value of a range of value, where the particular value within the range indicates the relative confidence associated with the indicator, relative to one or more confidence extremes represented by the range extremes.
  • the threshold confidence indication can include one or more indicators, values, ranks, levels, etc., and determining that a confidence indicator at least meets a threshold confidence indication can include determining that a value, rank, level, etc. included in the confidence indication at least matches a value, rank, level, etc. included in the confidence indication. In some embodiments, determining that a confidence indicator at least meets a threshold confidence indication can include determining that a value, rank, level, etc. included in the confidence indication exceeds a value, rank, level, etc. included in the confidence indication.
  • the threshold confidence indication can, in some embodiments, be referred to as one or more of a threshold confidence indicator, threshold value, threshold rank, threshold level, some combination thereof, etc.
  • a virtual route characterization can include a set of virtual roadway portion characterizations of the various roadway portions included in the route.
  • the virtual route characterization can include metadata referencing the various virtual roadway portion characterizations and can characterize driving rules associated with navigating between various roadway portions.
  • autonomous navigation of one or more portions of a route is enabled based at least in part upon a determination that sufficiently large portions of the route, including a set of one or more roadway portions, have associated virtual characterizations for which associated confidence indicators at least meet one or more thresholds.
  • Such a set of roadway portions can include a limited selection of the roadway portions included in the route.
  • a route includes multiple roadway portions which are 100 feet in length
  • a virtual roadway portion characterization associated with a single roadway portion has a confidence indicator which meets the threshold confidence indication while the remainder of roadway portion characterizations do not
  • autonomous navigation of the single roadway portion may remain disabled.
  • the virtual roadway portion characterization associated with multiple contiguous roadway portions each have a confidence indicator which meets the threshold and the contiguous length of the roadway portions at least meets a threshold confidence indication
  • autonomous navigation of the portion of the route which includes the multiple contiguous roadway portions can be enabled.
  • a “threshold confidence indication” is referred to interchangeably herein as a "threshold”.
  • the threshold can be based at least in part upon one or more of the distance of the contiguous roadway portions, driving velocity through the one or more roadway portions, estimated elapsed time of navigation through the one or more roadway portions, some combination thereof, etc.
  • the threshold can vary based on various roadway portions included in a route portion for which a determination of whether to enable autonomous navigation is made.
  • enabling autonomous navigation of one or more portions of a route enables user-initiated engaging of autonomous navigation through one or more particular roadway portions specified based on user interaction with one or more user interfaces.
  • the ANS in response to autonomous navigation of a roadway portion being enabled, can present to a user, via a user interface included in a vehicle, an option to engage autonomous navigation of the vehicle over one or more route portions which includes the one or more roadway portions for which autonomous navigation is enabled.
  • the ANS can receive a user-initiated command to autonomously navigate the vehicle along one or more portions of a route and, in response, engage in the autonomous navigation via controlling one or more control elements of the vehicle.
  • the ANS can update a virtual characterization of a route in response to detecting, via monitoring of the external environment, a change in the static features of a route. For example, where a portion of a roadway in a route routinely travelled by a vehicle undergoes road construction which results in an alteration to the roadway, lane closure, etc., the ANS included in the vehicle can update a characterization of the route in response to monitoring the roadway portion as the vehicle travels through the portion.
  • the ANS can adapt to changes in a route independently of preexisting route characterizations, "maps", etc., including independently of data received from a remote service, system, etc., thereby reducing the amount of time required to enable autonomous navigation of the changed route.
  • route characterizations are developed by an ANS of a vehicle based on routes which are successively (i.e., repeatedly) navigated by a user of the vehicle
  • the routes for which autonomous navigation can be enabled include routes which the user of the vehicle tends to navigate, including routinely-navigated routes.
  • the ANS can autonomously navigate routes routinely navigated by a vehicle user without requiring preexisting route characterizations.
  • the ANS can update virtual characterization of one or more roadway portions, routes, etc.
  • the ANS can update the characterizations as soon as the changes are encountered by the vehicle, thereby providing updates to virtual characterizations of routes which are navigated by a user and, in some embodiments, without relying upon distributed update information from remote systems, services, etc.
  • an ANS can continue to update virtual characterizations of a roadway portion based on monitoring an autonomous navigation of a vehicle through the roadway portion.
  • the ANS uploads virtual characterizations of one or more routes to one or more remote system, service, etc. implemented on one or more computer systems which are external to the vehicle in which the ANS is located. Such uploading can be performed in response to a determination that developing a virtual characterization with sufficient confidence to enable autonomous navigation of the route requires processing resources not available locally to the vehicle, in response to a determination that the ANS is unable to build the confidence indicator associated with a characterization at more than a certain rate with successive monitoring of route navigation, etc.
  • an ANS included in the vehicle can upload the characterization of a route, one or more sets of input data associated with the route, etc. to a remote service, and the remote service can process the data, evaluate the characterization, etc. to develop a virtual characterization of the route.
  • the ANS can upload the characterization, input data associated with the route, etc. to a remote service, system, etc. and the remote service, system, etc. can further evaluate the characterization to augment the confidence indicator of the characterization.
  • the remote system can flag the characterization for manual evaluation of the characterization and can modify the characterization in response to manual inputs from one or more operators.
  • a manual input can include a manually-specified confidence indicator of the characterization.
  • the remote system can dispatch a dedicated sensor suite, which can be included in a dedicated sensor- bearing vehicle, to collect additional input data associated with one or more selected portions of the route in question, where the remote system can utilize the additional input data to modify the characterization.
  • the remote system can flag the route and provide, to the ANS, proposed alternative routes for the vehicle to navigate.
  • Alternative route proposals can include characterizations of one or more of the alternative routes which can have sufficient confidence indicators that the ANSs of the vehicle can enable autonomous navigation of said alternative routes and propose to a user of the vehicle, via an interface, engaging of autonomous navigation of the alternative route rather than travel the first route.
  • the ANS invites the user of the vehicle, via a user interface, to manually navigate one or more alternative routes, so that the ANS can develop virtual characterizations of the one or more alternative route as part of enabling autonomous navigation of the one or more alternative routes.
  • characterizing a route at an ANS of a vehicle includes monitoring driving characteristics of one or more users of the vehicle in manually navigating the vehicle along one or more portions of the route.
  • Such characterization can include monitoring driving characteristics of one or more various mobile entities, including one or more motorized vehicles, manually-powered vehicles, pedestrians, some combination thereof, etc. travelling one or more portions of the route in proximity to the vehicle.
  • Driving characteristics can include positioning of one or more mobile entities relative to one or more static route features along the route, acceleration events relative to static route features, acceleration rates, driving velocities relative to static features, dynamic features, etc. in one or more portions of a route, etc.
  • the ANS can process the monitored driving characteristics to develop a set of driving rules associated with one or more portions of the route, where the set of driving rules determines the driving characteristics according to which the ANS autonomously navigates the vehicle along the one or more portions of the route. For example, based on monitoring driving characteristics of a user of the local vehicle, driving characteristics of various other vehicles, etc.
  • an ANS of the vehicle can develop a set of driving rules associated with the route which can include rules specifying driving velocity ranges along various portions of the route, locations of lanes along the route, permissible spacing distances between the vehicle and other vehicles along the route, locations along the route where particular acceleration rates ranges are permitted, locations where an amount of acceleration is to be applied (e.g., a roadway incline), likelihood of certain dynamic event occurrences (accidents, abrupt acceleration events, roadway obstructions, pedestrians, etc.), some combination thereof, etc..
  • driving rule characterizations can be included in a virtual roadway portion characterization, virtual route characterization, some combination thereof, etc.
  • driving rules for a route can be developed "experientially", i.e., based on monitoring how one or more users actually navigate one or more vehicles along the route.
  • Such locally-developed driving rules can provide an autonomous driving experience which is tailored to the particular conditions of the route being autonomously navigated, rather than using general driving rules developed independently of direct monitoring of how the route is actually navigated by vehicle users.
  • driving characteristics can be processed to develop characterizations of static features included in one or more portions of a route.
  • monitoring of driving characteristics, of the local vehicle and one or more various external vehicles, when the vehicles are navigating over a roadway which lacks at least some conventional static features can be processed to develop one or more static feature characterizations, including a characterization of the edges of the roadway, a characterization of the boundaries of the unmarked lanes of the roadway, etc.
  • Driving rule characterizations can be subject to predetermined driving constraints, including driving velocity limits. For example, based on processing input data generated from monitoring of external environment elements, the ANS can identify road signs, along various portions of a route, which specify a speed limit for the roadway over which that portion of the route extends. The ANS can analyze the input data associated with the monitoring of the road sign to identify the indicated speed limit and incorporate the identified speed limit as a driving velocity limit in the driving rules associated with that portion of the route, so that the ANS, when using the driving rule characterizations to autonomously navigate along at least that portion of the route, will at least attempt to not exceed the speed limit associated with that portion of the route.
  • driving velocity limits for example, based on processing input data generated from monitoring of external environment elements, the ANS can identify road signs, along various portions of a route, which specify a speed limit for the roadway over which that portion of the route extends. The ANS can analyze the input data associated with the monitoring of the road sign to identify the indicated speed limit and incorporate the identified speed limit as a driving velocity limit in
  • multiple virtual roadway portion characterizations included in multiple navigated routes can be developed at a vehicle, and such multiple characterizations can be incorporated into a set of roadway portion characterizations, of various portions of multiple different roadways navigated via navigation of multiple various routes, where the various characterizations of the multiple roadway portions can be used by the ANS to enable autonomous navigation along various portions of various routes, including portions of multiple separate routes.
  • virtual characterizations of one or more roadway portions can be uploaded from one or more ANSs, included in one or more vehicles, to a remote system, service, etc.
  • a system can include a navigation monitoring system, where multiple ANSs of multiple separate vehicles are communicatively coupled to one or more navigation monitoring system in a navigation network.
  • the various characterizations of various roadway portions can be incorporated, at the remote system, service, etc. into a "map" of roadway portion characterizations.
  • the characterization map can be distributed to the various ANSs of the various vehicles.
  • incorporating the characterizations into a map can include developing a composite characterization of the roadway portion based on processing the multiple characterizations of the one or more portions.
  • ANSs of various vehicles can characterize various routes travelled by those respective vehicles, and the various route characterizations developed locally at the various vehicles can be incorporated into a characterization map of route characterizations which can be distributed to other vehicles and utilized by ANSs of the other vehicles to enable autonomous navigation of the other vehicles over the various routes.
  • the ANS included in a vehicle can develop the virtual characterization at least partially locally to the vehicle, based on local monitoring of the environment proximate to the vehicle, thereby precluding a need for a preexisting detailed "map" of roadway portions included in a roadway network, where a map can include a set of virtual characterizations of the roadway portions organized and arranged in accordance with their relative physical, geographic locations, such that the map includes a virtual characterization of the roadway network and the various routes which can be navigated therein. Where a "map" is absent, the ANS can "bootstrap" a map into existence via developing virtual characterizations of one or more portions of one or more routes which the vehicle is navigated.
  • a locally-developed map of characterized routes can include routes which a user of the vehicle tends to navigate, at the expense of routes which the user does not navigate, thereby enabling autonomous navigation of routes which a user routinely travels.
  • an ANS is communicatively coupled to one or more other ANSs and can communicate virtual route characterizations with the one or more other autonomous navigation systems.
  • the ANS can be implemented by one or more computer systems external to one or more vehicles and can modify virtual route characterizations received from one or more other ANSs. Such modification can include incorporating multiple characterizations into one or more composite characterizations, modifying characterizations received from one or more sets of remote ANSs based on input data received from one or more other sets of remote ANSs, some combination thereof, etc.
  • first, second, etc. may be used herein to describe various elements, these elements should not be limited by these terms. These terms are only used to distinguish one element from another.
  • a first contact could be termed a second contact, and, similarly, a second contact could be termed a first contact, without departing from the intended scope.
  • the first contact and the second contact are both contacts, but they are not the same contact.
  • the term “if may be construed to mean “when” or “upon” or “in response to determining” or “in response to detecting,” depending on the context.
  • the phrase “if it is determined” or “if [a stated condition or event] is detected” may be construed to mean “upon determining” or “in response to determining” or “upon detecting [the stated condition or event]” or “in response to detecting [the stated condition or event],” depending on the context.
  • FIG. 1 illustrates a schematic block diagram of a vehicle 100 which comprises an autonomous navigation system (ANS) which is configured to control various control elements of the vehicle to autonomously navigate the vehicle along one or more driving routes, based at least in part upon one or more virtual characterizations of one or more portions of the one or more driving routes, according to some embodiments.
  • ANS autonomous navigation system
  • Vehicle 100 will be understood to encompass one or more vehicles of one or more various configurations which can accommodate one or more occupants, including, without limitation, one or more automobiles, trucks, vans, etc.
  • Vehicle 100 can include one or more interior cabins configured to accommodate one or more human occupants (e.g., passengers, drivers, etc.), which are collectively referred to herein as vehicle "users".
  • An interior cabin can include one or more user interfaces, including vehicle control interfaces (e.g., steering wheel, throttle control device, brake control device), display interfaces, multimedia interfaces, climate control interfaces, some combination thereof, or the like.
  • Vehicle 100 includes various control elements 120 which can be controlled to navigate (“drive") the vehicle 100 through the world, including navigate the vehicle 100 along one or more routes.
  • one or more control elements 120 are communicatively coupled to one or more user interfaces included in an interior cabin of the vehicle 100, such that the vehicle 100 is configured to enable a user to interact with one or more user interfaces to control at least some of the control elements 120 and manually navigate the vehicle 100.
  • vehicle 100 can include, in the interior cabin, a steering device, throttle device, and brake device which can be interacted with by a user to control various control elements 120 to manually navigate the vehicle 100.
  • Vehicle 100 includes an autonomous navigation system (ANS) 110 which is configured to autonomously navigate vehicle 100.
  • ANS 110 may be implemented by any combination of hardware and/or software configured to perform the various features, modules or other components discussed below.
  • ANS 110 may be implemented by any combination of hardware and/or software configured to perform the various features, modules or other components discussed below.
  • one or more multiple general processors, graphical processing units, or dedicated hardware components, such as various kinds of application specific integrated circuits (ASICs), field programmable gate arrays (FPGAs), or other dedicated circuitry may implement all (or portions in conjunction with program instructions stored in a memory and executed by the processors) of route characterization module 1 12 and driving control module 114.
  • ASICs application specific integrated circuits
  • FPGAs field programmable gate arrays
  • One or multiple computing systems, such as computer system 1300 in FIG. 13 below may also implement ANS 1 10.
  • ANS 110 is communicatively coupled to at least some of the control elements 120 of the vehicle and is configured to control one or more of the elements 120 to autonomously navigate the vehicle 100.
  • autonomous navigation of the vehicle 100 refers to controlled navigation ("driving") of vehicle 100 along at least a portion of a route based upon active control of the control elements 120 of the vehicle 100, including steering control elements, throttle control elements, braking control elements, transmission control elements, etc. independently of control element input commands from a user of the vehicle.
  • Autonomous navigation can include ANS active control of driving control elements 120 while enabling manual override of control of elements 120 via manual input from a user via user interaction with one or more user interfaces included in the vehicle.
  • ANS 1 10 can autonomously navigate vehicle 1 10 in the absence of input commands from a vehicle user via one or more user interfaces of the vehicle 100, and ANS 110 can cease control of one or more elements 120 in response to a user- initiated input command to the one or more elements 120 from one or more user elements of the vehicle 100.
  • ANS 110 includes a route characterization module 1 12 which develops and maintains virtual characterizations of various roadway portions, driving routes, etc. and a driving control module 1 14 which is configured to control one or more control elements 120 of the vehicle 100 to autonomously navigate the vehicle 100 along one or more portions of one or more driving routes based on the virtual characterizations associated with the one or more portions of the route.
  • Vehicle 100 includes a set of one or more external sensor devices 1 16, also referred to as external sensors 116, which can monitor one or more aspects of an external environment relative to the vehicle 100.
  • sensors can include camera devices, video recording devices, infrared sensor devices, radar devices, light-scanning devices including LIDAR devices, precipitation sensor devices, ambient wind sensor devices, ambient temperature sensor devices, position- monitoring devices which can include one or more global navigation satellite system devices (e.g., GPS, BeiDou, DORIS, Galileo, GLONASS, etc.), some combination thereof, or the like.
  • One or more of external sensor devices 1 16 can generate sensor data associated with an environment as the vehicle 100 navigates through the environment.
  • Sensor data generated by one or more sensor devices 116 can be communicated to ANS 1 10 as input data, where the input data can be used by the route characterization module 112 to develop, update, maintain, etc. a virtual characterization of one or more portions of the routes through which the vehicle 100 is being navigated.
  • External sensor devices 116 can generate sensor data when the vehicle 100 is being manually navigated, autonomously navigated, etc.
  • Vehicle 100 includes a set of one or more internal sensors 1 18, also referred to as internal sensor devices 118, which can monitor one or more aspects of vehicle 100.
  • sensors can include camera devices configured to collect image data of one or more users in the interior cabin of the vehicle, control element sensors which monitor operating states of various control elements 120 of the vehicle, accelerometers, velocity sensors, component sensors which monitor states of various automotive components (e.g., sensors which monitor wheel-turning dynamics of one or more wheels of the vehicle), etc.
  • One or more of internal sensor devices 1 18 can generate sensor data associated with the vehicle 100 as the vehicle 100 navigates through the environment.
  • Sensor data generated by one or more internal sensor devices 118 can be communicated to ANS 110 as input data, where the input data can be used by the route characterization module to develop, update, maintain, etc. a virtual characterization of one or more portions of the routes through which the vehicle 100 is being navigated.
  • Internal sensor devices 1 18 can generate sensor data when the vehicle 100 is being manually navigated, autonomously navigated, etc.
  • Vehicle 100 includes one or more sets of interfaces 130.
  • One or more interfaces 130 can include one or more user interface devices, also referred to as user interfaces, with which a user of vehicle 100 can interact to interact with one or more portions of ANS 100, control elements 120, etc.
  • an interface 130 can include a display interface with which a user can interact to command ANS 110 to engage autonomous navigation of vehicle 100 along one or more particular routes, based at least in part upon one or more virtual characterizations of one or more portions of the route.
  • one or more interfaces 130 includes one or more communication interfaces which can communicatively couple ANS 1 10 with one or more remote services, systems, etc. via one or more communication networks.
  • an interface 130 can include a wireless communication transceiver which can communicatively couple ANS 1 10 with one or more remote services via one or more wireless communication networks, including a cloud service.
  • ANS 1 10 can communicate virtual route characterizations, various sets of input data, etc. to a remote service, system, etc. via one or more interfaces 130, receive virtual characterizations of one or more roadway portions, etc. from the one or more remote services, systems, etc., and the like.
  • an ANS can develop one or more virtual characterizations of one or more roadway portions, which the ANS can subsequently utilize to autonomously navigate a vehicle through the one or more roadway portions, based on monitoring various static features, dynamic features, driving characteristics, etc. while the vehicle is navigated through the one or more roadway portions.
  • Such monitoring can be implemented when the vehicle is manually navigated through the one or more roadway portions by a vehicle user, such that the ANS can develop a characterization of the static features of a route by monitoring the static features when the vehicle is manually navigated along the route and can develop a set of driving rules specifying how ANS is to navigate a vehicle through the one or more roadway portions based on monitoring driving characteristics of the user's manual navigation of the vehicle through the roadway portion, monitoring driving characteristics of other vehicles navigating through the roadway portion in proximity to the local vehicle, etc.
  • a vehicle ANS can both develop a characterization of the physical state of the route (e.g.
  • the ANS can develop characterizations used to engage in autonomous navigation of a route, independently of externally-received or preexisting characterization data.
  • FIG. 2 illustrates an illustration of a vehicle 202, which includes an ANS 201 and a set of sensor devices 203, navigating through a region 200 which includes multiple roadway portions 210A-D of roadways 208, 218, according to some embodiments.
  • Vehicle 202 can be manually navigated through the route, and sensor devices 203 can include one or more external sensor devices, vehicle sensor devices, etc.
  • Vehicle 202 and ANS 201 can be included in any of the embodiments of a vehicle, ANS, etc.
  • a region 200 which includes one or more various roadways 208, 218 can be divided into various roadway "portions" 210.
  • An ANS can demarcate various roadway portions based on position data received from one or more position sensors in the vehicle 202, one or more various static features included in region 200, etc.
  • Different roadway portions 210 can have different sizes, which can be based at least in part upon driving velocity of vehicles navigating the roadway in which the roadway portions are included, environmental conditions, etc.
  • roadway 208 can be a highway where the average driving velocity is higher than that of roadway 218 which can be an onramp; as a result, each of roadway portions 210A-C of roadway 208 can be larger than roadway portion 210D of roadway 218.
  • an ANS included in the vehicle monitors various static feature characteristics of various roadway portions of the various roadways, monitors various driving characteristics of the vehicle user, other proximate vehicles, etc., as the vehicle navigates through the various roadway portions, some combination thereof, and the like.
  • Such monitoring which can be implemented by ANS based on input data received from sensors 203, can include processing the various characteristics to develop virtual characterizations of the one or more roadway portions through which the vehicle is navigated. The ANS can subsequently utilize the virtual characterizations to engage in autonomous navigation of the vehicle through the one or more roadway portions.
  • monitoring of various static feature characteristics of a roadway portion includes identifying various static features associated with the roadway portion.
  • sensor devices 203 can monitor various aspects of the external environment of region 200 to identify various static features associated with the roadway portion 210B, including the edges 212A-B, lane boundaries 217A-B, lanes 214A-C of roadway 208.
  • one or more sensor devices 203 can identify the material composition of one or more portions of roadway 208.
  • a sensor device 203 of vehicle 202 can include an internal sensor device which can monitor dynamics of the turning of the wheels of the vehicle 202 to determine whether the vehicle is presently navigating over an asphalt surface, gravel surface, concrete surface, dirt surface, etc.
  • sensor devices 203 can monitor various aspects of the external environment of region 200 to identify various static features associated with the roadway portion 210B of roadway which are external to the roadway 208 itself, including static landmarks 213, natural environmental elements 215, road inclines 242, road signs 221, 223, etc.
  • identifying a static feature includes identifying information associated with the static feature, including identifying information presented on a road sign.
  • region 200 includes road signs 221, 223, where road sign 221 indicates the presence of onramp 218 and road sign 223 is a speed limit sign which indicates a speed limit for at least roadway portion 210B.
  • Monitoring static features associated with roadway portion 210B as vehicle 202 navigates through the portion 21 OB includes the ANS, based on monitoring the external environment in region 200, determining the physical location of road signs 221, 223 in the roadway portion 210B, identifying the information presented on the road signs 221, 223, and including such information as part of the virtual characterization of the roadway portion.
  • ANS 201 can, based on monitoring of region 200 by sensors 203 as vehicle 202 navigates through roadway portion 210B, identify the physical position of road sign 223 in portion 210B, identify that road sign 223 is a speed limit sign, identify the speed limit indicated by the road sign as 55 miles/hour, and incorporate such information into a driving rule characterization associated with at least the roadway portion 210B as a maximum driving velocity when navigating through at least portion 210B.
  • sensor devices 203 can monitor driving characteristics of vehicle 202, other vehicles 232-236 navigating through roadway portions 210 proximate to vehicle 202, etc. Such driving characteristics can be utilized by ANS 201 to develop one or more portions of the virtual characterization of one or more roadway portions, including one or more static feature characterizations, driving rules, etc. For example, based on monitoring driving velocity of one or more of vehicle 202, vehicles 232-236, etc. navigating through roadway portions 210A-C, ANS 201 can determine a driving velocity range for autonomously navigating through the one or more roadway portions 210A-C.
  • ANS 201 can determine, based on monitoring driving characteristics of the one or more vehicles 202, 232-236, a permissible range of acceleration rates associated with navigating through particular portions 210A-C, locations in the roadway portions where acceleration events are likely, a location of lanes 214A-C in the roadway, a permissible range of spacing distances 252, 254, between vehicle 202 and other vehicles navigating one or more roadway portions 210A-C in a common lane 214 as vehicle 202, a permissible range of spacing distances 256A-B between vehicle 202 and one or more boundaries of the lane 214B in which the vehicle 202 is navigating, etc.
  • driving characteristics monitored while a vehicle navigates through a roadway portion are associated with one or more other roadway portions.
  • ANS 201 monitors a spacing 252 between vehicle 202 and another following vehicle 234 when vehicle navigates through portion 210B
  • ANS 201 can develop a driving rule which specifies the spacing 252 as a minimum permissible spacing between vehicle 202 and a vehicle 236 ahead of vehicle 202 when vehicle 202 is navigating through roadway portions 210A and 2 IOC.
  • Such associating can be based at least in part upon similarities between roadway portions.
  • driving characteristics determined based on input data generated while vehicle 202 is navigating through one or more of roadway portions 210A-C can be used to develop driving rule characterizations included in virtual characterizations of any of similar roadway portions 210A-C while such driving characteristics are not used to develop driving rules included in the virtual characterization of dissimilar roadway portion 210D.
  • FIG. 3 illustrates an illustration of a vehicle 302, which includes an ANS 301 and a set of sensor devices 303, navigating through a region 300 which includes multiple roadway portions 310A-C of roadways 208, 218, according to some embodiments.
  • Vehicle 302 can be manually navigated through the route, and sensor devices 203 can include one or more external sensor devices, vehicle sensor devices, etc.
  • Vehicle 302 and ANS 301 can be included in any of the embodiments of a vehicle, ANS, etc.
  • an ANS can utilize monitored driving characteristics while a vehicle navigates through a roadway portion to determine static feature characteristics of the roadway portion, which are included in the virtual characterization of the roadway portion.
  • vehicle 302 is navigating along an unpaved roadway 308 which lacks well-defined edges and lane boundaries.
  • ANS 301 can determine edges 312A-B of the roadway 308, lanes 314A-B, and a lane boundary 317 based at least in part upon monitoring the driving characteristics of vehicle 302 when a user of vehicle 302 navigates the vehicle 302 through one or more roadway portions 310A-C, monitoring the driving characteristics of one or more other vehicles 332 when the other vehicle 332 navigates the vehicle 302 through one or more roadway portions 3 lOA-C, some combination thereof, or the like.
  • ANS 301 can determine, based on monitoring the driving characteristics of both vehicle 302 and vehicle 332, the edges 312A-B, lanes 314A-B, and boundary 317. In addition, ANS 301 can determine that lane 314B is associated with driving in an opposite direction, relative to driving in lane 314A.
  • FIG. 4 illustrates a block diagram of an autonomous navigation system (ANS), according to some embodiments.
  • ANS 400 can be implemented by one or more computer systems and/or by any combination of hardware and/or software configured to perform the various features, modules or other components discussed below, such as one or more multiple general processors, graphical processing units, or dedicated hardware components, and can be included in any of the embodiments of ANSs.
  • ANS 400 includes various modules, which can be implemented by one or more instances of hardware, software, etc.
  • ANS 400 comprises a route characterization module 401 which is configured to develop virtual characterizations of various roadway portions based on monitoring input data generated based on a vehicle in which the ANS 400 is included navigating through the various roadway portions.
  • ANS 400 includes an input data module 410 which is configured to receive input data from various data sources, which can include one or more sensor devices.
  • module 410 is configured to process at least some input data and determine various static feature characterizations, driving rule characterizations, etc. based on the one or more instances of input data.
  • Input data can be received from various data sources based on a vehicle navigating one or more roadway portions, where such navigation can be manual, autonomous, some combination thereof, etc.
  • Input data module 410 includes an external sensor module 412 which is configured to receive input data from one or more external sensors of a vehicle, where the input data can be generated by the one or more external sensors concurrently with the vehicle navigating through one or more roadway portions along one or more driving routes.
  • Module 412 can include a static feature module 414 which monitors one or more static features included in one or more roadway portions as the vehicle navigates through the one or more roadway portions. Such monitoring can include determining a geographic location of a static feature, identifying information presented by the static feature, categorizing the static feature, etc. For example, module 414 can identify a roadway edge, lane boundary, road sign, etc. based on monitoring image data generated by a camera device monitoring the external environment of the vehicle.
  • module 414 monitors the physical location (also referred to herein as the "geographic location”, “geographic position”, etc.) of vehicle 401 as the vehicle 401 navigates through the one or more roadway portions.
  • Such monitoring can include determining a geographic location of the vehicle in which the ANS 400 is located based at least in part upon input data received from a global navigation satellite system device.
  • Such physical location data can be used to develop static feature characterizations of the roadway portion through which the vehicle in which the ANS 400 is located is navigating, including the physical location of the roadway portion, driving rule characterizations of the roadway portion, including the driving velocity through the roadway portion, some combination thereof, etc.
  • Module 412 can include a dynamic feature module 416 which monitors one or more dynamic features encountered as the vehicle navigates through one or more roadway portions, including other vehicles navigating through the roadway portion, vehicles stopped in the roadway portion, emergency vehicles, vehicle accidents, pedestrians, ambient environmental conditions, visibility, etc. Based on the dynamic features encountered in one or more roadway portions, module 416 can develop one or more driving rule characterizations, static feature characterizations, etc. associated with the roadway portions. [0080] Module 412 can include a driving characteristics module 418 which can monitor driving characteristics of one or more external vehicles, relative to the vehicle in which ANS 400 is included, navigating in proximity to the vehicle when the vehicle is navigating through one or more roadway portions.
  • Such driving characteristics can include driving velocity, acceleration rate, spacing between roadway boundaries, lane boundaries, other vehicles, physical position, etc.
  • module 418 can develop one or more driving rule characterizations, static feature characterizations, etc. associated with the roadway portions.
  • Input data module 410 includes an internal sensor module 422 which is configured to receive input data from one or more internal sensors of a vehicle, where the input data can be generated by the one or more internal sensors concurrently with the vehicle in which the ANS 400 is located navigating through one or more roadway portions along one or more driving routes.
  • Module 422 can include a control element module 426 which monitors one or more instances of input data associated with one or more control elements of vehicle in which the ANS 400 is located as the vehicle navigates through one or more roadway portions.
  • Such input data can include throttle positon data, steering element position, braking device state, wheel turning rate, commands to such elements from one or more user interfaces, some combination thereof, etc.
  • module 426 can develop one or more driving rule characterizations associated with the roadway portions through which the vehicle in which the ANS 400 is located is navigating, one or more static feature characterizations of the roadway portion through which the vehicle in which the ANS 400 is located is navigating, etc.
  • Module 422 can include a local driving characteristic module 428 which can monitor driving characteristics of vehicle in which the ANS 400 is located as the vehicle is navigating through one or more roadway portions, including driving characteristics of a user of the vehicle as the user is manually navigating the vehicle through the one or more roadway portions. Such driving characteristics can include driving velocity, acceleration rate, spacing between roadway boundaries, lane boundaries, other vehicles, physical position, etc. Based on the monitored driving characteristics of the vehicle in which the ANS 400 is located in one or more roadway portions, module 428 can develop one or more driving rule characterizations, static feature characterizations, etc. associated with the one or more roadway portions.
  • ANS 400 includes a processing module 430 which is configured to process input data received at module 410 to develop one or more virtual characterizations of one or more roadway portions.
  • module 430 is configured to process at least some input data and determine a virtual characterization of a roadway portion which includes a characterization of static features included in the roadway portion, a characterization of driving rules for navigating through the roadway portion, some combination thereof, etc.
  • Module 430 can include a static feature characterization module 432 which is configured to develop a virtual characterization of static features of a particular roadway portion, based on one or more instances of input data associated with the roadway portion received at module 410.
  • Module 430 can include a driving rule characterization module 434 which is configured to develop a virtual characterization of driving rules associated with a particular roadway portion, based on one or more instances of input data associated with the roadway portion received at module 410.
  • Modules 432, 434 are configured to generate virtual characterizations associated with one or more roadway portions based on sensor data generated and received at module 410 when vehicle 401 navigates through the one or more roadway portions.
  • module 430 is configured to generate one or more virtual roadway portion characterizations of one or more roadway portions, where said characterizations include the various driving rule characterizations and static feature characterizations associated with the roadway portion. In some embodiments, module 430 is configured to generate one or more virtual route characterizations of one or more driving routes, where a generated virtual route characterization includes at least a set of virtual roadway portion characterizations of the various roadway portions included in the route.
  • module 430 is configured to update a previously-developed virtual characterization of a roadway portion, based at least in part upon additional sets of input data received at module 410 when vehicle in which the ANS 400 is located subsequently navigating through the roadway portion at least once.
  • one or more of modules 432, 434 can update one or more portions of a virtual characterization of a roadway portion based at least in part upon determining a difference between one or more static features, driving characteristics, etc. associated with the subsequent navigation through the roadway portion.
  • module 432 determines, based on processing input data generated when vehicle in which the ANS 400 is located subsequently navigates through the same roadway portion again, the presence of an additional static feature, including a road sign, not characterized in the initial static feature characterization, module 432 can update the static feature characterization of the roadway portion to incorporate the additional static feature.
  • module 430 is configured to evaluate a virtual characterization of a roadway portion to determine whether to enable autonomous navigation of at least the roadway portion based on the virtual characterization.
  • Such evaluation can include determining a confidence indicator associated with the virtual characterization of a roadway portion, tracking changes in the confidence indicator over successive monitoring of the roadway portion, based on successive navigations of vehicle in which the ANS 400 is located through the roadway portion, comparing the confidence indicator with one or more various thresholds, etc.
  • Module 430 can include an evaluation module 436 which is configured to evaluate a virtual characterization of a roadway portion, such that the module 436 associates a confidence indicator with the characterization.
  • a confidence indicator can indicate a confidence that the virtual characterization characterizes, within a certain level of accuracy, precision, some combination thereof, or the like of the static features, driving characteristics, etc. associated with the roadway portion, etc.
  • a confidence indicator associated with a virtual characterization of a roadway portion can indicate a confidence that the virtual characterization characterizes, within a certain level of accuracy, all of the roadway static features (e.g., roadway edges, lanes, lane boundaries, road signs, etc.) associated with the roadway portion.
  • module 436 updates the confidence indicator of a virtual characterization of a roadway portion over time based on successive processing of successively- generated sets of input data associated with the roadway portion. For example, where the vehicle in which the ANS 400 is located navigates through a given roadway portion multiple times, and successive processing of the successive sets of input data associated with the roadway portion result in fewer or no additional changes to the developed virtual characterization of the roadway portion, module 436 can successively adjust the confidence indicator associated with the virtual characterization to reflect an increased confidence in the accuracy and precision of the virtual characterization. Where a set of input data, upon processing, results in a substantial revision of the virtual characterization of a roadway portion, module 436 can reduce the confidence indicator associated with the virtual characterization.
  • evaluation module 436 evaluates one or more portions of a driving route and determines whether to enable autonomous navigation of the vehicle in which the ANS 400 is located through the one or more portions of the driving route, based at least in part upon a determination of whether confidence indicators associated with aa set of roadway portion virtual characterizations, which at least meet a certain contiguous distance threshold, at least meet a certain threshold level. For example, evaluation module 436 can determine, based at least in part upon determining that a continuous set of twelve (12) roadway portions included in a particular driving route have associated confidence indicators which exceed a threshold confidence indication which comprises a particular level of 90%, module 436 can enable an availability of autonomous navigation of at least a portion of the twelve roadway portions.
  • Such an enabling can include establishing one or more "transition" route portions in which a transition between manual and autonomous navigation occurs.
  • a transition can include an autonomous transition portion in which a user is instructed to release manual control of one or more control elements of the vehicle in which the ANS 400 is located, a manual transition portion in which a user is alerted to assume manual control of one or more control elements of the vehicle in which the ANS 400 is located, some combination thereof, etc.
  • Module 430 can include a curation module 438 which is configured to monitor characterizations of one or more roadway portions to determine whether additional processing is needed to enable autonomous navigation of the one or more roadway portions. Such additional processing can include implementing one or more processing operations at one or more computer systems implementing ANS 400.
  • the monitoring at module 438 can include monitoring successive changes in a confidence indicator associated with a virtual characterization over time and determining whether additional processing is needed based on the time-variation of the confidence indicator. For example, module 438 can monitor the rate of change of a confidence indicator associated with a virtual characterization over time.
  • module 438 determines that additional processing of the virtual characterization of a roadway portion, input data associated with the roadway portion, some combination thereof, etc. is required, based on a determination that a rate of change of an associated confidence indicator does not meet a threshold rate value. For example, if a confidence indicator associated with a virtual characterization of a particular roadway portion fluctuates over time and does not increase at more than a particular rate, module 438 can determine that additional processing associated with that roadway portion is required. Such additional processing can include evaluating multiple sets of input data generated during multiple separate navigations through the roadway portion, evaluating one or more portions of the virtual characterization which are determined to change repeatedly with successive sets of input data, etc.
  • module 438 is configured to determine whether to upload one or more of a virtual characterization of a roadway portion, one or more sets of input data associated with a roadway portion, etc. to one or more remote systems, services, etc. for additional curation, processing, etc. For example, if, after additional processing by a computer system implementing ANS 400, a confidence indicator associated with a virtual characterization does not at least meet a threshold level, module 438 can determine to upload the virtual characterization and various sets of input data associated with the roadway portion to remote service, which can include a cloud service.
  • Module 400 includes an interface module 450 which is configured to present information associated with autonomous navigation to a user of the vehicle in which the ANS 400 is located via one or more user interfaces of vehicle in which the ANS 400 is located, receive user- initiated commands from the user via one or more user interfaces of the vehicle in which the ANS 400 is located, etc. For example, based on a determination at module 430 to enable autonomous navigation of a portion of a driving route which includes a set of roadway portions, module 450 can present a representation of the driving route, including a representation of the portion of the driving route for which autonomous navigation is enabled, along with an invitation to the user to indicate whether to engage autonomous navigation of the portion of the driving route.
  • Interface module 450 can receive user-initiated commands to engage autonomous driving of one or more portions of a driving route.
  • the ANS 400 can engage autonomous navigation of a portion of a driving route, for which autonomous navigation is enabled, independently of user interaction with the ANS via one or more user interfaces. For example, upon enabling of autonomous navigation for a roadway portion, the ANS can automatically, without user intervention, engage autonomous navigation upon the vehicle in which the ANS is located encountering the roadway portion. Such automatic engagement of autonomous navigation can be selectively enabled based on user interaction with the ANS via one or more user interfaces included in the vehicle.
  • Module 400 includes a communication module 460 which is configured to communicatively couple with one or more remote services, systems, etc., via one or more communication networks.
  • module 460 can communicatively couple with a remote service, system, etc. via a wireless communication network, cellular communication network, satellite communication network, etc.
  • module 460 can communicate data with the one or more remote services, systems, etc., including uploading virtual characterizations, input data sets, etc. to a remote service, system, etc., receiving one or more virtual characterizations from the remote service, system, etc., some combination thereof, or the like.
  • Module 400 including a database module 440 which is configured to store one or more virtual characterizations 442.
  • Such characterizations can include one or more virtual roadway portion characterizations, one or more virtual route portions which includes one or more sets of virtual roadway portion characterizations, some combination thereof, etc.
  • virtual characterizations 442 can be developed at module 430 based on one or more sets of input data generated based on monitoring one or more of external data, vehicle data, etc. when the vehicle in which the ANS 400 is located navigates through a particular roadway portion.
  • a virtual route characterization can include a characterization of the various roadway portions included in a route, including an indication of the start and destination locations of the route.
  • a virtual characterization can be developed for each roadway portion, and a virtual route characterization indicates the various roadway portions included in the route.
  • the virtual route characterization includes an indication of which roadway portions in the route autonomous navigation is enabled.
  • the various virtual characterizations 442 included in database module 440 can include, for each characterization 442, a set of driving rule characterizations 444 characterizing a set of driving rules which can be utilized to autonomously navigate vehicle 401 through one or more roadway portions, and a set of static feature characterizations 446 which characterize the various static features included in one or more roadway portions.
  • a virtual characterization 442 can include a confidence indicator 448 associated with the characterization 442.
  • FIG. 5A-C illustrate a user interface associated with the autonomous navigation system, according to some embodiments.
  • the user interface can be generated by any of the embodiments of ANSs.
  • User interface 500 is a display interface which presents a graphical user interface (GUI) 502 of a display screen.
  • GUI graphical user interface
  • the illustrated GUI 502 illustrates a representation of a map which includes a set of roadways 510A-E in a particular geographic region.
  • the set of roadways 510A- E can be referred to at least a portion of a roadway network.
  • a user interface presented to a user of a vehicle in which an ANS is included includes a representation of a route which the vehicle can be navigated between one or more locations.
  • the representation of the route can be presented based on one or more user- initiated commands to the interface which command that a particular re-characterized route be displayed on the represented map of GUI 502.
  • Each route can be associated with a particular title (e.g., "route to work"), and a user can interact with one or more user interfaces to select the particular route based on identifying the particular title associated with the route which a user desires to navigate a vehicle.
  • a user interface presents a representation of a particular route based at least upon an anticipation that a user of the vehicle will desire to navigate the vehicle along the particular route.
  • Such an anticipation can be based at least in part upon an anticipation that the vehicle is presently located at a physical location which corresponds to a start location of one or more particular routes, at a particular time of day which corresponds to a time range during which a particular route has historically been navigated from the start location, etc.
  • the GUI presents an interface element (e.g., one or more icons, message prompts, etc.) which includes one or more interactive elements, each representing a separate route, which a user can interact with to command the interface to present a representation of a particular route.
  • Each route can be a route for which a particular virtual characterization is stored at the ANS and can be associated with a particular route title.
  • the route title can be specified by a user, by the ANS, some combination thereof, etc.
  • the GUI presents an interface element which indicates a limited selection of the routes for which the ANS stores virtual characterizations, based at least in part upon one or more of the present location of the vehicle in which the interface and ANS are located, the present time of day at said location, some combination thereof, etc.
  • the interface can present an interactive element, including one or more presentations of the one or more routes and prompt a user to interact with one or more of the representations to select one or more of the routes.
  • the interface can interact with the ANS to present a graphical representation of the one or more routes associated with the one or more particular representations.
  • multiple roadways 510A-E are presented on GUI 502, which also presents a plurality of location icons 520A-D associated with various locations.
  • the vehicle in which the interface 500 is located can be presently proximate to location 520A, which can be a starting location for several separate routes to separate destination locations.
  • three locations 520B-D are presented on the GUI at locations, relative to the illustrated roadways 510, which correspond to the physical location of said locations relative to said roadways.
  • Each separate location can be a destination location for one or more driving routes which originate at starting location 520A.
  • the separate locations 520B-D can be presented in GUI in response to identifying that the vehicle in which the interface 500 is located is proximate to location 520A, identifying multiple separate driving routes for which location 520A is a starting location, and identifying locations 520B-D as destination locations of one or more of said identified separate driving routes.
  • the ANS and interface can be interoperable to identify, based on input data received at the ANS from one or more sensor devices, a present location of the vehicle.
  • the ANS can identify one or more starting locations, of one or more driving routes for which virtual characterizations are stored at the ANS, which are proximate to the present location of the vehicle, and further identify one or more destination locations of the one or more driving routes.
  • the interface can present, to the user, graphical representations of the identified starting locations and destination locations, and can further present one or more interactive interface elements with which the user can interact to select one or more of the driving routes.
  • GUI 502 includes an interface element 580 which includes three separate representations 590A-C of three separate driving routes. Each driving route can have starting location 520A and a separate one of the illustrated destination locations 520B-D. As shown, each representation 590A-C includes a route title associated with the respective driving route.
  • Each representation can be interactive, such that a user can interact with one or more of the representations 590 to select one or more of the driving routes associated with the representations.
  • one or more of the ANS and interface can identify that a user has selected a particular driving route and present a representation of same on the GUI.
  • FIG. 5B illustrates GUI 502 presenting a representation of a particular driving route 530 which extends between a start location 520A and a destination location 520B.
  • the driving route 530 includes a set of roadway portions 532 extending between the two locations 520A-B.
  • the represented route 530 does not indicate the boundaries between the various roadway portions 532 included in the route 530.
  • a particular represented driving route includes one or more portions for which autonomous navigation is enabled.
  • the representation 530 of a driving route includes a representation of a portion 540 of the route for which autonomous navigation is enabled.
  • the representation can include a message 570 which invites a user to indicate, via interaction with one or more interactive elements 572 of the GUI 502, whether to engage autonomous navigation of the portion 540 of the route for which autonomous navigation is enabled.
  • a portion of said portion 540 is associated with a transition between manual navigation and autonomous navigation.
  • transition region 546 of portion 540 is associated with transitioning from autonomous navigation of portion 540 to manual navigation of a remainder of route 530 to location 520B.
  • the GUI is configured to present various messages to a user based on a present location of the vehicle in which the interface device 500 is included. For example, where the vehicle is being autonomously navigated through portion 540 and crosses boundary 545 into the transition portion, GUI 502 can present an alert message alerting the user to imminent transfer to manual navigation.
  • One or more alerts presented on user interface 502 can be accompanied by other alert signals presented via one or more other user interfaces.
  • a presentation of an alert message on GUI 502 can be accompanied by an audio signal presented via one or more speaker interface devices of the vehicle.
  • portion 540 is represented distinctly from a remainder of route 530.
  • portion 540 can be represented in a different color relative to a remainder of route 530.
  • an animation effect can be presented on portion 540.
  • FIG. 5C illustrates a user interface associated with the autonomous navigation system, according to some embodiments.
  • the user interface can be generated by any embodiments of ANSs.
  • the representation of the portion 540 of the route for which autonomous navigation is enabled can change accordingly. For example, as shown, where the roadway portions in route for which autonomous navigation is enabled include additional portions extending toward location 520B, relative to the portions as shown in FIG. 5B, the representation of portion 540 can be shown in GUI 502 to be extended accordingly. Where the portion 540 is recently extended, within a certain period of time, the extended element of portion 540 can be represented distinctly from the remainder of portion 540, including being represented in a different color from the remainder of portion 540.
  • FIG. 6 illustrates a user interface associated with the autonomous navigation system, according to some embodiments.
  • the user interface can be generated by any embodiments of ANSs.
  • an indication of such one or more alternative routes can be presented to a user via a user interface and the user can be requested to engage in navigation of the one or more alternative routes, relative to the most recently-navigated route.
  • An alternative route can be proposed to a user based upon a determination that a confidence indicator associated with a virtual characterization of one or more roadway portions included in a driving route is not sufficiently high to enable autonomous navigation of one or more portions of the route.
  • the alternative route can include a route for which autonomous navigation is enabled for one or more portions thereof, such that a proposal to a user to engage in navigation of the alternative route includes an invitation to engage in autonomous navigation of the one or more portions of the alternative route.
  • the alternative route does not include portions for which autonomous navigation is enabled and virtual characterizations of one or more portions of the alternative route may be presently non-existent.
  • a proposal to navigate the alternative route can include an invitation to a user to manually navigate along the route, so that a virtual characterization of one or more roadway portions included in the alternative route can be developed and autonomous navigation of the alternative route cane be subsequently enabled.
  • GUI 602 illustrates one or more representations of one or more roadways 610A-E and a representation of a particular driving route 620 between a starting location 612A and a destination location 612B.
  • GUI 602 illustrates a representation of an alternative route 630 between the two locations 612A-B and an message 670 prompting a user to selectively engage or decline engaging autonomous navigation of the alternative route 630, rather than navigate along route 620, based at least in part upon interaction with one or more interactive elements 672-674 of the GUI 602.
  • FIG. 7 illustrates developing virtual characterizations of one or more roadway portions to enable autonomous navigation of the one or more roadway portions, according to some embodiments.
  • the developing can be implemented by any of the embodiments of ANSs included in one or more vehicles and can be implemented by one or more computer systems.
  • a set of input data is received from one or more sensor devices of a vehicle based on the vehicle being manually navigated through one or more roadway portions is received.
  • the set of input data can include external sensor data indicating various static features of the roadway portion, vehicle sensor data indicating various instances of data associated with the vehicle, driving characteristic data associated with one or more of the vehicle, one or more other external vehicles navigating the roadway portion in proximity to the vehicle, some combination thereof, or the like.
  • a virtual characterization of the one or more roadway portions is developed.
  • the virtual characterization can include a characterization of a set of driving rules associated with navigating through the roadway portion, a characterization of the static features of the roadway portion, some combination thereof, etc.
  • developing a virtual characterization of one or more roadway portions includes developing a virtual characterization of a driving route which includes one or more sets of roadway portions through which the vehicle navigates between one or more start locations, destination locations, etc.
  • a confidence indicator associated with the developed virtual characterizations of the one or more roadway portions is determined.
  • the confidence indicator can indicate a confidence associated with one or more of the accuracy, precision, etc. of the virtual characterization of one or more roadway portions.
  • a determination is made regarding whether the confidence indicator associated with one or more virtual characterizations at least meets a confidence threshold level.
  • the threshold level can be associated with a sufficiently- high confidence indicator that autonomous navigation through the one or more roadway portions can be safely engaged using the virtual characterization of the one or more roadway portions. If so, at 709, autonomous navigation of the one or more roadway portions is enabled, such that autonomous navigation of the one or more roadway portions can be engaged.
  • the rate at which the confidence indicator of one or more virtual characterizations of one or more roadway portions changes is determined at 710 to be less than a confidence rate threshold
  • one or more of the virtual characterizations, sets of input data, etc. can be uploaded to a remote service, system, etc. for additional processing to modify the virtual characterizations to increase the confidence indicator associated with the virtual characterizations beyond the confidence threshold.
  • the alternative route in some embodiments, includes one or more portions for which autonomous navigation is enabled.
  • the alternative route can, via one or more user interfaces of the vehicle, be proposed to a user of a vehicle as an option for navigation between a starting location and a destination location in the alternative to the driving route most recently navigated between the starting location and the destination location.
  • multiple ANSs are installed in multiple separate vehicles, and each separate ANS can develop virtual characterizations of the one or more driving routes navigated by the respective vehicle in which the respective ANS is installed.
  • the multiple separate ANSs can communicatively couple with a remote system, service, etc. and communicate data therewith.
  • a remote system, service, etc. can include a navigation monitoring system, implemented on one or more computer systems which are external to the multiple vehicles and communicatively coupled to one or more vehicles via one or more communication networks.
  • One or more monitoring systems can be communicatively coupled via one or more communication networks, and an ANS in a given vehicle can communicatively couple with one or more of the navigation monitoring systems.
  • Data communication between the multiple ANSs and one or more monitoring systems can include various ANSs "uploading" one or more sets of virtual route characterizations, virtual roadway portion characterizations, input data received from sensors of the vehicle in which the uploading ANS is located, etc.
  • an ANS uploads a virtual characterization to a remote system, service, etc. which is incorporated, at the navigation monitoring system, into a database of characterizations and input data.
  • an ANS uploads one or more virtual characterizations, sets of input data, etc. to be processed by the navigation monitoring system to refine one or more virtual characterizations, so that automated driving can be enabled for the one or more characterizations.
  • Data communication between the multiple ANSs and one or more monitoring systems can include the navigation monitoring system distributing, or "downloading", one or more virtual characterizations of one or more driving routes, roadway portions, etc. to one or more ANSs installed in one or more vehicles.
  • a virtual characterization distributed to an ANS from the navigation monitoring system can include a virtual characterization developed at least partially at the navigation monitoring system based on data received from one or more ANSs, a virtual characterization which was developed at a separate ANS and uploaded to the navigation monitoring system, some combination thereof, etc.
  • FIG. 8 illustrates a schematic of an autonomous navigation network 800 which comprises multiple ANSs 804A-F, located in separate vehicles 802A-F, which are communicatively coupled to a navigation monitoring system 810 via one or more communication links 820 over one or more communication networks, according to some embodiments.
  • Each ANS 804 illustrated can include any of the ANSs illustrated in any of the above embodiments.
  • a navigation monitoring system 810 implemented on one or more computer systems which are external to the various vehicles 802 in network 800, includes a processing module 812, implemented by one or more instances of processing circuitry included in the navigation monitoring system, which can process one or more sets of input data associated with one or more roadway portions, one or more virtual characterizations of one or more roadway portions, one or more virtual characterizations of one or more driving routes, some combination thereof, etc.
  • a navigation monitoring system 810 includes a database 814 in which multiple various virtual characterizations 816 of one or more driving routes, roadway portions, etc. are stored.
  • the navigation monitoring system 810 communicates with the various ANSs 804A-F via the one or more communication links 820.
  • Such communication can include exchanging virtual characterizations, exchanging sets of input data associated with one or more roadway portions, etc. between one or more ANSs 804 and the navigation monitoring system 810.
  • each ANS 804A-F includes at least one database 806A-F in which one or more sets of input data, virtual characterizations, etc. can be stored.
  • An ANS 804 can upload a virtual characterization, developed by the ANS 804 and stored at the respective database 806, to monitoring system 810 for one or more of processing, storage at database 814, etc.
  • Monitoring system 810 can distribute one or more virtual characterizations 816 stored at database 814, including virtual characterizations received from one or more ANSs 804, virtual characterizations at least partially developed at the navigation monitoring system 810 via processing module 812, etc. to one or more ANSs 804 to be stored in one or more databases 806 of the respective one or more ANSs 804.
  • a virtual route characterization developed at ANS 804E can be uploaded to system 810 and distributed to ANS 804A-D, F.
  • data including some or all of a route characterization can be uploaded continuously from one or more ANSs to system 810.
  • ANS 804A can process input data from various sensor devices of vehicle 802A and continuously upload input data, virtual characterizations based at least in part upon such input data, some combination thereof, etc. to system 810 as vehicle 802A continues to navigate one or more roadway portions.
  • FIG. 9A-B illustrate a schematic of an autonomous navigation network 900 which comprises multiple ANSs 904A-D, located in separate vehicles 902A-D, which are communicatively coupled to a navigation monitoring system 910 via one or more communication links 920A-D over one or more communication networks, according to some embodiments.
  • Each ANS 904 illustrated can include any of the ANSs illustrated in any of the above embodiments.
  • multiple separate ANSs located in separate vehicles develop virtual characterizations of various separate sets of roadway portions, driving routes, etc.
  • the separate ANSs can communicate one or more of such locally-developed virtual characterizations to the navigation monitoring system, where the various characterizations from various ANSs can be incorporated into a collection of virtual characterizations at the navigation monitoring system.
  • one or more ANSs located in one or more separate vehicles autonomously navigating one or more roadway portions concurrently and continuously uploads virtual characterizations developed based on sensor data generated during the autonomous navigation of the one or more roadway portions.
  • FIG. 9A illustrates each of the separate ANSs 904A-D of the separate vehicles 902A-D communicating a separate set 909 A-D of virtual roadway portion characterizations to monitoring system 910 via separate communication links 920A-D.
  • each separate set 909A-D of virtual characterizations is illustrated in FIG. 9A in separate map representations 908A-D showing the geographic locations and roadways for which the separate sets 909A-D include virtual characterizations.
  • each map 908A-D is an illustrated representation of a common geographic region
  • each separate set 909 A-D of virtual characterizations includes a set of multiple virtual roadway portion characterizations of a separate set of roadway portions.
  • separate sets of virtual characterizations include virtual characterizations of common roadway portions.
  • the sets of virtual characterizations 909 A-B include virtual characterizations of roadway portions 911.
  • the various ANSs can communicate the virtual characterizations to the navigation monitoring system based on various triggers.
  • the various ANSs 904 can communicate at least some of the locally-developed virtual characterizations, locally-stored virtual characterizations, etc. to the navigation monitoring system 910 in response to development, updating, etc. of said characterizations, in response to a timestamp trigger, in response to a query from the navigation monitoring system 910, intermittently, continuously, periodically, some combination thereof, etc.
  • the navigation monitoring system can implement processing of the virtual characterization, which can include automatically modifying various elements of the virtual characterization such that the confidence indicator associated with the virtual characterization is improved.
  • the processing which can be implemented by one or more processing modules 912 of the system 910, can include processing a virtual characterization in response to determining that a confidence indicator associated with the receive virtual characterization is less than a threshold confidence indication, processing the virtual characterization in response to identifying a curation flag associated with the received virtual characterization, etc.
  • the navigation monitoring system 910 processes received virtual characterizations, input data sets, etc. associated with one or more roadway portions with respect to stored virtual characterizations of the one or reo roadway portions. Such processing can include comparing two separate virtual characterizations of a roadway portion and discarding a virtual characterization in favor of storing another virtual characterization in response to a determination that the confidence indicator associated with the favored virtual characterization is superior to that of the discarded virtual characterization. In some embodiments, such processing can include developing a "composite" virtual characterization of one or more roadway portions based at least in part upon data incorporated from two or more virtual characterizations of the one or more roadway portions.
  • a composite virtual characterization of a roadway portion can be developed based on at least some static feature characterizations included in one virtual characterization of the roadway portion, at least some other static feature characterizations included in another virtual characterization of the roadway portion, and at least some driving rule characterizations incorporated from yet another virtual characterization of the roadway portion.
  • Such incorporation of various elements from various virtual characterizations can be based at least in part upon a determination that a confidence indicator associated a given element of a given virtual characterization is superior to corresponding elements of other virtual characterizations .
  • FIG. 9B illustrates a graphical representation of a set 919 of virtual roadway portion characterizations stored in the database 914 of monitoring system 910, where the various characterizations in the set 919 can be developed based at least in part upon characterizations 909 A-D received from one or more of the ANSs 904.
  • the set 919 of virtual characterizations is shown in a map representation 918 showing the geographic locations and roadways for which the set 919 includes virtual characterizations.
  • Set 919 includes a virtual characterization for each of the roadway portions for which a virtual characterization was received in one or more of the sets 909 received from one or more of the ANSs 904.
  • the corresponding virtual characterization in set 919 can include a composite characterization developed from the multiple received characterizations, a selected one of the received virtual characterizations, some combination thereof, etc.
  • a navigation monitoring system distributes at least a portion of the virtual characterizations stored at the navigation monitoring system to one or more ANSs via one or more communication links.
  • the navigation monitoring system can distribute one or more virtual characterizations to an ANS in response to receiving a request from the ANS for such distribution, in response to an update to the virtual characterizations stored at the navigation monitoring system, in response to a timestamp trigger, intermittently, continuously, at periodic intervals, some combination thereof, or the like.
  • monitoring system 910 can distribute one or more of the virtual characterizations included in the stored set 919 of virtual characterizations to one or more of the ANSs 904A-D via one or more of the communication links 920A-D.
  • a navigation monitoring system distributes, to a given ANS, a limited selection of virtual characterizations of one or more roadway portions for which the given ANS does not presently have a stored virtual characterization associated with a greater confidence indicator than the confidence indicator associated with the virtual characterizations in the limited selection.
  • an ANS can communicate with a navigation monitoring system to develop virtual characterizations of one or more roadway portions with sufficiently high confidence indicator to enable autonomous navigation of the one or more roadway portions via the virtual characterizations.
  • Such communication can include uploading one or more sets of input data associated with the roadway portions for development into one or more virtual characterizations, uploading one or more developed virtual characterizations for additional processing, etc.
  • an ANS can upload a virtual characterization, one or more sets of input data, some combination thereof, or the like to a navigation monitoring system, based at least in part upon a determination that the confidence indicator associated with one or more virtual characterizations is changing at less than a threshold rate over a certain number of updates of the one or more virtual characterizations.
  • Such uploading can include communicating a request to the navigation monitoring system to implement additional processing of the virtual characterization, input data, etc. to generate a modified virtual characterization.
  • a processing module of the navigation monitoring system can implement processing of one or more characterizations included in the virtual characterization to generate a modified virtual characterization with an improved associated confidence indicator.
  • Such processing can include implementing processing capabilities not available at the ANS. If such a modified virtual characterization cannot be generated based on the processing at the navigation monitoring system, the navigation monitoring system can flag one or more portions of the virtual characterization for "manual curation", whereby the one or more portions of the virtual characterization can be modified based on user-initiated modification of said portions.
  • a user, operator, etc. can be alerted by the navigation monitoring system that the virtual characterization requires manual modification to improve the associated confidence indicator thereof. If the manual curation does not result in a modified virtual characterization having an associated confidence indicator which at least meets a threshold level, the navigation monitoring system can generate a dispatching command to dispatch a dedicated sensor suite, which can be included in a dedicated sensor vehicle, to the one or more roadway portions characterized in the virtual characterization, where the sensor suite is commanded to collect additional sets of input data associated with the roadway portions. Once such additional sets of input data are received at the navigation monitoring system from the dedicated sensor suit, the data can be processed to implement additional modification of the virtual characterization of the roadway portion.
  • the roadway portion can be associated with a warning flag, and alternative driving routes can be associated with driving routes which include the flagged roadway portion. Characterizations of such alternative driving routes can be distributed to one or more ANSs, including the ANS from which the virtual characterization was originally received.
  • FIG. 10 illustrates a "curation spectrum" 1000 of processing available to generate one or more virtual roadway portion characterizations, according to some embodiments.
  • a curation spectrum 1000 can include one or more ANSs, monitoring systems, etc., which can include any of the above embodiments of ANSs, monitoring systems, etc.
  • an ANS included in a vehicle implements "local curation" 1001 of a virtual characterization of a roadway portion, where the ANS processes one or more sets of input data associated with the roadway portion, and a virtual characterization of the roadway portion, to update the virtual characterization.
  • Such processing can occur in response to receiving the set of input data and can occur multiple times in response to successively-generated sets of input data, based on successive navigations of a vehicle along the roadway portion.
  • Such successive processing can result in changes to a confidence indicator associated with the virtual characterization over time. For example, with each update to a virtual characterization, the confidence indicator can change based at least in part upon the changes, if any, to the virtual characterization with each successive update thereof.
  • a vehicle 1002 can include an ANS 1004 which includes a processing module 1006 which can implement said updating.
  • the processing module 1006 can compare the virtual characterization and associated confidence indicator with one or more threshold confidence indicators, threshold rates, etc.
  • the ANS 1004 can selectively enable autonomous navigation of the roadway portion based at least in part upon a determination that the associated confidence indicator at least meets a confidence indication threshold, which is also referred to herein interchangeably as a "threshold confidence indication", "threshold”, etc.
  • the ANS 1004 can determine whether the confidence indicator is changing at a minimum threshold rate over time with successive updates to the associated virtual characterization. If not, the ANS 1004 can upload some or all of the virtual characterization to a navigation monitoring system 1010 to implement the next level of "curation". Such curation can include "remote automatic curation" 1003, where one or more processing modules 1012 included in the navigation monitoring system 1010 process one or more virtual characterizations of one or more roadway portions to develop one or more modified virtual characterizations. Such processing can be implemented automatically, without manual input from one or more users, utilizing processing capabilities not available locally to the ANS 1004. For example, the processing module 1012 of the navigation monitoring system 1010 can include processing systems, processing circuity, etc.
  • the automatic curation 1003 is implemented to generate a modified virtual characterization of the roadway portion with an associated confidence indicator which is superior to the confidence indicator associated with the virtual characterization of the roadway portion received from the ANS 1004.
  • the automatic curation 1003 can, in some embodiments, result in a modified virtual characterization of a roadway portion associated with a confidence indicator which at least meets a threshold confidence indication associated with enabling autonomous navigation of the roadway portion.
  • the modified virtual characterization can be stored at the navigation monitoring system 1010, distributed to the ANS 1004, etc.
  • the navigation monitoring system 1010 is configured to implement the next level of "curation", which can include "manual curation" 1005, where the processing module 1012 of the navigation monitoring system 1010 implements additional processing of a virtual curation based on user- initiated manual input.
  • manual curation can include the processing module 1012 responding to a determination that the confidence indicator of a modified virtual characterization of a roadway portion, developed via automatic curation 1003, does not at least meet a threshold.
  • the threshold can be a confidence indication threshold associated with enabling autonomous navigation, a confidence indication of a virtual characterization originally received at the navigation monitoring system 1010 from ANS 1004, some combination thereof, etc.
  • Responding to such a determination can include generating a warning message, which can be transmitted to one or more human operators supported by one or more computer systems, identifying the virtual characterization and requesting "manual curation" of one or more portions of the identified virtual characterization.
  • the navigation monitoring system can receive one or more operator-initiated manual input commands to implement particular modifications to one or more elements of the virtual characterization.
  • the navigation monitoring system can modify the virtual characterization, based on the received one or more operator initiated manual input commands, to generate a modified virtual characterization of the roadway portion with an associated confidence indicator which is superior to the confidence indicator associated with the virtual characterization of the roadway portion received from the ANS 1004.
  • the manual curation 1005 can, in some embodiments, result in a modified virtual characterization of a roadway portion associated with a confidence indicator which at least meets a threshold confidence indication associated with enabling autonomous navigation of the roadway portion.
  • the modified virtual characterization can be stored at the navigation monitoring system 1010, distributed to the ANS 1004, etc.
  • the navigation monitoring system 1010 is configured to implement the next level of "curation", which can include "additional data curation" 1007, where the processing module 1012 of the navigation monitoring system 1010 implements additional processing of a virtual characterization of one or more roadway portions based on input data associated with said roadway portions, received from one or more dedicated sensor suites which are dispatched to generate data associated with the roadway portions.
  • Such implementation can include the processing module 1012 responding to a determination that the confidence indicator of a modified virtual characterization of a roadway portion, developed via manual curation 1005, does not at least meet a threshold.
  • the threshold can be a confidence indication threshold associated with enabling autonomous navigation, a confidence indicator of a virtual characterization originally received at the navigation monitoring system 1010 from ANS 1004, some combination thereof, etc.
  • Responding to such a determination can include generating a dispatch command to one or more sensor suites 1022, which can be included in one or more dedicated sensor vehicles 1020, to proceed to the roadway portion characterized in the virtual characterization and generate additional input data associated with the roadway portion via the various sensor included in the sensor suite 1022.
  • the processing module 1012 of the navigation monitoring system 1010 can communicate with said sensor suite 1022, via a communication module 1016 of the navigation monitoring system 1010.
  • the navigation monitoring system 1010 can receive additional input data sets from the sensor suite 1022 and, via processing module 1012, implement additional processing of the virtual characterization of the roadway portion based at least in part upon the additional input data.
  • the navigation monitoring system can modify the virtual characterization, based on the received one or more sets of additional input data, to generate a modified virtual characterization of the roadway portion with an associated confidence indicator which is superior to the confidence indicator associated with the virtual characterization of the roadway portion received from the ANS 1004.
  • the curation 1007 can, in some embodiments, result in a modified virtual characterization of a roadway portion associated with a confidence indicator which at least meets a threshold confidence indication associated with enabling autonomous navigation of the roadway portion. Where the curation 1007 results in such a modified virtual characterization, the modified virtual characterization can be stored at the navigation monitoring system 1010, distributed to the ANS 1004, etc.
  • FIG. 1 1 illustrates receiving and processing virtual characterizations, of one or more roadway portions, according to some embodiments.
  • the receiving and processing can be implemented on one or more computer systems, including one or more computer systems implementing one or more monitoring systems, ANSs, etc.
  • one or more virtual characterizations are received.
  • the one or more virtual characterizations can be received from one or more ANSs, monitoring systems, etc. Such virtual characterizations can be received at a navigation monitoring system, ANS, etc.
  • a virtual characterization can include a virtual characterization of a roadway portion, a virtual characterization of a driving route, some combination thereof, etc.
  • two separate virtual characterizations of a common roadway portion can be received from two separate ANSs.
  • a determination of such commonality can be made based at least in part upon comparing one or more static feature characterizations, driving rule characterizations, etc. included in the various virtual characterizations.
  • a determination of such commonality can be made based at least in part upon a determination that two separate virtual roadway portion characterizations include a common set of geographic location coordinates in the static feature characterizations of the separate virtual characterizations .
  • a composite virtual characterization is developed based at least in part upon each of the at least two virtual characterizations. Such development can include incorporating at least some elements of the various virtual characterizations into a common composite virtual characterization. For example, at least some static feature characterizations of one virtual characterization, and at least some static feature characterizations of another separate virtual characterization, can be incorporated into a composite virtual characterization.
  • a virtual characterization is a unique virtual characterization of a roadway portion, driving route, etc., a composite virtual characterization of same, some combination thereof, or the like
  • the virtual characterization is provided to one or more recipients.
  • Such recipients can include a local database, such that the virtual characterization is stored in the database.
  • Such recipients can include one or more remotely-located ANSs, services, systems, etc., such that the virtual characterization is communicated to same via one or more communication links.
  • the database can be included in a navigation monitoring system communicatively coupled to multiple ANSs, other monitoring systems, some combination thereof, or the like.
  • the navigation monitoring system can distribute one or more stored virtual characterizations to one or more ANSs, monitoring systems, etc.
  • FIG. 12 illustrates implementing at least a portion of a curation spectrum with regard to one or more virtual characterizations, of one or more roadway portions, according to some embodiments.
  • the implementing can be implemented on one or more computer systems, including one or more computer systems implementing one or more monitoring systems, ANSs, etc.
  • one or more virtual characterizations are received.
  • the one or more virtual characterizations can be received from one or more ANSs, monitoring systems, etc. Such virtual characterizations can be received at a navigation monitoring system, ANS, etc.
  • a virtual characterization can include a virtual characterization of a roadway portion, a virtual characterization of a driving route, some combination thereof, etc.
  • a certain threshold which can be a threshold value associated with enabling autonomous navigation of the roadway portion characterized by the virtual characterization.
  • a determination is made that a confidence indicator associated with a virtual characterization is less than a threshold value automatic curation of the virtual characterization is implemented.
  • Such implementation can include processing one or more elements of the virtual characterization, including one or more static feature characterizations, driving rule characterizations, etc., to generate a modified virtual characterization.
  • automatic curation is implemented without requiring any manual input from any human operators.
  • Generating a modified virtual characterization includes establishing a confidence indicator associated with the modified virtual characterization.
  • An improved value can include a confidence indicator which is superior to the associated confidence indicator of the unmodified virtual characterization, a confidence indicator which at least meets a threshold value associated with enabling autonomous navigation, some combination thereof, or the like. If so, as shown at 1222, the modified virtual characterization is stored in a database. The modified virtual characterization can be distributed to one or more ANSs, monitoring systems, etc.
  • manual curation of the virtual characterization is implemented.
  • Such implementing can include flagging the virtual characterization for manual curation.
  • flagging can include generating a warning message to one or more human operators supported by one or more computer systems, where the warning message instructs the one or more human operators to provide one or more manual input commands to modify one or more elements of the virtual characterization to generate a modified virtual characterization.
  • the message can include an instruction to modify one or more particular elements of the virtual characterization. For example, where the confidence indicator being below a threshold is determined to be due to one or more particular elements of the virtual characterization, including one or more particular static feature characterizations included therein, the message can include an instruction to modify at least the one or more particular static feature characterizations.
  • Manual curation can include implementing specific modifications to various characterizations included in the virtual characterization, based on manual inputs received from an operator.
  • An improved value can include a confidence indicator which is superior to the associated confidence indicator of the unmodified virtual characterization, a confidence indicator which at least meets a threshold value associated with enabling autonomous navigation, some combination thereof, or the like. If so, as shown at 1222, the modified virtual characterization is stored in a database. The modified virtual characterization can be distributed to one or more ANSs, monitoring systems, etc.
  • the virtual characterization is flagged for additional data curation.
  • flagging can include generating a message to one or more sensor suites, human operators of one or more sensor suites, one or more vehicles which include said one or more sensor suites, etc. to deploy the one or more sensor suites to the one or more roadway portions characterized in the virtual characterization to generate additional sets of input data associated with the one or more roadway portions.
  • one or more sets of additional input data are received, and the virtual characterization is processed to modify one or more portions of the virtual characterization based on the additional input data. Such modification results in a generation of a modified virtual characterization, which can include an updated associated confidence indicator.
  • An improved value can include a confidence indicator which is superior to the associated confidence indicator of the unmodified virtual characterization, a confidence indicator which at least meets a threshold value associated with enabling autonomous navigation, some combination thereof, or the like. If so, as shown at 1222, the modified virtual characterization is stored in a database. The modified virtual characterization can be distributed to one or more ANSs, monitoring systems, etc.
  • FIG. 13 illustrates an example computer system 1300 that may be configured to include or execute any or all of the embodiments described above.
  • computer system 1300 may be any of various types of devices, including, but not limited to, a personal computer system, desktop computer, laptop, notebook, tablet, slate, pad, or netbook computer, cell phone, smartphone, PDA, portable media device, mainframe computer system, handheld computer, workstation, network computer, a camera or video camera, a set top box, a mobile device, a consumer device, video game console, handheld video game device, application server, storage device, a television, a video recording device, a peripheral device such as a switch, modem, router, or in general any type of computing or electronic device.
  • a personal computer system desktop computer, laptop, notebook, tablet, slate, pad, or netbook computer
  • cell phone smartphone
  • PDA portable media device
  • mainframe computer system handheld computer
  • workstation network computer
  • camera or video camera a set top box
  • a mobile device a consumer device, video game console, handheld video game
  • Various embodiments of an autonomous navigation system may be executed in one or more computer systems 1300, which may interact with various other devices.
  • computer system 1300 includes one or more processors 1310 coupled to a system memory 1320 via an input/output (I/O) interface 1330.
  • Computer system 1300 further includes a network interface 1340 coupled to I/O interface 1330, and one or more input/output devices, which can include one or more user interface devices.
  • embodiments may be implemented using a single instance of computer system 1300, while in other embodiments multiple such systems, or multiple nodes making up computer system 1300, may be configured to host different portions or instances of embodiments.
  • some elements may be implemented via one or more nodes of computer system 1300 that are distinct from those nodes implementing other elements.
  • computer system 1300 may be a uniprocessor system including one processor 1310, or a multiprocessor system including several processors 1310 (e.g., two, four, eight, or another suitable number).
  • Processors 1310 may be any suitable processor capable of executing instructions.
  • processors 1310 may be general-purpose or embedded processors implementing any of a variety of instruction set architectures (ISAs), such as the x86, PowerPC, SPARC, or MIPS ISAs, or any other suitable ISA.
  • ISAs instruction set architectures
  • each of processors 1310 may commonly, but not necessarily, implement the same ISA.
  • System memory 1320 may be configured to store program instructions 1325, data 1326, etc. accessible by processor 1310.
  • system memory 1320 may be implemented using any suitable memory technology, such as static random access memory (SRAM), synchronous dynamic RAM (SDRAM), nonvolatile/F lash-type memory, or any other type of memory.
  • program instructions included in memory 1320 may be configured to implement some or all of an automotive climate control system incorporating any of the functionality described above.
  • existing automotive component control data of memory 1320 may include any of the information or data structures described above.
  • program instructions and/or data may be received, sent or stored upon different types of computer-accessible media or on similar media separate from system memory 1320 or computer system 1300. While computer system 1300 is described as implementing the functionality of functional blocks of previous Figures, any of the functionality described herein may be implemented via such a computer system.
  • I/O interface 1330 may be configured to coordinate I/O traffic between processor 1310, system memory 1320, and any peripheral devices in the device, including network interface 1340 or other peripheral interfaces, such as input/output devices 1350.
  • I/O interface 1330 may perform any necessary protocol, timing or other data transformations to convert data signals from one component (e.g., system memory 1320) into a format suitable for use by another component (e.g., processor 1310).
  • I/O interface 1330 may include support for devices attached through various types of peripheral buses, such as a variant of the Peripheral Component Interconnect (PCI) bus standard or the Universal Serial Bus (USB) standard, for example.
  • PCI Peripheral Component Interconnect
  • USB Universal Serial Bus
  • I/O interface 1330 may be split into two or more separate components, such as a north bridge and a south bridge, for example. Also, in some embodiments some or all of the functionality of I/O interface 1330, such as an interface to system memory 1320, may be incorporated directly into processor 1310.
  • Network interface 1340 may be configured to allow data to be exchanged between computer system 1300 and other devices 1360 attached to a network 1350 (e.g., carrier or agent devices) or between nodes of computer system 1300.
  • Network 1350 may in various embodiments include one or more networks including but not limited to Local Area Networks (LANs) (e.g., an Ethernet or corporate network), Wide Area Networks (WANs) (e.g., the Internet), wireless data networks, some other electronic data network, or some combination thereof.
  • LANs Local Area Networks
  • WANs Wide Area Networks
  • wireless data networks some other electronic data network, or some combination thereof.
  • network interface 1340 may support communication via wired or wireless general data networks, such as any suitable type of Ethernet network, for example; via telecommunications/telephony networks such as analog voice networks or digital fiber communications networks; via storage area networks such as Fibre Channel SANs, or via any other suitable type of network and/or protocol.
  • general data networks such as any suitable type of Ethernet network, for example; via telecommunications/telephony networks such as analog voice networks or digital fiber communications networks; via storage area networks such as Fibre Channel SANs, or via any other suitable type of network and/or protocol.
  • Input/output devices may, in some embodiments, include one or more display terminals, keyboards, keypads, touchpads, scanning devices, voice or optical recognition devices, or any other devices suitable for entering or accessing data by one or more computer systems 1300. Multiple input/output devices may be present in computer system 1300 or may be distributed on various nodes of computer system 1300. In some embodiments, similar input/output devices may be separate from computer system 1300 and may interact with one or more nodes of computer system 1300 through a wired or wireless connection, such as over network interface 1340.
  • memory 1320 may include program instructions 1325, which may be processor-executable to implement any element or action described above.
  • the program instructions may implement the methods described above.
  • different elements and data may be included. Note that data may include any data or information described above.
  • computer system 1300 is merely illustrative and is not intended to limit the scope of embodiments.
  • the computer system and devices may include any combination of hardware or software that can perform the indicated functions, including computers, network devices, Internet appliances, PDAs, wireless phones, pagers, etc.
  • Computer system 1300 may also be connected to other devices that are not illustrated, or instead may operate as a stand-alone system.
  • the functionality provided by the illustrated components may in some embodiments be combined in fewer components or distributed in additional components.
  • the functionality of some of the illustrated components may not be provided and/or other additional functionality may be available.
  • instructions stored on a computer-accessible medium separate from computer system 1300 may be transmitted to computer system 1300 via transmission media or signals such as electrical, electromagnetic, or digital signals, conveyed via a communication medium such as a network and/or a wireless link.
  • Various embodiments may further include receiving, sending or storing instructions and/or data implemented in accordance with the foregoing description upon a computer-accessible medium.
  • a computer- accessible medium may include a non-transitory, computer-readable storage medium or memory medium such as magnetic or optical media, e.g., disk or DVD/CD-ROM, volatile or non-volatile media such as RAM (e.g. SDRAM, DDR, RDRAM, SRAM, etc.), ROM, etc.
  • a computer-accessible medium may include transmission media or signals such as electrical, electromagnetic, or digital signals, conveyed via a communication medium such as network and/or a wireless link.

Abstract

Certains modes de réalisation de la présente invention concernent un système de navigation autonome qui permet la navigation autonome d'un véhicule le long d'une ou plusieurs parties d'une route de conduite sur la base de la surveillance, au niveau du véhicule, de différentes caractéristiques de la route lorsque le véhicule est conduit manuellement le long de la route afin de développer une caractérisation de la route. La caractérisation est progressivement mise à jour avec des navigations manuelles répétées le long de la route, et une navigation autonome de la route est permise lorsqu'un indicateur de confiance de la caractérisation satisfait à une indication de seuil. Les caractérisations peuvent être mises à jour en réponse au fait que le véhicule rencontre des changements de la route et peut comprendre un ensemble de règles de conduite associées à la route, les règles de conduite étant développées sur la base de la surveillance de la navigation d'un ou plusieurs véhicules de la route. Des caractérisations peuvent être transférées à un système distant qui traite des données pour développer et affiner les caractérisations de route et fournir des caractérisations à un ou plusieurs véhicules.
EP15813220.9A 2014-12-05 2015-12-04 Système de navigation autonome Pending EP3256815A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US201462088428P 2014-12-05 2014-12-05
PCT/US2015/064059 WO2016090282A1 (fr) 2014-12-05 2015-12-04 Système de navigation autonome

Publications (1)

Publication Number Publication Date
EP3256815A1 true EP3256815A1 (fr) 2017-12-20

Family

ID=54884434

Family Applications (1)

Application Number Title Priority Date Filing Date
EP15813220.9A Pending EP3256815A1 (fr) 2014-12-05 2015-12-04 Système de navigation autonome

Country Status (4)

Country Link
US (3) US10451425B2 (fr)
EP (1) EP3256815A1 (fr)
CN (2) CN107624155B (fr)
WO (1) WO2016090282A1 (fr)

Families Citing this family (100)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9582006B2 (en) 2011-07-06 2017-02-28 Peloton Technology, Inc. Systems and methods for semi-autonomous convoying of vehicles
US20170242443A1 (en) 2015-11-02 2017-08-24 Peloton Technology, Inc. Gap measurement for vehicle convoying
WO2014145918A1 (fr) 2013-03-15 2014-09-18 Peloton Technology, Inc. Systèmes et procédés de circulation en peloton de véhicules
US20160341555A1 (en) * 2015-05-20 2016-11-24 Delphi Technologies, Inc. System for auto-updating route-data used by a plurality of automated vehicles
US20180211546A1 (en) 2015-08-26 2018-07-26 Peloton Technology, Inc. Devices, systems, and methods for authorization of vehicle platooning
JP6524892B2 (ja) * 2015-11-13 2019-06-05 株式会社デンソー 車両の走行路情報生成システム
JP6468173B2 (ja) * 2015-12-01 2019-02-13 株式会社デンソー 運転支援装置
US10239529B2 (en) * 2016-03-01 2019-03-26 Ford Global Technologies, Llc Autonomous vehicle operation based on interactive model predictive control
JP6497353B2 (ja) * 2016-04-28 2019-04-10 トヨタ自動車株式会社 自動運転制御装置
US20170329331A1 (en) * 2016-05-16 2017-11-16 Magna Electronics Inc. Control system for semi-autonomous control of vehicle along learned route
DE102016212009A1 (de) * 2016-07-01 2018-01-04 Ford Global Technologies, Llc Verfahren zum Betrieb eines selbstfahrenden Kraftfahrzeugs und autonome Fahreinheit für ein selbstfahrendes Kraftfahrzeug
US10838426B2 (en) * 2016-07-21 2020-11-17 Mobileye Vision Technologies Ltd. Distributing a crowdsourced sparse map for autonomous vehicle navigation
KR102026058B1 (ko) 2016-08-08 2019-11-05 닛산 지도우샤 가부시키가이샤 자동 운전 차량의 제어 방법 및 제어 장치
EP3291197A1 (fr) * 2016-08-30 2018-03-07 Volvo Car Corporation Procede et systeme de determination de la praticabilite d'un tronçon de route pour un vehicule autonome
FR3057227B1 (fr) * 2016-10-07 2019-10-11 Peugeot Citroen Automobiles Sa Procede et dispositif de determination d’informations relatives a la disponibilite de portions de voies de circulation pour la conduite autonome de vehicules
US10144428B2 (en) * 2016-11-10 2018-12-04 Ford Global Technologies, Llc Traffic light operation
JP6780461B2 (ja) * 2016-11-14 2020-11-04 いすゞ自動車株式会社 運転支援システム及び運転支援方法
CN106767866B (zh) * 2016-12-02 2021-02-12 百度在线网络技术(北京)有限公司 局部路径规划的方法和装置
US10679312B2 (en) * 2017-04-25 2020-06-09 Lyft Inc. Dynamic autonomous vehicle servicing and management
CN110914641B (zh) * 2017-06-14 2024-01-30 御眼视觉技术有限公司 自主导航的导航信息的融合构架和分批对齐
US10816354B2 (en) * 2017-08-22 2020-10-27 Tusimple, Inc. Verification module system and method for motion-based lane detection with multiple sensors
US10565457B2 (en) 2017-08-23 2020-02-18 Tusimple, Inc. Feature matching and correspondence refinement and 3D submap position refinement system and method for centimeter precision localization using camera-based submap and LiDAR-based global map
US10762673B2 (en) 2017-08-23 2020-09-01 Tusimple, Inc. 3D submap reconstruction system and method for centimeter precision localization using camera-based submap and LiDAR-based global map
US10953880B2 (en) 2017-09-07 2021-03-23 Tusimple, Inc. System and method for automated lane change control for autonomous vehicles
US10953881B2 (en) 2017-09-07 2021-03-23 Tusimple, Inc. System and method for automated lane change control for autonomous vehicles
US10649458B2 (en) 2017-09-07 2020-05-12 Tusimple, Inc. Data-driven prediction-based system and method for trajectory planning of autonomous vehicles
US11089288B2 (en) 2017-09-11 2021-08-10 Tusimple, Inc. Corner point extraction system and method for image guided stereo camera optical axes alignment
US11158088B2 (en) 2017-09-11 2021-10-26 Tusimple, Inc. Vanishing point computation and online alignment system and method for image guided stereo camera optical axes alignment
WO2019098002A1 (fr) * 2017-11-20 2019-05-23 ソニー株式会社 Dispositif de traitement d'informations, procédé de traitement d'informations, programme et corps mobile
CN108121347B (zh) * 2017-12-29 2020-04-07 北京三快在线科技有限公司 用于控制设备运动的方法、装置及电子设备
CN108229386B (zh) * 2017-12-29 2021-12-14 百度在线网络技术(北京)有限公司 用于检测车道线的方法、装置和介质
US10274950B1 (en) 2018-01-06 2019-04-30 Drivent Technologies Inc. Self-driving vehicle systems and methods
US11073838B2 (en) 2018-01-06 2021-07-27 Drivent Llc Self-driving vehicle systems and methods
US10299216B1 (en) 2018-01-06 2019-05-21 Eric John Wengreen Self-driving vehicle actions in response to a low battery
US10303181B1 (en) 2018-11-29 2019-05-28 Eric John Wengreen Self-driving vehicle systems and methods
CN112004729B (zh) 2018-01-09 2023-12-01 图森有限公司 具有高冗余的车辆的实时远程控制
US11305782B2 (en) 2018-01-11 2022-04-19 Tusimple, Inc. Monitoring system for autonomous vehicle operation
US11262756B2 (en) * 2018-01-15 2022-03-01 Uatc, Llc Discrete decision architecture for motion planning system of an autonomous vehicle
US10915101B2 (en) * 2018-02-02 2021-02-09 Uatc, Llc Context-dependent alertness monitor in an autonomous vehicle
US11009365B2 (en) 2018-02-14 2021-05-18 Tusimple, Inc. Lane marking localization
US11009356B2 (en) 2018-02-14 2021-05-18 Tusimple, Inc. Lane marking localization and fusion
US10685244B2 (en) 2018-02-27 2020-06-16 Tusimple, Inc. System and method for online real-time multi-object tracking
US10521913B2 (en) * 2018-03-29 2019-12-31 Aurora Innovation, Inc. Relative atlas for autonomous vehicle and generation thereof
US11256729B2 (en) 2018-03-29 2022-02-22 Aurora Operations, Inc. Autonomous vehicle relative atlas incorporating hypergraph data structure
US10503760B2 (en) 2018-03-29 2019-12-10 Aurora Innovation, Inc. Use of relative atlas in an autonomous vehicle
CN110378185A (zh) 2018-04-12 2019-10-25 北京图森未来科技有限公司 一种应用于自动驾驶车辆的图像处理方法、装置
CN116129376A (zh) 2018-05-02 2023-05-16 北京图森未来科技有限公司 一种道路边缘检测方法和装置
US10926759B2 (en) * 2018-06-07 2021-02-23 GM Global Technology Operations LLC Controlling a vehicle based on trailer position
US10899364B2 (en) 2018-07-02 2021-01-26 International Business Machines Corporation Autonomous vehicle system
US11567632B2 (en) 2018-07-03 2023-01-31 Apple Inc. Systems and methods for exploring a geographic region
US10466057B1 (en) 2018-07-30 2019-11-05 Wesley Edward Schwie Self-driving vehicle systems and methods
CN110873568B (zh) * 2018-08-30 2021-02-23 百度在线网络技术(北京)有限公司 高精度地图的生成方法、装置以及计算机设备
US11292480B2 (en) 2018-09-13 2022-04-05 Tusimple, Inc. Remote safe driving methods and systems
US10471804B1 (en) 2018-09-18 2019-11-12 Drivent Llc Self-driving vehicle systems and methods
US10493952B1 (en) 2019-03-21 2019-12-03 Drivent Llc Self-driving vehicle systems and methods
US10289922B1 (en) 2018-09-18 2019-05-14 Eric John Wengreen System for managing lost, mislaid, or abandoned property in a self-driving vehicle
US10479319B1 (en) 2019-03-21 2019-11-19 Drivent Llc Self-driving vehicle systems and methods
US10282625B1 (en) 2018-10-01 2019-05-07 Eric John Wengreen Self-driving vehicle systems and methods
US10223844B1 (en) 2018-09-18 2019-03-05 Wesley Edward Schwie Self-driving vehicle systems and methods
US11644833B2 (en) 2018-10-01 2023-05-09 Drivent Llc Self-driving vehicle systems and methods
US10794714B2 (en) 2018-10-01 2020-10-06 Drivent Llc Self-driving vehicle systems and methods
US10900792B2 (en) 2018-10-22 2021-01-26 Drivent Llc Self-driving vehicle systems and methods
US10832569B2 (en) 2019-04-02 2020-11-10 Drivent Llc Vehicle detection systems
US11221622B2 (en) 2019-03-21 2022-01-11 Drivent Llc Self-driving vehicle systems and methods
US10240938B1 (en) 2018-10-22 2019-03-26 Drivent Technologies Inc. Self-driving vehicle systems and methods
US10762791B2 (en) * 2018-10-29 2020-09-01 Peloton Technology, Inc. Systems and methods for managing communications between vehicles
US10942271B2 (en) 2018-10-30 2021-03-09 Tusimple, Inc. Determining an angle between a tow vehicle and a trailer
US10286908B1 (en) 2018-11-01 2019-05-14 Eric John Wengreen Self-driving vehicle systems and methods
US10474154B1 (en) 2018-11-01 2019-11-12 Drivent Llc Self-driving vehicle systems and methods
DE102018219809A1 (de) 2018-11-19 2020-05-20 Volkswagen Aktiengesellschaft Verfahren zum Ausweichen vor lokal bedingten drohenden Gefahren, Fahrzeug zur Durchführung des Verfahrens sowie Computerprogramm
US11131554B2 (en) 2018-12-26 2021-09-28 Beijing Voyager Technology Co., Ltd. Systems and methods for vehicle telemetry
WO2020139324A1 (fr) * 2018-12-26 2020-07-02 Didi Research America, Llc Systèmes et procédés de planification d'itinéraire sûr pour un véhicule
US11287270B2 (en) 2018-12-26 2022-03-29 Beijing Voyager Technology Co., Ltd. Systems and methods for safe route planning for a vehicle
CN109739230B (zh) * 2018-12-29 2022-02-01 百度在线网络技术(北京)有限公司 驾驶轨迹生成方法、装置及存储介质
CN111383473B (zh) * 2018-12-29 2022-02-08 安波福电子(苏州)有限公司 基于交通标志限速指示的自适应巡航系统
CN109612472B (zh) * 2019-01-11 2020-08-25 中国人民解放军国防科技大学 一种深空探测器自主导航系统构建方法及装置
US10377342B1 (en) 2019-02-04 2019-08-13 Drivent Technologies Inc. Self-driving vehicle systems and methods
US10744976B1 (en) 2019-02-04 2020-08-18 Drivent Llc Self-driving vehicle systems and methods
US11313696B2 (en) * 2019-03-01 2022-04-26 GM Global Technology Operations LLC Method and apparatus for a context-aware crowd-sourced sparse high definition map
US20200310420A1 (en) * 2019-03-26 2020-10-01 GM Global Technology Operations LLC System and method to train and select a best solution in a dynamical system
US11427196B2 (en) 2019-04-15 2022-08-30 Peloton Technology, Inc. Systems and methods for managing tractor-trailers
US11823460B2 (en) 2019-06-14 2023-11-21 Tusimple, Inc. Image fusion for autonomous vehicle operation
CN114008682A (zh) * 2019-06-28 2022-02-01 宝马股份公司 用于标识对象的方法和系统
CN114174137A (zh) * 2019-07-01 2022-03-11 哲内提 Adas或ad特征的源横向偏移
JP7192709B2 (ja) * 2019-08-09 2022-12-20 トヨタ自動車株式会社 車両遠隔指示訓練装置
CN112498350A (zh) * 2019-09-13 2021-03-16 图森有限公司 自主车辆中的补充制动控制系统
US20210149407A1 (en) * 2019-11-15 2021-05-20 International Business Machines Corporation Autonomous vehicle accident condition monitor
CN111351496B (zh) * 2020-02-27 2023-07-14 歌尔股份有限公司 虚拟地图建模方法、装置、设备及存储介质
US11718304B2 (en) 2020-03-06 2023-08-08 Deere & Comoanv Method and system for estimating surface roughness of ground for an off-road vehicle to control an implement
US11684005B2 (en) * 2020-03-06 2023-06-27 Deere & Company Method and system for estimating surface roughness of ground for an off-road vehicle to control an implement
US11678599B2 (en) 2020-03-12 2023-06-20 Deere & Company Method and system for estimating surface roughness of ground for an off-road vehicle to control steering
US11667171B2 (en) 2020-03-12 2023-06-06 Deere & Company Method and system for estimating surface roughness of ground for an off-road vehicle to control steering
US11753016B2 (en) * 2020-03-13 2023-09-12 Deere & Company Method and system for estimating surface roughness of ground for an off-road vehicle to control ground speed
US11685381B2 (en) 2020-03-13 2023-06-27 Deere & Company Method and system for estimating surface roughness of ground for an off-road vehicle to control ground speed
EP3893150A1 (fr) 2020-04-09 2021-10-13 Tusimple, Inc. Techniques d'estimation de pose de caméra
US11796334B2 (en) 2020-05-15 2023-10-24 Apple Inc. User interfaces for providing navigation directions
US11788851B2 (en) 2020-06-11 2023-10-17 Apple Inc. User interfaces for customized navigation routes
AU2021203567A1 (en) 2020-06-18 2022-01-20 Tusimple, Inc. Angle and orientation measurements for vehicles with multiple drivable sections
US20220390248A1 (en) 2021-06-07 2022-12-08 Apple Inc. User interfaces for maps and navigation
CN113850297B (zh) * 2021-08-31 2023-10-27 北京百度网讯科技有限公司 道路数据的监测方法、装置、电子设备及存储介质

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP1045224A2 (fr) * 1999-04-15 2000-10-18 DaimlerChrysler AG Méthode pour la mise à jour d'une carte routière et méthode pour créer de l'information de guidage de véhicule basée sur la carte
US20070198177A1 (en) * 2006-02-20 2007-08-23 Denso Corporation Map evaluation system and map evaluation method
WO2009145695A1 (fr) * 2008-05-30 2009-12-03 Atlas Copco Rock Drills Ab Méthode et agencement pour calculer une conformité entre la représentation d'un environnement et ledit environnement
US8527199B1 (en) * 2012-05-17 2013-09-03 Google Inc. Automatic collection of quality control statistics for maps used in autonomous driving
US20140156182A1 (en) * 2012-11-30 2014-06-05 Philip Nemec Determining and displaying auto drive lanes in an autonomous vehicle
WO2014139821A1 (fr) * 2013-03-15 2014-09-18 Volkswagen Aktiengesellschaft Application de planification d'itinéraire pour conduite automatique

Family Cites Families (85)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2524602Y2 (ja) * 1991-03-07 1997-02-05 日産ディーゼル工業株式会社 車両のナビゲーション装置
EP1205881A3 (fr) 1998-02-17 2004-10-27 Sun Microsystems, Inc. Système graphique avec super-échantillonnage à résolution variable
US10684350B2 (en) * 2000-06-02 2020-06-16 Tracbeam Llc Services and applications for a communications network
US20030067476A1 (en) 2001-10-04 2003-04-10 Eastman Kodak Company Method and system for displaying an image
US7269504B2 (en) * 2004-05-12 2007-09-11 Motorola, Inc. System and method for assigning a level of urgency to navigation cues
US8050863B2 (en) * 2006-03-16 2011-11-01 Gray & Company, Inc. Navigation and control system for autonomous vehicles
US9373149B2 (en) * 2006-03-17 2016-06-21 Fatdoor, Inc. Autonomous neighborhood vehicle commerce network and community
US9070101B2 (en) * 2007-01-12 2015-06-30 Fatdoor, Inc. Peer-to-peer neighborhood delivery multi-copter and method
US8108092B2 (en) * 2006-07-14 2012-01-31 Irobot Corporation Autonomous behaviors for a remote vehicle
US7587260B2 (en) * 2006-07-05 2009-09-08 Battelle Energy Alliance, Llc Autonomous navigation system and method
US10096038B2 (en) * 2007-05-10 2018-10-09 Allstate Insurance Company Road segment safety rating system
US8126642B2 (en) * 2008-10-24 2012-02-28 Gray & Company, Inc. Control and systems for autonomously driven vehicles
US8775063B2 (en) * 2009-01-26 2014-07-08 GM Global Technology Operations LLC System and method of lane path estimation using sensor fusion
US8352112B2 (en) * 2009-04-06 2013-01-08 GM Global Technology Operations LLC Autonomous vehicle management
US8260550B2 (en) * 2009-06-19 2012-09-04 GM Global Technology Operations LLC Presentation of navigation instructions using variable levels of detail
US10198942B2 (en) * 2009-08-11 2019-02-05 Connected Signals, Inc. Traffic routing display system with multiple signal lookahead
WO2011032208A1 (fr) * 2009-09-15 2011-03-24 The University Of Sydney Système et procédé pour la navigation autonome d'un véhicule chenillé ou à direction à glissement
CN103026174A (zh) * 2010-06-17 2013-04-03 通腾科技股份有限公司 导航装置及方法
US8509982B2 (en) 2010-10-05 2013-08-13 Google Inc. Zone driving
EP3770558A1 (fr) * 2011-06-03 2021-01-27 Apple Inc. Dispositifs et procédés pour comparer et sélectionner des itinéraires de navigation alternatifs
US8184069B1 (en) 2011-06-20 2012-05-22 Google Inc. Systems and methods for adaptive transmission of data
US8583361B2 (en) * 2011-08-24 2013-11-12 Modular Mining Systems, Inc. Guided maneuvering of a mining vehicle to a target destination
FR2980295B1 (fr) * 2011-09-15 2014-08-08 Michelin Soc Tech Procede et systeme de navigation avec serveur centralise
US20130079964A1 (en) * 2011-09-27 2013-03-28 Saturna Green Systems Inc. Vehicle communication, analysis and operation system
US20130132434A1 (en) * 2011-11-22 2013-05-23 Inrix, Inc. User-assisted identification of location conditions
US8855847B2 (en) * 2012-01-20 2014-10-07 Toyota Motor Engineering & Manufacturing North America, Inc. Intelligent navigation system
KR101703144B1 (ko) 2012-02-09 2017-02-06 한국전자통신연구원 차량의 자율주행 장치 및 그 방법
US9429943B2 (en) 2012-03-05 2016-08-30 Florida A&M University Artificial intelligence valet systems and methods
US20140310075A1 (en) * 2013-04-15 2014-10-16 Flextronics Ap, Llc Automatic Payment of Fees Based on Vehicle Location and User Detection
US9378601B2 (en) * 2012-03-14 2016-06-28 Autoconnect Holdings Llc Providing home automation information via communication with a vehicle
US9082239B2 (en) * 2012-03-14 2015-07-14 Flextronics Ap, Llc Intelligent vehicle for assisting vehicle occupants
US8457827B1 (en) * 2012-03-15 2013-06-04 Google Inc. Modifying behavior of autonomous vehicle based on predicted behavior of other vehicles
US8718861B1 (en) * 2012-04-11 2014-05-06 Google Inc. Determining when to drive autonomously
CN102706354B (zh) * 2012-04-25 2016-06-08 深圳市华盈泰智能技术有限公司 基于车联网智能地图全自动增量升级的方法及系统
US8521352B1 (en) * 2012-05-07 2013-08-27 Google Inc. Controlling a vehicle having inadequate map data
US9638537B2 (en) * 2012-06-21 2017-05-02 Cellepathy Inc. Interface selection in navigation guidance systems
GB201211614D0 (en) * 2012-06-29 2012-08-15 Tomtom Dev Germany Gmbh Generating alternative routes
CN102829790B (zh) * 2012-07-17 2015-09-23 广东翼卡车联网服务有限公司 一种可采集行车路线并更新原有导航地图的方法和系统
GB2506645A (en) * 2012-10-05 2014-04-09 Ibm Intelligent route navigation
WO2014080802A1 (fr) * 2012-11-26 2014-05-30 日産自動車株式会社 Dispositif de commande pour véhicule hybride
US8825258B2 (en) * 2012-11-30 2014-09-02 Google Inc. Engaging and disengaging for autonomous driving
US10514541B2 (en) 2012-12-27 2019-12-24 Microsoft Technology Licensing, Llc Display update time reduction for a near-eye display
US9361409B2 (en) 2013-01-10 2016-06-07 International Business Machines Corporation Automatic driver modeling for integration of human-controlled vehicles into an autonomous vehicle network
US20140236483A1 (en) * 2013-02-19 2014-08-21 Navteq B.V. Method and apparatus for determining travel path geometry based on mapping information
US9423261B2 (en) * 2013-02-19 2016-08-23 Here Global B.V. Path curve confidence factors
KR101736306B1 (ko) 2013-02-27 2017-05-29 한국전자통신연구원 차량과 운전자간 협력형 자율 주행 장치 및 방법
US9727991B2 (en) 2013-03-01 2017-08-08 Microsoft Technology Licensing, Llc Foveated image rendering
GB2511750B (en) * 2013-03-11 2015-07-29 Jaguar Land Rover Ltd A driving assistance system, vehicle and method
US8972175B2 (en) * 2013-03-14 2015-03-03 Qualcomm Incorporated Navigation using crowdsourcing data
US8849494B1 (en) * 2013-03-15 2014-09-30 Google Inc. Data selection by an autonomous vehicle for trajectory modification
JP2016513805A (ja) * 2013-03-15 2016-05-16 キャリパー コーポレイション 車両ルート指定および交通管理のための車線レベル車両ナビゲーション
US9121719B2 (en) * 2013-03-15 2015-09-01 Abalta Technologies, Inc. Vehicle range projection
SE540269C2 (sv) * 2013-03-19 2018-05-22 Scania Cv Ab Anordning och metod för att reglera ett autonomt fordon
DE102013205392A1 (de) * 2013-03-27 2014-10-02 Bayerische Motoren Werke Aktiengesellschaft Backend für Fahrerassistenzsysteme
US9141107B2 (en) * 2013-04-10 2015-09-22 Google Inc. Mapping active and inactive construction zones for autonomous driving
US11372936B2 (en) * 2013-04-15 2022-06-28 Autoconnect Holdings Llc System and method for adapting a control function based on a user profile
FR3006088B1 (fr) * 2013-05-27 2015-11-27 Renault Sas Dispositif d'estimation de la duree de fonctionnement en mode autonome d'un vehicule automobile et procede associe
US20160267884A1 (en) 2015-03-12 2016-09-15 Oculus Vr, Llc Non-uniform rescaling of input data for displaying on display device
CN111367287A (zh) * 2015-05-13 2020-07-03 Uatc有限责任公司 通过引导协助操作的自动驾驶车辆
US10853139B2 (en) * 2018-10-19 2020-12-01 EMC IP Holding Company LLC Dynamic workload management based on predictive modeling and recommendation engine for storage systems
US10401852B2 (en) * 2015-11-04 2019-09-03 Zoox, Inc. Teleoperation system and method for trajectory modification of autonomous vehicles
US9630619B1 (en) * 2015-11-04 2017-04-25 Zoox, Inc. Robotic vehicle active safety systems and methods
US10248119B2 (en) * 2015-11-04 2019-04-02 Zoox, Inc. Interactive autonomous vehicle command controller
US9507346B1 (en) * 2015-11-04 2016-11-29 Zoox, Inc. Teleoperation system and method for trajectory modification of autonomous vehicles
US11010956B2 (en) 2015-12-09 2021-05-18 Imagination Technologies Limited Foveated rendering
US20170343360A1 (en) * 2016-05-31 2017-11-30 Hima Harikrishnan Method and system for managing navigation and tracking of, for and by portable and wearable computing and communications devices
US10838426B2 (en) * 2016-07-21 2020-11-17 Mobileye Vision Technologies Ltd. Distributing a crowdsourced sparse map for autonomous vehicle navigation
US20190251509A1 (en) * 2016-09-15 2019-08-15 Erik M. Simpson Price based navigation
US10471829B2 (en) * 2017-01-16 2019-11-12 Nio Usa, Inc. Self-destruct zone and autonomous vehicle navigation
US10752239B2 (en) * 2017-02-22 2020-08-25 International Business Machines Corporation Training a self-driving vehicle
US10698421B1 (en) * 2017-09-25 2020-06-30 State Farm Mutual Automobile Insurance Company Dynamic autonomous vehicle train
US11199413B2 (en) * 2018-07-19 2021-12-14 Qualcomm Incorporated Navigation techniques for autonomous and semi-autonomous vehicles
US20200026279A1 (en) * 2018-07-20 2020-01-23 Ford Global Technologies, Llc Smart neighborhood routing for autonomous vehicles
DK201870686A1 (en) * 2018-08-02 2020-02-20 Aptiv Technologies Limited MANAGEMENT OF MULTIPLE AUTONOMOUS VEHICLES
JP6964062B2 (ja) * 2018-11-26 2021-11-10 本田技研工業株式会社 車両制御装置、車両制御方法、およびプログラム
KR102209421B1 (ko) * 2019-05-22 2021-02-01 엘지전자 주식회사 자율 주행 차량과 이를 이용한 주행 제어 시스템 및 방법
US20200378771A1 (en) * 2019-05-29 2020-12-03 Here Global B.V. Method and apparatus for providing drop-off locations for passengers of a vehicle to reach different destinations via a multimodal route
US11294387B2 (en) * 2019-06-17 2022-04-05 Toyota Research Institute, Inc. Systems and methods for training a vehicle to autonomously drive a route
US11248914B2 (en) * 2019-06-20 2022-02-15 Lyft, Inc. Systems and methods for progressive semantic mapping
US11733050B2 (en) * 2019-06-21 2023-08-22 Here Global B.V. Method and apparatus for providing an isoline map of a time to park at a destination
US11548518B2 (en) * 2019-06-28 2023-01-10 Woven Planet North America, Inc. Subjective route comfort modeling and prediction
KR102195935B1 (ko) * 2019-08-13 2020-12-30 엘지전자 주식회사 자율 주행 차량의 주행 모드 및 경로 결정 방법 및 시스템
US11820255B2 (en) * 2020-02-03 2023-11-21 Nio Technology (Anhui) Co., Ltd. Predictive regenerative braking
US20210247762A1 (en) * 2020-02-12 2021-08-12 Qualcomm Incorporated. Allocating Vehicle Computing Resources to One or More Applications
WO2021168058A1 (fr) * 2020-02-19 2021-08-26 Nvidia Corporation Planification de comportement pour véhicules autonomes

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP1045224A2 (fr) * 1999-04-15 2000-10-18 DaimlerChrysler AG Méthode pour la mise à jour d'une carte routière et méthode pour créer de l'information de guidage de véhicule basée sur la carte
US20070198177A1 (en) * 2006-02-20 2007-08-23 Denso Corporation Map evaluation system and map evaluation method
WO2009145695A1 (fr) * 2008-05-30 2009-12-03 Atlas Copco Rock Drills Ab Méthode et agencement pour calculer une conformité entre la représentation d'un environnement et ledit environnement
US8527199B1 (en) * 2012-05-17 2013-09-03 Google Inc. Automatic collection of quality control statistics for maps used in autonomous driving
US20140156182A1 (en) * 2012-11-30 2014-06-05 Philip Nemec Determining and displaying auto drive lanes in an autonomous vehicle
WO2014139821A1 (fr) * 2013-03-15 2014-09-18 Volkswagen Aktiengesellschaft Application de planification d'itinéraire pour conduite automatique

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
See also references of WO2016090282A1 *

Also Published As

Publication number Publication date
US10451425B2 (en) 2019-10-22
US20220373338A1 (en) 2022-11-24
US20170363430A1 (en) 2017-12-21
US20200049515A1 (en) 2020-02-13
CN113654561A (zh) 2021-11-16
US11402221B2 (en) 2022-08-02
WO2016090282A1 (fr) 2016-06-09
CN107624155B (zh) 2021-09-28
CN107624155A (zh) 2018-01-23

Similar Documents

Publication Publication Date Title
US11402221B2 (en) Autonomous navigation system
EP3299921B1 (fr) Assistance spécifique à une position géographique pour un système de commande de véhicule autonome
JP6894471B2 (ja) 自動運転車(adv)のサブシステムによるパトロールカーのパトロール
US10259457B2 (en) Traffic light anticipation
US11340094B2 (en) Updating map data for autonomous driving vehicles based on sensor data
US11011064B2 (en) System and method for vehicle platooning
US20150153184A1 (en) System and method for dynamically focusing vehicle sensors
CA3085725A1 (fr) Utilisation de modeles de prediction de difficulte de scene dans un acheminement de vehicules
GB2536771A (en) Autonomous driving refined in virtual environments
GB2536549A (en) Virtual autonomous response testbed
EP3814909A2 (fr) Utilisation de divergence pour mener des simulations basées sur le journal
CN111845771A (zh) 数据收集自动化系统
CN113924241B (zh) 用于自主车辆的追踪消失对象
US11403949B2 (en) System for predicting vehicle behavior
US7286930B2 (en) Ghost following
CN114084164A (zh) 用于在自动驾驶期间改进驾驶员警告的系统和方法
US20210390225A1 (en) Realism in log-based simulations
US20240083458A1 (en) Using simulations to identify differences between behaviors of manually-driven and autonomous vehicles
JP2023133049A (ja) 自律マシンシステム及びアプリケーションのための認知ベースの駐車支援
US11708049B2 (en) Systems and methods for preventing an operation of a car application that reduces a quality of service of a computer system of a vehicle
US20230294728A1 (en) Road segment spatial embedding
US20230195830A1 (en) Calibration metrics for measuring trajectory prediction
US20230288223A1 (en) Keyframe-based compression for world model representation in autonomous systems and applications
CN109649385B (zh) 驾驶辅助装置
JP2022024099A (ja) 情報処理装置、情報処理方法及び情報処理プログラム

Legal Events

Date Code Title Description
STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE INTERNATIONAL PUBLICATION HAS BEEN MADE

PUAI Public reference made under article 153(3) epc to a published international application that has entered the european phase

Free format text: ORIGINAL CODE: 0009012

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: REQUEST FOR EXAMINATION WAS MADE

17P Request for examination filed

Effective date: 20171018

AK Designated contracting states

Kind code of ref document: A1

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR

AX Request for extension of the european patent

Extension state: BA ME

RIN1 Information on inventor provided before grant (corrected)

Inventor name: AL-DAHLE, AHMAD

Inventor name: LYON, BENJAMIN

Inventor name: SIEH, J., PHILIP

Inventor name: LAST, E., MATTHEW

DAV Request for validation of the european patent (deleted)
DAX Request for extension of the european patent (deleted)
STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: EXAMINATION IS IN PROGRESS

RAP1 Party data changed (applicant data changed or rights of an application transferred)

Owner name: APPLE INC.

17Q First examination report despatched

Effective date: 20180713

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: EXAMINATION IS IN PROGRESS