CN112955362A - Assessing ride quality of autonomous vehicles - Google Patents
Assessing ride quality of autonomous vehicles Download PDFInfo
- Publication number
- CN112955362A CN112955362A CN201980073648.2A CN201980073648A CN112955362A CN 112955362 A CN112955362 A CN 112955362A CN 201980073648 A CN201980073648 A CN 201980073648A CN 112955362 A CN112955362 A CN 112955362A
- Authority
- CN
- China
- Prior art keywords
- ride
- data
- model
- overall
- vehicle
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 238000012549 training Methods 0.000 claims abstract description 18
- 230000004044 response Effects 0.000 claims abstract description 14
- 238000000034 method Methods 0.000 claims description 38
- 230000001133 acceleration Effects 0.000 claims description 27
- 230000008447 perception Effects 0.000 claims description 19
- 230000008859 change Effects 0.000 claims description 8
- 238000012552 review Methods 0.000 claims description 8
- 238000013527 convolutional neural network Methods 0.000 claims description 5
- 230000015654 memory Effects 0.000 description 22
- 201000003152 motion sickness Diseases 0.000 description 17
- 238000004088 simulation Methods 0.000 description 14
- 238000001514 detection method Methods 0.000 description 9
- 238000004891 communication Methods 0.000 description 8
- 238000010586 diagram Methods 0.000 description 8
- 230000006399 behavior Effects 0.000 description 7
- 238000013528 artificial neural network Methods 0.000 description 6
- 230000033001 locomotion Effects 0.000 description 5
- 230000011664 signaling Effects 0.000 description 5
- 238000012360 testing method Methods 0.000 description 5
- 230000002996 emotional effect Effects 0.000 description 4
- 230000008901 benefit Effects 0.000 description 3
- 238000010276 construction Methods 0.000 description 3
- 230000006870 function Effects 0.000 description 3
- 238000010801 machine learning Methods 0.000 description 3
- 230000014759 maintenance of location Effects 0.000 description 3
- 238000012545 processing Methods 0.000 description 3
- 238000010079 rubber tapping Methods 0.000 description 3
- 230000003542 behavioural effect Effects 0.000 description 2
- 125000004122 cyclic group Chemical group 0.000 description 2
- 239000000446 fuel Substances 0.000 description 2
- 238000003825 pressing Methods 0.000 description 2
- 238000001303 quality assessment method Methods 0.000 description 2
- 230000008672 reprogramming Effects 0.000 description 2
- 238000013515 script Methods 0.000 description 2
- 230000003068 static effect Effects 0.000 description 2
- 230000002776 aggregation Effects 0.000 description 1
- 238000004220 aggregation Methods 0.000 description 1
- 230000005540 biological transmission Effects 0.000 description 1
- 230000001413 cellular effect Effects 0.000 description 1
- 230000036461 convulsion Effects 0.000 description 1
- 230000003247 decreasing effect Effects 0.000 description 1
- 230000000694 effects Effects 0.000 description 1
- 238000005516 engineering process Methods 0.000 description 1
- 238000011156 evaluation Methods 0.000 description 1
- 230000005484 gravity Effects 0.000 description 1
- 230000003287 optical effect Effects 0.000 description 1
- 230000029305 taxis Effects 0.000 description 1
- 230000000007 visual effect Effects 0.000 description 1
Images
Classifications
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W40/00—Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models
- B60W40/08—Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models related to drivers or passengers
- B60W40/09—Driving style or behaviour
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N20/00—Machine learning
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W50/00—Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
- B60W50/0098—Details of control systems ensuring comfort, safety or stability not otherwise provided for
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W60/00—Drive control systems specially adapted for autonomous road vehicles
- B60W60/001—Planning or execution of driving tasks
- B60W60/0025—Planning or execution of driving tasks specially adapted for specific operations
- B60W60/00253—Taxi operations
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/02—Control of position or course in two dimensions
- G05D1/021—Control of position or course in two dimensions specially adapted to land vehicles
- G05D1/0276—Control of position or course in two dimensions specially adapted to land vehicles using signals provided by a source external to the vehicle
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06Q—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q30/00—Commerce
- G06Q30/02—Marketing; Price estimation or determination; Fundraising
- G06Q30/0278—Product appraisal
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W2540/00—Input parameters relating to occupants
- B60W2540/215—Selection or confirmation of options
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/04—Architecture, e.g. interconnection topology
- G06N3/044—Recurrent networks, e.g. Hopfield networks
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/04—Architecture, e.g. interconnection topology
- G06N3/045—Combinations of networks
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/08—Learning methods
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G1/00—Traffic control systems for road vehicles
- G08G1/20—Monitoring the location of vehicles belonging to a group, e.g. fleet of vehicles, countable or determined number of vehicles
- G08G1/202—Dispatching vehicles on the basis of a location, e.g. taxi dispatching
Landscapes
- Engineering & Computer Science (AREA)
- Automation & Control Theory (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Transportation (AREA)
- Mechanical Engineering (AREA)
- Business, Economics & Management (AREA)
- Software Systems (AREA)
- Human Computer Interaction (AREA)
- Mathematical Physics (AREA)
- Development Economics (AREA)
- Finance (AREA)
- Accounting & Taxation (AREA)
- Strategic Management (AREA)
- Data Mining & Analysis (AREA)
- Evolutionary Computation (AREA)
- Remote Sensing (AREA)
- Radar, Positioning & Navigation (AREA)
- Artificial Intelligence (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Aviation & Aerospace Engineering (AREA)
- General Engineering & Computer Science (AREA)
- Medical Informatics (AREA)
- Computing Systems (AREA)
- Entrepreneurship & Innovation (AREA)
- General Business, Economics & Management (AREA)
- Game Theory and Decision Science (AREA)
- Economics (AREA)
- Marketing (AREA)
- Traffic Control Systems (AREA)
Abstract
The present disclosure relates to training models for identifying events that may cause discomfort to passengers of an autonomous vehicle and for evaluating overall ride quality of an autonomous vehicle ride. For example, the ride data may be associated with a ride quality value indicative of a level of discomfort and/or a first overall ride quality value indicative of an overall ride quality provided by the occupant for the first ride. This ride data may be used to train the model such that the model is configured to output a list of events that may cause discomfort to the occupant during the second ride in response to receiving the ride data for the second ride as input and/or such that the model is configured to output a second overall ride quality value for the second ride in response to receiving the second ride data for the second ride as input.
Description
Cross Reference to Related Applications
This application claims the benefit of application serial No. 16/214,991 filed on 12/10/2018, which claims the benefit of provisional patent application No. 62/747,815 filed on 19/10/2018, the disclosure of which is incorporated herein by reference.
Background
Autonomous vehicles, such as vehicles that do not require a human driver, may be used to assist in transporting passengers or items from one location to another. Such vehicles may operate in a fully autonomous mode in which the user may provide some initial input, such as a pick-up (pick up) or destination location, and the vehicle maneuvers itself to that location. When a person (or user) wants to physically transport and/or transport goods between two locations via a vehicle, they can use any number of taxis or delivery services. Heretofore, these services have typically involved a human driver who is given dispatch instructions to a location to pick up and drop down users and/or goods. In some cases, the passenger is provided the opportunity to "rate" his or her overall experience with the ride (ride) service, for example, by providing the driver with a tip and/or star or other rating.
Disclosure of Invention
One aspect of the present disclosure provides a method of training a model for identifying events that may cause discomfort to an occupant of an autonomous vehicle. The method includes receiving, by one or more server computing devices, first ride data identifying an output from a planner system and an output from a perception system. The first ride data is associated with a ride quality value indicative of a level of discomfort identified by an occupant of the autonomous vehicle during the first ride. The method also includes training, by the one or more server computing devices, a model using the first ride data and any associated ride quality values such that the model is configured to output a list of events occurring in the second ride that may cause discomfort to the one or more passengers during the second ride in response to receiving as input second ride data for the second ride.
In one example, the method further includes inputting second ride data for the second ride into the model to determine a list of events and flagging the second ride data for review based on the list of events. In one example, the second ride data corresponds to simulated ride data for a simulated ride generated using the autonomous vehicle control software. In another example, the ride quality value is also associated with a point in time during the first ride. In another example, the ride data also includes context from the map for controlling the autonomous vehicle during the first ride. In this example, the ride data also includes information about the location and type of road users other than the vehicle. In another example, the ride data also includes acceleration information for the first ride. In this example, the ride data also includes a derivative (derivative) of the acceleration information of the autonomous vehicle of the first ride.
Another aspect of the disclosure provides a method of training a model for assessing overall ride quality of an autonomous vehicle ride. The method includes receiving, by one or more server computing devices, first ride data identifying an output from a planner system and an output from a perception system. The first ride data is associated with a first overall ride quality value indicative of an overall ride quality provided by an occupant of the autonomous vehicle for the first ride. The method also includes training, by the one or more server computing devices, a model using the first ride data and the first overall ride quality value such that the model is configured to output a second overall ride quality value for the second ride in response to receiving as input second ride data for the second ride.
In one example, the second ride data corresponds to simulated ride data for a simulated ride generated using the autonomous vehicle control software. In another example, the first ride data further includes context from a map for controlling the autonomous vehicle during the first ride. In another example, the ride data also includes information about the location and type of road users other than the vehicle. In another example, the first ride data further includes acceleration information of the autonomous vehicle. In another example, the method further includes receiving an output from the second model that identifies a list of events that may cause discomfort to the passenger during the first ride, and wherein the training is further based on the list of events. In another example, the method further includes converting the first ride quality value to a first overall ride quality value, and the first ride quality value corresponds to a star rating of the first ride provided by the passenger. In another example, the model is a convolutional neural network over time to enable the model to recognize relationships between events and when the events occur during a ride and how the relationships affect overall ride quality. In another example, the method further includes inputting the ride data for the second ride into the model to determine a second overall ride quality value and flagging the second ride data for review based on the second overall ride quality value. In another example, the method further includes inputting ride data for a second ride into the model to determine a second overall ride quality value, and using the second overall ride quality value to evaluate ride quality for a plurality of miles driven by the autonomous vehicle. In another example, the method further includes inputting ride data for a second ride into the model to determine a second overall ride quality value, and using the second overall ride quality value to evaluate ride quality for a particular type of maneuver. In this example, the particular type of manipulation includes a left turn or a right turn. Alternatively, the particular type of maneuver includes a lane change.
Drawings
FIG. 1 is a functional diagram of an example vehicle, according to an example embodiment.
Fig. 2 is an example of map information according to aspects of the present disclosure.
Fig. 3 is an example exterior view of a vehicle according to aspects of the present disclosure.
Fig. 4 is a schematic diagram of an example system according to aspects of the present disclosure.
Fig. 5 is a functional diagram of the system of fig. 4, according to aspects of the present disclosure.
FIG. 6 is an example representation of a driving situation according to aspects of the present disclosure.
Fig. 7 is an example flow diagram in accordance with aspects of the present disclosure.
Fig. 8A and 8B are example interfaces for providing occupant feedback according to aspects of the present disclosure.
FIG. 9 is an example of an output of a model according to aspects of the present disclosure.
Fig. 10 is an example flow diagram in accordance with aspects of the present disclosure.
FIG. 11 is another example of an output of a model according to aspects of the present disclosure.
Detailed Description
SUMMARY
The present technology relates to assessing ride quality during a simulation. To do so, one or more models may be trained using feedback provided by actual passengers during actual rides and log data from these rides including outputs from various systems of the vehicle (such as the planner and perception systems). Thus, the model may be trained to input log data, identify individual events that may compromise ride quality, and provide ride quality values for both real-world and simulated ride data. This information can then be used to assess ride quality across different aspects of the ride and improve that software, and also to predict the user's retention rate in view of a low quality ride.
To train the model, data on ride quality must first be collected. For example, during a journey or ride with human riders (such as actual passengers or test riders), these riders may be provided with the ability to provide real-time feedback about the ride. As an example, the passenger may be able to provide feedback during the ride, e.g., in response to a particular event, and/or to provide feedback for the ride as a whole, e.g., a general rating of the ride.
The first model may be trained using ratings and associated ride data for multiple rides by a test rider to identify a list of events that may be uncomfortable to the passenger and a level of such discomfort for each event in the list. The first model may then be used to detect discomfort events from the ride data for different rides. For example, ride data from a simulated ride may be input into the model in order to provide a ride quality assessment (i.e., a list of events that may be uncomfortable and a level of discomfort for each event) for a given autonomous vehicle control software version and simulation. Similarly, ride data from actual passenger rides may be input into the model in order to provide ride quality assessments for particular rides. This, in turn, can be used to make adjustments to the software to improve the overall and/or passenger-specific ride quality.
The second model may be trained using ratings and associated ride data by the passenger for multiple rides to identify an overall ride quality value. Further, to better inform the second model, the second model may be trained using recognition of discomfort events during the ride (i.e., the output of the first model). The second model may also be trained using additional information from the passenger, such as information about whether the passenger is no longer riding (i.e., reduced) and/or motion sickness reported during riding.
The second model may also be used to determine the overall ride quality of the real world and simulated ride. For example, the output of the second model may be used to evaluate which types of events may cause passenger churn in order to attempt to address such issues and prioritize engineering efforts to make adjustments to the software in order to improve ride quality overall. The overall ride quality data may be used to evaluate the autonomous vehicle software and/or service ensemble.
Thus, the features described herein may allow for detection of discomfort events and assessment of overall ride quality in both real-world and simulated rides. This, in turn, allows for prioritized engineering efforts to improve autonomous vehicle software, change ride behavior to increase ride retention and avoid occupant loss, and provide intervention in the event of an adverse event.
Example System
As shown in fig. 1, a vehicle 100 according to one aspect of the present disclosure includes various components. While certain aspects of the present disclosure are particularly useful for a particular type of vehicle, the vehicle may be any type of vehicle, including but not limited to an automobile, a truck, a motorcycle, a bus, a recreational vehicle, and the like. The vehicle may have one or more computing devices, such as computing device 110 containing one or more processors 120, memory 130, and other components typically found in a general purpose computing device.
The instructions 134 may be any set of instructions to be executed directly (such as machine code) or indirectly (such as scripts) by the processor. For example, the instructions may be stored as computing device code on a computing device readable medium. In that regard, the terms "software," "instructions," and "programs" may be used interchangeably herein. The instructions may be stored in an object code format for direct processing by the processor, or in any other computing device language (including scripts or collections of independent source code modules that are interpreted or pre-compiled as needed). The function, method and routine of the instructions are explained in more detail below.
The one or more processors 120 may be any conventional processor, such as a commercially available CPU. Alternatively, one or more processors may be special purpose devices, such as an ASIC or other hardware-based processor. While fig. 1 functionally shows the processor, memory, and other elements of the computing device 110 as being within the same block, those of ordinary skill in the art will appreciate that a processor, computing device, or memory may actually comprise multiple processors, computing devices, or memories that may or may not be stored within the same physical housing. For example, the memory may be a hard drive or other storage medium located in a different housing than the computing device 110. Thus, references to a processor or computing device are to be understood as including references to a collection of processors or computing devices or memories that may or may not operate in parallel.
In one example, the computing device 110 may be a control computing device of an autonomous driving computing system, or may be incorporated into the vehicle 100. The autonomous driving computing system may be capable of communicating with various components of the vehicle to control movement of the vehicle 100 according to autonomous vehicle control software of the memory 130, as discussed further below. For example, returning to fig. 1, the computing device 110 may communicate with various systems of the vehicle 100, such as a deceleration system 160, an acceleration system 162, a steering system 164, a signaling system 166, a planning system 168, a positioning system 170, a perception system 172, and a power system 174 (i.e., an engine or motor of the vehicle) to control movement, speed, etc. of the vehicle 100 according to the instructions 134 of the memory 130. Each of these systems may include various hardware (processors and memory similar to processor 120 and memory 130) and software to enable these systems to perform various tasks. Again, while these systems are shown external to computing device 110, in practice, these systems may also be incorporated into computing device 110, again as an autonomous driving computing system for controlling vehicle 100.
As an example, the computing device 110 may interact with one or more actuators of the deceleration system 160 and/or the acceleration system 162 (such as a brake, an accelerator pedal, and/or an engine or motor of the vehicle) to control the speed of the vehicle. Similarly, one or more actuators of the steering system 164 (such as a steering wheel, a steering shaft, and/or gears and racks in a rack and pinion system) may be used by the computing device 110 to control the direction of the vehicle 100. For example, if the vehicle 100 is configured for use on a roadway, such as a car or truck, the steering system may include one or more actuators to control the angle of the wheels to turn the vehicle. The signaling system 166 may be used by the computing device 110 to signal other drivers or vehicles of the vehicle's intent when needed, such as by illuminating a turn signal or brake light.
The planning system 168 may be used by the computing system 110 to determine and follow a route to a location. For example, the planning system may periodically generate trajectories in order to follow a particular route to a destination. In this regard, the planning system 168 and/or the data 132 may store detailed map information, such as a highly detailed map that identifies the shape and height of roads, lane lines, intersections, crosswalks, speed limits, traffic signals, buildings, signs, real-time traffic information, vegetation, or other such objects and information.
Fig. 2 is an example of map information 200 for a section of road that includes intersections 202 and 204. In this example, the map information 200 includes information identifying the shape, location, and other characteristics of the lane lines 210, 212, 214, traffic light signals 220, 222, crosswalk 230, sidewalk 240, stop signs 250, 252, and yield sign 260. Although the map information is depicted herein as an image-based map, the map information need not be entirely image-based (e.g., a grid). For example, the map information may include one or more roadmaps (roadmaps) or graphical networks of information such as roads, lanes, intersections, and connections between these features. Each feature may be stored as graphical data and may be associated with information such as geographical locations and whether they are linked to other relevant features, e.g. stop signs may be linked to roads and intersections, etc. In some examples, the associated data may include a grid-based index of road maps to allow efficient lookup of certain road map features.
The positioning system 170 may be used by the computing device 110 to determine the relative or absolute position of the vehicle on a map or on the earth. For example, the positioning system 170 may include a GPS receiver to determine the latitude, longitude, and/or altitude location of the device. Other positioning systems, such as laser-based positioning systems, inertia-assisted GPS, or camera-based positioning, may also be used to identify the location of the vehicle. The location of a vehicle may include absolute geographic locations (such as latitude, longitude, and altitude), as well as relative location information (such as the location relative to other vehicles in the immediate vicinity, which may often be determined with less noise than an absolute geographic location).
The positioning system 170 may also include other devices in communication with the computing device 110, such as accelerometers, gyroscopes, or other direction/velocity detection devices to determine the direction and velocity of the vehicle or changes thereto. By way of example only, the acceleration device may determine its pitch, yaw, or roll (or changes thereto) relative to the direction of gravity or a plane perpendicular thereto. The device can also track the increase or decrease in speed and the direction of such changes. The location and orientation data provided by a device as described herein may be automatically provided to the computing device 110, other computing devices, and combinations of the foregoing.
The sensing system 172 also includes one or more components for detecting objects external to the vehicle, such as other vehicles, obstacles in the road, traffic signals, signs, trees, and the like. For example, the perception system 172 may include a laser, sonar, radar, camera, and/or any other detection device that records data that may be processed by the computing device 110. In the case where the vehicle is a passenger vehicle, such as a minibus, the minibus may include a laser or other sensor mounted on the roof or other convenient location. For example, fig. 3 is an example exterior view of the vehicle 100. In this example, the roof housing 310 and dome housing 312 may include LIDAR sensors as well as various cameras and radar units. Further, the housing 320 located at the front end of the vehicle 100 and the housings 330, 332 on the driver and passenger sides of the vehicle may each store a LIDAR sensor. For example, the housing 330 is located in front of the driver's door 350. The vehicle 100 also includes housings 340, 342 for radar units and/or cameras, also located on the roof of the vehicle 100. Additional radar units and cameras (not shown) may be located at the front and rear ends of the vehicle 100 and/or at other locations along the roof or roof housing 310. Vehicle 100 also includes many features of a typical passenger vehicle, such as doors 350, 352, wheels 360, 362, etc.
Various systems of the vehicle may function using autonomous vehicle control software to determine how to control the vehicle and to control the vehicle. As an example, the perception system software modules of perception system 172 may use sensor data generated by one or more sensors (such as cameras, LIDAR sensors, radar units, sonar units, etc.) of the autonomous vehicle to detect and identify objects and their characteristics. These characteristics may include location, type, heading, orientation, speed, acceleration, change in acceleration, size, shape, etc. In some cases, the characteristics may be input into a behavior prediction system software module that uses various models to output predicted future behavior of the detected object based on the object type. In other cases, the features may be put into one or more detection system software modules, such as a construction zone detection system software module configured to detect a construction zone from sensor data generated by one or more sensors of the vehicle, and an emergency vehicle detection system configured to detect an emergency vehicle from sensor data generated by sensors of the vehicle. Each of these detection system software modules may use various models to output the likelihood that a construction zone or object is an emergency vehicle. The detected objects, predicted future behavior, various possibilities from the detection system software module, map information identifying the environment of the vehicle, location information from the positioning system 170 identifying the location and orientation of the vehicle, the destination of the vehicle, and feedback from various other systems of the vehicle may be input into the planning system software module of the planning system 160. The planning system may use this input to generate a trajectory for the vehicle to follow within a short period of time in the future. The control system software module of the computing device 110 may be configured to control the movement of the vehicle, for example, by controlling the braking, acceleration, and steering of the vehicle, so as to follow the trajectory.
The computing device 110 may autonomously control the direction and speed of the vehicle by controlling various components. To do so, the computing device 110 may accelerate the vehicle (e.g., by increasing fuel or other energy provided to the engine by the acceleration system 162), decelerate (e.g., by decreasing fuel supplied to the engine, shifting gears, and/or by applying brakes by the deceleration system 160), change direction (e.g., by turning front or rear wheels of the vehicle 100 by the steering system 164), and signal such a change (e.g., by illuminating a steering signal of the signaling system 166). Thus, acceleration system 162 and deceleration system 160 may be part of a powertrain that includes various components between the engine of the vehicle and the wheels of the vehicle. Again, by controlling these systems, the computing device 110 may also control the driveline of the vehicle in order to autonomously steer the vehicle.
The computing device 110 of the vehicle 100 may also transmit information to or receive information from other computing devices, such as those that are part of the transportation service and other computing devices. Fig. 4 and 5 are pictorial and functional diagrams, respectively, of an example system 400, the example system 400 including a plurality of computing devices 410, 420, 430, 440 and a storage system 450 connected via a network 460. The system 400 also includes a vehicle 100 and vehicles 100A, 100B that may be configured the same as or similar to the vehicle 100. Although only a few vehicles and computing devices are depicted for simplicity, a typical system may include significantly more vehicles and computing devices.
As shown in fig. 5, each of the computing devices 410, 420, 430, 440 may include one or more processors, memory, data, and instructions. Such processors, memories, data, and instructions may be configured similarly to the one or more processors 120, memories 130, data 132, and instructions 134 of the computing device 110.
In one example, the one or more computing devices 410 may include one or more server computing devices having multiple computing devices, e.g., a load balancing server farm, that exchanges information with different nodes of a network for the purpose of receiving data from, processing data, and sending data to other computing devices. For example, the one or more computing devices 410 may include one or more server computing devices capable of communicating with the computing device 110 of the vehicle 100 or similar computing devices of the vehicle 100A and the computing devices 420, 430, 440 via the network 460. For example, the vehicles 100, 100A may be part of a fleet of vehicles that may be dispatched by a server computing device to various locations. In this regard, the server computing device 410 may function as a dispatch server computing system that may be used to dispatch vehicles, such as the vehicle 100 and the vehicle 100A, to different locations for pick-up and drop-off of passengers. Further, server computing device 410 may send and present information to users, such as users 422, 432, 442, on displays, such as displays 424, 434, 444 of computing devices 420, 430, 440 using network 460. In this regard, the computing devices 420, 430, 440 may be considered client computing devices.
As shown in fig. 5, each client computing device 420, 430, 440 may be a personal computing device intended for use by a user 422, 432, 442 and have all of the components typically used in connection with a personal computing device, including one or more processors (e.g., Central Processing Units (CPUs)), memory (e.g., RAM and internal hard drives) that stores data and instructions, a display (such as display 424, 434, 444) (e.g., a monitor having a screen, a touch screen, a projector, a television or other device operable to display information), and a user input device 426, 436, 446 (e.g., a mouse, a keyboard, a touch screen, or a microphone). The client computing device may also include a camera for recording video streams, speakers, a network interface device, and all components for connecting these elements to each other.
While the client computing devices 420, 430, and 440 may each comprise full-size personal computing devices, they may alternatively comprise mobile computing devices capable of wirelessly exchanging data with a server over a network such as the internet. By way of example only, the client computing device 420 may be a mobile phone or a device such as a wireless-enabled PDA, a tablet PC, a wearable computing device or system, or a netbook capable of obtaining information via the internet or other network. In another example, the client computing device 430 may be a wearable computing system, as shown in fig. 4, shown as a watch. As an example, a user may input information using a keypad, a microphone, using a visual signal with a camera, or a touch screen.
In some examples, the client computing device 440 may be a concierge work station (concierge work) used by an administrator or operator to communicate with passengers, as discussed further below. Although only a single concierge workstation 440 is shown in fig. 4 and 5, any number of such workstations may be included in a typical system. Moreover, while the operational workstation is depicted as a desktop computer, the operational workstation may include various types of personal computing devices, such as a portable computer, a netbook, a tablet computer, and so forth.
Like memory 130, storage system 450 may be any type of computerized storage device capable of storing information accessible to server computing device 410, such as a hard drive, memory card, ROM, RAM, DVD, CD-ROM, writable and read-only memory. Further, storage system 450 may comprise a distributed storage system in which data is stored on a plurality of different storage devices that may be physically located in the same or different geographic locations. As shown in fig. 4 and 5, storage system 450 may be connected to computing devices via network 460 and/or may be connected directly to any computing device 110, 410, 420, 430, 440, etc. or incorporated into any computing device 110, 410, 420, 430, 440, etc.
The storage system 450 may store various types of information. For example, the storage system 450 may also store the above-mentioned autonomous vehicle control software to be used by a vehicle, such as the vehicle 100, to operate the vehicle in an autonomous driving mode. This autonomous vehicle control software stored in the storage system 450 includes various invalid and validated versions of the autonomous vehicle control software. Once verified, the autonomous vehicle control software may be transmitted to, for example, the memory 130 of the vehicle 100 for use by the vehicle's computing device to control the vehicle in an autonomous driving mode.
The storage system 450 may store various types of information, as described in more detail below. This information may be retrieved or otherwise accessed by a server computing device (such as one or more server computing devices 410) in order to perform some or all of the features described herein. For example, the storage system 450 may store log data. This log data may include, for example, sensor data generated by a sensing system, such as sensing system 172 of vehicle 100. The sensing system may include a plurality of sensors that generate sensor data. As an example, the sensor data may include raw sensor data as well as data identifying characteristics defining perceived objects (including other road users), such as the shape, location, orientation, speed, etc. of objects (such as vehicles, pedestrians, bicyclists, vegetation, curbs, lane lines, sidewalks, crosswalks, buildings, etc.). The log data may also include "event" data identifying different types of events (such as collisions or near collisions with other objects), planned trajectories describing the planned geometry and/or speed of the potential path of the vehicle 100, the actual location of the vehicle at different times, the actual heading/heading of the vehicle at different times, the actual speed, acceleration and deceleration (and/or changes in speed, heading/steering angle, acceleration, etc.) of the vehicle at different times, the classification and response of perceived objects over time, behavioral predictions of perceived objects, the status of various systems of the vehicle (such as acceleration, deceleration, perception, steering, signaling, routing, power, etc.) at different times (including recorded errors), inputs and outputs of various systems of the vehicle at different times, and so on. As such, these events and sensor data may be used to "render" the environment of the vehicle, including the perceived objects and the behavior of the vehicle in the simulation.
At least some of this log data may be "ride data," i.e., log data generated during a particular journey or ride of a passenger in an autonomous vehicle such as vehicle 100. The ride data may include all of the above-mentioned features of the log data, or may include only certain types of data, such as motion planning commands from a planner system of the vehicle (such as planner system 160), telemetry from the vehicle, context for controlling the vehicle from map information, processed or raw sensor data from other road users of a perception system of the vehicle (such as perception system 172) (such as the vehicle, cyclists, pedestrians, etc.), acceleration information, jerk (or derivative of acceleration) information, and so forth. The ride data may also be associated with ride feedback provided by one or more passengers for a particular ride. At least some of this feedback may be provided by the passenger in response to, for example, during or immediately after an unpleasant event, as discussed further below. Additionally or alternatively, the feedback may include an overall ride quality value after the ride has been completed, as discussed further below.
Example method
In addition to the operations described above and shown in the figures, various operations will now be described. It should be understood that the following operations need not be performed in the precise order described below. Rather, various steps may be processed in a different order or concurrently, and steps may also be added or omitted.
In one aspect, a user may download an application to a client computing device requesting a transport or ride service, or more specifically, a vehicle. For example, users 422 and 432 may download applications directly from a website or application store via a link in an email to client computing devices 420 and 430. For example, a client computing device may send a request for an application to one or more server computing devices 410, e.g., over a network, and receive the application in response. The application may be installed locally at the client computing device.
The user may then use his or her client computing device to access the application and request the vehicle. As an example, a user, such as user 432, may use client computing device 430 to send a request for a vehicle to one or more server computing devices 410. As part of this, the user may identify an access point, a destination point, and in some cases one or more intermediate parking points anywhere within the service area where the vehicle may park. The one or more server computing devices 410 may then dispatch a vehicle to pick up the passenger at the pickup location, transport (i.e., make a "trip" or "ride") the passenger to the destination location, and drop the passenger at the destination location.
Fig. 6 represents ride data 600 for a ride by a passenger (such as user 422 or 432) in a vehicle having an autonomous driving mode (such as vehicle 100). In this example, the vehicle may have followed the route 610 from the pick-up location 620 to the destination location 630. Although not shown, this ride data 600 may also include the aforementioned log data, including sensor data generated by the perception system (e.g., perception system 172), event data identifying different types of events (such as collisions or near collisions with other objects), planned trajectories describing the planned geometry and/or speed of the potential path of the vehicle 100, the actual location of the vehicle at different times, the actual orientation/heading of the vehicle at different times, the actual speed, acceleration and deceleration (and/or changes in speed, heading/steering angle, acceleration, etc. over time) of the vehicle at different times, classifications of perceived objects and responses to perceived objects, behavioral predictions of perceived objects, various systems of the vehicle (e.g., acceleration, deceleration, perception, steering, signaling, acceleration, deceleration, etc.) Route, power, etc.) at different times (including recorded errors), input and output of various systems of the vehicle at different times, etc. This information, as well as feedback from users 422, 432 discussed further below, may be used to train one or more models, as discussed further below.
Fig. 7 includes an example flow diagram 700 of some of examples, which may be executed by one or more processors (such as a processor of computing device 410) for training a model to identify events that may cause discomfort to an occupant of an autonomous vehicle. To train the model, data regarding ride quality must first be collected and stored in the storage system 450, as discussed above.
For example, during a ride with human riders (such as actual passengers or test riders), these riders may be provided with the ability to provide real-time feedback regarding the ride. As an example, the test occupant may provide real-time feedback during or immediately after an unpleasant event. Fig. 8A provides a user interface 800 for providing real-time feedback during a ride. The user interface 800 includes a plurality of actual physical buttons that are virtual (e.g., presented on a touch screen display (such as the internal electronic display 152 as shown in fig. 8A) or a display of a client computing device (such as the display 424 of the client computing device 420 or the display 434 of the client computing device 430) or located in a vehicle. Each button 810, 820, 830, 840 includes a corresponding rating value: "1", "2", "3" and "4". These values may indicate a level of discomfort, for example in the range of 1 to 4, where 1 represents a very low or slight level of discomfort and 4 represents a very high or strong level of discomfort. Alternatively, rather than simply using a numerical rating value, the rating value may be a static or animated icon, such as a face representing the emotional state of the occupant.
During or immediately after the unpleasant event, the passenger may select the rating by, for example, tapping or pressing one of the physical buttons on the display using his or her finger to provide real-time feedback to the computing device 110 regarding the ride. For example, returning to fig. 6, the passenger (e.g., user 422, 432) may have provided feedback at point 650 along route 610. As shown, this feedback may indicate a rating value of "3" provided by the passenger indicating a significant level of discomfort. This information may also be associated with an Identifier (ID) and a timestamp (T1).
This feedback or rating value may also be converted to some other normalized value (e.g., on a scale of 0 to 1) representing the level of discomfort of the corresponding event. The rating may also be associated with ride data for the ride (e.g., ride data 600) including times proximate to the input and/or outputs from various systems of the vehicle as discussed above for the entire ride.
The ride data and associated feedback for a particular ride may be sent to the server computing device 410 by the client computing device and/or computing device 110, for example, via the network 460 and/or by otherwise communicating information to the server computing device 410. The server computing device 410 may store the ride data and associated feedback in the storage system 450.
At the end of the ride, the same or similar interface may be used to provide feedback indicating the overall ride quality value after the ride has been completed. In this regard, the values of buttons 1, 2, 3, and 4 may "toggle" or represent some value of star rating on a scale of 1 to 4, where 1 represents a very low quality ride and 4 represents a very high quality ride. While the above-mentioned interface 800 within the vehicle may be used, alternatively, a similar arrangement of virtual buttons may appear on the display of the passenger's client computing device after the ride is completed.
For example, fig. 8B provides a user interface 850 for providing feedback after a ride. The user interface 800 includes a plurality of actual physical buttons that are virtual (e.g., presented on a touch screen display (such as the internal electronic display 152 shown in fig. 8A) or a display of a client computing device (such as the display 422 of the client computing device 120)) or located in a vehicle. Each button 860, 870, 880, 890 includes a corresponding star rating value: "1", "2", "3" and "4". These star rating values may indicate an overall level of satisfaction with a completed ride, where 1 represents very low satisfaction or low overall ride quality and 4 represents very high satisfaction or high overall ride quality. For example, the passenger may indicate a star rating value by highlighting a given number of stars from the maximum number of stars (i.e., 3 out of 4 would give a 3 star rating value) or by simply tapping on a representation of a particular value (i.e., "3" stars would give a 3 star rating value). For example, returning to fig. 6, once the vehicle has completed a ride along route 610, a passenger (e.g., user 432) may have provided feedback. As shown, this feedback may indicate a 2 star overall ride quality value provided by the passenger, indicating a relatively low overall ride quality. This feedback may also be associated with a Ride Identifier (RID) and a timestamp (T2). Again, the star rating value may also be converted to some other normalized value (e.g., on a scale of 0 to 1, where zero is a very low ride quality and 1 is a very high ride quality), representing the overall ride quality for a given ride. Of course, the scale could also be reversed such that 1 is a very low quality ride and 0 is a very high quality ride. Again, alternatively, rather than simply using a numerical rating value, the rating value may be a static or animated icon, such as a face representing the emotional state of the occupant.
The client computing device (via the application) may then automatically send this star rating value and/or the normalized value to the server computing device 410 and/or the computing device 110. The computing device 110 may also provide the aforementioned ride data to the server computing device 110, e.g., via the network 460 and/or other transmission, to correlate to the overall ride quality value provided by the passenger. All of this information may then be stored in storage system 450.
To train the first model, the server computing device 210 may receive the ride data and feedback from the storage system 450, as shown in block 710 of fig. 7. In this case, the feedback may include a ride quality value indicative of a level of discomfort identified by an occupant of the autonomous vehicle during the first ride. As described above, this ride quality value may indicate the level of discomfort of the passenger during an event that occurs during a particular ride.
The first model may be trained using the ride data and the ride quality values such that the model is configured to output a list of events that may cause occupant discomfort during the second ride in response to receiving the ride data for the second ride as input, as shown in block 720 of fig. 7. The training data may also include the event codes mentioned above. In this regard, ride data may be used as training input, and event codes and ride quality values may be used as training output. The more information (i.e., ride data and feedback) used to train the model, the better the model can be expected to predict events that may cause discomfort. In this regard, the first model may be trained using ride data and associated discomfort levels for a test rider for multiple rides in order to identify events that discomfort the rider and the level of such discomfort. Again, the discomfort level may correspond to a value on the above-mentioned normalized scale. The model may include any type of machine learning model, such as a neural network, a deep neural network, a convolutional neural network over time, a cyclic neural network, and so forth. Alternatively, other non-machine learning models may be used. For example, if a vehicle crosses an intersection during a yellow light, then in a sense, tracking the amount of time spent at that intersection when the light is yellow may be used as a measure of comfort.
In some cases, the first model is not a single model, but may actually be a collection or aggregation of different models that may be used separately or together to predict different types of events. Each of these different models may be used to identify a certain event code or group of event codes (e.g., a category). For example, different models may be used to classify events, or more specifically, event codes that identify different types of behavior that may cause discomfort. As an example, the first model may include a model for detecting that the vehicle is turning sharply and providing an event code associated with the sharp turn. Similarly, the first model may also include a model for detecting: a vehicle is braking, traveling too slowly (e.g., traveling at 30mph on a highway when other vehicles are traveling at 55mph or above), turning at a high speed, approaching too close to another object without hard braking, etc. given the context in which the vehicle is currently located.
The first model may then be used to detect discomfort events from the ride data of the different rides. For example, autonomous control vehicle software for controlling a vehicle may be tested by generating a simulation, for example from log data from previous real world driving conditions. The output of the simulation may include ride data, such as only certain types of data, such as motion planning commands from a planner system (or more precisely, a planner system software module) of the vehicle, telemetry from the vehicle, context for controlling the vehicle from map information, processed or raw sensor data from other road users of a perception system (or more precisely, a perception system software module) of the vehicle, acceleration information, acceleration rate information, and the like. The simulated ride data may be input into the first model to analyze the ride data generated by the autonomous vehicle control software and provide a list of events and a level of discomfort for each event. As such, the first model may provide ride quality for a given simulation or simulations.
Fig. 9 provides an example 900 of the output of a first model of a given ride or simulated ride. Example 900 provides output for a ride or simulation identified by a ride identifier of "00001". The ride identifier and timestamp may be used to identify when and in which simulation or ride the event occurred. As such, the output also includes a list of event codes that may be used to identify certain types of events or manipulations, such as left-turn, right-turn, hard braking events, etc. that occur during a ride or simulation, as well as a level of discomfort corresponding to the value on the above-mentioned normalized scale.
From this information, the first model may be used to evaluate ride quality for different versions of the autonomous vehicle control software. For example, the output of the first model may be used to identify certain types of events caused by a given version of the autonomous vehicle control software that may be uncomfortable to passengers, or more precisely, those that correspond to relatively high levels of discomfort (i.e., closer to 1 on the above-mentioned normalized scale). This output may then be flagged for review according to the list of events and the level of discomfort. For example, referring to example 900, given a level of discomfort of 0.5 may be considered quite high, an event of timestamp T7 and/or ride identifier 00001 may be flagged for further review.
This in turn may be used to adjust the software to generally improve ride quality, for example, by reducing the number of this particular type of event caused by the autonomous vehicle control software during future rides. As an example, an engineer may prioritize reprogramming the autonomous vehicle control software to reduce the types of events with the highest level of discomfort. As another example, an engineer may prioritize reprogramming the autonomous vehicle control software to reduce the number of events of a given type that have a relatively high likelihood of occurrence even if the events of the given type have a lower level of discomfort. Additionally or alternatively, the autonomous vehicle control software may be reprogrammed to "replace" events with high levels of discomfort, such as certain types of maneuvers, with other types of events or certain types of maneuvers with lower levels of discomfort.
The first model may also be used to detect inappropriate events from ride data for certain real-world rides. For example, as described above, the passenger may provide an overall ride quality value after the ride has been completed. Again, as described above, this may include a rating, such as 1 to 4 stars, etc. For rides below certain thresholds (e.g., 2 stars or below), the first model may be automatically used to detect discomfort events that occur during the ride in order to avoid such events in the future for the particular passenger that provided a low overall ride quality value (or more precisely, a star rating value). This information may also be used to adjust the software to improve ride quality overall, as discussed above with respect to simulated rides, and to potentially identify patterns of discomfort specific to certain locations and times of day, which may be used to avoid driving at those locations (e.g., on certain routes) at those times of day to improve comfort.
Fig. 10 includes an example flow diagram 1000 of some of the examples that may be executed by one or more processors (such as the processor of computing device 410) for training a model for evaluating overall ride quality for an autonomous vehicle ride. As shown in block 1010, first ride data is received that identifies an output from a planner system of an autonomous vehicle and an output from a perception system of the autonomous vehicle. The first ride data is associated with a first overall ride quality value indicative of an overall ride quality of an occupant of the autonomous vehicle during the first ride. For example, the server computing device 410 may retrieve the ride data and associated feedback, here the overall ride quality value and/or the star rating value, from the storage system 450.
The second model may be trained using the ride data and the first overall ride quality value such that the model is configured to output a second overall ride quality value for the second ride in response to receiving as input second ride data for the second ride, as shown in block 1020. For example, the second model may be trained using the star rating values and associated ride data for multiple rides by the passenger to identify an overall ride quality value. As described above, the star rating value may be converted to some other normalized value to represent overall ride quality and to be associated with ride data for the ride. Further, to better inform the second model, the output of the first model that identifies inappropriate events during the ride (for which the ride data was used to train the model) may also be used to train the second model. In this regard, the output of the first model and/or the ride data may be used as training inputs and the overall ride quality value may be used as a training output. Again, the more information (i.e., ride data and feedback) used to train the second model, the better the second model can be expected to predict the overall ride quality value.
The second model may include any type of machine learning model, such as a neural network, a deep neural network, a convolutional neural network over time, a cyclic neural network, and so forth. In this case, by using a convolutional neural network over time, this may allow the second model to identify the relationships between events, when events occur during the ride, and how they affect the overall ride quality. For example, if an uncomfortable event occurs at the beginning or end of a ride, this may have a greater impact on the overall ride quality than if the same event occurred midway through the ride.
The second model may also be trained using additional information from the passenger. For example, some passengers may no longer be seated after experiencing a ride of low quality. As such, this information may be used to train the model to predict the likelihood that the passenger will continue to use the service. For passengers predicted to be likely to exit, the first model may be used to identify any uncomfortable events during one or more rides for that passenger, and in the future, the vehicle may be controlled to avoid those events for that particular passenger. Again, whether a passenger is likely to discontinue use of the service may be based on a normalized value (e.g., on a scale of 0 to 1, where zero is a very low likelihood of discontinuing use of the service and 1 is a very high likelihood of discontinuing use of the service), representing the overall likelihood of the passenger ceasing use of the service.
As another example, the second model may be trained using data identifying motion sickness estimates for the ride. For example, motion sickness of a ride may be assessed based on lateral and forward acceleration, steering angle (and change in steering angle over time), and/or acceleration rate of change values. As an example, an interface similar to interface 800 of fig. 8A may be used to indicate motion sickness during or immediately after a particular event. These values for the buttons 810, 820, 830, 840 may indicate a level of motion sickness, for example in the range of 1 to 4, where 1 represents a very low or slight motion sickness level and 4 represents a very high or strong motion sickness level. During or immediately after the unpleasant event, the passenger may select, for example, using his or her finger and tapping the display or pressing one of the physical buttons, in order to provide real-time feedback to the computing device 110 regarding the ride. In this regard, motion sickness assessment may be similar to ride quality values discussed above. As such, the second model may also identify whether a ride is associated with a high motion sickness value. Again, these values may also be converted to some other normalized value (e.g., on a scale of 0 to 1, where zero is very low motion sickness and 1 is very high motion sickness), representing the overall motion sickness value for a given ride. This may be accomplished, for example, by summing the values, taking an average, taking a moving average with an exponential decay, and the like.
Fig. 11 provides an example 1100 of the output of a second model of a given ride or simulated ride. Example 1100 provides output for a plurality of rides or simulations identified by ride identifiers "00001" through "n". The output includes ride information identifying a ride identifier that can be used to identify which ride or ride simulation, a motion sickness value (motion sickness) representing how likely the passenger will experience motion sickness, an overall ride quality value (overall) corresponding to the value on the above-mentioned normalized scale, and a likelihood value (stop value) indicating how likely the passenger will stop using the service after the ride. As shown, these values are normalized on the scale of 0 to 1 mentioned above.
The second model may also be used to determine the overall ride quality of the real world and simulated ride. For example, the second model may be used to evaluate which overall ride quality values may be responsible for passenger loss. This output may then be flagged for review based on the overall ride quality value and/or the likelihood that the passenger ceases to use the service. As an example, referring to example 1100, given a high motion sickness value (0.4), a high likelihood of a passenger ceasing to use the service (0.5), and a low overall ride quality value (0.3), a ride identifier of 00002 may be flagged for further review. Further, as described above, the ride data for any ride having a lower (i.e., closer to 0) overall ride quality value may be entered into the first model in order to identify specific events that are most likely to have an effect on the overall ride quality value.
This, in turn, can be used to prioritize engineering work to make adjustments to the software to improve ride quality overall. Further, simulated ride data may be input into the second model in order to analyze the ride data generated by the autonomous vehicle control software and provide an overall ride quality value and/or a likelihood that the passenger will no longer use the service for simulation. The output of the second model in many different simulations may be used to prioritize engineering work to make adjustments to the software to improve ride quality overall.
The overall ride quality value may be used to make an assessment of the autonomous vehicle control software and/or the service ensemble. For example, the output of the second model may be used to provide an estimate of the overall ride quality for the service over a given period of time, given the number of miles driven. Further, the same data may be used to evaluate how certain types of maneuvers from the ride data (such as left turns, right turns, lane changes, highway merging, etc.) affect the overall ride quality value. Again, this information can be used to prioritize engineering work to make adjustments to the software to improve ride quality overall.
In addition to the "offline" evaluation mentioned above, the first and second models may also be provided to one or more vehicles in order to allow computing devices of those vehicles to make better informed driving decisions. For example, the first model and the second model may be sent to the computing device 110 of the vehicle 100, e.g., via the network 460 or otherwise load this information into the computing device 110. This information may then be stored in the memory 130 of the computing device 110 to allow the computing device to evaluate the ride quality of the passenger using the first model and the second model, as well as essentially evaluating the passenger's perception during the ride to provide some level of intervention if necessary.
For example, a first model may be run to identify events that may be uncomfortable for the passenger. In response, the computing device may take one or more intervening actions. This may include, for example, changing aspects of the user interface to identify what the passenger is experiencing, connecting the passenger with a concierge (e.g., to the user 442 operating the concierge workstation 440) to talk to the passenger and eliminate any concerns, automatically changing the environment (e.g., entertaining and distracting the passenger by playing front or optimistic music and/or by displaying other information), etc., and all to help the passenger relax.
For example, after an uncomfortable hard braking event, the first model may be used to determine the level of discomfort for that event. The ride data for the event may be analyzed to identify the object causing the event, such as another vehicle, a pedestrian, a cyclist, a yellow light, and so forth. This information can then be used to generate a message explaining the situation, such as "we have to brake for the vehicle that just cut in from the right side, regressing the discomfort that results from this". The message may be communicated audibly to the passenger using speaker 154 and/or visually to the passenger, for example, by being displayed on internal electronic display 152. Such communication may have the potential to reduce the negative experience of the passenger.
For another scenario, a second model may be run to determine how well or how well a particular ride performed the occupant. Depending on the data used to train the second model, this may essentially allow the computing device 110 to assess the emotional state of the occupant, and depending on that emotional state (i.e., which overall rating the occupant may give at any given point during the ride), the computing device may take any of the one or more intervening actions mentioned above.
Although the examples herein relate to gathering feedback by receiving explicit user input, such as described with respect to fig. 8A and 8B, feedback may be generated in other ways as well. For example, a camera and/or biometric (e.g., skin) sensor may be used to monitor the passenger during the ride. Changes in the expression or biometric information of the passenger may be used to estimate the corresponding level of discomfort for a particular event, whether the passenger is experiencing motion sickness, and/or the overall level of satisfaction with a completed ride.
Thus, the features described herein may allow for detection of discomfort events and assessment of overall ride quality in both real-world and simulated rides. This, in turn, allows for prioritized engineering efforts to improve autonomous vehicle control software, alter ride behavior, to increase ride retention and avoid occupant loss, and to provide intervention in the event of an adverse event.
Unless otherwise specified, the foregoing alternative examples are not mutually exclusive and may be implemented in various combinations to achieve unique advantages. As these and other variations and combinations of the features discussed above can be utilized without departing from the subject matter defined by the claims, the foregoing description of the embodiments should be taken by way of illustration rather than by way of limitation of the subject matter defined by the claims. Furthermore, the provision of examples described herein and terms such as "such as," "including," and the like, should not be construed to limit claimed subject matter to the specific examples; rather, these examples are intended to illustrate only one of many possible embodiments. In addition, the same reference numbers in different drawings may identify the same or similar elements.
Claims (21)
1. A method of training a model for identifying events that may cause discomfort to an occupant of an autonomous vehicle, the method comprising:
receiving, by the one or more server computing devices, first ride data identifying an output from the planner system and an output from the perception system, the first ride data associated with a ride quality value indicative of a level of discomfort identified by a passenger of the autonomous vehicle during the first ride; and
training, by the one or more server computing devices, a model using the first ride data and any associated ride quality values such that the model is configured to output, in response to receiving as input second ride data for the second ride, a list of events occurring in the second ride that may cause discomfort to one or more passengers during the second ride.
2. The method of claim 1, further comprising:
inputting second ride data for a second ride into the model to determine a list of events; and
the second ride data is flagged for review based on the list of events.
3. The method of claim 1, wherein the second ride data corresponds to simulated ride data for a simulated ride generated using autonomous vehicle control software.
4. The method of claim 1, wherein the ride quality value is further associated with a point in time during the first ride.
5. The method of claim 1, wherein the ride data further comprises context from the map for controlling the autonomous vehicle during the first ride.
6. The method of claim 5, wherein the ride data further comprises information about the location and type of road users other than vehicles.
7. The method of claim 1, wherein the ride data further comprises acceleration information for the first ride.
8. The method of claim 7, wherein the ride data further comprises a derivative of acceleration information of the autonomous vehicle of the first ride.
9. A method of training a model for evaluating overall ride quality for an autonomous vehicle ride, the method comprising:
receiving, by the one or more server computing devices, first ride data identifying an output from the planner system and an output from the perception system, the first ride data associated with a first overall ride quality value indicative of an overall ride quality provided by a passenger of the autonomous vehicle for the first ride, and
training, by the one or more server computing devices, a model using the first ride data and the first overall ride quality value such that the model is configured to output a second overall ride quality value for the second ride in response to receiving as input second ride data for the second ride.
10. The method of claim 9, wherein the second ride data corresponds to simulated ride data for a simulated ride generated using autonomous vehicle control software.
11. The method of claim 9, wherein the first ride data further includes context from the map for controlling the autonomous vehicle during the first ride.
12. The method of claim 9, wherein the ride data further comprises information about the location and type of road users other than vehicles.
13. The method of claim 9, wherein the first ride data further includes acceleration information of the autonomous vehicle.
14. The method of claim 9, further comprising receiving an output from the second model that identifies a list of events that may cause discomfort to the passenger during the first ride, and wherein training is further based on the list of events.
15. The method of claim 9, further comprising converting the first ride quality value to a first overall ride quality value, and the first ride quality value corresponds to a star rating of the first ride provided by the occupant.
16. The method of claim 9, wherein the model is a convolutional neural network over time to enable the model to identify relationships between events and when the events occur during a ride and how the relationships affect overall ride quality.
17. The method of claim 9, further comprising:
inputting ride data for the second ride into the model to determine a second overall ride quality value; and
the second ride data is flagged for review based on the second overall ride quality value.
18. The method of claim 9, further comprising:
inputting ride data for the second ride into the model to determine a second overall ride quality value; and
the ride quality for the plurality of miles driven by the autonomous vehicle is evaluated using the second overall ride quality value.
19. The method of claim 9, further comprising:
inputting ride data for the second ride into the model to determine a second overall ride quality value; and
the ride quality for the particular type of maneuver is evaluated using the second overall ride quality value.
20. The method of claim 19, wherein the particular type of manipulation comprises a left turn or a right turn.
21. The method of claim 19, wherein the particular type of maneuver comprises a lane change.
Applications Claiming Priority (5)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US201862747815P | 2018-10-19 | 2018-10-19 | |
US62/747,815 | 2018-10-19 | ||
US16/214,991 | 2018-12-10 | ||
US16/214,991 US11657318B2 (en) | 2018-10-19 | 2018-12-10 | Assessing ride quality for autonomous vehicles |
PCT/US2019/055161 WO2020081287A1 (en) | 2018-10-19 | 2019-10-08 | Assessing ride quality for autonomous vehicles |
Publications (1)
Publication Number | Publication Date |
---|---|
CN112955362A true CN112955362A (en) | 2021-06-11 |
Family
ID=70281010
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201980073648.2A Pending CN112955362A (en) | 2018-10-19 | 2019-10-08 | Assessing ride quality of autonomous vehicles |
Country Status (4)
Country | Link |
---|---|
US (1) | US11657318B2 (en) |
EP (1) | EP3849867A4 (en) |
CN (1) | CN112955362A (en) |
WO (1) | WO2020081287A1 (en) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN113320537A (en) * | 2021-07-16 | 2021-08-31 | 北京航迹科技有限公司 | Vehicle control method and system |
Families Citing this family (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US11077844B2 (en) * | 2019-04-19 | 2021-08-03 | GM Global Technology Operations LLC | System and method for increasing passenger satisfaction in a vehicle having an automated driving system |
US11869048B2 (en) * | 2019-06-12 | 2024-01-09 | Gm Cruise Holdings Llc | User recommendations and reviews using autonomous vehicle information |
CN111861128B (en) * | 2020-06-20 | 2024-03-22 | 清华大学 | Method, system and storage medium for evaluating connection pipe comfort in man-machine cooperative control process of automatic driving vehicle |
KR20220003744A (en) * | 2020-07-02 | 2022-01-11 | 현대자동차주식회사 | Vehicle control system of autonomous vehicle for reducing motion sickness |
US20210107496A1 (en) * | 2020-12-22 | 2021-04-15 | Fabian Oboril | Passenger discomfort measurement during vehicle maneuver |
US11904886B1 (en) * | 2021-08-13 | 2024-02-20 | Waymo Llc | Modifying autonomous vehicle behavior prediction based on behavior prediction errors |
US11878718B2 (en) * | 2021-08-20 | 2024-01-23 | Ford Global Technologies, Llc | Autonomous vehicle rider drop-off sensory systems and methods |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US8457827B1 (en) * | 2012-03-15 | 2013-06-04 | Google Inc. | Modifying behavior of autonomous vehicle based on predicted behavior of other vehicles |
US20170203766A1 (en) * | 2016-01-14 | 2017-07-20 | Toyota Motor Engineering & Manufacturing North America, Inc. | Method and system for improving ride quality in an autonomous vehicle |
US20170305434A1 (en) * | 2016-04-26 | 2017-10-26 | Sivalogeswaran Ratnasingam | Dynamic Learning Driving System and Method |
CN107949514A (en) * | 2015-09-08 | 2018-04-20 | 苹果公司 | The comfort level configuration file of autonomous vehicle |
US20180275667A1 (en) * | 2017-03-27 | 2018-09-27 | Uber Technologies, Inc. | Machine Learning for Event Detection and Classification in Autonomous Vehicles |
Family Cites Families (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP4237737B2 (en) | 2005-08-04 | 2009-03-11 | 株式会社日本自動車部品総合研究所 | Automatic control device for on-vehicle equipment and vehicle equipped with the device |
JP4946374B2 (en) | 2006-11-13 | 2012-06-06 | トヨタ自動車株式会社 | Self-driving vehicle |
US10107635B2 (en) | 2016-08-19 | 2018-10-23 | Waymo Llc | Method and system for determining and dynamically updating a route and driving style for passenger comfort |
CN110225852B (en) | 2017-01-24 | 2023-01-17 | 福特全球技术公司 | Feedback for autonomous vehicles |
US10449958B2 (en) | 2017-02-15 | 2019-10-22 | Ford Global Technologies, Llc | Feedback-based control model generation for an autonomous vehicle |
US10346888B2 (en) | 2017-06-16 | 2019-07-09 | Uber Technologies, Inc. | Systems and methods to obtain passenger feedback in response to autonomous vehicle driving events |
US10019011B1 (en) | 2017-10-09 | 2018-07-10 | Uber Technologies, Inc. | Autonomous vehicles featuring machine-learned yield model |
US20190225232A1 (en) | 2018-01-23 | 2019-07-25 | Uber Technologies, Inc. | Passenger Experience and Biometric Monitoring in an Autonomous Vehicle |
US10464577B1 (en) | 2018-11-15 | 2019-11-05 | GM Global Technology Operations LLC | Contextual autonomous vehicle support through written interaction |
-
2018
- 2018-12-10 US US16/214,991 patent/US11657318B2/en active Active
-
2019
- 2019-10-08 CN CN201980073648.2A patent/CN112955362A/en active Pending
- 2019-10-08 WO PCT/US2019/055161 patent/WO2020081287A1/en unknown
- 2019-10-08 EP EP19873103.6A patent/EP3849867A4/en active Pending
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US8457827B1 (en) * | 2012-03-15 | 2013-06-04 | Google Inc. | Modifying behavior of autonomous vehicle based on predicted behavior of other vehicles |
CN107949514A (en) * | 2015-09-08 | 2018-04-20 | 苹果公司 | The comfort level configuration file of autonomous vehicle |
US20170203766A1 (en) * | 2016-01-14 | 2017-07-20 | Toyota Motor Engineering & Manufacturing North America, Inc. | Method and system for improving ride quality in an autonomous vehicle |
US20170305434A1 (en) * | 2016-04-26 | 2017-10-26 | Sivalogeswaran Ratnasingam | Dynamic Learning Driving System and Method |
US20180275667A1 (en) * | 2017-03-27 | 2018-09-27 | Uber Technologies, Inc. | Machine Learning for Event Detection and Classification in Autonomous Vehicles |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN113320537A (en) * | 2021-07-16 | 2021-08-31 | 北京航迹科技有限公司 | Vehicle control method and system |
Also Published As
Publication number | Publication date |
---|---|
WO2020081287A1 (en) | 2020-04-23 |
EP3849867A4 (en) | 2022-05-25 |
US11657318B2 (en) | 2023-05-23 |
EP3849867A1 (en) | 2021-07-21 |
US20200125989A1 (en) | 2020-04-23 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN111480052B (en) | Resolving scene difficulties in vehicle routes using predictive models | |
US11657318B2 (en) | Assessing ride quality for autonomous vehicles | |
KR102365050B1 (en) | Method and system for determining and dynamically updating a route and driving style for passenger comfort | |
KR102425960B1 (en) | Real-time lane change selection for autonomous vehicles | |
US20210070309A1 (en) | Exhaustive Driving Analytical Systems and Modelers | |
US11790131B2 (en) | Simulations with modified agents for testing autonomous vehicle software | |
CN111565990A (en) | Software validation for autonomous vehicles | |
US11774259B2 (en) | Mapping off-road entries for autonomous vehicles | |
US12103542B2 (en) | Extracting agent intent from log data for running log-based simulations for evaluating autonomous vehicle software | |
US20210390225A1 (en) | Realism in log-based simulations | |
CN113692373A (en) | Retention and range analysis for autonomous vehicle services | |
US11851086B2 (en) | Using simulations to identify differences between behaviors of manually-driven and autonomous vehicles | |
US12085935B2 (en) | Open door reconstruction for sensor simulation | |
US20240017741A1 (en) | Validation of trajectory planning for autonomous vehicles | |
US20230391363A1 (en) | User-controlled route selection for autonomous vehicles | |
US20220413510A1 (en) | Targeted driving for autonomous vehicles | |
CN115214625A (en) | Identifying parkable areas of an autonomous vehicle | |
US20230326335A1 (en) | Wrong-way driving modeling | |
EP4120172A1 (en) | Real time event triggered feedback for autonomous vehicles | |
US12030509B1 (en) | Realism in log-based simulations | |
US12036996B2 (en) | Automated method to detect road user frustration due to autonomous vehicle driving behavior | |
US20240046324A1 (en) | Digital token generation and output for riders of autonomous vehicles | |
Ojiaku | Will AI in cars lead to less road congestion? |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination |