WO2018237018A1 - Risk processing for vehicles having autonomous driving capabilities - Google Patents
Risk processing for vehicles having autonomous driving capabilities Download PDFInfo
- Publication number
- WO2018237018A1 WO2018237018A1 PCT/US2018/038520 US2018038520W WO2018237018A1 WO 2018237018 A1 WO2018237018 A1 WO 2018237018A1 US 2018038520 W US2018038520 W US 2018038520W WO 2018237018 A1 WO2018237018 A1 WO 2018237018A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- risk
- vehicle
- sensor signals
- driving
- report
- Prior art date
Links
- 238000012545 processing Methods 0.000 title description 54
- 230000004044 response Effects 0.000 claims abstract description 16
- 238000000034 method Methods 0.000 claims description 73
- 230000008569 process Effects 0.000 claims description 59
- 230000006399 behavior Effects 0.000 claims description 51
- 238000012544 monitoring process Methods 0.000 claims description 41
- 230000008447 perception Effects 0.000 claims description 14
- 239000000126 substance Substances 0.000 claims description 14
- 230000033001 locomotion Effects 0.000 claims description 13
- 230000008859 change Effects 0.000 claims description 10
- 230000004931 aggregating effect Effects 0.000 claims description 8
- 230000001133 acceleration Effects 0.000 claims description 7
- 230000032683 aging Effects 0.000 claims description 4
- 238000004891 communication Methods 0.000 description 42
- 238000006243 chemical reaction Methods 0.000 description 18
- 230000010354 integration Effects 0.000 description 9
- 230000003287 optical effect Effects 0.000 description 8
- 230000005540 biological transmission Effects 0.000 description 7
- 230000006870 function Effects 0.000 description 7
- 230000002159 abnormal effect Effects 0.000 description 5
- 238000013500 data storage Methods 0.000 description 5
- CURLTUGMZLYLDI-UHFFFAOYSA-N Carbon dioxide Chemical compound O=C=O CURLTUGMZLYLDI-UHFFFAOYSA-N 0.000 description 4
- 238000001514 detection method Methods 0.000 description 4
- 238000009826 distribution Methods 0.000 description 4
- 238000005516 engineering process Methods 0.000 description 3
- 239000002360 explosive Substances 0.000 description 3
- 230000006855 networking Effects 0.000 description 3
- 230000011218 segmentation Effects 0.000 description 3
- 230000005236 sound signal Effects 0.000 description 3
- LFQSCWFLJHTTHZ-UHFFFAOYSA-N Ethanol Chemical compound CCO LFQSCWFLJHTTHZ-UHFFFAOYSA-N 0.000 description 2
- 230000009471 action Effects 0.000 description 2
- 238000003491 array Methods 0.000 description 2
- 230000008901 benefit Effects 0.000 description 2
- 239000001569 carbon dioxide Substances 0.000 description 2
- 229910002092 carbon dioxide Inorganic materials 0.000 description 2
- 230000008878 coupling Effects 0.000 description 2
- 238000010168 coupling process Methods 0.000 description 2
- 238000005859 coupling reaction Methods 0.000 description 2
- 230000009849 deactivation Effects 0.000 description 2
- 230000036541 health Effects 0.000 description 2
- 239000000203 mixture Substances 0.000 description 2
- 230000002093 peripheral effect Effects 0.000 description 2
- 239000000779 smoke Substances 0.000 description 2
- 238000001228 spectrum Methods 0.000 description 2
- 230000003068 static effect Effects 0.000 description 2
- 239000003440 toxic substance Substances 0.000 description 2
- 230000000007 visual effect Effects 0.000 description 2
- UGFAIRIUMAVXCW-UHFFFAOYSA-N Carbon monoxide Chemical compound [O+]#[C-] UGFAIRIUMAVXCW-UHFFFAOYSA-N 0.000 description 1
- RYGMFSIKBFXOCR-UHFFFAOYSA-N Copper Chemical compound [Cu] RYGMFSIKBFXOCR-UHFFFAOYSA-N 0.000 description 1
- 241001465754 Metazoa Species 0.000 description 1
- 206010062519 Poor quality sleep Diseases 0.000 description 1
- 208000018777 Vulvar intraepithelial neoplasia Diseases 0.000 description 1
- 230000002411 adverse Effects 0.000 description 1
- 238000013459 approach Methods 0.000 description 1
- 230000000903 blocking effect Effects 0.000 description 1
- 229910002091 carbon monoxide Inorganic materials 0.000 description 1
- 230000001413 cellular effect Effects 0.000 description 1
- 239000013626 chemical specie Substances 0.000 description 1
- 231100000481 chemical toxicant Toxicity 0.000 description 1
- 239000003086 colorant Substances 0.000 description 1
- 238000010276 construction Methods 0.000 description 1
- 238000012937 correction Methods 0.000 description 1
- 238000013135 deep learning Methods 0.000 description 1
- 239000000835 fiber Substances 0.000 description 1
- 239000007789 gas Substances 0.000 description 1
- 230000001939 inductive effect Effects 0.000 description 1
- 239000004973 liquid crystal related substance Substances 0.000 description 1
- 238000010801 machine learning Methods 0.000 description 1
- 230000007257 malfunction Effects 0.000 description 1
- 239000000463 material Substances 0.000 description 1
- 238000005259 measurement Methods 0.000 description 1
- 230000007246 mechanism Effects 0.000 description 1
- 238000003909 pattern recognition Methods 0.000 description 1
- 231100000614 poison Toxicity 0.000 description 1
- 230000007096 poisonous effect Effects 0.000 description 1
- 238000001556 precipitation Methods 0.000 description 1
- 208000020016 psychiatric disease Diseases 0.000 description 1
- 230000010344 pupil dilation Effects 0.000 description 1
- 230000008439 repair process Effects 0.000 description 1
- 230000002123 temporal effect Effects 0.000 description 1
- 150000003568 thioethers Chemical class 0.000 description 1
- 231100000167 toxic agent Toxicity 0.000 description 1
- 238000012546 transfer Methods 0.000 description 1
- 230000017105 transposition Effects 0.000 description 1
Classifications
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W30/00—Purposes of road vehicle drive control systems not related to the control of a particular sub-unit, e.g. of systems using conjoint control of vehicle sub-units
- B60W30/08—Active safety systems predicting or avoiding probable or impending collision or attempting to minimise its consequences
- B60W30/09—Taking automatic action to avoid collision, e.g. braking and steering
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W60/00—Drive control systems specially adapted for autonomous road vehicles
- B60W60/001—Planning or execution of driving tasks
- B60W60/0015—Planning or execution of driving tasks specially adapted for safety
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60K—ARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
- B60K35/00—Instruments specially adapted for vehicles; Arrangement of instruments in or on vehicles
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60K—ARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
- B60K35/00—Instruments specially adapted for vehicles; Arrangement of instruments in or on vehicles
- B60K35/10—Input arrangements, i.e. from user to vehicle, associated with vehicle functions or specially adapted therefor
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60K—ARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
- B60K35/00—Instruments specially adapted for vehicles; Arrangement of instruments in or on vehicles
- B60K35/20—Output arrangements, i.e. from vehicle to user, associated with vehicle functions or specially adapted therefor
- B60K35/28—Output arrangements, i.e. from vehicle to user, associated with vehicle functions or specially adapted therefor characterised by the type of the output information, e.g. video entertainment or vehicle dynamics information; characterised by the purpose of the output information, e.g. for attracting the attention of the driver
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W10/00—Conjoint control of vehicle sub-units of different type or different function
- B60W10/20—Conjoint control of vehicle sub-units of different type or different function including control of steering systems
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W30/00—Purposes of road vehicle drive control systems not related to the control of a particular sub-unit, e.g. of systems using conjoint control of vehicle sub-units
- B60W30/08—Active safety systems predicting or avoiding probable or impending collision or attempting to minimise its consequences
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W30/00—Purposes of road vehicle drive control systems not related to the control of a particular sub-unit, e.g. of systems using conjoint control of vehicle sub-units
- B60W30/08—Active safety systems predicting or avoiding probable or impending collision or attempting to minimise its consequences
- B60W30/095—Predicting travel path or likelihood of collision
- B60W30/0953—Predicting travel path or likelihood of collision the prediction being responsive to vehicle dynamic parameters
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W30/00—Purposes of road vehicle drive control systems not related to the control of a particular sub-unit, e.g. of systems using conjoint control of vehicle sub-units
- B60W30/08—Active safety systems predicting or avoiding probable or impending collision or attempting to minimise its consequences
- B60W30/095—Predicting travel path or likelihood of collision
- B60W30/0956—Predicting travel path or likelihood of collision the prediction being responsive to traffic or environmental parameters
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W30/00—Purposes of road vehicle drive control systems not related to the control of a particular sub-unit, e.g. of systems using conjoint control of vehicle sub-units
- B60W30/10—Path keeping
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W30/00—Purposes of road vehicle drive control systems not related to the control of a particular sub-unit, e.g. of systems using conjoint control of vehicle sub-units
- B60W30/18—Propelling the vehicle
- B60W30/18009—Propelling the vehicle related to particular drive situations
- B60W30/18163—Lane change; Overtaking manoeuvres
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W40/00—Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models
- B60W40/02—Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models related to ambient conditions
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W40/00—Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models
- B60W40/10—Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models related to vehicle motion
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W50/00—Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
- B60W50/0097—Predicting future conditions
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W50/00—Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
- B60W50/08—Interaction between the driver and the control system
- B60W50/14—Means for informing the driver, warning the driver or prompting a driver intervention
-
- G—PHYSICS
- G07—CHECKING-DEVICES
- G07C—TIME OR ATTENDANCE REGISTERS; REGISTERING OR INDICATING THE WORKING OF MACHINES; GENERATING RANDOM NUMBERS; VOTING OR LOTTERY APPARATUS; ARRANGEMENTS, SYSTEMS OR APPARATUS FOR CHECKING NOT PROVIDED FOR ELSEWHERE
- G07C5/00—Registering or indicating the working of vehicles
- G07C5/008—Registering or indicating the working of vehicles communicating information to a remotely located station
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60K—ARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
- B60K2360/00—Indexing scheme associated with groups B60K35/00 or B60K37/00 relating to details of instruments or dashboards
- B60K2360/60—Structural details of dashboards or instruments
- B60K2360/65—Features of dashboards
- B60K2360/652—Crash protection features
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W50/00—Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
- B60W2050/0001—Details of the control system
- B60W2050/0002—Automatic control, details of type of controller or control system architecture
- B60W2050/0004—In digital systems, e.g. discrete-time systems involving sampling
- B60W2050/0005—Processor details or data handling, e.g. memory registers or chip architecture
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W50/00—Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
- B60W2050/0062—Adapting control system settings
- B60W2050/0063—Manual parameter input, manual setting means, manual initialising or calibrating means
- B60W2050/0064—Manual parameter input, manual setting means, manual initialising or calibrating means using a remote, e.g. cordless, transmitter or receiver unit, e.g. remote keypad or mobile phone
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W50/00—Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
- B60W2050/0062—Adapting control system settings
- B60W2050/0075—Automatic parameter input, automatic initialising or calibrating means
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W50/00—Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
- B60W50/08—Interaction between the driver and the control system
- B60W50/14—Means for informing the driver, warning the driver or prompting a driver intervention
- B60W2050/143—Alarm means
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W2520/00—Input parameters relating to overall vehicle dynamics
- B60W2520/06—Direction of travel
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W2520/00—Input parameters relating to overall vehicle dynamics
- B60W2520/10—Longitudinal speed
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W2540/00—Input parameters relating to occupants
- B60W2540/30—Driving style
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W2552/00—Input parameters relating to infrastructure
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W2552/00—Input parameters relating to infrastructure
- B60W2552/50—Barriers
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W2554/00—Input parameters relating to objects
- B60W2554/40—Dynamic objects, e.g. animals, windblown objects
- B60W2554/404—Characteristics
- B60W2554/4041—Position
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W2554/00—Input parameters relating to objects
- B60W2554/40—Dynamic objects, e.g. animals, windblown objects
- B60W2554/404—Characteristics
- B60W2554/4042—Longitudinal speed
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W2554/00—Input parameters relating to objects
- B60W2554/80—Spatial relation or speed relative to objects
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W2555/00—Input parameters relating to exterior conditions, not covered by groups B60W2552/00, B60W2554/00
- B60W2555/60—Traffic rules, e.g. speed limits or right of way
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W2556/00—Input parameters relating to data
- B60W2556/10—Historical data
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W2720/00—Output or target parameters relating to overall vehicle dynamics
- B60W2720/10—Longitudinal speed
- B60W2720/106—Longitudinal acceleration
Definitions
- This description relates to systems and methods for processing risk for vehicles having autonomous driving capabilities.
- a vehicle having autonomous driving capabilities may encounter risks while driving on a road. Such risks can involve, for example, a pedestrian suddenly crossing the street in front of the vehicle such that impact with the pedestrian may be difficult to avoid. Such risks can also involve, for example, possibility of collision with another vehicle on the road. Such risks can also involve, for example, possibility of accidents in adverse driving conditions, such as in rain or snow, among others.
- sensor signals are received using a vehicle comprising an autonomous driving capability.
- a risk associated with operating the vehicle is identified based on the sensor signals.
- the autonomous driving capability is modified in response to the risk.
- the operation of the vehicle is updated based on the modifying of the autonomous capability.
- Some embodiments of identifying a risk may include detecting or predicting the risk.
- the identifying may include analyzing the sensor signals, or known or predicted risks, or both.
- the identifying may include analyzing the sensor signals to determine a position of an object.
- the identifying may include analyzing the sensor signals to evaluate a speed of an object.
- the identifying may include analyzing the sensor signals to evaluate speed profiles of two or more objects over time.
- the identifying may include analyzing the sensor signals to identify a boundary of an object.
- the identifying may include analyzing the sensor signals to identify overlapping boundaries of two or more objects.
- the identifying may include analyzing the sensor signals to determine a concentration of a chemical.
- the identifying may include analyzing the sensor signals to segment one or more objects on an image or a video.
- the identifying may include tracking the segmented one or more objects.
- the identifying may include assessing a threat.
- the identifying may include learning a pattern of known risks.
- the pattern may include associations of the known risks with one or more of objects, times, road configurations, or geolocations.
- updating the operation of the vehicle may include executing a lane change by a motion planning system of the vehicle. Updating the operation of the vehicle may include executing a trajectory change by a motion planning system of the vehicle. Responding to the risk may include treating analyzed sensor signals as prior information to a perception system of the vehicle. Responding to the risk may include invoking intervention on an operation of the vehicle from a remote operator. Updating the operation of the vehicle may include generating and inserting new machine instructions into existing machine instructions of an operation of the vehicle.
- Responding to the risk may include generating an input to an operation of the vehicle.
- the method may include generating a report of a risk.
- Generating a report of a risk may include extracting and aggregating information associated with the risk from sensor signals. Extracting and aggregating information may include evaluating one or more of the following: overlapping geographic zones, overlapping time periods, or report frequencies.
- generating a report of a risk may include recording an environment of the risk.
- Generating a report of a risk may include stitching images or videos to form a view of the risk.
- Generating a report of the risk may include removing private information associated with the risk.
- Generating a report of a risk may include providing an interface to allow an interface user to provide information associated with the risk.
- Generating a report of a risk may include integrating two or more reports associated with the risk.
- One embodiment of the method may include receiving a report of a risk from a remote data source.
- One embodiment of the method may include assessing a risk factor of the vehicle. Assessing a risk factor may include determining a risk associated with passing through a risky region.
- Assessing a risk factor may include determining a risk associated with a driving distance or a driving time period. Assessing a risk factor may include determining a risk associated with an aging component of the vehicle. Assessing a risk factor may include determining a risk associated with an inactive autonomous driving capability. Assessing a risk factor may include determining a risk based on a profile of a user of the vehicle. Assessing a risk factor may include determining a risk based on a social network of a user of the vehicle. Assessing a risk factor may include determining a risk associated with a driving behavior. Assessing a risk factor may include determining a risk associated with following traffic rules.
- a vehicle having autonomous driving capabilities includes steering, acceleration, and deceleration devices that respond to control signals from a driving control system to drive the vehicle autonomously on a road network.
- the vehicle also includes a monitoring element on the vehicle that receives sensor signals, and identifies a risk associated with operating the vehicle based on the sensor signals.
- the vehicle further includes and a controller that responds to the risk by configuring the driving control system to modify the autonomous driving capability in response to the risk and, based on the modifying of the autonomous capability, update an operation of the vehicle to maneuver the vehicle to a goal location.
- One embodiment of identifying a risk may include detecting or predicting the risk.
- the identifying may include analyzing sensor signals, or known or predicted risks, or both.
- the identifying may include analyzing the sensor signals to determine a position of an object.
- the identifying may include analyzing the sensor signals to evaluate a speed of an object.
- the identifying may include analyzing the sensor signals to evaluate speed profiles of two or more objects over time.
- the identifying may include analyzing the sensor signals to identify a boundary of an object.
- the identifying may include analyzing the sensor signals to identify overlapping boundaries of two or more objects.
- the identifying may include analyzing the sensor signals to determine a concentration of a chemical.
- the identifying may include analyzing the sensor signals to segment one or more objects on an image or a video.
- the identifying may include tracking the segmented one or more objects.
- the identifying may include assessing a threat.
- the identifying may include learning a pattern of known risks.
- the pattern may include associations of the known risks with one or more of objects, times, road configurations, or geolocations.
- updating the operation of the vehicle may include executing a lane change by a motion planning system of the vehicle. Updating the operation of the vehicle may include executing a trajectory change by a motion planning system of the vehicle.
- Responding to the risk may include treating analyzed sensor signals as prior information to a perception system of the vehicle. Responding to the risk may include invoking intervention on an operation of the vehicle from a remote operator. Updating the operation of the vehicle may include generating and inserting new machine instructions into existing machine instructions of an operation of the vehicle. Responding to the risk may include generating an input to an operation of the vehicle.
- the vehicle may include a reporting element to generate a report of a risk.
- Generating a report of a risk may include extracting and aggregating information associated with the risk from sensor signals. Extracting and aggregating information may include evaluating one or more of the following: overlapping geographic zones, overlapping time periods, or report frequencies.
- generating a report of a risk may include recording an environment of the risk.
- Generating a report of a risk may include stitching images or videos to form a view of the risk.
- Generating a report of the risk may include removing private information associated with the risk.
- Generating a report of a risk may include providing an interface to allow an interface user to provide information associated with the risk.
- Generating a report of a risk may include integrating two or more reports associated with the risk.
- the reporting element may include receiving a report of a risk from a remote data source.
- One embodiment of the vehicle may include assessing a risk factor of the vehicle. Assessing a risk factor may include determining a risk associated with passing through a risky region.
- Assessing a risk factor may include determining a risk associated with a driving distance or a driving time period. Assessing a risk factor may include determining a risk associated with an aging component of the vehicle. Assessing a risk factor may include determining a risk associated with an inactive autonomous driving capability. Assessing a risk factor may include determining a risk based on a profile of a user of the vehicle. Assessing a risk factor may include determining a risk based on a social network of a user of the vehicle. Assessing a risk factor may include determining a risk associated with a driving behavior. Assessing a risk factor may include determining a risk associated with following traffic rules.
- an apparatus in one aspect, includes a processor configured to process data to identify a risk of driving a vehicle comprising an autonomous driving capability.
- the processor is also configured to modify the autonomous driving capability in response to the risk, and update operation of the vehicle based on the modifying of the autonomous capability.
- the apparatus also includes an alarm configured to issue an alert of the identified risk.
- identifying a risk may include detecting or predicting the risk.
- the data may include sensor signals, or known or predicted risks, or both.
- the identifying may include analyzing the sensor signals to determine a position of an object.
- the identifying may include analyzing the sensor signals to evaluate a speed of an object.
- the identifying may include analyzing the sensor signals to evaluate speed profiles of two or more objects over time.
- the identifying may include analyzing the sensor signals to identify a boundary of an object.
- the identifying may include analyzing the sensor signals to identify overlapping boundaries of two or more objects.
- the identifying may include analyzing the sensor signals to determine a concentration of a chemical.
- the identifying may include analyzing the sensor signals to segment one or more objects on an image or a video.
- the identifying may include tracking the segmented one or more objects.
- the identifying may include assessing a threat.
- the identifying may include evaluating a driving behavior of the vehicle or another vehicle. Evaluating a driving behavior may include evaluating a speed, a heading, a trajectory, a vehicular operation, or combinations of them.
- the identifying may include learning a pattern of known risks.
- the pattern may include associations of the known risks with one or more of objects, times, road configurations, or geolocations.
- updating the operation of the vehicle may include executing a lane change by a motion planning system of the vehicle. Updating the operation of the vehicle may include executing a trajectory change by a motion planning system of the vehicle.
- Responding to the risk may include treating analyzed sensor signals as prior information to a perception system of the vehicle. Responding to the risk may include invoking intervention on an operation of the vehicle from a remote operator. Updating the operation of the vehicle may include generating and inserting new machine instructions into existing machine instructions of an operation of the vehicle. Responding to the risk may include generating an input to an operation of the vehicle.
- the apparatus may include a processor configured to generate a report of a risk. Generating a report of a risk may include extracting and aggregating information associated with the risk from sensor signals. Extracting and aggregating information may include evaluating one or more of the following: overlapping geographic zones, overlapping time periods, or report frequencies.
- generating a report of a risk may include recording an environment of the risk.
- Generating a report of a risk may include stitching images or videos to form a view of the risk.
- Generating a report of the risk may include removing private information associated with the risk.
- Generating a report of a risk may include providing an interface to allow an interface user to provide information associated with the risk.
- Generating a report of a risk may include integrating two or more reports associated with the risk.
- the processor may include a processor configured to receive a report of a risk from a remote data source.
- One embodiment of the apparatus may include a processor configured to assess a risk factor of the vehicle. Assessing a risk factor may include determining a risk associated with passing through a risky region. Assessing a risk factor may include determining a risk associated with a driving distance or a driving time period. Assessing a risk factor may include determining a risk associated with an aging component of the vehicle. Assessing a risk factor may include determining a risk associated with an inactive autonomous driving capability. Assessing a risk factor may include determining a risk based on a profile of a user of the vehicle. Assessing a risk factor may include determining a risk based on a social network of a user of the vehicle. Assessing a risk factor may include determining a risk associated with a driving behavior. Assessing a risk factor may include determining a risk associated with following traffic rules.
- FIG. 1 illustrates an example of an autonomous vehicle having autonomous capability.
- FIGs. 2-4 illustrate examples of architectures of risk processing systems.
- FIG. 5 illustrates an example of an autonomous vehicle detecting a collision in its vicinity while driving on a road.
- FIG. 6A illustrates an example of object detection by an autonomous vehicle by encoding information about an elevation profile of the road surface.
- FIGs. 6B and 6C illustrate examples of a risk monitoring process monitoring driving behaviors.
- FIGs. 7-10 illustrate examples of a risk processing system.
- FIG. 11 illustrates an example of an interface of a risk processing system.
- FIG. 12 illustrates an exemplary "cloud” computing environment.
- FIG. 13 illustrates an example of a computer system.
- FIG. 1 illustrates an example of an autonomous vehicle 100 having autonomous capability.
- autonomous capability refers to a function, feature, or facility that enables a vehicle to be operated without real-time human intervention, unless specifically requested by the vehicle.
- an autonomous vehicle is a vehicle that possesses autonomous capability.
- vehicle includes means of transposition of goods or people.
- vehicles can be cars, buses, trains, airplanes, drones, trucks, boats, ships, submersibles, dirigibles, among others.
- a driverless car is an example of an AV.
- the term "trajectory" refers to a path or route generated by an AV to navigate from a first spatio-temporal location to second spatio-temporal location.
- the first spatio-temporal location is referred to as the initial or starting location and the second
- spatio-temporal location is referred to as the goal or goal-position.
- the spatio-temporal locations correspond to real world locations.
- the spatio-temporal locations include pickup or drop off locations to pick up or drop off persons or goods.
- risk processing refers to accumulating information about a risk, predicting a risk, monitoring a risk, analyzing a risk, reacting to a risk, assessing factors associated with a risk, or any combination of the above.
- risk processing system refers to any kind of hardware, software, firmware, computer, or device of any kind, or a combination of two or more of them, that performs risk processing.
- One or more includes a function being performed by one element, a function being performed by more than one element, e.g., in a distributed fashion, several functions being performed by one element, several functions being performed by several elements, or any combination of the above.
- first, second, etc. are, in some instances, used herein to describe various elements, these elements should not be limited by these terms. These terms are only used to distinguish one element from another. For example, a first contact could be termed a second contact, and, similarly, a second contact could be termed a first contact, without departing from the scope of the various described embodiments. The first contact and the second contact are both contacts, but they are not the same contact.
- the phrase “if it is determined” or “if [a stated condition or event] is detected” is, optionally, construed to mean “upon determining” or “in response to determining” or “upon detecting [the stated condition or event]” or “in response to detecting [the stated condition or event],” depending on the context.
- an AV system refers to the AV along with the array of hardware, software, stored data, and data generated in real-time that supports the operation of the AV.
- the AV system is incorporated within the AV.
- the AV system may be spread across several locations.
- some of the software of the AV system may be implemented on a cloud computing environment similar to cloud computing environment 1300 described below with respect to FIG. 13.
- this document describes technologies applicable to any vehicles that have one or more autonomous capabilities including fully autonomous vehicles, highly autonomous vehicles, and conditionally autonomous vehicles, such as so-called Level 5, Level 4 and Level 3 vehicles, respectively (see SAE International's standard J3016: Taxonomy and Definitions for Terms Related to On-Road Motor Vehicle Automated Driving Systems, which is incorporated by reference in its entirety, for more details on the classification of levels of autonomy in vehicles). Vehicles with Autonomous Capabilities may attempt to control the steering or speed of the vehicles.
- the technologies descried in this document also can be applied to partially autonomous vehicles and driver assisted vehicles, such as so-called Level 2 and Level 1 vehicles (see SAE International's standard J3016: Taxonomy and Definitions for Terms Related to On-Road Motor Vehicle
- Level 1, 2, 3, 4 and 5 vehicle systems may automate certain vehicle operations (e.g., steering, braking, and using maps) under certain operating conditions based on processing of sensor inputs.
- vehicle operations e.g., steering, braking, and using maps
- the technologies described in this document can benefit vehicles in any levels, ranging from fully autonomous vehicles to human-operated vehicles.
- headings are provided for improved readability. Although headings are provided, information related to a particular heading but not found in the section having that heading, may also be found elsewhere in the specification.
- an AV system 120 operates the AV 100 autonomously or semi- autonomously along a trajectory 198 through an environment 190 to a goal location 199 while avoiding objects (e.g., natural obstructions 191, vehicles 193, pedestrians 192, cyclists, and other obstacles) and obeying rules of the road (e.g., rules of operation or driving preferences).
- objects e.g., natural obstructions 191, vehicles 193, pedestrians 192, cyclists, and other obstacles
- rules of the road e.g., rules of operation or driving preferences
- the AV system 120 includes devices 101 that are instrumented to receive and act on operational commands from the computer processors 146.
- computing processors 146 are similar to the processor 1304 described below in reference to FIG. 13. Examples of devices 101 include a steering control 102, brakes 103, gears, accelerator pedal, windshield wipers, side-door locks, window controls, and turn-indicators.
- the AV system 120 includes sensors 121 for measuring or inferring properties of state or condition of the AV 100, such as the AV's position, linear and angular velocity and acceleration, and heading (e.g., an orientation of the leading end of AV 100).
- sensors 121 for measuring or inferring properties of state or condition of the AV 100, such as the AV's position, linear and angular velocity and acceleration, and heading (e.g., an orientation of the leading end of AV 100).
- GPS inertial measurement units
- IMU inertial measurement units
- wheel speed sensors for measuring or estimating wheel slip ratios
- wheel brake pressure or braking torque sensors for measuring or estimating wheel slip ratios
- wheel brake pressure or braking torque sensors for measuring or estimating wheel slip ratios
- wheel brake pressure or braking torque sensors for measuring or estimating wheel slip ratios
- wheel brake pressure or braking torque sensors for measuring or estimating wheel slip ratios
- wheel brake pressure or braking torque sensors for measuring or estimating wheel slip ratios
- monocular or stereo video cameras 122 in the visible light, infrared or thermal (or both) spectra LiDAR 123, radar, ultrasonic sensors, time-of-flight (TOF) depth sensors, speed sensors, temperature sensors, humidity sensors, and precipitation sensors.
- LiDAR 123 LiDAR 123
- radar ultrasonic sensors
- TOF time-of-flight
- the AV system 120 includes a data storage unit 142 and memory 144 for storing machine instructions associated with computer processors 146 or data collected by sensors 121.
- the data storage unit 142 is similar to the ROM 1308 or storage device 1310 described below in relation to FIG. 13.
- memory 144 is similar to the main memory 1306 described below.
- the data storage unit 142 and memory 144 store historical, real-time, and/or predictive information about the environment 190.
- the stored information includes maps, driving performance, traffic congestion updates or weather conditions.
- data relating to the environment 190 is transmitted to the AV 100 via a communications channel from a remotely located database 134.
- the AV system 120 includes communications devices 140 for
- V2V Vehicle-to- Vehicle
- V2I Vehicle-to- Infrastructure
- the communications devices 140 communicate across the electromagnetic spectrum (including radio and optical communications) or other media (e.g., air and acoustic media).
- V2V Vehicle-to- Vehicle
- V2I Vehicle-to-Infrastructure
- V2X Vehicle-to-Every thing
- V2X communication typically conforms to one or more communications standards for communication with, between, and among autonomous vehicles.
- the communication devices 140 include communication interfaces. For example, wired, wireless, WiMAX, Wi-Fi, Bluetooth, satellite, cellular, optical, near field, infrared, or radio interfaces.
- the communication interfaces transmit data from a remotely located database 134 to AV system 120.
- the remotely located database 134 is embedded in a cloud computing environment 1200 as described in FIG. 12.
- the communication interfaces 140 transmit data collected from sensors 121 or other data related to the operation of AV 100 to the remotely located database 134.
- communication interfaces 140 transmit information that relates to teleoperations to the AV 100.
- the AV 100 communicates with other remote (e.g., "cloud") servers 136.
- the remotely located database 134 also stores and transmits digital data (e.g., storing data such as road and street locations). Such data may be stored on the memory 144 on the AV 100, or transmitted to the AV 100 via a communications channel from the remotely located database 134.
- digital data e.g., storing data such as road and street locations.
- the remotely located database 134 stores and transmits historical information about driving properties (e.g., speed and acceleration profiles) of vehicles that have previously traveled along trajectory 198 at similar times of day. Such data may be stored on the memory 144 on the AV 100, or transmitted to the AV 100 via a communications channel from the remotely located database 134.
- driving properties e.g., speed and acceleration profiles
- Computing devices 146 located on the AV 100 algorithmically generate control actions based on both real-time sensor data and prior information, allowing the AV system 120 to execute its autonomous driving capabilities.
- the AV system 120 may include computer peripherals 132 coupled to computing devices 146 for providing information and alerts to, and receiving input from, a user (e.g., an occupant or a remote user) of the AV 100.
- peripherals 132 are similar to the display 1312, input device 1314, and curser controller 1316 discussed below in reference to FIG. 13.
- the coupling may be wireless or wired. Any two or more of the interface devices may be integrated into a single device.
- FIGs. 2-4 illustrate examples of architectures of risk processing systems.
- a risk processing system 230 includes the following elements:
- a risk processing client 201 realized by integrated circuits, field-programmable gate arrays, hardware, software, or firmware, or a combination of two or more of the above.
- the risk processing client 201 is installed on the AV system 200.
- the risk processing client 201 may interact with components of the AV system 200 (e.g., sensors 216 and 218, communication devices 210, user interface devices, memory 212, a processor 214, a database 220, or functional devices, or combinations of them).
- the risk processing client 201 sends and receives information and commands.
- the risk processing client 201 communicates via a communication device 210 (that may be at least partly wireless) with a risk processing server 231.
- the communication device 210 is a communication interface.
- the risk processing client 252 is installed on a mobile device 250.
- the risk processing client 252 may utilize signals collected by the sensors of the mobile device 250, such as GPS sensors, cameras, accelerometers, gyroscopes, and barometers.
- the risk processing client 252 can communicate with a risk processing server 231 over a
- the risk processing client 201 may be installed on a combination of the AV system (in particular on the AV itself) and a mobile device 250.
- a risk processing server 231 is on board the AV of the AV system 200 or in a remote
- AV location for example, at least 0.1, 1, 2, 3, 4, 5, 10, 20, 30, 40, 50, 100, 200, 300, 400, 500, 600, 700, 900, or 1000 meters away from the AV of the AV system 200.
- a graphical user interface 232 may be presented by the risk processing client 201, or the risk processing server 231, or both. Embodiments may present on the interface 232 information of one or more of the following: risk factors, a known risk, an active risk, a present risk, a potential risk, a road network, a condition of the AV of the AV system 200, an environment of the AV of the AV system 200, or sensor signals, among other things.
- a risk processing client 311 may communicate with two or more risk processing servers 321, 322 and 323.
- two or more servers e.g., 321 and 322 receive and aggregate information for risk processing or for presentation on an interface 332.
- a server e.g., 323 may receive risk information from two or more risk processing clients 311 and 312, which are installed, for example, on different AV systems 301 and 302, respectively.
- a server e.g., 322
- a risk processing client (e.g., 312) is configured as a server to receive and aggregate information from one or more other risk processing clients (e.g., 311 or 313, or both).
- a risk processing client (e.g., 312) may play a role as a relaying device for establishing and maintaining a communication between a server 323 and another client 311.
- a risk processing system 400 may communicate with one or more sensors (e.g., 402 and 404) to collect signals via a communication interface 410.
- Sensor signals or existing data or both may be stored in memory 422 or a database 424 or both.
- a database 424 may be onboard or remote to, or both, the AV.
- the database 424 may store data from sensors, government agencies, police stations, or insurance companies, or combinations of them. Examples of the data stored in the database 424 include timestamps, time windows, peak traffic, weather, maps, street light settings, traffic sign settings, traffic lights settings, road configurations, addresses, normal vehicle operator behaviors, abnormal vehicle operator behaviors, normal AV operations, abnormal AV operations, traffic due to social events, traffic due to sport events, location of hospitals, location of police stations, location of fire stations, known risks along a trajectory, predicted risks along a trajectory, characteristics (for example, age, gender, ethnicity, or socio-economic status) of individuals involved in high risk situations along a trajectory,
- characteristics for example, color, make, model, or engine type
- value of insurance claims filed/processed for high risk situations along a trajectory value of insurance claims filed/processed for high risk situations along a trajectory
- cost of repairs associated with high risk situations along a trajectory and
- Processing and analyzing the signals and data may be realized by a processor 420, or a computing resource of the processor 420.
- a risk processing system includes a risk monitoring process 432 to predict potential risks or detect existing risks in the environment of the AV system. [080] In one embodiment, a risk processing system includes a risk reporting process 434 to report predicted or detected risks.
- a risk processing system includes a risk reaction process 436 to configure the AV system to take suitable actions when a risk is predicted or detected.
- the risk reaction process 436 includes or communicates with a teleoperation system 442 to allow a remote operator to operate AV 502 in response to a risk.
- a risk processing system includes a risk factor assessing process 438 to evaluate risk factors affecting the on-road operation of AV 502.
- a risk processing system includes a report integration process 440 to integrate two or more risk reports.
- a risk monitoring process identifies risks by monitoring an environment near the AV, an operation of the AV system, or the interior of the AV.
- FIG. 5 illustrates an example of an autonomous vehicle AV 502 detecting a collision in its vicinity while driving on a road.
- AV 502 upon analyzing signals from sensors (e.g., a vision sensor, a lidar or a radar, or combinations of them) AV 502 produces information about other objects (e.g., vehicles 512 and 514, infrastructure, and pedestrians) in the environment; examples of such information include: locations, speeds, orientations, boundaries, sizes, dimensions, status of traffic lights (for example, red, green, amber, malfunction, etc.), information related to manufacturers, plate numbers, owners, drivers, and operational state of other vehicles on the road along with AV 502. The information is analyzed by risk processing server to predict a potential collision or detect an existing collision.
- sensors e.g., a vision sensor, a lidar or a radar, or combinations of them
- AV 502 produces information about other objects (e.g., vehicles 512 and 514, infrastructure, and pedestrians) in the environment; examples of such information include: locations
- a sensor e.g., lidar
- a wave beam e.g., an electromagnetic beam, or an acoustic beam, or both
- a returning beam component can indicate a boundary point of the object.
- a sensor emits a scan comprising beams each of which produces N returning beam components, a cloud of xN points (also called a point cloud) is acquired.
- FIG. 6 A illustrates an example of object detection by an autonomous vehicle AV system 601 by encoding information about an elevation profile of the road surface.
- a map used by the AV system 601 may encode information about an elevation profile of the road surface 600. This information can be used to classify a given point as belonging to the road surface as follows.
- an image, including depth information, from a vision sensor e.g., a stereo camera
- segmentation is applied to identify a background region, or a foreground object, or both. Segmentation results may be used alone or be integrated with map information (e.g., projected onto a map) for classifying points in a point cloud.
- the AV system 601 Given information about the current position and orientation of the AV system 601 and the position of the sensor 603, the AV system 601 derives a point (e.g., 609, 610, 611 or 612) at which an emitted beam (e.g., 605, 606, 607, or 608) is expected to encounter the ground. If a point (613) returned by a beam (608) is closer to the AV 601 than the expected point (612) by a predefined difference, then the beam is determined to have encountered an object 602 (e.g., a point on the foreground), and the point 613 is classified as a boundary point of the object 602. In one
- machine learning for example, deep learning
- machine learning is used to perform foreground classification.
- Such an approach fuses data from multiple sensors (such as lidar, radar and camera) to improve the classification accuracy.
- the risk monitoring process tracks the boundary and determine a speed of the object.
- a speed sensor e.g., based on radar
- the risk monitoring process can predict their positions and boundary locations at time t+l .
- a collision is predicted.
- a risk monitoring process detects that the boundaries of objects 512 and 514 overlap and the speeds of the objects drop swiftly to zero, a collision is detected.
- the AV system 502 may use sensor signals (e.g., lidar, radar, images, GPS, or information in vehicle-to-vehicle signals, or combinations of them) to determine the locations of the objects 512 and 514, and detect a collision that occurred before the AV 502 arrives in the vicinity of the collision between the objects 512 and 514.
- the AV system 502 may use sensor signals to determine shapes or boundaries or sizes or dimensions, or
- segmentation on an image may identify locations and boundaries of the objects 512 and 514.
- the AV system 502 may use traffic information (e.g., traffic volume and flow) to infer a collision.
- the AV system 502 may measure the traffic flow to determine if the traffic flow is in an abnormal condition. For instance, objects 512 and 514 involved in a collision have zero speed, and the traffic 532 behind the objects 512 and 514 is slow or congested, but the traffic 534 in front of the objects 512 and 514 is faster.
- Risky substances By analyzing signals from sensors (e.g., a smoke sensor, a chemical sensor, a temperature sensor, a flame sensor, a fire sensor, a radioactivity sensor, or combinations of them) the risk monitoring process may detect a fire, flame 522, smoke, or radioactivity in the environment of the AV 502.
- sensors e.g., a smoke sensor, a chemical sensor, a temperature sensor, a flame sensor, a fire sensor, a radioactivity sensor, or combinations of them
- the risk monitoring process may detect a fire, flame 522, smoke, or radioactivity in the environment of the AV 502.
- the risk monitoring process may include one or more sensors (e.g., chemical sensors, radar, vision sensors, optical sensors, and infrared sensors) or may have access to signals of the sensors.
- Sensor signals may provide information about the chemical composition of the AV's environment, such as a concentration of a certain chemical species or combinations of them (e.g., carbon monoxide, carbon dioxide, composition C, sulfides, explosives, and toxic chemicals).
- sensor signals may provide information about a shape of a risky object (e.g., gun, bomb, and grenade).
- the risk monitoring process analyzes sensor signals based on a pattern recognition algorithm to predict or detect presence of a risky substance, or a source of a risky substance, or both. For instance, when a fire (e.g., 522 in FIG. 5) exists, the air in the environment may contain substances whose concentrations deviate from normal values, and the risk monitoring process may compute a likelihood of the presence of the fire 522.
- the AV system may analyze distributions of the risky substances in a space over time to determine a source of a risk; for example, concentrations near the source are higher than those in a distant location.
- the risk monitoring process may monitor an interior of the AV to detect presence of risky substances, such as explosives, combustibles, poisonous gases, and flammables.
- risky substances such as explosives, combustibles, poisonous gases, and flammables.
- the AV may detect increased concentration of carbon dioxide due to the AV being located in an enclosed area.
- the risk monitoring process analyzes sensor signals and data to assess threats. For example, the risk monitoring process detects an abnormal concentration of a chemical (e.g., an alcohol level, a toxic substance, and an explosive material) that is possessed by an AV occupant or possessed by an approaching object (e.g., a person or an occupant of a vehicle).
- a chemical e.g., an alcohol level, a toxic substance, and an explosive material
- the risk monitoring process evaluates a driving behavior of a vehicle possessing an autonomous driving capability.
- FIGs. 6B - 6C illustrate examples of a risk monitoring process monitoring driving behaviors.
- the risk monitoring process monitors whether a vehicle follows traffic rules. For instance, referring to scenario 620 in FIG. 6B, the risk monitoring process monitors driving behavior of a vehicle 621 when encountering a stop sign 622. Driving behavior 623 involves slowing down when nearly arriving the stop sign 622, but second driving behavior 624 involves gradually slowing down within a reasonable distance from the stop sign 622. Thus, driving behavior 623 is riskier than second driving behavior 624.
- the risk monitoring process monitors driving behavior of a vehicle 626 when encountering stop signs 627.
- Third driving behavior 628 has a smoother speed profile than fourth driving behavior 629, so the driving behavior 628 makes the occupant of the vehicle feel more comfortable than the fourth driving behavior 629.
- a speed profile is a plot of the variation in speed of a vehicle over a period of time.
- a jagged speed profile with several peaks and valleys is an indicator of haphazard starts and stops and consequently unsafe or rough driving.
- Different driving behaviors are weighted and balanced against one another.
- the third driving behavior 628 also does not make a full stop at the stop signs, so the third driving behavior 628 is riskier than the fourth driving behavior 629.
- the risk monitoring process monitors headings or driving trajectories of vehicles. For instance, when driving on a straight road segment, fifth driving behavior 631 involving a vehicle wiggling is riskier than sixth driving behavior 632 involving the vehicle maintaining a straight traj ectory .
- the risk monitoring process monitors how a vehicle reacts to dynamic objects on the road. For example, referring to FIG. 6C, the risk monitoring process determines how the vehicle 635 slows down when approaching a crossing 636, when encountering a pedestrian 637, or when detecting an object 638.
- the risk monitoring process evaluates driving behaviors based on analyzing sensor signals. For example, the risk monitoring process uses a speed sensor (e.g., based on radar) to monitor speeds. In one embodiment, the risk monitoring process uses a position sensor (e.g., based on GPS) to monitor a position, or a series of positions. In some cases, the risk monitoring process uses odometer onboard a vehicle to monitor a driving distance.
- the risk monitoring process includes a sensor onboard a vehicle to monitor the operations of the steering wheel, the brake pedal, the acceleration, or the deceleration, or combinations of the above.
- a vehicle utilizes in-vehicle cameras to monitor vehicle operational characteristics associated with the operator of a vehicle. For example, the vehicle may analyze operator attentiveness and wakefulness by monitoring operator's eyes, pupil dilation, or alcohol consumption by analyzing operator breath.
- a template driving behavior includes driving based on: traffic rules, preferred driving behaviors, driving behaviors of human drivers, a statistical summary of human drivers, driving behaviors of AV systems, or a statistical summary of AV systems.
- a template driving behavior includes two vehicles from a same manufacturer or different manufacturers.
- a template driving behavior includes two AV systems from a same provider or different providers.
- the risk monitoring process explores a database of past risks or statistics of risks, or both.
- the database may be hosted by government agencies, police departments, or insurance companies, or combinations of them.
- the risk monitoring process learns patterns of risks.
- a learning algorithm may infer one or more of (or combinations of) the following: regions with frequent disasters, regions with frequent collisions, regions with frequent drunk drivers, regions with frequent sports events, regions with frequent protests, drivers with bad driving behaviors, frequent types of risks in a region, and frequent causes of risks in a region, among others.
- the patterns include time period information, e.g., peaks, mornings, afternoons, evenings, nights, weekdays, and weekends.
- the patterns include road configurations, e.g., parallel parking streets, crosswalks, 4-way stops, 3-way stops, highways, bifurcations, merges, dedicated lanes, and bicycle lanes.
- the patterns include distributions of risks within a region or within a time period; for example, more collisions take place at the center of the intersection of a 4-way stop and fewer collisions take place away from the center.
- the patterns include a dynamic model to describe the risks.
- a probabilistic model e.g., Gaussian distributions or Poisson distributions
- Gaussian distributions or Poisson distributions may be used to describe risks on a road configuration, within a region, or within a time period, or combinations of them.
- the learned patterns are used as prior information for the AV system to configure driving behaviors of the AV on a road. For example, when the AV is approaching a region having frequent risks, the AV slows down when passing the region, or it plans a trajectory to avoid passing the region or a combination of the two. In some applications, when approaching a region having frequent risks involving one or more specific types of objects (e.g., children, bicycles, trucks, pedestrians, or animals), a perception process of the AV system is dedicated to detecting these specific types of objects. For example, prior probabilities of the presence of these specific types of objects may become high in the perception process.
- the patterns include a model describing risks associated with trajectory information; for example, a database may show that a right turn at an intersection frequently is associated with accidents, and a model may describe the right-turn and the corresponding probability of collisions.
- a risk reporting process automatically reports a potential risk or an existing risk.
- An entity receiving a report may include a risk processing provider, a transportation service provider, a government agency, a fire station, a police station, a health service provider, an insurance company, an auto manufacturer, or a road user, or combinations of them.
- a report of a risk includes snapshot or temporal signals from sensors, such as images, radar signals, lidar signals, GPS signals, or speed signals, or combinations of them.
- a report of a risk includes information associated with the risk, such as timestamps, maps, traffic volumes, traffic flows, street light setting, travel signal settings, road configurations, addresses, hospitals, parties involved in the risk, injured parties involved in the risk, or object features (e.g., types, colors, sizes, shapes, models, make, plate numbers, VINs or owners, or combinations of them).
- the risk reporting process includes processing received signals to extract information associated with the risk.
- the risk reporting process may segment the objects involved in the risk, identify the parties (e.g., based on plate numbers, or on information embedded in V2V or V2I communications, or on both) involved in the high risk situation, or recognize traffic configurations (e.g., volumes, speeds, traffic lanes, traffic lights, traffic signs, and infrastructure), or combinations of them.
- the risk reporting process identifies a geolocation where a signal is taken; the geolocation information may be embedded in the signal or be inferred based on one or more GPS signals in the vicinity of the risk.
- the risk reporting process may evaluate if the same risk has been previously or simultaneously reported.
- the risk may be predicted or detected by another source (e.g., another AV system) or be notified by another source (e.g., a risk processing server, a government agency's server, or a news provider, or combinations of them).
- another source e.g., a risk processing server, a government agency's server, or a news provider, or combinations of them.
- the risk reporting process may evaluate one or more of the following factors.
- the risk reporting process defines a zone of a risk.
- the risk reporting process records a timestamp for a time at which a high risk situation was identified.
- two timestamps e.g., 724 and 726
- their associated risks may be identical or similar.
- the associated risks may involve a dissimilar or different risk.
- the risk reporting process may record reported events.
- a large number of reports is associated with a risk 732, the deduced risk 732 exists; otherwise, for a risk 734 with a small number of reports, the detection of risk may be a false positive.
- a geolocation is taken into account. For example, the number of reports of a risk in an area with a high population density (e.g., a metropolitan area) is expected to be higher than in an area with a low population density (e.g., a rural area).
- the risk reporting process includes an interface to allow a user to report a risk.
- a risk reporting process provides a user with an interface 810, for example, to submit one or more images or videos about a risk (e.g., a collision between objects 812 and 814).
- a risk reporting process provides a user with an interface 850 to report a risk on a map (e.g., a collision between objects 852 and 854).
- the risk reporting process allows the interface user to provide (e.g., by clicking, typing, or speaking, or combinations of them) one or more of the following: a location of a risk, parties involved in a risk, locations of the parties, event details, road conditions, weather conditions, or traffic configurations (e.g., volumes, speeds, traffic lanes, traffic lights, traffic signs, and infrastructure).
- the risk reporting process processes a report to comply with laws, regulations, or policies, or combinations of them. For example, the risk reporting process may remove private information (e.g., social security number and driver license number) associated with a risk before transmitting the report to a third party.
- Report Integration e.g., social security number and driver license number
- a report integration process (440 in FIG. 4) synthesizes two or more risk reports into an integrated risk report.
- each may include partial information about a single risk.
- one report may include a time and a location of the risk, and another may include a time, a location, a road configuration, and a travel signal.
- the report integration process resolves the discrepancy between the reported times, or between the reported locations, or both, and then generates a report associated with the risk with one or more of the following: a single timestamp or a single time period, a single location or a region, a road configuration, and a travel signal.
- a report integration process stitches two or more sensor signals.
- one report may include images or videos recording a collision from a side view with scenes 900 and 910.
- the two vehicles 901 and 902 in scene 900 collide and become collided vehicles 911 and 912 in scene 910.
- Another report may include images or videos from a front view showing a scene 920 of collided vehicles 921 and 922, which reveals that the collision was caused by the vehicle 922 (corresponding to the vehicle 902 in scene 900) attempting to pass the vehicle 921 (corresponding to the vehicle 901 in scene 900).
- the two different reports associated with the same collision may reveal different information.
- the report integration process may utilize image processing and computer vision to stitch data from different reports.
- the report integration process reconstructs a view of a risk in a two-dimensional or three-dimensional space.
- the report integration process may further reconstruct the view over time to show how the risk was evolving.
- a report integration process provides a user interface to allow a user to verify a report of a risk or an aggregated report of one or more risks.
- a risk reaction process (436 in FIG. 4) configures the AV system in response to an existing risk or a potential risk.
- the AV system receives a notice of a risk from one of the following: a risk monitoring process, a risk reporting process, a risk processing server, another AV system, an object on the road, or an infrastructure, or combinations of the above.
- an existing or potential risk is stored in a database (e.g., map data).
- map data may annotate a previously reported risk, a known existing risk, a known future risk, a collision-prone zone, a construction, and a heavy traffic zone.
- the risk reaction process adapts a motion planning process of the AV system based on one or more risks. For example, when a risk is near the AV, a motion planning process may plan a lane-changing trajectory to bypass the risk. In contrast, when a risk is far away, a motion planning process may plan a trajectory to its goal position by choosing another route to circumvent the risk.
- the risk reaction process enhances or alters a perception process of the AV system.
- the information about the risk is assigned a higher priority for the perception process.
- the perception process recognizes objects (e.g., 901 and 902) in their normal conditions.
- the perception process may fail to recognize the two collided objects (e.g., 91 1 and 912), because the collided objects 91 1 and 912 have deformed shapes or the perception process may misclassify the collided objects 91 1 and 912 as a single unknown object.
- the risk reaction process configures the perception process to consider the risk information as prior information C and to use probabilistic inference p(S ⁇ C to correctly recognize the collided objects (S) in the environment of the AV.
- the risk reaction process triggers a teleoperation system (442 in FIG. 4) for a tele-operator to guide the driving of the AV. For example, when a risk is observed and the AV is unable to drive along a previously planned trajectory, the risk reaction process sends a request to the tele-operation system. An intervention on the driving of the AV may be invoked based on the tele-operation system. Additional information about such tele-operation is found in United States patent application serial number 15/624,780, filed June 16, 2017, which is incorporated here by reference. [0122] In one embodiment, the way the risk reaction process changes or adapts another process is based on a flag.
- the flag is turned from inactive (e.g., represented by 0) to active (e.g., represented by 1), so the other process will retrieve or listen to outputs from the risk reaction process.
- the way the risk reaction process alters another underlying process is based on programming code.
- the underlying process 1000 executes routine instructions.
- the risk reaction process 1002 dynamically generates a set of instructions that can be inserted into the executions of the underlying process 1000.
- the underlying process 1010 takes one or more inputs (e.g., detected objects in the environment of the AV system), and an output of the reaction process 1012 is treated as an input (e.g., an additional stationary object blocking the current trajectory of the AV system) to the underlying process 1010.
- inputs e.g., detected objects in the environment of the AV system
- output of the reaction process 1012 is treated as an input (e.g., an additional stationary object blocking the current trajectory of the AV system) to the underlying process 1010.
- the risk reaction process generates an alarm of a potential risk or an existing risk; the alarm is based on visual or audio signals.
- FIG. 11 illustrates an example of an interface 1100 of a risk processing system.
- the interface 1100 presents a detected risk (e.g., collision 1122) detected near the AV 1120.
- the interface generates an audio signal (e.g., a sound, or a spoken language, or both) to provide alert for the risk 1122.
- the risk reaction process also may change a previous navigation guidance for a left-turn trajectory 1124 for the AV 1120 to a new navigation guidance for a straight trajectory 1126. The change in trajectory may be presented on the interface 1100, or verbally described by the interface 1100, or both.
- a risk factor assessing process calculates existing or potential risk factors of the AV system.
- the risk factor assessment may be used to compute insurance premiums.
- the risk factor assessing process evaluates the number of miles to be traversed for a trajectory. A larger number of miles may imply a higher risk.
- the risk factor assessing process evaluates the number of miles to be traveled during a time period. A larger number of miles may imply a higher risk. [0127] In one embodiment, a risk is not physically external to the AV. In some cases, the risk factor assessing process evaluates a health condition of the AV.
- components of the AV e.g., tires, brakes, engines, steering wheels, accessory belt tension pulleys, accessory belt tensioners, camshaft position sensors, crankshaft position sensors, crankshaft pulleys, crankshaft seals, cylinder heads, engine gasket sets, harmonic balancers, knock sensors, motor and transmission mounts, motor and transmission mount brackets, oil cooler hoses, oil dipsticks, oil drain plug gasket, oil filters, oil level sensors, oil pans, oil pan gaskets, oil pressure switches, oil pumps, rod bearing sets, timing belts, timing belt kits, timing belt tensioners, valve cover, valve cover gaskets, valve stem seals, perception sensors, perception process, motion planning process, databases, and computing processors) that have aged or have been used heavily may imply a higher risk.
- components of the AV e.g., tires, brakes, engines, steering wheels, accessory belt tension pulleys, accessory belt tensioners, camshaft position sensors, crankshaft position sensors, crankshaft pulleys, crankshaft
- the risk factor assessing process evaluates if a risk processing system is active when driving the AV.
- An inactive (e.g., due to malfunctioning or deactivation by the vehicle operator) risk processing system may imply a higher risk.
- the risk factor assessing process evaluates if an anti-theft system (e.g., alarms and sensors) is active when driving the AV system.
- An inactive (e.g., due to due to malfunctioning or deactivation by the vehicle operator) anti-theft system may imply a higher risk.
- the risk factor assessing process evaluates if a trajectory of the AV is through a risky region. Passing through a risky region may imply a higher risk.
- the risk factor assessing process evaluates a user's profile.
- a user may be an occupant of the AV, or a party using the AV for transporting someone else or something else. Examples of a user profile include age, occupation, driving history, driving behaviors, driving purpose, active driving license, a frequency of using vehicles, income level and history, education level and history, and home address, among others.
- the risk factor assessing process evaluates a user's social network profile.
- the risk processing system connects to the user's social network accounts (e.g., Facebook, Linkedln, Instagram, YouTube, and personal website) to evaluate red flags for the user.
- the user's social network accounts e.g., Facebook, Linkedln, Instagram, YouTube, and personal website.
- the risk factor assessing process evaluates a driving behavior. For example, a driving behavior deviating from a normal behavior may imply a higher risk. As another example, a driving behavior violating a traffic rule may imply a higher risk. As another example, a driving behavior inducing a less comfort level may imply a higher risk.
- FIG. 12 illustrates an exemplary "cloud" computing environment.
- Cloud computing is a model of service delivery for enabling convenient, on-demand network access to a shared pool of configurable computing resources (e.g. networks, network bandwidth, servers, processing, memory, storage, applications, virtual machines, and services).
- computing resources e.g. networks, network bandwidth, servers, processing, memory, storage, applications, virtual machines, and services.
- one or more large cloud data centers house the machines used to deliver the services provided by the cloud.
- the cloud computing environment 1200 includes cloud data centers 1204a, 1204b, and 1204c that are interconnected through the cloud 1202.
- Data centers 1204a, 1204b, and 1204c provide cloud computing services to computer systems 1206a, 1206b, 1206c, 1206d, 1206e, and 1206f connected to cloud 1202.
- the cloud computing environment 1200 includes one or more cloud data centers.
- a cloud data center for example the cloud data center 1204a shown in FIG. 12, refers to the physical arrangement of servers that make up a cloud, for example the cloud 1202 shown in FIG. 12, or a particular portion of a cloud.
- servers can be physically arranged in the cloud datacenter into rooms, groups, rows, and racks.
- a cloud datacenter has one or more zones, which include one or more rooms of servers. Each room has one or more rows of servers, and each row includes one or more racks. Each rack includes one or more individual server nodes.
- Servers in zones, rooms, racks, and/or rows may be arranged into groups based on physical infrastructure requirements of the datacenter facility, which include power, energy, thermal, heat, and/or other requirements.
- the server nodes are similar to the computer system described in FIG. 13.
- the data center 1204a has many computing systems distributed through many racks.
- the cloud 1202 includes cloud data centers 1204a, 1204b, and 1204c along with the network and networking resources (for example, networking equipment, nodes, routers, switches, and networking cables) that interconnect the cloud data centers 1204a, 1204c, and 1204c and help facilitate the computing systems' 1206a-1206f access to cloud computing services.
- the network and networking resources for example, networking equipment, nodes, routers, switches, and networking cables.
- the network represents any combination of one or more local networks, wide area networks, or internetworks coupled using wired or wireless links deployed using terrestrial or satellite connections. Data exchanged over the network, is transferred using any number of network layer protocols, such as Internet Protocol (IP), Multiprotocol Label Switching (MPLS), Asynchronous Transfer Mode (ATM), Frame Relay, etc. Furthermore, in embodiments where the network represents a combination of multiple sub-networks, different network layer protocols are used at each of the underlying sub-networks. In some embodiments, the network represents one or more interconnected internetworks, such as the public Internet.
- IP Internet Protocol
- MPLS Multiprotocol Label Switching
- ATM Asynchronous Transfer Mode
- Frame Relay etc.
- the network represents one or more interconnected internetworks, such as the public Internet.
- the computing systems 1206a-1206f or cloud computing services consumers are connected to the cloud 1202 through network links and network adapters.
- the computing systems 1206a-1206f are implemented as various computing devices, for example servers, desktops, laptops, tablet, smartphones, IoT devices, autonomous vehicles (including, cars, drones, shuttles, trains, buses, etc.) and consumer electronics.
- the computing systems 1206a-1206f may also be implemented in, or as a part of, other systems.
- FIG. 13 illustrates an example of a computer system 1300.
- the computer system 1300 is a special purpose computing device.
- the special-purpose computing devices may be hard-wired to perform the techniques, or may include digital electronic devices such as one or more application-specific integrated circuits (ASICs) or field programmable gate arrays (FPGAs) that are persistently programmed to perform the techniques, or may include one or more general purpose hardware processors programmed to perform the techniques pursuant to program instructions in firmware, memory, other storage, or a combination.
- ASICs application-specific integrated circuits
- FPGAs field programmable gate arrays
- Such special-purpose computing devices may also combine custom hard-wired logic, ASICs, or FPGAs with custom programming to accomplish the techniques.
- the special-purpose computing devices may be desktop computer systems, portable computer systems, handheld devices, network devices or any other device that incorporates hardwired and/or program logic to implement the techniques.
- the computer system 1300 may include a bus 1302 or other communication mechanism for communicating information, and a hardware processor 1304 coupled with a bus 1302 for processing information.
- the hardware processor 1304 may be, for example, a general-purpose microprocessor.
- the computer system 1300 also includes a main memory 1306, such as a random-access memory
- RAM random access memory
- main memory 1306 also may be used for storing temporary variables or other intermediate information during execution of instructions to be executed by the processor 1304.
- Such instructions when stored in non-transitory storage media accessible to the processor 1304, render the computer system 1300 into a special-purpose machine that is customized to perform the operations specified in the instructions.
- the computer system 1300 further includes a read only memory (ROM) 1308 or other static storage device coupled to the bus 1302 for storing static information and instructions for the processor 1304.
- ROM read only memory
- a storage device 1310 such as a magnetic disk, optical disk, or solid-state drive is provided and coupled to the bus 1302 for storing information and instructions.
- the computer system 1300 may be coupled via the bus 1302 to a display 1312, such as a cathode ray tube (CRT), a liquid crystal display (LCD), plasma display, light emitting diode (LED) display, or an organic light emitting diode (OLED) display for displaying information to a computer user.
- a display 1312 such as a cathode ray tube (CRT), a liquid crystal display (LCD), plasma display, light emitting diode (LED) display, or an organic light emitting diode (OLED) display for displaying information to a computer user.
- An input device 1314 is coupled to bus 1302 for communicating information and command selections to the processor 1304.
- a cursor controller 1316 such as a mouse, a trackball, a touch-enabled display, or cursor direction keys for communicating direction information and command selections to the processor 1304 and for controlling cursor movement on the display 1312.
- This input device typically has two degrees of freedom in two axes, a first axis (e.g., x-axis) and a second axis (e.g., y-axis), that allows the device to specify positions in a plane.
- a first axis e.g., x-axis
- a second axis e.g., y-axis
- the techniques herein are performed by the computer system 1300 in response to the processor 1304 executing one or more sequences of one or more instructions contained in the main memory 1306. Such instructions may be read into the main memory 1306 from another storage medium, such as the storage device 1310. Execution of the sequences of instructions contained in the main memory 1306 causes the processor 1304 to perform the process steps described herein. In alternative embodiments, hard-wired circuitry may be used in place of or in combination with software instructions.
- Non-volatile media includes, for example, optical disks, magnetic disks, or solid-state drives, such as the storage device 1310.
- Volatile media includes dynamic memory, such as the main memory 1306.
- storage media include, for example, a floppy disk, a flexible disk, hard disk, solid-state drive, magnetic tape, or any other magnetic data storage medium, a CD-ROM, any other optical data storage medium, any physical medium with patterns of holes, a RAM, a PROM, and EPROM, a FLASH-EPROM, NVRAM, or any other memory chip or cartridge.
- Storage media is distinct from but may be used in conjunction with transmission media.
- Transmission media participates in transferring information between storage media.
- transmission media includes coaxial cables, copper wire and fiber optics, including the wires that comprise the bus 1302.
- transmission media can also take the form of acoustic or light waves, such as those generated during radio wave and infrared data communications.
- Various forms of media may be involved in carrying one or more sequences of one or more instructions to the processor 1304 for execution.
- the instructions may initially be carried on a magnetic disk or solid-state drive of a remote computer.
- the remote computer can load the instructions into its dynamic memory and send the instructions over a telephone line using a modem.
- a modem local to the computer system 1300 can receive the data on the telephone line and use an infrared transmitter to convert the data to an infrared signal.
- An infrared detector can receive the data carried in the infrared signal and appropriate circuitry can place the data on the bus 1302.
- the bus 1302 carries the data to the main memory 1306, from which processor 1304 retrieves and executes the instructions.
- the instructions received by the main memory 1306 may optionally be stored on the storage device 1310 either before or after execution by processor 1304.
- the computer system 1300 also includes a communication interface 1318 coupled to the bus 1302.
- the communication interface 1318 provides a two-way data communication coupling to a network link 1320 that is connected to a local network 1322.
- the communication interface 1318 may be an integrated service digital network (ISDN) card, cable modem, satellite modem, or a modem to provide a data communication connection to a corresponding type of telephone line.
- the communication interface 1318 may be a local area network (LAN) card to provide a data communication connection to a compatible LAN.
- LAN local area network
- Wireless links may also be implemented.
- the communication interface 1318 sends and receives electrical, electromagnetic or optical signals that carry digital data streams representing various types of information.
- the network link 1320 typically provides data communication through one or more networks to other data devices.
- the network link 1320 may provide a connection through the local network 1322 to a host computer 1324 or to a cloud data center or equipment operated by an Internet Service Provider (ISP) 1326.
- the ISP 1326 in turn provides data communication services through the world-wide packet data communication network now commonly referred to as the "Internet" 1328.
- the local network 1322 and Internet 1328 both use electrical, electromagnetic or optical signals that carry digital data streams.
- the signals through the various networks and the signals on the network link 1320 and through the communication interface 1318, which carry the digital data to and from the computer system 1300, are example forms of transmission media.
- the network 1320 may contain or may be a part of the cloud 1202 described above.
- the computer system 1300 can send messages and receive data, including program code, through the network(s), the network link 1320, and the communication interface 1318.
- the computer system 1300 may receive code for processing.
- the received code may be executed by the processor 1304 as it is received, and/or stored in storage device 1310, or other nonvolatile storage for later execution.
- the descriptions in this document have described embodiments wherein the tele-operator is a person, tele-operator functions can be performed partially or fully automatically.
Landscapes
- Engineering & Computer Science (AREA)
- Mechanical Engineering (AREA)
- Transportation (AREA)
- Automation & Control Theory (AREA)
- Human Computer Interaction (AREA)
- Combustion & Propulsion (AREA)
- Chemical & Material Sciences (AREA)
- Physics & Mathematics (AREA)
- Mathematical Physics (AREA)
- General Physics & Mathematics (AREA)
- Traffic Control Systems (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Aviation & Aerospace Engineering (AREA)
Abstract
Among other things, sensor signals are received using a vehicle comprising an autonomous driving capability. A risk associated with operating the vehicle is identified based on the sensor signals. The autonomous driving capability is modified in response to the risk. The operation of the vehicle is updated based on the modifying of the autonomous capability.
Description
RISK PROCESSING FOR VEHICLES HAVING AUTONOMOUS DRIVING
CAPABILITIES
CROSS-REFERENCE TO RELATED APPLICATIONS
[001] This application claims priority to and the benefit of U.S. Provisional Application Serial No. 62/522,254, filed on June 20, 2017, the entire contents of which is incorporated here by reference.
FIELD OF THE INVENTION
[002] This description relates to systems and methods for processing risk for vehicles having autonomous driving capabilities.
BACKGROUND [003] A vehicle having autonomous driving capabilities may encounter risks while driving on a road. Such risks can involve, for example, a pedestrian suddenly crossing the street in front of the vehicle such that impact with the pedestrian may be difficult to avoid. Such risks can also involve, for example, possibility of collision with another vehicle on the road. Such risks can also involve, for example, possibility of accidents in adverse driving conditions, such as in rain or snow, among others.
SUMMARY
[004] In general, in one aspect, sensor signals are received using a vehicle comprising an autonomous driving capability. A risk associated with operating the vehicle is identified based on the sensor signals. The autonomous driving capability is modified in response to the risk. The operation of the vehicle is updated based on the modifying of the autonomous capability.
[005] Some embodiments of identifying a risk may include detecting or predicting the risk. The identifying may include analyzing the sensor signals, or known or predicted risks, or both. The identifying may include analyzing the sensor signals to determine a position of an object. The identifying may include analyzing the sensor signals to evaluate a speed of an object. The identifying may include analyzing the sensor signals to evaluate speed profiles of two or more objects over time. The identifying may include analyzing the sensor signals to identify a boundary of an object. The identifying may include analyzing the sensor signals to identify overlapping boundaries of two or more objects. The identifying may include analyzing the sensor signals to determine a concentration
of a chemical.
[006] In one embodiment, the identifying may include analyzing the sensor signals to segment one or more objects on an image or a video. The identifying may include tracking the segmented one or more objects.
[007] In one embodiment, the identifying may include assessing a threat.
[008] In one embodiment, the identifying may include evaluating a driving behavior of the vehicle or another vehicle. Evaluating a driving behavior may include evaluating a speed, a heading, a trajectory, a vehicular operation, or combinations of these.
[009] In one embodiment, the identifying may include learning a pattern of known risks. The pattern may include associations of the known risks with one or more of objects, times, road configurations, or geolocations.
[010] In one embodiment, updating the operation of the vehicle may include executing a lane change by a motion planning system of the vehicle. Updating the operation of the vehicle may include executing a trajectory change by a motion planning system of the vehicle. Responding to the risk may include treating analyzed sensor signals as prior information to a perception system of the vehicle. Responding to the risk may include invoking intervention on an operation of the vehicle from a remote operator. Updating the operation of the vehicle may include generating and inserting new machine instructions into existing machine instructions of an operation of the vehicle.
Responding to the risk may include generating an input to an operation of the vehicle. [011] In one embodiment, the method may include generating a report of a risk. Generating a report of a risk may include extracting and aggregating information associated with the risk from sensor signals. Extracting and aggregating information may include evaluating one or more of the following: overlapping geographic zones, overlapping time periods, or report frequencies.
[012] In one embodiment, generating a report of a risk may include recording an environment of the risk. Generating a report of a risk may include stitching images or videos to form a view of the risk. Generating a report of the risk may include removing private information associated with the risk. Generating a report of a risk may include providing an interface to allow an interface user to provide information associated with the risk. Generating a report of a risk may include integrating two or more reports associated with the risk.
[013] One embodiment of the method may include receiving a report of a risk from a remote data source.
[014] One embodiment of the method may include assessing a risk factor of the vehicle. Assessing a risk factor may include determining a risk associated with passing through a risky region.
Assessing a risk factor may include determining a risk associated with a driving distance or a driving time period. Assessing a risk factor may include determining a risk associated with an aging component of the vehicle. Assessing a risk factor may include determining a risk associated with an inactive autonomous driving capability. Assessing a risk factor may include determining a risk based on a profile of a user of the vehicle. Assessing a risk factor may include determining a risk based on a social network of a user of the vehicle. Assessing a risk factor may include determining a risk associated with a driving behavior. Assessing a risk factor may include determining a risk associated with following traffic rules.
[015] In general, in one aspect, a vehicle having autonomous driving capabilities includes steering, acceleration, and deceleration devices that respond to control signals from a driving control system to drive the vehicle autonomously on a road network. The vehicle also includes a monitoring element on the vehicle that receives sensor signals, and identifies a risk associated with operating the vehicle based on the sensor signals. The vehicle further includes and a controller that responds to the risk by configuring the driving control system to modify the autonomous driving capability in response to the risk and, based on the modifying of the autonomous capability, update an operation of the vehicle to maneuver the vehicle to a goal location.
[016] One embodiment of identifying a risk may include detecting or predicting the risk. The identifying may include analyzing sensor signals, or known or predicted risks, or both. The identifying may include analyzing the sensor signals to determine a position of an object. The identifying may include analyzing the sensor signals to evaluate a speed of an object. The identifying may include analyzing the sensor signals to evaluate speed profiles of two or more objects over time. The identifying may include analyzing the sensor signals to identify a boundary of an object. The identifying may include analyzing the sensor signals to identify overlapping boundaries of two or more objects. The identifying may include analyzing the sensor signals to determine a concentration of a chemical.
[017] In one embodiment of the vehicle, the identifying may include analyzing the sensor signals to
segment one or more objects on an image or a video. The identifying may include tracking the segmented one or more objects.
[018] In one embodiment of the vehicle, the identifying may include assessing a threat.
[019] In one embodiment of the vehicle, the identifying may include evaluating a driving behavior of the vehicle or another vehicle. Evaluating a driving behavior may include evaluating a speed, a heading, a trajectory, a vehicular operation, or combinations of them.
[020] In one embodiment of the vehicle, the identifying may include learning a pattern of known risks. The pattern may include associations of the known risks with one or more of objects, times, road configurations, or geolocations. [021] In one embodiment of the vehicle, updating the operation of the vehicle may include executing a lane change by a motion planning system of the vehicle. Updating the operation of the vehicle may include executing a trajectory change by a motion planning system of the vehicle.
Responding to the risk may include treating analyzed sensor signals as prior information to a perception system of the vehicle. Responding to the risk may include invoking intervention on an operation of the vehicle from a remote operator. Updating the operation of the vehicle may include generating and inserting new machine instructions into existing machine instructions of an operation of the vehicle. Responding to the risk may include generating an input to an operation of the vehicle.
[022] In one embodiment, the vehicle may include a reporting element to generate a report of a risk. Generating a report of a risk may include extracting and aggregating information associated with the risk from sensor signals. Extracting and aggregating information may include evaluating one or more of the following: overlapping geographic zones, overlapping time periods, or report frequencies.
[023] In one embodiment of the vehicle, generating a report of a risk may include recording an environment of the risk. Generating a report of a risk may include stitching images or videos to form a view of the risk. Generating a report of the risk may include removing private information associated with the risk. Generating a report of a risk may include providing an interface to allow an interface user to provide information associated with the risk. Generating a report of a risk may include integrating two or more reports associated with the risk.
[024] In one embodiment of the vehicle, the reporting element may include receiving a report of a
risk from a remote data source.
[025] One embodiment of the vehicle may include assessing a risk factor of the vehicle. Assessing a risk factor may include determining a risk associated with passing through a risky region.
Assessing a risk factor may include determining a risk associated with a driving distance or a driving time period. Assessing a risk factor may include determining a risk associated with an aging component of the vehicle. Assessing a risk factor may include determining a risk associated with an inactive autonomous driving capability. Assessing a risk factor may include determining a risk based on a profile of a user of the vehicle. Assessing a risk factor may include determining a risk based on a social network of a user of the vehicle. Assessing a risk factor may include determining a risk associated with a driving behavior. Assessing a risk factor may include determining a risk associated with following traffic rules.
[026] In general, in one aspect, an apparatus includes a processor configured to process data to identify a risk of driving a vehicle comprising an autonomous driving capability. The processor is also configured to modify the autonomous driving capability in response to the risk, and update operation of the vehicle based on the modifying of the autonomous capability. The apparatus also includes an alarm configured to issue an alert of the identified risk.
[027] In one embodiment of the apparatus, identifying a risk may include detecting or predicting the risk. The data may include sensor signals, or known or predicted risks, or both. The identifying may include analyzing the sensor signals to determine a position of an object. The identifying may include analyzing the sensor signals to evaluate a speed of an object. The identifying may include analyzing the sensor signals to evaluate speed profiles of two or more objects over time. The identifying may include analyzing the sensor signals to identify a boundary of an object. The identifying may include analyzing the sensor signals to identify overlapping boundaries of two or more objects. The identifying may include analyzing the sensor signals to determine a concentration of a chemical.
[028] In one embodiment of the apparatus, the identifying may include analyzing the sensor signals to segment one or more objects on an image or a video. The identifying may include tracking the segmented one or more objects.
[029] In one embodiment of the apparatus, the identifying may include assessing a threat.
[030] In one embodiment of the apparatus, the identifying may include evaluating a driving behavior of the vehicle or another vehicle. Evaluating a driving behavior may include evaluating a speed, a heading, a trajectory, a vehicular operation, or combinations of them.
[031] In one embodiment of the apparatus, the identifying may include learning a pattern of known risks. The pattern may include associations of the known risks with one or more of objects, times, road configurations, or geolocations.
[032] In one embodiment of the apparatus, updating the operation of the vehicle may include executing a lane change by a motion planning system of the vehicle. Updating the operation of the vehicle may include executing a trajectory change by a motion planning system of the vehicle.
Responding to the risk may include treating analyzed sensor signals as prior information to a perception system of the vehicle. Responding to the risk may include invoking intervention on an operation of the vehicle from a remote operator. Updating the operation of the vehicle may include generating and inserting new machine instructions into existing machine instructions of an operation of the vehicle. Responding to the risk may include generating an input to an operation of the vehicle. [033] In one embodiment, the apparatus may include a processor configured to generate a report of a risk. Generating a report of a risk may include extracting and aggregating information associated with the risk from sensor signals. Extracting and aggregating information may include evaluating one or more of the following: overlapping geographic zones, overlapping time periods, or report frequencies. [034] In one embodiment of the apparatus, generating a report of a risk may include recording an environment of the risk. Generating a report of a risk may include stitching images or videos to form a view of the risk. Generating a report of the risk may include removing private information associated with the risk. Generating a report of a risk may include providing an interface to allow an interface user to provide information associated with the risk. Generating a report of a risk may include integrating two or more reports associated with the risk.
[035] In one embodiment of the apparatus, the processor may include a processor configured to receive a report of a risk from a remote data source.
[036] One embodiment of the apparatus may include a processor configured to assess a risk factor of the vehicle. Assessing a risk factor may include determining a risk associated with passing
through a risky region. Assessing a risk factor may include determining a risk associated with a driving distance or a driving time period. Assessing a risk factor may include determining a risk associated with an aging component of the vehicle. Assessing a risk factor may include determining a risk associated with an inactive autonomous driving capability. Assessing a risk factor may include determining a risk based on a profile of a user of the vehicle. Assessing a risk factor may include determining a risk based on a social network of a user of the vehicle. Assessing a risk factor may include determining a risk associated with a driving behavior. Assessing a risk factor may include determining a risk associated with following traffic rules.
[037] These and other aspects, features, and embodiments can be expressed as methods, apparatus, systems, components, program products, methods of doing business, means or steps for performing a function, and in other ways.
[038] These and other aspects, features, and embodiments will become apparent from the following descriptions, including the claims.
BRIEF DESCRIPTION OF THE DRAWINGS [039] FIG. 1 illustrates an example of an autonomous vehicle having autonomous capability.
[040] FIGs. 2-4 illustrate examples of architectures of risk processing systems.
[041] FIG. 5 illustrates an example of an autonomous vehicle detecting a collision in its vicinity while driving on a road.
[042] FIG. 6A illustrates an example of object detection by an autonomous vehicle by encoding information about an elevation profile of the road surface.
[043] FIGs. 6B and 6C illustrate examples of a risk monitoring process monitoring driving behaviors.
[044] FIGs. 7-10 illustrate examples of a risk processing system.
[045] FIG. 11 illustrates an example of an interface of a risk processing system.
[046] FIG. 12 illustrates an exemplary "cloud" computing environment.
[047] FIG. 13 illustrates an example of a computer system.
DETAILED DESCRIPTION
[048] FIG. 1 illustrates an example of an autonomous vehicle 100 having autonomous capability.
[049] As used herein, the term "autonomous capability" refers to a function, feature, or facility that enables a vehicle to be operated without real-time human intervention, unless specifically requested by the vehicle.
[050] As used herein, an autonomous vehicle (AV) is a vehicle that possesses autonomous capability.
[051] As used herein, vehicle includes means of transposition of goods or people. For example, vehicles can be cars, buses, trains, airplanes, drones, trucks, boats, ships, submersibles, dirigibles, among others. A driverless car is an example of an AV. [052] As used herein, the term "trajectory" refers to a path or route generated by an AV to navigate from a first spatio-temporal location to second spatio-temporal location. In an embodiment, the first spatio-temporal location is referred to as the initial or starting location and the second
spatio-temporal location is referred to as the goal or goal-position. In an embodiment, the spatio-temporal locations correspond to real world locations. For example, the spatio-temporal locations include pickup or drop off locations to pick up or drop off persons or goods.
[053] As used herein, the term "risk processing" refers to accumulating information about a risk, predicting a risk, monitoring a risk, analyzing a risk, reacting to a risk, assessing factors associated with a risk, or any combination of the above.
[054] As used herein, the term "risk processing system" refers to any kind of hardware, software, firmware, computer, or device of any kind, or a combination of two or more of them, that performs risk processing.
[055] "One or more" includes a function being performed by one element, a function being performed by more than one element, e.g., in a distributed fashion, several functions being performed by one element, several functions being performed by several elements, or any combination of the above.
[056] It will also be understood that, although the terms first, second, etc. are, in some instances, used herein to describe various elements, these elements should not be limited by these terms. These terms are only used to distinguish one element from another. For example, a first contact could be termed a second contact, and, similarly, a second contact could be termed a first contact, without
departing from the scope of the various described embodiments. The first contact and the second contact are both contacts, but they are not the same contact.
[057] The terminology used in the description of the various described embodiments herein is for the purpose of describing particular embodiments only and is not intended to be limiting. As used in the description of the various described embodiments and the appended claims, the singular forms "a", "an" and "the" are intended to include the plural forms as well, unless the context clearly indicates otherwise. It will also be understood that the term "and/or" as used herein refers to and encompasses any and all possible combinations of one or more of the associated listed items. It will be further understood that the terms "includes," "including," "comprises," and/or "comprising," when used in this specification, specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof.
[058] As used herein, the term "if is, optionally, construed to mean "when" or "upon" or "in response to determining" or "in response to detecting," depending on the context. Similarly, the phrase "if it is determined" or "if [a stated condition or event] is detected" is, optionally, construed to mean "upon determining" or "in response to determining" or "upon detecting [the stated condition or event]" or "in response to detecting [the stated condition or event]," depending on the context.
[059] As used herein, an AV system refers to the AV along with the array of hardware, software, stored data, and data generated in real-time that supports the operation of the AV. In an embodiment, the AV system is incorporated within the AV. In an embodiment, the AV system may be spread across several locations. For example, some of the software of the AV system may be implemented on a cloud computing environment similar to cloud computing environment 1300 described below with respect to FIG. 13.
[060] In general, this document describes technologies applicable to any vehicles that have one or more autonomous capabilities including fully autonomous vehicles, highly autonomous vehicles, and conditionally autonomous vehicles, such as so-called Level 5, Level 4 and Level 3 vehicles, respectively (see SAE International's standard J3016: Taxonomy and Definitions for Terms Related to On-Road Motor Vehicle Automated Driving Systems, which is incorporated by reference in its entirety, for more details on the classification of levels of autonomy in vehicles). Vehicles with Autonomous Capabilities may attempt to control the steering or speed of the vehicles. The technologies descried in this document also can be applied to partially autonomous vehicles and
driver assisted vehicles, such as so-called Level 2 and Level 1 vehicles (see SAE International's standard J3016: Taxonomy and Definitions for Terms Related to On-Road Motor Vehicle
Automated Driving Systems). One or more of the Level 1, 2, 3, 4 and 5 vehicle systems may automate certain vehicle operations (e.g., steering, braking, and using maps) under certain operating conditions based on processing of sensor inputs. The technologies described in this document can benefit vehicles in any levels, ranging from fully autonomous vehicles to human-operated vehicles.
[061] In the following description, headings are provided for improved readability. Although headings are provided, information related to a particular heading but not found in the section having that heading, may also be found elsewhere in the specification.
[062] Referring to FIG. 1, an AV system 120 operates the AV 100 autonomously or semi- autonomously along a trajectory 198 through an environment 190 to a goal location 199 while avoiding objects (e.g., natural obstructions 191, vehicles 193, pedestrians 192, cyclists, and other obstacles) and obeying rules of the road (e.g., rules of operation or driving preferences).
[063] In an embodiment, the AV system 120 includes devices 101 that are instrumented to receive and act on operational commands from the computer processors 146. In an embodiment, computing processors 146 are similar to the processor 1304 described below in reference to FIG. 13. Examples of devices 101 include a steering control 102, brakes 103, gears, accelerator pedal, windshield wipers, side-door locks, window controls, and turn-indicators.
[064] In an embodiment, the AV system 120 includes sensors 121 for measuring or inferring properties of state or condition of the AV 100, such as the AV's position, linear and angular velocity and acceleration, and heading (e.g., an orientation of the leading end of AV 100). For example, GPS, inertial measurement units (IMU) that measure both vehicle linear accelerations and angular rates, wheel speed sensors for measuring or estimating wheel slip ratios, wheel brake pressure or braking torque sensors, engine torque or wheel torque sensors, and steering angle and angular rate sensors. [065] In an embodiment, the sensors 121 also include sensors for sensing or measuring properties of the AV's environment. For example, monocular or stereo video cameras 122 in the visible light, infrared or thermal (or both) spectra, LiDAR 123, radar, ultrasonic sensors, time-of-flight (TOF) depth sensors, speed sensors, temperature sensors, humidity sensors, and precipitation sensors.
[066] In an embodiment, the AV system 120 includes a data storage unit 142 and memory 144 for
storing machine instructions associated with computer processors 146 or data collected by sensors 121. In an embodiment, the data storage unit 142 is similar to the ROM 1308 or storage device 1310 described below in relation to FIG. 13. In an embodiment, memory 144 is similar to the main memory 1306 described below. In an embodiment, the data storage unit 142 and memory 144 store historical, real-time, and/or predictive information about the environment 190. In an embodiment, the stored information includes maps, driving performance, traffic congestion updates or weather conditions. In an embodiment, data relating to the environment 190 is transmitted to the AV 100 via a communications channel from a remotely located database 134.
[067] In an embodiment, the AV system 120 includes communications devices 140 for
communicating measured or inferred properties of other vehicles' states and conditions, such as positions, linear and angular velocities, linear and angular accelerations, and linear and angular headings to the AV 100. These devices include Vehicle-to- Vehicle (V2V) and Vehicle-to- Infrastructure (V2I) communication devices and devices for wireless communications over point-to- point or ad hoc networks or both. In an embodiment, the communications devices 140 communicate across the electromagnetic spectrum (including radio and optical communications) or other media (e.g., air and acoustic media). A combination of Vehicle-to- Vehicle (V2V) Vehicle-to-Infrastructure (V2I) communication (and, in some embodiments, one or more other types of communication) is sometimes referred to as Vehicle-to-Every thing (V2X) communication. V2X communication typically conforms to one or more communications standards for communication with, between, and among autonomous vehicles.
[068] In an embodiment, the communication devices 140 include communication interfaces. For example, wired, wireless, WiMAX, Wi-Fi, Bluetooth, satellite, cellular, optical, near field, infrared, or radio interfaces. The communication interfaces transmit data from a remotely located database 134 to AV system 120. In an embodiment, the remotely located database 134 is embedded in a cloud computing environment 1200 as described in FIG. 12. The communication interfaces 140 transmit data collected from sensors 121 or other data related to the operation of AV 100 to the remotely located database 134. In an embodiment, communication interfaces 140 transmit information that relates to teleoperations to the AV 100. In some embodiments, the AV 100 communicates with other remote (e.g., "cloud") servers 136. [069] In an embodiment, the remotely located database 134 also stores and transmits digital data
(e.g., storing data such as road and street locations). Such data may be stored on the memory 144 on the AV 100, or transmitted to the AV 100 via a communications channel from the remotely located database 134.
[070] In an embodiment, the remotely located database 134 stores and transmits historical information about driving properties (e.g., speed and acceleration profiles) of vehicles that have previously traveled along trajectory 198 at similar times of day. Such data may be stored on the memory 144 on the AV 100, or transmitted to the AV 100 via a communications channel from the remotely located database 134.
[071] Computing devices 146 located on the AV 100 algorithmically generate control actions based on both real-time sensor data and prior information, allowing the AV system 120 to execute its autonomous driving capabilities.
[072] In an embodiment, the AV system 120 may include computer peripherals 132 coupled to computing devices 146 for providing information and alerts to, and receiving input from, a user (e.g., an occupant or a remote user) of the AV 100. In an embodiment, peripherals 132 are similar to the display 1312, input device 1314, and curser controller 1316 discussed below in reference to FIG. 13. The coupling may be wireless or wired. Any two or more of the interface devices may be integrated into a single device.
[073] FIGs. 2-4 illustrate examples of architectures of risk processing systems. Referring to FIG. 2, a risk processing system 230 includes the following elements:
• A risk processing client 201, realized by integrated circuits, field-programmable gate arrays, hardware, software, or firmware, or a combination of two or more of the above.
In one embodiment, the risk processing client 201 is installed on the AV system 200. The risk processing client 201 may interact with components of the AV system 200 (e.g., sensors 216 and 218, communication devices 210, user interface devices, memory 212, a processor 214, a database 220, or functional devices, or combinations of them). For example, the risk processing client 201 sends and receives information and commands. The risk processing client 201 communicates via a communication device 210 (that may be at least partly wireless) with a risk processing server 231. In one embodiment, the communication device 210 is a communication interface.
In one embodiment, the risk processing client 252 is installed on a mobile device 250. The risk processing client 252 may utilize signals collected by the sensors of the mobile device 250, such as GPS sensors, cameras, accelerometers, gyroscopes, and barometers. The risk processing client 252 can communicate with a risk processing server 231 over a
communication interface of the mobile phone 250.
In one embodiment, the risk processing client 201 may be installed on a combination of the AV system (in particular on the AV itself) and a mobile device 250.
• A risk processing server 231 is on board the AV of the AV system 200 or in a remote
location, for example, at least 0.1, 1, 2, 3, 4, 5, 10, 20, 30, 40, 50, 100, 200, 300, 400, 500, 600, 700, 900, or 1000 meters away from the AV of the AV system 200.
• A graphical user interface 232 may be presented by the risk processing client 201, or the risk processing server 231, or both. Embodiments may present on the interface 232 information of one or more of the following: risk factors, a known risk, an active risk, a present risk, a potential risk, a road network, a condition of the AV of the AV system 200, an environment of the AV of the AV system 200, or sensor signals, among other things.
[074] Referring to FIG. 3, in one embodiment, a risk processing client 311 may communicate with two or more risk processing servers 321, 322 and 323. In some cases, two or more servers (e.g., 321 and 322) receive and aggregate information for risk processing or for presentation on an interface 332. In one embodiment, a server (e.g., 323) may receive risk information from two or more risk processing clients 311 and 312, which are installed, for example, on different AV systems 301 and 302, respectively. One embodiment allow a server (e.g., 322) to receive risk information from two or more risk processing clients 311 and 313, which are installed on an AV system 301 and a mobile device 303, respectively.
[075] In one embodiment, a risk processing client (e.g., 312) is configured as a server to receive and aggregate information from one or more other risk processing clients (e.g., 311 or 313, or both). In one embodiment, a risk processing client (e.g., 312) may play a role as a relaying device for establishing and maintaining a communication between a server 323 and another client 311.
[076] Referring to FIG. 4, a risk processing system 400 may communicate with one or more sensors (e.g., 402 and 404) to collect signals via a communication interface 410. Sensor signals or
existing data or both may be stored in memory 422 or a database 424 or both.
[077] A database 424 may be onboard or remote to, or both, the AV. The database 424 may store data from sensors, government agencies, police stations, or insurance companies, or combinations of them. Examples of the data stored in the database 424 include timestamps, time windows, peak traffic, weather, maps, street light settings, traffic sign settings, traffic lights settings, road configurations, addresses, normal vehicle operator behaviors, abnormal vehicle operator behaviors, normal AV operations, abnormal AV operations, traffic due to social events, traffic due to sport events, location of hospitals, location of police stations, location of fire stations, known risks along a trajectory, predicted risks along a trajectory, characteristics (for example, age, gender, ethnicity, or socio-economic status) of individuals involved in high risk situations along a trajectory,
characteristics (for example, color, make, model, or engine type) of vehicles involved in high risk situations along a trajectory, value of insurance claims filed/processed for high risk situations along a trajectory, cost of repairs associated with high risk situations along a trajectory, and
cost/characteristics of insurance policies offered to protect against high risk situations along a trajectory.
[078] Processing and analyzing the signals and data may be realized by a processor 420, or a computing resource of the processor 420.
[079] In one embodiment, a risk processing system includes a risk monitoring process 432 to predict potential risks or detect existing risks in the environment of the AV system. [080] In one embodiment, a risk processing system includes a risk reporting process 434 to report predicted or detected risks.
[081] In one embodiment, a risk processing system includes a risk reaction process 436 to configure the AV system to take suitable actions when a risk is predicted or detected. In one embodiment, the risk reaction process 436 includes or communicates with a teleoperation system 442 to allow a remote operator to operate AV 502 in response to a risk.
[082] In one embodiment, a risk processing system includes a risk factor assessing process 438 to evaluate risk factors affecting the on-road operation of AV 502.
[083] In one embodiment, a risk processing system includes a report integration process 440 to integrate two or more risk reports.
Risk Monitoring
[084] Among other things, a risk monitoring process identifies risks by monitoring an environment near the AV, an operation of the AV system, or the interior of the AV.
[085] Collisions. FIG. 5 illustrates an example of an autonomous vehicle AV 502 detecting a collision in its vicinity while driving on a road. For instance, upon analyzing signals from sensors (e.g., a vision sensor, a lidar or a radar, or combinations of them) AV 502 produces information about other objects (e.g., vehicles 512 and 514, infrastructure, and pedestrians) in the environment; examples of such information include: locations, speeds, orientations, boundaries, sizes, dimensions, status of traffic lights (for example, red, green, amber, malfunction, etc.), information related to manufacturers, plate numbers, owners, drivers, and operational state of other vehicles on the road along with AV 502. The information is analyzed by risk processing server to predict a potential collision or detect an existing collision.
[086] In one embodiment, a sensor (e.g., lidar) may emit a wave beam (e.g., an electromagnetic beam, or an acoustic beam, or both) and a component of the beam may return after hitting an object. A returning beam component can indicate a boundary point of the object. When a sensor emits a scan comprising beams each of which produces N returning beam components, a cloud of xN points (also called a point cloud) is acquired.
[087] Analyzing a map from a database or images from a vision sensor, or both, can further determine foreground and background. FIG. 6 A illustrates an example of object detection by an autonomous vehicle AV system 601 by encoding information about an elevation profile of the road surface. As shown in FIG. 6 A, a map used by the AV system 601 may encode information about an elevation profile of the road surface 600. This information can be used to classify a given point as belonging to the road surface as follows. In one embodiment, an image, including depth information, from a vision sensor (e.g., a stereo camera) is acquired and segmentation is applied to identify a background region, or a foreground object, or both. Segmentation results may be used alone or be integrated with map information (e.g., projected onto a map) for classifying points in a point cloud.
[088] Given information about the current position and orientation of the AV system 601 and the position of the sensor 603, the AV system 601 derives a point (e.g., 609, 610, 611 or 612) at which an emitted beam (e.g., 605, 606, 607, or 608) is expected to encounter the ground. If a point (613)
returned by a beam (608) is closer to the AV 601 than the expected point (612) by a predefined difference, then the beam is determined to have encountered an object 602 (e.g., a point on the foreground), and the point 613 is classified as a boundary point of the object 602. In one
embodiment, machine learning (for example, deep learning) is used to perform foreground classification. Such an approach fuses data from multiple sensors (such as lidar, radar and camera) to improve the classification accuracy.
[089] When an object's boundary is detected, the risk monitoring process tracks the boundary and determine a speed of the object. In one embodiment, a speed sensor (e.g., based on radar) is employed to determine the object's speed. Referring to FIG. 5, based on measured speeds of vehicles 512 and 514 at a time t, the risk monitoring process can predict their positions and boundary locations at time t+l . When the boundaries of objects 512 and 514 at time t+\ overlap, a collision is predicted. In some cases, when a risk monitoring process detects that the boundaries of objects 512 and 514 overlap and the speeds of the objects drop swiftly to zero, a collision is detected.
[090] In one embodiment, the AV system 502 may use sensor signals (e.g., lidar, radar, images, GPS, or information in vehicle-to-vehicle signals, or combinations of them) to determine the locations of the objects 512 and 514, and detect a collision that occurred before the AV 502 arrives in the vicinity of the collision between the objects 512 and 514. In one embodiment, the AV system 502 may use sensor signals to determine shapes or boundaries or sizes or dimensions, or
combinations of them, of the objects 512 and 514, and infer the collision based on overlapping between the objects. For instance, segmentation on an image may identify locations and boundaries of the objects 512 and 514.
[091] In one embodiment, the AV system 502 may use traffic information (e.g., traffic volume and flow) to infer a collision. The AV system 502 may measure the traffic flow to determine if the traffic flow is in an abnormal condition. For instance, objects 512 and 514 involved in a collision have zero speed, and the traffic 532 behind the objects 512 and 514 is slow or congested, but the traffic 534 in front of the objects 512 and 514 is faster.
[092] Risky substances. By analyzing signals from sensors (e.g., a smoke sensor, a chemical sensor, a temperature sensor, a flame sensor, a fire sensor, a radioactivity sensor, or combinations of them) the risk monitoring process may detect a fire, flame 522, smoke, or radioactivity in the
environment of the AV 502.
[093] For example, the risk monitoring process may include one or more sensors (e.g., chemical sensors, radar, vision sensors, optical sensors, and infrared sensors) or may have access to signals of the sensors. Sensor signals may provide information about the chemical composition of the AV's environment, such as a concentration of a certain chemical species or combinations of them (e.g., carbon monoxide, carbon dioxide, composition C, sulfides, explosives, and toxic chemicals). In some cases, sensor signals may provide information about a shape of a risky object (e.g., gun, bomb, and grenade).
[094] The risk monitoring process analyzes sensor signals based on a pattern recognition algorithm to predict or detect presence of a risky substance, or a source of a risky substance, or both. For instance, when a fire (e.g., 522 in FIG. 5) exists, the air in the environment may contain substances whose concentrations deviate from normal values, and the risk monitoring process may compute a likelihood of the presence of the fire 522. In some cases, the AV system may analyze distributions of the risky substances in a space over time to determine a source of a risk; for example, concentrations near the source are higher than those in a distant location. In some applications, the risk monitoring process may monitor an interior of the AV to detect presence of risky substances, such as explosives, combustibles, poisonous gases, and flammables. For example, the AV may detect increased concentration of carbon dioxide due to the AV being located in an enclosed area.
[095] Threat assessment. The risk monitoring process analyzes sensor signals and data to assess threats. For example, the risk monitoring process detects an abnormal concentration of a chemical (e.g., an alcohol level, a toxic substance, and an explosive material) that is possessed by an AV occupant or possessed by an approaching object (e.g., a person or an occupant of a vehicle).
[096] Vehicle Operation Characteristics. In an embodiment, the risk monitoring process evaluates a driving behavior of a vehicle possessing an autonomous driving capability. FIGs. 6B - 6C illustrate examples of a risk monitoring process monitoring driving behaviors.
[097] The risk monitoring process monitors whether a vehicle follows traffic rules. For instance, referring to scenario 620 in FIG. 6B, the risk monitoring process monitors driving behavior of a vehicle 621 when encountering a stop sign 622. Driving behavior 623 involves slowing down when nearly arriving the stop sign 622, but second driving behavior 624 involves gradually slowing down
within a reasonable distance from the stop sign 622. Thus, driving behavior 623 is riskier than second driving behavior 624.
[098] In another scenario 625 in FIG. 6B, the risk monitoring process monitors driving behavior of a vehicle 626 when encountering stop signs 627. Third driving behavior 628 has a smoother speed profile than fourth driving behavior 629, so the driving behavior 628 makes the occupant of the vehicle feel more comfortable than the fourth driving behavior 629. In this context, a speed profile is a plot of the variation in speed of a vehicle over a period of time. A jagged speed profile with several peaks and valleys is an indicator of haphazard starts and stops and consequently unsafe or rough driving. Different driving behaviors are weighted and balanced against one another. For example, the third driving behavior 628 also does not make a full stop at the stop signs, so the third driving behavior 628 is riskier than the fourth driving behavior 629.
[099] Referring to FIG. 6C, the risk monitoring process monitors headings or driving trajectories of vehicles. For instance, when driving on a straight road segment, fifth driving behavior 631 involving a vehicle wiggling is riskier than sixth driving behavior 632 involving the vehicle maintaining a straight traj ectory .
[0100] In one embodiment, the risk monitoring process monitors how a vehicle reacts to dynamic objects on the road. For example, referring to FIG. 6C, the risk monitoring process determines how the vehicle 635 slows down when approaching a crossing 636, when encountering a pedestrian 637, or when detecting an object 638. [0101] The risk monitoring process evaluates driving behaviors based on analyzing sensor signals. For example, the risk monitoring process uses a speed sensor (e.g., based on radar) to monitor speeds. In one embodiment, the risk monitoring process uses a position sensor (e.g., based on GPS) to monitor a position, or a series of positions. In some cases, the risk monitoring process uses odometer onboard a vehicle to monitor a driving distance. In some cases, the risk monitoring process includes a sensor onboard a vehicle to monitor the operations of the steering wheel, the brake pedal, the acceleration, or the deceleration, or combinations of the above. In an embodiment, a vehicle utilizes in-vehicle cameras to monitor vehicle operational characteristics associated with the operator of a vehicle. For example, the vehicle may analyze operator attentiveness and wakefulness by monitoring operator's eyes, pupil dilation, or alcohol consumption by analyzing
operator breath.
[0102] The risk monitoring process evaluates driving behaviors based on comparing a driving behavior with a template driving behavior. In one embodiment, a template driving behavior includes driving based on: traffic rules, preferred driving behaviors, driving behaviors of human drivers, a statistical summary of human drivers, driving behaviors of AV systems, or a statistical summary of AV systems. In one embodiment, a template driving behavior includes two vehicles from a same manufacturer or different manufacturers. In one embodiment, a template driving behavior includes two AV systems from a same provider or different providers.
[0103] Database exploration. In an embodiment, the risk monitoring process explores a database of past risks or statistics of risks, or both. For example, the database may be hosted by government agencies, police departments, or insurance companies, or combinations of them.
[0104] In one embodiment, the risk monitoring process learns patterns of risks. For example, a learning algorithm may infer one or more of (or combinations of) the following: regions with frequent disasters, regions with frequent collisions, regions with frequent drunk drivers, regions with frequent sports events, regions with frequent protests, drivers with bad driving behaviors, frequent types of risks in a region, and frequent causes of risks in a region, among others.
[0105] In some cases, the patterns include time period information, e.g., peaks, mornings, afternoons, evenings, nights, weekdays, and weekends. In some cases, the patterns include road configurations, e.g., parallel parking streets, crosswalks, 4-way stops, 3-way stops, highways, bifurcations, merges, dedicated lanes, and bicycle lanes. In some cases, the patterns include distributions of risks within a region or within a time period; for example, more collisions take place at the center of the intersection of a 4-way stop and fewer collisions take place away from the center.
[0106] In one embodiment, the patterns include a dynamic model to describe the risks. For example, a probabilistic model (e.g., Gaussian distributions or Poisson distributions) may be used to describe risks on a road configuration, within a region, or within a time period, or combinations of them.
[0107] The learned patterns are used as prior information for the AV system to configure driving behaviors of the AV on a road. For example, when the AV is approaching a region having frequent risks, the AV slows down when passing the region, or it plans a trajectory to avoid passing the region or a combination of the two. In some applications, when approaching a region having
frequent risks involving one or more specific types of objects (e.g., children, bicycles, trucks, pedestrians, or animals), a perception process of the AV system is dedicated to detecting these specific types of objects. For example, prior probabilities of the presence of these specific types of objects may become high in the perception process. In one embodiment, the patterns include a model describing risks associated with trajectory information; for example, a database may show that a right turn at an intersection frequently is associated with accidents, and a model may describe the right-turn and the corresponding probability of collisions.
Risk Reporting
[0108] In an embodiment, a risk reporting process automatically reports a potential risk or an existing risk. An entity receiving a report may include a risk processing provider, a transportation service provider, a government agency, a fire station, a police station, a health service provider, an insurance company, an auto manufacturer, or a road user, or combinations of them.
[0109] The potential risk or the existing risk is determined automatically from a risk monitoring process. A report of a risk includes snapshot or temporal signals from sensors, such as images, radar signals, lidar signals, GPS signals, or speed signals, or combinations of them. A report of a risk includes information associated with the risk, such as timestamps, maps, traffic volumes, traffic flows, street light setting, travel signal settings, road configurations, addresses, hospitals, parties involved in the risk, injured parties involved in the risk, or object features (e.g., types, colors, sizes, shapes, models, make, plate numbers, VINs or owners, or combinations of them). [0110] In an embodiment, the risk reporting process includes processing received signals to extract information associated with the risk. For example, given an image or a video, the risk reporting process may segment the objects involved in the risk, identify the parties (e.g., based on plate numbers, or on information embedded in V2V or V2I communications, or on both) involved in the high risk situation, or recognize traffic configurations (e.g., volumes, speeds, traffic lanes, traffic lights, traffic signs, and infrastructure), or combinations of them. In one embodiment, the risk reporting process identifies a geolocation where a signal is taken; the geolocation information may be embedded in the signal or be inferred based on one or more GPS signals in the vicinity of the risk.
[0111] When receiving information about a risk from a risk monitoring process, the risk reporting process may evaluate if the same risk has been previously or simultaneously reported. In some cases,
in addition to the prediction or detection performed by the risk monitoring process of the AV system, the risk may be predicted or detected by another source (e.g., another AV system) or be notified by another source (e.g., a risk processing server, a government agency's server, or a news provider, or combinations of them). To determine if the risk has been reported and if the risk is real, the risk reporting process may evaluate one or more of the following factors.
1. Geographic zones. Referring to FIG. 7, the risk reporting process defines a zone of a risk.
When two zones (e.g., 712 and 714) are close to one another, their associated risks may be identical or similar. In contrast, when a zone (e.g., 716) is far away from another zone, it may involve a dissimilar or different risk.
2. Time. Referring to FIG. 7, the risk reporting process records a timestamp for a time at which a high risk situation was identified. When two timestamps (e.g., 724 and 726) are close in time, their associated risks may be identical or similar. In contrast, when there is a gap between two timestamps, the associated risks may involve a dissimilar or different risk.
3. Report frequency. Referring to FIG. 7, the risk reporting process may record reported events. When a large number of reports is associated with a risk 732, the deduced risk 732 exists; otherwise, for a risk 734 with a small number of reports, the detection of risk may be a false positive. In one embodiment, a geolocation is taken into account. For example, the number of reports of a risk in an area with a high population density (e.g., a metropolitan area) is expected to be higher than in an area with a low population density (e.g., a rural area).
[0112] In one embodiment, the risk reporting process includes an interface to allow a user to report a risk. Referring to FIG. 8, a risk reporting process provides a user with an interface 810, for example, to submit one or more images or videos about a risk (e.g., a collision between objects 812 and 814). In some cases, a risk reporting process provides a user with an interface 850 to report a risk on a map (e.g., a collision between objects 852 and 854). In some applications, the risk reporting process allows the interface user to provide (e.g., by clicking, typing, or speaking, or combinations of them) one or more of the following: a location of a risk, parties involved in a risk, locations of the parties, event details, road conditions, weather conditions, or traffic configurations (e.g., volumes, speeds, traffic lanes, traffic lights, traffic signs, and infrastructure).
[0113] In one embodiment, the risk reporting process processes a report to comply with laws, regulations, or policies, or combinations of them. For example, the risk reporting process may remove private information (e.g., social security number and driver license number) associated with a risk before transmitting the report to a third party. Report Integration
[0114] In one embodiment, a report integration process (440 in FIG. 4) synthesizes two or more risk reports into an integrated risk report. When two or more risk reports are made, each may include partial information about a single risk. For example, one report may include a time and a location of the risk, and another may include a time, a location, a road configuration, and a travel signal. In one embodiment, the report integration process resolves the discrepancy between the reported times, or between the reported locations, or both, and then generates a report associated with the risk with one or more of the following: a single timestamp or a single time period, a single location or a region, a road configuration, and a travel signal.
[0115] In one embodiment, a report integration process stitches two or more sensor signals. For example, referring to FIG. 9, one report may include images or videos recording a collision from a side view with scenes 900 and 910. The two vehicles 901 and 902 in scene 900 collide and become collided vehicles 911 and 912 in scene 910. Another report may include images or videos from a front view showing a scene 920 of collided vehicles 921 and 922, which reveals that the collision was caused by the vehicle 922 (corresponding to the vehicle 902 in scene 900) attempting to pass the vehicle 921 (corresponding to the vehicle 901 in scene 900). The two different reports associated with the same collision may reveal different information. Thus, the report integration process may utilize image processing and computer vision to stitch data from different reports. In some applications, the report integration process reconstructs a view of a risk in a two-dimensional or three-dimensional space. The report integration process may further reconstruct the view over time to show how the risk was evolving.
[0116] In one embodiment, a report integration process provides a user interface to allow a user to verify a report of a risk or an aggregated report of one or more risks.
Risk Reaction
[0117] In one embodiment, a risk reaction process (436 in FIG. 4) configures the AV system in
response to an existing risk or a potential risk.
[0118] The AV system receives a notice of a risk from one of the following: a risk monitoring process, a risk reporting process, a risk processing server, another AV system, an object on the road, or an infrastructure, or combinations of the above. In one embodiment, an existing or potential risk is stored in a database (e.g., map data). For example, map data may annotate a previously reported risk, a known existing risk, a known future risk, a collision-prone zone, a construction, and a heavy traffic zone.
[0119] In one embodiment, the risk reaction process adapts a motion planning process of the AV system based on one or more risks. For example, when a risk is near the AV, a motion planning process may plan a lane-changing trajectory to bypass the risk. In contrast, when a risk is far away, a motion planning process may plan a trajectory to its goal position by choosing another route to circumvent the risk.
[0120] In one embodiment, the risk reaction process enhances or alters a perception process of the AV system. When the AV is expected to pass near a risk, the information about the risk is assigned a higher priority for the perception process. For example, referring to FIG. 9, in an embodiment, the perception process recognizes objects (e.g., 901 and 902) in their normal conditions. However, when the two objects collide, the perception process may fail to recognize the two collided objects (e.g., 91 1 and 912), because the collided objects 91 1 and 912 have deformed shapes or the perception process may misclassify the collided objects 91 1 and 912 as a single unknown object. In such cases, the risk reaction process configures the perception process to consider the risk information as prior information C and to use probabilistic inference p(S\C to correctly recognize the collided objects (S) in the environment of the AV.
[0121] In one embodiment, the risk reaction process triggers a teleoperation system (442 in FIG. 4) for a tele-operator to guide the driving of the AV. For example, when a risk is observed and the AV is unable to drive along a previously planned trajectory, the risk reaction process sends a request to the tele-operation system. An intervention on the driving of the AV may be invoked based on the tele-operation system. Additional information about such tele-operation is found in United States patent application serial number 15/624,780, filed June 16, 2017, which is incorporated here by reference.
[0122] In one embodiment, the way the risk reaction process changes or adapts another process is based on a flag. For instance, when a risk is predicted or detected, the flag is turned from inactive (e.g., represented by 0) to active (e.g., represented by 1), so the other process will retrieve or listen to outputs from the risk reaction process. [0123] In one embodiment, the way the risk reaction process alters another underlying process is based on programming code. For example, referring to FIG. 10, in an embodiment, the underlying process 1000 executes routine instructions. When the AV system is aware of a risk, the risk reaction process 1002 dynamically generates a set of instructions that can be inserted into the executions of the underlying process 1000. In some cases, the underlying process 1010 takes one or more inputs (e.g., detected objects in the environment of the AV system), and an output of the reaction process 1012 is treated as an input (e.g., an additional stationary object blocking the current trajectory of the AV system) to the underlying process 1010.
[0124] In one embodiment, the risk reaction process generates an alarm of a potential risk or an existing risk; the alarm is based on visual or audio signals. The reactions to be taken or
recommended by the risk reaction process is provided via visual or audio signals. FIG. 11 illustrates an example of an interface 1100 of a risk processing system. In an embodiment, the interface 1100 presents a detected risk (e.g., collision 1122) detected near the AV 1120. In some cases, the interface generates an audio signal (e.g., a sound, or a spoken language, or both) to provide alert for the risk 1122. The risk reaction process also may change a previous navigation guidance for a left-turn trajectory 1124 for the AV 1120 to a new navigation guidance for a straight trajectory 1126. The change in trajectory may be presented on the interface 1100, or verbally described by the interface 1100, or both.
Risk Factor Assessment
[0125] In one embodiment, a risk factor assessing process calculates existing or potential risk factors of the AV system. The risk factor assessment may be used to compute insurance premiums. For example, in one embodiment, the risk factor assessing process evaluates the number of miles to be traversed for a trajectory. A larger number of miles may imply a higher risk.
[0126] In one embodiment, the risk factor assessing process evaluates the number of miles to be traveled during a time period. A larger number of miles may imply a higher risk.
[0127] In one embodiment, a risk is not physically external to the AV. In some cases, the risk factor assessing process evaluates a health condition of the AV. For example, components of the AV (e.g., tires, brakes, engines, steering wheels, accessory belt tension pulleys, accessory belt tensioners, camshaft position sensors, crankshaft position sensors, crankshaft pulleys, crankshaft seals, cylinder heads, engine gasket sets, harmonic balancers, knock sensors, motor and transmission mounts, motor and transmission mount brackets, oil cooler hoses, oil dipsticks, oil drain plug gasket, oil filters, oil level sensors, oil pans, oil pan gaskets, oil pressure switches, oil pumps, rod bearing sets, timing belts, timing belt kits, timing belt tensioners, valve cover, valve cover gaskets, valve stem seals, perception sensors, perception process, motion planning process, databases, and computing processors) that have aged or have been used heavily may imply a higher risk.
[0128] In one embodiment, the risk factor assessing process evaluates if a risk processing system is active when driving the AV. An inactive (e.g., due to malfunctioning or deactivation by the vehicle operator) risk processing system may imply a higher risk.
[0129] In one embodiment, the risk factor assessing process evaluates if an anti-theft system (e.g., alarms and sensors) is active when driving the AV system. An inactive (e.g., due to due to malfunctioning or deactivation by the vehicle operator) anti-theft system may imply a higher risk.
[0130] In one embodiment, the risk factor assessing process evaluates if a trajectory of the AV is through a risky region. Passing through a risky region may imply a higher risk.
[0131] In one embodiment, the risk factor assessing process evaluates a user's profile. A user may be an occupant of the AV, or a party using the AV for transporting someone else or something else. Examples of a user profile include age, occupation, driving history, driving behaviors, driving purpose, active driving license, a frequency of using vehicles, income level and history, education level and history, and home address, among others.
[0132] In one embodiment, the risk factor assessing process evaluates a user's social network profile. For example, in some cases, the risk processing system connects to the user's social network accounts (e.g., Facebook, Linkedln, Instagram, YouTube, and personal website) to evaluate red flags for the user. A user having, or being prone to, for example, a psychiatric disorder, a terroristic tendency, or abnormal use of weapons, or combinations of these, may imply a higher risk.
[0133] In one embodiment, the risk factor assessing process evaluates a driving behavior. For
example, a driving behavior deviating from a normal behavior may imply a higher risk. As another example, a driving behavior violating a traffic rule may imply a higher risk. As another example, a driving behavior inducing a less comfort level may imply a higher risk.
[0134] FIG. 12 illustrates an exemplary "cloud" computing environment. Cloud computing is a model of service delivery for enabling convenient, on-demand network access to a shared pool of configurable computing resources (e.g. networks, network bandwidth, servers, processing, memory, storage, applications, virtual machines, and services). In typical cloud computing systems, one or more large cloud data centers house the machines used to deliver the services provided by the cloud. Referring now to FIG. 12, the cloud computing environment 1200 includes cloud data centers 1204a, 1204b, and 1204c that are interconnected through the cloud 1202. Data centers 1204a, 1204b, and 1204c provide cloud computing services to computer systems 1206a, 1206b, 1206c, 1206d, 1206e, and 1206f connected to cloud 1202.
[0135] The cloud computing environment 1200 includes one or more cloud data centers. In general, a cloud data center, for example the cloud data center 1204a shown in FIG. 12, refers to the physical arrangement of servers that make up a cloud, for example the cloud 1202 shown in FIG. 12, or a particular portion of a cloud. For example, servers can be physically arranged in the cloud datacenter into rooms, groups, rows, and racks. A cloud datacenter has one or more zones, which include one or more rooms of servers. Each room has one or more rows of servers, and each row includes one or more racks. Each rack includes one or more individual server nodes. Servers in zones, rooms, racks, and/or rows may be arranged into groups based on physical infrastructure requirements of the datacenter facility, which include power, energy, thermal, heat, and/or other requirements. In an embodiment, the server nodes are similar to the computer system described in FIG. 13. The data center 1204a has many computing systems distributed through many racks.
[0136] The cloud 1202 includes cloud data centers 1204a, 1204b, and 1204c along with the network and networking resources (for example, networking equipment, nodes, routers, switches, and networking cables) that interconnect the cloud data centers 1204a, 1204c, and 1204c and help facilitate the computing systems' 1206a-1206f access to cloud computing services. In an
embodiment, the network represents any combination of one or more local networks, wide area networks, or internetworks coupled using wired or wireless links deployed using terrestrial or satellite connections. Data exchanged over the network, is transferred using any number of network
layer protocols, such as Internet Protocol (IP), Multiprotocol Label Switching (MPLS), Asynchronous Transfer Mode (ATM), Frame Relay, etc. Furthermore, in embodiments where the network represents a combination of multiple sub-networks, different network layer protocols are used at each of the underlying sub-networks. In some embodiments, the network represents one or more interconnected internetworks, such as the public Internet.
[0137] The computing systems 1206a-1206f or cloud computing services consumers are connected to the cloud 1202 through network links and network adapters. In an embodiment, the computing systems 1206a-1206f are implemented as various computing devices, for example servers, desktops, laptops, tablet, smartphones, IoT devices, autonomous vehicles (including, cars, drones, shuttles, trains, buses, etc.) and consumer electronics. The computing systems 1206a-1206f may also be implemented in, or as a part of, other systems.
[0138] FIG. 13 illustrates an example of a computer system 1300. In an embodiment, the computer system 1300 is a special purpose computing device. The special-purpose computing devices may be hard-wired to perform the techniques, or may include digital electronic devices such as one or more application-specific integrated circuits (ASICs) or field programmable gate arrays (FPGAs) that are persistently programmed to perform the techniques, or may include one or more general purpose hardware processors programmed to perform the techniques pursuant to program instructions in firmware, memory, other storage, or a combination. Such special-purpose computing devices may also combine custom hard-wired logic, ASICs, or FPGAs with custom programming to accomplish the techniques. The special-purpose computing devices may be desktop computer systems, portable computer systems, handheld devices, network devices or any other device that incorporates hardwired and/or program logic to implement the techniques.
[0139] The computer system 1300 may include a bus 1302 or other communication mechanism for communicating information, and a hardware processor 1304 coupled with a bus 1302 for processing information. The hardware processor 1304 may be, for example, a general-purpose microprocessor.
The computer system 1300 also includes a main memory 1306, such as a random-access memory
(RAM) or other dynamic storage device, coupled to the bus 1302 for storing information and instructions to be executed by processor 1304. The main memory 1306 also may be used for storing temporary variables or other intermediate information during execution of instructions to be executed by the processor 1304. Such instructions, when stored in non-transitory storage media
accessible to the processor 1304, render the computer system 1300 into a special-purpose machine that is customized to perform the operations specified in the instructions.
[0140] In an embodiment, the computer system 1300 further includes a read only memory (ROM) 1308 or other static storage device coupled to the bus 1302 for storing static information and instructions for the processor 1304. A storage device 1310, such as a magnetic disk, optical disk, or solid-state drive is provided and coupled to the bus 1302 for storing information and instructions.
[0141] The computer system 1300 may be coupled via the bus 1302 to a display 1312, such as a cathode ray tube (CRT), a liquid crystal display (LCD), plasma display, light emitting diode (LED) display, or an organic light emitting diode (OLED) display for displaying information to a computer user. An input device 1314, including alphanumeric and other keys, is coupled to bus 1302 for communicating information and command selections to the processor 1304. Another type of user input device is a cursor controller 1316, such as a mouse, a trackball, a touch-enabled display, or cursor direction keys for communicating direction information and command selections to the processor 1304 and for controlling cursor movement on the display 1312. This input device typically has two degrees of freedom in two axes, a first axis (e.g., x-axis) and a second axis (e.g., y-axis), that allows the device to specify positions in a plane.
[0142] According to one embodiment, the techniques herein are performed by the computer system 1300 in response to the processor 1304 executing one or more sequences of one or more instructions contained in the main memory 1306. Such instructions may be read into the main memory 1306 from another storage medium, such as the storage device 1310. Execution of the sequences of instructions contained in the main memory 1306 causes the processor 1304 to perform the process steps described herein. In alternative embodiments, hard-wired circuitry may be used in place of or in combination with software instructions.
[0143] The term "storage media" as used herein refers to any non-transitory media that store data and/or instructions that cause a machine to operate in a specific fashion. Such storage media may comprise non-volatile media and/or volatile media. Non-volatile media includes, for example, optical disks, magnetic disks, or solid-state drives, such as the storage device 1310. Volatile media includes dynamic memory, such as the main memory 1306. Common forms of storage media include, for example, a floppy disk, a flexible disk, hard disk, solid-state drive, magnetic tape, or any
other magnetic data storage medium, a CD-ROM, any other optical data storage medium, any physical medium with patterns of holes, a RAM, a PROM, and EPROM, a FLASH-EPROM, NVRAM, or any other memory chip or cartridge.
[0144] Storage media is distinct from but may be used in conjunction with transmission media. Transmission media participates in transferring information between storage media. For example, transmission media includes coaxial cables, copper wire and fiber optics, including the wires that comprise the bus 1302. Transmission media can also take the form of acoustic or light waves, such as those generated during radio wave and infrared data communications.
[0145] Various forms of media may be involved in carrying one or more sequences of one or more instructions to the processor 1304 for execution. For example, the instructions may initially be carried on a magnetic disk or solid-state drive of a remote computer. The remote computer can load the instructions into its dynamic memory and send the instructions over a telephone line using a modem. A modem local to the computer system 1300 can receive the data on the telephone line and use an infrared transmitter to convert the data to an infrared signal. An infrared detector can receive the data carried in the infrared signal and appropriate circuitry can place the data on the bus 1302. The bus 1302 carries the data to the main memory 1306, from which processor 1304 retrieves and executes the instructions. The instructions received by the main memory 1306 may optionally be stored on the storage device 1310 either before or after execution by processor 1304.
[0146] The computer system 1300 also includes a communication interface 1318 coupled to the bus 1302. The communication interface 1318 provides a two-way data communication coupling to a network link 1320 that is connected to a local network 1322. For example, the communication interface 1318 may be an integrated service digital network (ISDN) card, cable modem, satellite modem, or a modem to provide a data communication connection to a corresponding type of telephone line. As another example, the communication interface 1318 may be a local area network (LAN) card to provide a data communication connection to a compatible LAN. Wireless links may also be implemented. In any such embodiment, the communication interface 1318 sends and receives electrical, electromagnetic or optical signals that carry digital data streams representing various types of information.
[0147] The network link 1320 typically provides data communication through one or more networks
to other data devices. For example, the network link 1320 may provide a connection through the local network 1322 to a host computer 1324 or to a cloud data center or equipment operated by an Internet Service Provider (ISP) 1326. The ISP 1326 in turn provides data communication services through the world-wide packet data communication network now commonly referred to as the "Internet" 1328. The local network 1322 and Internet 1328 both use electrical, electromagnetic or optical signals that carry digital data streams. The signals through the various networks and the signals on the network link 1320 and through the communication interface 1318, which carry the digital data to and from the computer system 1300, are example forms of transmission media. In an embodiment, the network 1320 may contain or may be a part of the cloud 1202 described above. [0148] The computer system 1300 can send messages and receive data, including program code, through the network(s), the network link 1320, and the communication interface 1318. In an embodiment, the computer system 1300 may receive code for processing. The received code may be executed by the processor 1304 as it is received, and/or stored in storage device 1310, or other nonvolatile storage for later execution. [0149] Although the descriptions in this document have described embodiments wherein the tele-operator is a person, tele-operator functions can be performed partially or fully automatically.
[0150] Other embodiments are also within the scope of the claims.
[0151] In the foregoing specification, embodiments of the invention have been described with reference to numerous specific details that may vary from implementation to implementation. The specification and drawings are, accordingly, to be regarded in an illustrative rather than a restrictive sense. The sole and exclusive indicator of the scope of the invention, and what is intended by the applicants to be the scope of the invention, is the literal and equivalent scope of the set of claims that issue from this application, in the specific form in which such claims issue, including any subsequent correction. Any definitions expressly set forth herein for terms contained in such claims shall govern the meaning of such terms as used in the claims. In addition, when we use the term "further comprising," in the foregoing specification or following claims, what follows this phrase can be an additional step or entity, or a sub -step/sub -entity of a previously-recited step or entity.
Claims
1. An apparatus comprising:
a processor configured to:
process data to identify a risk of driving a vehicle comprising an autonomous driving capability,
modify the autonomous driving capability in response to the risk, and
update operation of the vehicle based on the modifying of the autonomous capability; and an alarm configured to issue an alert of the identified risk.
2. The apparatus of claim 1, wherein the identifying comprises detecting or predicting the risk.
3. The apparatus of claim 1, wherein the data comprises at least one of sensor signals, or known or predicted risks.
4. The apparatus of claim 1, wherein the identifying comprises at least one of:
analyzing the sensor signals to determine a position of an object,
analyzing the sensor signals to evaluate a speed of an object,
analyzing the sensor signals to evaluate speed profiles of two or more objects over time, analyzing the sensor signals to identify a boundary of an object,
analyzing the sensor signals to identify overlapping boundaries of two or more objects, or analyzing the sensor signals to determine a concentration of a chemical.
5. The apparatus of claim 1, wherein the identifying comprises analyzing the sensor signals to segment one or more objects in an image or a video.
6. The apparatus of claim 5, wherein the identifying comprises tracking the segmented one or more objects.
7. The apparatus of claim 1, wherein the identifying comprises assessing a threat.
8. The apparatus of claim 1, wherein the identifying comprises evaluating a driving behavior of at least one of the vehicle or another vehicle.
9. The apparatus of claim 8, wherein evaluating a driving behavior comprises evaluating at least one of a speed, a heading, a trajectory, or a vehicular operation.
10. The apparatus of claim 1, wherein the identifying comprises learning a pattern of known risks.
11. The apparatus of claim 10, wherein the pattern comprises associations of the known risks with one or more of objects, times, road configurations, or geolocations.
12. The apparatus of claim 1, wherein updating the operation of the vehicle comprises executing at least one of a lane change by a motion planning system of the vehicle, or a trajectory change by a motion planning system of the vehicle.
13. The apparatus of claim 1, wherein the processor is configured to treat analyzed sensor signals as prior information to a perception system of the vehicle in response to the risk.
14. The apparatus of claim 1, wherein processor is configured to invoke intervention on an
operation of the vehicle from a remote operator in response to the risk.
15. The apparatus of claim 1, wherein updating the operation of the vehicle comprises at least one of:
generating and inserting new machine instructions into existing machine instructions of an operation of the vehicle, or
generating an input to an operation of the vehicle.
16. The apparatus of claim 1, wherein the processor is configured to generate a report of a risk.
17. The apparatus of claim 16, wherein generating a report of a risk comprises extracting and
aggregating, from sensor signals, information associated with the risk.
The apparatus of claim 17, wherein extracting and aggregating the information comprises evaluating one or more of overlapping geographic zones, overlapping time periods, or report frequencies.
The apparatus of claim 16, wherein generating a report of a risk comprises one of:
recording an environment of the risk,
stitching images or videos to form a view of the risk,
removing private information associated with the risk,
providing a graphical user interface to allow a user to provide information associated with the risk,
integrating two or more reports associated with the risk.
20. The apparatus of claim 1, wherein the processor is configured to receive a report of a risk from a remote data source.
21. The apparatus of claim 1, wherein the processor is configured to assess a risk factor of the
vehicle.
22. The apparatus of claim 21, wherein assessing a risk factor comprises one of:
determining a risk associated with passing through a risky region,
determining a risk associated with a driving distance or a driving time period,
determining a risk associated with an aging component of the vehicle,
determining a risk associated with an inactive autonomous driving capability,
determining a risk based on a profile of a user of the vehicle,
determining a risk based on social network information of a user of the vehicle,
determining a risk associated with a driving behavior, or
determining a risk associated with following traffic rules.
23. A method comprising:
receiving sensor signals using a vehicle comprising an autonomous driving capability;
identifying a risk associated with operating the vehicle based on the sensor signals;
modifying the autonomous driving capability in response to the risk; and
updating the operation of the vehicle based on the modifying of the autonomous capability.
24. A vehicle having autonomous driving capabilities, the vehicle comprising:
steering, acceleration, and deceleration devices that respond to control signals from a driving control system to drive the vehicle autonomously on a road network;
a monitoring element on the vehicle that:
receives sensor signals, and
identifies a risk associated with operating the vehicle based on the sensor signals; and a controller that responds to the risk by configuring the driving control system to:
modifying the autonomous driving capability in response to the risk, and
based on the modifying of the autonomous capability, update an operation of the vehicle to maneuver the vehicle to a goal location.
Priority Applications (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
EP18820858.1A EP3642068A4 (en) | 2017-06-20 | 2018-06-20 | Risk processing for vehicles having autonomous driving capabilities |
KR1020227011824A KR102628790B1 (en) | 2017-06-20 | 2018-06-20 | Risk processing for vehicles having autonomous driving capabilities |
KR1020207001316A KR102387240B1 (en) | 2017-06-20 | 2018-06-20 | Risk handling for vehicles with autonomous driving capabilities |
CN201880053834.5A CN110997387B (en) | 2017-06-20 | 2018-06-20 | Risk handling for vehicles with autonomous driving capability |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US201762522254P | 2017-06-20 | 2017-06-20 | |
US62/522,254 | 2017-06-20 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2018237018A1 true WO2018237018A1 (en) | 2018-12-27 |
Family
ID=64656555
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/US2018/038520 WO2018237018A1 (en) | 2017-06-20 | 2018-06-20 | Risk processing for vehicles having autonomous driving capabilities |
Country Status (5)
Country | Link |
---|---|
US (2) | US11008000B2 (en) |
EP (1) | EP3642068A4 (en) |
KR (2) | KR102387240B1 (en) |
CN (1) | CN110997387B (en) |
WO (1) | WO2018237018A1 (en) |
Families Citing this family (36)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US11008000B2 (en) | 2017-06-20 | 2021-05-18 | Motional Ad Llc | Risk processing for vehicles having autonomous driving capabilities |
CN107749193B (en) * | 2017-09-12 | 2020-12-04 | 华为技术有限公司 | Driving risk analysis and risk data sending method and device |
JP6575016B2 (en) * | 2017-09-22 | 2019-09-18 | 本田技研工業株式会社 | Vehicle control apparatus, vehicle control method, and program |
US10902336B2 (en) * | 2017-10-03 | 2021-01-26 | International Business Machines Corporation | Monitoring vehicular operation risk using sensing devices |
DE102017221634B4 (en) * | 2017-12-01 | 2019-09-05 | Audi Ag | Motor vehicle with a vehicle guidance system, method for operating a vehicle guidance system and computer program |
DE102018118761A1 (en) * | 2018-08-02 | 2020-02-06 | Robert Bosch Gmbh | Method for at least partially automated driving of a motor vehicle |
EP3621310A1 (en) * | 2018-09-10 | 2020-03-11 | Panasonic Intellectual Property Corporation of America | Video transmitting device, video transmitting method, and program |
DE102018218922A1 (en) * | 2018-11-06 | 2020-05-07 | Robert Bosch Gmbh | Prediction of expected driving behavior |
US11302182B2 (en) * | 2018-12-28 | 2022-04-12 | Beijing Voyager Technology Co., Ltd. | Systems and methods for vehicle identification |
US11400944B2 (en) | 2019-01-04 | 2022-08-02 | Byton North America Corporation | Detecting and diagnosing anomalous driving behavior using driving behavior models |
US20200216027A1 (en) * | 2019-01-04 | 2020-07-09 | Byton North America Corporation | Detecting vehicle intrusion using command pattern models |
EP3966799A4 (en) * | 2019-05-06 | 2023-01-04 | 3M Innovative Properties Company | Risk assessment for temporary zones |
WO2021021008A1 (en) * | 2019-08-01 | 2021-02-04 | Telefonaktiebolaget Lm Ericsson (Publ) | Methods for risk management for autonomous devices and related node |
US11945440B2 (en) | 2019-08-23 | 2024-04-02 | Motional Ad Llc | Data driven rule books |
JP7161458B2 (en) * | 2019-09-09 | 2022-10-26 | 本田技研工業株式会社 | VEHICLE CONTROL DEVICE, VEHICLE CONTROL METHOD, AND PROGRAM |
CN111161527B (en) * | 2020-01-03 | 2021-11-26 | 财团法人车辆研究测试中心 | Remote monitoring system and method for self-driving |
US11866037B2 (en) * | 2020-03-16 | 2024-01-09 | Toyota Motor Engineering & Manufacturing North America, Inc. | Behavior-based vehicle alerts |
CN113799793B (en) * | 2020-05-29 | 2023-05-12 | 魔门塔(苏州)科技有限公司 | System for realizing automatic iteration of prediction model based on data driving |
WO2021261680A1 (en) | 2020-06-26 | 2021-12-30 | 주식회사 에스오에스랩 | Sensor data sharing and utilizing method |
US20220032926A1 (en) * | 2020-08-03 | 2022-02-03 | Autobrains Technologies Ltd | Construction area alert |
US11568688B2 (en) | 2020-08-25 | 2023-01-31 | Motional Ad Llc | Simulation of autonomous vehicle to improve safety and reliability of autonomous vehicle |
US11691643B2 (en) | 2020-08-27 | 2023-07-04 | Here Global B.V. | Method and apparatus to improve interaction models and user experience for autonomous driving in transition regions |
US11687094B2 (en) | 2020-08-27 | 2023-06-27 | Here Global B.V. | Method, apparatus, and computer program product for organizing autonomous vehicles in an autonomous transition region |
US11713979B2 (en) | 2020-08-27 | 2023-08-01 | Here Global B.V. | Method, apparatus, and computer program product for generating a transition variability index related to autonomous driving |
US20220080962A1 (en) * | 2020-09-14 | 2022-03-17 | Motional Ad Llc | Vehicle operation using a behavioral rule model |
US20220089187A1 (en) * | 2020-09-22 | 2022-03-24 | Coast Autonomous, Inc. | Multi-layer autonomous vehicle control architecture |
US11886195B2 (en) * | 2020-11-30 | 2024-01-30 | Woven By Toyota, U.S., Inc. | Active learning and validation system for vehicles |
CN112896187B (en) * | 2021-02-08 | 2022-07-26 | 浙江大学 | System and method for considering social compatibility and making automatic driving decision |
US11820387B2 (en) * | 2021-05-10 | 2023-11-21 | Qualcomm Incorporated | Detecting driving behavior of vehicles |
US20220374845A1 (en) * | 2021-05-18 | 2022-11-24 | International Business Machines Corporation | Predict vehicle maintenance based on navigation route roadway characteristics |
CN113320536A (en) * | 2021-07-16 | 2021-08-31 | 北京航迹科技有限公司 | Vehicle control method and system |
CN113687991B (en) * | 2021-08-25 | 2023-08-22 | 北京赛目科技股份有限公司 | Vehicle defect recommending method and device |
CN113859250B (en) * | 2021-10-14 | 2023-10-10 | 泰安北航科技园信息科技有限公司 | Intelligent networking automobile information security threat detection system based on driving behavior anomaly recognition |
WO2022104295A1 (en) * | 2021-12-29 | 2022-05-19 | Innopeak Technology, Inc. | Object detection using fusion of vision, location, and/or other signals |
CN115402236B (en) * | 2022-09-19 | 2024-05-14 | 阿维塔科技(重庆)有限公司 | Vehicle-mounted sensor position monitoring system and method |
CN116653963B (en) * | 2023-07-31 | 2023-10-20 | 福思(杭州)智能科技有限公司 | Vehicle lane change control method, system and intelligent driving domain controller |
Citations (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5530651A (en) * | 1992-08-03 | 1996-06-25 | Mazda Motor Corporation | Running-safety system for an automotive vehicle |
US20150019043A1 (en) * | 2013-07-12 | 2015-01-15 | Jaybridge Robotics, Inc. | Computer-implemented method and system for controlling operation of an autonomous driverless vehicle in response to obstacle detection |
US9248834B1 (en) | 2014-10-02 | 2016-02-02 | Google Inc. | Predicting trajectories of objects based on contextual information |
US20160176397A1 (en) * | 2014-12-23 | 2016-06-23 | Toyota Motor Engineering & Manufacturing North America, Inc. | Risk mitigation for autonomous vehicles relative to oncoming objects |
US20160347310A1 (en) * | 2015-05-27 | 2016-12-01 | Dov Moran | Alerting predicted accidents between driverless cars |
US20170015318A1 (en) * | 2014-03-03 | 2017-01-19 | Inrix Inc. | Personalization of automated vehicle control |
EP3141926A1 (en) | 2015-09-10 | 2017-03-15 | Continental Automotive GmbH | Automated detection of hazardous drifting vehicles by vehicle sensors |
WO2017079236A2 (en) | 2015-11-04 | 2017-05-11 | Zoox, Inc. | Internal safety systems for robotic vehicles |
Family Cites Families (15)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US9552726B2 (en) * | 2009-08-24 | 2017-01-24 | Here Global B.V. | Providing driving condition alerts using road attribute data |
JP2011253302A (en) * | 2010-06-01 | 2011-12-15 | Toyota Motor Corp | Risk calculation device for vehicle |
US9958272B2 (en) * | 2012-08-10 | 2018-05-01 | Telogis, Inc. | Real-time computation of vehicle service routes |
WO2015088522A1 (en) * | 2013-12-11 | 2015-06-18 | Intel Corporation | Individual driving preference adapted computerized assist or autonomous driving of vehicles |
US9663111B2 (en) | 2014-05-30 | 2017-05-30 | Ford Global Technologies, Llc | Vehicle speed profile prediction using neural networks |
CN105980950B (en) * | 2014-09-05 | 2019-05-28 | 深圳市大疆创新科技有限公司 | The speed control of unmanned vehicle |
US9494935B2 (en) * | 2014-11-13 | 2016-11-15 | Toyota Motor Engineering & Manufacturing North America, Inc. | Remote operation of autonomous vehicle in unexpected environment |
KR101714273B1 (en) * | 2015-12-11 | 2017-03-08 | 현대자동차주식회사 | Method and apparatus for controlling path of autonomous driving system |
EP3433131B1 (en) * | 2016-03-23 | 2023-07-12 | Netradyne, Inc. | Advanced path prediction |
US10789842B2 (en) * | 2016-05-17 | 2020-09-29 | Ford Global Technologies, Llc | Apparatus and methods for detection and notification of icy conditions using integrated vehicle sensors |
US20180239359A1 (en) * | 2016-08-16 | 2018-08-23 | Faraday&Future Inc. | System and method for determining navigational hazards |
US10452068B2 (en) * | 2016-10-17 | 2019-10-22 | Uber Technologies, Inc. | Neural network system for autonomous vehicle control |
JP7036610B2 (en) * | 2017-03-16 | 2022-03-15 | パナソニック インテレクチュアル プロパティ コーポレーション オブ アメリカ | Learning methods and programs |
US10134279B1 (en) * | 2017-05-05 | 2018-11-20 | Toyota Motor Engineering & Manufacturing North America, Inc. | Systems and methods for visualizing potential risks |
US11008000B2 (en) * | 2017-06-20 | 2021-05-18 | Motional Ad Llc | Risk processing for vehicles having autonomous driving capabilities |
-
2018
- 2018-06-20 US US16/013,459 patent/US11008000B2/en active Active
- 2018-06-20 EP EP18820858.1A patent/EP3642068A4/en active Pending
- 2018-06-20 CN CN201880053834.5A patent/CN110997387B/en active Active
- 2018-06-20 WO PCT/US2018/038520 patent/WO2018237018A1/en unknown
- 2018-06-20 KR KR1020207001316A patent/KR102387240B1/en active IP Right Grant
- 2018-06-20 KR KR1020227011824A patent/KR102628790B1/en active IP Right Grant
-
2021
- 2021-05-14 US US17/320,887 patent/US11897460B2/en active Active
Patent Citations (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5530651A (en) * | 1992-08-03 | 1996-06-25 | Mazda Motor Corporation | Running-safety system for an automotive vehicle |
US20150019043A1 (en) * | 2013-07-12 | 2015-01-15 | Jaybridge Robotics, Inc. | Computer-implemented method and system for controlling operation of an autonomous driverless vehicle in response to obstacle detection |
US20170015318A1 (en) * | 2014-03-03 | 2017-01-19 | Inrix Inc. | Personalization of automated vehicle control |
US9248834B1 (en) | 2014-10-02 | 2016-02-02 | Google Inc. | Predicting trajectories of objects based on contextual information |
US20160176397A1 (en) * | 2014-12-23 | 2016-06-23 | Toyota Motor Engineering & Manufacturing North America, Inc. | Risk mitigation for autonomous vehicles relative to oncoming objects |
US20160347310A1 (en) * | 2015-05-27 | 2016-12-01 | Dov Moran | Alerting predicted accidents between driverless cars |
EP3141926A1 (en) | 2015-09-10 | 2017-03-15 | Continental Automotive GmbH | Automated detection of hazardous drifting vehicles by vehicle sensors |
WO2017079236A2 (en) | 2015-11-04 | 2017-05-11 | Zoox, Inc. | Internal safety systems for robotic vehicles |
Non-Patent Citations (1)
Title |
---|
See also references of EP3642068A4 |
Also Published As
Publication number | Publication date |
---|---|
US20210339742A1 (en) | 2021-11-04 |
KR102387240B1 (en) | 2022-04-15 |
KR20200019696A (en) | 2020-02-24 |
KR102628790B1 (en) | 2024-01-24 |
US11008000B2 (en) | 2021-05-18 |
US20180362031A1 (en) | 2018-12-20 |
US11897460B2 (en) | 2024-02-13 |
CN110997387A (en) | 2020-04-10 |
EP3642068A4 (en) | 2020-10-21 |
KR20220050233A (en) | 2022-04-22 |
CN110997387B (en) | 2023-06-20 |
EP3642068A1 (en) | 2020-04-29 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US11897460B2 (en) | Risk processing for vehicles having autonomous driving capabilities | |
US11714413B2 (en) | Planning autonomous motion | |
US20220244063A1 (en) | Data processing system communicating with a map data processing system to determine or alter a navigation path based on one or more road segments | |
US10593197B1 (en) | Networked vehicle control systems to facilitate situational awareness of vehicles | |
US11860979B2 (en) | Synchronizing image data with either vehicle telematics data or infrastructure data pertaining to a road segment | |
US11568688B2 (en) | Simulation of autonomous vehicle to improve safety and reliability of autonomous vehicle | |
KR20190042088A (en) | Unexpected impulse change collision detector | |
KR102565573B1 (en) | Metric back-propagation for subsystem performance evaluation | |
KR102657847B1 (en) | Vehicle operation using a behavioral rule model | |
US11932260B2 (en) | Selecting testing scenarios for evaluating the performance of autonomous vehicles | |
US12058552B2 (en) | Systems and methods for selecting locations to validate automated vehicle data transmission | |
WO2013170882A1 (en) | Collaborative vehicle detection of objects with a predictive distribution | |
US20220413502A1 (en) | Method, apparatus, and system for biasing a machine learning model toward potential risks for controlling a vehicle or robot | |
US20230096773A1 (en) | Mitigating fire risks in vehicles | |
WO2023126695A1 (en) | Application of mean time between failure (mtbf) models for autonomous vehicles |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 18820858 Country of ref document: EP Kind code of ref document: A1 |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
ENP | Entry into the national phase |
Ref document number: 20207001316 Country of ref document: KR Kind code of ref document: A |
|
ENP | Entry into the national phase |
Ref document number: 2018820858 Country of ref document: EP Effective date: 20200120 |