US20230356751A1 - Malicious event detection for autonomous vehicles - Google Patents
Malicious event detection for autonomous vehicles Download PDFInfo
- Publication number
- US20230356751A1 US20230356751A1 US18/343,210 US202318343210A US2023356751A1 US 20230356751 A1 US20230356751 A1 US 20230356751A1 US 202318343210 A US202318343210 A US 202318343210A US 2023356751 A1 US2023356751 A1 US 2023356751A1
- Authority
- US
- United States
- Prior art keywords
- events
- series
- normalcy
- autonomous vehicle
- mode
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 238000001514 detection method Methods 0.000 title claims description 45
- 230000004044 response Effects 0.000 claims abstract description 33
- 238000004891 communication Methods 0.000 claims description 56
- 238000000034 method Methods 0.000 claims description 44
- 230000006399 behavior Effects 0.000 claims description 18
- 230000009471 action Effects 0.000 claims description 13
- 238000012544 monitoring process Methods 0.000 claims description 7
- 230000003068 static effect Effects 0.000 claims description 6
- 230000000007 visual effect Effects 0.000 claims description 4
- 238000010801 machine learning Methods 0.000 description 21
- 230000033001 locomotion Effects 0.000 description 15
- 230000004927 fusion Effects 0.000 description 14
- 230000000875 corresponding effect Effects 0.000 description 13
- 238000005516 engineering process Methods 0.000 description 12
- 230000006870 function Effects 0.000 description 11
- 230000004807 localization Effects 0.000 description 8
- 238000012545 processing Methods 0.000 description 8
- 230000002547 anomalous effect Effects 0.000 description 7
- 230000005540 biological transmission Effects 0.000 description 7
- 230000008569 process Effects 0.000 description 7
- 238000013500 data storage Methods 0.000 description 6
- 238000010586 diagram Methods 0.000 description 6
- 230000009545 invasion Effects 0.000 description 6
- 238000013528 artificial neural network Methods 0.000 description 5
- 238000004590 computer program Methods 0.000 description 5
- 238000012549 training Methods 0.000 description 5
- 206010000117 Abnormal behaviour Diseases 0.000 description 3
- 230000003213 activating effect Effects 0.000 description 3
- 230000008901 benefit Effects 0.000 description 3
- 239000000446 fuel Substances 0.000 description 3
- 230000037361 pathway Effects 0.000 description 3
- 238000012552 review Methods 0.000 description 3
- 239000011435 rock Substances 0.000 description 3
- 230000002776 aggregation Effects 0.000 description 2
- 238000004220 aggregation Methods 0.000 description 2
- 238000012790 confirmation Methods 0.000 description 2
- 238000013527 convolutional neural network Methods 0.000 description 2
- 238000003066 decision tree Methods 0.000 description 2
- 230000007246 mechanism Effects 0.000 description 2
- 230000011218 segmentation Effects 0.000 description 2
- 238000012706 support-vector machine Methods 0.000 description 2
- 230000002159 abnormal effect Effects 0.000 description 1
- 230000001133 acceleration Effects 0.000 description 1
- 238000007792 addition Methods 0.000 description 1
- 230000004075 alteration Effects 0.000 description 1
- 238000004458 analytical method Methods 0.000 description 1
- 238000013459 approach Methods 0.000 description 1
- 230000003542 behavioural effect Effects 0.000 description 1
- 230000010267 cellular communication Effects 0.000 description 1
- 230000001413 cellular effect Effects 0.000 description 1
- 230000008859 change Effects 0.000 description 1
- 230000001276 controlling effect Effects 0.000 description 1
- 230000002596 correlated effect Effects 0.000 description 1
- 238000006073 displacement reaction Methods 0.000 description 1
- 230000001747 exhibiting effect Effects 0.000 description 1
- 238000003709 image segmentation Methods 0.000 description 1
- 238000007477 logistic regression Methods 0.000 description 1
- 238000013507 mapping Methods 0.000 description 1
- 238000005259 measurement Methods 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 239000010705 motor oil Substances 0.000 description 1
- 230000002093 peripheral effect Effects 0.000 description 1
- 230000002085 persistent effect Effects 0.000 description 1
- 230000001902 propagating effect Effects 0.000 description 1
- 238000007670 refining Methods 0.000 description 1
- 238000004088 simulation Methods 0.000 description 1
- 238000006467 substitution reaction Methods 0.000 description 1
- 230000002123 temporal effect Effects 0.000 description 1
Images
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04W—WIRELESS COMMUNICATION NETWORKS
- H04W4/00—Services specially adapted for wireless communication networks; Facilities therefor
- H04W4/90—Services for handling of emergency or hazardous situations, e.g. earthquake and tsunami warning systems [ETWS]
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W60/00—Drive control systems specially adapted for autonomous road vehicles
- B60W60/001—Planning or execution of driving tasks
- B60W60/0015—Planning or execution of driving tasks specially adapted for safety
- B60W60/0016—Planning or execution of driving tasks specially adapted for safety of the vehicle or its occupants
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W60/00—Drive control systems specially adapted for autonomous road vehicles
- B60W60/001—Planning or execution of driving tasks
- B60W60/0015—Planning or execution of driving tasks specially adapted for safety
- B60W60/0018—Planning or execution of driving tasks specially adapted for safety by employing degraded modes, e.g. reducing speed, in response to suboptimal conditions
- B60W60/00188—Planning or execution of driving tasks specially adapted for safety by employing degraded modes, e.g. reducing speed, in response to suboptimal conditions related to detected security violation of control systems, e.g. hacking of moving vehicle
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60R—VEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
- B60R25/00—Fittings or systems for preventing or indicating unauthorised use or theft of vehicles
- B60R25/10—Fittings or systems for preventing or indicating unauthorised use or theft of vehicles actuating a signalling device
- B60R25/102—Fittings or systems for preventing or indicating unauthorised use or theft of vehicles actuating a signalling device a signal being sent to a remote location, e.g. a radio signal being transmitted to a police station, a security company or the owner
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60R—VEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
- B60R25/00—Fittings or systems for preventing or indicating unauthorised use or theft of vehicles
- B60R25/10—Fittings or systems for preventing or indicating unauthorised use or theft of vehicles actuating a signalling device
- B60R25/104—Fittings or systems for preventing or indicating unauthorised use or theft of vehicles actuating a signalling device characterised by the type of theft warning signal, e.g. visual or audible signals with special characteristics
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W50/00—Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
- G05D1/02—Control of position or course in two dimensions
- G05D1/021—Control of position or course in two dimensions specially adapted to land vehicles
- G05D1/0212—Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory
- G05D1/0214—Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory in accordance with safety or protection criteria, e.g. avoiding hazardous areas
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
- G05D1/02—Control of position or course in two dimensions
- G05D1/021—Control of position or course in two dimensions specially adapted to land vehicles
- G05D1/0231—Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
- G05D1/02—Control of position or course in two dimensions
- G05D1/021—Control of position or course in two dimensions specially adapted to land vehicles
- G05D1/0231—Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means
- G05D1/0242—Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using non-visible light signals, e.g. IR or UV signals
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
- G05D1/02—Control of position or course in two dimensions
- G05D1/021—Control of position or course in two dimensions specially adapted to land vehicles
- G05D1/0276—Control of position or course in two dimensions specially adapted to land vehicles using signals provided by a source external to the vehicle
-
- G—PHYSICS
- G07—CHECKING-DEVICES
- G07C—TIME OR ATTENDANCE REGISTERS; REGISTERING OR INDICATING THE WORKING OF MACHINES; GENERATING RANDOM NUMBERS; VOTING OR LOTTERY APPARATUS; ARRANGEMENTS, SYSTEMS OR APPARATUS FOR CHECKING NOT PROVIDED FOR ELSEWHERE
- G07C5/00—Registering or indicating the working of vehicles
- G07C5/08—Registering or indicating performance data other than driving, working, idle, or waiting time, with or without registering driving, working, idle or waiting time
- G07C5/0808—Diagnosing performance data
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G1/00—Traffic control systems for road vehicles
- G08G1/01—Detecting movement of traffic to be counted or controlled
- G08G1/0104—Measuring and analyzing of parameters relative to traffic conditions
- G08G1/0125—Traffic data processing
- G08G1/0133—Traffic data processing for classifying traffic situation
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G1/00—Traffic control systems for road vehicles
- G08G1/01—Detecting movement of traffic to be counted or controlled
- G08G1/052—Detecting movement of traffic to be counted or controlled with provision for determining speed or overspeed
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04L—TRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
- H04L67/00—Network arrangements or protocols for supporting network services or applications
- H04L67/01—Protocols
- H04L67/12—Protocols specially adapted for proprietary or special-purpose networking environments, e.g. medical networks, sensor networks, networks in vehicles or remote metering networks
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04L—TRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
- H04L67/00—Network arrangements or protocols for supporting network services or applications
- H04L67/14—Session management
- H04L67/141—Setup of application sessions
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04W—WIRELESS COMMUNICATION NETWORKS
- H04W4/00—Services specially adapted for wireless communication networks; Facilities therefor
- H04W4/30—Services specially adapted for particular environments, situations or purposes
- H04W4/40—Services specially adapted for particular environments, situations or purposes for vehicles, e.g. vehicle-to-pedestrians [V2P]
- H04W4/44—Services specially adapted for particular environments, situations or purposes for vehicles, e.g. vehicle-to-pedestrians [V2P] for communication between vehicles and infrastructures, e.g. vehicle-to-cloud [V2C] or vehicle-to-home [V2H]
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W2300/00—Indexing codes relating to the type of vehicle
- B60W2300/14—Trailers, e.g. full trailers, caravans
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W2420/00—Indexing codes relating to the type of sensors based on the principle of their operation
- B60W2420/40—Photo or light sensitive means, e.g. infrared sensors
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W2420/00—Indexing codes relating to the type of sensors based on the principle of their operation
- B60W2420/40—Photo or light sensitive means, e.g. infrared sensors
- B60W2420/403—Image sensing, e.g. optical camera
-
- B60W2420/408—
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W2420/00—Indexing codes relating to the type of sensors based on the principle of their operation
- B60W2420/42—Image sensing, e.g. optical camera
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W2420/00—Indexing codes relating to the type of sensors based on the principle of their operation
- B60W2420/52—Radar, Lidar
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W2510/00—Input parameters relating to a particular sub-units
- B60W2510/06—Combustion engines, Gas turbines
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W2510/00—Input parameters relating to a particular sub-units
- B60W2510/10—Change speed gearings
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W2520/00—Input parameters relating to overall vehicle dynamics
- B60W2520/10—Longitudinal speed
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W2554/00—Input parameters relating to objects
- B60W2554/20—Static objects
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W2554/00—Input parameters relating to objects
- B60W2554/40—Dynamic objects, e.g. animals, windblown objects
- B60W2554/402—Type
- B60W2554/4029—Pedestrians
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W2556/00—Input parameters relating to data
- B60W2556/10—Historical data
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W2556/00—Input parameters relating to data
- B60W2556/45—External transmission of data to or from the vehicle
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W2756/00—Output or target parameters relating to data
- B60W2756/10—Involving external transmission of data to or from the vehicle
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G1/00—Traffic control systems for road vehicles
- G08G1/01—Detecting movement of traffic to be counted or controlled
- G08G1/0104—Measuring and analyzing of parameters relative to traffic conditions
- G08G1/0108—Measuring and analyzing of parameters relative to traffic conditions based on the source of data
- G08G1/0112—Measuring and analyzing of parameters relative to traffic conditions based on the source of data from the vehicle, e.g. floating car data [FCD]
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G1/00—Traffic control systems for road vehicles
- G08G1/01—Detecting movement of traffic to be counted or controlled
- G08G1/0104—Measuring and analyzing of parameters relative to traffic conditions
- G08G1/0137—Measuring and analyzing of parameters relative to traffic conditions for specific applications
- G08G1/0145—Measuring and analyzing of parameters relative to traffic conditions for specific applications for active traffic flow control
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G1/00—Traffic control systems for road vehicles
- G08G1/09—Arrangements for giving variable traffic instructions
- G08G1/0962—Arrangements for giving variable traffic instructions having an indicator mounted inside the vehicle, e.g. giving voice messages
- G08G1/0965—Arrangements for giving variable traffic instructions having an indicator mounted inside the vehicle, e.g. giving voice messages responding to signals from another vehicle, e.g. emergency vehicle
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G1/00—Traffic control systems for road vehicles
- G08G1/09—Arrangements for giving variable traffic instructions
- G08G1/0962—Arrangements for giving variable traffic instructions having an indicator mounted inside the vehicle, e.g. giving voice messages
- G08G1/0967—Systems involving transmission of highway information, e.g. weather, speed limits
- G08G1/096708—Systems involving transmission of highway information, e.g. weather, speed limits where the received information might be used to generate an automatic action on the vehicle control
- G08G1/096716—Systems involving transmission of highway information, e.g. weather, speed limits where the received information might be used to generate an automatic action on the vehicle control where the received information does not generate an automatic action on the vehicle control
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G1/00—Traffic control systems for road vehicles
- G08G1/09—Arrangements for giving variable traffic instructions
- G08G1/0962—Arrangements for giving variable traffic instructions having an indicator mounted inside the vehicle, e.g. giving voice messages
- G08G1/0967—Systems involving transmission of highway information, e.g. weather, speed limits
- G08G1/096733—Systems involving transmission of highway information, e.g. weather, speed limits where a selection of the information might take place
- G08G1/096741—Systems involving transmission of highway information, e.g. weather, speed limits where a selection of the information might take place where the source of the transmitted information selects which information to transmit to each vehicle
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G1/00—Traffic control systems for road vehicles
- G08G1/09—Arrangements for giving variable traffic instructions
- G08G1/0962—Arrangements for giving variable traffic instructions having an indicator mounted inside the vehicle, e.g. giving voice messages
- G08G1/0967—Systems involving transmission of highway information, e.g. weather, speed limits
- G08G1/096766—Systems involving transmission of highway information, e.g. weather, speed limits where the system is characterised by the origin of the information transmission
- G08G1/096775—Systems involving transmission of highway information, e.g. weather, speed limits where the system is characterised by the origin of the information transmission where the origin of the information is a central station
Definitions
- the present disclosure relates generally to autonomous vehicles. More particularly, the present disclosure is related to malicious event detection for autonomous vehicles.
- One aim of autonomous vehicle technologies is to provide vehicles that can safely navigate towards a destination.
- an autonomous vehicle may encounter an unexpected situation on its way to a destination.
- an autonomous vehicle may encounter a situation where a third party, such as a vehicle, an individual, or a pedestrian attempts to tamper with the AV.
- a third party may attempt to force the autonomous vehicle to deviate from its predetermined traveling path or force the autonomous vehicle to pull over.
- Current autonomous vehicle technologies may not be configured to account for encountering specific unexpected situations.
- This disclosure recognizes various problems and previously unmet needs related to detecting malicious events affecting autonomous vehicles.
- Current autonomous vehicle technologies may not be configured to account for malicious intents of vehicles, individuals, or pedestrians attempting to tamper with an autonomous vehicle (AV).
- AV autonomous vehicle
- one or more vehicles may intentionally force the AV to deviate from its routing plan or traveling path by invading space within a threshold distance from the AV.
- one or more vehicles may intentionally force the AV to pull over.
- one or more vehicles may intentionally force the AV to slow down.
- the one or more vehicles causing the above-identified events may attempt to access or steal cargo carried by the AV, or devices and autonomous technology present in the AV.
- one or more vehicles may intentionally or inadvertently collide with the AV and flee the scene of the accident.
- Certain embodiments of this disclosure provide unique technical solutions to technical problems of current autonomous vehicle technologies, including those problems described above by detecting a series of events within a threshold period, where the series of events corresponds to a malicious event.
- the detected series of events may indicate a deviation from a normalcy mode, where the normalcy mode comprises expected or predictable scenarios in various road environments.
- the various road environments may comprise, for example, when the AV is in traffic.
- sensors of the AV detect that all surrounding vehicles in the traffic are stopped or slowing down.
- the sensors detect that one or more surrounding vehicles are slowing down, while other vehicles are not slowing down, the AV may determine that this situation is a deviation from the normalcy mode.
- Another example of a road environment in the normalcy mode may be when the AV is driving along a road.
- the sensors of the AV detect that 1) vehicles driving next to the AV do not drive parallel to the AV for more than a threshold period and 2) the vehicles driving next to the AV do not invade a threshold distance from the AV for more than a threshold period.
- the series of events Upon detecting a series of events that deviates from the normalcy mode, the series of events is escalated to be addressed. For example, a communication path may be established with the AV such that a remote operator can be seen and/or heard from a communication module at the AV in order to discourage individuals tampering with the AV from tampering with the AV. As another example, a notifying message may be sent to law enforcement indicating that the AV is being tampered with at a particular location coordinate.
- a system comprises an AV that comprises at least one vehicle sensor, where the AV is configured to travel along a road.
- the system further comprises a control device that is operably coupled with the AV.
- the control device detects, from sensor data received from the vehicle sensor, a series of events within a threshold period, where a number of events in the series of events is above a threshold number.
- the series of events in the aggregate within the threshold period deviates from a normalcy mode.
- the normalcy mode comprises events that are expected to the encountered by the AV.
- the control device determines whether the series of events corresponds to a malicious.
- the control device In response to determining that the series of events corresponds to the malicious event, the control device escalates the series of events to be addressed, where escalating the series of events comprises performing at least one countermeasure to resolve the series of events. At least one countermeasure comprises establishing a communication path between the AV and an operator such that the operator is able to converse with accomplices causing the series of events.
- the disclosed systems provide several practical applications and technical advantages which include: 1) technology that builds a normalcy mode, where the normalcy mode comprises expected or predictable scenarios in various road environments; 2) technology that detects a series of events in a threshold period and determines whether the series of events deviates from the normalcy mode, where the series of events is greater than a threshold number of events; 3) technology that establishes a communication path with the AV, in response to determining that the series of events corresponds to a malicious event, where the communication path supports voice and visual communications; 4) technology that sends a notifying message to law enforcement indicating that the AV is being tampered with at a particular location coordinate, in response to determining that the series of events corresponds to a malicious event; 5) technology that remotely activates a horn of the AV, in response to determining that the series of events corresponds to a malicious event; and 7) technology that activates a surveillance sensor to record the series of events, in response to determining that the series of events corresponds to a malicious event, where
- the systems described in this disclosure may be integrated into a practical application of determining a more efficient, safe, and reliable solution for detecting malicious events acted upon the AV.
- the disclosed system may determine that a series of events detected within a threshold period deviates from the normalcy mode. The disclosed system may compare the detected series of events with the normalcy mode. If a corresponding expected scenario is found, the disclosed system determines that the series of events (in aggregation) does not correspond to a malicious event. If, however, no corresponding expected scenario is found, the disclosed system determines that the series of events (in aggregation) corresponds to a malicious event.
- the disclosed system may determine that above a threshold number of events from the series of events detected within the threshold period of time deviate from the normalcy mode. In another example, the disclosed system may compare each event from the series of events with the expected scenarios to determine whether each event corresponds to an expected scenario.
- the systems described in this disclosure may be integrated into an additional practical application of determining a more efficient, safe, and reliable solution to address and perhaps resolve a situation where a third party is tampering with the AV.
- the disclosed system may establish a communication path with the AV, and enable a remote operator to be seen and/or heard from a communication module at the AV in order to discourage individuals tampering with the AV from tampering with the AV.
- the disclosed system may remotely activate a horn of the AV in order to discourage individuals tampering with the AV from tampering with the AV.
- the disclosed system may send a notifying message to law enforcement indicating that the AV is being tampered with at a particular location coordinate.
- FIG. 1 illustrates an embodiment of a system configured to detect malicious events for an autonomous vehicle (AV) and a simplified schematic diagram of example series of events corresponding to malicious events according to certain embodiments of this disclosure;
- AV autonomous vehicle
- FIG. 2 illustrates an example flowchart of a method for detecting malicious events for an AV
- FIG. 3 illustrates a block diagram of an example AV configured to implement autonomous driving operations
- FIG. 4 illustrates an example system for providing autonomous driving operations used by the AV of FIG. 3 ;
- FIG. 5 illustrates a block diagram of an in-vehicle control computer included in the AV of FIG. 3 .
- FIG. 1 illustrates an embodiment of a system that is configured to detect malicious events encountered by an AV.
- FIG. 1 further illustrates the AV traveling along a road where various examples of anomalous series of events occur that each corresponds to a malicious event individually or that, in the aggregate, correspond to a malicious event.
- FIG. 2 illustrates an example flowchart of an embodiment of a method for detecting malicious events for an AV.
- FIGS. 3 - 5 illustrate an example AV and various systems and devices for implementing autonomous driving operations by an AV, including the malicious event detection operation described in this disclosure.
- FIG. 5 illustrates an example control device of the example AV shown in FIG. 3 for implementing the malicious event detection operations described in this disclosure.
- FIG. 1 illustrates an embodiment of a system 100 for detecting malicious events 112 for an AV 302 .
- FIG. 1 further illustrates a simplified schematic diagram 102 that comprises various examples of anomalous road conditions or series of events 114 that correspond to tampering with an AV 302 .
- system 100 comprises the AV 302 and an operation server 140 .
- the system 100 may further comprise an application server 162 , a remote operator 164 , and a network 184 that provides communication paths for all of the illustrated components of the system 100 to communicate with each other.
- the system 100 may be configured as shown or any other suitable configurations.
- the system 100 detects a series of events 114 that may correspond to a malicious event 112 , where the malicious event 112 indicates that a third party, such as a vehicle 122 or a pedestrian is tampering with the AV 302 .
- the series of events 114 comprises a number of events above a threshold number 116 that occur within a threshold period of time 118 .
- the series of events 114 may comprise a first event 104 a and a second event 104 b that in the aggregate taken within a threshold period of time 118 amount to the series of events 114 that deviates from a normalcy mode 106 .
- system 100 also contemplates identifying and escalating a singular event 104 that deviates from a normalcy mode 106 as a malicious event 112 in the same or similar manner to how a series of events 114 is identified and escalated as a malicious event 112 .
- Normalcy mode 106 is described in detail further below.
- the normalcy mode 106 comprises events or scenarios 108 that are expected to be encountered by the AV 302 in the normal course of operation.
- System 100 may detect the series of events 114 from sensor data 178 received from sensors 346 associated with the AV 302 .
- system 100 Upon detecting the series of events 114 within the threshold period 118 , system 100 determines whether the series of events 114 corresponds to a malicious event 112 . In response to determining that the series of events 114 corresponds to a malicious event 112 , system 100 escalates the series of events 114 to be addressed. For example, the control device 350 communicates the series of events 114 to the operation server 140 to be addressed by a remote operator 164 .
- the various examples of an anomalous series of events 114 are described in detail further below. The corresponding description below comprises a brief description of certain components of the system 100 .
- the AV 302 may include a semi-truck tractor unit attached to a trailer to transport cargo or freight from one location to another location (see FIG. 3 ).
- the semi-truck tractor may be referred to as a cab of the AV 302 .
- the AV 302 is navigated by a plurality of components described in detail in FIGS. 3 - 5 .
- the operation of the AV 302 is described in greater detail in FIG. 3 .
- the corresponding description below includes brief descriptions of certain components of the AV 302 .
- the AV 302 includes a in-vehicle control computer 350 which is operated to facilitate autonomous driving of the AV 302 .
- the in-vehicle control computer 350 may be interchangeably referred to as a control device 350 .
- Control device 350 is generally configured to control the operation of the AV 302 and its components.
- the control device 350 is further configured to determine a pathway in front of the AV 302 that is safe to travel and free of objects/obstacles, and navigate the AV 302 to travel in that pathway. This process is described in more detail in FIGS. 3 - 5 .
- the control device 350 generally includes one or more computing devices in signal communication with other components of the AV 302 (see FIG. 3 ).
- the control device 350 receives sensor data 178 from one or more sensors 346 positioned on the AV 302 to determine a safe pathway to travel.
- the sensor data 178 includes data captured by the sensors 346 .
- Sensors 346 are configured to capture any object within their detection zones or fields of view, such as landmarks, lane markings, lane boundaries, road 120 boundaries, vehicles 122 , pedestrians, road 120 /traffic signs, among other objects.
- the sensors 346 may include cameras, LiDAR sensors, motion sensors, infrared sensors, and the like. In one embodiment, the sensors 346 may be positioned around the AV 302 to capture the environment surrounding the AV 302 .
- the sensors 346 may detect a series of events 114 within the threshold period 118 that corresponds to a malicious event 112 .
- the sensor data 178 may include one or more indications indicating anomalous or abnormal series of events 114 within the threshold period of time 118 .
- the threshold period of time 118 may be determined to be thirty seconds, one minute, two minutes, or any other appropriate duration.
- the control device 350 analyzes the sensor data 178 and determines whether the series of events 114 corresponds to a malicious event 112 . For example, the control device 350 may compare the series of events 114 with the normalcy mode 106 to determine whether the series of events 114 corresponds to any of the expected scenarios 108 .
- control device 350 may compare each event 104 from the series of events 114 individually with the normalcy mode 106 (or each of the expected scenarios 108 ). For example, if above the threshold number 116 of events 104 are detected within the threshold period of time 118 such that each of those events 104 does not correspond to the normalcy mode 106 , the control device 350 may determine that those events 104 in the aggregate indicate a deviation from the normalcy mode 106 , and correspond to a malicious event 112 .
- the control device 350 may compare the series of events 114 , as a whole, with the normalcy mode 106 (or the expected scenarios 108 ). For example, if the series of events 114 , taken as a whole, does not correspond to any of the expected scenarios 108 , the control device 350 may determine that the series of events 114 is a deviation from the normalcy mode 106 , and correspond to a malicious event 112 . In other words, if a corresponding expected scenario 108 is found, the control device 350 determines that the series of events 114 , as a whole, does not correspond to a malicious event 112 . If, however, the series of events 114 does not correspond to any of the expected scenarios 108 , the control device 350 determines that the series of events 114 corresponds to the malicious event 112 .
- the control device 350 is in signal communication with the operation server 140 .
- the control device 350 is configured to communicate the sensor data 178 to the operation server 140 , for example, via network 184 .
- the control device 350 may communicate the sensor data 178 to the operation server 140 periodically (e.g., every minute, every few minutes, or any other suitable interval), continuously, and/or upon receiving a request from the operation server 140 to send sensor data 178 .
- the sensor data 178 may include data describing the environment surrounding the AV 302 , such as image feed, video feed, LiDAR data feed, and other data captured from the fields of view of the sensors 346 .
- the sensor data 178 may further include location coordinates 138 associated with the AV 302 . See the corresponding description of FIG. 3 for further description of the control device 350 .
- Operation server 140 is generally configured to oversee the operations of the AV 302 . Details of the operation server 140 are described further below.
- the operation server 140 comprises a processor 142 , a memory 144 , a network interface 146 , and a user interface 148 .
- the components of the operation server 140 are operably coupled to each other.
- the processor 142 may include one or more processing units that perform various functions as described herein.
- the memory 144 stores any data and/or instructions used by the processor 142 to perform its functions.
- the memory 144 stores software instructions 168 that when executed by the processor 142 causes the operation server 140 to perform one or more functions described herein.
- the operation server 140 is in signal communication with the AV 302 and its components.
- the operation server 140 is configured to receive the sensor data 178 and the series of events 114 from the control device 350 , analyze them, and, in response, confirm (or update) the determination of the control device 350 regarding whether the series of events 114 corresponds to a malicious event 112 .
- the operation server 140 is further configured to detect objects on and around a road 120 traveled by the AV 302 by analyzing the sensor data 178 .
- the operation server 140 may detect objects on and around a road 120 by implementing object detection machine learning modules 158 .
- the object detection machine learning modules 158 may be implemented using neural networks and/or machine learning algorithms for detecting objects from images, videos, infrared images, point clouds, radar data, etc. The object detection machine learning modules 158 is described in more detail further below.
- Normalcy mode 106 generally comprises scenarios 108 that are expected to be encountered by the AV 302 .
- the normalcy mode 106 is built or generated by a normalcy mode building engine 110 .
- the normalcy mode building engine 110 is described further below.
- the normalcy mode 106 may be built by the processor 142 executing the software instructions 168 .
- the normalcy mode 106 generally corresponds to expected or predictable scenarios 108 that indicate 1 ) expected actions or behaviors of the AV 302 and 2 ) expected actions or behaviors of objects within and outside detection zones of the sensors 346 , including moving objects (such as vehicles 122 , pedestrians, etc.) and static objects (such as traffic lights 126 , etc.) in various situations.
- the normalcy mode 106 may include expected scenarios 108 in situations where the AV 302 is in traffic, behind a traffic light 126 , detected an impact or collision with a vehicle 122 or an object 124 , among other situations which are described below.
- a first expected scenario 108 in the normalcy mode 106 may indicate that when the AV 302 is in congested traffic or behind a traffic light 126 , it is expected that vehicles 122 surrounding the AV 302 to slow down or have stopped.
- the control device 350 determines that this situation may be a deviation from the normalcy mode 106 .
- the control device 350 may determine that there is no traffic by detecting that other vehicles 122 (for example, vehicles 122 e and 1221 ) are not slowing down, where vehicles 122 that are not slowing down may be in the same lane or a different lane than a lane traveled by the AV 302 .
- the control device 350 may also determine that there is no traffic ahead of the AV 302 from traffic data 156 , for example, from live traffic reporting, etc.
- a deviation from the first expected scenario 108 in the normalcy mode 106 may comprise indications indicating that multiple vehicles 122 around the AV 302 are slowing down (e.g., vehicles 122 a - b , vehicles 122 a - c , or vehicles 122 a - d ) and attempting to force the AV 302 to slow down, while there is no traffic or traffic light 126 detected by the sensors 346 .
- the multiple vehicles 122 around the AV 302 are impeding the progress of the AV 302 to “box-in” the AV 302 . In this way, the vehicles 122 may force the AV 302 to slow down, pull over, or deviate from its routing plan 152 .
- the control device 350 may determine that such situations correspond to an event 104 or a series of events 114 that deviates from the normalcy mode 106 .
- a second expected scenario 108 in the normalcy mode 106 may indicate that a distance of a vehicle 122 from the AV 302 is expected to be above a threshold distance 128 , and if the distance of that vehicle 122 from the AV 302 becomes less than the threshold distance 128 , it is expected that that vehicle 122 does not persist on staying with a distance less than the threshold distance 128 from the AV 302 for more than a threshold period of time 118 .
- a deviation from the second expected scenario 108 in the normalcy mode 106 may comprise indications indicating that one or more vehicles 122 are persisting on staying with a distance less than the threshold distance 128 from the AV 302 for more than the threshold period of time 118 (e.g., thirty seconds, one minute, two minutes, or any other appropriate duration).
- the control device 350 may determine that such situations correspond to an event 104 or a series of events 114 that deviates from the normalcy mode 106 .
- a third expected scenario 108 in the normalcy mode 106 may indicate that the AV 302 is expected to drive within a particular speed range provided by the driving instructions 154 according to a speed limit of the road 120 traveled by the AV 302 .
- a vehicle drive subsystem 342 (see FIG. 3 ) monitoring an engine 342 a of the AV 302 (see FIG. 3 ) detects that the speed of the engine 342 a is going out of the particular speed range
- components of the vehicle control subsystems 348 are operating according to instructions provided in the driving instructions 154
- the control device 350 may determine that this situation is a deviation from the normalcy mode 106 .
- this situation may occur if a vehicle 112 d drags the AV 302 back by a cable 182 , or otherwise impedes the progress of the AV 302 .
- the control device 350 determines that such situations may correspond to an event 104 or a series of events 114 that deviates from the normalcy mode 106 .
- a fourth expected scenario 108 in the normalcy mode 106 may indicate that in response to being involved in an accident or a collision with a vehicle 122 , it is expected that an individual from the vehicle 122 to approach the AV 302 within the threshold period of time 118 .
- the control device 350 may determine that this situation is a deviation from the normalcy mode 106 , i.e., it may be a case of “hit and run.”
- a fifth expected scenario 108 in the normalcy mode 106 may indicate that in response to detecting that one or more vehicles 122 exhibiting unexpected or anomalous driving behaviors, it is expected that such unexpected or anomalous driving behaviors do not persist for more than the threshold period of time 118 .
- Some examples of unexpected or anomalous driving behaviors of the one or more vehicles 122 may comprise invading the space within the threshold distance 128 from the AV 302 , making contact or collide with the AV 302 , swerving in front of the AV 302 , among others.
- Some examples of one or more vehicles 122 tampering with the AV 302 may comprise forcing the AV 302 to pull over, deviate from its routing place 152 , slow down, speed up, drive over an object 124 , crash, or collide with another vehicle 122 .
- the control device 350 may determine that this situation is a deviation of the normalcy mode 106 .
- the series of events 114 generally includes events 104 that in the aggregate indicate that a third party, such as at least one vehicle 122 or an individual is tampering with the AV 302 .
- determining that a series of events 114 in the aggregate indicates a deviation from the normalcy mode 106 may comprise detecting that each event 104 from the series of events 114 deviates from the normalcy mode 116 .
- determining that a series of events 114 in the aggregate indicates a deviation from the normalcy mode 106 may comprise detecting that at least a threshold number 116 of events 104 (or at least a subset of the series of events 114 above the threshold number 116 ) within the threshold period of time 118 deviate from the normalcy mode 106 .
- determining that a series of events 114 in the aggregate indicates a deviation from the normalcy mode 106 may comprise grouping or taking a collection of events 104 together to form the series of events 114 that, as a whole, is compared with the expected scenarios 108 to determine whether the series of events 114 deviates from the normalcy mode 106 .
- series of events 114 are illustrated in FIG. 1 and described in detail below.
- Some examples of the series of events 114 may be related to unexpected or abnormal behaviors of moving objects, such as vehicles 122 , individuals, and/or pedestrians detected within the detection zones of the sensors 346 .
- Some examples of the series of events 114 may be related to unexpected or abnormal behaviors of moving objects, such as vehicles 122 , individuals, and/or pedestrians, where the moving objects are not within the detection zones of the sensors 346 .
- Some examples of the series of events 114 may be related to unexpected or abnormal behaviors of static or stationary objects, such as traffic lights 126 .
- the series of events 114 may not be zone- or region-specific, which means that the series of events 114 may occur at any region.
- the AV 302 is traveling on the road 120 according to its predetermined routing plan 152 when one or more examples of the series of events 114 occur.
- a first series of events 114 may indicate that the AV 302 is forced to deviate from its predetermined routing plan 152 by one or more vehicles 122 such that the AV 302 is forced to re-route or pull over.
- vehicles 122 a and 122 b on both sides of the AV 302 drive with a distance less than the threshold distance 128 from the AV 302 for more than the threshold period of time 118 .
- vehicles 122 a and 122 b invade the space within a threshold distance 128 from the AV 302 for more than the threshold period of time 118 .
- the sensors 346 detect these invasions of the space within the threshold distance 128 , and communicate sensor data 178 indicating these invasions to the control device 350 .
- vehicles 122 a and 122 b may force the AV 302 to re-route from its routing plan 152 and take the exit 130 .
- vehicles 122 a and 122 b may force the AV 302 to pull over to a side of the road 120 (as noted in FIG. 1 as pull over 180).
- FIG. 1 illustrates vehicles 122 a and 122 b on sides of the AV 302 , it is understood that any number of vehicles 122 on one or more sides of the AV 302 may contribute to forcing the AV 302 to deviate from its routing plan 152 or pull over.
- vehicle 122 c may also contribute to this malicious event by impeding the AV 302 from speeding up or otherwise evading vehicles 122 a and 122 b and their attempt to force AV 302 from being re-routed or pulled over. In this way, one or more of vehicles 122 a - c may “box in” AV 302 and force it to deviate from its routing plan 152 .
- a second series of events 114 may indicate that the AV 302 is forced to slow down by one or more vehicles 122 where other vehicles 122 around the AV 302 are not slowing down. For instance, assume that while the AV 302 is traveling along the road 120 , first vehicle 122 a (on the left side), second vehicle 122 b (on the right side), and third vehicle 122 c (on the front) unexpectedly slow down even though there are no traffic (i.e., vehicles 122 e and 122 f are not slowing down) and no traffic lights 126 detected by the sensors 346 . As such, the AV 302 is forced to slow down.
- FIG. 1 illustrates vehicles 122 a - c surrounding the AV 302
- any number of vehicles 122 on one or more sides of the AV 302 may contribute to forcing the AV 302 to slow down.
- another vehicle 122 on the rear side of the AV 302 may also match (or comes close to) the speed of the vehicles 122 a - c to box-in the AV 302 , thus, forcing the AV 302 to slow down.
- a third series of events 114 may indicate that the AV 302 is forced to slow down as detected by the control device 350 monitoring an engine 342 a of the AV 302 (see FIG. 3 ). For instance, assume that while the AV 302 is traveling along the road 120 , a fourth vehicle 122 d drags the AV 302 back with a cable 182 attached to the AV 302 , thus, forcing the AV 302 to slow down or otherwise impeding its movements. The control device 350 may store this event as a first event 104 a that is initiated at a first timestamp 136 . Also, assume that an individual from the fourth vehicle 122 d or an accomplice vehicle 122 has attached the cable 182 to the AV 302 that did not trigger an event 104 that deviates from the normalcy mode 106 .
- the fourth vehicle 122 d that is tampering with the AV 302 is not within the detection zone of the sensors 346 .
- the sensors 346 may not detect the presence of the fourth vehicle 122 d .
- the control device 350 that is monitoring the speed of the engine 342 a detects that the speed of the engine 342 a is not within a particular speed range that is provided in the driving instructions 154 , as expected.
- the particular speed range is determined according to the speed limit of the road 120 and other criteria, such as fuel-saving, providing a safe driving experience for the AV 302 , other vehicles 122 , pedestrians, among other criteria.
- the control device 350 may also detect that the engine 342 a (see FIG.
- control device 350 may detect that the engine 342 a (see FIG. 3 ) and the other components' performance indicators indicate their performance is within a normal range, e.g., 80%, and that they are not damaged. As for another example, the control device 350 may detect that the engine 342 a (see FIG.
- the control device 350 may store this set of determinations (indicating that the engine 342 a is in normal operation) as a second event 104 b at a second timestamp 136 .
- control device 350 determines that these events amount to the third series of events 114 .
- control device 350 detects that the third series of events 114 has occurred even though no suspected vehicle 122 potentially causing the AV 302 to slow down is detected by the sensors 346 .
- FIG. 1 illustrates that the fourth vehicle 122 d is dragging the AV 302 back forcing the AV 302 to slow down, it is understood that the fourth vehicle 122 d may be in front of the AV 302 and pull the AV 302 forward forcing the AV 302 to speed up, for example, to miss its predetermined exit 130 or to deviate from its routing plan 152 .
- a fourth series of events 114 may indicate one or more impacts with the AV 302 within the threshold period of time 118 by one or more vehicles 122 tampering with the AV 302 .
- the first vehicle 122 a hits or collides with the AV 302 at a first timestamp 136 .
- the sensors 346 detect the first collision, and communicate this event (i.e., first event 104 a ) to the control device 350 .
- the first vehicle 122 a (or the second vehicle 122 b ) hits or collides with the AV 302 at a second timestamp 136 .
- the sensors 346 detect the second collision, and communicate this event (i.e., second event 104 b ) to the control device 350 . If the control device 350 determines that the first event 104 a and the second event 104 b have occurred within the threshold period of time 118 , the control device 350 determines that the events 104 a and 104 b taken in the aggregate amount to the fourth series of events 114 that deviates from the normalcy mode 106 .
- an individual from the first vehicle 122 a hits the AV 302 at the first timestamp 136 , for example, by an object, such as a rock or a crowbar.
- an individual from the first vehicle 122 a hits the AV 302 at the second timestamp 136 , for example, by an object, such as a rock or a crowbar.
- control device 350 determines that these hits or impacts with the AV 302 are within the threshold period of time 118 , the control device 350 determines that these events taken in the aggregate amount to the fourth series of events 114 that deviates from the normalcy mode 106 .
- a fifth series of events 114 may indicate unexpected driving behaviors form one or more vehicles 122 .
- the first vehicle 122 a unexpectedly invades the space within threshold distance 128 from the AV 302 and swerves in front of the AV 302 at a first timestamp 136 .
- the sensors 346 detect this invasion of the space within threshold distance 128 , and communicate sensor data 178 indicating this invasion to the control device 350 .
- the control device 350 may store this event at a first event 104 a .
- the first vehicle 122 a slows down at a second timestamp 136 , thus, forcing the AV 302 to slow down.
- the sensors 346 detect that the first vehicle 122 a is slowing down, and communicate corresponding sensor data 178 indicating that to the control device 350 .
- the control device 350 may store this event as a second event 104 b . If the control device 350 determines that events 104 a and 104 b occur within the threshold period of time 118 , the control device 350 determines that the events 104 a and 104 b taken in the aggregate amount to the fifth series of events 114 that deviates from the normalcy mode 106 .
- the first vehicle unexpectedly swerves in front of the AV 302 at a first timestamp 136 .
- the second vehicle 122 b unexpectedly swerves in front of the AV 302 at a second timestamp 136 .
- the control device 350 determines that the events 104 a and 104 b occur within the threshold period of time 118 , the control device 350 determines that these events 104 a - b taken in the aggregate amount to the fifth series of events 114 that deviates from the normalcy mode 106 .
- a sixth series of events 114 may indicate that at least one sensor 346 from the sensors 346 is non-responsive or disabled. For instance, assume that while the AV 302 is traveling along the road 120 , a sensor 346 from the sensors 346 becomes non-responsive as a result of an impact. The sensor 346 may become non-responsive, for example, when the first vehicle 122 a or an individual from the first vehicle 122 a hits the sensor 346 in an attempt to disable or damage the sensor 346 .
- the control device 350 analyzes sensor data 178 captured by the sensor 346 (before it became non-responsive) and determines that the sensor 346 was disabled as a result of an impact from an object, such as a rock, a crowbar, etc., or the first vehicle 122 a.
- a first sensor 346 becomes non-responsive at a first timestamp 136 (stored at the first event 104 a ); and a second sensor 346 becomes non-responsive at a second timestamp 136 (stored at the second event 104 b ). If the control device 350 determines that the events 104 a and 104 b occur within the threshold period of time 118 , the control device 350 determines that the events 104 a and 104 b taken in the aggregate amount to the sixth series of events 114 that deviates from the normalcy mode 106 .
- a sensor 346 becomes non-responsive as a result of tampering.
- a sensor 346 may become non-responsive as a result of a cybersecurity breach in data communication between the sensor 346 and the control device 350 .
- the sensor 346 may become non-responsive at a first timestamp 136 as a result of a cybersecurity breach.
- the control device 350 may detect the cybersecurity breach, for example, by detecting a third-party attempt to establish unauthorized access to the sensor 346 or the control device 350 .
- a sensor 346 may become non-responsive as a result of propagating jamming signals, radio waves, light beams, and the like.
- jamming signals may be used to tamper with infrared sensors 346
- jamming radio waves may be used to tamper with Radar sensors 346 b (see FIG. 3 )
- jamming light (or jamming laser) beams may be used to tamper with LiDAR sensors 346 f (see FIG. 3 ).
- the control device 350 may detect such events 104 initiated at their corresponding timestamps 136 , and if they persist for more than the threshold period of time 118 , the control device 350 determines that such events 104 amount to a series of events 114 that deviates from the normalcy mode 106 .
- a seventh series of events 114 may indicate that the AV 302 is forced to drive over an object 124 as a result of unexpected driving behaviors of one or more vehicles 122 .
- the first vehicle 122 a unexpectedly swerves in front of the AV 302 at a first timestamp 136 , forcing the AV 302 to deviate from its traveling path (stored as the first event 104 a ), and as a result, the AV 302 drives over the object 124 at a second timestamp 136 (stored as the second event 104 b ).
- the control device 350 determines that the events 104 a and 104 b occur within the threshold period of time 118 , the control device 350 determines that the events 104 a and 104 b taken in the aggregate amount to the seventh series of events 114 that deviates from the normalcy mode 106 .
- the control device 350 stores this event at a third event 104 c .
- the control device 350 determines that events 104 a - c taken in the aggregate amount to a series of events 114 that deviates from the normalcy mode 106 .
- an eighth series of events 114 may indicate that a scheduled action indicated in a map data 150 unexpectedly does not occur.
- Map data 150 is described in detail further below.
- the map data 150 comprises detailed information about the environment on and around the traveling path of the AV 302 including objects on and around the road 120 , such as location coordinates of the road signs, buildings, terrain, traffic lights 126 , railroad crossing lights, among others.
- the map data 150 further comprises scheduling information of the traffic lights 126 , scheduling information of the railroad crossing lights, and any other scheduling information that the AV 302 may encounter during its routing plan 152 .
- the map data 150 comprises timestamps 136 when the traffic light 126 indicates yellow, green, and red lights.
- the map data 150 comprises timestamps 136 when a railroad crossing light indicates red and green lights.
- the AV 302 reaches the traffic light 126 and stops behind the traffic light 126 that is indicating a red light.
- the map data 150 indicates that a wait time for the traffic light 126 to change from a red light to a green light is a particular duration, for example, one minute.
- the sensors 346 are detecting the red light from the traffic light 126 for more than the particular duration indicated in the map data 150 .
- the control device 350 compares the scheduling information associated with the traffic light 126 provided by the map data 150 with the sensor data 178 captured by the sensors 346 .
- the control device 350 determines that a scheduled action (i.e., the traffic light 126 indicating a green light after one minute) has not occurred.
- the control device 350 may store this event as the first event 104 a initiated at a first timestamp 136 .
- a vehicle 122 invades the space within threshold distance 128 from the AV 302 at a second timestamp 136 .
- the sensors 346 detect this invasion of the threshold distance 128 , and communicate sensor data 178 indicating this invasion to the control device 350 .
- the control device 350 may store this event as the second event 104 b .
- control device 350 determines that the events 104 a and 104 b occur within the threshold period of time 118 , the control device 350 determines that the events 104 a and 104 b taken in the aggregate amount to the eighth series of events 114 that deviates from the normalcy mode 106 .
- a ninth series of events 114 may indicate that a field-of-view of at least one sensor 346 is obfuscated. For instance, assume that while the AV 302 is traveling along the road 120 , an object is used to obfuscate a detection zone or a field-of-view of the sensor 346 at a first timestamp 136 . In a particular example, sensor data 178 received from the sensor 346 prior to the first timestamp 136 indicate that a blanket is thrown over the sensor 346 . The control device 350 determines that the sensor 346 is functional because the sensor 346 is responsive to communication with the control device 350 . In other words, the control device 350 can receive sensor data 178 from the sensor 346 .
- the sensor data 178 is not as expected compared to sensor data 178 received prior to the first timestamp 136 . If the control device 350 determines that these events 104 (beginning from the first timestamp 136 ) persists for more than the threshold period of time 118 , the control device 350 determines that these events 104 amount to the ninth series of events 114 that deviates from the normalcy mode 106 .
- the control device 350 In response to detecting any of the example series of events 114 described above, the control device 350 escalates the series of events 114 to be addressed. For example, the control device 350 communicates the series of events 114 to the operation server 140 to be addressed by the remote operator 164 . This process is described in detail in conjunction with the operational flow of the system 100 further below. It should be understood that the previous series of events 114 described above are mere examples are not an exhaustive list of events 104 or series of events 114 that may be identified as deviating from normalcy mode 106 . This disclosure contemplates any suitable number and combination of events 104 that may deviate from a normalcy mode 106 that may be identified and escalated even if not specifically described as an example herein.
- the operation server 140 includes at least one processor 142 , at least one memory, at least one network interface 146 , and at least one user interface 148 .
- the operation server 140 may be configured as shown or in any other suitable configuration.
- the operation server 140 may be implemented by a cluster of computing devices that may serve to oversee the operations of the AV 302 .
- the operation server 140 may be implemented by a plurality of computing devices using distributed computing and/or cloud computing systems.
- the operation server 140 may be implemented by a plurality of computing devices in one or more data centers.
- the operation server 140 may include more processing power than the control device 350 .
- the operation server 140 is in signal communication with one or more AVs 302 and their components (e.g., the control device 350 ).
- the operation server 140 is configured to determine a particular routing plan 152 for the AV 302 .
- the operation server 140 may determine a particular routing plan 152 for an AV 302 that leads to reduced driving time and a safer driving experience for reaching the destination of that AV 302 .
- Processor 142 comprises one or more processors operably coupled to the memory 144 .
- the processor 142 is any electronic circuitry including, but not limited to, state machines, one or more central processing unit (CPU) chips, logic units, cores (e.g., a multi-core processor), field-programmable gate array (FPGAs), application specific integrated circuits (ASICs), or digital signal processors (DSPs).
- the processor 142 may be a programmable logic device, a microcontroller, a microprocessor, or any suitable combination of the preceding.
- the processor 142 is communicatively coupled to and in signal communication with the memory 144 , network interface 146 , and user interface 148 .
- the one or more processors are configured to process data and may be implemented in hardware or software.
- the processor 142 may be 8-bit, 16-bit, 32-bit, 64-bit or of any other suitable architecture.
- the processor 142 may include an arithmetic logic unit (ALU) for performing arithmetic and logic operations, processor registers that supply operands to the ALU and store the results of ALU operations, and a control unit that fetches instructions from memory and executes them by directing the coordinated operations of the ALU, registers and other components.
- ALU arithmetic logic unit
- the one or more processors are configured to implement various instructions.
- the one or more processors are configured to execute software instructions 168 to implement the function disclosed herein, such as some or all of those described with respect to FIGS. 1 and 2 .
- the function described herein is implemented using logic units, FPGAs, ASICs, DSPs, or any other suitable hardware or electronic circuitry.
- Memory 144 stores any of the information described above with respect to FIGS. 1 and 2 along with any other data, instructions, logic, rules, or code operable to implement the function(s) described herein when executed by processor 142 .
- the memory 144 may store normalcy mode 106 , normalcy mode building engine 110 , malicious event 112 , series of events 114 , threshold number 116 , threshold period 118 , threshold distance 128 , confidence score 132 , threshold score 134 , timestamps 136 , location coordinates 138 , sensor data 178 , map data 150 , routing plan 152 , driving instructions 154 , traffic data 156 , object detection machine learning modules 158 , countermeasures 166 , software instructions 168 , and/or any other data/instructions.
- the software instructions 168 include code that when executed by the processor 142 causes the operation server 140 to perform the functions described herein, such as some or all of those described in FIGS. 1 and 2 .
- the memory 144 comprises one or more disks, tape drives, or solid-state drives, and may be used as an over-flow data storage device, to store programs when such programs are selected for execution, and to store instructions and data that are read during program execution.
- the memory 144 may be volatile or non-volatile and may comprise read-only memory (ROM), random-access memory (RAM), ternary content-addressable memory (TCAM), dynamic random-access memory (DRAM), and static random-access memory (SRAM).
- the memory 144 may include one or more of a local database, cloud database, Network-attached storage (NAS), etc.
- Network interface 146 is configured to enable wired and/or wireless communications.
- the network interface 146 is configured to communicate data between the control device 350 and other network devices, systems, or domain(s).
- the network interface 146 may comprise a WIFI interface, a local area network 184 (LAN) interface, a wide area network 184 (WAN) interface, a modem, a switch, or a router.
- the processor 142 is configured to send and receive data using the network interface 146 .
- the network interface 146 may be configured to use any suitable type of communication protocol.
- User interfaces 148 may include one or more user interfaces that are configured to interact with users, such as the remote operator 164 .
- the user interfaces 148 may include peripherals of the operation server 140 , such as monitors, keyboards, mouse, trackpads, touchpads, etc.
- the remote operator 164 may use the user interfaces 148 to access the memory 144 to review sensor data 178 , review the series of events 114 , and address the detected malicious event 112 .
- Normalcy mode building engine 110 may be implemented by the processor 142 executing the software instructions 168 , and is generally configured to build the normalcy mode 106 .
- the normalcy mode building engine 110 may use simulated or offline driving situations to determine expected scenarios 108 (similar to those described above) and build the normalcy mode 106 .
- the normalcy mode building engine 110 generates the normalcy mode 106 that corresponds to a pattern-of-life for the AV 302 in the context of driving.
- the normalcy mode building engine 110 may be implemented by machine learning neural networks, including a plurality of convolutional neural networks, and the like. In one embodiment, the normalcy mode building engine 110 may be implemented by supervised pattern learning techniques and/or unsupervised pattern learning techniques, such as Bayesian Non-Parametric Modeling, decision trees, etc.
- the expected scenarios 108 in the normalcy mode 106 may be determined by offline driving simulations in various road environments.
- a first environment where the AV 302 is in a traffic may be simulated to determine scenarios expected from the environment around the AV 302 including its surrounding vehicles 122 in this situation.
- expected scenarios 108 comprise detecting that surrounding vehicles 122 are stopped or slowing down, for example, by determining speed profiles, trajectory profiles, detecting that rear red lights of the surrounding vehicles 122 are turned on, and any other indication that indicates the AV 302 is in traffic.
- a second environment where the AV 302 is behind the traffic light 126 may be simulated to determine expected scenarios 108 from the environment around the AV 302 including its surrounding vehicles 122 and the traffic light 126 in this situation.
- expected scenarios 108 comprise 1) detecting that the traffic light 126 is indicating a red light, 2) expecting that the traffic light 126 changes its status (i.e., from red light to green) based on its corresponding scheduling information provided in the map data 150 , 3 ) detecting that surrounding vehicles 122 are stopped or slowing down, and any other indication that indicates the AV 302 is behind the traffic light 126 .
- a third environment where one or more vehicles 122 are driving around the AV 302 may be simulated to determine expected scenarios 108 from the environment around the AV 302 including its surrounding vehicles 122 in this situation.
- expected scenarios 108 comprise 1) expecting that the one or more vehicles 122 do not invade the threshold distance 128 from the AV 302 , 2 ) expecting that the one or more vehicles 122 do not persist to drive parallel to the AV 302 for more than a threshold period 118 , and 3) if the one or more vehicles 122 invade the threshold distance 128 from the AV 302 , expecting that the one or more vehicles 122 do not persist this situation for more than the threshold period 118 .
- the threshold distance 128 may vary depending on which side of the AV 302 it is being measured. For example, a threshold distance 128 from the AV 302 from sides of the AV 302 may be less than a threshold distance 128 from the AV 302 from the front and the rear.
- Map data 150 may include a virtual map of a city which includes the road 120 .
- the map data 150 may include the map 458 and map database 436 (see FIG. 4 for descriptions of the map 458 and map database 436 ).
- the map data 150 may include drivable areas, such as roads 120 , paths, highways, and undrivable areas, such as terrain (determined by the occupancy grid module 460 , see FIG. 4 for descriptions of the occupancy grid module 460 ).
- the map data 150 may specify location coordinates of road signs, lanes, lane markings, lane boundaries, road boundaries, traffic lights 126 , etc.
- Routing plan 152 is a plan for traveling from a start location (e.g., a first AV 302 launchpad/landing pad) to a destination (e.g., a second AV 302 launchpad/landing pad).
- the routing plan 152 may specify a combination of one or more streets/roads/highways in a specific order from the start location to the destination.
- the routing plan 152 may specify stages including the first stage (e.g., moving out from the start location), a plurality of intermediate stages (e.g., traveling along particular lanes of one or more particular street/road/highway), and the last stage (e.g., entering the destination.
- the routing plan 152 may include other information about the route from the start position to the destination, such as road/traffic signs in that routing plan 152 , etc.
- Driving instructions 154 may be implemented by the planning module 462 (See descriptions of the planning module 462 in FIG. 4 ).
- the driving instructions 154 may include instructions and rules to adapt the autonomous driving of the AV 302 according to the driving rules of each stage of the routing plan 152 .
- the driving instructions 154 may include instructions to stay within the speed range of a road 120 traveled by the AV 302 , adapt the speed of the AV 302 with respect to observed changes by the sensors 346 , such as speeds of surrounding vehicles 122 , objects within the detection zones of the sensors 346 , etc.
- Object detection machine learning modules 158 may be implemented by the processor 142 executing software instructions 168 , and is generally configured to detect objects from the sensor data 178 .
- the object detection machine learning modules 158 may be implemented using neural networks and/or machine learning algorithms for detecting objects from any data type, such as images, videos, infrared images, point clouds, Radar data, etc.
- the object detection machine learning modules 158 may be implemented using machine learning algorithms, such as Support Vector Machine (SVM), Naive Bayes, Logistic Regression, k-Nearest Neighbors, Decision Trees, or the like.
- SVM Support Vector Machine
- the object detection machine learning modules 158 may utilize a plurality of neural network layers, convolutional neural network layers, and/or the like, in which weights and biases of perceptrons of these layers are optimized in the training process of the object detection machine learning modules 158 .
- the object detection machine learning modules 158 may be trained by a training dataset which includes samples of data types labeled with one or more objects in each sample.
- the training dataset may include sample images of objects (e.g., vehicles 122 , lane markings, pedestrian, road signs, etc.) labeled with object(s) in each sample image.
- the training dataset may include samples of other data types, such as videos, infrared images, point clouds, Radar data, etc. labeled with object(s) in each sample data.
- the object detection machine learning modules 158 may be trained, tested, and refined by the training dataset and the sensor data 178 .
- the object detection machine learning modules 158 use the sensor data 178 (which are not labeled with objects) to increase their accuracy of predictions in detecting objects.
- supervised and/or unsupervised machine learning algorithms may be used to validate the predictions of the object detection machine learning modules 158 in detecting objects in the sensor data 178 .
- Traffic data 156 may include traffic data of roads/streets/highways in the map data 150 .
- the operation server 140 may use traffic data 156 that is captured by one or more mapping vehicles.
- the operation server 140 may use traffic data 156 that is captured from any source, such as crowd-sourced traffic data 156 captured from external sources, e.g., Waze and Google maps, live traffic reporting, etc.
- Countermeasures 166 comprise instructions to be carried out in response to escalating the series of events 114 and determining that the series of events 114 corresponds to a malicious event 112 .
- the countermeasures 166 may comprise instructions that indicate to establish a communication path 160 with a communication module at the AV 302 in order to converse with individuals causing the series of events 114 and tampering with the AV 302 .
- the countermeasures 166 may comprise instructions that indicate to activate a horn of the AV 302 .
- the countermeasures 166 may comprise instructions that indicate to send a notifying message 172 to law enforcement 170 , where the notifying message 172 comprises an indication that the AV 302 has been tampered with at particular location coordinates 138 where the series of events 114 has occurred.
- countermeasures 166 may be performed by the remote operator 164 as described further below.
- performing the countermeasures 166 may be computerized and performed by the operation server 140 .
- the application server 162 is generally any computing device configured to communicate with other devices, such as other servers (e.g., operation server 140 ), AV 302 , databases, etc., via the network 184 .
- the application server 162 is configured to perform specific functions described herein and interact with the remote operator 164 , e.g., via communication path 174 using its user interfaces. Examples of the application server 162 include, but are not limited to, desktop computers, laptop computers, servers, etc.
- the application server 162 may act as a presentation layer where remote operator 164 accesses the operation server 140 .
- the operation server 140 may send sensor data 178 , the series of events 114 , countermeasures 166 and/or any other data/instructions to the application server 162 , e.g., via the network 184 .
- the remote operator 164 after establishing the communication path 174 with the application server 162 , may review the received data and carry out the countermeasures 166 in addressing the series of events 114 .
- the remote operator 164 can directly access the operation server 140 , and after establishing the communication path 176 with the operation server 140 , may carry out the countermeasures 166 in addressing the series of events 114 .
- the remote operator 164 may be an individual who is associated with and has access to the operation server 140 .
- the remote operator 164 may be an administrator that can access and view the information regarding the AV 302 , such as sensor data 178 and other information that is available on the memory 144 .
- Network 184 may be any suitable type of wireless and/or wired network including, but not limited to, all or a portion of the Internet, an Intranet, a private network, a public network, a peer-to-peer network, the public switched telephone network, a cellular network, a local area network (LAN), a metropolitan area network (MAN), a wide area network (WAN), and a satellite network.
- the network 184 may be configured to support any suitable type of communication protocol as would be appreciated by one of ordinary skill in the art.
- the operational flow of the system 100 begins when the control device 350 detects a series of events 114 , such as those described above or any other examples of a series of events 114 that deviates from the normalcy mode 106 .
- the control device 350 detects the series of events 114 by analyzing the sensor data 178 .
- the control device 350 determines whether the series of events 114 corresponds to a malicious event 112 .
- the control device 350 may compare the series of events 114 as a whole with the expected scenarios 108 stored in the normalcy mode 106 to determine whether the series of events 114 , taken as a whole, deviates from the normalcy mode 106 .
- control device 350 may compare events 104 from the series of events 104 with the expected scenarios 108 to determine whether at least a threshold number 116 of events 104 (from the series of events 114 ) within the threshold period of time 118 deviate from the normalcy mode 106 .
- the control device 350 may compare each event 104 detected within the threshold period of time 118 individually with each of the expected scenarios 108 to determine whether each event 104 deviates from the normalcy mode 106 .
- one or more correspondence between each individual event 104 (from the series of events 114 ) and the expected scenarios 108 may be found that may result in determining that the series of events 114 deviate from the normalcy mode 106 , even though a correspondence may not be found between the series of events 114 , taken as a whole, (comprising the same events 104 ) within the threshold period of time 118 and the overall expected scenarios 108 such that the series of events 114 is considered a malicious event 112 .
- the control device 350 determines whether the series of events 114 corresponds to any of the expected scenarios 108 . If a corresponding expected scenario 108 is found, the control device 350 determines that the series of events 114 does not correspond to a malicious event 112 . If, however, the control device 350 determines that the series of events 114 does not correspond to any of the expected scenarios 108 , it determines that the series of events 114 corresponds to the malicious event 112 .
- At least one surveillance sensor 346 i may be used to record the series of events 114 (in addition to or instead of other sensors 346 ).
- the surveillance sensor 346 i may be hidden from sight.
- the surveillance sensor 346 i may be any of the example sensors 346 described in FIG. 3 or any other object detection sensor 346 .
- the surveillance sensor 346 i may be positioned on the outer body and/or inside the AV 302 at any suitable position.
- surveillance sensors 346 i may be positioned in the cab of the AV 302 behind the front and/or side windows.
- a surveillance sensor 346 i may be positioned underneath the AV 302 .
- the surveillance sensor 346 i may be activated in response to detecting the series of events 114 .
- the control device 350 activates the surveillance sensors 346 i to record the series of events 114 .
- the control device 350 upon detecting the series of events 114 , assigns a confidence score 132 to the series of events 114 , where the confidence score 132 indicates a probability that the series of events 114 corresponds to the malicious event 112 . For instance, if every event 104 from the series of events 114 corresponds to a deviation from the normalcy mode 106 , the control device 350 assigns a high confidence score 132 (e.g., 75%) to the detected series of events 114 .
- the control device 350 assigns a high confidence score 132 (e.g., 90%) to the detected series of events 114 .
- the control device 350 detects that the first vehicle 122 a swerved in front of the AV 302 at a first timestamp 136 (stored as a first event 104 a ), followed by detecting that the first vehicle 122 a is slowing down at a second timestamp 136 (stored as a second event 104 b ), detecting that other surrounding vehicles 122 are not slowing down and no traffic light 126 detected by the sensors 346 (stored as a third event 104 c ), and this situation persists for more than the threshold period of time 118 , the control device 350 assigns a high confidence score 132 (e.g., 90%) to these events 104 a - c.
- a high confidence score 132 e.g. 90%
- the control device 350 may assign a high confidence score 132 to the event 104 . For example, if the control device 350 detects that a field-of-view of a sensor 346 is obfuscated according to sensor data 178 received from the sensor 346 prior to the detection of the obfuscation event 104 , and this situation persists more than the threshold period of time 118 , the control device 350 assigns a high confidence score 132 (e.g., 70%) to this event 104 .
- a high confidence score 132 e.g. 70%
- the control device 350 detects a first event 104 a that deviates from the normalcy mode 106 at a first timestamp 136 , and a second event 104 b that deviates from the normalcy mode 106 at a second timestamp 136 , and that the first timestamp 136 and the second timestamp 136 are not both within the threshold period of time 118 , the control device 350 assigns a low confidence score 132 (e.g., 30%) to this series of events 114 comprising events 104 a and 104 b .
- a low confidence score 132 e.g. 30%
- control device 350 detects a first unexpected driving behavior from the first vehicle 122 a at a first timestamp 136 such as the first vehicle 122 a unexpectedly swerves in front of the AV 302 (stored as a first event 104 a ); and detects a second unexpected driving behavior from the second vehicle 122 b at a second timestamp 136 such as the second vehicle 122 b unexpectedly swerves in front of the AV 302 (stored as a second event 104 b ).
- each of the first event 104 a and second event 104 b indicates a deviation from the normalcy mode 106 ; and that the first timestamp 136 and the second timestamp 136 are not within the threshold period of time 118 .
- the control device 350 assigns a low confidence score 132 to this series of events 114 that comprises events 104 a and 104 b.
- the control device 350 In response to detecting that the series of events 114 corresponds to a malicious event 112 , the control device 350 escalates the series of events 114 to be addressed by communicating the series of events 114 to the operation server 140 .
- the operation server 140 may confirm (or update) the determination of the control device 350 regarding whether the series of events 114 corresponds to a malicious event 112 .
- the remote operator 164 may confirm (or update) the determination of the operation server 140 (and the control device 350 ) regarding whether the series of events 114 corresponds to a malicious event 112 . This confirmation (or update) is used by the normalcy mode building engine 110 to further refine the normalcy mode 106 .
- the normalcy mode 106 is updated to include the series of events 114 indicating that the series of events 114 does not correspond to a malicious event 112 .
- the normalcy mode 106 may be updated by the remote operator 164 reviewing the series of events 114 .
- a supervised machine learning technique may be leveraged in refining and updating the normalcy mode 106 .
- the normalcy mode building engine 110 may learn from the confirmations and updates by the remote operator 164 and refine or update the normalcy mode 106 .
- the normalcy mode building engine 110 may adapt to the updated normalcy mode 106 using an unsupervised machine learning technique, for example, by adjusting weight and bias values of neural network layers of the normalcy mode building engine 110 .
- the operation server 140 may take particular countermeasures 166 to address (or perhaps resolve) the series of events 114 and tampering with the AV 302 .
- the corresponding description below describes non-limiting examples of countermeasures 166 in addressing (or perhaps resolving) the series of events 114 .
- the remote operator 164 establishes a communication path 160 between the operation server 140 and the AV 302 .
- the communication path 160 may follow a one-way communication protocol, where data can be transmitted from the operation server 140 to the AV 302 .
- the communication path 160 may be configured to support voice-based, message-based, visual-based, and/or any other appropriate types of communication.
- the communication path 160 may be established between the operation server 140 and a communication module that is associated with the AV 302 .
- the communication module may be installed at any appropriate location inside and/or on the outer body of the AV 302 .
- the communication module may be installed inside the cab of the AV 302 , behind the front windows.
- the communication module may include one or more user interfaces including, but not limited to, a speaker, a monitor screen, and a microphone.
- the communication module may operably be coupled with a camera in a surveillance room where the remote operator 164 is located.
- the remote operator 164 may configure the communication path 160 to show themselves on the monitor screen at the AV 302 , such that the remote operator 164 is visible from the monitor screen to the individuals causing the series of events 114 .
- the remote operator 164 can converse with the individuals causing the series of events 114 to discourage the individuals causing the series of events 114 from tampering with the AV 302 .
- the communication path 160 may follow a two-way communication protocol, where data can be transmitted and received from both sides.
- a countermeasure 166 to address (or perhaps resolve) the malicious event 112 may comprise activating a horn of the AV 302 .
- the remote operator 164 may remotely activate the horn of the AV 302 .
- a countermeasure 166 to address (or perhaps resolve) the malicious event 112 may comprise notifying law enforcement 170 .
- the remote operator 164 may send a notifying message 172 indicating that the AV 302 is being tampered with at particular location coordinates 138 .
- the countermeasures 166 described above may be computerized and be carried out by the operation server 140 .
- FIG. 2 illustrates an example flowchart of a method 200 for detecting malicious events 112 for an AV 302 . Modifications, additions, or omissions may be made to method 200 .
- Method 200 may include more, fewer, or other steps. For example, steps may be performed in parallel or in any suitable order. While at times discussed as the AV 302 , operation server 140 , control device 350 , or components of any of thereof performing steps, any suitable system or components of the system may perform one or more steps of the method 200 .
- one or more steps of method 200 may be implemented, at least in part, in the form of software instructions 168 , 380 , respectively from FIGS.
- non-transitory, tangible, machine-readable media e.g., memory 144 , data storage device 390 , and memory 502 , respectively from FIGS. 1 , 3 , and 5
- processors e.g., processors 142 , 370 , and 504 , respectively from FIGS. 1 , 3 , and 5
- steps 202 - 208 may cause the one or more processors to perform steps 202 - 208 .
- Method 200 begins at step 202 where the control device 350 detects a series of events 114 within a threshold period of time 118 .
- the control device 350 may detect a series of events 114 within a threshold period of time 118 , where the series of events 114 comprises events 104 above a threshold number 116 .
- the threshold number 116 may be one and in other embodiments, the threshold number 116 may be more than one depending on the circumstances.
- the control device 350 may detect the series of events 114 by analyzing the sensor data 178 captured by the sensors 346 .
- the control device 350 may detect any of the example series of events 114 described in FIG. 1 .
- the series of events 114 may correspond to a deviation from the normalcy mode 106 .
- the series of events 114 may comprise a first event 104 a and a second event 104 b that taken in the aggregate amount to a series of events 114 that deviates from the normalcy mode 106 .
- the series of events 114 may comprise one or more events 104 that are not detected by the sensors 362 , i.e., they are not within the detection zones of the sensors 362 .
- vehicle 122 d that drags the AV 302 back by a cable 182 may not be within the detection zone of the sensors 346 .
- the sensors 346 may not detect the presence of the vehicle 122 d .
- the control device 350 may detect that the AV 302 is slowing down by monitoring the speed and performance of the engine 342 a of the AV 302 (see FIG. 4 ).
- the series of events 114 may comprise one or more events 104 that are detected on lane(s) other than the lane traveled by the AV 302 .
- vehicles 122 a and 122 b that invade the threshold distance 128 from the AV 302 may be in side-lanes with respect to the AV 302 .
- the threshold period of time 118 may be determined to be thirty seconds, one minute, two minutes, or any other appropriate duration of time.
- the threshold period of time 118 may vary depending on an encountered series of events 114 (and/or a number of events 104 in the series of events 114 ). For example, the threshold period of time 118 may increase as the number of events 104 in the series of events 114 increases.
- control device 350 may determine the threshold period of time 118 to be shorter compared to another series of events 114 , such as where one vehicle 122 on a side of the AV 302 is driving parallel to the AV 302 .
- control device 350 determines whether the series of events 114 corresponds to a malicious event 112 .
- the control device 350 may compare the series of events 114 , taken as a whole, with the expected scenarios 108 stored in the normalcy mode 106 . If no correspondence is found between the series of events 114 , taken as a whole, and the expected scenarios 108 , the control device 350 may determine that the series of events 114 corresponds to a malicious event 112 , i.e., the series of events 114 is a deviation from the normalcy mode 106 . If, however, a correspondence is found, the control device 350 may determine that the series of events 114 does not correspond to a malicious event 112 . In one embodiment, the control device 350 may compare each event 104 (from the series of events 114 ) with the expected scenarios 108 .
- the control device 350 may determine that the series of events 114 does not correspond to a malicious event 112 . Otherwise, the control device 350 may determine that the series of events 114 corresponds to a malicious event 112 .
- control device 350 may determine whether the series of events 114 corresponds to a malicious event 112 by assigning a confidence score 132 to the series of events 114 and determining whether the assigned confidence score 132 is above the threshold score 134 , similar to that described in FIG. 1 .
- method 200 may proceed to step 206 . If, however, it is determined that the series of events 114 does not correspond to a malicious event 112 , method 200 may be terminated.
- control device 350 may communicate the series of events 114 to the operation server 140 so that the remote operator 164 can confirm, update, or override the determination of the control device 350 .
- the control device 350 escalates the series of events 114 to be addressed.
- the control device 350 communicates the series of events 114 to the operation server 140 to be addressed by the remote operator 164 (or the operation server 140 ).
- the remote operator 164 or the operation server 140
- the remote operator 164 may carry out particular countermeasures 166 to address the series of event 114 , similar to that described in FIG. 1 .
- countermeasures 166 may comprise establishing a communication path 160 with the AV 302 such that individuals causing the series of events 114 can hear and/or see the remote operator 164 from a speaker and/or a monitor screen of a communication module installed in the AV 302 , remotely activating a horn of the AV 302 , sending a notifying message 172 to law enforcement 170 indicating that the AV 302 is being tampered with at the particular location coordinates 138 .
- FIG. 3 shows a block diagram of an example vehicle ecosystem 300 in which autonomous driving operations can be determined.
- the AV 302 may be a semi-trailer truck.
- the vehicle ecosystem 300 includes several systems and components that can generate and/or deliver one or more sources of information/data and related services to the in-vehicle control computer 350 that may be located in an AV 302 .
- the in-vehicle control computer 350 can be in data communication with a plurality of vehicle subsystems 340 , all of which can be resident in the AV 302 .
- a vehicle subsystem interface 360 is provided to facilitate data communication between the in-vehicle control computer 350 and the plurality of vehicle subsystems 340 .
- the vehicle subsystem interface 360 can include a controller area network (CAN) controller to communicate with devices in the vehicle subsystems 340 .
- CAN controller area network
- the AV 302 may include various vehicle subsystems that support the operation of AV 302 .
- the vehicle subsystems may include the control device 350 , a vehicle drive subsystem 342 , a vehicle sensor subsystem 344 , and/or a vehicle control subsystem 348 .
- the components or devices of the vehicle drive subsystem 342 , the vehicle sensor subsystem 344 , and the vehicle control subsystem 348 shown in FIG. 3 are examples.
- the AV 302 may be configured as shown or any other configurations.
- the vehicle drive subsystem 342 may include components operable to provide powered motion for the AV 302 .
- the vehicle drive subsystem 342 may include an engine/motor 342 a , wheels/tires 342 b , a transmission 342 c , an electrical subsystem 342 d , and a power source 342 e.
- the vehicle sensor subsystem 344 may include a number of sensors 346 configured to sense information about an environment or condition of the AV 302 .
- the vehicle sensor subsystem 344 may include one or more cameras 346 a or image capture devices, a Radar unit 346 b , one or more temperature sensors 346 c , a wireless communication unit 346 d (e.g., a cellular communication transceiver), an inertial measurement unit (IMU) 346 e , a laser range finder/LiDAR unit 346 f , a Global Positioning System (GPS) transceiver 346 g , and/or a wiper control system 346 h .
- the vehicle sensor subsystem 344 may also include sensors 346 configured to monitor internal systems of the AV 302 (e.g., an 02 monitor, a fuel gauge, an engine oil temperature, etc.).
- the IMU 346 e may include any combination of sensors 346 (e.g., accelerometers and gyroscopes) configured to sense position and orientation changes of the AV 302 based on inertial acceleration.
- the GPS transceiver 346 q may be any sensor configured to estimate a geographic location of the AV 302 .
- the GPS transceiver 346 q may include a receiver/transmitter operable to provide information regarding the position of the AV 302 with respect to the Earth.
- the Radar unit 346 b may represent a system that utilizes radio signals to sense objects within the local environment of the AV 302 .
- the Radar unit 346 b may additionally be configured to sense the speed and the heading of the objects proximate to the AV 302 .
- the laser range finder or LiDAR unit 346 f may be any sensor configured to sense objects in the environment in which the AV 302 is located using lasers.
- the cameras 346 a may include one or more devices configured to capture a plurality of images of the environment of the AV 302 .
- the cameras 346 a may be still image cameras or motion video cameras.
- the vehicle control subsystem 348 may be configured to control the operation of the AV 302 and its components. Accordingly, the vehicle control subsystem 348 may include various elements such as a throttle and gear 348 a , a brake unit 348 b , a navigation unit 348 c , a steering system 348 d , and/or an autonomous control unit 348 e .
- the throttle 348 a may be configured to control, for instance, the operating speed of the engine and, in turn, control the speed of the AV 302 .
- the gear 348 a may be configured to control the gear selection of the transmission.
- the brake unit 348 b can include any combination of mechanisms configured to decelerate the AV 302 .
- the brake unit 348 b can use friction to slow the wheels in a standard manner.
- the brake unit 348 b may include an Anti-Lock Brake system (ABS) that can prevent the brakes from locking up when the brakes are applied.
- the navigation unit 348 c may be any system configured to determine a driving path or route for the AV 302 .
- the navigation 348 c unit may additionally be configured to update the driving path dynamically while the AV 302 is in operation.
- the navigation unit 348 c may be configured to incorporate data from the GPS transceiver 346 q and one or more predetermined maps so as to determine the driving path (e.g., along the road 120 of FIG. 1 ) for the AV 302 .
- the steering system 348 d may represent any combination of mechanisms that may be operable to adjust the heading of AV 302 in an autonomous mode or in a driver-controlled mode.
- the autonomous control unit 348 e may represent a control system configured to identify, evaluate, and avoid or otherwise negotiate potential obstacles or obstructions in the environment of the AV 302 .
- the autonomous control unit 348 e may be configured to control the AV 302 for operation without a driver or to provide driver assistance in controlling the AV 302 .
- the autonomous control unit 348 e may be configured to incorporate data from the GPS transceiver 346 q , the Radar 346 b , the LiDAR unit 346 f , the cameras 346 a , and/or other vehicle subsystems to determine the driving path or trajectory for the AV 302 .
- the in-vehicle control computer 350 may include at least one data processor 370 (which can include at least one microprocessor) that executes processing instructions 380 stored in a non-transitory computer-readable medium, such as the data storage device 390 or memory.
- the in-vehicle control computer 350 may also represent a plurality of computing devices that may serve to control individual components or subsystems of the AV 302 in a distributed fashion.
- the data storage device 390 may contain processing instructions 380 (e.g., program logic) executable by the data processor 370 to perform various methods and/or functions of the AV 302 , including those described with respect to FIGS. 1 and 2 .
- the data storage device 390 may contain additional instructions as well, including instructions to transmit data to, receive data from, interact with, or control one or more of the vehicle drive subsystem 342 , the vehicle sensor subsystem 344 , and the vehicle control subsystem 348 .
- the in-vehicle control computer 350 can be configured to include a data processor 370 and a data storage device 390 .
- the in-vehicle control computer 350 may control the function of the AV 302 based on inputs received from various vehicle subsystems (e.g., the vehicle drive subsystem 342 , the vehicle sensor subsystem 344 , and the vehicle control subsystem 348 ).
- FIG. 4 shows an exemplary system 400 for providing precise autonomous driving operations.
- the system 400 includes several modules that can operate in the in-vehicle control computer 350 , as described in FIG. 3 .
- the in-vehicle control computer 350 includes a sensor fusion module 402 shown in the top left corner of FIG. 4 , where the sensor fusion module 402 may perform at least four image or signal processing operations.
- the sensor fusion module 402 can obtain images from cameras located on an autonomous vehicle to perform image segmentation 404 to detect the presence of moving objects (e.g., other vehicles 122 , pedestrians, etc.) and/or static obstacles (e.g., stop sign, speed bump, terrain, etc.) located around the autonomous vehicle.
- the sensor fusion module 402 can obtain LiDAR point cloud data item from LiDAR sensors 346 located on the autonomous vehicle to perform LiDAR segmentation 406 to detect the presence of objects and/or obstacles located around the autonomous vehicle.
- the sensor fusion module 402 can perform instance segmentation 408 on image and/or point cloud data item to identify an outline (e.g., boxes) around the objects and/or obstacles located around the autonomous vehicle.
- the sensor fusion module 402 can perform temporal fusion where objects and/or obstacles from one image and/or one frame of point cloud data item are correlated with or associated with objects and/or obstacles from one or more images or frames subsequently received in time.
- the sensor fusion module 402 can fuse the objects and/or obstacles from the images obtained from the camera and/or point cloud data item obtained from the LiDAR sensors 346 .
- the sensor fusion module 402 may determine based on a location of two cameras that an image from one of the cameras comprising one half of a vehicle located in front of the autonomous vehicle is the same as the vehicle located captured by another camera.
- the sensor fusion module 402 sends the fused object information to the interference module 446 and the fused obstacle information to the occupancy grid module 460 .
- the in-vehicle control computer includes the occupancy grid module 460 can retrieve landmarks from a map database 458 stored in the in-vehicle control computer.
- the occupancy grid module 460 can determine drivable areas and/or obstacles from the fused obstacles obtained from the sensor fusion module 402 and the landmarks stored in the map database 458 . For example, the occupancy grid module 460 can determine that a drivable area may include a speed bump obstacle.
- the in-vehicle control computer 350 includes a LiDAR based object detection module 412 that can perform object detection 416 based on point cloud data item obtained from the LiDAR sensors 414 located on the autonomous vehicle.
- the object detection 416 technique can provide a location (e.g., in 3 D world coordinates) of objects from the point cloud data item.
- the in-vehicle control computer includes an image based object detection module 418 that can perform object detection 424 based on images obtained from cameras 420 located on the autonomous vehicle.
- the object detection 424 technique can employ a deep machine learning technique to provide a location (e.g., in 3 D world coordinates) of objects from the image provided by the camera.
- the Radar 456 on the autonomous vehicle can scan an area in front of the autonomous vehicle or an area towards which the autonomous vehicle is driven.
- the Radar data is sent to the sensor fusion module 402 that can use the Radar data to correlate the objects and/or obstacles detected by the Radar 456 with the objects and/or obstacles detected from both the LiDAR point cloud data item and the camera image.
- the Radar data is also sent to the interference module 446 that can perform data processing on the Radar data to track objects by object tracking module 448 as further described below.
- the in-vehicle control computer includes an interference module 446 that receives the locations of the objects from the point cloud and the objects from the image, and the fused objects from the sensor fusion module 402 .
- the interference module 446 also receive the Radar data with which the interference module 446 can track objects by object tracking module 448 from one point cloud data item and one image obtained at one time instance to another (or the next) point cloud data item and another image obtained at another subsequent time instance.
- the interference module 446 may perform object attribute estimation 450 to estimate one or more attributes of an object detected in an image or point cloud data item.
- the one or more attributes of the object may include a type of object (e.g., pedestrian, car, or truck, etc.).
- the interference module 446 may perform behavior prediction 452 to estimate or predict motion pattern of an object detected in an image and/or a point cloud.
- the behavior prediction 452 can be performed to detect a location of an object in a set of images received at different points in time (e.g., sequential images) or in a set of point cloud data item received at different points in time (e.g., sequential point cloud data items).
- the behavior prediction 452 can be performed for each image received from a camera and/or each point cloud data item received from the LiDAR sensor.
- the interference module 446 can be performed to reduce computational load by performing behavior prediction 452 on every other or after every pre-determined number of images received from a camera or point cloud data item received from the LiDAR sensor (e.g., after every two images or after every three point cloud data items).
- the behavior prediction 452 feature may determine the speed and direction of the objects that surround the autonomous vehicle from the Radar data, where the speed and direction information can be used to predict or determine motion patterns of objects.
- a motion pattern may comprise a predicted trajectory information of an object over a pre-determined length of time in the future after an image is received from a camera.
- the interference module 446 may assign motion pattern situational tags to the objects (e.g., “located at coordinates (x,y),” “stopped,” “driving at 50 mph,” “speeding up” or “slowing down”).
- the situation tags can describe the motion pattern of the object.
- the interference module 446 sends the one or more object attributes (e.g., types of the objects) and motion pattern situational tags to the planning module 462 .
- the interference module 446 may perform an environment analysis 454 using any information acquired by system 400 and any number and combination of its components.
- the in-vehicle control computer includes the planning module 462 that receives the object attributes and motion pattern situational tags from the interference module 446 , the drivable area and/or obstacles, and the vehicle location and pose information from the fused localization module 426 (further described below).
- the planning module 462 can perform navigation planning 464 to determine a set of trajectories on which the autonomous vehicle can be driven.
- the set of trajectories can be determined based on the drivable area information, the one or more object attributes of objects, the motion pattern situational tags of the objects, location of the obstacles, and the drivable area information.
- the navigation planning 464 may include determining an area next to the road 120 (see FIG. 1 ) where the autonomous vehicle can be safely parked in case of emergencies.
- the planning module 462 may include behavioral decision making 466 to determine driving actions (e.g., steering, braking, throttle) in response to determining changing conditions on the road 120 (see FIG.
- the planning module 462 performs trajectory generation 468 and selects a trajectory from the set of trajectories determined by the navigation planning operation 464 .
- the selected trajectory information is sent by the planning module 462 to the control module 470 .
- the in-vehicle control computer includes a control module 470 that receives the proposed trajectory from the planning module 462 and the autonomous vehicle location and pose from the fused localization module 426 .
- the control module 470 includes a system identifier 472 .
- the control module 470 can perform a model based trajectory refinement 474 to refine the proposed trajectory.
- the control module 470 can apply a filter (e.g., Kalman filter) to make the proposed trajectory data smooth and/or to minimize noise.
- the control module 470 may perform the robust control 476 by determining, based on the refined proposed trajectory information and current location and/or pose of the autonomous vehicle, an amount of brake pressure to apply, a steering angle, a throttle amount to control the speed of the vehicle, and/or a transmission gear.
- the control module 470 can send the determined brake pressure, steering angle, throttle amount, and/or transmission gear to one or more devices in the autonomous vehicle to control and facilitate precise driving operations of the autonomous vehicle.
- the deep image-based object detection 424 performed by the image based object detection module 418 can also be used to detect landmarks (e.g., stop signs, speed bumps, etc.) on the road 120 (see FIG. 1 ).
- the in-vehicle control computer includes a fused localization module 426 that obtains landmarks detected from images, the landmarks obtained from a map database 436 stored on the in-vehicle control computer, the landmarks detected from the point cloud data item by the LiDAR based object detection module 412 , the speed and displacement from the odometer sensor 444 and the estimated location of the autonomous vehicle from the GPS/IMU sensor 438 (i.e., GPS sensor 440 and IMU sensor 442 ) located on or in the autonomous vehicle. Based on this information, the fused localization module 426 can perform a localization operation 428 to determine a location of the autonomous vehicle, which can be sent to the planning module 462 and the control module 470 .
- GPS/IMU sensor 438 i.e., GPS
- the fused localization module 426 can estimate pose 430 of the autonomous vehicle based on the GPS and/or IMU sensors 438 .
- the pose of the autonomous vehicle can be sent to the planning module 462 and the control module 470 .
- the fused localization module 426 can also estimate status (e.g., location, possible angle of movement) of the trailer unit based on, for example, the information provided by the IMU sensor 442 (e.g., angular rate and/or linear velocity).
- the fused localization module 426 may also check the map content 432 .
- FIG. 5 shows an exemplary block diagram of an in-vehicle control computer 350 included in an autonomous AV 302 .
- the in-vehicle control computer 350 includes at least one processor 504 and a memory 502 having instructions stored thereupon (e.g., software instructions 168 and processing instructions 380 of FIGS. 1 and 3 , respectively).
- the instructions upon execution by the processor 504 configure the in-vehicle control computer 350 and/or the various modules of the in-vehicle control computer 350 to perform the operations described in FIGS. 1 - 5 .
- the transmitter 506 transmits or sends information or data to one or more devices in the autonomous vehicle. For example, the transmitter 506 can send an instruction to one or more motors of the steering wheel to steer the autonomous vehicle.
- the receiver 508 receives information or data transmitted or sent by one or more devices. For example, the receiver 508 receives a status of the current speed from the odometer sensor or the current transmission gear from the transmission.
- the transmitter 506 and receiver 508 are also configured to communicate with a plurality of vehicle subsystems 340 and the in-vehicle control computer 350 described above in FIGS. 3 and 4 .
- a system comprising:
- Clause 3 The system of Clause 1, wherein detecting the series of events within the threshold period of time comprises detecting one or more of:
- Clause 4 The system of Clause 1, wherein determining whether the series of events corresponds to the malicious event comprises:
- Clause 5 The system of Clause 1, wherein the processor is further configured to:
- Clause 6 The system of Clause 5, wherein the processor is further configured to in response to determining that the confidence score is below the threshold score, update the normalcy mode to include the series of events indicating that the series of events does not correspond to the malicious event.
- Clause 9 The method of Clause 8, wherein determining whether the series of events corresponds to the malicious event comprises:
- Clause 10 The method of Clause 8, wherein determining whether the series of events corresponds to the malicious event comprises:
- Clause 11 The method of Clause 8, wherein the communication path comprises one or more of audio and visual communications.
- Clause 13 The method of Clause 8, wherein the threshold period of time is determined based at least in part upon the number of events in the series of events such that as the number of events in the series of events increases, the threshold period of time increases.
- Clause 14 The method of Clause 8, wherein escalating the series of events comprises remotely activating a horn of the AV discouraging accomplices causing the series of events.
- Clause 15 The method of Clause 14, further comprising in response to determining that the series of events does not correspond to the malicious event, updating the normalcy mode to include the series of events.
- a computer program comprising executable instructions stored in a non-transitory computer-readable medium that when executed by one or more processors causes the one or more processors to:
- Clause 18 The computer program of Clause 16, wherein the at least one vehicle sensor comprises at least one of a camera, Light Detection and Ranging (LiDAR) sensor, motion sensor, and infrared sensor.
- LiDAR Light Detection and Ranging
- Clause 19 The computer program of Clause 16, wherein the at least one vehicle sensor comprises a sensor monitoring performance of at least one of an engine, a wheel, a tire, a transmission component, and an electrical component of the AV.
- Clause 20 The computer program of Clause 16, wherein the AV is a tracker unit and is attached to a trailer.
Abstract
A system comprises an autonomous vehicle (AV) and a control device operably coupled with the AV. The control device detects a series of events within a threshold period of time, where a number of series of events in the series of events is above a threshold number. The series of events taken in the aggregate within the threshold period of time deviates from a normalcy mode. The normalcy mode comprises events that are expected to the encountered by the AV. The control device determines whether the series of events corresponds to a malicious event, where the malicious event indicates tampering with the AV. In response to determining that the series of events corresponds to the malicious event, the series of events are escalated to be addressed.
Description
- This application is a continuation of U.S. patent application Ser. No. 17/165,396 filed Feb. 2, 2021, and entitled “MALICIOUS EVENT DETECTION FOR AUTONOMOUS VEHICLES,” which is incorporated herein by reference.
- The present disclosure relates generally to autonomous vehicles. More particularly, the present disclosure is related to malicious event detection for autonomous vehicles.
- One aim of autonomous vehicle technologies is to provide vehicles that can safely navigate towards a destination. In some cases, an autonomous vehicle may encounter an unexpected situation on its way to a destination. For example, an autonomous vehicle may encounter a situation where a third party, such as a vehicle, an individual, or a pedestrian attempts to tamper with the AV. For example, a third party may attempt to force the autonomous vehicle to deviate from its predetermined traveling path or force the autonomous vehicle to pull over. Current autonomous vehicle technologies may not be configured to account for encountering specific unexpected situations.
- This disclosure recognizes various problems and previously unmet needs related to detecting malicious events affecting autonomous vehicles. Current autonomous vehicle technologies may not be configured to account for malicious intents of vehicles, individuals, or pedestrians attempting to tamper with an autonomous vehicle (AV). For instance, one or more vehicles may intentionally force the AV to deviate from its routing plan or traveling path by invading space within a threshold distance from the AV. As for another instance, one or more vehicles may intentionally force the AV to pull over. As for another instance, one or more vehicles may intentionally force the AV to slow down. The one or more vehicles causing the above-identified events may attempt to access or steal cargo carried by the AV, or devices and autonomous technology present in the AV. As for another instance, one or more vehicles may intentionally or inadvertently collide with the AV and flee the scene of the accident.
- Certain embodiments of this disclosure provide unique technical solutions to technical problems of current autonomous vehicle technologies, including those problems described above by detecting a series of events within a threshold period, where the series of events corresponds to a malicious event. For example, the detected series of events may indicate a deviation from a normalcy mode, where the normalcy mode comprises expected or predictable scenarios in various road environments.
- The various road environments may comprise, for example, when the AV is in traffic. In this example, it is expected that sensors of the AV detect that all surrounding vehicles in the traffic are stopped or slowing down. As such, if the sensors detect that one or more surrounding vehicles are slowing down, while other vehicles are not slowing down, the AV may determine that this situation is a deviation from the normalcy mode. Another example of a road environment in the normalcy mode may be when the AV is driving along a road. In this example, it is expected that the sensors of the AV detect that 1) vehicles driving next to the AV do not drive parallel to the AV for more than a threshold period and 2) the vehicles driving next to the AV do not invade a threshold distance from the AV for more than a threshold period.
- Upon detecting a series of events that deviates from the normalcy mode, the series of events is escalated to be addressed. For example, a communication path may be established with the AV such that a remote operator can be seen and/or heard from a communication module at the AV in order to discourage individuals tampering with the AV from tampering with the AV. As another example, a notifying message may be sent to law enforcement indicating that the AV is being tampered with at a particular location coordinate.
- In one embodiment, a system comprises an AV that comprises at least one vehicle sensor, where the AV is configured to travel along a road. The system further comprises a control device that is operably coupled with the AV. The control device detects, from sensor data received from the vehicle sensor, a series of events within a threshold period, where a number of events in the series of events is above a threshold number. The series of events in the aggregate within the threshold period deviates from a normalcy mode. The normalcy mode comprises events that are expected to the encountered by the AV. The control device determines whether the series of events corresponds to a malicious. In response to determining that the series of events corresponds to the malicious event, the control device escalates the series of events to be addressed, where escalating the series of events comprises performing at least one countermeasure to resolve the series of events. At least one countermeasure comprises establishing a communication path between the AV and an operator such that the operator is able to converse with accomplices causing the series of events.
- The disclosed systems provide several practical applications and technical advantages which include: 1) technology that builds a normalcy mode, where the normalcy mode comprises expected or predictable scenarios in various road environments; 2) technology that detects a series of events in a threshold period and determines whether the series of events deviates from the normalcy mode, where the series of events is greater than a threshold number of events; 3) technology that establishes a communication path with the AV, in response to determining that the series of events corresponds to a malicious event, where the communication path supports voice and visual communications; 4) technology that sends a notifying message to law enforcement indicating that the AV is being tampered with at a particular location coordinate, in response to determining that the series of events corresponds to a malicious event; 5) technology that remotely activates a horn of the AV, in response to determining that the series of events corresponds to a malicious event; and 7) technology that activates a surveillance sensor to record the series of events, in response to determining that the series of events corresponds to a malicious event, where the surveillance sensor is hidden from sight.
- As such, the systems described in this disclosure may be integrated into a practical application of determining a more efficient, safe, and reliable solution for detecting malicious events acted upon the AV. For example, the disclosed system may determine that a series of events detected within a threshold period deviates from the normalcy mode. The disclosed system may compare the detected series of events with the normalcy mode. If a corresponding expected scenario is found, the disclosed system determines that the series of events (in aggregation) does not correspond to a malicious event. If, however, no corresponding expected scenario is found, the disclosed system determines that the series of events (in aggregation) corresponds to a malicious event. In another example, the disclosed system may determine that above a threshold number of events from the series of events detected within the threshold period of time deviate from the normalcy mode. In another example, the disclosed system may compare each event from the series of events with the expected scenarios to determine whether each event corresponds to an expected scenario.
- Furthermore, the systems described in this disclosure may be integrated into an additional practical application of determining a more efficient, safe, and reliable solution to address and perhaps resolve a situation where a third party is tampering with the AV. For example, the disclosed system may establish a communication path with the AV, and enable a remote operator to be seen and/or heard from a communication module at the AV in order to discourage individuals tampering with the AV from tampering with the AV. As another example, the disclosed system may remotely activate a horn of the AV in order to discourage individuals tampering with the AV from tampering with the AV. As for another example, the disclosed system may send a notifying message to law enforcement indicating that the AV is being tampered with at a particular location coordinate.
- Certain embodiments of this disclosure may include some, all, or none of these advantages. These advantages and other features will be more clearly understood from the following detailed description taken in conjunction with the accompanying drawings and claims.
- For a more complete understanding of this disclosure, reference is now made to the following brief description, taken in connection with the accompanying drawings and detailed description, wherein like reference numerals represent like parts.
-
FIG. 1 illustrates an embodiment of a system configured to detect malicious events for an autonomous vehicle (AV) and a simplified schematic diagram of example series of events corresponding to malicious events according to certain embodiments of this disclosure; -
FIG. 2 illustrates an example flowchart of a method for detecting malicious events for an AV; -
FIG. 3 illustrates a block diagram of an example AV configured to implement autonomous driving operations; -
FIG. 4 illustrates an example system for providing autonomous driving operations used by the AV ofFIG. 3 ; and -
FIG. 5 illustrates a block diagram of an in-vehicle control computer included in the AV ofFIG. 3 . - As described above, previous technologies may fail to provide efficient, reliable, and safe solutions for detecting malicious events for autonomous vehicles. This disclosure provides various systems, methods, and devices for 1) improving the performance of a traveling AV; 2) providing a safe driving experience for the AV, other vehicles, and pedestrians; and 3) securing a cargo carried by the AV by detecting a series of events that corresponds to a malicious event, escalating the malicious event to be addressed, and carrying out countermeasures to address (and perhaps resolve) the malicious event.
FIG. 1 illustrates an embodiment of a system that is configured to detect malicious events encountered by an AV.FIG. 1 further illustrates the AV traveling along a road where various examples of anomalous series of events occur that each corresponds to a malicious event individually or that, in the aggregate, correspond to a malicious event.FIG. 2 illustrates an example flowchart of an embodiment of a method for detecting malicious events for an AV.FIGS. 3-5 illustrate an example AV and various systems and devices for implementing autonomous driving operations by an AV, including the malicious event detection operation described in this disclosure. For example,FIG. 5 illustrates an example control device of the example AV shown inFIG. 3 for implementing the malicious event detection operations described in this disclosure. -
FIG. 1 illustrates an embodiment of asystem 100 for detecting malicious events 112 for anAV 302.FIG. 1 further illustrates a simplified schematic diagram 102 that comprises various examples of anomalous road conditions or series ofevents 114 that correspond to tampering with anAV 302. In one embodiment,system 100 comprises theAV 302 and anoperation server 140. Thesystem 100 may further comprise anapplication server 162, aremote operator 164, and anetwork 184 that provides communication paths for all of the illustrated components of thesystem 100 to communicate with each other. Thesystem 100 may be configured as shown or any other suitable configurations. - In general, the
system 100 detects a series ofevents 114 that may correspond to a malicious event 112, where the malicious event 112 indicates that a third party, such as a vehicle 122 or a pedestrian is tampering with theAV 302. The series ofevents 114 comprises a number of events above athreshold number 116 that occur within a threshold period oftime 118. For example, the series ofevents 114 may comprise afirst event 104 a and asecond event 104 b that in the aggregate taken within a threshold period oftime 118 amount to the series ofevents 114 that deviates from anormalcy mode 106. Although this disclosure is detailed with respect to a series ofevents 114 that includes more than oneevent 104 that in the aggregate deviate from anormalcy mode 106, it should be understood thatsystem 100 also contemplates identifying and escalating asingular event 104 that deviates from anormalcy mode 106 as a malicious event 112 in the same or similar manner to how a series ofevents 114 is identified and escalated as a malicious event 112.Normalcy mode 106 is described in detail further below. In brief, thenormalcy mode 106 comprises events orscenarios 108 that are expected to be encountered by theAV 302 in the normal course of operation.System 100 may detect the series ofevents 114 from sensor data 178 received fromsensors 346 associated with theAV 302. Upon detecting the series ofevents 114 within thethreshold period 118,system 100 determines whether the series ofevents 114 corresponds to a malicious event 112. In response to determining that the series ofevents 114 corresponds to a malicious event 112,system 100 escalates the series ofevents 114 to be addressed. For example, thecontrol device 350 communicates the series ofevents 114 to theoperation server 140 to be addressed by aremote operator 164. The various examples of an anomalous series ofevents 114 are described in detail further below. The corresponding description below comprises a brief description of certain components of thesystem 100. - In one embodiment, the
AV 302 may include a semi-truck tractor unit attached to a trailer to transport cargo or freight from one location to another location (seeFIG. 3 ). In this disclosure, the semi-truck tractor may be referred to as a cab of theAV 302. TheAV 302 is navigated by a plurality of components described in detail inFIGS. 3-5 . The operation of theAV 302 is described in greater detail inFIG. 3 . The corresponding description below includes brief descriptions of certain components of theAV 302. In brief, theAV 302 includes a in-vehicle control computer 350 which is operated to facilitate autonomous driving of theAV 302. In this disclosure, the in-vehicle control computer 350 may be interchangeably referred to as acontrol device 350. -
Control device 350 is generally configured to control the operation of theAV 302 and its components. Thecontrol device 350 is further configured to determine a pathway in front of theAV 302 that is safe to travel and free of objects/obstacles, and navigate theAV 302 to travel in that pathway. This process is described in more detail inFIGS. 3-5 . - The
control device 350 generally includes one or more computing devices in signal communication with other components of the AV 302 (seeFIG. 3 ). Thecontrol device 350 receives sensor data 178 from one ormore sensors 346 positioned on theAV 302 to determine a safe pathway to travel. The sensor data 178 includes data captured by thesensors 346.Sensors 346 are configured to capture any object within their detection zones or fields of view, such as landmarks, lane markings, lane boundaries,road 120 boundaries, vehicles 122, pedestrians,road 120/traffic signs, among other objects. Thesensors 346 may include cameras, LiDAR sensors, motion sensors, infrared sensors, and the like. In one embodiment, thesensors 346 may be positioned around theAV 302 to capture the environment surrounding theAV 302. In some cases, thesensors 346 may detect a series ofevents 114 within thethreshold period 118 that corresponds to a malicious event 112. For example, the sensor data 178 may include one or more indications indicating anomalous or abnormal series ofevents 114 within the threshold period oftime 118. The threshold period oftime 118 may be determined to be thirty seconds, one minute, two minutes, or any other appropriate duration. - The
control device 350 analyzes the sensor data 178 and determines whether the series ofevents 114 corresponds to a malicious event 112. For example, thecontrol device 350 may compare the series ofevents 114 with thenormalcy mode 106 to determine whether the series ofevents 114 corresponds to any of the expectedscenarios 108. - In one embodiment, the
control device 350 may compare eachevent 104 from the series ofevents 114 individually with the normalcy mode 106 (or each of the expected scenarios 108). For example, if above thethreshold number 116 ofevents 104 are detected within the threshold period oftime 118 such that each of thoseevents 104 does not correspond to thenormalcy mode 106, thecontrol device 350 may determine that thoseevents 104 in the aggregate indicate a deviation from thenormalcy mode 106, and correspond to a malicious event 112. - In one embodiment, the
control device 350 may compare the series ofevents 114, as a whole, with the normalcy mode 106 (or the expected scenarios 108). For example, if the series ofevents 114, taken as a whole, does not correspond to any of the expectedscenarios 108, thecontrol device 350 may determine that the series ofevents 114 is a deviation from thenormalcy mode 106, and correspond to a malicious event 112. In other words, if a corresponding expectedscenario 108 is found, thecontrol device 350 determines that the series ofevents 114, as a whole, does not correspond to a malicious event 112. If, however, the series ofevents 114 does not correspond to any of the expectedscenarios 108, thecontrol device 350 determines that the series ofevents 114 corresponds to the malicious event 112. - The
control device 350 is in signal communication with theoperation server 140. Thecontrol device 350 is configured to communicate the sensor data 178 to theoperation server 140, for example, vianetwork 184. Thecontrol device 350 may communicate the sensor data 178 to theoperation server 140 periodically (e.g., every minute, every few minutes, or any other suitable interval), continuously, and/or upon receiving a request from theoperation server 140 to send sensor data 178. The sensor data 178 may include data describing the environment surrounding theAV 302, such as image feed, video feed, LiDAR data feed, and other data captured from the fields of view of thesensors 346. The sensor data 178 may further include location coordinates 138 associated with theAV 302. See the corresponding description ofFIG. 3 for further description of thecontrol device 350. -
Operation server 140 is generally configured to oversee the operations of theAV 302. Details of theoperation server 140 are described further below. In brief, theoperation server 140 comprises aprocessor 142, amemory 144, anetwork interface 146, and auser interface 148. The components of theoperation server 140 are operably coupled to each other. - The
processor 142 may include one or more processing units that perform various functions as described herein. Thememory 144 stores any data and/or instructions used by theprocessor 142 to perform its functions. For example, thememory 144stores software instructions 168 that when executed by theprocessor 142 causes theoperation server 140 to perform one or more functions described herein. - The
operation server 140 is in signal communication with theAV 302 and its components. Theoperation server 140 is configured to receive the sensor data 178 and the series ofevents 114 from thecontrol device 350, analyze them, and, in response, confirm (or update) the determination of thecontrol device 350 regarding whether the series ofevents 114 corresponds to a malicious event 112. - The
operation server 140 is further configured to detect objects on and around aroad 120 traveled by theAV 302 by analyzing the sensor data 178. For example, theoperation server 140 may detect objects on and around aroad 120 by implementing object detectionmachine learning modules 158. The object detectionmachine learning modules 158 may be implemented using neural networks and/or machine learning algorithms for detecting objects from images, videos, infrared images, point clouds, radar data, etc. The object detectionmachine learning modules 158 is described in more detail further below. -
Normalcy mode 106 generally comprisesscenarios 108 that are expected to be encountered by theAV 302. Thenormalcy mode 106 is built or generated by a normalcymode building engine 110. The normalcymode building engine 110 is described further below. For example, thenormalcy mode 106 may be built by theprocessor 142 executing thesoftware instructions 168. - The
normalcy mode 106 generally corresponds to expected orpredictable scenarios 108 that indicate 1) expected actions or behaviors of theAV 302 and 2) expected actions or behaviors of objects within and outside detection zones of thesensors 346, including moving objects (such as vehicles 122, pedestrians, etc.) and static objects (such astraffic lights 126, etc.) in various situations. For example, thenormalcy mode 106 may include expectedscenarios 108 in situations where theAV 302 is in traffic, behind atraffic light 126, detected an impact or collision with a vehicle 122 or anobject 124, among other situations which are described below. - As an example, a first expected
scenario 108 in thenormalcy mode 106 may indicate that when theAV 302 is in congested traffic or behind atraffic light 126, it is expected that vehicles 122 surrounding theAV 302 to slow down or have stopped. Thus, in scenarios where thesensors 346 detect that particular vehicles 122 surrounding theAV 302 are slowing down and there is notraffic light 126 or traffic detected by thesensors 346, thecontrol device 350 determines that this situation may be a deviation from thenormalcy mode 106. Thecontrol device 350 may determine that there is no traffic by detecting that other vehicles 122 (for example,vehicles 122 e and 1221) are not slowing down, where vehicles 122 that are not slowing down may be in the same lane or a different lane than a lane traveled by theAV 302. Thecontrol device 350 may also determine that there is no traffic ahead of theAV 302 fromtraffic data 156, for example, from live traffic reporting, etc. - In a particular example, a deviation from the first expected
scenario 108 in thenormalcy mode 106 may comprise indications indicating that multiple vehicles 122 around theAV 302 are slowing down (e.g., vehicles 122 a-b, vehicles 122 a-c, or vehicles 122 a-d) and attempting to force theAV 302 to slow down, while there is no traffic ortraffic light 126 detected by thesensors 346. In other words, the multiple vehicles 122 around theAV 302 are impeding the progress of theAV 302 to “box-in” theAV 302. In this way, the vehicles 122 may force theAV 302 to slow down, pull over, or deviate from itsrouting plan 152. Thus, in situations where thesensors 346 detect that vehicles 122 surrounding theAV 302 are slowing down, while there is no traffic ortraffic light 126 detected by thesensors 346, thecontrol device 350 may determine that such situations correspond to anevent 104 or a series ofevents 114 that deviates from thenormalcy mode 106. - As another example, a second expected
scenario 108 in thenormalcy mode 106 may indicate that a distance of a vehicle 122 from theAV 302 is expected to be above athreshold distance 128, and if the distance of that vehicle 122 from theAV 302 becomes less than thethreshold distance 128, it is expected that that vehicle 122 does not persist on staying with a distance less than thethreshold distance 128 from theAV 302 for more than a threshold period oftime 118. In a particular example, a deviation from the second expectedscenario 108 in thenormalcy mode 106 may comprise indications indicating that one or more vehicles 122 are persisting on staying with a distance less than thethreshold distance 128 from theAV 302 for more than the threshold period of time 118 (e.g., thirty seconds, one minute, two minutes, or any other appropriate duration). Thus, in situations where thesensors 346 detect that one or more vehicles 122 stay with a distance less than thethreshold distance 128 from theAV 302 for more than thethreshold period 118, thecontrol device 350 may determine that such situations correspond to anevent 104 or a series ofevents 114 that deviates from thenormalcy mode 106. - As another example, a third
expected scenario 108 in thenormalcy mode 106 may indicate that theAV 302 is expected to drive within a particular speed range provided by the drivinginstructions 154 according to a speed limit of theroad 120 traveled by theAV 302. Thus, in situations where a vehicle drive subsystem 342 (seeFIG. 3 ) monitoring anengine 342 a of the AV 302 (seeFIG. 3 ) detects that the speed of theengine 342 a is going out of the particular speed range, while components of the vehicle control subsystems 348 (seeFIG. 3 ) are operating according to instructions provided in the drivinginstructions 154, thecontrol device 350 may determine that this situation is a deviation from thenormalcy mode 106. In a particular example, this situation may occur if a vehicle 112 d drags theAV 302 back by acable 182, or otherwise impedes the progress of theAV 302. Thus, thecontrol device 350 determines that such situations may correspond to anevent 104 or a series ofevents 114 that deviates from thenormalcy mode 106. - As another example, a fourth expected
scenario 108 in thenormalcy mode 106 may indicate that in response to being involved in an accident or a collision with a vehicle 122, it is expected that an individual from the vehicle 122 to approach theAV 302 within the threshold period oftime 118. Thus, if thesensors 346 detect that the vehicle 122 involved in the accident is fleeing the scene of the accident, thecontrol device 350 may determine that this situation is a deviation from thenormalcy mode 106, i.e., it may be a case of “hit and run.” - As another example, a fifth expected
scenario 108 in thenormalcy mode 106 may indicate that in response to detecting that one or more vehicles 122 exhibiting unexpected or anomalous driving behaviors, it is expected that such unexpected or anomalous driving behaviors do not persist for more than the threshold period oftime 118. Some examples of unexpected or anomalous driving behaviors of the one or more vehicles 122 may comprise invading the space within thethreshold distance 128 from theAV 302, making contact or collide with theAV 302, swerving in front of theAV 302, among others. Some examples of one or more vehicles 122 tampering with theAV 302 may comprise forcing theAV 302 to pull over, deviate from itsrouting place 152, slow down, speed up, drive over anobject 124, crash, or collide with another vehicle 122. Thus, if thesensors 346 detect that any or any combination of the driving scenarios described above persists for more than the threshold period oftime 118, thecontrol device 350 may determine that this situation is a deviation of thenormalcy mode 106. - As described above, the series of
events 114 generally includesevents 104 that in the aggregate indicate that a third party, such as at least one vehicle 122 or an individual is tampering with theAV 302. - In one embodiment, determining that a series of
events 114 in the aggregate indicates a deviation from thenormalcy mode 106 may comprise detecting that eachevent 104 from the series ofevents 114 deviates from thenormalcy mode 116. - In one embodiment, determining that a series of
events 114 in the aggregate indicates a deviation from thenormalcy mode 106 may comprise detecting that at least athreshold number 116 of events 104 (or at least a subset of the series ofevents 114 above the threshold number 116) within the threshold period oftime 118 deviate from thenormalcy mode 106. - In one embodiment, determining that a series of
events 114 in the aggregate indicates a deviation from thenormalcy mode 106 may comprise grouping or taking a collection ofevents 104 together to form the series ofevents 114 that, as a whole, is compared with the expectedscenarios 108 to determine whether the series ofevents 114 deviates from thenormalcy mode 106. - Various examples of such series of
events 114 are illustrated inFIG. 1 and described in detail below. Some examples of the series ofevents 114 may be related to unexpected or abnormal behaviors of moving objects, such as vehicles 122, individuals, and/or pedestrians detected within the detection zones of thesensors 346. Some examples of the series ofevents 114 may be related to unexpected or abnormal behaviors of moving objects, such as vehicles 122, individuals, and/or pedestrians, where the moving objects are not within the detection zones of thesensors 346. Some examples of the series ofevents 114 may be related to unexpected or abnormal behaviors of static or stationary objects, such astraffic lights 126. The series ofevents 114 may not be zone- or region-specific, which means that the series ofevents 114 may occur at any region. - As illustrated in
FIG. 1 , theAV 302 is traveling on theroad 120 according to itspredetermined routing plan 152 when one or more examples of the series ofevents 114 occur. - As an example, a first series of
events 114 may indicate that theAV 302 is forced to deviate from itspredetermined routing plan 152 by one or more vehicles 122 such that theAV 302 is forced to re-route or pull over. For instance, assume that while theAV 302 is traveling along theroad 120,vehicles AV 302 drive with a distance less than thethreshold distance 128 from theAV 302 for more than the threshold period oftime 118. In other words,vehicles threshold distance 128 from theAV 302 for more than the threshold period oftime 118. Thesensors 346 detect these invasions of the space within thethreshold distance 128, and communicate sensor data 178 indicating these invasions to thecontrol device 350. In one example,vehicles AV 302 to re-route from itsrouting plan 152 and take theexit 130. In another example,vehicles AV 302 to pull over to a side of the road 120 (as noted inFIG. 1 as pull over 180). AlthoughFIG. 1 illustratesvehicles AV 302, it is understood that any number of vehicles 122 on one or more sides of theAV 302 may contribute to forcing theAV 302 to deviate from itsrouting plan 152 or pull over. For example,vehicle 122 c may also contribute to this malicious event by impeding theAV 302 from speeding up or otherwise evadingvehicles AV 302 from being re-routed or pulled over. In this way, one or more of vehicles 122 a-c may “box in”AV 302 and force it to deviate from itsrouting plan 152. - As another example, a second series of
events 114 may indicate that theAV 302 is forced to slow down by one or more vehicles 122 where other vehicles 122 around theAV 302 are not slowing down. For instance, assume that while theAV 302 is traveling along theroad 120,first vehicle 122 a (on the left side),second vehicle 122 b (on the right side), andthird vehicle 122 c (on the front) unexpectedly slow down even though there are no traffic (i.e.,vehicles traffic lights 126 detected by thesensors 346. As such, theAV 302 is forced to slow down. AlthoughFIG. 1 illustrates vehicles 122 a-c surrounding theAV 302, it is understood that any number of vehicles 122 on one or more sides of theAV 302 may contribute to forcing theAV 302 to slow down. For example, another vehicle 122 on the rear side of theAV 302 may also match (or comes close to) the speed of the vehicles 122 a-c to box-in theAV 302, thus, forcing theAV 302 to slow down. - As another example, a third series of
events 114 may indicate that theAV 302 is forced to slow down as detected by thecontrol device 350 monitoring anengine 342 a of the AV 302 (seeFIG. 3 ). For instance, assume that while theAV 302 is traveling along theroad 120, afourth vehicle 122 d drags theAV 302 back with acable 182 attached to theAV 302, thus, forcing theAV 302 to slow down or otherwise impeding its movements. Thecontrol device 350 may store this event as afirst event 104 a that is initiated at afirst timestamp 136. Also, assume that an individual from thefourth vehicle 122 d or an accomplice vehicle 122 has attached thecable 182 to theAV 302 that did not trigger anevent 104 that deviates from thenormalcy mode 106. - In this particular example, the
fourth vehicle 122 d that is tampering with theAV 302 is not within the detection zone of thesensors 346. Thus, thesensors 346 may not detect the presence of thefourth vehicle 122 d. However, thecontrol device 350 that is monitoring the speed of theengine 342 a (seeFIG. 3 ) detects that the speed of theengine 342 a is not within a particular speed range that is provided in the drivinginstructions 154, as expected. The particular speed range is determined according to the speed limit of theroad 120 and other criteria, such as fuel-saving, providing a safe driving experience for theAV 302, other vehicles 122, pedestrians, among other criteria. Thecontrol device 350 may also detect that theengine 342 a (seeFIG. 3 ) and other components contributing to the speed of theAV 302 indicate that they are in normal operations. For example, thecontrol device 350 may detect that theengine 342 a (seeFIG. 3 ) and the other components' performance indicators indicate their performance is within a normal range, e.g., 80%, and that they are not damaged. As for another example, thecontrol device 350 may detect that theengine 342 a (seeFIG. 3 ) and the other components are not overheated (for example, their temperature is within a normal range, e.g., 35-40 degrees), do not lack fuel (for example, the fuel level is above a threshold level, e.g., 70%), do not lack electrical power (for example, a battery level indicator of a battery producing electrical power indicates that is above a threshold level, e.g., 80%), or any other conditions that may cause theAV 302 to slow down. Thecontrol device 350 may store this set of determinations (indicating that theengine 342 a is in normal operation) as asecond event 104 b at asecond timestamp 136. If theseevents 104 a-b persists for more than the threshold period oftime 118, thecontrol device 350 determines that these events amount to the third series ofevents 114. Thus, for this particular example,control device 350 detects that the third series ofevents 114 has occurred even though no suspected vehicle 122 potentially causing theAV 302 to slow down is detected by thesensors 346. AlthoughFIG. 1 illustrates that thefourth vehicle 122 d is dragging theAV 302 back forcing theAV 302 to slow down, it is understood that thefourth vehicle 122 d may be in front of theAV 302 and pull theAV 302 forward forcing theAV 302 to speed up, for example, to miss itspredetermined exit 130 or to deviate from itsrouting plan 152. - As another example, a fourth series of
events 114 may indicate one or more impacts with theAV 302 within the threshold period oftime 118 by one or more vehicles 122 tampering with theAV 302. For instance, assume that while theAV 302 is traveling along theroad 120, thefirst vehicle 122 a hits or collides with theAV 302 at afirst timestamp 136. Thesensors 346 detect the first collision, and communicate this event (i.e.,first event 104 a) to thecontrol device 350. Also, assume that thefirst vehicle 122 a (or thesecond vehicle 122 b) hits or collides with theAV 302 at asecond timestamp 136. Similarly, thesensors 346 detect the second collision, and communicate this event (i.e.,second event 104 b) to thecontrol device 350. If thecontrol device 350 determines that thefirst event 104 a and thesecond event 104 b have occurred within the threshold period oftime 118, thecontrol device 350 determines that theevents events 114 that deviates from thenormalcy mode 106. - In another instance, assume that while the
AV 302 is traveling along theroad 120, an individual from thefirst vehicle 122 a hits theAV 302 at thefirst timestamp 136, for example, by an object, such as a rock or a crowbar. Also, assume that an individual from thefirst vehicle 122 a (or thesecond vehicle 122 b) hits theAV 302 at thesecond timestamp 136, for example, by an object, such as a rock or a crowbar. Similar to the instance described above, if thecontrol device 350 determines that these hits or impacts with theAV 302 are within the threshold period oftime 118, thecontrol device 350 determines that these events taken in the aggregate amount to the fourth series ofevents 114 that deviates from thenormalcy mode 106. - As another example, a fifth series of
events 114 may indicate unexpected driving behaviors form one or more vehicles 122. For instance, assume that while theAV 302 is traveling along theroad 120, thefirst vehicle 122 a unexpectedly invades the space withinthreshold distance 128 from theAV 302 and swerves in front of theAV 302 at afirst timestamp 136. Thesensors 346 detect this invasion of the space withinthreshold distance 128, and communicate sensor data 178 indicating this invasion to thecontrol device 350. Thecontrol device 350 may store this event at afirst event 104 a. Also, assume that thefirst vehicle 122 a slows down at asecond timestamp 136, thus, forcing theAV 302 to slow down. Similarly, thesensors 346 detect that thefirst vehicle 122 a is slowing down, and communicate corresponding sensor data 178 indicating that to thecontrol device 350. Thecontrol device 350 may store this event as asecond event 104 b. If thecontrol device 350 determines thatevents time 118, thecontrol device 350 determines that theevents events 114 that deviates from thenormalcy mode 106. - In another instance, assume that while the
AV 302 is traveling along theroad 120, the first vehicle unexpectedly swerves in front of theAV 302 at afirst timestamp 136. Also, assume that thesecond vehicle 122 b unexpectedly swerves in front of theAV 302 at asecond timestamp 136. Similar to the instance described above, if thecontrol device 350 determines that theevents time 118, thecontrol device 350 determines that theseevents 104 a-b taken in the aggregate amount to the fifth series ofevents 114 that deviates from thenormalcy mode 106. - As another example, a sixth series of
events 114 may indicate that at least onesensor 346 from thesensors 346 is non-responsive or disabled. For instance, assume that while theAV 302 is traveling along theroad 120, asensor 346 from thesensors 346 becomes non-responsive as a result of an impact. Thesensor 346 may become non-responsive, for example, when thefirst vehicle 122 a or an individual from thefirst vehicle 122 a hits thesensor 346 in an attempt to disable or damage thesensor 346. Thecontrol device 350 analyzes sensor data 178 captured by the sensor 346 (before it became non-responsive) and determines that thesensor 346 was disabled as a result of an impact from an object, such as a rock, a crowbar, etc., or thefirst vehicle 122 a. - In another instance, assume that a
first sensor 346 becomes non-responsive at a first timestamp 136 (stored at thefirst event 104 a); and asecond sensor 346 becomes non-responsive at a second timestamp 136 (stored at thesecond event 104 b). If thecontrol device 350 determines that theevents time 118, thecontrol device 350 determines that theevents events 114 that deviates from thenormalcy mode 106. - In another instance, assume that a
sensor 346 becomes non-responsive as a result of tampering. In one example, asensor 346 may become non-responsive as a result of a cybersecurity breach in data communication between thesensor 346 and thecontrol device 350. For example, thesensor 346 may become non-responsive at afirst timestamp 136 as a result of a cybersecurity breach. Thecontrol device 350 may detect the cybersecurity breach, for example, by detecting a third-party attempt to establish unauthorized access to thesensor 346 or thecontrol device 350. - In another instance, a
sensor 346 may become non-responsive as a result of propagating jamming signals, radio waves, light beams, and the like. For example, jamming signals may be used to tamper withinfrared sensors 346, jamming radio waves may be used to tamper withRadar sensors 346 b (seeFIG. 3 ), jamming light (or jamming laser) beams may be used to tamper withLiDAR sensors 346 f (seeFIG. 3 ). - The
control device 350 may detectsuch events 104 initiated at theircorresponding timestamps 136, and if they persist for more than the threshold period oftime 118, thecontrol device 350 determines thatsuch events 104 amount to a series ofevents 114 that deviates from thenormalcy mode 106. - As another example, a seventh series of
events 114 may indicate that theAV 302 is forced to drive over anobject 124 as a result of unexpected driving behaviors of one or more vehicles 122. For instance, assume that while theAV 302 is traveling along theroad 120, thefirst vehicle 122 a unexpectedly swerves in front of theAV 302 at afirst timestamp 136, forcing theAV 302 to deviate from its traveling path (stored as thefirst event 104 a), and as a result, theAV 302 drives over theobject 124 at a second timestamp 136 (stored as thesecond event 104 b). If thecontrol device 350 determines that theevents time 118, thecontrol device 350 determines that theevents events 114 that deviates from thenormalcy mode 106. As for another instance, following driving over theobject 124, assume that a tire of theAV 302 is blown at athird timestamp 136. Thecontrol device 350 stores this event at athird event 104 c. Thus, thecontrol device 350 determines thatevents 104 a-c taken in the aggregate amount to a series ofevents 114 that deviates from thenormalcy mode 106. - As another example, an eighth series of
events 114 may indicate that a scheduled action indicated in amap data 150 unexpectedly does not occur.Map data 150 is described in detail further below. In brief, themap data 150 comprises detailed information about the environment on and around the traveling path of theAV 302 including objects on and around theroad 120, such as location coordinates of the road signs, buildings, terrain,traffic lights 126, railroad crossing lights, among others. Themap data 150 further comprises scheduling information of thetraffic lights 126, scheduling information of the railroad crossing lights, and any other scheduling information that theAV 302 may encounter during itsrouting plan 152. For example, themap data 150 comprisestimestamps 136 when thetraffic light 126 indicates yellow, green, and red lights. In another example, themap data 150 comprisestimestamps 136 when a railroad crossing light indicates red and green lights. - Continuing the example of the eighth series of
events 114, assume that while theAV 302 is traveling along theroad 120, theAV 302 reaches thetraffic light 126 and stops behind thetraffic light 126 that is indicating a red light. Also, assume that themap data 150 indicates that a wait time for thetraffic light 126 to change from a red light to a green light is a particular duration, for example, one minute. Also, assume thesensors 346 are detecting the red light from thetraffic light 126 for more than the particular duration indicated in themap data 150. Thecontrol device 350 compares the scheduling information associated with thetraffic light 126 provided by themap data 150 with the sensor data 178 captured by thesensors 346. In this particular instance, thecontrol device 350, based on the comparison between themap data 150 and the sensor data 178, determines that a scheduled action (i.e., thetraffic light 126 indicating a green light after one minute) has not occurred. Thecontrol device 350 may store this event as thefirst event 104 a initiated at afirst timestamp 136. Also, assume that following the delay in changing an indication light by thetraffic light 126, a vehicle 122 invades the space withinthreshold distance 128 from theAV 302 at asecond timestamp 136. Thesensors 346 detect this invasion of thethreshold distance 128, and communicate sensor data 178 indicating this invasion to thecontrol device 350. Thecontrol device 350 may store this event as thesecond event 104 b. If thecontrol device 350 determines that theevents time 118, thecontrol device 350 determines that theevents events 114 that deviates from thenormalcy mode 106. - As another example, a ninth series of
events 114 may indicate that a field-of-view of at least onesensor 346 is obfuscated. For instance, assume that while theAV 302 is traveling along theroad 120, an object is used to obfuscate a detection zone or a field-of-view of thesensor 346 at afirst timestamp 136. In a particular example, sensor data 178 received from thesensor 346 prior to thefirst timestamp 136 indicate that a blanket is thrown over thesensor 346. Thecontrol device 350 determines that thesensor 346 is functional because thesensor 346 is responsive to communication with thecontrol device 350. In other words, thecontrol device 350 can receive sensor data 178 from thesensor 346. However, the sensor data 178 is not as expected compared to sensor data 178 received prior to thefirst timestamp 136. If thecontrol device 350 determines that these events 104 (beginning from the first timestamp 136) persists for more than the threshold period oftime 118, thecontrol device 350 determines that theseevents 104 amount to the ninth series ofevents 114 that deviates from thenormalcy mode 106. - In response to detecting any of the example series of
events 114 described above, thecontrol device 350 escalates the series ofevents 114 to be addressed. For example, thecontrol device 350 communicates the series ofevents 114 to theoperation server 140 to be addressed by theremote operator 164. This process is described in detail in conjunction with the operational flow of thesystem 100 further below. It should be understood that the previous series ofevents 114 described above are mere examples are not an exhaustive list ofevents 104 or series ofevents 114 that may be identified as deviating fromnormalcy mode 106. This disclosure contemplates any suitable number and combination ofevents 104 that may deviate from anormalcy mode 106 that may be identified and escalated even if not specifically described as an example herein. - Aspects of an embodiment of the
operation server 140 are described above, and additional aspects are provided below. Theoperation server 140 includes at least oneprocessor 142, at least one memory, at least onenetwork interface 146, and at least oneuser interface 148. Theoperation server 140 may be configured as shown or in any other suitable configuration. - In one embodiment, the
operation server 140 may be implemented by a cluster of computing devices that may serve to oversee the operations of theAV 302. For example, theoperation server 140 may be implemented by a plurality of computing devices using distributed computing and/or cloud computing systems. In another example, theoperation server 140 may be implemented by a plurality of computing devices in one or more data centers. As such, in one embodiment, theoperation server 140 may include more processing power than thecontrol device 350. Theoperation server 140 is in signal communication with one ormore AVs 302 and their components (e.g., the control device 350). In one embodiment, theoperation server 140 is configured to determine aparticular routing plan 152 for theAV 302. For example, theoperation server 140 may determine aparticular routing plan 152 for anAV 302 that leads to reduced driving time and a safer driving experience for reaching the destination of thatAV 302. -
Processor 142 comprises one or more processors operably coupled to thememory 144. Theprocessor 142 is any electronic circuitry including, but not limited to, state machines, one or more central processing unit (CPU) chips, logic units, cores (e.g., a multi-core processor), field-programmable gate array (FPGAs), application specific integrated circuits (ASICs), or digital signal processors (DSPs). Theprocessor 142 may be a programmable logic device, a microcontroller, a microprocessor, or any suitable combination of the preceding. Theprocessor 142 is communicatively coupled to and in signal communication with thememory 144,network interface 146, anduser interface 148. The one or more processors are configured to process data and may be implemented in hardware or software. For example, theprocessor 142 may be 8-bit, 16-bit, 32-bit, 64-bit or of any other suitable architecture. Theprocessor 142 may include an arithmetic logic unit (ALU) for performing arithmetic and logic operations, processor registers that supply operands to the ALU and store the results of ALU operations, and a control unit that fetches instructions from memory and executes them by directing the coordinated operations of the ALU, registers and other components. The one or more processors are configured to implement various instructions. For example, the one or more processors are configured to executesoftware instructions 168 to implement the function disclosed herein, such as some or all of those described with respect toFIGS. 1 and 2 . In some embodiments, the function described herein is implemented using logic units, FPGAs, ASICs, DSPs, or any other suitable hardware or electronic circuitry. -
Memory 144 stores any of the information described above with respect toFIGS. 1 and 2 along with any other data, instructions, logic, rules, or code operable to implement the function(s) described herein when executed byprocessor 142. For example, thememory 144 may storenormalcy mode 106, normalcymode building engine 110, malicious event 112, series ofevents 114,threshold number 116,threshold period 118,threshold distance 128,confidence score 132,threshold score 134,timestamps 136, location coordinates 138, sensor data 178,map data 150,routing plan 152, drivinginstructions 154,traffic data 156, object detectionmachine learning modules 158,countermeasures 166,software instructions 168, and/or any other data/instructions. Thesoftware instructions 168 include code that when executed by theprocessor 142 causes theoperation server 140 to perform the functions described herein, such as some or all of those described inFIGS. 1 and 2 . Thememory 144 comprises one or more disks, tape drives, or solid-state drives, and may be used as an over-flow data storage device, to store programs when such programs are selected for execution, and to store instructions and data that are read during program execution. Thememory 144 may be volatile or non-volatile and may comprise read-only memory (ROM), random-access memory (RAM), ternary content-addressable memory (TCAM), dynamic random-access memory (DRAM), and static random-access memory (SRAM). Thememory 144 may include one or more of a local database, cloud database, Network-attached storage (NAS), etc. -
Network interface 146 is configured to enable wired and/or wireless communications. Thenetwork interface 146 is configured to communicate data between thecontrol device 350 and other network devices, systems, or domain(s). For example, thenetwork interface 146 may comprise a WIFI interface, a local area network 184 (LAN) interface, a wide area network 184 (WAN) interface, a modem, a switch, or a router. Theprocessor 142 is configured to send and receive data using thenetwork interface 146. Thenetwork interface 146 may be configured to use any suitable type of communication protocol. -
User interfaces 148 may include one or more user interfaces that are configured to interact with users, such as theremote operator 164. For example, theuser interfaces 148 may include peripherals of theoperation server 140, such as monitors, keyboards, mouse, trackpads, touchpads, etc. Theremote operator 164 may use theuser interfaces 148 to access thememory 144 to review sensor data 178, review the series ofevents 114, and address the detected malicious event 112. - Normalcy
mode building engine 110 may be implemented by theprocessor 142 executing thesoftware instructions 168, and is generally configured to build thenormalcy mode 106. In one embodiment, the normalcymode building engine 110 may use simulated or offline driving situations to determine expected scenarios 108 (similar to those described above) and build thenormalcy mode 106. In other words, the normalcymode building engine 110 generates thenormalcy mode 106 that corresponds to a pattern-of-life for theAV 302 in the context of driving. - In one embodiment, the normalcy
mode building engine 110 may be implemented by machine learning neural networks, including a plurality of convolutional neural networks, and the like. In one embodiment, the normalcymode building engine 110 may be implemented by supervised pattern learning techniques and/or unsupervised pattern learning techniques, such as Bayesian Non-Parametric Modeling, decision trees, etc. - In one embodiment, the expected
scenarios 108 in thenormalcy mode 106 may be determined by offline driving simulations in various road environments. In one example, a first environment where theAV 302 is in a traffic may be simulated to determine scenarios expected from the environment around theAV 302 including its surrounding vehicles 122 in this situation. In this example, expectedscenarios 108 comprise detecting that surrounding vehicles 122 are stopped or slowing down, for example, by determining speed profiles, trajectory profiles, detecting that rear red lights of the surrounding vehicles 122 are turned on, and any other indication that indicates theAV 302 is in traffic. - In another example, a second environment where the
AV 302 is behind thetraffic light 126 may be simulated to determine expectedscenarios 108 from the environment around theAV 302 including its surrounding vehicles 122 and thetraffic light 126 in this situation. In this example, expectedscenarios 108 comprise 1) detecting that thetraffic light 126 is indicating a red light, 2) expecting that thetraffic light 126 changes its status (i.e., from red light to green) based on its corresponding scheduling information provided in themap data 150, 3) detecting that surrounding vehicles 122 are stopped or slowing down, and any other indication that indicates theAV 302 is behind thetraffic light 126. - In another example, a third environment where one or more vehicles 122 are driving around the
AV 302 may be simulated to determine expectedscenarios 108 from the environment around theAV 302 including its surrounding vehicles 122 in this situation. In this example, expectedscenarios 108 comprise 1) expecting that the one or more vehicles 122 do not invade thethreshold distance 128 from theAV 302, 2) expecting that the one or more vehicles 122 do not persist to drive parallel to theAV 302 for more than athreshold period 118, and 3) if the one or more vehicles 122 invade thethreshold distance 128 from theAV 302, expecting that the one or more vehicles 122 do not persist this situation for more than thethreshold period 118. Thethreshold distance 128 may vary depending on which side of theAV 302 it is being measured. For example, athreshold distance 128 from theAV 302 from sides of theAV 302 may be less than athreshold distance 128 from theAV 302 from the front and the rear. -
Map data 150 may include a virtual map of a city which includes theroad 120. In some examples, themap data 150 may include themap 458 and map database 436 (seeFIG. 4 for descriptions of themap 458 and map database 436). Themap data 150 may include drivable areas, such asroads 120, paths, highways, and undrivable areas, such as terrain (determined by theoccupancy grid module 460, seeFIG. 4 for descriptions of the occupancy grid module 460). Themap data 150 may specify location coordinates of road signs, lanes, lane markings, lane boundaries, road boundaries,traffic lights 126, etc. -
Routing plan 152 is a plan for traveling from a start location (e.g., afirst AV 302 launchpad/landing pad) to a destination (e.g., asecond AV 302 launchpad/landing pad). For example, therouting plan 152 may specify a combination of one or more streets/roads/highways in a specific order from the start location to the destination. Therouting plan 152 may specify stages including the first stage (e.g., moving out from the start location), a plurality of intermediate stages (e.g., traveling along particular lanes of one or more particular street/road/highway), and the last stage (e.g., entering the destination. Therouting plan 152 may include other information about the route from the start position to the destination, such as road/traffic signs in thatrouting plan 152, etc. - Driving
instructions 154 may be implemented by the planning module 462 (See descriptions of theplanning module 462 inFIG. 4 ). The drivinginstructions 154 may include instructions and rules to adapt the autonomous driving of theAV 302 according to the driving rules of each stage of therouting plan 152. For example, the drivinginstructions 154 may include instructions to stay within the speed range of aroad 120 traveled by theAV 302, adapt the speed of theAV 302 with respect to observed changes by thesensors 346, such as speeds of surrounding vehicles 122, objects within the detection zones of thesensors 346, etc. - Object detection
machine learning modules 158 may be implemented by theprocessor 142 executingsoftware instructions 168, and is generally configured to detect objects from the sensor data 178. The object detectionmachine learning modules 158 may be implemented using neural networks and/or machine learning algorithms for detecting objects from any data type, such as images, videos, infrared images, point clouds, Radar data, etc. - In one embodiment, the object detection
machine learning modules 158 may be implemented using machine learning algorithms, such as Support Vector Machine (SVM), Naive Bayes, Logistic Regression, k-Nearest Neighbors, Decision Trees, or the like. In one embodiment, the object detectionmachine learning modules 158 may utilize a plurality of neural network layers, convolutional neural network layers, and/or the like, in which weights and biases of perceptrons of these layers are optimized in the training process of the object detectionmachine learning modules 158. The object detectionmachine learning modules 158 may be trained by a training dataset which includes samples of data types labeled with one or more objects in each sample. For example, the training dataset may include sample images of objects (e.g., vehicles 122, lane markings, pedestrian, road signs, etc.) labeled with object(s) in each sample image. Similarly, the training dataset may include samples of other data types, such as videos, infrared images, point clouds, Radar data, etc. labeled with object(s) in each sample data. The object detectionmachine learning modules 158 may be trained, tested, and refined by the training dataset and the sensor data 178. The object detectionmachine learning modules 158 use the sensor data 178 (which are not labeled with objects) to increase their accuracy of predictions in detecting objects. For example, supervised and/or unsupervised machine learning algorithms may be used to validate the predictions of the object detectionmachine learning modules 158 in detecting objects in the sensor data 178. -
Traffic data 156 may include traffic data of roads/streets/highways in themap data 150. Theoperation server 140 may usetraffic data 156 that is captured by one or more mapping vehicles. Theoperation server 140 may usetraffic data 156 that is captured from any source, such as crowd-sourcedtraffic data 156 captured from external sources, e.g., Waze and Google maps, live traffic reporting, etc. -
Countermeasures 166 comprise instructions to be carried out in response to escalating the series ofevents 114 and determining that the series ofevents 114 corresponds to a malicious event 112. For example, thecountermeasures 166 may comprise instructions that indicate to establish acommunication path 160 with a communication module at theAV 302 in order to converse with individuals causing the series ofevents 114 and tampering with theAV 302. As for another example, thecountermeasures 166 may comprise instructions that indicate to activate a horn of theAV 302. As for another example, thecountermeasures 166 may comprise instructions that indicate to send a notifyingmessage 172 tolaw enforcement 170, where the notifyingmessage 172 comprises an indication that theAV 302 has been tampered with at particular location coordinates 138 where the series ofevents 114 has occurred. In one embodiment,countermeasures 166 may be performed by theremote operator 164 as described further below. In one embodiment, performing thecountermeasures 166 may be computerized and performed by theoperation server 140. - The
application server 162 is generally any computing device configured to communicate with other devices, such as other servers (e.g., operation server 140),AV 302, databases, etc., via thenetwork 184. Theapplication server 162 is configured to perform specific functions described herein and interact with theremote operator 164, e.g., viacommunication path 174 using its user interfaces. Examples of theapplication server 162 include, but are not limited to, desktop computers, laptop computers, servers, etc. In one example, theapplication server 162 may act as a presentation layer whereremote operator 164 accesses theoperation server 140. As such, theoperation server 140 may send sensor data 178, the series ofevents 114,countermeasures 166 and/or any other data/instructions to theapplication server 162, e.g., via thenetwork 184. Theremote operator 164, after establishing thecommunication path 174 with theapplication server 162, may review the received data and carry out thecountermeasures 166 in addressing the series ofevents 114. In another embodiment, theremote operator 164 can directly access theoperation server 140, and after establishing thecommunication path 176 with theoperation server 140, may carry out thecountermeasures 166 in addressing the series ofevents 114. Theremote operator 164 may be an individual who is associated with and has access to theoperation server 140. For example, theremote operator 164 may be an administrator that can access and view the information regarding theAV 302, such as sensor data 178 and other information that is available on thememory 144. -
Network 184 may be any suitable type of wireless and/or wired network including, but not limited to, all or a portion of the Internet, an Intranet, a private network, a public network, a peer-to-peer network, the public switched telephone network, a cellular network, a local area network (LAN), a metropolitan area network (MAN), a wide area network (WAN), and a satellite network. Thenetwork 184 may be configured to support any suitable type of communication protocol as would be appreciated by one of ordinary skill in the art. - The operational flow of the
system 100 begins when thecontrol device 350 detects a series ofevents 114, such as those described above or any other examples of a series ofevents 114 that deviates from thenormalcy mode 106. For example, thecontrol device 350 detects the series ofevents 114 by analyzing the sensor data 178. Upon detection of the series ofevents 114, thecontrol device 350 determines whether the series ofevents 114 corresponds to a malicious event 112. In one embodiment, in this process, thecontrol device 350 may compare the series ofevents 114 as a whole with the expectedscenarios 108 stored in thenormalcy mode 106 to determine whether the series ofevents 114, taken as a whole, deviates from thenormalcy mode 106. In one embodiment, thecontrol device 350 may compareevents 104 from the series ofevents 104 with the expectedscenarios 108 to determine whether at least athreshold number 116 of events 104 (from the series of events 114) within the threshold period oftime 118 deviate from thenormalcy mode 106. - In one embodiment, the
control device 350 may compare eachevent 104 detected within the threshold period oftime 118 individually with each of the expectedscenarios 108 to determine whether eachevent 104 deviates from thenormalcy mode 106. In this embodiment, one or more correspondence between each individual event 104 (from the series of events 114) and the expectedscenarios 108 may be found that may result in determining that the series ofevents 114 deviate from thenormalcy mode 106, even though a correspondence may not be found between the series ofevents 114, taken as a whole, (comprising the same events 104) within the threshold period oftime 118 and the overall expectedscenarios 108 such that the series ofevents 114 is considered a malicious event 112. - The
control device 350 determines whether the series ofevents 114 corresponds to any of the expectedscenarios 108. If a corresponding expectedscenario 108 is found, thecontrol device 350 determines that the series ofevents 114 does not correspond to a malicious event 112. If, however, thecontrol device 350 determines that the series ofevents 114 does not correspond to any of the expectedscenarios 108, it determines that the series ofevents 114 corresponds to the malicious event 112. - In one embodiment, at least one
surveillance sensor 346 i may be used to record the series of events 114 (in addition to or instead of other sensors 346). Thesurveillance sensor 346 i may be hidden from sight. Thesurveillance sensor 346 i may be any of theexample sensors 346 described inFIG. 3 or any otherobject detection sensor 346. Thesurveillance sensor 346 i may be positioned on the outer body and/or inside theAV 302 at any suitable position. For example,surveillance sensors 346 i may be positioned in the cab of theAV 302 behind the front and/or side windows. In another example, asurveillance sensor 346 i may be positioned underneath theAV 302. In one embodiment, thesurveillance sensor 346 i may be activated in response to detecting the series ofevents 114. For example, upon detection of the series ofevents 114, thecontrol device 350 activates thesurveillance sensors 346 i to record the series ofevents 114. - In one embodiment, upon detecting the series of
events 114, thecontrol device 350 assigns aconfidence score 132 to the series ofevents 114, where theconfidence score 132 indicates a probability that the series ofevents 114 corresponds to the malicious event 112. For instance, if everyevent 104 from the series ofevents 114 corresponds to a deviation from thenormalcy mode 106, thecontrol device 350 assigns a high confidence score 132 (e.g., 75%) to the detected series ofevents 114. - As another example, if the series of
events 114 comprises a number ofevents 104 above thethreshold number 116 detected within the threshold period oftime 118, thecontrol device 350 assigns a high confidence score 132 (e.g., 90%) to the detected series ofevents 114. For example, if thecontrol device 350 detects that thefirst vehicle 122 a swerved in front of theAV 302 at a first timestamp 136 (stored as afirst event 104 a), followed by detecting that thefirst vehicle 122 a is slowing down at a second timestamp 136 (stored as asecond event 104 b), detecting that other surrounding vehicles 122 are not slowing down and notraffic light 126 detected by the sensors 346 (stored as athird event 104 c), and this situation persists for more than the threshold period oftime 118, thecontrol device 350 assigns a high confidence score 132 (e.g., 90%) to theseevents 104 a-c. - As another example, even if only one
event 104 that corresponds to a deviation from thenormalcy mode 106 is detected and persists over the threshold period oftime 118, thecontrol device 350 may assign ahigh confidence score 132 to theevent 104. For example, if thecontrol device 350 detects that a field-of-view of asensor 346 is obfuscated according to sensor data 178 received from thesensor 346 prior to the detection of theobfuscation event 104, and this situation persists more than the threshold period oftime 118, thecontrol device 350 assigns a high confidence score 132 (e.g., 70%) to thisevent 104. - In contrast, for instance, if the
control device 350 detects afirst event 104 a that deviates from thenormalcy mode 106 at afirst timestamp 136, and asecond event 104 b that deviates from thenormalcy mode 106 at asecond timestamp 136, and that thefirst timestamp 136 and thesecond timestamp 136 are not both within the threshold period oftime 118, thecontrol device 350 assigns a low confidence score 132 (e.g., 30%) to this series ofevents 114 comprisingevents control device 350 detects a first unexpected driving behavior from thefirst vehicle 122 a at afirst timestamp 136 such as thefirst vehicle 122 a unexpectedly swerves in front of the AV 302 (stored as afirst event 104 a); and detects a second unexpected driving behavior from thesecond vehicle 122 b at asecond timestamp 136 such as thesecond vehicle 122 b unexpectedly swerves in front of the AV 302 (stored as asecond event 104 b). Also, assume that each of thefirst event 104 a andsecond event 104 b indicates a deviation from thenormalcy mode 106; and that thefirst timestamp 136 and thesecond timestamp 136 are not within the threshold period oftime 118. In such situations, thecontrol device 350 assigns alow confidence score 132 to this series ofevents 114 that comprisesevents - In response to detecting that the series of
events 114 corresponds to a malicious event 112, thecontrol device 350 escalates the series ofevents 114 to be addressed by communicating the series ofevents 114 to theoperation server 140. In one embodiment, theoperation server 140 may confirm (or update) the determination of thecontrol device 350 regarding whether the series ofevents 114 corresponds to a malicious event 112. In one embodiment, theremote operator 164 may confirm (or update) the determination of the operation server 140 (and the control device 350) regarding whether the series ofevents 114 corresponds to a malicious event 112. This confirmation (or update) is used by the normalcymode building engine 110 to further refine thenormalcy mode 106. - For example, if it is determined that the series of
events 114 does not correspond to a malicious event 112, thenormalcy mode 106 is updated to include the series ofevents 114 indicating that the series ofevents 114 does not correspond to a malicious event 112. - In one embodiment, the
normalcy mode 106 may be updated by theremote operator 164 reviewing the series ofevents 114. As such, a supervised machine learning technique may be leveraged in refining and updating thenormalcy mode 106. For example, the normalcymode building engine 110 may learn from the confirmations and updates by theremote operator 164 and refine or update thenormalcy mode 106. The normalcymode building engine 110 may adapt to the updatednormalcy mode 106 using an unsupervised machine learning technique, for example, by adjusting weight and bias values of neural network layers of the normalcymode building engine 110. - The operation server 140 (or the remote operator 164) may take
particular countermeasures 166 to address (or perhaps resolve) the series ofevents 114 and tampering with theAV 302. The corresponding description below describes non-limiting examples ofcountermeasures 166 in addressing (or perhaps resolving) the series ofevents 114. - In one embodiment, the
remote operator 164 establishes acommunication path 160 between theoperation server 140 and theAV 302. In one embodiment, thecommunication path 160 may follow a one-way communication protocol, where data can be transmitted from theoperation server 140 to theAV 302. For example, thecommunication path 160 may be configured to support voice-based, message-based, visual-based, and/or any other appropriate types of communication. Thecommunication path 160 may be established between theoperation server 140 and a communication module that is associated with theAV 302. The communication module may be installed at any appropriate location inside and/or on the outer body of theAV 302. For example, the communication module may be installed inside the cab of theAV 302, behind the front windows. The communication module may include one or more user interfaces including, but not limited to, a speaker, a monitor screen, and a microphone. The communication module may operably be coupled with a camera in a surveillance room where theremote operator 164 is located. As such, theremote operator 164 may configure thecommunication path 160 to show themselves on the monitor screen at theAV 302, such that theremote operator 164 is visible from the monitor screen to the individuals causing the series ofevents 114. For example, theremote operator 164 can converse with the individuals causing the series ofevents 114 to discourage the individuals causing the series ofevents 114 from tampering with theAV 302. In another embodiment, thecommunication path 160 may follow a two-way communication protocol, where data can be transmitted and received from both sides. - In one embodiment, a
countermeasure 166 to address (or perhaps resolve) the malicious event 112 may comprise activating a horn of theAV 302. For example, theremote operator 164 may remotely activate the horn of theAV 302. In one embodiment, acountermeasure 166 to address (or perhaps resolve) the malicious event 112 may comprise notifyinglaw enforcement 170. For example, theremote operator 164 may send a notifyingmessage 172 indicating that theAV 302 is being tampered with at particular location coordinates 138. In one embodiment, thecountermeasures 166 described above may be computerized and be carried out by theoperation server 140. -
FIG. 2 illustrates an example flowchart of amethod 200 for detecting malicious events 112 for anAV 302. Modifications, additions, or omissions may be made tomethod 200.Method 200 may include more, fewer, or other steps. For example, steps may be performed in parallel or in any suitable order. While at times discussed as theAV 302,operation server 140,control device 350, or components of any of thereof performing steps, any suitable system or components of the system may perform one or more steps of themethod 200. For example, one or more steps ofmethod 200 may be implemented, at least in part, in the form ofsoftware instructions FIGS. 1 and 3 , stored on non-transitory, tangible, machine-readable media (e.g.,memory 144,data storage device 390, andmemory 502, respectively fromFIGS. 1, 3, and 5 ) that when run by one or more processors (e.g.,processors FIGS. 1, 3, and 5 ) may cause the one or more processors to perform steps 202-208. -
Method 200 begins atstep 202 where thecontrol device 350 detects a series ofevents 114 within a threshold period oftime 118. In one embodiment, thecontrol device 350 may detect a series ofevents 114 within a threshold period oftime 118, where the series ofevents 114 comprisesevents 104 above athreshold number 116. In some embodiments, thethreshold number 116 may be one and in other embodiments, thethreshold number 116 may be more than one depending on the circumstances. In this process, thecontrol device 350 may detect the series ofevents 114 by analyzing the sensor data 178 captured by thesensors 346. Thecontrol device 350 may detect any of the example series ofevents 114 described inFIG. 1 . The series ofevents 114 may correspond to a deviation from thenormalcy mode 106. For example, the series ofevents 114 may comprise afirst event 104 a and asecond event 104 b that taken in the aggregate amount to a series ofevents 114 that deviates from thenormalcy mode 106. - In some examples, the series of
events 114 may comprise one ormore events 104 that are not detected by the sensors 362, i.e., they are not within the detection zones of the sensors 362. For instance, as described inFIG. 1 ,vehicle 122 d that drags theAV 302 back by acable 182 may not be within the detection zone of thesensors 346. As such, thesensors 346 may not detect the presence of thevehicle 122 d. However, thecontrol device 350 may detect that theAV 302 is slowing down by monitoring the speed and performance of theengine 342 a of the AV 302 (seeFIG. 4 ). - In some examples, the series of
events 114 may comprise one ormore events 104 that are detected on lane(s) other than the lane traveled by theAV 302. For instance, as described inFIG. 1 ,vehicles threshold distance 128 from theAV 302 may be in side-lanes with respect to theAV 302. - The threshold period of
time 118 may be determined to be thirty seconds, one minute, two minutes, or any other appropriate duration of time. The threshold period oftime 118 may vary depending on an encountered series of events 114 (and/or a number ofevents 104 in the series of events 114). For example, the threshold period oftime 118 may increase as the number ofevents 104 in the series ofevents 114 increases. For example, if thecontrol device 350 detects a set of vehicles 122 are surrounding theAV 302 and the set of vehicles 122 invading the space withinthreshold distance 128 from theAV 302, thecontrol device 350 may determine the threshold period oftime 118 to be shorter compared to another series ofevents 114, such as where one vehicle 122 on a side of theAV 302 is driving parallel to theAV 302. - At
step 204, thecontrol device 350 determines whether the series ofevents 114 corresponds to a malicious event 112. - In this process, the
control device 350 may compare the series ofevents 114, taken as a whole, with the expectedscenarios 108 stored in thenormalcy mode 106. If no correspondence is found between the series ofevents 114, taken as a whole, and the expectedscenarios 108, thecontrol device 350 may determine that the series ofevents 114 corresponds to a malicious event 112, i.e., the series ofevents 114 is a deviation from thenormalcy mode 106. If, however, a correspondence is found, thecontrol device 350 may determine that the series ofevents 114 does not correspond to a malicious event 112. In one embodiment, thecontrol device 350 may compare each event 104 (from the series of events 114) with the expectedscenarios 108. If above thethreshold number 116 ofevents 104 within the threshold period oftime 118 correspond to the expectedscenarios 108, thecontrol device 350 may determine that the series ofevents 114 does not correspond to a malicious event 112. Otherwise, thecontrol device 350 may determine that the series ofevents 114 corresponds to a malicious event 112. - In one embodiment, the
control device 350 may determine whether the series ofevents 114 corresponds to a malicious event 112 by assigning aconfidence score 132 to the series ofevents 114 and determining whether the assignedconfidence score 132 is above thethreshold score 134, similar to that described inFIG. 1 . - In one embodiment, if it is determined that the series of
events 114 corresponds to a malicious event 112,method 200 may proceed to step 206. If, however, it is determined that the series ofevents 114 does not correspond to a malicious event 112,method 200 may be terminated. - In another embodiment, if it is determined that the series of
events 114 does not correspond to a malicious event 112, thecontrol device 350 may communicate the series ofevents 114 to theoperation server 140 so that theremote operator 164 can confirm, update, or override the determination of thecontrol device 350. - At
step 206, thecontrol device 350 escalates the series ofevents 114 to be addressed. For example, thecontrol device 350 communicates the series ofevents 114 to theoperation server 140 to be addressed by the remote operator 164 (or the operation server 140). For example, in response to receiving the series ofevents 114, the remote operator 164 (or the operation server 140) may carry outparticular countermeasures 166 to address the series ofevent 114, similar to that described inFIG. 1 . Some examples ofcountermeasures 166 may comprise establishing acommunication path 160 with theAV 302 such that individuals causing the series ofevents 114 can hear and/or see theremote operator 164 from a speaker and/or a monitor screen of a communication module installed in theAV 302, remotely activating a horn of theAV 302, sending a notifyingmessage 172 tolaw enforcement 170 indicating that theAV 302 is being tampered with at the particular location coordinates 138. -
FIG. 3 shows a block diagram of anexample vehicle ecosystem 300 in which autonomous driving operations can be determined. As shown inFIG. 3 , theAV 302 may be a semi-trailer truck. Thevehicle ecosystem 300 includes several systems and components that can generate and/or deliver one or more sources of information/data and related services to the in-vehicle control computer 350 that may be located in anAV 302. The in-vehicle control computer 350 can be in data communication with a plurality ofvehicle subsystems 340, all of which can be resident in theAV 302. Avehicle subsystem interface 360 is provided to facilitate data communication between the in-vehicle control computer 350 and the plurality ofvehicle subsystems 340. In some embodiments, thevehicle subsystem interface 360 can include a controller area network (CAN) controller to communicate with devices in thevehicle subsystems 340. - The
AV 302 may include various vehicle subsystems that support the operation ofAV 302. The vehicle subsystems may include thecontrol device 350, avehicle drive subsystem 342, avehicle sensor subsystem 344, and/or avehicle control subsystem 348. The components or devices of thevehicle drive subsystem 342, thevehicle sensor subsystem 344, and thevehicle control subsystem 348 shown inFIG. 3 are examples. TheAV 302 may be configured as shown or any other configurations. - The
vehicle drive subsystem 342 may include components operable to provide powered motion for theAV 302. In an example embodiment, thevehicle drive subsystem 342 may include an engine/motor 342 a, wheels/tires 342 b, atransmission 342 c, anelectrical subsystem 342 d, and apower source 342 e. - The
vehicle sensor subsystem 344 may include a number ofsensors 346 configured to sense information about an environment or condition of theAV 302. Thevehicle sensor subsystem 344 may include one ormore cameras 346 a or image capture devices, aRadar unit 346 b, one ormore temperature sensors 346 c, awireless communication unit 346 d (e.g., a cellular communication transceiver), an inertial measurement unit (IMU) 346 e, a laser range finder/LiDAR unit 346 f, a Global Positioning System (GPS)transceiver 346 g, and/or awiper control system 346 h. Thevehicle sensor subsystem 344 may also includesensors 346 configured to monitor internal systems of the AV 302 (e.g., an 02 monitor, a fuel gauge, an engine oil temperature, etc.). - The
IMU 346 e may include any combination of sensors 346 (e.g., accelerometers and gyroscopes) configured to sense position and orientation changes of theAV 302 based on inertial acceleration. The GPS transceiver 346 q may be any sensor configured to estimate a geographic location of theAV 302. For this purpose, the GPS transceiver 346 q may include a receiver/transmitter operable to provide information regarding the position of theAV 302 with respect to the Earth. TheRadar unit 346 b may represent a system that utilizes radio signals to sense objects within the local environment of theAV 302. In some embodiments, in addition to sensing the objects, theRadar unit 346 b may additionally be configured to sense the speed and the heading of the objects proximate to theAV 302. The laser range finder orLiDAR unit 346 f may be any sensor configured to sense objects in the environment in which theAV 302 is located using lasers. Thecameras 346 a may include one or more devices configured to capture a plurality of images of the environment of theAV 302. Thecameras 346 a may be still image cameras or motion video cameras. - The
vehicle control subsystem 348 may be configured to control the operation of theAV 302 and its components. Accordingly, thevehicle control subsystem 348 may include various elements such as a throttle and gear 348 a, abrake unit 348 b, anavigation unit 348 c, asteering system 348 d, and/or an autonomous control unit 348 e. Thethrottle 348 a may be configured to control, for instance, the operating speed of the engine and, in turn, control the speed of theAV 302. Thegear 348 a may be configured to control the gear selection of the transmission. Thebrake unit 348 b can include any combination of mechanisms configured to decelerate theAV 302. Thebrake unit 348 b can use friction to slow the wheels in a standard manner. Thebrake unit 348 b may include an Anti-Lock Brake system (ABS) that can prevent the brakes from locking up when the brakes are applied. Thenavigation unit 348 c may be any system configured to determine a driving path or route for theAV 302. Thenavigation 348 c unit may additionally be configured to update the driving path dynamically while theAV 302 is in operation. In some embodiments, thenavigation unit 348 c may be configured to incorporate data from the GPS transceiver 346 q and one or more predetermined maps so as to determine the driving path (e.g., along theroad 120 ofFIG. 1 ) for theAV 302. Thesteering system 348 d may represent any combination of mechanisms that may be operable to adjust the heading ofAV 302 in an autonomous mode or in a driver-controlled mode. - The autonomous control unit 348 e may represent a control system configured to identify, evaluate, and avoid or otherwise negotiate potential obstacles or obstructions in the environment of the
AV 302. In general, the autonomous control unit 348 e may be configured to control theAV 302 for operation without a driver or to provide driver assistance in controlling theAV 302. In some embodiments, the autonomous control unit 348 e may be configured to incorporate data from the GPS transceiver 346 q, theRadar 346 b, theLiDAR unit 346 f, thecameras 346 a, and/or other vehicle subsystems to determine the driving path or trajectory for theAV 302. - Many or all of the functions of the
AV 302 can be controlled by the in-vehicle control computer 350. The in-vehicle control computer 350 may include at least one data processor 370 (which can include at least one microprocessor) that executes processinginstructions 380 stored in a non-transitory computer-readable medium, such as thedata storage device 390 or memory. The in-vehicle control computer 350 may also represent a plurality of computing devices that may serve to control individual components or subsystems of theAV 302 in a distributed fashion. In some embodiments, thedata storage device 390 may contain processing instructions 380 (e.g., program logic) executable by thedata processor 370 to perform various methods and/or functions of theAV 302, including those described with respect toFIGS. 1 and 2 . - The
data storage device 390 may contain additional instructions as well, including instructions to transmit data to, receive data from, interact with, or control one or more of thevehicle drive subsystem 342, thevehicle sensor subsystem 344, and thevehicle control subsystem 348. The in-vehicle control computer 350 can be configured to include adata processor 370 and adata storage device 390. The in-vehicle control computer 350 may control the function of theAV 302 based on inputs received from various vehicle subsystems (e.g., thevehicle drive subsystem 342, thevehicle sensor subsystem 344, and the vehicle control subsystem 348). -
FIG. 4 shows anexemplary system 400 for providing precise autonomous driving operations. Thesystem 400 includes several modules that can operate in the in-vehicle control computer 350, as described inFIG. 3 . The in-vehicle control computer 350 includes asensor fusion module 402 shown in the top left corner ofFIG. 4 , where thesensor fusion module 402 may perform at least four image or signal processing operations. Thesensor fusion module 402 can obtain images from cameras located on an autonomous vehicle to performimage segmentation 404 to detect the presence of moving objects (e.g., other vehicles 122, pedestrians, etc.) and/or static obstacles (e.g., stop sign, speed bump, terrain, etc.) located around the autonomous vehicle. Thesensor fusion module 402 can obtain LiDAR point cloud data item fromLiDAR sensors 346 located on the autonomous vehicle to performLiDAR segmentation 406 to detect the presence of objects and/or obstacles located around the autonomous vehicle. - The
sensor fusion module 402 can performinstance segmentation 408 on image and/or point cloud data item to identify an outline (e.g., boxes) around the objects and/or obstacles located around the autonomous vehicle. Thesensor fusion module 402 can perform temporal fusion where objects and/or obstacles from one image and/or one frame of point cloud data item are correlated with or associated with objects and/or obstacles from one or more images or frames subsequently received in time. - The
sensor fusion module 402 can fuse the objects and/or obstacles from the images obtained from the camera and/or point cloud data item obtained from theLiDAR sensors 346. For example, thesensor fusion module 402 may determine based on a location of two cameras that an image from one of the cameras comprising one half of a vehicle located in front of the autonomous vehicle is the same as the vehicle located captured by another camera. Thesensor fusion module 402 sends the fused object information to theinterference module 446 and the fused obstacle information to theoccupancy grid module 460. The in-vehicle control computer includes theoccupancy grid module 460 can retrieve landmarks from amap database 458 stored in the in-vehicle control computer. Theoccupancy grid module 460 can determine drivable areas and/or obstacles from the fused obstacles obtained from thesensor fusion module 402 and the landmarks stored in themap database 458. For example, theoccupancy grid module 460 can determine that a drivable area may include a speed bump obstacle. - Below the
sensor fusion module 402, the in-vehicle control computer 350 includes a LiDAR basedobject detection module 412 that can performobject detection 416 based on point cloud data item obtained from theLiDAR sensors 414 located on the autonomous vehicle. Theobject detection 416 technique can provide a location (e.g., in 3D world coordinates) of objects from the point cloud data item. Below the LiDAR basedobject detection module 412, the in-vehicle control computer includes an image basedobject detection module 418 that can performobject detection 424 based on images obtained fromcameras 420 located on the autonomous vehicle. Theobject detection 424 technique can employ a deep machine learning technique to provide a location (e.g., in 3D world coordinates) of objects from the image provided by the camera. - The
Radar 456 on the autonomous vehicle can scan an area in front of the autonomous vehicle or an area towards which the autonomous vehicle is driven. The Radar data is sent to thesensor fusion module 402 that can use the Radar data to correlate the objects and/or obstacles detected by theRadar 456 with the objects and/or obstacles detected from both the LiDAR point cloud data item and the camera image. The Radar data is also sent to theinterference module 446 that can perform data processing on the Radar data to track objects byobject tracking module 448 as further described below. - The in-vehicle control computer includes an
interference module 446 that receives the locations of the objects from the point cloud and the objects from the image, and the fused objects from thesensor fusion module 402. Theinterference module 446 also receive the Radar data with which theinterference module 446 can track objects byobject tracking module 448 from one point cloud data item and one image obtained at one time instance to another (or the next) point cloud data item and another image obtained at another subsequent time instance. - The
interference module 446 may performobject attribute estimation 450 to estimate one or more attributes of an object detected in an image or point cloud data item. The one or more attributes of the object may include a type of object (e.g., pedestrian, car, or truck, etc.). Theinterference module 446 may performbehavior prediction 452 to estimate or predict motion pattern of an object detected in an image and/or a point cloud. Thebehavior prediction 452 can be performed to detect a location of an object in a set of images received at different points in time (e.g., sequential images) or in a set of point cloud data item received at different points in time (e.g., sequential point cloud data items). In some embodiments, thebehavior prediction 452 can be performed for each image received from a camera and/or each point cloud data item received from the LiDAR sensor. In some embodiments, theinterference module 446 can be performed to reduce computational load by performingbehavior prediction 452 on every other or after every pre-determined number of images received from a camera or point cloud data item received from the LiDAR sensor (e.g., after every two images or after every three point cloud data items). - The
behavior prediction 452 feature may determine the speed and direction of the objects that surround the autonomous vehicle from the Radar data, where the speed and direction information can be used to predict or determine motion patterns of objects. A motion pattern may comprise a predicted trajectory information of an object over a pre-determined length of time in the future after an image is received from a camera. Based on the motion pattern predicted, theinterference module 446 may assign motion pattern situational tags to the objects (e.g., “located at coordinates (x,y),” “stopped,” “driving at 50 mph,” “speeding up” or “slowing down”). The situation tags can describe the motion pattern of the object. Theinterference module 446 sends the one or more object attributes (e.g., types of the objects) and motion pattern situational tags to theplanning module 462. Theinterference module 446 may perform anenvironment analysis 454 using any information acquired bysystem 400 and any number and combination of its components. - The in-vehicle control computer includes the
planning module 462 that receives the object attributes and motion pattern situational tags from theinterference module 446, the drivable area and/or obstacles, and the vehicle location and pose information from the fused localization module 426 (further described below). - The
planning module 462 can performnavigation planning 464 to determine a set of trajectories on which the autonomous vehicle can be driven. The set of trajectories can be determined based on the drivable area information, the one or more object attributes of objects, the motion pattern situational tags of the objects, location of the obstacles, and the drivable area information. In some embodiments, thenavigation planning 464 may include determining an area next to the road 120 (seeFIG. 1 ) where the autonomous vehicle can be safely parked in case of emergencies. Theplanning module 462 may include behavioral decision making 466 to determine driving actions (e.g., steering, braking, throttle) in response to determining changing conditions on the road 120 (seeFIG. 1 ) (e.g., traffic light turned yellow, or the autonomous vehicle is in an unsafe driving condition because another vehicle drove in front of the autonomous vehicle and occupies a region within a pre-determined safe distance of the location of the autonomous vehicle). Theplanning module 462 performstrajectory generation 468 and selects a trajectory from the set of trajectories determined by thenavigation planning operation 464. The selected trajectory information is sent by theplanning module 462 to thecontrol module 470. - The in-vehicle control computer includes a
control module 470 that receives the proposed trajectory from theplanning module 462 and the autonomous vehicle location and pose from the fusedlocalization module 426. Thecontrol module 470 includes asystem identifier 472. Thecontrol module 470 can perform a model basedtrajectory refinement 474 to refine the proposed trajectory. For example, thecontrol module 470 can apply a filter (e.g., Kalman filter) to make the proposed trajectory data smooth and/or to minimize noise. Thecontrol module 470 may perform therobust control 476 by determining, based on the refined proposed trajectory information and current location and/or pose of the autonomous vehicle, an amount of brake pressure to apply, a steering angle, a throttle amount to control the speed of the vehicle, and/or a transmission gear. Thecontrol module 470 can send the determined brake pressure, steering angle, throttle amount, and/or transmission gear to one or more devices in the autonomous vehicle to control and facilitate precise driving operations of the autonomous vehicle. - The deep image-based
object detection 424 performed by the image basedobject detection module 418 can also be used to detect landmarks (e.g., stop signs, speed bumps, etc.) on the road 120 (seeFIG. 1 ). The in-vehicle control computer includes a fusedlocalization module 426 that obtains landmarks detected from images, the landmarks obtained from amap database 436 stored on the in-vehicle control computer, the landmarks detected from the point cloud data item by the LiDAR basedobject detection module 412, the speed and displacement from theodometer sensor 444 and the estimated location of the autonomous vehicle from the GPS/IMU sensor 438 (i.e.,GPS sensor 440 and IMU sensor 442) located on or in the autonomous vehicle. Based on this information, the fusedlocalization module 426 can perform alocalization operation 428 to determine a location of the autonomous vehicle, which can be sent to theplanning module 462 and thecontrol module 470. - The fused
localization module 426 can estimate pose 430 of the autonomous vehicle based on the GPS and/or IMU sensors 438. The pose of the autonomous vehicle can be sent to theplanning module 462 and thecontrol module 470. The fusedlocalization module 426 can also estimate status (e.g., location, possible angle of movement) of the trailer unit based on, for example, the information provided by the IMU sensor 442 (e.g., angular rate and/or linear velocity). The fusedlocalization module 426 may also check themap content 432. -
FIG. 5 shows an exemplary block diagram of an in-vehicle control computer 350 included in anautonomous AV 302. The in-vehicle control computer 350 includes at least oneprocessor 504 and amemory 502 having instructions stored thereupon (e.g.,software instructions 168 and processinginstructions 380 ofFIGS. 1 and 3 , respectively). The instructions upon execution by theprocessor 504 configure the in-vehicle control computer 350 and/or the various modules of the in-vehicle control computer 350 to perform the operations described inFIGS. 1-5 . Thetransmitter 506 transmits or sends information or data to one or more devices in the autonomous vehicle. For example, thetransmitter 506 can send an instruction to one or more motors of the steering wheel to steer the autonomous vehicle. Thereceiver 508 receives information or data transmitted or sent by one or more devices. For example, thereceiver 508 receives a status of the current speed from the odometer sensor or the current transmission gear from the transmission. Thetransmitter 506 andreceiver 508 are also configured to communicate with a plurality ofvehicle subsystems 340 and the in-vehicle control computer 350 described above inFIGS. 3 and 4 . - While several embodiments have been provided in this disclosure, it should be understood that the disclosed systems and methods might be embodied in many other specific forms without departing from the spirit or scope of this disclosure. The present examples are to be considered as illustrative and not restrictive, and the intention is not to be limited to the details given herein. For example, the various elements or components may be combined or integrated into another system or certain features may be omitted, or not implemented.
- In addition, techniques, systems, subsystems, and methods described and illustrated in the various embodiments as discrete or separate may be combined or integrated with other systems, modules, techniques, or methods without departing from the scope of this disclosure. Other items shown or discussed as coupled or directly coupled or communicating with each other may be indirectly coupled or communicating through some interface, device, or intermediate component whether electrically, mechanically, or otherwise. Other examples of changes, substitutions, and alterations are ascertainable by one skilled in the art and could be made without departing from the spirit and scope disclosed herein.
- To aid the Patent Office, and any readers of any patent issued on this application in interpreting the claims appended hereto, applicants note that they do not intend any of the appended claims to invoke 35 U.S.C. § 112(f) as it exists on the date of filing hereof unless the words “means for” or “step for” are explicitly used in the particular claim.
- Implementations of the disclosure can be described in view of the following clauses, the features of which can be combined in any reasonable manner.
-
Clause 1. A system, comprising: -
- an autonomous vehicle (AV) comprising at least one vehicle sensor located on the AV, wherein the AV is configured to travel along a road;
- a control device associated with the AV and comprising a processor configured to:
- detect, from sensor data received from the at least one vehicle sensor, a series of events within a threshold period of time, wherein:
- the series of events taken in the aggregate within the threshold period of time deviates from a normalcy mode;
- the normalcy mode comprises events that are expected to be encountered by the AV;
- a number of events in the series of events is above a threshold number;
- determine whether the series of events corresponds to a malicious event; and
- in response to determining that the series of events corresponds to the malicious event, escalate the series of events to be addressed, wherein:
- escalating the series of events comprises performing at least one countermeasure to address the series of events; and
- the at least one countermeasure comprises establishing a communication path between the AV and an operator such that the operator is able to converse with accomplices causing the series of events.
- detect, from sensor data received from the at least one vehicle sensor, a series of events within a threshold period of time, wherein:
- Clause 2. The system of
Clause 1, wherein: -
- the series of events comprises at least one event that is not within a field-of-view of the at least one vehicle sensor; and
- the field-of-view of the at least one sensor corresponds to a detection zone of the at least one vehicle sensor.
- Clause 3. The system of
Clause 1, wherein detecting the series of events within the threshold period of time comprises detecting one or more of: -
- a first series of events indicating that the AV is forced to deviate from a predetermined routing plan by one or more vehicles such that the AV is forced to re-route or pullover;
- a second series of events indicating that the AV is forced to slow down by one or more vehicles where other surrounding vehicles are not slowing down;
- a third series of events indicating that the AV is forced to slow down as detected by monitoring a speed of an engine of the AV;
- a fourth series of events indicating one or more impacts with the AV by one or more vehicles tampering with the AV;
- a fifth series of events indicating unexpected driving behaviors from one or more vehicles comprising invading a threshold distance from the AV;
- a sixth series of events indicating a vehicle sensor located on the AV is non-responsive as a result of an impact;
- a seventh series of events indicating that the AV is forced to drive over an object on the road as a result of unexpected driving behaviors of one or more vehicles;
- an eighth series of events indicating that a scheduled action indicated in a map data unexpectedly not occurred, wherein the scheduled action comprises at least one of scheduling of a traffic light and scheduling of a railroad crossing light; and
- a ninth series of events indicating that a field of view of the at least one vehicle sensor is obfuscated.
- Clause 4. The system of
Clause 1, wherein determining whether the series of events corresponds to the malicious event comprises: -
- comparing the series of events with the normalcy mode;
- determining whether above a threshold number of events from the series of events correspond to any of the expected events; and
- in response to determining that the series of events does not correspond to any of the expected events, determining that the series of events corresponds to the malicious event.
- Clause 5. The system of
Clause 1, wherein the processor is further configured to: -
- assign a confidence score to the series of events, wherein the confidence score indicates a probability of the series of events corresponding to the malicious event;
- determine whether the confidence score is above a threshold score; and
- in response to determining that the confidence score is above the threshold score, escalate the series of events to be addressed.
- Clause 6. The system of Clause 5, wherein the processor is further configured to in response to determining that the confidence score is below the threshold score, update the normalcy mode to include the series of events indicating that the series of events does not correspond to the malicious event.
- Clause 7. The system of
Clause 1, wherein: -
- the system further comprises a surveillance sensor associated with the AV such that the surveillance sensor is hidden from sight;
- the surveillance sensor is configured to be activated upon detecting the series of events;
- and the surveillance sensor is further configured to record the series of events.
- Clause 8. A method, comprising:
-
- detecting, from sensor data received from at least one vehicle sensor associated with an autonomous vehicle (AV), a series of events within a threshold period of time, wherein:
- the series of events taken in the aggregate within the threshold period of time deviates from a normalcy mode;
- the normalcy mode comprises events that are expected to be encountered by the AV;
- a number of events in the series of events is above a threshold number;
- determining whether the series of events corresponds to a malicious event; and
- in response to determining that the series of events corresponds to the malicious event, escalating the series of events to be addressed, wherein:
- escalating the series of events comprises performing at least one countermeasure to address the series of events; and
- the at least one countermeasure comprises establishing a communication path between the AV and an operator such that the operator is able to converse with accomplices causing the series of events.
- detecting, from sensor data received from at least one vehicle sensor associated with an autonomous vehicle (AV), a series of events within a threshold period of time, wherein:
- Clause 9. The method of Clause 8, wherein determining whether the series of events corresponds to the malicious event comprises:
-
- comparing each event from the series of events with the normalcy mode;
- determining whether each event from the series of events corresponds to the normalcy mode; and
- in response to determining that each event from the series of events does not correspond to the normalcy mode, determining that the series of events corresponds to the malicious event.
- Clause 10. The method of Clause 8, wherein determining whether the series of events corresponds to the malicious event comprises:
-
- comparing a threshold number of events from the series of events with the normalcy mode, wherein the threshold number of events is a subset of the series of events;
- determining whether the threshold number of events from the series of events in the aggregate corresponds to the normalcy mode; and
- in response to determining that the threshold number of events from the series of events in the aggregate corresponds to the normalcy mode, determining that the series of events corresponds to the malicious event.
- Clause 11. The method of Clause 8, wherein the communication path comprises one or more of audio and visual communications.
- Clause 12. The method of Clause 8, wherein escalating the series of events comprises sending a notifying message to law enforcement indicating that the AV is being tampered with at a particular location where the series of events is detected.
- Clause 13. The method of Clause 8, wherein the threshold period of time is determined based at least in part upon the number of events in the series of events such that as the number of events in the series of events increases, the threshold period of time increases.
- Clause 14. The method of Clause 8, wherein escalating the series of events comprises remotely activating a horn of the AV discouraging accomplices causing the series of events.
- Clause 15. The method of Clause 14, further comprising in response to determining that the series of events does not correspond to the malicious event, updating the normalcy mode to include the series of events.
- Clause 16. A computer program comprising executable instructions stored in a non-transitory computer-readable medium that when executed by one or more processors causes the one or more processors to:
-
- detect, from sensor data received from at least one vehicle sensor associated with an autonomous vehicle (AV), a series of events within a threshold period of time, wherein:
- the series of events taken in the aggregate within the threshold period of time deviates from a normalcy mode;
- the normalcy mode comprises events that are expected to be encountered by the AV;
- a number of events in the series of events is above a threshold number;
- detect, from sensor data received from at least one vehicle sensor associated with an autonomous vehicle (AV), a series of events within a threshold period of time, wherein:
- determine whether the series of events corresponds to a malicious event; and
-
- in response to determining that the series of events corresponds to the malicious event, escalate the series of events to be addressed, wherein:
- escalating the series of events comprises performing at least one countermeasure to resolve the series of events; and
- the at least one countermeasure comprises establishing a communication path between the AV and an operator such that the operator is able to converse with accomplices causing the series of events.
- in response to determining that the series of events corresponds to the malicious event, escalate the series of events to be addressed, wherein:
- Clause 17. The computer program of Clause 16, wherein the events in the normalcy mode correspond to events expected from at least one of:
-
- moving objects comprising vehicles and pedestrians; and
- static objects comprising road signs and traffic lights.
- Clause 18. The computer program of Clause 16, wherein the at least one vehicle sensor comprises at least one of a camera, Light Detection and Ranging (LiDAR) sensor, motion sensor, and infrared sensor.
- Clause 19. The computer program of Clause 16, wherein the at least one vehicle sensor comprises a sensor monitoring performance of at least one of an engine, a wheel, a tire, a transmission component, and an electrical component of the AV.
- Clause 20. The computer program of Clause 16, wherein the AV is a tracker unit and is attached to a trailer.
Claims (20)
1. A server, comprising:
a memory configured to store at least one of:
a series of events experienced by an autonomous vehicle, wherein the series of events occur within a threshold period of time; and
information about a normalcy mode that comprises events that are expected to be experienced by the autonomous vehicle; and
a processor operably coupled to the memory, and configured to:
receive the series of events from the autonomous vehicle;
determine whether the series of events taken as aggregate deviate from the normalcy mode; and
in response to determining that the series of events in the aggregate deviate from the normalcy mode, perform at least one countermeasure action to address the series of events.
2. The system of claim 1 , wherein:
the series of events comprises at least one event that is not within a field-of-view of at least one sensor coupled to the autonomous vehicle; and
the field-of-view of the at least one sensor corresponds to a detection zone of the at least one sensor.
3. The system of claim 1 , wherein to perform the at least one countermeasure the processor is further configured to establish a communication path between the autonomous vehicle and an operator such that the operator is able to converse, using the established communication path, with entities that are causing the series of events.
4. The system of claim 1 , wherein the processor is further configured to generate the normalcy mode based on simulating offline driving conditions for the autonomous vehicle in various road environments, wherein the various road environments comprise at least one of a first road environment where the autonomous vehicle is behind traffic, a second road environment where the autonomous vehicle is approaching a traffic light, or a third road environment where a set of vehicles are driving along a road near the autonomous vehicle.
5. The system of claim 1 , wherein the series of events comprises one or more of:
a first series of events indicating that the autonomous vehicle is forced to deviate from a predetermined routing plan by one or more vehicles such that the autonomous vehicle is forced to re-route or pullover;
a second series of events indicating that the autonomous vehicle is forced to slow down by one or more vehicles where other surrounding vehicles are not slowing down;
a third series of events indicating that the autonomous vehicle is forced to slow down as detected by monitoring a speed of an engine of the autonomous vehicle;
a fourth series of events indicating one or more impacts with the autonomous vehicle by one or more vehicles tampering with the autonomous vehicle;
a fifth series of events indicating unexpected driving behaviors from one or more vehicles comprising invading a threshold distance from the autonomous vehicle;
a sixth series of events indicating a vehicle sensor located on the autonomous vehicle is non-responsive as a result of an impact;
a seventh series of events indicating that the autonomous vehicle is forced to drive over an object on the road as a result of unexpected driving behaviors of one or more vehicles;
an eighth series of events indicating that a scheduled action indicated in a map data unexpectedly not occurred, wherein the scheduled action comprises at least one of scheduling of a traffic light and scheduling of a railroad crossing light; and
a ninth series of events indicating that a field of view of the at least one vehicle sensor is obfuscated.
6. The system of claim 1 , wherein determining that the series of events in the aggregate deviate from the normalcy mode is in response to:
comparing the series of events with the normalcy mode information;
determining whether more than a threshold number of events from the series of events correspond to any of the expected events; and
in response to determining that the series of events does not correspond to any of the expected events, determining that the series of events corresponds to a malicious event.
7. The system of claim 1 , wherein determining that the series of events in the aggregate deviate from the normalcy mode is in response to confirming a determination made by a second processor associated with the autonomous vehicle that the series of events corresponds to a malicious event.
8. The system of claim 1 , wherein:
the system further comprises a surveillance sensor associated with the autonomous vehicle wherein the surveillance sensor is hidden from sight;
the surveillance sensor is configured to be activated upon detecting the series of events; and
the surveillance sensor is further configured to record the series of events.
9. The system of claim 1 , wherein the processor is further configured to:
determine that the series of events in the aggregate does not deviate from the normalcy mode; and
in response to determining that the series of events does not deviate from the normalcy mode, update the normalcy mode to include the series of events indicating that the series of events does not correspond to a malicious event.
10. The system of claim 3 , wherein the established communication path supports at least one of a voice-based, a message-based, or a visual-based communication.
11. The system of claim 3 , wherein the established communication path supports a two-way communication between the autonomous vehicle and the operator.
12. The system of claim 1 , wherein the at least one countermeasure action comprises causing the autonomous vehicle to activate a horn at the autonomous vehicle discouraging accomplices causing the series of events.
13. The system of claim 1 , wherein the at least one countermeasure action comprises sending a notifying message to law enforcement indicating that the autonomous vehicle is being tampered with at a particular location where the series of events is detected.
14. The system of claim 1 , wherein the threshold period of time is determined based at least in part upon the number of events in the series of events such that as the number of events in the series of events increases, the threshold period of time increases.
15. The system of claim 1 , wherein the events in the normalcy mode correspond to events expected from at least one of:
moving objects comprising vehicles and pedestrians; or
static objects comprising road signs and traffic lights.
16. The system of claim 1 , wherein the autonomous vehicle comprises a tracker unit and is attached to a trailer.
17. A method, comprising:
receiving a series of events experienced by an autonomous vehicle, wherein the series of events occur within a threshold period of time;
storing the series of events in a memory;
determining whether the series of events taken as aggregate deviate from a normalcy mode;
in response to determining that the series of events in the aggregate deviate from the normalcy mode, performing at least one countermeasure action to address the series of events.
18. The method of claim 17 , wherein:
the series of events comprises at least one event that is not within a field-of-view of at least one sensor coupled to the autonomous vehicle; and
the field-of-view of the at least one sensor corresponds to a detection zone of the at least one sensor.
19. The method of claim 18 , wherein performing the at least one countermeasure comprises establishing a communication path between the autonomous vehicle and an operator such that the operator is able to converse, using the established communication path, with entities that are causing the series of events.
20. The method of claim 18 , wherein determining that the series of events in the aggregate deviate from the normalcy mode is in response to:
comparing the series of events with information about the normalcy mode information about a normalcy mode, the information about the normalcy mode comprising events that are expected to be experienced by the autonomous vehicle;
determining whether more than a threshold number of events from the series of events correspond to any of the expected events; and
in response to determining that the series of events does not correspond to any of the expected events, determining that the series of events corresponds to a malicious event.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US18/343,210 US20230356751A1 (en) | 2021-02-02 | 2023-06-28 | Malicious event detection for autonomous vehicles |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US17/165,396 US11731657B2 (en) | 2021-02-02 | 2021-02-02 | Malicious event detection for autonomous vehicles |
US18/343,210 US20230356751A1 (en) | 2021-02-02 | 2023-06-28 | Malicious event detection for autonomous vehicles |
Related Parent Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US17/165,396 Continuation US11731657B2 (en) | 2021-02-02 | 2021-02-02 | Malicious event detection for autonomous vehicles |
Publications (1)
Publication Number | Publication Date |
---|---|
US20230356751A1 true US20230356751A1 (en) | 2023-11-09 |
Family
ID=80218409
Family Applications (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US17/165,396 Active 2041-07-31 US11731657B2 (en) | 2021-02-02 | 2021-02-02 | Malicious event detection for autonomous vehicles |
US18/343,210 Pending US20230356751A1 (en) | 2021-02-02 | 2023-06-28 | Malicious event detection for autonomous vehicles |
Family Applications Before (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US17/165,396 Active 2041-07-31 US11731657B2 (en) | 2021-02-02 | 2021-02-02 | Malicious event detection for autonomous vehicles |
Country Status (5)
Country | Link |
---|---|
US (2) | US11731657B2 (en) |
EP (1) | EP4037353A1 (en) |
JP (1) | JP2022118722A (en) |
CN (1) | CN114834477A (en) |
AU (1) | AU2022200562A1 (en) |
Families Citing this family (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US11932268B2 (en) * | 2022-05-27 | 2024-03-19 | Plusai, Inc. | Methods and apparatus for tamper detection of a vehicle system |
Family Cites Families (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US8645311B2 (en) * | 2011-03-25 | 2014-02-04 | Siemens Aktiengesellschaft | Critical threshold parameters for defining bursts in event logs |
US9442350B2 (en) * | 2014-12-04 | 2016-09-13 | Ford Global Technologies, Llc | Hidden camera assembly with microprocessor control |
GB2559139B (en) * | 2017-01-26 | 2020-07-29 | Jaguar Land Rover Ltd | Apparatus and method for incident response |
US20190009785A1 (en) * | 2017-07-05 | 2019-01-10 | Panasonic Intellectual Property Management Co., Ltd. | System and method for detecting bullying of autonomous vehicles while driving |
US10818187B2 (en) * | 2017-07-17 | 2020-10-27 | Uatc, Llc | Systems and methods for deploying an autonomous vehicle to oversee autonomous navigation |
WO2019118836A1 (en) * | 2017-12-15 | 2019-06-20 | Walmart Apollo, Llc | System and method for autonomous vehicle intrusion counter-measures |
US11150663B2 (en) * | 2018-01-26 | 2021-10-19 | Nvidia Corporation | Detection of hazardous driving using machine learning |
US20200019173A1 (en) | 2018-07-12 | 2020-01-16 | International Business Machines Corporation | Detecting activity near autonomous vehicles |
US11377072B2 (en) | 2018-11-02 | 2022-07-05 | Uatc, Llc | Systems and methods for tamper evident electronic detection |
US20210170989A1 (en) * | 2019-12-09 | 2021-06-10 | James Andrew Cameron | Anti Theft Device for Open Air Compartment Vehicles |
-
2021
- 2021-02-02 US US17/165,396 patent/US11731657B2/en active Active
-
2022
- 2022-01-27 CN CN202210100010.7A patent/CN114834477A/en active Pending
- 2022-01-28 AU AU2022200562A patent/AU2022200562A1/en active Pending
- 2022-02-01 EP EP22154444.8A patent/EP4037353A1/en active Pending
- 2022-02-01 JP JP2022013972A patent/JP2022118722A/en active Pending
-
2023
- 2023-06-28 US US18/343,210 patent/US20230356751A1/en active Pending
Also Published As
Publication number | Publication date |
---|---|
CN114834477A (en) | 2022-08-02 |
US11731657B2 (en) | 2023-08-22 |
JP2022118722A (en) | 2022-08-15 |
EP4037353A1 (en) | 2022-08-03 |
AU2022200562A1 (en) | 2022-08-18 |
US20220242451A1 (en) | 2022-08-04 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US11731627B2 (en) | Road anomaly detection for autonomous vehicle | |
US10077007B2 (en) | Sidepod stereo camera system for an autonomous vehicle | |
US10896122B2 (en) | Using divergence to conduct log-based simulations | |
JP2018152056A (en) | Risk-based driver assistance for approaching intersections with limited visibility | |
US20180224850A1 (en) | Autonomous vehicle control system implementing teleassistance | |
US11932286B2 (en) | Responder oversight system for an autonomous vehicle | |
US20230020040A1 (en) | Batch control for autonomous vehicles | |
US20230139740A1 (en) | Remote access application for an autonomous vehicle | |
US20230356751A1 (en) | Malicious event detection for autonomous vehicles | |
JP7057874B2 (en) | Anti-theft technology for autonomous vehicles to transport cargo | |
US20220348223A1 (en) | Autonomous vehicle to oversight system communications | |
US11767031B2 (en) | Oversight system to autonomous vehicle communications | |
US11767032B2 (en) | Direct autonomous vehicle to autonomous vehicle communications | |
EP4261093A1 (en) | Method comprising the detection of an abnormal operational state of an autonomous vehicle | |
US20230365143A1 (en) | System and method for remote control guided autonomy for autonomous vehicles | |
US20230199450A1 (en) | Autonomous Vehicle Communication Gateway Architecture | |
WO2023220509A1 (en) | System and method for remote control guided autonomy for autonomous vehicles | |
WO2023122586A1 (en) | Autonomous vehicle communication gateway architecture | |
JP2022171625A (en) | Oversight system to autonomous vehicle communications | |
WO2021202410A1 (en) | Systems and methods for capturing passively-advertised attribute information |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: TUSIMPLE, INC., CALIFORNIA Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:HAMMOUD, RIAD I.;REEL/FRAME:064097/0836 Effective date: 20210201 |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION |