WO2023028610A1 - Détection de sirène d'urgence dans des véhicules autonomes - Google Patents

Détection de sirène d'urgence dans des véhicules autonomes Download PDF

Info

Publication number
WO2023028610A1
WO2023028610A1 PCT/US2022/075549 US2022075549W WO2023028610A1 WO 2023028610 A1 WO2023028610 A1 WO 2023028610A1 US 2022075549 W US2022075549 W US 2022075549W WO 2023028610 A1 WO2023028610 A1 WO 2023028610A1
Authority
WO
WIPO (PCT)
Prior art keywords
audio
vehicle
siren
emergency vehicle
audio signal
Prior art date
Application number
PCT/US2022/075549
Other languages
English (en)
Inventor
Scott Douglas Foster
Neil M. OVERMON
Erik Orlando PORTILLO
Zhujia Shi
Joyce TAM
Mohammad Poorsartep
Christopher Miller
Pengji DUAN
Lingting Ge
Navid SARMADNIA
Panqu PANQU WANG
Zhe Huang
Original Assignee
Tusimple, Inc.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Tusimple, Inc. filed Critical Tusimple, Inc.
Priority to CN202280072475.4A priority Critical patent/CN118511207A/zh
Priority to EP22777879.2A priority patent/EP4392962A1/fr
Priority claimed from US17/822,735 external-priority patent/US20230065647A1/en
Publication of WO2023028610A1 publication Critical patent/WO2023028610A1/fr

Links

Classifications

    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/09Arrangements for giving variable traffic instructions
    • G08G1/0962Arrangements for giving variable traffic instructions having an indicator mounted inside the vehicle, e.g. giving voice messages
    • G08G1/0965Arrangements for giving variable traffic instructions having an indicator mounted inside the vehicle, e.g. giving voice messages responding to signals from another vehicle, e.g. emergency vehicle
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W60/00Drive control systems specially adapted for autonomous road vehicles
    • B60W60/001Planning or execution of driving tasks
    • B60W60/0015Planning or execution of driving tasks specially adapted for safety
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S11/00Systems for determining distance or velocity not using reflection or reradiation
    • G01S11/14Systems for determining distance or velocity not using reflection or reradiation using ultrasonic, sonic, or infrasonic waves
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S3/00Direction-finders for determining the direction from which infrasonic, sonic, ultrasonic, or electromagnetic waves, or particle emission, not having a directional significance, are being received
    • G01S3/80Direction-finders for determining the direction from which infrasonic, sonic, ultrasonic, or electromagnetic waves, or particle emission, not having a directional significance, are being received using ultrasonic, sonic or infrasonic waves
    • G01S3/802Systems for determining direction or deviation from predetermined direction
    • G01S3/808Systems for determining direction or deviation from predetermined direction using transducers spaced apart and measuring phase or time difference between signals therefrom, i.e. path-difference systems
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W50/00Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
    • B60W2050/0001Details of the control system
    • B60W2050/0043Signal treatments, identification of variables or parameters, parameter estimation or state estimation
    • B60W2050/0057Frequency analysis, spectral techniques or transforms
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2420/00Indexing codes relating to the type of sensors based on the principle of their operation
    • B60W2420/40Photo, light or radio wave sensitive means, e.g. infrared sensors
    • B60W2420/403Image sensing, e.g. optical camera
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2420/00Indexing codes relating to the type of sensors based on the principle of their operation
    • B60W2420/54Audio sensitive means, e.g. ultrasound
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2420/00Indexing codes relating to the type of sensors based on the principle of their operation
    • B60W2420/60Doppler effect
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2554/00Input parameters relating to objects
    • B60W2554/40Dynamic objects, e.g. animals, windblown objects
    • B60W2554/404Characteristics
    • B60W2554/4041Position
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2554/00Input parameters relating to objects
    • B60W2554/80Spatial relation or speed relative to objects
    • B60W2554/802Longitudinal distance

Definitions

  • the present document relates generally to autonomous vehicles. More particularly, the present document is related to operating an autonomous vehicle (AV) appropriately on public roads, highways, and locations with other vehicles or pedestrians.
  • AV autonomous vehicle
  • One aim of autonomous vehicle technologies is to provide vehicles that can safely navigate towards a destination with limited or no driver assistance.
  • the safe navigation of an autonomous vehicle (AV) from one point to another may include the ability to signal other vehicles, navigating around other vehicles in shoulders or emergency lanes, changing lanes, biasing appropriately in a lane, and navigate all portions or types of highway lanes.
  • Autonomous vehicle technologies may enable an AV to operate without requiring extensive learning or training by surrounding drivers, by ensuring that the AV can operate safely, in a way that is evident, logical, or familiar to surrounding drivers and pedestrians.
  • An autonomous vehicle includes audio sensors configured to detect audio in an environment around the autonomous vehicle and to generate audio signals based on the detected audio.
  • a processor in the autonomous vehicle receives the audio signals and compares a time domain or frequency domain representation of the audio signals to a corresponding representation of a known emergency vehicle siren. The comparison causes the processor to output a first determination indicating whether the audio signals are indicative of an emergency vehicle siren.
  • the processor also applies a trained neural network to the audio signals that causes the processor to output a second determination indicating whether the audio signals are indicative of the emergency vehicle siren. Based on the first determination and the second determination indicating presence of an emergency vehicle siren in the environment around the autonomous vehicle, the autonomous vehicle is caused to perform an action.
  • FIG. 1 illustrates a schematic diagram of a system including an autonomous vehicle
  • FIG. 2 shows a flow diagram for operation of an autonomous vehicle (AV) safely in light of the health and surroundings of the AV;
  • AV autonomous vehicle
  • FIG. 3 illustrates a system that includes one or more autonomous vehicles, a control center or oversight system with a human operator (e.g., a remote center operator (RCO)), and an interface for third-party interaction;
  • a human operator e.g., a remote center operator (RCO)
  • RCO remote center operator
  • FIG. 4 is a schematic diagram of an autonomous vehicle according to some implementations.
  • FIG. 5 is a schematic diagram of an audio sensor array according to some implementations.
  • FIG. 6 is a block diagram illustrating functional modules executed by an autonomous vehicle to detect emergency vehicle sirens based on audio signals, according to some implementations.
  • FIG. 7 is a flowchart illustrating a process for detecting emergency vehicle sirens based on audio signals, according to some implementations.
  • AVs autonomous vehicles
  • AVs autonomous tractor trailers
  • the ability to recognize a malfunction in its systems and stop safely are necessary for lawful and safe operation of the vehicle.
  • systems and methods for the safe and lawful operation of an autonomous vehicle on a roadway including the execution of maneuvers that bring the autonomous vehicle in compliance with the law while signaling surrounding vehicles of its condition.
  • an autonomous vehicle includes a plurality of audio sensors configured to detect audio in an environment around the autonomous vehicle and to generate one or more audio signals based on the detected audio.
  • One or more processors in the vehicle receive the audio signals and compare a time domain or frequency domain representation of the one or more audio signals to a corresponding representation of a known emergency vehicle siren. The comparison causes the processor to output a first determination indicating whether the one or more audio signals are indicative of an emergency vehicle siren in the environment around the vehicle.
  • the processor also applies a trained neural network to the audio signals that causes the processor to output a second determination indicating whether the audio signals are indicative of the emergency vehicle siren in the environment around the autonomous vehicle. If either the first determination or the second determination indicates presence of an emergency vehicle siren in the environment around the autonomous vehicle, the autonomous vehicle is caused to perform an action, for example to safely move out of a pathway of the emergency vehicle.
  • FIG. 1 shows a system 100 that includes a tractor 105 of an autonomous truck.
  • the tractor 105 includes a plurality of vehicle subsystems 140 and an in-vehicle control computer 150.
  • the plurality of vehicle subsystems 140 includes vehicle drive subsystems 142, vehicle sensor subsystems 144, and vehicle control subsystems.
  • An engine or motor, wheels and tires, a transmission, an electrical subsystem, and a power subsystem may be included in the vehicle drive subsystems.
  • the engine of the autonomous truck may be an internal combustion engine, a fuel-cell powered electric engine, a battery powered electrical engine, a hybrid engine, or any other type of engine capable of moving the wheels on which the tractor 105 moves.
  • the tractor 105 have multiple motors or actuators to drive the wheels of the vehicle, such that the vehicle drive subsystems 142 include two or more electrically driven motors.
  • the transmission may include a continuous variable transmission or a set number of gears that translate the power created by the engine into a force that drives the wheels of the vehicle.
  • the vehicle drive subsystems may include an electrical system that monitors and controls the distribution of electrical current to components within the system, including pumps, fans, and actuators.
  • the power subsystem of the vehicle drive subsystem may include components that regulate the power source of the vehicle.
  • Vehicle sensor subsystems 144 can include sensors for general operation of the autonomous truck 105, including those which would indicate a malfunction in the AV or another cause for an AV to perform a limited or minimal risk condition (MRC) maneuver.
  • the sensors for general operation of the autonomous vehicle may include cameras, a temperature sensor, an inertial sensor (IMU), a global positioning system, a light sensor, a LIDAR system, a radar system, and wireless communications.
  • a sound detection array such as a microphone or array of microphones, may be included in the vehicle sensor subsystem 144.
  • the microphones of the sound detection array are configured to receive audio indications of the presence of, or instructions from, authorities, including sirens and command such as “Pull over.”
  • These microphones are mounted, or located, on the external portion of the vehicle, specifically on the outside of the tractor portion of an autonomous truck 105.
  • Microphones used may be any suitable type, mounted such that they are effective both when the autonomous truck 105 is at rest, as well as when it is moving at normal driving speeds.
  • Cameras included in the vehicle sensor subsystems 144 may be rear facing so that flashing lights from emergency vehicles may be observed from all around the autonomous truck 105. These cameras may include video cameras, cameras with filters for specific wavelengths, as well as other cameras suitable to detect emergency vehicle lights based on color, flashing, intensity, and/or the like parameters.
  • the vehicle control subsystem 146 may be configured to control operation of the autonomous vehicle, or truck, 105 and its components. Accordingly, the vehicle control subsystem 146 may include various elements such as an engine power output subsystem, a brake unit, a navigation unit, a steering system, and an autonomous control unit.
  • the engine power output may control the operation of the engine, including the torque produced or horsepower provided, as well as provide control the gear selection of the transmission.
  • the brake unit can include any combination of mechanisms configured to decelerate the autonomous vehicle 105.
  • the brake unit can use friction to slow the wheels in a standard manner.
  • the brake unit may include an Anti-lock brake system (ABS) that can prevent the brakes from locking up when the brakes are applied.
  • ABS Anti-lock brake system
  • the navigation unit may be any system configured to determine a driving path or route for the autonomous vehicle 105.
  • the navigation unit may additionally be configured to update the driving path dynamically while the autonomous vehicle 105 is in operation.
  • the navigation unit may be configured to incorporate data from the GPS device and one or more predetermined maps so as to determine the driving path for the autonomous vehicle 105.
  • the steering system may represent any combination of mechanisms that may be operable to adjust the heading of autonomous vehicle 105 in an autonomous mode or in a driver-controlled mode.
  • the autonomous control unit may represent a control system configured to identify, evaluate, and avoid or otherwise negotiate potential obstacles in the environment of the autonomous vehicle 105.
  • the autonomous control unit may be configured to control the autonomous vehicle 105 for operation without a driver or to provide driver assistance in controlling the autonomous vehicle 105.
  • the autonomous control unit may be configured to incorporate data from the GPS device, the RADAR, the LiDAR (also referred to as LIDAR), the cameras, and/or other vehicle subsystems to determine the driving path or trajectory for the autonomous vehicle 105.
  • the autonomous control that may activate systems that the AV 105 has which are not present in a conventional vehicle, including those systems which can allow an AV to communicate with surrounding drivers or signal surrounding vehicles or drivers for safe operation of the AV.
  • An in-vehicle control computer 150 which may be referred to as a VCU, includes a vehicle subsystem interface 160, a driving operation module 168, one or more processors 170, a compliance module 166, a memory 175, and a network communications subsystem 178.
  • This in- vehicle control computer 150 controls many, if not all, of the operations of the autonomous truck 105 in response to information from the various vehicle subsystems 140.
  • the one or more processors 170 execute the operations that allow the system to determine the health of the AV, such as whether the AV has a malfunction or has encountered a situation requiring service or a deviation from normal operation and giving instructions.
  • Data from the vehicle sensor subsystems 144 is provided to VCU 150 so that the determination of the status of the AV can be made.
  • the compliance module 166 determines what action should be taken by the autonomous truck 105 to operate according to the applicable (i.e., local) regulations. Data from other vehicle sensor subsystems 144 may be provided to the compliance module 166 so that the best course of action in light of the AV’s status may be appropriately determined and performed. Alternatively, or additionally, the compliance module 166 may determine the course of action in conjunction with another operational or control module, such as the driving operation module 168.
  • the memory 175 may contain additional instructions as well, including instructions to transmit data to, receive data from, interact with, or control one or more of the vehicle drive subsystem 142, the vehicle sensor subsystem 144, and the vehicle control subsystem 146 including the autonomous Control system.
  • the in-vehicle control computer (VCU) 150 may control the function of the autonomous vehicle 105 based on inputs received from various vehicle subsystems (e.g., the vehicle drive subsystem 142, the vehicle sensor subsystem 144, and the vehicle control subsystem 146). Additionally, the VCU 150 may send information to the vehicle control subsystems 146 to direct the trajectory, velocity, signaling behaviors, and the like, of the autonomous vehicle 105.
  • the autonomous control vehicle control subsystem may receive a course of action to be taken from the compliance module 166 of the VCU 150 and consequently relay instructions to other subsystems to execute the course of action.
  • FIG. 2 shows a flow diagram for operation of an autonomous vehicle (AV) safely in light of the health and surroundings of the AV.
  • AV autonomous vehicle
  • the vehicle sensor subsystem 144 receives visual, auditory, or both visual and auditory signals indicating the at the environmental condition of the AV, as well as vehicle health or sensor activity data are received in step 205. These visual and/or auditory signal data are transmitted from the vehicle sensor subsystem 144 to the in-vehicle control computer system (VCU) 150, as in step 210. Any of the driving operation module and the compliance module receive the data transmitted from the vehicle sensor subsystem, in step 215. Then, one or both of those modules determine whether the current status of the AV can allow it to proceed in the usual manner or that the AV needs to alter its course to prevent damage or injury or to allow for service in step 220.
  • VCU vehicle control computer system
  • the information indicating that a change to the course of the AV is needed may include an indicator of sensor malfunction; an indicator of a malfunction in the engine, brakes, or other components necessary for the operation of the autonomous vehicle; a determination of a visual instruction from authorities such as flares, cones, or signage; a determination of authority personnel present on the roadway; a determination of a law enforcement vehicle on the roadway approaching the autonomous vehicle, including from which direction; and a determination of a law enforcement or first responder vehicle moving away from or on a separate roadway from the autonomous vehicle.
  • This information indicating that a change to the AV’s course of action is needed may be used by the compliance module to formulate a new course of action to be taken which accounts for the AV’s health and surroundings, in step 225.
  • the course of action to be taken may include slowing, stopping, moving into a shoulder, changing route, changing lane while staying on the same general route, and the like.
  • the course of action to be taken may include initiating communications with any oversight or human interaction systems present on the autonomous vehicle.
  • the course of action to be taken may then be transmitted from the VCU 150 to the autonomous control system, in step 230.
  • the vehicle control subsystems 146 then cause the autonomous truck 105 to operate in accordance with the course of action to be taken that was received from the VCU 150 in step 235.
  • FIG. 3 illustrates a system 300 that includes one or more autonomous vehicles 105, a control center or oversight system 350 with a human operator 355, and an interface 362 for third- party 360 interaction.
  • a human operator 355 may also be known as a remoter center operator (RCO).
  • RCO remoter center operator
  • Communications between the autonomous vehicles 105, oversight system 350 and user interface 362 take place over a network 370.
  • the autonomous vehicles 105 may communicate with each other over the network 370 or directly.
  • the VCU 150 of each autonomous vehicle 105 may include a module for network communications 178.
  • An autonomous truck may be in communication with an oversight system.
  • the oversight system may serve many purposes, including: tracking the progress of one or more autonomous vehicles (e.g., an autonomous truck); tracking the progress of a fleet of autonomous vehicles; sending maneuvering instructions to one or more autonomous vehicles; monitoring the health of the autonomous vehicle(s); monitoring the status of the cargo of each autonomous vehicle in contact with the oversight system; facilitate communications between third parties (e.g., law enforcement, clients whose cargo is being carried) and each, or a specific, autonomous vehicle; allow for tracking of specific autonomous trucks in communication with the oversight system (e.g., third- party tracking of a subset of vehicles in a fleet); arranging maintenance service for the autonomous vehicles (e.g., oil changing, fueling, maintaining the levels of other fluids); alerting an affected autonomous vehicle of changes in traffic or weather that may adversely impact a route or delivery plan; pushing over the air updates to autonomous trucks to keep all components up to date; and other purposes or functions that improve the safety for the autonomous vehicle, its cargo, and its surroundings.
  • third parties
  • An oversight system may also determine performance parameters of an autonomous vehicle or autonomous truck, including any of: data logging frequency, compression rate, location, data type; communication prioritization; how frequently to service the autonomous vehicle (e.g., how many miles between services); when to perform a minimal risk condition (MRC) maneuver while monitoring the vehicle’s progress during the maneuver; when to hand over control of the autonomous vehicle to a human driver (e.g., at a destination yard); ensuring an autonomous vehicle passes pre-trip inspection; ensuring an autonomous vehicle performs or conforms to legal requirements at checkpoints and weight stations; ensuring an autonomous vehicle performs or conforms to instructions from a human at the site of a roadblock, cross-walk, intersection, construction, or accident; and the like.
  • data logging frequency e.g., how many miles between services
  • MRC minimal risk condition
  • an oversight system or command center includes the ability to relay over-the-air, real-time weather updates to autonomous vehicles in a monitored fleet.
  • the over-the-air weather updates may be pushed to all autonomous vehicles in the fleet or may be pushed only to autonomous vehicles currently on a mission to deliver a cargo.
  • priority to push or transmit over-the- air weather reports may be given to fleet vehicles currently on a trajectory or route that leads towards or within a predetermined radius of a severe weather event.
  • trailer metadata may include the type of cargo being transmitted, the weight of the cargo, temperature thresholds for the cargo (e.g., trailer interior temperature should not fall below or rise above predetermined temperatures), timesensitivities, acceleration/decel eration sensitivities (e.g., jerking motion may be bad because of the fragility of the cargo), trailer weight distribution along the length of the trailer, cargo packing or stacking within the trailer, and the like.
  • each autonomous vehicle may be equipped with a communication gateway.
  • the communication gateway may have the ability to do any of the following: allow for AV to oversight system communication (i.e. V2C) and the oversight system to AV communication (C2V); allow for AV to AV communication within the fleet (V2V); transmit the availability or status of the communication gateway; acknowledge received communications; ensure security around remote commands between the AV and the oversight system; convey the AV’s location reliably at set time intervals; enable the oversight system to ping the AV for location and vehicle health status; allow for streaming of various sensor data directly to the command or oversight system; allow for automated alerts between the AV and oversight system; comply to ISO 21434 standards; and the like.
  • An oversight system or command center may be operated by one or more human, also known as an operator or a remote center operator (RCO).
  • the operator may set thresholds for autonomous vehicle health parameters, so that when an autonomous vehicle meets or exceeds the threshold, precautionary action may be taken.
  • Examples of vehicle health parameters for which thresholds may be established by an operator may include any of: fuel levels; oil levels; miles traveled since last maintenance; low tire-pressure detected; cleaning fluid levels; brake fluid levels; responsiveness of steering and braking subsystems; Diesel exhaust fluid (DEF) level; communication ability (e.g., lack of responsiveness); positioning sensors ability (e.g., GPS, IMU malfunction); impact detection (e.g., vehicle collision); perception sensor ability (e.g., camera, LIDAR, radar, microphone array malfunction); computing resources ability (e.g., VCU or ECU malfunction or lack of responsiveness, temperature abnormalities in computing units); angle between a tractor and trailer in a towing situation (e.g., tractor-trailer, 18-wheeler, or semi -truck); unauthorized access by a living entity (e.g., a person or an animal) to the interior of an autonomous truck; and the like.
  • a living entity e.g., a person or an animal
  • the precautionary action may include execution of a minimal risk condition (MRC) maneuver, seeking service, or exiting a highway or other such re-routing that may be less taxing on the autonomous vehicle.
  • MRC minimal risk condition
  • An autonomous vehicle whose system health data meets or exceeds a threshold set at the oversight system or by the operator may receive instructions that are automatically sent from the oversight system to perform the precautionary action.
  • the operator may be made aware of situations affecting one or more autonomous vehicles in communication with or being monitored by the oversight system that the affected autonomous vehicle(s) may not be aware of.
  • Such situations may include: irregular or sudden changes in traffic flow (e.g., traffic jam or accident); abrupt weather changes; abrupt changes in visibility; emergency conditions (e.g., fire, sink-hole, bridge failure); power outage affecting signal lights; unexpected road work; large or ambiguous road debris (e.g., object unidentifiable by the autonomous vehicle); law enforcement activity on the roadway (e.g., car chase or road clearing activity); and the like.
  • An autonomous vehicle may not be able to detect such situations because of limitations of sensor systems or lack of access to the information distribution means (e.g., no direct communication with weather agency).
  • An operator at the oversight system may push such information to affected autonomous vehicles that are in communication with the oversight system.
  • the affected autonomous vehicles may proceed to alter their route, trajectory, or speed in response to the information pushed from the oversight system.
  • the information received by the oversight system may trigger a threshold condition indicating that MRC (minimal risk condition) maneuvers are warranted; alternatively, or additionally, an operator may evaluate a situation and determine that an affected autonomous vehicle should perform a MRC maneuver and subsequently send such instructions to the affected vehicle.
  • each autonomous vehicle receiving either information or instructions from the oversight system or the oversight system operator uses its on-board computing unit (e.g., VCU) to determine how to safely proceed, including performing an MRC maneuver that includes pulling-over or stopping.
  • VCU on-board computing unit
  • RCO remote center operator
  • Other interactions that the remote center operator (RCO) may have with an autonomous vehicle or a fleet of autonomous vehicle includes any of the following: pre-planned event avoidance; real-time route information updates; real-time route feedback; trail hookup status; first responder communication request handling; notification of aggressive surrounding vehicle(s); identification of construction zone changes; status of an AV with respect to its operational design domain (ODD), such as alerting the RCO when an autonomous vehicle is close to or enters a status out of ODD; RCO notification of when an AV is within a threshold distance from a toll booth and appropriate instruction/communication with the AV or toll authority may be sent to allow the AV to bypass the toll; RCO notification of when an AV bypasses a toll; RCO notification of when an AV is within a threshold distance from a weigh station and appropriate instruction/communication with the AV or appropriate authority may be sent to allow the AV to bypass the weigh station; RCO notification of when an AV bypasses a weigh station; notification to the AV from
  • An oversight system or command center may allow a third party to interact with the oversight system operator, with an autonomous truck, or with both the human system operator and an autonomous truck.
  • a third party may be a customer whose goods are being transported, a law enforcement or emergency services provider, or a person assisting the autonomous truck when service is needed.
  • the oversight system may recognize different levels of access, such that a customer concerned about the timing or progress of a shipment may only be allowed to view status updates for an autonomous truck, or may able to view status and provide input regarding what parameters to prioritize (e.g., speed, economy, maintaining originally planned route) to the oversight system.
  • parameters to prioritize e.g., speed, economy, maintaining originally planned route
  • an autonomous vehicle particularly an autonomous truck, as described herein may be configured to execute to safely traverse a course while abiding by the applicable rules, laws, and regulations may include those actions successfully accomplished by an autonomous truck driven by a human.
  • These actions, or maneuvers, may be described as features of the truck, in that these actions may be executable programming stored on the VCU 150 (i.e., the in-vehicle control computer unit).
  • actions or features may include those related to reactions to the detection of certain types of conditions or objects such as: appropriate motion in response to detection of an emergency vehicle with flashing lights; appropriate motion in response to detecting one or more vehicles approaching the AV, motions or actions in response to encountering an intersection; execution of a merge into traffic in an adjacent lane or area of traffic; detection of need to clean one or more sensor and the cleaning of the appropriate sensor; and the like.
  • Other features of an autonomous truck may include those actions or features which are needed for any type of maneuvering, including that needed to accomplish the features or actions that are reactionary, listed above.
  • Such features, which may be considered supporting features may include: the ability to maintain an appropriate following distance; the ability to turn right and left with appropriate signaling and motion, and the like.
  • These supporting features, as well as the reactionary features listed above may include controlling or altering the steering, engine power output, brakes, or other vehicle control subsystems 146.
  • FIG. 4 is a schematic diagram illustrating a configuration of a subset of sensors in the autonomous vehicle 105, according to some implementations.
  • the subset of sensors depicted in FIG. 4 can be part of the vehicle sensor subsystem 144 described with respect to FIG. 1.
  • the autonomous vehicle 105 has a front side 405, a rear side 410, a left side 415, and a right side 420.
  • One or more audio sensors can be positioned on each side of the autonomous vehicle, where each audio sensor is configured to generate a respective audio signal based on audio detected in an environment around the autonomous vehicle.
  • an audio sensor array 430 is disposed on each side of the autonomous vehicle.
  • the audio sensor arrays 430 can be positioned on the vehicle at locations that reduce ambient sound caused by wind pressure changes as the autonomous vehicle is operated. For example, each array 430 can be positioned at a location that experiences a lowest wind pressure change during operation of the autonomous vehicle.
  • the audio sensors can be disposed on the tractor, on the trailer, or both.
  • some implementations of the autonomous vehicle 105 further includes one or more sensors configured to detect light signals (such as a camera); the light-detecting sensors in these implementations can similarly be located on the tractor, on the trailer, or both.
  • FIG. 5 is a schematic diagram illustrating an example audio sensor array 430.
  • the audio sensor array 430 includes multiple audio sensors 502 distributed along a length of the array.
  • the audio sensors 502 have an even spacing distance 504 between adjacent sensors.
  • the audio sensors 502 may be unevenly spaced, e.g., to fit contours of the vehicle.
  • FIG. 6 is a block diagram illustrating functional modules executed by the autonomous vehicle 105 to detect emergency vehicle sirens based on audio signals, according to some implementations.
  • the autonomous vehicle 105 can include an audio preprocessing module 605, a signal processing module 610, and a siren detection module 625, as well as store or have access to a siren data repository 615 and a siren classification model 620.
  • the autonomous vehicle 105 can include additional, fewer, or different modules, and functionality described herein can be divided differently between the modules.
  • the term “module” refers broadly to software components, firmware components, and/or hardware components.
  • the modules 605, 610, and 625 could each be comprised of software, firmware, and/or hardware components implemented in, or accessible to, the autonomous vehicle 105.
  • the modules 605, 610, and 625 are executed by the in-vehicle control computer 150, for example by the one or more processors 170 executing computer-readable instructions stored in the memory 175.
  • the audio pre-preprocessing module 605 receives raw audio signals output by the audio sensors and performs one or more pre-processing steps to prepare the audio signals for analysis by other modules. For example, the audio pre-processing module 605 can apply one or more filters or windows to the audio signals to smooth the audio signal or to prepare portions of the audio signal for analysis by the signal processing module 610 or the siren detection module 625.
  • the audio pre-processing module 605 can also use audio signals generated by some audio sensors in the autonomous vehicle to remove ambient sounds from audio signals generated by other audio sensors in the autonomous vehicle. For example, when the autonomous vehicle 105 is traveling on a limited-access road such as a freeway, an emergency vehicle is expected to approach from either substantially to the front of or substantially to the rear of the vehicle 105. Thus, the audio sensors on the sides of the autonomous vehicle can be used to represent ambient audio in the environment around the autonomous vehicle. The audio pre-processing module 605 uses the signals from the side-facing audio sensors to modify audio signals generated by the forward- and backwardfacing sensors on the autonomous vehicle by removing the ambient audio or portions of the ambient audio (e.g., predominant frequencies).
  • the signal processing module 610 performs classical signal processing techniques on audio signals, such as the raw audio signals output by the audio sensors or pre-processed audio signals generated by the pre-processing module 605.
  • the output of the signal processing module 610 can include a determination of whether an audio signal matches expected characteristics of an emergency vehicle siren, a prediction of whether the emergency vehicle is approaching or moving away from the autonomous vehicle 105, or an estimation of a position of an emergency vehicle relative to the autonomous vehicle 105.
  • the signal processing module 610 generates a first determination of whether an audio signal generated by the audio sensors in the autonomous vehicle 105 contains an emergency vehicle siren by comparing a time domain or frequency domain representation of an audio signal to corresponding representations of known sirens.
  • the representations of known sirens can be stored in the siren data repository 615, which can be stored locally on the autonomous vehicle 105 or at a remote location accessible to the vehicle 105.
  • the signal processing module 610 detects and classifies siren sounds as defined by current guidelines issued by relevant jurisdictions, such as US Dept of Justice NIJ Guide 500-00 and SAE J-1849 (including siren patterns for Yelp and Wail), or siren sounds used for civil defense or used to alert the local populations to potential extreme weather or civil defense events as defined by guidelines issue by relevant jurisdictions, such as current FEMA CPG 1-17 and FEMA Manual 1550.2.
  • the signal processing module 610 compares an audio signal detected by the audio sensors on the autonomous vehicle to the predefined siren sounds defined by the guidelines in the jurisdiction in which the vehicle 105 is operating.
  • the signal processing module 610 performs a frequency analysis of the audio signals to generate a frequency spectrum representing at least a portion of audio. For example, the signal processing module 610 performs a fast Fourier transform on the audio signals to identify predominant frequencies in the signal. The identified predominant frequencies can be compared to predominant frequencies expected for each of several known types of emergency sirens in the siren data repository 615, causing the signal processing module 610 to determine the audio signal contains an emergency siren when at least a threshold degree of match is found.
  • the autonomous vehicle 105 determines if an average spectrum of the audio signal is within a threshold of an expected average spectrum for an emergency vehicle. Other implementations of the autonomous vehicle 105 analyzes a curve fit of a time-domain representation of an audio signal or performs audio fingerprinting analysis of the audio signal to determine if the sound signal matches expected time domain features of an emergency vehicle’s siren.
  • the signal processing module 610 can additionally analyze audio signals to determine an angle between the autonomous vehicle 105 and a source of a sound detected by the audio sensors. Based on the known spacing distance 504 between adjacent sensors, some implementations of the signal processing module 610 calculate an angle of incidence of sound on the audio sensor array 430. For example, the autonomous vehicle 105 determines a phase shift angle ⁇ P between the audio signal generated by a first sensor in the array and a second sensor in the array offset by distance d from the first sensor, for a sound signal with frequency , and applies the phase shift angle to determine an angle of incidence of the sound signal on the audio sensor array 430 (0) as follows:
  • the autonomous vehicle 105 can then calculate an angle between the autonomous vehicle 105 and a source of the siren (presumably an emergency vehicle) by computing a sum of the angle of incidence and the known angle at which the audio sensor array 430 is positioned with respect to the autonomous vehicle.
  • the signal processing module 610 can additionally or alternatively determine a distance to the source of an audio signal based on a phase or time difference between signals detected at opposite sides of the autonomous vehicle (e.g., at an audio sensor on a left side of the vehicle and an audio sensor on the right side of the vehicle, or at an audio sensor on a front of the vehicle and an audio sensor on a rear of the vehicle).
  • the autonomous vehicle 105 further includes one or more light sensors that are configured to detect emergency vehicle lights, and a difference between a time at which the lights are detected and a time at which the siren sound reaches the autonomous vehicle 105 can be used to estimate the distance to the emergency vehicle.
  • Some implementations of the signal processing module 610 are further configured to determine whether an emergency vehicle is moving towards or away from the autonomous vehicle 105 by analyzing audio signals for the Doppler effect.
  • the siren detection module 625 uses the siren classification model 620 and/or the outputs of the signal processing module 610 to detect an emergency vehicle siren in an environment around the autonomous vehicle 105.
  • the siren detection module 625 can take, as input, one or more audio signals and/or features of audio signals generated by the signal processing module 610, and produce as output a determination indicating whether an emergency vehicle siren has been detected.
  • the siren classification model 620 is a trained machine learning model, such as a neural network, that is trained to classify audio signals or portions of audio signals as either likely to be an emergency vehicle siren or not likely to be an emergency vehicle siren.
  • a "model,” as used herein, refers to a construct that is trained using training data to make predictions or provide probabilities for new data items, whether or not the new data items were included in the training data.
  • training data for supervised learning can include items with various parameters and an assigned classification.
  • a new data item can have parameters that a model can use to assign a classification to the new data item.
  • a model can be a probability distribution resulting from the analysis of training data, such as a likelihood of an emergency vehicle siren being nearby based on an analysis of a large set of audio signal data that includes audio signals indicative of and not indicative of the presence of an emergency vehicle siren.
  • Examples of models include: neural networks, support vector machines, decision trees, Parzen windows, Bayes clustering, reinforcement learning, probability distributions, decision trees, decision tree forests, and others. Models can be configured for various situations, data types, sources, and output formats.
  • the siren classification model 620 can include a neural network with multiple input nodes that receive an input data point or signal, such as a signal received from an audio sensor associated with the autonomous vehicle 105.
  • the input nodes can correspond to functions that receive the input and produce results. These results can be provided to one or more levels of intermediate nodes that each produce further results based on a combination of lower-level node results.
  • a weighting factor can be applied to the output of each node before the result is passed to the next layer node.
  • the output layer one or more nodes can produce a value classifying the input that, once the model is trained, can be used to cause an output in the autonomous vehicle 105.
  • such neural networks can have multiple layers of intermediate nodes with different configurations, can be a combination of models that receive different parts of the input and/or input from other parts of the deep neural network, or are convolutions - partially using output from previous iterations of applying the model as further input to produce results for the current input.
  • a machine learning model can be trained with supervised learning, where the training data includes inputs and desired outputs.
  • the inputs can include, for example, the different partial or complete siren sounds generated by different emergency vehicles.
  • Example outputs used for training can include an indication of whether an emergency vehicle was present at the time the training inputs were collected and/or a classification of a type of the emergency vehicle that was present.
  • the desired output e.g., an indication that an emergency vehicle is present or an indication that an emergency vehicle is not present
  • Output from the model can be compared to the desired output for the corresponding inputs and, based on the comparison, the model can be modified, such as by changing weights between nodes of the neural network or parameters of the functions used at each node in the neural network (e.g., applying a loss function).
  • the model can be trained to evaluate new data points (such as new audio signals) to generate the outputs.
  • the siren classification model 620 can be trained to identify sirens produced by different types of emergency vehicles in different jurisdictions.
  • emergency vehicle sirens include a short sound pattern that is repeated periodically while the siren is activated.
  • the sound pattern produced by a given type of siren in a given jurisdiction may be different from the sound pattern produced by a different type of siren or in a different jurisdiction.
  • the sirens of an ambulance and a police vehicle within the same jurisdiction are typically different from one another.
  • ambulances in different countries can have different sirens.
  • An operator of an emergency vehicle can also manually operate the vehicle’s siren (e.g., turning it on and off a few times in short succession), generating a unique audio pattern.
  • the siren classification model 620 can be trained with a training set that includes audio signals produced by an autonomous vehicle’s audio sensors in the presence of different types of sirens and different lengths of siren sound patterns, as well as sirens produced from different angles around the vehicle, different distances from vehicle, and/or at different volumes.
  • the audio signals used for the training data can be audio signals produced by the sensors in the autonomous vehicle 105 in which the trained model is to be deployed, or by sensors in another vehicle or set of vehicles (such as sensors in a test vehicle).
  • the siren detection module 625 applies the siren classification model 620 to one or more audio signals received from the audio sensors in the vehicle. When applied, the siren classification model 620 outputs a second determination of whether the audio signal is likely indicative of or not indicative of an emergency vehicle siren.
  • the siren detection module 625 can use the siren classification model 620 to continuously process real-time audio signal while the vehicle is operated. As the data is captured, it can be input to the siren classification model 620 to detect or classify audio signals as either being indicative of or not indicative of a presence of an emergency vehicle
  • the siren detection module 625 assesses whether an emergency vehicle is likely to be present in the environment of the autonomous vehicle 105 using both the first determination output by the signal processing module 610 and the second determination output based on application of the trained siren classification model 620. If either the first determination or the second determination indicates that an emergency vehicle is present, the siren detection module 625 outputs an alert of the emergency vehicle’s presence that can be used to guide the autonomous vehicle 105 out of the path of the emergency vehicle.
  • FIG. 7 is a flowchart illustrating a process 700 for detecting emergency vehicles, according to some implementations.
  • the process 700 can be performed by the autonomous vehicle 105, for example by one or more processors in the vehicle 105 executing computer-readable instructions.
  • Other implementations of the process 700 can include additional, fewer, or different steps, and can perform the steps in different orders than shown.
  • the autonomous vehicle 105 receives one or more audio signals generated by respective audio sensors disposed on or in the autonomous vehicle.
  • the audio sensors are configured to detect audio in an environment around the autonomous vehicle and generate the audio signal based on the detected audio.
  • the autonomous vehicle 105 compares a time domain or frequency domain representation of the one or more audio signals to a corresponding representation of a known emergency vehicle siren. The comparison causes the autonomous vehicle 105 to output a first determination indicating whether the one or more audio signals are indicative of an emergency vehicle siren in an environment around the autonomous vehicle.
  • the autonomous vehicle 105 applies a trained machine learning model, such as a neural network, to the one or more audio signals. Applying the trained model causes the processor to output a second determination indicating whether the audio signals are indicative of an emergency vehicle siren in the environment.
  • a trained machine learning model such as a neural network
  • the autonomous vehicle 105 determines at block 708 if an emergency vehicle siren is present. In some cases, the autonomous vehicle 105 decides the emergency vehicle siren is present if either the first determination or the second determine indicates the audio signals are indicative of an emergency vehicle siren in the environment around the autonomous vehicle. In other cases, the autonomous vehicle 105 processes the first determination and second determinations as binary values, where a value of 0 indicates an emergency vehicle siren is not present and a value of 1 indicates a siren is present. At decision block 708, the autonomous vehicle 105 computes a weighted sum of the first and second determinations.
  • the autonomous vehicle 105 determines an emergency vehicle siren has been detected.
  • the weights applied in the weighted sum can be configurable fixed weights or weights that are dynamically generated based on, for example, a duration of the captured audio signal that is being evaluated.
  • the autonomous vehicle 105 determines an emergency vehicle siren is present, the autonomous vehicle 105 performs an action at block 710. For example, a vehicle control system causes the autonomous vehicle 105 to slow or pull to a side of the road to allow the emergency vehicle to pass. As another example, in implementations where the autonomous vehicle 105 is not operated fully autonomously, an alert can be output to a driver of the autonomous vehicle to navigate the autonomous vehicle to the side of the road or away from the path of the emergency vehicle.
  • the autonomous vehicle 105 determines a distance to the source of the emergency vehicle siren based, for example, on a difference between audio signals detected on opposite sides of the autonomous vehicle or based on a difference between when an emergency vehicle light pattern was detected and when the emergency vehicle siren was detected.
  • the autonomous vehicle 105 can also determine whether the source of the siren is moving towards or away from the autonomous vehicle 105. When the direction of and distance to the source of the siren is calculated, the autonomous vehicle may not take the action at block 710 unless the emergency vehicle is moving towards the autonomous vehicle 105, and/or the emergency vehicle is within a threshold distance of the autonomous vehicle 105.
  • the threshold distance can be calculated based on the time estimated for the autonomous vehicle 105 to move out of a pathway of an approaching emergency vehicle. For example, the threshold distance can be calculated as the distance needed at a typical vehicle velocity for the autonomous vehicle to clear a complex intersection and pull over to make way for an approaching emergency vehicle travelling at 90 MPH (40.23 meters/second) with sirens enabled, approaching the same intersection.
  • the autonomous vehicle’s use of both machine learning-based siren detection and time domain- or frequency domain-based detection enables the autonomous vehicle to more accurately identify sirens based on audio signals.
  • the time domain- or frequency domain-based methods may be more accurate than a trained model when detecting sirens that are allowed to run continuously across multiple periods of the siren’s sound pattern, for well-known sound patterns.
  • the trained model may be more accurate when the autonomous vehicle encounters a new type of siren or when the siren is manually operated in short bursts that are shorter than the period of the siren’s sound pattern.
  • the autonomous vehicle 105 can more readily ensure that it correctly detects the presence of an emergency vehicle siren and takes appropriate action in response.
  • the words “comprise,” “comprising,” and the like are to be construed in an inclusive sense, as opposed to an exclusive or exhaustive sense; that is to say, in the sense of "including, but not limited to.”
  • the terms “connected,” “coupled,” or any variant thereof means any connection or coupling, either direct or indirect, between two or more elements; the coupling or connection between the elements can be physical, logical, or a combination thereof.
  • the words “herein,” “above,” “below,” and words of similar import can refer to this application as a whole and not to any particular portions of this application.
  • module refers broadly to software components, firmware components, and/or hardware components.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Automation & Control Theory (AREA)
  • Human Computer Interaction (AREA)
  • Transportation (AREA)
  • Mechanical Engineering (AREA)
  • Business, Economics & Management (AREA)
  • Emergency Management (AREA)
  • Traffic Control Systems (AREA)

Abstract

Un véhicule autonome comprend des capteurs audio conçus pour détecter un audio dans un environnement autour du véhicule autonome, et pour générer des signaux audio en fonction de l'audio détecté. Un processeur dans le véhicule autonome reçoit les signaux audio et compare une représentation de domaine temporel ou de domaine fréquentiel des signaux audio avec une représentation correspondante d'une sirène connue d'un véhicule d'urgence. La comparaison amène le processeur à délivrer en sortie une première détermination indiquant si les signaux audio indiquent une sirène du véhicule d'urgence. Le processeur applique également un réseau neuronal appris aux signaux audio, amenant le processeur à délivrer en sortie une seconde détermination indiquant si les signaux audio indiquent la sirène du véhicule d'urgence. Si la première détermination ou la seconde détermination indique la présence d'une sirène d'un véhicule d'urgence dans l'environnement autour du véhicule autonome, le véhicule autonome est amené à effectuer une action.
PCT/US2022/075549 2021-08-27 2022-08-26 Détection de sirène d'urgence dans des véhicules autonomes WO2023028610A1 (fr)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN202280072475.4A CN118511207A (zh) 2021-08-27 2022-08-26 自动驾驶车辆中的紧急警报检测
EP22777879.2A EP4392962A1 (fr) 2021-08-27 2022-08-26 Détection de sirène d'urgence dans des véhicules autonomes

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
US202163238089P 2021-08-27 2021-08-27
US63/238,089 2021-08-27
US17/822,735 2022-08-26
US17/822,735 US20230065647A1 (en) 2021-08-27 2022-08-26 Emergency siren detection in autonomous vehicles

Publications (1)

Publication Number Publication Date
WO2023028610A1 true WO2023028610A1 (fr) 2023-03-02

Family

ID=83457386

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2022/075549 WO2023028610A1 (fr) 2021-08-27 2022-08-26 Détection de sirène d'urgence dans des véhicules autonomes

Country Status (1)

Country Link
WO (1) WO2023028610A1 (fr)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20220126866A1 (en) * 2020-10-23 2022-04-28 Tusimple, Inc. Safe driving operations of autonomous vehicles

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20180211528A1 (en) * 2017-01-26 2018-07-26 Magna Electronics Inc. Vehicle acoustic-based emergency vehicle detection
US20190220248A1 (en) * 2019-03-27 2019-07-18 Intel Corporation Vehicle with external audio speaker and microphone
US20210125494A1 (en) * 2019-10-23 2021-04-29 Zoox, Inc. Emergency vehicle detection
US20210233554A1 (en) * 2020-01-24 2021-07-29 Motional Ad Llc Detection and classification of siren signals and localization of siren signal sources

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20180211528A1 (en) * 2017-01-26 2018-07-26 Magna Electronics Inc. Vehicle acoustic-based emergency vehicle detection
US20190220248A1 (en) * 2019-03-27 2019-07-18 Intel Corporation Vehicle with external audio speaker and microphone
US20210125494A1 (en) * 2019-10-23 2021-04-29 Zoox, Inc. Emergency vehicle detection
US20210233554A1 (en) * 2020-01-24 2021-07-29 Motional Ad Llc Detection and classification of siren signals and localization of siren signal sources

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20220126866A1 (en) * 2020-10-23 2022-04-28 Tusimple, Inc. Safe driving operations of autonomous vehicles
US11884298B2 (en) * 2020-10-23 2024-01-30 Tusimple, Inc. Safe driving operations of autonomous vehicles

Similar Documents

Publication Publication Date Title
EP3472680B1 (fr) Systèmes de commande de véhicule
US20220365530A1 (en) Systems and methods for operating an autonomous vehicle
US20220379924A1 (en) Systems and methods for operating an autonomous vehicle
US20230123611A1 (en) Systems and methods for operating an autonomous vehicle
US11884298B2 (en) Safe driving operations of autonomous vehicles
US20230020966A1 (en) Systems and methods for operating an autonomous vehicle
US20220410894A1 (en) Systems and methods for operating an autonomous vehicle
CN115179860A (zh) 自主车辆底盘下面的小物体的检测
WO2023028610A1 (fr) Détection de sirène d'urgence dans des véhicules autonomes
EP4303538A1 (fr) Système et procédé pour un routage optimisé de véhicules autonomes avec des cartes tenant compte des risques
US20230065647A1 (en) Emergency siren detection in autonomous vehicles
US11955006B2 (en) Traffic light indication system with suppressed notification for a vehicle
US20230356744A1 (en) System and method for fleet scene inquiries
US20230399021A1 (en) Systems and methods for detecting restricted traffic zones for autonomous driving
EP4275959A1 (fr) Prises amovibles de pied dans une cabine de camion pour un véhicule autonome
US20230394190A1 (en) Method and system to mitigate aquaplaning
EP4261093B1 (fr) Procédé comprenant la détection d'un état de fonctionnement anormal d'un véhicule autonome
US20230367309A1 (en) System and method for predicting non-operational design domain (odd) scenarios
US12103522B2 (en) Operating a vehicle according to an artificial intelligence model
US20240132060A1 (en) Operating a Vehicle According to an Artificial Intelligence Model
CN117355451A (zh) 用于操作自动驾驶车辆的系统和方法
CN117545671A (zh) 用于操作自动驾驶车辆的系统和方法

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22777879

Country of ref document: EP

Kind code of ref document: A1

WWE Wipo information: entry into national phase

Ref document number: 2022777879

Country of ref document: EP

NENP Non-entry into the national phase

Ref country code: DE

ENP Entry into the national phase

Ref document number: 2022777879

Country of ref document: EP

Effective date: 20240327