US20180186369A1 - Collision Avoidance Using Auditory Data Augmented With Map Data - Google Patents

Collision Avoidance Using Auditory Data Augmented With Map Data Download PDF

Info

Publication number
US20180186369A1
US20180186369A1 US15/906,910 US201815906910A US2018186369A1 US 20180186369 A1 US20180186369 A1 US 20180186369A1 US 201815906910 A US201815906910 A US 201815906910A US 2018186369 A1 US2018186369 A1 US 2018186369A1
Authority
US
United States
Prior art keywords
vehicle
predicted location
microphones
predicted
sound
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US15/906,910
Inventor
Brielle Reiff
Madeline Jane Schrier
Nithika Sivashankar
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Ford Global Technologies LLC
Original Assignee
Ford Global Technologies LLC
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Ford Global Technologies LLC filed Critical Ford Global Technologies LLC
Priority to US15/906,910 priority Critical patent/US20180186369A1/en
Assigned to FORD GLOBAL TECHNOLOGIES, LLC reassignment FORD GLOBAL TECHNOLOGIES, LLC ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: Sivashankar, Nithika, Reiff, Brielle, SCHRIER, MADELINE JANE
Publication of US20180186369A1 publication Critical patent/US20180186369A1/en
Abandoned legal-status Critical Current

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W30/00Purposes of road vehicle drive control systems not related to the control of a particular sub-unit, e.g. of systems using conjoint control of vehicle sub-units, or advanced driver assistance systems for ensuring comfort, stability and safety or drive control systems for propelling or retarding the vehicle
    • B60W30/08Active safety systems predicting or avoiding probable or impending collision or attempting to minimise its consequences
    • B60W30/09Taking automatic action to avoid collision, e.g. braking and steering
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/16Anti-collision systems
    • G08G1/166Anti-collision systems for active traffic, e.g. moving vehicles, pedestrians, bikes
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W10/00Conjoint control of vehicle sub-units of different type or different function
    • B60W10/04Conjoint control of vehicle sub-units of different type or different function including control of propulsion units
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W10/00Conjoint control of vehicle sub-units of different type or different function
    • B60W10/18Conjoint control of vehicle sub-units of different type or different function including control of braking systems
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W10/00Conjoint control of vehicle sub-units of different type or different function
    • B60W10/20Conjoint control of vehicle sub-units of different type or different function including control of steering systems
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W30/00Purposes of road vehicle drive control systems not related to the control of a particular sub-unit, e.g. of systems using conjoint control of vehicle sub-units, or advanced driver assistance systems for ensuring comfort, stability and safety or drive control systems for propelling or retarding the vehicle
    • B60W30/08Active safety systems predicting or avoiding probable or impending collision or attempting to minimise its consequences
    • B60W30/095Predicting travel path or likelihood of collision
    • B60W30/0956Predicting travel path or likelihood of collision the prediction being responsive to traffic or environmental parameters
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W60/00Drive control systems specially adapted for autonomous road vehicles
    • B60W60/001Planning or execution of driving tasks
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B62LAND VEHICLES FOR TRAVELLING OTHERWISE THAN ON RAILS
    • B62DMOTOR VEHICLES; TRAILERS
    • B62D15/00Steering not otherwise provided for
    • B62D15/02Steering position indicators ; Steering position determination; Steering aids
    • B62D15/025Active steering aids, e.g. helping the driver by actively influencing the steering system after environment evaluation
    • B62D15/0265Automatic obstacle avoidance by steering
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S5/00Position-fixing by co-ordinating two or more direction or position line determinations; Position-fixing by co-ordinating two or more distance determinations
    • G01S5/16Position-fixing by co-ordinating two or more direction or position line determinations; Position-fixing by co-ordinating two or more distance determinations using electromagnetic waves other than radio waves
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S5/00Position-fixing by co-ordinating two or more direction or position line determinations; Position-fixing by co-ordinating two or more distance determinations
    • G01S5/18Position-fixing by co-ordinating two or more direction or position line determinations; Position-fixing by co-ordinating two or more distance determinations using ultrasonic, sonic, or infrasonic waves
    • G01S5/28Position-fixing by co-ordinating two or more direction or position line determinations; Position-fixing by co-ordinating two or more distance determinations using ultrasonic, sonic, or infrasonic waves by co-ordinating position lines of different shape, e.g. hyperbolic, circular, elliptical or radial
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
    • G05D1/0088Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot characterized by the autonomous decision making process, e.g. artificial intelligence, predefined behaviours
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0255Control of position or course in two dimensions specially adapted to land vehicles using acoustic signals, e.g. ultra-sonic singals
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0268Control of position or course in two dimensions specially adapted to land vehicles using internal positioning means
    • G05D1/0274Control of position or course in two dimensions specially adapted to land vehicles using internal positioning means using mapping information stored in a memory device
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/09Arrangements for giving variable traffic instructions
    • G08G1/0962Arrangements for giving variable traffic instructions having an indicator mounted inside the vehicle, e.g. giving voice messages
    • G08G1/0965Arrangements for giving variable traffic instructions having an indicator mounted inside the vehicle, e.g. giving voice messages responding to signals from another vehicle, e.g. emergency vehicle
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04RLOUDSPEAKERS, MICROPHONES, GRAMOPHONE PICK-UPS OR LIKE ACOUSTIC ELECTROMECHANICAL TRANSDUCERS; DEAF-AID SETS; PUBLIC ADDRESS SYSTEMS
    • H04R1/00Details of transducers, loudspeakers or microphones
    • H04R1/20Arrangements for obtaining desired frequency or directional characteristics
    • H04R1/32Arrangements for obtaining desired frequency or directional characteristics for obtaining desired directional characteristic only
    • H04R1/326Arrangements for obtaining desired frequency or directional characteristics for obtaining desired directional characteristic only for microphones
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2420/00Indexing codes relating to the type of sensors based on the principle of their operation
    • B60W2420/10Transducer, e.g. piezoelectric elements
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2420/00Indexing codes relating to the type of sensors based on the principle of their operation
    • B60W2420/54Audio sensitive means, e.g. ultrasound
    • B60W2550/14
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2552/00Input parameters relating to infrastructure
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2554/00Input parameters relating to objects
    • B60W2554/80Spatial relation or speed relative to objects
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2556/00Input parameters relating to data
    • B60W2556/20Data confidence level
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2556/00Input parameters relating to data
    • B60W2556/45External transmission of data to or from the vehicle
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2556/00Input parameters relating to data
    • B60W2556/45External transmission of data to or from the vehicle
    • B60W2556/50External transmission of data to or from the vehicle for navigation systems
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2710/00Output or target parameters relating to a particular sub-units
    • B60W2710/18Braking system
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2710/00Output or target parameters relating to a particular sub-units
    • B60W2710/20Steering systems
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2720/00Output or target parameters relating to overall vehicle dynamics
    • B60W2720/10Longitudinal speed
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/88Radar or analogous systems specially adapted for specific applications
    • G01S13/93Radar or analogous systems specially adapted for specific applications for anti-collision purposes
    • G01S13/931Radar or analogous systems specially adapted for specific applications for anti-collision purposes of land vehicles
    • G01S2013/9316Radar or analogous systems specially adapted for specific applications for anti-collision purposes of land vehicles combined with communication equipment with other vehicles or with base stations
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/88Radar or analogous systems specially adapted for specific applications
    • G01S13/93Radar or analogous systems specially adapted for specific applications for anti-collision purposes
    • G01S13/931Radar or analogous systems specially adapted for specific applications for anti-collision purposes of land vehicles
    • G01S2013/9322Radar or analogous systems specially adapted for specific applications for anti-collision purposes of land vehicles using additional data, e.g. driver condition, road state or weather data
    • G01S2013/9357
    • G01S2013/936
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D2201/00Application
    • G05D2201/02Control of position of land vehicles
    • G05D2201/0213Road vehicle, e.g. car or truck
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04RLOUDSPEAKERS, MICROPHONES, GRAMOPHONE PICK-UPS OR LIKE ACOUSTIC ELECTROMECHANICAL TRANSDUCERS; DEAF-AID SETS; PUBLIC ADDRESS SYSTEMS
    • H04R2499/00Aspects covered by H04R or H04S not otherwise provided for in their subgroups
    • H04R2499/10General applications
    • H04R2499/13Acoustic transducers and sound field adaptation in vehicles

Definitions

  • This invention relates to performing obstacle avoidance in autonomous vehicles.
  • Autonomous vehicles are equipped with sensors that detect their environment.
  • An algorithm evaluates the output of the sensors and identifies obstacles.
  • a navigation system may then steer the vehicle, brake, and/or accelerate to both avoid the identified obstacles and reach a desired destination.
  • Sensors may include both imaging system, e.g. video cameras, as well as RADAR or LIDAR sensors.
  • the systems and methods disclosed herein provide an improved approach for detecting obstacles.
  • FIG. 1 is a schematic block diagram of a system for implementing embodiments of the invention
  • FIG. 2 is a schematic block diagram of an example computing device suitable for implementing methods in accordance with embodiments of the invention
  • FIGS. 3A and 3B are diagrams illustrating obstacle detection using auditory and map data.
  • FIG. 4 is a process flow diagram of a method for performing collision avoidance based on both auditory and map data in accordance with an embodiment of the present invention.
  • Embodiments in accordance with the present invention may be embodied as an apparatus, method, or computer program product. Accordingly, the present invention may take the form of an entirely hardware embodiment, an entirely software embodiment (including firmware, resident software, micro-code, etc.), or an embodiment combining software and hardware aspects that may all generally be referred to herein as a “module” or “system.” Furthermore, the present invention may take the form of a computer program product embodied in any tangible medium of expression having computer-usable program code embodied in the medium.
  • a computer-readable medium may include one or more of a portable computer diskette, a hard disk, a random access memory (RAM) device, a read-only memory (ROM) device, an erasable programmable read-only memory (EPROM or Flash memory) device, a portable compact disc read-only memory (CDROM), an optical storage device, and a magnetic storage device.
  • a computer-readable medium may comprise any non-transitory medium that can contain, store, communicate, propagate, or transport the program for use by or in connection with the instruction execution system, apparatus, or device.
  • Computer program code for carrying out operations of the present invention may be written in any combination of one or more programming languages, including an object-oriented programming language such as Java, Smalltalk, C++, or the like and conventional procedural programming languages, such as the “C” programming language or similar programming languages.
  • the program code may execute entirely on a computer system as a stand-alone software package, on a stand-alone hardware unit, partly on a remote computer spaced some distance from the computer, or entirely on a remote computer or server.
  • the remote computer may be connected to the computer through any type of network, including a local area network (LAN) or a wide area network (WAN), or the connection may be made to an external computer (for example, through the Internet using an Internet Service Provider).
  • LAN local area network
  • WAN wide area network
  • Internet Service Provider for example, AT&T, MCI, Sprint, EarthLink, MSN, GTE, etc.
  • These computer program instructions may also be stored in a non-transitory computer-readable medium that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable medium produce an article of manufacture including instruction means which implement the function/act specified in the flowchart and/or block diagram block or blocks.
  • the computer program instructions may also be loaded onto a computer or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide processes for implementing the functions/acts specified in the flowchart and/or block diagram block or blocks.
  • a controller 102 may be housed within a vehicle.
  • the vehicle may include any vehicle known in the art.
  • the vehicle may have all of the structures and features of any vehicle known in the art including, wheels, a drive train coupled to the wheels, an engine coupled to the drive train, a steering system, a braking system, and other systems known in the art to be included in a vehicle.
  • the controller 102 may perform autonomous navigation and collision avoidance.
  • auditory and map data may be analyzed to identify potential obstacles.
  • the controller 102 may include or access a database 104 housed in the vehicle or otherwise accessible by the controller 102 .
  • the database 104 may include data sufficient to enable identification of an obstacle using map data.
  • sound data 106 may contain data describing sounds generated by one or more types of vehicles or other potential obstacles.
  • sound data 106 may include samples of the sounds made by one or more types of vehicles, animals (e.g. a dog barking), people conversing, and the like.
  • sound data 106 may contain data describing such sounds, such as a spectrum of such sounds, or other data derived from a recording of such sounds.
  • the database 104 may further include map data 108 .
  • the map data 108 may include maps in the region of the vehicle, such as the city, state, or country in which the vehicle is located.
  • the maps may include data describing roads, landmarks, businesses, public buildings, etc.
  • the map data 108 may include the locations of emergency vehicle stations (fire stations, hospitals with ambulance service, police stations, etc.).
  • the controller 102 may periodically connect to a network 110 , such as the Internet or other network.
  • the controller 102 may retrieve some or all of the data stored in the database 104 from one or more servers 112 hosting or accessing a database 114 storing such information. For example, sound signatures or samples of sounds of one or more vehicles or other potential obstacles may be retrieved from the database 114 .
  • current map data 108 may be periodically retrieved from a database 114 .
  • the controller 102 may receive one or more image streams from one or more imaging devices 116 .
  • one or more cameras may be mounted to the vehicle and output image streams received by the controller 102 .
  • the controller 102 may further receive audio signals from one or more microphones 118 .
  • the one or more microphones 118 may be an array of microphones offset from one another such that differences in amplitude and time of arrival of a sound may be used to determine one or both of the direction to a source of the sound and the distance to the sound.
  • the one or more microphones may be directional microphones that are more sensitive to sounds originating from a particular direction.
  • the microphones 118 and the circuits or algorithms used to derive one or both of the distance and direction to a source of a sound may be according to any method known in the art of SONAR or any other approach for identifying the location of a source of sound known in the art.
  • the controller may execute a collision avoidance module 120 that receives the image streams and audio signals and identifies possible obstacles and takes measures to avoid them.
  • a collision avoidance module 120 that receives the image streams and audio signals and identifies possible obstacles and takes measures to avoid them.
  • image and auditory data is used to perform collision avoidance.
  • other sensors to detect obstacles may also be used such as RADAR, LIDAR, SONAR, and the like.
  • the collision avoidance module 120 may include an obstacle identification module 122 a that analyzes the one or more image streams and identifies potential obstacles, including people, animals, vehicles, buildings, curbs, and other objects and structures.
  • the obstacle identification module 122 a may identify vehicle images in the one or more image streams.
  • the obstacle identification module 122 a may include a sound processing module 124 that identifies potential obstacles using the audio signals in combination with map data 108 and possibly the sound data 106 . The method by which auditory and map data are used to identify potential obstacles is described in greater detail below.
  • the collision avoidance module 120 may further include a collision prediction module 122 b that predicts which obstacle images are likely to collide with the vehicle based on its current trajectory or current intended path.
  • a decision module 122 c may make a decision to stop, accelerate, turn, etc. in order to avoid obstacles.
  • the manner in which the collision prediction module 122 b predicts potential collisions and the manner in which the decision module 122 c takes action to avoid potential collisions may be according to any method or system known in the art of autonomous vehicles.
  • the decision module 122 c may control the trajectory of the vehicle by actuating one or more actuators 126 controlling the direction and speed of the vehicle.
  • the actuators 126 may include a steering actuator 128 a, accelerator actuator 128 b, and a brake actuator 128 c.
  • the configuration of the actuators 128 a - 128 c may be according to any implementation of such actuators known in the art of autonomous vehicles.
  • FIG. 2 is a block diagram illustrating an example computing device 200 .
  • Computing device 200 may be used to perform various procedures, such as those discussed herein.
  • the controller 102 may have some or all of the attributes of the computing device 200 .
  • Computing device 200 includes one or more processor(s) 202 , one or more memory device(s) 204 , one or more interface(s) 206 , one or more mass storage device(s) 208 , one or more Input/Output (I/O) device(s) 210 , and a display device 230 all of which are coupled to a bus 212 .
  • Processor(s) 202 include one or more processors or controllers that execute instructions stored in memory device(s) 204 and/or mass storage device(s) 208 .
  • Processor(s) 202 may also include various types of computer-readable media, such as cache memory.
  • Memory device(s) 204 include various computer-readable media, such as volatile memory (e.g., random access memory (RAM) 214 ) and/or nonvolatile memory (e.g., read-only memory (ROM) 216 ). Memory device(s) 204 may also include rewritable ROM, such as Flash memory.
  • volatile memory e.g., random access memory (RAM) 214
  • ROM read-only memory
  • Memory device(s) 204 may also include rewritable ROM, such as Flash memory.
  • Mass storage device(s) 208 include various computer readable media, such as magnetic tapes, magnetic disks, optical disks, solid-state memory (e.g., Flash memory), and so forth. As shown in FIG. 2 , a particular mass storage device is a hard disk drive 224 . Various drives may also be included in mass storage device(s) 208 to enable reading from and/or writing to the various computer readable media. Mass storage device(s) 208 include removable media 226 and/or non-removable media.
  • I/O device(s) 210 include various devices that allow data and/or other information to be input to or retrieved from computing device 200 .
  • Example I/O device(s) 210 include cursor control devices, keyboards, keypads, microphones, monitors or other display devices, speakers, printers, network interface cards, modems, lenses, CCDs or other image capture devices, and the like.
  • Display device 230 includes any type of device capable of displaying information to one or more users of computing device 200 .
  • Examples of display device 230 include a monitor, display terminal, video projection device, and the like.
  • Interface(s) 206 include various interfaces that allow computing device 200 to interact with other systems, devices, or computing environments.
  • Example interface(s) 206 include any number of different network interfaces 220 , such as interfaces to local area networks (LANs), wide area networks (WANs), wireless networks, and the Internet.
  • Other interface(s) include user interface 218 and peripheral device interface 222 .
  • the interface(s) 206 may also include one or more peripheral interfaces such as interfaces for printers, pointing devices (mice, track pad, etc.), keyboards, and the like.
  • Bus 212 allows processor(s) 202 , memory device(s) 204 , interface(s) 206 , mass storage device(s) 208 , I/O device(s) 210 , and display device 230 to communicate with one another, as well as other devices or components coupled to bus 212 .
  • Bus 212 represents one or more of several types of bus structures, such as a system bus, PCI bus, IEEE 1394 bus, USB bus, and so forth.
  • programs and other executable program components are shown herein as discrete blocks, although it is understood that such programs and components may reside at various times in different storage components of computing device 200 , and are executed by processor(s) 202 .
  • the systems and procedures described herein can be implemented in hardware, or a combination of hardware, software, and/or firmware.
  • one or more application specific integrated circuits (ASICs) can be programmed to carry out one or more of the systems and procedures described herein.
  • a vehicle housing the controller 102 may be prevented from visually detecting a potential object, such as another vehicle 302 , by an occluding object 304 such as building, tree, sign, etc. Accordingly, imaging devices 116 may not be effective at detecting such obstacles. However, the vehicle 300 may be close enough to detect sound generated by the other vehicle 302 or other obstacle.
  • the identification of obstacles as described herein may be performed where image data is available and may, for example, confirm the location of an obstacle that is also visible to imaging devices 116 .
  • Audible signals detected from the other vehicle 302 or other obstacle, as shown in FIG. 3A may be compared to map data as shown in FIG. 3B .
  • the position 306 of the vehicle 300 may be identified in the map using a GPS (global positioning system) receiver mounted to the vehicle 300 and landmarks in the region of the position 306 may be identified from map data.
  • the identity and location of the occluding object 304 may also be identified.
  • a landmark 308 corresponding to the vehicle 302 or other obstacle may be selected from the map data as corresponding to one or both of the direction and distance to a source of sound as detected using the one or more microphones 118 .
  • a direction and or location to a sound source as detected using the one or more microphones 118 may have an uncertainty or tolerance associated therewith.
  • the landmark 308 corresponding to the sound source may be selected due to the landmark 308 being positioned within that tolerance from the direction and/or location of the sound source as determined from the audio signals from the microphones 118 .
  • the landmark 308 corresponding to a sound source is determined to be a parking garage, it may be inferred that a vehicle is exiting the parking garage and measures may be taken to avoid it.
  • the landmark 308 is an emergency vehicle station and the sound detected is a siren, it may be inferred that an emergency vehicle is leaving the station and measures may be taken to pull over or otherwise take measures to avoid it.
  • the vehicle 300 is driving on a first road and the landmark 308 is a second road that intersects with the first road, it may be inferred that a vehicle on the second road could be about to turn onto the first road.
  • FIG. 4 illustrates a method 400 that may be executed by the controller by processing audio signals from the one or more microphones 118 .
  • the method 400 may include detecting 402 a sound and determining 404 one or more likely sources of the sound. For example, a wave form or spectrum of the sound may be compared to those of one or more sources in the sound data 106 .
  • Candidate sound sources 404 may be identified that have similarity to the detected sound exceeding a threshold condition.
  • Candidate sound sources may be estimated to be a vehicle, person, animal, or other sound producing entity for which sound data 106 is stored.
  • Some or all of the remaining steps of the method 400 may be executed for all sounds detected 402 or only for sounds corresponding to vehicles or other potential obstacles. Accordingly, if, at step 404 , the sound is found not to match a vehicle or other potential obstacle, then the remaining steps of the method 400 may be omitted.
  • the method 400 may include one or both of estimating 406 a distance to the origin of the sound and estimating 408 a direction to the origin of the sound. In some instances, by determining differences in a time of arrival of the sound at offset microphones 108 , both the distance to the origin and its direction may be determined simultaneously, i.e. a location estimate is derived. In other embodiments, separate microphones 118 or processing steps are used to estimate 406 , 408 the distance and direction to the origin of the sound.
  • the method 400 may include retrieving 410 map data in a region including the estimated location of the sound origin as determined at steps 406 and 408 . And evaluating 412 whether the map data includes a landmark corresponding to the location and candidate source of the sound origin. For example, a landmark closest to the location of the sound origin may be identified 412 . For example, where the candidate sound source is determined at step 404 to be a vehicle and a parking garage is within a specified tolerance from the location determined at steps 406 and 408 , then it may be determined that the parking garage is the landmark corresponding to the sound detected at step 402 . As noted above, the tolerance may be a region or range of angles and distances corresponding to the uncertainty in determining the location, direction, and distance, respectively of the sound origin. In another scenario, an emergency vehicle station is within the tolerance from the sound origin and the candidate sound source is an emergency vehicle, then it may be determined at step 412 that the landmark corresponding to the sound detected at step 402 is the emergency vehicle station.
  • the method 400 may include increasing 414 a certainty or confidence value indicating that a vehicle is located at the location determined at steps 406 , 408 .
  • a collision avoidance algorithm may identify potential obstacles. An obstacle may have a confidence value associated therewith that indicates the likelihood that an artifact in an image or detected in audio signals actually corresponds to a vehicle. Only those obstacles having a confidence value higher than a threshold may be considered for collision avoidance.
  • increasing 414 the certainty may increase options available to avoid the vehicle at the sound origin. For example, if a vehicle is detected but there is low certainty as to its location, a collision avoidance module 120 may slow down the vehicle in order to avoid a potential collision at a wide range of possible locations. However, if the location of the vehicle at the sound origin is known with high certainty (i.e. as increased at step 414 ), then the collision avoidance module 120 need only adjust speed and direction to avoid that known location along with any other identified obstacles.
  • the method 400 may further include increasing 416 certainty as to candidate source of the sound based on the landmark identified at step 412 . For example, if the candidate source of the sound is an emergency vehicle and the landmark determined at step 412 is determined to be an emergency vehicle station, then the confidence that the source of the sound was in fact an emergency vehicle may be increased 416 .
  • the collision avoidance module 120 may therefore take steps to pull over or otherwise avoid the emergency vehicle.
  • collision avoidance is performed 418 with respect to obstacles detected.
  • increasing 414 , 416 the certainty as to the location and source of a sound may be used by the collision avoidance module 120 to avoid collisions.
  • the source of the sound detected at step 402 may not necessarily be ignored during collision avoidance 418 , but rather its possible locations may be greater. This is particularly true where the candidate sound source determined at step 404 is a vehicle or person.

Abstract

A controller for an autonomous vehicle receives audio signals from one or more microphones and identifies sounds. The controller further identifies an estimated location of the sound origin and the type of sound, i.e. whether the sound is a vehicle and/or the type of vehicle. The controller analyzes map data and attempts to identify a landmark within a tolerance from the estimated location. If a landmark is found corresponding to the estimated location and type of the sound origin, then the certainty is increased that the source of the sound is at that location and is that type of sound source. Collision avoidance is then performed with respect to the location of the sound origin and its type with the certainty as augmented using the map data. Collision avoidance may include automatically actuating brake, steering, and accelerator actuators in order to avoid the location of the sound origin.

Description

    CROSS REFERENCE TO RELATED PATENT APPLICATION
  • The present application is a continuation of U.S. patent application Ser. No. 14/876,269, filed on Oct. 6, 2015, which is incorporated by reference in its entirety.
  • BACKGROUND Field of the Invention
  • This invention relates to performing obstacle avoidance in autonomous vehicles.
  • Background of the Invention
  • Autonomous vehicles are equipped with sensors that detect their environment. An algorithm evaluates the output of the sensors and identifies obstacles. A navigation system may then steer the vehicle, brake, and/or accelerate to both avoid the identified obstacles and reach a desired destination. Sensors may include both imaging system, e.g. video cameras, as well as RADAR or LIDAR sensors.
  • The systems and methods disclosed herein provide an improved approach for detecting obstacles.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • In order that the advantages of the invention will be readily understood, a more particular description of the invention briefly described above will be rendered by reference to specific embodiments illustrated in the appended drawings. Understanding that these drawings depict only typical embodiments of the invention and are not therefore to be considered limiting of its scope, the invention will be described and explained with additional specificity and detail through use of the accompanying drawings, in which:
  • FIG. 1 is a schematic block diagram of a system for implementing embodiments of the invention;
  • FIG. 2 is a schematic block diagram of an example computing device suitable for implementing methods in accordance with embodiments of the invention;
  • FIGS. 3A and 3B are diagrams illustrating obstacle detection using auditory and map data; and
  • FIG. 4 is a process flow diagram of a method for performing collision avoidance based on both auditory and map data in accordance with an embodiment of the present invention.
  • DETAILED DESCRIPTION
  • It will be readily understood that the components of the present invention, as generally described and illustrated in the Figures herein, could be arranged and designed in a wide variety of different configurations. Thus, the following more detailed description of the embodiments of the invention, as represented in the Figures, is not intended to limit the scope of the invention, as claimed, but is merely representative of certain examples of presently contemplated embodiments in accordance with the invention. The presently described embodiments will be best understood by reference to the drawings, wherein like parts are designated by like numerals throughout.
  • Embodiments in accordance with the present invention may be embodied as an apparatus, method, or computer program product. Accordingly, the present invention may take the form of an entirely hardware embodiment, an entirely software embodiment (including firmware, resident software, micro-code, etc.), or an embodiment combining software and hardware aspects that may all generally be referred to herein as a “module” or “system.” Furthermore, the present invention may take the form of a computer program product embodied in any tangible medium of expression having computer-usable program code embodied in the medium.
  • Any combination of one or more computer-usable or computer-readable media may be utilized. For example, a computer-readable medium may include one or more of a portable computer diskette, a hard disk, a random access memory (RAM) device, a read-only memory (ROM) device, an erasable programmable read-only memory (EPROM or Flash memory) device, a portable compact disc read-only memory (CDROM), an optical storage device, and a magnetic storage device. In selected embodiments, a computer-readable medium may comprise any non-transitory medium that can contain, store, communicate, propagate, or transport the program for use by or in connection with the instruction execution system, apparatus, or device.
  • Computer program code for carrying out operations of the present invention may be written in any combination of one or more programming languages, including an object-oriented programming language such as Java, Smalltalk, C++, or the like and conventional procedural programming languages, such as the “C” programming language or similar programming languages. The program code may execute entirely on a computer system as a stand-alone software package, on a stand-alone hardware unit, partly on a remote computer spaced some distance from the computer, or entirely on a remote computer or server. In the latter scenario, the remote computer may be connected to the computer through any type of network, including a local area network (LAN) or a wide area network (WAN), or the connection may be made to an external computer (for example, through the Internet using an Internet Service Provider).
  • The present invention is described below with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems) and computer program products according to embodiments of the invention. It will be understood that each block of the flowchart illustrations and/or block diagrams, and combinations of blocks in the flowchart illustrations and/or block diagrams, can be implemented by computer program instructions or code. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions/acts specified in the flowchart and/or block diagram block or blocks.
  • These computer program instructions may also be stored in a non-transitory computer-readable medium that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable medium produce an article of manufacture including instruction means which implement the function/act specified in the flowchart and/or block diagram block or blocks.
  • The computer program instructions may also be loaded onto a computer or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide processes for implementing the functions/acts specified in the flowchart and/or block diagram block or blocks.
  • Referring to FIG. 1, a controller 102 may be housed within a vehicle. The vehicle may include any vehicle known in the art. The vehicle may have all of the structures and features of any vehicle known in the art including, wheels, a drive train coupled to the wheels, an engine coupled to the drive train, a steering system, a braking system, and other systems known in the art to be included in a vehicle.
  • As discussed in greater detail herein, the controller 102 may perform autonomous navigation and collision avoidance. In particular, auditory and map data may be analyzed to identify potential obstacles.
  • The controller 102 may include or access a database 104 housed in the vehicle or otherwise accessible by the controller 102. The database 104 may include data sufficient to enable identification of an obstacle using map data. For example, sound data 106 may contain data describing sounds generated by one or more types of vehicles or other potential obstacles. For example, sound data 106 may include samples of the sounds made by one or more types of vehicles, animals (e.g. a dog barking), people conversing, and the like. Alternatively, sound data 106 may contain data describing such sounds, such as a spectrum of such sounds, or other data derived from a recording of such sounds.
  • The database 104 may further include map data 108. The map data 108 may include maps in the region of the vehicle, such as the city, state, or country in which the vehicle is located. The maps may include data describing roads, landmarks, businesses, public buildings, etc. In particular, the map data 108 may include the locations of emergency vehicle stations (fire stations, hospitals with ambulance service, police stations, etc.).
  • In some embodiments, the controller 102 may periodically connect to a network 110, such as the Internet or other network. The controller 102 may retrieve some or all of the data stored in the database 104 from one or more servers 112 hosting or accessing a database 114 storing such information. For example, sound signatures or samples of sounds of one or more vehicles or other potential obstacles may be retrieved from the database 114. Likewise, current map data 108 may be periodically retrieved from a database 114.
  • The controller 102 may receive one or more image streams from one or more imaging devices 116. For example, one or more cameras may be mounted to the vehicle and output image streams received by the controller 102.
  • The controller 102 may further receive audio signals from one or more microphones 118. The one or more microphones 118 may be an array of microphones offset from one another such that differences in amplitude and time of arrival of a sound may be used to determine one or both of the direction to a source of the sound and the distance to the sound. The one or more microphones may be directional microphones that are more sensitive to sounds originating from a particular direction. The microphones 118 and the circuits or algorithms used to derive one or both of the distance and direction to a source of a sound may be according to any method known in the art of SONAR or any other approach for identifying the location of a source of sound known in the art.
  • The controller may execute a collision avoidance module 120 that receives the image streams and audio signals and identifies possible obstacles and takes measures to avoid them. In the embodiments disclosed herein, only image and auditory data is used to perform collision avoidance. However, other sensors to detect obstacles may also be used such as RADAR, LIDAR, SONAR, and the like.
  • The collision avoidance module 120 may include an obstacle identification module 122 a that analyzes the one or more image streams and identifies potential obstacles, including people, animals, vehicles, buildings, curbs, and other objects and structures. In particular, the obstacle identification module 122 a may identify vehicle images in the one or more image streams. The obstacle identification module 122 a may include a sound processing module 124 that identifies potential obstacles using the audio signals in combination with map data 108 and possibly the sound data 106. The method by which auditory and map data are used to identify potential obstacles is described in greater detail below.
  • The collision avoidance module 120 may further include a collision prediction module 122 b that predicts which obstacle images are likely to collide with the vehicle based on its current trajectory or current intended path. A decision module 122 c may make a decision to stop, accelerate, turn, etc. in order to avoid obstacles. The manner in which the collision prediction module 122 b predicts potential collisions and the manner in which the decision module 122 c takes action to avoid potential collisions may be according to any method or system known in the art of autonomous vehicles.
  • The decision module 122 c may control the trajectory of the vehicle by actuating one or more actuators 126 controlling the direction and speed of the vehicle. For example, the actuators 126 may include a steering actuator 128 a, accelerator actuator 128 b, and a brake actuator 128 c. The configuration of the actuators 128 a-128 c may be according to any implementation of such actuators known in the art of autonomous vehicles.
  • FIG. 2 is a block diagram illustrating an example computing device 200. Computing device 200 may be used to perform various procedures, such as those discussed herein. The controller 102 may have some or all of the attributes of the computing device 200.
  • Computing device 200 includes one or more processor(s) 202, one or more memory device(s) 204, one or more interface(s) 206, one or more mass storage device(s) 208, one or more Input/Output (I/O) device(s) 210, and a display device 230 all of which are coupled to a bus 212. Processor(s) 202 include one or more processors or controllers that execute instructions stored in memory device(s) 204 and/or mass storage device(s) 208. Processor(s) 202 may also include various types of computer-readable media, such as cache memory.
  • Memory device(s) 204 include various computer-readable media, such as volatile memory (e.g., random access memory (RAM) 214) and/or nonvolatile memory (e.g., read-only memory (ROM) 216). Memory device(s) 204 may also include rewritable ROM, such as Flash memory.
  • Mass storage device(s) 208 include various computer readable media, such as magnetic tapes, magnetic disks, optical disks, solid-state memory (e.g., Flash memory), and so forth. As shown in FIG. 2, a particular mass storage device is a hard disk drive 224. Various drives may also be included in mass storage device(s) 208 to enable reading from and/or writing to the various computer readable media. Mass storage device(s) 208 include removable media 226 and/or non-removable media.
  • I/O device(s) 210 include various devices that allow data and/or other information to be input to or retrieved from computing device 200. Example I/O device(s) 210 include cursor control devices, keyboards, keypads, microphones, monitors or other display devices, speakers, printers, network interface cards, modems, lenses, CCDs or other image capture devices, and the like.
  • Display device 230 includes any type of device capable of displaying information to one or more users of computing device 200. Examples of display device 230 include a monitor, display terminal, video projection device, and the like.
  • Interface(s) 206 include various interfaces that allow computing device 200 to interact with other systems, devices, or computing environments. Example interface(s) 206 include any number of different network interfaces 220, such as interfaces to local area networks (LANs), wide area networks (WANs), wireless networks, and the Internet. Other interface(s) include user interface 218 and peripheral device interface 222. The interface(s) 206 may also include one or more peripheral interfaces such as interfaces for printers, pointing devices (mice, track pad, etc.), keyboards, and the like.
  • Bus 212 allows processor(s) 202, memory device(s) 204, interface(s) 206, mass storage device(s) 208, I/O device(s) 210, and display device 230 to communicate with one another, as well as other devices or components coupled to bus 212. Bus 212 represents one or more of several types of bus structures, such as a system bus, PCI bus, IEEE 1394 bus, USB bus, and so forth.
  • For purposes of illustration, programs and other executable program components are shown herein as discrete blocks, although it is understood that such programs and components may reside at various times in different storage components of computing device 200, and are executed by processor(s) 202. Alternatively, the systems and procedures described herein can be implemented in hardware, or a combination of hardware, software, and/or firmware. For example, one or more application specific integrated circuits (ASICs) can be programmed to carry out one or more of the systems and procedures described herein.
  • Turning now to FIGS. 3A and 3B, in many instances a vehicle housing the controller 102 (hereinafter the vehicle 300) may be prevented from visually detecting a potential object, such as another vehicle 302, by an occluding object 304 such as building, tree, sign, etc. Accordingly, imaging devices 116 may not be effective at detecting such obstacles. However, the vehicle 300 may be close enough to detect sound generated by the other vehicle 302 or other obstacle. Although the methods disclosed herein are particularly useful where there is an occluding object 304, the identification of obstacles as described herein may be performed where image data is available and may, for example, confirm the location of an obstacle that is also visible to imaging devices 116.
  • Audible signals detected from the other vehicle 302 or other obstacle, as shown in FIG. 3A may be compared to map data as shown in FIG. 3B. For example, the position 306 of the vehicle 300 may be identified in the map using a GPS (global positioning system) receiver mounted to the vehicle 300 and landmarks in the region of the position 306 may be identified from map data. The identity and location of the occluding object 304 may also be identified. A landmark 308 corresponding to the vehicle 302 or other obstacle may be selected from the map data as corresponding to one or both of the direction and distance to a source of sound as detected using the one or more microphones 118. For example, a direction and or location to a sound source as detected using the one or more microphones 118 may have an uncertainty or tolerance associated therewith. The landmark 308 corresponding to the sound source may be selected due to the landmark 308 being positioned within that tolerance from the direction and/or location of the sound source as determined from the audio signals from the microphones 118.
  • For example, where the landmark 308 corresponding to a sound source is determined to be a parking garage, it may be inferred that a vehicle is exiting the parking garage and measures may be taken to avoid it. Likewise, where the landmark 308 is an emergency vehicle station and the sound detected is a siren, it may be inferred that an emergency vehicle is leaving the station and measures may be taken to pull over or otherwise take measures to avoid it. If the vehicle 300 is driving on a first road and the landmark 308 is a second road that intersects with the first road, it may be inferred that a vehicle on the second road could be about to turn onto the first road.
  • FIG. 4 illustrates a method 400 that may be executed by the controller by processing audio signals from the one or more microphones 118.
  • The method 400 may include detecting 402 a sound and determining 404 one or more likely sources of the sound. For example, a wave form or spectrum of the sound may be compared to those of one or more sources in the sound data 106. Candidate sound sources 404 may be identified that have similarity to the detected sound exceeding a threshold condition. Candidate sound sources may be estimated to be a vehicle, person, animal, or other sound producing entity for which sound data 106 is stored.
  • Some or all of the remaining steps of the method 400 may be executed for all sounds detected 402 or only for sounds corresponding to vehicles or other potential obstacles. Accordingly, if, at step 404, the sound is found not to match a vehicle or other potential obstacle, then the remaining steps of the method 400 may be omitted.
  • The method 400 may include one or both of estimating 406 a distance to the origin of the sound and estimating 408 a direction to the origin of the sound. In some instances, by determining differences in a time of arrival of the sound at offset microphones 108, both the distance to the origin and its direction may be determined simultaneously, i.e. a location estimate is derived. In other embodiments, separate microphones 118 or processing steps are used to estimate 406, 408 the distance and direction to the origin of the sound.
  • The method 400 may include retrieving 410 map data in a region including the estimated location of the sound origin as determined at steps 406 and 408. And evaluating 412 whether the map data includes a landmark corresponding to the location and candidate source of the sound origin. For example, a landmark closest to the location of the sound origin may be identified 412. For example, where the candidate sound source is determined at step 404 to be a vehicle and a parking garage is within a specified tolerance from the location determined at steps 406 and 408, then it may be determined that the parking garage is the landmark corresponding to the sound detected at step 402. As noted above, the tolerance may be a region or range of angles and distances corresponding to the uncertainty in determining the location, direction, and distance, respectively of the sound origin. In another scenario, an emergency vehicle station is within the tolerance from the sound origin and the candidate sound source is an emergency vehicle, then it may be determined at step 412 that the landmark corresponding to the sound detected at step 402 is the emergency vehicle station.
  • If a corresponding landmark is identified at step 412, then the method 400 may include increasing 414 a certainty or confidence value indicating that a vehicle is located at the location determined at steps 406, 408. For example, a collision avoidance algorithm may identify potential obstacles. An obstacle may have a confidence value associated therewith that indicates the likelihood that an artifact in an image or detected in audio signals actually corresponds to a vehicle. Only those obstacles having a confidence value higher than a threshold may be considered for collision avoidance.
  • Alternatively, increasing 414 the certainty may increase options available to avoid the vehicle at the sound origin. For example, if a vehicle is detected but there is low certainty as to its location, a collision avoidance module 120 may slow down the vehicle in order to avoid a potential collision at a wide range of possible locations. However, if the location of the vehicle at the sound origin is known with high certainty (i.e. as increased at step 414), then the collision avoidance module 120 need only adjust speed and direction to avoid that known location along with any other identified obstacles.
  • The method 400 may further include increasing 416 certainty as to candidate source of the sound based on the landmark identified at step 412. For example, if the candidate source of the sound is an emergency vehicle and the landmark determined at step 412 is determined to be an emergency vehicle station, then the confidence that the source of the sound was in fact an emergency vehicle may be increased 416. The collision avoidance module 120 may therefore take steps to pull over or otherwise avoid the emergency vehicle.
  • In either outcome of step 412 collision avoidance is performed 418 with respect to obstacles detected. As noted above, increasing 414, 416 the certainty as to the location and source of a sound may be used by the collision avoidance module 120 to avoid collisions. However, the source of the sound detected at step 402 may not necessarily be ignored during collision avoidance 418, but rather its possible locations may be greater. This is particularly true where the candidate sound source determined at step 404 is a vehicle or person.
  • The present invention may be embodied in other specific forms without departing from its spirit or essential characteristics. The described embodiments are to be considered in all respects only as illustrative, and not restrictive. The scope of the invention is, therefore, indicated by the appended claims, rather than by the foregoing description. All changes which come within the meaning and range of equivalency of the claims are to be embraced within their scope.

Claims (20)

1. A controller for an autonomous vehicle comprising:
one or more processing devices programmed to:
receive one or more audio streams from one or more microphones;
retrieve map data in a region about the autonomous vehicle;
detect, in the one or more audio streams, a probable vehicle-originated sound;
identify a first predicted location of the probable vehicle-originated sound;
retrieve map data in a region proximate the first predicted location; and
when the map data indicates a vehicle-specific landmark within a threshold distance from the first predicted location, perform obstacle avoidance with respect to the first predicted location.
2. The controller of claim 1, wherein the one or more processing devices are further programmed to invoke obstacle avoidance with respect to the first predicted location by actuating at least one of a steering actuator, accelerator actuator, and brake actuator of the autonomous vehicle effective to avoid the first predicted location.
3. The controller of claim 1, wherein:
the one or more microphones are an array of microphones and the one or more audio streams are a plurality of audio streams from the array of microphones;
the one or more processing devices are further programmed to identify the first predicted location of the probable vehicle-originated sound by comparing at least one of time of arrival and intensity of the probable vehicle-originated sound in the plurality of audio streams.
4. The controller of claim 1, wherein the one or more microphones are directional microphones.
5. The controller of claim 1, wherein the one or more processing devices are further programmed to:
identify landmarks in the region proximate the first predicted location in the map data;
identify, from the map data, the vehicle-specific landmark among the landmarks in the region proximate the first predicted location; and
select, as a probable location of another vehicle, the vehicle-specific landmark.
6. The controller of claim 5, wherein the vehicle-specific landmark is at least one of a parking lot and emergency vehicle station.
7. The controller of claim 6, wherein the one or more processing devices are further programmed to:
identify a predicted vehicle type from the probable vehicle-originated sound;
if the predicted vehicle type corresponds to the vehicle specific landmark, confirm the predicted vehicle type.
8. The controller of claim 7, wherein the one or more processing devices are further programmed to:
if the predicted vehicle type corresponds to the vehicle specific landmark and the vehicle specific landmark is an emergency vehicle station, invoke pulling over and stopping of the autonomous vehicle.
9. The controller of claim 1, wherein the first predicted location is not within a line of sight of an imaging system of the autonomous vehicle.
10. The controller of claim 1, wherein the one or more processing devices are further programmed to:
receive outputs from one or more sensors, the one or more sensors including at least one of a Light Detection and Ranging (LIDAR) sensor and a Radio Detection and Ranging (RADAR) sensor;
detect an obstacle set in the outputs from the one or more sensors;
add the first predicted location to the obstacle set; and
perform obstacle avoidance with respect to the obstacle set.
11. An autonomous vehicle comprising:
a vehicle including an engine and wheels selectively coupled to the engine;
at least one of a steering actuator, accelerator actuator, and brake actuator;
one or more microphones;
a controller operably coupled to the one or more microphones and the at least one of the steering actuator, the accelerator actuator, and the brake actuator, the controller including one or more processing devices programmed to—
receive one or more audio streams from the one or more microphones;
detect, in the one or more audio streams, a probable vehicle-originated sound;
identify a first predicted location of the probable vehicle-originated sound;
retrieve map data in a region proximate the first predicted location;
when the map data indicates a vehicle-specific landmark within a threshold distance from the first predicted location, perform obstacle avoidance with respect to the first predicted location.
12. The autonomous vehicle of claim 11, wherein the one or more processing devices are further programmed to invoke obstacle avoidance with respect to the first predicted location by actuating at least one of a steering actuator, accelerator actuator, and brake actuator of the autonomous vehicle effective to avoid the first predicted location.
13. The autonomous vehicle of claim 11, wherein:
the one or more microphones are an array of microphones and the one or more audio streams are a plurality of audio streams from the array of microphones;
the one or more processing devices are further programmed to identify the first predicted location of the probable vehicle-originated sound by comparing at least one of time of arrival and intensity of the probable vehicle-originated sound in the plurality of audio streams.
14. The autonomous vehicle of claim 11, wherein the one or more microphones are directional microphones.
15. The autonomous vehicle of claim 11, wherein the one or more processing devices are further programmed to:
identify landmarks in the region proximate the first predicted location;
identify, from the map data, the vehicle-specific landmark among the landmarks in the region proximate the first predicted location; and
select, as a probable location of another vehicle, the vehicle-specific landmark.
16. The autonomous vehicle of claim 15, wherein the vehicle specific landmark is at least one of a parking lot and emergency vehicle station.
17. The autonomous vehicle of claim 16, wherein the one or more processing devices are further programmed to:
identify a predicted vehicle type from the probable vehicle-originated sound;
if the predicted vehicle type corresponds to the vehicle specific landmark, confirm the predicted vehicle type.
18. The autonomous vehicle of claim 17, wherein the one or more processing devices are further programmed to:
if the predicted vehicle type corresponds to the vehicle specific landmark and the vehicle specific landmark is an emergency vehicle station, invoke pulling over and stopping of the autonomous vehicle.
19. The autonomous vehicle of claim 11, wherein the first predicted location is not within a line of sight of an imaging system of the autonomous vehicle.
20. The autonomous vehicle of claim 11, wherein the one or more processing devices are further programmed to:
receive outputs from one or more sensors, the one or more sensors including at least one of a Light Detection and Ranging (LIDAR) sensor and a Radio Detection and Ranging (RADAR) sensor;
detect an obstacle set in the outputs from the one or more sensors;
add the first predicted location to the obstacle set; and
perform obstacle avoidance with respect to the obstacle set.
US15/906,910 2015-10-06 2018-02-27 Collision Avoidance Using Auditory Data Augmented With Map Data Abandoned US20180186369A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US15/906,910 US20180186369A1 (en) 2015-10-06 2018-02-27 Collision Avoidance Using Auditory Data Augmented With Map Data

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US14/876,269 US9937922B2 (en) 2015-10-06 2015-10-06 Collision avoidance using auditory data augmented with map data
US15/906,910 US20180186369A1 (en) 2015-10-06 2018-02-27 Collision Avoidance Using Auditory Data Augmented With Map Data

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
US14/876,269 Continuation US9937922B2 (en) 2015-10-06 2015-10-06 Collision avoidance using auditory data augmented with map data

Publications (1)

Publication Number Publication Date
US20180186369A1 true US20180186369A1 (en) 2018-07-05

Family

ID=57571233

Family Applications (2)

Application Number Title Priority Date Filing Date
US14/876,269 Active 2036-03-25 US9937922B2 (en) 2015-10-06 2015-10-06 Collision avoidance using auditory data augmented with map data
US15/906,910 Abandoned US20180186369A1 (en) 2015-10-06 2018-02-27 Collision Avoidance Using Auditory Data Augmented With Map Data

Family Applications Before (1)

Application Number Title Priority Date Filing Date
US14/876,269 Active 2036-03-25 US9937922B2 (en) 2015-10-06 2015-10-06 Collision avoidance using auditory data augmented with map data

Country Status (6)

Country Link
US (2) US9937922B2 (en)
CN (1) CN106560365B (en)
DE (1) DE102016118902A1 (en)
GB (1) GB2545053A (en)
MX (1) MX2016013080A (en)
RU (1) RU2016138295A (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10495722B2 (en) * 2017-12-15 2019-12-03 Walmart Apollo, Llc System and method for automatic determination of location of an autonomous vehicle when a primary location system is offline
US11257242B2 (en) * 2018-12-31 2022-02-22 Wipro Limited Method and device for determining operation of an autonomous device
GB2611559A (en) * 2021-10-08 2023-04-12 Virtual Vehicle Res Gmbh Method and device to detect traffic hazards based on sound events

Families Citing this family (35)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10412368B2 (en) 2013-03-15 2019-09-10 Uber Technologies, Inc. Methods, systems, and apparatus for multi-sensory stereo vision for robotics
US10338225B2 (en) 2015-12-15 2019-07-02 Uber Technologies, Inc. Dynamic LIDAR sensor controller
US9996080B2 (en) * 2016-02-26 2018-06-12 Ford Global Technologies, Llc Collision avoidance using auditory data
US10281923B2 (en) 2016-03-03 2019-05-07 Uber Technologies, Inc. Planar-beam, light detection and ranging system
US20170329332A1 (en) * 2016-05-10 2017-11-16 Uber Technologies, Inc. Control system to adjust operation of an autonomous vehicle based on a probability of interference by a dynamic object
US9952317B2 (en) * 2016-05-27 2018-04-24 Uber Technologies, Inc. Vehicle sensor calibration system
US10479376B2 (en) 2017-03-23 2019-11-19 Uatc, Llc Dynamic sensor selection for self-driving vehicles
US10746858B2 (en) 2017-08-17 2020-08-18 Uatc, Llc Calibration for an autonomous vehicle LIDAR module
US10775488B2 (en) 2017-08-17 2020-09-15 Uatc, Llc Calibration for an autonomous vehicle LIDAR module
US10569784B2 (en) 2017-09-28 2020-02-25 Waymo Llc Detecting and responding to propulsion and steering system errors for autonomous vehicles
SE541252C2 (en) 2017-10-10 2019-05-14 Kai Elodie Abiakle Method for stopping a vehicle
US10914820B2 (en) 2018-01-31 2021-02-09 Uatc, Llc Sensor assembly for vehicles
CN110329260A (en) * 2018-03-28 2019-10-15 比亚迪股份有限公司 Vehicle travel control method, system and auxiliary driving controller
US20180224860A1 (en) * 2018-04-02 2018-08-09 GM Global Technology Operations LLC Autonomous vehicle movement around stationary vehicles
DE112019002668T5 (en) * 2018-05-25 2021-03-11 Sony Corporation ROAD-SIDE DEVICE AND VEHICLE-SIDE DEVICE FOR ROAD-TO-VEHICLE COMMUNICATION, AND ROAD-TO-VEHICLE COMMUNICATION SYSTEM
US10976748B2 (en) * 2018-08-22 2021-04-13 Waymo Llc Detecting and responding to sounds for autonomous vehicles
US10800409B2 (en) * 2018-09-04 2020-10-13 Caterpillar Paving Products Inc. Systems and methods for operating a mobile machine using detected sounds
JP7147513B2 (en) * 2018-11-29 2022-10-05 トヨタ自動車株式会社 INFORMATION PROVISION SYSTEM, SERVER, IN-VEHICLE DEVICE, AND INFORMATION PROVISION METHOD
US11567510B2 (en) 2019-01-24 2023-01-31 Motional Ad Llc Using classified sounds and localized sound sources to operate an autonomous vehicle
DE102019202634B3 (en) * 2019-02-27 2020-07-23 Zf Friedrichshafen Ag Method, control device for an automated road vehicle, computer program product for recognizing objects in road traffic and automated road vehicle for mobility services
JP7120077B2 (en) * 2019-02-27 2022-08-17 トヨタ自動車株式会社 driving support system
JP7133155B2 (en) * 2019-03-04 2022-09-08 トヨタ自動車株式会社 driving support system
CN110040134B (en) * 2019-03-13 2020-06-16 重庆邮电大学 Vehicle collision time calculation method considering environmental factors
JP7147648B2 (en) * 2019-03-20 2022-10-05 トヨタ自動車株式会社 Driving support device
US11209831B2 (en) 2019-05-03 2021-12-28 Ford Global Technologies, Llc Object sound detection
US11433886B2 (en) * 2019-06-24 2022-09-06 GM Global Technology Operations LLC System, vehicle and method for adapting a driving condition of a vehicle upon detecting an event in an environment of the vehicle
GB201910864D0 (en) * 2019-07-30 2019-09-11 Blackberry Ltd Processing data for driving automation system
US11328592B2 (en) * 2019-08-14 2022-05-10 Toyota Motor North America, Inc. Systems and methods for roadway obstruction detection
US11788859B2 (en) 2019-12-02 2023-10-17 Here Global B.V. Method, apparatus, and computer program product for road noise mapping
US11393489B2 (en) * 2019-12-02 2022-07-19 Here Global B.V. Method, apparatus, and computer program product for road noise mapping
US11295757B2 (en) 2020-01-24 2022-04-05 Motional Ad Llc Detection and classification of siren signals and localization of siren signal sources
US11851049B1 (en) * 2020-02-28 2023-12-26 Zoox, Inc. System to detect impacts
US11483649B2 (en) 2020-08-21 2022-10-25 Waymo Llc External microphone arrays for sound source localization
CN112298173B (en) * 2020-11-06 2021-12-21 吉林大学 Intelligent driving-oriented vehicle safe driving control system and control method
US11364910B1 (en) 2021-08-26 2022-06-21 Motional Ad Llc Emergency vehicle detection system and method

Citations (21)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5170352A (en) * 1990-05-07 1992-12-08 Fmc Corporation Multi-purpose autonomous vehicle with path plotting
US6084973A (en) * 1997-12-22 2000-07-04 Audio Technica U.S., Inc. Digital and analog directional microphone
US6285771B1 (en) * 1996-12-31 2001-09-04 Etymotic Research Inc. Directional microphone assembly
US6529831B1 (en) * 2000-06-21 2003-03-04 International Business Machines Corporation Emergency vehicle locator and proximity warning system
US20050143918A1 (en) * 2003-12-29 2005-06-30 Hilliard Donald P. GPS collision avoidance apparatus
US20060058920A1 (en) * 2004-09-10 2006-03-16 Honda Motor Co., Ltd. Control apparatus for movable robot
US7116792B1 (en) * 2000-07-05 2006-10-03 Gn Resound North America Corporation Directional microphone system
US20090066538A1 (en) * 2006-06-21 2009-03-12 Dave Thomas Method and apparatus for object recognition and warning system of a primary vehicle for nearby vehicles
US20100217435A1 (en) * 2009-02-26 2010-08-26 Honda Research Institute Europe Gmbh Audio signal processing system and autonomous robot having such system
US20110077813A1 (en) * 2009-09-28 2011-03-31 Raia Hadsell Audio based robot control and navigation
US8072491B2 (en) * 2002-10-18 2011-12-06 Sony Corporation Information processing system and method, information processing apparatus, image-capturing device and method, recording medium, and program
US20130222127A1 (en) * 2012-02-16 2013-08-29 Bianca RAY AVALANI Intelligent driver assist system based on multimodal sensor fusion
US8571743B1 (en) * 2012-04-09 2013-10-29 Google Inc. Control of vehicles based on auditory signals
US8676427B1 (en) * 2012-10-11 2014-03-18 Google Inc. Controlling autonomous vehicle using audio data
US20150283703A1 (en) * 2014-04-03 2015-10-08 Brain Corporation Apparatus and methods for remotely controlling robotic devices
US20160026182A1 (en) * 2014-07-25 2016-01-28 Here Global B.V. Personalized Driving of Autonomously Driven Vehicles
US20160161271A1 (en) * 2014-12-09 2016-06-09 Toyota Motor Engineering & Manufacturing North America, Inc. Autonomous vehicle detection of and response to intersection priority
US20160217689A1 (en) * 2015-01-26 2016-07-28 Autoliv Asp, Inc. Supplemental automotive safety method and system
US9478139B2 (en) * 2014-12-25 2016-10-25 Automotive Research & Testing Center Driving safety system and barrier screening method thereof
US20170101093A1 (en) * 2015-10-13 2017-04-13 Verizon Patent And Licensing Inc. Collision prediction system
US20170120908A1 (en) * 2015-10-28 2017-05-04 Honda Motor Co., Ltd. Vehicle control apparatus, vehicle control method, and vehicle control program

Family Cites Families (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH0592767U (en) 1992-05-18 1993-12-17 株式会社豊田中央研究所 Approaching vehicle recognition device
JPH06231388A (en) * 1993-01-29 1994-08-19 Sadayoshi Iwabuchi On-vehicle emergency vehicle presence/absence in forming device
JP5040237B2 (en) * 2006-09-29 2012-10-03 株式会社デンソー Vehicle travel determination device
JP4967927B2 (en) 2007-08-27 2012-07-04 日産自動車株式会社 Hearing monitor device for vehicle
DE102007058542A1 (en) 2007-12-06 2009-06-10 Robert Bosch Gmbh Driver assistance system for monitoring driving safety and corresponding method for detecting and evaluating a vehicle movement
DE102008003205A1 (en) * 2008-01-04 2009-07-09 Wabco Gmbh Device, method and computer program for collision avoidance or for reducing the collision severity as a result of a collision for vehicles, in particular commercial vehicles
US7791499B2 (en) * 2008-01-15 2010-09-07 Qnx Software Systems Co. Dynamic siren detection and notification system
JP5303998B2 (en) 2008-04-03 2013-10-02 日産自動車株式会社 Outside vehicle information providing apparatus and outside vehicle information providing method
WO2011001684A1 (en) * 2009-07-02 2011-01-06 パナソニック株式会社 Vehicle position detecting device and vehicle position detecting method
JP2011232292A (en) 2010-04-30 2011-11-17 Toyota Motor Corp Vehicle exterior sound detection device
US8521352B1 (en) * 2012-05-07 2013-08-27 Google Inc. Controlling a vehicle having inadequate map data
JP5888414B2 (en) 2012-05-25 2016-03-22 トヨタ自動車株式会社 Approaching vehicle detection device and driving support system
GB2511748B (en) * 2013-03-11 2015-08-12 Jaguar Land Rover Ltd Emergency braking system for a vehicle
JP2014211756A (en) * 2013-04-18 2014-11-13 トヨタ自動車株式会社 Driving assist device
KR101526668B1 (en) * 2013-06-10 2015-06-05 현대자동차주식회사 Apparatus for detecting accidental contact of the vehicle and method thereof
GB2521415B (en) * 2013-12-19 2020-03-04 Here Global Bv An apparatus, method and computer program for controlling a vehicle

Patent Citations (21)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5170352A (en) * 1990-05-07 1992-12-08 Fmc Corporation Multi-purpose autonomous vehicle with path plotting
US6285771B1 (en) * 1996-12-31 2001-09-04 Etymotic Research Inc. Directional microphone assembly
US6084973A (en) * 1997-12-22 2000-07-04 Audio Technica U.S., Inc. Digital and analog directional microphone
US6529831B1 (en) * 2000-06-21 2003-03-04 International Business Machines Corporation Emergency vehicle locator and proximity warning system
US7116792B1 (en) * 2000-07-05 2006-10-03 Gn Resound North America Corporation Directional microphone system
US8072491B2 (en) * 2002-10-18 2011-12-06 Sony Corporation Information processing system and method, information processing apparatus, image-capturing device and method, recording medium, and program
US20050143918A1 (en) * 2003-12-29 2005-06-30 Hilliard Donald P. GPS collision avoidance apparatus
US20060058920A1 (en) * 2004-09-10 2006-03-16 Honda Motor Co., Ltd. Control apparatus for movable robot
US20090066538A1 (en) * 2006-06-21 2009-03-12 Dave Thomas Method and apparatus for object recognition and warning system of a primary vehicle for nearby vehicles
US20100217435A1 (en) * 2009-02-26 2010-08-26 Honda Research Institute Europe Gmbh Audio signal processing system and autonomous robot having such system
US20110077813A1 (en) * 2009-09-28 2011-03-31 Raia Hadsell Audio based robot control and navigation
US20130222127A1 (en) * 2012-02-16 2013-08-29 Bianca RAY AVALANI Intelligent driver assist system based on multimodal sensor fusion
US8571743B1 (en) * 2012-04-09 2013-10-29 Google Inc. Control of vehicles based on auditory signals
US8676427B1 (en) * 2012-10-11 2014-03-18 Google Inc. Controlling autonomous vehicle using audio data
US20150283703A1 (en) * 2014-04-03 2015-10-08 Brain Corporation Apparatus and methods for remotely controlling robotic devices
US20160026182A1 (en) * 2014-07-25 2016-01-28 Here Global B.V. Personalized Driving of Autonomously Driven Vehicles
US20160161271A1 (en) * 2014-12-09 2016-06-09 Toyota Motor Engineering & Manufacturing North America, Inc. Autonomous vehicle detection of and response to intersection priority
US9478139B2 (en) * 2014-12-25 2016-10-25 Automotive Research & Testing Center Driving safety system and barrier screening method thereof
US20160217689A1 (en) * 2015-01-26 2016-07-28 Autoliv Asp, Inc. Supplemental automotive safety method and system
US20170101093A1 (en) * 2015-10-13 2017-04-13 Verizon Patent And Licensing Inc. Collision prediction system
US20170120908A1 (en) * 2015-10-28 2017-05-04 Honda Motor Co., Ltd. Vehicle control apparatus, vehicle control method, and vehicle control program

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10495722B2 (en) * 2017-12-15 2019-12-03 Walmart Apollo, Llc System and method for automatic determination of location of an autonomous vehicle when a primary location system is offline
US11257242B2 (en) * 2018-12-31 2022-02-22 Wipro Limited Method and device for determining operation of an autonomous device
GB2611559A (en) * 2021-10-08 2023-04-12 Virtual Vehicle Res Gmbh Method and device to detect traffic hazards based on sound events

Also Published As

Publication number Publication date
US20170096138A1 (en) 2017-04-06
US9937922B2 (en) 2018-04-10
DE102016118902A1 (en) 2017-04-06
GB2545053A (en) 2017-06-07
MX2016013080A (en) 2017-04-27
RU2016138295A (en) 2018-03-28
GB201616938D0 (en) 2016-11-16
CN106560365A (en) 2017-04-12
CN106560365B (en) 2021-06-22

Similar Documents

Publication Publication Date Title
US9937922B2 (en) Collision avoidance using auditory data augmented with map data
US9996080B2 (en) Collision avoidance using auditory data
US9873428B2 (en) Collision avoidance using auditory data
CN107527092B (en) Training algorithms for collision avoidance using auditory data
US9598076B1 (en) Detection of lane-splitting motorcycles
US10474964B2 (en) Training algorithm for collision avoidance
US10849543B2 (en) Focus-based tagging of sensor data
WO2019006743A1 (en) Method and device for controlling travel of vehicle
GB2559032A (en) Autonomous school bus
US11270689B2 (en) Detection of anomalies in the interior of an autonomous vehicle
JP2016130966A (en) Risk estimation device, risk estimation method and computer program for risk estimation
JP2020126634A (en) Method and apparatus for detecting emergency vehicle in real time and planning travel route for accommodating situation which may be caused by emergency vehicle
WO2020105347A1 (en) Automated delivery method based on occupancy prediction
US20170103270A1 (en) Self-Recognition of Autonomous Vehicles in Mirrored or Reflective Surfaces
US10768631B2 (en) Method and apparatus for controlling a mobile robot
WO2018017094A1 (en) Assisted self parking
US11904856B2 (en) Detection of a rearward approaching emergency vehicle
US11508118B2 (en) Provisioning real-time three-dimensional maps for autonomous vehicles
US20210183221A1 (en) Theft proof techniques for autonomous driving vehicles used for transporting goods
CN109765886B (en) Target track identification method followed by vehicle
US20210039660A1 (en) Anomaly Detector For Vehicle Control Signals
KR20200128469A (en) Collision Prevention Apparatus for Autonomous Vehicles
JP7371679B2 (en) Information processing device, information processing method, and information processing program
KR20200128467A (en) Recording Medium
KR20180086099A (en) Method and system for managing accident based on pass prediction

Legal Events

Date Code Title Description
AS Assignment

Owner name: FORD GLOBAL TECHNOLOGIES, LLC, MICHIGAN

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:REIFF, BRIELLE;SCHRIER, MADELINE JANE;SIVASHANKAR, NITHIKA;SIGNING DATES FROM 20150825 TO 20150925;REEL/FRAME:045055/0174

STPP Information on status: patent application and granting procedure in general

Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION

STPP Information on status: patent application and granting procedure in general

Free format text: NON FINAL ACTION MAILED

STPP Information on status: patent application and granting procedure in general

Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER

STPP Information on status: patent application and granting procedure in general

Free format text: FINAL REJECTION MAILED

STCV Information on status: appeal procedure

Free format text: NOTICE OF APPEAL FILED

STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION