US10007262B1 - System for monitoring underneath an autonomous vehicle - Google Patents

System for monitoring underneath an autonomous vehicle Download PDF

Info

Publication number
US10007262B1
US10007262B1 US15/392,478 US201615392478A US10007262B1 US 10007262 B1 US10007262 B1 US 10007262B1 US 201615392478 A US201615392478 A US 201615392478A US 10007262 B1 US10007262 B1 US 10007262B1
Authority
US
United States
Prior art keywords
autonomous vehicle
topography
ground surface
autonomous
underneath
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
US15/392,478
Other versions
US20180181120A1 (en
Inventor
Oliver Schwindt
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Robert Bosch GmbH
Original Assignee
Robert Bosch GmbH
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Robert Bosch GmbH filed Critical Robert Bosch GmbH
Priority to US15/392,478 priority Critical patent/US10007262B1/en
Assigned to ROBERT BOSCH GMBH reassignment ROBERT BOSCH GMBH ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: SCHWINDT, OLIVER
Priority to EP17825223.5A priority patent/EP3563205B1/en
Priority to JP2019535300A priority patent/JP6771673B2/en
Priority to PCT/EP2017/084151 priority patent/WO2018122104A1/en
Priority to KR1020197018607A priority patent/KR102460053B1/en
Application granted granted Critical
Publication of US10007262B1 publication Critical patent/US10007262B1/en
Publication of US20180181120A1 publication Critical patent/US20180181120A1/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
    • G05D1/0055Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot with safety arrangements
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K35/00Arrangement of adaptations of instruments
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
    • G05D1/0055Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot with safety arrangements
    • G05D1/0061Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot with safety arrangements for transition from automatic pilot to manual pilot and vice versa
    • G06K9/6215
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/56Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
    • G06V20/58Recognition of moving objects or obstacles, e.g. vehicles or pedestrians; Recognition of traffic objects, e.g. traffic signs, traffic lights or roads
    • GPHYSICS
    • G07CHECKING-DEVICES
    • G07CTIME OR ATTENDANCE REGISTERS; REGISTERING OR INDICATING THE WORKING OF MACHINES; GENERATING RANDOM NUMBERS; VOTING OR LOTTERY APPARATUS; ARRANGEMENTS, SYSTEMS OR APPARATUS FOR CHECKING NOT PROVIDED FOR ELSEWHERE
    • G07C5/00Registering or indicating the working of vehicles
    • G07C5/008Registering or indicating the working of vehicles communicating information to a remotely located station
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/22Matching criteria, e.g. proximity measures

Definitions

  • Autonomous vehicles include external sensors that monitor the surroundings of the vehicle. When objects are detected by the sensors, control systems within the autonomous vehicle may provide automated maneuvering, stopping, and steering functions. However, external sensors are limited in their field of view and typically do not detect objects underneath the autonomous vehicle. In addition, these systems may demand a high level of system resources and processing time for detecting and tracking objects.
  • One embodiment provides a method of monitoring an area underneath an autonomous vehicle.
  • the method includes capturing a first topography of a ground surface underneath the autonomous vehicle with a sensor when the autonomous vehicle is stationary; storing the topography of the ground surface in a memory prior to the autonomous vehicle being switched off; capturing a second topography of the ground surface underneath the autonomous vehicle with the sensor before the autonomous vehicle begins moving; and comparing, with an electronic processor, the first topography with the second topography.
  • the method includes enabling autonomous driving of the autonomous vehicle when the first topography of the ground surface underneath the autonomous vehicle and the second topography of the ground surface underneath the autonomous vehicle match.
  • the system includes a sensor with a field of view that extends underneath the autonomous vehicle, an input/output interface configured to communicatively connect to a notification device, and an electronic processor communicatively connected to the sensor and the input/output interface.
  • the electronic processor is configured to capture a first topography of a ground surface underneath the autonomous vehicle with a sensor when the autonomous vehicle is stationary, store the topography of the ground surface in the memory of the electronic control unit prior to the autonomous vehicle being switched off, and capture a second topography of the ground surface underneath the autonomous vehicle with the sensor when the autonomous vehicle is switched on.
  • the electronic processor is also configured to compare the first topography with the second topography and enable autonomous driving of the autonomous vehicle when the first topography and the second topography match.
  • FIG. 1 is a block diagram of an autonomous vehicle equipped with a system for monitoring an area underneath the autonomous vehicle according to one embodiment.
  • FIG. 2 is a block diagram of an electronic control unit of the system of FIG. 1 according to one embodiment.
  • FIG. 3 is a flowchart of a method of operating the autonomous vehicle of FIG. 1 according to one embodiment.
  • a plurality of hardware and software based devices may be used to implement various embodiments.
  • embodiments may include hardware, software, and electronic components or modules that, for purposes of discussion, may be illustrated and described as if the majority of the components were implemented solely in hardware.
  • the electronic based aspects of the invention may be implemented in software (for example, stored on non-transitory computer-readable medium) executable by one or more processors.
  • control units” and “controllers” described in the specification can include one or more electronic processors, one or more memory modules including non-transitory computer-readable medium, one or more input/output interfaces, one or more application specific integrated circuits (ASICs), and various connections (for example, a system bus) connecting the various components.
  • ASICs application specific integrated circuits
  • FIG. 1 illustrates an autonomous vehicle 100 equipped with a monitoring system 105 for monitoring an area underneath the autonomous vehicle 100 according to one embodiment.
  • the autonomous vehicle 100 although illustrated as a four-wheeled vehicle, encompasses various types and designs.
  • the autonomous vehicle 100 may include an automobile, a motorcycle, a truck, a bus, a semi-tractor, and others.
  • the monitoring system 105 includes an electronic control unit (ECU) 110 , a sensor 115 (for example, radar, lidar, ultrasound, infrared, and others), and a notification device 120 (described in greater detail below).
  • ECU electronice control unit
  • sensor 115 for example, radar, lidar, ultrasound, infrared, and others
  • a notification device 120 described in greater detail below.
  • the electronic control unit 110 is communicatively connected to the sensor 115 and the notification device 120 .
  • the electronic control unit 110 may be configured to communicate with the sensor 115 and the notification device 120 via various mechanisms or protocols.
  • the electronic control unit 110 and the sensor 115 may be directly wired, wired through a communication bus, or wirelessly connected (for example, via a wireless network).
  • the electronic control unit 110 and the notification device 120 may be connected via similar connections as those listed above or may be connected via a wide area network (for example, the internet), a cellular network, or others.
  • the electronic control unit 110 is configured to receive information from the sensor 115 regarding the surroundings of the autonomous vehicle 100 and to generate notifications to send to the notification device 120 .
  • the notification device 120 may be of various different types and use various different technologies.
  • the notification device 120 is mounted within the autonomous vehicle 100 and viewable by a user of the autonomous vehicle 100 (for example, mounted on the console, mounted within a seatback, mounted on the roof, etc.).
  • the notification device 120 may include a display screen, a speaker, or other mechanism for creating an audial, visual, or haptic notification to the user.
  • the notification device 120 may be separate from the vehicle 100 , have other functionalities, and be configured to communicate with the electronic control unit 110 .
  • the notification device 120 may be a portable communication device of a user of the autonomous vehicle 100 .
  • the notification device 120 may be a computer terminal positioned at a remote monitoring service that controls, coordinates, or monitors performance of the autonomous vehicle 100 .
  • the notification device 120 includes an input mechanism for receiving a message from a user (for example, an “all-clear” message) and may be configured to send the message back to the electronic control unit 110 of the autonomous vehicle 100 .
  • FIG. 2 is a block diagram of an electronic control unit 110 of the monitoring system 105 according to one embodiment.
  • the electronic control unit 110 includes a plurality of electrical and electronic components that provide power, operational control, and protection to the components and modules within the electronic control unit 110 .
  • the electronic control unit 110 includes, among other things, an electronic processor 210 (such as a programmable electronic microprocessor, microcontroller, or similar device), a memory 215 , and an input/output interface 220 .
  • the term “memory 215 ” includes any type of non-transitory, machine-readable memory including various types of non-volatile memory.
  • the memory 215 may include internal and external memory, hard drives, disks, and others.
  • the electronic control unit 110 includes additional, fewer, or different components.
  • the electronic control unit 110 may be implemented in several independent electronic control units or modules each configured to perform specific steps or functions of the electronic control unit 110 .
  • the electronic processor 210 in coordination with the memory 215 , the input/output interface 220 , and other components of the electronic control unit 110 , is configured to perform the processes and methods discussed herein.
  • the electronic processor 210 is configured to retrieve from memory 215 and execute, among other things, instructions related to receiving sensor data from the sensor 115 , generating notifications for the notification device 120 , and enabling/disabling autonomous control of the autonomous vehicle 100 .
  • the input/output interface 220 is configured to perform input/output functions for the electronic processor 210 .
  • the input/output interface 220 is configured to communicate with the sensor 115 and the notification device 120 .
  • FIG. 3 is a flowchart of a method of operating the autonomous vehicle 100 according to one embodiment.
  • the electronic processor 210 first determines whether the autonomous vehicle is “stationary” (block 305 ). This may include determining whether the autonomous vehicle is in a state of any of the following conditions: parked, stopped, or shutting down. These conditions may be determined based on various operational parameters of the autonomous vehicle 100 including a location of the autonomous vehicle 100 , an amount of time that the autonomous vehicle 100 is not moving, arrival at a predetermined destination, and others. In some embodiments, the determination of being “stationary” only occurs under one of the particular conditions.
  • the electronic processor 210 sets a flag indicating that the autonomous vehicle is “stationary” only when the autonomous vehicle 100 is parked (for example, in the parked gear). In other embodiments, the electronic processor 210 sets a flag indicating that the autonomous vehicle 210 is “stationary” only when the autonomous vehicle 100 is not moving (for example, stopped) for more than a particular period of time. In yet other embodiments, the electronic processor 210 sets a flag indicating that the autonomous vehicle 100 is “stationary” only when the autonomous vehicle 100 is in the process of turning off (for example, in a power down sequence).
  • the electronic processor 210 determines that the autonomous vehicle 100 is stationary, the electronic processor 210 captures a first topography of a ground surface underneath the autonomous vehicle 100 with the sensor 115 (block 310 ).
  • the topography may be obtained by scanning the ground surface with the sensor 115 .
  • the sensor 115 may sense the distance to various points of the ground surface using a series of radio frequency reflections and create a topographical map based on the reflections.
  • the sensor 115 includes a lidar scanner or high resolution radar scanner that senses the elevation of the autonomous vehicle 100 at multiple points underneath the autonomous vehicle 100 .
  • the senor 115 includes an ultrasonic sensor, and the electronic processor 210 determines a distance to the ground at various points underneath the autonomous vehicle 100 using ultrasonic waves.
  • the electronic processor 210 may create a topological map based on ultrasonic reflections received by the ultrasonic sensor.
  • the sensor 115 includes an infra-red camera, and the electronic processor 210 detects temperature variations underneath the autonomous vehicle 100 . The temperature variations may indicate the presence of living objects underneath the autonomous vehicle 100 .
  • the sensor 115 may also scan the ground surface around the perimeter of the autonomous vehicle 100 and include the ground surface around the perimeter in the topographic map. The electronic processor 210 then stores the topography of the ground surface in the memory 215 (block 315 ).
  • the electronic processor 210 determines whether the autonomous vehicle 100 is about to begin moving (block 320 ). In some embodiments, this includes determining when the autonomous vehicle 100 is switched back on (for example, the vehicle's engine is turned on). In other embodiments, this includes determining whether the autonomous vehicle 100 is switching into a drive gear or other indication that the autonomous vehicle 100 is about to move. For example, in some embodiments, another vehicle controller in the autonomous vehicle 100 sends a signal to the electronic processor 210 that indicates that the autonomous vehicle 100 is about to move. Before the autonomous vehicle 100 begins moving, the electronic processor 210 captures a second topography of the ground surface underneath the autonomous vehicle 100 (block 325 ). The second topography may be captured by using the same technique or different technique used in capturing the first topography.
  • the electronic processor 210 compares the first topography with the second topography (block 330 ). This may include determining one or more differences between the topographies. For example, the electronic processor 210 may overlay the first topography on the second topography and identify which regions or points in the topographies are not the same. In this way, the electronic processor 210 determines whether the first topography matches the second topography (block 335 ). In some embodiments, the electronic processor 210 determines whether the differences exceed a predetermined threshold. In this case, the electronic processor 210 flags the topographies as different when the differences exceed the threshold. In some embodiments, the electronic processor 210 determines a difference score indicative of an amount of differences detected between the first topography and the second topography. In this case, the electronic processor 210 determines whether the difference score exceeds a predetermined threshold.
  • the electronic processor 210 disables autonomous driving of the autonomous vehicle 210 (block 340 ). In some embodiments, the electronic processor 210 disables autonomous driving only when the difference score is less than the threshold. In addition, the electronic processor 210 may also send a notification to the notification device 120 indicating that autonomous driving is disabled (block 345 ). In some embodiments, the notification includes a prompt to check under the autonomous vehicle 100 for the presence of objects. Conversely, when the first topography matches the second topography, the electronic processor 210 enables autonomous driving of the autonomous vehicle 100 (block 350 ).
  • the electronic processor 210 if autonomous driving has previously been disabled by the electronic processor 210 based on differences in the topographies, the electronic processor 210 re-enables autonomous driving when the topographies match. Once autonomous driving is disabled, the electronic processor 210 may continue to perform the method 300 . In this case, if the topographies match upon a subsequent iteration of the method 300 , the electronic processor 210 may then enable autonomous driving.
  • the electronic processor 210 may wait for an “all-clear” signal before enabling autonomous driving.
  • a passenger of the autonomous vehicle 100 may receive the notification that autonomous driving has been disabled (see block 345 ).
  • the notification may include a prompt to check under the autonomous vehicle 100 for the presence of objects.
  • the passenger may then input a selection on the notification device 120 or another input mechanism indicating that the autonomous vehicle 100 is clear of objects.
  • the passenger may check under the autonomous vehicle 100 and determine that an object is present, but does not pose a hazard to the autonomous vehicle 100 or to the object itself. This prevents the autonomous vehicle 100 from running over a person, a pet, a foot, etc. . . .
  • the notification device 120 then sends the all-clear signal to the electronic processor 210 based on the input from the passenger.
  • the electronic processor 210 may be configured to receive an all-clear signal from an external source such as a monitoring service for the autonomous vehicle 100 .
  • the sensor 115 includes a camera that is configured to capture a picture or video of the area underneath the autonomous vehicle 100 .
  • the electronic processor 210 may transmit the picture or video to the monitoring service for inspection.
  • the electronic processor 210 may receive an all-clear signal from the monitoring service.
  • the electronic processor 210 may re-enable autonomous driving of the autonomous vehicle 100 . Once the autonomous vehicle 100 resumes motion, the electronic processor 210 may reset and restart the method for the next time that the autonomous vehicle 100 is stationary.

Abstract

A method and system for monitoring an area underneath an autonomous vehicle. The method includes capturing a first topography of a ground surface underneath the autonomous vehicle with a sensor when the autonomous vehicle is stationary; storing the topography of the ground surface in a memory prior to the autonomous vehicle being switched off; capturing a second topography of the ground surface underneath the autonomous vehicle with the sensor before the autonomous vehicle begins moving; and comparing, with an electronic processor, the first topography with the second topography. The method includes enabling autonomous driving of the autonomous vehicle when the first topography of the ground surface underneath the autonomous vehicle and the second topography of the ground surface underneath the autonomous vehicle match.

Description

BACKGROUND
Autonomous vehicles include external sensors that monitor the surroundings of the vehicle. When objects are detected by the sensors, control systems within the autonomous vehicle may provide automated maneuvering, stopping, and steering functions. However, external sensors are limited in their field of view and typically do not detect objects underneath the autonomous vehicle. In addition, these systems may demand a high level of system resources and processing time for detecting and tracking objects.
SUMMARY
One embodiment provides a method of monitoring an area underneath an autonomous vehicle. The method includes capturing a first topography of a ground surface underneath the autonomous vehicle with a sensor when the autonomous vehicle is stationary; storing the topography of the ground surface in a memory prior to the autonomous vehicle being switched off; capturing a second topography of the ground surface underneath the autonomous vehicle with the sensor before the autonomous vehicle begins moving; and comparing, with an electronic processor, the first topography with the second topography. The method includes enabling autonomous driving of the autonomous vehicle when the first topography of the ground surface underneath the autonomous vehicle and the second topography of the ground surface underneath the autonomous vehicle match.
Another embodiment provides a system for monitoring an area underneath an autonomous vehicle. The system includes a sensor with a field of view that extends underneath the autonomous vehicle, an input/output interface configured to communicatively connect to a notification device, and an electronic processor communicatively connected to the sensor and the input/output interface. The electronic processor is configured to capture a first topography of a ground surface underneath the autonomous vehicle with a sensor when the autonomous vehicle is stationary, store the topography of the ground surface in the memory of the electronic control unit prior to the autonomous vehicle being switched off, and capture a second topography of the ground surface underneath the autonomous vehicle with the sensor when the autonomous vehicle is switched on. The electronic processor is also configured to compare the first topography with the second topography and enable autonomous driving of the autonomous vehicle when the first topography and the second topography match.
Other aspects, features, and embodiments will become apparent by consideration of the detailed description and accompanying drawings.
BRIEF DESCRIPTION OF THE DRAWINGS
FIG. 1 is a block diagram of an autonomous vehicle equipped with a system for monitoring an area underneath the autonomous vehicle according to one embodiment.
FIG. 2 is a block diagram of an electronic control unit of the system of FIG. 1 according to one embodiment.
FIG. 3 is a flowchart of a method of operating the autonomous vehicle of FIG. 1 according to one embodiment.
DETAILED DESCRIPTION
Before any embodiments are explained in detail, it is to be understood that this disclosure is not intended to be limited in its application to the details of construction and the arrangement of components set forth in the following description or illustrated in the following drawings. Embodiments are capable of other configurations and of being practiced or of being carried out in various ways
A plurality of hardware and software based devices, as well as a plurality of different structural components may be used to implement various embodiments. In addition, embodiments may include hardware, software, and electronic components or modules that, for purposes of discussion, may be illustrated and described as if the majority of the components were implemented solely in hardware. However, one of ordinary skill in the art, and based on a reading of this detailed description, would recognize that, in at least one embodiment, the electronic based aspects of the invention may be implemented in software (for example, stored on non-transitory computer-readable medium) executable by one or more processors. For example, “control units” and “controllers” described in the specification can include one or more electronic processors, one or more memory modules including non-transitory computer-readable medium, one or more input/output interfaces, one or more application specific integrated circuits (ASICs), and various connections (for example, a system bus) connecting the various components.
FIG. 1 illustrates an autonomous vehicle 100 equipped with a monitoring system 105 for monitoring an area underneath the autonomous vehicle 100 according to one embodiment. The autonomous vehicle 100, although illustrated as a four-wheeled vehicle, encompasses various types and designs. For example, the autonomous vehicle 100 may include an automobile, a motorcycle, a truck, a bus, a semi-tractor, and others. In the example illustrated, the monitoring system 105 includes an electronic control unit (ECU) 110, a sensor 115 (for example, radar, lidar, ultrasound, infrared, and others), and a notification device 120 (described in greater detail below).
The electronic control unit 110 is communicatively connected to the sensor 115 and the notification device 120. The electronic control unit 110 may be configured to communicate with the sensor 115 and the notification device 120 via various mechanisms or protocols. For example, the electronic control unit 110 and the sensor 115 may be directly wired, wired through a communication bus, or wirelessly connected (for example, via a wireless network). The electronic control unit 110 and the notification device 120 may be connected via similar connections as those listed above or may be connected via a wide area network (for example, the internet), a cellular network, or others. As discussed below, the electronic control unit 110 is configured to receive information from the sensor 115 regarding the surroundings of the autonomous vehicle 100 and to generate notifications to send to the notification device 120.
The notification device 120 may be of various different types and use various different technologies. In one example, the notification device 120 is mounted within the autonomous vehicle 100 and viewable by a user of the autonomous vehicle 100 (for example, mounted on the console, mounted within a seatback, mounted on the roof, etc.). In this case, the notification device 120 may include a display screen, a speaker, or other mechanism for creating an audial, visual, or haptic notification to the user. In other examples, the notification device 120 may be separate from the vehicle 100, have other functionalities, and be configured to communicate with the electronic control unit 110. For example, the notification device 120 may be a portable communication device of a user of the autonomous vehicle 100. In yet another example, the notification device 120 may be a computer terminal positioned at a remote monitoring service that controls, coordinates, or monitors performance of the autonomous vehicle 100. In some embodiments, the notification device 120 includes an input mechanism for receiving a message from a user (for example, an “all-clear” message) and may be configured to send the message back to the electronic control unit 110 of the autonomous vehicle 100.
FIG. 2 is a block diagram of an electronic control unit 110 of the monitoring system 105 according to one embodiment. The electronic control unit 110 includes a plurality of electrical and electronic components that provide power, operational control, and protection to the components and modules within the electronic control unit 110. The electronic control unit 110 includes, among other things, an electronic processor 210 (such as a programmable electronic microprocessor, microcontroller, or similar device), a memory 215, and an input/output interface 220. The term “memory 215” includes any type of non-transitory, machine-readable memory including various types of non-volatile memory. The memory 215 may include internal and external memory, hard drives, disks, and others. In some embodiments, the electronic control unit 110 includes additional, fewer, or different components. For example, the electronic control unit 110 may be implemented in several independent electronic control units or modules each configured to perform specific steps or functions of the electronic control unit 110.
The electronic processor 210, in coordination with the memory 215, the input/output interface 220, and other components of the electronic control unit 110, is configured to perform the processes and methods discussed herein. For example, the electronic processor 210 is configured to retrieve from memory 215 and execute, among other things, instructions related to receiving sensor data from the sensor 115, generating notifications for the notification device 120, and enabling/disabling autonomous control of the autonomous vehicle 100. The input/output interface 220 is configured to perform input/output functions for the electronic processor 210. For example, the input/output interface 220 is configured to communicate with the sensor 115 and the notification device 120.
FIG. 3 is a flowchart of a method of operating the autonomous vehicle 100 according to one embodiment. In the illustrated method, the electronic processor 210 first determines whether the autonomous vehicle is “stationary” (block 305). This may include determining whether the autonomous vehicle is in a state of any of the following conditions: parked, stopped, or shutting down. These conditions may be determined based on various operational parameters of the autonomous vehicle 100 including a location of the autonomous vehicle 100, an amount of time that the autonomous vehicle 100 is not moving, arrival at a predetermined destination, and others. In some embodiments, the determination of being “stationary” only occurs under one of the particular conditions. For example, in some embodiments, the electronic processor 210 sets a flag indicating that the autonomous vehicle is “stationary” only when the autonomous vehicle 100 is parked (for example, in the parked gear). In other embodiments, the electronic processor 210 sets a flag indicating that the autonomous vehicle 210 is “stationary” only when the autonomous vehicle 100 is not moving (for example, stopped) for more than a particular period of time. In yet other embodiments, the electronic processor 210 sets a flag indicating that the autonomous vehicle 100 is “stationary” only when the autonomous vehicle 100 is in the process of turning off (for example, in a power down sequence).
When the electronic processor 210 determines that the autonomous vehicle 100 is stationary, the electronic processor 210 captures a first topography of a ground surface underneath the autonomous vehicle 100 with the sensor 115 (block 310). The topography may be obtained by scanning the ground surface with the sensor 115. For example, the sensor 115 may sense the distance to various points of the ground surface using a series of radio frequency reflections and create a topographical map based on the reflections. In some embodiments, the sensor 115 includes a lidar scanner or high resolution radar scanner that senses the elevation of the autonomous vehicle 100 at multiple points underneath the autonomous vehicle 100. In some embodiments, the sensor 115 includes an ultrasonic sensor, and the electronic processor 210 determines a distance to the ground at various points underneath the autonomous vehicle 100 using ultrasonic waves. In these embodiments, the electronic processor 210 may create a topological map based on ultrasonic reflections received by the ultrasonic sensor. In other embodiments, the sensor 115 includes an infra-red camera, and the electronic processor 210 detects temperature variations underneath the autonomous vehicle 100. The temperature variations may indicate the presence of living objects underneath the autonomous vehicle 100. In other embodiments, the sensor 115 may also scan the ground surface around the perimeter of the autonomous vehicle 100 and include the ground surface around the perimeter in the topographic map. The electronic processor 210 then stores the topography of the ground surface in the memory 215 (block 315).
The electronic processor 210 determines whether the autonomous vehicle 100 is about to begin moving (block 320). In some embodiments, this includes determining when the autonomous vehicle 100 is switched back on (for example, the vehicle's engine is turned on). In other embodiments, this includes determining whether the autonomous vehicle 100 is switching into a drive gear or other indication that the autonomous vehicle 100 is about to move. For example, in some embodiments, another vehicle controller in the autonomous vehicle 100 sends a signal to the electronic processor 210 that indicates that the autonomous vehicle 100 is about to move. Before the autonomous vehicle 100 begins moving, the electronic processor 210 captures a second topography of the ground surface underneath the autonomous vehicle 100 (block 325). The second topography may be captured by using the same technique or different technique used in capturing the first topography.
The electronic processor 210 then compares the first topography with the second topography (block 330). This may include determining one or more differences between the topographies. For example, the electronic processor 210 may overlay the first topography on the second topography and identify which regions or points in the topographies are not the same. In this way, the electronic processor 210 determines whether the first topography matches the second topography (block 335). In some embodiments, the electronic processor 210 determines whether the differences exceed a predetermined threshold. In this case, the electronic processor 210 flags the topographies as different when the differences exceed the threshold. In some embodiments, the electronic processor 210 determines a difference score indicative of an amount of differences detected between the first topography and the second topography. In this case, the electronic processor 210 determines whether the difference score exceeds a predetermined threshold.
When the first topography does not match the second topography, the electronic processor 210 disables autonomous driving of the autonomous vehicle 210 (block 340). In some embodiments, the electronic processor 210 disables autonomous driving only when the difference score is less than the threshold. In addition, the electronic processor 210 may also send a notification to the notification device 120 indicating that autonomous driving is disabled (block 345). In some embodiments, the notification includes a prompt to check under the autonomous vehicle 100 for the presence of objects. Conversely, when the first topography matches the second topography, the electronic processor 210 enables autonomous driving of the autonomous vehicle 100 (block 350). In some embodiments, if autonomous driving has previously been disabled by the electronic processor 210 based on differences in the topographies, the electronic processor 210 re-enables autonomous driving when the topographies match. Once autonomous driving is disabled, the electronic processor 210 may continue to perform the method 300. In this case, if the topographies match upon a subsequent iteration of the method 300, the electronic processor 210 may then enable autonomous driving.
In some embodiments, however, when the topographies do not match, the electronic processor 210 may wait for an “all-clear” signal before enabling autonomous driving. For example, a passenger of the autonomous vehicle 100 may receive the notification that autonomous driving has been disabled (see block 345). As stated above, the notification may include a prompt to check under the autonomous vehicle 100 for the presence of objects. The passenger may then input a selection on the notification device 120 or another input mechanism indicating that the autonomous vehicle 100 is clear of objects. For example, the passenger may check under the autonomous vehicle 100 and determine that an object is present, but does not pose a hazard to the autonomous vehicle 100 or to the object itself. This prevents the autonomous vehicle 100 from running over a person, a pet, a foot, etc. . . . The notification device 120 then sends the all-clear signal to the electronic processor 210 based on the input from the passenger.
In another embodiment, the electronic processor 210 may be configured to receive an all-clear signal from an external source such as a monitoring service for the autonomous vehicle 100. For example, in one example, the sensor 115 includes a camera that is configured to capture a picture or video of the area underneath the autonomous vehicle 100. The electronic processor 210 may transmit the picture or video to the monitoring service for inspection. When the picture or video is free of objects, the electronic processor 210 may receive an all-clear signal from the monitoring service.
When an all-clear signal is received, the electronic processor 210 may re-enable autonomous driving of the autonomous vehicle 100. Once the autonomous vehicle 100 resumes motion, the electronic processor 210 may reset and restart the method for the next time that the autonomous vehicle 100 is stationary.
Various features, advantages, and embodiments are set forth in the following claims.

Claims (16)

What is claimed is:
1. A method of monitoring an area underneath an autonomous vehicle, the method comprising:
capturing a first topography of a ground surface underneath the autonomous vehicle with a sensor when the autonomous vehicle is stationary;
storing the first topography of the ground surface in a memory prior to the autonomous vehicle being switched off;
capturing a second topography of the ground surface underneath the autonomous vehicle with the sensor before the autonomous vehicle begins moving;
comparing, with an electronic processor, the first topography and the second topography;
enabling autonomous driving of the autonomous vehicle when the first topography of the ground surface underneath the autonomous vehicle and the second topography of the ground surface underneath the autonomous vehicle match.
2. The method according to claim 1, the method further comprising when the first topography of the ground surface underneath the autonomous vehicle and the second topography of the ground surface underneath the autonomous vehicle are different, disabling autonomous driving of the autonomous vehicle.
3. The method according to claim 1, the method further comprising when the first topography of the ground surface underneath the autonomous vehicle and the second topography of the ground surface underneath the autonomous vehicle are different, generating a notification that autonomous driving is disabled.
4. The method according to claim 3, wherein generating the notification that autonomous driving is disabled includes sending the notification to at least one selected from a group consisting of a user of the autonomous vehicle and a remote monitoring service for the autonomous vehicle.
5. The method according to claim 3, wherein generating the notification that autonomous driving is disabled includes generating a message to check underneath the autonomous vehicle for an object.
6. The method according to claim 1, the method further comprising
receiving, at the electronic processor, a signal indicating that the autonomous vehicle is clear of the object; and
enabling autonomous driving of the autonomous vehicle when the signal indicating that the autonomous vehicle is clear of the object is received.
7. The method according to claim 1, wherein comparing, by the electronic processor, the first topography with the second topography includes determining a difference score indicative of an amount of differences detected between the first topography and the second topography.
8. The method according to claim 7, wherein comparing, by the electronic control unit, the first topography with the second topography includes comparing the difference score to a threshold, and wherein enabling autonomous driving of the autonomous vehicle occurs when the difference score is less than the threshold.
9. A system for monitoring an area underneath an autonomous vehicle, the system comprising:
a sensor with a field of view that extends underneath the autonomous vehicle;
an input/output interface configured to communicatively connect to a notification device; and
an electronic control unit with an electronic processor and a memory communicatively connected to the sensor and the input/output interface, the electronic control unit configured to
capture a first topography of a ground surface underneath the autonomous vehicle with a sensor when the autonomous vehicle is stationary,
store the topography of the ground surface in the memory of the electronic control unit prior to the autonomous vehicle being switched off,
capture a second topography of the ground surface underneath the autonomous vehicle with the sensor when the autonomous vehicle is switched on,
compare the first topography of the ground surface underneath the autonomous vehicle with the second topography of the ground surface underneath the autonomous vehicle, and
enable autonomous driving of the autonomous vehicle when the first topography and the second topography match.
10. The system according to claim 9, wherein the electronic control unit is further configured to disable autonomous driving of the autonomous vehicle when the first topography and the second topography are different.
11. The system according to claim 9, wherein the electronic control unit is further configured to generate an indication that autonomous driving is disabled when the first topography and the second topography are different.
12. The system according to claim 11, wherein the electronic control unit is further configured to send the indication to at least one selected from a group consisting of a user of the autonomous vehicle and a remote monitoring service for the autonomous vehicle.
13. The system according to claim 11, wherein the electronic control unit is further configured to generate a message to check underneath the autonomous vehicle for an object.
14. The system according to claim 10, wherein the electronic control unit is further configured to
receive a signal indicating that the autonomous vehicle is clear of the object; and
enable autonomous driving of the autonomous vehicle when the signal indicating that the autonomous vehicle is clear of the object is received.
15. The system according to claim 11, wherein the electronic control unit is further configured to determine a difference score indicative of an amount of differences detected between the first topography and the second topography.
16. The system according to claim 15, wherein the electronic control unit is further configured to compare the difference score to a threshold, and enable autonomous driving of the autonomous vehicle occurs when the difference score is less than the threshold.
US15/392,478 2016-12-28 2016-12-28 System for monitoring underneath an autonomous vehicle Active US10007262B1 (en)

Priority Applications (5)

Application Number Priority Date Filing Date Title
US15/392,478 US10007262B1 (en) 2016-12-28 2016-12-28 System for monitoring underneath an autonomous vehicle
EP17825223.5A EP3563205B1 (en) 2016-12-28 2017-12-21 System and method for monitoring the area underneath an autonomous vehicle and selectively enabling autonomous driving
JP2019535300A JP6771673B2 (en) 2016-12-28 2017-12-21 System for monitoring below autonomous driving vehicles
PCT/EP2017/084151 WO2018122104A1 (en) 2016-12-28 2017-12-21 System for monitoring of the area underneath an autonomous vehicle
KR1020197018607A KR102460053B1 (en) 2016-12-28 2017-12-21 A system for monitoring the underside of autonomous vehicles

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
US15/392,478 US10007262B1 (en) 2016-12-28 2016-12-28 System for monitoring underneath an autonomous vehicle

Publications (2)

Publication Number Publication Date
US10007262B1 true US10007262B1 (en) 2018-06-26
US20180181120A1 US20180181120A1 (en) 2018-06-28

Family

ID=60923485

Family Applications (1)

Application Number Title Priority Date Filing Date
US15/392,478 Active US10007262B1 (en) 2016-12-28 2016-12-28 System for monitoring underneath an autonomous vehicle

Country Status (5)

Country Link
US (1) US10007262B1 (en)
EP (1) EP3563205B1 (en)
JP (1) JP6771673B2 (en)
KR (1) KR102460053B1 (en)
WO (1) WO2018122104A1 (en)

Cited By (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20190375423A1 (en) * 2018-06-06 2019-12-12 Ford Global Technologies, Llc Methods and systems for oil leak determination
US10848719B2 (en) 2017-09-13 2020-11-24 Alarm.Com Incorporated System and method for gate monitoring during departure or arrival of an autonomous vehicle
US20210080568A1 (en) * 2018-04-25 2021-03-18 Waymo Llc Underbody Radar Units
US11173883B2 (en) * 2017-01-27 2021-11-16 Knorr-Bremse Systeme Fuer Nutzfahrzeuge Gmbh Sensor arrangement and a method for detecting an object around a trailer of a vehicle
US20220009514A1 (en) * 2020-07-08 2022-01-13 Toyota Jidosha Kabushiki Kaisha Vehicle surrounding monitoring apparatus
US20230087119A1 (en) * 2021-09-22 2023-03-23 Motional Ad Llc Switchable wheel view mirrors
US20230087169A1 (en) * 2021-09-22 2023-03-23 Motional Ad Llc Switchable wheel view mirrors
US11724740B2 (en) * 2017-05-31 2023-08-15 Valeo Schalter Und Sensoren Gmbh Method for operating an ultrasonic sensor device for a motor vehicle to monitor a ground area below the motor vehicle, ultrasonic sensor device, driver assistance system and motor vehicle

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP3664063A1 (en) 2018-12-07 2020-06-10 Zenuity AB Under vehicle inspection

Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6861972B2 (en) 2003-07-28 2005-03-01 Ellistar Sensor Systems, Inc. Object detection apparatus and method
US20050253694A1 (en) 2004-05-17 2005-11-17 Kuznarowis Mark E Vehicle mounted pedestrian sensor system
US6990253B2 (en) 2001-05-23 2006-01-24 Kabushiki Kaisha Toshiba System and method for detecting obstacle
EP2441330A2 (en) 2010-10-14 2012-04-18 Deere & Company Undesired matter detection system
US20130128048A1 (en) * 2011-11-21 2013-05-23 Sony Corporation Imaging system and imaging device
US8473173B1 (en) 2008-09-08 2013-06-25 William Robles Motion sensor braking system and associated method
US20140049405A1 (en) * 2010-06-30 2014-02-20 Wabco Gmbh Device and Method for Outputting a Signal When There is a Hazardous Underlying Surface Under a Vehicle
US20140300740A1 (en) 2013-04-08 2014-10-09 Beat-Sonic Co., Ltd. Vehicle-mounted camera adapter in vehicle-mounted monitoring system
US20150198951A1 (en) 2014-01-16 2015-07-16 Volvo Car Corporation Vehicle adapted for autonomous driving and a method for detecting obstructing objects
US20160101734A1 (en) * 2014-10-13 2016-04-14 Lg Electronics Inc. Under vehicle image provision apparatus and vehicle including the same
US20160257248A1 (en) 2015-03-02 2016-09-08 Tk Holdings Inc. Vehicle object detection and notification system

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH11321497A (en) * 1998-05-14 1999-11-24 Nissei Giken Kk Safety operation support device
JP5196920B2 (en) * 2007-09-03 2013-05-15 アルパイン株式会社 Vehicle intrusion warning device
JP2016074314A (en) * 2014-10-07 2016-05-12 株式会社デンソー Vehicular warning device
DE102015201317A1 (en) * 2015-01-27 2016-07-28 Bayerische Motoren Werke Aktiengesellschaft Measuring a dimension on a surface
JP2016215751A (en) * 2015-05-18 2016-12-22 株式会社デンソー Automatic travel control device, or automatic travel control system

Patent Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6990253B2 (en) 2001-05-23 2006-01-24 Kabushiki Kaisha Toshiba System and method for detecting obstacle
US6861972B2 (en) 2003-07-28 2005-03-01 Ellistar Sensor Systems, Inc. Object detection apparatus and method
US20050253694A1 (en) 2004-05-17 2005-11-17 Kuznarowis Mark E Vehicle mounted pedestrian sensor system
US8473173B1 (en) 2008-09-08 2013-06-25 William Robles Motion sensor braking system and associated method
US20140049405A1 (en) * 2010-06-30 2014-02-20 Wabco Gmbh Device and Method for Outputting a Signal When There is a Hazardous Underlying Surface Under a Vehicle
EP2441330A2 (en) 2010-10-14 2012-04-18 Deere & Company Undesired matter detection system
US20130128048A1 (en) * 2011-11-21 2013-05-23 Sony Corporation Imaging system and imaging device
US20140300740A1 (en) 2013-04-08 2014-10-09 Beat-Sonic Co., Ltd. Vehicle-mounted camera adapter in vehicle-mounted monitoring system
US20150198951A1 (en) 2014-01-16 2015-07-16 Volvo Car Corporation Vehicle adapted for autonomous driving and a method for detecting obstructing objects
US20160101734A1 (en) * 2014-10-13 2016-04-14 Lg Electronics Inc. Under vehicle image provision apparatus and vehicle including the same
US20160257248A1 (en) 2015-03-02 2016-09-08 Tk Holdings Inc. Vehicle object detection and notification system

Cited By (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11173883B2 (en) * 2017-01-27 2021-11-16 Knorr-Bremse Systeme Fuer Nutzfahrzeuge Gmbh Sensor arrangement and a method for detecting an object around a trailer of a vehicle
US11724740B2 (en) * 2017-05-31 2023-08-15 Valeo Schalter Und Sensoren Gmbh Method for operating an ultrasonic sensor device for a motor vehicle to monitor a ground area below the motor vehicle, ultrasonic sensor device, driver assistance system and motor vehicle
US10848719B2 (en) 2017-09-13 2020-11-24 Alarm.Com Incorporated System and method for gate monitoring during departure or arrival of an autonomous vehicle
US11394933B2 (en) 2017-09-13 2022-07-19 Alarm.Com Incorporated System and method for gate monitoring during departure or arrival of an autonomous vehicle
US20210080568A1 (en) * 2018-04-25 2021-03-18 Waymo Llc Underbody Radar Units
US20190375423A1 (en) * 2018-06-06 2019-12-12 Ford Global Technologies, Llc Methods and systems for oil leak determination
US10843702B2 (en) * 2018-06-06 2020-11-24 Ford Global Technologies, Llc Methods and systems for oil leak determination
US20220009514A1 (en) * 2020-07-08 2022-01-13 Toyota Jidosha Kabushiki Kaisha Vehicle surrounding monitoring apparatus
US11673571B2 (en) * 2020-07-08 2023-06-13 Toyota Jidosha Kabushiki Kaisha Vehicle surrounding monitoring apparatus
US20230087119A1 (en) * 2021-09-22 2023-03-23 Motional Ad Llc Switchable wheel view mirrors
US20230087169A1 (en) * 2021-09-22 2023-03-23 Motional Ad Llc Switchable wheel view mirrors

Also Published As

Publication number Publication date
KR102460053B1 (en) 2022-10-31
EP3563205B1 (en) 2022-03-23
JP2020504694A (en) 2020-02-13
JP6771673B2 (en) 2020-10-21
WO2018122104A1 (en) 2018-07-05
US20180181120A1 (en) 2018-06-28
EP3563205A1 (en) 2019-11-06
KR20190101385A (en) 2019-08-30

Similar Documents

Publication Publication Date Title
US10007262B1 (en) System for monitoring underneath an autonomous vehicle
US10852741B2 (en) Using cameras for detecting objects near a vehicle
US10146227B2 (en) Driving assistance apparatus
US11205348B2 (en) Drive assist device
CN107856671B (en) Method and system for road condition identification through automobile data recorder
US10684625B2 (en) Automated parking for virtual parking spot
US10011299B2 (en) Trailer angle detection using rear camera
US10259427B1 (en) Vehicle security system using sensor data
JP6040870B2 (en) Vehicle monitoring system
US20120268601A1 (en) Method of recording traffic images and a drive recorder system
CN113655788B (en) Vehicle remote control parking method, system, terminal equipment and readable storage medium
US11225265B2 (en) Electronic device and method for recognizing object using plurality of sensors
JP5742458B2 (en) MOBILE BODY MONITORING DEVICE AND MOBILE BODY MONITORING METHOD
US20190286118A1 (en) Remote vehicle control device and remote vehicle control method
JP5720400B2 (en) Image processing apparatus and image processing method
CN111862576A (en) Method for tracking suspected target, corresponding vehicle, server, system and medium
JP5747847B2 (en) Vehicle periphery monitoring device
US11853035B2 (en) Camera assisted docking system for commercial shipping assets in a dynamic information discovery protocol environment
JP2023158546A (en) Monitoring system
JP2005044196A (en) Vehicle circumference monitoring device, automobile, vehicle circumference monitoring method, control program, and readable storage medium
JP2020145501A (en) Information processor, detection method and program
JP2023066239A (en) Object detection device and object detection method
CN112651621A (en) Information processing method and device for intelligent container falling identification
CN116605245A (en) Vehicle steering auxiliary control method and system

Legal Events

Date Code Title Description
AS Assignment

Owner name: ROBERT BOSCH GMBH, GERMANY

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:SCHWINDT, OLIVER;REEL/FRAME:041112/0501

Effective date: 20170103

STCF Information on status: patent grant

Free format text: PATENTED CASE

MAFP Maintenance fee payment

Free format text: PAYMENT OF MAINTENANCE FEE, 4TH YEAR, LARGE ENTITY (ORIGINAL EVENT CODE: M1551); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

Year of fee payment: 4