US20190204845A1 - Sensor integration for large autonomous vehicles - Google Patents

Sensor integration for large autonomous vehicles Download PDF

Info

Publication number
US20190204845A1
US20190204845A1 US16/009,499 US201816009499A US2019204845A1 US 20190204845 A1 US20190204845 A1 US 20190204845A1 US 201816009499 A US201816009499 A US 201816009499A US 2019204845 A1 US2019204845 A1 US 2019204845A1
Authority
US
United States
Prior art keywords
sensors
vehicle
housing
lidar
sensor assembly
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US16/009,499
Inventor
William Grossman
Benjamin Pitzer
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Waymo LLC
Original Assignee
Waymo LLC
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Waymo LLC filed Critical Waymo LLC
Priority to US16/009,499 priority Critical patent/US20190204845A1/en
Assigned to WAYMO LLC reassignment WAYMO LLC ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: GROSSMAN, WILLIAM, PITZER, BENJAMIN
Priority to PCT/US2018/066808 priority patent/WO2019133437A1/en
Priority to EP18834222.4A priority patent/EP3710862A1/en
Priority to CN201880084695.2A priority patent/CN111566511B/en
Publication of US20190204845A1 publication Critical patent/US20190204845A1/en
Priority to US17/724,559 priority patent/US11899466B2/en
Abandoned legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0231Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R1/00Optical viewing arrangements; Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles
    • B60R1/12Mirror assemblies combined with other articles, e.g. clocks
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/86Combinations of radar systems with non-radar systems, e.g. sonar, direction finder
    • G01S13/865Combination of radar systems with lidar systems
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/86Combinations of radar systems with non-radar systems, e.g. sonar, direction finder
    • G01S13/867Combination of radar systems with cameras
    • G01S17/023
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/86Combinations of lidar systems with systems other than lidar, radar or sonar, e.g. with direction finders
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/87Combinations of systems using electromagnetic waves other than radio waves
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/88Lidar systems specially adapted for specific applications
    • G01S17/93Lidar systems specially adapted for specific applications for anti-collision purposes
    • G01S17/931Lidar systems specially adapted for specific applications for anti-collision purposes of land vehicles
    • G01S17/936
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/481Constructional features, e.g. arrangements of optical elements
    • G01S7/4811Constructional features, e.g. arrangements of optical elements common to transmitter and receiver
    • G01S7/4813Housing arrangements
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0257Control of position or course in two dimensions specially adapted to land vehicles using a radar
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/20Control system inputs
    • G05D1/24Arrangements for determining position or orientation
    • G05D1/247Arrangements for determining position or orientation using signals provided by artificial sources external to the vehicle, e.g. navigation beacons
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/20Control system inputs
    • G05D1/24Arrangements for determining position or orientation
    • G05D1/247Arrangements for determining position or orientation using signals provided by artificial sources external to the vehicle, e.g. navigation beacons
    • G05D1/249Arrangements for determining position or orientation using signals provided by artificial sources external to the vehicle, e.g. navigation beacons from positioning sensors located off-board the vehicle, e.g. from cameras
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R1/00Optical viewing arrangements; Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles
    • B60R1/12Mirror assemblies combined with other articles, e.g. clocks
    • B60R2001/1223Mirror assemblies combined with other articles, e.g. clocks with sensors or transducers
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/88Radar or analogous systems specially adapted for specific applications
    • G01S13/93Radar or analogous systems specially adapted for specific applications for anti-collision purposes
    • G01S13/931Radar or analogous systems specially adapted for specific applications for anti-collision purposes of land vehicles
    • G01S2013/9318Controlling the steering
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/88Radar or analogous systems specially adapted for specific applications
    • G01S13/93Radar or analogous systems specially adapted for specific applications for anti-collision purposes
    • G01S13/931Radar or analogous systems specially adapted for specific applications for anti-collision purposes of land vehicles
    • G01S2013/93185Controlling the brakes
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/88Radar or analogous systems specially adapted for specific applications
    • G01S13/93Radar or analogous systems specially adapted for specific applications for anti-collision purposes
    • G01S13/931Radar or analogous systems specially adapted for specific applications for anti-collision purposes of land vehicles
    • G01S2013/9319Controlling the accelerator
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/88Radar or analogous systems specially adapted for specific applications
    • G01S13/93Radar or analogous systems specially adapted for specific applications for anti-collision purposes
    • G01S13/931Radar or analogous systems specially adapted for specific applications for anti-collision purposes of land vehicles
    • G01S2013/9327Sensor installation details
    • G01S2013/93274Sensor installation details on the side of the vehicles
    • G05D2201/0212
    • G05D2201/0213

Definitions

  • Autonomous vehicles such as vehicles that do not require a human driver, can be used to aid in the transport of passengers, cargo or other items from one location to another.
  • Such vehicles may operate in a fully autonomous mode or a partially autonomous mode where a person in the vehicle may provide some driving input.
  • one or more sets of sensors are used to detect features and objects in the environment around the vehicle.
  • the sensors may be placed at different locations about the vehicle in order to gather information about the surrounding environment.
  • aspects of the disclosure provide a sensor tower assembly that is especially beneficial for trucks, busses, construction equipment and other large vehicles.
  • the assembly co-locates various types of sensors in an integrated housing.
  • the integrated housing is rigidly affixed to a side of the large vehicle in a manner that provides enhanced fields of view for the sensors.
  • the integrated housing augments or replaces a side view mirror housing.
  • Conduits provide power, control and cooling/heating to the various sensors, and return acquired sensor information from the sensors to a control system of the vehicle so that it may operate in an autonomous or semi-autonomous mode.
  • a side sensor assembly for use on a truck or bus capable of operating in an autonomous driving mode.
  • the side sensor assembly comprises a housing, a mounting element, a plurality of sensors and a conduit.
  • the housing has one or more exterior surfaces and an interior receptacle. At least one of the one or more exterior surfaces including a side view mirror thereon.
  • the mounting element has a first end and a second end remote from the first end. The first end is coupled to the housing along one or more mounting points. The second end is configured to rigidly secure the housing to the truck or bus.
  • the plurality of sensors is received within the interior receptacle of the housing.
  • the plurality of sensors includes a pair of light detection and ranging (LIDAR) sensors.
  • LIDAR light detection and ranging
  • a first one of the pair of LIDAR sensors is a long range LIDAR having a detection range of at least 50 meters and a second one of the pair of LIDAR sensors is a short range LIDAR having a detection range of no more than 50 meters.
  • the conduit is received within the mounting element.
  • the conduit provides one or more of a power line, a data line and a cooling line to the plurality of sensors received within the housing and is configured for connection to one or more operational systems of the truck or bus.
  • the long range LIDAR is arranged along a first end of the interior receptacle and the short range LIDAR is arranged along a second end of the interior receptacle opposite the long range LIDAR.
  • the long range LIDAR is positioned closer to a roof of the truck or bus than the short range LIDAR so that the long range LIDAR has a field of view that extends past a front hood of the truck or bus during operation.
  • the plurality of sensors further includes at least one of a radar sensor and a camera sensor disposed between the long range LIDAR and the short range LIDAR within the interior receptacle.
  • the at least one radar sensor may comprise a plurality of radar sensors arranged to provide overlapping fields of view along a side of the truck or bus during operation.
  • the at least one camera sensor may comprise a plurality of cameras arranged to provide overlapping fields of view along a side of the truck or bus during operation.
  • the plurality of sensors may further include at least one inertial sensor disposed between the long range LIDAR and the short range LIDAR within the interior receptacle.
  • the plurality of sensors received within the interior receptacle of the housing are affixed within the housing relative to a common axis or physical reference point of the housing.
  • the plurality of sensors may be calibrated collectively relative to the common axis or physical reference point.
  • the side sensor assembly comprises a pair of side sensor assemblies.
  • Each one of the pair has a respective housing, mounting element, plurality of sensors and conduit.
  • a first one of the pair is configured for affixation to a left side of the truck or bus and a second one of the pair is configured for affixation to a right side of the truck or bus.
  • a vehicle is configured to operate in an autonomous driving mode.
  • the vehicle comprises a driving system configured to perform driving operations, a perception system configured to detect objects in an environment surrounding the vehicle, and a control system.
  • the control system is operatively coupled to the driving system and the perception system.
  • the control system has one or more computer processors configured to receive data from the perception system and to direct the driving system when operating in the autonomous driving mode.
  • the perception system includes a pair of side sensor assemblies attached to opposite sides of the vehicle.
  • Each side sensor assembly includes a housing, a mounting element, a plurality of sensors and a conduit.
  • the housing has one or more exterior surfaces and an interior receptacle. At least one of the one or more exterior surfaces includes a side view mirror thereon.
  • the mounting element has a first end and a second end remote from the first end.
  • the first end is coupled to the housing along one or more mounting points.
  • the second end is configured to rigidly secure the housing to a corresponding side of the vehicle.
  • the plurality of sensors is received within the interior receptacle of the housing.
  • the plurality of sensors includes a pair of light detection and ranging (LIDAR) sensors.
  • a first one of the pair of LIDAR sensors is a long range LIDAR having a detection range of at least 50 meters and a second one of the pair of LIDAR sensors is a short range LIDAR having a detection range of no more than 50 meters.
  • the conduit is received within the mounting element.
  • the conduit provides one or both of a power line and a data line to the plurality of sensors received within the housing and connects to one or more operational systems of the vehicle.
  • the long range LIDAR is arranged along a first end of the interior receptacle and the short range LIDAR is arranged along a second end of the interior receptacle opposite the long range LIDAR.
  • the long range LIDAR is positioned closer to a roof of the vehicle than the short range LIDAR so that the long range LIDAR has a field of view that extends past a front hood of the vehicle during operation.
  • the plurality of sensors in each side sensor assembly further includes at least one of a radar sensor and a camera sensor disposed between the long range LIDAR and the short range LIDAR within the interior receptacle.
  • the plurality of sensors in each side sensor assembly may further include at least one inertial sensor disposed between the long range LIDAR and the short range LIDAR within the interior receptacle. The at least one inertial sensor in each side sensor assembly may provide redundancy to the at least one inertial sensor in the other side sensor assembly.
  • the plurality of sensors received within the interior receptacle of the housing are affixed within the housing relative to a common axis or physical reference point of the housing.
  • the plurality of sensors in each side sensor assembly may be calibrated collectively relative to the common axis or physical reference point of that side sensor assembly.
  • the plurality of sensors in each side sensor assembly is calibrated relative to the other side sensor assembly.
  • the vehicle is one of a truck, a bus, or a construction vehicle.
  • the autonomous driving mode is a Level 4 or Level 5 autonomous mode of operation.
  • the conduit further provides a cooling line to the plurality of sensors received within the housing.
  • FIGS. 1A-B illustrate an example tractor-trailer for use with sensor towers according to aspects of the disclosure.
  • FIGS. 1C-D illustrate an example bus for use with sensor towers according to aspects of the disclosure.
  • FIG. 2 illustrates a system diagram of autonomous vehicle in accordance with aspects of the disclosure.
  • FIGS. 3A-B are example sensor assembly configurations in accordance with aspects of the disclosure.
  • FIGS. 4A-D illustrate arrangements of sensors and conduits with the sensor assembly configurations of FIGS. 3A-B , in accordance with aspects of the disclosure.
  • FIG. 5 is an example of short and long range LIDAR coverage for a large vehicle in accordance with aspects of the disclosure.
  • FIG. 6 is an example of radar or camera coverage for a large vehicle in accordance with aspects of the disclosure.
  • FIGS. 1A-B illustrate an example truck 100
  • FIGS. 1C-D illustrate an example bus 120
  • the truck 100 may be, e.g., a single, double or triple tractor-trailer, or other medium or heavy duty truck such as in weight classes 4 through 8
  • the bus 120 may be, e.g., a school bus, mini bus, trolley, motorcoach, double decker bus, etc.
  • the large vehicle may be longer than 8-10 meters.
  • the large vehicle may not exceed the length of a triple tractor trailer. Smaller or larger vehicles can also implement the sensor technologies discussed here.
  • Such large vehicles may have multiple blind spot areas on the sides and to the rear. Placing sensors on top of the truck cab or trailer, or on the roof of the bus, may not resolve the blind spot issue, and may or may not be feasible. For example, given the heights of such vehicles, it may be impractical to locate sensors on the roof or top due to low clearance bridges, underpasses, tunnels, parking structures, etc. This may limit routes available to the vehicle. It may also be difficult to maintain or service sensors placed on top of large vehicles.
  • side view mirror assemblies One way to address certain blind spot issues is via side view mirror assemblies.
  • the side view mirror assemblies on large trucks and busses can be placed towards the front of the vehicle. These assemblies can be secured by one or more bracket elements, and project away from the vehicle to the side and/or front, for instance as shown in the top views of FIGS. 1B and 1D .
  • Incorporating various sensor components into the side view mirror assemblies provides the autonomous or semi-autonomous driving system with good fields of view at a height that is beneficial. Specifics of this arrangement are provided in detail below.
  • Level 0 has no automation and the driver makes all driving-related decisions.
  • Level 1 includes some drive assistance such as cruise control.
  • Level 2 has partial automation of certain driving operations, while Level 3 involves conditional automation that can enable a person in the driver's seat to take control as warranted.
  • Level 4 is a high automation level where the vehicle is able to drive without assistance in select conditions.
  • Level 5 is a fully autonomous mode in which the vehicle is able to drive without assistance in all situations.
  • autonomous driving modes can function in any of the semi or fully-autonomous modes, e.g., Levels 1-5, which are referred to herein as “autonomous” driving modes.
  • autonomous driving mode includes both partial and full autonomy.
  • FIG. 2 illustrates a block diagram 200 with various components and systems of n vehicle, such as a truck or a bus, capable of operating in a full or semi-autonomous mode of operation.
  • the vehicle may have a control system of one or more computing devices, such as computing devices 202 containing one or more processors 204 , memory 206 and other components typically present in general purpose computing devices.
  • the memory 206 stores information accessible by the one or more processors 204 , including instructions 208 and data 210 that may be executed or otherwise used by the processor 120 .
  • the memory 206 may be of any type capable of storing information accessible by the processor, including a computing device-readable medium.
  • the memory is a non-transitory medium such as a hard-drive, memory card, optical disk, solid-state, tape memory, or the like. Systems may include different combinations of the foregoing, whereby different portions of the instructions and data are stored on different types of media.
  • the instructions 208 may be any set of instructions to be executed directly (such as machine code) or indirectly (such as scripts) by the processor.
  • the instructions may be stored as computing device code on the computing device-readable medium.
  • the terms “instructions” and “programs” may be used interchangeably herein.
  • the instructions may be stored in object code format for direct processing by the processor, or in any other computing device language including scripts or collections of independent source code modules that are interpreted on demand or compiled in advance.
  • the data 210 may be retrieved, stored or modified by one or more processors 204 in accordance with the instructions 208 .
  • data 210 of memory 206 may store information, such as calibration information, to be used when calibrating different types of sensors.
  • the one or more processor 204 may be any conventional processors, such as commercially available CPUs. Alternatively, the one or more processors may be a dedicated device such as an ASIC or other hardware-based processor. Although FIG. 2 functionally illustrates the processor(s), memory, and other elements of computing devices 202 as being within the same block, such devices may actually include multiple processors, computing devices, or memories that may or may not be stored within the same physical housing. Similarly, the memory 206 may be a hard drive or other storage media located in a housing different from that of the processor(s) 204 . Accordingly, references to a processor or computing device will be understood to include references to a collection of processors or computing devices or memories that may or may not operate in parallel.
  • the computing devices 202 may form an autonomous driving computing system incorporated into vehicle 100 or 120 .
  • the autonomous driving computing system may capable of communicating with various components of the vehicle.
  • the computing devices 202 may be in communication with various systems of the vehicle, including a driving system including a deceleration system 212 (for controlling braking of the vehicle), acceleration system 214 (for controlling acceleration of the vehicle), steering system 216 (for controlling the orientation of the wheels and direction of the vehicle), signaling system 218 (for controlling turn signals), navigation system 220 (for navigating the vehicle to a location or around objects) and a positioning system 222 (for determining the position of the vehicle).
  • a driving system including a deceleration system 212 (for controlling braking of the vehicle), acceleration system 214 (for controlling acceleration of the vehicle), steering system 216 (for controlling the orientation of the wheels and direction of the vehicle), signaling system 218 (for controlling turn signals), navigation system 220 (for navigating the vehicle to a location or around objects) and a positioning system 222 (for determining the position
  • the computing devices 202 are also operatively coupled to a perception system 224 (for detecting objects in the vehicle's environment), a power system 226 (for example, a battery and/or gas or diesel powered engine) and a transmission system 230 in order to control the movement, speed, etc., of vehicle 100 in accordance with the instructions 208 of memory 206 in an autonomous driving mode which does not require or need continuous or periodic input from a passenger of the vehicle.
  • the wheels/tires 228 are couples to the transmission system 230 , and the computing devices 202 may be able to receive information about tire pressure, balance and other factors that may impact driving in an autonomous mode.
  • the computing devices 202 may control the direction and speed of the vehicle by controlling various components.
  • computing devices 202 may navigate the vehicle to a destination location completely autonomously using data from the map information and navigation system 220 .
  • Computing devices 202 may use the positioning system 222 to determine the vehicle's location and the perception system 224 to detect and respond to objects when needed to reach the location safely.
  • computing devices 202 may cause the vehicle to accelerate (e.g., by increasing fuel or other energy provided to the engine by acceleration system 214 ), decelerate (e.g., by decreasing the fuel supplied to the engine, changing gears, and/or by applying brakes by deceleration system 212 ), change direction (e.g., by turning the front or other wheels of vehicle 100 or 120 by steering system 216 ), and signal such changes (e.g., by lighting turn signals of signaling system 218 ).
  • the acceleration system 214 and deceleration system 212 may be a part of a drivetrain or other transmission system 230 that includes various components between an engine of the vehicle and the wheels of the vehicle. Again, by controlling these systems, computing devices 202 may also control the transmission system 230 of the vehicle in order to maneuver the vehicle autonomously.
  • computing devices 202 may interact with deceleration system 212 and acceleration system 214 in order to control the speed of the vehicle.
  • steering system 216 may be used by computing devices 202 in order to control the direction of vehicle.
  • the steering system 216 may include components to control the angle of wheels to turn the vehicle.
  • Signaling system 218 may be used by computing devices 202 in order to signal the vehicle's intent to other drivers or vehicles, for example, by lighting turn signals or brake lights when needed.
  • Navigation system 220 may be used by computing devices 202 in order to determine and follow a route to a location.
  • the navigation system 220 and/or data 210 may store map information, e.g., highly detailed maps that computing devices 202 can use to navigate or control the vehicle.
  • map information e.g., highly detailed maps that computing devices 202 can use to navigate or control the vehicle.
  • these maps may identify the shape and elevation of roadways, lane markers, intersections, crosswalks, speed limits, traffic signal lights, buildings, signs, real time traffic information, vegetation, or other such objects and information.
  • the lane markers may include features such as solid or broken double or single lane lines, solid or broken lane lines, reflectors, etc.
  • a given lane may be associated with left and right lane lines or other lane markers that define the boundary of the lane. Thus, most lanes may be bounded by a left edge of one lane line and a right edge of another lane line.
  • the perception system 224 also includes one or more components for detecting objects external to the vehicle such as other vehicles, obstacles in the roadway, traffic signals, signs, trees, etc.
  • the perception system 224 may include one or more light detection and ranging (LIDAR) sensors, sonar devices, radar units, cameras, inertial (e.g., gyroscopic) sensors, and/or any other detection devices that record data which may be processed by computing devices 202 .
  • LIDAR light detection and ranging
  • sonar devices e.g., radar units
  • cameras inertial (e.g., gyroscopic) sensors, and/or any other detection devices that record data which may be processed by computing devices 202 .
  • the sensors of the perception system may detect objects and their characteristics such as location, orientation, size, shape, type (for instance, vehicle, pedestrian, bicyclist, etc.), heading, and speed of movement, etc.
  • the raw data from the sensors and/or the aforementioned characteristics can sent for further processing to the computing devices 202 periodically and continuously as it is generated by the perception system 224 .
  • Computing devices 202 may use the positioning system 222 to determine the vehicle's location and perception system 224 to detect and respond to objects when needed to reach the location safely.
  • the computing devices 202 may perform calibration of individual sensors, all sensors in a particular sensor assembly, or between sensors in different sensor assemblies.
  • the perception system 224 includes one or more sensor assemblies 232 , which may be arranged as sensor towers integrated into the side-view mirrors on the truck, bus or other large vehicle such as construction equipment.
  • a connection conduit 234 provides the necessary power, communication, cooling/heating and other connections between a given sensor housing assembly and the vehicle.
  • a data communication bus may provide bidirectional communication between the sensors of the sensor housing assembly and the computing devices 202 .
  • a power line may be connected directly or indirectly to the power system 226 , or to a separate power source such as a battery controlled by the computing devices 202 .
  • a cooling line may also couple to the power system 226 or to a dedicated cooling system of the vehicle. The cooling may be active, e.g., using a cooling fluid or forced cool air, or passive. Alternatively, in very cold or wintry environments, heating may be applied instead of cooling.
  • FIGS. 3A and 3B illustrate two examples of sensor assemblies.
  • FIG. 3A illustrates a sensor assembly 300 with a housing 302 and a mounting element 304 .
  • a mirror 306 is arranged on an outside surface of the housing 302 .
  • FIG. 3B similarly illustrates another sensor assembly 320 with a housing 322 and a mounting element 324 .
  • multiple mirrors 326 a and 326 b may be arranged on different exterior surfaces of the housing 322 .
  • Each housing is configured to store the various LIDAR sensors, sonar devices, radar units, cameras, inertial and/or gyroscopic sensors therein.
  • the mounting elements are configured to rigidly secure the housing to the vehicle.
  • mounting element 304 may couple the housing 302 to the cab of a tractor-trailer vehicle such as vehicle 100 .
  • mounting element 324 may couple the housing 322 to the side of a bus such as bus 120 .
  • Each side of the vehicle may have a housing 302 or 322 rigidly mounted thereon.
  • FIG. 4A illustrates an example of the housing 302 with selected sensors illustrated therein.
  • the sensors may include a long range, narrow field of view (FOV) LIDAR 400 and a short range, tall FOV LIDAR 402 .
  • the long range LIDAR 400 may have a range exceeding 50-250 meters, while the short range LIDAR 402 has a range no greater than 1-50 meters.
  • the short range LIDAR 402 may generally cover up to 10-15 meters from the vehicle while the long range LIDAR 400 may cover a range exceeding 100 meters.
  • the long range is between 10-200 meters, while the short range has a range of 0-20 meters.
  • the long range exceeds 80 meters while the short range is below 50 meters.
  • Intermediate ranges of between, e.g., 10-100 meters can be covered by one or both of the long range and short range LIDARs, or by a medium range LIDAR that may also be included in the housing 302 .
  • the medium range LIDAR may be disposed between the long and short range LIDARs, and may be aligned about the same common axis or other fixed point as discussed below.
  • a set of cameras 404 may be distributed along the housing 302 , for instance to provide forward, side and rear-facing imagery.
  • a set of radars 406 may be distributed along the housing 302 to provide forward, side and rear-facing data.
  • the sensors 408 may include an inertial sensor, a gyroscope, an accelerometer and/or other sensors. Each of the sensors may be aligned or arranged relative to a common axis 409 or physical point within the housing 302 . Examples of these sensors are also illustrated in FIG. 4C .
  • FIGS. 4B and 4D illustrate a conduit 410 for providing integrated power, data and cooling to the housings. While only one conduit 410 is illustrates, multiple conduits may be provided in each mounting element.
  • sensors on the roof of the vehicle can be hard to access and has side view limitations.
  • mounting various sensors on the roof may interfere with aerodynamic roof fairings. While different sensors could be distributed along the front, sides and rear of the vehicle, this may be costly and require running individual data, power and/or cooling lines to each individual sensor.
  • such a solution could be very hard to implement with legacy vehicles, or when the cab of a truck is capable of operating in an autonomous mode but the trailer is a legacy trailer without the necessary sensors.
  • the sensor housing is integrated into a side view mirror assembly, such as shown in FIGS. 3A and 3B .
  • a side mirror assembly is very sturdy, being mounted to the vehicle by a mounting element 304 or 324 that may be cast metal or some other durable material.
  • the sensors which may weight upwards of 10 kg or more, can be safely secured to the vehicle via the sensor housing.
  • a side view mirror sensor housing could be provided with a new vehicle, or could be easily retrofitted onto an older vehicle chassis.
  • Assembling the system would include running the conduit from the sensor housing to the truck cab or vehicle chassis. Aggregating the cooling, power and data lines in the conduit, or in separate sub-conduits, and running them to one location on the side of the vehicle significantly simplifies the design, lowers the cost of the components and reduces the time and expense of putting the sensors on the vehicle.
  • the typical height of the side view mirror for a semi-truck or a bus is on the order of 2 meters or more or less, for instance between 1.5-2.5 meters from the ground. This may be an ideal height for the LIDARs, radars, cameras and other sensors of an integrated sensor tower. And because truck and bus side view mirrors are designed to provide clear lines of sight down the side of the vehicle, the sensors within the housing will enjoy the same visibility. In addition, placing the sensors in the side view mirror assembly protects them from road debris and wheel splash, as the sensors will be at least 1.5-2.5 meters from the ground and away from the wheel wells.
  • Integrating the sensor housing as part of the side view mirror has the added benefit of avoiding occlusion by a conventional side view mirror. And by conforming to the form factors and placements of side view mirrors, the sensor housing will conform to requirements set forth by the U.S. National Highway Traffic Safety Administration and other governing bodies regarding placement of such elements external to the vehicle. And from a branding standpoint, a common appearance can be provided with a sensor assembly used by various types of large vehicles.
  • the sensors and algorithms for those sensors that are designed to work with passenger cars can be employed in this new arrangement as well.
  • the height of the sensors at around 1.5-2.5 meters, is approximately the height of sensors located on the roof of a sedan or sport utility vehicle.
  • One advantage of co-locating the sensors in the side view mirror housing is that at from this location there is visibility over the hood of the vehicle and provides more than a 180° FOV for sensors such as LIDARs, radars and cameras.
  • An example of this is shown in FIG. 5 , which illustrates coverage 500 for both long range LIDARs and short range LIDARs on both sides of a tractor-trailer.
  • the long range LIDARs may be located along a top or upper area of the sensor housings 502 . For instance, this portion of the housing 502 may be located closest to the top of the truck cab or roof of the vehicle. This placement allows the long range LIDAR to see over the hood of the vehicle.
  • the short range LIDARs may be located along a bottom area of the sensor housing 502 opposite the long range LIDARs. This allows the short range LIDARs to cover areas immediately adjacent to the cab of the truck or the front portion of a bus. This would allow the perception system to determine whether an object such as another vehicle, pedestrian, bicyclist, etc. is next to the front of the vehicle and take that information into account when determining how to drive or turn. Both types of LIDARs may be co-located in the housing, aligned along a common axis.
  • the long range LIDARs on the left and right sides of the vehicle have fields of view 504 . These encompass significant areas along the sides and front of the vehicle. As shown, there is an overlap region 506 of their fields of view in front of the vehicle. A space is shown between regions 504 and 506 for clarity; however in actuality there is no break in the coverage. The short range LIDARs on the left and right sides have smaller fields of view 508 .
  • the overlap region 506 provides the perception system with additional or information about a very important region that is directly in front of the vehicle. This redundancy also has a safety aspect. Should one of the long range LIDAR sensors suffer degradation in performance, the redundancy would still allow for operation in an autonomous mode.
  • FIG. 6 illustrates coverage 600 for either (or both) of radar and camera sensors on both sides of a tractor-trailer.
  • the sensors may be arranged so that the side and rear fields of view 604 overlap, and the side fields of view may overlap with the forward facing fields of view 606 .
  • the forward facing fields of view 606 also have an overlap region 608 . This overlap region provides similar redundancy to the overlap region 506 , and has the same benefits should one sensor suffer degradation in performance.
  • vehicle level calibration between left and right side sensor housings can be accomplished by matching features (e.g., convolution) in front of the vehicle, or other overlapping data points. Knowing where the features are with respect to the vehicle also gives the system extrinsic calibrations. And for sensor subsystems, such an inertial sensor subsystem that may employ redundant sensor packages, the different sensor packages may be mounted in each of the side view mirror housings. This has the added benefit of providing high resolution orientation information for all of the co-located sensors.
  • features e.g., convolution

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Electromagnetism (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Automation & Control Theory (AREA)
  • Multimedia (AREA)
  • Mechanical Engineering (AREA)
  • Traffic Control Systems (AREA)
  • Optical Radar Systems And Details Thereof (AREA)

Abstract

The technology relates to autonomous vehicles for transporting cargo and/or people between locations. Distributed sensor arrangements may not be suitable for vehicles such as large trucks, busses or construction vehicles. Side view mirror assemblies are provided that include a sensor suite of different types of sensors, including LIDAR, radar, cameras, etc. Each side assembly is rigidly secured to the vehicle by a mounting element. The sensors within the assembly may be aligned or arranged relative to a common axis or physical point of the housing. This enables self-referenced calibration of all sensors in the housing. Vehicle-level calibration can also be performed between the sensors on the left and right sides of the vehicle. Each side view mirror assembly may include a conduit that provides one or more of power, data and cooling to the sensors in the housing.

Description

    CROSS-REFERENCE TO RELATED APPLICATIONS
  • The present application claims the benefit of the filing date of U.S. Provisional Patent Application No. 62/611,685 filed Dec. 29, 2017, the entire disclosure of which is incorporated by reference herein.
  • BACKGROUND
  • Autonomous vehicles, such as vehicles that do not require a human driver, can be used to aid in the transport of passengers, cargo or other items from one location to another. Such vehicles may operate in a fully autonomous mode or a partially autonomous mode where a person in the vehicle may provide some driving input. To aid driving in an autonomous mode, one or more sets of sensors are used to detect features and objects in the environment around the vehicle. The sensors may be placed at different locations about the vehicle in order to gather information about the surrounding environment. However, there may be concerns regarding the placement of such sensors and the cost of equipping large vehicles with them.
  • BRIEF SUMMARY
  • Aspects of the disclosure provide a sensor tower assembly that is especially beneficial for trucks, busses, construction equipment and other large vehicles. The assembly co-locates various types of sensors in an integrated housing. The integrated housing is rigidly affixed to a side of the large vehicle in a manner that provides enhanced fields of view for the sensors. In one instance, the integrated housing augments or replaces a side view mirror housing. Conduits provide power, control and cooling/heating to the various sensors, and return acquired sensor information from the sensors to a control system of the vehicle so that it may operate in an autonomous or semi-autonomous mode.
  • According to aspects of the disclosure, a side sensor assembly is provided for use on a truck or bus capable of operating in an autonomous driving mode. The side sensor assembly comprises a housing, a mounting element, a plurality of sensors and a conduit. The housing has one or more exterior surfaces and an interior receptacle. At least one of the one or more exterior surfaces including a side view mirror thereon. The mounting element has a first end and a second end remote from the first end. The first end is coupled to the housing along one or more mounting points. The second end is configured to rigidly secure the housing to the truck or bus. The plurality of sensors is received within the interior receptacle of the housing. The plurality of sensors includes a pair of light detection and ranging (LIDAR) sensors. A first one of the pair of LIDAR sensors is a long range LIDAR having a detection range of at least 50 meters and a second one of the pair of LIDAR sensors is a short range LIDAR having a detection range of no more than 50 meters. The conduit is received within the mounting element. The conduit provides one or more of a power line, a data line and a cooling line to the plurality of sensors received within the housing and is configured for connection to one or more operational systems of the truck or bus.
  • In one example, the long range LIDAR is arranged along a first end of the interior receptacle and the short range LIDAR is arranged along a second end of the interior receptacle opposite the long range LIDAR. When the mounting element is affixed to the truck or bus, the long range LIDAR is positioned closer to a roof of the truck or bus than the short range LIDAR so that the long range LIDAR has a field of view that extends past a front hood of the truck or bus during operation.
  • In another example, the plurality of sensors further includes at least one of a radar sensor and a camera sensor disposed between the long range LIDAR and the short range LIDAR within the interior receptacle. Here, the at least one radar sensor may comprise a plurality of radar sensors arranged to provide overlapping fields of view along a side of the truck or bus during operation. The at least one camera sensor may comprise a plurality of cameras arranged to provide overlapping fields of view along a side of the truck or bus during operation. The plurality of sensors may further include at least one inertial sensor disposed between the long range LIDAR and the short range LIDAR within the interior receptacle.
  • In a further example, the plurality of sensors received within the interior receptacle of the housing are affixed within the housing relative to a common axis or physical reference point of the housing. In this case, the plurality of sensors may be calibrated collectively relative to the common axis or physical reference point.
  • In yet another example, the side sensor assembly comprises a pair of side sensor assemblies. Each one of the pair has a respective housing, mounting element, plurality of sensors and conduit. A first one of the pair is configured for affixation to a left side of the truck or bus and a second one of the pair is configured for affixation to a right side of the truck or bus.
  • According to further aspects of the disclosure, a vehicle is configured to operate in an autonomous driving mode. The vehicle comprises a driving system configured to perform driving operations, a perception system configured to detect objects in an environment surrounding the vehicle, and a control system. The control system is operatively coupled to the driving system and the perception system. The control system has one or more computer processors configured to receive data from the perception system and to direct the driving system when operating in the autonomous driving mode. The perception system includes a pair of side sensor assemblies attached to opposite sides of the vehicle. Each side sensor assembly includes a housing, a mounting element, a plurality of sensors and a conduit. The housing has one or more exterior surfaces and an interior receptacle. At least one of the one or more exterior surfaces includes a side view mirror thereon. The mounting element has a first end and a second end remote from the first end. The first end is coupled to the housing along one or more mounting points. The second end is configured to rigidly secure the housing to a corresponding side of the vehicle. The plurality of sensors is received within the interior receptacle of the housing. The plurality of sensors includes a pair of light detection and ranging (LIDAR) sensors. A first one of the pair of LIDAR sensors is a long range LIDAR having a detection range of at least 50 meters and a second one of the pair of LIDAR sensors is a short range LIDAR having a detection range of no more than 50 meters. The conduit is received within the mounting element. The conduit provides one or both of a power line and a data line to the plurality of sensors received within the housing and connects to one or more operational systems of the vehicle.
  • In one example, the long range LIDAR is arranged along a first end of the interior receptacle and the short range LIDAR is arranged along a second end of the interior receptacle opposite the long range LIDAR. Here, the long range LIDAR is positioned closer to a roof of the vehicle than the short range LIDAR so that the long range LIDAR has a field of view that extends past a front hood of the vehicle during operation.
  • In another example, the plurality of sensors in each side sensor assembly further includes at least one of a radar sensor and a camera sensor disposed between the long range LIDAR and the short range LIDAR within the interior receptacle. In this case, the plurality of sensors in each side sensor assembly may further include at least one inertial sensor disposed between the long range LIDAR and the short range LIDAR within the interior receptacle. The at least one inertial sensor in each side sensor assembly may provide redundancy to the at least one inertial sensor in the other side sensor assembly.
  • In a further example, the plurality of sensors received within the interior receptacle of the housing are affixed within the housing relative to a common axis or physical reference point of the housing. Here, the plurality of sensors in each side sensor assembly may be calibrated collectively relative to the common axis or physical reference point of that side sensor assembly.
  • According to another example, the plurality of sensors in each side sensor assembly is calibrated relative to the other side sensor assembly. In yet another example, the vehicle is one of a truck, a bus, or a construction vehicle. In another example, the autonomous driving mode is a Level 4 or Level 5 autonomous mode of operation. And in a further example, the conduit further provides a cooling line to the plurality of sensors received within the housing.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • FIGS. 1A-B illustrate an example tractor-trailer for use with sensor towers according to aspects of the disclosure.
  • FIGS. 1C-D illustrate an example bus for use with sensor towers according to aspects of the disclosure.
  • FIG. 2 illustrates a system diagram of autonomous vehicle in accordance with aspects of the disclosure.
  • FIGS. 3A-B are example sensor assembly configurations in accordance with aspects of the disclosure.
  • FIGS. 4A-D illustrate arrangements of sensors and conduits with the sensor assembly configurations of FIGS. 3A-B, in accordance with aspects of the disclosure.
  • FIG. 5 is an example of short and long range LIDAR coverage for a large vehicle in accordance with aspects of the disclosure.
  • FIG. 6 is an example of radar or camera coverage for a large vehicle in accordance with aspects of the disclosure.
  • DETAILED DESCRIPTION Overview
  • The technology relates to autonomous or semi-autonomous vehicles for transporting cargo and/or people between locations. Large trucks, busses and construction equipment, unlike passenger cars, typically do not provide good 360° visibility from a single vantage point. For instance, FIGS. 1A-B illustrate an example truck 100, and FIGS. 1C-D illustrate an example bus 120. The truck 100 may be, e.g., a single, double or triple tractor-trailer, or other medium or heavy duty truck such as in weight classes 4 through 8. The bus 120 may be, e.g., a school bus, mini bus, trolley, motorcoach, double decker bus, etc. In one example, the large vehicle may be longer than 8-10 meters. In another example, the large vehicle may not exceed the length of a triple tractor trailer. Smaller or larger vehicles can also implement the sensor technologies discussed here.
  • Such large vehicles may have multiple blind spot areas on the sides and to the rear. Placing sensors on top of the truck cab or trailer, or on the roof of the bus, may not resolve the blind spot issue, and may or may not be feasible. For example, given the heights of such vehicles, it may be impractical to locate sensors on the roof or top due to low clearance bridges, underpasses, tunnels, parking structures, etc. This may limit routes available to the vehicle. It may also be difficult to maintain or service sensors placed on top of large vehicles.
  • One way to address certain blind spot issues is via side view mirror assemblies. The side view mirror assemblies on large trucks and busses can be placed towards the front of the vehicle. These assemblies can be secured by one or more bracket elements, and project away from the vehicle to the side and/or front, for instance as shown in the top views of FIGS. 1B and 1D. Incorporating various sensor components into the side view mirror assemblies provides the autonomous or semi-autonomous driving system with good fields of view at a height that is beneficial. Specifics of this arrangement are provided in detail below.
  • There are different degrees of autonomy that may occur in a partially or fully autonomous driving system. The U.S. National Highway Traffic Safety Administration and the Society of Automotive Engineers have identified different levels to indicate how much, or how little, the vehicle controls the driving. For instance, Level 0 has no automation and the driver makes all driving-related decisions. The lowest semi-autonomous mode, Level 1, includes some drive assistance such as cruise control. Level 2 has partial automation of certain driving operations, while Level 3 involves conditional automation that can enable a person in the driver's seat to take control as warranted. In contrast, Level 4 is a high automation level where the vehicle is able to drive without assistance in select conditions. And Level 5 is a fully autonomous mode in which the vehicle is able to drive without assistance in all situations. The architectures, components, systems and methods described herein can function in any of the semi or fully-autonomous modes, e.g., Levels 1-5, which are referred to herein as “autonomous” driving modes. Thus, reference to an autonomous driving mode includes both partial and full autonomy.
  • Example Systems
  • FIG. 2 illustrates a block diagram 200 with various components and systems of n vehicle, such as a truck or a bus, capable of operating in a full or semi-autonomous mode of operation. As shown in the block diagram, the vehicle may have a control system of one or more computing devices, such as computing devices 202 containing one or more processors 204, memory 206 and other components typically present in general purpose computing devices.
  • The memory 206 stores information accessible by the one or more processors 204, including instructions 208 and data 210 that may be executed or otherwise used by the processor 120. The memory 206 may be of any type capable of storing information accessible by the processor, including a computing device-readable medium. The memory is a non-transitory medium such as a hard-drive, memory card, optical disk, solid-state, tape memory, or the like. Systems may include different combinations of the foregoing, whereby different portions of the instructions and data are stored on different types of media.
  • The instructions 208 may be any set of instructions to be executed directly (such as machine code) or indirectly (such as scripts) by the processor. For example, the instructions may be stored as computing device code on the computing device-readable medium. In that regard, the terms “instructions” and “programs” may be used interchangeably herein. The instructions may be stored in object code format for direct processing by the processor, or in any other computing device language including scripts or collections of independent source code modules that are interpreted on demand or compiled in advance. The data 210 may be retrieved, stored or modified by one or more processors 204 in accordance with the instructions 208. As an example, data 210 of memory 206 may store information, such as calibration information, to be used when calibrating different types of sensors.
  • The one or more processor 204 may be any conventional processors, such as commercially available CPUs. Alternatively, the one or more processors may be a dedicated device such as an ASIC or other hardware-based processor. Although FIG. 2 functionally illustrates the processor(s), memory, and other elements of computing devices 202 as being within the same block, such devices may actually include multiple processors, computing devices, or memories that may or may not be stored within the same physical housing. Similarly, the memory 206 may be a hard drive or other storage media located in a housing different from that of the processor(s) 204. Accordingly, references to a processor or computing device will be understood to include references to a collection of processors or computing devices or memories that may or may not operate in parallel.
  • In one example, the computing devices 202 may form an autonomous driving computing system incorporated into vehicle 100 or 120. The autonomous driving computing system may capable of communicating with various components of the vehicle. For example, returning to FIG. 2, the computing devices 202 may be in communication with various systems of the vehicle, including a driving system including a deceleration system 212 (for controlling braking of the vehicle), acceleration system 214 (for controlling acceleration of the vehicle), steering system 216 (for controlling the orientation of the wheels and direction of the vehicle), signaling system 218 (for controlling turn signals), navigation system 220 (for navigating the vehicle to a location or around objects) and a positioning system 222 (for determining the position of the vehicle). The computing devices 202 are also operatively coupled to a perception system 224 (for detecting objects in the vehicle's environment), a power system 226 (for example, a battery and/or gas or diesel powered engine) and a transmission system 230 in order to control the movement, speed, etc., of vehicle 100 in accordance with the instructions 208 of memory 206 in an autonomous driving mode which does not require or need continuous or periodic input from a passenger of the vehicle. The wheels/tires 228 are couples to the transmission system 230, and the computing devices 202 may be able to receive information about tire pressure, balance and other factors that may impact driving in an autonomous mode.
  • The computing devices 202 may control the direction and speed of the vehicle by controlling various components. By way of example, computing devices 202 may navigate the vehicle to a destination location completely autonomously using data from the map information and navigation system 220. Computing devices 202 may use the positioning system 222 to determine the vehicle's location and the perception system 224 to detect and respond to objects when needed to reach the location safely. In order to do so, computing devices 202 may cause the vehicle to accelerate (e.g., by increasing fuel or other energy provided to the engine by acceleration system 214), decelerate (e.g., by decreasing the fuel supplied to the engine, changing gears, and/or by applying brakes by deceleration system 212), change direction (e.g., by turning the front or other wheels of vehicle 100 or 120 by steering system 216), and signal such changes (e.g., by lighting turn signals of signaling system 218). Thus, the acceleration system 214 and deceleration system 212 may be a part of a drivetrain or other transmission system 230 that includes various components between an engine of the vehicle and the wheels of the vehicle. Again, by controlling these systems, computing devices 202 may also control the transmission system 230 of the vehicle in order to maneuver the vehicle autonomously.
  • As an example, computing devices 202 may interact with deceleration system 212 and acceleration system 214 in order to control the speed of the vehicle. Similarly, steering system 216 may be used by computing devices 202 in order to control the direction of vehicle. For example, if the vehicle is configured for use on a road, such as a tractor-trailer or a bus, the steering system 216 may include components to control the angle of wheels to turn the vehicle. Signaling system 218 may be used by computing devices 202 in order to signal the vehicle's intent to other drivers or vehicles, for example, by lighting turn signals or brake lights when needed.
  • Navigation system 220 may be used by computing devices 202 in order to determine and follow a route to a location. In this regard, the navigation system 220 and/or data 210 may store map information, e.g., highly detailed maps that computing devices 202 can use to navigate or control the vehicle. As an example, these maps may identify the shape and elevation of roadways, lane markers, intersections, crosswalks, speed limits, traffic signal lights, buildings, signs, real time traffic information, vegetation, or other such objects and information. The lane markers may include features such as solid or broken double or single lane lines, solid or broken lane lines, reflectors, etc. A given lane may be associated with left and right lane lines or other lane markers that define the boundary of the lane. Thus, most lanes may be bounded by a left edge of one lane line and a right edge of another lane line.
  • The perception system 224 also includes one or more components for detecting objects external to the vehicle such as other vehicles, obstacles in the roadway, traffic signals, signs, trees, etc. For example, the perception system 224 may include one or more light detection and ranging (LIDAR) sensors, sonar devices, radar units, cameras, inertial (e.g., gyroscopic) sensors, and/or any other detection devices that record data which may be processed by computing devices 202. The sensors of the perception system may detect objects and their characteristics such as location, orientation, size, shape, type (for instance, vehicle, pedestrian, bicyclist, etc.), heading, and speed of movement, etc. The raw data from the sensors and/or the aforementioned characteristics can sent for further processing to the computing devices 202 periodically and continuously as it is generated by the perception system 224. Computing devices 202 may use the positioning system 222 to determine the vehicle's location and perception system 224 to detect and respond to objects when needed to reach the location safely. In addition, the computing devices 202 may perform calibration of individual sensors, all sensors in a particular sensor assembly, or between sensors in different sensor assemblies.
  • As indicated in FIG. 2, the perception system 224 includes one or more sensor assemblies 232, which may be arranged as sensor towers integrated into the side-view mirrors on the truck, bus or other large vehicle such as construction equipment. A connection conduit 234 provides the necessary power, communication, cooling/heating and other connections between a given sensor housing assembly and the vehicle. For instance, a data communication bus may provide bidirectional communication between the sensors of the sensor housing assembly and the computing devices 202. A power line may be connected directly or indirectly to the power system 226, or to a separate power source such as a battery controlled by the computing devices 202. A cooling line may also couple to the power system 226 or to a dedicated cooling system of the vehicle. The cooling may be active, e.g., using a cooling fluid or forced cool air, or passive. Alternatively, in very cold or wintry environments, heating may be applied instead of cooling.
  • FIGS. 3A and 3B illustrate two examples of sensor assemblies. For instance, FIG. 3A illustrates a sensor assembly 300 with a housing 302 and a mounting element 304. As shown, a mirror 306 is arranged on an outside surface of the housing 302. FIG. 3B similarly illustrates another sensor assembly 320 with a housing 322 and a mounting element 324. Here, multiple mirrors 326 a and 326 b may be arranged on different exterior surfaces of the housing 322. Each housing is configured to store the various LIDAR sensors, sonar devices, radar units, cameras, inertial and/or gyroscopic sensors therein. The mounting elements are configured to rigidly secure the housing to the vehicle. For instance, mounting element 304 may couple the housing 302 to the cab of a tractor-trailer vehicle such as vehicle 100. And mounting element 324 may couple the housing 322 to the side of a bus such as bus 120. Each side of the vehicle may have a housing 302 or 322 rigidly mounted thereon.
  • FIG. 4A illustrates an example of the housing 302 with selected sensors illustrated therein. For instance, the sensors may include a long range, narrow field of view (FOV) LIDAR 400 and a short range, tall FOV LIDAR 402. In one example, the long range LIDAR 400 may have a range exceeding 50-250 meters, while the short range LIDAR 402 has a range no greater than 1-50 meters. Alternatively, the short range LIDAR 402 may generally cover up to 10-15 meters from the vehicle while the long range LIDAR 400 may cover a range exceeding 100 meters. In another example, the long range is between 10-200 meters, while the short range has a range of 0-20 meters. In a further example, the long range exceeds 80 meters while the short range is below 50 meters. Intermediate ranges of between, e.g., 10-100 meters can be covered by one or both of the long range and short range LIDARs, or by a medium range LIDAR that may also be included in the housing 302. The medium range LIDAR may be disposed between the long and short range LIDARs, and may be aligned about the same common axis or other fixed point as discussed below.
  • A set of cameras 404 may be distributed along the housing 302, for instance to provide forward, side and rear-facing imagery. Similarly, a set of radars 406 may be distributed along the housing 302 to provide forward, side and rear-facing data. And the sensors 408 may include an inertial sensor, a gyroscope, an accelerometer and/or other sensors. Each of the sensors may be aligned or arranged relative to a common axis 409 or physical point within the housing 302. Examples of these sensors are also illustrated in FIG. 4C. And FIGS. 4B and 4D illustrate a conduit 410 for providing integrated power, data and cooling to the housings. While only one conduit 410 is illustrates, multiple conduits may be provided in each mounting element.
  • Example Implementations
  • In addition to the structures and configurations described above and illustrated in the figures, various implementations will now be described.
  • As noted above, for large trucks, busses, construction equipment and other vehicles, it may be impractical to place sensors on the roof of the vehicle. The roof can be hard to access and has side view limitations. In addition, mounting various sensors on the roof may interfere with aerodynamic roof fairings. While different sensors could be distributed along the front, sides and rear of the vehicle, this may be costly and require running individual data, power and/or cooling lines to each individual sensor. Furthermore, such a solution could be very hard to implement with legacy vehicles, or when the cab of a truck is capable of operating in an autonomous mode but the trailer is a legacy trailer without the necessary sensors.
  • Thus, according to one aspect, the sensor housing is integrated into a side view mirror assembly, such as shown in FIGS. 3A and 3B. A side mirror assembly is very sturdy, being mounted to the vehicle by a mounting element 304 or 324 that may be cast metal or some other durable material. The sensors, which may weight upwards of 10 kg or more, can be safely secured to the vehicle via the sensor housing. A side view mirror sensor housing could be provided with a new vehicle, or could be easily retrofitted onto an older vehicle chassis.
  • Assembling the system would include running the conduit from the sensor housing to the truck cab or vehicle chassis. Aggregating the cooling, power and data lines in the conduit, or in separate sub-conduits, and running them to one location on the side of the vehicle significantly simplifies the design, lowers the cost of the components and reduces the time and expense of putting the sensors on the vehicle.
  • Furthermore, the typical height of the side view mirror for a semi-truck or a bus is on the order of 2 meters or more or less, for instance between 1.5-2.5 meters from the ground. This may be an ideal height for the LIDARs, radars, cameras and other sensors of an integrated sensor tower. And because truck and bus side view mirrors are designed to provide clear lines of sight down the side of the vehicle, the sensors within the housing will enjoy the same visibility. In addition, placing the sensors in the side view mirror assembly protects them from road debris and wheel splash, as the sensors will be at least 1.5-2.5 meters from the ground and away from the wheel wells.
  • Integrating the sensor housing as part of the side view mirror has the added benefit of avoiding occlusion by a conventional side view mirror. And by conforming to the form factors and placements of side view mirrors, the sensor housing will conform to requirements set forth by the U.S. National Highway Traffic Safety Administration and other governing bodies regarding placement of such elements external to the vehicle. And from a branding standpoint, a common appearance can be provided with a sensor assembly used by various types of large vehicles.
  • While arranging multiple types of sensors in a side view mirror housing for a large truck or bus may be different than a solution employed for a smaller passenger vehicle, the sensors and algorithms for those sensors that are designed to work with passenger cars can be employed in this new arrangement as well. For instance, the height of the sensors, at around 1.5-2.5 meters, is approximately the height of sensors located on the roof of a sedan or sport utility vehicle.
  • One advantage of co-locating the sensors in the side view mirror housing is that at from this location there is visibility over the hood of the vehicle and provides more than a 180° FOV for sensors such as LIDARs, radars and cameras. An example of this is shown in FIG. 5, which illustrates coverage 500 for both long range LIDARs and short range LIDARs on both sides of a tractor-trailer.
  • The long range LIDARs may be located along a top or upper area of the sensor housings 502. For instance, this portion of the housing 502 may be located closest to the top of the truck cab or roof of the vehicle. This placement allows the long range LIDAR to see over the hood of the vehicle. And the short range LIDARs may be located along a bottom area of the sensor housing 502 opposite the long range LIDARs. This allows the short range LIDARs to cover areas immediately adjacent to the cab of the truck or the front portion of a bus. This would allow the perception system to determine whether an object such as another vehicle, pedestrian, bicyclist, etc. is next to the front of the vehicle and take that information into account when determining how to drive or turn. Both types of LIDARs may be co-located in the housing, aligned along a common axis.
  • As illustrated in FIG. 5, the long range LIDARs on the left and right sides of the vehicle have fields of view 504. These encompass significant areas along the sides and front of the vehicle. As shown, there is an overlap region 506 of their fields of view in front of the vehicle. A space is shown between regions 504 and 506 for clarity; however in actuality there is no break in the coverage. The short range LIDARs on the left and right sides have smaller fields of view 508. The overlap region 506 provides the perception system with additional or information about a very important region that is directly in front of the vehicle. This redundancy also has a safety aspect. Should one of the long range LIDAR sensors suffer degradation in performance, the redundancy would still allow for operation in an autonomous mode.
  • FIG. 6 illustrates coverage 600 for either (or both) of radar and camera sensors on both sides of a tractor-trailer. Here, there may be multiple radar and/or camera sensors in the sensor housing 602. As shown, there may be sensors with side and rear fields of view 604 and sensors with forward facing fields of view 606. The sensors may be arranged so that the side and rear fields of view 604 overlap, and the side fields of view may overlap with the forward facing fields of view 606. As with the long range LIDARs discussed above, the forward facing fields of view 606 also have an overlap region 608. This overlap region provides similar redundancy to the overlap region 506, and has the same benefits should one sensor suffer degradation in performance.
  • In addition to the cost benefits and reduction in installation time, another benefit to co-locating the LIDAR, radar, camera and/or other sensors in a side view mirror housing involves calibration. Placing these sensors in the same housing means that they are all subject to the same relative movement, as they may be affixed within the housing relative to a common axis or reference point of the housing. This reduces the complexity involved in calibrating each sensor individually and with respect to the other co-located sensors. Calibration of all sensors in one of the side view mirror housings can be done for the whole assembly so that everything is referenced to itself. This is easily accomplished because all sensors in the housing can be rigidly mounted with respect to each other.
  • Furthermore, vehicle level calibration between left and right side sensor housings can be accomplished by matching features (e.g., convolution) in front of the vehicle, or other overlapping data points. Knowing where the features are with respect to the vehicle also gives the system extrinsic calibrations. And for sensor subsystems, such an inertial sensor subsystem that may employ redundant sensor packages, the different sensor packages may be mounted in each of the side view mirror housings. This has the added benefit of providing high resolution orientation information for all of the co-located sensors.
  • Unless otherwise stated, the foregoing alternative examples are not mutually exclusive, but may be implemented in various combinations to achieve unique advantages. As these and other variations and combinations of the features discussed above can be utilized without departing from the subject matter defined by the claims, the foregoing description of the embodiments should be taken by way of illustration rather than by way of limitation of the subject matter defined by the claims. In addition, the provision of the examples described herein, as well as clauses phrased as “such as,” “including” and the like, should not be interpreted as limiting the subject matter of the claims to the specific examples; rather, the examples are intended to illustrate only one of many possible embodiments. Further, the same reference numbers in different drawings can identify the same or similar elements.

Claims (20)

1. A side sensor assembly for use on a truck or bus capable of operating in an autonomous driving mode, the side sensor assembly comprising:
a housing having one or more exterior surfaces and an interior receptacle, at least one of the one or more exterior surfaces including a side view mirror thereon;
a mounting element having a first end and a second end remote from the first end, the first end being coupled to the housing along one or more mounting points, the second end being configured to rigidly secure the housing to the truck or bus;
a plurality of sensors received within the interior receptacle of the housing, the plurality of sensors including a pair of light detection and ranging (LIDAR) sensors, a first one of the pair of LIDAR sensors being a long range LIDAR having a detection range of at least 50 meters and a second one of the pair of LIDAR sensors being a short range LIDAR having a detection range of no more than 50 meters; and
a conduit received within the mounting element, the conduit providing one or more of a power line, a data line and a cooling line to the plurality of sensors received within the housing and configured for connection to one or more operational systems of the truck or bus.
2. The side sensor assembly of claim 1, wherein:
the long range LIDAR is arranged along a first end of the interior receptacle and the short range LIDAR is arranged along a second end of the interior receptacle opposite the long range LIDAR; and
when the mounting element is affixed to the truck or bus, the long range LIDAR is positioned closer to a roof of the truck or bus than the short range LIDAR so that the long range LIDAR has a field of view that extends past a front hood of the truck or bus during operation.
3. The side sensor assembly of claim 1, wherein the plurality of sensors further includes at least one of a radar sensor and a camera sensor disposed between the long range LIDAR and the short range LIDAR within the interior receptacle.
4. The side sensor assembly of claim 3, wherein the at least one radar sensor comprises a plurality of radar sensors arranged to provide overlapping fields of view along a side of the truck or bus during operation.
5. The side sensor assembly of claim 3, wherein the at least one camera sensor comprises a plurality of cameras arranged to provide overlapping fields of view along a side of the truck or bus during operation.
6. The side sensor assembly of claim 3, wherein the plurality of sensors further includes at least one inertial sensor disposed between the long range LIDAR and the short range LIDAR within the interior receptacle.
7. The side sensor assembly of claim 1, wherein the plurality of sensors received within the interior receptacle of the housing are affixed within the housing relative to a common axis or physical reference point of the housing.
8. The side sensor assembly of claim 7, wherein the plurality of sensors is calibrated collectively relative to the common axis or physical reference point.
9. The side sensor assembly of claim 1, wherein the side sensor assembly comprises a pair of side sensor assemblies, each one of the pair having a respective housing, mounting element, plurality of sensors and conduit, a first one of the pair being configured for affixation to a left side of the truck or bus and a second one of the pair being configured for affixation to a right side of the truck or bus.
10. A vehicle configured to operate in an autonomous driving mode, the vehicle comprising:
a driving system configured to perform driving operations;
a perception system configured to detect objects in an environment surrounding the vehicle; and
a control system operatively coupled to the driving system and the perception system, the control system having one or more computer processors configured to receive data from the perception system and to direct the driving system when operating in the autonomous driving mode;
wherein the perception system includes a pair of side sensor assemblies attached to opposite sides of the vehicle, each side sensor assembly including:
a housing having one or more exterior surfaces and an interior receptacle, at least one of the one or more exterior surfaces including a side view mirror thereon;
a mounting element having a first end and a second end remote from the first end, the first end being coupled to the housing along one or more mounting points, the second end being configured to rigidly secure the housing to a corresponding side of the vehicle;
a plurality of sensors received within the interior receptacle of the housing, the plurality of sensors including a pair of light detection and ranging (LIDAR) sensors, a first one of the pair of LIDAR sensors being a long range LIDAR having a detection range of at least 50 meters and a second one of the pair of LIDAR sensors being a short range LIDAR having a detection range of no more than 50 meters; and
a conduit received within the mounting element, the conduit providing one or both of a power line and a data line to the plurality of sensors received within the housing and connecting to one or more operational systems of the vehicle.
11. The vehicle of claim 10, wherein:
the long range LIDAR is arranged along a first end of the interior receptacle and the short range LIDAR is arranged along a second end of the interior receptacle opposite the long range LIDAR; and
the long range LIDAR is positioned closer to a roof of the vehicle than the short range LIDAR so that the long range LIDAR has a field of view that extends past a front hood of the vehicle during operation.
12. The vehicle of claim 10, wherein the plurality of sensors in each side sensor assembly further includes at least one of a radar sensor and a camera sensor disposed between the long range LIDAR and the short range LIDAR within the interior receptacle.
13. The vehicle of claim 12, wherein the plurality of sensors in each side sensor assembly further includes at least one inertial sensor disposed between the long range LIDAR and the short range LIDAR within the interior receptacle.
14. The vehicle of claim 13, wherein the at least one inertial sensor in each side sensor assembly provides redundancy to the at least one inertial sensor in the other side sensor assembly.
15. The vehicle of claim 10, wherein the plurality of sensors received within the interior receptacle of the housing are affixed within the housing relative to a common axis or physical reference point of the housing.
16. The vehicle of claim 15, wherein the plurality of sensors in each side sensor assembly is calibrated collectively relative to the common axis or physical reference point of that side sensor assembly.
17. The vehicle of claim 10, wherein the plurality of sensors in each side sensor assembly is calibrated relative to the other side sensor assembly.
18. The vehicle of claim 10, wherein the vehicle is one of a truck, a bus, or a construction vehicle.
19. The vehicle of claim 10, wherein the autonomous driving mode is a Level 4 or Level 5 autonomous mode of operation.
20. The vehicle of claim 10, wherein the conduit further provides a cooling line to the plurality of sensors received within the housing.
US16/009,499 2017-12-29 2018-06-15 Sensor integration for large autonomous vehicles Abandoned US20190204845A1 (en)

Priority Applications (5)

Application Number Priority Date Filing Date Title
US16/009,499 US20190204845A1 (en) 2017-12-29 2018-06-15 Sensor integration for large autonomous vehicles
PCT/US2018/066808 WO2019133437A1 (en) 2017-12-29 2018-12-20 Sensor integration for large autonomous vehicles
EP18834222.4A EP3710862A1 (en) 2017-12-29 2018-12-20 Sensor integration for large autonomous vehicles
CN201880084695.2A CN111566511B (en) 2017-12-29 2018-12-20 Side sensor assembly and autonomous vehicle including the same
US17/724,559 US11899466B2 (en) 2017-12-29 2022-04-20 Sensor integration for large autonomous vehicles

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US201762611685P 2017-12-29 2017-12-29
US16/009,499 US20190204845A1 (en) 2017-12-29 2018-06-15 Sensor integration for large autonomous vehicles

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US17/724,559 Continuation US11899466B2 (en) 2017-12-29 2022-04-20 Sensor integration for large autonomous vehicles

Publications (1)

Publication Number Publication Date
US20190204845A1 true US20190204845A1 (en) 2019-07-04

Family

ID=67058259

Family Applications (2)

Application Number Title Priority Date Filing Date
US16/009,499 Abandoned US20190204845A1 (en) 2017-12-29 2018-06-15 Sensor integration for large autonomous vehicles
US17/724,559 Active US11899466B2 (en) 2017-12-29 2022-04-20 Sensor integration for large autonomous vehicles

Family Applications After (1)

Application Number Title Priority Date Filing Date
US17/724,559 Active US11899466B2 (en) 2017-12-29 2022-04-20 Sensor integration for large autonomous vehicles

Country Status (4)

Country Link
US (2) US20190204845A1 (en)
EP (1) EP3710862A1 (en)
CN (1) CN111566511B (en)
WO (1) WO2019133437A1 (en)

Cited By (18)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20200200911A1 (en) * 2018-12-21 2020-06-25 Easymile Method and system for handling blind sectors of scanning layers of redundant sensors in a vehicle
EP3771923A1 (en) * 2019-07-31 2021-02-03 Tusimple, Inc. Lidar mirror sensor assembly
WO2021137884A1 (en) * 2019-12-30 2021-07-08 Waymo Llc Perimeter sensor housings
US11124132B2 (en) * 2019-06-14 2021-09-21 Locomation, Inc. Mirror pod environmental sensor arrangement for autonomous vehicle
US20210297142A1 (en) * 2020-03-18 2021-09-23 Qualcomm Incorporated Determining beam directions of a repeater
US11142125B2 (en) * 2018-11-01 2021-10-12 Elektrobit Automotive Gmbh Camera device, driver assist system, and vehicle
CN113917452A (en) * 2021-09-30 2022-01-11 北京理工大学 Blind road detection device and method combining vision and radar
WO2022036344A1 (en) * 2019-02-20 2022-02-17 Waymo Llc Self-driving sensor system
US20220097625A1 (en) * 2019-06-14 2022-03-31 Locomation, Inc. Mirror pod environmental sensor arrangement for autonomous vehicle enabling compensation for uneven road camber
US20220144185A1 (en) * 2019-03-01 2022-05-12 Kodiak Robotics, Inc. Sensor assembly for autonomous vehicles
US20220150392A1 (en) * 2020-11-10 2022-05-12 Deere & Company Work vehicle perception systems and front modules
US20220169270A1 (en) * 2020-11-30 2022-06-02 Nuro, Inc. Hardware systems for an autonomous vehicle
US20220196802A1 (en) * 2019-08-15 2022-06-23 Apollo Intelligent Connectivity (Beijing) Technology Co., Ltd. Autonomous vehicle
US11402497B2 (en) * 2019-08-30 2022-08-02 Robert Bosch Gmbh Device for a vehicle
US11427175B2 (en) * 2019-12-16 2022-08-30 Waymo Llc Vehicle braking systems and methods
US11442167B2 (en) * 2018-09-12 2022-09-13 Sick Ag Sensor and autonomous vehicle
US11557127B2 (en) 2019-12-30 2023-01-17 Waymo Llc Close-in sensing camera system
EP3990330A4 (en) * 2019-07-29 2023-08-09 Waymo LLC Maintaining road safety when there is a disabled autonomous vehicle

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112208440B (en) * 2020-09-23 2022-08-09 东风商用车有限公司 Commercial vehicle rearview mirror structure integrated with automobile auxiliary driving system

Family Cites Families (49)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
USD250332S (en) 1976-11-16 1978-11-21 Tetsuo Aiga Rear-view mirror for vehicle
USD356537S (en) 1993-12-13 1995-03-21 Eisenbraun Kenneth D Rear view mirror
USD375070S (en) 1995-12-01 1996-10-29 Donald Yeabower Angled rearview mirror
DE19734169B4 (en) 1997-08-07 2007-02-15 Mekra Lang Gmbh & Co. Kg Commercial vehicle with a mirror in the front area
US6592230B2 (en) 1997-10-16 2003-07-15 Holland Hitch Company Truck rearview mirror assembly having a display for displaying trailer coupling status information
US6229434B1 (en) 1999-03-04 2001-05-08 Gentex Corporation Vehicle communication system
SE520360C2 (en) * 1999-12-15 2003-07-01 Goeran Sjoenell Warning device for vehicles
US6535158B2 (en) 2000-03-15 2003-03-18 Utah State University Research Foundation Kinematic analysis of conically scanned environmental properties
WO2007143756A2 (en) 2006-06-09 2007-12-13 Carnegie Mellon University System and method for autonomously convoying vehicles
US8602573B2 (en) 2006-10-31 2013-12-10 Velvac Incorporated Electronics module for mirrors
GB2447672B (en) 2007-03-21 2011-12-14 Ford Global Tech Llc Vehicle manoeuvring aids
ITMI20081038A1 (en) * 2008-06-06 2009-12-07 Selle Italia Srl MULTIFUNCTIONAL DEVICE FOR VEHICLES
US8245928B2 (en) 2008-10-23 2012-08-21 Lockheed Martin Corporation Dual band threat warning system
USD636308S1 (en) 2009-10-12 2011-04-19 Lang-Mekra North America, Llc Mirror housing
KR101071362B1 (en) * 2011-03-25 2011-10-07 위재영 Vehicular object ranging system and method of operation
US9367065B2 (en) * 2013-01-25 2016-06-14 Google Inc. Modifying behavior of autonomous vehicles based on sensor blind spots and limitations
US9110169B2 (en) * 2013-03-08 2015-08-18 Advanced Scientific Concepts, Inc. LADAR enabled impact mitigation system
US20160272163A1 (en) * 2015-03-17 2016-09-22 Magna Electronics Inc. Vehicle camera with lens washer system
US9625582B2 (en) * 2015-03-25 2017-04-18 Google Inc. Vehicle with multiple light detection and ranging devices (LIDARs)
US10082797B2 (en) * 2015-09-16 2018-09-25 Ford Global Technologies, Llc Vehicle radar perception and localization
US10267908B2 (en) * 2015-10-21 2019-04-23 Waymo Llc Methods and systems for clearing sensor occlusions
US20170158136A1 (en) 2015-12-02 2017-06-08 Ford Global Technologies, Llc Vehicle side mirror system
US10315578B2 (en) * 2016-01-14 2019-06-11 Faraday&Future Inc. Modular mirror assembly
CN108603937B (en) * 2016-01-31 2024-01-05 威力登激光雷达有限公司 LIDAR 3-D imaging with far field illumination overlay
US20180032822A1 (en) 2016-08-01 2018-02-01 Ford Global Technologies, Llc Vehicle exterior monitoring
US10024970B2 (en) * 2016-08-19 2018-07-17 Dura Operating, Llc Sensor housing assembly for attachment to a motor vehicle
JP7124700B2 (en) * 2016-08-26 2022-08-24 ソニーグループ株式会社 MOBILE BODY CONTROL DEVICE, MOBILE BODY CONTROL METHOD, AND MOBILE BODY
US20180095473A1 (en) 2016-10-03 2018-04-05 Navya Autonomous electric vehicle for transportation of goods and/or people
KR102626014B1 (en) 2016-12-07 2024-01-17 한온시스템 주식회사 Vehicle thermal management system
US10703341B2 (en) * 2017-02-03 2020-07-07 Magna Electronics Inc. Vehicle sensor housing with theft protection
USD815577S1 (en) 2017-02-08 2018-04-17 Volvo Lastvagnar Ab Mirror for vehicle
WO2018170074A1 (en) 2017-03-14 2018-09-20 Starsky Robotics, Inc. Vehicle sensor system and method of use
US10195994B2 (en) 2017-04-07 2019-02-05 GM Global Technology Operations LLC Vehicle side mirror automation
WO2018196001A1 (en) 2017-04-28 2018-11-01 SZ DJI Technology Co., Ltd. Sensing assembly for autonomous driving
EP3621864B1 (en) 2017-05-03 2023-04-26 Soltare Inc. Audio processing for vehicle sensory systems
JP2020520008A (en) * 2017-05-09 2020-07-02 ブレーン コーポレーションBrain Corporation System and method for robot motion control
US10564261B2 (en) 2017-05-11 2020-02-18 Ford Global Technologies, Llc Autonomous vehicle LIDAR mirror
DE102017004842A1 (en) * 2017-05-19 2017-12-14 Daimler Ag Method for monitoring an exterior mirror assembly
US10444759B2 (en) 2017-06-14 2019-10-15 Zoox, Inc. Voxel based ground plane estimation and object segmentation
US20180372875A1 (en) 2017-06-27 2018-12-27 Uber Technologies, Inc. Sensor configuration for an autonomous semi-truck
US20190041859A1 (en) * 2017-08-04 2019-02-07 Aptiv Technologies Limited Sensor failure compensation system for an automated vehicle
US11210744B2 (en) * 2017-08-16 2021-12-28 Mobileye Vision Technologies Ltd. Navigation based on liability constraints
US10656245B2 (en) 2017-09-05 2020-05-19 Valeo Radar Systems, Inc. Automotive radar sensor blockage detection using adaptive overlapping visibility
US10391943B2 (en) * 2017-10-09 2019-08-27 Ford Global Technologies, Llc Vehicle lamp assembly
WO2019079211A1 (en) 2017-10-19 2019-04-25 DeepMap Inc. Lidar to camera calibration for generating high definition maps
US11052913B2 (en) * 2017-10-23 2021-07-06 Uatc, Llc Cargo trailer sensor assembly
US10551276B2 (en) 2017-12-05 2020-02-04 Electricfil Corporation Vehicle coolant flow and coolant quality sensor assembly
JP2020507137A (en) * 2017-12-11 2020-03-05 ベイジン ディディ インフィニティ テクノロジー アンド ディベロップメント カンパニー リミティッド System and method for identifying and positioning objects around a vehicle
US10529238B2 (en) 2017-12-19 2020-01-07 Denso International America, Inc. Blind spot detection system

Cited By (51)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11442167B2 (en) * 2018-09-12 2022-09-13 Sick Ag Sensor and autonomous vehicle
US11142125B2 (en) * 2018-11-01 2021-10-12 Elektrobit Automotive Gmbh Camera device, driver assist system, and vehicle
US20200200911A1 (en) * 2018-12-21 2020-06-25 Easymile Method and system for handling blind sectors of scanning layers of redundant sensors in a vehicle
US10962649B2 (en) * 2018-12-21 2021-03-30 Easymile Method and system for handling blind sectors of scanning layers of redundant sensors in a vehicle
WO2022036344A1 (en) * 2019-02-20 2022-02-17 Waymo Llc Self-driving sensor system
US20230056180A1 (en) * 2019-03-01 2023-02-23 Kodiak Robotics, Inc. Sensor assembly with lidar for autonomous vehicles
US20230052632A1 (en) * 2019-03-01 2023-02-16 Kodiak Robotics, Inc. Sensor assembly with lidar for autonomous vehicles
US20230048707A1 (en) * 2019-03-01 2023-02-16 Kodiak Robotics, Inc. Sensor assembly with radar for autonomous vehicles
US20230053054A1 (en) * 2019-03-01 2023-02-16 Kodiak Robotics, Inc. Sensor assembly with radar for autonomous vehicles
US20230047330A1 (en) * 2019-03-01 2023-02-16 Kodiak Robotics, Inc. Sensor assembly with lidar for autonomous vehicles
US20230053265A1 (en) * 2019-03-01 2023-02-16 Kodiak Robotics, Inc. Sensor assembly with lidar for autonomous vehicles
US20230048944A1 (en) * 2019-03-01 2023-02-16 Kodiak Robotics, Inc. Sensor assembly with radar for autonomous vehicles
US20220144185A1 (en) * 2019-03-01 2022-05-12 Kodiak Robotics, Inc. Sensor assembly for autonomous vehicles
US20230058449A1 (en) * 2019-03-01 2023-02-23 Kodiak Robotics, Inc. Sensor assembly for autonomous vehicles
US20230051375A1 (en) * 2019-03-01 2023-02-16 Kodiak Robotics, Inc. Sensor assembly with lidar for autonomous vehicles
US20230052355A1 (en) * 2019-03-01 2023-02-16 Kodiak Robotics, Inc. Sensor assembly for autonomous vehicles
US20230051970A1 (en) * 2019-03-01 2023-02-16 Kodiak Robotics, Inc. Sensor assembly with radar for autonomous vehicles
US20230047244A1 (en) * 2019-03-01 2023-02-16 Kodiak Robotics, Inc. Sensor assembly with lidar and radar for autonomous vehicles
US20230049599A1 (en) * 2019-03-01 2023-02-16 Kodiak Robotics, Inc. Sensor assembly with radar for autonomous vehicles
US20230068067A1 (en) * 2019-03-01 2023-03-02 Kodiak Robotics, Inc. Sensor assembly with lidar for autonomous vehicles
US20230057515A1 (en) * 2019-03-01 2023-02-23 Kodiak Robotics, Inc. Sensor assembly for autonomous vehicles
US20220097625A1 (en) * 2019-06-14 2022-03-31 Locomation, Inc. Mirror pod environmental sensor arrangement for autonomous vehicle enabling compensation for uneven road camber
US11584309B2 (en) * 2019-06-14 2023-02-21 Locomation, Inc. Mirror pod environmental sensor arrangement for autonomous vehicle enabling lane change decisions
US11932173B2 (en) * 2019-06-14 2024-03-19 Stack Av Co. Mirror pod environmental sensor arrangement for autonomous vehicle enabling compensation for uneven road camber
US11584308B2 (en) * 2019-06-14 2023-02-21 Locomation, Inc. Mirror pod environmental sensor arrangement for autonomous vehicle enabling lane center offset mimicry
US20220097624A1 (en) * 2019-06-14 2022-03-31 Locomation, Inc. Mirror pod environmental sensor arrangement for autonomous vehicle enabling lane change decisions
US20220097623A1 (en) * 2019-06-14 2022-03-31 Locomation, Inc. Mirror pod environmental sensor arrangement for autonomous vehicle enabling lane center offset mimicry
US11124132B2 (en) * 2019-06-14 2021-09-21 Locomation, Inc. Mirror pod environmental sensor arrangement for autonomous vehicle
EP3990330A4 (en) * 2019-07-29 2023-08-09 Waymo LLC Maintaining road safety when there is a disabled autonomous vehicle
US11794640B2 (en) 2019-07-29 2023-10-24 Waymo Llc Maintaining road safety when there is a disabled autonomous vehicle
EP3771923A1 (en) * 2019-07-31 2021-02-03 Tusimple, Inc. Lidar mirror sensor assembly
US11634079B2 (en) 2019-07-31 2023-04-25 Tusimple, Inc. Lidar mirror sensor assembly
EP3885197A4 (en) * 2019-08-15 2023-09-06 Beijing Baidu Netcom Science And Technology Co., Ltd. Autonomous vehicle
US20220196802A1 (en) * 2019-08-15 2022-06-23 Apollo Intelligent Connectivity (Beijing) Technology Co., Ltd. Autonomous vehicle
US11402497B2 (en) * 2019-08-30 2022-08-02 Robert Bosch Gmbh Device for a vehicle
US11427175B2 (en) * 2019-12-16 2022-08-30 Waymo Llc Vehicle braking systems and methods
US11887378B2 (en) 2019-12-30 2024-01-30 Waymo Llc Close-in sensing camera system
US11880200B2 (en) 2019-12-30 2024-01-23 Waymo Llc Perimeter sensor housings
CN115135554A (en) * 2019-12-30 2022-09-30 伟摩有限责任公司 Perimeter sensor housing
WO2021137884A1 (en) * 2019-12-30 2021-07-08 Waymo Llc Perimeter sensor housings
KR102649924B1 (en) * 2019-12-30 2024-03-25 웨이모 엘엘씨 Peripheral sensor housing
JP7457811B2 (en) 2019-12-30 2024-03-28 ウェイモ エルエルシー ambient sensor housing
US11557127B2 (en) 2019-12-30 2023-01-17 Waymo Llc Close-in sensing camera system
US11493922B1 (en) 2019-12-30 2022-11-08 Waymo Llc Perimeter sensor housings
KR20220104244A (en) * 2019-12-30 2022-07-26 웨이모 엘엘씨 Ambient sensor housing
EP4061682A4 (en) * 2019-12-30 2024-01-03 Waymo Llc Perimeter sensor housings
US20210297142A1 (en) * 2020-03-18 2021-09-23 Qualcomm Incorporated Determining beam directions of a repeater
US20220150392A1 (en) * 2020-11-10 2022-05-12 Deere & Company Work vehicle perception systems and front modules
US20220169270A1 (en) * 2020-11-30 2022-06-02 Nuro, Inc. Hardware systems for an autonomous vehicle
US11807259B2 (en) * 2020-11-30 2023-11-07 Nuro, Inc. Hardware systems for an autonomous vehicle
CN113917452A (en) * 2021-09-30 2022-01-11 北京理工大学 Blind road detection device and method combining vision and radar

Also Published As

Publication number Publication date
CN111566511B (en) 2024-04-16
CN111566511A (en) 2020-08-21
US11899466B2 (en) 2024-02-13
US20220244737A1 (en) 2022-08-04
WO2019133437A1 (en) 2019-07-04
EP3710862A1 (en) 2020-09-23

Similar Documents

Publication Publication Date Title
US11899466B2 (en) Sensor integration for large autonomous vehicles
US11125881B2 (en) Lidar-based trailer tracking
US11772719B2 (en) Efficient autonomous trucks
AU2021266362B2 (en) Camera ring structure for autonomous vehicles
KR102649924B1 (en) Peripheral sensor housing
US11794640B2 (en) Maintaining road safety when there is a disabled autonomous vehicle
US11822011B2 (en) Mirrors to extend sensor field of view in self-driving vehicles
US11675357B2 (en) Independently actuated wheel sets for large autonomous self-driving vehicles
US9043071B1 (en) Steering-based scrub braking
US11738743B2 (en) Collision avoidance method and system for a vehicle
KR102665624B1 (en) proximity detection camera system
US20240106987A1 (en) Multi-Sensor Assembly with Improved Backward View of a Vehicle

Legal Events

Date Code Title Description
AS Assignment

Owner name: WAYMO LLC, CALIFORNIA

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:GROSSMAN, WILLIAM;PITZER, BENJAMIN;REEL/FRAME:046102/0014

Effective date: 20180130

STPP Information on status: patent application and granting procedure in general

Free format text: NON FINAL ACTION MAILED

STCV Information on status: appeal procedure

Free format text: APPEAL BRIEF (OR SUPPLEMENTAL BRIEF) ENTERED AND FORWARDED TO EXAMINER

STCV Information on status: appeal procedure

Free format text: EXAMINER'S ANSWER TO APPEAL BRIEF MAILED

STCV Information on status: appeal procedure

Free format text: APPEAL READY FOR REVIEW

STCV Information on status: appeal procedure

Free format text: ON APPEAL -- AWAITING DECISION BY THE BOARD OF APPEALS

STCV Information on status: appeal procedure

Free format text: BOARD OF APPEALS DECISION RENDERED

STCV Information on status: appeal procedure

Free format text: BOARD OF APPEALS DECISION RENDERED AFTER REQUEST FOR RECONSIDERATION

STCB Information on status: application discontinuation

Free format text: ABANDONED -- AFTER EXAMINER'S ANSWER OR BOARD OF APPEALS DECISION