EP3997484A1 - Système et procédé de commande en temps réel d'un dispositif autonome - Google Patents

Système et procédé de commande en temps réel d'un dispositif autonome

Info

Publication number
EP3997484A1
EP3997484A1 EP20750954.8A EP20750954A EP3997484A1 EP 3997484 A1 EP3997484 A1 EP 3997484A1 EP 20750954 A EP20750954 A EP 20750954A EP 3997484 A1 EP3997484 A1 EP 3997484A1
Authority
EP
European Patent Office
Prior art keywords
occupancy grid
sdsf
delivery vehicle
global
processor
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
EP20750954.8A
Other languages
German (de)
English (en)
Inventor
Dirk A. Van Der Merwe
Arunabh Mishra
Christopher C. Langenfeld
Michael J. Slate
Christopher J. PRINCIPE
Gregory J. BUITKUS
Justin M. WHITNEY
Raajitha GUMMADI
Derek G. Kane
Emily A. CARRIGG
Patrick Steele
Benjamin V. Hersh
Fnu G SIVA PERUMAL
David Carrigg
Daniel F. Pawlowski
Yashovardhan Chaturvedi
Kartik Khanna
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Deka Products LP
Original Assignee
Deka Products LP
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Deka Products LP filed Critical Deka Products LP
Publication of EP3997484A1 publication Critical patent/EP3997484A1/fr
Pending legal-status Critical Current

Links

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W60/00Drive control systems specially adapted for autonomous road vehicles
    • B60W60/001Planning or execution of driving tasks
    • B60W60/0025Planning or execution of driving tasks specially adapted for specific operations
    • B60W60/00256Delivery operations
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W30/00Purposes of road vehicle drive control systems not related to the control of a particular sub-unit, e.g. of systems using conjoint control of vehicle sub-units
    • B60W30/08Active safety systems predicting or avoiding probable or impending collision or attempting to minimise its consequences
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W40/00Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models
    • B60W40/02Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models related to ambient conditions
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • G01C21/34Route searching; Route guidance
    • G01C21/3453Special cost functions, i.e. other than distance or default speed limit of road segments
    • G01C21/3461Preferred or disfavoured areas, e.g. dangerous zones, toll or emission zones, intersections, manoeuvre types, segments such as motorways, toll roads, ferries
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/86Combinations of radar systems with non-radar systems, e.g. sonar, direction finder
    • G01S13/865Combination of radar systems with lidar systems
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/86Combinations of radar systems with non-radar systems, e.g. sonar, direction finder
    • G01S13/867Combination of radar systems with cameras
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/88Radar or analogous systems specially adapted for specific applications
    • G01S13/93Radar or analogous systems specially adapted for specific applications for anti-collision purposes
    • G01S13/931Radar or analogous systems specially adapted for specific applications for anti-collision purposes of land vehicles
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S15/00Systems using the reflection or reradiation of acoustic waves, e.g. sonar systems
    • G01S15/86Combinations of sonar systems with lidar systems; Combinations of sonar systems with systems not using wave reflection
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S15/00Systems using the reflection or reradiation of acoustic waves, e.g. sonar systems
    • G01S15/88Sonar systems specially adapted for specific applications
    • G01S15/93Sonar systems specially adapted for specific applications for anti-collision purposes
    • G01S15/931Sonar systems specially adapted for specific applications for anti-collision purposes of land vehicles
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/02Systems using the reflection of electromagnetic waves other than radio waves
    • G01S17/06Systems determining position data of a target
    • G01S17/42Simultaneous measurement of distance and other co-ordinates
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/86Combinations of lidar systems with systems other than lidar, radar or sonar, e.g. with direction finders
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/88Lidar systems specially adapted for specific applications
    • G01S17/89Lidar systems specially adapted for specific applications for mapping or imaging
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/88Lidar systems specially adapted for specific applications
    • G01S17/93Lidar systems specially adapted for specific applications for anti-collision purposes
    • G01S17/931Lidar systems specially adapted for specific applications for anti-collision purposes of land vehicles
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/4808Evaluating distance, position or velocity data
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0231Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means
    • G05D1/0246Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using a video camera in combination with image processing means
    • G05D1/0248Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using a video camera in combination with image processing means in combination with a laser
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0231Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means
    • G05D1/0246Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using a video camera in combination with image processing means
    • G05D1/0251Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using a video camera in combination with image processing means extracting 3D information from a plurality of images taken from different locations, e.g. stereo vision
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0268Control of position or course in two dimensions specially adapted to land vehicles using internal positioning means
    • G05D1/0274Control of position or course in two dimensions specially adapted to land vehicles using internal positioning means using mapping information stored in a memory device
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W50/00Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
    • B60W2050/0001Details of the control system
    • B60W2050/0002Automatic control, details of type of controller or control system architecture
    • B60W2050/0004In digital systems, e.g. discrete-time systems involving sampling
    • B60W2050/0005Processor details or data handling, e.g. memory registers or chip architecture
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W50/00Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
    • B60W2050/0001Details of the control system
    • B60W2050/0002Automatic control, details of type of controller or control system architecture
    • B60W2050/0008Feedback, closed loop systems or details of feedback error signal
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W50/00Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
    • B60W2050/0001Details of the control system
    • B60W2050/0043Signal treatments, identification of variables or parameters, parameter estimation or state estimation
    • B60W2050/0052Filtering, filters
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2420/00Indexing codes relating to the type of sensors based on the principle of their operation
    • B60W2420/40Photo, light or radio wave sensitive means, e.g. infrared sensors
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2420/00Indexing codes relating to the type of sensors based on the principle of their operation
    • B60W2420/40Photo, light or radio wave sensitive means, e.g. infrared sensors
    • B60W2420/403Image sensing, e.g. optical camera
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2420/00Indexing codes relating to the type of sensors based on the principle of their operation
    • B60W2420/40Photo, light or radio wave sensitive means, e.g. infrared sensors
    • B60W2420/408Radar; Laser, e.g. lidar
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2420/00Indexing codes relating to the type of sensors based on the principle of their operation
    • B60W2420/54Audio sensitive means, e.g. ultrasound
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2556/00Input parameters relating to data
    • B60W2556/45External transmission of data to or from the vehicle
    • B60W2556/50External transmission of data to or from the vehicle of positioning data, e.g. GPS [Global Positioning System] data
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B62LAND VEHICLES FOR TRAVELLING OTHERWISE THAN ON RAILS
    • B62DMOTOR VEHICLES; TRAILERS
    • B62D15/00Steering not otherwise provided for
    • B62D15/02Steering position indicators ; Steering position determination; Steering aids
    • B62D15/025Active steering aids, e.g. helping the driver by actively influencing the steering system after environment evaluation
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/88Radar or analogous systems specially adapted for specific applications
    • G01S13/93Radar or analogous systems specially adapted for specific applications for anti-collision purposes
    • G01S13/931Radar or analogous systems specially adapted for specific applications for anti-collision purposes of land vehicles
    • G01S2013/9318Controlling the steering
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/88Radar or analogous systems specially adapted for specific applications
    • G01S13/93Radar or analogous systems specially adapted for specific applications for anti-collision purposes
    • G01S13/931Radar or analogous systems specially adapted for specific applications for anti-collision purposes of land vehicles
    • G01S2013/93185Controlling the brakes
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/88Radar or analogous systems specially adapted for specific applications
    • G01S13/93Radar or analogous systems specially adapted for specific applications for anti-collision purposes
    • G01S13/931Radar or analogous systems specially adapted for specific applications for anti-collision purposes of land vehicles
    • G01S2013/9319Controlling the accelerator
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/88Radar or analogous systems specially adapted for specific applications
    • G01S13/93Radar or analogous systems specially adapted for specific applications for anti-collision purposes
    • G01S13/931Radar or analogous systems specially adapted for specific applications for anti-collision purposes of land vehicles
    • G01S2013/9323Alternative operation using light waves

Definitions

  • the present teachings relate generally to AVs, and more specifically to autonomous route planning, global occupancy grid management, on-vehicle sensors, surface feature detection and traversal, and real-time vehicle configuration changes.
  • AVs AVs
  • long range sensors including, for example, but not limited to, LIDAR, cameras, stereo cameras, and radar.
  • Long range sensors can sense between 4 and 100 meters from the AV.
  • object avoidance and/or surface detection typically relies on short range sensors including, for example, but not limited to, stereo-cameras, short-range radar, and ultra-sonic sensors. These short range sensors typically observe the area or volume around the AV out to about 5 meters. Sensors can enable, for example, orienting the AV within its
  • Sensors can also enable visioning humans, signage, traffic lights, obstacles, and surface features.
  • SDSFs substantially discontinuous surface features
  • topology can be unique to a specific geography.
  • SDSFs such as, for example, but not limited to, inclines, edges, curbs, steps, and curb-like geometries (referred to herein, in a non-limiting way, as SDSFs or simply surface features), however, can include some typical characteristics that can assist in their identification.
  • Surface/road conditions and surface types can be recognized and classified by, for example, fusing multisensory data, which can be complex and costly. Surface features and condition can be used to control, in real-time, the physical reconfiguration of an AV.
  • Sensors can be used to enable the creation of an occupancy grid that can represent the world for path planning purposes for the AV.
  • Path planning requires a grid that identifies a space as free, occupied, or unknown.
  • a probability that the space is occupied can improve decision-making with respect to the space.
  • Logodds representation of the probabilities can be used to increase the accuracy at the numerical boundaries of the probability of 0 and 1.
  • the probability that the cell is occupied can depend at least upon new sensor information, previous sensor information, and prior occupancy information.
  • What is needed is advantageous sensor placement to achieve physical configuration change, variable terrain traversal, and object avoidance. What is needed is the ability to locate SDSFs based on a multi -part model that is associated with several criteria for SDSF identification. What is needed is determining candidate surface feature traversals based upon criteria such as candidate traversal approach angle, candidate traversal driving surface on both sides of the candidate surface feature, and real-time determination of candidate traversal path obstructions. What is needed is a system and method for incorporating drivable surface and device mode information into occupancy grid determination ⁇
  • the AV of the present teachings can autonomously navigate to a desired location.
  • the AV can include sensors, a device controller including a perception subsystem, an autonomy subsystem, and a driver subsystem, a power base, four powered wheels, two caster wheels, and a cargo container.
  • the perception and autonomy subsystems can receive and process sensor information
  • the AV can operate in multiple distinct modes. The modes can enable complex terrain traversal, among other benefits.
  • a combination of the map (surface classification, for example), the sensor data (sensing features surrounding the AV), the occupancy grid (probability that the upcoming path point is occupied), the mode (ready to traverse difficult terrain or not), and can be used to identify the direction, configuration, and speed of the AV.
  • the method of the present teachings for creating a map to navigate at least one SDSF encountered by an AV, where the AV travels a path over a surface, where the surface includes the at least one SDSF, and where the path includes a starting point and an ending point can include, but is not limited to including, accessing point cloud data representing the surface, filtering the point cloud data, forming the filtered point cloud data into proces sable parts, and merging the processable parts into at least one concave polygon.
  • the method can include locating and labeling the at least one SDSF in the at least one concave polygon.
  • the locating and labeling can form labeled point cloud data.
  • the method can include creating graphing polygons based at least on the at least one concave polygon, and choosing the path from the starting point to the ending point based at least on the graphing polygons.
  • the AV can traverse the at least one SDSF along the path.
  • Filtering the point cloud data can optionally include conditionally removing points representing transient objects and points representing outliers from the point cloud data, and replacing the removed points having a pre-selected height.
  • Forming processing parts can optionally include segmenting the point cloud data into the processable parts, and removing points of a pre-selected height from the processable parts.
  • Merging the processable parts can optionally include reducing the size of the processable parts by analyzing outliers, voxels, and normal, growing regions from the reduced-size processable parts, determining initial drivable surfaces from the grown regions, segmenting and meshing the initial drivable surfaces, locating polygons within the segmented and meshed initial drivable surfaces, and setting the drivable surfaces based at least on the polygons.
  • Locating and labeling the at least one SDSF feature can optionally include sorting the point cloud data of the drivable surfaces according to a SDSF filter, the SDSF filter including at least three categories of points, and locating the at least one SDSF point based at least on whether the categories of points, in combination, meet at least one first pre-selected criterion.
  • the method can optionally include creating at least one SDSF trajectory based at least on whether a plurality of the at least one SDSF points, in combination, meet at least one second pre-selected criterion.
  • Creating graphing polygons further can optionally include creating at least one polygon from the at least one drivable surface.
  • the at least one polygon can include edges.
  • Creating graphing polygons can include smoothing the edges, forming a driving margin based on the smoothed edges, adding the at least one SDSF trajectory to the at least one drivable surface, and removing edges from the at least one drivable surface according to at least one third pre-selected criterion. Smoothing of the edges can optionally include trimming the edges outward. Forming a driving margin of the smoothed edges can optionally include trimming the outward edges inward.
  • the system of the present teachings for creating a map for navigating at least one SDSF encountered by a AV, where the AV travels a path over a surface, where the surface includes the at least one SDSF, where the path includes a starting point and an ending point can include, but is not limited to including, a first processor accessing point cloud data representing the surface, a first filter filtering the point cloud data, a second processor forming processable parts from the filtered point cloud data, a third processor merging the processable parts into at least one concave polygon, a fourth processor locating and labeling the at least one SDSF in the at least one concave polygon, the locating and labeling forming labeled point cloud data, a fifth processor creating graphing polygons, and a path selector choosing the path from the starting point to the ending point based at least on the graphing polygons.
  • the AV can traverse the at least one SDSF along the path.
  • the first filter can optionally include executable code that can include, but is not limited to including, conditionally removing points representing transient objects and points representing outliers from the point cloud data, and replacing the removed points having a pre-selected height.
  • the segmenter can optionally include executable code that can include, but is not limited to including, segmenting the point cloud data into the processable parts, and removing points of a pre-selected height from the processable parts.
  • the third processor can optionally include executable code that can include, but is not limited to including, reducing the size of the processable parts by analyzing outliers, voxels, and normal, growing regions from the reduced-size processable parts, determining initial drivable surfaces from the grown regions, segmenting and meshing the initial drivable surfaces, locating polygons within the segmented and meshed initial drivable surfaces, and setting the drivable sur faces based at least on the polygons.
  • the fourth processor can optionally include executable code that can include, but is not limited to including, sorting the point cloud data of the drivable surfaces according to a SDSF filter, the SDSF filter including at least three categories of points, and locating the at least one SDSF point based at least on whether the categories of points, in combination, meet at least one first pre- selected criterion.
  • the system can optionally include executable code that can include, but is not limited to including, creating at least one SDSF trajectory based at least on whether a plurality of the at least one SDSF points, in combination, meet at least one second pre- selected criterion.
  • Creating graphing polygons can optionally include executable code that can include, but is not limited to including, creating at least one polygon from the at least one drivable surface, the at least one polygon including edges, smoothing the edges, forming a driving margin based on the smoothed edges, adding the at least one SDSF trajectory to the at least one drivable surface, and removing edges from the at least one drivable surface according to at least one third pre-selected criterion.
  • Smoothing the edges can optionally include executable code that can include, but is not limited to including, trimming the edges outward.
  • Forming a driving margin of the smoothed edges can optionally include executable code that can include, but is not limited to including, trimming the outward edges inward.
  • the method of the present teachings for creating a map for navigating at least one SDSF encountered by a AV, where the AV travels a path over a surface, where the surface includes the at least one SDSF, where the path includes a starting point and an ending point can include, but is not limited to including, accessing a route topology.
  • the route topology can include at least one graphing polygon that can include filtered point cloud data.
  • the point cloud data can include labeled features and a drivable margin.
  • the method can include transforming the point cloud data into a global coordinate system, determining boundaries of the at least one SDSF, creating SDSF buffers of a pre-selected size around the boundaries, determining which of the at least one SDSFs can be traversed based at least on at least one SDSF traversal criterion, creating an edge/weight graph based at least on the at least one SDSF traversal criterion, the transformed point cloud data, and the route topology, and choosing a path from the starting point to the destination point based at least on the edge/weight graph.
  • the at least one SDSF traversal criterion can optionally include a pre-selected width of the at least one SDSF and a pre-selected smoothness of the at least one SDSF, a minimum ingress distance and a minimum egress distance between the at least one SDSF and the AV including a drivable surface, and a minimum ingress distance between the at least one SDSF and the AV that can accommodate approximately a 90° approach by the AV to the at least one SDSF.
  • the system of the present teachings for creating a map for navigating at least one SDSF encountered by a AV, where the AV travels a path over a surface, where the surface includes the at least one SDSF, and where the path includes a starting point and an ending point can include, but is not limited to including, a sixth processor accessing a route topology.
  • the route topology can include at least one graphing polygon that can include filtered point cloud data.
  • the point cloud data can include labeled features and a drivable margin.
  • the system can include a seventh processor transforming the point cloud data into a global coordinate system, and an eighth processor determining boundaries of the at least one SDSF.
  • the eighth processor can create SDSF buffers of a pre-selected size around the boundaries.
  • the system can include a ninth processor determining which of the at least one SDSFs can be traversed based at least on at least one SDSF traversal criterion, a tenth processor creating an edge/weight graph based at least on the at least one SDSF traversal criterion, the transformed point cloud data, and the route topology, and a base controller choosing a path from the starting point to the destination point based at least on the edge/weight graph.
  • the method of the present teachings for creating a map for navigating at least one SDSF encountered by a AV, where the AV travels a path over a surface, where the surface includes the at least one SDSF, and where the path includes a starting point and an ending point can include, but is not limited to including, accessing point cloud data representing the surface.
  • the method can include filtering the point cloud data, forming the filtered point cloud data into processable parts, and merging the processable parts into at least one concave polygon.
  • the method can include locating and labeling the at least one SDSF in the at least one concave polygon. The locating and labeling can form labeled point cloud data.
  • the method can include creating graphing polygons based at least on the at least one concave polygon.
  • the graphing polygons can form a route topology
  • the point cloud data can include labeled features and a drivable margin.
  • the method can include transforming the point cloud data into a global coordinate system, determining boundaries of the at least one SDSF, creating SDSF buffers of a pre- selected size around the boundaries, determining which of the at least one SDSFs can be traversed based at least on at least one SDSF traversal criterion, creating an edge/weight graph based at least on the at least one SDSF traversal criterion, the transformed point cloud data, and the route topology, and choosing a path from the starting point to the destination point based at least on the edge/weight graph.
  • Filtering the point cloud data can optionally include conditionally removing points representing transient objects and points representing outliers from the point cloud data, and replacing the removed points having a pre-selected height.
  • Forming processing parts can optionally include segmenting the point cloud data into the processable parts, and removing points of a pre-selected height from the processable parts.
  • Merging the processable parts can optionally include reducing the size of the processable parts by analyzing outliers, voxels, and normal, growing regions from the reduced-size processable parts, determining initial drivable surfaces from the grown regions, segmenting and meshing the initial drivable surfaces, locating polygons within the segmented and meshed initial drivable surfaces, and setting the drivable surfaces based at least on the polygons.
  • Locating and labeling the at least one SDSF feature can optionally include sorting the point cloud data of the drivable surfaces according to a SDSF filter, the SDSF filter including at least three categories of points, and locating the at least one SDSF point based at least on whether the categories of points, in combination, meet at least one first pre-selected criterion.
  • the method can optionally include creating at least one SDSF trajectory based at least on whether a plurality of the at least one SDSF points, in combination, meet at least one second pre-selected criterion.
  • Creating graphing polygons further can optionally include creating at least one polygon from the at least one drivable surface.
  • the at least one polygon can include edges.
  • Creating graphing polygons can include smoothing the edges, forming a driving margin based on the smoothed edges, adding the at least one SDSF trajectory to the at least one drivable surface, and removing edges from the at least one drivable surface according to at least one third pre-selected criterion. Smoothing of the edges can optionally include trimming the edges outward. Forming a driving margin of the smoothed edges can optionally include trimming the outward edges inward.
  • the at least one SDSF traversal criterion can optionally include a pre-selected width of the at least one and a pre-selected smoothness of the at least one SDSF, a minimum ingress distance and a minimum egress distance between the at least one SDSF and the AV including a drivable surface, and a minimum ingress distance between the at least one SDSF and the AV that can accommodate approximately a 90° approach by the AV to the at least one SDSF.
  • the system of the present teachings for creating a map for navigating at least one SDSF encountered by a AV, where the AV travels a path over a surface, where the surface includes the at least one SDSF, where the path includes a starting point and an ending point can include, but is not limited to including, a point cloud accessor accessing point cloud data representing the surface, a first filter filtering the point cloud data, a segmenter forming processable parts from the filtered point cloud data, a third processor merging the processable parts into at least one concave polygon, a fourth processor locating and labeling the at least one SDSF in the at least one concave polygon, the locating and labeling forming labeled point cloud data, a fifth processor creating graphing polygons.
  • the route topology can include at least one graphing polygon that can include filtered point cloud data.
  • the point cloud data can include labeled features and a drivable margin.
  • the system can include a seventh processor transforming the point cloud data into a global coordinate system, and a eighth processor determining boundaries of the at least one SDSF.
  • the eighth processor can create SDSF buffers of a pre-selected size around the boundaries.
  • the system can include a ninth processor determining which of the at least one SDSFs can be traversed based at least on at least one SDSF traversal criterion, a tenth processor creating an edge/weight graph based at least on the at least one SDSF traversal criterion, the transformed point cloud data, and the route topology, and a base controller choosing a path from the starting point to the destination point based at least on the edge/weight graph.
  • the first filter can optionally include executable code that can include, but is not limited to including, conditionally removing points representing transient objects and points representing outliers from the point cloud data, and replacing the removed points having a pre-selected height.
  • the segmenter can optionally include executable code that can include, but is not limited to including, segmenting the point cloud data into the processable parts, and removing points of a pre-selected height from the processable parts.
  • the third processor can optionally include executable code that can include, but is not limited to including, reducing the size of the processable parts by analyzing outliers, voxels, and normal, growing regions from the reduced-size processable parts, determining initial drivable surfaces from the grown regions, segmenting and meshing the initial drivable surfaces, locating polygons within the segmented and meshed initial drivable surfaces, and setting the drivable surfaces based at least on the polygons.
  • the fourth processor can optionally include executable code that can include, but is not limited to including, sorting the point cloud data of the drivable surfaces according to a SDSF filter, the SDSF filter including at least three categories of points, and locating the at least one SDSF point based at least on whether the categories of points, in combination, meet at least one first pre- selected criterion.
  • the system can optionally include executable code that can include, but is not limited to including, creating at least one SDSF trajectory based at least on whether a plurality of the at least one SDSF points, in combination, meet at least one second pre- selected criterion.
  • Creating graphing polygons can optionally include executable code that can include, but is not limited to including, creating at least one polygon from the at least one drivable surface, the at least one polygon including edges, smoothing the edges, forming a driving margin based on the smoothed edges, adding the at least one SDSF trajectory to the at least one drivable surface, and removing edges from the at least one drivable surface according to at least one third pre-selected criterion.
  • Smoothing the edges can optionally include executable code that can include, but is not limited to including, trimming the edges outward.
  • Forming a driving margin of the smoothed edges can optionally include executable code that can include, but is not limited to including, trimming the outward edges inward.
  • a SDSF can be identified by its dimensions.
  • a curb can include, but is not limited to including, a width of about 0.6-0.7m.
  • point cloud data can be processed to locate SDSFs, and those data can be used to prepare a path for the AV from a beginning point to a destination.
  • the path can be included in the map and provided to the perception subsystem.
  • SDSF traversal can be accommodated through sensor-based positioning of the AV enabled in part by the perception subsystem.
  • the perception subsystem can execute on at least one processor within the AV.
  • the AV can include, but is not limited to including, a power base including two powered front-wheels, two powered back-wheels, energy storage, and at least one processor.
  • the power base can be configured to move at a commanded velocity.
  • the AV can include a cargo platform, mechanically attached to the power base, including a plurality of short-range sensors.
  • the AV can include a cargo container, mounted atop the cargo platform in some configurations, having a volume for receiving a one or more objects to deliver.
  • the AV can include a long-range sensor suite, mounted atop the cargo container in some configurations, that can include, but is not limited to including, LIDAR and one or more cameras.
  • the AV can include a controller that can receive data from the long-range sensor suite and the short-range sensor suite.
  • the short-range sensor suite can optionally detect at least one characteristic of the drivable surface, and can optionally include stereo cameras, an IR projector, two image sensors, an RGB sensor, and radar sensors.
  • the short-range sensor suite can optionally supply RGB-D data to the controller.
  • the controller can optionally determine the geometry of the road surface based on RGB-D data received from the short-range sensor suite.
  • the short-range sensor suite can optionally detect objects within 4 meters of the AV, and the long-range sensor suite can optionally detect objects more than 4 meters from the AV.
  • the perception subsystem can use the data collected by the sensors to populate the occupancy grid.
  • the occupancy grid of the present teachings can be configured as a 3D grid of points surrounding the AV, with the AV occupying the center point. In some configurations, the occupancy grid can stretch 10m to the left, right, back, and front of the AV.
  • the grid can include, approximately, the height of the AV, and can virtually travel with the AV as it moves, representing obstacles surrounding the AV.
  • the grid can be converted to two dimensions by reducing its vertical axis, and can be divided into polygons, for example, but not limited to, approximately 5cm x 5cm in size. Obstacles appearing in the 3D space around the AV can be reduced into a 2D shape.
  • the polygon can be given the value of 100, indicating that the space is occupied. Any polygons left unfilled-in can be given the value of 0, and can be referred to as free space, where the AV can move.
  • a method of the present teachings for real-time control of a configuration of a in some configurations where the in some configurations includes a chassis, at least four wheels, a first side of the chassis operably coupled with at least one of the at least four wheels, and an opposing second side of the chassis operably coupled with at least one of the at least four wheels, the method can include, but is not limited to including, receiving environmental data, determining a surface type based at least on the environmental data, determining a mode based at least on the surface type and a first configuration, determining a second configuration based at least on the mode and the surface type, determining movement commands based at least on the second configuration, and controlling the configuration of the device by using the movement commands to change the device from the first configuration to the second configuration.
  • the method can optionally include populating the occupancy grid based at least on the surface type and the mode.
  • the environmental data can optionally include RGB-D image data and a topology of a road surface.
  • the configuration can optionally include two pairs of clustered of the at least four wheels. A first pair of the two pairs can be positioned on the first side, and a second pair of the two pairs being can be positioned on the second side.
  • the first pair can include a first front wheel and a first rear wheel, and the second pair can include a second front wheel and a second rear wheel.
  • the controlling of the configuration can optionally include coordinated powering of the first pair and the second pair based at least on the environmental data.
  • the controlling of the configuration can optionally include transitioning from driving the at least four wheels and a pair of casters retracted to driving two wheels with the clustered first pair and the clustered second pair rotated to lift the first front wheel and the second front wheel.
  • the pair of casters can be operably coupled with the chassis.
  • the device can rest on the first rear wheel, the second rear wheel, and the pair of casters.
  • the controlling of the configuration can optionally include rotating a pair of clusters operably coupled with two powered wheels on the first side and two powered wheels on the second side based at least on the environmental data.
  • the system of the present teachings for real-time control of a configuration of an AV can include, but is not limited to including, a device processor and a powerbase processor.
  • the AV can include a chassis, at least four wheels, a first side of the chassis, and an opposing second side of the chassis.
  • the device processor can receive real-time environmental data surrounding the AV, determine a surface type based at least on the environmental data, determine a mode based at least on the surface type and a first configuration, and determine a second configuration based at least on the mode and the surface type.
  • the power base processor can enable the AV to move based at least on the second configuration, and can enable the AV to change from the first configuration to the second configuration.
  • the device processor can optionally include populating the occupancy grid based at least on the surface type and the mode.
  • the method of the present teachings for navigating the AV along a path line in a travel area towards a goal point across at least one SDSF, the AV including a leading edge and a trailing edge can include, but is not limited to including, receiving SDSF information and obstacle information for the travel area, detecting at least one candidate SDSF from the SDSF information, and selecting a SDSF line from the at least one candidate SDSF line based on at least one selection criterion.
  • the method can include determining at least one traversable part of the selected SDSF line based on at least one location of at least one obstacle found in the obstacle information in the vicinity of the selected SDSF line, heading the AV, operating at a first speed towards the at least one traversable part, by turning the AV to travel along a line perpendicular to the traversable part, and constantly correcting a heading of the AV based on a relationship between the heading and the perpendicular line.
  • the method can include driving the AV at a second speed by adjusting the first speed of the AV based at least on the heading and a distance between the AV and the traversable part.
  • the method can include traversing the SDSF by elevating the leading edge relative to the trailing edge and driving the AV at a third increased speed per degree of elevation, and driving the AV at a fourth speed until the AV has cleared the SDSF.
  • Detecting at least one candidate SDSF from the SDSF information can optionally include (a) drawing a closed polygon encompassing a location of the AV, and a location of a goal point, (b) drawing a path line between the goal point and the location of the AV, (c) selecting two SDSF points from the SDSF information, the SDSF points being located within the polygon, and (d) drawing a SDSF line between the two points.
  • Detecting at least one candidate SDSF can include (e) repeating steps (c)-(e) if there are fewer than a first pre- selected number of points within a first pre-selected distance of the SDSF line, and if there have been less than a second pre-selected number of attempts at choosing the SDSF points, drawing a line between them, and having fewer than the first pre-selected number of points around the SDSF line.
  • Detecting at least one candidate SDSF can include (f) fitting a curve to the SDSF points that fall within the first pre-selected distance of the SDSF line if there are the first pre-selected number of points or more, (g) identifying the curve as the SDSF line if a first number of the SDSF points that are within the first pre-selected distance of the curve exceeds a second number of the SDSF points within the first pre-selected distance of the SDSF line, and if the curve intersects the path line, and if there are no gaps between the SDSF points on the curve that exceed a second pre-selected distance.
  • Detecting at least one candidate SDSF can include (h) repeating steps (f)-(h) if the number of points that are within the first pre-selected distance of the curve does not exceed the number of points within the first pre-selected distance of the SDSF line, or if the curve does not intersect the path line, or if there are gaps between the SDSF points on the curve that exceed the second pre-selected distance, and if the SDSF line is not remaining stable, and if steps (f)-(h) have not been attempted more than the second pre-selected number of attempts.
  • the closed polygon can optionally include a pre-selected width, and the pre- selected width can optionally include a width dimension of the AV.
  • Selecting the SDSF points can optionally include random selection.
  • the at least one selection criterion can optionally include a first number of the SDSF points within the first pre-selected distance of the curve exceeds a second number of SDSF points within the first pre-selected distance of the SDSF line, the curve intersects the path line, and there are no gaps between the SDSF points on the curve that exceed a second pre-selected distance.
  • Determining at least one traversable part of the selected SDSF can optionally include selecting a plurality of obstacle points from the obstacle information.
  • Each of the plurality of obstacle points can include a probability that the obstacle point is associated with the at least one obstacle.
  • Determining at least one traversable part can include projecting the plurality of obstacle points to the SDSF line if the probability is higher than a pre-selected percent, and any of the plurality of obstacle points lies between the SDSF line and the goal point, and if any of the plurality of obstacle points is less than a third pre- selected distance from the SDSF line, forming at least one projection.
  • Determining at least one traversable part can optionally include connecting at least two of the at least one projection to each other, locating end points of the connected at least two projections along the SDSF line, marking as a non-traversable SDSF section the connected at least two projections, and marking as at least one traversable section the SDSF line outside of the non-traversable section.
  • Traversing the at least one traversable part of the SDSF can optionally include heading the AV, operating at a first speed, towards the traversable part, turning the AV to travel along a line perpendicular to the traversable part, constantly correcting a heading of the AV based on the relationship between the heading and the perpendicular line, and driving the AV at a second speed by adjusting the first speed of the AV based at least on the heading and a distance between the AV and the traversable part.
  • Traversing the at least one traversable part of the SDSF can optionally include if the SDSF is elevated relative to a surface of the travel route, traversing the SDSF by elevating the leading edge relative to the trailing edge and driving the AV at a third increased speed per degree of elevation, and driving the AV at a fourth speed until the AV has cleared the SDSF.
  • Traversing the at least one traversable part of the SDSF can alternatively optionally include (a) ignoring updated of the SDSF information and driving the AV at a pre-selected speed if a heading error is less than a third pre-selected amount with respect to a line perpendicular to the SDSF line, (b) driving the AV forward and increasing the speed of the AV to an eighth pre-selected speed per degree of elevation if an elevation of a front part of the AV relative to a rear part of the AV is between a sixth pre-selected amount and a fifth pre-selected amount, (c) driving the AV forward at a seventh pre-selected speed if the front part is elevated less than a sixth pre-selected amount relative to the rear part, and (d) repeating steps (a)-(d) if the rear part is less than or equal to a fifth pre-selected distance from the SDSF line.
  • the SDSF and the wheels of the AV can be automatically aligned to avoid system instability.
  • Automatic alignment can be implemented by, for example, but not limited to, continually testing for and correcting the heading of the AV as the AV approaches the SDSF.
  • Another aspect of the SDSF traversal feature of the present teachings is that the SDSF traversal feature automatically confirms that sufficient free space exists around the SDSF before attempting traversal.
  • traversing SDSFs of varying geometries is possible. Geometries can include, for example, but not limited to, squared and contoured SDSFs.
  • the orientation of the AV with respect to the SDSF can determine in what speed and direction the AV proceeds.
  • the SDSF traversal feature can adjust the speed of the AV in the vicinity of SDSFs. When the AV ascends the SDSF, the speed can be increased to assist the AV in traversing the SDSF.
  • An autonomous delivery vehicle comprising: a power base including two powered front wheels, two powered back wheels and energy storage, the power base configured to move at a commanded velocity and in a commanded direction to perform a transport of at least one object; a cargo platform including a plurality of short-range sensors, the cargo platform mechanically attached to the power base; a cargo container with a volume for receiving the at least one object, the cargo container mounted on top of the cargo platform; a long-range sensor suite comprising LIDAR and one or more cameras, the long-range sensor suite mounted on top of the cargo container; and a controller to receive data from the long-range sensor suite and the plurality of short-range sensors, the controller determining the commanded velocity and the commanded direction based at least on the data, the controller providing the commanded velocity and the commanded direction to the power base to complete the transport.
  • the data from the plurality of short-range sensors comprise at least one
  • the autonomous delivery vehicle of claim 1 wherein the plurality of short-range sensors comprises at least one stereo camera. 4. The autonomous delivery vehicle of claim 1 wherein the plurality of short-range sensors comprise at least one IR projector, at least one image sensor, and at least one RGB sensor. 5. The autonomous delivery vehicle of claim 1 wherein the plurality of short-range sensors comprises at least one radar sensor. 6. The autonomous delivery vehicle of claim 1 wherein the data from the plurality of short-range sensors comprise RGB-D data. 7. The autonomous delivery vehicle of claim 1 wherein the controller determines a geometry of a road surface based on RGB-D data received from the plurality of short-range sensors. 8.
  • the autonomous delivery vehicle of claim 1 wherein the plurality of short-range sensors detect objects within 4 meters of the AV and the long-range sensor suite detects objects more than 4 meters from the autonomous delivery vehicle.
  • the plurality of short-range sensors comprise a cooling circuit.
  • the controller comprises: executable code, the executable code including: accessing a map, the map formed by a map processor, the map processor comprising: first processor accessing point cloud data from the long-range sensor suite, the point cloud data representing the surface; a filter filtering the point cloud data; a second processor forming processable parts from the filtered point cloud data; a third processor merging the processable parts into at least one polygon; a fourth processor locating and labeling the at least one substantially discontinuous surface feature (SDSF) in the at least one polygon, if present, the locating and labeling forming labeled point cloud data; a fifth processor creating graphing polygons from the labeled point cloud data; and a sixth processor choosing a path from a starting point to an ending point based at least on the graphing polygons, the AV traversing the at least one SDSF along the path.
  • SDSF substantially discontinuous surface feature
  • the filter comprises: a seventh processor executing code including: conditionally removing points representing transient objects and points representing outliers from the point cloud data; and replacing the removed points having a pre-selected height.
  • the second processor includes the executable code comprising: segmenting the point cloud data into the processable parts; and removing points of a pre-selected height from the processable parts.
  • the fourth processor includes the executable code comprising: creating at least one SDSF trajectory based at least on whether a plurality of the at least one SDSF point, in
  • creating graphing polygons includes an eighth processor including the executable code comprising: creating at least one polygon from the at least one drivable surface, the at least one polygon including exterior edges; smoothing the exterior edges; forming a driving margin based on the smoothed exterior edges; adding the at least one SDSF trajectory to the at least one drivable surface; and removing interior edges from the at least one drivable surface according to at least one third pre-selected criterion.
  • the autonomous delivery vehicle as in claim 17 wherein the smoothing the exterior edges includes a ninth processor including the executable code comprising: trimming the exterior edges outward forming outward edges. 19.
  • forming the driving margin of the smoothed exterior edges includes a tenth processor including the executable code comprising: trimming the outward edges inward. 20.
  • the at least one SDSF traversal criterion comprises: a pre-selected width of the at least one and a pre-selected smoothness of the at least one SDSF; a minimum ingress distance and a minimum egress distance between the at least one SDSF and the AV including a drivable surface; and the minimum ingress distance between the at least one SDSF and the AV accommodating approximately a 90° approach by the AV to the at least one SDSF.
  • a method for managing a global occupancy grid for an autonomous device comprising: receiving sensor data from sensors associated with the autonomous device; creating a local occupancy grid based at least on the sensor data, the local occupancy grid having local occupancy grid cells; if the autonomous device has moved from a first area to a second area, accessing historical data associated with the second area; creating a static grid based at least on the historical data; moving the global occupancy grid to maintain the autonomous device in a central position of the global occupancy grid; updating the moved global occupancy grid based on the static grid; marking at least one of the global occupancy grid cells as unoccupied, if the at least one of the global occupancy grid cells coincides with a location of the autonomous device; for each of the local occupancy grid cells, calculating a position of the local occupancy grid cell on the global occupancy grid; accessing a first occupied probability from the global occupancy grid cell at the position; accessing a second occupied probability from the local occupancy grid cell at
  • the method as in claim 22 further comprising: range-checking the new occupied probability.
  • the range-checking comprises: setting the new occupied probability to 0 if the new occupied probability ⁇ 0; and setting the new occupied probability to 1 if the new occupied probability >1.
  • the method as in claim 22 further comprising: setting the global occupancy grid cell to the new occupied probability.
  • 26. The method as in claim 23 further comprising: setting the global occupancy grid cell to the range-checked new occupied probability.
  • a method for creating and managing occupancy grids comprising:
  • creating a plurality of local occupancy grids creating a static occupancy grid based on surface characteristics in a repository, the surface characteristics associated with a position of the device; moving a global occupancy grid associated with the position of the device to maintain the device and the local occupancy grid approximately centered with respect to the global occupancy grid; adding information from the static occupancy grid to the global occupancy grid; marking an area in the global occupancy grid currently occupied by the device as unoccupied; for each of at least one cell in each local occupancy grid, determining a location of the at least one cell in the global occupancy grid; accessing a first value at the location; determining a second value at the location based on a relationship between the first value and a cell value at the at least one cell in the local occupancy grid; comparing the second value against a pre-selected probability range; and setting the global occupancy grid with the new value if a probability value is within the pre-selected probability range.
  • the method as in claim 27 further comprising: publishing the global occupancy grid.
  • a system for creating and managing occupancy grids comprising: a plurality of local grid creation nodes creating at least one local occupancy grid, the at least one local occupancy grid associated with a position of a device, the at least one local occupancy grid including at least one cell; a global occupancy grid manager accessing the at least one local occupancy grid, the global occupancy grid manager creating a static occupancy grid based on surface characteristics in a repository, the surface characteristics associated with the position of the device, moving a global occupancy grid associated with the position of the device to maintain the device and at least one the local occupancy grid approximately centered with respect to the global occupancy grid; adding information from the static occupancy grid to at least one global occupancy grid; marking an area in the global occupancy grid currently occupied by the device as unoccupied; for each of the at least one cell in each local occupancy grid, determining a location of the at least one cell in the global occupancy grid; accessing a first value at
  • a method for updating a global occupancy grid comprising: if an autonomous device has moved to a new position, updating the global occupancy grid with information from a static grid associated with the new position; analyzing surfaces at the new position; if the surfaces are drivable, updating the surfaces and updating the global occupancy grid with the updated surfaces; and updating the global occupancy grid with values from a repository of static values, the static values being associated with the new position.
  • updating the surfaces comprises: accessing a local occupancy grid associated with the new position; for each cell in the local occupancy grid, accessing a local occupancy grid surface classification confidence value and a local occupancy grid surface classification; if the local occupancy grid surface classification is the same as a global surface classification in the global occupancy grid in the cell, adding a global surface classification confidence value in the global occupancy grid to the local occupancy grid surface classification confidence value to form a sum, and updating the global occupancy grid at the cell with the sum; if the local occupancy grid surface classification is not the same as the global surface classification in the global occupancy grid in the cell, subtracting the local occupancy grid surface classification confidence value from the global surface classification confidence value in the global occupancy grid to form a difference, and updating the global occupancy grid with the difference; if the difference is less than zero, updating the global occupancy grid with the local occupancy grid surface classification.
  • updating the global occupancy grid with the values from the repository of static values comprises: for each cell in a local occupancy grid, accessing a local occupancy grid probability that the cell is occupied value, a logodds value, from the local occupancy grid; updating the logodds value in the global occupancy grid with the local occupancy grid logodds value at the cell; if a pre-selected certainty that the cell is not occupied is met, and if the autonomous device is traveling within lane barriers, and if a local occupancy grid surface classification indicates a drivable surface, decreasing the logodds that the cell is occupied in the local occupancy grid; if the autonomous device expects to encounter relatively uniform surfaces, and if the local occupancy grid surface classification indicates a relatively non-uniform surface, increasing the logodds in the local occupancy grid; and if the autonomous device expects to encounter relatively uniform surfaces, and if the local occupancy grid surface classification indicates a relatively uniform surface, decreasing the logodds in the local occupancy grid.
  • a method for real-time control of a configuration of a device including a chassis, at least four wheels, a first side of the chassis operably coupled with at least one of the at least four wheels, and an opposing second side of the chassis operably coupled with at least one of the at least four wheels, the method comprising: creating a map based at least on prior surface features and an occupancy grid, the map being created in non-real time, the map including at least one location, the at least one location associated with at least one surface feature, the at least one surface feature being associated with at least one surface classification and at least one mode; determining current surface features as the device travels; updating the occupancy grid in real-time with the current surface features; determining, from the occupancy grid and the map, a path the device can travel to traverse the at least one surface feature.
  • a method for real-time control of a configuration of a device including a chassis, at least four wheels, a first side of the chassis operably coupled with at least one of the at least four wheels, and an opposing second side of the chassis operably coupled with at least one of the at least four wheels, the method comprising: receiving environmental data; determining a surface type based at least on the environmental data; determining a mode based at least on the surface type and a first configuration; determining a second configuration based at least on the mode and the surface type; determining movement commands based at least on the second configuration; and controlling the configuration of the device by using the movement commands to change the device from the first configuration to the second configuration.
  • the environmental data comprises RGB-D image data. 38.
  • the method as in claim 36 further comprising: populating an occupancy grid based at least on the surface type and the mode; and determining the movement commands based at least on the occupancy grid. 39.
  • the method as in claim 38 wherein the occupancy grid comprises information based at least on data from at least one image sensor. 40.
  • the method as in claim 36 wherein the environmental data comprises a topology of a road surface. 41.
  • the configuration comprises two pairs of clustered of the at least four wheels, a first pair of the two pairs being positioned on the first side, a second pair of the two pairs being positioned on the second side, the first pair including a first front wheel and a first rear wheel, and the second pair including a second front wheel and a second rear wheel.
  • the controlling of the configuration comprises: coordinated powering of the first pair and the second pair based at least on the environmental data. 43.
  • the controlling of the configuration comprises:
  • the pair of casters operably coupled to the chassis, to driving two wheels with the clustered first pair and the clustered second pair rotated to lift the first front wheel and the second front wheel, the device resting on the first rear wheel, the second rear wheel, and the pair of casters.
  • the controlling of the configuration comprises: rotating a pair of clusters operably coupled with a first two powered wheels on the first side and a second two powered wheels on the second side based at least on the environmental data.
  • the device further comprises a cargo container, the cargo container mounted on the chassis, the chassis controlling a height of the cargo container. 46. The method as in claim 45 wherein the height of the cargo container being based at least on the environmental data.
  • a system for real-time control of a configuration of a device including a chassis, at least four wheels, a first side of the chassis, and an opposing second side of the chassis, the system comprising: a device processor receiving real-time environmental data surrounding the device, the device processor determining a surface type based at least on the environmental data, the device processor determining a mode based at least on the surface type and a first configuration, the device processor determining a second configuration based at least on the mode and the surface type; and a powerbase processor determining movement commands based at least on the second configuration, the powerbase processor controlling the configuration of the device by using the movement commands to change the device from the first configuration to the second configuration.
  • the environmental data comprises RGB-D image data.
  • the device processor comprises populating an occupancy grid based at least on the surface type and the mode.
  • the powerbase processor comprises determining the movement commands based at least on the occupancy grid.
  • the occupancy grid comprises information based at least on data from at least one image sensor.
  • the environmental data comprises a topology of a road surface. 53.
  • the configuration comprises two pairs of clustered of the at least four wheels, a first pair of the two pairs being positioned on the first side, a second pair of the two pairs being positioned on the second side, the first pair having a first front wheel and a first rear wheel, and the second pair having a second front wheel and a second rear wheel.
  • the controlling of the configuration comprises: coordinated powering of the first pair and the second pair based at least on the environmental data. 55.
  • controlling of the configuration comprises: transitioning from driving the at least four wheels and a pair of casters retracted, the pair of casters operably coupled to the chassis, to driving two wheels with the clustered first pair and the clustered second pair rotated to lift the first front wheel and the second front wheel, the device resting on the first rear wheel, the second rear wheel, and the pair of casters.
  • a method for maintaining a global occupancy grid comprising: locating a first position of an autonomous device; when the autonomous device moves to a second position, the second position being associated with the global occupancy grid and a local occupancy grid, updating the global occupancy grid with at least one occupied probability value associated with the first position; updating the global occupancy grid with at least one drivable surface associated with the local occupancy grid; updating the global occupancy grid with surface confidences associated with the at least one drivable surface; updating the global occupancy grid with logodds of the at least one occupied probability value using a first Bayesian function; and adjusting the logodds based at least on characteristics associated with the second position; and when the autonomous device remains in the first position and the global occupancy grid and the local occupancy grid are co-located, updating the global occupancy grid with the at least one drivable surface associated with the local occupancy grid; updating the global occupancy grid with the surface confidences associated with the at least one drivable surface; updating the global occupancy grid with the logodds of the at least one occupied probability
  • creating the map comprises: accessing point cloud data representing the surface; filtering the point cloud data; forming the filtered point cloud data into processable parts; merging the processable parts into at least one concave polygon; locating and labeling the at least one SDSF in the at least one concave polygon, the locating and labeling forming labeled point cloud data; creating graphing polygons based at least on the at least one concave polygon; and choosing the path from a starting point to an ending point based at least on the graphing polygons, the AV traversing the at least one SDSF along the path. 58.
  • the method as in claim 57 wherein the filtering the point cloud data comprises: conditionally removing points representing transient objects and points representing outliers from the point cloud data; and replacing the removed points having a pre-selected height.
  • forming processing parts comprises: segmenting the point cloud data into the processable parts; and removing points of a pre-selected height from the processable parts. 60.
  • the merging the processable parts comprises: reducing a size of the processable parts by analyzing outliers, voxels, and normals; growing regions from the reduced-size processable parts; determining initial drivable surfaces from the grown regions; segmenting and meshing the initial drivable surfaces; locating polygons within the segmented and meshed initial drivable surfaces; and setting at least one drivable surface based at least on the polygons.
  • the method as in claim 60 wherein the locating and labeling the at least one SDSF comprises: sorting the point cloud data of the initial drivable surfaces according to a SDSF filter, the SDSF filter including at least three categories of points; and locating at least one SDSF point based at least on whether the at least three categories of points, in combination, meet at least one first pre-selected criterion.
  • the method as in claim 61 further comprising: creating at least one SDSF trajectory based at least on whether a plurality of the at least one SDSF point, in combination, meet at least one second pre- selected criterion.
  • the creating graphing polygons further comprises: creating at least one polygon from the at least one drivable surface, the at least one polygon including exterior edges; smoothing the exterior edges; forming a driving margin based on the smoothed exterior edges; adding the at least one SDSF trajectory to the at least one drivable surface; and removing interior edges from the at least one drivable surface according to at least one third pre-selected criterion.
  • the smoothing of the exterior edges comprises: trimming the exterior edges outward forming outward edges.
  • forming the driving margin of the smoothed exterior edges comprises: trimming the outward edges inward.
  • An autonomous delivery vehicle comprising: a power base including two powered front wheels, two powered back wheels and energy storage, the power base configured to move at a commanded velocity; a cargo platform including a plurality of short-range sensors, the cargo platform mechanically attached to the power base; a cargo container with a volume for receiving a one or more objects to deliver, the cargo container mounted on top of the cargo platform; a long-range sensor suite comprising LIDAR and one or more cameras, the long-range sensor suite mounted on top of the cargo container; and a controller to receive data from the long-range sensor suite and the plurality of short-range sensors.
  • the plurality of short-range sensors detect at least one characteristic of a drivable surface.
  • An autonomous delivery vehicle of claim 66 wherein the plurality of short-range sensors are stereo cameras. 69. The autonomous delivery vehicle of claim 66 wherein the plurality of short-range sensors comprise an IR projector, two image sensors and an RGB sensor. 70. The autonomous delivery vehicle of claim 66 wherein the plurality of short-range sensors are radar sensors. 71. The autonomous delivery vehicle of claim 66 wherein the short-range sensors supply RGB-D data to the controller. 72. The autonomous delivery vehicle of claim 66 wherein the controller determines a geometry of a road surface based on RGB-D data received from the plurality of short-range sensors. 73.
  • An autonomous delivery vehicle comprising: a power base including at least two powered back wheels, caster front wheels and energy storage, the power base configured to move at a commanded velocity; a cargo platform including a plurality of short-range sensors, the cargo platform mechanically attached to the power base; a cargo container with a volume for receiving a one or more objects to deliver, the cargo container mounted on top of the cargo platform; a long-range sensor suite comprising LIDAR and one or more cameras, the long-range sensor suite mounted on top of the cargo container; and a controller to receive data from the long-range sensor suite and the plurality of short-range sensors.
  • the autonomous delivery vehicle of claim 74 wherein the plurality of short-range sensors detect at least one characteristic of a drivable surface. 76. The autonomous delivery vehicle of claim 74 wherein the plurality of short-range sensors are stereo cameras. 77. The autonomous delivery vehicle of claim 74 wherein the plurality of short-range sensors comprise an IR projector, two image sensors and an RGB sensor. 78. The autonomous delivery vehicle of claim 74 wherein the plurality of short-range sensors are radar sensors. 79. The autonomous delivery vehicle of claim 74 wherein the short-range sensors supply RGB-D data to the controller. 80. The autonomous delivery vehicle of claim 74 wherein the controller determines a geometry of a road surface based on RGB-D data received from the plurality of short-range sensors. 81.
  • the autonomous delivery vehicle of claim 74 wherein the plurality of short-range sensors detect objects within 4 meters of the autonomous delivery vehicle and the long-range sensor suite detects objects more than 4 meters from the autonomous delivery vehicle. 82.
  • An autonomous delivery vehicle comprising: a power base including at least two powered back wheels, caster front wheels and energy storage, the power base configured to move at a commanded velocity; a cargo platform the cargo platform mechanically attached to the power base; and a short-range camera assembly mounted to the cargo platform that detects at least one characteristic of a drivable surface, the short- range camera assembly comprising: a camera; a first light; and a first liquid-cooled heat sink, wherein the first liquid-cooled heat sink cools the first light and the camera.
  • the short-range camera assembly further comprises a thermal electric cooler between the camera and the liquid cooled heat sink.
  • the autonomous delivery vehicle according to claim 83 wherein the first light and the camera are recessed in a cover with openings that deflect illumination from the first light away from the camera. 86.
  • the camera has a field of view and the first light comprises two LEDs with lenses to produce two beams of light that spread to illuminate the field of view of the camera.
  • the lights are angled approximately 50° apart and the lenses produce a 60° beam. 89.
  • the autonomous delivery vehicle according to claim 83 wherein the short- range camera assembly includes an ultrasonic sensor mounted above the camera. 90. The autonomous delivery vehicle according to claim 83, where the short-range camera assembly is mounted in a center position on a front face of the cargo platform. 91. The autonomous delivery vehicle according to claim 83, further comprising at least one comer camera assembly mounted on at least one comer of a front face of the cargo platform, the at least one comer camera assembly comprising: an ultra-sonic sensor a corner camera; a second light; and a second liquid-cooled heat sink, wherein the second liquid-cooled heat sink cools the second light and the comer camera. 92. The method as in claim 22 wherein the historical data comprises surface data. 93. The method as in claim 22 wherein the historical data comprises discontinuity data.
  • FIG. 1-1 is a schematic block diagram of the major components of the system of the present teachings.
  • FIG. 1-2 is a schematic block diagram of the major components of the map processor of the present teachings.
  • FIG. 1-3 is a schematic block diagram of the major components of the perception processor of the present teachings
  • FIG. 1-4 is a schematic block diagram of the major components of the autonomy processor of the present teachings
  • FIG. 1A is a schematic block diagram of the system of the present teachings for preparing a travel path for the A V ;
  • FIG. IB is a pictorial diagram of an exemplary configuration of a device incorporating the system of the present teachings
  • FIG. 1C is a side view of Automatic Delivery Vehicle showing field of views of some long and short range sensors.
  • FIG. 1D is a schematic block diagram of the map processor of the present teachings
  • FIG. IE is a pictorial diagram of the first part of the flow of the map processor of the present teachings.
  • FIG. IF is an image of the segmented point cloud of the present teachings.
  • FIG. 1G is a pictorial diagram of the second part of the map processor of the present teachings.
  • FIG. 1H is an image of the drivable surface detection result of the present teachings
  • FIG. II is a pictorial diagram of the flow of the SDSF finder of the present teachings.
  • FIG. 1J is a pictorial diagram of the SDSF categories of the present teachings.
  • FIG. IK is an image of the SDSFs identified by the system of the present teachings.
  • FlGs. 1L and 1M are pictorial diagrams of the polygon processing of the present teachings
  • FIG. IN is an image of the polygons and SDSFs identified by the system of the present teachings
  • FIG. 2A is an isometric view of the autonomous vehicle of the present teachings
  • FIG. 2B is a top view the cargo container showing fields of view of selected of the long-range sensors
  • FIG. 2C-2F are views of the long range sensor assembly;
  • FIG. 2G is a top view of the cargo container showing fields of view of selected of the short-range sensors;
  • FIG. 2H is an isometric view of the cargo platform of the present teachings.
  • FIG. 21- 2L are isometric views of a short-range sensor
  • FIG. 2M-2N are isometric views of the autonomous vehicle of the present teachings.
  • FIG. 20-2P are isometric views of the autonomous vehicle of the present teachings with skin panels removed;
  • FIG. 2Q is an isometric view of the autonomous vehicle of the present teachings with part of the top panel removed;
  • FIG. 2R-2V are views of long range sensors on the autonomous vehicle of the present teachings
  • FIG. 2W-2Z are views of an ultrasonic sensor
  • FIG. 2AA-2BB are views of the center short range camera assembly
  • FIG. 2CC-2DD are views of the corner short range camera assembly
  • FIG. 2EE-2HH are various views of the center short range camera assembly
  • FIG. 3A is a schematic block diagram of the system of one configuration of the present teachings.
  • FIG. 3B is a schematic block diagram of the system of another configuration of the present teachings.
  • FIG. 3C is a schematic block diagram of the system of the present teachings that can initially create the global occupancy grid
  • FIG. 3D is a pictorial representation of the static grid of the present teachings.
  • FlGs. 3E and 3F are pictorial representations of the creation of the occupancy grid of the present teachings.
  • FIG. 3G is a pictorial representation of the prior occupancy grid of the present teachings.
  • FIG. 3H is a pictorial representation of the updating the global occupancy grid of the present teachings.
  • FIG. 31 is a flow diagram of the method of the present teachings for publishing the global occupancy grid
  • FIG. 3J is a flow diagram of the method of the present teachings for updating the global occupancy grid
  • FlGs. 3K-3M are flow diagrams of another method of the present teachings for updating the global occupancy grid.
  • FIG. 4A is a perspective pictorial diagram a device of the present teachings situated in various modes
  • FIG. 4B is a schematic block diagram of the system of the present teachings.
  • FIG. 4C is a schematic block diagram of the drive surface processor components of the present teachings.
  • FIG. 4D is a schematic block/pictorial flow diagram of the process of the present teachings.
  • FlGs. 4E and 4F are perspective and side view diagrams, respectively, of a configuration of the device of the present teachings in standard mode;
  • FlGs. 4G and H are perspective and side view diagrams, respectively, of a configuration of the device of the present teachings in 4-Wheel mode;
  • FlGs. 41 and 4J are perspective and side view diagrams, respectively, of a configuration of the device of the present teachings in raised 4-Wheel mode;
  • FIG. 4K is a flowchart of the method of the present teachings.
  • FIG. 5A is a schematic block diagram of the device controller of the present teachings.
  • FIG. 5B is a schematic block diagram of the SDSF processor of the present teachings.
  • FIG. 5C is an image of the SDSF approaches identified by the system of the present teachings.
  • FIG. 5D is an image of the route topology created by the system of the present teachings.
  • FIG. 5E is a schematic block diagram of the modes of the present teachings.
  • FlGs. 5F-5J are flowcharts of the method of the present teachings for traversing SDSFs
  • FIG. 5K is schematic block diagram of the system of the present teachings for traversing SDSFs; [00104] FIGs. 5L-5N are pictorial representations of the method of FIGs. 5F-5H; and
  • FIG. 50 is a pictorial representation of converting an image to a polygon.
  • the system and method of the present teachings can use on-board sensors and previously-developed maps to develop an occupancy grid and use these aids to navigate an AV across surface features, including reconfiguring the AV based on the surface type and previous
  • AV system 100 can include a structure upon which sensors 10701 can be mounted, and within which device controller 10111 can execute.
  • the structure can include power base 10112 that can direct movement of wheels that are part of the structure and that can enable movement of the AV.
  • Device controller 10111 can execute on at least one processor located on the AV, and can receive data from sensors 10701 that can be, but are not limited to being, located on the AV.
  • Device controller 10111 can provide speed, direction, and configuration information to base controller 10114 that can provide movement commands to power base 10112.
  • Device controller 10111 can receive map information from map processor 10104, which can prepare a map of the area surrounding the AV.
  • Device controller 10111 can include, but is not limited to including, sensor processor 10703 that can receive and process input from sensors 10701, including on-AV sensors.
  • device controller 10111 can include perception processor 2143, autonomy processor 2145, and driver processor 2127.
  • Perception processor 2143 can, for example, but not limited to, locate static and dynamic obstacles, determine traffic light state, create an occupancy grid, and classify surfaces.
  • Autonomy processor 2145 can, for example, but not limited to, determine the maximum speed of the AV and determine the type of situation the AV is navigating in, for example, on a road, on a sidewalk, at an intersection, and/or under remote control.
  • Driver processor 2127 can, for example, but not limited to, create commands according to the direction of autonomy processor 2145 and send them on to base controller 10114.
  • map processor 10104 can create a map of surface features and can provide the map, through device controller 10111, to perception processor 2143, which can update an occupancy grid.
  • Map processor 10104 can include, among many other aspects, feature extractor 10801, point cloud organizer 10803, transient processor 10805, segmenter 10807, polygon generator 10809, SDSF line generator 10811, and combiner 10813.
  • Feature extractor 10801 can include a first processor accessing point cloud data representing the surface.
  • Point cloud organizer 10803 can include a second processor forming processable parts from the filtered point cloud data.
  • Transient processor 10805 can include a first filter filtering the point cloud data.
  • Segmenter 10807 can include executable code that can include, but is not limited to including, segmenting the point cloud data into the processable parts, and removing points of a pre-selected height from the processable parts.
  • the first filter can optionally include executable code that can include, but is not limited to including, conditionally removing points representing transient objects and points representing outliers from the point cloud data, and replacing the removed points having a pre-selected height.
  • Polygon generator 10809 can include a third processor merging the processable parts into at least one concave polygon.
  • the third processor can optionally include executable code that can include, but is not limited to including, reducing the size of the processable parts by analyzing outliers, voxels, and normal, growing regions from the reduced-size processable parts, determining initial drivable surfaces from the grown regions, segmenting and meshing the initial drivable surfaces, locating polygons within the segmented and meshed initial drivable surfaces, and setting the drivable sur faces based at least on the polygons.
  • SDSF line generator 10811 can include a fourth processor locating and labeling the at least one SDSF in the at least one concave polygon, the locating and labeling forming labeled point cloud data.
  • the fourth processor can optionally include executable code that can include, but is not limited to including, sorting the point cloud data of the drivable surfaces according to a SDSF filter, the SDSF filter including at least three categories of points, and locating the at least one SDSF point based at least on whether the categories of points, in combination, meet at least one first pre-selected criterion.
  • Combiner 10813 can include a fifth processor creating graphing polygons.
  • Creating graphing polygons can optionally include executable code that can include, but is not limited to including, creating at least one polygon from the at least one drivable surface, the at least one polygon including edges, smoothing the edges, forming a driving margin based on the smoothed edges, adding the at least one SDSF trajectory to the at least one drivable surface, and removing edges from the at least one drivable surface according to at least one third pre-selected criterion.
  • Smoothing the edges can optionally include executable code that can include, but is not limited to including, trimming the edges outward.
  • Forming a driving margin of the smoothed edges can optionally include executable code that can include, but is not limited to including, trimming the outward edges inward.
  • maps can be provided to an AV that can include on- board sensors, powered wheels, processors to receive the sensor and map data and use those data to power configure the AV to traverse various kinds of surfaces, among other things, as the AV, for example, delivers goods.
  • the on-board sensors can provide data that can populate an occupancy grid and can be used to detect dynamic obstacles.
  • the occupancy grid can also be populated by the map.
  • Device controller 10111 can include perception processor 2143 that can receive and process sensor data and map data, and can update the occupancy grid with those data.
  • device controller 10111 can include configuration processor 41023 that can automatically determine the configuration of the AV based at least upon the mode of the AV and encountered surface features.
  • Autonomy processor 2145 can include control processor 40325 that can determine, based at least on the map (the planned route to be followed), the information from configuration processor 41023, and the mode of the AV, what kind of surface needs to be traversed and what configuration the AV needs to assume to traverse the surface.
  • Autonomy processor 2145 can supply commands to motor drive processor 40326 to implement the commands.
  • map processor 10104 can enable a device, for example, but not limited to, an AV or a semi-autonomous device, to navigate in
  • SDSFs can include features such as SDSFs.
  • the features in the map can enable, along with on-board sensors, the AV to travel on a variety of surfaces.
  • SDSFs can be accurately identified and labeled so that the AV can automatically maintain the performance of the AV during ingress and egress of the SDSF, and the AV speed, configuration, and direction can be controlled for safe SDSF traversal.
  • system 100 for managing the traversal of SDSFs can include AV 10101, core cloud infrastructure 10103, AV services 10105, device controller 10111, sensor(s) 10701, and power base 10112.
  • AV 10101 can provide, for example, but not limited to, transport and escort services from an origin to a destination, following a dynamically-determined path, as modified by incoming sensor information.
  • AV 10101 can include, but is not limited to including, devices that have autonomous modes, devices that can operate entirely autonomously, devices that can be operated at least partially remotely, and devices that can include a combination of those features.
  • Transport device services 10105 can provide drivable surface information including features to device controller 10111.
  • Device controller 10111 can modify the drivable surface information at least according to, for example, but not limited to, incoming sensor information and feature traversal requirements, and can choose a path for AV 10101 based on the modified drivable surface information.
  • Device controller 10111 can present commands to power base 10112 that can direct power base 10112 to provide speed, direction, and configuration commands to wheel motors and cluster motors, the commands causing AV 10101 to follow the chosen path, and to raise and lower its cargo accordingly.
  • Transport device services 10105 can access route-related information from core cloud infrastructure 10103, which can include, but is not limited to including, storage and content distribution facilities.
  • core cloud infrastructure 10103 can include commercial products such as, for example, but not limited to, AMAZON WEB
  • an exemplary AV that can include device controller 10111 (FIG. 1A) that can receive information from map processor 10104 (FIG. 1A) of the present teachings can include a power base assembly such as, for example, but not limited to, the power base that is described fully in, for example, but not limited to, U.S. Patent Application # 16/035,205, filed on July 13, 2018, entitled Mobility Device, or U.S. Patent # 6,571,892, filed on August 15, 2001, entitled Control System and Method, both of which are incorporated herein by reference in their entirety.
  • An exemplary power base assembly is described herein not to limit the present teachings but instead to clarify features of any power base assembly that could be useful in implementing the technology of the present teachings.
  • An exemplary power base assembly can optionally include power base 10112, wheel cluster assembly 11100, and payload carrier height assembly 10068.
  • An exemplary power base assembly can optionally provide the electrical and mechanical power to drive wheels 11203 and clusters 11100 that can raise and lower wheels 11203.
  • Power base 10112 can control the rotation of cluster assembly 11100 and the lift of payload carrier height assembly 10068 to support the substantially discontinuous surface traversal of the present teachings. Other such devices can be used to accommodate the SDSF detection and traversal of the present teachings.
  • sensors internal to an exemplary power base can detect the orientation and rate of change in orientation of AV 10101, motors can enable servo operation, and controllers can assimilate information from the internal sensors and motors.
  • Appropriate motor commands can be computed to achieve transporter performance and to implement the path following commands.
  • Left and right wheel motors can drive wheels on the either side of AV 10101.
  • front and back wheels can be coupled to drive together, so that two left wheels can drive together and two right wheels can drive together.
  • turning can be accomplished by driving left and right motors at different rates, and a cluster motor can rotate the wheelbase in the fore/aft direction.
  • Payload carrier 10173 can be automatically raised and lowered based at least on the underlying terrain.
  • point cloud data can include route information for the area in which AV 10101 is to travel.
  • Point cloud data possibly collected by a mapping device similar or identical to AV 10101, can be time- tagged.
  • the path along which the mapping device travels can be referred to as a mapped trajectory.
  • Point cloud data processing that is described herein can happen as a mapping device traverses the mapped trajectory, or later after point cloud data collection is complete. After the point cloud data are collected, they can be subjected to point cloud data processing that can include initial filtering and point reduction, point cloud segmentation, and feature detection as described herein.
  • core cloud infrastructure 10103 can provide long- or short-term storage for the collected point cloud data, and can provide the data to AV services 10105.
  • AV services 10105 can select among possible point cloud datasets to find the dataset that covers the territory surrounding a desired starting point for AV 10101 and a desired destination for AV 10101.
  • AV services 10105 can include, but are not limited to including, map processor 10104 that can reduce the size of point cloud data and determine the features represented in the point cloud data.
  • map processor 10104 can determine the location of SDSFs from point cloud data.
  • polygons can be created from the point cloud data as a technique to segment the point cloud data and to ultimately set a drivable surface.
  • SDSF finding and drivable surface determination can proceed in parallel.
  • SDSF finding and drivable surface determination can proceed sequentially.
  • the AV may be configured to deliver cargo and/or perform other functions involving autonomously navigating to a desired location.
  • the AV may be remotely guided.
  • AV 20100 comprises a cargo container that can be opened remotely, in response to user inputs, automatically or manually to allow users to place or remove packages and other items.
  • the cargo container 20110 is mounted on the cargo platform 20160, which is mechanically connected to the power base 20170.
  • the power base 20170 includes the four powered wheels 20174 and two caster wheels 20176.
  • the power base provides speed and directional control to move the cargo container 20110 along the ground and over obstacles including curbs and other discontinuous surface features.
  • cargo platform 20160 is connected to the power base 20170 through two U-frames 20162.
  • Each U-frame 20162 is rigidly attached to the structure of the cargo platform 20160 and includes two holes that allow a rotatable joint 20164 to be formed with the end of each arm 20172 on the power base 20170.
  • the power base controls the rotational position of the arms and thus controls the height and attitude of the cargo container 20110.
  • AV 20100 includes one or more processors to receive data, navigate a path and select the direction and speed of the power base 20170.
  • map processor 10104 of the present teachings can position SDSFs on a map.
  • Map processor 10104 can include, but is not limited to including, feature extractor 10801, point cloud organizer 10803, transient processor 10805, segmenter 10807, polygon generator 10809, SDSF line generator 10811, and data combiner 10813.
  • feature extractor 10801 FIG. 1-2
  • feature extractor 10801 can include, but is not limited to including, line of sight filtering 10121 of point cloud data 10131 and mapped trajectory 10133. Line of sight filtering can remove points that are hidden from the direct line of sight of the sensors collecting the point cloud data and forming the mapped trajectory.
  • Point cloud organizer 10803 can organize 10151 reduced point cloud data 10132 according to pre-selected criteria possibly associated with a specific feature.
  • transient processor 10805 (FIG. 1-2) can remove 10153 transient points from organized point cloud data and mapped trajectory 10133 by any number of methods, including the method described herein. Transient points can complicate processing, in particular if the specific feature is stationary. Segmented 10807 (FIG. 1-2) can split processed point cloud data 10135 into processable chunks.
  • processed point cloud data 10135 can be segmented 10155 into sections having a pre-selected minimum number of points for example, but not limited to, about 100,000 points.
  • further point reduction can be based on pre- selected criteria that could be related to the features to be extracted. For example, if points above a certain height are unimportant to a locating a feature, those points could be deleted from the point cloud data.
  • the height of at least one of the sensors collecting point cloud data could be considered an origin, and points above the origin could be removed from the point cloud data when, for example, the only points of interest are associated with surface features.
  • polygon generator 10809 (FIG. 1-2) can locate drivable surfaces by generating 10161 polygons 10139, for example, but not limited to, as described herein.
  • SDSF line generator 10811 (FIG. 1-2) can locate surface features by generating 10163 SDSF lines 10141, for example, but not limited to, as described herein.
  • combiner 10813 (FIG. 1-2) can create a dataset that can be further processed to generate the actual path that AV 10101 (FIG. 1A) can travel by combining 10165 polygons 10139 and SDSFs 10141.
  • eliminating 10153 (FIG. 1D), from point cloud data 10131 (FIG. 1D), objects that are transient with respect to mapped trajectory 10133, such as exemplary time-stamped points 10751, can include casting ray 10753 from time- stamped points on mapped trajectory 10133 to each time- stamped point within point cloud data 10131 (FIG. 1D) that has substantially the same time stamp. If ray 10753 intersects a point, for example, point D 10755, between the time-stamped point on mapped trajectory 10133 and the end point of ray 10753, intersecting point D 10755 can be assumed to have entered the point cloud data during a different sweep of the camera.
  • the intersecting point for example, intersecting point D 10755
  • the result is processed point cloud data 10135 (FIG. 1D), free of, for example, but not limited to, transient objects. Points that had been removed as parts of transient objects but also are substantially at ground level can be returned 10754 to the processed point cloud data 10135 (FIG. 1D).
  • Transient objects cannot include certain features such as, for example, but not limited to, SDSFs 10141 (FIG. 1D), and can therefore be removed without interfering with the integrity of point cloud data 10131 (FIG. 1D) when SDSFs 10141 (FIG. 1D) are the features being detected.
  • segmenting 10155 (FIG. 1D) processed point cloud data 10135 (FIG. 1D) can produce sections 10757 having a pre-selected size and shape, for example, but not limited to, rectangles 10154 (FIG. IF) having a minimum pre- selected side length and including about 100,000 points. From each section 10757, points that are not necessary for the specific task, for example, but not limited to, points that lie above a pre-selected level, can be removed 10157 (FIG. 1D) to reduce the dataset size.
  • the pre-selected level can be the height of AV 10101 (FIG. 1A).
  • map processor 10104 can supply to device controller 10111 at least one dataset that can be used to produce direction, speed, and configuration commands to control AV 10101 (FIG. 1A).
  • the at least one dataset can include points that can be connected to other points in the dataset, where each of the lines that connects points in the dataset traverses a drivable surface.
  • segmented point cloud data 10137 can be divided into polygons 10139, and the vertices of polygons 10139 can possibly become the route points.
  • Polygons 10139 can include the features such as, for example, SDSFs 10141.
  • creating processed point cloud data 10135 can include filtering voxels.
  • the centroid of each voxel in the dataset can be used to approximate the points in the voxel, and all points except the centroid can be eliminated from the point cloud data.
  • the center of the voxel can be used to approximate the points in the voxel.
  • Other methods to reduce the size of filtered segments 10251 can be used such as, for example, but not limited to, taking random point subsamples so that a fixed number of points, selected uniformly at random, can be eliminated from filtered segments 10251 (FIG. 1G).
  • creating processed point cloud data 10135 can include computing the normals from the dataset from which outliers have been removed and which has been downsized through voxel filtering. Normals to each point in the filtered dataset can be used for various processing possibilities, including curve reconstruction algorithms.
  • estimating and filtering normals in the dataset can include obtaining the underlying surface from the dataset using surface meshing techniques, and computing the normals from the surface mesh.
  • estimating normals can include using approximations to infer the surface normals from the dataset directly, such as, for example, but not limited to, determining the normal to a fitting plane obtained by applying a total least squares method to the k nearest neighbors to the point.
  • the value of k can be chosen based at least on empirical data.
  • Filtering normals can include removing any normals that are more than about 45° from perpendicular to the x-y plane.
  • a filter can be used to align normals in the same direction. If part of the dataset represents a planar surface, redundant information contained in adjacent normals can be filtered out by performing either random sub-sampling, or by filtering out one point out of a related set of points.
  • choosing the point can include recursively decomposing the dataset into boxes until each box contains at most k points. A single normal can be computed from the k points in each box.
  • creating processed point cloud data 10135 can include growing regions within the dataset by clustering points that are geometrically compatible with the surface represented the dataset, and refining the surface as the region grows to obtain the best approximation of the largest number of points.
  • Region growing can merge the points in terms of a smoothness constraint.
  • the smoothness constraint can be determined empirically, for example, or can be based on a desired surface smoothness.
  • the smoothness constraint can include a range of about 10p/180 to about 20p/180.
  • the output of region growing is a set of point clusters, each point cluster being a set of points, each of which is considered to be a part of the same smooth surface.
  • region growing can be based on the comparison of the angles between normals. Region growing can be accomplished by algorithms such as, for example, but not limited to, region growing segmentation
  • segmented point cloud data 10137 (FIG. 1D) can be used to generate 10161 (FIG. 1D) polygons 10759, for example, 5m x 5m polygons.
  • Point sub-clusters can be converted into polygons 10759 using meshing, for example. Meshing can be accomplished by, for example, but not limited to, standard methods such as marching cubes, marching tetrahedrons, surface nets, greedy meshing, and dual contouring.
  • polygons 10759 can be generated by projecting the local neighborhood of a point along the point’s normal, and connecting unconnected points.
  • Resulting polygons 10759 can be based at least on the size of the neighborhood, the maximum acceptable distance for a point to be considered, the maximum edge length for the polygon, the minimum and maximum angles of the polygons, and the maximum deviation that normals can take from each other. In some configurations, polygons 10759 can be filtered according to whether or not polygons 10759 would be too small for AV
  • a circle the size of AV 10101 (FIG. 1A) can be dragged around each of polygons 10759 by known means. If the circle falls substantially within polygon 10759, then polygon 10759, and thus the resulting drivable surface, can accommodate AV 10101 (FIG. 1A).
  • the area of polygon 10759 can be compared to the footprint of AV 10101 (FIG. 1A). Polygons can be assumed to be irregular so that a first step for determining the area of polygons 10759 is to separate polygon 10759 into regular polygons 10759A by known methods.
  • each regular polygon 10759 A standard area equations can be used to determine its size.
  • the areas of each regular polygon 10759A can be added together to find the area of polygon 10759, and that area can be compared to the footprint of AV 10101 (FIG. 1A).
  • Filtered polygons can include the subset of polygons that satisfy the size criteria. The filtered polygons can be used to set a final drivable surface.
  • polygons 10759 can be processed by removing outliers by conventional means such as, for example, but not limited to, statistical analysis techniques such as those available in the Point Cloud Library, http://pointclouds.org/documentation/tutorials/statistical_outlier.php. Filtering can include downsizing segments 10137 (FIG. 1D) by conventional means including, but not limited to, a voxelized grid approach such as is available in the Point Cloud Library,
  • Concave polygons 10263 can be created, for example, but not limited to, by the process set out in the process set out in A New Concave Hull Algorithm and Concaveness Measure for n-dimensional Datasets, Park et al., Journal of Information Science and Engineering 28, pp. 587-600, 2012.
  • processed point cloud data 10135 (FIG. 1D) can be used to determine initial drivable surface 10265.
  • Region growing can produce point clusters that can include points that are part of a drivable surface.
  • a reference plane can be fit to each of the point clusters.
  • the point clusters can be filtered according to a relationship between the orientation of the point clusters and the reference plane. For example, if the angle between the point cluster plane and the reference plane is less than, for example, but not limited to, about 30°, the point cluster can be deemed, preliminarily, to be part of an initial drivable surface.
  • point clusters can be filtered based on, for example, but not limited to, a size constraint.
  • point clusters that are greater in point size than about 20% of the total points in point cloud data 10131 (FIG. 1D) can be deemed too large, and point clusters that are smaller in size than about 0.1% of the total points in point cloud data 10131 (FIG. 1D) can be deemed too small.
  • the initial drivable surface can include the filtered of the point clusters.
  • point clusters can be split apart to continue further processing by any of several known methods.
  • density based spatial clustering of applications with noise (DBSCAN) can be used to split the point clusters, while in some configurations, k-means clustering can be used to split the point clusters.
  • DBSCAN can group together points that are closely packed together, and mark as outliers the points that are substantially isolated or in low-density regions. To be considered closely packed, the point must lie within a pre-selected distance from a candidate point.
  • a scaling factor for the pre-selected distance can be empirically or dynamically determined. In some configurations, the scaling factor can be in the range of about 0.1 to 1.0.
  • generating 10163 (FIG. 1D) SDSF lines can include locating SDSFs by further filtering of concave polygons 10263 on drivable surface 10265 (FIG. 1H).
  • points from the point cloud data that make up the polygons can be categorized as either upper donut point 10351 (FIG. 1 J), lower donut point 10353 (FIG. 1J), or cylinder point 10355 (FIG. 1J).
  • Upper donut points 10351 (FIG. 1J) can fall into the shape of SDSF model 10352 that is farthest from the ground.
  • Each donut 10371 can be divided into multiple parts, for example, two hemispheres. Another criterion for determining if the points in donut 10371 represent a SDSF is whether the majority of the points lie in opposing hemispheres of the parts of donut 10371. Cylinder points 10355 (FIG. 1J) can occur in either first cylinder region 10357 (FIG. 1J) or second cylinder region 10359 (FIG. 1J). Another criterion for SDSF selection is that there must be a minimum number of points in both cylinder regions 10357/10359 (FIG. 1J). In some configurations, the minimum number of points can be selected empirically and can fall into the range of 3- 20.
  • donut 10371 must include at least two of the three categories of points, i.e. upper donut point 10351 (FIG. 1J), lower donut point 10353 (FIG. 1J), and cylinder point 10355 (FIG. 1J).
  • polygons can be processed in parallel.
  • Each category worker 10362 can search its assigned polygon for SDSF points 10789 (FIG. IN) and can assign SDSF points 10789 (FIG. IN) to categories 10763 (FIG. 1G).
  • the resulting point categories 10763 (FIG. 1G) can be combined 10363 forming combined categories 10366, and the categories can be shortened 10365 forming shortened combined categories 10368.
  • Shortening SDSF points 10789 (FIG. IN) can include filtering SDSF points 10789 (FIG. IN) with respect to their distances from the ground.
  • Shortened combined categories 10368 can be averaged, possibly processed in parallel by average workers 10373, by searching an area around each SDSF point 10766 (FIG. 1G) and generating average points 10765 (FIG. 1G), the category’s points forming a set of averaged donuts 10375.
  • the radius around each SDSF point 10766 (FIG. 1G) can be determined empirically.
  • the radius around each SDSF point 10766 (FIG. 1G) can include a range of between 0.1m to 1.0m.
  • the height change between one point and another on SDSF trajectory 10377 (FIG. 1G) for the SDSF at average point 10765 (FIG. 1G) can be calculated.
  • SDSF trajectory 10377 Connecting averaged donuts 10375 together can generate SDSF trajectory 10377 (FIGs. 1G and IK).
  • the next point can be chosen based at least on forming a straight-as-possible line among previous line segments, the starting point and the candidate destination point, and upon which the candidate next point represents the smallest change in SDSF height between previous points and the candidate next point.
  • SDSF height can be defined as the difference between the height of upper donut 10351 (FIG. 1J) and lower donut 10353 (FIG. 1J).
  • combining 10165 (FIG. 1D) concave polygons and SDSF lines can produce a dataset including polygons 10139 (FIG. 1D) and SDSFs 10141 (FIG. 1D), and the dataset can be manipulated to produce graphing polygons with SDSF data.
  • Manipulating concave polygons 10263 can include, but is not limited to including, merging concave polygons 10263 to form merged polygon 10771.
  • Merging concave polygons 10263 can be accomplished using known methods such as, for example, but not limited to, those found in (http://www.angusi.com/delphi/clipper.php).
  • Merged polygon 10771 can be expanded to smooth the edges and form expanded polygon 10772.
  • Expanded polygon 10772 can be contracted to provide a driving margin, forming contracted polygon 10774, to which SDSF trajectories 10377 (FIG. 1M) can be added.
  • Inward trimming (contraction) can insure that there is room near the edges for AV 10101 (FIG. 1A) to travel by reducing the size of the drivable surface by a pre-selected amount based at least on the size of AV 10101 (FIG. 1A).
  • Polygon expansion and contraction can be
  • ARCGIS® clip command http://desktop.arcgis.com/en/arcmap/10.3/manage- data/editing-existing-features/clipping-a-polv gon-feature.htm).
  • contracted polygon 10774 can be partitioned into polygons 10778, each of which can be traversed without encountering non- drivable surfaces.
  • Contracted polygon 10774 can be partitioned by conventional means such as, for example, but not limited to, ear slicing, optimized by z-order curve hashing and extended to handle holes, twisted polygons, degeneracies, and self-intersections.
  • SDSF trajectory 10377 can include SDSF points 10789 (FIG. IN) that can be connected to polygon vertices 10781. Vertices 10781 can be considered to be possible path points that can be connected to each other to form possible travel paths for AV 10101 (FIG. 1A). In the dataset, SDSF points 10789 (FIG. IN) can be labeled as such. As partitioning progresses, it is possible that redundant edges are introduced such as, for example, but not limited to, edges 10777 and 10779.
  • Removing one of edges 10777 or 10779 can reduce the complexity of further analyses and can retain the polygon mesh.
  • a Hertel-Mehlhorn polygon partitioning algorithm can be used to remove edges, skipping edges that have been labeled as features.
  • the set of polygons 10778, including the labeled features, can be subjected to further simplification to reduce the number of possible path points, and the possible path points can be provided to device controller 10111 (FIG. 1 A) in the form of annotated point data 10379 (FIG. 5B) which can be used to populate the occupancy grid.
  • sensor data gathered by an AV can also be used to populate the occupancy grid.
  • the processors in the AV can receive data from the sensors in long-range sensor assembly 20400 mounted on top of cargo container 20110 and from short-range sensors 20510, 20520, 20530, 20540 and others sensors located in cargo platform 20160.
  • the processors may receive data from optional short-range sensor 20505 mounted near the top of the front of cargo-container 20110.
  • the processors may also receive data from one or more antennas 20122A, 20122B (FIG. 1C) including cellular, , WiFi and/or GPS.
  • AV 20100 has an GPS antenna 20122A (FIG.
  • processors located anywhere in AV 20100. In some examples, one or more processors are located in long-range sensor assembly 20400. Additional processors may be located in cargo platform 20160. In other examples, the processors may be located in cargo container 20110 and/or as part of power base 20170.
  • long-range sensor assembly 20400 is mounted on top of the cargo-container to provide improved view of the environment surrounding the AV.
  • long-range sensor assembly 20400 is more than 1.2m feet above the travel surface or ground.
  • long-range sensor assembly 20400 may be 1.8m above the ground that the AV is moving over.
  • Long-range sensor assembly 20400 provides information about environment around the AV from a minimum distance out to a maximum range. The minimum distance may be defined by the relative position of long-range sensors 20400 and cargo-container 20110. The minimum distance may be further defined by the field of view (FOV) of the sensors.
  • FOV field of view
  • the maximum distance may be defined by the range of the long-range sensors in long-range sensor assembly 20400 and/or by the processors. In one example, the range of the long-range sensors is limited to 20 meters. In one example, a Velodyne Puck LIDAR has a range to 100m.
  • Long-range sensor assembly 20400 may provide data on objects in all directions. The sensor assembly may provide information on structures, surfaces, and obstacles over a 360° angle around the AV 20100.
  • three long-range cameras observing through windows 20434, 20436 and 20438 can provide horizontal FOVs 20410, 20412, 20414 that together provide 360 ° FOV.
  • the horizontal FOV may be defined by the selected camera and the location of cameras within the long-range camera assembly 20400.
  • the zero angle is a ray located in a vertical plain through the center of the AV 20100 and perpendicular to the front of the AV. The zero angle ray passes through the front of the AV.
  • Front long-range camera viewing through window 20434 has a 96 ° FOV 20410 from 311 ° to 47 ° .
  • Long-range sensor assembly 20400 may include an industrial camera located to observe through window 20432 that provides more detailed information on objects and surfaces in front of AV 20100 than the long-range cameras.
  • the industrial camera located behind window 20432 may have FOV 20416 defined by selected camera and the location of cameras within long-range camera assembly 20400. In one example, the industrial camera behind window 20432 has a FOV from 23 ° to 337 ° .
  • LIDAR 20420 provides a 360 ° horizontal FOV around AV 20100.
  • the vertical FOV may be limited by the LIDAR instrument.
  • the vertical FOV 20418 of 40 ° and mounted at 1.2m to 1.8m above the ground sets the minimum distance of the sensor at 3.3m to 5m from the AV 20100.
  • cover 20430 includes windows 20434, 20432, 20436 through which the long-range cameras and industrial camera observe the environment around AV 20100.
  • Cover 20430 for long-range sensor assembly 20400 is sealed from the weather by an O-ring between cover 20430 and the top of cargo container 20110.
  • LIDAR sensor 20420 provides data with regard to the range or distance to surfaces around the AV. These data may be provided to processor 20470 located in long-range sensor assembly 20400.
  • the LIDAR is mounted on the structure 20405 above the long-range cameras 20440A-C and cover 20430.
  • LIDAR sensor 20420 is one example of a ranging sensor based on reflected laser pulsed light. Other ranging sensors such as radar that use reflected radio waves can also be used.
  • LIDAR sensor 20420 is the Puck sensor by VELODYNE LIDAR® of San Jose, CA.
  • Three long-range cameras 20440A, 20440B, 20440C provide digital images of the objects, surfaces and structures around AV 20100.
  • Three long-range cameras 20440 A, 20440B, 20440C are arranged around structure 20405 with respect to cover 20430 to provide three horizontal FOVs that cover the entire 360 ° around the AV.
  • Long-range cameras 20440A, 20440B, 20440C are on elevated ring structure 20405 that is mounted to cargo container 20110.
  • Long-range cameras 20440A, 20440B, 20440C receive images through windows 20434, 20436, 20438 that are mounted in cover 20430.
  • the long-range cameras may comprise a camera on a printed circuit board (PCB) and a lens.
  • PCB printed circuit board
  • one example of long-range camera 20440A may comprise digital camera 20444 with fisheye lens 20442 mounted in front of digital camera 20444.
  • Fisheye lens 20442 may expand the FOV of the camera to a much wider angle.
  • the fisheye lens expands the field of view to 180 ° .
  • digital camera 20444 is similar to e-cam52A_56540_MOD by E-con Systems of San Jose, CA.
  • fisheye lens 20442 is similar to model DSL227 by Sunex of Carlsbad, CA.
  • long-range sensor assembly 20400 may also include industrial camera 20450 that receives visual data through window 20432 in cover 20430.
  • Industrial camera 20450 provides additional data on objects, surfaces and structures in front of the AV to processor 20470.
  • the camera may be similar to a Kowa industrial camera part number LM6HC.
  • Industrial camera 20450 and long-range cameras 20440A-C are located 1.2m to 1.8m above the surface that AV 20100 is moving over.
  • mounting long-range sensor assembly 20400 on top of the cargo-container provides at least two advantages.
  • the field of views for the long- range sensors including long-range cameras 20440A-C, industrial camera 20450 and LIDAR 20420 are less often blocked by nearby objects such as people, cars, low walls etc., when the sensors are mounted further above the ground.
  • pedestrian ways are architected to provide visual cues including signage, fence heights etc. for people to perceive, and a typical eye level is in the range of 1.2m to 1.8m.
  • Mounting long-range sensor assembly 20400 to the top of the cargo-container puts the long-range cameras 20440A-C, 20450 on the same level as signage and over visual clues directed at pedestrians.
  • the long-range sensors are mounted on the structure 20405 that provides a substantial and rigid mount that resists deflections caused by movement of AV 20100.
  • the long-range sensor assembly may include an inertial measurement unit (IMU) and one or more processors that receive data from the long-range sensors and output processed data to other processors for navigation.
  • IMU 20460 with a vertical reference (VRU) is mounted to structure 20405.
  • IMU/VRU 20460 may be located directly under LIDAR 20420 to as to provide positional data on LIDAR 20420.
  • the position and orientation from IMU/VRU 20460 may be combined with data from the other long-range sensors.
  • IMU/VRU 20460 is model MTi 20 supplied by Xsens Technologies of The Netherlands.
  • the one or more processors may include processor 20465 that receives data from at least industrial camera 20450.
  • processor 20470 may receive data from the at least one of the following, LIDAR 20420, long-range cameras 20440A-C, industrial camera 20450, and IMU/VRU 20460.
  • Processor 20470 may be cooled by liquid-cooled heat exchanger 20475 that is connected to a circulating coolant system.
  • AV 20100 may include a number of short-range sensors that detect driving surfaces and obstacles within a predetermined distance from the AV.
  • Short-range sensors 20510, 20520, 20530, 20540, 20550, and 20560 are located on the periphery of the container platform 20160. These sensors are located below cargo- container 20110 (FIG. 2B) and are closer to the ground than long-range sensor assembly 20400 (FIG. 2C).
  • Short-range sensors 20510, 20520, 20530, 20540, 20550, and 20560 are angled downward to provide FOVs that capture surfaces and objects that cannot be seen by the sensors in long-range sensor assembly 20400 (FIG. 2C).
  • the field of view of a sensor located closer to the ground and angled downward is less likely to be obstructed by the nearby objects and pedestrians than sensors mounted further form the ground.
  • the short range sensors provide information about the ground surfaces and objects up to 4m from AV 20100.
  • the vertical FOVs for two of short-range sensors are shown in a side view of the AV 20100.
  • the vertical FOV 20542 of the aft-facing sensor 20540 is centered about the center line 20544.
  • the center line 20544 is angled below the top surface of the cargo platform 20160.
  • sensor 20540 has a vertical FOV 42 ° and a center line angled 22 ° to 28 ° 20546 below the plane 20547 defined by the top plate of the cargo platform 20160.
  • the short range sensors 20510 and 20540 are approximately 0.55m to 0.71m above the ground.
  • the resulting vertical FOVs 20512, 20542 cover the ground from 0.4m to 4.2m from the AV.
  • Short range sensors 20510, 20520, 20530, 20550 (FIG. 2G), 20560 (FIG. 2G) mounted on the cargo base 20160 have similar vertical fields of view and a center-line angle relative to the top of the cargo- platform.
  • the short-range sensors mounted on the cargo platform 20160 can view the ground from 0.4 to 4.7 meters out from the outer edge of the AV 20100.
  • short-range sensor 20505 may be mounted on the front surface near to the top of cargo-container 20110.
  • sensor 20505 may provide additional views of the ground in front of the AV to the view of provided by short- range sensor 20510.
  • sensor 20505 may provide a view of the ground in front of the AV in place of the view provided by short-range sensor 20510.
  • short-range sensor 20505 may be have a vertical FOV 20507 of 42 ° and the angle of the centerline to the top of the cargo-platform 20160 is 39 ° . The resulting view of the ground extends from 0.7m to 3.75m from the AV.
  • the horizontal FOVs of short range sensor 20510, 20520, 20530, 20540, 20550, 20560 cover all the directions around the AV 20100.
  • the horizontal FOVs 20522 and 20532 of adjacent sensors such as 20520 and 20530 overlap at a distance out from the AV 20100.
  • the horizontal FOVs 20522, 20532 and 20562, 20552 of adjacent sensors, 20520, 20530 and 20560, 20550 overlap at 0.5 to 2 meters from the AV.
  • the short-range sensors are distributed around the periphery of cargo- base 20160, have horizontal fields of view, and are placed at specific angles to provide nearly complete visual coverage of the ground surrounding the AV.
  • the short-range sensors have horizontal FOV of 69°.
  • Front sensor 20510 faces forward at zero angle relative to the AV and has FOV 20512.
  • two front corner sensors 20520, 20560 are angled so that the center lines are at angle 20564 of 65 ° .
  • rear side sensors 20530, 20550 are angled so that the center lines of 20530 and 20560 at angle 20534 of 110 ° .
  • other numbers of sensors with other horizontal FOV that are mounted around the periphery of cargo base 20160 to provide nearly complete view of the ground around AV 20100 are possible.
  • short-range sensors 20510, 20520, 20530, 20540, 20550, 20560 are located on the periphery of cargo base 20160.
  • the short-range cameras are mounted in the protuberances that set the angle and location of the short-range sensors.
  • the sensors are located mounted on the interior of the cargo base and receive visual data through windows aligned with the outer skin of cargo base 20160.
  • short-range sensors 20600 mount in skin element 20516 of cargo base 20160 and may include a liquid cooling system.
  • Skin element 20516 includes formed protrusion 20514 that holds short-range sensor assembly 20600 at the predetermined location and vertical angle relative to the top of cargo-base 20160 and at an angle relative to front of thecargo base 20160.
  • short-range sensor 20510 is angled downward with respect to cargo platform 20160 by 28°
  • short-range sensors 20520 and 20560 are angled downward 18° and forward 25°
  • short-range sensors 20530 and 20550 are angled downward 34° and rearward 20°
  • short range sensor 20540 is angled downward with respect to cargo platform 20160 by 28°.
  • Skin element 20516 includes a cavity 20517 for receiving camera assembly 20600.
  • Skin element 20516 may also include a plurality of elements 20518 to receive mechanical fasteners including but not limited to rivets, screws and buttons.
  • the camera assembly may be mounted with an adhesive or held in place with a clip that fastens to skin element 20516.
  • Gasket 20519 can provide a seal against the front of camera 20610.
  • short-range sensor assembly 20600 comprises short-range sensor 20610 mounted on bracket 20622 that is attached to water-cooled plate 20626. Outer case 20612, transparent cover 20614 and heat sink 20618 have been partially removed in FIGs. 2K and 2L to better visualize heat dissipating elements sensor block 20616 and electronic block 20620 of the short-range sensor 20610.
  • Short-range sensor assembly 20600 may include one or more thermal-electric coolers (TEC) 20630 between bracket 20622 and liquid-cooled plate 20626. Liquid-cooled plate 20626 is cooled by coolant pumped through 20628 that is thermally connected to plate 20626.
  • TEC thermal-electric coolers
  • the TECs are electrically powered elements with a first and a second side.
  • An electrically powered TEC cools the first side, while rejecting the thermal energy removed from the first side plus the electrical power at the second side.
  • TECs 20630 cool bracket 20622 and transfer the thermal cooling energy plus the electrical energy to water cooled plate 20626.
  • TEC 20630 can be used to actively control the temperature of camera 20600 by varying the magnitude and polarity of the voltage supplied to TEC 20630. [00151] Operating TEC 20630 in a cooling mode allows short-range sensor 20610 to operate at temperatures below the coolant temperature.
  • Bracket 20622 is thermally connected to short-range sensor 20610 in two places to maximize cooling of sensor block 20616 and electronic block 20620.
  • Bracket 20622 includes tab 20624 that is thermally attached to heat sink 20618 via screw 20625. Heat sink 20618 is thermally connected to sensor block 20616. The bracket is thus thermally connected to sensor block 20616 via heat sink 20618, screw 20625 and tab 20624.
  • Bracket 20622 is also mechanically attached to electronic block 20620 to provide direct cooling of electronics block 20620.
  • Bracket 20622 may include a plurality of mechanical attachments including but not limited to screws and rivets that engage with elements 20518 in FIG. 2J.
  • Short-range sensor 20610 may incorporate one or more sensors including but not limited to a camera, a stereo camera, an ultrasonic sensor, a short-range radar, and an infrared projector and CMOS sensor.
  • One example short-range sensor is similar to the real-sense depth camera D435 by Intel of Santa Clara, California, that comprises an IR projector, two imager chips and a RGB camera.
  • AV 20100 A includes a cargo container 20110 mounted on a cargo platform 20160 and power base 20170.
  • AV 20100A includes a plurality of long-range and short range sensors.
  • the primary long-range sensors are mounted in sensor pylon 20400A on top of the cargo container 20110.
  • the sensor pylon may include a LIDAR 20420 and a plurality of long- range cameras (not shown) aimed in divergent directions to provide a wide field of view.
  • LIDAR 20420 can be used as described elsewhere herein, for example, but not limited to, providing point cloud data that can enable population of an occupancy grid, and to provide information to identify landmarks, locate the AV 20100 within its environment and/or determine the navigable space.
  • a long-range camera from Leopard Imaging Inc. can be used to identify landmarks, locate the AV 20100 within its environment and/or determine the navigable space.
  • the short range sensors are primarily mounted in the cargo platform 20160 and provide information about obstacles near AV 20100 A.
  • the short-range sensors supply data on obstacles and surfaces within 4m of AV 20100A.
  • the short-range sensors provide information up to 10m from the AV 20100A.
  • a plurality of cameras that are at least partially forward facing, are mounted in the cargo platform 20160. In some configurations, the plurality of cameras can include three cameras.
  • the top cover 20830 is has been partially cut-away to reveal a sub-roof 20810.
  • the sub-roof 20810 provides a single piece upon which a plurality of antennas 20820 can be mounted.
  • ten antennas 20820 are mounted to sub-roof 20810.
  • four cellular communications channels each have two antennas, and there are two WiFi antennas.
  • the antennas are wired as a main antenna and an auxiliary antenna for cellular transmissions and reception.
  • the auxiliary antenna may improve cellular functionality by several methods including but not limited to reducing interference, and achieving 4G LTE connectivity.
  • the sub-roof 20810 as well as the top cover 20830 are a non-metallic.
  • the sub-roof 20810 is a plastic surface within 10mm - 20mm of the top cover 20830, that is not structural and allows the antennas to be connected to the processors before the top cover 20830 is attached. Antenna connects are often high impedance and sensitive to dirt, grease and mishandling. Mounting and connecting the antennas to the sub-roof 20810 allows the top cover to be installed and removed without touching the antenna connections. Maintenance and repair operations may include removing the top cover without removing the sub-roof or disconnecting the antennas ⁇ Assembly of the antennas apart from installing the top-cover 20830 facilitates testing/repair.
  • the top cover 20830 is weatherproof and prevents water and grit from entering the cargo container 20110. Mounting the antennas on the sub-roof minimizes the number of openings on the top-cover 20830.
  • the LRSA may include a LIDAR and a plurality of long-range cameras that are mounted at different positions on the LRSA structure 20950 to provide a panoramic view of the environment of AV 20100 A.
  • the LIDAR 20420 is mounted top most on the LRSA structure 20950 to provide an uninterrupted view.
  • the LIDAR can include a VELODYNE LIDAR.
  • a plurality of long-range cameras 20910A-20910D are mounted on the LRSA structure 20950 on the next level below the LIDAR 20420.
  • each camera is either aligned with the direction of motion or orthogonal to the direction of movement.
  • one camera lines up with each of the principle faces of AV 20100A— front, back, left side, and right side.
  • the long-range cameras are model LI-AR01 44-MIPI-M12 made Leopard Imaging Inc.
  • the long-range cameras may have a MIPI CSI- 2 interface to provide high-speed data transfer to a processor.
  • the long-range cameras may have a horizontal field of view between 50° and 70° and a vertical field of view between 30° and 40°.
  • a long-range processor 20940 is located on the LRSA structure 20950 below the long-range cameras 20910A-20910D and the LIDAR 20420.
  • the long-range processor 20940 receives data from the long-range cameras and the LIDAR.
  • the long-range processor is in communication with one or more processors elsewhere in AV 20100 A.
  • the long-range processor 20940 provides data derived from the long-range cameras and the LIDAR to one or more processors located elsewhere on AV 20100 A, described elsewhere herein.
  • the long-range processor 20940 may be liquid cooled by cooler 20930.
  • the cooler 20930 may be mounted to the structure under the long-range cameras and LIDAR.
  • the cooler 20930 may provide a mounting location for the long- range processor 20940.
  • the cooler 20930 is described in U.S. Patent Application # 16/883,668, filed on May 26, 2020, entitled Apparatus for Electronic Cooling on an Autonomous Device (Atty. Dkt. # AA280), incorporated herein by reference in full.
  • the cooler is provided with a liquid supply conduit and a return conduit that provide cooling liquid to the cooler 20930.
  • the short-range camera assemblies 20740 A- C are mounted on the front of the container platform 20160 and angled to collect information about the travel surface and the obstacles, steps, curbs and other substantially discontinuous surface features (SDSFs).
  • the camera assemblies 20740A-C include one or more LED lights to illuminate the travel surfaces, objects on the ground and SDSFs.
  • the camera assemblies 20740A-B include lights 20732 to illuminate the ground and objects to provide improved image data from the cameras 20732.
  • camera-assembly 20740A is a mirror of 20740C and descriptions of 20740A apply implicitly to 20740C.
  • the camera 20732 may include a single vision camera, a stereo camera, and/or an infrared projector and CMOS sensor.
  • One example of a camera is the Real-Sense Depth D435 camera by Intel of Santa Clara, CA, that comprises an IR projector, two imager chips with lenses and a RGB camera.
  • the LED lights 20734 may be used at night or in low- light conditions or may be used at all times to improve image data.
  • One theory of operation is that the lights create contrast by illuminating projecting surfaces and creating shadows in depressions.
  • the LED lights may be white LEDs in an example.
  • the LED lights 20734 are Xlamp XHP50s from Cree Inc.
  • the LED lights may emit in the infrared to provide illumination for the camera 20372 without distracting or bothering nearby pedestrians or drivers.
  • the placement and angle of the lights 20374 and the shape of the covers 20736A, 20736B prevent the camera 20732 from seeing the lights 20374.
  • the angle and placement of the lights 20374 and the covers 20736A, 20736B prevent the lights from interfering with drivers or bothering pedestrians. It is advantageous that camera 20732 not be exposed to the light 20734 to prevent the sensor in the camera 20732 being blinded by the light 20734 and therefore being prevented from detecting the lower light signals from the ground and objects in front of and to the side of the AV 20100A.
  • the camera 20732 and/or the lights 20734 may be cooled with liquid that flows into and out of the camera assemblies through ports 20736.
  • the short-range camera assembly 20740A includes an ultrasonic or sonar short-range sensor 20730 A.
  • the second short-range camera assembly 20740C also includes a ultrasonic short-range sensor 20730B (FIG. 2N).
  • the ultrasonic sensor 20730A is mounted above the camera 20732.
  • the center line of the ultrasonic sensor 20730A is parallel with the base of the cargo container 20110, which often means sensor 20730A is horizontal.
  • Sensor 20730A is angled 45 ° from facing forward.
  • the cover 20376A provides a hom 20746 to direct the ultrasonic waves emerging from and received by the ultrasonic sensor 20730 A.
  • a cross-section of the camera 20732, the light 20734 within the camera assembly 20740A illustrates the angles and openings in the cover 20736.
  • the cameras in the short-range camera assemblies are angled downward to better image the ground in front of and to the side of the AV.
  • the center camera assembly 20740B is oriented straight ahead in the horizontal plane.
  • the comer camera assemblies 20740A, 20740C are angled 25 ° to their respective sides with respect to straight ahead in the horizontal plane.
  • the camera 20732 is angled 20 ° downward with respect to the top of the cargo platform in the vertical plane. As the AV generally holds the cargo platform horizontal, the camera therefore angled 20 ° below horizontal.
  • the center camera assembly 20740B (FIG. 2M) is angled below horizontal by 28 ° .
  • the cameras in the camera assemblies may be angled downward by 25 ° to 35 ° .
  • the cameras in the camera assemblies may be angled downward by 15 ° to 45 ° .
  • the LED lights 20734 are similary angled downward to illuminate the ground that is imaged by the camera 20732 and to minimize distraction to pedestrians.
  • the LED light centerline 20742 is parallel within 5° of the camera centerline 20738.
  • the cover 20736A both protects the camera 20732 and pedestrians from the bright light of LEDs 20734 in the camera assemblies 20740A-C.
  • the cover that isolates the light emitted by LED also provides a flared opening 20737 to maximize the field of view of the camera 20732.
  • the lights are recessed at least 4mm from the opening of the cover.
  • the light opening is defined by upper wall 20739 and lower wall 20744.
  • the upper wall 20739 is approximately parallel ( ⁇ 5 ° ) with the center line 2074.
  • the lower wall 20744 is flared approximately 18 ° from the center line 20742 to maximize illumination of the ground and objects near the ground.
  • the light 20734 includes two LEDs 20734A, each under a square lens 20734B, to produce a beam of light.
  • the LED/lenses are angled and located with respect to the camera 20372 to illuminate the camera’s field of view with minimal spillover of light outside the FOV of the camera 20732.
  • the two LED/lenses are mounted together on a single PCB 20752 with a defined angle 20762 between the two lights.
  • the two LED/lenses are mounted individually on the heat sink 20626A on separate PCBs at an angle with respect to each other.
  • the lights are Xlamp XHP50s from Cree, Inc., and the lenses are 60 ° lenses HB-SQ-W from LEDil.
  • the lights are angled approximately 50 ° with respect to each other so the angle 20762 between the front of the lenses is 130 ° .
  • the lights are located approximately 1 8mm (+ 5mm) 20764 behind the front of the camera 20732 and approximately 30mm 20766 below the center line of the camera 20732.
  • the camera 20732 is cooled by a thermo- electric cooler (TEC) 20630, which is cooled along with the light 20734 by liquid coolant that flows through the cold block 20626A.
  • TEC thermo- electric cooler
  • the camera is attached to bracket 20622 via screw 20625 that threads into a sensor block portion of the camera, while the back of the bracket 20622 is bolted to the electronics block of the camera .
  • the bracket 20622 is cooled by two TEzCs in order to maintain the performance of the IR imaging chips (CMOS chips) in the camera 20732.
  • the TECs reject heat from the bracket 20622 and the electrical power they draw to the cold block 20626A.
  • the coolant is directed through a U-shaped path created by a central fin 20626D.
  • the coolant flows directly behind the LED/lenses/PCBs of the light 20734.
  • Fins 20626B, 20626C improve heat transfer from the light 20734 to the coolant.
  • Coolant flows upward to pass by the hot side of the TECs 20630.
  • the fluid path is created by a plate 20737 (FIG. 2X) attached to the back of the cold block 20626A.
  • sensor data and map data can be used to update an occupancy grid.
  • the system and method of the present teachings can manage a global occupancy grid for a device that is navigating autonomously with respect to a grid map.
  • the grid map can include routes or paths that the device can follow from a beginning point to a destination.
  • the global occupancy grid can include free space indications that can indicate where it is safe for the device to navigate. The possible paths and the free space indications can be combined on the global occupancy grid to establish an optimum path upon which the device can travel to safely arrive at the destination.
  • the global occupancy grid that will be used to determine an unobstructed navigation route can be accessed based on the location of the device, and the global occupancy grid can be updated as the device moves.
  • the updates can be based at least on the current values associated with the global occupancy grid at the location of the device, a static occupancy grid that can include historical information about the neighborhood where the device is navigating, and data being collected by sensors as the device travels.
  • the sensors can be located on the device, as described herein, and they can be located elsewhere.
  • the global occupancy grid can include cells, and the cells can be associated with occupied probability values.
  • Each cell of the global occupancy grid can be associated with information such as whether obstacles have been identified at the location of the cell, the characteristics and discontinuities of the traveling surface at and surrounding the location as determined from previously collected data and as determined by data collected as the device navigates, and the prior occupancy data associated with the location.
  • Data captured as the device navigates can be stored in a local occupancy grid at whose center is the device.
  • static previously-collected data can be combined with the local occupancy grid data and global occupancy data determined in a previous update to create a new global occupancy grid with the space occupied by the device marked as unoccupied.
  • a Bayesian method can be used to update the global occupancy grid.
  • the method can include, for each cell in the local occupancy grid, calculating the position of the cell on the global occupancy grid, accessing the value at that position from the current global occupancy grid, accessing the value at the position from the static occupancy grid, accessing the value at the position from the local occupancy grid, and computing a new value at the position on the global occupancy grid as a function of the current value from the global occupancy grid, the value from the static occupancy grid, and the value from the local occupancy grid.
  • the relationship used to compute the new value can include the sum of the static value and the local occupancy grid value minus the current value.
  • the new value can be bounded by pre-selected values based on computational limitations, for example.
  • system 30100 of the present teachings can manage a global occupancy grid.
  • the global occupancy grid can begin with initial data, and can be updated as the device moves. Creating the initial global occupancy grid can include a first process, and updating the global occupancy grid can include a second process.
  • System 30100 can include, but is not limited to including, global occupancy server 30121 that can receive information from various sources and can update global occupancy grid 30505 based at least on the information.
  • the information can be supplied by, for example, but not limited to, sensors located upon the device and/or elsewhere, static information, and navigation information.
  • sensors can include cameras and radar that can detect surface characteristics and obstacles, for example.
  • the sensors can be advantageously located on the device, for example, to provide enough coverage of the surroundings to enable safe travel by the device.
  • LIDAR 30103 can provide LIDAR point cloud (PC) data 30201 that can enable populating a local occupancy grid with LIDAR free space information 30213.
  • conventional ground detect inverse sensor model (ISM) 30113 can process LIDAR PC data 30201 to produce LIDAR free space information 30213.
  • RGB-D cameras 30101 can provide RGB-D PC data 30202 and RGB camera data 30203.
  • RGB-D PC data 30202 can populate a local occupancy grid with depth free space information 30209
  • RGB-D camera data 30203 can populate a local occupancy grid with surface data 30211.
  • RGB-D PC data 30202 can be processed by, for example, but not limited to, conventional stereo free space ISM 30109, and RGB-D camera data 30203 can be fed to, for example, but not limited to, conventional surface detect neural network 30111.
  • RGB MIPI cameras 30105 can provide RGB data 30205 to produce, in combination with LIDAR PC data 30201, a local occupancy grid with LIDAR/MIPI free space information 30215.
  • RGB data 30205 can be fed to conventional free space neural network 30115, the output of which can be subjected to pre- selected mask 30221 that can identify which parts of RGB data 30205 are most important for accuracy, before being fed, along with LIDAR PC data 30201, to conventional 2D-3D registration 30117.
  • 2D-3D registration 30117 can project the image from RGB data 30205 onto LIDAR PC data 30201.
  • 2D-3D registration 30117 is not needed.
  • Any combination of sensors and methods for processing the sensor data can be used to gather data to update the global occupancy grid. Any number of free space estimation procedures can be used and combined to enable determination and verification of occupied probabilities in the global occupancy grid.
  • historical data can be provided by, for example, repository 30107 of previously collected and processed data having information associated with the navigation area.
  • repository 30107 can include, for example, but not limited to, route information such as, for example, polygons 30207.
  • these data can be fed to conventional polygon parser 30119 which can provide edges 30303, discontinuities 30503, and surfaces 30241, to global occupancy grid server 30121.
  • Global occupancy grid server 30121 can fuse the local occupancy grid data collected by the sensors with the processed repository data to determine global occupancy grid 30505.
  • Grid map 30601 (FIG. 3D) can be created from global occupancy data.
  • sensors can include sonar 30141 that can provide local occupancy grid with sonar free space 30225 to global occupancy grid server 30121.
  • Depth data 30209 can be processed by conventional free space ISM 30143.
  • Local occupancy grid with sonar free space 30225 can be fused with local occupancy grid with surfaces and discontinuities 30223, local occupancy grid with LIDAR free space 30213, local occupancy grid with LIDAR/MIPI free space 30215, local occupancy grid with stereo free space 30209, and edges 30303 (FIG. 3F), discontinuities 30503 (FIG. 3F), navigation points 30501 (FIG. 3F), surface confidences 30513 (FIG. 3F), and surfaces 30241 (FIG. 3F) to form global occupancy grid 30505.
  • global occupancy grid initialization 30200 can include creating, by global occupancy grid server 30121, global occupancy grid 30505 and static grid 30249.
  • Global occupancy grid 30505 can be created by fusing data from local occupancy grids 30118 with edges 30303, discontinuities 30503, and surfaces 30241 located in the region of interest.
  • Static grid 30249 (FIG. 3D) can be created to include data such as, for example, but not limited to, surface data 30241, discontinuity data 30503, edges 30303, and polygons 30207.
  • An initial global occupancy grid 30505 can be computed by adding the occupancy probability data from static grid 30249 (FIG.
  • Local occupancy grids 30118 can include, but are not limited to including, local occupancy grid data resulting from stereo free space estimation 30209 (FIG. 3B) through an ISM, local occupancy grid data including surface/discontinuity detection results 30223 (FIG. 3B), local occupancy grid data resulting from LIDAR free space estimation 30213 (FIG. 3B) through an ISM, and local occupancy grid data resulting from LIDAR/MIPI free space estimation 30215 (FIG. 3B) following in some configurations, 2D- 3D registration 30117 (FIG. 3B).
  • local occupancy grids 30118 can include local occupancy grid data resulting from sonar free space estimation 30225 through an ISM.
  • the various local occupancy grids with free space estimation can be fused according to pre-selected known processes into local occupancy grids 30118.
  • From global occupancy grid 30505 can be created grid map 30601 (FIG. 3E) that can include occupancy and surface data in the vicinity of the device.
  • grid map 30601 (FIG. 3E) and static grid 30249 (FIG. 3D) can be published using, for example, but not limited to, robot operating system (ROS) subscribe/publish features.
  • ROS robot operating system
  • occupancy grid update 30300 can include updating the local occupancy grid with respect to the data measured when the device is moving, and combining those data with static grid 30249.
  • Static grid 30249 is accessed when the device moves out of the working occupancy grid range.
  • the device can be positioned in occupancy grid 30245A at first location 30513A at a first time.
  • occupancy grid 30245B which includes a set of values derived from its new location and possibly from values in occupancy grid 30245A.
  • Data from static grid 30249 and surfaces data from the initial global occupancy grid 30505 (FIG.
  • 3C that locationally coincide with cells in occupancy grid 30245B at a second time can be used, along with measured surface data and occupied probabilities, to update each grid cell according to a pre-selected relationship.
  • the relationship can include summing the static data with the measured data.
  • the resulting occupancy grid 30245C at a third time and third location 30513C can be made available to movement manager 30123 to inform navigation of the device.
  • method 30450 for creating and managing occupancy grids can include, but is not limited to including, transforming 30451, by local occupancy grid creation node 30122, sensor measurements to the frame of reference associated with the device, creating 30453 a time-stamped measurement occupancy grid, and publishing 30455 the time-stamped measurement occupancy grid as a local occupancy grid 30234 (FIG. 3G).
  • the system associated with method 30450 can include multiple local grid creation nodes 30122, for example, one for each sensor, so that multiple local occupancy grids 30234 (FIG. 3G) can result.
  • Sensors can include, but are not limited to including, RGB-D cameras 30325 (FIG. 3G), LIDAR/MIPI 30231 (FIG. 3G), and LIDAR 30233
  • the system associated with method 30450 can include global occupancy grid server 30121 that can receive the local occupancy grid(s) and process them according to method 30450.
  • method 30450 can include loading 30242 surfaces, accessing 30504 surface discontinuities such as, for example, but not limited to, curbs, and creating 30248 static occupancy grid 30249 from any characteristics that are available in repository 30107, which can include, for example, but not limited to, surfaces and surface
  • Method 30450 can include receiving 30456 the published local occupancy grid and moving 30457 the global occupancy grid to maintain the device in the center of the map. Method 30450 can include setting 30459 new regions on the map with prior information from static prior occupancy grid 30249, and marking 30461 the area currently occupied by the device as unoccupied. Method 30450 can, for each cell in each local occupancy grid, execute loop 30463. Loop 30463 can include, but is not limited to including, calculating the position of the cell on the global occupancy grid, accessing the previous value at the position on the global occupancy grid, and calculating a new value at the cell position based on a relationship between the previous value and the value at the cell in the local occupancy grid. The relationship can include, but is not limited to including, summing the values.
  • Loop 30463 can include comparing the new value against a pre- selected acceptable probability range, and setting the global occupancy grid with the new value.
  • the comparison can include setting the probability to a minimum or maximum acceptable probability if the probability is lower or higher than the minimum or maximum acceptable probability.
  • Method 30450 can include publishing 30467 the global occupancy grid.
  • an alternate method 30150 for creating a global occupancy grid can include, but is not limited to including, if 30151 the device has moved, accessing 30153 the occupied probability values associated with an old map area (where the device was before it moved) and updating the global occupancy grid on the new map area (where the device was after it moved) with the values from the old map area, accessing 30155 drivable surfaces associated with the cells of the global occupancy grid in the new map area, and updating the cells in the updated global occupancy grid with the drivable surfaces, and proceeding at step 30159.
  • method 30150 can include updating 30159 the possibly updated global occupancy grid with surface confidences associated with the drivable surfaces from at least one local occupancy grid, updating 30161 the updated global occupancy grid with logodds of the occupied probability values from at least one local occupancy grid using, for example, but not limited to, a Bayesian function, and adjusting 30163 the logodds based at least on characteristics associated with the location. If 30157 the global occupancy grid is not co-located with the local occupancy grid, method 30150 can include returning to step 30151.
  • characteristics can include, but are not limited to including, setting the location of the device as unoccupied.
  • method 30250 for creating a global occupancy grid can include, but is not limited to including, if 30251 the device has moved, updating 30253 the global occupancy grid with information from a static grid associated with the new location of the device.
  • Method 30250 can include analyzing 30257 the surfaces at the new location. If 30259, the surfaces are drivable, method 30250 can include updating 30261 the surfaces on the global occupancy grid and updating 30263 the global occupancy grid with values from a repository of static values that are associated with the new position on the map.
  • updating 30261 the surfaces can include, but is not limited to including, accessing 30351 a local occupancy grid (LOG) for a particular sensor. If 30353 there are more cells in the local occupancy grid to process, method 30261 can include accessing 30355 the surface classification confidence value and the surface classification from the local occupancy grid. If 30357 the surface classification at the cell in the local occupancy grid is the same as the surface classification in the global occupancy grid at the location of the cell, method 30261 can include setting 30461 the new global occupancy grid (GOG) surface confidence to the sum of the old global occupancy grid surface confidence and the local occupancy grid surface confidence.
  • GOG global occupancy grid
  • method 30261 can include setting 30359 the new global occupancy grid surface confidence to the difference between the old global occupancy grid surface confidence and the local occupancy grid surface confidence. If 30463 the new global occupancy grid surface confidence is less than zero, method 30261 can include setting 30469 the new global occupancy grid surface classification to the value of the local occupancy grid surface classification.
  • updating 30263 the global occupancy grid with values from a repository of static values can include, but is not limited to including, if 30361 there are more cells in the local occupancy grid to process, method 30263 can include accessing 30363 the logodds from the local occupancy grid and updating 30365 the logodds in the global occupancy grid with the value from the local occupancy grid at the location. If 30367 maximum certainty that the cell is empty is met, and if 30369 the device is traveling within pre-determined lane barriers, and if 30371 the surface is drivable, method 30263 can include updating 30373 the probability that the cell is occupied and returning to continue processing more cells.
  • method 30263 can include returning to consider more cells without updating the logodds. If the device is in standard mode, i.e. a mode in which the device can navigate relatively uniform surfaces, and the surface classification indicates that the surface is not relatively uniform, method 30263 can adjust the device’s path by increasing the probability that the cell is occupied by updating 30373 the logodds.
  • method 30263 can adjust the device’s path by decreasing the probability that the cell is occupied by updating 30373 the logodds. If the device is traveling in 4-wheel mode, i.e. a mode in which the device can navigate non- uniform terrain, adjustments to the probability that the cell is occupied may not be necessary.
  • the AV can travel in a specific mode that can be associated with a device configuration, for example, the configuration depicted in device 42114A and the configuration depicted in device 42114B.
  • the system of the present teachings for real-time control of the configuration of the device, based on at least one environmental factor and the situation of the device can include, but is not limited to including, sensors, a movement means, a chassis operably coupled with the sensors and the movement means, the movement means being driven by motors and a power supply, a device processor receiving data from the sensors, and a powerbase processor controlling the movement means.
  • the device processor can receive environmental data, determine the environmental factors, determine configuration changes according to the environmental factors and the situation of the device, and provide the configuration changes to the powerbase processor.
  • the powerbase processor can issue commands to the movement means to move the device from place to place, physically reconfiguring the device when required by the road surface type.
  • the sensors collecting the environmental data can include, but are not limited to including, for example, cameras, LIDAR, radar,
  • thermometers thermometers, pressure sensors, and weather condition sensors, several of which are described herein.
  • the device processor can determine environmental factors upon which device configuration changes can be based.
  • environmental factors can include surface factors such as, for example, but not limited to, surface type, surface features, and surface conditions.
  • the device processor can determine in real-time, based on environmental factors and the current situation of the device, how to change the configuration to accommodate traversing the detected surface type.
  • the configuration change of device 42114A/B/C can include a change in the configuration of the movement means, for example.
  • Other configuration changes are contemplated, such as user information displays and sensor controls, that can depend on current mode and surface type.
  • the movement means can include at least four drive wheels 442101, two on each side of chassis 42112, and at least two caster wheels 42103 operably coupled with chassis 42112, as described herein.
  • drive wheels 442101 can be operably coupled in pairs 42105, where each pair 42105 can include first drive wheel 42101 A and second drive wheel 4210 IB of the four drive wheels 442101, and pairs 42105 are each located on opposing sides of chassis 42112.
  • the operable coupling can include wheel cluster assembly 42110.
  • powerbase processor 41016 (FIG. 4B) can control the rotation of cluster assembly 42110.
  • Left and right wheel motors 41017 (FIG. 4B) can drive wheels 442101 on the either side of chassis 42112. Turning can be accomplished by driving left and right wheel motors 41017
  • Cluster motors 41019 can rotate the wheelbase in the fore/aft direction. Rotation of the wheelbase can allow cargo to rotate, if at all,
  • Cluster assembly 42110 can independently operate each pair 42105 of two wheels, thereby providing forward, reverse and rotary motion of device 42114, upon command.
  • Cluster assembly 42110 can provide the structural support for pairs 42105.
  • Cluster assembly 42110 can provide the mechanical power to rotate wheel drive assemblies together, allowing for functions dependent on cluster assembly rotation, for example, but not limited to, discontinuous surface feature climbing, various surface types, and uneven terrain. Further details about the operation of clustered wheels can be found in U.S. patent application # 16,035,205 entitled Mobility Device, filed on July 13, 2018, attorney docket # X80, which is incorporated herein by reference in its entirety.
  • the configuration of device 42114 can be associated with, but is not limited to being associated with, mode 41033 (FIG. 4B) of device 42114.
  • Device 42114 can operate in several of modes 41033 (FIG. 4B).
  • mode 10100-1 (FIG. 5E)
  • device 42114B can operate on two of drive wheels 442101B and two of caster wheels 42103.
  • Standard mode 10100-1 (FIG. 5E) can provide turning performance and mobility on relatively firm, level surfaces for example, but not limited to, indoor environments, sidewalks, and pavement.
  • enhanced mode 10100-2 FIG.
  • device 42114A/C can command four of drive wheels 442101A/B, can be actively stabilized through onboard sensors, and can elevate and/or reorient chassis 42112, casters 42103, and cargo.
  • 4- Wheel mode 10100-2 (FIG. 5E) can provide mobility in a variety of environments, enabling device 42114A/C to travel up steep inclines and over soft, uneven terrain.
  • 4- Wheel mode 10100-2 (FIG. 5E)
  • all four of drive wheels 442101A/B can be deployed and caster wheels 42103 can be retracted.
  • Rotation of cluster 42110 can allow operation on uneven terrain, and drive wheels 442101A/B can drive up and over discontinuous surface features.
  • Device 42114A/C can provide device 42114A/C with mobility in a wide variety of outdoor environments.
  • Device 42114B can operate on outdoor surfaces that are firm and stable but wet. Frost heaves and other natural phenomena can degrade outdoor surfaces, creating cracks and loose material.
  • Modes 41033 are described in detail in United States Patent # 6,571,892, entitled Control System and Method, issued on June 3, 2003 (‘892), incorporated herein by reference in its entirety.
  • system 41000 can drive device 42114 (FIG. 4A) by processing inputs from sensors 41031, generating commands to wheel motors 41017 to drive wheels 442101 (FIG. 4A), and generating commands to cluster motors 41019 to drive clusters 42110 (FIG. 4A).
  • System 41000 can include, but is not limited to including, device processor 41014 and powerbase processor 41016.
  • Device processor 41014 can receive and process environmental data 41022 from sensors 41031, and provide configuration information 40125 to powerbase processor 41016.
  • device processor 41014 can include sensor processor 41021 that can receive and process environmental data 41022 from sensors 41031.
  • Sensors 41031 can include, but are not limited to including, cameras, as described herein. From these data, information about, for example, the driving surface that is being traversed by device 42114 (FIG. 4A) can be accumulated and processed. In some configurations, the driving surface information can be processed in real-time.
  • Device processor 41014 can include configuration processor 41023 that can determine surface type 40121 from environmental data 41022, for example, that is being traversed by device 42114 (FIG. 4A).
  • Configuration processor 41023 can include, for example, drive surface processor 41029 (FIG. 4C) that can create, for example, a drive surface classification layer, a drive surface confidence layer, and an occupancy layer from environmental data 41022.
  • Configuration 40125 can be based, at least in part, on surface type 40121.
  • Surface type 40121 and mode 41033 can be used to determine, at least in part, occupancy grid information 41022, which can include a probability that a cell in the occupancy grid is occupied.
  • the occupancy grid can, at least in part, enable determination of a path that device 42114 (FIG. 4A) can take.
  • powerbase processor 41016 can receive configuration information 40125 from device processor 41014, and process configuration information 40125 along with other information, for example path information.
  • Powerbase processor 41016 can include control processor 40325 that can create movement commands 40127 based at least on configuration information 40125 and provide movement commands 40127 to motor drive processor 40326.
  • Motor drive processor 40326 can generate motor commands 40128 that can direct and move device 42114 (FIG. 4A). Specifically motor drive processor 40326 can generate motor commands 40128 that can drive wheel motors 41017, and can generate motor commands 40128 that can drive cluster motors 41019.
  • real-time surface detection of the present teachings can include, but is not limited to including, configuration processor 41023 that can include drive surface processor 41029.
  • Drive surface processor 41029 can determine the characteristics of the driving surface upon which device 42114 (FIG. 4A) is navigating.
  • Drive surface processor 41029 can include, but is not limited to including, neural network processor 40207, data transforms 40215, 40219, and 40239, layer processor 40241, and occupancy grid processor 40242. Together these components can produce information that can direct the change of the configuration of device 42114 (FIG. 4A) and can enable modification of occupancy grid 40244 (FIG. 4C) that can inform path planning for the travel of device 42114 (FIG. 4A).
  • neural network processor 40207 can subject environmental data 41022 (FIG. 4B) to a trained neural network that can indicate, for each point of data collected by sensors 41031 (FIG. 4B), the type of surface the point is likely to represent.
  • Environmental data 41022 (FIG. 4B) can be, but is not limited to being, received as camera images 40202, where the cameras can be associated with camera properties 40204.
  • Camera images 40202 can include 2D grids of points 40201 having X-resolution 40205 (FIG. 4D) and Y-resolution 40204 (FIG. 4D). In some configurations, camera images 40202 can include RGB-D images, X-resolution 40205 (FIG.
  • Y-resolution 40204 can include 40,480 pixels.
  • camera images 40202 can be converted to images formatted according to the requirements of the chosen neural network.
  • the data can be normalized, scaled, and converted from 2D to 1D, which can improve processing efficiency of the neural network.
  • the neural network can be trained in many ways including, but not limited to, training with RGB-D camera images.
  • the trained neural network represented in neural network file 40209 (FIG. 4D), can be made available to neural network processor 40207 through a direct connection to the processor executing the trained neural network or, for example, through a communications channel.
  • neural network processor 40207 can use trained neural network file 40209 (FIG. 4D) to identify surface types 40121 (FIG. 4C) within environmental data 41022 (FIG. 4B).
  • surface types 40121 (FIG. 4C) can include, but are not limited to including, not drivable, hard drivable, soft drivable, and curb.
  • surface types 40121 (FIG. 4C) can include, but are not limited to including, not
  • the result of the neural network processing can include surface classification grid 40303 of points 40213 having X-resolution 40205 and Y-resolution 40203, and center 40211. Each point 40213 in surface classification grid 40303 can be associated with a likelihood being a specific one of surface types 40121 (FIG. 4C).
  • drive surface processor 41029 can include 2D to 3D transform 40215 that can deproject from 2D surface classification grid 40303 (FIG. 4D) in a 2D camera frame to 3D image cube 40307 (FIG. 4D) in 3D real world coordinates as seen by the camera. Deprojection can recover the 3D properties of 2D data, and can transform a 2D image from a RGB-D camera to 3D camera frame 40305 (FIG. 4C). Points 40233 (FIG. 4D) in cube 40307 (FIG. 4D) can each be associated with the likelihood of being a specific one of surface types 40121 (FIG.
  • point cube 40307 (FIG. 4D) can be delimited according to, for example, but not limited to, camera properties 40204 (FIG. 4C), such as, for example, focal length x, focal length y, and projection center 40225.
  • camera properties 40204 can include a maximum range over which the camera can reliably project.
  • casters 42103 (FIG. 4A) could interfere with the view of camera(s) 40227 (FIG. 4D). These factors can limit the number of points in point cube 40307 (FIG. 4D).
  • camera(s) 40227 cannot reliably project beyond about six meters, which can represent the high limit of the range of camera 40227 (FIG. 4D), and can limit the number of points in point cubes 40307 (FIG. 4D).
  • features of device 42114 can act as the minimum limit of the range of camera 40227 (FIG. 4D). For example, the presence of casters 42103 (FIG.
  • points in point cubes 40307 can be limited to points that are one meter or more from camera 40227 (FIG. 4D) and six meters or less from camera 40227 (FIG. 4D).
  • drive surface processor 41029 can include baselink transform 40219 that can transform the 3D cube of points to coordinates associated with device 42114 (FIG. 4A), i.e., baselink frame 40309 (FIG. 4C).
  • Baselink transform 40219 can transform 3D data points 40223 (FIG. 4D) in cube 40307 (FIG. 4D) into points 40233 (FIG. 4D) in cube 40308 (FIG. 4D) in which the Z dimension is set to the base of device 42114 (FIG. 4A).
  • Drive surface processor 41029 (FIG. 4C) can include baselink transform 40219 that can transform the 3D cube of points to coordinates associated with device 42114 (FIG. 4A), i.e., baselink frame 40309 (FIG. 4C).
  • Baselink transform 40219 can transform 3D data points 40223 (FIG. 4D) in cube 40307 (FIG. 4D) into points 40233 (FIG. 4D
  • Layer processor 40241 can flatten points 40237 (FIG. 4D) into various layers 40312 (FIG. 4C), depending on the data represented by points 40237 (FIG. 4D). In some configurations, layer processor 40241 can apply a scalar value to layers 40312 (FIG. 4C). In some configurations, layers 40312 (FIG. 4C) can include probability of occupancy layer 40243, surface classification layer 40245 (FIG.
  • surface type confidence layer 40247 can be determined by converting class scores from neural network processor 40207 into scores that can be determined by normalizing the class scores into a probability distribution over the output classes as log(class score)/ ⁇ log(each class).
  • one or more layers can be replaced or augmented by a layer that provides the probability of a non-drivable surface.
  • the probability value in occupancy layer 40243 can be represented as a logodds (log odds -> ln(p/(l-p)) value.
  • the probability value in occupancy layer 40243 can be based at least on a combination of mode 41033 (FIG. 4B) and surface type 40121 (FIG. 4B).
  • pre-selected probability values in occupancy layer 40243 can be chosen to cover situations such as, for example, but not limited to, (1) when surface type 40121 (FIG. 4A) is hard and drivable, and when device 42114 (FIG. 4A) is in a pre-selected set of modes 41033 (FIG.
  • probability values can include, but are not limited to including, those set out in Table I.
  • the neural network- predicted probabilities can be tuned, if necessary, and can replace the probabilities listed in Table I.
  • occupancy grid processor 40242 can provide, in real- time, parameters that can affect probability values of occupancy grid 40244 such as, for example, but not limited to, surface type 40121 and occupancy grid information 41022, to global occupancy grid processor 41025.
  • Configuration information 40125 (FIG. 4B) such as, for example, but not limited to, mode 41033 and surface type 40121, can be provided to powerbase processor 41016 (FIG. 4B).
  • Powerbase processor 41016 (FIG. 4B) can determine motor commands 40128 (FIG. 4B), which can set the configuration of device 42114 (FIG. 4A), based at least on configuration information 40125 (FIG. 4B).
  • device 42100A can be configured according to the present teachings to operate in standard mode.
  • casters 42103 and second drive wheel 42101B can rest on the ground as device 42100A navigates its path.
  • First drive wheel 42101A can be raised by a pre-selected amount 42102 (FIG. 4F) and can clear the driving surface.
  • Device 42100 A can navigate successfully on relatively firm, level surfaces.
  • occupancy grid 40244 (FIG. 4C) can reflect the surface type limitation (see Table I), and therefore can enable a compatible choice of mode, or can enable a configuration change based on the surface type and the current mode.
  • device 42100B/C can be configured according to the present teachings to operate in 4-Wheel mode.
  • first drive wheel 42101A and second drive wheel 42101B can rest on the ground as device 42100A navigates its path.
  • Casters 42103 can be retracted and can clear the driving surface by a pre-selected amount 42104 (FIG. 4H).
  • first drive wheel 42101A and second drive wheel 42101B can substantially rest on the ground as device 42100A navigates its path.
  • Casters 42103 can be retracted and chassis 42111 can be rotated (thus moving casters 42103 farther from the ground) to accommodate, for example, a discontinuous surface.
  • casters 42103 can clear the driving surface by a pre-selected amount 42108 (FIG. 1J).
  • Devices 42100A/C can navigate successfully on a variety of surfaces including soft surfaces and discontinuous surfaces.
  • second drive wheel 42101B can rest on the ground as device 42100A while first drive wheel 42101A can be raised as device 42100C (FIG. 1J) navigates its path.
  • Casters 42103 can be retracted and chassis 42111 can be rotated (thus moving casters 42103 farther from the ground) to accommodate, for example, a
  • occupancy grid 40244 (FIG. 4C) can reflect the surface type (see Table I), and therefore can enable a compatible choice of mode or can enable a configuration change based on the surface type and the current mode.
  • method 40150 for real-time control of a device configuration of a device can include, but is not limited to including, receiving 40151 sensor data, determining 40153 a surface type based at least on the sensor data, and determining 40155 a current mode based at least on the surface type and a current device configuration.
  • Method 40150 can include determining 40157 a next device configuration based at least on the current mode and the surface type, determining 40159 movement commands based at least on the next device configuration, and changing 40161 the current device configuration to the next device configuration based at least on the movement commands.
  • annotated point data 10379 (FIG. 5B) can be provided to device controller 10111.
  • Annotated point data 10379 (FIG. 5B), which can be the basis for route information that can be used to instruct AV 10101 (FIG. 1A) to travel a path, can include, but is not limited to including, navigable edges, a mapped trajectory such as, for example, but not limited to mapped trajectory 10413/10415 (FIG. 5D), and labeled features such as, for example, but not limited to, SDSFs 10377 (FIG. 5C). Mapped trajectory 10413/10415 (FIG.
  • Mapped trajectory 10413/10415 can include cost modifiers associated with the surfaces of the route space, and drive modes associated with the edges. Drive modes can include, but are not limited to including, path following and SDSF climbing. Other modes can include operational modes such as, for example, but not limited to, autonomous, mapping, and waiting for intervention. Ultimately, the path can be selected based at least on lower cost modifiers. Topology that is relatively distant from mapped trajectory 10413/10415 (FIG.
  • Initial weights can be adjusted while AV 10101 (FIG. 1A) is operational, possibly causing a modification in the path. Adjusted weights can be used to adjust edge/weight graph 10381 (FIG. 5B), and can be based at least on the current drive mode, the current surface, and the edge category.
  • device controller 10111 can include a feature processor that can perform specific tasks related to incorporating the eccentricities of any features into the path.
  • the feature processor can include, but is not limited to including, SDSF processor 10118.
  • device controller 10111 can include, but is not limited to including, SDSF processor 10118, sensor processor 10703, mode controller 10122, and base controller 10114, each described herein. SDSF processor 10118, sensor processor 10703, and mode controller 10122 can provide input to base controller 10114.
  • base controller 10114 can determine, based at least on the inputs provided by mode controller 10122, SDSF processor 10118, and sensor processor 10703, information that power base 10112 can use to drive AV 10101 (FIG. 1A) on a path determined by base controller 10114 based at least on edge/weight graph 10381 (FIG. 5B). In some configurations, base controller 10114 can insure that AV 10101 (FIG. 1A) can follow a pre-determined path from a starting point to a destination, and modify the pre-determined path based at least on external and/or internal conditions.
  • external conditions can include, but are not limited to including, stoplights, SDSFs, and obstacles in or near the path being driven by AV 10101 (FIG. 1A).
  • internal conditions can include, but are not limited to including, mode transitions reflecting the response that AV 10101 (FIG. 1A) makes to external conditions.
  • Device controller 10111 can determine commands to send to power base 10112 based at least on the external and internal conditions. Commands can include, but are not limited to including, speed and direction commands that can direct AV 10101 (FIG. 1A) to travel the commanded speed in the commanded direction. Other commands can include, for example, groups of commands that enable feature response such as, for example, SDSF climbing.
  • Base controller 10114 can determine a desired speed between waypoints of the path by conventional methods, including, but not limited to, Interior Point Optimizer (IPOPT) large- scale nonlinear optimization (https://projects.coin-or.org/Ipopt), for example.
  • Base controller 10114 can determine a desired path based at least on conventional technology such as, for example, but not limited to, technology based on Dykstra’s algorithm, the A* search algorithm, or the Breadth-first search algorithm.
  • Base controller 10114 can form a box around mapped trajectory 10413/10415 (FIG. 5C) to set an area in which obstacle detection can be performed.
  • the height of the payload carrier when adjustable, can be adjusted based at least in part on the directed speed.
  • base controller 10114 can convert speed and direction determinations to motor commands. For example, when a SDSF such as, for example, but not limited to, a curb or slope is encountered, base controller 10114, in SDSF climbing mode, can direct power base 10112 to raise payload carrier 10173 (FIG. 1A), align AV 10101 (FIG. 1A) at approximately a 90° angle with the SDSF, and reduce the speed to a relatively low level. When AV 10101 (FIG. 1A) climbs the substantially discontinuous surface, base controller 10114 can direct power base 10112 to transition to a climbing phase in which the speed is increased because increased torque is required to move AV 10101 (FIG. 1A) up an incline.
  • SDSF such as, for example, but not limited to, a curb or slope
  • base controller 10114 can reduce the speed in order to remain atop any flat part of the SDSF.
  • AV 10101 FIG. 1A
  • base controller 10114 can allow speed to increase.
  • a SDSF such as, for example, but not limited to, a slope
  • the slope can be identified and processed as a structure.
  • Features of the structure can include a pre-selected size of a ramp, for example.
  • the ramp can include an approximate 30° degree incline, and can optionally, but not limited to, be on both sides of a plateau.
  • Device controller 10111 (FIG. 5 A) can distinguish between an obstacle and a slope by comparing the angle of the perceived feature to an expected slope ramp angle, where the angle can be received from sensor processor 10703 (FIG. 5A).
  • SDSF processor 10118 can locate, from the blocks of drivable surfaces formed by a mesh of polygons represented in annotated point data 10379, navigable edges that can be used to create a path for traversal by AV 10101 (FIG. 1A).
  • SDSF buffer 10407 FIG. 5C
  • navigable edges can be erased (see FIG. 5D) in preparation for the special treatment given SDSF traversal. Closed line segments such as segment 10409 (FIG. 5C) can be drawn to bisect SDSF buffer 10407 (FIG.
  • segment ends 10411 can fall in an unobstructed part of the drivable surface, there can be enough room for AV 10101 (FIG. 1A) to travel between adjacent SDSF points 10789 (FIG. IN) along line segments, and the area between SDSF points 10789 (FIG. IN) can be a drivable surface.
  • Segment ends 10411 can be connected to the underlying topology, forming vertices and drivable edges.
  • line segments 10461, 10463, 10465, and 10467 (FIG. 5C) that met the traversal criteria are shown as part of the topology in FIG. 5D.
  • line segment 10409 (FIG. 5C) did not meet the criteria at least because segment end 10411 (FIG. 5C) does not fall on a drivable surface.
  • Overlapping SDSF buffers 10506 (FIG. 5C) can indicate SDSF discontinuity, which could weigh against SDSF traversal of the SDSFs within the overlapped SDSF buffers 10506 (FIG. 5C).
  • SDSF line 10377 (FIG. 5C) can be smoothed, and the locations of SDSF points 10789 (FIG. IN) can be adjusted so that they fall about a pre-selected distance apart, the pre-selected distance being based at least on the footprint of AV 10101 (FIG. 1A).
  • SDSF processor 10118 can transform annotated point data 10379 into edge/weight graph 10381, including topology modifications for SDSF traversal.
  • SDSF processor 10118 can include seventh processor 10601, eighth processor 10702, ninth processor 10603, and tenth processor 10605.
  • Seventh processor 10601 can transform the coordinates of the points in annotated point data 10379 to a global coordinate system, to achieve compatibility with GPS coordinates, producing GPS-compatible dataset 10602.
  • Seventh processor 10601 can use conventional processes such as, for example, but not limited to, affine matrix transform and PostGIS transform, to produce GPS-compatible dataset 10602.
  • the World Geodetic System can be used as the standard coordinate system as it takes into account the curvature of the earth.
  • the map can be stored in the Universal Transverse Mercator (UTM) coordinate system, and can be switched to WGS when it is necessary to find where specific addresses are located.
  • UTM Universal Transverse Mercator
  • eighth processor 10702 can smooth SDSFs and determine the boundary of SDSF 10377, create buffers 10407 around the SDSF boundary, and increase the cost modifier of the surface the farther it is from a SDSF boundary.
  • Mapped trajectory 10413/10415 can be a special case lane having the lowest cost modifier.
  • Lower cost modifiers 10406 can be generally located near the SDSF boundary, while higher cost modifiers 10408 can be generally located relatively farther from the SDSF boundary.
  • Eighth processor 10702 can provide point cloud data with costs 10704 (FIG. 5B) to ninth processor 10603 (FIG. 5B).
  • ninth processor 10603 can calculate approximately 90° approaches 10604 (FIG. 5B) for AV 10101 (FIG. 1A) to traverse SDSFs 10377 that have met the criteria to label them as traversable. Criteria can include SDSF width and SDSF smoothness. Line segments, such as line segment 10409 can be created such their length is indicative of a minimum ingress distance that AV 10101 (FIG. 1A) might require to approach SDSF 10377, and a minimum egress distance that might be required to exit SDSF 10377. Segment endpoints, such as endpoint 10411, can be integrated with the underlying routing topology. The criteria used to determine if a SDSF approach is possible can eliminate some approach possibilities. SDSF buffers such as SDSF buffer 10407 can be used to calculate valid approaches and route topology edge creation.
  • tenth processor 10605 can create edge/weight graph 10381 from the topology, a graph of edges and weights, developed herein that can be used to calculate paths through the map.
  • the topology can include cost modifiers and drive modes, and the edges can include directionality and capacity.
  • the weights can be adjusted at runtime based on information from any number of sources.
  • Tenth processor 10605 can provide at least one sequence of ordered points to base controller 10114, plus a recommended drive mode at particular points, to enable path generation.
  • Each point in each sequence of points represents the location and labeling of a possible path point on the processed drivable surface.
  • the labeling can indicate that the point represents part of a feature that could be encountered along the path, such as, for example, but not limited to, a SDSF.
  • the feature could be further labeled with suggested processing based on the type of feature. For example, in some configurations, if the path point is labeled as a SDSF, further labeling can include a mode. The mode can be interpreted by AV 10101 (FIG.
  • FIG. 1A as suggested driving instructions for AV 10101 (FIG. 1A), such as, for example, switching AV 10101 (FIG. 1A) into SDSF climbing mode 100-31 (FIG. 5E) to enable AV 10101 (FIG. 1A) to traverse SDSF 10377 (FIG. 5C).
  • mode controller 10122 can provide to base controller 10114 (FIG. 5A) directions to execute a mode transition.
  • Mode controller 10122 can establish the mode in which AV 10101 (FIG. 1A) is traveling.
  • mode controller 10122 can provide to base controller 10114 a change of mode indication, changing between, for example, path following mode 10100-32 and SDSF climbing mode 10100-31 when a SDSF is identified along the travel path.
  • annotated point data 10379 (FIG. 5B) can include mode identifiers at various points along the route, for example, when the mode changes to accommodate the route. For example, if SDSF 10377 (FIG.
  • device controller 10111 can determine the mode identifier(s) associated with the route point(s) and possibly adjust the instructions to power base 10112 (FIG. 5A) based on the desired mode.
  • AV 10101 (FIG. 1A) can support operating modes that can include, but are not limited to including, standard mode 10100-1 and enhanced (4-Wheel) mode 10100-2, described herein.
  • the height of payload carrier 10173 (FIG. 1A) can be adjusted to provide necessary clearance over obstacles and along slopes.
  • method 11150 for navigating the AV towards a goal point across at least one SDSF can include, but is not limited to including, receiving 11151 SDSF information related to the SDSF, the location of the goal point, and the location of the AV.
  • the SDSF information can include, but is not limited to including, a set of points each classified as SDSF points, and an associated probability for each point that the point is a SDSF point.
  • Method 11150 can include drawing 11153 a closed polygon encompassing the location of the AV, the location of the goal point, and drawing a path line between the goal point and the location of the AV.
  • the closed polygon can include a pre-selected width. Table I includes possible ranges for the pre-selected variables discussed herein.
  • Method 11150 can include selecting 11155 two of the SDSF points located within the polygon and drawing 11157 a SDSF line between the two points. In some configurations, the selection of the SDSF points can be at random or any other way. If 11159 there are fewer than a first pre-selected number of points within a first pre-selected distance of the SDSF line, and if 11161 there have been less than a second pre-selected number of attempts at choosing SDSF points, drawing a line between them, and having fewer than the first pre-selected number of points around the SDSF line, method 11150 can include returning to step 11155.
  • method 11150 can include noting 11163 that no SDSF line was detected. [00206] Referring now primarily to FIG. 5G, if 11159 (FIG. 5F) there are the first pre- selected number of points or more, method 11150 can include fitting 11165 a curve to the points that fall within the first pre-selected distance of the SDSF line.
  • method 11150 can include identifying 11175 the curve as the SDSF line.
  • method 11150 can include returning to step 11165.
  • a stable SDSF line is the result of subsequent iterations yielding the same or fewer points.
  • method 11150 can include receiving 11179 occupancy grid information.
  • the occupancy grid can provide the probability that obstacles exist at certain points.
  • the occupancy grid information can augment the SDSF and path information that are found in the polygon that surrounds the AV path and the SDSF(s) when the occupancy grid includes data captured and/or computed over the common geographic area with the polygon.
  • Method 11150 can include selecting 11181 a point from the common geographic area and its associated probability.
  • method 11150 can include projecting 11187 the obstacle to the SDSF line.
  • method 11150 can include resuming processing at step 11179.
  • method 11150 can include connecting 11191 the projections and finding the end points of the connected projections along the SDSF line.
  • Method 11150 can include marking 11193 the part of the SDSF line between the projection end points as non- traversable.
  • Method 11150 can include marking 11195 the part of the SDSF line that is outside of the non-traversable section as traversable.
  • Method 11150 can include turning 11197 the AV to within a fifth pre-selected amount perpendicular to the traversable section of the SDSF line.
  • method 11150 can include slowing 11251 the AV by a ninth pre-selected amount.
  • Method 11150 can include driving 11253 the AV forward towards the SDSF line, slowing by a second pre- selected amount per meter distance between the AV and the traversable SDSF line. If 11255 the distance of the AV from the traversable SDSF line is less than a fourth pre- selected distance, and if 11257 the heading error is greater than or equal to a third pre- selected amount with respect to a line perpendicular to the SDSF line, method 11150 can include slowing 11252 the AV by the ninth pre-selected amount.
  • method 11150 can include ignoring 11260 updated SDSF information and driving the AV at a pre-selected speed. If 11259 the elevation of a front part of the AV relative to a rear part of the AV is between a sixth pre-selected amount and the fifth pre-selected amount, method 11150 can include driving 11261 the AV forward and increasing the speed of the AV to an eighth pre-selected amount per degree of elevation.
  • method 11150 can include driving 11265 the AV forward at a seventh pre-selected speed. If 11267 the rear of the AV is more than a fifth pre-selected distance from the SDSF line, method 11150 can include noting 11269 that the AV has completed traversing the SDSF. If 11267, the rear of the AV is less than or equal to the fifth pre-selected distance from the SDSF line, method 11150 can include returning to step 11260. [00210] Referring now to FIG.
  • system 51100 for navigating a AV towards a goal point across at least one SDSF can include, but is not limited to including, path line processor 11103, SDSF detector 11109, and SDSF controller 11127.
  • System 51100 can be operably coupled with surface processor 11601 that can process sensor information that can include, for example, but not limited to, images of the surroundings of AV 10101 (FIG. 5L).
  • Surface processor 11601 can provide real-time surface feature updates, including indications of SDSFs.
  • cameras can provide RGB-D data whose points can be classified according to surface type.
  • system 51100 can process the points that have been classified as SDSFs and their associated probabilities.
  • System 51100 can be operably coupled with system controller 11602, which can manage aspects of the operation of AV 10101 (FIG. 5L).
  • System controller 11602 can maintain occupancy grid 11138 that can include information from available sources concerning navigable areas near AV 10101 (FIG. 5L).
  • Occupancy grid 11138 can include probabilities that obstacles exist. This information can be used, in conjunction with SDSF information, to determine if SDSF 10377 (FIG. 5N) can be traversed by AV 10101 (FIG. 5L), without encountering obstacle 11681 (FIG. 5M).
  • System controller 11602 can determine, based on environmental and other information, speed limit 11148 that AV 10101 (FIG. 5N) should not exceed.
  • Speed limit 11148 can be used as a guide, or can override, speeds set by system 51100.
  • System 51100 can be operably coupled with base controller 10114 which can send drive commands 11144 generated by SDSF controller 11127 to the drive components of the AV 10101 (FIG. 5L).
  • Base controller 10114 can provide information to SDSF controller 11127 about the orientation of AV 10101 (FIG. 5L) during SDSF traverse.
  • path line processor 11103 can continuously receive in real time surface classification points 10789 that can include, but are not limited to including, points classified as SDSFs.
  • Path line processor 11103 can receive the location of goal point 11139, and AV location 11141 as indicated by, for example, but not limited to, center 11202 (FIG. 5L) of AV 10101 (FIG. 5L).
  • System 51100 can include polygon processor 11105 drawing polygon 11147 encompassing AV location 11141, the location of goal point 11139, and path 11214 between goal point 11139 and AV location 11141.
  • Polygon 11147 can include the pre-selected width.
  • the pre-selected width can include approximately the width of AV 10101 (FIG. 5L). SDSF points 10789 that fall within polygon 11147 can be identified.
  • SDSF detector 11109 can receive surface classification points 10789, path 11214, polygon 11147, and goal point 11139, and can determine the most suitable SDSF line 10377, according to criteria set out herein, available within the incoming data.
  • SDSF detector 11109 can include, but is not limited to including, point processor 11111 and SDSF line processor 11113.
  • Point processor 11111 can include selecting two of SDSF points 10789 located within polygon 11147, and drawing SDSF 10377 line between the two points.
  • point processor 11111 can include again looping through the selecting-drawing-testing loop as stated herein. If there have been the second pre-selected number of attempts at choosing SDSF points, drawing a line between them, and having fewer than the first pre-selected number of points around the SDSF line, point processor 11111 can include noting that no SDSF line was detected.
  • SDSF line processor 11113 can include, if there are the first pre-selected number or more of points 10789, fitting curve 11609- 11611 (FIG. 5L) to points 10789 that fall within the first pre-selected distance of SDSF line 10377. If the number of points 10789 that are within the first pre-selected distance of curve 11609- 11611 (FIG. 5L) exceeds the number of points 10789 within the first pre-selected distance of SDSF line 10377, and if curve 11609-11611 (FIG. 5L) intersects path line 11214, and if there are no gaps between the points 10789 on curve 11609-11611 (FIG.
  • SDSF line processor 11113 can include identifying curve 11609-11611 (FIG. 5L) (for example) as SDSF line 10377. If the number of points 10789 that are within the pre-selected distance of curve 11609-11611 (FIG. 5L) does not exceed the number of points 10789 within the first pre-selected distance of SDSF line 10377, or if curve 11609-11611 (FIG. 5L) does not intersect path line 11214, or if there are gaps between points 10789 on curve 11609-11611 (FIG. 5L) that exceed the second pre-selected distance, and if SDSF line 10377 is not remaining stable, and if the curve fit has not been attempted more than the second pre-selected number of attempts, SDSF line processor 11113 can execute the curve fit loop again.
  • curve 11609-11611 (FIG. 5L)
  • SDSF controller 11127 can receive SDSF line 10377, occupancy grid 11138, AV orientation changes 11142, and speed limit 11148, and can generate SDSF commands 11144 to drive AV 10101 (FIG. 5L) to correctly traverse SDSF 10377 (FIG. 5N).
  • SDSF controller 11127 can include, but is not limited to including, obstacle processor 11115, SDSF approach 11131, and SDSF traverse 11133. Obstacle processor 11115 can receive SDSF line 10377, goal point 11139, and occupancy grid 11138, and can determine if, among the obstacles identified in occupancy grid 11138, any of them could impede AV 10101 (FIG. 5N) as it traverses SDSF 10377 (FIG. 5N).
  • Obstacle processor 11115 can include, but is not limited to including, obstacle selector 11117, obstacle tester 11119, and traverse locator 11121.
  • Obstacle selector 11117 can include, but is not limited to including, receiving occupancy grid 11138 as described herein. Obstacle selector 11117 can include selecting an occupancy grid point and its associated probability from the geographic area that is common to both occupancy grid 11138 and polygon 11147.
  • Obstacle tester 11119 can include, if the probability that an obstacle exists at the selected grid point is higher than the pre-selected percent, and if the obstacle lies between AV 10101 (FIG.
  • obstacle tester 11119 can include projecting the obstacle to SDSF line 10377, forming projections 11621 that intersect SDSF line 10377. If there is less than or equal to the pre-selected percent probability that the location includes an obstacle, or if the obstacle does not lie between AV 10101 (FIG.
  • obstacle tester 11119 can include, if there are more obstacles to process, resuming execution at receiving occupancy grid 11138.
  • traverse locator 11121 can include connecting projection points and locating end points 11622/11623 (FIG. 5M) of connected projections 11621 (FIG. 5M) along SDSF line 10377.
  • Traverse locater 11121 can include marking part 11624 (FIG. 5M) of SDSF line 10377 between projection end points 11622/11623 (FIG.
  • Traverse locater 11121 can include marking parts 11626 (FIG. 5M) of SDSF line 10377 that are outside of non-traversable part 11624 (FIG. 5M) as traversable.
  • SDSF approach 11131 can include sending SDSF commands 11144 to turn AV 10101 (FIG. 5N) to within the fifth pre-selected amount perpendicular to traversable part 11626 (FIG. 5N) of SDSF line 10377. If the heading error with respect to perpendicular line 11627 (FIG. 5N), perpendicular to traversable section 11626 (FIG. 5N) of SDSF line 10377, is greater than the first pre-selected amount, SDSF approach 11131 can include sending SDSF commands 11144 to slow AV 10101 (FIG. 5N) by the ninth pre-selected amount. In some configurations, the ninth pre-selected amount can range from very slow to completely stopped.
  • SDSF approach 11131 can include sending SDSF commands 11144 to drive AV 10101 (FIG. 5N) forward towards SDSF line 10377, sending SDSF commands 11144 to slow AV 10101 (FIG. 5N) by the second pre- selected amount per meter traveled. If the distance between AV 10101 (FIG. 5N) and traversable SDSF line 11626 (FIG. 5N) is less than the fourth pre-selected distance, and if the heading error is greater than or equal to the third pre-selected amount with respect to a line perpendicular to SDSF line 10377, SDSF approach 11131 can include sending SDSF commands 11144 to slow AV 10101 (FIG. 5N) by the ninth pre-selected amount.
  • SDSF traverse 11133 can include ignoring updated SDSF information and sending SDSF commands 11144 to drive AV 10101 (FIG. 5N) at the pre-selected rate. If the AV orientation changes 11142 indicate that the elevation of leading edge 11701 (FIG. 5N) of AV 10101 (FIG. 5N) relative to trailing edge 11703 (FIG. 5N) of AV 10101 (FIG.
  • SDSF traverse 11133 can include sending SDSF commands 11144 to drive AV 10101 (FIG. 5N) forward, and sending SDSF commands 11144 to increase the speed of AV 10101 (FIG. 5N) to the pre-selected rate per degree of elevation. If AV orientation changes 11142 indicate that leading edge 11701 (FIG. 5N) to trailing edge 11703 (FIG. 5N) elevation of AV 10101 (FIG. 5N) is less than the sixth pre-selected amount, SDSF traverse 11133 can include sending SDSF commands 11144 to drive AV 10101 (FIG. 5N) forward at the seventh pre-selected speed.
  • SDSF traverse 11133 can include noting that AV 10101 (FIG. 5N) has completed traversing SDSF 10377. If AV location 11141 indicates that trailing edge 11703 (FIG. 5N) is less than or equal to the fifth pre-selected distance from SDSF line 10377, SDSF traverse 11133 can include executing again the loop beginning with ignoring the updated SDSF information.
  • the system of the present teachings can produce locations in three- dimensional space of various surface types upon receiving data such as, for example, but not limited to, RGD-D camera image data.
  • the system can rotate images 12155 and translate them from camera coordinate system 12157 into UTM coordinate system 12159.
  • the system can produce polygon files from the transformed images, and the polygon files can represent the three-dimensional locations that are associated with surface type 12161.
  • Method 10101, AV 10101 having pose 12163 can include, but is not limited to including, receiving, by AV 10101, camera images 12155.
  • Each of camera images 12155 can include an image timestamp 12171, and each of images 12155 can include image color pixels 12167 and image depth pixels 12169.
  • Method 12150 can include receiving pose 12163 of AV 10101, pose 12163 having pose timestamp 12171, and determining selected image 12173 by identifying an image from camera images 12155 having a closest image timestamp 12165 to pose timestamp 12171.
  • Method 12150 can include separating image color pixels 12167 from image depth pixels 12169 in selected image 12173, and determining image surface classifications 12161 for selected image 12173 by providing image color pixels 12167 to first machine learning model 12177 and image depth pixels 12169 to second machine learning model 12179.
  • Method 12150 can include determining perimeter points 12181 of the features in camera image 12173, where the features can include feature pixels 12151 within the perimeter, each of feature pixels 12151 having the same surface classification 12161, each of perimeter points 12181 having set of coordinates 12157.
  • Method 12150 can include converting each of sets of coordinates 12157 to UTM coordinates 12159.
  • Configurations of the present teachings are directed to computer systems for accomplishing the methods discussed in the description herein, and to computer readable media containing programs for accomplishing these methods.
  • the raw data and results can be stored for future retrieval and processing, printed, displayed, transferred to another computer, and/or transferred elsewhere.
  • Communications links can be wired or wireless, for example, using cellular communication systems, military communications systems, and satellite communications systems. Parts of the system can operate on a computer having a variable number of CPUs. Other alternative computer platforms can be used.
  • the present configuration is also directed to software for accomplishing the methods discussed herein, and computer readable media storing software for accomplishing these methods.
  • the various modules described herein can be accomplished on the same CPU, or can be accomplished on a different computer.
  • the present configuration has been described in language more or less specific as to structural and methodical features. It is to be understood, however, that the present configuration is not limited to the specific features shown and described, since the means herein disclosed comprise preferred forms of putting the present configuration into effect.
  • Methods can be, in whole or in part, implemented electronically.
  • Signals representing actions taken by elements of the system and other disclosed configurations can travel over at least one live communications network.
  • Control and data information can be electronically executed and stored on at least one computer-readable medium.
  • the system can be implemented to execute on at least one computer node in at least one live communications network.
  • At least one computer-readable medium can include, for example, but not be limited to, a floppy disk, a flexible disk, a hard disk, magnetic tape, or any other magnetic medium, a compact disk read only memory or any other optical medium, punched cards, paper tape, or any other physical medium with patterns of holes, a random access memory, a programmable read only memory, and erasable programmable read only memory (EPROM), a Flash EPROM, or any other memory chip or cartridge, or any other medium from which a computer can read.
  • a floppy disk a flexible disk, a hard disk, magnetic tape, or any other magnetic medium
  • a compact disk read only memory or any other optical medium punched cards, paper tape, or any other physical medium with patterns of holes
  • EPROM erasable programmable read only memory
  • Flash EPROM any other memory chip or cartridge, or any other medium from which a computer can read.
  • the at least one computer readable medium can contain graphs in any form, subject to appropriate licenses where necessary, including, but not limited to, Graphic Interchange Format (GIF), Joint Photographic Experts Group (JPEG), Portable Network Graphics (PNG), Scalable Vector Graphics (SVG), and Tagged Image File Format (TIFF).
  • GIF Graphic Interchange Format
  • JPEG Joint Photographic Experts Group
  • PNG Portable Network Graphics
  • SVG Scalable Vector Graphics
  • TIFF Tagged Image File Format

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Electromagnetism (AREA)
  • Automation & Control Theory (AREA)
  • Acoustics & Sound (AREA)
  • Transportation (AREA)
  • Mechanical Engineering (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Multimedia (AREA)
  • Human Computer Interaction (AREA)
  • Mathematical Physics (AREA)
  • Optics & Photonics (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)
  • Optical Radar Systems And Details Thereof (AREA)
  • Traffic Control Systems (AREA)

Abstract

L'invention concerne un véhicule autonome comportant des capteurs à capacités qui varient avantageusement, avantageusement positionnés, et avantageusement imperméables aux conditions environnementales. Un système s'exécutant sur le véhicule autonome peut recevoir une carte comprenant, par exemple, des caractéristiques de surface sensiblement discontinues conjointement avec des données provenant des capteurs, créer une grille d'occupation sur la base de la carte et des données, et modifier la configuration du véhicule autonome sur la base du type de surface sur lequel le véhicule autonome navigue. Le dispositif peut naviguer en toute sécurité sur des surfaces et des caractéristiques de surface, notamment traverser des surfaces discontinues et d'autres obstacles.
EP20750954.8A 2019-07-10 2020-07-10 Système et procédé de commande en temps réel d'un dispositif autonome Pending EP3997484A1 (fr)

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
US201962872396P 2019-07-10 2019-07-10
US201962872320P 2019-07-10 2019-07-10
US202062990485P 2020-03-17 2020-03-17
PCT/US2020/041711 WO2021007561A1 (fr) 2019-07-10 2020-07-10 Système et procédé de commande en temps réel d'un dispositif autonome

Publications (1)

Publication Number Publication Date
EP3997484A1 true EP3997484A1 (fr) 2022-05-18

Family

ID=71944349

Family Applications (1)

Application Number Title Priority Date Filing Date
EP20750954.8A Pending EP3997484A1 (fr) 2019-07-10 2020-07-10 Système et procédé de commande en temps réel d'un dispositif autonome

Country Status (11)

Country Link
EP (1) EP3997484A1 (fr)
JP (1) JP2023112104A (fr)
KR (1) KR20220034843A (fr)
CN (1) CN114270140A (fr)
AU (1) AU2020310932A1 (fr)
BR (1) BR112022000356A2 (fr)
CA (1) CA3146648A1 (fr)
GB (1) GB2600638A (fr)
IL (1) IL289689A (fr)
MX (1) MX2022000458A (fr)
WO (1) WO2021007561A1 (fr)

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11988749B2 (en) 2021-08-19 2024-05-21 Argo AI, LLC System and method for hybrid LiDAR segmentation with outlier detection
DE102023001762A1 (de) 2022-05-24 2023-11-30 Sew-Eurodrive Gmbh & Co Kg Mobiles System
DE102023202244A1 (de) 2023-03-13 2024-09-19 Robert Bosch Gesellschaft mit beschränkter Haftung Verfahren für eine dreidimensionale Straßenbereichssegmentierung für ein Fahrzeug

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2000054719A1 (fr) 1999-03-15 2000-09-21 Deka Products Limited Partnership Systeme et procede de commande d'un fauteuil roulant
US8717456B2 (en) * 2002-02-27 2014-05-06 Omnivision Technologies, Inc. Optical imaging systems and methods utilizing nonlinear and/or spatially varying image processing
CN108369775B (zh) * 2015-11-04 2021-09-24 祖克斯有限公司 响应于物理环境的改变自适应制图以对自主车辆进行导航

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
B-H SCHAFER ET AL: "The Application of Design Schemata in Off-Road Robotics", IEEE INTELLIGENT TRANSPORTATION SYSTEMS MAGAZINE, vol. 5, no. 1, 1 January 2013 (2013-01-01), USA, pages 4 - 27, XP055733621, ISSN: 1939-1390, DOI: 10.1109/MITS.2012.2217591 *

Also Published As

Publication number Publication date
KR20220034843A (ko) 2022-03-18
GB202201700D0 (en) 2022-03-30
WO2021007561A1 (fr) 2021-01-14
AU2020310932A1 (en) 2022-01-27
GB2600638A (en) 2022-05-04
CN114270140A (zh) 2022-04-01
JP2022540640A (ja) 2022-09-16
MX2022000458A (es) 2022-04-25
GB2600638A8 (en) 2022-11-23
IL289689A (en) 2022-03-01
JP2023112104A (ja) 2023-08-10
CA3146648A1 (fr) 2021-01-14
BR112022000356A2 (pt) 2022-05-10

Similar Documents

Publication Publication Date Title
US11927457B2 (en) System and method for real time control of an autonomous device
EP3759562B1 (fr) Localisation faisant appel à une caméra pour véhicules autonomes
US11790668B2 (en) Automated road edge boundary detection
EP3997484A1 (fr) Système et procédé de commande en temps réel d'un dispositif autonome
CN111542860A (zh) 用于自主车辆的高清地图的标志和车道创建
CN110832279A (zh) 对准由自主车辆捕获的数据以生成高清晰度地图
AU2020229324B2 (en) System and method for surface feature detection and traversal
CN112819943B (zh) 一种基于全景相机的主动视觉slam系统
WO2015180021A1 (fr) Système robot d'élagage
CN112800524A (zh) 一种基于深度学习的路面病害三维重建方法
CN116508071A (zh) 用于注释汽车雷达数据的系统和方法
JP7572420B2 (ja) 自律的デバイスのリアルタイム制御のためのシステムおよび方法
CN116338729A (zh) 一种基于多层地图的三维激光雷达导航方法
Yang et al. An optimization-based selection approach of landing sites for swarm unmanned aerial vehicles in unknown environments
Li et al. Intelligent vehicle localization and navigation based on intersection fingerprint roadmap (IRM) in underground parking lots
Azim 3D perception of outdoor and dynamic environment using laser scanner
Gustafsson Wall and curb detection and mapping for mobile robots in urban environments
Yoon Path planning and sensor knowledge store for unmanned ground vehicles in urban area evaluated by multiple ladars
Mukhija et al. Mapping a Network of Roads for an On-road Navigating Robot

Legal Events

Date Code Title Description
STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: UNKNOWN

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE INTERNATIONAL PUBLICATION HAS BEEN MADE

PUAI Public reference made under article 153(3) epc to a published international application that has entered the european phase

Free format text: ORIGINAL CODE: 0009012

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: REQUEST FOR EXAMINATION WAS MADE

17P Request for examination filed

Effective date: 20220131

AK Designated contracting states

Kind code of ref document: A1

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR

DAV Request for validation of the european patent (deleted)
DAX Request for extension of the european patent (deleted)
P01 Opt-out of the competence of the unified patent court (upc) registered

Effective date: 20230519

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: EXAMINATION IS IN PROGRESS

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: REQUEST FOR EXAMINATION WAS MADE

17Q First examination report despatched

Effective date: 20240126

PUAG Search results despatched under rule 164(2) epc together with communication from examining division

Free format text: ORIGINAL CODE: 0009017

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: EXAMINATION IS IN PROGRESS

17Q First examination report despatched

Effective date: 20240801

B565 Issuance of search results under rule 164(2) epc

Effective date: 20240801

RIC1 Information provided on ipc code assigned before grant

Ipc: G01S 17/931 20200101ALI20240729BHEP

Ipc: G01S 17/86 20200101ALI20240729BHEP

Ipc: G01S 17/42 20060101ALI20240729BHEP

Ipc: G01S 15/931 20200101ALI20240729BHEP

Ipc: G01S 15/86 20200101ALI20240729BHEP

Ipc: G01S 13/86 20060101ALI20240729BHEP

Ipc: G01C 21/34 20060101ALI20240729BHEP

Ipc: B62D 15/02 20060101ALI20240729BHEP

Ipc: G01S 13/931 20200101ALI20240729BHEP

Ipc: G01S 7/48 20060101ALI20240729BHEP

Ipc: G05D 1/00 20060101ALI20240729BHEP

Ipc: G01C 21/30 20060101ALI20240729BHEP

Ipc: G01S 17/87 20200101ALI20240729BHEP

Ipc: G01S 17/89 20200101AFI20240729BHEP