GB2600638A - System and method for real time control of an autonomous device - Google Patents

System and method for real time control of an autonomous device Download PDF

Info

Publication number
GB2600638A
GB2600638A GB2201700.8A GB202201700A GB2600638A GB 2600638 A GB2600638 A GB 2600638A GB 202201700 A GB202201700 A GB 202201700A GB 2600638 A GB2600638 A GB 2600638A
Authority
GB
United Kingdom
Prior art keywords
occupancy grid
delivery vehicle
global
autonomous delivery
short
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
GB2201700.8A
Other versions
GB2600638A8 (en
GB202201700D0 (en
Inventor
A Van Der Meree Dirk
Mishra Arunabh
Langenfeld Christopher
J Slate Michael
J Principe Christopher
J Buitkus Gregory
M Whitney Justin
Gummadi Raajitha
G Kane Derek
A Carrigg Emily
Steele Patrick
V Hersh Benjamin
G Siva Perumal Fnu
Carrigg David
F Pawlowski Daniel
Chaturvedi Yashovardhan
Khanna Kartik
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Deka Products LP
Original Assignee
Deka Products LP
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Deka Products LP filed Critical Deka Products LP
Publication of GB202201700D0 publication Critical patent/GB202201700D0/en
Publication of GB2600638A publication Critical patent/GB2600638A/en
Publication of GB2600638A8 publication Critical patent/GB2600638A8/en
Withdrawn legal-status Critical Current

Links

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W60/00Drive control systems specially adapted for autonomous road vehicles
    • B60W60/001Planning or execution of driving tasks
    • B60W60/0025Planning or execution of driving tasks specially adapted for specific operations
    • B60W60/00256Delivery operations
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/88Lidar systems specially adapted for specific applications
    • G01S17/89Lidar systems specially adapted for specific applications for mapping or imaging
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W30/00Purposes of road vehicle drive control systems not related to the control of a particular sub-unit, e.g. of systems using conjoint control of vehicle sub-units
    • B60W30/08Active safety systems predicting or avoiding probable or impending collision or attempting to minimise its consequences
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W40/00Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models
    • B60W40/02Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models related to ambient conditions
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • G01C21/34Route searching; Route guidance
    • G01C21/3453Special cost functions, i.e. other than distance or default speed limit of road segments
    • G01C21/3461Preferred or disfavoured areas, e.g. dangerous zones, toll or emission zones, intersections, manoeuvre types, segments such as motorways, toll roads, ferries
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/86Combinations of radar systems with non-radar systems, e.g. sonar, direction finder
    • G01S13/865Combination of radar systems with lidar systems
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/86Combinations of radar systems with non-radar systems, e.g. sonar, direction finder
    • G01S13/867Combination of radar systems with cameras
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/88Radar or analogous systems specially adapted for specific applications
    • G01S13/93Radar or analogous systems specially adapted for specific applications for anti-collision purposes
    • G01S13/931Radar or analogous systems specially adapted for specific applications for anti-collision purposes of land vehicles
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S15/00Systems using the reflection or reradiation of acoustic waves, e.g. sonar systems
    • G01S15/86Combinations of sonar systems with lidar systems; Combinations of sonar systems with systems not using wave reflection
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S15/00Systems using the reflection or reradiation of acoustic waves, e.g. sonar systems
    • G01S15/88Sonar systems specially adapted for specific applications
    • G01S15/93Sonar systems specially adapted for specific applications for anti-collision purposes
    • G01S15/931Sonar systems specially adapted for specific applications for anti-collision purposes of land vehicles
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/02Systems using the reflection of electromagnetic waves other than radio waves
    • G01S17/06Systems determining position data of a target
    • G01S17/42Simultaneous measurement of distance and other co-ordinates
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/86Combinations of lidar systems with systems other than lidar, radar or sonar, e.g. with direction finders
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/88Lidar systems specially adapted for specific applications
    • G01S17/93Lidar systems specially adapted for specific applications for anti-collision purposes
    • G01S17/931Lidar systems specially adapted for specific applications for anti-collision purposes of land vehicles
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/4808Evaluating distance, position or velocity data
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0231Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means
    • G05D1/0246Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using a video camera in combination with image processing means
    • G05D1/0248Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using a video camera in combination with image processing means in combination with a laser
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0231Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means
    • G05D1/0246Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using a video camera in combination with image processing means
    • G05D1/0251Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using a video camera in combination with image processing means extracting 3D information from a plurality of images taken from different locations, e.g. stereo vision
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0268Control of position or course in two dimensions specially adapted to land vehicles using internal positioning means
    • G05D1/0274Control of position or course in two dimensions specially adapted to land vehicles using internal positioning means using mapping information stored in a memory device
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W50/00Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
    • B60W2050/0001Details of the control system
    • B60W2050/0002Automatic control, details of type of controller or control system architecture
    • B60W2050/0004In digital systems, e.g. discrete-time systems involving sampling
    • B60W2050/0005Processor details or data handling, e.g. memory registers or chip architecture
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W50/00Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
    • B60W2050/0001Details of the control system
    • B60W2050/0002Automatic control, details of type of controller or control system architecture
    • B60W2050/0008Feedback, closed loop systems or details of feedback error signal
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W50/00Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
    • B60W2050/0001Details of the control system
    • B60W2050/0043Signal treatments, identification of variables or parameters, parameter estimation or state estimation
    • B60W2050/0052Filtering, filters
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2420/00Indexing codes relating to the type of sensors based on the principle of their operation
    • B60W2420/40Photo, light or radio wave sensitive means, e.g. infrared sensors
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2420/00Indexing codes relating to the type of sensors based on the principle of their operation
    • B60W2420/40Photo, light or radio wave sensitive means, e.g. infrared sensors
    • B60W2420/403Image sensing, e.g. optical camera
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2420/00Indexing codes relating to the type of sensors based on the principle of their operation
    • B60W2420/40Photo, light or radio wave sensitive means, e.g. infrared sensors
    • B60W2420/408Radar; Laser, e.g. lidar
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2420/00Indexing codes relating to the type of sensors based on the principle of their operation
    • B60W2420/54Audio sensitive means, e.g. ultrasound
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2556/00Input parameters relating to data
    • B60W2556/45External transmission of data to or from the vehicle
    • B60W2556/50External transmission of data to or from the vehicle of positioning data, e.g. GPS [Global Positioning System] data
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B62LAND VEHICLES FOR TRAVELLING OTHERWISE THAN ON RAILS
    • B62DMOTOR VEHICLES; TRAILERS
    • B62D15/00Steering not otherwise provided for
    • B62D15/02Steering position indicators ; Steering position determination; Steering aids
    • B62D15/025Active steering aids, e.g. helping the driver by actively influencing the steering system after environment evaluation
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/88Radar or analogous systems specially adapted for specific applications
    • G01S13/93Radar or analogous systems specially adapted for specific applications for anti-collision purposes
    • G01S13/931Radar or analogous systems specially adapted for specific applications for anti-collision purposes of land vehicles
    • G01S2013/9318Controlling the steering
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/88Radar or analogous systems specially adapted for specific applications
    • G01S13/93Radar or analogous systems specially adapted for specific applications for anti-collision purposes
    • G01S13/931Radar or analogous systems specially adapted for specific applications for anti-collision purposes of land vehicles
    • G01S2013/93185Controlling the brakes
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/88Radar or analogous systems specially adapted for specific applications
    • G01S13/93Radar or analogous systems specially adapted for specific applications for anti-collision purposes
    • G01S13/931Radar or analogous systems specially adapted for specific applications for anti-collision purposes of land vehicles
    • G01S2013/9319Controlling the accelerator
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/88Radar or analogous systems specially adapted for specific applications
    • G01S13/93Radar or analogous systems specially adapted for specific applications for anti-collision purposes
    • G01S13/931Radar or analogous systems specially adapted for specific applications for anti-collision purposes of land vehicles
    • G01S2013/9323Alternative operation using light waves

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Electromagnetism (AREA)
  • Automation & Control Theory (AREA)
  • Acoustics & Sound (AREA)
  • Mechanical Engineering (AREA)
  • Transportation (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Multimedia (AREA)
  • Mathematical Physics (AREA)
  • Human Computer Interaction (AREA)
  • Optics & Photonics (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)
  • Traffic Control Systems (AREA)
  • Optical Radar Systems And Details Thereof (AREA)

Abstract

An autonomous vehicle having sensors advantageously varied in capabilities, advantageously positioned, and advantageously impervious to environmental conditions. A system executing on the autonomous vehicle that can receive a map including, for example, substantially discontinuous surface features along with data from the sensors, create an occupancy grid based upon the map and the data, and change the configuration of the autonomous vehicle based upon the type of surface on which the autonomous vehicle navigates. The device can safely navigate surfaces and surface features, including traversing discontinuous surfaces and other obstacles.

Claims (93)

1. An autonomous delivery vehicle comprising: a power base including two powered front wheels, two powered back wheels and energy storage, the power base configured to move at a commanded velocity and in a commanded direction to perform a transport of at least one object; a cargo platform including a plurality of short-range sensors, the cargo platform mechanically attached to the power base; a cargo container with a volume for receiving the at least one object, the cargo container mounted on top of the cargo platform; a long-range sensor suite comprising LIDAR and one or more cameras, the long- range sensor suite mounted on top of the cargo container; and a controller to receive data from the long-range sensor suite and the plurality of short-range sensors, the controller determining the commanded velocity and the commanded direction based at least on the data, the controller providing the commanded velocity and the commanded direction to the power base to complete the transport.
2. The autonomous delivery vehicle of claim 1 wherein the data from the plurality of short- range sensors comprise at least one characteristic of a surface upon which the power base travels.
3. The autonomous delivery vehicle of claim 1 wherein the plurality of short-range sensors comprises at least one stereo camera.
4. The autonomous delivery vehicle of claim 1 wherein the plurality of short-range sensors comprise at least one IR projector, at least one image sensor, and at least one RGB sensor.
5. The autonomous delivery vehicle of claim 1 wherein the plurality of short-range sensors comprises at least one radar sensor.
6. The autonomous delivery vehicle of claim 1 wherein the data from the plurality of short- range sensors comprise RGB-D data.
7. The autonomous delivery vehicle of claim 1 wherein the controller determines a geometry of a road surface based on RGB-D data received from the plurality of short-range sensors.
8. The autonomous delivery vehicle of claim 1 wherein the plurality of short-range sensors detect objects within 4 meters of the AV and the long-range sensor suite detects objects more than 4 meters from the autonomous delivery vehicle.
9. The autonomous delivery vehicle of claim 1 wherein the plurality of short-range sensors comprise a cooling circuit.
10. The autonomous delivery vehicle of claim 1 wherein the plurality of short-range sensors comprise an ultrasonic sensor.
11. The autonomous delivery vehicle of claim 2 wherein the controller comprises: executable code, the executable code including: accessing a map, the map formed by a map processor, the map processor comprising: first processor accessing point cloud data from the long-range sensor suite, the point cloud data representing the surface; a filter filtering the point cloud data; a second processor forming processable parts from the filtered point cloud data; a third processor merging the processable parts into at least one polygon; a fourth processor locating and labeling the at least one substantially discontinuous surface feature (SDSF) in the at least one polygon, if present, the locating and labeling forming labeled point cloud data; a fifth processor creating graphing polygons from the labeled point cloud data; and a sixth processor choosing a path from a starting point to an ending point based at least on the graphing polygons, the AV traversing the at least one SDSF along the path.
12. The autonomous delivery vehicle as in claim 11 wherein the filter comprises: a seventh processor executing code including: conditionally removing points representing transient objects and points representing outliers from the point cloud data; and replacing the removed points having a pre-selected height.
13. The autonomous delivery vehicle as in claim 11 wherein the second processor includes the executable code comprising: segmenting the point cloud data into the processable parts; and removing points of a pre-selected height from the processable parts.
14. The autonomous delivery vehicle as in claim 11 wherein the third processor includes the executable code comprising: reducing a size of the processable parts by analyzing outliers, voxels, and normals; growing regions from the reduced-size processable parts; determining initial drivable surfaces from the grown regions; segmenting and meshing the initial drivable surfaces; locating polygons within the segmented and meshed initial drivable surfaces; and setting at least one drivable surface based at least on the polygons.
15. The autonomous delivery vehicle as in claim 14 wherein the fourth processor includes the executable code comprising: sorting the point cloud data of the initial drivable surfaces according to a SDSF filter, the SDSF filter including at least three categories of points; and locating at least one SDSF point based at least on whether the at least three categories of points, in combination, meet at least one first pre-selected criterion.
16. The autonomous delivery vehicle as in claim 15 wherein the fourth processor includes the executable code comprising: creating at least one SDSF trajectory based at least on whether a plurality of the at least one SDSF point, in combination, meet at least one second pre-selected criterion.
17. The autonomous delivery vehicle as in claim 14 wherein creating graphing polygons includes an eighth processor including the executable code comprising: creating at least one polygon from the at least one drivable surface, the at least one polygon including exterior edges; smoothing the exterior edges; forming a driving margin based on the smoothed exterior edges; adding the at least one SDSF trajectory to the at least one drivable surface; and removing interior edges from the at least one drivable surface according to at least one third pre-selected criterion.
18. The autonomous delivery vehicle as in claim 17 wherein the smoothing the exterior edges includes a ninth processor including the executable code comprising: trimming the exterior edges outward forming outward edges.
19. The autonomous delivery vehicle as in claim 18 wherein forming the driving margin of the smoothed exterior edges includes a tenth processor including the executable code comprising: trimming the outward edges inward.
20. The autonomous delivery vehicle as in claim 1 wherein the controller comprises: a subsystem for navigating at least one substantially discontinuous surface feature (SDSF) encountered by the autonomous delivery vehicle (AV), the AV traveling a path over a surface, the surface including the at least one SDSF, the path including a starting point and an ending point, the subsystem comprising: a first processor accessing a route topology, the route topology including at least one graphing polygon including filtered point cloud data, the filtered point cloud data including labeled features, the point cloud data including a drivable margin; a second processor transforming the point cloud data into a global coordinate system; a third processor determining boundaries of the at least one SDSF, the third processor creating SDSF buffers of a pre-selected size around the boundaries; a fourth processor determining which of the at least one SDSFs can be traversed based at least on at least one SDSF traversal criterion; a fifth processor creating an edge/weight graph based at least on the at least one SDSF traversal criterion, the transformed point cloud data, and the route topology; and a base controller choosing the path from the starting point to the ending point based at least on the edge/weight graph.
21. The autonomous delivery vehicle as in claim 20 wherein the at least one SDSF traversal criterion comprises: a pre-selected width of the at least one and a pre-selected smoothness of the at least one SDSF; a minimum ingress distance and a minimum egress distance between the at least one SDSF and the AV including a drivable surface; and the minimum ingress distance between the at least one SDSF and the AV accommodating approximately a 90° approach by the AV to the at least one SDSF.
22. A method for managing a global occupancy grid for an autonomous device, the global occupancy grid including global occupancy grid cells, the global occupancy grid cells being associated with occupied probability, the method comprising: receiving sensor data from sensors associated with the autonomous device; creating a local occupancy grid based at least on the sensor data, the local occupancy grid having local occupancy grid cells; if the autonomous device has moved from a first area to a second area, accessing historical data associated with the second area; creating a static grid based at least on the historical data; moving the global occupancy grid to maintain the autonomous device in a central position of the global occupancy grid; updating the moved global occupancy grid based on the static grid; marking at least one of the global occupancy grid cells as unoccupied, if the at least one of the global occupancy grid cells coincides with a location of the autonomous device; for each of the local occupancy grid cells, calculating a position of the local occupancy grid cell on the global occupancy grid; accessing a first occupied probability from the global occupancy grid cell at the position; accessing a second occupied probability from the local occupancy grid cell at the position; and computing a new occupied probability at the position on the global occupancy grid based at least on the first occupied probability and the second occupied probability.
23. The method as in claim 22 further comprising: range-checking the new occupied probability.
24. The method as in claim 23 wherein the range-checking comprises: setting the new occupied probability to 0 if the new occupied probability <0; and setting the new occupied probability to 1 if the new occupied probability >1.
25. The method as in claim 22 further comprising: setting the global occupancy grid cell to the new occupied probability.
26. The method as in claim 23 further comprising: setting the global occupancy grid cell to the range-checked new occupied probability.
27. A method for creating and managing occupancy grids comprising: transforming, by a local occupancy grid creation node, sensor measurements to a frame of reference associated with a device; creating a time-stamped measurement occupancy grid; publishing the time-stamped measurement occupancy grid as a local occupancy grid; creating a plurality of local occupancy grids; creating a static occupancy grid based on surface characteristics in a repository, the surface characteristics associated with a position of the device; moving a global occupancy grid associated with the position of the device to maintain the device and the local occupancy grid approximately centered with respect to the global occupancy grid; adding information from the static occupancy grid to the global occupancy grid; marking an area in the global occupancy grid currently occupied by the device as unoccupied; for each of at least one cell in each local occupancy grid, determining a location of the at least one cell in the global occupancy grid; accessing a first value at the location; determining a second value at the location based on a relationship between the first value and a cell value at the at least one cell in the local occupancy grid; comparing the second value against a pre-selected probability range; and setting the global occupancy grid with the new value if a probability value is within the pre-selected probability range.
28. The method as in claim 27 further comprising: publishing the global occupancy grid.
29. The method as in claim 27 wherein the surface characteristics comprise surface type and surface discontinuities.
30. The method as in claim 27 wherein the relationship comprises summing.
31. A system for creating and managing occupancy grids comprising: a plurality of local grid creation nodes creating at least one local occupancy grid, the at least one local occupancy grid associated with a position of a device, the at least one local occupancy grid including at least one cell; a global occupancy grid manager accessing the at least one local occupancy grid, the global occupancy grid manager creating a static occupancy grid based on surface characteristics in a repository, the surface characteristics associated with the position of the device, moving a global occupancy grid associated with the position of the device to maintain the device and at least one the local occupancy grid approximately centered with respect to the global occupancy grid; adding information from the static occupancy grid to at least one global occupancy grid; marking an area in the global occupancy grid currently occupied by the device as unoccupied; for each of the at least one cell in each local occupancy grid, determining a location of the at least one cell in the global occupancy grid; accessing a first value at the location; determining a second value at the location based on a relationship between the first value and a cell value at the at least one cell in the local occupancy grid; comparing the second value against a pre-selected probability range; and setting the global occupancy grid with the new value if a probability value is within the pre-selected probability range.
32. A method for updating a global occupancy grid comprising: if an autonomous device has moved to a new position, updating the global occupancy grid with information from a static grid associated with the new position; analyzing surfaces at the new position; if the surfaces are drivable, updating the surfaces and updating the global occupancy grid with the updated surfaces; and updating the global occupancy grid with values from a repository of static values, the static values being associated with the new position.
33. The method as in claim 32 wherein updating the surfaces comprises: accessing a local occupancy grid associated with the new position; for each cell in the local occupancy grid, accessing a local occupancy grid surface classification confidence value and a local occupancy grid surface classification; if the local occupancy grid surface classification is the same as a global surface classification in the global occupancy grid in the cell, adding a global surface classification confidence value in the global occupancy grid to the local occupancy grid surface classification confidence value to form a sum, and updating the global occupancy grid at the cell with the sum; if the local occupancy grid surface classification is not the same as the global surface classification in the global occupancy grid in the cell, subtracting the local occupancy grid surface classification confidence value from the global surface classification confidence value in the global occupancy grid to form a difference, and updating the global occupancy grid with the difference; if the difference is less than zero, updating the global occupancy grid with the local occupancy grid surface classification.
34. The method as in claim 32 wherein updating the global occupancy grid with the values from the repository of static values comprises: for each cell in a local occupancy grid, accessing a local occupancy grid probability that the cell is occupied value, a logodds value, from the local occupancy grid; updating the logodds value in the global occupancy grid with the local occupancy grid logodds value at the cell; if a pre-selected certainty that the cell is not occupied is met, and if the autonomous device is traveling within lane barriers, and if a local occupancy grid surface classification indicates a drivable surface, decreasing the logodds that the cell is occupied in the local occupancy grid; if the autonomous device expects to encounter relatively uniform surfaces, and if the local occupancy grid surface classification indicates a relatively non- uniform surface, increasing the logodds in the local occupancy grid; and if the autonomous device expects to encounter relatively uniform surfaces, and if the local occupancy grid surface classification indicates a relatively uniform surface, decreasing the logodds in the local occupancy grid.
35. A method for real-time control of a configuration of a device, the device including a chassis, at least four wheels, a first side of the chassis operably coupled with at least one of the at least four wheels, and an opposing second side of the chassis operably coupled with at least one of the at least four wheels, the method comprising: creating a map based at least on prior surface features and an occupancy grid, the map being created in non-real time, the map including at least one location, the at least one location associated with at least one surface feature, the at least one surface feature being associated with at least one surface classification and at least one mode; determining current surface features as the device travels; updating the occupancy grid in real-time with the current surface features; determining, from the occupancy grid and the map, a path the device can travel to traverse the at least one surface feature.
36. A method for real-time control of a configuration of a device, the device including a chassis, at least four wheels, a first side of the chassis operably coupled with at least one of the at least four wheels, and an opposing second side of the chassis operably coupled with at least one of the at least four wheels, the method comprising: receiving environmental data; determining a surface type based at least on the environmental data; determining a mode based at least on the surface type and a first configuration; determining a second configuration based at least on the mode and the surface type; determining movement commands based at least on the second configuration; and controlling the configuration of the device by using the movement commands to change the device from the first configuration to the second configuration.
37. The method as in claim 36 wherein the environmental data comprises RGB-D image data.
38. The method as in claim 36 further comprising: populating an occupancy grid based at least on the surface type and the mode; and determining the movement commands based at least on the occupancy grid.
39. The method as in claim 38 wherein the occupancy grid comprises information based at least on data from at least one image sensor.
40. The method as in claim 36 wherein the environmental data comprises a topology of a road surface.
41. The method as in claim 36 wherein the configuration comprises two pairs of clustered of the at least four wheels, a first pair of the two pairs being positioned on the first side, a second pair of the two pairs being positioned on the second side, the first pair including a first front wheel and a first rear wheel, and the second pair including a second front wheel and a second rear wheel.
42. The method as in claim 41 wherein the controlling of the configuration comprises: coordinated powering of the first pair and the second pair based at least on the environmental data.
43. The method as in claim 41 wherein the controlling of the configuration comprises: transitioning from driving the at least four wheels and a pair of casters retracted, the pair of casters operably coupled to the chassis, to driving two wheels with the clustered first pair and the clustered second pair rotated to lift the first front wheel and the second front wheel, the device resting on the first rear wheel, the second rear wheel, and the pair of casters.
44. The method as in claim 41 wherein the controlling of the configuration comprises: rotating a pair of clusters operably coupled with a first two powered wheels on the first side and a second two powered wheels on the second side based at least on the environmental data.
45. The method as in claim 36 wherein the device further comprises a cargo container, the cargo container mounted on the chassis, the chassis controlling a height of the cargo container.
46. The method as in claim 45 wherein the height of the cargo container being based at least on the environmental data.
47. A system for real-time control of a configuration of a device, the device including a chassis, at least four wheels, a first side of the chassis, and an opposing second side of the chassis, the system comprising: a device processor receiving real-time environmental data surrounding the device, the device processor determining a surface type based at least on the environmental data, the device processor determining a mode based at least on the surface type and a first configuration, the device processor determining a second configuration based at least on the mode and the surface type; and a powerbase processor determining movement commands based at least on the second configuration, the powerbase processor controlling the configuration of the device by using the movement commands to change the device from the first configuration to the second configuration.
48. The system as in claim 47 wherein the environmental data comprises RGB-D image data.
49. The system as in claim 47 wherein the device processor comprises populating an occupancy grid based at least on the surface type and the mode.
50. The system as in claim 49 wherein the powerbase processor comprises determining the movement commands based at least on the occupancy grid.
51. The system as in claim 49 wherein the occupancy grid comprises information based at least on data from at least one image sensor.
52. The system as in claim 47 wherein the environmental data comprises a topology of a road surface.
53. The system as in claim 47 wherein the configuration comprises two pairs of clustered of the at least four wheels, a first pair of the two pairs being positioned on the first side, a second pair of the two pairs being positioned on the second side, the first pair having a first front wheel and a first rear wheel, and the second pair having a second front wheel and a second rear wheel.
54. The system as in claim 53 wherein the controlling of the configuration comprises: coordinated powering of the first pair and the second pair based at least on the environmental data.
55. The system as in claim 53 wherein the controlling of the configuration comprises: transitioning from driving the at least four wheels and a pair of casters retracted, the pair of casters operably coupled to the chassis, to driving two wheels with the clustered first pair and the clustered second pair rotated to lift the first front wheel and the second front wheel, the device resting on the first rear wheel, the second rear wheel, and the pair of casters.
56. A method for maintaining a global occupancy grid comprising: locating a first position of an autonomous device; when the autonomous device moves to a second position, the second position being associated with the global occupancy grid and a local occupancy grid, updating the global occupancy grid with at least one occupied probability value associated with the first position; updating the global occupancy grid with at least one drivable surface associated with the local occupancy grid; updating the global occupancy grid with surface confidences associated with the at least one drivable surface; updating the global occupancy grid with logodds of the at least one occupied probability value using a first Bayesian function; and adjusting the logodds based at least on characteristics associated with the second position; and when the autonomous device remains in the first position and the global occupancy grid and the local occupancy grid are co-located, updating the global occupancy grid with the at least one drivable surface associated with the local occupancy grid; updating the global occupancy grid with the surface confidences associated with the at least one drivable surface; updating the global occupancy grid with the logodds of the at least one occupied probability value using a second Bayesian function; and adjusting the logodds based at least on characteristics associated with the second position.
57. The method as in claim 35 wherein creating the map comprises: accessing point cloud data representing the surface; filtering the point cloud data; forming the filtered point cloud data into processable parts; merging the processable parts into at least one concave polygon; locating and labeling the at least one SDSF in the at least one concave polygon, the locating and labeling forming labeled point cloud data; creating graphing polygons based at least on the at least one concave polygon; and choosing the path from a starting point to an ending point based at least on the graphing polygons, the AV traversing the at least one SDSF along the path.
58. The method as in claim 57 wherein the filtering the point cloud data comprises: conditionally removing points representing transient objects and points representing outliers from the point cloud data; and replacing the removed points having a pre-selected height.
59. The method as in claim 57 wherein forming processing parts comprises: segmenting the point cloud data into the processable parts; and removing points of a pre-selected height from the processable parts.
60. The method as in claim 57 wherein the merging the processable parts comprises: reducing a size of the processable parts by analyzing outliers, voxels, and normals; growing regions from the reduced-size processable parts; determining initial drivable surfaces from the grown regions; segmenting and meshing the initial drivable surfaces; locating polygons within the segmented and meshed initial drivable surfaces; and setting at least one drivable surface based at least on the polygons.
61. The method as in claim 60 wherein the locating and labeling the at least one SDSF comprises: sorting the point cloud data of the initial drivable surfaces according to a SDSF filter, the SDSF filter including at least three categories of points; and locating at least one SDSF point based at least on whether the at least three categories of points, in combination, meet at least one first pre-selected criterion.
62. The method as in claim 61 further comprising: creating at least one SDSF trajectory based at least on whether a plurality of the at least one SDSF point, in combination, meet at least one second pre-selected criterion.
63. The method as in claim 62 wherein the creating graphing polygons further comprises: creating at least one polygon from the at least one drivable surface, the at least one polygon including exterior edges; smoothing the exterior edges; forming a driving margin based on the smoothed exterior edges; adding the at least one SDSF trajectory to the at least one drivable surface; and removing interior edges from the at least one drivable surface according to at least one third pre-selected criterion.
64. The method as in claim 63 wherein the smoothing of the exterior edges comprises: trimming the exterior edges outward forming outward edges.
65. The method as in claim 63 wherein forming the driving margin of the smoothed exterior edges comprises: trimming the outward edges inward.
66. An autonomous delivery vehicle comprising: a power base including two powered front wheels, two powered back wheels and energy storage, the power base configured to move at a commanded velocity; a cargo platform including a plurality of short-range sensors, the cargo platform mechanically attached to the power base; a cargo container with a volume for receiving a one or more objects to deliver, the cargo container mounted on top of the cargo platform; a long-range sensor suite comprising LIDAR and one or more cameras, the long- range sensor suite mounted on top of the cargo container; and a controller to receive data from the long-range sensor suite and the plurality of short-range sensors.
67. The autonomous delivery vehicle of claim 66 wherein the plurality of short-range sensors detect at least one characteristic of a drivable surface.
68. An autonomous delivery vehicle of claim 66 wherein the plurality of short-range sensors are stereo cameras.
69. The autonomous delivery vehicle of claim 66 wherein the plurality of short-range sensors comprise an IR projector, two image sensors and an RGB sensor.
70. The autonomous delivery vehicle of claim 66 wherein the plurality of short-range sensors are radar sensors.
71. The autonomous delivery vehicle of claim 66 wherein the short-range sensors supply RGB-D data to the controller.
72. The autonomous delivery vehicle of claim 66 wherein the controller determines a geometry of a road surface based on RGB-D data received from the plurality of short-range sensors.
73. The autonomous delivery vehicle of claim 66 wherein the plurality of short-range sensors detect objects within 4 meters of the autonomous delivery vehicle and the long- range sensor suite detects objects more than 4 meters from the autonomous delivery vehicle.
74. An autonomous delivery vehicle comprising: a power base including at least two powered back wheels, caster front wheels and energy storage, the power base configured to move at a commanded velocity; a cargo platform including a plurality of short-range sensors, the cargo platform mechanically attached to the power base; a cargo container with a volume for receiving a one or more objects to deliver, the cargo container mounted on top of the cargo platform; a long-range sensor suite comprising LIDAR and one or more cameras, the long- range sensor suite mounted on top of the cargo container; and a controller to receive data from the long-range sensor suite and the plurality of short-range sensors.
75. The autonomous delivery vehicle of claim 74 wherein the plurality of short-range sensors detect at least one characteristic of a drivable surface.
76. The autonomous delivery vehicle of claim 74 wherein the plurality of short-range sensors are stereo cameras.
77. The autonomous delivery vehicle of claim 74 wherein the plurality of short-range sensors comprise an IR projector, two image sensors and an RGB sensor.
78. The autonomous delivery vehicle of claim 74 wherein the plurality of short-range sensors are radar sensors.
79. The autonomous delivery vehicle of claim 74 wherein the short-range sensors supply RGB-D data to the controller.
80. The autonomous delivery vehicle of claim 74 wherein the controller determines a geometry of a road surface based on RGB-D data received from the plurality of short-range sensors.
81. The autonomous delivery vehicle of claim 74 wherein the plurality of short-range sensors detect objects within 4 meters of the autonomous delivery vehicle and the long- range sensor suite detects objects more than 4 meters from the autonomous delivery vehicle.
82. The autonomous delivery vehicle of claim 74, further comprising a second set of powered wheels that may engage the ground, while the caster wheels are lifted off the ground.
83. An autonomous delivery vehicle comprising: a power base including at least two powered back wheels, caster front wheels and energy storage, the power base configured to move at a commanded velocity; a cargo platform the cargo platform mechanically attached to the power base; and a short-range camera assembly mounted to the cargo platform that detects at least one characteristic of a drivable surface, the short-range camera assembly comprising: a camera; a first light; and a first liquid-cooled heat sink, wherein the first liquid-cooled heat sink cools the first light and the camera.
84. The autonomous delivery vehicle according to claim 83, wherein the short-range camera assembly further comprises a thermal electric cooler between the camera and the liquid cooled heat sink.
85. The autonomous delivery vehicle according to claim 83, wherein the first light and the camera are recessed in a cover with openings that deflect illumination from the first light away from the camera.
86. The autonomous delivery vehicle according to claim 83, wherein the lights are angled downward by at least 15° and recessed at least 4mm in a cover to minimize illumination distracting a pedestrian.
87. The autonomous delivery vehicle according to claim 83, wherein the camera has a field of view and the first light comprises two LEDs with lenses to produce two beams of light that spread to illuminate the field of view of the camera.
88. The autonomous delivery vehicle according to claim 87, wherein the lights are angled approximately 50° apart and the lenses produce a 60° beam.
89. The autonomous delivery vehicle according to claim 83, wherein the short-range camera assembly includes an ultrasonic sensor mounted above the camera.
90. The autonomous delivery vehicle according to claim 83, where the short-range camera assembly is mounted in a center position on a front face of the cargo platform.
91. The autonomous delivery vehicle according to claim 83, further comprising at least one corner camera assembly mounted on at least one comer of a front face of the cargo platform, the at least one comer camera assembly comprising: an ultra-sonic sensor a comer camera; a second light; and a second liquid-cooled heat sink, wherein the second liquid-cooled heat sink cools the second light and the comer camera.
92. The method as in claim 22 wherein the historical data comprises surface data.
93. The method as in claim 22 wherein the historical data comprises discontinuity data.
GB2201700.8A 2019-07-10 2020-07-10 System and method for real time control of an autonomous device Withdrawn GB2600638A (en)

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
US201962872396P 2019-07-10 2019-07-10
US201962872320P 2019-07-10 2019-07-10
US202062990485P 2020-03-17 2020-03-17
PCT/US2020/041711 WO2021007561A1 (en) 2019-07-10 2020-07-10 System and method for real time control of an autonomous device

Publications (3)

Publication Number Publication Date
GB202201700D0 GB202201700D0 (en) 2022-03-30
GB2600638A true GB2600638A (en) 2022-05-04
GB2600638A8 GB2600638A8 (en) 2022-11-23

Family

ID=71944349

Family Applications (1)

Application Number Title Priority Date Filing Date
GB2201700.8A Withdrawn GB2600638A (en) 2019-07-10 2020-07-10 System and method for real time control of an autonomous device

Country Status (11)

Country Link
EP (1) EP3997484A1 (en)
JP (2) JP2022540640A (en)
KR (1) KR20220034843A (en)
CN (1) CN114270140A (en)
AU (1) AU2020310932A1 (en)
BR (1) BR112022000356A2 (en)
CA (1) CA3146648A1 (en)
GB (1) GB2600638A (en)
IL (1) IL289689A (en)
MX (1) MX2022000458A (en)
WO (1) WO2021007561A1 (en)

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11988749B2 (en) 2021-08-19 2024-05-21 Argo AI, LLC System and method for hybrid LiDAR segmentation with outlier detection
DE102023001762A1 (en) 2022-05-24 2023-11-30 Sew-Eurodrive Gmbh & Co Kg Mobile system
DE102023202244A1 (en) 2023-03-13 2024-09-19 Robert Bosch Gesellschaft mit beschränkter Haftung Method for three-dimensional road area segmentation for a vehicle
JP7551091B1 (en) 2024-05-24 2024-09-17 株式会社トクイテン Autonomous driving system and method

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
AU774742B2 (en) 1999-03-15 2004-07-08 Deka Products Limited Partnership Control system and method for wheelchair

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
B-H SCHAFER, C ARMBRUST, T FOHST, K BERNS: "The Application of Design Schemata in Off-Road Robotics", IEEE INTELLIGENT TRANSPORTATION SYSTEMS MAGAZINE, IEEE, USA, vol. 5, no. 1, 1 January 2013 (2013-01-01), USA, pages 4 - 27, XP055733621, ISSN: 1939-1390, DOI: 10.1109/MITS.2012.2217591 *

Also Published As

Publication number Publication date
GB2600638A8 (en) 2022-11-23
AU2020310932A1 (en) 2022-01-27
WO2021007561A1 (en) 2021-01-14
KR20220034843A (en) 2022-03-18
IL289689A (en) 2022-03-01
EP3997484A1 (en) 2022-05-18
JP2023112104A (en) 2023-08-10
BR112022000356A2 (en) 2022-05-10
GB202201700D0 (en) 2022-03-30
CA3146648A1 (en) 2021-01-14
MX2022000458A (en) 2022-04-25
JP2022540640A (en) 2022-09-16
CN114270140A (en) 2022-04-01

Similar Documents

Publication Publication Date Title
GB2600638A (en) System and method for real time control of an autonomous device
US20240247945A1 (en) System and method for real time control of an autonomous device
US20210362708A1 (en) Polyline contour representations for autonomous vehicles
US11842430B2 (en) Methods and systems for ground segmentation using graph-cuts
Cherubini et al. A new tentacles-based technique for avoiding obstacles during visual navigation
JP7497365B2 (en) Systems and methods for surface feature detection and traversal - Patents.com
JP2023112104A5 (en)
Yamauchi Autonomous urban reconnaissance using man-portable UGVs
Beinschob et al. Advances in 3d data acquisition, mapping and localization in modern large-scale warehouses
US20230056589A1 (en) Systems and methods for generating multilevel occupancy and occlusion grids for controlling navigation of vehicles
WO2022115215A1 (en) Systems and methods for monocular based object detection
Bansal et al. A lidar streaming architecture for mobile robotics with application to 3d structure characterization
Yan et al. RH-Map: Online Map Construction Framework of Dynamic Object Removal Based on 3D Region-wise Hash Map Structure
CN116048120B (en) Autonomous navigation system and method for small four-rotor unmanned aerial vehicle in unknown dynamic environment
JPWO2021007561A5 (en)
KR102249485B1 (en) System and method for autonomously traveling mobile robot
CN117545674A (en) Technique for identifying curbs
CN113671511A (en) Laser radar high-precision positioning method for regional scene
Carballo et al. High density ground maps using low boundary height estimation for autonomous vehicles
US20240192371A1 (en) Detection of a Change in a Drivable Area
Huang et al. Obstacle recognition in front of vehicle based on geometry information and corrected laser intensity
Lebakula 3D shape estimation of negative obstacles using LiDAR point cloud data
JP2024022888A (en) Map automatic generation device and transport vehicle system
Hu et al. A Slope-Adaptive Navigation Approach for Ground Mobile Robots
Omi Accuracy Assessment of a Hesai LiDAR for Semi-Autonomous Vehicle

Legal Events

Date Code Title Description
WAP Application withdrawn, taken to be withdrawn or refused ** after publication under section 16(1)