US20220379820A1 - Automated moving platform - Google Patents
Automated moving platform Download PDFInfo
- Publication number
- US20220379820A1 US20220379820A1 US17/330,112 US202117330112A US2022379820A1 US 20220379820 A1 US20220379820 A1 US 20220379820A1 US 202117330112 A US202117330112 A US 202117330112A US 2022379820 A1 US2022379820 A1 US 2022379820A1
- Authority
- US
- United States
- Prior art keywords
- platform
- self
- propelled
- sensor
- wheels
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
- 238000004891 communication Methods 0.000 claims description 40
- 230000033001 locomotion Effects 0.000 claims description 36
- 239000000725 suspension Substances 0.000 claims description 18
- 230000003287 optical effect Effects 0.000 claims description 14
- 230000004044 response Effects 0.000 claims description 14
- 230000008878 coupling Effects 0.000 claims description 5
- 238000010168 coupling process Methods 0.000 claims description 5
- 238000005859 coupling reaction Methods 0.000 claims description 5
- 230000007704 transition Effects 0.000 claims description 5
- 238000000034 method Methods 0.000 abstract description 27
- 238000012360 testing method Methods 0.000 abstract description 25
- 238000010200 validation analysis Methods 0.000 abstract 1
- 238000012545 processing Methods 0.000 description 13
- 238000013439 planning Methods 0.000 description 11
- 230000006870 function Effects 0.000 description 10
- 230000001133 acceleration Effects 0.000 description 8
- 238000001514 detection method Methods 0.000 description 8
- 230000004807 localization Effects 0.000 description 8
- 238000013500 data storage Methods 0.000 description 7
- 230000007246 mechanism Effects 0.000 description 7
- 230000008447 perception Effects 0.000 description 6
- 230000005540 biological transmission Effects 0.000 description 5
- 238000010586 diagram Methods 0.000 description 5
- 238000005516 engineering process Methods 0.000 description 5
- 230000004927 fusion Effects 0.000 description 5
- 230000008569 process Effects 0.000 description 5
- 230000006399 behavior Effects 0.000 description 4
- 239000000463 material Substances 0.000 description 3
- 230000006855 networking Effects 0.000 description 3
- 230000002093 peripheral effect Effects 0.000 description 3
- 239000007787 solid Substances 0.000 description 2
- 238000001228 spectrum Methods 0.000 description 2
- 230000003068 static effect Effects 0.000 description 2
- 230000000007 visual effect Effects 0.000 description 2
- RYGMFSIKBFXOCR-UHFFFAOYSA-N Copper Chemical compound [Cu] RYGMFSIKBFXOCR-UHFFFAOYSA-N 0.000 description 1
- 101100175317 Danio rerio gdf6a gene Proteins 0.000 description 1
- 230000009471 action Effects 0.000 description 1
- 238000004458 analytical method Methods 0.000 description 1
- 238000013459 approach Methods 0.000 description 1
- 238000003491 array Methods 0.000 description 1
- 230000000712 assembly Effects 0.000 description 1
- 238000000429 assembly Methods 0.000 description 1
- 230000008901 benefit Effects 0.000 description 1
- 230000015572 biosynthetic process Effects 0.000 description 1
- 230000001413 cellular effect Effects 0.000 description 1
- 238000004590 computer program Methods 0.000 description 1
- 238000012937 correction Methods 0.000 description 1
- 238000013461 design Methods 0.000 description 1
- 230000005672 electromagnetic field Effects 0.000 description 1
- 239000000835 fiber Substances 0.000 description 1
- 230000006872 improvement Effects 0.000 description 1
- 239000004973 liquid crystal related substance Substances 0.000 description 1
- 230000007774 longterm Effects 0.000 description 1
- 238000007726 management method Methods 0.000 description 1
- 239000003550 marker Substances 0.000 description 1
- 238000005259 measurement Methods 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 230000000704 physical effect Effects 0.000 description 1
- 238000001556 precipitation Methods 0.000 description 1
- 239000011435 rock Substances 0.000 description 1
- 238000000926 separation method Methods 0.000 description 1
- 238000012546 transfer Methods 0.000 description 1
- 238000012384 transportation and delivery Methods 0.000 description 1
- 238000001429 visible spectrum Methods 0.000 description 1
Images
Classifications
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60R—VEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
- B60R11/00—Arrangements for holding or mounting articles, not otherwise provided for
- B60R11/04—Mounting of cameras operative during drive; Arrangement of controls thereof relative to the vehicle
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
- G05D1/0088—Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot characterized by the autonomous decision making process, e.g. artificial intelligence, predefined behaviours
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
- G05D1/02—Control of position or course in two dimensions
- G05D1/021—Control of position or course in two dimensions specially adapted to land vehicles
- G05D1/0231—Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means
- G05D1/0246—Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using a video camera in combination with image processing means
- G05D1/0248—Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using a video camera in combination with image processing means in combination with a laser
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/26—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
- G01C21/34—Route searching; Route guidance
- G01C21/3407—Route searching; Route guidance specially adapted for specific applications
- G01C21/3415—Dynamic re-routing, e.g. recalculating the route when the user deviates from calculated route or after detecting real-time traffic data or accidents
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60G—VEHICLE SUSPENSION ARRANGEMENTS
- B60G13/00—Resilient suspensions characterised by arrangement, location or type of vibration dampers
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60K—ARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
- B60K1/00—Arrangement or mounting of electrical propulsion units
- B60K1/02—Arrangement or mounting of electrical propulsion units comprising more than one electric motor
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60T—VEHICLE BRAKE CONTROL SYSTEMS OR PARTS THEREOF; BRAKE CONTROL SYSTEMS OR PARTS THEREOF, IN GENERAL; ARRANGEMENT OF BRAKING ELEMENTS ON VEHICLES IN GENERAL; PORTABLE DEVICES FOR PREVENTING UNWANTED MOVEMENT OF VEHICLES; VEHICLE MODIFICATIONS TO FACILITATE COOLING OF BRAKES
- B60T7/00—Brake-action initiating means
- B60T7/12—Brake-action initiating means for automatic initiation; for initiation not subject to will of driver or passenger
- B60T7/22—Brake-action initiating means for automatic initiation; for initiation not subject to will of driver or passenger initiated by contact of vehicle, e.g. bumper, with an external object, e.g. another vehicle, or by means of contactless obstacle detectors mounted on the vehicle
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W30/00—Purposes of road vehicle drive control systems not related to the control of a particular sub-unit, e.g. of systems using conjoint control of vehicle sub-units, or advanced driver assistance systems for ensuring comfort, stability and safety or drive control systems for propelling or retarding the vehicle
- B60W30/18—Propelling the vehicle
- B60W30/18009—Propelling the vehicle related to particular drive situations
- B60W30/18159—Traversing an intersection
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W60/00—Drive control systems specially adapted for autonomous road vehicles
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/10—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration
- G01C21/12—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning
- G01C21/16—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation
- G01C21/165—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation combined with non-inertial navigation instruments
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01M—TESTING STATIC OR DYNAMIC BALANCE OF MACHINES OR STRUCTURES; TESTING OF STRUCTURES OR APPARATUS, NOT OTHERWISE PROVIDED FOR
- G01M17/00—Testing of vehicles
- G01M17/007—Wheeled or endless-tracked vehicles
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
- G05D1/0011—Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot associated with a remote control arrangement
- G05D1/0027—Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot associated with a remote control arrangement involving a plurality of vehicles, e.g. fleet or convoy travelling
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
- G05D1/02—Control of position or course in two dimensions
- G05D1/021—Control of position or course in two dimensions specially adapted to land vehicles
- G05D1/0276—Control of position or course in two dimensions specially adapted to land vehicles using signals provided by a source external to the vehicle
- G05D1/028—Control of position or course in two dimensions specially adapted to land vehicles using signals provided by a source external to the vehicle using a RF signal
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G1/00—Traffic control systems for road vehicles
- G08G1/16—Anti-collision systems
- G08G1/166—Anti-collision systems for active traffic, e.g. moving vehicles, pedestrians, bikes
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60R—VEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
- B60R11/00—Arrangements for holding or mounting articles, not otherwise provided for
- B60R2011/0042—Arrangements for holding or mounting articles, not otherwise provided for characterised by mounting means
- B60R2011/008—Adjustable or movable supports
- B60R2011/0084—Adjustable or movable supports with adjustment by linear movement in their operational position
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60R—VEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
- B60R11/00—Arrangements for holding or mounting articles, not otherwise provided for
- B60R2011/0042—Arrangements for holding or mounting articles, not otherwise provided for characterised by mounting means
- B60R2011/008—Adjustable or movable supports
- B60R2011/0092—Adjustable or movable supports with motorization
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60R—VEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
- B60R2300/00—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle
- B60R2300/30—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the type of image processing
- B60R2300/301—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the type of image processing combining image information with other obstacle sensor information, e.g. using RADAR/LIDAR/SONAR sensors for estimating risk of collision
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- General Physics & Mathematics (AREA)
- Automation & Control Theory (AREA)
- Mechanical Engineering (AREA)
- Transportation (AREA)
- Aviation & Aerospace Engineering (AREA)
- Chemical & Material Sciences (AREA)
- Combustion & Propulsion (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Optics & Photonics (AREA)
- Multimedia (AREA)
- Electromagnetism (AREA)
- Human Computer Interaction (AREA)
- Business, Economics & Management (AREA)
- Health & Medical Sciences (AREA)
- Artificial Intelligence (AREA)
- Evolutionary Computation (AREA)
- Game Theory and Decision Science (AREA)
- Medical Informatics (AREA)
- Traffic Control Systems (AREA)
- Control Of Driving Devices And Active Controlling Of Vehicle (AREA)
Abstract
This disclosure describes systems and methods used in the development and validation of autonomous vehicles ability to track objects with sensors. This application describes a self-propelled autonomous platform and methods for carrying a pedestrian, cyclist or vehicular type target in a predetermined pattern during one or more testing runs. The self-propelled autonomous platform includes a sensor configured to retract within a platform housing of the self-propelled autonomous platform when being driven over during a test run.
Description
- This description relates to a self-propelled autonomous platform and, in some embodiments, to a self-propelled autonomous platform configured to carry a target to test the performance of at least one autonomous vehicles.
- Autonomous vehicles can be used to transport people and/or cargo (e.g., packages, objects, or other items) from one location to another. For example, an autonomous vehicle can navigate to the location of a person, wait for the person to board the autonomous vehicle, and navigate to a specified destination (e.g., a location selected by the person). To navigate in the environment, these autonomous vehicles are equipped with various types of sensors to detect objects in the surroundings.
- Anticipating the behavior of objects using the sensors of an autonomous vehicle can be difficult. The present disclosure is directed to systems, methods, and computer program products for developing and validating the ability of the autonomous vehicle to track objects with the sensors. This application describes a self-propelled autonomous platform and methods for carrying an object representing a pedestrian, cyclist or vehicle during one or more testing runs. In some embodiments, the one or more testing runs can include carrying (e.g., moving) the object in a predetermined pattern. The carriage of these targets in the predetermined pattern or patterns allows the autonomous vehicle to conduct tests to determine how accurately the sensors are able to detect and predict the behavior of objects in an environment surrounding an autonomous vehicle. Generally, the computer system is configured to receive input from one or more sensors of the vehicle, detect one or more objects in the environment surrounding the vehicle based on the received input, and operate the vehicle based upon the predicted behavior of the objects.
- In some embodiments, a self-propelled platform includes a plurality of wheels; a motor configured to drive at least one of the plurality of wheels; a platform housing comprising a support surface configured to carry at least one target and a sloped periphery configured to accommodate passage of an autonomous vehicle over the platform housing; and a suspension comprising a plurality of springs coupling the plurality of wheels to the platform housing, the plurality of springs configured to transition the platform from a first state to a second state in response to a threshold amount of weight being applied to the platform housing, wherein the platform housing is lower in the second state than it is in the first state.
- In some embodiments, provided is a self-propelled platform, including a sensor; at least one processing circuit; a wireless communication module; and at least one non-transitory storage media storing instructions which, when executed by the at least one processing circuit, cause performance of operations including: following a first movement route in accordance with a user input; recording a plurality of positions of the self-propelled platform based on data collected by the sensor while following the first movement route; and following a second movement route based on the plurality of positions in response to an autonomous vehicle arriving at a predetermined position.
- These and other aspects, features, and implementations can be expressed as methods, apparatuses, systems, components, program products, means or steps for performing a function, and in other ways.
- These and other aspects, features, and implementations will become apparent from the following descriptions, including the claims.
-
FIG. 1 shows an example of an autonomous vehicle having autonomous capability. -
FIG. 2 illustrates an example “cloud” computing environment. -
FIG. 3 illustrates a computer system. -
FIG. 4 shows an example architecture for an autonomous vehicle. -
FIG. 5 shows an example of inputs and outputs that may be used by a perception module. -
FIG. 6 shows an example of a LiDAR system. -
FIG. 7 shows the LiDAR system in operation. -
FIG. 8 shows the operation of the LiDAR system in additional detail. -
FIG. 9 shows a block diagram of the inputs and outputs of a control module. -
FIG. 10 shows a block diagram of the inputs, outputs, and components of a controller. -
FIGS. 11A-11C show various views of a self-propelled autonomous platform useful for testing the autonomous navigation system of an autonomous vehicle. -
FIG. 12 shows a rear facing perspective view of the platform depicted inFIGS. 11A-11C . -
FIG. 13 shows a perspective view of a downward facing surface of the platform depicted inFIGS. 11A-12 . -
FIGS. 14A-14B show detailed view of a sensor retraction mechanism in the first and second states respectively. -
FIGS. 15A-15C show an alternative platform suspension to the suspension depicted inFIGS. 11A-14B . -
FIG. 16 shows a perspective view of a bottom surface of a platform, which incorporates the alternative suspension depicted inFIG. 15 . -
FIG. 17A shows an exemplary testing setup in which an autonomous platform with a pedestrian target mounted atop it is positioned at the entrance to a cross-walk. -
FIG. 17B shows a perspective view of another testing setup in which a laser transmitter and a laser receiver are employed to determine when an autonomous vehicle has reached a predetermined location. -
FIGS. 18A-18B show top views of an exemplary intersection testing setup for an autonomous vehicle with multiple autonomous platforms configured to execute different movement patterns. -
FIG. 19 is a flow chart of an example process for causing a self-propelled platform to follow a first movement route in accordance with a user input - In the following description, for the purposes of explanation, numerous specific details are set forth in order to provide a thorough understanding of the present disclosure. It will be apparent, however, that the present disclosure may be practiced without these specific details. In other instances, well-known structures and devices are shown in block diagram form in order to avoid unnecessarily obscuring the present invention.
- In the drawings, specific arrangements or orderings of schematic elements, such as those representing devices, modules, instruction blocks and data elements, are shown for ease of description. However, it should be understood by those skilled in the art that the specific ordering or arrangement of the schematic elements in the drawings is not meant to imply that a particular order or sequence of processing, or separation of processes, is required. Further, the inclusion of a schematic element in a drawing is not meant to imply that such element is required in all embodiments or that the features represented by such element may not be included in or combined with other elements in some embodiments.
- Further, in the drawings, where connecting elements, such as solid or dashed lines or arrows, are used to illustrate a connection, relationship, or association between or among two or more other schematic elements, the absence of any such connecting elements is not meant to imply that no connection, relationship, or association can exist. In other words, some connections, relationships, or associations between elements are not shown in the drawings so as not to obscure the disclosure. In addition, for ease of illustration, a single connecting element is used to represent multiple connections, relationships or associations between elements. For example, where a connecting element represents a communication of signals, data, or instructions, it should be understood by those skilled in the art that such element represents one or multiple signal paths (e.g., a bus), as may be needed, to affect the communication.
- Reference will now be made in detail to embodiments, examples of which are illustrated in the accompanying drawings. In the following detailed description, numerous specific details are set forth in order to provide a thorough understanding of the various described embodiments. However, it will be apparent to one of ordinary skill in the art that the various described embodiments may be practiced without these specific details. In other instances, well-known methods, procedures, components, circuits, and networks have not been described in detail so as not to unnecessarily obscure aspects of the embodiments.
- Several features are described hereafter that can each be used independently of one another or with any combination of other features. However, any individual feature may not address any of the problems discussed above or might only address one of the problems discussed above. Some of the problems discussed above might not be fully addressed by any of the features described herein. Although headings are provided, information related to a particular heading, but not found in the section having that heading, may also be found elsewhere in this description. Embodiments are described herein according to the following outline:
-
- 1. General Overview
- 2. Hardware Overview
- 3. Autonomous Vehicle Architecture
- 4. Autonomous Vehicle Inputs
- 5. Autonomous Vehicle Planning
- 6. Autonomous Vehicle Control
- 7. Computing System for Object Detection Using Pillars
- 8. Example Point Clouds and Pillars
- 9. Example Process for Detecting Objects and Operating the Vehicle Based on the Detection of the Objects
- Autonomous vehicles driving in complex environments (e.g., an urban environment) pose a great technological challenge. In order for autonomous vehicles to navigate these environments, the vehicles detect various types of objects such as vehicles, pedestrians, and bikes in real-time using sensors such as LIDAR, optical imagery and/or RADAR. While these sensors are able to identify and track objects, predicting the behavior of the objects can be challenging and treating the tracked objects too conservatively can result in autonomous vehicles being unable to function. The disclosed embodiments include a low-profile self-propelled autonomous platform capable of carrying and maneuvering pedestrian, cyclist and/or vehicular type targets while testing one or more autonomous vehicles.
- In particular, the system and techniques described herein enhance the ability of testers to comprehensively validate the ability of an autonomous navigation system of an autonomous vehicle to safely navigate a road or intersection. The described platform includes a retractable sensor that allows the platform to track nearby objects and the autonomous vehicle during one or more test runs. The retractable nature of the sensor allows it to be positioned in a location with good visibility with minimal risk of damage being done to the sensor in the event the platform comes into contact with the autonomous vehicle. The onboard sensor also allows the platform to maneuver relative to dynamic objects that may not always follow the same path. In this way, the platform has more flexibility and does not need to be reprogrammed or redirected every time a scenario is adjusted.
-
FIG. 1 shows an example of anautonomous vehicle 100 having autonomous capability. - As used herein, the term “autonomous capability” refers to a function, feature, or facility that enables a vehicle to be partially or fully operated without real-time human intervention, including without limitation fully autonomous vehicles, highly autonomous vehicles, and conditionally autonomous vehicles.
- As used herein, an autonomous vehicle (AV) is a vehicle that possesses autonomous capability.
- As used herein, “vehicle” includes means of transportation of goods or people. For example, cars, buses, trains, airplanes, drones, trucks, boats, ships, submersibles, dirigibles, etc. A driverless car is an example of a vehicle.
- As used herein, “trajectory” refers to a path or route to navigate an AV from a first spatiotemporal location to a second spatiotemporal location. In an embodiment, the first spatiotemporal location is referred to as the initial or starting location and the second spatiotemporal location is referred to as the destination, final location, goal, goal position, or goal location. In some examples, a trajectory is made up of one or more segments (e.g., sections of road) and each segment is made up of one or more blocks (e.g., portions of a lane or intersection). In an embodiment, the spatiotemporal locations correspond to real world locations. For example, the spatiotemporal locations are pick up or drop-off locations to pick up or drop-off persons or goods.
- As used herein, “sensor(s)” includes one or more hardware components that detect information about the environment surrounding the sensor. Some of the hardware components can include sensing components (e.g., image sensors, biometric sensors), transmitting and/or receiving components (e.g., laser or radio frequency wave transmitters and receivers), electronic components such as analog-to-digital converters, a data storage device (such as a RAM and/or a nonvolatile storage), software or firmware components and data processing components such as an ASIC (application-specific integrated circuit), a microprocessor and/or a microcontroller.
- As used herein, a “scene description” is a data structure (e.g., list) or data stream that includes one or more classified or labeled objects detected by one or more sensors on the AV vehicle or provided by a source external to the AV.
- As used herein, a “road” is a physical area that can be traversed by a vehicle, and may correspond to a named thoroughfare (e.g., city street, interstate freeway, etc.) or may correspond to an unnamed thoroughfare (e.g., a driveway in a house or office building, a section of a parking lot, a section of a vacant lot, a dirt path in a rural area, etc.). Because some vehicles (e.g., 4-wheel-drive pickup trucks, sport utility vehicles, etc.) are capable of traversing a variety of physical areas not specifically adapted for vehicle travel, a “road” may be a physical area not formally defined as a thoroughfare by any municipality or other governmental or administrative body.
- As used herein, a “lane” is a portion of a road that can be traversed by a vehicle, and may correspond to most or all of the space between lane markings, or may correspond to only some (e.g., less than 50%) of the space between lane markings. For example, a road having lane markings spaced far apart might accommodate two or more vehicles between the markings, such that one vehicle can pass the other without traversing the lane markings, and thus could be interpreted as having a lane narrower than the space between the lane markings, or having two lanes between the lane markings. A lane could also be interpreted in the absence of lane markings. For example, a lane may be defined based on physical features of an environment, e.g., rocks and trees along a thoroughfare in a rural area.
- “One or more” includes a function being performed by one element, a function being performed by more than one element, e.g., in a distributed fashion, several functions being performed by one element, several functions being performed by several elements, or any combination of the above.
- It will also be understood that, although the terms first, second, etc. are, in some instances, used herein to describe various elements, these elements should not be limited by these terms. These terms are only used to distinguish one element from another. For example, a first contact could be termed a second contact, and, similarly, a second contact could be termed a first contact, without departing from the scope of the various described embodiments. The first contact and the second contact are both contacts, but they are not the same contact, unless specified otherwise.
- The terminology used in the description of the various described embodiments herein is for the purpose of describing particular embodiments only and is not intended to be limiting. As used in the description of the various described embodiments and the appended claims, the singular forms “a,” “an” and “the” are intended to include the plural forms as well, unless the context clearly indicates otherwise. It will also be understood that the term “and/or” as used herein refers to and encompasses any and all possible combinations of one or more of the associated listed items. It will be further understood that the terms “includes,” “including,” “comprises,” and/or “comprising,” when used in this description, specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof.
- As used herein, the term “if” is, optionally, construed to mean “when” or “upon” or “in response to determining” or “in response to detecting,” depending on the context. Similarly, the phrase “if it is determined” or “if [a stated condition or event] is detected” is, optionally, construed to mean “upon determining” or “in response to determining” or “upon detecting [the stated condition or event]” or “in response to detecting [the stated condition or event],” depending on the context.
- As used herein, an AV system refers to the AV along with the array of hardware, software, stored data, and data generated in real-time that supports the operation of the AV. In an embodiment, the AV system is incorporated within the AV. In an embodiment, the AV system is spread across several locations. For example, some of the software of the AV system is implemented on a cloud computing environment similar to cloud computing environment 200 described below with respect to
FIG. 2 . - In general, this document describes technologies applicable to any vehicles that have one or more autonomous capabilities including fully autonomous vehicles, highly autonomous vehicles, and conditionally autonomous vehicles, such as so-called Level 5, Level 4 and Level 3 vehicles, respectively (see SAE International's standard J3016: Taxonomy and Definitions for Terms Related to On-Road Motor Vehicle Automated Driving Systems, which is incorporated by reference in its entirety, for more details on the classification of levels of autonomy in vehicles). The technologies described in this document are also applicable to partially autonomous vehicles and driver assisted vehicles, such as so-called Level 2 and
Level 1 vehicles (see SAE International's standard J3016: Taxonomy and Definitions for Terms Related to On-Road Motor Vehicle Automated Driving Systems). In an embodiment, one or more of theLevel 1, 2, 3, 4 and 5 vehicle systems may automate certain vehicle operations (e.g., steering, braking, and using maps) under certain operating conditions based on processing of sensor inputs. The technologies described in this document can benefit vehicles in any levels, ranging from fully autonomous vehicles to human-operated vehicles. - Referring to
FIG. 1 , anAV system 120 operates theAV 100 along atrajectory 198 through anenvironment 190 to a destination 199 (sometimes referred to as a final location) while avoiding objects (e.g.,natural obstructions 191,vehicles 193,pedestrians 192, cyclists, and other obstacles) and obeying rules of the road (e.g., rules of operation or driving preferences). - In an embodiment, the
AV system 120 includesdevices 101 that are instrumented to receive and act on operational commands from thecomputer processors 146. In an embodiment, computingprocessors 146 are similar to theprocessor 304 described below in reference toFIG. 3 . Examples ofdevices 101 include asteering control 102,brakes 103, gears, accelerator pedal or other acceleration control mechanisms, windshield wipers, side-door locks, window controls, and turn-indicators. - In an embodiment, the
AV system 120 includessensors 121 for measuring or inferring properties of state or condition of theAV 100, such as the AV's position, linear and angular velocity and acceleration, and heading (e.g., an orientation of the leading end of AV 100). Example ofsensors 121 are GPS, inertial measurement units (IMU) that measure both vehicle linear accelerations and angular rates, wheel speed sensors for measuring or estimating wheel slip ratios, wheel brake pressure or braking torque sensors, engine torque or wheel torque sensors, and steering angle and angular rate sensors. - In an embodiment, the
sensors 121 also include sensors for sensing or measuring properties of the AV's environment. For example, monocular orstereo video cameras 122 in the visible light, infrared or thermal (or both) spectra,LiDAR 123, RADAR, ultrasonic sensors, time-of-flight (TOF) depth sensors, speed sensors, temperature sensors, humidity sensors, and precipitation sensors. - In an embodiment, the
AV system 120 includes a data storage unit 142 andmemory 144 for storing machine instructions associated withcomputer processors 146 or data collected bysensors 121. In an embodiment, the data storage unit 142 is similar to theROM 308 orstorage device 310 described below in relation toFIG. 3 . In an embodiment,memory 144 is similar to themain memory 306 described below. In an embodiment, the data storage unit 142 andmemory 144 store historical, real-time, and/or predictive information about theenvironment 190. In an embodiment, the stored information includes maps, driving performance, traffic congestion updates or weather conditions. In an embodiment, data relating to theenvironment 190 is transmitted to theAV 100 via a communications channel from a remotely locateddatabase 134. - In an embodiment, the
AV system 120 includescommunications devices 140 for communicating measured or inferred properties of other vehicles' states and conditions, such as positions, linear and angular velocities, linear and angular accelerations, and linear and angular headings to theAV 100. These devices include Vehicle-to-Vehicle (V2V) and Vehicle-to-Infrastructure (V2I) communication devices and devices for wireless communications over point-to-point or ad hoc networks or both. In an embodiment, thecommunications devices 140 communicate across the electromagnetic spectrum (including radio and optical communications) or other media (e.g., air and acoustic media). A combination of Vehicle-to-Vehicle (V2V) Vehicle-to-Infrastructure (V2I) communication (and, in some embodiments, one or more other types of communication) is sometimes referred to as Vehicle-to-Everything (V2X) communication. V2X communication typically conforms to one or more communications standards for communication with, between, and among autonomous vehicles. - In an embodiment, the
communication devices 140 include communication interfaces. For example, wired, wireless, WiMAX, WiFi, Bluetooth, satellite, cellular, optical, near field, infrared, or radio interfaces. The communication interfaces transmit data from a remotely locateddatabase 134 toAV system 120. In an embodiment, the remotely locateddatabase 134 is embedded in a cloud computing environment 200 as described inFIG. 2 . The communication interfaces 140 transmit data collected fromsensors 121 or other data related to the operation ofAV 100 to the remotely locateddatabase 134. In an embodiment, communication interfaces 140 transmit information that relates to teleoperations to theAV 100. In some embodiments, theAV 100 communicates with other remote (e.g., “cloud”)servers 136. - In an embodiment, the remotely located
database 134 also stores and transmits digital data (e.g., storing data such as road and street locations). Such data is stored on thememory 144 on theAV 100, or transmitted to theAV 100 via a communications channel from the remotely locateddatabase 134. - In an embodiment, the remotely located
database 134 stores and transmits historical information about driving properties (e.g., speed and acceleration profiles) of vehicles that have previously traveled alongtrajectory 198 at similar times of day. In one implementation, such data may be stored on thememory 144 on theAV 100, or transmitted to theAV 100 via a communications channel from the remotely locateddatabase 134. -
Computing devices 146 located on theAV 100 algorithmically generate control actions based on both real-time sensor data and prior information, allowing theAV system 120 to execute its autonomous driving capabilities. - In an embodiment, the
AV system 120 includescomputer peripherals 132 coupled to computingdevices 146 for providing information and alerts to, and receiving input from, a user (e.g., an occupant or a remote user) of theAV 100. In an embodiment,peripherals 132 are similar to thedisplay 312,input device 314, andcursor controller 316 discussed below in reference toFIG. 3 . The coupling is wireless or wired. Any two or more of the interface devices may be integrated into a single device. - In an embodiment,
AV system 120 can be incorporated into an autonomous platform configured to carry a target suitable for testing performance of sensors ofAV 100 and is described and depicted in greater detail below inFIGS. 11A-16 . The autonomous platform can be configured withsteering controller 102,brakes 103,communication devices 140 and one or more processors for receiving and processing instructions in the form of computer code stored on local or remote computer storage. The autonomous platform can also include one ormore sensors 121 that can include a LiDAR sensor, a video camera, a GPS receiver and the like. In some embodiments, one or more ofsensors 121 can be used by the autonomous platform to refine a desired position or path taken by the autonomous platform with respect to its environment and or theAV 100 it is being used to test. -
FIG. 2 illustrates an example “cloud” computing environment. Cloud computing is a model of service delivery for enabling convenient, on-demand network access to a shared pool of configurable computing resources (e.g., networks, network bandwidth, servers, processing, memory, storage, applications, virtual machines, and services). In typical cloud computing systems, one or more large cloud data centers house the machines used to deliver the services provided by the cloud. Referring now toFIG. 2 , the cloud computing environment 200 includescloud data centers cloud 202.Data centers computer systems - The cloud computing environment 200 includes one or more cloud data centers. In general, a cloud data center, for example the
cloud data center 204 a shown inFIG. 2 , refers to the physical arrangement of servers that make up a cloud, for example thecloud 202 shown inFIG. 2 , or a particular portion of a cloud. For example, servers are physically arranged in the cloud datacenter into rooms, groups, rows, and racks. A cloud datacenter has one or more zones, which include one or more rooms of servers. Each room has one or more rows of servers, and each row includes one or more racks. Each rack includes one or more individual server nodes. In some implementation, servers in zones, rooms, racks, and/or rows are arranged into groups based on physical infrastructure requirements of the datacenter facility, which include power, energy, thermal, heat, and/or other requirements. In an embodiment, the server nodes are similar to the computer system described inFIG. 3 . Thedata center 204 a has many computing systems distributed through many racks. - The
cloud 202 includescloud data centers cloud data centers - The computing systems 206 a-f or cloud computing services consumers are connected to the
cloud 202 through network links and network adapters. In an embodiment, the computing systems 206 a-f are implemented as various computing devices, for example servers, desktops, laptops, tablet, smartphones, Internet of Things (IoT) devices, autonomous vehicles (including, cars, drones, shuttles, trains, buses, etc.) and consumer electronics. In an embodiment, the computing systems 206 a-f are implemented in or as a part of other systems. -
FIG. 3 illustrates acomputer system 300. In an implementation, thecomputer system 300 is a special purpose computing device. The special-purpose computing device is hard-wired to perform the techniques or includes digital electronic devices such as one or more application-specific integrated circuits (ASICs) or field programmable gate arrays (FPGAs) that are persistently programmed to perform the techniques, or may include one or more general purpose hardware processors programmed to perform the techniques pursuant to program instructions in firmware, memory, other storage, or a combination. Such special-purpose computing devices may also combine custom hard-wired logic, ASICs, or FPGAs with custom programming to accomplish the techniques. In various embodiments, the special-purpose computing devices are desktop computer systems, portable computer systems, handheld devices, network devices or any other device that incorporates hard-wired and/or program logic to implement the techniques. - In an embodiment, the
computer system 300 includes a bus 302 or other communication mechanism for communicating information, and ahardware processor 304 coupled with a bus 302 for processing information. Thehardware processor 304 is, for example, a general-purpose microprocessor. Thecomputer system 300 also includes amain memory 306, such as a random-access memory (RAM) or other dynamic storage device, coupled to the bus 302 for storing information and instructions to be executed byprocessor 304. In one implementation, themain memory 306 is used for storing temporary variables or other intermediate information during execution of instructions to be executed by theprocessor 304. Such instructions, when stored in non-transitory storage media accessible to theprocessor 304, render thecomputer system 300 into a special-purpose machine that is customized to perform the operations specified in the instructions. - In an embodiment, the
computer system 300 further includes a read only memory (ROM) 308 or other static storage device coupled to the bus 302 for storing static information and instructions for theprocessor 304. Astorage device 310, such as a magnetic disk, optical disk, solid-state drive, or three-dimensional cross point memory is provided and coupled to the bus 302 for storing information and instructions. - In an embodiment, the
computer system 300 is coupled via the bus 302 to adisplay 312, such as a cathode ray tube (CRT), a liquid crystal display (LCD), plasma display, light emitting diode (LED) display, or an organic light emitting diode (OLED) display for displaying information to a computer user. Aninput device 314, including alphanumeric and other keys, is coupled to bus 302 for communicating information and command selections to theprocessor 304. Another type of user input device is acursor controller 316, such as a mouse, a trackball, a touch-enabled display, or cursor direction keys for communicating direction information and command selections to theprocessor 304 and for controlling cursor movement on thedisplay 312. This input device typically has two degrees of freedom in two axes, a first axis (e.g., x-axis) and a second axis (e.g., y-axis), that allows the device to specify positions in a plane. - According to one embodiment, the techniques herein are performed by the
computer system 300 in response to theprocessor 304 executing one or more sequences of one or more instructions contained in themain memory 306. Such instructions are read into themain memory 306 from another storage medium, such as thestorage device 310. Execution of the sequences of instructions contained in themain memory 306 causes theprocessor 304 to perform the process steps described herein. In alternative embodiments, hard-wired circuitry is used in place of or in combination with software instructions. - The term “storage media” as used herein refers to any non-transitory media that store data and/or instructions that cause a machine to operate in a specific fashion. Such storage media includes non-volatile media and/or volatile media. Non-volatile media includes, for example, optical disks, magnetic disks, solid-state drives, or three-dimensional cross point memory, such as the
storage device 310. Volatile media includes dynamic memory, such as themain memory 306. Common forms of storage media include, for example, a floppy disk, a flexible disk, hard disk, solid-state drive, magnetic tape, or any other magnetic data storage medium, a CD-ROM, any other optical data storage medium, any physical medium with patterns of holes, a RAM, a PROM, and EPROM, a FLASH-EPROM, NV-RAM, or any other memory chip or cartridge. - Storage media is distinct from but may be used in conjunction with transmission media. Transmission media participates in transferring information between storage media. For example, transmission media includes coaxial cables, copper wire and fiber optics, including the wires that comprise the bus 302. Transmission media can also take the form of acoustic or light waves, such as those generated during radio-wave and infrared data communications.
- In an embodiment, various forms of media are involved in carrying one or more sequences of one or more instructions to the
processor 304 for execution. For example, the instructions are initially carried on a magnetic disk or solid-state drive of a remote computer. The remote computer loads the instructions into its dynamic memory and send the instructions over a telephone line using a modem. A modem local to thecomputer system 300 receives the data on the telephone line and use an infrared transmitter to convert the data to an infrared signal. An infrared detector receives the data carried in the infrared signal and appropriate circuitry places the data on the bus 302. The bus 302 carries the data to themain memory 306, from whichprocessor 304 retrieves and executes the instructions. The instructions received by themain memory 306 may optionally be stored on thestorage device 310 either before or after execution byprocessor 304. - The
computer system 300 also includes acommunication interface 318 coupled to the bus 302. Thecommunication interface 318 provides a two-way data communication coupling to a network link 320 that is connected to alocal network 322. For example, thecommunication interface 318 is an integrated service digital network (ISDN) card, cable modem, satellite modem, or a modem to provide a data communication connection to a corresponding type of telephone line. As another example, thecommunication interface 318 is a local area network (LAN) card to provide a data communication connection to a compatible LAN. In some implementations, wireless links are also implemented. In any such implementation, thecommunication interface 318 sends and receives electrical, electromagnetic, or optical signals that carry digital data streams representing various types of information. - The network link 320 typically provides data communication through one or more networks to other data devices. For example, the network link 320 provides a connection through the
local network 322 to ahost computer 324 or to a cloud data center or equipment operated by an Internet Service Provider (ISP) 326. TheISP 326 in turn provides data communication services through the world-wide packet data communication network now commonly referred to as the “Internet” 328. Thelocal network 322 andInternet 328 both use electrical, electromagnetic or optical signals that carry digital data streams. The signals through the various networks and the signals on the network link 320 and through thecommunication interface 318, which carry the digital data to and from thecomputer system 300, are example forms of transmission media. In an embodiment, the network 320 contains thecloud 202 or a part of thecloud 202 described above. - The
computer system 300 sends messages and receives data, including program code, through the network(s), the network link 320, and thecommunication interface 318. In an embodiment, thecomputer system 300 receives code for processing. The received code is executed by theprocessor 304 as it is received, and/or stored instorage device 310, or other non-volatile storage for later execution. -
FIG. 4 shows anexample architecture 400 for an autonomous vehicle (e.g., theAV 100 shown inFIG. 1 ). Thearchitecture 400 includes a perception module 402 (sometimes referred to as a perception circuit), a planning module 404 (sometimes referred to as a planning circuit), a control module 406 (sometimes referred to as a control circuit), a localization module 408 (sometimes referred to as a localization circuit), and a database module 410 (sometimes referred to as a database circuit). Each module plays a role in the operation of theAV 100. Together, themodules AV system 120 shown inFIG. 1 . In some embodiments, any of themodules - In use, the
planning module 404 receives data representing adestination 412 and determines data representing a trajectory 414 (sometimes referred to as a route) that can be traveled by theAV 100 to reach (e.g., arrive at) thedestination 412. In order for theplanning module 404 to determine the data representing thetrajectory 414, theplanning module 404 receives data from theperception module 402, thelocalization module 408, and thedatabase module 410. - The
perception module 402 identifies nearby physical objects using one ormore sensors 121, e.g., as also shown inFIG. 1 . The objects are classified (e.g., grouped into types such as pedestrian, bicycle, automobile, traffic sign, etc.) and a scene description including the classifiedobjects 416 is provided to theplanning module 404. - The
planning module 404 also receives data representing theAV position 418 from thelocalization module 408. Thelocalization module 408 determines the AV position by using data from thesensors 121 and data from the database module 410 (e.g., a geographic data) to calculate a position. For example, thelocalization module 408 uses data from a GNSS (Global Navigation Satellite System) sensor and geographic data to calculate a longitude and latitude of the AV. In an embodiment, data used by thelocalization module 408 includes high-precision maps of the roadway geometric properties, maps describing road network connectivity properties, maps describing roadway physical properties (such as traffic speed, traffic volume, the number of vehicular and cyclist traffic lanes, lane width, lane traffic directions, or lane marker types and locations, or combinations of them), and maps describing the spatial locations of road features such as crosswalks, traffic signs or other travel signals of various types. - The
control module 406 receives the data representing thetrajectory 414 and the data representing theAV position 418 and operates the control functions 420 a-c (e.g., steering, throttling, braking, ignition) of the AV in a manner that will cause theAV 100 to travel thetrajectory 414 to thedestination 412. For example, if thetrajectory 414 includes a left turn, thecontrol module 406 will operate the control functions 420 a-c in a manner such that the steering angle of the steering function will cause theAV 100 to turn left and the throttling and braking will cause theAV 100 to pause and wait for passing pedestrians or vehicles before the turn is made. -
FIG. 5 shows an example of inputs 502 a-d (e.g.,sensors 121 shown inFIG. 1 ) and outputs 504 a-d (e.g., sensor data) that is used by the perception module 402 (FIG. 4 ). Oneinput 502 a is a LiDAR (Light Detection and Ranging) system (e.g.,LiDAR 123 shown inFIG. 1 ). LiDAR is a technology that uses light (e.g., bursts of light such as infrared light) to obtain data about physical objects in its line of sight. A LiDAR system produces LiDAR data asoutput 504 a. For example, LiDAR data is collections of 3D or 2D points (also known as a point clouds) that are used to construct a representation of theenvironment 190. - Another
input 502 b is a RADAR system. RADAR is a technology that uses radio waves to obtain data about nearby physical objects. RADARs can obtain data about objects not within the line of sight of a LiDAR system. ARADAR system 502 b produces RADAR data asoutput 504 b. For example, RADAR data are one or more radio frequency electromagnetic signals that are used to construct a representation of theenvironment 190. - Another
input 502 c is a camera system. A camera system uses one or more cameras (e.g., digital cameras using a light sensor such as a charge-coupled device [CCD]) to obtain information about nearby physical objects. A camera system produces camera data asoutput 504 c. Camera data often takes the form of image data (e.g., data in an image data format such as RAW, JPEG, PNG, etc.). In some examples, the camera system has multiple independent cameras, e.g., for the purpose of stereopsis (stereo vision), which enables the camera system to perceive depth. Although the objects perceived by the camera system are described here as “nearby,” this is relative to the AV. In use, the camera system may be configured to “see” objects far, e.g., up to a kilometer or more ahead of the AV. Accordingly, the camera system may have features such as sensors and lenses that are optimized for perceiving objects that are far away. - Another
input 502 d is a traffic light detection (TLD) system. A TLD system uses one or more cameras to obtain information about traffic lights, street signs, and other physical objects that provide visual navigation information. A TLD system produces TLD data asoutput 504 d. TLD data often takes the form of image data (e.g., data in an image data format such as RAW, JPEG, PNG, etc.). A TLD system differs from a system incorporating a camera in that a TLD system uses a camera with a wide field of view (e.g., using a wide-angle lens or a fish-eye lens) in order to obtain information about as many physical objects providing visual navigation information as possible, so that theAV 100 has access to all relevant navigation information provided by these objects. For example, the viewing angle of the TLD system may be about 120 degrees or more. - In some embodiments, outputs 504 a-d are combined using a sensor fusion technique. Thus, either the individual outputs 504 a-d are provided to other systems of the AV 100 (e.g., provided to a
planning module 404 as shown inFIG. 4 ), or the combined output can be provided to the other systems, either in the form of a single combined output or multiple combined outputs of the same type (e.g., using the same combination technique or combining the same outputs or both) or different types type (e.g., using different respective combination techniques or combining different respective outputs or both). In some embodiments, an early fusion technique is used. An early fusion technique is characterized by combining outputs before one or more data processing steps are applied to the combined output. In some embodiments, a late fusion technique is used. A late fusion technique is characterized by combining outputs after one or more data processing steps are applied to the individual outputs. -
FIG. 6 shows an example of a LiDAR system 602 (e.g., theinput 502 a shown inFIG. 5 ). TheLiDAR system 602 emits light 604 a-c from a light emitter 606 (e.g., a laser transmitter). Light emitted by a LiDAR system is typically not in the visible spectrum; for example, infrared light is often used. Some of the light 604 b emitted encounters a physical object 608 (e.g., a vehicle) and reflects back to theLiDAR system 602. (Light emitted from a LiDAR system typically does not penetrate physical objects, e.g., physical objects in solid form.) TheLiDAR system 602 also has one or morelight detectors 610, which detect the reflected light. In an embodiment, one or more data processing systems associated with the LiDAR system generates animage 612 representing the field ofview 614 of the LiDAR system. Theimage 612 includes information that represents theboundaries 616 of aphysical object 608. In this way, theimage 612 is used to determine theboundaries 616 of one or more physical objects near an AV. -
FIG. 7 shows theLiDAR system 602 in operation. In the scenario shown in this figure, theAV 100 receives bothcamera system output 504 c in the form of animage 702 andLiDAR system output 504 a in the form of LiDAR data points 704. In use, the data processing systems of theAV 100 compares theimage 702 to the data points 704. In particular, aphysical object 706 identified in theimage 702 is also identified among the data points 704. In this way, theAV 100 perceives the boundaries of the physical object based on the contour and density of the data points 704. -
FIG. 8 shows the operation of theLiDAR system 602 in additional detail. As described above, theAV 100 detects the boundary of a physical object based on characteristics of the data points detected by theLiDAR system 602. As shown inFIG. 8 , a flat object, such as theground 802, will reflect light 804 a-d emitted from aLiDAR system 602 in a consistent manner. Put another way, because theLiDAR system 602 emits light using consistent spacing, theground 802 will reflect light back to theLiDAR system 602 with the same consistent spacing. As theAV 100 travels over theground 802, theLiDAR system 602 will continue to detect light reflected by the nextvalid ground point 806 if nothing is obstructing the road. However, if anobject 808 obstructs the road, light 804 e-f emitted by theLiDAR system 602 will be reflected from points 810 a-b in a manner inconsistent with the expected consistent manner. From this information, theAV 100 can determine that theobject 808 is present. -
FIG. 9 shows a block diagram 900 of the inputs and outputs of a control module 406 (e.g., as shown inFIG. 4 ). A control module operates in accordance with acontroller 1102 which includes, for example, one or more processors (e.g., one or more computer processors such as microprocessors or microcontrollers or both) similar toprocessor 304, short-term and/or long-term data storage (e.g., memory random-access memory or flash memory or both) similar tomain memory 306,ROM 308, andstorage device 310, and instructions stored in memory that carry out operations of thecontroller 902 when the instructions are executed (e.g., by the one or more processors). - In an embodiment, the
controller 902 receives data representing a desiredoutput 904. The desiredoutput 904 typically includes a velocity, e.g., a speed and a heading. The desiredoutput 904 can be based on, for example, data received from a planning module 404 (e.g., as shown inFIG. 4 ). In accordance with the desiredoutput 904, thecontroller 902 produces data usable as athrottle input 906 and asteering input 908. Thethrottle input 1106 represents the magnitude in which to engage the throttle (e.g., acceleration control) of anAV 100, e.g., by engaging the steering pedal, or engaging another throttle control, to achieve the desiredoutput 904. In some examples, thethrottle input 1106 also includes data usable to engage the brake (e.g., deceleration control) of theAV 100. Thesteering input 908 represents a steering angle, e.g., the angle at which the steering control (e.g., steering wheel, steering angle actuator, or other functionality for controlling steering angle) of the AV should be positioned to achieve the desiredoutput 904. - In an embodiment, the
controller 902 receives feedback that is used in adjusting the inputs provided to the throttle and steering. For example, if theAV 100 encounters adisturbance 910, such as a hill, the measuredspeed 912 of theAV 100 is lowered below the desired output speed. In an embodiment, any measuredoutput 914 is provided to thecontroller 902 so that the necessary adjustments are performed, e.g., based on the differential 913 between the measured speed and desired output. The measuredoutput 914 includes measuredposition 916, measuredvelocity 918, (including speed and heading), measuredacceleration 920, and other outputs measurable by sensors of theAV 100. - In an embodiment, information about the
disturbance 910 is detected in advance, e.g., by a sensor such as a camera or LiDAR sensor, and provided to apredictive feedback module 922. Thepredictive feedback module 922 then provides information to thecontroller 902 that thecontroller 902 can use to adjust accordingly. For example, if the sensors of theAV 100 detect (“see”) a hill, this information can be used by thecontroller 902 to prepare to engage the throttle at the appropriate time to avoid significant deceleration. -
FIG. 10 shows a block diagram 1000 of the inputs, outputs, and components of thecontroller 902. Thecontroller 902 has aspeed profiler 1002 which affects the operation of a throttle/brake controller 1004. For example, thespeed profiler 1002 instructs the throttle/brake controller 1004 to engage acceleration or engage deceleration using the throttle/brake 1006 depending on, e.g., feedback received by thecontroller 902 and processed by thespeed profiler 1002. - The
controller 902 also has alateral tracking controller 1008 which affects the operation of asteering controller 1010. For example, thelateral tracking controller 1008 instructs thesteering controller 1010 to adjust the position of thesteering angle actuator 1012 depending on, e.g., feedback received by thecontroller 902 and processed by thelateral tracking controller 1008. - The
controller 902 receives several inputs used to determine how to control the throttle/brake 1006 andsteering angle actuator 1012. Aplanning module 404 provides information used by thecontroller 902, for example, to choose a heading when theAV 100 begins operation and to determine which road segment to traverse when theAV 100 reaches an intersection. Alocalization module 408 provides information to thecontroller 902 describing the current location of theAV 100, for example, so that thecontroller 902 can determine if theAV 100 is at a location expected based on the manner in which the throttle/brake 1006 andsteering angle actuator 1012 are being controlled. In an embodiment, thecontroller 902 receives information fromother inputs 1014, e.g., information received from databases, computer networks, etc. -
FIGS. 11A-11C show various views of a self-propelledautonomous platform 1100 useful for testing the autonomous navigation system of an autonomous vehicle similar toAV 100.FIG. 11A shows a top view ofplatform 1100.Platform 1100 includesplatform housing 1102, which can take the form of a rigid structure having a slopedperiphery 1104 that allows vehicles to drive overplatform 1100 without damaging the vehicles orplatform 1100. In some embodiments the sloped periphery can have a linear incline and in some embodiments slopedperiphery 1104 can have a non-linear incline (e.g., a concave or convex arced incline and/or the like). A central region ofplatform housing 1102 can include a flat orsloped support surface 1106 configured to support at least one target capable of detection by one or more sensors ofAV 100. Whenplatform 1100 includes an optical sensor,support surface 1106 can includes asensor cover 1108 that forms a portion ofsupport surface 1106 when a vehicle drives overplatform 1100.Support surface 1106 also includes magnetic attachment points 1110, which are configured to attach to a target support structure. The target support structure includes a base with a magnet or magnetically attractable material that is able to magnetically interact and attach to magnetic attachment points 1110. In some embodiments, magnetic attachment points 1110 can take the form of magnetically attractable material that magnetically couples to a magnet that is incorporated into the base of the target support structure. Once securely attached to one or both of magnetic attachment points 1110, the target support structure can be used to keep a target upright and stable whileplatform 1100 moves around. -
FIG. 11B shows a cross-sectional view ofplatform 1100 in a first state in accordance with section line A-A. The first state can also be referred to as a normal operating state in whichplatform 1100 remains untilplatform housing 1102 receives a threshold amount of force. In this firststate sensor cover 1108 is elevated abovesupport surface 1106, which allowssensor 1112 an unobstructed view outside ofplatform housing 1102. In some embodiments,sensor 1112 allowsplatform 1100 to adjust its speed relative to another platform or vehicle during a test based on data generated bysensor 1112. As depicted,wheels 1114 protrude from a downward facing surface ofplatform housing 1102.Wheels 1114 can be driven by one or more motors disposed withinplatform housing 1102.FIG. 11B also shows awheel 1116 associated withsensor retraction mechanism 1118. -
FIG. 11C shows a cross-sectional view ofplatform 1100 in a second state in accordance with section line A-A. The second state can also be referred to as a stationary state, sinceplatform 1100 is not capable of propelling itself in this state. In this secondstate sensor cover 1108 is flush or substantially flush withsupport surface 1106. This retracted position ofsensor 1112, preventssensor 1112 from sustaining damage in the event a wheel ofAV 100 happens to run directly oversensor cover 1108.FIG. 11C also shows howwheels 1114 andwheel 1116 retract intoplatform housing 1102 in the second state.Wheels 1114 andwheel 1116 are kept in the first state by sturdy springs that are configured to compress or extend only once a threshold amount of force is applied toplatform housing 1102. This disclosed structure allowsplatform 1100 to have a low-profile of less than about 80 mm in height, which makes the platform more aerodynamic and does not substantially increase the height of targets it carries. In this way, targets having proportions consistent with normal human or vehicular height can be used without having to adjust their height. -
FIG. 12 shows a rear facing perspective view ofplatform 1100. In particular, atarget support structure 1202 taking the form of a cylindrical beam is shown attached to a rear magnetic attachment point ofplatform housing 1102. On account of the coupling betweentarget support structure 1202 being magnetically coupled toplatform housing 1102, in the event a target attached to targetsupport structure 1402 is hit byAV 100, both the target andtarget support structure 1202 can detach easily fromplatform housing 1102 without doing substantial damage toplatform 1100 orAV 100. It should be appreciated thattarget support structure 1402 can take different forms. For example,target support structure 1402 could have target attachment features that allow target support structure to be securely attached to a target, thereby preventing inadvertent detachment of the target fromplatform 1100. -
FIG. 13 shows a perspective view of a downward facing surface ofplatform 1100.Target 1302 is shown attached to an upward facing surface ofplatform 1100. In some embodiments,target 1302 can be attached directly to one of the magnetic attachment points ofplatform housing 1102. It should be noted that whiletarget 1302 is depicted as a pedestrian inFIG. 13 ,platform 1100 is capable of carrying other types of targets, such as person atop a bike or in cooperation with other platforms carry inflatable vehicle shaped targets. -
FIG. 13 also shows a close upview 1304 of the downward facing surface ofplatform 1100 whereplatform 1100 is in the second state. As depicted, an interior ofplatform housing 1102 is substantially hollow allowing room forwheels structural ribs 1304.Structural ribs 1306 help to keepplatform housing 1102 rigid enough to structurally support the passage ofAV 100 overplatform housing 1102. Wheelassemblies attaching wheels 1114 toplatform housing 1102 are attached by pins to one or more ofstructural ribs 1304.FIG. 13 also showsmotors 1308, which are in axial alignment with wheels 1104-1 and 1104-2. In some embodiments,motors 1308 can be configured to move withwheels 1104 whenplatform 1100 changes from the second state to the first state to maintain the alignment ofmotors 1308 withwheels 1104. Having twomotors 1308 allowsplatform 1100 to perform turns by applying differential inputs tomotors 1308, thereby allowingplatform 1100 to make left and right turns. The differential inputs can take many forms including differential power, control signals, electrical current, etc. -
FIGS. 14A-14B show detailed view ofsensor retraction mechanism 1118 in the first and second states respectively.FIG. 14A showssensor 1112 protruding abovesupport surface 1106.Sensor 1112 is mechanically coupled towheel 1116 by way oflinkage 1402,wheel support 1404 and axle 1406. Whilewheel support 1404 andlinkage 1402 are shown as two different pieces, in some embodiments, these two pieces could be combined into a single piece.Wheel 1116 is kept in the position depicted inFIG. 14A byspring 1408. A first end ofspring 1408 is secured to sensorretraction mechanism body 1410 and a second end ofspring 1408 is secured tolinkage 1402 atpin 1412.Spring 1408 is configured to preventlinkage 1402 from rotating about an axis defined bypin 1414 until a threshold amount of force is applied toplatform housing 1102 at whichpoint spring 1408 is configured to lengthen, thereby allowinglinkage 1402 to rotate. -
FIG. 14B shows a position ofsensor retraction mechanism 1118 whenplatform 1100 is in the second state.Linkage 1402 is shown in a new position afterspring 1408 lengthens to accommodate rotation oflinkage 1402. Rotation oflinkage 1402 results in the retraction ofwheel 1116 into platform housing 1102 (not shown) and the retraction ofsensor 1112 intoplatform housing 1102.Sensor 1112 retracts intoplatform housing 1102 andsensor retraction body 1410 on account of a distal end of 1402 pushingpin 1416 downward. Sincepin 1416 is coupled tosensor 1112,sensor 1112 is retracted intoplatform housing 1112 as depicted. -
FIGS. 15A-15C show an alternative platform suspension.FIG. 15A shows a perspective view ofplatform suspension 1500. Whileplatform 1100 has a suspension that includes individual springs for controlling the movement of each wheel,suspension 1500 includes asingle chassis 1502 made of rigid material that attaches each ofwheels 1504 to a platform housing. In this way, as the platform moves between a normal operating state and a stationary state,wheels 1504 all move together withchassis 1502.FIG. 15A also shows howmotors 1506 are coupled tochassis 1502. Since bothmotors 1506 andwheels 1504 are attached to thechassis 1502, alignment betweenmotors 1506 andwheels 1504 remains constant regardless of state.Motors 1506 engagewheels 1504 by way ofbevel gearing 1508. Alternatively,suspension 1500 could be widened and drive shafts ofmotors 1506 could be aligned directly with the axes of rotation of the wheels similar to the configuration shown inFIG. 13 . -
Chassis 1502 is also coupled tohousing brackets 1510 bymultiple springs 1512 andlinkages 1514.Springs 1512 prevent movement ofchassis 1502 relative tohousing brackets 1510 until a threshold amount of force is applied to them through a respective platform housing. Oncesprings 1512 begin to a stretch under the applied force,linkages 1514 are configured to control movement ofchassis 1502 with respect tohousing brackets 1510. Becausehousing brackets 1510 are both rigidly coupled to a respective platform housing,linkages 1514 also control the movement ofchassis 1502 with respect to the respective platform housing. -
FIGS. 15B and 15C show side views ofsuspension 1500 in normal and stationary states. The figures demonstrate howchassis 1502 moves upward and laterally with respect tohousing brackets 1510 to facilitate retraction ofchassis 1502 into the respective platform housing assprings 1512 extend.FIG. 15C also shows howwheels 1504 retract to an extent that they become even with a base ofhousing bracket 1510. In some embodiments,linkages 1514 can be configured such thatwheels 1504 retract above the base ofhousing bracket 1510. -
FIG. 16 shows a perspective view of a bottom surface ofplatform 1600, which incorporates thealternative suspension 1500 depicted inFIGS. 15A-15C . As depicted,platform 1600 includes aplatform housing 1602 that defines a series ofstructural ribs 1604 that span a periphery ofplatform housing 1602. In the peripheral regionstructural ribs 1604 are arranged in a grid pattern. In a central region ofplatform housing 1602structural ribs 1606 only run in a single direction and are broken up to allow space to attachsuspension 1500 within the central region. -
FIG. 17A shows an exemplary testing setup in which anautonomous platform 1702 with a pedestrian target mounted atop it is positioned at the entrance to across-walk 1704. WhenAV 100 reaches apredetermined position 1706,autonomous platform 1702 can be configured to traversecross-walk 1704. Arrival ofAV 100 atposition 1706 can be determined in a number of ways. For example, a sensor can be embedded within the road atposition 1706 and configured to identify passage ofAV 100. In some embodiments, the sensor can be an RFID reader configured to emit an electromagnetic field to sense passage of an RFID tag secured to a forward portion ofAV 100. The RFID reader can then transmit a signal toautonomous platform 1702 relaying thatAV 100 has arrived atposition 1706. In some embodiments, an on-board optical sensor can be used to determine whenAV 100 has arrived atposition 1706. The optical sensor can be configured to measure the size of a target or feature positioned on an exterior surface ofAV 100, allowing analysis of the imagery to provide distance information. The optical sensor can also be configured to determine distance fromAV 100 by measuring a distance between features separated by a known distance. - A path taken by
autonomous platform 1702 can be a constant speed straight path or it can vary substantially. In some embodiments the path taken byautonomous platform 1702 can be based on a manual or programmatic input. For example, a test manager can manually input directional commands that causeautonomous platform 1702 to do a straight traversal across cross-walk 1704 or a more meandering path that can vary in direction and speed while staying within the bounds ofcross-walk 1704. - Regardless of the type of input provided to
autonomous platform 1702, it is important thatautonomous platform 1702 is able to repeat the same set of movements so that improvements to the autonomous management ofAV 100 can be tracked in theevent AV 100 is struggling with a particular scenario.Autonomous platform 1702 can perform the movements multiple times by recording instructions and/or a series of positions it occupies during a particular test run. This can allow for the movements to be repeated with precision. It can also be desirable forautonomous platform 1702 to have the capability to make adjustments or modifications to a previous set of input commands. This may be helpful whereAV 100 performs perfectly to the traversal ofautonomous platform 1702 acrosscross-walk 1704, as it allows testers to see if specific changes to the movement ofautonomous platform 1702 across cross-walk 1704 causes a failure in the performance ofAV 100. For example, autonomous platform can be configured to make controlled adjustments in speed and/or direction to ensureAV 100 is able to react accordingly to a wide variety of scenarios. In addition to providing a large number of controlled scenario variations, these autonomous adjustments also help reduce the amount of time needed by individuals to set up the scenarios as one doesn't need to have a dedicated worker driving everyautonomous platform 1702. -
FIG. 17B shows a perspective view of another testing setup in which alaser transmitter 1710 andlaser receiver 1712 are employed to determine whenAV 100 has reached a predetermined location. The use of a laser detection system may provide a quicker response than an RFID reader as a signal can be transmitted as soon asAV 100 disruptslaser receiver 1712 from receiving the laser transmitted bylaser transmitter 1710. In some embodiments, the laser detection system can also be configured to transmit the speed ofAV 100 at the predetermined location by measuring how long the laser is blocked. The speed ofAV 100 at the predetermined location may also be used to determine whenautonomous platform 1702 begins movement and/or at what speed the movement is carried out. This could be useful in a case a driver was maneuveringAV 100 at different speeds and performance of the anti-collision system necessitatesautonomous platform 1702 being positioned for a collision with a front ofAV 100. In some embodiments, an onboard sensor ofautonomous platform 1702 may be used to assist in positioningautonomous platform 1702 in a particular position relative toAV 100 at a time of contact. For example, while initial movement and direction ofautonomous platform 1702 can be made in accordance with detection ofAV 100 at a predetermined position, one or more sensors such as a LiDAR, RADAR or Imagery Sensor can be used to provide cuing toautonomous platform 1702 so it is positioned as intended prior to its closest point of approach toAV 100. -
FIG. 18A shows a top view of an exemplaryintersection testing setup 1800 forAV 100 with multiple autonomous platforms 1802-1808 configured to execute different movement patterns. Autonomous platforms 1802-1808 can be configured to execute their movement patterns in response toAV 100 arriving at a singular predetermined location orautonomous platforms 1802 can be configured to execute movement platforms in response toAV 100 arriving at different predetermined positions. For example, whenAV 100 arrives atposition 1810,autonomous platforms crosswalk 1812 and whenAV 100 reachesposition 1814autonomous platforms traverse crosswalks platform 1802 withincrosswalk 1816 detected by one or more sensors ofAV 100,AV 100 may decide to slow down or stop prior to making a right turn crossing throughcrosswalk 1816. In some embodiments, autonomous platforms 1802-1808 can be configured to augment readings from its position sensor or sensors by using an onboard sensor to determine a position of -
FIG. 18B shows a top view oftesting setup 1800 with autonomous platforms 1802-1808 cooperating to carryvehicular target 1850 that can take the form of an inflatable vehicular target. Autonomous platforms 1802-1808 can be configured to maintain a formation to carry a large target such asvehicular target 1850. Sensors aboardautonomous platforms 1802 can be configured to help maintain the relative positioning of autonomous platforms 1802-1808. In some embodiments, one of autonomous platforms 1802-1808 can be configured to guide motion ofvehicular target 1850 and the other autonomous platforms are configured to follow changes in direction in speed of the one autonomous platform based solely upon feedback from onboard sensors or alternatively be configured to receive wireless control signals from the controlling autonomous platform and/or cuing from onboard sensors. In some embodiments,vehicular target 1850 can be magnetically coupled to only a single one of autonomous platforms 1802-1808. This configuration allows for easier detachment of vehicular target from the autonomous platforms in the event of a collision. - Autonomous platforms 1802-1808 can be further configured to adjust their operation in response to
AV 100 arriving atpositions AV 100 arriving atposition 1810 and come to a complete stop whenAV 100 arrives atposition 1814. -
FIG. 19 is a flow chart of anexample process 1900 for controlling a self-propelled platform. At 1902 a processor of the self-propelled platform causes the self-propelled platform to move based on a first movement route in accordance with a user input (e.g., a user input received by the processor). In some embodiments, the processor can be configured to supply inputs to motors powering the self-propelled platform to effectuate movement and maneuver of the self-propelled platform based on the first movement route. The self-propelled platform can be constructed in accordance with any of the descriptions found inFIGS. 11A-16 . In some embodiments, the user input can be received by the self-propelled platform prior to execution of the movement and stored in local or cloud-based computer storage as a subroutine that includes a desired direction and rate of movement of the self-propelled platform. The user input could alternatively be input by a remote input control, allowing an individual controlling the self-propelled platform with the remote input control to specify a specific path relative to the surroundings of the self-propelled platform in real-time. - At 1904, during movement of the self-propelled platform, positions of the self-propelled platform at particular times or at particular velocities are recorded in computer memory as a movement route. The position information can be recorded in a number of different reference frames. For example, it may be desirable for the self-propelled platform to always traverse the same portion of a testing setup. Alternatively, it may be more advantageous for the movement to be based entirely or at least in part upon movement of the self-propelled platform relative to a self-propelled vehicle undergoing testing. The position data can be obtained from one or more systems aboard the self-propelled platform including, e.g., a satellite navigation system and an optical or RADAR sensor. On-board sensors capable of providing data about objects surrounding the self-propelled platform are useful in recording the first movement route with respect to one or more other self-propelled vehicles.
- At 1906, the self-propelled vehicle can be configured to follow a second movement route based on the recorded positions of the self-propelled vehicle during the first movement route. In some embodiments, the second movement route can be exactly the same or as close to exactly the same as the first movement route as possible given the accuracy of the recorded positions.
- In the foregoing description, embodiments of the invention have been described with reference to numerous specific details that may vary from implementation to implementation. The description and drawings are, accordingly, to be regarded in an illustrative rather than a restrictive sense. The sole and exclusive indicator of the scope of the invention, and what is intended by the applicants to be the scope of the invention, is the literal and equivalent scope of the set of claims that issue from this application, in the specific form in which such claims issue, including any subsequent correction. Any definitions expressly set forth herein for terms contained in such claims shall govern the meaning of such terms as used in the claims. In addition, when we use the term “further comprising,” in the foregoing description or following claims, what follows this phrase can be an additional step or entity, or a sub-step/sub-entity of a previously-recited step or entity.
Claims (20)
1. A self-propelled platform, comprising:
a plurality of wheels;
a motor configured to drive at least one of the plurality of wheels;
a platform housing comprising a support surface configured to carry at least one target and a sloped periphery configured to accommodate passage of an autonomous vehicle over the platform housing; and
a suspension comprising a plurality of springs coupling the plurality of wheels to the platform housing, the plurality of springs configured to transition the platform from a first state to a second state in response to a threshold amount of weight being applied to the platform housing, wherein the platform housing is lower in the second state than it is in the first state.
2. The self-propelled platform of claim 1 , further comprising a retractable sensor, wherein the retractable sensor protrudes from the platform housing when the suspension is in the first state and is retracted within the platform housing when the suspension is in the second state.
3. The self-propelled platform of claim 2 , wherein the retractable sensor is mechanically coupled to one of the plurality of wheels by a spring and at least one linkage.
4. The self-propelled platform of claim 2 , wherein the retractable sensor comprises an optical sensor.
5. The self-propelled platform of claim 2 , wherein the retractable sensor comprises a LiDAR sensor.
6. The self-propelled platform of claim 2 , wherein the retractable sensor is a first retractable sensor and the self-propelled platform further comprises a second retractable sensor facing in a different direction than the first retractable sensor.
7. The self-propelled platform of claim 2 , further comprising a motor configured to transition the retractable sensor between the protruding position and the retracted position based on a determined proximity of the autonomous vehicle to the self-propelled platform.
8. The self-propelled platform of claim 1 , wherein the sloped periphery of the platform housing and the plurality of wheels are in direct contact with a surface upon which the self-propelled platform rests when the self-propelled platform is in the second state.
9. The self-propelled platform of claim 1 , wherein the suspension comprises a rigid chassis coupled to each wheel of the plurality of wheels, wherein the chassis is coupled to the platform housing by the plurality of springs.
10. The self-propelled platform of claim 1 , wherein a first wheel of the plurality of wheels is coupled to the platform housing by a first spring of the plurality of springs and a second wheel of the plurality of wheels is coupled to the platform housing by a second spring of the plurality of springs.
11. The self-propelled platform of claim 1 , wherein the threshold amount of weight is between 25 kg and 75 kg.
12. The self-propelled platform of claim 1 , further comprising a weight on wheels sensor configured to distinguish between the first state and the second state, wherein the motor is configured to cease operation in response to sensor data from the weight on wheels sensor indicating the self-propelled platform is in the second state.
13. The self-propelled platform of claim 1 , wherein the motor is a first motor and the self-propelled platform further comprises a second motor configured to drive a second wheel of the plurality of wheels, wherein the first motor is configured to drive a first wheel of the plurality of wheels independent from the second motor.
14. The self-propelled platform of claim 13 , wherein a drive axis of the first motor is aligned with an axis of rotation of a first wheel of the plurality of wheels.
15. The self-propelled platform of claim 1 , wherein in the first state the self-propelled platform has an overall height of less than 8 cm.
16. The self-propelled platform of claim 1 , wherein an outside surface of the platform housing comprises at least one permanent magnet configured to secure a target with a magnetically attractable plate or magnet to the platform housing.
17. The self-propelled platform of claim 16 , wherein the target is shaped to simulate a pedestrian or cyclist.
18. The self-propelled platform of claim 1 , further comprising a wireless communication module configured to receive commands that change a movement route of the self-propelled platform.
19. The self-propelled platform of claim 1 , further comprising:
a retractable sensor, wherein the retractable sensor protrudes from the platform housing when the suspension is in the first state and is retracted within the platform housing when the suspension is in the second state;
a motor configured to transition the retractable sensor between the protruding position and the retracted position; and
a wireless communication module configured to receive commands that direct the motor to transition the retractable sensor between the protruding and retracted positions.
20-39. (canceled)
Priority Applications (6)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US17/330,112 US11505134B1 (en) | 2021-05-25 | 2021-05-25 | Automated moving platform |
GB2116241.7A GB2607130A (en) | 2021-05-25 | 2021-11-11 | Automated moving platform |
DE102021133741.0A DE102021133741A1 (en) | 2021-05-25 | 2021-12-17 | AUTOMATED MOBILE PLATFORM |
KR1020210183051A KR20220159249A (en) | 2021-05-25 | 2021-12-20 | Automated moving platform |
CN202111669761.2A CN115388904A (en) | 2021-05-25 | 2021-12-31 | Self-propelled platform, method for carrying out same and storage medium |
US18/056,168 US20230085010A1 (en) | 2021-05-25 | 2022-11-16 | Automated moving platform |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US17/330,112 US11505134B1 (en) | 2021-05-25 | 2021-05-25 | Automated moving platform |
Related Child Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US18/056,168 Continuation US20230085010A1 (en) | 2021-05-25 | 2022-11-16 | Automated moving platform |
Publications (2)
Publication Number | Publication Date |
---|---|
US11505134B1 US11505134B1 (en) | 2022-11-22 |
US20220379820A1 true US20220379820A1 (en) | 2022-12-01 |
Family
ID=79163532
Family Applications (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US17/330,112 Active US11505134B1 (en) | 2021-05-25 | 2021-05-25 | Automated moving platform |
US18/056,168 Pending US20230085010A1 (en) | 2021-05-25 | 2022-11-16 | Automated moving platform |
Family Applications After (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US18/056,168 Pending US20230085010A1 (en) | 2021-05-25 | 2022-11-16 | Automated moving platform |
Country Status (5)
Country | Link |
---|---|
US (2) | US11505134B1 (en) |
KR (1) | KR20220159249A (en) |
CN (1) | CN115388904A (en) |
DE (1) | DE102021133741A1 (en) |
GB (1) | GB2607130A (en) |
Families Citing this family (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP3811049B1 (en) * | 2018-06-22 | 2023-01-25 | Anthony Best Dynamics Ltd | Soft target movement platform |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20180037172A1 (en) * | 2016-08-05 | 2018-02-08 | MotoCrane, LLC | Releasable vehicular camera mount |
US10063815B1 (en) * | 2011-09-26 | 2018-08-28 | Jenesia1, Inc. | Mobile communication platform |
US20210070245A1 (en) * | 2019-09-10 | 2021-03-11 | Toyota Research Institute, Inc. | Ramp structures for a mobile platform |
US20220075057A1 (en) * | 2020-09-08 | 2022-03-10 | Lyft, Inc. | Universal sensor assembly for vehicles |
US20220075030A1 (en) * | 2020-09-09 | 2022-03-10 | Motional Ad Llc | Vehicle Sensor Mounting System |
Family Cites Families (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US8583358B2 (en) | 2011-07-13 | 2013-11-12 | Dynamic Research, Inc. | Devices, systems, and methods for testing crash avoidance technologies |
DE102014206086A1 (en) | 2014-03-31 | 2015-10-01 | Robert Bosch Gmbh | Method for operating a self-propelled mobile platform |
GB2569774A (en) | 2017-10-20 | 2019-07-03 | Kompetenzzentrum Das Virtuelle Fahrzeug | Method for virtual testing of real environments with pedestrian interaction and drones |
-
2021
- 2021-05-25 US US17/330,112 patent/US11505134B1/en active Active
- 2021-11-11 GB GB2116241.7A patent/GB2607130A/en active Pending
- 2021-12-17 DE DE102021133741.0A patent/DE102021133741A1/en active Pending
- 2021-12-20 KR KR1020210183051A patent/KR20220159249A/en not_active Application Discontinuation
- 2021-12-31 CN CN202111669761.2A patent/CN115388904A/en not_active Withdrawn
-
2022
- 2022-11-16 US US18/056,168 patent/US20230085010A1/en active Pending
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US10063815B1 (en) * | 2011-09-26 | 2018-08-28 | Jenesia1, Inc. | Mobile communication platform |
US20180037172A1 (en) * | 2016-08-05 | 2018-02-08 | MotoCrane, LLC | Releasable vehicular camera mount |
US20210070245A1 (en) * | 2019-09-10 | 2021-03-11 | Toyota Research Institute, Inc. | Ramp structures for a mobile platform |
US20220075057A1 (en) * | 2020-09-08 | 2022-03-10 | Lyft, Inc. | Universal sensor assembly for vehicles |
US20220075030A1 (en) * | 2020-09-09 | 2022-03-10 | Motional Ad Llc | Vehicle Sensor Mounting System |
Also Published As
Publication number | Publication date |
---|---|
CN115388904A (en) | 2022-11-25 |
KR20220159249A (en) | 2022-12-02 |
US11505134B1 (en) | 2022-11-22 |
US20230085010A1 (en) | 2023-03-16 |
GB2607130A (en) | 2022-11-30 |
DE102021133741A1 (en) | 2022-12-01 |
GB2607130A9 (en) | 2023-06-28 |
GB202116241D0 (en) | 2021-12-29 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US11851085B2 (en) | Navigation and mapping based on detected arrow orientation | |
US11548526B2 (en) | Systems and methods for implementing an autonomous vehicle response to sensor failure | |
US11333762B2 (en) | Merging data from multiple LiDAR devices | |
US11940804B2 (en) | Automated object annotation using fused camera/LiDAR data points | |
US11835948B2 (en) | Systems and methods for improving vehicle operations using movable sensors | |
US11834045B2 (en) | Navigating multi-way stop intersections with an autonomous vehicle | |
US11521394B2 (en) | Ground plane estimation using LiDAR semantic network | |
US11803184B2 (en) | Methods for generating maps using hyper-graph data structures | |
US11885893B2 (en) | Localization based on predefined features of the environment | |
US20210078593A1 (en) | Operation of an autonomous vehicle based on availability of navigational information | |
US20220283587A1 (en) | Controlling an autonomous vehicle using a proximity rule | |
US20220026917A1 (en) | Monocular 3d object detection from image semantics network | |
US20230085010A1 (en) | Automated moving platform | |
US11958503B2 (en) | Techniques for navigating an autonomous vehicle based on perceived risk | |
GB2610446A (en) | Navigation with drivable area detection | |
US20220258761A1 (en) | Controlling an autonomous vehicle using variable time periods |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
FEPP | Fee payment procedure |
Free format text: ENTITY STATUS SET TO UNDISCOUNTED (ORIGINAL EVENT CODE: BIG.); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY |
|
STCF | Information on status: patent grant |
Free format text: PATENTED CASE |