US11978259B2 - Systems and methods for particle filter tracking - Google Patents
Systems and methods for particle filter tracking Download PDFInfo
- Publication number
- US11978259B2 US11978259B2 US17/371,637 US202117371637A US11978259B2 US 11978259 B2 US11978259 B2 US 11978259B2 US 202117371637 A US202117371637 A US 202117371637A US 11978259 B2 US11978259 B2 US 11978259B2
- Authority
- US
- United States
- Prior art keywords
- cuboid
- track
- value
- score
- angular velocity
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active, expires
Links
- 238000000034 method Methods 0.000 title claims abstract description 60
- 239000002245 particle Substances 0.000 title claims abstract description 30
- 230000001133 acceleration Effects 0.000 claims abstract description 37
- 238000010801 machine learning Methods 0.000 claims abstract description 29
- 230000033001 locomotion Effects 0.000 claims abstract description 15
- 238000004590 computer program Methods 0.000 claims 1
- 238000001514 detection method Methods 0.000 description 17
- 238000010586 diagram Methods 0.000 description 10
- 230000003287 optical effect Effects 0.000 description 9
- 230000008569 process Effects 0.000 description 9
- 238000012545 processing Methods 0.000 description 7
- 230000006870 function Effects 0.000 description 6
- 238000012549 training Methods 0.000 description 5
- 230000010287 polarization Effects 0.000 description 4
- 230000007613 environmental effect Effects 0.000 description 3
- 230000003993 interaction Effects 0.000 description 3
- 238000002372 labelling Methods 0.000 description 3
- 230000000007 visual effect Effects 0.000 description 3
- 238000004891 communication Methods 0.000 description 2
- 238000013500 data storage Methods 0.000 description 2
- 230000002452 interceptive effect Effects 0.000 description 2
- 239000000463 material Substances 0.000 description 2
- 238000005259 measurement Methods 0.000 description 2
- 238000010200 validation analysis Methods 0.000 description 2
- 238000009825 accumulation Methods 0.000 description 1
- 230000004075 alteration Effects 0.000 description 1
- 238000004458 analytical method Methods 0.000 description 1
- 230000006399 behavior Effects 0.000 description 1
- 230000005540 biological transmission Effects 0.000 description 1
- 239000003990 capacitor Substances 0.000 description 1
- 230000000739 chaotic effect Effects 0.000 description 1
- 230000007423 decrease Effects 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 238000012544 monitoring process Methods 0.000 description 1
- 230000008447 perception Effects 0.000 description 1
- 238000001556 precipitation Methods 0.000 description 1
- 230000002787 reinforcement Effects 0.000 description 1
- 238000000926 separation method Methods 0.000 description 1
- 238000005309 stochastic process Methods 0.000 description 1
- 238000012731 temporal analysis Methods 0.000 description 1
- 230000001131 transforming effect Effects 0.000 description 1
- 230000007704 transition Effects 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/50—Context or environment of the image
- G06V20/56—Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
- G06V20/58—Recognition of moving objects or obstacles, e.g. vehicles or pedestrians; Recognition of traffic objects, e.g. traffic signs, traffic lights or roads
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/50—Context or environment of the image
- G06V20/56—Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W30/00—Purposes of road vehicle drive control systems not related to the control of a particular sub-unit, e.g. of systems using conjoint control of vehicle sub-units
- B60W30/18—Propelling the vehicle
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S17/00—Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
- G01S17/66—Tracking systems using electromagnetic waves other than radio waves
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S17/00—Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
- G01S17/88—Lidar systems specially adapted for specific applications
- G01S17/93—Lidar systems specially adapted for specific applications for anti-collision purposes
- G01S17/931—Lidar systems specially adapted for specific applications for anti-collision purposes of land vehicles
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S7/00—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
- G01S7/48—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
- G01S7/481—Constructional features, e.g. arrangements of optical elements
- G01S7/4814—Constructional features, e.g. arrangements of optical elements of transmitters alone
- G01S7/4815—Constructional features, e.g. arrangements of optical elements of transmitters alone using multiple transmitters
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F18/00—Pattern recognition
- G06F18/20—Analysing
- G06F18/21—Design or setup of recognition systems or techniques; Extraction of features in feature space; Blind source separation
- G06F18/214—Generating training patterns; Bootstrap methods, e.g. bagging or boosting
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N20/00—Machine learning
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W2420/00—Indexing codes relating to the type of sensors based on the principle of their operation
- B60W2420/40—Photo, light or radio wave sensitive means, e.g. infrared sensors
- B60W2420/403—Image sensing, e.g. optical camera
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W2420/00—Indexing codes relating to the type of sensors based on the principle of their operation
- B60W2420/40—Photo, light or radio wave sensitive means, e.g. infrared sensors
- B60W2420/408—Radar; Laser, e.g. lidar
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S17/00—Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
- G01S17/86—Combinations of lidar systems with systems other than lidar, radar or sonar, e.g. with direction finders
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V2201/00—Indexing scheme relating to image or video recognition or understanding
- G06V2201/08—Detecting or categorising vehicles
Definitions
- the present disclosure relates generally to object tracking systems. More particularly, the present disclosure relates to implementing systems and methods for particle filter tracking.
- Modern day vehicles have at least one on-board computer and have internet/satellite connectivity.
- the software running on these on-board computers monitor and/or control operations of the vehicles.
- the vehicle also comprises LiDAR detectors and machine learning algorithms trained for detecting and tracking objects in proximity thereto.
- the LiDAR detectors generate LiDAR datasets that measure the distance from the vehicle to an object at a plurality of different times. These distance measurements can be used for tracking movements of the object, making predictions as to the object's trajectory, and planning paths of travel for the vehicle based on the predicted objects trajectory.
- the present disclosure concerns implementing systems and methods for operating a mobile platform (e.g., an autonomous vehicle).
- the methods comprise, by a computing device: obtaining a LiDAR point cloud; using the LiDAR point cloud to generate a track for a given object in accordance with a particle filter algorithm; using the track to train a machine learning algorithm to detect and classify objects based on sensor data; and/or causing the machine learning algorithm to be used for controlling movement of the mobile platform.
- the particle filter algorithm is configured to generate states of a given object over time. Each state may be defined by a position, a velocity and a heading for the given object at a particular time. Each state has a score indicating a likelihood that a cuboid would be created given an acceleration value and an angular velocity value.
- the acceleration value and the angular velocity value may comprise random numbers.
- the score is generated by: setting a score value for the cuboid equal to zero; generating a first adjusted score by adding to the score value a likelihood of seeing the acceleration value and the angular velocity value in a context; and generating a second adjusted score value be adding to the first adjusted score value a negative squared distance from each data point of said LiDAR point cloud to a closest edge of the cuboid.
- the particle filter algorithm may also be configured to: generate an initial cuboid encompassing at least some data points in the LiDAR point cloud; randomly select different sets of acceleration and angular velocity values; create a set of cuboids using the initial cuboid and the different sets of acceleration and angular velocity values; determine a score for each cuboid of the set of cuboids that indicates a likelihood that the cuboid would be created given the respective one of the different sets of acceleration and angular velocity values; identify scores that are less than a maximum score minus a threshold value; and/or remove cuboids from the set of cuboids that are associated with the scores which were identified.
- the implementing systems can comprise: a processor; and/or a non-transitory computer-readable storage medium comprising programming instructions that are configured to cause the processor to implement a method for operating a mobile platform.
- FIG. 1 is an illustration of a system.
- FIG. 2 is an illustration of an architecture for a vehicle.
- FIG. 3 is an illustration of an architecture for a LiDAR system.
- FIG. 4 is an illustration of a computing device.
- FIG. 5 provides a block diagram of an illustrative vehicle trajectory planning process.
- FIG. 6 provides a flow diagram of an illustrative method for training machine learning algorithms and/or operating a vehicle.
- FIG. 7 provides a graph including a LiDAR dataset and a cuboid.
- FIG. 8 provides a flow diagram of an illustrative method for determining track(s).
- FIG. 9 provides a flow diagram of an illustration method for generating an amodal cuboid using particle filter algorithm(s).
- FIG. 10 provides a flow diagram for determining a score for a cuboid.
- FIG. 11 provides a flow diagram of an illustrative method for validating a track.
- FIG. 12 provides an illustration that is useful for understanding how a cuboid is generated in accordance with a particle filter algorithm.
- FIG. 13 provides an illustration showing a LiDAR dataset and a set of cuboids generated using the LiDAR dataset.
- An “electronic device” or a “computing device” refers to a device that includes a processor and memory. Each device may have its own processor and/or memory, or the processor and/or memory may be shared with other devices as in a virtual machine or container arrangement.
- the memory will contain or receive programming instructions that, when executed by the processor, cause the electronic device to perform one or more operations according to the programming instructions.
- memory each refer to a non-transitory device on which computer-readable data, programming instructions or both are stored. Except where specifically stated otherwise, the terms “memory,” “memory device,” “data store,” “data storage facility” and the like are intended to include single device embodiments, embodiments in which multiple memory devices together or collectively store a set of data or instructions, as well as individual sectors within such devices.
- processor and “processing device” refer to a hardware component of an electronic device that is configured to execute programming instructions. Except where specifically stated otherwise, the singular term “processor” or “processing device” is intended to include both single-processing device embodiments and embodiments in which multiple processing devices together or collectively perform a process.
- vehicle refers to any moving form of conveyance that is capable of carrying either one or more human occupants and/or cargo and is powered by any form of energy.
- vehicle includes, but is not limited to, cars, trucks, vans, trains, autonomous vehicles, aircraft, aerial drones and the like.
- An “autonomous vehicle” is a vehicle having a processor, programming instructions and drivetrain components that are controllable by the processor without requiring a human operator.
- An autonomous vehicle may be fully autonomous in that it does not require a human operator for most or all driving conditions and functions, or it may be semi-autonomous in that a human operator may be required in certain conditions or for certain operations, or that a human operator may override the vehicle's autonomous system and may take control of the vehicle.
- a cuboid for an object is a 3D oriented bounding box that represents (i) a heading of the object, and (ii) a full extent of the object.
- a track may comprise a plurality of cuboids that are temporally arranged (e.g., to indicate observed and/or predicted motion or movement of the object over time).
- the methods generally involve performing the following operations by a computing device: operating a mobile platform (e.g., an autonomous vehicle).
- the methods comprise, by a computing device: obtaining a LiDAR point cloud; using the LiDAR point cloud to generate a track for a given object in accordance with a particle filter algorithm; using the track to train a machine learning algorithm to detect and classify objects based on sensor data; and/or causing the machine learning algorithm to be used for controlling movement of the mobile platform (e.g., cause autonomous vehicle to drive or travel along a trajectory, cause an articulating arm to extend and/or grip an object, etc.).
- the particle filter algorithm is configured to generate states of a given object over time. Each state may be defined by a position, a velocity and a heading for the given object at a particular time. Each state has a score indicating a likelihood that a cuboid would be created given an acceleration value and an angular velocity value.
- the acceleration value and the angular velocity value may comprise random numbers.
- the score is generated by: setting a score value for the cuboid equal to zero; generating a first adjusted score by adding to the score value a likelihood of seeing the acceleration value and the angular velocity value in a context; and generating a second adjusted score value be adding to the first adjusted score value a negative squared distance from each data point of said LiDAR point cloud to a closest edge of the cuboid.
- the particle filter algorithm may also be configured to: generate an initial cuboid encompassing at least some data points in the LiDAR point cloud; randomly select different sets of acceleration and angular velocity values; create a set of cuboids using the initial cuboid and the different sets of acceleration and angular velocity values; determine a score for each cuboid of the set of cuboids that indicates a likelihood that the cuboid would be created given the respective one of the different sets of acceleration and angular velocity values; identify scores that are less than a maximum score minus a threshold value; and/or remove cuboids from the set of cuboids that are associated with the scores which were identified.
- the implementing systems can comprise: a processor; and a non-transitory computer-readable storage medium comprising programming instructions that are configured to cause the processor to implement a method for operating an autonomous robot (e.g., an autonomous vehicle) or other mobile platform (e.g., an articulating arm coupled to a mobile or fixed base).
- an autonomous robot e.g., an autonomous vehicle
- other mobile platform e.g., an articulating arm coupled to a mobile or fixed base
- the present solution has many advantages. Goals of the present solution are to accelerate track labeling and produce quality tight cuboids.
- a current implementation for track labeling uses linear interpolation between consecutive key frames and required approximation by piecewise linear movement during an acceleration of an autonomous vehicle. Multiple key frames are required for non-linear motion. The non-realistic larger extent of the user label allows for this since the LiDAR data points can shift around inside cuboids. The labelers still need multiple key frames during an acceleration or rotation.
- the novel particle filter algorithm of the present solution uses interpolation requiring less manual interaction, decreases the time required for labeling objects, allows for tighter fitting cuboids, produced more realistic dynamical behavior, and produced better quality data.
- present solution is being described herein in the context of an autonomous robots (e.g., autonomous vehicles).
- the present solution is not limited to autonomous robot applications.
- the present solution can be used in other applications.
- System 100 comprises a vehicle 102 1 that is traveling along a road in a semi-autonomous or autonomous manner.
- Vehicle 102 1 is also referred to herein as an AV.
- the AV 102 1 can include, but is not limited to, a land vehicle (as shown in FIG. 1 ), an aircraft, a watercraft or a spacecraft.
- AV 102 1 is generally configured to detect objects in proximity thereto.
- the objects can include, but are not limited to, a vehicle 102 2 , a cyclist (not shown) (such as a rider of a bicycle, electric scooter, motorcycle, or the like) and/or a pedestrian (not shown).
- the object detection may be achieved using machine learning algorithms that were trained with tracks determined in accordance with the present solution. Each track comprises a plurality of cuboids. The manner in which the tracks are determined or otherwise generated will become evident as the discussion progresses. Still, it should be understood that the tracks are determined/generated using LiDAR datasets generated by a LiDAR detector which may be onboard the AV 102 1 and/or onboard another platform.
- the LiDAR detector generally measures the distance to an object 102 2 by illuminating the object 102 2 with light 104 (e.g., a laser light) and measuring the reflected light 106 with a sensor.
- the LiDAR detector generates LiDAR datasets at a plurality of times t, t+1, t+2, . . . , t+n.
- Each LiDAR dataset is also referred to herein as a frame of LiDAR data.
- the frames LiDAR data are processed by an onboard computing device of the AV 102 1 and/or by a remote computing device 110 to generate cuboids for objects given the LiDAR datasets.
- the LiDAR datasets may be communicated from the AV 102 1 to the remote computing device 110 over a network 108 (e.g., the Internet) via wired and/or wireless connections.
- the LiDAR datasets may also be stored in a memory of the AV 102 1 , which may be manually removed from the AV 102 1 and connected to the remote computing device 110 .
- the LiDAR datasets may additionally be stored in a remote datastore 112 (e.g., a database).
- the cuboids for the objects are then used to train machine learning algorithms for making object detections and/or object trajectory predications/possibilities.
- AV 102 1 When such an object detection is made using the trained machine learning algorithm, AV 102 1 performs operations to: generate one or more possible object trajectories for the detected object; analyze the generated possible object trajectory(ies) to determine a trajectory for the AV 102 1 ; and cause the AV 102 1 to follow the trajectory.
- FIG. 2 there is provided an illustration of an illustrative system architecture 200 for a vehicle.
- Vehicles 102 1 and/or 102 2 of FIG. 1 can have the same or similar system architecture as that shown in FIG. 2 .
- system architecture 200 is sufficient for understanding vehicle(s) 102 1 , 102 2 of FIG. 1 .
- the vehicle 200 includes an engine or motor 202 and various sensors 204 - 218 for measuring various parameters of the vehicle.
- the sensors may include, for example, an engine temperature sensor 204 , a battery voltage sensor 206 , an engine Rotations Per Minute (RPM) sensor 208 , and a throttle position sensor 210 .
- RPM Rotations Per Minute
- the vehicle may have an electric motor, and accordingly will have sensors such as a battery monitoring system 212 (to measure current, voltage and/or temperature of the battery), motor current 214 and voltage 216 sensors, and motor position sensors 218 (e.g., resolvers and encoders).
- Operational parameter sensors that are common to both types of vehicles include, for example, a position sensor 236 (e.g., an accelerometer, gyroscope and/or inertial measurement unit), a speed sensor 238 , and/or an odometer sensor 240 .
- the vehicle also may have a clock 242 that the system uses to determine vehicle time during operation.
- the clock 242 may be encoded into the vehicle on-board computing device, it may be a separate device, or multiple clocks may be available.
- the vehicle also will include various sensors that operate to gather information about the environment in which the vehicle is traveling. These sensors may include, for example: a location sensor 260 (e.g., a Global Positioning System (GPS) device); object detection sensors such as one or more cameras 262 ; a LiDAR sensor system 264 ; and/or a radar and/or a sonar system 266 .
- the sensors also may include environmental sensors 268 such as a precipitation sensor and/or ambient temperature sensor.
- the object detection sensors may enable the vehicle to detect objects that are within a given distance range of the vehicle 200 in any direction, while the environmental sensors collect data about environmental conditions within the vehicle's area of travel.
- the vehicle on-board computing device 220 analyzes the data captured by the sensors and optionally controls operations of the vehicle based on results of the analysis. For example, the vehicle on-board computing device 220 may control: braking via a brake controller 232 ; direction via a steering controller 224 ; speed and acceleration via a throttle controller 226 (in a gas-powered vehicle) or a motor speed controller 228 (such as a current level controller in an electric vehicle); a differential gear controller 230 (in vehicles with transmissions); and/or other controllers.
- Geographic location information may be communicated from the location sensor 260 to the vehicle on-board computing device 220 , which may then access a map of the environment that corresponds to the location information to determine known fixed features of the environment such as a ground surface, streets, buildings, stop signs and/or stop/go signals. Captured images from the cameras 262 and/or object detection information captured from sensors such as LiDAR sensor system 264 is communicated from those sensors to the vehicle on-board computing device 220 . The object detection information and/or captured images are processed by the vehicle on-board computing device 220 to detect objects in proximity to the vehicle 200 . Any known or to be known technique for making an object detection based on sensor data and/or captured images can be used in the embodiments disclosed in this document.
- the vehicle on-board computing device 220 When the vehicle on-board computing device 220 detects a moving object, the vehicle on-board computing device 220 will generate one or more possible object trajectories for the detected object, and use the possible object trajectories to determine a vehicle trajectory for the AV. The vehicle on-board computing device 220 then performs operations to cause the AV to follow the defined vehicle trajectory. For example, the vehicle on-board computing device 220 uses the object trajectory information to decide what space has been occupied by the object, and then generates a vehicle trajectory in which the AV is not planned to travel to that space.
- LiDAR sensor system 264 of FIG. 2 may be the same as or substantially similar to the LiDAR system 300 . As such, the discussion of LiDAR system 300 is sufficient for understanding LiDAR sensor system 264 of FIG. 2 .
- the LiDAR system 300 includes a housing 306 which may be rotatable 360° about a central axis such as hub or axle.
- the housing may include an emitter and aperture(s) 312 made of a material transparent to light.
- an emitter and aperture(s) 312 made of a material transparent to light.
- multiple apertures for emitting and/or receiving light may be provided. Either way, the LiDAR system 300 can emit light through the aperture(s) 312 and receive reflected light back toward the aperture(s) 312 as the housing 306 rotates around the internal components.
- the outer shell of housing 306 may be a stationary dome, at least partially made of a material that is transparent to light, with rotatable components inside of the housing 306 .
- the light emitter system 304 may include any number of individual emitters (e.g., 8 emitters, 64 emitters, or 128 emitters). The emitters may emit light of substantially the same intensity or of varying intensities. The individual beams emitted by the light emitter system 304 will have a well-defined state of polarization that is not the same across the entire array. As an example, some beams may have vertical polarization and other beams may have horizontal polarization.
- the LiDAR system 300 will also include a light detector 308 containing a photodetector or array of photodetectors positioned and configured to receive light reflected back into the system.
- the light emitter system 304 and light detector 308 would rotate with the rotating shell, or they would rotate inside the stationary dome of the housing 306 .
- One or more optical element structures 310 may be positioned in front of the light emitter system 304 and/or the light detector 308 to serve as one or more lenses or wave plates that focus and direct light that is passed through the optical element structure(s) 310 .
- One or more optical element structures 310 may be positioned in front of a mirror 312 to focus and direct light that is passed through the optical element structure(s) 310 .
- the system includes optical element structure(s) 310 positioned in front of the mirror 312 and connected to the rotating elements of the system so that the optical element structure(s) 310 rotate(s) with the mirror 312 .
- the optical element structure(s) 310 may include multiple such structures (e.g., lenses and/or wave plates).
- multiple optical element structure(s) 310 may be arranged in an array on or integral with the shell portion of the housing 306 .
- each optical element structure 310 may include a beam splitter that separates light that the system receives from light that the system generates.
- the beam splitter may include, for example, a quarter-wave or half-wave wave plate to perform the separation and ensure that received light is directed to the receiver unit rather than to the emitter system (which could occur without such a wave plate as the emitted light and received light should exhibit the same or similar polarizations).
- the LiDAR system 300 will include a power unit 318 to power the light emitter system 304 , a motor 316 , and electronic components.
- the LiDAR system will also include an analyzer 314 with elements such as a processor 322 and non-transitory computer-readable memory 320 containing programming instructions that are configured to enable the system to receive data collected by the light detector 308 , analyze it to measure characteristics of the light received, and generate information that a connected system can use to make decisions about operating in an environment from which the data was collected.
- the analyzer 314 may be integral with the LiDAR system 300 as shown, or some or all of it may be external to the LiDAR system and communicatively connected to the LiDAR system via a wired or wireless communication network or link.
- FIG. 4 there is provided an illustration of an illustrative architecture for a computing device 400 .
- the computing device 110 of FIG. 1 and/or the vehicle on-board computing device 220 of FIG. 2 is/are the same as or similar to computing device 400 .
- the discussion of computing device 400 is sufficient for understanding the computing device 110 of FIG. 1 and the vehicle on-board computing device 220 of FIG. 2 .
- Computing device 400 may include more or less components than those shown in FIG. 4 . However, the components shown are sufficient to disclose an illustrative solution implementing the present solution.
- the hardware architecture of FIG. 4 represents one implementation of a representative computing device configured to (i) train a machine learning algorithm or other algorithm and/or (ii) operate a vehicle using a trained machine learning algorithm, as described herein. As such, the computing device 400 of FIG. 4 implements at least a portion of the method(s) described herein.
- the hardware includes, but is not limited to, one or more electronic circuits.
- the electronic circuits can include, but are not limited to, passive components (e.g., resistors and capacitors) and/or active components (e.g., amplifiers and/or microprocessors).
- the passive and/or active components can be adapted to, arranged to and/or programmed to perform one or more of the methodologies, procedures, or functions described herein.
- the computing device 400 comprises a user interface 402 , a Central Processing Unit (CPU) 406 , a system bus 410 , a memory 412 connected to and accessible by other portions of computing device 400 through system bus 410 , a system interface 460 , and hardware entities 414 connected to system bus 410 .
- the user interface can include input devices and output devices, which facilitate user-software interactions for controlling operations of the computing device 400 .
- the input devices include, but are not limited to, a physical and/or touch keyboard 450 .
- the input devices can be connected to the computing device 400 via a wired or wireless connection (e.g., a Bluetooth® connection).
- the output devices include, but are not limited to, a speaker 452 , a display 454 , and/or light emitting diodes 456 .
- System interface 460 is configured to facilitate wired or wireless communications to and from external devices (e.g., network nodes such as access points, etc.).
- Hardware entities 414 perform actions involving access to and use of memory 412 , which can be a Random Access Memory (RAM), a disk drive, flash memory, a Compact Disc Read Only Memory (CD-ROM) and/or another hardware device that is capable of storing instructions and data.
- Hardware entities 414 can include a disk drive unit 416 comprising a computer-readable storage medium 418 on which is stored one or more sets of instructions 420 (e.g., software code) configured to implement one or more of the methodologies, procedures, or functions described herein.
- the instructions 420 can also reside, completely or at least partially, within the memory 412 and/or within the CPU 406 during execution thereof by the computing device 400 .
- the memory 412 and the CPU 406 also can constitute machine-readable media.
- machine-readable media refers to a single medium or multiple media (e.g., a centralized or distributed database, and/or associated caches and servers) that store the one or more sets of instructions 420 .
- machine-readable media also refers to any medium that is capable of storing, encoding or carrying a set of instructions 420 for execution by the computing device 400 and that cause the computing device 400 to perform any one or more of the methodologies of the present disclosure.
- FIG. 5 there is provided a block diagram that is useful for understanding how movement of an autonomous robot (e.g., an AV) and/or other mobile platform may be achieved in accordance with the present solution. All of the operations performed in blocks 502 - 512 can be performed by the on-board computing device of the autonomous robot (e.g., AV 102 1 of FIG. 1 ) and/or other mobile platform.
- the on-board computing device of the autonomous robot e.g., AV 102 1 of FIG. 1
- a location of the autonomous robot (e.g., AV 102 1 of FIG. 1 ) is detected. This detection can be made based on sensor data output from a location sensor (e.g., location sensor 260 of FIG. 2 ) of the autonomous robot. This sensor data can include, but is not limited to, GPS data.
- the detected location of the autonomous robot is then passed to block 506 .
- an object e.g., vehicle 102 2 of FIG. 1
- the autonomous robot e.g., ⁇ 100+ meters. This detection is made based on sensor data output from camera(s) (e.g., camera(s) 262 of FIG. 2 ) of the autonomous robot or another device, and/or a LiDAR system (e.g., LiDAR sensor system 264 of FIG. 2 ) of the autonomous robot or another device.
- image processing is performed to detect an instance of an object of a certain class (e.g., a vehicle or pedestrian) in one or more images.
- LiDAR datasets are also processed to detect instances of objects of certain classes represented by point cloud data.
- Such sensor data processing can be achieved using machine learning algorithms that are trained based on tracks comprising cuboids generated/produced in accordance with the present solution.
- the machine learning algorithms are trained to detect patterns in images and/or LiDAR datasets which identify objects of a given classes (e.g., a vehicle or pedestrian). Any machine learning algorithm can be used here.
- Any machine learning algorithm can be used here.
- one or more of the following machine learning algorithms is employed here: supervised learning; unsupervised learning; semi-supervised learning; and reinforcement learning.
- a predicted trajectory is determined in block 504 for the object.
- the object's trajectory is predicted in block 504 based on results of the machine learning algorithms (e.g., an object class), a cuboid geometry, a track (defined by cuboids over time), and/or contents of a map 518 (e.g., a road/terrain map 270 of FIG. 2 including information specifying sidewalk locations, lane locations, lane directions of travel, driving rules, etc.).
- a map 518 e.g., a road/terrain map 270 of FIG. 2 including information specifying sidewalk locations, lane locations, lane directions of travel, driving rules, etc.
- the cuboid geometry is determined using the LiDAR dataset, images and/or the map 518 .
- Techniques for predicting object trajectories based on cuboid geometries are well known in the art. Any known or to be known technique for predicting object trajectories based on cuboid geometries can be used herein without limitation. For example, one technique involves predicting that the object is moving on a linear path in the same direction as the heading direction of the cuboid.
- the predicted object trajectories can include, but are not limited to, the following trajectories:
- the possible speed(s) and/or possible direction(s) of travel may be pre-defined for objects in the same class and/or sub-class as the object. It should be noted once again that the cuboid defines a full extent of the object and a heading of the object. The heading defines a direction in which the object's front is pointed, and therefore provides an indication as to the actual and/or possible direction of travel for the object.
- Information 520 specifying the object's predicted trajectory and the cuboid geometry is provided to block 506 .
- a classification of the object is also passed to block 506 .
- a platform trajectory is generated using the information from blocks 502 and 504 .
- Techniques for determining a platform trajectory using cuboid(s) are well known in the art. Any known or to be known technique for determining a platform trajectory using cuboid(s) can be used herein without limitation.
- such a technique involves determining a trajectory for the autonomous robot that would pass the object when the object is in front of the autonomous robot, the cuboid has a heading direction that is aligned with the direction in which the autonomous robot is moving, and the cuboid has a length that is greater than a threshold value.
- the present solution is not limited to the particulars of this scenario.
- the platform trajectory 508 can be determined based on the location information from block 502 , the object detection information from block 504 , and/or map 518 (which may be pre-stored in a data store of the autonomous robot).
- the platform trajectory 508 may represent a smooth path that does not have abrupt changes that would otherwise provide passenger discomfort.
- the platform trajectory is defined by a path of travel along a given lane of a road in which the object is not predicted travel within a given amount of time.
- the platform trajectory 508 is then provided to block 510 .
- a steering angle and velocity command is generated based on the platform trajectory 508 .
- the steering angle and velocity command is provided to block 512 for platform dynamics control, i.e., the steering angle and velocity command causes the autonomous robot to follow the platform trajectory 508 .
- Method 600 begins with 602 and continues with 604 where a mobile platform (e.g., an autonomous robot such as AV 102 1 of FIG. 1 ) performs operations to capture at least one image and/or generate at least one LiDAR dataset.
- a mobile platform e.g., an autonomous robot such as AV 102 1 of FIG. 1
- the image(s) and/or LiDAR dataset(s) is(are) communicated in 606 to one or more computing devices (e.g., remote computing device 110 of FIG. 1 and/or vehicle on-board computing device 220 of FIG. 2 ).
- the LiDAR dataset(s) is(are) plotted on 3D graph(s) as shown by 608 .
- Each 3D graph has an x-axis, a y-axis and a z-axis with an origin defined at a center of a LiDAR sensor, the x-axis pointing forward and the z-axis pointing upward.
- An illustration of a LiDAR dataset 702 plotted on a graph 700 is provided in FIG. 7 .
- graph 700 only shows the 2D point of view from the x-axis and the z-axis for ease of illustration.
- Techniques for plotting LiDAR datasets on 3D graphs are well known in the art, and therefore will not be described here. Any known or to be known technique for plotting LiDAR datasets on 3D graphs can be used here.
- the image(s) and/or 3D graph(s) are used in 610 to detect an object that is located in proximity to the mobile platform. This detection can be made manually by an individual or automatically/automatedly by the computing device(s). In the manual scenarios, one or more individuals analyze the 3D graphs displayed on a screen of the computing device(s) to identify data points that appear to define an object. In the automatic/automated scenarios, the computing device(s) can employ any known or to be known algorithm to identify data points that appear to define an object. Machine learning algorithms can be used here to facilitate the object detection(s) and/or classification(s). Such machine learning algorithms are well known.
- Track(s) is(are) defined in 612 using interactive tracker, the images, the LiDAR datasets and/or the 3D graphs. Operations of the interactive tracker will be discussed in detail below in relation to FIGS. 8 - 10 .
- Each track comprises a plurality of cuboids that are temporally arranged.
- Each cuboid comprises a 3D oriented bounded box that represents (i) a heading of the object (e.g., object 102 2 of FIG. 1 ), (ii) the full extent of the object (e.g., object 102 2 of FIG. 1 ), and/or the center/centroid of the object.
- the cuboid encompasses the LiDAR data points in the 3D graph that are associated with a given object.
- FIG. 7 An illustration showing a cuboid 704 defined on a graph 700 is provided in FIG. 7 .
- data points of a LiDAR dataset 702 reside within the cuboid 704 .
- the edges 706 , 708 , 710 , 712 of the cuboid touch or are otherwise are in contact with the data points of the LiDAR dataset 702 .
- the present solution is not limited to the particulars of this illustration.
- the cuboids of the track(s) and/or 3D graph(s) are then used in 614 to train a machine learning algorithm for object detection/classification purposes, motion prediction purposes, and/or motion planning purposes (e.g., to make predictions as to trajectories for objects).
- the track(s) may additionally or alternatively be used to train other algorithm(s) such as a track validation algorithm.
- Methods for training algorithms using track(s), cuboid(s) and 3D graphs are well known.
- the trained algorithm(s) may subsequently be used to facilitate movement-related operations of the mobile platform or another mobile platform (e.g., driving-related operations of AV 102 1 of FIG. 1 or joint control operations of an articulating arm).
- the trained algorithm(s) can be used for object detection and/or object classification as shown by 616 .
- an object's predicted trajectory may be determined based on results of the object detection/object classification operations.
- a trajectory for the mobile platform or the other mobile platform may be determined based on results of the object detection/object classification/object trajectory generation operations.
- the mobile platform or the other mobile platform may be caused to follow the trajectory as shown by 620 .
- 622 is performed where method 600 ends or other operations are performed (e.g., return to 602 ).
- operation 612 of FIG. 6 begins with 802 and continues with 804 where a LiDAR point cloud is selected at a given time (e.g., manually by an individual performing user-software interactions or automatically/automatedly by a computing device).
- the LiDAR point cloud is processed to remove or otherwise filter ground points therefrom.
- the ground points can be identified using content of a road/terrain map, a known sensor height, and/or a plane fitting algorithm. Road/terrain maps are well known.
- the LiDAR point cloud is input into one or more particle filter algorithms as shown by 806 .
- Each particle filter algorithm uses a set of particles (or samples) to represent a posterior distribution of a stochastic process given noisy and/or partial observations.
- each particle filter algorithm is configured to generate states of a given object over time, where each state is defined by a position, a velocity and a heading for the given object at a particular time.
- each particle filter algorithm is configured to generate a cuboid c for a plurality of sequential times t 0 , t 1 , . . . , t m .
- Each cuboid comprises a position, a velocity and a heading.
- the cuboids can be used to generate a set of states for the object.
- FIG. 13 An illustration showing a set of cuboids 1300 generated by a particle filter algorithm is provided in FIG. 13 .
- the set 1300 comprises cuboids c 1 , c 2 , . . . , c 10 encompassing data points 1304 of LiDAR point cloud 1306 .
- the present solution is not limited in this regard.
- the set can include any number of cuboids in accordance with a given application. The manner in which the cuboids are generated by the particle filter algorithm(s) will become evident as the discussion progresses.
- a track T for a given object is defined in 810 using the cuboids generated in 808 .
- the track Tis defined to comprise some or all cuboids c 1 , c 2 , . . . , c 10 shown in FIG. 13 .
- the track T may optionally be validated as shown by 812 . If the track is validated, then the track is stored for subsequent use in training machine learning algorithm(s) and/or other algorithm(s). In contrast, if the track is not validated, then the track may be discarded or stored in a manner (e.g., have a flag set to a given value) so that it will not be used to subsequently train the machine learning algorithm(s) and/or other algorithm(s).
- the track validation can be achieved in accordance with the process shown in FIG. 11 . More specifically, 812 can involve: using a point cloud accumulator in 1104 to validate the track and/or identify other track(s); and/or performing a temporal analysis in 1106 of the cuboids defining the track to detect whether any anomalies exist with regard to the cuboid sequence in time.
- the anomalies can include, but are not limited to, cases where a cuboid of the track is not surrounding LiDAR data points and/or cases where a heading of a cuboid in the track is different than an expected heading for the object.
- Point cloud accumulators are well known to perform operations involving the accumulation of LiDAR data points over successive frames and/or cuboids in a track.
- the point cloud accumulator can validate a correctly tracked object by showing a clear “image” of the object. Fuzzy or smeared out point clouds represent failure.
- method 800 continues with 814 where track(s) for sub-sequence(s) of times can be generated by performing another iteration of at least 806 - 810 . Subsequently, 816 is performed where method 800 ends or other operations are performed (e.g., return to 806 for generation of a next track for the same or different object).
- FIG. 9 there is provided a flow diagram of an illustrative method that is useful for understanding how cuboids are generated in accordance with particle filter algorithm(s) of the present solution.
- the process of FIG. 9 can be performed in block 808 of FIG. 8 .
- a particle is defined as a cuboid with a position, a velocity and a heading.
- the cuboid position can be expressed in two dimensions or three dimensions.
- the cuboid position comprises an x-axis coordinate for a center thereof center_x and a y-axis coordinate for the center thereof center_y.
- a z-axis coordinate in also provided for the center of the cuboid.
- the z-axis coordinate center_z can be determined as the height of the cuboid's center relative to a ground surface specified in a map (e.g., road/terrain map 270 of FIG. 2 ).
- the cuboid's velocity is defined by an x-axis coordinate velocity_x and a y-axis coordinate velocity_y.
- the heading is defined by an angle angle_h.
- an initial cuboid cinitia is generated using the LiDAR point cloud.
- the initial cuboid can be constructed in accordance with known or to be known techniques.
- the initial cuboid is constructed by: obtaining pre-defined cuboid dimensions (a length, a width, a height); and setting a center of a cuboid equal to a center of the LiDAR data points associated with the object detected in 610 .
- the cuboid can comprise a 3D shape that (i) encompasses a given percentage (e.g., >90%) of the LiDAR data points of an object and/or (ii) none or a minimal number of the LiDAR data points for other objects (but allowing for the inclusion of LiDAR data points for ground surface).
- the initial cuboid is constructed by: fusing the LiDAR point cloud, a vector map and a visual heading; and defining a cuboid along the visual heading with a highest likelihood.
- the vector map may contain a lane direction which provides a strong indication for a heading of the cuboid.
- the visual heading may be estimated for an object from camera image(s).
- Other operations may be performed such as: transforming the coordinates of the cuboid from a first coordinate system to a second different coordinate system; and/or adjusting the coordinates of the cuboid corners to have minimal values for encompassing a given number of LiDAR data points for the object (with a tolerance for outlier LiDAR data points).
- the first coordinate system may comprise a LiDAR system/sensor coordinate system, i.e., an xyz coordinate system having an origin of the three axes at a center of a LiDAR system/sensor center.
- the second coordinate system may comprise an xyz coordinate system having an origin of the three axes at a center of an object, the x-axis pointing forward (i.e., towards the heading of the object), and the z-axis pointing upward.
- Tolerance thresholds may need to be met. For example, 95% of all LiDAR data points for the object need to be included in the cuboid.
- the present solution is not limited to the particulars of these examples.
- an acceleration value a and an angular velocity value a) for each element of the set S are selected at multiple times t 0 , . . . , t g .
- the acceleration value can be defined by an x-axis value a x and y-axis value a y .
- These values a x , a y and ⁇ can be selected randomly using a random number generator (e.g., a Gaussian random number generator), a pseudo-random number generator and/or a chaotic number generator.
- a random number generator e.g., a Gaussian random number generator
- a pseudo-random number generator e.g., a pseudo-random number generator and/or a chaotic number generator.
- t g (e.g., a initial-t0 , . . . , a initial-tg , and ⁇ initial-t0 , . . . , ⁇ inital-tg ).
- the present solution is not limited in this regard.
- a set C of cuboids is created using set S, the acceleration value(s) (e.g., a initial-t0 , . . . , a initial-tg ) and the angular velocity value(s) (e.g., ⁇ intital-t0 , . . . , ⁇ initial-tg ).
- Each acceleration value specifies how a velocity of a cuboid in set S (e.g., c initial ) is to be changed to define another cuboid
- the angular velocity value specifies how the heading of the cuboid in set S (e.g., c initial ) is to be changed to define the another cuboid.
- each transition for the particle filter algorithm is given by, acceleration a x (variance VAX), acceleration a y (variance VAY) and angular velocity a) (variance VAV).
- a x , a y and ⁇ may be modeled to come from a Gaussian distribution with zero mean.
- the variances VAX, VAY and VAV are hyperparameters.
- the angular velocity VAV parameter can give stiffness or freedom to turn. Two hyperparameter sets are chosen to be turning or straight with straight having a greater stiffness and turning being required for turns.
- FIG. 12 shows LiDAR data being collected by a mobile platform 1200 (e.g., AV 102 1 of FIG. 1 ).
- the LiDAR data defines LiDAR point clouds 1202 .
- a cuboid 1204 is created by the particle filter algorithm.
- the cuboid 1204 can be included in set 1300 of FIG. 13 .
- Set C created in 910 can include set 1300 of FIG. 13 .
- the present solution is not limited to the particulars of these illustrations.
- a score l is determined for each cuboid of the set C.
- the score l indicates a likelihood that a cuboid is created given the acceleration value selected in 908 , the angular velocity value selected in 908 and the LiDAR point cloud input into the particle filter algorithm.
- the score l may comprise a sum of log likelihoods of a x , a y and v angular of all LiDAR data points in a given cuboid.
- ten cuboids c 1 , c 2 , . . . , c 10 are in set C as shown in FIG. 13 , then ten scores l 1 , l hd 2 , . . . , l 10 are determined or computed in 912 .
- the likelihood values can be computed as follows: let the x-acceleration be chosen from Gaussian(0,sigma_a x ); let the y-acceleration be chosen from Gaussian(0,sigma_a y ); let the angular velocity be chosen from Gaussian(0,sigma_a y ); compute the log-likelihood of the x-acceleration, as a function of q, in accordance with mathematical equation log(Gaussian(0,sigma_a x )(q)); compute the log-likelihood of the y-acceleration, as a function of q, in accordance with mathematical equation log(Gaussian(0,sigma_a y )(q)); and compute the log-likelihood of the angular velocity, as a function of q, in accordance with mathematical equation log(Gaussian(0,sigma_a v )(q)).
- a final term of the log likelihood i.e., the likelihood of the
- FIG. 10 An illustrative process for determining the score l is provided in FIG. 10 .
- a maximum score l max i s identified in 914 comprises the score (e.g., score l 2 ) with the highest or greatest value of all other scores determined in 912 (e.g., scores l 1 , l 3 , . . . , l 10 ).
- the computing device performs operations to identify scores that are less than a number T defined by the following mathematical equation (2).
- T l max ⁇ thr (2)
- thr represents a threshold value which may be pre-defined or dynamically determined based on certain criteria (e.g., a x , a y , v angular , a type of the particle filter algorithm, number of points in the LiDAR point cloud, number of LiDAR data points in the cuboids, etc.).
- cuboids are removed from the set C which are associated with the scores identified in 916 (i.e., the scores that are less than the maximum score minus a threshold value). For example, with reference to FIG. 13 , the set C is reduced from ten cuboids to eight cuboids.
- the present solution is not limited in this regard.
- the process can return to 910 so that the particle filter operations are repeated for each frame of LiDAR point cloud data. Subsequently, 922 is performed where the process ends, or other operations are performed.
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Data Mining & Analysis (AREA)
- Radar, Positioning & Navigation (AREA)
- Electromagnetism (AREA)
- Remote Sensing (AREA)
- Computer Networks & Wireless Communication (AREA)
- Software Systems (AREA)
- Evolutionary Computation (AREA)
- General Engineering & Computer Science (AREA)
- Artificial Intelligence (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Multimedia (AREA)
- Medical Informatics (AREA)
- Mechanical Engineering (AREA)
- Transportation (AREA)
- Computing Systems (AREA)
- Automation & Control Theory (AREA)
- Mathematical Physics (AREA)
- Bioinformatics & Cheminformatics (AREA)
- Life Sciences & Earth Sciences (AREA)
- Bioinformatics & Computational Biology (AREA)
- Evolutionary Biology (AREA)
- Traffic Control Systems (AREA)
- Radar Systems Or Details Thereof (AREA)
- Train Traffic Observation, Control, And Security (AREA)
- Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)
Abstract
Description
-
- a trajectory defined by the object's actual speed (e.g., 1 mile per hour) and actual direction of travel (e.g., west);
- a trajectory defined by the object's actual speed (e.g., 1 mile per hour) and another possible direction of travel (e.g., south, south-west, or X (e.g., 40°) degrees from the object's actual direction of travel in a direction towards the autonomous robot) for the object;
- a trajectory defined by another possible speed for the object (e.g., 2-10 miles per hour) and the object's actual direction of travel (e.g., west); and/or
- a trajectory defined by another possible speed for the object (e.g., 2-10 miles per hour) and another possible direction of travel (e.g., south, south-west, or X (e.g., 40°) degrees from the object's actual direction of travel in a direction towards the autonomous robot) for the object.
ι=log L(a x)+log L(a y)+log L(v angular) (1)
T=l max −thr (2)
where thr represents a threshold value which may be pre-defined or dynamically determined based on certain criteria (e.g., ax, ay, vangular, a type of the particle filter algorithm, number of points in the LiDAR point cloud, number of LiDAR data points in the cuboids, etc.). In 918, cuboids are removed from the set C which are associated with the scores identified in 916 (i.e., the scores that are less than the maximum score minus a threshold value). For example, with reference to
Claims (20)
Priority Applications (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US17/371,637 US11978259B2 (en) | 2021-07-09 | 2021-07-09 | Systems and methods for particle filter tracking |
DE112022002791.6T DE112022002791T5 (en) | 2021-07-09 | 2022-06-08 | PARTICLE FILTER TRACKING SYSTEMS AND METHODS |
CN202280048305.2A CN117795378A (en) | 2021-07-09 | 2022-06-08 | System and method for particle filter tracking |
PCT/US2022/072808 WO2023283511A2 (en) | 2021-07-09 | 2022-06-08 | Systems and methods for particle filter tracking |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US17/371,637 US11978259B2 (en) | 2021-07-09 | 2021-07-09 | Systems and methods for particle filter tracking |
Publications (2)
Publication Number | Publication Date |
---|---|
US20230012257A1 US20230012257A1 (en) | 2023-01-12 |
US11978259B2 true US11978259B2 (en) | 2024-05-07 |
Family
ID=84799398
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US17/371,637 Active 2042-08-18 US11978259B2 (en) | 2021-07-09 | 2021-07-09 | Systems and methods for particle filter tracking |
Country Status (4)
Country | Link |
---|---|
US (1) | US11978259B2 (en) |
CN (1) | CN117795378A (en) |
DE (1) | DE112022002791T5 (en) |
WO (1) | WO2023283511A2 (en) |
Families Citing this family (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN116469041B (en) * | 2023-06-20 | 2023-09-19 | 成都理工大学工程技术学院 | Target object motion trail prediction method, system and equipment |
Citations (14)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20140368807A1 (en) * | 2013-06-14 | 2014-12-18 | Microsoft Corporation | Lidar-based classification of object movement |
US20190096086A1 (en) | 2017-09-22 | 2019-03-28 | Zoox, Inc. | Three-Dimensional Bounding Box From Two-Dimensional Image and Point Cloud Data |
US20190156569A1 (en) | 2017-11-17 | 2019-05-23 | Thales Canada, Inc. | Point cloud rail asset data extraction |
US20190180467A1 (en) * | 2017-12-11 | 2019-06-13 | Beijing Didi Infinity Technology And Development Co., Ltd. | Systems and methods for identifying and positioning objects around a vehicle |
US20190266741A1 (en) | 2018-02-23 | 2019-08-29 | Toyota Motor Engineering & Manufacturing North America, Inc. | System and method for object detection using edge characteristics |
US20190317519A1 (en) | 2018-04-17 | 2019-10-17 | Baidu Usa Llc | Method for transforming 2d bounding boxes of objects into 3d positions for autonomous driving vehicles (advs) |
US20190355171A1 (en) | 2018-05-17 | 2019-11-21 | Denso Corporation | Surround monitoring system for vehicles |
US10621747B2 (en) | 2016-11-15 | 2020-04-14 | Magic Leap, Inc. | Deep learning system for cuboid detection |
US20200159225A1 (en) | 2018-11-16 | 2020-05-21 | Uber Technologies, Inc. | End-To-End Interpretable Motion Planner for Autonomous Vehicles |
US20200309923A1 (en) * | 2019-03-27 | 2020-10-01 | Panosense Inc. | Identifying and/or removing false positive detections from lidar sensor output |
US20210027546A1 (en) | 2019-07-22 | 2021-01-28 | Scale AI, Inc. | Techniques for labeling cuboids in point cloud data |
US20210149022A1 (en) | 2019-11-14 | 2021-05-20 | Toyota Research Institute, Inc. | Systems and methods for 3d object detection |
US20220222889A1 (en) | 2021-01-12 | 2022-07-14 | Toyota Research Institute, Inc. | Monocular 3d vehicle modeling and auto-labeling using semantic keypoints |
US11776215B1 (en) * | 2019-12-16 | 2023-10-03 | Scale AI, Inc. | Pre-labeling data with cuboid annotations |
-
2021
- 2021-07-09 US US17/371,637 patent/US11978259B2/en active Active
-
2022
- 2022-06-08 WO PCT/US2022/072808 patent/WO2023283511A2/en active Application Filing
- 2022-06-08 DE DE112022002791.6T patent/DE112022002791T5/en active Pending
- 2022-06-08 CN CN202280048305.2A patent/CN117795378A/en active Pending
Patent Citations (14)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20140368807A1 (en) * | 2013-06-14 | 2014-12-18 | Microsoft Corporation | Lidar-based classification of object movement |
US10621747B2 (en) | 2016-11-15 | 2020-04-14 | Magic Leap, Inc. | Deep learning system for cuboid detection |
US20190096086A1 (en) | 2017-09-22 | 2019-03-28 | Zoox, Inc. | Three-Dimensional Bounding Box From Two-Dimensional Image and Point Cloud Data |
US20190156569A1 (en) | 2017-11-17 | 2019-05-23 | Thales Canada, Inc. | Point cloud rail asset data extraction |
US20190180467A1 (en) * | 2017-12-11 | 2019-06-13 | Beijing Didi Infinity Technology And Development Co., Ltd. | Systems and methods for identifying and positioning objects around a vehicle |
US20190266741A1 (en) | 2018-02-23 | 2019-08-29 | Toyota Motor Engineering & Manufacturing North America, Inc. | System and method for object detection using edge characteristics |
US20190317519A1 (en) | 2018-04-17 | 2019-10-17 | Baidu Usa Llc | Method for transforming 2d bounding boxes of objects into 3d positions for autonomous driving vehicles (advs) |
US20190355171A1 (en) | 2018-05-17 | 2019-11-21 | Denso Corporation | Surround monitoring system for vehicles |
US20200159225A1 (en) | 2018-11-16 | 2020-05-21 | Uber Technologies, Inc. | End-To-End Interpretable Motion Planner for Autonomous Vehicles |
US20200309923A1 (en) * | 2019-03-27 | 2020-10-01 | Panosense Inc. | Identifying and/or removing false positive detections from lidar sensor output |
US20210027546A1 (en) | 2019-07-22 | 2021-01-28 | Scale AI, Inc. | Techniques for labeling cuboids in point cloud data |
US20210149022A1 (en) | 2019-11-14 | 2021-05-20 | Toyota Research Institute, Inc. | Systems and methods for 3d object detection |
US11776215B1 (en) * | 2019-12-16 | 2023-10-03 | Scale AI, Inc. | Pre-labeling data with cuboid annotations |
US20220222889A1 (en) | 2021-01-12 | 2022-07-14 | Toyota Research Institute, Inc. | Monocular 3d vehicle modeling and auto-labeling using semantic keypoints |
Non-Patent Citations (5)
Title |
---|
Deng, Z. et al., "Amodal Detection of 3D Objects: Inferring 3D Bounding Boxes from 2D Ones in RGB-Depth Images," IEEE Conf. on Computer Vision and Pattern Recognition (CVPR) 2017. |
Information about Related Patents and Patent Applications, see section 4 of the accompanying Information Disclosure Statement Letter, which concerns Related Patents and Patent Applications. |
International Search Report and Written Opinion dated Feb. 13, 2023 for PCT/US2022/0728808 (8 pages). |
International Search Report and Written Opinion dated Jul. 27, 2022 in International Appln. No. PCT/US2022/071770 (11 pages). |
U.S. Appl. No. 17/241,637, filed Apr. 27, 2021, Systems and Methods for Producing Amodal Cuboids. |
Also Published As
Publication number | Publication date |
---|---|
DE112022002791T5 (en) | 2024-03-28 |
US20230012257A1 (en) | 2023-01-12 |
WO2023283511A2 (en) | 2023-01-12 |
CN117795378A (en) | 2024-03-29 |
WO2023283511A3 (en) | 2023-03-09 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US20230289999A1 (en) | Methods and systems for joint pose and shape estimation of objects from sensor data | |
US11657572B2 (en) | Systems and methods for map generation based on ray-casting and semantic class images | |
US20230123184A1 (en) | Systems and methods for producing amodal cuboids | |
US11978259B2 (en) | Systems and methods for particle filter tracking | |
US11977440B2 (en) | On-board feedback system for autonomous vehicles | |
EP4148599A1 (en) | Systems and methods for providing and using confidence estimations for semantic labeling | |
US20220032953A1 (en) | Systems and methods for controlling vehicles using an amodal cuboid based algorithm | |
US20220221585A1 (en) | Systems and methods for monitoring lidar sensor health | |
US11919546B2 (en) | Systems and methods for estimating cuboids from LiDAR, map and image data | |
EP4181089A1 (en) | Systems and methods for estimating cuboid headings based on heading estimations generated using different cuboid defining techniques | |
US20230003886A1 (en) | Systems and methods for temporal decorrelation of object detections for probabilistic filtering | |
US20240077615A1 (en) | Systems and methods for convolutional high resolution lidar imaging | |
US20230237793A1 (en) | False track mitigation in object detection systems | |
US20240069207A1 (en) | Systems and methods for spatial processing of lidar data | |
EP4145352A1 (en) | Systems and methods for training and using machine learning models and algorithms | |
US20240151817A1 (en) | Systems and methods for static detection based amodalization placement |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: ARGO AI, LLC, PENNSYLVANIA Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:PLAYER, KEVIN JAMES;REEL/FRAME:056803/0702 Effective date: 20210707 |
|
FEPP | Fee payment procedure |
Free format text: ENTITY STATUS SET TO UNDISCOUNTED (ORIGINAL EVENT CODE: BIG.); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION |
|
AS | Assignment |
Owner name: FORD GLOBAL TECHNOLOGIES, LLC, MICHIGAN Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:ARGO AI, LLC;REEL/FRAME:063025/0346 Effective date: 20230309 |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: NON FINAL ACTION MAILED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER |
|
ZAAB | Notice of allowance mailed |
Free format text: ORIGINAL CODE: MN/=. |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: NOTICE OF ALLOWANCE MAILED -- APPLICATION RECEIVED IN OFFICE OF PUBLICATIONS |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: NOTICE OF ALLOWANCE MAILED -- APPLICATION RECEIVED IN OFFICE OF PUBLICATIONS |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: PUBLICATIONS -- ISSUE FEE PAYMENT RECEIVED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: PUBLICATIONS -- ISSUE FEE PAYMENT VERIFIED |
|
STCF | Information on status: patent grant |
Free format text: PATENTED CASE |