CN115485177A - Object speed and/or yaw for radar tracking - Google Patents

Object speed and/or yaw for radar tracking Download PDF

Info

Publication number
CN115485177A
CN115485177A CN202180027842.4A CN202180027842A CN115485177A CN 115485177 A CN115485177 A CN 115485177A CN 202180027842 A CN202180027842 A CN 202180027842A CN 115485177 A CN115485177 A CN 115485177A
Authority
CN
China
Prior art keywords
yaw rate
determining
radar
data structure
data
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202180027842.4A
Other languages
Chinese (zh)
Inventor
A·M·邦吉奥卡尔曼
M·C·博赛
S·达斯
F·帕皮
J·钱
S·盛
王闯
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Zoox Inc
Original Assignee
Zoox Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Zoox Inc filed Critical Zoox Inc
Publication of CN115485177A publication Critical patent/CN115485177A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/66Radar-tracking systems; Analogous systems
    • G01S13/72Radar-tracking systems; Analogous systems for two-dimensional tracking, e.g. combination of angle and range tracking, track-while-scan radar
    • G01S13/723Radar-tracking systems; Analogous systems for two-dimensional tracking, e.g. combination of angle and range tracking, track-while-scan radar by using numerical data
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/02Systems using reflection of radio waves, e.g. primary radar systems; Analogous systems
    • G01S13/50Systems of measurement based on relative movement of target
    • G01S13/58Velocity or trajectory determination systems; Sense-of-movement determination systems
    • G01S13/581Velocity or trajectory determination systems; Sense-of-movement determination systems using transmission of interrupted pulse modulated waves and based upon the Doppler effect resulting from movement of targets
    • G01S13/582Velocity or trajectory determination systems; Sense-of-movement determination systems using transmission of interrupted pulse modulated waves and based upon the Doppler effect resulting from movement of targets adapted for simultaneous range and velocity measurements
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/02Systems using reflection of radio waves, e.g. primary radar systems; Analogous systems
    • G01S13/50Systems of measurement based on relative movement of target
    • G01S13/58Velocity or trajectory determination systems; Sense-of-movement determination systems
    • G01S13/589Velocity or trajectory determination systems; Sense-of-movement determination systems measuring the velocity vector
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/66Radar-tracking systems; Analogous systems
    • G01S13/72Radar-tracking systems; Analogous systems for two-dimensional tracking, e.g. combination of angle and range tracking, track-while-scan radar
    • G01S13/723Radar-tracking systems; Analogous systems for two-dimensional tracking, e.g. combination of angle and range tracking, track-while-scan radar by using numerical data
    • G01S13/726Multiple target tracking
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/88Radar or analogous systems specially adapted for specific applications
    • G01S13/93Radar or analogous systems specially adapted for specific applications for anti-collision purposes
    • G01S13/931Radar or analogous systems specially adapted for specific applications for anti-collision purposes of land vehicles
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/88Radar or analogous systems specially adapted for specific applications
    • G01S13/93Radar or analogous systems specially adapted for specific applications for anti-collision purposes
    • G01S13/931Radar or analogous systems specially adapted for specific applications for anti-collision purposes of land vehicles
    • G01S2013/9321Velocity regulation, e.g. cruise control
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0257Control of position or course in two dimensions specially adapted to land vehicles using a radar

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Physics & Mathematics (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • General Physics & Mathematics (AREA)
  • Electromagnetism (AREA)
  • Traffic Control Systems (AREA)
  • Control Of Driving Devices And Active Controlling Of Vehicle (AREA)

Abstract

Some radar sensors may provide a doppler measurement that indicates the relative velocity of an object relative to the velocity of the radar sensor. Techniques for determining two-dimensional or multi-dimensional velocities from one or more radar measurements associated with an object may include determining a data structure that includes a yaw hypothesis and a set of weights to adjust for an effect of the yaw hypothesis. Determining the two-or multi-dimensional velocity may also include using the data structure as part of a regression algorithm to determine a velocity and/or yaw rate associated with the object.

Description

Object speed and/or yaw for radar tracking
Cross Reference to Related Applications
This PCT application claims priority to U.S. patent application No. 16/795,411, filed on 19/2/2020, which is incorporated herein by reference.
Background
An autonomous vehicle may use radar sensors to capture data about the environment through which the autonomous vehicle passes and use that data to detect objects in the environment to avoid collisions. Some radar sensors may provide Doppler (Doppler) measurements that indicate the relative velocity of an object relative to the velocity of the radar sensor. However, doppler measurements only provide the velocity of the object relative to the radar sensor. In other words, the doppler measurement is a one-dimensional velocity and does not specify the yaw (yaw) rate of the object, which may be necessary for safe operation in certain environments, for example for use in an autonomous vehicle.
Drawings
The embodiments are described with reference to the accompanying drawings. In the drawings, the left-most digit(s) of a reference number identifies the figure in which the reference number first appears. The use of the same reference symbols in different drawings indicates similar or identical items.
FIG. 1 illustrates an example scenario in which an autonomous vehicle configured with a perception and tracking component may track a previous and current location, speed, and/or heading (heading) of an object in an environment surrounding the autonomous vehicle and generate a trajectory for controlling the autonomous vehicle based at least in part on the tracking.
FIG. 2 illustrates a block diagram of an example architecture of a system for determining two-or multi-dimensional velocity from radar data associated with an object.
FIG. 3 shows a pictorial flow diagram of an example process for determining a return (return) associated with an object and determining a two or more dimensional speed and/or yaw rate from radar data associated with the object.
FIG. 4 shows a pictorial flow diagram of an example process for determining two or more dimensional speed and/or yaw rate from radar data associated with an object.
FIG. 5 shows a pictorial flow diagram of an example process for determining two or more dimensional speed and/or yaw rate from radar data associated with an object.
Detailed Description
Techniques described herein are directed to characterizing movement of objects in an environment based on radar data. For example, the techniques may include: determining one or more radar returns associated with an object in an environment of the radar sensor; and determining a two-dimensional (or multi-dimensional) velocity and/or yaw rate of the object based at least in part on the echoes. While many systems may benefit from the techniques described herein, example systems implementing the techniques of this disclosure may include autonomous vehicles with one or more radar sensors (and/or other or different modality sensors). The techniques may additionally or alternatively include determining a tracking associated with the object. The tracking may include velocity and/or historical, current, and/or predicted velocity, position, yaw rate, orientation, acceleration, etc., determined according to the techniques discussed herein. In some examples, the tracking may identify that the object detected in the previous sensor data is the same as the object detected in the current sensor data.
The technique may include: receiving radar data associated with an environment; detecting an object in an environment based at least in part on radar data and/or data from one or more other sensors; and identifying a subset of the radar data associated with the object. The subset of radar data may be used to determine a velocity associated with the object. For example, both U.S. patent application No. 16/386,249, filed on 16/4/2019 and U.S. patent application No. 16/587,605, filed on 30/9/2019, both of which are incorporated herein in their entirety, discuss detecting objects based at least in part on radar data. In the discussion that follows, unless specifically stated otherwise, the speed determined by the technique may include a two-dimensional or three-dimensional speed and/or a rotational speed, e.g., a yaw rate. In such examples, the speed may additionally or alternatively include one or more of a linear speed and a rotational speed.
In a first implementation of the technique, determining the yaw rate and/or the speed may include receiving a subset of radar data associated with the object and determining whether a complete solution for rigid body estimation exists based on the subset of radar data. For example, the rigid body estimate may include a complete constraint selected to reflect a physical constraint, e.g., an assumption that an object such as a vehicle will not slide laterally. The complete solution may be a two-dimensional velocity and yaw rate that solves for the rigid body estimates using linear, cauchy (Cauchy), and/or reweighed regression, while satisfying the complete (physical) constraints. In some examples, prior to determining the speed and/or yaw rate, the technique may include removing outliers (outliers), for example, using a random sample consensus (RANSAC) algorithm.
If a complete solution does not exist (e.g., a solution that satisfies a physical constraint does not exist), a first embodiment may include determining whether a subset of the radar data satisfies each rule of a set of rules, wherein the set of rules is a more complex physical constraint check. If all rules are satisfied, a first embodiment may include determining a two-dimensional velocity associated with the object, but may not be able to determine the yaw rate. If any rules are not satisfied, a first embodiment may include determining a one-dimensional velocity of the object based at least in part on the Doppler measurements indicated by the subset of radar data. In such a case, the yaw rate may not be available, which may be problematic for tracking the object, providing a prediction of future locations of the object, and the like.
In other words, according to the first embodiment, the technique may include attempting a rigid body estimation using a subset of radar points to obtain a solution that includes speed and yaw rate. However, there may not be a solution to the rigid body estimate, as it may include a fully constrained multiple regression that does not converge or violate the solution. When the solution is not easy to process, the first embodiment may include checking to determine if a set of constraints is satisfied, and if so, determining the velocity based on a dual velocity determination, where the first determination is based on a doppler value associated with the radar point, and the second determination is based on an assumption that the velocity is in the yaw direction. If either constraint is not satisfied, the first embodiment may degrade to depend on the range rate (i.e., doppler) associated with the object.
In additional or alternative embodiments, the techniques for determining a yaw rate and/or a speed associated with an object may include: receiving a set of radar data associated with an object; rejecting outliers of the set using a random sample consensus (RANSAC) technique to determine a subset of radar points; and generating two different data structures based at least in part on the subset of radar data. The first data structure may include a yaw rate hypothesis, and the second data structure may lack the yaw rate hypothesis. Both data structures may be used as observation variables in a portion of a regression algorithm for determining two-dimensional (or multi-dimensional) velocities and yaw rates associated with an object. In some examples, the technique may include determining a first error likely to be associated with the first data structure and determining a second error associated with the second data structure. Any data structure that results in a yaw rate and/or speed associated with the lower of the two errors may be output by the system associated with the object.
For example, the technique may include populating a first data structure with a subset of radar data and a yaw rate hypothesis based at least in part on tracking associated with an object, the tracking may be indicative of: previous, current, and/or predicted locations; speed; heading, which may include yaw; acceleration; and/or a yaw rate of the subject. The technique may also include populating the second data structure with a subset of the radar data, but may exclude the yaw rate hypothesis. The technique may include determining a first yaw rate and a first speed based at least in part on solving for linear regression using a first data structure, and determining a second yaw rate and a second speed based at least in part on solving for linear regression using a second data structure. A first error may be determined in association with the first data structure and a second error may be determined in association with the second data structure. The technique may include selecting a yaw rate and/or speed associated with any error that is lower between the first error and the second error.
In some examples, the techniques may additionally or alternatively include fusing the first data structure and the second data structure into a single data structure that includes a yaw hypothesis, and determining the speed and/or yaw rate based at least in part on the weighted regression. The weighted regression may functionally increase or decrease the effect of the yaw rate hypothesis, and may turn off the yaw hypothesis. Weighted regression may also be used to reject the effects of or reduce the weighting of outliers, so outlier rejection preprocessing operations (e.g., applying RANSAC to a subset of the radar data associated with the object) may be discarded in some examples.
In some examples, the technique may additionally or alternatively include determining one or more covariances associated with the selected data structure, and determining the yaw rate and/or the velocity may include providing the selected data structure and the one or more covariances to a Kalman filter. The kalman filter may additionally or alternatively receive a tracking as an input and may update a state associated with an object indicated by the tracking. In examples where a first data structure including a yaw hypothesis is selected, the covariance may be used to weight the yaw hypothesis.
The techniques discussed herein may improve the safety of a vehicle by improving the ability of the vehicle to predict movement and/or behavior of objects around the vehicle. The techniques may increase the accuracy and/or usability of two-dimensional or multi-dimensional velocity detection based on radar data. For example, the techniques discussed herein are less likely or unlikely to result in two object detection artifacts upon detected object rotation caused by previous techniques. Further, the techniques discussed herein eliminate the need for manually adjusted physical constraint rules, reduce or eliminate the need to test the system during setup, increase the ease of troubleshooting the system, and reduce the number of abnormal situations that can reduce the accuracy of the system and are not adequately reflected by the manually adjusted physical constraint rules.
Example scenarios
FIG. 1 shows an example scenario 100 including a vehicle 102. In some cases, the vehicle 102 may be an autonomous vehicle configured to operate according to a level 5 classification issued by the U.S. national highway traffic safety administration, which level 5 classification describes vehicles capable of performing all safety critical functions throughout a trip, where a driver (or occupant) is not expected to control the vehicle at any time. However, in other examples, the vehicle 102 may be a fully or partially autonomous vehicle having any other level or classification. It is contemplated that the techniques discussed herein may have application in areas other than robotic control, such as autonomous vehicles. For example, the techniques discussed herein may be applied to mining, manufacturing, augmented reality, and the like. Further, even though vehicle 102 is depicted as a land vehicle, vehicle 102 may be a spacecraft, a watercraft, and/or the like. In some examples, the vehicle 102 may be represented in the simulation as a simulated vehicle. For simplicity, the discussion herein does not distinguish between simulated vehicles and real-world vehicles. Thus, reference to a "vehicle" may refer to a simulated vehicle and/or a real-world vehicle.
In accordance with the techniques discussed herein, the vehicle 102 may receive sensor data from the sensors 104 of the vehicle 102. For example, the sensors 104 may include positioning sensors (e.g., global Positioning System (GPS) sensors), inertial sensors (e.g., accelerometer sensors, gyroscope sensors, etc.), magnetic field sensors (e.g., compass), position/velocity/acceleration sensors (e.g., speedometer, drive system sensors), depth position sensors (e.g., lidar sensors, radar sensors, sonar sensors, time-of-flight (ToF) cameras, depth cameras, ultrasonic and/or sonar sensors, and/or other depth sensing sensors), image sensors (e.g., cameras), audio sensors (e.g., microphones), and/or environmental sensors (e.g., barometers, hygrometers, etc.).
The sensors 104 may generate sensor data that may be received by a computing device 106 associated with the vehicle 102. However, in other examples, some or all of the sensors 104 and/or computing devices 106 may be located separate from the vehicle 102 and/or remote from the vehicle 102, and data capture, processing, commands, and/or control may be communicated to/from the vehicle 102 via wired and/or wireless network communications through one or more remote computing devices.
The computing device 106 can include a memory 108 that stores a perception component 110, a planning component 112, and/or a tracking component 114. In some examples, sensing component 110 may include a radar component 116, the radar component 116 configured to determine a yaw rate and/or a speed of an object based at least in part on radar data. The perception component 110, the planning component 112, the tracking component 114, and/or the radar component 116 can include one or more Machine Learning (ML) models and/or other computer-executable instructions.
In general, the perception component 110 can determine what is in the environment surrounding the vehicle 102, and the planning component 112 can determine how to operate the vehicle 102 based on information received from the perception component 110. For example, the planning component 112 may determine the trajectory 118 based at least in part on the perception data and/or other information (e.g., one or more maps, positioning information (e.g., a location of the vehicle 102 in the environment relative to the map and/or features detected by the perception component 110), etc.). Trajectory 118 may include instructions for a controller of autonomous vehicle 102 to actuate a drive assembly of vehicle 102 to achieve a steering angle and/or a steering rate, which may result in a vehicle position, a vehicle speed, and/or a vehicle acceleration. For example, the trajectory 118 may include a target heading, a target steering angle, a target steering rate, a target position, a target speed, and/or a target acceleration for the controller to track. In some examples, the controller may include software and/or hardware for actuating a drive assembly of vehicle 102 sufficient to track trajectory 118. For example, the controller may include one or more proportional-integral-derivative (PID) controllers.
In some examples, the perception component 110 may receive sensor data from the sensors 104 and determine data related to objects in the vicinity of the vehicle 102 (e.g., object classifications associated with detected objects, instance segmentation, semantic segmentation, two-and/or three-dimensional bounding boxes, tracking), route data specifying a vehicle destination, global map data identifying road characteristics (e.g., features detectable in different sensor modalities that are useful for locating an autonomous vehicle), local map data identifying characteristics detected in the vicinity of the vehicle (e.g., locations and/or dimensions of buildings, trees, fences, fire hydrants, parking markers, and any other features detectable in various sensor modalities), tracking data (e.g., environmental representations, object detection, and/or tracking as discussed herein), and the like.
In some examples, the perception component 110 may include a pipeline of hardware and/or software, which may include one or more GPUs, ML models, kalman filters, and/or the like. In some examples, the sensing component 110 can monitor as much of the environment around the autonomous vehicle as possible, which can be limited by sensor capabilities, objects and/or environmental obstructions (e.g., buildings, altitude changes, objects in front of other objects), and/or environmental influences such as fog, snow, and the like. For example, the sensor data may include radar data, which the sensing component 110 may receive as input. The perception component 110 can be configured to detect as many objects and information about the environment as possible to avoid failing to account for events or object behaviors that should be considered by the planning component 112 in determining the trajectory 118.
In some examples, one or more components of the perception component 110 may determine a subset of radar data associated with an object (e.g., the vehicle 120 in the example scenario 100). Regardless, radar component 116 can receive at least a subset of the radar data associated with the object and can determine a two-dimensional (or multi-dimensional) velocity associated with the object in accordance with the techniques discussed herein. For example, the velocity 122 may be a velocity having at least two components, e.g., a lateral component, a longitudinal component, and/or a yaw component, relative to the coordinate system 124. The coordinate system may be a coordinate system oriented based on an environment, a road or path, an inertial direction of movement associated with the autonomous vehicle 102, and/or the like. The depicted coordinate system 124 may be relative to the pose of the autonomous vehicle.
In some examples, the velocities determined by the techniques discussed herein may be provided to a prediction component of the perception component 110 and/or the planning component 112. The prediction component can use the speed to predict a future state of the detected object, e.g., a predicted trajectory (e.g., predicted heading, predicted speed, predicted path), based at least in part on the speed. In additional or alternative examples, the speed may be provided to the planning component 112 for the planning component 112 to determine the trajectory 118 for controlling the autonomous vehicle 102. The planning component 112 may determine the trajectory 118 based at least in part on the predicted trajectory and/or the velocity determined by the radar component 116.
In some examples, radar component 116 may additionally or alternatively estimate a center of the detected object and/or a size and/or dimensions of the detected object based at least in part on a subset of radar data associated with the object.
The object classification determined by the perception component 110 can distinguish between different object types, such as passenger cars, pedestrians, riders, delivery trucks, semi-trucks, traffic signs, and the like. Tracking may include historical, current, and/or predicted object position, velocity, acceleration, and/or heading. The data generated by the perception component 110 may be collectively referred to as perception data, which may include a speed 122 and/or a yaw rate 126 determined according to the techniques discussed herein. Once the perception component 110 has generated the perception data, the perception component 110 can provide the perception data to the planning component 112. In some examples, the perception data may include sensor-specific pipelines (e.g., vision, lidar, radar) and/or hybrid sensor pipelines (e.g., vision-lidar, radar-lidar).
Fig. 1 also depicts an example of the error caused by the prior art. The first embodiment discussed above may result in the assignment of two regions of interest (denoted as "dual regions of interest 128" in fig. 1) to one object. This problem is more likely to occur when the detected vehicle is turning or otherwise rotating on the yaw axis.
The tracking component 114 can receive sensor data and/or object detections from the sensors and/or perception component 110 and can determine whether to associate a previously generated tracking with a current object detection or generate a new tracking associated with a current object detection. In other words, the tracking component 114 identifies whether a recently detected object has been previously detected and tracking information associated with the detected object. The tracking component 114 can output a tracking associated with the detected object that identifies a historical, current, and/or predicted location, heading, speed, acceleration, yaw rate, perceptual data (e.g., a region of interest, an occlusion state, e.g., whether all or part of the object is hidden from one or more sensors) associated with the object, and the like. In some examples, the tracking information may be included as part of the perception data output by the perception component 110.
In some examples, the tracking may additionally or alternatively include various current and/or previous data about the object that is useful for a planning component of the autonomous vehicle to predict the motion/behavior of the object and determine a trajectory and/or path for controlling the autonomous vehicle. For example, tracking may additionally or alternatively include an indication of an environmental region currently and/or previously occupied by the object, a classification of the object associated with the object (e.g., vehicle, oversize vehicle, pedestrian, rider), a current/or previous heading associated with the object, a current and/or previous velocity and/or acceleration of the object, and/or a current position and/or velocity of the object. For example, the tracking may associate a region of interest (ROI) (e.g., bounding box, mask, some other identification of the region of the environment occupied by the object) generated by the perception component with the same tracking. The ROI may be associated with a detected object, an object classification of the object, a heading of the object, a speed and/or acceleration of the object, an altitude of the object, and/or the like.
The planning component 112 may use the perception data received from the perception component 110 and/or the tracking component 114 to determine one or more trajectories, control movement of the vehicle 102 to traverse a path or route, and/or otherwise control the operation 102 of the vehicle, although any such operation may be performed in various other components (e.g., positioning may be performed by a positioning component, which may be based at least in part on the perception data). For example, the planning component 112 may determine a route for the vehicle 102 from a first location to a second location; substantially simultaneously and based at least in part on the perception data and/or simulated perception data (which may further include predictions about objects detected in such data), generating a plurality of potential trajectories for controlling the motion of the vehicle 102 according to a rolling time domain technique (e.g., 1 microsecond, half a second) to control a vehicle traversal route (e.g., so as to avoid any of the detected objects); and selecting one of the potential trajectories as trajectory 118 of vehicle 102, the trajectory 118 being usable to generate a drive control signal that may be transmitted to drive a component of vehicle 102. Fig. 1 depicts an example of such a trajectory 118, represented as an arrow indicating heading, speed, and/or acceleration, although the trajectory itself may include instructions for the controller, which may in turn actuate the drive system of the vehicle 102.
Example System
Fig. 2 illustrates a block diagram of an example system 200 that implements techniques discussed herein. In some cases, the example system 200 may include a vehicle 202, which may represent the vehicle 102 in fig. 1. In some cases, the vehicle 202 may be an autonomous vehicle configured to operate according to a level 5 classification issued by the U.S. national highway traffic safety administration that describes vehicles capable of performing all safety critical functions throughout a trip, where it is not desirable for the driver (or occupant) to control the vehicle at any time. However, in other examples, the vehicle 202 may be a fully or partially autonomous vehicle having any other level or classification. Further, in some cases, the techniques described herein may also be used by non-autonomous vehicles.
The vehicle 202 may include a vehicle computing device 204, sensors 206, a transmitter 208, a network interface 210, and/or a drive component 212. The vehicle computing device 204 may represent the computing device 106 and the sensor 206 may represent the sensor 104. System 200 may additionally or alternatively include a computing device 214.
In some cases, sensors 206 may represent sensors 104 and may include lidar sensors, radar sensors, ultrasonic transducers, sonar sensors, positioning sensors (e.g., global Positioning System (GPS), compass, etc.), inertial sensors (e.g., inertial Measurement Unit (IMU), accelerometer, magnetometer, gyroscope, etc.), image sensors (e.g., red Green Blue (RGB), infrared (IR), intensity, depth, time-of-flight cameras, etc.), microphones, wheel encoders, environmental sensors (e.g., thermometer, hygrometer, light sensors, pressure sensors, etc.), and so forth. The sensors 206 may include multiple instances of each of these or other types of sensors. For example, the radar sensors may include individual radar sensors located at corners, front, rear, sides, and/or top of the vehicle 502. As another example, the camera sensor may include a plurality of cameras disposed at various locations around the exterior and/or interior of the vehicle 202. The sensors 206 may provide input to the vehicle computing device 204 and/or the computing device 214.
As described above, the vehicle 202 may also include a transmitter 208 for emitting light and/or sound. In this example, the transmitter 208 may include an internal audio and visual transmitter to communicate with the occupant of the vehicle 202. By way of example and not limitation, the internal transmitter may include: speakers, lights, signs, display screens, touch screens, tactile transmitters (e.g., vibration and/or force feedback), mechanical actuators (e.g., seat belt pretensioners, seat positioners, headrest positioners, etc.), and the like. In this example, the transmitter 208 may also include an external transmitter. By way of example and not limitation, the external transmitters in this example include lights for signaling direction of travel or other indicators of vehicle action (e.g., indicator lights, signs, light arrays, etc.), and one or more audio transmitters (e.g., speakers, speaker arrays, horns, etc.) for audibly communicating with pedestrians or other nearby vehicles, one or more of which include acoustic beam steering technology.
The vehicle 202 may also include a network interface 210, the network interface 210 enabling communication between the vehicle 202 and one or more other local or remote computing devices. For example, the network interface 210 may facilitate communication with other local computing devices on the vehicle 202 and/or the drive component 212. Also, the network interface 210 may additionally or alternatively allow the vehicle to communicate with other nearby computing devices (e.g., other nearby vehicles, traffic lights, etc.). The network interface 210 may additionally or alternatively enable the vehicle 202 to communicate with a computing device 214. In some examples, computing device 214 may include one or more nodes of a distributed computing system (e.g., a cloud computing architecture).
The network interface 210 may include a physical and/or logical interface for connecting the vehicle computing device 204 to another computing device or network (e.g., network 216). For example, the network interface 210 may enable Wi-Fi based communications, e.g., via frequencies defined by the IEEE 200.11 standard, short-range wireless frequencies (e.g.,
Figure BDA0003884100650000061
) Cellular communication (e.g., 2G, 3G, 4G LTE, 5G, etc.), or any suitable wired or wireless communication protocol that enables the respective computing device to interface with other computing devices. In some cases, the vehicle computing device 204 and/or the sensors 206 may transmit sensor data to the computing device 214 via the network 216 at a particular frequency, after a predetermined period of time has elapsed, in near real-time, and/or the like.
In some examples, the vehicle 202 may include one or more drive assemblies 212. In some examples, the vehicle 202 may have a single drive assembly 212. In some examples, drive assembly 212 may include one or more sensors to detect conditions of drive assembly 212 and/or the surroundings of vehicle 202. By way of example and not limitation, the sensors of the drive assembly 212 may include: one or more wheel encoders (e.g., rotary encoders) to sense rotation of the wheels of the drive assembly; inertial sensors (e.g., inertial measurement units, accelerometers, gyroscopes, magnetometers, etc.) to measure orientation of the drive assembly as well as acceleration; a camera or other image sensor, an ultrasonic sensor to acoustically detect objects in the environment surrounding the drive assembly; a laser radar sensor; radar sensors, etc. Some sensors, such as wheel encoders, may be unique to the drive assembly 212. In some cases, sensors on the drive assembly 212 may overlap or supplement corresponding systems (e.g., sensors 206) of the vehicle 202.
Drive assembly 212 may include any number of vehicle systems, including: a high voltage battery, an electric motor to propel the vehicle, an inverter to convert direct current from the battery to alternating current for use by other vehicle systems, a steering system including a steering motor and a steering frame (which may be electric), a braking system including hydraulic or electric actuators, a suspension system including hydraulic and/or pneumatic components, a stability control system to distribute braking power to mitigate traction loss and maintain control, an HVAC system, lighting (e.g., lighting such as headlights/taillights to illuminate the exterior environment of the vehicle), and one or more other systems (e.g., a cooling system, a security system, an on-board charging system, other electrical components such as a DC/DC converter, a high voltage junction, a high voltage cable, a charging system, a charging port, etc.). Additionally, the drive assembly 212 may include a drive assembly controller that may receive and pre-process data from the sensors and control the operation of various vehicle systems. In some examples, the drive assembly controller may include one or more processors and a memory communicatively coupled with the one or more processors. The memory may store one or more components to perform various functions of the drive component 212. In addition, the drive components 212 also include one or more communication connections that enable the respective drive components to communicate with one or more other local or remote computing devices.
The vehicle computing device 204 may include a processor 218 and a memory 220 communicatively coupled to the one or more processors 218. Memory 220 may represent memory 108. The computing device 214 may also include a processor 222 and/or a memory 224. Processor 218 and/or processor 222 may be any suitable processor capable of executing instructions to process data and perform operations as described herein. By way of example, and not limitation, processor 218 and/or processor 222 may include one or more Central Processing Units (CPUs), graphics Processing Units (GPUs), integrated circuits (e.g., application Specific Integrated Circuits (ASICs)), gate arrays (e.g., field Programmable Gate Arrays (FPGAs)), and/or any other device or portion of a device that processes electronic data to convert that electronic data into other electronic data that may be stored in registers and/or memory.
Memory 220 and/or memory 224 may be examples of non-transitory computer-readable media. Memory 220 and/or memory 224 may store an operating system and one or more software applications, instructions, programs, and/or data to implement the methods described herein and the functions pertaining to the various systems. In various implementations, the memory may be implemented using any suitable memory technology (e.g., static Random Access Memory (SRAM), synchronous Dynamic RAM (SDRAM), non-volatile/flash type memory, or any other type of memory capable of storing information). The architectures, systems, and individual elements described herein may include many other logical, programmed, and physical components, of which those shown in the figures are merely examples relevant to the discussion herein.
In some instances, the memory 220 and/or the memory 224 may store a positioning component 226, a perception component 228, a planning component 230, a radar component 232, a map 234, and/or a system controller 236. Perception component 228 may represent perception component 110 and may include tracking component 114, planning component 230 may represent planning component 112, and/or radar component 232 may represent radar component 116.
In at least one example, the positioning component 226 may include hardware and/or software to receive data from the sensors 206 to determine a position, a speed, and/or an orientation (e.g., one or more of an x-position, a y-position, a z-position, a roll, a pitch, or a yaw) of the vehicle 202. For example, the positioning component 226 may include and/or request/receive a map 234 of the environment and may continuously determine a position, a speed, and/or an orientation of the autonomous vehicle within the map 234. In some examples, the positioning component 226 may utilize SLAM (simultaneous positioning and mapping), CLAMS (simultaneous calibration, positioning and mapping), relative SLAM, beam steering, non-linear least squares optimization, etc. to receive image data, lidar data, radar data, IMU data, GPS data, wheel encoder data, etc. to accurately determine the position, attitude, and/or velocity of the autonomous vehicle. In some examples, the positioning component 226 may provide data to various components of the vehicle 202 to determine an initial position of the autonomous vehicle 202 to generate a trajectory and/or to generate map data, as discussed herein. In some examples, the positioning component 226 may provide the position and/or orientation of the vehicle 202 relative to the environment and/or sensor data associated therewith to the perception component 228.
In some instances, the perception component 228 can include a primary perception system and/or a prediction system implemented in hardware and/or software. The perception component 228 can detect objects in the environment surrounding the vehicle 202 (e.g., identify object presence), classify objects (e.g., determine object types associated with detected objects), segment sensor data and/or other representations of the environment (e.g., identify a portion of sensor data and/or a representation of the environment as being associated with detected objects and/or object types), determine characteristics associated with objects (e.g., track identify current, predicted, and/or previous locations, headings, speeds, and/or accelerations associated with objects), and so forth. The data determined by the perception component 228 is referred to as perception data.
The planning component 230 may receive the location and/or orientation of the vehicle 202 from the location component 226 and/or the perception data from the perception component 228, and may determine instructions for controlling the operation of the vehicle 202 based at least in part on any of the data. In some examples, determining the instruction may include determining the instruction based at least in part on a format associated with a system with which the instruction is associated (e.g., a first instruction to control movement of the autonomous vehicle may be formatted into a first format (e.g., analog, digital, pneumatic, moving) of messages and/or signals that system controller 236 and/or drive component 212 may parse/cause to be executed, and a second instruction for transmitter 208 may be formatted according to a second format associated therewith).
Radar component 232 may operate on vehicle 202 and/or computing device 214. In some examples, radar component 232 may be upstream in the pipeline from planning component 230 (providing inputs to planning component 230) and downstream from at least some of sensing component 228 (receiving inputs from these components). Radar component 232 may be configured to pass all, a portion, or none of the output of radar component 232 to tracking component and/or planning component 230. In some examples, radar component 232 may be part of sensing component 228. In some examples, radar assembly 232 may determine a speed and/or a yaw rate according to techniques discussed herein.
Memory 220 and/or memory 224 may additionally or alternatively store mapping systems (e.g., generating maps based at least in part on sensor data), planning systems, ride management systems, and the like. While the positioning component 226, the perception component 228, the planning component 230, the radar component 232, the map 234, and/or the system controller 236 are illustrated as being stored in the memory 220, any of these components may include processor-executable instructions, machine-learning models (e.g., neural networks), and/or hardware that may be stored on the memory 224 or configured as part of the computing device 214 and all or part of any of these components.
As described herein, the positioning component 226, the perception component 228, the planning component 230, the radar component 232, and/or other components of the system 200 may include one or more ML models. For example, the positioning component 226, the perception component 228, the planning component 230, and/or the radar component 232 may each include a different ML model pipeline. In some examples, the ML model may include a neural network. An exemplary neural network is a biological heuristic that passes input data through a sequence of connected layers to produce an output. Each layer in the neural network may also include another neural network, or may include any number of layers (whether convolutional or not). As can be appreciated in the context of the present disclosure, neural networks may utilize machine learning, which may refer to a broad class of such algorithms that generate outputs based on learned parameters.
Although discussed in the context of neural networks, any type of machine learning may be used consistent with the present disclosure. For example, machine learning algorithms may include, but are not limited to, regression algorithms (e.g., ordinary Least Squares Regression (OLSR), linear regression, logistic regression, stepwise regression, multivariate Adaptive Regression Splines (MARS), local estimated scatter smoothing (losss)), example-based algorithms (e.g., ridge regression, least Absolute Shrinkage and Selection Operator (LASSO), elastic net, least Angle Regression (LARS)), decision tree algorithms (e.g., classification and regression trees (CART), iterative dichotomer 3 (ID 3), chi-square automated interaction detection (CHAID), decision stumps, conditional decision trees), bayesian algorithms (e.g., naive Bayes, gaussian naive Bayes, polynomial naive Bayes, average-dependence estimators (AODE), bayesian belief networks (BNN), bayesian networks, clustering algorithms (e.g., k-means, k-median, expectation Maximization (EM), hierarchical clustering), association rule learning algorithms (e.g., perceptron, back propagation, hopfield networks, radial Basis Function Networks (RBFN)), deep learning algorithms (e.g., deep Boltzmann Machine (DBM), deep Belief Networks (DBN), convolutional Neural Networks (CNN), stacked autoencoders), dimension reduction algorithms (e.g., principal Component Analysis (PCA), principal Component Regression (PCR), partial Least Squares Regression (PLSR), sammon mapping, multidimensional scaling (MDS), projection pursuit, linear Discriminant Analysis (LDA), mixture Discriminant Analysis (MDA), quadratic Discriminant Analysis (QDA), flexible Discriminant Analysis (FDA)), integration algorithms (e.g., lifting, bootstrap aggregation (bagging), adaBoost, stacked generalization (mixing), gradient elevator (GBM), gradient-lifted regression tree (GBRT), random forest), SVM (support vector machine), supervised learning, unsupervised learning, semi-supervised learning, etc. Other examples of architectures include neural networks, e.g., resNet-50, resNet-101, VGG, denseNet, pointNet, and the like. In some examples, the ML models discussed herein may include pointpilars, SECOND, top-down feature layers (see, e.g., U.S. patent application No. 15/963,833, which is incorporated herein in its entirety), and/or VoxelNet. Architectural delay optimizations may include mobilene V2, shefflene, channnelnet, peleneet, and the like. In some examples, the ML model may include residual blocks such as Pixor.
The memory 220 may additionally or alternatively store one or more system controllers 236, which may be configured to control steering, propulsion, braking, safety, transmitters, communications, and other systems of the vehicle 202. These system controllers 236 may communicate with and/or control corresponding systems of the drive assembly 212 and/or other components of the vehicle 202. For example, the planning component 230 may generate instructions based at least in part on the perception data generated by the perception component 228, and may verify the perception data and/or send the instructions to the system controller 236. The system controller 236 may control operation of the vehicle 202 based at least in part on instructions received from the planning component 230.
It should be noted that although fig. 2 is illustrated as a distributed system, in alternative examples, components of vehicle 202 may be associated with computing device 214 and/or components of computing device 214 may be associated with vehicle 202. That is, vehicle 202 may perform one or more functions associated with computing device 214, and vice versa.
Example procedure
Fig. 3 shows a pictorial flow diagram of an example process 300 for determining a two-dimensional or multi-dimensional velocity from radar data associated with an object based at least in part on a subset of the radar data associated with the object. In the examples described herein, the radar data may be obtained by one or more radar sensors (e.g., radar sensor 302) disposed on vehicle 304. In some implementations, the operations of the example process 300 may be accomplished by one or more components of a sensory component (e.g., a radar component of the sensory component) of an autonomous, semi-autonomous, or non-autonomous vehicle.
At operation 306, the example process 300 may include receiving radar data 308 from one or more radar sensors according to any of the techniques discussed herein. In the example shown, vehicle 304 may traverse the environment generally in the direction indicated by arrow 310 (although in other embodiments, vehicle 304 may be stationary or moving in different directions) such that radar sensor 302 is disposed at the front of vehicle 304, for example, to capture data about objects in front of vehicle 304. More specifically, radar sensor 302 may capture radar data 308, e.g., via one or more radar scans.
In some example embodiments, radar data 308 may include location information indicative of a location of an object in the environment, e.g., a distance and an azimuth relative to vehicle 304 and/or a location in a local or global coordinate system. The radar data 308 may also include signal strength information. For example, the signal strength information may be an indication of the type or composition of the surface that reflected the radio waves. In some instances, the signal strength may be a radar cross-section (RCS) measurement. The radar data 308 may also include velocity information, such as range rate determined from doppler measurements. For example, the velocity of the object may be based on the frequency of the radio energy reflected by the object and/or the time at which the reflected radio energy is detected. The radar data 308 may additionally or alternatively also include sensor-specific information including, but not limited to, orientation of the sensor, e.g., attitude of the sensor relative to the vehicle, pulse Repetition Frequency (PRF) or Pulse Repetition Interval (PRI) of the sensor, field of view or detection of arcing, etc.
At operation 312, the example process 300 may include determining an echo associated with the object based at least in part on the radar data according to any of the techniques discussed herein. The illustration includes a visualization 314 of the radar data 316. More specifically, the visualization 314 includes multiple representations of radar points, shown as circular points, that may represent echoes caused by objects, surfaces, or other reflective items in the environment of the vehicle 304. As shown, some of the circular points are labeled with reference numerals 316 (1), 316 (2), 316 (3), 316 (4), 316 (5), 316 (6). In this example, each of the points depicted as part of the visualization 314 may correspond to a location of a surface that reflects radio waves emitted by the radar device and detected at the radar sensor 302. As will be appreciated, radar sensor 302 may receive and/or generate several types of information about the environment, and thus visualization 314 may include more or less information; the visualization 314 is for illustration purposes only.
The visualization 314 shows that some of the points appear close in location and thus may be associated with a single object. For example, points 316 (2) -316 (6) are located very close (e.g., within a threshold distance), and in some cases, these points may be estimated to indicate a single object, e.g., object 318. For example, a data association component, as discussed in more detail in U.S. patent application Ser. No. 16/416,686, which is incorporated by reference in its entirety, may determine that points 316 (2) -316 (6) may be identified as a point cluster (point cluster) representing an object 318 in the environment surrounding vehicle 304. In examples described herein, and as discussed above, a cluster of points may include multiple points with some likelihood (e.g., a level of similarity and/or a degree of similarity) to identify a single object or group of objects that should be considered together, e.g., by a planning system of an autonomous vehicle. In aspects of the present disclosure, information other than location information may be used to determine the point clusters. Further, the data correlation component may consider historical information (e.g., tracking information of the vehicle 304 and/or the object 318) to properly correlate the radar returns with the object 318.
Thus, in an embodiment of the present disclosure, operation 312 may determine the echo representative of object 318, for example, because the points may be clustered based on one or more different types of information, and thus, sensor data from multiple scans and/or radar sensors is used to create clusters. Operation 312 may result in determining a subset of radar data associated with object 318 from the radar data from the one or more scans.
In some examples, operation 312 may also include performing a RANSAC algorithm to reject outliers in the radar data and/or the subset of radar data. By way of non-limiting example, the RANSAC method can reject such echoes: the echo is identified as part of the subset of radar data associated with object 318 but is not actually from object 318. For example, the RANSAC method may reject echoes from the object 318 that are near the ground. Of course, the present disclosure is not limited to using RANSAC for outlier rejection. Other techniques may alternatively be used, including but not limited to iterative least squares. In some examples, the RANSAC operation may be based at least in part on an assumption about the motion of a detected object (e.g., a rigid body model), although this may be an assumption that may change based at least in part on a classification of the detected object (e.g., based at least in part on other perceptual data, e.g., a classification determined based at least in part on image and/or lidar data).
At operation 320, the example process 300 may include determining a two-dimensional (or multi-dimensional) speed and/or yaw rate of the detected object based at least in part on a subset of radar data associated with the object, in accordance with any of the techniques discussed herein. The visualization 322 includes the vehicle 304 (with the radar sensor 302) and three points 316 (2), 316 (3), and 316 (5) associated with the object 318. The visualization 322 also shows a coordinate system 324, which coordinate system 324 can be oriented based at least in part on the pose of the vehicle 304 in the depicted example.
In the visualization 322, for clarity, point 316 (2) is represented as the first detection point d 0 The point 316 (5) is denoted as a second detection point d 1 The point 316 (3) is denoted as the third detection point d 2 . Although only three detection points from radar data 308 are shown in the example description, the techniques described herein may use any and all echoes associated with object 318. As shown, the object 318 may have a center of rotation c, and the movement of the object 318 may be through a velocity in the x-direction (e.g., velocity v x ) Y-direction (e.g., velocity v) y ) (at least in a cartesian coordinate system, although other coordinate systems may be used) and a yaw rate ω about the center of rotation c. In an example, the center of rotation c may be arbitrarily selected, although in additional or alternative examples, the center of rotation may be determined by the radar component and/or another sensing component, and/or may be included as part of the tracking associated with the object 318.
In the example shown, detection point d 0 -d 2 Position of (2), radar sensor 302 (which generates a first point d) 0 The associated echo) and the position of the center of rotation c are both known or have been estimated at least by the perception component. In addition, the first detecting point d 0 And a second detecting point d 1 And a third detection point d 2 Each having an associated observation speed theta i Such as doppler velocity. Such speeds are respectively denoted gdv in the example 0 、gdv 1 And gdv 2 . Note that gdv represents "ground Doppler velocity". Such a speed is a speed in a direction between a point and a sensor sensing the speed. For example, gdv 0 Is along a first detection point d 0 Directions to the radar sensor 302
Figure BDA0003884100650000101
Velocity of (g dv) 1 Is along a second detection point d 1 Directions to the radar sensor 302
Figure BDA0003884100650000102
Velocity of (g dv) 2 Is along a third detecting point d 2 Directions to the radar sensor 302
Figure BDA0003884100650000103
Of the speed of (c). As will be appreciated, the velocity gdv 0 、gdv 1 And gdv 2 Indicates that the object 318 is moving counterclockwise about the center of rotation c.
The techniques discussed herein may include determining a two-dimensional (or multi-dimensional) speed (e.g., the speed depicted in fig. 3) and/or a yaw rate (as discussed in more detail in fig. 4).
Fig. 4 shows a pictorial flow diagram of an example process 400 for determining two-dimensional or multi-dimensional velocity from radar data associated with an object based at least in part on a subset of the radar data associated with the object. For example, a subset of radar data may be determined based at least in part on operation 306 and/or operation 312. The example process may additionally or alternatively determine a center and/or a size/dimension associated with the object.
At operation 402, the example process 400 may include receiving radar data associated with an object in accordance with any of the techniques discussed herein. For example, the radar data may be a subset of n radar points determined at operation 306 and/or operation 312, where n is a positive integer. Point d of radar data n May include, for example, the position of the detected surface, doppler velocity (gdv) n ) Vector indicating length and direction between detection and radar sensor
Figure BDA0003884100650000104
RCS, rotation angle that can be determined based at least in part on a direction vector
Figure BDA0003884100650000105
And the like. Additionally or alternatively, the perception component may provide tracking associated with the object to the radar component, which may include a previous yaw α and/or yaw rate, an estimated center, a position, and/or the like. In some examples, operation 402 may additionally or alternatively include running an outlier rejection algorithm to select further pruning of the radar data. For example, the outlier rejection algorithm may include the RANSAC algorithm, the M-estimation, and the like.
At operation 404, the example process 400 may include determining a first data structure based at least in part on the radar data and/or tracking according to any of the techniques discussed herein. Operation 404 may include populating the data structure with data indicated by the n radar points. For example, the first data structure may include the following matrix:
Figure BDA0003884100650000106
wherein:
Figure BDA0003884100650000107
and theta i May include an ith radar observation d i Doppler velocity (from the smoothed rotation angle). Note that the rows of equation (1) may include the lateral contribution of two-or multi-dimensional velocity to the ground echo
Figure BDA0003884100650000108
Longitudinal contribution
Figure BDA0003884100650000109
And yaw rate contribution
Figure BDA00038841006500001010
At operation 406, the example process 400 may include determining a second data structure based at least in part on the radar data and/or tracking according to any of the techniques discussed herein. Operation 404 may include populating the data structure with data indicated by the n radar points and tracking data associated with the object. For example, the first data structure may include the following matrix:
Figure BDA00038841006500001011
the second data structure may be identical to the first data structure, but the second data structure may additionally include a yaw hypothesis, indicated by the last line of equation (3). In other words, the second data structure includes terms that impose additional constraints on yaw based on the assumed yaw. The corresponding "measure" inserted into the subset of RADAR values is 0, so that the lateral velocity and the longitudinal velocity are related by the angle alpha. The data structures indicated by equations (1) and (3) may be parameters of a linear or non-linear (e.g., least squares) regression or a robust regression algorithm. For example, a linear model that may form the basis of a linear regression or a robust regression may be given by the following equation:
Ax=Y (4)
wherein:
Figure BDA0003884100650000111
Figure BDA0003884100650000112
note that Y represents the observed radar value, i.e., the ground doppler velocity in the given example, and x represents the velocity component that the system is determining. Further, even though the above equations and discussion relate to determining a velocity associated with an object, the example process 400 may additionally or alternatively include determining a center and/or a size/dimension associated with the object. For example, the first data structure and/or the second data structure more includes an assumption of a center and/or size/dimension of the object, which may be based at least in part on the center and/or size/dimension indicated by a previous tracking associated with the object. If an object has not been detected or a tracking operation has not been performed in a previous frame, the center and/or size/dimension may be initialized based at least in part on perceptual data (e.g., classification) associated with the object or a default initialization may be used.
At operation 408, the example process 400 may include determining a first speed, a first yaw rate, and/or a first error according to any of the techniques discussed herein. For example, the first error may be determined based at least in part on a residual determined by the expression Ax-Y. Operation 408 may include solving equation (4) based at least in part on the first data structure given by equation (1).
At operation 410, the example process 400 may include determining a second speed, a second yaw rate, and/or a second error according to any of the techniques discussed herein. Operation 410 may include solving equation (4) based at least in part on the first data structure given by equation (3). For operation 408 and/or operation 410, solving equation (4) may include a linear regression. The linear regression may determine errors associated with one or more of the radar points (e.g., a row of the matrix) and/or the yaw hypothesis (e.g., a last row of the second data structure). In some examples, the loss function may be part of a regression, and the first error and/or the second error may be determined based at least in part on a total error based at least in part on an error determined by a regression algorithm. In some examples, the errors discussed herein may not be associated with a failure of the underlying sensor system, but may indicate a likelihood that the velocity associated therewith is correct. For example, the error may be a posterior probability.
In some examples, operation 408 and/or operation 410 may additionally or alternatively include determining a covariance associated with the first speed and the first yaw rate and/or the second speed and the second yaw rate, respectively. For example, the covariance associated with speed and yaw may be given by:
cov[x,x]=(A T cov[Y,Y] -1 A) -1
(7)
wherein:
Figure BDA0003884100650000113
for equation (1), or
Figure BDA0003884100650000121
For equation (3).
Wherein var n May be static coefficients or, in additional or alternative examples, var n May be a variance, e.g., a range-rate measurement variance. In some examples, var may be set based at least in part on a detected environment of the vehicle n The value of (c). For example, if feelingKnowing that the component determines that the vehicle 304 is on a highway, the radar component may reduce var compared to detecting that the vehicle 304 is traveling on a downtown street n The value of (c).
At operation 412, the example process 400 may include selecting the first speed and/or the first yaw rate or the second speed and/or the second yaw rate based at least in part on the first error and the second error, according to any of the techniques discussed herein. For example, the regression algorithm may determine the total error as part of determining the speed during operation 408 and/or operation 410. Operation 412 may include selecting any velocity associated with the lower of the two errors. In some examples, one of the two data structures may result in an unresolvable or multi-solvable equation, in which case operation 412 may include selecting a velocity associated with the solvable equation. For example, the matrix may be irreversible, the error may diverge numerically during regression, and/or the solution may violate a complete constraint. In some examples, the selected speed may be associated with tracking associated with the object.
At operation 414, the example process 400 may include determining a final speed in accordance with any of the techniques discussed herein. Determining the final speed may be based at least in part on the selected speed and/or the error. For example, the speeds determined at operations 408 and 410 (and the selected speed) may be suggested speeds or candidate speeds, and the speed determined at operation 414 may be determined with greater accuracy in some examples. Operation 414 may be accomplished through a bayesian filter (e.g., a kalman filter) and/or other estimation algorithms. In some examples, determining the velocity using one of the data structures may include: determining a covariance according to equation (7) and based on the data structure; determining an estimated speed and/or an estimated yaw rate using a regression algorithm based, at least in part, on equations (4) - (6); and determining the final speed and/or the estimated yaw rate using a kalman filter. The kalman filter may receive as inputs the covariance and the estimated speed and/or the estimated yaw rate and determine a final speed and/or the estimated yaw rate based thereon. In addition to or instead of covariance, the kalman filter may receive a noise signal associated with the radar data. In some examples, the kalman filter may receive at least a portion of the tracking associated with the object or data associated therewith, e.g., a previous position, a speed, a yaw rate, etc., of the object. In some examples, operation 414 may be optional, additional, or alternative to operation 412.
At operation 416, the example process 400 may include controlling the autonomous vehicle based at least in part on the speed and/or the error according to any of the techniques discussed herein. The speed may be the selected speed determined at operation 412 and/or the final speed determined at operation 414. In some cases, operation 416 may additionally or alternatively be based at least in part on a yaw rate.
Fig. 5 shows a pictorial flow diagram of an example process 500 for determining a two-dimensional or multi-dimensional velocity from radar data associated with an object based at least in part on a subset of the radar data associated with the object. For example, a subset of radar data may be determined based at least in part on operation 306 and/or operation 312. In some examples, the example process 500 may be deterministic by eliminating the use of RANSAC. In some cases, the example process 500 may be faster than the example process 400 at least because the example process 500 includes solving one system, while the example process 400 may include solving two systems.
At operation 502, the example process 500 may include receiving radar data associated with an object in accordance with any of the techniques discussed herein. For example, operation 502 may include operation 402.
At operation 504, the example process 500 may include determining a data structure based at least in part on the radar data and the set of weights in accordance with any of the techniques discussed herein. Operation 506 may include populating the data structure with data indicated by the n radar points and tracking data associated with the object. For example, the data structure may include the following matrices:
Figure BDA0003884100650000131
wherein:
Figure BDA0003884100650000132
in some examples, the set of weights may include a different weight β for each row i (where i is the row/radar point number), although the weighting β for each column may also be determined i,j Where j ranges from the first column to the third column. Equation 9 is constructed according to the previous example. This set of weights may be used to maintain, increase or decrease the effect of the row on the speed determination at operation 508, which may correspond to the first n rows of data points in the speed determination and/or the (n + 1) th weight corresponding to the last row of the matrix given at (9) above may be used to turn on/off or increase or decrease the effect of the yaw hypothesis given by the last row of the matrix.
At operation 506, the example process 500 may include determining a set of weights based at least in part on the radar data according to any of the techniques discussed herein. Determining a weighted set beta i (where i is the row/radar point number) may include determining an error (e.g., variance) associated with the radar point, and determining a weighting based at least in part on the error to minimize an overall error of the subset of radar data associated with the object. For example, determining the set of weights may include robust regression, e.g., an iterative reweighted least squares algorithm. In robust regression, a loss function may determine residuals associated with radar detection and weight radar detection based at least in part on the residuals. For example, the loss function may be a cauchy loss function, a Huber loss function, or the like. In such an example, the loss function may apply a smaller weighting factor to radar detections associated with them that have larger residuals. In some examples, the weighting process may include iteratively re-weighting the radar detections until the weighted sum converges, which may include iterative operations 504, 506, and/or 508. In some examples, the error determined at operation 506 may be at leastThe determination is based in part on a total error associated with the data structure determined by the regression algorithm. This weighted set may be used instead of determining and selecting between two data structures and reducing the solution of two data structures to one.
In some examples, operation 506 may include populating the data structure based at least in part on previous tracking or initialization assumptions using the center and/or size/dimensions of the object.
At operation 508, the example process 500 may include determining a speed and/or an error based at least in part on the data structure according to any of the techniques discussed herein. For example, determining the velocity and/or error may be part of the weighted linear regression described above. In some examples, the covariance may be determined in association with the velocity, for example, by equations 7 and 9 given above. Operation 508 may additionally or alternatively include determining a center and/or a size/dimension associated with the object.
At operation 510, the example process 500 may include determining a final speed according to any of the techniques discussed herein. Determining the final speed may be based at least in part on the speed and/or covariance determined at operation 508. For example, the speed determined at operation 508 may be a suggested speed or a candidate speed, and the speed determined at operation 510 may be determined with greater accuracy in some examples. In some cases, operation 510 may be accomplished by a bayesian filter (e.g., a kalman filter) and/or other estimation algorithms.
At operation 512, the example process 500 may include controlling the autonomous vehicle based at least in part on the suggested and/or final speed and/or error in accordance with any of the techniques discussed herein.
Example clauses
A. A method, comprising: receiving radar data associated with an object; determining a data structure based at least in part on the radar data, the data structure comprising a first parameter portion and a second parameter portion, wherein the first parameter portion comprises: a lateral contribution to a velocity of the object, the lateral contribution being associated with the first radar observation; a longitudinal contribution associated with the first radar observation; a yaw rate contribution associated with the first radar observation; and wherein the second parameter portion comprises an assumption of a yaw rate associated with the object; determining a set of weights to be applied to the data structure, the set of weights comprising at least a first weight associated with the first radar point and a second weight associated with the yaw rate contribution; determining a yaw rate associated with the object based at least in part on the data structure and the set of weights; and controlling the autonomous vehicle based at least in part on the yaw rate.
B. The method of paragraph a, wherein determining the velocity comprises determining a weighted regression based at least in part on the data structure and the set of weights.
C. The method of paragraphs a or B, further comprising: determining a velocity based at least in part on the data structure and the set of weights; providing the speed and yaw rate to a prediction system; and receiving a predicted trajectory of the object from the prediction system, wherein the autonomous vehicle is also controlled based at least in part on the predicted trajectory.
D. The method of any of paragraphs a to C, further comprising: receiving a tracking associated with the object, the tracking including at least one of a yaw or a yaw rate associated with the object, wherein the yaw assumption is based at least in part on the at least one of the tracked yaw or yaw rate.
E. The method of any of paragraphs a through D, wherein determining the yaw rate comprises determining a suggested yaw rate, and the method further comprises: determining a covariance based at least in part on the data structure; and determining a final yaw rate of the object based at least in part on the covariance and the suggested yaw rate, wherein the vehicle is further controlled based at least in part on the final yaw rate and the covariance.
F. A system, comprising: one or more processors; and a memory storing processor-executable instructions that, when executed by the one or more processors, cause the system to perform operations comprising: receiving radar data associated with an object; determining a data structure based at least in part on the radar data, the data structure comprising a first parameter portion and a second parameter portion, wherein the second parameter portion comprises an assumption of a yaw rate associated with the object; determining a set of weights to be applied to the data structure, the set of weights comprising at least a first weight associated with the first radar point and a second weight associated with the yaw rate contribution; determining a yaw rate associated with the object based at least in part on the data structure and the set of weights; and controlling the autonomous vehicle based at least in part on the yaw rate.
G. The system of paragraph F, wherein the operations further comprise: determining a velocity based at least in part on the data structure and the set of weights; providing the speed and yaw rate to a prediction system; and receiving a predicted trajectory of the object from the prediction system, wherein the autonomous vehicle is also controlled based at least in part on the predicted trajectory.
H. The system of paragraph F or G, wherein the data structure includes radar observations, the radar observations including at least one of: a doppler observation associated with the object, a doppler direction vector, or an angle associated with the vector.
I. The system of any of paragraphs F through H, wherein the operations further comprise: receiving a tracking associated with the object, the tracking including at least one of a yaw or a yaw rate associated with the object, wherein the yaw assumption is based at least in part on the at least one of the yaw or the yaw rate of the tracking.
J. The system of any of paragraphs F to I, wherein determining the yaw rate comprises determining a suggested yaw rate, and the operations further comprise: determining a covariance based at least in part on the data structure; and determining a final yaw rate of the object based at least in part on the covariance and the suggested yaw rate, wherein the vehicle is controlled further based at least in part on the yaw rate and the covariance.
K. The system of any of paragraphs F through J, wherein the operations further comprise: the velocity is associated with a previously generated track associated with the object or a new track associated with the object is generated and the velocity is associated with the new track.
The system of any of paragraphs F through K, wherein the operations further comprise at least one of: the center or size of the object is determined.
A non-transitory computer-readable medium storing processor-executable instructions that, when executed by one or more processors, cause the one or more processors to perform operations comprising: receiving radar data associated with an object; determining a data structure based at least in part on the radar data, the data structure comprising a first parameter portion and a second parameter portion, wherein the second parameter portion comprises an assumption of a yaw rate associated with the object; determining a set of weights to be applied to the data structure, the set of weights comprising at least a first weight associated with the first radar point and a second weight associated with the yaw rate contribution; determining a yaw rate associated with the object based at least in part on the data structure and the set of weights; and controlling the autonomous vehicle based at least in part on the yaw rate.
N, the non-transitory computer-readable medium of paragraph M, wherein the operations further comprise: determining a velocity based at least in part on the data structure and the set of weights; providing the speed and yaw rate to a prediction system; and receiving a predicted trajectory of the object from the prediction system, wherein the autonomous vehicle is also controlled based at least in part on the predicted trajectory.
O, paragraph M, or N, wherein the data structure comprises radar observations, the radar observations comprising at least one of: a doppler observation associated with the object, a doppler direction vector, or an angle associated with the vector.
P, paragraph M to O, wherein the operations further comprise: receiving a tracking associated with the object, the tracking including at least one of a yaw or a yaw rate associated with the object, wherein the yaw assumption is based at least in part on the at least one of the tracked yaw or yaw rate.
Q, the non-transitory computer-readable medium of any one of paragraphs M to P, wherein determining the yaw rate comprises determining a suggested yaw rate, and the operations further comprise: determining a covariance based at least in part on the data structure; and determining a final yaw rate of the object based at least in part on the covariance and the suggested yaw rate, wherein the vehicle is controlled further based at least in part on the yaw rate and the covariance.
R, the non-transitory computer-readable medium of any one of paragraphs M to Q, wherein the operations further comprise: the velocity is associated with a previously generated track associated with the object or a new track associated with the object is generated and the velocity is associated with the new track.
S, non-transitory computer readable medium of any of paragraphs M to R, wherein determining the first weighting comprises: determining a residual based at least in part on the data structure and the yaw rate; and changing the first weighting to reduce the residual error.
T, the non-transitory computer-readable medium of any one of paragraphs M to S, wherein the operations further comprise at least one of: the center or size of the object is determined.
U, the method of any of paragraphs a to T, wherein the data structure comprises radar observations comprising at least one of: a doppler observation associated with the object, a doppler direction vector, or an angle associated with the vector.
V, a method comprising: receiving radar data associated with an object; determining a first data structure comprising a first parameter portion based at least in part on the radar data; determining, based at least in part on the radar data, a second data structure comprising a second parameter portion, the second parameter portion comprising an assumption of yaw associated with the object; determining, based at least in part on the first data structure, a first velocity associated with the object and a first error associated with the first velocity; determining a second speed associated with the object and a second error associated with the second speed and the second yaw rate based at least in part on the second data structure; and controlling the autonomous vehicle based at least in part on the first speed and the first yaw rate or the second speed and the second yaw rate based at least in part on the first error and the second error.
W, a system comprising: one or more processors; and a memory storing processor-executable instructions that, when executed by the one or more processors, cause the system to perform the method of any of claims a-E, U, or V.
X, a non-transitory computer-readable medium storing processor-executable instructions that, when executed by one or more processors, cause the one or more processors to perform the method of any one of claims a-E, U, or V.
While the above example clauses are described with respect to one particular implementation, it should be understood that in the context of this document, the contents of the example clauses may also be implemented via a method, an apparatus, a system, a computer-readable medium, and/or another implementation. Further, any of examples a through X may be implemented alone or in combination with any other one or more of examples a through X.
Conclusion
Although the subject matter has been described in language specific to structural features and/or methodological acts, it is to be understood that the subject matter defined in the appended claims is not necessarily limited to the specific features or acts described. Rather, the specific features and acts are disclosed as example forms of implementing the claims.
The components described herein represent instructions that may be stored in any type of computer-readable medium and may be implemented in software and/or hardware. All of the above described methods and processes may be embodied in, and fully automated via, software code components and/or computer executable instructions executed by one or more computers or processors, hardware, or some combination thereof. Some or all of the methods may alternatively be embodied in specialized computer hardware.
At least some of the procedures discussed herein are illustrated as logical flow diagrams, each operation of which represents a sequence of operations that can be implemented in hardware, software, or a combination thereof. In the context of software, the operations represent computer-executable instructions stored on one or more non-transitory computer-readable storage media that, when executed by one or more processors, cause a computer or autonomous vehicle to perform the recited operations. Generally, computer-executable instructions include routines, programs, objects, components, data structures, and so forth that perform particular functions or implement particular abstract data types. The order in which the operations are described is not intended to be construed as a limitation, and any number of the described operations can be combined in any order and/or in parallel to implement a process.
Conditional language such as "may (may)", "may (result)", "may (may)" or "may (light)" etc. are understood in this context to mean that some examples include, while other examples do not include, certain features, elements and/or steps, unless explicitly stated otherwise. Thus, such conditional language is not generally intended to imply that certain features, elements and/or steps are in any way required for one or more examples or that one or more embodiments necessarily include logic for deciding, with or without user input or prompting, whether certain features, elements and/or steps are included or are to be performed in any particular example.
Unless specifically stated otherwise, connectivity language (e.g., the phrase "at least one of X, Y or Z") should be understood to mean that items, terms, etc. can be X, Y or Z, or any combination thereof, including multiples of each element. The use of "a" or "an" means both the singular and the plural, unless explicitly described as such.
Any routine descriptions, elements, or blocks in the flow diagrams described herein and/or depicted in the figures should be understood as potentially representing modules, segments, or portions of code which include one or more computer-executable instructions for implementing specific logical functions or elements in the routine. Alternative implementations are included within the scope of the examples described herein in which elements or functions may be deleted or performed out of the order shown or discussed, including substantially concurrently, in the reverse order, with additional operations, or with omission of operations, depending on the functionality involved, as would be understood by those skilled in the art. Note that the term may essentially denote a range. For example, substantially simultaneously may indicate that two activities occur within a time range of each other, substantially the same dimensions may indicate that two elements have dimensions within a range of each other, and so on.
Many variations and modifications may be made to the above-described examples, and the elements thereof should be understood to fall within the scope of other acceptable examples. All such modifications and variations are intended to be included herein within the scope of this disclosure and protected by the following claims.

Claims (15)

1. A system, comprising:
one or more processors; and
a memory storing processor-executable instructions that, when executed by the one or more processors, cause the system to perform operations comprising:
receiving radar data associated with an object;
determining a data structure based at least in part on the radar data, the data structure comprising a first parameter portion and a second parameter portion, wherein the second parameter portion comprises an assumption of a yaw rate associated with the object;
determining a set of weights to be applied to the data structure, the set of weights comprising at least a first weight associated with a first radar point and a second weight associated with a yaw rate contribution;
determining a yaw rate associated with the object based at least in part on the data structure and the set of weights; and
controlling an autonomous vehicle based at least in part on the yaw rate.
2. The system of claim 1, wherein the operations further comprise:
determining a velocity based at least in part on the data structure and the set of weights;
providing the speed and the yaw rate to a prediction system; and
receiving a predicted trajectory of the object from the prediction system,
wherein the autonomous vehicle is further controlled based at least in part on the predicted trajectory.
3. The system of claim 1 or 2, wherein the data structure comprises radar observations comprising at least one of: a Doppler observation, a Doppler direction vector, or an angle associated with the vector associated with the object.
4. The system of any of claims 1 to 3, wherein the operations further comprise: receiving a tracking associated with the object, the tracking comprising at least one of a yaw or a yaw rate associated with the object, wherein the yaw assumption is based at least in part on the at least one of the yaw or the yaw rate of the tracking.
5. The system of any of claims 1 to 4, wherein determining the yaw rate comprises determining a suggested yaw rate, and the operations further comprise:
determining a covariance based at least in part on the data structure; and
determining a final yaw rate of the object based at least in part on the covariance and the suggested yaw rate,
wherein the vehicle is further controlled based at least in part on the yaw rate and the covariance.
6. The system of any of claims 1 to 5, wherein the operations further comprise:
correlating the velocity with a previously generated tracking associated with the object, or
A new tracking associated with the object is generated and the velocity is associated with the new tracking.
7. The system of any of claims 1 to 6, wherein the operations further comprise at least one of: determining a center or size of the object.
8. A non-transitory computer-readable medium storing processor-executable instructions that, when executed by one or more processors, cause the one or more processors to perform operations comprising:
receiving radar data associated with an object;
determining a data structure based at least in part on the radar data, the data structure comprising a first parameter portion and a second parameter portion, wherein the second parameter portion comprises an assumption of a yaw rate associated with the object;
determining a set of weights to be applied to the data structure, the set of weights comprising at least a first weight associated with a first radar point and a second weight associated with a yaw rate contribution;
determining a yaw rate associated with the object based at least in part on the data structure and the set of weights; and
controlling an autonomous vehicle based at least in part on the yaw rate.
9. The non-transitory computer-readable medium of claim 8, wherein the operations further comprise:
determining a velocity based at least in part on the data structure and the set of weights;
providing the speed and the yaw rate to a prediction system; and
receiving a predicted trajectory of the object from the prediction system,
wherein the autonomous vehicle is further controlled based at least in part on the predicted trajectory.
10. The non-transitory computer-readable medium of claim 8 or 9, wherein the data structure comprises radar observations comprising at least one of: a Doppler observation, a Doppler direction vector, or an angle associated with the vector associated with the object.
11. The non-transitory computer-readable medium of any of claims 8 to 10, wherein the operations further comprise: receiving a tracking associated with the object, the tracking comprising at least one of a yaw or a yaw rate associated with the object, wherein the yaw assumption is based at least in part on the at least one of the yaw or the yaw rate of the tracking.
12. The non-transitory computer-readable medium of any one of claims 8-11, wherein determining the yaw rate includes determining a suggested yaw rate, and the operations further comprise:
determining a covariance based at least in part on the data structure; and
determining a final yaw rate of the object based at least in part on the covariance and the suggested yaw rate,
wherein the vehicle is further controlled based at least in part on the yaw rate and the covariance.
13. The non-transitory computer-readable medium of any one of claims 8 to 12, wherein the operations further comprise:
associating the velocity with a previously generated tracking associated with the object, or
A new tracking associated with the object is generated and the velocity is associated with the new tracking.
14. The non-transitory computer-readable medium of any of claims 8 to 13, wherein determining the first weighting comprises:
determining a residual based at least in part on the data structure and the yaw rate; and
changing the first weighting to reduce the residual.
15. The non-transitory computer-readable medium of any of claims 8 to 14, wherein the operations further comprise at least one of: determining a center or size of the object.
CN202180027842.4A 2020-02-19 2021-02-19 Object speed and/or yaw for radar tracking Pending CN115485177A (en)

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
US16/795,411 US11609321B2 (en) 2020-02-19 2020-02-19 Radar-tracked object velocity and/or yaw
US16/795,411 2020-02-19
PCT/US2021/018820 WO2021168282A1 (en) 2020-02-19 2021-02-19 Radar-tracked object velocity and/or yaw

Publications (1)

Publication Number Publication Date
CN115485177A true CN115485177A (en) 2022-12-16

Family

ID=77272755

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202180027842.4A Pending CN115485177A (en) 2020-02-19 2021-02-19 Object speed and/or yaw for radar tracking

Country Status (5)

Country Link
US (1) US11609321B2 (en)
EP (1) EP4107044A4 (en)
JP (1) JP2023514618A (en)
CN (1) CN115485177A (en)
WO (1) WO2021168282A1 (en)

Families Citing this family (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11906967B1 (en) * 2020-03-31 2024-02-20 Zoox, Inc. Determining yaw with learned motion model
US20220135074A1 (en) * 2020-11-02 2022-05-05 Waymo Llc Classification of objects based on motion patterns for autonomous vehicle applications
US12050267B2 (en) 2020-11-09 2024-07-30 Waymo Llc Doppler-assisted object mapping for autonomous vehicle applications
US11719805B2 (en) * 2020-11-18 2023-08-08 Infineon Technologies Ag Radar based tracker using empirical mode decomposition (EMD) and invariant feature transform (IFT)
US20220349997A1 (en) * 2021-04-29 2022-11-03 Qualcomm Incorporated Intra-vehicle radar handover
US20230280457A1 (en) * 2021-12-27 2023-09-07 Gm Cruise Holdings Llc Radar detector with velocity profiling
CN114475664B (en) * 2022-03-17 2023-09-01 西华大学 Automatic driving vehicle lane-changing coordination control method for congested road section
US20240163554A1 (en) * 2022-11-16 2024-05-16 Black Sesame Technologies Inc. System and method for image auto-focusing
CN116047448B (en) * 2022-12-30 2024-03-12 西安电子科技大学 Method for predicting conductor target RCS

Family Cites Families (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9903945B2 (en) * 2015-02-04 2018-02-27 GM Global Technology Operations LLC Vehicle motion estimation enhancement with radar data
US9983301B2 (en) * 2015-10-02 2018-05-29 Delphi Technologies, Inc. Automated vehicle radar system to determine yaw-rate of a target vehicle
EP3285230B1 (en) * 2016-08-19 2021-04-07 Veoneer Sweden AB Enhanced object detection and motion estimation for a vehicle environment detection system
US10220857B2 (en) 2017-02-23 2019-03-05 Uber Technologies, Inc. Vehicle control system
WO2018201252A1 (en) 2017-05-03 2018-11-08 Soltare Inc. Audio processing for vehicle sensory systems
EP3415945B1 (en) * 2017-06-12 2024-01-10 Aptiv Technologies Limited Method of determining the yaw rate of a target vehicle
US11760280B2 (en) * 2018-01-12 2023-09-19 Uatc, Llc Systems and methods for streaming processing for autonomous vehicles
US10634777B2 (en) * 2018-05-30 2020-04-28 Ford Global Technologies, Llc Radar odometry for vehicle
EP3575827B1 (en) * 2018-06-01 2024-07-31 Aptiv Technologies AG Method for robust estimation of the velocity of a target using a host vehicle
US10691130B2 (en) 2018-06-06 2020-06-23 Uatc, Llc Gridlock solver for motion planning system of an autonomous vehicle
EP3611541B1 (en) * 2018-08-16 2024-07-03 Aptiv Technologies AG Method of determining an uncertainty estimate of an estimated velocity
US11275173B2 (en) * 2019-05-20 2022-03-15 Zoox, Inc. Yaw rate from radar data

Also Published As

Publication number Publication date
EP4107044A4 (en) 2024-02-28
US20210255307A1 (en) 2021-08-19
WO2021168282A1 (en) 2021-08-26
EP4107044A1 (en) 2022-12-28
JP2023514618A (en) 2023-04-06
US11609321B2 (en) 2023-03-21

Similar Documents

Publication Publication Date Title
US11485384B2 (en) Unstructured vehicle path planner
CN114127655B (en) Closed lane detection
US11625041B2 (en) Combined track confidence and classification model
JP2022554184A (en) Object detection and tracking
US11500385B2 (en) Collision avoidance perception system
US11609321B2 (en) Radar-tracked object velocity and/or yaw
JP2023511755A (en) Object velocity and/or yaw rate detection and tracking
EP3903223A1 (en) Collision avoidance system
CN115088013A (en) Multitask learning for semantic and/or depth-aware instance segmentation
US11353592B2 (en) Complex ground profile estimation
US11628855B1 (en) Object velocity detection from multi-modal sensor data
CN116547495A (en) Collaborative vehicle path generation
CN114787581A (en) Correction of sensor data alignment and environmental mapping
US20240212360A1 (en) Generating object data using a diffusion model
US20240253620A1 (en) Image synthesis for discrete track prediction
US20240211797A1 (en) Training a variable autoencoder using a diffusion model
US20240211731A1 (en) Generating object representations using a variable autoencoder
US20240210942A1 (en) Generating a scenario using a variable autoencoder conditioned with a diffusion model
US20240208546A1 (en) Predictive models for autonomous vehicles based on object interactions
WO2024137500A1 (en) Generating object representations using a variable autoencoder
WO2023192515A1 (en) Ground profile estimation for sensor data

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination