CN113544538A - Identifying radar reflections using velocity and position information - Google Patents

Identifying radar reflections using velocity and position information Download PDF

Info

Publication number
CN113544538A
CN113544538A CN202080017300.4A CN202080017300A CN113544538A CN 113544538 A CN113544538 A CN 113544538A CN 202080017300 A CN202080017300 A CN 202080017300A CN 113544538 A CN113544538 A CN 113544538A
Authority
CN
China
Prior art keywords
echo
radar
velocity
reflected
vehicle
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202080017300.4A
Other languages
Chinese (zh)
Inventor
王闯
J·K·科恩
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Zoox Inc
Original Assignee
Zoox Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from US16/289,068 external-priority patent/US11353578B2/en
Priority claimed from US16/288,990 external-priority patent/US11255958B2/en
Application filed by Zoox Inc filed Critical Zoox Inc
Publication of CN113544538A publication Critical patent/CN113544538A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/02Systems using reflection of radio waves, e.g. primary radar systems; Analogous systems
    • G01S13/06Systems determining position data of a target
    • G01S13/42Simultaneous measurement of distance and other co-ordinates
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/02Systems using reflection of radio waves, e.g. primary radar systems; Analogous systems
    • G01S13/50Systems of measurement based on relative movement of target
    • G01S13/58Velocity or trajectory determination systems; Sense-of-movement determination systems
    • G01S13/581Velocity or trajectory determination systems; Sense-of-movement determination systems using transmission of interrupted pulse modulated waves and based upon the Doppler effect resulting from movement of targets
    • G01S13/582Velocity or trajectory determination systems; Sense-of-movement determination systems using transmission of interrupted pulse modulated waves and based upon the Doppler effect resulting from movement of targets adapted for simultaneous range and velocity measurements
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/02Systems using reflection of radio waves, e.g. primary radar systems; Analogous systems
    • G01S13/50Systems of measurement based on relative movement of target
    • G01S13/58Velocity or trajectory determination systems; Sense-of-movement determination systems
    • G01S13/583Velocity or trajectory determination systems; Sense-of-movement determination systems using transmission of continuous unmodulated waves, amplitude-, frequency-, or phase-modulated waves and based upon the Doppler effect resulting from movement of targets
    • G01S13/584Velocity or trajectory determination systems; Sense-of-movement determination systems using transmission of continuous unmodulated waves, amplitude-, frequency-, or phase-modulated waves and based upon the Doppler effect resulting from movement of targets adapted for simultaneous range and velocity measurements
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/66Radar-tracking systems; Analogous systems
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/86Combinations of radar systems with non-radar systems, e.g. sonar, direction finder
    • G01S13/865Combination of radar systems with lidar systems
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/86Combinations of radar systems with non-radar systems, e.g. sonar, direction finder
    • G01S13/867Combination of radar systems with cameras
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/88Radar or analogous systems specially adapted for specific applications
    • G01S13/93Radar or analogous systems specially adapted for specific applications for anti-collision purposes
    • G01S13/931Radar or analogous systems specially adapted for specific applications for anti-collision purposes of land vehicles
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/56Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
    • G06V20/58Recognition of moving objects or obstacles, e.g. vehicles or pedestrians; Recognition of traffic objects, e.g. traffic signs, traffic lights or roads

Abstract

Techniques for determining reflected echoes in radar sensor data are discussed. In some examples, the radar echo pairs may be compared to each other. For example, a velocity associated with a first radar echo may be projected in a radial direction associated with a second radar echo to determine a projected velocity. In some examples, the second radar echo may be a reflected echo if the magnitude of the projection velocity corresponds to the magnitude of the second radar echo. In other instances, the reflection point may be determined using the location data, and the second radar echo may be a reflection echo if the object is located at the reflection point. In some instances, a vehicle, such as an autonomous vehicle, may be controlled with the exclusion of information from reflected echoes.

Description

Identifying radar reflections using velocity and position information
Cross Reference to Related Applications
This PCT international patent application claims benefit and priority from the filing date of U.S. patent application No. 16/288,990 filed on 28.2.2019 and U.S. patent application No. 16/289,068 filed on 28.2.2019, the disclosures of each of which are incorporated herein by reference.
Background
Autonomous vehicles utilize various methods, devices, and systems to traverse through an environment that includes obstacles. For example, an autonomous vehicle may utilize route planning methods, apparatus, and systems to navigate through an area that may include other vehicles, buildings, pedestrians, and the like. These planning systems may rely on sensor data including radar data, lidar data, image data, and/or other data. However, in some examples, the presence of vehicles, buildings, and/or objects in the environment may cause reflections, resulting in inaccurate or incorrect sensor data, including, for example, false positives. Such inaccurate and/or incorrect sensor data can present challenges to safely and comfortably traversing through the environment.
Drawings
Specific embodiments are described with reference to the accompanying drawings. In the drawings, the left-most digit(s) of a reference number identifies the drawing in which the reference number first appears. The use of the same reference symbols in different drawings indicates similar or identical items or features.
FIG. 1 is a schematic view illustrating an example vehicle and an environment in which the vehicle operates, the vehicle including a radar sensor that senses objects in the environment, according to an embodiment of the present disclosure.
Fig. 2 is a schematic view of the environment of fig. 1 illustrating an example technique for using velocity information to distinguish radar returns associated with objects in the environment from reflected returns reflected from intermediate objects in accordance with an embodiment of the present disclosure.
Fig. 3 is another schematic view of the environment of fig. 1, illustrating an example technique for using location information to distinguish radar returns associated with objects in the environment from reflected returns reflected from intermediate objects in accordance with an embodiment of the present disclosure.
Fig. 4 is a schematic block diagram of an example system including a vehicle and a computing device that may be used to implement the radar reflection identification techniques described herein, in accordance with an embodiment of the present disclosure.
FIG. 5 depicts a flowchart of an example process for implementing a radar reflection identification technique using velocity information, in accordance with an embodiment of the disclosure.
FIG. 6 depicts a flowchart of an example process for implementing a radar reflection identification technique using location information, according to an embodiment of the disclosure.
Detailed Description
As discussed above, certain types of sensor data (e.g., radar data) may be susceptible to reflections, for example, from vehicles, buildings, and other objects in the environment. Such reflected echoes may present challenges to the autonomous vehicle to safely and/or comfortably traverse the environment. For example, a reflected echo may represent a proximate object that does not actually exist. Planning in response to these non-existing or phantom "objects" may result in the vehicle taking unnecessary actions, e.g., braking, steering, etc.
Techniques for identifying reflected echoes in radar sensor data captured by a radar system are described. Typically, radar sensors can emit radio energy that can reflect (or bounce) off objects in the environment before returning to the sensor. When the transmitted energy returns directly from the object to the radar sensor, the radar sensor may capture object echoes that include accurate data about the object (e.g., distance, position, velocity, etc.). However, in some instances, radio energy may reflect from multiple objects in the environment before returning to the radar sensor. In these examples, the radar sensor may capture the reflected echo. The reflected echoes do not accurately represent objects in the environment. In at least some examples, safety critical path planning for autonomous vehicles may be affected without knowing whether the echoes are from actual objects or reflections.
In some examples, the techniques described herein may use a pair-wise comparison of echoes (e.g., pairing a considered echo with a known object echo) to determine that the radar echo is a reflected echo. More specifically, radar returns that are not related to known objects may be compared to object returns (e.g., returns associated with known objects) to determine whether the radar returns generally correspond to theoretical reflections of physical returns from certain hypothetical objects. In some examples, the object echo may be determined from all radar echoes based on information associating the echo with a trajectory or other previously obtained information. By way of non-limiting example, theoretical reflections may be determined by projecting object velocity (the magnitude and direction of the echo from the object, which may be provided by a radar echo) into a radial direction extending from the vehicle into the echo under consideration. A radial component (e.g., extending along a radial direction) of the projected velocity represents a desired velocity of a reflection of the object echo along the radial direction. Thus, when the magnitude of the radial component of the projected velocity corresponds to the velocity associated with the echo in question, such an echo may be identified as a reflected echo.
In some examples, the techniques described herein may use a pair-wise comparison of radar echoes without available a priori knowledge to determine that the echoes are reflected. For example, any two echoes may be compared to determine whether one of the echoes (e.g., the farther one of the echoes) may theoretically be an echo of the other of the echoes. In some examples, such a comparison may be based on projecting the velocity of the closer echo to the location of the farther one of the echoes. In other examples, the location of the echoes may be used to determine a theoretical reflection point on a line between the sensor and the farther one of the echoes.
In some examples, the techniques described herein may also be used to confirm that the identified reflected echo is actually a reflection. For example, the techniques described herein may determine an assumed reflection point (e.g., a point along the radial direction of the echo under consideration) at which radio energy will be reflected if the echo under consideration is a reflected echo. In aspects of the present disclosure, additional sensor information about the environment (e.g., lidar data, additional radar data, time-of-flight data, sonar data, image data, etc.) may be used to confirm the presence of objects near the point of reflection. In other words, the presence of an object at an assumed reflection point or a calculated reflection point may further indicate (and/or confirm) that the echo is a reflected echo of another echo.
Moreover, in some implementations, to investigate, the techniques described herein may determine a subset of all radar returns as potential reflected returns. For example, radar returns may be filtered to include only those radar returns that may be reflected returns and/or may have a non-negligible effect on vehicle operation. In some examples, echoes associated with a trajectory or other previously obtained information may be designated as object echoes, and thus these echoes are not potential reflected echoes. Furthermore, echoes that are relatively closer (e.g., radially closer) to the vehicle than object echoes are not reflected echoes and thus may be excluded from consideration using the techniques described herein. Furthermore, radar returns with a speed equal to or lower than a threshold speed may also be ignored, for example, because the vehicle's planning system may not take into account such radar returns. In some examples, the threshold speed may vary with distance from the vehicle. In at least some examples, filtering such radar returns may reduce the amount of time, processing, and/or memory required to determine whether a return is a reflection or an actual object.
In some examples, information regarding the reflected radar echo may be output for use by a vehicle computing device of the autonomous vehicle to control the autonomous vehicle to safely traverse through the environment. For example, the vehicle computing device may exclude reflected echoes from route and/or trajectory planning. In this way, the autonomous vehicle will not brake, steer, or otherwise take action in response to the phantom "object". Further, the vehicle computing device may not track or otherwise follow the echo determined to be a reflected echo, which may, for example, reduce processing load.
The techniques discussed herein may improve the functionality of a computing device in a variety of ways. For example, in the context of determining control for a vehicle, the amount of data to be considered may be reduced, for example by excluding reflected echoes, thereby reducing excess resources dedicated to unnecessary determinations about the environment. Improved trajectory generation may improve safety results and may improve driver experience (e.g., by reducing unnecessary braking in response to a phantom object, steering to avoid the phantom object, etc.). These and other improvements to computer functionality and/or user experience are discussed herein.
The techniques described herein may be implemented in a variety of ways. Example implementations are provided below with reference to the following figures. Although discussed in the context of an autonomous vehicle, the methods, apparatus, and systems described herein may be applied to various systems (e.g., robotic platforms) and are not limited to autonomous vehicles. In another example, these techniques may be utilized in an aeronautical or nautical context, or in any system that uses machine vision.
Fig. 1 is a schematic view of an environment 100 in which a vehicle 102 operates. In the illustrated example, the vehicle 102 is traveling in the environment, but in other examples, the vehicle 102 may be stationary and/or parked in the environment 100. The vehicle 102 includes one or more radar sensor systems 104 that capture data representative of the environment 100. By way of example and not limitation, the vehicle 102 may be an autonomous vehicle configured to operate according to a level 5 classification issued by the U.S. national highway traffic safety administration that describes vehicles capable of performing all safety critical functions throughout a trip, where a driver (or occupant) is not expected to control the vehicle at any time. In such an example, the vehicle 102 may be unmanned because it may be configured to control all functions from start to stop (including all parking functions). This is merely an example, and the systems and methods described herein may be incorporated into any ground, air, or marine vehicle, including vehicles that require manual control all the way by the driver to partially or fully autonomously controlled vehicles. Additional details associated with the vehicle 102 are described below.
Vehicle 102 may travel through environment 100, for example, in a direction generally indicated by arrow 106, relative to one or more other objects. For example, the environment 100 may include dynamic objects such as additional vehicles 108 (traveling generally in the direction indicated by arrow 110) and static objects such as first parked vehicle 112(1), second parked vehicle 112(2), third parked vehicle 112(3) (collectively, "parked vehicles 112"), first building 114(1), second vehicle 114(2), and third vehicle 114(3) (collectively, "buildings 114"). The additional vehicles 106, the parked vehicles 112, and the buildings 114 are merely examples of objects that may be in the environment 100; additional and/or different objects may also or alternatively be located in environment 100, including, but not limited to, vehicles, pedestrians, riders, trees, street signs, fixtures, and the like.
In at least one example, and as noted above, the vehicle 102 may be associated with the radar sensor(s) 104, and the radar sensor(s) 104 may be disposed on the vehicle 102. The radar sensor(s) 104 may be configured to measure a range to an object and/or a velocity of the object. In some example systems, the radar sensor(s) 104 may include doppler sensors, pulse-type sensors, continuous wave frequency modulation (CFWM) sensors, and the like. The radar sensor(s) 104 may transmit pulses of radio energy at predetermined intervals. In some implementations, the spacing may be configurable, for example, to facilitate enhanced detection of objects at relatively far distances or relatively near distances. Typically, the radio energy pulses transmitted by the radar sensor(s) 104 reflect off objects in the environment 100 and may be received by the radar sensor(s) 104, for example, as radar data or radar returns.
In the example of fig. 1, the radio energy emitted by the radar sensor(s) 104 may contact the additional vehicle 108 generally in the direction of arrow 116 and be reflected back to the radar sensor(s) 104, for example, generally in the direction of arrow 118. In this example, the radio energy is generally transmitted and returned along the same path, which may be the object echo path 120. In the example shown, the object echo path 120 is substantially a straight line. The reflected radio energy along the object echo path 120 is captured by the radar sensor(s) 104, for example as an object echo 122. The object echo 122 is shown as a square at a location determined based on the captured data. For example, the information associated with the object echo 122 may include information indicative of a location in the environment, e.g., a location of the additional vehicle 108. The positioning information may include a distance and an orientation relative to the vehicle 102, or a position in a local coordinate system or a global coordinate system. Also, in implementations, the object echo 122 may include signal strength information. For example, the signal strength information may indicate a type of the object. More specifically, radio waves may be strongly reflected by objects having certain shapes and/or compositions. For example, broad, flat surfaces and/or sharp edges have a higher reflectivity than round surfaces, and metals have a higher reflectivity than humans. In some instances, the signal strength may include a radar cross-section (RCS) measurement. The object echo 122 may also include velocity information. For example, the speed of the additional vehicle 108 may be based on the frequency of the radio energy reflected by the additional vehicle 108 and/or the time at which the reflected radio energy is detected.
Object returns 122 may be an example of accurate data (e.g., corresponding to additional vehicles 108) captured by radar sensor(s) 104. However, the radar sensor(s) 104 may also capture echoes that may be less reliable. For example, fig. 1 also shows that radio energy emitted by radar sensor(s) 104 generally in the direction of arrow 124 may be reflected from first building 114(1) to travel generally in the direction of arrow 126. In the example shown, the radio energy reflected from the first building 114(1) may then be reflected by the additional vehicle 108 back to the first building 114(1) generally in the direction of arrow 128 (i.e., opposite the direction of arrow 126). Finally, the radio energy may then be reflected again from the first building 114(1) and returned to the radar sensor(s) 104 generally in the direction of arrow 130. Accordingly, radio energy reflected from the additional vehicle 108 may return to the radar sensor(s) 104 along a first reflected echo path 132, the first reflected echo path 132 including: a first branch 134 between the additional vehicle 108 and the first building 114(1), and a second branch 136 between the first building 114(1) and the vehicle 102 (e.g., radar sensor(s) 104). This reflected energy may be captured by the radar sensor(s) 104 as a first reflected echo 138. As shown in fig. 1, the first reflected echo 138 is received along the direction of the second branch 136 (e.g., along arrows 124, 130), but has a range equal to the distance of the first reflected echo path 132 (e.g., the distance is the sum of the distance of the first branch 134 and the distance of the second branch 136). As will be appreciated from fig. 1 and the foregoing description, the first reflected echo 138 represents the presence of an object at a location corresponding to the first reflected echo 138 (i.e., in the direction of the second branch 136). However, as described herein, such "objects" are non-existent phantom "objects".
FIG. 1 shows that the radar sensor(s) 104 may also capture a second reflected echo 140 corresponding to radio energy reflected from the first parked vehicle 112(1) and the additional vehicle 108. More specifically, radio energy emitted by radar sensor(s) 104 generally in the direction of arrow 142 may be reflected from first parked vehicle 112(1) to travel generally in the direction of arrow 144. The wireless energy may then contact the additional vehicle 108 and reflect from the additional vehicle 108, returning to the first parked vehicle 112(1) generally in the direction of arrow 146 (i.e., opposite the direction of arrow 144). Finally, the wireless energy may then be reflected again from first parked vehicle 116(1) generally in the direction of arrow 148. Accordingly, radio energy corresponding to the second reflected echo 140 may travel along a second reflected echo path 154, the second reflected echo path 154 including: a first branch 156 between the additional vehicle 108 and the first parked vehicle 112(1), and a second branch 158 between the first parked vehicle 112(1) and the vehicle 102 (e.g., the radar sensor(s) 104). As shown in fig. 1, the second reflected echo includes information about the radio energy received along the second branch 158 (e.g., in the direction of arrows 142, 148), but has a range equal to the distance of the second reflected echo path 154 (e.g., the distance is the sum of the distance of the first branch 156 and the second distance of the second branch 158). As will be appreciated from fig. 1 and the foregoing description, the second reflected echo 140 represents the presence of an object at a location corresponding to the second reflected echo 140 (i.e., in the direction of the second leg 158). However, as described herein, such "objects" are non-existent phantom "objects".
In addition to including the location information (e.g., from range and bearing) shown in fig. 1, the object echo 122, the first reflected echo 138, and the second reflected echo 140 (the first reflected echo 138 and the second reflected echo 140 may be referred to herein as "reflected echoes 138, 140") may also include velocity information. More specifically, the object echo 122 may include information about the object velocity 160, e.g., which corresponds to the velocity (of the additional vehicle 108) along the direction of the arrow 118. Similarly, the first reflected echo 138 may include information regarding a first reflected echo velocity 162 (e.g., a velocity along the direction of the second leg 136 of the first reflected echo path 132), and the second reflected echo 140 may include information regarding a second reflected echo velocity 164 (e.g., a velocity along the direction of the second leg 158 of the second reflected echo path 154). Thus, the object echo 122 provides information (e.g., location, velocity) about the additional vehicle 108, while the first reflected echo 138 represents that an object (phantom object) is approaching from a location associated with the echo at a first reflected echo velocity 162, and the second reflected echo 140 represents that an object (phantom object) is approaching from a location associated with the echo at a second reflected echo velocity 164. As described herein, the vehicle 102 may include, among other functions, a planning system that may determine a route, trajectory, and/or control relative to objects in the environment 100 based on the received sensor data. However, a planning system using reflected echoes 138, 140 may plan to react to objects that do not actually exist.
The techniques described herein may improve the accuracy and performance of a planning system by identifying echoes (e.g., echoes 138, 140) as reflected echoes. For example, as further shown in fig. 1, the radar sensor(s) 104 may be one of a plurality of sensor systems 164 associated with the vehicle 102. In some examples, the sensor system 164 may also include one or more additional sensors 166, which may include additional radar sensors, light detection and ranging (lidar) sensors, ultrasonic transducers, sound navigation and ranging (sonar) sensors, positioning sensors (e.g., Global Positioning System (GPS), compass, etc.), inertial sensors (e.g., inertial measurement unit), accelerometers, magnetometers, gyroscopes, etc.), cameras (e.g., RGB, IR, intensity, depth, time of flight, etc.), wheel encoders, microphones, environmental sensors (e.g., temperature sensors, humidity sensors, light sensors, pressure sensors, etc.), and so forth.
As also shown in fig. 1, radar sensor(s) 104 may generate radar data 168, and additional sensor(s) 166 may generate sensor data 170. The radar data 168 may include information about the object echo 122, the first reflected echo 138, and the second reflected echo 140, including, but not limited to, position, velocity, and/or other information associated therewith. For some examples, radar data 168 may also include signal strength information, which may include RCS measurements. The radar data may also include specific information about the sensor including, but not limited to, the orientation of the sensor (e.g., the attitude of the sensor relative to the vehicle), the Pulse Repetition Frequency (PRF) or Pulse Repetition Interval (PRI) for the sensor, the field of view or the detection arc, etc. In some implementations, certain aspects of the radar data 168 (e.g., field of view, orientation, PRF or PRI, etc.) may be preconfigured, in which case the data may be available (e.g., in storage) and may, for example, not be transmittable with radar echoes. The sensor data 170 may include any information about the environment 100 and/or the additional sensor(s) 166. By way of non-limiting example, when additional sensor(s) 166 include lidar sensors, sensor data 170 may include point cloud data, and when additional sensor(s) 166 include cameras, sensor data 170 may include image data. The radar sensor(s) 104 and the additional sensor(s) 166 may generate radar data 168 and sensor data 170, respectively, at predetermined intervals, which may be the same or different for different sensors in the sensor system 164. For example, the radar sensor(s) 104 may have a sweep frequency at which echoes are captured and radar data 168 is generated.
The radar data 168 and the sensor data 170 may be received at one or more vehicle computing devices 172 and utilized by the vehicle computing device(s) 172 to perform planning, for example, using a planning system (not shown). In the example shown, the sensor system 164 and the vehicle computing device(s) 172 are part of the vehicle 102 (e.g., disposed on the vehicle 102). However, in other examples, some or all of the sensor system 164 and/or the vehicle computing device(s) 172 may be separate from the vehicle 102 and/or located remotely from the vehicle 102. In such an arrangement, data capture, processing, commands, and/or control may be transmitted to/from the vehicle 102 by one or more remote computing devices via a wired and/or wireless network.
In at least one example, the vehicle computing device(s) 172 can utilize the radar data 168 and the sensor data 170 captured by the sensor system(s) 164 in the reflection identification component 174. For example, the reflection identification component 174 may receive the radar data 168, the radar data 168 including the object echo 122, the first reflected echo 138, and the second reflected echo 140, and the reflection identification component 174 may determine that the reflected echoes 138, 140 are reflected echoes rather than echoes associated with actual objects in the environment. By making this distinction, the reflection identification component 174 may send only the object echo 122 (rather than the reflection echoes 138, 140) to the planning system and/or use additional echoes to further refine the estimated position and/or velocity of the object. Thus, a control plan can be implemented with the reflection echoes 138, 140 excluded. As described in more detail below with reference to fig. 2, the echo identification component 174 may determine whether an echo (e.g., a candidate echo or an unconfirmed echo) is a reflected echo by comparing two echoes (e.g., the object echo 122 and additional echoes (e.g., echoes 138 and/or 140)). For example, the reflection identification component 174 may "project" a known object echo, such as the object echo 122, onto a line extending along a direction associated with the unconfirmed echo (e.g., a direction extending radially from and through the radar sensor(s) 104). The technique for projecting object returns 122 is described in more detail below with reference to fig. 2, but the projection of the returns will have a velocity component that has the same direction (e.g., a radial direction relative to radar sensor(s) 104) as the direction of the velocity associated with the unconfirmed returns. The reflection identification component 174 may also determine the magnitude of the velocity associated with the projection of the object echo 122. In some implementations, when the magnitude corresponds to a magnitude of the velocity of the unconfirmed echo, the echo identification component 174 may determine that the unconfirmed echo is (or is likely to be) a reflected echo. As used herein, a size may correspond when the sizes are substantially equal (e.g., within a certain threshold range or margin of error).
The reflection identification component 174 may utilize some a priori knowledge of the objects in the environment and/or the environment itself. For example, the reflection identification component 174 may only project echoes that are echoes of known objects. In some examples, this knowledge may come from tracking objects and/or previously identified objects, e.g., additional vehicles 108. For example, such tracking and/or identification may be based on previously captured radar data 168 and/or previously captured sensor data 170 used to sense the object. For example, an object such as the attached vehicle 108 may be tracked in a few seconds and/or at an extended distance. On the other hand, the reflected echoes may be more transient since they require alignment of the vehicle 102, the additional vehicle 108, and the intervening reflective objects/surfaces (e.g., the first parked vehicle 112(1) and the first building 114 (1)). As one or more of these objects move, conditions (e.g., relative alignment and/or orientation) that allow for the production of reflected echoes may disappear, thereby causing the reflected echoes to disappear from subsequent scans of the radar sensor(s) 104. Additionally or alternatively, the map data available to the vehicle 102 (e.g., map data that may be downloaded at any time based on the location of the vehicle 102 or otherwise accessible by the vehicle) may include a three-dimensional representation of the environment, e.g., a grid of localized three-dimensional environments. In such an example, the reflection identification component 174 may determine that the first reflected echo 138 is a non-physical echo based on knowledge of the corresponding building 114 (1).
The echo identification component 174 may also include functionality to confirm that it determines that the candidate echo is a reflected echo. In some examples, the reflection identification component 174 may attempt to track echoes received as radar data 168, such as by subsequent scanning. As noted above, relative movement of the vehicle 102 and the additional vehicle 108 may result in a diminished reflection condition, resulting in a failure to receive subsequent echoes corresponding to "objects" associated with the reflected echoes. However, disregarding a potential (upcoming) object before its trajectory can be verified by continuously collected radar scans may slow down the reaction time, which may be unsafe. Thus, in other implementations, the reflection identification component 174 may determine whether a reflecting object is disposed along the direction of the reflected echo, for example, using the sensor data 170 and/or the radar data 168. Referring to fig. 1, in some examples, lidar data, image data, etc. may confirm the presence of the first parked vehicle 112(1) and/or the first building 114 (1). Accordingly, the reflection identification component 174 may identify the first parked vehicle 112(1) and/or the first building 114(1) and confirm that the echo is a previous determination of the reflected echo 138 based on the locations of these objects (e.g., at the intersection between the legs of the respective reflected echo paths 132, 154). The map data can help identify static stationary objects, such as buildings 114, ground topology, street signs, public fixtures, and the like. However, real-time sensor data or near real-time sensor data may be required to identify a movable object, whether the object is static or dynamic. For example, parked vehicles 112, pedestrians, riders, other moving vehicles, etc. may not be available through the map data.
In some implementations, the reflection identification component 174 may process the radar returns to determine whether they are real-time or near real-time reflection returns. For example, the reflection identification component 174 may compare, for example, multiple echoes in parallel to known object echoes, such as the object echo 122. In one example, all echoes in the radar scan that are not known object echoes may be compared to known object echoes to determine whether such echoes are reflections. In some other implementations, the reflection identification component 174 may filter the echoes to exclude, for example, points that are unlikely to be reflections or points that are not physically likely to be reflections. By way of non-limiting example, echoes that are relatively closer than the known object echo will not be reflections of the object echo. In other examples, the reflection identification component 174 may exclude echoes indicating a velocity below a certain threshold or a velocity of zero. Other filtering techniques may also be used.
While the examples described herein may compare points or echoes to known object echoes 122, in other examples, the reflection identification component 174 may additionally or alternatively compare echo pairs, e.g., where one of the echoes is not known to be a known object echo. More specifically, whether or not one of the echoes is an object echo, the techniques described herein may determine that the two echoes may be reflections of each other. For example, when considering echo pairs, farther echoes may be tagged or otherwise designated as potential reflections. Further processing may be used as a basis for determining whether a closer echo is associated with an object in the environment (or a further echo is associated with a reflection). The reflection recognition component 174 may perform high-level filtering as described herein to identify a subset of points for pair-wise comparison. By way of non-limiting example, echoes below a threshold velocity may be ignored, and echo pairs that differ too much in location and/or velocity (e.g., equal to or above a threshold difference) may not be compared. In at least some examples, various techniques may be combined to improve the level of certainty as to whether an echo is a reflection.
Fig. 2 is another schematic view of an environment 100 illustrating additional aspects of the present disclosure. To avoid confusion, elements in fig. 1 that are discussed in detail with reference to fig. 2 (e.g., environment 100) have the same reference numbers in fig. 1 and 2. Furthermore, certain elements of environment 100 are shown in gray in FIG. 2 for clarity.
More specifically, FIG. 2 shows a vehicle 102, an additional vehicle 108, a first parked vehicle 112(1), and a first building 114 (1). Fig. 2 also shows an object echo 122, a first reflected echo 138 and a second reflected echo 140. As discussed above in connection with fig. 1, the object echo 122 and the reflected echoes 138, 140 may include position information and velocity information based on properties of the radio energy received at the radar sensor(s) 104. For example, the object echo 122 that is spaced a distance from the vehicle 102 along a line 202 that passes through both the vehicle 102 and the additional vehicle 108 may be indicative of both the location of the object echo (and thus the location of the additional vehicle 108) and the object echo velocity 160. Note that since the object echo 122 is a direct reflection from the additional vehicle 108 (e.g., along line 202) that is known (e.g., from previously obtained data (e.g., trajectory) about the environment 100 and/or the additional vehicle 108), the object echo may be considered an accurate representation of a portion of the additional vehicle 108. As noted above, the presence of an object (e.g., additional vehicle 108) may not be required in all instances or the echo may be known to be an object echo (e.g., object echo 122). The techniques described herein may be applied to a pair-wise comparison of echoes to determine the likelihood that the echoes are reflections of each other.
The reflected echoes 138, 140 also include at least position and velocity information associated with the radio energy received at the radar sensor(s) 104. As discussed above, the first reflected echo 138 includes information about the first reflected echo velocity along the line 204 at the illustrated location. The distance of the first reflected echo 138 along line 204 is a distance or range determined by the radar sensor(s) 104 and corresponding to the distance traveled by the received radio energy (e.g., the distance corresponds to the first reflected echo path 132 shown in fig. 1). Similarly, the second reflected echo 140 includes information about the velocity of the second reflected echo along line 206 at the location shown. The distance of the second reflected echo 140 along line 206 is a distance or range determined by the radar sensor(s) 104 and corresponding to the distance traveled by the received radio energy (e.g., the distance corresponds to the second reflected echo path 154 shown in fig. 1).
The techniques described herein may utilize geometry in environment 100 to determine whether a radar echo is a reflected echo. For example, the object echo 122 and the reflected echoes 138, 140 are merely echoes at the time of capture at the radar sensor(s) 104. The object echo 122 may be unambiguously attributed to the additional vehicle 108 only by some previously acquired knowledge about the environment 100 and/or objects in the environment 100. In some instances, a tracker or other component associated with the vehicle 102 may have tracked the additional vehicle 108 and identified the echo as the object echo 122 when the echo matches an expectation associated with the tracking. In other examples, the object echo 122 may be considered, for example, one echo of an echo pair, regardless of whether the object echo 122 is attributable to an additional vehicle 108, as described herein. Although the object echo 122 is shown as a single point at a central location in front of the additional vehicle 108, the object echo 122 may also correspond to one or more other echoes and/or one or more other locations on the additional vehicle 108. By way of non-limiting example, a property of the object echo (e.g., object echo speed 160) may be determined based on a plurality of echoes associated with the additional vehicle 108. For example, the object echo 122 may be an average of multiple echoes from the additional vehicles 108.
As noted above, unlike the object echoes 122, the reflected echoes may be short lived, only being produced when certain conditions exist, such as geometric conditions (which may cause the reflected echoes to move irregularly, appear, or disappear). Therefore, there is no prior knowledge about the trajectory or other prior knowledge of the reflected echoes. However, the techniques described herein determine whether the echo is a reflection or may correspond directly to an actual object in the environment 100.
As noted above, the first reflected echo 138 corresponds to an echo located along line 204. To characterize the first reflected echo 138 as a reflected echo, the techniques herein may treat the echo as a pair along with the object echo 122. For example, because the direction of line 204 is known and the location of first reflected echo 138 on line 204 relative to the radar sensor is known, object echo 122 may be projected onto the location of the first reflected echo. Conceptually, projecting the object echo 122 may include determining a line 208 connecting the location associated with the object echo 122 (e.g., along the radial line 202) and the location associated with the first reflected echo 138 (e.g., along the radial line 204). As also shown, a line 210 perpendicular to and bisecting the line 208 intersects the line 204 at a reflection point 212, and thus, the distance of a line segment 214 between the reflection point 212 and the object echo 122 is equal to the distance of a line segment 216 between the reflection point 212 and the first reflected echo 138. Further, the object echo velocity 160 may be reflected around the line 210 as a first reflection velocity 218. Once the first reflected velocity 218 is determined, the first reflected velocity 218 may be resolved into two velocity components — a radial velocity component 220 generally along the line 204 and a tangential velocity component 222 perpendicular to the radial velocity component 220. The radial velocity component 220 is the projection velocity of the object echo 122 along the first direction 204. In other words, the first projection velocity is the radial velocity component 220 of the first reflection velocity 218, and the first reflection velocity 218 is a mirror image about the line 210 of the vector representing the object echo velocity 160.
In the example of fig. 2, if the radio energy reflected from the additional vehicle 108 is also reflected at the reflection point 212, the radial velocity component 220 (projected velocity) is the velocity (i.e., magnitude and direction) that will be sensed by the radar sensor(s) 104. As will be appreciated, the radial velocity component 220 of the first reflected velocity 218 is in the same direction (e.g., along line 204) as the first reflected echo velocity 162 (shown in fig. 1, but not shown in fig. 2). Thus, if the magnitude of radial velocity component 220 (first projection velocity) and first reflected echo velocity 162 are substantially similar (e.g., within a predetermined threshold or range of each other), first reflected echo 138 may be marked (flagged), marked (tagged), or otherwise identified as a possible reflection.
Fig. 2 illustrates a similar conceptualization of determining whether the second reflected echo 140 is a reflected echo. For example, the second reflection velocity 224 at the location of the second reflected echo 140 may be a mirror image of the object echo velocity 160 with respect to the line 226. The line 226 is perpendicular to and bisects a line 228 extending between the location of the object echo 122 and the location of the second reflected echo 140. The second reflected velocity 224 includes a radial velocity component 230 (i.e., along radial line 206) and a tangential velocity component 232 (i.e., perpendicular to radial line 206).
As also shown in fig. 2, line 226 meets radial line 206 at reflection point 234. As discussed above, the radial velocity component 230 of the second projection velocity 224 is the projected (e.g., desired) velocity (i.e., direction and magnitude) of the echo associated with the radio energy bouncing off the additional vehicle 108 and reflecting back to the radar sensor(s) 104 at the reflection point 234. In other words, the radial velocity component 230 is the projection velocity and corresponds to the desired echo associated with the reflection at the reflection point 234. As with the first projected echo, the direction of the radial velocity component 230 is substantially the same as the direction of the second reflected echo velocity 164. I.e., both along line 206. Thus, if the magnitude of the radial velocity component 230 (second projected echo) and the magnitude of the second reflected echo velocity 164 are substantially the same, the second reflected echo 140 may be marked, scaled, or otherwise identified as a possible reflection.
In implementations of the present disclosure, and as further described herein, upon determining that the reflected echoes 138, 140 are reflected echoes (e.g., because the respective radial velocity components 220, 230 are substantially the same as the first and second reflected echoes 162, 164), the vehicle 102 may ignore (e.g., exclude from planning) the reflected echoes 138, 140. The vehicle 102 may also confirm the reflection, for example, by determining that an object is present at each of the reflection points 212, 234. For example, the vehicle 102 may use sensor data, map data, or the like to confirm the presence of the object, as described herein. In at least some examples, echoes can be tagged so that other components can know that there may not be a corresponding object associated with them.
Fig. 2 also illustrates another example implementation in which a pedestrian 236 exits a building 238 and enters the environment 100. As shown, a pedestrian 236 may enter the environment at a location proximate to the location associated with the second reflected echo 140. The radar sensor(s) 104 may generate radar data that includes information about newly emerging pedestrians. Because the pedestrian 236 has not been previously tracked (i.e., because the pedestrian 236 has just appeared), the implementations described herein may compare the echo associated with the pedestrian 236 to known object echoes (e.g., object echo 122) using the techniques described herein. For example, the object echo velocity 160 may be projected onto a location associated with the pedestrian 236. Because the pedestrian 236 is farther from the radar sensor(s) 104 than the additional vehicle 108, the reflection point may also be identified. In the example shown, the reflection point will be close to reflection point 234. As in other examples, the radial component of the projected object echo velocity at the location of the pedestrian echo may also be determined. However, since the pedestrian echo is not a reflected echo, the magnitude of the velocity of the radial velocity component of the projected object echo velocity may be different. For example, unless the pedestrian walks in a manner that the component of the pedestrian's velocity toward the radar sensor(s) 104 matches the component of the reflected echo, the pedestrian echo will not be (correctly) identified as a reflection. Upon determining that the pedestrian echo is not a reflection, the movement of the pedestrian may be tracked and/or information 236 about the pedestrian may be used to generate control for the vehicle.
Aspects of the present disclosure are not limited to the example implementation shown in fig. 2. For example, the techniques described herein may be used to identify reflections of other objects in the environment (including reflections from the vehicle 102). In some examples, radio energy reflected by the additional vehicle 108 may reflect (or bounce off) the vehicle 102, travel back to the additional vehicle 108 (or other object), and reflect off of the additional vehicle again before being captured by the radar sensor(s) 104. In other words, the detected radio energy may traverse the path four times (twice in each direction) along the line 202 before being captured by the radar sensor(s). This "secondary bounce" may cause the echo to be along direction 202, but twice as far from vehicle 102. In some examples, the magnitude of the velocity will be different from (e.g., half of) the magnitude of the object echo velocity, which may result in identifying the reflected echo.
Further, while fig. 2 shows the intermediate reflective object as a stationary object, in other implementations, the techniques described herein may also determine whether an echo is reflected from a moving object. For example, the velocity of the intermediate object will be known (e.g., from other radar returns) and this velocity can be used to change the radial component of the projected velocity. Other modifications are also contemplated.
Fig. 3 shows another schematic view of the environment 100, and fig. 3 is used to illustrate an alternative method for determining whether an echo is (or may be) a reflection of another echo. More specifically, fig. 3 may illustrate a technique for using geometry to determine a reflection point, which may then be compared to environmental information, for example, to determine whether the reflection point corresponds to an object in the environment.
More specifically, fig. 3 shows the object echo 122 (off of the additional vehicle 108) and the first reflected echo 138. For clarity, the second reflected echo 140 is not shown, although the techniques described with reference to an echo pair comprising the first reflected echo 138 and the object echo 122 may be applied to any radar echo pair (e.g., comprising the object echo 122 and the second reflected echo 138, the first reflected echo 138 and the second reflected echo 140, and/or any other echo pair). As described herein, although a priori knowledge of the environment 100 may allow for the association of object returns 122 with additional vehicles 108, implementations of the present disclosure may be equally applicable to any radar return pair, regardless of whether associated with known objects and/or regardless of the availability of a priori knowledge of the environment 100. Thus, while the echoes are labeled as "object" echoes and "reflected" echoes, one or both of these labels may not be known until after additional processing. In at least some examples, such tags may be used to help guide an autonomous vehicle. For example, although the weight of the importance of the "reflection" echo is reduced, the "reflection" echo may still be considered for navigation or otherwise considered in planning.
As described herein, radar sensor(s) 104 may only receive range information, location information, and/or velocity (velocity) information about the echo. Thus, for example, radar sensor(s) 104 may generate sensor data indicative of the location of object echo 122 and the location of first reflected echo 138. For example, in fig. 3, line 302 in fig. 3 is a line between the location of object echo 122 and the location of radar sensor(s) 104, and line 304 is a line between the location of first reflected echo 138 and the location of radar sensor(s) 104. Based on the echoes 122, 138 (i.e., the locations of the echoes 122, 138), the lengths of the lines 302, 304 and the angle 314 between the lines 302, 304 are known (or can be readily determined). The techniques described herein may use lines 302, 304 and angle 314 to determine potential or theoretical reflection points 308 along line 304 at which radio energy first reflected from objects (e.g., additional vehicles 108) associated with object returns 122 will be reflected before being received at radar sensor(s) 104. Once the location of the theoretical reflection point 308 is determined, the techniques described herein may determine whether an object is present at the theoretical reflection point 308, thereby confirming (or at least indicating) that the first reflected echo 138 is a reflection of the object echo.
More specifically, the distance between the first reflected echo 138 and the object echo 122 (e.g., the distance of line 310 shown in fig. 3) may be determined using the distance of lines 302, 304 and the angle 306. For example, the length of line 310 may be determined using cosine law, but this is not required. The theoretical reflection point 308 may then be determined as the intersection of the line 304 and a line 312, the line 312 bisecting and perpendicular to a line 310 extending between the echoes 122, 138. For example, a simple geometry may be used to determine the angle 314 between the line 304 and the line 310, the line 304 extending between the sensor 304 and the first reflected echo 138, the line 310 extending between the first reflected echo 138 and the object echo 122. The length of the line segment 316 between the first reflected echo 138 and the reflection angle 316 bisected by the line 312 may then also be readily determined to provide the location of the reflected echo 308 along the line 304. Furthermore, if the first reflected echo 138 is a true reflection, the distance of the line segment 316 will be equal to the distance from the object echo 122 to the reflection point 308. This additional information may be used (additionally or alternatively) to calculate the position 308 of the reflection point.
In the implementation just described, only the geometry may be used to determine the potential reflection points 308. For example, no velocity information about the echo is required. However, potential reflection points may be determined for any echo pair at different distances. Thus, the techniques described herein may also determine whether an object is present at a potential reflection point to determine whether one of the echoes (e.g., the farther echo) is a reflection. For example, sensor data (including but not limited to lidar data, image data, additional radar data, etc.) may be used to determine whether an object is present near the location of theoretical reflection point 308. In the illustrated example, the building 114(1) may be identified using image data, lidar data, and/or other sensor data captured by the vehicle 102. In other embodiments, the map data may confirm that the object is present at the reflection point 308. In some examples, the velocity information may be used to determine potential reflections based at least in part on projecting a velocity from an echo to a unit vector associated with another echo. This projection velocity can then be directly compared with the velocity of another echo.
Aspects of fig. 2 and 3 (and fig. 5 and 6 below) are described in the environment 100 of fig. 1 by way of example with reference to the components shown in fig. 1. However, the examples illustrated and described with reference to fig. 2, 3, 5, and 6 are not limited to being performed in environment 100 or using the components of fig. 1. For example, some or all of the examples described with reference to fig. 2, 3, 5, and 6 may be performed by one or more components of fig. 4, as described herein, or by one or more other systems or components.
Fig. 4 depicts a block diagram of an example system 400 for implementing techniques described herein. In at least one example, the system 400 may include a vehicle 402, which vehicle 402 may be the same as or different from the vehicle 102 shown in fig. 1.
The vehicle 402 may include vehicle computing device(s) 404, one or more sensor systems 406, one or more transmitters 408, one or more communication connections 410, one or more drive modules 412, and at least one direct connection 414.
The vehicle computing device 404 may include one or more processors 416 and memory 418 communicatively coupled to the one or more processors 416. In the example shown, the vehicle 402 is an autonomous vehicle. However, the vehicle 402 may be any other type of vehicle, or any other system having at least one sensor (e.g., a camera-enabled smartphone). In the illustrated example, the memory 418 of the vehicle computing device 404 stores a positioning component 420, a perception component 422, a prediction component 424, a planning component 426, a reflection identification component 428, one or more system controllers 430, one or more maps 432, and a tracker component 434. While the positioning component 420, the perception component 422, the prediction component 424, the planning component 426, the reflection identification component 428, the system controller(s) 430, the map(s) 432, and/or the tracker component 434 are depicted in fig. 4 as residing in the memory 418 for illustrative purposes, it is contemplated that these components may additionally or alternatively be accessible to the vehicle 402 (e.g., stored on or otherwise accessible by a memory remote from the vehicle 402). In some examples, vehicle computing device(s) 404 may correspond to vehicle computing system(s) 172 of fig. 1 or may be an example of vehicle computing system(s) 172.
In at least one example, the positioning component 420 can include the following functionality: data is received from the sensor system(s) 406 to determine the position and/or orientation (e.g., one or more of x-position, y-position, z-position, roll, pitch, or yaw) of the vehicle 402. For example, the location component 420 may include and/or request/receive a map of the environment, and may continuously determine a location and/or orientation of the autonomous vehicle within the map. In some examples, the localization component 420 may utilize SLAM (simultaneous localization and mapping), CLAMS (simultaneous calibration, localization and mapping), relative SLAM, beam adjustment, non-linear least squares optimization, etc. to receive image data, lidar data, radar data, IMU data, GPS data, wheel encoder data, etc. to accurately determine the localization of the autonomous vehicle. In some examples, the positioning component 420 may provide data to various components of the vehicle 402 to determine an initial position of the autonomous vehicle 402 to generate the trajectory.
In some instances, the perception component 522 may include functionality for performing object detection, segmentation, and/or classification. In some examples, perception component 422 may provide processed sensor data that indicates the presence of an object in proximity to vehicle 402 and/or classify the object as an object type (e.g., car, pedestrian, rider, animal, building, tree, road surface, curb, sidewalk, unknown, etc.). In additional and/or alternative examples, sensing component 422 may provide processed sensor data that indicates one or more characteristics associated with a detected object (e.g., a tracked object) and/or an environment in which the object is located. In some examples, the characteristics associated with the object may include, but are not limited to, an x-position (global and/or local position), a y-position (global and/or local position), a z-position (global and/or local position), an orientation (e.g., roll, pitch, or yaw), an object type (e.g., classification), a velocity of the object, an acceleration of the object, a range (size) of the object, and/or the like. The characteristic associated with the environment may include, but is not limited to, the presence of another object in the environment, the status of another object in the environment, the time of day, day of week, season, weather conditions, indications of darkness/light, and the like. In some examples, sensing component 422 may determine an object using radar data and may receive information about reflected echoes, e.g., to include/exclude sensor data, as described herein.
In some examples, the prediction component 424 may include functionality for generating predicted trajectories of objects in the environment. For example, the prediction component 424 may generate one or more predicted trajectories for vehicles, pedestrians, animals, etc. within a threshold distance from the vehicle 402. In some instances, the prediction component 424 can measure a route of the object and generate a trajectory of the object. In some instances, the prediction component 424 may cooperate with the tracker 434 to track objects as they progress through the environment. In some examples, information from prediction component 424 may be used when determining whether the radar returns are from known objects.
In general, the planning component 426 may determine a path through the environment to be followed by the vehicle 402. For example, the planning component 426 may determine various routes and trajectories and various levels of detail. For example, the planning component 426 may determine a route to travel from a first location (e.g., a current location) to a second location (e.g., a target location). For purposes of this discussion, a route may be a sequence of waypoints for travel between two locations. As non-limiting examples, landmarks include streets, intersections, Global Positioning System (GPS) coordinates, and the like. Further, the planning component 426 may generate instructions for guiding the autonomous vehicle along at least a portion of a route from the first location to the second location. In at least one example, the planning component 426 may determine how to direct an autonomous vehicle from a first landmark in a series of landmarks to a second landmark in the series of landmarks. In some examples, the instructions may be a trace or a portion of a trace. Further, in some implementations, multiple trajectories may be generated substantially simultaneously (e.g., within a technical tolerance) according to a rolling horizon technique, where one of the multiple trajectories is selected for the vehicle 402 to navigate. In some examples, the planning component 426 may generate one or more trajectories for the vehicle 402 based at least in part on the sensor data (e.g., radar returns). For example, the planning component 426 may exclude echoes determined to be reflected echoes.
In general, the reflection identification component 428 may include functionality to identify whether the sensor data (e.g., radar returns) correspond to actual objects or reflections of objects from some intermediate objects.
In some examples, the reflection identification component 428 may correspond to the reflection identification component 174 in fig. 1. As discussed herein, the reflection identification component 428 may receive radar data, lidar data, image data, map data, and the like to determine whether the sensed object is an actual object or merely a reflection of an actual object. In some examples, the reflection identification component 428 may provide sensor information determined not to correspond to a reflection to the planning component 426 to determine when to control the vehicle 402 to traverse the environment.
In some instances, reflection identification component 428 may exclude sensor data determined to correspond to a reflection to planning component 426, e.g., such that planning component 426 determines a control with the exclusion of sensor data (e.g., radar data determined to be associated with a reflection from an intermediate object). The reflection identification component 428 may also provide sensor information to the tracker 434, for example, to cause the tracker to track a dynamic object in the environment.
The system controller(s) 430 may be configured to control the steering system, propulsion system, braking system, safety systems, transmitter systems, communication systems, and other systems of the vehicle 402, for example, based on controls generated by the planning component 426 and/or based on information about from the planning component 426. The system controller(s) 430 may communicate with and/or control corresponding systems of the drive module(s) 414 and/or other components of the vehicle 402.
The map(s) 432 may be used by the vehicle 402 to navigate within the environment. For purposes of this discussion, a map may be any number of data structures modeled in two, three, or N dimensions that are capable of providing information about an environment, such as, but not limited to, a topology (e.g., an intersection), a street, a mountain, a road, terrain, and a general environment. In some examples, the map may include, but is not limited to, texture information (e.g., color information (e.g., RGB color information, Lab color information, HSV/HSL color information), etc.), intensity information (e.g., lidar information, radar information, etc.), spatial information (e.g., image data projected onto a grid, individual "bins" (e.g., polygons associated with individual colors and/or intensities)), reflectivity information (e.g., specular reflectivity information, retroreflectivity information, BRDF information, rdbssf information, etc.). In one example, the map may include a three-dimensional grid of the environment. In some instances, the map may be stored in a tile format, such that individual tiles of the map represent discrete portions of the environment and may be loaded into working memory as needed. In some examples, the map(s) 432 may include at least one map (e.g., an image and/or a grid). The vehicle 402 may be controlled based at least in part on the map(s) 432. That is, the map(s) 432 can be used in conjunction with the positioning component 420, the perception component 422, the prediction component 424, the planning component 426, and/or the reflection recognition component 432 to determine a location of the vehicle 402, identify objects in the environment, and/or generate routes and/or trajectories to navigate within the environment. Further, and as described herein, map(s) 432 may be used to verify the presence of an object (e.g., an intermediate object from which radio energy may be reflected before being received at a radar sensor).
In some examples, map(s) 432 may be stored on one or more remote computing devices (e.g., one or more computing devices 438) accessible via one or more networks 436. In some examples, map(s) 432 may include a plurality of similar maps that are stored based on, for example, characteristics (e.g., type of entity, time of day, day of week, season of year, etc.). Storing multiple maps 432 in this manner may have similar memory requirements, but increases the speed at which data in the maps may be accessed.
The tracker 434 may include functionality for tracking (e.g., following) the motion of an object. For example, the tracker 434 may receive sensor data representing dynamic objects in the environment of the vehicle from one or more of the sensor system(s) 406. For example, an image sensor on the vehicle 402 may capture image sensor data, a lidar sensor may capture point cloud data, and a radar sensor may obtain echoes indicative of the position, pose, etc. of objects in the environment at multiple times. Based on this data, tracker 434 can determine tracking information for the object. For example, trajectory information may provide historical position, velocity, acceleration, etc. of the associated object. Further, in some instances, the tracker 434 can track objects in occluded regions in the environment. U.S. patent application No. 16/147,177, "Radar Spatial Estimation," filed on 28.9.2018, the entire disclosure of which is incorporated herein by reference, describes techniques for tracking objects in occluded regions. As described herein, information from the tracker 434 may be used to verify that the radar returns correspond to the tracked objects. In at least some examples, tracker 434 can be associated with perception component 422 such that perception component 422 performs data association to determine whether a newly identified object should be associated with a previously identified object.
As can be appreciated, the components discussed herein (e.g., the positioning component 420, the perception component 422, the prediction component 424, the planning component 426, the reflection identification component 428, the system controller(s) 430, the map(s) 432, and the tracker 434) are separately described for illustrative purposes. However, the operations performed by the various components may be combined or performed in any other component. By way of example, the reflection identification function can be performed by the perception component 422 and/or the planning system 426 (e.g., rather than the reflection identification component 428) to reduce the amount of data transmitted by the system.
In some instances, aspects of some or all of the components discussed herein may include models, algorithms, and/or machine learning algorithms. For example, in some instances, components in memory 418 (and/or memory 442 discussed below) may be implemented as a neural network.
As described herein, an example neural network is a biological heuristic that passes input data through a sequence of connected layers to produce an output. Each layer in the neural network may also include another neural network, or may include any number of layers (whether convolutional or not). As can be appreciated in the context of the present disclosure, neural networks may utilize machine learning, which may refer to a broad class of such algorithms that generate outputs based on learned parameters.
Although discussed in the context of a neural network, any type of machine learning may be used consistent with the present disclosure. For example, machine learning algorithms may include, but are not limited to, regression algorithms (e.g., Ordinary Least Squares Regression (OLSR), linear regression, logistic regression, stepwise regression, Multivariate Adaptive Regression Splines (MARS), local estimated scatter smoothing (lous)), example-based algorithms (e.g., ridge regression, Least Absolute Shrinkage and Selection Operator (LASSO), elastic net, Least Angle Regression (LARS)), decision tree algorithms (e.g., classification and regression tree (CART), iterative dichotomizer 4(ID3), chi-square automatic interaction detection (CHAID), decision stump, conditional decision tree), bayesian algorithms (e.g., na bayes, gaussian bayes, polynomial na, mean-dependency estimator (de), bayesian belief network (BNN), bayesian network), clustering algorithms (e.g., aok mean, k median, expectation-maximization (EM) Hierarchical clustering), association rule learning algorithms (e.g., perceptron, back propagation, hopfield networks, Radial Basis Function Networks (RBFN)), deep learning algorithms (e.g., Deep Boltzmann Machine (DBM), Deep Belief Networks (DBN), Convolutional Neural Networks (CNN), stacked autoencoders), dimensionality reduction algorithms (e.g., Principal Component Analysis (PCA), Principal Component Regression (PCR), Partial Least Squares Regression (PLSR), Sammon mapping, multidimensional scaling (MDS), projection pursuit, Linear Discriminant Analysis (LDA), Mixture Discriminant Analysis (MDA), Quadratic Discriminant Analysis (QDA), Flexible Discriminant Analysis (FDA)), integration algorithms (e.g., lifting, bootstrap aggregation (bagging), AdaBoost, stacked generalization (blending), gradient lifter (GBM), gradient-boosted regression trees (GBRT), random forests), SVMs (support vector machines), supervised learning, hierarchical learning algorithms (e.g., weighted ensemble, random forest, random number, unsupervised learning, semi-supervised learning, etc.
Additional examples of architectures include neural networks, e.g., ResNet70, ResNet101, VGG, DenseNet, PointNet, and the like.
In at least one example, sensor system(s) 406 can include lidar sensors, radar sensors, ultrasonic transducers, sonar sensors, positioning sensors (e.g., GPS, compass, etc.), inertial sensors (e.g., Inertial Measurement Unit (IMU), accelerometer, magnetometer, gyroscope, etc.), cameras (e.g., RGB, IR, intensity, depth, time of flight, etc.), microphones, wheel encoders, environmental sensors (e.g., temperature sensors, humidity sensors, light sensors, pressure sensors, etc.), and so forth. Sensor system(s) 406 may include multiple instances of each of these or other types of sensors. For example, the lidar sensors may include individual lidar sensors located at corners, front, rear, sides, and/or top of the vehicle 402. As another example, the camera sensors may include multiple cameras disposed at various locations around the exterior and/or interior of the vehicle 402. As another example, the radar sensors may include multiple instances of the same or different radar sensors disposed at various locations around the vehicle 402. The sensor system(s) 406 may provide input to the vehicle computing device 404. Additionally or alternatively, the sensor system(s) 406 may transmit sensor data to the computing device(s) 438 via the one or more networks 436 at a particular frequency, after a predetermined period of time has elapsed, in near real-time, or the like. In some examples, sensor system(s) 406 may correspond to sensor system(s) 164 of fig. 1, including radar sensor(s) 104 and/or additional sensor(s) 166.
The emitter(s) 408 may be configured to emit light and/or sound. The transmitter(s) 408 in this example may include internal audio and visual transmitters to communicate with the occupants of the vehicle 402. By way of example and not limitation, the internal transmitter may include: speakers, lights, signs, display screens, touch screens, tactile emitters (e.g., vibration and/or force feedback), mechanical actuators (e.g., seat belt pretensioners, seat positioners, headrest positioners, etc.), and the like. In this example, the transmitter 408 may also include an external transmitter. By way of example and not limitation, the external transmitters in this example include lights for signaling direction of travel or other indicators of vehicle action (e.g., indicator lights, signs, light arrays, etc.), and one or more audio transmitters (e.g., speakers, speaker arrays, horns, etc.) for audibly communicating with pedestrians or other nearby vehicles, one or more of which include acoustic beam steering techniques.
Communication connection(s) 410 may enable communication between vehicle 402 and one or more other local or remote computing devices. For example, communication connection(s) 410 may facilitate communication with other local computing device(s) and/or drive module(s) 414 on vehicle 402. Also, communication connection(s) 410 may allow the vehicle to communicate with other nearby computing device(s) (e.g., other nearby vehicles, traffic signals, etc.). The communication connection(s) 410 also enable the vehicle 402 to communicate with a remotely operated computing device or other remote service.
The communication connection(s) 410 may include a physical and/or logical interface for connecting the vehicle computing device 404 to another computing device or network (e.g., network(s) 436). For example, communication connection(s) 410 may enable Wi-Fi based communications, e.g., via frequencies defined by the IEEE 802.11 standard, short-range wireless frequencies (e.g.,
Figure BDA0003233281650000231
) Cellular communication (e.g., 2G, 3G, 4G LTE, 5G, etc.), or any suitable wired or wireless communication protocol that enables the respective computing device to interface with other computing device(s).
In at least one example, the vehicle 402 may include a drive module(s) 412. In some examples, the vehicle 402 may have a single drive module 412. In at least one example, the vehicle 402 may have multiple drive modules 412, with individual ones of the drive modules 412 positioned at opposite ends (e.g., forward and rearward, etc.) of the vehicle 402. In at least one example, the drive module(s) 412 can include one or more sensor systems to detect conditions of the drive module(s) 412 and/or the surroundings of the vehicle 402. By way of example and not limitation, the sensor system(s) associated with the drive module(s) 412 may include: one or more wheel encoders (e.g., rotary encoders) to sense rotation of the wheels of the drive module; inertial sensors (e.g., inertial measurement units, accelerometers, gyroscopes, magnetometers, etc.) to measure orientation and acceleration of the drive module; a camera or other image sensor, an ultrasonic sensor to acoustically detect objects in the environment surrounding the drive module; a laser radar sensor; radar sensors, etc. Some sensors, such as wheel encoders, may be unique to the drive module(s) 412. In some cases, the sensor system(s) on the drive module(s) 412 may overlap or supplement the corresponding system(s) (e.g., sensor system(s) 406) of the vehicle 402.
The drive module(s) 412 may include many of the vehicle systems, including: a high voltage battery, an electric motor to propel the vehicle, an inverter to convert direct current from the battery to alternating current for use by other vehicle systems, a steering system including a steering motor and a steering frame (which may be electric), a braking system including hydraulic or electric actuators, a suspension system including hydraulic and/or pneumatic components, a stability control system to distribute braking power to mitigate traction loss and maintain control, an HVAC system, lighting (e.g., lighting such as headlights/taillights to illuminate the exterior environment of the vehicle), and one or more other systems (e.g., a cooling system, a security system, an on-board charging system, other electrical components such as a DC/DC converter, a high voltage junction, a high voltage cable, a charging system, a charging port, etc.). Additionally, the drive module(s) 412 can include a drive module controller that can receive and pre-process data from the sensor system(s) and control the operation of various vehicle systems. In some examples, the drive module controller may include one or more processors and a memory communicatively coupled with the one or more processors. The memory may store one or more modules to perform various functions of the driver module(s) 412. In addition, driver module(s) 412 may also include one or more communication connections that enable the respective driver module to communicate with one or more other local or remote computing devices.
In at least one example, the direct connection 414 may provide a physical interface to couple the drive module(s) 412 with the body of the vehicle 402. For example, the direct connection 414 may allow energy, fluid, air, data, etc. to be transferred between the drive module(s) 412 and the vehicle. In some examples, the direct connection 414 may further releasably secure the drive module(s) 412 to the body of the vehicle 402.
In at least one example, the positioning component 420, the perception component 422, the prediction component 424, the planning component 426, the reflection identification component 428, the system controller(s) 430, the map(s) 432, and/or the tracker 434 can process the sensor data as described above and can send their respective outputs to the one or more computing devices 438 over the one or more networks 436. In at least one example, the location component 420, perception component 422, prediction component 424, planning component 426, reflection identification component 428, system controller(s) 430, map(s) 432, and/or tracker 434 can send their respective outputs to one or more computing devices 438 at a particular frequency, after a predetermined period of time has elapsed, in near real-time, and/or the like.
In some examples, the vehicle 402 may transmit the sensor data to the computing device(s) 438, e.g., via the network(s) 436. In some examples, the vehicle 402 may send the raw sensor data to the computing device(s) 438. In other examples, the vehicle 402 may send the processed sensor data and/or a representation of the sensor data (e.g., spatial grid data) to the computing device(s) 438. In some examples, the vehicle 402 may transmit the sensor data to the computing device(s) 438 at a particular frequency, after a predetermined period of time has elapsed, in near real-time, and so on. In some cases, the vehicle 402 may send the sensor data (raw or processed) to the computing device(s) 438 as one or more log files.
Computing device(s) 438 may include processor(s) 440 and memory 442, the memory 442 storing one or more maps 444 and/or reflection identification components 446
In some instances, map(s) 444 may be similar to map(s) 432. In addition to or instead of performing functions at the vehicle computing device(s) 404, the reflection identification component 448 may perform substantially the same functions as those described for the reflection identification component 428.
The processor(s) 416 of the vehicle 402 and the processor(s) 440 of the computing device(s) 438 may be any suitable processor capable of executing instructions to process data and perform operations as described herein. By way of example, and not limitation, processor(s) 416 and 440 may include one or more Central Processing Units (CPUs), Graphics Processing Units (GPUs), or any other device or portion of a device that processes electronic data to convert that electronic data into other electronic data that may be stored in registers and/or memory. In some examples, integrated circuits (e.g., ASICs, etc.), gate arrays (e.g., FPGAs, etc.), and other hardware devices may be considered processors, provided that they are configured to implement the coded instructions.
Memory 418 and memory 442 are examples of non-transitory computer-readable media. Memory 418 and memory 442 may store an operating system and one or more software applications, instructions, programs, and/or data to implement the methods described herein and the functions pertaining to the various systems. In various implementations, the memory may be implemented using any suitable storage technology (e.g., Static Random Access Memory (SRAM), synchronous dynamic ram (sdram), non-volatile/flash type memory, or any other type of memory capable of storing information). The architectures, systems, and individual elements described herein may include many other logical, programmed, and physical components, of which those shown in the figures are merely examples relevant to the discussion herein.
It should be noted that although fig. 4 is illustrated as a distributed system, in alternative examples, components of vehicle 402 may be associated with computing device(s) 438 and/or components of computing device(s) 438 may be associated with vehicle 402. That is, the vehicle 402 may perform one or more of the functions associated with the computing device(s) 438, and vice versa. Further, aspects of reflection identification components 428, 446 and/or tracker 434 may be performed on any of the devices discussed herein.
Fig. 5 is a flow diagram of an example process 500 for determining whether a radar echo is a reflected echo, in accordance with an embodiment of the present disclosure. Although discussed in the context of radar data, the example process 500 may also be used in the context of and/or in conjunction with lidar data, sonar data, time-of-flight image data, and/or other types of data.
At operation 502, the process 500 may include receiving radar data for an environment. For example, the radar data may include radar returns including position, velocity, and/or intensity information. In some examples, the radar system may include one or more radar sensors, and may be a sensor system of an autonomous vehicle (e.g., vehicle 102 described above). The radar data may include a plurality of echoes including static echoes corresponding to objects with position fixes and zero velocity, and dynamic echoes or radar tracks corresponding to moving objects with position fixes and non-zero velocity.
At operation 504, the process 500 may include determining a first velocity along a first radial direction at a first location based on a first radar return in the radar data. The first radar echo may comprise a velocity in a first direction between the object and the vehicle, e.g. in a radial direction extending from (the radar sensor on) the vehicle and through the object. In some instances, the first radar echo may be an object echo, e.g., corresponding to a known object in the environment. In some examples, it may be determined that the first radar echo corresponds to an object echo based on tracking information generated prior to receiving the radar data. However, in other instances, the techniques described herein may operate without any a priori knowledge of the echo. In other words, the first echo (and additional echoes including the second echo discussed below) may not be associated with objects in the environment, and in some instances such association may not be necessary. The position of the first radar echo may be a distance along a first radial direction.
At operation 506, the process 500 may include determining a second velocity along a second radial direction at a second location based on a second radar return in the radar data. For example, as with the first radar echo, the second radar echo (and other radar echoes) may not be associated with known objects in the environment. The second radar echo may be an echo from another object in the environment (e.g., a newly detected object) and/or an echo from a reflection from some intermediate object in the environment. In the latter case, the second radar echo may identify a phantom "object" at a location along the second radial direction.
At operation 508, the process 500 may determine the projection velocity as the projection of the first velocity in the second radial direction. For example, operation 508 may project one echo of an echo pair (e.g., a first echo) into a direction associated with the other echo of the echo pair (e.g., a unit vector associated with a second echo). As will be appreciated, since the reflected echo cannot be closer than the subject echo (e.g., the direct echo), operation 508 may project the velocity associated with the closer of the first and second echoes into a direction associated with the farther of the echoes. For example, and referring to the example of fig. 2, the first echo may be the object echo 122 and the second echo may be the first reflected echo 138. The projection velocity determined by operation 508 may be the radial component of the object echo velocity 160 reflected on line 210, where line 210 bisects and is perpendicular to line 208 connecting the object echo 122 and the first reflected echo 138. Thus, the projection velocity may be a velocity resulting from radio energy first reflecting from the object and subsequently reflecting from an intermediate object arranged along the second direction.
At operation 510, the process 500 may determine whether the projection speed corresponds to a second speed. For example, at operation 510, a magnitude of a radial component of the projected velocity (e.g., a component along the second radial direction) may be compared to a magnitude of the second velocity (which is measured by the radar sensor along the second radial direction). For example, such a comparison may determine whether the magnitude of the radial component is substantially equal to the magnitude of the second velocity. As used herein, the term "substantially equal" and similar terms may mean that two values are equal or within some threshold margin of each other. For example, the margin may be an absolute value (e.g., 0.1m/s, 0.5m/s, 2m/s, etc.), a percentage (e.g., 0.5%, 1%, 10%), or some other metric. In some examples, the margin may be based on a fidelity of the radar sensor, a range of the object, a range associated with the echo, a velocity of the object, a velocity associated with the echo, and/or other features and/or factors.
If it is determined at operation 510 that the projection speed corresponds to the second speed, at operation 512, the process 500 may identify the second radar return as a reflected radar return (or possibly a reflected radar return). For example, the vehicle computing device may determine that the echo is (possibly) a reflected echo because the echo closely corresponds to a (theoretical) reflection of the first echo.
At operation 514, the process 500 may optionally receive additional sensor data, and at operation 516, the process 500 may optionally confirm the presence of the intermediate object based on the additional sensor data. For example, operations 514, 516 may confirm that the second radar echo is a reflected echo by confirming the presence of an intermediate object (that will have reflected radio energy) causing the reflection. As described herein, the additional sensor data may be any type of sensor data from one or more sensor modalities disposed on or otherwise associated with the vehicle. In some examples, the presence of an intermediate object can also be confirmed from the map data (e.g., when the intermediate object is a fixture, a terrain feature, etc.). In other implementations, the additional sensor data received at operation 514 may be subsequently received data including, but not limited to, subsequent radar data. By way of non-limiting example, subsequently received data may be used to track the second echo over time. As noted herein, the reflected echo may be short lived, and attempting to track the second echo with the previous and/or subsequent additional sensor data may be futile.
At operation 518, the process 500 may control the vehicle with the radar returns excluded. For example, because it has been determined that the radar echo is a reflected echo and is not representative of an actual object in the environment, control of the vehicle may not rely on the radar echo. As described herein, conventional planning systems may have controlled the vehicle (e.g., by braking, steering, etc.) to react to phantom "objects" represented by radar returns. However, by identifying and excluding reflected echoes, the techniques described herein may provide improved control, e.g., only responding to actual objects in the environment.
Conversely, if it is determined at operation 512 that the projection speed does not correspond to the second speed, at operation 520, the process 500 may identify the radar returns as potential additional objects in the environment. For example, the second radar echo may not be a reflected echo when the magnitude of the second velocity does not correspond to the radial component of the projected velocity. Instead, the second echo may correspond to an actual object in the environment. For example, the second echo may be from a newly detected object, such as a pedestrian exiting a building in the example of fig. 2.
At operation 522, the process 500 may optionally confirm the presence of the additional object. For example, operation 522 may include receiving additional sensor information and determining that an object is present at a location associated with the second radar return based at least in part on the additional sensor information. The additional sensor information may include one or more of lidar data, image data, additional radar data, time-of-flight data, and the like. In some examples, operation 522 may be substantially the same as operation 516, but would confirm that the object is present at the second location, rather than at some intermediate point.
At operation 524, the process 500 may include controlling the vehicle based at least in part on the second radar echo. For example, because the second radar return may be associated with an actual dynamic object, the techniques described herein may control the vehicle with respect to the object. In some examples, a trajectory of the vehicle may be determined based at least in part on the second radar echo. By way of non-limiting example, a prediction system (e.g., prediction component 324) can determine a predicted trajectory of the additional object, and a planning system (e.g., planning component 326) can generate a trajectory of the vehicle relative to the predicted trajectory of the additional object. In some implementations, operation 524 may also or alternatively include tracking the additional object, e.g., such that subsequent echoes from the additional object may be considered object echoes. Such object echoes may be used to determine reflected echoes caused by additional objects.
Fig. 6 is a flow diagram of another example process 600 for determining whether a radar echo is a reflected echo, in accordance with an embodiment of the present disclosure. The process 600 may be used in place of or in addition to the process 500 discussed above. Although process 600 is not limited to the environment shown in fig. 3, aspects of process 600 may correspond to the techniques discussed above with reference to fig. 3. Further, although discussed in the context of radar data, the example process 600 may be used in the context of and/or in conjunction with lidar data, sonar data, time-of-flight image data, and/or other types of data.
At operation 602, the process 600 may include receiving radar data for an environment. For example, the radar data may include radar returns including position, velocity, and/or intensity information. In some examples, the radar system may include one or more radar sensors, and may be a sensor system of an autonomous vehicle (e.g., vehicles 102, 104 described above). The radar data may include a plurality of echoes including static echoes corresponding to objects with position fixes and zero velocity, and dynamic echoes or radar tracks corresponding to moving objects with position fixes and non-zero velocity.
At operation 604, the process 600 may include identifying a first location of a first echo of the radar data. The first radar echo may identify a depth or range of the echo, a position (e.g., angular position) and/or a velocity of the echo relative to the sensor. In some instances, the first radar echo may be an object echo, e.g., corresponding to a known object in the environment. In some examples, it may be determined that the first radar echo corresponds to an object echo based on tracking information generated prior to receiving the radar data. However, in other examples, the techniques described herein may operate without any a priori knowledge of the echo. In other words, the first echo (and additional echoes including the second echo discussed below) may not be associated with objects in the environment, and in some instances such association may not be necessary. The position of the first radar echo may be a position along a first radial direction, an x-y coordinate in a coordinate system, or some other position information.
At operation 606, the process 600 may include identifying a second location of a second echo of the radar data. The second radar echo may identify a depth or range of the echo, a position (e.g., angular position) and/or a velocity of the echo relative to the sensor. In other implementations, the location of the second radar echo may be an x-y coordinate in a coordinate system or some other location information. The second radar echo may be an echo from another object in the environment (e.g., a newly detected object) and/or an echo from a reflection from some intermediate object in the environment. In the latter case, the radar returns may identify phantom "objects" at locations along the radial direction. The techniques described herein may be used to determine whether an echo is a reflected echo.
At operation 608, the process 600 may determine a location of the reflection point based on the first location and the second location. For example, the techniques described herein may determine the reflection point as a point in space along a second radial direction at which radio energy reflected from an object associated with the first echo may be reflected toward the sensor to generate a second echo. In some examples, operation 610 may determine a reflection point using a geometry associated with the locations of the first echo and the second echo. Fig. 3 illustrates an example technique in which a reflection point may be determined as an intersection of a first line extending between the sensor and the second echo and a second line perpendicular to and bisecting the line extending between the first echo and the second echo. Additionally or alternatively, the assumption that the line segments from a reflection point to two points are equal may be used to determine the position of the reflection point along the line. As will be appreciated, for any given pair of points, the reflection point must lie on a line between the sensor and the echo radially further from the sensor, and thus, in the example of fig. 6, the second echo will be further than the first echo.
At operation 610, the process 600 may receive data regarding objects in an environment. For example, operation 610 may include receiving additional sensor data about the environment. Such additional sensor data may be any type of sensor data from one or more sensor modalities disposed on or otherwise associated with the vehicle. In some examples, operation 610 may include receiving map data of an environment. In the examples described herein, the data about the object may be from any source capable of providing information about whether the object in the environment is static or dynamic.
At operation 612, the process 600 determines whether the object is at the location of the reflection point. As described above, the geometry of the echoes allows the theoretical reflection point to be determined, i.e. the point at which radio energy first reflected from an object associated with a first echo will subsequently be reflected to give an echo at the location of a second echo. This reflection is a ghost reflection (or ghost image) as opposed to a (direct) reflection from an object in the environment. Thus, while such theoretical points may be determined for any echo pair, the data received at 610 about the object may be used to determine whether the object is actually present at the theoretical reflection points.
At operation 612, if it is determined that an object is present at the reflection point, at operation 614, the process 600 may identify the second radar echo as a reflected radar echo. For example, the vehicle computing device may determine that the echo is a reflected echo because the location of the second echo closely corresponds to the location of the (theoretical) reflection of the first echo at the reflection point, and there is an object at the theoretical reflection point. For example, the vehicle computing device may mark the second echo as a potential reflected echo and/or send information to other components of the vehicle.
At operation 616, the process 600 may control or otherwise control the vehicle with the exclusion of the second radar echo. For example, the control of the vehicle may not rely on the second radar echo because the second radar echo has been determined to be a reflected echo and is not representative of an actual object in the environment. As described herein, conventional planning systems may have controlled the vehicle (e.g., by braking, steering, etc.) to react to phantom "objects" represented by radar returns. However, by identifying and excluding reflected echoes, the techniques described herein may provide improved control, e.g., only responding to actual objects in the environment. In some implementations, the techniques described herein may also track the second echo to confirm that it is a reflection point. As indicated above, the environment geometry will create conditions that allow for reflection of echoes, but the geometry is constantly changing. Thus, while a reflected echo may be present in one radar scan, it is unlikely to be present in a subsequent (or previous) scan, and thus it may not be possible to attempt to track for a brief echo.
Conversely, if it is determined at operation 612 that an object is not present at the location of the reflection point, at operation 518, the process 600 may identify the second radar return as a potential additional object in the environment. For example, when no object is present at the reflection point, the radar echo is likely not a reflection echo. Instead, the second echo may correspond to an actual object in the environment. For example, the echo may be from a newly detected object, such as a pedestrian coming out of a building in the example of fig. 3.
At operation 620, the process 600 may optionally confirm the presence of the additional object. For example, operation 624 may include receiving additional sensor information and determining that an object is present at a location associated with the second radar return based at least in part on the additional sensor information. The additional sensor information may include one or more of lidar data, image data, additional radar data, time-of-flight data, and the like. In some implementations, operation 620 may be substantially the same as operations 610, 612, but rather than determining that an object is present at the reflection point, process 600 may determine that an object is present at a second location (i.e., of the second echo).
At operation 622, the process 600 may include controlling the vehicle based at least in part on the second radar echo. For example, because the second radar return may be associated with an actual dynamic object, the techniques described herein may control the vehicle with respect to the object. In some examples, a trajectory of the vehicle may be determined based at least in part on the second radar echo. By way of non-limiting example, a prediction system (e.g., the prediction component 424) can determine a predicted trajectory of the additional object, and a planning system (e.g., the planning component 426) can generate a trajectory of the vehicle relative to the predicted trajectory of the additional object. In some implementations, operation 622 may also or alternatively include tracking the additional object, e.g., such that subsequent echoes from the additional object may be considered object echoes. Such object echoes may be used to determine reflected echoes caused by additional objects.
The operations of processes 500, 600 may be performed serially and/or in parallel. By way of non-limiting example, the radar data received at operations 502, 504, 602 may include a plurality of radar returns. In some implementations, operations 506, 508, 510, 512, 608, 610, 612, etc. may be performed, e.g., in parallel, for each of the radar returns. Thus, all reflected echoes from radar data may be identified relatively quickly and removed from consideration, as described herein. Furthermore, although only a single echo pair is referenced in fig. 5 and 6, multiple echo pairs may also be compared using the techniques described herein. By way of non-limiting example, the same echo may be compared to a plurality of other echoes (which may or may not be known to be associated with objects in the environment). In some examples, multiple radar returns may be received for each known object, and these returns may be compared (e.g., using process 400 or process 500) to other returns that are not associated with the object. Further, echo pairs may be processed according to both process 400 and process 500, for example, to obtain a further determination that the echoes are reflections. In other implementations, the multiple radar returns may be filtered, e.g., such that only a certain subset of the multiple radar returns are processed. For example, radar returns that are relatively closer to the vehicle 102 than known objects cannot be reflected returns, and thus may not be investigated according to aspects of the process 500. Furthermore, radar returns with velocities below a minimum velocity (e.g., non-zero velocity) may also be excluded from investigation, for example, because these returns (even reflected returns) may have minimal impact on the vehicle. Further, the angle of the echo (relative to the radar sensor(s) 104/vehicle 102) is equal to or greater than a threshold angle (e.g., 45 degrees, 60 degrees, 90 degrees, etc.). Other filtering techniques and criteria may also be used.
Fig. 5 and 6 illustrate example processes according to embodiments of the disclosure. The processes of fig. 5 and 6 may, but need not, be performed as multiple sub-processes, for example, by different components of the vehicle 102, 402. The processes 500, 600 are illustrated as logical flow diagrams, wherein each operation represents a sequence of operations that can be implemented in hardware, software, or a combination thereof. In the context of software, the operations represent computer-executable instructions stored on one or more non-transitory computer-readable storage media that, when executed by one or more processors, cause a computer or autonomous vehicle to perform the recited operations. Generally, computer-executable instructions include routines, programs, objects, components, data structures, and so forth that perform particular functions or implement particular abstract data types. The order in which the operations are described should not be construed as a limitation, and any number of the described operations can be combined in any order and/or in parallel to implement the processes.
Example clauses
A: an example autonomous vehicle includes: a radar sensor on the autonomous vehicle; one or more processors; and a memory storing processor-executable instructions that, when executed by the one or more processors, cause the autonomous vehicle to perform actions comprising: receiving radar data of an environment from a radar sensor, the radar data comprising: a first radar echo comprising: a first speed and a first range along a first radial direction from the radar sensor; a second radar echo comprising: a second speed and a second range along a second radial direction from the radar sensor; projecting the first velocity onto a point corresponding to the second radar echo as a projection velocity; determining, based at least in part on the second velocity and the projection velocity, that the second radar echo corresponds to a reflected radar echo reflected from the intermediate surface; and controlling the autonomous vehicle within the environment with the second radar echo excluded.
B: the autonomous vehicle of example a, wherein projecting the first velocity comprises: determining a reflection line based at least in part on a first extent in a first radial direction and a second extent in a second radial direction; reflecting the first velocity about the reflected line from the first position to the second position; and determining the projection velocity as a component of the reflection velocity along the second radial direction.
C: the autonomous vehicle of example a or example B, wherein determining that the second radar return corresponds to the reflected radar return comprises: comparing the first magnitude of the projection velocity with the second magnitude of the second velocity; and determining that the first size is substantially equal to the second size.
D: the autonomous vehicle of any of examples a-C, the actions further comprising: prior to receiving the radar data, receiving a previous radar echo associated with the object; and identifying the first radar echo as an object echo associated with the object based at least in part on the previous radar echo.
E: the autonomous vehicle of any of examples a-D, further comprising: at least one additional sensor on the vehicle, the acts further comprising: receiving additional sensor data from at least one additional sensor; and verifying the presence of the intermediate object based at least in part on the additional sensor data.
F: the autonomous vehicle of any of examples a-E, the actions further comprising: selecting a location associated with the first echo or a velocity associated with the first echo and the second echo based at least in part on one or more of the distances associated with the first echo and the second echo, the first echo and the second echo from the plurality of candidate radar echoes.
G: an example method, comprising: capturing, by a radar sensor on a vehicle, radar data of an environment, the radar data including a plurality of radar returns; determining a first velocity along a first radial direction extending from the vehicle based at least on a first radar echo of the plurality of radar echoes; determining a second velocity along a second direction extending from the vehicle based at least in part on a second radar echo of the plurality of radar echoes; determining a projection speed of the first speed along the second direction; and determining, based at least in part on the comparison of the projection velocity and the second velocity, that the second radar echo corresponds to a reflected radar echo that is reflected from the object and an intermediate object between the object and the radar sensor.
H: the method of example G, further comprising: receiving sensor data from at least one of the radar sensor or the additional sensor; and identifying, based at least in part on the sensor data, that the first echo is associated with an object in the environment.
I: the method of example G or example H, further comprising: tracking an object in the environment based at least in part on the sensor data and prior to capturing the first radar echo and the second radar echo, wherein determining the second echo as a reflected echo is further based at least in part on the tracking.
J: the method of any of examples G to I, wherein projecting the velocity comprises projecting the velocity onto a location corresponding to a second echo along a second direction.
K: the method of any one of examples G to J, wherein determining the projection velocity comprises: determining a reflected line based at least in part on a first range associated with the first echo and a second range associated with the second echo; reflecting a first velocity with respect to the reflected line to determine a reflected velocity; and determining the projection velocity as a component of the reflection velocity along the second radial direction.
L: the method of any of examples G to K, further comprising: a reflection point is determined as an intersection of a reflection line and a line extending between the radar sensor and the second location, the reflection point being a location associated with the intermediate object.
M: the method of any one of examples G to L, further comprising: receiving at least one of additional sensor data or map data; and identifying an intermediate object at the reflection point based at least in part on the additional sensor data or the map data.
N: the method of any of examples G to M, wherein the first echo and the second echo are selected based at least in part on at least one of: a distance associated with the first echo and the second echo, a position associated with the first echo and the second echo, or a velocity associated with the first echo and the second echo.
O: the method of any of examples G to N, wherein the first echo is associated with an object in the environment, and the second echo is determined based at least in part on at least one of a second radar echo having a second range or a second velocity, the second range being greater than the first range associated with the first radar echo, the second velocity being equal to or greater than a threshold velocity.
P: one or more example non-transitory computer-readable media storing instructions that, when executed, cause one or more processors to perform operations comprising: capturing, by a radar sensor, radar data of an environment, the radar data comprising: a first radar echo comprising a first speed, a first direction and a first range; and a second radar echo comprising a second speed, a second direction, and a second range; determining a projection velocity of the first velocity relative to the second direction; and determining that the second radar echo corresponds to a reflected radar echo based at least in part on a comparison of the projection velocity and the second velocity.
Q: the one or more non-transitory computer-readable media of example P, wherein determining the projection velocity comprises: determining a reflected line based at least in part on the first echo and the second echo; reflecting the first velocity from the first position to the second position about the reflected line to determine a reflected velocity; and determining the projection velocity as a component of the reflection velocity along the second direction.
R: the method of example P or example Q, further comprising: determining the reflection point as a point along the second direction such that a line from the reflection point bisects a connecting line from the first point to the second point; receiving additional sensor data; determining a presence of a surface at the reflection point based at least in part on the additional sensor data; and verifying that the second radar echo corresponds to the reflected radar echo.
S: the one or more non-transitory computer-readable media of any of examples P to R, the operations further comprising: receiving sensor data from at least one of the radar sensor or the additional sensor; and determining, based at least in part on the sensor data, that the first reflected echo is associated with an object in the environment.
T: the one or more non-transitory computer-readable media of any of examples P to S, the operations further comprising: identifying one or more candidate reflected echoes from the plurality of reflected echoes, the one or more candidate reflected echoes including a second radar echo, and the identifying is based at least in part on at least one of a range associated with a single reflected echo of the one or more candidate reflected echoes or a velocity associated with a single reflected echo of the one or more candidate reflected echoes.
U: an example autonomous vehicle, comprising: one or more sensors on the autonomous vehicle, including at least one radar sensor; one or more processors; and a memory storing processor-executable instructions that, when executed by the one or more processors, cause the autonomous vehicle to perform actions comprising: receiving radar data of an environment from a radar sensor, the radar data comprising: a first radar echo having an associated first location, including a first range along a first radial direction from the radar sensor; a second radar return having an associated second location that includes a second range along a second radial direction from the radar sensor; determining a reflection point along a second radial direction based at least in part on the first location and the second location; receiving additional sensor data from one or more sensors; identifying an object in the environment based at least in part on the additional sensor data; determining that the second radar echo is a reflected echo based at least in part on the object being disposed at the reflection point; and controlling the autonomous vehicle within the environment with the second radar echo excluded.
V: the autonomous vehicle of example U, wherein determining the reflection point comprises: determining a reflection line based at least in part on the first location and the second location; and determining the reflection point as an intersection of the reflection line and a line extending along the second direction.
W: the autonomous vehicle of example U or example V, the actions further comprising: receiving, from the radar sensor and prior to receiving the radar data, previous sensor data associated with the second object; and identifying the first radar echo as an object echo associated with the second object based at least in part on the previous sensor data.
X: the autonomous vehicle of any of examples U-W, the actions further comprising: selecting a second echo from the plurality of echoes based at least in part on the second range being greater than the first range.
Y: the autonomous vehicle of any of examples U-X, the actions further comprising: receiving additional radar returns associated with the environment from the radar sensor and after receiving the radar data; and verifying that the second radar echo is a reflected echo based at least in part on the additional radar echo.
Z: an example method, comprising: receiving radar data of an environment from a radar sensor on a vehicle, the radar data including a plurality of radar returns; determining a first location in a first radial direction extending from the radar sensor based at least on a first radar echo of the plurality of radar echoes; determining a second location in a second radial direction extending from the radar sensor based at least in part on a second radar echo of the plurality of radar echoes; determining a reflection point along a second radial direction and between the radar sensor and a second location; and determining that the second radar echo is a reflected echo based at least in part on the additional data about the environment.
AA: the method of example Z, further comprising: and controlling the vehicle in the environment under the condition of excluding the second radar echo.
BB: the method of example Z or example AA, wherein determining a reflection point comprises: a reflected line is determined based at least in part on the first location and the second location.
CC: the method of any one of examples Z to BB, wherein the additional data comprises at least one of sensor data or map data, the sensor data comprising one or more of lidar data, additional radar data, or image data.
DD: the method of any one of example Z to example CC, further comprising: receiving additional radar returns associated with the environment from the radar sensor; and verifying that the second radar echo is a reflected echo based on at least the additional radar echo.
EE: the method of any one of examples Z to DD, further comprising: receiving, from a radar sensor and prior to receiving radar data, a previous radar echo associated with a tracked object in an environment; and identifying the first radar echo as an object echo associated with the tracked object based at least in part on the previous radar echo.
FF: the method of any one of example Z to example EE, wherein the first echo and the second echo are selected based at least in part on at least one of: a range associated with the first echo and the second echo, the first location and the second location, or a velocity associated with the first echo and the second echo.
GG: the method of any one of example Z to example FF, wherein the first echo is associated with an object in the environment, and the second echo is determined based at least in part on at least one of a second radar echo having a second range or a second velocity of the second radar echo, the second range being greater than the first range associated with the first radar echo, the second velocity being equal to or greater than a threshold velocity.
HH: the method of any one of examples Z to GG, further comprising: receiving additional sensor data from additional sensors on the vehicle; and identifying an object associated with the first echo based at least in part on the additional sensor data.
II: one or more example non-transitory computer-readable media storing instructions that, when executed, cause one or more processors to perform operations comprising: receiving radar data of an environment from a radar sensor on a vehicle, the radar data including a plurality of radar returns; determining a first location in a first radial direction extending from the radar sensor based at least on a first radar echo of the plurality of radar echoes; determining a second location in a second radial direction extending from the radar sensor based at least in part on a second radar echo of the plurality of radar echoes; determining a reflection point along a second radial direction and between the radar sensor and a second location; determining a presence of an object at a location corresponding to the reflection point based at least in part on the additional data about the environment; and determining that the second radar echo is a reflected echo based at least in part on a presence of an object at the location.
JJ: the one or more non-transitory computer-readable media of example II, wherein determining the reflection point comprises: a reflected line is determined based at least in part on the first location and the second location.
KK: the one or more non-transitory computer-readable media of example II or example JJ, wherein the additional data comprises at least one of sensor data or map data, the sensor data comprising one or more of lidar data or image data.
LL: one or more non-transitory computer-readable media of any one of example II to example KK, the operations further comprising: receiving additional radar returns associated with the environment from the radar sensor; and verifying that the second radar echo is a reflected echo based at least in part on the additional radar echo.
MM: one or more non-transitory computer-readable media of any one of example II to example LL, the operations further comprising: receiving, from a radar sensor and prior to receiving radar data, a previous radar echo associated with a tracked object in an environment; and identifying the first radar echo as an object echo associated with the tracked object based at least in part on the previous radar echo.
NN: the one or more non-transitory computer-readable media of any one of examples II to MM, wherein the first echo and the second echo are selected based at least in part on at least one of: a range associated with the first echo and the second echo, the first location and the second location, or a velocity associated with the first echo and the second echo.
Although the above example clauses are described with respect to one particular implementation, it should be understood that the contents of the example clauses, in the context of this document, may also be implemented via a method, apparatus, system, computer-readable medium, and/or another implementation.
Although the above example clauses are described with respect to one particular implementation, it should be understood that the contents of the example clauses, in the context of this document, may also be implemented via a method, apparatus, system, computer-readable medium, and/or another implementation.
Conclusion
Although one or more examples of the techniques described herein have been described, various alterations, additions, permutations, and equivalents thereof are included within the scope of the techniques described herein.
In the description of the examples, reference is made to the accompanying drawings which form a part hereof, and in which is shown by way of illustration specific examples of the claimed subject matter. It is to be understood that other examples may be used and that changes or alterations, such as structural changes, may be made. Such examples, changes, or variations are not necessarily departures from the scope of the claimed subject matter. Although the steps herein may be presented in a certain order, in some cases the order may be changed such that certain inputs are provided at different times or in a different order without altering the functionality of the systems and methods described. The disclosed processes may also be performed in a different order. Additionally, the various computations described herein need not be performed in the order disclosed, and other examples using alternative orders of computation may be readily implemented. In addition to being reordered, in some instances, these computations can also be broken down into sub-computations that have the same result.

Claims (15)

1. An autonomous vehicle comprising:
a radar sensor on the autonomous vehicle;
one or more processors; and
a memory storing processor-executable instructions that, when executed by the one or more processors, cause the autonomous vehicle to perform actions comprising:
receiving radar data of an environment from the radar sensor, the radar data comprising:
a first radar echo comprising: a first velocity along a first radial direction from the radar sensor; and a first position at a first extent along the first radial direction;
a second radar echo comprising: a second speed; and a second extent along a second radial direction from the radar sensor;
determining that the second radar echo corresponds to a reflected radar echo reflected from an intermediate surface based, at least in part, on at least one of: the first position and the second position, or the first velocity and the second velocity; and
controlling the autonomous vehicle within the environment with the second radar echo excluded.
2. The autonomous vehicle of claim 1, wherein determining that the second radar echo corresponds to the reflected radar echo comprises:
projecting the first velocity onto a point corresponding to the second radar echo as a projection velocity, an
Determining that a first magnitude of the projection velocity is substantially equal to a second magnitude of the second velocity.
3. The autonomous vehicle of claim 2, wherein projecting the first velocity comprises:
determining a reflection line based at least in part on the first extent in the first radial direction and the second extent in the second radial direction;
reflecting the first velocity with respect to the reflected line from the first position to the second position; and
determining the projection velocity as a component of the reflection velocity along the second radial direction.
4. The autonomous vehicle of any of claims 1-3, wherein determining that the second radar return corresponds to the reflected radar return comprises:
determining a reflection point along the second radial direction based at least in part on the first location and the second location;
receiving additional sensor data from one or more additional sensors;
identifying an object in the environment based at least in part on the additional sensor data; and
determining that the object is disposed at the reflection point.
5. The autonomous vehicle of claim 4, wherein determining the reflection point comprises:
determining a reflection line based at least in part on the first location and the second location; and
determining the reflection point as an intersection of the reflection line and a line extending along the second radial direction.
6. The autonomous vehicle of any of claims 1-5, the acts further comprising:
prior to receiving the radar data, receiving a previous radar echo associated with an object; and
identifying the first radar echo as an object echo associated with the object based at least in part on the previous radar echo.
7. The autonomous vehicle of any of claims 1-6, further comprising: at least one additional sensor on the autonomous vehicle, the acts further comprising:
receiving additional sensor data from the at least one additional sensor; and
verifying the presence of the intermediate surface based at least in part on the additional sensor data.
8. A method, comprising:
capturing, by a radar sensor on a vehicle, radar data of an environment, the radar data including a plurality of radar returns;
determining at least one of a first speed along a first radial direction extending from the vehicle or a first position at a first range along the first radial direction based at least on a first radar echo of the plurality of radar echoes;
determining at least one of a second velocity along a second radial direction extending from the vehicle or a second position at a second range along the second radial direction based at least in part on a second radar echo of the plurality of radar echoes; and
determining that the second radar echo corresponds to a reflected radar echo based at least in part on at least one of: the first location and the second location, or the first velocity and the second velocity, the reflected radar returns being reflected from an object and an intermediate object between the object and the radar sensor.
9. The method of claim 8, further comprising:
projecting the first velocity onto a point corresponding to the second radar echo as a projection velocity, an
Determining that a first magnitude of the projection velocity is substantially equal to a second magnitude of the second velocity.
10. The method of claim 9, wherein projecting the first velocity comprises:
determining a reflection line based at least in part on the first extent in the first radial direction and the second extent in the second radial direction;
reflecting the first velocity with respect to the reflected line from the first position to the second position; and
determining the projection velocity as a component of the reflection velocity along the second radial direction.
11. The method of claim 10, further comprising: determining a reflection point as an intersection of the reflection line and a line extending between the radar sensor and the second location, the reflection point being a location associated with the intermediate object.
12. The method of any of claims 8-11, further comprising:
determining a reflection point along the second radial direction based at least in part on the first location and the second location;
receiving additional sensor data from one or more additional sensors; and
identifying additional objects in the environment based at least in part on the additional sensor data,
wherein determining that the second radar echo is a reflected echo is based at least in part on the additional object being disposed at the reflection point.
13. The method of claim 12, wherein determining the reflection point comprises:
determining a reflection line based at least in part on the first location and the second location; and
determining the reflection point as an intersection of the reflection line and a line extending along the second radial direction.
14. The method of any of claims 8-13, further comprising:
selecting the first radar echo and the second radar echo based at least in part on at least one of: the first range and the second range, the first position and the second position, or the first speed and the second speed.
15. One or more non-transitory computer-readable media storing instructions that, when executed, cause one or more processors to perform the method of any one of claims 8-14.
CN202080017300.4A 2019-02-28 2020-02-25 Identifying radar reflections using velocity and position information Pending CN113544538A (en)

Applications Claiming Priority (5)

Application Number Priority Date Filing Date Title
US16/289,068 US11353578B2 (en) 2019-02-28 2019-02-28 Recognizing radar reflections using position information
US16/289,068 2019-02-28
US16/288,990 US11255958B2 (en) 2019-02-28 2019-02-28 Recognizing radar reflections using velocity information
US16/288,990 2019-02-28
PCT/US2020/019674 WO2020176483A1 (en) 2019-02-28 2020-02-25 Recognizing radar reflections using velocity and position information

Publications (1)

Publication Number Publication Date
CN113544538A true CN113544538A (en) 2021-10-22

Family

ID=69941497

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202080017300.4A Pending CN113544538A (en) 2019-02-28 2020-02-25 Identifying radar reflections using velocity and position information

Country Status (4)

Country Link
EP (1) EP3931593A1 (en)
JP (1) JP7464616B2 (en)
CN (1) CN113544538A (en)
WO (1) WO2020176483A1 (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11693110B2 (en) 2020-11-04 2023-07-04 Ford Global Technologies, Llc Systems and methods for radar false track mitigation with camera
DE102022211987A1 (en) 2021-11-12 2023-05-17 Zf Friedrichshafen Ag Method, computer program, machine-readable storage medium and system for classifying ghost objects in an environment of a road vehicle, a transportation system and/or a component of a traffic infrastructure

Family Cites Families (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP3608991B2 (en) 1999-10-22 2005-01-12 富士通テン株式会社 Inter-vehicle distance sensor
JP3770189B2 (en) * 2002-03-19 2006-04-26 株式会社デンソー Object recognition device, object recognition method, radar device
JP5061814B2 (en) 2007-09-25 2012-10-31 株式会社デンソー Vehicle width detection method and device, vehicle control device
EP3164859B1 (en) 2014-07-03 2022-08-31 GM Global Technology Operations LLC Vehicle radar methods and systems
US9810782B2 (en) * 2015-03-20 2017-11-07 Delphi Technologies, Inc. Vehicle radar system with image reflection detection
US10296001B2 (en) * 2016-10-27 2019-05-21 Uber Technologies, Inc. Radar multipath processing
JP7053982B2 (en) * 2017-05-25 2022-04-13 ミツミ電機株式会社 Ghost removal method and radar device

Also Published As

Publication number Publication date
WO2020176483A1 (en) 2020-09-03
EP3931593A1 (en) 2022-01-05
JP2022522298A (en) 2022-04-15
JP7464616B2 (en) 2024-04-09

Similar Documents

Publication Publication Date Title
US11351991B2 (en) Prediction based on attributes
US11054515B2 (en) Radar clustering and velocity disambiguation
US11021148B2 (en) Pedestrian prediction based on attributes
US11643073B2 (en) Trajectory modifications based on a collision zone
US20200174481A1 (en) Probabilistic risk assessment for trajectory evaluation
US11255958B2 (en) Recognizing radar reflections using velocity information
CN112789481A (en) Trajectory prediction for top-down scenarios
WO2021081064A1 (en) Trajectory modifications based on a collision zone
EP3948656A1 (en) Pedestrian prediction based on attributes
CN114245885A (en) Top-down scene prediction based on motion data
JP2021524410A (en) Determining the drive envelope
US11965956B2 (en) Recognizing radar reflections using position information
US20210096565A1 (en) Parking zone detection for vehicles
WO2021133810A1 (en) Sensor degradation monitor
CN113853533A (en) Yaw rate from radar data
EP4065442A1 (en) Height estimation using sensor data
JP2023547988A (en) Collision avoidance planning system
JP7464616B2 (en) Recognizing radar returns using speed and position information.
US20220179089A1 (en) Determining depth using multiple modulation frequencies
US11954877B2 (en) Depth dependent pixel filtering
US20230003872A1 (en) Tracking objects with radar data
US11861857B2 (en) Determining pixels beyond nominal maximum sensor depth
CN117651880A (en) Radar data analysis and hidden object detection
CN117545674A (en) Technique for identifying curbs
WO2021225822A1 (en) Trajectory classification

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination