EP3749977A1 - Method and apparatus for object detection using a beam steering radar and convolutional neural network system - Google Patents

Method and apparatus for object detection using a beam steering radar and convolutional neural network system

Info

Publication number
EP3749977A1
EP3749977A1 EP19746621.2A EP19746621A EP3749977A1 EP 3749977 A1 EP3749977 A1 EP 3749977A1 EP 19746621 A EP19746621 A EP 19746621A EP 3749977 A1 EP3749977 A1 EP 3749977A1
Authority
EP
European Patent Office
Prior art keywords
scan
radar
fmcw signal
chirp
object detection
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
EP19746621.2A
Other languages
German (de)
French (fr)
Other versions
EP3749977A4 (en
Inventor
Armin R. Völkel
Matthew HARRISON
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Metawave Corp
Original Assignee
Metawave Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Metawave Corp filed Critical Metawave Corp
Publication of EP3749977A1 publication Critical patent/EP3749977A1/en
Publication of EP3749977A4 publication Critical patent/EP3749977A4/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/02Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S13/00
    • G01S7/41Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S13/00 using analysis of echo signal for target characterisation; Target signature; Target cross-section
    • G01S7/417Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S13/00 using analysis of echo signal for target characterisation; Target signature; Target cross-section involving the use of neural networks
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/481Constructional features, e.g. arrangements of optical elements
    • G01S7/4814Constructional features, e.g. arrangements of optical elements of transmitters alone
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/02Systems using reflection of radio waves, e.g. primary radar systems; Analogous systems
    • G01S13/06Systems determining position data of a target
    • G01S13/08Systems for measuring distance only
    • G01S13/32Systems for measuring distance only using transmission of continuous waves, whether amplitude-, frequency-, or phase-modulated, or unmodulated
    • G01S13/34Systems for measuring distance only using transmission of continuous waves, whether amplitude-, frequency-, or phase-modulated, or unmodulated using transmission of continuous, frequency-modulated waves while heterodyning the received signal, or a signal derived therefrom, with a locally-generated signal related to the contemporaneously transmitted signal
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/02Systems using reflection of radio waves, e.g. primary radar systems; Analogous systems
    • G01S13/06Systems determining position data of a target
    • G01S13/42Simultaneous measurement of distance and other co-ordinates
    • G01S13/426Scanning radar, e.g. 3D radar
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/02Systems using reflection of radio waves, e.g. primary radar systems; Analogous systems
    • G01S13/50Systems of measurement based on relative movement of target
    • G01S13/58Velocity or trajectory determination systems; Sense-of-movement determination systems
    • G01S13/583Velocity or trajectory determination systems; Sense-of-movement determination systems using transmission of continuous unmodulated waves, amplitude-, frequency-, or phase-modulated waves and based upon the Doppler effect resulting from movement of targets
    • G01S13/584Velocity or trajectory determination systems; Sense-of-movement determination systems using transmission of continuous unmodulated waves, amplitude-, frequency-, or phase-modulated waves and based upon the Doppler effect resulting from movement of targets adapted for simultaneous range and velocity measurements
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/88Radar or analogous systems specially adapted for specific applications
    • G01S13/93Radar or analogous systems specially adapted for specific applications for anti-collision purposes
    • G01S13/931Radar or analogous systems specially adapted for specific applications for anti-collision purposes of land vehicles
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/02Systems using the reflection of electromagnetic waves other than radio waves
    • G01S17/06Systems determining position data of a target
    • G01S17/08Systems determining position data of a target for measuring distance only
    • G01S17/32Systems determining position data of a target for measuring distance only using transmission of continuous waves, whether amplitude-, frequency-, or phase-modulated, or unmodulated
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/481Constructional features, e.g. arrangements of optical elements
    • G01S7/4817Constructional features, e.g. arrangements of optical elements relating to scanning
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/491Details of non-pulse systems
    • G01S7/4911Transmitters
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/491Details of non-pulse systems
    • G01S7/4912Receivers
    • G01S7/4913Circuits for detection, sampling, integration or read-out
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/02Systems using reflection of radio waves, e.g. primary radar systems; Analogous systems
    • G01S13/06Systems determining position data of a target
    • G01S13/08Systems for measuring distance only
    • G01S13/32Systems for measuring distance only using transmission of continuous waves, whether amplitude-, frequency-, or phase-modulated, or unmodulated
    • G01S13/34Systems for measuring distance only using transmission of continuous waves, whether amplitude-, frequency-, or phase-modulated, or unmodulated using transmission of continuous, frequency-modulated waves while heterodyning the received signal, or a signal derived therefrom, with a locally-generated signal related to the contemporaneously transmitted signal
    • G01S13/343Systems for measuring distance only using transmission of continuous waves, whether amplitude-, frequency-, or phase-modulated, or unmodulated using transmission of continuous, frequency-modulated waves while heterodyning the received signal, or a signal derived therefrom, with a locally-generated signal related to the contemporaneously transmitted signal using sawtooth modulation
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/02Systems using reflection of radio waves, e.g. primary radar systems; Analogous systems
    • G01S13/06Systems determining position data of a target
    • G01S13/08Systems for measuring distance only
    • G01S13/32Systems for measuring distance only using transmission of continuous waves, whether amplitude-, frequency-, or phase-modulated, or unmodulated
    • G01S13/36Systems for measuring distance only using transmission of continuous waves, whether amplitude-, frequency-, or phase-modulated, or unmodulated with phase comparison between the received signal and the contemporaneously transmitted signal
    • G01S13/38Systems for measuring distance only using transmission of continuous waves, whether amplitude-, frequency-, or phase-modulated, or unmodulated with phase comparison between the received signal and the contemporaneously transmitted signal wherein more than one modulation frequency is used
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/02Systems using reflection of radio waves, e.g. primary radar systems; Analogous systems
    • G01S13/50Systems of measurement based on relative movement of target
    • G01S13/52Discriminating between fixed and moving objects or between objects moving at different speeds
    • G01S13/536Discriminating between fixed and moving objects or between objects moving at different speeds using transmission of continuous unmodulated waves, amplitude-, frequency-, or phase-modulated waves
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/88Lidar systems specially adapted for specific applications
    • G01S17/93Lidar systems specially adapted for specific applications for anti-collision purposes
    • G01S17/931Lidar systems specially adapted for specific applications for anti-collision purposes of land vehicles
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/88Radar or analogous systems specially adapted for specific applications
    • G01S13/93Radar or analogous systems specially adapted for specific applications for anti-collision purposes
    • G01S13/931Radar or analogous systems specially adapted for specific applications for anti-collision purposes of land vehicles
    • G01S2013/9322Radar or analogous systems specially adapted for specific applications for anti-collision purposes of land vehicles using additional data, e.g. driver condition, road state or weather data
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/02Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S13/00
    • G01S7/03Details of HF subsystems specially adapted therefor, e.g. common to transmitter and receiver
    • G01S7/032Constructional details for solid-state radar subsystems
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/02Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S13/00
    • G01S7/41Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S13/00 using analysis of echo signal for target characterisation; Target signature; Target cross-section
    • G01S7/415Identification of targets based on measurements of movement associated with the target

Definitions

  • ADAS Advanced-Driver Assistance Systems
  • the next step will be vehicles that increasingly assume control of driving functions such as steering, accelerating, braking and monitoring the surrounding environment and driving conditions to respond to events, such as changing lanes or speed when needed to avoid traffic, crossing pedestrians, animals, and so on.
  • the requirements for object and image detection are critical and specify the time required to capture data, process it and turn it into action. All this while ensuring accuracy, consistency and cost optimization.
  • An aspect of making this work is the ability to detect and classify objects in the surrounding environment at the same or possibly even better level as humans.
  • Humans are adept at recognizing and perceiving the world around them with an extremely complex human visual system that essentially has two main functional parts: the eye and the brain.
  • the eye may include a combination of multiple sensors, such as camera, radar, and lidar, while the brain may involve multiple artificial intelligence, machine learning and deep learning systems.
  • the goal is to have full understanding of a dynamic, fast- moving environment in real time and human-like intelligence to act in response to changes in the environment.
  • FIG. 1 illustrates an example environment in which a beam steering radar system in an autonomous vehicle is used to detect and identify objects
  • FIG. 2 is a schematic diagram of an autonomous driving system for an autonomous vehicle in accordance with various examples
  • FIG. 3 is a schematic diagram of a beam steering radar system as in FIG. 2 in accordance with various examples
  • FIG. 4 is a flowchart for operating the beam steering radar system implemented as in FIG. 3 in accordance with various examples
  • FIG. 5 illustrates a radar signal and its associated scan parameters in accordance with various examples
  • FIG. 6 illustrates a graph showing the range resolution per bandwidth of a beam steering radar system in accordance with various examples
  • FIGs. 7A-B illustrate example trade-offs between scan parameters and design goals
  • FIG. 8A illustrates a flowchart for adjusting scan parameters in accordance with various examples
  • FIG. 8B illustrates example scan parameters for the adjustment as in FIG. 8A
  • FIG. 9 illustrates a flowchart for adjusting scan parameters in accordance with various examples
  • FIG. 10 illustrates a flowchart for adjusting scan parameters in accordance with various examples.
  • FIG. 11 illustrates example scan parameters for the adjustment of FIG. 10.
  • the methods and apparatuses include the acquisition of raw data by a beam steering radar in an autonomous vehicle and the processing of that data through a perception module to extract information about multiple objects in the vehicle’s Field-of-View (“FoV”).
  • This information may be parameters, measurements or descriptors of detected objects, such as location, size, speed, object categories, and so forth.
  • the objects may include structural elements in the vehicle’s FoV such as roads, walls, buildings and road center medians, as well as other vehicles, pedestrians, bystanders, cyclists, plants, trees, animals and so on.
  • the beam steering radar incorporates at least one beam steering antenna that is dynamically controlled such as to change its electrical or electromagnetic configuration to enable beam steering.
  • the dynamic control is aided by the perception module, which upon identifying objects in the vehicle’s FoV, informs the beam steering radar where to steer its beams and focus on areas of interest by adjusting its radar scan parameters.
  • FIG. 1 illustrates an example environment in which a beam steering radar system in an autonomous vehicle is used to detect and identify objects.
  • Ego vehicle 100 is an autonomous vehicle with a beam steering radar system 106 for transmitting a radar signal to scan a FoV or specific area.
  • the radar signal is transmitted according to a set of scan parameters that can be adjusted to result in multiple transmission beams 118.
  • the scan parameters may include, among others, the total angle of the scanned area from the radar transmission point, the power of the transmitted radar signal, the scan angle of each incremental transmission beam, as well as the angle between each beam or overlap therebetween.
  • the entire FoV or a portion of it can be scanned by a compilation of such transmission beams 118, which may be in successive adjacent scan positions or in a specific or random order.
  • FoV is used herein in reference to the radar transmissions and does not imply an optical FoV with unobstructed views.
  • the scan parameters may also indicate the time interval between these incremental transmission beams, as well as start and stop angle positions for a full or partial scan.
  • the ego vehicle 100 may also have other perception sensors, such as camera 102 and lidar 104. These perception sensors are not required for the ego vehicle 100, but may be useful in augmenting the object detection capabilities of the beam steering radar system 106.
  • Camera sensor 102 may be used to detect visible objects and conditions and to assist in the performance of various functions.
  • the lidar sensor 104 can also be used to detect objects and provide this information to adjust control of the vehicle. This information may include information such as congestion on a highway, road conditions, and other conditions that would impact the sensors, actions or operations of the vehicle.
  • Camera sensors are currently used in Advanced Driver Assistance Systems (“ADAS”) to assist drivers in driving functions such as parking (e.g., in rear view cameras).
  • ADAS Advanced Driver Assistance Systems
  • Cameras are able to capture texture, color and contrast information at a high level of detail, but similar to the human eye, they are susceptible to adverse weather conditions and variations in lighting.
  • Camera 102 may have a high resolution but cannot resolve objects beyond 50 meters.
  • Lidar sensors typically measure the distance to an object by calculating the time taken by a pulse of light to travel to an object and back to the sensor.
  • a lidar sensor When positioned on top of a vehicle, a lidar sensor is able to provide a 360° 3D view of the surrounding environment. Other approaches may use several lidars at different locations around the vehicle to provide the full 360° view.
  • lidar sensors such as lidar 104 are still prohibitively expensive, bulky in size, sensitive to weather conditions and are limited to short ranges (typically ⁇ 150-200 meters).
  • Radars on the other hand, have been used in vehicles for many years and operate in all-weather conditions. Radars also use far less processing than the other types of sensors and have the advantage of detecting objects behind obstacles and determining the speed of moving objects. When it comes to resolution, lidars’ laser beams are focused on small areas, have a smaller wavelength than RF signals, and are able to achieve around 0.25 degrees of resolution.
  • the beam steering radar system 106 is capable of providing a 360° true 3D vision and human-like interpretation of the ego vehicle’s path and surrounding environment.
  • the radar system 106 is capable of shaping and steering RF beams in all directions in a 360° FoV with a beam steering antenna module (having at least one beam steering antenna) and recognize objects quickly and with a high degree of accuracy over a long range of around 300 meters or more.
  • the short range capabilities of camera 102 and lidar 104 along with the long range capabilities of radar 106 enable a sensor fusion module 108 in ego vehicle 100 to enhance its object detection and identification.
  • FIG. 2 illustrates a schematic diagram of an autonomous driving system for an ego vehicle in accordance with various examples.
  • Autonomous driving system 200 is a system for use in an ego vehicle that provides some or full automation of driving functions.
  • the driving functions may include, for example, steering, accelerating, braking, and monitoring the surrounding environment and driving conditions to respond to events, such as changing lanes or speed when needed to avoid traffic, crossing pedestrians, animals, and so on.
  • the autonomous driving system 200 includes a beam steering radar system 202 and other sensor systems such as camera 204, lidar 206, infrastructure sensors 208, environmental sensors 210, operational sensors 212, user preference sensors 214, and other sensors 216.
  • Autonomous driving system 200 also includes a communications module 218, a sensor fusion module 220, a system controller 222, a system memory 224, and a V2V communications module 226. It is appreciated that this configuration of autonomous driving system 200 is an example configuration and not meant to be limiting to the specific structure illustrated in FIG. 2. Additional systems and modules not shown in FIG. 2 may be included in autonomous driving system 200.
  • beam steering radar system 202 includes at least one beam steering antenna for providing dynamically controllable and steerable beams that can focus on one or multiple portions of a 360° FoV of the vehicle.
  • the beams radiated from the beam steering antenna are reflected back from objects in the vehicle’s path and surrounding environment and received and processed by the radar system 202 to detect and identify the objects.
  • Radar system 202 includes a perception module that is trained to detect and identify objects and control the radar module as desired.
  • Camera sensor 204 and lidar 206 may also be used to identify objects in the path and surrounding environment of the ego vehicle, albeit at a much lower range.
  • Infrastructure sensors 208 may provide information from infrastructure while driving, such as from a smart road configuration, bill board information, traffic alerts and indicators, including traffic lights, stop signs, traffic warnings, and so forth. This is a growing area, and the uses and capabilities derived from this information are immense.
  • Environmental sensors 210 detect various conditions outside, such as temperature, humidity, fog, visibility, precipitation, among others.
  • Operational sensors 212 provide information about the functional operation of the vehicle. This may be tire pressure, fuel levels, brake wear, and so forth.
  • the user preference sensors 214 may be configured to detect conditions that are part of a user preference. This may be temperature adjustments, smart window shading, etc.
  • Other sensors 216 may include additional sensors for monitoring conditions in and around the vehicle.
  • the sensor fusion module 220 optimizes these various functions to provide an approximately comprehensive view of the vehicle and environments.
  • Many types of sensors may be controlled by the sensor fusion module 220. These sensors may coordinate with each other to share information and consider the impact of one control action on another system.
  • a noise detection module (not shown) may identify that there are multiple radar signals that may interfere with the vehicle. This information may be used by a perception module in radar 202 to adjust the radar’ s scan parameters so as to avoid these other signals and minimize interference.
  • environmental sensor 210 may detect that the weather is changing, and visibility is decreasing.
  • the sensor fusion module 220 may determine to configure the other sensors to improve the ability of the vehicle to navigate in these new conditions.
  • the configuration may include turning off camera or lidar sensors 204- 206 or reducing the sampling rate of these visibility-based sensors. This effectively places reliance on the sensor(s) adapted for the current situation.
  • the perception module configures the radar 202 for these conditions as well. For example, the radar 202 may reduce the beam width to provide a more focused beam, and thus a finer sensing capability.
  • the sensor fusion module 220 may send a direct control to the metastructure antenna based on historical conditions and controls.
  • the sensor fusion module 220 may also use some of the sensors within system 200 to act as feedback or calibration for the other sensors.
  • an operational sensor 212 may provide feedback to the perception module and/or the sensor fusion module 220 to create templates, patterns and control scenarios. These are based on successful actions or may be based on poor results, where the sensor fusion module 220 learns from past actions.
  • Sensor fusion module 220 may itself be controlled by system controller 222, which may also interact with and control other modules and systems in the vehicle. For example, system controller 222 may turn the different sensors 202-216 on and off as desired, or provide instructions to the vehicle to stop upon identifying a driving hazard (e.g., deer, pedestrian, cyclist, or another vehicle suddenly appearing in the vehicle’s path, flying debris, etc.)
  • a driving hazard e.g., deer, pedestrian, cyclist, or another vehicle suddenly appearing in the vehicle’s path, flying debris, etc.
  • Autonomous driving system 200 also includes system memory 224, which may store information and data (e.g., static and dynamic data) used for operation of system 200 and the ego vehicle using system 200.
  • V2V communications module 226 is used for communication with other vehicles. The V2V communications may also include information from other vehicles that is invisible to the user, driver, or rider of the vehicle, and may help vehicles coordinate to avoid an accident.
  • FIG. 3 illustrates a schematic diagram of a beam steering radar system as in FIG. 2 in accordance with various examples.
  • Beam steering radar system 300 is a“digital eye” with true 3D vision and capable of a human-like interpretation of the world.
  • The“digital eye” and human-like interpretation capabilities are provided by two main modules: radar module 302 and a perception module 304.
  • the radar module 302 includes at least one beam steering antenna 306 for providing dynamically controllable and steerable beams that can focus on one or multiple portions of a 360° FoV of an autonomous ego vehicle. It is noted that current beam steering antenna implementations are able to steer beams of up to 120-180° FoV. Multiple beam steering antennas may be needed to provide steerability to reach the full 360° FoV.
  • the beam steering antenna 306 is integrated with RFICs for providing RF signals at multiple steering angles.
  • the antenna may be a metastructure antenna, a phase array antenna, or any other antenna capable of radiating RF signals in millimeter wave frequencies.
  • a metastructure as generally defined herein, is an engineered structure capable of controlling and manipulating incident radiation at a desired direction based on its geometry.
  • the metastructure antenna may include various structures and layers, including, for example, a feed or power division layer to divide power and provide impedance matching, an RF circuit layer with RFICs to provide steering angle control and other functions, and a metastructure antenna layer with multiple microstrips, gaps, patches, vias, and so forth.
  • the metastructure layer may include a metamaterial layer.
  • Various configurations, shapes, designs and dimensions of the beam steering antenna 306 may be used to implement specific designs and meet specific constraints.
  • Radar control is provided in part by the perception module 304.
  • Radar data generated by the radar module 302 is provided to the perception module 304 for object detection and identification.
  • the radar data is acquired by the transceiver 308, which has a radar chipset capable of transmitting the RF signals radiated by the metastructure antenna 306 and receiving the reflections of these RF signals.
  • Object detection and identification in perception module 304 is performed in a Machine Learning Module (“MLM”) 312 and in a classifier 314.
  • MLM Machine Learning Module
  • the perception module 304 Upon identifying objects in the FoV of the vehicle, the perception module 304 provides object data and control instructions to antenna control 310 and scan parameter control 316 in radar module 302 for adjusting the beam steering and beam characteristics as needed.
  • Antenna control 310 controls antenna parameters such as the steering angle while the scan parameter control 316 adjusts the scan parameters of the radar signal in transceiver 308.
  • the perception module 304 may detect a cyclist on the path of the vehicle and direct the radar module 302 to focus additional RF beams at a given steering angle and within the portion of the FoV corresponding to the cyclist’s location.
  • the MLM 312 implements a CNN that, in various examples, is a fully convolutional neural network (“FCN”) with three stacked convolutional layers from input to output (additional layers may also be included in the CNN). Each of these layers also performs the rectified linear activation function and batch normalization as a substitute for traditional L2 regularization and each layer has 64 filters. Unlike many FCNs, the data is not compressed as it propagates through the network because the size of the input is relatively small and runtime requirements are satisfied without compression. In various examples, the CNN may be trained with raw radar data, synthetic radar data, lidar data and then retrained with radar data, and so on. Multiple training options may be implemented for training the CNN to achieve a good object detection and identification performance.
  • FCN fully convolutional neural network
  • the classifier 314 may also include a CNN or other object classifier to enhance the object identification capabilities of perception module 304 with the use of the velocity information and micro-doppler signatures in the radar data acquired by the radar module 302.
  • a CNN or other object classifier to enhance the object identification capabilities of perception module 304 with the use of the velocity information and micro-doppler signatures in the radar data acquired by the radar module 302.
  • the location of the object such as in the far-right lane of a highway in some countries (e.g., in the United States of America) indicates a slower-moving type vehicle. If the movement of the object does not follow the path of a road, then the object may be an animal, such as a deer, running across the road. All of this information may be determined from a variety of sensors and information available to the vehicle, including information provided from weather and traffic services, as well as from other vehicles or the environment itself, such as smart roads and smart traffic signs.
  • Radar data is in a multi dimensional format having data tuples of the form (/ ⁇ ,, q ; , f,, /,, v ; ), where r f, represent the location coordinates of an obj ect with r, denoting the range or distance between the radar system 300 and the object along its line of sight, Q, is the azimuthal angle, and f, is elevation angle, I, is the intensity or reflectivity indicating the amount of transmitted power returned to the transceiver 308 and v ; is the speed between the radar system 300 and the object along its line of sight.
  • the location and velocity information provided by the perception module 304 to the radar module 302 enables the antenna control 310 and scan parameter control 316 to adjust its parameters accordingly.
  • FIG. 4 illustrates a flowchart for operating the beam steering radar system implemented as in FIG. 3.
  • the beam steering radar system initiates a transmission of a beam steering radar scan with a set of scan parameters (400).
  • the radar scan may be, in various examples, a Frequency -Modulated Continuous Wave (“FMCW”) signal.
  • FMCW Frequency -Modulated Continuous Wave
  • An FMCW signal enables the radar system to measure range to an object by measuring the differences in phase or frequency between the transmitted signal and the received/reflected signal or echo.
  • FMCW formats there are a variety of modulation patterns that may be used, including triangular, sawtooth, rectangular and so forth, each having advantages and purposes.
  • an FMCW signal there may be multiple waveforms or chirps, each corresponding to a transmission beam.
  • the beam will reflect off the object and the return signal or echo is received at the radar system (402).
  • the echo is analyzed by the MLM at the perception module 304 to detect an object (404). If an object is not detected, the beam steering radar system 300 continues to wait for echoes with further scans. Note that the beam steering radar system does not stop transmitting beams; scanning is accomplished as long as the ego vehicle is in operation.
  • the perception module 304 indicates that an object has been detected in an echo, the object information such as the object’s location and its velocity are extracted and sent to the radar module 302 (406).
  • the perception module 304 may also send information on where to focus the radar beams in a next scan.
  • the object information will inform the scan parameter control 316 to adjust its scan parameters (408) in various ways, such as described below with references to FIGs. 5-11.
  • the perception module 304 then classifies the object (410) and sends the object classification results to a sensor fusion module in the vehicle to determine, in combination with object detection/classification from other sensors, what control action (e.g., reduce speed, change lanes, break, and so on), if any, to take on the vehicle.
  • FIG. 5 illustrates a radar signal and its associated scan parameters in more detail.
  • Radar signal 500 is an FMCW signal containing a series of chirps, such as chirps 502-506.
  • Signal 500 is defined by a set of parameters that impact how to determine an object’s location, its resolution, and velocity. The parameters associated with the signal 500 and illustrated in FIG.
  • f ma x and fmm for the minimum and maximum frequency of the chirp signal
  • T totai for the total time for one chirp sequence
  • T delay representing the settling time for a Phase Locked Loop (“PLL”) in the radar system
  • T meas for the actual measurement time (e.g., > 2 /rs for a chirp sequence to detect objects within 300 meters)
  • T chiP for the total time of one chirp
  • (6) T repeat for the repeat time between chirps
  • B for the total bandwidth of the chirp
  • (8) B e ff for the effective bandwidth of the chirp
  • N r for the number of measurements taken per chirp (i.e., for each chirp, how many measurements will be taken of echoes)
  • (11) N c the number of chirps per sequence.
  • the velocity and velocity resolution of an object are fully determined by chirp sequence parameters as well.
  • the minimum velocity or resolution achieved is determined as follows (with c denoting the speed of light):
  • the sample rate f sa m P ie in Eq. 5 determines how fine a range resolution can be achieved for a selected maximum velocity and range.
  • Eqs. 1-9 below can be used to establish scan parameters for given design goals. For example, to detect objects at high resolution at long ranges, the radar system 300 needs to take a large number of measurements per chirp. If the goal is to detect objects at high speed at long range, the chirp time has to be low, limiting the chirp time. In the first case, high resolution detection at long range is limited by the bandwidth of the signal processing unit in the radar system.
  • FIGs. 7A-B illustrate example trade-offs between scan parameters and design goals.
  • table 700 shows values for scan parameters for a single chirp sequence in which the goal is to detect objects at long range. Note that increased speed and range of objects results in lower distance resolution. Note also that velocity resolution can be increased by adding more chirps in the sequence at a cost of increasing the total sequence time.
  • table 702 shows values for scan parameters for a single chirp sequence in which the goal is to detect objects at short range.
  • distance resolution can be improved to a fine 0.2 meters, if the maximum speed for a detected object is less than 20 m/s.
  • better velocity resolution increases the sequence time. For example, a velocity resolution of 0.3 m/s needs a sequence time of 7.1 ms.
  • FIGs. 8 A, 9 and 10 illustrate flowcharts for adjusting scan parameters as desired and in accordance with various examples.
  • the beam steering radar system first performs a low-resolution scan by, for example, using fewer measurement points per chirp ramp and/or chirp number to reduce computational time, or by using fewer chirps in a sequence to reduce measurement time (800). This scan is followed by a higher resolution scan (802).
  • the higher resolution scan is conducted for example in areas with expected traffic (e.g., by using a map tool to predict where the road is 300 meters away from the front of the ego vehicle), in areas that have been flagged as containing an object or multiple objects during the low-resolution scan, or in areas that have been flagged by other sensors, e.g., by camera and/or lidar sensors in the ego vehicle.
  • FIG. 8B shows example scan parameters for the low-resolution scan in row 806 and the high-resolution scan in row 808. Note that the high- resolution scan is conducted at twice as long the measurement time and four times as many data points for processing as the low-resolution scan.
  • the beam steering radar system first performs a scan with a wider beam width (900) and then rescans the FoV with a narrow beam (902).
  • the narrower scan is conducted for example in areas with expected traffic, in areas that have been flagged as containing an object or multiple objects during the low resolution scan, or in areas that have been flagged by other sensors, e.g., by camera and/or lidar sensors in the ego vehicle.
  • both operations can be performed in parallel when the beam steering radar system is implemented with multiple antennas.
  • Cross-talk can be omitted by operating each set of antennas at a different frequency sub-band within an allowed frequency band in the 76-81 GHz range.
  • an antenna having a double beam width will allow scanning a fixed FoV in half the time using the same scan parameters.
  • the beam steering radar system performs a first scan with high range resolution, but lower absolute speed capture (1000). This is followed by a second scan at a lower range resolution but high absolute speed capture (1002).
  • Corresponding example scan parameters are shown in FIG. 11 in table 1100, with row 1102 showing a high range resolution scan with a maximum velocity of 30 m/s and row 1104 showing a low range resolution scan with a maximum velocity of 85 m/s.
  • the beam steering radar system is capable of achieving multiple design goals and detecting objects’ location and velocity in both short and long ranges.
  • the objects are further classified with the MLM in the perception module coupled to the radar module as shown in FIG. 3.

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Electromagnetism (AREA)
  • Artificial Intelligence (AREA)
  • Evolutionary Computation (AREA)
  • Traffic Control Systems (AREA)
  • Radar Systems Or Details Thereof (AREA)
  • Optics & Photonics (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Automation & Control Theory (AREA)

Abstract

Examples disclosed herein relate to a radar system in an autonomous vehicle for object detection and classification. The radar system has a radar module with at least one beam steering antenna, a transceiver, and a scan parameter control adapted to adjust a set of scan parameters for the transceiver, and a perception module having a machine learning module and a classifier to detect and classify objects in a path and surrounding environment of the autonomous vehicle, the perception module to transmit object data and radar control information to the radar module.

Description

METHOD AND APPARATUS FOR OBJECT DETECTION USING A BEAM STEERING RADAR AND CONVOLUTIONAL NEURAL NETWORK SYSTEM
CROSS-REFERENCE TO RELATED APPLICATIONS
[0001] This application claims priority to ET.S. Provisional Application No. 62/626,569, filed on February 5th, 2018, and is related to ET.S. Patent Application No. 16/240,666, filed on January 4th, 2019, and which itself claims priority to U.S. Provisional Application No. 62/613,675, filed on January 4th, 2018, both incorporated herein by reference in their entirety.
BACKGROUND
[0002] Autonomous driving is quickly moving from the realm of science fiction to becoming an achievable reality. Already in the market are Advanced-Driver Assistance Systems (“ADAS”) that automate, adapt and enhance vehicles for safety and better driving. The next step will be vehicles that increasingly assume control of driving functions such as steering, accelerating, braking and monitoring the surrounding environment and driving conditions to respond to events, such as changing lanes or speed when needed to avoid traffic, crossing pedestrians, animals, and so on. The requirements for object and image detection are critical and specify the time required to capture data, process it and turn it into action. All this while ensuring accuracy, consistency and cost optimization.
[0003] An aspect of making this work is the ability to detect and classify objects in the surrounding environment at the same or possibly even better level as humans. Humans are adept at recognizing and perceiving the world around them with an extremely complex human visual system that essentially has two main functional parts: the eye and the brain. In autonomous driving technologies, the eye may include a combination of multiple sensors, such as camera, radar, and lidar, while the brain may involve multiple artificial intelligence, machine learning and deep learning systems. The goal is to have full understanding of a dynamic, fast- moving environment in real time and human-like intelligence to act in response to changes in the environment. BRIEF DESCRIPTION OF THE DRAWINGS
[0004] The present application may be more fully appreciated in connection with the following detailed description taken in conjunction with the accompanying drawings, which are not drawn to scale and in which like reference characters refer to like parts throughout, and wherein:
[0005] FIG. 1 illustrates an example environment in which a beam steering radar system in an autonomous vehicle is used to detect and identify objects;
[0006] FIG. 2 is a schematic diagram of an autonomous driving system for an autonomous vehicle in accordance with various examples;
[0007] FIG. 3 is a schematic diagram of a beam steering radar system as in FIG. 2 in accordance with various examples;
[0008] FIG. 4 is a flowchart for operating the beam steering radar system implemented as in FIG. 3 in accordance with various examples;
[0009] FIG. 5 illustrates a radar signal and its associated scan parameters in accordance with various examples;
[0010] FIG. 6 illustrates a graph showing the range resolution per bandwidth of a beam steering radar system in accordance with various examples;
[0011] FIGs. 7A-B illustrate example trade-offs between scan parameters and design goals;
[0012] FIG. 8A illustrates a flowchart for adjusting scan parameters in accordance with various examples;
[0013] FIG. 8B illustrates example scan parameters for the adjustment as in FIG. 8A;
[0014] FIG. 9 illustrates a flowchart for adjusting scan parameters in accordance with various examples;
[0015] FIG. 10 illustrates a flowchart for adjusting scan parameters in accordance with various examples; and
[0016] FIG. 11 illustrates example scan parameters for the adjustment of FIG. 10.
DETAILED DESCRIPTION
[0017] Methods and apparatuses for object detection using a beam steering radar and convolutional neural network system are disclosed. The methods and apparatuses include the acquisition of raw data by a beam steering radar in an autonomous vehicle and the processing of that data through a perception module to extract information about multiple objects in the vehicle’s Field-of-View (“FoV”). This information may be parameters, measurements or descriptors of detected objects, such as location, size, speed, object categories, and so forth. The objects may include structural elements in the vehicle’s FoV such as roads, walls, buildings and road center medians, as well as other vehicles, pedestrians, bystanders, cyclists, plants, trees, animals and so on. The beam steering radar incorporates at least one beam steering antenna that is dynamically controlled such as to change its electrical or electromagnetic configuration to enable beam steering. The dynamic control is aided by the perception module, which upon identifying objects in the vehicle’s FoV, informs the beam steering radar where to steer its beams and focus on areas of interest by adjusting its radar scan parameters.
[0018] It is appreciated that, in the following description, numerous specific details are set forth to provide a thorough understanding of the examples. However, it is appreciated that the examples may be practiced without limitation to these specific details. In other instances, well- known methods and structures may not be described in detail to avoid unnecessarily obscuring the description of the examples. Also, the examples may be used in combination with each other.
[0019] FIG. 1 illustrates an example environment in which a beam steering radar system in an autonomous vehicle is used to detect and identify objects. Ego vehicle 100 is an autonomous vehicle with a beam steering radar system 106 for transmitting a radar signal to scan a FoV or specific area. As described in more detail below, the radar signal is transmitted according to a set of scan parameters that can be adjusted to result in multiple transmission beams 118. The scan parameters may include, among others, the total angle of the scanned area from the radar transmission point, the power of the transmitted radar signal, the scan angle of each incremental transmission beam, as well as the angle between each beam or overlap therebetween. The entire FoV or a portion of it can be scanned by a compilation of such transmission beams 118, which may be in successive adjacent scan positions or in a specific or random order. Note that the term FoV is used herein in reference to the radar transmissions and does not imply an optical FoV with unobstructed views. The scan parameters may also indicate the time interval between these incremental transmission beams, as well as start and stop angle positions for a full or partial scan. [0020] In various examples, the ego vehicle 100 may also have other perception sensors, such as camera 102 and lidar 104. These perception sensors are not required for the ego vehicle 100, but may be useful in augmenting the object detection capabilities of the beam steering radar system 106. Camera sensor 102 may be used to detect visible objects and conditions and to assist in the performance of various functions. The lidar sensor 104 can also be used to detect objects and provide this information to adjust control of the vehicle. This information may include information such as congestion on a highway, road conditions, and other conditions that would impact the sensors, actions or operations of the vehicle. Camera sensors are currently used in Advanced Driver Assistance Systems (“ADAS”) to assist drivers in driving functions such as parking (e.g., in rear view cameras). Cameras are able to capture texture, color and contrast information at a high level of detail, but similar to the human eye, they are susceptible to adverse weather conditions and variations in lighting. Camera 102 may have a high resolution but cannot resolve objects beyond 50 meters.
[0021] Lidar sensors typically measure the distance to an object by calculating the time taken by a pulse of light to travel to an object and back to the sensor. When positioned on top of a vehicle, a lidar sensor is able to provide a 360° 3D view of the surrounding environment. Other approaches may use several lidars at different locations around the vehicle to provide the full 360° view. However, lidar sensors such as lidar 104 are still prohibitively expensive, bulky in size, sensitive to weather conditions and are limited to short ranges (typically < 150-200 meters). Radars, on the other hand, have been used in vehicles for many years and operate in all-weather conditions. Radars also use far less processing than the other types of sensors and have the advantage of detecting objects behind obstacles and determining the speed of moving objects. When it comes to resolution, lidars’ laser beams are focused on small areas, have a smaller wavelength than RF signals, and are able to achieve around 0.25 degrees of resolution.
[0022] In various examples and as described in more detail below, the beam steering radar system 106 is capable of providing a 360° true 3D vision and human-like interpretation of the ego vehicle’s path and surrounding environment. The radar system 106 is capable of shaping and steering RF beams in all directions in a 360° FoV with a beam steering antenna module (having at least one beam steering antenna) and recognize objects quickly and with a high degree of accuracy over a long range of around 300 meters or more. The short range capabilities of camera 102 and lidar 104 along with the long range capabilities of radar 106 enable a sensor fusion module 108 in ego vehicle 100 to enhance its object detection and identification.
[0023] Attention is now directed to FIG. 2, which illustrates a schematic diagram of an autonomous driving system for an ego vehicle in accordance with various examples. Autonomous driving system 200 is a system for use in an ego vehicle that provides some or full automation of driving functions. The driving functions may include, for example, steering, accelerating, braking, and monitoring the surrounding environment and driving conditions to respond to events, such as changing lanes or speed when needed to avoid traffic, crossing pedestrians, animals, and so on. The autonomous driving system 200 includes a beam steering radar system 202 and other sensor systems such as camera 204, lidar 206, infrastructure sensors 208, environmental sensors 210, operational sensors 212, user preference sensors 214, and other sensors 216. Autonomous driving system 200 also includes a communications module 218, a sensor fusion module 220, a system controller 222, a system memory 224, and a V2V communications module 226. It is appreciated that this configuration of autonomous driving system 200 is an example configuration and not meant to be limiting to the specific structure illustrated in FIG. 2. Additional systems and modules not shown in FIG. 2 may be included in autonomous driving system 200.
[0024] In various examples, beam steering radar system 202 includes at least one beam steering antenna for providing dynamically controllable and steerable beams that can focus on one or multiple portions of a 360° FoV of the vehicle. The beams radiated from the beam steering antenna are reflected back from objects in the vehicle’s path and surrounding environment and received and processed by the radar system 202 to detect and identify the objects. Radar system 202 includes a perception module that is trained to detect and identify objects and control the radar module as desired. Camera sensor 204 and lidar 206 may also be used to identify objects in the path and surrounding environment of the ego vehicle, albeit at a much lower range.
[0025] Infrastructure sensors 208 may provide information from infrastructure while driving, such as from a smart road configuration, bill board information, traffic alerts and indicators, including traffic lights, stop signs, traffic warnings, and so forth. This is a growing area, and the uses and capabilities derived from this information are immense. Environmental sensors 210 detect various conditions outside, such as temperature, humidity, fog, visibility, precipitation, among others. Operational sensors 212 provide information about the functional operation of the vehicle. This may be tire pressure, fuel levels, brake wear, and so forth. The user preference sensors 214 may be configured to detect conditions that are part of a user preference. This may be temperature adjustments, smart window shading, etc. Other sensors 216 may include additional sensors for monitoring conditions in and around the vehicle.
[0026] In various examples, the sensor fusion module 220 optimizes these various functions to provide an approximately comprehensive view of the vehicle and environments. Many types of sensors may be controlled by the sensor fusion module 220. These sensors may coordinate with each other to share information and consider the impact of one control action on another system. In one example, in a congested driving condition, a noise detection module (not shown) may identify that there are multiple radar signals that may interfere with the vehicle. This information may be used by a perception module in radar 202 to adjust the radar’ s scan parameters so as to avoid these other signals and minimize interference.
[0027] In another example, environmental sensor 210 may detect that the weather is changing, and visibility is decreasing. In this situation, the sensor fusion module 220 may determine to configure the other sensors to improve the ability of the vehicle to navigate in these new conditions. The configuration may include turning off camera or lidar sensors 204- 206 or reducing the sampling rate of these visibility-based sensors. This effectively places reliance on the sensor(s) adapted for the current situation. In response, the perception module configures the radar 202 for these conditions as well. For example, the radar 202 may reduce the beam width to provide a more focused beam, and thus a finer sensing capability.
[0028] In various examples, the sensor fusion module 220 may send a direct control to the metastructure antenna based on historical conditions and controls. The sensor fusion module 220 may also use some of the sensors within system 200 to act as feedback or calibration for the other sensors. In this way, an operational sensor 212 may provide feedback to the perception module and/or the sensor fusion module 220 to create templates, patterns and control scenarios. These are based on successful actions or may be based on poor results, where the sensor fusion module 220 learns from past actions.
[0029] Data from sensors 202-216 may be combined in sensor fusion module 220 to improve the target detection and identification performance of autonomous driving system 200. Sensor fusion module 220 may itself be controlled by system controller 222, which may also interact with and control other modules and systems in the vehicle. For example, system controller 222 may turn the different sensors 202-216 on and off as desired, or provide instructions to the vehicle to stop upon identifying a driving hazard (e.g., deer, pedestrian, cyclist, or another vehicle suddenly appearing in the vehicle’s path, flying debris, etc.)
[0030] All modules and systems in autonomous driving system 200 communicate with each other through communication module 218. Autonomous driving system 200 also includes system memory 224, which may store information and data (e.g., static and dynamic data) used for operation of system 200 and the ego vehicle using system 200. V2V communications module 226 is used for communication with other vehicles. The V2V communications may also include information from other vehicles that is invisible to the user, driver, or rider of the vehicle, and may help vehicles coordinate to avoid an accident.
[0031] FIG. 3 illustrates a schematic diagram of a beam steering radar system as in FIG. 2 in accordance with various examples. Beam steering radar system 300 is a“digital eye” with true 3D vision and capable of a human-like interpretation of the world. The“digital eye” and human-like interpretation capabilities are provided by two main modules: radar module 302 and a perception module 304. The radar module 302 includes at least one beam steering antenna 306 for providing dynamically controllable and steerable beams that can focus on one or multiple portions of a 360° FoV of an autonomous ego vehicle. It is noted that current beam steering antenna implementations are able to steer beams of up to 120-180° FoV. Multiple beam steering antennas may be needed to provide steerability to reach the full 360° FoV.
[0032] In various examples, the beam steering antenna 306 is integrated with RFICs for providing RF signals at multiple steering angles. The antenna may be a metastructure antenna, a phase array antenna, or any other antenna capable of radiating RF signals in millimeter wave frequencies. A metastructure, as generally defined herein, is an engineered structure capable of controlling and manipulating incident radiation at a desired direction based on its geometry. The metastructure antenna may include various structures and layers, including, for example, a feed or power division layer to divide power and provide impedance matching, an RF circuit layer with RFICs to provide steering angle control and other functions, and a metastructure antenna layer with multiple microstrips, gaps, patches, vias, and so forth. The metastructure layer may include a metamaterial layer. Various configurations, shapes, designs and dimensions of the beam steering antenna 306 may be used to implement specific designs and meet specific constraints.
[0033] Radar control is provided in part by the perception module 304. Radar data generated by the radar module 302 is provided to the perception module 304 for object detection and identification. The radar data is acquired by the transceiver 308, which has a radar chipset capable of transmitting the RF signals radiated by the metastructure antenna 306 and receiving the reflections of these RF signals. Object detection and identification in perception module 304 is performed in a Machine Learning Module (“MLM”) 312 and in a classifier 314. Upon identifying objects in the FoV of the vehicle, the perception module 304 provides object data and control instructions to antenna control 310 and scan parameter control 316 in radar module 302 for adjusting the beam steering and beam characteristics as needed. Antenna control 310 controls antenna parameters such as the steering angle while the scan parameter control 316 adjusts the scan parameters of the radar signal in transceiver 308. For example, the perception module 304 may detect a cyclist on the path of the vehicle and direct the radar module 302 to focus additional RF beams at a given steering angle and within the portion of the FoV corresponding to the cyclist’s location.
[0034] The MLM 312, in various examples, implements a CNN that, in various examples, is a fully convolutional neural network (“FCN”) with three stacked convolutional layers from input to output (additional layers may also be included in the CNN). Each of these layers also performs the rectified linear activation function and batch normalization as a substitute for traditional L2 regularization and each layer has 64 filters. Unlike many FCNs, the data is not compressed as it propagates through the network because the size of the input is relatively small and runtime requirements are satisfied without compression. In various examples, the CNN may be trained with raw radar data, synthetic radar data, lidar data and then retrained with radar data, and so on. Multiple training options may be implemented for training the CNN to achieve a good object detection and identification performance.
[0035] The classifier 314 may also include a CNN or other object classifier to enhance the object identification capabilities of perception module 304 with the use of the velocity information and micro-doppler signatures in the radar data acquired by the radar module 302. When an object is moving slowly, or is moving outside a road lane, then it most likely is not a motorized vehicle, but rather a person, animal, cyclist and so forth. Similarly, when one object is moving at a high speed, but lower than the average speed of other vehicles on a highway, the classifier 314 uses this velocity information to determine if that vehicle is a truck or another object which tends to move more slowly. Similarly, the location of the object, such as in the far-right lane of a highway in some countries (e.g., in the United States of America) indicates a slower-moving type vehicle. If the movement of the object does not follow the path of a road, then the object may be an animal, such as a deer, running across the road. All of this information may be determined from a variety of sensors and information available to the vehicle, including information provided from weather and traffic services, as well as from other vehicles or the environment itself, such as smart roads and smart traffic signs.
[0036] Note that velocity information is unique to radar sensors. Radar data is in a multi dimensional format having data tuples of the form (/·,, q;, f,, /,, v;), where r f, represent the location coordinates of an obj ect with r, denoting the range or distance between the radar system 300 and the object along its line of sight, Q, is the azimuthal angle, and f, is elevation angle, I, is the intensity or reflectivity indicating the amount of transmitted power returned to the transceiver 308 and v; is the speed between the radar system 300 and the object along its line of sight. The location and velocity information provided by the perception module 304 to the radar module 302 enables the antenna control 310 and scan parameter control 316 to adjust its parameters accordingly.
[0037] Attention is now directed to FIG. 4, which illustrates a flowchart for operating the beam steering radar system implemented as in FIG. 3. First, the beam steering radar system initiates a transmission of a beam steering radar scan with a set of scan parameters (400). The radar scan may be, in various examples, a Frequency -Modulated Continuous Wave (“FMCW”) signal. An FMCW signal enables the radar system to measure range to an object by measuring the differences in phase or frequency between the transmitted signal and the received/reflected signal or echo. Within FMCW formats, there are a variety of modulation patterns that may be used, including triangular, sawtooth, rectangular and so forth, each having advantages and purposes. Within an FMCW signal, there may be multiple waveforms or chirps, each corresponding to a transmission beam. When a transmission beam encounters an object, the beam will reflect off the object and the return signal or echo is received at the radar system (402). [0038] The echo is analyzed by the MLM at the perception module 304 to detect an object (404). If an object is not detected, the beam steering radar system 300 continues to wait for echoes with further scans. Note that the beam steering radar system does not stop transmitting beams; scanning is accomplished as long as the ego vehicle is in operation. Once the perception module 304 indicates that an object has been detected in an echo, the object information such as the object’s location and its velocity are extracted and sent to the radar module 302 (406). The perception module 304 may also send information on where to focus the radar beams in a next scan. The object information will inform the scan parameter control 316 to adjust its scan parameters (408) in various ways, such as described below with references to FIGs. 5-11. The perception module 304 then classifies the object (410) and sends the object classification results to a sensor fusion module in the vehicle to determine, in combination with object detection/classification from other sensors, what control action (e.g., reduce speed, change lanes, break, and so on), if any, to take on the vehicle.
[0039] FIG. 5 illustrates a radar signal and its associated scan parameters in more detail. Radar signal 500 is an FMCW signal containing a series of chirps, such as chirps 502-506. Signal 500 is defined by a set of parameters that impact how to determine an object’s location, its resolution, and velocity. The parameters associated with the signal 500 and illustrated in FIG. 5 include the following: (1) fmax and fmm for the minimum and maximum frequency of the chirp signal; (2) Ttotai for the total time for one chirp sequence; (3) T delay representing the settling time for a Phase Locked Loop (“PLL”) in the radar system; (4) Tmeas for the actual measurement time (e.g., > 2 /rs for a chirp sequence to detect objects within 300 meters); (5) TchiP for the total time of one chirp; (6) Trepeat for the repeat time between chirps; (7) B for the total bandwidth of the chirp; (8) Beff for the effective bandwidth of the chirp; (9) ABeff for the bandwidth between consecutive measurements; (10) Nr for the number of measurements taken per chirp (i.e., for each chirp, how many measurements will be taken of echoes); and (11) Nc, the number of chirps per sequence.
[0040] The distance and distance resolution of an object are fully determined by the chirp parameters Nr and Beff as follows and as illustrated in FIG. 6 with graph 600 showing the range resolution per bandwidth B:
The velocity and velocity resolution of an object are fully determined by chirp sequence parameters as well. The minimum velocity or resolution achieved is determined as follows (with c denoting the speed of light):
Note that higher radar frequencies give better velocity resolution for the same sequence parameters. The maximum velocity is given by:
[0041] Additional relationships between the scan parameters are given by the following equations, with Eq. 5 denoting the sample rate, Eq. 6 denoting the time between consecutive measurements A t, Eq. 7 denoting the range resolution, Eq. 8 denoting the chirp time, and Eq. 9 denoting the maximum velocity:
†ADC
f sample decimation (Eq. 5)
[0042] Note that once the parameter Rmax is fixed, vmax and Rmm are no longer independent. Also, the sample rate fsamPie in Eq. 5 determines how fine a range resolution can be achieved for a selected maximum velocity and range. Note also that Eqs. 1-9 below can be used to establish scan parameters for given design goals. For example, to detect objects at high resolution at long ranges, the radar system 300 needs to take a large number of measurements per chirp. If the goal is to detect objects at high speed at long range, the chirp time has to be low, limiting the chirp time. In the first case, high resolution detection at long range is limited by the bandwidth of the signal processing unit in the radar system. And in the second case, high maximum velocity at long range is limited by the data acquisition speed of the radar chipset (which also limits resolution). [0043] FIGs. 7A-B illustrate example trade-offs between scan parameters and design goals. In FIG. 7A, table 700 shows values for scan parameters for a single chirp sequence in which the goal is to detect objects at long range. Note that increased speed and range of objects results in lower distance resolution. Note also that velocity resolution can be increased by adding more chirps in the sequence at a cost of increasing the total sequence time. In FIG. 7B, table 702 shows values for scan parameters for a single chirp sequence in which the goal is to detect objects at short range. Note that distance resolution can be improved to a fine 0.2 meters, if the maximum speed for a detected object is less than 20 m/s. Further, note that better velocity resolution increases the sequence time. For example, a velocity resolution of 0.3 m/s needs a sequence time of 7.1 ms.
[0044] The trade-offs between scan parameters and design goals in terms of object detection capabilities enable multiple ways to adjust scan parameters during the operation of the beam steering radar system. FIGs. 8 A, 9 and 10 illustrate flowcharts for adjusting scan parameters as desired and in accordance with various examples. In FIG. 8A, the beam steering radar system first performs a low-resolution scan by, for example, using fewer measurement points per chirp ramp and/or chirp number to reduce computational time, or by using fewer chirps in a sequence to reduce measurement time (800). This scan is followed by a higher resolution scan (802). The higher resolution scan is conducted for example in areas with expected traffic (e.g., by using a map tool to predict where the road is 300 meters away from the front of the ego vehicle), in areas that have been flagged as containing an object or multiple objects during the low-resolution scan, or in areas that have been flagged by other sensors, e.g., by camera and/or lidar sensors in the ego vehicle. FIG. 8B shows example scan parameters for the low-resolution scan in row 806 and the high-resolution scan in row 808. Note that the high- resolution scan is conducted at twice as long the measurement time and four times as many data points for processing as the low-resolution scan.
[0045] In FIG. 9, the beam steering radar system first performs a scan with a wider beam width (900) and then rescans the FoV with a narrow beam (902). As with the higher resolution scan of FIG. 8, the narrower scan is conducted for example in areas with expected traffic, in areas that have been flagged as containing an object or multiple objects during the low resolution scan, or in areas that have been flagged by other sensors, e.g., by camera and/or lidar sensors in the ego vehicle. Note that both operations can be performed in parallel when the beam steering radar system is implemented with multiple antennas. Cross-talk can be omitted by operating each set of antennas at a different frequency sub-band within an allowed frequency band in the 76-81 GHz range. Note also that an antenna having a double beam width will allow scanning a fixed FoV in half the time using the same scan parameters.
[0046] In FIG. 10, the beam steering radar system performs a first scan with high range resolution, but lower absolute speed capture (1000). This is followed by a second scan at a lower range resolution but high absolute speed capture (1002). Corresponding example scan parameters are shown in FIG. 11 in table 1100, with row 1102 showing a high range resolution scan with a maximum velocity of 30 m/s and row 1104 showing a low range resolution scan with a maximum velocity of 85 m/s. Regardless of the method used to adjust scan parameters, the beam steering radar system is capable of achieving multiple design goals and detecting objects’ location and velocity in both short and long ranges. The objects are further classified with the MLM in the perception module coupled to the radar module as shown in FIG. 3.
[0047] These various examples support autonomous driving with improved sensor performance, all-weather/all-condition detection, advanced decision-making algorithms and interaction with other sensors through sensor fusion. These configurations optimize the use of radar sensors, as radar is not inhibited by weather conditions in many applications, such as for self-driving cars. The radar described here is effectively a“digital eye,” having true 3D vision and capable of human-like interpretation of the world.
[0048] It is appreciated that the previous description of the disclosed examples is provided to enable any person skilled in the art to make or use the present disclosure. Various modifications to these examples will be readily apparent to those skilled in the art, and the generic principles defined herein may be applied to other examples without departing from the spirit or scope of the disclosure. Thus, the present disclosure is not intended to be limited to the examples shown herein but is to be accorded the widest scope consistent with the principles and novel features disclosed herein.

Claims

WHAT IS CLAIMED IS:
1. A radar system in an autonomous vehicle for object detection and classification, comprising:
a radar module, comprising:
at least one beam steering antenna;
a transceiver; and
a scan parameter control adapted to adjust a set of scan parameters for the transceiver; and
a perception module comprising a machine learning module and a classifier to detect and classify objects in a path and surrounding environment of the autonomous vehicle, the perception module to transmit object data and radar control information to the radar module.
2. The radar system of claim 1, wherein the radar module further comprises an antenna control to control the at least one beam steering antenna according to the object data and the radar control information transmitted by the perception module to the radar module.
3. The radar system of claim 1, wherein the machine learning module comprises a convolutional neural network.
4. The radar system of claim 1, wherein the transceiver initiates a transmission of a beam steering radar scan with the set of scan parameters.
5. The radar system of claim 4, wherein the set of scan parameters are adjusted according to the object data and the radar control information transmitted by the perception module to the radar module.
6. The radar system of claim 4, wherein the radar scan comprises an FMCW signal.
7. The radar system of claim 6, wherein the scan parameters comprise parameters selected from the group consisting of: a minimum and a maximum frequency of the FMCW signal; a total time for a chirp sequence in the FMCW signal; a settling time for a Phase Locked Loop; a measurement time; a total time of one chirp in the FMCW signal; a repeat time between chirps in the FMCW signal; a total bandwidth of a chirp in the FMCW signal; an effective bandwidth of a chirp in the FMCW signal; a bandwidth between consecutive measurements; a number of measurements taken per chirp in the FMCW signal; and a number of chirps per sequence.
8. An object detection and classification method, comprising:
initiating a transmission of a beam steering radar scan with a transceiver;
receiving an echo;
detecting an object in the received echo;
extracting object information about the detected object; and
adjusting a set of scan parameters in the transceiver for a next beam steering radar scan.
9. The object detection and classification method of claim 8, wherein the beam steering radar scan is an FMCW signal.
10. The object detection and classification method of claim 8, further comprising classifying the detected object with a convolutional neural network and a classifier.
11. The object detection and classification method of claim 8, wherein the object information comprises an object location and a velocity.
12. The object detection and classification method of claim 8, further comprising sending object classification information to a sensor fusion module.
13. The object detection and classification method of claim 9, wherein the scan parameters comprise parameters selected from the group consisting of: a minimum and a maximum frequency of the FMCW signal; a total time for a chirp sequence in the FMCW signal; a settling time for a Phase Locked Loop; a measurement time; a total time of one chirp in the FMCW signal; a repeat time between chirps in the FMCW signal; a total bandwidth of a chirp in the FMCW signal; an effective bandwidth of a chirp in the FMCW signal; a bandwidth between consecutive measurements; a number of measurements taken per chirp in the FMCW signal; and a number of chirps per sequence.
14. An object detection and classification method, comprising:
performing a first scan with a first set of scan parameters;
adjusting the first set of scan parameters based on object detection information from a received echo; and
performing a second scan with a second set of parameters adjusted from the first set of scan parameters.
15. The object detection and classification method of claim 14, wherein the first scan has a lower resolution than the second scan.
16. The object detection and classification method of claim 14, wherein the first scan is performed with a beam steering antenna having a first beam width.
17. The object detection and classification method of claim 14, wherein the second scan is performed with a beam steering antenna having a second beam width, the second beam width narrower than the first beam width.
18. The object detection and classification method of claim 14, wherein the second scan has a lower resolution and higher speed capture than the first scan.
19. The object detection and classification method of claim 14, wherein the first set of scan parameters comprise parameters selected from the group consisting of: a minimum and a maximum frequency of the FMCW signal; a total time for a chirp sequence in the FMCW signal; a settling time for a Phase Locked Loop; a measurement time; a total time of one chirp in the FMCW signal; a repeat time between chirps in the FMCW signal; a total bandwidth of a chirp in the FMCW signal; an effective bandwidth of a chirp in the FMCW signal; a bandwidth between consecutive measurements; a number of measurements taken per chirp in the FMCW signal; and a number of chirps per sequence.
20. The object detection and classification method of claim 14, wherein the first and second scans are performed with a metastructure antenna.
EP19746621.2A 2018-02-05 2019-02-05 Method and apparatus for object detection using a beam steering radar and convolutional neural network system Pending EP3749977A4 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US201862626569P 2018-02-05 2018-02-05
PCT/US2019/016723 WO2019153016A1 (en) 2018-02-05 2019-02-05 Method and apparatus for object detection using a beam steering radar and convolutional neural network system

Publications (2)

Publication Number Publication Date
EP3749977A1 true EP3749977A1 (en) 2020-12-16
EP3749977A4 EP3749977A4 (en) 2021-11-10

Family

ID=67478616

Family Applications (1)

Application Number Title Priority Date Filing Date
EP19746621.2A Pending EP3749977A4 (en) 2018-02-05 2019-02-05 Method and apparatus for object detection using a beam steering radar and convolutional neural network system

Country Status (4)

Country Link
EP (1) EP3749977A4 (en)
JP (1) JP2021516763A (en)
KR (1) KR20200108097A (en)
WO (1) WO2019153016A1 (en)

Families Citing this family (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111220958B (en) * 2019-12-10 2023-05-26 西安宁远电子电工技术有限公司 Radar target Doppler image classification and identification method based on one-dimensional convolutional neural network
CN110988872B (en) * 2019-12-25 2023-10-03 中南大学 Rapid identification method for detecting wall health state by unmanned aerial vehicle through-wall radar
ES2894200A1 (en) * 2020-08-05 2022-02-11 Univ Rovira I Virgili Device and procedure for communication Vehicle-infrastructure and vehicle-vehicle (Machine-translation by Google Translate, not legally binding)
WO2023163354A1 (en) * 2022-02-25 2023-08-31 삼성전자 주식회사 Vehicle electronic device and operation method thereof
EP4386427A1 (en) * 2022-12-16 2024-06-19 Imec VZW An radar apparatus and a method for visual prescan for moving objects

Family Cites Families (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH04323584A (en) * 1991-04-23 1992-11-12 Mitsubishi Electric Corp Device for identifying radar signal
WO2001013141A2 (en) * 1999-08-12 2001-02-22 Automotive Systems Laboratory, Inc. Neural network radar processor
US7683827B2 (en) * 2004-12-15 2010-03-23 Valeo Radar Systems, Inc. System and method for reducing the effect of a radar interference signal
US7327308B2 (en) * 2005-04-28 2008-02-05 Chung Shan Institute Of Science And Technology, Armaments Bureau, M.N.D. Programmable method and test device for generating target for FMCW radar
JP5226185B2 (en) * 2006-02-15 2013-07-03 富士通株式会社 Detecting and ranging device
JP4712826B2 (en) * 2008-05-15 2011-06-29 古河電気工業株式会社 Pulse Doppler radar device
JP6190140B2 (en) * 2012-06-21 2017-08-30 古野電気株式会社 Target detection apparatus and target detection method
US9720072B2 (en) * 2014-08-28 2017-08-01 Waymo Llc Methods and systems for vehicle radar coordination and interference reduction
JP6567832B2 (en) * 2015-01-29 2019-08-28 日本電産株式会社 Radar system, radar signal processing apparatus, vehicle travel control apparatus and method, and computer program
JP6392152B2 (en) * 2015-03-24 2018-09-19 パナソニック株式会社 Radar device and traveling vehicle detection method
TWI588510B (en) * 2016-05-12 2017-06-21 桓達科技股份有限公司 Method of processing FMCW radar signal

Also Published As

Publication number Publication date
WO2019153016A1 (en) 2019-08-08
JP2021516763A (en) 2021-07-08
EP3749977A4 (en) 2021-11-10
KR20200108097A (en) 2020-09-16

Similar Documents

Publication Publication Date Title
US10739438B2 (en) Super-resolution radar for autonomous vehicles
US20220308204A1 (en) Beam steering radar with selective scanning mode for autonomous vehicles
US11378654B2 (en) Recurrent super-resolution radar for autonomous vehicles
US11133577B2 (en) Intelligent meta-structure antennas with targeted polarization for object identification
US11479262B2 (en) Geographically disparate sensor fusion for enhanced target detection and identification in autonomous vehicles
US11327156B2 (en) Reinforcement learning engine for a radar system
US20190204834A1 (en) Method and apparatus for object detection using convolutional neural network systems
US11495877B2 (en) Multi-layer, multi-steering antenna system for autonomous vehicles
US11585896B2 (en) Motion-based object detection in a vehicle radar using convolutional neural network systems
EP3749977A1 (en) Method and apparatus for object detection using a beam steering radar and convolutional neural network system
US11852749B2 (en) Method and apparatus for object detection using a beam steering radar and a decision network
US11587204B2 (en) Super-resolution radar for autonomous vehicles
US11719803B2 (en) Beam steering radar with adjustable long-range radar mode for autonomous vehicles
US11867830B2 (en) Side lobe reduction in a beam steering vehicle radar antenna for object identification
US11867789B2 (en) Optimized proximity clustering in a vehicle radar for object identification
US12066518B2 (en) GAN-based data synthesis for semi-supervised learning of a radar sensor
US20220252721A1 (en) Guard band antenna in a beam steering radar for resolution refinement
EP3931903A1 (en) Switchable reflective phase shifter for millimeter wave applications

Legal Events

Date Code Title Description
STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE INTERNATIONAL PUBLICATION HAS BEEN MADE

PUAI Public reference made under article 153(3) epc to a published international application that has entered the european phase

Free format text: ORIGINAL CODE: 0009012

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: REQUEST FOR EXAMINATION WAS MADE

17P Request for examination filed

Effective date: 20200804

AK Designated contracting states

Kind code of ref document: A1

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR

AX Request for extension of the european patent

Extension state: BA ME

DAV Request for validation of the european patent (deleted)
DAX Request for extension of the european patent (deleted)
A4 Supplementary search report drawn up and despatched

Effective date: 20211008

RIC1 Information provided on ipc code assigned before grant

Ipc: G01S 13/536 20060101ALN20211004BHEP

Ipc: G01S 13/38 20060101ALN20211004BHEP

Ipc: G01S 7/03 20060101ALN20211004BHEP

Ipc: G01S 13/58 20060101ALI20211004BHEP

Ipc: G01S 13/34 20060101ALI20211004BHEP

Ipc: G01S 13/42 20060101ALI20211004BHEP

Ipc: G01S 13/931 20200101ALI20211004BHEP

Ipc: G01S 7/41 20060101ALI20211004BHEP

Ipc: G01S 17/32 20200101ALI20211004BHEP

Ipc: G01S 7/491 20200101ALI20211004BHEP

Ipc: G01S 7/481 20060101AFI20211004BHEP