WO2012061896A1 - Radar image processing - Google Patents
Radar image processing Download PDFInfo
- Publication number
- WO2012061896A1 WO2012061896A1 PCT/AU2011/001458 AU2011001458W WO2012061896A1 WO 2012061896 A1 WO2012061896 A1 WO 2012061896A1 AU 2011001458 W AU2011001458 W AU 2011001458W WO 2012061896 A1 WO2012061896 A1 WO 2012061896A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- radar
- azimuth angle
- terrain
- image
- value
- Prior art date
Links
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S13/00—Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
- G01S13/02—Systems using reflection of radio waves, e.g. primary radar systems; Analogous systems
- G01S13/06—Systems determining position data of a target
- G01S13/42—Simultaneous measurement of distance and other co-ordinates
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S7/00—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
- G01S7/02—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S13/00
- G01S7/41—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S13/00 using analysis of echo signal for target characterisation; Target signature; Target cross-section
- G01S7/411—Identification of targets based on measurements of radar reflectivity
- G01S7/412—Identification of targets based on measurements of radar reflectivity based on a comparison between measured values and known or stored values
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S13/00—Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
- G01S13/88—Radar or analogous systems specially adapted for specific applications
- G01S13/89—Radar or analogous systems specially adapted for specific applications for mapping or imaging
Definitions
- the present invention relates to the processing of radar images.
- Autonomous vehicles may be implemented in many outdoor applications such as mining, earth moving, agriculture, and planetary-exploration.
- Imaging sensors mounted on the vehicles facilitate obstacle avoidance, task-specific target detection and generation of terrain maps for navigation.
- Visibility conditions may be poor in the scenarios in which autonomous vehicles are implemented. For example, day/night cycles change illumination conditions, weather phenomena such as fog, rain, snow and hail, and the presence of dust or smoke clouds may impede visual perception.
- Imaging sensors such as laser range-finders and cameras, tend to be adversely affected by these conditions.
- Sonar is a common sensor typically not affected by such visibility restrictions. However, sonar suffers from a limited maximum range, poor angular resolution, and reflections by specular surfaces.
- the present invention provides a method for performing radar image segmentation, the method comprising: using a radar, generating a radar image of an area of terrain, the radar image deriving from radar observations taken along a plurality of azimuth angles; performing a background extraction process on the radar image to extract a background image comprising extracted radar observations, the background extraction process comprising estimating a range spread of a radar echo from the surface of the terrain as function of the azimuth angle and a tilt of the radar relative to the surface of the terrain; fitting a model to the extracted radar observations along a particular azimuth angle; determining a value of a parameter indicative of the goodness of fit between the model and the extracted radar observations along the particular azimuth angle; and determining a classification depending on the value of the parameter for that azimuth angle.
- the radar used to generate the radar image may be either directly in contact with the terrain, or mounted on a system or apparatus that is directly in contact with the terrain.
- the radar observations may be taken in a near-field region of the radar.
- the radar observations may be taken in a far-field region of the radar.
- the steps of fitting a model, determining a value of a parameter, and determining a classification may be performed for each azimuth angle in the plurality of azimuth angles.
- the estimate of the range spread of the radar echo from the surface of the terrain may be determined using the following equations:
- R 0 is a value of slant range of a boresight of the radar
- ⁇ is a range from the radar to a proximal border of a footprint area illuminated by the radar on the surface of the terrain during generation of the radar image
- R 2 is the range from the radar to a distal border of a footprint area illuminated by the radar on the surface of the terrain during generation of the radar image;
- A is a height of an origin of the radar beam above the surface of the terrain
- ⁇ and ⁇ are the roll and pitch angles respectively of the radar relative to the surface of the terrain
- 9 e is a beamwidth of the radar.
- the model may be a power return model.
- the power return model may be:
- R is a value of the range of a target on the terrain from the radar
- P r is a received power of the signal reflected from the target at distance R;
- R Q is the slant range of a boresight of the radar; k is the power return at the slant range R 0 ;
- G is a value of the gain of the radar
- ⁇ is a grazing angle of the radar beam.
- the parameter may be a coefficient of efficiency.
- the step of classifying the background image may comprise: classifying data points along the particular azimuth angle in the background image as belonging to a first class if the value of the parameter for the particular azimuth angle is above a predetermined threshold value; and classifying data points along the particular azimuth angle in the background image as belonging to a second class if the value of the parameter for the particular azimuth angle is not above the predetermined threshold value.
- the step of classifying the background image may further comprise: for an azimuth angle along which data points are classified as belonging to the first class, performing a physical consistency check using an output of the background extraction process for that azimuth angle, and classifying the data points along that azimuth angle as belonging to a third class depending on an output of the physical consistency check.
- the step of classifying the background image may further comprise: for an azimuth angle along which data points are classified as belonging to the third class, determining a value of a further parameter; and identifying that an object is present along that azimuth angle if the value of the further parameter is above a further predetermined threshold.
- the further parameter may be a percentage relative change of the maximum intensity value of the radar echo along the respective azimuth angle between an observation along that azimuth angle and the model.
- the present invention provides apparatus for processing a radar image, the apparatus comprising: a radar arranged to generate a radar image of an area of terrain, the radar image deriving from radar observations taken along a plurality of azimuth angles; and one or more processors arranged to: perform a background extraction process on the radar image to extract a background image comprising extracted radar observations, the background extraction process comprising estimating a range spread of a radar echo from the surface of the terrain as function of the azimuth angle and a tilt of the radar relative to the surface of the terrain; fit a model to the extracted radar observations along a particular azimuth angle; determine a value of a parameter indicative of the goodness of fit between the model and the extracted radar observations along the particular azimuth angle; and determine a classification depending on the value of the parameter for that azimuth angle.
- the present invention provides a program or plurality of programs arranged such that when executed by a computer system or one or more processors it/they cause the computer system or the one or more processors to operate in accordance with the method of any of the above aspects.
- the present invention provides a machine readable storage medium storing a program or at least one of the plurality of programs according to the above aspect.
- Figure 1 is a schematic illustration (not to scale) of a vehicle in which an embodiment of a process of generating a model of the ground in the vicinity of the vehicle is implemented;
- Figure 2 is a schematic illustration of an example terrain modelling scenario in which the vehicle is used to scan a terrain area
- Figure 3 shows a so-called pencil radar beam hitting the surface of the terrain at a particular grazing angle
- Figure 4 is a process flow-chart of an embodiment of a radar ground segmentation process
- Figure 5 is a process flow-chart of a background extraction process performed at step s2 of the radar ground segmentation process.
- Figure 6 is a process flow-chart of a process of power spectrum analysis performed at step s4 of the radar ground segmentation process.
- ground is used herein to refer to a geometric configuration of an underlying supporting surface of an environment or a region of an environment.
- the underlying supporting surface may, for example, include surfaces such as the underlying geological terrain in a rural setting, or the artificial support surface in an urban setting, either indoors or outdoors.
- ground based is used herein to refer to a system that is either directly in contact with the ground, or that is mounted on a further system that is directly in contact with the ground.
- FIG 1 is a schematic illustration (not to scale) of a vehicle 2 in which an embodiment of a process of generating a model of the ground in the vicinity of the vehicle 2 is implemented. This process will hereinafter be referred to as a "radar ground segmentation process”.
- Figure 2 is a schematic illustration of an example terrain modelling scenario in which the vehicle 2 is used to scan a terrain area 8. In this scenario, the vehicle 2 uses the radar system 4 to scan the terrain area 8.
- the vehicle 2 comprises a radar system 4, and a processor 6.
- the vehicle 2 is an autonomous and unmanned ground- based vehicle.
- the ground-based vehicle 2 is in contact with a surface of the terrain area 8, i.e. the ground.
- the radar system is a ground-based system (because it is mounted in the ground-based vehicle 2).
- the radar system 4 is coupled to the processor 6.
- the radar system 4 comprises a mechanically scanned millimetre-wave radar.
- the radar is a 95-GHz Frequency Modulated Continuous Wave (FMCW) millimetre-wave radar that reports the amplitude of echoes at ranges between 1m and 120m.
- the wavelength of the emitted radar signal is 3mm.
- the beam-width of the emitted radar signal is 3.0° in elevation and 3.0° in azimuth.
- a radar antenna of the radar system 4 scans horizontally across the angular range of 360°.
- the radar system 4 radiates a continuous wave (CW) signal towards a target through an antenna. An echo is received from the target by the antenna. A signal corresponding to the received echo is sent from the radar system 4 to the processor 6.
- CW continuous wave
- the processor 6 comprises a spectrum analyzer to produce a range-amplitude profile that represents the target, i.e. a radar image.
- the processor 6 performs a radar ground segmentation process on the radar image, as described in more detail later below with reference to Figure 4.
- Figure 2 is a schematic illustration of an example terrain modelling scenario in which the vehicle 2 is used to scan a terrain area 8.
- the vehicle 2 uses the radar system 4 to scan the terrain area 8.
- the radar system 4 i.e. the millimetre-wave radar
- the radar system 4 provides a so-called pencil beam with relatively small antenna apertures.
- a relatively accurate range map (i.e. radar image) of the terrain area 8 is constructed through the scanning of the terrain area with the pencil beam.
- the beam width is proportional to the radar signal wavelength and is inversely proportional to the antenna aperture. Using a narrower beam tends to produce more accurate terrain maps and obstacle detection than using a wider beam. However, in this embodiment, radar antenna size is limited by vehicle size and spatial constraints.
- Radars are typically used to sense targets in the so-called antenna far-field region.
- the beginning of the far-field region for the radar antenna of the radar system 4 approximately begins at a distance of 15m from the radar system 4.
- short-range sensing by the vehicle 2 is implemented because many targets fall within the so-called near-field region (i.e. at a distance of less than approximately 15m from the vehicle 2).
- the antenna pattern is range-dependent and the average energy density of the radar signal remains relatively constant at different distances from the antenna.
- the radar system 4 is used to generate a radar image of the area of terrain 8 in the near-field region of the radar in the radar system 4.
- a radar operating partially, or wholly, in the near-field may be conveniently referred to as "short-range”.
- the generated image may be conveniently referred to as a "short-range image”.
- Figure 3 is a schematic illustration of the beam geometries of the radar system 4 in this embodiment.
- the radar is directed at the front of the vehicle 2 with a constant pitch or grazing angle ⁇ of about 11 degrees.
- the scanning pencil beam intersects the ground at near-grazing angles.
- Figure 3 shows the pencil beam hitting the surface of the terrain 8 at a grazing angle ⁇ .
- a beamwidth of the radar beam is indicated in Figure 3 by the reference symbol 9 e .
- a proximal border of a footprint area illuminated by the divergence beam is indicated in Figure 3 with the reference symbol A.
- a distal border of a footprint area illuminated by the divergence beam is indicated in Figure 3 with the reference symbol B.
- a height of the beam origin O with respect to the surface of the terrain 6 is indicated in Figure 3 by the reference symbol h.
- a slant range of the radar boresight is indicated in Figure 3 by the reference symbol RQ.
- a range from the radar to the proximal border A is indicated in Figure 3 by the reference symbol Ri.
- a range from the radar to the distal border B is indicated in Figure 3 by the reference symbol R 2 .
- Short-range sensing in the near-field region tends to stretch the pencil- beam footprint resulting in range-echo spread.
- the computation of the area on the ground surface, which is instantaneously illuminated by the radar depends on the geometry of the radar boresight, elevation beamwidth, resolution, and incidence angle to the local surface.
- a signal corresponding to the received echo is sent from the radar system 4 to the processor 6.
- the processor 6 produces a radar image of the surface of the terrain 8 using the received signal. Also, the processor 6 performs a radar ground segmentation process on the radar image.
- the radar image is composed of a foreground and a background.
- the background of the radar image is the part of the image that results from reflections from the ground (i.e. terrain surface 8).
- the foreground of the radar image is the part of the image that results from reflection from objects, or terrain features, above the ground.
- Radar observations belonging to the background tend to show a wide pulse produced by a high-incident angle surface.
- exceptions to this are possible, for example due to local unevenness or occlusion produced by obstacles of large cross-sections in the foreground.
- Figure 4 is a process flow-chart of an embodiment of a radar ground segmentation process.
- a background extraction process is performed on the radar image.
- This process extracts the ground echo from the radar image.
- the background extraction process is described in more detail later below with reference to Figure 5.
- step s4 the power spectrum across the background is analysed.
- This process results in a segmented ground model of the terrain 8 in the vicinity of the vehicle 2.
- Figure 5 is a process flow-chart of a background extraction process performed at step s2 of the radar ground segmentation process of Figure 4.
- a range spread of the ground echo is predicted.
- the prediction of the range spread of the ground echo as function of the azimuth angle and the tilt of the vehicle is obtained using the following geometrical model:
- R 0 is the slant range of the radar boresight as shown in Figure 3;
- ⁇ is the range to the proximal border A as shown in Figure 3;
- R 2 is the range to the distal border B as shown in Figure 3;
- h is the height of the radar beam origin O with respect to the surface of the terrain 8, as shown in Figure 3;
- ⁇ and ⁇ are the roll and pitch angles respectively of the radar system 4 on the vehicle 2. Together ⁇ and ⁇ described the tilt of the vehicle 2.
- ⁇ and ⁇ are conventional Euler angles (the ZYX Euler angles being ⁇ , ⁇ , and ⁇ , usually referred to as the roll, pitch, and taw angles respectively);
- a is an azimuth angle measured by the radar system 4.
- 0 e is the beamwidth of the radar beam as shown in Figure 3.
- the above geometrical model is based on an assumption of globally flat ground. Therefore, discrepancies in radar observations may be produced by the presence of irregularities or obstacles in the radar-illuminated area. In this embodiment, these discrepancies are compensated for by the performance of step s8, as described in more detail below.
- a change detection algorithm is applied in the vicinity of the model prediction.
- a cumulative sum (CUSUM) test that is based on the cumulative sums charts to detect systematic changes over time in a measured stationary variable.
- the CUSUM test tends to be computationally simple, intuitively easy to understand and can be motivated to be fairly robust to different types of changes (abrupt or incipient). ln this embodiment, the CUSUM test looks at prediction errors ⁇ , of a power intensity value.
- x is a power intensity of a particular point t in the radar image
- j is the mean of the power intensity of the observed radar data
- ⁇ is the standard deviation of the power intensity
- ⁇ is a measure of the deviation of an observed power intensity value from a target value.
- this test is implemented as a time recursion.
- the CUSUM test gives an alarm when the recent prediction errors have been sufficiently positive for a certain amount of time. Also, in this embodiment, the CUSUM test provides an alarm only if the power intensity increases.
- the ground echo is extracted from the radar image for a given azimuth angle.
- the background of the radar image is extracted from the radar image.
- Figure 6 is a process flow-chart of a process of power spectrum analysis performed at step s4 of the radar ground segmentation process of Figure 4.
- a power return model is fit to the radar observation for each azimuth angle.
- R is a distance of a target from the radar system 4;
- P r is a received power of the signal reflected from the target at distance R;
- R 0 is the slant range of the radar boresight as shown in Figure 3; k is a the power return at the slant range Ro;
- G is the antenna gain
- a good match between the parametric model of the power return and the data attests to a high likelihood of traversable ground. Conversely, a poor goodness of fit between the model and the data suggests a low likelihood (due, for example, to the presence of an obstacle or to irregular terrain).
- P r is a function of the parameters R 0 and k.
- the values of k can be interpreted as the power return corresponding to the range of the central beam Ro, and can be estimated by data fitting for each azimuth angle.
- the parameters are continuously updated across the image background. This advantageously tends to provide that the model can be adjusted to local ground roughness and tends to produce a more accurate estimation of R 0 .
- the initial parameter estimates (of Ro and k) are chosen as the maximum measured power value and the predicted range of the central beam respectively. This advantageously tends to limit the problems of ill conditioning and divergence.
- the output of the fitting process of step s10 is updated parameter values for Ro and k. Also, an estimate of the goodness of fit of the model is output.
- a coefficient of efficiency is determined for each azimuth angle in the extracted image background using the output parameter values (Ro and k) for that azimuth angle (that are determined at step s10 above).
- the coefficient of efficiency for a particular azimuth angle is determined using the following formula:
- E is the coefficient of efficiency for the particular azimuth angle
- t i is the measured intensity value of the rth data point along the particular azimuth angle
- t is the mean of measured intensity value of the data points along the particular azimuth angle
- y is the output from the fitting process of step s10 for the rth data point.
- E ranges from -°° to 1. Also, E is equal to 0 when the square of the differences between measured and estimated values is as large as the variability in the measured data.
- radar observations in every azimuth angle are labelled.
- the classification or labelling of the radar observations along an azimuth angle are performed as follows.
- the data points along a particular azimuth angle are labelled as "ground” if the determined coefficient of efficiency E for that azimuth angle is greater than or equal to an experimentally determined threshold ⁇ .
- ⁇ is equal to 0.8 (or 80%).
- Ti is equal to a different value.
- Ti is determined by a different appropriate method, i.e. other than experimentally.
- the data points along a particular azimuth angle are labelled as "not ground” if the determined coefficient of efficiency E for that azimuth angle is less than T
- a physical consistency check is performed for each data point along an azimuth angle labelled as "ground”.
- a physical consistency check is performed by comparing the updated values of the proximal, distal and central range (i.e. Ri, F3 ⁇ 4 and R 0 respectively) to each other. If the difference between the proximal and central range, i.e. (R1-R0) is lower than a further experimentally determined threshold T 2 , then the radar observation is more correctly labelled as "uncertain ground”.
- T 2 a further experimentally determined threshold
- a similar check is performed between the central and distal range, i.e. (Ro- 2)-
- an additional, optional process is performed for each azimuth angle labelled as "uncertain ground” for each azimuth angle labelled as "uncertain ground” for each "uncertain ground” classification.
- an additional check is performed to detect possible obstacles present a the region of interest. These obstacles may appear as narrow pulses of high intensity.
- a value k is recorded (this value defines a variation range for the ground return).
- ⁇ exceeds a predetermined threshold T 3 , then it is determined that an object is present along that azimuth angle.
- the value of the predetermined threshold T 3 is determined experimentally. This process advantageously tends to detect obstacles present along that azimuth angle which appear as narrow pulses of high intensity.
- An optional additional process of assessing the accuracy of the system may be performed.
- the accuracy of the system in measuring the distance from the ground may be assessed through comparison with a "true ground map".
- the above described system outputs a relative slant range Roj.
- a corresponding 3-D point in a world reference frame P is estimated. This is compared to a closest neighbour in the ground truth map P* .
- a mean square error in the elevation map is:
- An advantage provided by the above described radar ground segmentation process is that obstacle avoidance, task-specific target detection, and the generation of terrain maps for navigation tend to be facilitated. Moreover, other applications including scene understanding, segmentation classification, and dynamic tracking etc. tend to be advantageously facilitated.
- MMW millimetre-wave
- a further advantage is that radar tends to provide information of distributed targets and of multiple targets that appear in a single observation.
- a model describing the geometric and intensity properties of the ground echo in radar imagery is advantageously provided and exploited. This model advantageously facilitates the performance of the radar ground segmentation process, which tends to allow classification of observed ground returns.
- the above described process advantageously tends to enhance vehicle perception capabilities, e.g. in natural terrain and in all conditions.
- the identification of the ground tends to be facilitated (the ground typically being the terrain that is most likely to be traversable).
- the provided method and system advantageously tends to allow the vehicle to identify a traversable patch of its nearby environment with a single sweep.
- ground echo model tends to allow for range estimation along the entire ground footprint for accurate environment mapping.
- a further advantage of the provided process is that the process tends to be relatively fast and reliable, and capable of extracting features from a large set of noisy data.
- a further advantage of the provided process is that the radar antennas used in the process tend to be of a size that allows the radars to be mounted on a vehicle, e.g. an autonomous ground vehicle.
- a further advantage provided by the above described process is that is that the accuracy of the measured ground surface tends to be improved to 'sub pixel" levels. This tends to yield improved accuracy other conventional methods, such as selecting the highest intensity peak as the ground point, which is subject to the range resolution of the radar.
- Apparatus including the processor 6, for implementing the above arrangement, and performing the method steps to be described above, may be provided by configuring or adapting any suitable apparatus, for example one or more computers or other processing apparatus or processors, and/or providing additional modules.
- the apparatus may comprise a computer, a network of computers, or one or more processors, for implementing instructions and using data, including instructions and data in the form of a computer program or plurality of computer programs stored in or on a machine readable storage medium such as computer memory, a computer disk, ROM, PROM etc., or any combination of these or other storage media.
- the vehicle is an autonomous and unmanned land-based vehicle.
- the vehicle is a different type of vehicle.
- the vehicle is a manned and/or semi-autonomous vehicle.
- the above described radar ground segmentation process is implemented on a different type of entity instead of or in addition to a vehicle.
- the above described system/method may be implemented in an Unmanned Aerial Vehicle, or helicopter (e.g. to improve landing operations), or as a so-called "robotic cane" for visually impaired people.
- the above described system/method is implemented in a stationary system for security application, e.g. a fixed area scanner for tracking people or other moving objects by separating them from the ground return.
- the radar is a 95-GHz Frequency Modulated Continuous Wave (FMCW) millimetre-wave radar that reports the amplitude of echoes at ranges between 1 m and 120m.
- the wavelength of the emitted radar signal is 3mm.
- the beam-width of the emitted radar signal is 3.0° in elevation and 3.0° in azimuth.
- the radar is a different appropriate type of radar e.g. a radar having different appropriate specifications.
- the vehicle is used to implement the radar ground segmentation process in the scenario described above with reference to Figure 2.
- the above described process is implemented in a different appropriate scenario, for example, a scenario in which a variety of terrain features and or objects are present, and/or in the presence of challenging environmental conditions such as adverse weather conditions or dust/smoke clouds.
- the beginning of the far-field region for the radar antenna is 15m from the radar system.
- the far- field region begins at a different distance from the radar system.
- the radar signal is directed at the front of the vehicle with a constant pitch or grazing angle ⁇ of about 1 1 degrees.
- the radar signal is directed from a different area of the vehicle at any appropriate grazing angle.
- the geometrical model used at step s6 to estimate the range spread of the ground echo is based on an assumption of globally flat ground. However, in other embodiments this assumption is not made, or a different assumption is made.
- the radar system is used to generate a radar image of the area of terrain in the near-field region of the radar in the radar system 4.
- the radar operates partially, or wholly, in the radar near-field.
- the radar system may be used to generate images, i.e. operate, partially or wholly in the radar far-field.
- a change detection algorithm is implemented.
- a cumulative sum (CUSUM) test is used.
- a different appropriate change detection process is used, for example, using edge detection techniques to the whole radar image.
- a power return model is fit to the radar observation for each azimuth angle.
- the power return model used in the above embodiments is as described above with reference to step s10. However, in other embodiments a different type of model, or different power return model is fit to the radar observation.
- a non-linear least squares approach using the Gauss-Newton-Marquardt method is adopted for data fitting.
- a different data fitting method is used.
- a coefficient of efficiency is determined for each azimuth angle in the extracted image background.
- a different type of confidence measure is determined for the extracted image background.
- data points along each azimuth angle are classified as either “ground”, “not ground”, or “uncertain”.
- any number of different classifications may be used instead of or in addition to those classifications.
- the data points along a particular azimuth angle are labelled as "ground” if the determined coefficient of efficiency E for that azimuth angle is greater than or equal to a threshold.
- a data point is classified as "ground” if one or more different criteria are satisfied instead of or in addition to the criterion that the coefficient of efficiency is greater than or equal to.
- the data points along a particular azimuth angle are labelled as "not ground” if the determined coefficient of efficiency E for that azimuth angle is less than a threshold.
- a data point is classified as "not ground” if one or more different criteria are satisfied instead of or in addition to the criterion that the coefficient of efficiency is less than a threshold.
- a physical consistency check is performed for each data point along an azimuth angle labelled as "ground”. This may lead to a data point that has been classified as "ground” as being classified as “unknown”. However, in other embodiments a data point is classified as "unknown” if one or more different criteria are satisfied instead of or in addition to the criterion that consistency check is satisfied. Also, in other embodiments, a different type of consistency check is used.
- a percentage relative change in the maximum intensity value between the observation and the model along the particular azimuth angle is determined. This value is then used to identify whether there is an obstacle in the region of interest. However, in other embodiments this process is not performed. Also, in other embodiments a different process is used to identify whether there is an object along a particular azimuth angle.
- the radar system radiates a continuous wave
- the radar signal has a different type of radar modulation.
Landscapes
- Engineering & Computer Science (AREA)
- Remote Sensing (AREA)
- Radar, Positioning & Navigation (AREA)
- Physics & Mathematics (AREA)
- Computer Networks & Wireless Communication (AREA)
- General Physics & Mathematics (AREA)
- Electromagnetism (AREA)
- Radar Systems Or Details Thereof (AREA)
Abstract
Description
Claims
Priority Applications (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
EP11839025.1A EP2638410A1 (en) | 2010-11-11 | 2011-11-10 | Radar image processing |
AU2011326353A AU2011326353A1 (en) | 2010-11-11 | 2011-11-10 | Radar image processing |
US13/884,850 US20130293408A1 (en) | 2010-11-11 | 2011-11-10 | Radar image processing |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
AU2010905003 | 2010-11-11 | ||
AU2010905003A AU2010905003A0 (en) | 2010-11-11 | Radar Image Processing |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2012061896A1 true WO2012061896A1 (en) | 2012-05-18 |
Family
ID=46050247
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/AU2011/001458 WO2012061896A1 (en) | 2010-11-11 | 2011-11-10 | Radar image processing |
Country Status (4)
Country | Link |
---|---|
US (1) | US20130293408A1 (en) |
EP (1) | EP2638410A1 (en) |
AU (1) | AU2011326353A1 (en) |
WO (1) | WO2012061896A1 (en) |
Families Citing this family (11)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US10417918B2 (en) * | 2016-01-20 | 2019-09-17 | Honeywell International Inc. | Methods and systems to assist in a search and rescue mission |
FI127505B (en) * | 2017-01-18 | 2018-08-15 | Novatron Oy | Earth moving machine, range finder arrangement and method for 3d scanning |
US10939207B2 (en) * | 2017-07-14 | 2021-03-02 | Hewlett-Packard Development Company, L.P. | Microwave image processing to steer beam direction of microphone array |
CN109073744A (en) * | 2017-12-18 | 2018-12-21 | 深圳市大疆创新科技有限公司 | Landform prediction technique, equipment, system and unmanned plane |
EP3505959A1 (en) | 2017-12-29 | 2019-07-03 | Acconeer AB | An autonomous mobile cleaning robot |
EP3599484A1 (en) * | 2018-07-23 | 2020-01-29 | Acconeer AB | An autonomous moving object |
SE542921C2 (en) * | 2019-01-24 | 2020-09-15 | Acconeer Ab | Autonomous moving object with radar sensor |
CN111722187B (en) * | 2019-03-19 | 2024-02-23 | 富士通株式会社 | Radar installation parameter calculation method and device |
CN110309790B (en) * | 2019-07-04 | 2021-09-03 | 闽江学院 | Scene modeling method and device for road target detection |
CN111751796B (en) * | 2020-07-03 | 2023-08-22 | 成都纳雷科技有限公司 | Traffic radar angle measurement method, system and device based on one-dimensional linear array |
CN114509042B (en) * | 2020-11-17 | 2024-05-24 | 易图通科技(北京)有限公司 | Shading detection method, shading detection method of observation route and electronic equipment |
Citations (11)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5240159A (en) * | 1992-10-15 | 1993-08-31 | Bianchi International | Shoulder harness for backpack |
US6179186B1 (en) * | 1997-01-06 | 2001-01-30 | Global Act Ab | Backpack |
US20030000985A1 (en) * | 2001-06-30 | 2003-01-02 | Terry Schroeder | Posture pack TM - posture friendly backpack |
JP2003125951A (en) * | 2001-10-25 | 2003-05-07 | Nagatanien:Kk | Stirring container |
US20040020958A1 (en) * | 2002-07-31 | 2004-02-05 | Gallant Industrial Co., Ltd. | Backpack |
US6926183B2 (en) * | 2001-12-28 | 2005-08-09 | Danny Yim Hung Lui | Shoulder-borne carrying straps, carrying strap assemblies and golf bags incorporating the same |
US20050230445A1 (en) * | 2004-04-19 | 2005-10-20 | Wallace Woo | Backpack |
US20060093710A1 (en) * | 2004-11-02 | 2006-05-04 | Bengtson Timothy A | Beverage container with juice extracting feature |
US20080123464A1 (en) * | 2006-11-24 | 2008-05-29 | Jason Griffin | Combination drink dispenser |
US20090120932A1 (en) * | 2007-11-09 | 2009-05-14 | Mclaughlin Kevin W | Cocktail shaker |
WO2009104099A2 (en) * | 2008-02-19 | 2009-08-27 | Egidio Renna | Disposable shaker |
Family Cites Families (18)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
DE69721085T2 (en) * | 1996-05-14 | 2004-04-22 | Honeywell International Inc. | Autonomous landing system |
US7307575B2 (en) * | 2004-09-14 | 2007-12-11 | Bae Systems Information And Electronic Systems Integration Inc. | Through-the-wall frequency stepped imaging system utilizing near field multiple antenna positions, clutter rejection and corrections for frequency dependent wall effects |
US7479918B2 (en) * | 2006-11-22 | 2009-01-20 | Zimmerman Associates, Inc. | Vehicle-mounted ultra-wideband radar systems and methods |
US7773205B2 (en) * | 2007-06-06 | 2010-08-10 | California Institute Of Technology | High-resolution three-dimensional imaging radar |
US7782251B2 (en) * | 2007-10-06 | 2010-08-24 | Trex Enterprises Corp. | Mobile millimeter wave imaging radar system |
GB2453927A (en) * | 2007-10-12 | 2009-04-29 | Curtiss Wright Controls Embedded Computing | Method for improving the representation of targets within radar images |
US8044846B1 (en) * | 2007-11-29 | 2011-10-25 | Lockheed Martin Corporation | Method for deblurring radar range-doppler images |
US7532150B1 (en) * | 2008-03-20 | 2009-05-12 | Raytheon Company | Restoration of signal to noise and spatial aperture in squint angles range migration algorithm for SAR |
US20100152600A1 (en) * | 2008-04-03 | 2010-06-17 | Kai Sensors, Inc. | Non-contact physiologic motion sensors and methods for use |
US8362946B2 (en) * | 2008-10-03 | 2013-01-29 | Trex Enterprises Corp. | Millimeter wave surface imaging radar system |
US8144052B2 (en) * | 2008-10-15 | 2012-03-27 | California Institute Of Technology | Multi-pixel high-resolution three-dimensional imaging radar |
EP2320247B1 (en) * | 2009-11-04 | 2017-05-17 | Rockwell-Collins France | A method and system for detecting ground obstacles from an airborne platform |
KR101142737B1 (en) * | 2009-12-10 | 2012-05-04 | 한국원자력연구원 | Countermeasure system for birds |
CA2789737A1 (en) * | 2010-02-16 | 2011-08-25 | Sky Holdings Company, Llc | Systems, methods and apparatuses for remote device detection |
JP5580621B2 (en) * | 2010-02-23 | 2014-08-27 | 古野電気株式会社 | Echo signal processing device, radar device, echo signal processing method, and echo signal processing program |
US8487807B2 (en) * | 2010-06-28 | 2013-07-16 | Institut National D'optique | Synthetic aperture imaging interferometer |
WO2012000076A1 (en) * | 2010-06-28 | 2012-01-05 | Institut National D'optique | Method and apparatus for determining a doppler centroid in a synthetic aperture imaging system |
US9442189B2 (en) * | 2010-10-27 | 2016-09-13 | The Fourth Military Medical University | Multichannel UWB-based radar life detector and positioning method thereof |
-
2011
- 2011-11-10 WO PCT/AU2011/001458 patent/WO2012061896A1/en active Application Filing
- 2011-11-10 EP EP11839025.1A patent/EP2638410A1/en not_active Withdrawn
- 2011-11-10 US US13/884,850 patent/US20130293408A1/en not_active Abandoned
- 2011-11-10 AU AU2011326353A patent/AU2011326353A1/en not_active Abandoned
Patent Citations (11)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5240159A (en) * | 1992-10-15 | 1993-08-31 | Bianchi International | Shoulder harness for backpack |
US6179186B1 (en) * | 1997-01-06 | 2001-01-30 | Global Act Ab | Backpack |
US20030000985A1 (en) * | 2001-06-30 | 2003-01-02 | Terry Schroeder | Posture pack TM - posture friendly backpack |
JP2003125951A (en) * | 2001-10-25 | 2003-05-07 | Nagatanien:Kk | Stirring container |
US6926183B2 (en) * | 2001-12-28 | 2005-08-09 | Danny Yim Hung Lui | Shoulder-borne carrying straps, carrying strap assemblies and golf bags incorporating the same |
US20040020958A1 (en) * | 2002-07-31 | 2004-02-05 | Gallant Industrial Co., Ltd. | Backpack |
US20050230445A1 (en) * | 2004-04-19 | 2005-10-20 | Wallace Woo | Backpack |
US20060093710A1 (en) * | 2004-11-02 | 2006-05-04 | Bengtson Timothy A | Beverage container with juice extracting feature |
US20080123464A1 (en) * | 2006-11-24 | 2008-05-29 | Jason Griffin | Combination drink dispenser |
US20090120932A1 (en) * | 2007-11-09 | 2009-05-14 | Mclaughlin Kevin W | Cocktail shaker |
WO2009104099A2 (en) * | 2008-02-19 | 2009-08-27 | Egidio Renna | Disposable shaker |
Also Published As
Publication number | Publication date |
---|---|
AU2011326353A1 (en) | 2013-05-30 |
US20130293408A1 (en) | 2013-11-07 |
EP2638410A1 (en) | 2013-09-18 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US20130293408A1 (en) | Radar image processing | |
Reina et al. | Radar‐based perception for autonomous outdoor vehicles | |
EP3663790A1 (en) | Method and apparatus for processing radar data | |
US7132974B1 (en) | Methods and systems for estimating three dimensional distribution of turbulence intensity using radar measurements | |
US6792684B1 (en) | Method for determination of stand attributes and a computer program to perform the method | |
Dierking et al. | Change detection for thematic mapping by means of airborne multitemporal polarimetric SAR imagery | |
Reymann et al. | Improving LiDAR point cloud classification using intensities and multiple echoes | |
WO2012122589A1 (en) | Image processing | |
US11333753B2 (en) | Stripmap synthetic aperture radar (SAR) system utilizing direct matching and registration in range profile space | |
US10444398B2 (en) | Method of processing 3D sensor data to provide terrain segmentation | |
Negaharipour | On 3-D scene interpretation from FS sonar imagery | |
Gross et al. | Segmentation of tree regions using data of a full-waveform laser | |
Lee et al. | Investigations into the influence of object characteristics on the quality of terrestrial laser scanner data | |
EP1515160B1 (en) | A target shadow detector for synthetic aperture radar | |
Helgesen et al. | Low altitude georeferencing for imaging sensors in maritime tracking | |
CN112313535A (en) | Distance detection method, distance detection device, autonomous mobile platform, and storage medium | |
Hyyppä et al. | Airborne laser scanning | |
Mandlburger et al. | Feasibility investigation on single photon LiDAR based water surface mapping | |
Reina et al. | Short-range radar perception in outdoor environments | |
Mecocci et al. | Radar image processing for ship-traffic control | |
Glira et al. | 3D mobile mapping of the environment using imaging radar sensors | |
Jose et al. | Relative radar cross section based feature identification with millimeter wave radar for outdoor slam | |
Brooker et al. | High‐resolution millimeter‐wave radar systems for visualization of unstructured outdoor environments | |
Kuttikkad et al. | Building 2D wide-area site models from single-and multipass single-polarization SAR data | |
Overbye et al. | Radar-Only Off-Road Local Navigation |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 11839025 Country of ref document: EP Kind code of ref document: A1 |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2011839025 Country of ref document: EP |
|
ENP | Entry into the national phase |
Ref document number: 2011326353 Country of ref document: AU Date of ref document: 20111110 Kind code of ref document: A |
|
WWE | Wipo information: entry into national phase |
Ref document number: 13884850 Country of ref document: US |