GB2554179A - Method for generating an inverse synthetic aperture radar image - Google Patents

Method for generating an inverse synthetic aperture radar image Download PDF

Info

Publication number
GB2554179A
GB2554179A GB1713833.0A GB201713833A GB2554179A GB 2554179 A GB2554179 A GB 2554179A GB 201713833 A GB201713833 A GB 201713833A GB 2554179 A GB2554179 A GB 2554179A
Authority
GB
United Kingdom
Prior art keywords
target
radar
image
geometric image
point
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
GB1713833.0A
Other versions
GB201713833D0 (en
GB2554179B (en
Inventor
Veyer Antoine
Corretja Vincent
Garrec Patrick
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Thales SA
Original Assignee
Thales SA
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Thales SA filed Critical Thales SA
Publication of GB201713833D0 publication Critical patent/GB201713833D0/en
Publication of GB2554179A publication Critical patent/GB2554179A/en
Application granted granted Critical
Publication of GB2554179B publication Critical patent/GB2554179B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/88Radar or analogous systems specially adapted for specific applications
    • G01S13/89Radar or analogous systems specially adapted for specific applications for mapping or imaging
    • G01S13/90Radar or analogous systems specially adapted for specific applications for mapping or imaging using synthetic aperture techniques, e.g. synthetic aperture radar [SAR] techniques
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/88Radar or analogous systems specially adapted for specific applications
    • G01S13/89Radar or analogous systems specially adapted for specific applications for mapping or imaging
    • G01S13/90Radar or analogous systems specially adapted for specific applications for mapping or imaging using synthetic aperture techniques, e.g. synthetic aperture radar [SAR] techniques
    • G01S13/9094Theoretical aspects
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/88Radar or analogous systems specially adapted for specific applications
    • G01S13/89Radar or analogous systems specially adapted for specific applications for mapping or imaging
    • G01S13/90Radar or analogous systems specially adapted for specific applications for mapping or imaging using synthetic aperture techniques, e.g. synthetic aperture radar [SAR] techniques
    • G01S13/904SAR modes
    • G01S13/9064Inverse SAR [ISAR]
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/02Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S13/00
    • G01S7/41Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S13/00 using analysis of echo signal for target characterisation; Target signature; Target cross-section
    • G01S7/411Identification of targets based on measurements of radar reflectivity
    • G01S7/412Identification of targets based on measurements of radar reflectivity based on a comparison between measured values and known or stored values
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/02Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S13/00
    • G01S7/41Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S13/00 using analysis of echo signal for target characterisation; Target signature; Target cross-section
    • G01S7/415Identification of targets based on measurements of movement associated with the target

Abstract

Method of generating an inverse synthetic aperture radar (ISAR) image of a target seen by a radar, the target having at least two parts with a distinct radial speed relative to the radar. A geometric image is provided including the target seen from the radar, the target and radar being in the desired relative position. For each point of the geometric image corresponding to part of the target, the radial speed and distance from the radar are determined. For each point of the geometric image corresponding to part of the target, the signal amplitude is simulated by randomly drawing a value from among a plurality of values. The vessel may be a vessel such as a ship (fig 3, not shown) - the pitch and roll of the target may also be calculated. The geometric image may be displayed via a graphics engine and points corresponding to the environment of the target may be eliminated from the image.

Description

(71) Applicant(s):
Thales
TOUR CARPE DIEM, Place des Corolles,
Esplanade Nord, Courbevoie 92400,
France (including Overseas Departments and Territori es) (72) Inventor(s):
Antoine Veyer Vincent Corretja Patrick Garrec (74) Agent and/or Address for Service:
A.A. Thornton & CO.
Old Bailey, London, EC4M 7NG, United Kingdom (51) INT CL:
G01S 13/90 (2006.01) G01S 7/41 (2006.01) (56) Documents Cited:
WO 2007/133085 A1 CN 104407336 A JP 2008026106 A US 20100052977 A1
EUSAR 2014; Proceedings of 10th European Conference on Synthetic Aperture Radar, Knapskog et al., 'Target Recognition of Ships in Harbour Based on Simulated SAR Images Produced with MOCEM Software', pages 608-611
8th European Conference on Synthetic Aperture Radar (EUSAR), 2010, A. O. Knapskog, 'Classification of Ships in TerraSAR-X Images Based on 3D Models and Silhouette Matching', pages 234-237 (58) Field of Search:
INT CL G01S
Other: WPI, EPODOC, Patents Fulltext, INSPEC, XPI3E, Internet (54) Title of the Invention: Method for generating an inverse synthetic aperture radar image Abstract Title: Method for generating an inverse synthetic aperture radar image (57) Method of generating an inverse synthetic aperture radar (ISAR) image of a target seen by a radar, the target having at least two parts with a distinct radial speed relative to the radar. A geometric image is provided including the target seen from the radar, the target and radar being in the desired relative position. For each point of the geometric image corresponding to part of the target, the radial speed and distance from the radar are determined. For each point of the geometric image corresponding to part of the target, the signal amplitude is simulated by randomly drawing a value from among a plurality of values. The vessel may be a vessel such as a ship (fig 3, not shown) - the pitch and roll of the target may also be calculated. The geometric image may be displayed via a graphics engine and points corresponding to the environment of the target may be eliminated from the image.
P2
X
X v
Figure GB2554179A_D0001
c
FIG.2
Figure GB2554179A_D0002
Figure GB2554179A_D0003
Figure GB2554179A_D0004
Figure GB2554179A_D0005
WJWWWWWWWWWWWWWWWWWWWWWWWWWWWWWQW
Figure GB2554179A_D0006
FIG /3
Figure GB2554179A_D0007
Figure GB2554179A_D0008
L
Figure GB2554179A_D0009
Z
Figure GB2554179A_D0010
P
Figure GB2554179A_D0011
Figure GB2554179A_D0012
FIG.8
Method for generating an inverse synthetic aperture radar image
The present invention relates to a method for generating an inverse synthetic aperture radar image and an associated computer program product.
Radar imaging is a technique making it possible to calculate images representative of elementary reflector maps. Radar imaging is based on the processing of backscattering signals from the object to be observed and recorded by a coherent radar system.
When the radar system has a broadband analyzer and a relative rotational movement exists between the radar and the object (synthetic aperture), it is possible to use imaging based on the ISAR technique. The acronym ISAR stands for Inverse Synthetic Aperture Radar. The term inverse means that the movement is the movement of the object.
Imaging using the ISAR technique makes it possible to obtain two-dimensional images of a given object with a good resolution. Such images are in particular used in maritime patrol aircraft to classify and identify vessels.
It is desirable in simulations to be able to re-create such images obtained using the ISAR technique. Test benches and ground trainers in particular use these simulated images. In the case of test benches, this makes it possible to validate the chain ensuring the transfer of the image between the radar and the mission system as well as the display of the image. For ground trainers, the re-created images are useful to train maritime patrol aircraft on the use of a radar provided with imaging using the ISAR technique.
To that end, simulation methods are known using calculations involving the radar wave propagation equation. Consequently, the simulation method involves modeling various propagation and reflection phenomena of the electromagnetic wave. This assumes knowledge of the geometry and materials of the object illuminated by the radar. For example, the object is generally broken down into elementary facets with which absorption and/or reflection coefficients of the energy emitted by the radar are associated. Such coefficients are representative of the material making up the facet of the object.
However, such simulation methods have two major drawbacks. On the one hand, the reflection coefficients on the object of the wave emitted by the radar are often not well known, which limits the precision of the generation. On the other hand, the calculations are complex and require a relatively long calculation time.
There is therefore a need for a method for generating an inverse synthetic aperture radar image that is easier to implement while making it possible to retain at least the precision of the generating methods based on a breakdown of the object into elementary facets.
To that end, the description describes a method for generating an inverse synthetic aperture radar image of a target seen by a radar, the target having several parts, at least two parts having a distinct radial speed relative to the radar, the method including at least the step of providing the desired relative position of the target and the radar for the image to be generated, providing a geometric image including the target seen from the radar, the target and the radar being in the desired relative position, for each point of the geometric image corresponding to part of the target, determining the distance between the part of the target in question and the radar and the radial speed of the target part. The method comprises, for each point of the geometric image corresponding to part of the target, a step for simulating the signal amplitude received by the radar by randomly drawing a value from among a plurality of values, building the image to be generated, each point of the image to be generated being associated bijectively with a point of the geometric image and having the determined distance, the determined speed and the randomly drawn value of the point of the geometric image in question as coordinates.
According to specific embodiments, the method comprises one or more of the following features, considered alone or according to any technically possible combinations:
- the target is a vessel.
- the method includes a step for providing the rotational pitch speed of the target and the rotational roll speed of the target and wherein, in the determining step, the speed of each point of the geometric image corresponding to part of the target is determined from the rotational pitch speed of the target, the rotational roll speed of the target and the coordinate of the point in the geometric image.
- the method includes the detection of the coordinates of each point of the geometric image corresponding to part of the target.
- the method includes displaying the geometric image via a graphic engine including a buffer depth memory and the detection includes comparing values from the buffer depth memory before display and after display.
- the detection also includes converting the depth into coordinates.
- the geometric image also includes the environment of the target and the method includes a step for eliminating points corresponding to the environment.
- the method includes displaying the geometric image via a graphic engine including a buffer color memory and the elimination step includes comparing values from the buffer color memory before display and after display.
- the method includes providing the viewing direction of the radar, the aspect angle of the target and the distance between the radar and the target.
The description also proposes a computer program product suitable for implementing a method for generating an inverse synthetic aperture radar image as previously described.
The description also describes a computer medium storing instructions of the preceding computer program product.
Other features and advantages of the invention will appear upon reading the following description of embodiments of the invention, provided as an example only and in reference to the drawings, which are:
- figure 1, a schematic view of a graphic engine and a monitor,
- figure 2, a flowchart of an example implementation of part of a generating method,
- figure 3, a schematic view of a geometric image of a vessel seen by a radar,
- figure 4, a view illustrating the passage matrix between several systems of coordinates,
- figure 5, a graph showing the breakdown of the roll rotation in a plane of reference,
- figure 6, a graph showing the breakdown of the pitch rotation in a plane of reference,
- figure 7, a graph of the distribution of points in a distance and radial speed plane of reference, and
- figure 8, an inverse synthetic aperture radar image of a vessel seen by a radar, the image being obtained by implementing the determination method according to figure 2.
A monitor 10 and a graphic display engine 12 are shown in figure 1.
The monitor 10 is able to display images.
The monitor 10 makes it possible to define a display window size. Typically, the display window size corresponds to the number of pixels checked by the monitor 10.
The graphic display engine 12 is more simply referred to as graphic engine 12 hereinafter.
The graphic engine 12 is a set of components able to carry out a display of an image.
According to the example of figure 1, the graphic engine 12 includes a processor 14, a buffer depth memory 16 and a buffer color memory 18.
The processor 14 is able to check each component of the graphic engine 12 and to carry out calculation operations.
The buffer depth memory 16 is also referred to as “Z-buffer.
The buffer depth memory 16 is a memory storing a table associating each pixel of the image with the distance of the pixel from the viewer of the image.
More specifically, the table of the buffer depth memory 16 has a size identical to the display window and provides, for each pixel of the display window, the distance from the pixel to the observer.
The buffer color memory 18 is also referred to as “color buffer.
The buffer color memory 18 is a memory storing a table associating each pixel of the image with the color of the pixel in a colorimetric base.
The RGB base or the TSL base are two examples of colorimetric bases.
The operation of the monitor 10 and the graphic engine 12 will now be described in reference to the implementation of a method for generating an inverse synthetic aperture radar image.
The method makes it possible to generate an inverse synthetic aperture radar image of a target seen by the radar.
The target has several parts, at least two parts having a distinct radial speed. The radial speed is defined relative to a radar, the radial speed being the projection of the speed over the line of sight of the radar. The line of sight of the radar is the straight line passing through the radar and the center of the target.
As an example hereinafter, it is assumed that the target is a vessel.
The generating method includes a supply phase P1, a processing phase P2 and a generating phase P3 strictly speaking.
Only the processing phase P2 is shown schematically in figure 2.
The supply phase P1 includes a plurality of supply steps.
The supply phase P1 in particular includes a step for supplying the desired relative position of a vessel and a radar for the image to be generated.
According to the specific illustrated example, the supply phase P1 includes supplying the following information: for the radar, position, speed, route and direction of the sighting, and for the vessel, position, attitude, speed, route, as well as the pitch and roll speeds.
According to another embodiment, the supply step includes supplying the viewing direction of the radar, the distance between the vessel and the radar and the aspect angle of the vessel.
The aspect angle is defined as the angle between the viewing direction of the radar and the axis of the vessel.
The supply phase P1 also includes a step for supplying a geometric image including the vessel seen from the radar, the vessel and the radar being in the desired relative position.
A geometric image is an image of the vessel seen from the radar as the vessel would be seen by a camera operating in the visible domain.
The geometric image makes it possible to view at least the outside of the vessel, in particular its hull and the superstructures.
According to one embodiment, the geometric image makes it possible to view only an emerging part of the vessel.
According to another embodiment, the geometric image results from a reconstruction.
For example, it is assumed that the supply step includes supplying a geometric image including the vessel seen from the radar, the vessel and the radar being in the desired relative position.
The relative positioning is done by the processor 14 from a geometric image in particular accessible by the builder of the vessel.
Subsequently, a plane of reference is defined. As an example, the origin of the plane of reference is at the vertical of the radar and at a zero altitude, the axes being an axis Xref toward the North, an axis Yref toward the East and an axis Zref downward.
A plane of reference related to the target is also defined. The plane of reference related to the target includes an origin at the center of the targeted three axes, namely an axis Xdbie in the forward direction, an axis YCibie toward the right and an axis ZCibie downward.
More specifically, during the reconstruction, the vessel is inserted at a zero altitude using its relative position with the radar as well as its three attitude angles (heading, pitch and roll).
Furthermore, an environment is also inserted to obtain a horizontal flat surface with a predefined color, zero altitude, representing the sea.
An orthographic projection of the assembly of the vessel and the environment is next implemented.
The orthographic projection depends on the size of the display window and the distance of cutting planes along the viewing axis (near plane and far plane).
The set of the preceding operations is, according to one embodiment, implemented by the graphic engine 12.
One example geometric image obtained at the end of the performance of the supply step is in particular illustrated by figure 3.
In this image, the vessel is denoted N and the environment, which is the sea, is denoted E.
The processing phase P2 includes, according to the example of figure 2, a step 50 for displaying an image, a step 52 for obtaining particular pixels of the image, a calculating step 54 and a simulation step 56.
In the display step 50, the geometric image obtained in the supply step is displayed.
The geometric image is displayed using the graphic engine 12.
In the display step, the buffer depth memory 16 and the buffer color memory 18 are initialized at a predefined value corresponding to an erasure value.
The display changes the value of the depth for certain pixels of the buffer depth memory 16 and gives a color value for each pixel of the buffer color memory 18.
In the step 52 for obtaining pixels, the points of the geometric image corresponding to part of the vessel N are obtained.
This makes it possible to eliminate the pixels belonging to the environment E or the parts of the vessel that are concealed by the environment E.
To that end, during the obtaining step, coordinates are detected of each point of the geometric image corresponding to part of the vessel N.
According to the illustrated example, the detection includes comparing values from the buffer depth memory 16 before display of the geometric image and after display of the geometric image.
This makes it possible to obtain pixels for which the value of the buffer depth memory 16 is modified by the display, which are pixels corresponding to part of the vessel and, if applicable, the environment.
Since the geometric image also includes the environment E of the vessel N, in the obtaining step 52, a step for eliminating points corresponding to the environment E is carried out.
According to the illustrated example, the detection includes comparing values from the buffer color memory 18 before display and after display.
This makes it possible to obtain candidate pixels from among which a selection is made to select the pixels corresponding to a predefined color.
For example, the predefined color is blue for the sea.
Consequently, in an embodiment using an RGB colorimetric base, the selected candidate pixels are the pixels having a predominant blue component (for example, greater than 75%, the sum of the red and green components being less than 25%).
The selected candidate pixels are next eliminated.
Thus, in the step 52 for obtaining pixels, the points of the geometric image corresponding to part of the vessel N are obtained.
The calculating 54 and simulation 56 steps are carried out for each point of the geometric image corresponding to part of the vessel N.
This is illustrated by the test 58; as long as a pixel corresponding to part of the vessel N has not been processed in the image, the calculating 54 and simulation 56 steps are reiterated, shown by arrow 60 in figure 2. When all of the pixels corresponding to part of the vessel N have been processed, the result is obtained, which is shown by the result 62 in figure 2.
The calculating step 54 includes determining the distance between the part of the vessel N in question and the radar on the one hand, and determining the radial speed of the vessel N part, on the other hand.
First, the distance between the part of the vessel N in question and the radar is determined.
For example, for the pixels corresponding to the vessel N, the position of each of the pixels is calculated.
To that end, the value V of the buffer depth memory 16 is read to provide the distance between the radar and the considered part of the vessel N.
The read value is a normalized value, for example between 0 and 1.
The distance is then calculated using the following formula:
d = dp + v x (d; - dp)
Where:
• D is the orthographic distance, • dp and di define the distances of the two cutting planes used in the orthographic rendering projection of the scene.
The position of a pixel is next determined in the display plane perpendicular to the line of sight. Such a determination is in particular illustrated by figure 4.
The position is given by three coordinates X, Y and Z, these coordinates been given by the following relationships:
r X = D
L C*L
Y = -- +< 2 L (in pixels')
H R*H
Z =--+v 2 H (in pixels)
Where:
• R and C are the positions of the pixel in coordinates on the monitor 10 (the origin being situated in the bottom left corner), • L designates the horizontal dimension of the orthographic projection of the rendering of the scene in meters, while L (in pixels) provides the horizontal dimension of the monitor 10, and • H designates the vertical dimension of the orthographic projection of the rendering of the scene in meters, while H (in pixels) provides the vertical dimension of the monitor 10.
The three coordinates X, Y and Z are next converted into coordinates of the pixel. For example, such a conversion uses the fact that in the reference plane of reference, the line of sight is defined by A the azimuth angle (rotation angle in the horizontal plane) and S the elevation angle (rotation angle in the vertical plane) as well as the fact that the projection is orthographic.
As a result, the conversion is done by applying the following passage formula:
Figure GB2554179A_D0013
cos(?l) —sin(4) 0 sin(4) cos(4) 0 0 0 1 cos(S) 0 sin(S) 0 10 —sin(S) 0 cos(S)
Figure GB2554179A_D0014
Where Xr, Yr et Zr designate the coordinates of the pixel in the reference plane of reference.
The distance D between the radar and the pixel is next calculated by using the following formula:
D = ^xr 2 + Yr2+zr2
Secondly, the radial speed of the vessel part is determined.
The vessel is driven by a speed Uan9 the axis X of its plane of reference and rotation speeds in pitch and Ω,. in roll. The pitch rotation vector is shown by the axis Y, the roll rotation vector by the axis X. The components of the speed of the pixel that are denoted by the points P and Q in figures 5 (relative to the roll rotation) and 6 (relative to the pitch) can be expressed as follows in the plane of reference connected to the vessel:
V= Vx X + OP ΩΓ u + OQ Ω( v
This speed is next expressed in the reference plane of reference using the inverse matrix of the passage matrix from the reference plane of reference to the plane of reference of the vessel.
Let M be the passage matrix from the reference plane of reference to the plane of reference of the vessel. The matrix M is calculated traditionally by using the three attitude angles of the vessel: heading, pitch and roll. The coordinates of the pixel in the reference plane of reference are given by:
Figure GB2554179A_D0015
where zn) represents the coordinates of the origin of the vessel in the reference plane of reference.
By using, in this plane of reference, the speed of the pixel vp and the speed of the radar v0 that were previously provided, the radial speed is expressed by the equation:
vR = (Vp- Vo) x iop
The simulation step 56 is next carried out.
To that end, the signal amplitude received by the radar is simulated by randomly drawing a value from among a plurality of values.
To that end, the plurality of values includes more than 10 distinct values, preferably more than 100 distinct values.
For example, according to the described example, the plurality of values includes 256 values, namely the integer values comprised broadly speaking between 0 and 255.
According to a first embodiment, the drawing is a uniform random drawing.
According to a second embodiment, the drawing follows a normal or Gaussian law.
According to a third embodiment, the drawing is weighted by the height of the pixels relative to sea level.
Symbolically, as illustrated by figure 7, the implementation of all of the simulation steps 56 corresponds, for the graph of the distribution of points in a distance and radial speed plane of reference, to choosing a value for each of the depicted points. In this graph, the origin of the horizontal axis coincides with the distance of the origin of the vessel and the origin of the vertical axis corresponds to a zero radial speed.
The generating phase P3 includes a step for building the image to be generated, each point of the image to be generated being associated bijectively with a point of the geometric image and having the determined distance, the determined speed and the randomly drawn value of the point of the geometric image in question as coordinates.
An inverse synthetic aperture radar image is thus obtained, for example like that of figure 8.
The general principle of the method described above is as follows: displaying, via the graphic engine 12, the vessel N illuminated by the radar, while respecting its relative position and aspect angles; for each pixel of the image of the vessel N, calculating its position and its distance from the radar; knowing the pitch and roll rotation speeds of the vessel and the position of the pixel, calculating its speed, then its radial speed; randomly drawing a value representing the level (coded from 0 to 255, for example) of the signal returned by this pixel, and knowing the distance and the radial speed, inserting the point into the inverse synthetic aperture radar image.
In summary, the method makes it possible, from the geometric image of a vessel N displayed via a graphic engine 12 according to the aspect angle of the vessel N and the illumination angle of the radar, to obtain the distance of the points of the vessel illuminated by the radar, and therefore their position.
The method thus makes it possible to simplify the implementation of the generation of inverse synthetic aperture radar images.
The method makes it possible to avoid the use of the radar wave propagation equation.
The method uses only the geometry of the vessel contained in a file in the market standard format. Reference is no longer made to the materials or to the reflection coefficients of the superstructures of the vessel.
The method is implemented using the capacities of a graphic engine 12. Indeed, the calculations related to the display are done entirely by the graphic engine 12. One additional advantage is that these calculations are offloaded onto the graphic engine 12. This makes it possible to avoid using a dedicated processor.
Thus, the method for generating an inverse synthetic aperture radar image is easier to implement while making it possible to retain at least the precision of the generating methods based on a breakdown of the object into elementary facets.
Other alternatives corresponding to combinations of the embodiments previously described may also be considered.
In particular, it is possible to implement the generating method with a system and a computer program product. The interaction of the computer program product with the management system makes it possible to carry out this method.
The system is a computer.
More generally, the system is an electronic computer able to manipulate and/or transform data represented as electronic or physical quantities in registers of the system and/or memories into other similar data corresponding to physical data in the memories, registers or other types of display, transmission or storage devices.
The system includes a processor comprising a data processing unit, memories and an information medium reader. The system also comprises a keyboard and a display unit.
The computer program product includes a readable information medium.
A readable information medium is a medium readable by the system, usually by the data processing unit. The readable information medium is a medium suitable for storing electronic instructions and able to be coupled with a bus of a computer system.
As an example, the readable information medium is a floppy disk, an optical disc, a CD-ROM, a magnetic-optical disc, a ROM memory, a RAM memory, an EPROM memory, an EEPROM memory, a magnetic card or an optical card.
A computer program comprising program instructions is stored on the readable 5 information medium.
The computer program can be loaded on the data processing unit and is suitable for driving the implementation of a method as previously described when the computer program is implemented on the data processing unit.

Claims (10)

1. - A method for generating an inverse synthetic aperture radar image of a target (N) seen by a radar, the target (N) having several parts, at least two parts having a distinct radial speed relative to the radar, the method including at least the step of:
- providing the desired relative position of the target (N) and the radar for the image to be generated,
- providing a geometric image including the target (N) seen from the radar, the target (N) and the radar being in the desired relative position,
- for each point of the geometric image corresponding to part of the target (N), determining the distance between the part of the target (N) in question and the radar and the radial speed of the target part (N),
- for each point of the geometric image corresponding to part of the target (N), simulating the signal amplitude received by the radar by randomly drawing a value from among a plurality of values,
- building the image to be generated, each point of the image to be generated being associated bijectively with a point of the geometric image and having the determined distance, the determined speed and the randomly drawn value of the point of the geometric image in question as coordinates.
2. - The method according to claim 1, wherein the target (N) is a vessel.
3. - The method according to claim 1 or 2, wherein the method includes a step for providing the rotational pitch speed of the target (N) and the rotational roll speed of the target (N) and wherein, in the determining step, the speed of each point of the geometric image corresponding to part of the target (N) is determined from the rotational pitch speed of the target (N), the rotational roll speed of the target (N) and the coordinate of the point in the geometric image.
4. - The method according to any one of claims 1 to 3, wherein the method includes the detection of the coordinates of each point of the geometric image corresponding to part of the target (N).
5. - The method according to claim 4, wherein the method includes displaying the geometric image via a graphic engine (12) including a buffer depth memory (16) and the detection includes comparing values from the buffer depth memory (16) before display and after display.
6. - The method according to claim 5, wherein the detection also includes 5 converting the depth into coordinates.
7. - The method according to any one of claims 1 to 6, wherein the geometric image also includes the environment (E) of the target (N) and the method includes a step for eliminating points corresponding to the environment (E).
8. - The method according to claim 7, wherein the method includes displaying the geometric image via a graphic engine (12) including a buffer color memory (18) and the elimination step includes comparing values from the buffer color memory (18) before display and after display.
9. - The method according to any one of claims 1 to 8, wherein the method includes providing the viewing direction of the radar, the aspect angle of the target (N) and the distance between the radar and the target (N).
20
10.- A computer program product suitable for implementing a method for generating an inverse synthetic aperture radar image according to any one of claims 1 to 9.
Intellectual
Property
Office
Application No: Claims searched:
GB1713833.0A 2016-08-31 2017-08-29 Method for generating an inverse synthetic aperture radar image Active GB2554179B (en)

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
FR1601284A FR3055422B1 (en) 2016-08-31 2016-08-31 PROCESS FOR GENERATING A RADAR IMAGE WITH REVERSE OPENING SYNTHESIS

Publications (3)

Publication Number Publication Date
GB201713833D0 GB201713833D0 (en) 2017-10-11
GB2554179A true GB2554179A (en) 2018-03-28
GB2554179B GB2554179B (en) 2022-02-23

Family

ID=57860911

Family Applications (1)

Application Number Title Priority Date Filing Date
GB1713833.0A Active GB2554179B (en) 2016-08-31 2017-08-29 Method for generating an inverse synthetic aperture radar image

Country Status (3)

Country Link
DE (1) DE102017118992A1 (en)
FR (1) FR3055422B1 (en)
GB (1) GB2554179B (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108594230A (en) * 2018-07-17 2018-09-28 电子科技大学 A kind of diameter radar image emulation mode of seagoing vessel scene

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113671501B (en) * 2021-08-12 2023-08-18 广电计量检测集团股份有限公司 Direction simulation correction method and device based on ISAR imaging

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2007133085A1 (en) * 2006-05-15 2007-11-22 Telefonaktiebolaget Lm Ericsson (Publ) A method and system for automatic classification of objects
JP2008026106A (en) * 2006-07-20 2008-02-07 Mitsubishi Electric Corp Inverse synthetic aperture radar apparatus
US20100052977A1 (en) * 2008-06-26 2010-03-04 Raytheon Company Inverse Synthetic Aperture Radar Image Processing
CN104407336A (en) * 2014-10-27 2015-03-11 中国电子科技集团公司第二十九研究所 Orientation-sensitive object electromagnetic echo simulation method

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105116408A (en) * 2015-06-30 2015-12-02 电子科技大学 Ship ISAR image structure feature extraction method

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2007133085A1 (en) * 2006-05-15 2007-11-22 Telefonaktiebolaget Lm Ericsson (Publ) A method and system for automatic classification of objects
JP2008026106A (en) * 2006-07-20 2008-02-07 Mitsubishi Electric Corp Inverse synthetic aperture radar apparatus
US20100052977A1 (en) * 2008-06-26 2010-03-04 Raytheon Company Inverse Synthetic Aperture Radar Image Processing
CN104407336A (en) * 2014-10-27 2015-03-11 中国电子科技集团公司第二十九研究所 Orientation-sensitive object electromagnetic echo simulation method

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
8th European Conference on Synthetic Aperture Radar (EUSAR), 2010, A. O. Knapskog, 'Classification of Ships in TerraSAR-X Images Based on 3D Models and Silhouette Matching', pages 234-237 *
EUSAR 2014; Proceedings of 10th European Conference on Synthetic Aperture Radar, Knapskog et al., 'Target Recognition of Ships in Harbour Based on Simulated SAR Images Produced with MOCEM Software', pages 608-611 *

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108594230A (en) * 2018-07-17 2018-09-28 电子科技大学 A kind of diameter radar image emulation mode of seagoing vessel scene

Also Published As

Publication number Publication date
GB201713833D0 (en) 2017-10-11
FR3055422B1 (en) 2020-10-30
GB2554179B (en) 2022-02-23
FR3055422A1 (en) 2018-03-02
DE102017118992A1 (en) 2018-03-01

Similar Documents

Publication Publication Date Title
Auer et al. Ray-tracing simulation techniques for understanding high-resolution SAR images
JP6260924B2 (en) Image rendering of laser scan data
RU2695528C2 (en) Laser scanning data image visualization
Gu et al. Development of image sonar simulator for underwater object recognition
US20210109209A1 (en) Object measurement using deep learning analysis of synthetic aperture radar backscatter signatures
CN110276791B (en) Parameter-configurable depth camera simulation method
CN110135396A (en) Recognition methods, device, equipment and the medium of surface mark
JP5873683B2 (en) How to estimate occlusion in a virtual environment
JP2021022032A (en) Synthesizer, method and program
GB2554179A (en) Method for generating an inverse synthetic aperture radar image
US9361412B1 (en) Method for the simulation of LADAR sensor range data
JP2002352254A (en) Method for rendering mesh including a plurality of polygons representing graphics models
Yu et al. Development of High-Resolution Acoustic Camera based Real-Time Object Recognition System by using Autonomous Underwater Vehicles.
GB2554178A (en) Method for identification of a vessel
CN108226873A (en) A kind of radar return display methods under android system
CN111522007A (en) SAR imaging simulation method and system with real scene and target simulation fused
JP2000275338A (en) Apparatus and method for discrimination of target
US11175399B2 (en) Information processing device, information processing method, and storage medium
Xie et al. Towards differentiable rendering for sidescan sonar imagery
CN114494561A (en) Method for realizing visual domain analysis in WebGL
Woock et al. Approaches to acoustic and visual underwater sensing
Lin et al. Filling holes in 3D scanned model base on 2D image inpainting
Persson Second-Depth Antialiasing
Wąż et al. Multidimensional presentation of radar image
Naus et al. Method of producing the multidimensional radar display