GB2511908A - Image processing - Google Patents

Image processing Download PDF

Info

Publication number
GB2511908A
GB2511908A GB1400185.3A GB201400185A GB2511908A GB 2511908 A GB2511908 A GB 2511908A GB 201400185 A GB201400185 A GB 201400185A GB 2511908 A GB2511908 A GB 2511908A
Authority
GB
United Kingdom
Prior art keywords
image
terrain
camera
pixels
registered
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
GB1400185.3A
Other versions
GB201400185D0 (en
GB2511908B (en
Inventor
Mark Eccles
Christopher Charles Rawlinson Jones
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
BAE Systems PLC
Original Assignee
BAE Systems PLC
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by BAE Systems PLC filed Critical BAE Systems PLC
Publication of GB201400185D0 publication Critical patent/GB201400185D0/en
Publication of GB2511908A publication Critical patent/GB2511908A/en
Application granted granted Critical
Publication of GB2511908B publication Critical patent/GB2511908B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/18Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
    • H04N7/181Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast for receiving images from a plurality of remote sources
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/86Combinations of radar systems with non-radar systems, e.g. sonar, direction finder
    • G01S13/867Combination of radar systems with cameras
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/88Radar or analogous systems specially adapted for specific applications
    • G01S13/885Radar or analogous systems specially adapted for specific applications for ground probing
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/88Radar or analogous systems specially adapted for specific applications
    • G01S13/89Radar or analogous systems specially adapted for specific applications for mapping or imaging
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01VGEOPHYSICS; GRAVITATIONAL MEASUREMENTS; DETECTING MASSES OR OBJECTS; TAGS
    • G01V3/00Electric or magnetic prospecting or detecting; Measuring magnetic field characteristics of the earth, e.g. declination, deviation
    • G01V3/15Electric or magnetic prospecting or detecting; Measuring magnetic field characteristics of the earth, e.g. declination, deviation specially adapted for use during transport, e.g. by a person, vehicle or boat
    • G01V3/16Electric or magnetic prospecting or detecting; Measuring magnetic field characteristics of the earth, e.g. declination, deviation specially adapted for use during transport, e.g. by a person, vehicle or boat specially adapted for use from aircraft
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G5/00Traffic control systems for aircraft, e.g. air-traffic control [ATC]
    • G08G5/0073Surveillance aids
    • G08G5/0086Surveillance aids for monitoring terrain

Abstract

Method and apparatus for capturing and processing images of terrain 6, comprising: using each of a plurality of cameras (14, 5 16, 18, 20) to capture an image of a surface of the terrain thereby producing a plurality of camera images. Each camera image comprises a plurality of pixels and are registered to create a plurality of sets of registered pixels, which are then classified. Ground penetrating radar (8), captures a radar image (24) of the terrain and a detection algorithm is performed on the radar image (24) to detect an at least partially subterranean object or a terrain feature (26) in the radar image. The detected object or terrain feature (26) is associated with at least one classified set of registered pixels (28, 30), preferably by projecting at least part of the object or terrain feature and at least one camera image onto a common plane. Classification of the pixel sets may be achieved by determining a spectral signature and comparing it to a spectral stored in a database. Preferably, the frequency range of each camera overlaps at least partially with that of another camera. The camera may respectively detect the following frequencies: ultraviolet, visible light, and infrared.

Description

IMAGE PROCESSING
FIELD OF THE INVENTION
The present invention relates to the capturing and processing of images.
BACKGROUND
Typically, multi-spectral imaging comprises capturing 2-dimensional image data of a scene across multiple distinct frequency ranges within the electromagnetic spectrum. These images may then be registered. This process provides information about the scene that would not be provided were image data of the scene across only a single frequency range measured.
A spectral signature of an object is the specific combination of electromagnetic (EM) radiation across a range of frequencies that is reflected and absorbed by that object. A spectral signature can be used to identify a type of object. For example, it is known to process a 2-dimensional image to extract a spectral signature for each pixel. These signatures are used to divide the image into groups of similar pixels (referred to as segments). A class is then assigned to each segment. By matching a measured spectral signature of an unknown object to a stored spectral signature that has been assigned a class, that unknown object may be classified.
SUMMARY OF THE INVENTION
In a first aspect, the present invention provides a method of capturing and processing images, the images being of terrain, the method comprising: using each of a plurality of cameras, capturing an image of a surface of the terrain thereby producing a plurality of camera images, each camera image comprising a plurality of pixels; registering the camera images, thereby producing a plurality of sets of registered pixels; classifying each set of registered pixels; using a ground penetrating radar, capturing a radar image of the terrain; performing a detection algorithm on the range image to detect an at least partially subterranean (i.e. at least partially buried or underground) object or terrain feature in the radar image; and associating the detected object or terrain feature with at least one classified set of registered pixels. In this way, the detected object or terrain feature may be classified.
Each of the plurality of cameras may be for detecting electromagnetic radiation in a different frequency range to each of the other cameras in the plurality.
The frequency range in which each camera detects electromagnetic radiation may overlap at least partially with the frequency range in which a different camera detects electromagnetic radiation.
The plurality of cameras may comprise a first camera for detecting electromagnetic radiation in the ultraviolet range of frequencies, a second camera for detecting electromagnetic radiation in the visible light range of frequencies, and a third camera for detecting electromagnetic radiation in the infrared range of frequencies.
The step of classifying each set of registered pixels may comprise, for each set of registered pixels, determining a spectral signature using the image data from each of the plurality of cameras, and classifying a set of registered pixels depending on its spectral signature.
The step of classifying a set of registered pixels depending on its spectral signature may comprise comparing the spectral signature of the set of registered pixels to a spectral signature stored in a database.
The method may further comprise generating the database, wherein the step of generating the database may comprise, using each of a further plurality of cameras, capturing an image of a further area of terrain thereby producing a further plurality of camera images, each camera image in the further plurality comprising a plurality of pixels, registering the camera images in the further plurality, thereby producing a further plurality of sets of registered pixels, for each set of registered pixels in the further plurality, determining a spectral signature using the image data from each of the further plurality of cameras, assigning a class to each of determined spectral signatures, and forming the database from the determined spectral signatures and corresponding assigned classes.
The spectral signature of a set of registered pixels may span at least part of the following frequency ranges: the ultraviolet range of frequencies, the visible light range of frequencies, and the infrared range of frequencies.
The step of associating the detected object or terrain feature with at least one classified set of registered pixels may comprise projecting at least part of that object or terrain feature and at least one camera image onto a common plane. For example, the detected object or terrain feature may be projected onto a camera image (e.g. in a direction perpendicular to the plane of the camera image, e.g. vertically). For example, the detected object or terrain feature may be projected onto the registered camera images.
The step of associating the detected object or terrain feature with at least one classified set of registered pixels may comprise registering at least part of the radar image with one or more of the camera images. For example, a part of the radar image that corresponds to the surface of the ground may be registered with a camera image (which is an image of the surface of the ground). The detected object or terrain feature within the radar image may then be associated with parts of a camera image directly above it.
The method may further comprise performing an identification process to identify the detected object or terrain feature using the at least one classified set of registered pixels associated with the detected object or terrain feature.
The plurality of cameras and the range sensor may be mounted on an aircraft.
In a further aspect, the present invention provides apparatus for capturing and processing images, the images being of terrain, the apparatus comprising: a plurality of cameras, each of the plurality of cameras being for capturing an image of a surface of the terrain thereby producing a plurality of camera images, each camera image comprising a plurality of pixels; a ground penetrating radar for capturing a radar image of the area of terrain; and one or more processors arranged to: register the camera images, thereby producing a plurality of sets of registered pixels; classify each set of registered pixels; perform a detection algorithm on the radar image to detect an at least parlially subterranean object or terrain feature in the range image; and associate the detected object or terrain feature with at least one classified set of registered pixels.
In a further aspect, the present invention provides an aircraft comprising apparatus in accordance with the preceding aspect.
In a further aspect, the present invention provides a program or plurality of programs arranged such that when executed by a computer system or one or more processors it/they cause the computer system or the one or more processors to operate in accordance with the method of any of the above aspects.
In a further aspect, the present invention provides a machine readable storage medium storing a program or at least one of the plurality of programs according to the preceding aspect.
BRIEF DESCRIPTION OF THE DRAWINGS
Figure 1 is a schematic illustration (not to scale) of a scenario in which an aircraft is used to implement an embodiment of a method of capturing and processing images; Figure 2 is a schematic illustration (not to scale) of the aircraft used in this scenario to implement an embodiment of a method of capturing and processing images; Figure 3 is a schematic illustration (not to scale) of the set of imaging sensors on the aircraft; Figure 4 is a process flow-chart showing certain steps of an embodiment of a method of capturing and processing images implemented by the aircraft; Figure 5 is a schematic illustration (not to scale) of an example of a spectral signature for a set of registered pixels; Figure 6 is a process flow-chart showing certain steps of an embodiment of a method of using a database generated using the process of Figure 4 to survey an area of terrain; Figure 7 is a schematic illustration (not to scale) of an example of an image generated by a Ground Penetrating Radar; and Figure 8 is a schematic illustration (not to scale) of the Ground Penetrating Radar image together with the associated classification information from the imaging sensor images.
DETAILED DESCRIPTION
Figure 1 is a schematic illustration (not to scale) of a scenario in which an aircraft 2 is used to implement an embodiment of a method of capturing and processing images.
In this embodiment, the aircraft 2 is an unmanned aircraft.
In this scenario, the aircraft 2 flies over an area of terrain 6.
As the aircraft 2 flies over the area of terrain 12, the aircraft captures images of a portion of the area of terrain 6 as described in more detail later below with reference to Figures 4 and 6.
A portion of the area of terrain that the aircraft 2 captures images of is indicated in Figure 1 by the reference numeral 6.
Figure 2 is a schematic illustration (not to scale) of the aircraft 2 used in this scenario to implement an embodiment of a method of capturing and processing images.
In this embodiment, the aircraft 2 comprises a ground penetrating radar (GPR) 8, and a set of imaging sensors (indicated in Figure 2 by a single box and the reference numeral 10), and a processor 12.
In this embodiment, the GPR 8 is arranged to capture an image, hereinafter referred to as the GPR image, of the portion 6 of the area of terrain 4 as the aircraft 2 flies above the area of terrain 4. The GPR 8 is capable of detecting objects on or buried near to the surface of the ground.
In this embodiment, the GPR 8 emits radio waves towards the portion 6 of the area of terrain 4. The GPR 8 detects these radio waves after they have been reflected from the portion 6 and determines a range using these measurements.
In this embodiment, the GPR 8 measures the range (i.e. the distance) between detected objects/terrain features and the GPR 8. Thus, the GPR 8, in effect, produces a "range image" of the portion 6 of the area of terrain 4.
In this embodiment, the GPR 8 is connected to the processor 12 such that images captured by the GPR 8 are sent to the processor 12, as described in more detail later below with reference to Figure 6.
The imaging sensors 10 are described in more detail later below with reference to Figure 3.
In this embodiment, each of the imaging sensors 10 is arranged to capture an image of the portion 6 of the area of terrain 4 as the aircraft 2 flies above the area of terrain 4.
In this embodiment, each of the imaging sensors 10 measures an intensity of electromagnetic radiation reflected from objects/terrain features in the area of terrain 4. Thus, the imaging sensors 10, in effect, produce 2-dimensional image data.
In this embodiment, each of the image sensors 10 is a camera, i.e. a sensor that is used to detect electromagnetic radiation (originating from a remote source, e.g. the Sun) reflected by the portion 6 of the area of terrain 4.
In this embodiment, each of the imaging sensors 10 is connected to the processor 12 such that images captured by the imaging sensors 10 are sent to the processor 12, as described in more detail later below with reference to Figures 4 and 6.
In this embodiment, the processor 12 is connected to the GPR 8 and each of the imaging sensors 10. The processor 12 processes images received from the GPR 8 and the imaging sensors 10 as described in more detail later below with reference to Figures 4 and 6.
Figure 3 is a schematic illustration (not to scale) of the set of imaging sensors 10.
In this embodiment, the set of imaging sensors 10 comprises an ultra-violet (UV) camera 14, a hyperspectral visible-light detecting camera (hereinafter referred to as the "visible camera 16"), a short-wave infrared (SWIR) camera 18, and a long-wave infrared ([WIR) camera 20.
In this embodiment, the UV camera 14 is arranged to capture an image, hereinafter referred to as the UV image, of the portion 6 of the area of terrain 4 as the aircraft 2 flies above the area of terrain 4.
In this embodiment, the UV camera 14 detects electromagnetic radiation within the UV range of the electromagnetic spectrum.
In this embodiment, the visible camera 16 is arranged to capture an image, hereinafter referred to as the visible image, of the portion 6 of the area of terrain 4 as the aircraft 2 flies above the area of terrain 4.
In this embodiment, the visible camera 16 detects electromagnetic radiation within the visible range of the electromagnetic spectrum.
In this embodiment, the SWIR camera 18 is arranged to capture an image, hereinafter referred to as the SWIR image, of the portion 6 of the area of terrain 4 as the aircraft 2 flies above the area of terrain 4.
In this embodiment, the SWIR camera 18 detects electromagnetic radiation within the short-wave infrared range of the electromagnetic spectrum.
In this embodiment, the [WIR camera 20 is arranged to capture an image, hereinafter referred to as the LWIR image, of the portion 6 of the area of terrain 4 as the aircraft 2 flies above the area of terrain 4.
In this embodiment, the [WIR camera 20 detects electromagnetic radiation within the short-wave infrared range of the electromagnetic spectrum.
In this embodiment, the range of frequencies detected by the UV camera 14 overlaps to some extent the range of frequencies detected by the visible camera 16. In this embodiment, the range of frequencies detected by the UV camera 14 does not overlap to any extent the range of frequencies detected by the SWIR camera 18 or the LWIR camera 20.
In this embodiment, the range of frequencies detected by the visible camera 16 overlaps to some extent each of the ranges of frequencies detected by the UV camera 14 and the SWIR camera 18. In this embodiment, the range of frequencies detected by the visible camera 16 does not overlaps to any extent the ranges of frequencies detected by the [WIR camera 20.
In this embodiment, the range of frequencies detected by the SWIR camera 20 overlaps to some extent each of the ranges of frequencies detected by the visible camera 16 and the [WIR camera 22. In this embodiment, the range of frequencies detected by the SWIR camera 20 does not overlap to any extent the range of frequencies detected by the UV camera 14.
In this embodiment, the range of frequencies detected by the LWIR camera 22 overlaps to some extent the range of frequencies detected by the SWIR camera 22. In this embodiment, the range of frequencies detected by the [WIR camera 20 does not overlap to any extent the range of frequencies detectedbythevisiblecameral6orthe UV camera 14.
Figure 4 is a process flow-chart showing certain steps of an embodiment of a method of capturing and processing images implemented by the aircraft 2.
At step s2, as the aircraft 2 flies over the area of terrain 4, each of the imaging sensors 10 is used to capture an image of the portion 6 of the area of terrain 4.
In other words, as the aircraft 2 flies over the area of terrain a UV image of the portion 6 is captured using the UV camera 14, a visible image of the portion 6 is captured using the visible camera 16, a SWIR image of the portion 6 is captured using the SWIR camera 18, and a LWIR image of the portion 6 is captured using the LWIR camera 20.
At step s4, the images captured at step s2 are sent from the imaging sensors 10 to the processor 12.
At step s6, the processor 12 registers the received images, thereby producing a registered set of images.
In this embodiment, a conventional image registration technique is used to register the images, e.g. a feature-based registration process.
The overlapping of the frequency ranges detected by the image sensors (e.g. the overlapping of the range of frequencies detected by the (N camera 14 with the range of frequencies detected by the visible camera 16, and so on) tends to facilitate the registration process.
In this embodiment, the registration process provides that each pixel in a portion of an image is registered with, or associated with, a pixel in each of the other images. For example, a pixel in the UV image is registered to a pixel in each of the visible image, the SWIR image, and the LWIR image.
In other words, in this embodiment a plurality of sets of pixels is produced. Each set of pixels comprises a pixel from each of the UV, visible, SWIR and LWIR images, and those pixels are registered, or associated, together. Such a set of pixels is hereinafter referred to as a set of registered pixels". Each pixel in a set of registered pixels corresponds to the same point in the portion 6 of the area of terrain 4.
At step sB, for each set of registered pixels, a spectral signature is produced.
In this embodiment, a spectral signature of a set of registered pixels comprises values of the amplitude of electromagnetic radiation (as measured for the point corresponding to that registered set of pixels by the image sensors 10) for a range of frequencies. In this embodiment, this range of frequencies encompasses each of the respective ranges of frequencies detected by the UV camera 14, the visible camera 16, the SWIR camera 18, and the LWIR camera 20.
Figure 5 is a schematic illustration (not to scale) of an example of a spectral signature 22 for a set of registered pixels.
As shown in Figure 5, a spectral signature 22 of a set of registered pixels comprises the amplitude of the electromagnetic radiation measured across a range of frequencies at the point corresponding to that set of registered pixels.
The range of frequencies across which the amplitude of the electromagnetic radiation is measured comprises the range of frequencies detected by the UV camera 14, the range of frequencies detected by the visible camera 16, the range of frequencies detected by the SWIR camera 18, and the range of frequencies detected by the LWIR camera 20.
Returning to the process of Figure 4, at step sb, one of a plurality of different classes is assigned to each set of registered pixels.
In this embodiment, the class to that a set of registered pixels is assigned as depends on the terrain feature, or type of terrain, present at the point in the portion 6 of the area of terrain 4 to which that set of registered pixels corresponds.
For example, if the point in the portion 6 of the area of terrain 4 was covered in grass (e.g. the point was in a field), the corresponding set of registered pixels would be classified as "grass". If there was a building at the point in the portion 6 of the area of terrain 4, the corresponding set of registered pixels would be classified as "building".
Examples of other types of class, i.e. other types of terrain, or terrain features, which may be present at a point in the portion 6, are: rocks, sand, forest, water, and roads.
The use of a combination of LJV, visible, SWIR and LWIR cameras tends to provide that a greater number of different types of terrain, or terrain features, may be distinguished from one another. Thus, a greater number of unique classes may be used to classify a set of registered pixels.
For example, an area of ground that has been recently disturbed tends to have a different spectral signature in the infrared range of frequencies to the spectral signature it would have if it had not been recently disturbed. Thus, it tends to be possible to classify recently disturbed ground differently to areas of ground that have not recently been disturbed. Such a differentiation may not be possible if a narrower range of frequencies (e.g. only the visible range) were used to provide the spectral signatures used for classification.
In this embodiment, the set of classes used for classification comprises the classes "undisturbed ground", "recently disturbed ground", and "object above surface of ground".
In this example, assignment of a classification to each set of registered pixels is performed manually, i.e. by a human operator.
Thus, after step sb, for each set of registered pixels, a spectral signature is determined and a class is specified. Thus, each determined signature corresponds to a certain class.
At step s12, a database, or "library", is formed. In this embodiment, this database comprises each of the spectral signatures determined at step s8 and the class to which that spectral signature corresponds.
In effect, this database provides a "look-up" table whereby a spectral signature measured at a point on the ground can be looked-up, and a corresponding classification for that point can be returned.
Thus, a method of capturing and processing images is provided. This method produces a database comprising a plurality of classes with one or more spectral signatures matched to each of those classes.
Figure 6 is a process flow-chart showing certain steps of an embodiment of a method of capturing and processing images in which the database generated using the process of Figure 4 is used to survey a further portion of the area of terrain 4.
At step s14, as the aircraft 2 flies over the area of terrain 4, each of the imaging sensors 10 is used to capture an image of a further portion of the area of terrain 4.
In this embodiment, the further portion of the area of terrain 4 is different to the portion 6 of the area of terrain 4. However, in other embodiments the portion 6 and the further portion are the same portion of the area of terrain 4.
In other words, as the aircraft 2 flies over the area of terrain a Liv image of the further portion is captured using the UV camera 14, a visible image of the further portion is captured using the visible camera 16, a SWIR image of the further portion is captured using the SWIR camera 18, and a LWIR image of the further portion is captured using the LWIR camera 20.
At step s16, the images captured at step s14 are sent from the imaging sensors 10 to the processor 12.
At step siB, the processor 12 registers the images received from the imaging sensors 10, thereby producing a further registered set of images.
In this embodiment, the registration process used at step s18 is the same as that used at used step s6 and described in more detail above with reference toFigure4.
Similarly to step s6 above, in this embodiment further sets of pixels are produced. Each further set of pixels comprises a pixel from each of the UV, visible, SWIR and LWIR images, of the further portion and those pixels are registered together. Such a further set of pixels is hereinafter referred to as a "further set of registered pixels. Each pixel in a further set corresponds to the same point in the further portion of the area of terrain 4.
At step s20, for each further set of registered pixels, a spectral signature 22 is determined.
In this embodiment, a spectral signature 22 of a further set of registered pixels is determined in the same way as a spectral signature 22 of a set of registered pixels is determined (i.e. as preformed at step sB and described in more detail above with reference to Figure 4).
At step s22, each further set of registered pixels is classified as one of the classes.
In this embodiment, each further set of registered pixels is classified using the database generated by performing the process of Figure 4.
In this embodiment, a further set of registered pixels is classified as follows.
Firstly, a spectral signature in the database that is the closest to, or is substantially identical to, the spectral signature of the further set of registered pixels being classified is identified. In this embodiment, this is performed using a conventional statistical analysis process to compare the measured spectral signatures of the further sets of registered pixels to the stored spectral signatures in the database.
Secondly, the further set of registered pixels is classified as the class that corresponds to the identified spectral signature in the database.
At step s24, as the aircraft 2 flies over the area of terrain 4, the GPR 8 is used to capture an image of the further portion of the area of terrain 4.
In other words, a GPR image of the further portion is captured.
At step s26, the GPR image captured at step s18 is sent from the GPR 8 to the processor 12.
At step s28, the processor 12 processes the GPR image to detect features of interest within the GPR image.
In this embodiment, the features of interest within the GPR image are detected using a conventional detection process. Any appropriate detection process may be used.
In this embodiment, the GPR image is, in effect, a three-dimensional image generated from radar reflections from the surface of the ground within the area of terrain and from a volume beneath the surface of the ground within the area of terrain.
Figure 7 is a schematic illustration (not to scale) of an example of at least part of a GPR image 24. In particular, in this embodiment, the GPR image 24 is an image of the further area of terrain at a certain depth below the surface of the ground, when viewed from vertically above the surface of the ground. In other words, Figure 7 shows an intersection between the GPR image and a plane that is parallel to the surface of the ground at the certain depth beneath the surface of the ground.
The GPR image 24 comprises two image features, hereinafter referred to as the "GPR image features 26", which are detected at step s22.
In this embodiment, the GPR image features 26 are the images of wires that are buried beneath the surface of the ground (at the certain depth below the surface of the ground) within the further portion of the area of terrain 4.
Returning to the process-flowchart of Figures 6, at step s30, the processor 12 registers, or associates, the GPR image 24 with the images from the imaging sensors 10.
The registering or associating together the images from the GPR 8 and the imaging sensors 6 may be performed using any appropriate process.
For example, Some or all of the GPR image 24 may be projected onto one or more of images captured by the imaging sensors 10. Some or all of the GPR image 24 may be projected onto the 2-dimensional plane of the images captured by the imaging sensors 10 (i.e. the surface of the ground). In some embodiments, only the detected GPR image features 26 are projected onto the plane of the image sensor images. In some embodiments, some or all of the GPR image 24 (for example, each of the detected GPR image features 26) is projected onto an image sensor image along a line that is oriented in the direction in which the imaging sensors and/or GPR were facing when the images were captured. For example, a GPR image feature 26 may be projected onto the parts of the image sensor images that are directly (i.e. vertically) above that GPR image feature 26.
Alternatively, the some or all of the GPR image 24 and some or all of the images captured by the imaging sensors 10 may be projected onto a common 2-dimensional plane that is different to the plane of the images captured by the imaging sensors 10.
Alternatively, the parts of the GPR image 24 that corresponds to the surface of the ground (i.e. the parts of the GPR image 24 that results from radar reflections from the surface of the ground within the further area of terrain) may be registered with the images captured by the imaging sensors 10. Each detected GPR image feature 26 may then be associated with an area of the imaging sensors images that is above that GPR image feature 26, e.g., in the direction in which the GPR 8 was facing when the GPR image was captured.
Thus, GPR image features 26 are associated with one or more further sets of registered pixels of the image sensor images.
At step s32, after associating together the GPR image 24 and imaging sensor images, the further sets of registered pixels of the imaging sensor images associated with (e.g. positioned at or proximate to the GPR image features 26) are identified. In some embodiments, after the GPR image 24 has been projected onto the 2-dimensional plane of the imaging sensor image, the parts of the imaging sensor images that are at or proximate to the locations of the GPR image features 26 are identified.
At steps s34, the classifications of the further sets of registered pixels identified at step s32 are associated with the points of the GPR image 24 that correspond to those further sets of registered pixels.
In this embodiment, this association of the classes to points in the GPR image 24 is performed to provide further information about the GPR image features 26 that are identified at step s28.
Figure 8 is a schematic illustration (not to scale) of the GPR image 24 together with the associated classification information from the imaging sensor images.
The classification information shown in Figure 8 shows, a region classified as "object above surface of ground" (indicated in Figure 8 by dotted lines and the reference numeral 28), two regions classified as "recently disturbed ground" (indicated in Figure 8 by dotted lines and the reference numeral 30), and a region classified as "undisturbed ground" (indicated in Figure 8 by dotted lines and the reference numeral 32).
At step s36, the GPR image 24 together with the associated classification information is provided to a user, or operator, for analysis.
In this embodiment, the user analyses the GPR image 24 together with the associated classification information to detect and identify objects of interest (targets) in the area of terrain 4. In this embodiment, this target detection and identification is manually performed by the user.
The provided GPR image 24 and classification information advantageously allows the user to analyse and compare information across multiple image spectra. This tends to facilitate target detection and identification.
In this embodiment, the GPR image 24 and classification information provided to the user allow the user to interconnect individual features detected across multiple image spectra. In particular, in this embodiment the user may infer from the provided GPR image 24 and classification information that a detected object (i.e. the region classified as "object above surface of ground" 28) is a man-made object as it is connected to underground wires (i.e. the GPR image features 26) to a location remote from the object. Furthermore, as the ground surrounding the underground wires is classified as "recently disturbed ground" 30, the user may infer that the object has been installed recently.
The user may perform, or initiate the performance, of an action depending on their analysis of the GPR image 24 and classification information.
Thus, a method of using the database generated using the process of Figure 4 to survey a further portion of the area of terrain 4 is provided.
The above described systems and methods provide that information across multiple image spectra is used to provide a classification for regions in the area of terrain. This tends to provide more accurate classifications of terrain types and/or terrain features.
Furthermore, the use of spectral signatures that span multiple image spectra tend to facilitate the differentiation of terrain types/features. Also, this tends to provide that a greater number of different types of terrain types and/or terrain features may be differentiated from one another. Thus, it tends to be possible to have a greater number of unique classes as which types of terrain or terrain features may be classified.
In other words, using information from multiple image spectra tends to allow for the differentiation between terrain types or terrain features that would otherwise be undifferentiable (i.e. if information from only a single image spectrum was used).
As described in more detail above, the GPR is used to detect objects on or buried near to the ground surface. This information is be used to connect image features detected across the multiple image spectra. Likewise, the classifications provided by the multiple image spectra information provide a context for the GPR information. Thus, the multiple image spectra information may be advantageously used to support, clarify and/or facilitate the interpretation of the GPR information, and vice versa.
The synergistic combination of GPR information (i.e. range data, or 3-dimensional data) and the multiple image spectra information provided by the imaging sensors (2-dimensional classification data) advantageously tends to allow for more accurate detection and identification of targets.
It tends to be possible to generate a database of spectral signatures for classification purposes at any time prior to that database being used for classification purposes.
Furthermore, a database may be tailored depending on the particular scenario in which the surveying of the area of the terrain is performed. For example, database entries may be removed from the database such that the resulting reduced database only includes classifications for terrain features, types of terrain, or object that might reasonably be expected to occur in the area being surveyed. Also, the entries of the database (i.e. the spectral signatures) may be normalised or filtered, or in some other way processed, to account for environmental conditions (i.e. the weather) present when the surveying of the terrain is performed, and/or to account for the time-of-day (i.e. light levels and temperature) at which the surveying of the terrain is performed. Such database tailoring tends to achieve greater efficiency and more accurate classification/target detection.
A further advantage of the above described system and methods is that real-time, or near-real-time, classification, database generation, and/or target detection/identification tend to be possible.
A further advantage of the above described system and methods is that, using the above described techniques, it tends to be possible to determine whether a disturbance of an area of ground is a recent (i.e. relatively fresh) disturbance, or whether a disturbance of an area of ground is a relatively older disturbance.
Apparatus, including the processor 12, for implementing the above arrangement, and performing the method steps to be described later below, may be provided by configuring or adapting any suitable apparatus, for example one or more computers or other processing apparatus or processors, and/or providing additional modules. The apparatus may comprise a computer, a network of computers, or one or more processors, for implementing instructions and using data, including instructions and data in the form of a computer program or plurality of computer programs stored in or on a machine readable storage medium such as computer memory, a computer disk, RaM, PROM etc., or any combination of these or other storage media.
It should be noted that certain of the process steps depicted in the flowcharts of Figures 4 and 6 and described above may be omitted or such process steps may be performed in differing order to that presented above and shown in the Figures. Furthermore, although all the process steps have, for convenience and ease of understanding, been depicted as discrete temporally-sequential steps, nevertheless some of the process steps may in fact be performed simultaneously or at least overlapping to some extent temporally.
In the above embodiments, the above described systems and methods are implemented in the particular scenario of Figure 1. However, in other embodiments one or both of these methods are implemented in a different scenario. For example, in the above embodiments the aircraft is used to capture and process images so as to generate a database to be used, by the aircraft, to perform a classification process. However, in other embodiments, the aircraft is used to capture and process images for generating a database for use by an entity other than the aircraft. In other embodiments, the aircraft uses a database generated by an entity other than the aircraft.
In the above embodiments, an aircraft is used to implement the above described systems and method. However, in other embodiments one or more different types of entity (e.g. a different type of vehicle) is used to implement an above described system and/or method. For example, in other embodiments a land-based vehicle or a water-based vehicle is used to implement an above described systems or method.
In the above embodiments, the set of imaging sensors comprises a UV camera, a visible camera, a SWIR camera, and a LWIR camera. However, in other embodiments the set of imaging sensors comprises a different set of sensors or cameras such that information across multiple image spectra is captured. For example, in other embodiments, the set of imaging sensors comprises one or more different types of sensor instead of or in addition to any of those listed previously.
In the above embodiments, at step slO the user manually classifies (as one of a plurality of different classes) each set of registered pixels is. However, in other embodiments the classification of a set of registered pixels is performed using a different appropriate method. For example, the signatures for different types of ground could be specified e.g. manually by an operator, or compute from other data.
In the above embodiments, the database generated using the process of Figure 4 is used to survey a further portion of the area of terrain. In the above embodiments, the database is used, by a user, to facilitate the detection and identification of targets of interest within the further portion of the area of terrain.
However, in other embodiments the database is used for a different purpose.
For example, in other embodiments the database is used to ascertain a state for a known asset, e.g. GPR measurements may be used to detect an underground -20 -pipeline, whilst the multi-spectral image data may be used locate any leaks or damage to that pipeline.
In the above embodiments, range image data from the GPR is used in combination with the 2-dimensional image data gained by capturing images across multiple image spectra. The surveying of the further portion of the area of terrain is performed using images taken across multiple image spectra, and images from a GPR. However, in other embodiments a range image from a different type of source (i.e. a source other than a GPR) is used in combination with the 2-dimensional image data from the imaging sensors. For example, in other embodiments a LASER/Light Distance and Ranging (LIDAR) sensor, which is capable of detecting surface contours or changes, is used in combination with the 2-dimensional image data gained by capturing images across multiple image spectra.
In the above embodiments, the user manually analyses the GPR image together with the associated classification information to detect and identify targets in the further portion of the area of terrain. However, in other embodiments, the analysis of the GPR image and associated classification information is performed in a different appropriate way. For example, in other embodiments a fuzzy logic algorithm, e.g. performed by a processor, may be used to detect and identify targets. -21 -

Claims (15)

  1. CLAIM S1. A method of capturing and processing images, the images being of terrain (6), the method comprising: S using each of a plurality of cameras (14, 16, 18, 20), capturing an image of a surface of the terrain (6) thereby producing a plurality of camera images, each camera image comprising a plurality of pixels; registering the camera images, thereby producing a plurality of sets of registered pixels; classifying each set of registered pixels; using a ground penetrating radar (8), capturing a radar image (24) of the terrain (6); performing a detection algorithm on the radar image (24) to detect an at least partially subterranean object or terrain feature (26) in the radar image; and associating the detected object or terrain feature (26) with at least one classified set of registered pixels (28, 30).
  2. 2. A method according to claim 1, wherein each of the plurality of cameras (14, 16, 18, 20) is for detecting electromagnetic radiation in a different frequency range to each of the other cameras in the plurality.
  3. 3. A method according to claim 2, wherein the frequency range in which each camera (14, 16, 18, 20) detects electromagnetic radiation overlaps at least partially with the frequency range in which a different camera detects electromagnetic radiation.
  4. 4. A method according to any of claims 2 to 3, wherein the plurality of cameras (14, 16, 18, 20) comprises a first camera (14) for detecting -22 -electromagnetic radiation in the ultraviolet range of frequencies, a second camera (16) for detecting electromagnetic radiation in the visible light range of frequencies, and a third camera (18, 20) for detecting electromagnetic radiation in the infrared range of frequencies.
  5. 5. A method according to any of claims 1 to 4, wherein the step of classifying each set of registered pixels comprises: for each set of registered pixels, determining a spectral signature using the image data from each of the plurality of cameras (14, 16, 18, 20); and classifying a set of registered pixels depending on its spectral signature.
  6. 6. A method according claim 5, wherein the step of classifying a set of registered pixels depending on its spectral signature comprises comparing the spectral signature of the set of registered pixels to a spectral signature stored in a database.
  7. 7. A method according to claim 6, the method further comprising generating the database, wherein the step of generating the database comprises: using each of a further plurality of cameras, capturing an image of a further area of terrain thereby producing a further plurality of camera images, each camera image in the further plurality comprising a plurality of pixels; registering the camera images in the further plurality, thereby producing a further plurality of sets of registered pixels; for each set of registered pixels in the further plurality, determining a spectral signature using the image data from each of the further plurality of cameras; assigning a class to each of determined spectral signatures; and forming the database from the determined spectral signatures and corresponding assigned classes. -23 -
  8. 8. A method according to any of claims 5 to 7, wherein the spectral signature of a set of registered pixels spans at least part of the following frequency ranges: the ultraviolet range of frequencies, the visible light range of frequencies, and the infrared range of frequencies.
  9. 9. A method according to any of claims 1 to 8, wherein the step of associating the detected object or terrain feature (26) with at least one classified set of registered pixels (28, 30) comprises projecting at least part of that object or terrain feature (26) and at least one camera image onto a common plane.
  10. 10. A method according to any of claims 1 to 9, wherein the step of associating the detected object or terrain feature (26) with at least one classified set of registered pixels (28, 30) comprises: registering at least part of the radar image with one or more of the camera images; and associating the detected object or terrain feature (26) with a set of registered pixels (28, 30) located above the detected object or terrain feature (26).
  11. 11. A method according to any of claims 1 to 10, the method further comprising performing an identification process to identify the detected object or terrain feature (26) using the at least one classified set of registered pixels associated with the detected object or terrain feature (26).
  12. 12. Apparatus for capturing and processing images, the images being of terrain (6), the apparatus comprising: a plurality of cameras (14, 16, 18, 20), each of the plurality of cameras (14, 16, 18, 20) being for capturing an image of a surface of the terrain thereby -24 -producing a plurality of camera images, each camera image comprising a plurality of pixels; a ground penetrating radar (8) for capturing a radar image (24) of the terrain (6); and one or more processors (12) arranged to: register the camera images, thereby producing a plurality of sets of registered pixels; classify each set of registered pixels; perform a detection algorithm on the radar image to detect an at least partially subterranean object or terrain feature (26) in the radar image (24); and associate the detected object or terrain feature (26) with at least one classified set of registered pixels (28, 30).
  13. 13. An aircraft (2) comprising the apparatus of claim 12.
  14. 14. A program or plurality of programs arranged such that when executed by a computer system or one or more processors it/they cause the computer system or the one or more processors to operate in accordance with the method of any of claims 1 to 11.
  15. 15. A machine readable storage medium storing a program or at least one of the plurality of programs according to claim 14.
GB1400185.3A 2013-01-07 2014-01-07 Image processing Active GB2511908B (en)

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
GBGB1300169.8A GB201300169D0 (en) 2013-01-07 2013-01-07 Image processing

Publications (3)

Publication Number Publication Date
GB201400185D0 GB201400185D0 (en) 2014-02-26
GB2511908A true GB2511908A (en) 2014-09-17
GB2511908B GB2511908B (en) 2015-11-11

Family

ID=47748010

Family Applications (2)

Application Number Title Priority Date Filing Date
GBGB1300169.8A Ceased GB201300169D0 (en) 2013-01-07 2013-01-07 Image processing
GB1400185.3A Active GB2511908B (en) 2013-01-07 2014-01-07 Image processing

Family Applications Before (1)

Application Number Title Priority Date Filing Date
GBGB1300169.8A Ceased GB201300169D0 (en) 2013-01-07 2013-01-07 Image processing

Country Status (1)

Country Link
GB (2) GB201300169D0 (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
IT201800009761A1 (en) * 2018-10-24 2020-04-24 Ids Georadar Srl Photogrammetric system to assist in positioning the georadar data on the measurement scenario
WO2020097038A1 (en) * 2018-11-06 2020-05-14 Saudi Arabian Oil Company Drone-based electromagnetics for early detection of shallow drilling hazards

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2006128087A2 (en) * 2005-05-27 2006-11-30 Entech Engineering, Inc. System of subterranean anomaly detection and repair
WO2012050595A1 (en) * 2010-09-29 2012-04-19 Entech Engineering, Inc. Complete remote sensing bridge investigation system
US20120274505A1 (en) * 2011-04-27 2012-11-01 Lockheed Martin Corporation Automated registration of synthetic aperture radar imagery with high resolution digital elevation models

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2006128087A2 (en) * 2005-05-27 2006-11-30 Entech Engineering, Inc. System of subterranean anomaly detection and repair
WO2012050595A1 (en) * 2010-09-29 2012-04-19 Entech Engineering, Inc. Complete remote sensing bridge investigation system
US20120274505A1 (en) * 2011-04-27 2012-11-01 Lockheed Martin Corporation Automated registration of synthetic aperture radar imagery with high resolution digital elevation models

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
IT201800009761A1 (en) * 2018-10-24 2020-04-24 Ids Georadar Srl Photogrammetric system to assist in positioning the georadar data on the measurement scenario
WO2020084551A1 (en) * 2018-10-24 2020-04-30 Ids Georadar S.R.L. Photogrammetric system for positioning georadar data on the measurement scenario
WO2020097038A1 (en) * 2018-11-06 2020-05-14 Saudi Arabian Oil Company Drone-based electromagnetics for early detection of shallow drilling hazards
US10845498B2 (en) 2018-11-06 2020-11-24 Saudi Arabian Oil Company Drone-based electromagnetics for early detection of shallow drilling hazards

Also Published As

Publication number Publication date
GB201300169D0 (en) 2013-02-20
GB201400185D0 (en) 2014-02-26
GB2511908B (en) 2015-11-11

Similar Documents

Publication Publication Date Title
US20150356341A1 (en) Fusion of multi-spectral and range image data
CN109100741B (en) Target detection method based on 3D laser radar and image data
CN110988912B (en) Road target and distance detection method, system and device for automatic driving vehicle
CN110032949B (en) Target detection and positioning method based on lightweight convolutional neural network
CN109949372B (en) Laser radar and vision combined calibration method
CA2916062C (en) Stereo vision for sensing vehicles operating environment
Bansal et al. Pointillism: Accurate 3d bounding box estimation with multi-radars
EP3049793B1 (en) Structural hot spot and critical location monitoring
US20130202197A1 (en) System and Method for Manipulating Data Having Spatial Co-ordinates
CN108227738A (en) A kind of unmanned plane barrier-avoiding method and system
AU2016230926A1 (en) Method and apparatus for processing spectral images
Puente et al. Automatic detection of road tunnel luminaires using a mobile LiDAR system
CN111753609A (en) Target identification method and device and camera
CN102799903A (en) High-spectrum automatic cloud detection method based on space and spectral information
US20190391255A1 (en) Method and Device for Geo-Referencing Aerial Image Data with the Aid of SAR Image Data
CN103353988A (en) Method for evaluating performance of heterogeneous SAR (synthetic aperture radar) image feature matching algorithm
US20130096884A1 (en) Sensor data processing
JP7386136B2 (en) Cloud height measurement device, measurement point determination method, and cloud type determination method
GB2520819A (en) Method of identification from a spatial and spectral object model
GB2511908A (en) Image processing
EP2752788A1 (en) Fusion of multi-spectral and range image data
CN107767366B (en) A kind of transmission line of electricity approximating method and device
KR20220066783A (en) Estimation method of forest biomass
CN111489398B (en) Imaging equipment calibration method and device
Nitti et al. Automatic GCP extraction with high resolution COSMO-SkyMed products