GB2491395A - Temporal Alignment of Sensor Data - Google Patents

Temporal Alignment of Sensor Data Download PDF

Info

Publication number
GB2491395A
GB2491395A GB1109297.0A GB201109297A GB2491395A GB 2491395 A GB2491395 A GB 2491395A GB 201109297 A GB201109297 A GB 201109297A GB 2491395 A GB2491395 A GB 2491395A
Authority
GB
United Kingdom
Prior art keywords
image
images
sequence
mark
camera
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
GB1109297.0A
Other versions
GB201109297D0 (en
Inventor
Christopher Charles Rawlinson Jones
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
BAE Systems PLC
Original Assignee
BAE Systems PLC
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by BAE Systems PLC filed Critical BAE Systems PLC
Priority to GB1109297.0A priority Critical patent/GB2491395A/en
Publication of GB201109297D0 publication Critical patent/GB201109297D0/en
Publication of GB2491395A publication Critical patent/GB2491395A/en
Withdrawn legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C11/00Photogrammetry or videogrammetry, e.g. stereogrammetry; Photographic surveying
    • G01C11/02Picture taking arrangements specially adapted for photogrammetry or photographic surveying, e.g. controlling overlapping of pictures
    • G01C11/025Picture taking arrangements specially adapted for photogrammetry or photographic surveying, e.g. controlling overlapping of pictures by scanning the object
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/30Determination of transform parameters for the alignment of images, i.e. image registration
    • G06T7/33Determination of transform parameters for the alignment of images, i.e. image registration using feature-based methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/30Determination of transform parameters for the alignment of images, i.e. image registration
    • G06T7/37Determination of transform parameters for the alignment of images, i.e. image registration using transform domain methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/30Determination of transform parameters for the alignment of images, i.e. image registration
    • G06T7/38Registration of image sequences
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/50Depth or shape recovery
    • G06T7/521Depth or shape recovery from laser ranging, e.g. using interferometry; from the projection of structured light
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/50Depth or shape recovery
    • G06T7/55Depth or shape recovery from multiple images
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10032Satellite or aerial image; Remote sensing

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Theoretical Computer Science (AREA)
  • Length Measuring Devices By Optical Means (AREA)
  • Optics & Photonics (AREA)
  • Multimedia (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)

Abstract

For each of a plurality of different time-steps in a time period, a mark is projected onto a surface in the form of a laser dot. For a first portion of the time period, a first sensor, generates a first image or image sequence of the surface with at least a portion of each of the projected marks in that image or sequence and for a second portion of the time period, a second sensor, generates a second image or image sequence of the surface with at least a portion of each of the projected marks in that image or sequence. A process that is dependent upon the respective positions of the mark in the first image or image sequence and the second image or image sequence is then performed, namely the time alignment of the image sequences to temporally align the images sequences or registering the two image sequences. The different time steps may be selected by a pseudo random sequence. The sensors may be of different types e.g. UV, Infra-red, visible light and may be mounted in an aircraft.

Description

SENSOR DATA PROCESSING
FIELD OF THE INVENTION
The present invention relates to processing of sensor data. In particular, the present invention relates to the processing of data corresponding to respective images of a scene generated using two respective sensors.
BACKGROUND
Image registration is an image processing technique that is performed to match two or more pictures taken, for example, at different times, from different sensors, or from different viewpoints.
Image registration tends to be an important technique in applications such as target recognition, surveillance, autonomous navigation, performing medical diagnosis, and quality checking.
Typically, image registration comprises determining a spatial transformation to be performed on one image to register that image to another.
The spatial transformation is typically calculated using intensity-based and/or feature-based alignment algorithms.
Intensity-based algorithms compare intensity patterns in images Feature-based algorithms find correspondence between image features such as points, object edges, and contours.
However, in some applications, for example when a surface being images is relatively homogenous, changes in image intensity may be very slight and/or useful images features may be relatively rare.
SUMMARY OF THE INVENTION
In a first aspect, the present invention provides a method of performing image processing, the method comprising: at each of a plurality of different time-steps in a time period, positioning a mark onto a surface to provide a plurality of projected marks; for a first portion of the time period, using a first sensor, generating a first image or first sequence of images of the surface with at least a portion of each of the projected marks in the first image or first sequence of images; for a second portion of the time period, using a second sensor, generating a second image or second sequence of images of the surface with at least a portion of each of the projected marks in the second image or second sequence of images; and performing a process that is dependent upon the respective positions of the mark in the first image or first sequence of images and the second image or second sequence of images.
The first portion and the second portion may overlap to some extent.
The step of performing a process that is dependent upon the respective positions of the mark in the first image or first sequence of images and the second image or second sequence of images may comprise performing a time-aiignment process to temporally align the first and second images or first and second sequence of images.
The step of positioning a mark onto a surface may comprise projecting the mark onto the surface.
The mark may be projected onto the surface using a laser.
The mark may be a laser dot.
The time-steps in which the mark is positioned onto the surface may form a random, or pseudo-random, sequence of time-steps in the time period.
A sensor may be a line-scan sensor, and, for each time-step in which the mark is positioned on the surface, the mark may be positioned on the surface such that the at least a portion of the mark in the image generated by that sensor is present in a single line of that image.
The first sensor and the second sensor may be different types of sensor.
The second sensor may be a visible light camera and the first sensor may be either an ultra-violet camera or an infrared camera.
The step of performing a process that is dependent upon the respective positions of the at least a portion of the mark in the first image or sequence of images and the second image or sequence of images may comprise performing an image registration process to register the first and second images or register the first and second sequences of images.
The first sensor and the second sensor may be mounted on an aircraft.
In a further aspect, the present invention provides apparatus for performing image processing, the apparatus comprising: means for, at each of a plurality of different time-steps in a time period, positioning a mark onto a surface to provide a plurality of projected marks; a first sensor arranged to, for a first portion of the time period, generate a first image or first sequence of images of the surface with at least a portion of each of the projected marks in the first image or first sequence of images; a second sensor arranged to, for a second portion of the time period, generate a second image or second sequence of images of the surface with at Jeast a portion of each of the projected marks in the second image or second sequence of images; and a processor arranged to perform a process that is dependent upon the respective positions of the mark in the first image or first sequence of images and the second image or second sequence of images.
In a further aspect, the present invention provides a program or plurality of programs arranged such that when executed by a computer system or one or more processors it/they cause the computer system or the one or more processors to operate in accordance with the method of any of the above aspects.
In a further aspect, the present invention provides a machine readable storage medium storing a program or at least one of the plurality of programs according to the above aspect.
BRIEF DESCRIPTION OF THE DRAWINGS
Figure 1 is a schematic illustration (not to scale) of an aircraft that will be used to implement an example of an image registration process; Figure 2 is a schematic illustration (not to scale) of a scenario in which the aircraft will be used to implement an example of the image registration process; Figure 3 is a process flow-chart showing certain steps of an example of the image registration process; Figure 4 is a schematic illustration (not to scale) of an ultraviolet (UV) image and a visible light image; Figure 5 is a schematic illustration (not to scale) showing the UV image and the visible light image such that positions of a laser dot in the images are aligned; Figure 6 is a process flow chart showing certain steps of an image registration process according to a further example; Figure 7 is a schematic illustration (not to scale) of the aircraft flying in the proximity of the area of terrain in the further example; Figure 8 is a schematic illustration (not to scale) showing the lines relative to the direction of travel of the aircraft; Figure 9 is a schematic illustration (not to scale) of the UV image and the visible light image in a second further example; Figure 10 is a process flow-chart showing certain steps of an image registration and camera alignment process used in the second further example; Figure 11 is a schematic illustration of the UV image and the visible light image that have been registered; Figure 12 is a schematic illustration (not to scale) of an image taken using the UV camera over a first time-period; Figure 13 is a schematic illustration (not to scale) of an image taken using the visible light camera over a second time-period; Figure 14 is a process flow-chart showing certain steps of an example of an image time-alignment process implemented using the processor; and Figure 15 is a schematic illustration (not to scale) of a time-aligned second UV image and second visible light image.
DETAILED DESCRIPTION
Figure 1 is a schematic illustration (not to scale) of an aircraft 2 that will be used to implement an example of an image registration process. The example image registration process is useful in the understanding of an embodiment of an image time-alignment process, which is described in more detail later below with reference to Figures 12 -15.
In this example the image registration process is performed to register images from two different types of camera.
in this example, the aircraft 2 is an unmanned aircraft. The aircraft 2 comprises an ultraviolet (UV) camera 4, a visible light camera 6, a laser marker 8, and a processor 10.
In this example, the UV camera 4 is a line-scan camera, i.e. an image capturing device having a sensor which is formed of a single line of ultraviolet light sensitive elements (pixels). As the aircraft 2 flies in the proximity of an area of terrain, the UV camera 4 is arranged to capture high resolution UV band images of the area of terrain, as described in more details later below with reference to Figure 2. The image acquisition is made line by line as the aircraft 2 flies over the area of terrain.
Also, in this example the UV camera 4 is coupled to the processor 10.
In this example, the visible light camera 6 is a line-scan camera, i.e. an image capturing device having a sensor which is formed of a single line of visible light sensitive elements (pixels). As the aircraft 2 flies in the proximity of an area of terrain, the visible light camera 6 is arranged to capture high resolution visible light band images of the area of terrain, as described in more details later below with reference to Figure 2. The image acquisition is made line by line as the aircraft 2 flies over the area of terrain.
Also, in this example the visible light camera 6 is coupled to the processor 10.
in this example, the UV camera 4 and the visible light camera 6 are mounted on an optical bench or optical table (not shown in Figure 1) inside the aircraft 2. The optical bench provides that unwanted movement of one camera relative to the other camera is substantially prevented.
In this example, the optical bench comprises one or more actuators (not shown in Figure 1). The one or more actuators provide that one or both cameras 4, 6 may be moved relative to the other camera on the optical bench.
The movement of a camera relative to the other camera is performed according to an instruction received at the optical bench from the processor 10. Such an instruction, and the determination thereof, is described in more detail later below with reference to Figure 4.
Furthermore, the optical bench provides that unwanted rotation of a camera is substantially prevented.
In this example, the laser marker 8 is arranged to emit laser light such that the laser light is detectable by the UV camera 4 and the visible light camera 6. In this example, data corresponding to the images captured by the UV camera 4 and the visible light camera 6 is sent from the relevant camera 4, 6 to the processor 10. The processor 10 processes the received data, and generates a movement instruction for one or both of the cameras 4, 6, as described in more detail later below with reference to Figure 4.
Figure 2 is a schematic illustration (not to scale) of a scenario in which the aircraft 2 will be used to implement an example of an image registration process.
In this scenario, the aircraft 2 flies in the proximity of the area of terrain 12.
As the aircraft 2 flies in the proximity of the area of terrain 12, each camera 4, 6 captures an image of part of the area of terrain 12.
In this example, the cameras 4, 6 are line-scan cameras. Thus, respective the images are captured line-by-line as the aircraft flies over the area of terrain.
The image of the area of terrain 12 captured by the UV camera 4 is hereinafter referred to as the "UV image".
The image of the area of terrain 12 captured by the visible light camera 6 is hereinafter referred to as the "visible light image".
In this example, as the aircraft flies over the area of terrain 12, and as the cameras 4, 6 are capturing respective images of the area of terrain 12, at a particular point in time the laser marker 8 emits a laser pulse. The emitted laser pulse forms a mark on the area of terrain. In this example the mark is a single dot formed by the laser marker on the terrain and is hereinafter referred to as the "laser dot 14", as indicated in Figure 2 by an X'.
The laser dot 14 is captured by the cameras 4, 6 as they capture their respective images because the laser dot 14 is made by light that has a wavelength that is detectable by both the UV and visible light cameras 4, 6.
In this example, the duration of the laser pulse from the laser marker 8 is such that the laser dot 14 is present in a single line of the respective line-scanned images taken by the UV camera 4 and the visible light camera 6. Also, the diameter of the laser pulse from the laser marker 8 is such that the laser dot 14 occupies a single pixel in a particular line of each of the images captured by the UV camera 4 and the image captured by the visible light camera 6. Thus, in other words the laser dot 14 appears in both the UV image and the visible light image on a single line of that image, and at a single pixel along that line.
In this example, data corresponding to the UV and visible light images gathered by the UV camera 4 and visible light camera 6 respectively is sent to the processor 10.
Figure 3 is a process flow-chart showing certain steps of an image registration and camera alignment process implemented using the processor 10.
At step s2, image data from the UV camera 4 and the visible light camera 6 is received by the processor 10. In this example, data from the cameras 4, 6 is sent from the cameras 4, 6 to the processor 10 as the images are being captured.
At step s4, the image data from the UV camera 4 is processed to determine a position of the image of the laser dot 14 in the UV image. Also, the image data from the visible light camera 6 is processed to determine a position of the image of the laser dot 14 in the visible light image.
Figure 4 is a schematic illustration (not to scale) of the UV image 16 and the visible light image 18. The position of the image of the laser dot 14 in each of the images 16, 18 is indicated in Figure 4 by an "X" and the reference numeral 14.
In this example, in the UV image 16 the laser dot 14 appears in the ith line of that image, and in the jth pixel along the ith line. Thus, the coordinates of the laser dot 14 in the UV image 16 relative to an origin 0 are (j,i).
In this example, in the visible light image 18 the laser dot 14 appears in the kth line of that image, and in the /th pixel along the kth line. Thus, the coordinates of the laser dot 14 in the visible light image 18 relative to the origin o are (I,k).
At step s6, the processor 10 determines a transformation T that translates the UV image 16 relative the origin 0 such that the position of the laser dot 14 in the UV image 16 is aligned with the position of the laser dot 14 in the visible light image 18.
Figure 5 is a schematic illustration (not to scale) showing the UV image 16 and the visible light image 18 such that the positions of the laser dot 14 in the images 16, 18 are aligned.
In this example, the transformation T that translates the UV image 16 relative to the origin 0 such that the positions of the laser dot 14 in the UV and visible light images 16, 18 are aligned is: T=I
-I
At step s8, a movement instruction for the UV camera 4 is determined by the processor 10 using the transformation T determined at step s6 above.
In this example, the movement instruction is an instruction to move the UV camera 4 relative to the visible light camera 6 such that an image being captured by that camera is transformed according to the transformation T determined at step s8 above.
Thus, in this example UV camera 4 is moveable (relative to the visible light camera 6) in at least two directions, those two directions spanning the plane of the UV image 16. For example, the UV camera 4 may be moveable in a direction substantially parallel to a line scanned by the UV camera 4, and a direction substantially perpendicular to a line scanned by the UV camera 4.
At step sb, the movement instruction is sent from the processor 10 to the optical bench.
At step s12, the UV camera 4 is moved relative to the visible light camera 6 using the actuators of the optical bench according to the received movement instruction.
Thus, an image registration and camera alignment process is provided.
In this example, the UV camera 4 is moved relative to the visible light camera 6 to register images produced by those cameras 4, 6. In particular, a transformation that transforms the UV image 16 such that the positions of the laser dot 14 in the UV and visible light images 16, 18 overlap is determined, and the UV camera 4 is moved such that the UV image 14 being captured by the UV camera 4 is transformed by that transformation.
However, in other examples the visible light camera 6 is moved relative to the UV camera 4 to register images produced by those cameras 4, 6, i.e. the UV camera 4 may be held relatively stationary, and the visible light camera 6 moved as described above for the UV camera 4.
Also, in other examples both the UV and visible light cameras 4, 6 may be moved as described above. For example, each camera may be moved so that the image of the laser dot 14 in the image generated using that camera is at a particular pixel (e.g. a pixel substantially in the centre of the image).
An advantage provided by the above described image registration and camera alignment process is that the UV images are and the visible light images are aligned. Thus, UV and visible light images data is available for an area of terrain.
Moreover, after the movement of the UV camera 4 at step s12, the overlap between the UV and visible light images 16, 18 tends to be maximised.
This advantageously tends to provide that the area of terrain for which contemporaneous UV and visible light information is gathered, is maximised.
A further advantage provided by the above described process is that image registration is performed in such a way that tends to avoid the use of computationally expensive processes, such as feature extraction. The above described method tends to be more computationally efficient than conventional methods.
The provided method tends to provide a relatively precise alignment of two images of the same region or space. The process tends to be relatively fast and efficient. This advantageously tends to allow for real-time or near real-time processing of image data.
Conventionally, image registration techniques tend to involve the use of complex and/or very large computer programs to distinguishing a set of reference features. This is typically a lengthy process and is typically prone to error, especially in cases where distinguishable features are rare. The provided process tends to alleviate the difficult problems of having few/no distinguishable reference features by projecting a laser dot on to the region or space under -11 -observation. The provided process tends to allow for image registration in situations irrespective of whether there are useful reference features or not.
Thus, the provided process tends to be particularly useful in industrial applications such as sheet metal production quality checking.
The above described system/process advantageously uses a marker device to project a mark on to an imaged surface to create a feature that is used to allow alignment of different images of that surface.
In the above examples, one or more of the cameras 4, 6 that are used to capture images are moved relative to the other camera. However, in other examples, the cameras are not moved (or indeed be moveable) relative to one another. Figure 6 is a process flow chart showing certain steps of an image registration process according to one such further example in which the cameras are not moved.
In this further example, steps s2 to s6 are performed as described in more detail above with reference to Figure 3.
In this further example, after performing step s6, the process proceeds to step s14.
At step s14, the UV image 16 is transformed using the determined transformation T. Thus, an image registration process is performed.
In this further example, the overlap between the UV and visible light images 16, 18 may not be maximised. Thus, the area of terrain for which contemporaneous UV and visible light information is gathered tends not to be maximised. In other words, contemporaneous UV and visible light information is gathered for an area of terrain corresponding to the overlap of the UV image 16 and the visible light image 18 shown in Figure 5.
However, an advantage provided by this further example is that the cameras 4, 6 on the aircraft 2 do not need to be moveable with respect to one another. This tends to increase the stability of the cameras 4, 6 on the aircraft 2.
A further advantage provided by this further example is that the registration of the images can be implemented by processing alone. Also, the image registration process can be performed after the images have been collected. For example, images could be collected over a particular time period, and, if desired, those images could be registered at a later time.
In the above examples, a single laser dot 14 is projected onto the area of terrain 12. The image of this laser dot 14 in each of the UV and visible light images 16, 18 is then used to register those images 16, 18. However in other examples, a different number of laser dots are projected on to the area of terrain and used in the image registration process. In such examples, the plurality of laser dots may be projected from one or more laser markers. Also, one or more of the plurality of laser dots may have a different wavelength to a number of the other laser dots. The laser dots may be projected onto the area of terrain in any appropriate pattern, for example a pattern having no lines of symmetry.
An advantage of projecting a plurality of laser dots projected onto the area of terrain in a pattern that has no lines of symmetry is that registration of the images tends to correct for rotation of one of the images with respect to the other. In other words, the orientation of a mark (i.e. pattern produced by the laser marker on the surface of the terrain) can be determined and, thus, detection of whether an image and/or camera is rotated relative to another tends to be possible. In such examples, a resultant movement instruction for a camera may comprise an instruction to rotate the camera relative to the other camera e.g. to correct for such rotations.
In the above examples, the registration processes are implemented using laser dots. However, in other examples a laser marker may project a different shape (e.g. a line) on to the terrain. A second further example in which two lines are projected onto the area of terrain will now be described with reference to Figures 7 to 11.
Figure 7 is a schematic illustration (not to scale) of the aircraft 2 flying in the proximity of the area of terrain 12 in the further example.
As shown in Figure 7, in this second further example the laser marker 8 projects two laser lines 20 on to the area of terrain 12. In this example, the laser lines are projected by the laser marker 8, for example by using a laser beam splitter. However, in other examples a different separate laser marker may project each laser line.
Figure 8 is a schematic illustration (not to scale) showing the laser lines relative to the direction of travel 22 of the aircraft 2 (the direction 22 shown as an arrow in Figure 8).
In this example, the laser lines 20 are cast along carefully measured angles. In this example, the angle between a laser line 20 and the direction of travel of the aircraft 2 is denoted by q5. Thus, in this example, the angle between the two laser lines 20 projected on the area of terrain 12 is 2.
In this example, the lines scanned by the line-scan UV and visible light cameras 4, 6 are substantially perpendicular to the direction 22 of the aircraft 2.
A line scanned by the UV camera 4 is shown as a line in Figure 8 and is indicated by the reference numeral 24. This line is hereinafter referred to as "the UV line".
A line scanned by the visible light camera 6 is shown as a line in Figure 8 and is indicated by the reference numeral 26. This line is hereinafter referred to as "the visible light line".
In this example, the laser lines 20 are projected on to the area of terrain 12 by a pulse from the laser marker 8. The duration of the laser pulse from the laser marker 8 is such that each laser line 20 appears as a single dot in a single line of the respective line-scanned images taken by the UV camera 4 and the visible light camera 6. Also, the width of each laser line 20 projected by the laser marker 8 is such that each laser line 20 occupies a single pixel in a particular line of each of the images captured by the UV camera 4 and the image captured by the visible light camera 6. Thus, in other words each laser line 20 appears in both the UV image 16 and the visible light image 18 on a single line of that image, and at a single pixel along that line.
-14 -In this example, at the time the laser lines 20 are projected on to the area of terrain 12, the UV line 24 and the visible light line 26 are being scanned at different relative positions. Thus, as shown in Figure 9, the laser lines 20 appear as dots in the UV image 16 that are a distance d1 apart. Also, as shown in Figure 9, the laser lines 20 appear as dots in the visible light image 18 that are a distance d2 apart.
Figure 9 is a schematic illustration (not to scale) of the UV image 16 and the visible light image 18 in the second further example. The position of the image of each of the laser lines 20 in each of the images 16, 18 is indicated in Figure 9 by an "X" and the reference numeral 20.
In the UV image 16, the dots resulting from the laser lines 20 are at a distance of d1 from each other.
In the visible light image 18, the dots resulting from the laser lines 20 are at a distance of d2 from each other.
The laser lines 20 are captured (as dots) by the cameras 4, 6 as they capture their respective images because the laser lines 20 are made by light that has a wavelength that is detectable by both the UV and visible light cameras 4, 6.
In this second further example, data corresponding to the UV and visible light images gathered by the UV camera 4 and visible light camera 6 respectively is sent to the processor 10.
Figure 10 is a process flow-chart showing certain steps of an image registration and camera alignment process used in the second further example.
At step s16, image data from the UV camera 4 and the visible light camera 6 is received by the processor 10. In this second further example, data from the cameras 4, 6 is sent from the cameras 4, 6 to the processor 10 as the images are being captured.
At step s18, the image data from the UV camera 4 is processed to determine a distance between the laser dots 20 in the UV image 16, i.e. the distance d1 is determined.
At step s19, the image data from the visible light camera 6 is processed to determine a distance between the laser dots 20 in the visible light image 18, i.e. the distance d2 is determined.
At step s20, the processor 10 determines a transformation T' that registers the UV image 16 and the visible light image 18.
Figure 11 is a schematic illustration of the UV image 16 and the visible light image 18 that have been registered. For purposes of clarity, the UV image 16 is shown as a dotted line.
In this example, for the UV image 16 and the visible light image 18 to be registered the UV image 16 is moved in a direction opposite to the direction of travel 22 of the aircraft 2 by a distance D. In this second further example, the distance D is determined as follows: D=xS = d2 -d1 2tanq5 where: x is the length of a single pixel in an image 16, 18 in the direction of travel 22; 3 is a number of pixels that an image is to be moved in the direction of travel; c1 is the distance between the images of the laser lines 20 in the UV image 16; d2 is the distance between the images of the laser lines 20 in the visible light image 18; and 0 is the angle between a laser line 20 and the direction of travel 22.
Thus, 3 may be expressed as: 2xtançb A similar calculation can be performed to perform lateral image alignment. For example, lateral image alignment may be achieved as follows.
Firstly, a central point of a line joining the dots (i.e. the image of the laser lines 20) in each image is determined.
Secondly, one or both of the images are shifted in a direction substantially parallel to the line joining the dots in the respective image, until the central points in the images are laterally aligned.
At step s22, a movement instruction for the UV camera 4 is determined by the processor 10 using the transformation T' determined at step s20 above.
In this example, the movement instruction is determined in the same way as described above at step s8 for a previous example.
At step s24, the movement instruction is sent from the processor 10 to the optical bench.
At step s26, the UV camera 4 is moved relative to the visible light camera 6 using the actuators of the optical bench according to the received movement instruction.
Thus, a further image registration and camera alignment process is provided.
In addition to those advantages provided by the above described examples, the second further example advantageously provides that a rotation of one camera relative to the other camera may be corrected for.
In this second further example, the UV camera 4 is moved relative to the visible light camera 6 to register images produced by those cameras 4, 6.
However, in other examples the visible light camera 6 is moved relative to the UV camera 4, or both the UV and visible light cameras are moved relative to each other and/or the optical bench.
In the above examples, a laser dot 14 (or pair of laser lines 20) is projected onto the area of terrain 12 at a single time-step, i.e. by a single pulse from the laser marker 8. This advantageously provides that two images, e.g. from different sources/cameras can be aligned in time.
For example, in an above example the laser dot 14 is found to occupy the ith line of the UV image 16 and the kth line of the visible light image 18.
Thus, it can be determined that the ith line of the UV image 16 and the kth line of the visible light image were captured at substantially the same point in time.
In a further example, if instead of a continuous line-scanned image, the UV and visible light cameras 4, 6 took a sequence of images of the area of terrain 12, then if the image of the laser dot 14 was found to occupy the nth image in the sequence of images taken by the UV camera 4, and the mth image in the sequence of images taken by the visible light camera 6, it can be determined that the nth image taken by the UV camera 4 and the mth image taken by the visible light camera 6 were captured at substantially the same point in time.
Thus, time-alignment of images can be performed without registering the images. A time-alignment process can be performed instead of, or in addition to, any or all of the above described image registration and/or image registration and camera alignment processes.
An embodiment of a process of time-aligning images or image sequences will now be described. This embodiment of a process of image time-alignment tends to be more accurate than performing time-alignment of images (or sequences of images) using a single laser dot (or pair of laser lines) created by a single pulse.
In this embodiment, as the aircraft 2 flies over the area of terrain 12, the laser marker 8 projects a laser dot 14 on to the area of terrain at different points in time.
In this embodiment, the time-steps at which the laser dot 14 is projected onto the area of terrain are in a pseudo-random sequence.
Figure 12 is a schematic illustration (not to scale) of an image taken using the UV camera 4 over a first time-period. This image is hereinafter referred to as the "second UV image" and is indicated in Figure 12 by the reference numeral 30. The images in second UV image 30 of the laser dots that are projected on to the area of terrain 12 in a pseudo-random sequence are indicated in Figure 12 by an X'.
Figure 13 is a schematic illustration (not to scale) of an image taken using the visible light camera 6 over a second time-period. This image is hereinafter referred to as the "second visible light image" and is indicated in Figure 13 by the reference numeral 32. The images in second visible light image 32 of the laser dots that are projected on to the area of terrain 12 in a pseudo-random sequence are indicated in Figure 13 by an X'.
In this embodiment, the first and second time-periods are different time-periods, i.e. time-periods of different lengths and/or time-periods that span different time ranges. However, in this embodiment, the first and second time periods overlap to some extent. In other embodiments, the time periods may be the length and may overlap wholly or partially.
In this embodiment, data corresponding to the second UV and visible light images 30, 32 gathered by the UV camera 4 and visible light camera 6 respectively is sent to the processor 10 as the aircraft 2 files over the area of terrain 12.
Figure 14 is a process flow-chart showing certain steps of an embodiment of an image time-alignment process implemented using the processor 10.
At step s30, image data corresponding to the second UV image 30 camera 4 and the visible light camera 6 is received by the processor 10. In this embodiment, data from the cameras 4, 6 is sent from the cameras 4, 6 to the processor 10 as the second images 30, 32 are being captured.
At step s32, the image data from the UV camera 4 is processed to determine a position of each of the images of the laser dots in the second UV image 30. Also, the image data from the visible light camera 6 is processed to determine a position of each of the images of the laser dots in the second visible light image 32. In this embodiment, the position of an image of the laser dot in a line-scan image corresponds to a time-step at which that laser dot was captured by the camera.
At step s34, the processor 10 determines a time-period by which to shift the second UV image 30 relative to the second visible light image 32 until the positions of the laser dots in the second UV image 30 are aligned with the positions of the laser dots in the second visible light image 32.
In this embodiment, the time period by which to shift an image is determined as follows.
Firstly, positions of the laser dots in each image are represented as a binary sequence. In this embodiment, this is done by creating a binary sequence indicative of the spacing of the laser dots in the image. A 1' in the binary sequence represents a relatively large distance between successive laser dots in the image, whereas a 0' in the binary sequence represents a relatively small distance between successive laser dots in the image. Thus, a binary sequence is generated for each of the UV and visible light images 30, 32.
Secondly, corresponding elements in the binary sequences are summed using an exclusive or (XOR) operation (i.e. the first integers in the sequences are summed, the second integers in the sequences are summed, and so on). If each of the XOR sums are equal to zero then it is identified that the images are temporally aligned. However, if at least one of the XOR sums is not equal to zero (i.e. equal to 1) then the images are not temporally aligned.
In the case that the images are not temporally aligned, one of the binary sequences is shifted by one bit and the process of summing corresponding elements of the binary sequences is repeated. This process is iterated until the each of the XOR sums is zero. The number of bits that a binary sequence has been shifted by to provide that each XOR sum is zero is then used to determine a time-period by which to shift the UV and visible light images 30, 32 relative to each other to provide temporal alignment of the images 30, 32.
In other embodiments, the time period by which to shift an image is determined using a different technique. For example, a line of a line scanned image may be represented as a 0' if no mark is present in that line, or a 1' if a mark is present in that line. Thus, each line-scanned image 30, 32 may be represented as a binary sequence in a different way to that described above.
The above described iterative process of summing corresponding elements using an XOR operator, and shifting a binary sequence relative to the other by a single bit if a sum is non-zero, may then be performed. In this embodiment, the number of bits required to shift a binary sequence by in order to provide temporal alignment corresponds to the number of lines in a line scanned image that an image is to be shifted by to time-align it with the other image.
At step s36, the second UV image 30 is shifted by the determined time-period such that the positions of the laser dots in the second UV image 30 and the second visible light image 32 are aligned.
The use of a pseudo-random sequence of laser dots tends to provide that a pattern of laser dots that can be used to assure relatively precise time-alignment is created in the images. In other embodiments, this feature is provided by a different type of sequence of laser dots, for example, a random sequence, or a pre-defined sequence having a specific pattern that facilitates aiignment of images.
Figure 15 is a schematic illustration (not to scale) of the second UV image 30 and the second visible light image 32 after the second UV image 30 has been shifted such that the positions of the laser dots in the second UV image 30 and the second visible light image 32 are aligned.
In this embodiment, the second UV image 30 is shifted relative to the second visible light image 32. However, in other embodiments the visible light image 32 is shifted relative to the second UV image 30. Also, in other embodiments both UV and visible light images may be shifted.
An advantage provided by the above described time-alignment process is that aligning in time images, or image sequences, from different sources is facilitated. Also, the provided process works when the images, or image sequences, from the different sources are different sizes/lengths, or comprise different numbers of images.
A further advantage provided by the above described time-alignment process is that the identification of a portion of an image that corresponds in time to a particular portion of an image from a different source is facilitated. -21 -
Likewise, the identification a sub-sequence of images from a larger image sequence that corresponds in time to a sub-sequence of images from a different source is facilitated. Thus, if for example an interesting feature is identified in images from one source at a particular point in time, then images taken at that particular time by different sources can be easily identified and analysed to provide further information about the interesting feature.
Moreover, the above described time-alignment process may be advantageously used in combination with any of the above described image registration and/or image registration and camera alignment processes. Thus, the alignment in time, and registration, of images or image sequences taken by different sources tends to be facilitated. Moreover, the above described processes tend to provide that this can be performed in real-time, near real-time and/or in post-processing at a later time.
A further advantage provided by the combination with any of the above is described image registration and/or image registration and camera alignment process is that the registering of images and/or the alignment of cameras may be performed over any length of observation.
The provided process of time-alignment advantageously tends to provide for the alignment of sequences of discrete images taken of a scene by different image sources. The sequences of discrete images may be taken by cameras/sensors that may or may not be moving relative to the scene.
Apparatus, including the processor 10, for implementing the above arrangement, and performing any of the method steps described above, may be provided by configuring or adapting any suitable apparatus, for example one or more computers or other processing apparatus or processors, and/or providing additional modules. The apparatus may comprise a computer, a network of computers, or one or more processors, for implementing instructions and using data, including instructions and data in the form of a computer program or plurality of computer programs stored in or on a machine readable storage medium such as computer memory, a computer disk, ROM, PROM etc., or any combination of these or other storage media.
-22 -It should be noted that certain of the process steps depicted in the flowchart of Figures 3 and 6 and described above may be omitted or such process steps may be performed in differing order to that presented above and shown in the Figures. Furthermore, although all the process steps have, for convenience and ease of understanding, been depicted as discrete temporally-sequential steps, nevertheless some of the process steps may in fact be performed simultaneously or at least overlapping to some extent temporally.
In the above embodiments, the provided processes are implemented using an autonomous aircraft to take images of an area of terrain as the aircraft flies over the area. However, in other embodiments any or all of the above described processes are implemented in a different system, for example on a different type of vehicle such as a manned or semi-autonomous vehicle, or a manned or unmanned land-based or water-based vehicle. Any or all of the above described processes are particularly useful in industrial applications e.g. applications that involve the monitoring of continuous production or treatment processes and/or applications in which there tend to be few useful reference features in images that are typically used in conventional techniques. For example, the above described processes tend to be particularly effective in sheet metal production quality checking and the like (because the above described technique can be advantageously enhanced to provide a more persistent mark/feature for subsequent remedial action where a defect is discovered).
In the above embodiment, the cameras are moving relative to the surface being imaged (because the aircraft is moving). However, in other embodiments, one or more of the cameras/sensors is stationary with respect to the surface, e.g. in another embodiment the cameras are stationary relative to the scene and each camera is arranged to capture a sequence of discrete images of the scene.
In the above embodiments, images are gathered by two cameras, namely a UV and an visible light camera. However, in other embodiments images are collected from a different number of cameras. These images may then be registered/time-aligned as described above. Moreover, in other embodiments one or more of the cameras may be replaced by a different type of sensor, such as radar, sonar, or an infrared camera.
In the above embodiments, the cameras are line scan cameras.
However, in other embodiments one or more of the sensors/cameras may produce an image in a different way, for example by taking a sequence of discrete images of the terrain/scene being analysed.
In the above embodiments, the laser pulse that produces the laser dot is of a wavelength that is visible by all cameras/sensors being used. However, in other embodiments, one or more laser pulses may be implemented at a time, and any number of these one or more laser pulses may be visible to any number of the cameras/sensors being used.
In the above embodiments, the laser dot/line produced by the laser marker had a diameter such that it occupied a single pixel in an image line.
Also, the laser pulse that produced that laser dot had a duration that provided that the laser dot was present in a single line of an image. However, in other embodiments a laser feature in the image may have a diameter such that it occupies a different number of image pixels in a line. Also, in other embodiments, the laser pulse may have a different duration such that the laser feature in the image may occupy a different number of lines, for example in other embodiments the laser marker may emit a continuous laser beam.
In the above embodiments, a single laser marker is used to project a single laser dot or single pair of laser lines on to the terrain at a particular time.
However, in other embodiments a different number of laser markers may be used to project a different number of laser features. Also, in other embodiments, one or more of the projected laser features may be a different appropriate shape or pattern and of any appropriate size.
In the above embodiments, the feature (i.e. laser dot) that is projected on to the imaged surface (i.e. terrain area) is provided by a laser marker. However, in other embodiments, a different appropriate type of mark is projected onto an imaged surface from an appropriate source so as to provide a feature that can be used to register/align images. Also, in other embodiments a feature is -24 -positioned on to the imaged surface by a different method, i.e. other than projection. For example, in other embodiments physical markers may be placed on a surface, e.g. paint spots, chemical markers, burn marks, dye etc may be positioned on a surface and used to perform an above described process.
In the above embodiments, the processor (and image processing) is implemented on board the aircraft. However, in other embodiments the processor/processing is implemented at a location remote from the aircraft, e.g. at a later point in time.
In the above embodiments, the cameras are mounted on an optical bench in the aircraft. However, in other embodiments the cameras are mounted on the aircraft in a different appropriate way so as to provide the above described functionality.

Claims (15)

  1. -25 -CLAIMS1. A method of performing image processing, the method comprising: at each of a plurality of different time-steps in a time period, positioning a mark (14, 20) onto a surface (12)to provide a plurality of projected marks; for a first portion of the time period, using a first sensor (4), generating a first image (30) or first sequence of images of the surface (12) with at least a portion of each of the projected marks (14, 20) in the first image (30) or first sequence of images; for a second portion of the time period, using a second sensor (6), generating a second image (32) or second sequence of images of the surface (12) with at least a portion of each of the projected marks (14, 20) in the second image (32) or second sequence of images; and performing a process that is dependent upon the respective positions of the mark (14, 20) in the first image (30) or first sequence of images and the second image (32) or second sequence of images.
  2. 2. A method according to claim 1, wherein the first portion and the second portion overlap to some extent.
  3. 3. A method according to claim 1 or 2, wherein the step of performing a process that is dependent upon the respective positions of the mark (14, 20) in the first image (30) or first sequence of images and the second image (32) or second sequence of images comprises performing a time-alignment process to temporally align the first and second images or first and second sequence of images.
  4. 4. A method according to any of claims 1 to 3, wherein the step of positioning a mark (14, 20) onto a surface (12) comprises projecting the mark (14, 20) onto the surface (12).
  5. 5. A method according to claim 4, wherein the mark (14, 20) is projected onto the surface using a laser (8).
  6. 6. A method according to claim 5, wherein the mark (14, 20) is a laser dot (14).
  7. 7. A method according to any of claims 1 to 6, wherein the time-steps in which the mark (14, 20) is positioned onto the surface form a random, or pseudo-random, sequence of time-steps in the time period.
  8. 8. A method according to any of claims 1 to 7, wherein: a sensor (4, 6) is a line-scan sensor; and for each time-step in which the mark (14, 20) is positioned on the surface (12), the mark (14, 20) is positioned on the surface (12) such that the at least a portion of the mark (14, 20) in the image generated by that sensor is present in a single line of that image.
  9. 9. A method according to any of claims I to 8, wherein the first sensor (4) and the second sensor (6) are different types of sensor.
  10. 10. A method according to claim 9, wherein the second sensor (6) is a visible light camera and the first sensor (4) is either an ultra-violet camera or an infrared camera.
    -27 -
  11. 11. A method according to claim 1, wherein the step of performing a process that is dependent upon the respective positions of the at least a portion of the mark (14, 20) in the first image (30) or first sequence of images and the second image (32) or second sequence of images comprises performing an image registration process to register the first and second images or register the first and second sequences of images.
  12. 12. A method according to any of claims 1 to 11, wherein the first sensor (4) and the second sensor (6) are mounted on an aircraft (2).
  13. 13. Apparatus for performing image processing, the apparatus comprising: means (8) for, at each of a plurality of different time-steps in a time period, positioning a mark (14, 20) onto a surface (12) to provide a plurality of projected marks; a first sensor (4) arranged to, for a first portion of the time period, generate a first image (30) or first sequence of images of the surface (12) with at least a portion of each of the projected marks (14, 20) in the first image (30) or first sequence of images; a second sensor (6) arranged to, for a second portion of the time period, generate a second image (32) or second sequence of images of the surface (12) with at least a portion of each of the projected marks (14, 20) in the second image (32) or second sequence of images; and a processor arranged to perform a process that is dependent upon the respective positions of the mark (14, 20) in the first image (30) or first sequence of images and the second image (32) or second sequence of images.
  14. 14. A program or plurality of programs arranged such that when executed by a computer system or one or more processors it/they cause the computer system or the one or more processors to operate in accordance with the method of any of claims I to 12.
  15. 15. A machine readable storage medium storing a program or at least one of the plurality of programs according to claim 14.
GB1109297.0A 2011-06-03 2011-06-03 Temporal Alignment of Sensor Data Withdrawn GB2491395A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
GB1109297.0A GB2491395A (en) 2011-06-03 2011-06-03 Temporal Alignment of Sensor Data

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
GB1109297.0A GB2491395A (en) 2011-06-03 2011-06-03 Temporal Alignment of Sensor Data

Publications (2)

Publication Number Publication Date
GB201109297D0 GB201109297D0 (en) 2011-07-20
GB2491395A true GB2491395A (en) 2012-12-05

Family

ID=44343330

Family Applications (1)

Application Number Title Priority Date Filing Date
GB1109297.0A Withdrawn GB2491395A (en) 2011-06-03 2011-06-03 Temporal Alignment of Sensor Data

Country Status (1)

Country Link
GB (1) GB2491395A (en)

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7015954B1 (en) * 1999-08-09 2006-03-21 Fuji Xerox Co., Ltd. Automatic video system using multiple cameras
US20060256229A1 (en) * 2005-05-11 2006-11-16 Sony Ericsson Mobile Communications Ab Digital cameras with triangulation autofocus systems and related methods
US20090010633A1 (en) * 2007-07-06 2009-01-08 Flir Systems Ab Camera and method for use with camera

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7015954B1 (en) * 1999-08-09 2006-03-21 Fuji Xerox Co., Ltd. Automatic video system using multiple cameras
US20060256229A1 (en) * 2005-05-11 2006-11-16 Sony Ericsson Mobile Communications Ab Digital cameras with triangulation autofocus systems and related methods
US20090010633A1 (en) * 2007-07-06 2009-01-08 Flir Systems Ab Camera and method for use with camera

Also Published As

Publication number Publication date
GB201109297D0 (en) 2011-07-20

Similar Documents

Publication Publication Date Title
US10019838B2 (en) Human body three-dimensional imaging method and system
JP4221768B2 (en) Method and apparatus for positioning an object in space
US8265376B2 (en) Method and system for providing a digital model of an object
JP3930482B2 (en) 3D visual sensor
US20070057946A1 (en) Method and system for the three-dimensional surface reconstruction of an object
JP5051493B2 (en) 3D measurement marker and 3D measurement method using the same
AU2015250748A1 (en) 3D point clouds
Fiala et al. Visual odometry using 3-dimensional video input
JP2019032218A (en) Location information recording method and device
JP2023004964A (en) Sensor calibration method and apparatus
US20200114519A1 (en) System, method and marker for the determination of the position of a movable object in space
Shim et al. A mobile robot localization using external surveillance cameras at indoor
AU2021204030A1 (en) Multi-sensor calibration system
JP2017004228A (en) Method, device, and program for trajectory estimation
CN113850280A (en) Intelligent airplane part code spraying identification method and device, storage medium and equipment
EP2530649A1 (en) Sensor data processing
CN116117800B (en) Machine vision processing method for compensating height difference, electronic device and storage medium
GB2491396A (en) Sensor data alignment processing
US20230070281A1 (en) Methods and systems of generating camera models for camera calibration
EP2530650A1 (en) Sensor data processing
GB2491395A (en) Temporal Alignment of Sensor Data
JP5285487B2 (en) Image recording system and image recording method
Hsu et al. Online recalibration of a camera and lidar system
Zhang et al. Efficient closed-loop detection and pose estimation for vision-only relative localization in space with a cooperative target
JP2014052212A (en) System and method for measuring pipe position

Legal Events

Date Code Title Description
WAP Application withdrawn, taken to be withdrawn or refused ** after publication under section 16(1)