GB2246261A - Tracking arrangements and systems - Google Patents

Tracking arrangements and systems Download PDF

Info

Publication number
GB2246261A
GB2246261A GB9114518A GB9114518A GB2246261A GB 2246261 A GB2246261 A GB 2246261A GB 9114518 A GB9114518 A GB 9114518A GB 9114518 A GB9114518 A GB 9114518A GB 2246261 A GB2246261 A GB 2246261A
Authority
GB
United Kingdom
Prior art keywords
image
camera
tracking system
video
frame
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
GB9114518A
Other versions
GB9114518D0 (en
GB2246261B (en
Inventor
Christopher George Harris
Richard John Evans
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Roke Manor Research Ltd
Original Assignee
Roke Manor Research Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Roke Manor Research Ltd filed Critical Roke Manor Research Ltd
Publication of GB9114518D0 publication Critical patent/GB9114518D0/en
Publication of GB2246261A publication Critical patent/GB2246261A/en
Application granted granted Critical
Publication of GB2246261B publication Critical patent/GB2246261B/en
Anticipated expiration legal-status Critical
Expired - Lifetime legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S3/00Direction-finders for determining the direction from which infrasonic, sonic, ultrasonic, or electromagnetic waves, or particle emission, not having a directional significance, are being received
    • G01S3/78Direction-finders for determining the direction from which infrasonic, sonic, ultrasonic, or electromagnetic waves, or particle emission, not having a directional significance, are being received using electromagnetic waves other than radio waves
    • G01S3/782Systems for determining direction or deviation from predetermined direction
    • G01S3/785Systems for determining direction or deviation from predetermined direction using adjustment of orientation of directivity characteristics of a detector or detector system to give a desired condition of signal derived from that detector or detector system
    • G01S3/786Systems for determining direction or deviation from predetermined direction using adjustment of orientation of directivity characteristics of a detector or detector system to give a desired condition of signal derived from that detector or detector system the desired condition being maintained automatically
    • G01S3/7864T.V. type tracking systems
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/10Image acquisition
    • G06V10/12Details of acquisition arrangements; Constructional details thereof
    • G06V10/14Optical characteristics of the device performing the acquisition or on the illumination arrangements
    • G06V10/147Details of sensors, e.g. sensor lenses

Landscapes

  • Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Electromagnetism (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Health & Medical Sciences (AREA)
  • General Health & Medical Sciences (AREA)
  • Vascular Medicine (AREA)
  • Multimedia (AREA)
  • Theoretical Computer Science (AREA)
  • Image Analysis (AREA)

Abstract

A tracking system for tracking the relative positions of a video camera 8 and a three dimensional object (eg a runway) between which there is unknown but continuous relative motion, the tracking system including a video camera 8 for viewing the object, first means for defining a visual model of the object (eg at a ground station 7), the model consisting of a specification of a set of control points 3 which are located at high contrast edges of the object and whose position 5 in the video image define the position and orientation of the object relative to the camera and second means 11 for comparing the visual model with each frame of the camera's image of the object and for measuring the difference between the two images at each of the control points, the difference measurements for each frame being used to modify an initial or previous estimate of the position and attitude of the object relative to the camera which in turn is used to revise the predicted position of the set of control points on the image for comparison with the next succeeding frame of the video camera whereby the visual model and camera image are gradually brought into conformity. The tracking system may be used as an aircraft landing aid. <IMAGE>

Description

TRACKING ARRANGEMENTS AND SYSTEMS This invention relates to tracking arrangements and systems and is especially applicable to a three dimensional (3D) model based tracker.
Measurements of an object's position and attitude are currently made for a number of purposes by the manual interpretation of video recordings or cine film, for example to determine the path taken in tests by bombs dropped from aircraft. This manual analysis is extremely time consuming and a video rate automatic equipment would be much preferred.
A known method of obtaining measurements at or near videorate is reported on pages 85-90 of the Proceeding of the Fifth Alvey Vision Club (AVC89) by R.S. Stevens, Computing Device Limited. This method uses the detection of high contrast image edges at prespecified control points to perform a model fit, but employs a widely spaced pair of cameras, an elaborate and computationally expensive edge detector and model movement is determined by using a Hough binning method. The use of a pair of cameras limits the utility of this method. For example, it would be impracticable to install such cameras in an aircraft.
Furthermore, Stephens' reported work is still at the simulation stage, and when complete will need considerable processing power.
It is an object of the present invention to provide a tracking system which given an approximate estimate of the position and attitude of an object, refines these estimates and automatically tracks the movement of the object using the video-rate imagery provided by a TV camera.
The invention provides a tracking system for tracking the relative positions of a video camera and a three dimensional object between which there is unknown but continuous relative motion, the tracking system including a video camera for viewing the object, first means for defining a visual model of the object, the model consisting of a specification of a set of control points which are located at high contrast edges of the object and whose position in the video image define the position and orientation of the object relative to the camera and second means for comparing the visual model with each frame of the camera's image of the object and for measuring the difference between the two images at each of the control points, the difference measurements for each frame being used to modify an initial or previous estimate of the position and attitude of the object relative to the camera which in turn is used to revise the predicted position of the set of control points on the image for comparison with the next succeeding frame of the video camera whereby the visual model and camera image are gradually brought into conformity.
Given an approximate initial estimate of the position and attitude of a predefined object, the invention enables these estimates to be refined, and the movement of the object to be automatically tracked using the video-rate imagery provided by a TV camera. The principles of this invention and a means of realisation will be described for use in controlling the flight of an unmanned aircraft during landing.
Given a manually supplied initial estimate of runway position and attitude (relative to the camera), errors in the initial estimate are.
corrected by minimising the root-mean-square error between the observed position of the object's edges and edges specified at control points in a visual model of the object being tracked. This process can be repeated for subsequent image frames by using the refined estimate of position and attitude as the initial approximate estimate of position and attitude in the succeeding video frame to be processed; this enables a moving object to be tracked.
The invention will also be applicable to dynamic measurement and control generally. Examples are robot object manipulation, vehicle convoying, satellite docking, and aircraft landing aids.
The advantages of the vision based measurements technique according to the present invention in relation to most other techniques are that there is little impact on the design of the object to be tracked and that video cameras are now highly developed and inexpensive, making their use in hostile environments viable.
In the case of controlling an unmanned aircraft, the low cost of cameras makes their use in disposable unmanned aircraft an economic proposition. For this application of the tracking system according to the present invention the 3D model-based tracker would enable the position of the aircraft relative to the landing to be determined, as either a means of monitoring performance, a landing aid, or an input to an autonomous landing system.
The tracking system according to the present invention will be described in relation to controlling the flight of an unmanned aircraft during landing. The invention is applicable however, in many other situations where the object to be tracked is rigid and can be prespecified from design drawings or prior measurements.
The foregoing and other features according to the present invention will be better understood from the following description with reference to the accompanying drawings in which; FIGURES 1 (a) to (f) illustrate for an aircraft landing tracking system the effects of incorrectly estimated camera viewpoint, FIGURE 2 illustrates for an aircraft landing tracking system an arrangement of predicted control points, lines of search and detected edge positions, FIGURE 3 illustrates for the tracking system according to the invention a method of detecting the edge position of an object.
FIGURE 4 shows a tracking system according to the present invention for a 3D tracker for the control of an unmanned aircraft.
FIGURE 5 illustrates a visual model of an airfield runway, FIGURE 6 shows azimuth and elevation track output of the tracking system according to the invention, and FIGURE 7 shows multiple attitude tracks derived from a single segment of video data.
In accordance with one aspect of the present invention means are provided for defining from existing measurements a visual model of an aircraft runway and nearby features. The visual model consists of a specification of a set of control points at which there is a prominent visible edge, such as occurring between the runway surface (concrete or tarmac) and adjacent grass, or occurring at painted or other markers placed for this purpose. Each control point is specified by its position (in three dimensions relative to a convenient origin near the centre of the runway), the direction (in three dimensions) of the visible edge at this point, and the polarity of the edge (i.e. which side is brighter than the other). This technique can be extended to utilise lines, as opposed to visible edges, but this extension is not described herein.
Given an initial approximate estimate of the runway's position and attitude relative to a video camera the expected shape and position of the runway as seen in the camera's image of the runway can be predicted. Because of errors in the initial estimate, there will be some discrepancy between the expected position and shape and the true position and shape. This is illustrated in FIGURES l(a) to (f) which show for an aircraft runway the expected outline 1 of the runway i.e. the visional model, in relation to the true image 2 of the runway as seen by the video camera.In FIGURE l(a) the position of the true runway 2 is lower than the initial estimate of the expected runway outline 1, in FIGURE l(b) the true runway 2 is rolled to the left of the runway outline 1, in FIGURE l(c) the true runway 2 is positioned to the left of the runway outline 1, in FIGURE l(d) the orientation of the true runway 2 is pitched up from the initial estimate of the expected runway 1, in FIGURE l(e) the position of the true runway 2 is nearer than the expected runway 1 and in FIGURE 1 (f) the orientation of the true runway 2 is yawed to the left of the initial estimate of the expected runway outline 1. As the aircraft mounted camera is viewing the airfield at a shallow angle, the runway, which is actually very long, will (as shown in FIGURE 1) appear very much foreshortened. The error in the initial estimate is calculated by measuring the discrepancy between the predicted and true runway shape and position, as described below.
As illustrated in FIGURE 2, the position of a number of predicted control points 3 are shown on the expected outline 1 of the visual model of the runway. For each control point 3 a line of search 4 in the image is predicted which are each centred on a predicted control point 3 position and approximately perpendicular to the expected edge direction. The actual direction of the line of search 4 is selected to be either horizontal, vertical or diagonal in the array of brightness values making up the image. The image brightness along each line of search 4 is examined in order to locate a true detected edge position 5.
The method of detecting the detected edge position is illustrated in FIGURE 3. A brightness threshold is calculated for each line of search 4 to be mid-way between the image brightnesses at each end of the lines of search. Image brightnesses are then examined sequentially starting from one end of the line of search until the brightness is observed to cross the brightness threshold.
The detected edge position 5 is deemed to be found at the point where the threshold was crossed. In certain circumstances the detected edge position 5 may not be found, for example, if the specified polarity of the edge is inconsistent with the relative brightnesses of the image at the ends of the line of search 4, or if the difference in brightnesses at the ends of the line of search 4 is below a predefined threshold of acceptability. The detection of detected edge position may be performed in other ways, but the above has been found adequate in most cases. Alternative techniques include methods to reject unacceptably unsharp edges, by use of double brightness thresholds or analysis of brightness derivatives.
For those control points 3 where the corresponding detected edge position 5 has been found, the edge displacement error is calculated. This is the displacement between the predicted control point 3 position and the detected edge position 5, measured perpendicular to the predicted edge orientation.
Finally the correction required to the initial approximate estimate 1 of the runway position and attitude is calculated by minimising the root-mean-square of calculated edge displacement errors. This can be done by simple algebra on the assumption that, for small errors, the edge displacement error is a linear function of the error in position and attitude. The formulae used in this calculation are described below.
Tracking of a moving object, e.g. the runway, can, as outlined above, be achieved by performing a vision task on successive frames of an image sequence. By making use of the continuity of object motion found in the closely-spaced frames of video imagery, the process of tracking becomes simplified, permitting real-time object tracking to be accomplished. By noting the difference in image position between the observed image features and the projection of the 3D model features with the model in its currently estimate pose, the required small corrections to the object pose can be calculated.
The particular model features that are used to perform tracking are points located on high contrast edges. The projection of these edges into the image is easy to perform, and the corresponding image edges simple to locate by searching the image pixels perpendicularly to the edge direction. The set of measured displacements of these edges are used to update the estimate of model pose by linearising the resulting equations.
For the edges to be successfully detected, the edges need to be locally straight and uncluttered. Rudimentary edge detection is performed by considering the row of pixels perpendicular to the model edge that pass through the projected control point, and finding the pixel whose brightness is half-way between the brightnesses of the pixels on either end of the row.
In general, the projected model control points will not lie precisely on the observed image edges because of errors in the estimate of the object pose. However, frames captured at video rate give errors in the object pose which are small, and allows edges to be correctly matched. Small changes in the pose of the model will cause image movement of the projected control points which will be linear in the change in the pose, and so the perpendicular distances between the projected control points and the observed image edges will vary linearly with the change in the pose. This linearity enables the small variations of the object pose that serve to minimise these perpendicular distances to be determined by solving a set of simultaneous linear equations, hence producing a fast pose correction.Applying the resulting correction to the pose estimate enables the pose estimate to converge on the correct object pose as subsequent frames are processed.
For the algorithm used in the calculation it is necessary to define the (Cartesian) camera coordinate system, which has its origin at the camera pinhole, Z-axis aligned along the optical axis of the camera, and X and Y axes aligned along the horizontal (rightward) and vertical (downward) image axes respectively. Imaging of points in 3D is catered for by the introduction of a conceptual image-plane situated at unit distance in front of the camera pin-hole.
A point at position
R= Y in camera coords will then project to image position
Z x r = # # = # Y # Define a model coordinate system, with origin located at T in camera coords, and with axes aligned with the camera coordinate system.
Consider a control point on the model located at P in model coordinates, and situated on a prominent 3D edge. This control point will project onto the image at
1 Tx+Px r = (TZ+PZ) Ty+Py Let the tangent to the 3D edge on which the control point is located be called the control edge. The control edge is in practice defined by specifying a companion control point to P, also located on the control edge and which projects onto the image at s. By considering the image displacement between r and s, the expected orientation of the control edge on the image can be determined. Let this be an angle a from the image x-axis, hence
The next stage is to find the perpendicular distance of r from the appropriate image edge.Assuming that the orientations of the image edge and the control edge are very nearly the same, a onedimensional search for the image edge can be conducted by looking perpendicularly to the control edge from r. However, to search perpendicular for the edge in the image to the edge at the control point would require finding the image intensity at non-pixel positions. To avoid this inconvenience, the image is searched for the edge in one of four directions: horizontally, vertically, or diagonally (that is, by simultaneous unit pixel displacements in both the horizontal and vertical directions) and the measured distance is corrected to allow for the discrepancy between the desired and actual search directions. The direction which is closest to being perpendicular to the control edge is chosen, and a row of pixel values centred on r, the projection of the control point, is read from the image. Each control point will result in a measured perpendicular distance between the points 3 and 5 as illustrated in FIGURE 2. The set of these perpendicular distances will be used to find the small change in the object pose that should minimise the perpendicular distances on the next frame processed.
Consider rotating the model about the model origin by a small rotation vector û, and translating it by a small distance A. Write these two small displacements as the six-vector q.
R' (q) = (X',Y',Z') =T+A+P+0xP
Tx + Ax+ Px + #yPz-#zPy = Ty + Ay+ Py + #zPx-#xPz Tz + Az+ Pz + #xPy-#yPx This will project onto the image at
X' x' Z' r'(q) = # y' #=# Y' # Z' Expanding in small A and 0, and retaining terms up to first order, gives
where x = (Tx+Px) y = (Ty+Py) Thus r'(q) can be written
Hence the perpendicular distance of the control point from the image edge is L'(q) = L + q.a sin a - q.b cos a = L + q.c where c = a sin a - b cos a and L = orignally measured distance Consider now not just one control point, but N control points, labelled i=1..N. The perpendicular distance of the i'th control point to its image edge is L'i(q) = Li + q.ci The next stage is to find the small change of pose, q, that aligns the model edges precisely with the observed image edges, that is to make all L'i (q) zero.If the number of control points, N, is greater than 6, then this is not in general mathematically possible as the system is overdetermined. Instead, the sum of squares of the perpendicular distances, E, is minimised
By setting to zero the differentials of E with respect to q, the following equations are obtained
This is a set of 6 simultaneous linear equations, and so can be solved using standard linear algebra.
The change,
8 g= in the model pose specified by the above algorithm must now be applied to the model. Applying the change inmodel position is straightforward T :=T + # The change in object attitude however causes some difficulties.
Conceptually, the positions of the control points on the model should be updated thus Pi :=Pi+ Ox Pi However, after numerous (thousands of) cycles of the algorithm, finite numerical precision and the approximation to rotation represented by the above equation, results in the control points no longer being correctly positioned with respect to each other, and thus the model distorts. To overcome this problem, the attitude of the model is represented by the rotation vector (a 3vector whose direction is the axis of rotation and whose magnitude is the angle of rotation about this axis), which rotates the model from its reference attitude, in which the model has its axes aligned with the camera coordinate axes.From the rotation vector # can be constructed the orthonormal rotation matrix A(4)), which appropriately rotates any vector to which it is applied. Conceptually, the rotation vector , should be updated by the model attitude change 0, thus A() := A(#) A() but doing this, the orthonormality of the rotation matrix may be lost in time, so in practice the rotation vector, 4), is updated directly by use of quaternions.If A () is the rotation matrix after updating, and the i'th model point is located in reference coordinates at P i(ref), then the position of this point in model coordinates at the beginning of the next cycle will be Pi = A () pi(ref) If the object is executing continuous motion, the above method of updating the pose vector will produce a result that lags behind the actual pose. It is advantageous to counter this deficiency by predicting ahead, as inter-frame motion in excess of the half-length of the row of pixels examined in edge detection can then be tolerated.
This predicting ahead is most simply achieved by using a position and velocity tracker, the so-called alpha-beta tracker.
Let xt be the 6-vector that represents the estimate of the pose of the object at frame number t, and qt the change in pose requested by the above algorithm. The alpha-beta tracker is given by xt+1 = xt + v, + &alpha; qt vt+l = vt + ss qt Where the 6-vector vt is the estimated velocity, that is, the rate of change of pose per frame processed.
a and p are scalars effecting the integration time of the predictor.
It can be seen from the foregoing that by considering small changes in model pose, the projection of a 3D model onto an image plane can be linearised.
Furthermore, by defining a visual model in terms of control points, a robust algorithm for estimating object pose can be derived.
This algorithm is undemanding in terms of processing requirements and can be implemented at video rate on modest hardware.
The basic algorithm described above provides the basis for a fast, general tracking system. The possible enhancements on the basic algorithm are: to enable and disable control points appropriately as the object attitude varies and control points are revealed or obscured respectively; to weight the effect of each control point in the algorithm by the observed edge strength; to disregard the contribution of a control point if the observed 'polarity' of the edge (i.e. on which side of the edge it is brighter) is not as expected; and more sophisticated frame-to-frame prediction and smoothing.
In order to improve the estimates of position and attitude resulting from the processing of each image frame and to automatically cope with variations in measurement noise, a Kalman filter (also operating at or near video rate) can be utilised to smooth the sequence of measurements obtained from the processing of a sequence of video frames and predict position and orientation in future frames.
In order that the Kalman filter can be used to best integrate the sequence of observations, statistical parameters are calculated for both (1) the apparent motion of the object in the camera's field of view and (2) the likely error in each observation. The parameters of the apparent motion of the object are calculated, for the currently estimated relative position of the object and camera, from prior knowledge of the nature of the motion of the object and the camera defined separately. The statistical parameters of the likely error in each observation are calculated from the positions of the found detected edges. The formulas used in this calculation are described below.
The measuring technique outlined above for measuring displacement error by using special algorithms, is highly efficient and has been demonstrated at video rate with a standard min-computer.
This model-based technique culminates, as specified above, in the determination of pose by minimising (by linear algebra) the quantity E(q) = E.i [ Li + q.ci ]2, where q is a 6-vector defining the pose of the object, Li is the image plane distance between an observed edge and the predicted position of the i'th edge of the model at a specified control point, and ci is a 6-vector whose value depends on the position of the edge within the model and on an initial estimate of the object's pose. (Model edges which pass undetected in the image are simply ignored.) When applying this technique to a practical case of a moving object, it is in principle possible to use the pose estimate, calculated by processing one video frame, as the initial estimate of the object's pose in the next video frame.This approach to tracking a moving object has the disadvantages that the object's motion would be limited to small movements between frames because the system searches for model edges in a limited region about the predicted position. This problem can, as outlined above, be overcome by using a simple predictor, such as and a -P tracker which also has the advantage of performing a temporal smoothing of pose estimates, to reduce measurement noise. In practise however, it has been found difficult to set the tracker parameters as the measurement noise depends on the number and position of edges found, and also on the current position of the object. In some extreme cases, the edges detected in a particular frame may not define all the object's degrees of freedom, and consequently a more sophisticated predictor/filter is required.
The formulation of a Kalman filter and the manner in which the required models are constructed for use in filtering pose estimates are outlined below.
(a) Kalman Filter Outline Let xAt be a vector that represents the estimated state of a system at time t. Given a new measurement, yt, made at that same instant, the state vector estimate is updated to "x't, given by A, A x = Axt + K(yt - Hxt), where K is the Kalman gain matrix, and H is a matrix which maps the estimated state to the corresponding expected observation.
Between observations it is assumed that the true state of the system evolves according to xt+1 = Axt + Et, Where #1 is a random variable of zero mean and coveriance defined by the matrix Qt. Thus given #'t, #t+1 = A #'t. If the error in the A observation yt has zero mean and covariance Rt, and the error in xt has zero mean and covariance Pt, then the optimal choice of K (that which minimises the trace of P't, the covariance of x't) is K = PtHT[HPtHT + Rt]-1, and P't = Pt - KHPt.
In the time to the next observation, however, confidence in the state vector estimate worsens because of the uncertainty in evolution, thus Pt+l = AP'tAT + Qt.
(b) The Object Motion Model In this application of Kalman filtering, the tracking system pose estimate, yt, is the 6-vector pose estimate found by the minimisation of E(q). In the simplest moving object case we assume uniform motion, so the state vector contains both position and velocity terms.
In particular, x = (r, #, #, #)T, where r is the object's position 3-vector (relative to the sensor), and o is a rotation 3-vector defining its orientation;
1616 A= 0616 and H= [i606J, where I6 and 06 are the 6-by-6 identity and zero matrices.
It is assumed that the above motion model is accurate apart from a random fluctuation in velocities, i.e.
03 03 03 03 Q= 03 03 03 3 03 03 Q1 Q2 T 03 03 Q2 Q3 where Q1. Q2 and Q3 are to be determined.
In many practical situations neither the object being tracked nor the sensor are rigidly fixed, as for example when the object is one aircraft and the sensor is mounted on another flying nearby. In such cases Q will be the combined effect of the motion of the object in global coordinates and the motion of the sensor in global coordinates.
In particular, if the sensor rotates, the object will move in sensor coordinates and the greater the distance between sensor and object, the greater will be the amount of movement.
Suppose u and u' are the velocities of the sensor and object in global coordinates and co and ' are their rotation-vector rates, and that all the (co) variances of these quantities are known. Thus if u and c are the variance and covariance operators, D(sol) and c (#1,u'2) etc, are input parameters for the model. Q1 is simply the sum of the covariance of cc and '.
Q2 and Q3 depend on the current position of the object relative to the sensor. The velocity of the object in sensor coordinates is v = cc x r - u + u', where r is the position of the object in sensor coordinates. Suppose # is subjected to a small fluctuation Acc about its mean position, and similarly #', u, and u', then the corresponding fluctuation in v is Av = ## x r - Au + Au', i.e.
#vi=#ijk# #jrk - #ui + #ui' where Eijk is the Levi-Cevita symbol and the summation convention is in force. For small fluctuations in the rotation rate of the sensor and object, the corresponding change in relative rotation rate is ##i=##i - ##'i.
Q2, the covariance of the components of the velocity and rotational components of the object in sensor coordinates is c (#i, vp) = E [ (##i - ##'i)x (#pjk# #jrk - #up + #up')].
In very many cases, such as aircraft flying nearly parallel, it can be assumed that the motion of the sensor and of the object are uncorrelated i.e. E [## i #up'] = E [# #'i# #j'] = E[ # #'i#up]=0. A further simplification, which may not be accurate for aerodynamic bodies, is that the individual components of motion of the sensor and object are uncorrelated with each other, i.e.E [ # # i#up] = E [ # #'i #up'] = 0 for all i and p, and E [ # # i##j] = 0 for i #j. In these circumstances the above expression simplifies to
0 -#(#1)r3 #(#1)r2 Q2= # #(#2)r3 0 -#(#2)r1 -#(#3)r2 #(#3)r1 0 Similarly Q3, the covariance of the components of the velocity of the object in sensor coordinates is c(vi, vp) = E [ (#ijk# #jrk - #ui + #ui')x (#pmn##mrn - #up + #up'] With assumptions about sensor and object motion as above, this simplifies to
2 2 r #(#2)+ r #(#3)+#(u1)+#(u'1) -r2r1#(#3) -r3r1#(#2) 3 2 2 2 -r2r1#(#3) r #(#3)+ r #(#1)+#(u2)+#(u'2) -r3r2#(#1) Q3= 1 3 2 2 -r3r1#(#2) -r3r2#(#1) r #(#2)+ r #(#1)+#(u3)+#(u'3) 1 2 Thus Qt can be calculated for each update, from the model input parameters, using the current best estimate of the relative position of the sensor and object.
(c) The Measurement Model The confidence attached to a particular refined pose estimate depends on how well defined is the minimum of the quadratic surface of function E (q)= #i[li + q.ci]2, where the ci depend on the detected feature points at the initially estimated object pose. Imagine an ellipse touching the quadratic surface at the minimum such that the second derivatives of the quadratic surface and the ellipse match. This ellipse is defined by the equation (q-q')[ci ci T](q-q')T = 1, where q' is the centre of the ellipse. It is assumed that the axes and orientation of this ellipse provide a good estimate of the covariance, R, of the resulting measurement.Thus by analogy with normal distribution theory, R is set as follows: R = [z:ic; ciT]-l Unfortunately, when only a few control points are detected this inverse cannot be calculated because of ill-conditioning. There are also certain situations when the detected control points do not fully define the pose of the object; in these cases the coefficients matrix is singular. The formula defining the Kalman filter gain can be rearranged, however, to avoid the need to compute the inverse, i.e.
K = PtHTRt-1[HPtHTRt-1+I]-1, where R-1= [ì ciciT].
With this formulation for K, the filter gain can be robustly calculated for each filter cycle, weighting each measurement according to its expected accuracy.
The tracking system algorithms, have been implemented in Pascal running on a VAX3400 machine, hosting an IT100 card for image capture and display. Although the VAX3400 is a general purpose machine the tracking system algorithms run fast enough for video rate processing to be demonstrated.
Furthermore, video data has been recorded from a forward looking sensor mounted in the nose of an unmanned aircraft. As it approaches an airfield and lands, under the control of a remote pilot, the video shows the selected runway growing in the field of view.
Using a simple model of the airfield runways as shown in FIGURE 5, the airfield (the object) is smoothly tracked after being manually acquired. FIGURE 6 shows a typical track, plotting (in ground coordinates) the height and lateral displacement of the aircraft as a function of range as it descends to land.
FIGURE 7 shows the attitude of the aircraft. In this figure a set of tracks have been plotted, obtained by repeatedly processing single segments of recorded video data. This figure gives an indication of the good repeatability of the technique up to the point, at about 800metres from touch down, where the model has expanded beyond the camera's field of view. The discrepancies at about 1400metres correspond to a temporary burst of video interference. The tracker is seen to be notably stable despite this period of very noisy data.
The tracking system according to the present invention may be realised by implementing the processing algorithms in software within a suitable programmable processor.
As shown in FIGURE 4, the main features of the tracking system according to the invention for a 3D tracker for the control of an unmanned aircraft are an airborne equipment 6 and ground equipment 7.
The airborne equipment 6 comprises a video camera 8 the output of which is connected to a video frame store 9 and a transmission link unit 10, a programmable processor 11 connected to the outputs of the video frame store 9 and the unit 10, and a flight control unit 12 connected to the output of the processor 11.
The ground equipment 7 includes a ground station 13 connected to a transmission link unit 14. The transmission link 15 established by the units 10 and 14 enables the airborne equipment 6 to be linked to the ground station 13 and to thereby be remotely operated by a human pilot.
In operation, the video camera 8 is placed in a position to look in a forward direction. The video signals are connected to both the transmission link unit 10 and the video frame store 9.
The transmission link 15 transmits down to the ground station 13 the video data pictures generated by the video camera 8. The ground station 13 transmits up from the ground station via the link 15 the initial approximate position of the runway to be used to start the 3D model-based tracker algorithms described above.
The video frame store 9 temporarily holds a copy of the image brightness values of each video picture frame generated by the camera 8.
The programmable processor 11 receives the initial estimate of the runway position and attitude and selectively, according to the tracker algorithm, reads parts of the image brightness values held in the video frame store 9 and calculates new estimates of position and attitude according to the tracker algorithm.
The ground station 13 has an operator display and an appropriate device for allowing the operator to define the initial estimate of runwsy position and attitude (relative to the aircraft).
The flight control unit 12 receives the series of the runway position and orientation estimates, calculated in the processor 11, to generate the control signals to move the aircraft's control surfaces (rudder, elevators etc) to maintain the required course.
To improve the estimates of position and attitude resulting from the processing of each image frame, a Kalman filter (also operating at or near video rate) is utilised in a manner outlined above to smooth the sequence of measurements obtained from the processing of a sequence of video frames.
The advantages of the tracking system according to the present invention are that accurate measurements may be made with inexpensive sensors, major modification to the object to be tracked is not required, expensive computer hardware is not required for fast (i.e. video rate) processing and the overall technique is robust in that it does not rely on the perfect operation of the initial processing steps.

Claims (10)

1. A tracking system for tracking the relative positions of a video camera and a three dimensional object between which there is unknown but continuous relative motion, the tracking system including a video camera for viewing the object, first means for defining a visual model of the object, the model consisting of a specification of a set of control points which are located at high contrast edges of the object and whose position in the video image define the position and orientation of the object relative to the camera, and second means for comparing the visual model with each frame of the camera's image of the object and for measuring the difference between the two images at each of the control points, the difference measurements for each frame being used to modify an initial or previous estimate of the position and attitude of the object relative to the camera which in turn is used to revise the predicted position of the set of control points on the image for comparison with the next succeeding frame of the video camera whereby the visual model and camera image are gradually brought into conformity.
2. A tracking system as claimed in claim 1 wherein the second means includes a Kalman filter for smoothing the sequence of difference measurements from the processing of a sequence of video frames and predicting position and orientation in future frames.
3. A tracking system as claimed in claim 1 or claim 2 wherein the second means for comparing the visual model with each frame of the camera's image of the object includes means for calculating a line of search in the image for each control point, means for calculating a brightness threshold for each line of search to be mid-way between the image brightness at each end of the line of search, means for sequentially measuring the image brightnesses along the line of search, a true detected edge position being found where the measured brightness crosses the brightness threshold.
4. A tracking system as claimed in claim 3 wherein the direction of the line of search is substantially perpendicular to the expected edge direction of the visual model on which the associated control point is located.
5. A tracking system as claimed in any one of the preceding claims wherein the means for modifying an initial or previous estimate of the position and attitude of the object relative to the camera which in turn is used to revise the predicted position of the set of control points on the image is effected by minimising the rootmean-square of the difference measurements.
6. A tracking system as claimed in any one of preceding claims substantially as hereinbefore described with reference to the accompanying drawings.
7. An aircraft landing system including a tracking system as claimed in any one of the preceding claims.
8. An aircraft landing system as claimed in claim 7 including a video camera located on an aircraft for viewing objects in the direction of movement of the aircraft, video frame storage means for the temporary storage of a copy of the image brightness values of each frame of the output of the video camera, transmission link means for transmitting the output of the video camera to a ground station and for transmitting from the ground station to the aircraft the visual model of the landing runway, a programmable processor for comparing the visual model received from the ground station with the image brightness values held in the video frame storage means to effect modification of an initial or previous estimate of the position and attitude of the object relative to the camera which in turn is used to revise the predicted position of the set of control points on the image for comparison with the next succeeding frame of the video camera and thereby gradually bring the visual model of the runway into conformity with the camera image of the runway, and flight control means for generating, in response to the output of the processor, control signals for maintaining the aircraft on a required course.
9. An aircraft landing system as claimed in claim 7 or claim 8 and substantially as hereinbefore described with reference to the accompanying drawings.
10. A dynamic measurement and control system including a tracking system as claimed in any one of claims 1 to 6.
GB9114518A 1990-07-16 1991-07-05 Tracking arrangements and systems Expired - Lifetime GB2246261B (en)

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
GB909015594A GB9015594D0 (en) 1990-07-16 1990-07-16 Tracking arrangements and systems

Publications (3)

Publication Number Publication Date
GB9114518D0 GB9114518D0 (en) 1991-08-21
GB2246261A true GB2246261A (en) 1992-01-22
GB2246261B GB2246261B (en) 1994-05-11

Family

ID=10679159

Family Applications (2)

Application Number Title Priority Date Filing Date
GB909015594A Pending GB9015594D0 (en) 1990-07-16 1990-07-16 Tracking arrangements and systems
GB9114518A Expired - Lifetime GB2246261B (en) 1990-07-16 1991-07-05 Tracking arrangements and systems

Family Applications Before (1)

Application Number Title Priority Date Filing Date
GB909015594A Pending GB9015594D0 (en) 1990-07-16 1990-07-16 Tracking arrangements and systems

Country Status (1)

Country Link
GB (2) GB9015594D0 (en)

Cited By (21)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB2267360A (en) * 1992-05-22 1993-12-01 Octec Ltd Method and system for interacting with floating objects
GB2272343A (en) * 1992-11-10 1994-05-11 Gec Ferranti Defence Syst Automatic aircraft landing system calibration
EP0609948A1 (en) * 1993-02-02 1994-08-10 Nikon Corporation Camera having target follow up function
EP0631214A1 (en) * 1993-05-27 1994-12-28 Oerlikon Contraves AG Method for the automatic landing of aircrafts and device for implementing it
WO1996009207A1 (en) * 1994-09-19 1996-03-28 Siemens Corporate Research, Inc. Autonomous video-based aircraft docking system, apparatus, and method
WO1997018484A1 (en) * 1995-11-14 1997-05-22 Israel Aircraft Industries Ltd. Automatic aircraft landing
US5792147A (en) * 1994-03-17 1998-08-11 Roke Manor Research Ltd. Video-based systems for computer assisted surgery and localisation
WO2001027651A1 (en) 1999-10-12 2001-04-19 Honeywell Inc. Method and apparatus for navigating an aircraft from an image of the runway
WO2002044749A1 (en) * 2000-11-28 2002-06-06 Roke Manor Research Limited Optical tracking systems
US6542086B2 (en) 1997-09-22 2003-04-01 Siemens Aktiengesellschaft Docking system for airport terminals
WO2003046290A1 (en) 2001-11-21 2003-06-05 Roke Manor Research Limited Detection of undesired objects on surfaces
WO2004002352A2 (en) 2002-07-01 2004-01-08 Claron Technology Inc. A video pose tracking system and method
US7374103B2 (en) * 2004-08-03 2008-05-20 Siemens Corporate Research, Inc. Object localization
GB2449517A (en) * 2007-03-08 2008-11-26 Honeywell Int Inc Vision based navigation and guidance system
WO2013069012A1 (en) * 2011-11-07 2013-05-16 Dimensional Perception Technologies Ltd. Method and system for determining position and/or orientation
US8866938B2 (en) 2013-03-06 2014-10-21 International Business Machines Corporation Frame to frame persistent shadow reduction within an image
EP2866048A1 (en) * 2013-10-14 2015-04-29 Guidance Navigation Limited Tracking device
US9402691B2 (en) 2014-09-16 2016-08-02 X-Nav Technologies, LLC System for determining and tracking movement during a medical procedure
GB2545908A (en) * 2015-12-23 2017-07-05 Guidance Marine Ltd Markerless tracking of an object
US9844324B2 (en) 2013-03-14 2017-12-19 X-Nav Technologies, LLC Image guided navigation system
US9943374B2 (en) 2014-09-16 2018-04-17 X-Nav Technologies, LLC Image guidance system for detecting and tracking an image pose

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB2094088A (en) * 1979-01-09 1982-09-08 Emi Ltd Correlation arrangements
GB2105941A (en) * 1978-10-11 1983-03-30 Emi Ltd Correlation of representations of a reference and a scene
EP0139292A2 (en) * 1983-10-17 1985-05-02 Hitachi, Ltd. Navigation apparatus for mobile system

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB2105941A (en) * 1978-10-11 1983-03-30 Emi Ltd Correlation of representations of a reference and a scene
GB2094088A (en) * 1979-01-09 1982-09-08 Emi Ltd Correlation arrangements
EP0139292A2 (en) * 1983-10-17 1985-05-02 Hitachi, Ltd. Navigation apparatus for mobile system

Cited By (36)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB2267360A (en) * 1992-05-22 1993-12-01 Octec Ltd Method and system for interacting with floating objects
GB2267360B (en) * 1992-05-22 1995-12-06 Octec Ltd Method and system for interacting with floating objects
GB2272343A (en) * 1992-11-10 1994-05-11 Gec Ferranti Defence Syst Automatic aircraft landing system calibration
EP0609948A1 (en) * 1993-02-02 1994-08-10 Nikon Corporation Camera having target follow up function
US5475466A (en) * 1993-02-02 1995-12-12 Nikon Corporation Camera having target follow up function
EP0631214A1 (en) * 1993-05-27 1994-12-28 Oerlikon Contraves AG Method for the automatic landing of aircrafts and device for implementing it
US5792147A (en) * 1994-03-17 1998-08-11 Roke Manor Research Ltd. Video-based systems for computer assisted surgery and localisation
WO1996009207A1 (en) * 1994-09-19 1996-03-28 Siemens Corporate Research, Inc. Autonomous video-based aircraft docking system, apparatus, and method
WO1997018484A1 (en) * 1995-11-14 1997-05-22 Israel Aircraft Industries Ltd. Automatic aircraft landing
US6154693A (en) * 1995-11-14 2000-11-28 Israel Aircraft Industries Ltd. Automatic aircraft landing
US6542086B2 (en) 1997-09-22 2003-04-01 Siemens Aktiengesellschaft Docking system for airport terminals
WO2001027651A1 (en) 1999-10-12 2001-04-19 Honeywell Inc. Method and apparatus for navigating an aircraft from an image of the runway
WO2002044749A1 (en) * 2000-11-28 2002-06-06 Roke Manor Research Limited Optical tracking systems
US6926673B2 (en) 2000-11-28 2005-08-09 Roke Manor Research Limited Optical tracking systems
WO2003046290A1 (en) 2001-11-21 2003-06-05 Roke Manor Research Limited Detection of undesired objects on surfaces
WO2004002352A2 (en) 2002-07-01 2004-01-08 Claron Technology Inc. A video pose tracking system and method
US6978167B2 (en) 2002-07-01 2005-12-20 Claron Technology Inc. Video pose tracking system and method
US7374103B2 (en) * 2004-08-03 2008-05-20 Siemens Corporate Research, Inc. Object localization
GB2449517A (en) * 2007-03-08 2008-11-26 Honeywell Int Inc Vision based navigation and guidance system
GB2449517B (en) * 2007-03-08 2009-08-12 Honeywell Int Inc Vision based navigation and guidance system
US7881497B2 (en) 2007-03-08 2011-02-01 Honeywell International Inc. Vision based navigation and guidance system
CN101281644B (en) * 2007-03-08 2012-04-18 霍尼韦尔国际公司 Vision based navigation and guidance system
AU2008201108B2 (en) * 2007-03-08 2013-07-25 Honeywell International Inc. Vision based navigation and guidance system
WO2013069012A1 (en) * 2011-11-07 2013-05-16 Dimensional Perception Technologies Ltd. Method and system for determining position and/or orientation
US8866938B2 (en) 2013-03-06 2014-10-21 International Business Machines Corporation Frame to frame persistent shadow reduction within an image
US8872947B2 (en) 2013-03-06 2014-10-28 International Business Machines Corporation Frame to frame persistent shadow reduction within an image
US9844324B2 (en) 2013-03-14 2017-12-19 X-Nav Technologies, LLC Image guided navigation system
EP2866049A1 (en) * 2013-10-14 2015-04-29 Guidance Navigation Limited Tracking device
EP2869082A1 (en) * 2013-10-14 2015-05-06 Guidance Navigation Limited Tracking device
US9500746B2 (en) 2013-10-14 2016-11-22 Guidance Navigation Limited Tracking device
EP2866048A1 (en) * 2013-10-14 2015-04-29 Guidance Navigation Limited Tracking device
US9402691B2 (en) 2014-09-16 2016-08-02 X-Nav Technologies, LLC System for determining and tracking movement during a medical procedure
US9943374B2 (en) 2014-09-16 2018-04-17 X-Nav Technologies, LLC Image guidance system for detecting and tracking an image pose
GB2545908A (en) * 2015-12-23 2017-07-05 Guidance Marine Ltd Markerless tracking of an object
US10679375B2 (en) 2015-12-23 2020-06-09 Guidance Marine Limited Markerless tracking of an object
GB2545908B (en) * 2015-12-23 2021-07-14 Guidance Marine Ltd Markerless tracking of an object

Also Published As

Publication number Publication date
GB9114518D0 (en) 1991-08-21
GB2246261B (en) 1994-05-11
GB9015594D0 (en) 1991-04-24

Similar Documents

Publication Publication Date Title
GB2246261A (en) Tracking arrangements and systems
CN110070615B (en) Multi-camera cooperation-based panoramic vision SLAM method
CN111156998B (en) Mobile robot positioning method based on RGB-D camera and IMU information fusion
CN101598556B (en) Unmanned aerial vehicle vision/inertia integrated navigation method in unknown environment
CN109709801A (en) A kind of indoor unmanned plane positioning system and method based on laser radar
CN107390704B (en) IMU attitude compensation-based multi-rotor unmanned aerial vehicle optical flow hovering method
US5422828A (en) Method and system for image-sequence-based target tracking and range estimation
CN110726406A (en) Improved nonlinear optimization monocular inertial navigation SLAM method
Andert et al. Lidar-aided camera feature tracking and visual slam for spacecraft low-orbit navigation and planetary landing
Steiner et al. A vision-aided inertial navigation system for agile high-speed flight in unmapped environments: Distribution statement a: Approved for public release, distribution unlimited
CN111623773B (en) Target positioning method and device based on fisheye vision and inertial measurement
CN114608554B (en) Handheld SLAM equipment and robot instant positioning and mapping method
CN110824453A (en) Unmanned aerial vehicle target motion estimation method based on image tracking and laser ranging
Jasiobedzki et al. Autonomous satellite rendezvous and docking using LIDAR and model based vision
Dickmanns et al. Autonomous landing of airplanes by dynamic machine vision
Ruel et al. 3DLASSO: Real-time pose estimation from 3D data for autonomous satellite servicing
Miller et al. UAV navigation based on videosequences captured by the onboard video camera
Amidi et al. Research on an autonomous vision-guided helicopter
Veth et al. Two-dimensional stochastic projections for tight integration of optical and inertial sensors for navigation
Schell et al. Autonomous landing of airplanes by dynamic machine vision
Wang et al. Pose and velocity estimation algorithm for UAV in visual landing
Aminzadeh et al. Implementation and performance evaluation of optical flow navigation system under specific conditions for a flying robot
Yingying et al. Fast-swirl space non-cooperative target spin state measurements based on a monocular camera
Evans Kalman filtering of pose estimates in applications of the RAPID video rate tracker.
Feetham et al. Single camera absolute motion based digital elevation mapping for a next generation planetary lander

Legal Events

Date Code Title Description
PE20 Patent expired after termination of 20 years

Expiry date: 20110704