GB2289389A - Misile location - Google Patents
Misile location Download PDFInfo
- Publication number
- GB2289389A GB2289389A GB9508451A GB9508451A GB2289389A GB 2289389 A GB2289389 A GB 2289389A GB 9508451 A GB9508451 A GB 9508451A GB 9508451 A GB9508451 A GB 9508451A GB 2289389 A GB2289389 A GB 2289389A
- Authority
- GB
- United Kingdom
- Prior art keywords
- terrain
- picture
- micro
- pattern
- missile
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
Classifications
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
- G05D1/12—Target-seeking control
-
- F—MECHANICAL ENGINEERING; LIGHTING; HEATING; WEAPONS; BLASTING
- F41—WEAPONS
- F41G—WEAPON SIGHTS; AIMING
- F41G7/00—Direction control systems for self-propelled missiles
- F41G7/34—Direction control systems for self-propelled missiles based on predetermined target position data
- F41G7/343—Direction control systems for self-propelled missiles based on predetermined target position data comparing observed and stored data of target position or of distinctive marks along the path towards the target
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/005—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 with correlation of navigation data from several sources, e.g. map or contour matching
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S5/00—Position-fixing by co-ordinating two or more direction or position line determinations; Position-fixing by co-ordinating two or more distance determinations
- G01S5/16—Position-fixing by co-ordinating two or more direction or position line determinations; Position-fixing by co-ordinating two or more distance determinations using electromagnetic waves other than radio waves
Description
1 2289389 Method and Device of Updating the Inertial Navigation of a
Missile Autonomously Heading for a Remote Target
Technical Field
The invention relates to a method of updating the inertial navigation of a missile heading for a remote target by means of an image-generating sensor looking sidewards to the flight direction and detecting the terrain flown-over, the sensor providing terrain data by comparison of which with known terrain data a position of the missile is obtained, this position, in turn, being compared with the position determined by the inertial navigation, the inertial navigation system being corrected in accordance with this this comparison.
Furthermore, the invention relates to a device for updating the inertial navigation of a missile autonomously heading for a remote target, such device carrying out the aforementioned method.
Background Art
Low-flying autonomous missiles are, for example, the socalled "cruise missiles". The navigation of such missiles is based primarily on inertial navigation. An inertial sensor unit constructed with gyros and accelerometers provides data about position, velocity and heading of the missile. Such inertial sensor units are subjected to drift. The indicated position and the heading of the missile change slowly. The errors become the larger the longer the 2 flight time continues. It is therefore necessary to check position and heading of the missile from time to time and to effect corrections of the inertial sensor unit.
It is possible to update the position by receiving and processing signals from navigation satellites (GPS). The position determination based thereo ensures high pecision. With military equipment, however, it is often required that the navigation is absolitely autonomous.
Furthermore, it is known to scan the terrain flown-over by means of a RADAR set provided on-board the missile. But this type of position check does not meet the requirement of abolute passivity, thus of the interdiction of the emission of telltale active radiation.
Furthermore, it is known to take snapshot-like individual pictures of the terrain flown-over by means of a passive, image-generating sensor. If at least three known and identified objects are in the picture simultaneously, both the position and the orientation of the sensor at the moment of taking the picture can be computed from the angles between these objects appearing in the picture and the angles relative to the direction of view of the sensor by applying a three-dimensional "three- point problem". If the image generating sensor detects less than three objects, it is possible to include the flight path between taking of two individual pictures into the computation, the flight path being determined by means of the inertial sensor unit (R.Koch, R.Bader, W. Hinding: "A Study of an Integrated Image and Inertial Sensor SysteW' in "Agard Conference Proceedings" No 474 (1990)).
The applicability of this procedure depends on sufficiently many identified objects being available in the terrain. The
3 errors of the measurement largely depend on the position of these objects relative to the missile. The requirement of unambiguous identification necessitates an expensive recognition process. Depending on the direction of view, it may be impossible to compare, for the purpose of identifying the object, the image contents directly with a stored model of the object. Rather is it necessary, with oblique direction of view, to make a quantitative comparison of the expected object with the seen object through a perspective transformation of the terrain model stored in the missile into the image plane. only with the aid of this comparison, the position and the heading can be determi ned and, if necessary, corrected.
Between the updating objects, the missile has to navigate "blind" with the inertial navigation only. If the updating objects are located close to one another, confusion of objects having similar appearance may occur due to positional deviations caused by drift. If one updating object fails to show up, the positional deviation may become so large that the next but one updating object cannot be found any longer.
DE-A-3,412,533 describes an image generating sensor for three-dimensional covering of scene,in particular for use in industrial automation. A sequence of pictures is taken from different directions. Pairs of pictures are generated, from which a stereoscopic picture of the covered scene can be obtained. Image processing is effected by means of a parallel-processing, digital network for making quick correlation comparisons of image sections.
EP-A-0,122,048 describes a processor.
parallelly-operating data 4 Disclosure of the Invention
It is the object of the invention to provide a method of the type mentioned in the beginning for the updating of the inertial navigation of a missile autonomously heading for a remote target, wherein the information provided by the image generating sensor about the terrain flown-over is optimally utilized, the updating of the inertial navigation by this information is effected quasi-continuously, and the updating is independent of the presence of prominent updating objects in the terrain flown-over.
According to the invention this object is achieved with a method of the type mentioned in the beginning in that (a) the image generating sensor, during the movement of the missile over the terrain, continuously takes pictures of the terrain flown-over from different missile positions, (b) these pictures are stored electronically, (c) a three-dimensional representation of the terrain is computed by stereo-picture evaluation from stored pictures and the associated position differences of the missile derived from the inertial navigation, (d) the computed representation of the terrain is compared with a stored model of the terrain, and the position and the heading of the missile is determined therefrom.
A device for the carrying characterized by out of the method is (a) an image generating sensor looking sidewards to the flight direction at a finite angle and covering the terrain f lown-over, this sensor being arranged to take pictures of the terrain flown-over from different missile positions, while the missile moves over the terrain, to generate a picture sequence, (b) a memory for electronically storing the pictures taken by the image generating sensor, (c) computer means with means for computing by stereopicture evaluation a three-dimensional representation of the terrain f rom stored pictures and the associated position differences of the missile derived from the inertial navigation, (d) means for storing a three-dimensional model of the terrain flownover, (e) means for comparing the computed three-dimensional representation of the landscape with the stored threedimensional model of the landscape, and (f) means for determining the position and heading of the missile from the comparison of the computed representation of the terrain and the stored model of this terrain.
Modifications of the invention are subject matter of the sub-claims.
6 An embodiment of the invention is described in greater detail hereinbelow with reference to the accompanying drawings.
Fig.1 is a schematic-perspective representation and shows, how a terrain is observed consecutively from two positions of a missile by means of an image generating sensor attached to the missile.
Fig.2 shows the image contents covered by the image generating sensor in the two positions.
Fig. 3 9 illustrates the determination of position by pattern comparison of the observed terrain with a stored terrain.
Fig. 4 is a block diagram and illustrates a device for the updating of the inertial navigation of a missile heading for a remote target by means of an image generating sensor looking sidewards to the f light direction atr a f inite angle and covering the landscape flown-over.
Fig. 5 is a schematic representation and illustrates the image processing of micro-patterns in the picture of the terrain by means of a parallel computer.
Preferred Embodiment of the Invention Referring to Fig.1, numeral 10 designates a terrain, which is schematically indicated by a tree 12, two houses 14 and 16 and a path 18 passing therebetween. A missile flies with a heading 20 along a trajectory 22. The missile has an image generating sensor in the form of a video camera. The image generating sensor "looks" sidewards to the flight 7 direction of the missile at a finite angle. The sensor covers, in each position, a f ield of view of rectangular cross section. Fig.1 shows the field of view 24 of the sensor for a first position 26 of the missile and the field of view 28 of the sensor for a second position 30.
In the positions 26 and 30, the sensor provides pictures 30 and 32, respectively, of the terrain 10. In picture 32 of Fig. 2, the tree is arranged in f ront of the house 16. In picture 34 of Fig.2, the tree 12 is visible between the houses 16 and 14. A stereoscopic representation of the terrain 10 can be computed from the two pictures 32 and 34. The stereo-basis is the distance 36 between the two positions 26 and 30. This distance can be provided by the inertial sensor unit of the missile. The error of the stereo-basis is, at a rule, small, if the pictures are taken at not too large time intervals. A three-dimensional representation of the terrain flown-over results. This three-dimensional representation of the terrain flown-over is compared with a stored model of the terrain. The deviation of the position provided by the inertial navigation and of the heading provided by the inertial navigation from the actual position and the actual heading, respectively, can be determined by a correlation computation. The output of the inertial sensor unit can be corrected and updated correspondingly.
This is schematically illustrated in Fig.3. Referring to Fig.3, numeral 38 designates a trajectory of the missile, as resulting from the inertial navigation. Accordingly, the missile would be at a position 40 at a moment t, and at a position 40 at a moment tl, corresponding to the positions 26 and 30 in Fig.l. In its field of view corresponding to the field of view 28 of Fig.1, the sensor observes a terrain which, in Fig.3, is illustrated as a road 46 -with
8 junctions. The trajectory 38 (in a map) indicated by the inertial navigation and the observed road, the position of which in the map is referenced to this trajectory 38, are illustrated in dashed lines in Fig. 3. In the stored "map", the road is actually located at the position 48. The trajectory 38 provided by the inertial navigation has to be corrected to provide a "true" trajectory 50 with the positions 52 at the moment t, and 54 at the moment tl, which is located with respect to the road 48 stored in the map in the same way as the trajectory 38 is located with respect to the road 46 shown in dashed lines.
This can be done in the following way:
Small windows, which are characterized by clear contrasts, are considered in the three-dimensional representation of the terrain. It be assumed that these are the areas around the points A and B in Fig.3. A correlation function with the stored model of the landscape (map) is computed for these windows. Practically, the window is shifted relative to the map until optimal conformity has been reached and the correlation function has a maximum. In this way, the points A' and B' on the map are determined which correspond to the contrast-rich windows around the points A and B, respectively, which are observed by the sensor. No recognition of objects is required for this. within the range of the relatively small windows, distortion of the representation of the terrain due to inaccuracies of the stereo-basis are less critical, as this would be the case, if the correlation function had been formed over the whole picture contents. The computing expenditure and the required memory capacity are considerably reduced as compared to the latter alternative.
9 The position of the points A and B relative to the position 42 affected by errors is known. From the computation of the three-dimensional terrain structure, the distances of the position 42 from the points A and B of the observed terrain can be derived. From the now known points A' and B' on the map, determined with the aid of the correlation function, it is now possible to determine the point 54 on the map, i.e. the true position. The true position 54 is located exactly in the same relative position with respect to the timap-fixed" points A' and B', as the position 42 provided by the inertial navigation with respect to the points A and B in the observed terrain.
updated correspondingly, when the position of the missile has been determined in the manner described, and the inertial navigation unit has been the further movement of the picture points are tracked quasi-continuously. Picture points are defined by small, high-contrast "micro-patterns" of, for example, 3 x 3 pixels. In each picture, a correlation function is formed or a direct comparison is made. to ascertain, to which location a "micro-pattern" observed in the preceding picture has moved.To this end it is only necessary to consider in each picture only the near surroundings of that spot at which the respective micropattern has been located in the preceding picture. Provision can be made that the picture points move substantially in the direction of the rows of the picture raster of the image generating sensor. There will be displacement vectors for each considered micro-pattern, which represent the movement of the respective micropattern over the field of view covered by the sensor. During this procedure, new micro-patterns continuously enter the f ield of view at the f ront edge of the f ield of view, a number of such mico-patterns being selected and the movements of such selected micro-patterns being tracked and stored. Micro-patterns leave the f ield of view at its rear edge.
A three-dimensional representation of the terrain, as far as it is represented by the micro-patterns, is computed from the positions of the micro-patterns in pairs of different pictures, which are separated by a time interval during which the missile has traversed one stereo-basis. As the micro-patterns are continuously tracked during their passage through the field of view of the sensor, their positions in each of the pictures of the sequence is known. Therefore, it is not necessary to recover micro- patterns in pictures, which are taken at substantially different times. The three-dimensional representation of the terrain is computed as follows: When a micro-pattern enters the f ield of view of the sensor at the front edge thereof, the running number of the associated picture of the sequence is stored and, thereby, the time, when this micro-pattern enters the f ield of view of the sensor. In the consecutive pictures of the sequence, the micro-pattern travels through the field of view and, eventually, leaves the field of view at its opposite rear edge. The running number of the associated picture is also stored and, thereby, the time, when the micro-pattern leaves the field of view. The stereo-basis is obtained from the time difference and the velocity of the missile. Thereby, the location of the micro-pattern can be computed threedimensionally. The same procedure is applied to all considered micropatterns travelling through the field of view. Thereby, a threedimensional representation of the terrain scanned by the field of view of the sensor is continuously obtained.
On the basis of the micro-patterns, high-contrast windows in the observed terrain, which are used for the updating of the inertial navigation, can be continuously tracked from picture to picture. The quasi -continuously computed threedimensional representation of the terrain in the window is continuously compared with a stored terrain pattern. Therefrom, the true position of the missile can be continuously determined by forming a correlation function over the window between this representation of the terrain and the terrain model, as has been described above. In this procedure, the computation of the three-dimensional representation of the terrain and the correction of the position provided by the inertial navigation alternate continuously. Thereby, the trajectory indicated by the inertial' navigation deviates only slightly from the true trajectory. Correspondingly, the computation of the corrleation function for the determination of the deviation can confine itself to the immediate neighborhood of the window fixed in the field of view of the sensor.
The continuous updating of the comparison between the presently observed terrain and the on-board terrain model (map) permits also the inclusion of picture contents, which as small individual objects would be difficult to identify. The terrain model permits prediction of what can be observed by the sensor in the picture, as the position of the window in the terrain pattern is, to a large extent, known. It is then possible to "search" and also process such picture contents. This permits utilization of picture details which are little characteristic and, by themselves, would not yet permit a position information to be derived therefrom. Thereby redundancy is achieved, which is required, in particular, in little-structured terrain.
Fig-4 shows, as a block diagram, a device for the carryingout of the described method.
12 Referring to FigA, numeral 56 designates an image generating sensor. The image generating sensor is a video camera. The sensor 56 supplies pictures to a parallel computer 58. The parallel computer 58 compares the micro patterns with the required high processing speed. The parallel computer 58 provides for each micro-pattern, which is observed during its travel through a plurality of pictures, the lenght of the displacement vector of the micro-pattern and the picture numbers of the first and last pictures in which this micro-pattern was observed.
A following computer stage 60 is composed of standard computer devices, such as the conventional signal processors. The computer stage 60 receives the flight velocity vector ftom the inertial sensor unit and the flight time between that picture of the sequence of pictures of the sensor 56, in which a considered micropattern occurs for the first time, and that picture, in which the micro-pattern reaches the rear edge of the field of view. The stereo-basis is computed from the time difference between the pictures and the flight velocity vector. Therewith, the location of the terrain portion represented by the micropattern is computed by triangulation. Furthermore, the computer stage causes projection of the three-dimensional terrain representation obtained from the various micro-patterns on a horizontal plane. Thereby, a two- dimensional terrain representation similar to a map is obtained for comparison with an also two-dimensional, map-like terrain model, which is stored in the missile.
A third computer stage 62 effects comparison of the terrain representation derived from the pictures of the sensor 56 in the described manner with a terrain pattern, which is stored in a memory 64. The computer stage provides the 0 13 is shifting of the terrain representation derived from the pictures of the sensor 56 (in the coordinate system of the terrain model) relative to the terrain model. A navigation computer 66 determines therefrom, in a manner described with reference to Fig.3, the true position and the true heading of the missile in the terrain pattern (the map). From the deviations of position and heading of the missile, correction values for the inertial sensor unit are derived.
Fig.5 illustrates the image processing of micro-patterns in the picture of the terrain by means of the parallel computer 58.
In the schematic illustration of Fig.5, numeral 68 designates memory elements in which picture elements of the picture covered by the sensor 56 are stored. Fig.5 shows three rows of such memory elements 68, in which the picture elements of three rows of the picture are stored. The parallel computer 58, which represents the first computer stage, contains a one-dimensional array of processor elements 70. A local memory 72 having a plurality of memory cells is associated with each processor element 70.
Out of the picture elements (pixels) of a (n-1)th picture, a micropattern 74 of 3 x 3 pixels is stored at its location in the (n-1)th picture. To this end, the (n-1)th picture has been scanned row by row by the processor element 70. The pixels of the various rows are filed in the memory elements in the various planes of the local memory, thus "one below the other" in Fig.5.
In the n-th picture, the same micro-pattern, due to the movement of the sensor 56, appears as micro-pattern 74A at a different location of the picture, namely further to the lef t in Fig. 5. The shif t is to be determined. This is done 14 by means of a correlation method. In order to determine the length of the shif t vector of the micro-pattern 74 f rom picture to picture, the amounts of differences for all nine pixels of the micro-pattern and of a micro-pattern to be compared therewith are formed and summed up:
K = E 1Pixel (B.) - Pixel (B.-,) This provides a measure of the degree of conformity of 3x3 micro-patterns. If it is assumed, that the f light direction of the missile is exactly parallel to the rows of the pictures, then the micro-pattern 74 needs only be shif ted within the rows to the left in Fig.5, until the value of K becomes a minimum. The minimum of a correlation function K(O) = E 1Pixel,,(x-O,y) - Pixel,,_1(x,y) 1 with 0 as shift coordinate is determined. The position of the minimum is designated by O.in. The sum is again taken over all nine pixels surrounding the original position and the sought position. If 0 is counted from the position of the micro-pattern in the (n-1)th picture, E),in is the length of the shift vector 76 for the respective micro-pattern between two pictures. K(O.i.) is a measure of the quality of the conformity. The conformity is the better, the smaller K (O.in) is - In detail, the procedure is as follows: The n-th picture is processed row by row. At f irst, the tree pixel 86, 88 and 90 of the f irst row of the micro-pattern are compared with the pixels which have been stored f rom the (n-1) th picture in the memory elements 92, 94 and 96. The processor elements 80, 82 and 84 f orm the dif f erences of the pixel contents. The amounts of these differences are added and are stored in a memory element of the local memory of the central processor element 82. Then, in the same way, the pixels 98, 100, 102 of the second row of the micro-pattern 74A are compared with the pixels, which have been stored from the (n-1)th picture in the memory elements 104, 106 and 108 of the local memories of the processor elements 80, 82 and 84, respectively. Again the differences of the pixel contents are formed. The amounts of these differences are added and are added to the difference sum stored in the local memory of the central processor element. The same happens with the three pixels of the third row of the micro-pattern 74A and the pixels stored in the memory elements 110, 112 and 114. Also here, the differences of the pixel contents and the sum of the amounts of these differences are formed and again added to the difference sum stored in the local memory of the processor element 82 from the other two rows. Therefore, the correlation function of the mico- pattern 74A of the picture (n-1) with the micro-pattern storeded "therebelow" from picture n is stored in the local memory of the central processor element 82, "central" with respect to the micro-pattern 74A.
Subsequently, the micro-pattern 74 is "shifted by one step", i. e. it is compared in the same way with the 3x3mico-patterns stored in the local memories of the processor elements 82, 84 and 116, and the correlation function is formed. This proceeds step-by-step, until the comparison with the micro-pattern 74 is effected. The micro-patterns 74 and 74A are identical. The correlation function becomes a minimum, ideally zero. Thereby, the micro-pattern 74 of the (n-1)th picture has been "retrieved" in the micropattern 74A of the n-th picture. The number of the steps required herefor, the variable E), yields the shift vector 76, by which the micro-pattern 74 has been shifted in the field of view of the sensor 56 in the time interval from
16 the (n-1) th picture to the n-th picture. In the present case, this shif t vector extends in the direction of the rows. The amount of the shift vector is added to the sum of the previously determined shif t vectors of the respective micro-pattern. The latter sum had been stored in the local memory of the central processor element at the location of the micro-pattern 74. The new sum is stored in the local memory of the processor element 82. The micro-pattern 74A is stored in the local memories of the processor elements 80, 82 and 84. The micro-pattern 74 is erased.
Now the same operation is repeated with the (n+l)th picture and the n-th picture. This procedure is repeated with the sequence of the pictures.
The procedure described can be carried out in parallel by all processor elements of the row for all valid micropatterns. Only three rows of the current picture are illustrated in Fig.5. Actually, the picture has substantially more rows. These rows are processed with the micro-patterns contained therein consecutively in the described manner by means of the processor elements 70.
For the adding of new micro-patterns from younger pictures, an admission test for micro-patterns is required. Not every 3x3 matrix of the picture can and should be processed in the manner described. There are homogeneous and -as compared to the picture noise- little-structured areas in the picture, with which the described procedure would fail. Therefore, only those 3x3-areas are admitted as valid micro-patterns, which have a predetermined minimum remarkability, whereby sufficiendly marked minima of the correlation function not falsified by picture noise can be expected. A criterion herefor is a variance measure.
1 17 In this way, the micro-pattern travels across the f ield of view of the sensor 56. The stored characteristic data of those micro-patterns which have reached the rear edge of the f ield of view are read out. These characteristic data include the time interval required f or the total travel of the picro-pattern, the length of the total distance travelled by the micro-pattern and, at least, the picture half -tone of the central pixel. The time interval required for the total travel is the time dif f erence between the first appearence of the micro-pattern and its last shifting step. This time interval is obtained from the difference of the associated picture numbers divided by the picture frequency. The length of the total distance of shift results f rom the dif f erence of the column addresses of the first and last positions of the micro-patterns. In order to determine this distance with an accuracy of fractions of pixels, it is possible to carry out a for example, parabolic interpolation between the three measured values of the correlation function surrounding the theoretical minimum of the correlation function. The characteristic data are forwarded to the computer stage 60.
The respective stereo-basis can be determined from the characteristic data. Simple triangulation yields the location of the object detail represented by the micropattern.
The projection of the so obtained three-dimensional terrain representation into a horizontal plane, which is to be carried out at the end, is effected, in the simplest case, by setting the altitude coordinate to zero.
In order to compare, in the computer stage 62, patterns of the projected, two-dimensional terrain representation and the terrain profile stored in the memory 62, at firs, at 18 the beginning of the navigation updating, a searched pattern such as the road intersection 'W' in Fig. 3 is encoded as a set of rules for composing the pattern from elementary picture elements like lines and angles. This set of rules represents a kind of "designdirection" of the searched pattern. If from the picture elements contained in the projected terrain representation the searched pattern as described in the design direction can be retrieved, the pattern is regarded as found. This proceedure is substantially identical with the procedure in the above mentioned paper "Agard Conference Proceedings" No 474 (1990). The advantage of this procedure for the first search with only inaccurate knowledge of position, Orientation and size of the searched pattern is the far- reaching tolerance with respect to rather large translational, rotational and scale variations.
If in this way access to the pattern comparison between the terrain representation obtained from the sensor pictures and the stored terrain model (map) has been gained, the position of further objects becoming visible in the course of the flight can be predicted with progressive accuracy. The search areas become small. Scale and rotational deviations become virtually negligible. Then pattern correlation methods as described above with reference to the micro- patterns are used.
?7 19 claims is 1. A method of updating the inertial navigation of a missile heading for a remote target by means of an image-generating sensor looking sidewards to the flight direction and detecting the terrain flown-over, the sensor providing terrain data by comparison of which with known terrain data a position of the missile is obtained, this position, in turn, being compared with the position determined by the inertial navigation, the inertial navigation system being corrected in accordance with this this comparison, wherein:
(a) the image generating sensor, during the movement of the missile over the terrain, continuously takes pictures of the terrain flown-over from different missile positions, (b) these pictures are stored electronically, (c) a three-dimensional representation of the terrain is computed by stereo-picture evaluation from stored pictures and the associated position differences of the missile derived from the inertial navigation, (d) the computed representation of the terrain is compared with a stored model of the terrain, and the position and the heading of the missile is determined therefrom.
2. A method as claimed in claim 1, wherein sequences of pictures are generated, at short time intervals, by the image generating sensor and are stored, and the three-dimensional representations of the terrain are generated, at these time intervals, from respective pairs of pictures of this sequence, the moments of taking the pictures of each pair differing by a plurality of such time intervals.
3. A method as claimed in claim 2, wherein high-contrast micro-patterns in the pictures of the image generating sensor are continuously tracked and are subjected to the stereo-picture evaluation to compute the threedimensional representation.
4. A method as claimed in claim 2, wherein only the high-contrast micropatterns from each picture of the sequence are stored and processed.
5. A method as claimed in claim 4, wherein a micro-pattern in one picture of the sequence is stored, in the next-following picture, the micro-pattern shif ted by a shif t vector in the f ield of view of the sensor in the meantime due to the movement of the missile is searched by a correlation method, and r 1 21 the shif ted micro-pattern thus searched is stored anew together with characteristic data of the micro-pattern.
6.
A method as claimed in claim 5, wherein the characteristic data of each micro-pattern are read out, after the micro-pattern has travelled across the whole field of view of the sensor for the computation of the position of the micro-pattern in a threedimensional representation of the terrain.
7. A method as claimed in claim 6, wherein the stored characteristic data of each micro-pattern include, at least, the following information:
the running number of the pictures, in which the micro-pattern appeared for the first time and for the last time, the shift vector between the positions of the micro-pattern at its appearance for the f irst time and at its appearance for the last time, and the picture half-tone of a central pixel of the micro-pattern.
8. A method as claimed in anyone of the claims 5 to 7, wherein the micro-patterns are processed row-by-row and in parallel in each row.
9. A method as claimed in claim 8, wherein all micro-patterns appearing in one row are processed in parallel.
22 is 10. A method as claimed in anyone of the claims 1 to 9, wherein the three-dimensional representation of the terrain is converted to a two dimensional representation by projecting the representation on a plane by computation, and the two dimensional representation is compared with a stored two-dimensional terrain model.
11. A method as claimed in claim 10, wherein in order to gain access to the position updating, marked points of the terrain are, at first, selected, which are structured out of elementary picture components in accordance with particular rules, and af ter these points have been found and the position has been updated by comparison of the found points with the stored terrain pattern, the further position updating is effected by a pattern correlation method.
12. A device for the updating of the inertial navigation of a missile autonomuosly heading for a remote target, for carrying out the method of claim 1, wherein (a) an image generating sensor looking sidewards to the flight direction at a finite angleand covering the terrain flown-over, this sensor being arranged to take pictures of the terrain flownover from different missile positions, while the missile moves over the terrain to generate a picture sequence.
23 (b) a memory for electronically storing the pictures taken by the image generating sensor is (c) computer means with means for computing by stereo-picture evaluation a. three-dimensional representation of the terrain from stored pictures and the associated position differences of the missile derived from the inertial navigation, (d) means for storing a three-dimensional model of the terrain flown-over, (e) means for comparing the computed three dimensional representation of the landscape with the stored three-dimensional model of the landscape, and (f) means for determining the position and heading of the missile from the comparison of the computed representation of the terrain and the stored model of this terrain.
13. A device as claimed in claim 5, wherein the computer means have a parallel computer structure.
14. A method of updating the inertial navigation of a missile heading for a remote target substantially as described hereinbefore with reference to the accompanying drawings.
A device for carrying out the method according to claim 14 substantially as described hereinbefore with reference to the accompanying drawings and as shown in Figure 4, or in Figure 4 and Figure 5.
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
DE4416557A DE4416557A1 (en) | 1994-05-11 | 1994-05-11 | Method and device for supporting the inertial navigation of a missile autonomously controlling a distant target |
Publications (3)
Publication Number | Publication Date |
---|---|
GB9508451D0 GB9508451D0 (en) | 1995-06-14 |
GB2289389A true GB2289389A (en) | 1995-11-15 |
GB2289389B GB2289389B (en) | 1998-06-24 |
Family
ID=6517824
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
GB9508451A Expired - Fee Related GB2289389B (en) | 1994-05-11 | 1995-04-26 | Method and device of updating the inertial navigation of a missile autonomously heading for a remote target |
Country Status (3)
Country | Link |
---|---|
DE (1) | DE4416557A1 (en) |
FR (1) | FR2719920A1 (en) |
GB (1) | GB2289389B (en) |
Cited By (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO1999053335A1 (en) * | 1998-04-15 | 1999-10-21 | Commonwealth Scientific And Industrial Research Organisation | Method of tracking and sensing position of objects |
GB2362213A (en) * | 1997-07-14 | 2001-11-14 | British Aerospace | Inertial navigation accuracy enhancement |
AU756108B2 (en) * | 1998-04-15 | 2003-01-02 | Commonwealth Scientific And Industrial Research Organisation | Method of tracking and sensing position of objects |
WO2003071692A2 (en) | 2002-02-19 | 2003-08-28 | Motorola, Inc. | Device for use with a portable inertial navigation system (pins) and method for processing pins signals |
GB2393870A (en) * | 2002-08-28 | 2004-04-07 | Lockheed Corp | Means for determining the exact geographic location of a target on a battlefield |
WO2005119178A1 (en) * | 2004-06-02 | 2005-12-15 | Athena Technologies, Inc. | Image-augmented inertial navigation system (iains) and method |
EP2060873A2 (en) * | 2007-11-17 | 2009-05-20 | LFK-Lenkflugkörpersysteme GmbH | Method of supporting independent navigation in a low-flying aircraft |
EP2133662A3 (en) * | 2008-06-09 | 2010-07-14 | Honeywell International Inc. | Methods and system of navigation using terrain features |
US8213706B2 (en) | 2008-04-22 | 2012-07-03 | Honeywell International Inc. | Method and system for real-time visual odometry |
DE102012224107A1 (en) * | 2012-12-20 | 2014-06-26 | Continental Teves Ag & Co. Ohg | Method for determining a reference position as starting position for an inertial navigation system |
Families Citing this family (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
IL153531A (en) * | 2002-12-19 | 2005-11-20 | Rafael Armament Dev Authority | Personal rifle-launched reconnaissance system |
FR2897163B1 (en) * | 2006-02-08 | 2008-04-11 | Thales Sa | METHOD FOR GEO-LOCALIZATION OF ONE OR MORE TARGETS |
DE102007018187B4 (en) * | 2007-04-18 | 2013-03-28 | Lfk-Lenkflugkörpersysteme Gmbh | Method for optimizing the image-based automatic navigation of an unmanned missile |
FR3071624B1 (en) | 2017-09-22 | 2019-10-11 | Thales | DISPLAY SYSTEM, DISPLAY METHOD, AND COMPUTER PROGRAM |
Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
GB2108347A (en) * | 1981-08-25 | 1983-05-11 | Diehl Gmbh & Co | Terrain surveying |
EP0095660A2 (en) * | 1982-05-19 | 1983-12-07 | Messerschmitt-Bölkow-Blohm Gesellschaft mit beschränkter Haftung | Stereo-photogrammetrical recording and interpretation method |
EP0118324A1 (en) * | 1983-01-25 | 1984-09-12 | Thomson-Csf | Indicating apparatus for film-registered topographical data and its application to aerial navigation |
US4514733A (en) * | 1976-09-15 | 1985-04-30 | Vereinigte Flugtechnische Werke | Control system for aircraft |
EP0157414A2 (en) * | 1984-04-06 | 1985-10-09 | Honeywell Inc. | Range measurement method and apparatus |
EP0381178A1 (en) * | 1989-02-02 | 1990-08-08 | Honeywell Inc. | Method and apparatus for aircraft navigation |
Family Cites Families (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US4144571A (en) * | 1977-03-15 | 1979-03-13 | E-Systems, Inc. | Vehicle guidance system |
DE2965307D1 (en) * | 1979-05-09 | 1983-06-09 | Hughes Aircraft Co | Scene tracker system |
DE2938853A1 (en) * | 1979-09-26 | 1981-04-09 | Vereinigte Flugtechnische Werke Gmbh, 2800 Bremen | AREA NAVIGATION SYSTEM FOR AIRCRAFT |
US4520445A (en) * | 1981-03-30 | 1985-05-28 | E-Systems, Inc. | Method of determining the position and velocity of a vehicle |
US5146228A (en) * | 1990-01-24 | 1992-09-08 | The Johns Hopkins University | Coherent correlation addition for increasing match information in scene matching navigation systems |
FR2699666B1 (en) * | 1992-12-22 | 1995-02-24 | Telecommunications Sa | Navigation method for aircraft using a digital terrain model. |
-
1994
- 1994-05-11 DE DE4416557A patent/DE4416557A1/en not_active Withdrawn
-
1995
- 1995-04-26 FR FR9505188A patent/FR2719920A1/en active Granted
- 1995-04-26 GB GB9508451A patent/GB2289389B/en not_active Expired - Fee Related
Patent Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US4514733A (en) * | 1976-09-15 | 1985-04-30 | Vereinigte Flugtechnische Werke | Control system for aircraft |
GB2108347A (en) * | 1981-08-25 | 1983-05-11 | Diehl Gmbh & Co | Terrain surveying |
EP0095660A2 (en) * | 1982-05-19 | 1983-12-07 | Messerschmitt-Bölkow-Blohm Gesellschaft mit beschränkter Haftung | Stereo-photogrammetrical recording and interpretation method |
EP0118324A1 (en) * | 1983-01-25 | 1984-09-12 | Thomson-Csf | Indicating apparatus for film-registered topographical data and its application to aerial navigation |
EP0157414A2 (en) * | 1984-04-06 | 1985-10-09 | Honeywell Inc. | Range measurement method and apparatus |
EP0381178A1 (en) * | 1989-02-02 | 1990-08-08 | Honeywell Inc. | Method and apparatus for aircraft navigation |
Cited By (20)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
GB2362213A (en) * | 1997-07-14 | 2001-11-14 | British Aerospace | Inertial navigation accuracy enhancement |
GB2362213B (en) * | 1997-07-14 | 2002-02-27 | British Aerospace | Inertial navigation accuracy enhancement |
US6912464B1 (en) | 1997-07-14 | 2005-06-28 | Bae Systems Plc | Inertial navigation accuracy enhancement |
WO1999053335A1 (en) * | 1998-04-15 | 1999-10-21 | Commonwealth Scientific And Industrial Research Organisation | Method of tracking and sensing position of objects |
US6442476B1 (en) | 1998-04-15 | 2002-08-27 | Research Organisation | Method of tracking and sensing position of objects |
AU756108B2 (en) * | 1998-04-15 | 2003-01-02 | Commonwealth Scientific And Industrial Research Organisation | Method of tracking and sensing position of objects |
WO2003071692A2 (en) | 2002-02-19 | 2003-08-28 | Motorola, Inc. | Device for use with a portable inertial navigation system (pins) and method for processing pins signals |
EP1478903A4 (en) * | 2002-02-19 | 2010-01-13 | Motorola Inc | Device for use with a portable inertial navigation system (pins) and method for processing pins signals |
EP1478903A2 (en) * | 2002-02-19 | 2004-11-24 | Motorola, Inc. | Device for use with a portable inertial navigation system (pins) and method for processing pins signals |
GB2393870A (en) * | 2002-08-28 | 2004-04-07 | Lockheed Corp | Means for determining the exact geographic location of a target on a battlefield |
WO2005119178A1 (en) * | 2004-06-02 | 2005-12-15 | Athena Technologies, Inc. | Image-augmented inertial navigation system (iains) and method |
US8407000B2 (en) | 2004-06-02 | 2013-03-26 | Rockwell Collins Control Technologies, Inc. | Image augmented inertial navigation system (IAINS) and method |
US7725260B2 (en) | 2004-06-02 | 2010-05-25 | Athena Technologies, Inc. | Image-augmented inertial navigation system (IAINS) and method |
EP2060873A2 (en) * | 2007-11-17 | 2009-05-20 | LFK-Lenkflugkörpersysteme GmbH | Method of supporting independent navigation in a low-flying aircraft |
EP2060873A3 (en) * | 2007-11-17 | 2012-06-27 | LFK-Lenkflugkörpersysteme GmbH | Method of supporting independent navigation in a low-flying aircraft |
DE102007054950B4 (en) * | 2007-11-17 | 2013-05-02 | Mbda Deutschland Gmbh | Method for supporting the automatic navigation of a low-flying missile |
US8213706B2 (en) | 2008-04-22 | 2012-07-03 | Honeywell International Inc. | Method and system for real-time visual odometry |
EP2133662A3 (en) * | 2008-06-09 | 2010-07-14 | Honeywell International Inc. | Methods and system of navigation using terrain features |
DE102012224107A1 (en) * | 2012-12-20 | 2014-06-26 | Continental Teves Ag & Co. Ohg | Method for determining a reference position as starting position for an inertial navigation system |
US9658069B2 (en) | 2012-12-20 | 2017-05-23 | Continental Teves Ag & Co. Ohg | Method for determining a reference position as the starting position for an inertial navigation system |
Also Published As
Publication number | Publication date |
---|---|
FR2719920B1 (en) | 1997-03-07 |
GB9508451D0 (en) | 1995-06-14 |
GB2289389B (en) | 1998-06-24 |
DE4416557A1 (en) | 1995-11-23 |
FR2719920A1 (en) | 1995-11-17 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US4689748A (en) | Device for aircraft and spacecraft for producing a digital terrain representation | |
US5801970A (en) | Model-based feature tracking system | |
US4695959A (en) | Passive range measurement apparatus and method | |
US5128874A (en) | Inertial navigation sensor integrated obstacle detection system | |
EP0896267B1 (en) | Position recognizing system of autonomous running vehicle | |
US5259037A (en) | Automated video imagery database generation using photogrammetry | |
US6639553B2 (en) | Passive/ranging/tracking processing method for collision avoidance guidance | |
Sim et al. | Integrated position estimation using aerial image sequences | |
US4635203A (en) | Passive range measurement apparatus and method | |
US8050458B2 (en) | Frontal view imaging and control device installed on movable object | |
KR102627453B1 (en) | Method and device to estimate position | |
US4602336A (en) | Guidance systems | |
GB2289389A (en) | Misile location | |
CN107850449A (en) | Method and system for generating and using locating reference datum | |
CN110223380B (en) | Scene modeling method, system and device fusing aerial photography and ground visual angle images | |
US5564650A (en) | Processor arrangement | |
CN109900274B (en) | Image matching method and system | |
EP3624057A1 (en) | Subpixel computations for increasing distance resolution at a distant location | |
CN112200911A (en) | Region overlapping type three-dimensional map construction method and device combined with markers | |
JP2017181476A (en) | Vehicle location detection device, vehicle location detection method and vehicle location detection-purpose computer program | |
JP2002532770A (en) | Method and system for determining a camera pose in relation to an image | |
US6016116A (en) | Navigation apparatus | |
CN115457084A (en) | Multi-camera target detection tracking method and device | |
CN114898314A (en) | Target detection method, device and equipment for driving scene and storage medium | |
CN115597592B (en) | Comprehensive positioning method applied to unmanned aerial vehicle inspection |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PCNP | Patent ceased through non-payment of renewal fee |
Effective date: 20080426 |