EP1432958A1 - Method for guiding a rocket - Google Patents

Method for guiding a rocket

Info

Publication number
EP1432958A1
EP1432958A1 EP02783193A EP02783193A EP1432958A1 EP 1432958 A1 EP1432958 A1 EP 1432958A1 EP 02783193 A EP02783193 A EP 02783193A EP 02783193 A EP02783193 A EP 02783193A EP 1432958 A1 EP1432958 A1 EP 1432958A1
Authority
EP
European Patent Office
Prior art keywords
rocket
images
imaging device
target
camera
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
EP02783193A
Other languages
German (de)
French (fr)
Other versions
EP1432958B1 (en
Inventor
Michel Broekaert
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Safran Electronics and Defense SAS
Original Assignee
Sagem SA
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Sagem SA filed Critical Sagem SA
Publication of EP1432958A1 publication Critical patent/EP1432958A1/en
Application granted granted Critical
Publication of EP1432958B1 publication Critical patent/EP1432958B1/en
Anticipated expiration legal-status Critical
Expired - Lifetime legal-status Critical Current

Links

Classifications

    • FMECHANICAL ENGINEERING; LIGHTING; HEATING; WEAPONS; BLASTING
    • F41WEAPONS
    • F41GWEAPON SIGHTS; AIMING
    • F41G7/00Direction control systems for self-propelled missiles
    • F41G7/20Direction control systems for self-propelled missiles based on continuous observation of target position
    • F41G7/22Homing guidance systems
    • F41G7/2273Homing guidance systems characterised by the type of waves
    • F41G7/2293Homing guidance systems characterised by the type of waves using electromagnetic waves other than radio waves
    • FMECHANICAL ENGINEERING; LIGHTING; HEATING; WEAPONS; BLASTING
    • F41WEAPONS
    • F41GWEAPON SIGHTS; AIMING
    • F41G7/00Direction control systems for self-propelled missiles
    • F41G7/007Preparatory measures taken before the launching of the guided missiles
    • FMECHANICAL ENGINEERING; LIGHTING; HEATING; WEAPONS; BLASTING
    • F41WEAPONS
    • F41GWEAPON SIGHTS; AIMING
    • F41G7/00Direction control systems for self-propelled missiles
    • F41G7/20Direction control systems for self-propelled missiles based on continuous observation of target position
    • F41G7/22Homing guidance systems
    • F41G7/2206Homing guidance systems using a remote control station
    • FMECHANICAL ENGINEERING; LIGHTING; HEATING; WEAPONS; BLASTING
    • F41WEAPONS
    • F41GWEAPON SIGHTS; AIMING
    • F41G7/00Direction control systems for self-propelled missiles
    • F41G7/20Direction control systems for self-propelled missiles based on continuous observation of target position
    • F41G7/22Homing guidance systems
    • F41G7/2253Passive homing systems, i.e. comprising a receiver and do not requiring an active illumination of the target

Definitions

  • the invention relates to rocket guidance.
  • a rocket is a small rocket, without guidance. It is often used in the fight against tanks and it can be launched from a land, sea or air vehicle, for example from an airplane or a helicopter.
  • the invention applies equally well to missiles and when we speak in the text of rockets, it will be necessary to take the term in its general sense and to consider that we also cover missiles.
  • an operator Before launching a rocket, an operator first acquires the target in its sight, it identifies it, it follows it, to know its angular speed, then it ranges it, to know its distance and finally to know the position of the target in its benchmark. With this data and a flight model of the device, the fire calculator develops a future goal materialized by a reticle in the viewfinder.
  • the present application aims to perfect the precision of rockets and, for this purpose, it relates to a method for guiding a rocket on a target in which, the rocket being equipped with self-guiding means with imaging device and trajectory correction means,
  • the target is acquired by an aiming device and its position is determined
  • the rocket is guided according to this law until it acquires the target itself.
  • the harmonization of the two aiming and imaging devices, of the launcher and of the rocket can be carried out in a completely simple manner, first by harmonizing the aiming and taking axes, respectively, then, by calculating the image of the launcher sight in the reference frame of the rocket imaging device.
  • the stabilization of the images of the rocket imaging device makes it possible at least to overcome the disadvantages of the launcher before launching and therefore to stabilize these images in the absolute landscape of the target.
  • an initial guidance law is developed and the rocket is guided until it acquires the target according to this initial law.
  • an initial guidance law is developed and, after launch, a continuously variable guidance and trajectory correction law is developed until the rocket acquires the target.
  • electronic harmonization is carried out according to which, in a terrestrial frame of reference, the images of the scene taken at the same times by the two devices are filtered in a low-pass filter, i:> to retain only the low spatial frequencies, and the equation of the optical flow between these pairs of respective images of the two devices is solved to determine the rotations and the variation of the ratio of the parameters respective zoom to subject these images to harmonize them on each other.
  • the images of the rocket imaging device are stabilized in a terrestrial frame of reference, on the landscape, even if stabilization by inertial unit is still possible.
  • FIG. 1 is a schematic axial sectional view of a rocket equipped with self-guiding means with imaging device and path correction means;
  • FIG. 2 is a block representation of the electrical, electronic and optical functional means of the rocket of Figure 1;
  • - Figure 3 illustrates the geometry of the movement of a camera
  • - Figure 4 is a block diagram of the rocket imaging device allowing the implementation of electronic stabilization of its images and harmonization with the aiming device;
  • - Figure 5 is a representation of the image of the rocket imaging device showing the various fields of view and - Figure 6 is a schematic view illustrating the method of guiding a rocket on a target from a helicopter.
  • the rocket comprises a body 1, of which only the front part has been shown, the rear part comprising the payload and the trajectory correction members, which may be control surfaces or small directional rockets, and a nose 2, covered of a cap 3.
  • the cap 3 carries a first lens which acts as an aerodynamic window and which focuses the image on the detector using the rest of the optics which are discussed below.
  • the cap 3 carries a first lens which acts as an aerodynamic window and which focuses the image on the detector using the rest of the optics which are discussed below.
  • the rocket is a self-directing spinnated rocket, partly in the nose, partly in the body, as will be seen below, but the nose 2 and the body 1 of which are decoupled in rotation, the nose 2 carrying, by means of a hollow shaft 4, a flywheel 5 disposed in the body 1 creating a differential spin between the nose 2 and the body 1, so that the nose 2 is only rotated very slowly if not at all.
  • the hollow shaft 4 therefore extends on either side of the joint plane 6 between the nose 2 and the body 1, in rolling bearings 7 and 8 respectively in one 2 and the other part 1 rocket.
  • the self-steering of the rocket comprises, in the nose 2, behind the cap 3 and a fixed optical system 9, an imaging device 10 and in the body 1, u 'n equipment 11 Trajectory correction controlled by the device 10.
  • the equipment 11 also ensures, after launch, the comparison of the image taken by the imaging device 10 with the stored large and small field images of the scene taken, before launch, with the wearer's aiming device, it will be discussed below.
  • the imaging device 10 comprises a camera 13, with its conventional proximity electronic circuits 14, an analog-digital converter 15 and an image transmission component 16.
  • the device 10 is supplied from the body of the rocket and through the hollow shaft 4, by a rechargeable battery 12.
  • the camera 13 can be a camera or video or infrared device.
  • the transmission component 16 can be a laser diode or an LED (light-emitting diode). This component 16 can be placed in the imaging device 10 and then, the transmission of images through the hollow shaft 4 and the flywheel 5 is effected by optical fiber 17 extending along the roll axis 30 of the machine.
  • the image transmission component 22 can be placed in the inertia flap 5, opposite a diode 24 receiving the transmitted images and then the signal transmission between the imaging device 10 and the component 22 s' made by wires through the hollow shaft 4.
  • the imaging device is cooled by Peltier effect, if necessary.
  • the flywheel 5 symbolized in FIG. 2 by the two vertical dashed lines, carries the secondary 19 of a transformer 18 for coupling the power supply to the nose 2 of the rocket connected to the battery 12, a wheel 20 of an optical encoder 21 and a laser diode 22, or an LED, as the case may be, for transmitting in the body 1 of the rocket images of the device 10.
  • the trajectory correction equipment 11 of the rocket body comprises the transceiver 23 of the optical encoder 21, the diode 24 receiving the transmitted images, the primary 25 of the transformer 18, with its source 26, and circuits 27 for processing the received images and for guiding and controlling the control surfaces 28 of the rocket, connected to the receiving diode 24 and to the transceiver 23 of the encoder 21.
  • the circuits 27 include a computer of edge.
  • the encoder 21 indicates the relative angular position between the imaging device 10 and the body 1 of the rocket.
  • the rocket is guided using the computer of circuits 27, as a function of this angular position and of the comparison between the images received from the imaging device and stabilized in circuits 27 and the stored images previously supplied by example by a viewfinder.
  • the guidance commands are applied in synchronism with the rocket's own rotation, also taking into account the location of the control surface.
  • the operator Before launching the rocket, the operator, using a sighting device, takes a wide field image 52 of the scene, which is memorized, and which will serve, in the case of low spatial frequencies, to be determined the approximate direction of the target ( Figure 5). It also takes a small field image 53 which is also stored.
  • the overall view is a view 50 of the navigation field, with, inside, a view 51 of the rocket's auto-director, then a view 52 of the wide field, then again more inside, a view 53 small field.
  • FIG. 6 shows a sighting device 62 and a firing computer 63 of the helicopter as well as the angle of field ⁇ of the seeker of the rocket from right, corresponding to view 51, and the angle of small field v of the sighting device 62 of the helicopter, corresponding to view 53, angles in which the tank 61 is located.
  • the fire control operator shooter of the helicopter 60
  • the fire control operator begins by acquiring the target 61 using his aiming device 62, that is to say that it proceeds to the determination of the position, the distance and the speed of the target 61 which will enable it subsequently, in combination with a flight model and with the aid of the fire calculator 63, to develop a law of guidance, or command, initial.
  • the pilot of the helicopter brings back at best the axis of the helicopter in the direction aimed by the shooter thanks to a repeater.
  • the on-board computer After acquisition of the target 61 and its designation by the operator, the on-board computer, proceeds to the harmonization of the aiming device 62 and of the imaging device 10 of the rocket then to the stabilization of the images of the device rocket imagery, before developing the optimal rocket guidance law.
  • the camera is in a three-dimensional Cartesian or Polar coordinate system with the origin placed on the front lens of the camera and the z axis directed along the viewing direction.
  • the position of the camera relative to the wearer's center of gravity is defined by three rotations (ab, vc, gc) and three translations (Txc, Tyc, Tzc).
  • the relationship between the 3D coordinates of the camera and those of the wearer is:
  • R is a 3 x 3 rotation matrix
  • T is a 1 x 3 translation matrix
  • x (t) F (t) .x (t) + u (t) + v (t)
  • H (t) is a matrix m x n function of t and w is a white Gaussian noise of dimension m, which one can assimilate to the angular and linear vibrations of the camera compared to the center of gravity of the carrier.
  • x k [k aP, aV k, bP k, k bV, gP k, gV k, k xP, xV k, k yP, yV k, zP k, k zV] ⁇ is the state vector at the instant k of the trajectory, composed of the angles and speeds yaw, pitch, roll and positions and speeds in x, y and z.
  • u k is the known input vector function of k; it is the flight or trajectory model of the wearer's center of gravity.
  • v k is the n-dimensional white Gaussian noise representing the acceleration noises in yaw, pitch, roll, positions x, y, z.
  • angles and translations to which the camera is subjected relative to the center of gravity are not constant during the trajectory, in a viewfinder for example, it suffices to describe their measured or controlled values (ac (t), bc (t ), gc (t), Txc (t), Tyc (t), Tzc (t) as a function of t or k.
  • the trajectory of the camera can be defined by a vector
  • the camera undergoes pure 3D rotations and three translations, the values of which are provided by the vector x ' k + ⁇ .
  • Figure 3 shows the geometry of the camera movement in 3D space in the real world.
  • the camera is in a three-dimensional Cartesian or Polar coordinate system with the origin placed on the front lens of the camera and the z axis directed along the viewing direction.
  • F1 (X, Y) is the focal length of the camera at time t.
  • aw, bw, gw, xw, yw, zw are the angular vibrations.
  • imagek + 1 (Ai, Aj) imagek (Ai, Aj) + Gradie ⁇ tX (Ai, Aj) .dAi .pasH + GradientY (Ai, Aj). dAj .pasH with Gradie ⁇ tX and Gradie ⁇ tY the derivatives according to X and Y of ima ⁇ ek (X.Y).
  • Low-pass filtering consists, in a conventional manner, of dragging a convolution kernel from pixel to pixel of the digital images of the camera, a kernel on which the origin of the kernel is replaced by the average of the gray levels of the pixels of the kernel .
  • the results obtained with a rectangular core of 7 pixels high (v) and 20 pixels wide (H) are very satisfactory on normally contrasted scenes.
  • v pixels high
  • H pixels wide
  • Wavelet functions can also be used as the averaging kernel.
  • the optical flow equation measures the total displacement of the camera. We saw above that we could distinguish more finely the movements of the camera deduced from those of the wearer and the real movements of the camera by saying that the wearer and the camera have the same trajectory, but that the camera undergoes in addition to the linear and angular vibrations.
  • the displacements due to the trajectory of the camera are contained in the state vector x ' k + ⁇ of the camera, or rather in the estimation that one can make it, by averaging it, or by having a Kalman filter which provides the best estimate.
  • the fourth axis (zoom) is not necessarily necessary but it may prove to be essential in the event of optical zoom but also in the case where the focal length is not known with sufficient precision or when the focal length varies with the temperature ( IR, Germanium optics, etc.) or pressure (air index).
  • image k + 1 (X, Y) image (X - dX k + ⁇ (X, Y), Y - dY k + 1 (X, Y))
  • a (,:, 2) DeriveeX (Ai, Aj). (1 + (Ai.pasH / F1 (X, Y)) ⁇ 2)
  • a (,:, 3) DeriveeY (Ai, Aj) .Ai. pasH / pasV - DeriveeX (Ai, Aj). .Aj .notV / pasH
  • a (:,:, 4) DeriveeX (Ai, Aj). Ai + Derivative Y (Ai, Aj). aj
  • the shooting camera 13 delivers its video signal from images to a filter
  • the filter 10 low-pass 42 as well as to a processing block 43 receiving on a second input the stabilization data and supplying as output the stabilized images. On its second input, the block 43 therefore receives the rotational speeds to be subjected to the images taken by the camera 13.
  • the two buffer memories 44, 45 are connected to two inputs of a computation component 46, which is either an ASIC or an FPGA (field programmable gate array).
  • the calculation component 46 is connected to a working memory 47 and, at the output, to the processing block 43. All the electronic components of the
  • the harmonization implemented in the guiding method of the invention is an extrapolation of the stabilization step, the sighting device and the rocket imaging device being, before launch, mounted on the same carrier.
  • the stabilization of the images of the rocket imaging device is a self-stabilization process, in which the image of time t is stabilized on the image of time t-1.
  • we harmonize each image of the imaging system on the previous one we can say that we harmonize each image of the imaging system on the previous one.
  • the two images of the two devices are taken and they are stabilized one on the other, that is to say that the two devices are harmonized.
  • Harmonizing is tantamount to confusing the optical axes of the two devices as well as matching the pixels of the two images two by two, and preferably also confusing these pixels.
  • the two devices to be harmonized according to this process must be of the same optical nature, that is to say operate in comparable wavelengths.
  • the two devices both taking images of the same scene, in a terrestrial frame of reference, the images of the scene taken at the same times by the two devices are filtered in a low-pass filter, for n ' retain only the low spatial frequencies, and the equation of the optical flow between these pairs of respective images of the two devices is solved, to determine the rotations and the variation of the ratio of the respective zoom parameters to be subjected to these images for the harmonize on each other.
  • the initial guidance law is developed by means of the position, distance and speed of the target, on the one hand, and a flight model on the other.
  • the operator of the firing control proceeds to launch it. Up to a certain distance from the target 61, until the rocket acquires the target, the image taken by the imaging device 10 of the rocket is compared with the stored wide field image 52 of the scene taken at the start with the aiming device 62, that is to say that the rocket guidance is permanently controlled.
  • the guidance of the rocket is continued until the terminal phase, by comparison of the image taken by the imaging device 10 of the rocket with the small field image 53 also stored.

Landscapes

  • Engineering & Computer Science (AREA)
  • Chemical & Material Sciences (AREA)
  • Combustion & Propulsion (AREA)
  • General Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Electromagnetism (AREA)
  • Studio Devices (AREA)
  • Length Measuring Devices By Optical Means (AREA)
  • Telescopes (AREA)

Abstract

The invention concerns a method for guiding a rocket (1) onto a target, wherein, the rocket (1) is equipped with guidance by self-reacting imaging device (10) and trajectory correction means (11); said method consists in: acquiring the target through a sighting device and determining its position; harmonizing the sighting device and the imaging device (10) of the rocket; stabilising the images of the imaging device (10) of the rocket; working out a guidance law; launching the rocket (1) and guiding it in accordance with the guidance law until it acquires the target by itself.

Description

Procédé de guidage d'une roquette Method for guiding a rocket
L'invention concerne le guidage des roquettes.The invention relates to rocket guidance.
Une roquette est une petite fusée, sans guidage. On l'utilise souvent dans la lutte anti-chars et elle peut être lancée d'un engin terrestre, maritime ou aérien, par exemple d'un avion ou d'un hélicoptère. Toutefois, l'invention s'applique tout aussi bien aux missiles et quand on parlera dans le texte de roquettes, il faudra prendre le terme dans son sens général et considérer qu'on couvre également des missiles.A rocket is a small rocket, without guidance. It is often used in the fight against tanks and it can be launched from a land, sea or air vehicle, for example from an airplane or a helicopter. However, the invention applies equally well to missiles and when we speak in the text of rockets, it will be necessary to take the term in its general sense and to consider that we also cover missiles.
La précision d'une roquette n'est pas très grande. Mais elle est en plus affectée, en cas de tir d'hélicoptère, par le vent des pales qui provoque une déviation de trajectoire .The accuracy of a rocket is not very high. But it is also affected, in the event of a helicopter fire, by the wind of the blades which causes a deviation of trajectory.
Avant de lancer une roquette, un opérateur acquiert d'abord la cible dans son viseur, il l'identifie, il la suit, pour connaître sa vitesse angulaire, puis il la télémètre, pour connaître sa distance et finalement connaître la position de la cible dans son repère. Avec ces données et un modèle de vol de l'engin, le calculateur de tir élabore un but futur matérialisé par un réticule dans le viseur.Before launching a rocket, an operator first acquires the target in its sight, it identifies it, it follows it, to know its angular speed, then it ranges it, to know its distance and finally to know the position of the target in its benchmark. With this data and a flight model of the device, the fire calculator develops a future goal materialized by a reticle in the viewfinder.
On rappellera que de nombreux missiles sont équipés de moyens d'auto- guidage, c'est-à-dire d'un système d'écartométrie destiné, en fonction du résultat de la comparaison entre les images de la cible de référence et les images capturées en vol par un dispositif d'imagerie, à actionner des gouvernes ou des fusées directionnelles de correction de trajectoire.It will be recalled that many missiles are equipped with self-guidance means, that is to say a deviation measurement system intended, depending on the result of the comparison between the images of the reference target and the images captured in flight by an imaging device, to actuate control surfaces or directional trajectory correction rockets.
Et bien la présente demande vise à parfaire la précision des roquettes et, à cet effet, elle concerne un procédé de guidage d'une roquette sur une cible dans lequel, la roquette étant équipée de moyens d'auto-guidage à dispositif d'imagerie et moyens de correction de trajectoire,Well, the present application aims to perfect the precision of rockets and, for this purpose, it relates to a method for guiding a rocket on a target in which, the rocket being equipped with self-guiding means with imaging device and trajectory correction means,
- on acquiert la cible par un dispositif de visée et on détermine sa position,- the target is acquired by an aiming device and its position is determined,
- on harmonise le dispositif de visée et le dispositif d'imagerie de la roquette,- the aiming device and the rocket imaging device are harmonized,
- on stabilise les images du dispositif d'imagerie de la roquette,- the images of the rocket imaging device are stabilized,
- on élabore une loi de guidage,- we develop a guidance law,
- on lance la roquette et- we launch the rocket and
- on guide la roquette selon cette loi jusqu'à ce qu'elle acquiert elle-même la cible. On notera que l'harmonisation des deux dispositifs de visée et d'imagerie, du lanceur et de la roquette peut s'effectuer de façon tout-à-fait simple, d'abord par harmonisation des axes de visée et de prise de vues, respectivement, puis, par calcul de l'image du viseur du lanceur dans le repère du dispositif d'imagerie de la roquette.- the rocket is guided according to this law until it acquires the target itself. It will be noted that the harmonization of the two aiming and imaging devices, of the launcher and of the rocket can be carried out in a completely simple manner, first by harmonizing the aiming and taking axes, respectively, then, by calculating the image of the launcher sight in the reference frame of the rocket imaging device.
On notera aussi que la stabilisation des images du dispositif d'imagerie de la roquette permet au moins de palier les inconvénients du lanceur avant le lancement et donc de stabiliser ces images dans le paysage absolu de la cible.It will also be noted that the stabilization of the images of the rocket imaging device makes it possible at least to overcome the disadvantages of the launcher before launching and therefore to stabilize these images in the absolute landscape of the target.
Dans une mise en œuvre particulière du procédé de l'invention, avant le lancement, on élabore une loi de guidage initiale et la roquette est guidée jusqu'à ce qu'elle acquiert la cible selon cette loi initiale.In a particular implementation of the method of the invention, before launching, an initial guidance law is developed and the rocket is guided until it acquires the target according to this initial law.
Mais, de préférence, avant le lancement, on élabore une loi de guidage initiale et, après le lancement, on élabore une loi de guidage continûment variable et de correction de trajectoire jusqu'à ce que la roquette acquiert la cible.However, preferably before launch, an initial guidance law is developed and, after launch, a continuously variable guidance and trajectory correction law is developed until the rocket acquires the target.
2020
De préférence encore, pour harmoniser le dispositif de visée et le dispositif d'imagerie de la roquette, on procède à une harmonisation électronique selon laquelle, dans un référentiel terrestre, on filtre les images de la scène prises aux mêmes instants par les deux dispositifs dans un filtre passe-bas, i:> pour n'en retenir que les basses fréquences spatiales, et on résoud l'équation du flot optique entre ces paires d'images respectives des deux dispositifs pour déterminer les rotations et la variation du rapport des paramères de zoom respectifs à faire subir à ces images pour les harmoniser les unes sur les autres.More preferably, to harmonize the sighting device and the rocket imaging device, electronic harmonization is carried out according to which, in a terrestrial frame of reference, the images of the scene taken at the same times by the two devices are filtered in a low-pass filter, i:> to retain only the low spatial frequencies, and the equation of the optical flow between these pairs of respective images of the two devices is solved to determine the rotations and the variation of the ratio of the parameters respective zoom to subject these images to harmonize them on each other.
De préférence toujours, on stabilise les images du dispositif d'imagerie de la roquette dans un référentiel terrestre, sur le paysage, même si une stabilisation par centrale inertielle reste toujours possible.Preferably always, the images of the rocket imaging device are stabilized in a terrestrial frame of reference, on the landscape, even if stabilization by inertial unit is still possible.
Dans ce cas, il est avantageux, dans ce référentiel terrestre, de filtrer les images de la scène prises par le dispositif d'imagerie, dans un filtre passe- bas, pour n'en retenir que les basses fréquences spatiales, et de résoudre l'équation du flot optique pour déterminer les rotations à faire subir aux images pour les stabiliser sur les images précédentes.In this case, it is advantageous, in this terrestrial frame of reference, to filter the images of the scene taken by the imaging device, in a low-pass filter, to retain only the low spatial frequencies, and to resolve the equation of the optical flow to determine the rotations to be subjected to the images to stabilize them on the previous images.
40 L'invention sera mieux comprise à l'aide de la description suivante, en référence au dessin annexé, sur lequel40 The invention will be better understood using the following description, with reference to the accompanying drawing, in which
- la figure 1 est une vue en coupe axiale schématique d'une roquette équipée de moyens d'auto-guidage à dispositif d'imagerie et moyens de correction de trajectoire ;- Figure 1 is a schematic axial sectional view of a rocket equipped with self-guiding means with imaging device and path correction means;
- la figure 2 est une représentation par blocs des moyens fonctionnels électriques, électroniques et optiques de la roquette de la figure 1 ;- Figure 2 is a block representation of the electrical, electronic and optical functional means of the rocket of Figure 1;
- la figure 3 illustre la géométrie du mouvement d'une caméra de prise de vues ; - la figure 4 est un schéma fonctionnel du dispositif d'imagerie de la roquette permettant la mise en œuvre de la stabilisation électronique de ses images et rharmonisation avec le dispositif de visée ;- Figure 3 illustrates the geometry of the movement of a camera; - Figure 4 is a block diagram of the rocket imaging device allowing the implementation of electronic stabilization of its images and harmonization with the aiming device;
- la figure 5 est une représentation de l'image du dispositif d'imagerie de la roquette montrant les divers champs de prise de vues et - la figure 6 est une vue schématique illustrant le procédé de guidage de roquette sur une cible à partir d'un hélicoptère.- Figure 5 is a representation of the image of the rocket imaging device showing the various fields of view and - Figure 6 is a schematic view illustrating the method of guiding a rocket on a target from a helicopter.
La roquette comporte un corps 1, dont on n'a représenté que la partie avant, la partie arrière comprenant la charge utile et les organes de correction de trajectoire, qui peuvent être des gouvernes ou des petites fusées directionnelles, et un nez 2, recouvert d'une coiffe 3. La coiffe 3 porte une première lentille qui fait office de hublot aérodynamique et qui focalise l'image sur le détecteur à l'aide du reste de l'optique dont il est question ci- après. La coiffe 3 porte une première lentille qui fait office de hublot aérodynamique et qui focalise l'image sur le détecteur à l'aide du reste de l'optique dont il est question ci-après. La roquette est une roquette spinnée à auto-directeur, en partie dans le nez, en partie dans le corps, comme on le verra ci-après, mais dont le nez 2 et le corps 1 sont découplés en rotation, le nez 2 portant, par l'intermédiaire d'un arbre creux 4, un volant d'inertie 5 disposé dans le corps 1 créant un spin différentiel entre le nez 2 et le corps 1, de sorte que le nez 2 n'est entraîné en rotation que très lentement si ce n'est pas du tout.The rocket comprises a body 1, of which only the front part has been shown, the rear part comprising the payload and the trajectory correction members, which may be control surfaces or small directional rockets, and a nose 2, covered of a cap 3. The cap 3 carries a first lens which acts as an aerodynamic window and which focuses the image on the detector using the rest of the optics which are discussed below. The cap 3 carries a first lens which acts as an aerodynamic window and which focuses the image on the detector using the rest of the optics which are discussed below. The rocket is a self-directing spinnated rocket, partly in the nose, partly in the body, as will be seen below, but the nose 2 and the body 1 of which are decoupled in rotation, the nose 2 carrying, by means of a hollow shaft 4, a flywheel 5 disposed in the body 1 creating a differential spin between the nose 2 and the body 1, so that the nose 2 is only rotated very slowly if not at all.
L'arbre creux 4 s'étend donc de part - et d'autre du plan de joint 6 entre le nez 2 et le corps 1, dans des paliers à roulements 7 et 8 respectivement dans l'une 2 et l'autre partie 1 de la roquette. L'auto-directeur de la roquette comporte, dans le nez 2, derrière la coiffe 3 et une optique fixe 9, un dispositif d'imagerie 10 et dans le corps 1, u ' n équipement 11 de correction de trajectoire commandé par le dispositif 10.The hollow shaft 4 therefore extends on either side of the joint plane 6 between the nose 2 and the body 1, in rolling bearings 7 and 8 respectively in one 2 and the other part 1 rocket. The self-steering of the rocket comprises, in the nose 2, behind the cap 3 and a fixed optical system 9, an imaging device 10 and in the body 1, u 'n equipment 11 Trajectory correction controlled by the device 10.
L'équipement 11 assure également, après lancement, la comparaison de l'image prise par le dispositif d'imagerie 10 avec les images grand champ et petit champ mémorisées de la scène prises, avant le lancement, avec le dispositif de visée du porteur dont il sera question ci-après.The equipment 11 also ensures, after launch, the comparison of the image taken by the imaging device 10 with the stored large and small field images of the scene taken, before launch, with the wearer's aiming device, it will be discussed below.
Le dispositif d'imagerie 10 comprend un organe de prise de vues 13, avec ses circuits électroniques de proximité classiques 14, un convertisseur analogique - numérique 15 et un composant de transmission d'images 16. Le dispositif 10 est alimenté, depuis le corps de la roquette et à travers l'arbre creux 4, par une batterie rechargeable 12. L'organe de prise de vues 13 peut être une caméra ou appareil vidéo ou infra-rouge. Le composant de transmission 16 peut être une diode laser ou une LED (diode électroluminescente). Ce composant 16 peut être disposé dans le dispositif d'imagerie 10 et alors, la transmission d'images à travers l'arbre creux 4 et le volant d'inertie 5 s'effectue par fibre optique 17 s'étendant le long de l'axe de roulis 30 de l'engin. Mais le composant de transmission d'images 22 peut être disposé dans le volet d'inertie 5, en face d'une diode 24 réceptrice des images transmises et alors la transmission du signal entre le dispositif d'imagerie 10 et le composant 22 s'effectue par fils à travers l'arbre creux 4. Le dispositif d'imagerie est refroidi par effet Peltier, si nécessaire.The imaging device 10 comprises a camera 13, with its conventional proximity electronic circuits 14, an analog-digital converter 15 and an image transmission component 16. The device 10 is supplied from the body of the rocket and through the hollow shaft 4, by a rechargeable battery 12. The camera 13 can be a camera or video or infrared device. The transmission component 16 can be a laser diode or an LED (light-emitting diode). This component 16 can be placed in the imaging device 10 and then, the transmission of images through the hollow shaft 4 and the flywheel 5 is effected by optical fiber 17 extending along the roll axis 30 of the machine. But the image transmission component 22 can be placed in the inertia flap 5, opposite a diode 24 receiving the transmitted images and then the signal transmission between the imaging device 10 and the component 22 s' made by wires through the hollow shaft 4. The imaging device is cooled by Peltier effect, if necessary.
Le volant d'inertie 5, symbolisé sur la figure 2 par les deux traits en tirets verticaux, porte le secondaire 19 d'un transformateur 18 de couplage d'alimentation en énergie du nez 2 de la roquette relié à la batterie 12, une roue 20 d'un codeur optique 21 et une diode laser 22, ou une LED, selon le cas, de transmission dans le corps 1 de la roquette des images du dispositif 10.The flywheel 5, symbolized in FIG. 2 by the two vertical dashed lines, carries the secondary 19 of a transformer 18 for coupling the power supply to the nose 2 of the rocket connected to the battery 12, a wheel 20 of an optical encoder 21 and a laser diode 22, or an LED, as the case may be, for transmitting in the body 1 of the rocket images of the device 10.
L'équipement de correction de trajectoire 11 du corps de la roquette comporte l'émetteur-récepteur 23 du codeur optique 21, la diode 24 réceptrice des images transmises, le primaire 25 du transformateur 18, avec sa source 26, et des circuits 27 de traitement des images reçues et de guidage et de commande des gouvernes 28 de la roquette, reliés à la diode réceptrice 24 et à l'émetteur-récepteur 23 du codeur 21. Les circuits 27 incluent un calculateur de bord.The trajectory correction equipment 11 of the rocket body comprises the transceiver 23 of the optical encoder 21, the diode 24 receiving the transmitted images, the primary 25 of the transformer 18, with its source 26, and circuits 27 for processing the received images and for guiding and controlling the control surfaces 28 of the rocket, connected to the receiving diode 24 and to the transceiver 23 of the encoder 21. The circuits 27 include a computer of edge.
Le codeur 21 indique la position angulaire relative entre le dispositif d'imagerie 10 et le corps 1 de la roquette. Le guidage de la roquette s'effectue à l'aide du calculateur des circuits 27, en fonction de cette position angulaire et de la comparaison entre les images reçues du dispositif d'imagerie et stabilisées dans les circuits 27 et des images mémorisées préalablement fournies par exemple par un viseur.The encoder 21 indicates the relative angular position between the imaging device 10 and the body 1 of the rocket. The rocket is guided using the computer of circuits 27, as a function of this angular position and of the comparison between the images received from the imaging device and stabilized in circuits 27 and the stored images previously supplied by example by a viewfinder.
Les commandes de guidage sont appliquées en synchronisme avec la rotation propre de la roquette compte-tenu aussi de l'endroit où est située la gouverne.The guidance commands are applied in synchronism with the rocket's own rotation, also taking into account the location of the control surface.
Avant le lancement de la roquette, l'opérateur, à l'aide d'un dispositif de visée, prend une image grand champ 52 de la scène, qui est mémorisée, et qui servira, s 'agissant de basses fréquences spatiales, à déterminer la direction approximative de la cible (figure 5). Il prend également une image petit champ 53 qui est aussi mémorisée.Before launching the rocket, the operator, using a sighting device, takes a wide field image 52 of the scene, which is memorized, and which will serve, in the case of low spatial frequencies, to be determined the approximate direction of the target (Figure 5). It also takes a small field image 53 which is also stored.
En référence à la figure 5, la vue d'ensemble est une vue 50 champ de navigation, avec, à l'intérieur, une vue 51 champ de l' auto-directeur de la roquette, puis une vue 52 grand champ puis, encore plus à l'intérieur, une vue 53 petit champ.With reference to FIG. 5, the overall view is a view 50 of the navigation field, with, inside, a view 51 of the rocket's auto-director, then a view 52 of the wide field, then again more inside, a view 53 small field.
En référence à la figure 6, on a illustré l'exemple d'un opérateur, qui se trouve dans un hélicoptère 60 équipé, sur chacun de ses deux flancs, d'une roquette 1, 2, à guider sur la cible à atteindre constituée en l'espèce d'un char 61. On a représenté, sur cette figure 6, un dispositif de visée 62 et un calculateur de tir 63 de l'hélicoptère ainsi que l'angle de champ θ de l' autodirecteur de la roquette de droite, correspondant à la vue 51, et l'angle de petit champ v du dispositif de visée 62 de l'hélicoptère, correspondant à la vue 53, angles dans lesquels se trouve le char 61.Referring to Figure 6, there is illustrated the example of an operator, who is in a helicopter 60 equipped, on each of its two flanks, with a rocket 1, 2, to be guided on the target to be reached constituted in this case a tank 61. This FIG. 6 shows a sighting device 62 and a firing computer 63 of the helicopter as well as the angle of field θ of the seeker of the rocket from right, corresponding to view 51, and the angle of small field v of the sighting device 62 of the helicopter, corresponding to view 53, angles in which the tank 61 is located.
Ainsi, l'opérateur de conduite de tir, tireur de l'hélicoptère 60, commence par acquérir la cible 61 à l'aide de son dispositif de visée 62, c'est-à-dire qu'il procède à la détermination de la position, de la distance et de la vitesse de la cible 61 qui lui permettront ultérieurement, en combinaison avec un modèle de vol et à l'aide du calculateur de tir 63, à élaborer une loi de guidage, ou de commande, initiale. Pendant ce temps, le pilote de l'hélicoptère ramène au mieux l'axe de l'hélicoptère dans la direction visée par le tireur grâce à un répétiteur.Thus, the fire control operator, shooter of the helicopter 60, begins by acquiring the target 61 using his aiming device 62, that is to say that it proceeds to the determination of the position, the distance and the speed of the target 61 which will enable it subsequently, in combination with a flight model and with the aid of the fire calculator 63, to develop a law of guidance, or command, initial. During this time, the pilot of the helicopter brings back at best the axis of the helicopter in the direction aimed by the shooter thanks to a repeater.
Après acquisition de la cible 61 et de sa désignation par l'opérateur, le calculateur de bord, procède à l'harmonisation du dispositif de visée 62 et du dispositif d'imagerie 10 de la roquette puis à la stabilisation des images du dispositif d'imagerie de la roquette, avant d'élaborer la loi de guidage optimale de la roquette.After acquisition of the target 61 and its designation by the operator, the on-board computer, proceeds to the harmonization of the aiming device 62 and of the imaging device 10 of the rocket then to the stabilization of the images of the device rocket imagery, before developing the optimal rocket guidance law.
Pour des raisons qui apparaîtront clairement après, on procédera d'abord à la description de l'étape de stabilisation des images du dispositif d'imagerie de la roquette.For reasons which will become apparent later, the description will first be made of the step of stabilizing the images of the rocket imaging device.
Considérons la caméra d'observation et de guidage 13 de la roquette de la figure 1. Ce peut être une caméra vidéo ou une caméra infra-rouge.Let us consider the observation and guidance camera 13 of the rocket of FIG. 1. It can be a video camera or an infrared camera.
Si la scène est stationnaire, les points de la scène vus par la caméra entre deux images sont reliés par la trajectoire du porteur.If the scene is stationary, the points of the scene seen by the camera between two images are connected by the trajectory of the carrier.
Les coordonnées cartésiennes de la scène dans le repère du porteur sont P = (x, y, z)', l'origine est le centre de gravité du porteur, avec l'axe z orienté selon l'axe de roulis principal, l'axe x correspond à l'axe de lacet et l'axe y à l'axe de tangage.The Cartesian coordinates of the scene in the reference frame of the carrier are P = (x, y, z) ', the origin is the center of gravity of the carrier, with the z axis oriented along the main roll axis, l' x axis corresponds to the yaw axis and the y axis to the pitch axis.
La caméra est dans un système de coordonnées Cartésien ou Polaire à trois dimensions avec l'origine placée sur la lentille frontale de la caméra et l'axe z dirigé le long de la direction de visée.The camera is in a three-dimensional Cartesian or Polar coordinate system with the origin placed on the front lens of the camera and the z axis directed along the viewing direction.
La position de la caméra par rapport au centre de gravité du porteur est définie par trois rotations (ab, vc, gc) et trois translations (Txc, Tyc, Tzc). Le rapport entre les coordonnées 3 D de la caméra et celles du porteur est :The position of the camera relative to the wearer's center of gravity is defined by three rotations (ab, vc, gc) and three translations (Txc, Tyc, Tzc). The relationship between the 3D coordinates of the camera and those of the wearer is:
(x', y', z')' = (ac,bc,gc) * (x, y, z)' + T(Txc, Tyc, Tzc) OU(x ', y', z ')' = (ac, bc, gc) * (x, y, z) '+ T (Txc, Tyc, Tzc) OR
• R est une matrice 3 x 3 de rotation,• R is a 3 x 3 rotation matrix,
• T est une matrice 1 x 3 de translation.• T is a 1 x 3 translation matrix.
La trajectoire du centre de gravité est caractéristique de l'évolution de l'état du système et peut être décrite par le système d'équations différentiellesThe trajectory of the center of gravity is characteristic of the evolution of the state of the system and can be described by the system of differential equations
x(t)=F(t).x(t)+ u(t)+v(t)x (t) = F (t) .x (t) + u (t) + v (t)
x = vecteur d'état de dimension n F(t) = matrice fonction de t, de dimension n u = vecteur d'entrée fonction de t connu v = bruit blanc gaussien à n dimensionsx = state vector of dimension n F (t) = matrix function of t, of dimension n u = input vector function of t known v = n-dimensional white Gaussian noise
L'état du système est lui même observé à l'aide de la caméra et la résolution de l'équation du flot optique, par m mesures z(t) liées à l'état x par l'équation d'observation :The state of the system is itself observed using the camera and the resolution of the optical flow equation, by m measures z (t) linked to state x by the observation equation:
z(t)=H(t).x(t)+w(t)z (t) = H (t) .x (t) + w (t)
où H(t) est une matrice m x n fonction de t et w est un bruit blanc gaussien de dimension m, que l'on peut assimiler aux vibrations angulaires et linéaires de la caméra par rapport au centre de gravité du porteur.where H (t) is a matrix m x n function of t and w is a white Gaussian noise of dimension m, which one can assimilate to the angular and linear vibrations of the camera compared to the center of gravity of the carrier.
Le modèle discret s'écrit : xk+1 = Fk * xk + uk + vk zk = Hk * xk + WkThe discrete model is written: x k + 1 = F k * x k + u k + v k z k = H k * x k + Wk
ou xk = [aPk, aVk, bPk, bVk, gPk, gVk, xPk, xVk, yPk, yVk, zPk, zVk]τ est le vecteur d'état à l'instant k de la trajectoire, composé des angles et vitesses lacet, tangage, roulis et positions et vitesses en x, y et z.or x k = [k aP, aV k, bP k, k bV, gP k, gV k, k xP, xV k, k yP, yV k, zP k, k zV] τ is the state vector at the instant k of the trajectory, composed of the angles and speeds yaw, pitch, roll and positions and speeds in x, y and z.
xk+I est le vecteur d'état à l'instant k+1 avec t +1 - t = Ti.x k + I is the state vector at time k + 1 with t +1 - t = Ti.
uk est le vecteur d'entrée fonction de k connu; c'est le modèle de vol ou de trajectoire du centre de gravité du porteur.u k is the known input vector function of k; it is the flight or trajectory model of the wearer's center of gravity.
vk est le bruit blanc gaussien à n dimensions représentant les bruits d'accélérations en lacet, tangage, roulis, positions x, y, z.v k is the n-dimensional white Gaussian noise representing the acceleration noises in yaw, pitch, roll, positions x, y, z.
Si les angles et translations auxquels est soumise la caméra par rapport au centre de gravité ne sont pas constants au cours de la trajectoire, dans un viseur par exemple, il suffit de décrire leurs valeurs mesurées ou commandées (ac(t), bc(t), gc(t), Txc(t), Tyc(t), Tzc(t) en fonction de t ou de k.If the angles and translations to which the camera is subjected relative to the center of gravity are not constant during the trajectory, in a viewfinder for example, it suffices to describe their measured or controlled values (ac (t), bc (t ), gc (t), Txc (t), Tyc (t), Tzc (t) as a function of t or k.
Comme la trajectoire du centre de gravité du porteur est définie par le vecteur xk+ι, la trajectoire de la caméra peut être définie par un vecteurAs the trajectory of the wearer's center of gravity is defined by the vector x k + ι, the trajectory of the camera can be defined by a vector
xck+1 = R(ac, bc, gc) * (Fk*xk + uk + vk) + Texc k + 1 = R (ac, bc, gc) * (F k * x k + u k + v k ) + Te
Entre les instants d'observation k et k+1, la caméra subit de pures rotations 3D et trois translations, dont les valeurs sont fournies par le vecteur x'k+ι.Between the instants of observation k and k + 1, the camera undergoes pure 3D rotations and three translations, the values of which are provided by the vector x ' k + ι.
Considérons la situation où les éléments de la scène sont projetés dans le plan image de la caméra et seules ces projections sont connues.Consider the situation where the elements of the scene are projected in the image plane of the camera and only these projections are known.
La figure 3 montre la géométrie du mouvement de la caméra dans l'espace 3D du monde réel.Figure 3 shows the geometry of the camera movement in 3D space in the real world.
La caméra est dans un système de coordonnées Cartésien ou Polaire à trois dimensions avec l'origine placée sur la lentille frontale de la caméra et l'axe z dirigé le long de la direction de visée. Deux cas de complexités différentes existent :The camera is in a three-dimensional Cartesian or Polar coordinate system with the origin placed on the front lens of the camera and the z axis directed along the viewing direction. Two cases of different complexities exist:
• La scène est stationnaire tandis que la caméra zoome et tourne dans l'espace 3D.• The scene is stationary while the camera zooms and rotates in 3D space.
• La scène est stationnaire tandis que la caméra zoome, tourne et se translate dans l'espace 3D.• The scene is stationary while the camera zooms, rotates and translates in 3D space.
Soit P = (x, y, z)' = (d, a b)' les coordonnées cartésiennes ou polaires caméra d'un point stationnaire au temps tLet P = (x, y, z) '= (d, a b)' be the Cartesian or polar camera coordinates of a stationary point at time t
• x = d.sin(a).cos(b)• x = d.sin (a) .cos (b)
• y = d.sin(b).cos(a) • z = d.cos(a).cos(b)• y = d.sin (b) .cos (a) • z = d.cos (a) .cos (b)
et P' = (x',y',z')' = (d', a', b')' les coordonnées caméra correspondantes au temps t' = t + Ti.and P '= (x', y ', z') '= (d', a ', b') 'the corresponding camera coordinates at time t' = t + Ti.
Les coordonnées caméra (x, y, z) = (d, a, b) d'un point dans l'espace et les coordonnées dans le plan image (X, Y) de son image sont liées par une transformation de perspective égale à :The camera coordinates (x, y, z) = (d, a, b) of a point in space and the coordinates in the image plane (X, Y) of its image are linked by a perspective transformation equal to :
X = Fl(X,Y).x/z = Fl(X,Y).tg(a)X = Fl (X, Y) .x / z = Fl (X, Y) .tg (a)
Y = Fl(X,Y).y/z = Fl(X,Y)/tg(b)Y = Fl (X, Y) .y / z = Fl (X, Y) / tg (b)
où F1(X, Y) est la longueur focale de la caméra au temps t.where F1 (X, Y) is the focal length of the camera at time t.
(x', y', z')' = R(da,db,dg)*(x,y3z)' + T(Tx, Ty, Tz)(x ', y', z ')' = R (da, db, dg) * (x, y 3 z) '+ T (Tx, Ty, Tz)
or
• R=R γ R β R α est une matrice 3 x 3 de rotation et alpha = da, bêta = db, gamma = dg sont, respectivement, l'angle de lacet, l'angle de tangage et l'angle de roulis de la caméra entre le temps t et t' • T est une matrice 1 x 3 de translation avec Tx = x' - x, Ty = y' - y et Tz = z - z', les translations de la caméra entre le temps t et t'.• R = R γ R β R α is a 3 x 3 rotation matrix and alpha = da, beta = db, gamma = dg are, respectively, the yaw angle, the pitch angle and the roll angle of the camera between time t and t ' • T is a 1 x 3 translation matrix with Tx = x '- x, Ty = y' - y and Tz = z - z ', the camera's translations between time t and t'.
Les observations par la caméra se faisant à la fréquence trame (Ti = 20 ms) on peut noter que ces angles évoluent peu entre deux trames et qu'on pourra simplifier certains calculs en conséquence.The observations by the camera being made at the frame frequency (Ti = 20 ms) it can be noted that these angles change little between two frames and that certain calculations can be simplified accordingly.
Quand la longueur focale de la caméra au temps t évolue, on a :When the focal length of the camera at time t changes, we have:
F2(X,Y) = s.Fl(X, Y)F2 (X, Y) = s.Fl (X, Y)
où s est appelé paramètre de zoom, les coordonnées (X', Y') du plan image peuvent être exprimées parwhere s is called zoom parameter, the coordinates (X ', Y') of the image plane can be expressed by
• X' = F2(X, Y).x7z' = F2(X, Y), tg(a')• X '= F2 (X, Y) .x7z' = F2 (X, Y), tg (a ')
• Y = F2(X, Y), y'/z' = F2(X, Y).tg(b')• Y = F2 (X, Y), y '/ z' = F2 (X, Y) .tg (b ')
Si on veut distinguer plus finement les mouvements de la caméra déduitsIf we want to distinguish more precisely the camera movements deduced
-> o de ceux du porteur et les mouvements réels de la caméra, on dira que le porteur et la caméra ont la même trajectoire, mais que la caméra subit en plus des vibrations linéaires et angulaires.-> o of those of the carrier and the real movements of the camera, we will say that the carrier and the camera have the same trajectory, but that the camera undergoes in addition to the linear and angular vibrations.
(x'5 y*, z')' = R(da+aw,db+bw,dg+gw) * (x, y, z)' + 25 T(Tx+xw,Ty+yw,Tz+zw)(x ' 5 y *, z') '= R (da + aw, db + bw, dg + gw) * (x, y, z)' + 25 T (Tx + xw, Ty + yw, Tz + zw )
ouor
aw, bw, gw, xw, yw, zw sont les vibrations angulaires.aw, bw, gw, xw, yw, zw are the angular vibrations.
3030
Ces vibrations linéaires et angulaires peuvent être assimilées à des bruits à moyennes nulles, blanc ou non en fonction du spectre du porteur considéréThese linear and angular vibrations can be assimilated to noises with zero averages, white or not depending on the spectrum of the carrier considered.
35 L'équation du flot optique s'écrit :35 The equation of the optical flow is written:
-, - .^ ^ a(image, rX, Y)) ,v „,. ,., . c(imagelc (X,Y)) A , , . image,., (X, Y) = image, X ) + - ^ H.d ^X H — .dYk_, () - , - . ^ ^ a (image, rX, Y)), v „,. ,., . c (image lc (X, Y ) ) A ,, . im a g e ,., (X, Y) = image, X) + - ^ Hd ^ XH - .d Y k _, ()
ou : imagek+1 (Ai, Aj)=imagek(Ai, Aj)+ GradieπtX(Ai, Aj).dAi .pasH + GradientY(Ai, Aj). dAj .pasH avec GradieπtX et GradieπtY les dérivées selon X et Y de imaςek(X.Y).or: imagek + 1 (Ai, Aj) = imagek (Ai, Aj) + GradieπtX (Ai, Aj) .dAi .pasH + GradientY (Ai, Aj). dAj .pasH with GradieπtX and GradieπtY the derivatives according to X and Y of imaςek (X.Y).
Pour estimer les gradients on se sert uniquement des points adjacents. Comme on ne cherche que le mouvement global de l'image du paysage, on ne va s'intéresser qu'aux très basses fréquences spatiales de l'image et donc filtrer l'image en conséquence. Ainsi, les gradients calculés sont significatifs.To estimate the gradients only adjacent points are used. As we are only looking for the overall movement of the landscape image, we will only be interested in the very low spatial frequencies of the image and therefore filter the image accordingly. Thus, the gradients calculated are significant.
Le filtrage passe-bas consiste, de façon classique, à faire glisser un noyau de convolution de pixel en pixel des images numérisées de la caméra, noyau sur lequel on remplace l'origine du noyau par la moyenne des niveaux de gris des pixels du noyau. Les résultats obtenus avec un noyau rectangulaire de 7 pixels de hauteur (v) et 20 pixels de large (H) sont très satisfaisants sur des scènes normalement contrastées. Par contre, si on veut que ralgorithme fonctionne aussi sur quelques points chauds isolés, il vaut mieux utiliser un noyau qui préserve les maxima locaux et ne crée pas de discontinuité dans les gradients. On peut également utiliser des fonctions ondelettes comme noyau de moyennage.Low-pass filtering consists, in a conventional manner, of dragging a convolution kernel from pixel to pixel of the digital images of the camera, a kernel on which the origin of the kernel is replaced by the average of the gray levels of the pixels of the kernel . The results obtained with a rectangular core of 7 pixels high (v) and 20 pixels wide (H) are very satisfactory on normally contrasted scenes. On the other hand, if one wants that the algorithm also works on some isolated hot spots, it is better to use a kernel which preserves the local maxima and does not create discontinuity in the gradients. Wavelet functions can also be used as the averaging kernel.
On a donc utilisé un noyau de moyennage en forme de pyramide (triangle selon X convolué par triange selon Y). La complexité du filtre n'est pas augmentée car on a utilisé deux fois un noyau rectangulaire de moyenne glissante de [V = 4; H = 10]. On peut également utiliser des fonctions ondelettes comme noyau de moyennage.We therefore used an averaging nucleus in the shape of a pyramid (triangle along X convolved by triange along Y). The complexity of the filter is not increased since a rectangular kernel with sliding average of [V = 4; H = 10]. Wavelet functions can also be used as the averaging kernel.
Seuls dX et dY sont inconnus, mais si on peut décomposer dX et dY en fonction des paramètres du vecteur d'état qui nous intéressent et de X et Y (ou Ai, Aj) de façon à n'avoir plus comme inconnues que les paramètres du vecteur d'état, on va pouvoir écrire l'équation sous une forme vectorielle B = A*Xtrans, avec A et B connus. Chaque point de l'image pouvant être l'objet de l'équation, on est en présence d'un système surdéterminé, A*Xtrans = B, que l'on va pouvoir résoudre par la méthode des moindres carrés.Only dX and dY are unknown, but if we can decompose dX and dY as a function of the parameters of the state vector that interest us and of X and Y (or Ai, Aj) so as to have only as parameters the parameters of the state vector, we will be able to write the equation in a vector form B = A * Xtrans, with A and B known. Each point of the image can be the object of the equation, we are in the presence of an overdetermined system, A * Xtrans = B, which we will be able to solve by the method of least squares.
L'équation du flot optique mesure la totalité des déplacements de la caméra. On a vu plus haut qu'on pouvait distinguer plus finement les mouvements de la caméra déduits de ceux du porteur et les mouvements réels de la caméra en disant que le porteur et la caméra ont la même trajectoire, mais que la caméra subit en plus des vibrations linéaires et angulaires.The optical flow equation measures the total displacement of the camera. We saw above that we could distinguish more finely the movements of the camera deduced from those of the wearer and the real movements of the camera by saying that the wearer and the camera have the same trajectory, but that the camera undergoes in addition to the linear and angular vibrations.
(x', y' ,z')' = R(da+aw,db+bw,dg+g ) (x. y, z)' + T(Tx+xw,Ty+yw,Tz+z )(x ', y', z ')' = R (da + aw, db + bw, dg + g) (x. y, z) '+ T (Tx + xw, Ty + yw, Tz + z)
OU aw, bw, gw, xw, yw. zw sont les vibrations angulaires et linéairesOR aw, bw, gw, xw, yw. zw are the angular and linear vibrations
Or les déplacements dus à la trajectoire de la caméra (da, db, dg, Tx, Ty, Tz) sont contenus dans le vecteur d'état x'k+ι de la caméra, ou plutôt dans l'estimation que l'on peut en faire, en moyennant, ou en ayant un filtre de Kalman qui- en fournit la meilleure estimée.However the displacements due to the trajectory of the camera (da, db, dg, Tx, Ty, Tz) are contained in the state vector x ' k + ι of the camera, or rather in the estimation that one can make it, by averaging it, or by having a Kalman filter which provides the best estimate.
Comme l'équation du flot optique mesure la totalité des déplacements, on va pouvoir en déduire les vibrations angulaires et linéaires aw, bw, gw, xw, zw, à des fins de stabilisation.As the equation of the optical flow measures the totality of the displacements, we will be able to deduce from it the angular and linear vibrations aw, bw, gw, xw, zw, for stabilization purposes.
Il faut noter que sauf configurations extrêmement particulières, les vibrations linéaires ne pourront jamais être vues compte tenu de la distance d'observation, ou de leurs faibles amplitudes par rapport aux déplacements du porteur. On va donc observer : da + aw, db + bw, dg + gw, Tx, Ty, Tz.It should be noted that, except in extremely specific configurations, the linear vibrations can never be seen taking into account the observation distance, or their small amplitudes compared to the movements of the carrier. We will therefore observe: da + aw, db + bw, dg + gw, Tx, Ty, Tz.
Reprenons l'équation du flot optique :Take the equation of the optical flow:
£ J 5S≥ £y_idχk_l (X, Y) . g(ima e, ( Y)) dY^ (Xj y) ima 1gSec,!c.-,l (X-' ,' Y) J = i "m"a-g=>e-,'< ( VX,' Y ' )i + - - χ. α^-> ^ I ^ δγ£ J 5S≥ £ y_idχ k _ l ( X, Y). g (ima e , (Y )) dY ^ ( Xj y) ima 1 gSe c ,! c .-, l (X- ',' Y) J = i "m" ag => e -, '<(VX , 'Y') i + - - χ. α ^ -> ^ I ^ δ γ
OU image,., (X 4- dXt., (X, Y), Y + dY,^, Y)) = im e, (X, Y) Si on réalise cette opération, on voit que l'on va stabiliser de manière absolue les images de la séquence. Contrairement à une stabilisation inertielle où la ligne de visée est entachée de biais, de dérives et d'erreurs de facteur d'échelle, on peut créer une représentation de la scène non entachée de biais et de dérives si on la stabilise selon trois axes et si les défauts de distorsion de l'optique ont été compensés. Le quatrième axe (zoom) n'est pas forcément nécessaire mais il peut s'avérer indispensable en cas de zoom optique mais aussi dans le cas où la focale n'est pas connue avec assez de précision ou quand la focale varie avec la température (optiques IR, Germanium, etc..) ou la pression (indice de l'air).OR image,., (X 4- dX t ., (X, Y), Y + dY, ^, Y)) = im e, (X, Y) If we carry out this operation, we see that we will absolutely stabilize the images of the sequence. Unlike an inertial stabilization where the line of sight is tainted with bias, drifts and errors of scale factor, we can create a representation of the scene not tainted with bias and drifts if it is stabilized along three axes and if the optical distortion defects have been compensated. The fourth axis (zoom) is not necessarily necessary but it may prove to be essential in the event of optical zoom but also in the case where the focal length is not known with sufficient precision or when the focal length varies with the temperature ( IR, Germanium optics, etc.) or pressure (air index).
Cela peut intéresser des applications où l'on veut faire de l'accumulation de trames sans traînage, ou si l'on veut garder une référence en absolu du paysage (l'harmonisation dynamique d'un auto-directeur et d'un viseur par exemple).This can interest applications where we want to accumulate frames without dragging, or if we want to keep an absolute reference of the landscape (the dynamic harmonization of a self-director and a viewfinder by example).
Mais cela peut aussi concerner des applications où on va chercher à restaurer l'information de paysage de manière optimale en obtenant une image débarrassée des effets d'échantillonnage et de taille de détecteur.But this can also concern applications where we will seek to restore landscape information in an optimal manner by obtaining an image free of the effects of sampling and detector size.
On peut obtenir simultanément une amélioration de la résolution spatiale et une réduction des bruits temporels ou du brait spatial fixe.An improvement in spatial resolution and a reduction in temporal noise or fixed spatial brait can be obtained simultaneously.
On peut remarquer que la même équation peut aussi s'écrire :We can notice that the same equation can also be written:
imagek+1(X, Y) = image (X - dXk+ι(X, Y), Y - dYk+1(X, Y))image k + 1 (X, Y) = image (X - dX k + ι (X, Y), Y - dY k + 1 (X, Y))
Les valeurs dXk+ι(X,Y), dYk+ι(X, Y) ne sont bien évidemment pas connues à l'instant k. Par contre, en utilisant les équations de mouvement de caméra on peut les estimer à l'instant k+1.The values dX k + ι (X, Y), dY k + ι (X, Y) are obviously not known at time k. On the other hand, using the camera motion equations we can estimate them at time k + 1.
Cela procure une meilleure robustesse dans la mesure des vitesses et cela autorise de grandes dynamiques de mouvements. Comme le même point P du paysage, de coordonnées Xk, Y dans l'image k, va se trouver aux coordonnées Xk+ι Yk+i dans l'image k+1 à cause des trois rotations aVk+ι. Ti, bV +ι. TL, gVk+ι.Ti, et du changement de focale, il faut donc faire subir des rotations et des facteurs de zoom opposés pour stabiliser de façon absolue l'image k+1 sur l'image k.This provides better robustness in the measurement of speeds and allows large dynamics of movement. As the same point P of the landscape, of coordinates X k , Y in the image k, will be found at the coordinates X k + ι Y k + i in the image k + 1 because of the three rotations aV k + ι. Ti, bV + ι. TL, gV k + ι.Ti, and the change of focal length, it is therefore necessary to undergo opposite rotations and zoom factors to absolutely stabilize the image k + 1 on the image k.
Examinons maintenant le cas particulier d'une scène stationnaire et pas de translation caméra.Let us now examine the particular case of a stationary scene and no camera translation.
Quand la caméra subit de pures rotations 3D, le rapport entre les coordonnées cartésiennes 3D caméra avant et après le mouvement de caméra est :When the camera undergoes pure 3D rotations, the ratio between the Cartesian coordinates 3D camera before and after the camera movement is:
(x', y', z')' = R*(x, y, z)'(x ', y', z ')' = R * (x, y, z) '
où R est une matrice 3 x 3 de rotation et alpha = da, bêta = db, gamma = dg sont, respectivement, l'angle de lacet, l'angle de tangage et l'angle de roulis de la caméra entre le tramps t et t'.where R is a 3 x 3 rotation matrix and alpha = da, beta = db, gamma = dg are, respectively, the yaw angle, the pitch angle and the roll angle of the camera between the tramps t and you.
En coordonnées polaires 3D caméra, le rapport avant et après le mouvement de caméra est :In 3D camera polar coordinates, the ratio before and after the camera movement is:
(d', a', b')* = K(da, db, dg) *(d, a, b)'(d ', a', b ') * = K (da, db, dg) * (d, a, b)'
La scène étant stationnaire, on a :The scene being stationary, we have:
d' = d pour tous les points du paysaged '= d for all points of the landscape
X = F1(S, Y).x/z = F1(X, Y). tg(a)X = F1 (S, Y). X / z = F1 (X, Y). tg (a)
Y = F1(X, Y), y/z ) F1(X, Y).tg(b)Y = F1 (X, Y), y / z) F1 (X, Y) .tg (b)
Quand la longueur focale de la caméra au temps t évolue, on a :When the focal length of the camera at time t changes, we have:
F2(X,Y)= s . Fl(X,Y) où s est appelé paramètre de zoom, les coordonnées (X1, Y') du plan image peuvent être exprimées parF2 (X, Y) = s. Fl (X, Y) where s is called the zoom parameter, the coordinates (X 1 , Y ') of the image plane can be expressed by
• X' = F2(X, Y) . x'/z' = F2(X, Y) . tg(a')• X '= F2 (X, Y). x '/ z' = F2 (X, Y). tg (a ')
• Y = F2(X, Y) . y'/z* = F2(X, Y) . tg(b')• Y = F2 (X, Y). y '/ z * = F2 (X, Y). tg (b ')
On a donc quatre paramètres qui peuvent varier.So there are four parameters that can vary.
Considérons le cas pratique, pour résoudre l'équation de flux optique, de l'estimation des vitesses de lacet, tangage et roulis et du changement de focale.Let us consider the practical case, to solve the optical flux equation, of the estimation of the yaw, pitch and roll speeds and of the focal change.
B( ,:,1 ) = imagek+1 (Ai, Aj)-imagek(Ai, Aj)B (,:, 1) = imagek + 1 (Ai, Aj) -imagek (Ai, Aj)
Si On pose : AA((::,:,1 ) = Dérivée Y(Ai, Aj).(1 +(Aj.pasV/F1 (X,Y))Λ2)If we set: AA ((::,:, 1) = Derivative Y (Ai, Aj). (1 + (Aj.pasV / F1 (X, Y)) Λ 2)
A( ,:,2) = DeriveeX (Ai, Aj).(1 +(Ai.pasH/F1(X,Y))Λ2)A (,:, 2) = DeriveeX (Ai, Aj). (1 + (Ai.pasH / F1 (X, Y)) Λ 2)
A( ,:,3) = DeriveeY(Ai, Aj).Ai. pasH /pasV - DeriveeX(Ai, Aj). .Aj .pasV /pasHA (,:, 3) = DeriveeY (Ai, Aj) .Ai. pasH / pasV - DeriveeX (Ai, Aj). .Aj .notV / pasH
A(:,:,4) = DeriveeX(Ai, Aj). Ai + Dérivée Y(Ai, Aj). AjA (:,:, 4) = DeriveeX (Ai, Aj). Ai + Derivative Y (Ai, Aj). aj
Xtrans(1 ) = F1 (0,0). bVk+1.Ti /pasVXtrans (1) = F1 (0.0). bVk + 1.Ti / pasV
Xtraπs(2) = F1 (0,0).. aVk+1 i / pasHXtraπs (2) = F1 (0,0) .. aVk + 1 i / pasH
Xtrans(3) = gVk+1.TiXtrans (3) = gVk + 1.Ti
Xtrans(4) = (s-1 ).TiXtrans (4) = (s-1) .Ti
on va chercher à résoudre l'équation :we will try to solve the equation:
A * Xtrans - B = 0A * Xtrans - B = 0
On utilise la méthode des moindres carrés pour minimiser la norme.We use the method of least squares to minimize the norm.
On peut écrire l'équation pour tous les points de l'image. Mais pour améliorer la précision et limiter les calculs, on peut remarquer que dans l'équation A *Xtrans = B, le terme B est la différence de deux images successives et qu'on peut éliminer toutes les valeurs trop faibles ou proches du bruit. Dans les essais réalisés, on a conservé tous les points compris entre +/- 0.6Max(B) et +/-Max(B). Pour les séquences étudiées, le nombre de points évoluait de quelques dizaines à environ 1500. On peut aussi prendre un nombre fixe de l'ordre de 1000 parmi les séquences, proche du maximum.We can write the equation for all the points of the image. But to improve the precision and limit the calculations, we can notice that in the equation A * Xtrans = B, the term B is the difference of two successive images and that we can eliminate all the values too weak or close to the noise. In the tests carried out, all the points between +/- 0.6Max (B) and +/- Max (B) were kept. For the sequences studied, the number of points changed from a few tens to around 1500. We can also take a fixed number of the order of 1000 among the sequences, close to the maximum.
En référence à la figure 4, on va maintenant décrire brièvement le système d'imagerie permettant la mise en œuvre de l'étape de stabilisation.Referring to Figure 4, we will now briefly describe the imaging system allowing the implementation of the stabilization step.
La caméra de prise de vues 13 délivre son signal vidéo d'images à un filtreThe shooting camera 13 delivers its video signal from images to a filter
10 passe-bas 42 ainsi qu'à un bloc de traitement 43 recevant sur une deuxième entrée les données de stabilisation et fournissant en sortie les images stabilisées. Sur sa deuxième entrée, le bloc 43 reçoit donc les vitesses de rotation à faire subir aux images prises par la caméra 13. La sortie du filtre10 low-pass 42 as well as to a processing block 43 receiving on a second input the stabilization data and supplying as output the stabilized images. On its second input, the block 43 therefore receives the rotational speeds to be subjected to the images taken by the camera 13. The output of the filter
42 est reliée à deux mémoires tampons 44, 45 stockant respectivement les42 is connected to two buffer memories 44, 45 respectively storing the
15 deux images filtrées de l'instant présent t et de l'instant passé t-1. Les deux mémoires tampons 44, 45 sont reliées à deux entrées d'un composant de calcul 46, qui est soit un ASIC soit un FPGA (field programmable gâte array). Le composant de calcul 46 est relié à une mémoire de travail 47 et, en sortie, au bloc de traitement 43. Tous les composants électroniques du15 two filtered images of the present instant t and of the past instant t-1. The two buffer memories 44, 45 are connected to two inputs of a computation component 46, which is either an ASIC or an FPGA (field programmable gate array). The calculation component 46 is connected to a working memory 47 and, at the output, to the processing block 43. All the electronic components of the
20 système sont contrôlés par un microcontrôleur de gestion 48.20 systems are controlled by a management microcontroller 48.
Ayant maintenant décrit l'étape de stabilisation, on peut aborder l'étape d'harmonisation.Having now described the stabilization stage, we can now approach the harmonization stage.
? L'harmonisation mise en œuvre dans le procédé de guidage de l'invention est une extrapolation de l'étape de stabilisation, le dispositif de visée et le dispositif d'imagerie de la roquette étant, avant le lancement, montés sur un même porteur.? The harmonization implemented in the guiding method of the invention is an extrapolation of the stabilization step, the sighting device and the rocket imaging device being, before launch, mounted on the same carrier.
30 La stabilisation des images du dispositif d'imagerie de la roquette est un procédé d'auto-stabilisation, dans lequel on stabilise l'image de l'instant t sur l'image de l'instant t-1. En d'autres termes encore, on peut dire qu'on harmonise chaque image du système d'imagerie sur la précédente. Pour harmoniser les deux dispositifs, au même instant t, on prend les deux images des deux dispositifs et on les stabilise l'une sur l'autre, c'est-à-dire qu'on harmonise les deux dispositifs.The stabilization of the images of the rocket imaging device is a self-stabilization process, in which the image of time t is stabilized on the image of time t-1. In other words still, we can say that we harmonize each image of the imaging system on the previous one. To harmonize the two devices, at the same instant t, the two images of the two devices are taken and they are stabilized one on the other, that is to say that the two devices are harmonized.
Harmoniser revient à procéder à la confusion des axes optiques des deux dispositifs ainsi qu'à mettre en concordance deux à deux les pixels des deux images et, de préférence, à procéder aussi à la confusion de ces pixels.Harmonizing is tantamount to confusing the optical axes of the two devices as well as matching the pixels of the two images two by two, and preferably also confusing these pixels.
Naturellement, les deux dispositifs à harmoniser selon ce procédé doivent être de même nature optique, c'est-à-dire fonctionner dans des longueurs d'onde comparables.Naturally, the two devices to be harmonized according to this process must be of the same optical nature, that is to say operate in comparable wavelengths.
En l'espèce, les deux dispositifs prenant tous les deux des images d'une même scène, dans un référentiel terrestre, on filtre les images de la scène prises aux mêmes instants par les deux appareils dans un filtre passe-bas, pour n'en retenir que les basses fréquences spatiales, et on résoud l'équation du flot optique entre ces paires d'images respectives des deux dispositifs, pour déterminer les rotations et la variation du rapport des paramètres de zoom respectifs à faire subir à ces images pour les harmonier les unes sur les autres.In this case, the two devices both taking images of the same scene, in a terrestrial frame of reference, the images of the scene taken at the same times by the two devices are filtered in a low-pass filter, for n ' retain only the low spatial frequencies, and the equation of the optical flow between these pairs of respective images of the two devices is solved, to determine the rotations and the variation of the ratio of the respective zoom parameters to be subjected to these images for the harmonize on each other.
La loi de guidage initiale, comme rappelé ci-dessus, s'élabore au moyen de la position, la distance et la vitesse de la cible, d'une part, et d'un modèle de vol d'autre part.The initial guidance law, as recalled above, is developed by means of the position, distance and speed of the target, on the one hand, and a flight model on the other.
Ayant élaboré la loi de guidage initale de la roquette, l'opérateur de conduite de tir procède à son lancement. Jusqu'à une certaine distance de la cible 61, jusqu'à ce que la roquette acquiert la cible, on compare l'image prise par le dispositif d'imagerie 10 de la roquette avec l'image grand champ 52 mémorisée de la scène prise au début avec le dispositif de visée 62, c'est-à-dire qu'on commande en permanence le guidage de la roquette.Having developed the initial rocket guidance law, the operator of the firing control proceeds to launch it. Up to a certain distance from the target 61, until the rocket acquires the target, the image taken by the imaging device 10 of the rocket is compared with the stored wide field image 52 of the scene taken at the start with the aiming device 62, that is to say that the rocket guidance is permanently controlled.
Après acquisition de la cible 61 par la roquette, on poursuit le guidage de la roquette jusqu'en phase terminale, par comparaison de l'image prise par le dispositif d'imagerie 10 de la roquette avec l'image petit champ 53 aussi mémorisée. After acquisition of the target 61 by the rocket, the guidance of the rocket is continued until the terminal phase, by comparison of the image taken by the imaging device 10 of the rocket with the small field image 53 also stored.

Claims

REVENDICATIONS
L- Procédé de guidage d'une roquette (1) sur une cible, dans lequel, la roquette (1) étant équipée de moyens d'auto-guidage à dispositif d'imagerie (10) et moyens de correction de trajectoire (11),L- Method for guiding a rocket (1) on a target, in which the rocket (1) being equipped with self-guiding means with imaging device (10) and path correction means (11) ,
- on acquiert la cible par un dispositif de visée et on détermine sa position,- the target is acquired by an aiming device and its position is determined,
- on harmonise le dispositif de visée et le dispositif d'imagerie (10) de la roquette,- the aiming device and the imaging device (10) of the rocket are harmonized,
- on stabilise les images du dispositif d'imagerie (10) de la roquette, - on élabore une loi de guidage,- the images of the imaging device (10) of the rocket are stabilized, - a guide law is developed,
- on lance la roquette (1) et- we launch the rocket (1) and
- on guide la roquette selon cette loi jusqu'à ce qu'elle acquiert elle-même la cible.- the rocket is guided according to this law until it acquires the target itself.
2.- Procédé de guidage selon la revendication 1, dans lequel, avant le lancement, on élabore une loi de guidage initiale et la roquette (1) est guidée jusqu'à ce qu'elle acquiert la cible selon cette loi initiale.2. Guidance method according to claim 1, in which, before launching, an initial guidance law is developed and the rocket (1) is guided until it acquires the target according to this initial law.
3.- Procédé de guidage selon la revendication 1, dans lequel, avant le lancement, on élabore une loi de guidage initiale et, après le lancement, on élabore une loi de guidage continûment variable et de correction de trajectoire jusqu'à ce que la roquette (1) acquiert la cible.3. Guidance method according to claim 1, in which, before launching, an initial guidance law is developed and, after launching, a continuously variable guidance and trajectory correction law is developed until the rocket (1) acquires the target.
4.-Procédé selon l'une des revendication 1 à 3, dans lequel, pour harmoniser le dispositif de visée et le dispositif d'imagerie (10) de la roquette, on procède à une harmonisation électronique selon laquelle, dans un référentiel terrestre, on filtre les images de la scène prises aux mêmes instants par les deux dispositifs dans un filtre passe-bas (42), pour n'en retenir que les basses fréquences spatiales, et on résoud l'équation du flot optique entre ces paires d'images respectives des deux dispositifs pour déterminer les rotations et la variation du rapport des paramètres de zoom respectifs à faire subir à ces images pour les harmoniser les unes sur les autres.4.- Method according to one of claims 1 to 3, wherein, to harmonize the sighting device and the imaging device (10) of the rocket, an electronic harmonization is carried out according to which, in a terrestrial frame of reference, the images of the scene taken at the same instants by the two devices are filtered in a low-pass filter (42), so as to retain only the low spatial frequencies, and the equation of the optical flow between these pairs of solves is resolved. respective images of the two devices for determining the rotations and the variation of the ratio of the respective zoom parameters to be subjected to these images in order to harmonize them one on the other.
- 5.- Procédé selon l'une des revendications 1 à 4, dans lequel on stabilise les images du dispositif d'imagerie (10) de la roquette dans un référentiel terrestre, sur le paysage. - 5.- Method according to one of claims 1 to 4, wherein the images of the imaging device (10) of the rocket are stabilized in a terrestrial frame of reference, on the landscape.
6.- Procédé selon la revendication 5, dans lequel, dans le référentiel terrestre, on filtre les images de la scène prises par le dispositif d'imagerie (10) dans un filtre passe-bas (42), pour n'en retenir que les basses fréquences spatiales, et on résoud l'équation du flot optique pour déterminer les rotations à faire subir aux images pour les stabiliser sur les images précédentes. 6.- Method according to claim 5, wherein, in the terrestrial frame of reference, the images of the scene taken by the imaging device (10) are filtered in a low-pass filter (42), to retain only the low spatial frequencies, and the equation of the optical flow is solved to determine the rotations to be subjected to the images in order to stabilize them on the previous images.
EP02783193A 2001-09-25 2002-09-23 Method for guiding a rocket Expired - Lifetime EP1432958B1 (en)

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
FR0112330A FR2830078B1 (en) 2001-09-25 2001-09-25 GUIDING PROCESS OF A ROCKET
FR0112330 2001-09-25
PCT/FR2002/003240 WO2003027599A1 (en) 2001-09-25 2002-09-23 Method for guiding a rocket

Publications (2)

Publication Number Publication Date
EP1432958A1 true EP1432958A1 (en) 2004-06-30
EP1432958B1 EP1432958B1 (en) 2006-08-30

Family

ID=8867590

Family Applications (1)

Application Number Title Priority Date Filing Date
EP02783193A Expired - Lifetime EP1432958B1 (en) 2001-09-25 2002-09-23 Method for guiding a rocket

Country Status (5)

Country Link
US (1) US7083139B2 (en)
EP (1) EP1432958B1 (en)
DE (1) DE60214407T2 (en)
FR (1) FR2830078B1 (en)
WO (1) WO2003027599A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107966156A (en) * 2017-11-24 2018-04-27 北京宇航系统工程研究所 A kind of Design of Guidance Law method suitable for the vertical exhausting section of carrier rocket

Families Citing this family (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7643893B2 (en) 2006-07-24 2010-01-05 The Boeing Company Closed-loop feedback control using motion capture systems
US7813888B2 (en) 2006-07-24 2010-10-12 The Boeing Company Autonomous vehicle rapid development testbed systems and methods
US7885732B2 (en) 2006-10-25 2011-02-08 The Boeing Company Systems and methods for haptics-enabled teleoperation of vehicles and other devices
DE102007054950B4 (en) * 2007-11-17 2013-05-02 Mbda Deutschland Gmbh Method for supporting the automatic navigation of a low-flying missile
US8686326B1 (en) * 2008-03-26 2014-04-01 Arete Associates Optical-flow techniques for improved terminal homing and control
US8068983B2 (en) 2008-06-11 2011-11-29 The Boeing Company Virtual environment systems and methods
WO2010083517A1 (en) * 2009-01-16 2010-07-22 Bae Systems Land & Armaments L.P. Munition and guidance navigation and control unit
IL214191A (en) 2011-07-19 2017-06-29 Elkayam Ami Munition guidance system and method of assembling the same
IL227982B (en) * 2013-08-15 2018-11-29 Rafael Advanced Defense Systems Ltd Missile system with navigation capability based on image processing
US9464876B2 (en) * 2014-05-30 2016-10-11 General Dynamics Ordnance and Tacital Systems, Inc. Trajectory modification of a spinning projectile by controlling the roll orientation of a decoupled portion of the projectile that has actuated aerodynamic surfaces
DE102015000873A1 (en) * 2015-01-23 2016-07-28 Diehl Bgt Defence Gmbh & Co. Kg Seeker head for a guided missile
RU2722903C1 (en) * 2019-10-23 2020-06-04 Акционерное общество "Научно-производственное предприятие "Дельта" Method of identifying a target using a radio fuse of a missile with a homing head
RU2722904C1 (en) * 2019-10-23 2020-06-04 Акционерное общество "Научно-производственное предприятие "Дельта" Method of target detection by a missile radio fuse

Family Cites Families (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US3459392A (en) * 1959-09-24 1969-08-05 Goodyear Aerospace Corp Passive homing guidance system
US3712563A (en) * 1963-12-04 1973-01-23 Us Navy Automatic path follower guidance system
US3794272A (en) * 1967-02-13 1974-02-26 Us Navy Electro-optical guidance system
US3986682A (en) * 1974-09-17 1976-10-19 The United States Of America As Represented By The Secretary Of The Navy Ibis guidance and control system
DE3334729A1 (en) * 1983-09-26 1985-04-11 Siemens AG, 1000 Berlin und 8000 München Method for aligning a homing head of a self-controlled missile
US4881270A (en) * 1983-10-28 1989-11-14 The United States Of America As Represented By The Secretary Of The Navy Automatic classification of images
US6491253B1 (en) * 1985-04-15 2002-12-10 The United States Of America As Represented By The Secretary Of The Army Missile system and method for performing automatic fire control
GB8925196D0 (en) * 1989-11-08 1990-05-30 Smiths Industries Plc Navigation systems
US5785281A (en) * 1994-11-01 1998-07-28 Honeywell Inc. Learning autopilot
DE19546017C1 (en) * 1995-12-09 1997-04-24 Daimler Benz Aerospace Ag Missile weapon system
US5881969A (en) * 1996-12-17 1999-03-16 Raytheon Ti Systems, Inc. Lock-on-after launch missile guidance system using three dimensional scene reconstruction
US6347762B1 (en) * 2001-05-07 2002-02-19 The United States Of America As Represented By The Secretary Of The Army Multispectral-hyperspectral sensing system

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
See references of WO03027599A1 *

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107966156A (en) * 2017-11-24 2018-04-27 北京宇航系统工程研究所 A kind of Design of Guidance Law method suitable for the vertical exhausting section of carrier rocket
CN107966156B (en) * 2017-11-24 2020-09-18 北京宇航系统工程研究所 Guidance law design method suitable for carrier rocket vertical recovery section

Also Published As

Publication number Publication date
FR2830078B1 (en) 2004-01-30
US20040245370A1 (en) 2004-12-09
DE60214407T2 (en) 2007-05-10
US7083139B2 (en) 2006-08-01
DE60214407D1 (en) 2006-10-12
WO2003027599A1 (en) 2003-04-03
FR2830078A1 (en) 2003-03-28
EP1432958B1 (en) 2006-08-30

Similar Documents

Publication Publication Date Title
EP1432958B1 (en) Method for guiding a rocket
EP3273318B1 (en) Autonomous system for collecting moving images by a drone with target tracking and improved target positioning
EP1298592B1 (en) Stabilizing images of a scene, gray level offset correction, moving object detection and adjustment of two image pickup devices based on image stabilization
EP2048475B1 (en) Method of determining the attitude, position and velocity of a mobile unit
EP0432014A1 (en) Optoelectronic assistance system for air-raid missions and navigation
EP3268691B1 (en) Airborne device for detecting shots and for piloting assistance
EP1623567A1 (en) Method of transmitting data representing the spatial position of a video camera and system implementing said method
EP0484202A1 (en) System for the transfer of alignment between the inertial system of a carried vehicle and that of the carrier vehicle
JP6529411B2 (en) Mobile object identification device and mobile object identification method
WO2005066024A1 (en) Carrier-based modular optronic system
Pajusalu et al. Characterization of asteroids using nanospacecraft flybys and simultaneous localization and mapping
EP1676444B1 (en) Method and device for capturing a large-field image and region of interest thereof
FR2788845A1 (en) Shooting supervisor for machine gun, rocket launcher or anti-tank includes first and second modules to measure and visualize line of sight of projectile
EP0985900A1 (en) Method and device for guiding a flying device, in particular a missile, to a target
EP3929692B1 (en) Coded aperture seeker for navigation
EP0013195B1 (en) Air-ground radar telemetry apparatus for airborne fire-control system and use of such apparatus in a fire control system
FR2828314A1 (en) Electronic stabilization of images taken or observed with an imaging system such as portable thermal imaging binoculars wherein the output signals are passed through a low pass filter prior to image processing
JP7394724B2 (en) Space situation monitoring business equipment, space situation monitoring system, monitoring equipment, and ground equipment
EP0176121B1 (en) Method for detecting and eliminating parasitic pictures produced by a pyramidal ir dome
EP1785688B1 (en) Method and apparatus for determining the rotation speed of the projectile-target line and guidance apparatus for a projectile, especially an ammunition
FR3079943A1 (en) ELECTRONIC DEVICE AND METHOD FOR CONTROLLING A DRONE WITH TRAVELING COMPENSATION EFFECT, ASSOCIATED COMPUTER PROGRAM
WO2020144418A1 (en) Title of the invention: firearm for precisely simulating a shooting sequence with a calculation of the point of impact of the projectile(s)
FR3082012A1 (en) ELECTRONIC DEVICE, AND METHOD, FOR CONTROLLING A DRONE, COMPUTER PROGRAM
FR2737071A1 (en) Matrix array camera movement system for site surveillance and e.g. being mounted on platform on boat - has random position camera with position picture and coordinate memorisation and display, and is provided with gyroscope
FR2731785A1 (en) Target seeking head for automatic target tracking

Legal Events

Date Code Title Description
PUAI Public reference made under article 153(3) epc to a published international application that has entered the european phase

Free format text: ORIGINAL CODE: 0009012

17P Request for examination filed

Effective date: 20040416

AK Designated contracting states

Kind code of ref document: A1

Designated state(s): AT BE BG CH CY CZ DE DK EE ES FI FR GB GR IE IT LI LU MC NL PT SE SK TR

GRAP Despatch of communication of intention to grant a patent

Free format text: ORIGINAL CODE: EPIDOSNIGR1

GRAS Grant fee paid

Free format text: ORIGINAL CODE: EPIDOSNIGR3

GRAA (expected) grant

Free format text: ORIGINAL CODE: 0009210

AK Designated contracting states

Kind code of ref document: B1

Designated state(s): DE GB IT SE

REG Reference to a national code

Ref country code: GB

Ref legal event code: FG4D

Free format text: NOT ENGLISH

RAP2 Party data changed (patent owner data changed or rights of a patent transferred)

Owner name: SAGEM DEFENSE SECURITE

REF Corresponds to:

Ref document number: 60214407

Country of ref document: DE

Date of ref document: 20061012

Kind code of ref document: P

REG Reference to a national code

Ref country code: SE

Ref legal event code: TRGR

GBT Gb: translation of ep patent filed (gb section 77(6)(a)/1977)

Effective date: 20061220

PLBE No opposition filed within time limit

Free format text: ORIGINAL CODE: 0009261

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: NO OPPOSITION FILED WITHIN TIME LIMIT

26N No opposition filed

Effective date: 20070531

PGFP Annual fee paid to national office [announced via postgrant information from national office to epo]

Ref country code: SE

Payment date: 20120824

Year of fee payment: 11

PGFP Annual fee paid to national office [announced via postgrant information from national office to epo]

Ref country code: IT

Payment date: 20120829

Year of fee payment: 11

PGFP Annual fee paid to national office [announced via postgrant information from national office to epo]

Ref country code: DE

Payment date: 20130820

Year of fee payment: 12

PGFP Annual fee paid to national office [announced via postgrant information from national office to epo]

Ref country code: GB

Payment date: 20130823

Year of fee payment: 12

REG Reference to a national code

Ref country code: SE

Ref legal event code: EUG

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: SE

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20130924

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: IT

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20130923

REG Reference to a national code

Ref country code: DE

Ref legal event code: R119

Ref document number: 60214407

Country of ref document: DE

GBPC Gb: european patent ceased through non-payment of renewal fee

Effective date: 20140923

REG Reference to a national code

Ref country code: DE

Ref legal event code: R119

Ref document number: 60214407

Country of ref document: DE

Effective date: 20150401

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: DE

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20150401

Ref country code: GB

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20140923