EP3260808B1 - Procédé de correction de déviation d'un système d'arme - Google Patents

Procédé de correction de déviation d'un système d'arme Download PDF

Info

Publication number
EP3260808B1
EP3260808B1 EP17000925.2A EP17000925A EP3260808B1 EP 3260808 B1 EP3260808 B1 EP 3260808B1 EP 17000925 A EP17000925 A EP 17000925A EP 3260808 B1 EP3260808 B1 EP 3260808B1
Authority
EP
European Patent Office
Prior art keywords
image
projectile
action
effect
time
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
EP17000925.2A
Other languages
German (de)
English (en)
Other versions
EP3260808A2 (fr
EP3260808A3 (fr
Inventor
Hans-Ludwig Reischmann
Thomas Frei
Axel Pfersmann
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Diehl Defence GmbH and Co KG
Original Assignee
Diehl Defence GmbH and Co KG
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Diehl Defence GmbH and Co KG filed Critical Diehl Defence GmbH and Co KG
Publication of EP3260808A2 publication Critical patent/EP3260808A2/fr
Publication of EP3260808A3 publication Critical patent/EP3260808A3/fr
Application granted granted Critical
Publication of EP3260808B1 publication Critical patent/EP3260808B1/fr
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • FMECHANICAL ENGINEERING; LIGHTING; HEATING; WEAPONS; BLASTING
    • F41WEAPONS
    • F41GWEAPON SIGHTS; AIMING
    • F41G3/00Aiming or laying means
    • F41G3/14Indirect aiming means
    • F41G3/142Indirect aiming means based on observation of a first shoot; using a simulated shoot
    • FMECHANICAL ENGINEERING; LIGHTING; HEATING; WEAPONS; BLASTING
    • F41WEAPONS
    • F41GWEAPON SIGHTS; AIMING
    • F41G3/00Aiming or laying means
    • F41G3/14Indirect aiming means
    • F41G3/16Sighting devices adapted for indirect laying of fire
    • F41G3/165Sighting devices adapted for indirect laying of fire using a TV-monitor
    • FMECHANICAL ENGINEERING; LIGHTING; HEATING; WEAPONS; BLASTING
    • F41WEAPONS
    • F41GWEAPON SIGHTS; AIMING
    • F41G3/00Aiming or laying means
    • F41G3/04Aiming or laying means for dispersing fire from a battery ; for controlling spread of shots; for coordinating fire from spaced weapons
    • FMECHANICAL ENGINEERING; LIGHTING; HEATING; WEAPONS; BLASTING
    • F41WEAPONS
    • F41GWEAPON SIGHTS; AIMING
    • F41G3/00Aiming or laying means
    • F41G3/08Aiming or laying means with means for compensating for speed, direction, temperature, pressure, or humidity of the atmosphere

Definitions

  • the invention relates to a method for correcting the offset of a weapon system, in which a projectile is fired from a barrel weapon of the weapon system in a target direction at an object, a direction of an impact point of the projectile from the object is detected, the direction difference between the target direction and the impact point direction is detected as an offset, and a target direction of a subsequent shot at the object is corrected using the offset.
  • Shots from barrel weapons are subject to a certain degree of dispersion in terms of their accuracy.
  • a small dispersion within a single shot sequence with the barrel remaining in the same position can be distinguished from a larger dispersion over longer periods of time and between different weapon models.
  • machine weapons have a very small short-term dispersion, which is generally tolerable.
  • the large dispersion is caused by longer-term changes, such as transport of the weapon, temperature effects, wind changes from day to day, and the like. The large dispersion can therefore lead to systematically incorrect placement in a combat situation, in which the barrel weapon hits a small scatter area with high accuracy and in a reproducible manner, but this scatter area of impact is next to the target object.
  • a positioning shot can be fired at the target object or generally in a desired direction.
  • the impact of the positioning projectile is then detected and an impact point direction is determined from the barrel to the point of impact.
  • the distance from the impact to the target object or the difference in direction between the impact point direction and the target direction can now be determined as the misalignment.
  • a later shot can be corrected using the misalignment so that this later shot hits the target object.
  • the EP 3 034 986 A1 is a machine gun with an associated control computer for installation on a vehicle.
  • the control computer analyzes an image or a sequence of images from an image recording device, which shows a field of fire at the time of impact on a target.
  • the image or sequence of images shows both the impact or firing point and the marking of the target. From the data on the marking and the data on the recorded impact or firing point, the control computer calculates the offset of the shot to the actual target location.
  • the DE 697 20 749 T2 deals with a target aiming system with which automatic gun re-orientation is possible. It is proposed to record a sequence of individual images after a shot has been fired, which show the movement of the fired shot towards a target, and to determine an actual trajectory of the shot from this and to compare the actual trajectory with predicted trajectories to determine deviations. In order to obtain information on the firing accuracy, it is further proposed to monitor the area of an image containing the target for a change which indicates an explosion.
  • the weapon system can comprise a machine gun for firing ammunition in bursts, in which several shots can be fired automatically one after the other. Subsequent shots are expediently in an automated relationship to the previous shot, for example in the firing time difference and/or the automated aiming correction. It is practical to have a control computer that automatically aligns a barrel weapon of the weapon system, for example based on instructions from an operator.
  • the aiming direction is conveniently known, for example from an aiming process.
  • the object can be recorded in a target image and an operator can mark the imaged object, for example with a crosshair.
  • the marking determines the direction from the weapon system to the object and thus the aiming direction.
  • the aiming direction does not have to be a geo-related absolute direction, but can be a relative direction, for example to a reference direction of the weapon system, such as an image corner, a barrel alignment or the like.
  • the point of impact of the projectile is a point at which the projectile visibly produces an effect, for example as an explosion whose flash is visible. It is also possible for dust to be stirred up by a detonation pressure wave, a cloud of smoke or another visible effect.
  • the point of impact is on the object and can therefore be directly on the object, for example in the form of an impact on the object, or in the vicinity of the object. If, for example, the projectile is fired with a timer and ignites near the object, for example to achieve a scattering effect, then in this case too the point of impact of the projectile is on the object.
  • the offset does not have to be the absolute offset and can be subsequently changed or corrected, for example by a lead for an accelerating target object.
  • Another parameter can also be used to correct the offset, for example the measurement of changing wind, a vibration of the weapon, a predicted acceleration of the target object and the like.
  • a possible method for correcting the weapon system's alignment can include the steps of determining the target direction, determining the direction of the point of impact, determining the alignment and correcting the alignment of the machine weapon.
  • an optical system with a camera can record the object, expediently in a regular, in particular continuous Sequence of images. These images can be shown to an operator on a screen so that he can follow the object - like in a film. The operator can select the object and point a marker at it, for example a crosshair. If the object moves relative to the camera orientation, the operator can move the marker along the object, or the object is automatically tracked so that the marker automatically follows the object.
  • the object is recorded as such in the object image and recognized in the subsequent object images so that the marker is moved from object image to object image with the moving object.
  • the position of the marker in an object image can be used to determine the direction of the target when the projectile is fired, for example relative to the camera orientation or the object image, or absolutely if an absolute orientation of the camera or the object image in space is known.
  • an object image can be examined using image processing methods for image anomalies that can be attributed to a bullet impact. If an image anomaly is found that resembles a bullet impact to a certain extent, the location of this image anomaly in the object image can be determined. The object image is now an effective image that shows the impact of the bullet. The direction of the point of impact can now be determined from the location of the bullet impact or the image anomaly in the effective image and can possibly be the location itself.
  • the difference in direction can now be determined from the aiming direction and the direction of the point of impact, for example by subtraction, and used as at least a provisional offset.
  • the weapon's alignment is now corrected using the offset, and a subsequent shot is aimed at the object. If necessary, the offset can be further corrected as described above.
  • the positioning shot can be carried out with the same or different ammunition as the later and corrected combat shot, for example to make the projectile effect more visible and more pronounced.
  • a positioning shot can also consist of a salvo of several individual shots that are aimed at the object one after the other and whose effect is recorded in an effective image.
  • the actual image and the target image can be superimposed, for example if the orientation of a camera recording the two images is not changed between the two recording times. If the target direction is recorded in the target image and the effective point direction in the actual image, both directions can then be directly compared with each other, for example subtracted from each other, so that the difference in direction and thus the offset results. However, it is also possible that the later actual image is shifted to the target image by a movement of the recording camera, which was tracked to the object by the operator or by auto-tracking, for example. A movement of a vehicle carrying the weapon system can also lead to a movement of the camera and thus a shift of the actual image to the target image.
  • a target image depicting the object is recorded and the target direction is determined in the target image.
  • the target image can now be correlated with the later recorded actual image using pattern recognition in the two images.
  • the two images can be superimposed so that an object depicted in both images is brought into alignment.
  • a relative position of the two images to each other can be determined.
  • the difference in direction can be determined from the relative position of the two images and the position of the target direction in the target image and the effective point direction in the actual image.
  • a target marker may be carried along with the object, for example in subsequent object images.
  • the direction of the target at the time the projectile was fired is useful for determining the target direction.
  • a target image depicting the object at the time of firing is recorded, and the target direction is determined in the target image.
  • the target image is therefore expediently recorded at the time of firing.
  • the image whose recording time is the shortest time apart from the time of firing can be used as the target image.
  • a sequence of object images depicting the object is recorded, with a Target marking in these images can be moved along with the movement of the object.
  • the target image can therefore be an image in which the target or object is marked when fired.
  • the invention proposes that a sequence of object images depicting the object be recorded, the images of the sequence be examined for the projectile effect, and several object images in which the projectile effect is found pictorially are evaluated as effect images for a development of the pictorial projectile effect in the effect images. For example, an impact point and also an impact time can be deduced from the extent of a smoke or dust cloud. It would also be possible to deduce the type of projectile from the course of a projectile's burn and thus distinguish a marker shot from a shot from another gun.
  • the development can be compared with a signal course that is typical for a signal course of the projectile, in particular for a marker fire of a special type of ammunition for a positioning shot. If, for example, a tracer projectile is fired, the tracer can be followed in the several effect images and an impact point can be clearly assigned to the projectile.
  • the weapon system firing it can be easily recognized by the target object. This creates the risk of counter-attack.
  • interference signals When fired upon, the object may defend itself by generating interference signals, for example in the form of flares.
  • interference signals are recognized as such in a sequence of effect images or object images. Such recognition can occur by evaluating their temporal occurrence, for example the start of the signal, a temporal signal profile and/or a radiation spectrum of the interference signals. The signals can then be compared with a corresponding parameter of an expected projectile impact or with a corresponding parameter of known interference elements.
  • the temporal occurrence of an interference signal cannot be caused by the impact of a projectile because, for example, the projectile cannot have reached the object yet, the corresponding image conspicuity can be recognized as an interference signal.
  • an object image can therefore have a number of image anomalies that must be examined to determine whether they represent the desired projectile effect.
  • Such an evaluation can be computationally intensive and therefore time-consuming.
  • confusion can also occur, meaning that a drop is incorrectly recorded.
  • an image evaluation in an object image is spatially limited to an expected area in which the projectile effect is expected.
  • An expected area can, for example, be limited to the local environment of the nominal target point, for example around the target direction.
  • the size of the expected range is expediently a maximum of half the image area of the object image, in particular a maximum of 1/10 of the image area of the object image.
  • An evaluation result is, for example, a calculation result as to whether an image conspicuity could represent the projectile effect, for example a probability that an image conspicuity represents the projectile effect. Weighting could be carried out in such a way that an image conspicuity outside the expected range is classified as a projectile effect with a lower probability than the same image conspicuity within the expected range.
  • An expectation area can be divided into several areas.
  • the expectation area can include a core area with the highest weighting and an edge area with a lower weighting, possibly even a third outer area with an even lower weighting, which is however greater than the weighting outside the expectation area.
  • a continuous weighting curve that decreases with increasing distance from an expectation point at which the projectile effect of the positioning shot is expected is also possible.
  • the size of the expected area in the object image is not fixed, but can be selected depending on one or more parameters.
  • the parameter of a design-specific deviation spread of the gun can be used for this.
  • the individual deviation spread of the gun is also a useful parameter of this kind.
  • a wind strength and/or a wind direction can be taken into account.
  • a movement of a The speed of a gun-carrying vehicle during firing and/or the temperature of a gun part are parameters that can advantageously be taken into account.
  • the invention provides that a time window within which the projectile effect is expected is determined. An evaluation of a series of object images can then be limited to those object images that are at least partially within the time window. Generally speaking, a sequence of object images depicting the object is recorded and images of the sequence are examined for the projectile effect, with a time of impact of a projectile effect within the time window being given a higher weighting than outside the time window.
  • the occurrence of an image abnormality in one or more object images can be used to determine the time of effect, i.e. the time at which a suspected projectile effect occurs. If this time of effect is outside the time window, the probability that the corresponding image abnormality is a different effect, for example of a different gun, is greater than if the time of effect is within the time window.
  • the time of effect can be a time at which the projectile develops its effect, in particular begins to develop. If, for example, a flare from an explosion or the combustion of a marker projectile lasts for a longer period of time and is shown in several object images, the time of effect is at the beginning of this effect. This also applies if the time of effect itself is not visible in the images, for example because an impact is obscured. For example, the subsequent expansion of a cloud of smoke or dust caused by the projectile explosion can be used to determine the time of impact or explosion of the projectile based on the rate of expansion. This time can then be determined as the time of impact.
  • an image abnormality that lies outside the time window can be omitted if it is clear that this image abnormality cannot be attributed to a projectile effect due to time constraints. According to the invention, it is provided that, in order to determine the direction of the point of impact from which the placement is determined, an image abnormality that could represent a projectile effect whose time of effect lies outside the time window can be discarded.
  • the position and/or size of the time window can be determined from the distance of the object from the machine gun and/or the timing of the projectile.
  • the time window is placed symmetrically around the time of explosion of the projectile known from the timing.
  • the exposure period or integration period for recording the effective image can be determined depending on the position and size of the time window. It is advisable for the integration to take place within the time window. In this way, the evaluation can be limited to a single image.
  • the integration period is placed around an expected time of impact, for example symmetrically around the expected time of impact.
  • the invention provides that a development of the pictorial projectile effect is evaluated in a sequence of effect images. In this way, for example, the expansion of a smoke cloud can be used to calculate back the time of its formation and thus the time of effect.
  • the burn of a marker projectile is not visible from the beginning, for example because the burn started too small.
  • the end of the burn or a significant burn phase is recognized and, for example, the total duration of the burn or the course of the burn is known, the time of effect, i.e.
  • the time at which the projectile effect began can also be determined in this way.
  • a time of effect is determined from the retrospective consideration of the development and it is checked whether the time of effect is within the time window. Particularly when the impact images are recorded in the infrared spectral range, it is advantageous to examine the development of the projectile impact. The enlargement of a heat area, for example caused by the projectile explosion, can be tracked and from this the time of impact can be determined. by back calculation. It may also be the case that the integration time of the recording of an infrared effective image is longer than the time window. In general, i.e. regardless of the time window, the time of effect can only be determined imprecisely from a single effective image. By back calculating an expanding heat cloud, however, conclusions can be drawn about the time and/or place of its formation.
  • the image scene in an object image may have strong signal structures and contrasts.
  • Such effects can be reduced by a clever choice of the spectral range used by a recording camera, but they can significantly increase the potential for error in detecting the impact of a projectile.
  • Such processing is expediently carried out using a previously recorded object image.
  • at least one object image of the object can be recorded between the firing of the projectile and the time at which the projectile takes effect, with the later recorded effect image being processed using this object image, for example by amplifying image differences.
  • Such processing can in particular consist of image subtraction, so that, for example, the object image and the effect image are subtracted from one another.
  • the projectile effect can now be determined from a remaining image conspicuity and from this the direction of the effect point. Strong signal structures can also be reduced or even suppressed in this way, provided they change sufficiently slowly.
  • a period in which an object image used for image processing is used is expediently determined by the fluctuation of the signal structure in a series of object images, i.e. a temporal change in the signal structure.
  • a reference image is obtained from several object images and the actual image is processed using the reference image, for example by subtracting the reference image from the actual image.
  • the several object images can be a sequence of object images. Structures with strong or rapid fluctuations cannot be eliminated in this way, but they are given less weight and thus partially averaged out.
  • the reference image can be formed from an average of the object images or an interpolation of the object images. If, for example, a clear development of a structure can be recognized from several object images, this structure can be extrapolated to the point in time of the actual image to be examined by interpolation and thus fed into image processing, for example image subtraction.
  • a reference image is created based on several object images, it is advantageous if the number of object images from which the reference image is formed is determined depending on the image recording frequency and/or a fluctuation in the scene around the object.
  • the fluctuation in the scene can be determined from one or more previous object images by image processing.
  • the difference image is processed with image processing in the form of signal amplification, for example by at least a factor of 2, in particular a factor of 5.
  • image processing is only carried out if no sufficient image abnormality that could represent a bullet impact has been found beforehand.
  • the effect of a projectile usually unfolds in several spectral ranges.
  • a flash of explosion can be detected in the visible range and the expansion of a cloud of smoke can also be detected in the visible range.
  • the expansion of a cloud of heat can be sensed in the infrared spectral range.
  • a Explosion flashes can be distinguished very well from flames in the ultraviolet range.
  • a projectile effect can be determined from the effective images individually or from a combination of the results from the effective images and from this a point of effect direction. In this way, a point of effect direction can be determined from all the effective images of the cameras. It is also possible to first determine projectile effects, assign matching projectile effects in the effective images to a positioning projectile and determine only one point of effect direction.
  • the visual development of the projectile effect usually unfolds differently in different spectral ranges. For example, an explosion flash in the visible and/or ultraviolet range is very short, while the spread of a heat cloud takes considerably longer. The spread of a smoke or dust cloud takes even longer. It is therefore advantageous if the development of the visual projectile effect is evaluated in the impact images in each spectral range. The development of the visual projectile effects can then be compared with a signal curve expected for the positioning shot, for example for an explosion flash, a spreading heat cloud or an expanding smoke cloud.
  • the spectral range used to examine the impact of the bullet can be useful to limit the spectral range used to examine the impact of the bullet. This can suppress false signals.
  • at least one spectral range is limited to a spectral band that is less than 1.5 times wider around a marking color of the bullet.
  • the factor can also be selected to be smaller, for example 1.2 or even 1.1.
  • the factor 1.5 means that the frequency of the upper end of the band is a maximum of 1.5 times higher than the frequency of the lower end of the band.
  • a positioning shot is fired with marker ammunition that differs in its beam characteristics upon impact or explosion from ammunition used to combat the object.
  • ammunition with a special light characteristic can be used that differs from the subsequent, drop-corrected projectiles.
  • the probability of error can be further reduced if the signal characteristics of the projectile effect are examined in terms of their intensity over time in at least two different spectral ranges.
  • Another way to reduce false detections is to fire a salvo of several projectiles at the object.
  • a projectile effect can now be recognized and a time of effect can be determined.
  • the chronological sequence of the times of effect can now be checked for plausibility, for example by comparing the chronological sequence with the chronological sequence of the firing times.
  • a time window in which the projectile effect is expected is determined for each projectile and the position of the points of effect in relation to the time windows is determined. From the position of the points of effect, it can be concluded whether the projectile effect or the image anomaly found is actually the projectile effect of the marker projectile(s).
  • an allocation value can be formed from the position of the points of effect in relation to the time windows, and an overall allocation value can be formed from a plurality of allocation values. The overall allocation value can indicate an allocation probability of the projectile effects being associated with the projectiles in the salvo.
  • Another way to detect differentiation errors can be achieved by firing different marker bullets in the salvo, each of which has a different projectile effect. All marker bullets used are, for good reason, special bullets for a marker shot, so that they differ from combat ammunition fired later.
  • each module preferably follows a different strategy or method for reducing detection errors.
  • the different modules can be used individually or in combination, preferably depending on the combat situation, the projectiles present and/or the detection systems present.
  • the invention also relates to a weapon system according to the features of claim 13 with a targeting device for entering and detecting a target direction in which a projectile from a weapon of the weapon system is fired in a target direction at an object.
  • the weapon system also comprises an optical system with a camera and an evaluation unit for determining a direction of an impact point of the projectile on the object.
  • the evaluation unit is prepared to determine a deviation in the form of a directional difference between the target direction and the impact point direction.
  • the targeting device is prepared to correct a target direction of a later shot at the object using the deviation.
  • the evaluation unit is prepared to detect the impact point direction by recording an impact image representing the projectile impact on the object and determining the impact point direction from the impact image.
  • FIG 1 shows an object scene 2 in a landscape.
  • the object scene 2 contains a number of vehicles, of which FIG 1 shown as examples are: a missile carrier 4 for launching surface-to-air missiles 44, two tanks 6, 8, a truck 10 and a number of other wheeled vehicles 12.
  • a weapon system 14 is shown schematically, which is mounted on a wheeled vehicle 16, which is also only shown schematically.
  • the weapon system 14 comprises a tube weapon 18 for firing projectiles and a target unit 20 with several cameras 22, of which FIG 1 for simplicity only two cameras 22
  • the weapon system 14 comprises an evaluation unit 24 for the numerical detection of a placement and for the automatic placement correction of the barrel weapon 18.
  • one of the vehicles from the object scene is to be attacked by the weapon system 14.
  • the target unit 20 generates a target image 26 with the aid of one or more of the cameras 22, in which the object scene 2 and in particular the vehicle to be attacked are shown.
  • An operator of the weapon system 14 marks an object 10 in the target image 26 that is to be attacked, for example the front truck. This marking can be done via a target marking 28 in the target image 26, for example a crosshair, which for the sake of better clarity is in FIG 1 shown in the object scene 2 on the front wheeled vehicle.
  • the operator places the target marker 28 over the object 10 to be attacked or at the location where a shot from the gun 18 is to be placed.
  • the position of the target marker 28 in the target image 26 is recorded by the target unit 20.
  • the target direction 30 is only a relative target direction 30, since its absolute direction does not have to be known.
  • the orientation of the camera 22 recording the target image 26 is recorded, so that the position of the target image 26 in the object scene 2 and thus also the position of the target marker 28 in the object scene 2 is known.
  • This target direction can now move, for example due to a movement of the object 10 through the object scene 2 or a movement of the weapon system 14 in space.
  • the target direction 30 can be moved along with this, for example due to the position of the target marker 28 in the target image.
  • the target marker 28 in the target image 26 can be tracked by the operator and thus kept on the object 10 to be attacked.
  • Another possibility is to automatically track the object 10 so that the target marker 28 and thus the target direction 30 automatically remain on the object 10.
  • a series of target images 26 are recorded over time, which are compared with the original target image 26 by means of image processing methods so that the object 10 is recognized in the subsequent target images 26. Based on the original position of the target marking 28 on the object 10, the position of the target marking 28 is now automatically adjusted to the new position of the object 10 in the object scene 2.
  • the aiming direction 30 is stored at the moment of firing. This aiming direction is then used to calculate a drop. Previous aiming directions 30 before firing can be stored and used as a basis for other calculations, but initially play no role in calculating the drop.
  • the target image 26, which is shown on a display unit of the target unit 20, is in FIG 2 enlarged. It shows the target marking 28 on the front vehicle or object 10 at the time the projectile is fired in the direction of the target 30 from the gun 18.
  • a series of object images 32 are recorded, which are shown schematically in FIG 3 are shown. These object images 32 show the object scene 2, which changes slightly according to the course of time.
  • the object images 32 or at least a part of them are examined for a pictorial representation of the projectile effect of the projectile fired from the barrel weapon 18. If such a pictorial projectile effect 34 is found in an object image 32, this object image 32 is used as the effect image 36 for calculating the placement.
  • FIG 4 shows the effective image 36 in which the projectile effect 34 is visible next to the object 10.
  • the projectile effect 34 is recognized as such in the effective image 36, and the position of the pictorial representation of the projectile effect 34 in the effective image 36 is recognized by the evaluation unit 24.
  • FIG 1 is shown schematically how the evaluation unit 24 determines an effect point direction 38, R from the position L of the projectile effect 34 in the effective image 36, in which the projectile effect 34 is located starting from a reference point, for example a corner of the image or a point on the weapon system 14, such as the muzzle of the gun.
  • a reference point for example a corner of the image or a point on the weapon system 14, such as the muzzle of the gun.
  • the evaluation unit 24 determines a deviation A by which the fired projectile missed the object 10 or the desired target point therein.
  • the deviation A can consist of the direction difference 40, ⁇ R.
  • the evaluation unit 24 now calculates a deviation correction K of the barrel weapon 18 in the form of a change in direction in which the barrel weapon 18 is then swiveled. Subsequent shots at the object 10 are now fired in the corrected direction so that the object 10 is hit.
  • FIG 1 shows the position of object 10 at the time of launch, as shown in target image 26.
  • FIG 1 shows the projectile effect 34, which only occurs later and is not visible in the target image 26, but only in the effective image 36.
  • the object 10 has already moved a little further, so that the placement is based on the target direction 30 at the time of firing and the effective point direction 38 at the time of impact of the projectile, two directions 30, 38 which are determined at different times. This is in FIG 1 shown schematically accordingly.
  • the recording camera 22 is at rest between the time of firing or the recording of the target image 26 and the time of effect or the recording of the effect image 36.
  • the two images 26, 36 can simply be placed on top of each other so that the target direction corresponding to the position of the target marking 28 in the target image 26 and the effect point direction corresponding to the position of the projectile effect 34 in the effect image can be easily captured side by side.
  • the camera 22 will move during the projectile flight time so that the object scene 2 moves through the images 26, 36. This can be seen from the illustration in FIG 2 and FIG 4
  • the camera 22 has been swiveled slightly to the left during the flight time of the projectile of about 1.5 seconds by the movement of the wheeled vehicle 16 carrying it.
  • At least the actual image 36 is correlated with the target image 26 by image processing so that the two object scenes 2 can be superimposed, for example by static landscape points.
  • the image shift is recognized from this and taken into account when calculating the direction difference 40, ⁇ R.
  • the object images 32 are examined for image anomalies that could show the projectile impact 34. For example, all object images 32 taken one after the other are examined one after the other for a pictorial representation of the projectile impact 34.
  • the object scene 2 can show a variety of events that can easily be confused with a pictorial representation of the projectile effect 34.
  • FIG 1 is shown as an example of how a muzzle flash lights up from the two tanks 6, 8.
  • the object 10 to be attacked carries a machine gun, the muzzle flash of which also lights up brightly in the upper area of the object 10.
  • a grenade impact 42 can be seen, which can also easily be confused with a projectile effect 34, especially if the projectile effect 34 consists primarily of a smoke effect or the throwing up of dust and stones.
  • a surface-to-air missile 44 is launched from the missile carrier 4, the hot engine gases of which also cause an unusual image with a very strong signature. It is therefore not immediately apparent to the evaluation unit 24 whether the grenade impact 42, the muzzle flash of the tank 6 or the muzzle flash of the machine gun on the object 10 is the effect of the projectile from the tube weapon 18. If a mix-up occurs, the A position will be incorrectly calculated and subsequent shots from the barrel weapon 18 will be incorrectly placed.
  • the evaluation unit determines an expectation range 46, which is shown as an example in FIG 4
  • the expectation area 46 is determined depending on the target direction 30 and is, for example, placed symmetrically around the target direction 30, as in FIG 4 is shown as an example, whereby the target direction from the target image 26 is of course used.
  • the impact or explosion of the fired projectile is expected in this expectation area 46.
  • the expectation area 46 can be placed through all object images 32, so that in these only the corresponding expectation area 46 is examined for the pictorial representation of the projectile effect 34.
  • a multi-level expectation area is determined, which can consist of a core area and one or more areas around the core area, for example a middle area and an outer area. Depending on the position of the pictorial representation of the projectile effect 34 in the areas, the weighting is increasingly lower from the inside to the outside.
  • the weighting can also be changed continuously, for example, continuously decreasing as the distance of the projectile effect 34 from the target direction 30 increases. In this way, image anomalies further away from the targeted object 10 are not suppressed, but are increasingly weighted less, so that a closer impact is given greater consideration.
  • the evaluation unit 24 expediently takes into account both a design-specific deviation spread of the gun and an individual deviation spread of the gun, in this case the barrel weapon 18. While the design-specific deviation spread can be specified by the manufacturer, an individual deviation spread of the gun can either also be specified by the manufacturer or obtained from previous tests.
  • Another useful parameter that can be taken into account in the size and position of the expected range 46 is the wind strength and in particular the wind direction. This can be measured by the weapon system 14 and used to determine the expected range 46. Depending on the wind direction, the expected range 46 can be shifted a little in the direction of the wind, for example relative to a symmetry around the target direction 30, and the size of the expected range 46 can be made dependent on the wind strength. It is also useful to include a movement of the gun or the vehicle 16 carrying the gun in the calculation of the expected range. 46. The stronger the movement, the larger the expected range 46, since a movement, such as a shake, can increase the dispersion.
  • a temporal window can be defined in which a projectile effect 34 is expected. This is illustrated by the illustration in FIG 5 explained.
  • FIG 5 shows the recording periods 48 of a large number of recordings of object images 32 on a timeline. Between the recording periods 48, a readout period is indicated by hatching, in which no recording takes place and the corresponding detector of the image-recording camera 22 is read out. It is of course also possible to reduce the readout periods or even eliminate them by sequential reading or other suitable methods. In the FIG 5 In the processes shown, the hatched reading periods mainly serve to make the process easier to recognize.
  • the projectile is fired from the barrel weapon 18.
  • the target image 26 is recorded in the corresponding recording period 48.
  • a large number of object images 32 are recorded, as indicated by the interruption of the timeline using the two lines.
  • the time t E indicates an expected time at which a projectile effect 34 is expected, for example an impact of the projectile on or near the object 10 or an explosion of the projectile in the air.
  • the evaluation unit 24 places a time window 50 around the expected time t E in which events are exclusively taken into account or are taken into account more strongly, analogously to the spatial expected area 46.
  • the object images 32 lying before the time window 50 are not taken into account or are not recorded at all or are taken into account with less weighting in terms of their image conspicuities.
  • a weighting or an allocation value can generally be a parameter, even in a spatial expectation area, that is used to decide whether an image conspicuity is to be classified as a projectile effect. It is expediently a parameter whose size is decisive for the result of whether an image conspicuity is to be classified as a projectile effect.
  • the parameter can For example, there could be a probability that the image conspicuity is a projectile effect.
  • image conspicuity 52 is, for example, the muzzle flash of the machine gun on truck 10.
  • This image conspicuity 52 lasts for a while, as shown by the double arrow in FIG 5 around the image conspicuity 52 is indicated, for example 50 ms. In the example shown, it is sufficient in three object images 32 or their recording periods 48.
  • the entire time range of the image conspicuity 52 lies outside the time window 50, so that the image conspicuity 52 is not taken into account or is only taken into account with less weight than if it had been within the time window 50.
  • the projectile effect 34 which is omitted for the sake of clarity in FIG 5 was omitted, lies within the time window 50. It is taken into account accordingly, so that the position of its image conspicuity in the actual image 36 is used to calculate the storage A.
  • a bullet impact 54 does not manifest itself, or not only as a flash, but as a swirling of dust, a cloud of smoke or the throwing of stones, as exemplified by the grenade impact 42 in FIG 4
  • An explosion flash may be hidden or only insufficiently visible.
  • FIG 5 It can be seen that in an impact image 36, which was taken after the time window 50, a projectile effect 54 is visible. This is difficult to recognize at first and may not even be recognized as such. In the following impact image 36, the projectile effect 54 is already greater, and in the subsequent impact image 36 it is even greater, as in FIG 5 is indicated. For example, it is the expansion of a heat cloud caused by the explosion of a projectile of the tube weapon 18 and which is recorded by an IR camera 22 of the weapon system 14.
  • FIG 6 shows another possibility for the efficient use of the calculated time window 50.
  • Recording periods 48 of object images 32 are shown, this time without the readout periods omitted for the image recording.
  • the uninteresting image conspicuity 52 and the slowly growing projectile effect 54 are shown.
  • the recording and/or evaluation of object images 32 whose recording period lies completely before the time window 50 was omitted. In this respect, the image conspicuity 52 is not even discovered.
  • the image recording of object images 32 is only started at the beginning of the time window 50, so that the first object image covers the beginning of the time window 50, as in FIG 6 It can be seen that two object images completely cover the time window 50. However, the projectile impact 54 is not detected in this.
  • the evaluation of the object images 32 only around the time window 50 would not lead to a positive result. It is therefore sensible to attach a subsequent time window 56 to the actual time window 50 and also to examine this subsequent time window 56 for image anomalies that could show the projectile effect 54. With this method, the projectile effect 54 can be successfully detected in the form of the heat cloud.
  • the size of the subsequent time window 56 is expediently selected so that image anomalies from dust swirling, smoke development and/or heat development are reliably detected if their origin lies within the actual time window 50, as in FIG 6 is shown.
  • FIG 7 shows the recording of a series of object images 32.
  • Each of these object images 32 is compared with the previous object image, which is FIG 7 marked 32-1, for example by image subtraction.
  • image subtraction is indicated by the marking A - B in FIG 7
  • Constant image features are thereby eliminated, and only image changes remain, as shown in FIG 7 as an example of a projectile effect 34 or its pictorial representation.
  • the comparison image or difference image 58 can be processed using image processing in the form of signal amplification.
  • the correspondingly processed image 60 now shows the projectile effect 34 more clearly, so that its position can be determined in the effect image 60 processed in this way.
  • the position is determined, for example, relative to a reference point, in the example from FIG 7 the top left corner of the image. Accordingly, two coordinates result, as in FIG 7 represented by the two double arrows.
  • a group of object images 32 is grouped into a reference image 62, for example by averaging the object images 32 of the group or by interpolation from the object images 32.
  • the reference image 62 is then compared with the next object image 32, for example by image subtraction, as in FIG 7 indicated by the marking A - B.
  • the further procedure can be carried out as described above.
  • the number of object images 32 from which the reference image 62 is formed can be made dependent on the image fluctuation within the object images 32.
  • the image fluctuation of the entire object image or only from the expected area 46 can be taken into account.
  • the image fluctuation results from the change in the image content from one object image 32 to the next. With high image fluctuation, i.e. rapid changes in the image content, fewer object images 32 are processed into the reference image 62 than with a lower image fluctuation.
  • the corresponding image anomalies 64, 66, 68, 70 show a different progression. While an explosion flash in the ultraviolet spectral range is only detected over a relatively short period of time, in the visual range the explosion flash is detected in addition to a dust swirl, which expands relatively slowly and thus becomes increasingly noticeable, as in FIG 8 is indicated in the middle area of the drawing.
  • the heat cloud detected in the infrared spreads more slowly than the flash of light becomes noticeable, but its size grows faster than the smoke and dust swirls detected in the visible range.
  • different characteristic noticeable or image progressions i.e. intensity and size progressions as a function of time, result for each spectral range.
  • Each of the cameras 22 records a corresponding series of active images 36 in its spectral range.
  • Image anomalies are examined for their possible representation of a projectile effect 34.
  • the image anomalies 64, 66, 68, 70 are discovered and the development of the projectile effect in the effect images 36 is compared in each spectral range with a signal curve expected for the positioning shot in each spectral range.
  • typical signal curves of several types of ammunition are stored in the evaluation unit 24 for each spectral range.
  • the type of ammunition of the projectile that was fired is stored so that the typical signal curves for this projectile can be called up and compared with the measured signal curves, for example according to FIG 8 , can be compared.
  • the signal curve of the bullet effect 34 is therefore examined in terms of its intensity over time in all three spectral ranges.
  • the image anomalies 64, 66, 68, 70 are only assigned to the bullet effect 34 if the signal curve in all tested spectral ranges corresponds to the stored signal characteristics within a specified deviation.
  • the spectral ranges are limited to a relatively narrow spectral band.
  • the bandwidths of two spectral ranges namely the visual and ultraviolet spectral ranges, are limited by a factor of 1.5, so that the frequency of the upper end of the band is 1.5 times as high as the frequency of the lower end of the band.
  • the width of the infrared spectral band is limited by a factor of 2, so that the frequency of the upper end of the band is twice as high as the frequency of the lower end of the band.
  • the projectile fired by the barrel weapon 18 can be a special signal projectile that burns with a marker fire that is specific to the projectile.
  • This marker fire differs significantly from normal muzzle flashes or grenade explosions both in its spectrum and in the temporal progression of the intensity.
  • the signal path of such a marker fire which initially radiates intensified in the ultraviolet range and then over a longer period in the visual range, in particular with a characteristic intensity and/or spectral progression over time, can be compared with the stored signal path so that the projectile effect 34 of the positioning shot can be clearly recognized as such.
  • a projectile effect 34 in particular of a special positioning projectile or marking projectile, can also be distinguished from interference signals.
  • disruptive units are dropped to defend the object 10, such as flares, these are clearly visible as image anomalies in the object images 32, but their radiation characteristics are different from those of a normal projectile or a special positioning projectile.
  • the projectile effect 34 can be clearly distinguished from such disruptive signals.
  • Another way to reduce the susceptibility to errors is to fire a salvo of several projectiles from the gun 18 at the object 10. A projectile effect 34 is then determined for each projectile. The susceptibility to confusion can now be reduced in several test stages.
  • FIG 9 shows the firing of a salvo of six projectiles over time.
  • the first shot of the salvo is fired at time t 1 .
  • the remaining shots of the salvo are fired at times t 2 , t 3 , ....
  • the corresponding projectile effects 34 should each be visible at the same location in their effect image 36. If six projectile effects 34, or more generally: as many projectile effects as projectiles were fired in the salvo, are found at one location in the object scene 2, this is a strong indication that these are actually projectile effects 34 from projectiles from the barrel weapon 18.
  • FIG 9 It can be seen that the first three shots of the salvo follow one another regularly, then there is a longer pause and the last three shots of the salvo follow one another regularly again, but with a smaller time interval between them.
  • this coding for each projectile or the time of effect for each projectile effect 34 is determined. If the time coding of the times of effect is the same as the coding of the firings, this is a strong indication that the projectile effects 34 actually sought have been found.
  • Another possibility is to determine a time window for each projectile in which the projectile effect 34 is expected. This is also FIG 9 shown.
  • Six time windows 50 are calculated, the temporal progression of which corresponds to the temporal coding.
  • the time interval between a firing time t i and the corresponding time window 50 can be determined from a distance between the weapon system 14 and object 10 or a timing of the projectiles in the salvo, i.e. a preset time period after firing, after which the projectile explodes.
  • an allocation value for each impact time can be determined from the position of the impact times in relation to the time windows.
  • FIG 9 it is shown that three smaller image anomalies occur before the first time window 50. Their times of action are correspondingly before the first time window, so that the assignment value of these times of action to a projectile in the salvo is low.
  • the fifth image anomaly also lies behind a time window 50.
  • the assignment value to a projectile is low, although somewhat larger due to the temporal proximity to the respective time window 50. All image anomalies are treated accordingly.
  • an overall assignment value is formed from the assignment values. Since the individual image anomalies are not in the time windows 50, the assignment values are low and the overall assignment value is also low. In this embodiment the overall assignment value is too low for the image anomalies to be assigned to the salvo.
  • the position of all effective points in time is then examined for a systematic shift in position relative to the time windows 50. It is noticeable that the fourth, fifth and sixth image anomalies, as well as the eighth, ninth and tenth image anomalies are located at the same time interval behind a time window 50. This may therefore be a systematic shift in position of the effective points in time relative to the time windows.
  • all image anomalies are now shifted by the systematic shift in position. It is noticeable that six of the ten in FIG 9 The image anomalies shown are located symmetrically in the time windows 50. Their assignment value is corrected accordingly. As a result, the overall assignment value is above a threshold, so that these six image anomalies are assigned to the projectiles of the salvo.
  • the assignment values of the first three image anomalies and the seventh image anomaly are still low, so it is assumed that these do not belong to the salvo.
  • the weapon system 14 is equipped with several modules 72 for detecting the projectile effects 34. These modules 72 are in FIG 1 indicated.
  • the different modules 72 for determining the projectile effect 34 can be used individually or in combination according to an operator input or an evaluation result of the evaluation unit 24. Also crucial is which Cameras 22 are present in the weapon system 14, which spectral filters are present, which projectiles are present for positioning shots and which projectiles are used, and the like.
  • One of the modules 72 defines an expected range in the object images or an effective image in which the projectile effect 34 is expected.
  • Another module 72 determines a time window 50 in which the projectile effect 34 is expected.
  • a further module 72 evaluates different radiation spectra of the projectile effect.
  • a further module 72 controls the use of several cameras 22 in different spectral ranges to compare spectrally different image anomalies 64, 66, 68.
  • a further module 72 controls the use of special marking ammunition for positioning shots, which is different from ammunition fired later. This module 72 also controls the corresponding testing of the image anomalies 64, 66, 68 for their similarity to stored radiation profiles. Another module 72 controls the use of a marking salvo with several projectiles fired one after the other and evaluates the image anomalies accordingly.

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Aiming, Guidance, Guns With A Light Source, Armor, Camouflage, And Targets (AREA)

Claims (13)

  1. Procédé de correction de déviation d'un système d'arme (14), procédé dans lequel un projectile est tiré depuis une arme à canon (18) du système d'arme (14) dans une direction cible (30) sur un objet (10), une direction (38) d'un point d'action du projectile est détectée sur l'objet (10), la différence de direction (40) entre la direction cible (30) et la direction de point d'action (38) est détectée comme déviation (A) et la direction cible d'un tir ultérieur sur l'objet (10) est corrigée à l'aide de la déviation (A), la direction de point d'action (38) étant détectée par capture d'une image d'action (36) qui représente l'effet de projectile (34, 54) sur l'objet (10), et la direction de point d'action (38) étant déterminée à partir de l'image d'action (36),
    caractérisé en ce que
    une fenêtre temporelle (50), dans laquelle l'effet de projectile (34, 54) est attendu, est déterminée, une séquence d'images d'objet (32) représentant l'objet (10) est capturée, des images d'objet de la séquence sont examinées en termes d'effet de projectile (34, 54), plusieurs images d'objet (32), dans lesquelles l'effet de projectile (34, 54) est constaté de manière imagée, sont évaluées comme séquence d'images d'action (36) en termes d'évolution de l'effet de projectile imagé (34, 54), un instant d'action (tw), auquel l'effet de projectile (34, 54) a commencé, est déterminé à partir de l'examen rétrospectif de l'évolution et une vérification est effectuée quant à la position de l'instant d'action (tw) à l'intérieur de la fenêtre temporelle (50), et un instant d'action (tw) à l'intérieur de la fenêtre temporelle (50) est affecté d'une pondération supérieure à celle d'un instant d'action (tw) à l'extérieur de la fenêtre temporelle (50), une anomalie d'image qui pourrait représenter un effet de projectile dont l'instant d'action (tw) est situé à l'extérieur de la fenêtre temporelle (50) pour déterminer la direction d'instant d'action (38) à partir de laquelle la déviation est calculée, a une pondération plus faible ou est rejetée.
  2. Procédé selon la revendication 1,
    une image cible (26) qui reproduit l'objet (10) étant capturée et la direction cible (30) étant définie dans l'image cible (26), l'image cible (26) étant corrélée à l'image d'action (36), capturée ultérieurement, à l'aide une reconnaissance de forme et à partir de là une position relative des deux images (26, 36) est déterminée l'une par rapport à l'autre et la différence de direction (40) est déterminée à l'aide de la position relative.
  3. Procédé selon l'une des revendications précédentes,
    une zone d'attente (46), dans laquelle l'effet de projectile (34, 54) est attendu, étant sélectionnée comme sous-zone dans une image d'objet (32) représentant l'objet (10), et un résultat d'évaluation dans la zone d'attente (46) étant affecté d'une pondération supérieure à celle d'un résultat d'évaluation à l'extérieur de la zone d'attente (46).
  4. Procédé selon la revendication 3,
    la taille et/ou la position de la zone d'attente (46) dans l'image d'objet (32) étant sélectionnées en fonction d'au moins un des paramètres du groupe de paramètres suivant : dispersion de déviation spécifique à la conception du canon, dispersion de déviation individuelle du canon, la force du vent, la direction du vent, le mouvement d'un véhicule (16) portant le canon pendant le tir, la température d'une pièce de canon.
  5. Procédé selon l'une des revendications précédentes,
    la position temporelle de l'intégration de la capture de l'image d'action (36) est déterminée à l'aide de la fenêtre temporelle (50).
  6. Procédé selon l'une des revendications précédentes,
    au moins une image d'objet (32) de l'objet (10) étant capturée entre un tir du projectile et un instant d'action (tw) du projectile, l'image d'action (36) étant traitée à l'aide de l'image d'objet (32), en particulier l'image d'objet (32) étant soustraite de l'image d'action (36) et la direction d'instant d'action (38) étant déterminée à partir de l'image de différence (58).
  7. Procédé selon l'une des revendications précédentes,
    plusieurs caméras (22) étant présentes qui détectent dans différentes plages spectrales et chaque caméra (22) capturant au moins une image d'action (36), et la direction d'instant d'action (38) étant déterminée à partir des images d'action (36).
  8. Procédé selon l'une des revendications précédentes,
    une caractéristique de signal de l'effet de projectile trouvée dans une ou plusieurs images d'action (36) étant comparée à une caractéristique de signal stockée du projectile.
  9. Procédé selon l'une des revendications précédentes,
    une salve de plusieurs projectiles étant tirée sur l'objet (10), un effet de projectile (34, 54) et un instant d'action (tw) de l'effet de projectile (34, 54) étant déterminés pour chaque projectile et la séquence temporelle des instants d'action (tw) étant comparée à la séquence temporelle des instants de tir (tA).
  10. Procédé selon la revendication 9,
    la séquence temporelle des instants de tir (tA) étant codée de manière irrégulière.
  11. Procédé selon la revendication 9 ou 10,
    différents projectiles traçants étant tirés dans la salve qui produisent un effet de projectile différent (34, 54) et qui diffèrent d'une munition de combat tirée ultérieurement.
  12. Procédé selon l'une des revendications précédentes,
    différents modules (72) étant présents pour déterminer l'effet de projectile (34, 54), lesquels peuvent être utilisés individuellement ou en combinaison et, en fonction d'une situation de combat, d'un projectile existant et/ou de systèmes de détection existants, une détermination est effectuée quant à quel module (72) est utilisé pour déterminer l'effet de projectile (34, 54),
    un module ou plusieurs modules (72) du groupe suivant étant présents : zone d'attente (46) de l'effet de projectile (34, 54) dans l'image d'action (36), fenêtre temporelle (50) pour l'effet de projectile (34, 54), spectre de rayonnement de l'effet de projectile (34, 54), soustraction d'image d'une image d'objet (32) sans effet de projectile (34, 54) de l'image d'action (36), utilisation de plusieurs caméras (22) de différentes plages spectrales, l'utilisation d'une munition traçante qui est différente de la munition tirée ultérieurement, l'utilisation d'une salve traçante de plusieurs projectiles tirés les uns après les autres et leur évaluation quant à plusieurs effets de projectiles (34, 54).
  13. Système d'armes (14) comprenant une unité cible (20) destinée à entrer et détecter une direction cible (30) dans laquelle un projectile provenant d'une arme à canon (18) est tiré sur un objet (10), un système optique muni d'une caméra (22) et une unité d'évaluation (24) destinée à déterminer une direction (38) d'un point d'action du projectile sur l'objet (10) et une déviation (A) sous la forme d'une différence de direction (40) entre la direction cible (30) et la direction de point d'action (38), le dispositif cible (20) étant préparé pour corriger une direction cible (30) d'un tir ultérieur sur l'objet (10) à l'aide de la déviation (A), l'unité d'évaluation (24) étant préparée pour détecter la direction d'instant d'action (38) par capture d'une image d'action (36) représentant l'effet de projectile (34, 54) sur l'objet et la direction d'instant d'action (38) étant déterminée à partir de l'image d'action (36), caractérisé en ce que
    l'unité d'évaluation (24) est en outre préparée pour déterminer une fenêtre temporelle (50), dans laquelle l'effet de projectile (34, 54) est attendu, et examiner une séquence capturée d'images d'objet (32), représentant l'objet (10), en termes d'effet de projectile (34, 54), évaluer plusieurs images d'objet (32), dans lesquelles l'effet de projectile (34, 54) est constaté de manière imagée, comme séquence d'images d'action (36) en termes d'évolution de l'effet de projectile imagé (34, 54), déterminer un instant d'action (tw), auquel l'effet de projectile (34, 54) a commencé, à partir de l'examen rétrospectif de l'évolution et vérifier si l'instant d'action (tw) est situé à l'intérieur de la fenêtre temporelle (50), et affecter à un instant d'action (tw) à l'intérieur de la fenêtre temporelle (50) une pondération supérieure à celle d'un instant d'action (tw) à l'extérieur de la fenêtre temporelle (50), affecter à une anomalie d'image qui pourrait représenter un effet de projectile dont l'instant d'action (tw) est situé à l'extérieur de la fenêtre temporelle (50) pour déterminer la direction d'instant d'action (38) à partir de laquelle la déviation est calculée, une pondération plus faible ou la rejeter.
EP17000925.2A 2016-06-23 2017-06-01 Procédé de correction de déviation d'un système d'arme Active EP3260808B1 (fr)

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
DE102016007624.0A DE102016007624A1 (de) 2016-06-23 2016-06-23 1Verfahren zur Ablagekorrektur eines Waffensystems

Publications (3)

Publication Number Publication Date
EP3260808A2 EP3260808A2 (fr) 2017-12-27
EP3260808A3 EP3260808A3 (fr) 2018-01-10
EP3260808B1 true EP3260808B1 (fr) 2024-07-24

Family

ID=59009486

Family Applications (1)

Application Number Title Priority Date Filing Date
EP17000925.2A Active EP3260808B1 (fr) 2016-06-23 2017-06-01 Procédé de correction de déviation d'un système d'arme

Country Status (3)

Country Link
EP (1) EP3260808B1 (fr)
DE (1) DE102016007624A1 (fr)
ZA (1) ZA201704186B (fr)

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB9620614D0 (en) * 1996-10-03 1997-03-12 Barr & Stroud Ltd Target aiming system
WO2007056753A2 (fr) * 2005-11-08 2007-05-18 General Atomics Appareil et procedes servant lors d’une detection de flash
US20080022575A1 (en) * 2006-05-08 2008-01-31 Honeywell International Inc. Spotter scope
IL204455A (en) * 2010-03-14 2015-03-31 Shlomo Cohen Artillery firing system and method
FR2989456B1 (fr) * 2012-04-12 2018-05-04 Philippe Levilly Systeme teleopere de traitement de cibles
DE102014019200A1 (de) * 2014-12-19 2016-06-23 Diehl Bgt Defence Gmbh & Co. Kg Maschinenwaffe

Also Published As

Publication number Publication date
EP3260808A2 (fr) 2017-12-27
EP3260808A3 (fr) 2018-01-10
DE102016007624A1 (de) 2018-01-11
ZA201704186B (en) 2018-08-29

Similar Documents

Publication Publication Date Title
EP1304539B1 (fr) Procédé et dispositif pour pointer un tube de cannon et utilisation du dispositif
WO2007054278A1 (fr) Systeme de protection automatique pour vehicule de combat ou autres objets a proteger
DE10117007A1 (de) Verfahren und Vorrichtung zum Schutz von mobilen militärischen Einrichtungen
DE10230939A1 (de) Verfahren und Vorrichtung zum Schutz von Gefechtsfeldfahrzeugen
DE102015002737B4 (de) Verfahren und Vorrichtung zum Bereitstellen eines Scheinzieles zum Schutz eines Fahrzeuges und/oder Objektes vor radargelenkten Suchköpfen
EP0411073A1 (fr) Procede et dispositif d'amelioration de la precision du tir
DE3733962A1 (de) Verfahren zur automatischen zielklassifizierung durch land- und wasserkampffahrzeuge sowie einrichtung zur durchfuehrung des verfahrens
EP3260808B1 (fr) Procédé de correction de déviation d'un système d'arme
EP3376154B1 (fr) Procédé de protection d'un missile de croisière
EP3227711B1 (fr) Procédé de localisation et de lutte contre des menaces, notamment dans des positions de menace asymétriques
EP3591427B1 (fr) Avertisseur missile et procédé d'avertissement contre un missile
EP2899493B1 (fr) Procédé d'entraînement à l'utilisation d'armes à feu dans un simulateur d'armes, simulateur d'armes destiné à l'exécution d'un tel procédé, ordinateur de commande central d'un tel simulateur d'armes et programme informatique destiné au déroulement d'un tel ordinateur de commande
DE19716199A1 (de) Verfahren zum Richten der Waffe einer Waffenanlage und Waffenanlage zur Durchführung des Verfahrens
EP3034983B1 (fr) Pistolet automatique
DE102010036026A1 (de) Vorrichtung und Verfahren zur Bestimmung der Effektivität einer Nebelwand zur Erzeugung einer wirksamen Nebelwolke
DE2701042A1 (de) Vorrichtung zur fernerfassung von einschuessen auf einer zielscheibe
DE102008023520C5 (de) Verfahren zur Klassifikation von RAM-Geschossen
EP0154809A1 (fr) Procédé de simulation de combat
DE102011107950A1 (de) System von Waffen
DE102005054776A1 (de) Lenkverfahren für Flugkörper
EP3367046A1 (fr) Procédé de détermination d'un besoin en munitions d'un système d'arme
EP1122508A2 (fr) Dispositif pour identifier un tireur
EP3350534B1 (fr) Tourelle téléopérée et procédé de commande d'une tourelle téléopérée
DE69510612T2 (de) Verfahren zur Anzeigen der Beobachtungsrichtung eines Objektes und Vorrichtung zur Durchführung des Verfahrens
DE102020003080A1 (de) Verfahren und Steuersystem zum Ansteuern eines Flugkörpers auf ein Zielobjekt

Legal Events

Date Code Title Description
PUAI Public reference made under article 153(3) epc to a published international application that has entered the european phase

Free format text: ORIGINAL CODE: 0009012

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE APPLICATION HAS BEEN PUBLISHED

PUAL Search report despatched

Free format text: ORIGINAL CODE: 0009013

AK Designated contracting states

Kind code of ref document: A2

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR

AX Request for extension of the european patent

Extension state: BA ME

AK Designated contracting states

Kind code of ref document: A3

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR

AX Request for extension of the european patent

Extension state: BA ME

RIC1 Information provided on ipc code assigned before grant

Ipc: F41G 3/08 20060101ALN20171204BHEP

Ipc: F41G 3/14 20060101AFI20171204BHEP

Ipc: F41G 3/16 20060101ALI20171204BHEP

Ipc: F41G 3/04 20060101ALN20171204BHEP

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: REQUEST FOR EXAMINATION WAS MADE

17P Request for examination filed

Effective date: 20180709

RBV Designated contracting states (corrected)

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: EXAMINATION IS IN PROGRESS

17Q First examination report despatched

Effective date: 20200320

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: EXAMINATION IS IN PROGRESS

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: EXAMINATION IS IN PROGRESS

GRAP Despatch of communication of intention to grant a patent

Free format text: ORIGINAL CODE: EPIDOSNIGR1

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: GRANT OF PATENT IS INTENDED

RIC1 Information provided on ipc code assigned before grant

Ipc: F41G 3/08 20060101ALN20240223BHEP

Ipc: F41G 3/04 20060101ALN20240223BHEP

Ipc: F41G 3/16 20060101ALI20240223BHEP

Ipc: F41G 3/14 20060101AFI20240223BHEP

INTG Intention to grant announced

Effective date: 20240315

RIC1 Information provided on ipc code assigned before grant

Ipc: F41G 3/08 20060101ALN20240301BHEP

Ipc: F41G 3/04 20060101ALN20240301BHEP

Ipc: F41G 3/16 20060101ALI20240301BHEP

Ipc: F41G 3/14 20060101AFI20240301BHEP

GRAS Grant fee paid

Free format text: ORIGINAL CODE: EPIDOSNIGR3

GRAA (expected) grant

Free format text: ORIGINAL CODE: 0009210

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE PATENT HAS BEEN GRANTED

AK Designated contracting states

Kind code of ref document: B1

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR

REG Reference to a national code

Ref country code: GB

Ref legal event code: FG4D

Free format text: NOT ENGLISH

REG Reference to a national code

Ref country code: CH

Ref legal event code: EP

REG Reference to a national code

Ref country code: DE

Ref legal event code: R096

Ref document number: 502017016280

Country of ref document: DE