EP3239644B1 - Hilfsverfahren und -vorrichtung zum zielen für die laserlenkung eines projektils - Google Patents

Hilfsverfahren und -vorrichtung zum zielen für die laserlenkung eines projektils Download PDF

Info

Publication number
EP3239644B1
EP3239644B1 EP17167889.9A EP17167889A EP3239644B1 EP 3239644 B1 EP3239644 B1 EP 3239644B1 EP 17167889 A EP17167889 A EP 17167889A EP 3239644 B1 EP3239644 B1 EP 3239644B1
Authority
EP
European Patent Office
Prior art keywords
target
guide beam
environment
aiming
contact
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
EP17167889.9A
Other languages
English (en)
French (fr)
Other versions
EP3239644A1 (de
Inventor
Nikolaus Boos
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Airbus Helicopters SAS
Original Assignee
Airbus Helicopters SAS
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Airbus Helicopters SAS filed Critical Airbus Helicopters SAS
Priority to PL17167889T priority Critical patent/PL3239644T3/pl
Publication of EP3239644A1 publication Critical patent/EP3239644A1/de
Application granted granted Critical
Publication of EP3239644B1 publication Critical patent/EP3239644B1/de
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • FMECHANICAL ENGINEERING; LIGHTING; HEATING; WEAPONS; BLASTING
    • F41WEAPONS
    • F41GWEAPON SIGHTS; AIMING
    • F41G7/00Direction control systems for self-propelled missiles
    • F41G7/20Direction control systems for self-propelled missiles based on continuous observation of target position
    • F41G7/22Homing guidance systems
    • F41G7/2246Active homing systems, i.e. comprising both a transmitter and a receiver
    • FMECHANICAL ENGINEERING; LIGHTING; HEATING; WEAPONS; BLASTING
    • F41WEAPONS
    • F41GWEAPON SIGHTS; AIMING
    • F41G3/00Aiming or laying means
    • F41G3/02Aiming or laying means using an independent line of sight
    • FMECHANICAL ENGINEERING; LIGHTING; HEATING; WEAPONS; BLASTING
    • F41WEAPONS
    • F41GWEAPON SIGHTS; AIMING
    • F41G3/00Aiming or laying means
    • F41G3/14Indirect aiming means
    • F41G3/145Indirect aiming means using a target illuminator
    • FMECHANICAL ENGINEERING; LIGHTING; HEATING; WEAPONS; BLASTING
    • F41WEAPONS
    • F41GWEAPON SIGHTS; AIMING
    • F41G7/00Direction control systems for self-propelled missiles
    • F41G7/007Preparatory measures taken before the launching of the guided missiles
    • FMECHANICAL ENGINEERING; LIGHTING; HEATING; WEAPONS; BLASTING
    • F41WEAPONS
    • F41GWEAPON SIGHTS; AIMING
    • F41G7/00Direction control systems for self-propelled missiles
    • F41G7/20Direction control systems for self-propelled missiles based on continuous observation of target position
    • F41G7/22Homing guidance systems
    • F41G7/226Semi-active homing systems, i.e. comprising a receiver and involving auxiliary illuminating means, e.g. using auxiliary guiding missiles
    • FMECHANICAL ENGINEERING; LIGHTING; HEATING; WEAPONS; BLASTING
    • F41WEAPONS
    • F41GWEAPON SIGHTS; AIMING
    • F41G7/00Direction control systems for self-propelled missiles
    • F41G7/20Direction control systems for self-propelled missiles based on continuous observation of target position
    • F41G7/22Homing guidance systems
    • F41G7/2273Homing guidance systems characterised by the type of waves
    • F41G7/2293Homing guidance systems characterised by the type of waves using electromagnetic waves other than radio waves

Definitions

  • the present invention is in the field of projectile guidance. It relates more particularly to the guidance of projectiles using a laser beam.
  • the present invention relates to a method for assisting in the targeting of a target as well as a device for assisting in the targeting of a target.
  • the present invention also relates to a method for guiding a projectile by a laser beam using such a method of aiming assistance as well as a device for guiding a projectile by a laser beam equipped with such a device. sighting aid.
  • Guidance by a laser beam is used in particular by the military to guide a missile or any other projectile on a target illuminated by means of a laser beam.
  • This technique is a semi-active autoguiding by laser designated by the acronym "SALH” designating in English “Semi-Active Laser Homing”.
  • SALH semi-active autoguiding by laser
  • a laser beam is kept pointed by an operator, often designated by the term "shooter”, on a target. Reflections of this laser beam are then dispersed in a multitude of directions by reflection on the target.
  • a projectile, such as a missile can then be launched or dropped towards the target.
  • a receiving device that the projectile comprises receives part of the laser beam reflected by the target and then determines the source of this part of the reflected laser beam, namely the target.
  • the trajectory of the projectile is then adjusted in the direction of this source.
  • the projectile having no autonomous means of detecting the target proper, is then guided only towards the source by the part of the reflected laser beam which it receives.
  • the trajectory of the projectile can be corrected in order to guide the projectile exactly on the target.
  • the emission of the laser beam is therefore dissociated from the projectile and is carried out for example by an operator.
  • the operator must have the target in his field of vision in order to point the laser beam at it.
  • the projectile launch zone is completely independent of the laser beam emission zone.
  • the laser beam is emitted by a generator of a laser beam such as a laser designator.
  • a laser beam used for guiding a projectile generally consists of a succession of pulses emitted at regular or irregular time intervals, but in all cases known to be identifiable by the means for receiving the projectile.
  • a laser beam used for guiding a projectile can also be a continuous laser beam.
  • the aiming procedure always begins by scanning the environment visible to the operator in search of targets, then by stopping the scanning to focus on a target. Therefore, the operator must continuously point the laser beam at the target in order to guide the projectile towards it.
  • the target can be a moving vehicle, for example an automobile or an aircraft.
  • the operator can also be in motion, for example being on board a rolling vehicle or else an aircraft.
  • the operator has no direct visual feedback on the point of the environment which is actually illuminated by the laser beam.
  • the reflection of the laser on the target is generally not visible to the operator.
  • the operator can therefore only rely on his aiming, carried out for example through a sighting reticle of a sighting scope for a portable laser designator or else by means of a visualization means integrated in a helmet. for a laser designator on board a vehicle. Consequently, an offset between the aiming reticle and the actual laser beam can exist and cause an aiming error which goes unnoticed by the operator. Only the impact of the projectile informs the operator about the accuracy of the initial aim and the possible aim error. In the latter case where the projectile has missed the target, the operator can possibly correct its aiming as a function of the position of the point of impact of the projectile relative to the target, but only after a first failure.
  • a laser beam is a particular light beam composed of coherent and concentrated light.
  • the term “laser” is an acronym designating in English language “Light Amplification by Stimulated Emission of Radiation” and meaning "amplification of light by stimulated emission of radiation”.
  • the term “light beam” is intended to mean a beam generally composed of lights visible to the human eye.
  • a laser beam can therefore be a light beam located both in the area visible to the human eye and in the non-visible area.
  • This correction device emits a beam for guiding the projectiles directed towards a target.
  • This guide beam is divided into at least five partial beams, a central partial beam effectively directed at the target and at least four partial beams inclined with respect to the central partial beam. Projectiles illuminated by an inclined partial beam therefore do not point towards the target and have their trajectories corrected accordingly.
  • the document US 2009/078817 discloses a projectile guidance system to reduce the number of pulses of the guide beam to reduce the total energy sent to the target.
  • This device requires communication between the projectile and the generator of the guide beam in order to synchronize the reception of the reflected guide beam and the emission of this guide beam.
  • an image capture means allows the analysis of the guide beam through the contact points of this guide beam on the target.
  • a radiation capture means allows the analysis of the guide beam reflected by the target, in particular, the time of arrival of this guide beam reflected by the target on the sensor, its angle of arrival and / or its arrival position on the sensor.
  • the document US 6023322 makes it possible to determine the ratio between the number of points of contact of this guide beam reflected by the target and the number of pulses of this guide beam emitted making it possible for example to find the best zone of the target to be aimed with the guide beam .
  • the present invention aims to allow a reliable and precise aiming of a guide beam on a target.
  • the present invention makes it possible to provide the operator with feedback on the zone actually targeted by means of an image of the environment and of the target.
  • the present invention notably uses a new type of camera allowing the creation of a selective image of the target in the environment.
  • the subject of the present invention is therefore a method for assisting in the aiming of a target as well as a device for assisting in the aiming of a target making it possible to overcome the limitations mentioned above in order to improve quality and accuracy of aiming the target via a guide beam.
  • the present invention also relates to a method of guiding a projectile by a guide beam using such a sighting aid method as well as a device for guiding a projectile by a guide beam equipped with such a device. sighting aid.
  • This method according to the invention is particularly intended for methods of guiding a projectile by a guide beam towards a target.
  • the guide beam is emitted by a generator of a guide beam.
  • the guide beam can be a light beam visible or not visible to the human eye depending on the wavelength (s) making up this light beam.
  • the guide beam is preferably a laser beam. This laser beam is for example emitted by a generator of a laser beam of known type such as a laser designator dedicated to the aiming of a target.
  • the guide beam can be a continuous beam or else be formed by a succession of pulses at regular intervals.
  • the guide beam is notably defined by temporal characteristics which are the frequency and the duration of these pulses.
  • the generator of a guide beam can be portable and used directly by an operator.
  • the generator of a guide beam can also be embedded in a vehicle.
  • the generator of a guide beam can be linked to the projectile launching device, the generator of a guide beam and the projectile launching device being for example carried by the same vehicle.
  • the generator of a guide beam can also be carried by a third party, for example by a ground shooter, and then isolated from the projectile launching device, carried for example by a vehicle. We then speak of "external designation” or "remote designation” in the English language.
  • This method also uses a camera in order to record the environment and the target as well as a visualization means for displaying in particular the images recorded by the camera.
  • the display means can be integrated into a telescopic sight of a generator of a portable guide beam or else into a helmet for a generator of guide beam on board a vehicle.
  • the display means can also be a remote screen vis-à-vis the generator of a guide beam.
  • the camera can be linked to the generator of a guide beam or isolated from this generator of a guide beam.
  • the generator of a guide beam is carried by an operator located on the ground while the camera is carried by a vehicle, the vehicle possibly being able to also carry the device for launching the projectile.
  • the camera used by the method according to the invention is preferably a new type of camera known by the English expression "bio-inspired camera” or “event based”. These cameras are characterized by a very high radiometric dynamic, for example the ability to see light and dark objects at the same time, and by a very high temporal resolution, of the order of a microsecond. By their principle, these cameras therefore make it possible for each pixel to measure a change in radiometry with high temporal precision. A detection of change in the scene, for example by the presence of a pulse of a guide beam or a moving object, is therefore done naturally. “Radiometry of an object” is understood to mean the measurement of the energy quantity relating to the radiations emitted by this object or else derived properties such as the flux or the intensity of these radiations.
  • An aiming procedure always begins with a scan of the visible environment in search of targets, then by stopping the scan to focus on a target.
  • the first step of the method according to the invention therefore consists in a complete scanning of an environment using the camera. A complete image of this environment is therefore recorded. This complete image of this environment is then displayed during the second step on the display means so that a target on this complete image of the environment is identified and selected during the third step.
  • This identification and selection is made by an operator who is for example the operator in charge of aiming the target with the guide beam.
  • the operator identifies the target on the complete image of the environment and then selects it.
  • This selection can be made by the operator aiming at the target via the generator of a guide beam, but without emitting a guide beam.
  • the operator then uses the telescopic sight of this generator of a guide beam and when he aims at the target, he actuates a selection means such as a push button or a switch to select the targeted target.
  • the operator generally uses the sighting reticle present in the sighting scope of the generator of a guide beam to aim at the target.
  • This selection can also be made directly on the display means by moving the sighting reticle on the target, then by actuating the selection means.
  • the reticle can be moved by a mouse or directly on the display means which is then a touch screen, the target selection means also being this mouse or this touch screen.
  • the target can also be identified by its coordinates, for example according to a satellite location system, the operator then selecting it via the selection means to confirm that the coordinates correspond to the target.
  • the selection of the target can be automatic when the operator aims at a stationary target or when the aiming reticle is kept stationary for a first predetermined duration.
  • This first predetermined duration is for example 3 seconds (3s).
  • This automatic selection is also possible for a moving target, in particular by using an image processing system, designated for example in English by the expression "moving target indicators", aligning the reticle on the identified moving target.
  • the operator points the guide beam at the target during the fourth step in order to guide the projectile to the target.
  • the operator generally uses the sight reticle present in a sight telescope of the generator of a guide beam to aim the target or directly on the visualization means by positioning the aim reticle on the target.
  • the operator must in fact continuously point the guide beam at the target via the beam generator. guidance until the projectile hits the target. In fact, if the operator points the guide beam at another object outside the target, the projectile will then move towards this other object. Likewise, if the operator stops the generator of a guide beam and if no guide beam is emitted, the projectile will not know where to go.
  • first step and the second step are preferably repeated as long as the operator has not pointed the guide beam at the target in order to update the complete display of the environment.
  • the operator generally does not see the point of contact of the guide beam on the target, the guide beam may be visible or not visible to the human eye. In fact, the operator cannot verify whether the beam actually illuminates the intended target.
  • the guide beam and its reflection on the target are advantageously always visible by the camera.
  • the contact points of the guide beam in the environment and in particular on the target are always visible and can be recorded by the camera.
  • the method according to the invention then advantageously comprises a fifth step during which the complete image of this environment is displayed on the display means with the point of contact of the guide beam in the environment.
  • the operator can then visualize the point of contact of the guide beam on the image of the environment and verify that this point of contact is indeed on the target. In this way, if this contact point is not located on the target, the operator can then correct the aim. Indeed, a shift between the sighting reticle and the guide beam itself may exist due to inaccuracies in the system and cause a shift between the sighting direction and the direction of the guide beam and, consequently, a sighting error. .
  • the operator again points the target with the guide beam.
  • the operator can during this sixth step possibly correct the aiming as a function of the position of the point of contact of the guide beam on the environment with respect to the target on the image displayed during the previous step. .
  • the camera used by the method advantageously makes it possible during the seventh step to selectively scan the target and the point of contact of the guide beam in the environment.
  • the camera used by the method according to the invention makes it possible to record only part of the environment for which a change in radiometry is detected.
  • this camera can specifically record the target selected during the third step and impacted by the guide beam as well as the point of contact of this guide beam.
  • Other environmental objects may also be recorded according to the change in their respective radiometry.
  • the number of objects recorded in step 7 and then displayed in step 8 is significantly reduced when it comes to recording and displaying a complete picture of the environment, in steps 1 and 2 respectively .
  • a selective image of the target and of at least one point of contact of the guide beam in the environment is displayed.
  • the operator can visualize the contact point of the guide beam on the selective image of the environment and check that this contact point is always on the target.
  • this selective image is simplified and mainly displays the target, the point of contact of the guide beam on the environment and possibly other objects whose radiometry changes. This selective display advantageously allows a faster analysis on the part of the operator who immediately sees the position of the contact point of the guide beam on the environment vis-à-vis the target.
  • the sixth, seventh and eighth stages are then repeated until the impact of the projectile, these stages being carried out continuously.
  • the operator first uses the complete image displayed during the fifth step to possibly correct the aiming of the target and then uses the selective images displayed successively during the eighth step.
  • the aiming aid method according to the invention advantageously makes it possible to provide the operator in real time and during his aiming operation with feedback on the positions of the target and of the contact points of the guide beam on the environment thanks to the display of the complete image and then selective images of the environment.
  • This display of the selective image thus makes it possible to improve the aiming precision, the operator being able to immediately correct, in real time and continuously, a deviation from the position of the point of contact of the guide beam on the visible environment. vis-à-vis the target.
  • the aiming reticle can be displayed on the target. This aiming reticle can thus be displayed during the second, fifth and eighth display steps of the aiming aid method according to the invention.
  • the aiming aid method according to the invention may comprise, after the eighth step, additional steps making it possible to quantify the precision of the aiming of the target.
  • the image displayed during the second, fifth and eighth step comprising in particular the target and the point of contact of the guide beam in the environment, it is possible by analyzing each successively displayed image to determine a first number of points of the guide beam contacting the target and a second number of contact points of the guide beam not touching the target since the target was selected in the third step.
  • the aiming aid method according to the invention can thus include a ninth step of calculating the first number of contact points of the guide beam touching the target, and the second number of contact points of the guide beam not touching the target since the target was selected in the third step.
  • the percentage of contact points of the guide beam actually touching the target among the all of the contact points of the guide beam in the environment can possibly be calculated during this ninth step.
  • This information on the precision of the points of contact of the guide beam touching the target can be this first number and this second number or else the percentage of the points of contact of the guide beam actually touching the target among all the points of contact of the beam. guidance in the environment.
  • the contact points of the guide beam in the environment can be recorded for a second predetermined duration, then be displayed with the selective image of the target.
  • the operator can thus view the successive positions of the contact points over this second predetermined duration and thus observe a possible drift in the precision of his aim or else an improvement in this aim.
  • the selective image of the target and the current contact point of the guide beam in the environment can be displayed simultaneously with at least one of the contact points previously displayed.
  • “Current contact point” of the guide beam is understood to mean the contact point of the guide beam picked up by the camera during the seventh selective scanning step immediately preceding this eighth step.
  • point of contact previously displayed the contact point displayed during the fifth step of displaying a complete image of the environment and the point of contact of the guide beam in the environment as well as any contact point displayed during any previous eighth display steps.
  • the contact points previously displayed are for example partly constituted by the contact points registered during the second predetermined duration.
  • the ninth stage can also take place over the second predetermined duration. In this way, the information of the precision displayed during the tenth step is determined over this second determined duration.
  • the aiming aid method according to the invention may include another additional step taking place after the fourth step, that is to say after aiming the guide beam on the target, and parallel to the steps following, namely from the fifth stage to the eighth stage.
  • This additional step allows the identification of the guide beam aimed at the target.
  • the contact points of the guide beam in the environment are always visible and recordable by the camera.
  • the temporal characteristics of the guide beam used to aim at the target are known and constitute the code of the guide beam. It is then advantageously possible to verify that these temporal characteristics of the guide beam visible on the target do indeed correspond to the code of the expected guide beam and to thus determine that the guide beam visible on the target by the camera is indeed the expected guide beam.
  • the aiming aid method according to the invention can thus include an eleventh step of analysis and identification of the guide beam, the points of contact of the guide beam in the environment being analyzed in order to determine the temporal characteristics. of the guide beam and thus identify the guide beam code visible on the target.
  • the temporal characteristics of the guide beam are the frequency and the duration of these pulses.
  • a continuous guide beam has no pulses and no frequency can be determined.
  • the temporal characteristics of such a continuous beam are in fact the absence of pulses and frequency.
  • the points of contact of the guide beam in the environment can be recorded from the pointing of the target by the guide beam during the fourth step or else over a third predetermined duration.
  • This third predetermined duration can be equal to the second predetermined duration.
  • Such recordings of the contact points of the guide beam in the environment may in particular be useful for an analysis subsequent to the firing of the projectile, for example in the event of a target error, and in particular of fratricidal firing.
  • the method for assisting in the targeting of a target may comprise, between the fourth step of pointing the target and the fifth step of displaying, an intermediate step of complete scanning of the environment at camera help.
  • This intermediate step therefore consists of a new complete scan of the environment in order to update the display of the target and of the environment before the complete display of this environment during the fifth step.
  • This intermediate step notably makes it possible to take into account a possible displacement of the objects of the environment and in particular of the target.
  • the method of assisting with the aiming of a target according to the invention advantageously makes it possible, during the fifth and eighth steps, to visualize the spatial behavior of the guide beam in the environment and thus to verify the efficiency of the aiming.
  • this method makes it possible, during the ninth and tenth steps, to quantify this spatial behavior of the guide beam by providing the precision information.
  • this method makes it possible, during the eleventh step, to quantify the temporal behavior of the guide beam and to identify the code of the guide beam in order to ensure that this guide beam is indeed that expected.
  • the guide beam is preferably a laser beam.
  • the aiming assistance method previously described is then applied to the illumination step in order to improve the precision of this illumination of the target and, consequently, to improve the precision of the launching of a projectile on the target.
  • the projectile attachment step on the target can then be carried out as a function of the information of the precision of the contact points of the guide beam touching the target and / or of the temporal characteristics of the guide beam impacting the target.
  • This hooking step can be carried out manually by the operator.
  • This realization can also be done automatically if the information on the precision of the contact points of the guide beam touching the target is greater than or equal to a predetermined threshold and / or if the temporal characteristics of the guide beam impacting the target correspond to the code of the expected guide beam.
  • the projectile launching step can be canceled as a function of this information on the precision of the contact points of the guide beam touching the target.
  • This cancellation can be done manually by the operator.
  • This cancellation can also be carried out if the operator finds on the display of the selective image during the eighth step an unforeseen event, for example a vehicle approaching the target, this vehicle not having to be impacted by the projectile.
  • This cancellation can also be done automatically if the information on the accuracy of the contact points of the guide beam touching the target is less than the predetermined threshold and / or if the temporal characteristics of the beam guides impacting on the target do not correspond to the expected guide beam code.
  • the attachment and / or the firing of the projectile is carried out. Otherwise, the probability that the projectile will hit the target is too low and the projectile is not hooked and / or fired.
  • the projectile launching step can be carried out before the projectile attachment step on the target.
  • the present invention also relates to a device for assisting in the targeting of a target comprising a camera, a display means, a calculator and a selection means.
  • the camera makes it possible to record information specifically on particular objects in the recorded environment, the target being a particular object in the environment.
  • the particular objects are isolated by this camera according to a change in their respective radiometry, for example following a movement of the particular objects.
  • the computer of this device for assisting in the targeting of a target can be configured in particular in order to analyze each image successively displayed on the display means and to determine a first number of contact points of the guide beam touching the target and a second number of contact points of the guide beam not touching the target.
  • This computer also makes it possible to determine the percentage of these points of contact of the guide beam actually touching the target among all the points of contact of the guide beam in the environment.
  • the computer can also be configured to analyze the environment seen by the camera and in particular the contact points of the guide beam in this environment in order to determine the temporal characteristics of the guide beam and, consequently, to identify the guide beam code.
  • the device for assisting with the aiming of a target can thus implement the method for assisting with the aiming of a target previously described in order to improve the accuracy of aiming of the target.
  • the present invention also relates to a system for guiding a projectile by a guide beam comprising a generator of a guide beam, a sighting aid device as previously described and a projectile provided with a receiving device.
  • the sighting aid is configured to improve the accuracy of target illumination so that the accuracy of projectile fire on the target is improved.
  • the generator of a guide beam is preferably a generator of a laser beam.
  • This system for guiding a projectile by a guide beam can thus implement the method for guiding a projectile by a guide beam previously described.
  • the figure 1 represents a system 20 for guiding a projectile 10 by a guide beam comprising a generator 6 of a guide beam 9, a projectile 10 provided with a receiving device 11 and a device 1 for aiming assistance.
  • This system 20 for guiding a projectile 10 by a guide beam ensures the guiding of the projectile 10, a missile for example, towards a target 5.
  • the generator 6 of a guide beam 9 can be used by an operator to aim a target 5 with the guide beam 9, the operator being located on the ground and fixed.
  • the generator 6 of a guide beam 9 then generally comprises a telescopic sight 61 making it possible to achieve the aim of the target 5.
  • the generator 6 of a guide beam 9 can also be embarked in a vehicle such as an aircraft and used then both when the vehicle is moving or when it is stationary.
  • This guide beam 9 is for example a laser beam consisting of successive pulses.
  • the target 5 is first of all illuminated by the guide beam 9 emitted by the generator 6 of a guide beam 9 and reflections of this guide beam 9 are then dispersed in a multitude of directions by reflection on the target 5
  • the guide beam 9 may be visible or not visible to the human eye depending on the wavelength (s) making up this guide beam 9.
  • the projectile 10 is launched in the direction of the target 5.
  • the projectile 10 comprises a receiving device 11 which receives, when it approaches the target 5, part of the guide beam 9 reflected by the target 5 and then determines the source of this part of the guide beam 9 reflected.
  • the projectile 10 is finally guided and directed towards this source, namely the target 5, as long as the guide beam 9 points at the target 5 and illuminates it.
  • the device 1 for aiming at a target 5 comprises a camera 2, a display means 3, a computer 4 and a selection means 7.
  • the display means 3 is a screen.
  • the device 1 for aiming at a target 5 is configured in order to improve the accuracy of aiming at target 5 by implementing a method for assisting at aiming at a target, a block diagram of which is depicted on the figure 2 and which includes the following steps.
  • a complete scanning of an environment is carried out using the camera 2.
  • a complete image of this environment corresponding to this complete scanning of the environment is displayed on the display means 3. This complete image is represented on the figure 3 .
  • a target 5 is identified and then selected from this complete image of the environment. This identification is made by an operator who is in charge of aiming the target with the guide beam 9.
  • the operator selects target 5 on the complete image of the environment.
  • This selection is made by means of a selection means 7 such as a push button while the operator is targeting the target 5.
  • This selection is made by the operator when he is targeting the target 5 by means of the generator 6 of a guide beam, but without emitting a guide beam.
  • the operator uses for example the aiming scope 61 of this generator 6 of a guide beam to aim at the target 5, then he actuates the selection means 7 to select the targeted target 5.
  • This selection of the target 5 can also be made automatically when the operator targets a stationary target 5 or for a first predetermined duration.
  • a fourth step 104 the operator points the guide beam 9 at the target 5 in order to guide the projectile 10 to this target 5.
  • the operator generally uses the aiming reticle present in the telescopic sight 61 of the generator 6 of a guide beam for targeting the target 5.
  • the first step 101 and the second step 102 are repeated before the completion of this fourth step 104 in order to update the complete display of the environment.
  • a fifth display step 105 the complete image of this environment is displayed on the display means 3 with the contact point 91 of the guide beam 9 in the environment.
  • the guide beam 9 and its reflection on the target 5 are advantageously always visible by the camera 2.
  • an aiming reticle 8 can also be displayed on the display means 3 thus indicating to the operator the point of the target environment.
  • This complete image comprising the contact point 91 of the guide beam 9 and the sighting reticle 8 is shown on the figure 4 .
  • This fifth display step 105 thus allows the operator to view and verify on the one hand that the aiming reticle 8 points well at the target 5 and on the other hand that the contact point 91 of the guide beam 9 points also the target 5. In fact, the operator must constantly point the guide beam 9 at the target 5 until the projectile 10 impacts the target 5.
  • an intermediate step 115 of complete scanning of the environment can be carried out between the fourth step 104 of pointing the target 5 and the fifth step 105 of display.
  • an intermediate step 115 of complete scanning of the environment can be carried out between the fourth step 104 of pointing the target 5 and the fifth step 105 of display.
  • a sixth pointing step 106 the operator again points the target 5 with the guide beam 9.
  • This sixth pointing step 106 advantageously allows the operator to correct the aim if the contact point 91 is not located on target 5 on the complete image displayed during the fifth display step 105.
  • a selective scanning of the target 5 and of the contact point 91 of the guide beam 9 in the environment is carried out by the camera 2.
  • the camera 2 is a camera making it possible in fact to '' specifically record information on particular objects in the environment according to a change in their radiometry following, for example, their movement.
  • the camera 2 makes it possible to record specifically and only the information on particular objects in the environment which are in motion, as well as the contact points 91 of the guide beam 9.
  • this selective image of the target 5 and of at least one contact point 91 of the guide beam 9 in the environment is displayed.
  • the operator can view the contact point 91 of the guide beam 9 on the selective image of the environment and check that this contact point 91 is always well on the target 5.
  • this selective image is simplified and mainly displays the target 5 which is for example in motion and the point of contact 91 of the guide beam 9 on the environment.
  • This selective image advantageously allows a quicker analysis on the part of the operator who immediately sees the position of the contact point 91 with respect to the target 5.
  • the reticle of target 8 can be displayed on the display means 3 in order to indicate to the operator the point of the environment which is targeted.
  • This selective image comprising the contact point 91 of the guide beam 9 and the sighting reticle 8 is shown on the figure 5 .
  • the sixth, seventh and eighth stages are then repeated until the impact of the projectile 10, these stages being carried out continuously.
  • the aiming aid device 1 thus advantageously makes it possible to provide the operator in real time and during his aiming operation with a return to the positions of the target 5 and of the contact point 91 of the guide beam 9 on the environment thanks to the images displayed on the viewing means 3 after having identified and selected the target 5. The operator can then immediately correct a difference between the position of the contact point 91 with respect to the target 5 and thus improve the aim accuracy.
  • the aiming aid device 1 also makes it possible to quantify the precision of the aiming.
  • the computer 4 is configured in order to analyze each image successively displayed on the display means 3 and to determine, during a ninth step 109, a first number of contact points 91 of the guide beam 9 touching the target 5 and a second number of contact points 91 of the guide beam not touching the target 5.
  • This computer 4 also makes it possible to calculate the percentage of these contact points 91 actually touching the target 5 among all the contact points 91 of the guide beam 9 in the environment.
  • information 92 of the precision of the contact points 91 touching the target 5 formed by this percentage of contact points 91 actually touching the chosen target can be displayed on the display means 3.
  • the ninth step 109 and the tenth step 110 preferably take place simultaneously with the sixth, seventh and eighth steps as shown in the block diagram of the figure 2 , and repeat until the impact of projectile 10.
  • the contact points 91 of the guide beam 9 in the environment can be recorded for a second predetermined duration, then be displayed with the selective image of the target 5.
  • the selective image of the target 5 and the current contact point 91 of the guide beam 9 picked up during the seventh scanning step 107 can be displayed simultaneously with at least one of the contact points 91 previously displayed during the fifth display step 105 and during previous eighth display steps 108, if any.
  • the ninth step 109 can also take place over the second predetermined duration. In this way, the information 92 of the precision displayed during the tenth step 110 is determined over this second determined duration.
  • This information 92 is displayed on the display means 3 with the contact points 91 recorded during the second predetermined duration as shown in the Figures 6 and 7 .
  • the operator can thus visualize the positions of the contact points 91 over this second predetermined duration and thus visualize the precision of the contact points 91 with respect to the target 5.
  • This information 92 can also be used by the guidance system 20 of a projectile 10 in order to confirm or cancel the attachment of the projectile 10 to the target 5 and the launching of the projectile 10 in the direction of the target 5. In Indeed, if the aiming accuracy is considered too low by the operator, he can cancel the launching of the projectile 10 or else stop it momentarily until there is sufficient aiming accuracy. This cancellation can also be done automatically if the information 92 of the precision of the contact points 91 of the guide beam 9 touching the target 5 is less than a predetermined threshold.
  • the aiming aid device 1 makes it possible to identify the code of the guide beam 9 targeting the target 5.
  • the computer 4 is configured in order to analyze the environment seen by the camera 2 and in particular the contact points 91 of the guide beam 9 in the environment. The computer 4 can thus determine, during an eleventh step 111, the temporal characteristics of the guide beam 9 and identify the code of this guide beam 9. This eleventh step 111 takes place after the fourth step 104 and in parallel with the steps following.

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Chemical & Material Sciences (AREA)
  • Combustion & Propulsion (AREA)
  • Physics & Mathematics (AREA)
  • Electromagnetism (AREA)
  • Aiming, Guidance, Guns With A Light Source, Armor, Camouflage, And Targets (AREA)
  • Optical Radar Systems And Details Thereof (AREA)

Claims (16)

  1. Verfahren zur Unterstützung beim Zielen auf ein Ziel (5),
    dadurch gekennzeichnet, dass das Verfahren die folgenden Schritte umfasst,
    einen ersten Schritt (101) des vollständigen Abtastens einer Umgebung mit Hilfe einer Kamera (2),
    einen zweiten Schritt (102) des Anzeigens eines vollständigen Bildes der Umgebung,
    einen dritten Schritt (103) des Identifizierens und Auswählens eines Ziels (5) in dem vollständigen Bild der Umgebung,
    einen vierten Schritt (104) des Anpeilens des Ziels (5) durch einen Benutzer über einen Leitstrahl (9),
    einen fünften Schritt (105) des Anzeigens eines vollständigen Bildes der Umgebung und des Auftreffpunktes (91) des Leitstrahls (9) in der Umgebung,
    einen sechsten Schritt (106) des Anpeilens des Ziels (5) durch den Benutzer mittels des Leitstrahls (9),
    einen siebten Schritt (107) des selektiven Abtastens des Ziels (5) und des Auftreffpunktes (91) des Führungsstrahls (9) in der Umgebung mit Hilfe der Kamera (2),
    einen achten Schritt (108) des Anzeigens eines selektiven Bildes des Ziels (5), des aktuellen Auftreffpunkts (91) des Leitstrahls (9) in der Umgebung und mindestens eines der zuvor angezeigten Auftreffpunkte (91),
    einen neunten Schritt (109) des Berechnens einer ersten Anzahl der Auftreffpunkte (91), in denen der Leitstrahl (9) das Ziel (5) trifft, und einer zweiten Anzahl der Auftreffpunkte (91), in denen der Leitstrahl (9) das Ziel (5) nicht trifft, und
    - einen zehnten Schritt (110) des Anzeigens von Informationen über die Genauigkeit der Auftreffpunkte (91), in denen der Leitstrahl (9) das Ziel (5) trifft.
  2. Verfahren zur Unterstützung beim Zielen auf ein Ziel (5) nach Anspruch 1,
    dadurch gekennzeichnet, dass in dem achten Schritt (108) die Auftreffpunkte (91) des Führungsstrahls (9) in der Umgebung über eine vorbestimmte Zeit aufgezeichnet und dann mit dem selektiven Bild des Ziels (5) angezeigt werden.
  3. Verfahren zur Unterstützung beim Zielen auf ein Ziel (5) nach einem der Ansprüche 1 bis 2,
    dadurch gekennzeichnet, dass die Information über die Genauigkeit der Auftreffpunkte (91) des Führungsstrahls (9), die das Ziel (5) treffen, gebildet wird durch die erste Zahl und die zweite Zahl oder durch einen Prozentsatz derjenigen Auftreffpunkte (91) des Führungsstrahls (9) aus der Gesamtheit aller Auftreffpunkte (91) des Führungsstrahls (9) in der Umgebung, die das Ziel (5) tatsächlich treffen.
  4. Verfahren zur Unterstützung beim Zielen auf ein Ziel (5) nach einem der Ansprüche 1 bis 3,
    dadurch gekennzeichnet, dass der neunte Schritt (109) über einen vorgegebenen Zeitraum stattfindet.
  5. Verfahren zur Unterstützung beim Zielen auf ein Ziel (5) nach einem der Ansprüche 1 bis 4,
    dadurch gekennzeichnet, dass ein Leitstrahl (9) durch einen Code definiert ist, der durch zeitliche Merkmale gebildet ist, wobei das Verfahren nach dem vierten Schritt (104) des Anpeilens des Ziels (5) und parallel zu den nachfolgenden Schritten einen elften Schritt (111) des Analysierens und Identifizierens des Leitstrahls (9) umfasst, wobei die Auftreffpunkte (91) des Leitstrahls (9) in der Umgebung analysiert werden, um die zeitlichen Eigenschaften des Leitstrahls (9) zu bestimmen und somit den Code des Leitstrahls (9) zu identifizieren.
  6. Verfahren zur Unterstützung beim Zielen auf ein Ziel (5) nach Anspruch 5,
    dadurch gekennzeichnet, dass, wenn der Leitstrahl (9) durch eine Folge von Impulsen gebildet wird, die zeitlichen Eigenschaften des Leitstrahls (9) eine Frequenz und eine Dauer der Impulse sind.
  7. Verfahren zur Unterstützung beim Zielen auf ein Ziel (5) nach einem der Ansprüche 1 bis 6,
    dadurch gekennzeichnet, dass das Verfahren zwischen dem vierten Schritt (104) des Anpeilens des Ziels (5) und dem fünften Schritt (105) des Anzeigens einen Zwischenschritt (115) des vollständigen Abtastens der Umgebung unter Verwendung der Kamera (2) zum Aktualisieren der Anzeige des Ziels (5) und der Umgebung umfasst.
  8. Verfahren zur Unterstützung beim Zielen auf ein Ziel (5) nach einem der Ansprüche 1 bis 7,
    dadurch gekennzeichnet, dass während des zweiten, fünften und achten Schritts des Anzeigens (102, 105, 108) ein Fadenkreuz (8) auf dem Ziel (5) angezeigt wird, um die Identifizierung des Ziels (5) zu erleichtern.
  9. Verfahren zur Unterstützung beim Zielen auf ein Ziel (5) nach einem der Ansprüche 1 bis 8,
    dadurch gekennzeichnet, dass während des siebten Schritts des Abtastens (107) das selektive Abtasten des Ziels (5) und des Auftreffpunktes (91) des Leitstrahls (9) durchgeführt wird, indem jedes Objekt der Umgebung, für das eine Änderung der Radiometrie erfasst wird, sowie der Auftreffpunkt (91) aufgezeichnet werden.
  10. Verfahren zur Unterstützung beim Zielen auf ein Ziel (5) nach einem der Ansprüche 1 bis 8,
    dadurch gekennzeichnet, dass während des siebten Schritts des Abtastens (107) das selektive Abtasten des Ziels (5) und des Auftreffpunktes (91) des Leitstrahls (9) durchgeführt wird, indem Informationen über bestimmte Objekte der Umgebung, die in Bewegung sind, sowie der Auftreffpunkt (91) aufgezeichnet werden.
  11. Verfahren zum Leiten eines Projektils (10) mit einem Leitstrahl, umfassend:
    - einen Schritt des Beleuchtens eines Ziels (5) mit einem Leitstrahl (9),
    - einen Schritt des Verknüpfens des Projektils (10) mit dem Ziel (5),
    - einen Schritt des Abschießens des Projektils (10), und
    - einen Schritt des Leitens des Projektils (10) zu dem Ziel (5),
    dadurch gekennzeichnet, dass das Verfahren zur Unterstützung beim Zielen nach einem der Ansprüche 1 bis 10 während des Schritts des Beleuchtens angewendet wird, um die Genauigkeit der Beleuchtung des Ziels (5) zu verbessern.
  12. Verfahren zum Leiten eines Projektils (10) mit einem Leitstrahl nach Anspruch 11,
    dadurch gekennzeichnet, dass der Schritt des Verknüpfens des Projektils (10) mit dem Ziel (5) und/oder der Schritt des Abschießens des Projektils (10) in Abhängigkeit von Information über die Genauigkeit, mit der die Auftreffpunkte (91) des Leitstrahls (9) das Ziel (5) treffen, und/oder von zeitlichen Merkmalen des Leitstrahls (9) aufgehoben wird.
  13. Vorrichtung (1) zur Unterstützung beim Zielen auf ein Ziel (5) mit einer Kamera (2), einem Anzeigemittel (3), einem Computer (4) und einem Auswahlmittel (7),
    dadurch gekennzeichnet, dass die Vorrichtung zur Unterstützung beim Zielen (1) konfiguriert ist, um das Verfahren nach einem der Ansprüche 1 bis 10 durchzuführen, und dass die Kamera (2) eine Kamera zum spezifischen Aufzeichnen von Informationen über bestimmte Objekte der aufgezeichneten Umgebung ist, dass das Ziel (5) ein bestimmtes Objekt der Umgebung ist, dass der Computer (4) konfiguriert ist, um jedes selektive Bild zu analysieren, das nacheinander auf dem Anzeigemittel (3) angezeigt wird, und um eine erste Anzahl von Auftreffpunkten (91) zu bestimmen, an denen der Führungsstrahl (9) das Ziel (5) trifft, und eine zweite Anzahl von Auftreffpunkten (91) zu bestimmen, an denen der Führungsstrahl (9) das Ziel (5) nicht trifft, und dann eine Information über die Genauigkeit der Auftreffpunkte (91) zu bestimmen, an denen der Führungsstrahl (9) das Ziel (5) trifft.
  14. Vorrichtung (1) zur Unterstützung beim Zielen auf ein Ziel (5) nach Anspruch 13,
    dadurch gekennzeichnet, dass der Computer (4) konfiguriert ist, um die von der Kamera (2) gesehene Umgebung und insbesondere die Auftreffpunkte (91) des Führungsstrahls (9) in der Umgebung zu analysieren, um zeitliche Merkmale des Leitstrahls (9) zu bestimmen und den Leitstrahl (9) anhand der zeitlichen Merkmale zu identifizieren.
  15. System (20) zum Leiten eines Projektils (10) mittels eines Leitstrahls, umfassend einen Generator (6) eines Leitstrahls und ein mit einer Empfangsvorrichtung (11) versehenes Projektil (10),
    dadurch gekennzeichnet, dass das System (20) zum Leiten eines Projektils (10) eine Vorrichtung (1) zur Unterstützung beim Zielen nach einem der Ansprüche 13 bis 14 umfasst.
  16. System (20) zum Leiten eines Projektils (10) mittels eines Leitstrahls nach Anspruch 15,
    dadurch gekennzeichnet, dass das System (20) zum Leiten eines Projektils (10) mittels eines Leitstrahls das Verfahren zum Leiten eines Projektils (10) mittels eines Leitstrahls nach einem der Ansprüche 11 bis 12 implementiert.
EP17167889.9A 2016-04-29 2017-04-25 Hilfsverfahren und -vorrichtung zum zielen für die laserlenkung eines projektils Active EP3239644B1 (de)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PL17167889T PL3239644T3 (pl) 2016-04-29 2017-04-25 Sposób i urządzenie wspomagające celowanie do laserowego prowadzenia pocisku

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
FR1600721A FR3050814B1 (fr) 2016-04-29 2016-04-29 Procede et dispositif d'aide a la visee pour le guidage laser d'un projectile

Publications (2)

Publication Number Publication Date
EP3239644A1 EP3239644A1 (de) 2017-11-01
EP3239644B1 true EP3239644B1 (de) 2020-02-19

Family

ID=57396488

Family Applications (1)

Application Number Title Priority Date Filing Date
EP17167889.9A Active EP3239644B1 (de) 2016-04-29 2017-04-25 Hilfsverfahren und -vorrichtung zum zielen für die laserlenkung eines projektils

Country Status (4)

Country Link
US (1) US10281239B2 (de)
EP (1) EP3239644B1 (de)
FR (1) FR3050814B1 (de)
PL (1) PL3239644T3 (de)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE102017011407A1 (de) * 2017-12-11 2019-06-13 Mbda Deutschland Gmbh System und verfahren zur personenkoordinierten zielfindung eines lenkflugkörpers
DE102022122842A1 (de) * 2022-09-08 2024-03-14 Rheinmetall Electronics Gmbh Vorrichtung zum Bestimmen einer Winkelabweichung, Fahrzeug und Verfahren zur Bestimmung einer Winkelabweichung

Family Cites Families (54)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US2557401A (en) * 1945-01-10 1951-06-19 Arma Corp Remote control apparatus
US2969018A (en) * 1957-05-01 1961-01-24 Itt Quadrant homing system
US3239674A (en) * 1960-02-02 1966-03-08 Thompson Ramo Wooldridge Inc Radiant energy receiving and detection systems
US3306206A (en) * 1962-12-04 1967-02-28 Rodney E Grantham Radio frequency free communication system
US3366346A (en) * 1965-07-19 1968-01-30 Army Usa Remote missile command system
US3617016A (en) * 1968-05-27 1971-11-02 Emil J Bolsey Image motion and change transducers and systems controlled thereby
US4143835A (en) * 1972-09-12 1979-03-13 The United States Of America As Represented By The Secretary Of The Army Missile system using laser illuminator
US3859460A (en) * 1972-11-27 1975-01-07 Baird Atomic Inc Passive image stabilization system
DE2947492C2 (de) * 1979-11-24 1983-04-28 Licentia Patent-Verwaltungs-Gmbh, 6000 Frankfurt Lenkverfahren für Flugkörper
DE3230267A1 (de) * 1982-08-14 1984-02-16 Licentia Patent-Verwaltungs-Gmbh, 6000 Frankfurt Halbaktives leitsystem fuer einen zielsuchenden, lenkbaren flugkoerper
US4615590A (en) * 1984-07-17 1986-10-07 Schwem Instruments Optically stabilized camera lens system
US6491253B1 (en) * 1985-04-15 2002-12-10 The United States Of America As Represented By The Secretary Of The Army Missile system and method for performing automatic fire control
US6487953B1 (en) * 1985-04-15 2002-12-03 The United States Of America As Represented By The Secretary Of The Army Fire control system for a short range, fiber-optic guided missile
US4678142A (en) * 1985-07-25 1987-07-07 The United States Of America As Represented By The Secretary Of The Air Force Precision guided antiaircraft munition
US4913547A (en) * 1988-01-29 1990-04-03 Moran Steven E Optically phased-locked speckle pattern interferometer
US4911541A (en) * 1988-04-06 1990-03-27 Schwem Technology Incorporated Inertial pendulum optical stabilizer
US5122908A (en) * 1989-04-21 1992-06-16 Tinsley Laboratories, Inc. Non-linear controller functions for inertial optical stabilizers
US5375008A (en) * 1991-07-17 1994-12-20 Electronic Warfare Associates, Inc. Systems for distinguishing between friendly ground targets and those of a foe
US5194908A (en) * 1991-11-29 1993-03-16 Computing Devices Canada Ltd. Detecting target movement
DE4416211C2 (de) 1994-05-07 1996-09-26 Rheinmetall Ind Gmbh Verfahren und Vorrichtung zur Flugbahnkorrektur von Geschossen
US6023322A (en) * 1995-05-04 2000-02-08 Bushnell Corporation Laser range finder with target quality display and scan mode
DE69706738T2 (de) * 1996-04-05 2002-07-04 Luchaire Defense Sa Geschoss dessen Sprengladung durch einen Zielanzeiger ausgelöst wird
US5664741A (en) * 1996-04-19 1997-09-09 The United States Of America As Represented By The Secretary Of The Army Nutated beamrider guidance using laser designators
FR2753796B1 (fr) * 1996-09-25 1998-11-13 Detecteur photosensible et mosaique de detecteurs photosensibles pour la detection d'eclats lumineux et applications
FR2753785B1 (fr) * 1996-09-25 1998-11-13 Autodirecteur d'un corps volant
US6069656A (en) * 1997-12-17 2000-05-30 Raytheon Company Method and apparatus for stabilization of images by closed loop control
US6671538B1 (en) * 1999-11-26 2003-12-30 Koninklijke Philips Electronics, N.V. Interface system for use with imaging devices to facilitate visualization of image-guided interventional procedure planning
FR2821929B1 (fr) * 2001-03-06 2003-08-29 Sagem Systeme de visee a pointage laser a telemetre designateur
DE10147837A1 (de) * 2001-09-27 2003-04-24 Rheinmetall Landsysteme Gmbh Wurfsystem für einen Gefechtskopf mit einer Richtvorrichtung zur Neutralisierung von Minen
US6891984B2 (en) * 2002-07-25 2005-05-10 Lightlab Imaging, Llc Scanning miniature optical probes with optical distortion correction and rotational control
EP1553758A1 (de) * 2003-12-03 2005-07-13 Stueckler Gerd Optische Vorrichtung zur Ausgleich des Bildverwakelns mit Anzeige der durchgeführten Korrektur
US6851645B1 (en) * 2003-12-05 2005-02-08 Lockheed Martin Corporation Non-coherent fresnel direction finding method and apparatus
EP1607710A1 (de) * 2004-06-18 2005-12-21 Saab Ab System zur Zielentfernungsermittlung für eine Laserlenkungswaffe
FR2885213B1 (fr) * 2005-05-02 2010-11-05 Giat Ind Sa Procede de commande d'une munition ou sous-munition, systeme d'attaque, munition et designateur mettant en oeuvre un tel procede
JP4832013B2 (ja) * 2005-07-05 2011-12-07 富士フイルム株式会社 像振れ補正装置
US7767945B2 (en) * 2005-11-23 2010-08-03 Raytheon Company Absolute time encoded semi-active laser designation
US7575191B2 (en) * 2006-01-27 2009-08-18 Lockheed Martin Corporation Binary optics SAL seeker (BOSS)
FR2921149B1 (fr) * 2007-09-14 2009-11-06 Thales Sa Procede de telemetrie sur image stabilisee
FR2922008B1 (fr) * 2007-10-03 2015-12-11 Nexter Munitions Dispositif de telecommande d'un designateur de cible a partir d'un module d'attaque, module d'attaque et designateur mettant en oeuvre un tel dispositif
US7978313B2 (en) * 2008-05-30 2011-07-12 The Boeing Company Systems and methods for targeting directed energy devices
SG170644A1 (en) * 2009-11-02 2011-05-30 Dso Nat Lab A device for illuminating a target
US9590000B2 (en) * 2009-12-14 2017-03-07 Shilat Optical Systems Ltd. Laser daylight designation and pointing
US8344302B1 (en) * 2010-06-07 2013-01-01 Raytheon Company Optically-coupled communication interface for a laser-guided projectile
FR2965935B1 (fr) * 2010-10-06 2012-11-16 Sagem Defense Securite Dispositif optronique d'observation et/ou de visee d'une scene comportant un telemetre, et procede de telemetrie associe
DE102010062161A1 (de) * 2010-11-30 2012-05-31 Hilti Aktiengesellschaft Distanzmessgerät und Vermessungssystem
US8970708B2 (en) * 2011-05-23 2015-03-03 The Johns Hopkins University Automatic device alignment mechanism
US8525088B1 (en) * 2012-03-21 2013-09-03 Rosemont Aerospace, Inc. View-point guided weapon system and target designation method
US9360680B1 (en) * 2012-08-10 2016-06-07 Ilias Syrgabaev Electromagnetic beam or image stabilization system
IL233692A (en) * 2014-07-17 2017-04-30 Elbit Systems Electro-Optics Elop Ltd A system and method for analyzing the quality of radiation stain criteria
WO2016151928A1 (ja) * 2015-03-20 2016-09-29 富士フイルム株式会社 測距装置、測距用制御方法、及び測距用制御プログラム
WO2016151929A1 (ja) * 2015-03-20 2016-09-29 富士フイルム株式会社 測距装置、測距用制御方法、及び測距用制御プログラム
JP6348225B2 (ja) * 2015-03-20 2018-06-27 富士フイルム株式会社 測距装置、測距用制御方法、及び測距用制御プログラム
WO2016151927A1 (ja) * 2015-03-20 2016-09-29 富士フイルム株式会社 測距装置、測距用制御方法、及び測距用制御プログラム
JP6348221B2 (ja) * 2015-03-20 2018-06-27 富士フイルム株式会社 測距装置、測距用制御方法、及び測距用制御プログラム

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
None *

Also Published As

Publication number Publication date
US20170314891A1 (en) 2017-11-02
EP3239644A1 (de) 2017-11-01
US10281239B2 (en) 2019-05-07
FR3050814B1 (fr) 2019-06-07
FR3050814A1 (fr) 2017-11-03
PL3239644T3 (pl) 2020-07-13

Similar Documents

Publication Publication Date Title
US9897688B2 (en) Laser detection and image fusion system and method
US5408541A (en) Method and system for recognizing targets at long ranges
US10408574B2 (en) Compact laser and geolocating targeting system
EP0432014A1 (de) Optoelektronisches Hilfssystem für die Flugnavigation und Luftangriffsaufträge
EP0033679A1 (de) System zur Andeutung eines Objektes mittels eines Lasers
EP3239644B1 (de) Hilfsverfahren und -vorrichtung zum zielen für die laserlenkung eines projektils
US20140022388A1 (en) Air Surveillance System for Detecting Missiles Launched from Inside an Area to be Monitored and Air Surveillance Method
EP1925902A1 (de) Zielvorrichtung mit integriertem Abstandsanzeiger
FR2514884A1 (fr) Procede et dispositif pour corriger globalement, d'un tir au suivant, le tir d'une arme a tir tendu
US20240125936A1 (en) Time-resolved contrast imaging for lidar
EP3103062A1 (de) Verfahren zur bestimmung und klassifizierung von ereignissen in einer szene
US8547531B2 (en) Imaging device
US10267900B2 (en) System and method for covert pointer/communications and laser range finder
EP2652430B1 (de) Verfahren und system zur erkennung eines elektromagnetischen impulsstroms und vorrichtung mit einem solchen detektionssystem sowie zur elektromagnetischen leitung einer munition zu einem ziel
WO2013108204A1 (en) Laser target seeker with photodetector and image sensor
EP2625544B1 (de) Optoelektronische vorrichtung zum beobachten und/oder anvisieren einer szene mit einem entfernungsmesser und entsprechendes entfernungsmessungsverfahren
FR2765960A1 (fr) Methode et dispositif destines a eliminer des mines souterraines
EP2929284B1 (de) Optronische vorrichtung
EP2364455B1 (de) Entfernungsmesser
EP0605290A1 (de) Optronische Schiesshilfevorrichtung für Handwaffe und Anwendung zum Fortschritt in feindlicher Umgebung
Steindorfer et al. Daylight space debris laser ranging
FR2524137A1 (fr) Systeme de conduite de tir a lunette d'observation asservie par un dispositif de poursuite automatique
FR2974432A1 (fr) Dispositif de guidage differentiel par imagerie active laser
FR2736731A1 (fr) Dispositif de detection d'organes optiques pointes sur le dispositif
FR3129493A1 (fr) Procédé et dispositif de visualisation d'impacts d'impulsions laser sur une cible

Legal Events

Date Code Title Description
PUAI Public reference made under article 153(3) epc to a published international application that has entered the european phase

Free format text: ORIGINAL CODE: 0009012

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE APPLICATION HAS BEEN PUBLISHED

AK Designated contracting states

Kind code of ref document: A1

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR

AX Request for extension of the european patent

Extension state: BA ME

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: REQUEST FOR EXAMINATION WAS MADE

17P Request for examination filed

Effective date: 20180130

RBV Designated contracting states (corrected)

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: EXAMINATION IS IN PROGRESS

17Q First examination report despatched

Effective date: 20190314

GRAP Despatch of communication of intention to grant a patent

Free format text: ORIGINAL CODE: EPIDOSNIGR1

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: GRANT OF PATENT IS INTENDED

INTG Intention to grant announced

Effective date: 20191007

GRAS Grant fee paid

Free format text: ORIGINAL CODE: EPIDOSNIGR3

GRAA (expected) grant

Free format text: ORIGINAL CODE: 0009210

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE PATENT HAS BEEN GRANTED

AK Designated contracting states

Kind code of ref document: B1

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR

REG Reference to a national code

Ref country code: CH

Ref legal event code: EP

REG Reference to a national code

Ref country code: DE

Ref legal event code: R096

Ref document number: 602017011840

Country of ref document: DE

REG Reference to a national code

Ref country code: AT

Ref legal event code: REF

Ref document number: 1235449

Country of ref document: AT

Kind code of ref document: T

Effective date: 20200315

REG Reference to a national code

Ref country code: IE

Ref legal event code: FG4D

Free format text: LANGUAGE OF EP DOCUMENT: FRENCH

REG Reference to a national code

Ref country code: NL

Ref legal event code: MP

Effective date: 20200219

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: RS

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20200219

Ref country code: FI

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20200219

Ref country code: NO

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20200519

REG Reference to a national code

Ref country code: LT

Ref legal event code: MG4D

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: LV

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20200219

Ref country code: GR

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20200520

Ref country code: BG

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20200519

Ref country code: HR

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20200219

Ref country code: IS

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20200619

Ref country code: SE

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20200219

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: NL

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20200219

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: SK

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20200219

Ref country code: RO

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20200219

Ref country code: LT

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20200219

Ref country code: PT

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20200712

Ref country code: CZ

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20200219

Ref country code: DK

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20200219

Ref country code: SM

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20200219

Ref country code: EE

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20200219

Ref country code: ES

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20200219

REG Reference to a national code

Ref country code: DE

Ref legal event code: R119

Ref document number: 602017011840

Country of ref document: DE

REG Reference to a national code

Ref country code: AT

Ref legal event code: MK05

Ref document number: 1235449

Country of ref document: AT

Kind code of ref document: T

Effective date: 20200219

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: MC

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20200219

REG Reference to a national code

Ref country code: CH

Ref legal event code: PL

PLBE No opposition filed within time limit

Free format text: ORIGINAL CODE: 0009261

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: NO OPPOSITION FILED WITHIN TIME LIMIT

26N No opposition filed

Effective date: 20201120

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: LI

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20200430

Ref country code: AT

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20200219

Ref country code: LU

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20200425

Ref country code: CH

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20200430

Ref country code: DE

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20201103

REG Reference to a national code

Ref country code: BE

Ref legal event code: MM

Effective date: 20200430

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: SI

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20200219

Ref country code: BE

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20200430

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: IE

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20200425

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: TR

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20200219

Ref country code: MT

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20200219

Ref country code: CY

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20200219

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: MK

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20200219

Ref country code: AL

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20200219

P01 Opt-out of the competence of the unified patent court (upc) registered

Effective date: 20230530

PGFP Annual fee paid to national office [announced via postgrant information from national office to epo]

Ref country code: IT

Payment date: 20230426

Year of fee payment: 7

Ref country code: FR

Payment date: 20230424

Year of fee payment: 7

PGFP Annual fee paid to national office [announced via postgrant information from national office to epo]

Ref country code: PL

Payment date: 20230417

Year of fee payment: 7

PGFP Annual fee paid to national office [announced via postgrant information from national office to epo]

Ref country code: GB

Payment date: 20230419

Year of fee payment: 7