EP4038412A1 - Verfahren zur verfolgung eines raumobjektes unter verwendung von radar- und lidar-systemen an bord - Google Patents

Verfahren zur verfolgung eines raumobjektes unter verwendung von radar- und lidar-systemen an bord

Info

Publication number
EP4038412A1
EP4038412A1 EP20819807.7A EP20819807A EP4038412A1 EP 4038412 A1 EP4038412 A1 EP 4038412A1 EP 20819807 A EP20819807 A EP 20819807A EP 4038412 A1 EP4038412 A1 EP 4038412A1
Authority
EP
European Patent Office
Prior art keywords
target
radar
view
field
trajectory
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
EP20819807.7A
Other languages
English (en)
French (fr)
Inventor
Benjamin GIGLEUX
Julien CANTEGREIL
Henri CARRON
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Spaceable SAS
Original Assignee
Spaceable SAS
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Spaceable SAS filed Critical Spaceable SAS
Publication of EP4038412A1 publication Critical patent/EP4038412A1/de
Withdrawn legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/02Systems using the reflection of electromagnetic waves other than radio waves
    • G01S17/50Systems of measurement based on relative movement of target
    • G01S17/58Velocity or trajectory determination systems; Sense-of-movement determination systems
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64GCOSMONAUTICS; VEHICLES OR EQUIPMENT THEREFOR
    • B64G1/00Cosmonautic vehicles
    • B64G1/10Artificial satellites; Systems of such satellites; Interplanetary vehicles
    • B64G1/1078Maintenance satellites
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64GCOSMONAUTICS; VEHICLES OR EQUIPMENT THEREFOR
    • B64G1/00Cosmonautic vehicles
    • B64G1/22Parts of, or equipment specially adapted for fitting in or to, cosmonautic vehicles
    • B64G1/24Guiding or controlling apparatus, e.g. for attitude control
    • B64G1/242Orbits and trajectories
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64GCOSMONAUTICS; VEHICLES OR EQUIPMENT THEREFOR
    • B64G1/00Cosmonautic vehicles
    • B64G1/22Parts of, or equipment specially adapted for fitting in or to, cosmonautic vehicles
    • B64G1/66Arrangements or adaptations of apparatus or instruments, not otherwise provided for
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/02Systems using reflection of radio waves, e.g. primary radar systems; Analogous systems
    • G01S13/06Systems determining position data of a target
    • G01S13/42Simultaneous measurement of distance and other co-ordinates
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/66Radar-tracking systems; Analogous systems
    • G01S13/72Radar-tracking systems; Analogous systems for two-dimensional tracking, e.g. combination of angle and range tracking, track-while-scan radar
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/86Combinations of radar systems with non-radar systems, e.g. sonar, direction finder
    • G01S13/865Combination of radar systems with lidar systems
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/66Tracking systems using electromagnetic waves other than radio waves
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/88Lidar systems specially adapted for specific applications
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64GCOSMONAUTICS; VEHICLES OR EQUIPMENT THEREFOR
    • B64G1/00Cosmonautic vehicles
    • B64G1/10Artificial satellites; Systems of such satellites; Interplanetary vehicles
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64GCOSMONAUTICS; VEHICLES OR EQUIPMENT THEREFOR
    • B64G1/00Cosmonautic vehicles
    • B64G1/22Parts of, or equipment specially adapted for fitting in or to, cosmonautic vehicles
    • B64G1/24Guiding or controlling apparatus, e.g. for attitude control
    • B64G1/36Guiding or controlling apparatus, e.g. for attitude control using sensors, e.g. sun-sensors, horizon sensors

Definitions

  • the present invention relates to a system on board a space vehicle for the detection, location and tracking of a space object, hereinafter referred to as "target".
  • the target can be cooperative or non-cooperative, and maneuverable or non-maneuverable.
  • the invention relates more particularly to a navigation system on board a space vehicle, hereinafter referred to as "hunter", this navigation system being configured to perform a "space rendezvous", that is to say to bring the hunter. towards the target, in order to carry out proximity operations with the target, such as an inspection, a formation flight or a mooring, while limiting the risk of collision.
  • a space rendezvous that is to say to bring the hunter. towards the target, in order to carry out proximity operations with the target, such as an inspection, a formation flight or a mooring, while limiting the risk of collision.
  • the existing navigation systems suitable for carrying out such a mission generally employ a single active electromagnetic radar or lidar sensor.
  • Some navigation systems can combine an active sensor with one or more passive sensors, such as infrared cameras or cameras in the field of visible optics offering a wide field of view. Indeed, the joint use of one or more passive sensors improves the performance of these systems compared to the use of a single active sensor. However, the performance of these passive sensors can be greatly degraded in the event of a rapid transition between day and night or between night and day. These passive sensors can also be damaged by exposure to direct sunlight. Passive sensors, even when used in combination with an active sensor, therefore do not improve performance and therefore solve the problem of short range flight safety under all operational conditions that may arise.
  • the active sensors with a narrow field of view generally used for space rendezvous missions, namely radars with directional antennas and scanning lidars do not make it possible to resolve this problem.
  • Such flickering phenomena correspond to a large variation in the radar (or lidar) signal re-emitted by the target in the direction of the radar (or lidar). These variations depend on the orientation of the target relative to the sensor, the geometry of the target, the flight conditions, etc.
  • Embodiments relate to a method of tracking a target located on a known trajectory by a space vehicle, the method comprising an acquisition phase comprising the steps of: activating a lidar system in scan mode to scan a region in a estimated direction of a target, acquire signals from the lidar system, determine target trajectory data, using signals acquired from the lidar system, engage the spacecraft on an inbound or inspection trajectory of the target , determined according to target trajectory data, and if the target is no longer detected in the acquisition phase, activating a detection phase at a short distance from the target, comprising the steps of: activating a wide-field-of-view radar, acquiring and process signals from the wide field of vision radar, to detect the target, and if the target is detected in the signals from the wide field of vision radar, activate the acquisition phase, otherwise engage the space vehicle on a trajectory away from the target.
  • the method comprises a phase of long-range detection, comprising steps consisting in: activating a narrow-field-of-view radar in scanning mode in the space vehicle, acquiring and processing signals coming from the field-of-view radar. narrow vision, to detect a target, and if a target is detected in signals from the narrow field of view radar, determining target trajectory data, based on the signals acquired from the narrow field of view radar , and activate the acquisition phase.
  • the method comprises steps consisting in: receiving from a communication system static parameters of the trajectory of the target, estimating the trajectory data of the target as a function of the static parameters of the trajectory, and activating the phase long-range detection when the estimated position of the target is at a distance from the spacecraft less than a first distance threshold value.
  • the method comprises, upon activation of the long-range detection phase, steps consisting in pointing the lidar system in a direction estimated as a function of the target trajectory data, and acquiring and processing data. signals from the lidar system to detect the target.
  • a narrow-field-of-view radar is activated at the same time as the lidar system, data from the narrow-field-of-view radar and the lidar system being merged using Kalman filters to refine the data. target trajectory data.
  • the wide field of vision radar is activated alternately with a narrow field of vision radar when the space vehicle is on the outbound trajectory.
  • the lidar system when the target is detected by the wide field of view radar, the lidar system is pointed in an initial direction defined by target trajectory data, determined from the signals from the field of view radar. wide vision or a radar with a narrow field of vision, then, in the absence of detection of the target by the lidar system, in successive directions located along a first scan pattern.
  • a narrow field of view radar is pointed in an initial direction defined by the target trajectory data, determined from the signals from the wide field of view radar, then, in the absence of detection by the narrow field of view radar, in successive directions along a second scan pattern.
  • the distance between the space vehicle and the target is compared with a second distance threshold value. less than the first distance value and defining a target inspection zone where the spacecraft is engaged on a target inspection path.
  • the method comprises a step of estimating a risk of collision with the target in the short-range detection phase or the acquisition phase, as a function of a trajectory followed by the space vehicle and estimated trajectory data of the target, and engage the spacecraft on the outbound trajectory if the risk of collision is greater than a threshold value.
  • Embodiments may also relate to a computer program product loadable into a memory and which, when executed by a computer connected to the memory, configures the computer to implement the previously defined method.
  • Embodiments may also relate to a computer comprising interface circuits for receiving data from a wide field of view radar and a lidar system, the computer being configured to implement the method defined above.
  • the interface circuits are configured to receive data from a radar with a narrow field of view, or well for transmitting commands and receiving data from a radar system with a field of view configurable between a wide field of view and a narrow field of view.
  • Embodiments can also relate to a space vehicle comprising the previously defined computer, a radar with a wide field of view and a lidar system.
  • FIG. 1 schematically represents a space vehicle performing the function of a fighter, according to one embodiment
  • FIG. 2 schematically represents fields of view of a radar system on board the space vehicle, according to one embodiment
  • FIG. 3 schematically represents a functional architecture of a navigation system on board the space vehicle, according to one embodiment
  • FIG. 4 schematically represents a hardware architecture of the navigation system on board the space vehicle, according to one embodiment
  • FIG. 5 schematically represents a hardware architecture of a mission computer of the navigation system
  • FIG. 6 schematically illustrates different phases of a space rendezvous mission of the spacecraft with respect to a target, according to one embodiment
  • FIG. 7 represents a state and transition diagram illustrating the operation of the navigation system, according to one embodiment.
  • a space vehicle jointly uses a radar with a modular field of view in an analog or digital fashion, and an analog or digital scanning lidar within the framework of a space rendezvous between the space vehicle (or fighter) and a space object (or target), in particular to allow the hunter to carry out proximity operations on the target.
  • the variable field of view of the radar makes it possible to optimize the link budget in a specific direction or, on the contrary, to cover a large field of vision instantly.
  • the lidar sensor scanning makes it possible to achieve the angular resolution and precision necessary for the safety of proximity operations carried out by the hunter.
  • Other sensors can be used in combination to improve target detection and tracking performance.
  • Proximity operations carried out by the hunter can be an inspection of the target and / or its immediate environment (presence of debris, space weather, analysis of the electromagnetic spectrum), mooring of the hunter to the target and / or a maintenance operation of the target.
  • the target can be a satellite, a space station, a space vehicle or a piece of debris orbiting a star or following a known trajectory in the solar system.
  • the radar is first used in a narrow or directional field of view, to gradually scan the hunter's environment, until the target is detected and located.
  • Lidar can be used to confirm the detection of the radar and to refine the target location data, provided by the radar.
  • the wide-field-of-view radar is used with a higher refresh rate than when the target is detected, to quickly find the target and direct the lidar towards the target to refine its localization.
  • the invention thus makes it possible to recover the tracking of the target within a short time and to compare two independent measurements of the position of the target, in particular to interrupt the rendezvous in the event of an anomaly.
  • the joint use of two spectrally distant detection means makes it possible to reduce the risks of loss of tracking of the target, caused in particular by the flicker of the target in a given spectral band. The safety of short-range flight is thus increased.
  • FIG. 1 represents a space vehicle 101, according to one embodiment.
  • the hunter 101 is equipped with solar panels 102 to recover solar radiation from energy which is stored in batteries supplying electricity to on-board systems.
  • the hunter 101 can include an imager 103 in the visible domain for photographing a target during a target inspection phase.
  • the hunter 101 can be of the nano-satellite type.
  • the hunter 101 can also include one or more of the following devices: an imager in the infrared range, a spectrum analyzer, a remote control configured to transmit commands to the target, a space weather analyzer configured to perform monitoring operations. desired proximity.
  • the hunter 101 comprises a telecommunications system 104 providing a communication link between the hunter and one or more earth or space stations.
  • the hunter 101 can thus receive mission orders from an operator.
  • the received mission orders may include data characterizing a trajectory making it possible to estimate the position of a target at any time, this data being referred to in the following as "static trajectory parameters".
  • Telecommunications circuits also allow the hunter to transmit spatial data collected during the inspection of a target to a ground or space station.
  • the hunter 101 comprises a radar system with a configurable field of view comprising a transmitting antenna 105 and several receiving antennas 106.
  • the transmitting antenna 105 transmits an incident radar signal propagating in space, and reflecting off one or more targets that may be in the field of the incident radar signal.
  • the reception antennas 106 collect the signals reflected by any targets.
  • the field of view of the radar system can be adjusted between a wide field of view and a narrow field of view.
  • the radar system comprises a wide field of view radar and a narrow field of view radar.
  • the hunter 101 comprises a scanning lidar 107.
  • This scanning can be carried out by a mechanical or electronic device.
  • FIG. 2 represents the fields of view 201, 202 of the radar system, according to one embodiment.
  • the radar system has a narrow field of view mode 201, in which the transmitted energy and the received energy are concentrated by the radar system in a given direction.
  • the narrow field of view has an angular width A1 which may be identical in azimuth and in elevation.
  • the angular width A1 is between 3 and 10 °, for example equal to 5 °.
  • the radar system has a wide field of view mode 202 used to observe a large angular sector almost instantaneously.
  • the wide field of vision has an angular width A2 which may be identical in azimuth and in elevation.
  • the angular width A2 is between 30 and 180 °, for example equal to 100 °.
  • the radar system is configured in narrow field of view mode when the target is at a distance from the hunter 101 between distances d1 and d2, and in wide field of view mode when the target is at a distance from the hunter 101. distance of the hunter less than distance d2.
  • the distance d1 is between 1 and 5 km
  • the distance d2 is between 50 m and 1000 m, for example equal to 500 m.
  • the distance d1 depends on the sensitivity of the radar system in narrow field of view mode, and on the radar equivalent surface area of the target, taking into account the parts of the target reflecting the radiation emitted by the latter towards the radar.
  • FIG. 3 shows the functional architecture of the hunter's navigation system 101, according to one embodiment.
  • the navigation system comprises: an MCLC mission computer, the COMC telecommunication system coupled to the MCLC computer by interface circuits 303, a NAVC navigation system coupled to the MCLC computer by interface circuits 305, the RDRS radar system coupled to the MCLC computer by interface circuits 307, the LDRS scanning lidar system coupled to the MCLC computer by interface circuits 309, and a spatial data acquisition system SDCC, to acquire information relating to a target and in its close vicinity, the SDCC system being coupled to the MCLC computer by interface circuits 311.
  • the SDCC system being coupled to the MCLC computer by interface circuits 311.
  • the MCLC computer is configured to receive mission orders from the COMC communication system, and to transmit data to the latter, through the interface circuit 303.
  • the MCLC computer is also configured to control the RDRS radar systems. and LDRS lidar and the NAVC navigation system, and receive and process data from these systems and the navigation system, via interface circuits 305, 307, 309.
  • a ground operator can transmit to the MCLC computer an inspection mission order including static parameters of a target's trajectory, allowing the approximate position of the target to be determined at any time.
  • the static trajectory parameters include orbital parameters which may include the semi-major axis of the orbit, the eccentricity of the orbit, the inclination of the orbital plane , the longitude of the ascending node, the argument of the periapsis and the moment of passage from the target to the periapsis.
  • the NAVC navigation system can include a GNSS receiver (“Global Navigation Satellite System” - geolocation and navigation by satellite system) configured to determine the position of the hunter 101 in a geocentric reference frame, one or more electric thrusters, inertia wheels and magneto-couplers to orient the hunter in a desired direction.
  • the navigation system can also include one or more chemical thrusters to orient the hunter in the desired direction, and a star finder to determine an orientation of the hunter with respect to the stars, this device being known to those skilled in the art. under the term "Star-Tracker".
  • the MCLC mission computer determines the trajectory to be followed by the hunter to position himself at the approach distance d1 from the assumed position of the target. For this, the MCLC mission computer uses the position of the hunter in a frame of reference, for example the geocentric frame of reference, provided by the navigation system NAVC and the static parameters of the trajectory of the target, received by the communication system COMC. The path to be followed is then implemented by the MCLC computer which determines navigation commands to be transmitted to the NAVC navigation system.
  • the RDRS radar system comprises a frequency-modulated continuous wave (FMCW) radar, operating in a frequency band between 0.1 GHz and 100 GHz. GHz, for example in the frequency band 14.3 GHz - 14.4 GHz.
  • FMCW frequency-modulated continuous wave
  • the radar is used to detect, locate and track the target as soon as the hunter arrives at the approach distance d1 from the supposed position of the target.
  • the LDRS lidar system comprises a mechanical or electronic scanning lidar.
  • the mechanically scanned lidar has servomotors to steer the lidar transmitting and receiving devices, based on commands from the MCLC mission computer.
  • the RDRS radar and LDRS lidar systems provide detection data to the MCLC computer, which correlates and merges this data to, if necessary, confirm the presence of the target and refine the estimation of its position.
  • the fusion algorithm uses Kalman filters known to those skilled in the art to merge radar data with lidar data.
  • FIG 4 shows the RDRS radar system, according to one embodiment.
  • the RDRS radar system includes the active electronically scanned transmission antenna 105, for example of the AESA antenna type ("Active Electronically Scanned Array").
  • the transmit antenna 105 has a control interface 402 for controlling the gain of the transmit antenna 105 and configure it for a wide field of view or a narrow or directional field of view.
  • the RDRS radar system also includes active, electronically scanned receiving antennas 106.
  • the comparison of the phases of the signals received by the antennas 106 makes it possible to determine the direction of arrival of the target.
  • the radar comprises at least three reception antennas 106 (four in the example of FIGS. 1 and 4).
  • Each reception antenna 403 comprises a control interface 404 making it possible to control the reception gain of the reception antennas and the field of view of the antennas in wide field or in narrow or directional field.
  • the RDRS radar system comprises TXC transmission circuits ensuring the generation of the radar signal to be transmitted which is supplied to the transmission antenna 401, and RXC reception circuits ensuring the functions of filtering and digitization of the signals received by the antennas. receiver 106.
  • An interface 407 allows the TXC and RXC circuits to synchronize.
  • the RDRS radar system also includes an RPRC radar signal processor for processing radar signals to detect and locate a target.
  • An interface 409 makes it possible to synchronize the RPRC processor with the TXC transmission circuits.
  • An interface 410 makes it possible to synchronize the RPRC processor with the RXC reception circuits.
  • a digital bus 411 ensures the transmission of the digitized signals by the reception circuits RXC to the processor RPRC.
  • the RDRS radar system also includes an input-output interface 412 allowing the RPRC radar processor to receive commands and provide trajectory data and data characterizing targets detected by the radar system.
  • the estimated "trajectory data" of a target denotes data defining at least two time-stamped positions of the target, these data being able to include estimated future positions, each position being able to be associated with an amplitude of error, or else data defining a time-stamped position, associated with data defining a time-stamped speed vector, the speed vector possibly also being associated with error amplitudes.
  • the RDRS radar system includes a wide field of view transmitting antenna and a narrow field of view transmitting antenna.
  • the antenna selection is made by means of a circulator which routes the radar signal to be transmitted to the specified antenna while isolating the unspecified antenna.
  • FIG. 5 schematically represents a hardware architecture of the MCLC mission computer on board the space vehicle, according to one embodiment.
  • the MCLC computer comprises: a central processing unit CPU configured to execute a mission program, a clock circuit CLK to provide a regular time reference to the computer, a non-volatile NVM memory, for example of the ROM type ("Read-Only Memory "- read-only memory) containing the information needed to initialize the processing unit CPU and the mission program, a random access memory VM, for example of the RAM (Random Access Memory) type allowing the storage of the data necessary for operation after starting the processing unit CPU and during the execution of the mission software, a circuit d '' IOC interface comprising input and / or output ports making it possible to connect the MCLC mission computer to other on-board systems (COMC, NAVC, RDRS, LDRS, SDCC), an AB address bus allowing the central unit CPU to specify an access address in the NVM, VM memories, or an input and / or output port of the I
  • the central processing unit CPU may include one or more processors, and in particular a processor dedicated to processing signals from the RDRS radar system and / or the LDRS lidar system.
  • processors can be dedicated to processing radar and lidar signals to correlate and merge these signals, for example by Kalman filters, and provide estimated trajectory data of objects detected by the RDRS radar system and / or the LDRS lidar system.
  • FIG. 6 illustrates an example of a mission entrusted to hunter 101, for example in the form of an order received by the communication system COMC.
  • this mission consists of an inspection of a target 502 located on a known trajectory. Prior to the start of this mission, fighter 101 is located on a hold path, awaiting a mission order.
  • the mission order specifies the static trajectory parameters of the target 502, for example orbital data in the case of a satellite orbiting a star.
  • a target inspection mission comprises:
  • the MCLC mission computer of fighter 101 controls the NAVC navigation system and uses the static parameters of the target trajectory to navigate and reach the distance d1 from the assumed position of target 502.
  • the MCLC mission computer controls the RDRS radar system so that it scans the environment in a directional field of view, until the target 502 is detected and located.
  • activation of the RDRS radar system can be performed periodically, in narrow field of view mode.
  • the MCLC mission computer switches to observation phase 504, by configuring the RDRS radar system in a narrow field of view, and by orienting the radar systems RDRS and LDRS lidar towards target 502, as determined during the detection phase.
  • the data from the RDRS and LDRS systems are used by the MCLC computer to estimate more precisely current and future trajectory parameters of the target 502.
  • the inspection phase 505 begins when the trajectory of the target 502 has been estimated and the flight conditions and the required safety constraints allow it.
  • the RDRS radar system remains configured in a directional field of view in order to reduce the power to be transmitted into space.
  • the LDRS scanning lidar system is directed towards target 502 and its immediate surroundings.
  • the tracking of the target may deteriorate, in particular in the event of spontaneous maneuvering of the target, or of flickering of the target, during transitions between day and night, or in the presence of the sun. in the direction observed, this degradation being able to lead to a loss of tracking.
  • the RDRS radar system is automatically configured for wide field of view so as to detect target 502 in a short time.
  • the MCLC computer switches to the outbound phase 506, during which it commands the NAVC navigation system to move away from the target 502.
  • the RDRS radar and LDRS lidar systems remain active until this time. that the relative distance between the hunter 101 and the target 502 is large enough to prevent any risk of collision even in the event of maneuvering the target.
  • FIG. 7 illustrates an example of an algorithm implemented by the MCLC mission computer, according to one embodiment.
  • the MCLC computer After performing INIT initialization operations, the MCLC computer enters a standby SSO state, in which the fighter is maintained on a hold path, for example in a park orbit.
  • the MCLC calculator remains in the SSO state until no mission order is received (condition 602). In this state, the MCLC computer ensures that the hunter 101 is kept on the holding path using the NAVC navigation system to correct its path in the event of any drift in its path parameters.
  • the MCLC computer Upon receipt of a mission order (condition 603), the MCLC computer exits the SSO standby state to enter an APS long range approach state.
  • the transition 603 between SSO and APS states is triggered by the receipt of a mission order specifying static trajectory parameters of a target 502, by the telecommunications system COMC.
  • the MCLC calculator remains in the APS approach state as long as the distance between the hunter and the assumed position of the target remains greater than the distance d1 (condition 604).
  • the MCLC computer periodically estimates the distance between the hunter 101 and the target 502 as a function of the hunter's position in the geocentric reference frame, provided by the navigation system NAVC, and of the static trajectory parameters of the target.
  • the MCLC computer also determines an approach path and navigates this approach path towards the target until the distance d1 is reached.
  • the MCLC computer exits the APS approach state and switches to the DTS detection state of the target, by periodically activating the RDRS radar system configured in narrow field of view, in order to detect and locate the target 502.
  • the system RDRS radar is configured in narrow field of view in order to optimize the link budget. Indeed, at long distance, the losses of propagation in free space have a strong impact on the radar link budget. These losses can be partially compensated by concentrating the emitted energy in a given direction, which makes it possible to detect the target without having to increase the transmitted power too much. In fact, the greater the power to be transmitted, the more it is necessary to isolate the reception antennas 106 from the emission antennas 105 of the radar, in order to prevent transmission leaks from saturating the reception chains connected to the antennas. reception.
  • the radar beam is pointed in the direction corresponding to the estimated position of the target.
  • the radar beam is wide enough in the narrow field of view mode, to cover the angular range in which the target may physically be.
  • the radar beam is aimed in different directions in a square or circular spiral pattern, starting from the estimated direction of the target.
  • the radar beam can be electronically pointed in the estimated direction of the target.
  • the radar beam can be pointed mechanically in the estimated direction of the target, thanks to the navigation system capable of orienting the hunter, or else by means of a motorized sighting system.
  • the RDRS radar system can be kept active until the target is detected.
  • the RDRS radar system employs multiple linear frequency modulated waveforms.
  • the parameters of the waveform are the transmitted bandwidth, the rise time between the minimum frequency and the maximum frequency, the fall time between the maximum frequency and the minimum frequency, and the relaxation time during which the frequency remains to its minimum value. These parameters partly determine the performance of the radar, namely range accuracy, range resolution, maximum unambiguous distance, radial speed accuracy, radial speed resolution and unambiguous radial speed range.
  • Each waveform offers a compromise between the unambiguous measurement range and the couple precision and resolution.
  • the radar system employs a waveform with an unambiguous wide range of distance measurement and a wide range of unambiguous radial speed measurement. ambiguous.
  • the radar employs waveforms with high precision and resolution in distance and high precision and resolution in radial speed. The ambiguity of the measurements associated with these waveforms is resolved by virtue of the measurements obtained previously.
  • the RDRS radar system when target 502 is detected the RDRS radar system periodically alternates between an unambiguous wide measuring range waveform and a high precision and resolution waveform.
  • the MCLC calculator uses the measurements provided by the radar system to refine the estimated trajectory data of the target using Kalman filters, providing estimates of the current position and positions. futures of the target.
  • a volume is also defined around each calculated position, containing the target with a certain probability, taking into account the positioning errors and the errors of the Kalman filter models. This probability is set at a value between 0.5 and 0.9999, and preferably 0.9900.
  • the defined volume is used, for example, to estimate the risk of collision or to define the environment to be observed by the RDRS radar or the LDRS lidar.
  • the speed vector of the target can be calculated from the radial speed supplied by the radar system and from the tangential speed calculated from several successive calculated positions of the target and from the time elapsed between these successive positions.
  • the MCLC calculator remains in the DTS detection state as long as the condition hunter out of inspection area 606 is true.
  • the inspection zone corresponds to a sphere of radius d2 centered on the assumed or estimated position of the target 502.
  • the MCLC computer exits the DTS detection state and enters a DPS alienation state when condition 607 hunter in inspection zone and target not detected is true.
  • This transition condition corresponds to the absence of detection to confirm the presence of the target 502 or its detectability, while the hunter 101 has reached a distance less than or equal to the distance d2 from the supposed position of the target.
  • the MCLC computer also exits the DTS detection state to go into an OPS observation state when the condition 608 hunter in inspection area and detected target is true.
  • This transition condition implies that the target is detected and that the hunter 101 is located at a distance less than the distance d2 from the estimated position of the target 502.
  • the MCLC computer remains in the OPS observation state as long as condition 609 inspection maneuver being calculated and acceptable risk of collision is true.
  • the MCLC computer evaluates the risk of collision with the target 502 as a function of the observation position of the hunter 101 with respect to the estimated position of the target 502.
  • the MCLC computer controls the NAVC navigation system to maintain a constant distance from the estimated position of the target.
  • the MCLC computer periodically activates the RDRS radar system configured in a narrow field of view, and possibly the LDRS lidar system, to detect and locate the target. If the LDRS lidar system is enabled, it is used to confirm and refine the target position determined by the RDRS radar system.
  • the initial aiming direction of the lidar corresponds to the last calculated position of the target 502.
  • the lidar system emits a frequency modulated electromagnetic wave, and samples the received signal.
  • the hunter-target distance is determined as a function of the sampling instant corresponding to the sample where the target is detected in the received signal, knowing that the instant of this sample determines the duration of the wave between the transmitter and target, and the return time between target and receiver, the outward and return distances being assumed to be identical.
  • the LDRS lidar system will scan around the initial aiming direction, following a scan pattern. This pattern can be a spiral, for example the Galileo spiral, or closed concentric curves, such as circles or ellipses centered on the supposed position of the target. Lidar scanning is performed periodically from the assumed position of the target.
  • the assumed position of target 502 is determined based on the last known position and estimated target trajectory data, and the time elapsed since the last position was obtained.
  • the MCLC computer determines one or more inspection paths and a risk of collision with the target, associated with each path, and selects an inspection path associated with an acceptable risk of collision.
  • the risk of collision can be estimated according to the trajectory data of the hunter 101, and the last known location data of the target 502, taking into account all the possible maneuvers of the target, each maneuver can be weighted by its probability of occurrence if it is estimable. To guarantee a high level of flight safety, the risk of collision can be evaluated by also considering events with a low probability of occurrence, for example a failure of a GNSS satellite used for estimating the position of the hunter.
  • the MCLC computer keeps the RDRS radar system active by continuing to scan the environment, and commands hunter 101 to be held in position according to the estimated position of the target. 502 from the latest location data received from the RDRS radar and LDRS lidar systems. The risk of collision is then evaluated by also taking into account the time elapsed since the last detection of the target. The MCLC computer can also periodically perform tests of the correct functioning of the hunter 101.
  • the MCLC computer exits the OPS observation state to go into the DPS outbound state when condition 610 risk of collision associated with the unacceptable observation position is true.
  • Transition condition 610 corresponds to an unacceptable risk of collision while observing the target.
  • the risk of collision may also be unacceptable in the event of insufficient tracking of the target, or in the event of detection of a spontaneous maneuver of the target 502, or of a malfunction of the hunter 101.
  • the MCLC computer exits the OPS observation state to enter an IPS inspection state when the condition 611 calculated inspection path is true.
  • the transition condition 611 corresponds to obtaining a satisfactory inspection trajectory, in particular as regards the risk of collision with the target 502, which assumes sufficient tracking of the target.
  • the MCLC computer remains in the IPS inspection state as long as the condition 613 is true, that is, when the inspection success criteria are not yet met, the risk of collision associated with the inspection maneuver remains acceptable, and target tracking remains sufficient.
  • the inspection success criteria depend on the mission order and may correspond, for example, to a collection of information of sufficient quality and diversity from the target 502.
  • the MCLC computer activates periodically the RDRS radar system configured in narrow field of view and the LDRS lidar system to detect and locate the target.
  • the data from the RDRS and LDRS systems are correlated and merged, for example, using Kalman filters, to locate the target 502 and more precisely determine the trajectory data of the latter.
  • the MCLC computer controls the NAVC navigation system to periodically correct the trajectory of the hunter to follow the inspection trajectory previously calculated in the OPS observation state.
  • the MCLC calculator periodically calculates the risk of collision associated with the inspection path, based on the navigation data of the hunter 101 against the location data of the target 502.
  • the MCLC computer activates the SDCC spatial data acquisition system, to acquire information relating to the target 502 and to its close vicinity.
  • the MCLC computer exits the IPS inspection state and enters the DPS out-of-town state when condition 619 inspection success criteria are met or risk of collision associated with the inspection maneuver is unacceptable.
  • the MCLC computer exits the IPS inspection state to enter an STP inspection interrupt state when a loss of tracking or insufficient tracking condition 615 is true.
  • Condition 615 is true when the inspection success criteria have not yet been met, the risk of collision associated with the inspection maneuver remains acceptable, and target tracking is insufficient.
  • the transition condition 615 results from a degradation of the detection of the target 502 in the IPS state, this degradation possibly being caused by the flickering of a reflective surface of the target.
  • the loss of tracking condition can be reached when target 502 is not detected in N acquisitions among M last consecutive acquisitions by RDRS radar and / or LDRS lidar systems.
  • the MCLC calculator remains in the STP inspection interrupt state as long as the loss of tracking condition of target 615 is true and a period of time from the start of the loss of tracking has not expired.
  • the MCLC computer controls the NAVC navigation system to follow the inspection path, and determines the time elapsed since entering the lost track state.
  • the MCLC computer also periodically controls the RDRS radar system configured in a wide field of view 202 to observe its surroundings almost instantaneously with the aim of finding the target 502. Another sensor system such as the LDRS lidar system can also be activated. in the STP state to locate the target again.
  • the MCLC computer exits the STP tracking loss state to return to the IPS inspection state when the target 618 tracking condition 618 is true.
  • the transition condition 618 corresponds to the resumption of tracking following the detection of the target 502 by one or more sensors on board the hunter 101.
  • the field of view of the RDRS radar system can then be gradually reduced until it reaches the field of view. narrow vision 201. With each new acquisition, the radar is pointed so as to center the field of vision of the radar on the direction of the target determined from the signals supplied by the RDRS radar system or the LDRS lidar system during the acquisition previous.
  • the MCLC computer exits the STP tracking loss state to go into the DPS removal state when a time-out condition 617 is true, that is, when the target 502 is still not detected and the time-out has elapsed. since the loss of tracking has expired.
  • the MCLC computer remains in the DPS remote state as long as the condition 621 hunter in the inspection zone is true, in other words, as long as the hunter 101 is at a distance less than or equal to the distance d2, from the position estimate of the target 502.
  • the MCLC computer determines a outbound trajectory and commands the NAVC navigation system to direct the hunter along this trajectory.
  • the MCLC computer periodically activates the RDRS radar system configured alternately in wide field of view and in a narrow field of view to detect and locate the target 502.
  • the MCLC computer can also activate one or more other sensors to detect and locate the target.
  • the MCLC computer exits the DPS out-of-range state to a BSO return-to-hold state, such as its park orbit, when condition 622 hunter out of inspection area is true.
  • the transition condition 622 is true when the estimated position of the target 502 is at a distance greater than the distance d2 from the hunter 101.
  • the MCLC computer controls the control system. NAVC navigation to move towards the holding path.
  • the MCLC computer remains in the BSO state as long as condition 624 hunter out of hold path is true, that is, until the hunter has returned to its hold path.
  • the MCLC calculator exits the BSO back to hold path state to enter the hold hold path SSO state when fighter 101 has reached its hold path (condition 625).
  • the arrangements described above make it possible to ensure the safety of the short-range space rendezvous, by minimizing the risks of loss of tracking of the target and by reducing the duration of any loss of tracking.
  • the implementation of at least two active sensors located in distant spectra, makes it possible to reduce the risk of loss of tracking by decorrelating the flickering of the target in the observed spectral bands, and increasing resilience to rapid transitions between day and night and to interference with other electromagnetic signals.
  • the implementation of a radar system with a wide field of vision allows the target to be detected quickly, especially when it is at a short distance and has just performed an uncooperative maneuver.
  • the present invention is susceptible of various variant embodiments and various applications.
  • the invention is not limited to the sequence of steps described in reference to FIG. 7, and to the list of sensors activated at each of these stages. It is simply important that the hunter goes through phases of detecting the target at long range, then acquiring when the target is detected, and detecting at short range when the hunter no longer detects the target when he is nearby. of the latter.
  • the invention is also not limited to periodic activation of radar and lidar systems.
  • the moment of activation of one of these systems can be defined randomly or determined according to the signals acquired previously, and in particular according to the precision of these signals.
  • the time interval between two acquisitions can be reduced if this precision is not satisfactory.
  • the long-range DTS detection phase can be omitted when the static trajectory parameters provided in the mission order are sufficiently precise to allow pointing of the radar configured in a narrow field of view and / or of the lidar on the target ( 502).

Landscapes

  • Engineering & Computer Science (AREA)
  • Remote Sensing (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Electromagnetism (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Chemical & Material Sciences (AREA)
  • Combustion & Propulsion (AREA)
  • Astronomy & Astrophysics (AREA)
  • Radar Systems Or Details Thereof (AREA)
  • Traffic Control Systems (AREA)
  • Optical Radar Systems And Details Thereof (AREA)
EP20819807.7A 2019-11-04 2020-11-03 Verfahren zur verfolgung eines raumobjektes unter verwendung von radar- und lidar-systemen an bord Withdrawn EP4038412A1 (de)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
FR1912345A FR3102862B1 (fr) 2019-11-04 2019-11-04 Procede de poursuite d’un objet spatial a l’aide de systemes radar et lidar embarques
PCT/FR2020/051983 WO2021089938A1 (fr) 2019-11-04 2020-11-03 Procede de poursuite d'un objet spatial a l'aide de systemes radar et lidar embarques

Publications (1)

Publication Number Publication Date
EP4038412A1 true EP4038412A1 (de) 2022-08-10

Family

ID=72885590

Family Applications (1)

Application Number Title Priority Date Filing Date
EP20819807.7A Withdrawn EP4038412A1 (de) 2019-11-04 2020-11-03 Verfahren zur verfolgung eines raumobjektes unter verwendung von radar- und lidar-systemen an bord

Country Status (6)

Country Link
US (1) US20220390605A1 (de)
EP (1) EP4038412A1 (de)
JP (1) JP2023501349A (de)
CA (1) CA3161114A1 (de)
FR (1) FR3102862B1 (de)
WO (1) WO2021089938A1 (de)

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
UA150996U (uk) * 2022-01-04 2022-05-18 Товариство З Обмеженою Відповідальністю "Курс-Орбітал" Багатофункціональний модуль зближення і захвату
WO2023105353A1 (en) * 2022-11-30 2023-06-15 Karampour Rouhollah Star tracker device for astrophotography titled auto-guider
CN115877370B (zh) * 2023-03-08 2023-07-07 中国西安卫星测控中心 一种利用双雷达距离与方位角快速计算航天器轨道的方法
CN116659521B (zh) * 2023-06-01 2024-05-07 山东理工大学 空间碎片双弧测角值一体化定初轨及关联方法及装置

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9187189B2 (en) * 2012-10-12 2015-11-17 The Aerospace Corporation System, apparatus, and method for active debris removal

Also Published As

Publication number Publication date
FR3102862A1 (fr) 2021-05-07
JP2023501349A (ja) 2023-01-18
FR3102862B1 (fr) 2021-10-29
US20220390605A1 (en) 2022-12-08
WO2021089938A1 (fr) 2021-05-14
CA3161114A1 (fr) 2021-05-14

Similar Documents

Publication Publication Date Title
EP4038412A1 (de) Verfahren zur verfolgung eines raumobjektes unter verwendung von radar- und lidar-systemen an bord
EP2122388B1 (de) Einrichtung und verfahren zum lokalisieren einer mobilen einrichtung, die sich einer elektromagnetische wellen reflektierenden oberfläche nähert
US10656283B2 (en) Adaptative antenna assembly for improving precision of a GNSS receiver in a perturbated environment
US9989634B2 (en) System and method for detection and orbit determination of earth orbiting objects
FR2995088A1 (fr) Systeme pour proteger une plateforme aeroportee contre des collisions
FR2712989A1 (fr) Système adaptatif de radiogoniométrie.
FR2648570A1 (fr) Dispositif et procede pour mesurer l'azimut et le site d'un objet
EP3491408B1 (de) Verfahren und system zur schätzung der richtung eines satelliten in der transferphase von einer ersten umlaufbahn zu einer missionsumlaufbahn
EP2642317B1 (de) Empfangsvorrichtung für Funknavigationssignale mit Mehrfachantennen
EP2107391B1 (de) Radarerkennungsverfahren eines bekannten Ziels, das sich auf einer bestimmten Höhe befindet und zwar in der Nähe anderer Ziele, die sich auf derselben Höhe befinden.
EP4165435B1 (de) Bistatisches oder multistatisches radarsystem zur luftüberwachung mit räumlicher beleuchtung
EP2530022B1 (de) Geografisches Lokalisierungssystem eines Funksignalsenders, das auf der Erdoberfläche angebracht ist, und damit verbundenes Verfahren der verteilten Interferometrie
EP1167997B1 (de) Vorrichtung zum Messen von Raumverschmutzung
EP3474037B1 (de) Rekonfigurierbare bildgebungsvorrichtung
EP2642319B1 (de) Empfangsvorrichtung für Funknavigationssignale mit Mehrfachantennen, und gemeinsame Synchronisationssteuerung
EP1227333A1 (de) Verfahren und Gerät zur Positionsbestimmung eines erdgebundenen Senders mittels eines Satelliten
EP3859882B1 (de) Funksystem mit mehreren antennennetzwerken und adaptiven wellenformen
EP3904904B1 (de) Weltraumüberwachung mit einem bistatischen radar, dessen empfangssystem sich zumindest teilweise an bord eines satelliten befindet
FR3003961A1 (fr) Procede de formation de faisceau signaux d'un recepteur de signaux d'un systeme de navigation par satellites pour ameliorer la resistance au brouillage.
FR3079608A1 (fr) Procede de geolocalisation d'un drone, systeme de geolocalisation et produit programme d'ordinateur associes
JP2024518419A (ja) フェムト衛星の位置特定、ウェイクアップ、および方向推定
FR3041766A1 (fr) Procede de localisation passive d'emetteurs electromagnetiques au sol, utilisation, et dispositif associe
EP4386419A1 (de) Verfahren zur steuerung der ausrichtung einer antenne
FR3140176A1 (fr) Procédé et dispositif de détection d’obstacles proches par Radar Passif Multistatique GNSS pour plateformes mobiles
FR2988178A1 (fr) Dispositif de guidage sur faisceaux electromagnetiques, objet volant et systeme de lancement associe

Legal Events

Date Code Title Description
STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: UNKNOWN

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE INTERNATIONAL PUBLICATION HAS BEEN MADE

PUAI Public reference made under article 153(3) epc to a published international application that has entered the european phase

Free format text: ORIGINAL CODE: 0009012

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: REQUEST FOR EXAMINATION WAS MADE

17P Request for examination filed

Effective date: 20220506

AK Designated contracting states

Kind code of ref document: A1

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR

DAV Request for validation of the european patent (deleted)
DAX Request for extension of the european patent (deleted)
STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: EXAMINATION IS IN PROGRESS

17Q First examination report despatched

Effective date: 20230425

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE APPLICATION IS DEEMED TO BE WITHDRAWN

18D Application deemed to be withdrawn

Effective date: 20231107