EP3969845A1 - Absoluter 720°-neigungsmesser für den betrieb in der mikrogravitation - Google Patents

Absoluter 720°-neigungsmesser für den betrieb in der mikrogravitation

Info

Publication number
EP3969845A1
EP3969845A1 EP20737236.8A EP20737236A EP3969845A1 EP 3969845 A1 EP3969845 A1 EP 3969845A1 EP 20737236 A EP20737236 A EP 20737236A EP 3969845 A1 EP3969845 A1 EP 3969845A1
Authority
EP
European Patent Office
Prior art keywords
ball
sphere
image
attitude
housing
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
EP20737236.8A
Other languages
English (en)
French (fr)
Inventor
Emile REMETEAN
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Centre National dEtudes Spatiales CNES
Original Assignee
Centre National dEtudes Spatiales CNES
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Centre National dEtudes Spatiales CNES filed Critical Centre National dEtudes Spatiales CNES
Publication of EP3969845A1 publication Critical patent/EP3969845A1/de
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C9/00Measuring inclination, e.g. by clinometers, by levels
    • G01C9/10Measuring inclination, e.g. by clinometers, by levels by using rolling bodies, e.g. spheres, cylinders, mercury droplets
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64GCOSMONAUTICS; VEHICLES OR EQUIPMENT THEREFOR
    • B64G1/00Cosmonautic vehicles
    • B64G1/10Artificial satellites; Systems of such satellites; Interplanetary vehicles
    • B64G1/105Space science
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64GCOSMONAUTICS; VEHICLES OR EQUIPMENT THEREFOR
    • B64G1/00Cosmonautic vehicles
    • B64G1/10Artificial satellites; Systems of such satellites; Interplanetary vehicles
    • B64G1/105Space science
    • B64G1/1064Space science specifically adapted for interplanetary, solar or interstellar exploration
    • B64G1/1071Planetary landers intended for the exploration of the surface of planets, moons or comets
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64GCOSMONAUTICS; VEHICLES OR EQUIPMENT THEREFOR
    • B64G1/00Cosmonautic vehicles
    • B64G1/22Parts of, or equipment specially adapted for fitting in or to, cosmonautic vehicles
    • B64G1/66Arrangements or adaptations of apparatus or instruments, not otherwise provided for
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/24Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for cosmonautical navigation
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C9/00Measuring inclination, e.g. by clinometers, by levels
    • G01C9/02Details
    • G01C9/06Electric or photoelectric indication or reading means
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C9/00Measuring inclination, e.g. by clinometers, by levels
    • G01C9/02Details
    • G01C9/06Electric or photoelectric indication or reading means
    • G01C2009/068Electric or photoelectric indication or reading means resistive
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C9/00Measuring inclination, e.g. by clinometers, by levels
    • G01C9/10Measuring inclination, e.g. by clinometers, by levels by using rolling bodies, e.g. spheres, cylinders, mercury droplets
    • G01C2009/107Measuring inclination, e.g. by clinometers, by levels by using rolling bodies, e.g. spheres, cylinders, mercury droplets spheres

Definitions

  • TITLE 720 ° absolute inclinometer capable of operating in milli-gravity
  • the present invention relates to the field of space exploration, and relates to an opto-mechanical sensor suitable for use in a low-gravity space environment, such as that present on the surface of an asteroid or a small moon of the solar system.
  • the present invention relates to a device for recognizing the attitude of an object, to an object comprising such a device, to a space probe comprising such a device, to a landing gear comprising such a device and to a method of attitude recognition of an object, implemented by such a device.
  • a device for recognizing the attitude of an object to an object comprising such a device, to a space probe comprising such a device, to a landing gear comprising such a device and to a method of attitude recognition of an object, implemented by such a device.
  • the use of automatic exploration probes makes it possible to carry out missions which are not currently possible for manned missions.
  • Space exploration probes tend more and more often to carry one or more small daughter probes released by a mother probe to the surface of the celestial object to be explored, when it arrives in the vicinity of the latter.
  • the daughter probes are automatic space vehicles, also equipped with scientific instruments, which will therefore land on the surface of the celestial bodies to be explored, unlike the mother probes which will explore the greatest distance. possible surface of the celestial body.
  • This approach makes it possible to supplement the global scientific data measured remotely by the mother probe with local scientific data, measured in situ at the surface by the daughter probe.
  • the daughter probes intended to land on the surface of celestial bodies are qualified by the generic term “landers” or by the more specific terms in English “lander”, “hopper” or “rover” depending on the mobility capacities. of these.
  • the probes designated by the term, in English, “lander”, which can be translated as “lander” in French, are not provided with displacement capacities. They carry out scientific measurements at the place of their landing.
  • the probes designated by the term, in English, "hopper”, which can be translated as “jumper” in French, are equipped with a device allowing them jump so that you can visit sites other than the initial landing site.
  • the probes designated by the term, in English, "rover”, which can be translated as “astromobile” in French, are generally equipped with wheels in order to be able to roll on the surface of the object to be explored in order to reach target sites defined by the ground teams according to their potential scientific interest and the risks involved in reaching them.
  • the lander is dropped at an altitude varying between several tens of meters and several kilometers above the surface of the celestial body to be explored.
  • the lander falls towards the surface of the body to be explored, by gravity.
  • the duration of the trip to the surface is between a few seconds and several hours.
  • the lander Due to its impact speed and the low gravity on the surface of small bodies in the solar system, after coming into contact with the body, the lander typically bounces several times on the surface, before stabilizing in one location. unknown with an unknown attitude. The distance traveled during rebounds can be significant.
  • gyroscopes have several drawbacks.
  • the cost of a spatialized gyroscope that is to say which can be used in a space environment (hostile environment with strong thermal contrasts, where vacuum generally reigns as well as a gravity lower than gravity terrestrial and where the level of cosmic radiation is significantly higher than that observed on the surface of the Earth) is particularly high.
  • a gyroscope uses the initial conditions of the probe (attitude of the probe relative to the landing site) to deduce therefrom an angular orientation of the lander.
  • the deduction of the angular orientation of the probe with respect to the ground is made by integrating over time the values of angles measured between the first attitude, just before the release by the mother probe, and the final attitude when the lander is on the ground, that is, after having bounced several times before stabilizing. Or a restart of the on-board computer following an unexpected change of state in the electronics due to an ionizing event (“SEU” for “Single Event Upset”) or a crash of the gyroscope electronics following an event “SEU” during the descent phase or during landing gear rebounds could result in a loss of information.
  • SEU ionizing event
  • SEU Single Event Upset
  • the shocks and collisions of the undercarriage with the geological elements that may be present at the surface during the rebounds constitute non-linearities which are liable to distort the measurements.
  • the relatively high electrical consumption of gyroscopes can be very penalizing for missions which are often very constrained from an energy point of view.
  • the solution using photodiodes consists in analyzing the signal at the output of photodiodes placed at several locations of the lander in order to deduce therefrom the attitude of the latter.
  • the regolith which generally makes up the surface of the body to be explored can cover the diodes during bounces, which can also distort the measurements.
  • the present invention aims to overcome the above drawbacks, and to do this proposes a device for recognizing the attitude of an object, remarkable in that it comprises: a sphere comprising a wall capable of at least partially allowing a luminous flux to pass;
  • an image acquisition device arranged in said attitude recognition device to acquire a two-dimensional image of the ball and of the sphere;
  • a device for emitting a luminous flux arranged in said attitude recognition device to emit a luminous flux in the direction of said sphere; an image processing device, connected to the image acquisition device, driven by an image processing algorithm and designed to drive the device for emitting a light flux, to recover said image acquired by said device acquisition of images and to determine from said image a gravity vector having as its origin the center of the sphere and its end the center of the ball.
  • the present invention makes it possible to constitute a solution which is precise, robust and completely independent from the solutions of the prior art and which is more economical than a solution resulting from the prior art offering similar performance.
  • the measurement of the gravity vector is obtained over 4 Pi steradians, i.e. over 720 °, which allows the attitude recognition device according to the invention to be fully operational regardless of the attitude of the landing gear inside which the attitude recognition device of the invention is intended to be mounted.
  • the device therefore performs an absolute measurement, independent of any initial state, robust to any transient failure due to radiation during the landing gear descent phase towards the surface and independent of any stress brought on by more or less violent shocks during rebounds. and able to operate even in very low gravity if the ball is allowed time to stabilize in the sphere.
  • This measurement can, moreover, be repeated at any time during the mission to the surface when the lander is stationary and the need to know the attitude of the lander arises.
  • This measurement at any time can therefore be done without having to integrate the movements of the landing gear during its movements (“hopper” and “rover” cases) and therefore without consuming electrical energy during said movements. movements, the device being able to be kept de-energized outside the measurement phases.
  • the image processing device embedded in the attitude recognition device being able to control the device for emitting a light flux and the image acquisition device, and being able to process the images acquired in order to deduce the direction therefrom and the direction of the gravity vector, makes it possible to identify the attitude of the landing gear in total autonomy, without requiring any human intervention and without requiring the transmission of data to an external computer. This is particularly advantageous for space missions in the case where no communication is possible with another device and for simplifying the flight software of the lander and reducing its development cost.
  • the image processing device may also be called a “control and processing device”.
  • the image processing device integrates the image processing algorithm.
  • the attitude recognition device of an object is configured for use in a spatial environment.
  • the attitude recognition device of an object is configured for use in space.
  • the term “configured for use in a space environment” is understood to mean, specifically designed for use in a space environment and having undergone qualification tests making it possible to validate the behavior and proper functioning of the attitude recognition device in the space environment. .
  • the attitude recognition device of an object is configured to be used in a space environment when it has been demonstrated that it is compatible in particular for use in environments with high levels of radiation, very low pressure. (space vacuum), numerous and significant thermal variations (for example between -50 ° and + 70 ° C), mechanical vibrations and significant shocks (due in particular to the launching and landing phases).
  • the device for recognizing the attitude of an object comprises a housing inside which are mounted:
  • the device for emitting a luminous flux arranged in the first zone of the housing, o the sphere, inside which the ball is mounted, arranged in a second zone of the housing.
  • the device for emitting a light flux being arranged in the same area of the housing as the image acquisition device, it is possible to illuminate the sphere without dazzling the image acquisition device.
  • the image processing device is mounted inside the housing or is directly integrated into the image acquisition device.
  • the image processing device is on-board and completely autonomous with respect to other devices.
  • the device for emitting a light flux comprises a first set of light-emitting diodes.
  • the robustness, mass and energy efficiency of light-emitting diodes are suitable for use in space.
  • the first set of light emitting diodes is mounted between the image acquisition device and the sphere.
  • the first set of light emitting diodes is contained in at least one plane normal to an image acquisition direction.
  • the image acquisition direction is parallel to a straight line passing through the axis of the image acquisition device and the center of the sphere.
  • the direction of image acquisition corresponds to a longitudinal plane of the housing.
  • the at least one plane in which the first set of light-emitting diodes is contained is arranged between the image acquisition device and the sphere.
  • the attitude recognition device comprises a diaphragm, which extends from an outer face of the wall of the sphere, said diaphragm containing a plane of the sphere substantially orthogonal to a longitudinal axis of the housing, said diaphragm defining , on the one hand, a first compartment of the housing, inside which are mounted the image acquisition device and the first set of light-emitting diodes and, on the other hand, a second compartment, the emission device a luminous flux further comprising a second set of light-emitting diodes, mounted in the second compartment of said housing. It is understood that the diaphragm extends from an outer face of the wall of the sphere, without necessarily touching it.
  • the role of the diaphragm is to prevent the second set of light emitting diodes from directly illuminating and dazzling the image acquisition device. It is understood that the first compartment comprises the first zone of the housing. It is understood that the second zone of the housing comprises the second compartment. It is understood that the first compartment comprises a first portion of the sphere. It is understood that the second compartment comprises a second portion of the sphere.
  • the first portion of the sphere is smaller than the second portion of the sphere.
  • the sphere and the ball can be illuminated in different ways.
  • the sphere and the ball can be illuminated by the first set of light-emitting diodes and / or by the second set of light-emitting diodes.
  • the illuminated surface of the ball can have different aspects depending on the lighting. This makes it possible to more accurately calculate the position of the ball in the sphere.
  • the second set of light emitting diodes is similar to the first set of light emitting diodes and is disposed in at least one plane normal to the image acquisition direction.
  • the first and second sets of light emitting diodes can be driven independently of each other by the image processing device.
  • the second set of light emitting diodes are mounted on or near the diaphragm. This enables the diaphragm to provide good concealment of the light flux produced by the second set of light-emitting diodes.
  • the ball has a different color from that of the internal walls of the housing. This makes it easier to distinguish the ball in the image acquired by the image acquisition device.
  • the ball is white in color and the internal walls of the housing are black in color. This combination of colors makes it possible to bring out the ball in the image acquired by the image acquisition device.
  • the image acquisition device, the device for emitting a light flux and the image processing device are designed to be used in a spatial environment.
  • This disclosure also relates to an object comprising an attitude recognition device according to any one of the aforementioned characteristics.
  • the present disclosure also relates to a space probe comprising an attitude recognition device according to any one of the aforementioned characteristics.
  • the invention also relates to a lander, consisting of a space vehicle designed to land and / or to land and move on the surface of a celestial body to be explored, remarkable in that it comprises a recognition device attitude according to any one of the aforementioned characteristics.
  • the invention further relates to a method for recognizing the attitude of an object implemented by an attitude recognition device according to any one of the aforementioned characteristics, remarkable in that it comprises the following steps aimed at:
  • o calculate two sets of three-dimensional coordinates (XBI, YBI, ZBI) and (XB 2 , YB 2 , ZB 2 ) in the direct orthonormal coordinate system (O, X, Y, Z), which may correspond to the position of the ball in the sphere for the position (X b , Y b ) of the center of the ball in the image; o determine, from the appearance of an illuminated surface of the ball, which of the two sets of three-dimensional coordinates (XBI, YBI, ZBI) and (XB2, YB2, ZB2) corresponds to the position of the ball in the sphere .
  • the step consisting in determining, from the aspect of an illuminated surface of the ball, which of the two sets of three-dimensional coordinates (XBI, YBI, ZBI) and (XB2, YB2, ZB2) corresponds to the position of the ball in the sphere consists of the following steps:
  • the step aiming to determine which of the two sets of coordinates (XBI, YBI, ZBI) OR (XB2, YB2, ZB2) is that which corresponds to the position of the ball in the sphere comprises a step of acquiring an image of the sphere and of the ball by the image acquisition device, in which at least the second set of light-emitting diodes is turned on.
  • the step aimed at determining which of the two sets of coordinates (XBI, YBI, ZBI) OR (XB2, Y B 2, Z B 2) can include two acquisitions of images of the sphere and the ball under two different lighting conditions.
  • the first image acquisition is made with only the first set of light-emitting diodes on and the second acquisition is made with at least the second set of light-emitting diodes on.
  • FIG. 1 illustrates the landing gear according to the invention.
  • FIG. 2 illustrates a first embodiment of the attitude recognition device of the invention.
  • FIG. 3 represents the landing gear in a first attitude.
  • FIG. 4 shows the landing gear attitude recognition device according to the attitude given in FIG. 3.
  • FIG. 5 represents the image acquired when the landing gear is in the attitude of FIG. 3.
  • FIG. 6 shows the landing gear in a second attitude.
  • FIG. 7 shows the landing gear attitude recognition device according to the attitude given in FIG. 6.
  • FIG. 8 represents the image acquired when the landing gear is in the attitude of FIG. 6.
  • FIG. 9 shows the landing gear in a third attitude.
  • FIG. 10 shows the landing gear attitude recognition device according to the attitude given in FIG. 9.
  • FIG. 11 represents the image acquired when the landing gear is in the attitude of FIG. 9.
  • FIG. 12 represents a model of the attitude recognition device.
  • FIG. 13 shows in transverse view the model of FIG. 12 with two possible positions of the ball for a given position in the image.
  • FIG. 14 is a view similar to that of FIG. 13, the ball being positioned in the sphere at a location distinct from that of FIG. 12.
  • FIG. 15 illustrates a second embodiment of the attitude recognition device of the invention.
  • FIG. 16 gives an example of use of the attitude recognition device according to the second embodiment.
  • FIG. 17 gives another example of use of the attitude recognition device according to the second embodiment.
  • the terms “upstream” and “downstream” should be understood in relation to the attitude recognition device, the upstream being located on the left with reference to FIG. 2 and the downstream being located. on the right with reference to FIG. 2.
  • the lander (“lander” or “rover” or “hopper” in English) is a space exploration daughter probe released from a space exploration mother probe (not shown) in order to land on a celestial body to explore.
  • the mother probe is most often placed in orbit around the celestial body to be explored and serves in particular as a communication relay between the lander and the Earth.
  • FIG. 1 illustrating a landing gear 1 according to the invention in contact with a ground 3 of a celestial body to be explored.
  • the lander 1 is a daughter probe, for example a space vehicle designed to move by rolling on the ground 3 of the celestial body to be explored.
  • a landing gear is designated by the term, in English, "rover".
  • the lander 1 also designates the probes designated by the term, in English, "landers”, which are not provided with displacement capabilities, as well as the probes designated by the term, in English, "hoppers”, which are equipped with a device allowing them to perform jumps in order to be able to visit sites other than the initial landing site.
  • the lander is often designed to have a certain degree of autonomy in carrying out its activities and in making decisions.
  • the movement of the undercarriage 1 on the ground 3 is ensured by means of a set of wheels 5 and an on-board navigation algorithm allows it to move independently and in complete safety towards the sea. target designated by teams on Earth.
  • the undercarriage 1 comprises an attitude recognition device 7 of the undercarriage 1, the purpose of which is to enable the control system of the undercarriage
  • the attitude recognition device 7 is preferably mounted inside a casing 9 of the undercarriage 1.
  • FIG. 2 illustrating the attitude recognition device 7.
  • the attitude recognition device 7 comprises a sphere 11, hollow.
  • the sphere 11 is a sphere 11, hollow.
  • the wall 13 capable of at least partially allowing a luminous flux to pass inside its internal volume 15.
  • the wall 13 is therefore not opaque and is preferably transparent.
  • the wall 13 of the sphere has preferably received an anti-reflection treatment.
  • the sphere has a radius of between 2 and 6 cm, preferably around 4 cm.
  • the sphere is fixed relative to the attitude recognition device 7.
  • the sphere 11 comprises in its internal volume a ball 17, movable inside the sphere 11.
  • the movement of the ball 17 in the sphere 11 results from the acceleration undergone by the recognition device 7 when the landing gear 1 is subjected to a force, in particular its weight under the effect of gravity.
  • the ball 17 is relatively light, for example 10 grams.
  • the ball 17 is preferably made of a non-electrostatic material so that the electrostatic forces do not hinder its movements inside the sphere 11.
  • the attitude recognition device 7 also comprises an image acquisition device 19, fixed relative to the attitude recognition device 7.
  • the image acquisition device 19 is spatialized.
  • the term “spatialized” is understood to mean the characteristic according to which the device is designed to be used in a spatial environment. To this end, when it is spatialized, the image acquisition device 19 is preferably specifically designed for use in a space environment and undergoes qualification tests making it possible to validate its resistance to the space environment.
  • the image acquisition device 19 is thus qualified as “spatialized” when it has been demonstrated that it is compatible in particular for use in an environment exhibiting high levels of radiation, very low pressure (space vacuum), numerous and significant thermal variations (for example between -50 ° and + 70 ° C), mechanical vibrations and significant shocks.
  • the image acquisition device 19 can be obtained by a digital spatialized camera, or by an analog spatialized camera associated with an image digitizer (“frame grabber” in English) also spatialized.
  • the camera is associated with a spatialized (optical) lens, the characteristics of which (focal length, aperture, optical focus, etc.) are suited to the need for acquiring clear images of the entire sphere and the ball.
  • the image acquisition device 19 is arranged in the attitude recognition device 7 to acquire a two-dimensional image 29 of the ball 17 and of the sphere 11, as will be seen in the remainder of the description illustrating the lander 1 on the ground 3 of the body to explore in different attitudes.
  • the distance between the image acquisition device 19 and the sphere 11 is preferably of the order of a few centimeters, so as to obtain a clear image of the ball 17, while optimizing the compactness of the device. attitude recognition.
  • the image acquisition device 19 may be in the form of a parallelepiped of 2 cm x 2 cm x 4 cm.
  • a phase of calibration of the image acquisition device 19 is carried out prior to its operational use.
  • the calibration makes it possible to define the intrinsic and extrinsic parameters of the image acquisition device 19 with a view to correcting the geometric distortions and of alignment of the images produced.
  • This phase can possibly be omitted if the quality of the design and the realization of the image acquisition device allows it (very low geometric distortion, excellent tolerances and opto-mechanical alignments, etc.).
  • the attitude recognition device 7 further comprises a device for emitting a light flux 21.
  • the device for emitting a luminous flux 21 is arranged in the attitude recognition device 7 so as to emit a luminous flux in the direction of the sphere 11.
  • the luminous power of the device for emitting a luminous flux 21 is calculated and chosen so that it allows the image acquisition device 19 to acquire images of the sphere 11 and of the ball 17 which can be used by an image processing device described in the following section. description.
  • An image is said to be “usable” when the brightness of the image of the sphere 11 and of the ball 17 is such that the sphere 11 and the ball 17 are visible in the image that the image acquisition device 19 acquires. and that the image is neither underexposed nor overexposed.
  • examples of embodiments of the device for emitting a light flux 21 are given, without limitation, as well as examples of the arrangement of the device for emitting a light flux 21 in the device. attitude recognition 7.
  • the image acquisition device 19 is connected to an image processing device 23, which may also be called a “control and processing device”.
  • the image processing device 23 is driven by an image processing algorithm and is designed to drive the device for emitting a light flux 21, to drive the image acquisition device 19, to recover a image acquired by the image acquisition device 19, to rectify said image and to determine from the rectified image the gravity vector G having as its origin the center Os of the sphere 11 and as its end the center O B of the ball 17.
  • the assembly formed by the image acquisition device 19, the sphere 11, inside which the ball 17 is mounted, the device for emitting a luminous flux 21 and the image processing device can be mounted inside a housing 25 of the attitude recognition device 7.
  • the image acquisition device 19 and the device for emitting a luminous flux 21 are mounted in the housing 25 upstream with respect to the sphere 11.
  • the image acquisition device 19 and the device for emitting a luminous flux 21 are mounted in a first zone of the housing and the sphere 11 is mounted in a second zone of the housing.
  • the image processing device 23 can be mounted indifferently in one or the other of the aforementioned areas of the housing 25.
  • the device for emitting a light flux 21 preferably comprises a first set of light-emitting diodes 27, called “upstream”, because of their locations in the housing 25.
  • the first set of upstream light-emitting diodes 27 is however preferably mounted between the image acquisition device 19 and the sphere 11.
  • the upstream light-emitting diodes 27 can for example be distributed in the housing so as to form a circle contained in a plane (XY) with reference to the direct trihedron (X, Y, Z) shown in Figure 2.
  • Other light sources can also be considered instead of light emitting diodes (for example incandescent bulbs, fluorescent sources ...) but their robustness, mass and energy efficiency are generally less well suited to a space mission.
  • the housing 25 adopts for example a parallelepipedal shape.
  • the ball 17 preferably has a different color from that of the internal walls of the housing 25.
  • the ball 17 has a relatively light color, such as white, preferably matt.
  • the internal walls of the housing 25 preferably have a dark color such as black, preferably matt, in order to make the images acquired of the sphere 11 and of the ball 17 more usable.
  • the ball and the walls can be painted if necessary. to obtain the required colors.
  • the image processing device 23 is preferably mounted inside the housing 25 which makes it possible to make the attitude recognition device 7 autonomous.
  • the image processing device 23 can be integrated into the image acquisition device 19.
  • the image processing device 23 can be physically deported from the attitude recognition device 7 in which case, it is, for example, integrated into the onboard computer of the undercarriage 1.
  • the image processing device 23 is made up of either a microcontroller, a microprocessor, a network of programmable gates in situ designated by the acronym “FPGA” for “Field-Programmable Gâte Array” or d '' an integrated circuit specific to an application designated by the acronym “ASIC” for “Application-Specific Integrated Circuit” as well as the associated electronic components and, where applicable, the software necessary for its proper functioning, for controlling the device. 'emission of a luminous flux 21, for controlling the image acquisition device 19 and for communications with the computer onboard the landing gear 1.
  • FPGA Field-Programmable Gâte Array
  • ASIC Application-Specific Integrated Circuit
  • the image processing device 23 is spatialized, in the same way as the image acquisition device 19 and the transmission device a luminous flux 21.
  • the undercarriage 1 in its “rover” version has stabilized on the ground 3 upright, that is to say that the wheels 5 of the landing gear 1 rest on the ground 3 which is flat here (the gravity vector is perpendicular to the ground and directed towards the ground).
  • the ball 17 ends up stabilizing in contact with the internal face of the wall 13 of the sphere 11, under the effect of gravity, as shown in the figure. 4 which shows the attitude recognition device 7 in a simplified manner, the device for emitting a light flux 21 and the image processing device 23 not being shown.
  • FIG. 5 illustrates the image 29 acquired by the image acquisition device 19 when the landing gear 1 is in the attitude shown in FIG. 3.
  • the ball 17 is located at the bottom of the image. 29, which reflects the fact that the undercarriage 1 has its back facing upwards, that the ball is on the Y axis, which reflects the fact that the undercarriage is not inclined to its right or its left and finally that the ball is in contact with the circle representing the image of the sphere 11, which reflects the fact that the landing gear 1 is not inclined forward or backward either. From the image we can deduce that the undercarriage 1 is therefore correctly placed on its wheels and that it is flat.
  • FIGS. 6 to 8 give another example of the attitude of the landing gear 1, still in its “rover” version.
  • the attitude of the undercarriage 1 with respect to the ground is such that the undercarriage 1 has landed on its back, that is to say that it rests on its upper face 30, the wheels 5 not being at the bottom. contact with the ground.
  • the upper face 30 of the undercarriage 1 is not perpendicular to the gravity vector, either because that the ground is locally sloping or because the undercarriage is not placed flat on the ground, for example because of the presence of stones.
  • FIG. 7 illustrates (in a simplified view, only the sphere 11, the ball 17 and the image acquisition device 19 being represented) the attitude recognition device 7 when the landing gear 1 is in the attitude shown in figure 6.
  • FIG. 8 illustrates the image 29 acquired by the image acquisition device 19 when the landing gear 1 is in the attitude shown in FIG. 6. It is noted that the ball 17 is located at the top of the image. 29, which reflects the fact that the landing gear 1 is placed on its back, that the ball is on the Y axis, which means that the landing gear is not tilted to its right or to its left, and finally that the ball is not in contact with the circle representing the sphere 11, which reflects the fact that the landing gear 1 is moreover inclined forward or backward.
  • FIGS. 9 to 11 give another example of the attitude of the landing gear 1, still in its “rover” version.
  • the attitude of the landing gear 1 with respect to the ground is such that the landing gear 1 has landed on its left side, that is to say that it is resting on its left side face.
  • FIG. 10 illustrates (in a simplified view, only the sphere 11, the ball 17 and the image acquisition device 19 being represented) the attitude recognition device 7 when the landing gear 1 is in the attitude shown in figure 9.
  • FIG. 11 illustrates the image 29 acquired by the image acquisition device 19 when the landing gear 1 is in the attitude shown in FIG. 9.
  • the ball 17 is located to the left of the image 29, which reflects the fact that the undercarriage 1 is tilted to its left, that the ball is on the X axis, which reflects the fact that the undercarriage is not also tilted forward or l 'rear, and finally that the ball is in contact with the circle representing the sphere 11, which reflects the fact that the left side of the landing gear is perpendicular to the gravity vector.
  • the identification of the direction of the gravity vector can be obtained over 4 Pi steradians, that is to say over 720 °, which therefore allows the attitude recognition device 7 of the invention to determine the direction of the gravity vector regardless of the attitude of the landing gear 1 with respect to the ground 3.
  • FIGS. 12 and 13 in order to explain the steps of the method for recognizing the attitude of the landing gear 1 implemented by the attitude recognition device 7 according to the invention.
  • FIG. 12 illustrates a modeling of the attitude recognition system 7.
  • the latter can be modeled by a projection center OL and an image plane 31 located at the focal length f downstream from the projection center OL and upstream from the sphere 11.
  • the ray of light 33 connecting the center of projection OL to the center of the sphere Os is the main axis of the system. It crosses the image plane 31 at the main point Oi.
  • the ray of light 35 connecting the center of projection OL to the center OB of the ball 17 intersects the image plane 31 at a point PB.
  • the point PB is therefore the point of the image corresponding to the center of the ball 17.
  • All of the steps below aim to determine the direction of the gravity (or acceleration) vector from the image of the ball 17 acquired by the image acquisition device 19.
  • the first step of the landing gear attitude recognition method 1 aims to define a three-dimensional direct orthonormal frame (O, X, Y, Z) having for origin O the center O s of the sphere 11 and having the directing vector of the Z axis that points to the center of projection OL.
  • the coordinates (XL, YL, ZL) of the center of projection OL are (0, 0, ZL), ZL being the distance between OL, the center of projection, and Os , the center of the sphere 11.
  • the image plane 31 is parallel to the plane (O, X, Y) and its Z coordinate is equal to (ZL-Î), f being the focal distance of the image acquisition device 19
  • the coordinates of the center OB of the ball 17 are (XB, YB, ZB) and vary according to the position of the ball in the sphere.
  • an image 29 (shown in FIGS. 5, 8 and 11) is acquired, thanks to the image acquisition device 19, on which the sphere 11 and the ball 17 are visible.
  • the image acquisition device 19 then communicates to the image processing device 23 the acquired two-dimensional image.
  • the image processing device 23 performs the rectification of the acquired image. This step aims to use the results of the calibration of the image acquisition device 19 to correct the geometric distortions of the image.
  • the ball 17 is identified in the image 29 acquired and rectified.
  • the algorithm of image processing of the image processing device 23 can advantageously be designed to search for the brightest group of connected pixels in the image, corresponding to the ball 17.
  • the image processing algorithm of the image processing device 23 identifies the image coordinates (u b , Vb), in pixels and fractions of pixels, of the center of the ball .
  • the image processing algorithm of the image processing device 23 can advantageously be designed to calculate the coordinates of the barycenter of the pixels identified as belonging to the ball 17 in the fourth step.
  • a sixth step of the method of the invention there is associated with the image 29 acquired a two-dimensional direct orthonormal frame (Oi, X, Y) from the three-dimensional direct orthonormal frame (O, X, Y, Z).
  • the frame of reference (Oi, X, Y) is defined so as to form a direct orthonormal frame seen by the image acquisition device 19 whose center Oi of the frame (Oi, X, Y) corresponds to the center O of the frame ( O, X, Y, Z), therefore at the center O s of the sphere 11.
  • the points OL, Oi and O are therefore all three located on the main axis defined by the ray of light 33.
  • a seventh step of the method of the invention one identifies, in the acquired and rectified image 29, the coordinates (X b , Y b ) of the center OB of the ball 17 in the two-dimensional direct orthonormal frame (Oi, X, Y) from the image coordinates (ub, Vb), the physical dimension of the pixels of the camera and the image coordinates (u 0 , v 0 ) of the point Oi.
  • FIG. 13 illustrates, on a longitudinal section of the model of the attitude recognition device 7, the fact that for a given position of the ball 17 in the image 29, in general two positions of the ball 17 in sphere 11 are possible. It is noted, however, that there is a set of positions of the ball 17 in the image 29 for which only one position of the ball 17 in the sphere 11 is possible.
  • the coordinates (XBI, YBI, ZBI) and (XB2, YB2, ZB2) of the two possible positions of the ball are calculated in the three-dimensional frame (O, X, Y, Z). corresponding to the coordinates (X b , Y b ) of the center OB of the ball 17 in the two-dimensional direct orthonormal coordinate system (Oi, X, Y).
  • R (Rs - R B ), difference between the radius of sphere 11 and the radius of ball 17.
  • the method of the invention comprises a ninth and a tenth step according to which it is determined which of the two sets of three-dimensional coordinates (XBI, YBI, ZBI) and (XB2, YB2, ZB2) is that which corresponds to the real position of the ball 17 in sphere 11.
  • the processing algorithm calculates the value of the apparent surface of the ball 17 in the image 29 acquired.
  • the processing algorithm compares the value of the apparent surface of the ball 17 with a value or a set of calibration values previously obtained during a calibration phase of the attitude recognition device 7.
  • the two possible three-dimensional positions of the ball 17 being on the light ray 35 passing through the projection center OL and the point PB, intersection between the light ray 35 and the image plane 31, we obtain an OBI position for which the ball 17 is closer to the center of projection OL and a position OB2 for which the ball 17 is further from OL. Due to the perspective projection, when the ball 17 is closer to the center of projection OL, the apparent area of the ball 17 in the image 29 is greater than the apparent area of the ball 17 in the image 29 than the apparent area of the ball 17 in the image 29. one obtains when the ball 17 is further from the center of projection OL.
  • the step according to which the value of the apparent surface of the ball 17 is compared with a value or with a set of calibration values obtained previously during the calibration phase of the attitude recognition device 7 allows to determine whether the ball 17 is in the position closer or further from the center of projection OL and thus to determine which of the two sets of three-dimensional coordinates calculated in the eighth step is the correct one (tenth step of the method according to the invention).
  • the algorithm deduces the gravity vector G (or acceleration), the origin of which is given by the center Os of the sphere 11 and the end is given by the coordinates (XB, YB, ZB) of the center OB of the ball 17 in the direct orthonormal frame (O, X, Y, Z), determined at the end of the tenth step of the process or at the end of the eighth step of the process in the particular case where only a three-dimensional position of the ball 17 in the sphere 11 corresponds to the two-dimensional position of the ball 17 in the image 29.
  • FIG. 14 illustrates the fact that for certain attitudes of the attitude recognition device 7 the two positions of the ball 17 in the sphere 11 corresponding to the position of the ball 17 in the image 29 can be very close to each other, thus potentially making it difficult to choose between the two sets of three-dimensional coordinates possible by analyzing the apparent surface of the ball in the image. For most missions, we can advantageously treat this case by calculating the barycenter of the two positions. However, for the missions for which it is desired to improve the precision of the measurements when a situation such as that illustrated in FIG. 14 occurs, the attitude recognition device 7 can be obtained according to a second embodiment illustrated in FIG. 15 to which reference is now made, showing the attitude recognition device 7 in longitudinal section.
  • the attitude recognition device 7 further comprises a diaphragm 39, fixed relative to the attitude recognition device 7 and the device for emitting a light flux 21 further comprises a second assembly of light-emitting diodes 47.
  • the diaphragm 39 extends from an interior face of the housing 25, along a plane (XY) substantially orthogonal to a longitudinal axis 41 of the housing, corresponding to the direction of image acquisition.
  • the diaphragm may or may not be in contact with the outer surface of the sphere.
  • the diaphragm 39 thus first defines a compartment, said upstream 43 of the housing 25 and a second compartment, said downstream 45 of the housing 25.
  • the diaphragm is preferably mounted at a distance ZD from the center Os of the sphere such that:
  • ZL being the distance between Os , the center of the sphere 11, and OL, the center of projection of the image acquisition device 19 and
  • R (Rs - RB), difference between the radius of sphere 11 and the radius of ball 17.
  • the device for emitting a luminous flux 21 comprises a second set of light-emitting diodes, called a set of downstream light-emitting diodes 47, is mounted in the second downstream compartment 45 of the housing 25.
  • the set of Downstream light-emitting diodes 47 is mounted close to diaphragm 39 or on diaphragm 39.
  • the second set of downstream light-emitting diodes 47 is arranged in the attitude recognition device 7 for emitting a luminous flux in the direction of the sphere 11.
  • the method implemented by the device of attitude recognition 7 obtained according to the second embodiment of the invention provides for an additional step comprising the extinction of the first set of upstream light-emitting diodes 27 and the ignition of the second set of downstream light-emitting diodes 47, mounted in the second downstream compartment of the housing 25, then a step of acquiring an image of the sphere 11 and of the ball 17, the lighting this time being provided at least by the second set of downstream light-emitting diodes 47, here only by the second set of downstream light-emitting diodes 47.
  • the ball 17 when the ball 17 is upstream of the diaphragm 39, the ball 17 will appear illuminated from the front in the image acquired in upstream lighting and illuminated from behind in the image acquired in lighting. downstream.
  • the ball 17 when the ball 17 is downstream of the diaphragm 39, the ball 17 will appear to be illuminated from the front in the image acquired in upstream lighting and also illuminated from the front in the image. image acquired in downstream lighting.
  • the method according to the invention determines whether the ball 17 is located upstream or downstream of the. diaphragm 39 and deduces which is the correct set of three-dimensional coordinates of the ball 17 in the direct orthonormal coordinate system (O, X, Y, Z).
  • the method of the invention deduces the gravity vector, the origin of which is the center Os of the sphere 11 and the end is given by the coordinates (XB, YB, ZB) of the center of ball 17 in the direct orthonormal coordinate system (O, X, Y, Z).
  • the ignition of the first and second sets of upstream 27 and downstream light-emitting diodes 47 at the right time and the sequence of steps is controlled by the image processing device 23.
  • attitude recognition device of this object comprising the attitude recognition device of the invention, of this landing gear and of this method.
  • attitude recognition described above only as examples illustrative, but it embraces on the contrary all the variants involving the technical equivalents of the means described as well as their combinations if they come within the scope of the invention.
  • the present invention can be used on board any space vehicle needing to determine its attitude relative to an acceleration vector provided that this vector does not vary rapidly over time, in order to allow time for the ball to stabilize. .
  • the present invention can function perfectly also on Earth, in terrestrial gravity, in any application requiring an absolute attitude measurement over 720 °, provided that the attitude does not vary too quickly over time in order to allow time the ball to stabilize.
  • the various constituent elements of the invention obviously do not need to be spatialized.
  • the attitude recognition device can be intended to be integrated into an object designed to operate in a terrestrial environment.
  • Such an object can be provided with movement capacities, such as an aircraft, or devoid of its own movement capacities but for which one wants for example to know the evolution of the attitude during its handling or its transport, such as for example a box or a container.

Landscapes

  • Engineering & Computer Science (AREA)
  • Remote Sensing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Astronomy & Astrophysics (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Automation & Control Theory (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Sustainable Development (AREA)
  • Length Measuring Devices By Optical Means (AREA)
  • Geophysics And Detection Of Objects (AREA)
  • Measurement Of The Respiration, Hearing Ability, Form, And Blood Characteristics Of Living Organisms (AREA)
EP20737236.8A 2019-05-17 2020-05-14 Absoluter 720°-neigungsmesser für den betrieb in der mikrogravitation Pending EP3969845A1 (de)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
FR1905203A FR3096128A1 (fr) 2019-05-17 2019-05-17 Inclinomètre absolu 720° pouvant fonctionner en milli-gravité
PCT/FR2020/050803 WO2020234529A1 (fr) 2019-05-17 2020-05-14 Inclinomètre absolu 720° pouvant fonctionner en milli-gravité

Publications (1)

Publication Number Publication Date
EP3969845A1 true EP3969845A1 (de) 2022-03-23

Family

ID=67810876

Family Applications (1)

Application Number Title Priority Date Filing Date
EP20737236.8A Pending EP3969845A1 (de) 2019-05-17 2020-05-14 Absoluter 720°-neigungsmesser für den betrieb in der mikrogravitation

Country Status (3)

Country Link
EP (1) EP3969845A1 (de)
FR (2) FR3096128A1 (de)
WO (1) WO2020234529A1 (de)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN118293880B (zh) * 2024-06-05 2024-08-16 青岛雷悦重工股份有限公司 一种集装箱贴标用水平度检测装置

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US3100350A (en) * 1960-07-18 1963-08-13 Clifford K Brown Magnetic direction and inclination indicating device
JPS5856810B2 (ja) * 1980-11-17 1983-12-16 工業技術院長 全方向の検出方法及びその装置
DE102016110144A1 (de) * 2016-06-01 2017-11-16 Deutsches Zentrum für Luft- und Raumfahrt e.V. Vorrichtung und Verfahren zur Lagebestimmung eines Landers
CN108469251B (zh) * 2018-01-22 2020-07-07 北京邮电大学 一种基于图像识别的球形倾角传感器

Also Published As

Publication number Publication date
FR3096127A1 (fr) 2020-11-20
FR3096127B1 (fr) 2023-01-20
WO2020234529A1 (fr) 2020-11-26
FR3096128A1 (fr) 2020-11-20

Similar Documents

Publication Publication Date Title
EP1407214B1 (de) Vorrichtung und dazugehöriges verfahren zum bestimmen der richtung eines zieles
EP3488540B1 (de) Kombiniertes bildgebungs- und laserkommunikationssystem
EP2495531B1 (de) Verfahren zum Messen der Stabilität einer Sichtlinie und entsprechender Sternsensor
US20170370725A1 (en) Daytime and nighttime stellar sensor with active polarizer
WO2020234529A1 (fr) Inclinomètre absolu 720° pouvant fonctionner en milli-gravité
EP3117260A1 (de) Optisches verfahren zur erfassung von räumlichen beweglichen objekten und teleskopsysteme zur erfassung von räumlichen beweglichen objekten
EP1740457B1 (de) Satellit, verfahren und satellitenflotte zur beobachtung eines himmelskörpers
WO1998031985A1 (fr) Dispositif apte a determiner la direction d'une cible dans un repere predefini
WO2005066024A1 (fr) Systeme optronique modulaire embarquable sur un porteur
WO2014147042A2 (fr) Procédé et dispositif de détermination d'une interdistance entre un drone et un objet, procédé de pilotage de vol d'un drone
EP3476734B1 (de) Drohne zum suchen und markieren eines ziels
FR3067817A1 (fr) Systeme d'observation embarque comprenant un lidar pour l'obtention d'images tridimentionnelles haute resolution
EP3571468B1 (de) Verfahren zur beobachtung der erdoberfläche und vorrichtung zur durchführung desselben
FR2867283A1 (fr) Procede d'occultation stellaire, dispositif et ensemble de mise en oeuvre du procede
EP0608945B1 (de) Sterndetektor mit ladungsgekoppelter Matrix, Verfahren zur Detektion und Anwendung zur Neueinstellung eines Raumfahrzeuges
Hockman et al. Gravity poppers: Hopping probes for the interior mapping of small solar system bodies
FR2726643A1 (fr) Dispositif d'observation d'une zone de terrain
EP2388646B1 (de) Verfahren zur Bilderfassung
FR2675114A1 (fr) Dispositif a engin volant pour le survol d'une zone, notamment en vue de sa surveillance.
FR2981149A1 (fr) Aeronef comprenant un senseur optique diurne et nocturne, et procede de mesure d'attitude associe
WO2020193881A1 (fr) Système de neutralisation d'une cible utilisant un drone et un missile
EP4183066B1 (de) Verfahren zur datenübertragung durch ein raumfahrzeug mit einem laser-sendemodul
Pack et al. Flight Operations of Two Rapidly Assembled CubeSats with Commercial Infrared Cameras: The Rogue-Alpha, Beta Program
FR3049066A1 (fr) Systeme de surveillance et de detection d’un evenement a la surface terrestre par une constellation de satellites
FR3132142A1 (fr) Sextant electronique astral diurne et nocturne a plateforme inertielle

Legal Events

Date Code Title Description
STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: UNKNOWN

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE INTERNATIONAL PUBLICATION HAS BEEN MADE

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE INTERNATIONAL PUBLICATION HAS BEEN MADE

PUAI Public reference made under article 153(3) epc to a published international application that has entered the european phase

Free format text: ORIGINAL CODE: 0009012

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: REQUEST FOR EXAMINATION WAS MADE

17P Request for examination filed

Effective date: 20211130

AK Designated contracting states

Kind code of ref document: A1

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR

DAV Request for validation of the european patent (deleted)
DAX Request for extension of the european patent (deleted)
STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: EXAMINATION IS IN PROGRESS

17Q First examination report despatched

Effective date: 20240117