EP3060480A1 - Kollaborativer roboter zur optischen inspektion eines flugzeugs - Google Patents

Kollaborativer roboter zur optischen inspektion eines flugzeugs

Info

Publication number
EP3060480A1
EP3060480A1 EP14787189.1A EP14787189A EP3060480A1 EP 3060480 A1 EP3060480 A1 EP 3060480A1 EP 14787189 A EP14787189 A EP 14787189A EP 3060480 A1 EP3060480 A1 EP 3060480A1
Authority
EP
European Patent Office
Prior art keywords
visual inspection
robot
aircraft
anomaly
inspection
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
EP14787189.1A
Other languages
English (en)
French (fr)
Inventor
Nicolas Colin
Frank GUIBERT
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Airbus SAS
Original Assignee
Airbus Group SAS
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Airbus Group SAS filed Critical Airbus Group SAS
Publication of EP3060480A1 publication Critical patent/EP3060480A1/de
Withdrawn legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
    • G05D1/0011Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot associated with a remote control arrangement
    • G05D1/0038Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot associated with a remote control arrangement by providing the operator with simple or augmented images from one or more cameras located onboard the vehicle, e.g. tele-operation
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J19/00Accessories fitted to manipulators, e.g. for monitoring, for viewing; Safety devices combined with or specially adapted for use in connection with manipulators
    • B25J19/02Sensing devices
    • B25J19/021Optical sensing devices
    • B25J19/023Optical sensing devices including video camera means
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J5/00Manipulators mounted on wheels or on carriages
    • B25J5/007Manipulators mounted on wheels or on carriages mounted on wheels
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64FGROUND OR AIRCRAFT-CARRIER-DECK INSTALLATIONS SPECIALLY ADAPTED FOR USE IN CONNECTION WITH AIRCRAFT; DESIGNING, MANUFACTURING, ASSEMBLING, CLEANING, MAINTAINING OR REPAIRING AIRCRAFT, NOT OTHERWISE PROVIDED FOR; HANDLING, TRANSPORTING, TESTING OR INSPECTING AIRCRAFT COMPONENTS, NOT OTHERWISE PROVIDED FOR
    • B64F5/00Designing, manufacturing, assembling, cleaning, maintaining or repairing aircraft, not otherwise provided for; Handling, transporting, testing or inspecting aircraft components, not otherwise provided for
    • B64F5/60Testing or inspecting aircraft components or systems
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N29/00Investigating or analysing materials by the use of ultrasonic, sonic or infrasonic waves; Visualisation of the interior of objects by transmitting ultrasonic or sonic waves through the object
    • G01N29/04Analysing solids
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N29/00Investigating or analysing materials by the use of ultrasonic, sonic or infrasonic waves; Visualisation of the interior of objects by transmitting ultrasonic or sonic waves through the object
    • G01N29/22Details, e.g. general constructional or apparatus details
    • G01N29/225Supports, positioning or alignment in moving situation
    • G01N29/226Handheld or portable devices
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N2291/00Indexing codes associated with group G01N29/00
    • G01N2291/26Scanned objects
    • G01N2291/269Various geometry objects
    • G01N2291/2694Wings or other aircraft parts
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B2219/00Program-control systems
    • G05B2219/30Nc systems
    • G05B2219/45Nc applications
    • G05B2219/45066Inspection robot
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y10TECHNICAL SUBJECTS COVERED BY FORMER USPC
    • Y10STECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y10S901/00Robots
    • Y10S901/30End effector
    • Y10S901/44End effector inspection

Definitions

  • the present invention belongs to the field of non-destructive testing of aircraft.
  • the invention relates to a collaborative robot for visually inspecting an aircraft on the ground as part of the verification or control operations and relates to an inspection method implementing such a robot.
  • the invention finds particular application in the field of control operations before the flight of an aircraft.
  • visual inspection does not implement any particular disassembly, at most the opening of doors or hatches which allows a relative speed of inspection.
  • the visual inspection is performed by a ground operator, who is for example a mechanic or a pilot of the aircraft.
  • the inspection by a ground operator is carried out following a check list, but the operator can freely inspect other elements or areas of the aircraft than those provided in the checklist, particularly if Indices lead to an interest in a particular area, which improves the detection of possible anomalies.
  • Patent application WO2012 / 047479 illustrates an example of an automated inspection device for an aircraft.
  • the inspection of the aircraft is supported by a set of fixed or mobile cameras on the ground by means of mobile robots or mobile in flight by means of flying robots that are assigned to different parts of the aircraft.
  • the display means communicate with a remote computer center which processes the received images to deduce the presence of anomaly and determines the maintenance operations to be performed.
  • the device of the invention for the visual inspection of the external surfaces of an aircraft comprises an inspection area intended to receive an aircraft and comprises at least one visual inspection robot including a flat mobile form carries a turret with display means and comprises processing means providing a guidance of the mobile platform and a processing of information received from visualization means.
  • the visual inspection device comprises a control center with a station for at least one control operator, and the processing means of the visual inspection robot are adapted to:
  • the visual inspection robot comprises means for determining at any time during an inspection the position of the visual inspection robot and the orientation of the display means in an axis system linked to the aircraft.
  • the robot is thus able to move autonomously with respect to the aircraft both for the management of its movements and for that of the areas of the aircraft to be inspected visually.
  • the robot processing means are adapted to determine the position of the robot and the orientation of the display means by an image processing of the aircraft to be inspected obtained by the display means.
  • Such a means makes it possible to readjust the position of the robot with respect to the aircraft even if the aircraft does not have an exact position with respect to the theoretical position which it should have and to correct the deviations of this position without measurement or means external to the device.
  • the visual inspection robot has absolute location means, such as a GPS receiver or laser rangefinders. pointing reference targets, and or means for integrating its movements, for example by odometry.
  • the robot is thus able to move independently of any registration on the position of the aircraft to get close to the aircraft and after registration to determine its precise position during its movements even without establishing its position continuously by direct observation of the aircraft.
  • the visual inspection device comprises all or some of the following features:
  • the processing means comprise data storage means comprising, at least temporarily, characteristics, in particular geometric and graphic, of an aircraft to be inspected.
  • the robot thus locally has the nominal characteristics of the aircraft he must inspect and vis-à-vis which he must visually identify any anomalies.
  • the processing means comprise data storage means comprising anomaly characteristics, for example in an anomaly library.
  • the robot in addition to logic engines to identify anomalies not necessarily listed, is thus able to compare any observed visual element with respect to known anomalies.
  • the processing means comprise image processing algorithms for detecting visible anomalies in the images produced by the viewing means in at least one of the wavelengths of the optical spectrum.
  • the display means comprise means of illumination in a light of the visible range and or in the field of the infrared and or in the field of the ultraviolet.
  • the display means and the processing means are configured to determine a three-dimensional shape of the exterior surfaces inspected of the aircraft. It is thus possible to identify which forms of the outer surface of the aircraft do not conform to the nominal shapes.
  • the visual inspection device comprises means of non-destructive control of the structure of the inspected aircraft.
  • the non-destructive inspection means are carried by the visual inspection robot or are worn, in whole or in part, by at least one control robot whose behavior is controlled by the visual inspection robot.
  • the visual anomaly is the index of a deeper structural anomaly and to measure non-visible damage.
  • the display means are orientable in location and in azimuth with respect to a repository of the platform of the visual inspection robot. It is thus possible to quickly scan all the outer surfaces of the aircraft visible by the robot during its movement.
  • the processing means are configured to determine the position of a fault detected on an aircraft with respect to elements of the internal structure of said aircraft, which are not visible from outside the aircraft. The consequences of the defect are thus better evaluated and the position of the defect is localized for the maintenance operators in relation to identifiable structural elements of structure.
  • the inspection robot is a robot rolling on a floor of the inspection area or moving by levitation in a volume whose footprint substantially corresponds to the inspection area.
  • a plurality of inspection robots configured to perform a visual inspection of the same aircraft is implemented. This results in a faster and, if necessary, more complete visual inspection if specialized robots to inspect certain areas are implemented.
  • the invention is also directed to a method of visually inspecting a aircraft in which images of an outer surface of the inspected aircraft are transmitted to processing means of a visual inspection robot, wherein the processing means analyzes the images to identify the presence of any visible anomalies , and wherein when a visible anomaly is detected, the detected anomaly data is transmitted to a control center and the visual inspection is interrupted, at least when the processing means identifies an abnormality belonging to a category of anomalies considered critical.
  • instructions are transmitted to the visual inspection robot by the control center for the continuation of the visual inspection, said instructions determining how the robot must continue. inspection.
  • the amplitude of a visible anomaly is calculated by the processing means from optical deformation measurement means and or by a colorimetric analysis in the visible range, and or infrared and / or ultraviolet of the light spectrum.
  • an area affected by the visible defect is subjected to non-destructive testing by the visual inspection robot or a non-destructive control robot controlled by the robot. visual inspection.
  • Figure 1 an example of a basic arrangement of the main components of a visual inspection device of an aircraft implementing a collaborative robot and a remote control center;
  • Figure 2 an example of a flowchart of the main steps of the visual inspection process implementing a collaborative robot and a remote control center.
  • a collaborative inspection robot of an aircraft 90 such as the robot 10 illustrated in FIG. 1 comprises a mobile platform 11 provided with a turret 12 carrying viewing means 13.
  • the illustrated mobile platform 11 is mounted on four wheels ensuring the stability of the platform and its displacement on the ground by the motorization of at least one of the wheels.
  • the turret 12 is secured in a lower portion of said turret of the platform 11, for example on an upper face of said platform, and the display means 13, in the illustrated example a digital camera, are integral with the turret 12 in an upper part of said turret.
  • All the mechanical links between the platform 11 and the turret 12, and between the turret 12 and the display means 13 are arranged so that a main direction of observation 131 of the display means 13 is steerable. site and in azimuth in all possible directions to aim points above the ground and advantageously also on the ground on which the robot 10 moves, if necessary taking into account the platform's displacement capabilities him providing the possibility of changing the azimuth of a line of faith 111 of said platform.
  • the turret 12 is made to modify the azimuth of the main direction of observation 131 with a limited angular amplitude, for example 180 °, the other directions being then reached by movements on the ground of the platform.
  • the orientation capabilities in site of the main direction of observation 131 is advantageously at least 90 ° between a target substantially horizontal and a substantially vertical aim upward.
  • the substantially horizontal aiming allows a downward aiming to allow the observation from above of areas that may be located under the viewing means 13, placed for example in an elevated position by means of the turret 12.
  • areas are for example parts of the ground under the aircraft likely to show signs of leakage of a fluid from the aircraft.
  • Such zones are, for example, treads of tires likely to bear signs of abnormal wear or of parts of structure that can be observed from above because of the raised position of the display means.
  • the turret 12 includes elevation means 121 of the display means for modifying the height above the ground of said display means.
  • Such elevation means 121 consist for example of an assembly of articulated arms, two or more arms, which unfold or fold by actuators, not shown, so that the height above the plate ⁇ form 11, and therefore above the ground, the upper end of the turret 12 carrying the display means 13 can be modified.
  • the display means 13 in an elementary form, consist of a camera providing images in the field of visible light and intended to be analyzed.
  • the images can be transmitted in analogue form to be processed if necessary after being converted into digital form or transmitted in digital form.
  • the display means 13 comprise illumination means 132, for example integral with the turret 12, oriented to illuminate areas to be inspected located in a field of view of said viewing means.
  • the illumination means 132 may be offset from the main viewing direction 131 of the display means 13 to create more or less grazing lights for viewing certain defects such as superficial depressions.
  • the illumination means 132 may also be arranged to produce illuminations in particular areas of the visible light spectrum, of red, green or blue color, for example, or non-visible, corresponding to particular spectral bands of the infrared or ultraviolet spectra, and are advantageously switchable between the different visible or non-visible spectral bands.
  • the spectra of the lights of the illumination means 132 are set according to the spectral sensitivities of the cameras used for the display means 13 and according to the expected behaviors of the zones inspected under the different lighting, in particular when contrast effects or fluorescence are potentially expected to visualize and / or diagnose an abnormality.
  • the display means 13 comprise particular equipment, not shown in the drawings, such as for example stereoscopic cameras and or other devices for giving a three-dimensional perception, for example by stereo-correlation techniques. , shearography, laser telemetry ...
  • the perception in three dimensions makes it possible to check the conformity and to detect the deformations of both curved surfaces and plane surfaces.
  • the mobile platform 11 is self-powered.
  • the propulsion can be carried out by thermal engines ensuring the driving of the wheels for the displacement of the mobile platform, for example by a hydrostatic transmission, and to produce a hydraulic and / or electrical power necessary for actuators useful for the movements of the turret 12 and viewing means 13.
  • Propulsion can also be achieved by electric motors powered by electrical generating means carried by the platform or by electric accumulators.
  • electric accumulators to power electric motors and electric actuators for Movements of the mobile platform 11 and the turret 12 is suitable for inspection of aircraft that is performed in an airport environment with easily infrastructure to allow recharging of the electric accumulators between two inspections or as much as necessary.
  • the collaborative robot further comprises calculation means 20 necessary for visual inspection.
  • the calculation means 20 comprise in particular acquisition means 21 for the observation data transmitted by the display means 13, processing means 22 for the said observed data, communication means 23 with a control center 50, deported by report to an area in which the collaborative robot is to perform inspections, and location means 24.
  • the various means 21, 22, 23, 24 of the calculation means 20 on board the robot 10 are advantageously means communicating with each other digitally via a communication bus of said robot.
  • the acquisition means 21 of the observation data mainly consist of digital data storage memories having sufficient capacity to keep the observation data for a period necessary for their processing in order to determine whether anomalies have been observed and if need of data preprocessing means for their exploitation.
  • the observation data are stored in memory by the robot 10 or in a remote storage center, not shown, for subsequent downloading by the robot, with reference to the aircraft on which said observation data has been established, so that the comparative processing of the observation data between two or more inspections can be performed.
  • the processing means 22 mainly comprise digital processing units, for example processors and or microcontrollers, random access memories (RAM), storage memories (ROMs, flash memories, SDRAMs, hard disks, etc.), interfaces digital or analog and communication buses between the different components of said processing means.
  • the processing means 22 comprise specialized circuits for the rapid processing of the signals transmitted by the display means, for example circuits for the processing of the signals corresponding to fixed or moving images.
  • the processing means 22 are configured to produce sequences of instructions ensuring:
  • the robot comprises the mechanical means and which appear to those skilled in the art necessary for the implementation described as an example below.
  • the collaborative robot of the invention is not a fully automated means of inspection for detecting anomalies on an aircraft 90 but a means of assistance available to an operator for performing a visual inspection.
  • the collaborative robot is particularly adapted to allow remote inspection by the operator and to provide inspection assistance for more accurate, detailed, repeatable and reliable diagnoses.
  • a collaborative robot is initialized to be able to perform the desired inspection.
  • the robot 10 receives a position 211 of the aircraft 90 to be inspected, at least one approximate position such as the number or the geographical position of a block on which the aircraft is placed, and receives the information 212 on the aircraft to be inspected, at least one type of aircraft and preferably also an identity of the aircraft.
  • the collaborative robot may be required to carry out inspections on aircraft that can be parked on different locations of a parking area, it will then be informed during the first step 210 of the location where the robot is located. 90 aircraft to inspect.
  • the collaborative robot 10 when it reaches the area in which the aircraft 90 which it is to inspect, examines the aircraft overall with its means. 13 from which examination it deduces, from a database of aircraft shapes, the type of aircraft in question.
  • the collaborative robot by processing the images acquired during this step, detects the registration of the aircraft and by consulting a database of registered aircraft verifies that the type of aircraft bearing said registration corresponds to the type identified by the analysis of the shape of the aircraft.
  • the information on the aircraft 90 obtained during the first step 210 and during the second step 220 are compared during a third step 230 to check the consistency of the data transmitted by the control center. to detect any inconsistency as soon as possible.
  • Any anomaly or inconsistency detected during the third step is immediately reported by the collaborative robot to an operator responsible for the inspection who will decide on the arrangements to be made: stop the inspection by the robot in order to correct information on the type and identify the aircraft, send the robot orders to obtain additional visual data before making a decision, give the collaborative robot instructions to continue the inspection by ignoring certain inconsistencies detected, if necessary by also requesting a intervention by an operator to carry out an on-site analysis and remove the doubt.
  • a fourth step 240 which fourth step can be carried out in the previous steps on the basis of the instructions received by the collaborative robot, said collaborative robot downloads the data necessary for the inspection to be performed, in the case at least where it does not have all the said data in its internal storage memories.
  • Said data comprise, for example, an inspection circuit to be carried out determined by a nominal trajectory, lists of operations to be performed under the aircraft type inspection, lists of inspection operations to be performed. for the particular aircraft, for example, based on previous observations on the same or other aircraft of the same type
  • said data at least for some of them, are acquired by the collaborative robot by learning, for example during inspections carried out in remote-controlled mode, and are completed during successive inspections to improve future detection of anomalies that have been detected.
  • the said data also include the characteristics of the aircraft, nominal geometrical characteristics, colors and patterns painted on the aircraft, positions and types of hatches, locks, probes ... to be verified during the visual inspection, anomalies known, for example detected during previous inspections ...
  • the collaborative robot will perform the visual inspection operations following the inspection circuit while moving on the ground near the aircraft 90.
  • the collaborative robot will continuously determine its position in a reference frame linked to the aircraft being inspected, for example an OX-OY landmark whose trace corresponds to a longitudinal X axis of the aircraft. the aircraft and a transverse Y axis of the aircraft.
  • the collaborative robot uses conventional techniques, for example the recognition of bitters constituted by characteristic forms or subassemblies of the aircraft, data which are advantageously coupled to geolocation information, for example by GPS. , by laser telemetry or other localization devices, and odometry techniques to integrate the movements of said robot, the various techniques being advantageously hybridized to make accurate estimates of the position and orientation of the robot in the OX-mark.
  • OY typically about one centimeter or less in position and one minute or less in orientation.
  • the collaborative robot 10 is called upon to operate on a substantially flat and horizontal surface of an aircraft parking area of an airport facility, where appropriate the location techniques used ensure the determination of a relative position. in height of the robot by report to the aircraft.
  • the robot 10 following the instructions stored in a program of the processing means, will move along a path allowing it to visualize all the points of the aircraft to be inspected.
  • the robot 10 generally follows a predetermined trajectory 91 on the ground.
  • the robot 253 detects any obstacles, for example by optical means and or by specific sensors, for example by ultrasonic emission sensors and where appropriate deviates from the predetermined trajectory. performing the necessary movements to circumvent the obstacle (s) and to ensure that visual inspection of all areas to be inspected is properly performed.
  • a detected obstacle is reported 254 to the operator responsible for the inspection as a potential anomaly that must be addressed before the aircraft 90 leaves the parking spot.
  • the robot To perform the visual inspection, the robot according to its position and its estimated orientation in the OX-OY mark positions and directs the display means so that images of the zones of the aircraft to be inspected are taken and transmitted. to the processing means 22.
  • the images are made continuously during the movements of the robot and its stops so as to visualize areas of the aircraft that can be observed from the ground given the possibilities of orientation and elevation over the soil of the visualization means 13.
  • the processing means 22 perform a processing of the received images 256 to identify any type of anomaly resulting in identifiable characteristics in the images. For example, such characteristics correspond to geometrical and or contrasts and or colorimetric singularities, singularities that can be observed in one or in different illumination spectra, the robot switches the illumination spectra during the positioning and orientation step 255.
  • the robot 10 selects a given spectrum of illumination means 132 according to the types of anomalies sought in an area being inspected or according to the facies of a detected anomaly, for example in white light. , to establish a diagnosis.
  • the spectrum of illumination is for example in the field of visible light, in white light or in a particular color of the visible spectrum, or in the infrared or ultraviolet range, for example to identify areas having particular fluorescences.
  • Such fluorescence properties may be the direct trace of an anomaly such as, for example, leakage of a liquid, such as fuel, hydraulic fluid or the like, leaving a visible trace on an outer surface of the aircraft and or on the ground. They may be for example the sign of an aircraft electrically charged by electrostatic charges, by the presence of corona discharges perceptible at static fuselage unloaders or wings.
  • markers can also be the result of markers revealed by particular conditions for example following exposure to abnormal heat or shock.
  • markers consist, for example, of paints whose spectral properties are modified by exposure to heat or by paints comprising microspheres filled with a fluid that becomes visible when illuminated by wavelength light. particular and whose fluid is dispersed during a mechanical impact on a structure covered with such paint.
  • the image processing is carried out, in addition to the detection by developers such as those mentioned above, to detect abnormal shapes or changes of shape between successive inspections, or dissymmetries of forms between nominally symmetrical areas of the aircraft, for example bulges or depressions, such changes of shapes being interpreted in general as anomalies of the structure.
  • the processing means 22 perform an analysis of the images transmitted by the display means 13 by analogy with virtual representations of the observed areas, in particular of elements located on these particular areas, kept in memory.
  • the virtual representations are, for example, three-dimensional views or digital representations that make it possible to place a virtual representation in the position, orientation and distance, under which the actual particular zone is observed by the visualization means 13 in order to carry out a treatment. digital comparison of the real element with its virtual representation.
  • anomalies of greater or lesser dimensions can be sought according to criteria established in advance, the display means 13 being advantageously provided with magnification means, for example of the optical zoom type, to perform the search for small size defect.
  • the particular zones are for example zones comprising a door or a inspection hatch, or engine hoods whose closure and the position of the locking devices must be verified.
  • the particular zones are, for example, zones comprising visible equipment such as an incidence probe, a total aerodynamic pressure tap, a static pressure tap, a frost detector, an antenna, a drain or any other type of can be damaged, for example twisted or torn off, or closed, for example by pit caches
  • the collaborative robot transmits to the command center 50 supervised by an officer of the inspection of the aircraft reports inspection.
  • the robot 10 does not detect any anomaly 258 in an inspected area, it transmits information of absence of detected anomaly, for example being translated in the control center by a "green light" and information of progress of the inspection can be displayed on a screen 51 of the control center 50 to be followed. If the robot detects an anomaly 259, it retransmits an alert to the command center 50, which alert is accompanied by images of the visual observation of the area that occurred at the detection of the anomaly with a diagnosis or a list of diagnostics possible according to their probabilities given the typology of the anomaly treated by the robot 10.
  • said collaborative robot In a case where the collaborative robot identifies an anomaly associated with a high probability that the anomaly is affecting the security, said collaborative robot interrupts the inspection and transmits to the control center 50 an abnormality detection report with the message. warning and a warning "red warning", an intervention on the aircraft being in this case necessary a priori.
  • the collaborative robot 10 In the case where the collaborative robot 10 identifies an anomaly for which said collaborative robot is not able to make a diagnosis, if not with a high uncertainty on the consequences of the anomaly, said collaborative robot transmits the alert message with an "orange" alarm (caution) and goes into a standby condition and in a remote mode 262 in which he waits for instructions from the operator responsible for the inspection.
  • the operator can, after examining the images and data transmitted by the collaborative robot, decide either:
  • Remote control 264 the robot to obtain additional data on the area that generated the alarm and can make a decision to continue 263 or stop 265 the visual inspection;
  • the data collected by the robot 10 and the diagnostic information provided by the operator via the control center 50 are processed to enrich fault databases and to ensure subsequent inspections. better detection of the type of anomaly encountered by the robot 10 performing the visual inspection or by a fleet of inspection robots.
  • a given aircraft may comprise an antenna missing on other aircraft of the same type or an antenna of a different model, a situation that may lead to the detection of a false anomaly.
  • the knowledge by the robot 10 of the exact identity of the aircraft 90 allows it to have information specific to this aircraft and for example the presence in a zone of a particularity of the aircraft in question, for example an antenna attached to this particular aircraft.
  • the operator responsible for the inspection notified by the robot 10 of a detected anomaly, will be able to see the appearance normal for example of the antenna and confirm this information to the robot that will enrich the database for the aircraft 90 for subsequent inspections and thus avoid new alerts for the detection of the same false anomaly.
  • the robot 10 searches during an inspection of possible anomalies on an aircraft 90, during the fifth step of the method, it inspects successively the different zones of the aircraft to detect visible traces attributable to an abnormal state of the aircraft. 'aircraft.
  • the robot 10 moves around the aircraft 90 by methodically scanning with the display means 13 the different parts of the aircraft, or even the ground under the aircraft.
  • the images obtained are, after treatment, compared with data representative of normal situations and abnormal situations to detect the presence of a possible anomaly.
  • the images obtained by the visualization means are as much as necessary processed in order to be compared to the data of the known database of the robot 10.
  • the images are also analyzed to identify any generic characteristics of potential defects such as deformations on the outer surface of the aircraft 90, missing parts resulting in practice by irregular openings or locations for which it does not exist. is not known to the robot the presence of opening, traces such as scratches or drips on the outer surface of the aircraft, lack of paint in some locations, traces with a facet of lightning impact ...
  • defects potentially associated with these features are not in practice sought at a particular location but may be at many points on the outer surface of the aircraft.
  • the processing applied to the images may be general or specific to the detection of a particular problem.
  • the images are processed by contrast enhancement algorithms, extraction of image areas according to its color or its spectral sensitivity, contour extraction, texture ...
  • the detection may for example relate to a shape or a color that does not conform to the shape or the nominal color, the shape or color observed may reflect a deformation or the presence of a foreign body.
  • An anomaly of form results in an observed form that does not correspond to the expected form.
  • the robot 10 characterizes the anomaly on the one hand by its type, that is to say the characteristics which led to consider the presence of an anomaly, for example: the shape, the color, range, sensitivity at certain wavelengths, presence or absence of an unexpected or expected element, contrast ... and secondly quantifies the anomaly.
  • the robot also locates the anomaly observed in the axis system of the aircraft, location that it realizes taking into account its precise position in the axis system Ox-Oy in which it operates on the ground, the direction in which are oriented the display means and the geometric characteristics of the observed aircraft.
  • the knowledge of the position of the anomaly on the outer surface of the aircraft is advantageously exploited by the processing means 22 to perform a diagnosis of the anomaly.
  • the processing means 22 to perform a diagnosis of the anomaly. For example, it is known that lightning attaches to the privileged locations of an aircraft and the presence of a trace having a lightning impact facies will not necessarily be interpreted in the same way as to its causes and effects. according to the location where the trace is.
  • the knowledge of the position of the anomaly on the surface of the aircraft also makes it possible to relate the anomaly to structural characteristics of the aircraft at the location of the aircraft. 'anomaly.
  • the structure may be a metal structure or a composite material structure with properties depending on the type of composite material, and in the latter case with a protection against lightning currents which may be different, for example a wire mesh. more or less dense bronze or a particular conductive paint.
  • the processing means make a prognosis so as to provide the operator in charge of the inspection detailed information on the anomaly and the consequent risks.
  • shock type anomaly on the outer surface of the aircraft during the inspection results, for example, in a message intended for the control center of the type "fuselage coating depression at the level of the beam No. 14 between the fuselage frames 11 and 12 - no traces of coating tear - low structural risk, repair to be programmed "accompanied by images of the area concerned.
  • the processing means perform a quantitative analysis of the defect: extent (surface, length, width, etc.) of the anomaly, by example of a surface affected by a shock, depth of a deformation, intensity of a cause that must have led to the observed anomaly such as a rise in temperature, intensity of a leak in the case of a presence of suspicious fluid ...
  • the robot 10 uses as much as possible its visual means associated with the processing means as already specified by stereoscopic vision techniques and or by any known optical method such shearography, holography, telemetry ....
  • the robot performs structural examinations by means of non-destructive testing instruments to quantify the anomalies.
  • the robot according to the diagnosis made is positioned, in step 2591, with respect to the anomaly to deploy the non-destructive testing instruments, not shown, to obtain additional information to improve diagnosis based on visual inspection.
  • Such non-destructive control means can implement probes, for example ultrasonic probes, eddy current probes, temperature measuring means such as thermal imaging cameras or any other type of probe capable of producing a local examination of the material in a particular area.
  • the probes are carried by one or more articulated arms so that the robot can apply / orient said probes on / to the locations to be examined, each probe being, during an examination using said probe, associated with an equipment of measurement embedded on the platform of said robot.
  • the probes are stored in a probe holder of the robot to be implemented via a single articulated arm.
  • the articulated arm controlled by the robot processing means or by a control center operator 50 having chosen to remotely control the examination, enter the appropriate probe for a given examination, by any type of gripping and connecting means to the measuring instrument to which the probe is to be connected, then apply / orient the probe on / to the zone to be controlled by ensuring the desired displacements for said probe, and when the control has been carried out and the processing means and / or the remote operator consider having obtained the data necessary for the inspection, the articulated arm replaces the probe in the probe holder.
  • the robot 10 continues the inspection.
  • said robot When the inspection of the aircraft or the part of the aircraft entrusted to the robot 10 is completed, said robot returns to a waiting station 30 during a sixth step 270, advantageously a waiting station providing protection for said robot to prevent damage by the various vehicles and aircraft traveling on the airport platform, and preferably a position where said robot is connected to a power source allowing him for example to recharge his accumulator batteries electric if provided.
  • the waiting station 30 is provided with communication means 31 at short distances with the robot, where appropriate wired connections, to allow rapid exchange of information and with little risk of interference between the control center 50, for example to transmit all the data collected during the inspection in view of a storage of said data and a possible deferred processing.
  • Such means may also be implemented in the first step 210 when the robot 10 loads the useful information for its next inspection.
  • the data transmitted by the robot 10 to the control center 50 during the visual inspection may be received by cockpit equipment so that a member of the technical crew has the opportunity to follow the progress of the inspection and to assess any anomalies detected.
  • control means of the robot are arranged in the cockpit to intervene on the progress of the inspection.
  • the robot resumes the inspection so that all the requirements nominally provided for an inspection are fulfilled before the inspection can be declared complete.
  • the robot 10 provides security to avoid collisions with persons or objects that may be fixed or animated. in its close environment.
  • the robot 10 uses its display means 13 to detect the objects with which a collision can occur.
  • the robot may also include means specifically dedicated to the detection of such objects.
  • said means control the movements of the platform ⁇ to avoid the collision, if necessary by emitting a sound signal and or light or other to alert a person who can to be involved in the potential collision.
  • the processing means provide guidance of the platform autonomously to avoid the risk of collision by using known localization techniques with respect to an environment whose characteristics they reconstruct, for example using a SLAM method. (Simultaneous Localization and Map Building).
  • a single robot 10 performs the visual inspection of the aircraft 90.
  • the robot 10 performs partial visual inspections in combination with partial inspections performed by other similar or specialized robots.
  • two or more identical robots are used to carry out the visual inspection of a right-hand side of the aircraft and of a left-hand side of the aircraft, and / or of a front part. and a back part.
  • the inspections carried out by two robots are for example synchronized in order, for example, to obtain comparative data between two zones of the aircraft such as a zone on the right side and a theoretically symmetrical zone on the left side so as to identify or confirm the presence of an anomaly.
  • robots with different possibilities, for example adapted to parts of the aircraft for which it is necessary to implement particular viewing means, for example because of their heights, such as a drift of an aircraft, or their shapes, such as air inlets or outlets of engine nozzles.
  • all or part of the non-destructive control means are carried by at least one control robot whose mobile platform is independent of the visual inspection robot.
  • control robot is controlled by the visual inspection robot which, when it has identified an area to be subjected to non-destructive testing, sends an intervention command to the control robot with all the data necessary for the control robot to perform the requested control, in particular the exact location on the aircraft of the area to be controlled and its extent and the type of control desired.
  • the control robot When the control robot has performed the requested control it transmits to the visual inspection robot the data obtained by the control, possibly processed to provide an interpretation of the control, and said control robot is placed in a waiting position.
  • control robot which is intended to intervene only occasionally and at the request of a visual inspection robot, can serve several visual inspection robots, priority rules managing potential conflicts if two robots of visual inspection solicit the control robot at the same time.
  • control robots for example a robot for performing ultrasonic checks and another robot for performing eddy current checks. It is in this form obtained more or less specialized control robots whose performance can be adapted to each type of control.
  • the example of visual inspection robot 10 described in the exemplary embodiment is a robot moving on the ground, however other types of robots, for example levitating robots, can be implemented if necessary in combination with one or more robots on the ground.
  • said robot comprises means, for example carried by an articulated arm, for opening and closing doors on the surface of the aircraft giving access from outside the aircraft 90 to outlets. or indicators so that the robot 10 is able to close a door left open by mistake, or to open a door to visually inspect a housing and close the housing door.
  • the inspection is carried out collaboratively between the robot 10 and the remote operator responsible for the inspection to which said robot returns all the results of the visual inspection, if necessary non-destructive test results of an area presenting a visually detected anomaly, from which he waits for instructions whenever an anomaly is detected and a decision requires the intervention of the operator to whom the images are presented and all the characteristics that could have been measured of the anomaly .
  • the operator responsible for the visual inspection remains totally in control of the decision to declare the absence of any anomaly affecting the operational use of the inspected aircraft with the possibility of intervening remotely on the robot to obtain all the information, in particular the images of dubious zones, to make a fast decision adapted to the case.
  • the control center from which the person responsible for the inspection operates may be distant from the place of the visual inspection, for example a visual inspection officer may be located in a maintenance center of the aircraft operator and supervise visual inspections at airports around the world, in particular by making use of possible digital links over landlines, terrestrial radio networks and satellite radio networks.
EP14787189.1A 2013-10-24 2014-10-23 Kollaborativer roboter zur optischen inspektion eines flugzeugs Withdrawn EP3060480A1 (de)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
FR1360395A FR3012425B1 (fr) 2013-10-24 2013-10-24 Robot collaboratif d'inspection visuelle d'un aeronef
PCT/EP2014/072785 WO2015059241A1 (fr) 2013-10-24 2014-10-23 Robot collaboratif d'inspection visuelle d'un aéronef

Publications (1)

Publication Number Publication Date
EP3060480A1 true EP3060480A1 (de) 2016-08-31

Family

ID=50489156

Family Applications (1)

Application Number Title Priority Date Filing Date
EP14787189.1A Withdrawn EP3060480A1 (de) 2013-10-24 2014-10-23 Kollaborativer roboter zur optischen inspektion eines flugzeugs

Country Status (5)

Country Link
US (1) US9952593B2 (de)
EP (1) EP3060480A1 (de)
CN (1) CN105873825B (de)
FR (1) FR3012425B1 (de)
WO (1) WO2015059241A1 (de)

Families Citing this family (74)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB2529847B (en) 2014-09-03 2018-12-19 Dyson Technology Ltd A mobile Robot with Independently Adjustable Light Sources
GB2529846B (en) * 2014-09-03 2019-02-20 Dyson Technology Ltd Illumination Control of a Vision System for a Mobile Robot
US10377509B2 (en) * 2014-09-19 2019-08-13 Raytheon Company Dynamic testing of attitude determination and control systems, reaction wheel and torque rods using automotive robotic techniques
FR3035510B1 (fr) * 2015-04-21 2018-10-26 Airbus Group Sas Moyen acoustique de detection, de localisation et d'evaluation automatique d'impacts subis par une structure
US9964398B2 (en) 2015-05-06 2018-05-08 Faro Technologies, Inc. Three-dimensional measuring device removably coupled to robotic arm on motorized mobile platform
GB2538231A (en) * 2015-05-07 2016-11-16 Airbus Operations Ltd Method and apparatus for aircraft inspection
FR3037429B1 (fr) * 2015-06-15 2018-09-07 Donecle Systeme et procede d'inspection automatique de surface
GB2539661B (en) 2015-06-22 2019-06-26 Q Bot Ltd Robotic Vehicle
DE102015221095A1 (de) * 2015-10-28 2017-05-04 Airbus Defence and Space GmbH Verfahren zum Nachweis von Oberflächenrückständen auf Bauteilen mittels UV-Bestrahlung
DE102015120660A1 (de) * 2015-11-27 2017-06-01 Airbus Defence and Space GmbH Luftfahrzeuginspektionssystem
FR3046848B1 (fr) * 2016-01-14 2018-01-05 Donecle Procede et systeme de determination de la position d'un engin mobile
ITUA20161534A1 (it) * 2016-03-10 2017-09-10 Wpweb Srl Procedimento per analizzare un velivolo, relativo sistema di analisi di un velivolo e sistema di sghiacciamento e/o antigelo
US10053236B1 (en) * 2016-03-28 2018-08-21 Amazon Technologies, Inc. Automated aerial vehicle inspections
CN106027955A (zh) * 2016-04-27 2016-10-12 中国科学院上海高等研究院 一种用于辅助飞机检查与远程故障诊断的图像采集系统
US10272572B2 (en) * 2016-06-10 2019-04-30 The Boeing Company Remotely controlling robotic platforms based on multi-modal sensory data
US10220964B1 (en) * 2016-06-21 2019-03-05 Amazon Technologies, Inc. Unmanned aerial vehicle sensor calibration validation before flight
US9823089B1 (en) 2016-06-21 2017-11-21 Amazon Technologies, Inc. Unmanned aerial vehicle sensor calibration as part of departure from a materials handling facility
US10170011B2 (en) * 2016-07-26 2019-01-01 International Business Machines Corporation Guide drones for airplanes on the ground
US10820574B2 (en) 2016-07-29 2020-11-03 International Business Machines Corporation Specialized contextual drones for virtual fences
US9987971B2 (en) 2016-07-29 2018-06-05 International Business Machines Corporation Drone-enhanced vehicle external lights
CN106370602A (zh) * 2016-08-31 2017-02-01 纳路易爱姆斯株式会社 利用无人机的大型结构物超声波检查方法及系统
JP6698479B2 (ja) * 2016-09-06 2020-05-27 シャープ株式会社 自律走行装置
KR102559745B1 (ko) * 2016-10-13 2023-07-26 엘지전자 주식회사 공항 로봇 및 그를 포함하는 공항 로봇 시스템
US20180114302A1 (en) * 2016-10-23 2018-04-26 The Boeing Company Lightning strike inconsistency aircraft dispatch mobile disposition tool
WO2018104794A1 (en) * 2016-12-07 2018-06-14 Abb Schweiz Ag System and method for handling and charting data obtained by an inspection vehicle
US10445873B2 (en) * 2017-02-23 2019-10-15 The Boeing Company Automated validation of condition of assembly
US10497110B2 (en) * 2017-02-23 2019-12-03 The Boeing Company Identifying a pathway for condition of assembly validation
US10954001B2 (en) 2017-02-23 2021-03-23 The Boeing Company Hanging clamped supports for aircraft
FR3063488B1 (fr) * 2017-03-06 2019-08-23 Nexeya France Dispositif d'eclairage de l'interieur d'un aeronef
US10682677B2 (en) * 2017-05-10 2020-06-16 General Electric Company System and method providing situational awareness for autonomous asset inspection robot monitor
FR3067634B1 (fr) * 2017-06-15 2020-10-23 Donecle Plateforme de commande et de suivi des inspections de surfaces d'objets predetermines par des robots d'inspection et systeme d'inspection mettant en oeuvre une telle plateforme
KR101867553B1 (ko) * 2017-07-21 2018-06-14 노진석 드론 관리 장치 및 방법
CN107688345B (zh) * 2017-08-29 2019-04-19 平安科技(深圳)有限公司 屏幕状态自动检测机器人、方法及计算机可读存储介质
DE102017120923A1 (de) 2017-09-11 2019-03-14 Airbus Operations Gmbh Verbindungseinrichtung und Installationssystem
US10810501B1 (en) 2017-10-20 2020-10-20 Amazon Technologies, Inc. Automated pre-flight and in-flight testing of aerial vehicles by machine learning
US20190155237A1 (en) * 2017-11-21 2019-05-23 Deere & Company Remote control of multiple different machines
US10989795B2 (en) * 2017-11-21 2021-04-27 Faro Technologies, Inc. System for surface analysis and method thereof
US11486697B1 (en) * 2017-12-29 2022-11-01 II John Tyson Optical structural health monitoring
US20200340802A1 (en) * 2017-12-29 2020-10-29 II John Tyson Optical structural health monitoring
US10346969B1 (en) 2018-01-02 2019-07-09 Amazon Technologies, Inc. Detecting surface flaws using computer vision
CN108363409A (zh) * 2018-02-01 2018-08-03 沈阳无距科技有限公司 无人机巡检控制方法、装置及系统
US11238675B2 (en) * 2018-04-04 2022-02-01 The Boeing Company Mobile visual-inspection system
US10830889B2 (en) 2018-05-02 2020-11-10 Faro Technologies, Inc. System measuring 3D coordinates and method thereof
US10800550B2 (en) * 2018-06-21 2020-10-13 The Boeing Company Positioning enhancements to localization process for three-dimensional visualization
US10974396B2 (en) * 2018-06-22 2021-04-13 Southwest Research Institute Robotic system for surface treatment of vehicles
CN112512763A (zh) * 2018-08-08 2021-03-16 索尼公司 控制装置、控制方法和程序
CN109079795B (zh) * 2018-09-25 2023-10-20 广州供电局有限公司 具有视觉识别功能的巡检机器人
CN109358058B (zh) * 2018-10-10 2021-11-19 中石化(洛阳)科技有限公司 管道状态检测系统和方法
CN109856471A (zh) * 2018-12-09 2019-06-07 北京航天计量测试技术研究所 一种多参数接口匹配和验证系统
US11220356B2 (en) * 2019-01-02 2022-01-11 The Boeing Company Non-destructive inspection using unmanned aerial vehicle
US10611497B1 (en) 2019-02-18 2020-04-07 Amazon Technologies, Inc. Determining vehicle integrity using vibrometric signatures
US10861164B1 (en) 2019-03-28 2020-12-08 Amazon Technologies, Inc. Visually determining vibrometric behavior of aerial vehicles
CN110027729A (zh) * 2019-05-09 2019-07-19 西北工业大学 一种无人机动力装置系统快速检测方法
US11079303B1 (en) * 2019-06-11 2021-08-03 Amazon Technologies, Inc. Evaluating joints using vibrometric signatures
GB2584914A (en) * 2019-06-28 2020-12-23 Airbus Operations Ltd Autonomous mobile aircraft inspection system
JP7002501B2 (ja) * 2019-07-29 2022-01-20 Ihi運搬機械株式会社 無人飛行機の状態確認装置および状態確認方法
CN110641732A (zh) * 2019-09-29 2020-01-03 蚌埠和能信息科技有限公司 一种两自由度微型飞行器测试平台
CN110654571B (zh) * 2019-11-01 2023-10-20 西安航通测控技术有限责任公司 一种飞机蒙皮表面缺陷无损检测机器人系统及方法
US11794222B1 (en) 2019-11-27 2023-10-24 Amazon Technologies, Inc. Passive brushes for cleaning surfaces in the presence of acoustic excitation
US11461929B2 (en) * 2019-11-28 2022-10-04 Shanghai United Imaging Intelligence Co., Ltd. Systems and methods for automated calibration
CN111050079B (zh) * 2019-12-27 2021-09-14 成都睿铂科技有限责任公司 一种基于无人机的航拍方法
CN111634442B (zh) * 2020-06-03 2022-07-29 西北工业大学 一种用于飞机装配质量检测的悬挂式机器人结构
CN111846275B (zh) * 2020-07-10 2021-11-23 杭州天为航空技术服务有限公司 一种航空器清洁管理系统
CN111830872B (zh) * 2020-07-17 2022-02-25 珠海格力智能装备有限公司 机器人的控制方法、装置、存储介质和处理器
CN111942615A (zh) * 2020-07-31 2020-11-17 中国民航大学 基于机器视觉的室外环境下飞机外表面雷击状况检查车
CN111965259B (zh) * 2020-08-19 2021-05-07 深圳职业技术学院 基于声波的故障检测及巡检系统
CN113844675A (zh) * 2020-12-30 2021-12-28 上海飞机制造有限公司 一种检测系统和控制方法
CN112884733A (zh) * 2021-02-09 2021-06-01 北京首都机场节能技术服务有限公司 一种用于对飞行器的表面进行缺陷识别的方法和系统
US11752631B2 (en) 2021-02-09 2023-09-12 Ford Global Technologies, Llc Fleet inspection and maintenance mobile robot
US20220309644A1 (en) * 2021-03-24 2022-09-29 The Boeing Company Automated assessment of aircraft structure damage
CN113264200A (zh) * 2021-04-14 2021-08-17 泛博科技(中山)有限公司 一种航空智能自检多场景通用无人车系统
CN113740185B (zh) * 2021-08-16 2024-05-03 中国飞机强度研究所 一种飞机疲劳试验中飞机内舱结构损伤巡检架构
US20230415347A1 (en) * 2022-06-21 2023-12-28 Wilder Systems Inc. Auto-locating and locking movable gantry system
KR102613005B1 (ko) * 2023-06-07 2023-12-11 (주)위플로 비행체 점검용 로봇 및 이를 이용한 비행체의 점검 방법

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6907799B2 (en) * 2001-11-13 2005-06-21 Bae Systems Advanced Technologies, Inc. Apparatus and method for non-destructive inspection of large structures
US8812154B2 (en) * 2009-03-16 2014-08-19 The Boeing Company Autonomous inspection and maintenance
DE102009015648B4 (de) * 2009-03-31 2011-05-05 Lufthansa Technik Ag Anordnung zur Durchführung von Wartungsarbeiten an Flugzeugen
EP2621811A2 (de) * 2010-09-29 2013-08-07 Aerobotics, Inc. Neuartige systeme und verfahren zur zerstörungsfreien prüfung von flugzeugen
US8982207B2 (en) * 2010-10-04 2015-03-17 The Boeing Company Automated visual inspection system
US8833169B2 (en) * 2011-12-09 2014-09-16 General Electric Company System and method for inspection of a part with dual multi-axis robotic devices

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
None *
See also references of WO2015059241A1 *

Also Published As

Publication number Publication date
FR3012425A1 (fr) 2015-05-01
US20160264262A1 (en) 2016-09-15
WO2015059241A1 (fr) 2015-04-30
CN105873825A (zh) 2016-08-17
FR3012425B1 (fr) 2017-03-24
US9952593B2 (en) 2018-04-24
CN105873825B (zh) 2018-11-09

Similar Documents

Publication Publication Date Title
WO2015059241A1 (fr) Robot collaboratif d'inspection visuelle d'un aéronef
US10934023B2 (en) Image recognition for vehicle safety and damage inspection
EP3308232B1 (de) System und verfahren zur automatischen prüfung von oberflächen
CN110418957B (zh) 对具有运行机构的设施进行状态监控的方法和装置
JP5873500B2 (ja) 自動目視検査システム
US11494888B2 (en) Work terminal, oil leakage detection apparatus, and oil leakage detection method
FR2945630A1 (fr) Procede et systeme d'inspection a distance d'une structure
Liu et al. Framework for automated UAV-based inspection of external building façades
JP2022151763A (ja) 航空機構造体の損傷の自動評価
EP1936330B1 (de) Verfahren und System zur Verarbeitung und Visualisierung von Bildern der Umgebung eines Luftfahrzeugs
EP2193477B1 (de) Verfahren und system zur flugzeugrollunterstützung
AU2021361004A9 (en) Passive hyperspectral visual and infrared sensor package for mixed stereoscopic imaging and heat mapping
Wojciechowski et al. Detection of Critical Infrastructure Elements Damage with Drones
AU2021278260B2 (en) Method for the machine-based determination of the functional state of support rollers of a belt conveyor system, computer program and machine-readable data carrier
US11834065B2 (en) System, method, and computer program product for detecting road marking points from LiDAR data
US20240054621A1 (en) Removing reflection artifacts from point clouds
CA3162616A1 (fr) Procede de controle d'une pluralite de stations de controle d'un aeronef et systeme de controle associe
US20240054731A1 (en) Photogrammetry system for generating street edges in two-dimensional maps
US20240054789A1 (en) Drone data collection optimization for evidence recording
FR3067634A1 (fr) Plateforme de commande et de suivi des inspections de surfaces d'objets predetermines par des robots d'inspection et systeme d'inspection mettant en oeuvre une telle plateforme
CN117647526A (zh) 一种风力发电机叶片表面缺陷检测方法及装置
CN116183609A (zh) 一种基于无人机光谱的管道缺陷的检测方法及无人机
CN116476888A (zh) 一种地铁隧道病害识别检测装置及方法

Legal Events

Date Code Title Description
PUAI Public reference made under article 153(3) epc to a published international application that has entered the european phase

Free format text: ORIGINAL CODE: 0009012

17P Request for examination filed

Effective date: 20160513

AK Designated contracting states

Kind code of ref document: A1

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR

AX Request for extension of the european patent

Extension state: BA ME

DAX Request for extension of the european patent (deleted)
RAP1 Party data changed (applicant data changed or rights of an application transferred)

Owner name: AIRBUS (SAS)

17Q First examination report despatched

Effective date: 20181120

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE APPLICATION IS DEEMED TO BE WITHDRAWN

18D Application deemed to be withdrawn

Effective date: 20200819