JP2019051585A - Method for controlling location of end effector of robot using location alignment feedback - Google Patents

Method for controlling location of end effector of robot using location alignment feedback Download PDF

Info

Publication number
JP2019051585A
JP2019051585A JP2018110992A JP2018110992A JP2019051585A JP 2019051585 A JP2019051585 A JP 2019051585A JP 2018110992 A JP2018110992 A JP 2018110992A JP 2018110992 A JP2018110992 A JP 2018110992A JP 2019051585 A JP2019051585 A JP 2019051585A
Authority
JP
Japan
Prior art keywords
effector
target
robot
distance
state machine
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
JP2018110992A
Other languages
Japanese (ja)
Inventor
ジェームズ ジェー. トロイ,
J Troy James
ジェームズ ジェー. トロイ,
ゲアリー イー. ジョージソン,
E Georgeson Gary
ゲアリー イー. ジョージソン,
スコット ダブリュ. レア,
W Lea Scott
スコット ダブリュ. レア,
ダニエル ジェームズ ライト,
J Wright Daniel
ダニエル ジェームズ ライト,
Original Assignee
ザ・ボーイング・カンパニーThe Boeing Company
Boeing Co
ザ・ボーイング・カンパニーThe Boeing Company
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority to US15/623,304 priority Critical patent/US10625427B2/en
Priority to US15/623,304 priority
Application filed by ザ・ボーイング・カンパニーThe Boeing Company, Boeing Co, ザ・ボーイング・カンパニーThe Boeing Company filed Critical ザ・ボーイング・カンパニーThe Boeing Company
Publication of JP2019051585A publication Critical patent/JP2019051585A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J13/00Controls for manipulators
    • B25J13/08Controls for manipulators by means of sensing devices, e.g. viewing or touching devices
    • B25J13/088Controls for manipulators by means of sensing devices, e.g. viewing or touching devices with position, velocity or acceleration sensors
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J11/00Manipulators not otherwise provided for
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J19/00Accessories fitted to manipulators, e.g. for monitoring, for viewing; Safety devices combined with or specially adapted for use in connection with manipulators
    • B25J19/02Sensing devices
    • B25J19/021Optical sensing devices
    • B25J19/023Optical sensing devices including video camera means
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/10Programme-controlled manipulators characterised by positioning means for manipulator elements
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/10Programme-controlled manipulators characterised by positioning means for manipulator elements
    • B25J9/12Programme-controlled manipulators characterised by positioning means for manipulator elements electric
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1615Programme controls characterised by special kind of manipulator, e.g. planar, scara, gantry, cantilever, space, closed chain, passive/active joints and tendon driven manipulators
    • B25J9/162Mobile manipulator, movable base with manipulator arm mounted on it
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1679Programme controls characterised by the tasks executed
    • B25J9/1692Calibration of manipulator
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1694Programme controls characterised by use of sensors other than normal servo-feedback from position, speed or acceleration sensors, perception control, multi-sensor controlled systems, sensor fusion
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical means
    • G01B11/002Measuring arrangements characterised by the use of optical means for measuring two or more coordinates
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C3/00Measuring distances in line of sight; Optical rangefinders
    • G01C3/02Details
    • G01C3/06Use of electric means to obtain final indication
    • G01C3/08Use of electric radiation detectors
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01JMEASUREMENT OF INTENSITY, VELOCITY, SPECTRAL CONTENT, POLARISATION, PHASE OR PULSE CHARACTERISTICS OF INFRA-RED, VISIBLE OR ULTRA-VIOLET LIGHT; COLORIMETRY; RADIATION PYROMETRY
    • G01J5/00Radiation pyrometry
    • G01J5/02Details
    • G01J5/04Casings Mountings
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N25/00Investigating or analyzing materials by the use of thermal means
    • G01N25/72Investigating presence of flaws
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01JMEASUREMENT OF INTENSITY, VELOCITY, SPECTRAL CONTENT, POLARISATION, PHASE OR PULSE CHARACTERISTICS OF INFRA-RED, VISIBLE OR ULTRA-VIOLET LIGHT; COLORIMETRY; RADIATION PYROMETRY
    • G01J5/00Radiation pyrometry
    • G01J2005/0081Thermography
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B2219/00Program-control systems
    • G05B2219/30Nc systems
    • G05B2219/39Robotics, robotics to robotics hand
    • G05B2219/39024Calibration of manipulator
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B2219/00Program-control systems
    • G05B2219/30Nc systems
    • G05B2219/40Robotics, robotics mapping to robotics vision
    • G05B2219/40298Manipulator on vehicle, wheels, mobile
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B2219/00Program-control systems
    • G05B2219/30Nc systems
    • G05B2219/45Nc applications
    • G05B2219/45066Inspection robot
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B2219/00Program-control systems
    • G05B2219/30Nc systems
    • G05B2219/45Nc applications
    • G05B2219/45071Aircraft, airplane, ship cleaning manipulator, paint stripping
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y10TECHNICAL SUBJECTS COVERED BY FORMER USPC
    • Y10STECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y10S901/00Robots
    • Y10S901/01Mobile robot
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y10TECHNICAL SUBJECTS COVERED BY FORMER USPC
    • Y10STECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y10S901/00Robots
    • Y10S901/30End effector
    • Y10S901/44End effector inspection

Abstract

To provide a system and method for automating alignment of a robotic end effector.SOLUTION: An alignment process involves computation of offset distances and rotational angles to guide a robotic end effector to a desired location relative to a target object. The relative alignment process enables the development of robotic motion path planning applications that minimize on-line and offline motion path script creation, resulting in an easier-to-use robotic application. A relative alignment process with an independent (off-board) method for target object coordinate system registration can be used. One example implementation uses a finite-state machine configuration to control a holonomic motion robotic platform with rotational end effector used for grid-based scan acquisition for non-destructive inspection.SELECTED DRAWING: Figure 5

Description

  The present disclosure relates generally to systems and methods for controlling the position of a robot's end effector relative to a target object during implementation of automated procedures such as non-destructive inspection (NDI) and other maintenance operations.

  Existing robot programming techniques require the input of individual robot motion sequences for motion control processes (eg motion scripts), typically using online teaching methods or offline programming methods. . However, these usually require a high level of expertise and effort from the robot programmer.

  If the alignment between the robot and the target object is compromised, standard open loop programming techniques can also have problems. This may be the case for ground robots that move across the surface, where the surface may be uneven or may have cracks, holes or other discontinuities. Existing open-loop robotic programming techniques are not adaptable to misalignment (eg, dead reckoning odometry alone is not sufficient, because errors accumulate over time). To address this situation, some approaches use external hardware that provides continuous closed-loop feedback regarding scanner position. An example of this is motion capture (see US Pat. Nos. 7,643,893 and 8,892,252), in which a moving object and a retroreflective optical object attached to the target object are included. Multiple cameras are used for tracking. These types of solutions require external hardware to be set up prior to use, which can be a problem for some use cases. In other solutions, a contact alignment process is used to help align the end effector with respect to the workpiece.

  The subject matter disclosed in detail below is directed to a system and method for controlling the position of an end effector of a ground robot mobile platform relative to a target object during performance of an automated task. In one ground robot mobile platform, the robot includes a robot arm having a wrist at the distal end. An end effector is attached to the wrist. A tool (for example, an NDI sensor) is attached to the end effector. Typically, the robot controller controls the various motors of the robot, thereby moving the end effector to a position where the end effector and the tool attached to the end effector are scanned for the target object. (I.e., as used herein, the term "position" includes both placement and orientation).

  Specifically, the subject matter disclosed herein includes a system for automating the alignment of robot end effectors using real-time data from a plurality of distance sensors to control relative translational and rotational movements and Regarding the method. In one embodiment, the alignment process includes calculating an offset distance and a rotation angle to guide the robot end effector to a desired position relative to the target object. (As used herein, “target offset distance” is the desired (ie, target) distance between the distal end of the tool attached to the end effector and the target surface. In the case of the NDI process, the system operator needs to identify the target offset distance between the NDI sensor and the target object that he wishes to achieve.) This relative alignment process allows online and offline motion path scripts to be It is possible to develop a robot motion path planning application that minimizes creation, and as a result, the robot application is easier to use. Also disclosed below is the integration of this relative alignment process with an independent (off-board) method for registration of the coordinate system of the target object. An exemplary embodiment is provided using a finite state machine configuration to control a holonomic motion robot platform with a rotating end effector used for grid-based scan result acquisition for NDI. Finite state machine control applications are processes that require high-level goals, sensor feedback, constraints, and trigger conditions that generate commands sent to the robot motion controller in real time. .

  The process described below simplifies the programming of the robot motion path by reducing many of the open loop steps involved in creating a typical robot path, and reduces the placement and orientation of the end effector relative to the target object. The robustness of the entire system is improved by feeding back the related measurement data in real time. The disclosed process significantly speeds and simplifies the robot path planning process by using multiple distance sensors to align the placement and orientation of the end effector relative to the target object. The disclosed process also provides the ability to adapt to environmental changes to improve process robustness. Optionally, a location identification process may be used for registering NDI scan data in the coordinate system of the target object and for recording location information for archiving purposes for maintenance / repair applications. The proposed system also allows sensor feedback to be used for robot motion planning without the need for online or offline motion programming. This reduces path planning time and the need for specialized trained operators.

  For purposes of illustration, a system and method for inspecting a body of composite material (eg, composite laminate made of fiber reinforced plastic) using active infrared thermography will be described in detail. Active (ie, pulsed) infrared thermography is one method used in the aviation and power generation industries to non-destructively evaluate structural components for subsurface defects. However, the end effector alignment concept disclosed herein is not limited to applications in environments where an infrared thermographic scanner is mounted on the end effector. The alignment process disclosed herein may also be used to align other types of NDI sensors or non-NDI tools (such as ultrasonic transducer arrays or eddy current sensors).

  According to one embodiment using an infrared thermography (IRT) scanner, the system comprises a vertical extension arm with a rotating wrist and a modular tool mount, an alignment sensor element, and a minimum ground contact area (ground). It provides a compact and relatively low cost platform that can reach the required area around the fuselage with a footprint. According to a preferred embodiment, the vertical extension arm is rigidly connected to a base platform of holonomic motion and an IRT scanner including an infrared camera, and the end effector is fitted with at least one flash lamp.

  Various embodiments of systems and methods for controlling the position of a robot's end effector relative to a target object are described in detail below, one or more of these embodiments being one of the following aspects: It may be characterized by one or more.

  One aspect of the subject matter disclosed in detail below is a method for controlling a position of an end effector of a robotic mobile platform relative to a target object, the method moving the end effector to a first position (eg, a starting position). And enabling the robot controller to perform an action specified by the finite state machine control application, the action being a first attached to the end effector while the end effector is in the first position. Obtaining distance data from the second and third distance sensors, wherein the obtained distance data is determined by the first, second, and third distance sensors in the respective areas on the surface of the target object. Representing each distance away from the object and using this distance data to attach the end effector to the target object. By Imento, and a moving and the end effector from the first position to the first grid location. According to an embodiment, the alignment includes rotating the end effector such that the end effector axis is perpendicular to the surface of the target object and causing the end effector to move away from the target object surface by a target offset distance. Moving the end effector.

  The method described in the above paragraph calculates the coordinates of the position of the external tracking system in the coordinate system of the target object, and subsequently transmits the laser beam produced by the external tracking system to a specific surface on the target surface. It may further include aiming at the coordinate position and thereby forming a laser spot. In this example, moving the end effector to the first position aligns the laser spot created by the first, second, and third rangefinders around the laser spot created by the external tracking system. For driving a robotic mobile platform. The method may further include calculating the coordinates of the tool attached to the end effector in the coordinate system of the target object using an external tracking system while the end effector is in the first grid position.

  Another aspect of the subject matter disclosed in detail below is a self-propelled mobile base platform comprising a plurality of rotating elements and a plurality of motors each coupled to the plurality of rotating elements, and supported by the base platform. A vertically extendable mast, an arm having a proximal end fixedly coupled to the vertically extendable mast, an end effector pivotally coupled to the distal end of the arm, and a finite state machine A non-transitory tangible computer readable storage medium storing a control application and first, second, and third distance sensors attached to the end effector, the first, second, and third First, second, and third distances configured to obtain distance data representing respective distances at which the distance sensors are spaced from respective areas on the surface of the target object And a controller configured to control the operation of the first, second, and third distance sensors and to move the end effector relative to the ground in accordance with commands generated by the finite state machine control application; A finite state machine control application that executes instructions executable by the controller to move the end effector using the distance data acquired by the first, second, and third distance sensors. A robotic mobile platform including a method of generating.

  A further aspect of the subject matter disclosed in detail below is a method for controlling the position of an end effector of a robot movement platform relative to a target object, wherein the robot controller performs an action specified by a finite state machine control application. This operation includes: (a) moving the end effector to a nominal location that is not in contact with the surface of the target object according to pre-stored grid pattern data representing the grid pattern. And (b) acquiring distance data from the first, second, and third distance sensors attached to the end effector while the end effector is in an unaligned position. The distance data is the first, second, and third distance cell. Obtaining a respective distance from the respective area on the surface of the target object, and (c) aligning the end effector with the target object using this distance data, Moving from an unaligned position to an aligned position; (d) activating a tool attached to the end effector while the end effector is in the aligned position; and (e) a plurality of grid patterns. Repeating steps (a) to (d) for each of the aligned positions. According to an embodiment, the alignment includes rotating the end effector such that the end effector axis is perpendicular to the surface of the target object and causing the end effector to move away from the target object surface by a target offset distance. Moving the end effector.

  Other aspects of systems and methods for controlling the position of a robot end effector are disclosed below.

  The features, functions, and advantages discussed in the previous section can be implemented individually in various embodiments or combined in yet another embodiment. For the purpose of illustrating the above aspects and other aspects, various embodiments are described below with reference to the drawings. None of the drawings briefly described in this section are drawn to scale.

It is a block diagram which specifies some components of the system for thermographic imaging of a trunk | drum. 1 is a side view of a ground robot NDI mobile platform, according to one embodiment. FIG. FIG. 6 is a side view of a ground robot NDI mobile platform according to another embodiment. FIG. 4 is an exploded view of several components of the robot NDI mobile platform shown in FIG. 3. FIG. 6 is a perspective view of a ground robot NDI mobile platform in the middle of scanning a composite curved workpiece (the laser rangefinder attached to the robot end effector is not shown in FIG. 5; This is shown in FIG. FIG. 6 is a side view of a portion of the robot NDI mobile platform shown in FIG. 5 that includes an end effector and three laser rangefinders attached to the end effector. FIG. 6 is a perspective view of an infrared thermographic scanner mounted on an end effector of the robot NDI moving platform shown in FIG. 5. It is a front view of the scanner for infrared thermography shown in FIG. FIG. 3 is a block diagram for identifying some components of an alignment system, according to an embodiment. It is a figure showing the scan pattern (3x2) for the IRT inspection of a large sized workpiece. FIG. 6 is a flow diagram that identifies some steps of a method of non-destructive testing using an end effector alignment process according to one embodiment. A flowchart is identified in conjunction with FIG. 11B that identifies some of the steps performed by the alignment process finite state machine used in the method described as a higher level in FIG. A flowchart is identified in conjunction with FIG. 11A that identifies some of the steps performed by the finite state machine of the alignment process used in the method described as a higher level by FIG. Fig. 4 represents the measurement of the boundary of the scan area using a local positioning system (LPS). It is a figure showing the initial position alignment of the robot system using LPS. FIG. 4 is a perspective view of an LPS performing a robot-to-part location identification process, according to one embodiment. 15A-15C are three laser rangefinders arranged in a triangular pattern on a common plane and directed to respective spots on the surface of the target object, the laser rangefinders and these spots FIG. 3 is a front view, a side view, and a top view, respectively, of laser rangefinders that are separated by respective distances. 15A-15C are three laser rangefinders arranged in a triangular pattern on a common plane and directed to respective spots on the surface of the target object, the laser rangefinders and these spots FIG. 3 is a front view, a side view, and a top view, respectively, of laser rangefinders that are separated by respective distances. FIG. 4 is a top view of a holonomic motion base platform having four mecanum wheels with various dimensions represented by double arrows. FIG. 3 is a block diagram identifying certain components of a system for infrared thermography inspection of a large composite structure, according to an embodiment.

  In the following, reference is made to the drawings. In the various drawings, similar elements are provided with the same reference numerals.

  The process described in detail below provides a method for automating tasks such as NDI scans for large surfaces such as aircraft fuselage. This process also provides an easy-to-use high-level interface that allows scan sequences to be defined with minimal instructions, thereby eliminating the need for customized path programming. The system described below has an on-board feedback sensor for relative alignment to a target object that does not require continuous feedback from an external sensor. This solution includes techniques for obtaining distance measurement data, techniques for calculating distance and angle alignment, and techniques for controlling the alignment process in a finite state machine control application.

  In addition, an independent position identification process is also included to provide three-dimensional (3D) position data of the scan area defined within the local coordinate system of the target object. This allows post-processing of scan data that is aligned with respect to other reference information such as a 3D computer aided design (CAD) model. Initial alignment with the target object of the scanning system can also be performed using a position identification process.

  For purposes of illustration, a system and method for non-destructive inspection of body parts made of composite materials (eg, composite laminates made of fiber reinforced plastic) using active thermography will now be described in detail. However, not all features of an actual implementation are described herein. Those skilled in the art need to make a number of implementation-specific decisions in order to achieve the specific goals of the developer, such as adapting to system-related and industry-related constraints that vary depending on the implementation. You will understand that there is. Further, while these development efforts may be cumbersome and time consuming, it will be understood that those skilled in the art who benefit from this disclosure are routine tasks.

  Infrared thermography methods and apparatus allow for non-destructive testing of materials to detect defects, material property variations, or material coating or layer thickness differences. Infrared imaging can detect local variations in thermal diffusivity or thermal conductivity at or below the surface of the material. Infrared thermography can be used for metals such as ferrous materials including steel, or non-metallic materials such as plastics, ceramics, or composite materials.

  Active thermography is used to non-destructively evaluate a sample for subsurface defects. Active thermography is useful for finding internal joint discontinuities, delaminations, voids, inclusions, and other structural defects that cannot be detected by visual inspection of the sample. In general, active thermography heats or cools a sample to create a difference between the temperature of the sample and the ambient temperature, and then observes infrared thermal traces emitted from the sample when that temperature returns to ambient temperature Including doing. Infrared cameras are used because they can detect any anomalies in the cooling behavior that would have been caused by subsurface defects that block the diffusion of heat from the surface of the sample to the interior of the sample . Specifically, these defects cause the surface immediately above the defects to cool at a different rate than the surrounding defect-free area. As the sample cools, a time series of images representing the surface temperature is monitored and recorded by an infrared camera. Thereby, a record of the change in surface temperature over time is made.

  Typically, the material surface is heated using a flash lamp, and after a period of time, a thermal image of the heated material surface is taken. Thermographic heating systems typically use a xenon flash bulb and an off-the-shelf photographic power source to excite the sample. An infrared camera images the infrared spectral radiance from the surface of the material, which represents the surface temperature of the material. A difference in the surface temperature of the material indicates that the thermal properties of the material are different. Such variations in the thermal properties of the material indicate the possibility that the material is defective or contains foreign matter.

  The structural thickness and stacking arrangement required for the processing of infrared traces is obtained by knowing the exact position of the infrared camera field of view on the surface of the fuselage.

  FIG. 1 is a block diagram that identifies some components of the system for thermographic imaging of the body 2. The infrared thermographic inspection system includes a digital infrared camera 4 having a lens that is directed through a camera lens opening 5 in a hood 12. The hood 12 is designed to form a housing that is covered near the surface under inspection. A pair of flash lamps 6 a and 6 b are arranged inside the hood 12 and in a spatial relationship fixed to the hood 12. The flash lamps 6a and 6b produce a flash in response to a trigger signal from the infrared thermographic computer 8. The infrared thermographic computer 8 also controls the operation of the infrared camera 4. An example of a type of infrared camera 4 suitable for use in at least some of the embodiments disclosed herein includes a focal plane array (FPA) device configured to function as a spectroradiometer. . Further details regarding other components that may be included in a flash lamp assembly of the type with an infrared camera, a pair of flash lamps, and a hood can be found, for example, in US Pat. No. 7,186,981.

  According to one method of thermography inspection, first, the flash lamps 6a and 6c are triggered to transfer heat to the composite material of the body part 2. Preferably, during cooling of the composite material, the infrared camera 4 is triggered periodically to capture a continuous digital image of various spectral radiances of the heated part of the body part 2. Preferably, the thermally excited (heated) region of the composite material being tested is monotonically cooled after the excitation source is removed until the sample reaches thermal equilibrium with its surroundings. Digital infrared imaging data captured by the infrared camera 4 is received by the infrared thermographic computer 8 for processing. The infrared thermographic computer 8 processes the infrared imaging data to detect and locate material boundaries, foreign material under the surface of the material, or other material anomalies such as delamination or voids that exceed tolerances. Is programmed to do. The infrared imaging data may be displayed on a display monitor (not shown in FIG. 1) that may be integrated with or separated from the infrared thermographic computer 8.

  According to the embodiment shown in FIG. 1, the infrared thermographic computer 8 converts the infrared imaging data obtained from the infrared camera 4 into a format that can be analyzed and mathematically manipulated by the infrared thermographic computer 8. Has an acquisition function. The optional data acquisition module 10 may be integrated with the infrared thermographic computer 8 or may be separate (as shown in FIG. 1). The data acquisition module 10 is too large for the surface of the composite structure to fit in a single image frame, and the infrared camera 4 captures multiple spatially different images to complete the surface of the composite structure. Can be used to generate the combined image. The infrared thermographic computer 8 may be further programmed to analyze infrared imaging data captured by the infrared camera 4. Specifically, the time history of the surface temperature response of the body part 2 as it returns to room temperature can be analyzed to detect the presence of defects in the composite material.

  In the context of a particular application of fuselage inspection, a non-destructive inspection system may include a method of scanning the fuselage skin from an advantageous point outside the fuselage. In the embodiments disclosed below, the external scanning means comprises a robot equipped with an infrared camera. The robot has a movable robot base and a robot arm having a proximal end coupled to the robot base. The robot base may be a mobile holonomic crawler vehicle. An infrared thermographic scanner is connected to the distal end of the robot arm. The infrared thermographic scanner includes an infrared camera and two or more flash lamps mounted inside the hood. The hood may be sized to cover a square area on the outer surface of the body. Infrared imaging data acquired from adjacent square areas can be joined based on the measured values of the respective positions of the robot base using the local coordinate system. The stitching process may be performed in real time or may be performed at a later time.

  Various embodiments of an NDI system configured to use the position alignment feedback concept disclosed herein will now be described in detail. According to one embodiment, the NDI system is an automated platform with end effectors that can reach the top and bottom centerlines of the fuselage from either side of the aircraft. This NDI system includes a holonomic base platform with mecanum wheels, a vertical extension mast supported on the base platform, a pivoting end effector, a proximity sensor, and multiple types of NDI mounted on the end effector. A support for the apparatus. A vertical support mast with a pivoting end effector on the extension arm allows inspection of the entire height of the aircraft fuselage. The base platform of holonomic movement allows the robot to reposition the NDI scanner unit quickly and efficiently along the length of the fuselage. The operation control software with distance sensor feedback makes it possible to automatically capture the scan results based on the grid pattern so that they partially overlap. Reference position data is also captured to align the NDI scan results with the appropriate aircraft coordinate system. Whether in automatic or manual control mode, the system is relatively easy to set up and use. The system may be configured to accept various types of NDI units mounted on an end effector, including eddy current sensors, ultrasonic sensors, and infrared thermography (IRT) NDI sensors.

  FIG. 2 is a side view of a ground robot NDI mobile platform 200 according to one embodiment. The platform is supported by a holonomic motion base platform 204, an infrared thermography (IRT) scanner 214, and an automatic scanner support device (holonomic motion base platform 204) under the control of a robot controller (not shown). ). The automatic scanner support device includes a vertically extendable mast 206 that can be extended and retracted as needed to change the altitude of the IRT scanner 214. The vertically extendable mast 206 has a first mast portion 206a having a linear axis and one end fixedly connected to the base platform 204 for holonomic motion, and has a linear axis, and is attached to the first mast portion 206a. The second mast portion 206b is slidably connected and slides along a line parallel to the axis of the first mast portion 206a, and has a linear axis and is slidable on the second mast portion 206b. And a third mast portion 206c that slides along a line parallel to the axis of the second mast portion 206b. According to one embodiment, the vertical extension of the mast is controlled by a single motor and cable pulley system.

  The ground robot NDI mobile platform 200 shown in FIG. 2 further includes a four-bar coupling arm mechanism 208. A four bar coupled arm mechanism 208 controls the placement and orientation of the end effector 212 pivotally connected to its distal end. The drive coupling portion of the 4-bar coupling mechanism 208 is driven to rotate relative to the third mast portion 206c by a motor-driven main screw or hydraulic cylinder 210. The IRT scanner 214 is attached to the end effector 212 and rotates together with the end effector 212. An IRT shroud 216 surrounds the IRT scanner 214 to isolate the gap space between the IRT scanner 214 and the curved workpiece 202 (eg, the torso) from the surrounding environment.

  FIG. 3 is a side view of a ground robot NDI mobile platform 220 according to another embodiment. This embodiment is pivotable to a vertically extendable mast 206, a rigid extension arm 222 fixedly coupled to the third mast portion 206c, and distal ends on either side of the rigid extension arm 222. A connected end effector 224. FIG. 4 is an exploded view of some components of the robot NDI mobile platform 220 shown in FIG. In this embodiment, the pitch of the end effector as well as the height to extend is independently and programmably controlled. The pitch rotation of the end effector 224 can be driven by a position control motor 246 (see FIG. 4) in a gear box (not shown) that cannot be driven backward.

  FIG. 5 is a perspective view of a prototype of a ground robot NDI mobile platform 230 while scanning a curved workpiece 202 made of composite material. An IRT scanner 214 is attached to the end effector 224, and the end effector 224 is pivotable about the pitch axis under the control of the robot controller 80. The end effector 224 is pivotally connected to a rigid extension arm 232 that is fixedly connected to the uppermost mast portion of a vertical extendable mast 206. The IRT scanner 214 transmits the acquired data to the infrared thermography computer (not shown in FIG. 5) via the electric cable 242. The robot NDI mobile platform 230 also includes a warning light 244 that switches on and off when the system is enabled.

  According to one proposed embodiment, the holonomic base platform 204 uses four mecanum wheels, with a pair of Type A placed on one diagonal and a pair of Type B on the other diagonal. Is arranged. Type A Mecanum Wheel and Type B Mecanum Wheel are different in that the tapered roller of Type A Mecanum Wheel is oriented at a different angle from the tapered roller of Type B Mecanum Wheel. Is different. Each Mecanum wheel may be driven to be rotated by a respective independently controllable stepper motor. Vehicles with Mecanum wheels can be moved in any direction and swiveled by controlling the speed and direction of rotation of each wheel. For example, rotating all four wheels in the same direction at the same speed causes forward or backward movement, rotating one wheel at the same speed and rotating the opposite wheel in the opposite direction A lateral movement is caused by turning and further rotating the Type A wheel at the same speed and rotating in the opposite direction to the Type B wheel. The holonomic motion base platform 204 moves under the control of an onboard control computer (ie, a robot controller). A holonomic motion base platform with a suitable Mecanum wheel is described in US Pat. No. 9,410,659, the disclosure of which is incorporated herein by reference in its entirety.

  According to one embodiment, a plurality of sensors (not shown in FIG. 5) are mounted on the outer periphery of the holonomic base platform 204 to indicate the presence of obstacles in that particular area of the vehicle. It has become. The motion controller uses the sensor data to prevent further movement in the direction associated with that particular sensor. However, movement in other directions is still possible. Possible sensors include contact sensors, transmissive sensors, and proximity sensors. This collision avoidance system operates in a manner similar to that described in US Pat. No. 7,194,358.

  As described above, the position alignment feedback process disclosed herein uses a distance sensor to determine the placement and orientation (ie, position) of the IRT scanner 214 relative to the target object (eg, workpiece 202). In order to calculate the relative position in real time, at least three distance measuring devices that are non-collinear can be used. In order to reduce any possibility of scratching the surface of the target object, a laser rangefinder was chosen instead of a contact probe for use as a distance sensor. In addition to short range guidance and angle guidance, laser rangefinders provide the platform motion controller with the advantage of longer range feedback for general navigation purposes.

  According to one embodiment, three laser rangefinders (not shown in FIG. 5) are attached to end effector 224. 6 is a side view of a portion of the robot NDI mobile platform shown in FIG. 5, including an end effector 224 and three laser rangefinders attached to the end effector 224. FIG. In FIG. 6, only two of the three laser rangefinders (ie laser rangefinders 236 and 238) are visible. In FIG. 7, a third laser rangefinder (ie laser rangefinder 240) is visible. As can be seen from FIG. 6, the first laser rangefinder 236 is attached to the L-shaped mounting plate 218 a, and the L-shaped mounting plate 218 a is attached to the end effector 224. Similarly, the second laser rangefinder 238 is attached to an L-shaped mounting plate 218 b (shown in FIGS. 6 and 7), and the L-shaped mounting plate 218 b is attached to the end effector 224. The third laser rangefinder 240 is attached to an L-shaped mounting plate 218c (shown in FIG. 7). Similarly, the L-shaped mounting plate 218c is attached to the end effector 224.

  FIG. 7 is a perspective view of the IRT scanner 214 (with the shroud 216 removed) attached to an end effector 224 that is pivotally connected to a rigid extension arm 232. As described above, laser rangefinders 236, 238, and 240 are attached to end effector 224. As best seen in the front view of FIG. 8, the laser rangefinder 236 is mounted at an altitude that is higher than the altitude of the highest point of the hood 12 of the IRT scanner 214. On the other hand, laser rangefinders 238 and 240 are mounted at an altitude lower than the lowest altitude of hood 12 of IRT scanner 214 and are separated by a distance. Preferably, laser rangefinders 236, 238 and 240 are located at the vertices of an isosceles triangle. In the arrangement shown in FIG. 15A, the distance that the laser rangefinders 238 and 240 are separated (ie, the base of the isosceles triangle) is a, and the laser rangefinder 236 and the midpoints of the laser rangefinders 238 and 240 are separated. The distance (that is, the height of the isosceles triangle) is b.

  The system shown in FIGS. 5-7 uses a mounted alignment system to determine the relative position (placement and orientation) offset of the end effector 224 relative to the target object. This process uses the distance information from laser rangefinders 236, 238, and 240 to calculate the relative position in real time. The system then provides the data to the robot controller to produce the desired movement of the end effector 224 based on feedback (which may also include motion control of other parts of the robot).

  One form of control that this process allows is semi-automatic control that assists the operator with some aspects of alignment. This is, for example, the orientation of the end effector 224, such as ensuring that the end effector 224 is always perpendicular to the surface of the target object, or that it is always at a specific distance from the surface.

  FIG. 9 is a block diagram for identifying some components of an alignment system, according to an embodiment. Distance sensor 14 (eg, laser rangefinders 236, 238, and 240) provides distance information to computer 16 (eg, a robot controller). The computer 16 is configured to determine what movement is required to align the end effector 224 with respect to the surface of the target object based on the distance information received from the distance sensor 14 (eg, a program Have been). These movements include moving the holonomic base platform 204 to a new position, extending or retracting the vertically extendable mast 206, and pivoting the end effector 224 about the pitch axis. One or more may be included. The robot NDI mobile platform includes a plurality of motors 20 that are controlled by respective motor controllers 18. The computer 16 sends a command signal to the selected motor controller 18 to activate the robot motion necessary to align the end effector 224 with the surface of the target object.

  Another form of control enabled by this process is fully automated motion control. In this fully automated motion control, an operator identifies an upper level target such as an m × n grid pattern, and an automatic controller plans the motion based on the higher level target and feedback from the alignment system. For example, FIG. 10 is a diagram showing a 3 × 2 scan pattern 22 for IRT inspection of a large workpiece. First, the IRT scanner acquires IRT data of the scan area 26a. Next, the IRT scanner moves upward and stops at a position where the IRT data of the scan area 26b is acquired. In order to facilitate the joining of the scan results and to ensure that there is no gap within the target range, it is preferable that the scan area 26b slightly overlaps the scan area 26a. Next, the IRT scanner moves to the right and stops at a position where the IRT data of the scan area 26c is acquired. Next, the IRT scanner moves downward and stops at a position where the IRT data of the scan area 26d is acquired. Subsequently, it moves to the right to acquire IRT data in the scan area 26e, and then moves upward to acquire IRT data in the scan area 26f. The scan path 28 of the IRT scanner during this process is represented by the arrow in FIG.

  This alignment process provides an alternative to programming the individual movements of the robot directly. This process also allows the system to adapt to unexpected environmental changes. It also provides a collision avoidance function of the end effector for realizing a desired arrangement and orientation with respect to the target surface without the end effector contacting the target surface.

  The automated process used in this document is based on a finite state machine control application that manages transitions from one state to another based on external inputs. This framework allows the system to create a response based on multiple types of inputs and the current state of the system. The various movements of the system necessary to produce automatically generated motion path plans and scanner control signals are based on meeting the criteria necessary for transitioning from one mode of operation to another. According to one embodiment, the finite state machine uses sensor feedback to trigger transitions between separate sets of system states.

  Here, for this process, a base platform (e.g., a base platform for holonomic motion) that supports a robot arm having an NDI sensor (e.g., an IRT scanner) at the distal end is provided, and the operation of the base platform and the robot arm is a device controller ( For example, the description will be in the context of a robot NDI mobile platform controlled by a robot controller. FIG. 11 shows a high-level process for the overall operation of the system, and FIGS. 11A and 11B show details related to the alignment-based aspect of the process. A digital signal transmitted between the robot controller and the NDI sensor control computer (eg, the infrared thermography computer 8 identified in FIG. 1) allows synchronization between the remote robot and the NDI sensor system.

  FIG. 11 identifies some steps of a method 100 of non-destructive testing using an end effector alignment process according to one embodiment. To begin the process, the system operator identifies the initial position of the NDI sensor relative to the target object (step 102). This step can be accomplished by a visual method (by the operator) or automatically (with a pointing device such as LPS). Next, the system operator can manipulate the base platform and robotic arm to move the NDI sensor to a close first position (step 104). At step 106, the system operator provides the desired number of scans in the pattern to the device controller. This number is compared with the count number stored in the grid position counter. This counter is incremented each time one scan is acquired in the pattern. In step 108, the device controller calculates a backup path parameter and starts acquiring an automatic scan. The system operator also enables proximity detection / collision detection (step 110). The system operator then captures the 3D coordinates of the first location using an external tracking system (eg, LPS) (step 112). (The external tracking system is already calibrated so that its 3D coordinates relative to the coordinate system of the target object are known. Therefore, the LPS computer has a 3D coordinate system in the first position relative to the coordinate system of the target object. Thereafter, a finite state machine for controlling the operation of the NDI sensor is enabled during the alignment process and the scanning process (step 114) (ie, in FIG. 11A). Go to A). (The finite state machine is described in the next paragraph in connection with FIGS. 11A and 11B.) After the NDI sensor is aligned and the scan pattern is complete, the system operator uses an external tracking system to locate the end position of the NDI sensor. Are captured (step 116). A combined image can then be assembled by stitching together the scan data obtained from adjacent scans.

  11A and 11B form (in combination) a flow chart identifying several steps performed by the finite state machine used in the method described as a higher level in FIG. A finite state machine is a mathematical model of a process that can only be in one of a finite number of states at any given time.

  According to one proposed embodiment, the robot controller first checks (ie, determines) whether the finite state machine (FSM) is set to a GRID-MOVE state (step 120). GRID-MOVE is a state in which the robot moves between grid positions defined at a higher level. For example, if the system operator wants the system to capture data in a 3 × 2 pattern, the robot will move along the scan path 28 of FIG. 10 into a continuous grid. If the robot controller determines in step 120 that the FSM is not in the GRID-MOVE state, the robot controller proceeds directly to step 128. If, at step 120, the robot controller determines that the FSM is in the GRID-MOVE state, the robot controller determines whether there are additional grid positions in the sequence (step 122). This is accomplished by comparing the current count of the grid position counter with a preset number of scan results to be acquired. If, at step 122, the robot controller determines that there are no more grid positions in the sequence (ie, the count number is equal to the predetermined number), the process returns to step 116 of FIG. If, at step 122, the robot controller determines that there are more grid positions in the sequence (ie, the count is less than a predetermined number), the robot moves to the next position of the unaligned NDI sensor. (Step 124), and then the state of the finite state machine is set to ALIGN (step 126). In the next step, the robot controller determines whether the finite state machine is set to the ALIGN state (step 128).

  The ALIGN state is a state where the robot uses three distance sensors to ensure that the pitch and yaw of the end effector have the aiming axis of the NDI scanner perpendicular to the surface of the target object. If the robot controller determines in step 128 that the finite state machine is not in the ALIGN state, the robot controller proceeds directly to step 144 in FIG. 11B. If the robot controller determines in step 128 that the finite state machine is in the ALIGN state, the robot controller determines whether it is necessary to increase the accuracy of the position of the NDI sensor (step 130). If the robot controller determines in step 130 that there is no need to increase the accuracy of the position of the NDI sensor (ie, the aiming axis of the NDI scanner is perpendicular to the surface of the target object), the robot controller The state is set to SCAN (step 132) and the process proceeds directly to step 144 of FIG. 11B. If the robot controller determines in step 130 that the accuracy of the position of the NDI sensor needs to be increased (that is, if the aiming axis of the NDI scanner is not perpendicular to the surface of the target object), the robot controller sequentially performs the following steps. carry out. (A) obtaining distance data from the distance sensor (step 134), (b) calculating the orientation and translational offset from the desired aligned position (step 136), (c) aligning the distance to the desired offset (Step 138), (d) align the yaw angle of the end effector to achieve perpendicularity to the surface of the target object, and adjust the horizontal arrangement (step 140), (e) align the pitch angle of the end effector Then, the normal to the surface of the target object is realized, the height is adjusted (step 142), and (f) the process returns to step 130.

  As described above, if the robot controller determines in step 130 that it is not necessary to increase the accuracy of the position of the NDI sensor, the robot controller sets the state of the finite state machine to SCAN (step 132) and directly selects FIG. Proceed to step 144. In step 144, the robot controller determines whether the limit state machine is set to the ALIGN state. If the robot controller determines in step 144 that the finite state machine is not in the SCAN state, the robot controller returns to step 120 of FIG. 11A. If the robot controller determines in step 144 that the finite state machine is in the SCAN state, the robot controller sends a scanner control command to the NDI scanner control computer (eg, the infrared thermography computer 8 identified in FIG. 1). Transmit (step 146). Next, the robot controller checks the response of the scanner (step 148) and determines whether or not the scan pattern is completed (step 150). If the robot controller determines in step 150 that the scan pattern has not been completed, the robot controller returns to step 148. If the robot controller determines in step 150 that the scan pattern is complete, the robot controller then performs the following steps in order. (A) Return the NDI scanner to position (step 152), (b) set the state of the finite state machine to GRID_MOVE, (c) increment the grid position counter, and (d) return to step 120 of FIG. 11A.

  After completion of the automatic scan sequence, the individual images from each IRT scan can be stitched together to create a single representation of the examination area.

  The above system may have a number of possible use cases for general alignment tasks of robotic manipulators or other devices. One of these use cases is grid-based NDI scan acquisition in aerospace manufacturing and maintenance environments, for example, aircraft fuselage grid-based scanning.

  The system can be driven (remotely operated) to the approximate first position by the user during typical operation. The system is then set to automatically take a scan of the grid arranged in the vertical and horizontal pattern defined by the operator along either side of the aircraft fuselage, as shown in FIG. .

  The automatic grid scan function of the motion control algorithm involves feedback of distance data from the three laser rangefinders 236, 238, and 240 to the motion control algorithm. The motion control algorithm sets the horizontal and vertical placement of the platform 204 and end effector 224, and the yaw and pitch orientations, respectively. This approach eliminates the need for a separate predetermined operating path for the system, thereby simplifying use and reducing setup time.

  The system can also be fully controlled in remote operation mode so that the operator can acquire data manually. A semi-automated mode where the system operator controls the platform position and mast height and the system automatically adapts the end effector's pitch orientation to maintain vertical alignment with the surface in front of the end effector. Is possible.

  In order to accurately locate the scan result in the aircraft coordinate system, a 3D coordinate position measurement of the boundary region of the scan result is taken. This boundary reference value allows the combined scan image to be placed in the same coordinate system as the target object and its associated CAD model. This makes it possible to associate the acquired scan result with each 3D model of the target object, and to prepare position data for future reference. In this system, a local positioning system (LPS) 24 (shown in FIG. 12) is used to acquire 3D coordinate position data in the coordinate system of the target object 54. For example, FIG. 12 shows an LPS 24 that directs the laser beam 30 to a boundary location 60 to be measured. Assuming the LPS 24 has already been calibrated with respect to the coordinate system of the target object 54, the boundary position data points obtained by the LPS 24 are used to determine the coordinates of the position of each boundary in the coordinate system of the target object 54. Can be used.

  According to one embodiment, the acquisition of scan result boundaries is targeted at the corners of the IRT shroud 216 when the IRT scanner 214 (see FIGS. 5 and 8) is capturing scan data at a particular location. Can be achieved. These LPS boundary measurements are performed before and after the first scan, or at any intermediate position in the grid sequence. According to one proposed embodiment, the corner (or some known location) of the IRT shroud 216 has an active optical object (eg, LED) or a passive optical object, or other visible feature. Can be. The passive approach requires the system operator to operate the LPS 24 to target the point. The active LED object allows an automated approach to detect the LED using an LPS camera. Ideally, it would be best to get all four corners of the scanned area, but the IRT shroud 216 sometimes shields the optical object, making it difficult to target the optical object. The minimum number of optical objects required in this part of the process is two. This is because, for example, it is possible to create an assumption regarding the shape of the X × Y scan region using the surface normal from the 3D CAD model of the target object.

  The motorized pan / tilt control aspect of LPS 24 also allows LPS 24 to provide an initial position reference value and a guidance function that points to the desired first position to scan. After performing the initial calibration of the LPS 24 to a known position on the target object, the operator can direct the LPS 24 to aim the laser pointer of the LPS 24 toward specific 3D coordinates on the target surface. . This laser spot 38 is shown in FIG. The operator then drives the robot to align the laser spots 32a, 32b, and 32c of each of the robot's laser rangefinders 236, 238, and 240 around the LPS laser spot 38 as shown in FIG. .

  In order to obtain measurements in the coordinate system of the target object 54 (eg, aircraft) using the LPS 24, the system operator needs three known points on the target object. These three points are calibration points and are separate from points on the IRT shroud 216 measured for scan registration purposes. This means that if you want to align the scan data with the aircraft coordinates, the minimum of the total number of LPS measurements is 5, that is, 3 for the initial calibration of the LPS, and the rectangular area that was scanned It is that it is two to prescribe.

  FIG. 14 is a perspective view of a system capable of performing a robot-to-object position identification process according to one embodiment. The robot-to-object position identification process is performed using an LPS 24 comprising a single camera 40 and a laser rangefinder (not shown) on a controllable pan / tilt unit 42. The LPS operation and calibration process is disclosed in US Pat. No. 7,859,655, the disclosure of which is incorporated herein by reference in its entirety.

  Specifically, the local positioning system shown in FIG. 14 includes a video camera 40 that may have an automatic (remote control) zoom function. The video camera 40 is supported on a pan / tilt mechanism 42. The video camera 40 and the pan / tilt mechanism 42 may be operated by the LPS control computer 48. The LPS control computer 48 communicates with the video camera 40 and the pan / tilt mechanism 42 through the video / control cable 46. Alternatively, the LPS control computer 48 may communicate with the video camera 40 and the pan / tilt mechanism 42 through a wireless communication path (not shown). The LPS control computer 48 is configured to control the operation of LPS hardware, including a laser rangefinder (not shown), a video camera 40, and a pan / tilt mechanism 42. For example, the pan and tilt angles of the pan / tilt mechanism 42, and thus the orientation of the video camera 40, can be controlled using the keyboard of the computer 48 or other user interface hardware 36 (eg, a game pad). The optical image field observed by the video camera 40 can be displayed on the monitor 34 of the computer 48.

  The pan / tilt mechanism 42 rotates the laser distance meter (not shown) and the video camera 40 to a selected angle about a vertical azimuth (pan) axis and a horizontal elevation (tilt) axis. Be controlled. A direction vector 66 (shown in broken lines in FIG. 14) indicating the orientation of the laser rangefinder (not shown) and the video camera 40 relative to the fixed coordinate system of the tripod 44 (or other platform on which the pan / tilt unit is mounted). , Determined by the pan angle and tilt angle when the camera is aimed at the point of interest. In FIG. 14, direction vector 66 extends from a laser rangefinder (not shown) and video camera 40 and intersects point 94 a on one corner of shroud 216.

  The laser rangefinder may be incorporated within the housing of the camera 40 or mounted outside the camera 40 to transmit a laser beam along the direction vector 66. The laser rangefinder can measure the distance to any visible feature on shroud 216 (eg, one of corners 94a-c) or any calibration point (eg, points 92a-c) on curved workpiece 202. ) Is measured. (Each calibration point may be a visible feature on the curved workpiece 202 or an optical object attached to the curved workpiece 202.) The laser rangefinder reflected from the laser and the point of impact. And a unit configured to calculate a distance based on laser light detected in response to the laser beam.

  The local positioning system shown in FIG. 14 further includes three-dimensional position identification software loaded on the LPS control computer 48. For example, the three-dimensional position identification software is of the type that uses a plurality of calibration points 92a-c on the curved workpiece 202 to define the position (position and orientation) of the video camera 40 relative to the curved workpiece 202. It may be. The calibration points 92a-c are visible features at known locations in the local coordinate system of the curved workpiece 202, as determined by a three-dimensional database of feature locations (eg, CAD models) or other measurement techniques. It may be. During the LPS calibration process, X, Y, Z data for at least three non-collinear points are extracted from the CAD model. Typically, calibration points are selected that correspond to features that can be easily located on the target object. The three-dimensional position identification software uses the X, Y, Z data of the calibration points 92 a-c and the pan data and tilt data of the pan / tilt mechanism 42, and the video camera 40 with respect to the local coordinate system of the curved workpiece 202. Define relative placement and orientation. The measured distance to the calibration points 92a-c can be used in conjunction with the pan and tilt angles obtained from the pan / tilt mechanism 42 to determine the placement and orientation of the camera relative to the curved workpiece 202. A method for generating an instrument-to-object calibration transformation matrix (sometimes referred to as a camera pose) is disclosed in US Pat. No. 7,859,655. The calibration process uses known and measured data to calculate a 4 × 4 homogeneous transformation matrix that defines the placement and orientation of the video camera 40 relative to the curved workpiece 202.

  Once the placement and orientation of video camera 40 relative to curved workpiece 202 is determined and a camera pose transformation matrix is generated, any point of interest X, Y, on shroud 216 in the coordinate system of curved workpiece 202 will be described. In addition to the calculated placement and orientation of the video camera 40, the camera pan data (rotation angle of the video camera 40 around the azimuth axis) and tilt data (centered around the elevation axis) are determined to determine the Z coordinate. Rotation angle of the video camera 40) may be used. By identifying the position of the shroud 216 at the beginning and end of the scan pattern, the position of the scan pattern in the coordinate system of the curved workpiece 202 can be determined.

  Specifically, at the beginning and end of the scan pattern, a visible feature on the shroud 216 in the coordinate system of the curved workpiece 202 (eg, any one of the corners 94a-c shown in FIG. 14). A relative location identification process may be used to determine the location. The basic process sequence applied to shroud 216 is as follows. (1) A local positioning system calibrates the coordinate system of the target object (eg, curved workpiece 202) to be examined by measuring three known points 92a-c on the target object. (2) When the robot is at the beginning of the scan pattern (eg, scan area 26a in FIG. 12), the local positioning system measures the position of a visible feature (eg, corner 94a) on shroud 216. (3) Later, when the robot is at the end of the scan pattern (eg, scan area 26f in FIG. 12), it uses the local positioning system to identify the same or another visible feature (eg, corner) on shroud 216. 94b or 94c) is measured. (4) This allows the operator to determine the boundary of the scan range that constitutes the mosaic pattern.

  LPS control software running on the computer 48 calculates the position of each visible feature on the shroud 216 with respect to the coordinate system of the curved workpiece 202. The LPS control computer 48 (see FIG. 14) sends position data to an expert workstation 74 shown in FIG. 17 that is configured to record position coordinates for future reference. This position data can also be used to align the scan data with the CAD model of the target object.

  The LPS control software on the computer 48 outputs the point data as X, Y, and Z values, but the control application needs only X, Y, Not just Z data points. To solve the placement and orientation problem, the X, Y, and Z data from the three measured points 92a-c, and the known dimensions of these points, are used for a full six degree of freedom placement. And an orientation representation is calculated. This is what the above-described position identification software does. The layout and orientation format used by the location identification software uses a 4x4 transformation matrix, but there are other ways to represent the data.

  If the system operator wishes to perform a relative LPS scan (described in US Patent Application Publication No. 2015/0268033), the operator can use any three non-collinear points on the target object. However, it is not necessary to know in advance the 3D coordinates of these points (as is the case with standard LPS methods). The system operator will not use the associated mode to get results in the coordinate system of the target object, which is not necessary in some applications. A relative LPS location identification process can be used to ensure that the NDI sensor is aligned within the same area as the previous situation. The process is also useful for splicing several multiple scan results, or for moving the LPS if necessary.

  As disclosed above, this system uses distance measuring devices such as lasers, string encoders, and ultrasonic sensors with the basic requirement that at least three distance measuring devices are non-collinear. In the configuration using the distance sensor (above), three distance measuring lasers arranged in a triangular formation are used. In an alternative embodiment, four distance measuring lasers are arranged in a rectangular formation. Regardless of which configuration is used, the distance data is fed to the robot controller 80 along with the end effector orientation data. A feedback control method can be used to zero the error between the current angle and the desired angle.

  Here, a method of measuring an angle using a laser distance meter will be described with reference to FIGS. 15A to 15C. 15A-15C show three laser rangefinders 236, 238, and 240 arranged in a triangular pattern on a common plane and directed to respective points on the surface of the target object 54. The laser rangefinder and these points are a front view, a side view, and a top view, respectively, of the laser rangefinder, separated by their respective distances.

In addition to being used to measure the distance to the object, the three lasers are also used to measure yaw and pitch angles. FIG. 15A shows the laser distance meters 236, 238, and 240 relative to each other using the horizontal dimension a and the vertical dimension b, along with the measured distances d1, d2, and d3 to the surface of the target object 54. The arrangement is shown. Equations (1) and (2) can be used to calculate the pitch and yaw angles.
PitchAngle = atan2 (d 1 − (d 2 + d 3 ) / 2, b) (1)
Yaw Angle = atan2 (d 2 −d 3 , a) (2)
In the equation, the pitch angle and the yaw angle are angles with respect to the surface of the target object 54 currently calculated for the alignment apparatus shown in FIGS. 15A to 15C. The target values for these angles measured against the surface perpendicular to the current position are equal to zero, and the process for achieving this target angle is described below.

By calculating the current yaw and pitch angles, the system motion controller can use the speed control method for the controlled motion (pan, tilt, and distance). A feedback controller, such as a proportional integral derivative (PID) controller, can be used to zero out the error between the current angle and the desired angle. Equations (3) and (4) can be used to calculate pitch and yaw motion control.
PitchRate = Kp pitch * (PitchAngle-PitchAngle goal ) (3)
YawRate = Kp yaw * (Yaw Angle-Yaw Angle goal ) (4)
In the equation, PitchRate and YawRate indicate the angular rotation speed around the pitch axis of the alignment tool and the angular rotation speed around the yaw axis of the base, respectively. Kp pitch and Kp yaw are respectively the pitch axis and the yaw axis. Where PitchAngle and YawAngle are the angles calculated from equations (1) and (2), respectively, and PitchAngle goal and YawAngle goal are the desired target angles that the controller will direct the system to ( As described above, in the present embodiment, both angles are zero). Integral and derivative feedback can also be used but is not shown here.

The basic speed equation is:
Vel x = Kp x * (MinDist x -offset x ) (5)
Vel y = Kp y * (MinDist y -offset y) (6)
Wherein, Vel x and Vel y are horizontal speed of the base, Kp x and Kp y is a proportional feedback gain relative to the base in the X and Y directions, respectively, MinDist x and MinDist y X-direction, respectively, and Y The minimum value measured with a laser in the direction, offset x and offset y are target offset distances. For some applications, the laser is not configured to measure both the X and Y directions. In that case, the X or Y velocity control equation associated with this alignment process would not be used.

For a holonomic motion base platform comprising a base frame 62, a pair of type A mecanum wheels W1 and W3 on one diagonal, and a pair of type B mecanum wheels W2 and W4 on another diagonal, Kinematics can be used to calculate the speed of four individual wheels. The dimensions of the vehicle (L and D) and the desired rotation point (indicated by distances a1, a2, b1, b2) are shown in FIG. The individual wheel speeds for wheels W1-W4 are shown in equations (7)-(10).
V W1 = Vel y −Vel x + YawRate * (a 1 + b 1 ) (7)
V W2 = Vel y + Vel x -YawRate * (a 1 + b 2) (8)
V W3 = Vel y −Vel x −YawRate * (a 2 + b 2 ) (9)
V W4 = Vel y + Vel x + YawRate * (a 2 + b 1 ) (10)
Where V wi (i = 1, 2, 3, 4) is the individual wheel speed and Vel x and Vel y are the horizontal speeds obtained from equations (5) and (6). , YawRate is the yaw rotation speed obtained from equation (4), and a1, a2, b1, and b2 are distances to the rotation point shown in FIG.

  The basic configuration related to the pivot shaft of the end effector is as follows. (A) In the case of a uniaxial pivot shaft: one motor and one angle sensor. (B) In the case of a biaxial gimbal: two motors and two angle sensors.

  The alignment process described above addresses both separate sensor update and consecutive sensor update use cases, and the concept can be packaged as a stand-alone system or as part of an existing system. It is.

  The concept disclosed herein has application for a base platform of holonomic motion, but variations are applicable to other systems. Possible use cases include holonomic and non-holonomic platforms, articulated robotic arms, gantry arms, hybrid operating base / arm systems, helicopters and UAVs, cameras, lights, and tools.

  The laser-based alignment process disclosed herein enables system operation without the need to teach robots online or program offline. This makes this approach easier to use. This process guides the end effector into place while adapting to unexpected changes in the environment. This system uses a feedback from the sensor instead of passing through a list or pre-programmed motion steps to make a transition between various steps in the alignment, grid-based motion, and scanning process. Operates as a state machine.

  The alignment sensor also provides a collision avoidance function for the end effector. The configuration of this system makes it possible to reach each area of the fuselage from the ground holonomic platform to the crown (top). This solution provides an optional process for the collection of location reference data using an external measurement system (LPS).

  There is a method for accurately registering scan data for maintenance / repair applications with CAD data, and for recording position information for storage purposes, by a function of collecting position data defined in a coordinate system of a target object (for example, an aircraft). It becomes possible.

  The configuration of this system using a vertical extension arm with a rotating wrist and modular tool mount and alignment sensor elements is compact and relatively easy to reach the required area around the fuselage with minimal ground contact area Provide a low-cost platform.

  The systems disclosed herein can be configured to receive various types of NDI devices attached to end effectors, including eddy current sensors, ultrasonic sensors, and infrared thermography (IRT) sensors. A vertical support mast with a pivoting end effector on the extension arm allows inspection of the entire height of the aircraft fuselage. The base of the holonomic movement allows an efficient relocation of the sensor unit along the length of the fuselage. With the operation control software, it is possible to automatically capture the scan results by the grid pattern so as to partially overlap. Reference position data is captured for alignment of scan results with aircraft coordinates.

  During operation, the system can be driven (remotely operated) to an approximate location in the starting area by the operator. The system is then configured to automatically acquire scans of grids arranged in vertical and horizontal patterns defined by the operator along either side of the aircraft fuselage. One of the features of the motion control algorithm used in this document is that it involves distance sensor feedback instead of requiring a separate predefined motion path for the system. . As a result, use is simplified and setup time is reduced. In order to accurately identify the position of the scan result in the aircraft coordinate system, the 3D coordinate position of the boundary area of the scan result is measured. A local positioning system is used to obtain position data in 3D coordinates in the aircraft coordinate system. This reference position data is then used to align the NDI scan results with the appropriate aircraft coordinate system.

  FIG. 17 is a block diagram identifying several components of a system for infrared thermography inspection of a large composite structure according to one computer architecture. The operation of the robot 64 is controlled by the robot controller 80 based on a finite state machine and at least feedback from a distance sensor (eg, three laser rangefinders). The operation and launch of the LPS 24 is controlled by the LPS control computer 48. The LPS control computer 48 also receives laser tracking data from the LPS 24. Activation of the infrared camera 4 and the flash lamp 6 is controlled by an infrared thermography computer 8. The infrared thermography computer 8 also receives infrared image data from the infrared camera 4. All of these computers may be capable of wired or wireless communication with the master computer of expert workstation 74. The master computer of expert workstation 74 may be programmed to correlate laser tracking data with infrared imaging data. The master computer may be further programmed to request 3D model data from the 3D model database server 96. In the case of thermographic porosity measurement, the master computer of expert workstation 74 may also be programmed to request reference thermal trace data from reference thermal trace database server 98.

  The LPS control computer 48 acquires position data related to the infrared camera 4 in the 3D coordinate system of the composite material structure. In the case of a barrel-shaped body part, infrared imaging data can be mapped directly on the 3D model of the body part. Overlaying infrared imaging data with 3D model data allows for improved data analysis and potential data analysis automation. For example, by directly overlaying infrared imaging data on the 3D model, the feature / defect display can be directly correlated with the fuselage structure. In addition, direct overlay of the data on the model can be used to determine the local or spatial point thickness required for porosity quantification. In one embodiment, the process includes infrared images as one or more computer graphic texture maps projected onto the surface of the 3D model in a virtual environment displayed on the monitor or computer screen at the expert workstation 74. Includes pasting strips of data.

  Although a method for controlling the position of a robot end effector relative to a target object has been described with reference to various embodiments, those skilled in the art can make various modifications without departing from the scope of the teachings herein. And that elements thereof can be replaced by equivalents. In addition, many modifications may be made to adapt a teaching herein to a particular situation without departing from its scope. Accordingly, it is intended that the claims not be limited to the specific embodiments disclosed herein.

  As used in the claims, the term “position” includes placement in a three-dimensional coordinate system and orientation relative to the coordinate system. As used in the claims, the term “move end effector” refers to moving the base platform relative to the ground, moving the robot arm relative to the base platform, or moving the end effector relative to the robot arm. Or should be interpreted broadly to include at least one of them.

  The methods described herein may be encoded as executable instructions implemented in a non-transitory tangible computer readable medium including, but not limited to, storage devices and / or memory devices. Such instructions, when executed on a processing system or computer system, can cause the system apparatus to perform at least some of the methods described herein.

  Furthermore, the present invention includes embodiments according to the following clauses.

Article 1. A method for controlling the position of an end effector of a robot movement platform with respect to a target object,
Moving the end effector to the first position;
Enabling the robot controller to perform actions specified by the finite state machine control application,
While the end effector is in the first position, obtaining distance data from the first, second, and third distance sensors attached to the end effector, wherein the obtained distance data is the first The second and third distance sensors represent respective distances away from respective areas on the surface of the target object;
Using the distance data to move the end effector from the first position to the first grid position by aligning the end effector with respect to the target object.

  Article 2. The method of clause 1, wherein aligning includes rotating the end effector such that the axis of the end effector is perpendicular to the surface of the target object.

  Article 3. The method of clause 2, wherein rotating the end effector includes rotating the end effector about a pitch axis.

  Article 4. The method of clause 3, further comprising rotating the base of the robot movement platform about the yaw axis.

  Article 5. 5. The method of any one of clauses 2 to 4, wherein the aligning further comprises moving the end effector such that the end effector is separated from the surface of the target object by a target offset distance.

  Article 6. 6. The method of any one of clauses 1-5, wherein the aligning further comprises moving the end effector such that the end effector is separated from the surface of the target object by a target offset distance.

  Article 7. 7. A method according to any one of clauses 1 to 6, further comprising calculating the coordinates of the position of the external tracking system in the coordinate system of the target object.

  Article 8. Further comprising aiming the laser beam produced by the external tracking system at a specific coordinate position on the surface of the object, thereby forming a laser spot, and moving the end effector to the first position; 8. The clause 7, further comprising driving the robotic mobile platform to align the laser spots created by the first, second, and third rangefinders around the laser spots created by the external tracking system. the method of.

  Article 9. Further comprising calculating a coordinate of a visible feature on a tool attached to the end effector in the coordinate system of the target object using an external tracking system while the end effector is in the first grid position. 9. The method according to clause 7 or 8.

Article 10. The actions identified by the finite state machine control application are:
Activating a tool attached to the end effector while the end effector is in the first grid position;
Using a finite state machine control application to move the end effector from the first grid position to the second position;
Moving the end effector from a second position to a second grid position by aligning the end effector with a target object using a finite state machine control application;
10. The method of any one of clauses 1 to 9, further comprising activating the tool while the end effector is in the second grid position.

Article 11. The tool is an infrared thermography scanner, and the actions specified by the finite state machine control application are
Acquiring a first infrared thermographic scan result while the end effector is in a first grid position;
11. The method of clause 10, comprising obtaining a second infrared thermography scan result while the end effector is in the second grid position.

  Article 12. 12. The method of clause 11, further comprising stitching the first infrared thermography scan result and the second infrared thermography scan result.

Article 13. A self-propelled mobile base platform comprising a plurality of rotating elements and a plurality of motors respectively coupled to the plurality of rotating elements;
A vertically extendable mast supported by a base platform;
An arm having a proximal end fixedly coupled to a vertically extendable mast;
An end effector pivotally connected to the distal end of the arm;
A non-transitory tangible computer-readable storage medium storing a finite state machine control application;
First, second, and third distance sensors, each mounted on an end effector, wherein each of the first, second, and third distance sensors is spaced from a respective area on the surface of the target object First, second, and third distance sensors configured to obtain distance data representing a distance;
A robot movement comprising: a controller configured to control the operation of the first, second and third distance sensors and to move the end effector relative to the ground in accordance with commands generated by the control application of the finite state machine A platform, wherein a finite state machine control application generates instructions executable by a controller to move an end effector using distance data acquired by first, second, and third distance sensors. Including robot movement platform.

  Article 14. 14. The robot movement platform of clause 13, wherein the first, second, and third distance sensors are laser rangefinders.

  Article 15. 15. The robot movement platform of clause 14, further comprising a tool attached to the end effector.

  Article 16. 16. The robot movement platform according to clause 15, wherein the tool is an infrared thermographic scanner.

  Article 17. The robot movement platform according to clause 16, wherein the infrared thermographic scanner comprises a shroud.

  Article 18. The controller is further configured to move the end effector from the first position to the second position by aligning the end effector with the target object using a finite state machine control application, and aligning the end effector 18. The robot movement platform according to any one of clauses 13 to 17, comprising rotating the end effector so that the axis is perpendicular to the surface of the target object.

  Article 19. 19. The robot movement platform of clause 18, wherein aligning further comprises moving the end effector such that the end effector is separated from the surface of the target object by a target offset distance.

Article 20. A method for controlling the position of an end effector of a robot movement platform relative to a target object, comprising enabling a robot controller to perform an action specified by a finite state machine control application,
(A) moving the end effector to a reference position that is not in contact with the surface of the target object in accordance with pre-stored grid pattern data representing the grid pattern;
(B) Obtaining distance data from the first, second, and third distance sensors attached to the end effector while the end effector is in an unaligned position, the obtained distance data Obtaining a respective distance at which the first, second, and third distance sensors are distant from respective areas on the surface of the target object;
(C) using the distance data to move the end effector from the reference position to the aligned position by aligning the end effector with the target object;
(D) actuating a tool attached to the end effector while the end effector is in the aligned position;
(E) repeating steps (a) to (d) for each of a plurality of aligned positions of the grid pattern.

  Article 21. Aligning rotates the end effector so that the end effector axis is perpendicular to the surface of the target object and moves the end effector so that the end effector is at a target offset distance from the target object surface. 21. The method of clause 20, comprising:

  Article 22. The tool is an infrared thermography scanner, and the action specified by the finite state machine control application further includes obtaining each infrared thermography scan result while the end effector is in each aligned position, 22. A method according to clause 20 or 21, further comprising stitching together the scan results.

  For the process claims described below, the steps recited in the claim indicate that the claim language indicates that the claim language indicates a specific order in which some or all of these steps will be performed. Unless explicitly specified or described, they must be performed in alphabetical order (all alphabetical notations in the claims are used only for the purpose of referring to previous steps) or in the order in which these steps are listed. It should not be interpreted as being. A process claim also excludes any part of two or more steps being performed simultaneously or alternately, unless the claim language explicitly excludes such an interpretation. Should not be interpreted.

Claims (11)

  1. A method for controlling the position of an end effector (224) of a robot movement platform (200) relative to a target object (54), comprising:
    Moving the end effector (224) to a first position;
    Enabling the robot controller (16) to perform actions specified by the finite state machine control application, the actions comprising:
    Obtaining distance data from first, second and third distance sensors (236, 238, 240) mounted on the end effector while the end effector (224) is in the first position; The acquired distance data indicates that the first, second, and third distance sensors (236, 238, 240) are separated from respective areas on the surface of the target object (54). Representing each distance, getting,
    Moving the end effector (224) from the first position to a first grid position by aligning the end effector (224) with respect to the target object (54) using the distance data; Including a method.
  2.   The method of claim 1, wherein the aligning comprises rotating the end effector such that an axis of the end effector (224) is perpendicular to the surface of the target object (54).
  3.   The method of claim 2, wherein the rotating the end effector (224) comprises rotating the end effector about a pitch axis.
  4.   The method of claim 3, further comprising rotating a base of the robotic mobile platform (200) about a yaw axis.
  5.   5. The alignment of claim 1, wherein the alignment further includes moving the end effector (224) such that the end effector (224) is a target offset distance away from the surface of the target object (54). The method according to claim 1.
  6.   The method according to any one of the preceding claims, further comprising calculating coordinates of a position of an external tracking system in the coordinate system of the target object (54).
  7.   The method further comprises aiming a laser beam produced by the external tracking system at a specific coordinate position on the surface of the object, thereby forming a laser spot 38, wherein the end effector (224) is the first effector. Moving to the position of the laser drives the robot movement platform (200) and the laser spots (32a, 32b,...) Created by the first, second and third rangefinders (236, 238, 240). The method of claim 6, further comprising aligning 32c) around the laser spot 38 created by the external tracking system.
  8.   On the tool attached to the end effector (224) in the coordinate system of the target object (54) using the external tracking system while the end effector (224) is in the first grid position. 8. The method according to claim 6 or 7, further comprising calculating the coordinates of the visible feature.
  9. The actions specified by the finite state machine control application are:
    Activating a tool attached to the end effector (224) while the end effector (224) is in the first grid position;
    Moving the end effector (224) from the first grid position to a second position using the control application of the finite state machine;
    Moving the end effector (224) from the second position to a second grid position by aligning the end effector (224) with the target object (54) using a finite state machine control application; ,
    The method of any one of the preceding claims, further comprising activating the tool while the end effector (224) is in the second grid position.
  10. A self-propelled mobile base platform (204) comprising a plurality of rotating elements and a plurality of motors respectively coupled to the plurality of rotating elements;
    A vertically extendable mast supported by the base platform;
    An arm having a proximal end fixedly coupled to the vertically extendable mast;
    An end effector pivotally coupled to the distal end of the arm;
    A non-transitory tangible computer-readable storage medium storing a finite state machine control application;
    First, second, and third distance sensors, which are attached to the end effector, and the first, second, and third distance sensors are separated from respective areas on the surface of the target object. First, second, and third distance sensors configured to obtain distance data representing respective distances;
    A controller configured to control operation of the first, second and third distance sensors and to move the end effector relative to the ground in accordance with commands generated by a finite state machine control application. A robot movement platform (200), wherein a control application of the finite state machine is controlled by a controller to move the end effector using the distance data acquired by the first, second, and third distance sensors. Including a method of generating executable instructions,
    Robot movement platform (200).
  11. A method of controlling the position of an end effector of a robot movement platform relative to a target object, the method comprising: enabling a robot controller to perform an action specified by a finite state machine control application, the action ,
    (A) moving the end effector to a reference position that is not in contact with the surface of the target object according to pre-stored grid pattern data representing a grid pattern;
    (B) obtaining distance data from first, second, and third distance sensors mounted on the end effector while the end effector is in an unaligned position, The distance data is acquired, wherein the first, second, and third distance sensors represent respective distances away from respective areas on the surface of the target object;
    (C) moving the end effector from the reference position to an aligned position by aligning the end effector with the target object using the distance data;
    (D) activating a tool attached to the end effector while the end effector is in an aligned position;
    (E) repeating steps (a) to (d) for each of a plurality of aligned positions of the grid pattern.
JP2018110992A 2017-06-14 2018-06-11 Method for controlling location of end effector of robot using location alignment feedback Pending JP2019051585A (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
US15/623,304 US10625427B2 (en) 2017-06-14 2017-06-14 Method for controlling location of end effector of robot using location alignment feedback
US15/623,304 2017-06-14

Publications (1)

Publication Number Publication Date
JP2019051585A true JP2019051585A (en) 2019-04-04

Family

ID=62116327

Family Applications (1)

Application Number Title Priority Date Filing Date
JP2018110992A Pending JP2019051585A (en) 2017-06-14 2018-06-11 Method for controlling location of end effector of robot using location alignment feedback

Country Status (4)

Country Link
US (1) US10625427B2 (en)
EP (1) EP3415284A3 (en)
JP (1) JP2019051585A (en)
CN (1) CN109079775A (en)

Families Citing this family (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109803795A (en) 2016-09-06 2019-05-24 高级智能系统股份有限公司 For transporting the mobile workstation of more than one piece article
US10554950B1 (en) 2017-03-16 2020-02-04 Amazon Technologies, Inc. Collection of camera calibration data using augmented reality
US10447995B1 (en) * 2017-03-16 2019-10-15 Amazon Technologies, Inc. Validation of camera calibration data using augmented reality
WO2019157587A1 (en) 2018-02-15 2019-08-22 Advanced Intelligent Systems Inc. Apparatus for supporting an article during transport
US10634632B2 (en) * 2018-04-25 2020-04-28 The Boeing Company Methods for inspecting structures having non-planar surfaces using location alignment feedback
US10751883B2 (en) * 2018-08-16 2020-08-25 Mitutoyo Corporation Robot system with supplementary metrology position coordinates determination system
US10745219B2 (en) 2018-09-28 2020-08-18 Advanced Intelligent Systems Inc. Manipulator apparatus, methods, and systems with at least one cable
US10751888B2 (en) 2018-10-04 2020-08-25 Advanced Intelligent Systems Inc. Manipulator apparatus for operating on articles
US10645882B1 (en) 2018-10-29 2020-05-12 Advanced Intelligent Systems Inc. Method and apparatus for performing pruning operations using an autonomous vehicle
US10676279B1 (en) 2018-11-20 2020-06-09 Advanced Intelligent Systems Inc. Systems, methods, and storage units for article transport and storage
US10335947B1 (en) * 2019-01-18 2019-07-02 Mujin, Inc. Robotic system with piece-loss management mechanism

Family Cites Families (34)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7800758B1 (en) 1999-07-23 2010-09-21 Faro Laser Trackers, Llc Laser-based coordinate measuring device and laser-based method for measuring coordinates
US6231280B1 (en) 1999-08-10 2001-05-15 Northrop Grumman Corporation Vibration-absorbing end effector of an automated drill and countersink machine
US6751342B2 (en) 1999-12-02 2004-06-15 Thermal Wave Imaging, Inc. System for generating thermographic images using thermographic signal reconstruction
US6826299B2 (en) 2000-07-31 2004-11-30 Geodetic Services, Inc. Photogrammetric image correlation and measurement system and method
US6990215B1 (en) 2000-07-31 2006-01-24 Geodetic Services, Inc. Photogrammetric measurement system and method
US7110194B2 (en) 2002-11-27 2006-09-19 Hubbs Machine & Manufacturing Inc. Spherical retro-reflector mount negative
US7075084B2 (en) 2002-12-20 2006-07-11 The Boeing Company Ultrasonic thermography inspection method and apparatus
US7186981B2 (en) 2003-07-29 2007-03-06 Thermal Wave Imaging, Inc. Method and apparatus for thermographic imaging using flash pulse truncation
US7384220B2 (en) 2004-01-06 2008-06-10 The Boeing Company Laser-guided coordination hole drilling
US7194358B2 (en) 2004-02-25 2007-03-20 The Boeing Company Lift collision avoidance system
US7513964B2 (en) 2005-02-28 2009-04-07 The Boeing Company Real-time infrared thermography inspection and control for automated composite marterial layup
US7287902B2 (en) 2005-06-07 2007-10-30 The Boeing Company Systems and methods for thermographic inspection of composite structures
US7587258B2 (en) 2006-05-10 2009-09-08 The Boeing Company Merged laser and photogrammetry measurement using precise camera placement
US7783376B2 (en) 2006-05-10 2010-08-24 The Boeing Company Photogrammetric contrasting light for hole recognition
US7454265B2 (en) 2006-05-10 2008-11-18 The Boeing Company Laser and Photogrammetry merged process
US20070269098A1 (en) 2006-05-19 2007-11-22 Marsh Bobby J Combination laser and photogrammetry target
US7643893B2 (en) 2006-07-24 2010-01-05 The Boeing Company Closed-loop feedback control using motion capture systems
US7743660B2 (en) 2007-06-15 2010-06-29 The Boeing Company System and method for automated inspection of large-scale part
US7859655B2 (en) 2007-09-28 2010-12-28 The Boeing Company Method involving a pointing instrument and a target object
EP2197199B1 (en) 2008-12-12 2017-10-18 Testo SE & Co. KGaA Thermal imaging camera and method for constructing a thermographic image
US9149929B2 (en) 2010-05-26 2015-10-06 The Boeing Company Methods and systems for inspection sensor placement
US8619265B2 (en) 2011-03-14 2013-12-31 Faro Technologies, Inc. Automatic measurement of dimensional data with a laser tracker
US8713998B2 (en) 2011-06-14 2014-05-06 The Boeing Company Autonomous non-destructive evaluation system for aircraft structures
US9221506B1 (en) * 2011-07-18 2015-12-29 The Boeing Company Location tracking and motion control of automated marking device
US8892252B1 (en) * 2011-08-16 2014-11-18 The Boeing Company Motion capture tracking for nondestructive inspection
US9410659B2 (en) 2014-02-10 2016-08-09 The Boeing Company Automated mobile boom system for crawling robots
US9981389B2 (en) * 2014-03-03 2018-05-29 California Institute Of Technology Robotics platforms incorporating manipulators having common joint designs
US10310054B2 (en) 2014-03-21 2019-06-04 The Boeing Company Relative object localization process for local positioning system
US9664652B2 (en) * 2014-10-30 2017-05-30 The Boeing Company Non-destructive ultrasonic inspection apparatus, systems, and methods
WO2016174445A1 (en) 2015-04-30 2016-11-03 Bae Systems Plc Drilling apparatus for drilling aircraft panels
US9645012B2 (en) * 2015-08-17 2017-05-09 The Boeing Company Rapid automated infrared thermography for inspecting large composite structures
US9519844B1 (en) * 2016-01-22 2016-12-13 The Boeing Company Infrared thermographic methods for wrinkle characterization in composite structures
WO2017172611A1 (en) * 2016-03-28 2017-10-05 General Dynamics Mission Systems, Inc. System and methods for automatic solar panel recognition and defect detection using infrared imaging
US20180361571A1 (en) * 2017-06-14 2018-12-20 The Boeing Company Stabilization of Tool-Carrying End of Extended-Reach Arm of Automated Apparatus

Also Published As

Publication number Publication date
EP3415284A3 (en) 2019-01-02
US10625427B2 (en) 2020-04-21
CN109079775A (en) 2018-12-25
US20180361595A1 (en) 2018-12-20
EP3415284A2 (en) 2018-12-19

Similar Documents

Publication Publication Date Title
KR101707865B1 (en) Unmanned air vehicle system for approaching to photograph facility, and closeup method using the same
US9201422B2 (en) Measuring system
JP2018515774A (en) A three-dimensional measuring device removably connected to a robot arm on an electric mobile platform
US20180274910A1 (en) Aerial device having a three-dimensional measurement device
US9197810B2 (en) Systems and methods for tracking location of movable target object
La et al. Mechatronic systems design for an autonomous robotic system for high-efficiency bridge deck inspection and evaluation
JP5832629B2 (en) Measuring system for determining 3D coordinates of the surface of an object
DE102016105858A1 (en) Mobile three-dimensional measuring instrument
ES2654593T3 (en) Projection-assisted feature measurement using an uncalibrated camera
US9463574B2 (en) Mobile inspection robot
US9664508B2 (en) Portable optical metrology inspection station and method of operation
KR101909766B1 (en) Holonomic motion vehicle for travel on non-level surfaces
US10126116B2 (en) Registration of three-dimensional coordinates measured on interior and exterior portions of an object
US20180273173A1 (en) Autonomous inspection of elongated structures using unmanned aerial vehicles
US10671066B2 (en) Scanning environments and tracking unmanned aerial vehicles
EP2162807B1 (en) System and method for automated inspection of large-scale part
EP0812662B1 (en) Composite sensor robot system
JP5615416B2 (en) Automatic measurement of dimensional data by laser tracker
EP2759824B1 (en) System and method for automated crack inspection and repair
US7499772B2 (en) Method and system for navigating a nondestructive evaluation device
US20150049186A1 (en) Coordinate measuring machine having a camera
KR101568444B1 (en) System, program product, and related methods for registering three-dimensional models to point data representing the pose of a part
US7171041B2 (en) Position-orientation recognition device
US10011012B2 (en) Semi-autonomous multi-use robot system and method of operation
US9400170B2 (en) Automatic measurement of dimensional data within an acceptance region by a laser tracker