US9149929B2 - Methods and systems for inspection sensor placement - Google Patents

Methods and systems for inspection sensor placement Download PDF

Info

Publication number
US9149929B2
US9149929B2 US12/787,885 US78788510A US9149929B2 US 9149929 B2 US9149929 B2 US 9149929B2 US 78788510 A US78788510 A US 78788510A US 9149929 B2 US9149929 B2 US 9149929B2
Authority
US
United States
Prior art keywords
sensor
aircraft
inspection sensor
inspection
control system
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active, expires
Application number
US12/787,885
Other languages
English (en)
Other versions
US20110295427A1 (en
Inventor
William P. Motzer
Gary E. Georgeson
Scott W. Lea
Peter J. Hellenbrand
James J. Troy
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Boeing Co
Original Assignee
Boeing Co
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Boeing Co filed Critical Boeing Co
Priority to US12/787,885 priority Critical patent/US9149929B2/en
Assigned to THE BOEING COMPANY reassignment THE BOEING COMPANY ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: GEORGESON, GARY E, HELLENBRAND, PETER J, LEA, SCOTT W, MOTZER, WILLIAM P, TROY, JAMES J
Priority to EP11716337.8A priority patent/EP2576156B1/en
Priority to JP2013512617A priority patent/JP5955316B2/ja
Priority to AU2011258831A priority patent/AU2011258831B2/en
Priority to PCT/US2011/029717 priority patent/WO2011149582A1/en
Priority to CN201180025897.8A priority patent/CN102917844B/zh
Publication of US20110295427A1 publication Critical patent/US20110295427A1/en
Publication of US9149929B2 publication Critical patent/US9149929B2/en
Application granted granted Critical
Active legal-status Critical Current
Adjusted expiration legal-status Critical

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1615Programme controls characterised by special kind of manipulator, e.g. planar, scara, gantry, cantilever, space, closed chain, passive/active joints and tendon driven manipulators
    • B25J9/162Mobile manipulator, movable base with manipulator arm mounted on it
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/06Programme-controlled manipulators characterised by multi-articulated arms
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/06Programme-controlled manipulators characterised by multi-articulated arms
    • B25J9/065Snake robots
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1694Programme controls characterised by use of sensors other than normal servo-feedback from position, speed or acceleration sensors, perception control, multi-sensor controlled systems, sensor fusion
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B2219/00Program-control systems
    • G05B2219/30Nc systems
    • G05B2219/39Robotics, robotics to robotics hand
    • G05B2219/39014Match virtual world with real world
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B2219/00Program-control systems
    • G05B2219/30Nc systems
    • G05B2219/39Robotics, robotics to robotics hand
    • G05B2219/39024Calibration of manipulator
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B2219/00Program-control systems
    • G05B2219/30Nc systems
    • G05B2219/40Robotics, robotics mapping to robotics vision
    • G05B2219/40039Robot mounted or sliding inside vehicle, on assembly line or for test, service
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B2219/00Program-control systems
    • G05B2219/30Nc systems
    • G05B2219/40Robotics, robotics mapping to robotics vision
    • G05B2219/40234Snake arm, flexi-digit robotic manipulator, a hand at each end
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B2219/00Program-control systems
    • G05B2219/30Nc systems
    • G05B2219/45Nc applications
    • G05B2219/45071Aircraft, airplane, ship cleaning manipulator, paint stripping

Definitions

  • the subject matter described herein relates generally to inspections and more particularly to methods and systems for placement of inspection sensors.
  • Known aircraft generally undergo routine inspection of various components. Numerous aircraft components typically are inspected, and the equipment used to perform such inspections can vary from component to component depending, for example, on the component type and/or location. Inspecting at least some components may be difficult because of various spatial restrictions. For example, access to at least some components may require disassembly of at least one occluding structure and/or removal of the component prior to inspection. Inspecting such components may be a tedious and time-consuming task.
  • articulated robot manipulator arms have been used to position inspection sensors within at least some limited access areas.
  • Such articulated robot manipulator arms facilitate avoiding disassembly of portions of the aircraft in connection with performing inspections. Due to joint and link flexibility and high-degrees of freedom of such robot manipulator arms, accurate, real-time positioning and orientation tracking of such robot manipulator arms can be difficult. Further, positioning errors may build up the further down a chain of articulated segments a location is from a base. As such, the position of the end effector, i.e., the location of an inspection sensor, generally has the largest errors.
  • a method for positioning a remote sensor within a target object. The method includes determining a position of the target object using a first sensor, and calibrating a virtual representation of the target object with respect to the position of the target object. A first position of the remote sensor is determined, and movement of the remote sensor is tracked relative to the target object.
  • a control system for positioning a remote sensor within a target object.
  • the control system is configured to determine a position of the target object using a first sensor, and calibrate a virtual representation of the target object with respect to the position of the target object.
  • the control system is further configured to determine a first position of the remote sensor, and track movement of the remote sensor relative to the target object.
  • a system for positioning a remote sensor within a target object.
  • the system includes an articulated robotic system coupled to the remote sensor, a positioning system that determines a position of the target object and determines a first position of the remote sensor, and a control system that calibrates a virtual representation of the target object with respect to the position of the target object and tracks movement of the remote sensor relative to the target object.
  • FIG. 1 is an illustration of an exemplary system that may be used to place and/or visualize a sensor within a target object being inspected;
  • FIG. 2 is an enlarged schematic illustration of a portion of the system shown in FIG. 1 ;
  • FIG. 3 is an illustration of an exemplary control system that may be used with the system shown in FIG. 1 ;
  • FIG. 4 is a flow chart illustrating an exemplary method of positioning a sensor that may be used with the system shown in FIG. 1 .
  • the subject matter described herein relates generally to the inspection of a target object. More particularly, the subject matter described herein relates to methods and systems that facilitate remotely positioning a sensor within a target object being inspected.
  • a sensor is remotely positioned within a target object, and a positioning system determines a position of the target object and determines a first position of the sensor.
  • a control system calibrates a virtual representation of the target object with respect to the position of the target object and tracks movement of the sensor relative to the target object.
  • An exemplary technical effect of the methods and systems described herein includes at least one of: (a) determining a position of the target object using a first sensor; (b) calibrating a virtual representation of the target object with respect to the position of the target object; (c) determining a first position of the remote sensor; (d) determining a second position of the remote sensor relative to the first position of the remote sensor; and (e) tracking movement of the remote sensor relative to the target object based on at least the first position and the second position of the remote sensor.
  • FIGS. 1 and 2 illustrate an exemplary system 100 that may be used to place and/or visualize an inspection sensor 102 within a target object or structure 104 being inspected.
  • any type of inspection sensor such as a non-destructive inspection (NDI) sensor, that enables system 100 to function as described herein may be used.
  • inspection sensor 102 detects at least one parameter of structure 104 .
  • inspection sensor 102 may be used to inspect a surface of structure 104 and/or scan data for system 100 .
  • Inspection sensor 102 may be, without limitation, an optical sensor, a camera, an infrared sensor, an ultrasonic sensor, an eddy current sensor, a vibration sensor, a magnetometer, a laser scanner, a temperature probe, a microphone, a speaker, a capacitance-based gap measurement meter, an electrical multimeter, a voltage meter, a resistance meter, a current meter, a conductivity meter, a static charge meter, and/or any combination of the aforementioned components.
  • an articulated robotic system 200 such as a robotic snake system, is coupled to inspection sensor 102 to position, move, and/or orient inspection sensor 102 relative to structure 104 .
  • articulated robotic system 200 is a pedestal-mounted robotic snake system, also referred to as an elephant trunk robot, includes a mobile base 212 and an articulated arm 204 extending from mobile base 212 .
  • articulated robotic system 200 may be, but is not limited to being, a crawling robotic snake system, an endoscope, and/or a bore scope that does not include a base 212 .
  • any articulated robotic system that enables system 100 to function as described herein may be used.
  • arm 204 includes a plurality of jointed segments (not numbered) that enable articulated robotic system 200 to be selectively positioned with multiple degrees of freedom.
  • articulated robotic system 200 is configured to selectively move and/or orient inspection sensor 102 in various positions suitable for inspecting and/or evaluating structure 104 .
  • the motion instructions are variably selected with a desired speed and direction that will result in a desired movement of inspection sensor 102 located at the end effector of arm 204 .
  • arm 204 is navigable in a three-dimensional space by variably transmitting motion instructions simultaneously to each jointed segment in order to produce bending, twisting, spiraling, and/or turning motions.
  • articulated robotic system 200 includes at least one sensor system capable of determining its current position and location, such as positioning sensor 206 , that is a self-contained unit capable of tracking and/or monitoring movement of at least one location on arm 204 and/or inspection sensor 102 , including transient oscillations of arm 204 and/or inspection sensor 102 .
  • Positioning sensor 206 provides a positional awareness for system 100 and may be capable of measuring both a position and an orientation of its location on the articulated robotic system 200 relative to structure 104 .
  • positioning sensor 206 is an inertial sensor, such as a microelectromechanical system (MEMS).
  • MEMS microelectromechanical system
  • Positioning sensor 206 is part of a measurement system, which may include a processor (not shown), a plurality of accelerometers (not shown) that measure linear acceleration, a plurality of gyroscopes (not shown) that measure rotational velocity, and software (not shown) to process the linear acceleration and/or rotational velocity data to produce relative position and orientation information.
  • a processor not shown
  • a plurality of accelerometers not shown
  • a plurality of gyroscopes that measure rotational velocity
  • software not shown
  • Other types of self-contained positioning sensors 206 are also possible, such as those that use cameras to process image data to determine the location of positioning sensor 206 within structure 104 .
  • a local coordinate measurement system 300 provides positional awareness data to facilitate determining a first position of inspection sensor 102 relative to structure 104 .
  • local coordinate measurement system 300 is a local positioning system (LPS) that includes a range meter 302 and/or a digital camera that is coupled to a pan and tilt unit 304 .
  • LPS local positioning system
  • Local coordinate measurement system 300 may be used to calibrate articulated robotic system 200 to the coordinate system of structure 104 .
  • range meter 302 measures relative distances of visible feature 110 of structure 104 to determine the relative position of the local coordinate measurement system 300 to structure 104 .
  • range meter 302 measures relative distances of an exterior features 210 of articulated robotic system 200 , such as points on base 212 .
  • local coordinate measurement system 300 facilitates aligning articulated robotic system 200 and/or structure 104 with respect to a coordinate system to enable registering a relative location of positioning sensor 206 and inspection sensor 102 .
  • Pan and tilt unit 304 is actuatable to variably orient range meter 302 of local coordinate measurement system 300 .
  • pan and tilt unit 304 enables range meter 302 to rotate about a vertical axis of rotation 306 and about a horizontal axis of rotation 308 .
  • range meter 302 is rotatable about vertical axis of rotation 306 to pan range meter 302
  • range meter 302 is rotatable about horizontal axis of rotation 308 to tilt range meter 302 .
  • the height as well as the lateral position of the range meter 302 is variably adjustable.
  • pan and tilt unit 304 is configured to measure a horizontal and/or vertical angle between exterior features 110 , 210 .
  • the inspection process begins by inserting the robot's end effector containing the positioning sensor 206 and inspection sensor 102 through access port 112 .
  • the operator 412 directs arm 204 past obstacles 116 inside target object 104 by watching a virtual display of the target object 104 , robotic arm 204 , positioning sensor 206 , and inspection sensor 102 are displayed on a graphical presentation interface 406 .
  • Information from the positioning sensor 206 is converted into the coordinate system of target object 104 in order to place the virtual objects in the proper positions and orientations on the graphical presentation interface 206 .
  • internal landmarks or obstacles 116 can be used to adjust or re-calibrate the position data measured by the positioning sensor 206 .
  • the data from the positioning sensor can be adjusted to reference this known position of obstacle 116 .
  • FIG. 3 illustrates an exemplary control system 400 , also illustrated in FIGS. 1 and 2 , that may be used to operate system 100 .
  • control system 400 includes a memory device 402 and a processor 404 coupled to memory device 402 for executing instructions.
  • executable instructions and/or model data for structure 104 are stored in memory device 402 .
  • the term “processor” is not limited to integrated circuits referred to in the art as a computer, but broadly refers to a controller, a microcontroller, a microcomputer, a programmable logic controller (PLC), an application specific integrated circuit, and other programmable circuits.
  • PLC programmable logic controller
  • Control system 400 is configurable to perform one or more operations described herein by programming processor 404 .
  • processor 404 may be programmed by encoding an operation as one or more executable instructions and by providing the executable instructions in memory device 402 .
  • Processor 404 may include one or more processing units (e.g., in a multi-core configuration).
  • Memory device 402 includes one or more devices that enable information, such as executable instructions and/or other data, to be selectively stored and retrieved.
  • such other data includes at least a predetermined three-dimensional computer-aided design (CAD) model that is representative of structure 104 .
  • Memory device 402 may include one or more computer readable media, such as, without limitation, dynamic random access memory (DRAM), static random access memory (SRAM), a solid state disk, and/or a hard disk.
  • DRAM dynamic random access memory
  • SRAM static random access memory
  • solid state disk solid state disk
  • hard disk a hard disk.
  • memory device 402 may be configured to store, without limitation, executable instructions and/or any other type of data.
  • control system 400 includes a graphical presentation interface 406 that is coupled to processor 404 to enable information to be presented to a user 412 .
  • graphical presentation interface 406 may include a display adapter (not shown) that is coupleable to a display device (not shown), such as a cathode ray tube (CRT), a liquid crystal display (LCD), a light emitting diode (LED) display, an organic LED (OLED) display, and/or an “electronic ink” display.
  • graphical presentation interface 406 enables the user 412 to selectively position and/or visualize the position of inspection sensor 102 using system 100 .
  • graphical presentation interface 406 includes one or more display devices.
  • graphical presentation interface 406 may be coupled to, and/or include, a printer.
  • control system 400 includes an input interface 408 that receives input, such as control commands, from user 412 .
  • input interface 408 receives information suitable for use with the methods described herein.
  • Input interface 408 is coupled to processor 404 and may include, for example, a joystick, a keyboard, a pointing device, a mouse, a stylus, a touch sensitive panel (e.g., a touch pad or a touch screen), and/or a position detector.
  • a single component for example, a touch screen, may function as both a display device of graphical presentation interface 406 and as an input interface 408 .
  • control system 400 includes a communication interface 410 coupled to processor 404 .
  • communication interface 410 communicates with a remote device, such as inspection sensor 102 , articulated robotic system 200 , positioning sensor 206 , local coordinate measurement system 300 , and/or another control system 400 .
  • control system 400 cooperates with graphical presentation interface 406 and/or input interface 408 , to enable user 412 to operate system 100 .
  • communication interface 410 may include, without limitation, a wired network adapter, a wireless network adapter, and/or a mobile telecommunications adapter.
  • control system 400 may be coupled to articulated robotic system 200 , local coordinate measurement system 300 , and/or another control system 400 via a network (not shown).
  • a network may include, without limitation, the Internet, a local area network (LAN), a wide area network (WAN), a wireless LAN (WLAN), a mesh network, and/or a virtual private network (VPN) or other suitable communication means.
  • control system 400 is electrically coupled directly to, and/or formed integrally with, articulated robotic system 200 and/or local coordinate measurement system 300 .
  • FIG. 4 illustrates an exemplary method 500 for use in selectively positioning inspection sensor 102 relative to a target object, or structure 104 , being inspected.
  • control system 400 facilitates remotely positioning inspection sensor 102 to enable inspection and/or evaluation of structure 104 . More specifically, the embodiments described herein enable a position of sensor 206 , and as a result the position of the inspection sensor 102 , to be accurately tracked using a three-dimensional CAD model of structure 104 in a three-dimensional coordinate system.
  • articulated robotic system 200 is positioned 502 proximate and/or adjacent to a target object or structure 104 .
  • range meter 302 measures 504 a distance to an exterior position of structure 104
  • positional data is transmitted to control system 400 to enable control system 400 to accurately determine a position and/or an orientation of structure 104 and/or articulated robotic system 200 relative to local coordinate measurement system 300 .
  • triangulation techniques are used to determine the relative positions of structure 104 and articulated robotic system 200 .
  • control system 400 determines 508 a first position of positioning sensor 206 , which is coupled to articulated robotic system 200 .
  • At least one virtual representation of structure 104 is received 510 from memory device 402 .
  • a type of structure 104 may be identified and/or determined based on positional data of structure 104 , and a virtual representation of structure 104 may be determined and/or selected based on the type of structure 104 .
  • a virtual representation of articulated robotic system 200 and/or positioning sensor 206 may be provided based on the first position of positioning sensor 206 . The virtual representations of structure 104 , robotic system 200 and/or positioning sensor 206 are then registered 512 on a three-dimensional coordinate system.
  • articulated robotic system 200 actuates arm 204 to suitably position, orient, and/or move 514 inspection sensor 102 relative to structure 104 for inspection and/or evaluation of structure 104 .
  • user 412 may use graphical presentation interface 406 and/or input interface 408 to navigate arm 204 through an access port 112 and/or around other internal landmarks and/or obstructions 116 .
  • user 412 directs arm 204 past obstructions 116 inside structure 104 by watching a virtual display of structure 104 , robotic arm 204 , position sensor 206 , and inspection sensor 102 on graphical presentation interface 406 .
  • Information from positioning sensor 206 is converted into the three-dimensional coordinate system in order to place the virtual objects in the proper positions and orientations on the graphical presentation interface 206 .
  • data provided by inspection sensor 102 and/or positioning sensor 206 may be used to enable partial and/or full automation of the navigation process to suitably position, orient, and/or move 514 inspection sensor 102 and/or positioning sensor 206 .
  • positional awareness data acquired and/or provided by positioning sensor 206 is continuously monitored and/or tracked 516 to provide real-time and post-processed position and/or orientation tracking as positioning sensor 206 is moved 514 from a first location to a second location relative to structure 104 . More specifically, in the exemplary embodiment, movement of positioning sensor 206 is tracked 516 with respect to the position of structure 104 , the first position of positioning sensor 206 , and/or other data provided by positioning sensor 206 and/or inspection sensor 102 . As such, in the exemplary embodiment, the second location of positioning sensor 206 may be determined 518 based at least on the first position of positioning sensor 206 and the movement tracked from the first position to the second position. In the exemplary embodiment, the position and/or orientation of positioning sensor 206 is continuously displayed on graphical presentation interface 406 in the three-dimensional coordinate system to virtually track movement of positioning sensor 206 through limited access areas of structure 104 .
  • data provided by inspection sensor 102 and/or positioning sensor 206 may be used to calibrate 520 the virtual representation of structure 104 , articulated robotic system 200 , and/or positioning sensor 206 on the three-dimensional coordinate system based at least on the position of structure 104 and/or the first and second positions of positioning sensor 206 .
  • local coordinate measurement system 300 facilitates calibrating structure 104 and/or articulated robot system 200 to the three-dimensional coordinate system. More specifically, in the exemplary embodiment, relative distances and/or angles between local coordinate measurement system 300 , exterior feature 110 , and exterior feature 210 are determined.
  • any motion of arm 204 and positioning sensor 206 may then be converted into the three-dimensional coordinate system.
  • obstructions 116 may be used to adjust and/or re-calibrate the positional data measured by positioning sensor 206 . For example, if robot arm 204 is in contact with obstructions 116 , which is at a known position, but the positioning sensor 206 is reporting a different position, the data from position sensor 206 may be adjusted to reference this known position of obstructions 116 .
  • an updated and/or recalibrated estimate position and/or orientation of inspection sensor 102 and/or positioning sensor 206 may be provided during operation.
  • the virtual representation may be recalibrated, as necessary, when positional awareness data provided by inspection sensor 102 and/or positioning sensor 206 is not consistent with the virtual representation.
  • the embodiments described herein provide for remotely placing and/or visualizing a sensor to inspect various components within limited access areas.
  • the exemplary methods and systems facilitate reducing a time and/or cost associated with aircraft inspections.
  • the exemplary systems and methods are not limited to the specific embodiments described herein, but rather, components of each system and/or steps of each method may be utilized independently and separately from other components and/or method steps described herein. Each component and each method step may also be used in combination with other components and/or method steps.

Landscapes

  • Engineering & Computer Science (AREA)
  • Robotics (AREA)
  • Mechanical Engineering (AREA)
  • Health & Medical Sciences (AREA)
  • General Health & Medical Sciences (AREA)
  • Orthopedic Medicine & Surgery (AREA)
  • Length Measuring Devices With Unspecified Measuring Means (AREA)
  • Manipulator (AREA)
US12/787,885 2010-05-26 2010-05-26 Methods and systems for inspection sensor placement Active 2031-02-21 US9149929B2 (en)

Priority Applications (6)

Application Number Priority Date Filing Date Title
US12/787,885 US9149929B2 (en) 2010-05-26 2010-05-26 Methods and systems for inspection sensor placement
PCT/US2011/029717 WO2011149582A1 (en) 2010-05-26 2011-03-24 Methods and systems for inspection sensor placement
JP2013512617A JP5955316B2 (ja) 2010-05-26 2011-03-24 検査センサを配置する方法及びシステム
AU2011258831A AU2011258831B2 (en) 2010-05-26 2011-03-24 Methods and systems for inspection sensor placement
EP11716337.8A EP2576156B1 (en) 2010-05-26 2011-03-24 Methods and systems for inspection sensor placement
CN201180025897.8A CN102917844B (zh) 2010-05-26 2011-03-24 用于检查传感器放置的方法和系统

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
US12/787,885 US9149929B2 (en) 2010-05-26 2010-05-26 Methods and systems for inspection sensor placement

Publications (2)

Publication Number Publication Date
US20110295427A1 US20110295427A1 (en) 2011-12-01
US9149929B2 true US9149929B2 (en) 2015-10-06

Family

ID=44050149

Family Applications (1)

Application Number Title Priority Date Filing Date
US12/787,885 Active 2031-02-21 US9149929B2 (en) 2010-05-26 2010-05-26 Methods and systems for inspection sensor placement

Country Status (6)

Country Link
US (1) US9149929B2 (zh)
EP (1) EP2576156B1 (zh)
JP (1) JP5955316B2 (zh)
CN (1) CN102917844B (zh)
AU (1) AU2011258831B2 (zh)
WO (1) WO2011149582A1 (zh)

Cited By (20)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20160171802A1 (en) * 2014-12-15 2016-06-16 Bosch Automotive Service Solutions Inc. Vehicle Diagnostic System and Method
US9625287B2 (en) * 2013-10-22 2017-04-18 The United States Of America, As Represented By The Secretary Of The Army Controllable-arch sensor boom or crane
US20190054640A1 (en) * 2017-08-15 2019-02-21 Avigilon Corporation Camera on movable arm
US20190054638A1 (en) * 2017-08-18 2019-02-21 Rolls-Royce Plc Hyper-redundant manipulators
US10488349B2 (en) 2017-11-14 2019-11-26 General Electric Company Automated borescope insertion system
US10489896B2 (en) 2017-11-14 2019-11-26 General Electric Company High dynamic range video capture using variable lighting
US10611022B2 (en) * 2016-11-29 2020-04-07 Rolls-Royce Plc Methods, apparatus, computer programs and non-transitory computer readable storage mediums for controlling a hyper redundant manipulator
US10737396B2 (en) * 2015-04-15 2020-08-11 Abb Schweiz Ag Method and apparatus for robot path teaching
US10775315B2 (en) 2018-03-07 2020-09-15 General Electric Company Probe insertion system
US10782267B1 (en) 2019-11-04 2020-09-22 Equate Petrochemical Company Mobile non-destructive testing inspection system
US10786903B2 (en) * 2017-10-05 2020-09-29 Institute Of Nuclear Energy Research, Atomic Energy Council, Executive Yuan Map creation system and method thereof for movable robot
US11118948B2 (en) 2019-08-23 2021-09-14 Toyota Motor North America, Inc. Systems and methods of calibrating vehicle sensors using augmented reality
US11613003B2 (en) 2020-01-24 2023-03-28 General Electric Company Line assembly for an extension tool having a plurality of links
US11654547B2 (en) 2021-03-31 2023-05-23 General Electric Company Extension tool
US11692650B2 (en) 2020-01-23 2023-07-04 General Electric Company Selectively flexible extension tool
US11702955B2 (en) 2019-01-14 2023-07-18 General Electric Company Component repair system and method
US11707819B2 (en) 2018-10-15 2023-07-25 General Electric Company Selectively flexible extension tool
US11752622B2 (en) 2020-01-23 2023-09-12 General Electric Company Extension tool having a plurality of links
US11834990B2 (en) 2020-03-10 2023-12-05 Oliver Crispin Robotics Limited Insertion tool
US12091981B2 (en) 2020-06-11 2024-09-17 General Electric Company Insertion tool and method

Families Citing this family (29)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9082208B2 (en) * 2011-07-12 2015-07-14 Spirit Aerosystems, Inc. System and method for locating and displaying aircraft information
US9310317B2 (en) 2012-01-25 2016-04-12 The Boeing Company Automated system and method for tracking and detecting discrepancies on a target object
WO2014100829A1 (en) * 2012-12-21 2014-06-26 John Bean Technology Corporation Thermal measurement and process control
US9588515B2 (en) * 2012-12-31 2017-03-07 General Electric Company Systems and methods for remote control of a non-destructive testing system
US9152304B2 (en) * 2012-12-31 2015-10-06 General Electric Company Systems and methods for virtual control of a non-destructive testing system
US9581438B2 (en) * 2012-12-31 2017-02-28 General Electric Company Systems and methods for control of a non-destructive testing system
US20140207406A1 (en) * 2013-01-22 2014-07-24 General Electric Company Self-directed inspection plan
US20140207403A1 (en) * 2013-01-22 2014-07-24 General Electric Company Inspection instrument auto-configuration
US9954908B2 (en) * 2013-01-22 2018-04-24 General Electric Company Systems and methods for collaborating in a non-destructive testing system
US10725478B2 (en) * 2013-07-02 2020-07-28 The Boeing Company Robotic-mounted monument system for metrology systems
JP2015112708A (ja) * 2013-12-16 2015-06-22 多摩川精機株式会社 締め装置及びその締め方法
US9856037B2 (en) 2014-06-18 2018-01-02 The Boeing Company Stabilization of an end of an extended-reach apparatus in a limited-access space
FR3028615B1 (fr) * 2014-11-14 2017-01-13 Aircelle Sa Procede d’inspection d’un produit tel qu’un composant d’une nacelle de turboreacteur
CN106292655A (zh) * 2015-06-25 2017-01-04 松下电器(美国)知识产权公司 远程作业装置和控制方法
CN105092928B (zh) * 2015-07-23 2018-04-20 深圳市华谊智测科技股份有限公司 数字钳型表及其自动测量方法
US9841836B2 (en) * 2015-07-28 2017-12-12 General Electric Company Control of non-destructive testing devices
US10196927B2 (en) * 2015-12-09 2019-02-05 General Electric Company System and method for locating a probe within a gas turbine engine
GB2550395B (en) * 2016-05-19 2020-08-12 Hs Marston Aerospace Ltd Method and system for thermographic analysis
EP3260250B1 (en) * 2016-06-21 2019-10-02 Ansaldo Energia IP UK Limited Robotic system for confined space operations background
ES2899284T3 (es) * 2016-07-15 2022-03-10 Fastbrick Ip Pty Ltd Vehículo que incorpora una máquina de colocación de ladrillos
FR3056134B1 (fr) * 2016-09-20 2018-08-31 Airbus Sas Dispositif robotise pour l'inspection d'une structure d'aeronef
US10814480B2 (en) 2017-06-14 2020-10-27 The Boeing Company Stabilization of tool-carrying end of extended-reach arm of automated apparatus
US10625427B2 (en) 2017-06-14 2020-04-21 The Boeing Company Method for controlling location of end effector of robot using location alignment feedback
DE102018103333B3 (de) 2018-02-14 2019-05-09 Gesellschaft zur Förderung angewandter Informatik eV Verfahren und System zur dynamischen Strukturanalyse
US11084169B2 (en) * 2018-05-23 2021-08-10 General Electric Company System and method for controlling a robotic arm
US20190383158A1 (en) * 2018-06-14 2019-12-19 General Electric Company Probe Motion Compensation
US11906506B1 (en) * 2021-12-21 2024-02-20 Omidreza Ghanadiof System and method for inspecting and maintaining the exterior elevated elements of building structures
US12033314B2 (en) 2021-12-21 2024-07-09 Omidreza Ghanadiof System and method for inspecting and maintaining the exterior elevated elements of building structures
IT202200005888A1 (it) * 2022-03-24 2023-09-24 Bm Group Holding S P A Apparecchiatura per effettuare misurazioni tra le gabbie in un processo di laminazione.

Citations (31)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4362977A (en) * 1980-06-30 1982-12-07 International Business Machines Corporation Method and apparatus for calibrating a robot to compensate for inaccuracy of the robot
US5374830A (en) * 1984-10-12 1994-12-20 Sensor Adaptive Machines, Inc. Target based determination of robot and sensor alignment
US5525883A (en) * 1994-07-08 1996-06-11 Sara Avitzour Mobile robot location determination employing error-correcting distributed landmarks
JP2000070269A (ja) 1998-09-01 2000-03-07 Honda Seiki Kk 磁気センサー及び感圧センサーによりバーチャル・リアリティに誘導される能動内視鏡とその操作システム。
US6378387B1 (en) * 2000-08-25 2002-04-30 Aerobotics, Inc. Non-destructive inspection, testing and evaluation system for intact aircraft and components and method therefore
US20030089183A1 (en) * 2001-11-13 2003-05-15 Jacobsen Robert A. Apparatus and method for non-destructive inspection of large structures
US20040013295A1 (en) * 2002-03-15 2004-01-22 Kohtaro Sabe Obstacle recognition apparatus and method, obstacle recognition program, and mobile robot apparatus
US6822412B1 (en) * 2003-06-11 2004-11-23 Zhongxue Gan Method for calibrating and programming of a robot application
US20050203382A1 (en) * 2004-02-23 2005-09-15 Assaf Govari Robotically guided catheter
US6968224B2 (en) * 1999-10-28 2005-11-22 Surgical Navigation Technologies, Inc. Method of detecting organ matter shift in a patient
US7099745B2 (en) * 2003-10-24 2006-08-29 Sap Aktiengesellschaft Robot system using virtual world
US7117067B2 (en) * 2002-04-16 2006-10-03 Irobot Corporation System and methods for adaptive control of robotic devices
US7171279B2 (en) * 2000-08-18 2007-01-30 Oliver Crispin Robotics Limited Articulating arm for positioning a tool at a location
US20070113690A1 (en) 2005-08-31 2007-05-24 Honeywell International, Inc. Method and system for navigating a nondestructive evaluation device
US20080004523A1 (en) * 2006-06-29 2008-01-03 General Electric Company Surgical tool guide
US20080097156A1 (en) * 2006-10-23 2008-04-24 Pentax Corporation Camera calibration for endoscope navigation system
US7387179B2 (en) * 2003-06-17 2008-06-17 Science Applications International Corporation Toroidal propulsion and steering system
US20080195343A1 (en) * 2005-03-08 2008-08-14 Peter Osterlund Method of Calibration
US20080302200A1 (en) * 2007-06-06 2008-12-11 Tobey Wayland E Modular hybrid snake arm
US20090069937A1 (en) 2006-02-09 2009-03-12 Gunter Battenberg Method and Device for The Fully Authomatic Final Inspection of Components and/or Their Functional Units
US20090086014A1 (en) 2007-09-28 2009-04-02 The Boeing Company Local positioning system and method
US20090086199A1 (en) 2007-09-28 2009-04-02 The Boeing Company Method involving a pointing instrument and a target object
US20090137952A1 (en) * 2007-08-14 2009-05-28 Ramamurthy Bhaskar S Robotic instrument systems and methods utilizing optical fiber sensor
US20090171151A1 (en) * 2004-06-25 2009-07-02 Choset Howard M Steerable, follow the leader device
US20100010504A1 (en) * 2006-09-19 2010-01-14 The Trustees Of Columbia University In The City Of New York Systems, devices, and methods for surgery on a hollow anatomically suspended organ
US20100102980A1 (en) 2008-10-28 2010-04-29 The Boeing Company Hand-Held Positioning Interface for Spatial Query
US20100153051A1 (en) 2008-12-15 2010-06-17 Georgeson Gary E Locating A Component Underneath A Surface Of A Target Object And Locating An Access Panel For Accessing The Component
US20100188510A1 (en) * 2007-03-13 2010-07-29 Ki-Sung Yoo Landmark for position determination of mobile robot and apparatus and method using it
EP2216144A2 (de) 2009-02-06 2010-08-11 Günther Battenberg Verfahren und System zur Kontrolle von Bauteilen und/oder Funktionseinheiten mit einer Prüfvorrichtung
US20100228506A1 (en) 2009-03-09 2010-09-09 Motzer William P Non-destructive inspection apparatus
US20100235037A1 (en) * 2009-03-16 2010-09-16 The Boeing Company Autonomous Inspection and Maintenance

Family Cites Families (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPS6067093A (ja) * 1983-09-22 1985-04-17 株式会社東芝 連続部材
FR2625936A1 (fr) * 1988-01-14 1989-07-21 Hispano Suiza Sa Procede de mise en place d'un robot porte-outils destine a des interventions en milieu humainement hostile
JPH0386484A (ja) * 1989-08-25 1991-04-11 Fujitsu Ltd ロボットの遠隔操作装置
JPH08155863A (ja) * 1994-12-02 1996-06-18 Fujitsu Ltd ロボット遠隔操作システム
JPH11104984A (ja) * 1997-10-06 1999-04-20 Fujitsu Ltd 実環境情報表示装置及び実環境情報表示処理を実行するプログラムを記録したコンピュータ読み取り可能な記録媒体
FR2822573B1 (fr) * 2001-03-21 2003-06-20 France Telecom Procede et systeme de reconstruction a distance d'une surface
JP2002292582A (ja) * 2001-03-30 2002-10-08 Hitachi Zosen Corp 作業用ロボット装置
SE0203908D0 (sv) * 2002-12-30 2002-12-30 Abb Research Ltd An augmented reality system and method
JP2004223128A (ja) * 2003-01-27 2004-08-12 Hitachi Ltd 医療行為支援装置および方法
US7448271B2 (en) * 2005-08-17 2008-11-11 The Boeing Company Inspection system and associated method
JP4734120B2 (ja) * 2006-01-06 2011-07-27 株式会社東芝 航空機機体の検査方法および装置
JP4298757B2 (ja) * 2007-02-05 2009-07-22 ファナック株式会社 ロボット機構のキャリブレーション装置及び方法
CN102859317A (zh) * 2010-05-04 2013-01-02 形创有限公司 使用参考的体积分析传感器的物体检查

Patent Citations (33)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4362977A (en) * 1980-06-30 1982-12-07 International Business Machines Corporation Method and apparatus for calibrating a robot to compensate for inaccuracy of the robot
US5374830A (en) * 1984-10-12 1994-12-20 Sensor Adaptive Machines, Inc. Target based determination of robot and sensor alignment
US5525883A (en) * 1994-07-08 1996-06-11 Sara Avitzour Mobile robot location determination employing error-correcting distributed landmarks
JP2000070269A (ja) 1998-09-01 2000-03-07 Honda Seiki Kk 磁気センサー及び感圧センサーによりバーチャル・リアリティに誘導される能動内視鏡とその操作システム。
US6968224B2 (en) * 1999-10-28 2005-11-22 Surgical Navigation Technologies, Inc. Method of detecting organ matter shift in a patient
US7171279B2 (en) * 2000-08-18 2007-01-30 Oliver Crispin Robotics Limited Articulating arm for positioning a tool at a location
US6378387B1 (en) * 2000-08-25 2002-04-30 Aerobotics, Inc. Non-destructive inspection, testing and evaluation system for intact aircraft and components and method therefore
US20030089183A1 (en) * 2001-11-13 2003-05-15 Jacobsen Robert A. Apparatus and method for non-destructive inspection of large structures
US6907799B2 (en) * 2001-11-13 2005-06-21 Bae Systems Advanced Technologies, Inc. Apparatus and method for non-destructive inspection of large structures
US20040013295A1 (en) * 2002-03-15 2004-01-22 Kohtaro Sabe Obstacle recognition apparatus and method, obstacle recognition program, and mobile robot apparatus
US7117067B2 (en) * 2002-04-16 2006-10-03 Irobot Corporation System and methods for adaptive control of robotic devices
US6822412B1 (en) * 2003-06-11 2004-11-23 Zhongxue Gan Method for calibrating and programming of a robot application
US7387179B2 (en) * 2003-06-17 2008-06-17 Science Applications International Corporation Toroidal propulsion and steering system
US7099745B2 (en) * 2003-10-24 2006-08-29 Sap Aktiengesellschaft Robot system using virtual world
US20050203382A1 (en) * 2004-02-23 2005-09-15 Assaf Govari Robotically guided catheter
US20090171151A1 (en) * 2004-06-25 2009-07-02 Choset Howard M Steerable, follow the leader device
US20080195343A1 (en) * 2005-03-08 2008-08-14 Peter Osterlund Method of Calibration
US20070113690A1 (en) 2005-08-31 2007-05-24 Honeywell International, Inc. Method and system for navigating a nondestructive evaluation device
US7499772B2 (en) * 2005-08-31 2009-03-03 Honeywell International Inc. Method and system for navigating a nondestructive evaluation device
US20090069937A1 (en) 2006-02-09 2009-03-12 Gunter Battenberg Method and Device for The Fully Authomatic Final Inspection of Components and/or Their Functional Units
US20080004523A1 (en) * 2006-06-29 2008-01-03 General Electric Company Surgical tool guide
US20100010504A1 (en) * 2006-09-19 2010-01-14 The Trustees Of Columbia University In The City Of New York Systems, devices, and methods for surgery on a hollow anatomically suspended organ
US20080097156A1 (en) * 2006-10-23 2008-04-24 Pentax Corporation Camera calibration for endoscope navigation system
US20100188510A1 (en) * 2007-03-13 2010-07-29 Ki-Sung Yoo Landmark for position determination of mobile robot and apparatus and method using it
US20080302200A1 (en) * 2007-06-06 2008-12-11 Tobey Wayland E Modular hybrid snake arm
US20090137952A1 (en) * 2007-08-14 2009-05-28 Ramamurthy Bhaskar S Robotic instrument systems and methods utilizing optical fiber sensor
US20090086199A1 (en) 2007-09-28 2009-04-02 The Boeing Company Method involving a pointing instrument and a target object
US20090086014A1 (en) 2007-09-28 2009-04-02 The Boeing Company Local positioning system and method
US20100102980A1 (en) 2008-10-28 2010-04-29 The Boeing Company Hand-Held Positioning Interface for Spatial Query
US20100153051A1 (en) 2008-12-15 2010-06-17 Georgeson Gary E Locating A Component Underneath A Surface Of A Target Object And Locating An Access Panel For Accessing The Component
EP2216144A2 (de) 2009-02-06 2010-08-11 Günther Battenberg Verfahren und System zur Kontrolle von Bauteilen und/oder Funktionseinheiten mit einer Prüfvorrichtung
US20100228506A1 (en) 2009-03-09 2010-09-09 Motzer William P Non-destructive inspection apparatus
US20100235037A1 (en) * 2009-03-16 2010-09-16 The Boeing Company Autonomous Inspection and Maintenance

Non-Patent Citations (4)

* Cited by examiner, † Cited by third party
Title
Buckingham, R. et al.; Snake-Arm Robots: A New Approach to Aircraft Assembly; SAE Aerotech Congress, Los Angeles, CA; Sep. 17, 2007; pp. 1-6.
International Search Report and Written Opinion of PCT/US2011/029717; Jun. 16, 2011; 14 pages.
Notice of Reasons for Rejection from the Japanese Patent Office for application No. 2013-512617, Apr. 7, 2015, 4 pages.
U.S. Appl. No. 12/640,211, filed Dec. 17, 2009.

Cited By (22)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9625287B2 (en) * 2013-10-22 2017-04-18 The United States Of America, As Represented By The Secretary Of The Army Controllable-arch sensor boom or crane
US9911251B2 (en) * 2014-12-15 2018-03-06 Bosch Automotive Service Solutions Inc. Vehicle diagnostic system and method
US20160171802A1 (en) * 2014-12-15 2016-06-16 Bosch Automotive Service Solutions Inc. Vehicle Diagnostic System and Method
US10737396B2 (en) * 2015-04-15 2020-08-11 Abb Schweiz Ag Method and apparatus for robot path teaching
US10611022B2 (en) * 2016-11-29 2020-04-07 Rolls-Royce Plc Methods, apparatus, computer programs and non-transitory computer readable storage mediums for controlling a hyper redundant manipulator
US20190054640A1 (en) * 2017-08-15 2019-02-21 Avigilon Corporation Camera on movable arm
US10543605B2 (en) * 2017-08-15 2020-01-28 Avigilon Corporation Camera on movable arm
US20190054638A1 (en) * 2017-08-18 2019-02-21 Rolls-Royce Plc Hyper-redundant manipulators
US10786903B2 (en) * 2017-10-05 2020-09-29 Institute Of Nuclear Energy Research, Atomic Energy Council, Executive Yuan Map creation system and method thereof for movable robot
US10488349B2 (en) 2017-11-14 2019-11-26 General Electric Company Automated borescope insertion system
US10489896B2 (en) 2017-11-14 2019-11-26 General Electric Company High dynamic range video capture using variable lighting
US10775315B2 (en) 2018-03-07 2020-09-15 General Electric Company Probe insertion system
US11707819B2 (en) 2018-10-15 2023-07-25 General Electric Company Selectively flexible extension tool
US11702955B2 (en) 2019-01-14 2023-07-18 General Electric Company Component repair system and method
US11118948B2 (en) 2019-08-23 2021-09-14 Toyota Motor North America, Inc. Systems and methods of calibrating vehicle sensors using augmented reality
US10782267B1 (en) 2019-11-04 2020-09-22 Equate Petrochemical Company Mobile non-destructive testing inspection system
US11692650B2 (en) 2020-01-23 2023-07-04 General Electric Company Selectively flexible extension tool
US11752622B2 (en) 2020-01-23 2023-09-12 General Electric Company Extension tool having a plurality of links
US11613003B2 (en) 2020-01-24 2023-03-28 General Electric Company Line assembly for an extension tool having a plurality of links
US11834990B2 (en) 2020-03-10 2023-12-05 Oliver Crispin Robotics Limited Insertion tool
US12091981B2 (en) 2020-06-11 2024-09-17 General Electric Company Insertion tool and method
US11654547B2 (en) 2021-03-31 2023-05-23 General Electric Company Extension tool

Also Published As

Publication number Publication date
CN102917844A (zh) 2013-02-06
EP2576156A1 (en) 2013-04-10
JP2013527040A (ja) 2013-06-27
AU2011258831A1 (en) 2012-09-20
JP5955316B2 (ja) 2016-07-20
WO2011149582A1 (en) 2011-12-01
AU2011258831B2 (en) 2016-02-25
CN102917844B (zh) 2016-09-28
EP2576156B1 (en) 2019-05-29
US20110295427A1 (en) 2011-12-01

Similar Documents

Publication Publication Date Title
US9149929B2 (en) Methods and systems for inspection sensor placement
JP5722224B2 (ja) 空間照会のための手持ち型位置決めインタフェース
ES2785302T3 (es) Método y sistema para inspeccionar una pieza de trabajo
US10665012B2 (en) Augmented reality camera for use with 3D metrology equipment in forming 3D images from 2D camera images
CN103608642B (zh) 通过激光跟踪仪对维度数据的自动测量
US10286553B2 (en) Methods and systems for automatically inspecting an object
CN108759834B (zh) 一种基于全局视觉的定位方法
JP2009031295A (ja) 移動体姿勢検出装置
CN110849363A (zh) 激光雷达与组合惯导的位姿标定方法、系统及介质
CN113384347B (zh) 一种机器人标定方法、装置、设备及存储介质
CN112902965A (zh) 机器人跨楼层轨迹的显示方法与系统
JP2021096566A (ja) 走行位置検証システム、走行位置計測システム及び走行位置補正システム
Karam Developing a SLAM-based backpack mobile mapping system for indoor mapping

Legal Events

Date Code Title Description
AS Assignment

Owner name: THE BOEING COMPANY, ILLINOIS

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:MOTZER, WILLIAM P;GEORGESON, GARY E;LEA, SCOTT W;AND OTHERS;REEL/FRAME:024451/0684

Effective date: 20100527

FEPP Fee payment procedure

Free format text: PAYOR NUMBER ASSIGNED (ORIGINAL EVENT CODE: ASPN); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

STCF Information on status: patent grant

Free format text: PATENTED CASE

MAFP Maintenance fee payment

Free format text: PAYMENT OF MAINTENANCE FEE, 4TH YEAR, LARGE ENTITY (ORIGINAL EVENT CODE: M1551); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

Year of fee payment: 4

MAFP Maintenance fee payment

Free format text: PAYMENT OF MAINTENANCE FEE, 8TH YEAR, LARGE ENTITY (ORIGINAL EVENT CODE: M1552); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

Year of fee payment: 8