WO2018183709A1 - Systems and methods for performing an inspection inside a machine - Google Patents

Systems and methods for performing an inspection inside a machine Download PDF

Info

Publication number
WO2018183709A1
WO2018183709A1 PCT/US2018/025188 US2018025188W WO2018183709A1 WO 2018183709 A1 WO2018183709 A1 WO 2018183709A1 US 2018025188 W US2018025188 W US 2018025188W WO 2018183709 A1 WO2018183709 A1 WO 2018183709A1
Authority
WO
WIPO (PCT)
Prior art keywords
images
image
air gap
crawler
path
Prior art date
Application number
PCT/US2018/025188
Other languages
French (fr)
Inventor
Hetal V. LAKHANI
Remus Boca
Gregory F. Rossano
Gregory A. COLE
Biao Zhang
Cajetan Pinto
Original Assignee
Abb Schweiz Ag
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Abb Schweiz Ag filed Critical Abb Schweiz Ag
Publication of WO2018183709A1 publication Critical patent/WO2018183709A1/en

Links

Classifications

    • HELECTRICITY
    • H02GENERATION; CONVERSION OR DISTRIBUTION OF ELECTRIC POWER
    • H02KDYNAMO-ELECTRIC MACHINES
    • H02K15/00Methods or apparatus specially adapted for manufacturing, assembling, maintaining or repairing of dynamo-electric machines
    • H02K15/0006Disassembling, repairing or modifying dynamo-electric machines
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/84Systems specially adapted for particular applications
    • G01N21/88Investigating the presence of flaws or contamination
    • G01N21/95Investigating the presence of flaws or contamination characterised by the material or shape of the object to be examined
    • G01N21/9515Objects of complex shape, e.g. examined with use of a surface follower device
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/84Systems specially adapted for particular applications
    • G01N21/88Investigating the presence of flaws or contamination
    • G01N21/95Investigating the presence of flaws or contamination characterised by the material or shape of the object to be examined
    • G01N21/954Inspecting the inner surface of hollow bodies, e.g. bores
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B23/00Telescopes, e.g. binoculars; Periscopes; Instruments for viewing the inside of hollow bodies; Viewfinders; Optical aiming or sighting devices
    • G02B23/24Instruments or systems for viewing the inside of hollow bodies, e.g. fibrescopes
    • G02B23/2476Non-optical details, e.g. housings, mountings, supports
    • G02B23/2484Arrangements in relation to a camera or imaging device
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B23/00Telescopes, e.g. binoculars; Periscopes; Instruments for viewing the inside of hollow bodies; Viewfinders; Optical aiming or sighting devices
    • G02B23/24Instruments or systems for viewing the inside of hollow bodies, e.g. fibrescopes
    • G02B23/2476Non-optical details, e.g. housings, mountings, supports
    • G02B23/2492Arrangements for use in a hostile environment, e.g. a very hot, cold or radioactive environment
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/017Head mounted
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/84Systems specially adapted for particular applications
    • G01N21/88Investigating the presence of flaws or contamination
    • G01N21/8851Scan or image signal processing specially adapted therefor, e.g. for scan signal adjustment, for detecting different kinds of defects, for compensating for structures, markings, edges
    • G01N2021/8887Scan or image signal processing specially adapted therefor, e.g. for scan signal adjustment, for detecting different kinds of defects, for compensating for structures, markings, edges based on image processing techniques
    • G01N2021/889Scan or image signal processing specially adapted therefor, e.g. for scan signal adjustment, for detecting different kinds of defects, for compensating for structures, markings, edges based on image processing techniques providing a bare video image, i.e. without visual measurement aids
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0179Display position adjusting means not related to the information to be displayed
    • G02B2027/0187Display position adjusting means not related to the information to be displayed slaved to motion of at least a part of the body of the user, e.g. head, eye

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Optics & Photonics (AREA)
  • Analytical Chemistry (AREA)
  • Biochemistry (AREA)
  • General Health & Medical Sciences (AREA)
  • Chemical & Material Sciences (AREA)
  • Immunology (AREA)
  • Pathology (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Astronomy & Astrophysics (AREA)
  • Health & Medical Sciences (AREA)
  • Multimedia (AREA)
  • Manufacturing & Machinery (AREA)
  • Power Engineering (AREA)
  • Closed-Circuit Television Systems (AREA)

Abstract

A method for performing an inspection inside a machine includes inserting a remote controlled vehicle into the machine, wherein the remote controlled vehicle includes at least one camera. The remote controlled vehicle is directed along a path inside the machine. A plurality of images are captured inside the machine with the at least one camera. The images are processed to generate a panoramic image and/or a 3D image. The panoramic image and/or the 3D image are displayed on a display.

Description

SYSTEMS AND METHODS FOR PERFORMING AN INSPECTION INSIDE A
MACHINE
TECHNICAL FIELD
The present application generally relates to machine inspection and more particularly, but not exclusively, to systems and methods for performing an inspection inside a machine.
BACKGROUND
Inspection systems of various types, e.g., for inspecting internal components of machines, remain an area of interest. Some existing systems have various
shortcomings, drawbacks and disadvantages relative to certain applications. For example, in some inspection systems, some internal portions of the machine may not be readily perceivable by the user of the inspection system. Accordingly, there remains a need for further contributions in this area of technology.
SUMMARY
One embodiment of the present invention is a unique method for performing an inspection inside a machine. Another embodiment is a unique system for performing an inspection inside an electrical machine. Other embodiments include apparatuses, systems, devices, hardware, methods, and combinations for inspecting internal components of machines. Further embodiments, forms, features, aspects, benefits, and advantages of the present application shall become apparent from the description and figures provided herewith.
BRIEF DESCRIPTION OF THE FIGURES
The description herein makes reference to the accompanying drawings wherein like reference numerals refer to like parts throughout the several views, and wherein:
FIG. 1 schematically illustrates some aspects of a non-limiting example of an inspection system for performing an inspection inside an electrical machine in accordance with an embodiment of the present invention.
FIG. 2 schematically illustrates some aspects of a non-limiting example of an inspection system for performing an inspection inside an electrical machine in accordance with an embodiment of the present invention.
FIG. 3 schematically illustrates some aspects of a non-limiting example of a crawler disposed within an air gap inside an electrical machine in accordance with an embodiment of the present invention.
FIG. 4 schematically illustrates some aspects of a non-limiting example of a plurality of paths inside a machine in which images are captured in accordance with an embodiment of the present invention.
FIGS. 5A - 5C schematically illustrate some aspects of a non-limiting example of head position based image display in accordance with an embodiment of the present invention. DETAILED DESCRIPTION OF THE ILLUSTRATIVE EMBODIMENTS
For the purposes of promoting an understanding of the principles of the invention, reference will now be made to the embodiments illustrated in the drawings and specific language will be used to describe the same. It will nevertheless be understood that no limitation of the scope of the invention is thereby intended. Any alterations and further modifications in the described embodiments, and any further applications of the principles of the invention as described herein are contemplated as would normally occur to one skilled in the art to which the invention relates.
Referring to FIGS. 1 and 2, some aspects of a non-limiting example of an inspection system 10 for performing an inspection inside a machine, such as an electrical machine, in accordance with an embodiment of the present invention are schematically illustrated. System 10 includes a remote controlled vehicle 12, a controller 14, an image processor 16, and a head-mounted display (HMD) 18. Some embodiments may include an input device 20 coupled to controller 14, e.g., for receiving input to guide remote controlled vehicle 12. Input device 20 may be, for example, a joystick and/or a keyboard. In some embodiments, input device 20 or another input device for guiding remote controlled vehicle 12 may be communicatively coupled to image processor 16 and/or communicatively coupled to and/or associated with head- mounted display 18 in addition to or in place of being coupled to controller 14.
Controller 14 is coupled to remote controlled vehicle 12 via a tether 22, and operative to supply power and control signals to remote controlled vehicle 12 via tether 22. Remote controlled vehicle 12 is operative to provide data, e.g., image data and telemetry data to controller 14. In one form, remote controlled vehicle 12 is a magnetic air gap crawler, and is referred to herein as magnetic air gap crawler 12 or crawler 12. In other embodiments, remote controlled vehicle 12 may take other forms. Crawler 12 is constructed to fit within and move about within a desired portion of a machine being inspected, e.g., an electrical machine being inspected, such as a generator or a motor or a transformer. For example, crawler 12 is constructed to fit within and move about within the air gap between the stator and the rotor in an electrical machine, e.g., including crawling along the length of the laminated stator core or rotor of the electrical machine. In the illustrated embodiments, crawler 12 includes a body 28 having four (4) drive modules 30 and a central module 32. In other embodiments, crawler 12 may take other forms, and may have a greater or lesser number of modules and/or different modules.
Each drive module 30 includes a motor-driven track or tread 40 supported by rollers 42, and provides propulsion to crawler 12. Drive modules 30 are adhered to ferrous or other magnetic materials, such as stator core materials, by magnets 44 disposed adjacent to and under tracks 40. The two outer drive modules 30 are coupled to the two inner drive modules 30 on each side via hinge joints 46. The two inner drive modules 30 are coupled to central module 32 by hinge joints 46. Hinge joints 46 allow the various modules of crawler 12 to shift and rotate relative to each other as necessary to operate in confined spaces on curved or compound surfaces, such as the inner surface of an electrical machine laminated stator, e.g., a motor or generator stator core, in the air gap between the stator and the rotor. Controller 14 is operative to transmit motor drive commands and power signals to crawler 12 via tether 22 to direct crawler 12 to move in a desired direction within the electrical machine being inspected.
Central module 32 is coupled to controller 14 via tether 22, and receives power for operating drive modules 30 and camera modules 50 from controller 14 via tether 22. Central module 32 is electrically coupled to each drive module 30 and supplies electrical power for driving each drive module 30 from controller 14. Central module 32 includes a plurality of camera modules 50, each of which includes a camera and a light, e.g., an LED light. Some embodiments may include a structure projector with one or more camera modules 50. Camera modules 50 are operative to capture still image data and video image data of machine components in the vicinity of crawler 12. Central module 32 is operative to transmit image data from camera modules 50 to controller 14 via tether 22. In some embodiments, central module 32 is also operative to transmit telemetry data from crawler 12 to controller 14 via tether 22. Some telemetry data may be generated by controller 14. Telemetry data may be generated, measured or sensed by crawler 12 and/or controller 14 using one or more sensors (not shown), and may include, for example and without limitation, crawler 12 position within the machine, e.g., stator slot number and distance along the slot; crawler 12 tilt angle; humidity and/or other telemetry data. In addition, in some embodiments, a borescope may be mounted on crawler 12, e.g., directly or via a separate module attached to crawler 12, which may also provide image data.
Controller 14 includes a plurality of display monitors 54 for displaying the image data captured, e.g., by camera modules 50. Controller 14 also includes a
microcontroller 56 for controlling the operation of crawler 12, which in some embodiments includes a module (not shown) for transmitting data received from crawler 12 to image processor 16. In some embodiments, controller 14 also includes an image recorder 58, such as a digital video recorder (DVR), for recording video and still image data, as well as telemetry data received from crawler 12. In some embodiments, controller 14 includes a power distribution module 60 for supplying power to
microcontroller 56 and crawler 12. In some embodiments, controller 14 includes a keyboard (not shown).
Image processor 16 is communicatively coupled to controller 14. Controller 14 is operative to transmit image data (video and still image data) and telemetry data to image processor 16. Image processor 16 is operative to process the image data captured by camera modules 50, e.g., camera modules 50 disposed at a front portion 62 of crawler 12, to generate a panoramic image and/or a 3D image of the scene viewed by cameras 50, e.g., the inside of the electrical machine. Image processor 16 includes an image stitching / 3D reconstruction module 64 operative to process the image data to generate the panoramic image and/or 3D image, e.g., as selected by the user. Image processor 16 also includes a display monitoring module 66, and includes a head-mounted display controller 68 operative to control the images and graphics displayed on HMD 18.
In one form, HMD 18 is a virtual reality headset. In one form, HMD 18 includes dual ocular displays. In other embodiments, other display forms may be employed. HMD 18 is operative to perform user head position tracking, e.g., for changing the viewing angle or other viewing parameters of the images and telemetry displayed by HMD 18, and also for controlling the operation of crawler 12 in some embodiments. An example of a suitable example of virtual reality headset is the Oculus Rift, commercially available from Oculus VR, LLC of Menlo Park, CA, USA. In other embodiments, HMD 18 may take other forms, and may be, for example, an augmented reality headset. A suitable example of an augmented reality headset is the Vuzix augmented reality headset, commercially available from Vuzix Corporation of West Henrietta, NY, USA. In addition to displaying video and still image data on HMD 18, some embodiments also display video and still image data on one or more display monitors 54.
During operation, in one form, controller 14 is operative to transmit control commands 74 to crawler 12, e.g., motor driver commands, lighting control commands and other commands to operate crawler 12. Crawler 12 is operative to transmit telemetry and image data 76 to controller 14, e.g., including telemetry data, and video and still image data. Controller 14 is operative to transmit the telemetry and image data 76 to image processor 16. In other embodiments, crawler 12, a portion of tether 22 and/or an interface dongle plugged into controller 14 or image processor 16 may be operative to transmit telemetry and image data 76 to image processor 16. In some embodiments, the telemetry and image data 76 transmitted by controller 14 to image processor 16 may include annotation data, e.g., including operator notes, i.e., notes pertaining to the transmitted images, such as notes pertaining to the condition of the machine illustrated in the images for display or audio playback via, for example, HMD 18. Controller 14 is operative to process the telemetry and image data 76 into a video display output 78 suitable for HMD 18, and transmits the video display output 78 to HMD 18. HMD 18 is operative to display the telemetry data and image data, as well as display or play back annotation data. HMD 18 is also operative to transmit head tracking telemetry and user inputs 80 to image processor 16, e.g., for controlling crawler 12 and/or for changing viewing perspective or other viewing features of the image and telemetry data displayed on HMD 18 based on user head position. In some
embodiments, image processor 16 is operative to parse any user input data 82 used for controlling crawler 12 from head tracking telemetry and user inputs 80 used for controlling the display of images. The user input data 82 may be, for example, generated directly in HMD 18 based on user head position or movement, and/or based on other user input associated with HMD 18, e.g., a hand controller or other user input device(s) associated with HMD 18. Image processor 16 is operative to transmit the user input data 82 to controller 14 for generating control signals to control crawler 12 based on the user input. In some embodiments, some user input data 82 may also or alternatively supplied from input device 20 to controller 14.
Referring to FIG. 3, some aspects of a non-limiting example of crawler 12 inside an electrical machine 86, e.g., a turbo or hydro generator, or a motor or
motor/generator, or a transformer, are illustrated in accordance with an embodiment of the present invention. In the illustration of FIG. 3, crawler 12 is disposed inside electrical machine 86 within an air gap 88 between a stator 90 and a rotor 92 of electrical machine 86. In other embodiments, air gap 88 may be disposed between other components. Crawler 12 is constructed to fit within air gap 88, and is operative to transport itself along a desired path 94, e.g., across the stator and/or the rotor inside the air gap 88. Crawler 12 is operative to capture a plurality of images inside the electrical machine 86 with one or more camera modules 50 while inside air gap 88. Crawler 12 is magnetically adhered to stator 90 by magnets 44, and is able to move along the desired path 94 on stator 90 using tracks 40 of drive modules 30. The desired path 94 may be, for example, a path along a particular stator slot or pole 96 sought to be inspected. A stator winding overhang 98 may also be inspected by crawler 12, e.g., by extending, a camera or borescope tip from front portion 62 of crawler 12.
Referring to FIG. 4, during an inspection procedure, crawler 12 may be guided down adjacent paths, e.g., labeled as paths 94A - 94P, wherein images may be taken at intervals 100 along each path 94A - 94P. The number of paths and the circumferential spacing of each path from an adjacent path, as well as the length of each interval 100 along each path may vary with the needs of the application. In some embodiments, the intervals 100 may be small enough to effectively yield a continuous image stream, e.g., continuous video images along each path. In various embodiments, the images along a particular path and/or as between paths may be combined, e.g., converted to panoramic images, 3D images or a 3D model of the internals of electrical machine 86 formed as a composite of the images.
The front three (3) camera modules 50 on crawler 12 can combine to give a panoramic view of the overlooking stator lamination stampings, e.g., 2-4 stampings, depending upon the stator lamination and wedge size. HMD 18 effectively allows the user to experience full 3D immersion into the electrical machine 86 environment, e.g., within air gap 88. In some embodiments, HMD 18 provides a stereoscopic 3D rendering with a high resolution display, providing, for example, a field of view 1 10 degrees wide, with low latency head tracking to immerse the user in a virtual "world" within electrical machine 86 during the inspection procedure. The feed of the front camera modules 50 of crawler 12 may be projected so as to be viewed in a virtual reality immersive in a manner that may in some embodiments allow the user to feel as if the user is inside the electrical machine 86.
Referring to FIGS. 5A - 5C, in some embodiments, HMD 18 is configured to receive user input in the form of user head angle, e.g., and image processor 16 is configured to process images obtained from camera modules 50. Camera modules 50 in front portion 62 of crawler 12 have overlapping views. In some embodiments, the front camera modules can project, in combination, approximately 125 degrees of view. A build-in head tracking mechanism in HMD 18 allows HMD 18 and image processor 16 to determine what portion of the captured images that the user would like to see. Thus, for example, in some embodiments, when the user looks to the left, the images displayed on HMD 18 include left-most views LV1 and LV2, as well as central-left views CLV1 and CLV2 from the center and left camera modules 50. When the user looks to the center, the images displayed on HMD 18 include central-left views CLV1 and CLV2, as well as central-right views CRV1 and CRV2 displayed by the front left, central and right camera modules 50. When the user looks to the right, the images displayed on HMD 18 include central-right views CRV1 and CRV2 and right-most views RV1 and RV2 from the central and right camera modules 50. Thus, for example, image processor 16 may process image data from camera modules 50 on the front portion 62 of crawler 12 into a panoramic view or a 3D view, and the user head position or orientation may be used as an input to determine which portions of the panoramic 2-D or 3D image is displayed by HMD 18. Accordingly, when the user looks left, the left- hand side view of the stator 90 and/or rotor 92 is presented in HMD 18; and when the user looks to the center, the central portion of the view of stator 90 and/or rotor 92 is presented in HMD 18, and when the user looks to the right, the right-hand portion of the view of the stator 90 and/or rotor 92 is presented in HMD 18.
In some embodiments, an alternative approach to the use of a virtual reality HMD 18 is the use of an augmented reality HMD 18, e.g., wherein the opacity of the display can be adjusted such that the user can fade the view between seeing none of the camera image, to being completely immersed in the camera image. This may be useful so that the operator can see the operator's hands while performing the inspection procedure, manipulate tools and controls, and possibly interact with the crawler itself without having to remove and reattach HMD 18.
In one form, camera modules 50 acquire images in synchronous mode. The captured images are transmitted to from crawler 12 to controller 14, and then from controller 14 to image processor 16. Image processor 16 is operative to generate one or more different image display modalities, such as 2D panorama or spherical view, or a 3D model of the view, for example, as selected by the user. The generated
image/model output is sent to from image processor to the HMD 18 for display to the user. The user can select from a global view of the generated model or can see partial views. The partial views can be changed using different modes including but not limited to head motion, hand gesture or motion or using a keyboard or mouse or other input device.
Embodiments of the present invention include many variants for processing the images captured by camera modules 50 and generating the desire output, e.g., displayed in HMD 18. The processing of the images for one set of synchronous raw images may include feature extraction in all images, feature matching. If a panorama image is desired then the stitching algorithm is used. This relates to the process of taking the disparate 2D images or image feeds from each of the camera modules 50 and stitching them together into a shaped, 2D image through the use of identification of features within overlapping images, and lining those features up to spatially locate them, and form a spatially registered 2D image shell from the crawler 12 video feeds.
If a 3D model is desired then triangulation followed by a dense reconstruction can be used. The 3D location of different features can be identified from 2D features in multiple images to generate a depth point cloud through triangulation. Multiple images and processing passes can be used to generate a 3D model of the physical features found in crawler 12 video feeds. In some embodiments, this is processor intensive and may be performed offline, e.g., for post inspection viewing rather than live navigating of crawler 12, whereas in other embodiments it may be performed online, e.g., for live viewing and live navigating.
In order to speed up the calculation at inspection time, a calibration of each camera followed by stereo calibration are performed. In order to extract 3D information and in some embodiments, 2D feature information, each image may be adjusted for different aspects such as lens warp, focus, depth of field, etc. This can be extracted from the properties of each image in software, but creating a static transformation for each camera by performing a precise calibration ahead of time may eliminate the need for real time image calibration information extraction by allowing the static
transformation to be applied to each image. In some embodiments, if the calibration is not performed before using crawler 12 for an inspection procedure, then a bundle adjustment algorithm may be used to generate the 3D model of the scene and calibrate the cameras at the same time. For example, if the camera modules 50 are not individually calibrated ahead of the video collection, an algorithm (bundle adjustment algorithm) can be used to rectify the image information based on known truths, e.g., images captured at the same time from each camera module 50 are assumed to have a rigid transformation between the viewed object, and each other. This information can be used to calculate camera module 50 image capture properties. In some embodiments, such a procedure may be more processor intensive and somewhat less accurate than performing a pre-calibration of the camera modules 50 individually and then as a group.
The algorithms mentioned above use the scene distinct features to perform calculation. If the scene viewed by camera modules 50 lacks distinct features, then a structure projector can be used to generate features. For example, if there are enough distinct and shaped features in the imaging field, then the camera module 50
information alone can be used to generate a 3D model. When the imaging field is very uniform and the features become smaller, it becomes more difficult to use camera images to reconstruct 3D information as feature size approaches the boundary of the technology's abilities. In these scenarios a structured light projector may be employed on crawler 12 to project a known pattern on the imaging field, e.g., horizontal, vertical or angled stripes, which allow the cameras to more easily detect and measure features of smaller sizes. A striped structured light is projected onto the 3D shape. Deviations from the striped pattern can be used to find the 3D shape more easily than a regularly lit object. The structured light projector may be included as part of one or more camera modules 50. In some embodiments, the 3D models of the scene viewed by camera modules 50 can be used as odometry to find the position of the crawler within the specific elements of electrical machine 86. Once a 3D model of the of the internal portion of electrical machine 86 is generated, e.g., of structures viewable from within air gap 88, the 3D model may be used to accurately identify the position of crawler 12 within the electrical machine in subsequent inspections, thus tracking where crawler 12 is and how long it has been running etc. During model generation, as the 3D model is created, it can be used to determine how far the crawler has travelled. In some embodiments, the real time version of this may be more processing intensive. This information can be used to guide the crawler inside electrical machine 86. In some embodiments, the 3D information can be used to ultimately support autonomous inspection, where crawler 12 can be placed in electrical machine 86 and guide itself throughout electrical machine 86 to perform a complete inspection.
Based on the images collected from camera modules 50 at intervals 100 along the desired paths, e.g., paths 94A-94P, in some embodiments, image processor 16 can build a 3D virtual world, i.e., a 3D model of the inside the electrical machine 86, e.g., of all features viewable from within air gap 88. This allows the user to explore electrical machine 86 in the virtual world without moving the crawler, once the 3D model has been built. The user can move the view point by the inputs from joystick, gesture, etc., using one of the input devices. This feature may allow the user to see a pattern develop over an area. For example, a slight discoloration seen over a large area may make a pattern apparent. Alternatively, the user can direct crawler 12 to move manually or automatically to the area in the real world and collect the images. Since there may be multiple pathways for determining the crawler position at this stage, with a robust positioning method, the system will make sure there is not a circular dependency to determine position.
In some embodiments, the user can add marks/voice notes to the images, e.g., in the virtual world, as references. In some embodiments, the user may put annotations on the screen during an inspection, which may be tied into the 3D model for subsequent viewing.
Different modes of operations may be employed in various embodiments. For example, in one mode of operation, crawler 12 is in a static position and a user is exploring a specific area. In another mode of operation, crawler 12 is moving. The image data, raw images, can be recorded, processed and then played back later.
In addition to the five primary cameras mounted on crawler 12, in some embodiments, a borescopic or other camera may be attached to crawler 12. Unlike the other cameras, the orientation of the borescope can change. As the user employs a joystick or similar device to change the orientation of the borescope, the user can also move the user's head such that the view displayed by HMD 18 follows the borescope orientation. In another embodiment, the user's head position can be used as input control the direction of the borescope.
In addition to crawler mounted camera modules, in some embodiments, cameras may be mounted near the entrance of air gap 88 in order for the crawler 12 user to have a view of crawler 12 as it enter air gap 88. In some embodiments, camera(s) may be mounted on a crawler insertion device. Cameras can also be mounted on or near the stator or other electrical machine 86 component. HMD 18 may in some embodiments take images from these fixed mounted cameras to provide more flexible, hands free view control of crawler 12, e.g., to make it easier for the user to drive crawler 12 into air gap 88.
In some embodiments, crawler 12 is inserted into electrical machine 86, e.g., into air gap 88. Crawler 12 is then directed along each desired path, e.g., paths 94A-94P, and images are captured from the front-mounted camera modules 50, e.g., at each interval 100, or continuously (e.g., video images) along each path. The user may use these images to perform a live inspection. In some embodiments, panoramic images may be generated by image processor 16, and displayed on HMD 18 continuously or at intervals. Using HMD 18, the user may change perspective and view angle by changing head position, e.g., looking to the left, center or right, as illustrated in FIGS. 5A - 5C, allowing inspection of the portion of, e.g., stator 90 and/or rotor 92 in the vicinity of crawler 12 front mounted camera modules 50. The telemetry data may be imprinted on the viewed images. In addition, the user may make notes, e.g., recorded verbal notes, typewritten notes, screen annotations and other annotation data, which may be attached to or associated with the particular images to which the annotation data pertains, e.g., so that features, wear patterns, discoloration, etc., may be noted or highlighted for future reference. In some embodiments, 3D images may be generated by image processor 16 in addition to or in place of a 2D panoramic image, and then displayed on HMD 18. Images, e.g., from left to right, may be overlapping, e.g., having been taken by adjacent camera modules 50 and/or at different times or locations within air gap 88, and may be stitched together to generate the panoramic images, and may also or alternatively be triangulated to generate the 3D images, e.g., live. Once the panoramic and/or 3D images are generated, e.g., live, the user may explore the images, e.g., by turning from left to right as illustrated in FIGS. 5A - 5C.
In some embodiments, the images captured along the desired paths, e.g., paths 94A - 94P, whether continuously or at intervals 100, may be formed into a 3D model of the internals of electrical machine 86, e.g., of features viewable from within air gap 88. The telemetry data and annotations, notes, etc., may be included in the model. The 3D model may be stored for subsequent interrogation, and may also be used for manually or automatically guiding crawler 12 inside electrical machine 86 in subsequent inspection procedures. In some embodiments, some or all views of a current inspection procedure or a current 3D model may be compared to previous 3D models generated from previous inspections, which may allow the user to determine any progression of damage or wear within electrical machine 86 by observation and by viewing
annotations, notes and listening to verbal notes, if any. The location within the model and other parameters may be indicated by the telemetry data stored for each image or group of images with each 3D model. Thus, the latest inspection 3D model may be compared with previously stored inspection results in order to determine any
progression of damage or wear.
Embodiments of the present invention include a method for performing an inspection inside a machine, comprising: inserting a remote controlled vehicle into the machine, wherein the remote controlled vehicle includes at least one camera; directing the remote controlled vehicle along a path inside the machine; capturing a plurality of images inside the machine with the at least one camera; processing the images to generate a panoramic image and/or a 3D image; and displaying the panoramic image and/or the 3D image.
In a refinement, the panoramic image and/or the 3D image is displayed on a head-mounted virtual reality display or a head-mounted augmented reality display having a head position tracking feature, and the method further comprises changing a viewing perspective of the panoramic image and/or the 3D image based on a user head position.
In another refinement, the plurality of images includes a subset of overlapping images; and the processing of the images includes stitching the subset of overlapping images together to generate the panoramic image.
In yet another refinement, the plurality of images includes a subset of overlapping images; and the processing of the images includes triangulating between images of the subset of overlapping images to generate the 3D image.
In still another refinement, the method further comprises generating a 3D model of an interior portion of the machine.
In yet still another refinement, the method further comprises comparing a generated 3D model of a current inspection with a generated 3D model of a previous inspection.
In a further refinement, the machine includes a first component and a second component and an air gap disposed between the first component and the second component, the method further comprising: controlling the remote controlled vehicle to traverse along a first path in the air gap; capturing a first set of images using the at least one camera at intervals along the first path; controlling the remote controlled vehicle to traverse along a second path in the air gap adjacent to the first path; and capturing a second set of images at the intervals along the second path, wherein the second set of images overlaps the first set of images; and wherein the processing of the images includes generating the 3D model of the interior portion of the machine based on the first set of images and the second set of images.
In a yet further refinement, the method further comprises capturing telemetry data along the first path and along the second path, wherein the telemetry data is included as part of the 3D model.
In a still further refinement, the telemetry data is captured at each of the intervals; and the telemetry data includes the position of the remote controlled vehicle along the first path and along the second path.
In a yet still further refinement, the machine is an electrical machine; wherein the first internal component is a rotor and the second component is a stator having a plurality of slots; and the telemetry data includes identification of a stator slot number and a distance of the remote controlled vehicle along each slot.
Embodiments of the present invention include a system for performing an inspection inside an electrical machine, the electrical machine having a rotor, a stator and an air gap formed between the rotor and the stator, comprising: a magnetic air gap crawler constructed to fit within the air gap, the magnetic air gap crawler having at least one camera mounted thereon, and the magnetic air gap crawler being operative to transport itself along a desired path across the stator and/or the rotor inside the air gap, wherein the magnetic air gap crawler is operative to capture a plurality of images inside the electrical machine with the at least one camera; a controller operative to control an operation of the magnetic air gap crawler; an image processor operative to process the plurality of images to generate a panoramic image and/or a 3D image; and a head- mounted virtual reality display or a head-mounted augmented reality display operative to display the panoramic image and/or the 3D image.
In a refinement, the plurality of images includes a subset of overlapping images; and the image processor is operative to stitch the subset of overlapping images together to generate the panoramic image.
In another refinement, the plurality of images includes a subset of overlapping images; and the image processor is operative to triangulate between images of the subset of overlapping images to generate the 3D image.
In yet another refinement, the image processor is operative to associate user- generated annotation data with the panoramic image and/or the 3D image.
In still another refinement, the image processor is operative to generate a 3D model of an interior portion of the electrical machine.
In yet still another refinement, the controller is operative to direct the magnetic air gap crawler to traverse along a first path in the air gap; direct the at least one camera to capture a first set of images at intervals along the first path; direct the magnetic air gap crawler to traverse along a second path in the air gap adjacent to the first path; direct the at least one camera to capture a second set of images at the intervals along the second path, wherein the second set of images partially overlaps the first set of images; and wherein the image processor is operative to generate the 3D model of the interior portion of the machine based on the first set of images and the second set of images.
In a further refinement, the magnetic air gap crawler is operative to capture telemetry data along the first path and along the second path; and the telemetry data is included as part of the 3D model.
In a yet further refinement, the telemetry data is captured at each of the intervals, and wherein the telemetry data includes the position of the remote controlled vehicle along the first path and along the second path.
In a still further refinement, the stator includes a plurality of slots; and the telemetry data includes identification of a stator slot number and a distance of the magnetic air gap crawler along each slot.
Embodiments of the present invention include a system for performing an inspection inside an electrical machine, the electrical machine having a rotor, a stator and an air gap formed between the rotor and the stator, comprising: a magnetic air gap crawler constructed to fit within the air gap, the magnetic air gap crawler having a plurality of cameras mounted thereon operative to capture a plurality of images inside the electrical machine; a controller operative to control an operation of the magnetic air gap crawler; an image processor operative to process the plurality of images to generate a panoramic image and/or a 3D image; and a head-mounted virtual reality display or a head-mounted augmented reality display having a user body feature position tracking feature, wherein the head-mounted virtual reality display or the head- mounted augmented reality display is operative to display the panoramic image and/or the 3D image, and to change a viewing perspective of the panoramic image and/or the 3D image based on a user body feature position.
While the invention has been illustrated and described in detail in the drawings and foregoing description, the same is to be considered as illustrative and not restrictive in character, it being understood that only the preferred embodiments have been shown and described and that all changes and modifications that come within the spirit of the inventions are desired to be protected. It should be understood that while the use of words such as preferable, preferably, preferred or more preferred utilized in the description above indicate that the feature so described may be more desirable, it nonetheless may not be necessary and embodiments lacking the same may be contemplated as within the scope of the invention, the scope being defined by the claims that follow. In reading the claims, it is intended that when words such as "a," "an," "at least one," or "at least one portion" are used there is no intention to limit the claim to only one item unless specifically stated to the contrary in the claim. When the language "at least a portion" and/or "a portion" is used the item can include a portion and/or the entire item unless specifically stated to the contrary.
Unless specified or limited otherwise, the terms "mounted," "connected," "supported," and "coupled" and variations thereof are used broadly and encompass both direct and indirect mountings, connections, supports, and couplings. Further, "connected" and "coupled" are not restricted to physical or mechanical connections or couplings.

Claims

CLAIMS What is claimed is:
1 . A method for performing an inspection inside a machine, comprising:
inserting a remote controlled vehicle into the machine, wherein the remote controlled vehicle includes at least one camera;
directing the remote controlled vehicle along a path inside the machine;
capturing a plurality of images inside the machine with the at least one camera; processing the images to generate a panoramic image and/or a 3D image; and displaying the panoramic image and/or the 3D image.
2. The method of claim 1 , wherein the panoramic image and/or the 3D image is displayed on a head-mounted virtual reality display or a head-mounted augmented reality display having a head position tracking feature, further comprising changing a viewing perspective of the panoramic image and/or the 3D image based on a user head position.
3. The method of claim 1 , wherein the plurality of images includes a subset of overlapping images; and wherein the processing of the images includes stitching the subset of overlapping images together to generate the panoramic image.
4. The method of claim 1 , wherein the plurality of images includes a subset of overlapping images; and wherein the processing of the images includes triangulating between images of the subset of overlapping images to generate the 3D image.
5. The method of claim 1 , further comprising generating a 3D model of an interior portion of the machine.
6. The method of claim 5, further comprising comparing a generated 3D model of a current inspection with a generated 3D model of a previous inspection.
7. The method of claim 5, wherein the machine includes a first component and a second component and an air gap disposed between the first component and the second component, further comprising: controlling the remote controlled vehicle to traverse along a first path in the air gap; capturing a first set of images using the at least one camera at intervals along the first path; controlling the remote controlled vehicle to traverse along a second path in the air gap adjacent to the first path; and capturing a second set of images at the intervals along the second path, wherein the second set of images overlaps the first set of images; and wherein the processing of the images includes generating the 3D model of the interior portion of the machine based on the first set of images and the second set of images.
8. The method of claim 7, further comprising capturing telemetry data along the first path and along the second path, wherein the telemetry data is included as part of the 3D model.
9. The method of claim 8, wherein the telemetry data is captured at each of the intervals; and wherein the telemetry data includes the position of the remote controlled vehicle along the first path and along the second path.
10. The method of claim 8, wherein the machine is an electrical machine;
wherein the first component is a rotor and the second component is a stator having a plurality of slots; and wherein the telemetry data includes identification of a stator slot number and a distance of the remote controlled vehicle along each slot.
1 1. A system for performing an inspection inside an electrical machine, the electrical machine having a rotor, a stator and an air gap formed between the rotor and the stator, comprising:
a magnetic air gap crawler constructed to fit within the air gap, the magnetic air gap crawler having at least one camera mounted thereon, and the magnetic air gap crawler being operative to transport itself along a desired path across the stator and/or the rotor inside the air gap, wherein the magnetic air gap crawler is operative to capture a plurality of images inside the electrical machine with the at least one camera;
a controller operative to control an operation of the magnetic air gap crawler; an image processor operative to process the plurality of images to generate a panoramic image and/or a 3D image; and
a head-mounted virtual reality display or a head-mounted augmented reality display operative to display the panoramic image and/or the 3D image.
12. The system of claim 1 1 , wherein the plurality of images includes a subset of overlapping images; and wherein the image processor is operative to stitch the subset of overlapping images together to generate the panoramic image.
13. The system of claim 1 1 , wherein the plurality of images includes a subset of overlapping images; and wherein the image processor is operative to triangulate between images of the subset of overlapping images to generate the 3D image.
14. The system of claim 1 1 , wherein the image processor is operative to associate user-generated annotation data with the panoramic image and/or the 3D image.
15. The system of claim 1 1 , wherein the image processor is operative to generate a 3D model of an interior portion of the electrical machine.
16. The system of claim 15, wherein the controller is operative to direct the magnetic air gap crawler to traverse along a first path in the air gap; direct the at least one camera to capture a first set of images at intervals along the first path; direct the magnetic air gap crawler to traverse along a second path in the air gap adjacent to the first path; direct the at least one camera to capture a second set of images at the intervals along the second path, wherein the second set of images partially overlaps the first set of images; and wherein the image processor is operative to generate the 3D model of the interior portion of the machine based on the first set of images and the second set of images.
17. The system of claim 16, wherein the magnetic air gap crawler is operative to capture telemetry data along the first path and along the second path; and wherein the telemetry data is included as part of the 3D model.
18. The system of claim 17, wherein the telemetry data is captured at each of the intervals, and wherein the telemetry data includes the position of the remote controlled vehicle along the first path and along the second path.
19. The system of claim 17, wherein the stator includes a plurality of slots; and wherein the telemetry data includes identification of a stator slot number and a distance of the magnetic air gap crawler along each slot.
20. A system for performing an inspection inside an electrical machine, the electrical machine having a rotor, a stator and an air gap formed between the rotor and the stator, comprising:
a magnetic air gap crawler constructed to fit within the air gap, the magnetic air gap crawler having a plurality of cameras mounted thereon operative to capture a plurality of images inside the electrical machine;
a controller operative to control an operation of the magnetic air gap crawler; an image processor operative to process the plurality of images to generate a panoramic image and/or a 3D image; and
a head-mounted virtual reality display or a head-mounted augmented reality display having a user body feature position tracking feature, wherein the head-mounted virtual reality display or the head-mounted augmented reality display is operative to display the panoramic image and/or the 3D image, and to change a viewing perspective of the panoramic image and/or the 3D image based on a user body feature position.
PCT/US2018/025188 2017-03-31 2018-03-29 Systems and methods for performing an inspection inside a machine WO2018183709A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US201762479926P 2017-03-31 2017-03-31
US62/479,926 2017-03-31

Publications (1)

Publication Number Publication Date
WO2018183709A1 true WO2018183709A1 (en) 2018-10-04

Family

ID=63678071

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2018/025188 WO2018183709A1 (en) 2017-03-31 2018-03-29 Systems and methods for performing an inspection inside a machine

Country Status (1)

Country Link
WO (1) WO2018183709A1 (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2021094533A3 (en) * 2019-11-15 2021-07-15 Lufthansa Technik Ag Borescope having a rotary head
WO2022167141A1 (en) * 2021-02-02 2022-08-11 Siemens Energy Global GmbH & Co. KG Apparatus for inspecting a component using time-of-flight sensors

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20040131232A1 (en) * 1998-04-08 2004-07-08 Jeffrey Meisner Augmented reality technology
US20070217672A1 (en) * 2006-03-20 2007-09-20 Siemens Power Generation, Inc. Combined 2D and 3D nondestructive examination
US20090154293A1 (en) * 2007-12-18 2009-06-18 Anandraj Sengupta System and method for augmented reality inspection and data visualization
US20090301168A1 (en) * 2008-06-04 2009-12-10 Siemens Power Generation, Inc. Apparatus For Impact Testing For Electric Generator Stator Wedge Tightness
US20110175641A1 (en) * 2010-01-19 2011-07-21 Markus Wiesendanger Inspection vehicle for inspecting an air gap between the rotor and the stator of a generator
US20150054939A1 (en) * 2013-08-21 2015-02-26 Siemens Energy, Inc. Internal inspection of machinery by stitched surface imaging
WO2016138529A1 (en) * 2015-02-27 2016-09-01 Abb Technology Ag Localization, mapping and haptic feedback for inspection of a confined space in machinery
US20160284079A1 (en) * 2015-03-26 2016-09-29 Faro Technologies, Inc. System for inspecting objects using augmented reality

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20040131232A1 (en) * 1998-04-08 2004-07-08 Jeffrey Meisner Augmented reality technology
US20070217672A1 (en) * 2006-03-20 2007-09-20 Siemens Power Generation, Inc. Combined 2D and 3D nondestructive examination
US20090154293A1 (en) * 2007-12-18 2009-06-18 Anandraj Sengupta System and method for augmented reality inspection and data visualization
US20090301168A1 (en) * 2008-06-04 2009-12-10 Siemens Power Generation, Inc. Apparatus For Impact Testing For Electric Generator Stator Wedge Tightness
US20110175641A1 (en) * 2010-01-19 2011-07-21 Markus Wiesendanger Inspection vehicle for inspecting an air gap between the rotor and the stator of a generator
US20150054939A1 (en) * 2013-08-21 2015-02-26 Siemens Energy, Inc. Internal inspection of machinery by stitched surface imaging
WO2016138529A1 (en) * 2015-02-27 2016-09-01 Abb Technology Ag Localization, mapping and haptic feedback for inspection of a confined space in machinery
US20160284079A1 (en) * 2015-03-26 2016-09-29 Faro Technologies, Inc. System for inspecting objects using augmented reality

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2021094533A3 (en) * 2019-11-15 2021-07-15 Lufthansa Technik Ag Borescope having a rotary head
WO2022167141A1 (en) * 2021-02-02 2022-08-11 Siemens Energy Global GmbH & Co. KG Apparatus for inspecting a component using time-of-flight sensors

Similar Documents

Publication Publication Date Title
US11879852B1 (en) Multi-camera apparatus for wide angle pipe internal inspection
JP7007396B2 (en) Techniques for recording augmented reality data
US20230045393A1 (en) Volumetric depth video recording and playback
US9017163B2 (en) System and method for acquiring virtual and augmented reality scenes by a user
CN103258339A (en) Real-time compositing of live recording-based and computer graphics-based media streams
JP4739002B2 (en) Image processing method and image processing apparatus
CN111465886A (en) Selective tracking of head mounted displays
JP7059937B2 (en) Control device for movable image pickup device, control method and program for movable image pickup device
EP3061092B1 (en) Motion tracking system
JP4700476B2 (en) Multi-view video composition device and multi-view video composition system
JP2011528252A5 (en)
JP2006285787A (en) Calibration method and device
CN103177470A (en) Method and system for playing an augmented reality in a motor vehicle display
CN1957374A (en) Method and device for determining optical overlaps with AR objects
US11209657B2 (en) Position tracking system for head-mounted display systems that includes angle sensitive detectors
JP7177054B2 (en) Head-mounted display with user head rotation guide
WO2017126433A1 (en) Information processing device and user guide presentation method
WO2018183709A1 (en) Systems and methods for performing an inspection inside a machine
JP2007233971A (en) Image compositing method and device
JP2017156796A (en) Object tracking system, object tracking device and program, as well as physical object with location display body
JP2008194095A (en) Mileage image generator and generation program
JP2016140017A (en) Information processing device, display device, and information processing method
KR20210153472A (en) Method, appararus and system for providing real-time broadcasting platform using motion and facial capture
Krinidis et al. An audio-visual database for evaluating person tracking algorithms
US20220230357A1 (en) Data processing

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 18777507

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 18777507

Country of ref document: EP

Kind code of ref document: A1