EP2774118A1 - System and method for automated defect detection utilizing prior data - Google Patents
System and method for automated defect detection utilizing prior dataInfo
- Publication number
- EP2774118A1 EP2774118A1 EP12798915.0A EP12798915A EP2774118A1 EP 2774118 A1 EP2774118 A1 EP 2774118A1 EP 12798915 A EP12798915 A EP 12798915A EP 2774118 A1 EP2774118 A1 EP 2774118A1
- Authority
- EP
- European Patent Office
- Prior art keywords
- image
- prior
- database
- current image
- current
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Ceased
Links
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/0002—Inspection of images, e.g. flaw detection
- G06T7/0004—Industrial image inspection
- G06T7/001—Industrial image inspection using an image reference approach
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10068—Endoscopic image
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/20—Special algorithmic details
- G06T2207/20092—Interactive image processing based on input by user
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30108—Industrial image inspection
- G06T2207/30164—Workpiece; Machine component
Definitions
- the present disclosure relates to automated inspection techniques and, more particularly, relates to automated visual inspection techniques of images or videos captured by image capture devices such as borescopes.
- Video inspection systems such as borescopes
- borescopes have been widely used for capturing images or videos of difficult-to-reach locations by "snaking" image sensor(s) to these locations.
- Applications utilizing borescope inspections include aircraft engine blade inspection, power turbine blade inspection, internal inspection of mechanical devices and the like.
- a variety of techniques for inspecting the images or videos provided by borescopes for determining defects therein have been proposed in the past. Most such techniques capture and display images or videos to human inspectors for defect detection and interpretation. Human inspectors then de'cide whether any defect within those images or videos exists. These techniques are prone to errors resulting from human inattention. Some other techniques utilize automated inspection techniques in which most common defects are categorized into classes such as leading edge defects, erosion, nicks, cracks, or cuts and any incoming images or videos from the borescopes are examined to find those specific classes of defects. These techniques are thus focused on low-level feature extraction and to identify damage by matching features. Although somewhat effective in circumventing errors from human involvement, categorizing all kinds of blade damage defects within classes is difficult and images having defects other than those pre-defined classes are not detected.
- a method of performing automated defect detection by utilizing prior inspection data may include providing an image capture device for capturing and transmitting at least one current image of an object and providing a database for storing at least one prior image from prior inspections.
- the method may further include registering the at least one current image with the at least one prior image, comparing the registered at least one current image with the at least one prior image to determine a transformation therebetween and updating the database with the at least one current image.
- a system for performing automated defect detection may include an image capture device for capturing and transmitting current images of one or more components of a machine and a database for storing prior images of the one or more components from prior inspections.
- the system may also include a monitoring and analysis site in at least indirect communication with the image capture device and the database, the monitoring and analysis site capable of retrieving prior images from the database and comparing those images with the current images of the same one or more components to determine a transformation therebetween.
- the method may include providing an image capture device capable of capturing and transmitting at least one current image of one or more blades of an engine and providing a database capable of storing at least one prior image of the one or more blades of the engine from prior inspections.
- the method may also include accessing the database to retrieve the at least one prior image corresponding to the at least one current image, registering the at least one current image to the at least one prior image and comparing the registered at least one current image to the at least one prior image.
- the method may further include updating the database with the at least one current image.
- FIG. 1 is a schematic illustration of an automated defect detection system, in accordance with at least some embodiments of the present disclosure.
- FIG. 2 is a flowchart outlining steps of performing automated defect detection by utilizing prior data and using the automated defect detection system of FIG. 1, in accordance with at least some embodiments of the present disclosure;
- the automated defect detection system 2 may be an automated borescope inspection (ABI) system.
- the automated defect detection system 2 may include an engine 4 having a plurality of stages 6, each of the stages having a plurality of blades 8, some or all of which may require visual inspection periodically or at predetermined intervals by one or more image capture devices 10.
- the engine may be representative of a wide variety of engines such as jet aircraft engines, aeroderivative industrial gas turbines, steam turbines, diesel engines, automotive and truck engines, and the like.
- the ABI system 2 may be employed to inspect other parts of the engine inaccessible by other means, as well as to perform inspection in other equipment and fields such as medical endoscope inspection, inspecting critical interior surfaces in machined or cast parts, forensic inspection, inspection of civil structures such as buildings bridges, piping, etc.
- the image capture device(s) 10 may be an optical device having an optical lens or other imaging device or image sensor at one end and capable of capturing and transmitting still images or video images (referred hereinafter to as "data") through a communication channel 12 to a monitoring and analysis site 14.
- the image capture device(s) 10 may be representative of any of a variety of flexible borescopes or fiberscopes, rigid borescopes, video borescopes or other devices such as endoscopes, which are capable of capturing and transmitting data of difficult-to-reach areas through the communication channel 12.
- the communication channel 12 in turn may be an optical channel or alternatively, may be any other wired, wireless or radio channel or any other type of channel capable of transmitting data between two points including links involving the World Wide Web (www) or the internet.
- the monitoring and analysis site 14 may be located on-site near or on the engine 4, or alternatively, it may be located on a remote site away from the engine.
- the monitoring and analysis site 14 may include one or more processing systems 16 (e.g., computer systems having a central processing unit and memory) for recording, processing and storing the data received from the image capture device(s) 10, as well as personnel for controlling operation of the one or more processing systems.
- the monitoring and analysis site 14 may receive data of the blades 8 captured and transmitted by the image capture device(s) 10 via the communication channel 12.
- the monitoring and analysis site 14 and, particularly, the one or more processing systems 16 may process that data to determine any defects within any of the blades 8.
- Results 20 may then be reported through communication channel 18.
- the results 20 may also relay information about the type of defect, the location of the defect, size of the defect, etc. If defects are found in any of the inspected blades 8, alarm(s) to alert personnel or users may be raised as well.
- the communication channel 18 may be any of variety of communication links including, wired channels, optical or wireless channels, radio channels or possibly links involving the World Wide Web (www) or the internet. It will also be understood that although the results 20 have been shown as being a separate entity from the monitoring and analysis site 14, this need not always be the case. Rather, in at least some embodiments, the results 20 may be stored within and reported through the monitoring and analysis site 14 as well. Furthermore, in at least some embodiments, the results 20 may be stored within a database 21 for future reference that along with, or in addition to another one of the database 21, may be employed for performing the automated defect detection by the monitoring and analysis site 14, as described below with respect to FIG. 2.
- FIG. 2 a flowchart 22 outlining sample steps which may be followed in performing automated defect detection using the automated defect detection system 2 by utilizing prior data is shown, in accordance with at least some embodiments of the present invention.
- the process may proceed to a step 26, where the database (or databases) 21 may be accessed.
- defect detection of the engine 4 may be performed periodically, for example, after the engine has been in service for a specific amount of time or after the engine has been operational in certain environmental conditions.
- a set of maintenance data or metadata e.g., defects, type, size and location of defects, etc.
- comments for that inspection may be recorded within the database 21.
- the database 21 may store information regarding undamaged ones of the blades 8 as well.
- the database 21 with the prior inspection data may be accessed.
- the database 21 may be accessed manually by an inspector or user or alternatively, the database may be accessed automatically by utilizing an image (or video) retrieving tool, respectively.
- prior inspection data pertaining to one or more of the blades 8 may be accessed by inputting information manually to identify those blades. For example, in at least some embodiments, information specifying the specific one of the engine 4, a model number of the engine 4, and/or a stage number may be input by a user or an inspector. In other embodiments, other types of inputs, such as blade type, etc. may be input as well.
- an automatic image retrieval system may be employed to find corresponding information for the current image (or video) within the database 21.
- the current images (or videos) may be directly input into the automatic image retrieval system to retrieve the corresponding prior images (or videos) of those blade(s) 8 from the database 21.
- commonly known video or image searching and retrieval techniques may be employed.
- a combination of the aforementioned two methods may be employed as well.
- multiple prior images (or videos) corresponding to one current image (or video) may be stored within the database 21.
- all of the images (or videos) corresponding to the one current image (or video) may be retrieved in some embodiments, while in certain other embodiments, only the most recent one of the prior image (or video) may be retrieved.
- a blade (or multiple blades) that have never undergone inspection may not find any corresponding prior images (or videos) within the database 21. Upon inspection, images (or videos) of such blades may be stored within the database 21 for any future inspections of those blades.
- the current and/or the prior images (or videos) may correspond to a single one of the blades 8 within a single one of the stages 6, or alternatively, may correspond to multiple blades within the single stage.
- the current and/or the prior images (or videos) may correspond to a complete or partial view of a blades 8. In at least some embodiments, the images (or videos) may even correspond to multiple ones of the blades 8 from multiple ones of the stages 6, particularly for correlated damage across multiple one of the blades 8 in multiple of the stages 6.
- an image/video registration e.g., alignment of the current and the prior images for performing a comparison therebetween
- an image/video registration e.g., alignment of the current and the prior images for performing a comparison therebetween
- the process has been explained with respect to only images. It will, however, be understood that the images may correspond to either one or more still images or video images (e.g., frames within the video).
- a feature based approach for extracting features such as corner-like features and intensity gradient features, to determine any common features between the current and the prior images may be adopted.
- an image based approach where the entire current and the prior images are compared may be employed in some embodiments.
- a combination of the feature based and the image based approaches, or some other commonly employed technique for aligning (e.g., registering) and comparing the current and the prior images may be employed as well.
- Harris Corner Detector SURF (Speeded Up Robust Features) and SIFT (Scale Invariant Feature Transform) may be employed for feature correspondence extraction or techniques such as Phase Correlation and NCC (Normalized Cross Co-relation) may be employed for image based comparison. All of the aforementioned techniques are well known in the art and, accordingly, for conciseness of expression, they have not been described here. Notwithstanding the fact that in the present embodiment, only the Harris Corner Detector, SURF, SIFT, Phase Correlation and NCC techniques for image comparison have been mentioned, in at least some embodiments, other types of techniques that are commonly employed for comparing images may be used.
- the aforerhentioned registration techniques may be employed when the field-of-views (FOV) of the current and the prior images overlap with one another. Since the current and the prior images stored within the database 21 may not exactly be equivalent to one another (e.g., both of the images may correspond to the same blade(s) but one of the images may be rotated, for example, by a few degrees or translated a few inches relative to the other image), a warping technique to align the FOVs of both the current and the prior images prior to comparing those images may be performed.
- FOV field-of-views
- warping may be performed by transforming the current and the prior images into matrix form, and by multiplying the prior image matrix with a mathematical transformation (e.g., a transformation corresponding to the difference of alignment between the current and the prior images) to obtain a warped image corresponding to the current image.
- a mathematical transformation e.g., a transformation corresponding to the difference of alignment between the current and the prior images
- the new image matrix may be multiplied by an inverse warping. Since the warping of current or prior images are entirely analogous, hereinafter, only one of the two approaches has been described. Warping techniques are well known in the art and, accordingly, for conciseness of expression, they have not been described here in detail.
- the database 21 may contain additional information beyond images such as model or serial numbers, part identification numbers, metrology information, etc.
- the current image may then correspond to the warped prior image.
- the current image and the warped prior image may be compared to determine any differences between the two images.
- regions in the current image that have changed from the last inspection may be identified.
- a thresholding technique or a classifier technique to compare the current and the prior images may be employed.
- other techniques to compare those images may be utilized as well.
- a step 38 it is determined whether any significant difference between the current image and the prior image has been found based upon the comparison of those images conducted at the step 36.
- a difference between the currentand the prior images may be classified as significant if the differences, for example, circumference, area, texture, color, etc., are above a certain pre-set or adaptive threshold.
- the difference may correspond to either a significant defect being found in the current image compared to a prior image of an undamaged blade or, alternatively, the difference may correspond to a defect that has worsened from the prior image. Minor defects in the current image compared to the prior image may or may not be classified as significant, depending upon the threshold value mentioned above.
- the process may proceed to a step 40, where the blade(s) 'corresponding to the current image may be flagged for repair. [0026] Subsequent to repair at the step 40, the repaired blade(s) 8 may be
- the database 21 may be updated by replacing the prior image with the current image or appending the current image along with the prior image for any usage in any future inspections of that blade(s).
- Information such as date of storing the current image into the database 21 , may additionally be documented to determine a latest one of multiple images that may be stored within the database 21. It will also be understood that when multiple prior images of a blade are found, in at least some embodiments, the latest prior image corresponding to that blade may be employed for reference and inspection although in other embodiments, the current image may be compared to some or all of the prior images of that blade to determine the succession of any defect (or repair) within that blade. Furthermore, metadata such as types of defects, size of defects, location of defects, etc., may also be stored within the database 21 at the step 42.
- the process may proceed directly to the step 42 where the database 21 may be updated with the warped current (or prior) image (if warping was performed) created at the step 34.
- the current image "as-is" may be stored within the database 21 and the database may be updated.
- the warped (Or un-warped) current image may either replace one or more of the prior images already stored within the database 21 or it may be appended to the prior images.
- the present disclosure sets forth a system and method for performing automated defect detection by utilizing data from previous inspections.
- a current image of one or more blades or other component is compared with the corresponding image(s) of those blade(s) stored within a database from prior inspections.
- a warping technique to align or register the current image to the prior image may be performed.
- those blade(s) or component(s) may be flagged for repair. If no significant differences are found, the warped or unwarped image may be stored within the database for future inspections.
- the present disclosure allows for a more robust diagnostic of an engine blade or other mechanical component by storing multiple images of that blade or component for the same location. These multiple images may allow for a more robust comparison with a current image and may provide a mechanism for determining whether any defect is getting worse or not. Furthermore, the aforementioned technique may be performed in an automated (or semi-automated) fashion, thereby relieving human inspectors from manually inspecting blades or components to at least a certain extent.
Landscapes
- Engineering & Computer Science (AREA)
- Quality & Reliability (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Investigating Materials By The Use Of Optical Means Adapted For Particular Applications (AREA)
Abstract
Description
Claims
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US13/288,567 US8593543B2 (en) | 2010-11-04 | 2011-11-03 | Imaging apparatus |
PCT/US2012/063370 WO2013067387A1 (en) | 2011-11-03 | 2012-11-02 | System and method for automated defect detection utilizing prior data |
Publications (1)
Publication Number | Publication Date |
---|---|
EP2774118A1 true EP2774118A1 (en) | 2014-09-10 |
Family
ID=47326312
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
EP12798915.0A Ceased EP2774118A1 (en) | 2011-11-03 | 2012-11-02 | System and method for automated defect detection utilizing prior data |
Country Status (2)
Country | Link |
---|---|
EP (1) | EP2774118A1 (en) |
WO (1) | WO2013067387A1 (en) |
Family Cites Families (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
FR2949220B1 (en) * | 2009-08-21 | 2011-09-09 | Snecma | METHOD AND SYSTEM FOR DETECTING THE INGESTION OF AN OBJECT BY AN AIRCRAFT TURBOJUSTER DURING A MISSION |
-
2012
- 2012-11-02 WO PCT/US2012/063370 patent/WO2013067387A1/en active Application Filing
- 2012-11-02 EP EP12798915.0A patent/EP2774118A1/en not_active Ceased
Non-Patent Citations (2)
Title |
---|
None * |
See also references of WO2013067387A1 * |
Also Published As
Publication number | Publication date |
---|---|
WO2013067387A1 (en) | 2013-05-10 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US8792705B2 (en) | System and method for automated defect detection utilizing prior data | |
US8781209B2 (en) | System and method for data-driven automated borescope inspection | |
US8744166B2 (en) | System and method for multiple simultaneous automated defect detection | |
US9471057B2 (en) | Method and system for position control based on automated defect detection feedback | |
US8761490B2 (en) | System and method for automated borescope inspection user interface | |
US8781210B2 (en) | Method and system for automated defect detection | |
EP3086286B1 (en) | Method and system for automated inspection utilizing a multi-modal database | |
US11449980B2 (en) | System and method for combined automatic and manual inspection | |
US20170323163A1 (en) | Sewer pipe inspection and diagnostic system and method | |
CN110555831B (en) | Deep learning-based drainage pipeline defect segmentation method | |
CN106846304B (en) | Electrical equipment detection method and device based on infrared detection | |
Guan et al. | Automatic fault diagnosis algorithm for hot water pipes based on infrared thermal images | |
Hezaveh et al. | Roof damage assessment using deep learning | |
EP3276131A1 (en) | Systems and methods for indexing and detecting components | |
EP3852059A1 (en) | System and method for assessing the health of an asset | |
WO2013067387A1 (en) | System and method for automated defect detection utilizing prior data | |
CN113727022B (en) | Method and device for collecting inspection image, electronic equipment and storage medium | |
CN110189301B (en) | Foreign matter detection method for generator stator core steel sheet stacking platform | |
CN113792829A (en) | Water turbine inspection method and device, computer equipment and storage medium | |
CN115909155A (en) | Inspection method for auxiliary equipment of thermal power plant | |
Guo et al. | Novel methods for inspection of damage on airframes | |
CN116258681A (en) | Spacer abnormality detection method | |
CN111429422A (en) | Laser near-field state analysis method and device based on deep learning | |
CN116432078A (en) | Building electromechanical equipment monitoring system |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PUAI | Public reference made under article 153(3) epc to a published international application that has entered the european phase |
Free format text: ORIGINAL CODE: 0009012 |
|
17P | Request for examination filed |
Effective date: 20140602 |
|
AK | Designated contracting states |
Kind code of ref document: A1 Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR |
|
DAX | Request for extension of the european patent (deleted) | ||
RAP1 | Party data changed (applicant data changed or rights of an application transferred) |
Owner name: UNITED TECHNOLOGIES CORPORATION |
|
17Q | First examination report despatched |
Effective date: 20170316 |
|
REG | Reference to a national code |
Ref country code: DE Ref legal event code: R003 |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: THE APPLICATION HAS BEEN REFUSED |
|
18R | Application refused |
Effective date: 20191108 |