GB2570377A - Defect inspection method and defect inspection system - Google Patents

Defect inspection method and defect inspection system Download PDF

Info

Publication number
GB2570377A
GB2570377A GB1818898.7A GB201818898A GB2570377A GB 2570377 A GB2570377 A GB 2570377A GB 201818898 A GB201818898 A GB 201818898A GB 2570377 A GB2570377 A GB 2570377A
Authority
GB
United Kingdom
Prior art keywords
defect
imaging device
processing unit
positioning
region
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
GB1818898.7A
Other versions
GB201818898D0 (en
GB2570377B (en
Inventor
Konishi Takaaki
Kobayashi Ryousuke
Naganuma Junichiro
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hitachi GE Nuclear Energy Ltd
Original Assignee
Hitachi GE Nuclear Energy Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hitachi GE Nuclear Energy Ltd filed Critical Hitachi GE Nuclear Energy Ltd
Publication of GB201818898D0 publication Critical patent/GB201818898D0/en
Publication of GB2570377A publication Critical patent/GB2570377A/en
Application granted granted Critical
Publication of GB2570377B publication Critical patent/GB2570377B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/84Systems specially adapted for particular applications
    • G01N21/88Investigating the presence of flaws or contamination
    • G01N21/8806Specially adapted optical and illumination features
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01MTESTING STATIC OR DYNAMIC BALANCE OF MACHINES OR STRUCTURES; TESTING OF STRUCTURES OR APPARATUS, NOT OTHERWISE PROVIDED FOR
    • G01M11/00Testing of optical apparatus; Testing structures by optical methods not otherwise provided for
    • G01M11/08Testing mechanical properties
    • G01M11/081Testing mechanical properties by using a contact-less detection method, i.e. with a camera
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01MTESTING STATIC OR DYNAMIC BALANCE OF MACHINES OR STRUCTURES; TESTING OF STRUCTURES OR APPARATUS, NOT OTHERWISE PROVIDED FOR
    • G01M5/00Investigating the elasticity of structures, e.g. deflection of bridges or air-craft wings
    • G01M5/0033Investigating the elasticity of structures, e.g. deflection of bridges or air-craft wings by determining damage, crack or wear
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01MTESTING STATIC OR DYNAMIC BALANCE OF MACHINES OR STRUCTURES; TESTING OF STRUCTURES OR APPARATUS, NOT OTHERWISE PROVIDED FOR
    • G01M5/00Investigating the elasticity of structures, e.g. deflection of bridges or air-craft wings
    • G01M5/0075Investigating the elasticity of structures, e.g. deflection of bridges or air-craft wings by means of external apparatus, e.g. test benches or portable test systems
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01MTESTING STATIC OR DYNAMIC BALANCE OF MACHINES OR STRUCTURES; TESTING OF STRUCTURES OR APPARATUS, NOT OTHERWISE PROVIDED FOR
    • G01M5/00Investigating the elasticity of structures, e.g. deflection of bridges or air-craft wings
    • G01M5/0091Investigating the elasticity of structures, e.g. deflection of bridges or air-craft wings by using electromagnetic excitation or detection
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/84Systems specially adapted for particular applications
    • G01N21/88Investigating the presence of flaws or contamination
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/30Measuring arrangements characterised by the use of optical techniques for measuring roughness or irregularity of surfaces
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/30Measuring arrangements characterised by the use of optical techniques for measuring roughness or irregularity of surfaces
    • G01B11/306Measuring arrangements characterised by the use of optical techniques for measuring roughness or irregularity of surfaces for measuring evenness
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/84Systems specially adapted for particular applications
    • G01N21/88Investigating the presence of flaws or contamination
    • G01N21/8806Specially adapted optical and illumination features
    • G01N2021/8809Adjustment for highlighting flaws
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02EREDUCTION OF GREENHOUSE GAS [GHG] EMISSIONS, RELATED TO ENERGY GENERATION, TRANSMISSION OR DISTRIBUTION
    • Y02E30/00Energy generation of nuclear origin
    • Y02E30/30Nuclear fission reactors

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Chemical & Material Sciences (AREA)
  • Analytical Chemistry (AREA)
  • Engineering & Computer Science (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Biochemistry (AREA)
  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • General Health & Medical Sciences (AREA)
  • Immunology (AREA)
  • Pathology (AREA)
  • Electromagnetism (AREA)
  • Investigating Materials By The Use Of Optical Means Adapted For Particular Applications (AREA)

Abstract

A defect inspection system comprises a defect extraction processing unit configured to extract a region 11 including a defect 12 from an image captured by an imaging device 20, a defect determination processing unit 53 configured to reference information characteristic of defect type and to classify the type of the defect included in the region extracted by the defect extraction processing unit, a device positioning controller 54 configured to control and position the imaging device relative to the defect based on the type of defect determined and a display controller 55 configured to control a display 60 such that it subsequently displays the image of the defect captured by the imaging device controlled by the device positioning controller. In use, the positioning of an imaging device is controlled so that a defect is easily visually recognized. The imaging and illuminating devices 30 may be underwater, aerial or wheeled drones and the defect may be a crack or peeling paint in the surface of a pipeline.

Description

The present invention relates to a defect inspection method and a defect inspection system.
[0002]
As a method for inspecting the health of a structure, visual testing (VT) is applied to a social infrastructure structure such as a tunnel or a power generation plant in some cases. In the visual testing, an inspecting person directly visually recognizes a target region to be inspected or indirectly visually recognizes a video image captured by an imaging device such as a camera from a display device to determine whether or not a defect targeted for detection exists on the target region.
When the structure to be inspected exists at a high place or a narrow place or in a harsh environment such as a high-temperature environment or a high-radiation environment, and it is difficult to directly visually recognize the structure to be inspected, a video image captured by an imaging device attached to a remotely controlled device is used. Thus, an inspecting person can visually recognize the video image from a separate place and perform the visual testing.
[0003]
The visibility of a defect included in a video image captured for the visual testing may vary depending on a state and characteristics of the defect, a surface state of a structure to be inspected, conditions of an imaging device, conditions of an illuminator used together with the imaging device, or the like. Thus, a method for quantitatively evaluating the visibility of an imaging system has been proposed.
For example, Japanese Unexamined Patent Application Publication No. 2008-197087 describes a method for causing an imaging system to image an object so that the object has predetermined spatial resolution and evaluating contrast.
SUMMARY [0005]
To reliably detect a defect in visual testing, it is desirable to adjust the positioning of a device for imaging a defect so that the defect can be clearly visually recognized. This is due to the fact that when image data is captured by an imaging device at different positions and indicates the same defect, shapes, surface states, contrast, and the like of the defect included in the image data look different from each other.
[0006]
Thus, a support system configured to actively control the positioning of a device in order to improve the visibility of a defect is more useful than a system configured to evaluate the visibility of a defect and described in Japanese Unexamined Patent Application Publication No. 2008-197087. The positioning of the imaging device when the imaging device easily images a crack on a wall may be different from the positioning of the imaging device when the imaging device easily images shallow peeling on the wall. It is, therefore, desirable to optimize the positioning of the imaging device based on specific details of a defect.
[0007]
A main object of the present invention is to control the positioning of an imaging device for imaging a defect so that the defect is easily visually inspected.
[0008]
To achieve the aforementioned object, a defect inspection method according to the present invention has the following characteristics.
According to the present invention, a defect inspection system includes a database, a defect extraction processing unit, a defect determination processing unit, a device positioning controller, and a display controller.
In the database, defect type data including defect types related to a structure and characteristic information associated with each of the defect types, and device control data defining device positioning for each of the defect types are associated with each other and registered.
The defect extraction processing unit is configured to extract a defect region including a defect from an image captured by an imaging device .
The defect determination processing unit is configured to reference the characteristic information of the device control data and classify the type of the defect included in the defect region extracted by the defect extraction processing unit.
The defect determination processing unit is configured to reference the characteristic information of the device control data, determine positioning, associated with the type of the defect classified by the defect determination processing unit, of the imaging device, and control the imaging device so that the imaging device is positioned based on the determined positioning.
The display controller is configured to control a display device so that the display device displays the image captured by the imaging device controlled by the device positioning controller.
Other sections are described later.
[0009]
According to the present invention, it is possible to control the positioning of an imaging device for imaging a defect so that the defect is easily visually inspected.
BRIEF DESCRIPTION OF THE DRAWINGS [0010]
FIG. 1 is a diagram showing an entire configuration of a defect inspection system according to an embodiment of the present invention;
FIG. 2 is a flowchart showing a process to be executed by the defect inspection system according to the embodiment of the present invention;
FIG. 3 is a perspective view of a three-dimensional coordinate system of the defect inspection system according to the embodiment of the present invention;
FIG. 4 is a diagram showing an xy plane when viewed from the side of a front surface of a region to be inspected according to the embodiment of the present invention;
FIG. 5 is a cross-sectional view of an imaging device shown in FIG. 4 according to the embodiment of the present invention;
FIG. 6 is a cross-sectional view of an illuminator shown in FIG. 4 according to the embodiment of the present invention;
FIG. 7 is a perspective view showing a positional relationship of the defect inspection system in an image capturing process according to the embodiment of the present invention;
FIG. 8 is a side view showing a crack shown in FIG. 7 according to the embodiment of the present invention;
FIG. 9 is a screen diagram showing an image captured from the side of the front surface and indicating a region to be inspected according to the embodiment of the present invention;
FIG. 10 is a graph of a luminance distribution in the captured image shown in FIG. 9 according to the embodiment of the present invention;
FIG. 11 is a configuration diagram describing a defect type determination process according to the embodiment of the present invention;
FIG. 12 is a perspective view describing a positioning determination process and showing orientation θ of devices according to the embodiment of the present invention;
FIG. 13 is a perspective view describing the positioning determination process and showing angles φ of the devices with respect to a crack according to the embodiment of the present invention; and
FIG. 14 is a perspective view describing the positioning determination process and showing angles φ of the devices with respect to peeling of a surface layer according to the embodiment of the present invention.
DETAILED DESCRIPTION [0011]
Hereinafter, an embodiment of the present invention is described in detail with reference to the accompanying drawings.
[0012]
FIG. 1 is a diagram showing an entire configuration of a defect inspection system. The defect inspection system includes an imaging device driving mechanism 20, an illuminator driving mechanism 30, a PC 50, and a display device 60.
The imaging device driving mechanism 20 moves an imaging device 21 and a self-position measurer 22 while holding the imaging device 21 and the self-position measurer 22. The illuminator driving mechanism 30 moves an illuminator 31 and a relative position measurer 32 while holding the illuminator 31 and the relative position measurer 32 . The imaging device driving mechanism 20 and the illuminator driving mechanism 30 are remotely controlled by manual commands based on remote control operations performed by a person or are remotely controlled by automatic commands from the PC 50.
[0013]
The imaging device driving mechanism 20 and the illuminator driving mechanism 30 are configured as underwater moving devices whose positions and orientation in water are controlled using a plurality of thrusters, for example. Alternatively, instead of the underwater moving devices, flying devices such as drones or traveling devices with wheels may be used.
FIG. 1 shows an example in which the two mechanisms, which are the imaging device driving mechanism 20 and the illuminator mechanism 30 and are moved independently of each other. The two mechanisms, however, may be integrated with each other and conf igured. For example, the imaging device driving mechanism 20 may have an arm for holding the illuminator driving mechanism 30 and freely change a relative position of the illuminator driving mechanism 30 to the position of the imaging device driving mechanism 20.
[0014]
The imaging device 21 images a defect 12 on a region 11 to be inspected. The illuminator 31 radiates light toward the defect 12 on the region 11 to be inspected.
The self-position measurer 22 measures the position of the imaging device 21. The relative position measurer 32 measures a relative position of the illuminator 31 to the imaging device 21. A position measuring device that uses a laser distance meter to treat a peripheral structure as a reference structure is used for each of the self-position measurer 22 and the relative position measurer 32.
[0015]
As sections for measuring a spatial position of the imaging device
- 8 21 and a spatial position of the illuminator 31, a combination of the self-position measurer 22 capable of independently measuring the position of the self-position measurer 22 and the relative position measurer 32 capable of measuring a relative position of the relative position measurer 32 to another device is described as an example. On the other hand, a self-position measurer may be used to measure the position of the illuminator 31, and a relative position measurer may be used to measure the position of the imaging device 21. Alternatively, self-position measurers may be used to measure the position of the imaging device 21 and the position of the illuminator 31, respectively. [0016]
In addition, the sections for measuring the position of the imaging device 21 and the position of the illuminator 31 may use one or more of the following exemplified methods.
- A method for measuring a distance using a laser to estimate a positional relationship with a peripheral structure
- A method for using a camera to acquire an image indicating a distance to a peripheral structure and estimating a position
- A method for using a laser to trace a marker position and estimating a relative position
- A method for using a radio wave or an ultrasonic wave to estimate a position [0017]
The PC 50 includes a defect extraction processing unit 52, a defect determination processing unit 53, a device positioning controller 54, a display controller 55, and a database 51 and is connected to a display device 60. The PC 50 is configured as a computer including a central processing unit (CPU), a memory, a storage unit (storage section) such as a hard disk, and a network interface.
This computer causes the CPU to execute a program (also referred to as application (app)) read into the memory, thereby causing a controller (control section) configured by the processing units to operate .
[0018]
The database 51 stores defect type data 51a and device control data 51b so that the defect type data 51a is associated with the device control data 51b. The defect type data 51a is used to classify defect types such as a crack and surface layer's peeling. The device control data 51b is used to identify the position of the imaging device 21 and the position of the illuminator 31 in order to easily view a defect. For example, when a defect is a crack, data indicating that the imaging device 21 is to be positioned on an upper side (in a vertical direction) with respect to a surface layer is defined as the device control data 51b in order to easily view a deep portion of the crack. [0019]
Upon receiving image data 23 captured by the imaging device 21 as an acquired image 91, the defect extraction processing unit 52 extracts a defect region 92 included in the acquired image 91.
The defect determination processing unit 53 crosschecks the defect 12 included in the defect region 92 with the defect type data 51a registered in the database 51 and determines the type of the defect 12 .
The device positioning controller 54 references the device control data 51b registered in the database 51, determines positioning of the imaging device 21 and the illuminator 31 so that the determined positioning is suitable for the type of the defect 12 determined by the defect determination processing unit 53. Then, the device positioning controller 54 controls the imaging device driving mechanism 20 and the illuminator driving mechanism 30 based on the determined positioning .
The display controller 55 controls the display device 60 so that the display device 60 displays the acquired image 91 received from the imaging device 21 on a screen of the display device 60. Thus, an inspecting person can visually recognize, from the screen, the acquired image 91 captured in a state in which the imaging device 21 and the illuminator 31 are automatically positioned so that the position of the imaging device 21 and the position of the illuminator 31 are suitable to image the defect 12 without manually adjusting the position of the imaging device 21 and the position of the illuminator 31. Thus, the inspecting person can concentrate on details of the displayed defect 12 and visually inspect the defect 12 in a stable fashion.
[0020]
FIG. 2 is a flowchart showing a process to be executed by the defect inspection system.
In a device movement process of S101, the devices (or the imaging device driving mechanism 20 and the illuminator driving mechanism 30) are moved toward the region 11 to be inspected.
In an inspection target region arrival process of S102, the devices moved in S101 arrive at positions at which the defect 12 included in the region 11 to be inspected can be imaged.
In an image capturing process of S103, the imaging device 21 images, as captured image data 23, the region 11 including the defect 12 and to be inspected (refer to FIGS. 7 and 8 for details).
In a defect extraction process of S104, the defect extraction processing unit 52 receives the captured image data 23 as the acquired image 91 from the imaging device 21 and extracts, from the acquired image 91, a defect region 92 including the defect 12 and a surface region 93 excluding the defect region 92 (refer to FIGS. 9 and 10 for details). [0021]
In a defect existence determination process of Sill, the defect extraction processing unit 52 determines whether or not the defect region 92 exists in the acquired image 91 as a result of S104. When the answer to Sill is Yes, the process proceeds to S112 . When the answer to Sill is No, the process proceeds to S115.
In a defect type determination process of S112, the defect determination processing unit 53 uses image data of the defect region 92 as a search key to reference the defect type data 51a stored in the database 51 and determines (identifies) the type of the defect 12 included in the defect region 92 (refer to FIG. 11 for details). [0022]
In a defect visual recognition determination process of S113, the defect determination processing unit 53 uses a predetermined defect visibility parameter to evaluate the defect 12 included in the defect region 92 and determined in S112 and determines whether or not the defect can be visually recognized (or can be easily viewed by a person). This defect visual recognition determination process is to determine that the defect can be visually recognized when the defect visibility parameter exceeds a predetermined threshold, for example. When the answer to S113 is Yes, the process proceeds to S114. When the answer to S113 is No, the process proceeds to S121.
[0023]
In a defect image display process of S114, the display controller 55 causes the display device 60 to display the acquired image 91 as a result of imaging the defect 12.
In an entire region inspection completion determination process of S115, the defect extraction processing unit 52 determines whether or not scheduled imaging of the entire region 11 to be inspected has been completed. When the answer to S115 is Yes, the process shown in FIG. 2 is terminated. When the answer to S115 is No, the process returns to S101 and the devices are moved to (remaining) imaging positions. [0024]
In a device positioning determination process of S121, the device positioning controller 54 references the device control data 51b associated with the defect type data 51a indicating the defect type determined in S112 and determines device positioning defined based on the type of the defect 12 (refer to FIGS. 12 to 14 for details) . As described later, the device positioning controller 54 may determine not only the positions of the devices but also the postures (orientation) of the devices with respect to the positions.
In a device positioning adjustment process of S122, the device positioning controller 54 adjusts the positions of the devices (or the imaging device driving mechanism 20 and the illuminator driving mechanism 30) based on the device positioning determined in S121. Specifically, the device positioning controller 54 transmits driving command signals to the imaging device driving mechanism 20 and the illuminator driving mechanism 30 so that positions measured by the self-position measurer 22 and the relative position measurer 32 match positions, determined in S121, of the devices. Then, the process returns to S103.
[0025]
A three-dimensional coordinate system of the defect inspection system is described with reference to FIGS. 3 to 6.
FIG. 3 is a perspective view describing the three-dimensional coordinate system of the defect inspection system. The three-dimensional coordinate system of the defect inspection system is defined as an x, y, z coordinate system in which a flat surface of the region 11 to be inspected is treated as a plane (x, y) and a vertical line 13 perpendicular to the plane (x, y) of the region 11 to be inspected is treated as a z axis.
In addition, the three-dimensional coordinate system (x, y, z) is also defined as a polar coordinate system (orientation Θ, angle φ, distance L) in which an intersection (or a substantially central position of the defect 12) of the region 11 to be inspected and the vertical line 13 is used as the origin of the coordinate system.
Specifically, a three-dimensional position of the imaging device 21 is expressed by (orientation Θ1, angle φΐ, distance LI) . Similarly, a three-dimensional position of the illuminator 31 is expressed by (orientation 02, angle φ2, distance L2).
[0026]
When the region 11 to be inspected is not a flat surface and is a curved surface, a surface having an irregularity, or the like, a plane perpendicular to the vertical line 13 extending from the position of the defect 12 may be defined as the reference xy plane. In addition, positional information of relative positions of the imaging device 21 and the illuminator 31 to the defect 12 may be expressed in a world coordinate system (x, y, z) instead of being expressed in the polar coordinate system (orientation 0, angle φ, distance L) , or each of the positions of the imaging device 21 and the illuminator 31 maybe defined by arbitrary three parameters defining the position in a three-dimensional space.
[0027]
FIG. 4 is a diagram showing the xy plane when viewed from the side of the front surface of the region 11 to be inspected. When the region 11 to be inspected is treated as the reference plane (xy plane) , orientation 0 indicates a counterclockwise angle and is determined using an x axis as 0 degrees.
FIG. 4 also shows a sectional line 14L-14R extending in the direction of the orientation 01 in which the imaging device 21 is located and a sectional line 15L-15R extending in the direction of the orientation 02 in which the illuminator 31 is located. Cross-sectional views taken along the sectional lines are described below.
[0028]
FIG. 5 is a cross-sectional view of the imaging device 21 shown in FIG. 4 and positioned based on the sectional line 14L-14R. The angle φΐ indicates how much the imaging device 21 is inclined toward the sectional line 14L-14R with respect to the vertical line 13.
FIG. 6 is a cross-sectional view of the illuminator 31 and positioned based on the sectional line 15L-15R. The angle φ2 indicates how much the illuminator 31 is inclined toward the sectional line 15L-15R with respect to the vertical line 13.
[0029]
Next, the image capturing process of S103 is described with reference to FIGS. 7 and 8.
FIG. 7 is a perspective view showing a positional relationship of the defect inspection system in the image capturing process of S103. The imaging device 21 images, as the captured image data 23, the region including the defect 12 and to be inspected in a state in which the defect 12 is irradiated with light by the illuminator 31.
A crack that is an example of the defect 12 occurs in a surface of a structure due to elapse of time and has an opening with a width W, a length L, and a depth D in the surface to be inspected.
FIG. 8 is a side view showing the crack that is the defect 12 shown in FIG. 12. As the depth D is larger, it is harder for the defect to be irradiated with light by the illuminator 31. Thus, a gray-scaled image portion occurs at the position of the defect 12 in the captured image data 23.
[0030]
In addition, the defect extraction process of S104 is described with reference to FIGS. 9 and 10.
FIG. 9 is a screen diagram showing, as an example of the captured image data 23, the acquired image 91 captured from the side of the front surface of the region 11 to be inspected. A vertical direction of the acquired image 91 is treated as a Y axis, while a horizontal direction of the acquired image 91 is treated as an X axis.
In the acquired image 91, the surface region 93 that does not include the defect 12 and is irradiated with illumination light has high luminance (or is white), and the defect region 92 that includes the defect 12 has low luminance (or is black) since the amount of illumination light with which the defect region 92 is irradiated is reduced. For a description of FIG. 10, FIG. 9 also shows a sectional line 94L-94R that extends through the center of the defect region 92 across the surface region 93.
[0031]
FIG. 10 is a graph of a luminance distribution on the sectional line 94L-94R in the acquired image 91 shown in FIG. 9. An abscissa of the graph indicates the X axis of the acquired image 91, and an ordinate of the graph indicates a luminance value of each position on the X axis.
The defect extraction processing unit 52 executes image processing on the acquired image 91 and extracts the defect region 92 having low luminance. Specifically, the defect extraction processing unit 52 sets an arbitrary defect threshold for a luminance value or gray level information and executes a binarization process to extract, as the defect, the region having luminance lower than the defect threshold.
The defect extraction processing unit 52 may not execute the binarization process to extract a defect boundary portion as long as the defect region with low luminance is extracted. For example, the defect extraction processing unit 52 may execute an edge extraction process to extract the defect boundary portion and classify the defect region 92 or the surface region 93.
[0032]
The luminance distribution graph shown in FIG. 10 is useful for not only the determination of whether or not a defect exists (in the defect extraction process of S104) but also the determination of visibility when a defect exists (in the defect visibility enabling determination process of S113) . When an average luminance value of the defect region 92 is LI, an average luminance value of the surface region 93 is L2, and a variation in the luminance of the surface region 93 is N, the defect determination processing unit 53 calculates contrast C (= (L2 - LI) / N) of the two regions as the defect visibility parameter.
When C > a predetermined threshold (of, for example, 2), the defect determination processing unit 53 determines that the defect region and the surface region can be classified upon visual recognition of the defect region and the surface region. The calculation method used in the determination is not limited to this. It is sufficient as long as the visibility of the defect region and the visibility of the surface region are quantitatively expressed and results of the calculation of the regions are determined using the threshold.
To improve this defect visibility parameter C, it is sufficient if the positioning of the devices is changed so that the average luminance value LI of the defect region 92 is reduced, the average luminance value L2 of the surface region 93 is increased, and the variation N in the luminance of the surface region 93 is reduced. [0033]
FIG. 11 is a configuration diagram describing the defect type determination process of S112.
The defect determination processing unit 53 determines the type of the defect included in the defect region 92 by applying a predetermined shape so that the defect region 92 is surrounded by the predetermined shape. Thus, the defect determination processing unit 53 references the defect type data 51a in which the predetermined shape is associated with the defect type.
FIG. 11 shows, as an example of the defect type data 51a, the case where the predetermined shape is a thin and long ellipse, and a defect to which the thin and long ellipse is applied is a crack. The thin and long ellipse is formed in an extended shape in which a ratio of the longest diameter a of the thin and long ellipse to the shortest diameter b of the thin and long ellipse is larger than a predetermined value (of 5 or the like), and the longest diameter a is longer than the shortest diameter b.
The defect determination processing unit 53 calculates the minimum elliptical shape circumscribing the defect 12 included in the defect region 92 and calculates the longest diameter a of the calculated elliptical shape, the shortest diameter b of the calculated elliptical shape, and an inclination Θ of the major axis of the calculated elliptical shape with respect to an x axis in the coordinate system. Then, the defect determination processing unit 53 determines, based on the calculated inclination Θ, the orientation of the imaging device 21 to be positioned and the orientation of the illuminator 31 to be positioned and determines, based on the ratio a/b of the longest diameter a to the shortest diameter b, the angle of the imaging device 21 to be positioned and the angle of the illuminator 31 to be positioned.
[0034]
The defect described in the embodiment is not limited to the crack. As described later, it is sufficient if the defect described in the embodiment has a shape in which a gray-scaled image portion occurs in an irregular portion of a surface of the defect when the defect is irradiated with reference light and imaged. For example, a swell, an attached foreign object, surface layer's peeling, or the like may be an object to be detected.
For example, when a shape (close to a true circle) in which a ratio of the longest diameter a of the shape to the shortest diameter b of the shape is smaller than that of the thin and long ellipse to be applied to a crack is applied as another example of the predetermined shape to be applied, the defect type data 51a indicating that the defect is surface layer's peeling may be used.
In addition, when the predetermined shape is applied to a shape (protrusion) in which the defect region 92 protrudes in the z axis direction from the xy plane of the region 11 to be inspected, the defect type data 51a indicating that the defect is a swell or an attached foreign object may be used. For the defect type data 51a, not only the method for applying to the ellipse but also a method for calculating directivity and orientation of the defect region may be used.
[0035]
The device positioning determination process of S121 that is executed based on the device control data 51b is described with reference to FIGS. 12 to 14.
The distance LI of the imaging device 21 is determined based on the number of pixels of the imaging device 21 and the sizes of the pixels by setting resolution for the width of the defect in advance. Alternatively, the distance LI of the imaging device 21 maybe determined by a known automatic focusing mechanism so that the imaging device 21 brings the defect into focus.
The distance L2 of the illuminator 31 is determined based on required luminance of the surface region and required illuminance of the illuminator.
[0036]
FIG. 12 is a perspective view describing the positioning determination process and showing the orientation 0 of the devices.
As device positioning for improvement of the defect visibility parameter C, the orientation 02 of the illuminator 31 is set to be perpendicular to a long axis of the defect region 92, for example. Thus, the amount of light with which an inner cross-sectional surface of the defect 12 is irradiated is reduced by reducing the luminance of the defect region 92.
The orientation 01 of the imaging device 21 is set to be perpendicular to the long axis of the defect region 92 and opposite to the orientation 02 of the illuminator 31. Thus, the luminance of the defect region 92 is reduced by reducing the amount of light to be reflected or diffused from the inner cross-sectional surface of the defect 12 and to be incident on the imaging device 21.
The device control data 51b may be defined as data to be used to determine the orientation of the devices so that the defect 12 is easily viewed while orientation in which the defect 12 defined by the defect type data 51a exists is treated as reference orientation. [0037]
FIG. 13 is a perspective view describing the positioning determination process and showing the angles φ of the devices with respect to the crack.
The angle φ2 of the illuminator 31 is smaller than an inclination angle of the defect (and is measured on the side of the vertical line 13) . The inclination angle of the defect = tan (the shortest diameter b / the longest diameter a) . In FIG. 13, the inclination of the defect is indicated by a solid line, and the inclination angle of the defect is defined by three points (which are a point Pd on the vertical line 13, the deepest point Pc of the defect 12, and a surface point Pb of the defect 12). When the illuminator 31 is positioned almost immediately above the crack and oriented in an almost vertical direction toward the crack located almost immediately under the illuminator 31 so that incident light reaches the deepest point Pc of the defect 12, light reflected from the inner cross-sectional surface of the defect 12 can be suppressed and the contrast can be increased.
A relationship between the inclination angle of the defect and the angle of the illuminator is determined so that the difference between the inclination angle of the defect and the angle of the illuminator is equal to or larger than a predetermined angular value and that the illuminator does not interfere with the target to be inspected. Although the inclination angle of the defect is calculated from the tan(the shortest diameter b / the longest diameter a) as an example, a method for estimating the inclination angle of the defect is not limited to this, and another method may be used as long as the same effects can be obtained.
[0038]
When the surface region 93 is a mirror surface, the angle φΐ of the imaging device 21 to be positioned is set to be equal to the angle φ2 of the illuminator 31 to be positioned. Specifically, an angle measured on the side of the vertical line 13 and smaller than the defect's inclination angle defined by three points (which are the point Pd on the vertical line 13, the deepest point Pc of the defect 12, and a surface point Pa of the defect 12) is treated as the angle φΐ of the imaging device 21. Thus, by causing regularly reflected light from the illuminator 31 to be incident on the imaging device 21, the luminance of the surface region 93 can be improved and the contrast can be increased.
In addition, when the surface region 93 is a rough surface, light diffused from the surface region 93 is isotropic due to the angle φΐ of the imaging device 21. Thus, the visibility of the defect can be improved by positioning the imaging device 21 immediately above the defect (in the direction of the vertical line 13) so that a large number of shadow regions of the defect can be imaged.
[0039]
FIG. 14 is a perspective view describing the positioning determination process and showing angles φ of the devices with respect to surface layer's peeling. Differently from the crack shown in FIG. 13, the peeling shown in FIG. 14 is easily viewed when the imaging device 21 and the illuminator 31 are oriented almost along the xy plane of the region 11 to be inspected.
The angle φΐ of the imaging device 21 is formed between the vertical line 13 and an extended line connecting a bottom surface point Pf of the peeling to a surface point Pa of the peeling. Similarly, the angle φ2 of the illuminator 31 is formed between the vertical line 13 and an extended line connecting a bottom surface point Pe of the peeling to a surface point Pb of the peeling.
[0040]
As described above, the device positioning controller 54 determines the position (orientation Θ1, angle φΐ, distance LI) of the imaging device 21 and the position (orientation 02, angle φ2, distance L2) of the illuminator 31 based on the type of the defect 12 (crack, peeling, or the like) determined by the defect determination processing unit 53 to improve the visibility of the defect.
In addition, the device positioning controller 54 may determine the posture (orientation) of the imaging device 21 and the posture (orientation) of the illuminator 31 so that the imaging device 21 and the illuminator 31 are oriented toward the defect 12 from the determined positions of the imaging device 21 and the illuminator 31.
[0041]
In the embodiment described above, the defect extraction processing unit 52 extracts, from the acquired image 91 received from the imaging device 21, the defect region 92 including the defect 12. Then, the defect determination processing unit 53 classifies the type of the defect based on characteristic information such as the shape of the defect 12 included in the defect region 92 and a gray level of the defect 12 . In addition, the device positioning controller 54 reads, from the database 51, the device control data 51b associated with the defect type data 51a indicating the classified type of the defect and controls the imaging device driving mechanism 20 and the illuminator driving mechanism 30.
Since positioning of the imaging device 21 and the illuminator 31 in states in which the imaging device 21 easily images defects 12 is defined for each of types of the defects 12 in the defect type data 51a and the device control data 51b, the imaging device 21 controlled based on determined positioning can capture the image 91 in which the visibility of the defect 12 is high. Thus, when inspecting persons are dependent on their skills (or the skills of the inspecting persons vary), the defect determination can be efficiently made in a stable manner .
[0042]
The present invention is not limited to the embodiment and includes various modified examples. For example, the embodiment is described above in detail to clearly explain the present invention and may not include all the configurations described above.
In addition, one or more of the configurations described in the embodiment may be replaced with a configuration according to another embodiment. In addition, a configuration according to another embodiment may be added to one or more of the configurations described in the embodiment.
In addition, another configuration may be added to or replaced with one or more of the configurations described in the embodiment, and one or more of the configurations described in the embodiment may be omitted. All or one or more of the configurations, the functions, the processing units, the processing sections, and the like may be enabled by hardware or designed as an integrated circuit, for example.
In addition, the configurations, the functions, and the like may be enabled using software by causing a processor to interpret and execute a program for enabling the functions.
[0043]
Information such as the program for enabling the functions, a table, and a file may be stored in a storage device such as a memory, a hard disk, or a solid state drive (SSD) or in a storage medium such as an integrated circuit (IC) card, an SD card, or a digital versatile disc (DVD).
In addition, while control lines and information lines that are necessary for the description are shown, all control lines and information lines of the products are not necessarily shown. In fact, it may be considered that almost all the configurations are connected to each other.
Furthermore, a communication section that connects the devices
- 26 to each other is not limited to a wireless LAN and may be replaced with a wired LAN or another communication section.

Claims (5)

1. A defect inspection method to be executed by a defect inspection system including a database, a defect extraction processing unit, a defect determination processing unit, a device positioning controller, and a display controller, the defect inspection method comprising:
registering, in the database, defect type data including defect types related to a structure and characteristic information indicating and associated with each of the defect types, and device control data defining, for each of the defect types, device positioning enabling defects to be easily viewed so that the defect type data is associated with the device control data;
causing the defect extraction processing unit to extract a defect region including a defect from an image captured by an imaging device;
causing the defect determination processing unit to reference the characteristic information of the device control data and classify the type of the defect included in the defect region extracted by the defect extraction processing unit;
causing the device positioning controller to reference the device control data, determine positioning, associated with the type of the defect classified by the defect determination processing unit, of the imaging device, and control the imaging device so that the imaging device is positioned based on the determined positioning; and causing the display controller to control a display device so that the display device displays the image captured by the imaging device controlled by the device positioning controller.
2. The defect inspection method according to claim 1, further comprising:
registering a crack of a flat surface of a structure as a defect type in the database;
causing the defect determination processing unit to reference the defect type data indicating the crack and determine that the type of the defect included in the defect region is a crack when a ratio of the longest diameter of an elliptical shape circumscribing the defect region to the shortest diameter of the elliptical shape is larger than a predetermined value; and causing the device positioning controller to reference the device control data indicating the crack, and positioning the imaging device so that an angle of the imaging device with respect to the flat surface of the structure is large to enable the deepest point of the crack to be visually recognized.
3. The defect inspection method according to claim 2, further comprising:
causing the device positioning controller to position an illuminator at one of edge points on a line perpendicular to the longest diameter of an elliptical shape circumscribing the crack and position the imaging device at the other edge point on the line for a relationship between the position of the illuminator for radiating light toward the crack and the position of the imaging device for imaging the crack in coordinates of the flat surface of the structure.
4. The defect inspection method according to claim 1, further comprising:
registering peeling of a flat surface of a structure as a defect type in the database;
causing the defect determination processing unit to reference the defect type data indicating the peeling and determine that the type of the defect included in the defect region is peeling when a ratio of the longest diameter of an elliptical shape circumscribing the defect region to the shortest diameter of the elliptical shape is smaller than a predetermined value; and causing the device positioning controller to reference the device control data indicating the peeling and position the imaging device at a position on an extended line connecting a bottom surface point of the peeling to a surface point of the structure.
5. A defect inspection system comprising:
a database in which defect type data including defect types related to a structure and characteristic information indicating and associated with the defect types, and device control data defining, for each of the defect types, device positioning enabling a defect to be easily viewed are associated with each other and registered;
a defect extraction processing unit configured to extract a defect region including a defect from an image captured by an imaging device;
a defect determination processing unit configured to reference the characteristic information of the defect type data and classify the type of the defect included in the defect region extracted by the defect extraction processing unit;
a device positioning controller configured to reference the device control data, determine positioning, associated with the type of the defect classified by the defect determination processing unit, of the imaging device, and control the imaging device so that the imaging device is positioned based on the determined positioning; and a display controller configured to control a display device so that the display device displays the image captured by the imaging device controlled by the device positioning controller.
GB1818898.7A 2017-12-28 2018-11-20 Defect inspection method and defect inspection system Active GB2570377B (en)

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
JP2017252743A JP7178171B2 (en) 2017-12-28 2017-12-28 Defect inspection method and defect inspection system

Publications (3)

Publication Number Publication Date
GB201818898D0 GB201818898D0 (en) 2019-01-02
GB2570377A true GB2570377A (en) 2019-07-24
GB2570377B GB2570377B (en) 2020-07-01

Family

ID=64740027

Family Applications (1)

Application Number Title Priority Date Filing Date
GB1818898.7A Active GB2570377B (en) 2017-12-28 2018-11-20 Defect inspection method and defect inspection system

Country Status (2)

Country Link
JP (1) JP7178171B2 (en)
GB (1) GB2570377B (en)

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112378924A (en) * 2020-09-24 2021-02-19 宁波市鄞州世纪耀达市政建设有限公司 Pipeline crack positioning method and system, storage medium and intelligent terminal
CN114324387A (en) * 2021-12-14 2022-04-12 北京玖瑞科技有限公司 Plate defect detection device and method
CN114511557B (en) * 2022-04-02 2022-09-13 深圳市君合环保水务科技有限公司 Image processing-based underdrain structure defect detection method

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103383361A (en) * 2013-08-02 2013-11-06 湖州职业技术学院 Steel wire core conveyer belt detection device and method
EP3176537A1 (en) * 2015-12-01 2017-06-07 General Electric Company System for automated in-process inspection of welds
WO2018006180A1 (en) * 2016-07-08 2018-01-11 Ats Automation Tooling Systems Inc. System and method for combined automatic and manual inspection
KR20180133040A (en) * 2017-06-05 2018-12-13 충북대학교 산학협력단 Apparatus and method for classifying defect

Family Cites Families (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH07103905A (en) * 1993-10-07 1995-04-21 Toyo Commun Equip Co Ltd Flaw inspecting equipment
JP2005062037A (en) * 2003-08-15 2005-03-10 Fuji Photo Film Co Ltd Method for detecting flaw of ink jet recording paper and flaw detector therefor
JP2006208347A (en) * 2004-02-25 2006-08-10 Jfe Steel Kk Surface defect detector, grinding device, surface defect detection method and surface defect detection program for reduction roll, and reduction roll grinding method
JP2006058170A (en) * 2004-08-20 2006-03-02 Dainippon Screen Mfg Co Ltd Visual confirmation device and inspection system
JP2008076218A (en) * 2006-09-21 2008-04-03 Olympus Corp Visual inspection apparatus
KR100826153B1 (en) * 2006-11-29 2008-04-30 한국표준과학연구원 Width measurement method of the crack by using the depth value in histogram of image
JP5351673B2 (en) * 2009-09-09 2013-11-27 パナソニック株式会社 Appearance inspection device, appearance inspection method
JP5696221B2 (en) * 2011-09-15 2015-04-08 日立Geニュークリア・エナジー株式会社 Underwater inspection device
JP6099479B2 (en) * 2013-05-21 2017-03-22 大成建設株式会社 Crack detection method

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103383361A (en) * 2013-08-02 2013-11-06 湖州职业技术学院 Steel wire core conveyer belt detection device and method
EP3176537A1 (en) * 2015-12-01 2017-06-07 General Electric Company System for automated in-process inspection of welds
WO2018006180A1 (en) * 2016-07-08 2018-01-11 Ats Automation Tooling Systems Inc. System and method for combined automatic and manual inspection
KR20180133040A (en) * 2017-06-05 2018-12-13 충북대학교 산학협력단 Apparatus and method for classifying defect

Also Published As

Publication number Publication date
GB201818898D0 (en) 2019-01-02
GB2570377B (en) 2020-07-01
JP7178171B2 (en) 2022-11-25
JP2019120491A (en) 2019-07-22

Similar Documents

Publication Publication Date Title
US10746763B2 (en) Apparatus and method for diagnosing electric power equipment using thermal imaging camera
CA2934796C (en) Surface defect detecting method and surface defect detecting apparatus
US10288418B2 (en) Information processing apparatus, information processing method, and storage medium
US10521898B2 (en) Structural masking for progressive health monitoring
JP6395456B2 (en) Image inspection apparatus, image inspection method, image inspection program, computer-readable recording medium, and recorded apparatus
GB2570377A (en) Defect inspection method and defect inspection system
JP5484133B2 (en) Method for estimating the 3D pose of a specular object
US11710289B2 (en) Information processing apparatus, information processing system, and material identification method
JP2016166853A (en) Location estimation device and location estimation method
TWI695164B (en) Broadband wafer defect detection system and broadband wafer defect detection method
US20160259034A1 (en) Position estimation device and position estimation method
US10853935B2 (en) Image processing system, computer readable recording medium, and image processing method
JP6647903B2 (en) Image inspection device, image inspection program, computer-readable recording medium, and recorded device
KR20180116832A (en) Apparatus for detecting a pipe position and pose, and its method for detecting a pipe position and pose
JP5441752B2 (en) Method and apparatus for estimating a 3D pose of a 3D object in an environment
CN114964032B (en) Blind hole depth measurement method and device based on machine vision
JP6508763B2 (en) Surface inspection device
US10859506B2 (en) Image processing system for processing image data generated in different light emission states, non-transitory computer readable recording medium, and image processing method
US11276197B2 (en) Information processing apparatus and subject information acquisition method
Ekwongmunkong et al. Automated machine vision system for inspecting cutting quality of cubic zirconia
KR102015620B1 (en) System and Method for detecting Metallic Particles
JP5280918B2 (en) Shape measuring device
Stellari et al. Automated contactless defect analysis technique using computer vision
CN117078665B (en) Product surface defect detection method and device, storage medium and electronic equipment
WO2023026703A1 (en) Image analyzing device, method, and program