JP2010066153A - Method and device for inspecting appearance - Google Patents

Method and device for inspecting appearance Download PDF

Info

Publication number
JP2010066153A
JP2010066153A JP2008233467A JP2008233467A JP2010066153A JP 2010066153 A JP2010066153 A JP 2010066153A JP 2008233467 A JP2008233467 A JP 2008233467A JP 2008233467 A JP2008233467 A JP 2008233467A JP 2010066153 A JP2010066153 A JP 2010066153A
Authority
JP
Japan
Prior art keywords
image
area
luminance
inspected
inspection
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
JP2008233467A
Other languages
Japanese (ja)
Inventor
Naoki Fuse
直紀 布施
Original Assignee
Daido Steel Co Ltd
大同特殊鋼株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Daido Steel Co Ltd, 大同特殊鋼株式会社 filed Critical Daido Steel Co Ltd
Priority to JP2008233467A priority Critical patent/JP2010066153A/en
Publication of JP2010066153A publication Critical patent/JP2010066153A/en
Pending legal-status Critical Current

Links

Abstract

Provided are an appearance inspection method and an appearance inspection apparatus capable of accurately inspecting the surface of an object to be inspected by removing the influence of the individual difference even when there is an individual difference in the surface state of the object to be inspected. .
An imaging means capable of capturing a reference image and an appearance of an inspection object to be inspected, a storage means capable of storing a reference image captured by an external imaging means, and the inspection object A part of the area is extracted from an image obtained by imaging the appearance of the image, and an area having coordinates corresponding to the extracted area is extracted from the reference image stored in the storage unit, and the image obtained by imaging the appearance of the object to be inspected Calculating means 61 for comparing the luminance of the area extracted from the luminance of the area extracted from the reference image and correcting the luminance of the image obtained by imaging the appearance of the inspection object based on the comparison result; Prepare.
[Selection] Figure 1

Description

  The present invention relates to an appearance inspection method and an appearance inspection apparatus, and particularly preferably, the surface of an object to be inspected, such as a cast product or a forged product, is imaged, and the captured image is image-processed, thereby inspecting the object. The present invention relates to an appearance inspection method and an appearance inspection apparatus for inspecting a body for defects.

  A cast product or a forged product may be subjected to an appearance inspection for inspecting whether there is a defect on the surface after being formed into a predetermined shape by casting or forging. In addition to visual inspection of an object to be inspected such as a cast product or a forged product, the surface of the object to be inspected is imaged with a CCD camera or the like in addition to visual inspection by an operator (inspector), and the captured image is processed There is something to do. As an appearance inspection performed by image processing, for example, pattern matching is performed between an image obtained by imaging the surface of a casting or forged product having no defect (that is, a reference image) and an image obtained by imaging the surface of the object to be inspected. There is a method of assuming that a defect exists when the correlation between the reference image and the image of the object to be inspected is lower than a predetermined threshold.

  By the way, a cast product may be subjected to sand blasting for the purpose of cleaning the surface and making the surface roughness uniform after being formed by casting. Sandblasting is a process in which particles such as sand are sprayed onto the surface of the casting, and the surface of the casting is cleaned by colliding with the surface of the casting and polishing the particles such as sand. , Make the surface roughness uniform.

  Sand used for sandblasting has a shape in which sand particles are squared as a whole in a new or near-new state, and has a high ability to polish the surface of a cast product. By repeating the use in the sandblasting process, the sand used is worn and becomes a rounded shape as a whole, and the ability to polish the surface of the cast product is lowered. For this reason, when sandblasting is performed using new or near-new sand, the surface of the cast product is greatly polished, and the degree of brightness of the surface of the cast product that has been sandblasted is increased. On the other hand, when sand blasting is performed using sand that has been worn out, the surface of the cast product is not greatly polished, so that the brightness of the surface of the cast product is low. As described above, in the sandblasting process, there may be individual differences in the state of the surface of the cast product (for example, the degree of brightness) depending on the state of the sand to be used (for example, use history).

  Further, shot blasting may be performed on the forged product for the purpose of cleaning the surface of the forged product that has been forged. In addition, shot peening may be performed for the purpose of improving the mechanical properties of the surface. The shot blasting process and the shot peening process are processes for spraying fine steel balls or steel powder, and by spraying the fine steel balls or steel powder, the surface is cleaned and the mechanical properties are improved. Even in such shot blasting and shot peening processes, individual differences may occur in the surface condition (for example, the degree of brightness) depending on the state of steel balls and steel powder and usage history, as in the case of sandblasting. .

  When visual inspection is performed by image processing for cast or forged products that have been subjected to sandblasting or shut blasting, etc., if there are individual differences in the surface condition (especially the degree of brightness) as described above, Such a problem may occur. When imaging the surface of an object to be inspected, such as a cast or forged product, the surface of the object to be inspected is illuminated using a light source with a predetermined amount of light so that a portion having a predetermined surface property is imaged with a predetermined luminance. To do. At this time, if there is an individual difference in the surface condition (degree of brightness) of the object to be inspected, the brightness of the captured images is different despite the similar surface properties.

  For this reason, when image processing is performed on a captured image, if an image processing condition suitable for an image with a specific luminance is applied, it is difficult to accurately determine the presence / absence of a defect for an image with a luminance outside the luminance. It becomes. As a result, an erroneous determination may occur due to individual differences such as the degree of brightness of the surface of the object to be inspected. For example, when it is set to determine that a portion having a luminance lower than a predetermined threshold value is a defect, an object to be inspected having a low surface brightness is not defective even though there is no defect. It may be determined to have.

  Further, in the configuration in which the surface of the object to be inspected is illuminated with a light source having a predetermined light amount as described above, the light amount may change due to deterioration of the light source over time. If the amount of light of the light source is different for each object to be inspected, an image having a different luminance is captured for each object to be inspected. For this reason, the same problem as when there is an individual difference in the state (degree of brightness) on the surface may occur.

JP 2006-17668 A

  In view of the above situation, the problem to be solved by the present invention is to remove the influence of individual differences even when there is an individual difference in the surface condition (particularly the degree of brightness) of the object to be inspected. To provide an appearance inspection method and an appearance inspection apparatus capable of accurately inspecting the surface of a body, or to be configured with a curved surface even when the surface of the object to be inspected includes a portion configured with a curved surface To provide an appearance inspection method and an appearance inspection apparatus capable of accurately inspecting the surface of an object to be inspected by removing the influence of the luminance difference caused by it, or even when the light amount of the light source is different for each object to be inspected Another object of the present invention is to provide an appearance inspection method and an appearance inspection apparatus that can inspect with high accuracy.

  In order to solve the above problems, the present invention is an appearance inspection method for inspecting the surface of an object to be inspected that has been subjected to sandblasting, shot blasting, or shot peening, and images the appearance of a reference article. Acquiring a reference image, obtaining an inspected object luminance image indicating a luminance distribution of the reference image, capturing an appearance of the inspected object to obtain an inspected object image, and obtaining the luminance of the inspected object image Obtaining an inspected object luminance image indicating a distribution; extracting an area at a preset position from the inspected object luminance image as an inspected object area; and extracting the area of the inspected object area from the reference luminance image. A step of extracting an area corresponding to a position as a reference area, and data of luminance distribution of the inspected object area and the reference area in one-dimensional data in a predetermined coordinate direction A step of converting and setting the luminance value of each area, a comparison step of comparing the luminance values of the set areas for each coordinate, and the luminance image of the object to be inspected based on the comparison result A correction stage for correcting the brightness distribution of the reference image to be close to the brightness distribution of the reference image, and a stage for performing an appearance inspection by performing predetermined image processing on the corrected body brightness image. To do.

The comparison step divides the luminance value of the inspected area for each coordinate by the luminance value of the reference area, obtains a ratio for each coordinate, approximates the change in the ratio in the coordinate direction with a straight line, The intercept of the approximate straight line is set as a correction rate,
In the correction step, it is preferable that the reciprocal of the set correction rate is uniformly multiplied over the entire inspected object luminance image.

  Further, after the step of extracting an area at a preset position from the inspected object luminance image as the inspected object area, the method further includes a step of determining whether or not the image of the inspected object area includes a defect. When it is determined that a defect is included in the stage, a new inspection object area different from the extracted inspection object area is newly extracted from the inspection object luminance image, and It is preferable to extract an area corresponding to a newly extracted object area from the reference image.

  The reference image that captures and acquires the appearance of the reference article is formed using an image obtained by imaging a sandblasted sand or shotblasted steel ball using a new article and an old one. In the comparison step, the average of the luminance values set in the two reference images is newly set as the luminance value of the reference area, and the setting is performed. The corrected luminance value of the reference area and the luminance value of the inspection object area may be compared for each coordinate.

  The present invention provides an imaging unit capable of capturing a reference image of the appearance of an article to be used as a reference and an inspection object image of the appearance of the inspection object, and a storage unit capable of storing the reference image captured by the imaging unit. And an inspected object luminance image indicating the luminance distribution of the reference image, an inspected object luminance image indicating the luminance distribution of the inspected object image, and pre-set from the inspected object luminance image A position area is extracted as an inspection object area, and an area corresponding to the position of the inspection object area is set as a reference area from the reference luminance image, and the respective luminance distributions of the inspection object area and the reference area Is converted into one-dimensional data in a predetermined coordinate direction, set as the brightness value of each area, the brightness values of each set area are compared for each coordinate, and the test object Calculating means for correction can be made to approach the luminance distribution of the body luminance image to the luminance distribution of the reference image, and summarized in that it comprises a.

  Here, the calculation means divides the luminance value of the inspected area by the luminance value of the reference area for each coordinate, obtains a ratio for each coordinate, and changes the ratio change in the coordinate direction by a straight line. It is preferable that the approximation is performed, the intercept of the approximate straight line is set as a correction rate, and the whole of the inspected object luminance image is uniformly multiplied by the inverse number of the set correction rate.

  In the present invention, an “area” refers to a region in which pixels (pixels) are two-dimensionally arranged. Specifically, for example, a normal area of 100 pixels × 100 pixels is referred to as an “area”, and even 1 pixel × 100 pixels (usually referred to as a line) is included in the “area”.

  According to the present invention, since the luminance of the image obtained by imaging the inspected object is corrected based on the reference image, even if the image obtained by imaging the inspected object under the conditions for image processing of the reference image is processed with accuracy, The inspection can be performed well. Therefore, even when there are individual differences in the surface properties (especially the degree of brightness) of the object to be inspected, or when there are individual differences in the imaging conditions (especially the illumination conditions) due to deterioration of the light source over time, etc. Inspection can be performed.

  In particular, the luminance value of the area extracted from the image obtained by imaging the appearance of the object to be inspected is divided by the luminance value of the area extracted from the reference image, and the image obtained by imaging the appearance of the object to be inspected is removed. If the configuration is obtained by multiplying the reciprocal of the calculated value, the luminance of the image obtained by imaging the appearance of the object to be inspected can be made substantially equal to the luminance of the reference image. Therefore, since the luminance of the image to be inspected can be almost unified, the inspection can be performed with high accuracy even if image processing is performed on an image obtained by imaging the inspection object under the condition for image processing of the reference image.

  In addition, after the step of extracting a part of the area from the image obtained by imaging the appearance of the inspection object, whether or not the image of the area extracted from the image obtained by imaging the appearance of the inspection object includes a defect. If a determination step is further added and it is determined that a defect is included in this step, a different area other than the extracted area is newly extracted from an image obtained by imaging the appearance of the inspection object. In addition, if the area corresponding to the newly extracted area is extracted from the reference image, the luminance of the entire image can be corrected based on the luminance of the area having no defect. Therefore, the accuracy of correction is improved.

  Hereinafter, various embodiments of the present invention will be described in detail with reference to the drawings.

  FIG. 1 is a schematic view showing a configuration of an example of an appearance inspection apparatus 10 according to an embodiment of the present invention. An appearance inspection apparatus 10 according to an embodiment of the present invention is an apparatus used for appearance inspection of an impeller 1 (that is, a turbine wheel) for an automobile turbocharger as an object to be inspected. The turbocharger impeller 1 is formed by casting (for example, lost wax manufacturing method). After forming by casting, the surface is cleaned and the surface roughness is made uniform by sandblasting. It is. And the external appearance inspection apparatus concerning embodiment of this invention determines the presence or absence of casting defect by inspecting the external appearance of the to-be-inspected object (impeller (turbine wheel)) to which the sandblasting process was performed.

  As shown in FIG. 1, an appearance inspection apparatus 10 according to an embodiment of the present invention includes a work holding unit 20 that can hold an object to be inspected (impeller 1) in a predetermined position, and an appearance of the impeller 1. The imaging means 30 that can capture and obtain image information, the light source device 40 that can irradiate the impeller 1 with light, the work position sensor 50 that can detect the position of the impeller 1, and the imaging And a control unit 60 capable of performing image processing of images, control of the work holding means 20, and the like.

  The work holding means 20 can hold the object to be inspected (the impeller 1) at a predetermined position. As shown in FIG. 1, the work holding means 20 is configured such that a rotation mechanism portion 21 and a support portion 22 that can rotatably support the impeller 1 are mounted on a base 23. . The rotation mechanism unit 21 is in contact with the wrench boss 4 at the axial end of the shaft member 2 of the impeller 1, and is connected to the chuck unit 24 for supporting the wrench boss 4 side of the impeller 1, and the chuck unit 24. A servo motor 25 for rotating the chuck portion 24 is provided. Further, the support portion 22 abuts on the shaft holding portion 5 side of the impeller 1, holds the impeller 1 between the chuck portion 24 and the holder portion 26 for rotatably holding the impeller 1. I have. Both ends of the impeller 1 in the axial direction are held by the work holding means 20.

  The imaging means 30 can capture the appearance of the impeller 1 and obtain image information. Various known imaging means such as a CCD camera can be applied to the imaging means 30. The light source 40 can irradiate the impeller 1 with light. As the light source 40, for example, various light sources such as an LED light and a flash light can be applied.

  The work position sensor 50 can detect the position (posture) of the impeller 1. The work position sensor 50 is connected to the control means 63 of the control unit 60, and can transmit the position information (posture information) of the impeller 1 held by the work holding means 20 to the control means 63. Various known position sensors can be used as the work position sensor 50.

  The control unit 60 can perform image processing of the captured image and control of the imaging unit 30 and the work holding unit 20. Specifically, the control unit 60 includes a calculation unit 61, a storage unit 62, and a control unit 63. The calculation means 61 is connected to the image pickup means 30 and can receive data of an image picked up by the image pickup means 30 and perform a predetermined process. The storage unit 62 can store image data captured by the imaging unit 30, data processed by the calculation unit, and the like. The control means 63 is connected to the workpiece position sensor 50, the servo motor 25, etc., and can receive a signal from the workpiece position sensor 50 and control the movement of the servo motor 25. For example, the control means 63 can control the rotation of the servo motor 25 so as to step-feed (rotate) the impeller 1. Further, the control means 63 can change the position and orientation of the imaging means based on the signal of the workpiece position sensor 50, image information taken by the imaging means 30, and the like. A personal computer or a workstation can be applied to the control unit 60.

  FIG. 2 is a perspective view showing an appearance of the impeller 1 that is a target of the appearance inspection method according to the embodiment of the present invention. (A) is the figure which looked at the impeller 1 from the one side of the axial direction, respectively, (b) is the figure which looked at the other side of the axial direction of the impeller 1. As illustrated in FIGS. 2A and 2B, the impeller 1 includes a shaft member 2 and a plurality of twisted blades 3 provided around the shaft member 2. Further, as shown in FIG. 2A, the impeller 1 is formed such that one end of the shaft member 2 is a wrench boss 4 formed of a regular polygonal convex portion such as a hexagonal bolt shape. Yes. And the other end part of the shaft member 2 is provided with the shaft holding part 5 which consists of a crater-like recessed part in which the center part was circularly depressed as shown in FIG.2 (b). The shaft holding part 5 is provided in order to integrate a shaft of another member by welding or the like. The impeller 1 is cast by the lost wax method. And after shaping | molding by casting, a sandblasting process is performed for the purpose of the surface property. As the material, alloy stainless steel, TiAl, or the like is used.

  The appearance inspection apparatus 10 and the appearance inspection method according to the embodiment of the present invention are applied to inspect the surface of a cast product such as the impeller 1 after being subjected to sandblasting or the like. . For example, it is performed for the purpose of detecting pinholes and blowholes appearing on the surface of the impeller 1 or detecting a hot water boundary defect generated at the end of a twisted blade.

  Hereinafter, an appearance inspection method using the appearance inspection apparatus according to the embodiment of the present invention (an appearance inspection method according to the embodiment of the present invention) will be described. FIG. 3 is a flowchart showing an outline of the flow of the appearance inspection method according to the embodiment of the present invention. In the appearance inspection method according to the embodiment of the present invention, when divided into stages, (1) a step of acquiring image data as a reference, (2) a step of acquiring image data to be inspected, (3) an inspection Determining a correction rate for correcting the target image data; (4) correcting the luminance of the image data to be inspected based on the determined correction rate; and (5) converting the luminance corrected image into an image. There are five stages, that is, a stage for performing an appearance inspection of the object to be inspected.

(1) Step of acquiring a reference image In step S-1, a reference image is captured and stored. The reference image is an image of an object to be inspected having an average surface property and having no surface defects (hereinafter referred to as “reference” in order to be distinguished from an object to be actually inspected). Let it be a captured image. For example, when the life of sand used in sandblasting is N times, the surface of the reference that has been sandblasted using sand that has been used N / 2 times has an average surface property. To do. Specifically, when the life of the sand used in the sandblast treatment is 3500 times, the surface property of the reference that has been sandblasted using the sand that has been used about 1800 times is defined as the average surface property. . An image obtained by imaging the surface of the reference is set as a reference image.

  In capturing the reference image, it is preferable to adjust the position of the imaging unit 30 (CCD camera) and the position and brightness of the light source 40 (LED) so that the luminance distribution in the imaging region is uniform.

  The size of the image to be captured is not particularly limited, but for example, an image captured in a range of about 20 mm × 30 mm is applied. The area to be imaged is an area corresponding to the actual inspection target area of the inspection object (impeller 1). The captured image is obtained as matrix data (matrix) of a plurality of pixels (pixels). In the matrix data of N rows and M columns (N and M are predetermined natural numbers), for example, the row is the X coordinate of the image and the column is the Y coordinate. The captured reference image is converted into an image having a predetermined luminance gradation. For example, the captured image is converted into an image in which each pixel has a luminance gradation of 256 gradations (8-bit gradation). This image is a reference luminance image. The converted image (especially luminance information of each pixel) is stored in the storage means.

(2) Stage of acquiring image data to be inspected In step S-2, an inspection area of an object to be inspected that is an actual inspection object is imaged. The imaging conditions are preferably the same as the imaging conditions for the reference image. As with the reference image, it is difficult to make the luminance distribution uniform when imaging an object with a curved surface. However, the luminance distribution tends to be the same as the reference image. It seems that there is no problem if

  In step S-3, the image of the object to be inspected is also processed in the same manner as the reference luminance image to obtain an image (inspected object luminance image) converted into an image having a luminance gradation. Then, a partial region is extracted from the inspected object luminance image. This extracted area is referred to as a “correction area (inspected object area)”. The correction area is an area for collecting the luminance gradation of each pixel of the image in order to calculate a correction rate for correcting the luminance of the captured image. Although the range of the correction area is not particularly limited, for example, an area of 100 pixels × 100 pixels is extracted.

  In step S-4, noise removal processing is performed. Here, an image obtained by imaging the object to be inspected may contain noise. If the process of the subsequent step is performed while including noise, the noise may be determined to be a defect in the image processing stage (the stage of determining the presence or absence of a defect based on the captured image). The accuracy may be reduced. Therefore, in step S-4, noise removal processing is performed on the captured image of the inspected object using a Gaussian filter, an averaging filter, or the like.

  An object to be inspected such as the impeller 1 is subjected to sandblasting, and the surface thereof is polished by sand particles. Therefore, there may be an isolated point having a high polishing amount and high brightness locally. Since such isolated points with high local brightness are not defects, it is preferable not to be determined as defects at the stage of image processing in order to improve inspection accuracy. Therefore, such isolated points are treated as noise, and noise removal processing is performed using a Gaussian filter, an averaging filter, or the like, and such isolated isolated points are averaged out. And the process of a subsequent step is performed using the image from which this noise removal process was performed.

  As shown in the flowchart of FIG. 3, the noise removal process may be performed on the extracted correction area each time a correction area is extracted, or may be performed on the entire captured image. (That is, it may be performed between step S-2 and step S3). Further, the noise removal process is a process that is selectively performed, and is not necessarily performed.

  In step S-5, luminance gradation sampling is performed. That is, the luminance gradation of each pixel in the extracted correction area is collected. Then, an area corresponding to the extracted correction area is extracted from the reference image, and the luminance gradation of each pixel in the area extracted from the reference image is collected. That is, a part (correction area) is extracted from an image obtained by imaging the object to be inspected, and a luminance gradation is collected. Similarly, a position corresponding to the extracted correction area from an image obtained by imaging a reference. An image of (region, coordinate) is extracted and a luminance gradation is collected.

Then, the X-axis direction component or Y-axis direction component of the luminance is averaged based on the collected luminance gradation. Mathematically speaking, the image data in the correction area is matrix data of N rows and M columns (N and M are predetermined natural numbers). Based on this matrix data, a row vector of 1 rows and M columns or N rows A column vector of one column is created. In the above example, the image data of the correction area is matrix data of 100 pixels × 100 pixels. Based on this matrix data, a row vector of 1 row and 100 columns or a column vector of 100 rows and 1 column is created. . Specifically, the image data in the correction area is an , m (1, 2, 3,..., N,..., N, 1, 2, 3,... M,. ), A row vector given by the following formula (1) or a column vector given by formula (2) is created.

These row vector b m or column vector c n is a vector whose X-axis direction of the luminance or Y-axis direction of the mean components of the luminance of the image data of the correction area. In this way, the image data in the correction area is converted from two-dimensional data to one-dimensional data. That is, the data obtained as “surface” is converted into “line” data. Hereinafter, processing is performed based on the created one-dimensional data. By using “line” data in this way, subsequent processing is simplified. Note that, as an area to be extracted in step S-3, for example, when an area of 1 pixel × 100 pixels (area generally called a line) is extracted, this area itself is obtained as averaged one-dimensional data. Processing may be performed using this data. By performing such processing, the luminance of the correction area and the area of the reference image (reference area) can be set as the luminance value of the one-dimensional data.

  FIG. 4 is an example of a graph showing the averaged luminance of the correction area and the averaged luminance of the reference image corresponding to the correction area. In FIG. 4, the object to be inspected is one that has been sandblasted using new sand. The reference of the reference image is obtained by performing sandblast processing using sand that has been subjected to sandblast processing 1800 times. As shown in FIG. 4, the object to be inspected is sandblasted using new sand as compared with the reference, and has a high sand polishing ability, so that the overall brightness of the reference image is high.

  Note that, in both the reference image and the image obtained by imaging the object to be inspected, the luminance is inclined. This is because the reference and the surface of the object to be inspected are curved surfaces. It is considered that this is because the incident state (incident on the imaging means) differs depending on the coordinates. That is, the luminance is high at a position where the traveling direction of the reflected light and the optical axis of the imaging means are coincident or close to each other, and the luminance decreases as the reflected light traveling method and the optical axis of the imaging means are separated. It is thought that it will go.

(3) Step of determining a correction rate for correcting the image data to be inspected In step S-6, the luminance of each pixel (data after one-dimensional conversion) in the correction area is normalized. Specifically, the luminance of each pixel in the correction area is divided by the luminance of the pixel at the corresponding coordinate in the image extracted from the reference image. This process is referred to as “normalization” and is a process of comparing the luminance value for each coordinate. FIG. 5 is a graph showing a value (that is, normalized luminance) obtained by dividing the example of FIG. 4 by the luminance of the pixel at the corresponding coordinate in the image extracted from the reference image. The value obtained by dividing the luminance of each pixel in the correction area by the luminance of the pixel at the corresponding coordinate in the image extracted from the reference image (standardized luminance) indicates that the luminance of each pixel in the correction area is the reference image. It becomes an index indicating how high or low it is compared. That is, as shown in FIG. 4, when the luminance of each pixel in the correction area is higher than the luminance of the pixel at the corresponding coordinate in the image extracted from the reference image, as shown in FIG. If the luminance is lower than 1, the normalized luminance is lower than 1 when the luminance is low.

  In step S-7, it is determined whether the extracted correction area contains a defect. When a defect is included, it is difficult to calculate a correct correction rate. For this reason, when it is determined that a defect is included in the correction area, the process returns to step S-3, and an area other than the correction area including the defect is extracted from an image obtained by imaging the inspection object. To do. Then, the newly extracted area is set as a new correction area, and the processes after step S-4 are repeated. A process for determining whether or not a defect is included in the extracted correction area and a specific process when a defect is included will be described later. First, description will be made assuming that the correction area does not include a defect.

  In step S-8, a correction rate is calculated based on the normalized luminance. As shown in FIG. 5, based on the graph, a function y = ax + b (a: slope, b: with the vertical axis (y: value range) as normalized luminance and the horizontal axis (x: definition range) as coordinates. Approximate by (intercept). As the approximation method, various known approximation methods such as a least square method can be applied.

  When the surface property (luminance distribution) of the correction area of the inspection object has the same tendency as the image extracted from the reference image, the normalized luminance has almost no inclination as shown in FIG. It is approximated by a linear function. That is, when the normalized luminance is expressed by y = ax + b (a: inclination, x: coordinate, b: intercept) (for example, approximated by the least square method), the inclination a is a value close to zero. This intercept b is defined as “correction rate”.

(4) Stage of correcting luminance of image data to be inspected based on the determined correction rate In step S-9, an image obtained by imaging the object to be inspected is corrected. Specifically, the luminance of each pixel of the image obtained by imaging the inspection object is multiplied by the reciprocal of the calculated correction rate. FIG. 6 is a graph in which the luminance obtained by multiplying the luminance of each pixel in the correction area of the image obtained by imaging the object to be inspected by the reciprocal of the correction rate and the luminance of the corresponding pixel in the image extracted from the reference image. It is. As shown in FIG. 6, when the luminance of each pixel in the correction area is multiplied by the reciprocal of the correction rate, the luminance becomes substantially equal to the luminance of the reference image. Thus, by multiplying the reciprocal of the correction rate, the luminance of the image obtained by imaging the object to be inspected can be brought close to the luminance of the reference image.

  Then, by calculating a correction rate using a part of the image obtained by imaging the object to be inspected (that is, the correction area) and a region corresponding to the correction area in the reference image (that is, the reference area), The entire luminance of the image obtained by capturing the body is corrected. According to such a configuration, a region that does not include a defect can be extracted as a correction area in an image obtained by imaging the object to be inspected, so that a highly accurate correction rate can be calculated. In addition, the amount of calculation can be reduced as compared with a configuration in which the whole is compared.

(5) Stage in which an image whose luminance has been corrected is subjected to image processing and an appearance inspection is performed In step S-10, the corrected image is used to perform image processing and an inspection is performed. Note that as a method of performing an appearance inspection by image processing, a method of determining a defect by binarization, a method of calculating a correlation with a reference image by pattern matching, and a method of determining a defect by the magnitude of the correlation can be applied. As described above, since a method for performing an appearance inspection by image processing is known, description thereof is omitted.

  As described above, since the luminance of the image obtained by imaging the object to be inspected is corrected so as to be substantially equal to the luminance of the reference image, the image obtained by imaging the object under the conditions for image processing of the reference image is processed. Even so, the inspection can be performed with high accuracy. Therefore, even when there are individual differences in the surface properties (especially the degree of brightness) of the object to be inspected, or when there are individual differences in the imaging conditions (especially the illumination conditions) due to deterioration of the light source over time, etc. Inspection can be performed.

  Next, a process for determining whether or not a defect is included in the extracted correction area and a specific process when a defect is included will be described. FIG. 7A shows the luminance of each pixel of the extracted correction area and the luminance of each pixel of the reference image corresponding to the extracted correction area when the extracted correction area has a defect. FIG. 7B is a graph showing a value obtained by standardizing the luminance of the correction area having the defect.

  As shown in FIG. 7A, for example, if there is a defect in the extracted correction area, the luminance partially decreases or increases. FIG. 7A shows a case where the luminance is partially reduced. For example, if there is a pinhole or blowhole on the surface (cast surface) of the object to be inspected, the luminance is partially lowered in the portion where the pinhole or blowhole is imaged as shown in FIG. Sometimes. In this way, if there are low or high luminance parts compared to normal surface properties, the correction rate calculated based on the luminance of the correction area is affected by defects and does not become the correct value. Sometimes. For example, if there is a defect with low brightness, the correction rate may be smaller than that calculated using a correction area without a defect. If it does so, the image which imaged the whole to-be-inspected object cannot be correct | amended correctly, and the precision of a test | inspection may fall.

  Therefore, in such a case, it is as follows. First, regardless of whether or not the extracted correction area has a defect, the processing from step S-1 to step S-6 is executed to calculate the normalized luminance (see FIG. 3). Then, as shown in FIG. 7B, the difference between the standardized maximum and minimum luminance values is calculated. This difference is defined as a peak width. At this time, if the peak width is equal to or larger than a predetermined threshold, it is considered that a defect exists in the extracted correction area. For example, if the peak width is a luminance difference of 30% or more in the normalized luminance, it is considered that a defect exists. In addition, if the noise removal process is performed in step S-4, noise and an isolated point are not determined as a defect.

  In this way, when it is determined that there is a defect, the process returns to step S-3, and an area other than the correction area including the defect is extracted from the entire image obtained by imaging the inspection object. Then, the newly extracted area is set as a new correction area, and the processes after step S-4 are repeated. When such a process is performed, the correction rate is calculated with reference to a correction area that does not include a defect, so a highly accurate inspection can be performed.

  In addition, when it determines with there being a defect in step S-7, you may perform the process which considers that a to-be-inspected object has a defect as it is, without returning to step S-3. That is, when it is determined that there is a defect in S-7, the same processing as that performed when it is determined that there is a defect in S-10, which should originally determine the presence or absence of a defect, may be performed. With such a configuration, the processing can be reduced and the processing can be simplified.

  Next, another embodiment of the present invention will be described. In the above-described embodiment, an image obtained by imaging the surface of a standard test piece having an average surface property is applied as the reference image. However, such a standard test piece is not necessarily used. Another embodiment shown below is a form in which a standard test piece having an average surface texture is not used. Hereinafter, description will be made mainly with reference to FIG.

(1) Stage of acquiring a reference image In step S1-1, first, sandblasting is performed on a certain reference using new sand or sand in a state close to new, and the surface of the certain reference is imaged. To do. Also, sandblasting is performed on the other reference using sand that has reached or has reached the end of its life, and the surface of the other reference is imaged. In this way, reference images that have been subjected to sandblasting using two types of new and old sand are respectively captured. Then, the two captured reference images are converted into images each having a predetermined luminance gradation. For example, the captured image is converted into an image having a luminance gradation of 256 gradations (8-bit gradation). The converted images are stored in the storage means. Note that the same conditions as in the above embodiment can be applied to the imaging conditions.

(2) Stage of acquiring image data to be inspected Step S-2, Step S-3, Step S-4, and Step S-5 can be applied to the same configuration as that of the above embodiment. Therefore, the description is omitted.

(3) Step of determining a correction rate for correcting the image data to be inspected In step S-6, the luminance of each pixel in the extracted correction area is normalized. Specifically, the luminance of each pixel in the correction area is divided by the luminance of the pixel calculated based on the reference image. This processing is referred to as “normalization” in the other embodiments.

  Here, a method for calculating the luminance of the pixel based on the reference image will be described. First, the brightness of the area corresponding to the extracted correction area is extracted from each of the two reference images that have been captured and stored. FIG. 8 is a graph showing the luminance of the area corresponding to the correction area extracted from each of the two reference images. This graph shows a plot of two reference images together.

  As shown in FIG. 8, the brightness of the area corresponding to the extracted correction area is extracted from each of the two reference images, the horizontal axis (x axis: definition area) is the image coordinate (pixel), and the vertical axis ( A graph with luminance on the y-axis (value range) is created. Based on each created graph, a function that well represents each plot is obtained. Various known methods such as a least square method can be applied to the function derivation method. In the example shown in the graph of FIG. 8, the function of the luminance of the reference that has been sandblasted using new sand is y = 0.0332x + 83.027, and sandblasting is performed using sand that has reached the end of its life. The reference brightness function is y = 0.0244 + 65.854. Further, a function that takes an intermediate value between the two derived functions is created. In the example shown in FIG. 8, y = ((0.0332 + 0.0244) / 2) x + (83.027 + 65.854) /2=0.0288x+74.441. Then, by substituting the pixel coordinates into the variable x (domain) of the function thus derived, the luminance of the pixel based on the reference image of the pixel is calculated.

  Then, a value obtained by dividing the luminance of each pixel in the extracted correction area by the luminance of the pixel calculated based on the reference image (the function derived as described above), that is, a normalized luminance is calculated. The value obtained by dividing the brightness of each pixel in the correction area by the brightness of the pixel at the corresponding coordinate in the image extracted from the reference image indicates how high the brightness of each pixel in the correction area is compared to the reference image. Or it becomes an index indicating whether it is low. That is, when the luminance of each pixel in the correction area is higher than the luminance of the pixel of the corresponding coordinate in the image extracted from the reference image, the normalized luminance is greater than 1, and conversely the luminance is If it is low, the normalized luminance is less than 1.

  About step S-7 and step S-8, since the same structure as the said embodiment is applicable, description is abbreviate | omitted. Similarly, (4) a step of correcting the luminance of the image data to be inspected based on the determined correction rate, and (5) a step of performing an appearance inspection of the inspected object by performing image processing on the image whose luminance has been corrected. Since the same configuration as that of the above-described embodiment can be applied, description thereof will be omitted.

  The embodiments of the present invention have been described in detail with reference to the drawings. However, the present invention is not limited to the embodiments, and various modifications can be made without departing from the spirit of the present invention. Needless to say. For example, in the above-described embodiment, the impeller (turbine wheel) that has been cast and subjected to the sandblasting process is cited as the object of the appearance inspection, but the object of the appearance inspection is limited to the impeller (turbine wheel). is not. It can be applied to various cast products.

  Moreover, it can apply to the to-be-inspected object to which a shot peening or shot blasting process is given besides a cast product. In other words, the present invention can be applied to an object to be inspected that is subjected to a process in which the degree of reflection of light on the surface is different although the surface property is normal.

It is the figure which showed typically the outline of the structure of the external appearance inspection apparatus concerning embodiment of this invention. It is the perspective view which showed the external appearance of the impeller (turbine wheel) used as the application object of the external appearance inspection method concerning embodiment of this invention, (a) saw the impeller (turbine wheel) from the one side of the axial direction. It is a figure and (b) is the figure seen from the other side of the axial direction of an impeller (turbine wheel). It is the flowchart which showed the outline of the flow of the external appearance inspection method concerning embodiment of this invention. It is an example of the graph showing the brightness | luminance of each pixel of the extracted correction | amendment area, and the brightness | luminance of each pixel (pixel extracted from the reference | standard image) of the reference | standard image corresponding to the said each pixel. It is a graph which shows the brightness | luminance which normalized the example shown in FIG. It is the graph which superposed | combined the brightness | luminance of each pixel of the correction area of the image which imaged the to-be-inspected object to the reciprocal of the correction rate, and the brightness | luminance of the corresponding pixel of the image extracted from the reference | standard image. (A) is a graph showing the luminance of each pixel of the extracted correction area and the luminance of each pixel of the reference image corresponding to the extracted correction area when the extracted correction area is defective. (B) is a graph which shows the value which normalized the brightness | luminance of the correction area which has the said defect as it was. It is the graph which showed the brightness | luminance of the area | region corresponding to the correction area extracted from each of two reference | standard images.

Explanation of symbols

DESCRIPTION OF SYMBOLS 1 Impeller 2 Shaft member 3 Torsion blade 4 Wrench boss 5 Shaft holding part 10 Appearance inspection apparatus 20 Work holding means 21 Rotating mechanism part 22 Support part 24 Chuck part 25 Servo motor 26 Holder part
30 Imaging means 40 Light source 50 Work position sensor 60 Control unit 61 Calculation means 62 Storage means 63 Control means

Claims (6)

  1. A visual inspection method for inspecting the surface of an object to be inspected that has been subjected to sandblasting, shot blasting or shot peening,
    Capturing a reference image by imaging the appearance of a reference article, and obtaining an inspected object luminance image indicating a luminance distribution of the reference image;
    Capturing an appearance of the inspection object by capturing an appearance of the inspection object, and acquiring an inspection object luminance image indicating a luminance distribution of the inspection object image;
    Extracting an area of a preset position from the luminance image of the inspected object as an inspected object area;
    Extracting an area corresponding to the position of the object area from the reference luminance image as a reference area;
    Converting each luminance distribution data of the inspected object area and the reference area into one-dimensional data in a predetermined coordinate direction, and setting as a luminance value of each area;
    A comparison stage for comparing the brightness values of the set areas for each of the coordinates;
    Based on the comparison result, a correction stage for performing correction to bring the luminance distribution of the inspection object luminance image closer to the luminance distribution of the reference image;
    Performing a visual inspection by performing predetermined image processing on the corrected inspected object luminance image;
    An appearance inspection method comprising:
  2. The comparison step divides the luminance value of the inspected area for each coordinate by the luminance value of the reference area, obtains a ratio for each coordinate, approximates the change in the ratio in the coordinate direction with a straight line, The intercept of the approximate straight line is set as a correction rate,
    2. The appearance inspection method according to claim 1, wherein in the correction step, the reciprocal of the set correction rate is uniformly multiplied by the entire luminance image of the object to be inspected.
  3.   After the step of extracting the area of the position set in advance from the inspected object luminance image as the inspected object area, further comprising the step of determining whether or not the image of the inspected object area includes a defect; When it is determined that a defect is included in the stage, a new inspection object area different from the extracted inspection object area is newly extracted from the inspection object luminance image, and the new inspection object area is newly extracted. The appearance inspection method according to claim 1, wherein an area corresponding to the extracted inspection object area is extracted from the reference image.
  4. The reference image that captures and acquires the appearance of the reference article is formed using an image obtained by imaging a sandblasted sand or shotblasted steel ball using a new article and an old one. Two images with an image of the selected article,
    In the comparison step, the average of the respective luminance values set in the two reference images is newly set as the luminance value of the reference area, the luminance value of the reference area thus set again, and the object to be inspected 4. The appearance inspection method according to claim 1, wherein the brightness value of the area is compared for each coordinate.
  5. An imaging means capable of imaging a reference image of the appearance of an article to be a reference and an inspection object image of the appearance of the inspection object;
    Storage means capable of storing a reference image captured by the imaging means;
    An inspection object luminance image indicating the luminance distribution of the reference image and an inspection object luminance image indicating the luminance distribution of the inspection object image are calculated, and an area at a preset position is inspected from the inspection object luminance image. Extracted as a body area, and an area corresponding to the position of the inspected object area is set as a reference area from the reference luminance image, and luminance distribution data of the set inspected object area and the reference area are set Is converted into one-dimensional data and set as the brightness value of each area, the set brightness values of each area are compared for each coordinate, and the brightness distribution of the inspected object brightness image is compared with the brightness of the reference image. A computing means capable of performing a correction approaching the distribution;
    An appearance inspection apparatus comprising:
  6.   The calculation means divides the luminance value of the inspected area for each coordinate by the luminance value of the reference area, obtains a ratio for each coordinate, approximates a change in the ratio in the coordinate direction with a straight line, 6. The appearance inspection apparatus according to claim 5, wherein an intercept of the approximate straight line is set as a correction rate, and the reciprocal of the set correction rate is uniformly multiplied over the entire inspected object luminance image.
JP2008233467A 2008-09-11 2008-09-11 Method and device for inspecting appearance Pending JP2010066153A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2008233467A JP2010066153A (en) 2008-09-11 2008-09-11 Method and device for inspecting appearance

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
JP2008233467A JP2010066153A (en) 2008-09-11 2008-09-11 Method and device for inspecting appearance

Publications (1)

Publication Number Publication Date
JP2010066153A true JP2010066153A (en) 2010-03-25

Family

ID=42191860

Family Applications (1)

Application Number Title Priority Date Filing Date
JP2008233467A Pending JP2010066153A (en) 2008-09-11 2008-09-11 Method and device for inspecting appearance

Country Status (1)

Country Link
JP (1) JP2010066153A (en)

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR101191303B1 (en) * 2011-11-09 2012-10-16 동국대학교 경주캠퍼스 산학협력단 Quantitative evaluation methodology of mar-induced surface damage on coatings
JP2013190284A (en) * 2012-03-13 2013-09-26 Daido Steel Co Ltd Method for inspecting defect of bladed wheel
WO2017217121A1 (en) * 2016-06-15 2017-12-21 株式会社Screenホールディングス Appearance inspection device, surface processing system, appearance inspection method, program, and projection material replacement determination method
TWI630070B (en) * 2016-06-15 2018-07-21 斯庫林集團股份有限公司 Appearance inspection apparatus, surface processing system, appearance inspection method, appearance inspection program and method for determining replacement of projecting material
US10055824B2 (en) 2014-03-28 2018-08-21 Nec Corporation Image correction device, image correction method and storage medium
WO2018225460A1 (en) * 2017-06-05 2018-12-13 株式会社Screenホールディングス Inspecting device and inspecting method

Cited By (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR101191303B1 (en) * 2011-11-09 2012-10-16 동국대학교 경주캠퍼스 산학협력단 Quantitative evaluation methodology of mar-induced surface damage on coatings
WO2013069847A1 (en) * 2011-11-09 2013-05-16 동국대학교 경주캠퍼스 산학협력단 Method for quantitatively estimating surface damage such as cracks or scratches of coating material
JP2013190284A (en) * 2012-03-13 2013-09-26 Daido Steel Co Ltd Method for inspecting defect of bladed wheel
US10055824B2 (en) 2014-03-28 2018-08-21 Nec Corporation Image correction device, image correction method and storage medium
WO2017217121A1 (en) * 2016-06-15 2017-12-21 株式会社Screenホールディングス Appearance inspection device, surface processing system, appearance inspection method, program, and projection material replacement determination method
TWI630070B (en) * 2016-06-15 2018-07-21 斯庫林集團股份有限公司 Appearance inspection apparatus, surface processing system, appearance inspection method, appearance inspection program and method for determining replacement of projecting material
EP3474004A4 (en) * 2016-06-15 2020-01-15 SCREEN Holdings Co., Ltd. Appearance inspection device, surface processing system, appearance inspection method, program, and projection material replacement determination method
WO2018225460A1 (en) * 2017-06-05 2018-12-13 株式会社Screenホールディングス Inspecting device and inspecting method

Similar Documents

Publication Publication Date Title
JP5911904B2 (en) Accurate image acquisition on structured light systems for optical measurement of shape and position
RU2518288C2 (en) Method of nondestructive control of mechanical part
US9575013B2 (en) Non-contact method and system for inspecting a manufactured part at an inspection station having a measurement axis
Martínez et al. Analysis of laser scanning and strategies for dimensional and geometrical control
US7602960B2 (en) System and method for measuring thin film thickness variations and for compensating for the variations
US20150161776A1 (en) Digital Optical Comparator
Dutta et al. Application of digital image processing in tool condition monitoring: A review
Kassim et al. Texture analysis methods for tool condition monitoring
US8054460B2 (en) Methodology for evaluating the start and profile of a thread with a vision-based system
EP1254738B1 (en) Automated system for repairing components
CN106662538B (en) Repeat defect detection
JP2005023930A (en) Airfoil portion qualifying system and its method
JP2013137239A (en) Visual inspection device and visual inspection method
JP2006200972A (en) Image defect inspection method, image defect inspection device, and external appearance inspection device
EP1332334B2 (en) Measuring device for contactless measurement of tyres
JP5826611B2 (en) Hardness tester and hardness test method
Martínez et al. A machine vision system for defect characterization on transparent parts with non-plane surfaces
AU2005208342B2 (en) Optical inspection for container lean
US6592418B2 (en) Method for manufacturing spark plug and apparatus for carrying out the same
JP3914530B2 (en) Defect inspection equipment
JP2003270173A (en) Method for inspecting surface state and substrate inspection device
JP2012021909A (en) Image processing device and visual inspection method
US9047657B2 (en) Method and system for optically inspecting outer peripheral surfaces of parts
DE102015204554B4 (en) Method for testing gears
EP1777489A1 (en) Method and apparatus for inspecting an object