WO2019058633A1 - Image processing method, image processing device, and inspection method - Google Patents

Image processing method, image processing device, and inspection method Download PDF

Info

Publication number
WO2019058633A1
WO2019058633A1 PCT/JP2018/018126 JP2018018126W WO2019058633A1 WO 2019058633 A1 WO2019058633 A1 WO 2019058633A1 JP 2018018126 W JP2018018126 W JP 2018018126W WO 2019058633 A1 WO2019058633 A1 WO 2019058633A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
imaging
illumination
edge
unit
Prior art date
Application number
PCT/JP2018/018126
Other languages
French (fr)
Japanese (ja)
Inventor
駿 内藤
Original Assignee
株式会社Screenホールディングス
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 株式会社Screenホールディングス filed Critical 株式会社Screenホールディングス
Publication of WO2019058633A1 publication Critical patent/WO2019058633A1/en

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/50Image enhancement or restoration by the use of more than one image, e.g. averaging, subtraction

Definitions

  • the present invention is an image processing technique for specifying a target area in which an image of an object is present in an original image obtained by imaging an object irradiated with illumination light from all or part of a plurality of illumination directions different from one another
  • the present invention relates to an inspection method for inspecting an object using the image processing technology.
  • An image of a workpiece included in an original image obtained by imaging the workpiece (hereinafter referred to as “inspection image” in order to inspect whether there is a defect such as a scratch on an inspection object such as an industrial part (hereinafter referred to as “work”) "" Is compared with a reference image obtained by imaging a workpiece not including defects in advance with an imaging unit such as a camera, and an inspection method for inspecting various ing.
  • this inspection method it is important to extract an inspection image properly from an original image in order to improve inspection accuracy. That is, since the original image includes the background image in addition to the inspection image, an area (corresponding to the “target area” of the present invention) in the original image in which the inspection image exists is specified. It is necessary to exclude the background image from the original image.
  • a background image is acquired as a background material to acquire a background image whenever the photographing conditions change, such as the illumination intensity of the illumination and the aperture of the camera. That is, the background is fixed and the background image is static.
  • the work is transported to the inspection position by a table, a hand or the like, and imaged at the inspection position.
  • the background image is not constant but changes dynamically.
  • an area (hereinafter referred to as “target area”) including an image of a work is specified from an original image obtained by imaging the work to be inspected, and the specified target area It is desirable to extract examination images based on
  • the present invention has been made in view of the above problems, and an image processing technique capable of specifying with high accuracy a target area in which an image of a target is present in the original image while capturing the target image by the imaging unit. And an inspection method capable of inspecting an object with high accuracy using the image processing technology.
  • the first aspect of the present invention is an image for specifying a target area in which an image of a target is present in an original image obtained by imaging the target illuminated with illumination light from all or part of a plurality of different illumination directions.
  • a processing method comprising: an image acquiring step of acquiring an original image of an object as an illumination switching image at each switching of the illumination light while switching illumination light to the object in multiple stages; and illumination switching acquired in the image acquiring step An image processing step of acquiring an edge-emphasized image by performing edge emphasis processing for emphasizing an edge of an image of an object for each image, an image generation step of generating a composite image obtained by adding a plurality of edge-emphasized images, and combining And a region specifying step of specifying a target region from an image.
  • a target area in which an image of an object is present is specified in an original image obtained by imaging the object irradiated with illumination light from all or a part of a plurality of different illumination directions.
  • An image acquisition unit configured to acquire an original image of an object as an illumination switching image each time the illumination light is switched while switching the illumination light to the object in multiple stages, and an object for each illumination switching image
  • An image processing unit for obtaining an edge-emphasized image by performing edge enhancement processing for emphasizing an edge of an image of an object, an image generation unit for generating a composite image obtained by adding a plurality of edge-emphasized images, and specifying a target region from the composite image And an area specifying unit.
  • a third aspect of the present invention is an inspection method for inspecting an object, comprising the steps of: specifying a target area in which an image of the target is present by the image processing method; And an inspection step of inspecting the object based on the image of the object extracted by the extraction step.
  • the illumination switching image is acquired while switching the illumination light, and the edge enhancement processing is performed on each illumination switching image to acquire the edge-emphasized image.
  • the edge-emphasized image is obtained while irradiating the illumination light in various illumination modes, and in the composite image generated by adding them, the edge of the image of the object becomes clear. Then, the target area is accurately identified from the composite image having such characteristics.
  • a plurality of illumination switching images are acquired while switching the illumination light to the object in multiple stages, and an edge-emphasized image obtained by performing edge enhancement processing on each illumination switching image is added.
  • the target area is specified based on the composite image generated together. Therefore, the target area can be identified with high accuracy. Further, the image of the object can be extracted using the target region thus identified, and the object can be inspected with high accuracy by inspecting the object based on the image.
  • FIG. 9 It is a figure which shows an example of the illumination switching image at the time of imaging a workpiece
  • FIG. 1 is a view showing the configuration of an inspection apparatus equipped with an image processing apparatus according to an embodiment of the present invention.
  • FIG. 2 is a plan view showing the main body of the inspection apparatus shown in FIG.
  • the inspection apparatus 1 is an apparatus that inspects the workpiece W by comparing an inspection image obtained by imaging, for example, a metal workpiece W manufactured by forging or casting with a reference image acquired in advance.
  • the inspection apparatus 1 includes a main body 11 and a control unit 12 configured by a computer.
  • the main body 11 includes a stage 2, an imaging unit 3, and a light source unit 4.
  • the work W is placed on the stage 2.
  • the main body 11 is provided with a light shielding cover (not shown) for preventing external light from reaching the stage 2, and the stage 2, the imaging unit 3 and the light source unit 4 are provided in the light shielding cover.
  • the imaging unit 3 includes one upper imaging unit 31, eight oblique imaging units 32, and eight side imaging units 33.
  • the upper imaging unit 31 is not shown (the same applies to the upper light source unit 41 described later).
  • the upper imaging unit 31 is disposed on a central axis J1 extending vertically upward from the center of the stage 2 above the stage 2 as shown in FIG. The original image captured from above is output to the control unit 12.
  • the eight oblique imaging units 32 are arranged around the stage 2 There is.
  • the eight oblique imaging units 32 are arranged at an angular interval (pitch) of 45 ° in the circumferential direction about the central axis J1.
  • an angle ⁇ 2 formed by the imaging optical axis K2 and the central axis J1 is approximately 45 °.
  • the eight side imaging units 33 are also arranged around the stage 2 when the main body 11 is viewed in plan.
  • the eight side imaging units 33 are arranged at an angular interval of 45 ° in the circumferential direction about the central axis J1.
  • an angle ⁇ 3 formed by the imaging optical axis K3 and the central axis J1 is approximately 90 °.
  • the upper imaging unit 31, the oblique imaging unit 32, and the side imaging unit 33 have, for example, a charge coupled device (CCD), a complementary metal oxide semiconductor (CMOS), and the like, and a multi-gradation image is obtained.
  • the upper imaging unit 31, the oblique imaging unit 32, and the side imaging unit 33 are supported by a support (not shown).
  • the light source unit 4 includes one upper light source unit 41, eight oblique light source units 42, and eight side light source units 43.
  • the upper light source unit 41 is a light source unit in which a plurality of LEDs (light emitting diodes) are arranged in a ring shape centering on the central axis J1.
  • the ring-shaped upper light source unit 41 is fixed to the upper imaging unit 31 so as to surround the periphery of the upper imaging unit 31.
  • the illumination light can be irradiated by the upper light source unit 41 along the direction parallel to the central axis J1 from directly above the work W on the stage 2.
  • each of the oblique light source units 42 is a light source unit in which a plurality of LEDs are arranged in a bar shape extending in a tangential direction of a circumference centered on the central axis J1.
  • the line connecting the center of the exit surface of each oblique light source unit 42 and (the center of) the work W is referred to as the “diagonal illumination axis”.
  • illumination light can be irradiated to the work W on the stage 2 from obliquely above along the oblique illumination axis.
  • the oblique light source units 42 are respectively fixed to the oblique imaging unit 32.
  • Each side light source unit 43 is a light source unit in which a plurality of LEDs are arranged in a bar shape extending in a tangential direction of a circumference centered on the central axis J1.
  • each side light source unit 43 when a line connecting the center of the emission surface of each side light source unit 43 and the work W is referred to as “side illumination axis”, the side illumination axis of the side light source unit 43 and In the plane including the central axis J1, the angle between the side illumination axis and the central axis J1 is approximately 90 degrees.
  • illumination light can be irradiated to the work W on the stage 2 from the side along the side illumination axis.
  • the side light source units 43 are fixed to the side imaging units 33 respectively.
  • the workpiece W can be illuminated with illumination light from all or a part of 17 illumination directions different from one another.
  • the distance between the upper imaging unit 31 and the upper light source unit 41 and the work W is about 55 cm (centimeter).
  • the distance between the oblique imaging unit 32 and the oblique light source unit 42 and the work W is about 50 cm
  • the distance between the side imaging unit 33 and the side light source unit 43 and the work W is about 40 cm. is there.
  • light sources of types other than LEDs may be used.
  • a control unit 12 is provided to control each part of the main body 11 and to inspect the workpiece W by comparing the inspection image of the workpiece W with the reference image as described later.
  • the control unit 12 includes an arithmetic processing unit 5 configured by a CPU (Central Processing Unit), reference image data 61, illumination switching image data 62, edge-emphasized image data 63, composite image data 64, and a target area, which will be described in detail later.
  • the storage unit 6 stores various data such as the data 65 and the inspection image data 66, a program, and the like.
  • the arithmetic processing unit 5 controls each part of the apparatus according to the above program to obtain a plurality of illumination switching images while switching the illumination light to the workpiece W in multiple stages, and performs smoothing processing on each illumination switching image and Edge emphasis processing is performed to obtain a plurality of edge-emphasized images, the edge-emphasized images are added together to generate a composite image, a target area is specified based on the composite image, and an image of the work W, ie, based on the target area The inspection image is extracted, and the inspection image and the reference image are compared to inspect the workpiece W.
  • the arithmetic processing unit 5 functions as the image acquisition unit 51, the image processing unit 52, the image generation unit 53, the target area identification unit 54, the image extraction unit 55, and the inspection unit 56.
  • identification of a target area from an original image including an image of the work W and inspection of the work W will be described with reference to FIGS. 3 to 14.
  • FIG. 3 is a flowchart showing the flow of inspection processing in the inspection apparatus.
  • FIG. 4 is a flowchart showing a process of specifying a target area performed in the inspection process.
  • the workpiece W to be inspected is placed on the stage 2 (step S1).
  • a holding unit (not shown) that conforms to the shape, size, etc. of the work W is provided on the stage 2, and is positioned at a predetermined position on the stage 2 with the main surface W 1 of the work W directed vertically upward. Hold the work W.
  • the control unit 12 acquires imaging setting information on the workpiece W on the stage 2 based on an input by the operator or the like (step S2).
  • the imaging setting information includes an imaging unit (hereinafter referred to as a “selected imaging unit”) used in the imaging unit 3 and a light source unit which lights in the light source unit 4 when the selected imaging unit acquires a photographed image. Show.
  • the imaging setting information is The following information may be included.
  • the upper imaging unit 31 is used as a selective imaging unit, and all the oblique light source units 42 are turned on as a light source unit in order to illuminate the work W from the illumination direction suitable for it.
  • Light part of 42 as a light source A side light source unit is used to illuminate the work W from a suitable illumination direction while using one side imaging unit 33a (see FIG.
  • the arithmetic processing unit 5 executes a process of specifying a target area prior to imaging and inspection of the workpiece W based on the imaging setting information (step S3).
  • the arithmetic processing unit 5 selects one of the selected imaging units (corresponding to the upper imaging unit 31, the oblique imaging unit 32a, and the side imaging unit 33a in the above specific example) that has not completed the specification of the target area.
  • a set is determined (step S31).
  • the arithmetic processing unit 5 sets a light source unit (hereinafter referred to as a "lighting light source unit") and a plurality of lighting patterns to be turned on when imaging the work W with the selected imaging unit determined in step S31 (step S32).
  • a light source unit hereinafter referred to as a "lighting light source unit”
  • a plurality of lighting patterns to be turned on when imaging the work W with the selected imaging unit determined in step S31 (step S32).
  • the lighting pattern will be described later using a specific example, the work W can be illuminated with a plurality of illumination modes by switching these lighting patterns.
  • the arithmetic processing unit 5 switches to one lighting pattern among the plurality of lighting patterns, and in a state where the desired lighting light source unit is selectively lit, the original image of the work W captured by the selected imaging unit is switched to the illumination switching image Image data (illumination switching image data 62) is stored in the storage unit 6 (step S33). Subsequently, the arithmetic processing unit 5 performs smoothing processing on the illumination switching image data 62 to remove noise components included in the image, and then performs edge emphasizing processing to emphasize the edge of the image of the workpiece W. The edge-emphasized image is acquired, and the image data (edge-emphasized image data 63) is stored in the storage unit 6 (step S34).
  • step S33 The image acquisition process (step S33) and the image processing process (step S34) are repeated until it is determined by the arithmetic processing unit 5 that the entire lighting pattern has been completed ("YES" in step S35).
  • YES the arithmetic processing unit 5
  • the arithmetic processing unit 5 adds together the plurality of edge-emphasized images obtained in this way to generate a composite image, and stores composite image data 64 in the storage unit 6 (image generation step). Furthermore, the arithmetic processing unit 5 subjects the composite image to a binarization process, and the target area (in which the image of the work W exists in the original image of the work W captured by the selected imaging unit determined in step S31)
  • region identification process) is specified.
  • the arithmetic processing unit 5 determines whether or not the target region has been specified for all of the selected imaging units (step S38), and while determining that all have not been completed, the process returns to step S31 and the above series of processes (The image acquisition step + the image processing step + the image generation step + the area specifying step is repeated to specify the target area for each selected imaging unit, and the target area data 65 is stored in the storage unit 6.
  • the process of specifying the target area will be specifically described based on the specific example.
  • three of the upper imaging unit 31, the oblique imaging unit 32a, and the side imaging unit 33a are selected as the selection imaging unit.
  • step S31 When the upper imaging unit 31 is determined to be "one selected imaging unit" in step S31, as shown in FIGS. 5 to 7, all eight oblique light source units 42 are selected as lighting light source units.
  • a total of eight lighting patterns (A-1) to (A-8) for setting the respective oblique light source units 42 one by one are set (step S32).
  • dots are attached to clearly indicate the light source unit 42 to be lit.
  • the lighting switching image SG (A-1) is acquired in the lighting mode in which only the lighting light source unit 42a lights (step S33).
  • the edge enhanced image EG (A-1) is acquired by the smoothing process and the edge enhancement process on the illumination switching image SG (A-1) (step S34). Further, each time the lighting patterns (A-2) to (A-8) are sequentially switched, the edge-emphasized images EG (A-2) to EG (A-8) are acquired in the same manner as described above.
  • the eight edge-weighted images EG (A-1) to EG (A-8) thus acquired are added together to generate a composite image as shown in the upper part of FIG. 8 (step S36).
  • the target region OR surrounded by the boundary line BL is determined (step S37).
  • the image shown in the lower part of FIG. 8 shows an inspection image obtained by the extraction processing described below.
  • the eight oblique light source units 42 are selected.
  • Four of the lighting patterns (B-1) to (B) are selected in which five of the oblique light source units 42b to 42f and 42h are selected as the lighting light source units and one each of the oblique light source units 42c to 42e and 42h are lit.
  • a total of five lighting patterns of -4) and a lighting pattern (B-5) for simultaneously lighting the five oblique light source parts 42b to 42f are set (step S32).
  • the lighting switching image SG (B-1) is acquired in the lighting mode in which only the lighting light source unit 42h lights (step S33).
  • the edge enhanced image EG (B-1) is acquired by the smoothing process and the edge enhancement process on the illumination switching image SG (B-1) (step S34).
  • the edge-emphasized images EG (B-2) to EG (B-5) are acquired in the same manner as described above.
  • the six edge-weighted images EG (B-1) to EG (B-5) thus obtained are added together to generate a composite image as shown in the upper part of FIG. 11 (step S36).
  • the target area OR surrounded by the boundary line BL is determined (step S37).
  • the image shown in the lower part of FIG. 11 shows an inspection image obtained by the extraction processing described below.
  • the eight side light source units 43 are selected.
  • Three lighting patterns (C-1) to (C-3) in which five of the side light source units 43b to 43f are selected as lighting light source units and one side light source unit 43c to 43e are turned on A total of four lighting patterns are set with the lighting pattern (C-4) for simultaneously lighting the five side light source parts 43b to 43f (step S32).
  • the lighting switching image SG (C-1) is acquired in the lighting mode in which only the side light source unit 43d lights (step S33).
  • the edge enhanced image EG (C-1) is obtained by the smoothing process and the edge enhancement process on the illumination switching image SG (C-1) (step S34). Further, each time the lighting patterns (C-2) to (C-4) are sequentially switched, the edge enhanced images EG (C-2) to EG (C-4) are acquired in the same manner as described above.
  • the four edge-emphasized images EG (C-1) to EG (C-4) obtained in this way are added together to generate a composite image as shown in the upper part of FIG. 14 (step S36).
  • the target region OR surrounded by the boundary line BL is determined (step S37).
  • the image shown in the lower part of FIG. 14 shows an inspection image obtained by the extraction processing described below.
  • Step S4 extraction step). That is, an image other than the target area OR in the composite image is excluded as the background image by the area threshold and the image of the work W, that is, the inspection image is extracted (see the lower part of FIG. 8, FIG. 11 and FIG. 14) 66 is stored in the storage unit 6.
  • the arithmetic processing unit 5 reads from the storage unit 6 the data of the reference image of the work W stored in advance in the storage unit 6, that is, the reference image data 61 (step S5: reading step).
  • the reference image is a photographed image obtained by imaging the workpiece W having no defect such as a scratch based on the imaging setting information, and is acquired before the inspection and stored in the storage unit 6.
  • the arithmetic processing unit 5 compares the inspection image with the reference image to inspect whether there is a defect such as a flaw in the work W (step S6: inspection step).
  • the illumination switching image is acquired while switching the illumination light, and the edge emphasis processing is performed on each illumination switching image to acquire the edge-emphasized image, and further, their edge emphasis is performed. Images are added together to generate a composite image. Therefore, the edge of the work W becomes clear in the composite image, and then the target area where the image of the work W (inspection image) is present can be identified with high accuracy. In addition, since the inspection image is extracted using the target area thus identified, the workpiece W can be inspected with high accuracy.
  • the work W corresponds to an example of the "object” of the present invention.
  • the arithmetic processing unit 5 functions as the "image processing apparatus" of the present invention.
  • the extraction step, the reading step, and the inspection step are performed after the identification of the target region is completed for all selected imaging units, but the execution timing of the extraction step, the reading step, and the inspection step is limited thereto.
  • the identification process of the target area and the extraction process may be performed continuously for each selected imaging unit.
  • identification of a target area, an extraction process, a reading process, and an inspection process may be performed for each selected imaging unit.
  • the present invention is applied to an inspection apparatus which selects three imaging units 31 and the oblique imaging unit 32a side imaging unit 33a among the 17 imaging units as a selection imaging unit.
  • the type and number of imaging units are not limited to this.
  • the present invention is applied to an inspection apparatus provided with 17 imaging units and 17 light source units, the number of imaging units and light source units and the type of the apparatus are limited to the above embodiment.
  • the present invention can be generally applied to an apparatus generally provided with an imaging unit for imaging an object irradiated with illumination light from all or part of a plurality of illumination directions different from one another.
  • the present invention relates generally to an image processing technique for specifying a target area in which an image of an object is present in an original image obtained by imaging the object irradiated with illumination light from all or part of a plurality of different illumination directions. And it can apply to the inspection method which inspects an object using the image processing technique concerned.
  • inspection device 5 ... arithmetic processing unit (image processing device) 31 ... upper imaging unit 32 ... oblique imaging unit 33 ... side imaging unit 41 ... upper light source unit 42 ... oblique light source unit 43 ... side light source unit 51 ... image acquisition unit 52 ... image processing unit 53 ... image generation unit 54 ... Target area identification unit 55 ... Image extraction unit 56 ... Inspection unit OG ... Target region SG ... Illumination switching image W ... Work (object)

Abstract

An image processing method provided with: an image acquisition step for acquiring, while switching illumination light to a subject in multi-stages, an original image of the subject as an illumination switch image for each switching of the illumination light; an image processing step for performing an edge emphasis process for emphasizing an edge of an image of the subject to acquire an edge emphasis image for each illumination switch image acquired in the image acquisition step; an image generation step for generating a synthesis image resulted from adding a plurality of edge emphasis images together; and an area specification step for specifying a subject area from the synthesis image.

Description

画像処理方法、画像処理装置および検査方法Image processing method, image processing apparatus and inspection method
 この発明は、互いに異なる複数の照明方向の全部または一部から照明光が照射された対象物を撮像して得られる原画像において前記対象物の画像が存在する対象領域を特定する画像処理技術ならびに当該画像処理技術を用いて対象物を検査する検査方法に関するものである。 The present invention is an image processing technique for specifying a target area in which an image of an object is present in an original image obtained by imaging an object irradiated with illumination light from all or part of a plurality of illumination directions different from one another The present invention relates to an inspection method for inspecting an object using the image processing technology.
 以下に示す日本出願の明細書、図面および特許請求の範囲における開示内容は、参照によりその全内容が本書に組み入れられる:
 特願2017-183242(2017年9月25日出願)。
The disclosure content of the following Japanese application specification, drawings and claims is incorporated herein by reference in its entirety:
Japanese Patent Application No. 2017-183242 (filed on September 25, 2017).
 産業部品などの検査対象物(以下「ワーク」という)に傷などの欠陥が無いかどうかを検査するために、ワークを撮像して得られた原画像に含まれるワークの画像(以下「検査画像」という)を、欠陥が含まれていないワークをカメラなどの撮像部で予め撮像して取得しておいた参照画像と比較し、その差異から各種欠陥の検査を行う検査方法が従来より提案されている。この検査方法を採用する場合、原画像から検査画像を良好に抽出することが検査精度を高める上で重要となる。つまり、原画像には検査画像以外に背景画像が含まれるため、原画像のうち検査画像が存在する領域(本発明の「対象領域」に相当)を特定し、それ以外に領域の画像、つまり背景画像を原画像から排除する必要がある。 An image of a workpiece included in an original image obtained by imaging the workpiece (hereinafter referred to as “inspection image” in order to inspect whether there is a defect such as a scratch on an inspection object such as an industrial part (hereinafter referred to as “work”) "" Is compared with a reference image obtained by imaging a workpiece not including defects in advance with an imaging unit such as a camera, and an inspection method for inspecting various ing. When this inspection method is adopted, it is important to extract an inspection image properly from an original image in order to improve inspection accuracy. That is, since the original image includes the background image in addition to the inspection image, an area (corresponding to the “target area” of the present invention) in the original image in which the inspection image exists is specified. It is necessary to exclude the background image from the original image.
 このように背景画像を取り除いて検査画像を抽出する画像処理が従来より提案されている。例えば特許文献1に記載の装置では、種々の撮影条件でワークを含まない背景画像を予め登録しておき、ワークを撮影して得られた原画像と、その撮影条件に対応する背景画像と比較して背景画像を排除して検査画像を得ている。 Thus, image processing which removes a background image and extracts an inspection image is proposed conventionally. For example, in the apparatus described in Patent Document 1, a background image not including a workpiece is registered in advance under various imaging conditions, and an original image obtained by imaging the workpiece is compared with a background image corresponding to the imaging condition. Then the background image is excluded and the inspection image is obtained.
特開2002-33957号公報Japanese Patent Application Laid-Open No. 2002-33957
 しかしながら、上記従来装置では、照明の照度やカメラの絞りなどを撮影条件が変化する毎に背景材をバックとして背景画像を取得している。つまり、背景は固定されており、背景画像は静的なものである。一方、産業部品などを検査の対象物とする場合、テーブルやハンドなどによってワークは検査位置に搬送され、当該検査位置で撮像される。このようにワークの周辺で種々の周辺機械要素が作動している環境下では、背景画像は一定ではなく、動的に変化する。したがって、背景画像を予め用意しておくのではなく、検査対象であるワークを撮像した原画像からワークの画像が含まれる領域(以下「対象領域」という)を特定し、その特定された対象領域に基づいて検査画像を抽出することが望まれている。 However, in the above-described conventional apparatus, a background image is acquired as a background material to acquire a background image whenever the photographing conditions change, such as the illumination intensity of the illumination and the aperture of the camera. That is, the background is fixed and the background image is static. On the other hand, when an industrial part or the like is used as a target of inspection, the work is transported to the inspection position by a table, a hand or the like, and imaged at the inspection position. As described above, in an environment in which various peripheral mechanical elements operate around the work, the background image is not constant but changes dynamically. Therefore, instead of preparing a background image in advance, an area (hereinafter referred to as “target area”) including an image of a work is specified from an original image obtained by imaging the work to be inspected, and the specified target area It is desirable to extract examination images based on
 この発明は上記課題に鑑みなされたものであり、対象物を撮像部により原画像を撮像しながら当該原画像において対象物の画像が存在する対象領域を高精度に特定することができる画像処理技術、ならびに当該画像処理技術を用いて対象物を高精度に検査することができる検査方法を提供することを目的とする。 The present invention has been made in view of the above problems, and an image processing technique capable of specifying with high accuracy a target area in which an image of a target is present in the original image while capturing the target image by the imaging unit. And an inspection method capable of inspecting an object with high accuracy using the image processing technology.
 本発明の第1態様は、互いに異なる複数の照明方向の全部または一部から照明光が照射された対象物を撮像して得られる原画像において対象物の画像が存在する対象領域を特定する画像処理方法であって、対象物への照明光を多段階に切り替えながら照明光の切替毎に対象物の原画像を照明切替画像として取得する画像取得工程と、画像取得工程で取得された照明切替画像毎に、対象物の画像のエッジを強調するエッジ強調処理を行ってエッジ強調画像を取得する画像処理工程と、複数のエッジ強調画像を足し合わせた合成画像を生成する画像生成工程と、合成画像から対象領域を特定する領域特定工程とを備えることを特徴としている。 The first aspect of the present invention is an image for specifying a target area in which an image of a target is present in an original image obtained by imaging the target illuminated with illumination light from all or part of a plurality of different illumination directions. A processing method, comprising: an image acquiring step of acquiring an original image of an object as an illumination switching image at each switching of the illumination light while switching illumination light to the object in multiple stages; and illumination switching acquired in the image acquiring step An image processing step of acquiring an edge-emphasized image by performing edge emphasis processing for emphasizing an edge of an image of an object for each image, an image generation step of generating a composite image obtained by adding a plurality of edge-emphasized images, and combining And a region specifying step of specifying a target region from an image.
 また、本発明の第2態様は、互いに異なる複数の照明方向の全部または一部から照明光が照射された対象物を撮像して得られる原画像において対象物の画像が存在する対象領域を特定する画像処理装置であって、対象物への照明光を多段階に切り替えながら照明光の切替毎に対象物の原画像を照明切替画像として取得する画像取得部と、照明切替画像毎に、対象物の画像のエッジを強調するエッジ強調処理を行ってエッジ強調画像を得る画像処理部と、複数のエッジ強調画像を足し合わせた合成画像を生成する画像生成部と、合成画像から対象領域を特定する領域特定部とを備えることを特徴としている。 Further, according to the second aspect of the present invention, a target area in which an image of an object is present is specified in an original image obtained by imaging the object irradiated with illumination light from all or a part of a plurality of different illumination directions. An image acquisition unit configured to acquire an original image of an object as an illumination switching image each time the illumination light is switched while switching the illumination light to the object in multiple stages, and an object for each illumination switching image An image processing unit for obtaining an edge-emphasized image by performing edge enhancement processing for emphasizing an edge of an image of an object, an image generation unit for generating a composite image obtained by adding a plurality of edge-emphasized images, and specifying a target region from the composite image And an area specifying unit.
 さらに、本発明の第3態様は、対象物を検査する検査方法であって、上記画像処理方法により対象物の画像が存在する対象領域を特定する工程と、合成画像のうち対象領域以外の画像を排除して対象物の画像を抽出する抽出工程と、抽出工程により抽出された対象物の画像に基づいて対象物を検査する検査工程とを備えることを特徴としている。 Furthermore, a third aspect of the present invention is an inspection method for inspecting an object, comprising the steps of: specifying a target area in which an image of the target is present by the image processing method; And an inspection step of inspecting the object based on the image of the object extracted by the extraction step.
 このように構成された発明では、照明光を切り替えながら照明切替画像が取得されるとともに、各照明切替画像に対してエッジ強調処理が行われてエッジ強調画像が取得される。このように種々の照明態様で照明光を照射しながらエッジ強調画像を得ており、それらを足し合わすことで生成される合成画像では、対象物の画像のエッジが明確となる。そして、このような特性を有する合成画像から対象領域が精度良く特定される。 In the invention configured as described above, the illumination switching image is acquired while switching the illumination light, and the edge enhancement processing is performed on each illumination switching image to acquire the edge-emphasized image. As described above, the edge-emphasized image is obtained while irradiating the illumination light in various illumination modes, and in the composite image generated by adding them, the edge of the image of the object becomes clear. Then, the target area is accurately identified from the composite image having such characteristics.
 以上のように、本発明によれば、対象物への照明光を多段階に切り替えながら複数の照明切替画像を取得し、各照明切替画像に対してエッジ強調処理を施したエッジ強調画像を足し合わせて生成した合成画像に基づいて対象領域を特定している。したがって、対象領域を高精度に特定することができる。また、こうして特定された対象領域を用いて対象物の画像を抽出し、当該画像に基づいて対象物を検査することで対象物を高精度に検査することができる。 As described above, according to the present invention, a plurality of illumination switching images are acquired while switching the illumination light to the object in multiple stages, and an edge-emphasized image obtained by performing edge enhancement processing on each illumination switching image is added. The target area is specified based on the composite image generated together. Therefore, the target area can be identified with high accuracy. Further, the image of the object can be extracted using the target region thus identified, and the object can be inspected with high accuracy by inspecting the object based on the image.
 上述した本発明の各態様の有する複数の構成要素はすべてが必須のものではなく、上述の課題の一部又は全部を解決するため、あるいは、本明細書に記載された効果の一部又は全部を達成するために、適宜、前記複数の構成要素の一部の構成要素について、その変更、削除、新たな他の構成要素との差し替え、限定内容の一部削除を行うことが可能である。また、上述の課題の一部又は全部を解決するため、あるいは、本明細書に記載された効果の一部又は全部を達成するために、上述した本発明の一態様に含まれる技術的特徴の一部又は全部を上述した本発明の他の態様に含まれる技術的特徴の一部又は全部と組み合わせて、本発明の独立した一形態とすることも可能である。 The components included in each aspect of the present invention described above are not all essential, and some or all of the effects described in this specification may be solved in order to solve some or all of the problems described above. In order to achieve the above, it is possible to appropriately change, delete, replace with another new component, and partially delete limited content for some components of the plurality of components. In addition, in order to solve some or all of the problems described above, or to achieve some or all of the effects described in the present specification, technical features included in one aspect of the present invention described above It is also possible to combine some or all of the technical features included in the other aspects of the present invention described above into one or more independent forms of the present invention.
本発明に係る画像処理装置の一実施形態を装備する検査装置の構成を示す図である。It is a figure showing composition of an inspection device equipped with one embodiment of an image processing device concerning the present invention. 図1に示す検査装置の本体を示す平面図である。It is a top view which shows the main body of the inspection apparatus shown in FIG. 検査装置における検査処理の流れを示すフローチャートである。It is a flow chart which shows a flow of inspection processing in an inspection device. 検査処理において実行される対象領域の特定処理を示すフローチャートである。It is a flowchart which shows the identification process of the object area | region performed in a test process. 上方撮像部によりワークを撮像した際の照明切替画像およびエッジ強調画像の一例を示す図である。It is a figure which shows an example of the illumination switching image at the time of imaging a workpiece | work by the upper imaging part, and an edge emphasis image. 上方撮像部によりワークを撮像した際の照明切替画像およびエッジ強調画像の一例を示す図である。It is a figure which shows an example of the illumination switching image at the time of imaging a workpiece | work by the upper imaging part, and an edge emphasis image. 上方撮像部によりワークを撮像した際の照明切替画像およびエッジ強調画像の一例を示す図である。It is a figure which shows an example of the illumination switching image at the time of imaging a workpiece | work by the upper imaging part, and an edge emphasis image. 図5ないし図7に示すエッジ強調画像から作成される合成画像および当該合成画像から抽出された検査画像を示す図である。It is a figure which shows the test | inspection image extracted from the synthetic | combination image produced from the edge emphasis image shown in FIGS. 5-7, and the said synthetic | combination image. 斜方撮像部によりワークを撮像した際の照明切替画像およびエッジ強調画像の一例を示す図である。It is a figure which shows an example of the illumination switching image at the time of imaging a workpiece | work by an oblique imaging part and an edge emphasis image. 斜方撮像部によりワークを撮像した際の照明切替画像およびエッジ強調画像の一例を示す図である。It is a figure which shows an example of the illumination switching image at the time of imaging a workpiece | work by an oblique imaging part and an edge emphasis image. 図9および図10に示すエッジ強調画像から作成される合成画像および当該合成画像から抽出された検査画像を示す図である。It is a figure which shows the test | inspection image extracted from the synthetic | combination image produced from the edge emphasis image shown to FIG. 9 and FIG. 10, and the said synthetic | combination image. 側方撮像部によりワークを撮像した際の照明切替画像およびエッジ強調画像の一例を示す図である。It is a figure which shows an example of the illumination switching image at the time of imaging a workpiece | work by a side imaging part and an edge emphasis image. 側方撮像部によりワークを撮像した際の照明切替画像およびエッジ強調画像の一例を示す図である。It is a figure which shows an example of the illumination switching image at the time of imaging a workpiece | work by a side imaging part and an edge emphasis image. 図12および図13に示すエッジ強調画像から作成される合成画像および当該合成画像から抽出された検査画像を示す図である。It is a figure which shows the synthetic | combination image produced from the edge emphasis image shown to FIG. 12 and FIG. 13, and the test | inspection image extracted from the said synthetic | combination image.
 図1は本発明に係る画像処理装置の一実施形態を装備する検査装置の構成を示す図である。図2は図1に示す検査装置の本体を示す平面図である。検査装置1は、例えば鍛造や鋳造などにより製造された金属製のワークWを撮像して得られる検査画像を予め取得された参照画像とを比較することでワークWを検査する装置である。 FIG. 1 is a view showing the configuration of an inspection apparatus equipped with an image processing apparatus according to an embodiment of the present invention. FIG. 2 is a plan view showing the main body of the inspection apparatus shown in FIG. The inspection apparatus 1 is an apparatus that inspects the workpiece W by comparing an inspection image obtained by imaging, for example, a metal workpiece W manufactured by forging or casting with a reference image acquired in advance.
 図1に示すように、検査装置1は、本体11と、コンピュータにより構成される制御ユニット12とを備えている。本体11は、ステージ2と、撮像ユニット3と、光源ユニット4とを備える。ワークWはステージ2上に載置される。本体11には、外部の光がステージ2上に到達することを防止する図示省略の遮光カバーが設けられ、ステージ2、撮像ユニット3および光源ユニット4は、遮光カバー内に設けられている。 As shown in FIG. 1, the inspection apparatus 1 includes a main body 11 and a control unit 12 configured by a computer. The main body 11 includes a stage 2, an imaging unit 3, and a light source unit 4. The work W is placed on the stage 2. The main body 11 is provided with a light shielding cover (not shown) for preventing external light from reaching the stage 2, and the stage 2, the imaging unit 3 and the light source unit 4 are provided in the light shielding cover.
 図1および図2に示すように、撮像ユニット3は、1個の上方撮像部31と、8個の斜方撮像部32と、8個の側方撮像部33とを備えている。図2では、上方撮像部31の図示を省略している(後述の上方光源部41についても同様)。上方撮像部31は、図1に示すようにステージ2の上方にてステージ2の中心から鉛直上方に延びる中心軸J1上に配置され、上方撮像部31によりステージ2上のワークWの上面を真上から撮像した原画像を制御ユニット12に出力する。 As shown in FIGS. 1 and 2, the imaging unit 3 includes one upper imaging unit 31, eight oblique imaging units 32, and eight side imaging units 33. In FIG. 2, the upper imaging unit 31 is not shown (the same applies to the upper light source unit 41 described later). The upper imaging unit 31 is disposed on a central axis J1 extending vertically upward from the center of the stage 2 above the stage 2 as shown in FIG. The original image captured from above is output to the control unit 12.
 図2に示すように、上側から下方を向いて本体11を見た場合に(すなわち、本体11を平面視した場合に)、8個の斜方撮像部32はステージ2の周囲に配置されている。8個の斜方撮像部32は、中心軸J1を中心とする周方向に45°の角度間隔(ピッチ)にて配列されている。各斜方撮像部32の撮像光軸K2と中心軸J1とを含む面において(図1参照)、撮像光軸K2と中心軸J1とがなす角度θ2は、およそ45°である。各斜方撮像部32によりステージ2上のワークWを斜め上から撮像した原画像が取得可能である。 As shown in FIG. 2, when viewing the main body 11 from the upper side to the lower side (that is, when the main body 11 is viewed in plan), the eight oblique imaging units 32 are arranged around the stage 2 There is. The eight oblique imaging units 32 are arranged at an angular interval (pitch) of 45 ° in the circumferential direction about the central axis J1. In a plane including the imaging optical axis K2 of each oblique imaging unit 32 and the central axis J1 (see FIG. 1), an angle θ2 formed by the imaging optical axis K2 and the central axis J1 is approximately 45 °. An original image obtained by imaging the work W on the stage 2 obliquely from the top can be acquired by each oblique imaging unit 32.
 本体11を平面視した場合に、8個の側方撮像部33も、8個の斜方撮像部32と同様にステージ2の周囲に配置されている。8個の側方撮像部33は、中心軸J1を中心とする周方向に45°の角度間隔にて配列されている。各側方撮像部33の撮像光軸K3と中心軸J1とを含む面において、撮像光軸K3と中心軸J1とがなす角度θ3は、およそ90°である。各側方撮像部33によりステージ2上のワークWを横から撮像した原画像が取得可能である。上方撮像部31、斜方撮像部32および側方撮像部33は、例えば、CCD(Charge Coupled Device)やCMOS(Complementary Metal-Oxide Semiconductor)等を有し、多階調の画像が取得される。上方撮像部31、斜方撮像部32および側方撮像部33は、図示省略の支持部により支持されている。 Similar to the eight oblique imaging units 32, the eight side imaging units 33 are also arranged around the stage 2 when the main body 11 is viewed in plan. The eight side imaging units 33 are arranged at an angular interval of 45 ° in the circumferential direction about the central axis J1. In a plane including the imaging optical axis K3 of each side imaging unit 33 and the central axis J1, an angle θ3 formed by the imaging optical axis K3 and the central axis J1 is approximately 90 °. An original image obtained by imaging the work W on the stage 2 from the side can be acquired by each side imaging unit 33. The upper imaging unit 31, the oblique imaging unit 32, and the side imaging unit 33 have, for example, a charge coupled device (CCD), a complementary metal oxide semiconductor (CMOS), and the like, and a multi-gradation image is obtained. The upper imaging unit 31, the oblique imaging unit 32, and the side imaging unit 33 are supported by a support (not shown).
 光源ユニット4は、1個の上方光源部41と、8個の斜方光源部42と、8個の側方光源部43とを備える。上方光源部41は、中心軸J1を中心とするリング状に複数のLED(発光ダイオード)が配列された光源部である。リング状の上方光源部41は上方撮像部31の周囲を囲むように、上方撮像部31に固定されている。上方光源部41によりステージ2上のワークWに対して真上から中心軸J1に平行な方向に沿って照明光が照射可能となっている。 The light source unit 4 includes one upper light source unit 41, eight oblique light source units 42, and eight side light source units 43. The upper light source unit 41 is a light source unit in which a plurality of LEDs (light emitting diodes) are arranged in a ring shape centering on the central axis J1. The ring-shaped upper light source unit 41 is fixed to the upper imaging unit 31 so as to surround the periphery of the upper imaging unit 31. The illumination light can be irradiated by the upper light source unit 41 along the direction parallel to the central axis J1 from directly above the work W on the stage 2.
 本体11を平面視した場合に、8個の斜方光源部42はステージ2の周囲に配置されている。8個の斜方光源部42は、中心軸J1を中心とする周方向に45°の角度間隔にて配列されている。各斜方光源部42は、中心軸J1を中心とする円周の接線方向に伸びるバー状に複数のLEDが配列された光源部である。各斜方光源部42の出射面の中央とワークW(の中心)とを結ぶ線を「斜方照明軸」と呼ぶと、当該斜方光源部42の斜方照明軸と中心軸J1とを含む面において、当該斜方照明軸と中心軸J1とがなす角度は、およそ45°である。各斜方光源部42では、ステージ2上のワークWに対して斜め上から当該斜方照明軸に沿って照明光が照射可能となっている。検査装置1では、斜方光源部42は斜方撮像部32にそれぞれ固定されている。 When the main body 11 is viewed in plan, eight oblique light source units 42 are disposed around the stage 2. The eight oblique light source parts 42 are arranged at an angular interval of 45 ° in the circumferential direction centering on the central axis J1. Each of the oblique light source units 42 is a light source unit in which a plurality of LEDs are arranged in a bar shape extending in a tangential direction of a circumference centered on the central axis J1. The line connecting the center of the exit surface of each oblique light source unit 42 and (the center of) the work W is referred to as the “diagonal illumination axis”. The oblique illumination axis of the oblique light source unit 42 and the central axis J1 In the included plane, the angle between the oblique illumination axis and the central axis J1 is about 45 °. In each of the oblique light source units 42, illumination light can be irradiated to the work W on the stage 2 from obliquely above along the oblique illumination axis. In the inspection apparatus 1, the oblique light source units 42 are respectively fixed to the oblique imaging unit 32.
 本体11を平面視した場合に、8個の側方光源部43はステージ2の周囲に配置されている。8個の側方光源部43は、中心軸J1を中心とする周方向に45°の角度間隔にて配列されている。各側方光源部43は、中心軸J1を中心とする円周の接線方向に伸びるバー状に複数のLEDが配列された光源部である。斜方光源部42と同様に、各側方光源部43の出射面の中央とワークWとを結ぶ線を「側方照明軸」と呼ぶと、当該側方光源部43の側方照明軸と中心軸J1とを含む面において、当該側方照明軸と中心軸J1とがなす角度は、およそ90°である。各側方光源部43では、ステージ2上のワークWに対して横から当該側方照明軸に沿って照明光が照射可能となっている。検査装置1では、側方光源部43は側方撮像部33にそれぞれ固定されている。 When the main body 11 is viewed in a plan view, eight side light source units 43 are disposed around the stage 2. The eight side light source units 43 are arranged at an angular interval of 45 ° in the circumferential direction about the central axis J1. Each side light source unit 43 is a light source unit in which a plurality of LEDs are arranged in a bar shape extending in a tangential direction of a circumference centered on the central axis J1. Similarly to the oblique light source unit 42, when a line connecting the center of the emission surface of each side light source unit 43 and the work W is referred to as “side illumination axis”, the side illumination axis of the side light source unit 43 and In the plane including the central axis J1, the angle between the side illumination axis and the central axis J1 is approximately 90 degrees. In each side light source unit 43, illumination light can be irradiated to the work W on the stage 2 from the side along the side illumination axis. In the inspection apparatus 1, the side light source units 43 are fixed to the side imaging units 33 respectively.
 このように本実施形態では、17個の光源部(=1個の上方光源部41+8個の斜方光源部42+8個の側方光源部43)がステージ2の周囲をドーム状に取り囲むように配設され、互いに異なる17個の照明方向の全部または一部から照明光をワークWに照射可能となっている。例えば、上方撮像部31および上方光源部41とワークWとの間の距離は、約55cm(センチメートル)である。また、斜方撮像部32および斜方光源部42とワークWとの間の距離は約50cmであり、側方撮像部33および側方光源部43とワークWとの間の距離は約40cmである。上方光源部41、斜方光源部42および側方光源部43では、LED以外の種類の光源が用いられてよい。 As described above, in the present embodiment, 17 light source units (= 1 upper light source unit 41 + 8 oblique light source units 42 + 8 side light source units 43) are arranged so as to surround the periphery of the stage 2 in a dome shape. The workpiece W can be illuminated with illumination light from all or a part of 17 illumination directions different from one another. For example, the distance between the upper imaging unit 31 and the upper light source unit 41 and the work W is about 55 cm (centimeter). The distance between the oblique imaging unit 32 and the oblique light source unit 42 and the work W is about 50 cm, and the distance between the side imaging unit 33 and the side light source unit 43 and the work W is about 40 cm. is there. In the upper light source unit 41, the oblique light source unit 42, and the side light source unit 43, light sources of types other than LEDs may be used.
 本実施形態では、本体11の各部を制御するとともに後で説明するようにワークWの検査画像を参照画像とを比較してワークWの検査を行うために制御ユニット12が設けられている。制御ユニット12は、CPU(Central Processing Unit)により構成される演算処理部5と、参照画像データ61、後で詳述する照明切替画像データ62、エッジ強調画像データ63、合成画像データ64、対象領域データ65および検査画像データ66などの各種データやプログラムなどを記憶する記憶部6を備えている。演算処理部5は、上記プログラムにしたがって装置各部を制御することで、ワークWへの照明光を多段階に切り替えながら複数の照明切替画像を取得し、各照明切替画像に対して平滑化処理およびエッジ強調処理を施して複数のエッジ強調画像を取得し、エッジ強調画像を足し合わせて合成画像を生成し、合成画像に基づいて対象領域を特定し、対象領域に基づいてワークWの画像、つまり検査画像を抽出し、検査画像と参照画像を比較してワークWを検査する。このように、演算処理部5は、画像取得部51、画像処理部52、画像生成部53、対象領域特定部54、画像抽出部55および検査部56として機能する。以下、ワークWの画像を含む原画像からの対象領域の特定およびワークWの検査について図3ないし図14を参照しつつ説明する。 In the present embodiment, a control unit 12 is provided to control each part of the main body 11 and to inspect the workpiece W by comparing the inspection image of the workpiece W with the reference image as described later. The control unit 12 includes an arithmetic processing unit 5 configured by a CPU (Central Processing Unit), reference image data 61, illumination switching image data 62, edge-emphasized image data 63, composite image data 64, and a target area, which will be described in detail later. The storage unit 6 stores various data such as the data 65 and the inspection image data 66, a program, and the like. The arithmetic processing unit 5 controls each part of the apparatus according to the above program to obtain a plurality of illumination switching images while switching the illumination light to the workpiece W in multiple stages, and performs smoothing processing on each illumination switching image and Edge emphasis processing is performed to obtain a plurality of edge-emphasized images, the edge-emphasized images are added together to generate a composite image, a target area is specified based on the composite image, and an image of the work W, ie, based on the target area The inspection image is extracted, and the inspection image and the reference image are compared to inspect the workpiece W. As described above, the arithmetic processing unit 5 functions as the image acquisition unit 51, the image processing unit 52, the image generation unit 53, the target area identification unit 54, the image extraction unit 55, and the inspection unit 56. Hereinafter, identification of a target area from an original image including an image of the work W and inspection of the work W will be described with reference to FIGS. 3 to 14.
 図3は検査装置における検査処理の流れを示すフローチャートである。また、図4は検査処理において実行される対象領域の特定処理を示すフローチャートである。まず、ステージ2上に検査対象のワークWが載置される(ステップS1)。ステージ2上には、ワークWの形状や大きさなどに適合する保持部(図示省略)が設けられており、ワークWの主面W1を鉛直上方に向けた状態でステージ2上の所定位置にワークWを保持する。続いて、制御ユニット12では、操作者による入力等に基づいて、ステージ2上のワークWに対する撮像設定情報が取得される(ステップS2)。ここで、撮像設定情報は、撮像ユニット3において使用する撮像部(以下、「選択撮像部」という)と、当該選択撮像部による撮影画像の取得の際に光源ユニット4において点灯する光源部とを示す。例えばワークWを上方、斜め方向および側方の3つの撮像方向から撮像して3つの原画像を取得し、ワークWを検査するケース(以下、「具体例」という)には、撮像設定情報は以下のような情報を含むことがある。 FIG. 3 is a flowchart showing the flow of inspection processing in the inspection apparatus. Further, FIG. 4 is a flowchart showing a process of specifying a target area performed in the inspection process. First, the workpiece W to be inspected is placed on the stage 2 (step S1). A holding unit (not shown) that conforms to the shape, size, etc. of the work W is provided on the stage 2, and is positioned at a predetermined position on the stage 2 with the main surface W 1 of the work W directed vertically upward. Hold the work W. Subsequently, the control unit 12 acquires imaging setting information on the workpiece W on the stage 2 based on an input by the operator or the like (step S2). Here, the imaging setting information includes an imaging unit (hereinafter referred to as a “selected imaging unit”) used in the imaging unit 3 and a light source unit which lights in the light source unit 4 when the selected imaging unit acquires a photographed image. Show. For example, in the case where the workpiece W is imaged from three imaging directions in the upper, oblique, and side directions to acquire three original images and the workpiece W is inspected (hereinafter, referred to as “specific example”), the imaging setting information is The following information may be included.
 ・上方撮像部31を選択撮像部として使用するとともに、それに適した照明方向からワークWを照明するために全ての斜方光源部42を光源部として点灯させる、
 ・8個の斜方撮像部32のうちの一の斜方撮像部32a(図2参照)を選択撮像部として使用するとともに、それに適した照明方向からワークWを照明するために斜方光源部42の一部を光源部として点灯させる、
 ・8個の側方撮像部33のうちの一の側方撮像部33a(図2参照)を選択撮像部として使用するとともに、それに適した照明方向からワークWを照明するために側方光源部43の一部を光源部として点灯させる、
 なお、使用する撮像部の種類や数、ならびに光源部の点灯態様はこれに限定されるものではなく、ワークWの種類などによって適宜変更してもよいことは言うまでもない。
The upper imaging unit 31 is used as a selective imaging unit, and all the oblique light source units 42 are turned on as a light source unit in order to illuminate the work W from the illumination direction suitable for it.
An oblique light source unit for illuminating the work W from a suitable illumination direction while using one oblique imaging unit 32a (see FIG. 2) of the eight oblique imaging units 32 as a selective imaging unit Light part of 42 as a light source,
A side light source unit is used to illuminate the work W from a suitable illumination direction while using one side imaging unit 33a (see FIG. 2) of the eight side imaging units 33 as a selected imaging unit Light part of 43 as a light source,
It is needless to say that the type and number of imaging units to be used and the lighting mode of the light source unit are not limited to this, and may be appropriately changed according to the type of the work W and the like.
 ワークWに対する撮像設定情報が取得されると、上記撮像設定情報に基づくワークWの撮像および検査に先立って、演算処理部5は対象領域の特定処理を実行する(ステップS3)。まず、演算処理部5は選択撮像部(上記具体例では、上方撮像部31、斜方撮像部32aおよび側方撮像部33aが該当)のうち対象領域の特定を完了していない一の選択撮像部を決定する(ステップS31)。また、演算処理部5はステップS31で決定された選択撮像部でワークWを撮像する際に点灯する光源部(以下「点灯光源部」という)および複数の点灯パターンを設定する(ステップS32)。なお、点灯パターンについては後で具体例を用いて説明するが、これらの点灯パターンを切り替えることで複数の照明態様でワークWを照明可能となっている。 When the imaging setting information for the workpiece W is acquired, the arithmetic processing unit 5 executes a process of specifying a target area prior to imaging and inspection of the workpiece W based on the imaging setting information (step S3). First, the arithmetic processing unit 5 selects one of the selected imaging units (corresponding to the upper imaging unit 31, the oblique imaging unit 32a, and the side imaging unit 33a in the above specific example) that has not completed the specification of the target area. A set is determined (step S31). In addition, the arithmetic processing unit 5 sets a light source unit (hereinafter referred to as a "lighting light source unit") and a plurality of lighting patterns to be turned on when imaging the work W with the selected imaging unit determined in step S31 (step S32). Although the lighting pattern will be described later using a specific example, the work W can be illuminated with a plurality of illumination modes by switching these lighting patterns.
 演算処理部5は複数の点灯パターンのうちの一の点灯パターンに切り替え、所望の点灯光源部を選択的に点灯させた状態で上記選択撮像部により撮像されたワークWの原画像を照明切替画像として取得し、その画像データ(照明切替画像データ62)を記憶部6に格納する(ステップS33)。それに続いて、演算処理部5は照明切替画像データ62に対して平滑化処理を実行して当該画像に含まれるノイズ成分を除去した後でワークWの画像のエッジを強調するエッジ強調処理を行ってエッジ強調画像を取得し、その画像データ(エッジ強調画像データ63)を記憶部6に格納する(ステップS34)。これらの画像取得工程(ステップS33)および画像処理工程(ステップS34)は、全点灯パターンについて完了したと演算処理部5により判断される(ステップS35で「YES」)まで繰り返される。この繰り返し動作によって、複数の照明態様で照明光をワークWに照射しながら照明態様毎のエッジ強調画像が得られる。 The arithmetic processing unit 5 switches to one lighting pattern among the plurality of lighting patterns, and in a state where the desired lighting light source unit is selectively lit, the original image of the work W captured by the selected imaging unit is switched to the illumination switching image Image data (illumination switching image data 62) is stored in the storage unit 6 (step S33). Subsequently, the arithmetic processing unit 5 performs smoothing processing on the illumination switching image data 62 to remove noise components included in the image, and then performs edge emphasizing processing to emphasize the edge of the image of the workpiece W. The edge-emphasized image is acquired, and the image data (edge-emphasized image data 63) is stored in the storage unit 6 (step S34). The image acquisition process (step S33) and the image processing process (step S34) are repeated until it is determined by the arithmetic processing unit 5 that the entire lighting pattern has been completed ("YES" in step S35). By this repeated operation, while the illumination light is irradiated to the work W in a plurality of illumination modes, an edge-emphasized image for each illumination mode is obtained.
 次のステップS36では、こうして得られた複数のエッジ強調画像を演算処理部5が足し合わせて合成画像を生成し、合成画像データ64を記憶部6に格納する(画像生成工程)。さらに、演算処理部5は合成画像に対して二値化処理を施してステップS31で決定された選択撮像部で撮像されたワークWの原画像のうちワークWの画像が存在する対象領域(後で説明する図8、図11および図14中の符号OR)を特定する(ステップS37:領域特定工程)。そして、演算処理部5は選択撮像部の全てについて対象領域の特定を行ったか否かを判定し(ステップS38)、全部完了していないと判定する間、ステップS31に戻って上記一連の処理(=画像取得工程+画像処理工程+画像生成工程+領域特定工程)を繰り返して選択撮像部毎に対象領域を特定し、対象領域データ65を記憶部6に格納する。 In the next step S36, the arithmetic processing unit 5 adds together the plurality of edge-emphasized images obtained in this way to generate a composite image, and stores composite image data 64 in the storage unit 6 (image generation step). Furthermore, the arithmetic processing unit 5 subjects the composite image to a binarization process, and the target area (in which the image of the work W exists in the original image of the work W captured by the selected imaging unit determined in step S31) The code | symbol OR in FIG.8, FIG.11 and FIG.14 demonstrated in (step S37: area | region identification process) is specified. Then, the arithmetic processing unit 5 determines whether or not the target region has been specified for all of the selected imaging units (step S38), and while determining that all have not been completed, the process returns to step S31 and the above series of processes ( The image acquisition step + the image processing step + the image generation step + the area specifying step is repeated to specify the target area for each selected imaging unit, and the target area data 65 is stored in the storage unit 6.
 ここで、上記対象領域の特定処理の理解をさらに深めるために、上記具体例に基づいて対象領域の特定処理を具体的に説明する。上記具体例では、17個の撮像部のうち上方撮像部31、斜方撮像部32a、側方撮像部33aの3つが選択撮像部として選択される。 Here, in order to further deepen the understanding of the process of specifying the target area, the process of specifying the target area will be specifically described based on the specific example. In the above specific example, among the 17 imaging units, three of the upper imaging unit 31, the oblique imaging unit 32a, and the side imaging unit 33a are selected as the selection imaging unit.
 ステップS31で上方撮像部31が「一の選択撮像部」に決定された際には、図5ないし図7に示すように、8個の斜方光源部42の全部が点灯光源部として選択され、各斜方光源部42を1個ずつ点灯させる合計8つの点灯パターン(A-1)~(A-8)が設定される(ステップS32)。なお、これらの図面(および後で説明する図9、図10、図12および図13)においては、点灯される光源部42を明示するために、ドットを付している。そして、点灯パターン(A-1)に切り替えられると、点灯光源部42aのみが点灯する照明態様で照明切替画像SG(A-1)が取得される(ステップS33)。そして、照明切替画像SG(A-1)に対する平滑化処理およびエッジ強調処理によってエッジ強調画像EG(A-1)が取得される(ステップS34)。また、点灯パターン(A-2)~(A-8)に順次切り替えられる毎に上記と同様にしてエッジ強調画像EG(A-2)~EG(A-8)が取得される。こうして取得された8個のエッジ強調画像EG(A-1)~EG(A-8)を足し合わせることで、図8の上段に示すような合成画像が生成される(ステップS36)。この合成画像に対して二値化処理を加えることで図8の下段に示すように境界線BLで囲まれた対象領域ORが決定される(ステップS37)。なお、図8の下段に示す画像は次に説明する抽出処理によって得られる検査画像を示している。 When the upper imaging unit 31 is determined to be "one selected imaging unit" in step S31, as shown in FIGS. 5 to 7, all eight oblique light source units 42 are selected as lighting light source units. A total of eight lighting patterns (A-1) to (A-8) for setting the respective oblique light source units 42 one by one are set (step S32). In these drawings (and in FIG. 9, FIG. 10, FIG. 12 and FIG. 13 described later), dots are attached to clearly indicate the light source unit 42 to be lit. Then, when the lighting pattern (A-1) is switched to the lighting pattern (A-1), the lighting switching image SG (A-1) is acquired in the lighting mode in which only the lighting light source unit 42a lights (step S33). Then, the edge enhanced image EG (A-1) is acquired by the smoothing process and the edge enhancement process on the illumination switching image SG (A-1) (step S34). Further, each time the lighting patterns (A-2) to (A-8) are sequentially switched, the edge-emphasized images EG (A-2) to EG (A-8) are acquired in the same manner as described above. The eight edge-weighted images EG (A-1) to EG (A-8) thus acquired are added together to generate a composite image as shown in the upper part of FIG. 8 (step S36). By adding binarization processing to the composite image, as shown in the lower part of FIG. 8, the target region OR surrounded by the boundary line BL is determined (step S37). The image shown in the lower part of FIG. 8 shows an inspection image obtained by the extraction processing described below.
 また、ステップS31で斜方撮像部32a(図2参照)が「一の選択撮像部」に決定された際には、図9および図10に示すように、8個の斜方光源部42のうちの斜方光源部42b~42f、42hの5個が点灯光源部として選択され、各斜方光源部42c~42e、42hを1個ずつ点灯させる4つの点灯パターン(B-1)~(B-4)と、5個の斜方光源部42b~42fを同時点灯させる点灯パターン(B-5)との合計5つの点灯パターンが設定される(ステップS32)。そして、点灯パターン(B-1)に切り替えられると、点灯光源部42hのみが点灯する照明態様で照明切替画像SG(B-1)が取得される(ステップS33)。そして、照明切替画像SG(B-1)に対する平滑化処理およびエッジ強調処理によってエッジ強調画像EG(B-1)が取得される(ステップS34)。また、点灯パターン(B-2)~(B-5)に順次切り替えられる毎に上記と同様にしてエッジ強調画像EG(B-2)~EG(B-5)が取得される。こうして取得された6個のエッジ強調画像EG(B-1)~EG(B-5)を足し合わせることで、図11の上段に示すような合成画像が生成される(ステップS36)。この合成画像に対して二値化処理を加えることで図11の下段に示すように境界線BLで囲まれた対象領域ORが決定される(ステップS37)。なお、図11の下段に示す画像は次に説明する抽出処理によって得られる検査画像を示している。 When the oblique imaging unit 32a (see FIG. 2) is determined to be “one selected imaging unit” in step S31, as shown in FIGS. 9 and 10, the eight oblique light source units 42 are selected. Four of the lighting patterns (B-1) to (B) are selected in which five of the oblique light source units 42b to 42f and 42h are selected as the lighting light source units and one each of the oblique light source units 42c to 42e and 42h are lit. A total of five lighting patterns of -4) and a lighting pattern (B-5) for simultaneously lighting the five oblique light source parts 42b to 42f are set (step S32). Then, when switched to the lighting pattern (B-1), the lighting switching image SG (B-1) is acquired in the lighting mode in which only the lighting light source unit 42h lights (step S33). Then, the edge enhanced image EG (B-1) is acquired by the smoothing process and the edge enhancement process on the illumination switching image SG (B-1) (step S34). Further, each time the lighting patterns (B-2) to (B-5) are sequentially switched, the edge-emphasized images EG (B-2) to EG (B-5) are acquired in the same manner as described above. The six edge-weighted images EG (B-1) to EG (B-5) thus obtained are added together to generate a composite image as shown in the upper part of FIG. 11 (step S36). By adding binarization processing to the composite image, as shown in the lower part of FIG. 11, the target area OR surrounded by the boundary line BL is determined (step S37). The image shown in the lower part of FIG. 11 shows an inspection image obtained by the extraction processing described below.
 さらに、ステップS31で斜方撮像部33a(図2参照)が「一の選択撮像部」に決定された際には、図12および図13に示すように、8個の側方光源部43のうちの側方光源部43b~43fの5個が点灯光源部として選択され、各側方光源部43c~43eを1個ずつ点灯させる3つの点灯パターン(C-1)~(C-3)と、5個の側方光源部43b~43fを同時点灯させる点灯パターン(C-4)との合計4つの点灯パターンが設定される(ステップS32)。そして、点灯パターン(C-1)に切り替えられると、側方光源部43dのみが点灯する照明態様で照明切替画像SG(C-1)が取得される(ステップS33)。そして、照明切替画像SG(C-1)に対する平滑化処理およびエッジ強調処理によってエッジ強調画像EG(C-1)が取得される(ステップS34)。また、点灯パターン(C-2)~(C-4)に順次切り替えられる毎に上記と同様にしてエッジ強調画像EG(C-2)~EG(C-4)が取得される。こうして取得された4個のエッジ強調画像EG(C-1)~EG(C-4)を足し合わせることで、図14の上段に示すような合成画像が生成される(ステップS36)。この合成画像に対して二値化処理を加えることで図14の下段に示すように境界線BLで囲まれた対象領域ORが決定される(ステップS37)。なお、図14の下段に示す画像は次に説明する抽出処理によって得られる検査画像を示している。 Furthermore, when the oblique imaging unit 33a (see FIG. 2) is determined to be “one selected imaging unit” in step S31, as shown in FIGS. 12 and 13, the eight side light source units 43 are selected. Three lighting patterns (C-1) to (C-3) in which five of the side light source units 43b to 43f are selected as lighting light source units and one side light source unit 43c to 43e are turned on A total of four lighting patterns are set with the lighting pattern (C-4) for simultaneously lighting the five side light source parts 43b to 43f (step S32). Then, when the lighting pattern (C-1) is switched to the lighting pattern (C-1), the lighting switching image SG (C-1) is acquired in the lighting mode in which only the side light source unit 43d lights (step S33). Then, the edge enhanced image EG (C-1) is obtained by the smoothing process and the edge enhancement process on the illumination switching image SG (C-1) (step S34). Further, each time the lighting patterns (C-2) to (C-4) are sequentially switched, the edge enhanced images EG (C-2) to EG (C-4) are acquired in the same manner as described above. The four edge-emphasized images EG (C-1) to EG (C-4) obtained in this way are added together to generate a composite image as shown in the upper part of FIG. 14 (step S36). By adding binarization processing to the composite image, as shown in the lower part of FIG. 14, the target region OR surrounded by the boundary line BL is determined (step S37). The image shown in the lower part of FIG. 14 shows an inspection image obtained by the extraction processing described below.
 次に、図3に戻って説明を続ける。対象領域OR(図8、図11、図14参照)の特定が完了すると、演算処理部5は合成画像データ64および対象領域データ65を記憶部6から読み出し、それらに基づいて検査画像を合成画像から抽出する(ステップS4:抽出工程)。すなわち、合成画像のうち対象領域OR以外の画像が背景画像として面積閾値で排除されてワークWの画像、つまり検査画像が抽出され(図8、図11、図14の下段参照)、検査画像データ66が記憶部6に格納される。そして、演算処理部5は予め記憶部6に記憶されているワークWの参照画像のデータ、つまり参照画像データ61を記憶部6から読み出す(ステップS5:読出工程)。この参照画像は傷などの欠陥のないワークWを撮像設定情報に基づいて撮像して得られる撮影画像であり、検査前に取得され、記憶部6に記憶されている。最後に、演算処理部5は検査画像と参照画像を比較してワークWに傷などの欠陥が存在するか否かを検査する(ステップS6:検査工程)。 Next, returning to FIG. 3, the description will be continued. When the specification of the target area OR (see FIGS. 8, 11, and 14) is completed, the arithmetic processing unit 5 reads the composite image data 64 and the target area data 65 from the storage unit 6, and based on them, the inspection image is a composite image. (Step S4: extraction step). That is, an image other than the target area OR in the composite image is excluded as the background image by the area threshold and the image of the work W, that is, the inspection image is extracted (see the lower part of FIG. 8, FIG. 11 and FIG. 14) 66 is stored in the storage unit 6. Then, the arithmetic processing unit 5 reads from the storage unit 6 the data of the reference image of the work W stored in advance in the storage unit 6, that is, the reference image data 61 (step S5: reading step). The reference image is a photographed image obtained by imaging the workpiece W having no defect such as a scratch based on the imaging setting information, and is acquired before the inspection and stored in the storage unit 6. Finally, the arithmetic processing unit 5 compares the inspection image with the reference image to inspect whether there is a defect such as a flaw in the work W (step S6: inspection step).
 以上のように、本実施形態によれば、照明光を切り替えながら照明切替画像を取得するとともに、各照明切替画像に対してエッジ強調処理を行ってエッジ強調画像を取得し、さらにそれらのエッジ強調画像を足し合わせて合成画像を生成している。したがって、合成画像においてワークWのエッジが明確となり、それからワークWの画像(検査画像)が存在する対象領域を精度良く特定することができる。しかも、こうして特定された対象領域を用いて検査画像を抽出しているので、ワークWを高精度に検査することができる。 As described above, according to the present embodiment, the illumination switching image is acquired while switching the illumination light, and the edge emphasis processing is performed on each illumination switching image to acquire the edge-emphasized image, and further, their edge emphasis is performed. Images are added together to generate a composite image. Therefore, the edge of the work W becomes clear in the composite image, and then the target area where the image of the work W (inspection image) is present can be identified with high accuracy. In addition, since the inspection image is extracted using the target area thus identified, the workpiece W can be inspected with high accuracy.
 上記実施形態では、ワークWが本発明の「対象物」の一例に相当している。また、演算処理部5が本発明の「画像処理装置」として機能している。 In the above embodiment, the work W corresponds to an example of the "object" of the present invention. Further, the arithmetic processing unit 5 functions as the "image processing apparatus" of the present invention.
 なお、本発明は上記した実施形態に限定されるものではなく、その趣旨を逸脱しない限りにおいて上述したもの以外に種々の変更を行うことが可能である。例えば本実施形態では、全選択撮像部について対象領域の特定が完了した後で上記抽出工程、読出工程および検査工程を行っているが、抽出工程、読出工程および検査工程の実行タイミングはこれに限定されるものではなく、例えば選択撮像部毎に対象領域の特定と抽出工程とを連続して行ってもよい。また、選択撮像部毎に対象領域の特定、抽出工程、読出工程および検査工程を行ってもよい。 The present invention is not limited to the above-described embodiment, and various modifications can be made other than the above without departing from the scope of the invention. For example, in the present embodiment, the extraction step, the reading step, and the inspection step are performed after the identification of the target region is completed for all selected imaging units, but the execution timing of the extraction step, the reading step, and the inspection step is limited thereto. For example, the identification process of the target area and the extraction process may be performed continuously for each selected imaging unit. In addition, identification of a target area, an extraction process, a reading process, and an inspection process may be performed for each selected imaging unit.
 また、上記実施形態では、17個の撮像部のうち3つの撮像部31、斜方撮像部32a側方撮像部33aを選択撮像部として選択する検査装置に本発明を適用しているが、選択撮像部の種類や個数などはこれに限定されるものではない。また、本発明を17個の撮像部と17個の光源部とを備えた検査装置に適用しているが、撮像部や光源部の個数や装置の種類は上記実施形態に限定されるものではなく、互いに異なる複数の照明方向の全部または一部から照明光が照射された対象物を撮像する撮像ユニットを備えた装置全般に本発明を適用することができる。 Further, in the above embodiment, the present invention is applied to an inspection apparatus which selects three imaging units 31 and the oblique imaging unit 32a side imaging unit 33a among the 17 imaging units as a selection imaging unit. The type and number of imaging units are not limited to this. In addition, although the present invention is applied to an inspection apparatus provided with 17 imaging units and 17 light source units, the number of imaging units and light source units and the type of the apparatus are limited to the above embodiment. The present invention can be generally applied to an apparatus generally provided with an imaging unit for imaging an object irradiated with illumination light from all or part of a plurality of illumination directions different from one another.
 以上、特定の実施例に沿って発明を説明したが、この説明は限定的な意味で解釈されることを意図したものではない。発明の説明を参照すれば、本発明のその他の実施形態と同様に、開示された実施形態の様々な変形例が、この技術に精通した者に明らかとなるであろう。故に、添付の特許請求の範囲は、発明の真の範囲を逸脱しない範囲内で、当該変形例または実施形態を含むものと考えられる。 While the invention has been described above with reference to a specific embodiment, this description is not intended to be construed in a limiting sense. Various modifications of the disclosed embodiments, as well as other embodiments of the present invention, will be apparent to persons skilled in the art upon reference to the description of the invention. Therefore, the appended claims are considered to cover such modifications or embodiments without departing from the true scope of the invention.
 この発明は、互いに異なる複数の照明方向の全部または一部から照明光が照射された対象物を撮像して得られる原画像において前記対象物の画像が存在する対象領域を特定する画像処理技術全般ならびに当該画像処理技術を用いて対象物を検査する検査方法に適用することができる。 The present invention relates generally to an image processing technique for specifying a target area in which an image of an object is present in an original image obtained by imaging the object irradiated with illumination light from all or part of a plurality of different illumination directions. And it can apply to the inspection method which inspects an object using the image processing technique concerned.
 1…検査装置
 5…演算処理部(画像処理装置)
 31…上方撮像部
 32…斜方撮像部
 33…側方撮像部
 41…上方光源部
 42…斜方光源部
 43…側方光源部
 51…画像取得部
 52…画像処理部
 53…画像生成部
 54…対象領域特定部
 55…画像抽出部
 56…検査部
 OG…対象領域
 SG…照明切替画像
 W…ワーク(対象物)
1 ... inspection device 5 ... arithmetic processing unit (image processing device)
31 ... upper imaging unit 32 ... oblique imaging unit 33 ... side imaging unit 41 ... upper light source unit 42 ... oblique light source unit 43 ... side light source unit 51 ... image acquisition unit 52 ... image processing unit 53 ... image generation unit 54 ... Target area identification unit 55 ... Image extraction unit 56 ... Inspection unit OG ... Target region SG ... Illumination switching image W ... Work (object)

Claims (6)

  1.  互いに異なる複数の照明方向の全部または一部から照明光が照射された対象物を撮像して得られる原画像において前記対象物の画像が存在する対象領域を特定する画像処理方法であって、
     前記対象物への前記照明光を多段階に切り替えながら前記照明光の切替毎に前記対象物の原画像を照明切替画像として取得する画像取得工程と、
     前記画像取得工程で取得された照明切替画像毎に、前記対象物の画像のエッジを強調するエッジ強調処理を行ってエッジ強調画像を取得する画像処理工程と、
     前記複数のエッジ強調画像を足し合わせた合成画像を生成する画像生成工程と、
     前記合成画像から前記対象領域を特定する領域特定工程と
    を備えることを特徴とする画像処理方法。
    An image processing method for specifying a target area in which an image of an object is present in an original image obtained by imaging an object irradiated with illumination light from all or part of a plurality of different illumination directions.
    An image acquisition step of acquiring an original image of the object as an illumination switching image each time the illumination light is switched while switching the illumination light to the object in multiple stages;
    An image processing step of acquiring an edge-emphasized image by performing edge emphasis processing for emphasizing an edge of the image of the object for each of the illumination switching images acquired in the image acquisition step;
    An image generation step of generating a composite image in which the plurality of edge-emphasized images are added together;
    An area specifying step of specifying the target area from the composite image.
  2.  請求項1に記載の画像処理方法であって、
     前記画像処理工程は、前記照明切替画像に対して平滑化処理を施した後に前記エッジ強調処理を行って前記エッジ強調画像を取得する画像処理方法。
    The image processing method according to claim 1,
    The image processing step is an image processing method of performing the edge enhancement processing after performing the smoothing processing on the illumination switching image to obtain the edge enhanced image.
  3.  請求項1または2に記載の画像処理方法であって、
     前記領域特定工程は、前記合成画像に二値化処理を施して前記対象領域を特定する画像処理方法。
    The image processing method according to claim 1 or 2, wherein
    The area specifying step performs an image processing method of performing binarization processing on the composite image to specify the target area.
  4.  請求項1ないし3のいずれか一項に記載の画像処理方法であって、
     前記対象物を互いに異なる撮像方向から撮像するために複数の撮像部が設けられ、
     前記撮像部毎に、前記画像取得工程、前記画像処理工程、前記画像生成工程および前記領域特定工程を実行して前記撮像部により撮像される原画像において前記対象物の画像が存在する対象領域を特定する画像処理方法。
    The image processing method according to any one of claims 1 to 3, wherein
    A plurality of imaging units are provided to image the object from different imaging directions,
    The target area where the image of the target exists in the original image captured by the imaging unit by performing the image acquisition step, the image processing step, the image generation step, and the area specifying step for each of the imaging units Image processing method to identify.
  5.  互いに異なる複数の照明方向の全部または一部から照明光が照射された対象物を撮像して得られる原画像において前記対象物の画像が存在する対象領域を特定する画像処理装置であって、
     前記対象物への前記照明光を多段階に切り替えながら前記照明光の切替毎に前記対象物の原画像を照明切替画像として取得する画像取得部と、
     前記照明切替画像毎に、前記対象物の画像のエッジを強調するエッジ強調処理を行ってエッジ強調画像を得る画像処理部と、
     前記複数のエッジ強調画像を足し合わせた合成画像を生成する画像生成部と、
     前記合成画像から前記対象領域を特定する領域特定部と
    を備えることを特徴とする画像処理装置。
    An image processing apparatus for specifying a target area in which an image of an object is present in an original image obtained by imaging the object irradiated with illumination light from all or part of a plurality of different illumination directions.
    An image acquisition unit that acquires an original image of the object as an illumination switching image at each switching of the illumination light while switching the illumination light to the object in multiple stages;
    An image processing unit for performing an edge enhancement process for enhancing an edge of the image of the object for each of the illumination switching images to obtain an edge enhanced image;
    An image generation unit configured to generate a composite image in which the plurality of edge-emphasized images are added together;
    And a region specifying unit for specifying the target region from the composite image.
  6.  対象物を検査する検査方法であって、
     請求項1ないし4のいずれか一項に記載の画像処理方法により前記対象物の画像が存在する対象領域を特定する工程と、
     前記合成画像のうち前記対象領域以外の画像を排除して前記対象物の画像を抽出する抽出工程と、
     前記抽出工程により抽出された前記対象物の画像に基づいて前記対象物を検査する検査工程と
    を備えることを特徴とする検査方法。
    An inspection method for inspecting an object,
    A process of specifying a target area in which an image of the target exists by the image processing method according to any one of claims 1 to 4.
    An extraction step of excluding an image other than the target area among the composite image to extract an image of the target;
    An inspection step of inspecting the object based on the image of the object extracted in the extraction step.
PCT/JP2018/018126 2017-09-25 2018-05-10 Image processing method, image processing device, and inspection method WO2019058633A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2017183242A JP2019061317A (en) 2017-09-25 2017-09-25 Image processing method, image processing device and inspection method
JP2017-183242 2017-09-25

Publications (1)

Publication Number Publication Date
WO2019058633A1 true WO2019058633A1 (en) 2019-03-28

Family

ID=65809596

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2018/018126 WO2019058633A1 (en) 2017-09-25 2018-05-10 Image processing method, image processing device, and inspection method

Country Status (2)

Country Link
JP (1) JP2019061317A (en)
WO (1) WO2019058633A1 (en)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH01169581A (en) * 1987-12-24 1989-07-04 Toshiba Corp Image processor
JPH11345335A (en) * 1998-06-02 1999-12-14 Amada Co Ltd Method for image-picking up machined material by three-dimensional laser beam machining device and three-dimensional laser beam machining device using the method
JP2001317916A (en) * 2000-05-10 2001-11-16 Fuji Mach Mfg Co Ltd Edge detection method and apparatus
JP2009250740A (en) * 2008-04-04 2009-10-29 Bridgestone Corp Image synthesizing method and visual inspecting device of tire
JP2010134768A (en) * 2008-12-05 2010-06-17 Tokyo Univ Of Science Device and method for generating moving image having image feature composed therein, and program
JP2015173344A (en) * 2014-03-11 2015-10-01 三菱電機株式会社 object recognition device

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH01169581A (en) * 1987-12-24 1989-07-04 Toshiba Corp Image processor
JPH11345335A (en) * 1998-06-02 1999-12-14 Amada Co Ltd Method for image-picking up machined material by three-dimensional laser beam machining device and three-dimensional laser beam machining device using the method
JP2001317916A (en) * 2000-05-10 2001-11-16 Fuji Mach Mfg Co Ltd Edge detection method and apparatus
JP2009250740A (en) * 2008-04-04 2009-10-29 Bridgestone Corp Image synthesizing method and visual inspecting device of tire
JP2010134768A (en) * 2008-12-05 2010-06-17 Tokyo Univ Of Science Device and method for generating moving image having image feature composed therein, and program
JP2015173344A (en) * 2014-03-11 2015-10-01 三菱電機株式会社 object recognition device

Also Published As

Publication number Publication date
JP2019061317A (en) 2019-04-18

Similar Documents

Publication Publication Date Title
TWI598581B (en) Inspection apparatus and inspection method
TWI580951B (en) Inspection apparatus and inspection method
US20220084183A1 (en) Defect detection device, defect detection method, and program
JP6629455B2 (en) Appearance inspection equipment, lighting equipment, photography lighting equipment
JP2018205005A (en) Image inspection device
CN110658198A (en) Optical detection method, optical detection device and optical detection system
JP7188870B2 (en) Image inspection equipment
ES2376394T3 (en) APPARATUS AND PROCEDURE FOR DETECTING ANOMALS OF A SEMI-CONDUCTOR SUBSTRATE.
JP7010057B2 (en) Image processing system and setting method
KR20110089486A (en) Mounting boards inspection apparatus and method thereof
JP2007078540A (en) Visual inspection method and visual inspection device
KR101679314B1 (en) Method for inspection of overlapping exposure image mixing using multiple exposure
JP2006284471A (en) Pattern inspection method, pattern inspection device and pattern inspecting program
JP6812118B2 (en) Defect detector, defect detection method and program
JP3741287B1 (en) Mounting board inspection method and inspection apparatus
WO2019058633A1 (en) Image processing method, image processing device, and inspection method
JP2022012629A (en) Inspection device and inspection method
JP5615252B2 (en) Appearance inspection device
JP6811540B2 (en) Defect detector, defect detection method and program
JP2002250700A (en) Method and device for inspecting pattern
JP2007333661A (en) Method and apparatus for visual inspection of electronic component
JP2018205000A (en) Image inspection device
KR101906139B1 (en) Machine vision system of automatic light setting using inspection standard image
JP2020144691A (en) Model color determination method, inspection device, inspection method, and program
JP2019186275A (en) Inspection equipment and inspection method

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 18857778

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 18857778

Country of ref document: EP

Kind code of ref document: A1