WO2022130843A1 - Appearance inspection method, appearance inspection device, and method and device for processing structure - Google Patents

Appearance inspection method, appearance inspection device, and method and device for processing structure Download PDF

Info

Publication number
WO2022130843A1
WO2022130843A1 PCT/JP2021/041436 JP2021041436W WO2022130843A1 WO 2022130843 A1 WO2022130843 A1 WO 2022130843A1 JP 2021041436 W JP2021041436 W JP 2021041436W WO 2022130843 A1 WO2022130843 A1 WO 2022130843A1
Authority
WO
WIPO (PCT)
Prior art keywords
visual inspection
unit
photographing
illumination
reflected light
Prior art date
Application number
PCT/JP2021/041436
Other languages
French (fr)
Japanese (ja)
Inventor
啓晃 笠井
薫 酒井
真由香 大崎
信昭 中須
淳也 小坂
Original Assignee
株式会社日立ハイテクファインシステムズ
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 株式会社日立ハイテクファインシステムズ filed Critical 株式会社日立ハイテクファインシステムズ
Publication of WO2022130843A1 publication Critical patent/WO2022130843A1/en

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/84Systems specially adapted for particular applications
    • G01N21/88Investigating the presence of flaws or contamination
    • G01N21/95Investigating the presence of flaws or contamination characterised by the material or shape of the object to be examined
    • G01N21/958Inspecting transparent materials or objects, e.g. windscreens

Definitions

  • the present invention relates to a technique for inspecting the appearance of a structure composed of a plurality of members, particularly a structure including a transparent member and other members. Furthermore, the present invention also relates to a structure processing technique using inspection results.
  • the transparent member may be sufficient to transmit the inspection light used for the inspection in a predetermined amount or more, and includes a so-called translucent member.
  • the structure may have a defect in the members constituting the structure. For example, protrusions called parting lines and burrs may occur. In addition, there may be missing parts in the design. Therefore, at the production site, the appearance of the structure is inspected. In this appearance inspection, it is general that the inspection is performed by performing image processing (including image recognition) on the image input from the photographing apparatus.
  • the structure contains a transparent member, it may be difficult to perform a highly accurate visual inspection. In the inspection of such a structure, it is necessary to consider the presence of a transparent member.
  • Patent Document 1 discloses "a foreign matter inspection device capable of inspecting foreign matter adhering to the front surface and the back surface of an object with high accuracy.
  • This "foreign matter inspection device is a light projecting unit that projects the inspection light so that the inspection light is obliquely incident on the light-transmitting subject from the light projection position, and the scattering of the foreign matter caused by the inspection light. It includes a light receiving unit that receives light at a light receiving position and a processing unit that processes the light receiving result of the light receiving unit to determine the presence or absence of the foreign matter.
  • FIG. 1 shows a composite member 10 which is an example of this structure.
  • the composite member 10 is formed by integrally molding the glass 101 and the rubber frame 102 provided on the outer peripheral surface of the glass 101.
  • FIG. 1A is a cross-sectional view of the composite member 10, and FIG. 1B shows a top view thereof.
  • defects of burrs 1021 and 1022 occur on the upper surface and the lower surface of the glass 101 of the rubber frame 102 of the composite member 10, respectively.
  • Patent Document 1 describes detecting the presence of a foreign substance, it does not consider recognizing a defect related to the shape of the member. Therefore, the present invention relates to an inspection of a structure in which a defect occurs in another member, a member constituting the other member, or another member at the interface of the transparent member, and the following points are the problems.
  • An object of the present invention is to recognize the shape of a defect in shape in a structure having a transparent member and other members.
  • the photographing unit performs an imaging with respect to the reflected light in the first direction with respect to the transmissive member, or an imaging in which the pixel value of the reflected light becomes a certain value or more. This is done to recognize the shape of the defect related to the first surface of the interface of the transparent member.
  • the surface means the image pickup unit side of the interface of the transmissive member.
  • the shape of the "defect" in the present specification includes members and parts within the design tolerance range.
  • the present invention relates to a visual inspection method using an visual inspection device for inspecting a defect of a structure including a permeable member, wherein the visual inspection device includes a holding portion for holding the structure and the above-mentioned.
  • An imaging unit that photographs the held structure, and a lighting unit that illuminates at least the first surface of the transparent member of the retained structure for imaging by the imaging unit.
  • It has a control unit that controls the holding unit, the photographing unit, and the lighting unit, and the control unit has at least one of the first surfaces in the photographing with the photographing unit with respect to the first surface of the transmissive member.
  • At least one of the lighting unit and the photographing unit is controlled so that the pixel value of the unit becomes equal to or more than a predetermined value, and when the pixel value becomes equal to or more than the predetermined value, at least one of the structures.
  • the imaging unit is controlled so that the unit captures the reflected light in the first direction with respect to the first surface, and the imaging information of at least a part of the imaged structure is used to capture the image of the structure. This is a visual inspection method for recognizing the shape of a defect related to the first surface.
  • the control unit In order for the control unit to control the pixel value to be equal to or higher than a predetermined value, (1) the illumination unit irradiates the illumination from a predetermined first direction, and (2) the illumination unit determines. It includes at least one of irradiating the illumination with the above luminance and (3) controlling the gain in the photographing of the photographing unit.
  • the present invention also includes the above-mentioned visual inspection device and an information processing device for realizing the control unit. Further, a computer program for operating the information processing apparatus and a storage medium for storing the program are also included.
  • the present invention also includes a processing method and a processing apparatus using the results of visual inspection.
  • a computer program for executing this processing method and a storage medium for storing this program are also included.
  • the figure which shows the composite member which is the inspection target in each Example The block diagram which shows the structure of the appearance inspection apparatus in Example 1.
  • Flow chart showing the processing contents in the first embodiment The figure which shows the design information used in Example 1.
  • a diagram showing shooting information in Example 1 and (b) a diagram showing an example of an image when the shooting conditions are not used.
  • the figure which shows the relationship between the brightness and the pixel in the shooting information when it is difficult to distinguish The block diagram which shows the structure of the manufacturing system in Example 1.
  • Flow chart showing the processing contents in the second embodiment The figure which shows the design information used in Example 2.
  • (A) (b) A diagram showing imaging information in Examples 2 and 3, and (c) a diagram illustrating extraction of a cutting line in Examples 2 and 3.
  • the composite member 10 shown in FIG. 1 will be described as an example to be inspected.
  • the composite member 10 is used as a window of a railroad vehicle, and a glass portion and a frame supporting the glass portion are integrally manufactured.
  • the composite member 10 is installed in a railway vehicle as a so-called unit frame.
  • the composite member 10 is not limited to being installed in a railroad vehicle, but can be installed in various means of transportation such as automobiles, aircraft, and ships, and in buildings such as buildings.
  • the glass 101 which is a transparent member, is surrounded by the rubber frame 102 and held. Then, as shown in FIG. 1A, burrs 1021 and burrs 1022 are generated on the upper surface and the lower surface of the glass 101, respectively, on the rubber frame 102. Examples of these burrs include thorns having a thickness of about 0.1 mm, strips having a thickness of about 0.1 to 0.3 mm, and granules having a thickness of 0.1 mm or less.
  • the image is taken from the upper surface side using a photographing device such as a camera or an photographing unit.
  • the photographing device photographs from the direction of the arrow shown in FIG. 1 (a).
  • burrs 1021 are generated on the upper surface side in the drawing.
  • burrs 1022 are generated on the lower surface side.
  • both the burrs 1021 and the burrs 1022 can be observed. Therefore, when these are photographed by using a photographing device, not only the burr 1021 but also the burr 1022 is photographed.
  • burr 1021 on the side facing the photographing device and the burr 1022 in the direction of passing the transmitted light through the glass 101 are photographed. This means that burrs 1021 and burrs 1022 cannot be distinguished. Therefore, the burr 1021 cannot be extracted or recognized. As a result, the cutting position for removing the burr 1021 cannot be specified.
  • the burr 1021 is recognized, in other words, the process for removing the burr 1022 is executed.
  • the recognition process is executed in consideration of the influence of the transparent member such as the glass 101 on the image to be inspected such as the rubber frame 102.
  • FIG. 2 is a block diagram showing the configuration of the visual inspection apparatus 20 in this embodiment.
  • the visual inspection device 20 includes a camera 21 that photographs the composite member 10 that is an inspection object, a coaxial epi-illuminator 22 that illuminates the composite member 10, an XY stage 23 that holds the composite member 10, and each of these configurations, that is, a visual inspection. It has a control unit 24 that controls the device 20.
  • the camera 21 is a means for realizing a photographing device or a photographing unit having a function of photographing the composite member 10.
  • the shooting of the camera 21 is not limited to a still image, and a moving image may be shot.
  • the coaxial epi-illumination device 22 functions as a lighting unit, and is a means for realizing the lighting unit that illuminates the composite member 10 photographed by the camera 21.
  • the coaxial epi-illuminator 22 reflects the illumination (light) from the light source 221 by a reflector 222 such as a half mirror, and emits illumination parallel to the optical axis of the camera 21.
  • a reflector 222 such as a half mirror
  • illumination parallel to the optical axis of the camera 21 As a result, it becomes possible to irradiate at least a part of the composite member 10 with the illumination from the coaxial epi-illuminator 22.
  • "coaxial epi-illumination" illumination that irradiates illumination parallel to the optical axis of the camera 21 is used, but other illumination may be used.
  • the XY stage 23 is a means for realizing the holding portion having a function of holding the composite member 10.
  • the XY stage 23 holds the composite member 10 so that the upper surface, which is the first surface to be the inspection surface of the composite member 10, faces the camera 21 and the coaxial epi-illuminator 22.
  • the direction in which the upper surface faces directly means the direction in which the upper surface, which is the inspection surface, is suitable for inspection. Therefore, this direction can be appropriately determined according to the shape and characteristics of the inspection target and the inspection surface, and the accuracy of the inspection.
  • the XY stage 23 may be operated according to the control unit 24, or the inspector may manually operate the XY stage 23 so as to be in that direction.
  • it has a drive mechanism represented by a motor for driving the XY stage 23.
  • control unit 24 controls the operations of the camera 21, the coaxial epi-illuminator 22, and the XY stage 23. That is, the control unit 24 controls the operation of the visual inspection device 20. Therefore, the control unit 24 is connected to each of the camera 21, the coaxial epi-illuminator 22, and the XY stage 23, and sends and receives information for control to and from each other.
  • control unit 24 may be configured integrally with the visual inspection device 20 or may be configured in a separate housing.
  • the above connection can be realized in the internal communication path.
  • the network enables connection.
  • control unit 24 can be realized by an information processing device such as a PC, a tablet, or a server.
  • the configuration of the control unit 24 thus realized in a separate housing will be described with reference to FIG.
  • the basic configuration is the same as when it is realized in a separate housing, and the difference will be mentioned in the description of FIG.
  • FIG. 3 is a block diagram showing the configuration of the control unit 24.
  • the control unit 24 includes a processing unit 241, an input unit 242, a display unit 243, a communication I / F 244, a main storage unit 245, and an auxiliary storage unit 246. These are connected to each other via an internal communication channel such as a bus.
  • an internal communication channel such as a bus.
  • the processing unit 241 is realized by an arithmetic unit such as a CPU. That is, the process according to each program stored or expanded in the main storage unit 245 is executed. The details of this process will be described later.
  • the function of the control unit 24 is realized by using a program, that is, software, but it may be realized by hardware. That is, the processing unit 241 may be configured by a circuit that executes the processing described later.
  • the input unit 242 can be realized by an input device such as a keyboard or pointing, and accepts the operation of the inspector. This operation includes operations in inspections such as starting and stopping of the camera 21, as well as measures to be taken in the event of a failure. When the visual inspection device 20 is automatically operated, the input unit 242 may be omitted.
  • the display unit 243 can be realized as a so-called display and outputs various information. Further, the display unit 243 may be integrally molded with the input unit 242 like a touch panel. Further, when the visual inspection device 20 is automatically operated, the display unit 243 may be omitted.
  • the communication I / F 244 has a function of connecting to various configurations of the visual inspection device 20 via a network.
  • it is connected to one visual inspection device 20, but it may be connected to a plurality of units.
  • the control unit 24 is integrally configured with the visual inspection device 20, this configuration may be omitted and the control unit 24 may be directly connected to another configuration.
  • the main storage unit 245 is realized by a storage medium such as a memory. Then, the main storage unit 245 expands each program for performing the processing in the processing unit 241 to the main storage unit 245. Therefore, it is usually desirable that each program is stored in another storage unit such as the auxiliary storage unit 246 or a storage medium.
  • the program includes various programs for controlling the visual inspection device 20. These can be exemplified by the following.
  • Camera control program 2451 that controls the activation of the camera 21
  • Coaxial epi-illuminator control program 2452 that controls the activation of the coaxial epi-illuminator 22
  • XY stage control program 2453 that controls the operation of the XY stage 23
  • the main storage unit 245 also stores, as a program, an image processing program 2454 for executing various image processing on the shooting information shot by the camera 21. Details of these programs will be described later.
  • the auxiliary storage unit 246 stores various information.
  • the various information includes the design information 2461 of the composite member 10 and the shooting information 2462 taken by the camera 21.
  • the auxiliary storage unit 246 can be realized by a hard disk drive (HDD), a solid state drive (SSD), various optical disks, or the like.
  • the auxiliary storage unit 246 may also store the above-mentioned various programs. Further, the information stored in the auxiliary storage unit 246 may be stored in an external storage device connected via the communication I / F 244.
  • FIG. 4 is a flowchart showing the processing contents using the processing unit 241 in this embodiment.
  • step S100 the processing unit 241 activates the camera 21 and the coaxial epi-illuminator 22 according to the camera control program 2451 and the coaxial epi-illuminator control program 2452.
  • the activation condition includes an inspector's operation via the input unit 242 and reception of a detection signal indicating that the composite member 10 is held from the XY stage 23.
  • the composite member identification information for identifying the composite member 10 held by the XY stage 23 is included in the operation or detection signal for the input unit 242, and the processing unit 241 specifies this.
  • the processing unit 241 specifies the imaging conditions of the composite member 10 to be inspected according to the image processing program 2454.
  • the processing unit 241 reads out the imaging conditions of the corresponding composite member 10 from the design information 2461 of the auxiliary storage unit 246.
  • An example of this design information 2461 is shown in FIG.
  • the design information 2461 includes composite member identification information 2461-1 for identifying the composite member to be inspected, and member 2461-2 for identifying each member constituting the composite member identification information 2461-1.
  • the design information 2461 also includes a dimension 2461-3 indicating the size and shape of each member and a tolerance 2461-4 for determining whether correction processing is necessary for this dimension. Further, the design information 2461 also includes a shooting condition 2461-5, which is a condition for shooting the composite member for each composite member. It is preferable that the shooting conditions 2461-5 include the illumination brightness thereof and the camera gain in the camera 21. Further, instead of the camera gain, the ISO sensitivity, the aperture, and the shutter speed of the camera 21 for realizing this camera gain may be stored.
  • step S101 the record corresponding to the composite member identification information specified in step S100 is specified from the design information 2461.
  • the processing unit 241 specifies the shooting conditions 2461-5 included in the specified record.
  • the processing unit 241 specifies “a” as the illumination brightness and “b” as the gamer gain. It should be noted that these shooting conditions indicate conditions in which the pixel value of the reflected light in the transmissive member is equal to or higher than a certain level. This pixel value includes luminance. Further, as an example in which the pixel value is above a certain level, shooting information such as so-called overexposure is saturated.
  • step S102 the processing unit 241 outputs a control signal for moving to the XY stage 23 according to the XY stage control program 2453.
  • the processing unit 241 identifies the position of the composite member 10 photographed by the camera 21, and sends a control signal to the XY stage 23 so that the cutting line for cutting the burr is included in the field of view of the camera 21. Output.
  • This cutting line can be specified by the processing unit 241 using the dimensions of the design information 2461.
  • the processing unit 241 outputs a control signal to the camera 21 that moves the position of the camera so that the cutting line is included in the field of view of the camera 21 as described above according to the camera control program. May be good.
  • the cutting line may be controlled to be included in the field of view of the camera 21 by the relative positional relationship between the camera 21 and the XY stage (or the composite member 10 held by the camera 21).
  • the cutting line is a kind of processing position for processing a member according to a design.
  • step S103 the processing unit 241 outputs a control signal for shooting according to the shooting conditions specified in step S101 to the camera 21 according to the camera control program 2451. Further, the processing unit 241 outputs a control signal for lighting according to the shooting conditions specified in step S101 to the coaxial epi-illuminator 22 according to the coaxial epi-illuminator control program 2452.
  • the coaxial epi-illuminator 22 irradiates the illumination under the illumination conditions included in the photographing conditions specified in step S101.
  • the coaxial epi-illuminator 22 generates reflected light in the first direction from the composite member 10 or the glass 101. It is desirable that this reflected light is substantially specular reflected light.
  • the substantially specularly reflected light means the reflected light including the specularly reflected light and falling within a certain range from the reflection angle of the specularly reflected light.
  • the reflected light in the first direction can be defined as having a pixel value of a certain value or more in the shooting information. An example of this is saturation including so-called overexposure.
  • the reflected light in the first direction is a substantially specular reflected light, and will be described as causing saturation.
  • the camera 21 performs shooting including substantially specular reflected light with the output control signal, that is, the camera gain specified in step S101. That is, the camera 21 captures an image in which the pixel value of a part of the glass 101 is saturated.
  • the processing unit 241 may be controlled as follows in step S103. (1) As described above, the coaxial epi-illuminator 22 irradiates the illumination from a predetermined first direction, (2) the coaxial epi-illuminator 22 irradiates the illumination with a predetermined brightness or more, (3). At least one of controlling the gain in the shooting of the camera 21 is included. That is, it suffices if any one or a combination of these can realize shooting in which the pixel value is equal to or higher than a certain value.
  • FIG. 6A shows an example schematically showing the shooting information 2462 acquired in step S013.
  • the glass portion 101-1 of the photographing information 2462 is saturated.
  • the rubber frame portion 102-1 is present on the right side of FIG. 6A.
  • the rubber frame portion 102-1 shows a burr portion 1021-1 indicating the burr 1021 generated on the upper surface side of the glass 101.
  • FIG. 6 (b) an image when the shooting conditions of step S101 are not used is shown in FIG. 6 (b).
  • the glass portion 101-1 is not saturated. Therefore, in this image, the burr portion 1022-1 showing the burr 1022 generated on the lower surface side of the glass 101 is also included in the shooting information.
  • the burr portion 1022-1 on the lower surface side is also included in the shooting information 2462, which makes it difficult to specify the burr portion 1021-1. Therefore, when removing burrs, it becomes difficult to identify an accurate cutting line.
  • FIGS. 6 (a) and 6 (b) schematically show shooting information and images, respectively. Therefore, in the photographing information 2462 shown in FIG. 6A, a negligible burr portion 1022-1 may be included.
  • step S101 the shooting conditions 2461-5 included in the design information 2461 are used.
  • the camera 21 may take a picture on a trial basis, and the processing unit 241 may determine whether the glass portion 101-1 is saturated according to the image processing program 2454. In this case, the shooting of the camera 21 and the determination by the processing unit 241 are repeatedly performed. Further, the inspector may use the image displayed on the display unit 243 to determine whether or not the image is saturated.
  • step S103 the shooting information, which is an image shot by the camera 21, is output to the control unit 24.
  • the processing unit 241 of the control unit 24 stores this shooting information in the auxiliary storage unit 246 as shooting information 2462.
  • the shooting information 2462 may be the image itself taken by the camera 21, and it is preferable that the shooting information 2462 includes information for identifying the individual of the composite member 10 to be shot from this image.
  • step S104 the processing unit 241 extracts a cutting line from the shooting information 2462 according to the image processing program 2454.
  • This extraction will be described with reference to FIG. FIG. 7 (a) is the same diagram as in FIG. 6 (a), and is a diagram schematically showing shooting information 2462.
  • FIG. 7B a cutting line is extracted using the information showing the relationship between the pixel and the luminance shown in FIG. 7C.
  • the processing unit 241 has a pixel value from the rubber frame portion 102-1 (right side in the drawing) shown in FIG. 7 (c), that is, a pixel whose brightness first shows a minimum value, and a glass portion 101-1 and rubber.
  • the cutting line 103 is extracted by sequentially specifying along the boundary region of the frame portion 102-1.
  • the processing unit 241 of this embodiment can execute the extraction of the cutting line 103 not only by using the minimum value of the luminance value but also by using the characteristics of the luminance value.
  • the processing unit 241 of the present embodiment may detect and realize a portion where the rate of change in the slope of the luminance is larger than a predetermined value. As shown in FIG. 7 (d), it may be difficult to determine the minimum value.
  • a minimum value may be formed, or a minimum value may not be obtained and the change in inclination may be large. Therefore, the processing unit 241 may extract the cutting line 103 by utilizing both the minimum value and the change in the slope.
  • the processing unit 241 extracts the burr end face on the left side from the cutting line 103 (see FIG. 7 (c)). In this case, the processing unit 241 sets the difference between the burr end face position and the cutting line 103 position as the burr size. In some cases, the cutting line 103 cannot be extracted, that is, there is no need for cutting without burrs. Similarly in this case, the processing unit 241 sets the difference between the burr end face position and the cutting line 103 position as the burr size (see FIG. 7 (d)).
  • the processing unit 241 stores information indicating the cutting line so that it is possible to determine which inner side of the rubber frame 102 is the cutting line.
  • step S105 the processing unit 241 determines the presence or absence of burrs based on the extracted cutting line 103 according to the image processing program 2454. This determination is executed as follows. If the cutting line 103 is not extracted, the processing unit 241 determines that there is no burr. Further, when the cutting line 103 is extracted, the processing unit 241 uses the tolerance 2461-4 of the design information 2461. That is, the processing unit 241 determines whether or not it is within the range of the tolerance 2461-4 from the cutting line. Using this result, the processing unit 241 determines that there is no burr if it is within the range.
  • this step it becomes possible to determine whether the rubber frame 102 needs to be cut. It is desirable that the processing unit 241 associates this determination result with the cutting line extracted in step S104 and stores it in the auxiliary storage unit 246.
  • step S106 the processing unit 241 determines, according to the image processing program 2454, whether or not the entire circumference of the upper surface side of the composite member 10, that is, each inner side of the rubber frame 102 on the upper surface side has been photographed. For this reason, it is desirable that the processing unit 241 use the design information 2461. As a result, if it is not completed (NO), the process returns to step S101. When finished (YES), the process of this flowchart ends.
  • the lower surface side can be similarly performed. That is, the lower surface side can be inspected by performing the same processing as in FIG. 4 on the composite member 10 installed on the XY stage 23 at the top and bottom (front and back) in the opposite direction to the above.
  • step S101 It is also possible to omit the use of the imaging conditions in step S101 for the inspection of the lower surface side. That is, if the burr 1021 is cut as described later, the burr 1021 on the upper surface that is not the inspection target is not photographed. Therefore, in the inspection of the lower surface, it is possible to omit step S101.
  • the cutting device for cutting burrs may be integrally configured with the visual inspection device 20, or may be configured with a separate housing. However, it is desirable to be connected to the control unit 24 which may be used in any case. Further, it may be realized as the manufacturing system 200 shown in the block diagram of FIG.
  • the manufacturing system 200 of FIG. 8 has an inspection station 201 having the same function as the visual inspection device 20, and a processing station 202 which is a kind of cutting device. That is, the camera 21 and the coaxial epi-illuminator 22 are installed in the inspection station 201. Further, a laser oscillator 25 for cutting burrs is installed in the processing station 202. In this embodiment, the laser oscillator 25 is used as the cutting device, but the present invention is not limited to this. Further, in this embodiment, cutting is taken as an example of processing, but other processing such as deletion is included. That is, a processing device other than the cutting device may be used.
  • control unit 24 is connected to each configuration of the inspection station 201 and the processing station 202, and outputs a control signal to control them.
  • the processing unit 241 of the control unit 24 may be provided at each of the inspection station 201 and the processing station 202.
  • the processing unit 241 outputs a control signal for moving the composite member 10 to the processing station 202 side to the XY stage 23-a according to the XY stage control program 2453.
  • the XY stage 23-a moves the composite member 10 to the processing station 202 side by using its driving function.
  • the calibration result of the coordinates of the inspection station 201 and the processing station 202 is stored in the auxiliary storage unit 246.
  • the processing unit 241 outputs a control signal to the laser oscillator 25 according to a cutting processing control program (not shown).
  • This control signal is a signal instructing to cut the inner side of the rubber frame 102 determined to have burrs (YES) in step S105 according to the cutting line extracted in step S104.
  • the processing unit 241 uses information indicating the presence or absence of cutting lines and burrs stored in the auxiliary storage unit 246. As a result, the laser oscillator 25 outputs a laser along the cutting line extracted in step S104, and cuts the burr 1021 of the rubber frame 102 of the composite member 10.
  • the burr is cut after taking a picture of each inner side (entire circumference) of the rubber frame 102. However, this may be cut off each time a photograph is taken.
  • FIG. 9 is a block diagram showing the configuration of the visual inspection apparatus 20 in the second embodiment.
  • the illuminator 26 is added as compared with the first embodiment, and the other configurations are the same as those of the first embodiment.
  • the illuminator 26 is connected to the control unit 24 and is installed so that the illuminator is illuminated at a position and angle different from that of the coaxial epi-illumination illuminator 22.
  • This Example 2 is suitable when it is difficult to extract a cutting line from the photographing information when the illumination of the coaxial epi-illuminator 22 is used.
  • the shooting information as shown in FIG. 12A using the illumination of the coaxial epi-illuminator 22 is acquired, the difference between the pixel values of the burr portion 1021-1 and the rubber frame portion 102-1 is small and the cutting line is extracted. May be difficult. In such a case, it is preferable to use this embodiment.
  • an appearance inspection is performed using two shooting information using two lightings.
  • two lights and two shooting information are used, but three or more lights and shooting information may be used.
  • the details of the process for this inspection will be described below with reference to the flowchart of FIG.
  • step S100 is the same process as in the first embodiment.
  • step S201 is executed, and this process is basically the same as step S101 of the first embodiment.
  • the posture and angle of the illuminator 26 are specified.
  • the processing unit 241 specifies the "illumination posture / position" of the illuminator 26 as an imaging condition of the composite member 10 to be inspected according to the image processing program 2454. This indicates the irradiation angle of the illuminator 26 with respect to the composite member 10. This irradiation angle is different from that of the coaxial epi-illumination device 22, and is obtained as shown below. Therefore, in the photographing using the illuminating device 26, the reflected light in the second direction, which is a direction different from the first direction, is photographed.
  • the processing unit 241 uses the design information 2461a shown in FIG. 11 instead of the design information 2461. That is, the processing unit 241 specifies the illumination posture / position of the shooting conditions 2461-5a of the design information 2461a.
  • the lighting posture / position is recorded as being optimized according to the shape of the edge portion of the rubber frame 102 of the composite member 10. That is, as the illumination posture / position, the optimum value of the irradiation angle at which the pixel value (for example, brightness) near the cutting line becomes larger than a predetermined condition when the illuminator 26 is illuminated is used. This optimum value can be obtained from the shape of the rubber frame 102. Therefore, in this step, the processing unit 241 may obtain the shape of the rubber frame 102 included in the design information 2461 to the design information 2461a.
  • step S102 is executed in the same manner as in the first embodiment. That is, the XY stage 23 moves the composite member 10 to the photographing position.
  • step S203 the same image acquisition as in step S103 is performed.
  • the lighting has changed from the coaxial epi-illuminator 22 to the illuminator 26. That is, in step S203, the processing unit 241 outputs a control signal for shooting according to the shooting conditions specified in step S201 to the camera 21 according to the camera control program 2451. Further, the processing unit 241 outputs to the illuminator 26 a control signal for illuminating according to the shooting conditions specified in step S201 according to the coaxial epi-illuminator control program 2452 or the illumination control program (not shown).
  • the processing unit 241 acquires the shooting information taken by the camera 21 and stores the reflected light in the second direction in the lighting of the illuminator 26 in the auxiliary storage unit 246.
  • this shooting information has the following characteristics as compared with the shooting information in the coaxial epi-illuminator 22 of FIG. 12 (a). (1) Pixels of both the burr portion 1021-1 and the burr portion 1022-1 on the upper surface side and the lower surface side are included, and (2) the burr portion 1021-1, the burr portion 1022-1 and the rubber frame portion 102-1. The difference in values is large, and it is easier to extract the cutting line.
  • step S204 the processing unit 241 performs the same processing as in step S104 of the first embodiment according to the image processing program 2454. That is, the processing unit 241 extracts the cutting line from the shooting information acquired in step S203. The extraction of the cutting line is performed by using the characteristics of the pixel value as in the first embodiment. Since this extraction is the same as in Example 1, the details thereof will be omitted.
  • step S205 the processing unit 241 determines whether the burr portion 1021-1 on the upper surface side can be extracted from the shooting information acquired in step S203 according to the image processing program 2454. That is, it is determined whether or not the burr portion 1021-1 and the burr portion 1022-1 in FIG. 12B are present on the upper surface side.
  • a criterion for this determination for example, whether or not the pixel value of the glass portion 101-1 in FIG. 12B is a certain value or more (for example, whether it is saturated) can be used.
  • step S205 may be omitted, and a configuration in which the process proceeds from step S204 to step S103 may be adopted.
  • step S103 of the first embodiment the same process as in step S103 of the first embodiment is executed. That is, the processing unit 241 acquires the shooting information using the coaxial epi-illuminator 22.
  • step S206 the processing unit 241 draws a cutting line extracted in step S204 on the composite member 10 according to the image processing program 2454, and displays this on the display unit 243.
  • the processing unit 241 synthesizes the photographing information acquired in step S103 and the cutting line extracted in S204.
  • the processing unit 241 acquires the information in which (1) the burr portion 1021-1 on the upper surface side is specified and (2) the cutting line 103 is extracted as shown in FIG. 12 (c).
  • the same steps S105 and S106 as in the first embodiment are executed. Further, in the second embodiment as well, the burr can be cut in the same manner as in the first embodiment.
  • the shooting conditions (illumination brightness, illumination posture / angle) of the coaxial epi-illuminator 22 and the illuminator 26 were obtained in step S201, respectively.
  • the shooting conditions of the illuminator 26 may be obtained, and the shooting conditions of the coaxial epi-illuminator 22 may be obtained immediately before step S103.
  • step S103 the coaxial epi-illuminator 22 is photographed (step S103) after the image of the illuminator 26 is photographed (step S203), but the order may be reversed.
  • step S205 it is determined whether or not the cutting line can be extracted as follows.
  • the processing unit 241 determines whether or not the cutting line could be extracted in step S203 according to the image processing program 2454. This judgment can be made based on whether the number or length of pixels extracted as the minimum value is a predetermined value or more. Further, the processing unit 241 may display the pixel showing the minimum value on the photographing information on the display unit 243, and the inspector may make the determination. As a result, if it is determined that the cutting line can be extracted (YES), the process proceeds to step S106. If it is determined that the extraction could not be performed (NO), the process proceeds to step S103.
  • the shooting condition of the coaxial epi-illuminator 22 may be obtained in step S201, and the shooting condition of the illuminator 26 may be obtained immediately before the step S203 which is turned later.
  • the coaxial epi-illuminator 22 and the illuminator 26 and two illuminators are physically used, but it may be configured to acquire two shooting information with one illuminator.
  • the lighting is moved under the control of the control unit 24 to irradiate the lighting.
  • the processing unit 241 can acquire two shooting information by changing the angle of the reflector 222 according to the coaxial epi-illuminator control program 2452. In this case, as described above, it may be configured to acquire three or more shooting information.
  • Example 2 the same burr cutting process as in Example 1 may be performed. That is, the second embodiment can be applied to the manufacturing system 200 shown in FIG. This is the end of the description of the second embodiment.
  • FIG. 13 is a block diagram showing the configuration of the visual inspection apparatus 20 in this embodiment.
  • a camera 27, which is a photographing unit, is added as compared with the first embodiment shown in FIG.
  • the camera 27 is connected to the control unit 24, and the processing unit 241 controls shooting according to the camera control program 2451.
  • step S201 the processing of this embodiment is basically the same as that of the chart of Example 2.
  • step S203 the processing of this embodiment is basically the same as that of the chart of Example 2.
  • step S103 the third embodiment will be described below with a focus on these.
  • step S201 the design information 2461a shown in FIG. 11 was used in Example 2, but in this embodiment, the design information 2461 shown in FIG. 5 is used as in Example 1.
  • step S203 is executed as follows.
  • the processing unit 241 outputs a control signal for shooting according to the shooting conditions specified in step S201 to the camera 21 according to the camera control program 2451. Further, the processing unit 241 outputs a control signal for lighting according to the shooting conditions specified in step S201 to the coaxial epi-illuminator 22 according to the coaxial epi-illuminator control program 2452.
  • the processing unit 241 acquires shooting information from the camera 21.
  • the shooting information acquired in this way can be expressed as shown in FIG. 12A as in the second embodiment.
  • step S103 the processing unit 241 outputs a control signal for shooting according to the shooting conditions specified in step S201 to the camera 27 according to the camera control program 2451. Further, similarly to step S103, the processing unit 241 outputs a control signal for performing illumination according to the shooting conditions specified in step S201 to the coaxial epi-illuminator 22 according to the coaxial epi-illuminator control program 2452.
  • the processing unit 241 acquires shooting information from the camera 27.
  • the shooting information acquired in this way can be expressed as shown in FIG. 12 (b) as in the second embodiment.
  • step S206 the processes after step S206 are executed in the same manner as in the second embodiment.
  • the order of the steps S203 and the step S103 may be changed as in the second embodiment. In this case, it is desirable to change the process of step S205 as in the embodiment.
  • the processing unit 241 outputs a control signal for photographing to the camera 21 and the camera 27 in parallel according to the camera control program 2451.
  • one camera may be moved to acquire two or more shooting information.
  • the camera 21 and the camera 27 have different shooting angles, that is, different fields of view. Therefore, it is desirable that the processing unit 241 performs image conversion so as to match the shooting information of the camera 21 with the shooting information of the camera 27 according to the image processing program 2454.
  • the burr may be cut in the third embodiment as in the first and second embodiments. That is, the third embodiment can be applied to the manufacturing system 200 shown in FIG. This is the end of the description of the third embodiment.
  • the image is taken so that the pixel value of the reflected light or the reflected light in the first direction with respect to the transmissive member becomes a certain value or more.
  • the shape relating to the defect that the first surface (eg, the upper surface on the imaging device side) of the transmissive member constituting the structure is generated by further photographing the reflected light in the second direction. Recognize.
  • the burr of the member (rubber frame 102) constituting the structure is exemplified as a defect, but the object in which the defect occurs is not limited to this. For example, recognizing an object that does not form a structure, such as adhesion of impurities, is also included in each embodiment.
  • burrs have been described as an example of defects in this embodiment, the structure should be inspected by recognizing the shape of defects such as excess / deficiency with respect to dimensions such as parting line, chipping, and thickness, in addition to burrs. Can be applied to.
  • glass 101 was used as the transparent member, it is also possible to use another transparent member such as an acrylic plate. Further, in this embodiment, the rubber frame 102 is used as a member in which a defect occurs, but it can also be applied to other members and materials. As the other member, a transparent member such as plastic or glass may be used.
  • each embodiment can be applied to a structure in which a defect occurs regarding the interface of the transparent member constituting the structure.
  • what causes a defect related to the interface means that when the structure is observed from a certain direction, a defect observed through the transmissive member and a defect observed without the defect occur. Therefore, the interface associated with the defect is not limited to the upper surface and the lower surface. That is, the expression such as front surface and back surface, front surface and side surface, etc. does not matter. Therefore, each interface can be expressed as a first surface, a second surface, and so on. Further, the shape of the interface includes not only a flat surface but also a curved surface.

Landscapes

  • Physics & Mathematics (AREA)
  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Chemical & Material Sciences (AREA)
  • Analytical Chemistry (AREA)
  • Biochemistry (AREA)
  • General Health & Medical Sciences (AREA)
  • General Physics & Mathematics (AREA)
  • Immunology (AREA)
  • Pathology (AREA)
  • Investigating Materials By The Use Of Optical Means Adapted For Particular Applications (AREA)

Abstract

The purpose of the present invention is to recognize defects such as burrs is occurring on a first side and a second side of a transparent member such as a composite member 10. In order to meet this purpose in this appearance inspection device 20 for inspecting a structure, light reflected in a first direction relative to the transparent member (glass) of the composite member 10 is imaged with a camera 21, or imaging is performed such that the pixel value of reflected light is greater than or equal to a fixed value, and the shape of defects relating to the first side are recognized on the interface of the transparent member. Here, 'surface' means the interface of the transparent member on the side of the camera imaging device.

Description

外観検査方法、外観検査装置、構造体に対する加工方法および装置Visual inspection method, visual inspection equipment, processing method and equipment for structures
 本発明は、複数の部材で構成される構造体、特に、透過性を有する部材と他の部材を含む構造体の外観を検査する技術に関する。さらに、本発明は、検査結果を用いた構造体の加工技術にも関する。なお、透過性を有する部材とは、検査に用いる検査光を所定以上透過すればよく、いわゆる半透明な部材も含む。 The present invention relates to a technique for inspecting the appearance of a structure composed of a plurality of members, particularly a structure including a transparent member and other members. Furthermore, the present invention also relates to a structure processing technique using inspection results. The transparent member may be sufficient to transmit the inspection light used for the inspection in a predetermined amount or more, and includes a so-called translucent member.
 現在、工場などの生産現場で様々な構造体が生産されているが、設計とおり生産されるとは限らない。つまり、構造体に、これを構成する部材に不具合が生じることがある。例えば、パーティングラインやバリと呼ばれる突起が発生することがある。また、設計に対して欠落部分が発生することもある。このため、生産現場においては、その構造体の外観の検査が行われている。この外観の検査においては、撮影装置から入力される画像に対して画像処理(画像認識を含む)を行うことで、検査を実行していることが一般的である。 Currently, various structures are produced at production sites such as factories, but they are not always produced as designed. That is, the structure may have a defect in the members constituting the structure. For example, protrusions called parting lines and burrs may occur. In addition, there may be missing parts in the design. Therefore, at the production site, the appearance of the structure is inspected. In this appearance inspection, it is general that the inspection is performed by performing image processing (including image recognition) on the image input from the photographing apparatus.
 ここで、構造体に透過性を有する部材が含まれると、高精度な外観検査が困難な場合がある。このような構造体の検査においては、透過性を有する部材の存在を考慮する必要がある。 Here, if the structure contains a transparent member, it may be difficult to perform a highly accurate visual inspection. In the inspection of such a structure, it is necessary to consider the presence of a transparent member.
 例えば、特許文献1には、「被検物の表面および裏面に付着した異物を高精度で検査しうる異物検査装置」が開示されている。この「異物検査装置は、投光位置から光透過性の被検物に対して検査光が斜入射するように前記検査光を投光する投光部と、前記検査光によって生じる前記異物の散乱光を受光位置で受光する受光部と、前記受光部の受光結果を処理して前記異物の有無を判定する処理部と、を含む」。 For example, Patent Document 1 discloses "a foreign matter inspection device capable of inspecting foreign matter adhering to the front surface and the back surface of an object with high accuracy". This "foreign matter inspection device is a light projecting unit that projects the inspection light so that the inspection light is obliquely incident on the light-transmitting subject from the light projection position, and the scattering of the foreign matter caused by the inspection light. It includes a light receiving unit that receives light at a light receiving position and a processing unit that processes the light receiving result of the light receiving unit to determine the presence or absence of the foreign matter. "
特開2016-133357号公報Japanese Unexamined Patent Publication No. 2016-133357
 ここで、構造体として、透過性を有する部材(透過性部材)のうち、当該部材の境を示す界面のうち、少なくとも2面において、他の部材を含む不具合の検査を行うことが必要なものある。図1に、この構造体の一例である複合部材10を示す。複合部材10は、ガラス101と、ガラス101の外周面に設けられたゴム枠102が一体成型することで、構成される。図1(a)は、複合部材10の断面図であり、図1(b)はその上面図を示す。ここで、図1(a)に示すように、複合部材10のゴム枠102のガラス101の上面および下面のそれぞれに、バリ1021およびバリ1022の不具合が生じている。このような場合、発生する不具合の検査においては、上面の部材と下面のバリを区別して、認識することが必要である。なお、図1の例では、ガラス101の上面および下面のバリ、つまり、不具合を認識しているが、このことは、ゴム枠102、つまり、部材を認識することでもある。 Here, as a structure, among the transparent members (transparent members), it is necessary to inspect at least two of the interfaces indicating the boundaries of the members for defects including other members. be. FIG. 1 shows a composite member 10 which is an example of this structure. The composite member 10 is formed by integrally molding the glass 101 and the rubber frame 102 provided on the outer peripheral surface of the glass 101. FIG. 1A is a cross-sectional view of the composite member 10, and FIG. 1B shows a top view thereof. Here, as shown in FIG. 1A, defects of burrs 1021 and 1022 occur on the upper surface and the lower surface of the glass 101 of the rubber frame 102 of the composite member 10, respectively. In such a case, it is necessary to distinguish and recognize the member on the upper surface and the burr on the lower surface in the inspection of the defect that occurs. In the example of FIG. 1, burrs on the upper surface and the lower surface of the glass 101, that is, a defect is recognized, but this also recognizes the rubber frame 102, that is, a member.
 しかし、特許文献1では、異物の存在を検出することは記載されているものの、部材の形状に関する不具合を認識することは考慮されていない。そこで、本発明では、透過性部材の界面に、他の部材、他の部材を構成する部材もしくは他の部材に関する不具合が生じる構造体の検査に関するものであり、以下の点を課題とする。本発明の課題は、透過性部材と他の部材を有する構造体における形状に関する不具合の形状を認識することである。 However, although Patent Document 1 describes detecting the presence of a foreign substance, it does not consider recognizing a defect related to the shape of the member. Therefore, the present invention relates to an inspection of a structure in which a defect occurs in another member, a member constituting the other member, or another member at the interface of the transparent member, and the following points are the problems. An object of the present invention is to recognize the shape of a defect in shape in a structure having a transparent member and other members.
 上記課題を解決するために、本発明では、構造体の外観検査において、撮影部で透過性部材に対する第1の方向の反射光に対する撮影、もしくは反射光の画素値が一定値以上となる撮影を行い、透過性部材の界面のうち、第1面に関する不具合の形状を認識する。ここで、表面とは、透過性部材の界面のうち、撮像部側を意味する。 In order to solve the above problems, in the present invention, in the visual inspection of the structure, the photographing unit performs an imaging with respect to the reflected light in the first direction with respect to the transmissive member, or an imaging in which the pixel value of the reflected light becomes a certain value or more. This is done to recognize the shape of the defect related to the first surface of the interface of the transparent member. Here, the surface means the image pickup unit side of the interface of the transmissive member.
 なお、一般の生産においては、バリなどの狭義の不具合が発生しないこともある(設計公差の範囲内)。このような場合も、本発明では、構造体を構成する部材やその部分の形状を認識することが可能である。そこで、本明細書での「不具合」の形状とは、設計公差範囲内の部材や部分も含まれる。 In general production, defects in the narrow sense such as burrs may not occur (within the range of design tolerances). Even in such a case, in the present invention, it is possible to recognize the shape of the member constituting the structure and the portion thereof. Therefore, the shape of the "defect" in the present specification includes members and parts within the design tolerance range.
 より具体的には、本発明は、透過性部材を含む構造体の不具合を検査する外観検査装置を用いた外観検査方法において、前記外観検査装置は、前記構造体を保持する保持部と、前記保持された構造体を撮影する撮影部と、前記保持された構造体のうち、少なくとも前記透過性部材の第1面に向けて、前記撮影部での撮影のための照明を照射する照明部と、前記保持部、前記撮影部および前記照明部を制御する制御部を有し、前記制御部が、前記透過性部材の第1面に対する前記撮影部での撮影において、当該第1面の少なくとも一部の画素値が予め定めた値以上になるように、前記照明部および前記撮影部の少なくとも一方を制御し、前記画素値が前記予め定めた値以上になった場合、前記構造体の少なくとも一部を、前記第1面に対して第1の方向の反射光を撮影するよう前記撮影部を制御し、撮影された前記構造体の少なくとも一部の撮影情報を用いて、前記構造体の前記第1面に関する不具合の形状を認識する外観検査方法である。なお、制御部が、画素値が予め定めた値以上になるように制御することには、(1)照明部が所定の第1の方向から照明を照射すること、(2)照明部が所定以上の輝度の照明を照射すること、(3)撮影部の撮影におけるゲインを制御することの少なくとも1つが含まれる。 More specifically, the present invention relates to a visual inspection method using an visual inspection device for inspecting a defect of a structure including a permeable member, wherein the visual inspection device includes a holding portion for holding the structure and the above-mentioned. An imaging unit that photographs the held structure, and a lighting unit that illuminates at least the first surface of the transparent member of the retained structure for imaging by the imaging unit. It has a control unit that controls the holding unit, the photographing unit, and the lighting unit, and the control unit has at least one of the first surfaces in the photographing with the photographing unit with respect to the first surface of the transmissive member. At least one of the lighting unit and the photographing unit is controlled so that the pixel value of the unit becomes equal to or more than a predetermined value, and when the pixel value becomes equal to or more than the predetermined value, at least one of the structures. The imaging unit is controlled so that the unit captures the reflected light in the first direction with respect to the first surface, and the imaging information of at least a part of the imaged structure is used to capture the image of the structure. This is a visual inspection method for recognizing the shape of a defect related to the first surface. In order for the control unit to control the pixel value to be equal to or higher than a predetermined value, (1) the illumination unit irradiates the illumination from a predetermined first direction, and (2) the illumination unit determines. It includes at least one of irradiating the illumination with the above luminance and (3) controlling the gain in the photographing of the photographing unit.
 また、本発明には、上述の外観検査装置、制御部を実現するための情報処理装置も含まれる。さらに、この情報処理装置を機能させるためのコンピュータプログラムやこのプログラムを記憶する記憶媒体も含まれる。 The present invention also includes the above-mentioned visual inspection device and an information processing device for realizing the control unit. Further, a computer program for operating the information processing apparatus and a storage medium for storing the program are also included.
 またさらに、本発明には、外観検査の結果を用いた加工方法や加工装置も含まれる。ここで、この加工方法を実行するためのコンピュータプログラムやこのプログラムを記憶する記憶媒体も含まれる。 Furthermore, the present invention also includes a processing method and a processing apparatus using the results of visual inspection. Here, a computer program for executing this processing method and a storage medium for storing this program are also included.
 本発明によれば、透過性部材を含む構造体の外観検査をより正確に実現できる。なお、上記した以外の課題、構成及び効果は、以下の実施形態の説明により明らかにされる。 According to the present invention, it is possible to more accurately inspect the appearance of a structure including a permeable member. Issues, configurations and effects other than those described above will be clarified by the following description of the embodiments.
各実施例における検査対象である複合部材を示す図The figure which shows the composite member which is the inspection target in each Example 実施例1における外観検査装置の構成を示すブロック図The block diagram which shows the structure of the appearance inspection apparatus in Example 1. 実施例1における制御部の構成を示すブロック図A block diagram showing the configuration of the control unit in the first embodiment. 実施例1における処理内容を示すフローチャートFlow chart showing the processing contents in the first embodiment 実施例1で用いられる設計情報を示す図The figure which shows the design information used in Example 1. (a)実施例1における撮影情報を示す図、(b)撮影条件を用いない場合の画像の一例を示す図(A) A diagram showing shooting information in Example 1, and (b) a diagram showing an example of an image when the shooting conditions are not used. (a)実施例1における撮影情報を示す図、(b)撮影情報から切断線を抽出する処理を説明する図、(c)撮影情報における輝度と画素の関係を示す図、(d)極小値の判別が困難な場合の撮影情報における輝度と画素の関係を示す図(A) A diagram showing shooting information in Example 1, (b) a diagram explaining a process of extracting a cutting line from shooting information, (c) a diagram showing a relationship between brightness and pixels in shooting information, (d) a minimum value. The figure which shows the relationship between the brightness and the pixel in the shooting information when it is difficult to distinguish 実施例1における製造システムの構成を示すブロック図The block diagram which shows the structure of the manufacturing system in Example 1. 実施例2における外観検査装置の構成を示すブロック図The block diagram which shows the structure of the appearance inspection apparatus in Example 2. 実施例2における処理内容を示すフローチャートFlow chart showing the processing contents in the second embodiment 実施例2で用いられる設計情報を示す図The figure which shows the design information used in Example 2. (a)(b)実施例2および3における撮影情報示す図、(c)実施例2および3における切断線の抽出を説明する図(A) (b) A diagram showing imaging information in Examples 2 and 3, and (c) a diagram illustrating extraction of a cutting line in Examples 2 and 3. 実施例3における外観検査装置の構成を示すブロック図A block diagram showing the configuration of the visual inspection apparatus according to the third embodiment.
 以下、本発明の各実施例を、図面を用いて説明するが、まず、各実施例に共通の検査対象について、説明する。各実施例では、図1に示す複合部材10を検査対象の例として説明する。この複合部材10は、鉄道車両の窓として用いられるものであり、ガラス部分とこれを支える枠を一体的に製造される。そして、この複合部材10は、いわゆるユニット枠として、鉄道車両に設置される。なお、本発明は、複合部材10は、鉄道車両への設置に限定されず、自動車、航空機、船舶などの各種交通手段やビルなどの建造物にも設置可能である。 Hereinafter, each embodiment of the present invention will be described with reference to the drawings, but first, the inspection target common to each embodiment will be described. In each embodiment, the composite member 10 shown in FIG. 1 will be described as an example to be inspected. The composite member 10 is used as a window of a railroad vehicle, and a glass portion and a frame supporting the glass portion are integrally manufactured. The composite member 10 is installed in a railway vehicle as a so-called unit frame. In the present invention, the composite member 10 is not limited to being installed in a railroad vehicle, but can be installed in various means of transportation such as automobiles, aircraft, and ships, and in buildings such as buildings.
 上述のように、検査対象である複合部材10は、透過性部材であるガラス101がゴム枠102で囲まれ、保持されている。そして、図1(a)に示すように、ゴム枠102には、バリ1021とバリ1022が、ガラス101の上面および下面のそれぞれに発生する。これらバリとしては、厚さ0.1mm程度のとげ状、厚さ0.1~0.3mm程度の帯状、厚さ0.1mm以下の粒状などがある。 As described above, in the composite member 10 to be inspected, the glass 101, which is a transparent member, is surrounded by the rubber frame 102 and held. Then, as shown in FIG. 1A, burrs 1021 and burrs 1022 are generated on the upper surface and the lower surface of the glass 101, respectively, on the rubber frame 102. Examples of these burrs include thorns having a thickness of about 0.1 mm, strips having a thickness of about 0.1 to 0.3 mm, and granules having a thickness of 0.1 mm or less.
 ここで、この複合部材10の外観を検査する場合、上面側からカメラなどの撮影装置ないし撮影部を用いて撮影することになる。例えば、撮影装置は、図1(a)に示す矢印の方向から撮影する。この場合、ゴム枠102において、図中上面側にバリ1021が発生している。また、同じく下面側にはバリ1022が発生している。この際、図の矢印方向から観察すると、ガラス101が透過性を有するため、バリ1021とバリ1022が共に観察できる。このため、これらを、撮影装置を用いて撮影すると、バリ1021だけでなく、バリ1022も撮影されてしまう。つまり、撮影装置に正対する側のバリ1021と、ガラス101を介して透過光を通過する方向であるバリ1022の両方が撮影されてしまう。このことは、バリ1021とバリ1022を区別できないことを意味する。このため、バリ1021を抽出ないし認識できない。このことで、バリ1021を除去するための切断位置を特定できない。 Here, when inspecting the appearance of the composite member 10, the image is taken from the upper surface side using a photographing device such as a camera or an photographing unit. For example, the photographing device photographs from the direction of the arrow shown in FIG. 1 (a). In this case, in the rubber frame 102, burrs 1021 are generated on the upper surface side in the drawing. Similarly, burrs 1022 are generated on the lower surface side. At this time, when observing from the direction of the arrow in the figure, since the glass 101 has transparency, both the burrs 1021 and the burrs 1022 can be observed. Therefore, when these are photographed by using a photographing device, not only the burr 1021 but also the burr 1022 is photographed. That is, both the burr 1021 on the side facing the photographing device and the burr 1022 in the direction of passing the transmitted light through the glass 101 are photographed. This means that burrs 1021 and burrs 1022 cannot be distinguished. Therefore, the burr 1021 cannot be extracted or recognized. As a result, the cutting position for removing the burr 1021 cannot be specified.
 そこで、本発明の各実施例では、バリ1021を認識、言い換えると、バリ1022を除去するための処理を実行する。このように、各実施例では、ゴム枠102のような検査対象の画像に対し、ガラス101のような透過性部材が与える影響を加味した認識処理を実行する。 Therefore, in each embodiment of the present invention, the burr 1021 is recognized, in other words, the process for removing the burr 1022 is executed. As described above, in each embodiment, the recognition process is executed in consideration of the influence of the transparent member such as the glass 101 on the image to be inspected such as the rubber frame 102.
 まず、実施例1について、説明する。図2は、本実施例における外観検査装置20の構成を示すブロック図である。外観検査装置20は、検査物である複合部材10を撮影するカメラ21、複合部材10に対し照明を照射す同軸落射照明器22、複合部材10保持するXYステージ23およびこれら各構成、つまり外観検査装置20を制御する制御部24を有する。 First, Example 1 will be described. FIG. 2 is a block diagram showing the configuration of the visual inspection apparatus 20 in this embodiment. The visual inspection device 20 includes a camera 21 that photographs the composite member 10 that is an inspection object, a coaxial epi-illuminator 22 that illuminates the composite member 10, an XY stage 23 that holds the composite member 10, and each of these configurations, that is, a visual inspection. It has a control unit 24 that controls the device 20.
 以下、この外観検査装置20の各構成について、説明する。カメラ21は、複合部材10を撮影する機能を有する撮影装置ないし撮影部の一実現手段である。カメラ21の撮影は、静止画に限らず、動画を撮影してもよい。 Hereinafter, each configuration of the visual inspection device 20 will be described. The camera 21 is a means for realizing a photographing device or a photographing unit having a function of photographing the composite member 10. The shooting of the camera 21 is not limited to a still image, and a moving image may be shot.
 また、同軸落射照明器22は、照明部として機能し、カメラ21で撮影される複合部材10に対し、照明を照射する照明部の一実現手段である。同軸落射照明器22は、光源221からの照明(光)を、ハーフミラーのような反射板222で反射させ、カメラ21の光学軸と平行な照明を発する。この結果、同軸落射照明器22からの照明を、複合部材10の少なくとも一部に照射することが可能になる。なお、本実施例では、カメラ21の光学軸と平行な照明を照射する「同軸落射」照明を用いるが、他の照明であってもよい。 Further, the coaxial epi-illumination device 22 functions as a lighting unit, and is a means for realizing the lighting unit that illuminates the composite member 10 photographed by the camera 21. The coaxial epi-illuminator 22 reflects the illumination (light) from the light source 221 by a reflector 222 such as a half mirror, and emits illumination parallel to the optical axis of the camera 21. As a result, it becomes possible to irradiate at least a part of the composite member 10 with the illumination from the coaxial epi-illuminator 22. In this embodiment, "coaxial epi-illumination" illumination that irradiates illumination parallel to the optical axis of the camera 21 is used, but other illumination may be used.
 次に、XYステージ23は、複合部材10を保持する機能を有する保持部の一実現手段である。XYステージ23は、複合部材10の検査面となる第1面である上面が、カメラ21や同軸落射照明器22に、正対する方向となるように、この複合部材10を保持する。ここで、上面が正対する方向とは、検査面である上面が検査に適した方向を意味する。
このため、この方向は、検査対象やその検査面の形状、特性、検査の精度に応じて、適宜決めることができる。また、この方向へ向けるために、後述するように、制御部24に従ってXYステージ23が動作する他、検査員が手作業でその向きとなるようにXYステージ23を操作してもよい。なお、図示しないが、XYステージ23を駆動するためのモーターで代表される駆動機構を有する。
Next, the XY stage 23 is a means for realizing the holding portion having a function of holding the composite member 10. The XY stage 23 holds the composite member 10 so that the upper surface, which is the first surface to be the inspection surface of the composite member 10, faces the camera 21 and the coaxial epi-illuminator 22. Here, the direction in which the upper surface faces directly means the direction in which the upper surface, which is the inspection surface, is suitable for inspection.
Therefore, this direction can be appropriately determined according to the shape and characteristics of the inspection target and the inspection surface, and the accuracy of the inspection. Further, in order to turn in this direction, as will be described later, the XY stage 23 may be operated according to the control unit 24, or the inspector may manually operate the XY stage 23 so as to be in that direction. Although not shown, it has a drive mechanism represented by a motor for driving the XY stage 23.
 最後に、制御部24は、カメラ21、同軸落射照明器22、XYステージ23の動作を制御する。つまり、制御部24は、外観検査装置20の動作を制御する。このため、制御部24は、カメラ21、同軸落射照明器22、XYステージ23のそれぞれと接続されており、制御のための情報を互いに送受信している。 Finally, the control unit 24 controls the operations of the camera 21, the coaxial epi-illuminator 22, and the XY stage 23. That is, the control unit 24 controls the operation of the visual inspection device 20. Therefore, the control unit 24 is connected to each of the camera 21, the coaxial epi-illuminator 22, and the XY stage 23, and sends and receives information for control to and from each other.
 また、制御部24は、外観検査装置20と一体として構成としてもよいし、別筐体で構成してもよい。前者の場合、上述の接続は内部通信路で実現できる。また、後者の場合、ネットワークにより接続が可能になる。さらに、後者の場合、制御部24をPC、タブレット、サーバのような情報処理装置で実現できる。このように別筐体で実現される制御部24の構成を、図3を用いて、説明する。なお、外観検査装置20と一体として構成される場合も基本的な構成は、別筐体で実現される場合と同じであり、その相違点は図3の説明の中で触れる。 Further, the control unit 24 may be configured integrally with the visual inspection device 20 or may be configured in a separate housing. In the former case, the above connection can be realized in the internal communication path. In the latter case, the network enables connection. Further, in the latter case, the control unit 24 can be realized by an information processing device such as a PC, a tablet, or a server. The configuration of the control unit 24 thus realized in a separate housing will be described with reference to FIG. In addition, even when it is configured integrally with the visual inspection device 20, the basic configuration is the same as when it is realized in a separate housing, and the difference will be mentioned in the description of FIG.
 図3は、制御部24の構成を示すブロック図である。制御部24は、処理部241、入力部242、表示部243、通信I/F244、主記憶部245および補助記憶部246を有する。これらは、バスのような内部通信路を介して、互いに接続されている。以下、各構成について、説明する。 FIG. 3 is a block diagram showing the configuration of the control unit 24. The control unit 24 includes a processing unit 241, an input unit 242, a display unit 243, a communication I / F 244, a main storage unit 245, and an auxiliary storage unit 246. These are connected to each other via an internal communication channel such as a bus. Hereinafter, each configuration will be described.
 まず、処理部241は、CPUのような演算装置で実現される。つまり、主記憶部245に格納ないし展開された各プログラムに従った処理を実行する。この処理内容については、後述する。なお、本実施例では、制御部24の機能を、プログラム、つまり、ソフトウエアを用いて実現するが、ハードウエアで実現してもよい。つまり、後述する処理を実行する回路により、処理部241を構成してもよい。 First, the processing unit 241 is realized by an arithmetic unit such as a CPU. That is, the process according to each program stored or expanded in the main storage unit 245 is executed. The details of this process will be described later. In this embodiment, the function of the control unit 24 is realized by using a program, that is, software, but it may be realized by hardware. That is, the processing unit 241 may be configured by a circuit that executes the processing described later.
 次に、入力部242は、キーボードやポインティングといった入力デバイスで実現でき、検査員の操作を受け付ける。この操作には、カメラ21など起動、停止などの検査における操作の他、故障時の対処などが含まれる。なお、外観検査装置20を自動運用する場合などは、入力部242を省略してもよい。 Next, the input unit 242 can be realized by an input device such as a keyboard or pointing, and accepts the operation of the inspector. This operation includes operations in inspections such as starting and stopping of the camera 21, as well as measures to be taken in the event of a failure. When the visual inspection device 20 is automatically operated, the input unit 242 may be omitted.
 次に、表示部243はいわゆるディスプレイとして実現可能であり、各種情報を出力する。また、表示部243は入力部242とタッチパネルのように一体成型してもよい。さらに、外観検査装置20を自動運用する場合などは、表示部243は省略してもよい。 Next, the display unit 243 can be realized as a so-called display and outputs various information. Further, the display unit 243 may be integrally molded with the input unit 242 like a touch panel. Further, when the visual inspection device 20 is automatically operated, the display unit 243 may be omitted.
 次に、通信I/F244は、外観検査装置20の各種構成とネットワークを介して接続する機能を有する。本実施例では、外観検査装置20一台と接続しているが、複数台数と接続してもよい。また、制御部24を外観検査装置20と一体で構成する場合、この構成を省略し、他の構成と制御部24を直接接続してもよい。 Next, the communication I / F 244 has a function of connecting to various configurations of the visual inspection device 20 via a network. In this embodiment, it is connected to one visual inspection device 20, but it may be connected to a plurality of units. Further, when the control unit 24 is integrally configured with the visual inspection device 20, this configuration may be omitted and the control unit 24 may be directly connected to another configuration.
 次に、主記憶部245は、メモリといった記憶媒体で実現される。そして、主記憶部245は、処理部241での処理を行うための各プログラムを主記憶部245に展開する。
このため、各プログラムは、通常は補助記憶部246など他の記憶部、記憶媒体に格納されていることが望ましい。ここで、プログラムには、外観検査装置20を制御するための各種プログラムが含まれる。これらは、以下のものが例示できる。
カメラ21の起動を制御するカメラ制御プログラム2451
同軸落射照明器22の起動を制御する同軸落射照明器制御プログラム2452
XYステージ23の動作を制御するXYステージ制御プログラム2453
 さらに、主記憶部245は、プログラムとして、カメラ21で撮影された撮影情報に対して、各種画像処理を実行するための画像処理プログラム2454も記憶する。これらプログラムの詳細は、後述する。
Next, the main storage unit 245 is realized by a storage medium such as a memory. Then, the main storage unit 245 expands each program for performing the processing in the processing unit 241 to the main storage unit 245.
Therefore, it is usually desirable that each program is stored in another storage unit such as the auxiliary storage unit 246 or a storage medium. Here, the program includes various programs for controlling the visual inspection device 20. These can be exemplified by the following.
Camera control program 2451 that controls the activation of the camera 21
Coaxial epi-illuminator control program 2452 that controls the activation of the coaxial epi-illuminator 22
XY stage control program 2453 that controls the operation of the XY stage 23
Further, the main storage unit 245 also stores, as a program, an image processing program 2454 for executing various image processing on the shooting information shot by the camera 21. Details of these programs will be described later.
 最後に、補助記憶部246は、各種情報を記憶する。各種情報には、複合部材10の設計情報2461やカメラ21で撮影された撮影情報2462が含まれる。また、補助記憶部246は、ハードディスクドライブ(HDD)、ソリッドステートドライブ(SSD)や各種光学ディスクなどで実現できる。なお、この補助記憶部246は、上述の各種プログラムも記憶してもよい。さらに、この補助記憶部246に記憶される情報は、通信I/F244を介して接続される外部記憶装置に記憶してもよい。 Finally, the auxiliary storage unit 246 stores various information. The various information includes the design information 2461 of the composite member 10 and the shooting information 2462 taken by the camera 21. Further, the auxiliary storage unit 246 can be realized by a hard disk drive (HDD), a solid state drive (SSD), various optical disks, or the like. The auxiliary storage unit 246 may also store the above-mentioned various programs. Further, the information stored in the auxiliary storage unit 246 may be stored in an external storage device connected via the communication I / F 244.
 次に、処理部241を用いた各種処理の詳細を、図4を用いて、説明する。図4は、本実施例における処理部241を用いた処理内容を示すフローチャートである。 Next, the details of various processes using the processing unit 241 will be described with reference to FIG. FIG. 4 is a flowchart showing the processing contents using the processing unit 241 in this embodiment.
 まず、ステップS100において、処理部241は、カメラ制御プログラム2451および同軸落射照明器制御プログラム2452に従って、カメラ21および同軸落射照明器22を起動する。この起動の条件は、入力部242を介した検査員の操作やXYステージ23からの複合部材10を保持したことを示す検知信号の受信が含まれる。この際、XYステージ23で保持された複合部材10を識別する複合部材識別情報が、入力部242に対する操作もしくは検知信号に含まれ、処理部241はこれを特定する。 First, in step S100, the processing unit 241 activates the camera 21 and the coaxial epi-illuminator 22 according to the camera control program 2451 and the coaxial epi-illuminator control program 2452. The activation condition includes an inspector's operation via the input unit 242 and reception of a detection signal indicating that the composite member 10 is held from the XY stage 23. At this time, the composite member identification information for identifying the composite member 10 held by the XY stage 23 is included in the operation or detection signal for the input unit 242, and the processing unit 241 specifies this.
 次に、ステップS101において、処理部241は、画像処理プログラム2454に従って、検査対象の複合部材10の撮影条件を特定する。このために、処理部241は、補助記憶部246の設計情報2461から該当する複合部材10の撮影条件を読み出す。この設計情報2461の一例を図5に示す。設計情報2461は、検査対象である複合部材を識別する複合部材識別情報2461-1、これを構成する各部材を特定する部材2461-2が含まれる。ここで、部材2461-2には、検査対象に影響を与える透過性部材か、また、検査対象か否かを示す情報を付加することが望ましい。 Next, in step S101, the processing unit 241 specifies the imaging conditions of the composite member 10 to be inspected according to the image processing program 2454. For this purpose, the processing unit 241 reads out the imaging conditions of the corresponding composite member 10 from the design information 2461 of the auxiliary storage unit 246. An example of this design information 2461 is shown in FIG. The design information 2461 includes composite member identification information 2461-1 for identifying the composite member to be inspected, and member 2461-2 for identifying each member constituting the composite member identification information 2461-1. Here, it is desirable to add information indicating whether or not the member is a permeable member that affects the inspection target and whether or not the member is an inspection target.
 また、設計情報2461には、各部材の大きさ、形状を示す寸法2461-3やこの寸法に対して修正加工が必要かを判断するための公差2461-4も含まれる。またさらに、設計情報2461には、複合部材ごとに、その撮影のための条件である撮影条件2461-5も含まれる。この撮影条件2461-5は、その照明輝度やカメラ21におけるカメラゲインを含むことが好適である。さらに、カメラゲインの代わりに、このカメラゲインを実現するためのカメラ21におけるISO感度、絞り、シャッタースピードを格納してもよい。 The design information 2461 also includes a dimension 2461-3 indicating the size and shape of each member and a tolerance 2461-4 for determining whether correction processing is necessary for this dimension. Further, the design information 2461 also includes a shooting condition 2461-5, which is a condition for shooting the composite member for each composite member. It is preferable that the shooting conditions 2461-5 include the illumination brightness thereof and the camera gain in the camera 21. Further, instead of the camera gain, the ISO sensitivity, the aperture, and the shutter speed of the camera 21 for realizing this camera gain may be stored.
 ステップS101においては、ステップS100で特定された複合部材識別情報に対応するレコードを、設計情報2461から特定する。例えば、ステップS100で#1が特定された場合、図5に示す1レコード目が特定される。そして、処理部241は、特定されたレコードに含まれる撮影条件2461-5を特定する。図5の例では、処理部241は、照明輝度として「a」が、ガメラゲインとして「b」を特定する。なお、これら撮影条件は、透過性部材における反射光の画素値が一定以上となる条件を示す。この画素値には輝度が含まれる。また、画素値が一定以上の例として、いわゆる白飛びなどのような撮影情報が飽和することが含まれる。 In step S101, the record corresponding to the composite member identification information specified in step S100 is specified from the design information 2461. For example, when # 1 is specified in step S100, the first record shown in FIG. 5 is specified. Then, the processing unit 241 specifies the shooting conditions 2461-5 included in the specified record. In the example of FIG. 5, the processing unit 241 specifies “a” as the illumination brightness and “b” as the gamer gain. It should be noted that these shooting conditions indicate conditions in which the pixel value of the reflected light in the transmissive member is equal to or higher than a certain level. This pixel value includes luminance. Further, as an example in which the pixel value is above a certain level, shooting information such as so-called overexposure is saturated.
 次に、ステップS102において、処理部241は、XYステージ制御プログラム2453に従って、XYステージ23に対して、移動するための制御信号を出力する。この際、処理部241は、カメラ21で撮影された複合部材10の位置を特定し、バリを切断するための切断線がカメラ21の視野内に含まれるように、XYステージ23へ制御信号を出力する。この切断線は、処理部241により設計情報2461の寸法を用いて特定可能である。 Next, in step S102, the processing unit 241 outputs a control signal for moving to the XY stage 23 according to the XY stage control program 2453. At this time, the processing unit 241 identifies the position of the composite member 10 photographed by the camera 21, and sends a control signal to the XY stage 23 so that the cutting line for cutting the burr is included in the field of view of the camera 21. Output. This cutting line can be specified by the processing unit 241 using the dimensions of the design information 2461.
 また、本ステップでは、処理部241は、カメラ制御プログラムに従って、カメラの位置を、上述のように切断線がカメラ21の視野内に含まれるように移動する制御信号を、カメラ21に出力してもよい。このように、カメラ21とXYステージ(もしくはこれが保持する複合部材10)の相対位置関係により、切断線がカメラ21の視野内に含まれるように制御すればよい。なお、切断線は、部材を、設計に沿って加工するための加工位置の一種である。 Further, in this step, the processing unit 241 outputs a control signal to the camera 21 that moves the position of the camera so that the cutting line is included in the field of view of the camera 21 as described above according to the camera control program. May be good. In this way, the cutting line may be controlled to be included in the field of view of the camera 21 by the relative positional relationship between the camera 21 and the XY stage (or the composite member 10 held by the camera 21). The cutting line is a kind of processing position for processing a member according to a design.
 次に、ステップS103において、処理部241は、カメラ制御プログラム2451に従って、ステップS101で特定された撮影条件に応じて撮影するための制御信号を、カメラ21に出力する。また、処理部241は、同軸落射照明器制御プログラム2452に従って、ステップS101で特定された撮影条件に応じた照明を行うための制御信号を同軸落射照明器22に出力する。 Next, in step S103, the processing unit 241 outputs a control signal for shooting according to the shooting conditions specified in step S101 to the camera 21 according to the camera control program 2451. Further, the processing unit 241 outputs a control signal for lighting according to the shooting conditions specified in step S101 to the coaxial epi-illuminator 22 according to the coaxial epi-illuminator control program 2452.
 これらの結果、同軸落射照明器22は、ステップS101で特定された撮影条件に含まれる照明条件での照明を照射する。ここで、同軸落射照明器22は、複合部材10ないしガラス101から第1の方向の反射光を発生する。この反射光は、略正反射光であることが望ましい。略正反射光とは、正反射光を含み、正反射光の反射角から一定範囲内に収まる反射光を意味する。また、第1の方向の反射光は、撮影情報において画素値が一定値以上とも定義できる。この例としては、いわゆる白飛びを含む飽和が挙げられる。本実施例では、第1の方向の反射光として、略正反射光であり、飽和を発生させるものとして説明する。 As a result of these, the coaxial epi-illuminator 22 irradiates the illumination under the illumination conditions included in the photographing conditions specified in step S101. Here, the coaxial epi-illuminator 22 generates reflected light in the first direction from the composite member 10 or the glass 101. It is desirable that this reflected light is substantially specular reflected light. The substantially specularly reflected light means the reflected light including the specularly reflected light and falling within a certain range from the reflection angle of the specularly reflected light. Further, the reflected light in the first direction can be defined as having a pixel value of a certain value or more in the shooting information. An example of this is saturation including so-called overexposure. In this embodiment, the reflected light in the first direction is a substantially specular reflected light, and will be described as causing saturation.
 また、カメラ21では、出力された制御信号、つまり、ステップS101で特定されたカメラゲインで、略正反射光を含む撮影を行う。つまり、カメラ21は、ガラス101の一部分の画素値が飽和となる画像を撮影する。 Further, the camera 21 performs shooting including substantially specular reflected light with the output control signal, that is, the camera gain specified in step S101. That is, the camera 21 captures an image in which the pixel value of a part of the glass 101 is saturated.
 なお、上記のように、画素値が一定値以上になればよいため、ステップS103において、処理部241は以下のとおり制御してもよい。(1)上述のように、同軸落射照明器22が所定の第1の方向から照明を照射すること、(2)同軸落射照明器22が所定以上の輝度の照明を照射すること、(3)カメラ21の撮影におけるゲインを制御することの少なくとも1つが含まれる。つまり、これらのいずれか1つもしくは組合せで、画素値が一定値以上となる撮影が実現できればよい。 As described above, since the pixel value may be equal to or higher than a certain value, the processing unit 241 may be controlled as follows in step S103. (1) As described above, the coaxial epi-illuminator 22 irradiates the illumination from a predetermined first direction, (2) the coaxial epi-illuminator 22 irradiates the illumination with a predetermined brightness or more, (3). At least one of controlling the gain in the shooting of the camera 21 is included. That is, it suffices if any one or a combination of these can realize shooting in which the pixel value is equal to or higher than a certain value.
 ここで、一旦フローチャートの説明から離れ、本実施例で、第1の反射光を撮影する意義を説明する。まず、ステップS013で取得された撮影情報2462を、模式的に示す一例を図6(a)に示す。図6(a)に示すように、撮影情報2462はガラス部分101-1が飽和している。そして、図6(a)右側にゴム枠部分102-1が存在する。そして、ゴム枠部分102-1には、ガラス101の上面側に発生したバリ1021を示すバリ部分1021-1が示される。 Here, apart from the explanation of the flowchart, the significance of photographing the first reflected light will be explained in this embodiment. First, FIG. 6A shows an example schematically showing the shooting information 2462 acquired in step S013. As shown in FIG. 6A, the glass portion 101-1 of the photographing information 2462 is saturated. The rubber frame portion 102-1 is present on the right side of FIG. 6A. The rubber frame portion 102-1 shows a burr portion 1021-1 indicating the burr 1021 generated on the upper surface side of the glass 101.
 これに対して、ステップS101の撮影条件を用いない場合の画像を、図6(b)に示す。図6(b)に示す画像では、ガラス部分101-1が飽和していない。このため、この画像では、ガラス101の下面側に発生したバリ1022を示すバリ部分1022-1も撮影情報に含まれてしまう。このように、ステップS101で特定された撮影条件を用いない場合は、下面側のバリ部分1022-1も撮影情報2462に含まれてしまい、バリ部分1021-1の特定が困難になってしまう。したがって、バリを除去する場合、正確な切断線を特定することが困難になってしまう。 On the other hand, an image when the shooting conditions of step S101 are not used is shown in FIG. 6 (b). In the image shown in FIG. 6B, the glass portion 101-1 is not saturated. Therefore, in this image, the burr portion 1022-1 showing the burr 1022 generated on the lower surface side of the glass 101 is also included in the shooting information. As described above, when the shooting conditions specified in step S101 are not used, the burr portion 1022-1 on the lower surface side is also included in the shooting information 2462, which makes it difficult to specify the burr portion 1021-1. Therefore, when removing burrs, it becomes difficult to identify an accurate cutting line.
 なお、図6(a)(b)は、それぞれ撮影情報、画像を模式的に示したものである。このため、図6(a)が示す撮影情報2462においては、無視可能な程度なバリ部分1022-1が含まれてもよい。 Note that FIGS. 6 (a) and 6 (b) schematically show shooting information and images, respectively. Therefore, in the photographing information 2462 shown in FIG. 6A, a negligible burr portion 1022-1 may be included.
 さらに、本実施例では、ステップS101において、設計情報2461に含まれる撮影条件2461-5を用いた。但し、試行的にカメラ21で撮影を行い、この結果を処理部241が画像処理プログラム2454に従って、ガラス部分101-1が飽和しているかを判断してもよい。この場合、カメラ21の撮影と処理部241での判断を反復的に行うことになる。また、飽和したかの判断を、検査員が表示部243に表示された画像を用いて行ってもよい。 Further, in this embodiment, in step S101, the shooting conditions 2461-5 included in the design information 2461 are used. However, the camera 21 may take a picture on a trial basis, and the processing unit 241 may determine whether the glass portion 101-1 is saturated according to the image processing program 2454. In this case, the shooting of the camera 21 and the determination by the processing unit 241 are repeatedly performed. Further, the inspector may use the image displayed on the display unit 243 to determine whether or not the image is saturated.
 以上で、第1の方向の反射光を用いる意義の説明を終わり、図4のフローチャートの説明に戻る。 This completes the explanation of the significance of using the reflected light in the first direction, and returns to the explanation of the flowchart of FIG.
 ステップS103では、カメラ21で撮影された画像である撮影情報を、制御部24に出力する。これを受けて、制御部24の処理部241は、この撮影情報を、補助記憶部246に撮影情報2462として記憶する。なお、撮影情報2462は、カメラ21で撮影された画像そのものであってもよいし、この画像と撮影対象である複合部材10の個体を識別する情報を含むことが好適である。 In step S103, the shooting information, which is an image shot by the camera 21, is output to the control unit 24. In response to this, the processing unit 241 of the control unit 24 stores this shooting information in the auxiliary storage unit 246 as shooting information 2462. The shooting information 2462 may be the image itself taken by the camera 21, and it is preferable that the shooting information 2462 includes information for identifying the individual of the composite member 10 to be shot from this image.
 次に、ステップS104において、処理部241は、画像処理プログラム2454に従って、撮影情報2462から切断線を抽出する。この抽出について、図7を用いて説明する。図7(a)は、図6(a)と同じ図であり、撮影情報2462を模式的に示した図である。これに対して、図7(b)は、図7(c)で示される画素と輝度の関係を示す情報を用いて、切断線を抽出する。例えば、処理部241は、図7(c)に示されるゴム枠部分102-1(図面右側)からその画素値、つまり、輝度が最初に極小値を示す画素を、ガラス部分101-1とゴム枠部分102-1の境界領域に沿って順次特定していき、切断線103を抽出する。 Next, in step S104, the processing unit 241 extracts a cutting line from the shooting information 2462 according to the image processing program 2454. This extraction will be described with reference to FIG. FIG. 7 (a) is the same diagram as in FIG. 6 (a), and is a diagram schematically showing shooting information 2462. On the other hand, in FIG. 7B, a cutting line is extracted using the information showing the relationship between the pixel and the luminance shown in FIG. 7C. For example, the processing unit 241 has a pixel value from the rubber frame portion 102-1 (right side in the drawing) shown in FIG. 7 (c), that is, a pixel whose brightness first shows a minimum value, and a glass portion 101-1 and rubber. The cutting line 103 is extracted by sequentially specifying along the boundary region of the frame portion 102-1.
 なお、本実施例の処理部241は、切断線103の抽出を、輝度値の極小値に限らず、輝度値の特性を用いて実行することができる。例えば、本実施例の処理部241は、輝度の傾きの変化率が予め定めた値より大きい箇所を検出して実現してもよい。図7(d)に示すように、極小値の判別が困難な場合がある。複合部材10では、ゴム枠102の形状に依存して、極小値ができたり、極小値ができずに傾きの変化が大きい場合がある。このため、処理部241は、極小値と傾きの変化の双方を利用して、切断線103を抽出してもよい。 Note that the processing unit 241 of this embodiment can execute the extraction of the cutting line 103 not only by using the minimum value of the luminance value but also by using the characteristics of the luminance value. For example, the processing unit 241 of the present embodiment may detect and realize a portion where the rate of change in the slope of the luminance is larger than a predetermined value. As shown in FIG. 7 (d), it may be difficult to determine the minimum value. In the composite member 10, depending on the shape of the rubber frame 102, a minimum value may be formed, or a minimum value may not be obtained and the change in inclination may be large. Therefore, the processing unit 241 may extract the cutting line 103 by utilizing both the minimum value and the change in the slope.
 また、処理部241は、切断線を抽出した場合、切断線103から左側にバリ端面を抽出する(図7(c)参照)。この場合、処理部241は、バリ端面位置と切断線103位置の差分をバリの大きさとする。なお、切断線103を抽出できない場合、つまり、バリなしで切断の必要がない場合がある。この場合も同様に、処理部241は、バリ端面位置と切断線103位置の差分をバリの大きさとする(図7(d)参照)。 Further, when the cutting line is extracted, the processing unit 241 extracts the burr end face on the left side from the cutting line 103 (see FIG. 7 (c)). In this case, the processing unit 241 sets the difference between the burr end face position and the cutting line 103 position as the burr size. In some cases, the cutting line 103 cannot be extracted, that is, there is no need for cutting without burrs. Similarly in this case, the processing unit 241 sets the difference between the burr end face position and the cutting line 103 position as the burr size (see FIG. 7 (d)).
 このため、本実施例では、このような輝度と画素の関係を示す情報を、撮影情報2462として、もしくは、撮影情報2462と対応付けて、補助記憶部246に記憶することが望ましい。この場合、処理部241は、ゴム枠102のいずれの内辺の切断線であるかを判別可能となるように、切断線を示す情報を記憶する。 Therefore, in this embodiment, it is desirable that the information indicating the relationship between the brightness and the pixels is stored in the auxiliary storage unit 246 as the shooting information 2462 or in association with the shooting information 2462. In this case, the processing unit 241 stores information indicating the cutting line so that it is possible to determine which inner side of the rubber frame 102 is the cutting line.
 次に、ステップS105において、処理部241は、画像処理プログラム2454に従って、抽出された切断線103に基づいて、バリの有無を判定する。この判定は、以下のとおり実行する。処理部241は、切断線103が抽出されない場合は、バリなしと判定する。また、切断線103を抽出した場合、処理部241は、設計情報2461の公差2461-4を用いる。つまり、処理部241は、切断線から公差2461-4の範囲内かを判定する。この結果を用いて、処理部241は、範囲内であればバリは無いと判定する。 Next, in step S105, the processing unit 241 determines the presence or absence of burrs based on the extracted cutting line 103 according to the image processing program 2454. This determination is executed as follows. If the cutting line 103 is not extracted, the processing unit 241 determines that there is no burr. Further, when the cutting line 103 is extracted, the processing unit 241 uses the tolerance 2461-4 of the design information 2461. That is, the processing unit 241 determines whether or not it is within the range of the tolerance 2461-4 from the cutting line. Using this result, the processing unit 241 determines that there is no burr if it is within the range.
 この判定の結果、本ステップでは、ゴム枠102に対して、切断加工が必要かを判定することが可能になる。なお、処理部241は、この判定結果をステップS104で抽出された切断線と対応付けて、補助記憶部246に記憶することが望ましい。 As a result of this determination, in this step, it becomes possible to determine whether the rubber frame 102 needs to be cut. It is desirable that the processing unit 241 associates this determination result with the cutting line extracted in step S104 and stores it in the auxiliary storage unit 246.
 そして、ステップS106において、処理部241は、画像処理プログラム2454に従って、複合部材10の上面側の全周、つまり、上面側のゴム枠102の各内辺に対する撮影が終了したかを判断する。このために、処理部241は、設計情報2461を用いることが望ましい。この結果、終了していない場合(NO)は、ステップS101に戻る。
終了した場合(YES)は、本フローチャートの処理を終了する。
Then, in step S106, the processing unit 241 determines, according to the image processing program 2454, whether or not the entire circumference of the upper surface side of the composite member 10, that is, each inner side of the rubber frame 102 on the upper surface side has been photographed. For this reason, it is desirable that the processing unit 241 use the design information 2461. As a result, if it is not completed (NO), the process returns to step S101.
When finished (YES), the process of this flowchart ends.
 なお、上記のフローチャートは、複合部材10の上面側の検査を示すが、下面側も同様に行うことが可能である。つまり、XYステージ23に、上下(表裏)を、上述とは反対に設置された複合部材10に対し、図4と同様の処理を行うことで、下面側の検査が可能になる。 Although the above flowchart shows the inspection of the upper surface side of the composite member 10, the lower surface side can be similarly performed. That is, the lower surface side can be inspected by performing the same processing as in FIG. 4 on the composite member 10 installed on the XY stage 23 at the top and bottom (front and back) in the opposite direction to the above.
 また、下面側の検査については、ステップS101の撮影条件の利用を省略して行うことも可能である。つまり、後述するようにバリ1021を切断すれば、検査対象でない上面のバリ1021は撮影されない。このため、下面の検査においては、ステップS101を省略して検査することが可能である。 It is also possible to omit the use of the imaging conditions in step S101 for the inspection of the lower surface side. That is, if the burr 1021 is cut as described later, the burr 1021 on the upper surface that is not the inspection target is not photographed. Therefore, in the inspection of the lower surface, it is possible to omit step S101.
 以上で、実施例1の外観検査における処理の説明を終了する。ここで、実施例1では、さらに以下に説明するバリの切断加工などバリに対する加工を行ってもよい。 This is the end of the description of the process in the visual inspection of Example 1. Here, in the first embodiment, processing for burrs such as cutting of burrs described below may be further performed.
 本実施例では、バリの切断を行う切断装置を、外観検査装置20に一体で構成してもよいし、別筐体で構成してもよい。但し、いずれにしてもよい制御部24と接続されることが望ましい。また、図8のブロック図に示す製造システム200として実現してもよい。
図8の製造システム200は、外観検査装置20と同様の機能を有する検査ステーション201と切断装置の一種である加工ステーション202を有する。つまり、検査ステーション201にはカメラ21、同軸落射照明器22が設置されている。また、加工ステーション202には、バリを切断するためのレーザー発振器25が設置されている。なお、本実施例では、切断装置としてレーザー発振器25を用いるがこれに限定されない。また、本実施例では、加工として、切断を例に挙げたが、これ以外の削除等の他の加工が含まれる。つまり、切断装置以外の加工装置を用いてもよい。
In this embodiment, the cutting device for cutting burrs may be integrally configured with the visual inspection device 20, or may be configured with a separate housing. However, it is desirable to be connected to the control unit 24 which may be used in any case. Further, it may be realized as the manufacturing system 200 shown in the block diagram of FIG.
The manufacturing system 200 of FIG. 8 has an inspection station 201 having the same function as the visual inspection device 20, and a processing station 202 which is a kind of cutting device. That is, the camera 21 and the coaxial epi-illuminator 22 are installed in the inspection station 201. Further, a laser oscillator 25 for cutting burrs is installed in the processing station 202. In this embodiment, the laser oscillator 25 is used as the cutting device, but the present invention is not limited to this. Further, in this embodiment, cutting is taken as an example of processing, but other processing such as deletion is included. That is, a processing device other than the cutting device may be used.
 また、制御部24が検査ステーション201および加工ステーション202の各構成と接続され、これらへ制御信号を出力して制御する。なお、制御部24の処理部241は、検査ステーション201および加工ステーション202のそれぞれで設けてもよい。 Further, the control unit 24 is connected to each configuration of the inspection station 201 and the processing station 202, and outputs a control signal to control them. The processing unit 241 of the control unit 24 may be provided at each of the inspection station 201 and the processing station 202.
 次に、図8の製造システム200での処理について、簡単に説明する。図4のフローチャートでの処理の終了後、処理部241は、XYステージ制御プログラム2453に従って、XYステージ23-aに対して、複合部材10を加工ステーション202側に移動させる制御信号を出力する。この結果、図8に示すように、XYステージ23-aがその駆動機能を用いて、複合部材10を加工ステーション202側に移動する。なお、このように、移動させるために、検査ステーション201と加工ステーション202の座標のキャリブレーション結果を、補助記憶部246に記憶しておく。 Next, the processing in the manufacturing system 200 of FIG. 8 will be briefly described. After the processing in the flowchart of FIG. 4 is completed, the processing unit 241 outputs a control signal for moving the composite member 10 to the processing station 202 side to the XY stage 23-a according to the XY stage control program 2453. As a result, as shown in FIG. 8, the XY stage 23-a moves the composite member 10 to the processing station 202 side by using its driving function. In addition, in order to move in this way, the calibration result of the coordinates of the inspection station 201 and the processing station 202 is stored in the auxiliary storage unit 246.
 そして、処理部241は、図示しない切断加工制御プログラムに従って、レーザー発振器25に制御信号を出力する。この制御信号は、ステップS105でバリが有る(YES)と判断されたゴム枠102の内辺について、ステップS104で抽出した切断線に従った切断を指示する信号である。 Then, the processing unit 241 outputs a control signal to the laser oscillator 25 according to a cutting processing control program (not shown). This control signal is a signal instructing to cut the inner side of the rubber frame 102 determined to have burrs (YES) in step S105 according to the cutting line extracted in step S104.
 このために、処理部241は、補助記憶部246に記憶された切断線やバリの有無を示す情報を用いる。この結果、レーザー発振器25はステップS104で抽出された切断線に沿ってレーザーを出力し、複合部材10のゴム枠102のバリ1021を切断する。 For this purpose, the processing unit 241 uses information indicating the presence or absence of cutting lines and burrs stored in the auxiliary storage unit 246. As a result, the laser oscillator 25 outputs a laser along the cutting line extracted in step S104, and cuts the burr 1021 of the rubber frame 102 of the composite member 10.
 なお、本実施例では、バリの切断をゴム枠102の各内辺(全周)の撮影の後に行っている。但し、これの切断は、撮影の度に実行してもよい。 In this embodiment, the burr is cut after taking a picture of each inner side (entire circumference) of the rubber frame 102. However, this may be cut off each time a photograph is taken.
 以上で、実施例1の説明を終了し、以下実施例2を説明する。 This is the end of the description of Example 1, and the following will explain Example 2.
 実施例2は、実施例1の外観検査装置20と比較して、照明部に含まれる照明器26が追加されている。図9は、この実施例2における外観検査装置20の構成を示すブロック図である。上述のように、実施例1に比較して、照明器26が追加されており、他の構成は実施例1と同様である。また、照明器26は、これは制御部24と接続され、同軸落射照明器22とは異なる位置、角度で照明が照射されるように設置される。この実施例2は、同軸落射照明器22の照明を利用した場合の撮影情報では、切断線の抽出が困難である場合に好適である。 In the second embodiment, the illuminator 26 included in the illuminating unit is added as compared with the visual inspection device 20 of the first embodiment. FIG. 9 is a block diagram showing the configuration of the visual inspection apparatus 20 in the second embodiment. As described above, the illuminator 26 is added as compared with the first embodiment, and the other configurations are the same as those of the first embodiment. Further, the illuminator 26 is connected to the control unit 24 and is installed so that the illuminator is illuminated at a position and angle different from that of the coaxial epi-illumination illuminator 22. This Example 2 is suitable when it is difficult to extract a cutting line from the photographing information when the illumination of the coaxial epi-illuminator 22 is used.
 例えば、同軸落射照明器22の照明を用いた図12(a)のような撮影情報を取得した場合、バリ部分1021-1とゴム枠部分102-1の画素値の差が小さく切断線の抽出が困難な場合がある。このような場合に、本実施例を用いることが好適である。 For example, when the shooting information as shown in FIG. 12A using the illumination of the coaxial epi-illuminator 22 is acquired, the difference between the pixel values of the burr portion 1021-1 and the rubber frame portion 102-1 is small and the cutting line is extracted. May be difficult. In such a case, it is preferable to use this embodiment.
 このために、本実施例では、2台の照明を用いた2つの撮影情報により、外観検査を行う。なお、本実施例では、2台の照明、2つの撮影情報を用いるが、3以上の照明、撮影情報を用いてもよい。以下、この検査のための処理の詳細を、図10のフローチャートを用いて、以下説明する。 For this purpose, in this embodiment, an appearance inspection is performed using two shooting information using two lightings. In this embodiment, two lights and two shooting information are used, but three or more lights and shooting information may be used. Hereinafter, the details of the process for this inspection will be described below with reference to the flowchart of FIG.
 まず、ステップS100は、実施例1と同じの処理である。次に、ステップS201を実行するが、この処理は実施例1のステップS101と基本的には同様の処理を行う。ステップS201では、ステップS101に加え、照明器26の姿勢・角度の特定を行う。
具体的には、処理部241は、画像処理プログラム2454に従って、検査対象の複合部材10の撮影条件として、照明器26の「照明姿勢・位置」を特定する。これは、照明器26の複合部材10に対する照射角度を示す。この照射角度は、同軸落射照明器22とは異なる角度であって、以下に示すように求められる。このため、照明器26を用いた撮影では、第1の方向とは異なる方向である第2の方向の反射光を撮影することになる。
First, step S100 is the same process as in the first embodiment. Next, step S201 is executed, and this process is basically the same as step S101 of the first embodiment. In step S201, in addition to step S101, the posture and angle of the illuminator 26 are specified.
Specifically, the processing unit 241 specifies the "illumination posture / position" of the illuminator 26 as an imaging condition of the composite member 10 to be inspected according to the image processing program 2454. This indicates the irradiation angle of the illuminator 26 with respect to the composite member 10. This irradiation angle is different from that of the coaxial epi-illumination device 22, and is obtained as shown below. Therefore, in the photographing using the illuminating device 26, the reflected light in the second direction, which is a direction different from the first direction, is photographed.
 処理部241は、設計情報2461の代わりに図11に示す設計情報2461aを用いる。つまり、処理部241は、設計情報2461aの撮影条件2461-5aの照明姿勢・位置を特定する。この照明姿勢・位置は、複合部材10のゴム枠102の縁部の形状に合わせて最適化されたものが記録されている。つまり、照明姿勢・位置として、照明器26の照明を行った場合に、切断線付近の画素値(例えば、輝度)が所定条件より大きくなる照射角度の最適値が用いられる。この最適値は、ゴム枠102の形状から求めることが可能である。このため、本ステップにおいては、処理部241が、設計情報2461ないし設計情報2461aに含まれるゴム枠102の形状から求めてもよい。 The processing unit 241 uses the design information 2461a shown in FIG. 11 instead of the design information 2461. That is, the processing unit 241 specifies the illumination posture / position of the shooting conditions 2461-5a of the design information 2461a. The lighting posture / position is recorded as being optimized according to the shape of the edge portion of the rubber frame 102 of the composite member 10. That is, as the illumination posture / position, the optimum value of the irradiation angle at which the pixel value (for example, brightness) near the cutting line becomes larger than a predetermined condition when the illuminator 26 is illuminated is used. This optimum value can be obtained from the shape of the rubber frame 102. Therefore, in this step, the processing unit 241 may obtain the shape of the rubber frame 102 included in the design information 2461 to the design information 2461a.
 次に、ステップS102を実施例1と同様に実行する。つまり、XYステージ23が、複合部材10を撮影位置に移動させる。 Next, step S102 is executed in the same manner as in the first embodiment. That is, the XY stage 23 moves the composite member 10 to the photographing position.
 次に、ステップS203は、ステップS103と同様の画像取得を行う。但し、照明が同軸落射照明器22から照明器26に代わっている。つまり、ステップS203において、処理部241は、カメラ制御プログラム2451に従って、ステップS201で特定された撮影条件に応じて撮影するための制御信号を、カメラ21に出力する。また、処理部241は、同軸落射照明器制御プログラム2452もしくは図示しない照明制御プログラムに従って、ステップS201で特定された撮影条件に応じた照明を行うための制御信号を照明器26に出力する。 Next, in step S203, the same image acquisition as in step S103 is performed. However, the lighting has changed from the coaxial epi-illuminator 22 to the illuminator 26. That is, in step S203, the processing unit 241 outputs a control signal for shooting according to the shooting conditions specified in step S201 to the camera 21 according to the camera control program 2451. Further, the processing unit 241 outputs to the illuminator 26 a control signal for illuminating according to the shooting conditions specified in step S201 according to the coaxial epi-illuminator control program 2452 or the illumination control program (not shown).
 この結果、処理部241は、照明器26での照明における第2の方向の反射光を、カメラ21で撮影した撮影情報を取得し、補助記憶部246に記憶することになる。この撮影情報は図12(b)に示すように、図12(a)の同軸落射照明器22での撮影情報に比べ、以下の特性がある。(1)上面側、下面側両方のバリ部分1021-1、バリ部分1022-1の両方が含まれる、(2)バリ部分1021-1、バリ部分1022-1とゴム枠部分102-1の画素値の差が大きく、切断線の抽出がより容易である。 As a result, the processing unit 241 acquires the shooting information taken by the camera 21 and stores the reflected light in the second direction in the lighting of the illuminator 26 in the auxiliary storage unit 246. As shown in FIG. 12 (b), this shooting information has the following characteristics as compared with the shooting information in the coaxial epi-illuminator 22 of FIG. 12 (a). (1) Pixels of both the burr portion 1021-1 and the burr portion 1022-1 on the upper surface side and the lower surface side are included, and (2) the burr portion 1021-1, the burr portion 1022-1 and the rubber frame portion 102-1. The difference in values is large, and it is easier to extract the cutting line.
 次に、ステップS204において、処理部241は、画像処理プログラム2454に従って、実施例1のステップS104と同様の処理を行う。つまり、処理部241は、ステップS203で取得された撮影情報から切断線を抽出する。この切断線の抽出は、実施例1と同じように、画素値の特性を用いて抽出する。なお、この抽出は、実施例1と同様のため、その詳細の記載は省略する。 Next, in step S204, the processing unit 241 performs the same processing as in step S104 of the first embodiment according to the image processing program 2454. That is, the processing unit 241 extracts the cutting line from the shooting information acquired in step S203. The extraction of the cutting line is performed by using the characteristics of the pixel value as in the first embodiment. Since this extraction is the same as in Example 1, the details thereof will be omitted.
 次に、ステップS205において、処理部241は、画像処理プログラム2454に従って、ステップS203で取得した撮影情報で、上面側のバリ部分1021-1を抽出できるか判断する。つまり、図12(b)のバリ部分1021-1とバリ部分1022-1の上面側に存在するかを判断する。この判断の基準として、例えば、図12(b)のガラス部分101-1の画素値が一定値以上(例えば、飽和しているか)であるかを用いることができる。 Next, in step S205, the processing unit 241 determines whether the burr portion 1021-1 on the upper surface side can be extracted from the shooting information acquired in step S203 according to the image processing program 2454. That is, it is determined whether or not the burr portion 1021-1 and the burr portion 1022-1 in FIG. 12B are present on the upper surface side. As a criterion for this determination, for example, whether or not the pixel value of the glass portion 101-1 in FIG. 12B is a certain value or more (for example, whether it is saturated) can be used.
 なお、ステップS205の処理は省略して、ステップS204からステップS103に進む構成を採用してもよい。 Note that the process of step S205 may be omitted, and a configuration in which the process proceeds from step S204 to step S103 may be adopted.
 次に、実施例1のステップS103と同じ処理を実行する。つまり、処理部241は、同軸落射照明器22を用いた撮影情報を取得する。 Next, the same process as in step S103 of the first embodiment is executed. That is, the processing unit 241 acquires the shooting information using the coaxial epi-illuminator 22.
 次に、ステップS206において、処理部241は、画像処理プログラム2454に従って、複合部材10上にステップS204で抽出した切断線を描画し、これを表示部243に表示する。このために、処理部241は、ステップS103で取得した撮影情報と、S204で抽出した切断線を合成する。この結果、処理部241は、図12(c)に示すような(1)上面側のバリ部分1021-1が特定され、(2)切断線103が抽出された情報を取得する。 Next, in step S206, the processing unit 241 draws a cutting line extracted in step S204 on the composite member 10 according to the image processing program 2454, and displays this on the display unit 243. For this purpose, the processing unit 241 synthesizes the photographing information acquired in step S103 and the cutting line extracted in S204. As a result, the processing unit 241 acquires the information in which (1) the burr portion 1021-1 on the upper surface side is specified and (2) the cutting line 103 is extracted as shown in FIG. 12 (c).
 次に、実施例1と同じステップS105、ステップS106を実行する。また、実施例2でも、実施例1と同様にバリの切断を行うことができる。 Next, the same steps S105 and S106 as in the first embodiment are executed. Further, in the second embodiment as well, the burr can be cut in the same manner as in the first embodiment.
 なお、本実施例では、ステップS201において、同軸落射照明器22および照明器26の撮影条件(照明輝度、照明姿勢・角度)をそれぞれ求めた。但し、ステップS201では照明器26の撮影条件を求め、ステップS103の直前に同軸落射照明器22での撮影条件を求める構成としてもよい。 In this embodiment, the shooting conditions (illumination brightness, illumination posture / angle) of the coaxial epi-illuminator 22 and the illuminator 26 were obtained in step S201, respectively. However, in step S201, the shooting conditions of the illuminator 26 may be obtained, and the shooting conditions of the coaxial epi-illuminator 22 may be obtained immediately before step S103.
 さらに、本実施例では、照明器26の照明での撮影(ステップS203)の後に、同軸落射照明器22の撮影(ステップS103)を行ったが、この順序を逆にしてもよい。この場合、ステップS205においては、以下のように切断線が抽出できたかを判断する。 Further, in this embodiment, the coaxial epi-illuminator 22 is photographed (step S103) after the image of the illuminator 26 is photographed (step S203), but the order may be reversed. In this case, in step S205, it is determined whether or not the cutting line can be extracted as follows.
 処理部241は、画像処理プログラム2454に従って、ステップS203で切断線が抽出できたかを判断する。この判断は、極小値として抽出された画素の数ないし長さが所定値以上かで判断できる。また、処理部241は、表示部243に、極小値を示す画素を撮影情報上に表示して、検査員がその判断を行ってもよい。この結果、切断線を抽出できたと判断した場合(YES)は、ステップS106に進む。また、抽出できなかったと判断した場合(NO)は、ステップS103に進む。 The processing unit 241 determines whether or not the cutting line could be extracted in step S203 according to the image processing program 2454. This judgment can be made based on whether the number or length of pixels extracted as the minimum value is a predetermined value or more. Further, the processing unit 241 may display the pixel showing the minimum value on the photographing information on the display unit 243, and the inspector may make the determination. As a result, if it is determined that the cutting line can be extracted (YES), the process proceeds to step S106. If it is determined that the extraction could not be performed (NO), the process proceeds to step S103.
 また、この処理順序の場合、ステップS201では同軸落射照明器22の撮影条件を求め、後に廻ったステップS203の直前に照明器26での撮影条件を求める構成としてもよい。 Further, in the case of this processing order, the shooting condition of the coaxial epi-illuminator 22 may be obtained in step S201, and the shooting condition of the illuminator 26 may be obtained immediately before the step S203 which is turned later.
 さらに、本実施例では、同軸落射照明器22と照明器26と物理的に2台の照明を用いたが、1つの照明で2つの撮影情報を取得する構成としてもよい。この場合、制御部24の制御により照明を移動して、照明を照射する。また、照明として、同軸落射照明器22を用いる場合、処理部241が同軸落射照明器制御プログラム2452に従って、反射板222の角度を変えることで、2つの撮影情報を取得できる。なお、この場合、上述したように3以上の撮影情報を取得する構成としてもよい。 Further, in this embodiment, the coaxial epi-illuminator 22 and the illuminator 26 and two illuminators are physically used, but it may be configured to acquire two shooting information with one illuminator. In this case, the lighting is moved under the control of the control unit 24 to irradiate the lighting. Further, when the coaxial epi-illuminator 22 is used as the illumination, the processing unit 241 can acquire two shooting information by changing the angle of the reflector 222 according to the coaxial epi-illuminator control program 2452. In this case, as described above, it may be configured to acquire three or more shooting information.
 さらに、実施例2においても、実施例1と同様のバリの切断加工を行ってもよい。つまり、図8で示される製造システム200に、実施例2を適用できる。以上で、実施例2の説明を終了する。 Further, in Example 2, the same burr cutting process as in Example 1 may be performed. That is, the second embodiment can be applied to the manufacturing system 200 shown in FIG. This is the end of the description of the second embodiment.
 実施例2では、切断線の抽出が困難な場合に対応するために、2つの撮影情報を取得するために、2台の照明を用いた。本実施例では、実施例2と同様の課題を解決するために、2台のカメラを用いる。図13は、本実施例における外観検査装置20の構成を示すブロック図である。図2に示す実施例1と比較して、撮影部であるカメラ27が追加されている。このカメラ27は、制御部24と接続され、処理部241がカメラ制御プログラム2451に従って撮影を制御する。 In Example 2, two lights were used to acquire two shooting information in order to cope with the case where it is difficult to extract the cutting line. In this embodiment, two cameras are used in order to solve the same problems as in the second embodiment. FIG. 13 is a block diagram showing the configuration of the visual inspection apparatus 20 in this embodiment. A camera 27, which is a photographing unit, is added as compared with the first embodiment shown in FIG. The camera 27 is connected to the control unit 24, and the processing unit 241 controls shooting according to the camera control program 2451.
 以下、本実施例の処理は、基本的には、実施例2のチャートと同じ処理を行う。但し、ステップS201、ステップS203およびステップS103が異なるのでこれらを中心に、以下実施例3の説明を行う。 Hereinafter, the processing of this embodiment is basically the same as that of the chart of Example 2. However, since step S201, step S203, and step S103 are different, the third embodiment will be described below with a focus on these.
 まず、ステップS201において、実施例2では図11に示す設計情報2461aを用いたが、本実施例では実施例1と同様に図5に示す設計情報2461を用いる。 First, in step S201, the design information 2461a shown in FIG. 11 was used in Example 2, but in this embodiment, the design information 2461 shown in FIG. 5 is used as in Example 1.
 そして、本実施例では、ステップS102の後に、以下のとおりステップS203を実行する。ステップS203において、処理部241は、カメラ制御プログラム2451に従って、ステップS201で特定された撮影条件に応じて撮影するための制御信号を、カメラ21に出力する。また、処理部241は、同軸落射照明器制御プログラム2452ムに従って、ステップS201で特定された撮影条件に応じた照明を行うための制御信号を同軸落射照明器22に出力する。 Then, in this embodiment, after step S102, step S203 is executed as follows. In step S203, the processing unit 241 outputs a control signal for shooting according to the shooting conditions specified in step S201 to the camera 21 according to the camera control program 2451. Further, the processing unit 241 outputs a control signal for lighting according to the shooting conditions specified in step S201 to the coaxial epi-illuminator 22 according to the coaxial epi-illuminator control program 2452.
 この結果、本ステップにおいて、処理部241は、カメラ21から撮影情報を取得する。このように取得される撮影情報は、実施例2と同様に図12(a)のように表現できる。 As a result, in this step, the processing unit 241 acquires shooting information from the camera 21. The shooting information acquired in this way can be expressed as shown in FIG. 12A as in the second embodiment.
 そして、ステップS205の後に、ステップS103を行う。ステップS103において、処理部241は、カメラ制御プログラム2451に従って、ステップS201で特定された撮影条件に応じて撮影するための制御信号を、カメラ27に出力する。また、ステップS103と同様に、処理部241は、同軸落射照明器制御プログラム2452に従って、ステップS201で特定された撮影条件に応じた照明を行うための制御信号を同軸落射照明器22に出力する。 Then, after step S205, step S103 is performed. In step S103, the processing unit 241 outputs a control signal for shooting according to the shooting conditions specified in step S201 to the camera 27 according to the camera control program 2451. Further, similarly to step S103, the processing unit 241 outputs a control signal for performing illumination according to the shooting conditions specified in step S201 to the coaxial epi-illuminator 22 according to the coaxial epi-illuminator control program 2452.
 この結果、本ステップにおいて、処理部241は、カメラ27から撮影情報を取得する。このように取得される撮影情報は、実施例2と同様に図12(b)のように表現できる。 As a result, in this step, the processing unit 241 acquires shooting information from the camera 27. The shooting information acquired in this way can be expressed as shown in FIG. 12 (b) as in the second embodiment.
 そして、本実施例では、ステップS206以降の処理を、実施例2と同様に実行する。
また、実施例3においても、実施例2と同様に、ステップS203とステップS103の順序を入替てもよい。この場合、実施例と同じように、ステップS205の処理も変更することが望ましい。さらに、実施例3では、2台のカメラ21およびカメラ27を用いるので、ステップS203とステップS103を一緒に行うことも可能である。この場合、処理部241はカメラ制御プログラム2451に従って、カメラ21およびカメラ27に、撮影のための制御信号を並行して出力する。
Then, in this embodiment, the processes after step S206 are executed in the same manner as in the second embodiment.
Further, also in the third embodiment, the order of the steps S203 and the step S103 may be changed as in the second embodiment. In this case, it is desirable to change the process of step S205 as in the embodiment. Further, in the third embodiment, since the two cameras 21 and the camera 27 are used, it is possible to perform step S203 and step S103 together. In this case, the processing unit 241 outputs a control signal for photographing to the camera 21 and the camera 27 in parallel according to the camera control program 2451.
 また、本実施例では、1台のカメラ(カメラ21もしくはカメラ27)を移動して、2つ以上の撮影情報を取得するように構成してもよい。さらに、カメラ21とカメラ27は、撮影角度、つまり、視野が異なる。このため、処理部241が画像処理プログラム2454に従って、カメラ21の撮影情報とカメラ27の撮影情報を整合するような画像変換を行うことが望ましい。 Further, in this embodiment, one camera (camera 21 or camera 27) may be moved to acquire two or more shooting information. Further, the camera 21 and the camera 27 have different shooting angles, that is, different fields of view. Therefore, it is desirable that the processing unit 241 performs image conversion so as to match the shooting information of the camera 21 with the shooting information of the camera 27 according to the image processing program 2454.
 なお、実施例1、2と同じように、実施例3においても、バリの切断加工を行ってもよい。つまり、図8で示される製造システム200に、実施例3を適用できる。以上で、実施例3の説明を終了する。 It should be noted that the burr may be cut in the third embodiment as in the first and second embodiments. That is, the third embodiment can be applied to the manufacturing system 200 shown in FIG. This is the end of the description of the third embodiment.
 以上のように、各実施例では、透過性部材に対する第1の方向の反射光もしくは反射光の画素値が一定値以上となる撮影を行っている。そして、実施例2では、さらに第2の方向の反射光の撮影を行うことで、構造体を構成する透過性部材の第1面(例:撮影装置側である上面)の発生する不具合に関する形状を認識する。各実施例では、不具合として構造体を構成する部材(ゴム枠102)のバリを例示したが、不具合が発生する物体はこれに限定されない。例えば、不純物の付着といった構造体を構成しない物体を認識することも、各実施例に含まれる。 As described above, in each embodiment, the image is taken so that the pixel value of the reflected light or the reflected light in the first direction with respect to the transmissive member becomes a certain value or more. Then, in the second embodiment, the shape relating to the defect that the first surface (eg, the upper surface on the imaging device side) of the transmissive member constituting the structure is generated by further photographing the reflected light in the second direction. Recognize. In each embodiment, the burr of the member (rubber frame 102) constituting the structure is exemplified as a defect, but the object in which the defect occurs is not limited to this. For example, recognizing an object that does not form a structure, such as adhesion of impurities, is also included in each embodiment.
 また、透過性部材の複数の界面に異なる部材や同じ部材の異なる部品が配置される構造体も検査対象とすることができる。 Further, a structure in which different members or different parts of the same member are arranged at a plurality of interfaces of the transparent member can also be inspected.
 また、本実施例の不具合として、バリを例に説明したが、バリの他、パーティングライン、欠落、厚さ等の寸法に対する過不足などの不具合の形状を認識し、構造体を検査することに適用できる。 In addition, although burrs have been described as an example of defects in this embodiment, the structure should be inspected by recognizing the shape of defects such as excess / deficiency with respect to dimensions such as parting line, chipping, and thickness, in addition to burrs. Can be applied to.
 また、透過性部材としてガラス101を用いたが、アクリル板など他の透過性部材を用いることも可能であり。さらに、本実施例では、不具合が発生する部材としてゴム枠102を用いたが、他の部材、材料に適用することも可能である。他の部材としては、プラスチックやガラスなどの透過性部材を用いてもよい。 Although glass 101 was used as the transparent member, it is also possible to use another transparent member such as an acrylic plate. Further, in this embodiment, the rubber frame 102 is used as a member in which a defect occurs, but it can also be applied to other members and materials. As the other member, a transparent member such as plastic or glass may be used.
 さらに、各実施例の検査は、構造体が、これを構成する透過性部材の界面に関する不具合が発生するものに適用可能である。ここで、界面に関する不具合が発生するものとは、構造体をある方向から観察した場合、透過性部材を介して観察される不具合とこれを介さないで観察される不具合が発生することを示す。このため、不具合の関連する界面は、上面と下面に限定されない。つまり、表面と裏面、表面と側面等、その表現は問わない。このため、各界面を、第1面、第2面・・・と表現可能である。さらに、界面の形状も、平面の他、曲面も含まれる。 Furthermore, the inspection of each embodiment can be applied to a structure in which a defect occurs regarding the interface of the transparent member constituting the structure. Here, what causes a defect related to the interface means that when the structure is observed from a certain direction, a defect observed through the transmissive member and a defect observed without the defect occur. Therefore, the interface associated with the defect is not limited to the upper surface and the lower surface. That is, the expression such as front surface and back surface, front surface and side surface, etc. does not matter. Therefore, each interface can be expressed as a first surface, a second surface, and so on. Further, the shape of the interface includes not only a flat surface but also a curved surface.
 またさらに、各実施例では、上面(第1面)の不具合の形状を認識できるとの効果を奏する。また、付加的な効果として、上面と下面の不具合を区別することも可能となる場合がある。 Furthermore, in each embodiment, there is an effect that the shape of the defect on the upper surface (first surface) can be recognized. Further, as an additional effect, it may be possible to distinguish between defects on the upper surface and the lower surface.
10…複合部材、101…ガラス、102…ゴム枠、1021…バリ、1022…バリ、20…外観検査装置、21…カメラ、22…同軸落射照明器、221…光源、222…反射板、23…XYステージ、24…制御部、241…処理部、242…入力部、243…表示部、244…通信I/F、245…主記憶部、2451…カメラ制御プログラム、2452…同軸落射照明器制御プログラム、2453…XYステージ制御プログラム、2454…画像処理プログラム、246…補助記憶部、2461…設計情報、2462…撮影情報 10 ... Composite member, 101 ... Glass, 102 ... Rubber frame, 1021 ... Bali, 1022 ... Bali, 20 ... Visual inspection device, 21 ... Camera, 22 ... Coaxial epi-illuminator, 221 ... Light source, 222 ... Reflector, 23 ... XY stage, 24 ... control unit, 241 ... processing unit, 242 ... input unit, 243 ... display unit, 244 ... communication I / F, 245 ... main memory unit, 2451 ... camera control program, 2452 ... coaxial epi-illuminator control program , 2453 ... XY stage control program, 2454 ... image processing program, 246 ... auxiliary storage, 2461 ... design information, 2462 ... shooting information

Claims (18)

  1.  透過性部材を含む構造体の不具合を検査する外観検査装置を用いた外観検査方法において、
     前記外観検査装置は、前記構造体を保持する保持部と、前記保持された構造体を撮影する撮影部と、前記保持された構造体のうち、少なくとも前記透過性部材の第1面に向けて、前記撮影部での撮影のための照明を照射する照明部と、前記保持部、前記撮影部および前記照明部を制御する制御部を有し、
     前記制御部が、
     前記透過性部材の第1面に対する前記撮影部での撮影において、当該第1面の少なくとも一部の画素値が予め定めた値以上になるように、前記撮影部および前記照明部のうち少なくとも一方を制御し、
     画素値が前記予め定めた値以上になった場合、前記構造体の少なくとも一部を、前記第1面に対して第1の方向の反射光を撮影するよう前記撮影部を制御し、
     撮影された前記構造体の少なくとも一部の撮影情報を用いて、前記構造体の前記第1面に関する不具合の形状を認識することを特徴とする外観検査方法。
    In the visual inspection method using the visual inspection device for inspecting the defect of the structure including the permeable member,
    The visual inspection device is directed toward at least the first surface of the transparent member among the holding portion for holding the structure, the photographing portion for photographing the held structure, and the held structure. It has a lighting unit that irradiates lighting for photography in the imaging unit, and a control unit that controls the holding unit, the imaging unit, and the illumination unit.
    The control unit
    At least one of the photographing unit and the lighting unit so that the pixel value of at least a part of the first surface is equal to or higher than a predetermined value in the photographing by the photographing unit with respect to the first surface of the transmissive member. Control and
    When the pixel value becomes equal to or higher than the predetermined value, the photographing unit is controlled so that at least a part of the structure is photographed with the reflected light in the first direction with respect to the first surface.
    A visual inspection method comprising recognizing the shape of a defect relating to the first surface of the structure by using the photographed information of at least a part of the photographed structure.
  2.  請求項1に記載の外観検査方法において、
     前記制御部が、
     前記撮影部での撮影において、当該第1面の少なくとも一部の画素値が予め定めた値以上になるような第1の照明を照射するように、前記照明部を制御し、
     前記第1の照明が照射された場合に、前記構造体の少なくとも一部を、前記第1の照明により生じる前記第1の方向の反射光を撮影するよう前記撮影部を制御することを特徴とする外観検査方法。
    In the visual inspection method according to claim 1,
    The control unit
    In the shooting by the shooting unit, the lighting unit is controlled so as to irradiate the first illumination so that the pixel value of at least a part of the first surface becomes equal to or higher than a predetermined value.
    When the first illumination is irradiated, at least a part of the structure is controlled to capture the reflected light in the first direction generated by the first illumination. Visual inspection method.
  3.  請求項2に記載の外観検査方法において、
     前記制御部が、
     前記第1の方向の反射光とは異なる方向である第2の方向への反射光を撮影するよう前記撮影部を制御し、
     前記撮影情報と、前記第2の方向への反射光を撮影した第2の撮影情報を用いて、前記不具合の形状を認識することを特徴とする外観検査方法。
    In the visual inspection method according to claim 2,
    The control unit
    The photographing unit is controlled so as to capture the reflected light in the second direction, which is a direction different from the reflected light in the first direction.
    A visual inspection method comprising recognizing the shape of the defect by using the photographing information and the second photographing information obtained by photographing the reflected light in the second direction.
  4.  請求項3に記載の外観検査方法において、
     前記制御部が、前記構造体の不具合の形状を認識できない場合に、前記第2の方向への反射光を撮影するように、前記撮影部を制御することを特徴とする外観検査方法。
    In the visual inspection method according to claim 3,
    A visual inspection method comprising controlling the imaging unit so that the control unit captures the reflected light in the second direction when the control unit cannot recognize the shape of the defect of the structure.
  5.  請求項2に記載の外観検査方法において、
     前記保持部が、前記透過性部材の第1面が、前記撮影部および前記照明部が正対する方向となるように、前記構造体を保持し、
     前記第1の方向への反射光は、前記透過性部材の第1面における略正反射光であることを特徴とする外観検査方法。
    In the visual inspection method according to claim 2,
    The holding portion holds the structure so that the first surface of the transmissive member faces the photographing portion and the illuminating portion.
    The visual inspection method, wherein the reflected light in the first direction is a substantially specular reflected light on the first surface of the transmissive member.
  6.  請求項2に記載の外観検査方法において、
     前記制御部が、前記第1面から透過光の透過方向に存在する第2面に関する不具合の形状を認識することを特徴とする外観検査方法。
    In the visual inspection method according to claim 2,
    A visual inspection method, wherein the control unit recognizes the shape of a defect related to a second surface existing in a transmission direction of transmitted light from the first surface.
  7.  請求項6に記載の外観検査方法において、
     前記制御部が、
     前記第1面から透過光の透過方向に存在する第2面の少なくとも一部に対して、第2の照明を照射するように前記照明部を制御し
     前記第2の照明が照射された場合、前記構造体の少なくとも一部を、前記第2の照明に対する第3の方向への反射光を撮影するよう前記撮影部を制御し、
     前記第3の方向への反射光に対する撮影情報を用いて、前記構造体の前記第2面に関する不具合の形状を認識することを特徴とする外観検査方法。
    In the visual inspection method according to claim 6,
    The control unit
    When the lighting unit is controlled so as to irradiate at least a part of the second surface existing in the transmission direction of the transmitted light from the first surface with the second illumination, and the second illumination is irradiated. The imaging unit is controlled so that at least a part of the structure captures the reflected light in the third direction with respect to the second illumination.
    A visual inspection method comprising recognizing the shape of a defect relating to the second surface of the structure by using the photographing information for the reflected light in the third direction.
  8.  請求項7に記載の外観検査方法において、
     前記制御部が、当該第2面の少なくとも一部の画素値が予め定めた値以上になるように、前記第2の照明を照射するように前記照明部を制御することを特徴とする外観検査方法。
    In the visual inspection method according to claim 7,
    A visual inspection characterized in that the control unit controls the illumination unit so as to irradiate the second illumination so that the pixel value of at least a part of the second surface becomes equal to or higher than a predetermined value. Method.
  9.  請求項1に記載の外観検査方法における制御部は、前記構造体を加工する加工装置に対して、前記不具合の形状に対する加工のための制御信号を出力し、
     前記加工装置が、前記制御信号に従って、前記構造体を加工することを特徴とする構造体に対する加工方法。
    The control unit in the visual inspection method according to claim 1 outputs a control signal for processing the defective shape to the processing apparatus for processing the structure.
    A processing method for a structure, wherein the processing apparatus processes the structure according to the control signal.
  10.  透過性部材を含む構造体の不具合を検査する外観検査装置において、
     前記構造体を保持する保持部と、
     前記保持された構造体を撮影する撮影部と、
     前記保持された構造体のうち、少なくとも前記透過性部材の第1面に向けて、前記撮影部での撮影のための照明を照射する照明部と、
     前記保持部、前記撮影部および前記照明部を制御する制御部を有し、
     前記制御部は、
     前記透過性部材の第1面に対する前記撮影部での撮影において、当該第1面の少なくとも一部の画素値が予め定めた値以上になるように、前記撮影部および前記照明部のうち少なくとも一方を制御し、
     画素値が前記予め定めた値以上になった場合、前記構造体の少なくとも一部を、前記第1面に対して第1の方向の反射光を撮影するよう前記撮影部を制御し、
     撮影された前記構造体の少なくとも一部の撮影情報を用いて、前記構造体の前記第1面に関する不具合の形状を認識することを特徴とする外観検査装置。
    In a visual inspection device that inspects defects in structures including permeable members,
    A holding portion that holds the structure and
    An imaging unit that photographs the held structure, and
    Among the held structures, a lighting unit that irradiates at least the first surface of the transmissive member with illumination for photographing by the photographing unit, and a lighting unit.
    It has a control unit that controls the holding unit, the photographing unit, and the lighting unit.
    The control unit
    At least one of the photographing unit and the lighting unit so that the pixel value of at least a part of the first surface is equal to or higher than a predetermined value in the photographing by the photographing unit with respect to the first surface of the transmissive member. Control and
    When the pixel value becomes equal to or higher than the predetermined value, the photographing unit is controlled so that at least a part of the structure is photographed with the reflected light in the first direction with respect to the first surface.
    A visual inspection apparatus comprising recognizing the shape of a defect relating to the first surface of the structure by using the photographed information of at least a part of the photographed structure.
  11.  請求項10に記載の外観検査装置において、
     前記制御部が、
     前記撮影部での撮影において、当該第1面の少なくとも一部の画素値が予め定めた値以上になるような第1の照明を照射するように、前記照明部を制御し、
     前記第1の照明が照射された場合に、前記構造体の少なくとも一部を、前記第1の照明により生じる前記第1の方向の反射光を撮影するよう前記撮影部を制御することを特徴とする外観検査装置。
    In the visual inspection apparatus according to claim 10,
    The control unit
    In the shooting by the shooting unit, the lighting unit is controlled so as to irradiate the first illumination so that the pixel value of at least a part of the first surface becomes equal to or higher than a predetermined value.
    When the first illumination is irradiated, at least a part of the structure is controlled to capture the reflected light in the first direction generated by the first illumination. Visual inspection equipment.
  12.  請求項11に記載の外観検査装置において、
     前記制御部は、
     前記第1の方向の反射光とは異なる方向である第2の方向への反射光を撮影するよう前記撮影部を制御し、
     前記撮影情報と、前記第2の方向への反射光を撮影した第2の撮影情報を用いて、前記不具合の形状を認識することを特徴とする外観検査装置。
    In the visual inspection apparatus according to claim 11,
    The control unit
    The photographing unit is controlled so as to capture the reflected light in the second direction, which is a direction different from the reflected light in the first direction.
    A visual inspection apparatus characterized by recognizing the shape of the defect by using the photographing information and the second photographing information obtained by photographing the reflected light in the second direction.
  13.  請求項12に記載の外観検査装置において、
     前記制御部は、前記構造体の不具合の形状を認識できない場合に、前記第2の方向への反射光を撮影するように、前記撮影部を制御する特徴とする外観検査装置。
    In the visual inspection apparatus according to claim 12,
    The control unit is a visual inspection device that controls the imaging unit so as to capture the reflected light in the second direction when the shape of the defect of the structure cannot be recognized.
  14.  請求項11に記載の外観検査装置において、
     前記保持部は、前記透過性部材の第1面が、前記撮影部および前記照明部が正対する方向となるように、前記構造体を保持し、
     前記第1の方向への反射光は、前記透過性部材の第1面における略正反射光であることを特徴とする外観検査装置。
    In the visual inspection apparatus according to claim 11,
    The holding portion holds the structure so that the first surface of the transmissive member faces the photographing portion and the illuminating portion.
    The visual inspection apparatus, characterized in that the reflected light in the first direction is a substantially specular reflected light on the first surface of the transmissive member.
  15.  請求項11に記載の外観検査装置において、
     前記制御部は、前記第1面から透過光の透過方向に存在する第2面に関する不具合の形状を認識することを特徴とする外観検査装置。
    In the visual inspection apparatus according to claim 11,
    The control unit is a visual inspection apparatus characterized in that it recognizes the shape of a defect related to a second surface existing in a transmission direction of transmitted light from the first surface.
  16.  請求項15に記載の外観検査装置において、
     前記制御部は、
     前記第1面から透過光の透過方向に存在する第2面の少なくとも一部に対して、第2の照明を照射するように前記照明部を制御し
     前記第2の照明が照射された場合、前記構造体の少なくとも一部を、前記第2の照明に対する第3の方向への反射光を撮影するよう前記撮影部を制御し、
     前記第3の方向への反射光に対する撮影情報を用いて、前記構造体の前記第2面に関する不具合の形状を認識することを特徴とする外観検査装置。
    In the visual inspection apparatus according to claim 15,
    The control unit
    When the lighting unit is controlled so as to irradiate at least a part of the second surface existing in the transmission direction of the transmitted light from the first surface with the second illumination, and the second illumination is irradiated. The imaging unit is controlled so that at least a part of the structure captures the reflected light in the third direction with respect to the second illumination.
    A visual inspection apparatus comprising recognizing the shape of a defect relating to the second surface of the structure by using the photographing information for the reflected light in the third direction.
  17.  請求項16に記載の外観検査装置において、
     前記制御部は、当該第2面の少なくとも一部の画素値が予め定めた値以上になるように、前記第2の照明を照射するように前記照明部を制御することを特徴とする外観検査装置。
    In the visual inspection apparatus according to claim 16,
    The visual inspection unit is characterized in that the control unit controls the illumination unit so as to irradiate the second illumination so that the pixel value of at least a part of the second surface becomes equal to or higher than a predetermined value. Device.
  18.  請求項10に記載の外観検査装置の制御部は、前記構造体を加工する加工装置に対して、前記不具合の形状に対する加工のための制御信号を出力し、
     前記加工装置が、前記制御信号に従って、前記構造体を加工することを特徴とする構造体に対する加工装置。
    The control unit of the visual inspection device according to claim 10 outputs a control signal for processing the defective shape to the processing device for processing the structure.
    A processing device for a structure, wherein the processing device processes the structure according to the control signal.
PCT/JP2021/041436 2020-12-16 2021-11-10 Appearance inspection method, appearance inspection device, and method and device for processing structure WO2022130843A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2020-208290 2020-12-16
JP2020208290A JP7390278B2 (en) 2020-12-16 2020-12-16 Appearance inspection method, appearance inspection device, processing method and device for structures

Publications (1)

Publication Number Publication Date
WO2022130843A1 true WO2022130843A1 (en) 2022-06-23

Family

ID=82059034

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2021/041436 WO2022130843A1 (en) 2020-12-16 2021-11-10 Appearance inspection method, appearance inspection device, and method and device for processing structure

Country Status (2)

Country Link
JP (1) JP7390278B2 (en)
WO (1) WO2022130843A1 (en)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH11311510A (en) * 1998-04-27 1999-11-09 Asahi Glass Co Ltd Method and apparatus for inspection of very small uneven part
JP2009162570A (en) * 2007-12-28 2009-07-23 Hoya Corp Manufacturing method of glass substrate for magnetic disc, glass substrate for magnetic disc, and magnetic disc
JP2011257257A (en) * 2010-06-09 2011-12-22 Panasonic Corp Inspection device and inspection method, and method of manufacturing panel for image display using the same
JP2013084405A (en) * 2011-10-07 2013-05-09 Panasonic Corp Method for manufacturing plasma display panel
JP2015137927A (en) * 2014-01-22 2015-07-30 株式会社ブルービジョン imaging device and inspection system

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2011257258A (en) * 2010-06-09 2011-12-22 Panasonic Corp Inspection device and inspection method, and method of manufacturing panel for image display using the same
JP2015175815A (en) * 2014-03-18 2015-10-05 東レ株式会社 Method and device for inspecting defect of transparent sheet

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH11311510A (en) * 1998-04-27 1999-11-09 Asahi Glass Co Ltd Method and apparatus for inspection of very small uneven part
JP2009162570A (en) * 2007-12-28 2009-07-23 Hoya Corp Manufacturing method of glass substrate for magnetic disc, glass substrate for magnetic disc, and magnetic disc
JP2011257257A (en) * 2010-06-09 2011-12-22 Panasonic Corp Inspection device and inspection method, and method of manufacturing panel for image display using the same
JP2013084405A (en) * 2011-10-07 2013-05-09 Panasonic Corp Method for manufacturing plasma display panel
JP2015137927A (en) * 2014-01-22 2015-07-30 株式会社ブルービジョン imaging device and inspection system

Also Published As

Publication number Publication date
JP7390278B2 (en) 2023-12-01
JP2022095139A (en) 2022-06-28

Similar Documents

Publication Publication Date Title
TWI498546B (en) Defect inspection device and defect inspection method
JP5410092B2 (en) Apparatus and method for inspecting composite structures for inconsistencies
US9140545B2 (en) Object inspection system
JP5144401B2 (en) Wafer inspection equipment
JP2000009591A (en) Inspection equipment
JP2008249568A (en) Visual examination device
CN117571744A (en) Optical detection method for defects on double surfaces of cover plate glass
JPWO2008149712A1 (en) Strain inspection apparatus and strain inspection method
WO2022130843A1 (en) Appearance inspection method, appearance inspection device, and method and device for processing structure
JPH1062354A (en) Device and method of inspecting transparent plate for defect
JP5959430B2 (en) Bottle cap appearance inspection device and appearance inspection method
JP2005241586A (en) Inspection device and method for optical film
TWM514002U (en) Optical inspection device
KR101217173B1 (en) Apparatus for inspecting substrate and method of inspecting substrate
JP2009216647A (en) Defect inspection method and defect inspection device
JP2007333661A (en) Method and apparatus for visual inspection of electronic component
US10255671B1 (en) System and method for capture of high resolution/high magnification images
JP2021056166A (en) Inspection device, inspection system, and method for inspection of inspection device
KR101744149B1 (en) Apparatus for inspecting substrate
JP2004212353A (en) Optical inspection apparatus
KR102528464B1 (en) Vision Inspecting Apparatus
JP7541889B2 (en) Inspection device, inspection method, program, and recording medium
JP2004347525A (en) Visual inspection method for semiconductor chip and device therefor
KR102433319B1 (en) Vision inspection method of diffusing lens for led light source
CN112098420B (en) Curved surface screen detection device

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21906207

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 21906207

Country of ref document: EP

Kind code of ref document: A1