CN111757667A - Inspection apparatus and inspection method - Google Patents

Inspection apparatus and inspection method Download PDF

Info

Publication number
CN111757667A
CN111757667A CN202010231009.9A CN202010231009A CN111757667A CN 111757667 A CN111757667 A CN 111757667A CN 202010231009 A CN202010231009 A CN 202010231009A CN 111757667 A CN111757667 A CN 111757667A
Authority
CN
China
Prior art keywords
image
electronic component
display
images
mark
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202010231009.9A
Other languages
Chinese (zh)
Other versions
CN111757667B (en
Inventor
河崎武士
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Juki Corp
Original Assignee
Juki Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Juki Corp filed Critical Juki Corp
Publication of CN111757667A publication Critical patent/CN111757667A/en
Application granted granted Critical
Publication of CN111757667B publication Critical patent/CN111757667B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H05ELECTRIC TECHNIQUES NOT OTHERWISE PROVIDED FOR
    • H05KPRINTED CIRCUITS; CASINGS OR CONSTRUCTIONAL DETAILS OF ELECTRIC APPARATUS; MANUFACTURE OF ASSEMBLAGES OF ELECTRICAL COMPONENTS
    • H05K13/00Apparatus or processes specially adapted for manufacturing or adjusting assemblages of electric components
    • H05K13/08Monitoring manufacture of assemblages
    • H05K13/081Integration of optical monitoring devices in assembly lines; Processes using optical monitoring devices specially adapted for controlling devices or machines in assembly lines
    • HELECTRICITY
    • H05ELECTRIC TECHNIQUES NOT OTHERWISE PROVIDED FOR
    • H05KPRINTED CIRCUITS; CASINGS OR CONSTRUCTIONAL DETAILS OF ELECTRIC APPARATUS; MANUFACTURE OF ASSEMBLAGES OF ELECTRICAL COMPONENTS
    • H05K13/00Apparatus or processes specially adapted for manufacturing or adjusting assemblages of electric components
    • H05K13/08Monitoring manufacture of assemblages
    • H05K13/081Integration of optical monitoring devices in assembly lines; Processes using optical monitoring devices specially adapted for controlling devices or machines in assembly lines
    • H05K13/0812Integration of optical monitoring devices in assembly lines; Processes using optical monitoring devices specially adapted for controlling devices or machines in assembly lines the monitoring devices being integrated in the mounting machine, e.g. for monitoring components, leads, component placement
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/84Systems specially adapted for particular applications
    • G01N21/88Investigating the presence of flaws or contamination
    • G01N21/8806Specially adapted optical and illumination features
    • HELECTRICITY
    • H05ELECTRIC TECHNIQUES NOT OTHERWISE PROVIDED FOR
    • H05KPRINTED CIRCUITS; CASINGS OR CONSTRUCTIONAL DETAILS OF ELECTRIC APPARATUS; MANUFACTURE OF ASSEMBLAGES OF ELECTRICAL COMPONENTS
    • H05K13/00Apparatus or processes specially adapted for manufacturing or adjusting assemblages of electric components
    • H05K13/08Monitoring manufacture of assemblages
    • H05K13/083Quality monitoring using results from monitoring devices, e.g. feedback loops
    • HELECTRICITY
    • H05ELECTRIC TECHNIQUES NOT OTHERWISE PROVIDED FOR
    • H05KPRINTED CIRCUITS; CASINGS OR CONSTRUCTIONAL DETAILS OF ELECTRIC APPARATUS; MANUFACTURE OF ASSEMBLAGES OF ELECTRICAL COMPONENTS
    • H05K13/00Apparatus or processes specially adapted for manufacturing or adjusting assemblages of electric components
    • H05K13/08Monitoring manufacture of assemblages
    • H05K13/084Product tracking, e.g. of substrates during the manufacturing process; Component traceability

Abstract

The invention provides an inspection apparatus and an inspection method, which can restrain the reduction of work efficiency in inspection. The inspection device comprises: an image acquisition unit that acquires a plurality of images of an electronic component that is provided with a mark and that is illuminated under a plurality of illumination conditions in a state of being mounted on a substrate; and a display control unit that displays the plurality of images in an aligned manner on a display screen of the display device.

Description

Inspection apparatus and inspection method
Technical Field
The present invention relates to an inspection apparatus and an inspection method.
Background
In a manufacturing process of an electronic device, an electronic component mounting apparatus for mounting an electronic component on a substrate is used. The inspection device inspects the adequacy of the electronic component mounted on the substrate. Techniques of Optical Character Recognition (OCR) are utilized in the examination. The inspection device acquires an image of an electronic component provided with a symbol such as a character or a number, extracts the symbol from the image of the electronic component, and identifies the symbol, thereby determining the adequacy of the electronic component.
Patent document 1: japanese patent laid-open publication No. 2004-235582
In the case of using the technique of optical character recognition in the inspection, a plurality of images of the electronic component illuminated under a plurality of illumination conditions are acquired. An image used in the optical character recognition is selected from a plurality of images. There is a demand for a technique that can suppress a reduction in work efficiency when an image used for optical character recognition is selected by an operator.
Disclosure of Invention
An object of an embodiment of the present invention is to suppress a reduction in work efficiency during inspection.
According to the 1 st aspect of the present invention, there is provided an inspection apparatus comprising: an image acquisition unit that acquires a plurality of images of an electronic component that is provided with a mark and that is illuminated under a plurality of illumination conditions in a state of being mounted on a substrate; and a display control unit that displays the plurality of images in a line on a display screen of a display device.
According to the 2 nd aspect of the present invention, there is provided an inspection method comprising the steps of: displaying a plurality of images of the electronic component, which is provided with the mark and illuminated under a plurality of illumination conditions in a state of being mounted on the substrate, in an array on a display screen of the display device; recognizing the mark included in an image selected from the plurality of images displayed on the display screen; and comparing the recognized mark with a registered mark to determine whether the electronic component is appropriate.
ADVANTAGEOUS EFFECTS OF INVENTION
According to the aspect of the present invention, a decrease in work efficiency during inspection is suppressed.
Drawings
Fig. 1 is a diagram schematically showing a production line of an electronic device according to an embodiment.
Fig. 2 is a diagram schematically showing an electronic component according to an embodiment.
Fig. 3 is a plan view schematically showing the electronic component mounting apparatus according to the embodiment.
Fig. 4 is a view schematically showing a mounting head according to the embodiment.
Fig. 5 is a diagram schematically showing an inspection apparatus according to an embodiment.
Fig. 6 is a diagram showing an example of display data displayed on the display screen of the display device according to the embodiment.
Fig. 7 is a functional block diagram showing a control device according to an embodiment.
Fig. 8 is a flowchart showing an inspection method according to the embodiment.
Fig. 9 is a block diagram showing a computer system according to an embodiment.
Description of the reference numerals
1 … production line, 2 … printer, 3 … electronic component mounting device, 4 … reflow oven, 5 … inspection device, 6 … control device, 7 … display device, 8 … input device, 31 … base member, 32 … substrate transport device, 32B … conveyor belt, 32G … guide member, 32H … holding member, 33 … electronic component feeding device, 33F … tape feeder, 34 … suction nozzle, 34S … shaft, 34T … front end portion, 35 … mounting head, 36 … mounting head moving device, 36X … first moving device, 36Y … second moving device, 37 … suction nozzle moving device, 51 … substrate holding device, 51a …, 51B … holding member, 51C … actuator, 52 … lighting device, 52S … supporting member, 53 … photographing device, 53a … optical system, 53B … image sensor, 54a … light source, 54a … first light source …, 54B nd light source, 54C rd light source, 3 rd light source, 61 input data acquisition unit, 62 relative position control unit, 63 illumination control unit, 64 photographing control unit, 65 image acquisition unit, 66 image processing unit, 67 symbol recognition unit, 68 determination unit, 69 storage unit, 70 display control unit, 100 st display area, 101 st display area, 102 st display area, 103 st display area, 200 nd display area, 201 st display area, 202 st display area, 203 st display area, 204 st display area, 205 st display area, 206 st display area, 207 st display area, 208 st display area, 209 st display area, 300 rd display area, 3 rd display area, 301 st display area, 302 st display area, 303 th display area, 400 th display area, 401 st display area, 402 st display area, 403 st display area, 500 th display area, 600 operation display area, 700 th display area, 701 th display area, 702 st display area, 703 … region, 704 … region, 705 … region, 706 … region, 800 … button, AX … optical axis, C … electronic component, Cb … body, D … mark, DM … mounting position, M … image, Ma … image, Mb … image, Mc … image, Md … image, Me … image, SM … supply position.
Detailed Description
Embodiments according to the present invention will be described below with reference to the drawings, but the present invention is not limited thereto. The constituent elements of the embodiments described below can be combined as appropriate. In addition, some of the components may not be used.
In the embodiment, an XYZ rectangular coordinate system is defined, and the positional relationship of each portion will be described with reference to the XYZ rectangular coordinate system. A direction parallel to the X axis in the horizontal plane is referred to as an X axis direction. A direction parallel to a Y axis orthogonal to the X axis in the horizontal plane is defined as a Y axis direction. A direction parallel to a Z axis orthogonal to the horizontal plane is defined as a Z axis direction. A plane including the X axis and the Y axis is appropriately referred to as an XY plane. The XY plane is parallel to the horizontal plane. The Z axis is parallel to the plumb line. The Z-axis direction is the up-down direction. The + Z direction is an up direction and the-Z direction is a down direction.
[ production line ]
Fig. 1 is a diagram schematically showing a production line 1 of electronic devices according to an embodiment. As shown in fig. 1, a production line 1 includes: a printer 2, an electronic component mounting apparatus 3, a reflow furnace 4, and an inspection apparatus 5.
The printer 2 prints solder paste on a substrate on which electronic components are mounted. The electronic component mounting apparatus 3 mounts an electronic component on a substrate printed with cream solder. The reflow furnace 4 heats the substrate mounted with the electronic components to melt the cream solder. The molten cream solder is cooled in the reflow furnace 4, whereby the electronic component is soldered to the substrate.
The inspection device 5 inspects the adequacy of the electronic component mounted on the substrate. The inspection device 5 inspects whether or not the correct electronic component is mounted on the substrate.
[ electronic component ]
Fig. 2 is a diagram schematically showing an electronic component C according to the embodiment. The electronic component C has a body Cb. The electronic component C may be a lead type electronic component having leads protruding from the body Cb. The electronic component C may be a mounted electronic component having no lead.
The main body Cb has a rectangular parallelepiped shape. The body Cb may have a cylindrical shape or a disc shape. A mark D is provided on the surface of the body Cb. The mark D includes at least one of letters and numbers. In an embodiment, the symbol D is a number "102". The mark D is imprinted on the surface of the body Cb. Note that the mark D may be printed on the surface of the body Cb.
[ electronic component mounting apparatus ]
Fig. 3 is a plan view schematically showing the electronic component mounting apparatus 3 according to the embodiment. The electronic component mounting apparatus 3 mounts the electronic component C on the substrate P. The electronic component mounting apparatus 3 includes: a base member 31; a substrate transfer device 32 that transfers the substrate P; an electronic component supply device 33 that supplies the electronic component C; a mounting head 35 having a suction nozzle 34; a mounting head moving device 36 that moves the mounting head 35; and a nozzle moving device 37 that moves the suction nozzle 34.
The base member 31 supports the substrate transfer device 32, the electronic component supply device 33, the mounting head 35, the mounting head moving device 36, and the nozzle moving device 37.
The substrate transfer device 32 transfers the substrate P to the mounting position DM. The mounting position DM is defined in the conveyance path of the substrate conveyance device 32. The substrate transport device 32 includes: a conveyor 32B that conveys the substrate P; a guide member 32G that guides the substrate P; and a holding member 32H that holds the substrate P. The conveyor belt 32B moves by the operation of the actuator, and conveys the substrate P in the X-axis direction. After moving to the mounting position DM, the substrate P is held by the holding member 32H.
The electronic component supply device 33 supplies the electronic component C to the supply position SM. The electronic component supply device 33 includes a plurality of tape feeders 33F. The electronic component supply devices 33 are disposed on both sides of the substrate transport device 32. The electronic component supply device 33 may be disposed only on one side of the substrate transfer device 32.
The mounting head 35 holds the electronic component C supplied from the electronic component supply device 33 by the suction nozzle 34 and mounts the electronic component C on the board P. The mounting head 35 is movable between a supply position SM to which the electronic components C are supplied from the electronic component supply device 33 and a mounting position DM at which the substrate P is disposed. The mounting head 35 holds the electronic component C supplied to the supply position SM by the suction nozzle 34, moves to the mounting position DM, and mounts the electronic component C on the board P placed at the mounting position DM.
The mounting head moving device 36 moves the mounting head 35. The mounting head moving device 36 has: a1 st moving device 36X that moves the mounting head 35 in the X-axis direction; and a2 nd moving device 36Y that moves the mounting head 35 in the Y-axis direction. The 1 st and 2 nd moving devices 36X and 36Y each include an actuator. The 1 st transfer device 36X is coupled to the mounting head 35. By the operation of the 1 st moving device 36X, the mounting head 35 is moved in the X-axis direction. The 2 nd transfer device 36Y is connected to the mounting head 35 via the 1 st transfer device 36X. The 1 st moving device 36X is moved in the Y-axis direction by the operation of the 2 nd moving device 36Y, whereby the mounting head 35 is moved in the Y-axis direction.
Fig. 4 is a diagram schematically showing the mounting head 35 according to the embodiment. As shown in fig. 4, the mounting head 35 has a plurality of suction nozzles 34. The suction nozzle 34 detachably holds the electronic component C. The suction nozzle 34 is a suction nozzle that suctions and holds the electronic component C. An opening is provided at a front end portion 34T of the suction nozzle 34. The opening of the suction nozzle 34 is connected to a vacuum system. In a state where the front end portion 34T of the suction nozzle 34 is in contact with the electronic component C, a suction operation from an opening provided in the front end portion 34T of the suction nozzle 34 is performed, whereby the electronic component C is sucked and held by the front end portion 34T of the suction nozzle 34. By releasing the suction operation from the opening, the electronic component C is released from the suction nozzle 34.
The nozzle moving device 37 moves the nozzle 34 in the Z-axis direction and the rotation direction around the Z-axis, respectively. The nozzle moving device 37 is supported by the mounting head 35. The suction nozzle 34 is connected to a lower end portion of the shaft 34S. The shaft 34S is provided in plurality. The plurality of suction nozzles 34 are connected to the plurality of shafts 34S, respectively. The nozzle moving device 37 is provided in plurality. The plurality of nozzle moving devices 37 are connected to the plurality of shafts 34S, respectively. The suction nozzles 34 are supported by the mounting head 35 via the shafts 34S and the nozzle transfer device 37. The nozzle moving device 37 moves the shaft 34S in the Z-axis direction and the rotation direction around the Z-axis, thereby moving the nozzle 34.
The suction nozzles 34 can be moved in the X-axis direction, the Y-axis direction, the Z-axis direction, and the rotation direction about the Z-axis by the mounting head moving device 36 and the suction nozzle moving device 37, respectively. By moving the suction nozzle 34, the electronic component C held by the suction nozzle 34 can be moved in the X-axis direction, the Y-axis direction, the Z-axis direction, and the rotation direction about the Z-axis.
The suction nozzle 34 may be a holding suction nozzle that holds the electronic component C therebetween.
[ inspection apparatus ]
Fig. 5 is a diagram schematically showing the inspection apparatus 5 according to the embodiment. As shown in fig. 5, the inspection apparatus 5 includes: a substrate holding device 51 for holding a substrate P on which electronic components C are mounted; an illumination device 52 capable of illuminating the electronic component C mounted on the substrate P under a plurality of illumination conditions; an imaging device 53 that images the electronic component C mounted on the substrate P; a control device 6 including a computer system; a display device 7 capable of displaying display data; and an input device 8 operated by an operator.
The substrate holding device 51 holds the substrate P on which the electronic component C is mounted in the electronic component mounting device 3. The substrate holding device 51 includes: a base member 51A; a holding member 51B for holding the substrate P; and an actuator 51C that generates power to move the holding member 51B. The base member 51A movably supports the holding member 51B. The holding member 51B holds the substrate P so that the electronic component C mounted on the surface of the substrate P faces the imaging device 53. The holding member 51B can move in the XY plane by the operation of the actuator 51C while holding the substrate P.
The illumination device 52 illuminates the electronic component C mounted on the substrate P with illumination light in a state where the substrate P is held by the substrate holding device 51. The illumination device 52 includes: a light source 54 that emits illumination light; and a support member 52S that supports the light source 54.
The light source 54 has a circular ring shape. The light source 54 is exemplified by a Light Emitting Diode (LED). The light source 54 emits white light as illumination light. The support member 52S is disposed around at least a part of the light source 54.
The illumination device 52 illuminates the electronic component C under a plurality of illumination conditions. The illumination device 52 is disposed above the substrate holder 51. The illumination device 52 has a plurality of light sources 54 that can illuminate the electronic component C under different illumination conditions. Each of the plurality of light sources 54 has a ring shape. In an embodiment, the light source 54 comprises: a1 st light source 54A having a1 st inner diameter; a2 nd light source 54B having a2 nd inner diameter greater than the 1 st inner diameter; and a3 rd light source 54C having a3 rd inner diameter greater than the 2 nd inner diameter. Among the plurality of light sources 54, the 1 st light source 54A is disposed at a position farthest from the substrate holding device 51, the 2 nd light source 54B is disposed at a position second distant from the substrate holding device 51 next to the 1 st light source 54A, and the 3 rd light source 54C is disposed at a position closest to the substrate holding device 51. That is, among the plurality of light sources 54, the 1 st light source 54A is disposed at the highest position, the 2 nd light source 54B is disposed at the second highest position next to the 1 st light source 54A, and the 3 rd light source 54C is disposed at the lowest position.
The illumination condition includes an incident angle θ of illumination light incident on the electronic component C. The incident angle θ 1 at which the illumination light emitted from the 1 st light source 54A enters the electronic component C, the incident angle θ 2 at which the illumination light emitted from the 2 nd light source 54B enters the electronic component C, and the incident angle θ 3 at which the illumination light emitted from the 3 rd light source 54C enters the electronic component C are different. The illumination device 52 irradiates the electronic component C with illumination light at each of the plurality of incidence angles θ.
In the following description, an illumination condition in which the electronic component C is illuminated by illumination light emitted from the 1 st light source 54A is appropriately referred to as a1 st illumination condition, an illumination condition in which the electronic component C is illuminated by illumination light emitted from the 2 nd light source 54B is appropriately referred to as a2 nd illumination condition, and an illumination condition in which the electronic component C is illuminated by illumination light emitted from the 3 rd light source 54C is appropriately referred to as a3 rd illumination condition.
When the illumination light is emitted from the 1 st light source 54A, the illumination light is not emitted from each of the 2 nd light source 54B and the 3 rd light source 54C. When the illumination light is emitted from the 2 nd light source 54B, the illumination light is not emitted from each of the 3 rd light source 54C and the 1 st light source 54A. When the illumination light is emitted from the 3 rd light source 54C, the illumination light is not emitted from each of the 1 st light source 54A and the 2 nd light source 54B.
The imaging device 53 images the electronic component C mounted on the substrate P, and acquires an image of the electronic component C. The imaging device 53 is disposed above the illumination device 52. The imaging device 53 includes: the optical system 53A; and an image sensor 53B that acquires an image of the electronic component C via the optical system 53A. In the embodiment, the optical axis AX of the optical system 53A is parallel to the Z axis. The optical axis AX of the optical system 53A is disposed inside the annular light source 54. As the image sensor 53B, at least one of a ccd (charge-Charged device) image sensor and a cmos (complementary Metal oxide semiconductor) image sensor is exemplified. The imaging device 53 can acquire a color image of the electronic component C.
The imaging device 53 images the electronic component C in a state where the electronic component C is illuminated by the illumination device 52. The light source 54 is disposed outside the field of view of the imaging device 53. The imaging device 53 images the electronic component C through the space inside the light source 54.
The holding member 51B moves in the XY plane. The holding member 51B moves so that the electronic component C mounted on the substrate P is disposed in the field of view of the imaging device 53. For example, even if the plurality of electronic components C mounted on the substrate P are not simultaneously arranged in the field of view of the imaging device 53, the imaging device 53 can acquire images of the plurality of electronic components C mounted on the substrate P by adjusting the relative positions in the XY plane between the substrate P held by the holding member 51B and the field of view of the imaging device 53 so that the electronic components C are sequentially arranged in the field of view of the imaging device 53.
Further, the relative position in the XY plane between the substrate P held by the holding member 51B and the field of view of the imaging device 53 may be adjusted, and the imaging device 53 may be moved in the XY plane, or both the imaging device 53 and the holding member 51B may be moved in the XY plane. In the case where the photographing device 53 moves in the XY plane, the illumination device 52 also moves in the XY plane together with the photographing device 53.
The control device 6 controls the substrate holding device 51, the illumination device 52, and the imaging device 53. The controller 6 controls the substrate holding device 51 to adjust the relative position between the substrate P held by the holding member 51B and the field of view of the imaging device 53. The control device 6 controls the illumination device 52 to adjust the illumination conditions of the electronic component C mounted on the substrate P. The control device 6 controls the imaging device 53, and controls imaging conditions including at least one of the timing of imaging the electronic component C, the shutter speed, and the aperture of the optical system 53A.
The control device 6 controls the illumination device 52 to illuminate the electronic component C mounted on the substrate P under a plurality of illumination conditions. The control device 6 controls the imaging device 53 to acquire a plurality of images of the electronic component C illuminated under a plurality of illumination conditions.
The control device 6 illuminates the electronic component C under the 1 st illumination condition, the 2 nd illumination condition, and the 3 rd illumination condition, respectively, the 1 st illumination condition illuminating the electronic component C with the illumination light emitted from the 1 st light source 54A, the 2 nd illumination condition illuminating the electronic component C with the illumination light emitted from the 2 nd light source 54B, and the 3 rd illumination condition illuminating the electronic component C with the illumination light emitted from the 3 rd light source 54C. The control device 6 acquires an image of the electronic component C illuminated under the 1 st illumination condition, an image of the electronic component C illuminated under the 2 nd illumination condition, and an image of the electronic component C illuminated under the 3 rd illumination condition, respectively.
The controller 6 extracts and recognizes the mark D from the image of the electronic component C acquired by the imaging device 53 by using an Optical Character Recognition (OCR) technique. Based on the recognition result of the mark D, the control device 6 determines whether the electronic component C mounted on the substrate P is appropriate. That is, the control device 6 determines whether or not the correct electronic component C is mounted on the board P based on the recognition result of the mark D. The control device 6 compares the mark D recognized from the image of the electronic component C with a registered mark registered in advance, and determines whether or not the electronic component C mounted on the substrate P is appropriate.
The display device 7 has a display screen for displaying display data. A flat panel Display such as a Liquid Crystal Display (LCD) or an organic EL Display (OELD) is exemplified as the Display device 7. The operator can confirm the display screen of the display device 7.
The input device 8 is operated by an operator. The input device 8 generates input data by an operation performed by an operator. The input data generated by the input device 8 is output to the control device 6. As the input device 8, at least one of a keyboard for a computer, a mouse, buttons, switches, and a touch panel is exemplified.
[ display device ]
Fig. 6 is a diagram showing an example of display data displayed on the display screen of the display device 7 according to the embodiment. The control device 6 generates display data based on the image of the electronic component C acquired by the imaging device 53, and displays the generated display data on the display device 7.
Mark D is provided on the surface of main body Cb of electronic component C. The control device 6 controls the illumination device 52 to illuminate the electronic component C mounted on the substrate P under a plurality of illumination conditions. The control device 6 controls the imaging device 53 to acquire a plurality of images of the electronic component C illuminated under a plurality of illumination conditions. The control device 6 displays a plurality of images M of the electronic component C illuminated under a plurality of illumination conditions, respectively, in a line on the display screen of the display device 7.
When recognizing the symbol D by an Optical Character Recognition (OCR) technique, it is necessary to acquire an image of the symbol D suitable for the Optical Character Recognition. As an example of the image of the symbol D suitable for optical character recognition, a clear image of the symbol D is shown. If a clear image of the symbol D can be obtained, the decrease in the recognition rate of the symbol D is suppressed.
Factors that prevent the clear image of the mark D from being obtained may exist in the electronic component C. As factors that prevent the acquisition of a clear image of the mark D, at least one of the following is exemplified: the depth of the mark D is shallow, the fine irregularities present on the surface of the body Cb, and the regions where the reflectance of the illumination light locally present on the surface of the body Cb differs.
When the mark D is lightly engraved when the mark D is engraved on the surface of the body Cb, it may be difficult to obtain a clear image of the mark D. When fine irregularities exist on the surface of the body Cb, it may be difficult to obtain a clear image of the mark D. When a region having a different reflectance of illumination light exists locally on the surface of the body Cb, it may be difficult to obtain a clear image of the mark D.
Even when a factor that hinders the acquisition of a clear image of the mark D exists in the electronic component C, it is possible to acquire a clear image of the mark D by adjusting the illumination condition at the time of acquiring the image of the electronic component C. That is, by adjusting the illumination condition when the image of the electronic component C is acquired, it is possible to suppress the decrease in the recognition rate of the mark D.
A plurality of images M of the electronic component C illuminated under a plurality of illumination conditions are displayed on the display device 7. The operator checks the plurality of images M of the electronic component C displayed on the display device 7 and selects the image M used for optical character recognition from the plurality of images M. The operator checks the plurality of images M of the electronic component C displayed on the display device 7, and selects the image M of the clear symbol D, that is, the image M of the symbol D suitable for the optical character recognition from the plurality of images M so that the decrease in the recognition rate of the symbol D can be suppressed.
The control device 6 displays a plurality of images M of the electronic component C illuminated under a plurality of illumination conditions, respectively, in a line on the display screen of the display device 7. The operator can efficiently select the image M of the symbol D suitable for optical character recognition by checking the plurality of images M arranged on the display screen of the display device 7. This suppresses a reduction in work efficiency during inspection.
The control device 6 can perform image processing on the image M of the electronic component C acquired by the imaging device 53. The image M displayed on the display device 7 includes an image Ma before image processing and an image Mb after image processing. The control device 6 displays the image Ma before image processing and the image Mb after image processing in a line on the display screen of the display device 7. The control device 6 causes the plurality of images Ma before image processing to be displayed in a line on the display screen of the display device 7. The control device 6 displays the plurality of images Mb after the image processing in a line on the display screen of the display device 7.
As shown in fig. 6, the display screen of the display device 7 includes: a1 st display area 100 for displaying an image Ma before image processing; a2 nd display area 200 that displays the image Mb after the image processing; a3 rd display area 300 for displaying an image Mc before and/or after image processing; a 4 th display area 400 for displaying an image Md of at least one of before and after the image processing; and a 5 th display area 500 for displaying an image Me before and/or after the image processing.
At least a part of the 1 st display area 100 is disposed at an upper left portion of the display screen. The 2 nd display area 200 is set below the 1 st display area 100 in the display screen. The 3 rd display area 300 is set at the upper right portion of the display screen. The 4 th display area 400 is set below the 3 rd display area 300 in the display screen. The 5 th display area 500 is set below the 4 th display area 400 in the display screen.
The image Ma before image processing includes: the image Ma1 of the electronic component C illuminated under the 1 st illumination condition, the image Ma2 of the electronic component C illuminated under the 2 nd illumination condition, and the image Ma3 of the electronic component C illuminated under the 3 rd illumination condition. The control device 6 displays an image Ma1 of the electronic component C illuminated under the 1 st illumination condition in the region 101 of the 1 st display region 100. The control device 6 displays the image Ma2 of the electronic component C illuminated under the 2 nd illumination condition in the region 102 of the 1 st display region 100. The control device 6 displays the image Ma3 of the electronic component C illuminated under the 3 rd illumination condition in the area 103 of the 1 st display area 100.
The region 101, the region 102, and the region 103 are arranged laterally in the 1 st display region 100.
In fig. 6, the "upper image" refers to an image Ma1 of the electronic component C illuminated under the 1 st illumination condition. The "medium image" refers to the image Ma2 of the electronic component C illuminated under the 2 nd illumination condition. The "lower image" refers to the image Ma3 of the electronic component C illuminated under the 3 rd illumination condition.
The image processing includes four arithmetic operations of the plurality of images Ma before the image processing. The four arithmetic operations include: an addition process of adding the plurality of images Ma, a subtraction process of subtracting the 2 nd image Ma from the 1 st image Ma among the plurality of images Ma, a multiplication process of multiplying the plurality of images Ma, and a division process of dividing the 1 st image Ma among the plurality of images Ma by the 2 nd image Ma.
The control device 6 can perform addition processing of adding at least one of the image Ma1 and the image Ma1, the image Ma2, and the image Ma 3. The control device 6 can perform addition processing of adding the image Ma2 and at least one of the image Ma2 and the image Ma 3. The control device 6 can perform addition processing of adding the image Ma3 and the image Ma 3.
The control device 6 can perform subtraction processing of subtracting at least one of the image Ma2 and the image Ma3 from the image Ma 1. The control device 6 can perform subtraction processing of subtracting at least one of the image Ma3 and the image Ma1 from the image Ma 2. The control device 6 can perform subtraction processing of subtracting at least one of the image Ma1 and the image Ma2 from the image Ma 3.
The control device 6 can execute multiplication processing of multiplying at least one of the image Ma1 and the image Ma1, the image Ma2, and the image Ma 3. The control device 6 can execute multiplication processing of multiplying the image Ma2 by at least one of the image Ma2 and the image Ma 3. The control device 6 can perform multiplication processing of multiplying the image Ma3 and the image Ma 3.
The control device 6 can perform division processing of dividing the image Ma1 by at least one of the image Ma2 and the image Ma 3. The control device 6 can perform division processing of dividing the image Ma2 by at least one of the image Ma3 and the image Ma 1. The control device 6 can perform division processing of dividing the image Ma3 by at least one of the image Ma1 and the image Ma 2.
The operator can select at least one of the image processing methods of addition, subtraction, multiplication, and division by operating the input device 8. The control device 6 executes the image processing method selected by the operation of the input device 8. For example, when the addition process is selected, "+" that is a mark indicating the addition process is displayed in the arithmetic display area 600 on the display screen of the display device 7.
When the addition process is selected, the control device 6 displays the image Mb1 obtained by adding the image Ma1 and the image Ma1 in the area 201 of the 2 nd display area 200. The control device 6 displays the image Mb2 obtained by adding the image Ma1 and the image Ma2 in the area 202 of the 2 nd display area 200. The control device 6 displays the image Mb3 obtained by adding the image Ma1 and the image Ma3 in the area 203 of the 2 nd display area 200. The control device 6 displays the image Mb4 obtained by adding the image Ma2 and the image Ma1 in the area 204 of the 2 nd display area 200. The control device 6 displays the image Mb5 obtained by adding the image Ma2 and the image Ma2 in the area 205 of the 2 nd display area 200. The control device 6 displays the image Mb6 obtained by adding the image Ma2 and the image Ma3 in the area 206 of the 2 nd display area 200. The control device 6 displays the image Mb7 obtained by adding the image Ma3 and the image Ma1 in the region 207 of the 2 nd display region 200. The control device 6 displays the image Mb8 obtained by adding the image Ma3 and the image Ma2 in the area 208 of the 2 nd display area 200. The control device 6 displays an image Mb9 obtained by adding the image Ma3 and the image Ma3 in the area 209 of the 2 nd display area 200.
The region 201, the region 202, and the region 203 are arranged laterally in the 2 nd display region 200. The region 204, the region 205, and the region 206 are arranged laterally in the 2 nd display region 200. The region 207, the region 208, and the region 209 are laterally arranged in the 2 nd display region 200. The region 201, the region 204, and the region 207 are arranged vertically in the 2 nd display region 200. The region 202, the region 205, and the region 208 are arranged vertically in the 2 nd display region 200. The region 203, the region 206, and the region 209 are arranged vertically in the 2 nd display region 200.
As described above, the control device 6 arranges the plurality of images M including the plurality of images Ma before image processing and the plurality of images Mb after image processing in a matrix on the display screen of the display device 7.
When the subtraction process is selected, a "-" which is a mark indicating the subtraction process is displayed in the arithmetic display area 600. When the multiplication process is selected, an "x" which is a mark indicating the multiplication process is displayed in the operation display area 600. When the division process is selected, "÷" which is a mark indicating the division process is displayed in the arithmetic display area 600.
The control device 6 can add a weighting coefficient to the image Ma and perform four arithmetic operations. The operator can operate the input device 8 to specify the weighting coefficients. The display screen of the display device 7 includes a coefficient display area 700 for displaying the designated weighting coefficient.
The coefficient display area 700 includes: a region 701 set between the region 101 and the region 201 in the display screen; a region 702 set between the region 102 and the region 202; region 703, which is set between region 103 and region 203; an area 704 set on the left side of the area 201 on the display screen; a region 705 set to the left of the region 204; and a region 706 set to the left of the region 207.
The weighting coefficient displayed in the region 701 is a weighting coefficient added to the image Ma1 common to the arithmetic processing of the image Mb displayed in the region 201, the region 204, and the region 207. The weighting coefficient displayed in the area 702 is a weighting coefficient added to the image Ma2 common to the arithmetic processing of the image Mb displayed in the areas 202, 205, and 208. The weighting coefficient displayed in the region 703 is a weighting coefficient added to the image Ma3 common to the arithmetic processing of the image Mb displayed in the region 203, the region 206, and the region 209. The weighting coefficient displayed in the region 704 is a weighting coefficient added to the image Ma1 common to the arithmetic processing of the image Mb displayed in the regions 201, 202, and 203. The weighting coefficient displayed in the region 705 is a weighting coefficient added to the image Ma2 common to the arithmetic processing of the image Mb displayed in the regions 204, 205, and 206. The weighting coefficient displayed in the region 706 is a weighting coefficient added to the image Ma3 common to the arithmetic processing of the image Mb displayed in the regions 207, 208, and 209.
The operator checks the plurality of images M of the electronic component C displayed on the display device 7 and selects the image M used for optical character recognition from the plurality of images M. The operator checks the plurality of images M of the electronic component C displayed on the display device 7 and selects the image M of the symbol D suitable for optical character recognition from the plurality of images M.
The image processing includes at least one of sharpness processing and contrast processing. The sharpness processing refers to processing for sharpening the image M and processing for blurring the image M. The contrast processing is processing for increasing the contrast of the image M and processing for decreasing the contrast of the image M.
The image M to be subjected to the sharpness processing is selected from the plurality of images Ma displayed in the 1 st display area 100 and the plurality of images Mb displayed in the 2 nd display area 200. The operator selects the image Ma or the image Mb of the symbol D suitable for optical character recognition from the plurality of images Ma displayed in the 1 st display area 100 and the plurality of images Mb displayed in the 2 nd display area 200.
The operator operates the input device 8 to select the image Ma or the image Mb to be subjected to the sharpness processing from the plurality of images Ma displayed in the 1 st display area 100 and the plurality of images Mb displayed in the 2 nd display area 200. In the embodiment, the image Ma3 displayed in the region 103 is selected.
The control device 6 causes the image Ma3 selected by the operation of the input device 8 from the plurality of images Ma or images Mb displayed in the 1 st display area 100 and the 2 nd display area 200 to be displayed in different display modes from the image M (Ma1, Ma2, Mb1 to Mb9) that is not selected. In the embodiment, the control device 6 displays the background of the image Ma3 in the region 103 in a color different from the background of the unselected images M (Ma1, Ma2, Mb1 to Mb 9).
The control device 6 displays an image Mc1 in which the image Ma3 has been blurred by the sharpness processing in the area 301 of the 3 rd display area 300. The control device 6 displays the image Mc2, which is not subjected to the sharpness processing on the image Ma3, in the area 302 of the 3 rd display area 300. The control device 6 displays the image Mc3 in which the image Ma3 has been sharpened by the sharpening process in the region 303 of the 3 rd display region 300.
The region 301, the region 302, and the region 303 are arranged laterally in the 3 rd display region 300.
The image M to be subjected to the contrast processing is selected from the plurality of images Mc displayed in the 3 rd display area 300. The operator selects the image Mc of the symbol D suitable for optical character recognition from the plurality of images Mc displayed in the 3 rd display area 300.
The operator operates the input device 8 to select an image Mc to be subjected to contrast processing from the plurality of images Mc displayed in the 3 rd display area 300. In the embodiment, it is assumed that the image Mc2 displayed in the region 302 is selected (Ma 3).
The control device 6 causes the image Mc2 selected by the operation of the input device 8 from the plurality of images M displayed in the 3 rd display area 300 and the image M (Mc1, Mc3) that has not been selected to be displayed in different display modes. In the embodiment, the control device 6 displays the background of the image Mc2 in the area 302 in a color different from the background of the unselected images Mc (Mc1, Mc 3).
The control device 6 displays the image Md1 in which the contrast of the image Mc2 is reduced by the contrast processing in the area 401 of the 4 th display area 400. The control device 6 displays the image Md2, which is not subjected to the contrast processing with respect to the image Mc2, in the area 402 of the 4 th display area 400. The control device 6 displays the image Md3 in which the contrast of the image Mc2 is enhanced by the contrast processing in the area 403 of the 4 th display area 400.
The region 401, the region 402, and the region 403 are arranged laterally in the 4 th display region 400.
The sizes of the plurality of images M of the electronic component C displayed in each of the 1 st display area 100, the 2 nd display area 200, the 3 rd display area 300, and the 4 th display area 400 are substantially equal to each other. The color of the symbol D included in the image M of the electronic component C displayed in each of the 1 st display area 100, the 2 nd display area 200, the 3 rd display area 300, and the 4 th display area 400 is the 1 st color (for example, white).
The image M used for optical character recognition is selected from the plurality of images Md displayed in the 4 th display area 400. The operator selects the image M of the symbol D suitable for optical character recognition from the plurality of images Md displayed in the 4 th display area 400.
The operator operates the input device 8 to select the image Md used for the optical character recognition from the plurality of images Md displayed in the 4 th display area 400. In the embodiment, it is assumed that the image Md2 displayed in the region 402 is selected (Ma 3).
The control device 6 causes the image Md2 selected by the operation of the input device 8 from the plurality of images M displayed in the 4 th display area 400 and the image M (Md1, Md3) not selected to be displayed in different display modes. In the embodiment, the control device 6 displays the background of the image Md2 in the region 402 in a color different from the background of the unselected images Md (Md1, Md 3).
The control device 6 displays an image Me (Md2) used for the optical character recognition in the 5 th display area 500. The size of the image Me of the electronic component C displayed in the 5 th display area 500 is larger than the sizes of the plurality of images M (Ma, Mb, Mc, Md) of the electronic component C displayed in each of the 1 st, 2 nd, 3 rd, and 4 th display areas 100, 200, 300, and 400.
The control device 6 extracts the symbol D included in the selected image Me by using the technique of optical character recognition. The control device 6 recognizes the extracted symbol D by, for example, a pattern matching method.
In the embodiment, the button 800 displayed as "update" is displayed on the display screen of the display device 7. The operator operates the button 800 via the input device 8. By operating the button 800, the control device 6 starts recognizing the extracted symbol D.
When the recognition of the symbol D is successful, the control device 6 causes the symbol D included in the image Me displayed in the 5 th display area 500 to be displayed in a2 nd color (for example, blue) different from the 1 st color (white) of the symbol D included in the image M of the electronic component C displayed in each of the 1 st display area 100, the 2 nd display area 200, the 3 rd display area 300, and the 4 th display area 400. On the other hand, when the recognition of the symbol D fails, the control device 6 causes the symbol D included in the image Me displayed in the 5 th display area 500 to be displayed in the same color (white) as the 1 st color of the symbol D included in the image M of the electronic component C displayed in each of the 1 st display area 100, the 2 nd display area 200, the 3 rd display area 300, and the 4 th display area 400. That is, the control device 6 causes the display device 7 to display the mark D that has been successfully recognized and the mark D that has not been successfully recognized in different display modes.
The operator can check whether or not the recognition of the symbol D is successful by checking the color of the symbol D included in the image Me after operating the button 800. When the color of the symbol D included in the image Me is changed from the 1 st color to the 2 nd color after the button 800 is operated, the operator can confirm that the symbol D is successfully recognized. When the color of the symbol D included in the image Me is still the 1 st color after the button 800 is operated, the operator can confirm that the symbol D has failed to be recognized.
[ control device ]
Fig. 7 is a functional block diagram showing the control device 6 according to the embodiment. The control device 6 includes an input data acquisition unit 61, a relative position control unit 62, an illumination control unit 63, an imaging control unit 64, an image acquisition unit 65, an image processing unit 66, a symbol recognition unit 67, a determination unit 68, a storage unit 69, and a display control unit 70.
The input data acquisition unit 61 acquires input data generated by operating the input device 8.
The relative position control unit 62 outputs a control command to the substrate holding device 51 to adjust the relative position in the XY plane between the holding member 51B holding the substrate P and the imaging device 53. The relative position control unit 62 adjusts the relative position in the XY plane between the electronic component C mounted on the substrate P and the field of view of the imaging device 53.
The illumination control unit 63 outputs a control command to the illumination device 52, and adjusts the illumination conditions when the electronic component C mounted on the substrate P is illuminated with the illumination light. The illumination control unit 63 controls the illumination device 52 so that the electronic component C is illuminated under at least one of the 1 st illumination condition, the 2 nd illumination condition, and the 3 rd illumination condition.
The imaging control unit 64 outputs a control command to the imaging device 53 to control imaging conditions including at least one of the timing of imaging the electronic component C, the shutter speed, and the aperture of the optical system 53A.
The image acquiring unit 65 acquires a plurality of images of the electronic component C illuminated by the illumination device 52 under a plurality of illumination conditions in a state of being mounted on the substrate P. The electronic component C is provided with a mark D. The image acquisition unit 65 acquires a plurality of images of the electronic component C illuminated under a plurality of illumination conditions and captured by the imaging device 53. The image acquisition unit 65 acquires an image of the electronic component C including the symbol D when illuminated under the 1 st illumination condition, an image of the electronic component C including the symbol D when illuminated under the 2 nd illumination condition, and an image of the electronic component C including the symbol D when illuminated under the 3 rd illumination condition, respectively.
The image processing unit 66 performs image processing on the image acquired by the image acquisition unit 65. The image processing includes four arithmetic operations of the plurality of images Ma before the image processing. The image processing includes at least one of sharpness processing and contrast processing.
The symbol recognition unit 67 extracts the symbol D included in the image Me selected by the operator via the input device 8 from the plurality of images M displayed on the display screen of the display device 7. The mark recognizing unit 67 cuts out the mark D from the image Me. The symbol recognition unit 67 recognizes the extracted symbol D based on, for example, a pattern matching method.
The determination unit 68 compares the mark D recognized by the mark recognition unit 67 with the registration mark stored in the storage unit 69, and determines whether the electronic component C mounted on the substrate P is appropriate. The registration mark is a mark given to a correct electronic component C to be mounted on the substrate P. The registration mark is a predetermined mark and is registered in the storage unit 69.
The display control unit 70 causes the display device 7 to display the display data. As described with reference to fig. 6, the display control unit 70 displays a plurality of images Ma of the electronic component C illuminated under a plurality of illumination conditions, respectively, in a line on the display screen of the display device 7. The display control unit 70 displays the image Ma before image processing and the image Mb after image processing in a line on the display screen of the display device 7. The display control unit 70 displays the plurality of images Mb after the image processing in an aligned manner on the display screen of the display device 7. The display control unit 70 arranges the plurality of images M in a matrix on the display screen of the display device 7.
As described with reference to the area 103, the area 302, and the area 402 of fig. 6, the display control unit 70 causes the image M selected by the operator from the plurality of images M displayed on the display screen of the display device 7 via the input device 8 and the image M not selected to be displayed on the display screen of the display device 7 in different display modes.
The display control unit 70 causes the display screen to display the symbol D recognized by the symbol recognition unit 67 and the symbol D not recognized in a different display manner. As described with reference to the image Me of fig. 6, when the recognition of the symbol D by the symbol recognition unit 67 is successful, the display control unit 70 causes the symbol D included in the image Me displayed in the 5 th display area 500 to be displayed in the 2 nd color (blue). When the recognition of the symbol D by the symbol recognition unit 67 fails, the display control unit 70 causes the symbol D included in the image Me displayed in the 5 th display area 500 to be displayed in the 1 st color (white).
[ inspection method ]
Fig. 8 is a flowchart showing an inspection method according to the embodiment. The substrate P on which the electronic component C is mounted by the electronic component mounting device 3 and which is subjected to reflow soldering and cooling by the reflow furnace 4 is carried into the inspection device 5. The holding member 51B of the substrate holding device 51 holds the substrate P on which the electronic component C is mounted. The relative position control unit 62 adjusts the relative position in the XY plane between the substrate P held by the holding member 51B and the field of view of the imaging device 53 so that the electronic component C is disposed in the field of view of the imaging device 53.
The illumination control unit 63 controls the illumination device 52 to cause the illumination device 52 to illuminate the electronic component C mounted on the substrate P under a plurality of illumination conditions. The imaging control unit 64 controls the imaging device 53 to cause the imaging device 53 to image the electronic component C illuminated under each of the plurality of illumination conditions.
The illumination control unit 63 emits illumination light from the 1 st light source 54A to illuminate the electronic component C mounted on the substrate P under the 1 st illumination condition. The imaging control unit 64 controls the imaging device 53 to cause the imaging device 53 to image the electronic component C illuminated under the 1 st illumination condition.
The illumination control unit 63 emits illumination light from the 2 nd light source 54B to illuminate the electronic component C mounted on the substrate P under the 2 nd illumination condition. The imaging control unit 64 controls the imaging device 53 to cause the imaging device 53 to image the electronic component C illuminated under the 2 nd illumination condition.
The illumination control section 63 emits illumination light from the 3 rd light source 54C to illuminate the electronic component C mounted on the substrate P under the 3 rd illumination condition. The imaging control unit 64 controls the imaging device 53 to cause the imaging device 53 to image the electronic component C illuminated under the 3 rd illumination condition.
The image obtaining unit 65 obtains a plurality of images of the electronic component C illuminated under a plurality of illumination conditions (step S1).
The image acquisition unit 65 acquires an image of the electronic component C including the symbol D when illuminated under the 1 st illumination condition, an image of the electronic component C including the symbol D when illuminated under the 2 nd illumination condition, and an image of the electronic component C including the symbol D when illuminated under the 3 rd illumination condition, respectively.
The operator operates the input device 8 to select the image processing method of the four arithmetic operations. The operator operates the input device 8 to select at least one of the addition process, the subtraction process, the multiplication process, and the division process. By operating the input device 8, input data for selecting an image processing method is generated. The input data acquisition unit 61 acquires input data for selecting an image processing method (step S2).
The image processing unit 66 performs four arithmetic operations on the image Ma based on the image processing method selected by the operator via the input device 8 (step S3).
In the embodiment, the image processing section 66 performs addition processing of the image Ma. By performing image processing, an image-processed image Mb is generated.
The display controller 70 causes the display device 7 to display a plurality of images Ma (Ma1 to Ma3) of the electronic component C illuminated under the plurality of illumination conditions, which are acquired in step S1, in an aligned manner in the 1 st display region 100. The display controller 70 causes the plurality of images Mb (Mb1 to Mb9) after the image processing generated in step S3 to be displayed in a line in the 2 nd display area 200 of the display device 7 (step S4).
The operator checks the plurality of images M (Ma1 to Ma3, Mb1 to Mb9) displayed on the display device 7, and selects an image M to be used for optical character recognition from the plurality of images M displayed on the display device 7. The operator operates the input device 8 to select the image M of the symbol D suitable for the optical character recognition. By operating the input device 8, input data for selecting the image M is generated. The input data acquiring unit 61 acquires input data for selecting the image M (step S5).
As described with reference to the region 103 of fig. 6, the display control unit 70 displays the image Ma3 selected by the operator from the plurality of images M displayed on the display device 7 via the input device 8 and the images M (Ma1, Ma2, Mb1 to Mb9) that are not selected, in different display modes.
The image processing unit 66 executes sharpness processing of the image Ma3 selected by the operator via the input device 8 (step S6).
The display control unit 70 displays the sharpness-processed image Mc in the area 301 and the area 303 of the 3 rd display area 300 of the display device 7. Further, the display control unit 70 displays the image Mc that is not subjected to the sharpness processing in the area 302 of the 3 rd display area 300 of the display device 7 (step S7).
The operator confirms the plurality of images Mc (Mc1 to Mc3) displayed on the display device 7 and selects the image M to be used for optical character recognition from the plurality of images Mc displayed on the display device 7. The operator operates the input device 8 to select the image Mc of the symbol D suitable for optical character recognition. By operating the input device 8, input data for selecting the image Mc is generated. The input data acquiring unit 61 acquires input data for selecting the image Mc (step S8).
As described with reference to the area 302 of fig. 6, the display control unit 70 displays the image Mc2 selected by the operator from the plurality of images Mc displayed on the display device 7 via the input device 8 and the images Mc (Mc1, Mc3) that are not selected in different display modes.
The image processing unit 66 executes contrast processing of the image M selected by the operator via the input device 8 (step S9).
The display control unit 70 displays the image Md after the contrast processing in the area 401 and the area 403 of the 4 th display area 400 of the display device 7. Further, the display control unit 70 displays the image Md on which the sharpness processing has not been performed in the area 402 of the 4 th display area 400 of the display device 7 (step S10).
The operator checks the plurality of images Md (Md1 to Md3) displayed on the display device 7 and selects an image Md to be used for the optical character recognition from the plurality of images Md displayed on the display device 7. The operator operates the input device 8 to select the image Md of the symbol D suitable for the optical character recognition. By operating the input device 8, input data for selecting the image Md is generated. The input data acquisition unit 61 acquires input data for selecting the image Md (step S11).
As described with reference to the area 402 of fig. 6, the display controller 70 causes the image Md2 selected by the operator from the plurality of images Md displayed on the display device 7 via the input device 8 to be displayed in a different display form from the image Md (Md1, Md3) that has not been selected.
The display control unit 70 causes the selected image Me to be displayed in the 5 th display area 500 (step S12).
The operator operates the button 800 via the input device 8. By operating the button 800, the mark recognition unit 67 extracts the mark D included in the image Me. The symbol recognition unit 67 recognizes the extracted symbol D based on, for example, a pattern matching method. That is, the symbol recognition unit 67 recognizes the symbol D by using an optical character recognition technique.
The symbol recognition unit 67 determines whether or not the recognition of the symbol D is successful (step S13).
When it is determined in step S13 that the recognition of the symbol D by the symbol recognition unit 67 is successful (step S13: Yes), the display control unit 70 displays the symbol D included in the image Me in the 2 nd color (blue) (step S14).
When it is determined in step S13 that the recognition of the symbol D by the symbol recognition unit 67 has failed (No in step S13), the display controller 70 causes the symbol D included in the image Me to be displayed in the 1 st color (white) (step S15).
The determination unit 68 compares the mark D recognized by the mark recognition unit 67 with the registration mark stored in the storage unit 69, and determines whether or not the correct electronic component C is mounted on the board P (step S16).
In step S16, if the mark D recognized by the mark recognition unit 67 matches the registration mark stored in the storage unit 69 and it is determined that the correct electronic component C is mounted on the substrate P (step S16: Yes), the display control unit 70 causes the display device 7 to display the 1 st display data indicating that the electronic component C is correct (step S17).
In step S16, if the mark D recognized by the mark recognition unit 67 does not match the registered mark stored in the storage unit 69, and it is determined that the electronic component C is not properly mounted on the substrate P (step S16: No), the display control unit 70 causes the display device 7 to display the 2 nd display data indicating that the electronic component C is not properly mounted (step S18).
[ computer System ]
Fig. 9 is a block diagram showing a computer system 1000 according to an embodiment. The control device 6 includes a computer system 1000. The computer system 1000 has: a processor 1001 such as a cpu (central Processing unit); a main memory 1002 including a nonvolatile memory such as a rom (read Only memory) and a volatile memory such as a ram (random access memory); a storage 1003; and an interface 1004 including input-output circuitry. The functions of the control device 6 are stored in the memory 1003 as a program. The processor 1001 reads a program from the storage 1003, expands the program in the main memory 1002, and executes the above-described processing in accordance with the program. The program may be transmitted to the computer system 1000 via a network.
The program causes the computer system 1000 to execute the following operations according to the above-described embodiment: displaying a plurality of images of the electronic component C provided with the mark D and illuminated under a plurality of illumination conditions in a state of being mounted on the substrate P, in an aligned manner on a display screen of the display device 7; recognizing a mark D included in an image Me selected from a plurality of images M displayed on a display screen; and comparing the recognized mark D with the registered mark to determine whether the electronic component C is proper or not.
[ Effect ]
As described above, a plurality of images of the electronic component C provided with the mark D and illuminated under a plurality of illumination conditions in a state of being mounted on the substrate P are acquired, and the acquired plurality of images M are displayed in an aligned manner on the display screen of the display device 7. This allows the operator to efficiently select an image M suitable for use in optical character recognition by checking the plurality of images M arranged on the display screen of the display device 7. Therefore, a decrease in work efficiency during inspection is suppressed.
The image Ma acquired by the image acquisition unit 65 is subjected to image processing by the image processing unit 66. Thus, even if there is no image Ma of the symbol D suitable for optical character recognition among the plurality of images Ma, there is a possibility that the image Mb of the symbol D suitable for optical character recognition may be generated by image processing. Not only the image Ma before the image processing but also the image Mb after the image processing are displayed in line on the display screen of the display device 7. This enables the operator to efficiently select an image M suitable for use in optical character recognition from the plurality of images M (Ma, Mb) arranged on the display screen of the display device 7.
When the operator selects the image M via the input device 8, the selected image M and the unselected image M are displayed in different display modes. This allows the operator to visually confirm the selected image M.
The operator operates the button 800 to start the process of performing optical character recognition. If the recognition of the symbol D is successful, the color of the symbol D changes in the image Me. That is, the symbol D that has been successfully recognized is displayed in a different display form from the symbol D that has not been successfully recognized. Thus, the operator can visually confirm whether or not the optical character recognition is successful.
[ other embodiments ]
In the above-described embodiment, it is assumed that illumination light is not emitted from each of the 2 nd light source 54B and the 3 rd light source 54C when illumination light is emitted from the 1 st light source 54A, illumination light is not emitted from each of the 3 rd light source 54C and the 1 st light source 54A when illumination light is emitted from the 2 nd light source 54B, and illumination light is not emitted from each of the 1 st light source 54A and the 2 nd light source 54B when illumination light is emitted from the 3 rd light source 54C. The illumination condition may include a condition that illumination light is emitted from one or both of the 2 nd light source 54B and the 3 rd light source 54C in a state where illumination light is emitted from the 1 st light source 54A. The illumination condition may include a condition that illumination light is emitted from one or both of the 3 rd light source 54C and the 1 st light source 54A in a state where illumination light is emitted from the 2 nd light source 54B. The illumination condition may include a condition that illumination light is emitted from one or both of the 1 st light source 54A and the 2 nd light source 54B in a state where illumination light is emitted from the 3 rd light source 54C.
In the above-described embodiment, the illumination condition is the incident angle θ of the illumination light incident on the electronic component C. The illumination condition may be the light amount of the illumination light emitted from the light source 54. The illumination device 52 can respectively irradiate illumination light to the electronic component C at a plurality of light quantities. The illumination condition may be the wavelength (color) of the illumination light emitted from the light source 54. The illumination device 52 can irradiate the electronic component C with red light, green light, and blue light, respectively, for example.

Claims (10)

1. An inspection apparatus, comprising:
an image acquisition unit that acquires a plurality of images of an electronic component that is provided with a mark and that is illuminated under a plurality of illumination conditions in a state of being mounted on a substrate; and
and a display control unit that displays the plurality of images in an aligned manner on a display screen of the display device.
2. The inspection apparatus according to claim 1,
the display control unit arranges the plurality of images in a matrix on the display screen.
3. The inspection apparatus according to claim 1 or 2,
having an image processing section for performing image processing on the image,
the display control unit displays an image before image processing and an image after image processing in an aligned manner.
4. The inspection apparatus according to claim 1,
having an image processing section for performing image processing on the image,
the display control unit displays the plurality of images after the image processing in an aligned manner.
5. The inspection apparatus according to claim 3,
the image processing includes four arithmetic operations of a plurality of images.
6. The inspection apparatus according to claim 3,
the image processing includes at least one of sharpness processing and contrast processing.
7. The inspection apparatus according to claim 1,
the display control unit causes an image selected from the plurality of images displayed on the display screen and an image not selected to be displayed in different display modes.
8. The inspection apparatus according to claim 1,
comprising:
a symbol recognition unit that recognizes a symbol included in an image selected from the plurality of images displayed on the display screen; and
and a determination unit that compares the mark recognized by the mark recognition unit with a registered mark to determine whether the electronic component is appropriate.
9. The inspection apparatus according to claim 8,
the display control unit displays the mark recognized by the mark recognition unit and the mark not recognized in different display modes.
10. An inspection method, comprising the steps of:
displaying a plurality of images of the electronic component, which is provided with the mark and illuminated under a plurality of illumination conditions in a state of being mounted on the substrate, in an array on a display screen of the display device;
recognizing the mark included in an image selected from the plurality of images displayed on the display screen; and
the recognized mark is compared with a registered mark, and the adequacy of the electronic component is determined.
CN202010231009.9A 2019-03-29 2020-03-27 Inspection device and inspection method Active CN111757667B (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2019066784A JP7281942B2 (en) 2019-03-29 2019-03-29 Inspection device and inspection method
JP2019-066784 2019-03-29

Publications (2)

Publication Number Publication Date
CN111757667A true CN111757667A (en) 2020-10-09
CN111757667B CN111757667B (en) 2023-08-25

Family

ID=72673225

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010231009.9A Active CN111757667B (en) 2019-03-29 2020-03-27 Inspection device and inspection method

Country Status (4)

Country Link
JP (1) JP7281942B2 (en)
KR (1) KR20200115324A (en)
CN (1) CN111757667B (en)
TW (1) TW202044980A (en)

Citations (38)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPS60144884A (en) * 1983-12-31 1985-07-31 Nippon Steel Corp Detecting method of printed letter
JPH05332948A (en) * 1992-06-02 1993-12-17 Nec Corp Defective part indicator in mounting board appearance inspection
CN1121368A (en) * 1993-04-21 1996-04-24 欧姆龙株式会社 Visual inspection support apparatus, substrate inspection apparatus, and soldering inspection and correction methods using the same apparatuses
JPH10207980A (en) * 1997-01-21 1998-08-07 Toshiba Corp Method and device for character recognition
JPH1183455A (en) * 1997-09-04 1999-03-26 Dainippon Printing Co Ltd Appearance inspecting device
JP2003106936A (en) * 2001-09-27 2003-04-09 Japan Science & Technology Corp Sensor head, luminance distribution measuring device provided with the same, appearance inspection device, and device for inspecting and evaluating display unevenness
JP2004153462A (en) * 2002-10-29 2004-05-27 Keyence Corp Magnifying observation apparatus, operating method of the magnifying observation apparatus, magnifying observation apparatus operating program, and computer-readable recording medium
JP2004235582A (en) * 2003-01-31 2004-08-19 Omron Corp Method for inspection of mounting error, and substrate inspection apparatus using this method
JP2005291761A (en) * 2004-03-31 2005-10-20 Anritsu Corp Printed circuit board inspection apparatus
CN1751550A (en) * 2003-02-21 2006-03-22 富士机械制造株式会社 Pair circuit substrate operating machine
JP2006100677A (en) * 2004-09-30 2006-04-13 Omron Corp Method of examining component packaging state and component packaging state examining apparatus using the method
CN1811337A (en) * 2005-01-28 2006-08-02 雅马哈发动机株式会社 Checking result informing device
CN1874671A (en) * 2005-06-03 2006-12-06 富士机械制造株式会社 Job system to circuit basal lamina
JP2007128306A (en) * 2005-11-04 2007-05-24 Omron Corp Image processing apparatus
JP2007150604A (en) * 2005-11-25 2007-06-14 Nikon Corp Electronic camera
CN101101857A (en) * 2006-07-03 2008-01-09 奥林巴斯株式会社 Defect repairing device
JP2008032645A (en) * 2006-07-31 2008-02-14 Sunx Ltd Magnifying observation device
JP2008218706A (en) * 2007-03-05 2008-09-18 Yamaha Motor Co Ltd Component transfer apparatus, surface mounting apparatus, and electronic component inspection device
JP2009210519A (en) * 2008-03-06 2009-09-17 I-Pulse Co Ltd Apparatus for supporting visual inspection of substrate
KR20100046499A (en) * 2008-10-27 2010-05-07 삼성전기주식회사 Visual inspection device for chip component
JP2011058954A (en) * 2009-09-10 2011-03-24 Rexxam Co Ltd Substrate inspection apparatus
WO2011074183A1 (en) * 2009-12-16 2011-06-23 株式会社日立ハイテクノロジーズ Defect observation method and defect observation device
US20130182942A1 (en) * 2012-01-17 2013-07-18 Omron Corporation Method for registering inspection standard for soldering inspection and board inspection apparatus thereby
CN104030161A (en) * 2013-03-06 2014-09-10 株式会社多田野 Machine display system
JP2015109321A (en) * 2013-12-04 2015-06-11 パナソニックIpマネジメント株式会社 Electronic component mounting apparatus
JP2015232478A (en) * 2014-06-09 2015-12-24 株式会社キーエンス Inspection device, inspection method, and program
JP2016045019A (en) * 2014-08-20 2016-04-04 オムロン株式会社 Teaching device for substrate inspection device, and teaching method
JP2016050864A (en) * 2014-09-01 2016-04-11 三菱電機株式会社 Appearance inspection device with solder
CN106408800A (en) * 2013-07-16 2017-02-15 东芝泰格有限公司 Information processing apparatus and information processing method
JP2017125695A (en) * 2016-01-12 2017-07-20 株式会社Screenホールディングス Inspection system, display device, program, and inspection method
CN107211572A (en) * 2015-02-04 2017-09-26 富士机械制造株式会社 Image processing apparatus, installation process system, image processing method and program
JP2018097563A (en) * 2016-12-13 2018-06-21 セイコーエプソン株式会社 Image processing apparatus
JP2018124075A (en) * 2017-01-30 2018-08-09 名古屋電機工業株式会社 Inspection information display device, inspection information display method and inspection information display program
WO2018163384A1 (en) * 2017-03-09 2018-09-13 株式会社Fuji Supplied component inspection device
DE102018208449A1 (en) * 2017-05-31 2018-12-06 Keyence Corporation Image inspection device and image inspection method
JP2018205025A (en) * 2017-05-31 2018-12-27 株式会社キーエンス Image inspection device, image inspection method, image inspection program, and computer readable recording medium and recorded apparatus
JP2019015741A (en) * 2018-10-23 2019-01-31 株式会社キーエンス Imaging inspection device, imaging inspection method, imaging inspection program, and computer readable recording medium as well as instrument
JP2019045510A (en) * 2018-12-07 2019-03-22 株式会社キーエンス Inspection device

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6573523B1 (en) * 2001-12-12 2003-06-03 Lsi Logic Corporation Substrate surface scanning
KR100887178B1 (en) * 2007-03-08 2009-03-09 (주)제이티 Method for recognizing character, and Method for recognizing character formed on Semiconductor device

Patent Citations (39)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPS60144884A (en) * 1983-12-31 1985-07-31 Nippon Steel Corp Detecting method of printed letter
JPH05332948A (en) * 1992-06-02 1993-12-17 Nec Corp Defective part indicator in mounting board appearance inspection
CN1121368A (en) * 1993-04-21 1996-04-24 欧姆龙株式会社 Visual inspection support apparatus, substrate inspection apparatus, and soldering inspection and correction methods using the same apparatuses
JPH10207980A (en) * 1997-01-21 1998-08-07 Toshiba Corp Method and device for character recognition
JPH1183455A (en) * 1997-09-04 1999-03-26 Dainippon Printing Co Ltd Appearance inspecting device
JP2003106936A (en) * 2001-09-27 2003-04-09 Japan Science & Technology Corp Sensor head, luminance distribution measuring device provided with the same, appearance inspection device, and device for inspecting and evaluating display unevenness
JP2004153462A (en) * 2002-10-29 2004-05-27 Keyence Corp Magnifying observation apparatus, operating method of the magnifying observation apparatus, magnifying observation apparatus operating program, and computer-readable recording medium
JP2004235582A (en) * 2003-01-31 2004-08-19 Omron Corp Method for inspection of mounting error, and substrate inspection apparatus using this method
CN1751550A (en) * 2003-02-21 2006-03-22 富士机械制造株式会社 Pair circuit substrate operating machine
JP2005291761A (en) * 2004-03-31 2005-10-20 Anritsu Corp Printed circuit board inspection apparatus
JP2006100677A (en) * 2004-09-30 2006-04-13 Omron Corp Method of examining component packaging state and component packaging state examining apparatus using the method
CN1811337A (en) * 2005-01-28 2006-08-02 雅马哈发动机株式会社 Checking result informing device
JP2006210729A (en) * 2005-01-28 2006-08-10 Yamaha Motor Co Ltd Information management system in component mounting line
CN1874671A (en) * 2005-06-03 2006-12-06 富士机械制造株式会社 Job system to circuit basal lamina
JP2007128306A (en) * 2005-11-04 2007-05-24 Omron Corp Image processing apparatus
JP2007150604A (en) * 2005-11-25 2007-06-14 Nikon Corp Electronic camera
CN101101857A (en) * 2006-07-03 2008-01-09 奥林巴斯株式会社 Defect repairing device
JP2008032645A (en) * 2006-07-31 2008-02-14 Sunx Ltd Magnifying observation device
JP2008218706A (en) * 2007-03-05 2008-09-18 Yamaha Motor Co Ltd Component transfer apparatus, surface mounting apparatus, and electronic component inspection device
JP2009210519A (en) * 2008-03-06 2009-09-17 I-Pulse Co Ltd Apparatus for supporting visual inspection of substrate
KR20100046499A (en) * 2008-10-27 2010-05-07 삼성전기주식회사 Visual inspection device for chip component
JP2011058954A (en) * 2009-09-10 2011-03-24 Rexxam Co Ltd Substrate inspection apparatus
WO2011074183A1 (en) * 2009-12-16 2011-06-23 株式会社日立ハイテクノロジーズ Defect observation method and defect observation device
US20130182942A1 (en) * 2012-01-17 2013-07-18 Omron Corporation Method for registering inspection standard for soldering inspection and board inspection apparatus thereby
CN104030161A (en) * 2013-03-06 2014-09-10 株式会社多田野 Machine display system
CN106408800A (en) * 2013-07-16 2017-02-15 东芝泰格有限公司 Information processing apparatus and information processing method
JP2015109321A (en) * 2013-12-04 2015-06-11 パナソニックIpマネジメント株式会社 Electronic component mounting apparatus
JP2015232478A (en) * 2014-06-09 2015-12-24 株式会社キーエンス Inspection device, inspection method, and program
JP2016045019A (en) * 2014-08-20 2016-04-04 オムロン株式会社 Teaching device for substrate inspection device, and teaching method
JP2016050864A (en) * 2014-09-01 2016-04-11 三菱電機株式会社 Appearance inspection device with solder
CN107211572A (en) * 2015-02-04 2017-09-26 富士机械制造株式会社 Image processing apparatus, installation process system, image processing method and program
JP2017125695A (en) * 2016-01-12 2017-07-20 株式会社Screenホールディングス Inspection system, display device, program, and inspection method
JP2018097563A (en) * 2016-12-13 2018-06-21 セイコーエプソン株式会社 Image processing apparatus
JP2018124075A (en) * 2017-01-30 2018-08-09 名古屋電機工業株式会社 Inspection information display device, inspection information display method and inspection information display program
WO2018163384A1 (en) * 2017-03-09 2018-09-13 株式会社Fuji Supplied component inspection device
DE102018208449A1 (en) * 2017-05-31 2018-12-06 Keyence Corporation Image inspection device and image inspection method
JP2018205025A (en) * 2017-05-31 2018-12-27 株式会社キーエンス Image inspection device, image inspection method, image inspection program, and computer readable recording medium and recorded apparatus
JP2019015741A (en) * 2018-10-23 2019-01-31 株式会社キーエンス Imaging inspection device, imaging inspection method, imaging inspection program, and computer readable recording medium as well as instrument
JP2019045510A (en) * 2018-12-07 2019-03-22 株式会社キーエンス Inspection device

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
汪国云;李济顺;: "一种识别和跟踪图像目标的模拟方法", 矿山机械, no. 05 *
贺珍真;张宪民;邝泳聪;: "电子元件的特征建模与检查算法研究", 机电工程技术, no. 06 *

Also Published As

Publication number Publication date
JP2020167286A (en) 2020-10-08
CN111757667B (en) 2023-08-25
KR20200115324A (en) 2020-10-07
TW202044980A (en) 2020-12-01
JP7281942B2 (en) 2023-05-26

Similar Documents

Publication Publication Date Title
CN105472960B (en) Electronic component mounting apparatus
JP2002024804A (en) Part recognition data forming method, forming apparatus electronic component mounting apparatus, and recording medium
US20080156207A1 (en) Stencil printers and the like, optical systems therefor, and methods of printing and inspection
JP6223091B2 (en) Position measuring apparatus, alignment apparatus, pattern drawing apparatus, and position measuring method
US20180035582A1 (en) Component mounting machine
JP2008251898A (en) Mounting machine, its mounting method, and method for moving substrate imaging means of mounting machine
JP7098795B2 (en) Setting direction error angle of supply parts Teaching method
JP5335109B2 (en) Feeder, electronic component mounting apparatus, and mounting method
JP2023118927A (en) Substrate handling work system
JP2007335524A (en) Mounting line
JP6648252B2 (en) Image processing system and image processing method
JP6472873B2 (en) Parts inspection machine and parts placement machine
CN111757667B (en) Inspection device and inspection method
JP2007189029A (en) Mounting system, mounting machine, mounting method of printer and electronic component
CN110651538B (en) Working machine and calculation method
CN110214476B (en) Coordinate data generating device and coordinate data generating method
EP3051935A1 (en) Component mounting apparatus
JP4901451B2 (en) Component mounting equipment
JP2009059928A (en) Electronic component mounting method and device, and feeder used for the same device
CN111274756A (en) Drawing and recording system with online evaluation and operation method
JP2000266688A (en) Inspection apparatus for defect of tape carrier
JP2003051698A (en) Method and device for mounting electronic component
CN117769894A (en) Component mounting machine
JP6064168B2 (en) Mark imaging method and component mounting line
JP7269028B2 (en) decision device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant