WO2018055757A1 - Dispositif et procédé de spécification de condition d'éclairage - Google Patents

Dispositif et procédé de spécification de condition d'éclairage Download PDF

Info

Publication number
WO2018055757A1
WO2018055757A1 PCT/JP2016/078222 JP2016078222W WO2018055757A1 WO 2018055757 A1 WO2018055757 A1 WO 2018055757A1 JP 2016078222 W JP2016078222 W JP 2016078222W WO 2018055757 A1 WO2018055757 A1 WO 2018055757A1
Authority
WO
WIPO (PCT)
Prior art keywords
imaging
illumination condition
illumination
image
degree
Prior art date
Application number
PCT/JP2016/078222
Other languages
English (en)
Japanese (ja)
Inventor
弘健 江嵜
一也 小谷
雅史 天野
Original Assignee
富士機械製造株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 富士機械製造株式会社 filed Critical 富士機械製造株式会社
Priority to JP2018540586A priority Critical patent/JP6859356B2/ja
Priority to PCT/JP2016/078222 priority patent/WO2018055757A1/fr
Publication of WO2018055757A1 publication Critical patent/WO2018055757A1/fr

Links

Images

Classifications

    • HELECTRICITY
    • H05ELECTRIC TECHNIQUES NOT OTHERWISE PROVIDED FOR
    • H05KPRINTED CIRCUITS; CASINGS OR CONSTRUCTIONAL DETAILS OF ELECTRIC APPARATUS; MANUFACTURE OF ASSEMBLAGES OF ELECTRICAL COMPONENTS
    • H05K13/00Apparatus or processes specially adapted for manufacturing or adjusting assemblages of electric components
    • H05K13/08Monitoring manufacture of assemblages
    • H05K13/081Integration of optical monitoring devices in assembly lines; Processes using optical monitoring devices specially adapted for controlling devices or machines in assembly lines
    • H05K13/0813Controlling of single components prior to mounting, e.g. orientation, component geometry
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T1/00General purpose image data processing

Definitions

  • the present invention relates to an illumination condition specifying device and an illumination condition specifying method.
  • Patent Document 1 describes an illumination unit having a three-stage light emitting diode group that irradiates light obliquely on a component.
  • the three-stage light emitting diode groups are attached so that the irradiation directions are at different angles.
  • an image suitable for the type and characteristics of the electronic component can be obtained by individually controlling the amount of light for each light emitting diode group.
  • the captured image is used to read information on the shape and suction posture of the electronic component.
  • illumination conditions such as which light-emitting diode group and how much light can be changed can be variously changed.
  • humans have conventionally performed the task of specifying illumination conditions suitable for the components to be imaged. For this reason, there is a problem that a human being takes time to work and a problem that an appropriate lighting condition cannot always be specified.
  • the present invention has been made to solve the above-described problems, and has as its main purpose to automatically specify an appropriate lighting condition according to the object.
  • the present invention adopts the following means in order to achieve the above-mentioned main object.
  • the lighting condition specifying device of the present invention is An ideal image acquisition unit for acquiring an ideal image desired to be obtained by imaging a target; A captured image acquisition unit that acquires a plurality of captured images obtained by imaging the object under different illumination conditions; A degree-of-matching derivation process for deriving a degree of coincidence between the ideal image and the captured image is performed on the plurality of captured images, and the captured image that can be regarded as having the high degree of coincidence of the plurality of captured images is captured.
  • An illumination condition identifying unit that identifies the illumination condition as an illumination condition suitable for imaging the object; It is equipped with.
  • an ideal image prepared as an image to be obtained by capturing an image of an object for example, an image suitable for detecting the shape of a specific portion of the object
  • the degree of coincidence with the ideal image is high.
  • Illumination conditions at the time of capturing a captured image that can be regarded as being identified as illumination conditions suitable for capturing an object.
  • the illumination condition is specified based on the degree of coincidence with the ideal image, it is possible to automatically specify an appropriate illumination condition according to the object.
  • the “captured image that can be regarded as having a high degree of coincidence” may be a captured image having the highest degree of coincidence among a plurality of captured images, or a captured image having a coincidence degree higher than a predetermined threshold among the plurality of captured images. Also good.
  • the ideal image has a predetermined range in which a difference in luminance value between a pixel corresponding to a contour of at least a part of the shape of the object and a pixel outside thereof is considered to be large. It may be an image having a value of. Such an image is suitable for an ideal image because it is easy to detect the shape of a specific part of the object based on the luminance value of the pixel.
  • the “predetermined range in which the difference in luminance value can be regarded as large” may be a range in which the difference in luminance value is 80% or more of the number of gradations of the luminance value of the pixel, or may be 90% or more.
  • the range may be a range, or the difference may be “the number of gradations of the luminance value ⁇ 1” (that is, the difference is maximized).
  • the “predetermined range that can be regarded as having a large difference in luminance value” may be a range of values that cannot be realized by imaging under any of the possible illumination conditions, and the above contour may be obtained by image processing. It is good also as a range more than a predetermined value as a difference value of a luminance value required in order to specify.
  • the ideal image acquisition unit may generate and acquire the ideal image based on information on at least a part of the shape of the object. In this way, an ideal image can also be automatically created.
  • the illumination condition identification unit sequentially performs the coincidence degree derivation process for the plurality of captured images, and the coincidence degree derived in the coincidence degree derivation process is higher than a predetermined threshold value.
  • the illumination condition at the time of capturing the captured image from which the degree of coincidence is derived is specified as an illumination condition suitable for capturing the target object, and the subsequent degree of coincidence deriving process may not be performed.
  • the illumination condition of the captured image here, the captured image whose matching degree is higher than a predetermined threshold
  • the captured image for which the degree of matching has not yet been derived Since the degree-of-match derivation process is omitted, the overall processing time can be further shortened.
  • the illumination condition specifying device includes an illumination unit capable of emitting light by changing the illumination condition, an imaging unit that captures the object irradiated with light from the illumination unit, and the predetermined illumination condition.
  • An imaging control unit that performs an imaging process of causing the imaging unit to capture the captured image by causing the illumination unit to emit light, and performing the imaging process a plurality of times with different illumination conditions, and the captured image acquisition unit performs imaging by the imaging process.
  • the captured image may be acquired. If it carries out like this, the illumination conditions suitable for imaging can be specified regarding the illumination part with which an illumination condition specification apparatus is provided.
  • the illumination condition specifying unit determines the illumination condition at the time of capturing the captured image from which the degree of coincidence is derived as the target. It may be specified as an illumination condition suitable for imaging an object, and the subsequent degree-of-matching derivation process may not be performed and the subsequent imaging process may be stopped.
  • an illumination condition of a captured image here, a captured image with a degree of coincidence higher than a predetermined threshold
  • imaging under an illumination condition that has not yet been imaged since the process and the degree-of-match derivation process for the captured image for which the degree of coincidence has not yet been derived are omitted, the overall processing time can be further shortened.
  • the lighting condition specifying method of the present invention includes: An ideal image acquisition step of acquiring an ideal image desired to be obtained by imaging the object; A captured image acquisition step of acquiring a plurality of captured images obtained by imaging the object under different illumination conditions; A degree-of-matching derivation process for deriving a degree of coincidence between the ideal image and the captured image is performed on the plurality of captured images, and the captured image that can be regarded as having the high degree of coincidence of the plurality of captured images is captured.
  • An illumination condition specifying step for specifying the illumination condition as an illumination condition suitable for imaging the object; Is included.
  • this illumination condition specifying method it is possible to automatically specify an appropriate illumination condition according to the object, as in the above-described illumination condition specifying device.
  • various aspects of the lighting condition specifying device described above may be adopted, and steps for realizing each function of the lighting condition specifying device described above may be added. .
  • FIG. 2 is a schematic explanatory diagram of a configuration of a parts camera 40.
  • FIG. 3 is a block diagram showing a configuration relating to control of the mounting apparatus 10.
  • the flowchart of an illumination condition specific process routine Explanatory drawing of the ideal image 190 of the components 90.
  • FIG. 3 is a schematic explanatory diagram of a configuration of a parts camera 40.
  • FIG. 1 is a perspective view of the mounting apparatus 10
  • FIG. 2 is a schematic explanatory diagram of a configuration of a parts camera 40 provided in the mounting apparatus 10
  • FIG. 3 is a block diagram illustrating a configuration related to control of the mounting apparatus 10.
  • the left-right direction (X-axis), the front-rear direction (Y-axis), and the up-down direction (Z-axis) are as shown in FIG.
  • the mounting apparatus 10 includes a base 12, a mounting apparatus main body 14 installed on the base 12, and a reel unit 70 as a component supply apparatus mounted on the mounting apparatus main body 14.
  • the mounting apparatus main body 14 is installed to be exchangeable with respect to the base 12.
  • the mounting apparatus main body 14 includes a substrate transport device 18 that transports and holds the substrate 16, a head 24 that can move on the XY plane, and a suction nozzle 37 that is attached to the head 24 and can move along the Z axis.
  • a mark camera 38 that images the reference mark attached to the substrate 16, a parts camera 40 that images a component adsorbed by the adsorption nozzle 37, and a controller 60 that executes various controls.
  • the substrate transfer device 18 is provided with support plates 20 and 20 extending in the left-right direction with a space between the front and rear in FIG. 1 and conveyor belts 22 and 22 provided on opposite surfaces of the support plates 20 and 20 (see FIG. 1). 1 shows only one of them).
  • the conveyor belts 22 and 22 are stretched over the drive wheels and the driven wheels provided on the left and right sides of the support plates 20 and 20 so as to be endless.
  • substrate 16 is mounted on the upper surface of a pair of conveyor belts 22 and 22, and is conveyed from the left to the right.
  • the substrate 16 is supported from the back side by a large number of support pins 23 erected.
  • the head 24 is attached to the front surface of the X-axis slider 26.
  • the X-axis slider 26 is attached to the front surface of the Y-axis slider 30 that can slide in the front-rear direction so as to be slidable in the left-right direction.
  • the Y-axis slider 30 is slidably attached to a pair of left and right guide rails 32, 32 extending in the front-rear direction.
  • a pair of upper and lower guide rails 28, 28 extending in the left-right direction are provided on the front surface of the Y-axis slider 30, and the X-axis slider 26 is attached to the guide rails 28, 28 so as to be slidable in the left-right direction.
  • the head 24 moves in the left-right direction as the X-axis slider 26 moves in the left-right direction, and moves in the front-rear direction as the Y-axis slider 30 moves in the front-rear direction.
  • the sliders 26 and 30 are driven by drive motors 26a and 30a (see FIG. 3), respectively.
  • the head 24 incorporates a Z-axis motor 34 and adjusts the height of the suction nozzle 37 attached to a ball screw 35 extending along the Z-axis by the Z-axis motor 34. Further, the head 24 incorporates a Q-axis motor 36 (see FIG. 3) that rotates the suction nozzle 37.
  • a mark camera 38 is attached to the lower surface of the X-axis slider 26.
  • the mark camera 38 has an imaging range below the mark camera 38 and moves to the front, rear, left and right together with the X-axis slider 26.
  • the suction nozzle 37 is a member that sucks and holds a component at the tip of the nozzle and releases the suction of the component that is sucked at the tip of the nozzle.
  • the suction nozzle 37 can supply pressure from a pressure supply source (not shown).
  • the suction nozzle 37 sucks the component when a negative pressure is supplied, and stops supplying the negative pressure or the component when the positive pressure is supplied. Release adsorption.
  • the suction nozzle 37 protrudes downward from the bottom surface of the main body of the head 24. Further, the suction nozzle 37 is moved up and down along the Z-axis direction by the Z-axis motor 34, so that the height of the component sucked by the suction nozzle 37 is adjusted.
  • the suction nozzle 37 By rotating the suction nozzle 37 by the Q-axis motor 36, the orientation of the parts sucked by the suction nozzle 37 is adjusted.
  • the parts camera 40 is disposed in front of the support plate 20 on the front side of the substrate transfer device 18.
  • the parts camera 40 has an imaging range above the parts camera 40, and images the components held by the suction nozzle 37 from below to generate a captured image.
  • the parts camera 40 controls the illumination unit 41 that irradiates light to the imaging target component, the imaging unit 51 that images the component based on the received light, and the entire part camera 40.
  • An imaging control unit 52 is disposed in front of the support plate 20 on the front side of the substrate transfer device 18.
  • the parts camera 40 has an imaging range above the parts camera 40, and images the components held by the suction nozzle 37 from below to generate a captured image.
  • the parts camera 40 controls the illumination unit 41 that irradiates light to the imaging target component, the imaging unit 51 that images the component based on the received light, and the entire part camera 40.
  • An imaging control unit 52 is an imaging control unit 52.
  • the illumination unit 41 includes a housing 42, a connecting unit 43, an incident light source 44, a half mirror 46, and a side light source 47.
  • the housing 42 is a bowl-shaped member whose upper surface and lower surface (bottom surface) are opened in an octagonal shape.
  • the housing 42 has a shape in which the opening on the upper surface is larger than the opening on the lower surface, and the internal space tends to increase from the lower surface toward the upper surface.
  • the connecting portion 43 is a cylindrical member that connects the housing 42 and the imaging portion 51. The light emitted from the epi-illumination light source 44 and the light received by the imaging unit 51 pass through the internal space of the connection unit 43.
  • the incident light source 44 is a light source for irradiating the component held by the suction nozzle 37 with light in a direction along the optical axis 51 a of the imaging unit 51.
  • the incident light source 44 includes a plurality of LEDs 45 that emit light toward the half mirror 46 in a direction perpendicular to the optical axis 51a.
  • the plurality of LEDs 45 are attached to the inner peripheral surface of the connecting portion 43.
  • the optical axis 51a is along the up-down direction, and the light from the LED 45 is irradiated in the horizontal direction (for example, the left-right direction).
  • the half mirror 46 is disposed inside the connecting portion 43 so as to be inclined from the optical axis 51a (for example, an inclination angle of 45 °).
  • the half mirror 46 reflects the light in the horizontal direction from the incident light source 44 upward. Therefore, the light from the LED 45 of the incident light source 44 is irradiated in the direction along the optical axis 51a of the imaging unit 51 (upward here) after being reflected by the half mirror 46. Further, the half mirror 46 transmits light from above toward the imaging unit 51.
  • the side-emitting light source 47 is a light source for irradiating light on the component held by the suction nozzle 37 in a direction inclined from the optical axis 51a.
  • the side light source 47 includes an upper light source 47a having a plurality of LEDs 48a, a middle light source 47b having a plurality of LEDs 48b disposed below the upper light source 47a, and a plurality of LEDs 48c disposed below the middle light source 47b.
  • These LEDs 48 a to 48 c are attached to the inner peripheral surface of the housing 42.
  • Each of the LEDs 48a to 48c irradiates light in a direction inclined from the optical axis 51a (an inclination angle from the optical axis 51a is more than 0 ° and less than 90 °).
  • the LED 48a has the largest inclination angle from the optical axis 51a in the irradiation direction of the LEDs 48a to 48c, and the LED 48a emits light in a direction close to the horizontal. Further, the LED 48c has the smallest inclination angle.
  • the imaging unit 51 includes an optical system such as a lens (not shown) and an imaging element.
  • an optical system such as a lens (not shown) and an imaging element.
  • the image capturing unit 51 receives this light.
  • the imaging unit 51 photoelectrically converts the received light to generate charges corresponding to each pixel in the captured image, and is a captured image that is digital data including information on each pixel based on the generated charge Is generated.
  • the imaging control unit 52 outputs a control signal to the illuminating unit 41 to control light irradiation from the illuminating unit 41, outputs a control signal to the imaging unit 51 to take an image, or captures an image.
  • the captured image generated by is output to the controller 60.
  • the imaging control unit 52 controls the value of current that is applied to each of the LEDs 45, 48 a to 48 c of the illumination unit 41 and the energization time, and the light emission amount and light emission per unit time of the incident light source 44 and the side light source 47. Time can be individually controlled.
  • the imaging control unit 52 can individually control the light emission amount and the light emission time per unit time for each of the upper light source 47a, the middle light source 47b, and the lower light source 47c of the side light source 47.
  • the upper diagram is a bottom view of the component 90
  • the lower diagram is a front view of the component 90.
  • the component 90 is configured as a flip chip, for example, and includes a main body portion 91 and a plurality of terminals 92 used for connection to the substrate 16 when the main body portion 91 is disposed on the substrate 16.
  • the terminal 92 is a hemispherical conductor such as metal, and is disposed on the lower surface of the main body 91 and protrudes downward.
  • the plurality of terminals 92 may be divided into a plurality of groups.
  • the plurality of terminals 92 are divided into four groups of first to fourth groups 93a to 93d as shown in FIG.
  • the terminal 92 may have a different aspect of the terminal 92 for each of the plurality of groups.
  • Examples of the terminal 92 include one or more of the diameter of the terminal 92, the number of the terminals 92 (for example, the number in the vertical direction and the number in the horizontal direction in the upper part of FIG. 2), and the pitch of the terminals 92.
  • the diameter and pitch of the terminals 92 are the same for all of the first to fourth groups 93a to 93d.
  • the parts camera 40 irradiates the lower surface of the main body 91 of the component 90 sucked by the suction nozzle 37 with light from the illumination unit 41, and the imaging unit 51 receives the reflected light of the light, and includes a terminal 92.
  • the lower surface of the part 91 is imaged.
  • the controller 60 is configured as a microprocessor centered on the CPU 61, and includes a ROM 62 that stores processing programs, an HDD 63 that stores various data, a RAM 64 that is used as a work area, and external devices and electrical signals. An input / output interface 65 and the like for performing exchanges are provided, and these are connected via a bus 66.
  • the controller 60 is connected to a pressure supply source (not shown) for the substrate transport device 18, the drive motor 26 a of the X-axis slider 26, the drive motor 30 a of the Y-axis slider 30, the Z-axis motor 34, the Q-axis motor 36, and the suction nozzle 37.
  • a drive signal is output.
  • the controller 60 outputs a control signal to the mark camera 38 and inputs a captured image from the mark camera 38. Further, the controller 60 outputs information related to the illumination conditions including the amount of control of the incident light source 44 and the side light source 47 at the time of imaging to the parts camera 40 or inputs a captured image from the parts camera 40. .
  • each slider 26, 30 is equipped with a position sensor (not shown), and the controller 60 inputs position information from these position sensors while driving the drive motors 26a, 30a of each slider 26, 30. To control.
  • the reel unit 70 includes a plurality of reels 72 and is detachably attached to the front side of the mounting apparatus body 14.
  • a tape is wound around each reel 72, and components are held on the surface of the tape along the longitudinal direction of the tape. These parts are protected by a film covering the surface of the tape.
  • Such a tape is unwound from the reel toward the rear, and the film is peeled off at the feeder portion 74 so that the components are exposed.
  • the suction nozzle 37 sucks the exposed component, the component is held by the suction nozzle 37 and can be moved together with the head 24.
  • the management computer 80 is a computer that manages a production job of the mounting apparatus 10 and is connected to the controller 60 of the mounting apparatus 10 so as to be communicable.
  • the production job is information that defines which components are mounted on which substrate 16 in which order in the mounting apparatus 10 and how many substrates 16 are to be mounted with components.
  • the management computer 80 stores the production job and outputs information included in the production job to the mounting apparatus 10 as necessary.
  • FIG. 4 is a flowchart illustrating an example of an illumination condition specifying process routine.
  • the illumination condition specifying process routine of FIG. 4 is stored in the HDD 63.
  • step S100 the CPU 61 first obtains information on a component for which an appropriate lighting condition is specified.
  • the target component is the component 90
  • step S ⁇ b> 100 the CPU 61 acquires information on the component 90 from, for example, the management computer 80 or based on the operator's operation.
  • the information on the component 90 may include information on at least a part of the shape of the component 90.
  • the information on the component 90 may include, for example, information on the shape of the part 90 to be detected from the captured image in the component mounting process described later.
  • the information on the component 90 includes information on the shape of the terminal 92 as the shape of the portion to be detected from the captured image.
  • the information on the component 90 includes information on the size of the main body 91 (for example, the vertical direction of the main body 91 in the upper stage of FIG. 2) as information on the outer shape of the component 90 in the bottom view in FIG. And lateral length).
  • FIG. 5 is an explanatory diagram of an ideal image 190 of the component 90.
  • the ideal image 190 is an ideal image that is desired to be obtained by imaging the component 90 in the component mounting process described later.
  • the ideal image 190 is a grayscale image in which each pixel has a luminance value represented by 256 gradations.
  • pixels in a circular area corresponding to each of the plurality of terminals 92 that are parts to be detected from the captured image are white pixels 192 (pixels whose luminance value is the maximum value 255), and other than that Is a black pixel 191 (a pixel having a minimum luminance value of 0).
  • This ideal image 190 includes pixels corresponding to the outlines of the plurality of terminals 92 in the component 90 (here, pixels at the edge of each circular area represented by white pixels 192) and pixels outside thereof (here, white).
  • the difference in luminance value from the black pixel 191 adjacent to the pixel 192 is a value in a predetermined range (here, the maximum value, that is, the number of gradations of the luminance value ⁇ 1) that can be regarded as a large difference.
  • the CPU 61 creates the ideal image 190 as follows. First, the CPU 61 determines the size of the ideal image 190 (the number of pixels in the vertical and horizontal directions) based on the information about the external shape of the component 90 acquired in step S100, here the vertical and horizontal lengths of the main body 91. ).
  • the CPU 61 applies each of the plurality of terminals 92 among the pixels included in the ideal image 190 having the determined size.
  • the ideal image 190 is created with the pixels in the area of the corresponding shape (here circular) as the white pixels 192 and the other pixels as the black pixels 191.
  • the CPU 61 determines, based on the information about the component 90 acquired in step S ⁇ b> 100, the outline of the terminal 92 that is a portion to be detected from the captured image in the component mounting process to be described later and the periphery thereof. An image with increased contrast is created as an ideal image 190.
  • the CPU 61 stores the created ideal image 190 in the HDD 63.
  • the ideal image 190 is preferably an image created by taking into account the imaging distance (distance in the Z-axis direction) between the parts camera 40 and the component 90 when the parts camera 40 images the component 90. That is, for example, the number of white pixels 192 constituting a circular area corresponding to one terminal 92 in the ideal image 190 and the terminal 92 in the captured image obtained by the part camera 40 actually capturing the component 90 are supported. It is preferable that the ideal image 190 that is enlarged or reduced according to the imaging distance is created so that the number of pixels that form the circular area is substantially the same.
  • the information regarding the component 90 acquired in step S100 may be information in consideration of an imaging distance in advance (for example, information based on an apparent size when the component 90 is viewed from the part camera 40).
  • the information related to the component 90 acquired in step S100 is information related to the actual size of the component 90, and in step S110, the CPU 61 converts this size based on a predetermined imaging distance to create an ideal image 190.
  • the imaging unit 51 may include a telecentric lens. By using the telecentric lens, the apparent size of the component 90 in the captured image can be made constant even if the imaging distance between the parts camera 40 and the component 90 changes. This eliminates the need to create an ideal image 190 that is enlarged or reduced according to the imaging distance.
  • the CPU 61 moves the head 24 and causes the suction nozzle 37 to suck the target part (part 90 in this case) supplied by the reel unit 70 (step S120). Subsequently, the CPU 61 moves the head 24 and moves the component 90 sucked by the suction nozzle 37 to above the parts camera 40 (step S130). Note that the CPU 61 drives the Z-axis motor 34 as necessary so that the distance between the component 90 and the part camera 40 becomes the above-described imaging distance. Then, the CPU 61 controls the parts camera 40 to start capturing a plurality of images with different illumination conditions (step S140).
  • the imaging control unit 52 of the parts camera 40 sequentially executes the imaging process of causing the imaging unit 51 to capture a captured image by causing the illumination unit 41 to emit light under a predetermined illumination condition with different illumination conditions.
  • the imaging control unit 52 outputs the obtained captured image to the controller 60.
  • the HDD 63 of the controller 60 stores in advance a plurality of illumination conditions and their execution order, and the CPU 61 sequentially outputs the illumination conditions to the parts camera 40 according to the execution order to perform an imaging process. It was.
  • the value of the current to be supplied to each of the LED 45 and the LEDs 48a to 48c is determined in advance, and the illumination conditions are varied by varying the light emission time of each light source of the illumination unit 41. . More specifically, for each of the four types of light sources, the incident light source 44, the upper light source 47a, the middle light source 47b, and the lower light source 47c, there are a plurality of light emission times (in this embodiment, the light emission time is 10 seconds and 0 seconds). It is assumed that there are illumination conditions as many as the number of combinations of the respective light emission times (10,000 in this embodiment). Note that the CPU 61 may execute at least a part of steps S120 to S140 in parallel with step S110.
  • the CPU 61 sequentially acquires the captured images generated by the imaging process from the imaging control unit 52, and derives the degree of coincidence between the acquired captured image and the ideal image 190. Is started (step S150). That is, the CPU 61 sequentially executes the degree-of-match derivation process for deriving the degree of match between the ideal image 190 and the captured image for each captured image captured under different illumination conditions.
  • the degree of coincidence is information indicating how close (or close) the ideal image 190 and the captured image are, and can be derived by, for example, known pattern matching.
  • the CPU 61 derives a correlation value ( ⁇ 1) derived by performing pattern matching using normalized correlation (for example, zero average normalized cross-correlation) as the degree of coincidence. More specifically, for example, the degree of coincidence is derived as follows. First, the CPU 61 uses the ideal image 190 as a template to derive a correlation value between the ideal image 190 and a predetermined target area (for example, the upper left area in the image) having the same size as the ideal image 190 in the captured image. I do. Then, this process is sequentially executed by changing the target area in the captured image, and the highest value among the obtained correlation values is derived as the degree of coincidence of the captured image. Note that the CPU 61 sequentially stores the derived degree of coincidence in the HDD 63.
  • normalized correlation for example, zero average normalized cross-correlation
  • the CPU 61 waits until acquisition of the captured image and derivation of the matching degree based on all of the plurality of illumination conditions is completed (step S160). Based on the degree of coincidence, the illumination condition at the time of capturing the captured image with the highest degree of coincidence is specified as the illumination condition suitable for imaging (step S170).
  • the illumination condition suitable for imaging step S170.
  • a plurality of illumination conditions and their execution order are stored in the HDD 63. Therefore, for example, the CPU 61 stores the derived degrees of coincidence and the derived order in the HDD 63 in association with each other, and based on the derivation order associated with the highest degree of coincidence and the execution order of the illumination conditions. The lighting condition with the highest matching degree is specified.
  • the illumination condition corresponding to the highest degree of matching is not limited to this, and for example, when the imaging control unit 52 outputs the captured image to the controller 60, the illumination condition at the time of imaging is also output. Good. Further, the CPU 61 compares the degree of coincidence derived immediately before each degree of coincidence is derived, and associates only the degree of coincidence having the larger value with the derivation order (or corresponding illumination condition). You may make it memorize
  • the CPU 61 stores (sets) the illumination condition in the HDD 63 as an illumination condition to be used when the component 90 is imaged (step S180), and executes this routine. finish.
  • the CPU 61 specifies the illumination condition at the time of capturing the captured image having the highest degree of coincidence with the ideal image 190 as the illumination condition suitable for capturing the component 90. Therefore, the CPU 61 can specify an illumination condition that can obtain a captured image as close to the ideal image 190 as possible, and can set the illumination condition to be used when the component 90 is imaged.
  • This component mounting routine is stored in, for example, the HDD 63.
  • the component mounting process is repeatedly executed by the controller 60 when receiving a command from the management computer 80.
  • the CPU 61 mounts the mounting target component on the substrate 16 with the imaging of the mounting target component by the parts camera 40.
  • the component to be mounted is the component 90.
  • the CPU 61 first moves the head 24 to suck the component 90 to be mounted on the suction nozzle 37, and moves the component 90 above the parts camera 40.
  • the CPU 61 controls the parts camera 40 so that the component 90 sucked by the suction nozzle 37 is imaged under the lighting conditions set in the above-described lighting condition specifying process. Then, the CPU 61 detects the shape of at least a part of the component 90 sucked by the suction nozzle 37 based on the captured image created by the parts camera 40, and determines whether there is an abnormality in the component 90 based on the detected shape. Judgment and derivation of the position (coordinates) of the component 90 are performed.
  • the CPU 61 detects the shape of the plurality of terminals 92 by detecting an edge based on the luminance value of each pixel of the captured image (here, the grayscale image), that is, the outline of the terminal 92 in the captured image. Corresponding pixels are detected. Then, based on the detected pixels, the number of terminals 92 in the component 90, the arrangement of the terminals 92, the diameter of each of the terminals 92, and the like are derived. Then, the CPU 61 determines whether there is an abnormality in the component 90 and the position of the component 90 sucked by the suction nozzle 37 (based on the derived information and information on the component 90 included in the production job acquired from the management computer 80, for example). Coordinates).
  • the presence / absence of an abnormality in the component 90 is, for example, an excess or shortage of the number of terminals 92, an abnormality in the diameter of the terminals 92, an abnormality in the arrangement of the terminals 92, or the like.
  • the position of the component 90 is, for example, the center position of the component 90 derived based on the plurality of terminals 92.
  • the CPU 61 discards the component 90 and repeats the same processing as described above for another component 90 again.
  • the component 90 is arranged at a predetermined position on the substrate 16 based on the center position of the component 90, and the component mounting process is terminated.
  • the CPU 61 repeatedly executes this component mounting process until all the components to be mounted are mounted. Note that when imaging a component type that does not have the illumination condition set in the illumination condition specifying process, such as a component type other than the component 90, the CPU 61 captures an image with a general illumination condition stored in advance in the HDD 63, for example. May be controlled.
  • the terminal 92 of the component 90 when the terminal 92 of the component 90 is detected based on the captured image as described above, it is preferable that the difference in luminance value between the pixel corresponding to the outline of the terminal 92 and the pixel outside thereof is large.
  • the terminal 92 protrudes downward as shown in FIG. 2, and the vicinity of the outer edge of the terminal 92 is a surface inclined from the vertical direction (the direction along the optical axis 51a). Therefore, in order to increase (brighten) the luminance value of the pixel corresponding to the outline of the terminal 92, it is preferable to irradiate light from a direction close to the horizontal direction rather than irradiate light from a direction close to the vertical direction.
  • the component 90 has a plurality of terminals 92, and depending on the position of the terminals 92, it may be preferable to irradiate light from the middle light source 47b or the lower light source 47c.
  • the light is irradiated from a direction close to the direction of the optical axis 51a (vertical direction in FIG. 2), the light is reflected on the lower surface of the main body 91 and easily reaches the imaging unit 51.
  • the luminance value of the pixel may also increase.
  • an appropriate illumination condition for the component 90 can be automatically specified based on the degree of coincidence between the ideal image 190 and the captured image.
  • the CPU 61 detects at least a part of the shape of the component 90 on the basis of the captured image obtained under the illumination conditions specified as described above, and detects an abnormality and a position based on the detected shape. This can be performed with higher accuracy.
  • the mounting device 10 of this embodiment corresponds to the illumination condition specifying device of the present invention
  • the controller 60 corresponds to an ideal image acquisition unit, a captured image acquisition unit, and an illumination condition specification unit.
  • an example of the illumination condition specifying method of the present invention is also clarified by describing the operation of the mounting apparatus 10.
  • the CPU 61 of the controller 60 wants to capture an image of an object (here, the component 90) (here, an image suitable for detecting the shape of the terminal 92).
  • an illumination condition at the time of capturing a captured image that can be regarded as having a high degree of coincidence with the ideal image 190 is specified as an illumination condition suitable for capturing the component 90.
  • the mounting apparatus 10 can automatically specify an appropriate illumination condition corresponding to the component (here, the component 90). .
  • the CPU 61 identifies the illumination condition at the time of capturing the captured image having the highest degree of coincidence with the ideal image 190 as the illumination condition suitable for capturing the component 90, it is easy to identify the optimal illumination condition.
  • the ideal image 190 has a predetermined range in which the difference in luminance value between the pixel corresponding to the contour of at least a part of the shape of the component 90 (here, the shape of the terminal 92) and the pixel outside thereof is considered to be large. This is a value image. Such an image is suitable for an ideal image because it is easy to detect the shape of a specific portion of the component 90 (here, the terminal 92) based on the luminance value of the pixel. Furthermore, since the pixel corresponding to the outline of the terminal 92 is the white pixel 192 and the outer pixel is the black pixel 191, the ideal image 190 has the largest difference in luminance value between these pixels (that is, the luminance value scale). Logarithm-1).
  • the CPU 61 creates and acquires an ideal image 190 based on information regarding at least a part of the shape of the component 90. Therefore, the mounting apparatus 10 can automatically create the ideal image 190 as well as specify the illumination condition.
  • the mounting apparatus 10 includes an illumination unit 41 that can emit light under different illumination conditions, an imaging unit 51 that captures a component 90 irradiated with light from the illumination unit 41, and an illumination unit under predetermined illumination conditions. And an imaging control unit 52 that performs imaging processing for causing the imaging unit 51 to emit light and imaging the captured image 51 a plurality of times with different illumination conditions. And CPU61 acquires the captured image imaged by the imaging process. Therefore, the mounting apparatus 10 can specify illumination conditions suitable for imaging with respect to the illumination unit 41 included in the mounting apparatus 10 itself.
  • the CPU 61 specifies the illumination condition at the time of capturing the captured image with the highest degree of matching in step S170 as the illumination condition suitable for imaging, but is not limited thereto.
  • the CPU 61 may specify an illumination condition at the time of capturing a captured image that can be regarded as having a high degree of coincidence as an illumination condition suitable for capturing the component.
  • an illumination condition at the time of capturing a captured image whose derived degree of matching is higher than a predetermined threshold for example, a correlation value of 0.8, 0.9, etc.
  • a predetermined threshold for example, a correlation value of 0.8, 0.9, etc.
  • the CPU 61 sequentially performs the degree-of-match derivation process in step S150, if the degree of coincidence derived in the degree-of-match derivation process is higher than a predetermined threshold,
  • the illumination condition at the time of imaging may be specified as an illumination condition suitable for imaging the component, and the subsequent degree of matching process may not be performed and the subsequent imaging process may be stopped.
  • an illumination condition of a captured image here, a captured image with a degree of coincidence higher than a predetermined threshold
  • imaging under an illumination condition that has not yet been imaged since the process and the degree-of-match derivation process for the captured image for which the degree of coincidence has not yet been derived are omitted, the overall processing time can be further shortened.
  • a plurality of executions of the imaging process in step S140 and a plurality of executions of the matching degree derivation process in step S150 are performed in parallel, but the present invention is not limited to this.
  • the CPU 61 may perform a plurality of imaging processes in advance and store a plurality of captured images in the HDD 63, and then perform a plurality of coincidence derivation processes at different timings thereafter.
  • the CPU 61 may derive a degree of coincidence by masking a partial area of the ideal image 190. That is, the CPU 61 may not use a part of the ideal image 190 for deriving the degree of coincidence.
  • the CPU 61 may not use a part of the ideal image 190 for deriving the degree of coincidence.
  • pixels in the region other than the contour (region near the center of the terminal 92) of the terminal 92 are unnecessary. In this case, among the white pixels 192 of the ideal image 190, pixels corresponding to regions other than the outline of the terminal 92 may be masked to derive the degree of coincidence.
  • the degree of coincidence may be derived by masking the pixel area corresponding to the area unnecessary for detecting the shape of the component 90 in the component mounting process in the ideal image 190.
  • the degree of coincidence of the region necessary for detecting the shape of the component 90 is more strongly influenced by the degree of coincidence value derived by the coincidence degree deriving process. Appropriate lighting conditions can be identified.
  • the degree of coincidence is derived by masking a partial area of the ideal image 190 in this way, the pixels in the masked area of the ideal image 190 may have any luminance value.
  • the CPU 61 creates the ideal image 190, but the present invention is not limited to this.
  • the CPU 61 may acquire the ideal image 190 from the management computer 80, or the ideal image 190 may be stored in advance in the HDD 63.
  • the ideal image 190 in such a case may be an image created in advance by a human.
  • the ideal image 190 has the maximum difference in luminance value between the pixel corresponding to the contour of the shape of the terminal 92 and the pixel outside the terminal 92 (the number of gradations of the luminance value ⁇ 1).
  • the present invention is not limited to this, and any value within a predetermined range that can be regarded as having a large difference may be used.
  • the difference in luminance value may be 80% or more of the number of gradations of the luminance value of the pixel (for example, a value of 205 or more for 256 gradations), or 90% or more (for example, for 256 gradations). (Value 231 or more).
  • the “predetermined range that can be regarded as having a large difference in luminance value” may be a range of values that are too large to be realized by imaging under any of the possible illumination conditions, or the shape of a specific part of the part
  • the contour (for example, the contour of the terminal 92) may be in a range equal to or greater than a predetermined value as a difference value of luminance values necessary for specifying by image processing.
  • the ideal image 190 is not limited to this, and may be an ideal image that is desired to be obtained by imaging a part.
  • the imaging control unit 52 changes the illumination conditions by changing the light emission times of the four types of light sources, the incident light source 44, the upper light source 47a, the middle light source 47b, and the lower light source 47c.
  • the illumination conditions may be varied by varying the value of the current applied to each of the LED 45 and the LEDs 48a to 48c.
  • the light emission time can be set in 10 steps.
  • the present invention is not limited to this.
  • the light emission time may be set to 2 steps of turning on and off.
  • the illumination unit 41 has four types of light sources, the epi-illumination light source 44, the upper-stage light source 47a, the middle-stage light source 47b, and the lower-stage light source 47c. It is not limited to this.
  • the side-emitting light source 47 is not limited to the three types of the upper light source 47a, the middle light source 47b, and the lower light source 47c, and may include four or more light sources or two or less light sources.
  • the illumination unit 41 preferably has two or more types of light sources, but may have only one type of light source as long as the amount of emitted light can be adjusted.
  • the CPU 61 in addition to information related to the shape of the terminal 92 to be detected from the captured image in the component mounting process, information related to the external shape of the component 90 (here, information related to the size of the main body 91). ), but is not limited to this.
  • the CPU 61 does not have to use information regarding the outer shape of the component 90 when creating the ideal image 190.
  • the ideal image 190 only needs to include the white pixels 192 representing the shape corresponding to each of the plurality of terminals 92 to be detected and the black pixels 191 on the outside thereof. The image need not include the entire component 90.
  • the part 90 to be detected from the captured image in the component mounting process is the plurality of terminals 92.
  • the present invention is not limited to this.
  • the part to be detected may be the outer shape of the component 90, that is, the contour shape of the main body 91.
  • a pixel in a region corresponding to the contour of the main body 91 may be a white pixel
  • a pixel in a region corresponding to the outside (outside of the component 90) may be a black pixel.
  • the portion to be detected from the captured image may be the contour of the main body 91 and the contour of the terminal 92.
  • the present invention is not limited to this, and another type of component may be used. Since the component 90 is a flip chip, the reflectance of the lower surface of the main body 91 is relatively high, and the luminance value of the pixel corresponding to the lower surface of the main body 91 is likely to increase. That is, the detection accuracy of the outer shape of the terminal 92 tends to decrease. Such a component 90 has a high significance of specifying an appropriate illumination condition using the ideal image 190 as in the above-described embodiment.
  • the component 90 is exemplified as the imaging target, but the target is not limited to the component.
  • the CPU 61 may specify an imaging condition suitable for detecting a reference mark attached to the substrate 16.
  • the mark camera 38 illuminates an illumination unit that can emit light under different illumination conditions, and an imaging unit that captures the substrate 16 irradiated with light from the illumination unit.
  • an imaging control unit that performs imaging processing multiple times under different illumination conditions.
  • an image suitable for detecting the shape of the reference mark on the substrate 16 may be used as the ideal image.
  • the ideal image 190 is a grayscale image, but is not limited thereto, and may be a color image or a binary image. Even when the ideal image 190 is a color image, it is preferable to derive the degree of coincidence based on a luminance value (for example, a luminance value calculated from RGB gradation values).
  • a luminance value for example, a luminance value calculated from RGB gradation values.
  • the CPU 61 does not consider the orientation (rotation) of the component 90 in the captured image when deriving the degree of coincidence between the ideal image 190 and the captured image, but may consider it.
  • the CPU 61 obtains an image obtained by rotating the ideal image 190 by a predetermined angle with the degree of coincidence between one captured image and the ideal image 190 (the maximum value among the correlation values derived by changing the target region described above). It may be derived a plurality of times as a template, and the maximum value among the derived degrees of coincidence may be derived as the degree of coincidence of the captured images.
  • the illumination condition specifying device of the present invention is embodied as the mounting device 10
  • the present invention is not limited thereto, and may be embodied as a management computer 80 that does not include the parts camera 40.
  • the controller 60 has functions of an ideal image acquisition unit, a captured image acquisition unit, and an illumination condition specifying unit.
  • the imaging control unit 52 may have these functions. .
  • the mounting apparatus 10 includes the suction nozzle 37 that sucks and holds a component, but is not limited thereto as long as the component can be held.
  • the mounting apparatus 10 may include a mechanical chuck that holds and holds a component instead of the suction nozzle 37.
  • the present invention can be used for a mounting apparatus for mounting components on a substrate.

Landscapes

  • Engineering & Computer Science (AREA)
  • Operations Research (AREA)
  • Manufacturing & Machinery (AREA)
  • Microelectronics & Electronic Packaging (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Image Input (AREA)
  • Image Analysis (AREA)
  • Supply And Installment Of Electrical Components (AREA)

Abstract

Une CPU d'un dispositif de montage acquiert une image idéale que l'on souhaite acquérir en capturant une image d'un composant (étape S110), puis acquiert une pluralité d'images capturées acquises en capturant le composant dans différentes conditions d'éclairage (étape S140) et exécute un processus de dérivation de degré de coïncidence afin de dériver un degré de coïncidence entre l'image idéale et une image capturée par rapport à la pluralité d'images capturées (étape S150). De plus, la CPU spécifie, en tant que condition d'éclairage appropriée à la capture du composant, une condition d'éclairage lors de la capture d'une image capturée dont le degré de coïncidence peut être considéré comme le plus élevé parmi la pluralité d'images capturées (étape S170).
PCT/JP2016/078222 2016-09-26 2016-09-26 Dispositif et procédé de spécification de condition d'éclairage WO2018055757A1 (fr)

Priority Applications (2)

Application Number Priority Date Filing Date Title
JP2018540586A JP6859356B2 (ja) 2016-09-26 2016-09-26 照明条件特定装置及び照明条件特定方法
PCT/JP2016/078222 WO2018055757A1 (fr) 2016-09-26 2016-09-26 Dispositif et procédé de spécification de condition d'éclairage

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2016/078222 WO2018055757A1 (fr) 2016-09-26 2016-09-26 Dispositif et procédé de spécification de condition d'éclairage

Publications (1)

Publication Number Publication Date
WO2018055757A1 true WO2018055757A1 (fr) 2018-03-29

Family

ID=61689406

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2016/078222 WO2018055757A1 (fr) 2016-09-26 2016-09-26 Dispositif et procédé de spécification de condition d'éclairage

Country Status (2)

Country Link
JP (1) JP6859356B2 (fr)
WO (1) WO2018055757A1 (fr)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2022269912A1 (fr) * 2021-06-25 2022-12-29 株式会社Fuji Dispositif de reconnaissance et procédé de reconnaissance
WO2023276059A1 (fr) * 2021-06-30 2023-01-05 株式会社Fuji Machine de montage de composants

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH03261810A (ja) * 1990-03-12 1991-11-21 Fujitsu Ltd 外観検査装置
JP2003097916A (ja) * 2001-09-25 2003-04-03 Juki Corp 基板マーク認識方法及び装置
JP2003204200A (ja) * 2001-10-30 2003-07-18 Matsushita Electric Ind Co Ltd 教示データ設定装置及び方法、ネットワークを利用した教示データ提供システム及び方法
JP2006120995A (ja) * 2004-10-25 2006-05-11 Juki Corp 電子部品実装装置

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH03261810A (ja) * 1990-03-12 1991-11-21 Fujitsu Ltd 外観検査装置
JP2003097916A (ja) * 2001-09-25 2003-04-03 Juki Corp 基板マーク認識方法及び装置
JP2003204200A (ja) * 2001-10-30 2003-07-18 Matsushita Electric Ind Co Ltd 教示データ設定装置及び方法、ネットワークを利用した教示データ提供システム及び方法
JP2006120995A (ja) * 2004-10-25 2006-05-11 Juki Corp 電子部品実装装置

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2022269912A1 (fr) * 2021-06-25 2022-12-29 株式会社Fuji Dispositif de reconnaissance et procédé de reconnaissance
JP7562858B2 (ja) 2021-06-25 2024-10-07 株式会社Fuji 認識装置及び認識方法
WO2023276059A1 (fr) * 2021-06-30 2023-01-05 株式会社Fuji Machine de montage de composants
JP7562860B2 (ja) 2021-06-30 2024-10-07 株式会社Fuji 部品実装機

Also Published As

Publication number Publication date
JP6859356B2 (ja) 2021-04-14
JPWO2018055757A1 (ja) 2019-07-04

Similar Documents

Publication Publication Date Title
JP6293899B2 (ja) 実装装置
US12051188B2 (en) Substrate work system
JP2017139388A (ja) 実装装置
WO2018055757A1 (fr) Dispositif et procédé de spécification de condition d'éclairage
JP6837161B2 (ja) 撮像ユニット及び部品実装機
JP6475165B2 (ja) 実装装置
JP7301973B2 (ja) 検査装置
CN110651538B (zh) 作业机及运算方法
JP3597611B2 (ja) 実装機におけるマーク認識用照明装置
WO2018055663A1 (fr) Dispositif d'imagerie et dispositif de montage
JP7108045B2 (ja) 部品撮像用カメラ及び部品実装機
JPH09321494A (ja) 表面実装機の照明装置
JP6670949B2 (ja) 電子部品の画像処理方法及び画像処理装置
JP7562860B2 (ja) 部品実装機
JP7285194B2 (ja) 部品実装機、部品認識方法
JP7271738B2 (ja) 撮像ユニット
WO2024009410A1 (fr) Procédé de détermination de présence/absence de composant et système de traitement d'image
WO2024004074A1 (fr) Machine de montage
WO2019176024A1 (fr) Unité de capture d'image et dispositif de montage de composant
CN116349420A (zh) 图像处理装置、安装装置、安装系统、图像处理方法及安装方法

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 16916824

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2018540586

Country of ref document: JP

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 16916824

Country of ref document: EP

Kind code of ref document: A1