WO2019030875A1 - Image processing system and component mounting machine - Google Patents

Image processing system and component mounting machine Download PDF

Info

Publication number
WO2019030875A1
WO2019030875A1 PCT/JP2017/029008 JP2017029008W WO2019030875A1 WO 2019030875 A1 WO2019030875 A1 WO 2019030875A1 JP 2017029008 W JP2017029008 W JP 2017029008W WO 2019030875 A1 WO2019030875 A1 WO 2019030875A1
Authority
WO
WIPO (PCT)
Prior art keywords
component
luminance
imaging device
subject
captured image
Prior art date
Application number
PCT/JP2017/029008
Other languages
French (fr)
Japanese (ja)
Inventor
雅史 天野
秀一郎 鬼頭
Original Assignee
株式会社Fuji
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 株式会社Fuji filed Critical 株式会社Fuji
Priority to JP2019535522A priority Critical patent/JP6728501B2/en
Priority to PCT/JP2017/029008 priority patent/WO2019030875A1/en
Publication of WO2019030875A1 publication Critical patent/WO2019030875A1/en

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T1/00General purpose image data processing
    • HELECTRICITY
    • H05ELECTRIC TECHNIQUES NOT OTHERWISE PROVIDED FOR
    • H05KPRINTED CIRCUITS; CASINGS OR CONSTRUCTIONAL DETAILS OF ELECTRIC APPARATUS; MANUFACTURE OF ASSEMBLAGES OF ELECTRICAL COMPONENTS
    • H05K13/00Apparatus or processes specially adapted for manufacturing or adjusting assemblages of electric components
    • H05K13/08Monitoring manufacture of assemblages

Definitions

  • the present specification discloses an image processing system and a component mounter.
  • Patent Document 1 it is determined that the luminance reduction of a plurality of light emitters from the time of introduction of the lighting device to the present time, and the luminance reduction is within a predetermined luminance reduction range, the luminance of the plurality It is disclosed to increase the drive current of the illumination power supply so that
  • Patent Document 1 describes that the decrease in luminance of the light emitter is recovered by the increase in the drive current of the illumination power source, no problem occurs in some of the plurality of light emitters. Not mentioned. For example, when a failure occurs in which some of the plurality of light emitters do not emit light, that part appears dark and there are cases where the subject can not be normally recognized from the captured image.
  • the present disclosure is mainly intended to obtain a good captured image even when abnormality occurs in the lighting of the lighting device.
  • the present disclosure takes the following measures in order to achieve the above-mentioned main objects.
  • An image processing system is an image processing system that processes an image, and includes an illumination device that emits light to a subject, and an imaging device that receives reflected light from the subject and captures the subject. And a control device that controls the imaging device to capture the subject and processes the obtained captured image to recognize the subject, and the control device is acquired in advance with the captured image. If the lighting device is determined to have an abnormality based on the reference data and if it is determined that the lighting device has an abnormality, the luminance distribution of the field of view of the imaging device is measured, and the luminance distribution is measured. The relationship between the position of each pixel of the imaging element and the luminance correction value is determined, and the picked up image of the subject captured by the imaging device thereafter using the determined relationship between the position of each pixel and the luminance correction value Brightness of And summarized in that corrected.
  • the image processing system of the present disclosure includes an imaging device including an illumination device and an imaging device, and a control device that processes a captured image obtained by imaging a subject and recognizes the subject.
  • the control device determines an abnormality of the lighting device based on the captured image and the reference data acquired in advance. When it is determined that an abnormality occurs in the lighting device, the control device measures the luminance distribution of the field of view of the imaging device, and the relationship between the position of each pixel of the imaging element and the luminance correction value based on the measured luminance distribution. Decide. Then, the control device corrects the luminance value of the captured image of the subject captured by the imaging device thereafter, using the determined relationship between the position for each pixel and the luminance correction value. As a result, even when abnormality occurs in the illumination of the illumination device, a good captured image can be obtained, and the subject can be normally recognized from the obtained captured image.
  • the component mounter of the present disclosure is a component mounter that holds a component and mounts the component on a mounting target, and includes an illumination device that emits light to a subject, and light reflected from the subject to capture the subject.
  • An imaging device including an imaging device, a holder for holding a component, a moving device for moving the holder relative to the mounting object, and a component held by the holder Control the moving device and the imaging device, process the obtained captured image to determine the holding state of the component, and the holder so that the held component is mounted on the mounting object And a control device for controlling the moving device, wherein the control device determines an abnormality of the lighting device based on the captured image and reference data acquired in advance, and the lighting device has a failure.
  • the gist of the present invention is to correct a luminance value of a captured image of a component captured by the imaging device thereafter using a relationship.
  • an imaging device including an illumination device and an imaging element, a holder, a moving device, and a component obtained by imaging the component held by the holder are processed to obtain a component And a controller configured to mount the held component on the mounting object after determining the holding state.
  • the control device determines an abnormality of the lighting device based on the captured image and the reference data acquired in advance. When it is determined that an abnormality occurs in the lighting device, the control device measures the luminance distribution of the field of view of the imaging device, and the relationship between the position of each pixel of the imaging element and the luminance correction value based on the measured luminance distribution. Decide.
  • the control device corrects the luminance value of the captured image of the component captured by the imaging device thereafter, using the determined relationship between the position for each pixel and the luminance correction value.
  • the control device corrects the luminance value of the captured image of the component captured by the imaging device thereafter, using the determined relationship between the position for each pixel and the luminance correction value.
  • FIG. 1 is a block diagram of a component mounting system 1;
  • FIG. 2 is a block diagram of a component mounter 10;
  • FIG. 6 is a configuration diagram of a head 30 and a part camera 40.
  • FIG. 6 is a block diagram showing an electrical connection relationship between a control device 70 and a management device 100.
  • It is a flowchart which shows an example of component mounting processing. It is explanatory drawing which shows the mode of a change of the captured image in, when a failure arises in the partial illumination of a parts camera. It is a flowchart which shows an example of correction table preparation processing.
  • It is an explanatory view showing a situation of measurement of luminance distribution.
  • It is an explanatory view showing a situation of measurement of luminance distribution.
  • It is an explanatory view showing an example of a amendment table.
  • It is an explanatory view showing a situation of an image pick-up picture before shading amendment.
  • It is an explanatory view showing a situation of an image pick
  • FIG. 1 is a block diagram of the component mounting system 1.
  • FIG. 2 is a block diagram of the component mounter 10.
  • FIG. 3 is a block diagram of the head 30 and the part camera 40.
  • FIG. 4 is a block diagram showing the electrical connection between the control device 70 and the management device 100.
  • the horizontal direction in FIG. 2 is the X-axis direction
  • the front-rear direction is the Y-axis direction
  • the vertical direction is the Z-axis direction.
  • the component mounting system 1 is provided with the screen printing machine 2, the component mounting machine 10, the reflow oven 4, the management apparatus 100 which manages the whole system, etc., as shown in FIG.
  • the screen printing machine 2 prints a wiring pattern (solder surface) on the lower substrate B via the pattern holes by pressing the solder on the screen while rolling the solder on the screen with a squeegee into the pattern holes formed in the screen.
  • the component mounter 10 adsorbs an electronic component (hereinafter simply referred to as "component") P and mounts it on a substrate B on which solder is printed.
  • the reflow furnace 4 melts the solder on the substrate B by heating the substrate B on which the component is mounted, and performs solder bonding.
  • the component mounting machine 10 mounts various components P having different sizes and shapes, such as chip components such as chip resistors and irregular shaped components such as connectors, and IC components such as QFP (Quad Flat Package) and BGA (Ball Grid Array). It is configured as a possible universal mounter.
  • the component mounter 10 as shown in FIG. 2, includes a component supply device 22, a substrate transfer device 24, an XY robot 26, a head 30, a mark camera 28, a parts camera 40, a control device 70 (see FIG. 4) and the like. .
  • the component supply device 22 supplies the component P to the component supply position.
  • the component supply device 22 is mounted on the front of the component mounter 10 so as to be arranged along the X-axis direction (left-right direction), and has a tape containing a plurality of components (chip components etc.) P of the same type. It includes a tape feeder for supplying, and a tray feeder for supplying a tray installed at the front of the component mounter 10 and containing a plurality of components (such as IC components) P of the same type.
  • the substrate transfer device 24 has a pair of conveyor belts which are provided at intervals in the front and rear direction and are spanned in the left-right direction.
  • the substrate B is transported by the conveyor belt of the substrate transport device 24 from left to right in the drawing.
  • the XY robot 26 moves the head 30 in the XY axis direction.
  • a Y-axis slider 26b which is movably supported and movable in the Y-axis direction (front-rear direction) by the drive of a Y-axis motor.
  • the head 30 is a rotary head, and as shown in FIG. 3, a head body 31 as a rotating body, and a plurality of nozzle holders 32 arranged circumferentially with respect to the head body 31 and supported so as to be able to move up and down. Equipped with A suction nozzle 33 is removably attached to the tip of each nozzle holder 32. Further, although not shown, the head 30 rotates an R-axis motor for rotating the head main body 31 so as to turn the plurality of nozzle holders 32 around the central axis of the head main body 31 and rotates the plurality of nozzle holders 32 around their respective axes. And a Z-axis motor for moving up and down the nozzle holder 32 (suction nozzle 33) at a predetermined turning position among the plurality of nozzle holders 32.
  • the mark camera 28 is provided on the head 30, and picks up an image of the component P supplied from the component supply device 22 from above to recognize the position of the component, and is attached to the substrate B transported by the substrate transport device 24.
  • the reference mark is imaged from above to recognize the substrate position.
  • the parts camera 40 is provided between the parts supply device 22 and the substrate transfer device 24.
  • the parts camera 40 picks up a part adsorbed by the head 30 from below and recognizes its adsorption posture (adsorption deviation).
  • the parts camera 40 includes an illumination device 41, a lens 48, and an imaging device 49 (CCD, CMOS, etc.).
  • the illumination device 41 includes a side illumination unit 42 and an epi-illumination unit (coaxial epi-illumination unit) 44.
  • the side-reflection illumination unit 42 applies light obliquely to the subject, and includes a plurality of light emitters (LEDs) 43 arranged in a ring around the lens 48 in top view.
  • LEDs light emitters
  • the epi-illumination unit 44 applies light to the subject from the same direction as the optical axis of the lens 48, and the half mirror 46 and the half mirror 46 disposed at an oblique angle of 45 degrees with respect to the optical axis of the lens 48.
  • a light emitter (LED) 45 for emitting light in a direction (horizontal direction) orthogonal to the optical axis of the lens 48.
  • the control device 70 is configured as a microprocessor centering on the CPU 71 and, in addition to the CPU 71, includes a ROM 72, an HDD 73, a RAM 74, an input / output interface 75, and the like. These are connected via a bus 76.
  • a position sensor (not shown) that detects the position of the XY robot 26 in the XY axis direction, image signals from the part camera 40 and the mark camera 28, and the like are input to the control device 70.
  • the control device 70 the component supply device 22 and the substrate transfer device 24, the X axis motor and Y axis motor of the XY robot 26, Z axis motor of the head 30, R axis motor and ⁇ axis motor, parts camera 40, mark Various control signals to the camera 28 and the like are output.
  • the management apparatus 100 is configured, for example, as a general-purpose computer, and includes a CPU 101, a ROM 102, an HDD 103, a RAM 104, an input / output interface 105, and the like as shown in FIG.
  • An input signal from the input device 107 is input to the management apparatus 100 via the input / output interface 105.
  • a display signal to the display 108 is output from the management device 100 via the input / output interface 105.
  • Job information of the substrate B is stored in the HDD 103.
  • job information refers to information that defines which component P is mounted on which substrate B and in which order in each component mounter 10, and how many substrates B mounted in such a manner are to be manufactured. .
  • the job information includes information on the substrate B, information on the part P, a target mounting position of the part P, information on the head 30, and the like.
  • the management device 100 is communicably connected to the control device 70 of the component mounter 10, and exchanges various information and control signals.
  • FIG. 5 is a flowchart showing an example of the component mounting process.
  • the component mounting process is performed when job information is received from the management device 100.
  • the CPU 71 of the control device 70 first controls the XY robot 26 to move the head 30 above the component supply position and controls the Z-axis motor to be supplied to the component supply position.
  • the component P is sucked by the suction nozzle 33 (S100).
  • the CPU 71 repeats the operation of controlling the R-axis motor to turn the nozzle holder 32 and controlling the Z-axis motor to cause the next suction nozzle 33 to suction the component P until the planned number of components P is suctioned. .
  • the CPU 71 controls the XY robot 26 to move the head 30 above the part camera 40 (S110), and controls the part camera 40 to pick up an image of the part P adsorbed by the suction nozzle 33 (S120) .
  • the control of the parts camera 40 is performed by controlling the illumination device 41 so that light is emitted to the parts P absorbed by the suction nozzle 33 and controlling the imaging element 49 so that the parts P are imaged.
  • the CPU 71 stores the obtained captured image in the HDD 73 in association with the type of the component P that has adsorbed the image (S130).
  • FIG. 6 is an explanatory view showing a state of change of a captured image when a failure occurs in partial illumination of a part camera.
  • the component P has a component body having a rectangular parallelepiped outer shape, and electrodes arranged at both ends in the short direction.
  • the captured image of the part P has a dark area which is an image of the part body, and a bright area which is an image of the electrode and whose luminance value is higher than that of the dark area.
  • S150 it is determined whether or not the brightness difference of the bright area between both images is equal to or larger than a predetermined value (the brightness value of the bright area of the captured image is smaller than the corresponding bright area of the reference image). It is carried out by determining
  • the CPU 71 determines that a change in luminance value occurs between both images, it determines that a failure has occurred in the illumination (illumination device 41) of the part camera 40, and corrects the luminance value of each pixel of the captured image
  • a correction table is created (S160). This process is performed by executing the correction table creating process illustrated in FIG.
  • the CPU 71 first controls the XY robot 26 to move the part P attracted by the suction nozzle 33 to the measurement start position within the field of view of the part camera 40 (S300). Subsequently, the CPU 71 controls the parts camera 40 to image the component P adsorbed by the suction nozzle 33 (S310), and stores the obtained captured image in the HDD 73 (S320).
  • the CPU 71 determines whether the component P adsorbed by the suction nozzle 33 has reached the measurement end position (S330). If the CPU 71 determines that the component P absorbed by the suction nozzle 33 has not reached the measurement end position, it controls the XY robot 26 to move the component P to the next imaging position (S340), and returns to S310 The imaging of the part P and the storage of the acquired captured image are repeated.
  • FIG. 8 and FIG. 9 are explanatory diagrams showing the state of measurement of the luminance distribution.
  • the movement of the part P is, as illustrated, with the lower left position as the measurement start position and the upper right position as the measurement end position, from the measurement start position toward the measurement end position to the right, left diagonally upward, It is carried out in a zigzag manner to the right, left diagonally upward, and right.
  • the luminance value of the pixel corresponding to the electrode part becomes high no matter where in the field of the parts camera 40 the part P is imaged.
  • the partial illumination (upper part in the figure) of the parts camera 40 breaks down, as shown in FIG. 9, when the part P is imaged at a position where the illumination is not applied, the brightness of the pixel corresponding to the electrode portion The value is relatively low.
  • the CPU 71 determines that the part P has reached the measurement end position in S330, the CPU 71 combines only the electrode parts of the captured image of the part P captured and stored by the eye and derives the luminance distribution over the entire field of view of the part camera 40 (S350).
  • the CPU 71 creates a correction table in which the luminance correction value for each pixel of the imaging device 49 is determined based on the derived luminance distribution (S360).
  • the correction table determines the relationship between the position of each pixel of the image sensor 49 and the luminance correction value.
  • FIG. 10 is an explanatory diagram of an example of the correction table.
  • the correction table is created such that the correction amount increases as the pixel having the lower luminance value in the derived luminance distribution.
  • the luminance correction value may be determined such that the corrected luminance value of each pixel is a predetermined value.
  • the luminance correction value may be set such that the corrected luminance value of each pixel is the largest luminance value among the derived luminance distributions.
  • the CPU 71 stores the created correction table in the HDD 73 in association with the type of the part P used for the creation (S370), and ends the correction table creation processing.
  • the component camera 40 picks up again the component P picked up by the part camera 40 (S170), and corrects (shading correction) the captured image using the created correction table.
  • FIG. 11 is an explanatory view showing the appearance of a captured image before shading correction.
  • FIG. 12 is an explanatory view showing the appearance of the captured image after the shading correction.
  • the CPU 71 determines the suction position deviation of the component P adsorbed by the suction nozzle 33 based on the corrected captured image (S210), and corrects the target mounting position of the component P (S220).
  • the CPU 71 controls the XY robot 26 to move the suction nozzle 33 above the target mounting position (S230), lowers the suction nozzle 33 by the Z-axis motor, and mounts the component P on the substrate B ( S240), the component mounting process is finished.
  • the CPU 71 repeats the operation of mounting each component P at each target mounting position when the component P is suctioned by the plurality of suction nozzles 33.
  • the CPU 71 determines whether or not the correction table of the same kind of component is already stored in the HDD 73 (S190). If the CPU 71 determines that the correction table of the same kind of component is not stored, the process proceeds to step S210. That is, the CPU 71 determines the suction position deviation of the component P based on the captured image of the component P captured in S120 (S210), corrects the target mounting position of the component P (S220), and mounts the component P as a target. The position is moved upward (S230), and mounted on the substrate B (S240).
  • the CPU 71 determines that the correction table for the same kind of part is stored in the HDD 73 in S190, it corrects the captured image of the part P picked up in S120 using the stored correction table for the same kind of part ( It progresses to S200) and S210. That is, the CPU 71 determines the suction position deviation of the component P based on the pickup image of the component P corrected using the correction table of the same component (S210), and corrects the target mounting position of the component P (S220) The component P is moved above the target mounting position (S230), and mounted on the substrate B (S240).
  • the illumination device 41 corresponds to the illumination device
  • the imaging device 49 corresponds to the imaging device
  • the part camera 40 corresponds to the imaging device
  • the control device 70 corresponds to the control device.
  • the suction nozzle 33 corresponds to a holder
  • the XY robot 26 corresponds to a moving device.
  • the component mounter 10 of the present embodiment described above picks up an image of the component P absorbed by the part camera 40 including the illumination device 41 and the imaging device 49, the suction nozzle 33, the XY robot 26, and the suction nozzle 33.
  • the controller 70 for mounting the suctioned component P on the substrate B is provided.
  • the control device 70 determines the abnormality of the illumination device 41 based on the captured image and the reference image acquired in advance. When it is determined that an abnormality occurs in the illumination device 41, the control device 70 measures the luminance distribution of the field of view of the part camera 40, and based on the measured luminance distribution, the luminance correction value for each pixel of the imaging device 49 is calculated. Create a defined correction table.
  • the control device 70 corrects the luminance value of the captured image of the part P captured by the part camera 40 thereafter using the created correction table.
  • the component mounter 10 can obtain a good captured image even when an abnormality occurs in the illumination of the lighting device 41, and the suction state of the component P can be normally recognized from the obtained captured image. And the mounting operation can be performed.
  • the component mounter 10 images the component P by the part camera 40 while moving the component P absorbed by the suction nozzle 33 within the field of view of the part camera 40, whereby the field of view of the part camera 40 Measure the brightness distribution inside. Thereby, even if a failure occurs in the partial illumination of the part camera 40, the component mounter 10 can automatically perform the measurement of the luminance distribution in the field of view of the part camera 40 and the creation of the correction table.
  • the component mounter 10 measures the luminance distribution of the field of view of the part camera 40 for each type of component P to create a correction table, and subsequently generates the correction table.
  • the luminance value of the captured image obtained when the part P is captured by the part camera 40 is corrected using the correction table.
  • the component mounter 10 can correct the captured image using the correction table suitable for each type of the component P, and can further enhance the accuracy of recognition of the captured image.
  • the component mounter 10 captures the captured component P with the part camera 40, associates the obtained captured image with the type of the component P captured, and stores the obtained captured image
  • the captured image (reference image) of the same type of part P stored most recently was compared, and it was determined whether or not a failure occurred in the illumination device 41 of the part camera 40.
  • the reference image may be one obtained by averaging the luminance values of captured images of a plurality of similar components stored in the past.
  • the reference image may be a captured image of the part P captured by the part camera 40 in advance before production.
  • the reference image may be created based on the information (type) on the part P.
  • the component mounter 10 measures the luminance distribution in the field of view of the part camera 40 for each type of component P, creates and stores a correction table, The captured image of the part P was corrected using the table.
  • the component mounter 10 may correct the captured image of the component P using the same correction table regardless of the type of the component P.
  • the component mounter 10 picks up the component camera 40 by moving the component P absorbed by the suction nozzle 33 within the field of view of the component camera 40 while imaging the component P with the component camera 40.
  • the luminance distribution in the field of view was measured.
  • the component mounter 10 measures the luminance distribution in the field of view of the part camera 40 by, for example, providing a jig plate for covering the field of view of the part camera 40 on the head 30 and imaging the jig plate with the part camera 40 You may
  • the component mounter 10 measures the luminance distribution in the entire area within the field of view of the part camera 40 and creates the correction table.
  • the component mounter 10 does not necessarily have to measure the luminance distribution in the entire field of the part camera 40.
  • the component mounting machine 10 measures the luminance distribution of only the imaging range corresponding to each type of component P, A correction table may be created for each type of.
  • the component mounter 10 determines that the illumination of the parts camera 40 has a failure when the luminance difference between the captured image and the reference image is equal to or greater than a predetermined luminance difference. It judged and measured the luminance distribution of the visual field, and created the correction table. However, in the component mounting machine 10, the range of the bright area where the brightness difference between the captured image and the reference image exceeds the limit value larger than the predetermined value or the brightness range of the predetermined value or more is predetermined. If the range is exceeded, the mounting operation may be stopped without measuring the luminance distribution or creating the correction table.
  • the XY robot 26 moves the head 30 in the XY axis direction.
  • the substrate B may be moved in the XY axis direction.
  • the CPU 71 creates a correction table in which the luminance correction value for each pixel of the imaging device 49 is determined based on the luminance distribution of the field of view of the part camera 40.
  • the CPU 71 may create a function that defines the relationship between the position of the pixel of the imaging device 49 and the luminance correction value of the pixel.
  • the CPU 71 may divide the field of view of the parts camera 40 into the areas of the parts adsorbed by the suction nozzles 33, and create a function for each divided area.
  • the present disclosure may be in the form of an image processing system in addition to the form of a component mounter.
  • the present disclosure is applicable to the manufacturing industry of an image processing system and a component mounting machine.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Operations Research (AREA)
  • Manufacturing & Machinery (AREA)
  • Microelectronics & Electronic Packaging (AREA)
  • Supply And Installment Of Electrical Components (AREA)
  • Image Input (AREA)

Abstract

An image processing system comprising: an imaging device including an illumination device that irradiates light on to a subject and an imaging element that receives reflected light from the subject and captures an image of the subject; and a control device that controls the imaging device such that an image of the subject is captured, processes the captured image, and recognizes the subject. The control device: determines illumination device errors on the basis of the captured image and reference data obtained beforehand; measures luminance distribution in the field-of-view of the imaging device if a determination has been made that an error has occurred in the illumination device; determines the relationship between a luminance correction value and the position of each pixel in the imaging element on the basis of the measured luminance distribution; and corrects the luminance value for the captured image of the subject captured by the imaging device thereafter, using the determined relationship between the position of each pixel and the luminance correction value.

Description

画像処理システムおよび部品実装機Image processing system and component mounting machine
 本明細書は、画像処理システムおよび部品実装機について開示する。 The present specification discloses an image processing system and a component mounter.
 従来より、照明装置により被写体に照明を当てて当該被写体を撮像装置により撮像し、得られた画像データを処理する画像処理システムが知られている。例えば、特許文献1には、照明装置の導入時から現在までの複数の発光体の輝度低下を判断し、輝度低下が所定の輝度低下範囲内のときには、輝度低下分だけ複数の発光体の輝度が高くなるように照明用電源の駆動電流を増加させるものが開示されている。 2. Description of the Related Art Conventionally, there has been known an image processing system in which a subject is illuminated by a lighting device, the subject is imaged by an imaging device, and the obtained image data is processed. For example, in Patent Document 1, it is determined that the luminance reduction of a plurality of light emitters from the time of introduction of the lighting device to the present time, and the luminance reduction is within a predetermined luminance reduction range, the luminance of the plurality It is disclosed to increase the drive current of the illumination power supply so that
特開2007-265120号公報JP 2007-265120 A
 しかしながら、特許文献1には、照明用電源の駆動電流の増加によって発光体の輝度低下を回復させることについては記載されているものの、複数の発光体の一部に異常が生じた場合については何ら言及されていない。例えば、複数の発光体の一部が発光しなくなる故障が生じた場合、その部分が暗く写り、被写体を撮像画像から正常に認識できない場合が生じる。 However, although Patent Document 1 describes that the decrease in luminance of the light emitter is recovered by the increase in the drive current of the illumination power source, no problem occurs in some of the plurality of light emitters. Not mentioned. For example, when a failure occurs in which some of the plurality of light emitters do not emit light, that part appears dark and there are cases where the subject can not be normally recognized from the captured image.
 本開示は、照明装置の照明に異常が生じた場合であっても、良好な撮像画像を得ることを主目的とする。 The present disclosure is mainly intended to obtain a good captured image even when abnormality occurs in the lighting of the lighting device.
 本開示は、上述の主目的を達成するために以下の手段を採った。 The present disclosure takes the following measures in order to achieve the above-mentioned main objects.
 本開示の画像処理システムは、画像を処理する画像処理システムであって、被写体に光を照射する照明装置と、該被写体からの反射光を受光して該被写体を撮像する撮像素子とを含む撮像装置と、前記被写体が撮像されるよう前記撮像装置を制御し、得られた撮像画像を処理して前記被写体を認識する制御装置と、を備え、前記制御装置は、前記撮像画像と予め取得された基準データとに基づいて前記照明装置の異常を判定し、前記照明装置に異常が生じていると判定した場合に、前記撮像装置の視野の輝度分布を測定し、該測定した輝度分布に基づいて前記撮像素子の画素ごとの位置と輝度補正値との関係を決定し、該決定した画素ごとの位置と輝度補正値との関係を用いて以降に前記撮像装置により撮像される被写体の撮像画像の輝度値を補正することを要旨とする。 An image processing system according to an embodiment of the present disclosure is an image processing system that processes an image, and includes an illumination device that emits light to a subject, and an imaging device that receives reflected light from the subject and captures the subject. And a control device that controls the imaging device to capture the subject and processes the obtained captured image to recognize the subject, and the control device is acquired in advance with the captured image. If the lighting device is determined to have an abnormality based on the reference data and if it is determined that the lighting device has an abnormality, the luminance distribution of the field of view of the imaging device is measured, and the luminance distribution is measured. The relationship between the position of each pixel of the imaging element and the luminance correction value is determined, and the picked up image of the subject captured by the imaging device thereafter using the determined relationship between the position of each pixel and the luminance correction value Brightness of And summarized in that corrected.
 この本開示の画像処理システムでは、照明装置と撮像素子とを含む撮像装置と、被写体を撮像し得られた撮像画像を処理して被写体を認識する制御装置とを備える。制御装置は、撮像画像と予め取得された基準データとに基づいて照明装置の異常を判定する。照明装置に異常が生じていると判定した場合に、制御装置は、撮像装置の視野の輝度分布を測定し、測定した輝度分布に基づいて撮像素子の画素ごとの位置と輝度補正値との関係を決定する。そして、制御装置は、決定した画素ごとの位置と輝度補正値との関係を用いて以降に撮像装置により撮像される被写体の撮像画像の輝度値を補正する。これにより、照明装置の照明に異常が生じた場合であっても、良好な撮像画像を得ることができ、得られた撮像画像から被写体を正常に認識することができる。 The image processing system of the present disclosure includes an imaging device including an illumination device and an imaging device, and a control device that processes a captured image obtained by imaging a subject and recognizes the subject. The control device determines an abnormality of the lighting device based on the captured image and the reference data acquired in advance. When it is determined that an abnormality occurs in the lighting device, the control device measures the luminance distribution of the field of view of the imaging device, and the relationship between the position of each pixel of the imaging element and the luminance correction value based on the measured luminance distribution. Decide. Then, the control device corrects the luminance value of the captured image of the subject captured by the imaging device thereafter, using the determined relationship between the position for each pixel and the luminance correction value. As a result, even when abnormality occurs in the illumination of the illumination device, a good captured image can be obtained, and the subject can be normally recognized from the obtained captured image.
 本開示の部品実装機は、部品を保持して実装対象物に実装する部品実装機であって、被写体に光を照射する照明装置と、該被写体からの反射光を受光して該被写体を撮像する撮像素子とを含む撮像装置と、部品を保持する保持具と、前記保持具を前記実装対象物に対して相対的に移動させる移動装置と、前記保持具に保持された部品が撮像されるよう前記移動装置と前記撮像装置とを制御し、得られた撮像画像を処理して前記部品の保持状態を判定した後、前記保持された部品が前記実装対象物に実装されるよう前記保持具と前記移動装置とを制御する制御装置と、を備え、前記制御装置は、前記撮像画像と予め取得された基準データとに基づいて前記照明装置の異常を判定し、前記照明装置に異常が生じていると判定した場合に、前記撮像装置の視野の輝度分布を測定し、該測定した輝度分布に基づいて前記撮像素子の画素ごとの位置と輝度補正値との関係を決定し、該決定した画素ごとの位置と輝度補正値との関係を用いて以降に前記撮像装置により撮像される部品の撮像画像の輝度値を補正することを要旨とする。 The component mounter of the present disclosure is a component mounter that holds a component and mounts the component on a mounting target, and includes an illumination device that emits light to a subject, and light reflected from the subject to capture the subject. An imaging device including an imaging device, a holder for holding a component, a moving device for moving the holder relative to the mounting object, and a component held by the holder Control the moving device and the imaging device, process the obtained captured image to determine the holding state of the component, and the holder so that the held component is mounted on the mounting object And a control device for controlling the moving device, wherein the control device determines an abnormality of the lighting device based on the captured image and reference data acquired in advance, and the lighting device has a failure. If it is determined that the The luminance distribution of the field of view of the device is measured, and the relationship between the position of each pixel of the image pickup element and the luminance correction value is determined based on the measured luminance distribution, and the determined position of each pixel and the luminance correction value The gist of the present invention is to correct a luminance value of a captured image of a component captured by the imaging device thereafter using a relationship.
 この本開示の部品実装機では、照明装置と撮像素子とを含む撮像装置と、保持具と、移動装置と、保持具に保持された部品を撮像し得られた撮像画像を処理して部品の保持状態を判定した後、保持された部品を実装対象物に実装する制御装置とを備える。制御装置は、撮像画像と予め取得された基準データとに基づいて照明装置の異常を判定する。照明装置に異常が生じていると判定した場合に、制御装置は、撮像装置の視野の輝度分布を測定し、測定した輝度分布に基づいて撮像素子の画素ごとの位置と輝度補正値との関係を決定する。そして、制御装置は、決定した画素ごとの位置と輝度補正値との関係を用いて以降に撮像装置により撮像される部品の撮像画像の輝度値を補正する。これにより、照明装置の照明に異常が生じた場合であっても、良好な撮像画像を得ることができ、得られた撮像画像から部品の保持状態を正常に認識して、実装動作を行なうことができる。 In the component mounting machine of the present disclosure, an imaging device including an illumination device and an imaging element, a holder, a moving device, and a component obtained by imaging the component held by the holder are processed to obtain a component And a controller configured to mount the held component on the mounting object after determining the holding state. The control device determines an abnormality of the lighting device based on the captured image and the reference data acquired in advance. When it is determined that an abnormality occurs in the lighting device, the control device measures the luminance distribution of the field of view of the imaging device, and the relationship between the position of each pixel of the imaging element and the luminance correction value based on the measured luminance distribution. Decide. Then, the control device corrects the luminance value of the captured image of the component captured by the imaging device thereafter, using the determined relationship between the position for each pixel and the luminance correction value. As a result, even when abnormality occurs in the illumination of the illumination device, a good captured image can be obtained, and the holding state of the component is normally recognized from the obtained captured image to perform the mounting operation. Can.
部品実装システム1の構成図である。FIG. 1 is a block diagram of a component mounting system 1; 部品実装機10の構成図である。FIG. 2 is a block diagram of a component mounter 10; ヘッド30およびパーツカメラ40の構成図である。FIG. 6 is a configuration diagram of a head 30 and a part camera 40. 制御装置70と管理装置100の電気的な接続関係を示すブロック図である。FIG. 6 is a block diagram showing an electrical connection relationship between a control device 70 and a management device 100. 部品実装処理の一例を示すフローチャートである。It is a flowchart which shows an example of component mounting processing. パーツカメラの一部照明に故障が生じた場合における撮像画像の変化の様子を示す説明図である。It is explanatory drawing which shows the mode of a change of the captured image in, when a failure arises in the partial illumination of a parts camera. 補正テーブル作成処理の一例を示すフローチャートである。It is a flowchart which shows an example of correction table preparation processing. 輝度分布の測定の様子を示す説明図である。It is an explanatory view showing a situation of measurement of luminance distribution. 輝度分布の測定の様子を示す説明図である。It is an explanatory view showing a situation of measurement of luminance distribution. 補正テーブルの一例を示す説明図である。It is an explanatory view showing an example of a amendment table. シェーディング補正前の撮像画像の様子を示す説明図である。It is an explanatory view showing a situation of an image pick-up picture before shading amendment. シェーディング補正後の撮像画像の様子を示す説明図である。It is an explanatory view showing a situation of an image pick-up picture after shading amendment.
 次に、本開示を実施するための形態について図面を参照しながら説明する。 Next, an embodiment for carrying out the present disclosure will be described with reference to the drawings.
 図1は、部品実装システム1の構成図である。図2は、部品実装機10の構成図である。図3は、ヘッド30およびパーツカメラ40の構成図である。図4は、制御装置70と管理装置100の電気的な接続関係を示すブロック図である。なお、本実施形態において、図2の左右方向がX軸方向であり、前後方向がY軸方向であり、上下方向がZ軸方向である。 FIG. 1 is a block diagram of the component mounting system 1. FIG. 2 is a block diagram of the component mounter 10. FIG. 3 is a block diagram of the head 30 and the part camera 40. As shown in FIG. FIG. 4 is a block diagram showing the electrical connection between the control device 70 and the management device 100. As shown in FIG. In the present embodiment, the horizontal direction in FIG. 2 is the X-axis direction, the front-rear direction is the Y-axis direction, and the vertical direction is the Z-axis direction.
 部品実装システム1は、図1に示すように、スクリーン印刷機2や部品実装機10、リフロー炉4、システム全体を管理する管理装置100などを備える。スクリーン印刷機2は、スキージによりスクリーン上のはんだをローリングさせながらスクリーンに形成されたパターン孔に押し込むことでそのパターン孔を介して下方の基板Bに配線パターン(はんだ面)を印刷する。部品実装機10は、電子部品(以下、単に「部品」という)Pを吸着してはんだが印刷された基板Bに実装する。リフロー炉4は、部品を実装した基板Bを加熱することにより基板B上のはんだを溶かしてはんだ接合を行なう。 The component mounting system 1 is provided with the screen printing machine 2, the component mounting machine 10, the reflow oven 4, the management apparatus 100 which manages the whole system, etc., as shown in FIG. The screen printing machine 2 prints a wiring pattern (solder surface) on the lower substrate B via the pattern holes by pressing the solder on the screen while rolling the solder on the screen with a squeegee into the pattern holes formed in the screen. The component mounter 10 adsorbs an electronic component (hereinafter simply referred to as "component") P and mounts it on a substrate B on which solder is printed. The reflow furnace 4 melts the solder on the substrate B by heating the substrate B on which the component is mounted, and performs solder bonding.
 部品実装機10は、チップ抵抗などのチップ部品やコネクタなどの異形部品、QFP(Quad Flat Package)やBGA(Ball Grid Array)などのIC部品など、サイズや形状の異なる多様な部品Pの実装が可能な汎用実装機として構成されている。この部品実装機10は、図2に示すように、部品供給装置22、基板搬送装置24、XYロボット26、ヘッド30、マークカメラ28、パーツカメラ40、制御装置70(図4参照)などを備える。 The component mounting machine 10 mounts various components P having different sizes and shapes, such as chip components such as chip resistors and irregular shaped components such as connectors, and IC components such as QFP (Quad Flat Package) and BGA (Ball Grid Array). It is configured as a possible universal mounter. The component mounter 10, as shown in FIG. 2, includes a component supply device 22, a substrate transfer device 24, an XY robot 26, a head 30, a mark camera 28, a parts camera 40, a control device 70 (see FIG. 4) and the like. .
 部品供給装置22は、部品Pを部品供給位置へ供給するものである。この部品供給装置22は、部品実装機10の前部にX軸方向(左右方向)に沿って配列されるように装着され同一種類の複数の部品(チップ部品など)Pが収容されたテープを供給するテープフィーダや、部品実装機10の前部に設置され同一種類の複数の部品(IC部品など)Pが収容されたトレイを供給するトレイフィーダを含む。 The component supply device 22 supplies the component P to the component supply position. The component supply device 22 is mounted on the front of the component mounter 10 so as to be arranged along the X-axis direction (left-right direction), and has a tape containing a plurality of components (chip components etc.) P of the same type. It includes a tape feeder for supplying, and a tray feeder for supplying a tray installed at the front of the component mounter 10 and containing a plurality of components (such as IC components) P of the same type.
 基板搬送装置24は、図2に示すように、前後に間隔を開けて設けられ左右方向に架け渡された1対のコンベアベルトを有している。基板Bは、基板搬送装置24のコンベアベルトにより図中、左から右へと搬送される。 As shown in FIG. 2, the substrate transfer device 24 has a pair of conveyor belts which are provided at intervals in the front and rear direction and are spanned in the left-right direction. The substrate B is transported by the conveyor belt of the substrate transport device 24 from left to right in the drawing.
 XYロボット26は、ヘッド30をXY軸方向に移動させるものである。このXYロボット26は、図2に示すように、ヘッド30が取り付けられX軸モータの駆動によりX軸方向(左右方向)に移動可能なX軸スライダ26aと、X軸スライダ26aをX軸方向に移動自在に支持すると共にY軸モータの駆動によりY軸方向(前後方向)に移動可能なY軸スライダ26bと、を備える。 The XY robot 26 moves the head 30 in the XY axis direction. As shown in FIG. 2, in the XY robot 26, an X-axis slider 26a to which the head 30 is attached and which can move in the X-axis direction (horizontal direction) by driving an X-axis motor, and the X-axis slider 26a in the X-axis direction And a Y-axis slider 26b which is movably supported and movable in the Y-axis direction (front-rear direction) by the drive of a Y-axis motor.
 ヘッド30は、ロータリヘッドであり、図3に示すように、回転体としてのヘッド本体31と、ヘッド本体31に対して周方向に配列され且つ昇降可能に支持された複数のノズルホルダ32と、を備える。各ノズルホルダ32の先端部には、吸着ノズル33が着脱可能に取り付けられる。また、ヘッド30は、図示しないが、複数のノズルホルダ32をヘッド本体31の中心軸周りに旋回させるようヘッド本体31を回転させるR軸モータや、複数のノズルホルダ32をそれぞれその軸周りに回転させるθ軸モータ、複数のノズルホルダ32のうち所定の旋回位置にあるノズルホルダ32(吸着ノズル33)を昇降させるZ軸モータ、を備える。 The head 30 is a rotary head, and as shown in FIG. 3, a head body 31 as a rotating body, and a plurality of nozzle holders 32 arranged circumferentially with respect to the head body 31 and supported so as to be able to move up and down. Equipped with A suction nozzle 33 is removably attached to the tip of each nozzle holder 32. Further, although not shown, the head 30 rotates an R-axis motor for rotating the head main body 31 so as to turn the plurality of nozzle holders 32 around the central axis of the head main body 31 and rotates the plurality of nozzle holders 32 around their respective axes. And a Z-axis motor for moving up and down the nozzle holder 32 (suction nozzle 33) at a predetermined turning position among the plurality of nozzle holders 32.
 マークカメラ28は、ヘッド30に設けられており、部品供給装置22により供給される部品Pを上方から撮像して部品位置を認識したり、基板搬送装置24により搬送される基板Bに付される基準マークを上方から撮像して基板位置を認識したりする。 The mark camera 28 is provided on the head 30, and picks up an image of the component P supplied from the component supply device 22 from above to recognize the position of the component, and is attached to the substrate B transported by the substrate transport device 24. The reference mark is imaged from above to recognize the substrate position.
 パーツカメラ40は、部品供給装置22と基板搬送装置24との間に設けられており、ヘッド30に吸着させた部品を下方から撮像してその吸着姿勢(吸着ずれ)を認識する。このパーツカメラ40は、図3に示すように、照明装置41と、レンズ48と、撮像素子49(CCDやCMOSなど)と、を備える。照明装置41は、側射照明部42と、落射照明部(同軸落射照明部)44と、を有する。側射照明部42は、被写体に対して斜めに光を当てるものであり、上面視においてレンズ48の周囲にリング状に配列された複数の発光体(LED)43を有する。落射照明部44は、被写体に対してレンズ48の光軸と同じ方向から光を当てるものであり、レンズ48の光軸に対して斜め45度に配置されたハーフミラー46と、ハーフミラー46に対してレンズ48の光軸と直交する方向(水平方向)に光を照射する発光体(LED)45と、を有する。 The parts camera 40 is provided between the parts supply device 22 and the substrate transfer device 24. The parts camera 40 picks up a part adsorbed by the head 30 from below and recognizes its adsorption posture (adsorption deviation). As shown in FIG. 3, the parts camera 40 includes an illumination device 41, a lens 48, and an imaging device 49 (CCD, CMOS, etc.). The illumination device 41 includes a side illumination unit 42 and an epi-illumination unit (coaxial epi-illumination unit) 44. The side-reflection illumination unit 42 applies light obliquely to the subject, and includes a plurality of light emitters (LEDs) 43 arranged in a ring around the lens 48 in top view. The epi-illumination unit 44 applies light to the subject from the same direction as the optical axis of the lens 48, and the half mirror 46 and the half mirror 46 disposed at an oblique angle of 45 degrees with respect to the optical axis of the lens 48. And a light emitter (LED) 45 for emitting light in a direction (horizontal direction) orthogonal to the optical axis of the lens 48.
 制御装置70は、図4に示すように、CPU71を中心とするマイクロプロセッサとして構成されており、CPU71の他に、ROM72やHDD73、RAM74、入出力インタフェース75などを備える。これらはバス76を介して接続されている。制御装置70には、例えば、XYロボット26のXY軸方向の位置を検知する図示しない位置センサからの位置信号や、パーツカメラ40およびマークカメラ28からの各画像信号などが入力されている。一方、制御装置70からは、部品供給装置22や基板搬送装置24、XYロボット26のX軸モータおよびY軸モータ、ヘッド30のZ軸モータ,R軸モータおよびθ軸モータ、パーツカメラ40、マークカメラ28などへの各種制御信号が出力されている。 As shown in FIG. 4, the control device 70 is configured as a microprocessor centering on the CPU 71 and, in addition to the CPU 71, includes a ROM 72, an HDD 73, a RAM 74, an input / output interface 75, and the like. These are connected via a bus 76. For example, position signals from a position sensor (not shown) that detects the position of the XY robot 26 in the XY axis direction, image signals from the part camera 40 and the mark camera 28, and the like are input to the control device 70. On the other hand, from the control device 70, the component supply device 22 and the substrate transfer device 24, the X axis motor and Y axis motor of the XY robot 26, Z axis motor of the head 30, R axis motor and θ axis motor, parts camera 40, mark Various control signals to the camera 28 and the like are output.
 管理装置100は、例えば汎用のコンピュータとして構成され、図4に示すように、CPU101やROM102、HDD103、RAM104、入出力インタフェース105などを備える。管理装置100には、入力デバイス107からの入力信号が入出力インタフェース105を介して入力されている。管理装置100からは、ディスプレイ108への表示信号が入出力インタフェース105を介して出力されている。HDD103には、基板Bのジョブ情報が記憶されている。ここで、ジョブ情報は、各部品実装機10において、どの基板Bにどの部品Pをどの順番で実装するか、また、そのように実装した基板Bを何枚作製するかを定めた情報をいう。このジョブ情報には、基板Bに関する情報や部品Pに関する情報、部品Pの目標実装位置、ヘッド30に関する情報などが含まれる。管理装置100は、部品実装機10の制御装置70と通信可能に接続され、各種情報や制御信号のやり取りを行なう。 The management apparatus 100 is configured, for example, as a general-purpose computer, and includes a CPU 101, a ROM 102, an HDD 103, a RAM 104, an input / output interface 105, and the like as shown in FIG. An input signal from the input device 107 is input to the management apparatus 100 via the input / output interface 105. A display signal to the display 108 is output from the management device 100 via the input / output interface 105. Job information of the substrate B is stored in the HDD 103. Here, job information refers to information that defines which component P is mounted on which substrate B and in which order in each component mounter 10, and how many substrates B mounted in such a manner are to be manufactured. . The job information includes information on the substrate B, information on the part P, a target mounting position of the part P, information on the head 30, and the like. The management device 100 is communicably connected to the control device 70 of the component mounter 10, and exchanges various information and control signals.
 次に、こうして構成された本実施形態の部品実装機10の動作について説明する。図5は部品実装処理の一例を示すフローチャートである。部品実装処理は、管理装置100からジョブ情報を受信したときに実行される。 Next, the operation of the component mounter 10 of the present embodiment thus configured will be described. FIG. 5 is a flowchart showing an example of the component mounting process. The component mounting process is performed when job information is received from the management device 100.
 部品実装処理が実行されると、制御装置70のCPU71は、まず、XYロボット26を制御してヘッド30を部品供給位置の上方へ移動させ、Z軸モータを制御して部品供給位置に供給された部品Pを吸着ノズル33に吸着させる(S100)。CPU71は、予定した数の部品Pが吸着されるまで、R軸モータを制御してノズルホルダ32を旋回させ、Z軸モータを制御して次の吸着ノズル33に部品Pを吸着させる動作を繰り返す。続いて、CPU71は、XYロボット26を制御してヘッド30をパーツカメラ40の上方に移動させ(S110)、パーツカメラ40を制御して吸着ノズル33に吸着させた部品Pを撮像する(S120)。パーツカメラ40の制御は、吸着ノズル33に吸着させた部品Pに光が照射されるよう照明装置41を制御し、当該部品Pが撮像されるよう撮像素子49を制御することにより行なわれる。そして、CPU71は、得られた撮像画像を吸着した部品Pの種類と関連付けてHDD73に保存する(S130)。次に、CPU71は、得られた撮像画像と直近にHDD73に保存した同一種類の部品Pの撮像画像(基準画像)とを比較し(S140)、両画像間に輝度値の変化が生じているか否かを判定する(S150)。この処理は、パーツカメラ40の一部照明に故障が生じているか否かを判定するものである。図6は、パーツカメラの一部照明に故障が生じた場合における撮像画像の変化の様子を示す説明図である。図の例では、部品Pは、直方体の外形をもつ部品本体と、短手方向の両端に配列された電極とを有する。部品Pの撮像画像は、部品本体の像である暗色領域と、電極の像であり暗色領域よりも輝度値が高い明色領域とを有する。S150の判定は、両画像間の明色領域の輝度差が所定値以上あるか否か(撮像画像の明色領域の輝度値が基準画像の対応する明色領域よりも所定値以上小さいか否か)を判定することにより行なわれる。 When the component mounting process is executed, the CPU 71 of the control device 70 first controls the XY robot 26 to move the head 30 above the component supply position and controls the Z-axis motor to be supplied to the component supply position. The component P is sucked by the suction nozzle 33 (S100). The CPU 71 repeats the operation of controlling the R-axis motor to turn the nozzle holder 32 and controlling the Z-axis motor to cause the next suction nozzle 33 to suction the component P until the planned number of components P is suctioned. . Subsequently, the CPU 71 controls the XY robot 26 to move the head 30 above the part camera 40 (S110), and controls the part camera 40 to pick up an image of the part P adsorbed by the suction nozzle 33 (S120) . The control of the parts camera 40 is performed by controlling the illumination device 41 so that light is emitted to the parts P absorbed by the suction nozzle 33 and controlling the imaging element 49 so that the parts P are imaged. Then, the CPU 71 stores the obtained captured image in the HDD 73 in association with the type of the component P that has adsorbed the image (S130). Next, the CPU 71 compares the obtained captured image with the captured image (reference image) of the same type of component P stored most recently in the HDD 73 (S140), and is a change in luminance value occurring between both images? It is determined whether or not it is (S150). This processing is to determine whether or not a failure occurs in the partial illumination of the part camera 40. FIG. 6 is an explanatory view showing a state of change of a captured image when a failure occurs in partial illumination of a part camera. In the illustrated example, the component P has a component body having a rectangular parallelepiped outer shape, and electrodes arranged at both ends in the short direction. The captured image of the part P has a dark area which is an image of the part body, and a bright area which is an image of the electrode and whose luminance value is higher than that of the dark area. In the determination of S150, it is determined whether or not the brightness difference of the bright area between both images is equal to or larger than a predetermined value (the brightness value of the bright area of the captured image is smaller than the corresponding bright area of the reference image). It is carried out by determining
 CPU71は、両画像間に輝度値の変化が生じていると判定すると、パーツカメラ40の照明(照明装置41)に故障が生じていると判断し、撮像画像の画素ごとの輝度値を補正するための補正テーブルを作成する(S160)。この処理は、図7に例示する補正テーブル作成処理を実行することにより行なわれる。補正テーブル作成処理では、CPU71は、まず、XYロボット26を制御してパーツカメラ40の視野内の測定開始位置へ吸着ノズル33に吸着させた部品Pを移動させる(S300)。続いて、CPU71は、パーツカメラ40を制御して吸着ノズル33に吸着させた部品Pを撮像し(S310)、得られた撮像画像をHDD73に保存する(S320)。そして、CPU71は、位置センサにより検出される吸着ノズル33のXY位置に基づいて吸着ノズル33に吸着させた部品Pが測定終了位置に到達したか否かを判定する(S330)。CPU71は、吸着ノズル33に吸着させた部品Pが測定終了位置に到達していないと判定すると、XYロボット26を制御して次の撮像位置へ部品Pを移動させ(S340)、S310に戻って当該部品Pの撮像と得られた撮像画像の保存とを繰り返す。S300~S340の処理は、部品Pの電極部分がパーツカメラ40の視野内全域において隈無く撮像されるようXYロボット26によりパーツカメラ40の視野内で部品Pを移動させながら当該部品Pを繰り返し撮像することにより行なわれる。図8および図9は、輝度分布の測定の様子を示す説明図である。部品Pの移動は、実施形態では、図示するように、左下位置を測定開始位置とすると共に右上位置を測定終了位置として、測定開始位置から測定終了位置へ向かって右方向,左斜め上方向,右方向,左斜め上方向,右方向へとジグザグ状に行なわれる。パーツカメラ40が正常な場合、図8に示すように、パーツカメラ40の視野内のどの位置で部品Pが撮像されても、その電極部分に対応する画素の輝度値は高くなる。一方、パーツカメラ40の一部照明(図中、上部)が故障した場合、図9に示すように、照明が当たらない位置で部品Pが撮像されると、その電極部分に対応する画素の輝度値は相対的に低くなる。CPU71は、S330で部品Pが測定終了位置に到達したと判定すると、此までに撮像・保存した部品Pの撮像画像の電極部分のみを合わせてパーツカメラ40の視野内全域の輝度分布を導出する(S350)。そして、CPU71は、導出した輝度分布に基づいて撮像素子49の画素ごとの輝度補正値を定めた補正テーブルを作成する(S360)。補正テーブルにより、撮像素子49の各画素の位置と輝度補正値との関係が定められる。図10は、補正テーブルの一例を示す説明図である。補正テーブルは、導出された輝度分布において輝度値が低い画素ほど補正量が大きくなるように作成される。例えば、補正テーブルは、各画素の補正後の輝度値が予め定めた所定値となるよう輝度補正値が定められてもよい。また、補正テーブルは、各画素の補正後の輝度値が、導出した輝度分布のうち最も大きい輝度値となるよう輝度補正値が定められてもよい。そして、CPU71は、作成した補正テーブルを、その作成に用いた部品Pの種類と関連付けてHDD73に保存して(S370)、補正テーブル作成処理を終了する。 If the CPU 71 determines that a change in luminance value occurs between both images, it determines that a failure has occurred in the illumination (illumination device 41) of the part camera 40, and corrects the luminance value of each pixel of the captured image A correction table is created (S160). This process is performed by executing the correction table creating process illustrated in FIG. In the correction table creation process, the CPU 71 first controls the XY robot 26 to move the part P attracted by the suction nozzle 33 to the measurement start position within the field of view of the part camera 40 (S300). Subsequently, the CPU 71 controls the parts camera 40 to image the component P adsorbed by the suction nozzle 33 (S310), and stores the obtained captured image in the HDD 73 (S320). Then, based on the XY position of the suction nozzle 33 detected by the position sensor, the CPU 71 determines whether the component P adsorbed by the suction nozzle 33 has reached the measurement end position (S330). If the CPU 71 determines that the component P absorbed by the suction nozzle 33 has not reached the measurement end position, it controls the XY robot 26 to move the component P to the next imaging position (S340), and returns to S310 The imaging of the part P and the storage of the acquired captured image are repeated. In the processes of S300 to S340, the part P is repeatedly imaged while moving the part P in the field of view of the part camera 40 by the XY robot 26 so that the electrode part of the part P is imaged without loss in the entire area of the part camera 40 It is done by doing. FIG. 8 and FIG. 9 are explanatory diagrams showing the state of measurement of the luminance distribution. In the embodiment, the movement of the part P is, as illustrated, with the lower left position as the measurement start position and the upper right position as the measurement end position, from the measurement start position toward the measurement end position to the right, left diagonally upward, It is carried out in a zigzag manner to the right, left diagonally upward, and right. When the parts camera 40 is normal, as shown in FIG. 8, the luminance value of the pixel corresponding to the electrode part becomes high no matter where in the field of the parts camera 40 the part P is imaged. On the other hand, if the partial illumination (upper part in the figure) of the parts camera 40 breaks down, as shown in FIG. 9, when the part P is imaged at a position where the illumination is not applied, the brightness of the pixel corresponding to the electrode portion The value is relatively low. When the CPU 71 determines that the part P has reached the measurement end position in S330, the CPU 71 combines only the electrode parts of the captured image of the part P captured and stored by the eye and derives the luminance distribution over the entire field of view of the part camera 40 (S350). Then, the CPU 71 creates a correction table in which the luminance correction value for each pixel of the imaging device 49 is determined based on the derived luminance distribution (S360). The correction table determines the relationship between the position of each pixel of the image sensor 49 and the luminance correction value. FIG. 10 is an explanatory diagram of an example of the correction table. The correction table is created such that the correction amount increases as the pixel having the lower luminance value in the derived luminance distribution. For example, in the correction table, the luminance correction value may be determined such that the corrected luminance value of each pixel is a predetermined value. Further, in the correction table, the luminance correction value may be set such that the corrected luminance value of each pixel is the largest luminance value among the derived luminance distributions. Then, the CPU 71 stores the created correction table in the HDD 73 in association with the type of the part P used for the creation (S370), and ends the correction table creation processing.
 部品実装処理に戻って、CPU71は、こうして補正テーブルを作成すると、吸着した部品Pを再度、パーツカメラ40で撮像し(S170)、作成した補正テーブルを用いて撮像画像を補正(シェーディング補正)する(S180)。図11は、シェーディング補正前の撮像画像の様子を示す説明図である。図12は、シェーディング補正後の撮像画像の様子を示す説明図である。次に、CPU71は、補正した撮像画像に基づいて吸着ノズル33に吸着させた部品Pの吸着位置ずれを判定して(S210)、当該部品Pの目標実装位置を補正する(S220)。そして、CPU71は、XYロボット26を制御して吸着ノズル33を目標実装位置の上方へ移動させ(S230)、Z軸モータにより吸着ノズル33を下降させて部品Pを基板B上に実装して(S240)、部品実装処理を終了する。CPU71は、複数の吸着ノズル33に部品Pが吸着されている場合には、各部品Pをそれぞれの目標実装位置に実装する動作を繰り返す。 Returning to the component mounting process, when the CPU 71 creates the correction table in this way, the component camera 40 picks up again the component P picked up by the part camera 40 (S170), and corrects (shading correction) the captured image using the created correction table. (S180). FIG. 11 is an explanatory view showing the appearance of a captured image before shading correction. FIG. 12 is an explanatory view showing the appearance of the captured image after the shading correction. Next, the CPU 71 determines the suction position deviation of the component P adsorbed by the suction nozzle 33 based on the corrected captured image (S210), and corrects the target mounting position of the component P (S220). Then, the CPU 71 controls the XY robot 26 to move the suction nozzle 33 above the target mounting position (S230), lowers the suction nozzle 33 by the Z-axis motor, and mounts the component P on the substrate B ( S240), the component mounting process is finished. The CPU 71 repeats the operation of mounting each component P at each target mounting position when the component P is suctioned by the plurality of suction nozzles 33.
 CPU71は、S150において両画像間に輝度値の変化が生じていないと判定すると、同種部品の補正テーブルが既にHDD73に保存されているか否かを判定する(S190)。CPU71は、同種部品の補正テーブルが保存されていないと判定すると、S210に進む。即ち、CPU71は、S120で撮像された部品Pの撮像画像に基づいて当該部品Pの吸着位置ずれを判定し(S210)、部品Pの目標実装位置を補正し(S220)、部品Pを目標実装位置の上方へ移動させて(S230)、基板B上に実装する(S240)。 If the CPU 71 determines in S150 that no change in luminance value occurs between both images, the CPU 71 determines whether or not the correction table of the same kind of component is already stored in the HDD 73 (S190). If the CPU 71 determines that the correction table of the same kind of component is not stored, the process proceeds to step S210. That is, the CPU 71 determines the suction position deviation of the component P based on the captured image of the component P captured in S120 (S210), corrects the target mounting position of the component P (S220), and mounts the component P as a target. The position is moved upward (S230), and mounted on the substrate B (S240).
 一方、CPU71は、S190において同種部品の補正テーブルがHDD73に保存されていると判定すると、保存された同種部品の補正テーブルを用いてS120で撮像された部品Pの撮像画像を補正してから(S200)、S210に進む。即ち、CPU71は、同種部品の補正テーブルを用いて補正された部品Pの撮像画像に基づいて当該部品Pの吸着位置ずれを判定し(S210)、部品Pの目標実装位置を補正し(S220)、部品Pを目標実装位置の上方へ移動させて(S230)、基板B上に実装する(S240)。 On the other hand, when the CPU 71 determines that the correction table for the same kind of part is stored in the HDD 73 in S190, it corrects the captured image of the part P picked up in S120 using the stored correction table for the same kind of part ( It progresses to S200) and S210. That is, the CPU 71 determines the suction position deviation of the component P based on the pickup image of the component P corrected using the correction table of the same component (S210), and corrects the target mounting position of the component P (S220) The component P is moved above the target mounting position (S230), and mounted on the substrate B (S240).
 ここで、実施形態の主要な要素と請求の範囲に記載した本開示の主要な要素との対応関係について説明する。即ち、照明装置41が照明装置に相当し、撮像素子49が撮像素子に相当し、パーツカメラ40が撮像装置に相当し、制御装置70が制御装置に相当する。また、吸着ノズル33が保持具に相当し、XYロボット26が移動装置に相当する。 Here, the correspondence between the main elements of the embodiment and the main elements of the present disclosure described in the claims will be described. That is, the illumination device 41 corresponds to the illumination device, the imaging device 49 corresponds to the imaging device, the part camera 40 corresponds to the imaging device, and the control device 70 corresponds to the control device. The suction nozzle 33 corresponds to a holder, and the XY robot 26 corresponds to a moving device.
 以上説明した本実施形態の部品実装機10は、照明装置41と撮像素子49とを含むパーツカメラ40と、吸着ノズル33と、XYロボット26と、吸着ノズル33に吸着させた部品Pを撮像し得られた撮像画像を処理して部品Pの吸着状態を判定した後、吸着された部品Pを基板Bに実装する制御装置70とを備える。制御装置70は、撮像画像と予め取得された基準画像とに基づいて照明装置41の異常を判定する。照明装置41に異常が生じていると判定した場合に、制御装置70は、パーツカメラ40の視野の輝度分布を測定し、測定した輝度分布に基づいて撮像素子49の画素ごとの輝度補正値を定めた補正テーブルを作成する。そして、制御装置70は、作成した補正テーブルを用いて以降にパーツカメラ40により撮像される部品Pの撮像画像の輝度値を補正する。これにより、部品実装機10は、照明装置41の照明に異常が生じた場合であっても、良好な撮像画像を得ることができ、得られた撮像画像から部品Pの吸着状態を正常に認識して、実装動作を行なうことができる。 The component mounter 10 of the present embodiment described above picks up an image of the component P absorbed by the part camera 40 including the illumination device 41 and the imaging device 49, the suction nozzle 33, the XY robot 26, and the suction nozzle 33. After processing the obtained captured image to determine the suction state of the component P, the controller 70 for mounting the suctioned component P on the substrate B is provided. The control device 70 determines the abnormality of the illumination device 41 based on the captured image and the reference image acquired in advance. When it is determined that an abnormality occurs in the illumination device 41, the control device 70 measures the luminance distribution of the field of view of the part camera 40, and based on the measured luminance distribution, the luminance correction value for each pixel of the imaging device 49 is calculated. Create a defined correction table. Then, the control device 70 corrects the luminance value of the captured image of the part P captured by the part camera 40 thereafter using the created correction table. As a result, the component mounter 10 can obtain a good captured image even when an abnormality occurs in the illumination of the lighting device 41, and the suction state of the component P can be normally recognized from the obtained captured image. And the mounting operation can be performed.
 また、本実施形態の部品実装機10は、吸着ノズル33に吸着させた部品Pをパーツカメラ40の視野内で移動させながら当該パーツカメラ40で部品Pを撮像することにより、パーツカメラ40の視野内の輝度分布を測定する。これにより、部品実装機10は、パーツカメラ40の一部照明に故障が発生しても、パーツカメラ40の視野内の輝度分布の測定と補正テーブルの作成とを自動で行なうことができる。 Further, the component mounter 10 according to the present embodiment images the component P by the part camera 40 while moving the component P absorbed by the suction nozzle 33 within the field of view of the part camera 40, whereby the field of view of the part camera 40 Measure the brightness distribution inside. Thereby, even if a failure occurs in the partial illumination of the part camera 40, the component mounter 10 can automatically perform the measurement of the luminance distribution in the field of view of the part camera 40 and the creation of the correction table.
 さらに、本実施形態の部品実装機10は、部品Pの種類ごとにパーツカメラ40の視野の輝度分布を測定して補正テーブルを作成し、以降に補正テーブルの作成に用いた部品Pと同種の部品Pがパーツカメラ40により撮像された場合に得られた撮像画像の輝度値を当該補正テーブルを用いて補正する。これにより、部品実装機10は、部品Pの種類ごとに適した補正テーブルを用いて撮像画像を補正することができ、撮像画像の認識の精度をより高めることができる。 Furthermore, the component mounter 10 according to the present embodiment measures the luminance distribution of the field of view of the part camera 40 for each type of component P to create a correction table, and subsequently generates the correction table. The luminance value of the captured image obtained when the part P is captured by the part camera 40 is corrected using the correction table. Thereby, the component mounter 10 can correct the captured image using the correction table suitable for each type of the component P, and can further enhance the accuracy of recognition of the captured image.
 なお、本発明は上述した実施形態に何ら限定されることはなく、本発明の技術的範囲に属する限り種々の態様で実施し得ることはいうまでもない。 It is needless to say that the present invention is not limited to the above-mentioned embodiment at all, and can be implemented in various modes within the technical scope of the present invention.
 例えば、上述した実施形態では、部品実装機10は、吸着した部品Pをパーツカメラ40で撮像し、得られた撮像画像を吸着した部品Pの種類と関連付けて記憶し、得られた撮像画像と直近に記憶した同一種類の部品Pの撮像画像(基準画像)とを比較して、パーツカメラ40の照明装置41に故障が生じたか否かを判定した。しかし、基準画像は、過去に記憶された複数の同種部品の撮像画像の輝度値が平均化されたものであってもよい。また、基準画像は、生産前に予めパーツカメラ40で撮像された部品Pの撮像画像であってもよい。さらに、基準画像は、部品Pに関する情報(種類)に基づいて作成されてもよい。 For example, in the above-described embodiment, the component mounter 10 captures the captured component P with the part camera 40, associates the obtained captured image with the type of the component P captured, and stores the obtained captured image The captured image (reference image) of the same type of part P stored most recently was compared, and it was determined whether or not a failure occurred in the illumination device 41 of the part camera 40. However, the reference image may be one obtained by averaging the luminance values of captured images of a plurality of similar components stored in the past. The reference image may be a captured image of the part P captured by the part camera 40 in advance before production. Furthermore, the reference image may be created based on the information (type) on the part P.
 また、上述した実施形態では、部品実装機10は、部品Pの種類ごとにパーツカメラ40の視野内の輝度分布を測定して補正テーブルを作成して記憶しておき、同種部品で作成した補正テーブルを用いて部品Pの撮像画像を補正した。しかし、部品実装機10は、部品Pの種類に拘わらず同じ補正テーブルを用いて当該部品Pの撮像画像を補正してもよい。 Further, in the embodiment described above, the component mounter 10 measures the luminance distribution in the field of view of the part camera 40 for each type of component P, creates and stores a correction table, The captured image of the part P was corrected using the table. However, the component mounter 10 may correct the captured image of the component P using the same correction table regardless of the type of the component P.
 また、上述した実施形態では、部品実装機10は、吸着ノズル33に吸着させた部品Pをパーツカメラ40の視野内で移動させながら当該パーツカメラ40で部品Pを撮像することにより、パーツカメラ40の視野内の輝度分布を測定した。しかし、部品実装機10は、例えば、パーツカメラ40の視野をカバーするジグプレートをヘッド30に設けて、パーツカメラ40でジグプレートを撮像することにより、パーツカメラ40の視野内の輝度分布を測定してもよい。 Further, in the above-described embodiment, the component mounter 10 picks up the component camera 40 by moving the component P absorbed by the suction nozzle 33 within the field of view of the component camera 40 while imaging the component P with the component camera 40. The luminance distribution in the field of view was measured. However, the component mounter 10 measures the luminance distribution in the field of view of the part camera 40 by, for example, providing a jig plate for covering the field of view of the part camera 40 on the head 30 and imaging the jig plate with the part camera 40 You may
 また、上述した実施形態では、部品実装機10は、パーツカメラ40の視野内全域における輝度分布を測定して補正テーブルを作成した。しかし、部品実装機10は、部品Pの撮像範囲がカバーできれば、必ずしもパーツカメラ40の視野内全域における輝度分布を測定する必要はない。また、部品実装機10は、パーツカメラ40の視野のうち部品Pの種類ごとに撮像位置が決まっている場合、部品Pの種類ごとに対応する撮像範囲のみの輝度分布を測定して、部品Pの種類ごとに補正テーブルを作成してもよい。 Further, in the embodiment described above, the component mounter 10 measures the luminance distribution in the entire area within the field of view of the part camera 40 and creates the correction table. However, as long as the component mounter 10 can cover the imaging range of the component P, the component mounter 10 does not necessarily have to measure the luminance distribution in the entire field of the part camera 40. Further, when the imaging position is determined for each type of component P in the field of view of the parts camera 40, the component mounting machine 10 measures the luminance distribution of only the imaging range corresponding to each type of component P, A correction table may be created for each type of.
 また、上述した実施形態では、部品実装機10は、撮像画像と基準画像との間の明色領域の輝度差が所定以上あった場合には、パーツカメラ40の照明に故障が生じていると判断し、視野の輝度分布を測定して補正テーブルを作成した。しかし、部品実装機10は、撮像画像と基準画像との間の明色領域の輝度差が所定値よりも大きい限界値を超えた場合や輝度差が所定値以上の明色領域の範囲が所定範囲を超えた場合などには、輝度分布の測定や補正テーブルの作成を行なうことなく、実装動作を停止させてもよい。 In the embodiment described above, the component mounter 10 determines that the illumination of the parts camera 40 has a failure when the luminance difference between the captured image and the reference image is equal to or greater than a predetermined luminance difference. It judged and measured the luminance distribution of the visual field, and created the correction table. However, in the component mounting machine 10, the range of the bright area where the brightness difference between the captured image and the reference image exceeds the limit value larger than the predetermined value or the brightness range of the predetermined value or more is predetermined. If the range is exceeded, the mounting operation may be stopped without measuring the luminance distribution or creating the correction table.
 また、上述した実施形態では、XYロボット26は、ヘッド30をXY軸方向に移動させるものとしたが、基板BをXY軸方向に移動させてもよい。 In the embodiment described above, the XY robot 26 moves the head 30 in the XY axis direction. However, the substrate B may be moved in the XY axis direction.
 また、上述した実施形態では、CPU71は、パーツカメラ40の視野の輝度分布に基づいて撮像素子49の画素ごとの輝度補正値を定めた補正テーブルを作成した。しかし、CPU71は、撮像素子49の画素の位置とその画素の輝度補正値との関係を定めた関数を作成してもよい。CPU71は、パーツカメラ40の視野を、各吸着ノズル33に吸着された各部品の領域に分割し、分割領域ごとに、関数を作成してもよい。 In the embodiment described above, the CPU 71 creates a correction table in which the luminance correction value for each pixel of the imaging device 49 is determined based on the luminance distribution of the field of view of the part camera 40. However, the CPU 71 may create a function that defines the relationship between the position of the pixel of the imaging device 49 and the luminance correction value of the pixel. The CPU 71 may divide the field of view of the parts camera 40 into the areas of the parts adsorbed by the suction nozzles 33, and create a function for each divided area.
 また、本開示は、部品実装機の形態の他、画像処理システムの形態であってもよい。 Further, the present disclosure may be in the form of an image processing system in addition to the form of a component mounter.
 本開示は、画像処理システムや部品実装機の製造産業などに利用可能である。 The present disclosure is applicable to the manufacturing industry of an image processing system and a component mounting machine.
 1 部品実装システム、2 スクリーン印刷機、4 リフロー炉、10 部品実装機、22 部品供給装置、24 基板搬送装置、26 XYロボット、26a X軸スライダ、26b Y軸スライダ 28 マークカメラ、30 ヘッド、31 ヘッド本体、32 ノズルホルダ、33 吸着ノズル、40 パーツカメラ、41 照明装置、42 側射照明部、43 発光体、44 落射照明部、45 発光体、46 ハーフミラー、48 レンズ、49 撮像素子、70 制御装置、71 CPU、72 ROM 73 HDD、74 RAM、75 入出力インタフェース、76 バス、100 管理装置、101 CPU、102 ROM、103 HDD、104 RAM、105 入出力インタフェース、107 入力デバイス、108 ディスプレイ、B 基板、P 部品。 Reference Signs List 1 component mounting system, 2 screen printing machine, 4 reflow furnace, 10 component mounting machine, 22 component feeding device, 24 substrate transfer device, 26 XY robot, 26a X axis slider, 26b Y axis slider 28 mark camera, 30 head, 31 Head body, 32 nozzle holder, 33 suction nozzle, 40 parts camera, 41 illumination devices, 42 side illumination units, 43 luminous bodies, 44 epi-illumination units, 45 luminous bodies, 46 half mirrors, 48 lenses, 49 imaging elements, 70 Control device, 71 CPU, 72 ROM 73 HDD, 74 RAM, 75 I / O interface, 76 bus, 100 management device, 101 CPU, 102 ROM, 103 HDD, 104 RAM, 105 I / O interface, 107 input device , 108 display, B substrate, P component.

Claims (5)

  1.  画像を処理する画像処理システムであって、
     被写体に光を照射する照明装置と、該被写体からの反射光を受光して該被写体を撮像する撮像素子とを含む撮像装置と、
     前記被写体が撮像されるよう前記撮像装置を制御し、得られた撮像画像を処理して前記被写体を認識する制御装置と、
     を備え、
     前記制御装置は、前記撮像画像と予め取得された基準データとに基づいて前記照明装置の異常を判定し、前記照明装置に異常が生じていると判定した場合に、前記撮像装置の視野の輝度分布を測定し、該測定した輝度分布に基づいて前記撮像素子の画素ごとの位置と輝度補正値との関係を決定し、該決定した画素ごとの位置と輝度補正値との関係を用いて以降に前記撮像装置により撮像される被写体の撮像画像の輝度値を補正する、
     画像処理システム。
    An image processing system for processing an image,
    An illumination device that emits light to a subject; and an imaging device that includes an imaging device that receives reflected light from the subject and captures the subject.
    A control device that controls the imaging device so that the subject is imaged, processes the acquired captured image, and recognizes the subject;
    Equipped with
    The control device determines the abnormality of the lighting device based on the captured image and the reference data acquired in advance, and when it is determined that the lighting device has a failure, the luminance of the field of view of the imaging device The distribution is measured, the relationship between the position of each pixel of the image pickup device and the brightness correction value is determined based on the measured brightness distribution, and the relationship between the determined position for each pixel and the brightness correction value is used thereafter Correcting a luminance value of a captured image of a subject captured by the imaging device,
    Image processing system.
  2.  部品を保持して実装対象物に実装する部品実装機であって、
     被写体に光を照射する照明装置と、該被写体からの反射光を受光して該被写体を撮像する撮像素子とを含む撮像装置と、
     部品を保持する保持具と、
     前記保持具を前記実装対象物に対して相対的に移動させる移動装置と、
     前記保持具に保持された部品が撮像されるよう前記移動装置と前記撮像装置とを制御し、得られた撮像画像を処理して前記部品の保持状態を判定した後、前記保持された部品が前記実装対象物に実装されるよう前記保持具と前記移動装置とを制御する制御装置と、
     を備え、
     前記制御装置は、前記撮像画像と予め取得された基準データとに基づいて前記照明装置の異常を判定し、前記照明装置に異常が生じていると判定した場合に、前記撮像装置の視野の輝度分布を測定し、該測定した輝度分布に基づいて前記撮像素子の画素ごとの位置と輝度補正値との関係を決定し、該決定した画素ごとの位置と輝度補正値との関係を用いて以降に前記撮像装置により撮像される部品の撮像画像の輝度値を補正する、
     部品実装機。
    A component mounting machine that holds components and mounts them on a mounting object,
    An illumination device that emits light to a subject; and an imaging device that includes an imaging device that receives reflected light from the subject and captures the subject.
    A holder for holding parts;
    A moving device for moving the holder relative to the mounting object;
    The moving device and the imaging device are controlled such that the component held by the holder is imaged, and the acquired captured image is processed to determine the holding state of the component, and then the held component A control device that controls the holder and the moving device to be mounted on the mounting object;
    Equipped with
    The control device determines the abnormality of the lighting device based on the captured image and the reference data acquired in advance, and when it is determined that the lighting device has a failure, the luminance of the field of view of the imaging device The distribution is measured, the relationship between the position of each pixel of the image pickup device and the brightness correction value is determined based on the measured brightness distribution, and the relationship between the determined position for each pixel and the brightness correction value is used thereafter Correcting the luminance value of the captured image of the part captured by the imaging device,
    Component mounter.
  3.  請求項2に記載の部品実装機であって、
     前記制御装置は、前記決定した画素ごとの位置と輝度補正値との関係を定めた補正テーブルを作成する、
     部品実装機。
    The component mounting machine according to claim 2, wherein
    The control device creates a correction table that defines the relationship between the determined position of each pixel and the luminance correction value.
    Component mounter.
  4.  請求項2または3に記載の部品実装機であって、
     前記制御装置は、前記照明装置に異常が生じていると判定した場合に、前記保持具に部品が保持された状態で前記撮像装置の視野内で該部品の位置をずらしながら該部品の撮像が繰り返されるよう前記移動装置と前記撮像装置とを制御し、得られた各撮像画像に基づいて前記撮像装置の視野の輝度分布を測定して前記撮像素子の画素ごとの位置と輝度補正値との関係を決定する、
     部品実装機。
    A component mounting machine according to claim 2 or 3, wherein
    When the control device determines that an abnormality occurs in the lighting device, imaging of the component is performed while shifting the position of the component in the field of the imaging device in a state where the component is held by the holder. The movement device and the imaging device are controlled to be repeated, and the luminance distribution of the field of view of the imaging device is measured based on each acquired captured image, and the position of each pixel of the imaging element and the luminance correction value are measured. Determine the relationship,
    Component mounter.
  5.  請求項4に記載の部品実装機であって、
     前記制御装置は、部品種ごとに前記撮像装置の視野の輝度分布を測定して前記撮像素子の画素ごとの位置と輝度補正値との関係を決定し、以降に前記画素ごとの位置と輝度補正値との関係の決定に用いた部品と同種の部品が前記撮像装置により撮像された場合に得られた撮像画像の輝度値を当該関係を用いて補正する、
     部品実装機。
    A component mounting machine according to claim 4, wherein
    The control device measures the luminance distribution of the field of view of the imaging device for each component type to determine the relationship between the position of each pixel of the imaging element and the luminance correction value, and thereafter the position and luminance correction for each pixel The luminance value of the captured image obtained when the same type of component as the component used to determine the relationship with the value is imaged by the imaging device is corrected using the relationship.
    Component mounter.
PCT/JP2017/029008 2017-08-09 2017-08-09 Image processing system and component mounting machine WO2019030875A1 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
JP2019535522A JP6728501B2 (en) 2017-08-09 2017-08-09 Image processing system and component mounter
PCT/JP2017/029008 WO2019030875A1 (en) 2017-08-09 2017-08-09 Image processing system and component mounting machine

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2017/029008 WO2019030875A1 (en) 2017-08-09 2017-08-09 Image processing system and component mounting machine

Publications (1)

Publication Number Publication Date
WO2019030875A1 true WO2019030875A1 (en) 2019-02-14

Family

ID=65271958

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2017/029008 WO2019030875A1 (en) 2017-08-09 2017-08-09 Image processing system and component mounting machine

Country Status (2)

Country Link
JP (1) JP6728501B2 (en)
WO (1) WO2019030875A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113455117A (en) * 2019-02-19 2021-09-28 株式会社富士 Component mounting machine

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPS6156573A (en) * 1984-08-27 1986-03-22 Fujitsu Ltd Detector for deterioration of light source of picture reader
JPH03214695A (en) * 1990-01-18 1991-09-19 Matsushita Electric Ind Co Ltd Image recognizer of electronic parts mounting equipment
JP2013033483A (en) * 2007-12-26 2013-02-14 Sharp Corp Method of generating one-dimensional histogram
JP2015211346A (en) * 2014-04-25 2015-11-24 キヤノンファインテック株式会社 Image reading device, duplication apparatus and image reading method

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPS6156573A (en) * 1984-08-27 1986-03-22 Fujitsu Ltd Detector for deterioration of light source of picture reader
JPH03214695A (en) * 1990-01-18 1991-09-19 Matsushita Electric Ind Co Ltd Image recognizer of electronic parts mounting equipment
JP2013033483A (en) * 2007-12-26 2013-02-14 Sharp Corp Method of generating one-dimensional histogram
JP2015211346A (en) * 2014-04-25 2015-11-24 キヤノンファインテック株式会社 Image reading device, duplication apparatus and image reading method

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113455117A (en) * 2019-02-19 2021-09-28 株式会社富士 Component mounting machine

Also Published As

Publication number Publication date
JP6728501B2 (en) 2020-07-22
JPWO2019030875A1 (en) 2020-01-09

Similar Documents

Publication Publication Date Title
JP6435099B2 (en) Electronic component mounting apparatus and electronic component mounting method
JP6131039B2 (en) Electronic component mounting equipment
JP6224348B2 (en) Judgment device, surface mounter
JP6293899B2 (en) Mounting device
JP2023118927A (en) Substrate handling work system
JP6351486B2 (en) Electronic component transfer nozzle and electronic component mounting apparatus having the same
WO2012165261A1 (en) Solder mark setting method and solder mark setting device
JPWO2017126025A1 (en) Mounting apparatus and imaging processing method
JP6728501B2 (en) Image processing system and component mounter
JP2017139388A (en) Mounting apparatus
WO2019111331A1 (en) Imaging unit and component mounting machine
JP6088838B2 (en) Electronic component mounting apparatus and electronic component mounting method
JP6475165B2 (en) Mounting device
JP6804653B2 (en) Parts allocation device
JP7301973B2 (en) inspection equipment
CN108702866B (en) Component determination device and component determination method
JP6742498B2 (en) Component mounting system and component mounting method
WO2018055663A1 (en) Imaging device and mounting device
WO2024111022A1 (en) Component appearance inspection device and component appearance inspection method
WO2024089852A1 (en) Control device, robot system, and control method
KR101332005B1 (en) Apparatus and method of mounting component
WO2024033961A1 (en) Foreign matter detection device and foreign matter detection method
JP5946998B2 (en) Image processing apparatus and electronic component mounting machine
JP7128362B2 (en) work machine
JP2018098404A (en) Determination device and surface mounting machine

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 17921067

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2019535522

Country of ref document: JP

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 17921067

Country of ref document: EP

Kind code of ref document: A1