WO2021117319A1 - Component-imaging device and component-mounting device - Google Patents

Component-imaging device and component-mounting device Download PDF

Info

Publication number
WO2021117319A1
WO2021117319A1 PCT/JP2020/036798 JP2020036798W WO2021117319A1 WO 2021117319 A1 WO2021117319 A1 WO 2021117319A1 JP 2020036798 W JP2020036798 W JP 2020036798W WO 2021117319 A1 WO2021117319 A1 WO 2021117319A1
Authority
WO
WIPO (PCT)
Prior art keywords
component
mark
unit
imaging
image pickup
Prior art date
Application number
PCT/JP2020/036798
Other languages
French (fr)
Japanese (ja)
Inventor
康一 岡田
森 秀雄
鷹則 松田
Original Assignee
パナソニックIpマネジメント株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by パナソニックIpマネジメント株式会社 filed Critical パナソニックIpマネジメント株式会社
Priority to CN202080082450.3A priority Critical patent/CN114747308A/en
Priority to JP2021563758A priority patent/JPWO2021117319A1/ja
Publication of WO2021117319A1 publication Critical patent/WO2021117319A1/en

Links

Images

Classifications

    • HELECTRICITY
    • H05ELECTRIC TECHNIQUES NOT OTHERWISE PROVIDED FOR
    • H05KPRINTED CIRCUITS; CASINGS OR CONSTRUCTIONAL DETAILS OF ELECTRIC APPARATUS; MANUFACTURE OF ASSEMBLAGES OF ELECTRICAL COMPONENTS
    • H05K13/00Apparatus or processes specially adapted for manufacturing or adjusting assemblages of electric components
    • H05K13/08Monitoring manufacture of assemblages

Definitions

  • the present disclosure relates to a component imaging device that images a component and a component mounting device that mounts the component on a substrate.
  • the component mounting device has a component supply unit, a mounting head that horizontally moves the components supplied by the component supply unit, and a substrate transfer conveyor that holds the substrate.
  • the mounting head has a nozzle, and the nozzle takes out a component and mounts it on a substrate.
  • the parts mounting device described in Patent Document 1 includes a parts recognition camera (parts imaging device) and a substrate recognition camera (board imaging device).
  • the component recognition camera captures the position of the component held by the nozzle.
  • the board recognition camera moves integrally with the mounting head and captures a mark formed on the board.
  • Patent Document 1 sets as follows. That is, a glass plate on which a reference mark is formed is placed on the upper surface of the component recognition camera. The glass plate is arranged so that the reference mark is within the field of view of the component recognition camera. Further, the mounting head moves so that the nozzle is within this field of view. In this state, the component recognition camera detects the nozzle and the position of the reference mark. The substrate recognition camera calculates the positional relationship between the nozzle and the substrate recognition camera from the captured reference mark and the position of the optical axis of the substrate recognition camera.
  • the component imaging device of the present disclosure includes a component imaging unit, a mark, a mark imaging unit, and a misalignment amount calculation unit.
  • the component image pickup unit is provided on the image pickup device and images the component.
  • the mark is provided outside the field of view of the component imaging unit.
  • the mark image pickup unit is provided on the image pickup device and images the mark.
  • the misalignment amount calculation unit calculates the misalignment amount of the visual field of the component imaging unit based on the image imaging result of the mark by the mark imaging unit.
  • the component mounting device of the present disclosure includes a nozzle, a component image pickup device, a substrate image pickup device, and an offset amount calculation unit.
  • the nozzle holds the component and mounts it on the board.
  • the component imaging device images the component held by the nozzle.
  • the substrate imaging device moves integrally with the nozzle to image the substrate.
  • the component imaging device includes a component imaging unit, a mark, and a mark imaging unit.
  • the component image pickup unit is provided on the image pickup device and images the component.
  • the mark is provided outside the field of view of the component imaging unit.
  • the mark image pickup unit is provided on the image pickup device and images the mark.
  • the offset amount calculation unit calculates the offset amount between the nozzle and the substrate imaging device based on the mark imaging result by the mark imaging unit of the component imaging device and the mark imaging result by the substrate imaging device.
  • the offset amount between the nozzle and the substrate imaging device can be calculated accurately during the production of the mounting substrate.
  • FIG. 3 is a diagram illustrating a state in which a component held by a nozzle is imaged by the component recognition camera shown in FIG. 3B.
  • Functional block diagram showing the configuration of the control system of the component mounting device shown in FIGS. 1 and 2.
  • FIG. 3B is a diagram showing an example of an captured image of a component held by a nozzle captured by a component imaging unit of the component recognition camera shown in FIG. 3B.
  • FIG. 5 is a diagram illustrating a state in which a mark provided on the component recognition camera shown in FIG. 3B is imaged by the substrate recognition camera shown in FIG.
  • the component recognition camera in the conventional component mounting device detects the position of the component held by the nozzle from the position of the optical axis of the camera.
  • the optical axis of the camera may be deviated (distorted) due to a change over time due to heat generation of the image sensor included in the component recognition camera. Therefore, in order to maintain the mounting accuracy of the component, it is necessary to periodically detect the deviation of the optical axis of the component recognition camera and appropriately correct the mounting position of the component.
  • Patent Document 1 a method of detecting the deviation of the optical axis of the component recognition camera during the production of the mounting substrate is not disclosed, and there is room for further improvement in order to improve the mounting accuracy of the component. ..
  • the present disclosure provides a component imaging device capable of detecting a deviation of the optical axis of a camera that images a component and correcting the mounting position of the component based on the deviation. That is, this component imaging device calculates the amount of positional deviation of the field of view of the component imaging unit, which is a camera that images components.
  • the present disclosure also provides a component mounting device capable of accurately calculating the offset amount between the nozzle and the board imaging device during the production of the mounting board.
  • the substrate is conveyed along the X-axis.
  • the Y-axis is orthogonal to the X-axis in the horizontal plane.
  • the Z-axis is orthogonal to the horizontal plane and extends from bottom to top.
  • FIG. 2 schematically shows a part of the component mounting device 1 in FIG.
  • the component mounting device 1 executes a component mounting operation for mounting the component D supplied from the component supply unit 4 on the substrate 3.
  • a substrate transport mechanism hereinafter, transport mechanism 2 is arranged along the X axis in the center of the base 1a.
  • the transport mechanism 2 carries, positions, and holds the substrate 3 transported from the upstream to the mounting work position. Further, the transport mechanism 2 carries out the substrate 3 for which the component mounting work has been completed downstream.
  • Parts supply units 4 are arranged on both sides of the Y-axis of the transport mechanism 2.
  • a plurality of tape feeders 5 are mounted in parallel on each component supply unit 4.
  • the tape feeder 5 pitch-feeds the tape 11 having the pocket for storing the component D in the direction from the outside of the component supply unit 4 toward the transport mechanism 2 (tape feed direction).
  • the tape feeder 5 supplies the component D to the component supply position 5a (see FIG. 2) from which the component D is taken out by the mounting head 8, as described below.
  • the Y-axis table 6 is arranged so as to extend along the Y-axis at both ends of the upper surface of the base 1a on the X-axis.
  • Each Y-axis table 6 has a linear drive mechanism.
  • Beams 7 are coupled to the pair of Y-axis tables 6 so as to connect the Y-axis tables 6 to each other.
  • the beam 7 extends along the X axis.
  • the beam 7 can be moved along the Y-axis by the linear drive mechanism of the Y-axis table 6.
  • the beam 7 has a linear drive mechanism like the Y-axis table 6.
  • a mounting head 8 is mounted on the beam 7.
  • the mounting head 8 can be moved along the X axis by the linear drive mechanism included in the beam 7. As shown in FIG.
  • the mounting head 8 has a suction unit 8a.
  • the suction unit 8a can be raised and lowered, and holds the component D by sucking it.
  • a nozzle 8b that sucks and holds the component D is mounted on the lower end of the suction unit 8a.
  • the Y-axis table 6 and the beam 7 constitute a head moving mechanism (hereinafter, moving mechanism) 9 that moves the mounting head 8 along the X-axis and the Y-axis.
  • the moving mechanism 9 and the mounting head 8 attract and take out the component D from the component supply position 5a of the tape feeder 5 arranged in the component supply unit 4 by the nozzle 8b, and take out the component D from the component supply position 5a.
  • a parts recognition camera (hereinafter, first camera) 20 is arranged between the parts supply unit 4 and the transport mechanism 2.
  • first camera 20 takes an image of the component D held by the mounting head 8 and recognizes the posture of the component D.
  • a substrate recognition camera 30 (hereinafter referred to as a second camera) is attached to the plate 7a to which the mounting head 8 is attached. The second camera 30 moves integrally with the mounting head 8.
  • the second camera 30 moves above the substrate 3 positioned on the transport mechanism 2, images the substrate mark 3a provided on the substrate 3, and recognizes the position of the substrate 3.
  • the mounting position is corrected in consideration of the recognition result of the component D by the first camera 20 and the recognition result of the position of the board 3 by the second camera 30.
  • a dolly 10 is set in the parts supply unit 4.
  • the carriage 10 has a feeder base 10a, and a plurality of tape feeders 5 are attached to the feeder base 10a in advance.
  • the dolly 10 holds the tape reel 12.
  • the tape reel 12 stores the tape 11 holding the component D in a wound state.
  • the tape 11 drawn from the tape reel 12 is pitch-fed to the component supply position 5a by the tape feeder 5.
  • a touch panel 13 operated by the operator is installed at a position where the operator works on the front surface of the component mounting device 1.
  • the touch panel 13 displays various information on the display unit thereof. Further, the operator inputs data from the touch panel 13 or operates the component mounting device 1 by using the operation buttons displayed on the display unit.
  • the first camera 20 includes a housing 21, a sensor substrate 23, a lens 24, a component lighting unit 25, and a transparent member 26.
  • the sensor board 23 is installed on the bottom 21a inside the housing 21.
  • An image sensor 22 such as a two-dimensional CMOS (complementary metal oxide semiconductor) sensor is mounted on the sensor substrate 23.
  • the lens 24 is installed above the sensor substrate 23 in the housing 21, and the component illumination unit 25 is installed above the lens 24.
  • the transparent member 26 is installed at a portion of the upper portion 21b of the housing 21 in which a part of the housing 21 is cut out.
  • the transparent member 26 is made of a plate-shaped glass or the like that transmits light.
  • a mark 27 for detecting the deviation of the optical axis of the image pickup device 22 is provided on a part of the upper surface 26c of the transparent member 26.
  • the mark 27 is outside the component imaging field of view 26a when the image sensor 22 images the component D held by the nozzle 8b, and the mark 27 is a mark when the image sensor 22 images the mark 27. It is arranged in the imaging field of view 26b.
  • the mark imaging field of view 26b as marks 27, five disks formed of a metal thin film that does not transmit light are arranged at equal intervals along the Y axis. The shape, number, and arrangement of the marks 27 shown in FIG.
  • the transparent member 26 is on the first optical path Pa where the component imaging unit 22a images the component D, and on the second optical path Pb where the mark imaging unit 22b images the mark 27. It is provided.
  • the image pickup device 22 has a component image pickup unit 22a and a mark image pickup unit 22b arranged side by side along the X axis.
  • the component imaging unit 22a and the mark imaging unit 22b extend along the Y axis. That is, the length along the Y-axis is shorter than the length along the X-axis.
  • the component imaging unit 22a images the component D held by the nozzle 8b.
  • the mark imaging unit 22b images the mark 27 provided on the transparent member 26.
  • the component imaging unit 22a images the component D whose lower surface Da is located at the component recognition height Ha set above the mark recognition height Hb of the upper surface 26c of the transparent member 26. To do.
  • the image sensor 22 is provided with a component image pickup unit 22a for imaging the component D and a mark image pickup unit 22b for imaging the mark 27.
  • the mark 27 is provided in the mark imaging field of view 26b, which is outside the component imaging field of view 26a, which is the field of view of the component imaging unit 22a.
  • the component image pickup unit 22a is provided in a part of the image pickup element 22, and the mark image pickup section 22b is provided in a portion different from the component image pickup section 22a of the image pickup element 22.
  • the detection accuracy of the amount of displacement of the visual field of the component image pickup unit 22a is improved, as will be described later.
  • the lens 24 arranged above the image pickup element 22 causes an image pickup object (part D or the like) located above the transparent member 26 and at the component recognition height Ha to the component image pickup unit 22a. Make an image. Further, the lens 24 forms a mark 27 formed on the transparent member 26 on the mark imaging unit 22b. The mark 27 is located at the mark recognition height Hb.
  • the physical distance Lpa of the first optical path Pa from the component imaging unit 22a to the imaging object located at the component recognition height Ha is the physical distance Lpb of the second optical path Pb from the mark imaging unit 22b to the mark 27. It's different.
  • a correction member 28 having a refractive index different from that of air and made of glass or the like that transmits light is installed on the component imaging unit 22a.
  • the refractive index and the thickness T of the correction member 28 are set so as to correct the optical path length Loa (optical distance) of the first optical path Pa and form an image in each of the component imaging unit 22a and the mark imaging unit 22b. ing.
  • the lens 24 can simultaneously focus on the image of the imaging object to be imaged on the component imaging unit 22a and the image of the mark 27 to be imaged on the mark imaging unit 22b.
  • the correction member 28 is installed on the component imaging unit 22a, but the correction member 28 is installed in the middle of either the first optical path Pa or the second optical path Pb. Just do it. That is, in the first camera 20, the optical path length Loa is set in either the first optical path Pa in which the component imaging unit 22a images the component D or the second optical path Pb in which the mark imaging unit 22b images the mark 27. It has a correction member 28 that corrects either one of the optical path length lobs. Correction members 28 having different refractive indexes and / or thicknesses T may be installed in both the first optical path Pa and the second optical path Pb. Further, if the depth of field of the lens 24 is deep and the two images of the imaging object and the mark 27 can be focused at the same time without the correction member 28, the correction member 28 can be omitted.
  • the component illumination unit 25 includes a side illumination unit 25a, a coaxial illumination unit 25b, and a half mirror 25c.
  • the side-illuminating unit 25a has a plurality of light emitting diode (LED) chips and the like, and emits light that illuminates an imaging object (part D, etc.) located at a component recognition height Ha from diagonally below and a mark 27. Irradiate.
  • the coaxial illumination unit 25b has a plurality of LED chips and the like, and is arranged below the side-illuminated illumination unit 25a.
  • the half mirror 25c is arranged in the middle of the first optical path Pa and the second optical path Pb.
  • the half mirror 25c reflects the light emitted from the coaxial illumination unit 25b toward the image pickup object (part D or the like) located above the half mirror 25c at the component recognition height Ha and the mark 27. That is, the coaxial illumination unit 25b and the half mirror 25c illuminate the image pickup object and the mark 27 coaxially with the first optical path Pa and the second optical path Pb.
  • the component illumination unit 25 illuminates by switching or combining the illumination by the side-illuminated illumination unit 25a and the illumination by the coaxial illumination unit 25b according to the material or the like of the object to be imaged.
  • the first camera 20 has an illumination unit (parts illumination unit 25) that illuminates the mark 27.
  • the mounting head 8 holding the component D by the nozzle 8b positions the lower surface Da of the component D at the component recognition height Ha so that the component D passes over the first optical path Pa of the first camera 20. It moves along the X-axis as indicated by the arrow a. During this time, the component lighting unit 25 illuminates the component D.
  • the first camera 20 transmits the image pickup result imaged by the image pickup element 22 (component image pickup unit 22a, mark image pickup unit 22b) to the control unit 40 shown in FIG.
  • the second camera 30 has a housing 31, a window member 34, a camera unit 32 installed inside the housing 31, and a substrate lighting unit 33.
  • the window member 34 is made of glass or the like that transmits light, and is installed in a portion of the lower portion of the housing 31 that is cut out.
  • the camera unit 32 is composed of an image sensor such as a two-dimensional CMOS sensor whose optical axis is directed downward, a lens, and the like.
  • the substrate illumination unit 33 includes a side-illuminated illumination unit 33a that illuminates the substrate 3 to be imaged from diagonally above to below, a coaxial illumination unit 33b, and a half mirror 33c.
  • the half mirror 33c reflects the light emitted from the coaxial illumination unit 33b toward the lower image pickup target.
  • the substrate illumination unit 33 illuminates by switching or combining the illumination by the side-illuminated illumination unit 33a and the illumination by the coaxial illumination unit 33b according to the material or the like of the object to be imaged by the camera unit 32.
  • the second camera 30 images an imaging target such as a substrate mark 3a formed on the substrate 3 and a mark 27 provided on the transparent member 26 of the first camera 20.
  • the second camera 30 moves above the image pickup target together with the mounting head 8, and the camera unit 32 images the image pickup target while the substrate illumination unit 33 illuminates the image pickup target.
  • the second camera 30 has a substrate illumination unit 33 that illuminates the substrate mark 3a and the mark 27 formed on the substrate 3, and moves integrally with the nozzle 8b to image the substrate 3 and the mark 27. It is a substrate imaging device.
  • the second camera 30 transmits the image pickup result imaged by the camera unit 32 to the control unit 40.
  • the component mounting device 1 includes a control unit 40, a transport mechanism 2, a tape feeder 5, a mounting head 8, a moving mechanism 9, a first camera 20, a second camera 30, and a touch panel 13.
  • the control unit 40 includes an imaging processing unit (hereinafter, first processing unit) 41, a misalignment amount calculation unit (hereinafter, first calculation unit) 42, an offset amount calculation unit (hereinafter, second calculation unit) 43, and the like. It has a mounting operation processing unit (hereinafter, second processing unit) 44 and a mounting storage unit (hereinafter, storage unit) 45.
  • the storage unit 45 includes mounting data 46, imaging data 47, mark visual field misalignment amount (hereinafter, first position misalignment amount) 48, component visual field misalignment amount (hereinafter, second misalignment amount) 49, and substrate visual field misalignment amount. (Hereinafter, the third position shift amount) 50, the offset amount 51, and the like are stored.
  • the first processing unit 41, the first calculation unit 42, the second calculation unit 43, and the second processing unit 44 that constitute the control unit 40 are CPUs (central processing units) or LSIs (large-scale integrated circuits), respectively. It is configured. Memory may be included, if desired. These may be configured by a dedicated circuit, or may be realized by controlling general-purpose hardware with software read from a transient or non-transient storage device or recording medium. Further, two or more of these may be integrally configured.
  • the storage unit 58 is composed of a rewritable RAM, a flash memory, a hard disk, and the like. The storage unit 58 may be configured by a plurality of memories to individually store the mounting data 46 to the offset amount 51, or may be integrally configured to store these data collectively.
  • the mounting data 46 includes information necessary for component mounting work, such as the size of the board 3, the type of component D to be mounted, and the mounting position (XY coordinates) for each type of board 3.
  • the first processing unit 41 controls the mark imaging unit 22b and the component lighting unit 25 of the first camera 20 when detecting (calculating) the positional deviation of the imaging field of view of the first camera 20. Specifically, the first processing unit 41 causes the mark imaging unit 22b to image the mark 27 while illuminating the mark 27 provided on the transparent member 26 with the component lighting unit 25. The first processing unit 41 stores the imaging result as imaging data 47 in the storage unit 45.
  • the imaging result of the mark 27 imaged by the first camera 20 will be described.
  • the first camera 20 repeatedly executes imaging of the component D held by the nozzle 8b in the component mounting work.
  • the sensor substrate 23 and the housing 21 may be distorted over time due to the heat generated by the image sensor 22 and the component lighting unit 25.
  • the sensor substrate 23 is distorted due to such a change over time
  • the image sensor 22 is displaced in the XY plane
  • the first optical path Pa (optical axis) of the component image pickup unit 22a and the mark image pickup unit 22b are shown. It shows a state in which the second optical path Pb (optical axis) deviates from the initial state.
  • the first optical path Pa is deviated by the amount of misalignment ⁇ Xd
  • the second optical path Pb is deviated by the amount of misalignment ⁇ Xm.
  • the position of the first optical path Pa at the component recognition height Ha is defined as the initial component visual field center position Cd0, and the second at the mark recognition height Hb.
  • the position of the optical path Pb is defined as the initial mark visual field center position Cm0.
  • the position of the first optical path Pa at the component recognition height Ha is defined as the component visual field center position Cd
  • the second optical path at the mark recognition height Hb is defined.
  • the position of the optical path Pb is defined as the mark visual field center position Cm.
  • FIG. 8 shows a mark image 60, which is an image of the mark 27 captured by the first camera 20 in a state where the image sensor 22 is displaced in the XY plane as shown in FIG. 7.
  • the mark image 60 five marks 27 imaged by the mark imaging unit 22b are shown side by side.
  • the center of the third mark 27 (hereinafter, the center mark 27A) of the five marks 27 arranged side by side is the initial mark visual field center position Cm0.
  • the center 60c of the mark image 60 is the mark visual field center position Cm.
  • the initial mark field of view center position Cm0 is located at the center 60c of the mark image 60, and the five marks 27 are the positions indicated by the dotted lines. It is in. In FIG. 8, the position of the second optical path Pb at the mark recognition height Hb is displaced from the center of the central mark 27A due to the displacement of the image sensor 22.
  • the first calculation unit 42 shown in FIG. 6 calculates the first misalignment amount 48, which is the amount of misalignment from the initial state of the visual field of the mark imaging unit 22b, based on the mark image 60, and stores it in the storage unit 45. Let me. Specifically, the first calculation unit 42 calculates the deviation amount ⁇ Xm on the X-axis and the deviation amount ⁇ Ym on the Y-axis from the position of the central mark 27A with respect to the center 60c of the mark image 60 shown in FIG. Further, the deviation amount ⁇ m in the ⁇ direction, which is the direction of rotation with the Z axis as the rotation axis, is calculated from the inclination of the straight line connecting the centers of the five marks 27.
  • FIG. 9 shows a component image pickup image (hereinafter, first image) 61 which is an image of the component D imaged by the first camera 20 in a state where the image pickup element 22 shown in FIG. 7 is displaced in the XY plane. ..
  • the nozzle 8b holds the center of the component D.
  • the center of the component D shown in the first image 61 is the initial component visual field center position Cd0.
  • the center 61c of the first image 61 is the component field of view center position Cd.
  • the initial component visual field center position Cd0 is located at the center 61c of the first image 61, and the component D and the nozzle 8b holding the component D are dotted lines. It is in the indicated position.
  • the second displacement amount 49 which is the displacement amount from the initial state of the field of view of the component imaging unit 22a based on the first image 61, is represented by the displacement amount ⁇ Xd on the X-axis and the displacement amount ⁇ Yd on the Y-axis. That is, the deviation of the position of the center of the component D with respect to the center 61c of the first image 61 is the deviation amount ⁇ Xd and the deviation amount ⁇ Yd. Further, the degree of rotation of the component D (inclination of each side of the component D, etc.) is a deviation amount ⁇ d in the ⁇ direction, which is the direction of rotation with the Z axis as the rotation axis.
  • the component image pickup unit 22a and the mark image pickup unit 22b are formed on a single image pickup element 22. Therefore, due to changes over time in the first camera 20, the deviation of the first optical path Pa of the component imaging unit 22a and the deviation of the second optical path Pb of the mark imaging unit 22b are interlocked with each other. Therefore, the first misalignment amount 48 and the second misalignment amount 49 have a predetermined relationship (for example, a proportional relationship) determined by the configuration of the first camera 20. That is, the second misalignment amount 49 can be calculated from the first misalignment amount 48.
  • a predetermined relationship for example, a proportional relationship
  • the first calculation unit 42 executes a predetermined calculation on the first position shift amount 48 calculated from the mark image 60 to calculate the second position shift amount 49 and stores it in the storage unit 45.
  • the predetermined operation is, for example, an operation in which the X-axis component and the Y-axis component are multiplied by a predetermined coefficient and the ⁇ direction is left as it is. That is, the first calculation unit 42 calculates the amount of misalignment of the visual field of the component imaging unit 22a (second amount of misalignment 49) based on the mark image 60 which is the result of imaging the mark 27 by the mark imaging unit 22b. Then, the component imaging unit 22a, the mark imaging unit 22b, the mark 27, and the first calculation unit 42 constitute a component imaging device that images the component D held by the nozzle 8b and calculates the second misalignment amount 49.
  • the second processing unit 44 shown in FIG. 6 controls each unit of the component mounting device 1 based on the mounting data 46. Specifically, the nozzle 8b of the mounting head 8 is controlled to hold the component D supplied by the tape feeder 5. Then, the first camera 20 is made to detect the misalignment of the component D held by the nozzle 8b. Further, the nozzle 8b is controlled to mount the component D on the substrate 3 held at the mounting work position. The second processing unit 44 controls such a series of component mounting operations. At that time, the second processing unit 44 corrects the position of the component D with respect to the nozzle 8b detected by the first camera 20 based on the second misalignment amount 49 stored in the storage unit 45.
  • the mark imaging unit 22b can image the mark 27 while the mounting head 8 takes out the component D from the tape feeder 5 and mounts the component on the substrate 3. That is, the mark imaging unit 22b can image the mark 27 independently of the operation of the mounting head 8 and the like in the spare time when the mounting head 8 is not above the first camera 20. Therefore, the mark imaging unit 22b can image the mark 27 in the spare time and update the second misalignment amount 49 to the latest value. As a result, the position where the component D is mounted is corrected by detecting the deviation of the optical axis (second position deviation amount 49) of the component imaging camera 20 that images the component D without reducing the efficiency of the component mounting work, and the component. D can be mounted in the regular mounting position with high accuracy.
  • the first processing unit 41 shown in FIG. 6 detects (calculates) the misalignment of the mounting head 8, the nozzle 8b, and the second camera 30 due to the temporal distortion of the moving mechanism 9. Specifically, the first processing unit 41 controls the moving mechanism 9, the second camera 30, and the first camera 20 to cause the camera unit 32 to image the mark 27 formed on the transparent member 26. The first processing unit 41 stores the imaging result as imaging data 47 in the storage unit 45.
  • the first processing unit 41 shown in FIG. 6 first controls the moving mechanism 9, and the second camera 30 is located at a position where the mark 27 provided on the first camera 20 is imaged. To move. Next, the first processing unit 41 controls the component lighting unit 25 of the first camera 20 to illuminate the mark 27 provided on the transparent member 26 from below, and controls the camera unit 32 shown in FIG. The mark 27 is imaged.
  • the component illumination unit 25 illuminates the mark 27 with transmitted illumination.
  • the camera unit 32 may take an image of the mark 27 while the substrate illumination unit 33 illuminates the mark 27.
  • the component illumination unit 25 is a transmission illumination
  • the transmission illumination is an example of an illumination type and is not limited to this example.
  • the moving mechanism 9 repeatedly moves the mounting head 8 in the component mounting work.
  • the moving mechanism 9 may be distorted over time due to heat generated by the linear drive mechanism of the moving mechanism 9.
  • the control unit 40 controls the positions of the mounting head 8 and the nozzle 8b by assuming an initial state in which the moving mechanism 9 is not distorted, with the base point (not shown) set in the component mounting device 1 as the origin. Various control parameters are set. If the moving mechanism 9 is distorted, the mounting head 8 and the nozzle 8b deviate from the initial positions, so that correction is required.
  • the first processing unit 41 controls the moving mechanism 9 and the third optical path Ph of the camera unit 32 coincides with the center of the mark 27, the first processing unit 41 is caused by the distortion of the moving mechanism 9. 2 Indicates a state in which the camera 30 is misaligned and stopped.
  • the second camera 30 including the camera unit 32 moves integrally with the mounting head 8 by the moving mechanism 9.
  • the position on the mark 27 of the third optical path Ph is defined as the initial substrate visual field center position Ch0.
  • the position of the third optical path Ph on the transparent member 26 when the moving mechanism 9 is distorted is defined as the substrate visual field center position Ch.
  • FIG. 11 shows a substrate recognition camera image (hereinafter, second image) 62 of the mark 27 captured by the camera unit 32 of the second camera 30 in a state where the moving mechanism 9 is distorted as shown in FIG. There is.
  • the center 62c of the second image 62 is the substrate field of view center position Ch.
  • the center of the third center mark 27A out of the five marks 27 shown in the second image 62 is the initial substrate visual field center position Ch0.
  • the initial substrate field of view center position Ch0 is located at the center 62c, and the five marks 27 are at the positions indicated by the dotted lines.
  • the second calculation unit 43 shown in FIG. 6 calculates and stores a third position deviation amount 50, which is a position deviation amount from the initial state of the field of view of the camera unit 32, based on the second image 62 shown in FIG. It is stored in the part 45. Specifically, the second calculation unit 43 calculates the deviation amount ⁇ Xh on the X-axis and the deviation amount ⁇ Yh on the Y-axis from the position of the center mark 27A with respect to the center 62c of the second image 62. Further, from the inclination of the straight line connecting the centers of the five marks 27, the deviation amount ⁇ h in the ⁇ direction, which is the direction of rotation with the Z axis as the rotation axis, is calculated.
  • the second processing unit 44 controls the second camera 30 to image the substrate mark 3a of the substrate 3 held at the mounting work position and recognizes the position of the substrate 3. At that time, the second processing unit 44 corrects the stop position of the substrate 3 based on the third position deviation amount 50 stored in the storage unit 45. Further, in the component mounting work, it is necessary to correct the position of the nozzle 8b with respect to the base point set in the component mounting device 1 and mount the component D held by the nozzle 8b at a predetermined mounting position on the substrate 3. In order to correct the position of the nozzle 8b, in addition to the distortion of the moving mechanism 9, it is necessary to consider the distortion of the first camera 20 that captures the component D held by the nozzle 8b described above.
  • the second calculation unit 43 sets the third position shift amount 50 and the first position shift amount 48 in addition to the positional relationship (Xn, Yn) between the second camera 30 and the nozzle 8b in the mounting head 8. Based on this, the offset amount 51 between the nozzle 8b and the second camera 30 is calculated. Then, the second calculation unit 43 stores the offset amount 51 in the storage unit 45.
  • the first misalignment amount 48 is the misalignment amount of the field of view of the first camera 20 that represents the distortion of the first camera 20.
  • the second processing unit 44 corrects the position of the nozzle 8b based on the mounting data 46 stored in the storage unit 45 and the offset amount 51, controls the nozzle 8b, and holds the component D held by the nozzle 8b on the substrate 3. Mount it at the mounting position above.
  • the amount of misalignment of the field of view of the first camera 20 is calculated by the first calculation unit 42 based on the result of capturing the mark by the mark imaging unit 22b of the first camera 20. That is, the first calculation unit 42 calculates the first misalignment amount 48 based on the mark image 60 shown in FIG.
  • the first processing unit 41 controls the substrate illumination unit 33 of the second camera 30 to illuminate the mark 27 provided on the transparent member 26 from above.
  • the mark imaging unit 22b is controlled to image the mark 27. That is, when the first camera 20 images the mark 27, the substrate illumination unit 33 illuminates the mark 27 with transmitted illumination. Thereby, for example, even when the transparent member 26 is dirty, the mark 27 can be clearly imaged.
  • the second calculation unit 43 obtains the image pickup result of the mark 27 by the mark image pickup unit 22b of the first camera 20 which is the component image pickup device and the image pickup result of the mark 27 by the second camera 30 which is the substrate image pickup device. Based on this, the offset amount 51 between the nozzle 8b and the second camera 30 is calculated. That is, the second calculation unit 43 calculates the offset amount 51 based on the mark image 60 and the substrate recognition camera image 62.
  • the component mounting device 1 moves integrally with the component imaging device (first camera 20, first calculation unit 42) that images the component D held by the nozzle 8b, and the substrate 3 It has a substrate imaging device (second camera 30) for imaging the image, and an offset amount calculation unit (second calculation unit) 43 for calculating the offset amount 51.
  • the offset amount 51 between the nozzle 8b and the substrate recognition device can be calculated with high accuracy.
  • the component imaging device of the present disclosure can calculate the deviation of the optical axis of the camera that images the component. Further, the component mounting device of the present disclosure can accurately calculate the offset amount between the nozzle and the board imaging device during the production of the mounting board. Therefore, all of them are useful in the field of mounting components on a substrate.
  • Component mounting device 1a Base 2 Board transfer mechanism (transfer mechanism) 3 Board 3a Board mark 4 Parts supply part 5 Tape feeder 5a Parts supply position 6 Y-axis table 7 Beam 7a Plate 8 Mounting head 8a Suction unit 8b Nozzle 9 Head movement mechanism (movement mechanism) 10 Dolly 10a Feeder base 11 Tape 12 Tape reel 13 Touch panel 20 Parts recognition camera (1st camera) 21,31 Housing 21a Bottom 21b Top 22 Image sensor 22a Parts image sensor 22b Mark image pickup unit 23 Sensor board 24 Lens 25 Parts illumination 25a, 33a Side illumination 25b, 33b Coaxial illumination 25c, 33c Half mirror 26 Transparent member 26a Parts image pickup field 26b Mark image pickup field 26c Top surface 27 Mark 27A Center mark 28 Correction member 30 Board recognition camera (second camera) 32 Camera unit 33 Board lighting unit 34 Window member 40 Control unit 41 Imaging processing unit (first processing unit) 42 Misalignment amount calculation unit (first calculation unit) 43 Offset amount calculation unit (second calculation unit) 44 Mounting operation processing unit (second processing unit) 45 Mounting storage

Abstract

A component-imaging device, having a component-imaging unit, a mark, a mark-imaging unit, and a positional displacement amount calculation unit. The component-imaging unit is provided to an imaging element and images a component. The mark is provided outside of the field-of-view of the component-imaging unit. The mark-imaging unit is provided to the imaging element and images the mark. The positional displacement amount calculation unit calculates, on the basis of the result of imaging of the mark by the mark-imaging unit, the positional displacement amount of the field-of-view of the component-imaging unit.

Description

部品撮像装置および部品実装装置Component imaging device and component mounting device
 本開示は、部品を撮像する部品撮像装置および部品を基板に実装する部品実装装置に関する。 The present disclosure relates to a component imaging device that images a component and a component mounting device that mounts the component on a substrate.
 部品実装装置は、部品供給部と、部品供給部が供給する部品を水平に移動する実装ヘッドと、基板を保持する基板搬送コンベアとを有する。実装ヘッドはノズルを有し、ノズルは部品を取り出し、基板に実装する。部品の実装精度を向上するために、例えば、特許文献1に記載の部品実装装置は、部品認識カメラ(部品撮像装置)と、基板認識カメラ(基板撮像装置)とを有する。部品認識カメラは、ノズルが保持する部品の位置を撮像する。基板認識カメラは、実装ヘッドと一体となって移動して基板上に形成されたマークを撮像する。 The component mounting device has a component supply unit, a mounting head that horizontally moves the components supplied by the component supply unit, and a substrate transfer conveyor that holds the substrate. The mounting head has a nozzle, and the nozzle takes out a component and mounts it on a substrate. In order to improve the mounting accuracy of parts, for example, the parts mounting device described in Patent Document 1 includes a parts recognition camera (parts imaging device) and a substrate recognition camera (board imaging device). The component recognition camera captures the position of the component held by the nozzle. The board recognition camera moves integrally with the mounting head and captures a mark formed on the board.
 実装ヘッドは部品実装装置の本体に対して着脱可能である。このような実装ヘッドのノズルと基板認識カメラとの位置関係を補正して部品の実装精度を確保するため、特許文献1では次のように設定されている。すなわち、部品認識カメラの上面には、基準マークが形成されたガラス板が載置されている。ガラス板は、基準マークが部品認識カメラの視野内に入るように配置されている。さらに実装ヘッドは、この視野内にノズルが入るように移動する。この状態で、部品認識カメラは、ノズルと基準マークの位置とを検出する。基板認識カメラは、撮像された基準マークと基板認識カメラの光軸の位置とから、ノズルと基板認識カメラの位置関係を算出する。 The mounting head is removable from the main body of the component mounting device. In order to correct the positional relationship between the nozzle of the mounting head and the substrate recognition camera to ensure the mounting accuracy of the component, Patent Document 1 sets as follows. That is, a glass plate on which a reference mark is formed is placed on the upper surface of the component recognition camera. The glass plate is arranged so that the reference mark is within the field of view of the component recognition camera. Further, the mounting head moves so that the nozzle is within this field of view. In this state, the component recognition camera detects the nozzle and the position of the reference mark. The substrate recognition camera calculates the positional relationship between the nozzle and the substrate recognition camera from the captured reference mark and the position of the optical axis of the substrate recognition camera.
特開2004-179636号公報Japanese Unexamined Patent Publication No. 2004-179636
 本開示の部品撮像装置は、部品撮像部と、マークと、マーク撮像部と、位置ずれ量算出部とを有する。部品撮像部は、撮像素子に設けられ、部品を撮像する。マークは、部品撮像部の視野の外に設けられている。マーク撮像部は、撮像素子に設けられ、マークを撮像する。位置ずれ量算出部は、マーク撮像部によるマークの撮像結果に基づいて、部品撮像部の視野の位置ずれ量を算出する。 The component imaging device of the present disclosure includes a component imaging unit, a mark, a mark imaging unit, and a misalignment amount calculation unit. The component image pickup unit is provided on the image pickup device and images the component. The mark is provided outside the field of view of the component imaging unit. The mark image pickup unit is provided on the image pickup device and images the mark. The misalignment amount calculation unit calculates the misalignment amount of the visual field of the component imaging unit based on the image imaging result of the mark by the mark imaging unit.
 本開示の部品実装装置は、ノズルと、部品撮像装置と、基板撮像装置と、オフセット量算出部とを有する。ノズルは、部品を保持して基板に装着する。部品撮像装置は、ノズルで保持された部品を撮像する。基板撮像装置は、ノズルと一体的に移動して基板を撮像する。部品撮像装置は、部品撮像部と、マークと、マーク撮像部とを含む。部品撮像部は、撮像素子に設けられ、部品を撮像する。マークは、部品撮像部の視野の外に設けられている。マーク撮像部は、撮像素子に設けられ、マークを撮像する。オフセット量算出部は、部品撮像装置のマーク撮像部によるマークの撮像結果と基板撮像装置によるマークの撮像結果とに基づいて、ノズルと基板撮像装置とのオフセット量を算出する。 The component mounting device of the present disclosure includes a nozzle, a component image pickup device, a substrate image pickup device, and an offset amount calculation unit. The nozzle holds the component and mounts it on the board. The component imaging device images the component held by the nozzle. The substrate imaging device moves integrally with the nozzle to image the substrate. The component imaging device includes a component imaging unit, a mark, and a mark imaging unit. The component image pickup unit is provided on the image pickup device and images the component. The mark is provided outside the field of view of the component imaging unit. The mark image pickup unit is provided on the image pickup device and images the mark. The offset amount calculation unit calculates the offset amount between the nozzle and the substrate imaging device based on the mark imaging result by the mark imaging unit of the component imaging device and the mark imaging result by the substrate imaging device.
 本開示によれば、部品を撮像するカメラの光軸の経時的なずれを算出し、このずれによる部品の搭載位置のずれを補正することができる。また、実装基板の生産中にノズルと基板撮像装置とのオフセット量を精度良く算出することができることができる。 According to the present disclosure, it is possible to calculate the deviation of the optical axis of the camera that images the component over time and correct the deviation of the mounting position of the component due to this deviation. In addition, the offset amount between the nozzle and the substrate imaging device can be calculated accurately during the production of the mounting substrate.
本開示の実施の形態に係る部品実装装置の平面図Top view of the component mounting apparatus according to the embodiment of the present disclosure. 図1に示す部品実装装置の構成を示す透視側面図A perspective side view showing the configuration of the component mounting device shown in FIG. 図1、図2に示す部品実装装置が有する部品認識カメラの透過部材の平面図Top view of the transparent member of the component recognition camera included in the component mounting device shown in FIGS. 1 and 2. 図2に示す部品実装装置が有する部品認識カメラの構成説明図Configuration explanatory view of the component recognition camera included in the component mounting device shown in FIG. 図1、図2に示す部品実装装置が有する部品認識カメラのセンサ基板の平面図Top view of the sensor board of the component recognition camera included in the component mounting device shown in FIGS. 1 and 2. 図3Bに示す部品認識カメラにより、ノズルが保持する部品を撮像する様子を説明する図FIG. 3 is a diagram illustrating a state in which a component held by a nozzle is imaged by the component recognition camera shown in FIG. 3B. 図1、図2に示す部品実装装置が有する基板認識カメラの構成説明図Configuration explanatory view of the board recognition camera included in the component mounting device shown in FIGS. 1 and 2. 図1、図2に示す部品実装装置の制御系の構成を示す機能ブロック図Functional block diagram showing the configuration of the control system of the component mounting device shown in FIGS. 1 and 2. 図3Bに示す部品認識カメラの発熱に起因する光軸のずれを説明する図The figure explaining the deviation of the optical axis caused by the heat generation of the component recognition camera shown in FIG. 3B. 図3Bに示す部品認識カメラのマーク撮像部によって撮像されたマークの撮像画像の例を示す図The figure which shows the example of the captured image of the mark imaged by the mark imaging unit of the component recognition camera shown in FIG. 3B. 図3Bに示す部品認識カメラの部品撮像部によって撮像されたノズルが保持する部品の撮像画像の例を示す図FIG. 3B is a diagram showing an example of an captured image of a component held by a nozzle captured by a component imaging unit of the component recognition camera shown in FIG. 3B. 図5に示す基板認識カメラにより、図3Bに示す部品認識カメラに設けられたマークを撮像する様子を説明する図FIG. 5 is a diagram illustrating a state in which a mark provided on the component recognition camera shown in FIG. 3B is imaged by the substrate recognition camera shown in FIG. 図10に示す基板認識カメラによって撮像された、部品認識カメラに設けられたマークの撮像画像の例を示す図The figure which shows the example of the captured image of the mark provided in the component recognition camera which was taken by the substrate recognition camera shown in FIG.
 本開示の実施の形態の説明に先立ち、本開示の着想に至った経緯を簡単に説明する。従来の部品実装装置における部品認識カメラは、カメラの光軸の位置からノズルが保持する部品の位置を検出している。しかしながら、継続して実装基板を生産する過程において、部品認識カメラが含む撮像素子の発熱などによる経時変化によってカメラの光軸がずれる(歪む)ことがある。そのため、部品の実装精度を維持するためには、部品認識カメラの光軸のずれを定期的に検出して部品の実装位置を適切に補正する必要がある。特許文献1を含む従来技術では、実装基板の生産中に部品認識カメラの光軸のずれを検出する方法は開示されておらず、部品の実装精度を向上するためにはさらなる改善の余地がある。 Prior to the explanation of the embodiment of the present disclosure, the background to the idea of the present disclosure will be briefly explained. The component recognition camera in the conventional component mounting device detects the position of the component held by the nozzle from the position of the optical axis of the camera. However, in the process of continuously producing the mounting substrate, the optical axis of the camera may be deviated (distorted) due to a change over time due to heat generation of the image sensor included in the component recognition camera. Therefore, in order to maintain the mounting accuracy of the component, it is necessary to periodically detect the deviation of the optical axis of the component recognition camera and appropriately correct the mounting position of the component. In the prior art including Patent Document 1, a method of detecting the deviation of the optical axis of the component recognition camera during the production of the mounting substrate is not disclosed, and there is room for further improvement in order to improve the mounting accuracy of the component. ..
 本開示は、部品を撮像するカメラの光軸のずれを検出し、そのずれに基づき部品を実装する位置を補正することができる部品撮像装置を提供する。すなわち、この部品撮像装置は、部品を撮像するカメラである部品撮像部の視野の位置ズレ量を算出する。また本開示は、実装基板の生産中にノズルと基板撮像装置とのオフセット量を精度良く算出することができる部品実装装置を提供する。 The present disclosure provides a component imaging device capable of detecting a deviation of the optical axis of a camera that images a component and correcting the mounting position of the component based on the deviation. That is, this component imaging device calculates the amount of positional deviation of the field of view of the component imaging unit, which is a camera that images components. The present disclosure also provides a component mounting device capable of accurately calculating the offset amount between the nozzle and the board imaging device during the production of the mounting board.
 以下に図面を参照しながら、本開示の実施の形態を詳細に説明する。以下で述べる構成、形状等は説明のための例示であって、部品実装装置、基板認識カメラ、部品認識カメラの仕様に応じ、適宜変更が可能である。以下では、全ての図面において対応する要素には同一符号を付し、重複する説明を省略する。なお、各図面において、基板はX軸に沿って搬送される。Y軸は、水平面内でX軸に直交する。またZ軸は、水平面と直交し、下から上に延びている。 The embodiments of the present disclosure will be described in detail with reference to the drawings below. The configuration, shape, and the like described below are examples for explanation, and can be appropriately changed according to the specifications of the component mounting device, the board recognition camera, and the component recognition camera. In the following, the corresponding elements will be designated by the same reference numerals in all the drawings, and duplicate description will be omitted. In each drawing, the substrate is conveyed along the X-axis. The Y-axis is orthogonal to the X-axis in the horizontal plane. The Z-axis is orthogonal to the horizontal plane and extends from bottom to top.
 まず図1、図2を参照して、部品実装装置1の構成を説明する。なお図2は、図1における部品実装装置1の一部を模式的に示している。部品実装装置1は、部品供給部4から供給された部品Dを基板3に装着する部品実装作業を実行する。図1に示すように、基台1aの中央には、基板搬送機構(以下、搬送機構)2がX軸に沿って配置されている。搬送機構2は、上流から搬送された基板3を、実装作業位置に搬入して位置決めし、保持する。また、搬送機構2は、部品実装作業が完了した基板3を下流に搬出する。 First, the configuration of the component mounting device 1 will be described with reference to FIGS. 1 and 2. Note that FIG. 2 schematically shows a part of the component mounting device 1 in FIG. The component mounting device 1 executes a component mounting operation for mounting the component D supplied from the component supply unit 4 on the substrate 3. As shown in FIG. 1, a substrate transport mechanism (hereinafter, transport mechanism) 2 is arranged along the X axis in the center of the base 1a. The transport mechanism 2 carries, positions, and holds the substrate 3 transported from the upstream to the mounting work position. Further, the transport mechanism 2 carries out the substrate 3 for which the component mounting work has been completed downstream.
 搬送機構2のY軸における両側方には、部品供給部4が配置されている。それぞれの部品供給部4には、複数のテープフィーダ5が並列に装着されている。テープフィーダ5は、部品Dを格納するポケットが形成されたテープ11を部品供給部4の外側から搬送機構2に向かう方向(テープ送り方向)にピッチ送りする。これによりテープフィーダ5は、以下に説明するように、実装ヘッド8によって部品Dが取り出される部品供給位置5a(図2参照)に部品Dを供給する。 Parts supply units 4 are arranged on both sides of the Y-axis of the transport mechanism 2. A plurality of tape feeders 5 are mounted in parallel on each component supply unit 4. The tape feeder 5 pitch-feeds the tape 11 having the pocket for storing the component D in the direction from the outside of the component supply unit 4 toward the transport mechanism 2 (tape feed direction). As a result, the tape feeder 5 supplies the component D to the component supply position 5a (see FIG. 2) from which the component D is taken out by the mounting head 8, as described below.
 基台1aの上面のX軸における両端部にはそれぞれ、Y軸テーブル6がY軸に沿って延びるように配置されている。Y軸テーブル6はそれぞれ、リニア駆動機構を有する。一対のY軸テーブル6には、Y軸テーブル6同志をつなぐように、ビーム7が結合されている。ビーム7はX軸に沿って延びている。ビーム7は、Y軸テーブル6のリニア駆動機構により、Y軸に沿って移動可能である。またビーム7は、Y軸テーブル6と同様にリニア駆動機構を有している。ビーム7には、実装ヘッド8が装着されている。実装ヘッド8は、ビーム7が有するリニア駆動機構により、X軸に沿って移動可能である。図2に示すように、実装ヘッド8は、吸着ユニット8aを有している。吸着ユニット8aは、昇降可能であり、部品Dを吸着することにより保持する。吸着ユニット8aの下端部には、部品Dを吸着保持するノズル8bが装着されている。 The Y-axis table 6 is arranged so as to extend along the Y-axis at both ends of the upper surface of the base 1a on the X-axis. Each Y-axis table 6 has a linear drive mechanism. Beams 7 are coupled to the pair of Y-axis tables 6 so as to connect the Y-axis tables 6 to each other. The beam 7 extends along the X axis. The beam 7 can be moved along the Y-axis by the linear drive mechanism of the Y-axis table 6. Further, the beam 7 has a linear drive mechanism like the Y-axis table 6. A mounting head 8 is mounted on the beam 7. The mounting head 8 can be moved along the X axis by the linear drive mechanism included in the beam 7. As shown in FIG. 2, the mounting head 8 has a suction unit 8a. The suction unit 8a can be raised and lowered, and holds the component D by sucking it. A nozzle 8b that sucks and holds the component D is mounted on the lower end of the suction unit 8a.
 図1において、Y軸テーブル6およびビーム7は、実装ヘッド8をX軸、Y軸に沿って移動させるヘッド移動機構(以下、移動機構)9を構成する。移動機構9および実装ヘッド8は、部品供給部4に配置されたテープフィーダ5の部品供給位置5aから部品Dをノズル8bによって吸着して取り出して、搬送機構2に位置決めされた基板3の実装位置に装着する実装ターンを実行する。すなわち、Y軸テーブル6、ビーム7および実装ヘッド8は、テープフィーダ5の部品供給位置5aに供給される部品Dをノズル8bで保持して基板3に装着する。 In FIG. 1, the Y-axis table 6 and the beam 7 constitute a head moving mechanism (hereinafter, moving mechanism) 9 that moves the mounting head 8 along the X-axis and the Y-axis. The moving mechanism 9 and the mounting head 8 attract and take out the component D from the component supply position 5a of the tape feeder 5 arranged in the component supply unit 4 by the nozzle 8b, and take out the component D from the component supply position 5a. Perform a mounting turn to attach to. That is, the Y-axis table 6, the beam 7, and the mounting head 8 hold the component D supplied to the component supply position 5a of the tape feeder 5 by the nozzle 8b and mount it on the substrate 3.
 部品供給部4と搬送機構2との間には、部品認識カメラ(以下、第1カメラ)20が配置されている。部品供給部4から部品Dを取り出した実装ヘッド8が第1カメラ20の上方を移動する際に、第1カメラ20は実装ヘッド8に保持された部品Dを撮像して部品Dの姿勢を認識する。実装ヘッド8が取り付けられたプレート7aには基板認識カメラ30(以下、第2カメラ)が取り付けられている。第2カメラ30は、実装ヘッド8と一体的に移動する。 A parts recognition camera (hereinafter, first camera) 20 is arranged between the parts supply unit 4 and the transport mechanism 2. When the mounting head 8 from which the component D is taken out from the component supply unit 4 moves above the first camera 20, the first camera 20 takes an image of the component D held by the mounting head 8 and recognizes the posture of the component D. To do. A substrate recognition camera 30 (hereinafter referred to as a second camera) is attached to the plate 7a to which the mounting head 8 is attached. The second camera 30 moves integrally with the mounting head 8.
 実装ヘッド8が移動することにより、第2カメラ30は搬送機構2に位置決めされた基板3の上方に移動し、基板3に設けられた基板マーク3aを撮像して基板3の位置を認識する。実装ヘッド8による基板3への部品実装動作においては、第1カメラ20による部品Dの認識結果と、第2カメラ30による基板3の位置の認識結果とを加味して実装位置が補正される。 When the mounting head 8 moves, the second camera 30 moves above the substrate 3 positioned on the transport mechanism 2, images the substrate mark 3a provided on the substrate 3, and recognizes the position of the substrate 3. In the component mounting operation on the substrate 3 by the mounting head 8, the mounting position is corrected in consideration of the recognition result of the component D by the first camera 20 and the recognition result of the position of the board 3 by the second camera 30.
 図2において、部品供給部4には、台車10がセットされる。台車10はフィーダベース10aを有し、フィーダベース10aには、予め複数のテープフィーダ5が装着されている。台車10は、テープリール12を保持している。テープリール12は、部品Dを保持したテープ11を巻回した状態で収納している。テープリール12から引き出されたテープ11は、テープフィーダ5によって部品供給位置5aまでピッチ送りされる。 In FIG. 2, a dolly 10 is set in the parts supply unit 4. The carriage 10 has a feeder base 10a, and a plurality of tape feeders 5 are attached to the feeder base 10a in advance. The dolly 10 holds the tape reel 12. The tape reel 12 stores the tape 11 holding the component D in a wound state. The tape 11 drawn from the tape reel 12 is pitch-fed to the component supply position 5a by the tape feeder 5.
 図1において、部品実装装置1の前面で作業者が作業する位置には、作業者が操作するタッチパネル13が設置されている。タッチパネル13は、その表示部に各種情報を表示する。また作業者は、表示部に表示される操作ボタンなどを使ってタッチパネル13からデータを入力したり、部品実装装置1を操作したりする。 In FIG. 1, a touch panel 13 operated by the operator is installed at a position where the operator works on the front surface of the component mounting device 1. The touch panel 13 displays various information on the display unit thereof. Further, the operator inputs data from the touch panel 13 or operates the component mounting device 1 by using the operation buttons displayed on the display unit.
 次に図3A~図3Cを参照して、第1カメラ20の構成について説明する。図3Bに示すように、第1カメラ20は、筐体21と、センサ基板23と、レンズ24と、部品照明部25と、透明部材26とを有する。センサ基板23は、筐体21の内部の底21aに設置されている。センサ基板23には、2次元CMOS(相補型金属酸化膜半導体)センサなどの撮像素子22が実装されている。レンズ24は、筐体21内において、センサ基板23の上方に設置され、部品照明部25は、レンズ24の上方に設置されている。透明部材26は、筐体21の上部21bの、筐体21の一部を切り欠いた箇所に設置されている。 Next, the configuration of the first camera 20 will be described with reference to FIGS. 3A to 3C. As shown in FIG. 3B, the first camera 20 includes a housing 21, a sensor substrate 23, a lens 24, a component lighting unit 25, and a transparent member 26. The sensor board 23 is installed on the bottom 21a inside the housing 21. An image sensor 22 such as a two-dimensional CMOS (complementary metal oxide semiconductor) sensor is mounted on the sensor substrate 23. The lens 24 is installed above the sensor substrate 23 in the housing 21, and the component illumination unit 25 is installed above the lens 24. The transparent member 26 is installed at a portion of the upper portion 21b of the housing 21 in which a part of the housing 21 is cut out.
 透明部材26は、光を透過する板状のガラスなどで構成されている。透明部材26の上面26cの一部には、撮像素子22の光軸のずれを検出するためのマーク27が設けられている。図3Aに示すように、マーク27は、撮像素子22がノズル8bに保持された部品Dを撮像する際の部品撮像視野26aの外であって、撮像素子22がマーク27を撮像する際のマーク撮像視野26bの中に配置されている。この例では、マーク撮像視野26bの中に、マーク27として、光を透過しない金属薄膜で形成された5個の円板がY軸に沿って等間隔に配置されている。なお、図3Aに示すマーク27の形状、数、配置は一例であり、この例に限定されることはない。図3Bに示すように、透明部材26は、部品撮像部22aが部品Dを撮像する第1の光路Pa上であって、かつマーク撮像部22bがマーク27を撮像する第2の光路Pb上に設けられている。 The transparent member 26 is made of a plate-shaped glass or the like that transmits light. A mark 27 for detecting the deviation of the optical axis of the image pickup device 22 is provided on a part of the upper surface 26c of the transparent member 26. As shown in FIG. 3A, the mark 27 is outside the component imaging field of view 26a when the image sensor 22 images the component D held by the nozzle 8b, and the mark 27 is a mark when the image sensor 22 images the mark 27. It is arranged in the imaging field of view 26b. In this example, in the mark imaging field of view 26b, as marks 27, five disks formed of a metal thin film that does not transmit light are arranged at equal intervals along the Y axis. The shape, number, and arrangement of the marks 27 shown in FIG. 3A are examples, and are not limited to this example. As shown in FIG. 3B, the transparent member 26 is on the first optical path Pa where the component imaging unit 22a images the component D, and on the second optical path Pb where the mark imaging unit 22b images the mark 27. It is provided.
 図3Cに示すように、撮像素子22は、X軸に沿って並んで配置された部品撮像部22aと、マーク撮像部22bとを有する。部品撮像部22a、マーク撮像部22bは、Y軸に沿って延びている。すなわち、Y軸に沿った長さは、X軸に沿った長さより短い。部品撮像部22aは、ノズル8bが保持している部品Dを撮像する。マーク撮像部22bは、透明部材26に設けられたマーク27を撮像する。部品撮像部22aは、図3Bおよび図4に示すように、透明部材26の上面26cのマーク認識高さHbよりも上方に設定された部品認識高さHaに下面Daが位置する部品Dを撮像する。このように、撮像素子22には、部品Dを撮像する部品撮像部22aと、マーク27を撮像するマーク撮像部22bとが設けられている。図3Aに示すように、マーク27は、部品撮像部22aの視野である部品撮像視野26aの外であるマーク撮像視野26bの中に設けられている。 As shown in FIG. 3C, the image pickup device 22 has a component image pickup unit 22a and a mark image pickup unit 22b arranged side by side along the X axis. The component imaging unit 22a and the mark imaging unit 22b extend along the Y axis. That is, the length along the Y-axis is shorter than the length along the X-axis. The component imaging unit 22a images the component D held by the nozzle 8b. The mark imaging unit 22b images the mark 27 provided on the transparent member 26. As shown in FIGS. 3B and 4, the component imaging unit 22a images the component D whose lower surface Da is located at the component recognition height Ha set above the mark recognition height Hb of the upper surface 26c of the transparent member 26. To do. As described above, the image sensor 22 is provided with a component image pickup unit 22a for imaging the component D and a mark image pickup unit 22b for imaging the mark 27. As shown in FIG. 3A, the mark 27 is provided in the mark imaging field of view 26b, which is outside the component imaging field of view 26a, which is the field of view of the component imaging unit 22a.
 部品撮像部22aは撮像素子22の一部に設けられ、マーク撮像部22bは撮像素子22の部品撮像部22aとは異なる部分に設けられている。このように、部品撮像部22aとマーク撮像部22bとが同じ撮像素子22に設けられていることで、後述するように、部品撮像部22aの視野の位置ずれ量の検出精度が高まる。 The component image pickup unit 22a is provided in a part of the image pickup element 22, and the mark image pickup section 22b is provided in a portion different from the component image pickup section 22a of the image pickup element 22. As described above, since the component image pickup unit 22a and the mark image pickup unit 22b are provided on the same image pickup element 22, the detection accuracy of the amount of displacement of the visual field of the component image pickup unit 22a is improved, as will be described later.
 図3Bに示すように、撮像素子22の上方に配置されたレンズ24は、透明部材26の上方であって部品認識高さHaに位置する撮像対象物(部品Dなど)を部品撮像部22aに結像させる。また、レンズ24は、透明部材26に形成されたマーク27をマーク撮像部22bに結像させる。マーク27は、マーク認識高さHbに位置している。部品撮像部22aから部品認識高さHaに位置する撮像対象物までの第1の光路Paの物理的距離Lpaは、マーク撮像部22bからマーク27までの第2の光路Pbの物理的距離Lpbと異なっている。 As shown in FIG. 3B, the lens 24 arranged above the image pickup element 22 causes an image pickup object (part D or the like) located above the transparent member 26 and at the component recognition height Ha to the component image pickup unit 22a. Make an image. Further, the lens 24 forms a mark 27 formed on the transparent member 26 on the mark imaging unit 22b. The mark 27 is located at the mark recognition height Hb. The physical distance Lpa of the first optical path Pa from the component imaging unit 22a to the imaging object located at the component recognition height Ha is the physical distance Lpb of the second optical path Pb from the mark imaging unit 22b to the mark 27. It's different.
 図3B、図3Cに示すように、部品撮像部22aの上には、空気と異なる屈折率を有し、光を透過するガラスなどで構成された補正部材28が設置されている。補正部材28の屈折率と厚さTは、第1の光路Paの光路長Loa(光学的距離)を補正して、部品撮像部22aとマーク撮像部22bの各々で結像するように設定されている。これによって、レンズ24は、部品撮像部22aに結像させる撮像対象物の像とマーク撮像部22bに結像させるマーク27の像とに、同時にピントを合わせることができる。 As shown in FIGS. 3B and 3C, a correction member 28 having a refractive index different from that of air and made of glass or the like that transmits light is installed on the component imaging unit 22a. The refractive index and the thickness T of the correction member 28 are set so as to correct the optical path length Loa (optical distance) of the first optical path Pa and form an image in each of the component imaging unit 22a and the mark imaging unit 22b. ing. As a result, the lens 24 can simultaneously focus on the image of the imaging object to be imaged on the component imaging unit 22a and the image of the mark 27 to be imaged on the mark imaging unit 22b.
 なお、本実施の形態では補正部材28は部品撮像部22a上に設置されているが、補正部材28は第1の光路Paと第2の光路Pbとの何れか一方の途中に設置されていればよい。すなわち、第1カメラ20は、部品撮像部22aが部品Dを撮像する第1の光路Paとマーク撮像部22bがマーク27を撮像する第2の光路Pbとの何れか一方に、光路長Loa、光路長Lobの何れか一方を補正する補正部材28を有している。なお、屈折率および/または厚さTの異なる補正部材28を第1の光路Paと第2の光路Pbの両方に設置してもよい。また、レンズ24の被写界深度が深く、補正部材28がなくとも撮像対象物とマーク27の2つの像に同時にピントを合わせることができれば、補正部材28を省略することができる。 In the present embodiment, the correction member 28 is installed on the component imaging unit 22a, but the correction member 28 is installed in the middle of either the first optical path Pa or the second optical path Pb. Just do it. That is, in the first camera 20, the optical path length Loa is set in either the first optical path Pa in which the component imaging unit 22a images the component D or the second optical path Pb in which the mark imaging unit 22b images the mark 27. It has a correction member 28 that corrects either one of the optical path length lobs. Correction members 28 having different refractive indexes and / or thicknesses T may be installed in both the first optical path Pa and the second optical path Pb. Further, if the depth of field of the lens 24 is deep and the two images of the imaging object and the mark 27 can be focused at the same time without the correction member 28, the correction member 28 can be omitted.
 図3Bに示すように、部品照明部25は、側射照明部25aと、同軸照明部25bと、ハーフミラー25cとを含む。側射照明部25aは、複数の発光ダイオード(LED)チップなどを有しており、斜め下から部品認識高さHaに位置する撮像対象物(部品Dなど)とマーク27とを照明する光を照射する。同軸照明部25bは、複数のLEDチップなどを有しており、側射照明部25aの下方に配置されている。 As shown in FIG. 3B, the component illumination unit 25 includes a side illumination unit 25a, a coaxial illumination unit 25b, and a half mirror 25c. The side-illuminating unit 25a has a plurality of light emitting diode (LED) chips and the like, and emits light that illuminates an imaging object (part D, etc.) located at a component recognition height Ha from diagonally below and a mark 27. Irradiate. The coaxial illumination unit 25b has a plurality of LED chips and the like, and is arranged below the side-illuminated illumination unit 25a.
 ハーフミラー25cは、第1の光路Paと第2の光路Pbの途中に配置されている。ハーフミラー25cは、同軸照明部25bから照射された光を、ハーフミラー25cの上方に位置する部品認識高さHaに位置する撮像対象物(部品Dなど)とマーク27とに向かって反射させる。すなわち、同軸照明部25bとハーフミラー25cとは、第1の光路Paと第2の光路Pbと同軸で撮像対象物とマーク27を照明する。 The half mirror 25c is arranged in the middle of the first optical path Pa and the second optical path Pb. The half mirror 25c reflects the light emitted from the coaxial illumination unit 25b toward the image pickup object (part D or the like) located above the half mirror 25c at the component recognition height Ha and the mark 27. That is, the coaxial illumination unit 25b and the half mirror 25c illuminate the image pickup object and the mark 27 coaxially with the first optical path Pa and the second optical path Pb.
 部品照明部25は、撮像する対象の材質等に応じて、側射照明部25aによる照射と同軸照明部25bによる照明を切り替えて、もしくは組み合わせて照明する。このように、第1カメラ20は、マーク27を照らす照明部(部品照明部25)を有している。 The component illumination unit 25 illuminates by switching or combining the illumination by the side-illuminated illumination unit 25a and the illumination by the coaxial illumination unit 25b according to the material or the like of the object to be imaged. As described above, the first camera 20 has an illumination unit (parts illumination unit 25) that illuminates the mark 27.
 次に図4を参照して、第1カメラ20による部品Dの撮像について説明する。部品Dをノズル8bで保持した実装ヘッド8は、部品Dの下面Daを部品認識高さHaに位置させて、部品Dが第1カメラ20の第1の光路Paの上を通過するように、矢印aで示すようにX軸に沿って移動する。この間、部品照明部25が部品Dを照明する。第1カメラ20は、撮像素子22(部品撮像部22a、マーク撮像部22b)が撮像した撮像結果を、図6に示す制御部40に送信する。 Next, with reference to FIG. 4, the imaging of the component D by the first camera 20 will be described. The mounting head 8 holding the component D by the nozzle 8b positions the lower surface Da of the component D at the component recognition height Ha so that the component D passes over the first optical path Pa of the first camera 20. It moves along the X-axis as indicated by the arrow a. During this time, the component lighting unit 25 illuminates the component D. The first camera 20 transmits the image pickup result imaged by the image pickup element 22 (component image pickup unit 22a, mark image pickup unit 22b) to the control unit 40 shown in FIG.
 次に図5を参照して、第2カメラ30の構成について説明する。第2カメラ30は、筐体31と、窓部材34と、筐体31の内部に設置されたカメラユニット32と基板照明部33とを有する。窓部材34は、光を透過するガラスなどで構成され、筐体31の下部の一部を切り欠いた部分に設置されている。カメラユニット32は、光軸を下方に向けた2次元CMOSセンサなどの撮像素子、レンズなどで構成されている。 Next, the configuration of the second camera 30 will be described with reference to FIG. The second camera 30 has a housing 31, a window member 34, a camera unit 32 installed inside the housing 31, and a substrate lighting unit 33. The window member 34 is made of glass or the like that transmits light, and is installed in a portion of the lower portion of the housing 31 that is cut out. The camera unit 32 is composed of an image sensor such as a two-dimensional CMOS sensor whose optical axis is directed downward, a lens, and the like.
 基板照明部33は、上斜めから下方の撮像対象の基板3を照明する側射照明部33aと、同軸照明部33bと、ハーフミラー33cとを含む。ハーフミラー33cは、同軸照明部33bから照射された光を下方の撮像対象に向けて反射させる。基板照明部33は、カメラユニット32が撮像する対象の材質等に応じて、側射照明部33aによる照射と、同軸照明部33bによる照明を切り替えて、もしくは組み合わせて照明する。 The substrate illumination unit 33 includes a side-illuminated illumination unit 33a that illuminates the substrate 3 to be imaged from diagonally above to below, a coaxial illumination unit 33b, and a half mirror 33c. The half mirror 33c reflects the light emitted from the coaxial illumination unit 33b toward the lower image pickup target. The substrate illumination unit 33 illuminates by switching or combining the illumination by the side-illuminated illumination unit 33a and the illumination by the coaxial illumination unit 33b according to the material or the like of the object to be imaged by the camera unit 32.
 第2カメラ30は、基板3上に形成された基板マーク3aや第1カメラ20の透明部材26上に設けられたマーク27などの撮像対象を撮像する。その際、第2カメラ30は、実装ヘッド8とともに撮像対象の上方に移動し、基板照明部33が撮像対象を照明しながらカメラユニット32が撮像対象を撮像する。このように、第2カメラ30は、基板3上に形成された基板マーク3aとマーク27とを照らす基板照明部33を有し、ノズル8bと一体的に移動して基板3やマーク27を撮像する基板撮像装置である。第2カメラ30は、カメラユニット32が撮像した撮像結果を、制御部40に送信する。 The second camera 30 images an imaging target such as a substrate mark 3a formed on the substrate 3 and a mark 27 provided on the transparent member 26 of the first camera 20. At that time, the second camera 30 moves above the image pickup target together with the mounting head 8, and the camera unit 32 images the image pickup target while the substrate illumination unit 33 illuminates the image pickup target. In this way, the second camera 30 has a substrate illumination unit 33 that illuminates the substrate mark 3a and the mark 27 formed on the substrate 3, and moves integrally with the nozzle 8b to image the substrate 3 and the mark 27. It is a substrate imaging device. The second camera 30 transmits the image pickup result imaged by the camera unit 32 to the control unit 40.
 次に図6を参照して、部品実装装置1の制御系の構成について説明する。部品実装装置1は、制御部40と、搬送機構2と、テープフィーダ5と、実装ヘッド8と、移動機構9と、第1カメラ20と、第2カメラ30と、タッチパネル13とを有している。制御部40は、撮像処理部(以下、第1処理部)41と、位置ずれ量算出部(以下、第1算出部)42と、オフセット量算出部(以下、第2算出部)43と、実装動作処理部(以下、第2処理部)44と、実装記憶部(以下、記憶部)45とを有している。記憶部45は、実装データ46、撮像データ47、マーク視野位置ずれ量(以下、第1位置ずれ量)48、部品視野位置ずれ量(以下、第2位置ずれ量)49、基板視野位置ずれ量(以下、第3位置ずれ量)50、オフセット量51などを記憶している。 Next, the configuration of the control system of the component mounting device 1 will be described with reference to FIG. The component mounting device 1 includes a control unit 40, a transport mechanism 2, a tape feeder 5, a mounting head 8, a moving mechanism 9, a first camera 20, a second camera 30, and a touch panel 13. There is. The control unit 40 includes an imaging processing unit (hereinafter, first processing unit) 41, a misalignment amount calculation unit (hereinafter, first calculation unit) 42, an offset amount calculation unit (hereinafter, second calculation unit) 43, and the like. It has a mounting operation processing unit (hereinafter, second processing unit) 44 and a mounting storage unit (hereinafter, storage unit) 45. The storage unit 45 includes mounting data 46, imaging data 47, mark visual field misalignment amount (hereinafter, first position misalignment amount) 48, component visual field misalignment amount (hereinafter, second misalignment amount) 49, and substrate visual field misalignment amount. (Hereinafter, the third position shift amount) 50, the offset amount 51, and the like are stored.
 なお、制御部40を構成する第1処理部41、第1算出部42、第2算出部43、第2処理部44は、それぞれCPU(中央演算処理装置)またはLSI(大規模集積回路)で構成されている。必要に応じて、メモリを含んでいてもよい。これらは専用回路で構成されていてもよく、汎用のハードウェアを、一過性または非一過性の記憶装置や記録媒体から読みだしたソフトウェアで制御して実現してもよい。またこれらの2つ以上を一体に構成してもよい。記憶部58は、書き換え可能なRAMやフラッシュメモリ、ハードディスク等で構成されている。なお記憶部58は、複数のメモリで構成し、実装データ46~オフセット量51を個別に記憶してもよいし、一体に構成してこれらのデータを一括して記憶してもよい。 The first processing unit 41, the first calculation unit 42, the second calculation unit 43, and the second processing unit 44 that constitute the control unit 40 are CPUs (central processing units) or LSIs (large-scale integrated circuits), respectively. It is configured. Memory may be included, if desired. These may be configured by a dedicated circuit, or may be realized by controlling general-purpose hardware with software read from a transient or non-transient storage device or recording medium. Further, two or more of these may be integrally configured. The storage unit 58 is composed of a rewritable RAM, a flash memory, a hard disk, and the like. The storage unit 58 may be configured by a plurality of memories to individually store the mounting data 46 to the offset amount 51, or may be integrally configured to store these data collectively.
 実装データ46は、基板3の種類毎の、基板3のサイズ、実装される部品Dの種類と実装位置(XY座標)など、部品実装作業に必要な情報を含む。第1処理部41は、第1カメラ20の撮像視野の位置ずれを検出(算出)する際は、第1カメラ20のマーク撮像部22bと部品照明部25とを制御する。具体的には第1処理部41は、透明部材26上に設けられたマーク27を部品照明部25に照らさせながらマーク撮像部22bにマーク27を撮像させる。第1処理部41は、撮像結果を撮像データ47として記憶部45に記憶させる。 The mounting data 46 includes information necessary for component mounting work, such as the size of the board 3, the type of component D to be mounted, and the mounting position (XY coordinates) for each type of board 3. The first processing unit 41 controls the mark imaging unit 22b and the component lighting unit 25 of the first camera 20 when detecting (calculating) the positional deviation of the imaging field of view of the first camera 20. Specifically, the first processing unit 41 causes the mark imaging unit 22b to image the mark 27 while illuminating the mark 27 provided on the transparent member 26 with the component lighting unit 25. The first processing unit 41 stores the imaging result as imaging data 47 in the storage unit 45.
 ここで図7、図8を参照して、第1カメラ20が撮像したマーク27の撮像結果について説明する。第1カメラ20は、部品実装作業においてノズル8bに保持された部品Dの撮像を繰り返し実行する。この過程において、撮像素子22の発熱や部品照明部25の発熱に起因して、センサ基板23や筐体21に経時的な歪が発生することがある。図7は、このような経時的な変化によってセンサ基板23が歪んで撮像素子22がXY面内で位置ずれし、部品撮像部22aの第1の光路Pa(光軸)とマーク撮像部22bの第2の光路Pb(光軸)とが初期の状態からずれた状態を示している。X軸上において、第1の光路Paは、位置ずれ量ΔXdだけずれており、第2の光路Pbは、位置ずれ量ΔXmだけずれている。 Here, with reference to FIGS. 7 and 8, the imaging result of the mark 27 imaged by the first camera 20 will be described. The first camera 20 repeatedly executes imaging of the component D held by the nozzle 8b in the component mounting work. In this process, the sensor substrate 23 and the housing 21 may be distorted over time due to the heat generated by the image sensor 22 and the component lighting unit 25. In FIG. 7, the sensor substrate 23 is distorted due to such a change over time, the image sensor 22 is displaced in the XY plane, and the first optical path Pa (optical axis) of the component image pickup unit 22a and the mark image pickup unit 22b are shown. It shows a state in which the second optical path Pb (optical axis) deviates from the initial state. On the X-axis, the first optical path Pa is deviated by the amount of misalignment ΔXd, and the second optical path Pb is deviated by the amount of misalignment ΔXm.
 図7において、第1カメラ20に歪がない初期の状態において、部品認識高さHaにおける第1の光路Paの位置を初期部品視野中心位置Cd0と定義し、マーク認識高さHbにおける第2の光路Pbの位置を初期マーク視野中心位置Cm0と定義する。また、撮像素子22がXY面内で位置ずれしている状態において、部品認識高さHaにおける第1の光路Paの位置を部品視野中心位置Cdと定義し、マーク認識高さHbにおける第2の光路Pbの位置をマーク視野中心位置Cmと定義する。 In FIG. 7, in the initial state where the first camera 20 is not distorted, the position of the first optical path Pa at the component recognition height Ha is defined as the initial component visual field center position Cd0, and the second at the mark recognition height Hb. The position of the optical path Pb is defined as the initial mark visual field center position Cm0. Further, in a state where the image sensor 22 is displaced in the XY plane, the position of the first optical path Pa at the component recognition height Ha is defined as the component visual field center position Cd, and the second optical path at the mark recognition height Hb is defined. The position of the optical path Pb is defined as the mark visual field center position Cm.
 図8は、図7に示すように撮像素子22がXY面内で位置ずれしている状態において、第1カメラ20によって撮像されたマーク27の画像であるマーク画像60を示している。マーク画像60には、マーク撮像部22bによって撮像された5個のマーク27が並んで写っている。5個並んだマーク27のうちの3番目のマーク27(以下、中央マーク27A)の中心が、初期マーク視野中心位置Cm0である。また、マーク画像60の中心60cが、マーク視野中心位置Cmである。 FIG. 8 shows a mark image 60, which is an image of the mark 27 captured by the first camera 20 in a state where the image sensor 22 is displaced in the XY plane as shown in FIG. 7. In the mark image 60, five marks 27 imaged by the mark imaging unit 22b are shown side by side. The center of the third mark 27 (hereinafter, the center mark 27A) of the five marks 27 arranged side by side is the initial mark visual field center position Cm0. Further, the center 60c of the mark image 60 is the mark visual field center position Cm.
 第1カメラ20に歪がない初期状態では、第1カメラ20によって撮像された画像において、初期マーク視野中心位置Cm0がマーク画像60の中心60cに位置し、5個のマーク27は点線で示す位置にある。図8では、撮像素子22が位置ずれすることで、マーク認識高さHbにおける第2の光路Pbの位置が中央マーク27Aの中心から移動してずれている。 In the initial state where the first camera 20 is not distorted, in the image captured by the first camera 20, the initial mark field of view center position Cm0 is located at the center 60c of the mark image 60, and the five marks 27 are the positions indicated by the dotted lines. It is in. In FIG. 8, the position of the second optical path Pb at the mark recognition height Hb is displaced from the center of the central mark 27A due to the displacement of the image sensor 22.
 図6に示す第1算出部42は、マーク画像60に基づいて、マーク撮像部22bの視野の初期状態からの位置ずれ量である第1位置ずれ量48を算出して、記憶部45に記憶させる。具体的には、第1算出部42は、図8に示すマーク画像60の中心60cに対する中央マーク27Aの位置より、X軸におけるずれ量ΔXmとY軸におけるずれ量ΔYmとを算出する。また、5個のマーク27の中心を結ぶ直線の傾きからZ軸を回転軸とする回転の方向であるθ方向のずれ量Δθmを算出する。 The first calculation unit 42 shown in FIG. 6 calculates the first misalignment amount 48, which is the amount of misalignment from the initial state of the visual field of the mark imaging unit 22b, based on the mark image 60, and stores it in the storage unit 45. Let me. Specifically, the first calculation unit 42 calculates the deviation amount ΔXm on the X-axis and the deviation amount ΔYm on the Y-axis from the position of the central mark 27A with respect to the center 60c of the mark image 60 shown in FIG. Further, the deviation amount Δθm in the θ direction, which is the direction of rotation with the Z axis as the rotation axis, is calculated from the inclination of the straight line connecting the centers of the five marks 27.
 図9は、図7に示す撮像素子22がXY面内で位置ずれしている状態において、第1カメラ20が撮像した部品Dの画像である部品撮像画像(以下、第1画像)61を示す。なお、ノズル8bは、部品Dの中心を保持している。第1画像61に写っている部品Dの中心が、初期部品視野中心位置Cd0である。また、第1画像61の中心61cが、部品視野中心位置Cdである。第1カメラ20に歪がない初期状態では、第1画像61において、初期部品視野中心位置Cd0が第1画像61の中心61cに位置し、部品Dと部品Dを保持するノズル8bとは点線で示す位置にある。 FIG. 9 shows a component image pickup image (hereinafter, first image) 61 which is an image of the component D imaged by the first camera 20 in a state where the image pickup element 22 shown in FIG. 7 is displaced in the XY plane. .. The nozzle 8b holds the center of the component D. The center of the component D shown in the first image 61 is the initial component visual field center position Cd0. Further, the center 61c of the first image 61 is the component field of view center position Cd. In the initial state where the first camera 20 is not distorted, in the first image 61, the initial component visual field center position Cd0 is located at the center 61c of the first image 61, and the component D and the nozzle 8b holding the component D are dotted lines. It is in the indicated position.
 第1画像61に基づく部品撮像部22aの視野の初期状態からの位置ずれ量である第2位置ずれ量49は、X軸におけるずれ量ΔXdとY軸におけるずれ量ΔYdとで表される。すなわち、第1画像61の中心61cに対する部品Dの中心の位置のずれは、ずれ量ΔXdとずれ量ΔYdである。また、部品Dの回転具合(部品Dの各辺の傾きなど)が、Z軸を回転軸とする回転の方向であるθ方向のずれ量Δθdである。 The second displacement amount 49, which is the displacement amount from the initial state of the field of view of the component imaging unit 22a based on the first image 61, is represented by the displacement amount ΔXd on the X-axis and the displacement amount ΔYd on the Y-axis. That is, the deviation of the position of the center of the component D with respect to the center 61c of the first image 61 is the deviation amount ΔXd and the deviation amount ΔYd. Further, the degree of rotation of the component D (inclination of each side of the component D, etc.) is a deviation amount Δθd in the θ direction, which is the direction of rotation with the Z axis as the rotation axis.
 第1カメラ20では、部品撮像部22aとマーク撮像部22bとが単一の撮像素子22に形成されている。そのため、第1カメラ20の経時的な変化によって、部品撮像部22aの第1の光路Paのずれとマーク撮像部22bの第2の光路Pbのずれとは連動して発生する。したがって、第1位置ずれ量48と第2位置ずれ量49には、第1カメラ20の構成で決まる所定の関係(例えば、比例関係)がある。すなわち、第2位置ずれ量49は、第1位置ずれ量48から算出することができる。 In the first camera 20, the component image pickup unit 22a and the mark image pickup unit 22b are formed on a single image pickup element 22. Therefore, due to changes over time in the first camera 20, the deviation of the first optical path Pa of the component imaging unit 22a and the deviation of the second optical path Pb of the mark imaging unit 22b are interlocked with each other. Therefore, the first misalignment amount 48 and the second misalignment amount 49 have a predetermined relationship (for example, a proportional relationship) determined by the configuration of the first camera 20. That is, the second misalignment amount 49 can be calculated from the first misalignment amount 48.
 第1算出部42は、マーク画像60から算出した第1位置ずれ量48に対して、所定の演算を実行して第2位置ずれ量49を算出して、記憶部45に記憶させる。所定の演算とは、例えば、X軸の成分とY軸の成分には所定の係数を掛け、θ方向はそのままにする、といった演算である。すなわち、第1算出部42は、マーク撮像部22bによるマーク27の撮像結果であるマーク画像60に基づいて、部品撮像部22aの視野の位置ずれ量(第2位置ずれ量49)を算出する。そして、部品撮像部22a、マーク撮像部22b、マーク27、第1算出部42は、ノズル8bで保持された部品Dを撮像し、第2位置ずれ量49を算出する部品撮像装置を構成する。 The first calculation unit 42 executes a predetermined calculation on the first position shift amount 48 calculated from the mark image 60 to calculate the second position shift amount 49 and stores it in the storage unit 45. The predetermined operation is, for example, an operation in which the X-axis component and the Y-axis component are multiplied by a predetermined coefficient and the θ direction is left as it is. That is, the first calculation unit 42 calculates the amount of misalignment of the visual field of the component imaging unit 22a (second amount of misalignment 49) based on the mark image 60 which is the result of imaging the mark 27 by the mark imaging unit 22b. Then, the component imaging unit 22a, the mark imaging unit 22b, the mark 27, and the first calculation unit 42 constitute a component imaging device that images the component D held by the nozzle 8b and calculates the second misalignment amount 49.
 図6に示す第2処理部44は、実装データ46に基づいて部品実装装置1の各部を制御する。具体的には、実装ヘッド8のノズル8bを制御して、テープフィーダ5が供給する部品Dを保持させる。そして、ノズル8bが保持した部品Dの位置ずれを第1カメラ20に検出させる。さらに、ノズル8bを制御して、実装作業位置に保持された基板3に部品Dを装着させる。第2処理部44は、このような一連の部品実装作業を制御する。その際、第2処理部44は、記憶部45に記憶されている第2位置ずれ量49に基づいて、第1カメラ20が検出したノズル8bに対する部品Dの位置を補正する。 The second processing unit 44 shown in FIG. 6 controls each unit of the component mounting device 1 based on the mounting data 46. Specifically, the nozzle 8b of the mounting head 8 is controlled to hold the component D supplied by the tape feeder 5. Then, the first camera 20 is made to detect the misalignment of the component D held by the nozzle 8b. Further, the nozzle 8b is controlled to mount the component D on the substrate 3 held at the mounting work position. The second processing unit 44 controls such a series of component mounting operations. At that time, the second processing unit 44 corrects the position of the component D with respect to the nozzle 8b detected by the first camera 20 based on the second misalignment amount 49 stored in the storage unit 45.
 マーク撮像部22bは、実装ヘッド8がテープフィーダ5からの部品Dの取り出しや基板3への部品の装着を実行している間に、マーク27を撮像することができる。すなわちマーク撮像部22bは、実装ヘッド8が第1カメラ20の上方にない空き時間に、実装ヘッド8の動作などとは独立してマーク27を撮像することができる。このため、空き時間にマーク撮像部22bがマーク27を撮像して第2位置ずれ量49を最新の値に更新することできる。これにより、部品実装作業の効率を低下させることなく部品Dを撮像する部品撮像カメラ20の光軸のずれ(第2位置ずれ量49)を検出して部品Dを実装する位置を補正し、部品Dを高精度に正規の実装位置に装着することができる。 The mark imaging unit 22b can image the mark 27 while the mounting head 8 takes out the component D from the tape feeder 5 and mounts the component on the substrate 3. That is, the mark imaging unit 22b can image the mark 27 independently of the operation of the mounting head 8 and the like in the spare time when the mounting head 8 is not above the first camera 20. Therefore, the mark imaging unit 22b can image the mark 27 in the spare time and update the second misalignment amount 49 to the latest value. As a result, the position where the component D is mounted is corrected by detecting the deviation of the optical axis (second position deviation amount 49) of the component imaging camera 20 that images the component D without reducing the efficiency of the component mounting work, and the component. D can be mounted in the regular mounting position with high accuracy.
 図6に示す第1処理部41は、移動機構9の経時的な歪に起因する実装ヘッド8、ノズル8b、第2カメラ30の位置ずれを検出(算出)する。具体的には第1処理部41は、移動機構9、第2カメラ30、第1カメラ20を制御して、透明部材26上に形成されたマーク27をカメラユニット32に撮像させる。第1処理部41は、撮像結果を撮像データ47として記憶部45に記憶させる。 The first processing unit 41 shown in FIG. 6 detects (calculates) the misalignment of the mounting head 8, the nozzle 8b, and the second camera 30 due to the temporal distortion of the moving mechanism 9. Specifically, the first processing unit 41 controls the moving mechanism 9, the second camera 30, and the first camera 20 to cause the camera unit 32 to image the mark 27 formed on the transparent member 26. The first processing unit 41 stores the imaging result as imaging data 47 in the storage unit 45.
 次に、図10、図11を参照して、第2カメラ30によるマーク27の撮像の詳細と撮像結果について説明する。第2カメラ30によりマーク27を撮像する際、まず図6に示す第1処理部41は移動機構9を制御して、第1カメラ20に設けられたマーク27を撮像する位置に第2カメラ30を移動させる。次いで第1処理部41は、第1カメラ20の部品照明部25を制御して、透明部材26上に設けられたマーク27を下方から照らさせながら、図10に示すカメラユニット32を制御してマーク27を撮像させる。 Next, with reference to FIGS. 10 and 11, details of imaging of the mark 27 by the second camera 30 and imaging results will be described. When the mark 27 is imaged by the second camera 30, the first processing unit 41 shown in FIG. 6 first controls the moving mechanism 9, and the second camera 30 is located at a position where the mark 27 provided on the first camera 20 is imaged. To move. Next, the first processing unit 41 controls the component lighting unit 25 of the first camera 20 to illuminate the mark 27 provided on the transparent member 26 from below, and controls the camera unit 32 shown in FIG. The mark 27 is imaged.
 すなわち、基板撮像装置としての第2カメラ30がマーク27を撮像する際には、部品照明部25が透過照明によりマーク27を照らす。これにより、例えば透明部材26に汚れが付着している場合でも、第2カメラ30は、マーク27を鮮明に撮像することができる。なお、基板照明部33がマーク27を照らしながらカメラユニット32がマーク27を撮像してもよい。また、部品照明部25は透過照明としたが、透過照明は照明種別の一例であり、この例に限定されることはない。 That is, when the second camera 30 as the substrate imaging device images the mark 27, the component illumination unit 25 illuminates the mark 27 with transmitted illumination. As a result, for example, even if the transparent member 26 is dirty, the second camera 30 can clearly image the mark 27. The camera unit 32 may take an image of the mark 27 while the substrate illumination unit 33 illuminates the mark 27. Further, although the component illumination unit 25 is a transmission illumination, the transmission illumination is an example of an illumination type and is not limited to this example.
 ところで、移動機構9は、部品実装作業において実装ヘッド8を繰り返し移動させる。この過程において、移動機構9のリニア駆動機構などの発熱によって移動機構9に経時的な歪が発生することがある。制御部40では、部品実装装置1に設定されている基点(図示省略)を原点として、移動機構9に歪がない初期状態を想定して実装ヘッド8やノズル8bの位置が制御されるように各種制御パラメータが設定されている。移動機構9に歪がある場合、実装ヘッド8やノズル8bが初期状態の位置からずれるため補正が必要となる。 By the way, the moving mechanism 9 repeatedly moves the mounting head 8 in the component mounting work. In this process, the moving mechanism 9 may be distorted over time due to heat generated by the linear drive mechanism of the moving mechanism 9. The control unit 40 controls the positions of the mounting head 8 and the nozzle 8b by assuming an initial state in which the moving mechanism 9 is not distorted, with the base point (not shown) set in the component mounting device 1 as the origin. Various control parameters are set. If the moving mechanism 9 is distorted, the mounting head 8 and the nozzle 8b deviate from the initial positions, so that correction is required.
 図10は、第1処理部41が移動機構9を制御して、カメラユニット32の第3の光路Phがマーク27の中心と一致させるにもかかわらず、移動機構9の歪に基因して第2カメラ30が位置ずれして停止している状態を示している。前述のようにカメラユニット32を含む第2カメラ30は、移動機構9により実装ヘッド8と一体的に移動する。移動機構9に歪がない初期の状態において、第3の光路Phのマーク27上の位置を初期基板視野中心位置Ch0と定義する。また、移動機構9に歪がある状態における、第3の光路Phの透明部材26上の位置を基板視野中心位置Chと定義する。 In FIG. 10, although the first processing unit 41 controls the moving mechanism 9 and the third optical path Ph of the camera unit 32 coincides with the center of the mark 27, the first processing unit 41 is caused by the distortion of the moving mechanism 9. 2 Indicates a state in which the camera 30 is misaligned and stopped. As described above, the second camera 30 including the camera unit 32 moves integrally with the mounting head 8 by the moving mechanism 9. In the initial state where the moving mechanism 9 is not distorted, the position on the mark 27 of the third optical path Ph is defined as the initial substrate visual field center position Ch0. Further, the position of the third optical path Ph on the transparent member 26 when the moving mechanism 9 is distorted is defined as the substrate visual field center position Ch.
 図11は、図10に示すように移動機構9に歪がある状態において、第2カメラ30のカメラユニット32によって撮像されたマーク27の基板認識カメラ画像(以下、第2画像)62を示している。第2画像62の中心62cが基板視野中心位置Chである。また、第2画像62に写っている5個のマーク27のうちの3番目の中央マーク27Aの中心が、初期基板視野中心位置Ch0である。移動機構9に歪がない初期状態では、第2画像62において、中心62cに初期基板視野中心位置Ch0が位置し、5個のマーク27は点線で示す位置にある。 FIG. 11 shows a substrate recognition camera image (hereinafter, second image) 62 of the mark 27 captured by the camera unit 32 of the second camera 30 in a state where the moving mechanism 9 is distorted as shown in FIG. There is. The center 62c of the second image 62 is the substrate field of view center position Ch. Further, the center of the third center mark 27A out of the five marks 27 shown in the second image 62 is the initial substrate visual field center position Ch0. In the initial state where the moving mechanism 9 is not distorted, in the second image 62, the initial substrate field of view center position Ch0 is located at the center 62c, and the five marks 27 are at the positions indicated by the dotted lines.
 図6に示す第2算出部43は、図11に示す第2画像62に基づいて、カメラユニット32の視野の初期状態からの位置ずれ量である第3位置ずれ量50を算出して、記憶部45に記憶させる。具体的には、第2算出部43は、第2画像62の中心62cに対する中央マーク27Aの位置よりX軸におけるずれ量ΔXhとY軸におけるずれ量ΔYhとを算出する。また、5個のマーク27の中心を結ぶ直線の傾きから、Z軸を回転軸とする回転の方向であるθ方向のずれ量Δθhを算出する。 The second calculation unit 43 shown in FIG. 6 calculates and stores a third position deviation amount 50, which is a position deviation amount from the initial state of the field of view of the camera unit 32, based on the second image 62 shown in FIG. It is stored in the part 45. Specifically, the second calculation unit 43 calculates the deviation amount ΔXh on the X-axis and the deviation amount ΔYh on the Y-axis from the position of the center mark 27A with respect to the center 62c of the second image 62. Further, from the inclination of the straight line connecting the centers of the five marks 27, the deviation amount Δθh in the θ direction, which is the direction of rotation with the Z axis as the rotation axis, is calculated.
 第2処理部44は、第2カメラ30を制御して、実装作業位置に保持された基板3の基板マーク3aを撮像させて基板3の位置を認識する。その際、第2処理部44は、記憶部45に記憶されている第3位置ずれ量50に基づいて、基板3の停止位置を補正する。また、部品実装作業では、部品実装装置1に設定されている基点に対するノズル8bの位置を補正して、ノズル8bが保持する部品Dを基板3上の所定の実装位置に実装する必要がある。ノズル8bの位置を補正するには、移動機構9の歪に加えて、上述したノズル8bが保持する部品Dを撮像する第1カメラ20の歪も考慮する必要がある。 The second processing unit 44 controls the second camera 30 to image the substrate mark 3a of the substrate 3 held at the mounting work position and recognizes the position of the substrate 3. At that time, the second processing unit 44 corrects the stop position of the substrate 3 based on the third position deviation amount 50 stored in the storage unit 45. Further, in the component mounting work, it is necessary to correct the position of the nozzle 8b with respect to the base point set in the component mounting device 1 and mount the component D held by the nozzle 8b at a predetermined mounting position on the substrate 3. In order to correct the position of the nozzle 8b, in addition to the distortion of the moving mechanism 9, it is necessary to consider the distortion of the first camera 20 that captures the component D held by the nozzle 8b described above.
 図10において、第2算出部43は、実装ヘッド8における第2カメラ30とノズル8bとの位置関係(Xn,Yn)に加え、第3位置ずれ量50と、第1位置ずれ量48とに基づいて、ノズル8bと第2カメラ30とのオフセット量51を算出する。そして第2算出部43は、オフセット量51を記憶部45に記憶させる。前述のように、第1位置ずれ量48は、第1カメラ20の歪を表わす第1カメラ20の視野の位置ずれ量である。第2処理部44は、記憶部45に記憶された実装データ46とオフセット量51とに基づいてノズル8bの位置を補正して、ノズル8bを制御し、ノズル8bが保持する部品Dを基板3上の実装位置に実装させる。 In FIG. 10, the second calculation unit 43 sets the third position shift amount 50 and the first position shift amount 48 in addition to the positional relationship (Xn, Yn) between the second camera 30 and the nozzle 8b in the mounting head 8. Based on this, the offset amount 51 between the nozzle 8b and the second camera 30 is calculated. Then, the second calculation unit 43 stores the offset amount 51 in the storage unit 45. As described above, the first misalignment amount 48 is the misalignment amount of the field of view of the first camera 20 that represents the distortion of the first camera 20. The second processing unit 44 corrects the position of the nozzle 8b based on the mounting data 46 stored in the storage unit 45 and the offset amount 51, controls the nozzle 8b, and holds the component D held by the nozzle 8b on the substrate 3. Mount it at the mounting position above.
 前述のように、第1カメラ20の視野の位置ずれ量は、第1カメラ20のマーク撮像部22bによるマークの撮像結果に基づいて、第1算出部42によって算出される。すなわち、第1算出部42は、図8に示すマーク画像60に基づいて第1位置ずれ量48を算出する。第1カメラ20がマーク27を撮像する際、第1処理部41は、第2カメラ30の基板照明部33を制御して透明部材26上に設けられたマーク27を上方から照らさせた状態で、マーク撮像部22bを制御してマーク27を撮像させる。すなわち、第1カメラ20がマーク27を撮像する際は、基板照明部33がマーク27を透過照明で照らす。これにより、例えば、透明部材26に汚れが付着している場合でも、マーク27を鮮明に撮像することができる。 As described above, the amount of misalignment of the field of view of the first camera 20 is calculated by the first calculation unit 42 based on the result of capturing the mark by the mark imaging unit 22b of the first camera 20. That is, the first calculation unit 42 calculates the first misalignment amount 48 based on the mark image 60 shown in FIG. When the first camera 20 images the mark 27, the first processing unit 41 controls the substrate illumination unit 33 of the second camera 30 to illuminate the mark 27 provided on the transparent member 26 from above. , The mark imaging unit 22b is controlled to image the mark 27. That is, when the first camera 20 images the mark 27, the substrate illumination unit 33 illuminates the mark 27 with transmitted illumination. Thereby, for example, even when the transparent member 26 is dirty, the mark 27 can be clearly imaged.
 このように、第2算出部43は、部品撮像装置である第1カメラ20のマーク撮像部22bによるマーク27の撮像結果と、基板撮像装置である第2カメラ30によるマーク27の撮像結果とに基づいて、ノズル8bと第2カメラ30とのオフセット量51を算出する。すなわち、第2算出部43は、マーク画像60と基板認識カメラ画像62とに基づいて、オフセット量51を算出する。 As described above, the second calculation unit 43 obtains the image pickup result of the mark 27 by the mark image pickup unit 22b of the first camera 20 which is the component image pickup device and the image pickup result of the mark 27 by the second camera 30 which is the substrate image pickup device. Based on this, the offset amount 51 between the nozzle 8b and the second camera 30 is calculated. That is, the second calculation unit 43 calculates the offset amount 51 based on the mark image 60 and the substrate recognition camera image 62.
 上記のように、部品実装装置1は、ノズル8bで保持された部品Dを撮像する部品撮像装置(第1カメラ20、第1算出部42)と、ノズル8bと一体的に移動して基板3を撮像する基板撮像装置(第2カメラ30)と、オフセット量51を算出するオフセット量算出部(第2算出部)43とを有している。これによって、ノズル8bと基板認識装置とのオフセット量51を精度良く算出することができる。 As described above, the component mounting device 1 moves integrally with the component imaging device (first camera 20, first calculation unit 42) that images the component D held by the nozzle 8b, and the substrate 3 It has a substrate imaging device (second camera 30) for imaging the image, and an offset amount calculation unit (second calculation unit) 43 for calculating the offset amount 51. As a result, the offset amount 51 between the nozzle 8b and the substrate recognition device can be calculated with high accuracy.
 本開示の部品撮像装置は、部品を撮像するカメラの光軸のずれを算出することができる。また、本開示の部品実装装置は、実装基板の生産中にノズルと基板撮像装置とのオフセット量を精度良く算出することができることができる。そのため、いずれも、部品を基板に実装する分野において有用である。 The component imaging device of the present disclosure can calculate the deviation of the optical axis of the camera that images the component. Further, the component mounting device of the present disclosure can accurately calculate the offset amount between the nozzle and the board imaging device during the production of the mounting board. Therefore, all of them are useful in the field of mounting components on a substrate.
1  部品実装装置
1a  基台
2  基板搬送機構(搬送機構)
3  基板
3a  基板マーク
4  部品供給部
5  テープフィーダ
5a  部品供給位置
6  Y軸テーブル
7  ビーム
7a  プレート
8  実装ヘッド
8a  吸着ユニット
8b  ノズル
9  ヘッド移動機構(移動機構)
10  台車
10a  フィーダベース
11  テープ
12  テープリール
13  タッチパネル
20  部品認識カメラ(第1カメラ)
21,31  筐体
21a  底
21b  上部
22  撮像素子
22a  部品撮像部
22b  マーク撮像部
23  センサ基板
24  レンズ
25  部品照明部
25a,33a  側射照明部
25b,33b  同軸照明部
25c,33c  ハーフミラー
26  透明部材
26a  部品撮像視野
26b  マーク撮像視野
26c  上面
27  マーク
27A  中央マーク
28  補正部材
30  基板認識カメラ(第2カメラ)
32  カメラユニット
33  基板照明部
34  窓部材
40  制御部
41  撮像処理部(第1処理部)
42  位置ずれ量算出部(第1算出部)
43  オフセット量算出部(第2算出部)
44  実装動作処理部(第2処理部)
45  実装記憶部(記憶部)
46  実装データ
47  撮像データ
48  マーク視野位置ずれ量(第1位置ずれ量)
49  部品視野位置ずれ量(第2位置ずれ量)
50  基板視野位置ずれ量(第3位置ずれ量)
51  オフセット量
60  マーク画像
60c,61c,62c  中心
61  部品撮像画像(第1画像)
62  基板認識カメラ画像(第2画像)
D  部品
Pa  第1の光路
Pb  第2の光路
Ph  第3の光路
1 Component mounting device 1a Base 2 Board transfer mechanism (transfer mechanism)
3 Board 3a Board mark 4 Parts supply part 5 Tape feeder 5a Parts supply position 6 Y-axis table 7 Beam 7a Plate 8 Mounting head 8a Suction unit 8b Nozzle 9 Head movement mechanism (movement mechanism)
10 Dolly 10a Feeder base 11 Tape 12 Tape reel 13 Touch panel 20 Parts recognition camera (1st camera)
21,31 Housing 21a Bottom 21b Top 22 Image sensor 22a Parts image sensor 22b Mark image pickup unit 23 Sensor board 24 Lens 25 Parts illumination 25a, 33a Side illumination 25b, 33b Coaxial illumination 25c, 33c Half mirror 26 Transparent member 26a Parts image pickup field 26b Mark image pickup field 26c Top surface 27 Mark 27A Center mark 28 Correction member 30 Board recognition camera (second camera)
32 Camera unit 33 Board lighting unit 34 Window member 40 Control unit 41 Imaging processing unit (first processing unit)
42 Misalignment amount calculation unit (first calculation unit)
43 Offset amount calculation unit (second calculation unit)
44 Mounting operation processing unit (second processing unit)
45 Mounting storage unit (storage unit)
46 Mounting data 47 Imaging data 48 Mark field of view misalignment amount (first misalignment amount)
49 Parts visual field misalignment amount (second misalignment amount)
50 Substrate visual field misalignment (third misalignment)
51 Offset amount 60 Mark image 60c, 61c, 62c Center 61 Part image (first image)
62 Board recognition camera image (second image)
D Part Pa 1st optical path Pb 2nd optical path Ph 3rd optical path

Claims (11)

  1. 撮像素子に設けられ、部品を撮像する部品撮像部と、
    前記部品撮像部の視野の外に設けられたマークと、
    前記撮像素子に設けられ、前記マークを撮像するマーク撮像部と、
    前記マーク撮像部による前記マークの撮像結果に基づいて、前記部品撮像部の前記視野の位置ずれ量を算出する位置ずれ量算出部と、を備えた、
    部品撮像装置。
    A component image sensor that is provided on the image sensor and captures components,
    A mark provided outside the field of view of the component imaging unit and
    A mark image pickup unit provided on the image pickup device to image the mark, and a mark image pickup unit.
    A position deviation amount calculation unit for calculating the position deviation amount of the field of view of the component imaging unit based on the image pickup result of the mark by the mark imaging unit is provided.
    Parts imaging device.
  2. 前記部品撮像部と前記マーク撮像部とは、前記撮像素子における互いに異なる部分に設けられた、
    請求項1に記載の部品撮像装置。
    The component image pickup unit and the mark image pickup unit are provided at different portions of the image pickup device.
    The component imaging device according to claim 1.
  3. 前記部品撮像部が前記部品を撮像する第1の光路と前記マーク撮像部が前記マークを撮像する第2の光路との少なくとも何れか一方に、光路長を補正する補正部材が設けられた、
    請求項1または2に記載の部品撮像装置。
    A correction member for correcting the optical path length is provided in at least one of a first optical path in which the component imaging unit images the component and a second optical path in which the mark imaging unit images the mark.
    The component imaging device according to claim 1 or 2.
  4. 前記マークを照らす照明部をさらに備えた、
    請求項1から3のいずれか一項に記載の部品撮像装置。
    A lighting unit that illuminates the mark is further provided.
    The component imaging device according to any one of claims 1 to 3.
  5. 前記部品撮像部が前記部品を撮像する第1の光路上であって、かつ前記マーク撮像部が前記マークを撮像する第2の光路上に設けられた透明部材をさらに備え、
    前記マークは、前記透明部材に設けられている、
    請求項1から4のいずれか一項に記載の部品撮像装置。
    The component imaging unit further includes a transparent member provided on the first optical path for imaging the component and the mark imaging unit on the second optical path for imaging the mark.
    The mark is provided on the transparent member.
    The component imaging device according to any one of claims 1 to 4.
  6. 部品を保持して基板に装着するノズルと、
    前記ノズルで保持された前記部品を撮像する部品撮像装置と、
    前記ノズルと一体的に移動して前記基板を撮像する基板撮像装置と、
    前記ノズルと前記基板撮像装置とのオフセット量を算出するオフセット量算出部と、を備え、
    前記部品撮像装置は、
     撮像素子に設けられ、前記部品を撮像する部品撮像部と、
     前記部品撮像部の視野の外に設けられたマークと、
     前記撮像素子に設けられ、前記マークを撮像するマーク撮像部と、を有し、
    前記オフセット量算出部は、前記部品撮像装置の前記マーク撮像部による前記マークの撮像結果と前記基板撮像装置による前記マークの撮像結果とに基づいて、前記オフセット量を算出する、
    部品実装装置。
    Nozzles that hold parts and mount them on the board,
    A component imaging device that images the component held by the nozzle, and
    A substrate imaging device that moves integrally with the nozzle to image the substrate,
    An offset amount calculation unit for calculating an offset amount between the nozzle and the substrate imaging device is provided.
    The component imaging device is
    A component image pickup unit provided on the image sensor to image the component, and a component image sensor.
    A mark provided outside the field of view of the component imaging unit and
    It has a mark image pickup unit provided on the image pickup device and images the mark.
    The offset amount calculation unit calculates the offset amount based on the image pickup result of the mark by the mark imaging unit of the component imaging device and the image pickup result of the mark by the substrate imaging device.
    Component mounting device.
  7. 前記部品撮像部と前記マーク撮像部とは、前記撮像素子における互いに異なる部分に設けられた、
    請求項6に記載の部品実装装置。
    The component image pickup unit and the mark image pickup unit are provided at different portions of the image pickup device.
    The component mounting device according to claim 6.
  8. 前記部品撮像部が前記部品を撮像する第1の光路と前記マーク撮像部が前記マークを撮像する第2の光路との少なくとも何れか一方に、光路長を補正する補正部材が設けられた、
    請求項6または7に記載の部品実装装置。
    A correction member for correcting the optical path length is provided in at least one of a first optical path in which the component imaging unit images the component and a second optical path in which the mark imaging unit images the mark.
    The component mounting device according to claim 6 or 7.
  9. 前記部品撮像装置は、前記マーク撮像部による前記マークの撮像結果に基づいて、前記部品撮像部の視野の位置ずれ量を算出する位置ずれ量算出部を有し、
    前記オフセット量算出部は、前記基板撮像装置による前記マークの撮像結果と前記部品撮像装置の前記視野の位置ずれ量とに基づいて、前記オフセット量を算出する、
    請求項6から8のいずれか一項に記載の部品実装装置。
    The component imaging device has a position shift amount calculation unit that calculates the position shift amount of the visual field of the component imaging unit based on the image pickup result of the mark by the mark imaging unit.
    The offset amount calculation unit calculates the offset amount based on the image pickup result of the mark by the substrate imaging device and the position deviation amount of the field of view of the component imaging device.
    The component mounting device according to any one of claims 6 to 8.
  10. 前記部品撮像装置は、前記部品撮像装置が前記マークを撮像する際に前記マークを照らす部品照明部を有し、
    前記基板撮像装置は、前記基板撮像装置が前記マークを撮像する際に前記マークを照らす基板照明部を有する、
    請求項6から9のいずれかに記載の部品実装装置。
    The component imaging device has a component illumination unit that illuminates the mark when the component imaging device images the mark.
    The substrate imaging device has a substrate illumination unit that illuminates the mark when the substrate imaging device images the mark.
    The component mounting device according to any one of claims 6 to 9.
  11. 前記部品撮像部が前記部品を撮像する第1の光路上であって、かつ前記マーク撮像部が前記マークを撮像する第2の光路上に設けられた透明部材をさらに備え、
    前記マークは、前記透明部材に設けられている、
    請求項6から10のいずれか一項に記載の部品実装装置。
    The component imaging unit further includes a transparent member provided on the first optical path for imaging the component and the mark imaging unit on the second optical path for imaging the mark.
    The mark is provided on the transparent member.
    The component mounting device according to any one of claims 6 to 10.
PCT/JP2020/036798 2019-12-11 2020-09-29 Component-imaging device and component-mounting device WO2021117319A1 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN202080082450.3A CN114747308A (en) 2019-12-11 2020-09-29 Component imaging device and component mounting device
JP2021563758A JPWO2021117319A1 (en) 2019-12-11 2020-09-29

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
JP2019-223373 2019-12-11
JP2019223373 2019-12-11
JP2019-223374 2019-12-11
JP2019223374 2019-12-11

Publications (1)

Publication Number Publication Date
WO2021117319A1 true WO2021117319A1 (en) 2021-06-17

Family

ID=76329368

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2020/036798 WO2021117319A1 (en) 2019-12-11 2020-09-29 Component-imaging device and component-mounting device

Country Status (3)

Country Link
JP (1) JPWO2021117319A1 (en)
CN (1) CN114747308A (en)
WO (1) WO2021117319A1 (en)

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP3129134B2 (en) * 1995-02-23 2001-01-29 松下電器産業株式会社 Chip mounting method
JP2004179636A (en) * 2002-11-13 2004-06-24 Fuji Mach Mfg Co Ltd Method and device of calibration in electronic part packaging apparatus
JP2008270649A (en) * 2007-04-24 2008-11-06 Juki Corp Surface mounting equipment, and camera position correction method thereof
JP2011035148A (en) * 2009-07-31 2011-02-17 Nec Corp Electronic component mounting device, and camera position correcting method in the same
WO2013153834A1 (en) * 2012-04-12 2013-10-17 富士機械製造株式会社 Component mounting machine
WO2013153645A1 (en) * 2012-04-12 2013-10-17 富士機械製造株式会社 Image pickup device and image processing device
WO2015049721A1 (en) * 2013-10-01 2015-04-09 富士機械製造株式会社 Component mounting device and component mounting method

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP3129134B2 (en) * 1995-02-23 2001-01-29 松下電器産業株式会社 Chip mounting method
JP2004179636A (en) * 2002-11-13 2004-06-24 Fuji Mach Mfg Co Ltd Method and device of calibration in electronic part packaging apparatus
JP2008270649A (en) * 2007-04-24 2008-11-06 Juki Corp Surface mounting equipment, and camera position correction method thereof
JP2011035148A (en) * 2009-07-31 2011-02-17 Nec Corp Electronic component mounting device, and camera position correcting method in the same
WO2013153834A1 (en) * 2012-04-12 2013-10-17 富士機械製造株式会社 Component mounting machine
WO2013153645A1 (en) * 2012-04-12 2013-10-17 富士機械製造株式会社 Image pickup device and image processing device
WO2015049721A1 (en) * 2013-10-01 2015-04-09 富士機械製造株式会社 Component mounting device and component mounting method

Also Published As

Publication number Publication date
CN114747308A (en) 2022-07-12
JPWO2021117319A1 (en) 2021-06-17

Similar Documents

Publication Publication Date Title
JP6103800B2 (en) Component mounter
JPH07193396A (en) Part mounting device
JP5113406B2 (en) Electronic component mounting equipment
JP2001230597A (en) Detection method for electrical component position
US6195454B1 (en) Component mounting apparatus
JP4331054B2 (en) Adsorption state inspection device, surface mounter, and component testing device
JP2009212251A (en) Component transfer equipment
WO2021117319A1 (en) Component-imaging device and component-mounting device
JP2003234598A (en) Component-mounting method and component-mounting equipment
JP4437686B2 (en) Surface mount machine
JP2022107877A (en) Component imaging apparatus and component mounting device
JP5787397B2 (en) Electronic component mounting apparatus and electronic component mounting method
JP4704218B2 (en) Component recognition method, apparatus and surface mounter
JP3891825B2 (en) Electronic component mounting equipment
JP3499316B2 (en) Calibration data detection method for mounting machine and mounting machine
JP6590949B2 (en) Mounting head movement error detection device and component mounting device
JP4296029B2 (en) Electronic component mounting equipment
JP4386425B2 (en) Surface mount machine
JP2000068696A (en) Part recognition/mounting device and part recognition method
JP2007059776A (en) Electronic part mounting device, method for mounting electronic part and method for detecting noise level
JP4358015B2 (en) Surface mount machine
JP4368709B2 (en) Surface mount machine
JP6498789B2 (en) Mounting head movement error detection device and component mounting device
JP2022120244A (en) Component imaging device
JP6985901B2 (en) Component mounting machine and mounting line

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20900097

Country of ref document: EP

Kind code of ref document: A1

WWE Wipo information: entry into national phase

Ref document number: 2021563758

Country of ref document: JP

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 20900097

Country of ref document: EP

Kind code of ref document: A1