WO2020003384A1 - Flatness acquisition system and mounting machine - Google Patents

Flatness acquisition system and mounting machine Download PDF

Info

Publication number
WO2020003384A1
WO2020003384A1 PCT/JP2018/024223 JP2018024223W WO2020003384A1 WO 2020003384 A1 WO2020003384 A1 WO 2020003384A1 JP 2018024223 W JP2018024223 W JP 2018024223W WO 2020003384 A1 WO2020003384 A1 WO 2020003384A1
Authority
WO
WIPO (PCT)
Prior art keywords
imaging device
unit
relative position
flatness
imaging
Prior art date
Application number
PCT/JP2018/024223
Other languages
French (fr)
Japanese (ja)
Inventor
雅史 天野
勇太 横井
Original Assignee
株式会社Fuji
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 株式会社Fuji filed Critical 株式会社Fuji
Priority to PCT/JP2018/024223 priority Critical patent/WO2020003384A1/en
Priority to JP2020526757A priority patent/JP7181292B2/en
Publication of WO2020003384A1 publication Critical patent/WO2020003384A1/en

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/24Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/30Measuring arrangements characterised by the use of optical techniques for measuring roughness or irregularity of surfaces
    • HELECTRICITY
    • H05ELECTRIC TECHNIQUES NOT OTHERWISE PROVIDED FOR
    • H05KPRINTED CIRCUITS; CASINGS OR CONSTRUCTIONAL DETAILS OF ELECTRIC APPARATUS; MANUFACTURE OF ASSEMBLAGES OF ELECTRICAL COMPONENTS
    • H05K13/00Apparatus or processes specially adapted for manufacturing or adjusting assemblages of electric components
    • H05K13/08Monitoring manufacture of assemblages

Definitions

  • the present disclosure relates to a flatness obtaining system for obtaining flatness and a mounting machine.
  • Patent Literature 1 describes a lift detection device that detects whether or not each of the lead wires of a component includes a component body and a plurality of lead wires that extend side by side from four side surfaces of the component body. Have been.
  • the lift detection device includes a slit light source that irradiates slit light to a plurality of lead wires extending in a line from one side surface of the component held by the component holder, and a plurality of lead wires that are irradiated with the slit light.
  • a camera for imaging the plurality of lead wires based on the captured image. The component is rotated after detecting the presence or absence of floating of the plurality of lead wires, and detecting the presence or absence of each of the plurality of lead wires extending from another side surface.
  • the relative position in the first direction which is a direction parallel to the axis of the imaging device, between the object holder that holds the object and the imaging device is the first position of the target portion of the object. It is changed between the case where one part is imaged and the case where the second part is imaged. For example, when a part of the target portion of the object is out of the depth of field of the imaging device and the first part including the part is imaged, the first part is located within the depth of field.
  • the relative position in the first direction between the object holder and the imaging device can be changed so as to be located.
  • the relative position of the object holder and the imaging device in the first direction depends on whether the first part of the target part of the object is imaged or the second part excluding the first part is imaged. Is changed, and a good captured image can be obtained for the first portion, and the flatness of the target portion can be obtained well.
  • FIG. 3 is a diagram showing a component mounting device of the mounting machine. It is a front view of the imaging unit of the above-mentioned mounting machine. It is a top view of the above-mentioned imaging unit. It is a figure which shows notionally the periphery of the control apparatus of the said mounting machine.
  • 4 is a flowchart illustrating a flatness acquisition program stored in a storage unit of the control device. It is a flowchart showing a part of said program. It is a figure showing the state where part B1 of a part is imaged by the above-mentioned imaging unit. It is a figure showing the state where the part B2 of the above-mentioned parts is imaged.
  • This mounting machine includes a flatness acquisition system.
  • the mounting machine 4 mounts an electronic component (hereinafter, abbreviated as a component) on a circuit board S (hereinafter, abbreviated as a board S). , A component supply device 14, a component mounting device 16, an imaging unit 18, and the like.
  • the substrate transport and support device 12 transports and holds the substrate S.
  • x is the transport direction of the substrate S by the substrate transport and support device 12
  • y is the width direction of the substrate S
  • z is the thickness direction of the substrate S.
  • y is the front-back direction of the mounting machine 4
  • z is the up-down direction, and these x-direction, y-direction, and z-direction are orthogonal to each other.
  • the component supply device 14 supplies a component to be mounted on the board S to the component mounting device 16 in a state where it can be delivered.
  • the component supply device 14 includes at least one of a tray type supply device having a tray 20, a tape feeder type supply device having a tape feeder (not shown), and a loose component supply device 21.
  • the components supplied by the component supply device 14 include a component 30 including a component main body 26 and a plurality of solder balls 28 as electrode portions formed on the component main body 26. As shown in FIG.
  • the SOJ is a lead component 36 including a component main body 32 and a plurality of lead wires 34 extending from the side surface of the component main body 32 and serving as electrode portions bent in a J-shape. (Small Out Line J Lead).
  • the component mounting device 16 picks up and holds the component supplied by the component supply device 14, and mounts the component on the substrate S transported and supported by the substrate transport and support device 12.
  • the component mounting device 16 includes two heads 40 and 41, a head moving device 42 that moves the two heads 40 and 41, and the like.
  • the head moving device 42 includes an x-direction moving device 50 for simultaneously moving the two heads 40 and 41 in the x-direction, a y-direction moving device 52 for moving in the y-direction, and a z-direction moving device 53 for individually moving in the z-direction. 54 and the like.
  • the y direction moving device 52 includes a y slider 55, a y motor 56 which is a linear motor, and the like.
  • the x-direction moving device 50 is provided on the y-slider 55, and includes an x-slider 60, an x-motor 62 as a driving source, a motion converting mechanism 64 for converting the rotation of the x-motor 62 into a linear movement and transmitting the linear movement to the x-slider 60. .
  • the z-direction moving devices 53 and 54 are provided on the x-slider 60, and convert z-sliders 68 and 69, z-motors 70 and 71 as driving sources, and z-motors 70 and 71 into linear motion, respectively, to convert the z-sliders 68 into linear motions. , 69, and a motion conversion mechanism (not shown).
  • the head 40 of the two heads 40 and 41 has one component holder 80.
  • the component holder 80 can be, for example, a suction nozzle that suctions and holds components by negative pressure, or a chuck that holds components by a pair of claws.
  • the imaging unit 18 is for acquiring the three-dimensional shape of the component held by the component holder 80 positioned above, and as shown in FIGS. 3 and 4, two projectors 90 and 91 and an imaging device as an imaging device. It includes a camera 92 and a three-dimensional shape acquiring unit 94 which mainly controls a computer, and controls the projectors 90 and 91 and the camera 92 and acquires a three-dimensional shape of a target portion of an object.
  • the camera 92 is an imaging device having an imaging element such as a charge coupled device (CCD) and a complementary metal oxide semiconductor (CMOS).
  • CCD charge coupled device
  • CMOS complementary metal oxide semiconductor
  • the camera 92 is provided so that the axis Lz extends in the z direction, and the projectors 90 and 91 are provided at positions separated by 90 ° around the axis Lz.
  • the projectors 90 and 91 respectively irradiate a pattern that spreads in a plane in which the intensity changes sinusoidally in one direction in a direction inclined with respect to the z direction, the x direction, and the y direction.
  • an imaging region Rc which is an area where an image can be captured by the camera 92 is included in an irradiation area Rp which is an area where a pattern is irradiated by the projectors 90 and 91.
  • a part of the component irradiated with the pattern by the projectors 90 and 91 is imaged by the camera 92, and is positioned inside the imaging region Rc by the three-dimensional shape acquisition unit 94 based on the acquired image.
  • the three-dimensional shape of the part (part) of the part is obtained by the phase shift method.
  • a pattern whose intensity changes sinusoidally in one direction is emitted a plurality of times by shifting the phase by the projectors 90 and 91, respectively.
  • the camera 92 obtains a captured image that is an image in the imaging region Rc.
  • the three-dimensional shape obtaining unit 94 obtains the luminance of each of the pixels constituting the captured image, and obtains the phase of the pixel based on the luminance value of the same pixel in the plurality of captured images. Is done. Then, by connecting pixels having the same phase, an equal phase line is obtained.
  • the irradiation angle of the phase light one line forming the pattern
  • the position of the pixel on the image sensor of the camera 92 the optical or geometric parameters of the image pickup unit 18 (the optical center coordinates of the projectors 90 and 91,
  • the distance between the image sensor of the camera 92 and the point on the component corresponding to each of the pixels connected by the equiphase lines is acquired based on the optical center coordinates of the camera 92, the focal length, and the like.
  • the three-dimensional shape of the target portion of the component is obtained based on the distance between the plurality of points on the component and the image sensor of the camera 92.
  • the method of acquiring the three-dimensional shape and the pattern irradiated by the projectors 90 and 91 are not limited.
  • a three-dimensional shape can be obtained not only by the phase shift method but also widely by a pattern projection method, or a three-dimensional shape can be obtained by a stereo image method, a contour method, or the like.
  • the three-dimensional shape of the component located in the predetermined two-dimensionally (planarly) set area may be obtained.
  • a projector is unnecessary, and a three-dimensional shape of a component located in the imaging region Rc is obtained based on images captured by a plurality of cameras.
  • the control device 100 mainly includes a computer, and includes an execution unit 110, a storage unit 112, an input / output unit 114, and the like, as shown in FIG.
  • the shape acquisition unit 94 is connected, and the substrate transport support device 12, the component supply device 14, the component mounting device 16, and the like are connected via the drive circuit 120.
  • the component holder 80 is moved above the imaging unit 18, and the three-dimensional shape of the target part of the component held by the component holder 80 is acquired by the imaging unit 18.
  • the target part of the component refers to a portion of the component opposite to the surface facing the component holder 80, in other words, a portion of the component mounted on the substrate S.
  • a part of the solder ball 28 may be missing or the lead wire 34 may be bent due to a manufacturing defect or a trouble during transportation.
  • a problem such as the occurrence of poor current supply to the components 30 and 36 occurs. . Therefore, in the present embodiment, the flatness of the target portion of the components 30 and 36 is acquired, and the components 30 and 36 are checked.
  • a portion including the plurality of solder balls 28 and the like is set as the target portion Ta, and based on the three-dimensional shape of the target portion Ta, the tip of the plurality of solder balls 28
  • the flatness of the virtual plane Pa formed by the set of (points) is obtained.
  • a portion including a portion of the plurality of lead wires 34 located below the component body 32 and the like is defined as the target portion Tb, and the lead wire is formed based on the three-dimensional shape of the target portion Tb.
  • the flatness of an imaginary plane Pb formed by a set of predetermined points on the lower side surface of the part located below the component main body 32 of 34 is acquired.
  • the target part Ta of the component held by the component holder 80 (for example, the case where the component 30 is held) is wider than the imaging region Rc of the imaging unit 18, as shown in FIG.
  • a captured image including the entire target portion Ta of the component 30 cannot be obtained.
  • the component holder 80 is moved in the horizontal direction by the x-direction moving device 50 and the y-direction moving device 52. Thereby, the portion of the target portion Ta of the component 30 located in the imaging region Rc is moved, and the portion located in the imaging region Rc is changed. Each time a portion located in the imaging region Rc is changed, an image is taken by the camera 92 to obtain a three-dimensional shape.
  • FIG. 10 is a diagram illustrating a state in which the component 30 is viewed from the target portion Ta. Black arrows indicate the moving directions of the portions B1 to B9 located in the imaging region Rc.
  • the depth of field C refers to a range in which a clear image can be obtained as a captured image before and after a focused object in the camera 92, and is determined by the characteristics of the camera 92. For a part Bx that deviates from the depth of field C, the captured image becomes unclear, and it is difficult to accurately obtain a three-dimensional shape.
  • the inclination k1 of the part B1 is obtained.
  • k1 (h2-h1) / w
  • the two points Q1 and Q2 are two points separated in the direction indicated by the arrow F1, that is, in the direction of the relative movement between the component holder 80 and the camera 92.
  • Q2 is the distance in the direction parallel to the arrow F1
  • the slope k is the slope of the portion B1 in the direction parallel to the arrow F1.
  • the direction indicated by the arrow F4 in the part B3 is obtained based on the heights h5 and h6 of the two points Q5 and Q6 separated from each other and the distance w3 in the direction parallel to the arrow F4 between the points Q5 and Q6. .
  • the height h4 * of the end E4 is obtained based on the inclination k3 and the distance x3 in a direction parallel to the arrow F4 between the point Q6 and the end E4 of the part B4 on the far side from the part B3. , Is located within the depth of field C. According to the determination result, the component holder 80 is moved in the horizontal direction, or is moved in the horizontal direction and the vertical direction.
  • a three-dimensional shape is obtained for each of the portions B1 to B9, and the obtained three-dimensional shapes are combined to obtain a three-dimensional shape of the entire target portion.
  • the three-dimensional shapes acquired for each of a plurality of parts can be combined based on the elevation amount ⁇ H of the component holder 80. For example, when the component holder 80 is moved up and down by ⁇ H when the part B2 is imaged, the three-dimensional shape acquired for the part B1 and the three-dimensional shape acquired for the part B2 are shifted by an amount of elevation ⁇ H. It combines the three-dimensional shape.
  • the three-dimensional shape obtained for the portion B1 and the three-dimensional shape obtained for the portion B2 are acquired such that the three-dimensional shape of the overlapping portion Bs1 of the portion B1 matches the three-dimensional shape of the overlapping portion Bs1 of the portion B2.
  • the heights of one or more points included in the overlapping portion Bs1 are obtained based on the captured image including the portion B1 (hd1), and are obtained based on the captured image including the portion B2 (hd2).
  • the three-dimensional shape acquired in the portion B2 is shifted so that the heights hd1 and hd2 match.
  • the flatness acquisition program in that case will be described with reference to the flowchart of FIG.
  • This program is executed by the control device 100, and every time the component holder 80 is moved in the horizontal direction, a three-dimensional shape acquisition command is output to the three-dimensional shape acquisition unit 94.
  • a pattern is emitted by the projectors 90 and 91 under the control of the three-dimensional shape acquisition unit 94, and a captured image is acquired by the camera 92. Then, a three-dimensional shape is acquired based on the captured image and supplied to the control device 100.
  • step 1 (hereinafter abbreviated as S1; the same applies to other steps), a count value n of a counter that counts the number of portions where the three-dimensional shape has been acquired is initialized (set to 0), and S2 is performed.
  • the component holder 80 is moved to a predetermined three-dimensional shape acquisition start position above the imaging unit 18. For example, for the component 30, the position of the component holder 80 where the portion B1 in FIG. 10 is located within the imaging region Rc can be set as the three-dimensional shape acquisition start position.
  • S3 a three-dimensional shape acquisition command is output to the imaging unit 18. Thereby, the three-dimensional shape of the portion B1 is obtained, supplied to the control device 100, and stored.
  • the count value of the counter for counting the number of parts is increased by one, and in S5, it is determined whether the count value is equal to or more than the set value Ns.
  • the set value Ns is the number of the parts B1 to B9 set in the target part Ta of the component 30, and is 9, for example, for the component 30 shown in FIG.
  • the component holder 80 is moved in the horizontal direction and moved up and down.
  • the elevating amount is determined such that the height h2 * of the end E2 of the portion B2 is within the depth of field C.
  • S3 and subsequent steps are similarly executed, and S3 to S11 are repeatedly executed. Meanwhile, if the number of the portions where the three-dimensional shape is obtained becomes equal to or more than Ns, the determination in S5 becomes YES, and in S12, the flatness of the component is obtained.
  • the component holder 80 is moved up and down to position the portion located in the imaging region Rc within the depth of field. This makes it possible to accurately acquire the three-dimensional shape of the target portion Ta of the component 30, and to accurately acquire the flatness.
  • the object corresponds to the components 30 and 36
  • the object holder corresponds to the component holder 80
  • the first direction corresponds to the vertical direction
  • the second direction corresponds to the horizontal direction.
  • the second direction may be the direction of relative movement between the component holder 80 (head 40) and the camera 92.
  • the functional portion corresponds to the electrode portion
  • the first portion corresponds to the portion B1
  • the second portion corresponds to the portion B2.
  • the distance acquisition unit and the height acquisition unit are configured by a part that acquires the three-dimensional shape of the target part of the three-dimensional shape acquisition unit 94, and the like.
  • the flatness acquisition unit is configured by a portion that stores and executes S12 of the flatness acquisition program of the control device 100, and the first direction relative position change unit and the portion that the relative height control unit stores S11 execute.
  • the second direction relative position changing unit is constituted by a part for storing and executing S9 and S10, and the like.
  • the second direction relative position changing unit includes the first direction relative position changing unit and the second direction relative position changing unit.
  • a position changing unit is configured.
  • the x-direction moving device 50 and the y-direction moving device 52 constitute a horizontal moving device, and the z-direction moving device 53 constitutes a vertical moving device.
  • the distance acquisition unit control unit and the height acquisition control unit are configured by a part that stores and executes S3
  • the inclination acquisition unit is configured by a part that stores S6 and an execution unit
  • the determination unit is performed by S8.
  • a part to execute It should be noted that a flatness acquisition system is configured by the imaging unit 18, the portion of the control device 100 that stores and executes the flatness acquisition program represented by the flowchart in FIG. 6, and the like.
  • the height h2 * of the end E2 of the portion B2 is acquired based on the inclination k1 of the portion B1, and when the height h2 * is out of the depth of field, the component holding is performed.
  • the tool 80 is moved in the horizontal direction and in the vertical direction, it is not essential to acquire the height h2 * of the end E2.
  • the component holder 80 is horizontally moved without acquiring the height k2 * of the end E2, while acquiring the inclination k1 of the portion B1.
  • the height h2 * of the end E2 of the part B2 is determined based on the inclination k1 of the part B1.
  • the component holder 80 can be raised and lowered so that the height h2 * of the end E2 is located within the depth of field. Also, it is not essential to acquire the inclination k1 of the portion B1. For example, as a result of moving the component holder 80 in the horizontal direction so that the portion located in the imaging region Rc is changed from the portion B1 to the portion B2 and imaging the portion B2 with the camera 92, at least a portion of the portion B2 is obtained. When it is out of the depth of field, the component holder 80 can be moved up and down as appropriate to position the portion B2 within the depth of field.
  • the imaging unit 18 acquires the three-dimensional shape of the target portion of the component, but only the height of a plurality of points of the portion from the imaging device of the camera 92 may be acquired. For example, based on the height of the tip (point) of the solder ball 28 of the component 30 from the image sensor of the camera 92, the flatness of the virtual plane Pa can be obtained.
  • the portions when a plurality of portions are set in the target portion, it is not essential that the portions have overlapping portions. Further, it is not essential to apply the flatness acquisition system to the mounting machine 4, and the flatness acquisition system may be executed in a single system.
  • the present invention can be implemented in a mode in which various changes are made based on the above.
  • Control unit # 10 Control device $ 18: Imaging unit $ 40: Head $ 42: Head moving device $ 50: X direction moving device $ 52: Y direction moving device $ 54: Z direction moving device $ 80: Component holder $ 90, 91: Projector $ 92: Camera # 94: Three-dimensional shape acquisition unit # 100: Control unit # 110: Execution unit # 112: Storage unit

Abstract

The present invention is an improvement of a flatness acquisition system, and addresses the problem of satisfactorily acquiring the flatness of an objective part of an object even when a portion of the objective part of the object is outside the depth of field of an imaging device. In this flatness acquisition system, the relative positions of an imaging device and an object holder for retaining the object, in a first direction parallel to an axis line of the imaging device, are changed when a first portion of the objective part of the object is imaged and when a second portion thereof is imaged. For example, when a portion of the objective part of the object is outside the depth of field of the imaging device, the relative positions of the object holder and the imaging device in the first direction can be changed so that the first portion including the portion of the objective part of the object is positioned within the depth of field. As a result, a captured image that is satisfactory even in the first portion can be acquired, and the flatness of the objective part can be satisfactorily acquired.

Description

平坦度取得システムおよび実装機Flatness acquisition system and mounting machine
 本開示は、平坦度を取得する平坦度取得システムおよび実装機に関するものである。 The present disclosure relates to a flatness obtaining system for obtaining flatness and a mounting machine.
 特許文献1には、部品本体と、部品本体の4つの側面からそれぞれ並んで伸び出した複数本ずつのリード線とを含む部品のリード線の各々の浮き上がりの有無を検出する浮き上がり検出装置が記載されている。浮き上がり検出装置は、部品保持具に保持された部品の一側面から一列に並んで伸び出した複数本のリード線にスリット光を照射するスリット光源と、スリット光が照射された複数本のリード線を撮像するカメラとを含み、撮像画像に基づいて複数本のリード線の各々の浮き上がりの有無を検出する。部品は、複数本のリード線の浮き上がりの有無が検出された後に回転させられ、別の側面から伸び出した複数本のリード線の各々の浮き上がりの有無が検出される。 Patent Literature 1 describes a lift detection device that detects whether or not each of the lead wires of a component includes a component body and a plurality of lead wires that extend side by side from four side surfaces of the component body. Have been. The lift detection device includes a slit light source that irradiates slit light to a plurality of lead wires extending in a line from one side surface of the component held by the component holder, and a plurality of lead wires that are irradiated with the slit light. And a camera for imaging the plurality of lead wires based on the captured image. The component is rotated after detecting the presence or absence of floating of the plurality of lead wires, and detecting the presence or absence of each of the plurality of lead wires extending from another side surface.
特開2008-288336号JP 2008-288336 A
本開示が解決しようとする課題Problems to be solved by the present disclosure
 本開示の課題は、平坦度取得システムの改良であり、物体の対象部の一部が撮像装置の被写界深度から外れている場合であっても、物体の対象部の平坦度を良好に取得することである。 It is an object of the present disclosure to improve a flatness acquisition system, and to improve the flatness of a target portion of an object even when a part of the target portion of the object is out of the depth of field of the imaging device. Is to get.
課題を解決するための手段、作用および効果Means, actions and effects for solving the problem
 本開示に係る平坦度取得システムにおいては、物体を保持する物体保持具と撮像装置との間の、撮像装置の軸線と平行な方向である第1方向の相対位置が、物体の対象部の第1部分を撮像する場合と第2部分を撮像する場合とで変更させられる。例えば、物体の対象部の一部が撮像装置の被写界深度から外れている場合において、その一部を含む第1部分が撮像される場合には、第1部分が被写界深度内に位置するように物体保持具と撮像装置との第1方向の相対位置を変更させることができる。その結果、物体の対象部のうちの第1部分が撮像される場合と、第1部分を除く第2部分が撮像される場合とで、物体保持具と撮像装置との第1方向における相対位置が変更させられることになり、第1部分についても良好な撮像画像を取得することができ、対象部の平坦度を良好に取得することができる。 In the flatness acquisition system according to the present disclosure, the relative position in the first direction, which is a direction parallel to the axis of the imaging device, between the object holder that holds the object and the imaging device is the first position of the target portion of the object. It is changed between the case where one part is imaged and the case where the second part is imaged. For example, when a part of the target portion of the object is out of the depth of field of the imaging device and the first part including the part is imaged, the first part is located within the depth of field. The relative position in the first direction between the object holder and the imaging device can be changed so as to be located. As a result, the relative position of the object holder and the imaging device in the first direction depends on whether the first part of the target part of the object is imaged or the second part excluding the first part is imaged. Is changed, and a good captured image can be obtained for the first portion, and the flatness of the target portion can be obtained well.
本実施形態に係る実装機の斜視図である。It is a perspective view of the mounting machine concerning this embodiment. 上記実装機の部品装着装置を示す図である。FIG. 3 is a diagram showing a component mounting device of the mounting machine. 上記実装機の撮像ユニットの正面図である。It is a front view of the imaging unit of the above-mentioned mounting machine. 上記撮像ユニットの平面図である。It is a top view of the above-mentioned imaging unit. 上記実装機の制御装置の周辺を概念的に示す図である。It is a figure which shows notionally the periphery of the control apparatus of the said mounting machine. 上記制御装置の記憶部に記憶された平坦度取得プログラムを表すフローチャートである。4 is a flowchart illustrating a flatness acquisition program stored in a storage unit of the control device. 上記プログラムの一部を表すフローチャートである。It is a flowchart showing a part of said program. 上記撮像ユニットによって部品の部分B1が撮像される状態を示す図である。It is a figure showing the state where part B1 of a part is imaged by the above-mentioned imaging unit. 上記部品の部分B2が撮像される状態を示す図である。It is a figure showing the state where the part B2 of the above-mentioned parts is imaged. 上記部品に設定された複数の部分を概念的に示す図である。It is a figure which shows notionally the some part set to the said component. 上記撮像ユニットによって部品の部分B1が撮像される状態を示す図である。It is a figure showing the state where part B1 of a part is imaged by the above-mentioned imaging unit. 上記部品の部分B2が撮像される状態を示す図である。It is a figure showing the state where the part B2 of the above-mentioned parts is imaged. (13A)(13B)上記部品を概念的に示す図である。(13A) (13B) It is a figure which shows the said component notionally.
開示を実施するための形態Forms for implementing disclosure
 以下、本開示の一実施形態である実装機について、図面に基づいて詳細に説明する。本実装機には、平坦度取得システムが含まれる。 Hereinafter, a mounting machine according to an embodiment of the present disclosure will be described in detail with reference to the drawings. This mounting machine includes a flatness acquisition system.
 実装機4は、図1に示すように、電子部品(以下、部品と略称する)を回路基板S(以後、基板Sと略称する)に装着するものであり、本体10,基板搬送支持装置12,部品供給装置14,部品装着装置16,撮像ユニット18等を含む。
 基板搬送支持装置12は、基板Sを搬送して保持するものである。図1において、xは基板搬送支持装置12による基板Sの搬送方向であり、yは基板Sの幅方向であり、zは基板Sの厚み方向である。yは実装機4の前後方向、zは上下方向であり、これら、x方向、y方向、z方向は互いに直交する。
As shown in FIG. 1, the mounting machine 4 mounts an electronic component (hereinafter, abbreviated as a component) on a circuit board S (hereinafter, abbreviated as a board S). , A component supply device 14, a component mounting device 16, an imaging unit 18, and the like.
The substrate transport and support device 12 transports and holds the substrate S. In FIG. 1, x is the transport direction of the substrate S by the substrate transport and support device 12, y is the width direction of the substrate S, and z is the thickness direction of the substrate S. y is the front-back direction of the mounting machine 4, z is the up-down direction, and these x-direction, y-direction, and z-direction are orthogonal to each other.
 部品供給装置14は、基板Sに装着される部品を、部品装着装置16に受け渡し可能な状態で供給するものである。本実施例において、部品供給装置14は、トレイ20を備えたトレイ型供給装置、図示しないテープフィーダを備えたテープフィーダ型供給装置およびばら部品供給装置21のうちの1つ以上を含む。
 部品供給装置14によって供給される部品には、図13Aに示すように、部品本体26と、部品本体26に形成された電極部としての複数のはんだボール28とを含む部品30であるBGA(Ball Grid Array)、図13Bに示すように、部品本体32と、その部品本体32の側面から延び出し、J字状に曲げられた電極部としての複数のリード線34を含むリード部品36であるSOJ(Small Out Line J Lead)等が含まれる。
The component supply device 14 supplies a component to be mounted on the board S to the component mounting device 16 in a state where it can be delivered. In this embodiment, the component supply device 14 includes at least one of a tray type supply device having a tray 20, a tape feeder type supply device having a tape feeder (not shown), and a loose component supply device 21.
As shown in FIG. 13A, the components supplied by the component supply device 14 include a component 30 including a component main body 26 and a plurality of solder balls 28 as electrode portions formed on the component main body 26. As shown in FIG. 13B, the SOJ is a lead component 36 including a component main body 32 and a plurality of lead wires 34 extending from the side surface of the component main body 32 and serving as electrode portions bent in a J-shape. (Small Out Line J Lead).
 部品装着装置16は、部品供給装置14によって供給された部品をピックアップして保持して、基板搬送支持装置12によって搬送されて支持された基板Sに装着するものである。部品装着装置16は、図2に示すように、2つのヘッド40,41、2つのヘッド40,41を移動させるヘッド移動装置42等を含む。ヘッド移動装置42は、2つのヘッド40,41を、同時にx方向に移動させるx方向移動装置50およびy方向に移動させるy方向移動装置52、個別にz方向に移動させるz方向移動装置53,54等を含む。y方向移動装置52は、yスライダ55、リニアモータであるyモータ56等を含む。x方向移動装置50は、yスライダ55に設けられ、xスライダ60、駆動源たるxモータ62、xモータ62の回転を直線移動に変換してxスライダ60に伝達する運動変換機構64等を含む。z方向移動装置53,54は、xスライダ60に設けられ、それぞれ、zスライダ68,69、駆動源たるzモータ70,71、zモータ70,71の回転を直線運動に変換してzスライダ68,69に伝達する図示しない運動変換機構等を含む。 The component mounting device 16 picks up and holds the component supplied by the component supply device 14, and mounts the component on the substrate S transported and supported by the substrate transport and support device 12. As shown in FIG. 2, the component mounting device 16 includes two heads 40 and 41, a head moving device 42 that moves the two heads 40 and 41, and the like. The head moving device 42 includes an x-direction moving device 50 for simultaneously moving the two heads 40 and 41 in the x-direction, a y-direction moving device 52 for moving in the y-direction, and a z-direction moving device 53 for individually moving in the z-direction. 54 and the like. The y direction moving device 52 includes a y slider 55, a y motor 56 which is a linear motor, and the like. The x-direction moving device 50 is provided on the y-slider 55, and includes an x-slider 60, an x-motor 62 as a driving source, a motion converting mechanism 64 for converting the rotation of the x-motor 62 into a linear movement and transmitting the linear movement to the x-slider 60. . The z- direction moving devices 53 and 54 are provided on the x-slider 60, and convert z- sliders 68 and 69, z- motors 70 and 71 as driving sources, and z- motors 70 and 71 into linear motion, respectively, to convert the z-sliders 68 into linear motions. , 69, and a motion conversion mechanism (not shown).
 2つのヘッド40,41のうちのヘッド40は、部品保持具80を1つ有する。部品保持具80は、例えば、負圧により部品を吸着して保持する吸着ノズルとしたり、一対の爪部により部品を保持するチャックとしたりすること等ができる。 ヘ ッ ド The head 40 of the two heads 40 and 41 has one component holder 80. The component holder 80 can be, for example, a suction nozzle that suctions and holds components by negative pressure, or a chuck that holds components by a pair of claws.
 撮像ユニット18は、上方に位置する部品保持具80によって保持された部品の三次元形状を取得するものであり、図3,4に示すように、2つのプロジェクタ90,91と、撮像装置としてのカメラ92と、コンピュータを主体とし、プロジェクタ90,91、カメラ92を制御するとともに、物体の対象部の三次元形状を取得する三次元形状取得部94とを含む。
 カメラ92は、CCD(Charge Coupled Device),CMOS(Complementary Metal Oxide Semiconductor)等の撮像素子を有する撮像装置である。カメラ92は、軸線Lzがz方向に延びた姿勢で設けられ、プロジェクタ90,91は、軸線Lzの回りに90°隔たった位置に設けられる。プロジェクタ90,91は、それぞれ、一方向に正弦波的に強度が変化する平面状に広がるパターンを、z方向に対しても、x方向、y方向に対しても傾いた向きに照射する。また、カメラ92によって画像を撮像可能な領域である撮像領域Rcは、プロジェクタ90,91によってパターンが照射される領域である照射領域Rpに含まれる。
The imaging unit 18 is for acquiring the three-dimensional shape of the component held by the component holder 80 positioned above, and as shown in FIGS. 3 and 4, two projectors 90 and 91 and an imaging device as an imaging device. It includes a camera 92 and a three-dimensional shape acquiring unit 94 which mainly controls a computer, and controls the projectors 90 and 91 and the camera 92 and acquires a three-dimensional shape of a target portion of an object.
The camera 92 is an imaging device having an imaging element such as a charge coupled device (CCD) and a complementary metal oxide semiconductor (CMOS). The camera 92 is provided so that the axis Lz extends in the z direction, and the projectors 90 and 91 are provided at positions separated by 90 ° around the axis Lz. The projectors 90 and 91 respectively irradiate a pattern that spreads in a plane in which the intensity changes sinusoidally in one direction in a direction inclined with respect to the z direction, the x direction, and the y direction. Further, an imaging region Rc which is an area where an image can be captured by the camera 92 is included in an irradiation area Rp which is an area where a pattern is irradiated by the projectors 90 and 91.
 撮像ユニット18において、プロジェクタ90,91によってパターンが照射された部品の部分がカメラ92によって撮像され、三次元形状取得部94によって、取得された撮像画像に基づいて、撮像領域Rcの内側に位置する部品の部分(一部)の三次元形状が位相シフト法により取得される。 In the imaging unit 18, a part of the component irradiated with the pattern by the projectors 90 and 91 is imaged by the camera 92, and is positioned inside the imaging region Rc by the three-dimensional shape acquisition unit 94 based on the acquired image. The three-dimensional shape of the part (part) of the part is obtained by the phase shift method.
 プロジェクタ90,91によって、それぞれ、一方向に正弦波的に強度が変化するパターンが、位相をずらして複数回照射される。また、パターンが照射される毎に、カメラ92によって撮像領域Rc内の画像である撮像画像が取得される。三次元形状取得部94において、撮像画像の各々において、撮像画像を構成する画素の各々における輝度が取得され、複数の撮像画像における同一の画素における輝度の値に基づいて、その画素における位相が取得される。そして、位相が同じ画素を連結することにより等位相線が得られる。一方、その位相の光(パターンを構成する1ライン)の照射角度、カメラ92の撮像素子上の画素の位置、撮像ユニット18の光学的または幾何学的パラメータ(プロジェクタ90,91の光学中心座標、カメラ92の光学中心座標、焦点距離)等に基づいて、カメラ92の撮像素子とその等位相線で連結された画素の各々に対応する部品上の点との間の距離が取得される。そして、これら部品上の複数の点とカメラ92の撮像素子との間の距離に基づいて、部品の対象部の三次元形状が取得される。 (4) A pattern whose intensity changes sinusoidally in one direction is emitted a plurality of times by shifting the phase by the projectors 90 and 91, respectively. In addition, every time the pattern is irradiated, the camera 92 obtains a captured image that is an image in the imaging region Rc. In each of the captured images, the three-dimensional shape obtaining unit 94 obtains the luminance of each of the pixels constituting the captured image, and obtains the phase of the pixel based on the luminance value of the same pixel in the plurality of captured images. Is done. Then, by connecting pixels having the same phase, an equal phase line is obtained. On the other hand, the irradiation angle of the phase light (one line forming the pattern), the position of the pixel on the image sensor of the camera 92, the optical or geometric parameters of the image pickup unit 18 (the optical center coordinates of the projectors 90 and 91, The distance between the image sensor of the camera 92 and the point on the component corresponding to each of the pixels connected by the equiphase lines is acquired based on the optical center coordinates of the camera 92, the focal length, and the like. Then, the three-dimensional shape of the target portion of the component is obtained based on the distance between the plurality of points on the component and the image sensor of the camera 92.
 なお、三次元形状の取得方法、プロジェクタ90,91によって照射されるパターンは、限定されない。例えば、位相シフト法に限らず広くパターン投影法により三次元形状が取得されるようにしたり、ステレオ画像法、等高線法等により三次元形状が取得されるようにしたりすること等ができる。また、本実施例においては、予め定められた二次元的(平面的)に広がる設定領域内に位置する部品の三次元的形状が取得されればよい。例えば、ステレオ画像法による場合には、プロジェクタは不要であり、複数のカメラによる撮像画像に基づいて、撮像領域Rc内に位置する部品の三次元形状が取得される。 The method of acquiring the three-dimensional shape and the pattern irradiated by the projectors 90 and 91 are not limited. For example, a three-dimensional shape can be obtained not only by the phase shift method but also widely by a pattern projection method, or a three-dimensional shape can be obtained by a stereo image method, a contour method, or the like. In the present embodiment, the three-dimensional shape of the component located in the predetermined two-dimensionally (planarly) set area may be obtained. For example, in the case of using the stereo image method, a projector is unnecessary, and a three-dimensional shape of a component located in the imaging region Rc is obtained based on images captured by a plurality of cameras.
 制御装置100は、コンピュータを主体とするものであり、図5に示すように、実行部110、記憶部112、入出力部114等を含み、入出力部114には、撮像ユニット18の三次元形状取得部94が接続されるとともに、駆動回路120を介して基板搬送支持装置12、部品供給装置14、部品装着装置16等が接続される。 The control device 100 mainly includes a computer, and includes an execution unit 110, a storage unit 112, an input / output unit 114, and the like, as shown in FIG. The shape acquisition unit 94 is connected, and the substrate transport support device 12, the component supply device 14, the component mounting device 16, and the like are connected via the drive circuit 120.
 以上のように構成された実装機4における作動について説明する。
 実装機4おいて、部品保持具80が撮像ユニット18の上方に移動させられ、部品保持具80に保持された部品の対象部の三次元形状が撮像ユニット18によって取得される。部品の対象部とは、部品の部品保持具80に対向する面とは反対側の部分、換言すると、部品の基板Sに装着される側の部分をいう。
The operation of the mounting machine 4 configured as described above will be described.
In the mounting machine 4, the component holder 80 is moved above the imaging unit 18, and the three-dimensional shape of the target part of the component held by the component holder 80 is acquired by the imaging unit 18. The target part of the component refers to a portion of the component opposite to the surface facing the component holder 80, in other words, a portion of the component mounted on the substrate S.
 例えば、製造上の欠陥、搬送中のトラブルにより、はんだボール28の一部が欠けていたり、リード線34が曲がっていたりする場合がある。このように、はんだボール28の一部が欠けた部品30やリード線34が曲がった部品36が基板Sに装着された場合には、部品30,36への通電不良が生じる等の問題が生じる。そこで、本実施例においては、部品30,36の対象部の平坦度が取得され、部品30,36についてのチェックが行われる。 For example, a part of the solder ball 28 may be missing or the lead wire 34 may be bent due to a manufacturing defect or a trouble during transportation. As described above, when the component 30 in which a part of the solder ball 28 is missing or the component 36 in which the lead wire 34 is bent is mounted on the substrate S, a problem such as the occurrence of poor current supply to the components 30 and 36 occurs. . Therefore, in the present embodiment, the flatness of the target portion of the components 30 and 36 is acquired, and the components 30 and 36 are checked.
 本実施例において、例えば、図13Aに示す部品30においては、複数のはんだボール28等を含む部分が対象部Taとされ、対象部Taの三次元形状に基づいて、複数のはんだボール28の先端(点)の集合によって形成される仮想平面Paの平坦度が取得される。また、図13Bに示す部品36においては、複数のリード線34の部品本体32の下方に位置する部分等を含む部分が対象部Tbとされ、対象部Tbの三次元形状に基づいて、リード線34の部品本体32の下方に位置する部分の下側面上の予め定められた点の集合によって形成される仮想平面Pbの平坦度が取得される。 In the present embodiment, for example, in the component 30 shown in FIG. 13A, a portion including the plurality of solder balls 28 and the like is set as the target portion Ta, and based on the three-dimensional shape of the target portion Ta, the tip of the plurality of solder balls 28 The flatness of the virtual plane Pa formed by the set of (points) is obtained. In the component 36 shown in FIG. 13B, a portion including a portion of the plurality of lead wires 34 located below the component body 32 and the like is defined as the target portion Tb, and the lead wire is formed based on the three-dimensional shape of the target portion Tb. The flatness of an imaginary plane Pb formed by a set of predetermined points on the lower side surface of the part located below the component main body 32 of 34 is acquired.
 しかし、部品保持具80によって保持された部品(例えば、部品30が保持されている場合について説明する)の対象部Taが撮像ユニット18の撮像領域Rcより広い場合には、図8に示すように、部品30の対象部Ta全体を含む撮像画像を取得することができない。
 それに対して、本実施例においては、図8,9に示すように、部品保持具80がx方向移動装置50とy方向移動装置52とにより水平方向に移動させられる。それによって、部品30の対象部Taの撮像領域Rc内に位置する部分が移動させられ、撮像領域Rc内に位置する部分が変更される。撮像領域Rc内に位置する部分は変更される毎に、それぞれ、カメラ92によって撮像されて、三次元形状が取得される。例えば、図10に示すように、予め、部品30の対象部Taに複数の部分B1~B9が設定され、撮像の順番等が決定される。部品保持具80は、部分B1~B9が、それぞれ、順番に撮像領域Rc内に位置するように、x方向移動装置50とy方向移動装置52とにより移動させられる。なお、部分B1~B9は、互いに重複部分Bsを有する。また、図10は、部品30を対象部Taから見た状態を示す図であり、黒矢印は、撮像領域Rc内に位置する部分B1~B9の移動方向を示す。
However, when the target part Ta of the component held by the component holder 80 (for example, the case where the component 30 is held) is wider than the imaging region Rc of the imaging unit 18, as shown in FIG. However, a captured image including the entire target portion Ta of the component 30 cannot be obtained.
On the other hand, in the present embodiment, as shown in FIGS. 8 and 9, the component holder 80 is moved in the horizontal direction by the x-direction moving device 50 and the y-direction moving device 52. Thereby, the portion of the target portion Ta of the component 30 located in the imaging region Rc is moved, and the portion located in the imaging region Rc is changed. Each time a portion located in the imaging region Rc is changed, an image is taken by the camera 92 to obtain a three-dimensional shape. For example, as shown in FIG. 10, a plurality of portions B1 to B9 are set in advance in the target portion Ta of the component 30, and the order of imaging and the like are determined. The component holder 80 is moved by the x-direction moving device 50 and the y-direction moving device 52 so that the portions B1 to B9 are sequentially positioned within the imaging region Rc. Note that the portions B1 to B9 have an overlapping portion Bs. FIG. 10 is a diagram illustrating a state in which the component 30 is viewed from the target portion Ta. Black arrows indicate the moving directions of the portions B1 to B9 located in the imaging region Rc.
 一方、部品装着装置16の製造上、組み付け時の誤差、撮像ユニット18の製造上、取付け時の誤差、部品保持具80の部品30の保持位置のずれ、部品30の製造上の誤差等に起因して、図11,12に示すように、部品保持具80に保持された部品30の対象部Taがカメラ92の軸線Lzと直交する方向(水平方向に伸びている場合が多い)に対して傾いている場合がある。この場合において、図12の破線が示すように、部品保持具80が水平方向に移動させられることにより、撮像領域Rc内に位置する部分が部分B1から部分B2に変更された場合には、部分B2の一部Bxがカメラ92の被写界深度Cから外れる。被写界深度Cとは、カメラ92において焦点を合わせた被写体の前後で撮像画像として鮮明な画像が得られる範囲をいい、カメラ92の特性によって決まる。被写界深度Cから外れた一部Bxについては撮像画像が不鮮明となり、三次元形状を正確に取得することが困難である。 On the other hand, errors in manufacturing and assembling the component mounting device 16, errors in manufacturing and mounting the imaging unit 18, deviations in the holding position of the component 30 of the component holder 80, errors in manufacturing the component 30, and the like. Then, as shown in FIGS. 11 and 12, the target portion Ta of the component 30 held by the component holder 80 is perpendicular to the axis Lz of the camera 92 (in many cases, extends in the horizontal direction). May be inclined. In this case, as shown by the broken line in FIG. 12, when the part located in the imaging region Rc is changed from the part B1 to the part B2 by moving the component holder 80 in the horizontal direction, the part A part Bx of B2 deviates from the depth of field C of the camera 92. The depth of field C refers to a range in which a clear image can be obtained as a captured image before and after a focused object in the camera 92, and is determined by the characteristics of the camera 92. For a part Bx that deviates from the depth of field C, the captured image becomes unclear, and it is difficult to accurately obtain a three-dimensional shape.
 そこで、本実施例においては、部品30の対象部Taの部分B1の三次元形状が取得される際に取得された2点Q1,Q2の各々のカメラ92の撮像素子からの高さh1,h2、2点Q1,Q2間の水平方向の距離wとに基づいて、部分B1の傾きk1を取得する。
k1=(h2-h1)/w
 この場合において、図10に示すように、2点Q1,Q2は、矢印F1が示す方向、すなわち、部品保持具80とカメラ92との相対移動の方向に隔たった2点とされ、2点Q1,Q2の間の距離wは、矢印F1と平行な方向の距離とされ、傾きkは、部分B1の、矢印F1と平行な方向の傾きとされる。
Therefore, in the present embodiment, the heights h1 and h2 of the two points Q1 and Q2 obtained when the three-dimensional shape of the portion B1 of the target portion Ta of the component 30 is obtained from the imaging device of the camera 92, respectively. Based on the horizontal distance w between the two points Q1 and Q2, the inclination k1 of the part B1 is obtained.
k1 = (h2-h1) / w
In this case, as shown in FIG. 10, the two points Q1 and Q2 are two points separated in the direction indicated by the arrow F1, that is, in the direction of the relative movement between the component holder 80 and the camera 92. , Q2 is the distance in the direction parallel to the arrow F1, and the slope k is the slope of the portion B1 in the direction parallel to the arrow F1.
 また、(a)傾きk1と、(b)2点Q1,Q2のうち部分B2に近い方の点Q2の高さh2と、(c)点Q2と部分B2の部分B1から遠い側の端部E2との間の距離x1とに基づいて、撮像領域Rc内に部分B2が位置するように部品保持具80が水平方向に移動させられた場合の、部分B2の端部E2の高さh2*を取得する。
h2*=h2+k1×x1
 そして、部分B2の端部E2の高さh2*がカメラ92の被写界深度C内にある場合、換言すると、図12の高さhc1と高さhc2との間にある場合(hc1<h2*<hc2)には、部分B2全体は被写界深度C内に位置する可能性が高いと判定される。この場合には、部品保持具80が昇降させられることなく水平方向に移動させられ、部分B2が撮像領域Rc内に位置させられる。それに対して、部分B2の端部E2の高さh2*が被写体深度Cから外れる場合(h2*>hc2、h2*<hc1)には、部分B2の少なくとも一部が被写界深度から外れる可能性が高いと判定される。この場合には、図12に示すように、部分B2全体が被写界深度C内に位置するように部品保持具80が水平方向に移動させられつつ昇降させられる。その後、部分B2の三次元形状が取得される。
Also, (a) the slope k1, (b) the height h2 of the point Q2 closer to the portion B2 of the two points Q1 and Q2, and (c) the end of the point Q2 and the portion B2 farther from the portion B1. The height h2 * of the end E2 of the part B2 when the component holder 80 is moved in the horizontal direction so that the part B2 is located in the imaging region Rc based on the distance x1 to the part E2. To get.
h2 * = h2 + k1 × x1
Then, when the height h2 * of the end E2 of the portion B2 is within the depth of field C of the camera 92, in other words, when it is between the height hc1 and the height hc2 in FIG. 12 (hc1 <h2). In * <hc2), it is determined that there is a high possibility that the entire part B2 is located within the depth of field C. In this case, the component holder 80 is moved in the horizontal direction without being moved up and down, and the portion B2 is positioned in the imaging region Rc. On the other hand, when the height h2 * of the end E2 of the part B2 deviates from the depth of field C (h2 *> hc2, h2 * <hc1), at least a part of the part B2 may deviate from the depth of field. Is determined to be high. In this case, as shown in FIG. 12, the component holder 80 is moved up and down while being moved in the horizontal direction so that the entire portion B2 is located within the depth of field C. Thereafter, the three-dimensional shape of the portion B2 is obtained.
 また、例えば、部品保持具80の水平方向の移動により、撮像領域Rc内に位置する部分が図10に示す部分B3から部分B4に変更される場合には、部分B3において、矢印F4が示す方向に隔たった2つの点Q5,Q6の各々の高さh5,h6と、点Q5,Q6との間の矢印F4と平行な方向の距離w3とに基づいて矢印F4方向の傾きk3が取得される。また、傾きk3と、点Q6と部分B4の部分B3から遠い側の端部E4との間の矢印F4と平行な方向の距離x3とに基づいて、端部E4の高さh4*が取得され、被写界深度内Cに位置するか否かが判定される。判定結果により、部品保持具80が水平方向に移動させられたり、水平方向および上下方向に移動させられたりするのである。 Further, for example, when the part located in the imaging region Rc is changed from the part B3 shown in FIG. 10 to the part B4 due to the horizontal movement of the component holder 80, the direction indicated by the arrow F4 in the part B3 The inclination k3 in the direction of the arrow F4 is obtained based on the heights h5 and h6 of the two points Q5 and Q6 separated from each other and the distance w3 in the direction parallel to the arrow F4 between the points Q5 and Q6. . Further, the height h4 * of the end E4 is obtained based on the inclination k3 and the distance x3 in a direction parallel to the arrow F4 between the point Q6 and the end E4 of the part B4 on the far side from the part B3. , Is located within the depth of field C. According to the determination result, the component holder 80 is moved in the horizontal direction, or is moved in the horizontal direction and the vertical direction.
 このようにして、部分B1~B9の各々について、それぞれ、三次元形状が取得されるが、それぞれ取得された三次元形状が合体させられて、対象部全体の三次元形状が取得される。 三 Thus, a three-dimensional shape is obtained for each of the portions B1 to B9, and the obtained three-dimensional shapes are combined to obtain a three-dimensional shape of the entire target portion.
 例えば、複数の部分毎に取得された三次元形状を、部品保持具80の昇降量ΔHに基づいて合体させることができる。例えば、部分B2が撮像される場合に部品保持具80がΔH昇降させられた場合には、部分B1について取得された三次元形状と、部分B2について取得された三次元形状を昇降量ΔHシフトさせた三次元形状とを合体させるのである。 For example, the three-dimensional shapes acquired for each of a plurality of parts can be combined based on the elevation amount ΔH of the component holder 80. For example, when the component holder 80 is moved up and down by ΔH when the part B2 is imaged, the three-dimensional shape acquired for the part B1 and the three-dimensional shape acquired for the part B2 are shifted by an amount of elevation ΔH. It combines the three-dimensional shape.
 また、部分B1のうちの重複部分Bs1の三次元形状と、部分B2のうちの重複部分Bs1の三次元形状とが一致するように、部分B1について取得された三次元形状と部分B2について取得された三次元形状とを合体させることもできる。例えば、重複部分Bs1に含まれる1つ以上の点の高さが部分B1を含む撮像画像に基づいて取得される(hd1)とともに、部分B2を含む撮像画像に基づいて取得され(hd2)、それら高さhd1,hd2が一致するように、部分B2において取得された三次元形状をシフトさせるのである。これら高さhd1,hd2の間には、式(hd1=hd2+ΔH)の関係が成立するはずであるが、それ以外の誤差を含むことがある。そのため、これら重複部分の高さhd1,hd2の差に基づいて、シフトさせる値を補正することにより、より正確に対象部Taの三次元形状を取得することができる。 In addition, the three-dimensional shape obtained for the portion B1 and the three-dimensional shape obtained for the portion B2 are acquired such that the three-dimensional shape of the overlapping portion Bs1 of the portion B1 matches the three-dimensional shape of the overlapping portion Bs1 of the portion B2. Can be combined with the three-dimensional shape. For example, the heights of one or more points included in the overlapping portion Bs1 are obtained based on the captured image including the portion B1 (hd1), and are obtained based on the captured image including the portion B2 (hd2). The three-dimensional shape acquired in the portion B2 is shifted so that the heights hd1 and hd2 match. The relationship of the formula (hd1 = hd2 + ΔH) should be established between the heights hd1 and hd2, but may include other errors. Therefore, by correcting the value to be shifted based on the difference between the heights hd1 and hd2 of these overlapping portions, the three-dimensional shape of the target portion Ta can be acquired more accurately.
 その場合の平坦度取得プログラムを図6のフローチャートに基づいて説明する。本プログラムは、制御装置100において実行され、部品保持具80が水平方向に移動させられる毎に、三次元形状取得部94に三次元形状の取得指令が出力される。撮像ユニット18において、三次元形状取得部94の制御により、プロジェクタ90,91によってパターンが照射され、カメラ92によって撮像画像が取得される。そして、撮像画像に基づいて三次元形状が取得され、制御装置100に供給される。 The flatness acquisition program in that case will be described with reference to the flowchart of FIG. This program is executed by the control device 100, and every time the component holder 80 is moved in the horizontal direction, a three-dimensional shape acquisition command is output to the three-dimensional shape acquisition unit 94. In the imaging unit 18, a pattern is emitted by the projectors 90 and 91 under the control of the three-dimensional shape acquisition unit 94, and a captured image is acquired by the camera 92. Then, a three-dimensional shape is acquired based on the captured image and supplied to the control device 100.
 ステップ1(以下、S1と略称する。他のステップについても同様とする)において、三次元形状が取得された部分の数をカウントするカウンタのカウント値nが初期化(0とされ)され、S2において、部品保持具80を撮像ユニット18の上方の予め定められた三次元形状取得開始位置へ移動させる。例えば、部品30について、図10の部分B1が撮像領域Rc内に位置する部品保持具80の位置を三次元形状取得開始位置とすることができる。S3において、撮像ユニット18に三次元形状取得指令を出力する。それによって、部分B1について三次元形状が取得されて、制御装置100に供給されて記憶される。次に、S4において、部分の数をカウントするカウンタのカウント値が1増加させられ、S5において、カウント値が設定値Ns以上であるか否かが判定される。設定値Nsは、部品30の対象部Taに設定された部分B1~B9の数であり、例えば、図10に示す部品30については9である。 In step 1 (hereinafter abbreviated as S1; the same applies to other steps), a count value n of a counter that counts the number of portions where the three-dimensional shape has been acquired is initialized (set to 0), and S2 is performed. In, the component holder 80 is moved to a predetermined three-dimensional shape acquisition start position above the imaging unit 18. For example, for the component 30, the position of the component holder 80 where the portion B1 in FIG. 10 is located within the imaging region Rc can be set as the three-dimensional shape acquisition start position. In S3, a three-dimensional shape acquisition command is output to the imaging unit 18. Thereby, the three-dimensional shape of the portion B1 is obtained, supplied to the control device 100, and stored. Next, in S4, the count value of the counter for counting the number of parts is increased by one, and in S5, it is determined whether the count value is equal to or more than the set value Ns. The set value Ns is the number of the parts B1 to B9 set in the target part Ta of the component 30, and is 9, for example, for the component 30 shown in FIG.
 S5の判定がNOである場合には、S6において、部分B1の三次元形状に基づいて上述のように傾きkが取得され、S7において、次に三次元形状が取得される部分B2の端部E2の高さh2*が取得される。そして、S8において、部分B2全体が被写界深度内に位置する可能性が高いか否かが判定される。判定がYESである場合には、S9において、部品保持具80が水平方向へ、部分B2が撮像領域Rc内に位置するように移動させられ、S3以降が同様に実行される。部分B2が撮像領域Rc内に位置し、部分B2について三次元形状が取得される。
 それに対して、S8の判定がNOである場合には、S10において、部品保持具80が水平方向に移動させられるとともに、昇降させられる。昇降量は、部分B2の端部E2の高さh2*が被写界深度C内となるように決定される。その後、S3以降が同様に実行されるのであり、S3~11が繰り返し実行される。そのうちに、三次元形状が取得された部分の数がNs以上になった場合にはS5の判定がYESとなり、S12において、部品の平坦度が取得される。
If the determination in S5 is NO, in S6, the inclination k is obtained as described above based on the three-dimensional shape of the portion B1, and in S7, the end of the portion B2 from which the next three-dimensional shape is obtained. The height h2 * of E2 is obtained. Then, in S8, it is determined whether there is a high possibility that the entire part B2 is located within the depth of field. If the determination is YES, in S9, the component holder 80 is moved in the horizontal direction so that the part B2 is positioned within the imaging region Rc, and the steps from S3 are similarly performed. The part B2 is located in the imaging region Rc, and a three-dimensional shape is obtained for the part B2.
On the other hand, if the determination in S8 is NO, in S10, the component holder 80 is moved in the horizontal direction and moved up and down. The elevating amount is determined such that the height h2 * of the end E2 of the portion B2 is within the depth of field C. After that, S3 and subsequent steps are similarly executed, and S3 to S11 are repeatedly executed. Meanwhile, if the number of the portions where the three-dimensional shape is obtained becomes equal to or more than Ns, the determination in S5 becomes YES, and in S12, the flatness of the component is obtained.
 S12の実行を、図7のフローチャートに従って説明する。
 S21において、撮像領域Rc内に位置する部分が変更される際に、部品保持具80が昇降させられた場合の昇降量ΔHがそれぞれ読み込まれ、S22において、部分B1~B9の各々の重複部分Bs1~Bs9の各々の少なくとも1点ずつの高さが読み込まれる。S23において、部分B1~B9の各々の三次元形状がヘッド40の昇降量ΔHと、重複部分Bs1~Bs9の各々の少なくとも1点の高さとに基づいて合体させられる。そして、S24において、合体させられた三次元形状に基づいて、仮想平面Paの平坦度が取得されるのである。
The execution of S12 will be described with reference to the flowchart of FIG.
In S21, when the portion located in the imaging region Rc is changed, the elevation amount ΔH when the component holder 80 is moved up and down is read, and in S22, each overlapping portion Bs1 of the portions B1 to B9 is read. The height of at least one point of each of .about.Bs9 is read. In S23, the three-dimensional shapes of the portions B1 to B9 are combined based on the elevation amount ΔH of the head 40 and the height of at least one point of each of the overlapping portions Bs1 to Bs9. Then, in S24, the flatness of the virtual plane Pa is obtained based on the combined three-dimensional shape.
 このように、本実施例においては、部品30の対象部Taが撮像領域Rcより広く、かつ、対象部Taがカメラ92の軸線Lzと直交する面に対して傾斜している場合であっても、部品保持具80を昇降させて、撮像領域Rc内に位置する部分を、被写界深度内に位置させる。それにより、部品30の対象部Taの三次元形状を正確に取得することが可能となり、平坦度を正確に取得することができる。 As described above, in the present embodiment, even when the target part Ta of the component 30 is wider than the imaging region Rc and the target part Ta is inclined with respect to a plane orthogonal to the axis Lz of the camera 92. Then, the component holder 80 is moved up and down to position the portion located in the imaging region Rc within the depth of field. This makes it possible to accurately acquire the three-dimensional shape of the target portion Ta of the component 30, and to accurately acquire the flatness.
 以上のように構成された実装機において、物体が部品30,36に対応し、物体保持具が部品保持具80に対応する。第1方向が上下方向に対応し、第2方向が水平方向に対応する。第2方向は、部品保持具80(ヘッド40)とカメラ92との相対移動方向とすることもできる。機能部が電極部に対応し、第1部分が部分B1に対応し、第2部分が部分B2に対応する。
 また、距離取得部、高さ取得部が、三次元形状取得部94の対象部の部分の三次元形状を取得する部分等により構成される。平坦度取得部が制御装置100の平坦度取得プログラムのS12を記憶する部分、実行する部分等により構成され、第1方向相対位置変更部、相対高さ制御部がS11を記憶する部分、実行する部分等により構成され、第2方向相対位置変更部がS9,10を記憶する部分、実行する部分等により構成され、これら第1方向相対位置変更部、第2方向相対位置変更部を含んで相対位置変更部が構成される。なお、x方向移動装置50およびy方向移動装置52等により水平方向移動装置が構成され、z方向移動装置53等により上下方向移動装置が構成される。さらに、距離取得部制御部、高さ取得制御部がS3を記憶する部分、実行する部分等により構成され、傾き取得部がS6を記憶する部分、実行する部分等により構成され、判定部がS8を記憶する部分、実行する部分等により構成される。なお、撮像ユニット18、制御装置100の図6のフローチャートで表される平坦度取得プログラムを記憶する部分、実行する部分等により平坦度取得システムが構成される。
In the mounting machine configured as described above, the object corresponds to the components 30 and 36, and the object holder corresponds to the component holder 80. The first direction corresponds to the vertical direction, and the second direction corresponds to the horizontal direction. The second direction may be the direction of relative movement between the component holder 80 (head 40) and the camera 92. The functional portion corresponds to the electrode portion, the first portion corresponds to the portion B1, and the second portion corresponds to the portion B2.
Further, the distance acquisition unit and the height acquisition unit are configured by a part that acquires the three-dimensional shape of the target part of the three-dimensional shape acquisition unit 94, and the like. The flatness acquisition unit is configured by a portion that stores and executes S12 of the flatness acquisition program of the control device 100, and the first direction relative position change unit and the portion that the relative height control unit stores S11 execute. The second direction relative position changing unit is constituted by a part for storing and executing S9 and S10, and the like. The second direction relative position changing unit includes the first direction relative position changing unit and the second direction relative position changing unit. A position changing unit is configured. The x-direction moving device 50 and the y-direction moving device 52 constitute a horizontal moving device, and the z-direction moving device 53 constitutes a vertical moving device. Further, the distance acquisition unit control unit and the height acquisition control unit are configured by a part that stores and executes S3, the inclination acquisition unit is configured by a part that stores S6 and an execution unit, and the determination unit is performed by S8. And a part to execute. It should be noted that a flatness acquisition system is configured by the imaging unit 18, the portion of the control device 100 that stores and executes the flatness acquisition program represented by the flowchart in FIG. 6, and the like.
 なお、上記実施例においては、部分B1の傾きk1に基づいて部分B2の端部E2の高さh2*を取得して、高さh2*が被写界深度から外れている場合に、部品保持具80を水平方向に移動させるとともに上下方向に移動させるようにされていたが、端部E2の高さh2*を取得することは不可欠ではない。例えば、部分B1の傾きk1を取得するが、端部E2の高さh2*を取得することなく、部品保持部80を水平移動させる。そして、部分B2をカメラ92で撮像した結果、部分B2の少なくとも一部が被写界深度から外れている場合に、部分B1の傾きk1に基づいて部分B2の端部E2の高さh2*を取得して、端部E2の高さh2*が被写界深度内に位置するように、部品保持具80を昇降させることができる。また、部分B1の傾きk1を取得することも不可欠ではない。例えば、撮像領域Rc内に位置する部分が部分B1から部分B2に変更されるように部品保持具80を水平方向に移動させ、部分B2をカメラ92で撮像した結果、部分B2の少なくとも一部が被写界深度から外れている場合に、部品保持具80を適宜昇降させて部分B2を被写界深度内に位置させることもできる。 In the above embodiment, the height h2 * of the end E2 of the portion B2 is acquired based on the inclination k1 of the portion B1, and when the height h2 * is out of the depth of field, the component holding is performed. Although the tool 80 is moved in the horizontal direction and in the vertical direction, it is not essential to acquire the height h2 * of the end E2. For example, the component holder 80 is horizontally moved without acquiring the height k2 * of the end E2, while acquiring the inclination k1 of the portion B1. When at least a part of the part B2 is out of the depth of field as a result of imaging the part B2 with the camera 92, the height h2 * of the end E2 of the part B2 is determined based on the inclination k1 of the part B1. The component holder 80 can be raised and lowered so that the height h2 * of the end E2 is located within the depth of field. Also, it is not essential to acquire the inclination k1 of the portion B1. For example, as a result of moving the component holder 80 in the horizontal direction so that the portion located in the imaging region Rc is changed from the portion B1 to the portion B2 and imaging the portion B2 with the camera 92, at least a portion of the portion B2 is obtained. When it is out of the depth of field, the component holder 80 can be moved up and down as appropriate to position the portion B2 within the depth of field.
 また、撮像ユニット18によって、部品の対象部の部分の三次元形状が取得されることは不可欠ではなく、部分の複数の点のカメラ92の撮像素子からの高さが取得されるだけでもよい。例えば、部品30のはんだボール28の先端(点)の各々のカメラ92の撮像素子からの高さに基づけば、仮想平面Paの平坦度が取得されるようにすることもできる。 It is not essential that the imaging unit 18 acquire the three-dimensional shape of the target portion of the component, but only the height of a plurality of points of the portion from the imaging device of the camera 92 may be acquired. For example, based on the height of the tip (point) of the solder ball 28 of the component 30 from the image sensor of the camera 92, the flatness of the virtual plane Pa can be obtained.
 さらに、対象部に複数の部分が設定される場合において、部分同士が重複部分を有することは不可欠ではない。さらに、平坦度取得システムを実装機4に適用することは不可欠ではなく、平坦度が取得されるシステム単体において実行されるようにすることもできる等、その他、本発明は、当業者の知識に基づいて種々の変更を施した態様で実施することができる。 Furthermore, when a plurality of portions are set in the target portion, it is not essential that the portions have overlapping portions. Further, it is not essential to apply the flatness acquisition system to the mounting machine 4, and the flatness acquisition system may be executed in a single system. The present invention can be implemented in a mode in which various changes are made based on the above.
 4:実装機 10:制御装置 18:撮像ユニット 40:ヘッド 42:ヘッド移動装置 50:x方向移動装置 52:y方向移動装置 54:z方向移動装置 80:部品保持具 90,91:プロジェクタ 92:カメラ 94:三次元形状取得部 100:制御装置 110:実行部 112:記憶部 4: Mounting machine $ 10: Control device $ 18: Imaging unit $ 40: Head $ 42: Head moving device $ 50: X direction moving device $ 52: Y direction moving device $ 54: Z direction moving device $ 80: Component holder $ 90, 91: Projector $ 92: Camera # 94: Three-dimensional shape acquisition unit # 100: Control unit # 110: Execution unit # 112: Storage unit

Claims (10)

  1.  予め定められた撮像領域の画像を取得する撮像装置を備えた撮像ユニットと、
     物体保持具に保持された物体の前記物体保持具に対向する側とは反対側の対象部の、前記撮像領域内に位置する部分の複数の点の各々と前記撮像装置との間の各々の距離を、それぞれ、前記撮像装置による撮像画像に基づいて取得する距離取得部と、
     前記物体の前記対象部に前記部分が複数設定され、前記複数の前記部分の各々における前記距離取得部によって取得された複数の点の各々と前記撮像装置との間の各々の距離に基づいて前記対象部の平坦度を取得する平坦度取得部と、
     前記撮像領域内に位置する部分が前記物体の前記対象部の第1部分である場合と前記第1部分とは異なる第2部分である場合とで、前記物体保持具と前記撮像装置との、前記撮像装置の軸線と平行な方向である第1方向における相対位置を変更させる第1方向相対位置変更部を備えた相対位置変更部と
    を含む平坦度取得システム。
    An imaging unit including an imaging device that acquires an image of a predetermined imaging region,
    Each of a plurality of points of a portion located in the imaging region of the target portion on the opposite side of the object held by the object holding device from the side facing the object holding device and each of the plurality of points between the imaging device. A distance obtaining unit that obtains a distance based on an image captured by the imaging device,
    A plurality of the portions are set in the target portion of the object, and based on a distance between each of the plurality of points acquired by the distance acquisition unit in each of the plurality of the portions and the imaging device, A flatness acquisition unit for acquiring the flatness of the target portion,
    In the case where the portion located in the imaging region is the first portion of the target portion of the object and in the case where the first portion is a second portion different from the first portion, the object holder and the imaging device A relative position change unit including a first direction relative position change unit that changes a relative position in a first direction that is a direction parallel to the axis of the imaging device.
  2.  前記第1方向相対位置変更部が、前記第1部分が前記撮像装置の被写界深度内に位置し、かつ、前記第2部分が前記撮像装置の被写界深度内に位置するように、前記物体保持具と前記撮像装置とを前記第1方向において接近・離間させることにより、前記物体保持具と前記撮像装置との前記第1方向における相対位置を変更させるものである請求項1に記載の平坦度取得システム。 The first direction relative position change unit, such that the first portion is located within the depth of field of the imaging device, and the second portion is located within the depth of field of the imaging device, 2. The relative position of the object holder and the imaging device in the first direction is changed by moving the object holder and the imaging device closer and farther in the first direction. 3. Flatness acquisition system.
  3.  前記相対位置変更部が、前記物体保持具と前記撮像装置との前記第1方向と直交する第2方向における相対位置を変更させることにより、前記撮像領域内に位置する前記物体の部分を変更させる第2方向相対位置変更部を含み、
     当該平坦度取得システムが、前記第2方向相対位置変更部によって前記撮像領域内に位置する前記部分が変更させられる毎に、前記変更させられた前記部分の複数の点の各々と前記撮像装置との間の各々の距離を、それぞれ、前記距離取得部に取得させる距離取得部制御部を含む請求項1または2に記載の平坦度取得システム。
    The relative position changing unit changes a relative position of the object holder and the imaging device in a second direction orthogonal to the first direction, thereby changing a part of the object located in the imaging region. Including a second direction relative position changing unit,
    The flatness acquisition system, each time the portion located in the imaging region is changed by the second direction relative position change unit, each of the plurality of points of the changed portion and the imaging device The flatness acquisition system according to claim 1, further comprising a distance acquisition unit control unit that causes the distance acquisition unit to acquire each distance between the two.
  4.  前記相対位置変更部が、前記距離取得部によって取得された前記部分の前記複数の点のうちの少なくとも2点の各々と前記撮像装置との間の各々の距離に基づいて、前記物体保持具と前記撮像装置との前記第2方向における相対位置を変更させるとともに、前記物体保持具と前記撮像装置との前記第1方向における相対位置を変更させるものである請求項1ないし3のいずれか1つに記載の平坦度取得システム。 The relative position change unit, based on the distance between each of at least two of the plurality of points of the portion acquired by the distance acquisition unit and the imaging device, the object holder and 4. The apparatus according to claim 1, wherein a relative position in the second direction with respect to the imaging device is changed, and a relative position in the first direction between the object holder and the imaging device is changed. 5. 4. The flatness acquisition system according to 1.
  5.  前記相対位置変更部が、(a)前記距離取得部によって取得された前記部分の前記少なくとも2点の各々と前記撮像装置との間の各々の距離に基づいて、前記対象部の前記部分の前記第2方向に対する傾きを取得する傾き取得部と、(b)その傾き取得部によって取得された前記傾きに基づいて、前記物体保持具と前記撮像装置との前記第2方向の相対位置の変更に起因して次に前記撮像領域内に位置する前記対象部の部分の少なくとも一部が被写界深度から外れる可能性が高いか否かを判定する判定部とを含み、前記判定部によって前記次の前記部分の少なくとも一部が前記被写界深度から外れる可能性が高いと判定された場合には、前記物体保持具と前記撮像装置との前記第2方向の相対位置を変更させるとともに前記第1方向における相対位置を変更させ、前記次の前記部分の少なくとも一部が前記被写界深度から外れる可能性が低いと判定された場合には、前記物体保持具と前記撮像装置との前記第1方向における相対位置を変更させることなく前記第2方向の相対位置を変更させるものである請求項4に記載の平坦度取得システム。 The relative position change unit, (a) based on each distance between each of the at least two points of the portion acquired by the distance acquisition unit and the imaging device, the portion of the target portion of the portion A tilt obtaining unit that obtains a tilt with respect to a second direction, and (b) changing a relative position of the object holder and the imaging device in the second direction based on the tilt obtained by the tilt obtaining unit. A determination unit that determines whether there is a high possibility that at least part of the portion of the target portion next located in the imaging region is out of the depth of field. If it is determined that there is a high possibility that at least a part of the portion deviates from the depth of field, the relative position of the object holder and the imaging device in the second direction is changed and the second position is changed. Relative position in one direction And if it is determined that the possibility that at least a part of the next portion deviates from the depth of field is low, the relative position of the object holder and the imaging device in the first direction The flatness acquisition system according to claim 4, wherein the relative position in the second direction is changed without changing the position.
  6.  前記複数の前記部分が、互いに重複部分を含む請求項1ないし5のいずれか1つに記載の平坦度取得システム。 The flatness acquisition system according to any one of claims 1 to 5, wherein the plurality of portions include overlapping portions.
  7.  前記平坦度取得部が、前記複数の部分の各々において前記距離取得部によって取得された前記複数の点の各々と前記撮像装置との間の各々の距離と、前記第1方向相対位置変更部によって変更させられた前記物体保持具と前記撮像装置との前記第1方向における相対位置の変更量とに基づいて、前記物体の前記対象部の平坦度を取得するものである請求項1ないし6のいずれか1つに記載の平坦度取得システム。 The flatness acquisition unit, in each of the plurality of portions, the distance between each of the plurality of points acquired by the distance acquisition unit and the imaging device, and the first direction relative position change unit The flatness of the target portion of the object is acquired based on the changed amount of the relative position of the object holder and the imaging device in the first direction that has been changed. A flatness acquisition system according to any one of the preceding claims.
  8.  前記物体が、本体と、その本体に設けられた複数の機能部とを含み、
     前記対象部が、前記複数の機能部の少なくとも一部を含み、
     前記平坦度取得部が、前記複数の機能部の少なくとも一部の予め定められた点の集合である仮想平面の平坦度を取得するものである請求項1ないし7のいずれか1つに記載の平坦度取得システム。
    The object includes a main body, and a plurality of functional units provided in the main body,
    The target unit includes at least a part of the plurality of functional units,
    The flatness acquisition unit according to any one of claims 1 to 7, wherein the flatness acquisition unit acquires flatness of a virtual plane, which is a set of predetermined points of at least some of the plurality of functional units. Flatness acquisition system.
  9.  予め定められた撮像領域の画像を取得する撮像装置を備えた撮像ユニットと、
     物体保持具に保持された物体の前記物体保持具に対向する側とは反対側の対象部の、前記撮像領域内に位置する部分の複数の点の各々と前記撮像装置との間の各々の距離を、それぞれ、前記撮像装置によって撮像された前記部分の撮像画像に基づいて取得する距離取得部と、
     前記物体保持具を移動させることにより前記物体保持具と前記撮像装置との相対位置を変更する相対位置変更部と、
     前記相対位置変更部によって、前記物体保持具が前記撮像装置の軸線と平行な方向である第1方向と直交する第2方向に移動させられることに起因して前記対象部の前記撮像領域内に位置する部分が変更させられる毎に、前記距離取得部に、前記変更させられた前記撮像領域内に位置する部分の複数の点の各々と前記撮像装置との間の各々の距離を取得させる距離取得制御部と、
     前記物体の前記対象部に前記部分を複数設定し、それら複数の前記部分の各々において前記距離取得部によって取得された前記複数の点の各々と前記撮像装置との間の各々の距離に基づいて、前記対象部の平坦度を取得する平坦度取得部と
    を含み、
     前記相対位置変更部が、前記距離取得部によって取得された前記部分の複数の点のうちの少なくとも2点の各々と前記撮像装置との間の各々の距離に基づいて、前記物体保持具を前記第2方向に移動させるとともに前記第1方向に移動させることにより、前記物体保持具に保持された前記物体の前記撮像領域内に位置する部分を、前記撮像装置の被写界深度内に位置させるものである平坦度取得システム。
    An imaging unit including an imaging device that acquires an image of a predetermined imaging region,
    Each of a plurality of points of a portion located in the imaging region of the target portion on the opposite side of the object held by the object holding device from the side facing the object holding device and each of the plurality of points between the imaging device. A distance obtaining unit that obtains the distance based on the captured image of the portion captured by the imaging device,
    A relative position changing unit that changes a relative position between the object holder and the imaging device by moving the object holder,
    Due to the relative position changing unit moving the object holder in a second direction orthogonal to a first direction that is a direction parallel to an axis of the imaging device, the object holding tool moves in the imaging area of the target unit. A distance that causes the distance obtaining unit to obtain each distance between each of the plurality of points of the part located in the changed imaging area and the imaging device each time the located part is changed. An acquisition control unit;
    A plurality of the portions are set in the target portion of the object, based on a distance between each of the plurality of points acquired by the distance acquisition unit and the imaging device in each of the plurality of the portions. A flatness acquisition unit for acquiring the flatness of the target portion,
    The relative position change unit, based on a distance between each of at least two points of the plurality of points of the portion acquired by the distance acquisition unit and the imaging device, the object holding tool, By moving the object in the second direction and moving the object in the first direction, a portion of the object held by the object holder, which is located within the imaging region, is located within the depth of field of the imaging device. Flatness acquisition system.
  10.  電子部品を回路基板に装着する実装機であって、
     前記電子部品を保持する部品保持具と、
     その部品保持具を水平方向に移動させる水平方向移動装置と、前記部品保持具を上下方向に移動させる上下方向移動装置とを備えた保持具移動装置と、
     少なくとも、上下方向に伸びた軸線を有し、予め定められた撮像領域の画像を取得する撮像装置を備え、前記部品保持具の下方に位置する撮像ユニットと、
     前記部品保持具に保持された前記電子部品の前記回路基板に装着される側の底部の、前記撮像領域内に位置する部分の複数の点の各々の前記撮像装置からの高さをそれぞれ取得する高さ取得部と、
     前記保持具移動装置を制御することにより、前記部品保持具と前記撮像装置との相対位置を制御する相対位置制御部と
     前記相対位置制御部による前記水平移動装置の制御により、前記部品保持具と前記撮像装置との水平方向における相対位置の変更により、前記撮像領域内に位置する前記底面の部分が変更させられる毎に、前記高さ取得部に、前記部分の複数の点の各々の前記高さを取得させる高さ取得制御部と、
     前記電子部品の底部に前記部分を複数設定し、それら複数の部分の各々において前記高さ取得部によって取得された前記複数の点の各々の前記高さに基づいて、前記底部の平坦度を取得する平坦度取得部と
    を含み、
     前記相対位置制御部が、前記部分の前記複数の点のうちの少なくとも2点の各々の前記高さ取得部によって取得された前記高さに基づいて、前記保持具移動装置を制御することにより、前記部品保持具を水平方向に移動させるとともに上下方向に移動させて、前記部品保持具に保持された前記電子部品の前記底部の前記撮像領域内に位置する部分を、前記撮像装置の被写界深度内に位置させる相対高さ制御部を含む実装機。
    A mounting machine for mounting electronic components on a circuit board,
    A component holder for holding the electronic component,
    A horizontal moving device for moving the component holder in the horizontal direction, and a holder moving device including a vertical moving device for moving the component holder in the vertical direction,
    At least, an imaging unit having an axis extending in the vertical direction, including an imaging device that acquires an image of a predetermined imaging region, and an imaging unit located below the component holder,
    The height of each of a plurality of points of a portion located in the imaging region on the bottom of the electronic component held by the component holder on the side mounted on the circuit board is obtained from the imaging device. A height acquisition unit,
    By controlling the holder moving device, a relative position controller that controls a relative position between the component holder and the imaging device, and by controlling the horizontal moving device by the relative position controller, the component holder and Each time a portion of the bottom surface located in the imaging region is changed by a change in the relative position in the horizontal direction with the imaging device, the height acquisition unit informs the height acquisition unit of each of a plurality of points of the portion. A height acquisition control unit for acquiring the height,
    A plurality of the portions are set on the bottom of the electronic component, and the flatness of the bottom is obtained based on the height of each of the plurality of points obtained by the height obtaining unit in each of the plurality of portions. And a flatness acquisition unit that performs
    The relative position control unit, based on the height acquired by the height acquisition unit of each of at least two of the plurality of points of the portion, by controlling the holder moving device, By moving the component holder in the horizontal direction and moving the component holder in the vertical direction, the portion of the electronic component held by the component holder, which is located in the imaging region on the bottom, is moved to the field of view of the imaging device. A mounting machine including a relative height control unit located within the depth.
PCT/JP2018/024223 2018-06-26 2018-06-26 Flatness acquisition system and mounting machine WO2020003384A1 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
PCT/JP2018/024223 WO2020003384A1 (en) 2018-06-26 2018-06-26 Flatness acquisition system and mounting machine
JP2020526757A JP7181292B2 (en) 2018-06-26 2018-06-26 Flatness acquisition system and mounter

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2018/024223 WO2020003384A1 (en) 2018-06-26 2018-06-26 Flatness acquisition system and mounting machine

Publications (1)

Publication Number Publication Date
WO2020003384A1 true WO2020003384A1 (en) 2020-01-02

Family

ID=68986854

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2018/024223 WO2020003384A1 (en) 2018-06-26 2018-06-26 Flatness acquisition system and mounting machine

Country Status (2)

Country Link
JP (1) JP7181292B2 (en)
WO (1) WO2020003384A1 (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20210254970A1 (en) * 2020-02-19 2021-08-19 Faro Technologies, Inc. System and method for verifying a position of a component on an object
US20220037175A1 (en) * 2018-10-15 2022-02-03 Koh Young Technology Inc. Apparatus, method and recording medium storing command for inspection

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2007121981A (en) * 2005-09-30 2007-05-17 Matsushita Electric Ind Co Ltd Substrate test method
US20090180679A1 (en) * 1998-07-08 2009-07-16 Charles A. Lemaire Method and apparatus for parts manipulation, inspection, and replacement
JP2013115383A (en) * 2011-11-30 2013-06-10 Fuji Mach Mfg Co Ltd Electronic component mounting device
JP2017142188A (en) * 2016-02-12 2017-08-17 Ckd株式会社 Three-dimensional measurement device

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20090180679A1 (en) * 1998-07-08 2009-07-16 Charles A. Lemaire Method and apparatus for parts manipulation, inspection, and replacement
JP2007121981A (en) * 2005-09-30 2007-05-17 Matsushita Electric Ind Co Ltd Substrate test method
JP2013115383A (en) * 2011-11-30 2013-06-10 Fuji Mach Mfg Co Ltd Electronic component mounting device
JP2017142188A (en) * 2016-02-12 2017-08-17 Ckd株式会社 Three-dimensional measurement device

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20220037175A1 (en) * 2018-10-15 2022-02-03 Koh Young Technology Inc. Apparatus, method and recording medium storing command for inspection
US11694916B2 (en) * 2018-10-15 2023-07-04 Koh Young Technology Inc. Apparatus, method and recording medium storing command for inspection
US20210254970A1 (en) * 2020-02-19 2021-08-19 Faro Technologies, Inc. System and method for verifying a position of a component on an object
US11867499B2 (en) * 2020-02-19 2024-01-09 Faro Technologies, Inc. System and method for verifying a position of a component on an object

Also Published As

Publication number Publication date
JPWO2020003384A1 (en) 2021-07-15
JP7181292B2 (en) 2022-11-30

Similar Documents

Publication Publication Date Title
EP3322275B1 (en) Mounting device, photographic processing method, and photographic unit
CN105309064A (en) Component mounting device and component mounting method
WO2020003384A1 (en) Flatness acquisition system and mounting machine
WO2016203639A1 (en) Component mounting device, and component mounting evaluation method for component mounting device
WO2014174598A1 (en) Component mounting device, mounting head, and control device
JP6147016B2 (en) Assembly machine
JP2009117624A (en) Component mounting apparatus
JP6190229B2 (en) Component mounting equipment
JP6712260B2 (en) Mounting device, imaging processing method, and imaging unit
CN108702867B (en) Mounting device and shooting processing method
JP6116583B2 (en) Component mounter
JP6411663B2 (en) Component mounting equipment
JP5988839B2 (en) Component mounter
JP5507378B2 (en) Electronic component mounting equipment
JP6849815B2 (en) How to determine the component mounting device, shooting method, and mounting order
JP2013236011A (en) Component mounting device
WO2016092673A1 (en) Component-mounting machine
WO2015052755A1 (en) Mounting device
JP5254106B2 (en) Electronic component mounting equipment
JP7095089B2 (en) Mounting machine and mounting system
JP6990309B2 (en) Surface mounter
JP2013096831A (en) Component mounting substrate production apparatus and three-dimensional shape measurement apparatus
JP6670590B2 (en) Image generation device, mounting device, and image generation method
JP6204050B2 (en) Electronic component mounting machine
TWI709826B (en) Correction information generating method, drawing method, correction information generating apparatus and drawing apparatus

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 18923904

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2020526757

Country of ref document: JP

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 18923904

Country of ref document: EP

Kind code of ref document: A1