CN114747308A - Component imaging device and component mounting device - Google Patents

Component imaging device and component mounting device Download PDF

Info

Publication number
CN114747308A
CN114747308A CN202080082450.3A CN202080082450A CN114747308A CN 114747308 A CN114747308 A CN 114747308A CN 202080082450 A CN202080082450 A CN 202080082450A CN 114747308 A CN114747308 A CN 114747308A
Authority
CN
China
Prior art keywords
component
imaging
mark
unit
substrate
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202080082450.3A
Other languages
Chinese (zh)
Inventor
冈田康一
森秀雄
松田鹰则
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Panasonic Intellectual Property Management Co Ltd
Original Assignee
Panasonic Intellectual Property Management Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Panasonic Intellectual Property Management Co Ltd filed Critical Panasonic Intellectual Property Management Co Ltd
Publication of CN114747308A publication Critical patent/CN114747308A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • HELECTRICITY
    • H05ELECTRIC TECHNIQUES NOT OTHERWISE PROVIDED FOR
    • H05KPRINTED CIRCUITS; CASINGS OR CONSTRUCTIONAL DETAILS OF ELECTRIC APPARATUS; MANUFACTURE OF ASSEMBLAGES OF ELECTRICAL COMPONENTS
    • H05K13/00Apparatus or processes specially adapted for manufacturing or adjusting assemblages of electric components
    • H05K13/08Monitoring manufacture of assemblages

Landscapes

  • Engineering & Computer Science (AREA)
  • Operations Research (AREA)
  • Manufacturing & Machinery (AREA)
  • Microelectronics & Electronic Packaging (AREA)
  • Supply And Installment Of Electrical Components (AREA)

Abstract

The component imaging device includes a component imaging unit, a mark imaging unit, and a positional deviation amount calculation unit. The component imaging unit is provided in the imaging element and images the component. The mark is arranged outside the visual field of the component shooting part. The mark imaging unit is provided in the imaging element and images the mark. The positional deviation amount calculation unit calculates a positional deviation amount of the field of view of the component imaging unit based on the imaging result of the marker imaged by the marker imaging unit.

Description

Component imaging device and component mounting device
Technical Field
The present invention relates to a component imaging apparatus for imaging a component and a component mounting apparatus for mounting a component on a substrate.
Background
The component mounting device includes a component supply unit, a mounting head for horizontally moving a component supplied from the component supply unit, and a substrate transport conveyor for holding a substrate. The mounting head has a suction nozzle which takes out the component and mounts it on the substrate. In order to improve the mounting accuracy of components, for example, a component mounting apparatus described in patent document 1 includes a component recognition camera (component imaging apparatus) and a substrate recognition camera (substrate imaging apparatus). The component recognition camera photographs the position of the component held by the suction nozzle. The substrate recognition camera moves integrally with the mounting head to take an image of a mark formed on the substrate.
The mounting head is attachable to and detachable from the main body of the component mounting apparatus. In order to correct the positional relationship between the suction nozzle of the mounting head and the board recognition camera to ensure the mounting accuracy of the component, patent document 1 sets the following. That is, a glass plate having a reference mark formed thereon is placed on the upper surface of the component recognition camera. The glass plate is disposed so that the reference mark is within the field of view of the component recognition camera. The mounting head is moved so that the suction nozzle enters the field of view. In this state, the component recognition camera detects the positions of the suction nozzle and the reference mark. The substrate recognition camera calculates the position relation between the suction nozzle and the substrate recognition camera according to the shot reference mark and the position of the optical axis of the substrate recognition camera.
Documents of the prior art
Patent document
Patent document 1: japanese patent laid-open publication No. 2004-179636
Disclosure of Invention
A component imaging device includes a component imaging unit, a mark imaging unit, and a positional deviation amount calculating unit. The component imaging unit is provided in the imaging element and images the component. The mark is arranged outside the view of the component shooting part. The mark imaging unit is provided in the imaging element and images the mark. The positional deviation amount calculation unit calculates a positional deviation amount of the field of view of the component imaging unit based on the imaging result of the marker imaged by the marker imaging unit.
The component mounting device of the present invention includes a suction nozzle, a component imaging device, a substrate imaging device, and an offset amount calculating unit. The suction nozzle holds the component and mounts the component on the substrate. The component imaging device images the component held by the suction nozzle. The substrate imaging device moves integrally with the suction nozzle and images the substrate. The component imaging device includes a component imaging section, a mark, and a mark imaging section. The component imaging unit is provided in the imaging element and images the component. The mark is arranged outside the view of the component shooting part. The mark imaging unit is provided in the imaging element and images the mark. The offset amount calculation unit calculates an offset amount between the suction nozzle and the substrate imaging device based on an imaging result of the mark imaged by the mark imaging unit of the component imaging device and an imaging result of the mark imaged by the substrate imaging device.
According to the present invention, it is possible to calculate a temporal shift of an optical axis of a camera that captures a component, and correct a shift of a mounting position of the component due to the shift. In addition, the offset between the suction nozzle and the substrate imaging device can be calculated with high precision in the production of the mounting substrate.
Drawings
Fig. 1 is a plan view of a component mounting apparatus according to an embodiment of the present invention.
Fig. 2 is a perspective side view showing the structure of the component mounting apparatus shown in fig. 1.
Fig. 3A is a plan view of a transparent member of the component recognition camera included in the component mounting apparatus shown in fig. 1 and 2.
Fig. 3B is an explanatory diagram of the configuration of the component recognition camera included in the component mounting apparatus shown in fig. 2.
Fig. 3C is a plan view of the sensor substrate of the component recognition camera included in the component mounting device shown in fig. 1 and 2.
Fig. 4 is a diagram illustrating a case where the component held by the suction nozzle is photographed by the component recognition camera shown in fig. 3B.
Fig. 5 is an explanatory diagram of the configuration of the board recognition camera included in the component mounting apparatus shown in fig. 1 and 2.
Fig. 6 is a functional block diagram showing the configuration of a control system of the component mounting apparatus shown in fig. 1 and 2.
Fig. 7 is a diagram illustrating a shift of the optical axis due to heat generation of the component recognition camera shown in fig. 3B.
Fig. 8 is a diagram showing an example of a captured image of a marker captured by the marker capturing section of the component recognition camera shown in fig. 3B.
Fig. 9 is a diagram showing an example of a captured image of a component held by a suction nozzle captured by a component capturing unit of the component recognition camera shown in fig. 3B.
Fig. 10 is a diagram illustrating a case where the mark provided in the component recognition camera shown in fig. 3B is photographed by the board recognition camera shown in fig. 5.
Fig. 11 is a diagram showing an example of a captured image of a mark provided to the component recognition camera captured by the substrate recognition camera shown in fig. 10.
Detailed Description
Before describing the embodiments of the present invention, the following will briefly describe the process of completing the concept of the present invention. A component recognition camera in a conventional component mounting device detects the position of a component held by a suction nozzle based on the position of the optical axis of the camera. However, in the process of continuously producing the mounting substrate, the optical axis of the camera may be shifted (distorted) due to a change with time caused by heat generation or the like of an imaging element included in the component recognition camera. Therefore, in order to maintain the mounting accuracy of the component, it is necessary to periodically detect the deviation of the optical axis of the component recognition camera and appropriately correct the mounting position of the component. In the conventional techniques including patent document 1, a method of detecting a shift of an optical axis of a component recognition camera in production of a mounting substrate is not disclosed, and there is room for further improvement in order to improve mounting accuracy of components.
The invention provides a component imaging device which can detect the deviation of the optical axis of a camera for imaging a component and correct the position of a mounting component based on the deviation. That is, the component imaging apparatus calculates a positional displacement amount of a visual field of a component imaging unit that is a camera that images a component. The present invention also provides a component mounting apparatus capable of calculating the offset between the suction nozzle and the substrate imaging device with high accuracy in the production of mounting substrates.
Hereinafter, embodiments of the present invention will be described in detail with reference to the drawings. The configurations, shapes, and the like described below are examples for explanation, and can be changed as appropriate according to the specifications of the component mounting apparatus, the board recognition camera, and the component recognition camera. In the following, corresponding elements in all the drawings are denoted by the same reference numerals, and redundant description thereof is omitted. In each drawing, the substrate is transported along the X axis. The Y axis is orthogonal to the X axis in the horizontal plane. In addition, the Z-axis is orthogonal to the horizontal plane and extends from bottom to top.
First, the structure of the component mounting apparatus 1 will be described with reference to fig. 1 and 2. Fig. 2 schematically shows a part of the component mounting apparatus 1 of fig. 1. The component mounting apparatus 1 performs a component mounting operation of mounting the component D supplied from the component supply section 4 on the substrate 3. As shown in fig. 1, a substrate conveyance mechanism (hereinafter, referred to as conveyance mechanism) 2 is disposed along the X axis at the center of the base 1 a. The transport mechanism 2 carries in and positions the substrate 3 transported from the upstream to the mounting work position, and holds the substrate. The transport mechanism 2 carries out the substrate 3, on which the component mounting operation has been completed, downstream.
The component supply units 4 are disposed on both sides of the Y axis of the conveying mechanism 2. A plurality of tape feeders 5 are mounted in parallel on each component supply section 4. The tape feeder 5 pitch-feeds the tape 11 formed with the recessed portions holding the components D in a direction (tape feed direction) from the outside of the component supply portion 4 toward the conveying mechanism 2. Thereby, the tape feeder 5 supplies the component D to the component supply position 5a (refer to fig. 2) where the component D is taken out by the mounting head 8, as described below.
Y-axis tables 6 are disposed on both ends of the upper surface of the base 1a on the X axis so as to extend along the Y axis. The Y-axis tables 6 have linear drive mechanisms, respectively. A beam 7 is coupled to the pair of Y-axis tables 6 so as to connect the Y-axis tables 6 to each other. The beam 7 extends along the X-axis. The beam 7 is movable along the Y axis by a linear drive mechanism of the Y axis table 6. The beam 7 has a linear drive mechanism, similarly to the Y-axis table 6. The beam 7 is equipped with a mounting head 8. The mounting head 8 is movable along the X axis by a linear drive mechanism provided to the beam 7. As shown in fig. 2, the mounting head 8 has a suction unit 8 a. The suction unit 8a can be lifted and lowered, and holds the component D by suction. A suction nozzle 8b for suction-holding the component D is attached to a lower end of the suction unit 8 a.
In fig. 1, the Y-axis table 6 and the beam 7 constitute a head moving mechanism (hereinafter, referred to as a moving mechanism) 9 that moves the mounting head 8 along the X-axis and the Y-axis. The moving mechanism 9 and the mounting head 8 perform the following mounting cycle: the component D is sucked and taken out by the suction nozzle 8b from the component supply position 5a of the tape feeder 5 disposed in the component supply portion 4, and is mounted on the mounting position of the substrate 3 positioned on the conveying mechanism 2. That is, the Y-axis table 6, the beam 7, and the mounting head 8 hold and mount the component D supplied to the component supply position 5a of the tape feeder 5 by the suction nozzle 8b to the substrate 3.
A component recognition camera (hereinafter, referred to as a first camera) 20 is disposed between the component supply unit 4 and the conveying mechanism 2. When the mounting head 8, which has taken out the component D from the component supply part 4, moves above the first camera 20, the first camera 20 images the component D held by the mounting head 8 and recognizes the posture of the component D. A board recognition camera 30 (hereinafter, referred to as a second camera) is mounted on the board 7a on which the mounting head 8 is mounted. The second camera 30 moves integrally with the mounting head 8.
By the movement of the mounting head 8, the second camera 30 moves above the substrate 3 positioned on the conveyance mechanism 2, and images the substrate mark 3a provided on the substrate 3 to recognize the position of the substrate 3. In the component mounting operation of mounting the component on the board 3 by the mounting head 8, the mounting position is corrected in consideration of the recognition result of the component D by the first camera 20 and the recognition result of the position of the board 3 by the second camera 30.
In fig. 2, a carriage 10 is provided in the component supply unit 4. The carriage 10 has a feeder base 10a on which a plurality of tape feeders 5 are fitted in advance at the feeder base 10 a. The carriage 10 holds the reel 12. The reel 12 stores the tape 11 holding the component D in a wound state. The tape 11 pulled out from the reel 12 is pitch-fed to the component supply position 5a by the tape feeder 5.
In fig. 1, a touch panel 13 for an operator to operate is provided on the front surface of the component mounting device 1 at a position where the operator performs work. The touch panel 13 displays various information on its display unit. The operator inputs data from the touch panel 13 or operates the component mounting device 1 using operation buttons or the like displayed on the display unit.
Next, the structure of the first camera 20 will be described with reference to fig. 3A to 3C. As shown in fig. 3B, the first camera 20 includes a housing 21, a sensor substrate 23, a lens 24, a component illumination section 25, and a transparent member 26. The sensor substrate 23 is provided on the bottom 21a inside the housing 21. An imaging element 22 such as a two-dimensional CMOS (complementary metal oxide semiconductor) sensor is mounted on the sensor substrate 23. The lens 24 is provided above the sensor substrate 23 in the housing 21, and the component illumination unit 25 is provided above the lens 24. The transparent member 26 is provided at a portion of the upper portion 21b of the housing 21 where a part of the housing 21 is cut away.
The transparent member 26 is made of plate-like glass or the like that transmits light. A mark 27 for detecting a shift of the optical axis of the imaging element 22 is provided on a part of the upper surface 26c of the transparent member 26. As shown in fig. 3A, the mark 27 is arranged outside the component imaging field of view 26a when the imaging element 22 images the component D held by the nozzle 8b, and is arranged in the mark imaging field of view 26b when the imaging element 22 images the mark 27. In this example, 5 disks formed of a light-opaque metal thin film are disposed at equal intervals along the Y axis as the marks 27 in the mark imaging field 26 b. The shape, number, and arrangement of the marks 27 shown in fig. 3A are examples, and are not limited to these examples. As shown in fig. 3B, the transparent member 26 is provided on a first optical path Pa on which the component imaging section 22a images the component D, and on a second optical path Pb on which the mark imaging section 22B images the mark 27.
As shown in fig. 3C, the imaging element 22 includes a component imaging unit 22a and a mark imaging unit 22b arranged in line along the X axis. The component imaging unit 22a and the mark imaging unit 22b extend along the Y axis. That is, the length along the Y-axis is shorter than the length along the X-axis. The component imaging unit 22a images the component D held by the nozzle 8 b. The mark imaging unit 22b images the mark 27 provided on the transparent member 26. As shown in fig. 3B and 4, the component imaging section 22a images a component D whose lower surface Da is located at a component recognition height Ha set higher than the mark recognition height Hb of the upper surface 26c of the transparent member 26. As described above, the imaging element 22 is provided with the component imaging unit 22a that images the component D and the mark imaging unit 22b that images the mark 27. As shown in fig. 3A, the mark 27 is provided in a mark imaging field 26b outside the field of view of the component imaging section 22a, i.e., the component imaging field of view 26 a.
The component imaging unit 22a is provided in a part of the imaging device 22, and the mark imaging unit 22b is provided in a part of the imaging device 22 different from the component imaging unit 22 a. In this way, since the component imaging unit 22a and the mark imaging unit 22b are provided on the same imaging element 22, the accuracy of detecting the positional displacement amount of the visual field of the component imaging unit 22a is improved as will be described later.
As shown in fig. 3B, the lens 24 disposed above the image pickup device 22 forms an image of an object to be picked up (such as a part D) located above the transparent member 26 and at the part recognition height Ha on the part image pickup portion 22 a. The lens 24 forms an image of the mark 27 formed on the transparent member 26 on the mark imaging unit 22 b. The marker 27 is located at a marker recognition height Hb. The physical distance Lpa of the first optical path Pa from the part imaging unit 22a to the imaging target located at the part recognition height Ha is different from the physical distance Lpb of the second optical path Pb from the mark imaging unit 22b to the mark 27.
As shown in fig. 3B and 3C, the component imaging section 22a is provided with a correction member 28 made of glass or the like having a refractive index different from that of air and transmitting light. The refractive index and the thickness T of the correction member 28 are set so as to correct the optical path length Loa (optical distance) of the first optical path Pa and form images on the component imaging unit 22a and the mark imaging unit 22b, respectively. Thus, the lens 24 can focus the image of the object imaged in the component imaging unit 22a and the image of the mark 27 imaged in the mark imaging unit 22b at the same time.
In the present embodiment, the correction member 28 is provided on the component imaging section 22a, but the correction member 28 may be provided in the middle of either the first optical path Pa or the second optical path Pb. That is, the first camera 20 includes a correction means 28 for correcting one of the optical path length Loa and the optical path length Lob, on either one of the first optical path Pa in which the component imaging unit 22a images the component D and the second optical path Pb in which the marker imaging unit 22b images the marker 27. The correction member 28 having different refractive indices and/or thicknesses T may be provided on both the first optical path Pa and the second optical path Pb. Further, if the depth of field of the lens 24 is deep and two images of the object and the mark 27 can be focused at the same time without the correction means 28, the correction means 28 can be omitted.
As shown in fig. 3B, the component illumination section 25 includes a side-light illumination section 25a, a coaxial illumination section 25B, and a half mirror 25 c. The side-emitting illumination unit 25a has a plurality of Light Emitting Diode (LED) chips and the like, and irradiates light for illuminating the mark 27 and the object to be photographed (member D and the like) positioned at the member recognition height Ha from obliquely below. The coaxial illumination unit 25b has a plurality of LED chips and the like, and is disposed below the side illumination unit 25 a.
The half mirror 25c is disposed midway between the first optical path Pa and the second optical path Pb. The half mirror 25c reflects the light irradiated from the coaxial illumination unit 25b toward the object to be photographed (member D, etc.) and the mark 27 located above the half mirror 25c at the member recognition height Ha. That is, the coaxial illumination unit 25b and the half mirror 25c illuminate the object to be photographed and the mark 27 coaxially with the first optical path Pa and the second optical path Pb.
The component illumination unit 25 switches between illumination by the side illumination unit 25a and illumination by the coaxial illumination unit 25b, or performs illumination by combining them, depending on the material of the object to be imaged, or the like. In this way, the first camera 20 has an illumination section (component illumination section 25) that illuminates the mark 27.
Next, the shooting of the component D by the first camera 20 will be described with reference to fig. 4. The mounting head 8 holding the component D by the suction nozzle 8b moves along the X axis as shown by an arrow a so that the lower surface Da of the component D is positioned at the component recognition height Ha and the component D passes above the first optical path Pa of the first camera 20. During this time, the component illumination unit 25 illuminates the component D. The first camera 20 transmits the imaging result captured by the imaging element 22 (the component imaging unit 22a and the mark imaging unit 22b) to the control unit 40 shown in fig. 6.
Next, the structure of the second camera 30 will be described with reference to fig. 5. The second camera 30 includes a housing 31, a window member 34, and a camera unit 32 and a substrate illumination unit 33 provided inside the housing 31. The window member 34 is made of light-transmitting glass or the like, and is provided at a portion where a part of the lower portion of the housing 31 is cut away. The camera unit 32 is configured by an imaging element such as a two-dimensional CMOS sensor having an optical axis directed downward, a lens, and the like.
The substrate illumination unit 33 includes a side illumination unit 33a that illuminates the lower substrate 3 to be imaged from obliquely above, a coaxial illumination unit 33b, and a half mirror 33 c. The half mirror 33c reflects the light irradiated from the coaxial illumination unit 33b toward the imaging target in the downward direction. The substrate illumination unit 33 switches between illumination by the side illumination unit 33a and illumination by the coaxial illumination unit 33b, or performs illumination by combining them, depending on the material of the object imaged by the camera unit 32 or the like.
The second camera 30 takes an image of an object to be taken, such as the substrate mark 3a formed on the substrate 3, the mark 27 provided on the transparent member 26 of the first camera 20, and the like. At this time, the second camera 30 moves above the imaging target together with the mounting head 8, and the camera unit 32 images the imaging target while the substrate illumination unit 33 illuminates the imaging target. In this way, the second camera 30 is a substrate imaging device as follows: the substrate illumination unit 33, which has the substrate mark 3a and the mark 27 formed on the substrate 3, moves integrally with the suction nozzle 8b and images the substrate 3 and the mark 27. The second camera 30 transmits the imaging result captured by the camera unit 32 to the control unit 40.
Next, the configuration of the control system of the component mounting apparatus 1 will be described with reference to fig. 6. The component mounting device 1 has a control section 40, a conveying mechanism 2, a tape feeder 5, a mounting head 8, a moving mechanism 9, a first camera 20, a second camera 30, and a touch panel 13. The control unit 40 includes an imaging processing unit (hereinafter referred to as a first processing unit) 41, a positional deviation amount calculation unit (hereinafter referred to as a first calculation unit) 42, an offset amount calculation unit (hereinafter referred to as a second calculation unit) 43, a mounting operation processing unit (hereinafter referred to as a second processing unit) 44, and a mounting storage unit (hereinafter referred to as a storage unit) 45. The storage unit 45 stores mounting data 46, imaging data 47, a mark view position offset amount (hereinafter referred to as a first position offset amount) 48, a component view position offset amount (hereinafter referred to as a second position offset amount) 49, a substrate view position offset amount (hereinafter referred to as a third position offset amount) 50, an offset amount 51, and the like.
The first processing unit 41, the first calculating unit 42, the second calculating unit 43, and the second processing unit 44 constituting the control unit 40 are each constituted by a CPU (central processing unit) or an LSI (large scale integrated circuit). Memory may also be included as desired. These may be constituted by dedicated circuits, or may be realized by controlling general-purpose hardware with software read from a temporary or non-temporary storage device or a recording medium. Two or more of them may be integrated. The storage unit 58 is constituted by a rewritable RAM, a flash memory, a hard disk, and the like. The storage unit 58 may be configured by a plurality of memories and store the mounting data 46 to the offset 51 individually, or may be configured integrally and store these data collectively.
The mounting data 46 includes information necessary for the component mounting work such as the size of the substrate 3, the type of the mounted component D, and the mounting position (XY coordinates) for each substrate 3. The first processing unit 41 controls the mark imaging unit 22b and the component illumination unit 25 of the first camera 20 when detecting (calculating) a positional shift of the imaging field of view of the first camera 20. Specifically, the first processing unit 41 causes the component illumination unit 25 to illuminate the mark 27 provided on the transparent member 26, and causes the mark imaging unit 22b to image the mark 27. The first processing unit 41 stores the imaging result in the storage unit 45 as imaging data 47.
Here, the result of the image capture of the marker 27 by the first camera 20 will be described with reference to fig. 7 and 8. The first camera 20 repeatedly performs shooting of the component D held by the suction nozzle 8b in the component mounting work. In this process, the sensor substrate 23 and the housing 21 may be deformed with time due to heat generation of the imaging element 22 and heat generation of the component illumination unit 25. Fig. 7 shows the following states: due to such a change with time, the sensor substrate 23 is deformed, the image pickup device 22 is positionally displaced in the XY plane, and the first optical path Pa (optical axis) of the component image pickup portion 22a and the second optical path Pb (optical axis) of the mark image pickup portion 22b are deviated from the initial state. On the X axis, the first optical path Pa is shifted by the positional shift amount Δ Xd, and the second optical path Pb is shifted by the positional shift amount Δ Xm.
In fig. 7, in an initial state where the first camera 20 is not deformed, the position of the first light path Pa at the component recognition height Ha is defined as an initial component visual field center position Cd0, and the position of the second light path Pb at the mark recognition height Hb is defined as an initial mark visual field center position Cm 0. In addition, in a state where the positional deviation of the imaging element 22 occurs in the XY plane, the position of the first optical path Pa at the component recognition height Ha is defined as a component visual field center position Cd, and the position of the second optical path Pb at the mark recognition height Hb is defined as a mark visual field center position Cm.
Fig. 8 shows a marker image 60 which is an image of the marker 27 captured by the first camera 20 in a state where the imaging element 22 is displaced in the XY plane as shown in fig. 7. In the mark image 60, 5 marks 27 imaged by the mark imaging unit 22b are displayed in a line. The center of the third marker 27 (hereinafter, referred to as the center marker 27A) among the 5 aligned markers 27 is the initial marker visual field center position Cm 0. In addition, the center 60c of the marker image 60 is the marker visual field center position Cm.
In the initial state where the first camera 20 is not deformed, the initial marker visual field center position Cm0 is located at the center 60c of the marker image 60, and the 5 markers 27 are located at the positions indicated by the broken lines in the image captured by the first camera 20. In fig. 8, since the photographing element 22 is positionally shifted, the position of the second optical path Pb at the mark recognition height Hb is shifted from the center of the center mark 27A to be shifted.
The first calculation unit 42 shown in fig. 6 calculates a first positional displacement amount 48, which is a positional displacement amount by which the field of view of the marker imaging unit 22b is displaced from the initial state, based on the marker image 60, and stores the calculated amount in the storage unit 45. Specifically, the first calculation unit 42 calculates the shift amount Δ Xm on the X axis and the shift amount Δ Ym on the Y axis from the position of the center mark 27A with respect to the center 60c of the mark image 60 shown in fig. 8. Further, the amount of deviation Δ θ m in the θ direction, which is the rotational direction about the Z axis as the rotational axis, is calculated from the inclination of the straight line connecting the centers of the 5 marks 27.
Fig. 9 shows a component captured image (hereinafter, referred to as a first image) 61, which is an image of the component D captured by the first camera 20 in a state where the imaging element 22 shown in fig. 7 is positionally displaced in the XY plane. The suction nozzle 8b holds the center of the component D. The center of the part D reflected in the first image 61 is the initial part visual field center position Cd 0. In addition, the center 61c of the first image 61 is the component view center position Cd. In the initial state where the first camera 20 is not deformed, in the first image 61, the initial component view center position Cd0 is located at the center 61c of the first image 61, and the component D and the suction nozzle 8b holding the component D are located at positions indicated by broken lines.
The second positional displacement 49, which is the amount of positional displacement of the field of view of the component imaging section 22a from the initial state based on the first image 61, is represented by the amount of displacement Δ Xd on the X axis and the amount of displacement Δ Yd on the Y axis. That is, the shift of the position of the center of the component D with respect to the center 61c of the first image 61 is the shift amount Δ Xd and the shift amount Δ Yd. The rotation of the member D (such as the inclination of each side of the member D) is a shift amount Δ θ D in the θ direction, which is a rotation direction of the Z axis as a rotation axis.
In the first camera 20, the component imaging section 22a and the mark imaging section 22b are formed in a single imaging element 22. Therefore, due to the change over time of the first camera 20, the shift of the first optical path Pa of the component imaging part 22a and the shift of the second optical path Pb of the marker imaging part 22b occur in conjunction. Therefore, the first positional displacement amount 48 and the second positional displacement amount 49 have a predetermined relationship (for example, a proportional relationship) determined by the structure of the first camera 20. That is, the second position displacement amount 49 can be calculated from the first position displacement amount 48.
The first calculation unit 42 calculates a second displacement amount 49 by performing a predetermined calculation on the first displacement amount 48 calculated from the marker image 60, and stores the second displacement amount in the storage unit 45. The predetermined operation is, for example, the following operation: that is, the component on the X axis and the component on the Y axis are multiplied by a predetermined coefficient, and the θ direction is kept constant. That is, the first calculation unit 42 calculates the amount of positional displacement (the second amount of positional displacement 49) of the field of view of the component imaging unit 22a based on the marker image 60 that is the imaging result of the marker 27 imaged by the marker imaging unit 22 b. The component imaging unit 22a, the mark imaging unit 22b, the mark 27, and the first calculation unit 42 constitute a component imaging device that images the component D held by the suction nozzle 8b and calculates the second positional displacement 49.
The second processing unit 44 shown in fig. 6 controls each unit of the component mounting apparatus 1 based on the mounting data 46. Specifically, the suction nozzles 8b of the mounting head 8 are controlled so as to hold the components D supplied by the tape feeder 5. Then, the first camera 20 is caused to detect the positional deviation of the component D held by the suction nozzle 8 b. Then, the suction nozzle 8b is controlled so as to mount the component D on the substrate 3 held at the mounting work position. The second processing unit 44 controls such a series of component mounting operations. At this time, the second processing unit 44 corrects the position of the component D detected by the first camera 20 with respect to the suction nozzle 8b based on the second position displacement amount 49 stored in the storage unit 45.
The mark photographing part 22b can photograph the mark 27 while the mounting head 8 performs the taking out of the component D from the tape feeder 5 and the mounting of the component to the board 3. That is, the mark imaging part 22b can image the mark 27 independently of the motion of the mounting head 8 or the like during the idle time when the mounting head 8 is not positioned above the first camera 20. Therefore, during the idle time, the marker imaging unit 22b can image the marker 27 and update the second position displacement amount 49 to the latest value. Thus, the component D can be accurately mounted at the proper mounting position by detecting the deviation of the optical axis of the component imaging camera 20 (the second positional deviation 49) for imaging the component D and correcting the position of the mounted component D without reducing the efficiency of the component mounting work.
The first processing unit 41 shown in fig. 6 detects (calculates) the positional displacement of the mounting head 8, the suction nozzle 8b, and the second camera 30 due to the deformation of the movement mechanism 9 with time. Specifically, the first processing unit 41 controls the movement mechanism 9, the second camera 30, and the first camera 20, and causes the camera unit 32 to capture an image of the mark 27 formed on the transparent member 26. The first processing unit 41 stores the imaging result in the storage unit 45 as imaging data 47.
Next, details of the imaging of the marker 27 by the second camera 30 and the imaging result will be described with reference to fig. 10 and 11. When the mark 27 is imaged by the second camera 30, first, the first processing unit 41 shown in fig. 6 controls the moving mechanism 9 to move the second camera 30 to a position where the mark 27 provided in the first camera 20 is imaged. Next, the first processing unit 41 controls the component illumination unit 25 of the first camera 20 to illuminate the mark 27 provided on the transparent member 26 from below, and controls the camera unit 32 shown in fig. 10 to photograph the mark 27.
That is, when the second camera 30 as the substrate imaging device images the mark 27, the component illumination section 25 illuminates the blazed mark 27 by transmission illumination. Thus, for example, even when dirt adheres to the transparent member 26, the second camera 30 can clearly photograph the mark 27. The camera unit 32 may capture an image of the mark 27 while the substrate illumination unit 33 illuminates the mark 27. The component illumination unit 25 employs transmissive illumination, but transmissive illumination is an example of the type of illumination, and is not limited to this example.
The moving mechanism 9 repeatedly moves the mounting head 8 during the component mounting operation. In this process, the movement mechanism 9 may be deformed with time due to heat generation of the linear driving mechanism of the movement mechanism 9 or the like. The control unit 40 sets various control parameters so that the positions of the mounting head 8 and the suction nozzles 8b are controlled on the assumption that the movement mechanism 9 is not deformed in an initial state with a base point (not shown) set in the component mounting device 1 as an origin. When the moving mechanism 9 is deformed, the mounting head 8 and the suction nozzle 8b are displaced from the initial positions, and therefore, correction is required.
Fig. 10 shows the following states: although the first processing unit 41 controls the moving mechanism 9 so that the third optical path Ph of the camera unit 32 coincides with the center of the mark 27, the second camera 30 is positionally displaced and stopped due to deformation of the moving mechanism 9. As described above, the second camera 30 including the camera unit 32 is moved integrally with the mounting head 8 by the moving mechanism 9. In the initial state where the moving mechanism 9 is not deformed, the position on the mark 27 of the third optical path Ph is defined as an initial substrate view center position Ch 0. The position on the transparent member 26 of the third optical path Ph in the state where the moving mechanism 9 is deformed is defined as a substrate view center position Ch.
Fig. 11 shows a substrate recognition camera image (hereinafter, referred to as a second image) 62 of the mark 27 captured by the camera unit 32 of the second camera 30 in a state where the moving mechanism 9 is deformed as shown in fig. 10. The center 62c of the second image 62 is the substrate view center position Ch. Further, the center of the third center mark 27A of the 5 marks 27 reflected in the second image 62 is the initial substrate visual field center position Ch 0. In the initial state where the movement mechanism 9 is not deformed, in the second image 62, the initial substrate view center position Ch0 is located at the center 62c, and the 5 marks 27 are located at positions indicated by broken lines.
The second calculation unit 43 shown in fig. 6 calculates the third displacement amount 50, which is the displacement amount of the displacement of the field of view of the camera unit 32 from the initial state, based on the second image 62 shown in fig. 11, and stores the calculated displacement amount in the storage unit 45. Specifically, the second calculation unit 43 calculates the shift amount Δ Xh on the X axis and the shift amount Δ Yh on the Y axis from the position of the center mark 27A with respect to the center 62c of the second image 62. Further, the amount of deviation Δ θ h in the θ direction, which is the rotational direction about the Z axis as the rotational axis, is calculated from the inclination of the straight line connecting the centers of the 5 marks 27.
The second processing unit 44 controls the second camera 30 to take an image of the substrate mark 3a of the substrate 3 held at the mounting work position and recognize the position of the substrate 3. At this time, the second processing unit 44 corrects the stop position of the substrate 3 based on the third positional displacement 50 stored in the storage unit 45. In the component mounting work, it is necessary to correct the position of the suction nozzle 8b with respect to the base point set in the component mounting device 1 and mount the component D held by the suction nozzle 8b at a predetermined mounting position on the board 3. In order to correct the position of the nozzle 8b, it is necessary to consider the deformation of the first camera 20 that captures the image of the component D held by the nozzle 8b, in addition to the deformation of the moving mechanism 9.
In fig. 10, the second calculation unit 43 calculates an offset 51 between the suction nozzle 8b and the second camera 30 based on the third positional offset 50 and the first positional offset 48 in addition to the positional relationship (Xn, Yn) between the second camera 30 and the suction nozzle 8b in the mounting head 8. The second calculation unit 43 stores the offset 51 in the storage unit 45. As described above, the first positional offset amount 48 is a positional offset amount of the field of view of the first camera 20 representing a deformation of the first camera 20. The second processing unit 44 corrects the position of the suction nozzle 8b based on the mounting data 46 and the offset 51 stored in the storage unit 45, and controls the suction nozzle 8b so that the component D held by the suction nozzle 8b is mounted at the mounting position on the substrate 3.
As described above, the amount of positional displacement of the field of view of the first camera 20 is calculated by the first calculation unit 42 based on the result of image capture of the marker by the marker image capture unit 22b of the first camera 20. That is, the first calculation unit 42 calculates the first position displacement amount 48 based on the marker image 60 shown in fig. 8. When the first camera 20 images the mark 27, the first processing unit 41 controls the mark imaging unit 22b to image the mark 27 in a state where the substrate illumination unit 33 of the second camera 30 is controlled to illuminate the mark 27 provided on the transparent member 26 from above. That is, when the first camera 20 photographs the mark 27, the substrate illumination unit 33 illuminates the mark 27 by transmission illumination. This enables clear imaging of the mark 27 even when dirt adheres to the transparent member 26, for example.
In this way, the second calculation unit 43 calculates the offset 51 between the suction nozzle 8b and the second camera 30 based on the result of the image pickup of the mark 27 by the mark image pickup unit 22b of the first camera 20 as the component image pickup device and the result of the image pickup of the mark 27 by the second camera 30 as the substrate image pickup device. That is, the second calculation unit 43 calculates the offset 51 based on the mark image 60 and the substrate recognition camera image 62.
As described above, the component mounting device 1 includes the component imaging device (the first camera 20 and the first calculation unit 42) that images the component D held by the suction nozzle 8b, the substrate imaging device (the second camera 30) that moves integrally with the suction nozzle 8b and images the substrate 3, and the offset amount calculation unit (the second calculation unit) 43 that calculates the offset amount 51. This makes it possible to accurately calculate the offset 51 between the nozzle 8b and the substrate recognition device.
Industrial applicability
The component imaging apparatus of the present invention can calculate the deviation of the optical axis of the camera that images the component. In addition, the component mounting apparatus of the present invention can accurately calculate the offset between the suction nozzle and the substrate imaging device in the production of the mounting substrate. Therefore, both are useful in the field of mounting components on a substrate.
Description of the reference numerals
1: component mounting device, 1 a: base station, 2: substrate conveying mechanism (conveying mechanism), 3: substrate, 3 a: substrate mark, 4: component supply unit, 5: tape feeder, 5 a: component supply position, 6: y-axis table, 7: beam, 7 a: plate, 8: mounting head, 8 a: adsorption unit, 8 b: suction nozzle, 9: head moving mechanism (moving mechanism), 10: carriage, 10 a: feeder base, 11: belt, 12: tape reel, 13: touch panel, 20: component recognition camera (first camera), 21, 31: housing, 21 a: bottom, 21 b: upper portion, 22: imaging element, 22 a: component imaging unit, 22 b: marker imaging unit, 23: sensor substrate, 24: lens, 25: component illumination portion, 25a, 33 a: side-emitting illumination unit, 25b, 33 b: coaxial illumination unit, 25c, 33 c: half mirror, 26: transparent member, 26 a: part imaging field of view, 26 b: mark-shooting field of view, 26 c: upper surface, 27: marker, 27A: center mark, 28: correction member, 30: substrate recognition camera (second camera), 32: camera unit, 33: substrate illumination unit, 34: window member, 40: control unit, 41: imaging processing unit (first processing unit), 42: position deviation amount calculation unit (first calculation unit), 43: offset amount calculation unit (second calculation unit), 44: mounting operation processing unit (second processing unit), 45: mounting storage unit (storage unit), 46: installation data, 47: shot data, 48: mark visual field position offset amount (first position offset amount), 49: component visual field position offset amount (second position offset amount), 50: substrate visual field position offset amount (third position offset amount), 51: offset, 60: marker image, 60c, 61c, 62 c: center, 61: component captured image (first image), 62: substrate recognition camera image (second image), D: component, Pa: first optical path, Pb: second light path, Ph: and a third light path.

Claims (11)

1. A component photographing apparatus, wherein,
the component imaging apparatus includes:
a component imaging unit that is provided in the imaging element and that images a component;
a mark provided outside a field of view of the component imaging section;
a mark imaging unit that is provided in the imaging element and that images the mark; and
and a positional deviation amount calculation unit that calculates a positional deviation amount of the field of view of the component imaging unit based on an imaging result of the marker imaged by the marker imaging unit.
2. The component photographing apparatus according to claim 1, wherein,
the component imaging section and the mark imaging section are provided in different portions of the imaging element.
3. The component photographing apparatus according to claim 1 or 2, wherein,
a correction member for correcting an optical path length is provided on at least one of a first optical path for the component imaging unit to image the component and a second optical path for the mark imaging unit to image the mark.
4. The component photographing apparatus according to any one of claims 1 to 3, wherein,
the component imaging apparatus further includes an illumination unit configured to illuminate the mark.
5. The component photographing device according to any one of claims 1 to 4,
the component imaging apparatus further includes a transparent member provided on a first optical path on which the component imaging unit images the component and on a second optical path on which the mark imaging unit images the mark,
the mark is provided to the transparent member.
6. A component mounting apparatus, wherein,
the component mounting device includes:
a suction nozzle which holds a component and mounts the component on a substrate;
a component imaging device that images the component held by the suction nozzle;
a substrate photographing device that moves integrally with the suction nozzle and photographs the substrate; and
an offset calculating unit for calculating an offset between the suction nozzle and the substrate imaging device,
the component imaging apparatus includes:
a component imaging unit that is provided in an imaging element and that images the component;
a mark provided outside a field of view of the component imaging section; and
a mark imaging unit that is provided in the imaging element and that images the mark,
the offset amount calculation unit calculates the offset amount based on an imaging result of the mark imaged by the mark imaging unit of the component imaging device and an imaging result of the mark imaged by the substrate imaging device.
7. The component mounting apparatus according to claim 6,
the component imaging section and the mark imaging section are provided in different portions of the imaging element.
8. The component mounting apparatus according to claim 6 or 7,
a correction member for correcting an optical path length is provided on at least one of a first optical path for the component imaging unit to image the component and a second optical path for the mark imaging unit to image the mark.
9. A component mounting apparatus according to any one of claims 6 to 8,
the component imaging device includes a positional deviation amount calculation unit that calculates a positional deviation amount of a field of view of the component imaging unit based on an imaging result of the marker imaged by the marker imaging unit,
the offset amount calculation unit calculates the offset amount based on an imaging result of the mark imaged by the substrate imaging device and a positional offset amount of the field of view of the component imaging device.
10. A component mounting apparatus according to any one of claims 6 to 9,
the component imaging device has a component illumination section that illuminates the mark when the component imaging device images the mark,
the substrate imaging device includes a substrate illumination unit that illuminates the mark when the substrate imaging device images the mark.
11. A component mounting apparatus according to any one of claims 6 to 10,
the component mounting apparatus further includes a transparent member provided on a first optical path on which the component imaging section images the component and provided on a second optical path on which the mark imaging section images the mark,
the mark is provided to the transparent member.
CN202080082450.3A 2019-12-11 2020-09-29 Component imaging device and component mounting device Pending CN114747308A (en)

Applications Claiming Priority (5)

Application Number Priority Date Filing Date Title
JP2019-223373 2019-12-11
JP2019223374 2019-12-11
JP2019-223374 2019-12-11
JP2019223373 2019-12-11
PCT/JP2020/036798 WO2021117319A1 (en) 2019-12-11 2020-09-29 Component-imaging device and component-mounting device

Publications (1)

Publication Number Publication Date
CN114747308A true CN114747308A (en) 2022-07-12

Family

ID=76329368

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202080082450.3A Pending CN114747308A (en) 2019-12-11 2020-09-29 Component imaging device and component mounting device

Country Status (3)

Country Link
JP (1) JPWO2021117319A1 (en)
CN (1) CN114747308A (en)
WO (1) WO2021117319A1 (en)

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH07162200A (en) * 1993-12-04 1995-06-23 Tdk Corp Method and apparatus for mounting electronic part
JP3129134B2 (en) * 1995-02-23 2001-01-29 松下電器産業株式会社 Chip mounting method
JP2004179636A (en) * 2002-11-13 2004-06-24 Fuji Mach Mfg Co Ltd Method and device of calibration in electronic part packaging apparatus
JP2008153511A (en) * 2006-12-19 2008-07-03 Juki Corp Component mounting equipment
JP2008270649A (en) * 2007-04-24 2008-11-06 Juki Corp Surface mounting equipment, and camera position correction method thereof
WO2014020733A1 (en) * 2012-08-01 2014-02-06 富士機械製造株式会社 Component mounting apparatus
WO2015040696A1 (en) * 2013-09-18 2015-03-26 富士機械製造株式会社 Component mounting machine
CN105191516A (en) * 2013-03-18 2015-12-23 富士机械制造株式会社 Component mounting device and method of calibration in component mounting device

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP5471131B2 (en) * 2009-07-31 2014-04-16 日本電気株式会社 Electronic component mounting apparatus and camera position correction method in electronic component mounting apparatus
WO2013153645A1 (en) * 2012-04-12 2013-10-17 富士機械製造株式会社 Image pickup device and image processing device
US20150049183A1 (en) * 2012-04-12 2015-02-19 Fuji Machine Mfg. Co., Ltd Component-mounting machine
EP3054756B1 (en) * 2013-10-01 2019-11-20 FUJI Corporation Component mounting device and component mounting method

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH07162200A (en) * 1993-12-04 1995-06-23 Tdk Corp Method and apparatus for mounting electronic part
JP3129134B2 (en) * 1995-02-23 2001-01-29 松下電器産業株式会社 Chip mounting method
JP2004179636A (en) * 2002-11-13 2004-06-24 Fuji Mach Mfg Co Ltd Method and device of calibration in electronic part packaging apparatus
JP2008153511A (en) * 2006-12-19 2008-07-03 Juki Corp Component mounting equipment
JP2008270649A (en) * 2007-04-24 2008-11-06 Juki Corp Surface mounting equipment, and camera position correction method thereof
WO2014020733A1 (en) * 2012-08-01 2014-02-06 富士機械製造株式会社 Component mounting apparatus
CN105191516A (en) * 2013-03-18 2015-12-23 富士机械制造株式会社 Component mounting device and method of calibration in component mounting device
WO2015040696A1 (en) * 2013-09-18 2015-03-26 富士機械製造株式会社 Component mounting machine

Also Published As

Publication number Publication date
WO2021117319A1 (en) 2021-06-17
JPWO2021117319A1 (en) 2021-06-17

Similar Documents

Publication Publication Date Title
US8681210B2 (en) Component mounting device and component mounting method
JP5385010B2 (en) Electronic component mounting equipment
JP5299380B2 (en) Component mounting apparatus and component detection method
US6563530B1 (en) Camera position-correcting method and system and dummy component for use in camera position correction
JP5113406B2 (en) Electronic component mounting equipment
JP4343710B2 (en) Surface mount machine
JP2009212251A (en) Component transfer equipment
JP4331054B2 (en) Adsorption state inspection device, surface mounter, and component testing device
CN111096102B (en) Component mounting apparatus
JP6731003B2 (en) Surface mounter, recognition error correction method
JP2003234598A (en) Component-mounting method and component-mounting equipment
JP4437686B2 (en) Surface mount machine
CN113170607B (en) Component mounting apparatus
JP2009094295A (en) Apparatus for measuring height of electronic component
CN114747308A (en) Component imaging device and component mounting device
JP4810586B2 (en) Mounting machine
JP2022107877A (en) Component imaging apparatus and component mounting device
JP4901451B2 (en) Component mounting equipment
JP4704218B2 (en) Component recognition method, apparatus and surface mounter
KR102432607B1 (en) surface mount machine
JP4323410B2 (en) Electronic component mounting apparatus and mounting method
CN108370662B (en) Movement error detecting device for mounting head and component mounting device
JP4296029B2 (en) Electronic component mounting equipment
JP4358015B2 (en) Surface mount machine
JP4368709B2 (en) Surface mount machine

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination