CN106465580B - Component data generating means, surface mounting apparatus and component data generation method - Google Patents
Component data generating means, surface mounting apparatus and component data generation method Download PDFInfo
- Publication number
- CN106465580B CN106465580B CN201480078459.1A CN201480078459A CN106465580B CN 106465580 B CN106465580 B CN 106465580B CN 201480078459 A CN201480078459 A CN 201480078459A CN 106465580 B CN106465580 B CN 106465580B
- Authority
- CN
- China
- Prior art keywords
- imaging
- electronic component
- unit
- optical axis
- component
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
- 238000000034 method Methods 0.000 title claims description 35
- 230000003287 optical effect Effects 0.000 claims abstract description 86
- 238000003384 imaging method Methods 0.000 claims description 200
- 238000005286 illumination Methods 0.000 claims description 82
- 239000000758 substrate Substances 0.000 claims description 12
- 230000002194 synthesizing effect Effects 0.000 claims description 9
- 230000001678 irradiating effect Effects 0.000 claims description 3
- 230000015572 biosynthetic process Effects 0.000 abstract 1
- 238000003786 synthesis reaction Methods 0.000 abstract 1
- 230000008569 process Effects 0.000 description 10
- 238000010586 diagram Methods 0.000 description 8
- 230000001427 coherent effect Effects 0.000 description 7
- 238000001514 detection method Methods 0.000 description 7
- 238000013500 data storage Methods 0.000 description 6
- 230000007246 mechanism Effects 0.000 description 4
- 239000004065 semiconductor Substances 0.000 description 4
- 238000007689 inspection Methods 0.000 description 3
- WABPQHHGFIMREM-UHFFFAOYSA-N lead(0) Chemical compound [Pb] WABPQHHGFIMREM-UHFFFAOYSA-N 0.000 description 3
- 238000004519 manufacturing process Methods 0.000 description 3
- 230000004048 modification Effects 0.000 description 2
- 238000012986 modification Methods 0.000 description 2
- 238000005452 bending Methods 0.000 description 1
- 230000008859 change Effects 0.000 description 1
- 230000003247 decreasing effect Effects 0.000 description 1
- 230000009977 dual effect Effects 0.000 description 1
- 230000000694 effects Effects 0.000 description 1
- 239000000284 extract Substances 0.000 description 1
- 238000009434 installation Methods 0.000 description 1
- 238000005259 measurement Methods 0.000 description 1
- 230000002093 peripheral effect Effects 0.000 description 1
- 230000003252 repetitive effect Effects 0.000 description 1
- 238000001179 sorption measurement Methods 0.000 description 1
- 230000032258 transport Effects 0.000 description 1
Classifications
-
- H—ELECTRICITY
- H05—ELECTRIC TECHNIQUES NOT OTHERWISE PROVIDED FOR
- H05K—PRINTED CIRCUITS; CASINGS OR CONSTRUCTIONAL DETAILS OF ELECTRIC APPARATUS; MANUFACTURE OF ASSEMBLAGES OF ELECTRICAL COMPONENTS
- H05K13/00—Apparatus or processes specially adapted for manufacturing or adjusting assemblages of electric components
- H05K13/08—Monitoring manufacture of assemblages
- H05K13/081—Integration of optical monitoring devices in assembly lines; Processes using optical monitoring devices specially adapted for controlling devices or machines in assembly lines
- H05K13/0813—Controlling of single components prior to mounting, e.g. orientation, component geometry
-
- H—ELECTRICITY
- H05—ELECTRIC TECHNIQUES NOT OTHERWISE PROVIDED FOR
- H05K—PRINTED CIRCUITS; CASINGS OR CONSTRUCTIONAL DETAILS OF ELECTRIC APPARATUS; MANUFACTURE OF ASSEMBLAGES OF ELECTRICAL COMPONENTS
- H05K13/00—Apparatus or processes specially adapted for manufacturing or adjusting assemblages of electric components
- H05K13/08—Monitoring manufacture of assemblages
- H05K13/081—Integration of optical monitoring devices in assembly lines; Processes using optical monitoring devices specially adapted for controlling devices or machines in assembly lines
- H05K13/0812—Integration of optical monitoring devices in assembly lines; Processes using optical monitoring devices specially adapted for controlling devices or machines in assembly lines the monitoring devices being integrated in the mounting machine, e.g. for monitoring components, leads, component placement
Landscapes
- Engineering & Computer Science (AREA)
- Operations Research (AREA)
- Manufacturing & Machinery (AREA)
- Microelectronics & Electronic Packaging (AREA)
- Supply And Installment Of Electrical Components (AREA)
Abstract
Element photographic device (11) includes: video camera (31), is had along shooting preset shooting field optical axis (A2);Laser lighting unit (40) has relative to shooting optical axis (A2) inclination and intersecting therewith lighting optical axis (A1);Region setting part, multiple shooting areas (310) are set in z-direction, the shooting area is the region that electronic component (17) intersects in shooting area with identification face, and the identification face is the face of intersection point that is orthogonal with shooting optical axis (A2) and being intersected by shooting optical axis (A2) and lighting optical axis (A1);Control unit, control video camera (31), laser lighting unit and region setting part are to obtain the identification image of the section shape of electronic component for each shooting area of multiple shooting areas (310);Generating unit (28) generates the three-dimensional data of the electronic component comprising lead (L) by multiple identification images of synthesis video camera (31) acquirement.
Description
Technical Field
The invention relates to a component data generating device, a surface mounting machine and a component data generating method.
Background
In the field relating to electronic components, there is a need for techniques to generate three-dimensional data for the electronic components. For example, patent document 1 discloses a data acquisition method using a three-dimensional sensor. The three-dimensional sensor is configured in the following manner: the laser light mechanically distributed by the polygon mirror is converted into an optical path by an F-theta lens and irradiated to an electronic component, and the reflected light thereof is detected by a detection device.
Documents of the prior art
Patent document
Patent document 1: japanese patent laid-open publication No. 2004-235671
In the structure of patent document 1, although the undulation of the surface irradiated with the laser light can be recognized, the shape (or size) along the longitudinal direction of the laser light received by the detection device cannot be detected. In addition, the structure using the polygon mirror or the F- θ lens is expensive.
Disclosure of Invention
The present invention has been made in view of the above circumstances, and an object thereof is to provide an inexpensive component data generating device, a surface mounting machine, and a component data generating method, which are capable of generating precise three-dimensional data.
In order to achieve the above object, the present invention relates to a device data generating device that generates three-dimensional data of an electronic device.
The component data generating apparatus of the present invention may include: and an imaging unit having an imaging optical axis provided in a vertical direction perpendicular to a bottom surface of the element main body of the electronic element, and having an imaging area set in advance along the imaging optical axis.
The component data generating apparatus of the present invention may include: and an illumination unit that is inclined with respect to the imaging optical axis, has an illumination optical axis intersecting the imaging optical axis, and irradiates illumination light having directivity along the illumination optical axis.
The component data generating apparatus of the present invention may include: and an area setting unit that sets a plurality of imaging areas in a vertical direction perpendicular to a bottom surface of the element body, the imaging areas being areas that are orthogonal to the imaging optical axis and that pass through an intersection point where the imaging optical axis intersects the illumination optical axis.
The component data generating apparatus of the present invention may include: and a control unit configured to control the imaging unit, the illumination unit, and the area setting unit so as to acquire a recognition image of a cross-sectional shape of the electronic component for each of a plurality of imaging areas.
The component data generating apparatus of the present invention may include: and a generation unit that generates three-dimensional data of the electronic component by synthesizing the plurality of recognition images acquired by the imaging unit. The generation unit generates the three-dimensional data of the electronic component by synthesizing the plurality of recognition images, using a size of the outline obtained from each image and a size of the electronic component in the vertical direction corresponding to the plurality of recognition images based on a distance from the top of the electronic component to the imaging unit.
In the above-described aspect, the area setting unit sets a plurality of imaging areas. The imaging unit images an electronic component, which is a generation target of component data, for each of a plurality of imaging regions to acquire a plurality of recognition images. The generation unit generates three-dimensional data from the plurality of identification images. Here, in each shooting region, the position where the electronic component intersects with the recognition surface differs in the vertical direction along the shooting optical axis. Therefore, the plurality of identification images that are the basis of the three-dimensional data have different size information in the vertical direction. Thus, the generating section can calculate the shape of the electronic component in the vertical direction in a layered manner. As a result, three-dimensional data of the electronic component can be generated.
Further, another aspect of the present invention relates to a surface mounting machine including the component data generating apparatus.
Further, still another aspect of the present invention relates to a component data generation method.
Other features, objects, structures, and effects of the present invention will be more readily understood from the following detailed description and the accompanying drawings.
Drawings
Fig. 1 is a plan view of a surface mounting apparatus using a device imaging apparatus according to the present invention.
Fig. 2 is an overall perspective view of the element imaging device.
Fig. 3 is a schematic diagram schematically showing the configuration of the device imaging apparatus.
Fig. 4 is a diagram showing an image pickup situation of an electronic component by the component image pickup device according to the embodiment of the present invention.
Fig. 5(a) to (E) are side views of an electronic component showing shooting conditions of the leaded electronic component in stages.
Fig. 6 (a) to (E) are recognition images obtained when the electronic component is captured in fig. 5(a) to (E).
Fig. 7 is a block diagram showing the structure of the surface mounting machine.
Fig. 8 is a schematic diagram showing a view chart as an example of a storage area of the interval storage unit.
Fig. 9 is a side view of an electronic component showing an example of size setting at the time of imaging the electronic component.
Fig. 10 is a flowchart of a component recognition operation performed by the component imaging apparatus according to the present invention.
Fig. 11 is a flowchart showing a shooting processing subroutine in fig. 10.
Fig. 12 is a flowchart showing a shooting processing subroutine of the modification in fig. 10.
Fig. 13 is an explanatory diagram showing an example of the size of the outline corresponding to fig. 6 (a).
Detailed Description
Hereinafter, embodiments of a component imaging apparatus as an example of the component generating apparatus according to the present invention and a surface mounting machine including the component imaging apparatus will be described in detail with reference to the drawings. In this embodiment, a component imaging apparatus mounted on a surface mounting machine is a specific example of the component creating apparatus according to the present invention.
First, the surface mounting machine will be described with reference to fig. 1. In fig. 1, a conveyance direction (a left-right direction in fig. 1) of the printed circuit board 7 (an example of a substrate) for conveying the printed circuit board 7 is represented as an X direction, and a horizontal direction orthogonal to the X direction is represented as a Y direction.
The surface mounting machine 1 includes a base 2, a conveying section 3, a component supply section 4, a component moving section 5 (mounting section), and a detection section 10. The base 2 is formed in a quadrangular shape along the X direction in a plan view, and supports each part of the surface mounting apparatus 1. The conveying unit 3 is provided to traverse the upper surface of the base 2 in the X direction, and conveys the printed circuit board 7. The component supply units 4 are provided at both ends of the base 2 with the conveying unit 3 interposed therebetween, and supply electronic components mounted on the printed circuit board 7. The component moving section 5 is provided above the base 2, and transports the electronic component from the component supply section 4 toward the printed circuit board 7 on the conveying section 3, and mounts the electronic component on the printed circuit board 7. The detection unit 10 images an electronic component by a component imaging device 11 described later, and recognizes the electronic component or detects the lead distal end position of the electronic component based on a recognition image obtained by the imaging.
The conveying unit 3 is composed of a pair of conveyors 6 provided at intervals in the Y direction. The pair of conveyors 6 are belt conveyors, and convey the printed circuit board 7 in the X direction while supporting both ends of the printed circuit board 7 in the Y direction. An electronic component supply device is mounted in the component supply section 4. Fig. 1 shows an example in which a plurality of tape feeders 12 as the electronic component supply device are mounted. A component imaging device 11 is provided between the side of the conveyor 6 and the component supply unit 4.
The element moving unit 5 includes a Y rail unit 13, an X rail unit 14, and a head unit 15. The Y rail units 13 are provided on both ends of the base 2 in the X direction so as to straddle the conveying unit 3 in a pair. The X rail unit 14 is supported to be movable in the Y direction based on the Y rail unit 13. The head unit 15 is supported to be movable in the X direction by the X rail unit 14. The head unit 15 includes a plurality of suction heads (not shown).
These suction heads include suction nozzles 16 (component holding members; see fig. 3) which are freely projected from the lower end surfaces thereof, and a suction head driving device 20 (see fig. 7) which will be described later. The adsorption head has the following functions: the electronic component is sucked and held by the suction nozzle 16, and the suction is released at a position near the upper side of the printed circuit board 7, and the electronic component is mounted on the printed circuit board 7. The suction head has a function of vertically moving the suction nozzle 16 by the suction head driving device 20 and a function of rotating the suction nozzle 16 about a vertical axis. The head unit 15 is movable in the Y direction and the X direction based on the Y rail unit 13 and the X rail unit 14, and thus the suction nozzle 16 can be freely moved to a desired position in the horizontal direction.
The component imaging device 11 images the electronic component sucked by the suction nozzle 16 from below, and acquires an identification image of the electronic component.
Referring to fig. 2 and 3, the device imaging apparatus 11 includes a housing 30. The device imaging apparatus 11 further includes a camera 31 (imaging unit) and a lens unit 33 housed in the housing 30. The device imaging apparatus 11 further includes an illumination unit 35 that is placed on the top plate 30T of the housing 30 and emits omnidirectional illumination light. The device imaging apparatus 11 further includes a laser illumination unit 40 (illumination unit) attached to an upper peripheral edge of the illumination unit 35 and emitting directional illumination light.
The electronic components to be imaged in the device imaging apparatus 11 are, for example, the following components: a semiconductor element in which a plurality of leads such as DIP (dual inline Package) protrude downward from a Package portion, a semiconductor element in which a plurality of leads (an example of an extension terminal) such as QFP (Quad Flat Package) extend from a Package portion to a side of a Package portion and then bend downward and protrude downward, or a semiconductor element in which a spherical or hemispherical ball terminal (another example of an extension terminal) such as BGA (ball grid array) protrudes downward from a bottom surface of a Package portion. In addition, CSP (Chip Size Package) and the like are also targets for imaging. Fig. 2 and 3 illustrate an electronic component 17 including a QFP having a rectangular parallelepiped component body B and leads L extending from the side surfaces of the component body B. The lead L includes a lead distal end La (fig. 4) extending downward in a direction perpendicular to the flat bottom surface Ba of the element body B.
Among the above-described components, when the component imaging device 11 is used as the component data generating device, a suitable electronic component is one having a component main body B and a lead L extending from the component main body B and extending in a direction perpendicular to the bottom surface Ba of the component main body B. However, the present invention can be applied to all mounting elements having terminals corresponding to the extended terminals, that is, having terminals protruding or extending from the main body.
The light image of the electronic component 17 illuminated by the illumination unit 35 or the laser illumination unit 40 is incident into the camera 31. The camera 31 comprises a line sensor 32, which line sensor 32 converts the light image into an electrical signal. The imaging device arrangement direction of the line sensor 32 is the Y direction. The imaging optical axis a2 of the camera 31 is set in the Z direction (vertical direction) which is a direction passing through the bottom Ba of the element body B. The photographing optical axis a2 may be set obliquely to the Z direction.
The lens unit 33 includes an imaging lens (not shown) for imaging the light image of the electronic element 17 on the light receiving surface of the line sensor 32.
As shown in fig. 3, the illumination unit 35 is an illumination device for illuminating the electronic component 17 from its lower side in all directions. The illumination unit 35 has an octagonal semicircular shape in plan view, and a plurality of illumination LEDs are mounted on the inner wall surface thereof. The LEDs are directed substantially upward along the photographing optical axis a 2. As a result, the illumination unit 35 emits illumination light 35L from the entire periphery of the photographing optical axis a2 toward the photographing optical axis a 2. When the illumination unit 35 is operated, the electronic component 17 passing across the imaging optical axis a2 is irradiated with omnidirectional illumination light 35L.
As shown in fig. 4, the laser illumination unit 40 has an illumination optical axis a1 inclined with respect to the photographing optical axis a2 and intersecting the photographing optical axis a2, and illuminates illumination light 40L having directivity along the illumination optical axis a 1. The laser illumination unit 40 includes a light source unit 41 having a laser device that emits laser light, and an optical system unit 42 that converts the laser light into linear coherent light and outputs the linear coherent light. As the laser device, a semiconductor laser can be favorably used. As the optical system unit 42, a unit including a cylindrical lens can be exemplified. Generally, coherence means that interference fringes can be formed by two waves or two different parts of one wave having a constant relationship between amplitude and phase. While there is no completely coherent light, laser light is generally considered to be highly coherent light, and therefore, in the present specification, laser light is explained as "coherent light" on the premise that the laser light is highly coherent.
The illumination unit 35 shown in fig. 3 that emits the omnidirectional illumination light 35L is mainly used when the device imaging apparatus 11 acquires an identification image of a general-purpose device without a lead wire at the time of production. On the other hand, the laser illumination unit 40 shown in fig. 4 that emits the directional illumination light 40L is used when the device imaging apparatus 11 acquires the recognition image of the electronic device 17 having the lead L extending downward, particularly the lead distal end La of the lead L, at the time of production and at the time of data generation. This is because, when the omnidirectional illumination light 35L is used in capturing the lead distal end La, not only the lead distal end La but also other portions are reflected in the recognition image, and thus, the accurate position recognition of the lead distal end La may not be performed. This point will be described in detail below.
Fig. 4 is a diagram showing an imaging state of the lead distal end La of the electronic component 17 by the component imaging device 11 according to the basic embodiment of the present invention. In fig. 4, the electronic component 17 moves in the direction of arrow a3 along the X direction. The camera 31 has a predetermined imaging region 31A extending in the Z direction along the imaging optical axis a2, and the electronic component 17 passes through the imaging region 31A by moving. The imaging area 31A is determined by the optical specification of the lens unit 33, the Z-direction installation position, and the size of the imaging device so that an optical image of the moving electronic component 17 is formed on the imaging device of the line sensor 32 shown in fig. 4 and an image with a desired accuracy can be obtained. The laser illumination unit 40 irradiates linearly coherent illumination light 40L along an illumination optical axis a1 diagonally across the imaging optical axis a 2.
Fig. 4 is a schematic diagram for explanation, and the actual distance D between the line sensor 32 and the electronic component 17 is set so that the electronic component 17 can be included in the imaging region 31A, which is an imaging range, in the entire length range. The illumination light also has a predetermined width, and an intersection point (point p1) described below has a planar expanded region. Therefore, in reality, imaging can be performed with a range including all the leads L included in the imaging region 310 (see fig. 6).
Hereinafter, a principle of imaging with laser light will be described mainly by taking a case of imaging the distal end La of the lead as an example.
Based on the setting of the irradiation direction of the illumination light 40L as described above, the photographing optical axis a2 and the illumination optical axis a1 intersect within the photographing region 31A. A horizontal plane including the intersection point (point p1) in the moving direction A3 of the electronic component 17 serves as a recognition plane of the electronic component 17. The line sensor 32 of the camera 31 photographs an area where the recognition surface intersects with the photographing region 31A. This region becomes an imaging region 310 of the line sensor 32 (see fig. 5). The line sensor 32 is equal to the outline shape in a bottom view of the portion where the imaging electronic component 17 intersects the imaging area 310.
The illumination optical axis a1 is not set to an angle close to horizontal. Therefore, the illumination light 40L illuminates the vertical intermediate portion (point p2) of the other lead L adjacent to the lead L in the imaging region 31A and the base end portion (point p3) of the still other lead L adjacent to the other lead. Therefore, reflected light is generated at the points p1, p2, p 3. However, the reflected light incident on the camera 31 is only the reflected light from the point p 1. That is, the illumination light 40L is directional light and is projected obliquely, and the portion (detection region) illuminated by the illumination light 40L passing through the imaging region 31A is only the lead distal end La of the point p 1. Therefore, the camera 31 can capture an optical image of the distal end La of the lead with good contrast. When the electronic component 17 is slightly conveyed in the direction of the arrow a3 from the state of fig. 4, even if the illuminated portion of the point p2 enters the imaging area 31A, the reflected light thereof does not enter the camera 31. This is because the lead line L extends vertically downward, and the camera 31 is disposed vertically below the lead line L. Since the irradiation direction of the illumination light 40L is set as in the present embodiment, the intersection region extends horizontally in the Y direction (see fig. 3). As shown in fig. 3, the lead L (lead distal end La) on both sides of the electronic component 17 is imaged by the line sensor 32 in which the imaging devices are arranged in the Y direction only in 1 movement of the electronic component 17 along arrow a3 (fig. 4).
When the distance D is changed, the portion of the electronic component 17 intersecting the recognition plane defined by the illumination optical axis a1 and the imaging optical axis a2 can be changed in the Z direction. Therefore, in the present embodiment, for example, as shown in fig. 5(a) to (E), a plurality of distances d (H) (H is a height identification number) are set according to the type of the electronic component 17, and a plurality of imaging regions 310 are set for each of the plurality of distances d (H). Therefore, the line sensor 32 images the cross-sectional shape of the electronic component 17 orthogonal to the Z-axis direction for each imaging area 310. Thus, after the imaging, the recognition images shown in fig. 6 (a) to (E) corresponding to fig. 5(a) to (E) can be obtained. As shown in fig. 6 (a) to (E): in each recognition image, different cross-sections are displayed in correspondence with the plurality of distances d (h) (or the imaging area 310). In the present embodiment, the data in the Z direction is calculated by correlating the cross section described above with the distance d (h).
Next, a control structure of the surface mounting apparatus 1 will be described with reference to a block diagram of fig. 7. The surface mounting machine 1 further includes a control device 8 (control unit) for controlling the operations of the respective units of the surface mounting machine 1, a drive device 18 for the X-rail unit, a drive device 19 for the head unit, and a drive device 20 for the suction head. The X-rail unit driving device 18 generates a driving force for moving the X-rail unit 14 (fig. 1) in the Y direction on the Y-rail unit 13. The head unit drive device 19 generates a drive force for moving the head unit 15 on the X rail unit 14 in the X direction. The suction head driving device 20 generates a driving force for moving up and down or rotating the suction nozzle 16 about an axis in the vertical direction in each suction head included in the head unit 15. The suction head driving device 20 and the suction nozzles 16 function as an "area setting unit" based on the control of the control device 8 as described later. The suction head driving device 20 and the like described above are examples of the "area setting unit".
The control device 8 functionally includes a main control unit 21, a storage unit 22, a shaft control unit 23, a conveyor control unit 24, a camera control unit 25, an illumination control unit 26, an image processing unit 27, and a data generation unit 28. The main control unit 21 performs various controls of the surface mounting apparatus 1 in a unified manner. In the present embodiment, the main control unit 21 performs the following control: the shaft control unit 23, the camera control unit 25, and the illumination control unit 26 are controlled so that the electronic component 17 (the lead distal end La) is moved by the component moving unit 5 so as to pass through the detection region, and the camera 31, the illumination unit 35, or the laser illumination unit 40 is operated so that an identification image of the electronic component 17 or the lead distal end La is acquired. That is, the main control unit 21 is an example of the control unit of the present invention.
The data generation unit 28 is a module that synthesizes the captured recognition images and generates three-dimensional data. As shown in fig. 5 and 6, in the present embodiment, a plurality of distances D between the electronic component 17 and the line sensor 32 are provided, and the recognition images captured at the respective distances D (h) are combined to generate three-dimensional data from the size of the outline acquired from the respective images and the size in the Z-axis direction based on the distances D (h). The data generation unit 28 may be software executed by the main control unit 21.
The storage unit 22 stores various information related to the printed circuit board 7 and the electronic component 17. The information related to the electronic component 17 is information such as the type of the electronic component, the number and arrangement of the leads L, and the height position of the lead distal end La.
Here, in the present embodiment, the setting number H and the distance d (H) are set for each electronic component 17, and an appropriate imaging position can be set.
Referring to fig. 8 and 9, the storage section 22 is provided with a storage area 220 as an interval storage unit. The storage area registers data for each item including { electronic component, setting number H, thickness t, distance D (H) }. The item called { electronic component } holds the number of the electronic component. In the illustrated example, an electronic component of a number called "ABC" is first shown in which five items are set. The example of the electronic component "ABC" corresponds to the imaging example illustrated in fig. 5 and 6. An identification number for identifying the distance d (H) between the line sensor 32 and the electronic component 17 is registered in an item called "setting number H". In the example relating to "ABC" shown in the figure, five setting numbers are stored. The item called "thickness t" stores the dimension in the Z direction of the portion necessary for measurement for each electronic element 17. In the illustrated example, in order to generate the three-dimensional data of the electronic component 17 shown in fig. 5, the thickness is set so as to photograph a portion where the thickness (the dimension in the Z direction) from the top of the electronic component 17 is "11.9" mm, "8.3" mm, "7.7" mm, "7.4" mm, "and" 7.0 "mm. The item called { distance d (h) } stores a distance corresponding to an imaging position necessary for generating three-dimensional data of the electronic component. In the illustrated example, values such as "11.9", "8.3", "7.7", "7.4" and "7.0" are stored in each of the five items of "ABC", and values such as "104.9", "101.3", "100.7", "100.4" and "100.0" are stored in d (h), respectively. As shown in fig. 8, the above-described setting is set for each of the plurality of electronic components 17, and a specific distance d (h) is set for each necessary electronic component 17. In addition, a plurality of distances d (h) may be set to be gradually increased (or decreased) at predetermined intervals.
By setting the distance d (h) as described above in advance, it is possible to set a plurality of imaging regions 310 with an appropriate thickness as shown in fig. 5(a) to (E).
As shown in fig. 7, the storage unit 22 is provided with a data storage area 221. The data storage area 221 stores the generated three-dimensional data. A preferable specific example of the data storage area 221 is a secondary storage device such as a hard disk drive.
Returning to fig. 7, the axis control unit 23 controls the X rail unit 14, the head unit 15, and the suction head by controlling the X rail unit drive device 18, the head unit drive device 19, and the suction head drive device 20. The conveyor controller 24 controls the operation and stop of the pair of conveyors 6 constituting the conveyor unit 3 to control the conveyance of the printed circuit board 7.
The camera control unit 25 controls the shooting operation of the camera 31. For example, the camera control section 25 controls the shutter timing, shutter speed (exposure amount), and the like of the camera 31.
The illumination control unit 26 controls the light emission operation of the illumination unit 35 and the laser illumination unit 40. Illumination control unit 26 refers to storage unit 22 for the electronic components taken out from tape feeder 12 to acquire component information, and determines whether to light either illumination unit 35 or laser illumination unit 40. Then, the lighting control unit 26 operates the selected lighting unit by a predetermined program.
The image processing unit 27 applies a known image processing technique to the recognition image acquired by the camera 31 (line sensor 32), and extracts various kinds of inspection information from the recognition image. For example, it is determined whether or not the suction variation of the electronic component 17 by the suction nozzle 16, the bending of the lead L, the distortion of the lead distal end La, and the like are within the allowable range based on the inspection information extracted by the image processing unit 27. In addition, at the time of data generation, the leads L and the element body B are recognized from the respective recognition images and registered individually. Then, coordinates and X-direction dimensions and Y-direction dimensions are registered in association with the height d (h) for each registered part. Each data is digitized.
Fig. 10 is a flowchart of a component recognition operation performed by the component imaging apparatus 11 according to the present embodiment. Here, the operation of the present apparatus will be mainly described with respect to processing for generating three-dimensional data.
The electronic component 17 to be subjected to data generation is installed at a suction position preset by an operator, for example, differently from the case where the production process is performed. The suction nozzles 16 of the head unit 15 suck the set electronic components. The head unit 15 moves in the X direction so as to pass above the element imaging device 11. The device imaging apparatus 11 acquires an image of the electronic device 17 when the head unit 15 passes.
In this imaging process, the main control section 21 initializes necessary sections of the respective sections and then moves the electronic component 17 to the upper limit value (step S1). Here, the upper limit value is a height (position) corresponding to the maximum thickness of the electronic component mountable by the surface mounting machine 1 (see fig. 5 a), and is a value set in advance in the storage unit 22.
Next, the control device 8 executes the image capturing process based on the control of the camera control unit 25, the illumination control unit 26, the image processing unit 27, the axis control unit 23, and the like (step S2).
Here, a specific example of the imaging process will be described with reference to fig. 11.
A count variable i is set in the control device 8. The count variable i is a variable corresponding to the set number H. The control device 8 initializes the count variable i to 0, and acquires a record set of the electronic component 17 as the imaging target (step S201). The record set is a set of data for each electronic component 17 to be imaged, and for example, in the example of fig. 9, when three-dimensional data of the electronic component "ABC" is acquired, all data of the setting number H of "ABC" of 1 to 5 is acquired.
Next, the control device 8 increments the count variable i (step S202). Thereafter, the control device 8 reads the distance d (h) corresponding to d (i) from the data stored in the storage area 220 of the storage unit 22. In the example shown in fig. 8, when H is 1, the readable distance D (1) is 104.9. Next, the control device 8 changes the imaging area 310 based on the read distance d (h) (step S204). In the present embodiment, the suction head driving device 20 is operated to lower the suction nozzle 16, and the portion where the electronic component 17 is lowered from the upper limit value to the thickness t (═ 11.9) of the electronic component 17 is located at the distance d (h) (-104.9) on the recognition surface. Based on this operation, as shown in fig. 5(a), the lead distal end La, which is the distal end of the lead L of the electronic component 17, stops on the recognition surface. Thereby, the imaging area 310 is set on the recognition surface. Based on the principle described with reference to fig. 4 and the like, the illumination light applied to the distal end La of the lead wire in the imaging region 310 on the recognition surface is imaged by the camera 31 (line sensor 32), and a recognition image of the distal end La of the lead wire shown in fig. 6 a is acquired. The identification image is stored in the storage unit 22. Next, the control device 8 determines whether or not there is an unprocessed distance d (h), that is, whether or not the entire data of the record set read in step S201 has been captured (step S207). If there is any more data, the control device 8 moves the process to step S202 to repeat the above process. Based on this repetitive operation, a plurality of imaging regions 310 are set, and the regions shown in fig. 5 (B) to (E) are sequentially imaged for each imaging region 310, and the recognition images shown in fig. 6 (B) to (E) are sequentially stored. Thereafter, the control device 8 returns the process to the main routine.
Referring to fig. 10, after the imaging process is completed, the control device 8 generates three-dimensional data under the control of the data generation unit 28 and the like (step S3). Specifically, the information in the Z direction is set based on the distance d (h) at the time of capturing each recognition image, and the registered outline dimensions of the lead line L and the element body B are obtained by correlating the information with each distance d (h). Next, three-dimensional data is generated for each lead line L and the element body B based on the distance d (h) and the outline size of the subsequent recognition image (step S4).
The controller 8 stores the generated three-dimensional data in the storage unit 22 for each electronic component. Three-dimensional data based on the recognition images of (a) to (E) of fig. 6 is generated based on the processing of step S3 shown in fig. 10, and the generated data is stored in the data storage area 221 of the storage unit 22 based on the processing of step S4.
As described above, the present embodiment relates to the element imaging device 11 as an element data generating device that generates three-dimensional data of the electronic element 17 including the element main body B and extending in the direction perpendicular to the bottom surface Ba of the element main body B.
The element imaging device 11 may include a camera 31 having an imaging region 31A set in advance along an imaging optical axis a 2. The photographing optical axis a2 of the camera 31 is disposed along the Z direction passing through the bottom surface Ba of the element main body B.
The element imaging device 11 may further include an illumination unit 35 that irradiates illumination light having directivity along the illumination optical axis a 1. The illumination unit 35 is disposed with the illumination optical axis a1 inclined with respect to the photographing optical axis a2 and intersecting the photographing optical axis a 2.
The component imaging device 11 further includes an area setting unit (the suction nozzle 16, the suction head driving device 20, and the like) that sets a plurality of imaging areas 310 in the Z direction.
Each imaging region 310 is a surface where the electronic component 17 intersects the recognition surface in the imaging region 31A. Further, the recognition surface is a surface orthogonal to the photographing optical axis a2 and passing through an intersection point where the photographing optical axis a2 intersects with the illumination optical axis a 1.
The device imaging apparatus 11 may further include a control apparatus 8. The control device 8 controls the camera 31, the illumination unit 35, and the area setting unit to acquire a recognition image of the cross-sectional shape of the electronic component 17 for each of the plurality of imaging areas 310.
The device imaging apparatus 11 may further include a data generation unit 28 that synthesizes a plurality of recognition images acquired by the camera 31 to generate three-dimensional data of the electronic device 17 including the extended terminal. The data generation unit 28 may be specifically configured by a module of the control device 8, for example.
In this embodiment, the area setting unit sets a plurality of imaging areas 310. The camera 31 captures an image of the electronic component 17 to be subjected to generation of component data for each of the plurality of imaging regions 310, and acquires a plurality of recognition images. The data generation unit 28 generates three-dimensional data from the plurality of identification images. Here, in each photographing region 310, the position where the electronic component 17 intersects with the recognition surface differs in the Z direction along the photographing optical axis a 2. Therefore, the plurality of identification images that are the basis of the three-dimensional data have different size information in the Z direction. Thus, the data generating unit 28 can calculate the shape of the electronic component 17 in the Z direction in a stacked manner. As a result, three-dimensional data of the electronic component 17 can be generated.
In the present embodiment, the control device 8 functions as an area setting unit in cooperation with the suction head driving device 20 and the like to set a plurality of imaging areas 310. The camera 31 captures an image of the electronic component 17 to be subjected to generation of component data for each of the plurality of imaging regions 310, and acquires a plurality of recognition images. The control device 8 generates three-dimensional data from the plurality of identification images. Here, in each photographing region 310, the position where the electronic component 17 intersects with the recognition surface differs in the Z direction along the photographing optical axis a 2. Therefore, the plurality of identification images that are the basis of the three-dimensional data have different size information in the Z direction. Thus, the data generation unit 28 of the control device 8 can calculate the shape of the electronic component 17 in the Z direction in a stacked manner. As a result, three-dimensional data of the electronic component 17 can be generated.
Specifically, the data generation unit 28 may synthesize a plurality of recognition images (see fig. 6 a to E), and generate three-dimensional data using the size of the outline obtained from each image and the sizes t1 to t5 (see fig. 9) of the electronic component 17 in the Z direction corresponding to the plurality of recognition images based on the distance d (h) from the top of the electronic component 17 to the line sensor 32 of the camera 31 serving as the imaging unit.
In such an embodiment, fine three-dimensional data can be generated from a plurality of recognition images. The "size of the outline" can be obtained, for example, in the manner illustrated in fig. 13 (the size indicated by symbols X001 to X008, y001 to y007, and the like). Since it is known that the size of the outline can be obtained from the image, a detailed description thereof will be omitted.
In some embodiments, the region setting unit may change the imaging position in the Z direction for each predetermined interval, and set the imaging region 310 for each of the plurality of imaging positions.
Specifically, the subroutine of fig. 12 may be executed when the photographing step S2 of fig. 10 is executed.
In fig. 12, instead of step S201 in fig. 11, the upper limit value of the electronic component 17 to be subjected to data generation, the imaging pitch P (an example of a predetermined interval) at the time of lowering, and the number of times of imaging N are stored in the storage unit 22 in advance, and are read from the storage unit 22 (step S211). Alternatively, the electronic component 17 may be lowered from the upper limit value by a predetermined interval instead of step S203 in fig. 11. Further, the predetermined number of times of shooting may be repeated for a plurality of shooting areas 310 in the same shooting process as steps S202, S204, S205, and S206 (step S217). The shooting pitch P may be any value that can be determined by the user as the resolution at the time of shooting. The imaging pitch P and the number of times of imaging N may be set to a number of times that can be collectively imaged in the range of the overall height (dimension in the Z direction) of the electronic component 17.
In such an embodiment, since the upper limit value of the electronic component 17 to be subjected to data generation, the imaging pitch P at the time of lowering, and the number of times of imaging N can be read from the storage unit 22 at the stage of step S211 to execute the imaging process, the data reading operation in the imaging step becomes unnecessary, and the data reading process becomes fast. Further, the imaging position in the Z direction can be changed for each designated interval, that is, the sliced piece data of the electronic component 17 can be acquired in the Z direction for the generation of three-dimensional data.
In several embodiments, the method further comprises: an interval storage unit for storing a plurality of imaging positions corresponding to the parts to be imaged for each of the electronic components 17 according to the type of the electronic component 17; the area setting unit sequentially reads a plurality of imaging positions stored in the interval storage unit to set the imaging area 310.
In such an embodiment, a plurality of recognition images can be acquired by setting a plurality of imaging positions for each portion to be imaged (fig. 5(a) to (E), see fig. 8) depending on the type of the electronic component 17, changing the distance d (h) shown in fig. 8, and changing the portion to be imaged for each electronic component 17. Therefore, the number of recognized images can be set to the minimum necessary for each electronic component 17, and necessary data can be acquired with the most appropriate number of shots.
In some embodiments, the data generator 28 stores the generated three-dimensional data in the storage 22.
In such an embodiment, the stored three-dimensional data can be used for various purposes. For example, feedback control of the mounting work can be performed based on the stored three-dimensional data.
In some embodiments, the area setting unit includes a suction nozzle 16 that sucks the electronic component 17, and a suction head driving device 20 that drives the suction nozzle 16 in a vertical direction.
In such an embodiment, the present invention can be applied to an existing apparatus having the suction nozzle 16 and the suction head driving device 20, such as the surface mounting apparatus 1, for example, and can generate three-dimensional data. In addition, the implementation becomes simple.
The present embodiment is also a surface mounting apparatus 1 including a conveying unit for conveying a substrate and a component moving unit 5 for mounting an electronic component 17 on the conveyed substrate. The surface mounting apparatus 1 may be provided with a component imaging device 11 as the component data generating device.
In such an embodiment, three-dimensional data can be generated for the electronic component 17 used for the mounting work in the surface mounting machine 1.
In some embodiments, the suction nozzle 16 and the suction head driving device 20 of the component moving unit 5 are the suction nozzle 16 and the suction head driving device 20 of the component imaging device 11.
In the above embodiment, the imaging area 310 setting means constituting the component data generating device of the present invention can be realized by the suction nozzle 16 and the suction head existing in the surface mounting machine 1, and the three-dimensional data can be easily generated at low cost.
Still another embodiment of the present invention is a component data generation method. The component data generating method generates three-dimensional data including extended terminals such as the lead lines L of the electronic component 17.
In several embodiments, the component data generation method includes: a step of setting in advance (see fig. 5) a photographing optical axis a2 along the Z direction passing through the bottom Ba of the element body B, and a photographing region 31A along the photographing optical axis a 2.
In several embodiments, the component data generation method includes: an illumination step (refer to fig. 4) of irradiating illumination light having directivity along an illumination optical axis inclined with respect to the photographing optical axis a2 and intersecting the photographing optical axis a 2.
In several embodiments, the component data generation method includes: an area setting step (see fig. 5) of setting a plurality of imaging areas 310 in the Z direction by relatively moving the electronic component 17 and the imaging area 31A, the imaging areas 310 being areas where the electronic component 17 intersects with a recognition plane, which is a plane orthogonal to the imaging optical axis a2 and passes through an intersection point where the imaging optical axis a2 intersects with the illumination optical axis, within the imaging area 31A.
In several embodiments, the component data generation method includes: the imaging step (see step S2 in fig. 10 and 11) images a recognition image of the cross-sectional shape of the electronic component 17 for each of the plurality of imaging regions 310.
In several embodiments, the component data generation method includes: a generation step (step S3 in fig. 10) of synthesizing the plurality of recognition images captured in the capturing step and generating three-dimensional data of the electronic component 17 including the extended terminals such as the leads L.
In the component data generating method according to the preferred embodiment, in the area setting step, the imaging position in the Z direction is changed for each predetermined interval, and the imaging area 310 is set for each of the plurality of imaging positions.
In the component data generating method according to the preferred embodiment, the region setting unit sequentially reads a plurality of imaging positions stored in the interval storage unit to set the imaging region 310.
While various embodiments of the present invention have been described above, the present invention is not limited to the above embodiments, and, for example, the following modified embodiments may be adopted.
(1) In the above embodiment, an example in which the component imaging device 11 according to the present invention is mounted on the surface mounting machine 1 is shown. The component imaging apparatus 11 can be applied to various apparatuses other than the surface mounting apparatus 1, which require imaging of the electronic component 17. For example, the present invention can be applied to a component inspection apparatus for inspecting the electronic component 17. In the above embodiment, the device imaging apparatus 11 according to the present invention is applied to the imaging recognition of the lead distal end La of the electronic device 17 with a lead. The present invention can also be applied to the case of imaging and recognizing the ball terminal distal end of an electronic component as the electronic component 17 in which a hemispherical or spherical ball terminal is provided on the bottom surface Ba of the component main body B.
(2) In the above embodiment, the camera 31 provided with the line sensor 32 as an imaging unit is exemplified. A two-dimensional area sensor may also be employed in place of the line sensor 32.
(3) In the above embodiment, an application example to the surface mounting machine 1 is shown, and a case where a mechanism including the Y rail unit 13, the X rail unit 14, and the head unit 15 is used as the moving mechanism is exemplified. This is merely an example, and the moving mechanism may be a mechanism capable of holding the electronic component 17 and carrying it over the air. Therefore, it is not limited to the application to the surface mounting machine 1, but it may be specifically applied to a component data generating apparatus as an apparatus provided on a table.
(4) In the above embodiment, the electronic component 17 is moved, but the camera 31 may be moved without moving the electronic component 17 so as to move the side of the imaging area (detection area).
The present invention is not limited to the above-described embodiments, and various modifications may be added to the present invention without departing from the scope of the present invention.
In the component data generating device according to the preferred embodiment of the present invention, the region setting unit changes the imaging position in the vertical direction for each predetermined interval, and sets the imaging region for each of a plurality of imaging positions.
In this embodiment, the imaging position in the vertical direction is changed at every predetermined interval, and slice data of the electronic component in the vertical direction can be acquired for generating three-dimensional data.
In the component data generating device according to the preferred embodiment of the present invention, the generating unit generates the three-dimensional data by synthesizing the plurality of recognition images, and using a size of an outline obtained from each image and a size of the electronic component in the vertical direction based on a distance from the top of the electronic component to the imaging unit, the size corresponding to the plurality of recognition images.
In this technical solution, fine three-dimensional data can be generated from a plurality of identification images.
The component data generating device according to a preferred embodiment of the present invention further includes: an interval storage unit which stores a plurality of shooting positions corresponding to parts needing to be shot for each electronic component according to the type of the electronic component; wherein the area setting section sequentially reads the plurality of imaging positions stored in the interval storage section to set the imaging area.
In this configuration, a plurality of imaging positions are set for each portion to be imaged according to the type of electronic component, and the portion to be imaged is changed for each electronic component, thereby obtaining a plurality of recognition images. Therefore, the number of recognized images can be set to the minimum necessary for each electronic component, and necessary data can be acquired with the most appropriate number of shots.
In the element data generating device according to a preferred embodiment of the present invention, the generating unit stores the generated three-dimensional data in the storage unit.
In this technical solution, the stored three-dimensional data can be used for various purposes. For example, feedback control of the mounting work can be performed based on the stored three-dimensional data.
In the component data generating device according to the preferred embodiment of the present invention, the area setting unit includes a suction nozzle for sucking the electronic component and a suction head driving unit for driving the suction nozzle in a vertical direction.
In this embodiment, when setting a plurality of imaging areas, the present invention can be applied to an existing apparatus having a suction nozzle or a suction head, such as a surface mounting apparatus, and therefore, implementation thereof is simple.
Another aspect of the present invention is directed to a surface mounting apparatus, including: a conveying part for conveying the substrate; a component moving unit for mounting the electronic component on the substrate; in the component data generating device according to the present invention, the electronic component is supplied to the component moving unit in the component data generating step, and three-dimensional data of the electronic component is generated.
In this configuration, three-dimensional data can be generated for electronic components used for mounting work in the surface mounting machine.
Another aspect of the present invention is directed to a surface mounting apparatus, including: a conveying part for conveying the substrate; a component moving part which comprises a suction nozzle for sucking the electronic component and a driving device for a suction head for lifting the suction nozzle, and mounts the electronic component on the conveyed substrate; a component data generating device having the suction nozzle and the suction head driving device; wherein the nozzle and the suction head driving device of the component moving unit are the nozzle and the suction head driving device of the component data generating device.
In this configuration, the imaging area setting means constituting the component data generating device of the present invention can be realized by using the existing driving device for the suction nozzle and the suction head in the surface mounting machine, and the three-dimensional data can be generated easily at low cost.
Another aspect of the present invention is a component data generating method for generating three-dimensional data of an electronic component, including: a presetting step of setting a photographing optical axis along a vertical direction perpendicular to a bottom surface of the element main body, and presetting a photographing region along the photographing optical axis; an illumination step of irradiating illumination light having directivity along an illumination optical axis that is inclined with respect to the photographing optical axis and intersects the photographing optical axis; a region setting step of setting a plurality of imaging regions in the vertical direction by relatively moving the electronic component and the imaging region, the imaging regions being regions where the electronic component intersects with a recognition surface in the imaging region, the recognition surface being a surface orthogonal to the imaging optical axis and passing through an intersection point where the imaging optical axis intersects with the illumination optical axis; an imaging step of imaging a recognition image of a cross-sectional shape of the electronic component for each of a plurality of imaging areas; a generation step of generating three-dimensional data of the electronic component by synthesizing the plurality of recognition images captured in the capturing step.
In the component data generating method according to a preferred embodiment of the present invention, the region setting step changes the imaging position in the vertical direction for each predetermined interval, and sets the imaging region for each of a plurality of imaging positions.
In the device data generating method according to a preferred embodiment of the present invention, the region setting unit sequentially reads the plurality of imaging positions stored in the interval storage unit to set the imaging region.
Industrial applicability
The present invention is suitably applicable to an industrial field where it is necessary to generate three-dimensional data of an electronic component, particularly an electronic component including a component main body and an extended terminal such as a lead extending from the component main body, the extended terminal extending in a direction perpendicular to a bottom surface of the component main body.
Description of the symbols
1 surface mounting machine
3 conveying part
4 parts supply part
5 element moving part
7 printed circuit board (an example of a substrate)
8 control device
11 component imaging device (an example of component data generating device)
16 suction nozzle
17 electronic component
20 suction head driving device (example of the main part of the area setting unit)
21 Main control part (an example of a control part)
22 storage section
28 data generating part (an example of the generating part)
31 vidicon (an example of a camera part)
31A field of photography
32-wire sensor
40 laser lighting unit
40L illumination light
220 storage area (an example of a space storage unit)
221 data storage area (example of three-dimensional data storage area)
310 shooting area
A1 lighting optical axis
A2 photographing optical axis
B element body
Bottom surface of Ba
D (H) distance
t thickness
One example of the dimensions of the X001 to X008, y001 to y007 profiles
Claims (10)
1. A component data generating apparatus for generating three-dimensional data of an electronic component, comprising:
an imaging unit having an imaging optical axis provided along a vertical direction perpendicular to a bottom surface of a device main body of the electronic device, and having an imaging region set in advance along the imaging optical axis;
an illumination unit that is inclined with respect to the imaging optical axis, has an illumination optical axis intersecting the imaging optical axis, and irradiates illumination light having directivity along the illumination optical axis;
an area setting unit that sets a plurality of imaging areas in a vertical direction perpendicular to a bottom surface of the element body, the imaging areas being areas that are orthogonal to the imaging optical axis and that pass through an intersection point where the imaging optical axis intersects the illumination optical axis;
a control unit that controls the imaging unit, the illumination unit, and the area setting unit so as to acquire a recognition image of a cross-sectional shape of the electronic component for each of a plurality of imaging areas;
a generation unit that generates three-dimensional data of the electronic component by synthesizing the plurality of recognition images acquired by the imaging unit,
the generation unit generates the three-dimensional data of the electronic component by synthesizing the plurality of recognition images, using a size of the outline obtained from each image and a size of the electronic component in the vertical direction corresponding to the plurality of recognition images based on a distance from the top of the electronic component to the imaging unit.
2. The component data generation apparatus according to claim 1, wherein:
the region setting unit changes the imaging position in the vertical direction for each predetermined interval, and sets the imaging region for each of a plurality of imaging positions.
3. The component data generation apparatus according to claim 1, characterized by further comprising:
an interval storage unit which stores a plurality of shooting positions corresponding to parts needing to be shot for each electronic component according to the type of the electronic component; wherein,
the area setting unit sequentially reads the plurality of imaging positions stored in the interval storage unit to set the imaging area.
4. The component data generation apparatus according to claim 1, wherein:
the generation unit stores the generated three-dimensional data in the storage unit.
5. The component data generation apparatus according to any one of claims 1 to 4, wherein:
the area setting unit includes a suction nozzle for sucking the electronic component and a driving device for a suction head for driving the suction nozzle in a vertical direction.
6. A surface mounting machine, characterized by comprising:
a conveying part for conveying the substrate;
a component moving unit for mounting the electronic component on the substrate;
the component data generating apparatus according to any one of claims 1 to 4, wherein in the component data generating step, the electronic component is supplied to the component moving unit, and three-dimensional data of the electronic component is generated.
7. A surface mounting machine comprising:
a conveying part for conveying the substrate;
a component moving part which comprises a suction nozzle for sucking the electronic component and a driving device for a suction head for lifting the suction nozzle, and mounts the electronic component on the conveyed substrate; characterized in that, the surface mounting machine further comprises:
the component data generating apparatus of claim 5; wherein,
the nozzle and the suction head driving device of the component moving unit are the nozzle and the suction head driving device of the component data generating device.
8. A component data generating method, characterized by a method of generating three-dimensional data of an electronic component, comprising:
a preset step of setting a photographing optical axis along a vertical direction perpendicular to a bottom surface of a component main body of the electronic component, and presetting a photographing region along the photographing optical axis;
an illumination step of irradiating illumination light having directivity along an illumination optical axis that is inclined with respect to the photographing optical axis and intersects the photographing optical axis;
a region setting step of setting a plurality of imaging regions in the vertical direction by relatively moving the electronic component and the imaging region, the imaging regions being regions where the electronic component intersects with a recognition surface in the imaging region, the recognition surface being a surface orthogonal to the imaging optical axis and passing through an intersection point where the imaging optical axis intersects with the illumination optical axis;
an imaging step of imaging a recognition image of a cross-sectional shape of the electronic component for each of a plurality of imaging areas;
a generation step of generating three-dimensional data of the electronic component by synthesizing the plurality of recognition images captured in the capturing step,
in the generating step, the three-dimensional data of the electronic component is created by synthesizing the plurality of recognition images, using a size of the outline acquired from each image and a size of the electronic component in the vertical direction corresponding to the plurality of recognition images.
9. The component data generation method according to claim 8, wherein:
in the area setting step, the imaging position in the vertical direction is changed for each predetermined interval, and the imaging area is set for each of a plurality of imaging positions.
10. The component data generation method according to claim 8 or 9, characterized in that:
the area setting step sequentially reads the plurality of shooting positions stored in the interval storage unit to set the shooting area.
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
PCT/JP2014/064484 WO2015181974A1 (en) | 2014-05-30 | 2014-05-30 | Component-data-generating device, surface-mounting machine, and method for generating component data |
Publications (2)
Publication Number | Publication Date |
---|---|
CN106465580A CN106465580A (en) | 2017-02-22 |
CN106465580B true CN106465580B (en) | 2019-06-18 |
Family
ID=54698348
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201480078459.1A Active CN106465580B (en) | 2014-05-30 | 2014-05-30 | Component data generating means, surface mounting apparatus and component data generation method |
Country Status (3)
Country | Link |
---|---|
JP (1) | JP6147927B2 (en) |
CN (1) | CN106465580B (en) |
WO (1) | WO2015181974A1 (en) |
Families Citing this family (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2016147332A1 (en) * | 2015-03-18 | 2016-09-22 | 富士機械製造株式会社 | Recognition device |
JP6689301B2 (en) * | 2016-02-18 | 2020-04-28 | 株式会社Fuji | Component determination device and component determination method |
CN108575086A (en) * | 2017-03-08 | 2018-09-25 | 台达电子电源(东莞)有限公司 | Electronic component pin station acquisition device, identification device and automatism card machine |
WO2019003267A1 (en) * | 2017-06-26 | 2019-01-03 | ヤマハ発動機株式会社 | Component mounting device and component data creation method |
WO2020075256A1 (en) * | 2018-10-11 | 2020-04-16 | 株式会社Fuji | Work machine |
Citations (11)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH11220298A (en) * | 1998-02-02 | 1999-08-10 | Matsushita Electric Ind Co Ltd | Electronic component mounting method |
JP2000121338A (en) * | 1998-10-13 | 2000-04-28 | Yamagata Casio Co Ltd | Electronic component inspecting device |
CN1645590A (en) * | 2004-01-23 | 2005-07-27 | 株式会社瑞萨科技 | Fabrication method of semiconductor integrated circuit device |
CN101091426A (en) * | 2005-03-29 | 2007-12-19 | 松下电器产业株式会社 | Component shape profiling method and component mounting method |
CN101324426A (en) * | 2007-06-11 | 2008-12-17 | 西门子公司 | Evaluation of surface structure of devices using different-angle presentation |
CN102099653A (en) * | 2008-07-21 | 2011-06-15 | 伟特机构有限公司 | A method and means for measuring positions of contact elements of an electronic components |
CN102364297A (en) * | 2010-06-15 | 2012-02-29 | Juki株式会社 | Electronic component mounting apparatus |
CN103196364A (en) * | 2012-01-10 | 2013-07-10 | 雅马哈发动机株式会社 | Element photographing device, surface mounting machine and component inspection device |
JP2013191775A (en) * | 2012-03-14 | 2013-09-26 | Panasonic Corp | Component mounting device and component shape measuring method |
CN103376069A (en) * | 2012-04-27 | 2013-10-30 | 松下电器产业株式会社 | Member installation apparatus and member shape identification method |
CN105684568A (en) * | 2013-11-13 | 2016-06-15 | 雅马哈发动机株式会社 | Component image pickup apparatus and surface-mounting apparatus using same |
Family Cites Families (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP3629532B2 (en) * | 2001-03-27 | 2005-03-16 | 国立大学法人 和歌山大学 | Method and system for measuring real-time shape of continuously moving object |
JP2006294881A (en) * | 2005-04-12 | 2006-10-26 | Juki Corp | Method and device for component recognition |
JP4715944B2 (en) * | 2009-04-03 | 2011-07-06 | オムロン株式会社 | Three-dimensional shape measuring apparatus, three-dimensional shape measuring method, and three-dimensional shape measuring program |
-
2014
- 2014-05-30 JP JP2016523078A patent/JP6147927B2/en active Active
- 2014-05-30 CN CN201480078459.1A patent/CN106465580B/en active Active
- 2014-05-30 WO PCT/JP2014/064484 patent/WO2015181974A1/en active Application Filing
Patent Citations (11)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH11220298A (en) * | 1998-02-02 | 1999-08-10 | Matsushita Electric Ind Co Ltd | Electronic component mounting method |
JP2000121338A (en) * | 1998-10-13 | 2000-04-28 | Yamagata Casio Co Ltd | Electronic component inspecting device |
CN1645590A (en) * | 2004-01-23 | 2005-07-27 | 株式会社瑞萨科技 | Fabrication method of semiconductor integrated circuit device |
CN101091426A (en) * | 2005-03-29 | 2007-12-19 | 松下电器产业株式会社 | Component shape profiling method and component mounting method |
CN101324426A (en) * | 2007-06-11 | 2008-12-17 | 西门子公司 | Evaluation of surface structure of devices using different-angle presentation |
CN102099653A (en) * | 2008-07-21 | 2011-06-15 | 伟特机构有限公司 | A method and means for measuring positions of contact elements of an electronic components |
CN102364297A (en) * | 2010-06-15 | 2012-02-29 | Juki株式会社 | Electronic component mounting apparatus |
CN103196364A (en) * | 2012-01-10 | 2013-07-10 | 雅马哈发动机株式会社 | Element photographing device, surface mounting machine and component inspection device |
JP2013191775A (en) * | 2012-03-14 | 2013-09-26 | Panasonic Corp | Component mounting device and component shape measuring method |
CN103376069A (en) * | 2012-04-27 | 2013-10-30 | 松下电器产业株式会社 | Member installation apparatus and member shape identification method |
CN105684568A (en) * | 2013-11-13 | 2016-06-15 | 雅马哈发动机株式会社 | Component image pickup apparatus and surface-mounting apparatus using same |
Also Published As
Publication number | Publication date |
---|---|
JPWO2015181974A1 (en) | 2017-04-20 |
CN106465580A (en) | 2017-02-22 |
JP6147927B2 (en) | 2017-06-14 |
WO2015181974A1 (en) | 2015-12-03 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN106465580B (en) | Component data generating means, surface mounting apparatus and component data generation method | |
CN105684568B (en) | Element photographic device and the surface mounting apparatus for using the element photographic device | |
KR20010033900A (en) | Electronics assembly apparatus with stereo vision linescan sensor | |
CN103196364B (en) | Element camera head, surface mounting apparatus and component detection device | |
JP6258295B2 (en) | Component recognition system for component mounters | |
KR102224699B1 (en) | 3d measurement device, 3d measurement method, and manufacturing method of substrate | |
TW202111455A (en) | Methods of positioning components in desired positions on a board | |
JP2009094295A (en) | Apparatus for measuring height of electronic component | |
JP5296749B2 (en) | Component recognition device and surface mounter | |
JP2005340648A (en) | Part recognition method, part recognition apparatus, surface mounter, and part inspection apparatus | |
CN106937525B (en) | Image generation device, installation device, and image generation method | |
JP6388135B2 (en) | Electronic component mounting apparatus and electronic component mounting method | |
JP2007225317A (en) | Apparatus for three-dimensionally measuring component | |
JP6392958B2 (en) | Component imaging apparatus and surface mounter using the same | |
JP6912993B2 (en) | Component mounting device | |
JP6770069B2 (en) | measuring device | |
JPWO2015011851A1 (en) | Electronic component mounting apparatus and electronic component mounting method | |
JP6033052B2 (en) | Parts transfer device | |
KR20140071265A (en) | Component mounting apparatus, head and method for recognizing component attitude | |
JP4401210B2 (en) | Electronic component mounting equipment | |
JP6239374B2 (en) | Inclination inspection system for parts | |
CN114287176B (en) | Working machine | |
JP7012887B2 (en) | Parts mounting machine | |
JP4840924B2 (en) | Electronic component mounting machine | |
JP4724612B2 (en) | Component recognition device and surface mounter |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
C06 | Publication | ||
PB01 | Publication | ||
C10 | Entry into substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |