WO2010011124A1 - A method and means for measuring positions of contact elements of an electronic components - Google Patents

A method and means for measuring positions of contact elements of an electronic components Download PDF

Info

Publication number
WO2010011124A1
WO2010011124A1 PCT/MY2009/000082 MY2009000082W WO2010011124A1 WO 2010011124 A1 WO2010011124 A1 WO 2010011124A1 MY 2009000082 W MY2009000082 W MY 2009000082W WO 2010011124 A1 WO2010011124 A1 WO 2010011124A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
image point
point
contact elements
camera
Prior art date
Application number
PCT/MY2009/000082
Other languages
French (fr)
Inventor
Ting Lik Wong
Chee Kit Loh
Chen Chung Yew
Original Assignee
Vitrox Corporation Bhd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Vitrox Corporation Bhd filed Critical Vitrox Corporation Bhd
Priority to CN2009801278488A priority Critical patent/CN102099653B/en
Priority to JP2011520004A priority patent/JP5787258B2/en
Publication of WO2010011124A1 publication Critical patent/WO2010011124A1/en

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/02Measuring arrangements characterised by the use of optical techniques for measuring length, width or thickness
    • G01B11/06Measuring arrangements characterised by the use of optical techniques for measuring length, width or thickness for measuring thickness ; e.g. of sheet material
    • G01B11/0608Height gauges
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/50Depth or shape recovery
    • G06T7/55Depth or shape recovery from multiple images
    • G06T7/593Depth or shape recovery from multiple images from stereo images
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30108Industrial image inspection
    • G06T2207/30152Solder

Definitions

  • the invention relates to the field of machine vision inspection and measurement of contact element positions on an electronic component.
  • method and means for measuring positions of contact elements of an electronic component utilizes two camera set up over a triangulation angle to capture a first and second image respectively for the determination of the relative z-position of the contact elements to the bottom surface of an electronic device.
  • the accuracy of the determined positions of the contact elements are affected by triangulation angle between the two cameras set up to record the first and second images.
  • the accuracy of the calculated triangulation because the data used are derived from 2D information.
  • the accuracy of the calculated triangulation angle can also be affected by lens distortion and non-uniform illumination can contribute error to the derived equation. Therefore, it is advantageous to provide an alternative method that is more robust and unaffected by many uncertain factors during calibration and measurement to provide 3 dimensional positions of measured contact elements.
  • the present invention relates to a method for measuring respective positions of contact elements of an electronic component comprising the steps of: bringing said contact elements into a calibrated space; illuminating said contact elements; recording a first image of said contact elements with a first camera having a first image plane extending substantially parallel to a calibration plane; recording a second image of said contact elements with a second camera; processing said first image to determine a first image point for each said contact element on each said contact element, wherein each said first image point is a point located on each said contact elements; processing said second image to identify a second image point for each said contact element on each said contact element; wherein said second image point and said first image point for each said contact element corresponds to a same point on each said contact element; determining within said second image a third image point by a position mapping algorithm; and determining within said second image a displacement between said second image point and said third image point, wherein said third image point is said first image point orthogonally projected to said calibration plane.
  • the present invention also relates to a means for measuring respective positions of contact elements of an electronic component comprising: an illumination source for illuminating said contact elements predisposed in a calibrated space; a first camera and a second camera; said first camera and said second camera being provided for recording a first and second image of said contact elements respectively; a processing device, wherein said processing device is connected to said first camera and said second camera for retrieving positions of a first image point in said first image for each said contact element and a second image point in said second image for each said contact elements, said first image point and said second image point for each said contact elements corresponds to a same point on each said contact element, retrieving by a position mapping algorithm in said second image position a third image point, said third image point is a point orthogonally projected from said first image point, and determining displacement between said second image point and said third point.
  • Additional third camera may be included to compensate for image point recorded on first camera that could not be recorded on the second camera.
  • the inclusion of the third camera also shortens the overall inspection time.
  • Figure IA and IB illustrate schematically the set-up of the illumination and the cameras according to the present invention.
  • Figure 2A, 2B, 2C respectively show example of a first, second, third image of a BGA component such as recorded by the first, second and third camera.
  • Figure 3 shows an example of a multiple layer reticle for calibration means.
  • Figure 4A, 4B, 4C show an example of images of the reticle for calibration means such as recorded by the first, second and third camera.
  • Figure 5 illustrates the mapping algorithm of co-ordinates from the first to the second calibration image.
  • Figure 6 illustrates the measurement principle according to the invention.
  • the method according to the invention is designed for the automatic computation of the three dimensional position of contact elements of electronic components such as BGA (Ball Grid Array)/CSP (Chip Scale Packaging), flip-chip devices, leaded devices (QFP, TSOP) and leadless devices (MLP, QFN).
  • BGA Bit Grid Array
  • CSP Chip Scale Packaging
  • the invention is able to automatically compute 3 dimensional position of combination of contact elements found on a single device, for example a combination of BGA/CSP device (1) with leaded device as illustrated in Figure 1.
  • Figure 1 illustrates the electronic component (1) is predisposed in a calibrated space, in such a manner, that its contact elements (2) and (3) are illuminated by illumination sources (7) and (8).
  • a first camera (4) is located substantially perpendicular to the calibration plane to record a first image from the bottom of a component as illustrated in Figure 2A.
  • a second (5) and third camera (6) are located to record a second and third images of side perspective views of the electronic component such as respectively illustrated in Figure 2B and 2C.
  • First camera is used to record the image of contact elements of the electronic components in case these contact elements cannot be viewed from second image. The use of a third camera also allows for shorter overall image acquisition time.
  • the method and means for measuring the positions of other contact elements recorded in respective first and third image of the first and third camera are the same as the method and means used by the first and second camera. Therefore, the operation of the invention is described using the first and second camera only.
  • camera 1 and 2 is positioned relative to each other in such a manner that both cameras are directed towards the electronic components as illustrated in Figure IA and IB.
  • camera 1, 2 and 3 can be placed at different positions as long as all the contact elements of the entire electronic component can be viewed by at least two different cameras.
  • Figure IA and IB illustrate together that all three cameras are located at different x, y and z coordinates so that all the possible locations of the contact elements are in the view of at least two different cameras.
  • the apparatus for measuring the 3 dimensional contact elements is calibrated before the measurement of the positions of the contact elements can be started.
  • the calibration establishes a relationship between a position Xl in the first image and a corresponding position X3 in the second image, i.e. an image point orthogonally projected to the calibration plane.
  • Figure 3 is used for the calibration. It comprises a number of predetermined markings which are represented, for example, by squares which positions are accurately known.
  • This reticle (9) is for example, made of a substrate which is placed at the location provided for predisposing the electronic component to be measured.
  • the substrate is, for example, composed of multiple layers of glasses disposed, preferably in an array manner on a base glass layer. Squares are printed on each of these glass layers by using a highly accurate screen-printing process. The thickness of all these higher layer glasses is accurate and the accurately printed markings are clearly defined for easy detection by the cameras.
  • calibration reticle means described beforehand is replaced by other calibration reticle means such as a reticle with vertical structure projecting out from a layer at accurately known position.
  • the position and the size of the squares printed on the calibration reticle means are preferably known to an accuracy of 0.1 microns.
  • the area and volume spanned by the calibration reticle means defines the space that can be calibrated in x, y and z direction. Therefore, a larger space is available for taking measurement for larger electronic component when a larger space and area are covered by the calibration reticle means in the x, y and z direction.
  • Measurements can be carried out as long as the plane in which the contacts are situated is within the established calibrated space.
  • Figure 4A, 4B and 4C respectively illustrate the first, second and third image of the calibration reticle means recorded by the first, second and third camera respectively. Since the individual markings, such as squares on the calibration reticle means can be identified, a mapping algorithm is determined through the calibration procedure to enable mapping between both images later on as illustrated figuratively in Figure 5.
  • the first (4) and second camera (5) each record its calibration image of the calibration reticle means, as shown in Figure 4A and 4B respectively.
  • the calibration comprises the determination of the position of the same squares (10) (11) or similar features in the first image ( Figure 4A) and the second image ( Figure 4B) such as recorded by the first (4) and second camera (5).
  • a position mapping algorithm is derived from the positions for each corresponding feature that have been determined in each first and second image. For example, as illustrated in Figure 5, the co-ordinates of a point (111) in the first image is mapped to its corresponding point (IH') in the second image.
  • plane (11) is the calibration plane since features on its surface are used for mapping pixels in the first image plane to the second image.
  • a bilinear interpolation technique is used to determine the coordinates for all the positions in between any two markings.
  • the position of a point Xl in the first image will be expressed in fractions of dx and dy and considered with respect to coordinates (111), (112), (221) and (222) that is formed by a rectangular matrix on the first image.
  • Point X3 in the second image is a point orthogonally projected from point Xl in the first image and the former is expressed in fractions of dx' and dy'.
  • the position mapping algorithm between a first image point in the first image and an image point in the second image, wherein the image point in the second image is a point orthogonally projected from the first image point on to the calibration plane is obtained by determining the relationship relating dx and dx' and the relationship relating dy and dy'.
  • the position mapping algorithm relating Xl in the first image plane and X3 in the second image plane is a relationship relating dx and dx' for the x-coordinate and a relationship relating dy and dy' for the y- coordinate.
  • the position mapping algorithm is then used during measurements for determining a point in the second image plane which is a first image point orthogonally projected on to a calibration plane.
  • a z-scaling factor is determined by firstly determining the respective pixel coordinates Xl and X2 in the first and second image respectively, for features, such as squares (10), on the calibration reticle means.
  • an x-y coordinate position mapping algorithm and a z-scaling factor have been obtained to determine a position of the contact elements on the component in the first and second image.
  • distances between the contact elements X, Y and Z-axis can be determined from respective x, y and z-scaling factors.
  • the position of the contact elements can also be determined accurately in microns.
  • the calibration procedure also enables a camera-to-camera calibration to be carried out, establishing the relationship between the first (4) and second camera (5).
  • the relationship of the third camera with the first camera is also established in the manner described herein.
  • the third camera (6) is set up with respect to the first camera (4) in a manner similar to the second camera.
  • an electronic component is brought in to the calibrated space and one of its contact element is located at a point P with respect to the calibration plane, p.
  • the first camera records a first image of the contact elements and selects a first point P.
  • the position of a first image point Xl for first point P in the first image of the contact element is determined.
  • the second camera (5) records a second image of the contact element with the point P and point P is identified as a second image point X2 in the second image.
  • the point P is orthogonally projected using the position mapping algorithm on to the calibration plane, p giving a second point P'.
  • the third image point X3 is then the image point of position P'.
  • X3 is the expected position in case the contact element P would be localized in the calibration plane p.
  • the position of a point X3 in the second image is determined by the position mapping algorithm.
  • the displacement between X2 and X3 is due to the fact that the real contact element is usually not exactly located on the calibration plane but at a height difference, DELTA.z, which is to be measured.
  • the displacement DELTA.z is determined by the multiplication product of the distance between X3 and X2 (i.e. DELTA.z') and z- scaling factor in Z-axis of second image.
  • the invention enables measurements of a point within the calibrated space, it is evident to a person reasonably skilled in the art to utilize the invention to measure the distance between any two points using the invention, as long as the two points are located within the calibrated space.

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Theoretical Computer Science (AREA)
  • Length Measuring Devices By Optical Means (AREA)
  • Supply And Installment Of Electrical Components (AREA)

Abstract

A method and means for measuring positions of contact elements (2, 3) of an electronic components that utilizes scaling factors in x, y and z dimensions is disclosed herein. The scaling factors are determined during calibration procedures which also establishes the camera-to-camera relationship of the system. The calibration maps image point recorded in the first camera (4) for each point of the contact element to corresponding image point recorded in the second and third camera (5, 6) as part of the process to obtain the displacement between a first image point and a second image point to determine the height difference between different contact elements.

Description

A METHOD AND MEANS FOR MEASURING POSITIONS OF CONTACT ELEMENTS OF AN ELECTRONIC COMPONENTS
FIELD OF INVENTION
The invention relates to the field of machine vision inspection and measurement of contact element positions on an electronic component.
BACKGROUND OF THE INVENTION
In the prior art, method and means for measuring positions of contact elements of an electronic component utilizes two camera set up over a triangulation angle to capture a first and second image respectively for the determination of the relative z-position of the contact elements to the bottom surface of an electronic device. The accuracy of the determined positions of the contact elements are affected by triangulation angle between the two cameras set up to record the first and second images. There is however a limitation to the accuracy of the calculated triangulation because the data used are derived from 2D information. Furthermore, the accuracy of the calculated triangulation angle can also be affected by lens distortion and non-uniform illumination can contribute error to the derived equation. Therefore, it is advantageous to provide an alternative method that is more robust and unaffected by many uncertain factors during calibration and measurement to provide 3 dimensional positions of measured contact elements.
SUMMARY OF THE INVENTION
The present invention relates to a method for measuring respective positions of contact elements of an electronic component comprising the steps of: bringing said contact elements into a calibrated space; illuminating said contact elements; recording a first image of said contact elements with a first camera having a first image plane extending substantially parallel to a calibration plane; recording a second image of said contact elements with a second camera; processing said first image to determine a first image point for each said contact element on each said contact element, wherein each said first image point is a point located on each said contact elements; processing said second image to identify a second image point for each said contact element on each said contact element; wherein said second image point and said first image point for each said contact element corresponds to a same point on each said contact element; determining within said second image a third image point by a position mapping algorithm; and determining within said second image a displacement between said second image point and said third image point, wherein said third image point is said first image point orthogonally projected to said calibration plane.
The present invention also relates to a means for measuring respective positions of contact elements of an electronic component comprising: an illumination source for illuminating said contact elements predisposed in a calibrated space; a first camera and a second camera; said first camera and said second camera being provided for recording a first and second image of said contact elements respectively; a processing device, wherein said processing device is connected to said first camera and said second camera for retrieving positions of a first image point in said first image for each said contact element and a second image point in said second image for each said contact elements, said first image point and said second image point for each said contact elements corresponds to a same point on each said contact element, retrieving by a position mapping algorithm in said second image position a third image point, said third image point is a point orthogonally projected from said first image point, and determining displacement between said second image point and said third point.
Additional third camera may be included to compensate for image point recorded on first camera that could not be recorded on the second camera. The inclusion of the third camera also shortens the overall inspection time.
BRIEF DESCRIPTION OF THE DRAWINGS
These and other features, aspects, and advantages of the present invention will become better understood when the following detailed description is read with reference to the accompanying drawings in which like characters represent like parts throughout the drawings, wherein: Figure IA and IB illustrate schematically the set-up of the illumination and the cameras according to the present invention.
Figure 2A, 2B, 2C respectively show example of a first, second, third image of a BGA component such as recorded by the first, second and third camera.
Figure 3 shows an example of a multiple layer reticle for calibration means.
Figure 4A, 4B, 4C show an example of images of the reticle for calibration means such as recorded by the first, second and third camera.
Figure 5 illustrates the mapping algorithm of co-ordinates from the first to the second calibration image.
Figure 6 illustrates the measurement principle according to the invention.
DETAILED DESCRIPTION OF THE INVENTION
The method according to the invention is designed for the automatic computation of the three dimensional position of contact elements of electronic components such as BGA (Ball Grid Array)/CSP (Chip Scale Packaging), flip-chip devices, leaded devices (QFP, TSOP) and leadless devices (MLP, QFN). The invention is able to automatically compute 3 dimensional position of combination of contact elements found on a single device, for example a combination of BGA/CSP device (1) with leaded device as illustrated in Figure 1.
Figure 1 illustrates the electronic component (1) is predisposed in a calibrated space, in such a manner, that its contact elements (2) and (3) are illuminated by illumination sources (7) and (8). A first camera (4) is located substantially perpendicular to the calibration plane to record a first image from the bottom of a component as illustrated in Figure 2A. A second (5) and third camera (6) are located to record a second and third images of side perspective views of the electronic component such as respectively illustrated in Figure 2B and 2C. In a first embodiment of the invention, only two cameras are needed for determining the 3 dimensional position of contact elements of the electronic components. Third camera is used to record the image of contact elements of the electronic components in case these contact elements cannot be viewed from second image. The use of a third camera also allows for shorter overall image acquisition time. When the third camera is used, the method and means for measuring the positions of other contact elements recorded in respective first and third image of the first and third camera are the same as the method and means used by the first and second camera. Therefore, the operation of the invention is described using the first and second camera only.
Preferably, camera 1 and 2 is positioned relative to each other in such a manner that both cameras are directed towards the electronic components as illustrated in Figure IA and IB. Furthermore, camera 1, 2 and 3 can be placed at different positions as long as all the contact elements of the entire electronic component can be viewed by at least two different cameras. For example, both Figure IA and IB illustrate together that all three cameras are located at different x, y and z coordinates so that all the possible locations of the contact elements are in the view of at least two different cameras.
The apparatus for measuring the 3 dimensional contact elements is calibrated before the measurement of the positions of the contact elements can be started. The calibration establishes a relationship between a position Xl in the first image and a corresponding position X3 in the second image, i.e. an image point orthogonally projected to the calibration plane. A calibration reticle means (9) as illustrated in
Figure 3 is used for the calibration. It comprises a number of predetermined markings which are represented, for example, by squares which positions are accurately known.
This reticle (9) is for example, made of a substrate which is placed at the location provided for predisposing the electronic component to be measured. The substrate is, for example, composed of multiple layers of glasses disposed, preferably in an array manner on a base glass layer. Squares are printed on each of these glass layers by using a highly accurate screen-printing process. The thickness of all these higher layer glasses is accurate and the accurately printed markings are clearly defined for easy detection by the cameras. Though not illustrated, it is possible that calibration reticle means described beforehand is replaced by other calibration reticle means such as a reticle with vertical structure projecting out from a layer at accurately known position.
In order to obtain accurate calibration, the position and the size of the squares printed on the calibration reticle means are preferably known to an accuracy of 0.1 microns. The area and volume spanned by the calibration reticle means defines the space that can be calibrated in x, y and z direction. Therefore, a larger space is available for taking measurement for larger electronic component when a larger space and area are covered by the calibration reticle means in the x, y and z direction. When measurements are carried out later, it is not necessary that the plane in which the contacts are situated is located exactly on the calibration plane (11). Measurements can be carried out as long as the plane in which the contacts are situated is within the established calibrated space.
Figure 4A, 4B and 4C respectively illustrate the first, second and third image of the calibration reticle means recorded by the first, second and third camera respectively. Since the individual markings, such as squares on the calibration reticle means can be identified, a mapping algorithm is determined through the calibration procedure to enable mapping between both images later on as illustrated figuratively in Figure 5. The first (4) and second camera (5) each record its calibration image of the calibration reticle means, as shown in Figure 4A and 4B respectively. As shown in Figure 4A and 4B, the calibration comprises the determination of the position of the same squares (10) (11) or similar features in the first image (Figure 4A) and the second image (Figure 4B) such as recorded by the first (4) and second camera (5). A position mapping algorithm is derived from the positions for each corresponding feature that have been determined in each first and second image. For example, as illustrated in Figure 5, the co-ordinates of a point (111) in the first image is mapped to its corresponding point (IH') in the second image. Hence, plane (11) is the calibration plane since features on its surface are used for mapping pixels in the first image plane to the second image.
According to Figure 5, a bilinear interpolation technique is used to determine the coordinates for all the positions in between any two markings. For example the position of a point Xl in the first image will be expressed in fractions of dx and dy and considered with respect to coordinates (111), (112), (221) and (222) that is formed by a rectangular matrix on the first image. Point X3 in the second image is a point orthogonally projected from point Xl in the first image and the former is expressed in fractions of dx' and dy'. The position mapping algorithm between a first image point in the first image and an image point in the second image, wherein the image point in the second image is a point orthogonally projected from the first image point on to the calibration plane, is obtained by determining the relationship relating dx and dx' and the relationship relating dy and dy'. For example, the position mapping algorithm relating Xl in the first image plane and X3 in the second image plane is a relationship relating dx and dx' for the x-coordinate and a relationship relating dy and dy' for the y- coordinate. The position mapping algorithm is then used during measurements for determining a point in the second image plane which is a first image point orthogonally projected on to a calibration plane.
Referring to Figures 3, 4 A, 4B and 5, a z-scaling factor is determined by firstly determining the respective pixel coordinates Xl and X2 in the first and second image respectively, for features, such as squares (10), on the calibration reticle means. X3 is a point orthogonally projected on to the calibration plane from the point Xl by using the position mapping algorithm. Since the z distance between X2 and X3 is accurately known, respective z-scaling factor in the second image (4B) and third image (4C) can be determined. For example, suppose that the distance between X2 and X3 is 10 mm and their corresponding distance on the second image is 25 pixel, the z-scaling factor would be lOmm/25 pixel = 0.4 mm/pixel.
After completing the calibration procedure, an x-y coordinate position mapping algorithm and a z-scaling factor have been obtained to determine a position of the contact elements on the component in the first and second image. In addition, distances between the contact elements X, Y and Z-axis can be determined from respective x, y and z-scaling factors. As the position is accurately determined by the calibration reticle means, the position of the contact elements can also be determined accurately in microns. The calibration procedure also enables a camera-to-camera calibration to be carried out, establishing the relationship between the first (4) and second camera (5). The relationship of the third camera with the first camera is also established in the manner described herein.
Now that the space for measurement has been calibrated and the set-up of the first and second camera has been explained, the measurement principle according to the present invention will be explained with the help of Figure 6. The third camera (6) is set up with respect to the first camera (4) in a manner similar to the second camera. Suppose an electronic component is brought in to the calibrated space and one of its contact element is located at a point P with respect to the calibration plane, p. The first camera records a first image of the contact elements and selects a first point P. The position of a first image point Xl for first point P in the first image of the contact element is determined.
The second camera (5) records a second image of the contact element with the point P and point P is identified as a second image point X2 in the second image. The point P is orthogonally projected using the position mapping algorithm on to the calibration plane, p giving a second point P'. In the second image, the third image point X3 is then the image point of position P'. X3 is the expected position in case the contact element P would be localized in the calibration plane p. The position of a point X3 in the second image is determined by the position mapping algorithm. The displacement between X2 and X3 is due to the fact that the real contact element is usually not exactly located on the calibration plane but at a height difference, DELTA.z, which is to be measured. Following the foregoing, the displacement DELTA.z is determined by the multiplication product of the distance between X3 and X2 (i.e. DELTA.z') and z- scaling factor in Z-axis of second image.
Since the invention enables measurements of a point within the calibrated space, it is evident to a person reasonably skilled in the art to utilize the invention to measure the distance between any two points using the invention, as long as the two points are located within the calibrated space.
It is therefore apparent to anyone skillful in the art as to the practicality and usefulness of such a system and method for measuring the distances between contact elements of a component and the distance of the contact elements from a calibration plane within a calibrated space, which can be an integrated circuit which the contact elements are situated on. Therefore, the above description is descriptive and is by no means limitation of the essence of the invention itself which scope and embodiment is claimed as below.

Claims

1. A method for measuring respective positions of contact elements (2, 3) of an electronic component comprising the steps of: bringing said contact elements (2, 3) into a calibrated space; illuminating said contact elements (2, 3) ; recording a first image of said contact elements (2, 3) with a first camera (4) having a first image plane extending substantially parallel to a calibration plane; recording a second image of said contact elements with a second camera (5); processing said first image to determine a first image point for each said contact element (2, 3) on each said contact element (2, 3), wherein each said first image point is a point located on each said contact element (2, 3); processing said second image to identify a second image point for each said contact element (2, 3) on each said contact element (2, 3); wherein said second image point and said first image point for each said contact element (2, 3) corresponds to a same point on each said contact element (2, 3); determining within said second image a third image point by a position mapping algorithm; and determining within said second image a displacement between said second image point and said third image point, wherein said third image point is said first image point orthogonally projected to said calibration plane.
2. A method as claimed in claim I9 wherein said step of determining displacement is by multiplying a z-scaling factor and coordinate difference of said second image point and said third image point.
3. A method as claimed in claim I9 further comprising the step of deriving said position mapping algorithm for a particular relative position between said first and second camera (5).
4. A method as claimed in claim 1, further comprising the step of deriving a z-scaling factor for determining the displacement of said second image point and said orthogonally projected image point.
5. A method as claimed in claim 1, further comprising the step of determining displacement between any two points within the calibrated space.
6. A means for measuring respective positions of contact elements (2, 3) of an electronic component comprising: an illumination source (7, 8) for illuminating said contact elements (2, 3) predisposed in a calibrated space; a first camera (4) and a second camera (5); said first camera (4) and said second camera (5) being provided for recording a first and second image of said contact elements (2, 3) respectively; a processing device, wherein said processing device is connected to said first camera (4) and said second camera (5) for retrieving positions of a first image point in said first image for each said contact element (2, 3) and a second image point in said second image for each said contact elements (2, 3), said first image point and said second image point for each said contact elements (2, 3) corresponds to a same point on each said contact element (2, 3), retrieving by a position mapping algorithm in said second image position a third image point, said third image point is a point orthogonally projected from said first image point, and determining displacement between said second image point and said third point.
7. A means for measuring respective positions of contact elements (2, 3) of an electronic component as claimed in claim 6, wherein said processing device determines said displacement by multiplying a z- scaling factor and coordinate difference of said second image point and said third image point.
8. A means for measuring respective positions of contact elements (2, 3) of an electronic component as claimed in claim 6, further comprising a calibration reticle means (9) which image is recorded by said first and second cameras (4, 5) and said processing device derives said position mapping algorithm from said images of said calibration reticle means for a particular relative position between said first and second camera (4, 5).
9. A means for measuring respective positions of contact elements (2, 3) of an electronic component as claimed in claim 8, wherein said calibration reticle means (9) comprising a base layer having features disposed thereupon and multiple layers having features thereupon, said multiple layers have accurately known thickness and are accurately positioned relative to said features upon said base layer.
10. A means for measuring respective positions of contact elements (2, 3) of an electronic component as claimed in claim 8, wherein said processing device derives a z-scaling factor for determining the displacement of said second image point and said third image point.
11. A means for measuring respective positions of contact elements (2, 3) of an electronic component as claimed in claim 7, wherein said processing device determines displacement between any two points within the calibrated space.
12. A means for measuring respective positions of contact elements (2, 3) of an electronic component as claimed in claim 6, further comprising a third camera (6) connected to said processing device for recording a third image of said contact elements, said third image having a fourth image point and a fifth image point, wherein said fifth image point is a point orthogonally projected from said first image point and is not recorded in said second image, and said processing device determines a displacement between said fourth image point and said fifth image point.
PCT/MY2009/000082 2008-07-21 2009-06-26 A method and means for measuring positions of contact elements of an electronic components WO2010011124A1 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN2009801278488A CN102099653B (en) 2008-07-21 2009-06-26 Method and means for measuring positions of contact elements of an electronic components
JP2011520004A JP5787258B2 (en) 2008-07-21 2009-06-26 Method and apparatus for measuring the position of a contact element of an electronic component

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
MYPI20082704A MY148204A (en) 2008-07-21 2008-07-21 A method and means for measuring positions of contact elements of an electronic components
MYPI20082704 2008-07-21

Publications (1)

Publication Number Publication Date
WO2010011124A1 true WO2010011124A1 (en) 2010-01-28

Family

ID=41570469

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/MY2009/000082 WO2010011124A1 (en) 2008-07-21 2009-06-26 A method and means for measuring positions of contact elements of an electronic components

Country Status (5)

Country Link
JP (1) JP5787258B2 (en)
KR (1) KR101633139B1 (en)
CN (1) CN102099653B (en)
MY (1) MY148204A (en)
WO (1) WO2010011124A1 (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102213581A (en) * 2010-04-08 2011-10-12 财团法人工业技术研究院 Object measuring method and system
US8691916B2 (en) 2012-05-07 2014-04-08 Dow Global Technologies Llc Retortable easy opening seals for film extrusion
JP2016151538A (en) * 2015-02-19 2016-08-22 富士機械製造株式会社 Component determination device

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP6147927B2 (en) * 2014-05-30 2017-06-14 ヤマハ発動機株式会社 Component data generation apparatus, surface mounter, and component data generation method
DE102016112197B4 (en) * 2016-07-04 2018-05-24 Asm Assembly Systems Gmbh & Co. Kg A method and apparatus for stereoscopically determining information regarding the elevation of the front of a port

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5912985A (en) * 1994-11-08 1999-06-15 Matsushita Electric Industrial Co., Ltd. Pattern detection method
WO2000062012A1 (en) * 1999-04-13 2000-10-19 Icos Vision Systems N.V. Measuring positions or coplanarity of contact elements of an electronic component with a flat illumination and two cameras
US6538750B1 (en) * 1998-05-22 2003-03-25 Cyberoptics Corporation Rotary sensor system with a single detector

Family Cites Families (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6072898A (en) * 1998-01-16 2000-06-06 Beaty; Elwin M. Method and apparatus for three dimensional inspection of electronic components
EP1220596A1 (en) * 2000-12-29 2002-07-03 Icos Vision Systems N.V. A method and an apparatus for measuring positions of contact elements of an electronic component
JP3915033B2 (en) * 2003-05-15 2007-05-16 株式会社テクノホロン Measuring method and measuring apparatus using stereo optical system
CN100374816C (en) * 2003-05-28 2008-03-12 富士机械制造株式会社 Pickup image processing device of electronic part mounting device and pickup image processing method
JP4871352B2 (en) * 2005-03-11 2012-02-08 クリアフォーム インク. Automatic reference system and apparatus for 3D scanning
US7400417B2 (en) * 2005-05-23 2008-07-15 Federal Mogul World Wide, Inc. Diffraction method for measuring thickness of a workpart
KR20070099398A (en) * 2006-04-03 2007-10-09 삼성전자주식회사 Apparatus for inspecting substrate and method of inspecting substrate using the same
JP2009139285A (en) * 2007-12-07 2009-06-25 Univ Nihon Solder ball inspection device, its inspection method, and shape inspection device

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5912985A (en) * 1994-11-08 1999-06-15 Matsushita Electric Industrial Co., Ltd. Pattern detection method
US6538750B1 (en) * 1998-05-22 2003-03-25 Cyberoptics Corporation Rotary sensor system with a single detector
WO2000062012A1 (en) * 1999-04-13 2000-10-19 Icos Vision Systems N.V. Measuring positions or coplanarity of contact elements of an electronic component with a flat illumination and two cameras

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102213581A (en) * 2010-04-08 2011-10-12 财团法人工业技术研究院 Object measuring method and system
US8691916B2 (en) 2012-05-07 2014-04-08 Dow Global Technologies Llc Retortable easy opening seals for film extrusion
JP2016151538A (en) * 2015-02-19 2016-08-22 富士機械製造株式会社 Component determination device

Also Published As

Publication number Publication date
JP2011528800A (en) 2011-11-24
KR101633139B1 (en) 2016-06-23
JP5787258B2 (en) 2015-09-30
KR20110043593A (en) 2011-04-27
CN102099653A (en) 2011-06-15
CN102099653B (en) 2012-11-14
MY148204A (en) 2013-03-15

Similar Documents

Publication Publication Date Title
JP5334835B2 (en) Method and system for measuring shape of reflecting surface
TWI440847B (en) Inspection method
US6750899B1 (en) Solder paste inspection system
US8885040B2 (en) Method and apparatus for 3-dimensional vision and inspection of ball and like protrusions of electronic components
US20130194569A1 (en) Substrate inspection method
JPH07311025A (en) Three-dimensional shape inspection device
WO2010011124A1 (en) A method and means for measuring positions of contact elements of an electronic components
JP2007078533A (en) Method of inspecting substrate
CN107271445B (en) Defect detection method and device
KR102011910B1 (en) 3 dimensional shape detection apparatus and method
CN110044266B (en) Photogrammetry system based on speckle projection
KR101183101B1 (en) Method of die bonding for flip chip
JPH04181106A (en) Calibration device of position dimension measuring device
US8102516B2 (en) Test method for compound-eye distance measuring apparatus, test apparatus, and chart used for the same
JP4333349B2 (en) Mounting appearance inspection method and mounting appearance inspection apparatus
WO2002029357A2 (en) Method and apparatus for evaluating integrated circuit packages having three dimensional features
EP1619623A1 (en) Apparatus for three dimensional measuring on an electronic component
CN110020648A (en) Workpiece measures and localization method
CN112985276B (en) Thickness measuring method and system for circuit board
KR102234984B1 (en) Apparatus for detecting particle of a semiconductor wafer
CN113916152B (en) Sample detection device and method based on phase deflection technology
CN113063352B (en) Detection method and device, detection equipment and storage medium
JPH07140088A (en) Correcting method for unit shift amount of appearance-inspection camera in circuit board inspection device
CN115375681B (en) Large-size target measuring method based on image splicing
JP3226712B2 (en) Semiconductor pellet height measuring device and its measuring method

Legal Events

Date Code Title Description
WWE Wipo information: entry into national phase

Ref document number: 200980127848.8

Country of ref document: CN

121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 09800602

Country of ref document: EP

Kind code of ref document: A1

DPE1 Request for preliminary examination filed after expiration of 19th month from priority date (pct application filed from 20040101)
ENP Entry into the national phase

Ref document number: 2011520004

Country of ref document: JP

Kind code of ref document: A

ENP Entry into the national phase

Ref document number: 20117000873

Country of ref document: KR

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 09800602

Country of ref document: EP

Kind code of ref document: A1