WO2010090605A1 - Procédés d'examen d'une structure de liaison d'un substrat et dispositifs d'inspection de structure de liaison - Google Patents

Procédés d'examen d'une structure de liaison d'un substrat et dispositifs d'inspection de structure de liaison Download PDF

Info

Publication number
WO2010090605A1
WO2010090605A1 PCT/SG2010/000042 SG2010000042W WO2010090605A1 WO 2010090605 A1 WO2010090605 A1 WO 2010090605A1 SG 2010000042 W SG2010000042 W SG 2010000042W WO 2010090605 A1 WO2010090605 A1 WO 2010090605A1
Authority
WO
WIPO (PCT)
Prior art keywords
light
wire
die
bonding structure
camera
Prior art date
Application number
PCT/SG2010/000042
Other languages
English (en)
Inventor
Kum Pang Chung
Chew Junn Lam
Jian Xu
Tong Liu
Zai Xin Tang
Albertus Zakaria
Original Assignee
Agency For Science, Technology And Research
Component Technology Pte Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Agency For Science, Technology And Research, Component Technology Pte Ltd filed Critical Agency For Science, Technology And Research
Priority to SG2011052479A priority Critical patent/SG173068A1/en
Priority to CN201080015400.XA priority patent/CN102439708B/zh
Publication of WO2010090605A1 publication Critical patent/WO2010090605A1/fr

Links

Classifications

    • HELECTRICITY
    • H01ELECTRIC ELEMENTS
    • H01LSEMICONDUCTOR DEVICES NOT COVERED BY CLASS H10
    • H01L24/00Arrangements for connecting or disconnecting semiconductor or solid-state bodies; Methods or apparatus related thereto
    • H01L24/80Methods for connecting semiconductor or other solid state bodies using means for bonding being attached to, or being formed on, the surface to be connected
    • H01L24/85Methods for connecting semiconductor or other solid state bodies using means for bonding being attached to, or being formed on, the surface to be connected using a wire connector
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/84Systems specially adapted for particular applications
    • G01N21/88Investigating the presence of flaws or contamination
    • G01N21/8806Specially adapted optical and illumination features
    • HELECTRICITY
    • H01ELECTRIC ELEMENTS
    • H01LSEMICONDUCTOR DEVICES NOT COVERED BY CLASS H10
    • H01L24/00Arrangements for connecting or disconnecting semiconductor or solid-state bodies; Methods or apparatus related thereto
    • H01L24/01Means for bonding being attached to, or being formed on, the surface to be connected, e.g. chip-to-package, die-attach, "first-level" interconnects; Manufacturing methods related thereto
    • H01L24/42Wire connectors; Manufacturing methods related thereto
    • H01L24/47Structure, shape, material or disposition of the wire connectors after the connecting process
    • H01L24/49Structure, shape, material or disposition of the wire connectors after the connecting process of a plurality of wire connectors
    • HELECTRICITY
    • H01ELECTRIC ELEMENTS
    • H01LSEMICONDUCTOR DEVICES NOT COVERED BY CLASS H10
    • H01L2223/00Details relating to semiconductor or other solid state devices covered by the group H01L23/00
    • H01L2223/544Marks applied to semiconductor devices or parts
    • H01L2223/54426Marks applied to semiconductor devices or parts for alignment
    • HELECTRICITY
    • H01ELECTRIC ELEMENTS
    • H01LSEMICONDUCTOR DEVICES NOT COVERED BY CLASS H10
    • H01L2223/00Details relating to semiconductor or other solid state devices covered by the group H01L23/00
    • H01L2223/544Marks applied to semiconductor devices or parts
    • H01L2223/54473Marks applied to semiconductor devices or parts for use after dicing
    • HELECTRICITY
    • H01ELECTRIC ELEMENTS
    • H01LSEMICONDUCTOR DEVICES NOT COVERED BY CLASS H10
    • H01L2224/00Indexing scheme for arrangements for connecting or disconnecting semiconductor or solid-state bodies and methods related thereto as covered by H01L24/00
    • H01L2224/01Means for bonding being attached to, or being formed on, the surface to be connected, e.g. chip-to-package, die-attach, "first-level" interconnects; Manufacturing methods related thereto
    • H01L2224/02Bonding areas; Manufacturing methods related thereto
    • H01L2224/04Structure, shape, material or disposition of the bonding areas prior to the connecting process
    • H01L2224/05Structure, shape, material or disposition of the bonding areas prior to the connecting process of an individual bonding area
    • H01L2224/0554External layer
    • H01L2224/0555Shape
    • H01L2224/05552Shape in top view
    • HELECTRICITY
    • H01ELECTRIC ELEMENTS
    • H01LSEMICONDUCTOR DEVICES NOT COVERED BY CLASS H10
    • H01L2224/00Indexing scheme for arrangements for connecting or disconnecting semiconductor or solid-state bodies and methods related thereto as covered by H01L24/00
    • H01L2224/01Means for bonding being attached to, or being formed on, the surface to be connected, e.g. chip-to-package, die-attach, "first-level" interconnects; Manufacturing methods related thereto
    • H01L2224/02Bonding areas; Manufacturing methods related thereto
    • H01L2224/04Structure, shape, material or disposition of the bonding areas prior to the connecting process
    • H01L2224/05Structure, shape, material or disposition of the bonding areas prior to the connecting process of an individual bonding area
    • H01L2224/0554External layer
    • H01L2224/0555Shape
    • H01L2224/05552Shape in top view
    • H01L2224/05553Shape in top view being rectangular
    • HELECTRICITY
    • H01ELECTRIC ELEMENTS
    • H01LSEMICONDUCTOR DEVICES NOT COVERED BY CLASS H10
    • H01L2224/00Indexing scheme for arrangements for connecting or disconnecting semiconductor or solid-state bodies and methods related thereto as covered by H01L24/00
    • H01L2224/01Means for bonding being attached to, or being formed on, the surface to be connected, e.g. chip-to-package, die-attach, "first-level" interconnects; Manufacturing methods related thereto
    • H01L2224/02Bonding areas; Manufacturing methods related thereto
    • H01L2224/04Structure, shape, material or disposition of the bonding areas prior to the connecting process
    • H01L2224/05Structure, shape, material or disposition of the bonding areas prior to the connecting process of an individual bonding area
    • H01L2224/0554External layer
    • H01L2224/0555Shape
    • H01L2224/05552Shape in top view
    • H01L2224/05554Shape in top view being square
    • HELECTRICITY
    • H01ELECTRIC ELEMENTS
    • H01LSEMICONDUCTOR DEVICES NOT COVERED BY CLASS H10
    • H01L2224/00Indexing scheme for arrangements for connecting or disconnecting semiconductor or solid-state bodies and methods related thereto as covered by H01L24/00
    • H01L2224/01Means for bonding being attached to, or being formed on, the surface to be connected, e.g. chip-to-package, die-attach, "first-level" interconnects; Manufacturing methods related thereto
    • H01L2224/02Bonding areas; Manufacturing methods related thereto
    • H01L2224/04Structure, shape, material or disposition of the bonding areas prior to the connecting process
    • H01L2224/06Structure, shape, material or disposition of the bonding areas prior to the connecting process of a plurality of bonding areas
    • H01L2224/0601Structure
    • H01L2224/0603Bonding areas having different sizes, e.g. different heights or widths
    • HELECTRICITY
    • H01ELECTRIC ELEMENTS
    • H01LSEMICONDUCTOR DEVICES NOT COVERED BY CLASS H10
    • H01L2224/00Indexing scheme for arrangements for connecting or disconnecting semiconductor or solid-state bodies and methods related thereto as covered by H01L24/00
    • H01L2224/01Means for bonding being attached to, or being formed on, the surface to be connected, e.g. chip-to-package, die-attach, "first-level" interconnects; Manufacturing methods related thereto
    • H01L2224/42Wire connectors; Manufacturing methods related thereto
    • H01L2224/44Structure, shape, material or disposition of the wire connectors prior to the connecting process
    • H01L2224/45Structure, shape, material or disposition of the wire connectors prior to the connecting process of an individual wire connector
    • H01L2224/45001Core members of the connector
    • H01L2224/4501Shape
    • H01L2224/45012Cross-sectional shape
    • H01L2224/45014Ribbon connectors, e.g. rectangular cross-section
    • HELECTRICITY
    • H01ELECTRIC ELEMENTS
    • H01LSEMICONDUCTOR DEVICES NOT COVERED BY CLASS H10
    • H01L2224/00Indexing scheme for arrangements for connecting or disconnecting semiconductor or solid-state bodies and methods related thereto as covered by H01L24/00
    • H01L2224/01Means for bonding being attached to, or being formed on, the surface to be connected, e.g. chip-to-package, die-attach, "first-level" interconnects; Manufacturing methods related thereto
    • H01L2224/42Wire connectors; Manufacturing methods related thereto
    • H01L2224/44Structure, shape, material or disposition of the wire connectors prior to the connecting process
    • H01L2224/45Structure, shape, material or disposition of the wire connectors prior to the connecting process of an individual wire connector
    • H01L2224/45001Core members of the connector
    • H01L2224/4501Shape
    • H01L2224/45012Cross-sectional shape
    • H01L2224/45015Cross-sectional shape being circular
    • HELECTRICITY
    • H01ELECTRIC ELEMENTS
    • H01LSEMICONDUCTOR DEVICES NOT COVERED BY CLASS H10
    • H01L2224/00Indexing scheme for arrangements for connecting or disconnecting semiconductor or solid-state bodies and methods related thereto as covered by H01L24/00
    • H01L2224/01Means for bonding being attached to, or being formed on, the surface to be connected, e.g. chip-to-package, die-attach, "first-level" interconnects; Manufacturing methods related thereto
    • H01L2224/42Wire connectors; Manufacturing methods related thereto
    • H01L2224/44Structure, shape, material or disposition of the wire connectors prior to the connecting process
    • H01L2224/45Structure, shape, material or disposition of the wire connectors prior to the connecting process of an individual wire connector
    • H01L2224/45001Core members of the connector
    • H01L2224/45099Material
    • H01L2224/451Material with a principal constituent of the material being a metal or a metalloid, e.g. boron (B), silicon (Si), germanium (Ge), arsenic (As), antimony (Sb), tellurium (Te) and polonium (Po), and alloys thereof
    • H01L2224/45138Material with a principal constituent of the material being a metal or a metalloid, e.g. boron (B), silicon (Si), germanium (Ge), arsenic (As), antimony (Sb), tellurium (Te) and polonium (Po), and alloys thereof the principal constituent melting at a temperature of greater than or equal to 950°C and less than 1550°C
    • H01L2224/45144Gold (Au) as principal constituent
    • HELECTRICITY
    • H01ELECTRIC ELEMENTS
    • H01LSEMICONDUCTOR DEVICES NOT COVERED BY CLASS H10
    • H01L2224/00Indexing scheme for arrangements for connecting or disconnecting semiconductor or solid-state bodies and methods related thereto as covered by H01L24/00
    • H01L2224/01Means for bonding being attached to, or being formed on, the surface to be connected, e.g. chip-to-package, die-attach, "first-level" interconnects; Manufacturing methods related thereto
    • H01L2224/42Wire connectors; Manufacturing methods related thereto
    • H01L2224/47Structure, shape, material or disposition of the wire connectors after the connecting process
    • H01L2224/48Structure, shape, material or disposition of the wire connectors after the connecting process of an individual wire connector
    • H01L2224/4805Shape
    • H01L2224/4809Loop shape
    • H01L2224/48091Arched
    • HELECTRICITY
    • H01ELECTRIC ELEMENTS
    • H01LSEMICONDUCTOR DEVICES NOT COVERED BY CLASS H10
    • H01L2224/00Indexing scheme for arrangements for connecting or disconnecting semiconductor or solid-state bodies and methods related thereto as covered by H01L24/00
    • H01L2224/01Means for bonding being attached to, or being formed on, the surface to be connected, e.g. chip-to-package, die-attach, "first-level" interconnects; Manufacturing methods related thereto
    • H01L2224/42Wire connectors; Manufacturing methods related thereto
    • H01L2224/47Structure, shape, material or disposition of the wire connectors after the connecting process
    • H01L2224/48Structure, shape, material or disposition of the wire connectors after the connecting process of an individual wire connector
    • H01L2224/481Disposition
    • H01L2224/48151Connecting between a semiconductor or solid-state body and an item not being a semiconductor or solid-state body, e.g. chip-to-substrate, chip-to-passive
    • H01L2224/48221Connecting between a semiconductor or solid-state body and an item not being a semiconductor or solid-state body, e.g. chip-to-substrate, chip-to-passive the body and the item being stacked
    • H01L2224/48245Connecting between a semiconductor or solid-state body and an item not being a semiconductor or solid-state body, e.g. chip-to-substrate, chip-to-passive the body and the item being stacked the item being metallic
    • H01L2224/48247Connecting between a semiconductor or solid-state body and an item not being a semiconductor or solid-state body, e.g. chip-to-substrate, chip-to-passive the body and the item being stacked the item being metallic connecting the wire to a bond pad of the item
    • HELECTRICITY
    • H01ELECTRIC ELEMENTS
    • H01LSEMICONDUCTOR DEVICES NOT COVERED BY CLASS H10
    • H01L2224/00Indexing scheme for arrangements for connecting or disconnecting semiconductor or solid-state bodies and methods related thereto as covered by H01L24/00
    • H01L2224/01Means for bonding being attached to, or being formed on, the surface to be connected, e.g. chip-to-package, die-attach, "first-level" interconnects; Manufacturing methods related thereto
    • H01L2224/42Wire connectors; Manufacturing methods related thereto
    • H01L2224/47Structure, shape, material or disposition of the wire connectors after the connecting process
    • H01L2224/48Structure, shape, material or disposition of the wire connectors after the connecting process of an individual wire connector
    • H01L2224/484Connecting portions
    • H01L2224/4847Connecting portions the connecting portion on the bonding area of the semiconductor or solid-state body being a wedge bond
    • H01L2224/48472Connecting portions the connecting portion on the bonding area of the semiconductor or solid-state body being a wedge bond the other connecting portion not on the bonding area also being a wedge bond, i.e. wedge-to-wedge
    • HELECTRICITY
    • H01ELECTRIC ELEMENTS
    • H01LSEMICONDUCTOR DEVICES NOT COVERED BY CLASS H10
    • H01L2224/00Indexing scheme for arrangements for connecting or disconnecting semiconductor or solid-state bodies and methods related thereto as covered by H01L24/00
    • H01L2224/01Means for bonding being attached to, or being formed on, the surface to be connected, e.g. chip-to-package, die-attach, "first-level" interconnects; Manufacturing methods related thereto
    • H01L2224/42Wire connectors; Manufacturing methods related thereto
    • H01L2224/47Structure, shape, material or disposition of the wire connectors after the connecting process
    • H01L2224/49Structure, shape, material or disposition of the wire connectors after the connecting process of a plurality of wire connectors
    • H01L2224/4901Structure
    • H01L2224/4903Connectors having different sizes, e.g. different diameters
    • HELECTRICITY
    • H01ELECTRIC ELEMENTS
    • H01LSEMICONDUCTOR DEVICES NOT COVERED BY CLASS H10
    • H01L2224/00Indexing scheme for arrangements for connecting or disconnecting semiconductor or solid-state bodies and methods related thereto as covered by H01L24/00
    • H01L2224/01Means for bonding being attached to, or being formed on, the surface to be connected, e.g. chip-to-package, die-attach, "first-level" interconnects; Manufacturing methods related thereto
    • H01L2224/42Wire connectors; Manufacturing methods related thereto
    • H01L2224/47Structure, shape, material or disposition of the wire connectors after the connecting process
    • H01L2224/49Structure, shape, material or disposition of the wire connectors after the connecting process of a plurality of wire connectors
    • H01L2224/4905Shape
    • H01L2224/49051Connectors having different shapes
    • HELECTRICITY
    • H01ELECTRIC ELEMENTS
    • H01LSEMICONDUCTOR DEVICES NOT COVERED BY CLASS H10
    • H01L2224/00Indexing scheme for arrangements for connecting or disconnecting semiconductor or solid-state bodies and methods related thereto as covered by H01L24/00
    • H01L2224/73Means for bonding being of different types provided for in two or more of groups H01L2224/10, H01L2224/18, H01L2224/26, H01L2224/34, H01L2224/42, H01L2224/50, H01L2224/63, H01L2224/71
    • H01L2224/732Location after the connecting process
    • H01L2224/73251Location after the connecting process on different surfaces
    • H01L2224/73265Layer and wire connectors
    • HELECTRICITY
    • H01ELECTRIC ELEMENTS
    • H01LSEMICONDUCTOR DEVICES NOT COVERED BY CLASS H10
    • H01L2224/00Indexing scheme for arrangements for connecting or disconnecting semiconductor or solid-state bodies and methods related thereto as covered by H01L24/00
    • H01L2224/74Apparatus for manufacturing arrangements for connecting or disconnecting semiconductor or solid-state bodies and for methods related thereto
    • H01L2224/78Apparatus for connecting with wire connectors
    • H01L2224/789Means for monitoring the connection process
    • H01L2224/78901Means for monitoring the connection process using a computer, e.g. fully- or semi-automatic bonding
    • HELECTRICITY
    • H01ELECTRIC ELEMENTS
    • H01LSEMICONDUCTOR DEVICES NOT COVERED BY CLASS H10
    • H01L2224/00Indexing scheme for arrangements for connecting or disconnecting semiconductor or solid-state bodies and methods related thereto as covered by H01L24/00
    • H01L2224/80Methods for connecting semiconductor or other solid state bodies using means for bonding being attached to, or being formed on, the surface to be connected
    • H01L2224/85Methods for connecting semiconductor or other solid state bodies using means for bonding being attached to, or being formed on, the surface to be connected using a wire connector
    • H01L2224/8512Aligning
    • H01L2224/85121Active alignment, i.e. by apparatus steering, e.g. optical alignment using marks or sensors
    • HELECTRICITY
    • H01ELECTRIC ELEMENTS
    • H01LSEMICONDUCTOR DEVICES NOT COVERED BY CLASS H10
    • H01L2224/00Indexing scheme for arrangements for connecting or disconnecting semiconductor or solid-state bodies and methods related thereto as covered by H01L24/00
    • H01L2224/80Methods for connecting semiconductor or other solid state bodies using means for bonding being attached to, or being formed on, the surface to be connected
    • H01L2224/85Methods for connecting semiconductor or other solid state bodies using means for bonding being attached to, or being formed on, the surface to be connected using a wire connector
    • H01L2224/859Methods for connecting semiconductor or other solid state bodies using means for bonding being attached to, or being formed on, the surface to be connected using a wire connector involving monitoring, e.g. feedback loop
    • HELECTRICITY
    • H01ELECTRIC ELEMENTS
    • H01LSEMICONDUCTOR DEVICES NOT COVERED BY CLASS H10
    • H01L23/00Details of semiconductor or other solid state devices
    • H01L23/544Marks applied to semiconductor devices or parts, e.g. registration marks, alignment structures, wafer maps
    • HELECTRICITY
    • H01ELECTRIC ELEMENTS
    • H01LSEMICONDUCTOR DEVICES NOT COVERED BY CLASS H10
    • H01L24/00Arrangements for connecting or disconnecting semiconductor or solid-state bodies; Methods or apparatus related thereto
    • H01L24/01Means for bonding being attached to, or being formed on, the surface to be connected, e.g. chip-to-package, die-attach, "first-level" interconnects; Manufacturing methods related thereto
    • H01L24/42Wire connectors; Manufacturing methods related thereto
    • H01L24/44Structure, shape, material or disposition of the wire connectors prior to the connecting process
    • H01L24/45Structure, shape, material or disposition of the wire connectors prior to the connecting process of an individual wire connector
    • HELECTRICITY
    • H01ELECTRIC ELEMENTS
    • H01LSEMICONDUCTOR DEVICES NOT COVERED BY CLASS H10
    • H01L24/00Arrangements for connecting or disconnecting semiconductor or solid-state bodies; Methods or apparatus related thereto
    • H01L24/01Means for bonding being attached to, or being formed on, the surface to be connected, e.g. chip-to-package, die-attach, "first-level" interconnects; Manufacturing methods related thereto
    • H01L24/42Wire connectors; Manufacturing methods related thereto
    • H01L24/47Structure, shape, material or disposition of the wire connectors after the connecting process
    • H01L24/48Structure, shape, material or disposition of the wire connectors after the connecting process of an individual wire connector
    • HELECTRICITY
    • H01ELECTRIC ELEMENTS
    • H01LSEMICONDUCTOR DEVICES NOT COVERED BY CLASS H10
    • H01L2924/00Indexing scheme for arrangements or methods for connecting or disconnecting semiconductor or solid-state bodies as covered by H01L24/00
    • H01L2924/0001Technical content checked by a classifier
    • H01L2924/00014Technical content checked by a classifier the subject-matter covered by the group, the symbol of which is combined with the symbol of this group, being disclosed without further technical details
    • HELECTRICITY
    • H01ELECTRIC ELEMENTS
    • H01LSEMICONDUCTOR DEVICES NOT COVERED BY CLASS H10
    • H01L2924/00Indexing scheme for arrangements or methods for connecting or disconnecting semiconductor or solid-state bodies as covered by H01L24/00
    • H01L2924/01Chemical elements
    • H01L2924/01005Boron [B]
    • HELECTRICITY
    • H01ELECTRIC ELEMENTS
    • H01LSEMICONDUCTOR DEVICES NOT COVERED BY CLASS H10
    • H01L2924/00Indexing scheme for arrangements or methods for connecting or disconnecting semiconductor or solid-state bodies as covered by H01L24/00
    • H01L2924/01Chemical elements
    • H01L2924/01013Aluminum [Al]
    • HELECTRICITY
    • H01ELECTRIC ELEMENTS
    • H01LSEMICONDUCTOR DEVICES NOT COVERED BY CLASS H10
    • H01L2924/00Indexing scheme for arrangements or methods for connecting or disconnecting semiconductor or solid-state bodies as covered by H01L24/00
    • H01L2924/01Chemical elements
    • H01L2924/01014Silicon [Si]
    • HELECTRICITY
    • H01ELECTRIC ELEMENTS
    • H01LSEMICONDUCTOR DEVICES NOT COVERED BY CLASS H10
    • H01L2924/00Indexing scheme for arrangements or methods for connecting or disconnecting semiconductor or solid-state bodies as covered by H01L24/00
    • H01L2924/01Chemical elements
    • H01L2924/01015Phosphorus [P]
    • HELECTRICITY
    • H01ELECTRIC ELEMENTS
    • H01LSEMICONDUCTOR DEVICES NOT COVERED BY CLASS H10
    • H01L2924/00Indexing scheme for arrangements or methods for connecting or disconnecting semiconductor or solid-state bodies as covered by H01L24/00
    • H01L2924/01Chemical elements
    • H01L2924/01027Cobalt [Co]
    • HELECTRICITY
    • H01ELECTRIC ELEMENTS
    • H01LSEMICONDUCTOR DEVICES NOT COVERED BY CLASS H10
    • H01L2924/00Indexing scheme for arrangements or methods for connecting or disconnecting semiconductor or solid-state bodies as covered by H01L24/00
    • H01L2924/01Chemical elements
    • H01L2924/01033Arsenic [As]
    • HELECTRICITY
    • H01ELECTRIC ELEMENTS
    • H01LSEMICONDUCTOR DEVICES NOT COVERED BY CLASS H10
    • H01L2924/00Indexing scheme for arrangements or methods for connecting or disconnecting semiconductor or solid-state bodies as covered by H01L24/00
    • H01L2924/01Chemical elements
    • H01L2924/01075Rhenium [Re]
    • HELECTRICITY
    • H01ELECTRIC ELEMENTS
    • H01LSEMICONDUCTOR DEVICES NOT COVERED BY CLASS H10
    • H01L2924/00Indexing scheme for arrangements or methods for connecting or disconnecting semiconductor or solid-state bodies as covered by H01L24/00
    • H01L2924/01Chemical elements
    • H01L2924/01077Iridium [Ir]
    • HELECTRICITY
    • H01ELECTRIC ELEMENTS
    • H01LSEMICONDUCTOR DEVICES NOT COVERED BY CLASS H10
    • H01L2924/00Indexing scheme for arrangements or methods for connecting or disconnecting semiconductor or solid-state bodies as covered by H01L24/00
    • H01L2924/01Chemical elements
    • H01L2924/01079Gold [Au]
    • HELECTRICITY
    • H01ELECTRIC ELEMENTS
    • H01LSEMICONDUCTOR DEVICES NOT COVERED BY CLASS H10
    • H01L2924/00Indexing scheme for arrangements or methods for connecting or disconnecting semiconductor or solid-state bodies as covered by H01L24/00
    • H01L2924/01Chemical elements
    • H01L2924/01082Lead [Pb]
    • HELECTRICITY
    • H01ELECTRIC ELEMENTS
    • H01LSEMICONDUCTOR DEVICES NOT COVERED BY CLASS H10
    • H01L2924/00Indexing scheme for arrangements or methods for connecting or disconnecting semiconductor or solid-state bodies as covered by H01L24/00
    • H01L2924/013Alloys
    • H01L2924/014Solder alloys
    • HELECTRICITY
    • H01ELECTRIC ELEMENTS
    • H01LSEMICONDUCTOR DEVICES NOT COVERED BY CLASS H10
    • H01L2924/00Indexing scheme for arrangements or methods for connecting or disconnecting semiconductor or solid-state bodies as covered by H01L24/00
    • H01L2924/10Details of semiconductor or other solid state devices to be connected
    • H01L2924/102Material of the semiconductor or solid state bodies
    • H01L2924/1025Semiconducting materials
    • H01L2924/10251Elemental semiconductors, i.e. Group IV
    • H01L2924/10253Silicon [Si]
    • HELECTRICITY
    • H01ELECTRIC ELEMENTS
    • H01LSEMICONDUCTOR DEVICES NOT COVERED BY CLASS H10
    • H01L2924/00Indexing scheme for arrangements or methods for connecting or disconnecting semiconductor or solid-state bodies as covered by H01L24/00
    • H01L2924/10Details of semiconductor or other solid state devices to be connected
    • H01L2924/11Device type
    • H01L2924/12Passive devices, e.g. 2 terminal devices
    • H01L2924/1204Optical Diode
    • H01L2924/12041LED
    • HELECTRICITY
    • H01ELECTRIC ELEMENTS
    • H01LSEMICONDUCTOR DEVICES NOT COVERED BY CLASS H10
    • H01L2924/00Indexing scheme for arrangements or methods for connecting or disconnecting semiconductor or solid-state bodies as covered by H01L24/00
    • H01L2924/10Details of semiconductor or other solid state devices to be connected
    • H01L2924/11Device type
    • H01L2924/12Passive devices, e.g. 2 terminal devices
    • H01L2924/1204Optical Diode
    • H01L2924/12042LASER
    • HELECTRICITY
    • H01ELECTRIC ELEMENTS
    • H01LSEMICONDUCTOR DEVICES NOT COVERED BY CLASS H10
    • H01L2924/00Indexing scheme for arrangements or methods for connecting or disconnecting semiconductor or solid-state bodies as covered by H01L24/00
    • H01L2924/10Details of semiconductor or other solid state devices to be connected
    • H01L2924/11Device type
    • H01L2924/14Integrated circuits
    • HELECTRICITY
    • H01ELECTRIC ELEMENTS
    • H01LSEMICONDUCTOR DEVICES NOT COVERED BY CLASS H10
    • H01L2924/00Indexing scheme for arrangements or methods for connecting or disconnecting semiconductor or solid-state bodies as covered by H01L24/00
    • H01L2924/10Details of semiconductor or other solid state devices to be connected
    • H01L2924/146Mixed devices
    • H01L2924/1461MEMS

Definitions

  • Embodiments relate to methods for examining a bonding structure of a substrate and bonding structure inspection devices.
  • wire bonding involves adopting a wire to connect pads residing on a chip or die to a lead in a lead frame. Once the chip and lead frame have been wire bonded, the chip and lead frame may be further packaged in ceramic or plastic to form an integrated circuit device.
  • an inspection of the quality of the wire bond may be performed manually by a human operator using a microscope.
  • this manual method may be time consuming and costly.
  • illumination may need to be designed so as to minimize the specularity on the respective surfaces of the wires and the lead frame to ensure a relatively accurate measurement.
  • good contrasts on other respective surfaces such as die surface and bond surface shall also be desired so as to be able to carry out two-dimensional (2D) measurements.
  • some examples of the state of the art illumination systems adopted may include a co-axial light, a dome light, a ring light, a combination of the dome light and the co-axial light or other combinations thereof.
  • a bonding structure inspection device may be provided.
  • the bonding structure inspection device may include a substrate receiving region configured to receive a substrate, the substrate including the bonding structure to be examined; a stereoscopic image determining device configured to determine one or more stereoscopic images including one or a plurality of predefined regions of interests of portions of the bonding structure to be examined, the predefined regions of interests corresponding to reference regions of interests in a stored predetermined reference model of a reference bonding structure of a reference substrate; a memory configured to store the predetermined reference model of a reference bonding structure of a reference substrate; and a feature determiner configured to determine one or more two-dimensional feature parameters and one or more three-dimensional feature parameters of the bonding structure to be examined from one or more of a plurality of the predefined regions of interests; and a determiner configured to determine as to whether the determined feature parameters fulfill at least one predefined quality criterion with respect to the reference model.
  • a method for examining a bonding structure of a substrate may be provided.
  • the method may include: providing a substrate including a bonding structure to be examined; determining one or more stereoscopic images including one or a plurality of predefined regions of interests of portions of the bonding structure to be examined, the predefined regions of interests corresponding to reference regions of interests in a stored predetermined reference model of a reference bonding structure of a reference substrate; determining one or more two-dimensional feature parameters and one or more three-dimensional feature parameters of the bonding structure to be examined from one or more of a plurality of the predefined regions of interests; and determining as to whether the determined feature parameters fulfill at least one predefined quality criterion with respect to the reference model.
  • FIG. IA shows a front view of. a light detection arrangement, the light detection arrangement including a first light source, a second light source, a first beam splitter and a second beam splitter according to an embodiment
  • FIG. IB shows a front view of a light detection system, the light detection system including the light detection arrangement of FIG. IA according to an embodiment
  • FIG. 2A shows a front view of a light detection arrangement, the light detection arrangement including a body structure with a cutout portion according to an embodiment
  • FIG. 2B shows a top view of the body structure with the cutout portion of FIG. 2A according to an embodiment
  • FIG. 3 shows a perspective view of a prototype of the light detection arrangement of FIG. 2A according to an embodiment
  • FIG. 4A shows a left image obtained with the prototype of the light detection arrangement of FIG. 3 according to an embodiment
  • FIG. 4B shows a right image obtained with the prototype of the light detection arrangement of FIG. 3 according to an embodiment
  • FIG. 5A shows a front view of a light detection arrangement, the light detection arrangement including a beam splitter. arrangement arranged on the same side of a body structure as an object receiving region according to an embodiment
  • FIG. 5B shows a side view of the light detection arrangement of FIG. 5A according to an embodiment
  • FIG. 6 shows a perspective view of a prototype of a light detection arrangement of FIG. 5 A according to an embodiment
  • FIG. 7A shows a left image obtained with the prototype of the light detection arrangement of FIG. 5 A according to an embodiment
  • FIG. 7B shows a right image obtained with the prototype of the light detection arrangement of FIG. 5 A according to an embodiment
  • FIG. 8 A shows a front view of a light detection arrangement, the light detection arrangement including a bar shaped light source according to an embodiment
  • FIG. 8B shows a side view of the light detection arrangement of FIG. 8A according to an embodiment
  • FIG. 9A shows a front view of a light detection arrangement, the light detection arrangement including an arc shaped light source according to an embodiment
  • FIG. 9B shows a top view of the arc shape light source of FIG. 9A according to an embodiment
  • FIG. 1OA shows a front view of a light detection arrangement, the light detection arrangement including a blue color dichroic mirror and a red color dichroic mirror according to an embodiment
  • FIG. 1OB shows a top view of the light detection arrangement of FIG. 1OA according to an embodiment
  • FIG. 11 shows a flow-chart of a method for detecting light in a light detection arrangement according to an embodiment
  • FIG. 12 shows a front view of multi-camera wire bond inspection head in accordance with an embodiment
  • FIG. 13 shows a dome light for wire bond stereo inspection in accordance with an embodiment
  • FIG. 14A shows images with poor contrast between the wire end and the die in accordance with an embodiment
  • FIG. 14B shows an image with ideal illumination for each camera in accordance with an embodiment
  • FIG. 15 shows a flow diagram illustrating a method for examining a bonding structure of a substrate in accordance with an embodiment
  • FIG. 16 shows a bonding structure inspection device in accordance with an embodiment
  • FIG. 17 shows a bonding structure inspection device in accordance with an embodiment
  • FIG. 18A shows a perspective view of a wire bond inspection system in accordance with an embodiment
  • FIG. 18B shows a front view of a wire bond inspection system in accordance with an embodiment
  • FIG. 19 shows an overall flow chart of a process in accordance with an embodiment
  • FIG. 20 shows a flow diagram illustrating details of learn product steps in accordance with an embodiment
  • FIG. 21 shows a flow diagram illustrating a 2D wire bond recipe creation in accordance with an embodiment
  • FIG. 22 shows an illustration of bond placement measurement regions of interest (ROIs) in accordance with an embodiment
  • FIG. 23 shows a flow diagram illustrating 3D wire bond recipe creation in accordance with an embodiment
  • FIG. 24 shows an illustration of die tilt ROIs in accordance with an embodiment
  • FIG. 25 shows a flow diagram illustrating details of an inspect product step in accordance with an embodiment
  • FIG. 26 shows a flow diagram illustrating inspecting 2D features in accordance with an embodiment
  • FIG. 27 shows an illustration of long tail ROIs and measurement in accordance with an embodiment
  • FIG. 28 shows a flow diagram illustrating inspecting 3D features in accordance with an embodiment
  • FIG. 29 shows an illustration of wire loop height ROIs and measurement in accordance with an embodiment
  • FIG. 30 shows a flow diagram illustrating calibrating 2D inspection in accordance with an embodiment
  • FIG. 31 shows a flow diagram illustrating calibrating 3D inspection in accordance with an embodiment.
  • An embodiment may provide a light detection arrangement.
  • the light detection arrangement may include a body structure configured to provide light, the body structure including a first light-transmissive portion and a second light-transmissive portion disposed at a distance from the first light-transmissive portion; an object receiving region arranged such that light provided by the body structure may be illuminating at least a portion of the object receiving region; a first camera having a first main optical axis; a second camera having a second main optical axis; wherein the body structure may be arranged between the first camera and the second camera on its one side and the object receiving region on its other side; wherein the first camera may be arranged such that its first main optical axis may be directed to the object receiving region via the first light- transmissive portion; wherein the second camera may be arranged such that its second main optical axis may be directed to the object receiving region via the second light- transmissive portion; an arrangement configured to provide light reflected from the object receiving region such that a first reflected light portion may be provided
  • each of the first light-transmissive portion and the second light-transmissive portion may be configured to at least allow some light to pass through.
  • each of the first light-transmissive portion and the second light-transmissive portion may be translucent.
  • each of the first light-transmissive portion and the second light-transmissive portion may be configured to be transparent.
  • the first light-transmissive portion and the second light-transmissive portion may be formed as two separate openings or as a single opening.
  • the body structure may have a dome shape.
  • the dome shape may include a radius of between about.40 mm to about 45 mm, for example.
  • the dome shape may be sized so as to accommodate the object to be arranged in the object receiving region and at least one body structure illuminating light source.
  • the body structure may include a dome light, for example.
  • the body structure may also include any other light source as long as the light source may provide uniform illumination on the object receiving region.
  • the light detection arrangement may further include a structure having the first light-transmissive portion and the second light-transmissive portion, wherein the body structure has a cutout portion and the structure is arranged in the cutout portion.
  • the structure may be made of a material which may not allow the through passage of light.
  • the structure may include a rectangular plate or a diffused white plastic sheet.
  • At least a portion of an inner surface of the body structure facing the object receiving region may have a reflective surface.
  • At least a portion of an inner surface of the body structure facing the object receiving region may be configured to generate light.
  • the at least a portion of an inner surface of the body structure facing the object receiving region may include a light generating coating and/or a plurality of organic light emitting diodes.
  • the light detection arrangement may further include at least one body structure illuminating light source arranged to illuminate at least a portion of an inner surface of the body structure.
  • the at least one body structure illuminating light source may be positioned within the body structure or positioned outside of the body structure.
  • the at least one body structure illuminating light source may include a plurality of lamps.
  • the plurality of lamps may include a light emitting diode array.
  • the plurality of lamps may include a diffuser or a diffuse light source.
  • the arrangement may include a light arrangement.
  • the light arrangement may include a first light source and a second light source.
  • the first light source may be arranged to direct light through the first light-transmissive portion to the object receiving region and/or the second light source may be arranged to direct light through the second light-transmissive portion to the object receiving region.
  • the first light source and the second light source may be arranged such that a light axis of the light emitted by the first light source may be substantially parallel with the first main optical axis and that a light axis of the light emitted by the second light source may be substantially parallel with the second main optical axis.
  • the first light source and/or the second light source may include a plurality of lamps.
  • the plurality of lamps may include a light emitting diode array.
  • the first light source and/or the second light source may be a disperse light beam or a focused light beam.
  • the light arrangement may include a light source being selected from a group of light sources consisting of an arc shaped light source, a bar shaped light source, and a diffuser.
  • the arrangement may include a beam splitter arrangement.
  • the beam splitter arrangement may be arranged on the same side of the body structure as the first camera and the second camera.
  • the beam splitter arrangement may include a first beam splitter and a second beam splitter.
  • the arrangement may include a light arrangement, the light arrangement including a first light source and a second light source; wherein the first beam splitter may be arranged relative to the first light source and the first camera such that the first beam splitter may be configured to receive the light provided by the first light source and to re-direct the light to the object receiving region via the first light- transmissive portion, thereby generating the second reflected light portion; and wherein the second beam splitter may be arranged relative to the second light source and the second camera such that the second beam splitter may be configured to receive the light provided by the second light source and to re-direct the light to the object receiving region via the second light-transmissive portion, thereby generating the first reflected light portion.
  • the arrangement may include a light arrangement; and wherein the beam splitter arrangement may be arranged relative to the light arrangement and both the first camera and the second camera such that the beam splitter arrangement may be configured to receive the light provided by the light arrangement and to re-direct the light to the object receiving region via the respective first light-transmissive portion and the second light-transmissive portion to generate the second reflected light portion and the first reflected light portion from the received light.
  • the beam splitter arrangement may be arranged on the same side of the body structure as the object receiving region.
  • the beam splitter arrangement may be configured to receive the light provided by the inner surface of the body structure, and to generate the first reflected light portion and the second reflected light portion from the received light.
  • the beam splitter arrangement may include a single beam splitter configured such that the first reflected light portion may be provided as the first co-axial light portion aligned to the first main optical axis and such that the second reflected light portion may be provided as the second co-axial light portion aligned to the second main optical axis.
  • the first camera and the second camera may be arranged along a common camera plane.
  • the single beam splitter may be arranged at a substantially 45 degree angle with respect to the common camera plane.
  • the arrangement may include a first dichroic mirror of a first wavelength arranged in a light path along the first main optical axis and a second dichroic mirror of a second wavelength arranged in a light path along the second main optical axis.
  • the first dichroic mirror may be arranged in the first light- transmissive portion or on the one side of the body structure overlapping the first light- transmissive portion and wherein the second dichroic mirror may be arranged in the second light-transmissive portion or on the one side of the body structure overlapping the second light-transmissive portion.
  • the first dichroic mirror may be a blue color dichroic mirror and the second dichroic mirror may be a red color dichroic mirror; or wherein the second dichroic mirror may be a blue color dichroic mirror and the first dichroic mirror may be a red color dichroic mirror.
  • the light detection arrangement may further include a light separating structure.
  • the light separating structure may include a opaque material for example a black paper so as to prevent the light from the first light source from interfering with the light from the second light source.
  • the light separating structure may be arranged between the first light source and the second light source, or between the beam splitter arrangement and the at least one body structure illuminating light source.
  • the light detection arrangement may further include an object arranged in the object receiving region.
  • the object may be a die or a chip.
  • the die may include a plurality of bonding structures.
  • the plurality of bonding structures may include a plurality of bond pads and a plurality of wire bonds.
  • Each of the plurality of bond wire may be configured to connect each of the plurality of bond pads to the die.
  • the first camera and the second camera may be arranged in a manner such that the first main optical axis and the second main optical axis may be axial symmetric with respect to a symmetry axis being perpendicular to a plane defined by the object receiving region.
  • each of the first camera and the second camera may be arranged at a same camera angle ( ⁇ ) with respect to the symmetry axis. Further, each of the first beam splitter and the second beam splitter may be arranged at a same beam splitter angle ( ⁇ ) with respect to the symmetry axis. The difference between the beam splitter angle and the camera angle is about 45°. Correspondingly, the first beam splitter and the second beam splitter may be arranged at a combined beam splitter angle of 90° + 2 ⁇
  • the arrangement configured to provide light reflected from the object receiving region may compensate for the light provided by the body structure and escaping through the first light-transmissive portion and the second light-transmissive portion.
  • An embodiment may provide a method for detecting light in a light detection arrangement
  • the arrangement may be such that the light originating from the second light source may be either essentially absorbed by the light separating structure and/or be reflected off the object arranged in the object receiving region and thereafter being sensed by the first camera.
  • FIG. IA shows a front view of a light detection arrangement 102, the light detection arrangement 102 including a first light source 104, a second light source 106, a first beam splitter 108 and a second beam splitter 110 according to an embodiment.
  • the light detection arrangement 102 may include a body structure 112 configured to provide light, the body structure 112 including a first light-transmissive portion 114 and a second light-transmissive portion 116 disposed at a distance from the first light- transmissive portion 114. The distance between the first light-transmissive portion 114 and the second light-transmissive portion 116 may vary according to design and user requirements.
  • the light detection arrangement 102 may further include an object receiving region 118 arranged such that light provided by the body structure 112 may be illuminating at least a portion of the object receiving region 118.
  • the light detection arrangement 102 may further include a first camera 120 having a first main optical axis 122 and a second camera 124 having a second main optical axis 126.
  • the body structure 112 may be arranged between the first camera 120 and the second camera 124 on its one side and. the object receiving region 118 on its other side.
  • the first camera 120 may be arranged such that its first main optical axis 122 may be directed to the object receiving region 118 via the first light-transmissive portion 114 and the second camera 124 may be arranged such that its second main optical axis 126 may be directed to the object receiving region 118 via the second light-transmissive portion 116. ⁇ .
  • the light detection arrangement 102 may further include an arrangement 128 configured to provide light reflected from the object receiving region 118 such that a first reflected light portion 130 may be provided as a first co-axial light portion 132 aligned to the first main optical axis 122, and such that a second reflected light portion 134 may be provided as a second co-axial light portion 136 aligned to the second main optical axis
  • the arrangement 128 may help to generate a good illumination contrast between the die and the bond wire on an image captured by the second camera 124.
  • Each of the first light-transmissive portion 114 and the second light-transmissive portion 116 may be configured to at least allow some light to pass through or be translucent. Alternatively, each of the first light-transmissive portion 114 and the second light-transmissive portion 116 may be configured to be transparent. For example, the first light-transmissive portion 114 and the second light-transmissive portion 116 may be formed as two separate openings or as a single opening.
  • the body structure 112 may have a dome shape.
  • the dome shape may include a radius of between about 40 mm to about 45 mm, for example.
  • the body structure 112 may include a dome light, for example.
  • an inner surface 138 of the body structure 112 facing the object receiving region 118 may have a reflective surface. At least a portion of an inner surface 138 of the body structure 112 facing the object receiving region 118 may also be configured to generate light. The at least a portion of an inner surface 138 of the body structure 112 facing the object receiving region 118 may include a light generating coating and/or a plurality of organic light emitting diodes.
  • the light detection arrangement 102 may further include a plurality of body structure illuminating light sources 140 arranged to illuminate at least a portion of an inner surface 138 of the body structure 112.
  • the plurality of body structure illuminating light sources 140 may be positioned within the body structure 112.
  • the plurality of body structure illuminating light sources 140 may include a plurality of lamps.
  • the plurality of lamps may include a light emitting diode array; in alternative embodiments, the plurality of lamps may include any other suitable light generating devices such as e.g. organic light emitting diodes.
  • the plurality of lamps may include a diffuser.
  • the arrangement 128 may include a light arrangement 142.
  • the light arrangement 142 may include the first light source 104 and the second light source 106.
  • the first light source 104 may be arranged to direct light through the first light- transmissive portion 114 to the object receiving region 118 and/or the second light source 106 may be arranged to direct light through the second light-transmissive portion 116 to the object receiving region 118.
  • the first light source 104 and/or the second light source 106 may include co-axial light sources.
  • the first light source 104 and/or the second light source 106 may include a focused light beam or a disperse light source, for example a plurality of lamps.
  • the plurality of lamps may include a light emitting diode array; in alternative embodiments, the plurality of lamps may include any other suitable light generating devices such as e.g. organic light emitting diodes.
  • the arrangement 128 may further include a beam splitter arrangement 148.
  • the beam splitter arrangement 148 may be arranged on the same side of the body structure 112 as the first camera 120 and the second camera 124.
  • the beam splitter arrangement 148 may include the first beam splitter 108 and the second beam splitter 110.
  • the arrangement 128 may include any other suitable device configured to re-direct light such that light may be directed to the object receiving region 118 or to the first camera 120 or to the second camera 124, as desired.
  • the first beam splitter 108 may be arranged relative to the first light source 104 and the first camera 120 such that the first beam splitter 108 may be configured to receive the light provided by the first light source 104 and to re-direct the light to the object receiving region 118 via the first light-transmissive portion 114, thereby generating the second reflected light portion 134.
  • the second reflected light portion 134 may be substantially aligned with the second major optical axis 126 such that the second camera 124 detects more of the second reflected light portion 134 than the first camera 120.
  • the second beam splitter 110 may be arranged relative to the second light source 106 and the second camera 124 such that the second beam splitter 110 may be configured to receive the light provided by the second light source 106 and to re-direct the light to the object receiving region 118 via the second light-transmissive portion 116, thereby generating the first reflected light portion 130.
  • the first reflected light portion 130 may be substantially aligned with the first major optical axis 122 such that the first camera 120 detects more of the first reflected light portion 130 than the second camera 124. .
  • the light detection arrangement 102 may further include a light separating structure 154.
  • the light separating structure 154 may be arranged between the first light source 104 and the second light source 106.
  • the light separating structure 154 may include a opaque material for example a black paper so as to prevent the light from the first light source 104 from interfering with the light from the second light source 106.
  • an object 156 may be. arranged in the object receiving region 118.
  • the object 156 may be a die (e.g. a semiconductor die); in alternative embodiments, the object 156 may e.g. be any kind of substrate or carrier, e.g. a wafer including a plurality of dies.
  • the die may include a plurality of bonding structures.
  • the first camera 120 and the second camera 124 may be arranged in a manner such that the first main optical axis 122 and the second main optical axis 126 may be axial symmetric with respect to a symmetry axis 190 being perpendicular to a plane 158 defined by the object receiving region 118.
  • Each of the first camera 120 and the second camera 124 may be arranged at a same camera angle (d) with respect to the symmetry axis 190.
  • each of the first beam splitter 108 and the second beam splitter 110 may be arranged at a same beam splitter angle ( ⁇ ) with respect to the symmetry axis 190.
  • the difference between the beam splitter angle and the camera angle may be about 45°.
  • the first beam splitter 108 and the second beam splitter 110 may be arranged at a combined beam splitter, angle of 90° + 2a.
  • FIG. IB shows a front view of a light detection system 192, the light detection system 192 including the light detection arrangement 102 of FIG. IA according to an embodiment.
  • the light detection system 192 may include an input 160, an output 162, a controlling or processing device 164 and the light detection arrangement 102 of FIG. IA.
  • the input 160 may include a keyboard and the output 162 may include a display.
  • the controlling or processing device 164 may serve to control the light detection arrangement
  • processor 166 e.g. a (e.g. programmable) microprocessor or any kind of programmable or hard- wired logic
  • memory 168 e.g. a (e.g. programmable) programmable or hard- wired logic
  • the user may provide the input 160 to the controlling or processing device 164 so as to control the light detection arrangement 102.
  • Data may be received from the light detection arrangement 102 and then stored in the memory 168.
  • the processor 166 may then process the data stored in the memory 168 so as achieve a desired image data to be displayed on the output 162.
  • FIG. 2 A shows a front view of a light detection arrangement 102, the light detection arrangement 102 including a body structure 112 with a cutout portion 170 according to an embodiment and FIG. 2B shows a top view of the body structure 112 with the cutout portion 170 of FIG. 2 A according to an embodiment
  • the light detection arrangement 102 may include the body structure 112 configured to provide light, the body structure 112 including a first light- transmissive portion 114 and a second light-transmissive portion 116 disposed at a distance from the first light-transmissive portion 114.
  • the distance between the first light-transmissive portion 114 and the second light-transmissive portion 116 may vary according to design and user requirements.
  • the light detection arrangement 102 may further include an object receiving region 118 arranged such that light provided by the body structure 112 may be illuminating at least a portion of the object receiving region 118.
  • the light detection arrangement 102 may further include a first camera 120 having a first main optical axis 122 and a second camera 124 having a second main optical axis 126 and the body structure 112 may be arranged between the first camera 120 and the second camera 124 on its one side and the object receiving region 118 on its other side.
  • the first camera 120 may be arranged such that its first main optical axis 122 may be directed to the object receiving region 118 via the first light-transmissive portion 114 and the second camera 124 may be arranged such that its second main optical axis 126 may be directed to the object receiving region 118 via the second light-transmissive portion 116.
  • the light detection arrangement 102 may further include an arrangement 128 configured to provide light reflected from the object receiving region 118 such that a first reflected light portion 130 may be provided as a first co-axial light portion 132 aligned to the first main optical axis 122, and such that a second reflected light portion 134 may be provided as a second co-axial light portion -136 aligned to the second main optical axis 126.
  • the light detection arrangement 102 as shown in FIGs. 2A and 2B may further include a structure 172 having a first light-transmissive portion 114 and a second light-transmissive portion 116.
  • the body structure 112 may include the cutout portion 170 and the structure 172 may be arranged in the cutout portion 170.
  • the structure 172 having the first light-transmissive portion 114 and the second light-transmissive portion 116 may include a rectangular plate or a plate with any suitable shape matching the shape of the cutout portion 170.
  • the structure 172 may include a diffused white plastic sheet.
  • the light detection arrangement 102 may include a light separating structure 154.
  • the light separating structure 154 may be positioned between the first beam splitter 108 or the second beam splitter 110, or indirectly between the first light source 104 and the second light source 106.
  • the embodiment according to FIG. 2A may have a more compact or integrated design. Such a design is of particular advantage if commercially available co-axial light sources may be used.
  • the dome light may be cut a rectangular window which may be wider than the combined width of separate first beam splitter 108 and second beam splitter 110 so that an integrated first beam splitter 108 and second beam splitter 110 may be located lower than the ceiling of the dome light.
  • the structure 172 or the rectangular plate may be adopted which may include the first light-transmissive portion 114 and the second light-transmissive portion 116 in the form of two circular openings matching the viewing angle of the first camera 120 and the second camera 124.
  • FIG. 3 shows a perspective view of a prototype of the light detection arrangement 102 of FIG. 2 A according to an embodiment.
  • the light detection arrangement 102 may include a body structure 112 configured to provide light.
  • the body structure 112 may include a dome light.
  • the light detection arrangement 102 may further include a structure 172 having the first light-transmissive portion (not shown) and the second light-transmissive portion (not shown).
  • the body structure 112 has a cutout portion 170 and the structure 172 is arranged in the cutout portion 170.
  • the light detection arrangement 102 may further include an arrangement 128 configured to provide light reflected from an object receiving region (not shown).
  • FIG. 4A shows a left image 400 obtained with the prototype of the light detection arrangement 102 of FIG. 3 according to an embodiment
  • FIG. 4B shows a right image 402 obtained with the prototype of the light detection arrangement 102 of FIG. 3 according to an embodiment.
  • the left image 400 may be taken by the left camera or the first camera 120 and the right image 402 maybe taken by the right camera or the second camera 124. From the respective left image 400 and right image 402, it may be seem that the object 156 e.g. the die 194 includes a plurality of bonding structures 174. Each of the plurality, of bonding structures 174 may include a plurality of bond pads 176 and a plurality of wire bonds 178. Each of the plurality of bond wire may be configured to connect each of the plurality of bond pads 176 to the object 156, e.g. the die 194. In addition, the contrast of the die 194 and the wire bonds 178 in the left image 400 in FIG. 4A and the right image 402 in FIG. 4B seems comparable.
  • FIG. 5 A shows a front view of a light detection arrangement 102, the light detection arrangement 102 including a beam splitter arrangement 148 arranged on the same side of a body structure 112 as an object receiving region 118 according to an embodiment
  • FIG. 5B shows a side view of the light detection arrangement 102 of FIG. 5 A according to an embodiment.
  • the light detection arrangement 102 may include the body structure 112 configured to provide light, the body structure 112 including a first light-transmissive portion 114 and a second light-transmissive portion 116 disposed at a distance from the first light-transmissive portion 114; the object receiving region 118 arranged such that light provided by the body structure 112 may be illuminating at least a portion of the object receiving region 118; a first camera 120.having a first main optical axis 122; a second camera 124 having a second main optical axis 126; wherein the body structure 112 may be arranged between the first camera 120 and the second camera 124 on its one side and the object receiving region 118 on its other side; wherein the first camera 120 may be arranged such that its first main optical axis 122 may be directed to the object receiving region 118 via the first light-transmissive portion 114; wherein the second camera 124 may be arranged such that its second main optical axis 126 may be directed to the object receiving region 118 via the
  • the body structure 112 may have a dome shape.
  • the dome shape may include a radius of between about 40 mm to about 45 mm, for example.
  • the body structure 112 may include a dome light for example.
  • the light detection arrangement 102 may further include a plurality of body structure illuminating light sources 140 arranged to illuminate at least a portion of an inner surface 138 of the body structure 112.
  • the plurality of body structure illuminating light sources 140 may be positioned within the body structure 112.
  • the arrangement 128 may include the beam splitter arrangement 148.
  • the beam splitter arrangement 148 may be arranged on the same side of the body structure 112 as the object receiving region 118.
  • the beam splitter arrangement 148 may be configured to receive the light provided by the inner surface 138 of the body structure 112, and to generate the first reflected light portion 130 and the second reflected light portion 134 from the received light. This may be to compensate for the light which may escape through the first light-transmissive portion 114 and the second light-transmissive portion 116.
  • the beam splitter arrangement 148 may include a single beam splitter 180 configured such that the first reflected light portion 130 may be provided as the first coaxial light portion 132 aligned to the first main optical axis 122 and such that the second reflected light portion 134 may be provided as the second co-axial light portion 136 aligned to the second main optical axis 126.
  • the diffused light provided by the inner surface 138 of the body structure 112 may be reflected onto the object 156 through the beam splitter arrangement 148 so as to generate a co-axial lighting effect. Because of the curvature of the dome light, a requirement for different lighting directions may be met. An advantage of this design may be that the alignment becomes relatively simple and easy.
  • the first camera 120 and the second camera 124 may be arranged along a common camera plane 196.
  • the single beam splitter 180 may be arranged at an approximately 45 degree angle with respect to the common camera plane 196.
  • the single beam splitter 180 may be positioned so as correspond to the position of the respective first light-transmissive portion 114 and the second light-transmissive portion 116. This may be to compensate for the light escaping through the first light-transmissive portion 114 and the second light-transmissive portion 116.
  • the light detection arrangement 102 may further include a light separating structure 154.
  • the light separating structure 154 may be arranged between the beam splitter arrangement 148 and the plurality of body structure illuminating light sources 140 or the body structure 112.
  • 154 may be arranged substantially perpendicular to a plane 158 defined by the object receiving region 118.
  • the first camera 120 and the second camera 124 may be arranged in a manner such that the first main optical axis 122 and the second main optical axis 126 may be axial symmetric with respect to a symmetry axis 190 being perpendicular to a plane 158 defined by the object receiving region 118.
  • Each of the first camera 120 and the second camera 124 may be arranged at a same camera angle ( ⁇ ) with respect to the symmetry axis 190.
  • FIG. 6 shows a perspective view of a prototype of a light detection arrangement 102 of FIG. 5 A according to an embodiment.
  • the light detection arrangement 102 may include a body structure 112 configured to provide light.
  • the body structure 112 may include a dome light.
  • the light detection arrangement 102 may further include a first camera 120 and a second camera 124 disposed above the body structur ⁇ 112.
  • the arrangement 128 may not be shown in FIG. 6 as the arrangement may be arranged within the body structure 112.
  • FIG. 7A shows a left image 700 obtained with the prototype of the light detection arrangement 102 of FIG. 5 A according to an embodiment
  • FIG. 7B shows a right image 702 obtained with the prototype of the light detection arrangement 102 of FIG. 5A according to an embodiment.
  • the respective left image 700 and right image 702 as shown in FIGs. 7A and 7B may be similar to the respective left image 400 and right image 402 as shown in FIGs. 4A and 4B. Similar to FIGs. 4A and 4B, the left image 700 may be taken by the left camera or the first camera 120 and the right image 702 may be taken by the right camera or the second camera 124 in FIGs. 7A and 7B. From the respective left image and right image in FIGs. 7A and 7B, it may be seem that the object 156, e.g. the die 194 includes a plurality of bonding structures 174. Each of the plurality of bonding structures 174 may include a plurality of bond pads 176 and a plurality of wire bonds 178.
  • Each of the plurality of bond wires 178 may be configured to connect each of the plurality of bond pads 176 to the object 156, e.g. the die.194.
  • the contrast of the die 194 and the wire bonds 178 in the left image 700' in FIG. 7A and the right image 702 in FIG. 7B seems comparable.
  • FIG. 8 A shows a front view of a light detection arrangement 102, the light detection arrangement 102 including a bar shaped light source 182 according to an embodiment
  • FIG. 8B shows a side view of the light detection arrangement 102 of FIG. 8A according to an embodiment.
  • the light detection arrangement 102 may include a body structure 112 configured to provide light, the body structure 112 including a first light-transmissive portion 114 and a second light-transmissive portion 116 disposed at a distance from the first light-transmissive portion 114; an object receiving region 118 arranged such that light provided by the body structure 112 may be illuminating at least a portion of the object receiving region 118; a first camera 120 having a first main optical axis 122; a second camera 124 having a second main optical axis 126; wherein the body structure 112 may be arranged between the first camera 120 and the second camera 124 on its one side and the object receiving region 118 on its other side; wherein the first camera 120 may be arranged such that its first main optical axis 122 may be directed to the object receiving region 118 via the first light-transmissive portion 114; wherein the second camera 124 may be arranged such that its second main optical axis 126 may be directed to the object receiving region 118
  • the body structure 112 may have a dome shape with a flat portion at the top.
  • the body structure 112 may include a dome light for example.
  • the dome light may include a housing which may have a flat top for the purpose of easy positioning of a beam splitter arrangement 148 on top of the dome light.
  • the light detection arrangement 102 may further include a plurality of body structure illuminating light sources 140 arranged to illuminate at least a portion of an inner surface 138 of the body structure 112.
  • the plurality of body structure illuminating light sources 140 may be positioned within the body structure 112.
  • the . arrangement 128 may include, a light arrangement 142.
  • the light arrangement 142 may include one or more diffused light sources positioned above or outside of the body structure 112.
  • Each of the diffused light sources may include the bar shaped light source 182 which may include a length at least longer than the width of the first light-transmissive portion 114 and the second light- transmissive portion 116 so that the required directions of light for co-axial lighting effect may be met.
  • the arrangement 128 may include a beam splitter arrangement 148.
  • the beam splitter arrangement 148 may be arranged on the same side of the body structure 112 as the first camera.120 and the second camera 124.
  • the beam splitter arrangement 148 may include a single beam splitter 180 configured such that the first reflected light portion 130 may be provided as the first co-axial light portion 132 aligned to the first main optical axis 122 and such that the second reflected light portion 134 may be provided as the second co-axial light portion 136 aligned to the second main optical axis 126.
  • the first camera 120 and the second camera 124 may be arranged along a common camera plane 196.
  • the single beam splitter 180 may be arranged at a substantially 45 degree angle with respect to the common camera plane 196.
  • FIG. 9A shows a front view of a light detection arrangement 102, the light detection arrangement 102 including an arc shaped light source 184 according to an embodiment
  • FIG. 9B shows a top view of the arc shape light source of FIG. 9A according to an embodiment.
  • the light detection arrangement 102 as shown in FIGs. 9A and 9B may be similar to the light detection arrangement 102 as shown in FIGs. 8 A and 8B with a difference such that the light arrangement 142 may include the arc shape light source instead of the bar shaped light source 182.
  • One effect of using the arc shape light source maybe to simply alignment effort when compared to the bar shaped light source 182.
  • FIG. 1OA shows a front view of a light detection arrangement 102, the light detection arrangement 102 including a blue color dichroic mirror 186 and a red color dichroic mirror according to an embodiment
  • FIG. 1OB shows a top view of the light detection arrangement 102 of FIG. 1OA according to an embodiment.
  • the light detection arrangement 102 may include a body structure 112 configured to provide light, the body structure 112 including an opening 198; an object receiving region 118 arranged such that light provided by the body structure 112 may be illuminating at least a portion of the object receiving region 118; a first camera 120 having a first main optical axis 122; a second camera 124 having a second main optical axis 126; wherein the body structure 112 may be arranged between the first camera 120 and the second camera 124 on its one side and the object receiving region 118 on its other side; wherein the first camera 120 may be .
  • first main optical axis 122 maybe directed to the object receiving region 118 via the opening 198; wherein the second camera 124 may be arranged such that its second main optical axis 126 may be directed to the object receiving region 118 via the opening 198; an arrangement 128 configured to provide light reflected from the object receiving region 118 such that a first reflected light portion 130 may be provided as- a first co-axial light portion 132 aligned to the first main optical axis 122, and such that a second reflected light portion 134 may be provided as a second co-axial light portion 136 aligned to the second main optical axis 126.,
  • the body structure 112 may have a dome shape.
  • the body structure 112 may include a dome light.
  • the dome light may include a white LED dome light with all red (R), green (G) and blue (B) spectrums.
  • the light detection arrangement 102 may further include a plurality of body structure illuminating light sources 140 arranged to illuminate at least a portion of an inner surface 138 of the body structure 112.
  • the plurality of body structure illuminating light sources 140 may be positioned within the body structure 112.
  • the arrangement 128 may include a first dichroic mirror 186 of a first wavelength arranged in a light path along the first main optical axis 122 and a second dichroic mirror 188 of a second wavelength arranged in a light path along the second main optical axis 126.
  • the first wavelength may be different from the second wavelength.
  • the first wavelength may include a range between about 400 nm to about 500 nm and the second wavelength may include a range between about 500 nm to about 650 nm.
  • the first dichroic mirror 186 may be arranged over a portion of the opening 198 or on the one side of the body structure 112 overlapping the portion of the opening 198 and the second dichroic mirror 188 may be arranged in the another portion of the opening 198 or on the one side of the body structure 112 overlapping the another portion of the opening 198.
  • the first dichroic mirror 186 may be arranged so as in contact with the second dichroic mirror 188 at the respective edges.
  • the first dichroic mirror 186 and the second dichroic mirror 188 may also be arranged so as to substantially overlap the opening 198.
  • the first dichroic mirror 186 may be a blue color dichroic mirror and the second dichroic mirror 188 may be a red color dichroic mirror.
  • the second dichroic mirror 188 may be a blue color dichroic mirror and the first dichroic mirror 186 may be a red color dichroic mirror.
  • the first dichroic mirror 186 may be a blue color dichroic mirror and the second dichroic mirror 188 may be a red color dichroic mirror.
  • the red color dichroic mirror may reflect red light and let other wavelengths pass and the blue color dichroic mirror may reflect blue light and let other wavelengths pass.
  • the diffused red dome light reflected from the red color dichroic mirror illuminates the die surface and the reflected light will pass through the blue color dichroic mirror before reaching the first camera 120.
  • the diffused blue dome light reflected from the blue color dichroic mirror illuminates the die surface and the reflected light will pass through the red color dichroic mirror before reaching the second camera 124.
  • a dome light 112 with mixed blue and red LEDs may be built to control the blue and red illumination intensity separately.
  • the first camera 120 and the second camera 124 may be arranged in a manner such that the first main optical axis 122 and the second main optical axis 126 may be axial symmetric with respect to a symmetry axis 190 being perpendicular to a plane 158 defined by the object receiving region 118.
  • FIG. 11 shows a flow-chart 1100 of a method for detecting light in a light detection arrangement 102 according to an embodiment.
  • the light detection arrangement 102 may include a body structure 112 configured to provide light, the body structure 112 including a first light-transmissive portion 114 and a second light-transmissive portion 116 disposed at a distance from the first light-transmissive portion 114.
  • the light detection arrangement 102 may further include an object receiving region 118 arranged such that light provided by the body structure 112 may be illuminating at least a portion of the object receiving region 118.
  • the light detection arrangement 102 may further include a first camera 120 having a first main optical axis 122 and a second camera 124 having a second main optical axis 126. [00127].
  • the body structure 112 may be arranged between the first camera 120 and the second camera 124 on its one side and the object receiving region 118 on its other side.
  • the first camera 120 may be arranged such that its first main optical axis 122 may be directed to the object receiving region 118 via the first light-transmissive portion 114 and the second camera 124 may be arranged such that its second main optical axis 126 may be directed to the object receiving region 118 via the second light-transmissive portion 116.
  • the method may include providing light reflected from the object receiving region 118 such that a first reflected light portion 130 may be provided as a first co-axial light portion 132 aligned to the first main optical axis 122, and such that a second reflected light portion 134 may be provided as a second co-axial light portion 136 aligned to the second main optical axis 126.
  • FIG. 12 shows a front view of multi-camera wire bond inspection head 1200 in accordance with an embodiment, for example of a wire bond inspection vision head 1200.
  • FIG. 13 shows a front view of a dome light system 1300 for stereo vision in accordance with an embodiment, for example a dome light for wire bond stereo inspection.
  • FIG. 13 illustrates a general- stereo vision system 1300 which, without losing generality, may include two cameras, a first camera 1302 and a second camera 1304, and a light for illumination.
  • a dome light may be provided by light from a first light source 1310 and a second light source 1312, which may illuminate a dome 1314 so that the dome 1314 may reflect light to the sample 1306.
  • the first camera 1302 may be directed to a sample 1316 through a first imaging hole 1306.
  • the second camera 1304 may be directed to the sample 1316 through a second imaging hole 1308.
  • the two cameras may be positioned at an angle (for example less than 15 degree) with respect to the vertical axis, and may capture images of the same field of view, for example including the sample 1316.
  • the height of a feature on the object may be calculated using its positions on the images of the two cameras 1302 and 1304. Due to the fact that the camera tilting angle may be small, one or two or more cameras may also be used to calculate both 2D and 3D features.
  • FIG. 14A shows a scenario 1400 of two images, a first image 1402 and a second image 1404, with poor contrast between the wire end and the die, in each of the first wire end region 1406 in the first image 1402, the second wire end region 1410 in the first image 1402, the first wire end region 1408 in the second image 1404, and the second wire end region 1412 in the second image 1404.
  • the poor contrast image 1400 of wire and die may be obtained when using a stereo dome light 1300 as shown in FIG. 13.
  • FIG. 14B shows an image 1450 with good illumination for each camera in accordance with an embodiment.
  • the contrast in a first wire end region 1452 and in a second wire end region 1454 may be high.
  • bond height may be measured.
  • some bond height measurement regions may be indicated by 1458, 1460, 1462 and 1464.
  • wire loop height may be measured.
  • some wire loop measurement regions may be indicated by 1466 and 1468.
  • bond width may be measured.
  • some bond width measurement regions may be indicated by 1456 and 1470.
  • the good contrast between the wire end and die in the first region 1452 and in the second region 1454 may be used for a reliable edge detection and quality evaluation of the wire bonding in accordance with various embodiments.
  • the poor contrast in the images 1400 of FIG. 14 may be caused by missing of the reflection of light that should come from the opposite hole on the dome light.
  • the beam splitter may be placed inside the dome light, aligning with the line formed by the cameras, forming for example an angle of 45 degrees with respect to the two-camera plane.
  • the diffused lighting may be reflected from the internal wall of the dome light onto the object through the beam splitter to generate the similar effect of a co-axial lighting.
  • FIG. 6 shows the experimental setup of stereo imaging with two cameras and a dome light according to various embodiments
  • FIG. 7A and FIG. 7B show the stereo imaging result in accordance with various embodiments. . .
  • diffused co-axial light may be applied.
  • ring light for example including a mirror surface, for example to provide a bright field ring image
  • diffused dome light may be applied, hi various embodiments, a combination of dome light and diffused coaxial light may be applied.
  • a co-axial light may be integrated to a dome light through a beam splitter.
  • a relatively large-area of light for example, a bar light
  • a beam splitter placing at an angle of for example -45 degree with respect to the plane formed by the two camera optical axes, as explained above with reference to FIG. 8 A and FIG. 8B.
  • two co-axial lights may be integrated to a dome light through separated beam splitters.
  • Each coaxial light may be aligned with the camera tilt angle to get the best 2D illumination contrast, as explained above with reference to FIG. IA.
  • two dichroic mirrors may be used to reflect certain wavelengths of light to illuminate the 2D surface.
  • a white LED dome light with all R (red), G (green) and B (blue) spectrums may be used together with a red dichroic mirror (which may reflect red light and may let other wavelengths pass) and a blue dichroic mirror (which may reflect blue light and may let other wavelengths pass).
  • Diffused red dome light reflected from the red dichroic mirror may illuminate the die surface and the reflected light may pass through the blue dichroic mirror before reaching the left camera.
  • diffused blue dome light reflected from the blue dichroic mirror may illuminate the die surface and the reflected light may pass through the red dichroic mirror before reaching right camera.
  • a dome light with mixed blue and red LEDs may be applied to control the blue and red illumination intensity separately.
  • dome illumination may be used which may be very effective in illuminating specular objects such as leadframe surface and wires on a substrate as the diffused light may be reflected back to the camera from all directions. Dome illumination may be optimal for visualization of both flat surface structures such as die surface features as well as 3D wire loops and bonds.
  • dual coaxial illumination may be used in conjunction with the dome light to increase the contrast of the 2D feature detection, or in an alternative way, to distinguish probe marks, which may cause false contour of wire bond width or displacement.
  • the coaxial lights may be aligned with optical axis of each camera.
  • the light from left coaxial light which may be aligned with the optical axis of the left camera, may illuminate die surface with reflected light being captured by the right camera.
  • the reflected light from right coaxial illumination may be captured by the left camera.
  • semiconductor devices such as integrated circuit chips
  • wire bonding may involve bonding a wire to connect pads residing on a silicon die/silicon chip/semiconductor chip to a lead in a lead frame.
  • the chip and lead frame may be packaged in ceramic or plastic to form an integrated circuit device.
  • the inspection of wire bond quality may be done by a human operator using a microscope. This manual method may be time-consuming and costly.
  • wires on a leadframe may be highly specular. Illumination may be desired to be designed- to minimize the specularity on wire and lead surfaces to ensure accurate 3D measurement. At the same. time, good contrasts on other surfaces such as die and bond pads may allow 2D measurements.
  • Special image- processing tools may be employed to derive features from wire and lead surfaces for 3D reconstruction.
  • an optical system may be desired to inspect wires with thickness from 18 ⁇ m (micrometer) to 500 ⁇ m, loop heights from 20 ⁇ m to 3000 ⁇ m or more. It may be desired that the system may be able to measure a few microns
  • micrometers sagging gold wires while maintaining its focus over a dynamic range of a few thousand microns.
  • an automated 3D inspection system may be provided that may be operated with.a manual operator's manual inspection.
  • a new optical 3D wire bond inspection and measurement system may be provided.
  • a new illumination system combining dome illumination for 3D reconstruction and coaxial illumination for 2D measurement may be provided.
  • an automated inspection system for single die wire bond, stack die wire bond, power device wire bond, gold wire bond and the advanced ribbon bonding may be provided.
  • an automated wire bond inspection system for inspecting 3D wire bond features such as wire loop height, die tilt and wire bond height, as well as 2D features such as die surface scratches, bond positions, bond widths, and long tails may be provided.
  • an automated wire bond inspection system that may measure the ever-smaller circuit geometry in an accurate and rapid manner may be provided.
  • an automated wire bond inspection system that may provide higher throughput than manual inspection may be provided.
  • an automated wire bond inspection system that may improve inspection quality and consistency may be provided.
  • an automated wire bond inspection system that may have inspection recipes therein and may create, copy and edit such recipes to customize the system to the user's inspection requirements may be provided.
  • an automated wire bond inspection system that may use digital image analysis to perform wire bond device inspection may be provided.
  • an automated wire bond inspection system that may be trained by inspecting good wire bond devices so that once trained the system may detect variations from what it has learned may be provided.
  • an automated wire bond device inspection system that may include a "good device " training step, an inspection recipe creation step, a defect inspection step, a defect review step, and a report issuing or exporting step may be provided.
  • an automated wire bond inspection system that may provide NIST (National Institute of Standards and Technology) traceable 2D and 3D calibration steps for all sensors to ensure high precision measurement may be provided.
  • NIST National Institute of Standards and Technology
  • an automated wire bond inspection may be provided with measurement results and/or measurement data.
  • an automated wire bond inspection system that may provide for on-the-fly wire bond inspection where a strobe illumination may be used to capture still views of the dynamically moving wire bond device may be provided.
  • an automated method of inspecting a wire bond device in any form including single die wire bond, stack die wire- bond, power device wire bond, gold wire bond and the advanced ribbon bonding etc may be provided.
  • the method may include (1) calibrating the sensors used for camera intrinsic and extrinsic parameters, (2) training a model as to parameters of a good wire bond via optical viewing of multiple known good wire bond devices, (3) inspecting unknown quality wire bond devices using the model, and (4) report the measurement results.
  • FIG. 15 shows a flow diagram 1500 illustrating a method for examining a bonding structure of a substrate in accordance with an embodiment.
  • a substrate including a bonding structure to be examined may be provided.
  • one or more stereoscopic images including one or a plurality of predefined regions of interests of portions of the bonding structure to be examined may be determined, wherein the predefined regions of interests may correspond to reference regions of interests in a stored predetermined reference model of a reference bonding structure of a reference substrate.
  • one or more two-dimensional feature parameters and one or more three-dimensional feature parameters of the bonding structure to be examined may be determined from one or more of a plurality of the predefined regions of interests.
  • the one or more two-dimensional feature parameters and the one or more three-dimensional feature parameters may be determined concurrently.
  • concurrently determining the one or more two- dimensional feature parameters and the one or more three-dimensional feature parameters may be understood in a way of determining both the one or more two-dimensional feature parameters and the one or more three-dimensional feature parameters.
  • the one or more two- dimensional feature parameters and the one or more three-dimensional feature parameters may be determined in different threads, which may be scheduled to perform calculations alternately; for example, in various embodiments, only one thread may be active at the same time.
  • the one or more two-dimensional feature parameters and the one or more three-dimensional feature parameters may be determined simultaneously.
  • simultaneously determining the one or more two- dimensional feature parameters and the one or more three-dimensional feature parameters may be understood in a way of determining at the same time both the one or more two- dimensional feature parameters and the one or more three-dimensional feature parameters.
  • the one or more two-dimensional feature parameters and the one or more three-dimensional feature parameters may be determined in different threads, which may be performed on different processors and/or different cores; for example, in various embodiments, more than one thread may be active at the same time.
  • the one or more two-dimensional feature parameters and the one or more three-dimensional feature parameters may be determined sequentially.
  • sequentially determining the one or more two- dimensional feature parameters and the one or more three-dimensional feature parameters may be understood in a way of first determining one or more two-dimensional feature parameters and after that determining one or more three-dimensional feature parameters.
  • sequentially determining the one or more two-dimensional feature parameters and the one or more three-dimensional feature parameters may be understood in a way of first determining one or more three-dimensional feature parameters and after that determining one or more two-dimensional feature parameters.
  • the one or more two-dimensional feature parameters and the one or more three-dimensional feature parameters may be determined from the same one or more stereoscopic images.
  • one or more additional single images may be determined, and one or more two-dimensional feature parameters may be determined from the one or more additional single images.
  • all methods described above and below may be performed independently from a choice of a camera system.
  • each of the methods may be performed using a camera system chosen from the camera systems as described above.
  • each of the methods may be performed using any other camera system.
  • a substrate may be provided selected from a group of substrates consisting of: a semiconductor substrate; a polymer substrate; and an insulating substrate.
  • the bonding structure may be a wire bond structure.
  • the bonding structure may be or may include a wire bond structure selected from a group consisting of: a single die wire bond; a stacked die wire bond; a power device wire bond; a gold wire bond; an advanced ribbon bonding; and a micfoelectromechanical systems device.
  • the one or more stereoscopic images may be taken using a plurality of cameras, wherein the plurality of cameras may take one or more two- dimensional images from the substrate from respectively different perspectives. In various embodiments, the one or more stereoscopic images may be formed from the one or more two-dimensional images taken from at least two cameras of the plurality of cameras.
  • the one or more predefined regions of interests may be matched to the regions of interest in the reference model of the reference bonding structure of the reference substrate using alignment markings on the substrate.
  • the reference structure when used for setting up the reference model and the bonding structure to be examined may be not at exactly the same position and orientation. Therefore, it may be desired to match the regions of interest, in order to compensate differences in position and orientation, as will be explained below.
  • the one or more two-dimensional feature parameters may include features selected from a group consisting of: a global reference point; a local reference point; a wire bond placement; a bonding width; a tail length; a die location; a die orientation; a bond pad location; a wire placement; a wire bond width; a die surface defect; a die placement; and a long tail.
  • the one or more three-dimensional feature parameters may include features selected from a group consisting of: a global reference height; a local reference height; a default die tilt; a die tilt; a wire loop height; a bond heights for each wire; a component on the unknown quality leadframe; and a die component on the unknown quality leadframe.
  • the one or more images may include one or more images selected from a group, e.g. consisting of: grey scale image; multi-color image; and temperature profile image.
  • the reference model of the reference bonding structure of the reference substrate may be determined, and the reference model may be stored. In various embodiments, determining the reference bonding structure may be referred to as learning a product or a training a model.
  • FIG. 16 shows a bonding structure inspection device 1600 in accordance with an embodiment.
  • the bonding structure inspection device 1600 may include a substrate receiving region 1602 configured to receive a substrate, the substrate may include the bonding structure to be examined; a stereoscopic image determining device 1604 configured to determine one or more stereoscopic images including one or a plurality of predefined regions of interests of portions of the bonding structure to be examined, wherein the predefined regions of interests may correspond to reference regions of interests in a stored predetermined reference model of a reference bonding structure of a reference substrate; a memory 1606 configured to store the predetermined reference model of a reference bonding structure of a reference substrate; a feature determiner 1608 configured to determine one or more two-dimensional feature parameters and one or more three-dimensional feature parameters of the bonding structure to be examined from one or more of a plurality of the predefined regions of interests; and a determiner 1610 configured to determine as to whether the determined feature parameters fulfill at least one predefined quality criterion with respect to the reference model.
  • the substrate receiving region 1602, the stereoscopic image determining device 1604, the memory 1606 , the feature determiner 1608 and the determiner 1610 may be coupled with each other, e.g. via an electrical connection 1612 such as e.g. a cable or a computer bus or via any other suitable electrical connection to exchange electrical signals.
  • the feature determiner 1608 may be configured to determine the one or more two-dimensional feature parameters and the one or more three-dimensional feature parameters concurrently.
  • concurrently determining the one or more two- dimensional feature parameters and the one or more three-dimensional feature parameters may be understood in a way of determining both the one or more two-dimensional feature parameters and the one or more three-dimensional feature parameters.
  • the one or more two- dimensional feature parameters and the one or more three-dimensional feature parameters may be determined in different threads, which may be scheduled to perform calculations alternately; for example, in various embodiments, only one thread may be active at the same time.
  • the feature determiner 1608 may be configured to determine the one or more two-dimensional feature parameters and the one or more three-dimensional feature parameters simultaneously.
  • simultaneously determining the one or more two- dimensional feature parameters and the one or more three-dimensional feature parameters may be understood in a way of determining at the same time both the one or more two- dimensional feature parameters and the one or more three-dimensional feature parameters.
  • the one or more two-dimensional feature parameters and the one or more three-dimensional feature parameters may be determined in different threads, which may be performed on different processors and/or different cores; for example, in various embodiments, more than one thread may be active at the same time.
  • the feature determiner 1608 may be configured to determine the one or more two-dimensional feature parameters and the one or more three-dimensional feature parameters sequentially.
  • sequentially determining the one or more two- dimensional feature parameters and the one or more three-dimensional feature parameters may be understood in a way of first deterinining one or more two-dimensional feature parameters and after that determining one or more three-dimensional feature parameters.
  • sequentially determining the one or more two-dimensional feature parameters and the one or more three-dimensional feature parameters may be understood in a way of first determining one or more three-dimensional feature parameters and after that determining one or more two-dimensional feature parameters.
  • the feature determiner 1608 may be configured to determine the one or more two-dimensional feature parameters and the one or more three-dimensional feature parameters from the same one or more stereoscopic images.
  • an additional single image determiner may be provided and may be configured to determine one or more additional single images (for example non-stereoscopic images), and the feature determiner 1608 may be further configured to determine one or more two-dimensional feature parameters from the one or more additional single images.
  • each of the inspection devices described above and below may be applied independently from a choice of a camera system.
  • each of the inspection devices may be provided with a camera system chosen from the camera systems as described above.
  • each of the inspection devices may be applied using any other camera system.
  • the substrate may be selected from a group of substrates .consisting of: a semiconductor substrate; a polymer substrate; and an insulating substrate.
  • the bonding structure may be a wire bond structure.
  • the bonding structure may include or may be a wire bond structure selected from a group consisting of: a single die wire bond; a stacked die wire bond; a power device wire bond; a gold wire bond; an advanced ribbon bonding; and a microelectromechanical systems device.
  • FIG. 17 shows a bonding structure inspection device 1700 in accordance with an embodiment. The bonding structure inspection device 1700, similar to the bonding structure inspection device 1600 of FIG.
  • the bonding structure inspection device 1700 may further include a regions of interest matcher 1710, as will be explained in more detail below.
  • the bonding structure inspection device 1700 may further include a model determiner 1712, as will be explained in more detail below.
  • the substrate receiving region 1602, the stereoscopic image determining device 1702, the memory 1606 , the feature determiner 1608, the determiner 1610, the regions of interest matcher 1710 and the model determiner 1712 may be coupled with each other, e.g. via a first electrical connection 1714 such as e.g. a cable or a computer bus or via any other suitable electrical connection to exchange electrical signals.
  • the stereoscopic image determining device 1702 may include a plurality of cameras (shown as block 1704 in FIG. 17), wherein the plurality of cameras may be arranged to take one or more two-dimensional images from the substrate from respectively different perspectives.
  • the stereoscopic image determining device 1702 may further include a stereoscopic image determiner 1706 configured to determine the one or more stereoscopic images from the one or more two- dimensional images taken from at least two cameras of the plurality of cameras 1704.
  • the plurality of cameras 1704 and the stereoscopic image determiner 1706 may be coupled with each other, e.g. via a second electrical connection 1708 such as e.g. a cable or a computer bus or via any other suitable electrical connection to exchange electrical signals.
  • the bonding structure inspection device 1700 may further include a regions of interest matcher 1710 configured to match the one or more predefined regions of interests to the regions of interest in the reference model of the reference bonding structure of the reference substrate using alignment markings on the substrate.
  • the one or more two-dimensional feature parameters may include features selected from a group, e:g. consisting of: a global reference point; a local reference point; a wire bond placement; a bonding width; a tail length; a die location; a die orientation; a bond pad location; a wire placement; a wire bond width; a die surface defect; a die placement; and a long tail.
  • the one or more three-dimensional feature parameters may include features selected from a group, e.g. consisting of: a global reference height; a local reference height; a default die tilt; a die tilt; a wire loop height; a bond heights for each wire; a component on the unknown quality leadframe; and a die component on the unknown quality leadframe.
  • the plurality of cameras 1704 may be selected from a group of cameras, e.g. consisting of: a grey scale image camera; a multi-color image camera; and a temperature profile image camera.
  • the bonding structure inspection device 1700 may further include a model determiner 1712 configured to determine the reference model of the reference bonding structure of the reference substrate.
  • a model determiner 1712 configured to determine the reference model of the reference bonding structure of the reference substrate.
  • the method may include: calibrating multiple cameras for 2D and 3D camera parameters, training a model as to parameters of a good wire bond device via optical viewing of one or multiple known good wire bond devices, wherein the training may include taking a gray scale measurement of each pixel in a grid of pixels on each of the multiple known good wire bond devices, calculating the 2D features such as global reference points, local reference points; wire bond placements, bonding widths, tail lengths, die locations and orientations, bond pad locations, calculating the 3D features such as global reference heights, local reference heights, default die tilts, and then from all these 2D and 3D measurement results determining an upper and lower limit for each measurement and save as 2D and 3D recipes, inspecting unknown quality wire bond devices using the model, where such inspecting may involve calculating the 2D measurement of wire placements, wire bond widths, die surface defects, die placements, long tails, and 3D measurement of die tilt, wire loop heights, bond heights for each wire, die and/or other components on the unknown quality leadframe and comparing the
  • the 3D calibration may include imaging one or multiple static images of a calibration target or more targets and calculating localized camera calibration parameters as well as localized epipolar geometries.
  • the calibration target may be a two level or multiple level target where on each level, target points are of the same z-height, while on different levels, z-heights of the targets may be different.
  • the training model step may further include taking the gray scale measurement of each pixel on a 0 to 255 gray scale.
  • the automated inspection method may further include training a system as to inspection parameters.
  • the automated inspection method may further include creating an inspection 2D and 3D recipe that may be used during inspecting to perform at least one of the following: defining how substrates are selected from storage receptacles, defining how wire bond device on a substrate are selected for defect inspection, and defining how an inspection map is imported or exported, if desired.
  • the automated inspection method may further include creating group ROIs (Region of Interest) whereby a set of wire ROIs may be categorized under one group to share the same inspection methods and parameters.
  • group ROIs (Region of Interest) whereby a set of wire ROIs may be categorized under one group to share the same inspection methods and parameters.
  • the inspection system may allow users to define one or more group ROIs.
  • automatic ROIs may be defined using localized epipolar constraints or other constraints such as assuming a default loop height.
  • the automated inspection method may further include determining die placement using three or more ROIs. By selecting three rectangular or other shape areas on the die surface with rich textures, the centres of these three ROIs may be found by matching unknown die against a good die. Offset in x and y directions and rotation angle relative to the good reference die or to a global reference point may be calculated from three matched points.
  • the automated inspection method may further include determining bonding widths of large wires using subpixel edge detection. Edges from both sides of a wire may be added together to best-fit a centre line, which may be used as reference line. Edges along both sides of wire bonding area may be filtered to remove noisy edges and from each side, and the edge point with biggest distance to the centre line may be considered as bond width.
  • the automated inspection method may further include determining bonding widths with occlusion elimination.
  • bond width ROIs may be defined in a second view or more views.
  • image rectification may be performed for all views to bring all cameras to the same parallel optical direction.
  • view fusion may be done to merge multiple views into one to ' determine bond width in a similar way as described above from the merged view.
  • the automated inspection method may further include determining bonding widths using back-projection method, hi various embodiments, three more ROIs may be defined in a second view to determine bond pad surface by using a best-fit planar surface such that 3D coordinates of the bonding edges from a second view are known and thus may be transferred back to the first view by using projective geometry determined in camera calibration. After that, bond width may be determined in the first view.
  • the automated inspection method may further include determining die tilt in the 3D world coordinate system using three, or four or more ROIs defined in each view to cover four corners of a die.
  • a best-fit plane may be determined from heights measured from matched four ROIs.
  • die tilt angles alpha, beta and gamma in reference to the x, y, and z axis of the world coordinate system may be determined and stored as die tilt angles.
  • the automated inspection method may further include determining die tilt using four die tilt ROIs defined inside die surface, four more ROIs on leadframe surface or other reference surfaces.
  • die tilt may be formulated as four sets of difference between die ROI height and its corresponding leadframe height nearby.
  • the automated inspection method may further include determining wire loop heights by using subpixel edge detection algorithm to determine the transitions across a wire accurately.
  • wire centroid may be traced with subpixel accuracy in each view.
  • one or more features e.g. subpixel wire centroids
  • the automated inspection method may further include using local camera calibration parameters and local epipolar geometry to define the search line in a second view or other views. In various embodiments, two nearest centroids to this epipolar line in the second view may be determined.
  • linear interpolation or other interpolation means may be used to get the interpolated centroid on the epipolar line. In various embodiments, this may be the matching point to the selected centroid in the first view. In various embodiments, after establishing stereo correspondence, 3D coordinate of a wire centroid point may be calculated through a process called triangulation. Least squared solution or other means may be used to triangulate for 3D reconstruction.
  • the automated inspection method may further include determining the highest or the lowest point in a wire loop using a subpixel peak or valley interpolation scheme.
  • the automated inspection method may further include, employing double resolution or any kind of superresolution to increase the resolution around wire ROI.
  • double resolution may use different interpolation schemes for different wire regions: slow changing smooth area, wire rising edge, wire top area and wire descending edge.
  • the interpolation schemes may be knowledge based resulting from experiments using specified stereo light which may combine dome light and coaxial light in one.
  • linear or cubic interpolation may be used.
  • the automated inspection method may further include determining wire bond heights by defining a small predefined area around widest bonding width and feature based stereo matching, performed to get a sequence of bond heights at wire centroids.
  • statistical processing and filtering may be applied to get the bond height.
  • the automated inspection method may further include determining the bonding 3D profile at widest bonding width location.
  • a fixed correlation window may be selected.
  • a dynamic correlation window may be selected.
  • correlations may be measured for each pixel along the profile line to get the best match, hi various embodiments, 3D profile of wire bonding at widest bonding width location may be reconstructed.
  • this 1 profile may be compared (for example by using absolute sum of difference or correlation method, or any other method) with reference bonding profile derived from measurement of a good wire. In various embodiments, if the difference is big, the device may be rejected as bad bonding wire.
  • the automated inspection method may further include determining missing die by either template matching or histogram analysis.
  • die surface of unknown quality device may be matched against a learning die.
  • the second method may compare the histograms of unknown quality die with learning die and may declare die missing if the deviation is too big.
  • an automated method of inspecting a substrate such as single die wire bond, stacked die wire bond, power device wire bond, gold wire bond and the advanced ribbon bonding, MEMS device and/or other package configurations for defects may be provided.
  • an automated method of inspecting a semiconductor wire bond device in any form including single die wire bond, stacked die wire bond, power device wire bond, gold wire bond and the advanced ribbon bonding, MEMS device and/or other package configurations for defects may be provided, hi various embodiments, the method may include: calibrating multiple cameras for 2D and 3D camera parameters, training a model as to parameters of a good wire bond device via optical viewing of one or multiple known good wire bond devices using color CCD sensors, wherein the training may include taking a color measurement of each pixel in a grid of pixels on each of the multiple known good wire bond devices, calculating the 2D features such as global reference points, local reference points, wire bond placements, bonding widths, tail lengths, die locations and orientations, bond pad locations, calculating the 3D features such as global reference heights, local reference heights, default die tilts, and then from all these 2D and 3D measurement results determining an upper and lower limit for each measurement and save as 2D and 3D recipes, inspecting unknown quality wire
  • the automated inspection method may further include color stereo illumination with dome light using one color spectrum and coaxial illuminations using another color spectrum.
  • the automated inspection method may further include color subpixel wire centroids determination and reconstruction of 3D wire loop height based on stereo matching of these subpixel features.
  • the automated inspection method may further include color double resolution and color superresolution to increase the resolution around wire area to achieve higher 3D reconstruction accuracy.
  • the training model step may further include taking the color measurement of each pixel on a 24 bits color space.
  • an automated method of inspecting section by section of a substrate such as single die wire bond, stacked die wire bond, power device wire bond, gold wire bond and the advanced ribbon bonding, MEMS device and/or other package configurations for defects may be 1 provided, hi various embodiments, the method may include: calibrating multiple cameras for 2D and 3D camera parameters, training a model as to parameters of a good substrate via optical viewing of one or multiple known good substrates, wherein the training may include taking a measurement of each pixel in a section of a substrate on each of the multiple known good substrate sample, calculating the 2D features such as global reference points, local reference points, wire bond placements, bonding widths, tail lengths, die locations and orientations, bond pad locations, calculating the 3D features such as global reference heights, local reference heights, default die tilts, and then from all these 2D and 3D measurement results determining an upper and lower limit for each measurement and save as 2D and 3D recipes, inspecting unknown quality wire bond devices using the model, where such
  • an apparatus and method may be provided for automatic optical inspection of wire bond devices.
  • the apparatus may include a vision head with multiple CCDs for imaging wire loops and bonding conditions from multiple views, an integrated stereo illumination optimized for both 3D triangulation and 2D wire feature measurement, a movable platform such as an X-Y-Z table integrated with the vision head to inspect objects such as wire bond devices, a calibration unit and method for mapping 3D world coordinates into 2D image coordinates, and an image processing unit for the measurement of 3D features such as wire loop height, die tilt and wire bond height, as well as 2D features such as die surface scratches, bond placements, bond widths, and long tails.
  • the apparatus and method may provide a rapid inspection of wires, bumps and other 3D features on leadframes, wafers, MEMS devices, and other semiconductor substrates.
  • an automated system for inspecting a substrate such as a wire bond devices in any form including single die wire bond, stacked die wire bond, power device wire bond, gold wire bond and the advanced ribbon bonding, MEMS device and/or other package configurations for defects may be provided.
  • the system may include: a wire bond test plate, means for providing a wire bond devices to the test plate; a stereo illumination system with dome and coaxial lights optimized for wire loop 3D reconstruction and wire and die surface 2D measurements; a visual inspection head for visual inputting of a plurality of known good quality wire bond devices during training and for visual inspection of other unknown quality wire bond devices during inspection; a laser thermal sensor based heat detection subsystem to detect the lifted wire, wherein the methods of heating may include placing heaters (for example a non contact heater or for example a contact heater) under the leadframe and heat up bonded leadframes to acceptable temperature level, or projecting a focused heat source on the wires; a microprocessor having processing and memory capabilities for developing a model of a good quality wire bond device by taking a gray scale or color measurement for 2D and 3D features on a plurality of known good quality wire bond devices and determining an upper and a lower limit for each 2D. or 3D feature from the plurality of known good quality wire bond devices, and the microprocessor further comparing other
  • the automated system may further include a focusing mechanism for focusing an image of at least a portion of the wire bond device as seen by the camera.
  • the visual inspection head may be or may include a dual camera or multiple cameras.
  • the visual inspection head may be or may include multiple high resolution black and white CCD cameras or color CCD cameras.
  • the visual inspection head may be or may include multiple microscopy lenses.
  • the magnifications of the lenses may be changed manually or automatically.
  • the automated system may further include a device for a 3D feature reconstruction using subpixel wire centroid based stereo matching.
  • the automated system may further include an autofocus feature.
  • a feature may be based upon a sharpness calculation using segmentation and statistical processing.
  • the unknown sharpness image of a wire bond device may be taken and then segmented into multiple sections to calculate sharpness of each segment based on either correlation, FFT (fast Fourier transform), or edge based criteria.
  • FFT fast Fourier transform
  • a sharpness calculation may be used to find the correct focus point by statistical processing of all segment sharpness data.
  • Methods and apparatuses may be applied to: wire bond device 2D and 3D measurement and defect inspection; MEMS device or other substrates 2D and 3D inspection; 2D and 3D surface measurements; and camera calibration for precision measurement.
  • FIG. 18A shows a perspective view of a wire bond inspection system 1800 in accordance with an embodiment.
  • FIG. 18B shows a front view of the wire bond inspection 1800 system in accordance with an embodiment.
  • the automated wire bond inspection system 1800 in accordance with an embodiment may be used in one environment to find defects on wire bond device but may be used for this and other uses including for inspecting other devices like BGA (Ball
  • the system 1800 may inspect for many types of defects including, but not limited to, the following: bond placement, bond width, die placement, die scratches, long tail, 3D die tilt, bond height and wire loop height.
  • the system 1800 may for example perform the process described with reference to FIG. 19 below.
  • the system 1800 may include a magazine loader 1802 which may load a magazine, of one or more leadframes with multiple wire bond devices; a leadframe indexer 1804; an inspection head 1808 (which in the following may also be referred to as a camera system, an optical imaging mechanism, inspection camera or cameras and optics, or a vision head), for example a 2D&3D vision head, for example with multiple
  • CCD sensors and stereo light CCD sensors and stereo light
  • a die location detection sensor (not shown) to determine if the die. is under inspection head 1808
  • a computation unit 1810 for example a computer
  • a parameter input device 1812 for example a keyboard arid/or a mouse
  • a display 1814 for example a monitor
  • displaying the view being seen by the camera presently or at any previous saved period a test plate 1806 as will be explained in more detail below; an xyz table 1810; a reject station 1816; and an output magazine loader 1818.
  • means for providing a leadframe including multiple wire bond devices to the test plate may be provided as a test plate 1806 and may be manual in that the user may move the leadframe from a cassette or magazine to the test plate 1806, or automatic.
  • the wire bond leadframe providing means which may be also referred to. as leadframe indexer 1804 may include a robotic arm or a leadframe indexer 1804 that may pivot from a first position where a leadframe may be initially grasped from a magazine or cassette to a second position where the leadframe may be positioned on the wire bond test plate 1806 for inspection. After inspection, the robotic arm may pivot the leadframe from the second position at the test plate 1806 to a third position where the leadframe may be transported to an output magazine or cassette.
  • the camera system 1808 or other visual inspection device may be provided for visual inputting of good wire bond devices during learning product and for visual inspection of other unknown quality wire bond devices during inspecting product as will be explained in more detail below.
  • the camera system 1808 may be any type of camera capable of high resolution inspection.
  • An example of one part of such a camera system may include of one or multiple CCD inspection cameras used to capture leadframe containing wire bond device or other images from one or multiple views during defect analysis, for example but not limited to one of the camera systems described above.
  • the focusing mechanism may be an optical imaging mechanism 1808 integrated with an xyz table 1810 with multiple optics therein for using different inspection resolutions.
  • the camera system 1808 may be a two (2) inspection camera system where high resolution CCD cameras may be used to provide high resolution gray-scale images for inspection.
  • computer controlled optics may be provided that may use long working distance microscopic objectives so as to provide for low distortion images that may be desired for accurate defect detection.
  • multiple magnifications may be selected for inspection of different field of views and accuracies. «
  • computer controlled illumination may be integrated into and with the inspection camera or cameras and optics 1808 to complete the wire bond device imaging process.
  • the illumination system may be coupled to the camera or cameras and optics 1808 as long as the illumination system works in conjunction with the camera or cameras 1808.
  • the illumination in a strobing environment, the illumination may occur simultaneously or substantially simultaneously with the camera shuttering which in accordance with an embodiment may be a high speed electronic shuttering mechanism.
  • a non-strobing environment' may be provided, and the illumination may be continuous or as needed.
  • illumination may be by any known illumination device such as high intensity lights, lasers, fluorescent lights, arc discharge lamps, incandescent lamps, and LEDs.
  • the types of the illumination may be of a dome light only, dual axial lights only, or both dome light and dual axial lights variety. > '• •
  • ari illumination system may include a dome illumination and two coaxial illuminations.
  • dome illumination may involve illuminating the substrates from above where the illumination system may be mounted directly above the substrate * This dome illumination may be very effective in illuminating specular objects such as leadframe surface and wires on a substrate as the diffused light is reflected back to the camera from all directions. Dome illumination may be optimal for visualization of both flat surface structures such as die surface features as well as 3D wire loops and bonds. '
  • dual coaxial illumination may be used in conjunction with the dome light to increase the contrast, of the 2D feature detection, or in an alternative way, to distinguish probe marks, which may cause false contour during wire bond width or displacement determination.
  • the coaxial lights may be aligned with optical axis of each camera.
  • the light from left coaxial light which may be aligned with the optical axis of the left camera, may illuminate die surface with reflected light being captured by the right camera.
  • the reflected light from right coaxial illumination may be captured by the left camera.
  • the vision head 1808 may include a dual coaxial illumination system that may be physically located adjacent to the camera so as to provide brightfield illumination from aboye the objects illuminated.
  • the vision head 1808 may include a dome illumination system that may be located peripherally around the leadframe in the test plate 1806.
  • both dual coaxial illumination from above the object and dome illumination from around the periphery of the object may be provided.
  • the illumination as provided by the coaxial and dome illumination systems may be provided by any known illumination source such as a white light source, LED, incandescent, fluorescent, or other similar gas envelope or similar electrical lights, or by lasers or similar devices.
  • the parameter input device 1812 may be configured for inputting parameters and other constraints or information. These parameters, constraints and information may include sensitivity parameters, geometry, wire bond device size, wire size, wire number, etc. Any form of input device may suffice including a keyboard, mouse, scanner, infrared or radio frequency transmitter and receiver, etc.
  • the display 1814 may be configured for displaying the view being seen by the camera presently or at any previous saved period. The display 1814 may be a color monitor or other device for displaying a color display format of the image being viewed by the vision head 1808 for the user's viewing, or alternatively viewing an image saved in memory.
  • This monitor, or another adjacent or other monitor may be used to view the gray-scale inspection image of the vision head 1808 that may be used by the system 1800.
  • the display 1814 may be used during inspection to show the image being viewed by the vision head 1808.
  • the system parameters display 1814 may also be available for displaying other information as desired by the user such as system parameters.
  • the computer system 1810 or other computer having processing and memory capabilities may be configured for saving the inputted good wire bond, developing a model therefrom, and comparing or analyzing other, wire bond devices in comparison to the model based upon defect filtering and sensitivity parameters to determine if defects exist.
  • the computer system 1810 may also be used to perform other mathematical and statistical functions as well as other operations.
  • the computer system 1810 may be of a parallel processing DSP (digital signal processing) environment.
  • the system 1800 may be based upon standard computer technology such as Pentium quad core.
  • the system 1800 may also incorporate use of an autofocus feature.
  • a feature may be based upon a sharpness calculation using segmentation and statistical processing.
  • the unknown sharpness image of a wire bond device may be taken and then segmented into multiple sections to calculate sharpness of each segment based on either con-elation, FFT, or edge based criteria. Thereafter, a sharpness calculation may be used to find the correct focus point by statistical processing sharpness data of all segments.
  • the sequence of operation may be as follows, and will be described in more detail below. The operator may calibrate the system to get accurate camera parameters for 2D and 3D measurement.
  • camera calibration may be desired to be performed for the system once a week.
  • the operator may then train the system as to what a "good wire bond device" is, and may thus for example create a good wire bond model, or choose an existing model.
  • the plurality of good wire bond devices may be viewed by the CCD camera such that the computer system then may form a "good wire bond” model by grouping all or some of the common characteristics.
  • the system 1800 may perform wire bond inspection by studying a user provided set of known good wire bond devices as well as by measuring features from digital image ' s using 2D and 3D calibration results.
  • Inspection parameters may also be set to indicate how close an unknown quality wire bond may be desired to match specific characteristics of the "good wire bond" model to be considered a good device. These may include sensitivity parameters and defect filters.
  • the operator may also create or select previously stored inspection recipes. This may include information as to how wire bond devices may be selected from cassettes or other storage receptacles, how each wire bond device on a leadframe may be desired to be selected for defect inspection, 1 how and where to inspection 2D and 3D defects and what may be the tolerances, how defect inspection map files may be imported and exported, etc.
  • the system 1800 may then be ready to inspect an unknown quality wire bond device. A leadframe containing multiple wire bond devices may be loaded onto the inspection area 1806, for example the test plate 1806.
  • the system may then be ready to collect an image of the selected area (the first device position) of the leadframe using the cameras of the vision head 1808 by moving the leadframe to align the camera with the selected area, such as a first wire bond device position, so as to take a first image thereof which may be the whole leadframe, a part of the leadframe, a wire bond device, or a part of a wire bond device and then viewing and recording that image.
  • Automatic defect inspection may be performed on the device's digital image. If a defect was found, then the system may collect and store detailed information about each defect such as defect location on the device, size, shape, etc.
  • the leadframe indexer 1804 may then be moved to align the camera with another selected wire bond device area, which may be the next adjacent area or not, to take an image thereof (the second device position) on the leadframe adjacent to the first image.
  • the leadframe may be indexed under the inspection cameras to the next device position. This second device position may then be viewed and recorded. These steps may be repeated until all of the images on the leadframe have been viewed and recorded. Simultaneous with these image viewing steps, TD and 3D feature measurements, defect sensitivity and filtering may be used in conjunction with the "good wire bond" model viewing to determine if initial anomalies or differences between the "good wire bond" model and the image are actual defects or if they may be desired to be filtered out.
  • a defect map of the leadframe may be then created in the computer system from the collection of all of the defect of all of the devices including all of the defects found thereon.
  • the leadframe may continuously be moved . during strobe illumination thereof.
  • the sections of the wire bond device may then be scanned by synchronizing the camera with a strobe illumination so that when the camera is properly positioned over each section of the moving substrate, the strobe illumination may occur simultaneous with the image collection via the camera.
  • defect classification may be desired.
  • Each archived defect may be manually reviewed by the operator, where the leadframe indexer 1804 may be moved to the position on the wire bond device, that the particular defect may be positioned at so that the operator may view and classify the defect. This may then be repeated for all defects.
  • the classified defects may be then saved as a classified defect map.
  • That leadframe may then be removed and another leadframe may be loaded for inspection. This removal and loading of a new leadframe may either be manually performed or may be automatically performed.
  • stereo cameras may be calibrated with localized camera parameters (e.g. rotation angles, translations, distortions and image center etc) and localized fundamental matrix and/or epipolar geometry may be used.
  • localized camera parameters e.g. rotation angles, translations, distortions and image center etc
  • localized fundamental matrix and/or epipolar geometry may be used.
  • a special design of a calibration target for camera 3D calibration without using a z-stage may be provided.
  • stereo illumination with integrated dome and dual coaxial lights for optimum 2D and 3D illumination may be provided.
  • methods for wire width measurement from two cameras with projection and fusion may be provided.
  • a three-point die placement measurement method may be provided.
  • a double resolution method for wire tracing may be provided.
  • a four point die tilt 3D measurement method may be provided.
  • an IR (infra red) detection using laser sensor for lifted wires with active IR illumination may be provided.
  • wire bond 2D and 3D recipe creation methods may be provided.
  • a rotated ROI and automatic ROI method may be provided.
  • stereo & coaxial illumination may be provided.
  • stereo triangulation may be provided.
  • 2D/3D recipes may be provided.
  • apparatus and methods for automated optical inspection of wire bond devices may be provided.
  • a 2D/3D wirebond inspection system may be provided.
  • optic design and software development consultancy support may be provided.
  • a semiconductor device inspection method and apparatus may be provided.
  • wire bond 2D and 3D inspection and measurement method and apparatus may be provided.
  • a wire bond inspection system may integrate various of the above referenced embodiments.
  • the above described systems and parts may be part of system 1800 and may be used to perform the wire bond defect inspection as will be explained in more detail below.
  • FIG. 19 shows an overall flow chart 1900 of a process in accordance with an embodiment.
  • the process may be started.
  • the system and process may encompasses a multiple step process of learning a product in 1904, where 2D and 3D recipes of the product may be created, inspecting product in 1906, where a wire bond device may measured using knowledge obtained from training and recipes, reviewing defects in 1908, and if desired, defect reporting in 1910.
  • the process may end in step
  • the flowchart 1900 is shown to include several steps, not each of them may be required. For example, after having learnt a product in 1904, more than one product may be inspected in 1906. In other words: the learning step 1904 may be performed for a product, and then, the inspection step 1906 may be performed without requiring additional learning steps 1904. [00293] In the review defects step 1908, defects may be visualized by superimposing defects onto a graphical display of the corresponding wire bond device.
  • the defect map may include indication of the defect type, location, and severity of defects against preset tolerances.
  • the classified defect map as well as alternatively or additionally the defect information in any of a number of other formats may be saved for database or other management and review.
  • a defect map may be generate and created from the inspection results in the product inspection step 1906.
  • the first step may be selecting a defect to review (or alternatively reviewing all or some of the defects for this wire bond device in order).
  • the second step may be moving the inspection head, for example the vision head 1808 shown with reference to FIG. 18 A, to the position of the wire bond device to be reviewed such that the particular defect is properly positioned under the cameras of the vision head 1808.
  • the third step may be a user viewing and classifying of the defect. Any number of classifications may be available and the classifications may be user defined.
  • the classification step may be repeated. until all of the defects have been reviewed and classified.
  • the last step may be saving of classified defect map as well as alternatively or additionally saving the defect information in any of a number of other formats for database or other management and review-
  • statistics may be generated, whereby defect statistics along with defect maps may be outputted to perform statistical or other analysis on the types of defects, frequency of defects, location of defects, etc. This may be useful to the wire bond manufacturers so as to allow them to focus on major defects and trace the source.
  • the data stored in a database format may be exported or printed out. This data may then be analyzed or otherwise used to perform statistical or other analysis on the types of defects, frequency of defects, location of defects, etc. which may be useful to the wire bond manufacturers so as to allow them to focus on defect laden areas. This step
  • 1910 of generating statistics may provide for complete and effective data analysis as it may report data in multiple formats including graphical, tabular, and actual image displays.
  • the data that may be placed in tabular format may allow numerical values to be readily correlated with other values such as. electrical formats.
  • the graphical data representation may quickly show trends that may otherwise be difficult to see.
  • FIG. 20 shows a flow diagram 2000 illustrating details of the learn product steps in accordance with an embodiment.
  • the flow diagram 2000 may be a more detailed flow chart of one step in the process as shown in FIG. 19, e.g. learn a new product.
  • flow diagram 2000 may illustrate details of the learn product step 1904 shown in FIG. 19.
  • the learn product step 1904 may be started.
  • parameters may be defined and/or trained (and/or stored), for example in the computer system 1810 shown in FIG, 18 A, for inspection path planning, where wire bond device size, pitch and. how to separate the device into multiple inspection sections may be learnt so that the inspection head may move to different parts of a wire bond component for 2D and 3D feature measurements.
  • leadframe scan path and other parameters for a substrate containing multiple wire bond devices may be defined and stored, for example in the computer system 1810 shown in FIG. 18 A, for use during learning and inspecting.
  • This path planning when performed in conjunction with the create 2D recipe step 2006 and create 3D recipe step 2008 as described below to define 2D and 3D inspection recipes, may be desired to guide an inspection head, for example the inspection head 1808 shown in FIG. 18 A, to position itself during leadframe scanning to a particular x/y position to perform a specific 2D and/or 3D inspection based on procedures defined in inspection recipes, hi various embodiments, the path planning may define the z scan height associated with each x/y position to maintain an in-focus imaging of wire loops and other features.
  • ROIs region of interest
  • 2D inspections and training the system as to what a "good wire bond” is, and may include aligning one or multiple wire bond device images and forming a model, for example within computer system 1810 shown in FIG. 18 A, to define what an ideal wire bond should look like based upon the common characteristics viewed, as will be explained in more detail below with reference to FIG. 21.
  • ROIs region of interest
  • the system may be trained as to how to set the subpixel parameters for optimum wire feature detection and 3D triangulation, as will be explained in mor.e detail below with reference to FIG. 23.
  • FIG. 21 shows a flow diagram 2100 illustrating a 2D wire bond recipe creation in accordance with an embodiment.
  • flow diagram 2100 may be a more detailed flow chart of one step in the process 2000 as shown in FIG. 20, e.g. 2D wire bond recipe creation 2006 of defining (and inputting, for example into a computer system, for example the computer system 1810 shown in FIG. 18A) the recipe for wire bond 2D inspection.
  • the create 2D recipe step 2006 may be started.
  • one or more global reference point(s) may be defined. Each 2D measurement may be linked to one of these reference points so that when a component changes position in term of location and orientation, global reference point(s) may be used to re-align it with the reference sample during learning. In this way a comparison may be made between wire bond device to be measured and wire bond device learnt.
  • global references may be used to correct the relative position change between leadframe learnt and an incoming new leadframe. Any wire bond features dependent on the leadframe may be' corrected using global reference points.
  • Normally leadframe indexing holes or leadframe corners may be selected as global reference points.
  • local reference point(s) may be defined, for example for the correction of offsets for global independent features.
  • die position in a leadframe may be different from component to component. Any measurements relevant to a die may be desired to define local reference points on the die itself.
  • the 2108, small wire placement inspection Region of Interest may be defined. Small wires may refer to those with wire diameter smaller than lOOum. For each wire bond position, one wire placement ROI may be desired to be defined. Wire placement ROIs may be rotated to any angle to be in line with actual wire bond orientation. Group ROIs may be introduced whereby a set of wire placement ROIs may be categorized under one group to share the same inspection methods and parameters. The inspection system may allow a user to define one or more group ROIs. Default small wire placements may be leamt and stored, for example in the computer system 1810 shown in FIG. 18A, and if needed, may be used by 2D inspection system to determine the relative placement offset.
  • ROIs small wire placement inspection Region of Interest
  • large wire placement inspection Region of Interest may be defined. Large wires may refer to those with wire diameter bigger than lOOum. For each wire bond position, one wire placement ROI may be desired to be defined. The angle of a wire placement ROI may be adjusted to be in line with actual wire bond direction. Group ROIs may be introduced whereby a set of wire placement ROIs may be categorized under one group to share the same inspection methods and parameters. Different inspection strategies may be used for large and small wires (as will be explained with reference to a determine small wire placement step 2604 and a determine large wire placement step 2606 with reference to FIG. 26). Default large wire placements may be learnt and stored, for example in the computer system .1810 shown in FIG.
  • FIG. 22 shows an illustration 2200 of bond placement measurement regions of interest (ROIs) in accordance with an embodiment.
  • ROIs bond placement measurement regions of interest
  • die placement ROIs may be defined.
  • ROIs may be defined for each die.
  • One ROI may be desired to cover a reference point outside the die. Either a corner point or a indexing hole centre or other feature point may be selected as die reference point.
  • Another two die placement ROIs may be desired to be defined inside die area. They may be desired to cover texture rich areas. The bigger the distance between the two ROIs, the better the result and sensitively of die offset and tilt determination may be.
  • die surface scratch and missing die inspection ROI may be defined.
  • One ROI may be defined for each die for the detection of surface defects such as scratches, dent, chip off and missing die.
  • bond width measurement ROIs for small wires may be defined.
  • ROI may be defined for each wire for wire width measurement.
  • the angle of a wire width ROI may be adjusted to be in line with actual wire bond direction.
  • Group ROIs may be defined to apply the same inspection methods and parameters to all ROIs in this group. If wire placement ROIs for small wires have been defined, this step may be omitted since both may share the same ROIs.
  • bond width measurement ROIs for large wires may be defined.
  • ROI may be defined for each wire for wire width measurement.
  • the angle of a wire width ROI may be adjusted to be in line with actual wire bond direction.
  • Group ROIs may be defined to apply the same inspection methods and parameters to all ROIs in this group. If wire placement ROIs for large wires have been defined, this step may be omitted since both may share the same RO ⁇ s.
  • FIG. 23 shows a flow diagram 2300 illustrating 3D wire bond recipe creation in accordance with an embodiment.
  • the flow diagram 2300 may be a more detailed flow chart of one step in the process 2000 as shown in FIG. 20, e.g. 3D wire bond recipe creation 2008 of defining (and, in various embodiments, inputting into a computer system, for example the computer system 1810 shown in FIG. 18A) the recipe for wire bond 3D inspection.
  • the 3D wire bond recipe creation step 2008 may be started.
  • one or more global reference point(s) for multiple views may be defined. Each 3D measurement may be linked to one of these reference points so that when a component changes position in term of location and orientation, global reference point(s) may be used to re-align it with the reference sample obtained learning. In this way, a comparison may be made between wire bond device to be measured and wire bond device learnt.
  • global references may be used to correct the relative position change between leadframe learnt and an incoming new leadframe. Any wire bond features dependent on the leadframe may be corrected using global reference points, hi various embodiments, leadframe indexing holes or leadframe corners may be selected as global reference points.
  • ROIs. for measurement of a reference height or a reference plane may be defined. This may be desired to be done in each multi-view image. All or some 3D measurements such as wire loop height and die tilt may be measured in absolute world coordinate system defined by calibration, or relatively to a reference height or a reference plane.
  • a reference plane may be determined by defining three ROIs on the leadframe so that subsequently wire loop heights may be calculated as the distance of the projection of the highest point (x,y,z) in a wire to the reference plane. In this way, a benchmarking between a manual measurement may result from a typical measurement scope and an automatic wire bond inspection system may be possible.
  • small wire loop height Region of Interest may be defined.
  • one or more wire ROIs may be defined to help tracing the wire feature points for 3D calculation.
  • Group ROIs may be introduced whereby a set of wire loop height ROIs may be categorized under one group to share the same inspection methods and parameters.
  • automatic ROIs may be activated to position the corresponding wire loop height ROIs in all other view based on well known epipolar constraint and other constraints like assuming a default loop height.
  • wire loop height ROIs may be defined manually in all other views.
  • large wire loop height Region of Interest may be defined.
  • one or more wire ROIs may be defined to help tracing the wire feature points for 3D calculation.
  • Group ROIs may be introduced whereby a set of wire placement ROIs may be categorized under one group to share the same inspection methods and parameters.
  • automatic ROIs may be activated to position the corresponding wire loop height ROIs in all other view based on well known epipolar constraint and other constraints.
  • wire loop height ROIs may be defined manually in all other views.
  • die tilt Region of Interest may be defined. For each die, three or four (which may be used as a default value in accordance with various embodiments) or more ROIs may be defined to determine. die tilt in the world coordination system. Die tilt may also be determined relatively by specifying a number of (for example 4 or more) die tilt reference ROIs near the corners of a die and associating each reference ROI with a nearest die tilt ROI.
  • FIG. 24 shows an illustration 2400 of die tilt ROIs in accordance with an embodiment.
  • a first die tilt ROI 2402 a second die tilt ROI 2404, a third die tilt
  • ROI 2406 ROI 2406
  • a fourth die tilt ROI 2408 are shown.
  • bond height measurement ROIs for small wires may be defined.
  • One ROI may be defined for each wire bonding position.
  • the angle of s. wire width ROI may be adjusted to the actual wire bond direction.
  • ROIs may be introduced whereby a set of wire bond width ROIs may be categorized under one group to share the same inspection methods and parameters.
  • FIG. 25 shows a flow diagram .2500 illustrating details of an inspect product step in accordance with an embodiment.
  • FIG. 25 may show a more detailed flow chart 2500 of one step in the process 1900 as shown in FIG. 19, e.g. inspect a new product 1906.
  • the inspect product step 1906 may be started.
  • one or more global reference point(s) may be localized to determine leadframe, die and pad positions and orientations.
  • all defined 2D and 3D regions of interest may be shifted based on offsets and rotations calculated in step 2504 so that the incoming device may be aligned with a reference device (for example training device).
  • a reference device for example training device
  • all or some 2D features may be inspected, including die placement, bond placement, bond width, die scratches and long tails within each individual region of interest.
  • all or some 3D features including reference heights, die tilts, bond heights, and loop heights within each individual region of interest may be measured.
  • FIG. 26 shows a flow diagram 2600 illustrating inspecting 2D features in accordance with an embodiment.
  • FIG. 7 may provide a more detailed flow chart 2600 of one step in the process 2500 as shown in FIG. 25, for example 2D wire bond inspection 2508, which may include determination of small and large wire placement on the corresponding bonding pad, determinatipn of die offset and rotation, inspection of die surface scratches and long tails and measurement of bond widths for small and large wires as will be described in more detail below.
  • the step 2508 of performing wire bond device 2D inspection may be started.
  • small wire placement may be determined to determine how accurately a bonder bonds the wire onto the bond pad surface.
  • three methods may be used: a present or not present check, a reference based method by learning a good wire bond device or an absolute measurement method by determining bond position relative to its bond pad.
  • the system may check if a wire is available or not. Edge detection may be applied to wire placement
  • ROIs and if sufficient edges are detected, a wire may be available.
  • subpixel edges along wire bond location may be calculated and may be best-fit to form three lines: two running in parallel with the wire and one in the perpendicular direction. Two subpixel corner points may be calculated from these three lines. The centre of these two corners may be used as bond placement position. If this centre is far away from the learnt one, bond placement error may occur.
  • the bond placement location may be determined in a similar way like in reference based method. This position, however, may be compared with bond pad edges to determine its absolute location on the pad. Tolerances may be set to reject those bond placements too near the bond pad edges.
  • large wire placement may be determined.
  • the procedures for large bond placement determination may be the same as in small wire bond placement determination, only parameters for subpixel edge determination and best-fitting may be changed for the best detection of large wires.
  • die placement may be determined.
  • a good die with accurate placement may be desired to be learnt first.
  • three ROIs may be defined in a create 2D recipe section (for example in step 2112 as explained above with reference to FIG. 21) to select three rectangular areas, one on the leadframe as reference point and two on the die surface with rich textures.
  • the centres of these three ROIs may be stored to as the learning positions of a good die.
  • the patterns within these ROIs may be used as matching models later on to find corresponding patterns in an incoming wire bond device.
  • three bigger ROIs may be defined to allow the die shiftment from component to component.
  • Template matching may be performed for all these three ROIs using learning patterns as models.
  • the three matching positions may be determined. Offset in x and y directions and rotation angle relative to the good reference die may be calculated from three matched points.
  • two die corners may be detected by edge based approach.
  • the positions of these two corners, together with the reference point may be used to determine die offset and rotation.
  • missing die and die surface defects may be determined.
  • various methods may be used. In a template matching based approach, die surface of unknown quality device may be matched against a learning die. If the matching score is too low, there may be a high chance of missing die. The second method may compare the histograms of unknown quality die with learning die and may declare die missing if the deviation is too big.
  • a good die with good die surface condition may be desired to be learnt first. For an incoming die, offset and rotation against reference good die may be determined and translation and rotation corrections may be applied to bring the incoming die to the reference die position. A pixel to pixel comparison may be made between reference die and incoming die.
  • Defects may be those pixels with grey level value differences bigger than a preset threshold.
  • good die surface grey level distribution may be learnt so that binarization may be performed on good die which may result in no defects (or no blobs found). The same binarization may be performed for an incoming die to screen out the surface defects such as scratches or foreign particles. Don't care areas may be defined for pattern rich areas on a die to avoid over-rejects.
  • the defect points may be analyzed using Hough transformation. Line segments may be detected by peak analysis in Hough space. Line segment longer than certain value may be from scratches. [00341] hi step 2612, small wire bonding width may be determined.
  • a dynamic threshold may be first applied to the bonding area within wire bond width ROIs defined in create 2D recipe (for example step 2116 as described with reference to FIG. 21).
  • Object analysis may then be applied to find a biggest white object may be the wire surface of a small wire. The centroid of this object may be used as reference line. Edges along both sides of wire bonding area may be detected and from each side, the edge point with biggest distance to the centre line may be considered as bond width. [00342] hi 2614, bonding widths of large wires may be determined.
  • Subpixel edge detection may be applied to the bonding area within wire bond ROIs defined in create 2D recipe (for example step 2118 as shown above with reference to Fig. 21).
  • Edges from both sides of a wire may be added together to best-fit a centre line, which may be used as reference, line. Edges along both sides of wire bonding area may be filtered to remove noisy edges and from each side, the edge point with biggest distance to the centre line may be considered as bond width.
  • bond width ROIs may be defined in a second view or more views, where the occluded bonding area may be visible.
  • Image rectification may be performed for all views to bring all cameras to the same parallel optical direction.
  • View fusion may be done to merge multiple views into one (for example for bond width area only) to determine bond width in a similar way as described above.
  • three more ROIs may be defined in a second view to determine bond pad surface by using a best- fit planar surface. Assuming bonding edges points are all on the same surface, the 3D coordinates of the points may be known and thus may be transferred back to the first view by using projective geometry determined in camera calibration. After that, bond width may be determined in the first view. [00345] • In 2616, long tails, which only apply for large wires, may be determined. A reference wire bond device with good bonding tail lengths may be desired to be learnt first. Similar like bond width detection (step 2614), the location with the biggest bond width may be determined first.
  • the wire placement determined in step 2606 may be used to calculate the distance between wire placement position to the line representing biggest bond width.
  • the distance may be defined as tail length.
  • tail length may be calculated in the same way and if the difference between reference one and incoming one is bigger than a preset threshold, the system may reject the device as a long tail.
  • FIG. 27 shows an illustration 2700 of long tail ROIs and measurement in accordance with an embodiment.
  • a first long tail ROI 2702 and a second, long tail ROI 2704 are shown.
  • FIG. 28 shows a flow diagram 2800 illustrating inspecting 3D features in accordance with an embodiment.
  • FIG. 28 may show a more detailed flow chart 2800 of one step in the process 2500 as shown in FIG. 25, e.g.
  • 3D wire bond inspection 2510 of performing wire bond device 3D inspection which may involve determination of reference height for the reference point correction of all 3D measurements, die tilt measurement by a best-fit die tilt plane, wire loop height measurements for small and large wires using both correction method and feature based stereo matching, and wire bond height measurement for lifted wire inspection, as will be explained in more detail below.
  • the inspect 3D features step 2510 may be started.
  • global reference height may be determined by matching small areas between first and second view or alternatively more views and by triangulating the matched views to get the reference height.
  • corresponding ROIs may be defined in multiple views.
  • Stereo matching may be performed to establish match candidates between first view and the rest views.
  • absolute sum of differences, normalized correlation and/or weighted correlation may be used.
  • the match may be found between two views when correlation reaches a peak.
  • the height information, e.g. x,y,z of the ROI centre may be derived by a least-squared solution.
  • a small search area may be defined within each reference height ROI to allow matching of multiple points and to calculate multiple reference height within a flat area.
  • Statistical process may. be applied to get the best height.
  • a global reference plane may be determined. By matching three small ROIs in multiple views, a 3D reference plane may be built. Other features calculated may be projected to this reference plane to make comparison between manual measurement and automatic measurement possible.
  • the leadframe surface or die surface may often be selected as reference plane.
  • die tilt in the 3D world coordinate system may be determined.
  • four ROIs may be defined in each view to cover four corners within a die.
  • at least three ROIs may be desired to be defined to determine die tilt and in various embodiments, four or more than four ROIs may be used.
  • a best- fit plane may be determined from heights measured from matched four ROIs. Die tilt angles alpha, beta and gamma in reference to the x, y, and z axis of the world coordinate system may be determined and stored as die tilt angles.
  • Average die height may also be calculated by averaging four ROI heights and with respect to the global reference height.
  • die tilt in addition to four die tilt ROIs defined inside die surface, four ROIs on leadframe surface other reference surfaces may be defined which may be desired to be as near the corresponding die tilt ROIs as possible. Die tilt may be formulated as four sets of differences between die ROI height and its corresponding leadframe or other reference height nearby.
  • wire loop heights for small wires may be determined.
  • one (or more) line shape ROI may be defined in each view.
  • Subpixel edge detection algorithm may be used to determine the transitions across wire area accurately. For a white wire (shiny wire in the multi-view images), a typical transition may be from black (background) to white (wire body) and then back to black (background), hi some area, black wires may occur. In that case wire transition may follow the transition pattern from white to black to white. ⁇
  • wire centroid may be traced with subpixel accuracy in each view.
  • feature e.g. subpixel wire centroids
  • the procedure of feature based stereo matching may be as follows: after detecting subpixel wire centroid points in all views, the centroids may be ordered for each wire in a top-down or left-right manner.” For each centroid in the first view (image), global or local epipolar geometry may be applied to the second view (image) to define the search line (for epipolar constraint, see the global fundamental matrix for epipolar line calculating step 3112 and the local fundamental matrix for local epipolar line calculating step 3114 in a 3D calibration session described with reference to FIG. 31 below). Two nearest centroids to this epipolar line in the second view may be determined. The intersection point between the epipolar line and the line determined by these two centroids may be considered as the matching point to the selected centroid in the first view.
  • 3D coordinate of a wire centroid point may be calculated through a process called triangulation. Least squared solution or other means may be used to triangulate for 3D reconstruction.
  • stereo matching may be established as follows: at first subpixel wire centroids in the first view may be calculated. For each valid wire centroid in the first view, global or local epipolar geometry may be applied to the second view (image) to define the search line. Transitions along the epipolar line may be i determined to find the centroid, which may be the matching point of the centroid in the first view.
  • Continuity constraint may be applied to the reconstructed 3D wire centroids to filter out abnormal points.
  • a correction may be made on the reconstructed 3D wire loop. Mathematically, matching first view and second view or more views to get triangulation value of z height may calculate the wire centre height, not the true wire top height. A correction may be calculated from camera tilt angle and the wire radius.
  • double resolution may use different interpolation schemes for different wire regions: slow changing smooth area, wire rising edge, wire top area and wire descending edge.
  • the interpolation schemes may be knowledge based operation resulting from experiments using specified stereo light which may combine dome light and coaxial light in one.
  • linear or cubic interpolation may be used, though the result may be less repeatable and less accurate than the knowledge based double resolution;
  • Double resolution or superresolution may be used for small wires since some of the gold wires may have diameters as small as 18um. Depending on the microscope lens used, in some cases the wire thickness in an image (view) may be just 1.7pixel. A double resolution to 3.4pixel may make the subsequent subpixeling edge detection much easier and reliable.
  • FIG. 29 shows an illustration - 2900 of wire loop height ROIs and measurement in accordance with an embodiment. For example, a first loop height ROI 2902, a second loop height ROI 2904, a third loop height ROI 2906, and a fourth loop height ROI 2908 are shown.
  • wire bond heights for small wires may be determined to check bonding conditions three dimensionally. As indicated in create 3D recipe step 2314 with reference to FIG. 23, for each small wire, one or more line shape
  • Wire bond height measurement may serve two purposes: one may be to determine if there is any lifted wire and the other may be to determine bonding quality based on 3D profile analysis at the location where the bonding width is the biggest.
  • a small predefined area around widest bonding width may be selected and feature based stereo matching may be performed to get a sequence of bond heights at wire centroids.
  • Statistical processing and filtering may be applied to get the bond height.
  • a fixed (alternatively a dynamic) correlation window may be selected. Correlations may be measured for each pixel along the profile line to get the best match. In this way, 3D profile of wire bonding at widest bonding width location may be reconstructed. This profile may be compared (for example using absolute sum of difference or correlation method, or other method) with reference bonding profile derived from measurement of a good wire. If the difference is big, the device may be rejected as bad bonding wire.
  • bl may be a constant depends on the wire size, camera tilt angle and experimental result and r may be the radius of the wire.
  • wire bond heights for large wires may be determined. As indicated in create 3D recipe step 2316 with reference to FIG. 23, for each large wire, one or more line shape ROIs may be defined in each view. . Similar like in a small wire's case, to determine the bonding height, a small predefined area around widest bonding width may be selected and feature based stereo matching, or correlation based stereo match, may be performed to get a sequence of bond heights around wire centroids. Statistical processing and filtering may be applied to get the bond height.
  • a so called visibility constraint may be introduced to improve stereo matching accuracy and to minimize occlusion effect for a large wire.
  • the visibility constraint may desire that occlusion in one image and disparity in the other are consistent.
  • the visibility constraint may be embedded within an energy minimization framework, resulting in a symmetric stereo model that may treat left and right images equally.
  • An iterative optimization algorithm may be used to approximate the minimum of the energy.
  • a dynamic correlation window may be selected. Correlations may be measured for each pixel along the profile line to get the best match. In this way, 3D profile of wire bonding at widest bonding width location may be reconstructed. This profile may be compared (for example using absolute sum of difference or correlation method, or other methods) with the reference bonding profile derived from measurement of a good wire. If the difference is big, the device may be rejected as bad bonding wire.
  • a laser thermal sensor based heat detection subsystem may be used to detect the lifted wire.
  • the methods of heating may include placing heaters (in various embodiments for example non contact; in various embodiments for example contact) under the leadframe and heat up bonded leadframes to acceptable temperature level, or projecting a focused heat source on the wires.
  • the heat When applying heat source under leadframe, the heat may be transferred from the leadframe to the die and bonding places, if a wire is bonded well. Hence the temperature may be higher from a good bond than a lifted or partially lifted wire.
  • heat When applying heat source from above, heat may be dissipated from the heated spot of wire through the wire to die pads and leadframe, under the condition that a wire is bonded well,. Thus the temperature may be lower for a good bond as compared to the lifted or partially lifted wire.
  • a wire bond 3D reconstruction camera system may include two or more single-image cameras, each of which may include a CCD sensor and a fixed focal length lens system or, alternatively, a CMOS image sensor and such a lens system.
  • the single-image cameras may simultaneously take one image each. These images may be referred to as an image pair. With the aid of the images taken, a direct distance measurement of objects visible in both images may be made.
  • each camera's parameters such as camera (and lens) distortion, camera location and orientation, image centre, y/x ratio etc.
  • camera parameters may be derived from a process of camera calibration.
  • FIG. 30 shows a flow diagram 3000 illustrating calibrating 2D inspection in accordance with an embodiment.
  • one or more snap(s) of a static calibration target may be performed.
  • an averaging process may be used to calculate average image for later use.
  • subpixel target centroid determination function may be applied to calculate all target centroids.
  • centroids may be calculated by using zero-crossing, multi-cut threshold methods. After target feature detection, world coordinates which may correspond to the image coordinates of the target features may be calculated.
  • global camera parameters may be calculated from derived target points with known world and image coordinates.
  • Various camera models such as direct lineai- transformation for 2D (DLT 2D), affine, or Tsai 2D may be used.
  • DLT 2D direct lineai- transformation for 2D
  • affine for instance, 9 camera parameters may be calculated from calibration targets.
  • local camera parameters may be calculated from derived target points with known world and image coordinates.
  • local calibration may calculate different camera parameters at different image locations.
  • 3x3 or 5x5 calibration targets nearest to the centre pixel may be used to determine the localized calibration parameters.
  • various camera models such as DLT (direct linear transform) 2D, affine, or Tsai 2D may be used.
  • 9 localized camera parameters may be calculated from calibration targets.
  • equations may be determined based on camera model used for image to world coordinate transformation and vise verse.
  • the calibration result may be stored.
  • FIG. 31 shows a flow diagram 3100 illustrating calibrating 3D inspection in accordance with an embodiment.
  • calibrating all or some cameras for 3D measurement may be started.
  • one or more snap(s) of a static calibration target in each view may be performed. If more snaps of the calibration target are performed, an averaging process may be used to calculate average image for later use.
  • a calibration target may be desired to be moved to a different height to image target features at a different height to make 3D calibration accurate.
  • a three dimensional calibration target with staircase design may be used. In this case the calibration target may have two (or more) levels. All feature points on level 1 may have z height value zl and all feature points on level 2 may have z height value z2.
  • 3D calibration may be performed without moving calibration target up and down.
  • subpixel target centroid determination function to calculate all target centroids in all views may be applied. Centroids may be calculated by using zero- crossing, multi-cut threshold methods. After target feature detection, world coordinates which may correspond to the image coordinates of the target features may be calculated.
  • Li 3108 global camera parameters may be calculated from derived target points with known world and image coordinates. Various camera models such as DLT (direct linear transformation), affine, or Tsai coplanar, Tsai non-coplanar, or other models may be used. For DLT model, for instance, 11 camera parameters may be calculated from calibration targets.
  • local camera parameters may be calculated from derived target points with known world and image coordinates.
  • 3x3 or 5x5 calibration targets nearest to the centre pixel may be used to determine the localized calibration parameters.
  • equations may be determined based on camera model used for image to world coordinate transformation and vise verse.
  • a 3D calibration result may be stored.

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Hardware Design (AREA)
  • Microelectronics & Electronic Packaging (AREA)
  • Power Engineering (AREA)
  • Physics & Mathematics (AREA)
  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Chemical & Material Sciences (AREA)
  • Analytical Chemistry (AREA)
  • Biochemistry (AREA)
  • General Health & Medical Sciences (AREA)
  • General Physics & Mathematics (AREA)
  • Immunology (AREA)
  • Pathology (AREA)
  • Length Measuring Devices By Optical Means (AREA)
  • Investigating Materials By The Use Of Optical Means Adapted For Particular Applications (AREA)
  • Testing Or Measuring Of Semiconductors Or The Like (AREA)

Abstract

Un mode de réalisation de l'invention porte sur un procédé d'examen d'une structure de liaison d'un substrat. Le procédé peut comprendre : la fourniture d'un substrat comprenant une structure de liaison devant être examinée; la détermination d'une ou plusieurs images stéréoscopiques comprenant une ou plusieurs régions d'intérêt prédéfinies de parties de la structure de liaison devant être examinée, les régions d'intérêt prédéfinies correspondant à des régions d'intérêt de référence dans un modèle de référence prédéterminé stocké d'une structure de liaison de référence d'un substrat de référence; la détermination d'un ou plusieurs paramètres de caractéristiques bidimensionnelles et d'un ou plusieurs paramètres de caractéristiques tridimensionnelles de la structure de liaison devant être examinée à partir d'une ou plusieurs d'une pluralité des régions d'intérêt prédéfinies; et la détermination du fait que les paramètres de caractéristiques déterminés satisfont ou non au moins un critère de qualité prédéfini par rapport au modèle de référence.
PCT/SG2010/000042 2009-02-06 2010-02-08 Procédés d'examen d'une structure de liaison d'un substrat et dispositifs d'inspection de structure de liaison WO2010090605A1 (fr)

Priority Applications (2)

Application Number Priority Date Filing Date Title
SG2011052479A SG173068A1 (en) 2009-02-06 2010-02-08 Methods for examining a bonding structure of a substrate and bonding structure inspection devices
CN201080015400.XA CN102439708B (zh) 2009-02-06 2010-02-08 检查基板的接合结构的方法和接合结构检验设备

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US15068609P 2009-02-06 2009-02-06
US61/150,686 2009-02-06

Publications (1)

Publication Number Publication Date
WO2010090605A1 true WO2010090605A1 (fr) 2010-08-12

Family

ID=42542308

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/SG2010/000042 WO2010090605A1 (fr) 2009-02-06 2010-02-08 Procédés d'examen d'une structure de liaison d'un substrat et dispositifs d'inspection de structure de liaison

Country Status (4)

Country Link
CN (1) CN102439708B (fr)
MY (1) MY169616A (fr)
SG (1) SG173068A1 (fr)
WO (1) WO2010090605A1 (fr)

Cited By (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103364398A (zh) * 2012-03-29 2013-10-23 株式会社高永科技 接头检查装置
CN104504386A (zh) * 2014-12-08 2015-04-08 深圳市浦洛电子科技有限公司 一种模块化aoi定位方法、系统及烧录ic设备
CN104538336A (zh) * 2015-01-07 2015-04-22 海太半导体(无锡)有限公司 一种用于半导体封装的设备的报警识别与处理系统及方法
WO2015088658A2 (fr) 2013-12-11 2015-06-18 Fairchild Semiconductor Corporation Ponteuse intégrée et système de mesurage 3d avec rejet de défauts
WO2015171459A1 (fr) * 2014-05-05 2015-11-12 Alcoa Inc. Appareil et procédés pour mesure de soudure
WO2017117566A1 (fr) * 2015-12-31 2017-07-06 Industrial Dynamics Company, Ltd. Système et procédé d'inspection de récipients à l'aide de multiples images de ceux-ci
US10776912B2 (en) 2016-03-09 2020-09-15 Agency For Science, Technology And Research Self-determining inspection method for automated optical wire bond inspection
CN112308073A (zh) * 2020-11-06 2021-02-02 中冶赛迪重庆信息技术有限公司 废钢火车装卸料转载状态识别方法、系统、设备及介质
US20220115253A1 (en) * 2020-10-14 2022-04-14 Emage Equipment Pte. Ltd. Loop height measurement of overlapping bond wires

Families Citing this family (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
SG2013084975A (en) * 2013-11-11 2015-06-29 Saedge Vision Solutions Pte Ltd An apparatus and method for inspecting asemiconductor package
US9759547B2 (en) * 2014-08-19 2017-09-12 The Boeing Company Systems and methods for fiber placement inspection during fabrication of fiber-reinforced composite components
CN104483331A (zh) * 2014-12-03 2015-04-01 东莞市神州视觉科技有限公司 一种连接器插针三维检测方法、装置及系统
DE102015109431A1 (de) * 2015-06-12 2016-12-15 Witrins S.R.O. Inspektionssystem und Verfahren zur Fehleranalyse von Drahtverbindungen
CN105374700B (zh) * 2015-11-26 2018-01-16 北京时代民芯科技有限公司 一种高密度集成电路键合精度控制方法
JP6673268B2 (ja) * 2017-03-14 2020-03-25 オムロン株式会社 管理装置、管理装置の制御方法、情報処理プログラム、および記録媒体
CN110030923B (zh) * 2018-01-12 2021-09-28 联合汽车电子有限公司 连接器Pin针检测系统及其检测方法
CN109003911B (zh) * 2018-08-02 2021-07-20 安徽大华半导体科技有限公司 一种半导体芯片引脚成型缺陷检测的方法
CN111192233B (zh) * 2018-11-14 2022-04-12 长鑫存储技术有限公司 半导体结构的制备方法及其制备装置
CN110346381B (zh) * 2019-08-12 2022-03-08 衡阳师范学院 一种光学元件损伤测试方法及装置
JP6996699B2 (ja) * 2020-01-14 2022-01-17 トヨタ自動車株式会社 バルブ当たり面の検査方法及び検査装置
JP7373436B2 (ja) * 2020-03-09 2023-11-02 ファスフォードテクノロジ株式会社 ダイボンディング装置および半導体装置の製造方法
CN113611036B (zh) * 2021-07-15 2022-12-06 珠海市运泰利自动化设备有限公司 一种精密测试自动校准方法

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP0466562B1 (fr) * 1990-07-04 1995-10-11 Fujitsu Limited Méthode et appareil pour mesurer la configuration, tridimentionnelle d'un objet filaire en temps réduit
WO2000026850A1 (fr) * 1998-11-05 2000-05-11 Cyberoptics Corporation Appareil d'assemblage electronique pourvu d'un capteur a balayage lineaire a vision stereo
US20020054299A1 (en) * 2000-09-22 2002-05-09 Visicon Inspection Technologies Llc, A Corporation Of The State Of Connecticut Three dimensional scanning camera

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6072898A (en) * 1998-01-16 2000-06-06 Beaty; Elwin M. Method and apparatus for three dimensional inspection of electronic components
SG73563A1 (en) * 1998-11-30 2000-06-20 Rahmonic Resources Pte Ltd Apparatus and method to measure three-dimensional data

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP0466562B1 (fr) * 1990-07-04 1995-10-11 Fujitsu Limited Méthode et appareil pour mesurer la configuration, tridimentionnelle d'un objet filaire en temps réduit
WO2000026850A1 (fr) * 1998-11-05 2000-05-11 Cyberoptics Corporation Appareil d'assemblage electronique pourvu d'un capteur a balayage lineaire a vision stereo
US20020054299A1 (en) * 2000-09-22 2002-05-09 Visicon Inspection Technologies Llc, A Corporation Of The State Of Connecticut Three dimensional scanning camera

Non-Patent Citations (4)

* Cited by examiner, † Cited by third party
Title
"Proceedings of the 17th International Conference on Production Research (ICPR-2003), Blacksburg, Virginia, August 2003", article PERNG ET AL.: "A new wire bonding inspection system by machine vision" *
ONG ET AL.: "3D visual inspection of IC bonding wires", PROCEEDINGS OF THE SPIE, vol. 3185, 25 June 1997 (1997-06-25), SINGAPORE, pages 68 - 77 *
PERNG ET AL.: "Illumination system for wire bonding inspection", APPLIED OPTICS, vol. 46, 6 February 2007 (2007-02-06), pages 845 - 854 *
YE ET AL.: "A stereo vision system for the inspection of IC bonding wires", INTERNATIONAL JOURNAL OF IMAGING SYSTEMS AND TECHNOLOGY, vol. 11, 2000, pages 254 - 262 *

Cited By (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103364398A (zh) * 2012-03-29 2013-10-23 株式会社高永科技 接头检查装置
CN106908443B (zh) * 2012-03-29 2020-12-08 株式会社高迎科技 接头检查装置
CN106908443A (zh) * 2012-03-29 2017-06-30 株式会社高永科技 接头检查装置
WO2015088658A2 (fr) 2013-12-11 2015-06-18 Fairchild Semiconductor Corporation Ponteuse intégrée et système de mesurage 3d avec rejet de défauts
WO2015171459A1 (fr) * 2014-05-05 2015-11-12 Alcoa Inc. Appareil et procédés pour mesure de soudure
US9927367B2 (en) 2014-05-05 2018-03-27 Arconic Inc. Apparatus and methods for weld measurement
CN104504386A (zh) * 2014-12-08 2015-04-08 深圳市浦洛电子科技有限公司 一种模块化aoi定位方法、系统及烧录ic设备
CN104538336A (zh) * 2015-01-07 2015-04-22 海太半导体(无锡)有限公司 一种用于半导体封装的设备的报警识别与处理系统及方法
WO2017117566A1 (fr) * 2015-12-31 2017-07-06 Industrial Dynamics Company, Ltd. Système et procédé d'inspection de récipients à l'aide de multiples images de ceux-ci
US10776912B2 (en) 2016-03-09 2020-09-15 Agency For Science, Technology And Research Self-determining inspection method for automated optical wire bond inspection
US20220115253A1 (en) * 2020-10-14 2022-04-14 Emage Equipment Pte. Ltd. Loop height measurement of overlapping bond wires
US11721571B2 (en) * 2020-10-14 2023-08-08 Emage Vision Pte. Ltd. Loop height measurement of overlapping bond wires
CN112308073A (zh) * 2020-11-06 2021-02-02 中冶赛迪重庆信息技术有限公司 废钢火车装卸料转载状态识别方法、系统、设备及介质
CN112308073B (zh) * 2020-11-06 2023-08-25 中冶赛迪信息技术(重庆)有限公司 废钢火车装卸料转载状态识别方法、系统、设备及介质

Also Published As

Publication number Publication date
CN102439708B (zh) 2016-03-02
SG173068A1 (en) 2011-08-29
MY169616A (en) 2019-04-23
CN102439708A (zh) 2012-05-02

Similar Documents

Publication Publication Date Title
WO2010090605A1 (fr) Procédés d'examen d'une structure de liaison d'un substrat et dispositifs d'inspection de structure de liaison
US10876975B2 (en) System and method for inspecting a wafer
EP2387796B1 (fr) Système et procédé d'inspection de tranche
JP5672240B2 (ja) ウェーハを検査するためのシステム及び方法
EP3223001B1 (fr) Système et procédé de capture d'éclairage se reflétant dans plusieurs directions
JP7135418B2 (ja) 平坦度検出方法、平坦度検出装置及び平坦度検出プログラム
SG185301A1 (en) System and method for inspecting a wafer

Legal Events

Date Code Title Description
WWE Wipo information: entry into national phase

Ref document number: 201080015400.X

Country of ref document: CN

121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 10738831

Country of ref document: EP

Kind code of ref document: A1

WWE Wipo information: entry into national phase

Ref document number: 12011501491

Country of ref document: PH

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 10738831

Country of ref document: EP

Kind code of ref document: A1