EP1638050A2 - Méthode et appareil d'inspection de substrat - Google Patents

Méthode et appareil d'inspection de substrat Download PDF

Info

Publication number
EP1638050A2
EP1638050A2 EP05019260A EP05019260A EP1638050A2 EP 1638050 A2 EP1638050 A2 EP 1638050A2 EP 05019260 A EP05019260 A EP 05019260A EP 05019260 A EP05019260 A EP 05019260A EP 1638050 A2 EP1638050 A2 EP 1638050A2
Authority
EP
European Patent Office
Prior art keywords
image
substrate
standard
target
inspection
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
EP05019260A
Other languages
German (de)
English (en)
Other versions
EP1638050B1 (fr
EP1638050A3 (fr
Inventor
Kiyoshi Omron Corp. 801 Murakami
Yasunori Omron Corp. 801 Asano
Takashi Omron Corp. 801 Kinoshita
Teruhisa 708 Cosmotoday Yotsuya
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Omron Corp
Original Assignee
Omron Corp
Omron Tateisi Electronics Co
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Omron Corp, Omron Tateisi Electronics Co filed Critical Omron Corp
Publication of EP1638050A2 publication Critical patent/EP1638050A2/fr
Publication of EP1638050A3 publication Critical patent/EP1638050A3/fr
Application granted granted Critical
Publication of EP1638050B1 publication Critical patent/EP1638050B1/fr
Not-in-force legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • G06T7/0004Industrial image inspection
    • G06T7/0008Industrial image inspection checking presence/absence
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • G06T7/0004Industrial image inspection
    • G06T7/001Industrial image inspection using an image reference approach
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • G06T7/73Determining position or orientation of objects or cameras using feature-based methods
    • G06T7/74Determining position or orientation of objects or cameras using feature-based methods involving reference images or patches
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30108Industrial image inspection
    • G06T2207/30141Printed circuit board [PCB]

Definitions

  • This invention relates to a method of and an apparatus for using an image taken of a component mounting substrate (herein referred to simply as a substrate) to inspect the presence or absence of components, their positional displacements and appropriateness of soldering.
  • a component mounting substrate herein referred to simply as a substrate
  • the present assignee of this invention has developed a substrate inspection apparatus for automatically inspecting mounted conditions and soldering conditions of components.
  • This inspection apparatus is provided with a substrate stage for supporting a target substrate which is the object of inspection, a camera for taking an image of the substrate and a table part (including both an X-axis part and a Y-axis part) for moving the substrate stage and the camera on a horizontal plane.
  • each table part is controlled so as to match the field of vision of the camera with a specified area on the substrate.
  • An image obtained under this condition is processed in order to make measurements that are necessary for the inspection of target portions to be inspected inside this specified area and measured values thus obtained are compared with specified standard values.
  • a process referred to as the teaching process is usually carried out prior to the inspection wherein inspection data of various kinds are prepared and registered in a memory.
  • the so-called setting data of inspection areas that are set at each target inspection portion are included in the inspection data.
  • inspection areas are set on the image of the target substrate to be inspected (hereinafter referred to as the target image) based on the aforementioned setting data and the inspection of each target inspection portion is carried out by processing the image of each inspection area.
  • positioning marks are attached to appropriate positions on the substrate, these marks are extracted from the image of the substrate after it is transported into the image-taking position, and the X-axis and the Y-axis parts are controlled such that these extracted positions will match preliminarily specified reference positions.
  • Japanese Patent Publication Tokkai 9-15302 discloses a technology similar to the one described above, according to which two positioning patterns are extracted from an image of the substrate, the position of the substrate stage is adjusted such that the middle point of the line segment connecting these two extracted positions will match the middle point of the line segment connecting reference positions of these patterns and the substrate stage is further rotated by the angle between these two line segments.
  • the invention therefore relates firstly to a method of inspecting a plurality of target portions on a substrate having components mounted thereon by using an image taken of this substrate by image taking means and this method is characterized as comprising preparation steps and inspection steps.
  • the preparation steps are what correspond to so-called teaching steps and include Steps A, B, C and D that are characterized as follows.
  • a standard image which is defined as an image that becomes obtainable when the field of vision of this image taking means matches a specified area on the substrate, is obtained.
  • This standard image may be obtained by dividing a standard substrate with a high quality into a plurality of areas to be photographed and joining together the images thus obtained for the individual areas to obtain a whole image of this standard substrate.
  • the image taking means may be position-matched at a specified position of the standard substrate to photograph it to obtain such an image. If the substrate is small enough such that its whole can be put within the field of vision of the image taking means, the whole image of this substrate may be set as the standard image.
  • Step B a positional relationship between the image taking means and the substrate that is necessary for position-matching the image taking means to the aforementioned specified area is obtained.
  • This positional relationship may be expressed as position data of the area (hereinafter referred to as the target area) on the substrate corresponding to the field of vision of the image taking means.
  • Such position data may be expressed also as the relative coordinates of a specified position within this target area as seen from a specified reference point on the substrate. In such a situation, if a whole image of the substrate is created, the relative coordinates with respect to the reference point on the whole image may be obtained. Since the size of the target area is determined by the field of vision of the image taking means, coordinates of any specified single point within the target area may be obtained but this is not necessary.
  • the position and the size of the target area may be expressed, for example, by obtaining the relative coordinates of the top left-hand and bottom right-hand corners of the target area. This positional relationship may be expressed also as the coordinates of a table part for adjusting these
  • Step C an inspection area is set at a target portion on the standard image.
  • This setting step is preferably carried out by using design data (such as CAD data) of the substrate.
  • design data such as CAD data
  • the positional relationship obtained in Step B may be used to extract, from the design data of the substrate, data inside the area corresponding to the standard image and the position and the size of an inspection area suitable to a target portion to be inspected shown by the extracted data.
  • Each inspection area may be set by displaying the standard image on a monitor to allow the user to carry out a setting operation.
  • Step D the positional relationship of the standard image obtained in Step A, the positional relationship obtained in Step B between the image taking means and the substrate and the positional relationship set in Step C between the inspection area and the standard image are correlated and registered.
  • the inspection area set in Step C it is preferable to register data indicative of the size of the inspection area in addition to the positional relationship with the standard image.
  • the width of each side of the inspection area may be expressed as its size.
  • Relative coordinates of the top left-hand corner and the bottom right-hand corner of the inspection area may be obtained with respect to a specified point on the standard image and the position and the size of the inspection area may be expressed by these coordinates.
  • the inspection steps include the following first, second, third and fourth steps.
  • a target image for inspection is created by position-matching the image taking means to a target substrate for inspection based on the positional relationship registered in Step D between the image taking means and the substrate.
  • the position-matching of the image taking means may be effected by moving a stage part of at least either one of the image taking means and the substrate with respect to the other but if there is an error in the amount of the movement, an image of the same area as the standard image cannot be obtained and there is a possibility of positional displacement of the target image with respect to the standard image.
  • displacement values of the target image with respect to the standard image are detected by matching the target image created in the first step with the standard image registered in Step D.
  • This step may be carried out preferably by scanning the target image of the standard image with respect to the other, carrying out a matching process at each specified interval (preferably for each pixel) by a correlation calculation or a grading differentiation calculation and detecting the displacement when the degree of similarity between the two images becomes a maximum.
  • the image of the target substrate may contain defective portions, the remaining portions are close to the standard image.
  • the degree of similarity may be considered to assume a maximum value when the standard image and the target image are in a matching positional relationship and displacement values can be detected accurately.
  • the positional relationship registered in Step D between the inspection area and the standard image is corrected by the displacement values detected in the second step and an inspection area is set on the target image based on the corrected positional relationship.
  • the coordinates indicative of the position of the inspection area with respect to the standard image may be changed by the displacement values to set the inspection area at the position of the coordinates after they were moved.
  • image data in the inspection area set in the third step are used to carry out an image processing for the inspection steps.
  • This image processing may be carried out according to the prior art technology, say, as disclosed in aforementioned Japanese Patent Publication 2003-222598, or various inspection data for this fourth step, such as the kinds of programs to be carried out, parameters such as threshold values for binarization, reference values for measured data and threshold values for judgment processes, may be registered for each inspection area in the preparation steps.
  • Step A is characterized as including Steps A1, A2, A3 and A4 and Step C is characterized as including Steps C1 and C2
  • Step A1 is for dividing a standard substrate of a good quality into a plurality of target areas with a size according to the field of vision of the image taking means and carrying out an image taking process for each of these areas
  • Step A2 is for creating a whole image of the standard substrate by joining the images created for the individual areas in Step A1
  • Step A3 is for setting on this whole image an area that includes a specified number of target portions to be inspected and has a size according to the field of vision of the image taking means
  • Step A4 is for setting an image inside an area set in Step A3 as the standard image
  • Step C1 is for extracting substrate design data in an area corresponding to the standard image set in Step A4 from substrate design data corresponding to the standard substrate
  • Step C2 is for determining setting conditions for an inspection area corresponding to said standard image by using said extracted substrate design data.
  • Step A1 is preferably carried out by setting the target areas such that they overlap by a width corresponding to the error that takes place when the aforementioned stage part is being operated.
  • Step S2 is preferably carried out by creating the whole image by sequentially overlapping their overlapping portions. A more appropriate whole image can be created if a pattern matching is carried out at the time of this overlapping process by using image data of these overlapping portions and the correspondence relationship among the images is adjusted based on the results of this matching process.
  • the substrate design data of the area corresponding to the standard image can be extracted based on the positional relationship of the standard image with respect to the whole image after the coordinate systems of the whole image created in Step A and the substrate design data are matched based on a specified reference point on the substrate (such as the center point of a positioning mark on the substrate).
  • a target portion to be inspected can be extracted from the extracted substrate design data and an inspection area suitable to this target portion may be set such that the position and the size of this area can be used as setting conditions of the inspection area.
  • Step D in the case of the first example, only the standard image set in Step A4 may be cut out from the whole image of the standard substrate and registered but the whole image may be registered instead.
  • Step A3 may include setting on the whole image of the standard substrate areas corresponding to the areas used in the image taking processing in Step A1 and Step A4 may include setting images of individual ones of these areas set in Step A3 as individual standard images.
  • an image taking process can be carried out for the inspection also under the same conditions as at the time of creating the whole image such that the target substrate can be inspected as a whole.
  • the standard images obtained for the individual areas may be registered as synthesized into a whole image.
  • Step A3 may include determining setting condition of an inspection area corresponding to a specified target portion on the substrate based on substrate design data corresponding to the standard substrate and setting an area including an inspection area by this setting condition.
  • an inspection becomes possible by creating an image including the whole of this target portion and hence the degree of freedom in inspection improves.
  • Efficiency of inspection can be further improved if a map image showing the distribution of inspection areas over the entire substrate is created from the substrate design data and a process of assigning target areas is carried out.
  • Step A is characterized as including Steps A1 and A2 and Step C is characterized as including Steps C1, C2 and C3
  • Step A1 is for determining setting conditions for an inspection area for a specified target portion on the substrate based on design data of this substrate
  • Step A2 is for obtaining an image of a standard substrate of a good quality by position-matching the image taking means such that the field of vision of the image taking means includes the inspection area according to the setting condition determined in Step A1 and carrying out an image taking process and using the obtained image as the standard image
  • Step C1 is for detecting a target portion on the standard image
  • Step C2 is for detecting displacement values of the target portion with respect to the target area when the target area is set based on the setting condition determined in Step A1
  • Step C3 is for correcting setting position of the inspection area based on the displacement values detected in Step C2.
  • the standard image can be obtained without creating any whole image but by using substrate design data to determine a target area at an arbitrary position on the substrate and position-matching the image taking means to this target area on the standard substrate. If this method is used, the field of vision of the image taking means may be displaced from the target area and a correct standard image may not be obtained but this problem can be coped with by the processes of Steps C1 and C2.
  • a target portion to be inspected can be extracted from the standard image by a process such as binarization and edge extraction according to its characteristics.
  • a target portion may be extracted from substrate design data or by displaying the standard image on the monitor and specifying a target portion on the display screen.
  • Step C2 displacement values at the time when positional relationship becomes suitable or creating the standard image (or when extracted target portions on the standard image each become contained in a inspection area) can be detected by creating a map image for showing the distribution of inspection area based on the setting condition determined in Step A1 and scanning this image with respect to the standard image.
  • Efficiency and the degree of freedom of inspection can be improved also by the second example because it is made possible to carry out the inspection by setting a target area at any position on the substrate.
  • the invention relates secondly to an apparatus for inspecting a plurality of target portions on a substrate having components mounted thereon by using an image taken of the substrate, comprising a substrate stage for supporting a target substrate to be inspected with the target portions facing upward, an image taking means for photographing a substrate supported by this substrate stage from above, moving means for changing relative horizontal position between the image taking means and the substrate stage, a memory that registers a standard image obtained when the field of vision of the image taking means is matched to a specified area on the target substrate, positional relationship between the image taking means and the substrate when the standard image is obtained and positional relationship of inspection area with respect to the standard image, position adjusting means for controlling motion of the moving means such that the positional relationship between the image taking means and the substrate is adjusted to be the same as the positional relationship registered in the memory, image creating means for creating a target image to be processed by operating the image taking means after adjustment is made by the position adjusting means, displacement detecting means for detecting displacement values of the target image with respect to the standard image by matching the target image
  • the moving means corresponds to the aforementioned table control parts and may be provided both to the substrate stage and to the image taking means.
  • the table control part for the image taking means may be adapted to move in one horizontal direction and that for the substrate stage in another horizontal direction perpendicular to it.
  • a table control part may be provided to only one of them and adapted to move in one of these two horizontal directions.
  • a rotary mechanism may further be provided to this table control parts.
  • the memory may be used to register the standard image created in the preparation steps. It is preferable that a non-volatile memory be used.
  • the position adjusting means, the image creating means, the displacement detecting means, the area setting means and the image processing means may each be formed as a computer having a program installed therein.
  • the aforementioned first, second, third and fourth steps of the method of this invention may be considered to be carried out by the program installed respectively in the position adjusting and image creating means, the displacement detecting means, the area setting means and the image processing means.
  • each corresponding to one of these means may be all installed in one computer.
  • the aforementioned memory may be provided in such computer or externally to such a computer.
  • the apparatus of this invention may be further provided with means for creating data to be registered to the memory, as will be shown below.
  • the memory may be adapted to register a whole image or setting data of an inspection area created by another computer.
  • Apparatus of this invention for carrying out the aforementioned first example of the method of this invention may be characterized as further comprising image taking control means for controlling the moving means and the image taking means such that, when a standard substrate with a good quality is supported on the substrate stage, the standard substrate is photographed by being divided into a plurality of areas with a size corresponding to the field of vision of the image taking means, image synthesizing means for synthesizing images obtained by a control by the image taking control means to obtain a whole image of the standard substrate, standard image extracting means for setting on the whole image an area including a specified number of target portions to be inspected and having a size corresponding to the field of vision of the image taking means and extracting an image in the area as the standard image, condition determining means for determining setting conditions of inspection areas for the standard image based on substrate design data inside the area where the standard image is extracted, and registering means for registering the standard image in the memory, extracting positional relationship between the inspection area based on the setting condition determined by the condition determining means and
  • Each of the means described above may be realized by a computer having installed therein a program for carrying out the process for the corresponding means.
  • the image taking control means serves to carry out Step 1 for the method of the first example, the image synthesizing means serves to carry out Step 2 and the standard image extracting means serves to carry out Steps 3 and 4.
  • the condition determining means serves to carry out Step C1 and the registering means serves to carry out not only Step C2 but also the process of extracting the positional relationship between the standard image and the image taking means and registers the results of these processes in the memory.
  • the positional relationship between the standard image and the image taking means can be extracted as that between the target area set by the standard image extracting means and the whole image and may be obtained as the relative coordinates of a specified point in the target area with respect to a specified standard point on the whole image.
  • Apparatus of this invention for carrying out the aforementioned second example of the method of this invention may be characterized as further comprising condition determining means for determining setting conditions of an inspection area for a specified target portion on the substrate based on design data of the substrate, positional relationship determining means for determining positional relationship between the image taking means and the substrate such that the inspection area according to the conditions determined by the condition setting means is included in the field of vision of the image taking means, image taking control means for controlling the moving means and the image taking means such that, when a standard substrate with a good quality is supported on the substrate stage, the substrate is photographed while being in the positional relationship determined by the positional relationship determining means with respect to the image taking means, detecting means for detecting the target portion on an image obtained by control by the image taking control means and displacement values of the target portion with respect to the inspection area when the inspection area is set on the image by the setting condition determined by the condition determining means, correcting means for correcting setting position of the inspection area based on the displacement values detected by the detecting means, and registering
  • Each of the means described above may also be realized by a computer having installed therein a program for carrying out the process for the corresponding means.
  • the condition determining means serves to carry out Step 1 for the method of the second example, the positional relationship determining means serves to carry out Step A2, the detecting means serves to carry out Steps C1 and C2 and the correcting means serves to carry out Step C3.
  • a target image for processing is matched with a preliminarily prepared whole image of the substrate to extract the relative positional relationship between the two images and an inspection area necessary for the target image is set based on the results of this extraction and the correspondence relationship between the whole image and the inspection area.
  • Fig. 1 shows a substrate inspection apparatus embodying this invention, adapted to process an image taken of a target substrate to be inspected and to judge appropriateness of the mounted conditions of the components mounted to the substrate as well as its soldering condition and comprising an image taking part 3, a light emitting part 4, a control part 5, an X-axis table control part 6 and a Y-axis table control part 7.
  • symbol 1T indicates the target substrate to be inspected
  • symbol 1S indicates a standard substrate with mounted components in good conditions and to be used for the teaching process prior to the inspection process.
  • the Y-axis table control part 7 is provided with a conveyer 7A for supporting the substrates 1T and 1S.
  • the conveyer 7A is moved by means of a motor (not shown) in order to transport the substrates 1T and 1S in the direction of the Y-axis (perpendicular to the paper in the figure).
  • the X-axis table control part 6 supports the image taking part 3 and the light emitting part 4 above the Y-axis table control part 7 and serves to move them in the direction of the X-axis (the left-right direction in the figure).
  • the light emitting part 4 has three circular ring-shaped light sources 8, 9 and 10 with different diameters and adapted to emit red light, green light and blue light, respectively. As their centers are positioned exactly above the observation position, they will be at different angles of elevation as seen from the substrate 1T or 1S.
  • the image taking part 3 comprises a CCD camera 3A (hereinafter referred to simply as the camera) for generating a color image with its optical axis matching the center of the light sources 8, 9 and 10 and oriented vertically such that the reflected light from the substrate 1T or 1S will be made incidence to the image taking part 3 and converted into color signals R, G and B of the three primary colors to be inputted to the control part 5.
  • a CCD camera 3A hereinafter referred to simply as the camera
  • the control part 5 has a computer inclusive of a CPU serving as a control unit 11 and includes an image input part 12, a memory 13, an image taking controller 14, an image processor 15, an illumination controller 16, an XY table controller 17, an inspection part 18, a teaching table 19, a data managing part 20, an input part 21, a CRT display part 22, a printer 23, a communication part 24 and an external memory device 25.
  • the image input part 12 is provided with an amplifier circuit for amplifying the image signals R, G and B from the image taking part 3 and an A/D converter circuit for converting these image signals into digital signals.
  • the memory 13 serves to store not only digital gradient image data R, G and B but also binarized image data obtained by processing these gradient image and color phase data.
  • the image taking controller 14 is provided with an interface for connecting the image taking part 3 to the control unit 11 and serves to carry out various controls such as driving the image taking part 3 based on commands from the control unit 11 and adjusting the output level of each color light.
  • the illumination controller 16 is for regulating the light quantity for each light source of the light emitting part 4. The light quantities for the light sources 8, 9 and 10 may be adjusted such that white light will be projected for illumination.
  • the XY table controller 17 includes an interface for connecting the X-axis and Y-axis table control parts 6 and 7 with the control unit 11 and serves to control the motions of these table control parts 6 and 7 based on commands from the control unit 11.
  • the teaching table 19 is a memory part for storing inspection data for substrates and is provided inside a non-volatile memory such as a hard disk device.
  • the inspection data include a standard image of substrate to be explained below, setting data of inspection areas and target areas for inspection, kinds of programs to be executed at each inspection area, parameters corresponding to image processing to be carried out (such as threshold values for binarization processes and filters for edge extraction) and judgment standard values for judging appropriateness of extracted characteristic quantities).
  • These inspection data are "taught" prior to an inspection by using an image taken of the standard substrate 1S or preliminarily registered standard inspection data. They are arranged as a judgment file for each kind of substrates.
  • the aforementioned data managing part 20 is a memory for storing the link data that correlate between the kinds of substrates and the judgment files. After the name of the target substrate 1T is inputted, the control unit 11 reads out the judgment file corresponding to it based on the link data in the data managing part 20 and sets it in the memory 13.
  • the image processor 15 and the inspection part 18 carry out their processing based on the inspection data in the judgment file which has been read out.
  • the image processor 15 processes the color images stored in the memory 13 for each inspection area and thereby measures characteristic quantities necessary for the inspection such as the areas, the positions of the centers of gravity and the color patterns of the target portions to be inspected.
  • the inspection part 18 compares the characteristic quantities extracted by the image processor 15 with the standard data and thereby judges appropriateness of each target portion.
  • the results of these judgment processes are gathered together and the control unit 11 judges whether the target substrate 1T is a good product or not. This final judgment result is outputted to the CRT display part 22 (hereinafter referred to simply as the display part) and the printer 23 or the communication part 24.
  • the input part 21 is for inputting various conditions and inspection data for an inspection and comprises a keyboard and a mouse.
  • the display part 22 receives image data and inspection results from the control unit 11 and displays them on a display screen.
  • the printer 23 serves to receive inspection results from the control unit 11 and to print them out in a predetermined format.
  • the communication part 24 is for exchanging data with other apparatus.
  • a target substrate 1T determined to be defective its ID data and a description of its defects may be transmitted to a repairing apparatus on the downstream side such that the defective part can be quickly repaired.
  • the external memory device 25 is for reading and writing data from and into a memory medium such as a flexible disk, a CD-R or a magneto-optical disk and is used for saving the inspection results and taking in programs and setting data necessary for the inspection from outside.
  • the communication part 24 and the external memory device 25 may be used for introducing CAD data on a substrate when a test area or a target area for taking in an image is set during the teaching prior to the inspection.
  • the image processor 15 and the inspection part 18 may each comprise a dedicated processor having a program for the aforementioned processes installed but it is not always necessary to provide dedicated processors.
  • the control unit 11 may be provided with the functions of the image processor 15 and the inspection part 18.
  • the setting conditions of inspection areas may be created by using CAD data.
  • This principle will be explained next with reference to Fig. 2 wherein numeral 100 indicates a standard image obtained from the standard substrate 1S and intended to represent the whole of the substrate 1S.
  • Numeral 101 is a so-called map image for showing the distribution of inspection areas 102 on the standard image 100. Each of these inspection areas is set by matching with the position and size of the target inspection portion shown by the CAD data of the substrate.
  • the map image represents data on the positions and sizes of the inspection areas on the substrate, or the setting conditions of the inspection areas.
  • These setting conditions may comprise coordinate data (x n and y n ) representing the positions of the inspection areas and numbers representing their widths.
  • the setting conditions of inspection areas are hereinafter referred to as a map image.
  • the substrates 1T and 1 S according to this example are provided with positioning marks 105 and 106 respectively at the lower right-hand corner and the upper left-hand corner.
  • the CAD data also include corresponding positioning marks 105a and 106a.
  • the positions and sizes of the inspection areas 102 on the map image 101 are identified as the setting data of the inspection areas 102.
  • inspection areas 102 are set by considering a portion including the main body part of a component and its electrode part as one target inspection position, this is not intended to limit the scope of the invention.
  • the main body portion, the electrode portion and the soldering portion (such as fillet) may be considered as separate inspection portions when inspection areas 102 are set.
  • the inspection areas 102 can be set also for the image of the target substrate 1T by using the positioning marks of its image and under the same conditions as for the standard substrate 1S. If the position of the standard substrate 1S is registered as relative coordinates with respect to the positioning marks 105, even when the image of the target substrate 1T is displaced with respect to the standard image 100, inspection areas corresponding to each target portion can be set at correct positions based on the positioning mark 105 on its image.
  • the substrate 1T or 1S is larger than the field of vision of the camera 3A and images need to be taken of a plurality of target areas, a standard image will have to be registered for each of the target areas and setting conditions of inspection areas must be determined for each standard image.
  • an image of the target area is extracted from the map image 101 of the whole substrate based on the relative coordinates of the target area with respect to the positioning mark, and this extracted map image and the standard image are "superposed", that is, the position data and size data of the inspection area obtained from the CAD data are converted to the coordinate system of the standard image.
  • a substrate which requires a plurality of target areas to be set is used as the target of inspection, and a standard image is prepared for each of the target areas such that the position of the inspection area can be accurately determined corresponding to this standard image.
  • the target image obtained by the camera 3A is matched with the standard image in each of the examples in order to detect the amount of displacement of the field of vision of the camera 3A and this amount of displacement is used to correct the setting data of the inspection area. Moreover, the process of setting the inspection area to the target image is carried out by using the setting data after this correction.
  • the images of the individual areas are connected together to produce a whole image of the standard substrate 1S and registered.
  • Overlapped areas of width of a specified number of pixels are set between each pair of mutually adjacent areas corresponding to the error generated by the motions of the X-axis and Y-axis table control parts 6 and 7 (referred to as the machine error).
  • the machine error the error generated by the motions of the X-axis and Y-axis table control parts 6 and 7
  • a pattern matching process is carried out by using the image data of these overlapped portions and after the corresponding relationships of pixels are adjusted based on the result of this matching process, corresponding pixels are overlapped to obtain a correct whole image.
  • Fig. 3 shows an example of the process for overlapping images, the order in which the images are connected being indicated by circled numerals.
  • one of the images in the center part is designated as the first image and the other images are sequentially overlapped in the clockwise direction.
  • the average density value of the pixels before the overlapping is carried out is assigned to each of the pixels in the overlapped area.
  • Figs. 4-6 show a method of forming a whole image by using character array "ABC" for convenience to represent a distribution pattern of components on a substrate.
  • the target substrate to be processed is divided into six target areas to be photographed to obtain six images g1-g6.
  • the image g2 at the lower center is set first, as shown in Fig. 5, to start the overlapping process to make a whole image. Thereafter, the method explained above with reference to Fig. 3 is used and the images g5, g6, g3, g1 and g4 are overlapped sequentially in this order and a whole image 103 as shown in Fig. 6 is completed.
  • inspection areas are set on the whole image 103 by the method explained above with reference to Fig. 2 such that a plurality of target areas are set on the substrate based on the distribution condition of the inspection areas.
  • the whole image 103 in this case may be considered to include the standard images of these target areas. Since the whole image 103 is obtained by properly joining the overlapping parts of the individual images g1-g6 with their overlapping parts prepared by taking into consideration the machine errors of the X-axis and Y-axis table control parts 6 and 7, an image may be cut out from this whole image 103 to obtain a standard image similar to the image that would obtain if the field of vision of the camera 3A is positioned correctly for the corresponding target area.
  • the field of vision of the camera 3A cannot be set correctly to the target area which was set as explained above because of the machine errors of the X-axis and Y-axis table control parts 6 and 7.
  • This displacement of the field of vision of the camera 3A can be detected as the displacement of the target image with respect to the target area on the whole image 103.
  • Fig. 7 is a drawing for explaining the process for detecting the displacement.
  • Numeral 31 in Fig. 7 indicates an inspection area set on the basis of a whole image 103 and numeral 30 indicates a target area on the whole image 103.
  • the image that is inside this target inspection area 31 functions as the standard image for the inspection.
  • only one inspection area 30 is set in this example, a plurality of inspection areas may be contained within this single target area 30.
  • Numeral 40 in the figure indicates a target image created by the camera 3A at a matching position with the target area on the actual substrate.
  • Numeral 41 indicates the area (hereinafter referred to as the corresponding area) on the whole image 103 where an image corresponding to the target image 40 is obtained. If the field of vision of the camera 3A is correctly positioned, this corresponding area 41 should match the target area 30. In this illustrated example, however, the field of vision of the camera 3A is not correctly positioned due to machine errors of the X-axis and Y-axis table control parts 6 and 7 and hence there are resultant displacement values ⁇ x and ⁇ y from the target area 30.
  • the positions of the inspection area 31 and the target area 30 are registered as coordinates of the same coordinate system as that of the whole image 103 of the substrate. If the positional relationship between these areas 30 and 31 in this coordinate system is directly used on the target image 40, the inspection area 31 will be displaced from the proper position where it should be set (as shown by numeral 311 on the target image 40).
  • the aforementioned corresponding area 41 is extracted corresponding to the target image 40 by a pattern matching process (also referred to as a correlation matching process) and thereafter displacement values ⁇ x and ⁇ y of the corresponding area 41 from the target area 30 are calculated for both the horizontal direction and the perpendicular direction, and the setting position of the inspection area 31 on the target image 40 is corrected from the displacement values ⁇ x and ⁇ y.
  • the inspection area 31 can be set at a position relative to the target portion (letter B in this example) as on the whole image 103 such that a correct inspection result can be obtained.
  • the field of vision of the camera 3A and the target area are matched as explained above at the beginning of an inspection from the displacement of the substrate extracted from the images of the positioning marks 105 and 106 on the image of the substrate and nothing more. As a result, it is the image within the displaced inspection area 311 that is processed on the target image 40.
  • the accuracy of positioning of the inspection area 311 can be improved by reducing the machine errors of the X-axis and Y-axis table control parts 6 and 7, this will require driving mechanisms with high accuracy which will be costly.
  • the inspection area can be set accurately independent of the capability of the driving mechanisms for the X-axis and Y-axis table control parts 6 and 7 and an image processing necessary for the inspection can be carried out.
  • the target image 40 obtained by photographing the target substrate 1T is likely to include defective portions. Although there may be some defective portions, it may be expected that images similar to those obtained from the whole image 103 are obtained from the remaining portions. Thus, if the percentage of the area with defective portions on the target image 40 is small, the area where the correlation value by a pattern matching process is the highest with the target image 40 may be extracted as the corresponding area 41 to obtain the displacement values ⁇ x and ⁇ y.
  • Fig. 8 shows a routine for a teaching process
  • Fig. 9 shows a routine for an inspection process to be carried out in Example 1.
  • Fig. 8 The teaching process of Fig. 8 is started as a substrate name and a substrate size are inputted and the standard substrate 1S is transported onto the Y-axis table control part 7 (Step ST1). Next, the number and positions of target areas for creating a whole image are determined and the camera 3A is positioned in a target area containing the positioning mark 105 or 106 (Step ST2).
  • the positioned camera 3A is operated to take an image (Step ST3) and the image thus taken is temporarily stored in the memory 13 through the image input part 12.
  • This temporarily stored image is displayed next on the display part 22 to cause the user to specify the position of the positioning mark 105 or 106.
  • Its specified coordinates are identified as the coordinates of the reference point (Step ST4).
  • the positioning mark 105 or 106 on the image may be automatically extracted and the coordinates representing the position of extraction may be identified as the reference point.
  • Step ST5 the X-axis and Y-axis table control parts 6 and 7 are operated to position the camera 3A in the next target area (Step ST5) and the camera 3A is operated and the obtained image is temporarily stored in the memory 13 (Step ST6).
  • Steps ST5 and ST6 are repeated while the position of the camera 3A is changed with respect to the standard substrate 1S.
  • all necessary images for creating a whole image have been obtained (YES in Step ST7), all of the images temporarily stored in the memory 13 are superposed sequentially to create a whole image 103 by the method shown in Fig. 3 (Step ST8).
  • inspection areas are set for each of the target portions on the whole map 103 by the method of creating a map of an inspection area from CAD data corresponding to the standard substrate 1S (Step ST9).
  • This map image is matched with the whole map 103 by means of the positioning marks 105 and 106 and the positions and the sizes of the inspection areas on the map image after this matching process are set as the setting data for the inspection areas.
  • Step ST10 a window with a size corresponding to the field of vision of the camera 3A is scanned on the map image of the inspection areas and setting positions for target areas for the time of inspection are identified.
  • inspection data are created for each of the components on the whole image such as setting position of the inspection window for each target portion to be inspected within the inspection area, kinds of the programs to be carried out, parameters such as threshold values for binarization processes and standard values for judging appropriateness of extracted characteristic quantities (Step ST11).
  • These inspection data may include both those created from standard inspection data preliminarily registered for each type of components (library data) and those created by using the whole image.
  • judgment files are created by correlating the coordinates of the reference point identified in Step ST4, the whole image created in Step ST8 and the inspection data of various kinds created in Steps ST9-ST11 with the substrate names and are registered in the teaching table 19 (Step ST12). Thereafter the standard substrate 1S is transported out (Step ST13) and the process ends.
  • the inspection routine of Fig. 9 is started after the teaching process of Fig. 8 is carried out and in response to the input of the substrate name of the target substrate 1T to be inspected and a start command for an inspection.
  • the routine starts by reading out the judgment file of the corresponding substrate and setting it to the memory 13 (Step ST21).
  • the position mark is used to correct the initial position of the camera 3A with respect to the target substrate 1T (Step 23).
  • This correction process is carried out by matching the field of vision of the camera 3A to an area including the positioning mark 105 or 106 corresponding to the aforementioned reference point to obtain an image, extracting the positioning mark on the created image and obtaining displacement values of the positioning mark with respect to the reference point registered by the aforementioned teaching process.
  • These displacement values are converted into the distance of motion of the X-axis and Y-axis table control parts 6 and 7 and these table control parts 6 and 7 are moved by these distances to adjust the positional relationship between the camera 3A and the target substrate 1T. (The position of the camera 3A are adjusted in the same way also in the subsequent Examples.)
  • Steps ST24-ST31 are carried out for each of the target areas.
  • the X-axis and Y-axis table control parts 6 and 7 are operated to position the camera 3A to the target area under consideration (Step ST24) and a target image to be processed is obtained by the camera 3A (Step ST25).
  • an area of a specified size including the target area under consideration and surrounding images is set on the whole image as the scanning object of the target image (Step ST26).
  • a correlation matching process is carried out at each scanning position by scanning the target image for each pixel to this scanning object area and a corresponding area to the object image is extracted (Step ST27). Thereafter, displacement values ⁇ x and ⁇ y with respect to the target area under consideration are calculated regarding the corresponding area extracted in Step ST27 (Step ST28).
  • Step ST29 the setting position of the inspection area read out from the teaching table 19 and stored in the memory 13 is corrected by the displacement values ⁇ x and ⁇ y. This is done first by replacing the registered coordinates of the setting position of the inspection area (in the coordinate system of the whole image) by relative coordinates (such as the coordinates as seen from the lower right-hand corner of the target area) with respect to the registered inspection area and then correcting the coordinates after the conversion by the displacement values ⁇ x and ⁇ y.
  • Step ST30 a series of processes related to inspection is carried out such as setting an inspection area on the target image based on the setting position corrected in Step ST29, carrying out measurement and judgment processes on the target portions for the inspection by using the inspection data read out of the teaching table 19 for each inspection area and storing the inspection results in the memory 13 (Step ST30).
  • Step ST32 the results stored in the memory 13 over all of the inspection areas are considered together to determine whether the target substrate 1T under consideration is a good product or not and the result of this judgment is outputted (Step ST32).
  • the inspected target substrate 1T is thereafter transported away (Step ST33) and Steps ST22-ST33 are repeated for all other substrates to be processed (Step ST34).
  • Fig. 10 shows the whole image 103 of a substrate. Like the example shown in Figs. 4 and 5, this whole image 103 is also formed by superposing six images, that is, three images in the horizontal x-direction and two images in the perpendicular y-direction. For the sake of simplicity of description, superposed parts between mutually adjacent images are not illustrated in Fig. 10,
  • Example 2 images of target areas are obtained (photographed) under the same condition as when the whole image 103 is obtained.
  • the whole image 103 is divided into area portions corresponding to the target areas (to be photographed) and the images G0-G5 of the six area portions are registered as standard images.
  • the reason for obtaining the standard images G0-G5 after creating the whole image 103 by cutting it is to eliminate the machine errors that may be caused when images are taken for obtaining the standard images.
  • Fig. 10 shows two components 50 only on one of the area portions (corresponding to image G4). These components 50 are shown as chip components each having electrodes on both end parts. Numeral 51 indicates a fillet and numeral 52 indicates an inspection area corresponding to this fillet 51.
  • each inspection area 52 As the setting position of each inspection area 52, the coordinates of the lower right-hand corner of that inspection area 52 are registered in this example. These coordinates (x n , y n ) are taken relative to the lower right-hand corner of the image that contains this inspection area 52. Although not shown in Fig. 10, the horizontal and perpendicular dimensions of the inspection areas 52 are also registered as data indicative of their sizes.
  • Fig. 11 is used next to explain how the setting position of an inspection area is identified.
  • data regarding the area corresponding to each standard image are extracted from the CAD data of the substrate, those of the extracted data representing the fillet 51 are identified and the setting condition of the inspection area is determined according to the position and the size of the target portion to be inspected.
  • Fig. 11 shows a standard image G4 at the upper center of the whole image 103 and a map image M4 showing the distribution condition of the inspection areas on this standard image G4.
  • the map image M4 and the standard image G4 are position-matched to apply the positions and sizes of the inspection areas 52 on the map image M4 such that appropriate inspection areas 52 are set for each fillet 51 on the standard image G4.
  • the center point P of the positioning mark 105 at the lower right-hand corner of the substrate is used as the reference point for the purpose of creating map images corresponding to the standard images G0-G5 such that the coordinate systems of the whole image 103 and the CAD data can be matched.
  • Data on the positioning marks are also included in the CAD data and since the positioning marks are accurately formed on the substrate, it is justified to select the reference point P with reference to one of the positioning marks (105).
  • each point on the whole image 103 can be expressed in terms of the relative coordinates with respect to the reference point P.
  • the coordinates of the lower right-hand corner of the aforementioned inspection area 51 on the standard image G4 become (x 0 +x n , y 0 +y n ).
  • the camera 3A is positioned for the same target area as when the whole image 103 was obtained to create target images corresponding to the standard images G0-G5. Because of machine errors, however, there is a possibility that the field of vision of the camera 3A may not correctly match the target area and there may be a displacement between the registered standard image and the target image to be processed. Thus, the positional displacement of the target image is extracted and the inspection area is corrected, as explained above regarding Example 1.
  • Fig. 12 shows a target image 54 corresponding to and displaced from the standard image G4. Moreover, one (51K) of the four target portions 51 on the target image 54 is shown to be defective.
  • the standard image G4 is scanned for each pixel with respect to the target image 54 and correlation values are calculated where the two images overlap such that the displacement values ⁇ x and ⁇ y between the two images when the correlation value is the highest are judged to be the displacement values of the target image 54 with respect to the standard image G4. Even if there is a defective part, since the remaining parts other than the defective part 51K are similar to the standard image 54, it may be concluded that the correlation value becomes the largest when the standard image G4 has moved to the best matching position.
  • Fig. 13 shows how the positions of the inspection areas 52 which are set under the same conditions as the standard image G4 are corrected by using the displacement values ⁇ x and ⁇ y. In this manner, appropriate inspection areas 12 can be set for all fillets 51 and 51K such that an accurate inspection is made possible.
  • Example 2 too, a teaching process similar to the routine shown by Fig. 8 can be carried out except that the individual images G0-G5, instead of the whole image, are registered as standard images. Thus, no separate flowchart will be shown for this example and Fig. 14 is referenced next to explain the inspection routine.
  • the inspection routine for Example 2 starts by reading out the inspection data of the desired target substrate from the teaching table 19 (Step ST41) and storing them in the memory 13.
  • the camera 3A is positioned for the target area including the positioning mark 105 corresponding to the reference point P (corresponding to the standard image G0) (Step ST43).
  • Step ST44 an image is taken by the camera 3A, the center point of the positioning mark 105 is extracted on the obtained image and the displacement of the extracted position with respect to the reference point P is obtained.
  • the X-axis and Y-axis table control parts 6 and 7 are operated according to this displacement to adjust the position of the camera 3A (Step ST44). Since there is a possibility of machine errors also at this time of adjustment, causing a displacement of the field of vision of the camera 3A with respect to the target area, another image is obtained by the camera 3A after this positional adjustment (Step ST45). After a target image is obtained corresponding to the standard image G0, Steps ST46 and ST47 are carried out to set an inspection area on the target image free of the effects of the machine errors.
  • Step ST46 a correlation matching process is carried out between the created target image and the standard image G0 to detect displacement values ⁇ x and ⁇ y. This process is similar to the process explained earlier with reference to Fig. 12 as the method of detecting the displacement values of the target image 54 with respect to the standard image G4.
  • Step ST48 After these displacement values ⁇ x and ⁇ y are used to correct the setting position of the inspection area in Step ST47, an inspection is carried out by using the image data of the corrected inspection area (Step ST48).
  • Step ST50 the camera 3A is positioned for the next target area (Step ST50). If the camera 3A is to be set for the target area corresponding to the standard image G1, for example, the X-axis table control part 6 is moved by an amount corresponding to the distance of xp. If the camera 3A is to be set for the target area corresponding to the standard image G5, the Y-axis table control part 7 is moved by an amount corresponding to the distance of y p . Thereafter, Steps ST45-ST48 are repeated to sequentially obtain images corresponding to all standard images and detection of displacements with respect to the standard image, correction of inspection areas and inspection are carried out.
  • Step ST49 After all target areas have been processed (YES in Step ST49), the results of the inspection are outputted (Step ST51) and the target substrate 1T is transported out (Step ST52). Steps ST42-ST52 are repeated until all target substrates are processed (NO in Step 53).
  • Example 2 presumed that the target portion to be examined is contained within any one of the images that comprise the whole image. There are situations as shown in Fig. 15, however, where a component (QFP) 56 to be examined is positioned so as to span over a plurality of images. In such a situation, it is desirable to be able to set a target area at an arbitrarily selectable position on the whole image 103 as in the case of Example 1.
  • QFP component
  • Example 3 after the whole image 103 of the standard substrate 1S has been created, a target area 57 is set on this whole image 103 so as to wholly contain this component 56 in question and the setting position of this target area 57 is registered. The image inside the target area 57 is cut out and is registered as the standard image.
  • the setting of the target area 57 can be carried out by using the map image of the substrate but another method is to display the whole image 103 on the display part 22 and to receive the user's setting operations on this display screen. It also goes without saying that the dimensions of this target area 57 are also x P and y P .
  • Fig. 16 shows the standard image G11 cut out of the target area 57 on the whole image 103, as well as the map image M11 of the inspection areas distributed over this standard image G11.
  • numeral 59 indicates a fillet corresponding to one of the lead lines of the component 56 and numeral 58 indicates the inspection area corresponding to this fillet 59.
  • the map image M11 of this example can be created, as in the case of Example 2, by matching the coordinate systems of the whole image 103 and the CAD data and thereafter extracting the data of areas corresponding to the standard image G11 from the CAD data.
  • the setting position of each inspection area 58 on the map image M11 can also be expressed by relative coordinates (x n , y n ) of the lower right-hand corner of the inspection area 58 as seen from the lower right-hand corner of the image.
  • the setting position of the target area 57 on the whole image 103 may be set by the relative coordinates (x T , y T ) of the lower right-hand corner of this area as seen from the reference point P, as shown in Fig. 15.
  • An area corresponding to the standard image G11 may be identified on the CAD data for determining the setting conditions of the inspection area by matching the coordinate systems of the whole image and the CAD data based on the reference point P and thereafter setting an area with dimensions of x p and y p with the lower right-hand corner at (X T , y T ).
  • the position of the camera 3A can be matched with the target area 57 on the substrate at the time of inspection by adjusting the positional relationship between the substrate and the camera 3A based on the reference point P by a process similar to Steps ST43 and ST44 of Fig. 14 and thereafter moving the X-axis and Y-axis table control parts 6 and 7 respectively by a distance corresponding to the sum of the value of x T and y T and the coordinate of the reference point P.
  • the displacement values ⁇ x and ⁇ y of the target image with respect to the standard image may be extracted similarly as in Example 2 and these values ⁇ x and ⁇ y can be used to correct the position of the inspection area.
  • Example 3 The teaching and inspection routines for Example 3 are similar to those for Examples 1 and 2 and hence repetitious explanations will be omitted.
  • the whole image 103 of the standard substrate 1S is created first and thereafter an image corresponding to a target area is cut out of this whole image and registered as the standard image but it may be the whole image 103 that is registered as in Example 1.
  • the displacement values of the target image with respect to the standard image may be detected at the time of inspection by cutting out the standard image from the registered whole image and carrying out the correlation matching process between the standard image that has been cut out and the target image.
  • Example 4 also relates to a substrate as shown in Fig. 15, a target area 57 is set so as to wholly include a component 56 and the image in this target area 57 is registered as the standard image G11. Unlike Example 3, however, the whole image 103 is not created and the setting position of the target 57 including the component 56 is identified on the CAD data and an image to become the standard image G11 is obtained by matching the position of the camera 3A with the target area of the standard substrate 1S. Since the effects of machine errors of the X-axis and Y-axis table control parts 6 and 7 must be taken into consideration when a standard image is created by such a method, however, a teaching process is carried out in Example 4 by a routine as shown in Fig. 17.
  • the teaching routine of Fig. 17 starts by reading out the CAD data of the target substrate of inspection, and a map image of inspection areas for the entire substrate is created by using them (Step ST61). Next, a target area 57 is set at a position such that all inspection areas 58 corresponding to the component 56 are included (Step ST62). The setting position of this target area 57 is also expressed by the relative coordinates (x T , y T ) of the lower right-hand corner of this area 57 with respect to the reference point P as in the case of the whole image 103 in Example 3.
  • Step ST63 After the standard substrate 1S is transported in (Step ST63), the positional relationship between the standard substrate 1S and the camera 3A is adjusted such that the positioning mark 105 on the image of the standard substrate 1S will be at preliminarily determined coordinates (Step ST64). After this position matching is completed, the X-axis and Y-axis table control parts 6 and 7 are moved based on x T and y T to match the position of the camera 3A with the target area 57 set in Step ST62 (Step ST65). An image is taken by the camera 3A to obtain an image that is to become the standard image (Step ST66).
  • the relative coordinates (x T , y T ) are used further to extract an image in the target area 57 from the map image of the whole of the substrate and this map image is superposed to the standard image.
  • the map image is scanned for each pixel until the standard image and the map image come to a matching positional relationship and the displacement values ⁇ x and ⁇ y between the standard image and the map image when they are in this matching positional relationship are detected (Step ST67).
  • Fig. 18A shows the map image M11 of the target area 57 initially set on the standard image G11. Since the standard image G11 of this example is in a displaced condition with respect to the target area 57, the fillets 59 on the image are also displaced from the corresponding inspection areas 58.
  • Fig. 18B shows the map image M11 after it has been moved to a position such that each fillet 59 is contained in the corresponding inspection area 58. In the condition of Fig. 18B, the standard image G11 and the map image M11 may be considered to be in a matching relationship.
  • the horizontal and perpendicular displacement values of the standard image with respect to the map image are respectively ⁇ x and ⁇ y.
  • the matching positional relationship between the standard image G1 and the map image M11 is obtained by extracting areas where the red, green and blue colors are distributed on the standard image G11 and distinguishing the extracted areas from the other area by binarization.
  • the map image is also binarized by distinguishing the inspection area 58 from the other area. Thereafter, these two binarized images are moved with respect to each other to obtain a positional relationship where the correlation value becomes the largest.
  • This positional relationship corresponds to the aforementioned matching relationship between the standard image G11 and the map image M11 and their displacement values are ⁇ x and ⁇ y.
  • Step ST67 After the displacement values ⁇ x and ⁇ y of the standard image G11 with respect to the map image M11 are obtained in Step ST67, the setting position of the inspection area 58 on the map image is corrected by using these obtained displacement values (Step ST68). Next, the setting position of the target area 57 set in Step ST62 is also corrected similarly by using the displacement values ⁇ x and ⁇ y (Step ST69).
  • Step ST70 inspection data for other than the inspection area are created as done in Step ST11 of Fig. 8 (Step ST70).
  • a judgment file is created, including the standard image created in Step ST66, the corrected setting position of the target area, the setting position and the size of the correct inspection area and the other inspection data created in Step ST70, and is registered in the teaching table 13 (Step ST71).
  • Step ST72 this teaching routine is ended.
  • the inspection routine to be carried out after this teaching routine is the same as explained for Example 2 and hence will not be repetitiously explained.
  • the setting data x T and y T of the target area 57 are relative coordinates with respect to the reference point P but the setting data x n and y n of the inspection area 58 are relative coordinates with respect to the map image G11, or the lower right-hand corner of the target area 57, as shown in Fig. 18A. Since the positional relationship between the substrate and the inspection area 58 is fixed, the relative coordinates of the inspection area relative to the reference point P are always of constant values (x T +x n , y T +y n ). Thus, if the values of x n and y n are changed, the values of x T and y T must also be changed. For example, if the coordinate x n of the inspection area is changed to x n + ⁇ x, the coordinate x T of the target area must be changed to x T - ⁇ x.
  • the position of the camera 3a will be matched to the target area after the correction at the time of the inspection but there is also a possibility of machine errors occurring in such a situation.
  • the inspection area of the target image must be corrected but the setting data of the target area must also be corrected for the same reason if the position of a target portion to be inspected in this inspection area is to be expressed by relative coordinates with respect to the reference point P.

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Quality & Reliability (AREA)
  • Image Processing (AREA)
  • Investigating Materials By The Use Of Optical Means Adapted For Particular Applications (AREA)
  • Image Analysis (AREA)
  • Length Measuring Devices By Optical Means (AREA)
  • Supply And Installment Of Electrical Components (AREA)
EP05019260A 2004-09-06 2005-09-05 Méthode et appareil d'inspection de substrat Not-in-force EP1638050B1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2004258488 2004-09-06
JP2005255009A JP3918854B2 (ja) 2004-09-06 2005-09-02 基板検査方法および基板検査装置

Publications (3)

Publication Number Publication Date
EP1638050A2 true EP1638050A2 (fr) 2006-03-22
EP1638050A3 EP1638050A3 (fr) 2010-07-21
EP1638050B1 EP1638050B1 (fr) 2012-06-20

Family

ID=35601895

Family Applications (1)

Application Number Title Priority Date Filing Date
EP05019260A Not-in-force EP1638050B1 (fr) 2004-09-06 2005-09-05 Méthode et appareil d'inspection de substrat

Country Status (4)

Country Link
US (1) US7512260B2 (fr)
EP (1) EP1638050B1 (fr)
JP (1) JP3918854B2 (fr)
TW (1) TWI279741B (fr)

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP2321778A2 (fr) * 2008-08-08 2011-05-18 Snap-on Incorporated Système de contrôle des stocks basé sur l image avec calibration automatique et correction d image
CN101765361B (zh) * 2008-12-22 2011-12-21 株式会社日立高新技术仪器 电子部件安装方法及其装置
CN103348376A (zh) * 2011-03-15 2013-10-09 欧姆龙株式会社 图像处理装置及图像处理程序
TWI484159B (zh) * 2008-09-25 2015-05-11 Photon Dynamics Inc 用以測定或識別平板顯示器或其像素之位置的方法及裝置
EP2990758B1 (fr) * 2013-04-25 2018-09-19 Bridgestone Corporation Dispositif de contrôle
EP3589105A4 (fr) * 2017-02-23 2020-02-26 Fuji Corporation Dispositif de travail sur substrat et procédé de traitement d'image

Families Citing this family (18)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7684891B2 (en) * 2006-08-04 2010-03-23 Hurco Companies, Inc. System and method for tool use management
DE102007060355A1 (de) * 2007-12-12 2009-06-25 Vistec Semiconductor Systems Gmbh Verfahren und Vorrichtung zur Verarbeitung der von mindestens einer Kamera aufgenommenen Bilddaten der Oberfläche eines Wafers
JP5254637B2 (ja) * 2008-02-19 2013-08-07 株式会社ブリヂストン タイヤの外観検査装置、及びタイヤの外観検査方法
KR101237497B1 (ko) * 2009-03-30 2013-02-26 주식회사 고영테크놀러지 검사영역의 설정방법
US8260030B2 (en) 2009-03-30 2012-09-04 Koh Young Technology Inc. Inspection method
JP5418093B2 (ja) 2009-09-11 2014-02-19 ソニー株式会社 表示装置および制御方法
ES2718471T3 (es) 2009-11-26 2019-07-02 Japan Tobacco Inc Dispositivo de inspección de cigarrillos
EP2550876B1 (fr) * 2010-03-24 2019-10-30 Japan Tobacco, Inc. Procédé et dispositif d'inspection de filtre
KR101642897B1 (ko) 2011-07-13 2016-07-26 주식회사 고영테크놀러지 검사방법
US9080855B2 (en) * 2011-09-23 2015-07-14 Mitutoyo Corporation Method utilizing image correlation to determine position measurements in a machine vision system
US8977035B2 (en) 2012-06-13 2015-03-10 Applied Materials Israel, Ltd. System, method and computer program product for detection of defects within inspection images
JP6177010B2 (ja) * 2013-06-03 2017-08-09 新電元工業株式会社 捺印シンボル検査方法、捺印シンボル検査装置、及び電子機器
JP5775132B2 (ja) * 2013-11-01 2015-09-09 株式会社ブリヂストン タイヤの検査装置
JP2018189387A (ja) * 2017-04-28 2018-11-29 セイコーエプソン株式会社 電子部品搬送装置および電子部品検査装置
WO2020105679A1 (fr) * 2018-11-21 2020-05-28 ソニー株式会社 Système, dispositif et procédé d'identification de pièce
JP7153127B2 (ja) * 2019-03-14 2022-10-13 株式会社Fuji 良否判定装置および良否判定方法
TWI700644B (zh) * 2019-04-02 2020-08-01 精英電腦股份有限公司 板件構件的即時定位裝置與方法
WO2023199572A1 (fr) * 2022-04-11 2023-10-19 パナソニックIpマネジメント株式会社 Dispositif d'enregistrement de modèle, procédé d'enregistrement de modèle et système d'enregistrement de modèle

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2002097535A2 (fr) * 2001-05-30 2002-12-05 Nptest, Inc. Alignement de sous-resolution des images
US20020181760A1 (en) * 2001-06-01 2002-12-05 Norio Asai Hole inspection apparatus and method
US20030179921A1 (en) * 2002-01-30 2003-09-25 Kaoru Sakai Pattern inspection method and its apparatus

Family Cites Families (20)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH02503241A (ja) * 1988-02-22 1990-10-04 イーストマン・コダック・カンパニー Svdブロック変換を使用したディジタル画像雑音抑制方法
US5185811A (en) * 1990-12-27 1993-02-09 International Business Machines Corporation Automated visual inspection of electronic component leads prior to placement
US5187754A (en) * 1991-04-30 1993-02-16 General Electric Company Forming, with the aid of an overview image, a composite image from a mosaic of images
US5371690A (en) * 1992-01-17 1994-12-06 Cognex Corporation Method and apparatus for inspection of surface mounted devices
US5325449A (en) * 1992-05-15 1994-06-28 David Sarnoff Research Center, Inc. Method for fusing images and apparatus therefor
JP3250309B2 (ja) 1993-03-02 2002-01-28 オムロン株式会社 実装部品検査用データの教示方法
KR0168243B1 (ko) * 1994-12-19 1999-05-01 다떼이시 요시오 관측 영역 설정 방법 및 그 장치와, 이 관측 영역 설정 방법을 이용한 외관 검사 방법 및 그 장치
JP3381129B2 (ja) 1994-12-19 2003-02-24 オムロン株式会社 観測領域設定方法およびその装置、ならびにこの観測領域設定方法を用いた外観検査方法およびその装置
US5694481A (en) * 1995-04-12 1997-12-02 Semiconductor Insights Inc. Automated design analysis system for generating circuit schematics from high magnification images of an integrated circuit
JPH0915302A (ja) 1995-06-30 1997-01-17 Olympus Optical Co Ltd 回路基板検査機の位置決め装置および位置決め方法
US5768443A (en) * 1995-12-19 1998-06-16 Cognex Corporation Method for coordinating multiple fields of view in multi-camera
JP3524254B2 (ja) 1996-02-01 2004-05-10 日置電機株式会社 パターンマッチング方法による被検査基板の位置ずれ検出方法
US6075905A (en) * 1996-07-17 2000-06-13 Sarnoff Corporation Method and apparatus for mosaic image construction
JP3995030B2 (ja) * 1996-09-17 2007-10-24 コグネックス・テクノロジー・アンド・インベストメント・コーポレーション 半導体パッケージの検査装置
US6915006B2 (en) * 1998-01-16 2005-07-05 Elwin M. Beaty Method and apparatus for three dimensional inspection of electronic components
US6549222B1 (en) * 2000-06-27 2003-04-15 Chipworks Lock-step cursors for feature alignment
JP2002181729A (ja) 2000-12-12 2002-06-26 Saki Corp:Kk 外観検査装置および外観検査方法
JP3594026B2 (ja) 2001-11-26 2004-11-24 オムロン株式会社 曲面体の表面状態検査方法および基板検査装置
US6840666B2 (en) * 2002-01-23 2005-01-11 Marena Systems Corporation Methods and systems employing infrared thermography for defect detection and analysis
JPWO2005001456A1 (ja) * 2003-06-30 2006-08-10 株式会社東京精密 パターン比較検査方法およびパターン比較検査装置

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2002097535A2 (fr) * 2001-05-30 2002-12-05 Nptest, Inc. Alignement de sous-resolution des images
US20020181760A1 (en) * 2001-06-01 2002-12-05 Norio Asai Hole inspection apparatus and method
US20030179921A1 (en) * 2002-01-30 2003-09-25 Kaoru Sakai Pattern inspection method and its apparatus

Cited By (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP2321778A2 (fr) * 2008-08-08 2011-05-18 Snap-on Incorporated Système de contrôle des stocks basé sur l image avec calibration automatique et correction d image
EP2321778A4 (fr) * 2008-08-08 2011-10-19 Snap On Tools Corp Système de contrôle des stocks basé sur l image avec calibration automatique et correction d image
TWI484159B (zh) * 2008-09-25 2015-05-11 Photon Dynamics Inc 用以測定或識別平板顯示器或其像素之位置的方法及裝置
CN101765361B (zh) * 2008-12-22 2011-12-21 株式会社日立高新技术仪器 电子部件安装方法及其装置
CN103348376A (zh) * 2011-03-15 2013-10-09 欧姆龙株式会社 图像处理装置及图像处理程序
EP2667350A1 (fr) * 2011-03-15 2013-11-27 Omron Corporation Dispositif et programme de traitement d'images
EP2667350A4 (fr) * 2011-03-15 2014-12-24 Omron Tateisi Electronics Co Dispositif et programme de traitement d'images
CN103348376B (zh) * 2011-03-15 2016-11-09 欧姆龙株式会社 图像处理装置
US9571795B2 (en) 2011-03-15 2017-02-14 Omron Corporation Image processing device and image processing program
EP2990758B1 (fr) * 2013-04-25 2018-09-19 Bridgestone Corporation Dispositif de contrôle
EP3589105A4 (fr) * 2017-02-23 2020-02-26 Fuji Corporation Dispositif de travail sur substrat et procédé de traitement d'image

Also Published As

Publication number Publication date
EP1638050B1 (fr) 2012-06-20
US7512260B2 (en) 2009-03-31
EP1638050A3 (fr) 2010-07-21
TWI279741B (en) 2007-04-21
TW200620161A (en) 2006-06-16
JP3918854B2 (ja) 2007-05-23
US20060050267A1 (en) 2006-03-09
JP2006099758A (ja) 2006-04-13

Similar Documents

Publication Publication Date Title
EP1638050B1 (fr) Méthode et appareil d'inspection de substrat
CN100533132C (zh) 基板检查方法及基板检查装置
JP2007184589A (ja) 基板検査方法および基板検査装置
EP1619494B1 (fr) Procédé et dispositif pour inspecter un substrat
EP1388738B1 (fr) Méthode de production de données d'inspection et appareil d'inspection de plaquettes utilisant ce procédé
JP4596029B2 (ja) はんだ付け検査方法、はんだ付け検査用の検査データ作成方法、およびはんだ付け検査装置
US7114249B2 (en) Substrate inspecting method and substrate inspecting apparatus using the method
EP1675067A2 (fr) Procédé de traitement d'image procédé d'inspection de substrat, appareil d'inspection de substrat et procédé de génération de données d'inspection substrat
US5822449A (en) Teaching method and system for mounted component inspection
US7356176B2 (en) Mounting-error inspecting method and substrate inspecting apparatus using the method
JP3906780B2 (ja) 部品コード変換テーブルに対するデータ登録方法、基板検査データの作成装置、登録処理用のプログラムおよびその記憶媒体
JP2007005358A (ja) 基板検査結果の分析支援方法、およびこの方法を用いた基板検査結果の分析支援装置ならびにプログラム
JP4470659B2 (ja) 部品検査用のモデル登録方法およびこの方法を用いた検査データ作成装置
JP2536127B2 (ja) 基板検査装置
JP4026636B2 (ja) 部品実装状態の検査方法およびその方法を用いた部品実装検査装置
JP3264020B2 (ja) 検査用データ作成方法および実装部品検査装置
JP3289070B2 (ja) 実装部品検査装置
JP3189308B2 (ja) はんだ付検査結果の表示方法およびその装置,はんだ付不良の修正方法,ならびにはんだ付検査装置
JPH03202757A (ja) 基板検査装置
JPH09145334A (ja) 実装部品検査方法およびその装置
JP2790557B2 (ja) 検査データ教示方法およびこの方法を用いた実装基板検査装置
JPH0526815A (ja) 実装部品検査用データの教示方法
JP7523840B1 (ja) プログラム、コンピュータ、検査システムおよび検査方法
JPH06258244A (ja) 実装部品検査用データの教示方法
JP3273378B2 (ja) 基板検査装置における特徴パラメータ決定装置

Legal Events

Date Code Title Description
PUAI Public reference made under article 153(3) epc to a published international application that has entered the european phase

Free format text: ORIGINAL CODE: 0009012

AK Designated contracting states

Kind code of ref document: A2

Designated state(s): AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HU IE IS IT LI LT LU LV MC NL PL PT RO SE SI SK TR

AX Request for extension of the european patent

Extension state: AL BA HR MK YU

PUAL Search report despatched

Free format text: ORIGINAL CODE: 0009013

AK Designated contracting states

Kind code of ref document: A3

Designated state(s): AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HU IE IS IT LI LT LU LV MC NL PL PT RO SE SI SK TR

AX Request for extension of the european patent

Extension state: AL BA HR MK YU

17P Request for examination filed

Effective date: 20110120

AKX Designation fees paid

Designated state(s): AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HU IE IS IT LI LT LU LV MC NL PL PT RO SE SI SK TR

17Q First examination report despatched

Effective date: 20110324

GRAP Despatch of communication of intention to grant a patent

Free format text: ORIGINAL CODE: EPIDOSNIGR1

GRAS Grant fee paid

Free format text: ORIGINAL CODE: EPIDOSNIGR3

GRAA (expected) grant

Free format text: ORIGINAL CODE: 0009210

AK Designated contracting states

Kind code of ref document: B1

Designated state(s): AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HU IE IS IT LI LT LU LV MC NL PL PT RO SE SI SK TR

REG Reference to a national code

Ref country code: GB

Ref legal event code: FG4D

REG Reference to a national code

Ref country code: CH

Ref legal event code: EP

REG Reference to a national code

Ref country code: AT

Ref legal event code: REF

Ref document number: 563412

Country of ref document: AT

Kind code of ref document: T

Effective date: 20120715

REG Reference to a national code

Ref country code: IE

Ref legal event code: FG4D

REG Reference to a national code

Ref country code: DE

Ref legal event code: R096

Ref document number: 602005034805

Country of ref document: DE

Effective date: 20120816

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: FI

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20120620

Ref country code: LT

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20120620

Ref country code: SE

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20120620

REG Reference to a national code

Ref country code: NL

Ref legal event code: VDEP

Effective date: 20120620

REG Reference to a national code

Ref country code: AT

Ref legal event code: MK05

Ref document number: 563412

Country of ref document: AT

Kind code of ref document: T

Effective date: 20120620

REG Reference to a national code

Ref country code: LT

Ref legal event code: MG4D

Effective date: 20120620

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: LV

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20120620

Ref country code: GR

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20120921

Ref country code: SI

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20120620

REG Reference to a national code

Ref country code: DE

Ref legal event code: R082

Ref document number: 602005034805

Country of ref document: DE

Representative=s name: KILIAN KILIAN & PARTNER, DE

Ref country code: DE

Ref legal event code: R082

Ref document number: 602005034805

Country of ref document: DE

Representative=s name: KILIAN KILIAN & PARTNER MBB PATENTANWAELTE, DE

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: AT

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20120620

Ref country code: BE

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20120620

Ref country code: IS

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20121020

Ref country code: EE

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20120620

Ref country code: RO

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20120620

Ref country code: SK

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20120620

Ref country code: CY

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20120620

Ref country code: CZ

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20120620

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: PT

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20121022

Ref country code: IT

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20120620

Ref country code: PL

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20120620

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: NL

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20120620

PLBE No opposition filed within time limit

Free format text: ORIGINAL CODE: 0009261

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: NO OPPOSITION FILED WITHIN TIME LIMIT

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: MC

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20120930

Ref country code: ES

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20121001

Ref country code: DK

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20120620

REG Reference to a national code

Ref country code: CH

Ref legal event code: PL

26N No opposition filed

Effective date: 20130321

GBPC Gb: european patent ceased through non-payment of renewal fee

Effective date: 20120920

REG Reference to a national code

Ref country code: IE

Ref legal event code: MM4A

REG Reference to a national code

Ref country code: FR

Ref legal event code: ST

Effective date: 20130531

REG Reference to a national code

Ref country code: DE

Ref legal event code: R097

Ref document number: 602005034805

Country of ref document: DE

Effective date: 20130321

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: LI

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20120930

Ref country code: CH

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20120930

Ref country code: BG

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20120920

Ref country code: GB

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20120920

Ref country code: IE

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20120905

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: FR

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20121001

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: TR

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20120620

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: LU

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20120905

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: HU

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20050905

PGFP Annual fee paid to national office [announced via postgrant information from national office to epo]

Ref country code: DE

Payment date: 20220621

Year of fee payment: 18

REG Reference to a national code

Ref country code: DE

Ref legal event code: R119

Ref document number: 602005034805

Country of ref document: DE

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: DE

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20240403