EP4405668A1 - Inspektionssystem und verfahren zur fehleranalyse - Google Patents

Inspektionssystem und verfahren zur fehleranalyse

Info

Publication number
EP4405668A1
EP4405668A1 EP22790499.2A EP22790499A EP4405668A1 EP 4405668 A1 EP4405668 A1 EP 4405668A1 EP 22790499 A EP22790499 A EP 22790499A EP 4405668 A1 EP4405668 A1 EP 4405668A1
Authority
EP
European Patent Office
Prior art keywords
product
inspection system
image
light beam
lens
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
EP22790499.2A
Other languages
German (de)
English (en)
French (fr)
Inventor
Roman Wieser
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Wickon Hightech GmbH
Original Assignee
Wickon Hightech GmbH
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Wickon Hightech GmbH filed Critical Wickon Hightech GmbH
Publication of EP4405668A1 publication Critical patent/EP4405668A1/de
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/84Systems specially adapted for particular applications
    • G01N21/88Investigating the presence of flaws or contamination
    • G01N21/95Investigating the presence of flaws or contamination characterised by the material or shape of the object to be examined
    • G01N21/9515Objects of complex shape, e.g. examined with use of a surface follower device
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/02Measuring arrangements characterised by the use of optical techniques for measuring length, width or thickness
    • G01B11/06Measuring arrangements characterised by the use of optical techniques for measuring length, width or thickness for measuring thickness ; e.g. of sheet material
    • G01B11/0608Height gauges
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/84Systems specially adapted for particular applications
    • G01N21/88Investigating the presence of flaws or contamination
    • G01N21/8806Specially adapted optical and illumination features
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/84Systems specially adapted for particular applications
    • G01N21/88Investigating the presence of flaws or contamination
    • G01N21/8851Scan or image signal processing specially adapted therefor, e.g. for scan signal adjustment, for detecting different kinds of defects, for compensating for structures, markings, edges
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/84Systems specially adapted for particular applications
    • G01N21/88Investigating the presence of flaws or contamination
    • G01N21/95Investigating the presence of flaws or contamination characterised by the material or shape of the object to be examined
    • G01N21/9501Semiconductor wafers
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/84Systems specially adapted for particular applications
    • G01N21/88Investigating the presence of flaws or contamination
    • G01N21/95Investigating the presence of flaws or contamination characterised by the material or shape of the object to be examined
    • G01N21/956Inspecting patterns on the surface of objects
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/84Systems specially adapted for particular applications
    • G01N21/88Investigating the presence of flaws or contamination
    • G01N21/8806Specially adapted optical and illumination features
    • G01N2021/8845Multiple wavelengths of illumination or detection
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/84Systems specially adapted for particular applications
    • G01N21/88Investigating the presence of flaws or contamination
    • G01N21/8851Scan or image signal processing specially adapted therefor, e.g. for scan signal adjustment, for detecting different kinds of defects, for compensating for structures, markings, edges
    • G01N2021/8887Scan or image signal processing specially adapted therefor, e.g. for scan signal adjustment, for detecting different kinds of defects, for compensating for structures, markings, edges based on image processing techniques
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/84Systems specially adapted for particular applications
    • G01N21/88Investigating the presence of flaws or contamination
    • G01N21/95Investigating the presence of flaws or contamination characterised by the material or shape of the object to be examined
    • G01N21/956Inspecting patterns on the surface of objects
    • G01N2021/95638Inspecting patterns on the surface of objects for PCB's
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N2201/00Features of devices classified in G01N21/00
    • G01N2201/12Circuits of general importance; Signal processing
    • G01N2201/127Calibration; base line adjustment; drift compensation

Definitions

  • the invention relates to an inspection system and a method for error analysis of a product, in particular a printed circuit board product, semiconductor wafer or the like, the inspection system comprising a projection device, an optical detection device and a processing device, the projection device having at least one spectrometer device by means of which white light is divided into its spectral components divisible and such a multichromatic light beam formed from monochromatic light beams can be projected onto a product at an angle of incidence ⁇ , the optical detection device having a detection unit with an area camera and a lens, with the detection unit running in a transverse, preferably orthogonal to a product surface of the product detection plane a multichromatic light beam reflected on the product can be detected by means of the area camera.
  • a height profile for example of a printed circuit board or of components arranged on a printed circuit board, can be determined using a spectroscopic method.
  • the product to be checked can be a so-called PCB (printed circuit board) or a photovoltaic cell.
  • PCB's can also be formed, among other things, by printing a printed circuit board substrate with a conductive paste to form conductor tracks or electronic components, such as resistors.
  • Various materials such as copper, gold, aluminum, titanium, etc. can be applied to the substrate.
  • the printed circuit board product can also be equipped with wired components or also so-called SMD components, coated with solder resist and soldered as part of the respective manufacturing process, for example by flow soldering.
  • SMD components coated with solder resist and soldered as part of the respective manufacturing process, for example by flow soldering.
  • a number of errors can occur, such as an incompletely printed conductor track, a faulty solder joint or a missing component, which can lead to the printed circuit board product not being able to function. It is therefore regularly necessary to analyze the printed circuit board product for any faults during its manufacture.
  • Such an error analysis is regularly carried out using an inspection system which has a camera for recording images of the printed circuit board product.
  • the camera can be a so-called line scan camera with optical sensors arranged in a line or row or also with three rows or lines each with different sensors (RGB sensors) or an area camera with more than three rows or lines.
  • RGB sensors RGB sensors
  • a disadvantage of the known inspection systems is that only products up to a certain size can be analyzed by the inspection system with sufficient image quality, since a line or row length of the optical sensors of the known cameras is limited due to production. For error analysis, it may be necessary to scan a product in several passes in order to capture the entire product surface or its relevant sections.
  • EP 2 307 852 B1 discloses an inspection system and a method for fault analysis of PCBs, with which a surface of the PCB can be measured optically. It involves passing a polychromatic light beam from a white light source through a prism to create a multichromatic light beam that is projected at an angle of incidence onto the printed circuit board product. Depending on the height difference of a surface of the printed circuit board product, the surface appears depending on its height in a spectral color directed at this height. be of the multichromatic light beam. A monochromatic light beam reflected in this way by the surface in question is detected by means of an optical detection device, in particular a line camera, or also a line of an area camera. A processing device or a device for data processing can then be used to calculate height information for the relevant surface of the printed circuit board product by setting the desired angle of incidence for aligning the multichromatic light beam on the detection plane and the position of the printed circuit board product relative to the line camera.
  • the disadvantage of the inspection system or method described above is that a surface of the PCB cannot always be reliably detected. Depending on the degree of reflection or the color of the product surface, there may be deviations when the relevant light beam is reflected on the product surface.
  • several images of the printed circuit board product or the PCB are therefore regularly required, or the product is illuminated with different colored light or the RGB channels that appear most suitable for an error analysis are processed further by means of the processing device.
  • different lighting devices with respectively assigned detection units are regularly required, with these then being able to scan the product surface sequentially or in separate passes.
  • the error analysis itself and the structural design of the inspection system become complex as a result.
  • Another disadvantage is that the height information obtained for the relevant surface can be imprecise. Depending on the height of the surface, there is a blurred image in an image plane of the optical detection device or on the line scan camera or the area scan camera. This makes height measurement less accurate, although this blurriness cannot be compensated for by using a camera chip with a higher resolution.
  • the present invention is based on the object of proposing an inspection system and a method for fault analysis of a product, which enables reliable fault analysis using simple means. This object is achieved by an inspection system having the features of claim 1 and a method having the features of claim 13.
  • the inspection system for error analysis of a product, in particular a printed circuit board product, semiconductor wafer or the like, comprises a projection device, an optical detection device and a processing device, the projection device having at least one spectrometer device, by means of which white light can be divided into its spectral components and is formed from monochromatic light beams in this way multichromatic light beam can be projected onto a product at an angle of incidence ß, the optical detection device having a detection unit with an area camera and a lens, wherein in a detection plane of the detection unit running transversely, preferably orthogonally to a product surface of the product, a multichromatic light beam reflected on the product by means of the Area camera can be detected, the detection unit having a dispersive or diffractive element, medium s in which the reflected multichromatic light beam can be projected onto an image plane of the area camera, height information of the product surface being derivable by means of the processing device from a spatial distribution of saturation values of the reflected multichromatic light beam in the image plane
  • a reflection image of the multichromatic light beam on the product surface shifts relative to an optical axis of the lens or to the detection plane due to the multichromatic light beam aligned on the detection plane at the angle of incidence ⁇ , the wavelength of which varies with a height in the detection plane the registration unit or area scan camera.
  • the reflected image in question is also imaged offset in the image plane of the area camera relative to the optical axis or detection plane, depending on the wavelength or height.
  • the multichromatic light beam which is formed from monochromatic light beams
  • a length offset can occur depending on the wavelength of the monochromatic light beam, which represents the height information of an object surface of the relevant monochromatic light beam, which can lead to a blurred image in the image plane for this light beam.
  • the correction device is designed to optically correct this longitudinal chromatic aberration of the multichromatic light beam or of the relevant monochromatic components for the image plane. It is thus possible to always obtain an essentially sharp image in the image plane, and thus clearer and more accurate height information, independently of height information or a wavelength of the relevant monochromatic light beam that is projected onto the image plane. A resolution of a height can be significantly improved in this way.
  • line images can be recorded by at least two sensor lines of the area camera with above-average saturation values.
  • a displacement of the reflection image in the image plane is then determined in that a spatial distribution or an imaging width of saturation values is evaluated by the processing device.
  • the location of the shift results from the spatial distribution or imaging range of the saturation values, since the sensor rows of the area scan camera with the above-average saturation values represent this. It is therefore sufficient to use the processing device alone to determine two sensor lines with above-average saturation values and to capture the line images imaged on these sensor lines in the image plane. This not only makes it possible to determine the height or a topography, but line images of the product surface can be recorded at the same time.
  • the dispersive or diffractive element which is arranged directly in front of the area camera or between the area camera and the lens, line images are obtained in different color gradations.
  • the image plane of the area camera can be inclined by an angle y 0° relative to a main plane of the lens such that the image plane is adapted to the longitudinal chromatic aberration of the reflected multichromatic light beam.
  • the correction device can consequently already be designed in that the image plane of the area camera is inclined relative to the main plane of the lens.
  • the image plane and the main plane are then no longer arranged parallel relative to one another.
  • the image plane is then also inclined relative to an optical axis of the detection device or the lens. Due to this inclination, the longitudinal chromatic aberration that otherwise occurs can be corrected in such a way that a comparatively sharp image is always formed, regardless of the wavelength of the respective monochromatic light beams that are projected onto the image plane.
  • the correction device can be formed by a further optical element, which is arranged in the beam path between the lens and the area camera.
  • a further optical element can be an appropriately adapted achromatic lens, a prism or the like.
  • An analysis image of the product can be derived from a plurality of line images by means of the processing device.
  • the recorded line images can then be combined within the scope of a scanning process by the processing device to form an analysis image of the product surface.
  • height information and, at the same time, a high-quality analysis image can be obtained in a single scan of the product surface.
  • a line image can be imaged from an object plane of the product surface onto an image plane of the area camera by means of the lens, the area camera can be arranged transversely, preferably orthogonally, to a direction of movement of a product.
  • the area camera can have a rectangular sensor surface, as a result of which a detection width, based on a product surface, can be optimally utilized.
  • the lens then serves to image a line image in the image plane of the area camera, which corresponds to a multichromatic light beam that is reflected on the product and can be detected by the lens.
  • a beam path is divergent or the imaging in the image plane occurs due to the refraction of the dispersive element according to the spectrum of the spectrometer device and the multichromatic light beam reflected on the product with a spectral distribution on the image plane.
  • the area camera can be formed by an RGB chip or a grayscale chip, which has 32 to 128 sensor lines, preferably 32 to 64 sensor lines, relatively transversely, preferably orthogonally, to a direction of movement of a product.
  • the area scan camera can have 1024 pixels x 36 sensor lines, 2048 pixels x 72 sensor lines, 4096 pixels x 128 sensor lines, 16,384 pixels x 256 sensor lines or more.
  • a higher resolution can be achieved with a gray scale chip, since all sensor lines of the gray scale chip can be used independently of a wavelength of the light.
  • the projection device can emit light in the red, green, blue (RGB), infrared (IR) and/or ultraviolet (UV) wavelength ranges, preferably in a wavelength range of 400 to 700 nm, and the area camera can capture this light.
  • RGB red, green, blue
  • IR infrared
  • UV ultraviolet
  • the dispersive or diffractive element can be a prism or a diffraction grating.
  • the prism can be a dispersion prism in the form of an isosceles triangle, for example.
  • the lens can be a telecentric lens with an image-side aperture. The beam path of the objective can then run parallel within the objective, at least in sections.
  • the diaphragm or aperture diaphragm can be arranged in an object-side focal plane of the lens. Consequently, the lens can also be a partially telecentric lens.
  • the aperture can be a slit aperture. Due to the fact that an area camera is used, it is already sufficient if the diaphragm is a slit diaphragm that is arranged or aligned in the direction of the longitudinal axis of the area camera.
  • the lens can have a front lens, which is designed as a converging lens and/or as an achromatic lens.
  • the achromatic lens can be composed of at least two lenses, the lenses of the achromatic lens being glued to one another or simply placed one inside the other to form a very thin air gap.
  • the air gap between the lenses can be up to 100 ⁇ m, for example. This makes it possible to form an achromatic lens in a particularly cost-effective manner and to dispense with gluing the lenses. By using an achromat, it is possible to correct color errors or distortion errors.
  • the front lens can be a cylindrical or spherical lens. It may be sufficient to use a cylindrical lens alone as the front lens, since only a light slit in the image plane or the line image needs to be imaged. By using the cylindrical lens, it becomes possible to save costs in manufacturing the inspection system.
  • the front lens can be in the form of a segment of a circle with two coaxial and parallel boundary surfaces.
  • the edges of the front lens can be cut off to leave a rod-shaped lens including a main axis of the cylindrical lens.
  • boundary surfaces on the outer ends of the front lens can also be formed parallel, in which case the front lens can then only approximately correspond to a segment of a circle.
  • a circular segment of the front lens can have a depth of ⁇ 20 mm, preferably 10 mm, while the front lens can have a width of >250 mm, preferably 180 mm.
  • the width of the front lens can also be understood to mean a diameter of the front lens.
  • the depth of the front lens can correspond to a relative distance between the parallel boundary surfaces.
  • the optical detection device can have at least a second detection unit with a second area camera, a second dispersive or diffractive element and a second objective, it being possible for the area cameras to be arranged in alignment relative to one another in the direction of their longitudinal axes, with the area cameras being at a distance relative to one another in such a way can be arranged such that the respective line images of the area cameras that can be detected by the area cameras in a common detection plane of the object plane can overlap in sections, with the processing device being able to put together or combine the overlapping line images to form a combination line image, with the processing device being able to create a plurality of combination line images an analysis image of the product can be derived.
  • the second detection unit of the optical detection device can be arranged with the second area camera and the second objective in such a way that the area cameras are positioned in a row relative to their longitudinal axes and thus transversely to the direction of movement.
  • the area cameras can be arranged at a distance relative to each other, but adjacent.
  • the line images in the object plane are comparatively larger than the line images in the image plane of the area camera, so that it is then possible to arrange the acquisition units or area cameras in such a way that the respective line images of the area cameras overlap in sections at their longitudinal ends, i.e. with the area cameras match voting sections (b) of the product surface can each be detected.
  • the processing device can be set up in such a way that it can assemble or combine these overlapping line images into a combination line image.
  • the combination line image can have a detection width (E) in the object plane that is larger than a detectable width (B) of the line image in the object plane. Consequently, the respective line images do not overlap completely, but only in an overlapping section (b).
  • the overlap portion may be 1mm to 3mm.
  • a detectable width (B) of the line image in the object plane can be greater than a diameter (D) of the lens or a length (L) of the area camera. This makes it possible for the first time to arrange area cameras next to one another with aligned longitudinal axes and still obtain an undivided combination line image.
  • the diameter of the objective means a maximum external dimension of the objective and the length of the area camera means a maximum external length of the area camera.
  • An optical axis of the lens can be arranged orthogonally relative to the object plane and transversely to the direction of movement, with a convergent beam path of the lens being able to run at an angle ⁇ >80° and ⁇ 90° relative to the object plane.
  • the convergent beam path of the objective then makes it possible to bring the line images of the object plane into overlap in sections, and at the same time to arrange the detection units next to one another, possibly spaced apart with an intermediate space.
  • the optical axes of the respective lenses can then be arranged in parallel in the detection plane. It is particularly advantageous if the line images overlap by 1 mm to 3 mm.
  • the processing device can then be set up in such a way that the line images are assembled into the combination line image using matching pixels in the overlapping areas.
  • the addition of the line images in the overlapping area makes it possible to achieve a particularly good imaging quality in the overlapping area.
  • the lens can then also have aberrations in the edge area of its detectable width, which are then tolerable
  • the projection device can have an illumination device, wherein diffuse light can be projected onto the product by means of the illumination device, wherein light of the diffuse light reflected on the product in the detection plane can be detected by means of the area camera.
  • the lighting device can have a diffuser, by means of which a homogeneous distribution of the light on the product surface can be achieved while at the same time avoiding strong contrasts.
  • the lighting device can emit light in the wavelength ranges red, green and blue (RGB), infrared (IR) and/or ultraviolet (UV). Provision can also be made to select any color component of the light in order to mix specific wavelength ranges.
  • the lighting device can each be formed from a number of light-emitting diodes (LED) in a series arrangement or a matrix arrangement. It can further be provided that the lighting device has a polarization filter.
  • the projection device can have a second illumination device, it being possible for the first and the second illumination device to be arranged coaxially relative to the optical detection device.
  • the lighting devices can also be arranged transversely or orthogonally to the direction of movement of the product. This avoids the possible formation of shadows on the product surface.
  • the structure of the second lighting device can correspond to that of the first lighting device.
  • line images can be recorded simultaneously from at least three, preferably five, sensor lines of the area camera with the highest saturation values. For example, it is then also possible to superimpose at least two of the sensor lines or their line images on one another in order to obtain a quality-optimized, individual line image. At the same time, the amount of data to be processed will then also be small, which enables rapid processing of the image data and thus rapid scanning of the product.
  • the superimposition of the line images makes it possible to capture only one line image of the reflected multichromatic light beam with the area camera.
  • the area scan camera is therefore used at the same time to determine the topography and to select line images that can be used easily.
  • Line images can be recorded simultaneously in the red, green and blue (RGB) wavelength ranges.
  • the processing device can also be used to analyze the image information or RGB color information separately according to color value (hue), color saturation (saturation) and light value (value) in a color space (HSV, HSL, HSB).
  • the image plane or an image in the image plane of the area camera can therefore be analyzed by the processing device with regard to hue, brightness and/or saturation.
  • the image information can be used to analyze material type and distribution, as different materials have different H, S and V values.
  • a color space can be selected depending on the materials to be analyzed, with an RGB color space serving as the basis.
  • a material, a material property and/or a geometric structure of the product can be determined from the analysis image and/or the analysis image can be compared with a reference image. Further material-structure information of a product can be obtained if line images of the analysis image are superimposed and evaluated by means of the processing device. In this case, for example, two line images of a matching product area can then be combined by means of image processing of the processing device.
  • the spectrometer device can also be used to illuminate in a sequence. Firstly, a first analysis image is recorded with only illumination in one wavelength range and then a second analysis image is recorded solely with illumination in a different wavelength range of a matching product surface. Finally, the analysis images are combined using image processing.
  • the processing device can be used to compare analysis image information or analysis images with reference images.
  • the reference images of the product may include CAD data and material distribution data of the product. The comparison can be done by image processing, taking for different structures of the product can be analyzed separately from each other. This material information can also be combined with height information, for example to clearly identify a conductor track.
  • the reference image information can therefore include all of the product's geometric data, material information, component information and also height information. If the analysis image information differs from the reference image information, an error can be signaled.
  • At least two or more line images of a matching product surface can be combined by image processing of the processing device.
  • HDR image High Dynamic Range Image
  • a graded series of exposures can be achieved by means of an adapted illumination using the spectrometer device and/or a correspondingly different detection of the sensor rows of the area camera using the processing device.
  • a 12-bit analysis image from the area scan camera in which a plurality of brightnesses are integrated in 8-bit format, can be read out using the processing device.
  • a color locus of a color space of a pixel of the relevant analysis image can be determined even more precisely.
  • the figure shows a simplified schematic representation of an embodiment of an inspection system 26 in a side view.
  • the inspection system 26 has an area camera 27 .
  • the projection device 32 has a light source 33 which emits white light, a diaphragm 34 and a further dispersive element 35 .
  • the further dispersive element 35 is designed as a further prism 36, by means of which a multichromatic light beam 37 is projected onto a product surface 38 transversely to a direction of movement, indicated by an arrow 39, of the product, not shown in detail here.
  • the objective 30 comprises a lens arrangement 40, which is shown schematically here, a front lens 41 and a diaphragm 42 arranged on the image side.
  • the diaphragm 42 is designed in particular as a slit diaphragm 43.
  • the front lens 41 is designed in the form of a segment of a circle and has two boundary surfaces 45 which are parallel and arranged coaxially with respect to an optical axis 44 .
  • the optical axis 44 runs through a detection plane 46 of the detection unit 29, the detection plane 46 being arranged orthogonally relative to the product surface 38, which corresponds to an object plane 47.
  • the multichromatic light beam 37 thus falls at an angle ⁇ relative to the detection plane 46 onto the product surface 38 or the object plane 47 and is reflected from there into the lens 30 .
  • the dispersive element 31 which is designed as a prism 48 , is arranged between the lens 30 and the area camera 27 .
  • the prism 48 disperses the light emerging from the lens 30 and projects it onto the area camera 27 or its image plane 49.
  • a correction device is also provided here, by means of which a longitudinal chromatic aberration of the reflected multichromatic light beam in the image plane 49 is corrected .
  • the correction device can already be designed in that the image plane 49 is pivoted relative to a main plane 50 of the lens 30 or is not parallel. An angle Y 0 is then formed between the image plane 49 and the main plane 50 .
  • the inclination of the image plane 49 is selected in such a way that the light dispersed by the prism 48 is imaged sufficiently sharply in the image plane 49 independently of a wavelength.
  • a further optical element which is not shown here, can also be arranged on the optical axis 44 alone or in addition, between the objective 30 and the image plane 49, which correspondingly corrects the longitudinal chromatic aberration.
  • height information of the product surface 38 relative to the area camera 27 is generated using a processing device (not shown here). derived.
  • sensor rows of the area camera 27 (not shown in more detail here), which run parallel to the detection plane 46, are evaluated, with five sensor rows with the highest or maximum saturation values being detected.
  • the height information can then be calculated from a position of the sensor rows relative to the detection plane 46 and the angle of incidence ⁇ .
  • This height information is always consistently accurate, regardless of the height of the product surface 38 , since the accuracy is independent of a wavelength of the light projected onto the detection plane 46 due to the use of the correction device.
  • the processing device is used to superimpose the sensor lines or their line images. These line images are in turn combined to form an analysis image of the product.

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Chemical & Material Sciences (AREA)
  • Analytical Chemistry (AREA)
  • Biochemistry (AREA)
  • General Health & Medical Sciences (AREA)
  • Immunology (AREA)
  • Pathology (AREA)
  • Engineering & Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Signal Processing (AREA)
  • Investigating Materials By The Use Of Optical Means Adapted For Particular Applications (AREA)
  • Length Measuring Devices By Optical Means (AREA)
EP22790499.2A 2021-09-22 2022-09-21 Inspektionssystem und verfahren zur fehleranalyse Pending EP4405668A1 (de)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
DE102021124504.4A DE102021124504A1 (de) 2021-09-22 2021-09-22 Inspektionssystem und Verfahren zur Fehleranalyse
PCT/EP2022/076257 WO2023046769A1 (de) 2021-09-22 2022-09-21 Inspektionssystem und verfahren zur fehleranalyse

Publications (1)

Publication Number Publication Date
EP4405668A1 true EP4405668A1 (de) 2024-07-31

Family

ID=83898386

Family Applications (1)

Application Number Title Priority Date Filing Date
EP22790499.2A Pending EP4405668A1 (de) 2021-09-22 2022-09-21 Inspektionssystem und verfahren zur fehleranalyse

Country Status (6)

Country Link
EP (1) EP4405668A1 (ko)
JP (1) JP2024533868A (ko)
KR (1) KR20240060654A (ko)
DE (1) DE102021124504A1 (ko)
MX (1) MX2024003487A (ko)
WO (1) WO2023046769A1 (ko)

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6573998B2 (en) * 1997-11-06 2003-06-03 Cynovad, Inc. Optoelectronic system using spatiochromatic triangulation
KR100406843B1 (ko) * 2001-04-06 2003-11-21 (주) 인텍플러스 색정보를 이용한 실시간 3차원 표면형상 측정방법 및 장치
JP5417723B2 (ja) 2008-03-18 2014-02-19 株式会社豊田中央研究所 方位測定方法及び方位測定装置
CZ2009133A3 (cs) 2009-03-03 2009-07-08 Witrins S.R.O. Zarízení, zpusob merení vnejších rozmeru testovaného výrobku a použití tohoto zarízení
JP5701837B2 (ja) * 2012-10-12 2015-04-15 横河電機株式会社 変位センサ、変位測定方法
CN207556477U (zh) * 2017-12-20 2018-06-29 北京卓立汉光仪器有限公司 一种表面形貌测量装置

Also Published As

Publication number Publication date
JP2024533868A (ja) 2024-09-12
WO2023046769A1 (de) 2023-03-30
MX2024003487A (es) 2024-06-11
DE102021124504A1 (de) 2023-03-23
KR20240060654A (ko) 2024-05-08

Similar Documents

Publication Publication Date Title
EP2801816B1 (de) Verfahren und Vorrichtung zur optischen Analyse eines PCBs
EP1314972B1 (de) Spektralphotometer und Verwendung desselben
EP3104117B1 (de) Verfahren zur fehleranalyse von drahtverbindungen
DE102004029012B4 (de) Verfahren zur Inspektion eines Wafers
EP2409125B1 (de) Verfahren und vorrichtung zur durchführung eines optischen vergleiches zwischen zumindest zwei mustern, vorzugsweise durch vergleich von auswählbaren ausschnitten
DE102011086417B4 (de) Verfahren zum Erkennen eines Brückenverbindungsfehlers
DE102006036504A1 (de) Vorrichtung und Verfahren zur Messung des Höhenprofils eines strukturierten Substrats
DE202016008950U1 (de) Dreidimensionale Formmessvorrichtung
DE102014115650B4 (de) Inspektionssystem und Verfahren zur Fehleranalyse
WO1998012543A1 (de) Verfahren und anordnung zur automatischen optischen qualitätskontrolle von flachen, ebenen produkten
DE102021124507B4 (de) Inspektionssystem und Verfahren zur Fehleranalyse
AT406528B (de) Verfahren und einrichtung zur feststellung, insbesondere zur visualisierung, von fehlern auf der oberfläche von gegenständen
EP4405668A1 (de) Inspektionssystem und verfahren zur fehleranalyse
EP4405669A1 (de) Inspektionssystem und verfahren zur fehleranalyse
WO2024120821A1 (de) Verfahren zur fehleranalyse und inspektionssystem
EP4400805A1 (de) Inspektionssystem und verfahren zur fehleranalyse
WO2024121216A1 (de) Vorrichtung und verfahren zum bestimmen einer abbildungsqualität zumindest einer abbildung für einen prüfling
DE112020006630T5 (de) Vorrichtung zur messung dreidimensionaler form, verfahren zur messung dreidimensionaler form und programm
DE112020006612T5 (de) Vorrichtung zur messung dreidimensionaler form, verfahren zur messung dreidimensionaler form und programm
EP1538430A1 (de) Farbsensor
AT15969U1 (de) Vorrichtung zum optischen Analysieren und Sortieren von Objekten
EP1078250A1 (de) Verfahren und vorrichtung zur automatisierten erfassung und prüfung von geometrischen und/oder texturellen merkmalen eines objektes
WO2005088271A1 (de) Abbildendes ellipsometer mit synchronisiertem probenvorschub und ellipsometrisches messverfahren

Legal Events

Date Code Title Description
STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: UNKNOWN

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE INTERNATIONAL PUBLICATION HAS BEEN MADE

PUAI Public reference made under article 153(3) epc to a published international application that has entered the european phase

Free format text: ORIGINAL CODE: 0009012

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: REQUEST FOR EXAMINATION WAS MADE

17P Request for examination filed

Effective date: 20240319

AK Designated contracting states

Kind code of ref document: A1

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR