WO2015004672A1 - A method and apparatus for inspection and quality assurance of material samples using qualified user definitions and data derived from images in a controlled environment - Google Patents

A method and apparatus for inspection and quality assurance of material samples using qualified user definitions and data derived from images in a controlled environment Download PDF

Info

Publication number
WO2015004672A1
WO2015004672A1 PCT/IL2014/050628 IL2014050628W WO2015004672A1 WO 2015004672 A1 WO2015004672 A1 WO 2015004672A1 IL 2014050628 W IL2014050628 W IL 2014050628W WO 2015004672 A1 WO2015004672 A1 WO 2015004672A1
Authority
WO
WIPO (PCT)
Prior art keywords
color
sample
image
master
classification
Prior art date
Application number
PCT/IL2014/050628
Other languages
French (fr)
Inventor
Igal Loevsky
Original Assignee
Igal Loevsky
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Igal Loevsky filed Critical Igal Loevsky
Publication of WO2015004672A1 publication Critical patent/WO2015004672A1/en

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/84Systems specially adapted for particular applications
    • G01N21/85Investigating moving fluids or granular solids
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01JMEASUREMENT OF INTENSITY, VELOCITY, SPECTRAL CONTENT, POLARISATION, PHASE OR PULSE CHARACTERISTICS OF INFRARED, VISIBLE OR ULTRAVIOLET LIGHT; COLORIMETRY; RADIATION PYROMETRY
    • G01J3/00Spectrometry; Spectrophotometry; Monochromators; Measuring colours
    • G01J3/46Measurement of colour; Colour measuring devices, e.g. colorimeters
    • G01J3/50Measurement of colour; Colour measuring devices, e.g. colorimeters using electric radiation detectors
    • G01J3/501Colorimeters using spectrally-selective light sources, e.g. LEDs
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01JMEASUREMENT OF INTENSITY, VELOCITY, SPECTRAL CONTENT, POLARISATION, PHASE OR PULSE CHARACTERISTICS OF INFRARED, VISIBLE OR ULTRAVIOLET LIGHT; COLORIMETRY; RADIATION PYROMETRY
    • G01J3/00Spectrometry; Spectrophotometry; Monochromators; Measuring colours
    • G01J3/46Measurement of colour; Colour measuring devices, e.g. colorimeters
    • G01J3/52Measurement of colour; Colour measuring devices, e.g. colorimeters using colour charts
    • G01J3/524Calibration of colorimeters
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/84Systems specially adapted for particular applications
    • G01N21/88Investigating the presence of flaws or contamination
    • G01N21/8806Specially adapted optical and illumination features
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/10Image acquisition
    • G06V10/12Details of acquisition arrangements; Constructional details thereof
    • G06V10/14Optical characteristics of the device performing the acquisition or on the illumination arrangements
    • G06V10/147Details of sensors, e.g. sensor lenses
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01JMEASUREMENT OF INTENSITY, VELOCITY, SPECTRAL CONTENT, POLARISATION, PHASE OR PULSE CHARACTERISTICS OF INFRARED, VISIBLE OR ULTRAVIOLET LIGHT; COLORIMETRY; RADIATION PYROMETRY
    • G01J3/00Spectrometry; Spectrophotometry; Monochromators; Measuring colours
    • G01J3/46Measurement of colour; Colour measuring devices, e.g. colorimeters
    • G01J3/50Measurement of colour; Colour measuring devices, e.g. colorimeters using electric radiation detectors
    • G01J2003/503Densitometric colour measurements
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N2201/00Features of devices classified in G01N21/00
    • G01N2201/06Illumination; Optics
    • G01N2201/065Integrating spheres
    • G01N2201/0655Hemispheres
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V2201/00Indexing scheme relating to image or video recognition or understanding
    • G06V2201/06Recognition of objects for industrial automation

Definitions

  • the present invention relates to the field of sample inspection.
  • the invention comprises a method for automatically checking samples in terms of correctness of color, shape, size, and other parameters using image processing technology. 'Correctness' here is defined in comparison to an instance called the Master which is used to define the desired characteristics.
  • the invention comprises a combination of hardware and data processing methods adapted to produce measurements, 'go/no-go' assessments, and discrimination into classes (such as 'fine', 'medium' and 'course' or 'low quality', 'medium quality', and 'high quality') of samples of materials such as: grains, seeds, flours, processed materials, spices, mechanical parts, printed circuits, pests, mites and the like.
  • the hardware comprises a closed box having a reflection chamber, described in provisional patent application US61/844956 and PCT/IL2014/050337 which are herein incorporated by reference.
  • This device is referred to in the figures to follow, and the data processing methods employed use routines described by the figures as well.
  • the invention provides a way to allow direct comparison of sample inspection results obtained from different machines having inevitable differences in lighting, optics, and the like.
  • Fig. 1 shows a flowchart for the material check routine performed by the user.
  • Fig. 2 shows a flowchart describing one possible embodiment of a method for configuring the system to produce a Master, against which subsequent samples under test will be compared.
  • Fig. 3 shows a flowchart for the possible main steps in the data processing (Preprocessing, Features Generation and Classification & Report), and the points in the data processing that allow convenient reprocessing of the calculation if the Master settings were changes by the expert user.
  • Fig. 4 shows a histogram indicating Normal and Exceptional values configuration by the user, using a PDF representation
  • Fig. 5 shows a Cumulative Distribution Function (CDF) indicating Normal and Exceptional values configuration by the user.
  • CDF Cumulative Distribution Function
  • Fig. 6 shows, among the rest, the use of Top Lighting, Specular Background, and Illumination
  • Fig. 7 shows the components of the Master.
  • Fig. 8 shows a flowchart of Master configuration by the user denoted the "Master Configuration Flowchart”.
  • Fig. 9 shows applying vibration for spatial diffusion of particles in a sample.
  • Fig. 10 shows an embodiment of the device having light sources aimed through shadow masks having line patterns adapted to illuminate samples with lines for the purpose of 3D reconstruction using structured light.
  • Fig. 11 shows an alternative embodiment for projection of structured light using a mini projector.
  • Fig. 12 shows a definition of a spatiogram variant optimized for grains and particles.
  • Fig. 13 shows results of a sample analysis using a first instance of the device.
  • Fig. 14 shows results of a sample analysis using a second instance of the device at a different location having different ambient light and other environmental conditions.
  • Fig. 15 shows a typical Color Checker. Each rectangle of different color has its own standard (R,G,B) and (L*,a*,b*) values.
  • Fig. 16 shows an image from the device of a spice sample containing stems, and the foreground mask of this sample
  • Fig. 17 shows an illustration of the principle of association between pests and their motions on the image
  • Master refers hereinafter the definitions of Positive sample of a certain kind, consisting of reference images and other data items which are stored in the device memory, and used by the device software when the automatic check is performed.
  • the data items include exposure settings, image correction calibration settings in terms of brightness normalization and gain, image processing parameters, vibrating parameters, tolerances in terms of color, shape and size, user definitions, and more.
  • 'Positive sample' or 'positives' hereinafter refers to a sample acceptable by the user (pass).
  • the terms 'classification' hereinafter refers to categorization of a sample or set of samples into a number of discrete categories, such as pass/fail, fine/medium/coarse, well-formed/deformed, and the like.
  • 'Particles sample' or 'particulate sample' hereinafter refers to a sample that consists of particles, which are a small and separate pieces of material like grains or screws, when the inspection of each particle is important.
  • 'Bulk sample' refers to a sample of a bulk of material. The inspection of the bulk properties is important, but each of the bulk's particles standing alone isn't important for the inspection.
  • spices, flours The term 'Powder sample' hereinafter refers to a sample of a powder, usually having a homogenous color.
  • 'Panel sample' hereinafter refers to a nearly flat panel such as a printed electronic circuit or a keyboard.
  • imaging means' hereinafter refers to any means for capturing an image, such as a camera having a lens, and especially digital imaging means such as a CMOS detector with lens and focusing apparatus adapted for connection to a computer.
  • the invention comprises methods for automatically analyzing samples. This analysis is performed in terms of categories of color, shape, size, and other parameters using image processing technology. The categories are defined in comparison to an instance called the Master sample which is used to define the desired characteristics for each category.
  • the methods are adapted to measure sample parameters, produce 'go/no-go' assessments of samples as well as discrimination into classes of samples of materials such as: grains, seeds, flours, processed materials, spices, mechanical parts, printed circuits, and the like.
  • the hardware of the invention comprises a closed box having a hemispherical reflection chamber, described in provisional patent application US/61844956 and PCT/IL2014/050337.
  • the reflection chamber can be also cubical, or hemispherical with cubic basis.
  • the device consists of a portable, fully enclosed sample analysis chamber.
  • the chamber has two halves which may be opened and closed to allow insertion of a sample and then closing of the device to keep out ambient light.
  • the upper half has a downward-facing reflection chamber, attached by hinges to the lower half which supports a planar analysis surface.
  • the dome is provided with lighting means such as RGB LEDs disposed so as to provide uniform illumination of controllable color and intensity to the planar analysis surface when the device is closed.
  • the chamber can also consist of lighting material such as LED paper instead of reflecting separate LEDs.
  • the device is referred to in the figures to follow, and the data processing methods employed use routines described by the figures as well.
  • the reflection chamber is illuminated from below and possibly from above as well.
  • a camera at the top of the chamber is used to analyze samples resting on the floor of the chamber, as shown in Fig. 6.
  • Various other additions may be used, such as means for vibrating the sample, illuminating it with structured light, and so on.
  • the invention further provides a system and method for allowing direct comparison of sample inspection results obtained from different machines having inevitable differences in lighting, optics, and the like.
  • Fig. 1 shows a flowchart for material check routine performed by the user. The user first opens the device, place the sample to be checked, and then closes the device, which triggers processing and subsequent generation of a report.
  • Fig. 2 shows a flowchart describing one such possible method for configuring the system. This procedure produces master images against which subsequent images of objects under test will be compared.
  • the background will generally be of a standard color such as blue and having specular reflectivity.
  • the brightness of illumination is then set using feedback, to arrive at a nominally illuminated sample (for example varying illumination until an average pixel value of half the maximum possible value is reached).
  • the exposure is set, and then the sample is removed.
  • the Brightness Normalization sheet is inserted into the device.
  • the white balance is then adjusted, either by varying illumination or changing the gain of measured values in software.
  • the brightness normalization procedure is performed yielding the brightness normalization.
  • the Brightness Normalization sheet is removed from the device and the bare background sheet is covering the working area.
  • the background image might be captured now for background subtraction, if used.
  • a master sample is then inserted into the device.
  • the master is captured (imaged).
  • the color histogram of the foreground areas of the Master Sample is calculated and the algorithm calculates the color channel value thresholds using user definitions in the PDF or the CDF method which are described further.
  • the master sample is then removed and an example of a particular category, such as a 'fail' sample (one that does not pass inspection) is inserted into the device and imaged.
  • a 'fail' sample one that does not pass inspection
  • the resulting image and report are reviewed, at which stage various thresholds (for sample morphology, size, color variation, and the like) may be set to better define a passing vs. failing sample.
  • various thresholds for sample morphology, size, color variation, and the like
  • the master definition as determined above is then saved, and used for subsequent analyses.
  • Fig. 3 shows a flowchart for the possible main steps and the expert user decision points in the data processing. After a sample is inserted into the device, steps of brightness normalization and image undistortion are performed.
  • the background is then segmented out and various calculations made on discriminating features. As the user may update various thresholds and other parameters, these stages may occur repeatedly.
  • the sample is classified and a report generated. Based on the report, the expert user may again update parameter and restart the process from the step of background segmentation.
  • Fig. 4 shows a histogram indicating Normal and Exceptional color values configuration.
  • the algorithm calculates the color channel threshold values using a pre-defined density threshold given by the expert user. The user may afterwards move the scroll bars to determine upper and lower thresholds for color channel values manually.
  • Fig. 5 shows a cumulative distribution function (CDF) indicating Normal and Exceptional values using a CDF representation; here the expert user defines the number 'exceptionals' (items outside the accepted range of color, size, or other parameter being set) in the sample.
  • CDF cumulative distribution function
  • the upper and lower thresholds are therefore defined based not on color channel or size value, but using percentages, e.g. labeling any pixel below the 5th percentile as below the lower threshold and any pixel above the 95th percentile as above the upper threshold, and deriving the threshold in this way automatically.
  • the device used for the sample inspection may be one such as that shown in Fig. 6, which illustrates the use of Top Lighting, Specular Background and other features shown in the figure.
  • Fig. 7 shows the components of the contents of a Master, this being a set of master images Iml ...ImN, a background image, brightness normalization image, and a set of parameters.
  • a flowchart for determining the parameters of the Master is shown in Fig. 8.
  • a set of master images is taken for a set of samples (both pass and fail samples). For each such image, an automatic classification is run using the current set of parameters. If the classification is incorrect, the parameters are adjusted until correct classification is achieved on all images Iml... ImN. If the correct classification is given, and a sufficient number of both positive and negative samples was been processed, then the process ends. If insufficient positive and/or negative samples are represented, then additional samples of the missing variety are processed in the same fashion, until sufficient samples have been classified correctly, at which point the Master sample is fully defined. Both the correctness of the classification and the sufficient number of Master images is determined by the expert user, who is assumed to be familiar with the quality characteristics of the product in the visual domain.
  • Fig. 9 shows a method for applying vibration to a sample, to cause spatial diffusion of the particulate sample.
  • the repeatability of sample spreading might improve greatly the repeatability of the classification and the parameters measurement by the device.
  • the device measures the percentage of the background in the sample image, and assures that this number doesn't differs significantly from the Master images.
  • Fig. 10 shows an embodiment of the device having light sources aimed through shadow masks having line patterns adapted to illuminate samples with lines. This will allow (for example) for partial 3D information to be obtained, if the illumination is varied over a series of sets of lines and images taken for each series.
  • Fig. 11 shows an alternative embodiment for projection of structured light using a mini projector.
  • the concept of the spatiogram is shown in Fig. 12.
  • the spatiogram variant proposed in the same figure is optimized for seeds, grains and other samples consisting of particles.
  • the method makes use of: (1) 'normalization object' which is a standard sample having a uniform gray color and diffuse reflective properties as closer to Lambertian reflectance as possible. (2) Target plate is a piece of gray material with Lambertian reflectance which is in the field of view of the camera, providing feedback on the illumination intensity.
  • the steps of the procedure are:
  • ./ is an element-wise (pixel-by-pixel) division according to R,G and B values
  • the brightest pixel value is 100, and all the other pixels will have values in the range [0..100].
  • the Brightness Normalization Operation on an image I consists of:
  • N (I ./ B) * 100%, where ./ is an element-wise (pixel-by-pixel) division operation.
  • a brightness measurement of a control area called the "Target Plate” is taken in each captured image.
  • the target plate is situated near the working area. It is attached mechanically to the top half of the device, so when the device is open, there is no danger of sullying the target plate.
  • the multiple levels of feedback and control over measurement conditions (controlling camera temperature, lighting element temperature, and sample chamber temperature, calibrating for lens distortion, and calibrating for color aberrations) will allow for repeatable and measurements in extremely well-characterized conditions.
  • the invention solves the problem of making different devices give the same output for a given sample, when variations in a given inspection machine's mechanics, lighting, imaging, and other characteristics cause unavoidable differences in the images obtained on the same sample in different machines.
  • Importer the machine that uses a Master created in another machine for purpose of samples inspection with the imported Master
  • several lighting elements are used such as a set of bottom LEDs and a set of top LEDs.
  • the brightness of an image of the standard gray background will change in nonlinearly with respect to the voltage, due to the well-known nonlinear relation between LED brightness and voltage.
  • This is just one example of a number of variations that will inevitably cause differences in the raw images obtained on different machines, even for the same sample or standard background.
  • Other reasons for differences in lighting are LEDs degradation and dimensions tolerances in LEDs installation. The simple derivation below explains the implementation of a calibration procedure to account for the aforementioned inevitable variations in raw images obtained on different machines.
  • a Master is created in device A, with certain voltages in the top and bottom LEDs.
  • the method may be implemented in software as follows:
  • Each machine is identified with a unique ID.
  • a Master is created on a given machine.
  • the gain correction parameter sM is calculated (by taking the sum of image values for a standard gray background, see description above)
  • Software of the invention is run on a given (possibly different) machine.
  • the user inserts a standard gray sample into the machine and an image is taken.
  • the software saves the Brightness Normalization results as Imported Brightness Normalization.
  • the gain correction parameter sO is calculated (by taking the sum of image values for a standard gray background, see description above)
  • Fig. 13 shows an example of an image of given test sample in a first machine.
  • test (carried out in “machine 5") shows 4.78% color abnormalities on the calibration object (the dark strips).
  • a Color Checker shown in Fig. 15 must be captured in the imaging conditions of the Master.
  • the Color Checker image, denoted I cam passes the same geometry and color corrections as any other image captured in this Master. Then, the transformation from the RGB space of I cam , RGB cara , to the standard RGB color space, RGB std , has to be calculated.
  • the input is a set of sampled RGB values from each of the Color Checker colors.
  • the samples are of course de-noised by taking pixel values within some radius and performing on them average or median operation. In this way the following table is obtained: l,cam Bl,cam Rl,std Gl.std Bl.std ⁇ 2, cam G 2 ,cam B 2,cam 3 ⁇ 4std G2,std B2,std
  • the transformation f: N — > N is calculated by least squares solution or construction of a mesh grid for interpolation.
  • Fig 16.a shows an image of the device with a spice sample containing stems.
  • Fig 16.b shows the Foreground Mask obtained using the Master.
  • a morphological Open operator applied on this image yields the Leaves Foreground Mask containing the leaves only, while the thin areas of the stems are removed. Subtracting the Leaves Foreground Mask from the Foreground Mask yields the image of the stems only.
  • the Stems area proportion is obtained by the device.
  • the weight proportion is obtained from the area proportion.
  • a pest factory needs to check samples of pests for: (a) containing a high purity of the sample from pests of wrong species; (b) cleanliness from contamination by external bodies; (c) containing high percentage of live pests, while live pests produce motion and dead pests doesn't move.
  • (c) involves particles separation at first in a similar way to granules. Then motion estimation is performed by subtraction of subsequent images and morphological noise removal. Principle of association between motions and pest particles is shown in Fig. 17. In the figure, the motion of the pest's leg forms a cone on the image, and the thin part of the motion cone is attached to the pest that produced the motion.
  • Pests that produced motions above a threshold of magnitude are considered alive, and others are considered dead. In this way the device counts the percentage of live pests in a sample.

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • General Health & Medical Sciences (AREA)
  • Health & Medical Sciences (AREA)
  • Spectroscopy & Molecular Physics (AREA)
  • Engineering & Computer Science (AREA)
  • Immunology (AREA)
  • Biochemistry (AREA)
  • Analytical Chemistry (AREA)
  • Pathology (AREA)
  • Chemical & Material Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Vascular Medicine (AREA)
  • Multimedia (AREA)
  • Theoretical Computer Science (AREA)
  • Investigating Materials By The Use Of Optical Means Adapted For Particular Applications (AREA)
  • Spectrometry And Color Measurement (AREA)

Abstract

A method is introduced for automatically inspecting samples using color, shape, size, and other parameters by means of a novel analysis system and image processing technology. The inspection is based on a highly repeatable imaging in optimal imaging conditions for the samples. The imaging repeatability is valid on a single device instance and on different device instances, at different times and environment conditions. Feature measurements, 'Go/No-go' assessments, and discrimination into classes (such as 'fine', 'medium' and 'course' or 'low quality', 'medium quality', and 'high quality') is performed on samples of materials such as: grains, seeds, flours, processed materials, spices, mechanical parts, printed circuits, pests, mites and the like.

Description

A Method and Apparatus for inspection and quality assurance of material samples using qualified user definitions and data derived from images in a controlled environment
Field of the invention
The present invention relates to the field of sample inspection.
Cross-Reference to Related Application
This application claims the benefit of U.S. Provisional Application No. 61/844956, filed 11 July 2013.
Summary of the Invention
The invention comprises a method for automatically checking samples in terms of correctness of color, shape, size, and other parameters using image processing technology. 'Correctness' here is defined in comparison to an instance called the Master which is used to define the desired characteristics.
The invention comprises a combination of hardware and data processing methods adapted to produce measurements, 'go/no-go' assessments, and discrimination into classes (such as 'fine', 'medium' and 'course' or 'low quality', 'medium quality', and 'high quality') of samples of materials such as: grains, seeds, flours, processed materials, spices, mechanical parts, printed circuits, pests, mites and the like.
The hardware comprises a closed box having a reflection chamber, described in provisional patent application US61/844956 and PCT/IL2014/050337 which are herein incorporated by reference. This device is referred to in the figures to follow, and the data processing methods employed use routines described by the figures as well.
The invention provides a way to allow direct comparison of sample inspection results obtained from different machines having inevitable differences in lighting, optics, and the like. Brief Description of the Drawings
Fig. 1 shows a flowchart for the material check routine performed by the user.
Fig. 2 shows a flowchart describing one possible embodiment of a method for configuring the system to produce a Master, against which subsequent samples under test will be compared.
Fig. 3 shows a flowchart for the possible main steps in the data processing (Preprocessing, Features Generation and Classification & Report), and the points in the data processing that allow convenient reprocessing of the calculation if the Master settings were changes by the expert user.
Fig. 4 shows a histogram indicating Normal and Exceptional values configuration by the user, using a PDF representation
Fig. 5 shows a Cumulative Distribution Function (CDF) indicating Normal and Exceptional values configuration by the user.
Fig. 6 shows, among the rest, the use of Top Lighting, Specular Background, and Illumination
Feedback for automatic gain correction.
Fig. 7 shows the components of the Master.
Fig. 8 shows a flowchart of Master configuration by the user denoted the "Master Configuration Flowchart".
Fig. 9 shows applying vibration for spatial diffusion of particles in a sample.
Fig. 10 shows an embodiment of the device having light sources aimed through shadow masks having line patterns adapted to illuminate samples with lines for the purpose of 3D reconstruction using structured light.
Fig. 11 shows an alternative embodiment for projection of structured light using a mini projector. Fig. 12 shows a definition of a spatiogram variant optimized for grains and particles.
Fig. 13 shows results of a sample analysis using a first instance of the device.
Fig. 14 shows results of a sample analysis using a second instance of the device at a different location having different ambient light and other environmental conditions. Fig. 15 shows a typical Color Checker. Each rectangle of different color has its own standard (R,G,B) and (L*,a*,b*) values.
Fig. 16 shows an image from the device of a spice sample containing stems, and the foreground mask of this sample
Fig. 17 shows an illustration of the principle of association between pests and their motions on the image
Detailed Description
The present invention will be understood from the following detailed description of preferred embodiments, which are meant to be descriptive and not limiting. For the sake of brevity, some well- known features, methods, systems, procedures, components, circuits, and so on, are not described in detail.
The term "Master" refers hereinafter the definitions of Positive sample of a certain kind, consisting of reference images and other data items which are stored in the device memory, and used by the device software when the automatic check is performed. The data items include exposure settings, image correction calibration settings in terms of brightness normalization and gain, image processing parameters, vibrating parameters, tolerances in terms of color, shape and size, user definitions, and more.
The term 'Positive sample' or 'positives' hereinafter refers to a sample acceptable by the user (pass).
The term "Negative sample' or 'negatives' hereinafter - a sample not acceptable by the user (fail).
The terms 'classification' hereinafter refers to categorization of a sample or set of samples into a number of discrete categories, such as pass/fail, fine/medium/coarse, well-formed/deformed, and the like.
The terms 'Particles sample' or 'particulate sample' hereinafter refers to a sample that consists of particles, which are a small and separate pieces of material like grains or screws, when the inspection of each particle is important.
The term 'Bulk sample' hereinafter refers to a sample of a bulk of material. The inspection of the bulk properties is important, but each of the bulk's particles standing alone isn't important for the inspection. For example: spices, flours The term 'Powder sample' hereinafter refers to a sample of a powder, usually having a homogenous color.
The term 'Panel sample' hereinafter refers to a nearly flat panel such as a printed electronic circuit or a keyboard.
The term 'imaging means' hereinafter refers to any means for capturing an image, such as a camera having a lens, and especially digital imaging means such as a CMOS detector with lens and focusing apparatus adapted for connection to a computer.
The invention comprises methods for automatically analyzing samples. This analysis is performed in terms of categories of color, shape, size, and other parameters using image processing technology. The categories are defined in comparison to an instance called the Master sample which is used to define the desired characteristics for each category.
The methods are adapted to measure sample parameters, produce 'go/no-go' assessments of samples as well as discrimination into classes of samples of materials such as: grains, seeds, flours, processed materials, spices, mechanical parts, printed circuits, and the like.
The hardware of the invention comprises a closed box having a hemispherical reflection chamber, described in provisional patent application US/61844956 and PCT/IL2014/050337. The reflection chamber can be also cubical, or hemispherical with cubic basis. The device consists of a portable, fully enclosed sample analysis chamber. The chamber has two halves which may be opened and closed to allow insertion of a sample and then closing of the device to keep out ambient light. The upper half has a downward-facing reflection chamber, attached by hinges to the lower half which supports a planar analysis surface. The dome is provided with lighting means such as RGB LEDs disposed so as to provide uniform illumination of controllable color and intensity to the planar analysis surface when the device is closed. The chamber can also consist of lighting material such as LED paper instead of reflecting separate LEDs.
The device is referred to in the figures to follow, and the data processing methods employed use routines described by the figures as well. The reflection chamber is illuminated from below and possibly from above as well. A camera at the top of the chamber is used to analyze samples resting on the floor of the chamber, as shown in Fig. 6. Various other additions may be used, such as means for vibrating the sample, illuminating it with structured light, and so on.
The invention further provides a system and method for allowing direct comparison of sample inspection results obtained from different machines having inevitable differences in lighting, optics, and the like.
Fig. 1 shows a flowchart for material check routine performed by the user. The user first opens the device, place the sample to be checked, and then closes the device, which triggers processing and subsequent generation of a report.
In order to nearly eliminate variability between devices and to achieve optimal results, a number of settings and internal calibrations are performed before the automated analysis of Fig. 1 can be carried out. Fig. 2 shows a flowchart describing one such possible method for configuring the system. This procedure produces master images against which subsequent images of objects under test will be compared.
First a background and representative sample is inserted into the device. The background will generally be of a standard color such as blue and having specular reflectivity.
The brightness of illumination is then set using feedback, to arrive at a nominally illuminated sample (for example varying illumination until an average pixel value of half the maximum possible value is reached).
The exposure is set, and then the sample is removed. The Brightness Normalization sheet is inserted into the device.
The white balance is then adjusted, either by varying illumination or changing the gain of measured values in software. The brightness normalization procedure is performed yielding the brightness normalization. The Brightness Normalization sheet is removed from the device and the bare background sheet is covering the working area.
The background image might be captured now for background subtraction, if used.
A master sample is then inserted into the device. The master is captured (imaged). The color histogram of the foreground areas of the Master Sample is calculated and the algorithm calculates the color channel value thresholds using user definitions in the PDF or the CDF method which are described further.
The master sample is then removed and an example of a particular category, such as a 'fail' sample (one that does not pass inspection) is inserted into the device and imaged.
The resulting image and report are reviewed, at which stage various thresholds (for sample morphology, size, color variation, and the like) may be set to better define a passing vs. failing sample. The master definition as determined above is then saved, and used for subsequent analyses.
Fig. 3 shows a flowchart for the possible main steps and the expert user decision points in the data processing. After a sample is inserted into the device, steps of brightness normalization and image undistortion are performed.
The background is then segmented out and various calculations made on discriminating features. As the user may update various thresholds and other parameters, these stages may occur repeatedly.
Once the thresholds and parameters are fixed, the sample is classified and a report generated. Based on the report, the expert user may again update parameter and restart the process from the step of background segmentation.
Fig. 4 shows a histogram indicating Normal and Exceptional color values configuration. The algorithm calculates the color channel threshold values using a pre-defined density threshold given by the expert user. The user may afterwards move the scroll bars to determine upper and lower thresholds for color channel values manually.
Fig. 5 shows a cumulative distribution function (CDF) indicating Normal and Exceptional values using a CDF representation; here the expert user defines the number 'exceptionals' (items outside the accepted range of color, size, or other parameter being set) in the sample. These parameters may derive from accurate manual measurements, from intuition, or from requirements on the distribution width for a given parameter. The upper and lower thresholds are therefore defined based not on color channel or size value, but using percentages, e.g. labeling any pixel below the 5th percentile as below the lower threshold and any pixel above the 95th percentile as above the upper threshold, and deriving the threshold in this way automatically.
The device used for the sample inspection may be one such as that shown in Fig. 6, which illustrates the use of Top Lighting, Specular Background and other features shown in the figure.
Fig. 7 shows the components of the contents of a Master, this being a set of master images Iml ...ImN, a background image, brightness normalization image, and a set of parameters.
A flowchart for determining the parameters of the Master is shown in Fig. 8. A set of master images is taken for a set of samples (both pass and fail samples). For each such image, an automatic classification is run using the current set of parameters. If the classification is incorrect, the parameters are adjusted until correct classification is achieved on all images Iml... ImN. If the correct classification is given, and a sufficient number of both positive and negative samples was been processed, then the process ends. If insufficient positive and/or negative samples are represented, then additional samples of the missing variety are processed in the same fashion, until sufficient samples have been classified correctly, at which point the Master sample is fully defined. Both the correctness of the classification and the sufficient number of Master images is determined by the expert user, who is assumed to be familiar with the quality characteristics of the product in the visual domain.
The sample may be manipulated in various ways to allow for fuller characterization thereof. For example, Fig. 9 shows a method for applying vibration to a sample, to cause spatial diffusion of the particulate sample.
The repeatability of sample spreading might improve greatly the repeatability of the classification and the parameters measurement by the device. To improve the repeatability of the spreading the device measures the percentage of the background in the sample image, and assures that this number doesn't differs significantly from the Master images.
As another example, Fig. 10 shows an embodiment of the device having light sources aimed through shadow masks having line patterns adapted to illuminate samples with lines. This will allow (for example) for partial 3D information to be obtained, if the illumination is varied over a series of sets of lines and images taken for each series. Fig. 11 shows an alternative embodiment for projection of structured light using a mini projector.
The concept of the spatiogram is shown in Fig. 12. The spatiogram variant proposed in the same figure is optimized for seeds, grains and other samples consisting of particles.
A further method of the invention for Sample Inspection Devices Communication, allowing for Consistent Inter-Device Imaging (CIDI), is now described.
Brightness Normalization Algorithm in the Device
The algorithm below describes in detail the Brightness Normalization technique presented in the application US/61931729 which is hereby incorporated by reference.
The method makes use of: (1) 'normalization object' which is a standard sample having a uniform gray color and diffuse reflective properties as closer to Lambertian reflectance as possible. (2) Target plate is a piece of gray material with Lambertian reflectance which is in the field of view of the camera, providing feedback on the illumination intensity. The steps of the procedure are:
1. Take a sequence of n images imi ..imn
2. Calculate the average image imAvg = mean(imi=1 n) from the sequence, where n is the
number of images
3. Check that the Target Plate is clean by performing Variance checking in the Target Plate ROI of imAvg. Record the sum of brightness of the Target Plate ROI for future use in Gain Correction
4. Calculate a median image from the MultiGrab list, imMed = median(imi..im„)
5. Smooth the imMed image with a Gaussian blur kernel of the size 15x15
6. Calculate the Brightness Normalization matrix B:
6.0 Warn on exceptional colors, so as to require imMed to be smooth and monotonic
6.1 Find the maximal values in the entire image imMed for each of the R,G,B channels:
mV=max(imMed)=[mVR,mVG,mVB]
6.2 For each pixel v e imMed, set the Brightness Normalization matrix value to be
B=( v ./ mV ) * 100%
where ./ is an element-wise (pixel-by-pixel) division according to R,G and B values
6.3 In the resulting matrix B, the brightest pixel value is 100, and all the other pixels will have values in the range [0..100].
7. The Brightness Normalization Operation on an image I consists of:
N = (I ./ B) * 100%, where ./ is an element-wise (pixel-by-pixel) division operation.
Gain Correction using Target Plate
Variations in camera temperature, LEDs degradation, CMOS variations, and other uncontrolled changes create an unknown gain effect on the image, which must be compensated for in order to be able to compare brightness information between images of one device taken in different time, or to compare images from different devices.
For this purpose, it is within provision of the invention that a brightness measurement of a control area called the "Target Plate" is taken in each captured image. The target plate is situated near the working area. It is attached mechanically to the top half of the device, so when the device is open, there is no danger of sullying the target plate. The resulting gain correction works as follows: when capturing a reference image M, the sum of brightness in every channel R, G and B from the pixels belonging to the target plate T is taken, M Bc =∑∑ Mvc\ c=RiGiB, {vc\ ( x, vy) G T}.
When capturing an image I of a sample S, the target plate area brightness is calculated in a similar way, s Bc =∑∑ svc | C=R,G,B · The gain correction coefficient for any channel is found as 8C =
S ju __.
Sc E K. Finally, the gain corrected image is obtained, lc' = Ic/Sc.
As will be appreciated by one skilled in the art, the multiple levels of feedback and control over measurement conditions (controlling camera temperature, lighting element temperature, and sample chamber temperature, calibrating for lens distortion, and calibrating for color aberrations) will allow for repeatable and measurements in extremely well-characterized conditions.
Master Usage on Importer Machines
The invention solves the problem of making different devices give the same output for a given sample, when variations in a given inspection machine's mechanics, lighting, imaging, and other characteristics cause unavoidable differences in the images obtained on the same sample in different machines.
In the description below the following definitions will be used: Creator - the machine that created a Master
Importer - the machine that uses a Master created in another machine for purpose of samples inspection with the imported Master
In one embodiment of the invention, several lighting elements are used such as a set of bottom LEDs and a set of top LEDs. For slightly different voltages operating the top LEDs for example, the brightness of an image of the standard gray background will change in nonlinearly with respect to the voltage, due to the well-known nonlinear relation between LED brightness and voltage. This is just one example of a number of variations that will inevitably cause differences in the raw images obtained on different machines, even for the same sample or standard background. Other reasons for differences in lighting are LEDs degradation and dimensions tolerances in LEDs installation. The simple derivation below explains the implementation of a calibration procedure to account for the aforementioned inevitable variations in raw images obtained on different machines. Suppose that a Master is created in device A, with certain voltages in the top and bottom LEDs. It is of great interest to be able to compare information obtained on this machine with that obtained on another machine B. This is done by performing an Import operation of the Master to device B. Each Master created in a Creator device, requires import operation to any Importer device. The operation is performed once. After performing import operation on the Importer device, this Imported Master can be used freely on the Importer machine.
We denote the response image to lighting from the top LEDs on a standard gray background in machine A by
and the response to lighting from the bottom LEDs on a gray sheet in machine A by
The response to illumination from both top and bottom LED sources will then be given by
¾0. ) = FTiA(x, y) + FBA(x, y) The normalization is then calculated by means of a normalization function ij fo y = CA/FA (x, y , by which a given raw image is multiplied to obtain a normalized image. The normalized image of a standard gray background will now be
Figure imgf000011_0001
As will be clear to one skilled in the art, if the same normalization procedure is carried out on machine B, then it is possible to bring the images from the two machines into agreement in the sense that they give the same uniform value C for a standard gray background.
The final step in this sequence is to perform an overall gain correction operation: one takes the sum of pixel values of the Brightness Normalization Object image in the Owner machine, sO. Take the same measure in the importer machine, sM. In the Sample Result image, SR, an Import Gain Correction is performed by multiplying by the gain correction sO/sM: SRc = SR * sO/sM. By this means, the constants on the different machines will be made equal, CA=CB.
The method may be implemented in software as follows:
1. Each machine is identified with a unique ID.
2. A Master is created on a given machine.
3. The gain correction parameter sM is calculated (by taking the sum of image values for a standard gray background, see description above)
4. Software of the invention is run on a given (possibly different) machine.
5. If the machine running the software is the Creator of the Master Image, the Brightness
Normalization image created thereupon is used
6. If the machine running the software is not the creator of the Master Image, we are running an Imported Master. The software will check for an Import Brightness Normalization file according to the Running Machine lD
7. If an Imported Brightness Normalization file is found, it loads it and uses this file for Brightness Normalization.
8. If the Imported Brightness Normalization file isn't found, the software asks to perform
Brightness Normalization Operation.
9. A message is presented to the user, to the effect that Brightness Normalization must be performed.
10. The user inserts a standard gray sample into the machine and an image is taken.
11. The software saves the Brightness Normalization results as Imported Brightness Normalization.
12. The gain correction parameter sO is calculated (by taking the sum of image values for a standard gray background, see description above)
13. In the Sample Result image, SR, perform the Import Gain Correction: SRc = SR * sO/sM. SRc is the Corrected Sample Result image The above explained procedure ensures that the same image is obtained from the same sample when the same Master is loaded in different devices. As the Master in the devices is completely the same, running the similar checks, settings and algorithms, users of separate instances of the inventive machine will measure precisely the same statistics for a given sample. Thus, allowing for direct comparison between (for example) color of textile from a Chinese production line vs. factory output from Bangladesh for the same product.
Fig. 13 shows an example of an image of given test sample in a first machine.
The test (carried out in "machine 5") shows 4.78% color abnormalities on the calibration object (the dark strips).
The same analysis is shown in Fig. 14. This was carried out on the same test object performed in "machine 7" showing 4.89% color abnormalities on the calibration object, showing that the analyses carried out using the inventive method in different machines are nearly identical.
Real Color Imaging and Comparability to Human Color Sensitivity
In order for the image of the discussed device to have the colors corresponding to the true color observed from the sample and overcoming biases such as the different color sensitivity in the R,G and B channels of the camera, it is required to perform a transformation to the standard RGB colors.
As the gain corrections and brightness normalization techniques applied in the device assure a repeatable image in the dynamic range - such an image can be effectively transformed to true colors.
For that, a Color Checker shown in Fig. 15 must be captured in the imaging conditions of the Master. The Color Checker image, denoted Icam, passes the same geometry and color corrections as any other image captured in this Master. Then, the transformation from the RGB space of Icam, RGBcara, to the standard RGB color space, RGBstd, has to be calculated.
The input is a set of sampled RGB values from each of the Color Checker colors. The samples are of course de-noised by taking pixel values within some radius and performing on them average or median operation. In this way the following table is obtained: l,cam Bl,cam Rl,std Gl.std Bl.std ^2, cam G2,cam B2,cam ¾std G2,std B2,std
^k,cam ^k,cara Bk,cam ¾std Gk,std Bk,std
Given this data, the transformation f: N — > N is calculated by least squares solution or construction of a mesh grid for interpolation.
Given the image Istd, we obtain in a similar way (with a values table) a transformation from the RGBstd color space to the L*a*b* color space. This time the Color Checker values table provides standard values for RGBstd and L*a*b*:
Figure imgf000014_0001
In the L*a*b* color space, for two points: ί^ι , αι · ¾ (^2 > a ¾ their second norm is defined as:
is known to be equal to the minimal distinguishable color difference for an average human for ΔΕ = 2.
In this way we obtain an image in standard RGB colors and for any two colors in the image we able to know whether an average human will distinguish the difference in their color.
Stems counting
Fig 16.a shows an image of the device with a spice sample containing stems. Fig 16.b shows the Foreground Mask obtained using the Master. A morphological Open operator applied on this image yields the Leaves Foreground Mask containing the leaves only, while the thin areas of the stems are removed. Subtracting the Leaves Foreground Mask from the Foreground Mask yields the image of the stems only.
In this way the Stems area proportion is obtained by the device. Multiplication by a constant specific for the material, the weight proportion is obtained from the area proportion.
Putting smaller sample of spice with stems and using a vibrating plate yields a sparse spreading of the material, and more accurate stems area proportion measurement is obtained.
Pests counting and classification
For the needs of natural pollination, natural pest control in agriculture and natural medfly control, pests are produced in factories worldwide. A pest factory needs to check samples of pests for: (a) containing a high purity of the sample from pests of wrong species; (b) cleanliness from contamination by external bodies; (c) containing high percentage of live pests, while live pests produce motion and dead pests doesn't move.
Using the discussed device, the mentioned requirements are achieved in the following manner: (a) and
(b) are achieved in a similar way to granules sample inspection, in the terms of color, shape and size;
(c) involves particles separation at first in a similar way to granules. Then motion estimation is performed by subtraction of subsequent images and morphological noise removal. Principle of association between motions and pest particles is shown in Fig. 17. In the figure, the motion of the pest's leg forms a cone on the image, and the thin part of the motion cone is attached to the pest that produced the motion.
Pests that produced motions above a threshold of magnitude are considered alive, and others are considered dead. In this way the device counts the percentage of live pests in a sample.
The foregoing embodiments of the invention have been described and illustrated in conjunction with systems and methods thereof, which are meant to be merely illustrative, and not limiting. Furthermore just as every particular reference may embody particular methods/systems, yet not require such, ultimately such teaching is meant for all expressions notwithstanding the use of particular embodiments.
Any term that has been defined above and used in the claims, should be interpreted according to this definition.
The reference numbers in the claims are not a part of the claims, but rather used for facilitating the reading thereof. These reference numbers should not be interpreted as limiting the claims in any form.

Claims

Claims
1. A method for sample inspection in a device comprising a lighting chamber, feedback controlled illumination elements, imaging means, and computation means comprising steps of: a. setting the exposure of said imaging means; b. capturing a sequence of images using said imaging means, including a background
image and a brightness normalization image; c. performing white balancing of said imaging means; d. controlling the brightness of said illumination elements to achieve a predetermined average brightness level to correct for brightness non-uniformity; e. defining a Master comprising a set of images, said background image, said brightness normalization image, and a set of parameters comprising required parameters for samples to pass inspection, said parameters including brightness thresholds, color thresholds, and morphology thresholds; f. imaging a sample to be inspected; g. analyzing said sample in terms of said parameters to arrive at a classification thereof, wherein machine-independent classification of samples is achieved that is unaffected by variations in hardware.
2. The method of claim 1 wherein said the Master is defined remotely.
3. The method of claim 1 wherein said imaging means comprises a lens, further performing steps of lens geometric undistortion adapted to correct the lens fisheye effect and other lens geometric distortions.
4. The method of claim 1 wherein said white balancing comprises color calibration adapted to correct the outputs of the different channels of the photometric sensor of said imaging means to produce balanced intensity values relative to each other corresponding to the true color of the object captured in the image.
5. The method of claim 4 wherein said color balance is accomplished mathematically in software, or through use of hardware such as RGB LEDs whose intensity in various parts of the spectrum may be adjusted individually.
6. The method of claim 1 wherein said analysis comprises segmentation of the background from the sample, classifying the pass and the fail samples of the material, and measuring color modalities, color distributions, geometry, shape, size and depth properties of the material.
7. The method of claim 6 wherein said background segmentation uses background model
information to subtract the background image from the foreground image, or applies functions on the values of different color channels of the pixel and its surroundings to discriminate between foreground and background pixels.
8. The method of claim 1 wherein the homography, which is a two-dimensional rigid body
transformation and scaling, estimated between the observed plane coordinates system and the image plane coordinates system is used to analyze images in a standard coordinate space.
9. The method of claim 1 further comprising steps adapted for panel sample inspection, by
generating a Master from a median image of multiple panel samples, applying an image alignment algorithm on the images to correct their offset relative to each other, defining a sensitivity mask to define sensitivity areas for differences of a sample from the master sample, applying a morphological filter, and the classification result based on the comparison of the sample against the Master.
10. The method of claim 1 further allowing the user to define parameters of color or size analysis for discriminating normal colors or sizes from exceptional colors or sizes for the sample, by defining density threshold values in the sample's color and/or size histogram, thereby separating the color intensity or size values into "pass" and "fail" regions, allowing automatic
determination of color channel value thresholds or the size thresholds from said density thresholds by the algorithm.
11. The method of claim 10 wherein said analysis uses the percentage of the exceptional color area in addition to the distance of the color histograms' modes from the Master's color histograms modes.
12. The method of claim 11 further using additional distance measures between the color histograms of the examined sample and the Master, including the Earth Mover's Distance (EMD) or Bhattacharyya distance.
13. The method of claim 1 wherein said analysis is based on quantitative features of said sample selected from the group consisting of: length, width, length-to-width ratio, area, morphology, convexity defects, shape similarity information for analysis, and combinations thereof.
14. The method of claim 1 further wherein color classification for a powder is performed using methods selected from the group consisting of: the modes distance from said Master sample; a whole color histogram distance measure; edge information; intensity gradients information, and histogram entropy
15. The method of claim 1 wherein said classification is performed using color uniformity-based classification employing entropy measures or variance of the color histograms.
16. The method of claim 1 further obtaining depth information of the observed objects obtained by means of illumination using structured light, or using two imagers to achieve the stereopsis effect of a calibrated camera pair.
17. The method of claim 16 wherein said classification is performed in the features of volume and height using the three-dimensional information.
18. The method of claim 16 further weighing said sample using weighing means to determine the unit mass of the inspected material.
19. The method of claim 1 wherein said parameters can be configured by the user at any stage using any means of input to the computational device, in particular, using text boxes, scroll bars, check boxes and other features of the graphic user interface.
20. The method of claim 1 employing several user-selectable color spaces including but not limited to: RGB, HSV, CIE Lab.
21. The method of claim 1 further automatically recommending to the user which color channels to use for the color segmentation or algorithms, or automatically setting those color channels using criteria of entropy minimization or other measures derived from the channel's color histogram, indicating ordered information in this color channel.
22. The method of claim 1 wherein said classification employs threshold sensitivity difference of C1E Lab ΔΕ=2 minimal distinguishable color difference of a human, or a product of it.
23. The method of claim! further using backlight illumination with or without color filters.
24. The method of claim 1 further allowing the user to configure parameters of background
segmentation while adjusting the background subtraction threshold and further selecting color channels and the parameters for the background-foreground classification functions.
25. The method of claim 1 further using Open, Close, Erode and Dilate morphological operations for noise removal in the Foreground Mask or in the Exceptional Color areas mask.
26. The method of claim 1 wherein said illumination means is selected from the group consisting of IR, Red, Green, Blue, and UV illumination.
27. The method of claim 1 further allowing configuration of Bulk or Particles color or size analysis based on the representation of the Cumulative Distribution Function (CDF) derived from the Normalized ID Color Histogram of all or any subset of the color channels of the Sample Foreground Image in the case of color, and derived from the sizes histogram otherwise.
28. The method of claim 1 wherein said background segmentation is performed using the ratio R12 = CH1/CH2 between two color channels CHI and CH2 to discriminate effectively between background and foreground areas, and between accepted and rejected color for a given product, said ratio being user selectable, and further classifying pixels that satisfy binary relations relative to thresholds.
29. The method of claim 1 using a top LED ring and a specular background material of a
distinguishable color, while the background reflects its color to the imaging means allowing to distinguish and segment the background well.
30. The method of claim 1 wherein the Master is represented using: (a) a set of images containing variations of the acceptable product appearances, and some instances of product defects; (b) an image of a clear background panel; (c) the brightness normalization image from Claim 3; (d) parameters that define foreground segmentation, defects segmentation and classification, pass or fail definitions, user settings, image processing parameters, vibrating parameters, and tolerances in terms of color, shape and size.
31. The method of claim 1 wherein the Master sample is created using steps: a. a sample image is captured, and the acceptable color ranges are learned b. another sample image is captured, and the user examines the automatic classification; c. if the classification isn't correct, any subset of parameters can be adjusted by the user, until sufficient classification is achieved on all the Master images; d. if the classification is correct, but the captured images don't represent the possible
variations of the product sufficiently, images of the missing product variations have to be captured and added to the Master; e. when the classification is correct and the covered product appearance variations range is sufficient, the Master is ready for operational use.
32. The method of claim 1 further separating particles mechanically by applying vibration on the panel on which the sample is placed, applying a set of vibrations of similar or different duration, frequency and amplitude of up to 1 mm and frequency between 1 and 500 Hz to achieve optimal spreading, parameters of said vibration being added to the definition of said Master.
33. The method of claim 1 further allowing classification into quality classes based on defect
quantification as defined by the user.
34. The method of claim 33 wherein parameters defining different classes are defined by the user directly by defining thresholds on selected parameters, or by indicating individual particles in a captured image corresponding to various quality classes, thresholds for classification being calculated directly if the classes are separable, or may be calculated using any of a number of learning algorithms such as SVM, K-means, or any others as known in the art.
35. The method of claim 1 further employing a Metamaster allowing the unification of several Masters into a series of checks performed at once on a single product, comprising several Master checks serially. The series can require one or several spreads of samples in a similai- or different ways, and one or several images taken of each sample in similar or different imaging conditions.
36. The method of claim 1 further employing Spatiogram and variants thereof to discriminate between different color-in-space distributions on the particles, bulk, or other samples.
37. The method of claim 1 wherein said Master is adapted to allow for direct compaiison of sample analyzed at disparate locations, under disparate conditions of ambient lighting.
38. The method of claim 1 applied to Stems counting as described in this patent.
39. The method of claim 1 applied to Pests counting and classification by motion as described in this patent.
PCT/IL2014/050628 2013-07-11 2014-07-11 A method and apparatus for inspection and quality assurance of material samples using qualified user definitions and data derived from images in a controlled environment WO2015004672A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US201361844956P 2013-07-11 2013-07-11
US61/844,956 2013-07-11

Publications (1)

Publication Number Publication Date
WO2015004672A1 true WO2015004672A1 (en) 2015-01-15

Family

ID=52279423

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/IL2014/050628 WO2015004672A1 (en) 2013-07-11 2014-07-11 A method and apparatus for inspection and quality assurance of material samples using qualified user definitions and data derived from images in a controlled environment

Country Status (1)

Country Link
WO (1) WO2015004672A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE102016001513A1 (en) * 2016-02-10 2017-08-10 Schölly Fiberoptic GmbH Correction method, image recording method and image pickup device for improved white balance when taking an image in changing light conditions

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5253302A (en) * 1989-02-28 1993-10-12 Robert Massen Method and arrangement for automatic optical classification of plants
US20030072484A1 (en) * 2001-09-17 2003-04-17 Kokko Eric Gerard Method and apparatus for identifying and quantifying characteristics of seeds and other small objects
WO2007068056A1 (en) * 2005-12-14 2007-06-21 Grains Research And Development Corporation Stain assessment for cereal grains

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5253302A (en) * 1989-02-28 1993-10-12 Robert Massen Method and arrangement for automatic optical classification of plants
US20030072484A1 (en) * 2001-09-17 2003-04-17 Kokko Eric Gerard Method and apparatus for identifying and quantifying characteristics of seeds and other small objects
WO2007068056A1 (en) * 2005-12-14 2007-06-21 Grains Research And Development Corporation Stain assessment for cereal grains

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE102016001513A1 (en) * 2016-02-10 2017-08-10 Schölly Fiberoptic GmbH Correction method, image recording method and image pickup device for improved white balance when taking an image in changing light conditions
DE102016001513B4 (en) 2016-02-10 2024-02-29 Schölly Fiberoptic GmbH Correction method, image recording method and image recording device for improved white balance when recording an image in changing lighting conditions

Similar Documents

Publication Publication Date Title
US7936377B2 (en) Method and system for optimizing an image for improved analysis of material and illumination image features
JP5496509B2 (en) System, method, and apparatus for image processing for color classification and skin color detection
US7949181B2 (en) Segmentation of tissue images using color and texture
WO2014167566A1 (en) Apparatus for inspection and quality assurance of material samples
CN112567428B (en) Photographing method and photographing device
US8577135B2 (en) System and method for detection of specularity in an image
US8346022B2 (en) System and method for generating an intrinsic image using tone mapping and log chromaticity
US8542917B2 (en) System and method for identifying complex tokens in an image
CN110675368A (en) Cell image semantic segmentation method integrating image segmentation and classification
CN111727412A (en) Method, apparatus, system, and program for setting lighting condition, and storage medium
JP7078175B2 (en) Inspection equipment and method
US8385655B2 (en) Method and system for generating intrinsic images using single reflectance technique
CN111712769A (en) Method, apparatus, system, and program for setting lighting condition, and storage medium
CN114066857A (en) Infrared image quality evaluation method and device, electronic equipment and readable storage medium
US20210208069A1 (en) Uv-vis spectroscopy instrument and methods for color appearance and difference measurement
van Zwanenberg et al. Edge detection techniques for quantifying spatial imaging system performance and image quality
US6577775B1 (en) Methods and apparatuses for normalizing the intensity of an image
US20130114911A1 (en) Post processing for improved generation of intrinsic images
US11047807B2 (en) Defect detection
EP3499178A1 (en) Image processing system, image processing program, and image processing method
JPWO2018203514A1 (en) Image analysis evaluation method, computer program, image analysis evaluation device
McCann et al. Edges and gradients in lightness illusions: Role of optical veiling glare
WO2015004672A1 (en) A method and apparatus for inspection and quality assurance of material samples using qualified user definitions and data derived from images in a controlled environment
CN113252585B (en) Method and device for judging gold surface coating based on hyperspectral image
US20130114905A1 (en) Post processing for improved generation of intrinsic images

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 14822849

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

32PN Ep: public notification in the ep bulletin as address of the adressee cannot be established

Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC (EPO FORM 1205 DATED 02/05/2016)

122 Ep: pct application non-entry in european phase

Ref document number: 14822849

Country of ref document: EP

Kind code of ref document: A1