WO2022163421A1 - 検査対象に含まれる異物を検出する方法および装置 - Google Patents
検査対象に含まれる異物を検出する方法および装置 Download PDFInfo
- Publication number
- WO2022163421A1 WO2022163421A1 PCT/JP2022/001492 JP2022001492W WO2022163421A1 WO 2022163421 A1 WO2022163421 A1 WO 2022163421A1 JP 2022001492 W JP2022001492 W JP 2022001492W WO 2022163421 A1 WO2022163421 A1 WO 2022163421A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- foreign matter
- region
- image data
- pixel
- image
- Prior art date
Links
- 238000000034 method Methods 0.000 title claims abstract description 60
- 238000007689 inspection Methods 0.000 title claims abstract description 51
- 239000000126 substance Substances 0.000 title claims abstract description 17
- 238000003384 imaging method Methods 0.000 claims description 60
- 238000004590 computer program Methods 0.000 claims description 12
- 238000000411 transmission spectrum Methods 0.000 claims description 4
- 238000012545 processing Methods 0.000 description 99
- 229910052751 metal Inorganic materials 0.000 description 53
- 239000002184 metal Substances 0.000 description 53
- 238000002834 transmittance Methods 0.000 description 51
- 238000010586 diagram Methods 0.000 description 34
- 230000003595 spectral effect Effects 0.000 description 28
- 238000001514 detection method Methods 0.000 description 24
- 230000008569 process Effects 0.000 description 18
- 230000003287 optical effect Effects 0.000 description 17
- 239000011159 matrix material Substances 0.000 description 16
- 238000000701 chemical imaging Methods 0.000 description 10
- 235000013305 food Nutrition 0.000 description 9
- 230000006870 function Effects 0.000 description 7
- 238000004364 calculation method Methods 0.000 description 6
- 238000005286 illumination Methods 0.000 description 5
- 150000002739 metals Chemical class 0.000 description 5
- 230000000694 effects Effects 0.000 description 4
- 230000010354 integration Effects 0.000 description 4
- 238000010801 machine learning Methods 0.000 description 4
- 238000000985 reflectance spectrum Methods 0.000 description 4
- 239000004065 semiconductor Substances 0.000 description 4
- 239000000470 constituent Substances 0.000 description 3
- 238000005516 engineering process Methods 0.000 description 3
- 239000000463 material Substances 0.000 description 3
- 241000238631 Hexapoda Species 0.000 description 2
- 230000006835 compression Effects 0.000 description 2
- 238000007906 compression Methods 0.000 description 2
- 238000005259 measurement Methods 0.000 description 2
- 239000011368 organic material Substances 0.000 description 2
- 239000002245 particle Substances 0.000 description 2
- 230000002093 peripheral effect Effects 0.000 description 2
- 230000004044 response Effects 0.000 description 2
- 238000012935 Averaging Methods 0.000 description 1
- 238000004458 analytical method Methods 0.000 description 1
- 230000005540 biological transmission Effects 0.000 description 1
- 238000001574 biopsy Methods 0.000 description 1
- 238000004422 calculation algorithm Methods 0.000 description 1
- 230000000295 complement effect Effects 0.000 description 1
- 238000009795 derivation Methods 0.000 description 1
- 238000013461 design Methods 0.000 description 1
- 239000000284 extract Substances 0.000 description 1
- 230000005484 gravity Effects 0.000 description 1
- 238000009499 grossing Methods 0.000 description 1
- 229910052500 inorganic mineral Inorganic materials 0.000 description 1
- 230000001678 irradiating effect Effects 0.000 description 1
- 238000010030 laminating Methods 0.000 description 1
- 230000007246 mechanism Effects 0.000 description 1
- 229910044991 metal oxide Inorganic materials 0.000 description 1
- 150000004706 metal oxides Chemical class 0.000 description 1
- 239000011707 mineral Substances 0.000 description 1
- 239000000203 mixture Substances 0.000 description 1
- 238000010606 normalization Methods 0.000 description 1
- 238000005192 partition Methods 0.000 description 1
- 230000035699 permeability Effects 0.000 description 1
- 238000011170 pharmaceutical development Methods 0.000 description 1
- 230000000704 physical effect Effects 0.000 description 1
- 239000000049 pigment Substances 0.000 description 1
- 239000011295 pitch Substances 0.000 description 1
- 239000004033 plastic Substances 0.000 description 1
- 229920003023 plastic Polymers 0.000 description 1
- 230000005855 radiation Effects 0.000 description 1
- 230000009467 reduction Effects 0.000 description 1
- 238000004611 spectroscopical analysis Methods 0.000 description 1
- 238000001228 spectrum Methods 0.000 description 1
- 230000006641 stabilisation Effects 0.000 description 1
- 238000011105 stabilization Methods 0.000 description 1
- 230000000087 stabilizing effect Effects 0.000 description 1
- 230000002194 synthesizing effect Effects 0.000 description 1
- 230000009466 transformation Effects 0.000 description 1
- 230000001131 transforming effect Effects 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/0002—Inspection of images, e.g. flaw detection
- G06T7/0004—Industrial image inspection
- G06T7/001—Industrial image inspection using an image reference approach
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01N—INVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
- G01N21/00—Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
- G01N21/17—Systems in which incident light is modified in accordance with the properties of the material investigated
- G01N21/25—Colour; Spectral properties, i.e. comparison of effect of material on the light at two or more different wavelengths or wavelength bands
- G01N21/27—Colour; Spectral properties, i.e. comparison of effect of material on the light at two or more different wavelengths or wavelength bands using photo-electric detection ; circuits for computing concentration
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01N—INVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
- G01N21/00—Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
- G01N21/84—Systems specially adapted for particular applications
- G01N21/88—Investigating the presence of flaws or contamination
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01V—GEOPHYSICS; GRAVITATIONAL MEASUREMENTS; DETECTING MASSES OR OBJECTS; TAGS
- G01V8/00—Prospecting or detecting by optical means
- G01V8/10—Detecting, e.g. by using light barriers
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T1/00—General purpose image data processing
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T3/00—Geometric image transformations in the plane of the image
- G06T3/40—Scaling of whole images or parts thereof, e.g. expanding or contracting
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/70—Determining position or orientation of objects or cameras
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10024—Color image
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10032—Satellite or aerial image; Remote sensing
- G06T2207/10036—Multispectral image; Hyperspectral image
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30108—Industrial image inspection
- G06T2207/30128—Food products
Definitions
- the present disclosure relates to a method and apparatus for detecting foreign matter contained in an inspection target.
- wavelength bands each of which is a narrow band, for example, spectral information of ten bands or more
- a camera that acquires such multi-wavelength information is called a “hyperspectral camera”.
- Hyperspectral cameras are used in various fields such as food inspection, biopsy, pharmaceutical development, and mineral composition analysis.
- a wavelength band may be referred to as a band.
- Patent Document 2 discloses a method of recognizing an object using machine learning from a hyperspectral image generated using compressed sensing.
- a method is a method for detecting a specific foreign substance contained in an inspection target.
- the method includes obtaining first image data for the inspection object, each pixel having a pixel value for a first group of bands including one or more wavelength bands; Determining one or more pixel regions satisfying one condition as the first foreign matter region, and each pixel having a pixel value for a second band group including more wavelength bands than the first band group. Acquiring second image data for a region including a first foreign matter region; and obtaining, from the second image data, one or more pixel regions satisfying a second condition different from the first condition, the specific foreign matter region. and outputting information about the second foreign matter region.
- Computer-readable recording media include non-volatile recording media such as CD-ROMs (Compact Disc-Read Only Memory).
- a device may consist of one or more devices. When the device is composed of two or more devices, the two or more devices may be arranged in one device, or may be divided and arranged in two or more separate devices. As used herein and in the claims, a "device" can mean not only one device, but also a system of multiple devices.
- FIG. 10 is a diagram schematically showing another configuration example of a hyperspectral imaging system
- FIG. 10 is a diagram schematically showing still another configuration example of a hyperspectral imaging system
- FIG. 10 is a diagram schematically showing still another configuration example of a hyperspectral imaging system
- FIG. 4 is a diagram schematically showing an example of a filter array
- FIG. 3 is a diagram showing an example of a spatial distribution of light transmittance in each of a plurality of wavelength bands included in a target wavelength band
- FIG. 4 is a diagram showing an example of spectral transmittance of one region included in a filter array
- FIG. 3 is a diagram showing an example of a spatial distribution of light transmittance in each of a plurality of wavelength bands included in a target wavelength band
- FIG. 4 is a diagram showing an example of spectral transmittance of one region included in a filter array
- FIG. 3 is a diagram showing an example of a spatial distribution of light transmittance in each of a plurality of wavelength bands
- FIG. 5 is a diagram showing an example of spectral transmittance of other regions included in the filter array; 1 is a diagram for explaining the relationship between a target wavelength band W and a plurality of wavelength bands W 1 , W 2 , . . . , WN included therein; FIG. FIG. 4 is another diagram for explaining the relationship between the target wavelength range W and the plurality of wavelength bands W 1 , W 2 , . . . , WN included therein. FIG. 4 is a diagram for explaining spectral transmittance characteristics in a region of a filter array; It is a figure which shows the result of having averaged the spectral transmittance
- FIG. 1 is a diagram schematically showing a configuration example of an inspection system according to Embodiment 1;
- FIG. 1 is a block diagram showing a configuration example of an inspection system;
- FIG. 5 is a flow chart showing an example of a two-step foreign object detection operation by a processing circuit;
- FIG. 11 is a flow chart showing details of a foreign object detection process based on a second condition in step S150;
- FIG. FIG. 2 is a diagram showing an example of information stored in a storage device;
- FIG. FIG. 2 shows an example of the reflectance spectra of two metals and the background and the first band group;
- FIG. 10 is a diagram showing an example of an image generated for each band by restoration calculation using the restoration table corresponding to the first band group;
- FIG. 12 is a diagram showing an example of a first foreign matter region cut out in the example shown in FIG. 11;
- FIG. 2 shows an example of reflectance spectra of two metals and a background and a second group of bands;
- FIG. 11 is a block diagram showing a configuration example of an inspection system according to Embodiment 2;
- 10 is a flow chart showing an example of the operation of two-step foreign matter detection in the second embodiment; It is a figure which shows a coordinate axis and a coordinate.
- all or part of a circuit, unit, device, member or section, or all or part of a functional block in a block diagram is, for example, a semiconductor device, a semiconductor integrated circuit (IC), or an LSI (large scale integration). ) may be performed by one or more electronic circuits.
- An LSI or IC may be integrated on one chip, or may be configured by combining a plurality of chips.
- functional blocks other than memory elements may be integrated into one chip.
- they are called LSIs or ICs, but they may be called system LSIs, VLSIs (very large scale integration), or ULSIs (ultra large scale integration) depending on the degree of integration.
- a Field Programmable Gate Array (FPGA), which is programmed after the LSI is manufactured, or a reconfigurable logic device that can reconfigure the connection relationships inside the LSI or set up the circuit partitions inside the LSI can also be used for the same purpose.
- FPGA Field Programmable Gate Array
- circuits, units, devices, members or parts can be executed by software processing.
- the software is recorded on one or more non-transitory storage media, such as ROMs, optical discs, hard disk drives, etc., and when the software is executed by a processor, the functions specified in the software are is performed by the processor and peripherals.
- a system or apparatus may comprise one or more non-transitory storage media on which software is recorded, a processing unit, and required hardware devices such as interfaces.
- the above method is executed by a computer.
- one or more first foreign matter regions satisfying the first condition are first determined from the first image data containing information of a relatively small number of bands.
- the first foreign matter region may be a region in which it is estimated that a specific foreign matter is highly likely to exist.
- second image data including information of more bands is acquired for the region including the first foreign matter region.
- one or more pixel regions satisfying the second condition are determined as the second foreign matter regions in which the specific foreign matter is present.
- the first condition is that the pixel region is composed of a plurality of consecutive pixels whose pixel values in the first band group satisfy a specific condition, and the size of the pixel region exceeds a predetermined size.
- a predetermined size For example, when the first band group includes one band, if a group of a plurality of continuous pixels whose pixel values are within a predetermined range exceeds a predetermined size, a pixel composed of the plurality of pixels The area may be detected as the first foreign object area.
- the first band group includes two bands, if the ratio or difference between the pixel values of those bands exceeds a predetermined size of a group of consecutive pixels within a predetermined range, A pixel region composed of a plurality of pixels may be detected as the first foreign matter region.
- a plurality of pixels being “contiguous” means that those pixels are adjacent or close together in the image.
- the "predetermined size” can be, for example, a threshold regarding the number of pixels in the pixel region, the diameter of the circumscribed circle, or the diameter of the inscribed circle.
- the second condition may be that the pixel region is classified into one of a predetermined classification list based on pixel values for the second band group. For example, if a combination (for example, ratio) of pixel values of a plurality of bands included in the second band group satisfies a predetermined condition, it is determined to be classified into one of the predetermined classification lists. obtain. Classification may be performed according to a trained model that has been trained in advance using learning data.
- Obtaining the first image data includes obtaining compressed image data in which image information for each of a plurality of wavelength bands including the second band group is compressed as one two-dimensional image, and obtaining compressed image data from the compressed image data. Generating the first image data may be included.
- Obtaining the second image data includes extracting the region including the first foreign matter region from the compressed image data and generating the second image data based on data of the extracted region. You can The region including the first foreign matter region may be the same region as the first foreign matter region.
- Generating the first image data may include restoring the first image data from the compressed image data using a first restoration table corresponding to the first band group.
- Generating the second image data may include restoring the second image data from data of the extracted region using a second restoration table corresponding to the second band group.
- the compressed image may be generated by an imaging device comprising a filter array and an image sensor.
- the filter array may include multiple types of filters having different transmission spectra.
- the first reconstruction table and the second reconstruction table may be generated based on the transmission spectrum distribution in the multiple filter array.
- the first image data may be acquired by a first imaging operation by an imaging device
- the second image data may be acquired by a second imaging operation by the imaging device.
- the imaging device is not limited to the imaging device with the filter array described above, and may be any hyperspectral imaging device.
- the method may further include outputting a warning to an output device when the second foreign object area is detected.
- Said output device may be, for example, one or more devices selected from the group consisting of a display, a speaker, a buzzer, and a lamp.
- Alerts may include one or more information selected from the group consisting of, for example, light, sound, image, text, and vibration.
- the method may further include storing the location of the first foreign matter region and the location of the second foreign matter region in a storage device.
- the position is a position in the image and can be specified by two-dimensional coordinate values.
- a device detects foreign matter contained in an inspection target.
- the apparatus includes a processor and a storage medium storing a computer program.
- the processor executes the computer program to (a) acquire first image data for the inspection object, each pixel having a pixel value for a first group of bands including one or more wavelength bands.
- a computer program is a computer program for detecting foreign matter contained in an inspection target.
- the computer program causes the computer to: (a) acquire first image data for the inspection object, each pixel having a pixel value for a first group of bands including one or more wavelength bands; b) determining one or more pixel regions that satisfy a first condition from the first image data as a first foreign matter region; and (c) each pixel includes more wavelength bands than the first band group. (d) obtaining second image data for a region including the first foreign matter region, which has pixel values for a second band group including; Determining one or more pixel regions satisfying two conditions as a second foreign matter region in which the specific foreign matter exists; and (e) outputting information about the second foreign matter region.
- FIG. 1A is a diagram schematically showing a configuration example of a hyperspectral imaging system.
- This system includes an imaging device 100 and a processing device 200 .
- the imaging device 100 has a configuration similar to that of the imaging device disclosed in Patent Document 2.
- FIG. The imaging device 100 includes an optical system 140 , a filter array 110 and an image sensor 160 .
- Optical system 140 and filter array 110 are arranged on the optical path of light incident from object 70, which is a subject.
- Filter array 110 is placed between optical system 140 and image sensor 160 .
- the filter array 110 is an array of a plurality of translucent filters arranged in rows and columns.
- the multiple filters include multiple types of filters having different spectral transmittances, ie, wavelength dependencies of light transmittances.
- the filter array 110 modulates the intensity of incident light for each wavelength and outputs the modulated light. This process by filter array 110 is referred to herein as "encoding.”
- the filter array 110 is arranged near or directly above the image sensor 160 .
- “near” means that the image of the light from the optical system 140 is close enough to be formed on the surface of the filter array 110 in a somewhat clear state.
- “Directly above” means that they are so close to each other that there is almost no gap. Filter array 110 and image sensor 160 may be integrated.
- the optical system 140 includes at least one lens. Although optical system 140 is shown as a single lens in FIG. 1A, optical system 140 may be a combination of multiple lenses. Optical system 140 forms an image on the imaging surface of image sensor 160 via filter array 10 .
- Filter array 110 may be located remotely from image sensor 160 .
- FIGS. 1B to 1D are diagrams showing configuration examples of the imaging device 100 in which the filter array 110 is arranged away from the image sensor 160.
- FIG. 1B filter array 110 is positioned between optical system 140 and image sensor 160 and at a distance from image sensor 160 .
- filter array 110 is positioned between object 70 and optics 140 .
- imaging device 100 comprises two optical systems 140A and 140B, with filter array 110 positioned therebetween.
- an optical system including one or more lenses may be arranged between filter array 110 and image sensor 160 .
- the image corresponding to wavelength band W 1 ie, the image of wavelength band W 1 is 250W 1 , .
- the image sensor 160 is a monochrome photodetector having a plurality of two-dimensionally arranged photodetection elements (also referred to as "pixels" in this specification).
- the image sensor 160 can be, for example, a CCD (Charge-Coupled Device), a CMOS (Complementary Metal Oxide Semiconductor) sensor, or an infrared array sensor.
- the photodetector includes, for example, a photodiode.
- Image sensor 160 does not necessarily have to be a monochrome type sensor. For example, color type sensors with R/G/B, R/G/B/IR, or R/G/B/W filters may be used.
- the wavelength range to be acquired may be arbitrarily determined, and is not limited to the visible wavelength range, and may be the ultraviolet, near-infrared, mid-infrared, or far-infrared wavelength ranges.
- the processing device 200 is a computer that includes a processor and a storage medium such as memory. Based on the compressed image 120 acquired by the image sensor 160, the processing unit 200 generates data for a plurality of images 250W1 , 250W2 , .
- FIG. 2A is a diagram schematically showing an example of the filter array 110.
- FIG. Filter array 110 has a plurality of regions arranged two-dimensionally. In this specification, the area may be referred to as a "cell".
- An optical filter having an individually set spectral transmittance is arranged in each region.
- the spectral transmittance is represented by a function T( ⁇ ), where ⁇ is the wavelength of incident light.
- the spectral transmittance T( ⁇ ) can take a value of 0 or more and 1 or less.
- the filter array 110 has 48 rectangular regions arranged in 6 rows and 8 columns. This is only an example and in actual applications more areas may be provided. The number may be about the same as the number of pixels of the image sensor 160, for example. The number of filters included in the filter array 110 is determined depending on the application, for example, within the range of tens to tens of millions.
- FIG. 2B is a diagram showing an example of spatial distribution of light transmittance in each of a plurality of wavelength bands W 1 , W 2 , . . . , W N included in the target wavelength range.
- the difference in shading in each region represents the difference in transmittance.
- a lighter area has a higher transmittance, and a darker area has a lower transmittance.
- the spatial distribution of light transmittance differs depending on the wavelength band.
- FIGS. 2C and 2D are diagrams respectively showing examples of spectral transmittance of area A1 and area A2 included in filter array 110 shown in FIG. 2A.
- the spectral transmittance of the area A1 and the spectral transmittance of the area A2 are different from each other.
- the spectral transmittance of filter array 110 differs depending on the region. However, it is not necessary that all regions have different spectral transmittances.
- Filter array 110 includes two or more filters having different spectral transmittances.
- the number of spectral transmittance patterns in the plurality of regions included in the filter array 110 can be equal to or greater than the number N of wavelength bands included in the wavelength range of interest.
- the filter array 110 may be designed such that more than half of the regions have different spectral transmittances.
- the target wavelength band W can be set in various ranges depending on the application.
- the target wavelength range W can be, for example, a visible light wavelength range from about 400 nm to about 700 nm, a near-infrared wavelength range from about 700 nm to about 2500 nm, or a near-ultraviolet wavelength range from about 10 nm to about 400 nm.
- the target wavelength range W may be a radio wave range such as mid-infrared or far-infrared.
- the wavelength range used is not limited to the visible light range.
- the term “light” refers to radiation in general, including not only visible light but also infrared rays and ultraviolet rays.
- N is an arbitrary integer of 4 or more, and each wavelength band obtained by equally dividing the target wavelength band W into N is defined as a wavelength band W 1 , a wavelength band W 2 , .
- a plurality of wavelength bands included in the target wavelength band W may be set arbitrarily. For example, different wavelength bands may have different bandwidths. There may be gaps or overlaps between adjacent wavelength bands. In the example shown in FIG. 3B, the wavelength bands have different bandwidths and there is a gap between two adjacent wavelength bands. In this way, the plurality of wavelength bands may be different from each other, and the method of determination is arbitrary.
- FIG. 4A is a diagram for explaining spectral transmittance characteristics in a certain region of the filter array 110.
- the spectral transmittance has multiple maxima P1 to P5 and multiple minima for wavelengths within the wavelength range W of interest.
- normalization is performed so that the maximum value of the light transmittance within the target wavelength range W is 1 and the minimum value is 0.
- the spectral transmittance has maximum values in wavelength bands such as the wavelength band W 2 and the wavelength band W N ⁇ 1 .
- the spectral transmittance of each region has maximum values in at least two of the plurality of wavelength bands W1 to WN.
- local maxima P1, P3, P4 and P5 are greater than or equal to 0.5.
- the filter array 110 transmits a large amount of components in a certain wavelength band and transmits less components in other wavelength bands among the incident light. For example, for light in k wavelength bands out of N wavelength bands, the transmittance is greater than 0.5, and for light in the remaining Nk wavelength bands, the transmittance is 0.5. can be less than k is an integer that satisfies 2 ⁇ k ⁇ N. If the incident light is white light that evenly includes all wavelength components of visible light, the filter array 110 converts the incident light into light having a plurality of discrete intensity peaks with respect to wavelength. , and superimposes and outputs these multi-wavelength lights.
- FIG. 4B is a diagram showing, as an example, the result of averaging the spectral transmittance shown in FIG. 4A for each wavelength band W 1 , wavelength band W 2 , . . . , wavelength band WN.
- the averaged transmittance is obtained by integrating the spectral transmittance T( ⁇ ) for each wavelength band and dividing by the bandwidth of that wavelength band.
- the transmittance value averaged for each wavelength band is defined as the transmittance in that wavelength band.
- the transmittance is remarkably high in the three wavelength regions having the maximum values P1, P3 and P5. In particular, the transmittance exceeds 0.8 in the two wavelength regions having the maximum values P3 and P5.
- a grayscale transmittance distribution is assumed in which the transmittance of each region can take any value between 0 and 1 inclusive.
- a binary-scale transmittance distribution may be employed in which the transmittance of each region can take either a value of approximately 0 or approximately 1.
- each region transmits a majority of light in at least two wavelength bands of the plurality of wavelength bands included in the wavelength band of interest and transmits a majority of light in the remaining wavelength bands. don't let Here, "most" refers to approximately 80% or more.
- Part of the total cells may be replaced with transparent areas.
- Such a transparent region transmits light in all wavelength bands W1 to WN contained in the wavelength range W of interest with a similarly high transmittance, eg, a transmittance of 80% or more.
- the plurality of transparent regions may be arranged in a checkerboard, for example. That is, in the two directions in which the plurality of regions in the filter array 110 are arranged, the regions having different light transmittances depending on the wavelength and the transparent regions can be alternately arranged.
- Such data indicating the spatial distribution of the spectral transmittance of the filter array 110 is obtained in advance based on design data or actual measurement calibration, and stored in a storage medium included in the processing device 200. This data is used for arithmetic processing to be described later.
- the filter array 110 can be constructed using, for example, a multilayer film, an organic material, a diffraction grating structure, or a microstructure containing metal.
- a multilayer film for example, a dielectric multilayer film or a multilayer film containing a metal layer can be used.
- each cell is formed so that at least one of the thickness, material, and stacking order of each multilayer film is different. Thereby, different spectral characteristics can be realized depending on the cell.
- a multilayer film a sharp rise and fall in spectral transmittance can be realized.
- a structure using an organic material can be realized by differentiating the pigment or dye contained in each cell or by laminating different materials.
- a configuration using a diffraction grating structure can be realized by providing diffraction structures with different diffraction pitches or depths for each cell.
- a microstructure containing metal it can be produced using spectroscopy due to the plasmon effect.
- the processor 200 generates a multi-wavelength hyperspectral image 250 based on the compressed image 120 output from the image sensor 160 and the spatial distribution characteristics of transmittance for each wavelength of the filter array 110 .
- multiple wavelengths means a wavelength range greater than the three color wavelength ranges of RGB acquired by a normal color camera, for example.
- the number of wavelength bands may be on the order of 4 to 100, for example.
- the number of wavelength regions is called the number of bands. Depending on the application, the number of bands may exceed 100.
- f be the data of the hyperspectral image 250 . If the number of bands is N, f is data obtained by integrating image data f 1 of wavelength band W 1 , image data f 2 of wavelength band W 2 , . . . , and image data f N of wavelength band WN .
- the horizontal direction of the image is the x direction
- the vertical direction of the image is the y direction.
- each of image data f 1 , image data f 2 , . is two-dimensional data. Therefore, the data f is three-dimensional data having n ⁇ m ⁇ N elements. This three-dimensional data is called "hyperspectral image data" or "hyperspectral datacube".
- the number of elements of the data g of the compressed image 120 encoded and multiplexed by the filter array 110 is n ⁇ m.
- Data g can be represented by the following equation (1).
- each of f 1 , f 2 , . . . , f N is data having n ⁇ m elements. Therefore, the vector on the right side is a one-dimensional vector of n ⁇ m ⁇ N rows and 1 column.
- the vector g is converted into a one-dimensional vector of n ⁇ m rows and 1 column, and calculated.
- the matrix H encodes and intensity - modulates each component f 1 , f 2 , . represents a transformation that adds Therefore, H is a matrix with n ⁇ m rows and n ⁇ m ⁇ N columns.
- the processing device 200 utilizes the redundancy of the images included in the data f and obtains the solution using the method of compressed sensing. Specifically, the desired data f is estimated by solving the following equation (2).
- f' represents the estimated data of f.
- the first term in parentheses in the above formula represents the amount of deviation between the estimation result Hf and the acquired data g, ie, the so-called residual term.
- the sum of squares is used as the residual term here, the absolute value or the square root of the sum of squares may be used as the residual term.
- the second term in parentheses is the regularization or stabilization term. Equation (2) means finding f that minimizes the sum of the first and second terms.
- the processing unit 200 can converge the solution by recursive iterative computation and calculate the final solution f'.
- the first term in parentheses in formula (2) means an operation for obtaining the sum of squares of the difference between the acquired data g and Hf obtained by transforming f in the estimation process using the matrix H.
- the second term, ⁇ (f), is a constraint on the regularization of f, and is a function that reflects the sparse information of the estimated data. This function has the effect of smoothing or stabilizing the estimated data.
- the regularization term may be represented by, for example, the Discrete Cosine Transform (DCT), Wavelet Transform, Fourier Transform, or Total Variation (TV) of f. For example, when the total variation is used, it is possible to acquire stable estimated data that suppresses the influence of noise in the observed data g.
- the sparsity of the object 70 in the space of each regularization term depends on the texture of the object 70 .
- a regularization term may be chosen that makes the texture of the object 70 more spars in the space of regularization terms.
- multiple regularization terms may be included in the operation.
- ⁇ is a weighting factor. The larger the weighting factor ⁇ , the larger the reduction amount of redundant data and the higher the compression rate. The smaller the weighting factor ⁇ , the weaker the convergence to the solution.
- the weighting factor ⁇ is set to an appropriate value with which f converges to some extent and does not become over-compressed.
- the hyperspectral image 250 can be generated by storing the blur information in advance and reflecting the blur information on the matrix H described above.
- blur information is represented by a point spread function (PSF).
- PSF is a function that defines the degree of spread of a point image to peripheral pixels. For example, when a point image corresponding to one pixel in an image spreads over a region of k ⁇ k pixels around that pixel due to blurring, the PSF is a coefficient group that indicates the effect on the brightness of each pixel in that region. can be defined as a matrix.
- the hyperspectral image 250 can be generated by reflecting the blurring effect of the PSF-encoded pattern on the matrix H.
- FIG. The position where the filter array 110 is placed is arbitrary, but a position can be selected where the coding pattern of the filter array 110 is not too diffuse and disappears.
- the hyperspectral image 250 can be generated from the compressed image 120 acquired by the image sensor 160.
- the processing device 200 applies an algorithm using the principle of compressed sensing to generate the hyperspectral image 250 for all wavelength bands included in the wavelength band of interest. In this case, if the resolution of the compressed image 120 is high, the calculation load for generating the hyperspectral image 250 will be high, and the time required for inspection will be long.
- the following two stages of restoration and inspection are performed to reduce the computational load and time required for the entire inspection.
- the above restoration operation is performed only for a first band group containing a relatively small number of bands instead of all bands, and an image for each band is generated from the compressed image. Based on the images of those relatively few bands, a first foreign object region is identified in the image that is likely to contain a particular foreign object.
- the restoration operation is performed only for a relatively narrow region including the specified first foreign matter region, and for a second band group including more bands than the number of bands included in the first band group. .
- the second foreign matter region in which the specific foreign matter to be detected exists is specified from the first foreign matter regions. Information about the identified second foreign matter region is output to an output device such as a display.
- Such a method makes it possible to detect specific foreign matter contained in the inspection target with a smaller amount of computation. As a result, the time required for the inspection process can be greatly reduced.
- FIG. 5 is a diagram schematically showing a configuration example of an inspection system according to this embodiment.
- This inspection system comprises an imaging device 100 , a processing device 200 , an output device 300 and an actuation device 400 .
- Output devices 300 may include devices such as displays, speakers, and lamps.
- Actuator 400 may include devices such as conveyor belts and picking devices.
- An object 70 to be inspected is placed on a conveyor belt and conveyed.
- Object 70 is any article, such as an industrial product or a food product.
- the inspection system detects foreign matter mixed in the object 70 based on the compressed image of the object 70 .
- Foreign objects to be detected can be anything, such as certain metals, plastics, insects, debris, or hair.
- the foreign matter is not limited to these objects, and may be a portion of the object 70 whose quality has deteriorated. For example, if the object 70 is food, a rotten part of the food may be detected as a foreign object.
- the inspection system can output information indicating that the foreign object has been detected to the output device 300, or remove the object 70 including the foreign object by the picking device.
- the imaging device 100 is a camera capable of the aforementioned hyperspectral imaging.
- the imaging device 100 generates the above-mentioned compressed image by photographing the object 70 continuously flowing on the conveyor.
- Processing device 200 is any computer such as, for example, a personal computer, a server computer, or a laptop computer.
- the processing device 200 generates an image for each of a plurality of bands by performing the above-described restoration calculation based on the compressed image generated by the imaging device 100 .
- the processing device 200 detects foreign matter contained in the target object 70 based on the images of those bands, and outputs the detection result to the output device 300 .
- FIG. 6 is a block diagram showing a configuration example of an inspection system.
- the processing device 200 comprises processing circuitry 210 and a storage device 220 .
- Output device 300 includes display 310 , speaker 320 , and lamp 330 .
- Actuator 400 comprises a conveyor 410 and a picking device 420 .
- the imaging device 100 includes an image sensor, a filter array, and an optical system such as a lens, as described with reference to FIGS. 1A to 1D.
- the imaging device 100 captures an image of the object 70 to generate compressed image data, and sends the compressed image data to the processing device 200 .
- the processing device 200 generates an image for each band based on the compressed image generated by the imaging device 100.
- the processing circuitry 210 of the processing unit 200 includes a processor, such as a CPU or GPU.
- the processing circuit 210 determines whether or not the target object 70 contains a specific foreign substance based on the compressed image generated by the imaging device 100, and outputs information indicating the determination result.
- the processing circuit 210 performs two stages of restoration processing on the compressed image acquired from the imaging device 100 .
- the first-stage restoration only a relatively small number of bands among multiple bands included in the target wavelength band are restored for the entire compressed image. This relatively small number of bands is referred to as the first band group.
- the number of bands included in the first band group is any number greater than or equal to 1, and in a certain example, greater than or equal to 2 and less than or equal to 5.
- processing circuit 210 uses a first reconstruction table that contains only information on matrix elements corresponding to the first band group in matrix H in equations (1) and (2) above.
- a synthesized restored image corresponding to a band different from the band included in the first band group is obtained using a synthesized restored table obtained by synthesizing information of matrix elements corresponding to the band different from the band included in the first band group. may be generated.
- the processing circuit 210 restores the image of each band in the first band group according to the above equation (2) based on the compressed image and the first restoration table.
- the processing circuit 210 first identifies a first foreign matter region in which foreign matter is highly likely to exist according to the discrimination model based on the first condition from the pixel values of the plurality of pixels included in the restored image corresponding to each band.
- the first condition is, for example, that the pixel region is composed of a plurality of consecutive pixels whose pixel values for the first band group satisfy a specific condition, and that the size of the pixel region exceeds a predetermined size.
- the processing circuit 210 causes the storage device 220 to store the first foreign matter region, and restores each band of the second band group, which is larger than the number of bands of the first band group, for a relatively narrow region including the first foreign matter region. conduct.
- processing circuitry 210 uses a second reconstruction table that contains only information on matrix elements corresponding to the second band group in matrix H in equations (1) and (2) above.
- the second band group may include all bands in the wavelength range of interest.
- the processing circuit 210 Based on the compressed image and the second restoration table, the processing circuit 210 extracts the pixels of the plurality of pixels in the region corresponding to the first foreign matter region included in the image of each band in the second band group according to the above equation (2). Calculate the value.
- the processing circuit 210 identifies a second foreign matter region containing a specific foreign matter according to a discrimination model based on a second condition from pixel values of a plurality of pixels in a region corresponding to the first foreign matter region included in each band image, Information indicating the specified second foreign matter region is stored in the storage device 220 as a detection result.
- a second condition may be, for example, that the region is classified into one of a predetermined classification list based on pixel values for the second band group.
- the processing circuit 210 transmits a control signal to the external output device 300 and the actuation device 400 when a foreign object satisfying the second condition is detected.
- the output device 300 causes at least one of the display 310, the speaker 320, and the lamp 330 to output a warning such as light, image, text, beep, or sound in response to the received control signal.
- the actuation device 400 may switch the path of the conveyor 410 or remove from the conveyor 410 objects 70 for which foreign objects have been detected by the picking device 420 in response to the received control signal.
- the storage device 220 includes any storage medium such as semiconductor memory, magnetic storage device, optical storage device, and the like.
- Storage device 220 stores computer programs executed by processing circuitry 210, data used by processing circuitry 210 in the course of processing, and data generated by processing circuitry 210 in the course of processing.
- the storage device 220 stores, for example, compressed image data generated by the imaging device 100, a restoration table corresponding to each combination of bands such as a first band group and a second band group, a discrimination model corresponding to each combination of bands, detection Information indicating the positions on the image of the first foreign matter region and the second foreign matter region and information indicating the foreign matter determination result are stored.
- FIG. 7 is a flowchart showing an example of a two-step foreign object detection operation by the processing circuit 210.
- processing circuitry 210 performs the operations of steps S100 through S180.
- the processing circuit 210 acquires a compressed image of the object 70 generated by the compression sensing imaging by the imaging device 100 (step S100).
- the processing circuit 210 uses the first restoration table corresponding to the first band group to perform the restoration calculation based on the above equation (2), and obtains one image of the object 70 for the first band group from the compressed image. Alternatively, multiple images are generated (step S110).
- the processing circuitry 210 applies the first discriminant model according to the first condition to the generated one or more images, and detects a first foreign matter region that satisfies the first condition (step S120).
- a first foreign matter region that satisfies the first condition
- the processing circuitry 210 may detect two or more different types of foreign matter according to the first discriminant model.
- the processing circuitry 210 determines whether or not there is a first foreign matter region that satisfies the first condition (step S130). If the first foreign matter region exists, the process proceeds to step S140. If the first foreign matter region does not exist, the process ends.
- Processing circuit 210 causes storage device 220 to store information indicating the detected first foreign matter region (step S140).
- the processing circuit 210 performs foreign matter detection processing on the detected first foreign matter region based on two conditions based on the information of the second band group including more bands than the first band group (step S150).
- the number of types of detected foreign matter that satisfies the second condition is one or more, and may be a plurality of types.
- the number of types of foreign matter that satisfies the second condition is less than the number of types of foreign matter that satisfies the first condition.
- the number of types of foreign matter that satisfies the first condition but does not satisfy the second condition may be one or plural.
- the processing circuit 210 determines whether or not there is a second foreign matter region that satisfies the second condition (step S160).
- processing circuit 210 causes storage device 220 to store information about the second foreign matter region (step S170). Processing circuit 210 then outputs control signals to output device 300 and actuator 400 (step S180).
- the output device 300 receives a control signal from the processing circuit 210 and causes the display 310 or the like to output a warning display.
- Actuator 400 receives control signals from processing circuitry 210 to control conveyor 410, picking device 420, and the like.
- FIG. 8 is a flow chart showing the details of the foreign matter detection process based on the second condition in step S150.
- Step S150 includes steps S151 to S155 shown in FIG.
- One or more first foreign matter regions may be detected from the compressed image according to the discriminant model according to the first condition.
- the processing circuit 210 cuts out the first foreign matter region from the compressed image (step S151).
- the processing circuit 210 uses the second restoration table corresponding to the second band group to perform the restoration calculation based on the above-described formula (2) for the cut-out first foreign matter region, and obtains the second image for the second band group. is generated (step S152).
- processing circuitry 210 selects one of the unprocessed regions from among the first foreign matter regions (step S153).
- the processing circuit 210 applies the second discrimination model based on the second condition to the selected first foreign matter region, and determines whether or not the second condition is satisfied (step S154).
- the processing circuit 210 causes the storage device 220 to store the determination result.
- the processing circuit 210 determines whether or not the processing has been completed for all of the first foreign matter regions (step S155). If an unprocessed area remains, the process returns to step S153.
- the processing of steps S153 to S155 is repeated until the processing is completed for all of the first foreign matter regions. When the processing for all the first foreign matter regions is completed, the process proceeds to step S160.
- FIG. 9 is a diagram showing an example of information stored in the storage device 220.
- the processing circuitry 210 detects particles of two metals (Metal 1 and Metal 2) as foreign objects and causes the output device 300 to issue an alert if a Metal 1 particle is detected.
- Compressed image data obtained by compressed sensing imaging is assigned a number (inspection sample No.) for each inspection sample, and recorded together with information on the date and time when the compressed image was acquired.
- metal 1 and metal 2 are detected as foreign matter based on the discrimination model based on the first condition.
- Metal 1 and metal 2 are detected separately based on the discrimination model based on the second condition. In this example, an alert is issued if Metal 1 is detected.
- FIG. 10 is a diagram showing an example of the first band group used in foreign object detection under the first condition.
- two wavelength bands representing the first band group are illustrated in gray.
- the reflectance spectra of two types of foreign matter ie Metal 1 and Metal 2
- processing circuitry 210 distinguishes between the two types of foreign matter and the background by comparing the reflectance in the two bands shown pixel by pixel.
- the first band group includes two bands of 350 nm ⁇ 10 nm and 600 nm ⁇ 10 nm in this example, this is only an example. Which band to include in the first band group is appropriately selected according to the object to be inspected.
- the number of bands included in the first band group is any number of 1 or more, and may be 3 or more.
- FIG. 10 shows a wavelength range from 350 nm to 850 nm, other wavelength ranges may be used.
- FIG. 11 shows an example of an image of a band (600 nm ⁇ 10 nm) near the center of the graph shown in FIG. 10 among the images generated for each band by the restoration calculation using the restoration table corresponding to the first band group. It is a diagram. This image reflects the reflectance distribution of the object, and the brightness varies according to the reflectance. Areas with higher reflectance are displayed in white, and areas with lower reflectance are displayed in black. In this example, the black grainy areas in the image are the foreign matter, and the bright areas are the background.
- the discrimination model based on the first condition is a model for detecting a foreign object based on the size of a continuous pixel region in which the pixel values of the first band group satisfy a specific condition.
- the specific condition may be, for example, that the pixel value ratio of the two bands is within a predetermined range.
- the reflectance of the two types of foreign matter is similar in all bands, but the reflectance of the background is significantly different between the two bands. Therefore, whether the pixel value ratio of the two bands is within a range of, for example, 0.8 or more and 1.2 or less can be used to determine whether the object is a foreign object or a background.
- the specific condition may be whether pixel values in one band are above or below a certain threshold.
- the specific condition is that when a reference band is determined from among three or more bands, each of the multiple bands other than the reference band , and each of the calculated multiple ratio values may be included in a defined range.
- a region of continuous pixels in which a value (for example, a ratio) calculated based on pixel values of corresponding pixels of each band included in the first band group is within a predetermined range is defined as a region in which a foreign substance exists. It can be detected as a possible region.
- the first condition may be, for example, that the diameter of the circumscribed circle of the continuous pixel regions thus detected is greater than or equal to a threshold. In the example shown in FIG.
- a pixel region whose circumscribed circle has a diameter of 1 mm or more is detected as a region in which a foreign substance (ie, metal 1 or metal 2) exists.
- the condition for determination is not limited to the condition based on the reflectance ratio between wavelength bands, but may be the condition based on the difference or the condition derived by machine learning.
- the processing circuit 210 assigns an ID to each foreign matter area detected according to the first condition, and records the position on the image in the storage device 220 .
- the position of the foreign matter area recorded may be a representative position such as the center or the center of gravity of the foreign matter area.
- the foreign matter area detected in this way is called a "first foreign matter area”.
- the first foreign matter region can be detected at a plurality of locations according to the number of foreign matter.
- the processing circuit 210 cuts out one or more detected first foreign matter regions from the compressed image. Then, according to the discrimination model under the second condition, foreign matter determination processing for metal 1 and metal 2 is performed for each foreign matter region that is cut out, and the determination result (for example, the type of foreign matter) is stored in the storage device 220 .
- FIG. 13 is a diagram showing an example of the second band group in foreign object detection under the second condition.
- the wavelength range displayed in gray is the second band group.
- the second group of bands in this example includes nine bands.
- Metal 1 and metal 2 can be distinguished by comparing the reflectance in the nine bands.
- the condition for determination may be, for example, whether the pixel value ratio or difference between bands is within a predetermined range, or a condition derived by machine learning.
- the discrimination conditions can be that the ratio of metal 1 and metal 2 is equal in band 2 and that the ratio of metal 2 is lower than that of metal 1 in bands 4 to 7 .
- the band 1 is the band on the shortest wavelength side
- the band 9 is the band on the longest wavelength side.
- ⁇ (band 2 pixel value for metal 1)/(band 9 pixel value for metal 1) ⁇ ⁇ (band 2 pixel value for metal 2)/(band 9 pixel value for metal 2) ⁇ , ⁇ ( band 4 pixel value for metal 1)/(band 9 pixel value for metal 1) ⁇ > ⁇ (band 4 pixel value for metal 2)/(band 9 pixel value for metal 2) ⁇ , ⁇ (metal 1 (pixel value of band 5 for metal 1)/(pixel value of band 9 for metal 1) ⁇ > ⁇ (pixel value of band 5 for metal 2)/(pixel value of band 9 for metal 2) ⁇ , ⁇ (band 6 pixel value)/(band 9 pixel value for metal 1) ⁇ > ⁇ (band 6 pixel value for metal 2)/(band 9 pixel value for metal 2) ⁇ , ⁇ (band 7 pixel value for metal 1 pixel value)/(band 9 pixel value for metal 1) ⁇ > ⁇ (band 7 pixel value for metal 2)/(band 9 pixel value for metal 2) ⁇ It is good also as conditions of discrimination that it is.
- a model for estimating the type of foreign matter is created from the pixel value information for each band based on the pixel value information for each band and the information indicating the type of foreign matter. is derived based on
- the processing circuit 210 causes the output device 300 to output a warning if metal 1 is detected as a result of the determination. As shown in FIG. 9, the processing circuitry 210 may cause the storage device 220 to store that the warning has been output.
- the foreign matter for which a warning is output is not limited to one type, and may be two or more types.
- one of a plurality of predetermined classification lists based on a combination of pixel values for the second band group among the first foreign matter regions detected based on the pixel values of each band of the first band group may be set as a second foreign matter region in which a specific foreign matter is present.
- one or more regions in which one or more foreign substances may exist are first selected as the first foreign substance regions by the first-stage detection operation with a narrowed number of bands. detected. After that, a second foreign object area containing a specific foreign object is detected from among the first foreign object areas by the second-stage detection operation based on more bands.
- Such an operation makes it possible to greatly reduce the computational load and the inspection time compared to the case of detecting a foreign object based on the information of all bands in the entire image.
- the method for determining the first foreign matter region may take into consideration the following.
- the first band group (that is, one or a plurality of wavelength bands) is assumed to be the wavelength band W1 and the wavelength band WN.
- processing circuitry 210 computes image data f1 for wavelength band W1 from compressed image 120 , produces image 250W1 containing image data f1, and from compressed image 120 , wavelength band WN to generate image 250WN containing image data fN .
- the first image includes image 250W1 and image 250WN.
- the processing circuitry 210 does not compute from the compressed image 120 the image data f 2 , . That is, at S110, processing circuitry 210 does not generate images 250W 2 . . .
- the image data f 1 and the image data f N are obtained by solving the inverse problem of the following equation (3) instead of the above equation (1).
- a plurality of pixel values of a plurality of pixels included in the compressed image 120 are
- P(g ij ) is the pixel value of pixel g ij contained in compressed image 120 .
- Pixel gij is located in compressed image 120 at coordinates (i,j). Coordinate axes and coordinates may be as shown in FIG.
- a plurality of pixel values of a plurality of pixels included in the image 250W k are
- the pixel value P(f pij ) included in the image data f p and the pixel value P(f qij ) included in the image data f q are pixel values at the same location on the subject. This can be done by determining H shown below so that the pixel f pij and the pixel value f qij correspond to the same position on the object.
- the pixel located at the coordinates (r, s) may be determined not to be included in a region of contiguous pixels.
- the first foreign matter area determined by the above method is an area having coordinates (r, s), coordinates (r+1, s), coordinates (r, s+1), and coordinates (r+1, s+1), and will be described below.
- the coordinates (r, s), coordinates (r+1, s), coordinates (r, s+1), and coordinates (r+1, s+1), which are the coordinates of the first foreign matter area, are recorded in the storage device 220.
- a method for determining the second foreign matter region may be one that considers the following.
- the second band group (that is, the plurality of second wavelength bands) will be described as wavelength band W t , wavelength band W u , wavelength band W v , and wavelength band W w .
- the number of second plurality of wavelength bands (four in this example) is greater than the number of one or more wavelength bands (two in the example above). From compressed image 120 and matrix H, pixel values for some pixels of image 250Wt are computed, and pixel values for other pixels of image 250Wt are not computed.
- Image data f t of image 250 W t (P(f t11 )...P(f t1n )...P(f tm1 )...P(f tmn )) P(f trs ) , P(f t(r+1)s ), P(f tr(s+1) ), P(f t(r+1)(s+1) ) are calculated and P(f t11 ), ⁇ , P(f tmn ) other than P(f trs ), P(f t(r+1)s ), P(f tr(s+1) ), P(f t(r+1)(s+1) ) is not calculated.
- Image data f w of image 250W w (P(f w11 )...P(f w1n )...P(f wm1 )... P (f wmn ) ) w(r+1)s ), P(f wr(s+1) ), P(f w(r+1)(s+1) ) are calculated, P(f w11 ), ⁇ , P(f wmn ) other than P(f wrs ), P(f w(r+1)s ), P(f wr(s+1) ), P(f w(r+1)(s+1) ) is not calculated.
- Foreign matter detection is performed under the second condition using the pixel value P(f t(r+1)(s+1) ) of the pixel. In foreign matter detection under the second condition, the four pixel values included in the image 250W t are used, and the other (m ⁇ n ⁇ 4) pixel values included in the image 250W t are not used.
- Foreign matter detection is performed under the second condition using the pixel value P(f u(r+1)(s+1) ) of the pixel. In foreign matter detection under the second condition, the four pixel values included in the image 250W u are used, and the other (m ⁇ n ⁇ 4) pixel values included in the image 250W u are not used.
- Foreign matter detection is performed under the second condition using the pixel value P( fv(r+1)(s+1) ) of the pixel. In foreign matter detection under the second condition, the four pixel values included in the image 250Wt are used, and the other (m ⁇ n ⁇ 4) pixel values included in the image 250Wv are not used.
- Foreign matter detection is performed under the second condition using the pixel value P(f w(r+1)(s+1) ) of the pixel. In foreign matter detection under the second condition, the four pixel values included in the image 250W w are used, and the other (m ⁇ n ⁇ 4) pixel values included in the image 250W w are not used.
- FIG. 14 is a block diagram showing a configuration example of an inspection system according to this embodiment.
- the imaging device 100 and the processing circuit 210 restore images for each band from compressed images using compressed sensing.
- the image capturing apparatus 100 captures images for each band, and collects a plurality of images obtained by the capturing, thereby generating image data including information of a desired band group.
- the imaging apparatus 100 in this embodiment has a mechanism for imaging each band and generating an image of each band, instead of including the filter array 110 as shown in FIGS. 1A to 1D.
- the imaging device 100 directly generates an image of each band. Therefore, the storage device 220 does not store the reconstruction table used to generate the image of each band.
- the imaging device 100 captures images of each of the bands included in the first band group, and generates a plurality of images corresponding to those bands.
- the processing circuit 210 detects the first foreign matter region from the image of the first band group generated by the imaging device 100 according to the discrimination model based on the first condition.
- the imaging device 100 captures images of each of the second band group including more bands than the number of bands of the first band group, and captures a plurality of images corresponding to these bands.
- the processing circuit 210 performs a foreign object detection process from those images according to the discrimination model under the second condition, and stores the result in the storage device 220 .
- processing circuitry 210 sends control signals to output device 300 and actuator device 400 .
- the output device 300 causes the display 310 or the like to output a warning.
- Actuator 400 controls conveyor 410 and the like upon receiving a control signal.
- FIG. 15 is a flow chart showing an example of the operation of two-step foreign object detection by photographing multiple images in this embodiment.
- the processing circuit 210 in this example instructs the imaging device 100 to perform imaging for each band belonging to the first band group (step S200). Thereby, the imaging device 100 generates image data of each band in the first band group. For example, when the first band group consists of a band around 350 nm and a band around 600 nm, imaging is performed with illumination light having a wavelength of 350 nm, and then imaging is performed with illumination light having a wavelength of 600 nm.
- the wavelength of the illumination light may be adjusted by the user for each imaging, or the imaging device 100 may be provided with an illumination device and the wavelength may be adjusted automatically.
- Processing circuitry 210 acquires image data for the generated first band group (step S210).
- the processing circuit 210 performs processing for detecting the first foreign matter region from the acquired image data according to the discrimination model based on the first condition (step S230).
- the processing circuitry 210 determines whether or not there is a first foreign matter region that satisfies the first condition (step S240). If the first foreign matter area is not detected, the process ends.
- the processing circuit 210 causes the imaging device 100 to image a relatively narrow region including the first foreign matter region for each band belonging to the second band group.
- Image data is generated (step S250).
- the processing circuitry 210 acquires image data for the second band group generated for the relatively narrow region including the first foreign matter region (step S260).
- the processing circuit 210 performs processing for detecting a second foreign matter region from the acquired image data according to the discrimination model based on the second condition (step S270).
- the processing circuitry 210 determines whether or not there is a second foreign matter region that satisfies the second condition (step S280). If the second foreign matter area is not detected, the process ends. If the second foreign object area is detected, processing circuit 210 causes storage device 220 to store information about the second foreign object area (step S290). Processing circuit 210 then outputs a control signal to output device 300 and actuator 400 (step S300).
- the output device 300 receives a control signal from the processing circuit 210 and causes the display 310 or the like to output a warning display.
- Actuator 400 receives control signals from processing circuitry 210 to control conveyor 410, picking device 420, and the like.
- imaging is performed for each band by switching the wavelength of the illumination light and performing imaging a plurality of times, but imaging for each band may be performed by another method.
- the imaging device 100 may perform imaging a plurality of times while switching a plurality of filters having different transmission wavelength ranges.
- the imaging device 100 may be a line scan hyperspectral camera with a prism or diffraction grating.
- imaging is performed multiple times for each band. If the number of bands to be imaged is large, the examination will take a long time.
- imaging is performed for a relatively small number of the first band groups, and imaging is performed for a larger number of the second band groups only when a foreign object is detected. Therefore, in most cases where no foreign matter is detected, the inspection is completed without taking much time for imaging. Only when a foreign object is detected, detailed inspection is performed on the second band group, so the time required for the entire inspection process can be greatly reduced.
- the technology of the present disclosure is useful, for example, for cameras and measurement equipment that acquire multi-wavelength images.
- the technology of the present disclosure can be used, for example, for detecting foreign matter mixed in articles such as industrial products or foods.
- imaging device 110 filter array 120 image 140 optical system 160 image sensor 200 processing device 210 processing circuit 220 storage device 300 output device 310 display 320 speaker 330 lamp 400 actuator 410 conveyor 420 picking device
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Life Sciences & Earth Sciences (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Geophysics (AREA)
- General Life Sciences & Earth Sciences (AREA)
- Quality & Reliability (AREA)
- Health & Medical Sciences (AREA)
- Chemical & Material Sciences (AREA)
- Analytical Chemistry (AREA)
- Biochemistry (AREA)
- General Health & Medical Sciences (AREA)
- Immunology (AREA)
- Pathology (AREA)
- Mathematical Physics (AREA)
- Spectroscopy & Molecular Physics (AREA)
- Investigating Or Analysing Materials By Optical Means (AREA)
Abstract
Description
まず、本開示の第1の実施形態において用いられるハイパースペクトル撮像システムの構成例を説明する。その後、ハイパースペクトル撮像システムを利用した検査システムを説明する。
図1Aは、ハイパースペクトル撮像システムの構成例を模式的に示す図である。このシステムは、撮像装置100と、処理装置200とを備える。撮像装置100は、特許文献2に開示された撮像装置と同様の構成を備える。撮像装置100は、光学系140と、フィルタアレイ110と、イメージセンサ160とを備える。光学系140およびフィルタアレイ110は、被写体である対象物70から入射する光の光路上に配置されている。フィルタアレイ110は、光学系140とイメージセンサ160との間に配置される。
図5は、本実施形態における検査システムの構成例を模式的に示す図である。この検査システムは、撮像装置100と、処理装置200と、出力装置300と、作動装置400とを備える。出力装置300は、ディスプレイ、スピーカ、およびランプなどの装置を含み得る。作動装置400は、ベルトコンベアおよびピッキング装置などの装置を含み得る。
{(金属1に対するバンド2の画素値)/(金属1に対するバンド9の画素値)}={(金属2に対するバンド2の画素値)/(金属2に対するバンド9の画素値)}、{(金属1に対するバンド4の画素値)/(金属1に対するバンド9の画素値)}>{(金属2に対するバンド4の画素値)/(金属2に対するバンド9の画素値)}、{(金属1に対するバンド5の画素値)/(金属1に対するバンド9の画素値)}>{(金属2に対するバンド5の画素値)/(金属2に対するバンド9の画素値)}、{(金属1に対するバンド6の画素値)/(金属1に対するバンド9の画素値)}>{(金属2に対するバンド6の画素値)/(金属2に対するバンド9の画素値)}、{(金属1に対するバンド7の画素値)/(金属1に対するバンド9の画素値)}>{(金属2に対するバンド7の画素値)/(金属2に対するバンド9の画素値)}
であることを判別の条件としてもよい。
g=(P(g11)…P(g1n)…P(gm1)…P(gmn))T
と表現してもよい。
fk=(P(fk11)…P(fk1n)…P(fkm1)…P(fkmn))T
と表現してもよい。
(1)画像250Wtの画像データft=(P(ft11)…P(ft1n)…P(ftm1)…P(ftmn))Tに含まれるP(ftrs)、P(ft(r+1)s)、P(ftr(s+1))、P(ft(r+1)(s+1))が計算され、P(ft11)、~、P(ftmn)のうち、P(ftrs)、P(ft(r+1)s)、P(ftr(s+1))、P(ft(r+1)(s+1))以外は計算されない。
次に、実施形態2による検査システムを説明する。図14は、本実施形態における検査システムの構成例を示すブロック図である。
100 撮像装置
110 フィルタアレイ
120 画像
140 光学系
160 イメージセンサ
200 処理装置
210 処理回路
220 記憶装置
300 出力装置
310 ディスプレイ
320 スピーカ
330 ランプ
400 作動装置
410 コンベア
420 ピッキング装置
Claims (13)
- 検査対象に含まれる特定の異物を検出する方法であって、
各画素が、1つ以上の波長バンドを含む第1バンド群についての画素値を有する、前記検査対象についての第1画像データを取得することと、
前記第1画像データから、第1条件を満たす1つ以上の画素領域を第1異物領域として決定することと、
各画素が、前記第1バンド群よりも多くの波長バンドを含む第2バンド群についての画素値を有する、前記第1異物領域を含む領域についての第2画像データを取得することと、
前記第2画像データから、前記第1条件とは異なる第2条件を満たす1つ以上の画素領域を、前記特定の異物が存在する第2異物領域として決定することと、
前記第2異物領域に関する情報を出力することと、
を含む方法。 - 前記第1条件は、前記画素領域が、前記第1バンド群についての画素値が特定の条件を満たす連続する複数の画素から構成され、前記画素領域の大きさが所定の大きさを超えることである、請求項1に記載の方法。
- 前記第2条件は、前記画素領域が、前記第2バンド群についての画素値に基づき、予め定められた分類リストのいずれかに分類されることである、請求項1または2に記載の方法。
- 前記第1画像データを取得することは、前記第2バンド群を含む複数の波長バンドのそれぞれについての画像情報が1つの2次元画像として圧縮された圧縮画像データを取得し、前記圧縮画像データから前記第1画像データを生成することを含み、
前記第2画像データを取得することは、前記圧縮画像データから前記第1異物領域を含む前記領域を抽出し、抽出した前記領域のデータに基づいて前記第2画像データを生成することを含む、
請求項1から3のいずれかに記載の方法。 - 前記第1画像データを生成することは、前記第1バンド群に対応する第1復元テーブルを用いて、前記圧縮画像データから前記第1画像データを復元することを含み、
前記第2画像データを生成することは、前記第2バンド群に対応する第2復元テーブルを用いて、抽出した前記領域のデータから前記第2画像データを復元することを含む、
請求項4に記載の方法。 - 前記圧縮画像は、フィルタアレイおよびイメージセンサを備える撮像装置によって生成され、
前記フィルタアレイは、互いに異なる透過スペクトルを有する複数種類のフィルタを含み、
前記第1復元テーブルおよび前記第2復元テーブルは、前記複数フィルタアレイにおける前記透過スペクトルの分布に基づいて生成される、
請求項5に記載の方法。 - 前記第1画像データは、撮像装置による第1撮像動作によって取得され、
前記第2画像データは、前記撮像装置による第2撮像動作によって取得される、
請求項1から3のいずれかに記載の方法。 - 前記第2異物領域が検出された場合に、出力装置に警告を出力することをさらに含む、
請求項1から7のいずれかに記載の方法。 - 前記第1異物領域の位置、および前記第2異物領域の位置を記憶装置に記憶させることをさらに含む、請求項1から8のいずれかに記載の方法。
- 検査対象に含まれる異物を検出する装置であって、
プロセッサと、
コンピュータプログラムを格納した記憶媒体と、
を備え、
前記プロセッサは、前記コンピュータプログラムを実行することにより、
各画素が、1つ以上の波長バンドを含む第1バンド群についての画素値を有する、前記検査対象についての第1画像データを取得することと、
前記第1画像データから、第1条件を満たす1つ以上の画素領域を第1異物領域として決定することと、
各画素が、前記第1バンド群よりも多くの波長バンドを含む第2バンド群についての画素値を有する、前記第1異物領域を含む領域についての第2画像データを取得することと、
前記第2画像データから、前記第1条件とは異なる第2条件を満たす1つ以上の画素領域を、前記特定の異物が存在する第2異物領域として決定することと、
前記第2異物領域に関する情報を出力することと、
を実行する装置。 - 検査対象に含まれる異物を検出するためのコンピュータプログラムであって、コンピュータに、
各画素が、1つ以上の波長バンドを含む第1バンド群についての画素値を有する、前記検査対象についての第1画像データを取得することと、
前記第1画像データから、第1条件を満たす1つ以上の画素領域を第1異物領域として決定することと、
各画素が、前記第1バンド群よりも多くの波長バンドを含む第2バンド群についての画素値を有する、前記第1異物領域を含む領域についての第2画像データを取得することと、
前記第2画像データから、前記第1条件とは異なる第2条件を満たす1つ以上の画素領域を、前記特定の異物が存在する第2異物領域として決定することと、
前記第2異物領域に関する情報を出力することと、
を実行させる、コンピュータプログラム。 - 検査対象に含まれる異物を検出する方法であって、
1または複数の第1画像を取得し、前記1または複数の第1画像は、1または複数の第1波長バンドに対応し、
前記1または複数の第1画像に基づいて、第1条件を満たす領域を第1異物領域として決定し、
複数の第2画像を取得し、前記複数の第2画像の各々は前記第1異物領域に対応する領域を含み、前記複数の第2画像は複数の第2波長バンドに対応し、前記複数の第2波長バンドの数は前記1または複数の第1波長バンドの数より大きく、
前記複数の第2画像に基づいて、前記第1条件とは異なる第2条件を満たす1つ以上の領域を、前記異物が存在する第2異物領域として決定し、
前記第2異物領域に関する情報を出力する、
方法。 - 前記1または複数の第1画像のそれぞれに含まれる複数の第1画素の複数の第1画素値は、カメラで前記検査対象を撮像した第3画像に基づいて算出され、前記複数の第1画素は、前記第1異物領域及び前記第1異物領域以外の領域に対応する領域に含まれ、
前記複数の第2画像のそれぞれに含まれ、前記第1異物領域に対応する領域に含まれる複数の画素の複数の画素値は、前記前記第3画像に基づいて算出され、
前記複数の第2画像のそれぞれに含まれ、かつ、前記第1異物領域に対応する領域以外に含まれる、複数の画素の複数の画素値は、算出されない
前記請求項12記載の方法。
Priority Applications (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202280009183.6A CN116710958A (zh) | 2021-01-26 | 2022-01-18 | 对检查对象中包含的异物进行检测的方法及装置 |
JP2022578255A JPWO2022163421A1 (ja) | 2021-01-26 | 2022-01-18 | |
US18/346,824 US20230419478A1 (en) | 2021-01-26 | 2023-07-04 | Method and apparatus for detecting foreign object included in inspection target |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2021-010386 | 2021-01-26 | ||
JP2021010386 | 2021-01-26 |
Related Child Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US18/346,824 Continuation US20230419478A1 (en) | 2021-01-26 | 2023-07-04 | Method and apparatus for detecting foreign object included in inspection target |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2022163421A1 true WO2022163421A1 (ja) | 2022-08-04 |
Family
ID=82653324
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2022/001492 WO2022163421A1 (ja) | 2021-01-26 | 2022-01-18 | 検査対象に含まれる異物を検出する方法および装置 |
Country Status (4)
Country | Link |
---|---|
US (1) | US20230419478A1 (ja) |
JP (1) | JPWO2022163421A1 (ja) |
CN (1) | CN116710958A (ja) |
WO (1) | WO2022163421A1 (ja) |
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20230030308A1 (en) * | 2021-07-28 | 2023-02-02 | Panasonic Intellectual Property Management Co., Ltd. | Inspection method and inspection apparatus |
JP7375963B1 (ja) | 2023-01-06 | 2023-11-08 | 株式会社サタケ | トレーサビリティシステム |
WO2024053302A1 (ja) * | 2022-09-08 | 2024-03-14 | パナソニックIpマネジメント株式会社 | 情報処理方法および撮像システム |
Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2013164338A (ja) * | 2012-02-10 | 2013-08-22 | Sumitomo Electric Ind Ltd | 植物または植物加工品の異物検出方法 |
JP2016105084A (ja) * | 2014-11-21 | 2016-06-09 | 和歌山県 | 食品検査装置 |
JP2016156801A (ja) * | 2014-11-19 | 2016-09-01 | パナソニックIpマネジメント株式会社 | 撮像装置および分光システム |
WO2020023213A1 (en) * | 2018-07-27 | 2020-01-30 | Ventana Medical Systems, Inc. | Systems for automated in situ hybridization analysis |
JP2020524328A (ja) * | 2017-06-19 | 2020-08-13 | インパクトビジョン インコーポレイテッド | 異物を識別するためのハイパースペクトル画像処理用のシステム及び方法 |
WO2021192891A1 (ja) * | 2020-03-26 | 2021-09-30 | パナソニックIpマネジメント株式会社 | 信号処理方法、信号処理装置、および撮像システム |
WO2021246192A1 (ja) * | 2020-06-05 | 2021-12-09 | パナソニックIpマネジメント株式会社 | 信号処理方法、信号処理装置、および撮像システム |
-
2022
- 2022-01-18 JP JP2022578255A patent/JPWO2022163421A1/ja active Pending
- 2022-01-18 WO PCT/JP2022/001492 patent/WO2022163421A1/ja active Application Filing
- 2022-01-18 CN CN202280009183.6A patent/CN116710958A/zh active Pending
-
2023
- 2023-07-04 US US18/346,824 patent/US20230419478A1/en active Pending
Patent Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2013164338A (ja) * | 2012-02-10 | 2013-08-22 | Sumitomo Electric Ind Ltd | 植物または植物加工品の異物検出方法 |
JP2016156801A (ja) * | 2014-11-19 | 2016-09-01 | パナソニックIpマネジメント株式会社 | 撮像装置および分光システム |
JP2016105084A (ja) * | 2014-11-21 | 2016-06-09 | 和歌山県 | 食品検査装置 |
JP2020524328A (ja) * | 2017-06-19 | 2020-08-13 | インパクトビジョン インコーポレイテッド | 異物を識別するためのハイパースペクトル画像処理用のシステム及び方法 |
WO2020023213A1 (en) * | 2018-07-27 | 2020-01-30 | Ventana Medical Systems, Inc. | Systems for automated in situ hybridization analysis |
WO2021192891A1 (ja) * | 2020-03-26 | 2021-09-30 | パナソニックIpマネジメント株式会社 | 信号処理方法、信号処理装置、および撮像システム |
WO2021246192A1 (ja) * | 2020-06-05 | 2021-12-09 | パナソニックIpマネジメント株式会社 | 信号処理方法、信号処理装置、および撮像システム |
Cited By (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20230030308A1 (en) * | 2021-07-28 | 2023-02-02 | Panasonic Intellectual Property Management Co., Ltd. | Inspection method and inspection apparatus |
US11991457B2 (en) * | 2021-07-28 | 2024-05-21 | Panasonic Intellectual Property Management Co., Ltd. | Inspection method and inspection apparatus |
WO2024053302A1 (ja) * | 2022-09-08 | 2024-03-14 | パナソニックIpマネジメント株式会社 | 情報処理方法および撮像システム |
JP7375963B1 (ja) | 2023-01-06 | 2023-11-08 | 株式会社サタケ | トレーサビリティシステム |
Also Published As
Publication number | Publication date |
---|---|
JPWO2022163421A1 (ja) | 2022-08-04 |
CN116710958A (zh) | 2023-09-05 |
US20230419478A1 (en) | 2023-12-28 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2022163421A1 (ja) | 検査対象に含まれる異物を検出する方法および装置 | |
JP6952277B2 (ja) | 撮像装置および分光システム | |
Wu et al. | Advanced applications of hyperspectral imaging technology for food quality and safety analysis and assessment: A review—Part I: Fundamentals | |
Mollazade et al. | Analysis of texture-based features for predicting mechanical properties of horticultural products by laser light backscattering imaging | |
JP6945195B2 (ja) | 光学フィルタ、光検出装置、および光検出システム | |
WO2021192891A1 (ja) | 信号処理方法、信号処理装置、および撮像システム | |
US20230079297A1 (en) | Signal processing method, signal processing device, and imaging system | |
JP2016130727A (ja) | 撮像装置 | |
WO2022230640A1 (ja) | 画像処理装置、撮像システム、および復元画像の誤差を推定する方法 | |
JP7457952B2 (ja) | 光検出装置、光検出システム、およびフィルタアレイ | |
WO2021085014A1 (ja) | フィルタアレイおよび光検出システム | |
US11843876B2 (en) | Optical filter array, photodetection device, and photodetection system | |
Yoon et al. | Hyperspectral image processing methods | |
WO2021241171A1 (ja) | フィルタアレイおよび光検出システム | |
CN116806304A (zh) | 数据处理装置、方法及程序以及光学元件、摄影光学系统及摄影装置 | |
WO2023282069A1 (ja) | 信号処理装置および信号処理方法 | |
WO2023106142A1 (ja) | 信号処理方法、プログラム、およびシステム | |
WO2023106143A1 (ja) | 分光画像を生成するシステムに用いられる装置およびフィルタアレイ、分光画像を生成するシステム、ならびにフィルタアレイの製造方法 | |
WO2024053302A1 (ja) | 情報処理方法および撮像システム | |
JP7122636B2 (ja) | フィルタアレイおよび光検出システム | |
WO2024043009A1 (ja) | 信号処理方法および信号処理装置 | |
JP2024020922A (ja) | 復元画像の評価方法および撮像システム、 | |
WO2022270355A1 (ja) | 撮像システム、撮像システムに用いられる方法、および撮像システムに用いられるコンピュータプログラム | |
WO2023286613A1 (ja) | フィルタアレイ、光検出装置、および光検出システム | |
WO2023100660A1 (ja) | 撮像システムおよび撮像方法 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 22745644 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 2022578255 Country of ref document: JP Kind code of ref document: A |
|
WWE | Wipo information: entry into national phase |
Ref document number: 202280009183.6 Country of ref document: CN |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 22745644 Country of ref document: EP Kind code of ref document: A1 |