WO2023210619A1 - Dispositif d'imagerie, dispositif d'inspection, procédé de détermination de condition d'imagerie et procédé d'imagerie - Google Patents

Dispositif d'imagerie, dispositif d'inspection, procédé de détermination de condition d'imagerie et procédé d'imagerie Download PDF

Info

Publication number
WO2023210619A1
WO2023210619A1 PCT/JP2023/016219 JP2023016219W WO2023210619A1 WO 2023210619 A1 WO2023210619 A1 WO 2023210619A1 JP 2023016219 W JP2023016219 W JP 2023016219W WO 2023210619 A1 WO2023210619 A1 WO 2023210619A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
pixel
light source
monochromatic light
imaging
Prior art date
Application number
PCT/JP2023/016219
Other languages
English (en)
Japanese (ja)
Inventor
透 渡邊
勝蘭 李
雄 吉田
邦光 豊島
Original Assignee
株式会社エヌテック
株式会社ヤクルト本社
東邦商事株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 株式会社エヌテック, 株式会社ヤクルト本社, 東邦商事株式会社 filed Critical 株式会社エヌテック
Publication of WO2023210619A1 publication Critical patent/WO2023210619A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/17Systems in which incident light is modified in accordance with the properties of the material investigated
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N25/00Circuitry of solid-state image sensors [SSIS]; Control thereof
    • H04N25/10Circuitry of solid-state image sensors [SSIS]; Control thereof for transforming different wavelengths into image signals
    • H04N25/11Arrangement of colour filter arrays [CFA]; Filter mosaics
    • H04N25/13Arrangement of colour filter arrays [CFA]; Filter mosaics characterised by the spectral characteristics of the filter elements

Definitions

  • the present invention relates to an imaging device, an inspection device, an imaging condition determination method, and an imaging method that output images of different types of similar-colored objects in images of objects as images that are easy to identify.
  • Patent Documents 1 to 3 disclose inspection devices that include an imaging device that images, for example, a colony as an imaging target.
  • an imaging device that images, for example, a colony as an imaging target.
  • the number of colonies of microorganisms cultured in a petri dish can be accurately counted, down to minute colonies of various shapes, using a simple device configuration using a black-and-white CCD camera, without requiring complicated pretreatment of the culture medium. Count in a short time.
  • Patent Documents 2 and 3 disclose colony detection systems that can accurately identify two or more types of colonies that have different color features. This colony detection system classifies colony pixels into predetermined types based on the color features of colony pixels in order to accurately identify two or more types of colonies with different color features. identify.
  • An imaging device that solves the above problems is an imaging device that images a subject and outputs an image that can identify regions made of two different materials included in the subject, and the imaging device that captures the subject is RGB.
  • an image processing unit that performs predetermined processing on the RGB pixel, and the RGB pixel includes a first pixel having sensitivity in a first wavelength region and a second pixel having sensitivity in a second wavelength region, The wavelength range and the second wavelength range overlap in some overlapping wavelength ranges, and the monochromatic light has an emission wavelength range that includes at least a part of the first wavelength range and at least a part of the overlapping wavelength range.
  • the image processing unit includes a calculation processing unit that generates a three-band XYZ image with RGB imaging characteristics corrected by performing linear matrix calculation processing on the three-band RGB image, and the monochromatic light source.
  • the XYZ image is constructed with a gain set to a value that allows each output level of the first pixel and the second pixel to be balanced between the two colors under the condition that monochromatic light is irradiated from the An amplifying section that amplifies each value corresponding to each of the first pixel and the second pixel among the XYZ values.
  • the imaging device can output an image in which regions made of two different materials included in the subject can be identified by color or density. For example, when an image is used for inspection, two different types of materials included in the image can be identified or classified with relatively high accuracy.
  • density is a concept equivalent to pixel level, signal level, pixel value, etc.
  • the image processing unit may generate the three-band XYZ image by performing black balance processing on a reference portion of the subject other than the identification target. According to this configuration, since black balance processing is performed, even if the environment when photographing a subject changes, it is possible to output an image in which the material can be identified with appropriate color or density.
  • the monochromatic light source is a blue light source or a red light source
  • the first pixel is a B pixel
  • the second pixel is a G pixel
  • the first pixel may be an R pixel
  • the second pixel may be a G pixel
  • the G pixel has sensitivity to at least part of the overlapping wavelength region also to blue light, which is the light source color of the blue light source. Therefore, when capturing an image by illuminating with blue light, the sensitivity wavelength range of the G image can be narrowed compared to that of a general-purpose camera. Furthermore, the G pixel has sensitivity to at least a portion of the overlapping wavelength region for red light, which is the light source color of the red light source. Therefore, when capturing an image by illuminating with red light, the sensitivity wavelength range of the G image can be narrowed compared to that of a general-purpose camera. By performing linear matrix calculation on the RGB image, it is possible to further narrow the narrow sensitive wavelength region of the G image or shift the wavelength. Therefore, it is possible to output an image in which the material can be identified with appropriate color or density.
  • the monochromatic light source is a first monochromatic light source
  • a second monochromatic light source capable of emitting a second monochromatic light having an emission wavelength range different from an emission wavelength range of the first monochromatic light emitted by the first monochromatic light source
  • the RGB pixel further includes a light source
  • the RGB pixel includes a third pixel in addition to the first pixel and the second pixel
  • the control unit simultaneously performs illumination with the first monochromatic light source and the second monochromatic light source.
  • the gain setting section may set a gain to be given to the output level of the third pixel according to each gain given to the output level of the second pixel.
  • An inspection apparatus for solving the above problem is an inspection apparatus equipped with the above-mentioned imaging device, wherein the object includes two different types of colonies as the two different types of materials, and the three bands inputted from the imaging device are
  • the identification unit includes an identification unit that identifies colony areas that are areas of two different colonies based on an XYZ image, the XYZ image includes an X image, a Y image, and a Z image, and the identification unit
  • the area whose level is within the threshold setting range is defined as a first colony area
  • the area whose pixel level is within the threshold setting range in the Y image is defined as a second colony area
  • the pixel level is within the threshold setting range in the Z image.
  • the area is a third colony area, the remaining colony area after excluding the second colony area from the first colony area or the third colony area is a fourth colony area, and the first colony area and the third colony are a classification unit that classifies at least one of the areas, the second colony area, and the fourth colony area based on characteristics and shapes; and counting the number of each of the colony areas classified by the classification unit. and a specifying section that specifies the number of two types of colony regions to be identified based on the counting results of the counting section.
  • a method for determining imaging conditions for solving the above problem is a method for determining imaging conditions for an imaging device that outputs an image that can identify regions made of two different materials included in a subject, the imaging device It is equipped with a monochromatic light source that illuminates with monochromatic light, and an imaging section that includes RGB pixels that images the subject, and performs linear matrix calculation processing on the three-band RGB image output from the imaging section to obtain RGB imaging characteristics.
  • the emission wavelength of the monochromatic light source is changed so that the peak wavelength of the spectral output characteristic of one pixel and the spectral output characteristic of a second pixel different from the first pixel is changed, and the monochromatic light source after the emission wavelength change is changed.
  • this method for determining imaging conditions it is possible to determine appropriate imaging conditions that enable the imaging device to output an image in which regions made of two different materials included in the subject can be distinguished by color or density.
  • An imaging method that solves the above problems is an imaging method that images a subject and outputs an image that can identify regions made of two different materials included in the subject, the imaging method comprising: an imaging unit that captures the subject; a monochromatic light source that illuminates the subject with monochromatic light; the imaging unit includes an imaging element having RGB pixels; the RGB pixels include a first pixel having sensitivity in a first wavelength region; and a monochromatic light source having sensitivity in a first wavelength region; a second pixel having sensitivity to , the first wavelength region and the second wavelength region overlap in some overlapping wavelength regions, and the monochromatic light has at least a portion of the first wavelength region and the second wavelength region.
  • the light is in an emission wavelength range including at least a part of the overlapping wavelength range, and includes an imaging step in which the imaging unit takes an image of the subject illuminated by the monochromatic light source, and 3 output from the imaging unit that has taken the image of the subject.
  • this imaging method it is possible to output an image in which regions made of two different materials included in the subject can be identified by color or density. For example, by using an image output by this method, two different types of materials included in the image can be identified or classified with relatively high accuracy.
  • FIG. 3 is a schematic plan view showing a subject.
  • 1 is a schematic diagram showing a schematic configuration of a camera, and a graph showing relative sensitivity of the camera for each RGB color. It is a partial schematic diagram which shows the detailed structure of a monochromatic light source.
  • FIG. 2 is a block diagram showing the functional configuration of the inspection device.
  • 7 is a graph illustrating a case where it is difficult to identify different types of imaging targets. 7 is a graph illustrating the principle of making it easier to identify different types of imaging targets.
  • 12 is a graph illustrating a case where it is difficult to identify different types of imaging targets in images captured using a general-purpose camera.
  • 3 is a flowchart showing an initial setting routine.
  • 3 is a flowchart showing an imaging processing routine.
  • 3 is a flowchart showing an inspection processing routine.
  • 3 is a flowchart showing an imaging condition determination processing routine. It is a graph showing the relationship between the wavelength of the light source and the output power (light intensity) in a modified example. It is a graph showing the relationship between wavelength and relative output after linear matrix calculation processing when illuminating with green and red monochromatic light in a modified example. It is a graph explaining the principle by which the output image of the imaging device imaged using the general-purpose camera in a modification example identifies different types of imaging targets.
  • This embodiment is an example in which an imaging device 11 that captures an image of a subject 12 is included in an inspection apparatus 10 that inspects the subject 12. Note that the imaging device 11 may be included in a device other than the inspection device 10, or may be used alone.
  • the inspection device 10 includes an imaging device 11 that images the subject 12, and an inspection processing unit 60 that inspects the subject 12 using images input from the imaging device 11.
  • the inspection device 10 may include a display section 70 that displays the inspection results of the inspection processing section 60 and the like.
  • the display unit 70 may display an image output from the imaging device 11, an image obtained by processing the image, or the like.
  • the display unit 70 may be, for example, a monitor connected to a personal computer, or a display provided on an operation panel included in the inspection device 10.
  • the imaging device 11 includes a monochromatic light source 20 that illuminates a subject 12 with monochromatic light, an imaging unit 30 that captures an image of the subject 12, a control unit 40 that controls the monochromatic light source 20 and the imaging unit 30, and an imaging unit 30 that outputs an image as an imaging result.
  • the image processing unit 50 includes an image processing unit 50 that performs predetermined processing on the output image signal.
  • the imaging device 11 also includes a power source 45 that supplies power to the monochromatic light source 20 .
  • the imaging unit 30 is, for example, a color camera 31.
  • the color camera 31 (hereinafter also simply referred to as "camera 31") is electrically connected to the processing section 18.
  • the processing section 18 includes the aforementioned control section 40 and image processing section 50.
  • the processing unit 18 may include at least one of an electronic circuit and a computer.
  • the monochromatic light source 20 may be one type of monochromatic light source that emits one type of monochromatic light, or may include two types of monochromatic light sources that emit two types of monochromatic light. In this case, the monochromatic light source 20 may include three types of monochromatic light sources that emit three types of monochromatic light. When a plurality of types of monochromatic light sources are provided, the control unit 40 may be configured to cause one or two types of monochromatic light sources to emit light.
  • one type of monochromatic light source refers to any one of a blue light source, a red light source, and a green light source.
  • the monochromatic light source is composed of, for example, an LED.
  • the monochromatic light source may be, for example, any one of a blue LED, a red LED, and a green LED.
  • the monochromatic light source 20 of this example includes at least a blue LED and a red LED. Therefore, in the example shown in FIG. 1, the power supply 45 includes at least a blue LED power supply 46 and a red LED power supply 47.
  • the control unit 40 includes an imaging control unit 41 that controls the imaging unit 30 and an illumination control unit 42 that controls the monochromatic light source 20.
  • the illumination control unit 42 controls turning on and off of the monochromatic light source 20 via a power source 45.
  • the monochromatic light source 20 includes a plurality of types of monochromatic light sources (LEDs)
  • the lighting control unit 42 individually controls turning on and off for each monochromatic light source via the power source 45. Therefore, the subject 12 is illuminated with monochromatic light by the monochromatic light source 20.
  • the monochromatic light source 20 may illuminate the subject 12 with two types of monochromatic light.
  • the inspection device 10 may include a robot that transports objects to be transported, such as a petri dish 13 as the subject 12 and a reference sheet for color gain adjustment, to a predetermined position.
  • the control unit 40 may control the robot, or a control unit different from the control unit 40 may control the robot.
  • the image processing section 50 performs predetermined image processing on the first image signal S1 of three RGB bands output from the camera 31 constituting the image capturing section 30, and generates a second image signal S2 of three XYZ bands (both shown in FIG. 5). reference).
  • RGB in the first image signal S1 refers to three colors determined from each wavelength range that can be received by the R pixel, G pixel, and B pixel that constitute the image sensor 33 of the camera 31.
  • the XYZ referred to in the second imaging signal S2 is determined from each wavelength region of the XYZ image obtained by converting the RGB image forming the first imaging signal S1 into an XYZ image having each wavelength region different from each wavelength region of RGB. Refers to three colors.
  • the inspection processing unit 60 inspects the subject 12 using the XYZ image based on the second imaging signal S2 input from the image processing unit 50.
  • the subject 12 includes colonies WC and YC cultured in a Petri dish 13. That is, the subject 12 includes a petri dish 13, a medium 14 in the petri dish 13, and colonies WC and YC cultured in the medium 14. Note that colonies WC and YC do not occur at the beginning of the culture period or when there are no bacteria, so colonies WC and YC may not exist depending on the subject 12.
  • two or more different types of colonies WC and YC exist.
  • the different types of colonies WC and YC are two types having similar colors.
  • foreign matter FS as shown in FIG. 2 may exist in the culture medium 14.
  • the foreign substances FS include dietary fiber, food particles, seeds, skin, fruit pulp, etc. contained in the food or drinking water.
  • Foreign matter FS may be mistakenly identified as colonies WC and YC. Therefore, the inspection processing unit 60 identifies the colonies WC, YC and the foreign object FS based on their characteristics.
  • colonies WC and YC are identified as foreign matter FS based on characteristic factors such as shape, area, and color.
  • Colonies WC and YC generally have a circular or oval shape, and the size is within a predetermined range.
  • the foreign material FS is a fiber, its shape is elongated.
  • the foreign substance FS is fruit pulp, seeds, skin, etc.
  • the shape is different from the colonies WC and YC.
  • the inspection processing unit 60 distinguishes colony candidates from colonies WC, YC and others based on a threshold value of area (size).
  • the petri dish 13 is placed on a background sheet 15 (background board).
  • the background sheet 15 serves as a background for the petri dish 13, which is the subject 12.
  • the background sheet 15 may be, for example, a black diffusion plate. By making the background sheet 15 black, it may be used as a black reference for black balance.
  • the petri dish 13 is made of a colorless and transparent material (for example, glass).
  • the culture medium 14 contained in the petri dish 13 is colored as necessary. In that case, the culture medium 14 may be colored in a predetermined color and become translucent. In the image captured by the camera 31, the black color of the background sheet 15 may be transmitted through the culture medium 14.
  • the culture medium 14 may be colored depending on the color of the food or drinking water.
  • the black background sheet 15 may be a black plate such as a black sheet that does not have a diffusion function of diffusing light. Further, when using the background sheet 15 as a reference for black balance, it is not limited to a black (achromatic color) sheet, but a chromatic color sheet may be used.
  • the two types of colonies schematically shown in FIGS. 1 and 2 are white colony WC and yellow colony YC.
  • the white colony WC and the yellow colony YC are different types of colonies. Therefore, for inspection, it is necessary to distinguish between white colony WC and yellow colony YC.
  • the white and yellow colors of the white colony WC and the yellow colony YC are similar in color, it is difficult to distinguish them in the RGB image captured by the camera 31 of the subject 12. Further, in the R image, G image, and B image that constitute the RGB image, the white colony WC and the yellow colony YC have similar densities, so it is difficult to distinguish them.
  • the image processing unit 50 performs predetermined image processing on the first imaging signal S1 from the camera 31, thereby generating an XYZ image that makes it easy to distinguish between the white colony WC and the yellow colony YC. Details of the XYZ image will be described later.
  • the camera 31 is a general-purpose color camera that captures RGB images.
  • the camera 31 includes a lens 32 assembled to a lens barrel 31a, a near-infrared light cut filter 35 (hereinafter also referred to as IR cut filter 35) that blocks near-infrared light, and an image sensor 33.
  • IR cut filter 35 near-infrared light cut filter 35
  • the image sensor 33 includes an R pixel 33R, a G pixel 33G, and a B pixel 33B, each of which is a light receiving element.
  • the R pixel 33R receives red light (R light) that has passed through the R filter 34R, and outputs an R pixel signal according to the amount of received light.
  • the G pixel 33G receives green light (G light) that has passed through the G filter 34G, and outputs a G pixel signal according to the amount of received light.
  • the B pixel 33B receives blue light (B light) that has passed through the B filter 34B, and outputs a B pixel signal according to the amount of received light.
  • an R pixel 33R, a G pixel 33G, and a B pixel 33B are arranged in a predetermined arrangement.
  • This image sensor 33 has RGB imaging characteristics in which near-infrared light is cut.
  • the R pixel 33R, the G pixel 33G, and the B pixel 33B are sensitive to light in respective wavelength bands shown in the graph in FIG.
  • the horizontal axis is wavelength and the vertical axis is relative sensitivity.
  • the R pixel 33R has high sensitivity to light in the red (R) wavelength band shown in the graph in FIG.
  • the G pixel 33G has high sensitivity to light in the green (G) wavelength band shown in the graph.
  • the B pixel 33B has high sensitivity to light in the blue (B) wavelength band shown in the graph.
  • the monochromatic light source 20 includes monochromatic light sources of three colors (three types): a blue light source 21, a green light source 22, and a red light source 23.
  • the blue light source 21 emits blue light (B light) as monochromatic light.
  • the green light source 22 emits green light (G light) as monochromatic light.
  • the red light source 23 emits red light (R light) as monochromatic light.
  • the illumination control unit 42 of this embodiment illuminates the subject 12 with blue light (B light) by causing the blue light source 21 to emit light as a monochromatic light source. Note that if the emitted light color (light source color) is not particularly conscious, the blue light source 21, green light source 22, and red light source 23 that constitute the monochromatic light source 20 may be referred to as monochromatic light sources 21 to 23.
  • the monochromatic light source 20 may have the function of a white light source. That is, the monochromatic light source 20 also has a white light source function of emitting white light by emitting three types of monochromatic light sources 21 to 23 at the same time.
  • the control unit 40 causes the white light source to emit any two types (two colors) of the three types of monochromatic light sources 21 to 23 capable of emitting monochromatic light of three colors of RGB, so that the white light source can emit monochromatic light. Function as a light source.
  • a white light source having the function of such a monochromatic light source is also included in the monochromatic light source 20.
  • the first imaging in which the subject 12 is imaged by illuminating with one type or two types of monochromatic light, and the second imaging in which the subject 12 is imaged by illuminating with white light may be performed in a time-sharing manner.
  • images captured using white light may also be used for the inspection.
  • the control unit 40 determines which monochromatic light source emits light based on the spectral reflectance characteristics of the colony to be inspected.
  • one type of two types of monochromatic light sources that are emitted simultaneously when photographing the subject 12 is assumed to be a first monochromatic light source 25 which is a main monochromatic light source, and the other type is assumed to be a second monochromatic light source 26 which is an auxiliary light source.
  • the blue light source 21 is used as a first monochromatic light source 25
  • the red light source 23 is used as a second monochromatic light source 26.
  • the blue light source 21 is the main first monochromatic light source 25, and the first monochromatic light source 25 has a first monochromatic light source 25 that has an emission wavelength range different from the emission wavelength range of the first monochromatic light (B light in this example) that illuminates the subject 12.
  • the red light source 23 capable of emitting two monochromatic lights (R light in this example) is assumed to be a second monochromatic light source 26 which is an auxiliary light source.
  • the control unit 40 illuminates the subject 12 using blue light (B light) as main monochromatic light and illuminates the subject 12 using red light (R light) as auxiliary light. Which two of the three RGB colors are used as a combination of the first monochromatic light source 25 and the second monochromatic light source 26 is determined according to the spectral reflectance characteristics of the object to be inspected.
  • the monochromatic light sources 21 to 23 are, for example, LEDs.
  • the blue light source 21 is a blue LED
  • the green light source 22 is a green LED
  • the red light source 23 is a red LED.
  • the control unit 40 is capable of individually and independently controlling on/off the three types of monochromatic light sources 21 to 23.
  • the control unit 40 uses the blue LED determined from the imaging condition search results performed in advance to identify the white colonies WC and yellow colonies YC to be inspected.
  • a red LED is lit as an auxiliary light. Note that a method for determining imaging conditions including a method for determining monochromatic light used for illumination will be described later.
  • the image processing section 50 and the inspection processing section 60 are also simply referred to as an R signal, a G signal, and a B signal.
  • the image processing section 50 includes an RGB separation section 51, an arithmetic processing section 52, and an amplification section 53.
  • the RGB separation unit 51 separates the first image signal S1 input from the image sensor 33 into an R image signal, a G image signal, and a B image signal.
  • the arithmetic processing unit 52 converts the R signal, G signal, and B signal input from the RGB separation unit 51 into an X signal, a Y signal, and a Z signal. Specifically, the arithmetic processing unit 52 converts the RGB values, which are the signal values of the R signal, G signal, and B signal, into the X signal, Y signal, and Z signal by performing a linear matrix calculation.
  • the arithmetic processing unit 52 is given matrix coefficients.
  • the matrix used for the linear matrix calculation is a 3 ⁇ 3 matrix.
  • the arithmetic processing unit 52 is given coefficients of a 3 ⁇ 3 matrix.
  • the arithmetic processing unit 52 performs a linear matrix operation of multiplying the RGB values of the first imaging signal S1 by a 3 ⁇ 3 matrix specified using the matrix coefficient, and calculates the spectral characteristics different from the RGB values of the first imaging signal S1. It is converted into a second image pickup signal S2 expressed by XYZ with .
  • the matrix coefficient is a coefficient for separating RGB of the first image signal S1 into XYZ bands of the second image signal S2.
  • a1 to a3, b1 to b3, and c1 to c3 are matrix coefficients, and Gx, Gy, and Gz are gains (amplification factors). These matrix coefficients are set in the matrix coefficient setting section 54 shown in FIG. Further, the gains Gx, Gy, and Gz are set in a gain setting section 55 shown in FIG.
  • the gain setting unit 55 sets the output levels of the first and second pixels of the RGB pixels of the image sensor 33 between two colors under the condition that the first monochromatic light is irradiated from the first monochromatic light source 25. A gain value that allows for color balance is set. In the example shown in FIG.
  • the gain setting unit 55 is set to the B pixel 33B, which is the first pixel, under the condition that B light, which is the first monochromatic light, is irradiated from the blue light source 21, which is the first monochromatic light source 25.
  • a gain is set to a value that allows each output level of the second pixel G pixel 33G to be balanced between two colors (B color and G color).
  • the arithmetic processing unit 52 performs a linear matrix arithmetic process of multiplying the RGB values by a 3 ⁇ 3 matrix in the above equation (1).
  • the arithmetic processing unit 52 outputs the XYZ values before being multiplied by a gain (amplification factor) to the amplification unit 53.
  • Matrix coefficients are set in the 3 ⁇ 3 matrix that can improve the separation of the three bands.
  • the amplification unit 53 multiplies the XYZ values before normalization from the arithmetic processing unit 52 by the X gain Gx, Y gain Gy, and Z gain Gz set in the gain setting unit 55, respectively.
  • the amplification unit 53 multiplies the X value after XYZ conversion by the X gain Gx, the Y value by the Y gain Gy, and the Z value by the Z gain Gz.
  • the amplification unit 53 outputs the normalized XYZ values as the second image pickup signal S2.
  • the amplification unit 53 is capable of balancing the output levels of the B pixel 33B and the G pixel 33G between two colors under the condition that monochromatic light (B light) is irradiated from the blue light source 21.
  • the image processing unit 50 sequentially performs RGB separation processing, linear matrix calculation processing (XYZ conversion processing), and normalization processing on the input first imaging signal S1, thereby outputting a second imaging signal S2. .
  • the three-band RGB image signal is converted into the three-band XYZ image signal.
  • the inspection processing section 60 includes an identification section 61 and a determination section 62.
  • the identification section 61 includes a classification section 63, a counting section 64, and a specifying section 65.
  • the subject 12 includes two different types of colonies WC and YC as two different types of materials.
  • the identification unit 61 identifies colony areas, which are areas of two different types of colonies WC and YC, based on the three-band XYZ image input from the imaging device 11.
  • the XYZ image includes an X image, a Y image, and a Z image.
  • the identification unit 61 defines an area in which the pixel values constituting the X image are within the range set by the first threshold value as a first colony area. Further, the identification unit 61 defines an area in which the pixel values forming the Y image are within the range set by the second threshold value as a second colony area.
  • the identification unit 61 defines an area where the pixel values forming the Z image are within the range set by the third threshold value as a third colony area.
  • the identification unit 61 removes the second colony area from the first colony area or the third colony area and sets the remaining colony area as a fourth colony area.
  • the first colony area or the third colony area is an area where all of the colonies WC and YC are in the subject 12.
  • the second colony region is a region of white colony WC or yellow colony YC.
  • the calculation formula for the fourth colony region described here is an example of the case of FIG. 12, and the calculation formula for the fourth colony region changes depending on the characteristics of the colony to be identified.
  • the first to third thresholds are thresholds for setting a specific range of pixel values, and not only one type of intermediate threshold, but also two types of thresholds, an upper threshold and a lower threshold, or two intervals. Four types of threshold values may be used to set the range.
  • the identification unit 61 binarizes the area specified within the threshold range and the other areas using the threshold value, and specifies the area to be inspected extracted by the binarization.
  • the classification unit 63 classifies at least one of the first colony area and the third colony area, the second colony area, and the fourth colony area based on characteristics such as shape and size (area).
  • the counting unit 64 counts the number of each colony area classified by the classification unit 63.
  • the specifying unit 65 specifies the number of two types of colony areas to be identified based on the counting results of the counting unit 64.
  • the determination unit 62 determines the quality of the test object obtained as a result of culturing colonies in the Petri dish 13 of the subject 12 based on the counts of the two types of colony regions.
  • the inspection device 10 when the inspection device 10 is a colony counter, it may be provided with a determination function for determining the quality of the inspection object as an auxiliary function, or may be configured without the determination function by the determination section 62.
  • the main function when the inspection target of the inspection device 10 is not a colony, the main function may be a determination function for determining the quality of the inspection target.
  • FIG. 6 shows an example of spectral reflectance characteristic lines LW and LY of white colony WC and yellow colony YC, and spectral sensitivity characteristic line B indicating the spectral sensitivity characteristic of B pixel as an example of RGB pixels.
  • the horizontal axis shows wavelength (nm), and the vertical axis shows relative sensitivity.
  • the spectral reflectance characteristic lines LW and LY of the two types of colonies shown in this graph are hypothetical waveforms for convenience of explanation.
  • a dedicated spectral reflectance measurement device is required to measure the spectral reflectance characteristics of an inspection target such as a colony.
  • the spectral reflectance characteristics of white colony WC and yellow colony YC are similar. Further, the culture medium 14 in the petri dish 13 may be colored in a predetermined color. Therefore, due to the coloring of the medium 14, it becomes even more difficult to distinguish between the white colony WC and the yellow colony YC.
  • the spectral reflectance characteristic line LW of the white colony WC and the spectral reflectance characteristic line LY of the yellow colony YC are similar over a wide wavelength range.
  • a spectral sensitivity characteristic line B of the B pixel shown by a broken line in FIG. 6 indicates the relative sensitivity of the B pixel, and has a peak near 430 nm.
  • the spectral reflectance characteristic lines LW and LY of each colony WC and YC both have peaks that overlap with the spectral sensitivity characteristic line B. Furthermore, both have a peak in a predetermined wavelength region of the near-infrared region NIRA.
  • the B pixel 33B is sensitive to B light (light in the blue to green region), which is a light component in the range of 300 to 600 nm, as shown by the broken line in FIG. Therefore, the B pixel 33B receives the blue component of the B light of the reflected light from the white colony. Further, the B pixel 33B receives the blue component of the B light of the reflected light from the yellow colony.
  • the density of the white colony WC in the B image is proportional to the area Sw of the region LW*B multiplied by both the spectral reflectance characteristic line LW and the spectral sensitivity characteristic line B of the white colony WC in FIG. .
  • the density of the yellow colony YC in the B image is proportional to the area Sy of the region LY*B, which is obtained by multiplying both the spectral reflectance characteristic line LY and the spectral sensitivity characteristic line B of the yellow colony YC in FIG. 6.
  • the two areas Sw and Sy have approximately the same value. Therefore, the densities of the two types of colonies WC and YC in the B image are approximately the same.
  • B is limited to a narrow wavelength range (400 to 500 nm) in which the difference between the two spectral reflectance characteristic lines LW and LY of white colony WC and yellow colony YC is relatively large. If the spectral sensitivity characteristics of the pixels exist, white colonies WC and yellow colonies YC can be distinguished in the B image. That is, the density of the white colony WC in the B image is proportional to the area Sw of the region LW*B multiplied by both the spectral reflectance characteristic line LW and the spectral sensitivity characteristic line B of the white colony WC in FIG. 7.
  • the density of the yellow colony YC in the B image is proportional to the area Sy of the region LY*B, which is obtained by multiplying both the spectral reflectance characteristic line LY and the spectral sensitivity characteristic line B of the yellow colony YC in FIG. 7.
  • the two types of colonies WC and YC can be identified in the B image.
  • the narrow broken line N is limited to the near-infrared wavelength region (800 to 900 nm) where the difference between the two spectral reflectance characteristic lines LW and LY of the white colony WC and the yellow colony YC is relatively large.
  • the two types of colonies WC and YC can be distinguished in the IR image.
  • Sw indicates the area of the area where LW and B overlap, not LW*B, but since the difference from LW*B is small, in these figures, Sw indicates LW*B. is substituted by the area of the region where LW and B overlap.
  • LY*B is replaced by the area of the region where LY and B overlap.
  • the spectral reflectance characteristic lines LW, Spectral sensitivity characteristics of pixels are set in areas where there is a difference in LY.
  • the two types of colonies WC and YC can be identified from the image.
  • a special light source that can emit light of a specific wavelength with a narrow emission wavelength range is required.
  • these special cameras and special light sources are expensive, which increases the manufacturing cost of the imaging device.
  • the spectral sensitivity characteristics of pixels as shown in FIG. 7 are generated by performing predetermined processing on the first image signal S1 captured using a general-purpose color camera 31. ⁇ About images using general-purpose color camera 31> Next, a comparative example will be described with reference to FIG.
  • the image sensor 33 mounted on the general-purpose color camera 31 has the RGB spectral sensitivity characteristics shown in the graph when illuminated with white LED.
  • the spectral reflectance characteristic lines LW and LY of the white colony WC and the yellow colony YC are different from those in FIGS. 6 and 7.
  • the spectral sensitivity characteristic lines R, G, and B which indicate the sensitivity of each of the RGB pixels shown in FIG. 8, are distributed over a wide wavelength range.
  • the spectral reflectance characteristic line LW of the white colony WC and the spectral reflectance characteristic line LY of the yellow colony YC have a difference of LW>LY in the wavelength range of about 480 to 520 nm, but on the contrary, in the wavelength range of about 530 to 580 nm. LW ⁇ LY. Therefore, the difference in density between the two types of colonies WC and YC becomes small in any of the R, G, and B images that have sensitivity in the range of 400 to 700 nm.
  • the density of the white colony WC in the G image is calculated by multiplying the spectral reflectance characteristic line LW (thick line) of the white colony WC in FIG. 8 by the spectral sensitivity characteristic line G of the G pixel shown by the dashed line
  • the area of the region is indicated by Sg1.
  • the density of the yellow colony YC in the G image is calculated by multiplying the spectral reflectance characteristic line LY (thick one-dot chain line) of the yellow colony in FIG. 8 by the spectral sensitivity characteristic line G of the G image, which is the area Sg2 of the region. It is indicated by.
  • the ratio of the difference ⁇ Sg (
  • ) between the two areas Sg1 and Sg2 to the whole (total area of the G region) is small. As this ratio becomes smaller, it becomes difficult to distinguish between the two types of colonies WC and YC in the G image. This is because the G pixel has sensitivity over a wide wavelength range.
  • the differences ⁇ Sb and ⁇ Sr account for a small proportion of the total area (total area of B area, total area of R area), white colony WC and yellow colony YC cannot be distinguished. It's hard to do.
  • the area Sg1 of the area where LW and G overlap is substituted for LW*G
  • the area Sg2 of the area where LY and G overlap is substituted for LY*G is substituted for LY*G.
  • FIG. 9 shows the density of white colonies WC and yellow colonies YC in an RGB image captured by the general-purpose color camera 31 of this comparative example.
  • any of the R, G, and B images there is almost no difference between the density Cw of the white colony WC and the density Cy of the yellow colony YC, or even if there is a difference, it is very small. For this reason, it is difficult to distinguish between white colony WC and yellow colony YC in an RGB image.
  • the RGB pixels of the image sensor 33 of the general-purpose color camera 31 have a sensitivity expressed as a relative output with respect to wavelength (nm) under white LED illumination.
  • the B pixel 33B which is the first pixel, has a relative output of 0.1 or more for light having a wavelength of approximately 440 to 540 nm, and has spectral characteristics having a peak at approximately 460 nm. That is, the first wavelength region where the relative output is 0.1 or more is about 440 to 540 nm.
  • the G pixel 33G which is the second pixel, has a relative output of 0.1 or more for light having a wavelength of approximately 460 to 610 nm, and has a spectral characteristic having a peak at approximately 550 nm. That is, the second wavelength region where the relative output is 0.1 or more is about 460 to 610 nm.
  • the R pixel 33R which is the third pixel, has a relative output of 0.1 or more for light having a wavelength of approximately 570 to 650 nm, and has spectral characteristics having a peak at approximately 600 nm.
  • B light whose emission wavelength includes at least part of the overlapping wavelength range of about 460 to 550 nm
  • not only the first B pixel 33B but also the second G pixel 33G also has a relative output of 0.1 or more.
  • FIG. 10 shows the spectrum (emission distribution) of the monochromatic light source 20 when the blue light source 21 made of a blue LED and the red light source 23 made of a red LED emit light at the same time.
  • the horizontal axis is wavelength and the vertical axis is output power.
  • the blue light source 21, which is the main monochromatic light source emits B light of about 430 to 530 nm.
  • the red light source 23 which is an auxiliary light source, emits R light of 500 to 660 nm. Note that in the graph of FIG. 10, the spectral characteristics of the white LED are shown by broken lines.
  • FIG. 11 shows the relative outputs of the RGB pixels 33R, 33G, and 33B when illuminated with B light and R light.
  • the G region is shifted to the lower wavelength side compared to the G region when illuminated with white light in FIG.
  • region B shown in FIG. 11 the region where the relative output was about 0.05 to 2 in the region of 500 to 650 nm in region B when illuminated with white light in FIG. 8 has almost disappeared.
  • the R region is shifted to the higher wavelength side by illuminating with the R light shown in FIG. 10, compared to the R region when illuminating with white light in FIG.
  • an XYZ image is obtained by performing linear matrix calculation on the RGB image.
  • the imaging device 11 of this example employs an imaging method that allows a color camera 31 equipped with an RGB color filter 34 to have new two-band imaging characteristics through monochromatic illumination and RGB linear matrix calculation processing. .
  • the two new bands have different peak wavelengths from the original two bands, and have different imaging characteristics from the RGB imaging characteristics when using white LED lighting.
  • this imaging method enables color discrimination that is difficult to perform when imaging with a normal color camera 31.
  • the monochromatic illumination of this embodiment is blue illumination.
  • the monochromatic illumination used as an auxiliary light source is red illumination. That is, the blue light source 21 that illuminates with B light is used as the main monochromatic light source, and the red light source 23 that illuminates with R light is used as an auxiliary light source. By illuminating with R light using the red light source 23 as an auxiliary light source, the R image is also used for inspection processing.
  • the B light from the blue light source 21 causes the G pixel 33G to have a relative output of 0.1 or more in the region of about 450 to 520 nm.
  • the B light which is monochromatic light that illuminates the subject 12, is light in an emission wavelength range that includes at least part of the first wavelength range and at least part of the overlapping wavelength range.
  • the control unit 40 simultaneously illuminates the first monochromatic light source 25 and the second monochromatic light source 26 to cause the imaging unit 30 to image the subject 12.
  • the control unit 40 illuminates the second monochromatic light source 26 with a second illuminance set according to the illuminance of the first monochromatic light source 25, or illuminates the B pixel which is the first pixel among the RGB pixels 33R, 33G, and 33B.
  • the gain setting unit 55 sets the gain given to the output level of the R pixel 33R, which is the third pixel, according to each gain given to the output level of the R pixel 33B, which is the third pixel, and the G pixel 33G, which is the second pixel.
  • the density of the white colony WC in the Y image is determined by the spectral reflectance characteristic line LW (thick line) of the white colony WC in FIG.
  • the area of the region calculated by multiplication is indicated by Sg1.
  • the density of the yellow colony YC in the Y image is calculated by multiplying the spectral reflectance characteristic line LY (thick one-dot chain line) of the yellow colony in FIG. 13 by the spectral sensitivity characteristic line Y of the Y image, the area Sg2 of the region. It is indicated by.
  • the difference ⁇ Sg (
  • ) between the two areas Sg1 and Sg2 accounts for a large proportion of the total (sum of Sg1 and Sg2). Since this ratio corresponds to the density difference ratio, in the Y image, the two types of colonies WC and YC have a large density difference ratio and are easy to distinguish. Note that between the B image and the R image, the difference ⁇ Sb and ⁇ Sr (not shown) account for a small proportion of the total area (the total area of the B region, the total area of the R region), so the two types of colonies WC and YC cannot be distinguished. Hateful. In addition, in FIG. 13, the area Sg1 of the area where LW and Y overlap is substituted for LW*Y, and the area Sg2 of the area where LY and Y overlap is substituted for LY*Y.
  • FIG. 14 shows the density of white colonies WC and yellow colonies YC in an XYZ image captured by the general-purpose color camera 31 of this embodiment.
  • the X image and the Z image there is almost no difference between the density Cw of the white colony WC and the density Cy of the yellow colony YC, or even if there is a difference, the difference is very small. For this reason, it is difficult to distinguish between the two types of colonies WC and YC in the X/Z images.
  • the Y image there is a large difference between the density Cw of the white colony WC and the density Cy of the yellow colony YC. Therefore, white colonies WC and yellow colonies YC can be easily distinguished in the Y image. From the above, it is also possible to identify the two types of colonies WC and YC by color in the XYZ image.
  • step S11 the control unit 40 installs a reference sheet for color gain adjustment.
  • the control unit 40 controls, for example, a robot to have an arm grip a reference sheet, and places the gripped reference sheet at, for example, an inspection position, which is an imaging area of the camera 31 .
  • the reference sheet may be placed manually by the operator.
  • step S12 the control unit 40 turns on the blue LED.
  • step S13 the control unit 40 sets the B and G gains so that the levels of the B and G images are the same. That is, the control unit 40 sets the respective output levels of the B pixel 33B, which is the first pixel, and the G pixel 33G, which is the second pixel, to 2 under the condition that the B pixel 33B, which is the first pixel, is illuminated with B light from the blue LED, which is the blue light source 21.
  • the B gain and G gain are set to values that allow for color balance between colors.
  • step S14 the control unit 40 turns on the red LED.
  • step S15 the control unit 40 manually sets the R gain. That is, since the operator manually sets the R gain by operating the input section, the control section 40 sets the R gain input from the input section by writing it into a predetermined storage area of the memory.
  • step S16 the control unit 40 causes a background sheet to be installed.
  • the control unit 40 controls, for example, the arm of the transport robot, causes the arm to grip the background sheet, and places the gripped background sheet at the imaging area of the camera 31, for example, at an inspection position. Note that before placing the background sheet at the inspection position, the robot removes the reference sheet previously placed at the inspection position.
  • step S17 the control unit 40 calculates and sets a black balance offset amount using the background sheet image.
  • the control unit 40 executes the imaging processing routine shown in FIG. 17.
  • the control unit 40 controls the camera 31 to take an image of the petri dish 13 placed at the inspection position, and also processes the image processing unit 50 (FIGS. 1 and 5) with respect to the imaging signal output from the camera 31 as an imaging result. ) performs predetermined image processing to convert the 3-band RGB image forming the imaging signal into a 3-band XYZ image and output it.
  • step S21 the control unit 40 arranges the petri dish 13 in which the colonies WC and YC are cultured on the background sheet 15.
  • the control unit 40 controls, for example, a robot to place the petri dish 13 held by the arm on the background sheet 15. Note that the operator may place the petri dish 13 on the background sheet 15.
  • step S22 the control unit 40 turns on the blue and red LEDs.
  • step S23 the control unit 40 causes the camera 31 to take an image. Note that the processing in steps S22 and S23 corresponds to an example of an imaging step.
  • step S24 the control unit 40 performs linear matrix processing on the RGB image. Note that the processing in step S24 corresponds to an example of an arithmetic processing step.
  • step S25 the control unit 40 performs white balance processing using the gains Gx, Gy, and Gz set in the gain setting unit 55.
  • the gains Gx and Gy are values obtained by color-balancing the output levels of the B pixel and the G pixel. Therefore, even if the areas Sg1 and Sg2 shown in FIG. 13 occupy a small area in the entire Y area, the two types of colonies WC and YC do not become too dark in the Y image of FIG. It can be obtained with a concentration difference.
  • step S26 the control unit 40 performs black balance processing using the set RGB offset amount. That is, the control unit 40 performs black balance so that the black color of the background sheet 15 in the image has the same density as the black color of the background sheet 15 in the image set at the time of initial setting.
  • step S27 the control unit 40 outputs an XYZ image.
  • a distinguishable difference in density is obtained between the white colony WC and the yellow colony YC by predetermined image processing such as binarization processing. Therefore, two types of colonies WC and YC can be identified.
  • the concentrations of the two types of colonies WC and YC are the same and difficult to distinguish, but the difference in concentration between the two types of colonies WC and YC is large; It is possible to identify Therefore, the total number of colonies WC and YC can be counted. Further, in the Y image shown in FIG.
  • the areas of the two types of colonies WC and YC in FIG. 15(b) correspond to the first colony area, which is an area where the pixel level is within the threshold setting range.
  • One of the two types of colonies WC and YC in FIG. 15(c) corresponds to the second colony area, which is an area where the pixel level is within the threshold setting range.
  • the areas of the two types of colonies WC and YC in FIG. 15(d) correspond to the third colony area, which is an area where the pixel level is within the threshold setting range.
  • the remaining colony area after excluding the second colony area, which is the area of one of the two types of colonies, from the first colony area or the third colony area, which is the area of the total number of colonies corresponds to the fourth colony area.
  • the white colony WC area corresponds to the second colony area
  • the yellow colony YC area corresponds to the fourth colony area
  • the white colony WC area corresponds to the fourth colony area.
  • the fourth colony area corresponds to the fourth colony area.
  • the computer constituting the inspection processing section 60 executes the inspection processing routine shown in FIG.
  • the inspection processing unit 60 counts colonies for each image using the XYZ images from the imaging device 11, and performs a counting correction (count correction) that corrects the number of colonies by type using the colony counting result for each image.
  • a determination process is performed to determine the test result based on the corrected counting result (count result).
  • the processing performed for each XYZ image includes area extraction processing to find the colony area to be inspected, classification processing to classify the extracted area according to the characteristics of the colony, and counting the number of inspection subjects classified as colonies.
  • the area extraction process is a process of finding the area of the inspection target (colony candidate) for each of the X image, Y image, and Z image.
  • the classification process is a process of classifying whether or not a colony to be inspected exists in the extracted region, using the shape factor and size factor of the colony to be inspected as characteristics.
  • the counting process is a process of counting the number of test objects classified as colonies.
  • FIG. 18 shows parallel processing for determining regions for each XYZ image in one flow, and the processing order of step numbers is shown as an example, but the parallel processing may be performed in any order.
  • the feature classification and counting for each area may be performed in any order as long as the area is specified.
  • steps S31 to S34 are a series of processes (first process) for counting colonies using the X image.
  • steps S35 to S38 are a series of processes (second process) for counting colonies using the Y image.
  • steps S39 to S42 are a series of processes (third process) for counting colonies using the Z image.
  • steps S43 to S47 are a series of processes (fourth process) in which two or three of the areas extracted as colony candidates in the first to third processes are combined to determine a specific type of colony area. .
  • the first process includes region extraction using a threshold value of the X image (step S31), identification of a first region as a region extraction result (step S32), feature classification for classifying the first region according to features (step S33), and feature It includes the count of classified inspection objects (step S34).
  • the second process includes region extraction using a threshold value of the Y image (step S35), identification of a second region as a region extraction result (step S36), feature classification for classifying the second region according to the feature (step S37), and feature classification of the second region according to the feature. It includes the count of classified inspection objects (step S38).
  • the third process includes region extraction using a threshold value of the Z image (step S39), identification of a third region as a region extraction result (step S40), feature classification for classifying the third region according to features (step S41), and feature It includes the count of classified inspection objects (step S42).
  • the fourth process is a process of finding another area using the multiple areas obtained in each of the above processes (for example, three processes).
  • the fourth process includes a determination process (step S43) that determines whether a plurality of areas to be used have already been acquired, and an area acquisition process (step S43) that determines a fourth area using the first area and the second area, as an example.
  • S44 identification of a fourth region as a result of region acquisition (step S45), feature classification for classifying the fourth region according to features (step S46), and counting of test objects subjected to feature classification (step S47).
  • the colony candidate area of the A fourth colony area may be determined by using the colony candidate area of the Y image as the second area.
  • step S48 the inspection processing unit 60 (more specifically, the computer configuring the inspection processing unit 60) performs a predetermined calculation using the count results of the inspection targets for each region obtained in steps S34, S38, S42, and S47, respectively. Perform count correction. Count values for each type of colony to be inspected, for example, are determined by count correction. For example, the number of colonies of the second type is calculated by subtracting the number of colonies of the first type from the total number of colonies. Further, in step S49, the inspection processing unit 60 performs a determination process to obtain a final inspection result using the count correction results.
  • Judgment processing includes inspection judgment processing that determines whether the total number of colonies and the number of each type of colony exceeds the corresponding number threshold, and highlighting colonies by surrounding them with a display frame or adding a predetermined color. It may also include highlighting processing and the like.
  • the test processing section 60 then displays the test results on the display section 70.
  • the display unit 70 displays, for example, the total number of colonies, the number of colonies by type, test result details, and images with colonies highlighted.
  • the inspection result content may include a determination result as to whether the product is good or defective.
  • the inspection target is a colony here, it is not limited to this. For example, a configuration may be adopted in which foreign matter contained in a product is inspected for the presence of foreign matter, or a configuration in which contamination on the product is inspected for the presence or absence of contamination may be adopted.
  • the identification section 61 and the determination section 62 are configured by software by the computer forming the inspection processing section 60 executing the inspection processing program shown in FIG. Then, the identification unit 61 identifies each area by area extraction (steps S31, S35, S39, S44) and area specification (steps S32, S36, S40, S45).
  • the classification unit 63 is configured by a computer that performs feature classification processing (steps S33, S37, S41, and S46).
  • the counting unit 64 is constituted by a computer that performs counting processing (steps S34, S38, S42, S47).
  • the specifying unit 65 is constituted by a computer that performs a process of specifying the number of two types of colony areas to be identified based on the count value that is the counting result of the counting unit 64.
  • the determination unit 62 is constituted by a computer that performs determination processing based on the numbers of the two types of colonies identified (step S49).
  • the control unit 40 executes the imaging condition determination processing routine shown in FIG. 19.
  • This flowchart is an example of searching and determining imaging conditions using three types of monochromatic light sources.
  • the types of monochromatic light sources used for the imaging condition search may be two types of monochromatic light sources among the blue light source 21, the green light source 22, and the red light source 23.
  • the processing in steps S51 to S57 is a first search process in which imaging conditions are searched and determined using the first monochromatic light source, which is the first monochromatic light source. If appropriate imaging conditions cannot be determined even after repeatedly performing the processes in steps S51 to S57, the monochromatic light is changed in step S58, and the processes in steps S51 to S57 are repeatedly performed. Modification of monochromatic light refers to modification of the peak wavelength of monochromatic light. Furthermore, if all the peak wavelengths of monochromatic light in one monochromatic light source have been changed, the monochromatic light source may be changed. Even if the monochromatic light source is changed, the processes in steps S51 to S57 are basically the same. Therefore, in the following, a process in which the control unit 40 searches for and determines imaging conditions using a blue light source, which is the first monochromatic light source, will be described in detail.
  • step S51 the control unit 40 sets coefficients for linear matrix calculation when illuminating with the first monochromatic light.
  • This step S51 performs linear matrix calculation processing on the 3-band RGB image output from the imaging unit 30 to generate a 3-band XYZ image with corrected RGB imaging characteristics. This corresponds to an example of the first step of setting matrix coefficients by which the RGB image is multiplied.
  • step S52 the control unit 40 illuminates the subject with the first monochromatic light source.
  • the subject 12 is photographed with monochromatic light that can change the peak wavelengths of the spectral output characteristics of the first pixel among the RGB pixels of the camera 31 and the spectral output characteristics of the second pixel different from the first pixel. This corresponds to an example of the second step of illuminating.
  • step S53 the control unit 40 images the subject. Note that the process in step S53 corresponds to an example of a third step in which the imaging unit 30 images the subject 12 illuminated with monochromatic light in the second step.
  • step S54 the control unit 40 multiplies the RGB image by a matrix to generate an XYZ image.
  • this step S53 corresponds to an example of the fourth step in which the RGB image output from the imaging unit 30 as a result of the imaging in the third step is multiplied by a matrix to generate an XYZ image.
  • step S55 the control unit 40 obtains the difference in density between two areas of different types in the Y image. Note that this step S55 determines whether the difference in density between regions of two different materials in the second image output from the second pixel of the XYZ image generated in the fourth step is greater than or equal to a predetermined threshold. This corresponds to an example of the fifth step of determining whether.
  • step S56 the control unit 40 determines whether the difference is greater than or equal to a threshold value. If the difference is not greater than or equal to the threshold, that is, if the difference is less than the threshold, the process advances to step S57. On the other hand, if the difference is greater than or equal to the threshold, the process proceeds to step S59, where the monochromatic light and coefficients at that time are determined. Note that the determined coefficients are set in the matrix coefficient setting section 54.
  • step S57 the control unit 40 determines whether the condition search using the first monochromatic light has ended. If the condition search using the first monochromatic light has not been completed, the process returns to step S52 and the processes from step S52 to step S57 are repeatedly executed. If an imaging condition for which the difference is equal to or greater than the threshold value cannot be found even after completing all the condition searches using the first monochromatic light, the process moves to step S58.
  • step S58 the control unit 40 changes the monochromatic light.
  • the control unit 40 changes the peak wavelength of monochromatic light.
  • changing the peak wavelength of monochromatic light may mean gradually changing the peak wavelength of the monochromatic light source without changing it.
  • the peak wavelength of a general-purpose blue LED is about 470 nm, but it may be changed to 480 nm and 490 nm in 10 nm steps, for example.
  • the peak wavelength of the monochromatic light (for example, B light) to be used is changed, and the imaging condition search processing in steps S51 to S57 is similarly executed using the monochromatic light after the peak wavelength has been changed.
  • the monochromatic light source to be used is changed from the blue light source 21 to the green light source 22 in step S58.
  • the peak wavelength of the monochromatic light (for example, G light) to be used is changed, and the imaging condition search processing of steps S51 to S57 is similarly executed using the monochromatic light after the peak wavelength has been changed.
  • step S56 If an imaging condition for which the difference is still equal to or greater than the threshold cannot be found in step S56, the green light source 22 is changed to the red light source 23 in step S58. Then, similarly, the peak wavelength of the monochromatic light (for example, R light) to be used is changed, and the imaging condition search processing of steps S51 to S57 is similarly executed using the monochromatic light after the peak wavelength change. In this way, the change in the coefficient in the first step (S51) and the peak of monochromatic light in the second step (S52, S58) are performed until the difference in density becomes equal to or greater than a predetermined threshold in the third step (S56). The first to fifth steps (steps S51 to S58) are repeated as the wavelength changes.
  • R light for example, R light
  • the target image when acquiring the density difference among the XYZ images in the subsequent step S55 is an image different from the previous Y image (X image or Z image). ) may be changed. This image may be selected depending on the monochromatic light and the spectral reflectance characteristics of the colonies WC and YC.
  • the imaging device 11 images the subject 12 under the imaging conditions determined in this routine.
  • an imaging condition is found in which the difference is equal to or greater than a threshold value.
  • the blue light source 21 is determined to be a monochromatic light source. This blue light source 21 is determined as the first monochromatic light source 25.
  • step S58 changing the peak wavelength of the monochromatic light includes changing the monochromatic light by changing the peak wavelength while leaving the monochromatic light source as it is. It is sufficient if the emission wavelength can be changed so that the peak wavelengths of the first pixel and the spectral output characteristics of the second pixel, which are different from the first pixel, can be changed. For example, it is sufficient to simply change the monochromatic light source. That is, in step S58, the control unit 40 may simply change the blue light source 21, the green light source, and the red light source 23 in order. In addition, in the case of a configuration in which the monochromatic light source is unchanged but the peak wavelength is changed, the type of monochromatic light source used for imaging condition search can be one of the blue light source 21, the green light source 22, and the red light source 23. be.
  • the imaging device 11 outputs an image in which regions made of two different materials included in the photographed subject 12 can be identified.
  • the imaging device 11 includes an imaging section 30, a monochromatic light source 20, a control section 40, and an image processing section 50.
  • an image sensor 33 imaging device that images the subject 12 has RGB pixels 33R, 33G, and 33B.
  • the monochromatic light source 20 illuminates the subject 12 with monochromatic light.
  • the control unit 40 controls the monochromatic light source 20 and the imaging unit 30.
  • the image processing unit 50 performs predetermined processing on the three-band RGB image output from the imaging unit 30 that has captured the image of the subject 12.
  • the RGB pixels include a first pixel 33B having sensitivity in a first wavelength region and a second pixel 33G having sensitivity in a second wavelength region.
  • the first wavelength region and the second wavelength region overlap in some overlapping wavelength regions.
  • Monochromatic light is light in an emission wavelength range that includes at least part of the first wavelength range and at least part of the overlapping wavelength range.
  • the image processing section 50 includes an arithmetic processing section 52 and an amplification section 53.
  • the arithmetic processing unit 52 generates a three-band XYZ image with corrected RGB imaging characteristics by performing linear matrix arithmetic processing on the three-band RGB image.
  • the amplification unit 53 sets each output level of the first pixel 33B and the second pixel 33G to a value that allows color balance between the two colors under the condition that monochromatic light is irradiated from the monochromatic light source 20.
  • the gain amplifies each value corresponding to each of the first pixel 33B and the second pixel 33G among the XYZ values forming the XYZ image.
  • the image processing unit 50 generates a three-band XYZ image by performing black balance processing on a reference portion of the subject 12 other than the identification target. According to this configuration, since black balance processing is performed, even if the environment when photographing the subject 12 changes, it is possible to output an image in which the material can be identified with appropriate color or density.
  • the monochromatic light source 20 is a blue light source 21 or a red light source 23.
  • the monochromatic light source 20 is the blue light source 21
  • the first pixel 33B is a B pixel
  • the second pixel is a G pixel 33G.
  • the monochromatic light source 20 is the red light source 23
  • the first pixel is the R pixel 33R
  • the second pixel is the G pixel 33G.
  • the G pixel 33G has sensitivity to at least part of the overlapping wavelength region also to blue light, which is the light source color of the blue light source 21. Therefore, when capturing an image by illuminating with blue light, the sensitivity wavelength range of the G image can be narrowed compared to that of a general-purpose camera. Furthermore, the G pixel 33G has sensitivity to at least a portion of the overlapping wavelength region with respect to red light, which is the light source color of the red light source 23. Therefore, when capturing an image by illuminating with red light, the sensitivity wavelength range of the G image can be narrowed compared to that of the general-purpose camera 31. By performing linear matrix calculation on the RGB image, it is possible to further narrow the narrow sensitive wavelength region of the G image or shift the wavelength. Therefore, it is possible to output an image in which the materials (colonies WC, YC) can be identified with appropriate color or density.
  • the imaging device 11 can emit second monochromatic light in an emission wavelength range different from that of the first monochromatic light emitted by the first monochromatic light source 25. It further includes a second monochromatic light source 26.
  • the RGB pixels include a third pixel in addition to the first pixel 33B and the second pixel 33G.
  • the control unit 40 causes the imaging unit 30 to image the subject 12 by illuminating the first monochromatic light source 25 and the second monochromatic light source 26 .
  • the control unit 40 illuminates the second monochromatic light source 26 with a second illuminance set according to the illuminance of the first monochromatic light source 25, or controls the output of the first pixel 33B and second pixel 33G among the RGB pixels.
  • the gain setting unit 55 sets the gain given to the output level of the third pixel according to each gain given to the level. According to this configuration, by adjusting the illuminance or setting the gain of the monochromatic light source 20, it is possible to output an image in which the material can be identified with an appropriate color or density.
  • the inspection device 10 includes an imaging device 11.
  • the object 12 includes two different colonies WC and YC as two different materials.
  • An identification unit 61 is provided that identifies colony areas, which are areas of two different types of colonies WC and YC, based on the three-band XYZ image input from the imaging device 11.
  • the XYZ image includes an X image, a Y image, and a Z image.
  • the identification unit 61 defines an area in the X image whose pixel level is within the threshold setting range as a first colony area.
  • the identification unit 61 defines an area in the Y image whose pixel level is within the threshold setting range as a second colony area.
  • the identification unit 61 defines an area in the Z image whose pixel level is within the threshold setting range as a third colony area.
  • the identification unit 61 removes the second colony area from the first colony area or the third colony area and sets the remaining colony area as a fourth colony area.
  • the identification section 61 includes a classification section 63 , a counting section 64 , and a specifying section 65 .
  • the classification unit 63 classifies at least one of the first colony area and the third colony area, the second colony area, and the fourth colony area based on characteristics and shapes.
  • the counting unit 64 counts the number of each colony area classified by the classification unit.
  • the specifying unit 65 specifies the number of two types of colony areas to be identified based on the counting results of the counting unit.
  • a method for determining imaging conditions for the imaging device 11 that outputs regions made of two different materials included in the subject 12 as images that can be identified.
  • the imaging device 11 includes a monochromatic light source 20 that illuminates the subject 12 and an imaging unit 30 that includes RGB pixels that captures an image of the subject 12.
  • This imaging condition determining method includes first to fifth steps.
  • the first step is to perform linear matrix calculation processing on the 3-band RGB image output from the imaging unit 30 to generate a 3-band XYZ image with corrected RGB imaging characteristics.
  • the coefficients of the matrix by which the RGB image is multiplied are set.
  • the second step it is possible to change the peak wavelengths of the spectral output characteristics of the first pixel 33B and the spectral output characteristics of the second pixel 33G, which is different from the first pixel 33B, among the RGB pixels forming the imaging unit 30.
  • the subject 12 is illuminated with monochromatic light.
  • the imaging unit 30 images the subject 12 illuminated with monochromatic light in the second step.
  • the RGB image output from the imaging unit 30 as a result of the imaging in the third step is multiplied by a matrix to generate an XYZ image.
  • the fifth step is to determine whether the difference in density between regions of two different materials in the second image output from the second pixel 33G among the XYZ images generated in the fourth step is greater than or equal to a predetermined threshold. to judge. Until the difference in concentration becomes equal to or greater than a predetermined threshold value in the fifth step, the first to second steps are performed with at least one of a change in the coefficient in the first step and a change in each peak wavelength in the second step. Repeat the 5 steps. According to this method of determining imaging conditions, appropriate imaging conditions are set under which the imaging device 11 can output an image in which regions made of two different materials (colonies WC, YC) included in the subject 12 can be identified by color or density. can be determined.
  • the imaging unit 30 includes an image sensor 33 (image sensor) having RGB pixels.
  • the RGB pixels include a first pixel 33B having sensitivity in a first wavelength region and a second pixel 33G having sensitivity in a second wavelength region. The first wavelength region and the second wavelength region overlap in some overlapping wavelength regions.
  • Monochromatic light is light in an emission wavelength range that includes at least part of the first wavelength range and at least part of the overlapping wavelength range.
  • This imaging method includes an imaging step, an output step, an arithmetic processing step, and an amplification step.
  • the imaging step the imaging unit 30 images the subject 12 illuminated by the monochromatic light source 20 .
  • the output step the imaging unit 30 that has captured the image of the subject 12 outputs a 3-band RGB image.
  • the arithmetic processing step generates a three-band XYZ image with corrected RGB imaging characteristics by performing linear matrix arithmetic processing on the three-band RGB image output from the imaging unit 30.
  • the amplification step is performed in three bands using each gain set to balance the output levels of the first pixel 33B and the second pixel 33G between two colors under the condition that monochromatic light is irradiated from the monochromatic light source 20.
  • two types of images corresponding to the first pixel 33B and the second pixel 33G are amplified.
  • this imaging method it is possible to output an image in which regions made of two different types of materials (colonies WC, YC) included in the subject 12 can be identified by color or density. For example, when an image is used for inspection, two different types of materials included in the image can be identified or classified with relatively high accuracy.
  • the embodiment is not limited to the above, and may be modified in the following manner.
  • - Monochromatic light is not limited to blue light.
  • the combination of the first monochromatic light and the second monochromatic light is not limited to blue light and red light.
  • the monochromatic light emitted by the monochromatic light source 20 may be green light or red light. That is, the first monochromatic light may be green light, and the combination of the first monochromatic light and the second monochromatic light may be green light and red light.
  • an XYZ image as shown in FIG. 21 is obtained. Then, as shown in FIG.
  • the difference ⁇ Sg between the area Sg1 and the area Sg2 becomes relatively large. Therefore, a relatively large difference occurs in the ratio of the difference ⁇ Sg to the total area Sg1 and the area Sg2, making it easier to distinguish between white colonies and yellow colonies in the XYZ image (particularly the B image) based on the density difference between the two.
  • the IR cut filter 35 that blocks near-infrared light from the camera 31 may be removed, and the RGB pixels forming the image sensor may be configured to have sensitivity to near-infrared light.
  • the first monochromatic light source 25 is the blue light source 21, and B light is used as the first monochromatic light.
  • the second monochromatic light source 26 is a near-infrared LED, and near-infrared light is used as the second monochromatic light.
  • a linear matrix operation is performed on an RGB image to convert, for example, a NYZ image containing N images having a peak in the near-infrared region (for example, approximately 800 to 900 nm in the example of FIG. 7) to an XYZ image.
  • the inspection processing unit 60 may perform inspection processing using only two images, the Y image and the Z image, or only the two images, the X image and the Y image.
  • the monochromatic light source 20 may include a near-infrared light source that uses near-infrared light as monochromatic light as one type of monochromatic light source.
  • the B light is not limited to monochromatic light emitted by a blue LED, but may be other blue light having a peak in the blue wavelength region.
  • the main monochromatic light is B light
  • the first pixel is the B pixel and the second pixel is the G pixel
  • the present invention is not limited to this.
  • the monochromatic light is R light
  • the first pixel may be the R pixel 33R
  • the second pixel may be the G pixel 33G.
  • the monochromatic light is G light
  • the first pixel may be the G pixel 33G
  • the second pixel may be the B pixel 33B or the R pixel 33R.
  • the combination of the first monochromatic light and the second monochromatic light is not limited to blue light and red light as in the above embodiment. Moreover, it is not limited to green light and red light as shown in FIG.
  • the monochromatic light may be red light.
  • the first monochromatic light may be red light and the second monochromatic light may be blue light.
  • the number of monochromatic light sources is not limited to two types, and only one type may be used.
  • the monochromatic light source may be only a blue light source. Further, in FIG. 20, the monochromatic light source may be only a green light source.
  • the four colors of R, G1, G2, and B may be used.
  • a color camera having complementary color filters may be used, and the complementary colors may be four colors: yellow, cyan, magenta, and green.
  • - Save image data (for example, RGB image data) based on the first imaging signal S1 captured by the image sensor 33 using the camera 31 in a removable memory such as a USB memory.
  • the image data stored in the removable memory may be read by a personal computer, and the CPU (image processing unit 50) of the personal computer may perform conversion processing including linear matrix calculation to generate an XYZ image.
  • the device that performs the imaging step and the device that performs the conversion step may be separate devices. Also by such an imaging method, XYZ images of multiple bands can be acquired.
  • the configuration may be such that an inspector visually recognizes and inspects the XYZ image output by the imaging device 11.
  • Similar colors are not limited to white and yellow, but may also include white and light pink, white and light orange, white and light green, white and light blue, etc.
  • a combination of white and light color may be used.
  • a combination of different light colors may be used.
  • the object 12 is not limited to one that includes colonies, such as a Petri dish in which colonies are cultured, but may be an object that includes different types of identification targets. Further, the object to be identified may be a transparent object or a semi-transparent object. Furthermore, the identification target may be a food such as jelly or a processed food.
  • the subject 12 to be imaged or inspected is not particularly limited.
  • the object 12 may be, for example, a container such as a plastic bottle or a bottle, a food product, a beverage, an electronic component, an electric appliance, a daily necessities, a part, a member, or a raw material such as powder, granule, or liquid.
  • the inspection target may be scratches, dirt, printing defects, painting defects, etc.
  • the object within the subject 12 is not limited to the inspection object.
  • the objects may be of different types but exhibit similar colors.
  • the imaging device 11 may be configured as a separate device from the inspection processing section 60.
  • the imaging device 11 may be used for purposes other than inspection.
  • the arrangement pattern of the color filters 34 constituting the image sensor 33 is not limited to the RGB Bayer arrangement, but may be any arrangement pattern such as a stripe arrangement.
  • the imaging device 11 does not need to include a transport robot.
  • a configuration may be adopted in which an operator places the subject 12 on a mounting table for photographing, which serves as an imaging position.
  • At least one of the control unit 40, the image processing unit 50, and the inspection processing unit 60 may be partially or entirely configured by software consisting of a computer that executes a program, or may be configured by hardware such as an electronic circuit. may be configured.
  • Amplification unit 54 ...Matrix coefficient setting section, 55... Gain setting section, 60... Inspection processing section, 61... Identification section, 62... Judgment section, 63... Classification section, 64... Counting section, 65... Specification section, 70... Display section, WC... White colony, YC...yellow colony, S1...first imaging signal, S2...second imaging signal, VA...visible light wavelength region, NIRA...near infrared wavelength region, LW...spectral reflectance characteristic line of white colony, LY... Spectral reflectance characteristic line of yellow colony, B, G, R...spectral sensitivity characteristic line, X, Y, Z...spectral sensitivity characteristic line calculated from RGB spectral sensitivity characteristic, Gx, Gy, Gz...gain.

Abstract

Les pixels RVB d'une caméra (31) comprennent un premier pixel ayant une sensibilité à une première région de longueurs d'onde et un second pixel ayant une sensibilité à une seconde région de longueurs d'onde. La première région de longueurs d'onde et la seconde région de longueurs d'onde se chevauchent dans une région de longueurs d'onde de chevauchement partiel. Une lumière monochromatique est une lumière dans une région de longueurs d'onde d'émission comprenant au moins une partie de la première région de longueurs d'onde et au moins une partie de la région de longueur d'onde de chevauchement. Une unité de traitement d'image (50) comprend une unité de traitement arithmétique (52) et une unité d'amplification (53). L'unité de traitement arithmétique (52) effectue un traitement arithmétique de matrice linéaire sur une image RVB à trois bandes provenant de la caméra (31) pour générer une image XYZ à trois bandes dont les caractéristiques d'imagerie RVB sont corrigées. Le gain de l'unité d'amplification (53) est réglé à une valeur qui permet aux niveaux de sortie du premier pixel et du second pixel d'être équilibrés en couleur entre deux couleurs sous un éclairage de lumière monochromatique.
PCT/JP2023/016219 2022-04-28 2023-04-25 Dispositif d'imagerie, dispositif d'inspection, procédé de détermination de condition d'imagerie et procédé d'imagerie WO2023210619A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2022-075219 2022-04-28
JP2022075219 2022-04-28

Publications (1)

Publication Number Publication Date
WO2023210619A1 true WO2023210619A1 (fr) 2023-11-02

Family

ID=88518956

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2023/016219 WO2023210619A1 (fr) 2022-04-28 2023-04-25 Dispositif d'imagerie, dispositif d'inspection, procédé de détermination de condition d'imagerie et procédé d'imagerie

Country Status (1)

Country Link
WO (1) WO2023210619A1 (fr)

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2017042146A (ja) * 2015-08-28 2017-03-02 株式会社エヌテック 微生物の検出方法、微生物の検出装置及びプログラム
JP2021145185A (ja) * 2020-03-10 2021-09-24 株式会社エヌテック マルチスペクトル画像撮像装置、検査装置及びマルチスペクトル画像撮像方法
JP2021190802A (ja) * 2020-05-28 2021-12-13 株式会社エヌテック 撮像装置、検査装置及び撮像方法

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2017042146A (ja) * 2015-08-28 2017-03-02 株式会社エヌテック 微生物の検出方法、微生物の検出装置及びプログラム
JP2021145185A (ja) * 2020-03-10 2021-09-24 株式会社エヌテック マルチスペクトル画像撮像装置、検査装置及びマルチスペクトル画像撮像方法
JP2021190802A (ja) * 2020-05-28 2021-12-13 株式会社エヌテック 撮像装置、検査装置及び撮像方法

Similar Documents

Publication Publication Date Title
JP6945245B2 (ja) 外観検査装置
KR101692115B1 (ko) 검사 장치 및 검사 방법
US10393669B2 (en) Colour measurement of gemstones
US8049892B2 (en) Apparatus and method for camera-based color measurements
US20060132777A1 (en) Systems and methods for augmenting spectral range of an LED spectrophotometer
CN104062006B (zh) 光测定装置、印刷装置及图像显示装置
US11408819B2 (en) Process and system for identifying the gram type of a bacterium
JPH11510623A (ja) 紙幣または有価証券などのシート材を検査する装置および方法
KR20080080998A (ko) 화상 해석에 의해서 결함 검사를 실시하는 결함검사장치
KR20070065228A (ko) 종이 시트 판별 장치, 종이 시트 처리 장치, 및 종이 시트판별 방법
JP2008209211A (ja) 異物検査装置および異物検査方法
JP2021113744A (ja) 撮像システム
US20120114218A1 (en) High fidelity colour imaging of microbial colonies
EP0660277B1 (fr) Méthode et dispositif pour la caractérisation et la différenciation de billets de banque et documents légaux
KR102240757B1 (ko) Lctf-기반 다분광 영상 기술을 이용한 가공 채소류 내 포함된 이물질 실시간 검출 시스템
WO2023210619A1 (fr) Dispositif d'imagerie, dispositif d'inspection, procédé de détermination de condition d'imagerie et procédé d'imagerie
CN116148265A (zh) 一种基于合成革高质量图像获取的瑕疵分析方法及系统
US20220343478A1 (en) Brightness and color correction of image data of a line camera
US10091443B2 (en) Camera system and method for inspecting and/or measuring objects
TW202409544A (zh) 拍攝裝置、檢查裝置、拍攝條件決定方法及拍攝方法
CN111886492B (zh) 用于翡翠的颜色分级工艺及系统
JP2006300615A (ja) 外観検査装置及び外観検査方法
JPH09185711A (ja) 形状認識方法及び装置
JP4270604B2 (ja) 画像入力装置
JP2020005053A (ja) イメージセンサの分光感度測定方法、分光感度測定装置の検査方法及び分光感度測定装置

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 23796359

Country of ref document: EP

Kind code of ref document: A1