WO2023210619A1 - Imaging device, inspection device, imaging condition determination method, and imaging method - Google Patents

Imaging device, inspection device, imaging condition determination method, and imaging method Download PDF

Info

Publication number
WO2023210619A1
WO2023210619A1 PCT/JP2023/016219 JP2023016219W WO2023210619A1 WO 2023210619 A1 WO2023210619 A1 WO 2023210619A1 JP 2023016219 W JP2023016219 W JP 2023016219W WO 2023210619 A1 WO2023210619 A1 WO 2023210619A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
pixel
light source
monochromatic light
imaging
Prior art date
Application number
PCT/JP2023/016219
Other languages
French (fr)
Japanese (ja)
Inventor
透 渡邊
勝蘭 李
雄 吉田
邦光 豊島
Original Assignee
株式会社エヌテック
株式会社ヤクルト本社
東邦商事株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 株式会社エヌテック, 株式会社ヤクルト本社, 東邦商事株式会社 filed Critical 株式会社エヌテック
Publication of WO2023210619A1 publication Critical patent/WO2023210619A1/en

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/17Systems in which incident light is modified in accordance with the properties of the material investigated
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N25/00Circuitry of solid-state image sensors [SSIS]; Control thereof
    • H04N25/10Circuitry of solid-state image sensors [SSIS]; Control thereof for transforming different wavelengths into image signals
    • H04N25/11Arrangement of colour filter arrays [CFA]; Filter mosaics
    • H04N25/13Arrangement of colour filter arrays [CFA]; Filter mosaics characterised by the spectral characteristics of the filter elements

Definitions

  • the present invention relates to an imaging device, an inspection device, an imaging condition determination method, and an imaging method that output images of different types of similar-colored objects in images of objects as images that are easy to identify.
  • Patent Documents 1 to 3 disclose inspection devices that include an imaging device that images, for example, a colony as an imaging target.
  • an imaging device that images, for example, a colony as an imaging target.
  • the number of colonies of microorganisms cultured in a petri dish can be accurately counted, down to minute colonies of various shapes, using a simple device configuration using a black-and-white CCD camera, without requiring complicated pretreatment of the culture medium. Count in a short time.
  • Patent Documents 2 and 3 disclose colony detection systems that can accurately identify two or more types of colonies that have different color features. This colony detection system classifies colony pixels into predetermined types based on the color features of colony pixels in order to accurately identify two or more types of colonies with different color features. identify.
  • An imaging device that solves the above problems is an imaging device that images a subject and outputs an image that can identify regions made of two different materials included in the subject, and the imaging device that captures the subject is RGB.
  • an image processing unit that performs predetermined processing on the RGB pixel, and the RGB pixel includes a first pixel having sensitivity in a first wavelength region and a second pixel having sensitivity in a second wavelength region, The wavelength range and the second wavelength range overlap in some overlapping wavelength ranges, and the monochromatic light has an emission wavelength range that includes at least a part of the first wavelength range and at least a part of the overlapping wavelength range.
  • the image processing unit includes a calculation processing unit that generates a three-band XYZ image with RGB imaging characteristics corrected by performing linear matrix calculation processing on the three-band RGB image, and the monochromatic light source.
  • the XYZ image is constructed with a gain set to a value that allows each output level of the first pixel and the second pixel to be balanced between the two colors under the condition that monochromatic light is irradiated from the An amplifying section that amplifies each value corresponding to each of the first pixel and the second pixel among the XYZ values.
  • the imaging device can output an image in which regions made of two different materials included in the subject can be identified by color or density. For example, when an image is used for inspection, two different types of materials included in the image can be identified or classified with relatively high accuracy.
  • density is a concept equivalent to pixel level, signal level, pixel value, etc.
  • the image processing unit may generate the three-band XYZ image by performing black balance processing on a reference portion of the subject other than the identification target. According to this configuration, since black balance processing is performed, even if the environment when photographing a subject changes, it is possible to output an image in which the material can be identified with appropriate color or density.
  • the monochromatic light source is a blue light source or a red light source
  • the first pixel is a B pixel
  • the second pixel is a G pixel
  • the first pixel may be an R pixel
  • the second pixel may be a G pixel
  • the G pixel has sensitivity to at least part of the overlapping wavelength region also to blue light, which is the light source color of the blue light source. Therefore, when capturing an image by illuminating with blue light, the sensitivity wavelength range of the G image can be narrowed compared to that of a general-purpose camera. Furthermore, the G pixel has sensitivity to at least a portion of the overlapping wavelength region for red light, which is the light source color of the red light source. Therefore, when capturing an image by illuminating with red light, the sensitivity wavelength range of the G image can be narrowed compared to that of a general-purpose camera. By performing linear matrix calculation on the RGB image, it is possible to further narrow the narrow sensitive wavelength region of the G image or shift the wavelength. Therefore, it is possible to output an image in which the material can be identified with appropriate color or density.
  • the monochromatic light source is a first monochromatic light source
  • a second monochromatic light source capable of emitting a second monochromatic light having an emission wavelength range different from an emission wavelength range of the first monochromatic light emitted by the first monochromatic light source
  • the RGB pixel further includes a light source
  • the RGB pixel includes a third pixel in addition to the first pixel and the second pixel
  • the control unit simultaneously performs illumination with the first monochromatic light source and the second monochromatic light source.
  • the gain setting section may set a gain to be given to the output level of the third pixel according to each gain given to the output level of the second pixel.
  • An inspection apparatus for solving the above problem is an inspection apparatus equipped with the above-mentioned imaging device, wherein the object includes two different types of colonies as the two different types of materials, and the three bands inputted from the imaging device are
  • the identification unit includes an identification unit that identifies colony areas that are areas of two different colonies based on an XYZ image, the XYZ image includes an X image, a Y image, and a Z image, and the identification unit
  • the area whose level is within the threshold setting range is defined as a first colony area
  • the area whose pixel level is within the threshold setting range in the Y image is defined as a second colony area
  • the pixel level is within the threshold setting range in the Z image.
  • the area is a third colony area, the remaining colony area after excluding the second colony area from the first colony area or the third colony area is a fourth colony area, and the first colony area and the third colony are a classification unit that classifies at least one of the areas, the second colony area, and the fourth colony area based on characteristics and shapes; and counting the number of each of the colony areas classified by the classification unit. and a specifying section that specifies the number of two types of colony regions to be identified based on the counting results of the counting section.
  • a method for determining imaging conditions for solving the above problem is a method for determining imaging conditions for an imaging device that outputs an image that can identify regions made of two different materials included in a subject, the imaging device It is equipped with a monochromatic light source that illuminates with monochromatic light, and an imaging section that includes RGB pixels that images the subject, and performs linear matrix calculation processing on the three-band RGB image output from the imaging section to obtain RGB imaging characteristics.
  • the emission wavelength of the monochromatic light source is changed so that the peak wavelength of the spectral output characteristic of one pixel and the spectral output characteristic of a second pixel different from the first pixel is changed, and the monochromatic light source after the emission wavelength change is changed.
  • this method for determining imaging conditions it is possible to determine appropriate imaging conditions that enable the imaging device to output an image in which regions made of two different materials included in the subject can be distinguished by color or density.
  • An imaging method that solves the above problems is an imaging method that images a subject and outputs an image that can identify regions made of two different materials included in the subject, the imaging method comprising: an imaging unit that captures the subject; a monochromatic light source that illuminates the subject with monochromatic light; the imaging unit includes an imaging element having RGB pixels; the RGB pixels include a first pixel having sensitivity in a first wavelength region; and a monochromatic light source having sensitivity in a first wavelength region; a second pixel having sensitivity to , the first wavelength region and the second wavelength region overlap in some overlapping wavelength regions, and the monochromatic light has at least a portion of the first wavelength region and the second wavelength region.
  • the light is in an emission wavelength range including at least a part of the overlapping wavelength range, and includes an imaging step in which the imaging unit takes an image of the subject illuminated by the monochromatic light source, and 3 output from the imaging unit that has taken the image of the subject.
  • this imaging method it is possible to output an image in which regions made of two different materials included in the subject can be identified by color or density. For example, by using an image output by this method, two different types of materials included in the image can be identified or classified with relatively high accuracy.
  • FIG. 3 is a schematic plan view showing a subject.
  • 1 is a schematic diagram showing a schematic configuration of a camera, and a graph showing relative sensitivity of the camera for each RGB color. It is a partial schematic diagram which shows the detailed structure of a monochromatic light source.
  • FIG. 2 is a block diagram showing the functional configuration of the inspection device.
  • 7 is a graph illustrating a case where it is difficult to identify different types of imaging targets. 7 is a graph illustrating the principle of making it easier to identify different types of imaging targets.
  • 12 is a graph illustrating a case where it is difficult to identify different types of imaging targets in images captured using a general-purpose camera.
  • 3 is a flowchart showing an initial setting routine.
  • 3 is a flowchart showing an imaging processing routine.
  • 3 is a flowchart showing an inspection processing routine.
  • 3 is a flowchart showing an imaging condition determination processing routine. It is a graph showing the relationship between the wavelength of the light source and the output power (light intensity) in a modified example. It is a graph showing the relationship between wavelength and relative output after linear matrix calculation processing when illuminating with green and red monochromatic light in a modified example. It is a graph explaining the principle by which the output image of the imaging device imaged using the general-purpose camera in a modification example identifies different types of imaging targets.
  • This embodiment is an example in which an imaging device 11 that captures an image of a subject 12 is included in an inspection apparatus 10 that inspects the subject 12. Note that the imaging device 11 may be included in a device other than the inspection device 10, or may be used alone.
  • the inspection device 10 includes an imaging device 11 that images the subject 12, and an inspection processing unit 60 that inspects the subject 12 using images input from the imaging device 11.
  • the inspection device 10 may include a display section 70 that displays the inspection results of the inspection processing section 60 and the like.
  • the display unit 70 may display an image output from the imaging device 11, an image obtained by processing the image, or the like.
  • the display unit 70 may be, for example, a monitor connected to a personal computer, or a display provided on an operation panel included in the inspection device 10.
  • the imaging device 11 includes a monochromatic light source 20 that illuminates a subject 12 with monochromatic light, an imaging unit 30 that captures an image of the subject 12, a control unit 40 that controls the monochromatic light source 20 and the imaging unit 30, and an imaging unit 30 that outputs an image as an imaging result.
  • the image processing unit 50 includes an image processing unit 50 that performs predetermined processing on the output image signal.
  • the imaging device 11 also includes a power source 45 that supplies power to the monochromatic light source 20 .
  • the imaging unit 30 is, for example, a color camera 31.
  • the color camera 31 (hereinafter also simply referred to as "camera 31") is electrically connected to the processing section 18.
  • the processing section 18 includes the aforementioned control section 40 and image processing section 50.
  • the processing unit 18 may include at least one of an electronic circuit and a computer.
  • the monochromatic light source 20 may be one type of monochromatic light source that emits one type of monochromatic light, or may include two types of monochromatic light sources that emit two types of monochromatic light. In this case, the monochromatic light source 20 may include three types of monochromatic light sources that emit three types of monochromatic light. When a plurality of types of monochromatic light sources are provided, the control unit 40 may be configured to cause one or two types of monochromatic light sources to emit light.
  • one type of monochromatic light source refers to any one of a blue light source, a red light source, and a green light source.
  • the monochromatic light source is composed of, for example, an LED.
  • the monochromatic light source may be, for example, any one of a blue LED, a red LED, and a green LED.
  • the monochromatic light source 20 of this example includes at least a blue LED and a red LED. Therefore, in the example shown in FIG. 1, the power supply 45 includes at least a blue LED power supply 46 and a red LED power supply 47.
  • the control unit 40 includes an imaging control unit 41 that controls the imaging unit 30 and an illumination control unit 42 that controls the monochromatic light source 20.
  • the illumination control unit 42 controls turning on and off of the monochromatic light source 20 via a power source 45.
  • the monochromatic light source 20 includes a plurality of types of monochromatic light sources (LEDs)
  • the lighting control unit 42 individually controls turning on and off for each monochromatic light source via the power source 45. Therefore, the subject 12 is illuminated with monochromatic light by the monochromatic light source 20.
  • the monochromatic light source 20 may illuminate the subject 12 with two types of monochromatic light.
  • the inspection device 10 may include a robot that transports objects to be transported, such as a petri dish 13 as the subject 12 and a reference sheet for color gain adjustment, to a predetermined position.
  • the control unit 40 may control the robot, or a control unit different from the control unit 40 may control the robot.
  • the image processing section 50 performs predetermined image processing on the first image signal S1 of three RGB bands output from the camera 31 constituting the image capturing section 30, and generates a second image signal S2 of three XYZ bands (both shown in FIG. 5). reference).
  • RGB in the first image signal S1 refers to three colors determined from each wavelength range that can be received by the R pixel, G pixel, and B pixel that constitute the image sensor 33 of the camera 31.
  • the XYZ referred to in the second imaging signal S2 is determined from each wavelength region of the XYZ image obtained by converting the RGB image forming the first imaging signal S1 into an XYZ image having each wavelength region different from each wavelength region of RGB. Refers to three colors.
  • the inspection processing unit 60 inspects the subject 12 using the XYZ image based on the second imaging signal S2 input from the image processing unit 50.
  • the subject 12 includes colonies WC and YC cultured in a Petri dish 13. That is, the subject 12 includes a petri dish 13, a medium 14 in the petri dish 13, and colonies WC and YC cultured in the medium 14. Note that colonies WC and YC do not occur at the beginning of the culture period or when there are no bacteria, so colonies WC and YC may not exist depending on the subject 12.
  • two or more different types of colonies WC and YC exist.
  • the different types of colonies WC and YC are two types having similar colors.
  • foreign matter FS as shown in FIG. 2 may exist in the culture medium 14.
  • the foreign substances FS include dietary fiber, food particles, seeds, skin, fruit pulp, etc. contained in the food or drinking water.
  • Foreign matter FS may be mistakenly identified as colonies WC and YC. Therefore, the inspection processing unit 60 identifies the colonies WC, YC and the foreign object FS based on their characteristics.
  • colonies WC and YC are identified as foreign matter FS based on characteristic factors such as shape, area, and color.
  • Colonies WC and YC generally have a circular or oval shape, and the size is within a predetermined range.
  • the foreign material FS is a fiber, its shape is elongated.
  • the foreign substance FS is fruit pulp, seeds, skin, etc.
  • the shape is different from the colonies WC and YC.
  • the inspection processing unit 60 distinguishes colony candidates from colonies WC, YC and others based on a threshold value of area (size).
  • the petri dish 13 is placed on a background sheet 15 (background board).
  • the background sheet 15 serves as a background for the petri dish 13, which is the subject 12.
  • the background sheet 15 may be, for example, a black diffusion plate. By making the background sheet 15 black, it may be used as a black reference for black balance.
  • the petri dish 13 is made of a colorless and transparent material (for example, glass).
  • the culture medium 14 contained in the petri dish 13 is colored as necessary. In that case, the culture medium 14 may be colored in a predetermined color and become translucent. In the image captured by the camera 31, the black color of the background sheet 15 may be transmitted through the culture medium 14.
  • the culture medium 14 may be colored depending on the color of the food or drinking water.
  • the black background sheet 15 may be a black plate such as a black sheet that does not have a diffusion function of diffusing light. Further, when using the background sheet 15 as a reference for black balance, it is not limited to a black (achromatic color) sheet, but a chromatic color sheet may be used.
  • the two types of colonies schematically shown in FIGS. 1 and 2 are white colony WC and yellow colony YC.
  • the white colony WC and the yellow colony YC are different types of colonies. Therefore, for inspection, it is necessary to distinguish between white colony WC and yellow colony YC.
  • the white and yellow colors of the white colony WC and the yellow colony YC are similar in color, it is difficult to distinguish them in the RGB image captured by the camera 31 of the subject 12. Further, in the R image, G image, and B image that constitute the RGB image, the white colony WC and the yellow colony YC have similar densities, so it is difficult to distinguish them.
  • the image processing unit 50 performs predetermined image processing on the first imaging signal S1 from the camera 31, thereby generating an XYZ image that makes it easy to distinguish between the white colony WC and the yellow colony YC. Details of the XYZ image will be described later.
  • the camera 31 is a general-purpose color camera that captures RGB images.
  • the camera 31 includes a lens 32 assembled to a lens barrel 31a, a near-infrared light cut filter 35 (hereinafter also referred to as IR cut filter 35) that blocks near-infrared light, and an image sensor 33.
  • IR cut filter 35 near-infrared light cut filter 35
  • the image sensor 33 includes an R pixel 33R, a G pixel 33G, and a B pixel 33B, each of which is a light receiving element.
  • the R pixel 33R receives red light (R light) that has passed through the R filter 34R, and outputs an R pixel signal according to the amount of received light.
  • the G pixel 33G receives green light (G light) that has passed through the G filter 34G, and outputs a G pixel signal according to the amount of received light.
  • the B pixel 33B receives blue light (B light) that has passed through the B filter 34B, and outputs a B pixel signal according to the amount of received light.
  • an R pixel 33R, a G pixel 33G, and a B pixel 33B are arranged in a predetermined arrangement.
  • This image sensor 33 has RGB imaging characteristics in which near-infrared light is cut.
  • the R pixel 33R, the G pixel 33G, and the B pixel 33B are sensitive to light in respective wavelength bands shown in the graph in FIG.
  • the horizontal axis is wavelength and the vertical axis is relative sensitivity.
  • the R pixel 33R has high sensitivity to light in the red (R) wavelength band shown in the graph in FIG.
  • the G pixel 33G has high sensitivity to light in the green (G) wavelength band shown in the graph.
  • the B pixel 33B has high sensitivity to light in the blue (B) wavelength band shown in the graph.
  • the monochromatic light source 20 includes monochromatic light sources of three colors (three types): a blue light source 21, a green light source 22, and a red light source 23.
  • the blue light source 21 emits blue light (B light) as monochromatic light.
  • the green light source 22 emits green light (G light) as monochromatic light.
  • the red light source 23 emits red light (R light) as monochromatic light.
  • the illumination control unit 42 of this embodiment illuminates the subject 12 with blue light (B light) by causing the blue light source 21 to emit light as a monochromatic light source. Note that if the emitted light color (light source color) is not particularly conscious, the blue light source 21, green light source 22, and red light source 23 that constitute the monochromatic light source 20 may be referred to as monochromatic light sources 21 to 23.
  • the monochromatic light source 20 may have the function of a white light source. That is, the monochromatic light source 20 also has a white light source function of emitting white light by emitting three types of monochromatic light sources 21 to 23 at the same time.
  • the control unit 40 causes the white light source to emit any two types (two colors) of the three types of monochromatic light sources 21 to 23 capable of emitting monochromatic light of three colors of RGB, so that the white light source can emit monochromatic light. Function as a light source.
  • a white light source having the function of such a monochromatic light source is also included in the monochromatic light source 20.
  • the first imaging in which the subject 12 is imaged by illuminating with one type or two types of monochromatic light, and the second imaging in which the subject 12 is imaged by illuminating with white light may be performed in a time-sharing manner.
  • images captured using white light may also be used for the inspection.
  • the control unit 40 determines which monochromatic light source emits light based on the spectral reflectance characteristics of the colony to be inspected.
  • one type of two types of monochromatic light sources that are emitted simultaneously when photographing the subject 12 is assumed to be a first monochromatic light source 25 which is a main monochromatic light source, and the other type is assumed to be a second monochromatic light source 26 which is an auxiliary light source.
  • the blue light source 21 is used as a first monochromatic light source 25
  • the red light source 23 is used as a second monochromatic light source 26.
  • the blue light source 21 is the main first monochromatic light source 25, and the first monochromatic light source 25 has a first monochromatic light source 25 that has an emission wavelength range different from the emission wavelength range of the first monochromatic light (B light in this example) that illuminates the subject 12.
  • the red light source 23 capable of emitting two monochromatic lights (R light in this example) is assumed to be a second monochromatic light source 26 which is an auxiliary light source.
  • the control unit 40 illuminates the subject 12 using blue light (B light) as main monochromatic light and illuminates the subject 12 using red light (R light) as auxiliary light. Which two of the three RGB colors are used as a combination of the first monochromatic light source 25 and the second monochromatic light source 26 is determined according to the spectral reflectance characteristics of the object to be inspected.
  • the monochromatic light sources 21 to 23 are, for example, LEDs.
  • the blue light source 21 is a blue LED
  • the green light source 22 is a green LED
  • the red light source 23 is a red LED.
  • the control unit 40 is capable of individually and independently controlling on/off the three types of monochromatic light sources 21 to 23.
  • the control unit 40 uses the blue LED determined from the imaging condition search results performed in advance to identify the white colonies WC and yellow colonies YC to be inspected.
  • a red LED is lit as an auxiliary light. Note that a method for determining imaging conditions including a method for determining monochromatic light used for illumination will be described later.
  • the image processing section 50 and the inspection processing section 60 are also simply referred to as an R signal, a G signal, and a B signal.
  • the image processing section 50 includes an RGB separation section 51, an arithmetic processing section 52, and an amplification section 53.
  • the RGB separation unit 51 separates the first image signal S1 input from the image sensor 33 into an R image signal, a G image signal, and a B image signal.
  • the arithmetic processing unit 52 converts the R signal, G signal, and B signal input from the RGB separation unit 51 into an X signal, a Y signal, and a Z signal. Specifically, the arithmetic processing unit 52 converts the RGB values, which are the signal values of the R signal, G signal, and B signal, into the X signal, Y signal, and Z signal by performing a linear matrix calculation.
  • the arithmetic processing unit 52 is given matrix coefficients.
  • the matrix used for the linear matrix calculation is a 3 ⁇ 3 matrix.
  • the arithmetic processing unit 52 is given coefficients of a 3 ⁇ 3 matrix.
  • the arithmetic processing unit 52 performs a linear matrix operation of multiplying the RGB values of the first imaging signal S1 by a 3 ⁇ 3 matrix specified using the matrix coefficient, and calculates the spectral characteristics different from the RGB values of the first imaging signal S1. It is converted into a second image pickup signal S2 expressed by XYZ with .
  • the matrix coefficient is a coefficient for separating RGB of the first image signal S1 into XYZ bands of the second image signal S2.
  • a1 to a3, b1 to b3, and c1 to c3 are matrix coefficients, and Gx, Gy, and Gz are gains (amplification factors). These matrix coefficients are set in the matrix coefficient setting section 54 shown in FIG. Further, the gains Gx, Gy, and Gz are set in a gain setting section 55 shown in FIG.
  • the gain setting unit 55 sets the output levels of the first and second pixels of the RGB pixels of the image sensor 33 between two colors under the condition that the first monochromatic light is irradiated from the first monochromatic light source 25. A gain value that allows for color balance is set. In the example shown in FIG.
  • the gain setting unit 55 is set to the B pixel 33B, which is the first pixel, under the condition that B light, which is the first monochromatic light, is irradiated from the blue light source 21, which is the first monochromatic light source 25.
  • a gain is set to a value that allows each output level of the second pixel G pixel 33G to be balanced between two colors (B color and G color).
  • the arithmetic processing unit 52 performs a linear matrix arithmetic process of multiplying the RGB values by a 3 ⁇ 3 matrix in the above equation (1).
  • the arithmetic processing unit 52 outputs the XYZ values before being multiplied by a gain (amplification factor) to the amplification unit 53.
  • Matrix coefficients are set in the 3 ⁇ 3 matrix that can improve the separation of the three bands.
  • the amplification unit 53 multiplies the XYZ values before normalization from the arithmetic processing unit 52 by the X gain Gx, Y gain Gy, and Z gain Gz set in the gain setting unit 55, respectively.
  • the amplification unit 53 multiplies the X value after XYZ conversion by the X gain Gx, the Y value by the Y gain Gy, and the Z value by the Z gain Gz.
  • the amplification unit 53 outputs the normalized XYZ values as the second image pickup signal S2.
  • the amplification unit 53 is capable of balancing the output levels of the B pixel 33B and the G pixel 33G between two colors under the condition that monochromatic light (B light) is irradiated from the blue light source 21.
  • the image processing unit 50 sequentially performs RGB separation processing, linear matrix calculation processing (XYZ conversion processing), and normalization processing on the input first imaging signal S1, thereby outputting a second imaging signal S2. .
  • the three-band RGB image signal is converted into the three-band XYZ image signal.
  • the inspection processing section 60 includes an identification section 61 and a determination section 62.
  • the identification section 61 includes a classification section 63, a counting section 64, and a specifying section 65.
  • the subject 12 includes two different types of colonies WC and YC as two different types of materials.
  • the identification unit 61 identifies colony areas, which are areas of two different types of colonies WC and YC, based on the three-band XYZ image input from the imaging device 11.
  • the XYZ image includes an X image, a Y image, and a Z image.
  • the identification unit 61 defines an area in which the pixel values constituting the X image are within the range set by the first threshold value as a first colony area. Further, the identification unit 61 defines an area in which the pixel values forming the Y image are within the range set by the second threshold value as a second colony area.
  • the identification unit 61 defines an area where the pixel values forming the Z image are within the range set by the third threshold value as a third colony area.
  • the identification unit 61 removes the second colony area from the first colony area or the third colony area and sets the remaining colony area as a fourth colony area.
  • the first colony area or the third colony area is an area where all of the colonies WC and YC are in the subject 12.
  • the second colony region is a region of white colony WC or yellow colony YC.
  • the calculation formula for the fourth colony region described here is an example of the case of FIG. 12, and the calculation formula for the fourth colony region changes depending on the characteristics of the colony to be identified.
  • the first to third thresholds are thresholds for setting a specific range of pixel values, and not only one type of intermediate threshold, but also two types of thresholds, an upper threshold and a lower threshold, or two intervals. Four types of threshold values may be used to set the range.
  • the identification unit 61 binarizes the area specified within the threshold range and the other areas using the threshold value, and specifies the area to be inspected extracted by the binarization.
  • the classification unit 63 classifies at least one of the first colony area and the third colony area, the second colony area, and the fourth colony area based on characteristics such as shape and size (area).
  • the counting unit 64 counts the number of each colony area classified by the classification unit 63.
  • the specifying unit 65 specifies the number of two types of colony areas to be identified based on the counting results of the counting unit 64.
  • the determination unit 62 determines the quality of the test object obtained as a result of culturing colonies in the Petri dish 13 of the subject 12 based on the counts of the two types of colony regions.
  • the inspection device 10 when the inspection device 10 is a colony counter, it may be provided with a determination function for determining the quality of the inspection object as an auxiliary function, or may be configured without the determination function by the determination section 62.
  • the main function when the inspection target of the inspection device 10 is not a colony, the main function may be a determination function for determining the quality of the inspection target.
  • FIG. 6 shows an example of spectral reflectance characteristic lines LW and LY of white colony WC and yellow colony YC, and spectral sensitivity characteristic line B indicating the spectral sensitivity characteristic of B pixel as an example of RGB pixels.
  • the horizontal axis shows wavelength (nm), and the vertical axis shows relative sensitivity.
  • the spectral reflectance characteristic lines LW and LY of the two types of colonies shown in this graph are hypothetical waveforms for convenience of explanation.
  • a dedicated spectral reflectance measurement device is required to measure the spectral reflectance characteristics of an inspection target such as a colony.
  • the spectral reflectance characteristics of white colony WC and yellow colony YC are similar. Further, the culture medium 14 in the petri dish 13 may be colored in a predetermined color. Therefore, due to the coloring of the medium 14, it becomes even more difficult to distinguish between the white colony WC and the yellow colony YC.
  • the spectral reflectance characteristic line LW of the white colony WC and the spectral reflectance characteristic line LY of the yellow colony YC are similar over a wide wavelength range.
  • a spectral sensitivity characteristic line B of the B pixel shown by a broken line in FIG. 6 indicates the relative sensitivity of the B pixel, and has a peak near 430 nm.
  • the spectral reflectance characteristic lines LW and LY of each colony WC and YC both have peaks that overlap with the spectral sensitivity characteristic line B. Furthermore, both have a peak in a predetermined wavelength region of the near-infrared region NIRA.
  • the B pixel 33B is sensitive to B light (light in the blue to green region), which is a light component in the range of 300 to 600 nm, as shown by the broken line in FIG. Therefore, the B pixel 33B receives the blue component of the B light of the reflected light from the white colony. Further, the B pixel 33B receives the blue component of the B light of the reflected light from the yellow colony.
  • the density of the white colony WC in the B image is proportional to the area Sw of the region LW*B multiplied by both the spectral reflectance characteristic line LW and the spectral sensitivity characteristic line B of the white colony WC in FIG. .
  • the density of the yellow colony YC in the B image is proportional to the area Sy of the region LY*B, which is obtained by multiplying both the spectral reflectance characteristic line LY and the spectral sensitivity characteristic line B of the yellow colony YC in FIG. 6.
  • the two areas Sw and Sy have approximately the same value. Therefore, the densities of the two types of colonies WC and YC in the B image are approximately the same.
  • B is limited to a narrow wavelength range (400 to 500 nm) in which the difference between the two spectral reflectance characteristic lines LW and LY of white colony WC and yellow colony YC is relatively large. If the spectral sensitivity characteristics of the pixels exist, white colonies WC and yellow colonies YC can be distinguished in the B image. That is, the density of the white colony WC in the B image is proportional to the area Sw of the region LW*B multiplied by both the spectral reflectance characteristic line LW and the spectral sensitivity characteristic line B of the white colony WC in FIG. 7.
  • the density of the yellow colony YC in the B image is proportional to the area Sy of the region LY*B, which is obtained by multiplying both the spectral reflectance characteristic line LY and the spectral sensitivity characteristic line B of the yellow colony YC in FIG. 7.
  • the two types of colonies WC and YC can be identified in the B image.
  • the narrow broken line N is limited to the near-infrared wavelength region (800 to 900 nm) where the difference between the two spectral reflectance characteristic lines LW and LY of the white colony WC and the yellow colony YC is relatively large.
  • the two types of colonies WC and YC can be distinguished in the IR image.
  • Sw indicates the area of the area where LW and B overlap, not LW*B, but since the difference from LW*B is small, in these figures, Sw indicates LW*B. is substituted by the area of the region where LW and B overlap.
  • LY*B is replaced by the area of the region where LY and B overlap.
  • the spectral reflectance characteristic lines LW, Spectral sensitivity characteristics of pixels are set in areas where there is a difference in LY.
  • the two types of colonies WC and YC can be identified from the image.
  • a special light source that can emit light of a specific wavelength with a narrow emission wavelength range is required.
  • these special cameras and special light sources are expensive, which increases the manufacturing cost of the imaging device.
  • the spectral sensitivity characteristics of pixels as shown in FIG. 7 are generated by performing predetermined processing on the first image signal S1 captured using a general-purpose color camera 31. ⁇ About images using general-purpose color camera 31> Next, a comparative example will be described with reference to FIG.
  • the image sensor 33 mounted on the general-purpose color camera 31 has the RGB spectral sensitivity characteristics shown in the graph when illuminated with white LED.
  • the spectral reflectance characteristic lines LW and LY of the white colony WC and the yellow colony YC are different from those in FIGS. 6 and 7.
  • the spectral sensitivity characteristic lines R, G, and B which indicate the sensitivity of each of the RGB pixels shown in FIG. 8, are distributed over a wide wavelength range.
  • the spectral reflectance characteristic line LW of the white colony WC and the spectral reflectance characteristic line LY of the yellow colony YC have a difference of LW>LY in the wavelength range of about 480 to 520 nm, but on the contrary, in the wavelength range of about 530 to 580 nm. LW ⁇ LY. Therefore, the difference in density between the two types of colonies WC and YC becomes small in any of the R, G, and B images that have sensitivity in the range of 400 to 700 nm.
  • the density of the white colony WC in the G image is calculated by multiplying the spectral reflectance characteristic line LW (thick line) of the white colony WC in FIG. 8 by the spectral sensitivity characteristic line G of the G pixel shown by the dashed line
  • the area of the region is indicated by Sg1.
  • the density of the yellow colony YC in the G image is calculated by multiplying the spectral reflectance characteristic line LY (thick one-dot chain line) of the yellow colony in FIG. 8 by the spectral sensitivity characteristic line G of the G image, which is the area Sg2 of the region. It is indicated by.
  • the ratio of the difference ⁇ Sg (
  • ) between the two areas Sg1 and Sg2 to the whole (total area of the G region) is small. As this ratio becomes smaller, it becomes difficult to distinguish between the two types of colonies WC and YC in the G image. This is because the G pixel has sensitivity over a wide wavelength range.
  • the differences ⁇ Sb and ⁇ Sr account for a small proportion of the total area (total area of B area, total area of R area), white colony WC and yellow colony YC cannot be distinguished. It's hard to do.
  • the area Sg1 of the area where LW and G overlap is substituted for LW*G
  • the area Sg2 of the area where LY and G overlap is substituted for LY*G is substituted for LY*G.
  • FIG. 9 shows the density of white colonies WC and yellow colonies YC in an RGB image captured by the general-purpose color camera 31 of this comparative example.
  • any of the R, G, and B images there is almost no difference between the density Cw of the white colony WC and the density Cy of the yellow colony YC, or even if there is a difference, it is very small. For this reason, it is difficult to distinguish between white colony WC and yellow colony YC in an RGB image.
  • the RGB pixels of the image sensor 33 of the general-purpose color camera 31 have a sensitivity expressed as a relative output with respect to wavelength (nm) under white LED illumination.
  • the B pixel 33B which is the first pixel, has a relative output of 0.1 or more for light having a wavelength of approximately 440 to 540 nm, and has spectral characteristics having a peak at approximately 460 nm. That is, the first wavelength region where the relative output is 0.1 or more is about 440 to 540 nm.
  • the G pixel 33G which is the second pixel, has a relative output of 0.1 or more for light having a wavelength of approximately 460 to 610 nm, and has a spectral characteristic having a peak at approximately 550 nm. That is, the second wavelength region where the relative output is 0.1 or more is about 460 to 610 nm.
  • the R pixel 33R which is the third pixel, has a relative output of 0.1 or more for light having a wavelength of approximately 570 to 650 nm, and has spectral characteristics having a peak at approximately 600 nm.
  • B light whose emission wavelength includes at least part of the overlapping wavelength range of about 460 to 550 nm
  • not only the first B pixel 33B but also the second G pixel 33G also has a relative output of 0.1 or more.
  • FIG. 10 shows the spectrum (emission distribution) of the monochromatic light source 20 when the blue light source 21 made of a blue LED and the red light source 23 made of a red LED emit light at the same time.
  • the horizontal axis is wavelength and the vertical axis is output power.
  • the blue light source 21, which is the main monochromatic light source emits B light of about 430 to 530 nm.
  • the red light source 23 which is an auxiliary light source, emits R light of 500 to 660 nm. Note that in the graph of FIG. 10, the spectral characteristics of the white LED are shown by broken lines.
  • FIG. 11 shows the relative outputs of the RGB pixels 33R, 33G, and 33B when illuminated with B light and R light.
  • the G region is shifted to the lower wavelength side compared to the G region when illuminated with white light in FIG.
  • region B shown in FIG. 11 the region where the relative output was about 0.05 to 2 in the region of 500 to 650 nm in region B when illuminated with white light in FIG. 8 has almost disappeared.
  • the R region is shifted to the higher wavelength side by illuminating with the R light shown in FIG. 10, compared to the R region when illuminating with white light in FIG.
  • an XYZ image is obtained by performing linear matrix calculation on the RGB image.
  • the imaging device 11 of this example employs an imaging method that allows a color camera 31 equipped with an RGB color filter 34 to have new two-band imaging characteristics through monochromatic illumination and RGB linear matrix calculation processing. .
  • the two new bands have different peak wavelengths from the original two bands, and have different imaging characteristics from the RGB imaging characteristics when using white LED lighting.
  • this imaging method enables color discrimination that is difficult to perform when imaging with a normal color camera 31.
  • the monochromatic illumination of this embodiment is blue illumination.
  • the monochromatic illumination used as an auxiliary light source is red illumination. That is, the blue light source 21 that illuminates with B light is used as the main monochromatic light source, and the red light source 23 that illuminates with R light is used as an auxiliary light source. By illuminating with R light using the red light source 23 as an auxiliary light source, the R image is also used for inspection processing.
  • the B light from the blue light source 21 causes the G pixel 33G to have a relative output of 0.1 or more in the region of about 450 to 520 nm.
  • the B light which is monochromatic light that illuminates the subject 12, is light in an emission wavelength range that includes at least part of the first wavelength range and at least part of the overlapping wavelength range.
  • the control unit 40 simultaneously illuminates the first monochromatic light source 25 and the second monochromatic light source 26 to cause the imaging unit 30 to image the subject 12.
  • the control unit 40 illuminates the second monochromatic light source 26 with a second illuminance set according to the illuminance of the first monochromatic light source 25, or illuminates the B pixel which is the first pixel among the RGB pixels 33R, 33G, and 33B.
  • the gain setting unit 55 sets the gain given to the output level of the R pixel 33R, which is the third pixel, according to each gain given to the output level of the R pixel 33B, which is the third pixel, and the G pixel 33G, which is the second pixel.
  • the density of the white colony WC in the Y image is determined by the spectral reflectance characteristic line LW (thick line) of the white colony WC in FIG.
  • the area of the region calculated by multiplication is indicated by Sg1.
  • the density of the yellow colony YC in the Y image is calculated by multiplying the spectral reflectance characteristic line LY (thick one-dot chain line) of the yellow colony in FIG. 13 by the spectral sensitivity characteristic line Y of the Y image, the area Sg2 of the region. It is indicated by.
  • the difference ⁇ Sg (
  • ) between the two areas Sg1 and Sg2 accounts for a large proportion of the total (sum of Sg1 and Sg2). Since this ratio corresponds to the density difference ratio, in the Y image, the two types of colonies WC and YC have a large density difference ratio and are easy to distinguish. Note that between the B image and the R image, the difference ⁇ Sb and ⁇ Sr (not shown) account for a small proportion of the total area (the total area of the B region, the total area of the R region), so the two types of colonies WC and YC cannot be distinguished. Hateful. In addition, in FIG. 13, the area Sg1 of the area where LW and Y overlap is substituted for LW*Y, and the area Sg2 of the area where LY and Y overlap is substituted for LY*Y.
  • FIG. 14 shows the density of white colonies WC and yellow colonies YC in an XYZ image captured by the general-purpose color camera 31 of this embodiment.
  • the X image and the Z image there is almost no difference between the density Cw of the white colony WC and the density Cy of the yellow colony YC, or even if there is a difference, the difference is very small. For this reason, it is difficult to distinguish between the two types of colonies WC and YC in the X/Z images.
  • the Y image there is a large difference between the density Cw of the white colony WC and the density Cy of the yellow colony YC. Therefore, white colonies WC and yellow colonies YC can be easily distinguished in the Y image. From the above, it is also possible to identify the two types of colonies WC and YC by color in the XYZ image.
  • step S11 the control unit 40 installs a reference sheet for color gain adjustment.
  • the control unit 40 controls, for example, a robot to have an arm grip a reference sheet, and places the gripped reference sheet at, for example, an inspection position, which is an imaging area of the camera 31 .
  • the reference sheet may be placed manually by the operator.
  • step S12 the control unit 40 turns on the blue LED.
  • step S13 the control unit 40 sets the B and G gains so that the levels of the B and G images are the same. That is, the control unit 40 sets the respective output levels of the B pixel 33B, which is the first pixel, and the G pixel 33G, which is the second pixel, to 2 under the condition that the B pixel 33B, which is the first pixel, is illuminated with B light from the blue LED, which is the blue light source 21.
  • the B gain and G gain are set to values that allow for color balance between colors.
  • step S14 the control unit 40 turns on the red LED.
  • step S15 the control unit 40 manually sets the R gain. That is, since the operator manually sets the R gain by operating the input section, the control section 40 sets the R gain input from the input section by writing it into a predetermined storage area of the memory.
  • step S16 the control unit 40 causes a background sheet to be installed.
  • the control unit 40 controls, for example, the arm of the transport robot, causes the arm to grip the background sheet, and places the gripped background sheet at the imaging area of the camera 31, for example, at an inspection position. Note that before placing the background sheet at the inspection position, the robot removes the reference sheet previously placed at the inspection position.
  • step S17 the control unit 40 calculates and sets a black balance offset amount using the background sheet image.
  • the control unit 40 executes the imaging processing routine shown in FIG. 17.
  • the control unit 40 controls the camera 31 to take an image of the petri dish 13 placed at the inspection position, and also processes the image processing unit 50 (FIGS. 1 and 5) with respect to the imaging signal output from the camera 31 as an imaging result. ) performs predetermined image processing to convert the 3-band RGB image forming the imaging signal into a 3-band XYZ image and output it.
  • step S21 the control unit 40 arranges the petri dish 13 in which the colonies WC and YC are cultured on the background sheet 15.
  • the control unit 40 controls, for example, a robot to place the petri dish 13 held by the arm on the background sheet 15. Note that the operator may place the petri dish 13 on the background sheet 15.
  • step S22 the control unit 40 turns on the blue and red LEDs.
  • step S23 the control unit 40 causes the camera 31 to take an image. Note that the processing in steps S22 and S23 corresponds to an example of an imaging step.
  • step S24 the control unit 40 performs linear matrix processing on the RGB image. Note that the processing in step S24 corresponds to an example of an arithmetic processing step.
  • step S25 the control unit 40 performs white balance processing using the gains Gx, Gy, and Gz set in the gain setting unit 55.
  • the gains Gx and Gy are values obtained by color-balancing the output levels of the B pixel and the G pixel. Therefore, even if the areas Sg1 and Sg2 shown in FIG. 13 occupy a small area in the entire Y area, the two types of colonies WC and YC do not become too dark in the Y image of FIG. It can be obtained with a concentration difference.
  • step S26 the control unit 40 performs black balance processing using the set RGB offset amount. That is, the control unit 40 performs black balance so that the black color of the background sheet 15 in the image has the same density as the black color of the background sheet 15 in the image set at the time of initial setting.
  • step S27 the control unit 40 outputs an XYZ image.
  • a distinguishable difference in density is obtained between the white colony WC and the yellow colony YC by predetermined image processing such as binarization processing. Therefore, two types of colonies WC and YC can be identified.
  • the concentrations of the two types of colonies WC and YC are the same and difficult to distinguish, but the difference in concentration between the two types of colonies WC and YC is large; It is possible to identify Therefore, the total number of colonies WC and YC can be counted. Further, in the Y image shown in FIG.
  • the areas of the two types of colonies WC and YC in FIG. 15(b) correspond to the first colony area, which is an area where the pixel level is within the threshold setting range.
  • One of the two types of colonies WC and YC in FIG. 15(c) corresponds to the second colony area, which is an area where the pixel level is within the threshold setting range.
  • the areas of the two types of colonies WC and YC in FIG. 15(d) correspond to the third colony area, which is an area where the pixel level is within the threshold setting range.
  • the remaining colony area after excluding the second colony area, which is the area of one of the two types of colonies, from the first colony area or the third colony area, which is the area of the total number of colonies corresponds to the fourth colony area.
  • the white colony WC area corresponds to the second colony area
  • the yellow colony YC area corresponds to the fourth colony area
  • the white colony WC area corresponds to the fourth colony area.
  • the fourth colony area corresponds to the fourth colony area.
  • the computer constituting the inspection processing section 60 executes the inspection processing routine shown in FIG.
  • the inspection processing unit 60 counts colonies for each image using the XYZ images from the imaging device 11, and performs a counting correction (count correction) that corrects the number of colonies by type using the colony counting result for each image.
  • a determination process is performed to determine the test result based on the corrected counting result (count result).
  • the processing performed for each XYZ image includes area extraction processing to find the colony area to be inspected, classification processing to classify the extracted area according to the characteristics of the colony, and counting the number of inspection subjects classified as colonies.
  • the area extraction process is a process of finding the area of the inspection target (colony candidate) for each of the X image, Y image, and Z image.
  • the classification process is a process of classifying whether or not a colony to be inspected exists in the extracted region, using the shape factor and size factor of the colony to be inspected as characteristics.
  • the counting process is a process of counting the number of test objects classified as colonies.
  • FIG. 18 shows parallel processing for determining regions for each XYZ image in one flow, and the processing order of step numbers is shown as an example, but the parallel processing may be performed in any order.
  • the feature classification and counting for each area may be performed in any order as long as the area is specified.
  • steps S31 to S34 are a series of processes (first process) for counting colonies using the X image.
  • steps S35 to S38 are a series of processes (second process) for counting colonies using the Y image.
  • steps S39 to S42 are a series of processes (third process) for counting colonies using the Z image.
  • steps S43 to S47 are a series of processes (fourth process) in which two or three of the areas extracted as colony candidates in the first to third processes are combined to determine a specific type of colony area. .
  • the first process includes region extraction using a threshold value of the X image (step S31), identification of a first region as a region extraction result (step S32), feature classification for classifying the first region according to features (step S33), and feature It includes the count of classified inspection objects (step S34).
  • the second process includes region extraction using a threshold value of the Y image (step S35), identification of a second region as a region extraction result (step S36), feature classification for classifying the second region according to the feature (step S37), and feature classification of the second region according to the feature. It includes the count of classified inspection objects (step S38).
  • the third process includes region extraction using a threshold value of the Z image (step S39), identification of a third region as a region extraction result (step S40), feature classification for classifying the third region according to features (step S41), and feature It includes the count of classified inspection objects (step S42).
  • the fourth process is a process of finding another area using the multiple areas obtained in each of the above processes (for example, three processes).
  • the fourth process includes a determination process (step S43) that determines whether a plurality of areas to be used have already been acquired, and an area acquisition process (step S43) that determines a fourth area using the first area and the second area, as an example.
  • S44 identification of a fourth region as a result of region acquisition (step S45), feature classification for classifying the fourth region according to features (step S46), and counting of test objects subjected to feature classification (step S47).
  • the colony candidate area of the A fourth colony area may be determined by using the colony candidate area of the Y image as the second area.
  • step S48 the inspection processing unit 60 (more specifically, the computer configuring the inspection processing unit 60) performs a predetermined calculation using the count results of the inspection targets for each region obtained in steps S34, S38, S42, and S47, respectively. Perform count correction. Count values for each type of colony to be inspected, for example, are determined by count correction. For example, the number of colonies of the second type is calculated by subtracting the number of colonies of the first type from the total number of colonies. Further, in step S49, the inspection processing unit 60 performs a determination process to obtain a final inspection result using the count correction results.
  • Judgment processing includes inspection judgment processing that determines whether the total number of colonies and the number of each type of colony exceeds the corresponding number threshold, and highlighting colonies by surrounding them with a display frame or adding a predetermined color. It may also include highlighting processing and the like.
  • the test processing section 60 then displays the test results on the display section 70.
  • the display unit 70 displays, for example, the total number of colonies, the number of colonies by type, test result details, and images with colonies highlighted.
  • the inspection result content may include a determination result as to whether the product is good or defective.
  • the inspection target is a colony here, it is not limited to this. For example, a configuration may be adopted in which foreign matter contained in a product is inspected for the presence of foreign matter, or a configuration in which contamination on the product is inspected for the presence or absence of contamination may be adopted.
  • the identification section 61 and the determination section 62 are configured by software by the computer forming the inspection processing section 60 executing the inspection processing program shown in FIG. Then, the identification unit 61 identifies each area by area extraction (steps S31, S35, S39, S44) and area specification (steps S32, S36, S40, S45).
  • the classification unit 63 is configured by a computer that performs feature classification processing (steps S33, S37, S41, and S46).
  • the counting unit 64 is constituted by a computer that performs counting processing (steps S34, S38, S42, S47).
  • the specifying unit 65 is constituted by a computer that performs a process of specifying the number of two types of colony areas to be identified based on the count value that is the counting result of the counting unit 64.
  • the determination unit 62 is constituted by a computer that performs determination processing based on the numbers of the two types of colonies identified (step S49).
  • the control unit 40 executes the imaging condition determination processing routine shown in FIG. 19.
  • This flowchart is an example of searching and determining imaging conditions using three types of monochromatic light sources.
  • the types of monochromatic light sources used for the imaging condition search may be two types of monochromatic light sources among the blue light source 21, the green light source 22, and the red light source 23.
  • the processing in steps S51 to S57 is a first search process in which imaging conditions are searched and determined using the first monochromatic light source, which is the first monochromatic light source. If appropriate imaging conditions cannot be determined even after repeatedly performing the processes in steps S51 to S57, the monochromatic light is changed in step S58, and the processes in steps S51 to S57 are repeatedly performed. Modification of monochromatic light refers to modification of the peak wavelength of monochromatic light. Furthermore, if all the peak wavelengths of monochromatic light in one monochromatic light source have been changed, the monochromatic light source may be changed. Even if the monochromatic light source is changed, the processes in steps S51 to S57 are basically the same. Therefore, in the following, a process in which the control unit 40 searches for and determines imaging conditions using a blue light source, which is the first monochromatic light source, will be described in detail.
  • step S51 the control unit 40 sets coefficients for linear matrix calculation when illuminating with the first monochromatic light.
  • This step S51 performs linear matrix calculation processing on the 3-band RGB image output from the imaging unit 30 to generate a 3-band XYZ image with corrected RGB imaging characteristics. This corresponds to an example of the first step of setting matrix coefficients by which the RGB image is multiplied.
  • step S52 the control unit 40 illuminates the subject with the first monochromatic light source.
  • the subject 12 is photographed with monochromatic light that can change the peak wavelengths of the spectral output characteristics of the first pixel among the RGB pixels of the camera 31 and the spectral output characteristics of the second pixel different from the first pixel. This corresponds to an example of the second step of illuminating.
  • step S53 the control unit 40 images the subject. Note that the process in step S53 corresponds to an example of a third step in which the imaging unit 30 images the subject 12 illuminated with monochromatic light in the second step.
  • step S54 the control unit 40 multiplies the RGB image by a matrix to generate an XYZ image.
  • this step S53 corresponds to an example of the fourth step in which the RGB image output from the imaging unit 30 as a result of the imaging in the third step is multiplied by a matrix to generate an XYZ image.
  • step S55 the control unit 40 obtains the difference in density between two areas of different types in the Y image. Note that this step S55 determines whether the difference in density between regions of two different materials in the second image output from the second pixel of the XYZ image generated in the fourth step is greater than or equal to a predetermined threshold. This corresponds to an example of the fifth step of determining whether.
  • step S56 the control unit 40 determines whether the difference is greater than or equal to a threshold value. If the difference is not greater than or equal to the threshold, that is, if the difference is less than the threshold, the process advances to step S57. On the other hand, if the difference is greater than or equal to the threshold, the process proceeds to step S59, where the monochromatic light and coefficients at that time are determined. Note that the determined coefficients are set in the matrix coefficient setting section 54.
  • step S57 the control unit 40 determines whether the condition search using the first monochromatic light has ended. If the condition search using the first monochromatic light has not been completed, the process returns to step S52 and the processes from step S52 to step S57 are repeatedly executed. If an imaging condition for which the difference is equal to or greater than the threshold value cannot be found even after completing all the condition searches using the first monochromatic light, the process moves to step S58.
  • step S58 the control unit 40 changes the monochromatic light.
  • the control unit 40 changes the peak wavelength of monochromatic light.
  • changing the peak wavelength of monochromatic light may mean gradually changing the peak wavelength of the monochromatic light source without changing it.
  • the peak wavelength of a general-purpose blue LED is about 470 nm, but it may be changed to 480 nm and 490 nm in 10 nm steps, for example.
  • the peak wavelength of the monochromatic light (for example, B light) to be used is changed, and the imaging condition search processing in steps S51 to S57 is similarly executed using the monochromatic light after the peak wavelength has been changed.
  • the monochromatic light source to be used is changed from the blue light source 21 to the green light source 22 in step S58.
  • the peak wavelength of the monochromatic light (for example, G light) to be used is changed, and the imaging condition search processing of steps S51 to S57 is similarly executed using the monochromatic light after the peak wavelength has been changed.
  • step S56 If an imaging condition for which the difference is still equal to or greater than the threshold cannot be found in step S56, the green light source 22 is changed to the red light source 23 in step S58. Then, similarly, the peak wavelength of the monochromatic light (for example, R light) to be used is changed, and the imaging condition search processing of steps S51 to S57 is similarly executed using the monochromatic light after the peak wavelength change. In this way, the change in the coefficient in the first step (S51) and the peak of monochromatic light in the second step (S52, S58) are performed until the difference in density becomes equal to or greater than a predetermined threshold in the third step (S56). The first to fifth steps (steps S51 to S58) are repeated as the wavelength changes.
  • R light for example, R light
  • the target image when acquiring the density difference among the XYZ images in the subsequent step S55 is an image different from the previous Y image (X image or Z image). ) may be changed. This image may be selected depending on the monochromatic light and the spectral reflectance characteristics of the colonies WC and YC.
  • the imaging device 11 images the subject 12 under the imaging conditions determined in this routine.
  • an imaging condition is found in which the difference is equal to or greater than a threshold value.
  • the blue light source 21 is determined to be a monochromatic light source. This blue light source 21 is determined as the first monochromatic light source 25.
  • step S58 changing the peak wavelength of the monochromatic light includes changing the monochromatic light by changing the peak wavelength while leaving the monochromatic light source as it is. It is sufficient if the emission wavelength can be changed so that the peak wavelengths of the first pixel and the spectral output characteristics of the second pixel, which are different from the first pixel, can be changed. For example, it is sufficient to simply change the monochromatic light source. That is, in step S58, the control unit 40 may simply change the blue light source 21, the green light source, and the red light source 23 in order. In addition, in the case of a configuration in which the monochromatic light source is unchanged but the peak wavelength is changed, the type of monochromatic light source used for imaging condition search can be one of the blue light source 21, the green light source 22, and the red light source 23. be.
  • the imaging device 11 outputs an image in which regions made of two different materials included in the photographed subject 12 can be identified.
  • the imaging device 11 includes an imaging section 30, a monochromatic light source 20, a control section 40, and an image processing section 50.
  • an image sensor 33 imaging device that images the subject 12 has RGB pixels 33R, 33G, and 33B.
  • the monochromatic light source 20 illuminates the subject 12 with monochromatic light.
  • the control unit 40 controls the monochromatic light source 20 and the imaging unit 30.
  • the image processing unit 50 performs predetermined processing on the three-band RGB image output from the imaging unit 30 that has captured the image of the subject 12.
  • the RGB pixels include a first pixel 33B having sensitivity in a first wavelength region and a second pixel 33G having sensitivity in a second wavelength region.
  • the first wavelength region and the second wavelength region overlap in some overlapping wavelength regions.
  • Monochromatic light is light in an emission wavelength range that includes at least part of the first wavelength range and at least part of the overlapping wavelength range.
  • the image processing section 50 includes an arithmetic processing section 52 and an amplification section 53.
  • the arithmetic processing unit 52 generates a three-band XYZ image with corrected RGB imaging characteristics by performing linear matrix arithmetic processing on the three-band RGB image.
  • the amplification unit 53 sets each output level of the first pixel 33B and the second pixel 33G to a value that allows color balance between the two colors under the condition that monochromatic light is irradiated from the monochromatic light source 20.
  • the gain amplifies each value corresponding to each of the first pixel 33B and the second pixel 33G among the XYZ values forming the XYZ image.
  • the image processing unit 50 generates a three-band XYZ image by performing black balance processing on a reference portion of the subject 12 other than the identification target. According to this configuration, since black balance processing is performed, even if the environment when photographing the subject 12 changes, it is possible to output an image in which the material can be identified with appropriate color or density.
  • the monochromatic light source 20 is a blue light source 21 or a red light source 23.
  • the monochromatic light source 20 is the blue light source 21
  • the first pixel 33B is a B pixel
  • the second pixel is a G pixel 33G.
  • the monochromatic light source 20 is the red light source 23
  • the first pixel is the R pixel 33R
  • the second pixel is the G pixel 33G.
  • the G pixel 33G has sensitivity to at least part of the overlapping wavelength region also to blue light, which is the light source color of the blue light source 21. Therefore, when capturing an image by illuminating with blue light, the sensitivity wavelength range of the G image can be narrowed compared to that of a general-purpose camera. Furthermore, the G pixel 33G has sensitivity to at least a portion of the overlapping wavelength region with respect to red light, which is the light source color of the red light source 23. Therefore, when capturing an image by illuminating with red light, the sensitivity wavelength range of the G image can be narrowed compared to that of the general-purpose camera 31. By performing linear matrix calculation on the RGB image, it is possible to further narrow the narrow sensitive wavelength region of the G image or shift the wavelength. Therefore, it is possible to output an image in which the materials (colonies WC, YC) can be identified with appropriate color or density.
  • the imaging device 11 can emit second monochromatic light in an emission wavelength range different from that of the first monochromatic light emitted by the first monochromatic light source 25. It further includes a second monochromatic light source 26.
  • the RGB pixels include a third pixel in addition to the first pixel 33B and the second pixel 33G.
  • the control unit 40 causes the imaging unit 30 to image the subject 12 by illuminating the first monochromatic light source 25 and the second monochromatic light source 26 .
  • the control unit 40 illuminates the second monochromatic light source 26 with a second illuminance set according to the illuminance of the first monochromatic light source 25, or controls the output of the first pixel 33B and second pixel 33G among the RGB pixels.
  • the gain setting unit 55 sets the gain given to the output level of the third pixel according to each gain given to the level. According to this configuration, by adjusting the illuminance or setting the gain of the monochromatic light source 20, it is possible to output an image in which the material can be identified with an appropriate color or density.
  • the inspection device 10 includes an imaging device 11.
  • the object 12 includes two different colonies WC and YC as two different materials.
  • An identification unit 61 is provided that identifies colony areas, which are areas of two different types of colonies WC and YC, based on the three-band XYZ image input from the imaging device 11.
  • the XYZ image includes an X image, a Y image, and a Z image.
  • the identification unit 61 defines an area in the X image whose pixel level is within the threshold setting range as a first colony area.
  • the identification unit 61 defines an area in the Y image whose pixel level is within the threshold setting range as a second colony area.
  • the identification unit 61 defines an area in the Z image whose pixel level is within the threshold setting range as a third colony area.
  • the identification unit 61 removes the second colony area from the first colony area or the third colony area and sets the remaining colony area as a fourth colony area.
  • the identification section 61 includes a classification section 63 , a counting section 64 , and a specifying section 65 .
  • the classification unit 63 classifies at least one of the first colony area and the third colony area, the second colony area, and the fourth colony area based on characteristics and shapes.
  • the counting unit 64 counts the number of each colony area classified by the classification unit.
  • the specifying unit 65 specifies the number of two types of colony areas to be identified based on the counting results of the counting unit.
  • a method for determining imaging conditions for the imaging device 11 that outputs regions made of two different materials included in the subject 12 as images that can be identified.
  • the imaging device 11 includes a monochromatic light source 20 that illuminates the subject 12 and an imaging unit 30 that includes RGB pixels that captures an image of the subject 12.
  • This imaging condition determining method includes first to fifth steps.
  • the first step is to perform linear matrix calculation processing on the 3-band RGB image output from the imaging unit 30 to generate a 3-band XYZ image with corrected RGB imaging characteristics.
  • the coefficients of the matrix by which the RGB image is multiplied are set.
  • the second step it is possible to change the peak wavelengths of the spectral output characteristics of the first pixel 33B and the spectral output characteristics of the second pixel 33G, which is different from the first pixel 33B, among the RGB pixels forming the imaging unit 30.
  • the subject 12 is illuminated with monochromatic light.
  • the imaging unit 30 images the subject 12 illuminated with monochromatic light in the second step.
  • the RGB image output from the imaging unit 30 as a result of the imaging in the third step is multiplied by a matrix to generate an XYZ image.
  • the fifth step is to determine whether the difference in density between regions of two different materials in the second image output from the second pixel 33G among the XYZ images generated in the fourth step is greater than or equal to a predetermined threshold. to judge. Until the difference in concentration becomes equal to or greater than a predetermined threshold value in the fifth step, the first to second steps are performed with at least one of a change in the coefficient in the first step and a change in each peak wavelength in the second step. Repeat the 5 steps. According to this method of determining imaging conditions, appropriate imaging conditions are set under which the imaging device 11 can output an image in which regions made of two different materials (colonies WC, YC) included in the subject 12 can be identified by color or density. can be determined.
  • the imaging unit 30 includes an image sensor 33 (image sensor) having RGB pixels.
  • the RGB pixels include a first pixel 33B having sensitivity in a first wavelength region and a second pixel 33G having sensitivity in a second wavelength region. The first wavelength region and the second wavelength region overlap in some overlapping wavelength regions.
  • Monochromatic light is light in an emission wavelength range that includes at least part of the first wavelength range and at least part of the overlapping wavelength range.
  • This imaging method includes an imaging step, an output step, an arithmetic processing step, and an amplification step.
  • the imaging step the imaging unit 30 images the subject 12 illuminated by the monochromatic light source 20 .
  • the output step the imaging unit 30 that has captured the image of the subject 12 outputs a 3-band RGB image.
  • the arithmetic processing step generates a three-band XYZ image with corrected RGB imaging characteristics by performing linear matrix arithmetic processing on the three-band RGB image output from the imaging unit 30.
  • the amplification step is performed in three bands using each gain set to balance the output levels of the first pixel 33B and the second pixel 33G between two colors under the condition that monochromatic light is irradiated from the monochromatic light source 20.
  • two types of images corresponding to the first pixel 33B and the second pixel 33G are amplified.
  • this imaging method it is possible to output an image in which regions made of two different types of materials (colonies WC, YC) included in the subject 12 can be identified by color or density. For example, when an image is used for inspection, two different types of materials included in the image can be identified or classified with relatively high accuracy.
  • the embodiment is not limited to the above, and may be modified in the following manner.
  • - Monochromatic light is not limited to blue light.
  • the combination of the first monochromatic light and the second monochromatic light is not limited to blue light and red light.
  • the monochromatic light emitted by the monochromatic light source 20 may be green light or red light. That is, the first monochromatic light may be green light, and the combination of the first monochromatic light and the second monochromatic light may be green light and red light.
  • an XYZ image as shown in FIG. 21 is obtained. Then, as shown in FIG.
  • the difference ⁇ Sg between the area Sg1 and the area Sg2 becomes relatively large. Therefore, a relatively large difference occurs in the ratio of the difference ⁇ Sg to the total area Sg1 and the area Sg2, making it easier to distinguish between white colonies and yellow colonies in the XYZ image (particularly the B image) based on the density difference between the two.
  • the IR cut filter 35 that blocks near-infrared light from the camera 31 may be removed, and the RGB pixels forming the image sensor may be configured to have sensitivity to near-infrared light.
  • the first monochromatic light source 25 is the blue light source 21, and B light is used as the first monochromatic light.
  • the second monochromatic light source 26 is a near-infrared LED, and near-infrared light is used as the second monochromatic light.
  • a linear matrix operation is performed on an RGB image to convert, for example, a NYZ image containing N images having a peak in the near-infrared region (for example, approximately 800 to 900 nm in the example of FIG. 7) to an XYZ image.
  • the inspection processing unit 60 may perform inspection processing using only two images, the Y image and the Z image, or only the two images, the X image and the Y image.
  • the monochromatic light source 20 may include a near-infrared light source that uses near-infrared light as monochromatic light as one type of monochromatic light source.
  • the B light is not limited to monochromatic light emitted by a blue LED, but may be other blue light having a peak in the blue wavelength region.
  • the main monochromatic light is B light
  • the first pixel is the B pixel and the second pixel is the G pixel
  • the present invention is not limited to this.
  • the monochromatic light is R light
  • the first pixel may be the R pixel 33R
  • the second pixel may be the G pixel 33G.
  • the monochromatic light is G light
  • the first pixel may be the G pixel 33G
  • the second pixel may be the B pixel 33B or the R pixel 33R.
  • the combination of the first monochromatic light and the second monochromatic light is not limited to blue light and red light as in the above embodiment. Moreover, it is not limited to green light and red light as shown in FIG.
  • the monochromatic light may be red light.
  • the first monochromatic light may be red light and the second monochromatic light may be blue light.
  • the number of monochromatic light sources is not limited to two types, and only one type may be used.
  • the monochromatic light source may be only a blue light source. Further, in FIG. 20, the monochromatic light source may be only a green light source.
  • the four colors of R, G1, G2, and B may be used.
  • a color camera having complementary color filters may be used, and the complementary colors may be four colors: yellow, cyan, magenta, and green.
  • - Save image data (for example, RGB image data) based on the first imaging signal S1 captured by the image sensor 33 using the camera 31 in a removable memory such as a USB memory.
  • the image data stored in the removable memory may be read by a personal computer, and the CPU (image processing unit 50) of the personal computer may perform conversion processing including linear matrix calculation to generate an XYZ image.
  • the device that performs the imaging step and the device that performs the conversion step may be separate devices. Also by such an imaging method, XYZ images of multiple bands can be acquired.
  • the configuration may be such that an inspector visually recognizes and inspects the XYZ image output by the imaging device 11.
  • Similar colors are not limited to white and yellow, but may also include white and light pink, white and light orange, white and light green, white and light blue, etc.
  • a combination of white and light color may be used.
  • a combination of different light colors may be used.
  • the object 12 is not limited to one that includes colonies, such as a Petri dish in which colonies are cultured, but may be an object that includes different types of identification targets. Further, the object to be identified may be a transparent object or a semi-transparent object. Furthermore, the identification target may be a food such as jelly or a processed food.
  • the subject 12 to be imaged or inspected is not particularly limited.
  • the object 12 may be, for example, a container such as a plastic bottle or a bottle, a food product, a beverage, an electronic component, an electric appliance, a daily necessities, a part, a member, or a raw material such as powder, granule, or liquid.
  • the inspection target may be scratches, dirt, printing defects, painting defects, etc.
  • the object within the subject 12 is not limited to the inspection object.
  • the objects may be of different types but exhibit similar colors.
  • the imaging device 11 may be configured as a separate device from the inspection processing section 60.
  • the imaging device 11 may be used for purposes other than inspection.
  • the arrangement pattern of the color filters 34 constituting the image sensor 33 is not limited to the RGB Bayer arrangement, but may be any arrangement pattern such as a stripe arrangement.
  • the imaging device 11 does not need to include a transport robot.
  • a configuration may be adopted in which an operator places the subject 12 on a mounting table for photographing, which serves as an imaging position.
  • At least one of the control unit 40, the image processing unit 50, and the inspection processing unit 60 may be partially or entirely configured by software consisting of a computer that executes a program, or may be configured by hardware such as an electronic circuit. may be configured.
  • Amplification unit 54 ...Matrix coefficient setting section, 55... Gain setting section, 60... Inspection processing section, 61... Identification section, 62... Judgment section, 63... Classification section, 64... Counting section, 65... Specification section, 70... Display section, WC... White colony, YC...yellow colony, S1...first imaging signal, S2...second imaging signal, VA...visible light wavelength region, NIRA...near infrared wavelength region, LW...spectral reflectance characteristic line of white colony, LY... Spectral reflectance characteristic line of yellow colony, B, G, R...spectral sensitivity characteristic line, X, Y, Z...spectral sensitivity characteristic line calculated from RGB spectral sensitivity characteristic, Gx, Gy, Gz...gain.

Abstract

RGB pixels of a camera (31) include a first pixel having sensitivity to a first wavelength region and a second pixel having sensitivity to a second wavelength region. The first wavelength region and the second wavelength region overlap in a partial overlapping wavelength region. Monochromatic light is light in an emission wavelength region including at least a part of the first wavelength region and at least a part of the overlapping wavelength region. An image processing unit (50) comprises an arithmetic processing unit (52) and an amplification unit (53). The arithmetic processing unit (52) performs linear matrix arithmetic processing on a three-band RGB image from the camera (31) to generate a three-band XYZ image having corrected RGB imaging characteristics. The gain of the amplification unit (53) is set to a value that enables the output levels of the first pixel and the second pixel to be color-balanced between two colors under monochromatic light illumination.

Description

撮像装置、検査装置、撮像条件決定方法及び撮像方法Imaging device, inspection device, imaging condition determination method, and imaging method
 本発明は、被写体を撮像した画像中の種類の異なる類似した色の対象を識別しやすい画像として出力する撮像装置、検査装置、撮像条件決定方法及び撮像方法に関する。 The present invention relates to an imaging device, an inspection device, an imaging condition determination method, and an imaging method that output images of different types of similar-colored objects in images of objects as images that are easy to identify.
 例えば、特許文献1~3には、撮像対象として例えばコロニーを撮像する撮像装置を備える検査装置が開示されている。特許文献1では、シャーレ内で培養した微生物のコロニー数を、白黒CCDカメラを利用した簡易な装置構成で、培地に対する複雑な事前処理を必要としないで、種々の形状の微細なコロニーまで精度良く短時間に計数する。 For example, Patent Documents 1 to 3 disclose inspection devices that include an imaging device that images, for example, a colony as an imaging target. In Patent Document 1, the number of colonies of microorganisms cultured in a petri dish can be accurately counted, down to minute colonies of various shapes, using a simple device configuration using a black-and-white CCD camera, without requiring complicated pretreatment of the culture medium. Count in a short time.
 特許文献2、3には、色の特徴量が異なる2種以上のコロニーを的確に識別することが可能なコロニー検出システムが開示されている。このコロニー検出システムは、色の特徴量が異なる2種以上のコロニーを的確に識別するために、コロニーピクセルの色特徴量に基づいて、コロニーピクセルを予め定められた種別毎に分類して、コロニーを識別する。 Patent Documents 2 and 3 disclose colony detection systems that can accurately identify two or more types of colonies that have different color features. This colony detection system classifies colony pixels into predetermined types based on the color features of colony pixels in order to accurately identify two or more types of colonies with different color features. identify.
 これによって、グレースケールを用いてコロニーを検出しているだけでは判別できない、赤紫や紺などといった色特性が異なる2種以上のコロニーを識別する場合に中間色が属する菌を判別すること、又は、ターゲットとなる菌と当該菌に類似する菌を識別する際の色特性が同系色の場合でも検出できる。 By this, it is possible to distinguish between two or more types of colonies with different color characteristics, such as reddish-purple or navy blue, which cannot be distinguished by simply detecting colonies using gray scale, or to determine the bacteria to which an intermediate color belongs. When identifying target bacteria and bacteria similar to the target bacteria, detection is possible even when the color characteristics are similar.
特開2006-345750号公報Japanese Patent Application Publication No. 2006-345750 特開2017-035042号公報Japanese Patent Application Publication No. 2017-035042 特開2016-26500号公報Unexamined Japanese Patent Publication No. 2016-26500
 しかしながら、コロニーの種類が異なるものの色が類似する複数種のコロニーが存在する場合がある。この場合、特許文献2、3に記載されたコロニー検出システムにおいて色特徴量を用いても、カメラで撮像した画像から類似する色の特徴を有する種類の異なるコロニーを識別しにくいという課題がある。なお、識別対象はコロニーに限定されず、種類の異なるものの色が類似する対象であっても、同様の課題がある。 However, there may be cases where there are multiple types of colonies that are different types but have similar colors. In this case, even if color features are used in the colony detection systems described in Patent Documents 2 and 3, there is a problem in that it is difficult to identify different types of colonies that have similar color features from images captured by cameras. Note that the identification target is not limited to colonies, and even if the target is of a different type but has a similar color, the same problem occurs.
 以下、上記課題を解決するための手段及びその作用効果について記載する。
 上記課題を解決する撮像装置は、被写体を撮像して前記被写体に含まれる2種類の異なる素材からなる領域を識別可能な画像として出力する撮像装置であって、前記被写体を撮像する撮像素子がRGB画素を有する撮像部と、前記被写体を単色光で照明する単色光源と、前記単色光源及び前記撮像部を制御する制御部と、前記被写体を撮像した前記撮像部から出力される3バンドのRGB画像に所定の処理を施す画像処理部と、を備え、前記RGB画素は、第1波長領域に感度を有する第1画素と、第2波長領域に感度を有する第2画素とを含み、前記第1波長領域と前記第2波長領域とは一部の重複波長領域で重複し、前記単色光は、前記第1波長領域の少なくとも一部と前記重複波長領域の少なくとも一部とを含む発光波長領域の光であり、前記画像処理部は、前記3バンドのRGB画像に対してリニアマトリックス演算処理を行うことによりRGB撮像特性が補正された3バンドのXYZ画像を生成する演算処理部と、前記単色光源から単色光が照射された条件下で前記第1画素と前記第2画素との各出力レベルを2色間で色バランスをとることが可能な値に設定されたゲインで前記XYZ画像を構成するXYZ値のうち前記第1画素と前記第2画素との各々に対応する各値を増幅する増幅部と、を備える。
Below, means for solving the above problems and their effects will be described.
An imaging device that solves the above problems is an imaging device that images a subject and outputs an image that can identify regions made of two different materials included in the subject, and the imaging device that captures the subject is RGB. an imaging unit having pixels, a monochromatic light source that illuminates the subject with monochromatic light, a control unit that controls the monochromatic light source and the imaging unit, and a three-band RGB image output from the imaging unit that captures the image of the subject. an image processing unit that performs predetermined processing on the RGB pixel, and the RGB pixel includes a first pixel having sensitivity in a first wavelength region and a second pixel having sensitivity in a second wavelength region, The wavelength range and the second wavelength range overlap in some overlapping wavelength ranges, and the monochromatic light has an emission wavelength range that includes at least a part of the first wavelength range and at least a part of the overlapping wavelength range. The image processing unit includes a calculation processing unit that generates a three-band XYZ image with RGB imaging characteristics corrected by performing linear matrix calculation processing on the three-band RGB image, and the monochromatic light source. The XYZ image is constructed with a gain set to a value that allows each output level of the first pixel and the second pixel to be balanced between the two colors under the condition that monochromatic light is irradiated from the An amplifying section that amplifies each value corresponding to each of the first pixel and the second pixel among the XYZ values.
 この構成によれば、撮像装置は、被写体に含まれる2種類の異なる素材からなる領域を色又は濃度で識別できる画像を出力できる。例えば、画像を検査に用いるときに、画像に含まれる2種類の異なる素材を比較的高い精度で識別又は分類することができる。なお、本明細書において、「濃度」とは、画素レベル、信号レベル、画素値などに相当する概念である。 According to this configuration, the imaging device can output an image in which regions made of two different materials included in the subject can be identified by color or density. For example, when an image is used for inspection, two different types of materials included in the image can be identified or classified with relatively high accuracy. Note that in this specification, "density" is a concept equivalent to pixel level, signal level, pixel value, etc.
 上記撮像装置において、前記画像処理部は、前記被写体のうち識別対象以外の基準部分を対象にして黒バランス処理を行った前記3バンドのXYZ画像を生成してもよい。
 この構成によれば、黒バランス処理を行うので、被写体を撮像するときの環境が変化しても、適切な色又は濃度で素材を識別できる画像を出力することができる。
In the imaging device, the image processing unit may generate the three-band XYZ image by performing black balance processing on a reference portion of the subject other than the identification target.
According to this configuration, since black balance processing is performed, even if the environment when photographing a subject changes, it is possible to output an image in which the material can be identified with appropriate color or density.
 上記撮像装置において、前記単色光源は、青色光源又は赤色光源であり、前記単色光源が前記青色光源である場合、前記第1画素はB画素であり、かつ前記第2画素はG画素であり、前記単色光源が前記赤色光源である場合、前記第1画素はR画素であり、かつ前記第2画素はG画素であってもよい。 In the imaging device, the monochromatic light source is a blue light source or a red light source, and when the monochromatic light source is the blue light source, the first pixel is a B pixel, and the second pixel is a G pixel, When the monochromatic light source is the red light source, the first pixel may be an R pixel, and the second pixel may be a G pixel.
 この構成によれば、G画素は、青色光源の光源色である青色光に対しても重複波長領域の少なくとも一部に感度を有する。このため、青色光で照明して撮像するとき、G画像の感度波長領域を汎用カメラのそれに対して狭くすることができる。また、G画素は、赤色光源の光源色である赤色光に対しても重複波長領域の少なくとも一部に感度を有する。このため、赤色光で照明して撮像するとき、G画像の感度波長領域を汎用カメラのそれに対して狭くすることができる。RGB画像をリニアマトリックス演算することで、G画像の狭くなった感度波長領域を更に狭くしたり波長をずらしたりすることができる。よって、適切な色又は濃度で素材を識別できる画像を出力することができる。 According to this configuration, the G pixel has sensitivity to at least part of the overlapping wavelength region also to blue light, which is the light source color of the blue light source. Therefore, when capturing an image by illuminating with blue light, the sensitivity wavelength range of the G image can be narrowed compared to that of a general-purpose camera. Furthermore, the G pixel has sensitivity to at least a portion of the overlapping wavelength region for red light, which is the light source color of the red light source. Therefore, when capturing an image by illuminating with red light, the sensitivity wavelength range of the G image can be narrowed compared to that of a general-purpose camera. By performing linear matrix calculation on the RGB image, it is possible to further narrow the narrow sensitive wavelength region of the G image or shift the wavelength. Therefore, it is possible to output an image in which the material can be identified with appropriate color or density.
 上記撮像装置において、前記単色光源を第1単色光源とすると、前記第1単色光源が照射する第1単色光の発光波長領域とは異なる発光波長領域の第2単色光を照射可能な第2単色光源を更に備え、前記RGB画素は、前記第1画素及び前記第2画素の他に第3画素を含み、前記制御部は、前記第1単色光源と前記第2単色光源との照明を同時に行って、前記被写体を前記撮像部に撮像させ、前記増幅部が前記XYZ値のうち前記第1画素と前記第2画素との各々に対応する各値を増幅するときの前記ゲインが設定されるゲイン設定部を備え、前記制御部が、前記第1単色光源の照度に応じて設定した第2照度で前記第2単色光源の照明を行うか、あるいは、前記RGB画素のうち前記第1画素と前記第2画素との出力レベルに与えられる前記各ゲインに応じて前記第3画素の出力レベルに与えられるゲインが前記ゲイン設定部により設定されてもよい。 In the above imaging device, when the monochromatic light source is a first monochromatic light source, a second monochromatic light source capable of emitting a second monochromatic light having an emission wavelength range different from an emission wavelength range of the first monochromatic light emitted by the first monochromatic light source The RGB pixel further includes a light source, the RGB pixel includes a third pixel in addition to the first pixel and the second pixel, and the control unit simultaneously performs illumination with the first monochromatic light source and the second monochromatic light source. a gain to which the gain is set when the subject is imaged by the imaging unit and the amplification unit amplifies each value corresponding to each of the first pixel and the second pixel among the XYZ values; a setting section, wherein the control section illuminates the second monochromatic light source at a second illuminance set according to the illuminance of the first monochromatic light source; or The gain setting section may set a gain to be given to the output level of the third pixel according to each gain given to the output level of the second pixel.
 この構成によれば、単色光源の照度の調整あるいはゲインの設定により、適切な色又は濃度で素材を識別できる画像を出力することができる。
 上記課題を解決する検査装置は、上記撮像装置を備えた検査装置であって、前記被写体は、前記2種類の異なる素材として2種類の異なるコロニーを含み、前記撮像装置から入力した前記3バンドのXYZ画像に基づいて2種類の異なるコロニーの領域であるコロニー領域を識別する識別部を備え、前記XYZ画像は、X画像、Y画像及びZ画像を含み、前記識別部は、前記X画像で画素レベルが閾値設定範囲内にある領域を第1コロニー領域とし、前記Y画像で画素レベルが閾値設定範囲内にある領域を第2コロニー領域とし、前記Z画像で画素レベルが閾値設定範囲内にある領域を第3コロニー領域とし、前記第1コロニー領域又は前記第3コロニー領域から、前記第2コロニー領域を排除した残りのコロニー領域を第4コロニー領域とし、前記第1コロニー領域及び前記第3コロニー領域のうちの少なくとも一方と、前記第2コロニー領域と、前記第4コロニー領域とを、特徴及び形状に基づいて分類する分類部と、前記分類部が分類した前記コロニー領域のそれぞれの数を計数する計数部と、前記計数部の計数結果に基づいて識別対象である2種類のコロニー領域の数をそれぞれ特定する特定部とを備える。
According to this configuration, by adjusting the illuminance of the monochromatic light source or setting the gain, it is possible to output an image in which the material can be identified with an appropriate color or density.
An inspection apparatus for solving the above problem is an inspection apparatus equipped with the above-mentioned imaging device, wherein the object includes two different types of colonies as the two different types of materials, and the three bands inputted from the imaging device are The identification unit includes an identification unit that identifies colony areas that are areas of two different colonies based on an XYZ image, the XYZ image includes an X image, a Y image, and a Z image, and the identification unit The area whose level is within the threshold setting range is defined as a first colony area, the area whose pixel level is within the threshold setting range in the Y image is defined as a second colony area, and the pixel level is within the threshold setting range in the Z image. The area is a third colony area, the remaining colony area after excluding the second colony area from the first colony area or the third colony area is a fourth colony area, and the first colony area and the third colony are a classification unit that classifies at least one of the areas, the second colony area, and the fourth colony area based on characteristics and shapes; and counting the number of each of the colony areas classified by the classification unit. and a specifying section that specifies the number of two types of colony regions to be identified based on the counting results of the counting section.
 この構成によれば、単色光源の照度の調整あるいはゲインの設定により、適切な色又は濃度で素材を識別できる画像を出力することができる。よって、画像を検査に用いるときに、画像に含まれる2種類の異なる素材を比較的高い精度で識別又は分類することができる。 According to this configuration, by adjusting the illuminance of the monochromatic light source or setting the gain, it is possible to output an image in which the material can be identified with an appropriate color or density. Therefore, when using an image for inspection, two different types of materials included in the image can be identified or classified with relatively high accuracy.
 上記課題を解決する撮像条件決定方法は、被写体に含まれる2種類の異なる素材からなる領域を識別可能な画像として出力する撮像装置の撮像条件決定方法であって、前記撮像装置は、前記被写体を単色光で照明する単色光源と、前記被写体を撮像するRGB画素を含む撮像部とを備え、前記撮像部から出力される3バンドのRGB画像に対してリニアマトリックス演算処理を行うことによりRGB撮像特性が補正された3バンドのXYZ画像を生成するために前記リニアマトリックス演算処理で前記RGB画像に対して乗算するマトリックスの係数を設定する第1ステップと、前記撮像部を構成するRGB画素のうち第1画素の分光出力特性と前記第1画素とは異なる第2画素の分光出力特性とのピーク波長が変化するように前記単色光源の発光波長を変化させて、発光波長変化後の当該単色光源で前記被写体を照明する第2ステップと、前記第2ステップで前記単色光で照明した前記被写体を前記撮像部が撮像する第3ステップと、前記第3ステップの撮像の結果として前記撮像部から出力されるRGB画像に前記マトリックスを乗算してXYZ画像を生成する第4ステップと、前記第4ステップで生成された前記XYZ画像のうち前記第2画素から出力される第2画像において前記2種類の異なる素材の領域間の濃度の差が所定の閾値以上であるか否かを判断する第5ステップと、を行い、前記第5ステップにおいて前記濃度の差が所定の閾値以上になるまで、前記第1ステップでの前記係数の変化と、前記第2ステップでの前記発光波長の変化とのうち少なくとも一方の変化を伴って前記第1~第5ステップを繰り返し行う。 A method for determining imaging conditions for solving the above problem is a method for determining imaging conditions for an imaging device that outputs an image that can identify regions made of two different materials included in a subject, the imaging device It is equipped with a monochromatic light source that illuminates with monochromatic light, and an imaging section that includes RGB pixels that images the subject, and performs linear matrix calculation processing on the three-band RGB image output from the imaging section to obtain RGB imaging characteristics. a first step of setting a matrix coefficient by which the RGB image is multiplied in the linear matrix calculation process in order to generate a three-band XYZ image corrected; The emission wavelength of the monochromatic light source is changed so that the peak wavelength of the spectral output characteristic of one pixel and the spectral output characteristic of a second pixel different from the first pixel is changed, and the monochromatic light source after the emission wavelength change is changed. a second step of illuminating the subject; a third step of causing the imaging unit to image the subject illuminated with the monochromatic light in the second step; and an output from the imaging unit as a result of the imaging in the third step. a fourth step of multiplying the RGB image by the matrix to generate an XYZ image; and a second image output from the second pixel of the XYZ image generated in the fourth step, which is different from the two types. a fifth step of determining whether the difference in density between regions of the material is equal to or greater than a predetermined threshold; and in the fifth step, the first The first to fifth steps are repeated with at least one of a change in the coefficient in the step and a change in the emission wavelength in the second step.
 この撮像条件決定方法によれば、撮像装置が、被写体に含まれる2種類の異なる素材からなる領域を色又は濃度で識別できる画像を出力可能となる適切な撮像条件を決定できる。 According to this method for determining imaging conditions, it is possible to determine appropriate imaging conditions that enable the imaging device to output an image in which regions made of two different materials included in the subject can be distinguished by color or density.
 上記課題を解決する撮像方法は、被写体を撮像して前記被写体に含まれる2種類の異なる素材からなる領域を識別可能な画像として出力する撮像方法であって、前記被写体を撮像する撮像部と、前記被写体を単色光で照明する単色光源とを備え、前記撮像部は、RGB画素を有する撮像素子を備え、前記RGB画素は、第1波長領域に感度を有する第1画素と、第2波長領域に感度を有する第2画素とを含み、前記第1波長領域と前記第2波長領域とは一部の重複波長領域で重複し、前記単色光は、前記第1波長領域の少なくとも一部と前記重複波長領域の少なくとも一部とを含む発光波長領域の光であり、前記単色光源が照明した前記被写体を前記撮像部が撮像する撮像ステップと、前記被写体を撮像した前記撮像部から出力される3バンドのRGB画像に対してリニアマトリックス演算処理を行うことによりRGB撮像特性が補正された3バンドのXYZ画像を生成する演算処理ステップと、前記単色光源から前記単色光が照射された条件下で前記第1画素及び前記第2画素の出力レベルを2色間で色バランスするように設定された各ゲインを用いて前記3バンドのXYZ画像のうち前記第1画素及び前記第2画素に対応する2種の画像を増幅する増幅ステップとを含む。 An imaging method that solves the above problems is an imaging method that images a subject and outputs an image that can identify regions made of two different materials included in the subject, the imaging method comprising: an imaging unit that captures the subject; a monochromatic light source that illuminates the subject with monochromatic light; the imaging unit includes an imaging element having RGB pixels; the RGB pixels include a first pixel having sensitivity in a first wavelength region; and a monochromatic light source having sensitivity in a first wavelength region; a second pixel having sensitivity to , the first wavelength region and the second wavelength region overlap in some overlapping wavelength regions, and the monochromatic light has at least a portion of the first wavelength region and the second wavelength region. The light is in an emission wavelength range including at least a part of the overlapping wavelength range, and includes an imaging step in which the imaging unit takes an image of the subject illuminated by the monochromatic light source, and 3 output from the imaging unit that has taken the image of the subject. a calculation processing step of generating a three-band XYZ image with corrected RGB imaging characteristics by performing linear matrix calculation processing on the band RGB image; Using each gain set to color balance the output level of the first pixel and the second pixel between the two colors, two pixels corresponding to the first pixel and the second pixel among the three bands of XYZ images are obtained. and an amplification step of amplifying the image of the species.
 この撮像方法によれば、被写体に含まれる2種類の異なる素材からなる領域を色又は濃度で識別できる画像を出力できる。例えば、当該方法によって出力された画像を用いることで、画像に含まれる2種類の異なる素材を比較的高い精度で識別又は分類することができる。 According to this imaging method, it is possible to output an image in which regions made of two different materials included in the subject can be identified by color or density. For example, by using an image output by this method, two different types of materials included in the image can be identified or classified with relatively high accuracy.
 本発明によれば、被写体に含まれる2種類の異なる素材からなる領域を色又は濃度で識別できる画像を出力できる。 According to the present invention, it is possible to output an image in which regions made of two different materials included in a subject can be identified by color or density.
実施形態における撮像装置を備える検査装置を示す模式側面図である。It is a schematic side view showing an inspection device provided with an imaging device in an embodiment. 被写体を示す模式平面図である。FIG. 3 is a schematic plan view showing a subject. カメラの概略構成を示す模式図と、カメラのRGB色別の相対感度を示すグラフである。1 is a schematic diagram showing a schematic configuration of a camera, and a graph showing relative sensitivity of the camera for each RGB color. 単色光源の詳細な構成を示す部分模式図である。It is a partial schematic diagram which shows the detailed structure of a monochromatic light source. 検査装置の機能的構成を示すブロック図である。FIG. 2 is a block diagram showing the functional configuration of the inspection device. 種類の異なる撮像対象を識別しにくい場合を説明するグラフである。7 is a graph illustrating a case where it is difficult to identify different types of imaging targets. 種類の異なる撮像対象を識別しやすくする原理を説明するグラフである。7 is a graph illustrating the principle of making it easier to identify different types of imaging targets. 汎用カメラを用いて撮像した画像が種類の異なる撮像対象を識別しにくい場合を説明するグラフである。12 is a graph illustrating a case where it is difficult to identify different types of imaging targets in images captured using a general-purpose camera. RGB画像における白色コロニーと黄色コロニーの濃度を示すグラフである。It is a graph showing the density of white colonies and yellow colonies in an RGB image. 光源の波長と出力パワー(光強度)との関係を示すグラフである。It is a graph showing the relationship between the wavelength of a light source and the output power (light intensity). 青色と赤色の単色光で照明したときのRGB撮像素子の波長と相対出力との関係を示すグラフである。It is a graph showing the relationship between the wavelength and relative output of an RGB image sensor when illuminated with blue and red monochromatic light. リニアマトリックス演算処理後の波長と相対出力との関係を示すグラフである。It is a graph showing the relationship between wavelength and relative output after linear matrix calculation processing. 汎用カメラを用いて撮像した撮像装置の出力画像が種類の異なる撮像対象を識別する原理を説明するグラフである。7 is a graph illustrating the principle by which output images of an imaging device captured using a general-purpose camera identify different types of imaging targets. XYZ画像における白色コロニーと黄色コロニーの濃度を示すグラフである。It is a graph showing the density of white colonies and yellow colonies in an XYZ image. (a)はXYZ画像、(b)はX画像、(c)はY画像、(d)はZ画像を示す図である。(a) is an XYZ image, (b) is an X image, (c) is a Y image, and (d) is a diagram showing a Z image. 初期設定ルーチンを示すフローチャートである。3 is a flowchart showing an initial setting routine. 撮像処理ルーチンを示すフローチャートである。3 is a flowchart showing an imaging processing routine. 検査処理ルーチンを示すフローチャートである。3 is a flowchart showing an inspection processing routine. 撮像条件決定処理ルーチンを示すフローチャートである。3 is a flowchart showing an imaging condition determination processing routine. 変更例における光源の波長と出力パワー(光強度)との関係を示すグラフである。It is a graph showing the relationship between the wavelength of the light source and the output power (light intensity) in a modified example. 変更例における緑色と赤色の単色光で照明したときのリニアマトリックス演算処理後の波長と相対出力との関係を示すグラフである。It is a graph showing the relationship between wavelength and relative output after linear matrix calculation processing when illuminating with green and red monochromatic light in a modified example. 変更例における汎用のカメラを用いて撮像した撮像装置の出力画像が種類の異なる撮像対象を識別する原理を説明するグラフである。It is a graph explaining the principle by which the output image of the imaging device imaged using the general-purpose camera in a modification example identifies different types of imaging targets.
 以下、実施形態に係る撮像装置及び検査装置について、図面を参照して説明する。
 本実施形態は、被写体12を撮像する撮像装置11が、被写体12の検査を行う検査装置10に備えられる例である。なお、撮像装置11は、検査装置10以外の他の装置に備えられてもよいし、単独で使用されてもよい。
Hereinafter, an imaging device and an inspection device according to embodiments will be described with reference to the drawings.
This embodiment is an example in which an imaging device 11 that captures an image of a subject 12 is included in an inspection apparatus 10 that inspects the subject 12. Note that the imaging device 11 may be included in a device other than the inspection device 10, or may be used alone.
 <検査装置>
 図1に示すように、検査装置10は、被写体12を撮像する撮像装置11と、撮像装置11から入力した画像を用いて被写体12の検査を行う検査処理部60とを備える。検査装置10は、検査処理部60の検査結果等を表示する表示部70を備えてもよい。表示部70には、撮像装置11から出力された画像やその画像に処理を施した画像等が表示されてもよい。表示部70は、例えば、パーソナルコンピュータに接続されたモニタでもよいし、検査装置10が備える操作盤に設けられたディスプレイでもよい。
<Inspection equipment>
As shown in FIG. 1, the inspection device 10 includes an imaging device 11 that images the subject 12, and an inspection processing unit 60 that inspects the subject 12 using images input from the imaging device 11. The inspection device 10 may include a display section 70 that displays the inspection results of the inspection processing section 60 and the like. The display unit 70 may display an image output from the imaging device 11, an image obtained by processing the image, or the like. The display unit 70 may be, for example, a monitor connected to a personal computer, or a display provided on an operation panel included in the inspection device 10.
 撮像装置11は、被写体12を単色光で照明する単色光源20と、被写体12を撮像する撮像部30と、単色光源20及び撮像部30を制御する制御部40と、撮像部30が撮像結果として出力する撮像信号に所定の処理を施す画像処理部50とを備える。また、撮像装置11は、単色光源20に電力を供給する電源45を備える。撮像部30は、例えば、カラーカメラ31である。カラーカメラ31(以下、単に「カメラ31」ともいう。)は、処理部18と電気的に接続されている。処理部18は、前述の制御部40及び画像処理部50を含む。処理部18は、電子回路及びコンピュータのうち少なくとも一方により構成されてもよい。 The imaging device 11 includes a monochromatic light source 20 that illuminates a subject 12 with monochromatic light, an imaging unit 30 that captures an image of the subject 12, a control unit 40 that controls the monochromatic light source 20 and the imaging unit 30, and an imaging unit 30 that outputs an image as an imaging result. The image processing unit 50 includes an image processing unit 50 that performs predetermined processing on the output image signal. The imaging device 11 also includes a power source 45 that supplies power to the monochromatic light source 20 . The imaging unit 30 is, for example, a color camera 31. The color camera 31 (hereinafter also simply referred to as "camera 31") is electrically connected to the processing section 18. The processing section 18 includes the aforementioned control section 40 and image processing section 50. The processing unit 18 may include at least one of an electronic circuit and a computer.
 単色光源20は、1種類の単色光を発光する1種類の単色光源でもよいし、2種類の単色光を発光する2種類の単色光源を含んでもよい。この場合、単色光源20が3種類の単色光を発光する3種類の単色光源を含んでもよい。単色光源を複数種備える場合、制御部40が1種類又は2種類の単色光源を発光させる構成でもよい。ここで、1種類の単色光源とは、青色光源、赤色光源、緑色光源のうちのいずれか1種類をいう。 The monochromatic light source 20 may be one type of monochromatic light source that emits one type of monochromatic light, or may include two types of monochromatic light sources that emit two types of monochromatic light. In this case, the monochromatic light source 20 may include three types of monochromatic light sources that emit three types of monochromatic light. When a plurality of types of monochromatic light sources are provided, the control unit 40 may be configured to cause one or two types of monochromatic light sources to emit light. Here, one type of monochromatic light source refers to any one of a blue light source, a red light source, and a green light source.
 単色光源は、例えば、LEDにより構成される。単色光源は、例えば、青色LED、赤色LED及び緑色LEDのうちのいずれか1種でもよい。本例の単色光源20は、青色LED及び赤色LEDを少なくとも備える。そのため、図1に示す例では、電源45は、青色LED電源46と赤色LED電源47とを少なくとも含む。 The monochromatic light source is composed of, for example, an LED. The monochromatic light source may be, for example, any one of a blue LED, a red LED, and a green LED. The monochromatic light source 20 of this example includes at least a blue LED and a red LED. Therefore, in the example shown in FIG. 1, the power supply 45 includes at least a blue LED power supply 46 and a red LED power supply 47.
 制御部40は、撮像部30を制御する撮像制御部41と、単色光源20を制御する照明制御部42とを備える。照明制御部42は、電源45を介して単色光源20の点灯・消灯を制御する。単色光源20が複数種の単色光源(LED)を含む場合、照明制御部42は、電源45を介して単色光源ごとに点灯・消灯を個別に制御する。このため、被写体12は、単色光源20によって単色光で照明される。なお、単色光源20は、被写体12を2種類の単色光で照明してもよい。要するに、3種類すべての単色光源(青色光源、赤色光源、緑色光源)が同時に3種類の単色光(青色光源、赤色光源、緑色光源)を発光することで被写体12を白色光で照明するのでなければ、青色光、赤色光及び緑色光のうち適宜選択された2種類の単色光で被写体12を照明してもよい。また、検査装置10は、被写体12であるシャーレ13や、色ゲイン調整用の基準シートなどの搬送対象を、所定位置に搬送するロボットを備えてもよい。この場合、制御部40が、ロボットを制御してもよいし、制御部40とは別の制御部がロボットを制御してもよい。 The control unit 40 includes an imaging control unit 41 that controls the imaging unit 30 and an illumination control unit 42 that controls the monochromatic light source 20. The illumination control unit 42 controls turning on and off of the monochromatic light source 20 via a power source 45. When the monochromatic light source 20 includes a plurality of types of monochromatic light sources (LEDs), the lighting control unit 42 individually controls turning on and off for each monochromatic light source via the power source 45. Therefore, the subject 12 is illuminated with monochromatic light by the monochromatic light source 20. Note that the monochromatic light source 20 may illuminate the subject 12 with two types of monochromatic light. In short, all three types of monochromatic light sources (blue light source, red light source, green light source) must simultaneously emit three types of monochromatic light (blue light source, red light source, green light source) to illuminate the subject 12 with white light. For example, the subject 12 may be illuminated with two types of monochromatic light appropriately selected from among blue light, red light, and green light. Furthermore, the inspection device 10 may include a robot that transports objects to be transported, such as a petri dish 13 as the subject 12 and a reference sheet for color gain adjustment, to a predetermined position. In this case, the control unit 40 may control the robot, or a control unit different from the control unit 40 may control the robot.
 画像処理部50は、撮像部30を構成するカメラ31から出力されるRGB3バンドの第1撮像信号S1に対して所定の画像処理を施してXYZ3バンドの第2撮像信号S2(いずれも図5を参照)を生成する。なお、第1撮像信号S1でいうRGBは、カメラ31のイメージセンサ33を構成するR画素、G画素、B画素がそれぞれ受光できる各波長領域から決まる3色を指す。また、第2撮像信号S2でいうXYZは、第1撮像信号S1を構成するRGB画像をRGBの各波長領域とは異なる各波長領域を有するXYZ画像に変換したそのXYZ画像の各波長領域から決まる3色を指す。 The image processing section 50 performs predetermined image processing on the first image signal S1 of three RGB bands output from the camera 31 constituting the image capturing section 30, and generates a second image signal S2 of three XYZ bands (both shown in FIG. 5). reference). Note that RGB in the first image signal S1 refers to three colors determined from each wavelength range that can be received by the R pixel, G pixel, and B pixel that constitute the image sensor 33 of the camera 31. Furthermore, the XYZ referred to in the second imaging signal S2 is determined from each wavelength region of the XYZ image obtained by converting the RGB image forming the first imaging signal S1 into an XYZ image having each wavelength region different from each wavelength region of RGB. Refers to three colors.
 検査処理部60は、画像処理部50から入力した第2撮像信号S2に基づくXYZ画像を用いて被写体12を検査する。
 図1及び図2に示すように、被写体12は、シャーレ13で培養されたコロニーWC,YCを含む。すなわち、被写体12は、シャーレ13と、シャーレ13内の培地14と、培地14に培養されたコロニーWC,YCを含む。なお、コロニーWC,YCは、培養期間の初期や菌がいないときには発生しないので、被写体12によってはコロニーWC,YCが存在しない場合もある。
The inspection processing unit 60 inspects the subject 12 using the XYZ image based on the second imaging signal S2 input from the image processing unit 50.
As shown in FIGS. 1 and 2, the subject 12 includes colonies WC and YC cultured in a Petri dish 13. That is, the subject 12 includes a petri dish 13, a medium 14 in the petri dish 13, and colonies WC and YC cultured in the medium 14. Note that colonies WC and YC do not occur at the beginning of the culture period or when there are no bacteria, so colonies WC and YC may not exist depending on the subject 12.
 図1及び図2に示す例では、2種類以上の異なるコロニーWC,YCが存在している。図1、図2の例では、種類の異なるコロニーWC,YCは類似した色をもつ2種類としている。また、培地14には、図2に示すような異物FSが存在する場合がある。例えば、食品や飲料水に菌が発生する状況を検査する場合、食品や飲料水に含まれる、食物繊維、食材の片、種子、皮、果肉などが異物FSとなる。異物FSは、コロニーWC,YCと間違えて識別される場合がある。このため、検査処理部60は、コロニーWC,YCと異物FSとを特徴によって識別する。例えば、形状、面積、色などの特徴因子によってコロニーWC,YCを異物FSと識別する。コロニーWC,YCは、一般に、円形や楕円形の形状を呈し、かつサイズは所定範囲内にある。異物FSが繊維である場合、その形状は長細い形状を呈する。また、異物FSが果肉、種子、皮などの場合、コロニーWC,YCと形状が異なる。例えば、外接矩形のアスペクト比を形状の特徴因子とし、アスペクト比が1に近い所定範囲内にあれば、コロニーWC,YCと特定し、所定範囲から外れる場合は、異物FSと識別してもよい。また、検査処理部60は、コロニー候補を面積(サイズ)の閾値によってコロニーWC,YCとそれ以外とを識別する。 In the examples shown in FIGS. 1 and 2, two or more different types of colonies WC and YC exist. In the examples shown in FIGS. 1 and 2, the different types of colonies WC and YC are two types having similar colors. Furthermore, foreign matter FS as shown in FIG. 2 may exist in the culture medium 14. For example, when inspecting food or drinking water for the presence of bacteria, the foreign substances FS include dietary fiber, food particles, seeds, skin, fruit pulp, etc. contained in the food or drinking water. Foreign matter FS may be mistakenly identified as colonies WC and YC. Therefore, the inspection processing unit 60 identifies the colonies WC, YC and the foreign object FS based on their characteristics. For example, colonies WC and YC are identified as foreign matter FS based on characteristic factors such as shape, area, and color. Colonies WC and YC generally have a circular or oval shape, and the size is within a predetermined range. When the foreign material FS is a fiber, its shape is elongated. Further, when the foreign substance FS is fruit pulp, seeds, skin, etc., the shape is different from the colonies WC and YC. For example, if the aspect ratio of the circumscribed rectangle is used as the feature factor of the shape, if the aspect ratio is within a predetermined range close to 1, it can be identified as a colony WC or YC, and if it is outside the predetermined range, it can be identified as a foreign object FS. . In addition, the inspection processing unit 60 distinguishes colony candidates from colonies WC, YC and others based on a threshold value of area (size).
 図1及び図2に示すように、シャーレ13は、背景シート15(背景板)の上に載置されている。背景シート15は、被写体12であるシャーレ13の背景となる。背景シート15は、例えば、黒拡散板であってもよい。背景シート15を黒色にすることで黒バランスの黒基準として使用してもよい。シャーレ13は、無色透明の材料(例えば、ガラス)により構成される。シャーレ13に収容される培地14は、必要に応じて着色される。その場合、培地14は、所定の色に着色された半透明となる場合がある。カメラ31が撮像した画像では、培地14は、背景シート15の黒色が透過する場合がある。また、食品や飲料水を検査する場合、食品や飲料水の色によって培地14が着色される場合もある。なお、黒色の背景シート15は、光を拡散する拡散機能を有しない黒色シート等の黒色板であってもよい。また、背景シート15を黒バランスの基準とする場合、黒色(無彩色)シートに限らず、有彩色シートを用いてもよい。 As shown in FIGS. 1 and 2, the petri dish 13 is placed on a background sheet 15 (background board). The background sheet 15 serves as a background for the petri dish 13, which is the subject 12. The background sheet 15 may be, for example, a black diffusion plate. By making the background sheet 15 black, it may be used as a black reference for black balance. The petri dish 13 is made of a colorless and transparent material (for example, glass). The culture medium 14 contained in the petri dish 13 is colored as necessary. In that case, the culture medium 14 may be colored in a predetermined color and become translucent. In the image captured by the camera 31, the black color of the background sheet 15 may be transmitted through the culture medium 14. Furthermore, when testing food or drinking water, the culture medium 14 may be colored depending on the color of the food or drinking water. Note that the black background sheet 15 may be a black plate such as a black sheet that does not have a diffusion function of diffusing light. Further, when using the background sheet 15 as a reference for black balance, it is not limited to a black (achromatic color) sheet, but a chromatic color sheet may be used.
 図1、図2に模式的に示す2種類のコロニーは、白色コロニーWCと黄色コロニーYCである。白色コロニーWCと黄色コロニーYCは、種類の異なるコロニーである。そのため、検査上、白色コロニーWCと黄色コロニーYCは区別される必要がある。しかし、白色コロニーWCと黄色コロニーYCの色である白色と黄色は、色が類似しているので、カメラ31で被写体12を撮像したRGB画像において識別しにくい。また、RGB画像を構成する、R画像、G画像、B画像においても、白色コロニーWCと黄色コロニーYCは濃度が近いので、識別しにくい。このようにカメラ31で被写体12を撮像して得られるRGB画像では白色コロニーWCと黄色コロニーYCは識別しにくい。本実施形態では、カメラ31からの第1撮像信号S1に画像処理部50が所定の画像処理を施すことで、白色コロニーWCと黄色コロニーYCとを識別しやすいXYZ画像を生成する。XYZ画像の詳細は後述する。 The two types of colonies schematically shown in FIGS. 1 and 2 are white colony WC and yellow colony YC. The white colony WC and the yellow colony YC are different types of colonies. Therefore, for inspection, it is necessary to distinguish between white colony WC and yellow colony YC. However, since the white and yellow colors of the white colony WC and the yellow colony YC are similar in color, it is difficult to distinguish them in the RGB image captured by the camera 31 of the subject 12. Further, in the R image, G image, and B image that constitute the RGB image, the white colony WC and the yellow colony YC have similar densities, so it is difficult to distinguish them. In this way, in the RGB image obtained by imaging the subject 12 with the camera 31, it is difficult to distinguish between the white colony WC and the yellow colony YC. In this embodiment, the image processing unit 50 performs predetermined image processing on the first imaging signal S1 from the camera 31, thereby generating an XYZ image that makes it easy to distinguish between the white colony WC and the yellow colony YC. Details of the XYZ image will be described later.
 <カメラ31の構成>
 次に、図3を参照して、撮像部30を構成するカメラ31の内部構造について説明する。図3は、カメラ31は、RGB画像を撮像する汎用のカラーカメラである。カメラ31は、鏡筒31aに組み付けられたレンズ32と、近赤外光を遮断する近赤外光カットフィルタ35(以下、IRカットフィルタ35ともいう。)と、イメージセンサ33とを備える。
<Configuration of camera 31>
Next, with reference to FIG. 3, the internal structure of the camera 31 that constitutes the imaging section 30 will be described. In FIG. 3, the camera 31 is a general-purpose color camera that captures RGB images. The camera 31 includes a lens 32 assembled to a lens barrel 31a, a near-infrared light cut filter 35 (hereinafter also referred to as IR cut filter 35) that blocks near-infrared light, and an image sensor 33.
 イメージセンサ33は、それぞれ受光素子よりなる、R画素33R、G画素33G及びB画素33Bを備える。R画素33Rは、Rフィルタ34Rを透過した赤色光(R光)を受光し受光量に応じたR画素信号を出力する。G画素33Gは、Gフィルタ34Gを透過した緑色光(G光)を受光し受光量に応じたG画素信号を出力する。B画素33Bは、Bフィルタ34Bを透過した青色光(B光)を受光し受光量に応じたB画素信号を出力する。イメージセンサ33において、R画素33R、G画素33G及びB画素33Bは、所定の配列で配置されている。 The image sensor 33 includes an R pixel 33R, a G pixel 33G, and a B pixel 33B, each of which is a light receiving element. The R pixel 33R receives red light (R light) that has passed through the R filter 34R, and outputs an R pixel signal according to the amount of received light. The G pixel 33G receives green light (G light) that has passed through the G filter 34G, and outputs a G pixel signal according to the amount of received light. The B pixel 33B receives blue light (B light) that has passed through the B filter 34B, and outputs a B pixel signal according to the amount of received light. In the image sensor 33, an R pixel 33R, a G pixel 33G, and a B pixel 33B are arranged in a predetermined arrangement.
 このイメージセンサ33は、近赤外光がカットされたRGB撮像特性を有する。R画素33R、G画素33G及びB画素33Bは、図3中のグラフで示されるそれぞれの波長帯の光に感度を有する。グラフでは、横軸が波長、縦軸が相対感度である。R画素33Rは、図3におけるグラフに示されるレッド(R)の波長帯の光に高い感度を有する。G画素33Gは、グラフに示されるグリーン(G)の波長帯の光に高い感度を有する。B画素33Bは、グラフに示されるブルー(B)の波長帯の光に高い感度を有する。 This image sensor 33 has RGB imaging characteristics in which near-infrared light is cut. The R pixel 33R, the G pixel 33G, and the B pixel 33B are sensitive to light in respective wavelength bands shown in the graph in FIG. In the graph, the horizontal axis is wavelength and the vertical axis is relative sensitivity. The R pixel 33R has high sensitivity to light in the red (R) wavelength band shown in the graph in FIG. The G pixel 33G has high sensitivity to light in the green (G) wavelength band shown in the graph. The B pixel 33B has high sensitivity to light in the blue (B) wavelength band shown in the graph.
 <単色光源20の構成について>
 次に、図4を参照して単色光源20の構成について説明する。
 図4に示すように、単色光源20は、青色光源21と、緑色光源22と、赤色光源23との3色(3種類)の単色光源を備える。青色光源21は、単色光として青色光(B光)を発光する。緑色光源22は、単色光として緑色光(G光)を発光する。赤色光源23は、単色光として赤色光(R光)を発光する。本実施形態の照明制御部42は、被写体12を撮像するときは、単色光源として青色光源21を発光させることで、被写体12を青色光(B光)で照明する。なお、特に発光色(光源色)を意識しない場合は、単色光源20を構成する青色光源21と緑色光源22と赤色光源23を、単色光源21~23と称する場合がある。
<About the configuration of the monochromatic light source 20>
Next, the configuration of the monochromatic light source 20 will be described with reference to FIG. 4.
As shown in FIG. 4, the monochromatic light source 20 includes monochromatic light sources of three colors (three types): a blue light source 21, a green light source 22, and a red light source 23. The blue light source 21 emits blue light (B light) as monochromatic light. The green light source 22 emits green light (G light) as monochromatic light. The red light source 23 emits red light (R light) as monochromatic light. When capturing an image of the subject 12, the illumination control unit 42 of this embodiment illuminates the subject 12 with blue light (B light) by causing the blue light source 21 to emit light as a monochromatic light source. Note that if the emitted light color (light source color) is not particularly conscious, the blue light source 21, green light source 22, and red light source 23 that constitute the monochromatic light source 20 may be referred to as monochromatic light sources 21 to 23.
 ここで、単色光源20は、白色光源の機能を備えてもよい。すなわち、単色光源20は、3種類の単色光源21~23を同時に発光させることで、白色光を発光させる白色光源機能も備える。この場合、制御部40は、白色光源においてRGB3色の単色光のそれぞれを発光可能な3種類の単色光源21~23のうちいずれか2種類(2色)を発光させることで、白色光源を単色光源として機能させる。本明細書では、このような単色光源の機能を備える白色光源も単色光源20に含まれる。なお、1種類又は2種類の単色光で照明して被写体12を撮像する第1撮像と、白色光で照明して被写体12を撮像する第2撮像とを時分割で行ってもよい。この場合、白色光で撮像した画像も検査に用いてもよい。 Here, the monochromatic light source 20 may have the function of a white light source. That is, the monochromatic light source 20 also has a white light source function of emitting white light by emitting three types of monochromatic light sources 21 to 23 at the same time. In this case, the control unit 40 causes the white light source to emit any two types (two colors) of the three types of monochromatic light sources 21 to 23 capable of emitting monochromatic light of three colors of RGB, so that the white light source can emit monochromatic light. Function as a light source. In this specification, a white light source having the function of such a monochromatic light source is also included in the monochromatic light source 20. Note that the first imaging in which the subject 12 is imaged by illuminating with one type or two types of monochromatic light, and the second imaging in which the subject 12 is imaged by illuminating with white light may be performed in a time-sharing manner. In this case, images captured using white light may also be used for the inspection.
 制御部40が、どの単色光源を発光させるかは、検査対象のコロニーの分光反射率特性から決定される。ここで、被写体12の撮像時に同時に発光させる2種類の単色光源のうち1種類を主となる単色光源である第1単色光源25、他の1種類を補助光源である第2単色光源26とする。本例では、図4に示すように、青色光源21を第1単色光源25とし、赤色光源23を第2単色光源26とする。すなわち、青色光源21を主となる第1単色光源25とし、第1単色光源25が被写体12を照明する第1単色光(本例ではB光)の発光波長領域とは異なる発光波長領域の第2単色光(本例ではR光)を発光可能な赤色光源23を補助光源である第2単色光源26とする。制御部40は、青色光(B光)を主たる単色光として被写体12を照明し、赤色光(R光)を補助光として被写体12を照明する。第1単色光源25と第2単色光源26との組み合わせを、RGB3色のうちどの2色とするかは、検査対象の分光反射率特性に応じて決定される。 The control unit 40 determines which monochromatic light source emits light based on the spectral reflectance characteristics of the colony to be inspected. Here, one type of two types of monochromatic light sources that are emitted simultaneously when photographing the subject 12 is assumed to be a first monochromatic light source 25 which is a main monochromatic light source, and the other type is assumed to be a second monochromatic light source 26 which is an auxiliary light source. . In this example, as shown in FIG. 4, the blue light source 21 is used as a first monochromatic light source 25, and the red light source 23 is used as a second monochromatic light source 26. That is, the blue light source 21 is the main first monochromatic light source 25, and the first monochromatic light source 25 has a first monochromatic light source 25 that has an emission wavelength range different from the emission wavelength range of the first monochromatic light (B light in this example) that illuminates the subject 12. The red light source 23 capable of emitting two monochromatic lights (R light in this example) is assumed to be a second monochromatic light source 26 which is an auxiliary light source. The control unit 40 illuminates the subject 12 using blue light (B light) as main monochromatic light and illuminates the subject 12 using red light (R light) as auxiliary light. Which two of the three RGB colors are used as a combination of the first monochromatic light source 25 and the second monochromatic light source 26 is determined according to the spectral reflectance characteristics of the object to be inspected.
 単色光源21~23は、例えば、LEDである。本例では、青色光源21は青色LEDであり、緑色光源22は緑色LEDであり、赤色光源23は、赤色LEDである。制御部40は、3種類の単色光源21~23を個々に独立してOn/Off制御することが可能である。制御部40は、被写体12を撮像するときは、図1に示す例では、検査対象の白色コロニーWCと黄色コロニーYCとを識別するために予め行った撮像条件探索結果から決定された青色LEDを点灯させ、これに加えて補助として赤色LEDを点灯させる。なお、照明に用いる単色光の決定方法を含む撮像条件決定方法については後述する。 The monochromatic light sources 21 to 23 are, for example, LEDs. In this example, the blue light source 21 is a blue LED, the green light source 22 is a green LED, and the red light source 23 is a red LED. The control unit 40 is capable of individually and independently controlling on/off the three types of monochromatic light sources 21 to 23. When imaging the subject 12, in the example shown in FIG. 1, the control unit 40 uses the blue LED determined from the imaging condition search results performed in advance to identify the white colonies WC and yellow colonies YC to be inspected. In addition to this, a red LED is lit as an auxiliary light. Note that a method for determining imaging conditions including a method for determining monochromatic light used for illumination will be described later.
 <画像処理部50及び検査処理部60の構成>
 次に、図5を参照して、画像処理部50及び検査処理部60の詳細な構成を説明する。
 図5に示すように、レンズ32及びIRカットフィルタ35を通して被写体12の像がイメージセンサ33の撮像面に結像される。イメージセンサ33は、被写体12の撮像結果として第1撮像信号S1を画像処理部50に出力する。第1撮像信号S1は、各画素33R,33G,33BからのR撮像信号(レッド信号)、G撮像信号(グリーン信号)及びB撮像信号(ブルー信号)を含むシリアル信号である。なお、R撮像信号、G撮像信号及びB撮像信号を、単にR信号、G信号及びB信号ともいう。
<Configuration of image processing unit 50 and inspection processing unit 60>
Next, detailed configurations of the image processing section 50 and the inspection processing section 60 will be described with reference to FIG. 5.
As shown in FIG. 5, an image of the subject 12 is formed on the imaging surface of the image sensor 33 through the lens 32 and the IR cut filter 35. The image sensor 33 outputs the first imaging signal S1 as the imaging result of the subject 12 to the image processing unit 50. The first imaging signal S1 is a serial signal including an R imaging signal (red signal), a G imaging signal (green signal), and a B imaging signal (blue signal) from each pixel 33R, 33G, and 33B. Note that the R imaging signal, the G imaging signal, and the B imaging signal are also simply referred to as an R signal, a G signal, and a B signal.
 図5に示すように、画像処理部50は、RGB分離部51、演算処理部52及び増幅部53を備える。RGB分離部51は、イメージセンサ33から入力した第1撮像信号S1を、R撮像信号、G撮像信号及びB撮像信号に分離する。 As shown in FIG. 5, the image processing section 50 includes an RGB separation section 51, an arithmetic processing section 52, and an amplification section 53. The RGB separation unit 51 separates the first image signal S1 input from the image sensor 33 into an R image signal, a G image signal, and a B image signal.
 演算処理部52は、RGB分離部51から入力したR信号、G信号及びB信号を、X信号、Y信号及びZ信号に変換する。詳しくは、演算処理部52は、R信号、G信号及びB信号の信号値であるRGB値に対してリニアマトリックス演算を施すことにより、X信号、Y信号及びZ信号に変換する。演算処理部52には、マトリックス係数が与えられる。ここで、リニアマトリックス演算に用いられるマトリックスは、3×3マトリックスである。演算処理部52には、3×3マトリックスの係数が与えられる。 The arithmetic processing unit 52 converts the R signal, G signal, and B signal input from the RGB separation unit 51 into an X signal, a Y signal, and a Z signal. Specifically, the arithmetic processing unit 52 converts the RGB values, which are the signal values of the R signal, G signal, and B signal, into the X signal, Y signal, and Z signal by performing a linear matrix calculation. The arithmetic processing unit 52 is given matrix coefficients. Here, the matrix used for the linear matrix calculation is a 3×3 matrix. The arithmetic processing unit 52 is given coefficients of a 3×3 matrix.
 演算処理部52は、マトリックス係数を用いて特定される3×3マトリックスを、第1撮像信号S1のRGB値に対して乗算するリニアマトリックス演算を行い、第1撮像信号S1のRGBと異なる分光特性を持つXYZで表される第2撮像信号S2に変換する。マトリックス係数は、第1撮像信号S1のRGBを、第2撮像信号S2のXYZバンドに分光させるための係数である。 The arithmetic processing unit 52 performs a linear matrix operation of multiplying the RGB values of the first imaging signal S1 by a 3×3 matrix specified using the matrix coefficient, and calculates the spectral characteristics different from the RGB values of the first imaging signal S1. It is converted into a second image pickup signal S2 expressed by XYZ with . The matrix coefficient is a coefficient for separating RGB of the first image signal S1 into XYZ bands of the second image signal S2.
 ここで、第1撮像信号S1であるRGB信号を、第2撮像信号S2であるXYZ信号に変換する計算式は、下記の(1)式で与えられる。 Here, the calculation formula for converting the RGB signal, which is the first imaging signal S1, into the XYZ signal, which is the second imaging signal S2, is given by the following equation (1).
Figure JPOXMLDOC01-appb-M000001
Figure JPOXMLDOC01-appb-M000001
 ここで、a1~a3,b1~b3,c1~c3はマトリックス係数であり、Gx,Gy,Gzはゲイン(増幅率)である。これらのマトリックス係数は、図5に示すマトリックス係数設定部54に設定されている。また、ゲインGx,Gy,Gzは、図5に示すゲイン設定部55に設定されている。ゲイン設定部55には、第1単色光源25から第1単色光が照射された条件下で、イメージセンサ33のRGB画素のうち第1画素と第2画素との各出力レベルを2色間で色バランスをとることが可能な値のゲインが設定されている。図12に示す例では、ゲイン設定部55には、第1単色光源25である青色光源21から第1単色光であるB光が照射された条件下で、第1画素であるB画素33Bと第2画素であるG画素33Gとの各出力レベルを2色(B色とG色)間で色バランスをとることが可能な値のゲインが設定されている。 Here, a1 to a3, b1 to b3, and c1 to c3 are matrix coefficients, and Gx, Gy, and Gz are gains (amplification factors). These matrix coefficients are set in the matrix coefficient setting section 54 shown in FIG. Further, the gains Gx, Gy, and Gz are set in a gain setting section 55 shown in FIG. The gain setting unit 55 sets the output levels of the first and second pixels of the RGB pixels of the image sensor 33 between two colors under the condition that the first monochromatic light is irradiated from the first monochromatic light source 25. A gain value that allows for color balance is set. In the example shown in FIG. 12, the gain setting unit 55 is set to the B pixel 33B, which is the first pixel, under the condition that B light, which is the first monochromatic light, is irradiated from the blue light source 21, which is the first monochromatic light source 25. A gain is set to a value that allows each output level of the second pixel G pixel 33G to be balanced between two colors (B color and G color).
 演算処理部52は、上記(1)式のうち、RGB値に対して3×3マトリックスを乗算するリニアマトリックス演算処理を行う。演算処理部52は、ゲイン(増幅率)が乗算される前のXYZ値を増幅部53に出力する。3×3マトリックスには、3バンドの分離性を高くできるマトリックス係数が設定されている。 The arithmetic processing unit 52 performs a linear matrix arithmetic process of multiplying the RGB values by a 3×3 matrix in the above equation (1). The arithmetic processing unit 52 outputs the XYZ values before being multiplied by a gain (amplification factor) to the amplification unit 53. Matrix coefficients are set in the 3×3 matrix that can improve the separation of the three bands.
 例えば、図10に示す単色光源である青色光源21と赤色光源23とが照明に使用される場合、図11から図12及び図13への変換のマトリックス演算に、次の3×3マトリックスの係数が与えられる。すなわち、3×3マトリックスの係数として、a1=2、a2=-0.04、a3=0、b1=-0.55、b2=1、b3=-0.25、c1=0、c2=-0.15、c3=1が与えられる。これらのマトリックス係数は、図5に示すマトリックス係数設定部54に設定されている。これは、一例であって、他のマトリックス係数を設定してよい。 For example, when the blue light source 21 and the red light source 23, which are monochromatic light sources shown in FIG. is given. That is, as coefficients of the 3×3 matrix, a1=2, a2=-0.04, a3=0, b1=-0.55, b2=1, b3=-0.25, c1=0, c2=- 0.15, c3=1 is given. These matrix coefficients are set in the matrix coefficient setting section 54 shown in FIG. This is just an example, and other matrix coefficients may be set.
 増幅部53は、演算処理部52からの正規化前のXYZ値に、ゲイン設定部55に設定されたXゲインGx,YゲインGy,ZゲインGzをそれぞれ乗算する。増幅部53は、XYZ変換後のX値にXゲインGxを乗算し、Y値にYゲインGyを乗算し、Z値にZゲインGzを乗算する。増幅部53は、正規化されたXYZ値を、第2撮像信号S2として出力する。本例では、増幅部53は、青色光源21から単色光(B光)が照射された条件下でB画素33BとG画素33Gとの各出力レベルを2色間で色バランスをとることが可能な値に設定される。こうして、画像処理部50は、入力した第1撮像信号S1に対して、RGB分離処理、リニアマトリックス演算処理(XYZ変換処理)及び正規化処理を順次行うことで、第2撮像信号S2を出力する。こうして3バンドのRGB画像信号が、3バンドのXYZ画像信号に変換される。 The amplification unit 53 multiplies the XYZ values before normalization from the arithmetic processing unit 52 by the X gain Gx, Y gain Gy, and Z gain Gz set in the gain setting unit 55, respectively. The amplification unit 53 multiplies the X value after XYZ conversion by the X gain Gx, the Y value by the Y gain Gy, and the Z value by the Z gain Gz. The amplification unit 53 outputs the normalized XYZ values as the second image pickup signal S2. In this example, the amplification unit 53 is capable of balancing the output levels of the B pixel 33B and the G pixel 33G between two colors under the condition that monochromatic light (B light) is irradiated from the blue light source 21. set to a value. In this way, the image processing unit 50 sequentially performs RGB separation processing, linear matrix calculation processing (XYZ conversion processing), and normalization processing on the input first imaging signal S1, thereby outputting a second imaging signal S2. . In this way, the three-band RGB image signal is converted into the three-band XYZ image signal.
 <検査処理部60の構成>
 次に、図5を参照して、検査処理部60の構成について説明する。
 検査処理部60は、識別部61及び判定部62を備える。識別部61は、分類部63、計数部64及び特定部65を備える。
<Configuration of inspection processing unit 60>
Next, the configuration of the inspection processing section 60 will be explained with reference to FIG. 5.
The inspection processing section 60 includes an identification section 61 and a determination section 62. The identification section 61 includes a classification section 63, a counting section 64, and a specifying section 65.
 被写体12は、2種類の異なる素材として2種類の異なるコロニーWC,YCを含む。識別部61は、撮像装置11から入力した3バンドのXYZ画像に基づいて2種類の異なるコロニーWC,YCの領域であるコロニー領域を識別する。XYZ画像は、X画像、Y画像及びZ画像を含む。識別部61は、X画像を構成する画素値が第1閾値で設定された範囲内にある領域を第1コロニー領域とする。また、識別部61は、Y画像を構成する画素値が第2閾値で設定された範囲内にある領域を第2コロニー領域とする。さらに、識別部61は、Z画像を構成する画素値が第3閾値で設定された範囲内にある領域を第3コロニー領域とする。識別部61は、第1コロニー領域又は第3コロニー領域から、第2コロニー領域を排除した残りのコロニー領域を第4コロニー領域とする。例えば、第1コロニー領域又は第3コロニー領域は、被写体12内の全てコロニーWC,YCの領域である。第2コロニー領域は、白色コロニーWC又は黄色コロニーYCの領域である。ここで述べた第4コロニー領域の演算式は、図12の場合の例であり、識別したいコロニーの特性によっては、第4コロニー領域の演算式は変わる。なお、第1閾値~第3閾値は、画素値の特定範囲を設定するための閾値であり、中間閾値の1種類の閾値だけではなく、上限閾値、下限閾値の2種類の閾値、あるいは2区間範囲を設定するための4種類の閾値でもよい。 The subject 12 includes two different types of colonies WC and YC as two different types of materials. The identification unit 61 identifies colony areas, which are areas of two different types of colonies WC and YC, based on the three-band XYZ image input from the imaging device 11. The XYZ image includes an X image, a Y image, and a Z image. The identification unit 61 defines an area in which the pixel values constituting the X image are within the range set by the first threshold value as a first colony area. Further, the identification unit 61 defines an area in which the pixel values forming the Y image are within the range set by the second threshold value as a second colony area. Furthermore, the identification unit 61 defines an area where the pixel values forming the Z image are within the range set by the third threshold value as a third colony area. The identification unit 61 removes the second colony area from the first colony area or the third colony area and sets the remaining colony area as a fourth colony area. For example, the first colony area or the third colony area is an area where all of the colonies WC and YC are in the subject 12. The second colony region is a region of white colony WC or yellow colony YC. The calculation formula for the fourth colony region described here is an example of the case of FIG. 12, and the calculation formula for the fourth colony region changes depending on the characteristics of the colony to be identified. Note that the first to third thresholds are thresholds for setting a specific range of pixel values, and not only one type of intermediate threshold, but also two types of thresholds, an upper threshold and a lower threshold, or two intervals. Four types of threshold values may be used to set the range.
 識別部61は、閾値の範囲に特定した領域とそれ以外の領域とを閾値で二値化し、二値化で抽出された検査対象の領域を特定する。
 分類部63は、第1コロニー領域及び第3コロニー領域のうちの少なくとも一方と、第2コロニー領域と、第4コロニー領域とを、形状及びサイズ(面積)等の特徴に基づいて分類する。
The identification unit 61 binarizes the area specified within the threshold range and the other areas using the threshold value, and specifies the area to be inspected extracted by the binarization.
The classification unit 63 classifies at least one of the first colony area and the third colony area, the second colony area, and the fourth colony area based on characteristics such as shape and size (area).
 計数部64は、分類部63が分類したコロニー領域のそれぞれの数を計数する。
 特定部65は、計数部64の計数結果に基づいて識別対象である2種類のコロニー領域の数をそれぞれ特定する。
The counting unit 64 counts the number of each colony area classified by the classification unit 63.
The specifying unit 65 specifies the number of two types of colony areas to be identified based on the counting results of the counting unit 64.
 判定部62は、2種類のコロニー領域の各計数値に基づいて被写体12のシャーレ13でコロニーを培養した結果として得られる検査対象の良否を判定する。なお、検査装置10が、コロニーカウンタである場合は、検査対象の良否を判定する判定機能を補助機能として備えたり、判定部62による判定機能を備えない構成でもよい。また、検査装置10の検査対象がコロニーでない場合は、検査対象の良否を判定する判定機能を主機能としてもよい。 The determination unit 62 determines the quality of the test object obtained as a result of culturing colonies in the Petri dish 13 of the subject 12 based on the counts of the two types of colony regions. In addition, when the inspection device 10 is a colony counter, it may be provided with a determination function for determining the quality of the inspection object as an auxiliary function, or may be configured without the determination function by the determination section 62. Furthermore, when the inspection target of the inspection device 10 is not a colony, the main function may be a determination function for determining the quality of the inspection target.
 <検査対象であるコロニーの光反射特性と撮像素子(イメージセンサ)の感度>
 次に、図6を参照して、汎用のカラーカメラを用いて撮像しても、類似した色の2種類のコロニーWC,YCを識別しにくい理由について説明する。また、図7を参照して、本実施形態では同じ汎用のカラーカメラ31を用いて撮像した画像でも、撮像信号に所定の処理を施すことで、2種類のコロニーWC,YCを識別可能な原理について説明する。
<Light reflection characteristics of the colony to be inspected and sensitivity of the imaging device (image sensor)>
Next, with reference to FIG. 6, the reason why it is difficult to distinguish the two types of colonies WC and YC with similar colors even if they are imaged using a general-purpose color camera will be explained. Further, with reference to FIG. 7, in this embodiment, even in images captured using the same general-purpose color camera 31, two types of colonies WC and YC can be distinguished by performing predetermined processing on the imaging signal. I will explain about it.
 図6は、白色コロニーWCと黄色コロニーYCの分光反射率特性線LW、LYと、RGB画素のうち一例としてB画素の分光感度特性を示す分光感度特性線Bとの一例を示す。図6のグラフでは、横軸が波長(nm)、縦軸が相対感度を示す。なお、このグラフに示される2種類のコロニーの分光反射率特性線LW,LYは、説明の便宜上の仮定の波形である。一般に、コロニーなどの検査対象の分光反射率特性の測定には、専用の分光反射率測定装置が必要になる。 FIG. 6 shows an example of spectral reflectance characteristic lines LW and LY of white colony WC and yellow colony YC, and spectral sensitivity characteristic line B indicating the spectral sensitivity characteristic of B pixel as an example of RGB pixels. In the graph of FIG. 6, the horizontal axis shows wavelength (nm), and the vertical axis shows relative sensitivity. Note that the spectral reflectance characteristic lines LW and LY of the two types of colonies shown in this graph are hypothetical waveforms for convenience of explanation. Generally, a dedicated spectral reflectance measurement device is required to measure the spectral reflectance characteristics of an inspection target such as a colony.
 白色コロニーWCと黄色コロニーYCの分光反射率特性は近似している。また、シャーレ13内の培地14は所定の色に着色される場合がある。このため、培地14の着色等によって、白色コロニーWCと黄色コロニーYCはさらに識別しにくくなる。 The spectral reflectance characteristics of white colony WC and yellow colony YC are similar. Further, the culture medium 14 in the petri dish 13 may be colored in a predetermined color. Therefore, due to the coloring of the medium 14, it becomes even more difficult to distinguish between the white colony WC and the yellow colony YC.
 図6において、白色コロニーWCの分光反射率特性線LWと、黄色コロニーYCの分光反射率特性線LYは、広い波長範囲に亘り近似している。図6に破線で示すB画素の分光感度特性線Bは、B画素の相対感度を示し、430nm付近にピークをもつ。各コロニーWC,YCの分光反射率特性線LW,LYは、どちらも分光感度特性線Bに重なるピークをもつ。また、どちらも近赤外領域NIRAの所定波長領域にもピークをもつ。 In FIG. 6, the spectral reflectance characteristic line LW of the white colony WC and the spectral reflectance characteristic line LY of the yellow colony YC are similar over a wide wavelength range. A spectral sensitivity characteristic line B of the B pixel shown by a broken line in FIG. 6 indicates the relative sensitivity of the B pixel, and has a peak near 430 nm. The spectral reflectance characteristic lines LW and LY of each colony WC and YC both have peaks that overlap with the spectral sensitivity characteristic line B. Furthermore, both have a peak in a predetermined wavelength region of the near-infrared region NIRA.
 B画素33Bは、図6に破線で示される300~600nmの範囲の光成分のB光(青色~緑色の領域の光)に感度を有する。このため、B画素33Bは、白色コロニーからの反射光のうちの青色成分のB光を受光する。また、B画素33Bは、黄色コロニーからの反射光のうちの青色成分のB光を受光する。 The B pixel 33B is sensitive to B light (light in the blue to green region), which is a light component in the range of 300 to 600 nm, as shown by the broken line in FIG. Therefore, the B pixel 33B receives the blue component of the B light of the reflected light from the white colony. Further, the B pixel 33B receives the blue component of the B light of the reflected light from the yellow colony.
 このとき、B画像における白色コロニーWCの濃度は、図6において、白色コロニーWCの分光反射率特性線LWと分光感度特性線Bとの両方を乗算したLW*Bの領域の面積Swに比例する。また、B画像における黄色コロニーYCの濃度は、図6において、黄色コロニーYCの分光反射率特性線LYと分光感度特性線Bとの両方を乗算したLY*Bの領域の面積Syに比例する。この場合、2つの面積Sw,Syは同じ程度の値である。このため、B画像における2種類のコロニーWC,YCの濃度は同程度の値である。つまり、B画像において、2種類のコロニーWC,YCは識別しにくい。G画像やR画像についても、同様に2種類のコロニーWC,YCの濃度に差があまりない。G画像とR画像のどちらにおいても、2種類のコロニーWC,YCは識別しにくい。 At this time, the density of the white colony WC in the B image is proportional to the area Sw of the region LW*B multiplied by both the spectral reflectance characteristic line LW and the spectral sensitivity characteristic line B of the white colony WC in FIG. . Further, the density of the yellow colony YC in the B image is proportional to the area Sy of the region LY*B, which is obtained by multiplying both the spectral reflectance characteristic line LY and the spectral sensitivity characteristic line B of the yellow colony YC in FIG. 6. In this case, the two areas Sw and Sy have approximately the same value. Therefore, the densities of the two types of colonies WC and YC in the B image are approximately the same. That is, in the B image, the two types of colonies WC and YC are difficult to distinguish. Similarly, in the G and R images, there is not much difference in density between the two types of colonies WC and YC. In both the G image and the R image, the two types of colonies WC and YC are difficult to distinguish.
 一方、図7に示すように、白色コロニーWCと黄色コロニーYCとの2つの分光反射率特性線LW,LYの間の差が比較的大きい波長領域(400~500nm)に限る狭い波長範囲にB画素の分光感度特性があれば、B画像において白色コロニーWCと黄色コロニーYCは識別可能である。すなわち、B画像における白色コロニーWCの濃度は、図7において、白色コロニーWCの分光反射率特性線LWと分光感度特性線Bとの両方を乗算したLW*Bの領域の面積Swに比例する。また、B画像における黄色コロニーYCの濃度は、図7において、黄色コロニーYCの分光反射率特性線LYと分光感度特性線Bとの両方を乗算したLY*Bの領域の面積Syに比例する。この場合、2つの面積Sw,Syに所定値(所定比)以上の差があるので、B画像において、2種類のコロニーWC,YCは識別可能である。 On the other hand, as shown in FIG. 7, B is limited to a narrow wavelength range (400 to 500 nm) in which the difference between the two spectral reflectance characteristic lines LW and LY of white colony WC and yellow colony YC is relatively large. If the spectral sensitivity characteristics of the pixels exist, white colonies WC and yellow colonies YC can be distinguished in the B image. That is, the density of the white colony WC in the B image is proportional to the area Sw of the region LW*B multiplied by both the spectral reflectance characteristic line LW and the spectral sensitivity characteristic line B of the white colony WC in FIG. 7. Further, the density of the yellow colony YC in the B image is proportional to the area Sy of the region LY*B, which is obtained by multiplying both the spectral reflectance characteristic line LY and the spectral sensitivity characteristic line B of the yellow colony YC in FIG. 7. In this case, since there is a difference between the two areas Sw and Sy by a predetermined value (predetermined ratio) or more, the two types of colonies WC and YC can be identified in the B image.
 同様に、図7において、白色コロニーWCと黄色コロニーYCとの2つの分光反射率特性線LW,LYの間の差が比較的大きい近赤外波長領域(800~900nm)に限る狭い破線Nで示す分光感度特性があれば、IR画像において2種類のコロニーWC,YCは識別可能である。なお、図6、図7では、Swは、LW*Bではなく、LWとBとが重なった領域の面積を示すが、LW*Bとの差異は小さいので、これらの図では、LW*Bを、LWとBとが重なった領域の面積で代用する。Syについても同様に、LY*Bを、LYとBとが重なった領域の面積で代用する。 Similarly, in FIG. 7, the narrow broken line N is limited to the near-infrared wavelength region (800 to 900 nm) where the difference between the two spectral reflectance characteristic lines LW and LY of the white colony WC and the yellow colony YC is relatively large. With the spectral sensitivity characteristics shown, the two types of colonies WC and YC can be distinguished in the IR image. In addition, in FIGS. 6 and 7, Sw indicates the area of the area where LW and B overlap, not LW*B, but since the difference from LW*B is small, in these figures, Sw indicates LW*B. is substituted by the area of the region where LW and B overlap. Similarly, for Sy, LY*B is replaced by the area of the region where LY and B overlap.
 このように、画素の分光感度特性の幅(波長範囲)を狭くしたり分光感度特性の位置(波長)をシフトさせたりすることで、2種類のコロニーWC,YCの分光反射率特性線LW,LYに差がある領域に画素の分光感度特性を設定する。こうすれば、その画像は2種類のコロニーWC,YCを識別可能である。このためには、マルチスペクトルカメラやハイパースペクトルカメラが必要になる。あるいは、発光波長領域が狭い特定波長の光を発光できる特殊な光源が必要になる。しかしながら、これらの特殊なカメラ及び特殊な光源は高額であるため、撮像装置の製造コストは高くなる。 In this way, by narrowing the width (wavelength range) of the spectral sensitivity characteristic of the pixel or shifting the position (wavelength) of the spectral sensitivity characteristic, the spectral reflectance characteristic lines LW, Spectral sensitivity characteristics of pixels are set in areas where there is a difference in LY. In this way, the two types of colonies WC and YC can be identified from the image. This requires a multispectral or hyperspectral camera. Alternatively, a special light source that can emit light of a specific wavelength with a narrow emission wavelength range is required. However, these special cameras and special light sources are expensive, which increases the manufacturing cost of the imaging device.
 本実施形態では、図7に示すような画素の分光感度特性を、汎用のカラーカメラ31を用いて撮像した第1撮像信号S1に所定の処理を施すことで生成する。
 <汎用のカラーカメラ31を用いた画像について>
 次に、図8を参照して、比較例について説明する。
In this embodiment, the spectral sensitivity characteristics of pixels as shown in FIG. 7 are generated by performing predetermined processing on the first image signal S1 captured using a general-purpose color camera 31.
<About images using general-purpose color camera 31>
Next, a comparative example will be described with reference to FIG.
 図8に示すように、汎用のカラーカメラ31に搭載されているイメージセンサ33は、白色LED照明時には、グラフに示すRGBの分光感度特性を有する。このグラフでは、白色コロニーWCと黄色コロニーYCの分光反射率特性線LW,LYが、図6、図7とは異なる。 As shown in FIG. 8, the image sensor 33 mounted on the general-purpose color camera 31 has the RGB spectral sensitivity characteristics shown in the graph when illuminated with white LED. In this graph, the spectral reflectance characteristic lines LW and LY of the white colony WC and the yellow colony YC are different from those in FIGS. 6 and 7.
 図8に示す例では、白色コロニーWCの分光反射率特性線LWと黄色コロニーYCの分光反射率特性線LYは、波長が約480~520nmの領域DAで差がある。この差がある領域DAで、RGBのいずれかにこの領域DAの幅程度の狭い波長領域に感度のピークがあれば、その色の画像において、白色コロニーWCと黄色コロニーYCは濃度で識別可能になる。 In the example shown in FIG. 8, there is a difference between the spectral reflectance characteristic line LW of the white colony WC and the spectral reflectance characteristic line LY of the yellow colony YC in the region DA where the wavelength is about 480 to 520 nm. In the area DA where this difference exists, if any of RGB has a sensitivity peak in a wavelength range as narrow as the width of this area DA, then in the image of that color, white colonies WC and yellow colonies YC can be distinguished by density. Become.
 しかし、汎用のカラーカメラ31では、図8に示すRGB画素いずれも感度を示す分光感度特性線R,G,Bが広い波長領域に分布する。白色コロニーWCの分光反射率特性線LWと黄色コロニーYCの分光反射率特性線LYは、波長が約480~520nm領域でLW>LYの差があるが、約530~580nmの領域では、逆にLW<LYとなっている。そのため、400~700nmで感度を持つR画像、G画像、B画像のいずれにおいても、2種類のコロニーWC,YCの濃度差は小さくなる。 However, in the general-purpose color camera 31, the spectral sensitivity characteristic lines R, G, and B, which indicate the sensitivity of each of the RGB pixels shown in FIG. 8, are distributed over a wide wavelength range. The spectral reflectance characteristic line LW of the white colony WC and the spectral reflectance characteristic line LY of the yellow colony YC have a difference of LW>LY in the wavelength range of about 480 to 520 nm, but on the contrary, in the wavelength range of about 530 to 580 nm. LW<LY. Therefore, the difference in density between the two types of colonies WC and YC becomes small in any of the R, G, and B images that have sensitivity in the range of 400 to 700 nm.
 例えば、G画像における白色コロニーWCの濃度は、図8における白色コロニーWCの分光反射率特性線LW(太線)と、一点鎖線で示されるG画素の分光感度特性線Gとの乗算で算出される領域の面積Sg1で示される。また、G画像における黄色コロニーYCの濃度は、図8における黄色コロニーの分光反射率特性線LY(太い一点鎖線)と、G画像の分光感度特性線Gとの乗算で算出される領域の面積Sg2で示される。そして、2つの面積Sg1,Sg2の差ΔSg(=|Sg1-Sg2|)が全体(G領域全面積)に対して占める比率が小さい。この比率が小さくなると、G画像において2種類のコロニーWC,YCの識別が難しくなっていく。これは、G画素が広い波長範囲に亘って感度を有することが原因である。B画像とR画像においても、同様に、差ΔSb,ΔSr(図示略)が全体(B領域全面積,R領域全面積)に対して占める比率が小さいため、白色コロニーWCと黄色コロニーYCは識別しにくい。なお、図8では、LW*Gを、LWとGとが重なった領域の面積Sg1で代用し、LY*Gを、LYとGとが重なった領域の面積Sg2で代用する。 For example, the density of the white colony WC in the G image is calculated by multiplying the spectral reflectance characteristic line LW (thick line) of the white colony WC in FIG. 8 by the spectral sensitivity characteristic line G of the G pixel shown by the dashed line The area of the region is indicated by Sg1. The density of the yellow colony YC in the G image is calculated by multiplying the spectral reflectance characteristic line LY (thick one-dot chain line) of the yellow colony in FIG. 8 by the spectral sensitivity characteristic line G of the G image, which is the area Sg2 of the region. It is indicated by. The ratio of the difference ΔSg (=|Sg1−Sg2|) between the two areas Sg1 and Sg2 to the whole (total area of the G region) is small. As this ratio becomes smaller, it becomes difficult to distinguish between the two types of colonies WC and YC in the G image. This is because the G pixel has sensitivity over a wide wavelength range. Similarly, in the B and R images, since the differences ΔSb and ΔSr (not shown) account for a small proportion of the total area (total area of B area, total area of R area), white colony WC and yellow colony YC cannot be distinguished. It's hard to do. In FIG. 8, the area Sg1 of the area where LW and G overlap is substituted for LW*G, and the area Sg2 of the area where LY and G overlap is substituted for LY*G.
 図9は、この比較例の汎用のカラーカメラ31で撮像したRGB画像における白色コロニーWCと黄色コロニーYCとの濃度を示す。R画像、G画像、B画像のいずれにおいても、白色コロニーWCの濃度Cwと黄色コロニーYCの濃度Cyは、ほとんど差がないか、差があっても非常に小さい。このため、RGB画像で白色コロニーWCと黄色コロニーYCは識別しにくい。 FIG. 9 shows the density of white colonies WC and yellow colonies YC in an RGB image captured by the general-purpose color camera 31 of this comparative example. In any of the R, G, and B images, there is almost no difference between the density Cw of the white colony WC and the density Cy of the yellow colony YC, or even if there is a difference, it is very small. For this reason, it is difficult to distinguish between white colony WC and yellow colony YC in an RGB image.
 汎用のカラーカメラ31のイメージセンサ33のRGB画素は、図8に示すように、白色LED照明下では、波長(nm)に対する相対出力で示される感度を有する。第1画素であるB画素33Bは、波長約440~540nmの光に対して相対出力0.1以上あり且つ約460nmにピークをもつ分光特性を有する。すなわち、相対出力0.1以上となる第1波長領域は、約440~540nmである。また、第2画素であるG画素33Gは、波長約460~610nmの光に対して相対出力0.1以上あり且つ約550nmにピークをもつ分光特性を有する。すなわち、相対出力0.1以上となる第2波長領域は、約460~610nmである。第3画素であるR画素33Rは、波長約570~650nmの光に対して相対出力0.1以上あり且つ約600nmにピークをもつ分光特性を有する。 As shown in FIG. 8, the RGB pixels of the image sensor 33 of the general-purpose color camera 31 have a sensitivity expressed as a relative output with respect to wavelength (nm) under white LED illumination. The B pixel 33B, which is the first pixel, has a relative output of 0.1 or more for light having a wavelength of approximately 440 to 540 nm, and has spectral characteristics having a peak at approximately 460 nm. That is, the first wavelength region where the relative output is 0.1 or more is about 440 to 540 nm. Further, the G pixel 33G, which is the second pixel, has a relative output of 0.1 or more for light having a wavelength of approximately 460 to 610 nm, and has a spectral characteristic having a peak at approximately 550 nm. That is, the second wavelength region where the relative output is 0.1 or more is about 460 to 610 nm. The R pixel 33R, which is the third pixel, has a relative output of 0.1 or more for light having a wavelength of approximately 570 to 650 nm, and has spectral characteristics having a peak at approximately 600 nm.
 第1波長領域(約440~540nm)と第2波長領域(約460~610nm)とは一部の重複波長領域(約460~540nm)で重複する。重複波長領域である約460~550nmの範囲内の少なくとも一部を含む発光波長とするB光が照射されると、第1画素であるB画素33Bだけでなく、第2画素であるG画素33Gも、0.1以上の相対出力になる。 The first wavelength region (approximately 440 to 540 nm) and the second wavelength region (approximately 460 to 610 nm) overlap in some overlapping wavelength regions (approximately 460 to 540 nm). When irradiated with B light whose emission wavelength includes at least part of the overlapping wavelength range of about 460 to 550 nm, not only the first B pixel 33B but also the second G pixel 33G also has a relative output of 0.1 or more.
 図10は、青色LEDよりなる青色光源21と、赤色LEDよりなる赤色光源23とを同時に発光させるときの単色光源20のスペクトル(発光分布)を示す。グラフにおいて、横軸は波長、縦軸は出力パワーである。主となる単色光源である青色光源21は、約430~530nmのB光を発光する。補助光源である赤色光源23は、500~660nmのR光を発光する。なお、図10のグラフでは、白色LEDの分光特性を破線で示している。 FIG. 10 shows the spectrum (emission distribution) of the monochromatic light source 20 when the blue light source 21 made of a blue LED and the red light source 23 made of a red LED emit light at the same time. In the graph, the horizontal axis is wavelength and the vertical axis is output power. The blue light source 21, which is the main monochromatic light source, emits B light of about 430 to 530 nm. The red light source 23, which is an auxiliary light source, emits R light of 500 to 660 nm. Note that in the graph of FIG. 10, the spectral characteristics of the white LED are shown by broken lines.
 図11は、B光とR光で照明したときのRGB画素33R,33G,33Bの相対出力を示す。図11に示すようにG領域が、図8における白色光を照明したときのG領域に比べ低波長側へシフトしている。また、図11に示すB領域は、図8における白色光を照明したときのB領域において500~650nmの領域に相対出力が0.05~2程度あった領域がほぼ消失している。 FIG. 11 shows the relative outputs of the RGB pixels 33R, 33G, and 33B when illuminated with B light and R light. As shown in FIG. 11, the G region is shifted to the lower wavelength side compared to the G region when illuminated with white light in FIG. Furthermore, in region B shown in FIG. 11, the region where the relative output was about 0.05 to 2 in the region of 500 to 650 nm in region B when illuminated with white light in FIG. 8 has almost disappeared.
 また、図11に示すようにR領域は、図10に示すR光を照明することで、図10における白色光を照明したときのR領域に比べ、高波長側へシフトしている。
 そして、図12に示すように、RGB画像に対してリニアマトリックス演算を施すことで、XYZ画像が取得される。
Further, as shown in FIG. 11, the R region is shifted to the higher wavelength side by illuminating with the R light shown in FIG. 10, compared to the R region when illuminating with white light in FIG.
Then, as shown in FIG. 12, an XYZ image is obtained by performing linear matrix calculation on the RGB image.
 本例の撮像装置11は、RGBカラーフィルタ34を搭載したカラーカメラ31において、単色照明及びRGBリニアマトリックス演算処理により、新たな2バンドの撮像特性を持たせることを可能とした撮像方式を採用する。新たな2バンドとは、元の2バンドとは、ピーク波長が異なり、かつ白色LED照明使用時のRGB撮像特性とは異なる撮像特性をもつ。加えて、新たな2バンドの感度波長領域とは異なる領域の単色照明を補助光源として使用することにより、通常のカラーカメラ31での撮像では困難な色識別を可能にする撮像方式である。 The imaging device 11 of this example employs an imaging method that allows a color camera 31 equipped with an RGB color filter 34 to have new two-band imaging characteristics through monochromatic illumination and RGB linear matrix calculation processing. . The two new bands have different peak wavelengths from the original two bands, and have different imaging characteristics from the RGB imaging characteristics when using white LED lighting. In addition, by using monochromatic illumination in a region different from the new two-band sensitivity wavelength region as an auxiliary light source, this imaging method enables color discrimination that is difficult to perform when imaging with a normal color camera 31.
 本実施形態の単色照明は、青色照明である。補助光源として使用する単色照明は、赤色照明である。つまり、B光で照明する青色光源21を主となる単色光源として使用し、R光で照明する赤色光源23を補助光源として使用する。赤色光源23を補助光源としてR光でも照明することで、R画像についても検査処理に使用する。 The monochromatic illumination of this embodiment is blue illumination. The monochromatic illumination used as an auxiliary light source is red illumination. That is, the blue light source 21 that illuminates with B light is used as the main monochromatic light source, and the red light source 23 that illuminates with R light is used as an auxiliary light source. By illuminating with R light using the red light source 23 as an auxiliary light source, the R image is also used for inspection processing.
 重複波長領域が約460~550nmであるので、図11に示すように、青色光源21のB光によってG画素33Gは、約450~520nmの領域で相対出力が0.1以上になる。被写体12を照明する単色光であるB光は、第1波長領域の少なくとも一部と重複波長領域の少なくとも一部とを含む発光波長領域の光である。 Since the overlapping wavelength region is about 460 to 550 nm, as shown in FIG. 11, the B light from the blue light source 21 causes the G pixel 33G to have a relative output of 0.1 or more in the region of about 450 to 520 nm. The B light, which is monochromatic light that illuminates the subject 12, is light in an emission wavelength range that includes at least part of the first wavelength range and at least part of the overlapping wavelength range.
 制御部40は、第1単色光源25と第2単色光源26との照明を同時に行って、被写体12を撮像部30に撮像させる。制御部40が、第1単色光源25の照度に応じて設定した第2照度で第2単色光源26の照明を行うか、あるいは、RGB画素33R,33G,33Bのうち第1画素であるB画素33Bと第2画素であるG画素33Gとの出力レベルに与えられる各ゲインに応じて第3画素であるR画素33Rの出力レベルに与えられるゲインがゲイン設定部55により設定される。 The control unit 40 simultaneously illuminates the first monochromatic light source 25 and the second monochromatic light source 26 to cause the imaging unit 30 to image the subject 12. The control unit 40 illuminates the second monochromatic light source 26 with a second illuminance set according to the illuminance of the first monochromatic light source 25, or illuminates the B pixel which is the first pixel among the RGB pixels 33R, 33G, and 33B. The gain setting unit 55 sets the gain given to the output level of the R pixel 33R, which is the third pixel, according to each gain given to the output level of the R pixel 33B, which is the third pixel, and the G pixel 33G, which is the second pixel.
 図13に示すように、Y画像における白色コロニーWCの濃度は、図13における白色コロニーWCの分光反射率特性線LW(太線)と、一点鎖線で示されるY画像の分光感度特性線Yとの乗算で算出される領域の面積Sg1で示される。また、Y画像における黄色コロニーYCの濃度は、図13における黄色コロニーの分光反射率特性線LY(太い一点鎖線)と、Y画像の分光感度特性線Yとの乗算で算出される領域の面積Sg2で示される。そして、2つの面積Sg1,Sg2の差ΔSg(=|Sg1-Sg2|)が全体(Sg1とSg2の和)に対して占める比率が大きい。この比率が濃度差比率に相当するので、Y画像においては2種類のコロニーWC,YCは濃度差比率が大きく識別しやすい。なお、B画像とR画像においては、差ΔSb,ΔSr(図示略)が全体(B領域全面積,R領域全面積)に対して占める比率が小さいため、2種類のコロニーWC,YCは識別しにくい。なお、図13では、LW*Yを、LWとYとが重なった領域の面積Sg1で代用し、LY*Yを、LYとYとが重なった領域の面積Sg2で代用する。 As shown in FIG. 13, the density of the white colony WC in the Y image is determined by the spectral reflectance characteristic line LW (thick line) of the white colony WC in FIG. The area of the region calculated by multiplication is indicated by Sg1. In addition, the density of the yellow colony YC in the Y image is calculated by multiplying the spectral reflectance characteristic line LY (thick one-dot chain line) of the yellow colony in FIG. 13 by the spectral sensitivity characteristic line Y of the Y image, the area Sg2 of the region. It is indicated by. The difference ΔSg (=|Sg1−Sg2|) between the two areas Sg1 and Sg2 accounts for a large proportion of the total (sum of Sg1 and Sg2). Since this ratio corresponds to the density difference ratio, in the Y image, the two types of colonies WC and YC have a large density difference ratio and are easy to distinguish. Note that between the B image and the R image, the difference ΔSb and ΔSr (not shown) account for a small proportion of the total area (the total area of the B region, the total area of the R region), so the two types of colonies WC and YC cannot be distinguished. Hateful. In addition, in FIG. 13, the area Sg1 of the area where LW and Y overlap is substituted for LW*Y, and the area Sg2 of the area where LY and Y overlap is substituted for LY*Y.
 図14は、この本実施例の汎用のカラーカメラ31で撮像したXYZ画像における白色コロニーWCと黄色コロニーYCとの濃度を示す。X画像、Z画像においては、白色コロニーWCの濃度Cwと黄色コロニーYCの濃度Cyは、ほとんど差がないか、差があっても非常に小さい。このため、X・Z画像で2種類のコロニーWC,YCは識別しにくい。しかし、Y画像においては、白色コロニーWCの濃度Cwと黄色コロニーYCの濃度Cyとの差が大きい。このため、Y画像で白色コロニーWCと黄色コロニーYCは識別しやすくなる。以上より、XYZ画像において2種類のコロニーWC,YCを色で識別することも可能になる。 FIG. 14 shows the density of white colonies WC and yellow colonies YC in an XYZ image captured by the general-purpose color camera 31 of this embodiment. In the X image and the Z image, there is almost no difference between the density Cw of the white colony WC and the density Cy of the yellow colony YC, or even if there is a difference, the difference is very small. For this reason, it is difficult to distinguish between the two types of colonies WC and YC in the X/Z images. However, in the Y image, there is a large difference between the density Cw of the white colony WC and the density Cy of the yellow colony YC. Therefore, white colonies WC and yellow colonies YC can be easily distinguished in the Y image. From the above, it is also possible to identify the two types of colonies WC and YC by color in the XYZ image.
 <制御部40による制御内容>
 <実施形態の作用>
 次に、撮像装置11及び検査装置10の作用について説明する。
<Contents of control by the control unit 40>
<Action of the embodiment>
Next, the functions of the imaging device 11 and the inspection device 10 will be explained.
 <初期設定処理について>
 まず図16を参照して、初期設定処理について説明する。制御部40は、図16に示す初期設定処理ルーチンを実行する。
<About initial setting process>
First, the initial setting process will be described with reference to FIG. 16. The control unit 40 executes the initial setting processing routine shown in FIG.
 ステップS11において、制御部40は、色ゲイン調整用の基準シートを設置する。制御部40が、例えば、ロボットを制御してアームに基準シートを把持させ、把持した基準シートを、カメラ31の撮像エリアである、例えば、検査位置に置く。なお、作業者が手動で基準シートを置いてもよい。 In step S11, the control unit 40 installs a reference sheet for color gain adjustment. The control unit 40 controls, for example, a robot to have an arm grip a reference sheet, and places the gripped reference sheet at, for example, an inspection position, which is an imaging area of the camera 31 . Note that the reference sheet may be placed manually by the operator.
 ステップS12において、制御部40は、青色LEDをOnさせる。
 ステップS13において、制御部40は、B画像とG画像のレベルが揃うようにB,Gゲインを設定する。すなわち、制御部40は、青色光源21である青色LEDからのB光で照明される条件の下で第1画素であるB画素33Bと第2画素であるG画素33Gとの各出力レベルを2色間で色バランスをとることが可能な値のBゲイン及びGゲインを設定する。
In step S12, the control unit 40 turns on the blue LED.
In step S13, the control unit 40 sets the B and G gains so that the levels of the B and G images are the same. That is, the control unit 40 sets the respective output levels of the B pixel 33B, which is the first pixel, and the G pixel 33G, which is the second pixel, to 2 under the condition that the B pixel 33B, which is the first pixel, is illuminated with B light from the blue LED, which is the blue light source 21. The B gain and G gain are set to values that allow for color balance between colors.
 ステップS14において、制御部40は、赤色LEDをOnさせる。
 ステップS15において、制御部40は、Rゲインを手動設定する。すなわち、作業者が入力部を操作してRゲインを手動設定するので、制御部40は、入力部から入力したRゲインをメモリの所定記憶領域に書き込むことで設定する。
In step S14, the control unit 40 turns on the red LED.
In step S15, the control unit 40 manually sets the R gain. That is, since the operator manually sets the R gain by operating the input section, the control section 40 sets the R gain input from the input section by writing it into a predetermined storage area of the memory.
 ステップS16において、制御部40は、背景シートを設置させる。制御部40が、例えば、搬送ロボットのアームを制御して、アームに背景シートを把持させ、把持した背景シートを、カメラ31の撮像エリアである、例えば、検査位置に置く。なお、ロボットは、背景シートを検査位置に置く前に、先に検査位置に置いた基準シートを取り除く。 In step S16, the control unit 40 causes a background sheet to be installed. The control unit 40 controls, for example, the arm of the transport robot, causes the arm to grip the background sheet, and places the gripped background sheet at the imaging area of the camera 31, for example, at an inspection position. Note that before placing the background sheet at the inspection position, the robot removes the reference sheet previously placed at the inspection position.
 ステップS17において、制御部40は、背景シート画像で黒バランスのオフセット量を算出及び設定する。
 <撮像処理について>
 次に、図17を参照して、撮像処理について説明する。制御部40は、図17に示す撮像処理ルーチンを実行する。制御部40は、カメラ31に対して検査位置に配置されたシャーレ13を撮像させる制御を行うとともに、撮像結果としてカメラ31から出力される撮像信号に対して画像処理部50(図1及び図5を参照)が所定の画像処理を行うことで、撮像信号を構成する3バンドのRGB画像を、3バンドのXYZ画像に変換して出力する。
In step S17, the control unit 40 calculates and sets a black balance offset amount using the background sheet image.
<About imaging processing>
Next, the imaging process will be described with reference to FIG. 17. The control unit 40 executes the imaging processing routine shown in FIG. 17. The control unit 40 controls the camera 31 to take an image of the petri dish 13 placed at the inspection position, and also processes the image processing unit 50 (FIGS. 1 and 5) with respect to the imaging signal output from the camera 31 as an imaging result. ) performs predetermined image processing to convert the 3-band RGB image forming the imaging signal into a 3-band XYZ image and output it.
 詳しくは、ステップS21において、制御部40は、背景シート15上にコロニーWC,YCが培養されているシャーレ13を配置する。この処理は、制御部40が、例えば、ロボットを制御してアームに把持したシャーレ13を背景シート15の上に置く。なお、作業者がシャーレ13を背景シート15の上に置いてもよい。 Specifically, in step S21, the control unit 40 arranges the petri dish 13 in which the colonies WC and YC are cultured on the background sheet 15. In this process, the control unit 40 controls, for example, a robot to place the petri dish 13 held by the arm on the background sheet 15. Note that the operator may place the petri dish 13 on the background sheet 15.
 次のステップS22において、制御部40は、青色・赤色LEDをOnさせる。
 ステップS23において、制御部40は、カメラ31に撮像させる。なお、ステップS22及びS23の処理が、撮像ステップの一例に相当する。
In the next step S22, the control unit 40 turns on the blue and red LEDs.
In step S23, the control unit 40 causes the camera 31 to take an image. Note that the processing in steps S22 and S23 corresponds to an example of an imaging step.
 ステップS24において、制御部40は、RGB画像にリニアマトリックス処理を実施する。なお、このステップS24の処理が、演算処理ステップの一例に相当する。
 ステップS25において、制御部40は、ゲイン設定部55に設定されたゲインGx,Gy,Gzで白バランス処理を実施する。ここで、ゲインGx,Gyは、B画素とG画素との出力レベルを色バランスした値である。よって、図13に示す面積Sg1,Sg2が、Y領域全体に占める面積が小さくても、図15(c)のY画像において2種類のコロニーWC,YCが暗くなりすぎず、適度な濃度かつ適度な濃度差で得られる。
In step S24, the control unit 40 performs linear matrix processing on the RGB image. Note that the processing in step S24 corresponds to an example of an arithmetic processing step.
In step S25, the control unit 40 performs white balance processing using the gains Gx, Gy, and Gz set in the gain setting unit 55. Here, the gains Gx and Gy are values obtained by color-balancing the output levels of the B pixel and the G pixel. Therefore, even if the areas Sg1 and Sg2 shown in FIG. 13 occupy a small area in the entire Y area, the two types of colonies WC and YC do not become too dark in the Y image of FIG. It can be obtained with a concentration difference.
 ステップS26において、制御部40は、設定されたRGBオフセット量により黒バランス処理を実施する。すなわち、制御部40は、画像中の背景シート15の黒色が、初期設定時に設定された画像中の背景シート15の黒色と同じ濃度になるように黒バランスを実施する。 In step S26, the control unit 40 performs black balance processing using the set RGB offset amount. That is, the control unit 40 performs black balance so that the black color of the background sheet 15 in the image has the same density as the black color of the background sheet 15 in the image set at the time of initial setting.
 ステップS27において、制御部40は、XYZ画像を出力する。
 図15(a)に示すように、XYZ画像では白色コロニーWCと黄色コロニーYCとの間に、例えば、2値化処理等の所定の画像処理によって識別可能な程度の濃度の差が得られる。このため、2種類のコロニーWC,YCが識別可能である。一方、図15(b)に示すX画像では、2種類のコロニーWC,YCの濃度は同程度であり識別はしにくいが、培地14との濃度差が大きいので、培地14とコロニーWC,YCとの識別は可能である。よって、コロニーWC,YCの全数を計数可能である。また、図15(c)に示すY画像では、白色コロニーWCと黄色コロニーYCとの間に識別可能な程度の濃度の差が得られる。このため、2種類のコロニーWC,YCが識別可能である。また、図15(d)に示すZ画像では、2種類のコロニーWC,YCの濃度は同程度であり識別はしにくいが、培地14との濃度差が大きいので、培地14とコロニーWC,YCとの識別は可能である。コロニーWC,YCの全数を計数可能である。また、X画像とY画像は、培地14とコロニーWC,YCとの濃度の関係が異なる。このため、培地14の着色の仕方によっては、X画像とZ画像の両方を用いることで、少なくともこのうち一方の画像によりコロニーWC,YCの全数を計数可能である。
In step S27, the control unit 40 outputs an XYZ image.
As shown in FIG. 15(a), in the XYZ image, a distinguishable difference in density is obtained between the white colony WC and the yellow colony YC by predetermined image processing such as binarization processing. Therefore, two types of colonies WC and YC can be identified. On the other hand, in the X image shown in FIG. 15(b), the concentrations of the two types of colonies WC and YC are the same and difficult to distinguish, but the difference in concentration between the two types of colonies WC and YC is large; It is possible to identify Therefore, the total number of colonies WC and YC can be counted. Further, in the Y image shown in FIG. 15(c), a distinguishable difference in density is obtained between the white colony WC and the yellow colony YC. Therefore, two types of colonies WC and YC can be identified. In addition, in the Z image shown in FIG. 15(d), the concentrations of the two types of colonies WC and YC are about the same and difficult to distinguish, but since the difference in concentration between the two types of colonies WC and YC is large, there is a large concentration difference between the two types of colonies WC and YC. It is possible to identify The total number of colonies WC and YC can be counted. Furthermore, the relationship between the concentrations of the culture medium 14 and the colonies WC and YC is different between the X image and the Y image. Therefore, depending on how the culture medium 14 is colored, by using both the X image and the Z image, it is possible to count the total number of colonies WC and YC using at least one of these images.
 なお、図15(b)における2種類のコロニーWC,YCの領域が、画素レベルが閾値設定範囲内にある領域である第1コロニー領域に相当する。図15(c)における2種類のコロニーWC,YCのうち一方の領域が、画素レベルが閾値設定範囲内にある領域である第2コロニー領域に相当する。図15(d)における2種類のコロニーWC,YCの領域が、画素レベルが閾値設定範囲内にある領域である第3コロニー領域に相当する。そして、コロニー全数の領域である第1コロニー領域又は第3コロニー領域から、2種類のうち一方の種類のコロニーの領域である第2コロニー領域を排除した残りのコロニー領域が第4コロニー領域に相当する。例えば、白色コロニーWCの領域が第2コロニー領域であれば、黄色コロニーYCの領域が第4コロニー領域に相当し、黄色コロニーYCの領域が第2コロニー領域であれば、白色コロニーWCの領域が第4コロニー領域に相当する。 Note that the areas of the two types of colonies WC and YC in FIG. 15(b) correspond to the first colony area, which is an area where the pixel level is within the threshold setting range. One of the two types of colonies WC and YC in FIG. 15(c) corresponds to the second colony area, which is an area where the pixel level is within the threshold setting range. The areas of the two types of colonies WC and YC in FIG. 15(d) correspond to the third colony area, which is an area where the pixel level is within the threshold setting range. Then, the remaining colony area after excluding the second colony area, which is the area of one of the two types of colonies, from the first colony area or the third colony area, which is the area of the total number of colonies, corresponds to the fourth colony area. do. For example, if the white colony WC area corresponds to the second colony area, the yellow colony YC area corresponds to the fourth colony area, and if the yellow colony YC area corresponds to the second colony area, the white colony WC area corresponds to the fourth colony area. Corresponds to the fourth colony area.
 <検査処理について>
 次に、図18を参照して、検査処理について説明する。検査処理部60を構成するコンピュータは、図18に示す検査処理ルーチンを実行する。検査処理部60は、撮像装置11からのXYZ画像を用いて画像別にコロニーを計数し、画像別のコロニーの計数結果を用いてコロニーの種類別の数に補正する計数補正(カウント補正)と、補正後の計数結果(カウント結果)に基づいて検査結果を判定する判定処理とを行う。
<About inspection processing>
Next, inspection processing will be described with reference to FIG. 18. The computer constituting the inspection processing section 60 executes the inspection processing routine shown in FIG. The inspection processing unit 60 counts colonies for each image using the XYZ images from the imaging device 11, and performs a counting correction (count correction) that corrects the number of colonies by type using the colony counting result for each image. A determination process is performed to determine the test result based on the corrected counting result (count result).
 XYZ画像ごとに行う処理としては、検査対象であるコロニーの領域を求める領域抽出処理と、抽出した領域をコロニーの特徴に応じて分類する分類処理、コロニーとして分類された検査対象の数をカウントするカウント処理(計数処理)とを含む。詳しくは、領域抽出処理は、X画像、Y画像及びZ画像ごとに検査対象(コロニー候補)の領域を求める処理である。分類処理は、抽出した領域に対して検査対象であるコロニーの形状因子・サイズ因子を特徴として、検査対象であるコロニーあるか否かを分類する処理である。カウント処理は、コロニーとして分類された検査対象の数をカウントする処理である。 The processing performed for each XYZ image includes area extraction processing to find the colony area to be inspected, classification processing to classify the extracted area according to the characteristics of the colony, and counting the number of inspection subjects classified as colonies. This includes counting processing (counting processing). Specifically, the area extraction process is a process of finding the area of the inspection target (colony candidate) for each of the X image, Y image, and Z image. The classification process is a process of classifying whether or not a colony to be inspected exists in the extracted region, using the shape factor and size factor of the colony to be inspected as characteristics. The counting process is a process of counting the number of test objects classified as colonies.
 そして、XYZ画像ごとに求めるなどした複数のコロニー計数値(カウント値)を用いて所定の演算を行うことでコロニーの種類別の数に補正するカウント補正を行うことで、コロニーの種類別の計数値を求める。さらに、コロニーの種類別の計数値を用いて検査対象の良否を判定するなどの判定処理を行って、最終的な検査結果を取得する。なお、図18は、XYZ画像ごとに領域を求める並列な処理を1つのフローで示しており、一例としてステップ番号の処理順を示すが、並列な処理はどの順番で行ってもよい。領域ごとの特徴分類およびカウントについても、領域が特定されれば、どの順番で行ってもよい。 Then, by performing a predetermined calculation using multiple colony count values (count values) obtained for each XYZ image, and performing count correction to correct the number for each type of colony, the count for each type of colony can be calculated. Find the numerical value. Further, determination processing such as determining the quality of the inspection object using the count values for each type of colony is performed to obtain a final inspection result. Note that FIG. 18 shows parallel processing for determining regions for each XYZ image in one flow, and the processing order of step numbers is shown as an example, but the parallel processing may be performed in any order. The feature classification and counting for each area may be performed in any order as long as the area is specified.
 図18において、ステップS31~S34は、X画像を用いてコロニーをカウントする一連の処理(第1処理)である。ステップS35~S38は、Y画像を用いてコロニーをカウントする一連の処理(第2処理)である。ステップS39~S42は、Z画像を用いてコロニーをカウントする一連の処理(第3処理)である。ステップS43~S47は、第1処理~第3処理でコロニー候補として抽出された各領域のうち2つ又は3つを組み合わせて特定種類のコロニーの領域を割り出す一連の処理(第4処理)である。 In FIG. 18, steps S31 to S34 are a series of processes (first process) for counting colonies using the X image. Steps S35 to S38 are a series of processes (second process) for counting colonies using the Y image. Steps S39 to S42 are a series of processes (third process) for counting colonies using the Z image. Steps S43 to S47 are a series of processes (fourth process) in which two or three of the areas extracted as colony candidates in the first to third processes are combined to determine a specific type of colony area. .
 第1処理は、X画像の閾値による領域抽出(ステップS31)、領域抽出結果としての第1領域の特定(ステップS32)、第1領域を特徴に応じて分類する特徴分類(ステップS33)及び特徴分類した検査対象のカウント(ステップS34)を含む。 The first process includes region extraction using a threshold value of the X image (step S31), identification of a first region as a region extraction result (step S32), feature classification for classifying the first region according to features (step S33), and feature It includes the count of classified inspection objects (step S34).
 第2処理は、Y画像の閾値による領域抽出(ステップS35)、領域抽出結果としての第2領域の特定(ステップS36)、第2領域を特徴に応じて分類する特徴分類(ステップS37)及び特徴分類した検査対象のカウント(ステップS38)を含む。 The second process includes region extraction using a threshold value of the Y image (step S35), identification of a second region as a region extraction result (step S36), feature classification for classifying the second region according to the feature (step S37), and feature classification of the second region according to the feature. It includes the count of classified inspection objects (step S38).
 第3処理は、Z画像の閾値による領域抽出(ステップS39)、領域抽出結果としての第3領域の特定(ステップS40)、第3領域を特徴に応じて分類する特徴分類(ステップS41)及び特徴分類した検査対象のカウント(ステップS42)を含む。 The third process includes region extraction using a threshold value of the Z image (step S39), identification of a third region as a region extraction result (step S40), feature classification for classifying the third region according to features (step S41), and feature It includes the count of classified inspection objects (step S42).
 第4処理は、上記各処理(例えば3つの処理)で得られた複数の領域を用いて別の領域を求める処理である。第4処理は、使用する複数の領域が既に取得されたか否かを判定する判定処理(ステップS43)、一例として第1領域と第2領域を用いて第4領域領を求める領域取得処理(ステップS44)、領域取得結果としての第4領域の特定(ステップS45)、第4領域を特徴に応じて分類する特徴分類(ステップS46)及び特徴分類した検査対象のカウント(ステップS47)を含む。なお、本例では、X画像のコロニー候補領域を第1領域とし、Y画像のコロニー候補領域を第2領域として、第4コロニー領域を求めるが、Z画像のコロニー候補領域を第1領域とし、Y画像のコロニー候補領域を第2領域として、第4コロニー領域を求めてもよい。 The fourth process is a process of finding another area using the multiple areas obtained in each of the above processes (for example, three processes). The fourth process includes a determination process (step S43) that determines whether a plurality of areas to be used have already been acquired, and an area acquisition process (step S43) that determines a fourth area using the first area and the second area, as an example. S44), identification of a fourth region as a result of region acquisition (step S45), feature classification for classifying the fourth region according to features (step S46), and counting of test objects subjected to feature classification (step S47). Note that in this example, the colony candidate area of the A fourth colony area may be determined by using the colony candidate area of the Y image as the second area.
 ステップS48において、検査処理部60(詳しくは、検査処理部60を構成するコンピュータ)は、ステップS34,S38,S42,S47でそれぞれ得られた領域別の検査対象のカウント結果を用いて所定の演算を行うカウント補正を行う。カウント補正によって検査対象の例えばコロニーの種類別のカウント値を求める。例えば、コロニー総個数から第1種のコロニー個数を減算することで、第2種のコロニー個数を算出する。また、ステップS49において、検査処理部60は、カウント補正結果を用いて最終的な検査結果を求める判定処理を行う。判定処理は、コロニー総個数やコロニーの種類ごとの個数がそれぞれに応じた個数閾値を超えたか否かを判定する検査判定処理や、コロニーを表示枠で囲んだり所定色を付加したりして強調表示する強調表示処理などを含んでもよい。そして、検査処理部60は、表示部70に検査結果を表示する。表示部70には、例えば、コロニー総個数、コロニーの種類別の個数、検査結果内容、コロニーに強調表示された画像などが表示される。なお、検査結果内容には、検査対象が食品又は飲料水などの場合は、良品か不良品かの判定結果を含んでもよい。また、ここでは、検査対象をコロニーとしているが、これに限定されない。例えば、製品に含まれる異物を検査対象とし、異物の有無を検査する構成でもよいし、製品の汚れを検査対象として汚れの有無を検査する構成でもよい。 In step S48, the inspection processing unit 60 (more specifically, the computer configuring the inspection processing unit 60) performs a predetermined calculation using the count results of the inspection targets for each region obtained in steps S34, S38, S42, and S47, respectively. Perform count correction. Count values for each type of colony to be inspected, for example, are determined by count correction. For example, the number of colonies of the second type is calculated by subtracting the number of colonies of the first type from the total number of colonies. Further, in step S49, the inspection processing unit 60 performs a determination process to obtain a final inspection result using the count correction results. Judgment processing includes inspection judgment processing that determines whether the total number of colonies and the number of each type of colony exceeds the corresponding number threshold, and highlighting colonies by surrounding them with a display frame or adding a predetermined color. It may also include highlighting processing and the like. The test processing section 60 then displays the test results on the display section 70. The display unit 70 displays, for example, the total number of colonies, the number of colonies by type, test result details, and images with colonies highlighted. In addition, when the inspection target is food or drinking water, the inspection result content may include a determination result as to whether the product is good or defective. Moreover, although the inspection target is a colony here, it is not limited to this. For example, a configuration may be adopted in which foreign matter contained in a product is inspected for the presence of foreign matter, or a configuration in which contamination on the product is inspected for the presence or absence of contamination may be adopted.
 本実施形態では、識別部61及び判定部62は、検査処理部60を構成するコンピュータが、図18に示される検査処理のプログラムを実行することで、ソフトウェアにより構成される。そして、識別部61は、領域抽出(ステップS31,S35,S39,S44)及び領域特定(ステップS32,S36,S40,S45)によって各領域を識別する。分類部63は、特徴分類処理(ステップS33,S37,S41,S46)を行うコンピュータにより構成される。計数部64は、カウント処理(ステップS34,S38,S42,S47)を行うコンピュータにより構成される。特定部65は、計数部64の計数結果であるカウント値に基づいて識別対象である2種類のコロニー領域の数を特定する処理を行うコンピュータにより構成される。判定部62は、特定した2種類のコロニーの個数に基づく判定処理を行う(ステップS49)を行うコンピュータにより構成される。 In the present embodiment, the identification section 61 and the determination section 62 are configured by software by the computer forming the inspection processing section 60 executing the inspection processing program shown in FIG. Then, the identification unit 61 identifies each area by area extraction (steps S31, S35, S39, S44) and area specification (steps S32, S36, S40, S45). The classification unit 63 is configured by a computer that performs feature classification processing (steps S33, S37, S41, and S46). The counting unit 64 is constituted by a computer that performs counting processing (steps S34, S38, S42, S47). The specifying unit 65 is constituted by a computer that performs a process of specifying the number of two types of colony areas to be identified based on the count value that is the counting result of the counting unit 64. The determination unit 62 is constituted by a computer that performs determination processing based on the numbers of the two types of colonies identified (step S49).
 <撮像条件決定処理について>
 次に、図19を参照して、撮像条件決定処理について説明する。本例では、制御部40が、図19に示す撮像条件決定処理ルーチンを実行する。このフローチャートは、3種類の単色光源を用いて撮像条件を探索し決定する例である。なお、撮像条件探索に使用する単色光源の種類は、青色光源21、緑色光源22及び赤色光源23のうちの2種類の単色光源でもよい。
<About imaging condition determination process>
Next, with reference to FIG. 19, the imaging condition determination process will be described. In this example, the control unit 40 executes the imaging condition determination processing routine shown in FIG. 19. This flowchart is an example of searching and determining imaging conditions using three types of monochromatic light sources. Note that the types of monochromatic light sources used for the imaging condition search may be two types of monochromatic light sources among the blue light source 21, the green light source 22, and the red light source 23.
 図19において、ステップS51~57の処理が1番目の単色光源である第1単色光源を用いて撮像条件を探索し決定する第1探索処理である。ステップS51~57の処理を繰り返し実行しても、適切な撮像条件を決定できない場合、ステップS58で単色光を変更して、ステップS51~57の処理を繰り返し実行する。単色光の変更とは、単色光のピーク波長の変更を指す。また、1つの単色光源において単色光のピーク波長の変更をすべて終えた場合は、単色光源を変更してもよい。単色光源を変更しても、ステップS51~S57の処理は基本的に同様である。そのため、以下では、制御部40が、第1単色光源である青色光源を用いて撮像条件を探索し決定する処理について詳しく説明する。 In FIG. 19, the processing in steps S51 to S57 is a first search process in which imaging conditions are searched and determined using the first monochromatic light source, which is the first monochromatic light source. If appropriate imaging conditions cannot be determined even after repeatedly performing the processes in steps S51 to S57, the monochromatic light is changed in step S58, and the processes in steps S51 to S57 are repeatedly performed. Modification of monochromatic light refers to modification of the peak wavelength of monochromatic light. Furthermore, if all the peak wavelengths of monochromatic light in one monochromatic light source have been changed, the monochromatic light source may be changed. Even if the monochromatic light source is changed, the processes in steps S51 to S57 are basically the same. Therefore, in the following, a process in which the control unit 40 searches for and determines imaging conditions using a blue light source, which is the first monochromatic light source, will be described in detail.
 まずステップS51において、制御部40は、第1単色光で照明するときのリニアマトリックス演算の係数を設定する。このステップS51は、撮像部30から出力される3バンドのRGB画像に対してリニアマトリックス演算処理を行うことによりRGB撮像特性が補正された3バンドのXYZ画像を生成するためにリニアマトリックス演算処理でRGB画像に対して乗算するマトリックスの係数を設定する第1ステップの一例に相当する。 First, in step S51, the control unit 40 sets coefficients for linear matrix calculation when illuminating with the first monochromatic light. This step S51 performs linear matrix calculation processing on the 3-band RGB image output from the imaging unit 30 to generate a 3-band XYZ image with corrected RGB imaging characteristics. This corresponds to an example of the first step of setting matrix coefficients by which the RGB image is multiplied.
 ステップS52において、制御部40は、第1単色光源で被写体を照明する。このステップS52は、カメラ31のRGB画素のうち第1画素の分光出力特性と第1画素と異なる第2画素の分光出力特性との各ピーク波長を変化させることが可能となる単色光で被写体12を照明する第2ステップの一例に相当する。 In step S52, the control unit 40 illuminates the subject with the first monochromatic light source. In this step S52, the subject 12 is photographed with monochromatic light that can change the peak wavelengths of the spectral output characteristics of the first pixel among the RGB pixels of the camera 31 and the spectral output characteristics of the second pixel different from the first pixel. This corresponds to an example of the second step of illuminating.
 ステップS53において、制御部40は、被写体を撮像する。なお、ステップS53の処理が、第2ステップで単色光で照明した被写体12を撮像部30が撮像する第3ステップの一例に相当する。 In step S53, the control unit 40 images the subject. Note that the process in step S53 corresponds to an example of a third step in which the imaging unit 30 images the subject 12 illuminated with monochromatic light in the second step.
 ステップS54において、制御部40は、RGB画像に対してマトリックスを乗算してXYZ画像を生成する。なお、このステップS53は、第3ステップの撮像の結果として撮像部30から出力されるRGB画像に対してマトリックスを乗算してXYZ画像を生成する第4ステップの一例に相当する。 In step S54, the control unit 40 multiplies the RGB image by a matrix to generate an XYZ image. Note that this step S53 corresponds to an example of the fourth step in which the RGB image output from the imaging unit 30 as a result of the imaging in the third step is multiplied by a matrix to generate an XYZ image.
 ステップS55において、制御部40は、Y画像において種類が異なる2つの領域の濃度の差を取得する。なお、このステップS55は、第4ステップで生成されたXYZ画像のうち第2画素から出力される第2画像において2種類の異なる素材の領域間の濃度の差が所定の閾値以上であるか否かを判断する第5ステップの一例に相当する。 In step S55, the control unit 40 obtains the difference in density between two areas of different types in the Y image. Note that this step S55 determines whether the difference in density between regions of two different materials in the second image output from the second pixel of the XYZ image generated in the fourth step is greater than or equal to a predetermined threshold. This corresponds to an example of the fifth step of determining whether.
 ステップS56において、制御部40は、差が閾値以上であるか否かを判断する。差が閾値以上でなければ、つまり差が閾値未満であれば、ステップS57に進む。一方、差が閾値以上であれば、ステップS59に進んで、そのときの単色光とそのときの係数とを決定する。なお、決定した係数は、マトリックス係数設定部54に設定される。 In step S56, the control unit 40 determines whether the difference is greater than or equal to a threshold value. If the difference is not greater than or equal to the threshold, that is, if the difference is less than the threshold, the process advances to step S57. On the other hand, if the difference is greater than or equal to the threshold, the process proceeds to step S59, where the monochromatic light and coefficients at that time are determined. Note that the determined coefficients are set in the matrix coefficient setting section 54.
 ステップS57において、制御部40は、第1単色光での条件探索終了であるか否かを判断する。第1単色光での条件探索を終了していなければステップS52に戻り、ステップS52~ステップS57の処理を繰り返し実行する。第1単色光での条件探索をすべて終了しても差が閾値以上になる撮像条件が発見できなければ、ステップS58に移行する。 In step S57, the control unit 40 determines whether the condition search using the first monochromatic light has ended. If the condition search using the first monochromatic light has not been completed, the process returns to step S52 and the processes from step S52 to step S57 are repeatedly executed. If an imaging condition for which the difference is equal to or greater than the threshold value cannot be found even after completing all the condition searches using the first monochromatic light, the process moves to step S58.
 ステップS58において、制御部40は、単色光を変更する。詳しくは、制御部40は、単色光のピーク波長を変更する。ここで、単色光のピーク波長の変更とは、単色光源はそのままでそのピーク波長を徐々に変化させてもよい。例えば、汎用品の青色LEDのピーク波長は、約470nmであるが、例えば10nmステップで480nm、490nmと変化させてもよい。その手段としては、単色光源(LED)はそのままで光学フィルタを使用してピーク波長をシフトさせる方法、あるいは単色光源(LED)そのもののピーク波長を変えた別の単色光源を使用する方法がある。 In step S58, the control unit 40 changes the monochromatic light. Specifically, the control unit 40 changes the peak wavelength of monochromatic light. Here, changing the peak wavelength of monochromatic light may mean gradually changing the peak wavelength of the monochromatic light source without changing it. For example, the peak wavelength of a general-purpose blue LED is about 470 nm, but it may be changed to 480 nm and 490 nm in 10 nm steps, for example. As a means for this, there are two methods: leaving the monochromatic light source (LED) as it is and shifting the peak wavelength using an optical filter, or using another monochromatic light source with a different peak wavelength of the monochromatic light source (LED) itself.
 このように、使用する単色光(例えばB光)のピーク波長を変更し、そのピーク波長変更後の単色光を用いて同様にステップS51~S57の撮像条件探索処理を実行する。それでもステップS56で差が閾値以上になる撮像条件が発見できなければ、ステップS58において、使用する単色光源を青色光源21から緑色光源22に変更する。そして、同様に、使用する単色光(例えばG光)のピーク波長を変更し、そのピーク波長変更後の単色光を用いて同様にステップS51~S57の撮像条件探索処理を実行する。それでもステップS56で差が閾値以上になる撮像条件が発見できなければ、ステップS58において、緑色光源22から赤色光源23に変更する。そして、同様に、使用する単色光(例えばR光)のピーク波長を変更し、そのピーク波長変更後の単色光を用いて同様にステップS51~S57の撮像条件探索処理を実行する。このように、第3ステップ(S56)において濃度の差が所定の閾値以上になるまで、第1ステップ(S51)での係数の変化と、第2ステップ(S52,S58)での単色光のピーク波長の変化とを伴って、第1~第5ステップ(ステップS51~S58)を繰り返し行う。なお、ステップS58において単色光源を変更した場合、その後のステップS55においてXYZ画像のうち濃度の差を取得する際に対象とする画像は、それまでのY画像とは異なる画像(X画像又はZ画像)に変更されてもよい。この画像は、そのときの単色光とコロニーWC,YCの分光反射率特性とに応じて選択されてもよい。 In this way, the peak wavelength of the monochromatic light (for example, B light) to be used is changed, and the imaging condition search processing in steps S51 to S57 is similarly executed using the monochromatic light after the peak wavelength has been changed. If an imaging condition for which the difference is still equal to or greater than the threshold cannot be found in step S56, the monochromatic light source to be used is changed from the blue light source 21 to the green light source 22 in step S58. Then, similarly, the peak wavelength of the monochromatic light (for example, G light) to be used is changed, and the imaging condition search processing of steps S51 to S57 is similarly executed using the monochromatic light after the peak wavelength has been changed. If an imaging condition for which the difference is still equal to or greater than the threshold cannot be found in step S56, the green light source 22 is changed to the red light source 23 in step S58. Then, similarly, the peak wavelength of the monochromatic light (for example, R light) to be used is changed, and the imaging condition search processing of steps S51 to S57 is similarly executed using the monochromatic light after the peak wavelength change. In this way, the change in the coefficient in the first step (S51) and the peak of monochromatic light in the second step (S52, S58) are performed until the difference in density becomes equal to or greater than a predetermined threshold in the third step (S56). The first to fifth steps (steps S51 to S58) are repeated as the wavelength changes. Note that when the monochromatic light source is changed in step S58, the target image when acquiring the density difference among the XYZ images in the subsequent step S55 is an image different from the previous Y image (X image or Z image). ) may be changed. This image may be selected depending on the monochromatic light and the spectral reflectance characteristics of the colonies WC and YC.
 撮像装置11は、このルーチンで決定された撮像条件で、被写体12を撮像する。本例では、2種類のコロニーWC,YCを培地14に含むシャーレ13よりなる被写体12を青色光源21で照明して撮像した際に、差が閾値以上になる撮像条件が発見できた場合であり、青色光源21が単色光源に決定される。この青色光源21は、第1単色光源25として決定される。 The imaging device 11 images the subject 12 under the imaging conditions determined in this routine. In this example, when an object 12 consisting of a petri dish 13 containing two types of colonies WC and YC in a culture medium 14 is imaged by illuminating it with a blue light source 21, an imaging condition is found in which the difference is equal to or greater than a threshold value. , the blue light source 21 is determined to be a monochromatic light source. This blue light source 21 is determined as the first monochromatic light source 25.
 上記の説明では、ステップS58において、単色光のピーク波長の変更は、単色光源はそのままでピーク波長を変化させる単色光の変更を含むが、カメラ31のRGB画素のうち第1画素の分光出力特性と第1画素と異なる第2画素の分光出力特性との各ピーク波長を変化させることができるように発光波長を変更できれば足りる。例えば、単色光源を変更させるだけでもよい。すなわち、ステップS58において、制御部40が、青色光源21、緑色光源、赤色光源23を順番に変更するだけでもよい。また、単色光源はそのままでピーク波長を変化させる構成の場合、撮像条件探索に使用する単色光源の種類は、青色光源21、緑色光源22及び赤色光源23のうちの1種類とすることも可能である。 In the above description, in step S58, changing the peak wavelength of the monochromatic light includes changing the monochromatic light by changing the peak wavelength while leaving the monochromatic light source as it is. It is sufficient if the emission wavelength can be changed so that the peak wavelengths of the first pixel and the spectral output characteristics of the second pixel, which are different from the first pixel, can be changed. For example, it is sufficient to simply change the monochromatic light source. That is, in step S58, the control unit 40 may simply change the blue light source 21, the green light source, and the red light source 23 in order. In addition, in the case of a configuration in which the monochromatic light source is unchanged but the peak wavelength is changed, the type of monochromatic light source used for imaging condition search can be one of the blue light source 21, the green light source 22, and the red light source 23. be.
 <実施形態の効果>
 以上、詳述した実施形態によれば、以下に示す効果が得られる。
 (1)撮像装置11は、撮像した被写体12に含まれる2種類の異なる素材からなる領域を識別可能な画像として出力する。撮像装置11は、撮像部30、単色光源20、制御部40及び画像処理部50を備える。撮像部30は、被写体12を撮像するイメージセンサ33(撮像素子)がRGB画素33R,33G,33Bを有する。単色光源20は、被写体12を単色光で照明する。制御部40は、単色光源20及び撮像部30を制御する。画像処理部50は、被写体12を撮像した撮像部30から出力される3バンドのRGB画像に所定の処理を施す。RGB画素は、第1波長領域に感度を有する第1画素33Bと、第2波長領域に感度を有する第2画素33Gとを含む。第1波長領域と第2波長領域とは一部の重複波長領域で重複する。単色光は、第1波長領域の少なくとも一部と重複波長領域の少なくとも一部とを含む発光波長領域の光である。画像処理部50は、演算処理部52と、増幅部53と、を備える。演算処理部52は、3バンドのRGB画像に対してリニアマトリックス演算処理を行うことによりRGB撮像特性が補正された3バンドのXYZ画像を生成する。増幅部53は、単色光源20から単色光が照射された条件下で第1画素33Bと第2画素33Gとの各出力レベルを2色間で色バランスをとることが可能な値に設定されたゲインでXYZ画像を構成するXYZ値のうち第1画素33Bと第2画素33Gとの各々に対応する各値を増幅する。
<Effects of embodiment>
According to the embodiment described in detail above, the following effects can be obtained.
(1) The imaging device 11 outputs an image in which regions made of two different materials included in the photographed subject 12 can be identified. The imaging device 11 includes an imaging section 30, a monochromatic light source 20, a control section 40, and an image processing section 50. In the imaging unit 30, an image sensor 33 (imaging device) that images the subject 12 has RGB pixels 33R, 33G, and 33B. The monochromatic light source 20 illuminates the subject 12 with monochromatic light. The control unit 40 controls the monochromatic light source 20 and the imaging unit 30. The image processing unit 50 performs predetermined processing on the three-band RGB image output from the imaging unit 30 that has captured the image of the subject 12. The RGB pixels include a first pixel 33B having sensitivity in a first wavelength region and a second pixel 33G having sensitivity in a second wavelength region. The first wavelength region and the second wavelength region overlap in some overlapping wavelength regions. Monochromatic light is light in an emission wavelength range that includes at least part of the first wavelength range and at least part of the overlapping wavelength range. The image processing section 50 includes an arithmetic processing section 52 and an amplification section 53. The arithmetic processing unit 52 generates a three-band XYZ image with corrected RGB imaging characteristics by performing linear matrix arithmetic processing on the three-band RGB image. The amplification unit 53 sets each output level of the first pixel 33B and the second pixel 33G to a value that allows color balance between the two colors under the condition that monochromatic light is irradiated from the monochromatic light source 20. The gain amplifies each value corresponding to each of the first pixel 33B and the second pixel 33G among the XYZ values forming the XYZ image.
 この構成によれば、単色光源20の照度の調整あるいはゲインの設定により、適切な色又は濃度で素材(コロニーWC,YC)を識別できる画像を出力することができる。よって、画像を検査に用いるときに、画像に含まれる2種類の異なる素材を比較的高い精度で識別又は分類することができる。 According to this configuration, by adjusting the illuminance or setting the gain of the monochromatic light source 20, it is possible to output an image in which the materials (colonies WC, YC) can be identified with appropriate color or density. Therefore, when using an image for inspection, two different types of materials included in the image can be identified or classified with relatively high accuracy.
 (2)画像処理部50は、被写体12のうち識別対象以外の基準部分を対象にして黒バランス処理を行った3バンドのXYZ画像を生成する。この構成によれば、黒バランス処理を行うので、被写体12を撮像するときの環境が変化しても、適切な色又は濃度で素材を識別できる画像を出力することができる。 (2) The image processing unit 50 generates a three-band XYZ image by performing black balance processing on a reference portion of the subject 12 other than the identification target. According to this configuration, since black balance processing is performed, even if the environment when photographing the subject 12 changes, it is possible to output an image in which the material can be identified with appropriate color or density.
 (3)単色光源20は、青色光源21又は赤色光源23である。単色光源20が青色光源21である場合、第1画素33BはB画素であり、かつ第2画素はG画素33Gである。単色光源20が赤色光源23である場合、第1画素はR画素33Rであり、かつ第2画素はG画素33Gである。 (3) The monochromatic light source 20 is a blue light source 21 or a red light source 23. When the monochromatic light source 20 is the blue light source 21, the first pixel 33B is a B pixel, and the second pixel is a G pixel 33G. When the monochromatic light source 20 is the red light source 23, the first pixel is the R pixel 33R, and the second pixel is the G pixel 33G.
 この構成によれば、G画素33Gは、青色光源21の光源色である青色光に対しても重複波長領域の少なくとも一部に感度を有する。このため、青色光で照明して撮像するとき、G画像の感度波長領域を汎用カメラのそれに対して狭くすることができる。また、G画素33Gは、赤色光源23の光源色である赤色光に対しても重複波長領域の少なくとも一部に感度を有する。このため、赤色光で照明して撮像するとき、G画像の感度波長領域を汎用のカメラ31のそれに対して狭くすることができる。RGB画像をリニアマトリックス演算することで、G画像の狭くなった感度波長領域を更に狭くしたり波長をずらしたりすることができる。よって、適切な色又は濃度で素材(コロニーWC,YC)を識別できる画像を出力することができる。 According to this configuration, the G pixel 33G has sensitivity to at least part of the overlapping wavelength region also to blue light, which is the light source color of the blue light source 21. Therefore, when capturing an image by illuminating with blue light, the sensitivity wavelength range of the G image can be narrowed compared to that of a general-purpose camera. Furthermore, the G pixel 33G has sensitivity to at least a portion of the overlapping wavelength region with respect to red light, which is the light source color of the red light source 23. Therefore, when capturing an image by illuminating with red light, the sensitivity wavelength range of the G image can be narrowed compared to that of the general-purpose camera 31. By performing linear matrix calculation on the RGB image, it is possible to further narrow the narrow sensitive wavelength region of the G image or shift the wavelength. Therefore, it is possible to output an image in which the materials (colonies WC, YC) can be identified with appropriate color or density.
 (4)青色光源21を第1単色光源25とすると、撮像装置11は、第1単色光源25が照射する第1単色光の発光波長領域とは異なる発光波長領域の第2単色光を照射可能な第2単色光源26を更に備える。RGB画素は、第1画素33B及び第2画素33Gの他に第3画素を含む。制御部40は、第1単色光源25の照明と第2単色光源26の照明とを行って、被写体12を撮像部30に撮像させる。制御部40が、第1単色光源25の照度に応じて設定した第2照度で第2単色光源26の照明を行うか、あるいは、RGB画素のうち第1画素33Bと第2画素33Gとの出力レベルに与えられる各ゲインに応じて第3画素の出力レベルに与えられるゲインがゲイン設定部55により設定される。この構成によれば、単色光源20の照度の調整あるいはゲインの設定により、適切な色又は濃度で素材を識別できる画像を出力することができる。 (4) When the blue light source 21 is used as the first monochromatic light source 25, the imaging device 11 can emit second monochromatic light in an emission wavelength range different from that of the first monochromatic light emitted by the first monochromatic light source 25. It further includes a second monochromatic light source 26. The RGB pixels include a third pixel in addition to the first pixel 33B and the second pixel 33G. The control unit 40 causes the imaging unit 30 to image the subject 12 by illuminating the first monochromatic light source 25 and the second monochromatic light source 26 . The control unit 40 illuminates the second monochromatic light source 26 with a second illuminance set according to the illuminance of the first monochromatic light source 25, or controls the output of the first pixel 33B and second pixel 33G among the RGB pixels. The gain setting unit 55 sets the gain given to the output level of the third pixel according to each gain given to the level. According to this configuration, by adjusting the illuminance or setting the gain of the monochromatic light source 20, it is possible to output an image in which the material can be identified with an appropriate color or density.
 (5)検査装置10は、撮像装置11を備える。被写体12は、2種類の異なる素材として2種類の異なるコロニーWC,YCを含む。撮像装置11から入力した3バンドのXYZ画像に基づいて2種類の異なるコロニーWC,YCの領域であるコロニー領域を識別する識別部61を備える。XYZ画像は、X画像、Y画像及びZ画像を含む。識別部61は、X画像で画素レベルが閾値設定範囲内にある領域を第1コロニー領域とする。識別部61は、Y画像で画素レベルが閾値設定範囲内にある領域を第2コロニー領域とする。識別部61は、Z画像で画素レベルが閾値設定範囲内にある領域を第3コロニー領域とする。識別部61は、第1コロニー領域又は第3コロニー領域から、第2コロニー領域を排除した残りのコロニー領域を第4コロニー領域とする。識別部61は、分類部63と、計数部64と、特定部65とを備える。分類部63は、第1コロニー領域及び第3コロニー領域のうちの少なくとも一方と、第2コロニー領域と、第4コロニー領域とを、特徴及び形状に基づいて分類する。計数部64は、分類部が分類したコロニー領域のそれぞれの数を計数する。特定部65は、計数部の計数結果に基づいて識別対象である2種類のコロニー領域の数をそれぞれ特定する。この構成によれば、単色光源20の照度の調整あるいはゲインの設定により、適切な色又は濃度で素材を識別できる画像を出力することができる。よって、画像を検査に用いるときに、画像に含まれる2種類の異なる素材を比較的高い精度で識別又は分類することができる。 (5) The inspection device 10 includes an imaging device 11. The object 12 includes two different colonies WC and YC as two different materials. An identification unit 61 is provided that identifies colony areas, which are areas of two different types of colonies WC and YC, based on the three-band XYZ image input from the imaging device 11. The XYZ image includes an X image, a Y image, and a Z image. The identification unit 61 defines an area in the X image whose pixel level is within the threshold setting range as a first colony area. The identification unit 61 defines an area in the Y image whose pixel level is within the threshold setting range as a second colony area. The identification unit 61 defines an area in the Z image whose pixel level is within the threshold setting range as a third colony area. The identification unit 61 removes the second colony area from the first colony area or the third colony area and sets the remaining colony area as a fourth colony area. The identification section 61 includes a classification section 63 , a counting section 64 , and a specifying section 65 . The classification unit 63 classifies at least one of the first colony area and the third colony area, the second colony area, and the fourth colony area based on characteristics and shapes. The counting unit 64 counts the number of each colony area classified by the classification unit. The specifying unit 65 specifies the number of two types of colony areas to be identified based on the counting results of the counting unit. According to this configuration, by adjusting the illuminance or setting the gain of the monochromatic light source 20, it is possible to output an image in which the material can be identified with an appropriate color or density. Therefore, when using an image for inspection, two different types of materials included in the image can be identified or classified with relatively high accuracy.
 (6)被写体12に含まれる2種類の異なる素材からなる領域を識別可能な画像として出力する撮像装置11の撮像条件決定方法である。撮像装置11は、被写体12を照明する単色光源20と、被写体12を撮像するRGB画素を含む撮像部30とを備える。この撮像条件決定方法は、第1~第5ステップを含む。第1ステップは、撮像部30から出力される3バンドのRGB画像に対してリニアマトリックス演算処理を行うことによりRGB撮像特性が補正された3バンドのXYZ画像を生成するために前記リニアマトリックス演算処理で前記RGB画像に対して乗算するマトリックスの係数を設定する。第2ステップは、撮像部30を構成するRGB画素のうち第1画素33Bの分光出力特性と第1画素33Bとは異なる第2画素33Gの分光出力特性の各ピーク波長を変化させることが可能になる単色光で被写体12を照明する。第3ステップは、第2ステップにおいて単色光で照明した被写体12を撮像部30が撮像する。第4ステップは、第3ステップの撮像の結果として撮像部30から出力されるRGB画像にマトリックスを乗算してXYZ画像を生成する。第5ステップは、第4ステップで生成されたXYZ画像のうち第2画素33Gから出力される第2画像において2種類の異なる素材の領域間の濃度の差が所定の閾値以上であるか否かを判断する。第5ステップにおいて濃度の差が所定の閾値以上になるまで、前記第1ステップでの係数の変化と、第2ステップでの各ピーク波長の変化とのうち少なくとも一方を伴って、第1~第5ステップを繰り返し行う。この撮像条件決定方法によれば、撮像装置11が、被写体12に含まれる2種類の異なる素材(コロニーWC,YC)からなる領域を色又は濃度で識別できる画像を出力可能である適切な撮像条件を決定できる。 (6) A method for determining imaging conditions for the imaging device 11 that outputs regions made of two different materials included in the subject 12 as images that can be identified. The imaging device 11 includes a monochromatic light source 20 that illuminates the subject 12 and an imaging unit 30 that includes RGB pixels that captures an image of the subject 12. This imaging condition determining method includes first to fifth steps. The first step is to perform linear matrix calculation processing on the 3-band RGB image output from the imaging unit 30 to generate a 3-band XYZ image with corrected RGB imaging characteristics. The coefficients of the matrix by which the RGB image is multiplied are set. In the second step, it is possible to change the peak wavelengths of the spectral output characteristics of the first pixel 33B and the spectral output characteristics of the second pixel 33G, which is different from the first pixel 33B, among the RGB pixels forming the imaging unit 30. The subject 12 is illuminated with monochromatic light. In the third step, the imaging unit 30 images the subject 12 illuminated with monochromatic light in the second step. In the fourth step, the RGB image output from the imaging unit 30 as a result of the imaging in the third step is multiplied by a matrix to generate an XYZ image. The fifth step is to determine whether the difference in density between regions of two different materials in the second image output from the second pixel 33G among the XYZ images generated in the fourth step is greater than or equal to a predetermined threshold. to judge. Until the difference in concentration becomes equal to or greater than a predetermined threshold value in the fifth step, the first to second steps are performed with at least one of a change in the coefficient in the first step and a change in each peak wavelength in the second step. Repeat the 5 steps. According to this method of determining imaging conditions, appropriate imaging conditions are set under which the imaging device 11 can output an image in which regions made of two different materials (colonies WC, YC) included in the subject 12 can be identified by color or density. can be determined.
 (7)撮像した被写体12に含まれる2種類の異なる素材からなる領域を識別可能な画像として出力する撮像方法である。被写体12を撮像する撮像部30と、被写体12を単色光で照明する単色光源20とを備える。撮像部30は、RGB画素を有するイメージセンサ33(撮像素子)を備える。RGB画素は、第1波長領域に感度を有する第1画素33Bと、第2波長領域に感度を有する第2画素33Gとを含む。第1波長領域と第2波長領域とは一部の重複波長領域で重複する。単色光は、第1波長領域の少なくとも一部と重複波長領域の少なくとも一部とを含む発光波長領域の光である。この撮像方法は、撮像ステップと、出力ステップと、演算処理ステップと、増幅ステップとを備える。撮像ステップは、単色光源20が照明した被写体12を撮像部30が撮像する。出力ステップは、被写体12を撮像した撮像部30が3バンドのRGB画像を出力する。演算処理ステップは、撮像部30から出力される3バンドのRGB画像に対してリニアマトリックス演算処理を行うことによりRGB撮像特性が補正された3バンドのXYZ画像を生成する。増幅ステップは、単色光源20から単色光が照射された条件下で第1画素33B及び第2画素33Gの出力レベルを2色間で色バランスするように設定された各ゲインを用いて3バンドのXYZ画像のうち第1画素33B及び第2画素33Gに対応する2種の画像を増幅する。この撮像方法によれば、被写体12に含まれる2種類の異なる素材(コロニーWC,YC)からなる領域を色又は濃度で識別できる画像を出力できる。例えば、画像を検査に用いるときに、画像に含まれる2種類の異なる素材を比較的高い精度で識別又は分類することができる。 (7) This is an imaging method that outputs regions made of two different materials included in the imaged subject 12 as images that can be identified. It includes an imaging unit 30 that images the subject 12, and a monochromatic light source 20 that illuminates the subject 12 with monochromatic light. The imaging unit 30 includes an image sensor 33 (image sensor) having RGB pixels. The RGB pixels include a first pixel 33B having sensitivity in a first wavelength region and a second pixel 33G having sensitivity in a second wavelength region. The first wavelength region and the second wavelength region overlap in some overlapping wavelength regions. Monochromatic light is light in an emission wavelength range that includes at least part of the first wavelength range and at least part of the overlapping wavelength range. This imaging method includes an imaging step, an output step, an arithmetic processing step, and an amplification step. In the imaging step, the imaging unit 30 images the subject 12 illuminated by the monochromatic light source 20 . In the output step, the imaging unit 30 that has captured the image of the subject 12 outputs a 3-band RGB image. The arithmetic processing step generates a three-band XYZ image with corrected RGB imaging characteristics by performing linear matrix arithmetic processing on the three-band RGB image output from the imaging unit 30. The amplification step is performed in three bands using each gain set to balance the output levels of the first pixel 33B and the second pixel 33G between two colors under the condition that monochromatic light is irradiated from the monochromatic light source 20. Among the XYZ images, two types of images corresponding to the first pixel 33B and the second pixel 33G are amplified. According to this imaging method, it is possible to output an image in which regions made of two different types of materials (colonies WC, YC) included in the subject 12 can be identified by color or density. For example, when an image is used for inspection, two different types of materials included in the image can be identified or classified with relatively high accuracy.
 実施形態は、上記に限定されず、以下の態様に変更してもよい。
 ・単色光は、青色光に限定されない。また、第1単色光と第2単色光の組み合わせは、青色光と赤色光に限定されない。例えば、図20に示すように、単色光源20が照明する単色光は、緑色光と赤色光でもよい。つまり、第1単色光は緑色光でもよく、さらに第1単色光と第2単色光の組み合わせは、緑色光と赤色光でもよい。この場合、RGB画像にリニアマトリックス演算を行うことで、図21に示すようなXYZ画像が取得される。そして、図22に示すように、領域DAにおいて、面積Sg1と面積Sg2との差ΔSgが比較的大きくなる。このため、差ΔSgと全体面積Sg1、および面積Sg2との比率に比較的大きな差が生じるので、XYZ画像(特にB画像)において白色コロニーと黄色コロニーとを両者の濃度差により識別しやすくなる。
The embodiment is not limited to the above, and may be modified in the following manner.
- Monochromatic light is not limited to blue light. Further, the combination of the first monochromatic light and the second monochromatic light is not limited to blue light and red light. For example, as shown in FIG. 20, the monochromatic light emitted by the monochromatic light source 20 may be green light or red light. That is, the first monochromatic light may be green light, and the combination of the first monochromatic light and the second monochromatic light may be green light and red light. In this case, by performing linear matrix calculation on the RGB image, an XYZ image as shown in FIG. 21 is obtained. Then, as shown in FIG. 22, in the area DA, the difference ΔSg between the area Sg1 and the area Sg2 becomes relatively large. Therefore, a relatively large difference occurs in the ratio of the difference ΔSg to the total area Sg1 and the area Sg2, making it easier to distinguish between white colonies and yellow colonies in the XYZ image (particularly the B image) based on the density difference between the two.
 ・図3において、カメラ31から近赤外光を遮断するIRカットフィルタ35を取り除き、撮像素子を構成するRGB画素が近赤外光に感度を有する構成としてもよい。そして、第1単色光源25は青色光源21とし、第1単色光としてB光を使用する。第2単色光源26を近赤外LEDとし、第2単色光として近赤外光を使用する。図7に示すように、RGB画像に対してリニアマトリックス演算を行って近赤外線領域(例えば、図7の例では、約800~900nm)にピークをもつN画像を含む、例えば、NYZ画像をXYZ画像として生成する。この場合、この近赤外線領域と重なる波長領域において、白色コロニーWCと黄色コロニーYCとの分光特性を示すラインLW,LYに差があるので、NYZ画像(特にN画像)において白色コロニーWCと黄色コロニーYCとを識別しやすくなる。このように、XYZの3色は、そのうち1色又は2色が近赤外領域内に波長領域を有するものであってもよい。 - In FIG. 3, the IR cut filter 35 that blocks near-infrared light from the camera 31 may be removed, and the RGB pixels forming the image sensor may be configured to have sensitivity to near-infrared light. The first monochromatic light source 25 is the blue light source 21, and B light is used as the first monochromatic light. The second monochromatic light source 26 is a near-infrared LED, and near-infrared light is used as the second monochromatic light. As shown in FIG. 7, a linear matrix operation is performed on an RGB image to convert, for example, a NYZ image containing N images having a peak in the near-infrared region (for example, approximately 800 to 900 nm in the example of FIG. 7) to an XYZ image. Generate as an image. In this case, in the wavelength region that overlaps with this near-infrared region, there is a difference between the lines LW and LY that indicate the spectral characteristics of the white colony WC and the yellow colony YC, so in the NYZ image (especially the N image), the white colony WC and the yellow colony It becomes easier to distinguish between YC and YC. In this way, one or two of the three colors XYZ may have a wavelength region within the near-infrared region.
 ・検査処理部60は、Y画像とZ画像の2画像のみ、又はX画像とY画像の2画像のみを用いて、検査処理を行ってもよい。
 ・単色光源20は、近赤外光を単色光とする近赤外光源を1種類の単色光源として含んでもよい。
- The inspection processing unit 60 may perform inspection processing using only two images, the Y image and the Z image, or only the two images, the X image and the Y image.
- The monochromatic light source 20 may include a near-infrared light source that uses near-infrared light as monochromatic light as one type of monochromatic light source.
 ・B光は、例えば、青色LEDが発光する単色光に限らず、青色の波長領域にピークを有する他の青色光でもよい。
 ・前記実施形態では、主となる単色光がB光の場合、第1画素がB画素、第2画素がG画素であったが、これに限定されない。単色光がR光の場合は、第1画素がR画素33R、第2画素がG画素33Gであってもよい。また、単色光がG光の場合は、第1画素がG画素33G、第2画素がB画素33B又はR画素33Rであってもよい。
- The B light is not limited to monochromatic light emitted by a blue LED, but may be other blue light having a peak in the blue wavelength region.
- In the embodiment, when the main monochromatic light is B light, the first pixel is the B pixel and the second pixel is the G pixel, but the present invention is not limited to this. When the monochromatic light is R light, the first pixel may be the R pixel 33R, and the second pixel may be the G pixel 33G. Further, when the monochromatic light is G light, the first pixel may be the G pixel 33G, and the second pixel may be the B pixel 33B or the R pixel 33R.
 ・第1単色光と第2単色光の組み合わせは、前記実施形態のような青色光と赤色光に限定されない。また、図20に示すような緑色光と赤色光にも限定されない。単色光を赤色光としてもよい。この場合、第1単色光が赤色光、第2単色光が青色光であってもよい。 - The combination of the first monochromatic light and the second monochromatic light is not limited to blue light and red light as in the above embodiment. Moreover, it is not limited to green light and red light as shown in FIG. The monochromatic light may be red light. In this case, the first monochromatic light may be red light and the second monochromatic light may be blue light.
 ・単色光源は、2種類設けることに限定されず、1種類のみでもよい。前記実施形態において、単色光源は青色光源のみでもよい。また、図20において、単色光源は緑色光源のみでもよい。 ・The number of monochromatic light sources is not limited to two types, and only one type may be used. In the embodiment, the monochromatic light source may be only a blue light source. Further, in FIG. 20, the monochromatic light source may be only a green light source.
 ・原色のカラーフィルタを有するカラーカメラの場合、R,G1,G2,Bの4色でもよい。また、補色のカラーフィルタを有するカラーカメラでもよく、補色が、イエロー、シアン、マゼンタ、グリーンの4色でもよい。 - In the case of a color camera that has color filters of primary colors, the four colors of R, G1, G2, and B may be used. Furthermore, a color camera having complementary color filters may be used, and the complementary colors may be four colors: yellow, cyan, magenta, and green.
 ・カメラ31を用いてイメージセンサ33で撮像した第1撮像信号S1に基づく画像データ(例えばRGB画像データ)をUSBメモリ等のリムーバブルメモリに保存する。そのリムーバブルメモリに保存された画像データをパーソナルコンピュータに読み取らせ、パーソナルコンピュータのCPU(画像処理部50)がリニアマトリックス演算を含む変換処理を行ってXYZ画像を生成してもよい。つまり、撮像ステップを行う装置と、変換ステップを行う装置とが、別々の装置であってもよい。このような撮像方法によっても、複数バンドのXYZ画像を取得できる。 - Save image data (for example, RGB image data) based on the first imaging signal S1 captured by the image sensor 33 using the camera 31 in a removable memory such as a USB memory. The image data stored in the removable memory may be read by a personal computer, and the CPU (image processing unit 50) of the personal computer may perform conversion processing including linear matrix calculation to generate an XYZ image. In other words, the device that performs the imaging step and the device that performs the conversion step may be separate devices. Also by such an imaging method, XYZ images of multiple bands can be acquired.
 ・撮像装置11が出力したXYZ画像を検査員が視認して検査する構成でもよい。
 ・類似の色とは、白色と黄色に限定されず、白色と薄ピンク色、白色と薄オレンジ色、白色と薄緑色、白色と薄水色などでもよい。その他、白色と淡色の組合せでもよい。また、色の異なる淡色同士の組合せでもよい。
- The configuration may be such that an inspector visually recognizes and inspects the XYZ image output by the imaging device 11.
- Similar colors are not limited to white and yellow, but may also include white and light pink, white and light orange, white and light green, white and light blue, etc. In addition, a combination of white and light color may be used. Alternatively, a combination of different light colors may be used.
 ・被写体12は、コロニーが培養されたシャーレなどのコロニーを含むものに限定されず、種類の異なる識別対象を含む被写体でもよい。また、識別対象は、透明体又は半透明体でもよい。さらに、識別対象は、ゼリー等の食品あるいは加工食品などの食品でもよい。 - The object 12 is not limited to one that includes colonies, such as a Petri dish in which colonies are cultured, but may be an object that includes different types of identification targets. Further, the object to be identified may be a transparent object or a semi-transparent object. Furthermore, the identification target may be a food such as jelly or a processed food.
 ・撮像対象又は検査対象とされる被写体12は特に限定されない。被写体12は、例えば、ペットボトルや壜等の容器、食品、飲料物、電子部品、電化製品、日常用品、部品、部材、粉粒体又は液状等の原料などでもよい。また、検査対象は、傷、汚れ、印刷不良、塗装不良などであってもよい。また、被写体12内の対象は、検査対象に限らない。対象は、種類が異なるものの類似の色を呈する対象であればよい。 ・The subject 12 to be imaged or inspected is not particularly limited. The object 12 may be, for example, a container such as a plastic bottle or a bottle, a food product, a beverage, an electronic component, an electric appliance, a daily necessities, a part, a member, or a raw material such as powder, granule, or liquid. In addition, the inspection target may be scratches, dirt, printing defects, painting defects, etc. Furthermore, the object within the subject 12 is not limited to the inspection object. The objects may be of different types but exhibit similar colors.
 ・撮像装置11は、検査処理部60とは別の装置として構成してもよい。撮像装置11を検査以外の用途で使用してもよい。
 ・イメージセンサ33を構成するカラーフィルタ34の配列パターンは、RGBベイヤ配列に限らず、ストライプ配列など任意の配列パターンでもよい。
- The imaging device 11 may be configured as a separate device from the inspection processing section 60. The imaging device 11 may be used for purposes other than inspection.
- The arrangement pattern of the color filters 34 constituting the image sensor 33 is not limited to the RGB Bayer arrangement, but may be any arrangement pattern such as a stripe arrangement.
 ・撮像装置11は、搬送用のロボットを備えなくてもよい。例えば、作業者が、被写体12を撮像位置となる例えば撮影用の載置台に載置する構成でもよい。
 ・制御部40、画像処理部50及び検査処理部60のうち少なくとも1つは、一部又は全部が、プログラムを実行するコンピュータよりなるソフトウェアにより構成されてもよいし、電子回路等のハードウェアにより構成されてもよい。
- The imaging device 11 does not need to include a transport robot. For example, a configuration may be adopted in which an operator places the subject 12 on a mounting table for photographing, which serves as an imaging position.
- At least one of the control unit 40, the image processing unit 50, and the inspection processing unit 60 may be partially or entirely configured by software consisting of a computer that executes a program, or may be configured by hardware such as an electronic circuit. may be configured.
 10…検査装置、11…撮像装置、12…被写体、13…シャーレ、14…培地、15…背景シート、18…処理部、20…単色光源、21…単色光源の一例を構成する青色光源(青色LED)、22…単色光源の一例を構成する緑色光源(緑色LED)、23…単色光源の一例を構成する赤色光源(赤色LED)、25…第1単色光源、26…第2単色光源、30…撮像部、31…カラーカメラ(カメラ)、31a…鏡筒、32…レンズ、33…カラーイメージセンサ(イメージセンサ)、35…赤外光カットフィルタ(IRカットフィルタ)、33R…R画素、33G…第2画素の一例としてのG画素、33B…第1画素の一例としてのB画素、34…カラーフィルタ、34R…Rフィルタ、34G…Gフィルタ、34B…Bフィルタ、40…制御部、41…撮像制御部、42…照明制御部、45…電源、46…青色LED電源、47…赤色LED電源、50…画像処理部、51…RGB分離部、52…演算処理部、53…増幅部、54…マトリックス係数設定部、55…ゲイン設定部、60…検査処理部、61…識別部、62…判定部、63…分類部、64…計数部、65…特定部、70…表示部、WC…白色コロニー、YC…黄色コロニー、S1…第1撮像信号、S2…第2撮像信号、VA…可視光波長領域、NIRA…近赤外波長領域、LW…白色コロニーの分光反射率特性線、LY…黄色コロニーの分光反射率特性線、B,G,R…分光感度特性線、X,Y,Z…RGB分光感度特性より算出した分光感度特性線、Gx,Gy,Gz…ゲイン。 DESCRIPTION OF SYMBOLS 10... Inspection device, 11... Imaging device, 12... Subject, 13... Petri dish, 14... Culture medium, 15... Background sheet, 18... Processing part, 20... Monochromatic light source, 21... Blue light source (blue light source) constituting an example of a monochromatic light source LED), 22... Green light source (green LED) constituting an example of a monochromatic light source, 23... Red light source (red LED) constituting an example of a monochromatic light source, 25... First monochromatic light source, 26... Second monochromatic light source, 30 ...Imaging unit, 31...Color camera (camera), 31a...Lens barrel, 32...Lens, 33...Color image sensor (image sensor), 35...Infrared light cut filter (IR cut filter), 33R...R pixel, 33G ...G pixel as an example of a second pixel, 33B...B pixel as an example of a first pixel, 34...color filter, 34R...R filter, 34G...G filter, 34B...B filter, 40...control unit, 41... Imaging control unit, 42... Illumination control unit, 45... Power supply, 46... Blue LED power supply, 47... Red LED power supply, 50... Image processing unit, 51... RGB separation unit, 52... Arithmetic processing unit, 53... Amplification unit, 54 ...Matrix coefficient setting section, 55... Gain setting section, 60... Inspection processing section, 61... Identification section, 62... Judgment section, 63... Classification section, 64... Counting section, 65... Specification section, 70... Display section, WC... White colony, YC...yellow colony, S1...first imaging signal, S2...second imaging signal, VA...visible light wavelength region, NIRA...near infrared wavelength region, LW...spectral reflectance characteristic line of white colony, LY... Spectral reflectance characteristic line of yellow colony, B, G, R...spectral sensitivity characteristic line, X, Y, Z...spectral sensitivity characteristic line calculated from RGB spectral sensitivity characteristic, Gx, Gy, Gz...gain.

Claims (7)

  1.  被写体を撮像して前記被写体に含まれる2種類の異なる素材からなる領域を識別可能な画像として出力する撮像装置であって、
     前記被写体を撮像する撮像素子がRGB画素を有する撮像部と、
     前記被写体を単色光で照明する単色光源と、
     前記単色光源及び前記撮像部を制御する制御部と、
     前記被写体を撮像した前記撮像部から出力される3バンドのRGB画像に所定の処理を施す画像処理部と、
    を備え、
     前記RGB画素は、第1波長領域に感度を有する第1画素と、第2波長領域に感度を有する第2画素とを含み、
     前記第1波長領域と前記第2波長領域とは一部の重複波長領域で重複し、
     前記単色光は、前記第1波長領域の少なくとも一部と前記重複波長領域の少なくとも一部とを含む発光波長領域の光であり、
     前記画像処理部は、
     前記3バンドのRGB画像に対してリニアマトリックス演算処理を行うことによりRGB撮像特性が補正された3バンドのXYZ画像を生成する演算処理部と、
     前記単色光源から単色光が照射された条件下で前記第1画素と前記第2画素との各出力レベルを2色間で色バランスをとることが可能な値に設定されたゲインで前記XYZ画像を構成するXYZ値のうち前記第1画素と前記第2画素との各々に対応する各値を増幅する増幅部と、を備えることを特徴とする撮像装置。
    An imaging device that images a subject and outputs an image that can identify areas made of two different materials included in the subject,
    an imaging unit in which an imaging device for imaging the subject has RGB pixels;
    a monochromatic light source that illuminates the subject with monochromatic light;
    a control unit that controls the monochromatic light source and the imaging unit;
    an image processing unit that performs predetermined processing on a three-band RGB image output from the imaging unit that captured the image of the subject;
    Equipped with
    The RGB pixel includes a first pixel having sensitivity in a first wavelength region and a second pixel having sensitivity in a second wavelength region,
    The first wavelength region and the second wavelength region overlap in some overlapping wavelength regions,
    The monochromatic light is light in an emission wavelength range including at least part of the first wavelength range and at least part of the overlapping wavelength range,
    The image processing unit includes:
    a calculation processing unit that generates a 3-band XYZ image with corrected RGB imaging characteristics by performing linear matrix calculation processing on the 3-band RGB image;
    The XYZ image is produced with a gain set to a value that allows each output level of the first pixel and the second pixel to be balanced between two colors under the condition that monochromatic light is irradiated from the monochromatic light source. An imaging device comprising: an amplification unit that amplifies each value corresponding to each of the first pixel and the second pixel among the XYZ values forming the image.
  2.  前記画像処理部は、前記被写体のうち識別対象以外の基準部分を対象にして黒バランス処理を行った前記3バンドのXYZ画像を生成することを特徴とする請求項1に記載の撮像装置。 The imaging device according to claim 1, wherein the image processing unit generates the three-band XYZ image in which black balance processing is performed on a reference portion of the subject other than the identification target.
  3.  前記単色光源は、青色光源又は赤色光源であり、
     前記単色光源が前記青色光源である場合、前記第1画素はB画素であり、かつ前記第2画素はG画素であり、
     前記単色光源が前記赤色光源である場合、前記第1画素はR画素であり、かつ前記第2画素はG画素であることを特徴とする請求項1に記載の撮像装置。
    The monochromatic light source is a blue light source or a red light source,
    When the monochromatic light source is the blue light source, the first pixel is a B pixel, and the second pixel is a G pixel,
    The imaging device according to claim 1, wherein when the monochromatic light source is the red light source, the first pixel is an R pixel and the second pixel is a G pixel.
  4.  前記単色光源を第1単色光源とすると、
     前記第1単色光源が照射する第1単色光の発光波長領域とは異なる発光波長領域の第2単色光を照射可能な第2単色光源を更に備え、
     前記RGB画素は、前記第1画素及び前記第2画素の他に第3画素を含み、
     前記制御部は、前記第1単色光源と前記第2単色光源との照明を同時に行って、前記被写体を前記撮像部に撮像させ、
     前記増幅部が前記XYZ値のうち前記第1画素と前記第2画素との各々に対応する各値を増幅するときの前記ゲインが設定されるゲイン設定部を備え、
     前記制御部が、前記第1単色光源の照度に応じて設定した第2照度で前記第2単色光源の照明を行うか、あるいは、前記RGB画素のうち前記第1画素と前記第2画素との出力レベルに与えられる前記各ゲインに応じて前記第3画素の出力レベルに与えられるゲインが前記ゲイン設定部により設定されることを特徴とする請求項3に記載の撮像装置。
    If the monochromatic light source is a first monochromatic light source,
    Further comprising a second monochromatic light source capable of emitting a second monochromatic light having an emission wavelength range different from the emission wavelength range of the first monochromatic light emitted by the first monochromatic light source,
    The RGB pixels include a third pixel in addition to the first pixel and the second pixel,
    The control unit causes the imaging unit to image the subject by simultaneously illuminating the first monochromatic light source and the second monochromatic light source,
    a gain setting section in which the gain is set when the amplification section amplifies each value corresponding to each of the first pixel and the second pixel among the XYZ values,
    The control unit may illuminate the second monochromatic light source at a second illuminance set according to the illuminance of the first monochromatic light source, or may illuminate the first pixel and the second pixel among the RGB pixels. 4. The imaging device according to claim 3, wherein the gain setting unit sets the gain given to the output level of the third pixel according to each gain given to the output level.
  5.  請求項1~請求項4のいずれか一項に記載の撮像装置を備えた検査装置であって、
     前記被写体は、前記2種類の異なる素材として2種類の異なるコロニーを含み、
     前記撮像装置から入力した前記3バンドのXYZ画像に基づいて2種類の異なるコロニーの領域であるコロニー領域を識別する識別部を備え、
     前記XYZ画像は、X画像、Y画像及びZ画像を含み、
     前記識別部は、
     前記X画像で画素レベルが閾値設定範囲内にある領域を第1コロニー領域とし、
     前記Y画像で画素レベルが閾値設定範囲内にある領域を第2コロニー領域とし、
     前記Z画像で画素レベルが閾値設定範囲内にある領域を第3コロニー領域とし、
     前記第1コロニー領域又は前記第3コロニー領域から、前記第2コロニー領域を排除した残りのコロニー領域を第4コロニー領域とし、
     前記第1コロニー領域及び前記第3コロニー領域のうちの少なくとも一方と、前記第2コロニー領域と、前記第4コロニー領域とを、特徴及び形状に基づいて分類する分類部と、
     前記分類部が分類した前記コロニー領域のそれぞれの数を計数する計数部と、
     前記計数部の計数結果に基づいて識別対象である2種類のコロニー領域の数をそれぞれ特定する特定部と
    を備えることを特徴とする検査装置。
    An inspection device comprising the imaging device according to any one of claims 1 to 4,
    The subject includes two different colonies as the two different materials,
    an identification unit that identifies colony areas that are areas of two different types of colonies based on the three-band XYZ image input from the imaging device;
    The XYZ image includes an X image, a Y image, and a Z image,
    The identification section is
    The area in the X image whose pixel level is within the threshold setting range is defined as a first colony area,
    An area in the Y image whose pixel level is within a threshold setting range is defined as a second colony area,
    An area in the Z image whose pixel level is within a threshold setting range is defined as a third colony area,
    The remaining colony area after excluding the second colony area from the first colony area or the third colony area is defined as a fourth colony area,
    a classification unit that classifies at least one of the first colony area and the third colony area, the second colony area, and the fourth colony area based on characteristics and shapes;
    a counting unit that counts the number of each of the colony areas classified by the classification unit;
    An inspection device comprising: a specifying section that specifies the number of two types of colony regions to be identified based on the counting results of the counting section.
  6.  被写体に含まれる2種類の異なる素材からなる領域を識別可能な画像として出力する撮像装置の撮像条件決定方法であって、
     前記撮像装置は、前記被写体を単色光で照明する単色光源と、前記被写体を撮像するRGB画素を含む撮像部とを備え、
     前記撮像部から出力される3バンドのRGB画像に対してリニアマトリックス演算処理を行うことによりRGB撮像特性が補正された3バンドのXYZ画像を生成するために前記リニアマトリックス演算処理で前記RGB画像に対して乗算するマトリックスの係数を設定する第1ステップと、
     前記撮像部を構成するRGB画素のうち第1画素の分光出力特性と前記第1画素とは異なる第2画素の分光出力特性とのピーク波長が変化するように前記単色光源の発光波長を変化させて、発光波長変化後の当該単色光源で前記被写体を照明する第2ステップと、
     前記第2ステップで前記単色光で照明した前記被写体を前記撮像部が撮像する第3ステップと、
     前記第3ステップの撮像の結果として前記撮像部から出力されるRGB画像に前記マトリックスを乗算してXYZ画像を生成する第4ステップと、
     前記第4ステップで生成された前記XYZ画像のうち前記第2画素から出力される第2画像において前記2種類の異なる素材の領域間の濃度の差が所定の閾値以上であるか否かを判断する第5ステップと、を行い、
     前記第5ステップにおいて前記濃度の差が所定の閾値以上になるまで、前記第1ステップでの前記係数の変化と、前記第2ステップでの前記発光波長の変化とのうち少なくとも一方の変化を伴って前記第1~第5ステップを繰り返し行うことを特徴とする撮像条件決定方法。
    A method for determining imaging conditions for an imaging device that outputs an image that can identify regions made of two different materials included in a subject, the method comprising:
    The imaging device includes a monochromatic light source that illuminates the subject with monochromatic light, and an imaging unit that captures an image of the subject, including RGB pixels,
    In order to generate a 3-band XYZ image with corrected RGB imaging characteristics by performing linear matrix calculation processing on the 3-band RGB image output from the imaging unit, the linear matrix calculation processing is performed on the RGB image. A first step of setting a coefficient of a matrix to be multiplied by
    The emission wavelength of the monochromatic light source is changed so that the peak wavelength of a spectral output characteristic of a first pixel and a spectral output characteristic of a second pixel different from the first pixel among the RGB pixels constituting the imaging section changes. a second step of illuminating the subject with the monochromatic light source after the emission wavelength has changed;
    a third step in which the imaging unit images the subject illuminated with the monochromatic light in the second step;
    a fourth step of multiplying the RGB image output from the imaging unit as a result of the imaging in the third step by the matrix to generate an XYZ image;
    Determine whether or not the difference in density between regions of the two different materials in the second image output from the second pixel of the XYZ image generated in the fourth step is equal to or greater than a predetermined threshold. Perform the fifth step of
    At least one of the change in the coefficient in the first step and the change in the emission wavelength in the second step is performed until the difference in concentration becomes equal to or greater than a predetermined threshold in the fifth step. A method for determining imaging conditions, characterized in that the first to fifth steps are repeatedly performed.
  7.  被写体を撮像して前記被写体に含まれる2種類の異なる素材からなる領域を識別可能な画像として出力する撮像方法であって、
     前記被写体を撮像する撮像部と、前記被写体を単色光で照明する単色光源とを備え、
     前記撮像部は、RGB画素を有する撮像素子を備え、
     前記RGB画素は、第1波長領域に感度を有する第1画素と、第2波長領域に感度を有する第2画素とを含み、
     前記第1波長領域と前記第2波長領域とは一部の重複波長領域で重複し、
     前記単色光は、前記第1波長領域の少なくとも一部と前記重複波長領域の少なくとも一部とを含む発光波長領域の光であり、
     前記単色光源が照明した前記被写体を前記撮像部が撮像する撮像ステップと、
     前記被写体を撮像した前記撮像部から出力される3バンドのRGB画像に対してリニアマトリックス演算処理を行うことによりRGB撮像特性が補正された3バンドのXYZ画像を生成する演算処理ステップと、
     前記単色光源から前記単色光が照射された条件下で前記第1画素及び前記第2画素の出力レベルを2色間で色バランスするように設定された各ゲインを用いて前記3バンドのXYZ画像のうち前記第1画素及び前記第2画素に対応する2種の画像を増幅する増幅ステップと
    を含むことを特徴とする撮像方法。
    An imaging method that images a subject and outputs an area made of two different materials included in the subject as an identifiable image, the method comprising:
    comprising an imaging unit that images the subject, and a monochromatic light source that illuminates the subject with monochromatic light,
    The imaging unit includes an imaging element having RGB pixels,
    The RGB pixel includes a first pixel having sensitivity in a first wavelength region and a second pixel having sensitivity in a second wavelength region,
    The first wavelength region and the second wavelength region overlap in some overlapping wavelength regions,
    The monochromatic light is light in an emission wavelength range including at least part of the first wavelength range and at least part of the overlapping wavelength range,
    an imaging step in which the imaging unit takes an image of the subject illuminated by the monochromatic light source;
    a calculation processing step of generating a 3-band XYZ image with corrected RGB imaging characteristics by performing linear matrix calculation processing on the 3-band RGB image output from the imaging unit that has imaged the subject;
    The three-band XYZ image is created using each gain set to color balance the output levels of the first pixel and the second pixel between two colors under the condition that the monochromatic light is irradiated from the monochromatic light source. An imaging method comprising: amplifying two types of images corresponding to the first pixel and the second pixel.
PCT/JP2023/016219 2022-04-28 2023-04-25 Imaging device, inspection device, imaging condition determination method, and imaging method WO2023210619A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2022-075219 2022-04-28
JP2022075219 2022-04-28

Publications (1)

Publication Number Publication Date
WO2023210619A1 true WO2023210619A1 (en) 2023-11-02

Family

ID=88518956

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2023/016219 WO2023210619A1 (en) 2022-04-28 2023-04-25 Imaging device, inspection device, imaging condition determination method, and imaging method

Country Status (1)

Country Link
WO (1) WO2023210619A1 (en)

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2017042146A (en) * 2015-08-28 2017-03-02 株式会社エヌテック Microorganism detection method, microorganism detection apparatus, and program
JP2021145185A (en) * 2020-03-10 2021-09-24 株式会社エヌテック Multispectral image pickup apparatus, inspection device and multispectral image pickup method
JP2021190802A (en) * 2020-05-28 2021-12-13 株式会社エヌテック Imaging device, inspection apparatus and imaging method

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2017042146A (en) * 2015-08-28 2017-03-02 株式会社エヌテック Microorganism detection method, microorganism detection apparatus, and program
JP2021145185A (en) * 2020-03-10 2021-09-24 株式会社エヌテック Multispectral image pickup apparatus, inspection device and multispectral image pickup method
JP2021190802A (en) * 2020-05-28 2021-12-13 株式会社エヌテック Imaging device, inspection apparatus and imaging method

Similar Documents

Publication Publication Date Title
JP6629455B2 (en) Appearance inspection equipment, lighting equipment, photography lighting equipment
KR101692115B1 (en) Inspection apparatus and inspection method
US10393669B2 (en) Colour measurement of gemstones
US8049892B2 (en) Apparatus and method for camera-based color measurements
US20060132777A1 (en) Systems and methods for augmenting spectral range of an LED spectrophotometer
CN104062006B (en) Light measurement device, printing equipment and image display device
US11408819B2 (en) Process and system for identifying the gram type of a bacterium
JPH11510623A (en) Apparatus and method for inspecting sheet material such as bills or securities
KR20080080998A (en) Defect inspection device for inspecting defect by image analysis
KR20070065228A (en) Paper sheet discrimination apparatus, paper sheet processing apparatus, and paper sheet discrimination method
JP2008209211A (en) Foreign matter inspection apparatus and method
JP2021113744A (en) Imaging system
US20120114218A1 (en) High fidelity colour imaging of microbial colonies
EP0660277B1 (en) Method and apparatus for the characterization and discrimination of legal tender bank notes and documents
KR102240757B1 (en) Real-time detection system for foreign object contained in fresh-cut vegetable using LCTF-based multispectral imaging technology
WO2023210619A1 (en) Imaging device, inspection device, imaging condition determination method, and imaging method
CN116148265A (en) Flaw analysis method and system based on synthetic leather high-quality image acquisition
US10091443B2 (en) Camera system and method for inspecting and/or measuring objects
TW202409544A (en) Image capturing device, inspection device, image capturing condition determining method and image capturing method
CN111886492B (en) Color grading process and system for jadeite
JP2006300615A (en) Visual examination device and visual examination method
JPH09185711A (en) Shape recognizing method and its device
JP4270604B2 (en) Image input device
JP2020005053A (en) Spectral sensitivity measurement method for image sensors, inspection method for spectral sensitivity measurement devices, and spectral sensitivity measurement device
KR101646040B1 (en) Method of analyzing color alloy using reflectivity

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 23796359

Country of ref document: EP

Kind code of ref document: A1