WO2022202694A1 - 検査システム及び検査方法 - Google Patents
検査システム及び検査方法 Download PDFInfo
- Publication number
- WO2022202694A1 WO2022202694A1 PCT/JP2022/012761 JP2022012761W WO2022202694A1 WO 2022202694 A1 WO2022202694 A1 WO 2022202694A1 JP 2022012761 W JP2022012761 W JP 2022012761W WO 2022202694 A1 WO2022202694 A1 WO 2022202694A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- microorganisms
- inspection system
- inspection
- image data
- fungi
- Prior art date
Links
- 238000007689 inspection Methods 0.000 title claims abstract description 170
- 238000000034 method Methods 0.000 title claims abstract description 24
- 244000005700 microbiome Species 0.000 claims abstract description 155
- 238000003384 imaging method Methods 0.000 claims abstract description 84
- 238000012545 processing Methods 0.000 claims description 65
- 241000233866 Fungi Species 0.000 claims description 61
- 239000000243 solution Substances 0.000 claims description 40
- 238000012360 testing method Methods 0.000 claims description 21
- 239000002504 physiological saline solution Substances 0.000 claims description 11
- 239000004094 surface-active agent Substances 0.000 claims description 10
- 230000008569 process Effects 0.000 claims description 9
- 238000005070 sampling Methods 0.000 claims description 7
- 238000011109 contamination Methods 0.000 claims description 5
- FAPWRFPIFSIZLT-UHFFFAOYSA-M Sodium chloride Chemical compound [Na+].[Cl-] FAPWRFPIFSIZLT-UHFFFAOYSA-M 0.000 claims description 2
- 238000009423 ventilation Methods 0.000 claims description 2
- 230000007613 environmental effect Effects 0.000 abstract description 88
- 230000006870 function Effects 0.000 description 24
- 238000010586 diagram Methods 0.000 description 20
- 239000000428 dust Substances 0.000 description 14
- 238000000605 extraction Methods 0.000 description 7
- 238000003860 storage Methods 0.000 description 7
- 241000894006 Bacteria Species 0.000 description 6
- 239000011521 glass Substances 0.000 description 6
- 238000002360 preparation method Methods 0.000 description 6
- 238000004891 communication Methods 0.000 description 5
- 238000012136 culture method Methods 0.000 description 5
- 239000002609 medium Substances 0.000 description 5
- 238000012986 modification Methods 0.000 description 5
- 230000004048 modification Effects 0.000 description 5
- 239000000284 extract Substances 0.000 description 4
- 238000013500 data storage Methods 0.000 description 3
- 239000000126 substance Substances 0.000 description 3
- 230000001580 bacterial effect Effects 0.000 description 2
- 230000008859 change Effects 0.000 description 2
- 238000004140 cleaning Methods 0.000 description 2
- 238000012258 culturing Methods 0.000 description 2
- 238000007865 diluting Methods 0.000 description 2
- 238000010790 dilution Methods 0.000 description 2
- 239000012895 dilution Substances 0.000 description 2
- 238000001035 drying Methods 0.000 description 2
- 230000000694 effects Effects 0.000 description 2
- 238000012800 visualization Methods 0.000 description 2
- 241000235349 Ascomycota Species 0.000 description 1
- 241000193830 Bacillus <bacterium> Species 0.000 description 1
- 241000589248 Legionella Species 0.000 description 1
- 208000007764 Legionnaires' Disease Diseases 0.000 description 1
- 241000192041 Micrococcus Species 0.000 description 1
- 240000004808 Saccharomyces cerevisiae Species 0.000 description 1
- 230000004931 aggregating effect Effects 0.000 description 1
- 239000000470 constituent Substances 0.000 description 1
- 238000001514 detection method Methods 0.000 description 1
- 230000003028 elevating effect Effects 0.000 description 1
- 238000002474 experimental method Methods 0.000 description 1
- 239000001963 growth medium Substances 0.000 description 1
- 230000012447 hatching Effects 0.000 description 1
- 230000007246 mechanism Effects 0.000 description 1
- 238000011002 quantification Methods 0.000 description 1
- 239000004065 semiconductor Substances 0.000 description 1
- 230000003381 solubilizing effect Effects 0.000 description 1
Images
Classifications
-
- C—CHEMISTRY; METALLURGY
- C12—BIOCHEMISTRY; BEER; SPIRITS; WINE; VINEGAR; MICROBIOLOGY; ENZYMOLOGY; MUTATION OR GENETIC ENGINEERING
- C12Q—MEASURING OR TESTING PROCESSES INVOLVING ENZYMES, NUCLEIC ACIDS OR MICROORGANISMS; COMPOSITIONS OR TEST PAPERS THEREFOR; PROCESSES OF PREPARING SUCH COMPOSITIONS; CONDITION-RESPONSIVE CONTROL IN MICROBIOLOGICAL OR ENZYMOLOGICAL PROCESSES
- C12Q1/00—Measuring or testing processes involving enzymes, nucleic acids or microorganisms; Compositions therefor; Processes of preparing such compositions
- C12Q1/02—Measuring or testing processes involving enzymes, nucleic acids or microorganisms; Compositions therefor; Processes of preparing such compositions involving viable microorganisms
- C12Q1/04—Determining presence or kind of microorganism; Use of selective media for testing antibiotics or bacteriocides; Compositions containing a chemical indicator therefor
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/60—Type of objects
- G06V20/69—Microscopic objects, e.g. biological cells or cellular parts
- G06V20/693—Acquisition
-
- C—CHEMISTRY; METALLURGY
- C12—BIOCHEMISTRY; BEER; SPIRITS; WINE; VINEGAR; MICROBIOLOGY; ENZYMOLOGY; MUTATION OR GENETIC ENGINEERING
- C12M—APPARATUS FOR ENZYMOLOGY OR MICROBIOLOGY; APPARATUS FOR CULTURING MICROORGANISMS FOR PRODUCING BIOMASS, FOR GROWING CELLS OR FOR OBTAINING FERMENTATION OR METABOLIC PRODUCTS, i.e. BIOREACTORS OR FERMENTERS
- C12M41/00—Means for regulation, monitoring, measurement or control, e.g. flow regulation
- C12M41/30—Means for regulation, monitoring, measurement or control, e.g. flow regulation of concentration
- C12M41/36—Means for regulation, monitoring, measurement or control, e.g. flow regulation of concentration of biomass, e.g. colony counters or by turbidity measurements
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01N—INVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
- G01N1/00—Sampling; Preparing specimens for investigation
- G01N1/28—Preparing specimens for investigation including physical details of (bio-)chemical methods covered elsewhere, e.g. G01N33/50, C12Q
- G01N1/38—Diluting, dispersing or mixing samples
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01N—INVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
- G01N15/00—Investigating characteristics of particles; Investigating permeability, pore-volume or surface-area of porous materials
- G01N15/06—Investigating concentration of particle suspensions
- G01N15/0606—Investigating concentration of particle suspensions by collecting particles on a support
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01N—INVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
- G01N15/00—Investigating characteristics of particles; Investigating permeability, pore-volume or surface-area of porous materials
- G01N15/06—Investigating concentration of particle suspensions
- G01N15/075—Investigating concentration of particle suspensions by optical means
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01N—INVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
- G01N15/00—Investigating characteristics of particles; Investigating permeability, pore-volume or surface-area of porous materials
- G01N15/10—Investigating individual particles
- G01N15/14—Optical investigation techniques, e.g. flow cytometry
- G01N15/1429—Signal processing
- G01N15/1433—Signal processing using image recognition
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01N—INVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
- G01N15/00—Investigating characteristics of particles; Investigating permeability, pore-volume or surface-area of porous materials
- G01N15/10—Investigating individual particles
- G01N15/14—Optical investigation techniques, e.g. flow cytometry
- G01N15/1468—Optical investigation techniques, e.g. flow cytometry with spatial resolution of the texture or inner structure of the particle
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/0002—Inspection of images, e.g. flaw detection
- G06T7/0012—Biomedical image inspection
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/70—Arrangements for image or video recognition or understanding using pattern recognition or machine learning
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/60—Type of objects
- G06V20/69—Microscopic objects, e.g. biological cells or cellular parts
- G06V20/698—Matching; Classification
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01N—INVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
- G01N15/00—Investigating characteristics of particles; Investigating permeability, pore-volume or surface-area of porous materials
- G01N15/10—Investigating individual particles
- G01N15/14—Optical investigation techniques, e.g. flow cytometry
- G01N2015/1486—Counting the particles
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30004—Biomedical image processing
- G06T2207/30024—Cell structures in vitro; Tissue sections in vitro
Definitions
- This disclosure relates to an inspection system and an inspection method.
- JP 2007-195454 A Japanese Patent Publication No. 2020-529869 WO2019/074926
- the present disclosure provides an inspection system and inspection method that reduce the time required to inspect environmental microorganisms.
- a first aspect of the present disclosure includes: An inspection system for inspecting microorganisms or fungi occurring in indoor environments or equipment, an imaging unit that directly captures a sample collected from the indoor environment or equipment; and an output unit that inspects for microorganisms or fungi in the image data captured by the imaging unit and outputs inspection results.
- a second aspect of the present disclosure is An inspection system for inspecting microorganisms or fungi occurring in indoor environments or equipment, an imaging unit that captures the appearance of microorganisms or mold individuals collected from the indoor environment or equipment; and an output unit that inspects for microorganisms or fungi in the image data captured by the imaging unit and outputs inspection results.
- a third aspect of the present disclosure is the inspection system according to the first or second aspect,
- An object to be imaged by the imaging unit is an indoor environment or a solution sampled from a device, and the imaging unit images the object to be imaged under visible light or ultraviolet light through a lens.
- a fourth aspect of the present disclosure is the inspection system according to the third aspect,
- the solution is obtained by solubilizing a sample taken from the room environment or equipment.
- a fifth aspect of the present disclosure is the inspection system according to the third aspect,
- the sampling destination is either an air conditioner, an air purifier, a humidifier, a ventilator, a blower, or a surface of the indoor environment.
- a sixth aspect of the present disclosure is the inspection system according to the third aspect, A first trained model that determines attributes of each region in the image data.
- a seventh aspect of the present disclosure is the inspection system according to the sixth aspect, It has a second trained model for determining the type of microorganisms or molds for regions determined by the first trained model to contain microorganisms or molds.
- An eighth aspect of the present disclosure is the inspection system according to any one of the first to seventh aspects,
- the output unit counts and outputs the number of microorganisms or fungi in the image data captured by the imaging unit for each type.
- a ninth aspect of the present disclosure is An inspection system for inspecting microorganisms or fungi occurring in indoor environments or equipment, an imaging unit that directly captures a sample collected from the indoor environment or equipment; an output unit that inspects microorganisms or fungi in image data captured by the imaging unit and outputs inspection results;
- the output unit displays the type of microorganisms or molds, the number or ratio of microorganisms or molds, and the image data captured by the imaging unit as inspection results, as well as information indicating the indoor environment or equipment, and the number of microorganisms or molds. Display any one of description, contamination level, and comparison result with other test results.
- a tenth aspect of the present disclosure is An inspection system for inspecting microorganisms or fungi occurring in indoor environments or equipment, an imaging unit that captures the appearance of microorganisms or mold individuals collected from the indoor environment or equipment; an output unit that inspects microorganisms or fungi in image data captured by the imaging unit and outputs inspection results;
- the output unit displays the type of microorganisms or molds, the number or ratio of microorganisms or molds, and the image data captured by the imaging unit as inspection results, as well as information indicating the indoor environment or equipment, and the number of microorganisms or molds. Display any one of description, contamination level, and comparison result with other test results.
- An eleventh aspect of the present disclosure is the inspection system according to any one of the third to seventh aspects,
- the object to be imaged is obtained by dispersing the collected sample into a solution.
- a twelfth aspect of the present disclosure is the inspection system according to the eleventh aspect,
- the object to be imaged is obtained by dispersing a sample collected from an air conditioner into a solution.
- a thirteenth aspect of the present disclosure is the inspection system according to the eleventh or twelfth aspect,
- the object to be imaged is obtained by dispersing the collected sample in a physiological saline containing a surfactant to form a solution.
- a fourteenth aspect of the present disclosure is the inspection system according to any one of the eleventh to thirteenth aspects,
- the subject to be imaged includes a first solution obtained by dispersing and dissolving the collected sample in a physiological saline in which a surfactant is dissolved; and a second solution diluted with a saline solution containing
- a fifteenth aspect of the present disclosure is the inspection system according to the seventh aspect, Each region whose attributes are determined by the first trained model is 32 ⁇ 32 pixels or more.
- a sixteenth aspect of the present disclosure is the inspection system according to the seventh aspect,
- the first trained model includes image data containing the same type of microorganisms or fungi, and position information specifying the position of each region determined to contain the microorganisms or fungi in the image data by a plurality of coordinates. is generated by performing a learning process using learning data associated with .
- a seventeenth aspect of the present disclosure is the inspection system according to the seventh aspect, When the types of microorganisms or fungi are determined, the second trained model counts the microorganisms or fungi for each determined type.
- An eighteenth aspect of the present disclosure is the inspection system according to any one of the first to seventeenth aspects,
- the inspection system is a mobile terminal,
- the imaging unit is built in the mobile terminal.
- a nineteenth aspect of the present disclosure is the inspection system according to the seventh aspect,
- the inspection system includes a portable imaging table on which the imaging target is placed, and a mobile terminal connected to the imaging table,
- the imaging table has the imaging unit
- the mobile terminal includes the first trained model, the second trained model, and the output unit.
- a twentieth aspect of the present disclosure is the inspection system according to any one of the seventh, fifteenth, or sixteenth aspects,
- the first trained model is a trained YOLO.
- a twenty-first aspect of the present disclosure is an inspection system according to the seventh or seventeenth aspect, comprising: In the second trained model, partial image data of each region determined to contain microorganisms or fungi in image data containing the same type of microorganisms or fungi are associated with the types of the microorganisms or fungi. It is generated by performing a learning process using learning data.
- a twenty-second aspect of the present disclosure is the inspection system according to any one of the seventh, seventeenth, or twenty-first aspects,
- the second trained model is a trained DML.
- a twenty-third aspect of the present disclosure includes: An inspection method for an inspection system for inspecting microorganisms or mold occurring in indoor environments or equipment, a step of directly photographing a sample taken from the room environment or equipment; and inspecting for microorganisms or fungi in the photographed image data and outputting the inspection results.
- a twenty-fourth aspect of the present disclosure includes: An inspection method for an inspection system for inspecting microorganisms or mold occurring in indoor environments or equipment, a step of photographing the appearance of microorganisms or mold individuals collected from the indoor environment or equipment; and inspecting for microorganisms or fungi in the photographed image data and outputting the inspection results.
- FIG. 1 is a first diagram showing an application example of the inspection service providing system in the learning phase.
- FIG. 2 is a first diagram showing an application example of the inspection service providing system in the inspection phase.
- FIG. 3 is a diagram illustrating an example of a hardware configuration of a mobile terminal and an image processing apparatus;
- FIG. 4 is a diagram illustrating an example of a functional configuration of functions related to learning data generation processing of the image processing apparatus.
- FIG. 5 is a diagram illustrating an example of a functional configuration of functions related to learning processing of the image processing apparatus.
- FIG. 6 is a diagram illustrating an example of a functional configuration of functions related to inspection processing of a mobile terminal.
- FIG. 7 is a diagram illustrating an example of provision of inspection results by a mobile terminal.
- FIG. 1 is a first diagram showing an application example of the inspection service providing system in the learning phase.
- FIG. 2 is a first diagram showing an application example of the inspection service providing system in the inspection phase.
- FIG. 3 is a diagram illustrating
- FIG. 8 is a flowchart showing the flow of learning processing.
- FIG. 9 is a flowchart showing the flow of inspection processing.
- FIG. 10 is a second diagram showing an application example of the inspection service providing system in the learning phase.
- FIG. 11 is a second diagram showing an application example of the inspection service providing system in the inspection phase.
- FIG. 12 is a diagram showing an example of a report containing test results.
- FIG. 1 is a first diagram showing an application example of the inspection service providing system in the learning phase.
- the inspection service providing system 100 has a mobile terminal 120 (an example of the inspection system according to the first embodiment), an imaging table 130 and an image processing device 140 .
- an object to be imaged using the mobile terminal 120 is acquired by the following procedure and directly imaged under visible light (for example, under fluorescent light) or under ultraviolet light.
- An experimenter collects a sample (dust 160) from an experimental air conditioner (indoor unit) 150 (even if it is collected from inside the air conditioner (indoor unit) 150, the air conditioner (indoor unit) 150 (may be collected from the outer surface of the (2)
- the collected sample is dispersed in a physiological saline solution 170 in which a surfactant is dissolved to form a solution.
- “dispersing and making a solution of the sample” as used herein may include “drying the dispersed and made solution of the sample”.
- a prepared sample is produced by extracting (for example, 10 ⁇ l) the solutionized sample from the solution 170 and dropping it onto a slide glass.
- samples 181_1 to 181_6 are collected so as to contain only the same type of environmental microorganisms, and are dropped on different slide glasses for each type to generate preparations 180_1 to 180_6.
- FIG. 1 shows a case where a sample (dust 160) collected from an air conditioner (indoor unit) 150 contains six types of environmental microorganisms.
- the generated slides 180_1 to 180_6 are sequentially placed on the imaging table 130 as imaging targets, and photographed using the portable terminal 120 under visible light (for example, under fluorescent light) or under ultraviolet light.
- direct imaging in the learning phase means that the procedure from sampling (1) to imaging (4) does not include “culturing of environmental microorganisms” It refers to the fact that photography is performed without increasing the number of microorganisms).
- photographing means an environmental microorganism individual (in the case of fungi, a mold individual, specifically, one unit of spores and hyphae. In the case of bacteria, one bacterial individual ) to obtain information about its appearance (color and shape).
- the imaging table 130 is configured to be portable, and has a lens 131, a mounting section 132, and a lens support section 133, as shown in FIG.
- the lens 131 magnifies the object to be imaged (the slide 180_1 in the example of FIG. 1) to a predetermined magnification.
- the placement section 132 is a member on which an object to be imaged (the slide 180_1 in the example of FIG. 1) is placed.
- the lens support portion 133 is a member that supports the lens 131 at a position separated by a predetermined distance from the imaging target placed on the placement portion 132, and is a member on which the portable terminal 120 is placed on the upper surface.
- the lens support section 133 may be provided with an elevating mechanism for varying the distance between the object to be imaged placed on the placement section 132 and the lens 131 .
- the mobile terminal 120 incorporates an imaging device 121 (an example of an imaging unit), and is placed on the lens support unit 133 so that the position of the lens 131 and the position of the imaging device 121 match, thereby capturing an object ( In the example of FIG. 1, the slide 180_1) is photographed under visible light.
- an imaging device 121 an example of an imaging unit
- the mobile terminal 120 transmits captured image data 122_1 to the image processing device 140 .
- the image processing device 140 generates learning data based on the image data 122_1 transmitted from the mobile terminal 120. In addition, the image processing device 140 performs learning processing on the learning model using the generated learning data, and generates a trained model. A trained model generated in the learning phase is installed in the mobile terminal 120 in the inspection phase.
- FIG. 2 is a first diagram showing an application example of the inspection service providing system in the inspection phase.
- the inspection service providing system 100 has a mobile terminal 120 (an example of the inspection system according to the first embodiment) and an imaging table 130.
- the inspection service providing system 100 has a mobile terminal 120 (an example of the inspection system according to the first embodiment) and an imaging table 130.
- an object to be imaged using the mobile terminal 120 is obtained by the following procedure and directly imaged under visible light (for example, under fluorescent light) or under ultraviolet light.
- the service provider collects a sample (dust 220) from the air conditioner (indoor unit) 210 of the user to whom the test results are to be provided. It may be collected from the outer surface of the air conditioner (indoor unit) 210).
- the collected sample is dispersed in a physiological saline solution 230 in which a surfactant is dissolved to form a solution.
- a surfactant is dissolved to form a solution.
- a prepared sample 240 is generated by extracting (for example, 10 ⁇ l) a solution sample 241 from the solution 230 and dropping it onto a slide glass.
- the prepared slide 240 is placed on the imaging table 130 as an imaging target, and is photographed using the portable terminal 120 under visible light (for example, under fluorescent light) or under ultraviolet light.
- direct imaging in the inspection phase means that the procedure from sampling (1) to imaging (4) does not include “cultivation of environmental microorganisms” It refers to the fact that photography is performed without increasing the number of microorganisms).
- photographing means an environmental microorganism individual (in the case of fungi, a mold individual, specifically, one unit of spores and hyphae. In the case of bacteria, one bacterial individual ) to obtain information about its appearance (color and shape).
- the imaging table 130 has a lens 131, a placement section 132, and a lens support section 133. Note that the imaging table 130 used in the examination phase is the same as the imaging table 130 (FIG. 1) used in the learning phase, so detailed description of each unit will be omitted here.
- the portable terminal 120 incorporates an imaging device 121, and is mounted on the lens support portion 133 so that the position of the lens 131 and the position of the imaging device 121 match, thereby capturing an object to be imaged (in the example of FIG. 2,
- the slide 240 is photographed under visible light (eg, under fluorescent light) or under ultraviolet light.
- a learned model is installed in the mobile terminal 120, and an inspection result 260 is provided by processing the photographed image data 250 using the learned model.
- the inspection service providing system 100 can enjoy the following effects. ⁇ Accurately identify environmental microorganisms in captured image data, even if they are similar in shape regardless of type, such as molds (belonging to the kingdom of fungi, especially yeasts and ascomycetes). be able to. ⁇ Because it is not necessary to use the culture method, it is possible to shorten the time required to provide test results. It is possible to take samples and provide test results on the spot. As a result, it is possible to immediately make proposals (cleaning of the air conditioner, replacement of filters, etc.) for realizing the optimum air environment according to the inspection results.
- FIG. 3 is a diagram illustrating an example of a hardware configuration of a mobile terminal and an image processing apparatus
- FIG. 3A shows an example of the hardware configuration of the mobile terminal 120 .
- the mobile terminal 120 has a processor 301, a memory 302, an auxiliary storage device 303, a display device 304, an operation device 305, a communication device 306, and an imaging device 121.
- FIG. Each piece of hardware of the mobile terminal 120 is interconnected via a bus 307 .
- the processor 301 has various computing devices such as a CPU (Central Processing Unit) and a GPU (Graphics Processing Unit).
- the processor 301 reads various programs (for example, an inspection program in the inspection phase) onto the memory 302 and executes them.
- the memory 302 has main storage devices such as ROM (Read Only Memory) and RAM (Random Access Memory).
- the processor 301 and the memory 302 form a so-called computer, and the processor 301 executes various programs read onto the memory 302, thereby realizing various functions of the computer.
- the auxiliary storage device 303 stores various programs and various data used when the various programs are executed by the processor 301 .
- the display device 304 is a display device that displays captured image data 122_1 and 250, inspection results 260, and the like.
- the operation device 305 is an input device used when inputting various instructions to the mobile terminal 120 .
- the communication device 306 is, for example, a communication device for communicating with the image processing device 140 .
- the imaging device 121 photographs slides 180_1 to 180_6 and 240, which are objects to be imaged.
- auxiliary storage device 303 Various programs installed in the auxiliary storage device 303 are installed by being downloaded from the network via the communication device 306, for example.
- FIG. 3 shows an example of the hardware configuration of the image processing apparatus 140.
- the processor 321 reads a learning program or the like onto the memory 322 and executes it.
- the auxiliary storage device 323 implements, for example, a learning data storage unit (described later).
- a drive device 327 is a device for setting a recording medium 329 .
- the recording medium 329 here includes media for optically, electrically or magnetically recording information such as a CD-ROM, a flexible disk, a magneto-optical disk, and the like. Also, the recording medium 329 may include a semiconductor memory or the like that electrically records information such as a ROM or a flash memory.
- auxiliary storage device 323 Various programs to be installed in the auxiliary storage device 323 are installed by, for example, setting the distributed recording medium 329 in the drive device 327 and reading the various programs recorded in the recording medium 329 by the drive device 327. be done. Alternatively, various programs installed in the auxiliary storage device 323 may be installed by being downloaded from the network via the communication device 326 .
- FIG. 4 is a diagram illustrating an example of a functional configuration of functions related to learning data generation processing of the image processing apparatus.
- the image processing apparatus 140 has an image data acquisition unit 410, a correct label acquisition unit 420, and a learning data generation unit 430 as functions related to learning data generation processing.
- the image data acquisition unit 410 acquires image data 122_1 to 122_6 captured by the mobile terminal 120 under visible light (for example, under fluorescence) or under ultraviolet light in the learning phase.
- the image data 122_1 to 122_6 are image data obtained by photographing the preparations 180_1 to 180_6 under visible light (for example, under fluorescent light) or under ultraviolet light, respectively.
- the correct label acquisition unit 420 acquires the correct label input by the experimenter 450 .
- the experimenter 450 inputs the types of environmental microorganisms contained in the samples 181_1 to 181_6 of the preparations 180_1 to 180_6 respectively to the image processing device 140 as correct data, and the correct label obtaining unit 420 obtains them.
- the experimenter 450 inputs "fungus A” as the type of environmental microorganisms contained in the sample 181_1, and "fungus B” as the type of environmental microorganisms contained in the sample 181_2. and It is also assumed that the experimenter 450 inputs "fungus C” as the type of environmental microorganisms contained in the sample 181_3 and “fungus D” as the type of environmental microorganisms contained in the sample 181_4. Furthermore, the experimenter 450 inputs "fungus E” as the type of environmental microorganisms contained in the sample 181_5 and "fungus F” as the type of environmental microorganisms contained in the sample 181_6.
- the learning data generation unit 430 generates the image data 122_1 to 122_6 acquired by the image data acquisition unit 410 and the corresponding environmental microorganism types (“fungus A” to “fungus F” acquired by the correct label acquisition unit 420). ) are associated with each other to generate learning data. Also, the learning data generation unit 430 stores the generated learning data in the learning data storage unit 440 .
- learning data 441 to 446 indicate learning data generated by the learning data generation unit 430 and stored in the learning data storage unit 440. As shown in FIG. 4, the learning data 441 to 446 have "image data”, “image area”, “partial image”, and "correct label” as information items.
- the file name of the image data acquired by the image data acquisition unit 410 is stored in "image data".
- image data 1 to “image data 6” are file names of image data 122_1 to 122_6, respectively.
- position information (a plurality of (x coordinates, y coordinates) of each area determined to contain one environmental microorganism in the image data, for example, two diagonal vertices of the area (x coordinates, y-coordinate)) is stored.
- Partial image stores the file name of the image data (partial image data) of each area that is determined to contain one environmental microorganism in the image data, specified by the "image area".
- including one environmental microorganism refers to including one spore unit when the environmental microorganism is a mold, for example.
- the "correct label” stores the type of environmental microorganisms in each area specified by the "image area”.
- "fungus A” to "fungus F” are stored in the "correct label” of the learning data 441 to the "correct label” of the learning data 446, respectively.
- FIG. 5 is a diagram illustrating an example of a functional configuration of functions related to learning processing of the image processing apparatus.
- the image processing device 140 has a first learning section 510 and a second learning section 520 as functions related to learning processing.
- the first learning unit 510 learns the process of determining areas containing environmental microorganisms in image data. Specifically, the first learning unit 510 has a YOLO 511 and a comparing/modifying unit 512 .
- YOLO 511 receives image data with a file name specified by “image data” in the learning data, and determines the position information and attribute information of each area (whether it is an environmental microorganism, etc.) in the image data. It is a learning model that
- the comparison/change unit 512 compares the position information of each region determined by the YOLO 511, the attribute information of each region, and the position information of each region determined to contain environmental microorganisms specified by the “image region” of the learning data. and calculate the error.
- the comparison/modification unit 512 also back-propagates the calculated error to update the model parameters of the YOLO 511 .
- the second learning unit 520 learns the process of determining the type of environmental microorganisms from the image data (partial image data) of each area determined to contain environmental microorganisms in the image data. Specifically, the second learning unit 520 has a DML (Deep Metric learning) 521 and a comparison/modification unit 522 .
- DML Deep Metric learning
- the DML 521 is a learning model that receives as input partial image data specified by "partial image” in the learning data, and outputs the types of environmental microorganisms (six types in this embodiment) (outputs six types of classification probabilities). is.
- the comparison/change unit 522 compares the type of environmental microorganisms determined by the DML 521 with the type of environmental microorganisms (the classification probability (eg, 1.0)) specified by the “correct label” of the learning data. , to calculate the error.
- the comparison/modification unit 522 also back-propagates the calculated error to update the model parameters of the DML 521 .
- FIG. 6 is a diagram showing an example of a functional configuration of functions related to inspection processing of a mobile terminal.
- the mobile terminal 120 has an image data acquisition unit 610, a first inference unit 620, a partial image extraction unit 630, a second inference unit 640, and an output unit 650 as functions related to inspection processing.
- the first inference unit 620, the partial image extraction unit 630, and the second inference unit 640 function as an identification unit that identifies environmental microorganisms in image data.
- the image data acquisition unit 610 acquires image data 660 captured under visible light (for example, under fluorescence) or under ultraviolet light in the inspection phase.
- the image data 660 is image data of the slide 240 captured under visible light (for example, under fluorescence) or under ultraviolet light.
- the first inference unit 620 has a trained YOLO 621 (an example of a first trained model) generated by the first learning unit 510 performing learning processing on the YOLO 511 in the learning phase.
- the first inference unit 620 causes the learned YOLO 621 to execute by inputting the image data 660 to the learned YOLO 621 .
- the learned YOLO 621 determines position information and attribute information (whether or not environmental microbes are included, etc.) of each area in the image data 660 .
- ⁇ Determine the coordinates (x 1a , y 1a ), (x 1b , y 1b ) as the position information of the region 1, determine that the region is a region containing environmental microorganisms, ⁇ Determine the coordinates (x 2a , y 2a ), (x 2b , y 2b ) as the position information of the region 2, determine that the region is a region containing environmental microorganisms, ⁇ Determine the coordinates (x 3a , y 3a ), (x 3b , y 3b ) as the position information of the region 3, determine that the region is a region containing environmental microorganisms, ⁇ Determine the coordinates (x 4a , y 4a ), (x 4b , y 4b ) as the position information of the region 4, determine that the region is a region containing environmental microorganisms, ⁇ Determine the coordinates (x 5a ,
- the first inference unit 620 notifies the partial image extraction unit 630 and the output unit 650 of position information and attribute information of each area in the image data 660 determined by the learned YOLO 621 .
- the partial image extraction unit 630 extracts an image of each region containing environmental microorganisms in the image data 660 based on the position information and attribute information of each region in the image data 660 notified by the first inference unit 620. Extract data (partial image data).
- the example of FIG. 6 shows how the partial image extraction unit 630 extracts partial image data 631 to 635 of regions 1 to 5.
- FIG. 6 shows how the partial image extraction unit 630 extracts partial image data 631 to 635 of regions 1 to 5.
- the partial image extraction unit 630 notifies the second inference unit 640 of the extracted partial image data 631-635.
- the second inference unit 640 has a learned DML 641 (second learned model) generated by the second learning unit 520 performing learning processing on the DML 521 in the learning phase.
- the second inference unit 640 causes the learned DML 641 to be executed by inputting the partial image data 631 to 635 to the learned DML 641 .
- the learned DML 641 determines the types of environmental microorganisms included in each of the partial image data 631-635.
- the learned DML 641 is determined to be "mold C"
- the learned DML 641 is determined to be "mold A”. It shows how it was done.
- the learned DML 641 is determined to be "fungus A”
- the learned DML 641 is determined to be "fungus B”. It shows how it was determined.
- the example of FIG. 6 shows how the learned DML 641 determines "mold C" by inputting the partial image data 635 .
- the second inference unit 640 notifies the output unit 650 of the types of environmental microorganisms determined by the learned DML 641 .
- the output unit 650 Visualization processing is performed on the display device 304 to generate inspection results and display them on the display device 304 .
- the inspection results generated by the output unit 650 include, for example, - Image data in which each region determined to contain environmental microorganisms in the image data 660 is colored according to the type of environmental microorganisms, Aggregated data obtained by aggregating the number (or ratio) of each type of environmental microorganisms in the image data 660, etc. are included.
- the "inspection" in the inspection phase includes at least the quantification of the types and numbers of environmental microorganisms.
- FIG. 7 is a diagram illustrating an example of provision of inspection results by a mobile terminal.
- 7a of FIG. 7 is an example of image data 710 in which each area determined to contain environmental microorganisms in image data 660 is colored according to the type of environmental microorganisms.
- the example of 7a in FIG. 7 shows that "mold C” is colored in blue, “mold A” is colored in red, and “mold C” is colored in yellow (in 7a of FIG. 7, For the sake of convenience, they are represented by different types of hatching.)
- 7b of FIG. 7 is an example of aggregated data 720 in which the number of each type of environmental microorganisms in the image data 660 is aggregated.
- the aggregated data 720 includes "kind of mold” and "number” as information items.
- the "type of fungi” stores the names of six types of fungi and the names other than fungi ("others")
- the "number” stores the number of corresponding types of fungi or other numbers. be.
- FIG. 8 is a flowchart showing the flow of learning processing.
- step S801 the experimenter 450 collects dust 160 from the air conditioner (indoor unit) 150 for experiment.
- step S802 the experimenter 450 disperses the collected sample into a solution.
- step S803 the experimenter 450 classifies the solution by type of environmental microorganisms.
- step S804 the experimenter 450 takes a sample from each classified solution and drops it onto a slide glass to generate preparations 180_1 to 180_6. Furthermore, the experimenter 450 photographs the prepared slides 180_1 to 180_6 (imaging targets) using the portable terminal 120 under visible light (for example, under fluorescent light) or under ultraviolet light.
- step S805 the image processing device 140 generates learning data 441 to 446 for each type of environmental microorganism based on the photographed image data.
- step S806 the image processing device 140 performs learning processing on YOLO using the generated learning data 441 to 446, and learns the position information and attribute information of each region in the image data.
- step S807 the image processing apparatus 140 performs learning processing on the DML using the generated learning data 441 to 446, and learns the types of environmental microorganisms for the partial image data of each region in the image data. .
- FIG. 9 is a flowchart showing the flow of inspection processing.
- step S901 the service provider collects dust 220 from the air conditioner (indoor unit) 210 of the user to whom the test results are provided.
- step S902 the service provider disperses the collected sample into a solution.
- step S903 the service provider generates a slide 240 by taking a sample from the solution and dropping it onto a slide glass. Further, the service provider photographs the generated slide 240 (imaging target) using the portable terminal 120 under visible light or ultraviolet light.
- step S904 the mobile terminal 120 determines the position information and attribute information of each area in the captured image data, and extracts partial image data of each area determined to contain environmental microorganisms.
- step S905 the mobile terminal 120 determines the types of environmental microorganisms in each of the extracted partial image data.
- step S906 the mobile terminal 120 performs visualization processing based on the position information and attribute information of each area in the image data and the type information of environmental microorganisms, and provides the inspection results to the user.
- the mobile terminal 120 which is an example of the inspection system according to the first embodiment
- ⁇ The sampled dust is turned into a solution and the prepared slide is dropped onto a slide glass.
- the dust is directly photographed, or the external appearance of the environmental microorganisms is photographed.
- ⁇ Learned YOLO for determining the attribute of each area in the image data
- DML for determining the type of environmental microorganisms in areas determined to contain environmental microorganisms. and are used for identification.
- the mobile terminal 120 functions as an inspection system having the imaging device 121, the identification section (the second inference section 640 and the like), and the output section 650 has been described.
- part of the functions of the mobile terminal 120 may be arranged on the imaging table, and the imaging table and the mobile terminal 120 may be configured to form an inspection system.
- FIG. 10 is a second diagram showing an application example of the inspection service providing system in the learning phase.
- the inspection service providing system 100 in the learning phase has an inspection system 1010 and an image processing device 140, and the inspection system 1010 includes an imaging table 1020, The point is that the portable terminal 120 is included.
- the imaging stand 1020 has an imaging device 1021 and is connected to the mobile terminal 120 .
- FIG. 11 is a second diagram showing an application example of the inspection service providing system in the inspection phase.
- the inspection service providing system 100 in the inspection phase has an inspection system 1010, and the inspection system 1010 includes an imaging table 1020 and a mobile terminal 120. is.
- the imaging table 1020 has an imaging device 1021 and is connected to the mobile terminal 120 .
- the imaging device 1021 is placed on the imaging table 1020 and the portable terminal 120 functions as an identification unit and an output unit, the same effects as in the first embodiment can be obtained.
- the imaging device 1021 is arranged on the imaging stand 1020 , but other functions realized by the mobile terminal 120 may be arranged on the imaging stand 1020 .
- the combination of functional assignments between the imaging table 1020 and the mobile terminal 120 is arbitrary.
- FIG. 12 is a diagram showing an example of a report containing test results.
- the report 1200 includes the following information items: "user information” (reference numeral 1210), ⁇ “Photo showing the collection position” (reference numeral 1220), “Level” (reference numeral 1230), ⁇ “Inspection result” (reference numeral 1240), ⁇ "Description of mold” (reference numeral 1250), etc. are included.
- "user information” includes the date of collection, the name of the location where the sample was collected, etc. as the status of sample collection.
- “photograph indicating the sampling position” a photograph or the like indicating the sampling position is inserted as information indicating the environment in which the sample was sampled.
- the "level” describes the level of contamination calculated based on the inspection results.
- photographed image data photographed image data, a graph showing the types and numbers (or ratios) of fungi detected, and the like are inserted.
- “description of fungi” detailed descriptions of each type of fungi (in the example of FIG. 12, fungi A to fungi F) are described.
- the report 1200 may further include comparison results with past inspection results.
- the past test results here include either or both of the past test results of the same place and the past test results of different collection places.
- the inspection systems according to the first and second embodiments are inspection systems for inspecting microorganisms or fungi generated in indoor environments or equipment.
- the object to be imaged is acquired by collecting dust and turning it into a solution
- the method for acquiring the object to be imaged is not limited to this.
- the object to be imaged may be acquired by collecting dirt and turning it into a solution.
- the object to be imaged may be obtained by obtaining the solution as it is.
- the object to be imaged may be a solution obtained by sampling from an indoor environment or a device.
- the learning data 441 to 446 are used to perform the learning process, but the learning data used for the learning process is not limited to the learning data 441 to 446.
- the accuracy of identifying environmental microorganisms may be further improved.
- the number of types of environmental microorganisms is six, but the number of types of environmental microorganisms is not limited to six.
- Image data may be added to generate learning data corresponding to seven or more types of environmental microorganisms.
- six types of learning data are generated based on the number of types of environmental microorganisms.
- DML is used for determining the types of environmental microorganisms, even if environmental microorganisms other than the six types are included, environmental microorganisms other than the six types can be accurately classified. Alternatively, even if substances other than environmental microorganisms are contained, the substance can be classified with high accuracy. That is, in the case of DML, classification can be performed with high accuracy even when unlearned substances are included.
- mold is given as an example of environmental microorganisms, but environmental microorganisms are not limited to mold.
- it may be a microorganism such as a bacterium.
- Bacteria as used herein include, for example, Legionella, Bacillus, Micrococcus, and the like.
- the physiological saline solution in which the surfactant is dissolved is used when the collected sample is made into a solution.
- the solution used for dissolving the collected sample is not limited to this, and a solution diluted with physiological saline containing a surfactant may also be used.
- the solution used to convert the collected sample into a solution contains - A first solution obtained by dispersing the collected sample in a physiological saline solution containing a surfactant to form a solution; - A second solution obtained by further diluting the first solution with a surfactant-dissolved physiological saline solution; is included.
- the dilution ratio at this time is, for example, about 1 to 1000 times, preferably about 5 to 20 times.
- the output unit 650 counts the number (or ratio) of each type of environmental microorganisms in the image data 660 .
- the number of each type of environmental microorganism may be counted by, for example, a learned DML.
- REFERENCE SIGNS LIST 100 Inspection service providing system 120: Mobile terminal 121: Imaging device 130: Imaging table 132: Mounting unit 133: Lens support unit 140: Image processing device 150: Air conditioner (indoor unit) 160: Dust 180_1 to 180_6: Preparation 210: Air conditioner (indoor unit) 220: Dust 240: Preparation 410: Image data acquisition unit 420: Correct label acquisition unit 430: Learning data generation unit 441 to 446: Learning data 510: First learning unit 520: Second learning unit 610: Image data acquisition unit 620: First inference unit 630: Partial image extraction unit 640: Second inference unit 650: Output unit 730: Aggregated data 1010: Inspection system 1020: Imaging table 1021: Imaging device 1200: Report
Landscapes
- Engineering & Computer Science (AREA)
- Chemical & Material Sciences (AREA)
- Health & Medical Sciences (AREA)
- Life Sciences & Earth Sciences (AREA)
- General Health & Medical Sciences (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Analytical Chemistry (AREA)
- Biochemistry (AREA)
- Immunology (AREA)
- Theoretical Computer Science (AREA)
- Pathology (AREA)
- Organic Chemistry (AREA)
- Wood Science & Technology (AREA)
- Zoology (AREA)
- Dispersion Chemistry (AREA)
- Multimedia (AREA)
- Molecular Biology (AREA)
- Biomedical Technology (AREA)
- Bioinformatics & Cheminformatics (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Medical Informatics (AREA)
- Microbiology (AREA)
- Genetics & Genomics (AREA)
- General Engineering & Computer Science (AREA)
- Biotechnology (AREA)
- Proteomics, Peptides & Aminoacids (AREA)
- Artificial Intelligence (AREA)
- Computing Systems (AREA)
- Databases & Information Systems (AREA)
- Evolutionary Computation (AREA)
- Software Systems (AREA)
- Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
- Biophysics (AREA)
- Radiology & Medical Imaging (AREA)
- Quality & Reliability (AREA)
- Toxicology (AREA)
- Signal Processing (AREA)
- Sustainable Development (AREA)
- Apparatus Associated With Microorganisms And Enzymes (AREA)
Abstract
Description
室内環境もしくは機器に発生する微生物またはカビを検査する検査システムであって、
室内環境もしくは機器から採取された試料を直接撮影する撮像部と、
前記撮像部により撮影された画像データ内の微生物またはカビを検査し検査結果を出力する出力部とを有する。
室内環境もしくは機器に発生する微生物またはカビを検査する検査システムであって、
室内環境もしくは機器から採取された微生物またはカビ個体の外観を撮影する撮像部と、
前記撮像部により撮影された画像データ内の微生物またはカビを検査し検査結果を出力する出力部とを有する。
前記撮像部が撮影する撮像対象は、室内環境もしくは機器から採取された溶液であり、前記撮像部は、前記撮像対象を、可視光下または紫外光下でレンズを介して撮影する。
前記溶液は、室内環境もしくは機器から採取された試料を溶液化することで得られる。
採取先は、空調機、空気清浄機、加湿器、換気設備、送風機、室内環境の表面のいずれかである。
前記画像データ内の各領域の属性を判定する第1の学習済みモデルを有する。
前記第1の学習済みモデルにより、微生物またはカビを含むと判定された領域について、微生物またはカビの種類を判定する第2の学習済みモデルを有する。
前記出力部は、前記撮像部により撮影された画像データ内の微生物またはカビの数を種類ごとに集計して出力する。
室内環境もしくは機器に発生する微生物またはカビを検査する検査システムであって、
室内環境もしくは機器から採取された試料を直接撮影する撮像部と、
前記撮像部により撮影された画像データ内の微生物またはカビを検査し検査結果を出力する出力部と、を有し、
前記出力部は、微生物またはカビの種類、微生物またはカビの数または比率、前記撮像部により撮影された画像データを検査結果として表示するとともに、前記室内環境もしくは機器を示す情報、前記微生物またはカビの説明、汚染度のレベル、他の検査結果との比較結果、のいずれか1つを表示する。
室内環境もしくは機器に発生する微生物またはカビを検査する検査システムであって、
室内環境もしくは機器から採取された微生物またはカビ個体の外観を撮影する撮像部と、
前記撮像部により撮影された画像データ内の微生物またはカビを検査し検査結果を出力する出力部と、有し、
前記出力部は、微生物またはカビの種類、微生物またはカビの数または比率、前記撮像部により撮影された画像データを検査結果として表示するとともに、前記室内環境もしくは機器を示す情報、前記微生物またはカビの説明、汚染度のレベル、他の検査結果との比較結果、のいずれか1つを表示する。
前記撮像対象は、採取された試料を分散させて溶液化することで得られる。
前記撮像対象は、空調機から採取された試料を分散させて溶液化することで得られる。
前記撮像対象は、界面活性剤を溶かした生理食塩水に、採取された試料を分散させて溶液化することで得られる。
前記撮像対象には、界面活性剤を溶かした生理食塩水に、採取された試料を分散させて溶液化することで得られる第1の溶液と、該第1の溶液を、更に、界面活性剤を溶かした生理食塩水を用いて希釈した第2の溶液とが含まれる。
前記第1の学習済みモデルにより属性が判定された各領域は、32×32画素以上である。
前記第1の学習済みモデルは、同一種類の微生物またはカビが含まれる画像データと、該画像データ内の微生物またはカビを含むと判定された各領域の位置が複数の座標により特定された位置情報とが対応付けられた学習用データを用いて学習処理が行われることで生成される。
前記第2の学習済みモデルは、前記微生物またはカビの種類を判定した場合に、判定した種類ごとに前記微生物またはカビを計数する。
前記検査システムは、携帯端末であり、
前記撮像部は、前記携帯端末に内蔵されている。
前記検査システムは、前記撮像対象を載置する携帯可能な撮影台と、該撮影台と接続される携帯端末とを有し、
前記撮影台は、前記撮像部を有し、
前記携帯端末は、前記第1の学習済みモデルと前記第2の学習済みモデルと前記出力部とを有する。
前記第1の学習済みモデルは、学習済みのYOLOである。
前記第2の学習済みモデルは、同一種類の微生物またはカビが含まれる画像データの微生物またはカビを含むと判定された各領域の部分画像データと、該微生物またはカビの種類とが対応付けられた学習用データを用いて学習処理が行われることで生成される。
前記第2の学習済みモデルは、学習済みのDMLである。
室内環境もしくは機器に発生する微生物またはカビを検査する検査システムの検査方法であって、
室内環境もしくは機器から採取された試料を直接撮影する工程と、
撮影された画像データ内の微生物またはカビを検査し検査結果を出力する工程とを有する。
室内環境もしくは機器に発生する微生物またはカビを検査する検査システムの検査方法であって、
室内環境もしくは機器から採取された微生物またはカビ個体の外観を撮影する工程と、
撮影された画像データ内の微生物またはカビを検査し検査結果を出力する工程とを有する。
<学習フェーズにおける検査サービス提供システム>
はじめに、第1の実施形態に係る検査システムを含む検査サービス提供システムのうち、学習フェーズにおける検査サービス提供システムの適用例について説明する。図1は、学習フェーズにおける検査サービス提供システムの適用例を示す第1の図である。図1に示すように、学習フェーズにおいて検査サービス提供システム100は、携帯端末120(第1の実施形態に係る検査システムの一例)と、撮影台130と、画像処理装置140とを有する。
(1)実験者が実験用の空調機(室内機)150から試料(ほこり160)を採取する(なお、空調機(室内機)150の内部から採取しても、空調機(室内機)150の外表面から採取してもよい)。
(2)界面活性剤を溶かした生理食塩水の溶液170に、採取した試料を分散させて溶液化する。なお、ここでいう「試料を分散させて溶液化する」ことには、「分散させて溶液化した試料を乾燥させる」ことが含まれていてもよい。
(3)溶液化した試料を、溶液170から採取し(例えば、10μl)、スライドガラスに滴下することで、プレパラートを生成する。このとき、同一種類の環境微生物のみが含まれるように試料181_1~181_6を採取し、種類ごとに、異なるスライドガラスに滴下することで、プレパラート180_1~180_6を生成する。なお、図1の例は、空調機(室内機)150から採取した試料(ほこり160)に、6種類の環境微生物が含まれていた場合を示している。
(4)生成したプレパラート180_1~180_6を、撮像対象として、順次、撮影台130に載置し、携帯端末120を用いて可視光下(例えば、蛍光下)または紫外光下で撮影する。
次に、第1の実施形態に係る検査システムを含む検査サービス提供システムのうち、検査フェーズにおける検査サービス提供システムの適用例について説明する。図2は、検査フェーズにおける検査サービス提供システムの適用例を示す第1の図である。図2に示すように、検査フェーズにおいて検査サービス提供システム100は、携帯端末120(第1の実施形態に係る検査システムの一例)と、撮影台130とを有する。
(1)サービス提供者が検査結果の提供先となるユーザの空調機(室内機)210から試料(ほこり220)を採取する(なお、空調機(室内機)210の内部から採取しても、空調機(室内機)210の外表面から採取してもよい)。
(2)界面活性剤を溶かした生理食塩水の溶液230に、採取した試料を分散させて溶液化する。なお、ここでいう「試料を分散させて溶液化する」ことには、「分散させて溶液化した試料を乾燥させる」ことが含まれていてもよい。
(3)溶液化した試料241を、溶液230から採取し(例えば、10μl)、スライドガラスに滴下することで、プレパラート240を生成する。
(4)生成したプレパラート240を、撮像対象として、撮影台130に載置し、携帯端末120を用いて可視光下(例えば、蛍光下)または紫外光下で撮影する。
・カビ(菌界に属し、特に酵母や子嚢菌にあたるもの)等のように、種類によらず形状が似ている環境微生物であっても、撮影した画像データ内の環境微生物を精度よく同定することができる。
・培養法を用いる必要がないため、検査結果の提供に要する時間を短縮することが可能となり、例えば、サービス提供者が検査結果の提供先となるユーザを訪問した際、当該ユーザの空調機から試料を採取し、その場で、検査結果を提供することが可能となる。この結果、検査結果に応じた最適な空気環境を実現するための提案(空調機の洗浄やフィルタ交換等)を、直ちに行うことができる。
・培養法を用いる必要がないため、検査結果の提供に要するコストを削減することが可能となる(専門の業者に依頼する必要がなくなる)。
・培養法を用いる必要がないため、死んだ環境微生物(つまり、生菌だけでなく死菌)についても同定することが可能となる(生きた環境微生物を採取する必要がなくなる)。
次に、携帯端末120及び画像処理装置140のハードウェア構成について説明する。図3は、携帯端末及び画像処理装置のハードウェア構成の一例を示す図である。
図3の3aは、携帯端末120のハードウェア構成の一例である。図3の3aに示すように、携帯端末120は、プロセッサ301、メモリ302、補助記憶装置303、表示装置304、操作装置305、通信装置306、撮像装置121を有する。なお、携帯端末120の各ハードウェアは、バス307を介して相互に接続されている。
図3の3bは、画像処理装置140のハードウェア構成の一例である。なお、画像処理装置140のハードウェア構成は、携帯端末120のハードウェア構成と概ね同じであるため、ここでは、携帯端末120との相違点を中心に説明する。
次に、画像処理装置140の機能構成について説明する。上述したように、画像処理装置140には、学習プログラムがインストールされており、当該プログラムが実行されることで画像処理装置140は、
・学習用データを生成する学習用データ生成処理に関する機能、
・学習用データを用いて、学習モデルに対して学習処理を行う学習処理に関する機能、
等を実現する。そこで、以下では、画像処理装置140が実現するこれらの機能について、分けて説明する。
はじめに、学習用データ生成処理に関する機能について説明する。図4は、画像処理装置の学習用データ生成処理に関する機能の機能構成の一例を示す図である。図4に示すように、画像処理装置140は、学習用データ生成処理に関する機能として、画像データ取得部410、正解ラベル取得部420、学習用データ生成部430を有する。
次に、学習処理に関する機能について説明する。図5は、画像処理装置の学習処理に関する機能の機能構成の一例を示す図である。図5に示すように、画像処理装置140は、学習処理に関する機能として、第1学習部510、第2学習部520を有する。
・ファイル名=「画像データ1」の画像データ122_1をYOLO511に入力することで、YOLO511より出力される各領域の位置情報及び各領域の属性情報と、
・画像領域=“領域1-1”、“領域1-2”、“領域1-3”、・・・により特定される環境微生物を含むと判定された各領域の位置情報と、
を比較し、誤差を算出する。
・ファイル名=「画像1-1」の部分画像データを、DML521に入力することで、DML521より判定される環境微生物の種類と、
・正解ラベル=“カビA”と、
を比較し、誤差を算出する。
次に、携帯端末120の機能構成について説明する。上述したように、携帯端末120には、検査プログラムがインストールされており、当該プログラムが実行されることで携帯端末120は、検査処理に関する機能が実現される。そこで、以下では、携帯端末120が実現する検査処理に関する機能について説明する。
・領域1の位置情報として座標(x1a,y1a)、(x1b,y1b)を判定し、当該領域が環境微生物を含む領域であると判定し、
・領域2の位置情報として座標(x2a,y2a)、(x2b,y2b)を判定し、当該領域が環境微生物を含む領域であると判定し、
・領域3の位置情報として座標(x3a,y3a)、(x3b,y3b)を判定し、当該領域が環境微生物を含む領域であると判定し、
・領域4の位置情報として座標(x4a,y4a)、(x4b,y4b)を判定し、当該領域が環境微生物を含む領域であると判定し、
・領域5の位置情報として座標(x5a,y5a)、(x5b,y5b)を判定し、当該領域が環境微生物を含む領域であると判定し、
・領域1~領域5以外の領域が、環境微生物を含まない領域であると判定、
した様子を示している。
・画像データ660内において環境微生物を含むと判定された各領域が、環境微生物の種類に応じた色により配色された画像データ、
・画像データ660内の環境微生物の各種類の個数(または割合)を集計した集計データ、
等が含まれる。
次に、携帯端末120による検査結果の提供例について説明する。図7は、携帯端末による検査結果の提供例を示す図である。
次に、検査サービス提供システム100による、学習処理の流れについて説明する。図8は、学習処理の流れを示すフローチャートである。
次に、検査サービス提供システム100による、検査処理の流れについて説明する。図9は、検査処理の流れを示すフローチャートである。
以上の説明から明らかなように、第1の実施形態に係る検査システムの一例である携帯端末120は、
・採取されたほこりを溶液化し、溶液をスライドガラスに滴下することで生成したプレパラートを撮像対象として、可視光下または紫外光下でレンズを介して撮影する。つまり、ほこりを直接撮影する、あるいは、環境微生物個体の外観を撮影する。
・撮影された画像データ内の環境微生物を、画像データ内の各領域の属性を判定する学習済みのYOLOと、環境微生物を含むと判定された領域について環境微生物の種類を判定する学習済みのDMLとを用いて同定する。
・同定した画像データ内の環境微生物について可視化処理を行い、検査結果を提供する。
上記第1の実施形態は、携帯端末120が、撮像装置121と、同定部(第2推論部640等)と、出力部650とを有する検査システムとして機能する場合について説明した。しかしながら、携帯端末120の一部の機能については、撮影台に配し、撮影台と携帯端末120とで検査システムが形成されるように構成してもよい。
上記第1及び第2の実施形態では、検査結果の提供例として、少なくとも、環境微生物の種類及び数と画像データとを出力するものとして説明したが、検査結果の提供方法はこれに限定されず、例えば、レポートとして、ユーザが理解しやすい形にまとめてもよい。
・“ユーザ情報”(符号1210)、
・“採取位置を示す写真”(符号1220)、
・“レベル”(符号1230)、
・“検査結果”(符号1240)、
・“カビの説明”(符号1250)、
等が含まれる。
上記第1の実施形態では、環境微生物を含むと判定された各領域の画像データ(部分画像データ)のサイズについて特に言及しなかったが、各領域の画像データのサイズは、例えば、32×32画素以上、あるいは、50×50画素以上であるとする。
・界面活性剤を溶かした生理食塩水に、採取された試料を分散させて溶液化することで得られる第1の溶液と、
・第1の溶液を、更に界面活性剤を溶かした生理食塩水で希釈した第2の溶液と、
が含まれる。
120 :携帯端末
121 :撮像装置
130 :撮影台
132 :載置部
133 :レンズ支持部
140 :画像処理装置
150 :空調機(室内機)
160 :ほこり
180_1~180_6 :プレパラート
210 :空調機(室内機)
220 :ほこり
240 :プレパラート
410 :画像データ取得部
420 :正解ラベル取得部
430 :学習用データ生成部
441~446 :学習用データ
510 :第1学習部
520 :第2学習部
610 :画像データ取得部
620 :第1推論部
630 :部分画像抽出部
640 :第2推論部
650 :出力部
730 :集計データ
1010 :検査システム
1020 :撮影台
1021 :撮像装置
1200 :レポート
Claims (24)
- 室内環境もしくは機器に発生する微生物またはカビを検査する検査システムであって、
室内環境もしくは機器から採取された試料を直接撮影する撮像部と、
前記撮像部により撮影された画像データ内の微生物またはカビを検査し検査結果を出力する出力部と
を有する検査システム。 - 室内環境もしくは機器に発生する微生物またはカビを検査する検査システムであって、
室内環境もしくは機器から採取された微生物またはカビ個体の外観を撮影する撮像部と、
前記撮像部により撮影された画像データ内の微生物またはカビを検査し検査結果を出力する出力部と
を有する検査システム。 - 前記撮像部が撮影する撮像対象は、室内環境もしくは機器から採取された溶液であり、前記撮像部は、前記撮像対象を、可視光下または紫外光下でレンズを介して撮影する、
請求項1または2に記載の検査システム。 - 前記溶液は、室内環境もしくは機器から採取された試料を溶液化することで得られる、
請求項3に記載の検査システム。 - 採取先は、空調機、空気清浄機、加湿器、換気設備、送風機、室内環境の表面のいずれかである、請求項3に記載の検査システム。
- 前記画像データ内の各領域の属性を判定する第1の学習済みモデルを有する、請求項3に記載の検査システム。
- 前記第1の学習済みモデルにより、微生物またはカビを含むと判定された領域について、微生物またはカビの種類を判定する第2の学習済みモデルを有する、請求項6に記載の検査システム。
- 前記出力部は、前記撮像部により撮影された画像データ内の微生物またはカビの数を種類ごとに集計して出力する、請求項1乃至7のいずれか1項に記載の検査システム。
- 室内環境もしくは機器に発生する微生物またはカビを検査する検査システムであって、
室内環境もしくは機器から採取された試料を直接撮影する撮像部と、
前記撮像部により撮影された画像データ内の微生物またはカビを検査し検査結果を出力する出力部と、を有し、
前記出力部は、微生物またはカビの種類、微生物またはカビの数または比率、前記撮像部により撮影された画像データを検査結果として表示するとともに、前記室内環境もしくは機器を示す情報、前記微生物またはカビの説明、汚染度のレベル、他の検査結果との比較結果、のいずれか1つを表示する、検査システム。 - 室内環境もしくは機器に発生する微生物またはカビを検査する検査システムであって、
室内環境もしくは機器から採取された微生物またはカビ個体の外観を撮影する撮像部と、
前記撮像部により撮影された画像データ内の微生物またはカビを検査し検査結果を出力する出力部と、有し、
前記出力部は、微生物またはカビの種類、微生物またはカビの数または比率、前記撮像部により撮影された画像データを検査結果として表示するとともに、前記室内環境もしくは機器を示す情報、前記微生物またはカビの説明、汚染度のレベル、他の検査結果との比較結果、のいずれか1つを表示する、検査システム。 - 前記撮像対象は、採取された試料を分散させて溶液化することで得られる、請求項3乃至7のいずれか1項に記載の検査システム。
- 前記撮像対象は、空調機から採取された試料を分散させて溶液化することで得られる、請求項11に記載の検査システム。
- 前記撮像対象は、界面活性剤を溶かした生理食塩水に、採取された試料を分散させて溶液化することで得られる、請求項11または12に記載の検査システム。
- 前記撮像対象には、界面活性剤を溶かした生理食塩水に、採取された試料を分散させて溶液化することで得られる第1の溶液と、該第1の溶液を、更に、界面活性剤を溶かした生理食塩水を用いて希釈した第2の溶液とが含まれる、請求項11乃至13のいずれか1項に記載の検査システム。
- 前記第1の学習済みモデルにより属性が判定された各領域は、32×32画素以上である、請求項7に記載の検査システム。
- 前記第1の学習済みモデルは、同一種類の微生物またはカビが含まれる画像データと、該画像データ内の微生物またはカビを含むと判定された各領域の位置が複数の座標により特定された位置情報とが対応付けられた学習用データを用いて学習処理が行われることで生成される、請求項7に記載の検査システム。
- 前記第2の学習済みモデルは、前記微生物またはカビの種類を判定した場合に、判定した種類ごとに前記微生物またはカビを計数する、請求項7に記載の検査システム。
- 前記検査システムは、携帯端末であり、
前記撮像部は、前記携帯端末に内蔵されている、請求項1乃至17のいずれか1項に記載の検査システム。 - 前記検査システムは、前記撮像対象を載置する携帯可能な撮影台と、該撮影台と接続される携帯端末とを有し、
前記撮影台は、前記撮像部を有し、
前記携帯端末は、前記第1の学習済みモデルと前記第2の学習済みモデルと前記出力部とを有する、請求項7に記載の検査システム。 - 前記第1の学習済みモデルは、学習済みのYOLOである、請求項7、15または16のいずれか1項に記載の検査システム。
- 前記第2の学習済みモデルは、同一種類の微生物またはカビが含まれる画像データの微生物またはカビを含むと判定された各領域の部分画像データと、該微生物またはカビの種類とが対応付けられた学習用データを用いて学習処理が行われることで生成される、請求項7または17に記載の検査システム。
- 前記第2の学習済みモデルは、学習済みのDMLである、請求項7、17または21のいずれか1項に記載の検査システム。
- 室内環境もしくは機器に発生する微生物またはカビを検査する検査システムの検査方法であって、
室内環境もしくは機器から採取された試料を直接撮影する工程と、
撮影された画像データ内の微生物またはカビを検査し検査結果を出力する工程と
を有する検査方法。 - 室内環境もしくは機器に発生する微生物またはカビを検査する検査システムの検査方法であって、
室内環境もしくは機器から採取された微生物またはカビ個体の外観を撮影する工程と、
撮影された画像データ内の微生物またはカビを検査し検査結果を出力する工程と
を有する検査方法。
Priority Applications (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
AU2022244737A AU2022244737A1 (en) | 2021-03-22 | 2022-03-18 | Inspection system and inspection method |
EP22775481.9A EP4317404A1 (en) | 2021-03-22 | 2022-03-18 | Inspection system and inspection method |
CN202280021147.1A CN116981933A (zh) | 2021-03-22 | 2022-03-18 | 检查系统及检查方法 |
US18/549,657 US20240169749A1 (en) | 2021-03-22 | 2022-03-18 | Inspection system and inspection method |
Applications Claiming Priority (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2021-047960 | 2021-03-22 | ||
JP2021047960 | 2021-03-22 | ||
JP2021161942 | 2021-09-30 | ||
JP2021-161942 | 2021-09-30 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2022202694A1 true WO2022202694A1 (ja) | 2022-09-29 |
Family
ID=83397241
Family Applications (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2022/012758 WO2022202693A1 (ja) | 2021-03-22 | 2022-03-18 | 検査システム及び検査方法 |
PCT/JP2022/012761 WO2022202694A1 (ja) | 2021-03-22 | 2022-03-18 | 検査システム及び検査方法 |
Family Applications Before (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2022/012758 WO2022202693A1 (ja) | 2021-03-22 | 2022-03-18 | 検査システム及び検査方法 |
Country Status (5)
Country | Link |
---|---|
US (1) | US20240169749A1 (ja) |
EP (1) | EP4317404A1 (ja) |
JP (4) | JP7206533B2 (ja) |
AU (1) | AU2022244737A1 (ja) |
WO (2) | WO2022202693A1 (ja) |
Families Citing this family (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR20240055503A (ko) * | 2022-10-20 | 2024-04-29 | 씨제이제일제당 (주) | 인공지능 기반의 배양액 오염 탐지 방법 및 장치 |
Citations (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2000258416A (ja) * | 1999-03-11 | 2000-09-22 | Matsushita Electric Ind Co Ltd | アレルゲン測定用被検液の調製方法及びその装置 |
JP2007195454A (ja) | 2006-01-26 | 2007-08-09 | Kao Corp | カビの形態観察方法 |
WO2019074926A1 (en) | 2017-10-09 | 2019-04-18 | Pathspot Technologies Inc. | SYSTEMS AND METHODS FOR DETECTION OF CONTAMINANTS ON SURFACES |
WO2019204854A1 (en) * | 2018-04-24 | 2019-10-31 | First Frontier Pty Ltd | System and method for performing automated analysis of air samples |
US20200158603A1 (en) * | 2018-11-16 | 2020-05-21 | Particle Measuring Systems, Inc. | Particle Sampling Systems and Methods for Robotic Controlled Manufacturing Barrier Systems |
US10769501B1 (en) * | 2017-02-15 | 2020-09-08 | Google Llc | Analysis of perturbed subjects using semantic embeddings |
JP2021047960A (ja) | 2019-09-19 | 2021-03-25 | キオクシア株式会社 | 半導体記憶装置 |
JP2021161942A (ja) | 2020-03-31 | 2021-10-11 | 株式会社豊田自動織機 | スクロール型圧縮機 |
Family Cites Families (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP6120631B2 (ja) * | 2013-03-22 | 2017-04-26 | アズビル株式会社 | 微生物検出装置の評価方法 |
-
2022
- 2022-03-15 JP JP2022040545A patent/JP7206533B2/ja active Active
- 2022-03-16 JP JP2022041461A patent/JP7206534B2/ja active Active
- 2022-03-18 US US18/549,657 patent/US20240169749A1/en active Pending
- 2022-03-18 EP EP22775481.9A patent/EP4317404A1/en active Pending
- 2022-03-18 WO PCT/JP2022/012758 patent/WO2022202693A1/ja unknown
- 2022-03-18 WO PCT/JP2022/012761 patent/WO2022202694A1/ja active Application Filing
- 2022-03-18 AU AU2022244737A patent/AU2022244737A1/en active Pending
- 2022-12-28 JP JP2022211148A patent/JP2023058482A/ja active Pending
- 2022-12-28 JP JP2022211149A patent/JP2023061925A/ja active Pending
Patent Citations (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2000258416A (ja) * | 1999-03-11 | 2000-09-22 | Matsushita Electric Ind Co Ltd | アレルゲン測定用被検液の調製方法及びその装置 |
JP2007195454A (ja) | 2006-01-26 | 2007-08-09 | Kao Corp | カビの形態観察方法 |
US10769501B1 (en) * | 2017-02-15 | 2020-09-08 | Google Llc | Analysis of perturbed subjects using semantic embeddings |
WO2019074926A1 (en) | 2017-10-09 | 2019-04-18 | Pathspot Technologies Inc. | SYSTEMS AND METHODS FOR DETECTION OF CONTAMINANTS ON SURFACES |
WO2019204854A1 (en) * | 2018-04-24 | 2019-10-31 | First Frontier Pty Ltd | System and method for performing automated analysis of air samples |
US20200158603A1 (en) * | 2018-11-16 | 2020-05-21 | Particle Measuring Systems, Inc. | Particle Sampling Systems and Methods for Robotic Controlled Manufacturing Barrier Systems |
JP2021047960A (ja) | 2019-09-19 | 2021-03-25 | キオクシア株式会社 | 半導体記憶装置 |
JP2021161942A (ja) | 2020-03-31 | 2021-10-11 | 株式会社豊田自動織機 | スクロール型圧縮機 |
Non-Patent Citations (4)
Title |
---|
CAO NAM, MEYER MATTHIAS, THIELE LOTHAR, SAUKH OLGA: "Dataset: Pollen Video Library for Benchmarking Detection, Classification, Tracking and Novelty Detection Tasks", PROCEEDINGS OF THE THIRD WORKSHOP ON DATA: ACQUISITION TO ANALYSIS, ACM, NEW YORK, NY, USA, 16 November 2020 (2020-11-16), New York, NY, USA, pages 23 - 25, XP055968973, ISBN: 978-1-4503-8136-9, DOI: 10.1145/3419016.3431487 * |
KHAN HALEEM A, MOHAN KARUPPAYIL S.: "Fungal pollution of indoor environments and its management", SAUDI JOURNAL OF BIOLOGICAL SCIENCES, ELSEVIER, AMSTERDAM, NL, vol. 19, no. 4, 1 October 2012 (2012-10-01), AMSTERDAM, NL , pages 405 - 426, XP055968968, ISSN: 1319-562X, DOI: 10.1016/j.sjbs.2012.06.002 * |
KING MARIA D., LACEY RONALD E., PAK HYOUNGMOOK, FEARING ANDREW, RAMOS GABRIELA, BAIG TATIANA, SMITH BROOKE, KOUSTOVA ALEXANDRA: "Assays and enumeration of bioaerosols-traditional approaches to modern practices", AEROSOL SCIENCE AND TECHNOLOGY., ELSEVIER SCIENCE PUBLISHING, NEW YORK, NY., US, vol. 54, no. 5, 3 May 2020 (2020-05-03), US , pages 611 - 633, XP055968967, ISSN: 0278-6826, DOI: 10.1080/02786826.2020.1723789 * |
MAKI TERUYA, KAZUNORI HARA, MAROMU YAMADA, FUMIHISA KOBAYASHI, HIROSHI HASEGAWA, YASUNOBU IWASAKA: "Epifluorescent Microscopic Observation of Aerosol", EAROZORU KENKYU - JOURNAL OF AEROSOL RESEARCH. JAPAN., EAROZORU KENKYU KYOGIKAI, KYOTO., JP, vol. 28, no. 3, 1 January 2013 (2013-01-01), JP , pages 201 - 207, XP055968971, ISSN: 0912-2834, DOI: 10.11203/jar.28.201 * |
Also Published As
Publication number | Publication date |
---|---|
US20240169749A1 (en) | 2024-05-23 |
JP7206534B2 (ja) | 2023-01-18 |
JP2023058482A (ja) | 2023-04-25 |
JP2022146910A (ja) | 2022-10-05 |
EP4317404A1 (en) | 2024-02-07 |
JP2023061925A (ja) | 2023-05-02 |
WO2022202693A1 (ja) | 2022-09-29 |
AU2022244737A1 (en) | 2023-10-05 |
JP7206533B2 (ja) | 2023-01-18 |
JP2022146913A (ja) | 2022-10-05 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
AU2014230824B2 (en) | Tissue object-based machine learning system for automated scoring of digital whole slides | |
US11226280B2 (en) | Automated slide assessments and tracking in digital microscopy | |
CN107580715A (zh) | 用于自动计数微生物菌落的方法和系统 | |
JP2018180635A (ja) | 画像処理装置、画像処理方法、及び画像処理プログラム | |
Chieco et al. | Image cytometry: protocols for 2D and 3D quantification in microscopic images | |
KR20140045923A (ko) | 미생물 성장을 분석하는 방법 및 소프트웨어 | |
WO2022202694A1 (ja) | 検査システム及び検査方法 | |
CN103140757A (zh) | 信息处理设备、信息处理系统、信息处理方法、程序和记录介质 | |
CN107209111A (zh) | 自动化整体载片分析的质量控制 | |
JP2014235494A (ja) | 画像処理装置及びプログラム | |
CN110320158A (zh) | 移动化学分析 | |
Cannet et al. | Wing interferential patterns (WIPs) and machine learning, a step toward automatized tsetse (Glossina spp.) identification | |
JP6158967B1 (ja) | 環境汚染予測システム及び方法 | |
JP2023061925A5 (ja) | 検査システム、検査方法及び検査プログラム | |
US20200074628A1 (en) | Image processing apparatus, imaging system, image processing method and computer readable recoding medium | |
WO2022019110A1 (ja) | プログラム、情報処理装置、情報処理方法及びモデル生成方法 | |
Wilson et al. | Automated bacterial identification by angle resolved dark-field imaging | |
Rhoads et al. | Comparison of the diagnostic utility of digital pathology systems for telemicrobiology | |
CN116981933A (zh) | 检查系统及检查方法 | |
Powless et al. | Evaluation of acridine orange staining for a semi-automated urinalysis microscopic examination at the point-of-care | |
Ramakrishna et al. | Smart Phone based Microscopic Image Acquisition and Quantifying System for Detecting Dengue | |
Doornewaard et al. | Reproducibility in double scanning of cervical smears with the PAPNET system | |
Back et al. | From Rocks to Pixels: A Protocol for Reproducible Mineral Imaging and its Applications in Machine Learning | |
Micklem | Developing Digital Photomicroscopy | |
Wang et al. | Deep Learning Technique and UAV Imagery Dataset for Paddy Rice Panicle Detection at Early Stage |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 22775481 Country of ref document: EP Kind code of ref document: A1 |
|
WWE | Wipo information: entry into national phase |
Ref document number: 18549657 Country of ref document: US |
|
WWE | Wipo information: entry into national phase |
Ref document number: 202280021147.1 Country of ref document: CN |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2022244737 Country of ref document: AU Ref document number: AU2022244737 Country of ref document: AU |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2301005914 Country of ref document: TH |
|
ENP | Entry into the national phase |
Ref document number: 2022244737 Country of ref document: AU Date of ref document: 20220318 Kind code of ref document: A |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2022775481 Country of ref document: EP |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
ENP | Entry into the national phase |
Ref document number: 2022775481 Country of ref document: EP Effective date: 20231023 |