WO2024024434A1 - Program for pest inspection and pest inspection device - Google Patents

Program for pest inspection and pest inspection device Download PDF

Info

Publication number
WO2024024434A1
WO2024024434A1 PCT/JP2023/024998 JP2023024998W WO2024024434A1 WO 2024024434 A1 WO2024024434 A1 WO 2024024434A1 JP 2023024998 W JP2023024998 W JP 2023024998W WO 2024024434 A1 WO2024024434 A1 WO 2024024434A1
Authority
WO
WIPO (PCT)
Prior art keywords
insect
image
paper
classification result
species name
Prior art date
Application number
PCT/JP2023/024998
Other languages
French (fr)
Japanese (ja)
Inventor
敦 岡田
忠 谷元
英男 白井
勇太 笠原
和之 日置
仁志 掛野
龍一 井上
翔斗 田浦
Original Assignee
日本農薬株式会社
株式会社アグリマート
株式会社エヌ・ティ・ティ・データCcs
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 日本農薬株式会社, 株式会社アグリマート, 株式会社エヌ・ティ・ティ・データCcs filed Critical 日本農薬株式会社
Publication of WO2024024434A1 publication Critical patent/WO2024024434A1/en

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A01AGRICULTURE; FORESTRY; ANIMAL HUSBANDRY; HUNTING; TRAPPING; FISHING
    • A01MCATCHING, TRAPPING OR SCARING OF ANIMALS; APPARATUS FOR THE DESTRUCTION OF NOXIOUS ANIMALS OR NOXIOUS PLANTS
    • A01M1/00Stationary means for catching or killing insects
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning

Definitions

  • the present disclosure relates to a program and a pest inspection device for pest inspection.
  • AI artificial intelligence
  • An object of the present disclosure is to provide a pest inspection program and a pest inspection device that have increased reliability in analysis for pest inspection using artificial intelligence.
  • a program for inspecting insect pests inputs an image of insect paper into an inspection model configured to classify the species name of an insect body stuck to the insect paper based on the image of the insect paper. Classify the species name of the insect body, determine whether the classification result of the species name of the insect body is suspicious, and perform a cutting process to change the classification result determined to be suspicious to that it could not be classified. Make a computer do something.
  • a pest inspection program and a pest inspection device are provided that have increased reliability in analysis for pest inspection using artificial intelligence.
  • FIG. 1 is a diagram showing the configuration of a system according to an embodiment.
  • FIG. 2 is a block diagram showing an example of the hardware configuration of the server.
  • FIG. 3 is a functional block diagram of the server.
  • FIG. 4 is a diagram showing an example of an image of insect-trapping paper.
  • FIG. 5 is a flowchart showing analysis processing by the server.
  • FIG. 6 is a flowchart showing frame detection processing.
  • FIG. 7 is a flowchart showing the insect body detection process.
  • FIG. 8 is a flowchart showing the insect body classification process.
  • FIG. 9 is a diagram showing the inspection implementation range.
  • FIG. 10 is a flowchart showing the leg cutting process.
  • FIG. 1 is a diagram showing the configuration of a system according to an embodiment.
  • the system 1 includes a terminal 2 and a server 3.
  • Terminal 2 and server 3 are connected via network NW.
  • the terminal 2 can connect to the network NW by, for example, wireless communication.
  • the terminal 2 is a terminal device that can be carried by the user, such as a smartphone or a tablet terminal.
  • the terminal 2 is carried by a user specialized in pest inspection, such as a PCO, to a place where an insect trap IT is installed, such as a factory or a restaurant, and takes an image of the insect trap paper attached to the insect trap IT.
  • the terminal 2 cooperates with the server 3 to perform pest inspection, such as identification and counting of pests stuck to the insect trap IT, based on the photographed images.
  • the terminal 2 may be a terminal device such as a personal computer that cannot be carried by the user. In this case, the terminal 2 receives an image of the insect trapping paper taken with a smartphone or the like and performs the inspection process.
  • FIG. 2 is a block diagram showing an example of the hardware configuration of the server 3.
  • the server 3 includes a processor 31, a ROM 32, a RAM 33, a storage 34, an input interface 35, a display 36, and a communication module 37.
  • the server 3 does not have to be a single server.
  • the server 3 may be configured as a cloud server, for example.
  • the processor 31 is a processor configured to control the operation of the server 3.
  • the processor 31 is, for example, a CPU.
  • the processor 31 may be an MPU, a GPU, or the like instead of a CPU. Further, the processor 31 does not need to be composed of one CPU or the like, and may be composed of a plurality of CPUs or the like.
  • the ROM 32 is, for example, a nonvolatile memory, and stores a startup program for the server 3 and the like.
  • the RAM 33 is, for example, a volatile memory.
  • the RAM 33 is used, for example, as a working memory during processing in the processor 31.
  • the storage 34 is, for example, a hard disk drive or a solid state drive.
  • the storage 34 stores an OS 341, a pest inspection program 342, an inspection model 343, a database 344, and a captured image 345.
  • the storage 34 may store programs and data other than the OS 341, pest inspection program 342, inspection model 343, database 344, and photographed image 345.
  • the OS 341 is a program for realizing the basic functions of the server 3.
  • Various programs stored in the storage 34 are executed under the control of the OS.
  • the pest inspection program 342 is a program that causes a computer to execute a series of processes for inspecting pests in accordance with a request from the user of the terminal 2.
  • the pest inspection program 342 may be a program executed by the processor 31 of the server 3.
  • the pest inspection program 342 may be a program that is downloaded to the terminal 2 as a web application and executed on the web browser of the terminal 2.
  • the pest inspection program 342 may be a program that is installed on the terminal 2 and executed on the terminal 2.
  • the inspection model 343 is a trained AI model that detects a pest in the selected image and performs an analysis for inspection that is equipped with a discriminator that identifies the detected pest.
  • the inspection model 343 predicts what type the pest in the input image will be classified into, and performs analysis for inspection of the input image based on the result with the highest predicted probability. This inspection includes pest species identification and enumeration.
  • the test model 343 will be explained in detail later.
  • the database 344 stores various data used in the cutting process when the inspection model 343 analyzes the image.
  • the cutting process is a process in which the testing model 343 is cut in advance so that questionable results from the test model 343 are not presented to the user. Database 344 will be explained in detail later.
  • the photographed image 345 is image data uploaded by the user of the terminal 2.
  • the captured image 345 is an image that can be analyzed using the inspection model 343.
  • the photographed image 345 may include, in addition to the image uploaded by the user of the terminal 2, other images such as an image used for learning the inspection model 343, for example.
  • the input interface 35 includes input devices such as a keyboard, mouse, and touch panel.
  • input devices such as a keyboard, mouse, and touch panel.
  • a signal corresponding to the content of the operation is input to the processor 31.
  • the processor 31 performs various processes in response to this signal.
  • the display 36 is a display device such as a liquid crystal display or an organic EL display.
  • the display 36 displays various images.
  • the communication module 37 is a module that includes an interface configured to perform processing when the server 3 communicates with the terminal 2.
  • the communication module 37 is configured to connect to the network NW using a mobile phone line, a wired LAN line, a wireless LAN line, or the like.
  • FIG. 3 is a functional block diagram of the server 3.
  • the processor 31 of the server 3 operates as the control unit 311 by operating according to the pest inspection program 342.
  • the control unit 311 can operate as a classification unit and a presentation unit using the test model 343 and database 344.
  • the inspection model 343 includes an insect trapping paper frame detection engine 3431, an insect body detection engine 3432, and an insect body classification engine 3433.
  • the insect trapping paper frame detection engine 3431 is a trained AI model that extracts feature amounts in an input image and detects a frame indicating a section of insect paper in the image based on the extracted feature amounts.
  • FIG. 4 is a diagram showing an example of an image of insect-trapping paper. In FIG. 4, one or more pieces of insect-trapping paper are cut into four pieces of insect-trapping paper IP1, IP2, IP3, and IP4, and images of these pieces of paper being lined up and photographed at the same time are shown.
  • the insect-trapping paper frame detection engine 3431 detects the frame of the division surrounded by the frame line L and the longitudinal sides of the insect-trapping paper by detecting the frame line L indicating the division in the image of the insect-trapping paper.
  • the frame of the division is square.
  • the frame of the compartment does not necessarily have to be square as long as the size is determined in advance.
  • the insect-trapping paper frame detection engine 3431 is configured by, for example, a convolutional neural network (CNN). For the learning of the insect-trapping paper frame detection engine 3431, for example, various insect-trapping paper images including frame lines and teacher data indicating the coordinates of the frame lines in each insect-trapping paper image can be used.
  • the insect trapping paper frame detection engine 3431 performs, for example, error back propagation based on the error between the coordinates of the section frame detected based on the feature amount of the input insect paper image and the coordinates of the section frame given as teacher data. Implement learning by law.
  • the insect-trapping paper is photographed in a state where it is cut into four insect-trapping papers IP1, IP2, IP3, and IP4. This is a process for making it easier to fit the insect paper within the photographing range of the terminal 2 and ensuring a resolution suitable for image analysis.
  • the insect paper may be photographed without being cut into pieces.
  • the insect detection engine 3432 is a trained AI model that extracts features from the input image and detects insects in the image based on the extracted features. In embodiments, detection of insects and classification of insects are performed by different AI models.
  • the insect detection engine 3432 is configured by, for example, CNN. For example, images of various insects and training data indicating the coordinates of the insect in each insect image can be used for learning of the insect detection engine 3432.
  • the image of the insect may be an image of the insect stuck to the insect paper, or may be an image of the insect not stuck to the insect paper.
  • the insect detection engine 3432 performs learning using, for example, an error backpropagation method based on the error between the coordinates of an insect body detected based on the feature values of the input insect image and the coordinates of the insect body given as training data. implement.
  • the coordinates of the insect body may be, for example, the coordinates of a minute rectangular range including the insect body.
  • the insect classification engine 3433 is a trained AI model that classifies insects in an image based on the features of the insects in the image.
  • the insect classification engine 3433 classifies insects present in the image range of the insect detected by the insect detection engine 3432.
  • Classification by the insect classification engine 3433 may include classification by insect species name and group name.
  • An insect species name is a species name given to each insect.
  • the group name is a group name based on the characteristics of the insect. For example, when insect-trapping paper is installed in a factory, the group name includes group names such as inside the factory and outside the factory. Inside the factory represents a group of insects that can occur inside the factory. Outside the factory represents a group of insects that originate outside the factory and invade the factory.
  • the group names "inside the factory” and “outside the factory” may each include further subdivided groups.
  • the groups may be divided into groups based on walking characteristics, groups divided based on whether or not they prefer a humid environment, groups divided based on whether or not they prefer a humid environment, whether insects are phagocytic or not, and phagocytic insects.
  • sexual insects it may include groups divided according to which classification of fungivorous insects they fall under, and groups divided according to whether they are drought-resistant insects.
  • the groups may include, for example, groups divided according to differences in walking characteristics and groups divided according to the presence or absence of flying.
  • Group names are also given to groups subdivided from these group names "inside the factory” and “outside the factory.”
  • the insect classification engine 3433 uses the broad group names such as “inside the factory” and “outside the factory,” as well as “with edible fungi” and “not edible with edible fungi.” ” Classification of sub-category group names is also carried out.
  • the insect classification engine 3433 is configured by, for example, CNN.
  • images of various insects and training data indicating the species name and group name of each insect can be used.
  • the image of the insect may be an image of the insect stuck to the insect paper, or may be an image of the insect not stuck to the insect paper.
  • the insect classification engine 3433 calculates the error between the insect species name and group name classified based on the feature values of the input insect image and the insect species name and group name given as training data. Based on this, the features of each insect are learned using the error backpropagation method.
  • the database 344 stores at least a probability threshold 3441, frame data 3442, and insect data 3443.
  • the database 344 may store data other than these.
  • the certainty threshold 3441 is a threshold for the certainty of insect detection by the insect detection engine 3432 of the inspection model 343, for example.
  • the insect object detection engine 3432 is configured to detect, as an insect object, an object in the image that is likely to be an insect object, that is, an object that has a high probability of being an insect object, based on the feature amount extracted from the image.
  • the certainty threshold is a threshold for the likelihood of being an insect, and is set, for example, between 0 and 1.
  • the certainty threshold 3441 can be arbitrarily determined by the administrator of the server 3. As will be explained later, the control unit 311 assumes that an insect object whose likelihood of being an insect object is less than a certainty threshold is not detected.
  • the certainty threshold 3441 is the certainty threshold of the insect detection engine 3431 and/or the insect classification engine. It may also include a threshold for the certainty of classification of the insect body according to 3433.
  • the frame data 3442 is data regarding the frame of the section set in advance on the insect trapping paper.
  • the frame data 3442 includes, for example, data of the actual size of one section. For example, if one section is a square section measuring 5 cm x 5 cm, data such as 5 cm in height and 5 cm in width may be stored as the frame data 3442.
  • the frame data 3442 may include data other than the actual size data of one section.
  • the frame data 3442 may include data regarding the frames of various sections that can be used in the cutting process, which will be described in detail later.
  • the insect data 3443 is data related to insects to be classified.
  • the insect data 3443 includes data such as insect species name, distribution range, breeding season, and actual size.
  • the actual size of the insect body may be registered for each distribution range.
  • the insect data 3443 in the embodiment also includes data on group names for each insect.
  • the insect data 3443 may include data other than these.
  • the insect data 3443 may include data regarding various types of insects to be classified that can be used in leg cutting processing, which will be described in detail later.
  • FIG. 5 is a flowchart showing analysis processing by the server 3.
  • the process in FIG. 5 is started when a request for insect body analysis is made from the terminal 2.
  • the terminal 2 notifies the user of information on the image of the insect paper to be analyzed.
  • the server 3 performs the process shown in FIG. 5 on the captured image 345 to be analyzed, which is notified from the terminal 2.
  • the image of the insect-trapping paper to be analyzed may be an image transmitted from the terminal 2 together with the analysis request.
  • step S1 the processor 31 as the control unit 311 performs frame detection processing on the captured image 345 to be analyzed.
  • the process moves to step S2.
  • the processor 31 inputs the captured image 345 to be analyzed into the inspection model 343.
  • the inspection model 343 detects the frame of the section in the photographed image 345 using the insect trapping paper frame detection engine 3431. The frame detection process will be explained in detail later.
  • step S2 the processor 31 performs insect detection processing on the photographed image 345 to be analyzed.
  • the process moves to step S3.
  • the processor 31 inputs the photographed image 345 to be analyzed into the inspection model 343.
  • the inspection model 343 detects the insect body in the photographed image 345 using the insect body detection engine 3432.
  • the insect body detection process will be explained in detail later.
  • the frame detection process in step S1 and the insect body detection process in step S2 may be performed in parallel.
  • step S3 the processor 31 performs insect body classification processing on the photographed image 345 to be analyzed.
  • the process moves to step S4.
  • the processor 31 inputs the photographed image 345 to be analyzed into the inspection model 343.
  • the inspection model 343 uses the insect classification engine 3433 to classify insects in the photographed image 345 .
  • the insect body classification process will be explained in detail later.
  • step S4 the processor 31 performs a cut-off process on the result of the insect body classification process. After the leg cutting process, the process moves to step S5.
  • the processor 31 receives the results of the frame detection process and the insect classification process from the inspection model 343, and selects the results of the insect classification process that do not meet the criteria based on the received results. The item is classified as "Unclassified" indicating that it could not be classified.
  • step S5 the processor 31 returns the analysis result for the photographed image 345 to the terminal 2.
  • the analysis result may be a list including the position of the insect in the photographed image 345 and the species name of the insect at that position.
  • the terminal 2 can receive this list and display the results of identifying and counting the insects in the image of the insect trapping paper, for example, on a display.
  • "unclassified" is counted independently of the species name of each insect.
  • FIG. 6 is a flowchart showing the frame detection process.
  • the processor 31 inputs the captured image 345 to be analyzed into the inspection model 343.
  • the inspection model 343 calls the insect trapping paper frame detection engine 3431 and detects the frame of the section in the input photographed image 345.
  • the inspection model 343 then returns the coordinates of the partition frame to the processor 31 as the control unit 311.
  • the coordinates of frames of a plurality of sections can be detected in the image of insect paper.
  • the test model 343 may output a list including the coordinates of the frame of each section, or may output only the coordinates of the frame of one representative section.
  • step S102 the processor 31 calculates the actual size per pixel of the photographed image 345 from the frame data 3442 stored in the database 344 and the size of the frame of the section occupied in the photographed image 345. Then, the processor 31 stores the calculated actual size in the RAM 33, for example. After that, the process in FIG. 6 ends.
  • a partition frame having an aspect ratio that deviates greatly from the aspect ratio of the partition frame stored as the frame data 3442 is treated as not detected.
  • the frame of the section for which the actual size d2 is calculated outside the predetermined threshold range is also treated as not being detected.
  • the threshold for the certainty of detecting a compartment frame is stored as the certainty threshold 3441, a compartment frame for which the probability of detecting a compartment frame is less than the threshold may also be undetected. It is desirable to be treated as such.
  • the actual size d2 may be calculated for each section frame, or may be calculated for the frame of one representative section.
  • their average value or the like may be stored in the RAM 33.
  • FIG. 7 is a flowchart showing the insect body detection process.
  • the processor 31 divides the captured image 345 to be analyzed into a plurality of blocks.
  • the pixel size of the block may be determined as appropriate. For example, one block may be determined to have the same size as the frame of the partition detected in the frame detection process.
  • step S202 the processor 31 inputs each block of the photographed image 345 to the inspection model 343. All blocks may be input to the test model 343 at once, or a predetermined number of blocks may be input to the test model 343.
  • the inspection model 343 calls the same number of insect detection engines 3432 as the number of input blocks in parallel to detect insects in each block. Then, each insect detection engine 3432 returns the coordinates of the insect detected for each block to the processor 31 as the control unit 311 along with the certainty of insect detection.
  • step S203 the processor 31 integrates the insect detection results in all blocks. After that, the process in FIG. 7 ends.
  • the processor 31 integrates the detection results by converting the coordinates of the insect body expressed in a coordinate system set for each block into coordinates in a coordinate system based on the captured image 345.
  • FIG. 8 is a flowchart showing the insect classification process.
  • the processor 31 cuts out an image of the inspection range from the captured image 345 to be analyzed.
  • FIG. 9 is a diagram showing the inspection implementation range.
  • the inspection range is, for example, a rectangular frame set in the captured image 345.
  • the inspection scope may be specified by the user of the terminal 2 prior to the analysis request.
  • the inspection range R is specified for the insect trapping paper IP4.
  • the insect classification process is performed only on the inspection range.
  • the inspection range R can be set for multiple ranges of the photographed image 345.
  • inspection implementation ranges R are set for multiple ranges of the photographed image 345, insect body classification processing is performed for each inspection implementation range R.
  • the inspection implementation range R does not necessarily have to be designated by the user.
  • the inspection implementation range R may be the frame of the section detected by the frame detection process.
  • the entire photographed image 345 may be the inspection implementation range R.
  • the inspection implementation range R may be a minute rectangular range that includes the coordinates of the insect body detected in the insect body detection process.
  • step S302 the processor 31 inputs the image of the inspection area to the inspection model 343 along with the coordinates of the insect body detected in the insect body detection process. Images of all inspection ranges may be input to the inspection model 343 at once, or images of a predetermined number of inspection ranges may be input to the inspection model 343.
  • the inspection model 343 calls the same number of insect classification engines 3433 as the input images of the inspection range in parallel to classify the insects in the images of each inspection range. Then, each insect classification engine 3433 returns the species name and group name of the insect classified for each inspection range to the processor 31 serving as the control unit 311 as a classification result.
  • step S303 the processor 31 integrates the insect classification results in all blocks. After that, the process in FIG. 8 ends. For example, the processor 31 compiles the species names and group names of the insects classified for each inspection range into one list in which they are associated with the coordinates of each insect and the certainty of detecting the insect.
  • FIG. 10 is a flowchart showing the leg cutting process.
  • the processor 31 selects, for example, one classification result from the classification results compiled into a list.
  • step S402 the processor 31 determines whether the certainty of insect detection associated with the selected classification result is higher than the certainty threshold. If it is determined in step S402 that the certainty of insect detection associated with the selected classification result is higher than the certainty threshold, the process moves to step S403. If it is determined in step S402 that the certainty of detecting an insect body associated with the selected classification result is less than or equal to the certainty threshold, the process proceeds to step S407.
  • step S403 the processor 31 determines whether the combination of the insect species name and group name included in the selected classification result is correct. In step S403, if the combination of the species name and group name of the insect included in the classification result matches the combination of the species name and group name registered in the insect data 3443, the combination is determined to be correct. . If it is determined in step S403 that the combination of the insect species name and group name included in the selected classification result is correct, the process moves to step S404. If it is determined in step S403 that the combination of the insect species name and group name included in the selected classification result is incorrect, the process moves to step S407.
  • step S404 the processor 31 calculates the actual size of the insect body of the selected classification result.
  • the actual size d2 per pixel of the captured image 345 has been calculated by the frame detection process described above. Therefore, the actual size of the insect body can be calculated from the product of the number p2 of pixels of the insect body detected from the captured image 345 and the actual size d2 per pixel.
  • step S405 the processor 31 determines whether the classified insect body is within the distribution range. For example, when the difference between the actual size of an insect existing in the distribution range registered in the insect data 3443 and the calculated actual size of the insect is within a threshold, it is determined that the classified insect is within the distribution range. be done. If it is determined in step S405 that the classified insect body is within the distribution range, the process moves to step S406. If it is determined in step S405 that the classified insect body is not within the distribution range, the process moves to step S407.
  • step S406 the processor 31 determines to output the selected classification result as is. After that, the process moves to step S408.
  • step S407 the processor 31 changes the selected classification result to "unclassified".
  • the process moves to step S408.
  • the certainty of insect detection is low, if the combination of the insect species name and group name is inappropriate, or if the size of the detected insect is It is marked as "unclassified".
  • the low certainty of insect body detection indicates that there is a high possibility that something that is not an insect body is detected as an insect body.
  • an inappropriate combination of insect species name and group name may result in an insect being classified as having occurred inside a factory, even though it originally occurred outside the factory. This indicates that a classification error may have occurred.
  • the distribution range of insects also indicates that errors in classification of insects have occurred.
  • questionable classification results are changed to "unclassified.”
  • step S408 the processor 31 determines whether the selection of all classification results has been completed. When it is determined in step S408 that the selection of all classification results has been completed, the process in FIG. 10 ends. If it is determined in step S408 that the selection of all classification results has not been completed, the process returns to step S401.
  • the classification result by the test model 343 is questionable, the classification result is changed to "unclassified". This prevents identification and counting results based on clearly incorrect classification results from being presented to the user. As a result, the reliability of the test model 343 is increased.
  • whether the classification result is doubtful or not is determined based on three different judgments: the certainty of detecting the insect, the combination of the species name and group name of the insect, and the distribution range based on the actual size of the insect. It will be judged. By determining whether or not a classification result is suspicious using these three different determinations, presentation of identification and counting results based on suspicious classification results to the user is further suppressed. On the other hand, it is not necessary to use three determinations to determine whether or not a classification result is questionable. The determination as to whether the classification result is doubtful may be made using only one of the combination of the species name and group name of the insect and the distribution range based on the actual size of the insect, or a different judgment may be made. It may also be carried out using
  • the inspection model 343 in the embodiment detects insects in the photographed image 345 and classifies the insects using separate AI models.
  • the accuracy of insect classification is improved compared to when detection of insect bodies in the photographed image 345 and classification of insect bodies are performed by one AI model.
  • the actual size of the frame of the section shown in the photographed image 345 is used to calculate the actual size of the insect body. This is because the insect-trapping paper has a frame line indicating the frame of each compartment, and the actual size of the frame of each compartment is known.
  • the actual size of the insect body does not necessarily have to be calculated from the actual size of the division frame.
  • the terminal 2 may prompt the user to take the photo together with an object whose size is known.
  • the actual size of the object may be input by the user.
  • the process proceeds to the subsequent determination.
  • it may be configured to proceed to the subsequent determination only when the certainty of insect detection exceeds the certainty threshold and the certainty of insect classification exceeds the certainty threshold. good.
  • the certainty threshold for insect detection and the certainty threshold for insect classification may be the same value or may be different values.
  • analysis of captured images for inspection is performed in the server 3.
  • analysis of captured images for inspection may be performed at the terminal 2.
  • each process according to the embodiment described above can be stored as a program that can be executed by the processor 31, which is a computer.
  • it can be stored and distributed in a storage medium of an external storage device such as a magnetic disk, an optical disk, or a semiconductor memory.
  • the processor 31 reads a program stored in the storage medium of this external storage device, and its operations are controlled by the read program, thereby being able to execute the above-described processes.
  • the present invention is not limited to the above-described embodiments, and can be variously modified at the implementation stage without departing from the spirit thereof.
  • each embodiment may be implemented in combination as appropriate, and in that case, the combined effect can be obtained.
  • the embodiments described above include various inventions, and various inventions can be extracted by combinations selected from the plurality of constituent features disclosed. For example, if a problem can be solved and an effect can be obtained even if some constituent features are deleted from all the constituent features shown in the embodiment, the configuration from which these constituent features are deleted can be extracted as an invention.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Pest Control & Pesticides (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • General Physics & Mathematics (AREA)
  • Physics & Mathematics (AREA)
  • Medical Informatics (AREA)
  • Artificial Intelligence (AREA)
  • Software Systems (AREA)
  • General Health & Medical Sciences (AREA)
  • Evolutionary Computation (AREA)
  • Databases & Information Systems (AREA)
  • Multimedia (AREA)
  • Computing Systems (AREA)
  • Health & Medical Sciences (AREA)
  • Insects & Arthropods (AREA)
  • Wood Science & Technology (AREA)
  • Zoology (AREA)
  • Environmental Sciences (AREA)
  • Catching Or Destruction (AREA)
  • Image Analysis (AREA)

Abstract

This program for pest inspection causes a computer to: perform classification to obtain the species names of bugs adhering to a bug-capturing sheet on the basis of an image of the bug-capturing sheet, by inputting the image of the bug-capturing sheet to an inspection model constructed so as to classify the species names of the bugs; and execute a cutoff process for determining whether or not a result of classification of the species name of a bug is doubtful, and changing the result of classification determined as being doubtful to unclassifiable.

Description

害虫の検査のためのプログラム及び害虫検査装置Programs and pest inspection equipment for pest inspection
 本開示は、害虫の検査のためのプログラム及び害虫検査装置に関する。 The present disclosure relates to a program and a pest inspection device for pest inspection.
 食品工場及び飲食店等では、衛生管理の一環として害虫の検査が義務付けられている。害虫の検査では、食品工場等の所定の場所に予め設置された粘着トラップ等に張り付いた害虫の同定及び計数が行われる。これらの害虫の同定及び計数の結果に応じて対策が講じられる。ここで、害虫の同定及び計数とその結果を含む報告書作成の業務は、ユーザから委託された外部の業者であるPCO(Pest Control Operator)によって行われることが多い。従来、PCOは、粘着トラップ等に張り付いた害虫の同定及び計数を目視で行っている。また、PCOは、害虫の同定及び計数の結果を報告書にまとめて取引先に提出している。このような目視による害虫の同定及び計数には多大な労力がかかる。これに対し、人工知能(AI)によって撮影画像を解析し、解析結果に基づいて虫体の同定及び計数を実施する試みもされている。ここで、AIによる解析は統計的な手法によって実施されるため、理論上、100%の解析結果は導出され得ない。AIを用いた解析の信頼性を高めることが望まれている。 Food factories, restaurants, etc. are required to inspect for pests as part of hygiene management. In pest inspection, pests that have stuck to sticky traps or the like that have been installed in advance at predetermined locations such as food factories are identified and counted. Countermeasures are taken depending on the results of identification and counting of these pests. Here, the work of identifying and counting pests and creating a report including the results is often performed by a PCO (Pest Control Operator), which is an external company commissioned by the user. Conventionally, PCO visually identifies and counts pests stuck to sticky traps and the like. In addition, PCO summarizes the results of pest identification and counting in a report and submits it to business partners. Such visual identification and counting of pests requires a great deal of effort. In response, attempts have been made to analyze captured images using artificial intelligence (AI) and identify and count insects based on the analysis results. Here, since the analysis by AI is performed by a statistical method, theoretically, a 100% analysis result cannot be derived. It is desired to improve the reliability of analysis using AI.
 本開示は、人工知能を用いた害虫の検査のための解析における信頼性が高められた害虫検査のためのプログラム及び害虫検査装置を提供することを目的とする。 An object of the present disclosure is to provide a pest inspection program and a pest inspection device that have increased reliability in analysis for pest inspection using artificial intelligence.
 一態様の害虫の検査のためのプログラムは、捕虫紙の画像に基づいて、捕虫紙に張り付いた虫体の種名を分類するように構成された検査モデルに捕虫紙の画像を入力して虫体の種名の分類を実施することと、虫体の種名の分類結果が疑わしいか否かを判定し、疑わしいと判定された分類結果を分類できなかったと変更する足切り処理を実施することとをコンピュータに実行させる。 In one embodiment, a program for inspecting insect pests inputs an image of insect paper into an inspection model configured to classify the species name of an insect body stuck to the insect paper based on the image of the insect paper. Classify the species name of the insect body, determine whether the classification result of the species name of the insect body is suspicious, and perform a cutting process to change the classification result determined to be suspicious to that it could not be classified. Make a computer do something.
 本開示によれば、人工知能を用いた害虫の検査のための解析における信頼性が高められた害虫検査のためのプログラム及び害虫検査装置が提供される。 According to the present disclosure, a pest inspection program and a pest inspection device are provided that have increased reliability in analysis for pest inspection using artificial intelligence.
図1は、実施形態に係るシステムの構成を示す図である。FIG. 1 is a diagram showing the configuration of a system according to an embodiment. 図2は、サーバのハードウェア構成の一例を示すブロック図である。FIG. 2 is a block diagram showing an example of the hardware configuration of the server. 図3は、サーバの機能ブロック図である。FIG. 3 is a functional block diagram of the server. 図4は、捕虫紙の画像の一例を示す図である。FIG. 4 is a diagram showing an example of an image of insect-trapping paper. 図5は、サーバによる解析処理について示すフローチャートである。FIG. 5 is a flowchart showing analysis processing by the server. 図6は、枠検出処理について示すフローチャートである。FIG. 6 is a flowchart showing frame detection processing. 図7は、虫体検出処理について示すフローチャートである。FIG. 7 is a flowchart showing the insect body detection process. 図8は、虫体分類処理について示すフローチャートである。FIG. 8 is a flowchart showing the insect body classification process. 図9は、検査実施範囲について示す図である。FIG. 9 is a diagram showing the inspection implementation range. 図10は、足切り処理について示すフローチャートである。FIG. 10 is a flowchart showing the leg cutting process.
 以下、図面を参照して実施形態を説明する。図1は、実施形態に係るシステムの構成を示す図である。システム1は、端末2と、サーバ3とを有している。端末2と、サーバ3とはネットワークNWを介して接続される。端末2は、例えば無線通信によってネットワークNWに接続し得る。 Hereinafter, embodiments will be described with reference to the drawings. FIG. 1 is a diagram showing the configuration of a system according to an embodiment. The system 1 includes a terminal 2 and a server 3. Terminal 2 and server 3 are connected via network NW. The terminal 2 can connect to the network NW by, for example, wireless communication.
 端末2は、例えばスマートフォン、タブレット端末といった、ユーザが携帯できる端末装置である。端末2は、PCOといった害虫の検査の専門のユーザによって工場及び飲食店等の捕虫器ITが設置されている場所まで携帯されて捕虫器ITに取り付けられた捕虫紙の画像を撮影する。そして、端末2は、サーバ3と連携して、撮影した画像に基づいて捕虫器ITに張り付いた害虫の同定及び計数といった害虫の検査をする。ここで、端末2は、ユーザが携帯できないパーソナルコンピュータ等の端末装置であってもよい。この場合、端末2は、スマートフォン等で撮影された捕虫紙の画像を受け取って検査の処理を実施する。図1では、端末2は、1つである。端末2は、2つ以上であってもよい。 The terminal 2 is a terminal device that can be carried by the user, such as a smartphone or a tablet terminal. The terminal 2 is carried by a user specialized in pest inspection, such as a PCO, to a place where an insect trap IT is installed, such as a factory or a restaurant, and takes an image of the insect trap paper attached to the insect trap IT. The terminal 2 cooperates with the server 3 to perform pest inspection, such as identification and counting of pests stuck to the insect trap IT, based on the photographed images. Here, the terminal 2 may be a terminal device such as a personal computer that cannot be carried by the user. In this case, the terminal 2 receives an image of the insect trapping paper taken with a smartphone or the like and performs the inspection process. In FIG. 1, there is one terminal 2. There may be two or more terminals 2.
 図2は、サーバ3のハードウェア構成の一例を示すブロック図である。サーバ3は、プロセッサ31と、ROM32と、RAM33と、ストレージ34と、入力インターフェース35と、ディスプレイ36と、通信モジュール37とを有している。ここで、サーバ3は、単一のサーバでなくてもよい。さらには、サーバ3は、例えばクラウドサーバとして構成されていてもよい。 FIG. 2 is a block diagram showing an example of the hardware configuration of the server 3. The server 3 includes a processor 31, a ROM 32, a RAM 33, a storage 34, an input interface 35, a display 36, and a communication module 37. Here, the server 3 does not have to be a single server. Furthermore, the server 3 may be configured as a cloud server, for example.
 プロセッサ31は、サーバ3の動作を制御するように構成されたプロセッサである。プロセッサ31は、例えばCPUである。プロセッサ31は、CPUではなく、MPU、GPU等であってもよい。また、プロセッサ31は、1つのCPU等によって構成されている必要はなく、複数のCPU等によって構成されてもよい。 The processor 31 is a processor configured to control the operation of the server 3. The processor 31 is, for example, a CPU. The processor 31 may be an MPU, a GPU, or the like instead of a CPU. Further, the processor 31 does not need to be composed of one CPU or the like, and may be composed of a plurality of CPUs or the like.
 ROM32は、例えば不揮発性メモリであり、サーバ3の起動プログラム等を記憶している。RAM33は、例えば揮発性のメモリである。RAM33は、例えばプロセッサ31における処理の際の作業メモリとして用いられる。 The ROM 32 is, for example, a nonvolatile memory, and stores a startup program for the server 3 and the like. The RAM 33 is, for example, a volatile memory. The RAM 33 is used, for example, as a working memory during processing in the processor 31.
 ストレージ34は、例えばハードディスクドライブ、ソリッドステートドライブといったストレージである。ストレージ34は、OS341、害虫検査プログラム342、検査モデル343、データベース344、撮影画像345を格納している。ストレージ34は、OS341、害虫検査プログラム342、検査モデル343、データベース344、撮影画像345以外のプログラム及びデータを格納していてもよい。 The storage 34 is, for example, a hard disk drive or a solid state drive. The storage 34 stores an OS 341, a pest inspection program 342, an inspection model 343, a database 344, and a captured image 345. The storage 34 may store programs and data other than the OS 341, pest inspection program 342, inspection model 343, database 344, and photographed image 345.
 OS341は、サーバ3の基本的な機能を実現するためのプログラムである。ストレージ34に格納されている各種のプログラムは、OSの制御下で実行される。 The OS 341 is a program for realizing the basic functions of the server 3. Various programs stored in the storage 34 are executed under the control of the OS.
 害虫検査プログラム342は、端末2のユーザからの要求に従って害虫の検査のための一連の処理をコンピュータに実行させるためのプログラムである。害虫検査プログラム342は、サーバ3のプロセッサ31によって実行されるプログラムであってよい。または、害虫検査プログラム342は、ウェブアプリケーションとして端末2にダウンロードされ、端末2のウェブブラウザ上で実行されるプログラムであってもよい。さらには、害虫検査プログラム342は、端末2にインストールされて端末2で実行されるプログラムであってもよい。 The pest inspection program 342 is a program that causes a computer to execute a series of processes for inspecting pests in accordance with a request from the user of the terminal 2. The pest inspection program 342 may be a program executed by the processor 31 of the server 3. Alternatively, the pest inspection program 342 may be a program that is downloaded to the terminal 2 as a web application and executed on the web browser of the terminal 2. Furthermore, the pest inspection program 342 may be a program that is installed on the terminal 2 and executed on the terminal 2.
 検査モデル343は、選択された画像に写っている害虫を検出し、検出した害虫が何であるかを識別する識別器を備えた検査のための解析を実施する学習済みAIモデルである。検査モデル343は、入力された画像に写っている害虫が何に分類されるかを予測し、予測確率の上位の結果に基づいて入力された画像の検査のための解析を実施する。この検査は、害虫の種名の同定及び計数を含む。検査モデル343については後で詳しく説明される。 The inspection model 343 is a trained AI model that detects a pest in the selected image and performs an analysis for inspection that is equipped with a discriminator that identifies the detected pest. The inspection model 343 predicts what type the pest in the input image will be classified into, and performs analysis for inspection of the input image based on the result with the highest predicted probability. This inspection includes pest species identification and enumeration. The test model 343 will be explained in detail later.
 データベース344は、検査モデル343による画像の解析の際の足切り処理において利用される各種のデータを格納している。足切り処理は、検査モデル343による疑わしい結果がユーザに対して提示されないように事前に足切りする処理である。データベース344については後で詳しく説明される。 The database 344 stores various data used in the cutting process when the inspection model 343 analyzes the image. The cutting process is a process in which the testing model 343 is cut in advance so that questionable results from the test model 343 are not presented to the user. Database 344 will be explained in detail later.
 撮影画像345は、端末2のユーザによってアップロードされた画像のデータである。撮影画像345は、検査モデル343を用いた解析の対象となり得る画像である。ここで、撮影画像345は、端末2のユーザによってアップロードされた画像以外に、例えば検査モデル343の学習に用いられた画像等の他の画像を含んでいてもよい。 The photographed image 345 is image data uploaded by the user of the terminal 2. The captured image 345 is an image that can be analyzed using the inspection model 343. Here, the photographed image 345 may include, in addition to the image uploaded by the user of the terminal 2, other images such as an image used for learning the inspection model 343, for example.
 入力インターフェース35は、キーボード、マウス、タッチパネル等の入力装置を含む。入力インターフェース35の操作がされた場合、操作内容に応じた信号がプロセッサ31に入力される。プロセッサ31は、この信号に応じて各種の処理を行う。 The input interface 35 includes input devices such as a keyboard, mouse, and touch panel. When the input interface 35 is operated, a signal corresponding to the content of the operation is input to the processor 31. The processor 31 performs various processes in response to this signal.
 ディスプレイ36は、液晶ディスプレイ、有機ELディスプレイ等の表示装置である。ディスプレイ36は、各種の画像を表示する。 The display 36 is a display device such as a liquid crystal display or an organic EL display. The display 36 displays various images.
 通信モジュール37は、サーバ3が端末2と通信するときの処理をするように構成されたインターフェースを含むモジュールである。通信モジュール37は、携帯電話回線、有線LAN回線、無線LAN回線等を用いてネットワークNWに接続するように構成されている。 The communication module 37 is a module that includes an interface configured to perform processing when the server 3 communicates with the terminal 2. The communication module 37 is configured to connect to the network NW using a mobile phone line, a wired LAN line, a wireless LAN line, or the like.
 図3は、サーバ3の機能ブロック図である。サーバ3のプロセッサ31は、害虫検査プログラム342に従って動作することにより、制御部311として動作する。制御部311は、検査モデル343及びデータベース344を用いて分類部及び提示部として動作し得る。 FIG. 3 is a functional block diagram of the server 3. The processor 31 of the server 3 operates as the control unit 311 by operating according to the pest inspection program 342. The control unit 311 can operate as a classification unit and a presentation unit using the test model 343 and database 344.
 図3に示すように、検査モデル343は、捕虫紙枠検出エンジン3431と、虫体検出エンジン3432と、虫体分類エンジン3433とを含む。 As shown in FIG. 3, the inspection model 343 includes an insect trapping paper frame detection engine 3431, an insect body detection engine 3432, and an insect body classification engine 3433.
 捕虫紙枠検出エンジン3431は、入力された画像における特徴量を抽出し、抽出した特徴量に基づいて画像における捕虫紙の区画を示す枠を検出する学習済みAIモデルである。図4は、捕虫紙の画像の一例を示す図である。図4では、1枚以上の捕虫紙が捕虫紙IP1、IP2、IP3、IP4の4つに切り分けられ、それらが並べられて同時に撮影された画像が示されている。 The insect trapping paper frame detection engine 3431 is a trained AI model that extracts feature amounts in an input image and detects a frame indicating a section of insect paper in the image based on the extracted feature amounts. FIG. 4 is a diagram showing an example of an image of insect-trapping paper. In FIG. 4, one or more pieces of insect-trapping paper are cut into four pieces of insect-trapping paper IP1, IP2, IP3, and IP4, and images of these pieces of paper being lined up and photographed at the same time are shown.
 人が目視で捕虫紙に張り付いた害虫の同定及び計数をする場合、捕虫紙を複数の区画に分け、区画毎に害虫の同定及び計数をしたり、1つの区画について害虫の同定及び計数をした結果を区画の数に応じて倍することによって捕虫紙の全体としての計数結果を得たりすることがある。このような区画を人の目で識別できるように、捕虫紙にはそれぞれ区画を示す枠線Lが捕虫紙の長手方向の辺から垂直に等間隔で引かれていることがある。例えば、捕虫紙の短手方向の辺が5cmである場合、枠線が5cm間隔で引かれていると、1つの区画は、5cm×5cmの領域となる。人は、図4に示す枠線Lを目視することによって区画を識別する。また、捕虫紙枠検出エンジン3431は、捕虫紙の画像における区画を示す枠線Lを検出することで、枠線Lと捕虫紙の長手方向の辺とによって囲まれる区画の枠を検出する。ここで、図4では、区画の枠は、正方形状である。しかしながら、区画の枠は、予めサイズが決められていれば必ずしも正方形状でなくてもよい。 When humans visually identify and count pests stuck to insect paper, they may divide the insect paper into multiple sections and identify and count the pests in each section, or identify and count pests in one section. By multiplying the result according to the number of sections, the total counting result of the insect paper may be obtained. In order to be able to identify such divisions with the human eye, frame lines L indicating the divisions may be drawn perpendicularly from the longitudinal sides of the insect-trapping paper at equal intervals on the insect-trapping paper. For example, if the width of the insect paper is 5 cm and the frame lines are drawn at 5 cm intervals, one section will be an area of 5 cm x 5 cm. A person identifies a division by visually observing the frame line L shown in FIG. In addition, the insect-trapping paper frame detection engine 3431 detects the frame of the division surrounded by the frame line L and the longitudinal sides of the insect-trapping paper by detecting the frame line L indicating the division in the image of the insect-trapping paper. Here, in FIG. 4, the frame of the division is square. However, the frame of the compartment does not necessarily have to be square as long as the size is determined in advance.
 捕虫紙枠検出エンジン3431は、例えば畳み込みニューラルネットワーク(CNN)によって構成される。捕虫紙枠検出エンジン3431の学習には、例えば枠線を含む種々の捕虫紙の画像と、それぞれの捕虫紙の画像における枠線の座標を示す教師データとが用いられ得る。捕虫紙枠検出エンジン3431は、例えば、入力された捕虫紙の画像の特徴量に基づいて検出した区画の枠の座標と教師データとして与えられた区画の枠の座標との誤差に基づく誤差逆伝搬法によって学習を実施する。 The insect-trapping paper frame detection engine 3431 is configured by, for example, a convolutional neural network (CNN). For the learning of the insect-trapping paper frame detection engine 3431, for example, various insect-trapping paper images including frame lines and teacher data indicating the coordinates of the frame lines in each insect-trapping paper image can be used. The insect trapping paper frame detection engine 3431 performs, for example, error back propagation based on the error between the coordinates of the section frame detected based on the feature amount of the input insect paper image and the coordinates of the section frame given as teacher data. Implement learning by law.
 ここで、図4では、捕虫紙は、捕虫紙IP1、IP2、IP3、IP4の4つに切り分けられた状態で撮影されている。これは、捕虫紙を端末2の撮影範囲に収めやすくして、画像の解析に適する解像度を確保するための処理である。一方で、捕虫紙は、切り分けられずに撮影されてもよい。 Here, in FIG. 4, the insect-trapping paper is photographed in a state where it is cut into four insect-trapping papers IP1, IP2, IP3, and IP4. This is a process for making it easier to fit the insect paper within the photographing range of the terminal 2 and ensuring a resolution suitable for image analysis. On the other hand, the insect paper may be photographed without being cut into pieces.
 虫体検出エンジン3432は、入力された画像から特徴量を抽出し、抽出した特徴量に基づいて画像における虫体を検出する学習済みAIモデルである。実施形態では、虫体の検出と虫体の分類とが異なるAIモデルによって行われる。 The insect detection engine 3432 is a trained AI model that extracts features from the input image and detects insects in the image based on the extracted features. In embodiments, detection of insects and classification of insects are performed by different AI models.
 虫体検出エンジン3432は、例えばCNNによって構成される。虫体検出エンジン3432の学習には、例えば種々の虫の画像と、それぞれの虫の画像における虫体の座標を示す教師データとが用いられ得る。虫の画像は、捕虫紙に張り付いた状態の虫の画像であってもよいし、捕虫紙に張り付いていない状態の虫の画像であってもよい。虫体検出エンジン3432は、例えば、入力された虫の画像の特徴量に基づいて検出した虫体の座標と教師データとして与えられた虫体の座標との誤差に基づく誤差逆伝搬法によって学習を実施する。虫体の座標は、例えば虫体を含む微小な矩形範囲の座標であり得る。 The insect detection engine 3432 is configured by, for example, CNN. For example, images of various insects and training data indicating the coordinates of the insect in each insect image can be used for learning of the insect detection engine 3432. The image of the insect may be an image of the insect stuck to the insect paper, or may be an image of the insect not stuck to the insect paper. The insect detection engine 3432 performs learning using, for example, an error backpropagation method based on the error between the coordinates of an insect body detected based on the feature values of the input insect image and the coordinates of the insect body given as training data. implement. The coordinates of the insect body may be, for example, the coordinates of a minute rectangular range including the insect body.
 虫体分類エンジン3433は、画像における虫体の特徴量に基づき、画像における虫体を分類する学習済みAIモデルである。虫体分類エンジン3433は、虫体検出エンジン3432で検出された虫体の画像範囲に存在している虫体について分類を実施する。虫体分類エンジン3433による分類は、虫の種名の分類と、グループ名の分類とを含み得る。虫の種名は、虫毎に付けられる種名である。グループ名は、虫の特性に基づくグループ名である。例えば、捕虫紙が工場に設置される場合、グループ名は、例えば工場内、工場外といったグループ名を含む。工場内は、工場内で発生し得る虫のグループであることを表す。工場外は、工場外で発生し、工場内に侵入してくる虫のグループであることを表す。ここで、グループ名「工場内」、「工場外」は、それぞれさらに細分化されたグループを含んでいてもよい。「工場内」であれば、グループは、例えば、歩行の特性の違いによって分けられたグループ、湿潤環境を好むか否かによって分けられたグループ、食菌性の虫であるか否か、食菌性の虫であればどの分類の食菌性の虫に該当するかによって分けられたグループ、乾燥に強い虫であるか否かによって分けられたグループといったものを含み得る。また、「工場外」であれば、グループは、例えば、歩行の特性の違いによって分けられたグループ、飛来性の有無によって分けられたグループといったものを含み得る。これらのグループ名「工場内」及び「工場外」から細分化されたグループにもグループ名が付けられる。グループ名が細分化されている場合には、虫体分類エンジン3433は、例えば「工場内」、「工場外」といった大分類のグループ名に加えて「食菌性あり」、「食菌性なし」といった小分類のグループ名の分類も実施する。 The insect classification engine 3433 is a trained AI model that classifies insects in an image based on the features of the insects in the image. The insect classification engine 3433 classifies insects present in the image range of the insect detected by the insect detection engine 3432. Classification by the insect classification engine 3433 may include classification by insect species name and group name. An insect species name is a species name given to each insect. The group name is a group name based on the characteristics of the insect. For example, when insect-trapping paper is installed in a factory, the group name includes group names such as inside the factory and outside the factory. Inside the factory represents a group of insects that can occur inside the factory. Outside the factory represents a group of insects that originate outside the factory and invade the factory. Here, the group names "inside the factory" and "outside the factory" may each include further subdivided groups. In the case of "inside a factory," the groups may be divided into groups based on walking characteristics, groups divided based on whether or not they prefer a humid environment, groups divided based on whether or not they prefer a humid environment, whether insects are phagocytic or not, and phagocytic insects. In the case of sexual insects, it may include groups divided according to which classification of fungivorous insects they fall under, and groups divided according to whether they are drought-resistant insects. Furthermore, in the case of "outside the factory", the groups may include, for example, groups divided according to differences in walking characteristics and groups divided according to the presence or absence of flying. Group names are also given to groups subdivided from these group names "inside the factory" and "outside the factory." When the group name is subdivided, the insect classification engine 3433 uses the broad group names such as "inside the factory" and "outside the factory," as well as "with edible fungi" and "not edible with edible fungi." ” Classification of sub-category group names is also carried out.
 虫体分類エンジン3433は、例えばCNNによって構成される。虫体分類エンジン3433の学習には、例えば種々の虫の画像と、それぞれの虫の種名及びグループ名を示す教師データとが用いられ得る。虫の画像は、捕虫紙に張り付いた状態の虫の画像であってもよいし、捕虫紙に張り付いていない状態の虫の画像であってもよい。虫体分類エンジン3433は、例えば、入力された虫の画像の特徴量に基づいて分類した虫体の種名及びグループ名と教師データとして与えられた虫体の種名及びグループ名との誤差に基づいて虫毎の特徴量を誤差逆伝搬法によって学習する。 The insect classification engine 3433 is configured by, for example, CNN. For the learning of the insect classification engine 3433, for example, images of various insects and training data indicating the species name and group name of each insect can be used. The image of the insect may be an image of the insect stuck to the insect paper, or may be an image of the insect not stuck to the insect paper. For example, the insect classification engine 3433 calculates the error between the insect species name and group name classified based on the feature values of the input insect image and the insect species name and group name given as training data. Based on this, the features of each insect are learned using the error backpropagation method.
 また、図3に示すように、データベース344は、確からしさ閾値3441と、枠データ3442と、虫データ3443とを少なくとも格納している。データベース344は、これら以外のデータを格納していてもよい。 Further, as shown in FIG. 3, the database 344 stores at least a probability threshold 3441, frame data 3442, and insect data 3443. The database 344 may store data other than these.
 確からしさ閾値3441は、検査モデル343の例えば虫体検出エンジン3432による虫体検出の確からしさの閾値である。虫体検出エンジン3432は、画像から抽出される特徴量に基づいて、画像内の虫体らしさ、すなわち虫体である確率の高いものを虫体として検出するように構成されている。確からしさ閾値は、虫体らしさの閾値であり、例えば0から1の間で設定される。確からしさ閾値3441は、サーバ3の管理者によって任意に決められ得る。後で説明するように、制御部311は、虫体であるらしさが確からしさ閾値未満である虫体については検出されなかったものとする。また、確からしさ閾値3441は、虫体検出エンジン3432による虫体検出の確からしさの閾値に加えて、捕虫紙枠検出エンジン3431による区画の枠の検出の確からしさの閾値及び/又は虫体分類エンジン3433による虫体の分類の確からしさの閾値を含んでいてもよい。 The certainty threshold 3441 is a threshold for the certainty of insect detection by the insect detection engine 3432 of the inspection model 343, for example. The insect object detection engine 3432 is configured to detect, as an insect object, an object in the image that is likely to be an insect object, that is, an object that has a high probability of being an insect object, based on the feature amount extracted from the image. The certainty threshold is a threshold for the likelihood of being an insect, and is set, for example, between 0 and 1. The certainty threshold 3441 can be arbitrarily determined by the administrator of the server 3. As will be explained later, the control unit 311 assumes that an insect object whose likelihood of being an insect object is less than a certainty threshold is not detected. In addition to the certainty threshold 3441 of the insect detection engine 3432, the certainty threshold 3441 is the certainty threshold of the insect detection engine 3431 and/or the insect classification engine. It may also include a threshold for the certainty of classification of the insect body according to 3433.
 枠データ3442は、捕虫紙に予め設定される区画の枠に関するデータである。枠データ3442は、例えば1区画の実寸のデータを含む。例えば、1つの区画が5cm×5cmの正方形の区画である場合、枠データ3442としては縦5cm、横5cmといったデータが格納され得る。枠データ3442は、1区画の実寸のデータ以外のデータを含んでいてもよい。枠データ3442は、後で詳しく説明する足切り処理において利用できる各種の区画の枠に関するデータを含んでいてよい。 The frame data 3442 is data regarding the frame of the section set in advance on the insect trapping paper. The frame data 3442 includes, for example, data of the actual size of one section. For example, if one section is a square section measuring 5 cm x 5 cm, data such as 5 cm in height and 5 cm in width may be stored as the frame data 3442. The frame data 3442 may include data other than the actual size data of one section. The frame data 3442 may include data regarding the frames of various sections that can be used in the cutting process, which will be described in detail later.
 虫データ3443は、分類対象の虫に関するデータである。虫データ3443は、虫の種名、分布範囲、繁殖時期、実寸といったデータを含む。ここで、虫体の実寸は、分布範囲毎に登録されてよい。さらに、実施形態における虫データ3443は、虫毎のグループ名のデータも含む。虫データ3443は、これら以外のデータを含んでいてよい。虫データ3443は、後で詳しく説明する足切り処理において利用できる各種の分類対象の虫に関するデータを含んでいてよい。 The insect data 3443 is data related to insects to be classified. The insect data 3443 includes data such as insect species name, distribution range, breeding season, and actual size. Here, the actual size of the insect body may be registered for each distribution range. Furthermore, the insect data 3443 in the embodiment also includes data on group names for each insect. The insect data 3443 may include data other than these. The insect data 3443 may include data regarding various types of insects to be classified that can be used in leg cutting processing, which will be described in detail later.
 次に、サーバ3による画像の解析処理について説明する。図5は、サーバ3による解析処理について示すフローチャートである。図5の処理は、端末2から虫体の解析要求がされたときに開始される。解析要求の際には、端末2から解析対象の捕虫紙の画像の情報が通知される。サーバ3は、端末2から通知された解析対象の撮影画像345に対して図5の処理を実施する。なお、解析対象の捕虫紙の画像は、解析要求とともに端末2から送信された画像であってもよい。 Next, image analysis processing by the server 3 will be explained. FIG. 5 is a flowchart showing analysis processing by the server 3. The process in FIG. 5 is started when a request for insect body analysis is made from the terminal 2. When an analysis request is made, the terminal 2 notifies the user of information on the image of the insect paper to be analyzed. The server 3 performs the process shown in FIG. 5 on the captured image 345 to be analyzed, which is notified from the terminal 2. Note that the image of the insect-trapping paper to be analyzed may be an image transmitted from the terminal 2 together with the analysis request.
 ステップS1において、制御部311としてのプロセッサ31は、解析対象の撮影画像345に対して枠検出処理を実施する。枠検出処理の後、処理はステップS2に移行する。枠検出処理において、プロセッサ31は、解析対象の撮影画像345を検査モデル343に入力する。検査モデル343は、捕虫紙枠検出エンジン3431によって撮影画像345における区画の枠を検出する。枠検出処理については後で詳しく説明される。 In step S1, the processor 31 as the control unit 311 performs frame detection processing on the captured image 345 to be analyzed. After the frame detection process, the process moves to step S2. In the frame detection process, the processor 31 inputs the captured image 345 to be analyzed into the inspection model 343. The inspection model 343 detects the frame of the section in the photographed image 345 using the insect trapping paper frame detection engine 3431. The frame detection process will be explained in detail later.
 ステップS2において、プロセッサ31は、解析対象の撮影画像345に対して虫体検出処理を実施する。虫体検出処理の後、処理はステップS3に移行する。虫体検出処理において、プロセッサ31は、解析対象の撮影画像345を検査モデル343に入力する。検査モデル343は、虫体検出エンジン3432によって撮影画像345における虫体を検出する。虫体検出処理については後で詳しく説明される。ここで、ステップS1の枠検出処理とステップS2の虫体検出処理とは、並列で実施されてもよい。 In step S2, the processor 31 performs insect detection processing on the photographed image 345 to be analyzed. After the insect body detection process, the process moves to step S3. In the insect body detection process, the processor 31 inputs the photographed image 345 to be analyzed into the inspection model 343. The inspection model 343 detects the insect body in the photographed image 345 using the insect body detection engine 3432. The insect body detection process will be explained in detail later. Here, the frame detection process in step S1 and the insect body detection process in step S2 may be performed in parallel.
 ステップS3において、プロセッサ31は、解析対象の撮影画像345に対して虫体分類処理を実施する。虫体分類処理の後、処理はステップS4に移行する。虫体分類処理において、プロセッサ31は、解析対象の撮影画像345を検査モデル343に入力する。検査モデル343は、虫体分類エンジン3433によって撮影画像345における虫体を分類する。虫体分類処理については後で詳しく説明される。 In step S3, the processor 31 performs insect body classification processing on the photographed image 345 to be analyzed. After the insect body classification process, the process moves to step S4. In the insect body classification process, the processor 31 inputs the photographed image 345 to be analyzed into the inspection model 343. The inspection model 343 uses the insect classification engine 3433 to classify insects in the photographed image 345 . The insect body classification process will be explained in detail later.
 ステップS4において、プロセッサ31は、虫体分類処理の結果に対する足切り処理を実施する。足切り処理の後、処理はステップS5に移行する。足切り処理において、プロセッサ31は、検査モデル343から枠検出処理の結果及び虫体分類処理の結果を受け取り、受け取った結果に基づいて、虫体分類処理の結果のうちの基準に満たない結果を分類できなかったことを示す「未分類」に分類する。 In step S4, the processor 31 performs a cut-off process on the result of the insect body classification process. After the leg cutting process, the process moves to step S5. In the cutting process, the processor 31 receives the results of the frame detection process and the insect classification process from the inspection model 343, and selects the results of the insect classification process that do not meet the criteria based on the received results. The item is classified as "Unclassified" indicating that it could not be classified.
 ステップS5において、プロセッサ31は、端末2に対して撮影画像345に対する解析結果を返却する。その後、図5の処理は終了する。解析結果は、撮影画像345における虫体の位置とその位置の虫体の種名とを含むリストであり得る。端末2は、このリストを受け取って、捕虫紙の画像における虫体の同定及び計数の結果を例えばディスプレイに表示し得る。ここで、「未分類」は、それぞれの虫体の種名とは独立して計数される。 In step S5, the processor 31 returns the analysis result for the photographed image 345 to the terminal 2. After that, the process in FIG. 5 ends. The analysis result may be a list including the position of the insect in the photographed image 345 and the species name of the insect at that position. The terminal 2 can receive this list and display the results of identifying and counting the insects in the image of the insect trapping paper, for example, on a display. Here, "unclassified" is counted independently of the species name of each insect.
 図6は、枠検出処理について示すフローチャートである。ステップS101において、プロセッサ31は、解析対象の撮影画像345を検査モデル343に入力する。検査モデル343は、捕虫紙枠検出エンジン3431を呼び出し、入力された撮影画像345における区画の枠を検出する。そして、検査モデル343は、区画の枠の座標を制御部311としてのプロセッサ31に返却する。ここで、図4の例で示されるように、捕虫紙の画像においては、複数の区画の枠の座標が検出され得る。この場合において、検査モデル343は、それぞれの区画の枠の座標を含むリストを出力してもよいし、1つの代表の区画の枠の座標だけを出力してもよい。 FIG. 6 is a flowchart showing the frame detection process. In step S101, the processor 31 inputs the captured image 345 to be analyzed into the inspection model 343. The inspection model 343 calls the insect trapping paper frame detection engine 3431 and detects the frame of the section in the input photographed image 345. The inspection model 343 then returns the coordinates of the partition frame to the processor 31 as the control unit 311. Here, as shown in the example of FIG. 4, the coordinates of frames of a plurality of sections can be detected in the image of insect paper. In this case, the test model 343 may output a list including the coordinates of the frame of each section, or may output only the coordinates of the frame of one representative section.
 ステップS102において、プロセッサ31は、データベース344に格納されている枠データ3442と撮影画像345に占める区画の枠の大きさとから撮影画像345の1ピクセル当たりの実寸を計算する。そして、プロセッサ31は、計算した実寸を例えばRAM33に記憶させる。その後、図6の処理は終了する。 In step S102, the processor 31 calculates the actual size per pixel of the photographed image 345 from the frame data 3442 stored in the database 344 and the size of the frame of the section occupied in the photographed image 345. Then, the processor 31 stores the calculated actual size in the RAM 33, for example. After that, the process in FIG. 6 ends.
 例えば、枠データ3442として格納されている区画の枠の1辺の実寸をd1、撮影画像345において検出された区画の枠の1辺のピクセル数をp1とした場合、撮影画像345の1ピクセル当たりの実寸d2は、d2=d1/p1から計算され得る。 For example, if the actual size of one side of the compartment frame stored as frame data 3442 is d1, and the number of pixels on one side of the compartment frame detected in the photographed image 345 is p1, then each pixel of the photographed image 345 The actual size d2 of can be calculated from d2=d1/p1.
 ここで、枠データ3442として格納されている区画の枠の縦横比に対して乖離の大きい縦横比を有する区画の枠は、検出されなかったものとして扱われることが望ましい。また、実寸d2が捕虫紙の大きさよりも大きくなることはないので、予め定められた閾値範囲外の実寸d2が計算される区画の枠も、検出されなかったものとして扱われることが望ましい。さらには、区画の枠の検出の確からしさの閾値が確からしさ閾値3441として格納されている場合には、区画の枠の検出の確からしさが閾値に満たない区画の枠も、検出されなかったものとして扱われることが望ましい。 Here, it is desirable that a partition frame having an aspect ratio that deviates greatly from the aspect ratio of the partition frame stored as the frame data 3442 is treated as not detected. Further, since the actual size d2 is never larger than the size of the insect trapping paper, it is desirable that the frame of the section for which the actual size d2 is calculated outside the predetermined threshold range is also treated as not being detected. Furthermore, if the threshold for the certainty of detecting a compartment frame is stored as the certainty threshold 3441, a compartment frame for which the probability of detecting a compartment frame is less than the threshold may also be undetected. It is desirable to be treated as such.
 また、複数の区画の枠が検出されている場合、実寸d2はそれぞれの区画の枠について計算されてもよいし、代表の1つの区画の枠について計算されてもよい。実寸d2がそれぞれの区画の枠について計算される場合、それらの平均値等がRAM33に記憶されてもよい。 Furthermore, when frames of multiple sections are detected, the actual size d2 may be calculated for each section frame, or may be calculated for the frame of one representative section. When the actual size d2 is calculated for each section frame, their average value or the like may be stored in the RAM 33.
 図7は、虫体検出処理について示すフローチャートである。ステップS201において、プロセッサ31は、解析対象の撮影画像345を複数のブロックに分割する。ブロックのピクセルサイズは適宜に決められてよい。例えば、1つのブロックは、枠検出処理で検出された区画の枠と同サイズに決められてもよい。 FIG. 7 is a flowchart showing the insect body detection process. In step S201, the processor 31 divides the captured image 345 to be analyzed into a plurality of blocks. The pixel size of the block may be determined as appropriate. For example, one block may be determined to have the same size as the frame of the partition detected in the frame detection process.
 ステップS202において、プロセッサ31は、撮影画像345のそれぞれのブロックを検査モデル343に入力する。検査モデル343には、全てのブロックが一度に入力されてもよいし、所定数ずつのブロックが入力されてもよい。検査モデル343は、入力されたブロックの数と同数の虫体検出エンジン3432を並列的に呼び出してそれぞれのブロックにおける虫体を検出する。そして、それぞれの虫体検出エンジン3432は、ブロック毎に検出された虫体の座標を虫体検出の確からしさとともに制御部311としてのプロセッサ31に返却する。 In step S202, the processor 31 inputs each block of the photographed image 345 to the inspection model 343. All blocks may be input to the test model 343 at once, or a predetermined number of blocks may be input to the test model 343. The inspection model 343 calls the same number of insect detection engines 3432 as the number of input blocks in parallel to detect insects in each block. Then, each insect detection engine 3432 returns the coordinates of the insect detected for each block to the processor 31 as the control unit 311 along with the certainty of insect detection.
 ステップS203において、プロセッサ31は、すべてのブロックにおける虫体の検出結果を統合する。その後、図7の処理は終了する。例えば、プロセッサ31は、それぞれのブロック毎に設定された座標系で表されている虫体の座標を、撮影画像345を基準とした座標系における座標に変換することで検出結果の統合を行う。 In step S203, the processor 31 integrates the insect detection results in all blocks. After that, the process in FIG. 7 ends. For example, the processor 31 integrates the detection results by converting the coordinates of the insect body expressed in a coordinate system set for each block into coordinates in a coordinate system based on the captured image 345.
 図8は、虫体分類処理について示すフローチャートである。ステップS301において、プロセッサ31は、解析対象の撮影画像345から検査実施範囲の画像を切り出す。図9は、検査実施範囲について示す図である。検査実施範囲は、撮影画像345において設定される例えば矩形の枠である。検査実施範囲は、解析要求に先立って端末2のユーザによって指定され得る。例えば、図9では、捕虫紙IP4に対して検査実施範囲Rが指定されている。検査実施範囲Rが指定されている場合、虫体分類処理は、検査実施範囲だけを対象にして行われる。検査実施範囲Rは、撮影画像345の複数の範囲に対して設定され得る。撮影画像345の複数の範囲に対して検査実施範囲Rが設定されている場合、それぞれの検査実施範囲Rに対して虫体分類処理が行われる。なお、検査実施範囲Rは、必ずしもユーザによって指定されなくてもよい。例えば、枠検出処理で検出された区画の枠が検査実施範囲Rとされてもよい。または、撮影画像345の全体が検査実施範囲Rとされてもよい。さらには、虫体検出処理で検出された虫体の座標を含む微小な矩形範囲が検査実施範囲Rとされてもよい。 FIG. 8 is a flowchart showing the insect classification process. In step S301, the processor 31 cuts out an image of the inspection range from the captured image 345 to be analyzed. FIG. 9 is a diagram showing the inspection implementation range. The inspection range is, for example, a rectangular frame set in the captured image 345. The inspection scope may be specified by the user of the terminal 2 prior to the analysis request. For example, in FIG. 9, the inspection range R is specified for the insect trapping paper IP4. When the inspection range R is specified, the insect classification process is performed only on the inspection range. The inspection range R can be set for multiple ranges of the photographed image 345. When inspection implementation ranges R are set for multiple ranges of the photographed image 345, insect body classification processing is performed for each inspection implementation range R. Note that the inspection implementation range R does not necessarily have to be designated by the user. For example, the inspection implementation range R may be the frame of the section detected by the frame detection process. Alternatively, the entire photographed image 345 may be the inspection implementation range R. Furthermore, the inspection implementation range R may be a minute rectangular range that includes the coordinates of the insect body detected in the insect body detection process.
 ステップS302において、プロセッサ31は、検査実施範囲の画像を虫体検出処理で検出された虫体の座標とともに検査モデル343に入力する。検査モデル343には、全ての検査実施範囲の画像が一度に入力されてもよいし、所定数ずつの検査実施範囲の画像が入力されてもよい。検査モデル343は、入力された検査実施範囲の画像と同数の虫体分類エンジン3433を並列的に呼び出してそれぞれの検査実施範囲の画像における虫体を分類する。そして、それぞれの虫体分類エンジン3433は、検査実施範囲毎に分類された虫体の種名及びグループ名を分類結果として制御部311としてのプロセッサ31に返却する。 In step S302, the processor 31 inputs the image of the inspection area to the inspection model 343 along with the coordinates of the insect body detected in the insect body detection process. Images of all inspection ranges may be input to the inspection model 343 at once, or images of a predetermined number of inspection ranges may be input to the inspection model 343. The inspection model 343 calls the same number of insect classification engines 3433 as the input images of the inspection range in parallel to classify the insects in the images of each inspection range. Then, each insect classification engine 3433 returns the species name and group name of the insect classified for each inspection range to the processor 31 serving as the control unit 311 as a classification result.
 ステップS303において、プロセッサ31は、すべてのブロックにおける虫体の分類結果を統合する。その後、図8の処理は終了する。例えば、プロセッサ31は、それぞれの検査実施範囲毎に分類された虫体の種名及びグループ名をそれぞれの虫体の座標及び虫体検出の確からしさと関連付けた1つのリストにまとめる。 In step S303, the processor 31 integrates the insect classification results in all blocks. After that, the process in FIG. 8 ends. For example, the processor 31 compiles the species names and group names of the insects classified for each inspection range into one list in which they are associated with the coordinates of each insect and the certainty of detecting the insect.
 図10は、足切り処理について示すフローチャートである。ステップS401において、プロセッサ31は、例えばリストにまとめられた分類結果のうちの1つの分類結果を選択する。 FIG. 10 is a flowchart showing the leg cutting process. In step S401, the processor 31 selects, for example, one classification result from the classification results compiled into a list.
 ステップS402において、プロセッサ31は、選択した分類結果に関連づけられた虫体検出の確からしさが確からしさ閾値よりも高いか否かを判定する。ステップS402において、選択した分類結果に関連づけられた虫体検出の確からしさが確からしさ閾値よりも高いと判定されたときには、処理はステップS403に移行する。ステップS402において、選択した分類結果に関連づけられた虫体検出の確からしさが確からしさ閾値以下であると判定されたときには、処理はステップS407に移行する。 In step S402, the processor 31 determines whether the certainty of insect detection associated with the selected classification result is higher than the certainty threshold. If it is determined in step S402 that the certainty of insect detection associated with the selected classification result is higher than the certainty threshold, the process moves to step S403. If it is determined in step S402 that the certainty of detecting an insect body associated with the selected classification result is less than or equal to the certainty threshold, the process proceeds to step S407.
 ステップS403において、プロセッサ31は、選択した分類結果に含まれる虫体の種名とグループ名の組み合わせが正しいか否かを判定する。ステップS403においては、分類結果に含まれる虫体の種名とグループ名の組み合わせが虫データ3443に登録されている種名とグループ名の組み合わせと一致した場合には、組み合わせが正しいと判定される。ステップS403において、選択した分類結果に含まれる虫体の種名とグループ名の組み合わせが正しいと判定されたときには、処理はステップS404に移行する。ステップS403において、選択した分類結果に含まれる虫体の種名とグループ名の組み合わせが正しくないと判定されたときには、処理はステップS407に移行する。 In step S403, the processor 31 determines whether the combination of the insect species name and group name included in the selected classification result is correct. In step S403, if the combination of the species name and group name of the insect included in the classification result matches the combination of the species name and group name registered in the insect data 3443, the combination is determined to be correct. . If it is determined in step S403 that the combination of the insect species name and group name included in the selected classification result is correct, the process moves to step S404. If it is determined in step S403 that the combination of the insect species name and group name included in the selected classification result is incorrect, the process moves to step S407.
 ステップS404において、プロセッサ31は、選択した分類結果の虫体の実寸を計算する。前述した枠検出処理により、撮影画像345の1ピクセル当たりの実寸d2が計算されている。したがって、撮影画像345から検出された虫体のピクセル数p2と1ピクセル当たりの実寸d2との積から、虫体の実寸が計算され得る。 In step S404, the processor 31 calculates the actual size of the insect body of the selected classification result. The actual size d2 per pixel of the captured image 345 has been calculated by the frame detection process described above. Therefore, the actual size of the insect body can be calculated from the product of the number p2 of pixels of the insect body detected from the captured image 345 and the actual size d2 per pixel.
 ステップS405において、プロセッサ31は、分類された虫体が分布範囲内であるか否かを判定する。例えば、虫データ3443に登録されている分布範囲に存在する虫体の実寸と計算された虫体の実寸との差が閾値内であるときには、分類された虫体が分布範囲内であると判定される。ステップS405において、分類された虫体が分布範囲内であると判定されたときには、処理はステップS406に移行する。ステップS405において、分類された虫体が分布範囲内でないと判定されたときには、処理はステップS407に移行する。 In step S405, the processor 31 determines whether the classified insect body is within the distribution range. For example, when the difference between the actual size of an insect existing in the distribution range registered in the insect data 3443 and the calculated actual size of the insect is within a threshold, it is determined that the classified insect is within the distribution range. be done. If it is determined in step S405 that the classified insect body is within the distribution range, the process moves to step S406. If it is determined in step S405 that the classified insect body is not within the distribution range, the process moves to step S407.
 ステップS406において、プロセッサ31は、選択した分類結果をそのまま出力すると決定する。その後、処理はステップS408に移行する。 In step S406, the processor 31 determines to output the selected classification result as is. After that, the process moves to step S408.
 ステップS407において、プロセッサ31は、選択した分類結果を「未分類」に変更する。その後、処理はステップS408に移行する。つまり、実施形態では、虫体検出の確からしさが低い場合、虫体の種名とグループ名の組み合わせが適切でない場合、又は検出された虫体の大きさが適切でない場合には、分類結果が「未分類」とされる。虫体検出の確からしさが低いことは、虫体でないものが虫体であるとして検出されている可能性が高いことを示している。同様に、虫体の種名とグループ名の組み合わせが適切でないことは、例えば本来は工場の外で発生する虫であるのに、工場の中で発生したと分類されているといったように虫体の分類の誤りが生じている可能性があることを示している。また同様に、虫体が分布範囲であることも虫体の分類の誤りが生じていることを示している。このように、実施形態では、疑わしい分類結果は「未分類」に変更される。 In step S407, the processor 31 changes the selected classification result to "unclassified". After that, the process moves to step S408. In other words, in the embodiment, if the certainty of insect detection is low, if the combination of the insect species name and group name is inappropriate, or if the size of the detected insect is It is marked as "unclassified". The low certainty of insect body detection indicates that there is a high possibility that something that is not an insect body is detected as an insect body. Similarly, an inappropriate combination of insect species name and group name may result in an insect being classified as having occurred inside a factory, even though it originally occurred outside the factory. This indicates that a classification error may have occurred. Similarly, the distribution range of insects also indicates that errors in classification of insects have occurred. Thus, in embodiments, questionable classification results are changed to "unclassified."
 ステップS408において、プロセッサ31は、すべての分類結果の選択を完了したか否かを判定する。ステップS408において、すべての分類結果の選択を完了したと判定されたときには、図10の処理は終了する。ステップS408において、すべての分類結果の選択を完了していないと判定されたときには、処理はステップS401に戻る。 In step S408, the processor 31 determines whether the selection of all classification results has been completed. When it is determined in step S408 that the selection of all classification results has been completed, the process in FIG. 10 ends. If it is determined in step S408 that the selection of all classification results has not been completed, the process returns to step S401.
 以上説明したように、本実施形態によれば、検査モデル343による分類結果が疑わしい場合には、その分類結果が「未分類」に変更される。これにより、明らかに誤っている分類結果に基づく同定及び計数結果がユーザに対して提示されることが抑制される。結果として検査モデル343の信頼性が高められる。 As explained above, according to this embodiment, if the classification result by the test model 343 is questionable, the classification result is changed to "unclassified". This prevents identification and counting results based on clearly incorrect classification results from being presented to the user. As a result, the reliability of the test model 343 is increased.
 また、実施形態では、虫体検出の確からしさ、虫体の種名とグループ名の組み合わせ、及び虫体の実寸に基づく分布範囲の3つの異なる判定に基づいて、分類結果が疑わしいか否かが判定される。このような3つの異なる判定によって分類結果が疑わしいか否かが判定されることにより、疑わしい分類結果に基づく同定及び計数結果がユーザに対して提示されることはより抑制される。一方で、分類結果が疑わしいか否かの判定は、必ずしも3つの判定を用いて行われる必要はない。分類結果が疑わしいか否かの判定は、虫体の種名とグループ名の組み合わせ、及び虫体の実寸に基づく分布範囲の何れか1つだけを用いて行われてもよいし、さらに異なる判定も用いて行われてもよい。 Furthermore, in the embodiment, whether the classification result is doubtful or not is determined based on three different judgments: the certainty of detecting the insect, the combination of the species name and group name of the insect, and the distribution range based on the actual size of the insect. It will be judged. By determining whether or not a classification result is suspicious using these three different determinations, presentation of identification and counting results based on suspicious classification results to the user is further suppressed. On the other hand, it is not necessary to use three determinations to determine whether or not a classification result is questionable. The determination as to whether the classification result is doubtful may be made using only one of the combination of the species name and group name of the insect and the distribution range based on the actual size of the insect, or a different judgment may be made. It may also be carried out using
 また、実施形態における検査モデル343は、撮影画像345における虫体の検出と虫体の分類とを別々のAIモデルによって行う。これにより、撮影画像345における虫体の検出と虫体の分類とが1つのAIモデルによって行われるよりも、虫体の分類の精度が高められる。 Furthermore, the inspection model 343 in the embodiment detects insects in the photographed image 345 and classifies the insects using separate AI models. As a result, the accuracy of insect classification is improved compared to when detection of insect bodies in the photographed image 345 and classification of insect bodies are performed by one AI model.
 [変形例]
 以下、実施形態の変形例について説明する。実施形態では、虫体の実寸を計算するのに、撮影画像345に映っている区画の枠の実寸が用いられている。これは、捕虫紙には、区画の枠を示す枠線が引かれていることと、区画の枠の実寸が既知であることによる。一方で、撮影画像345に予め大きさが既知の対象物が映っているときには、必ずしも区画の枠の実寸から虫体の実寸が計算されなくてもよい。この場合、ユーザが端末2を用いて捕虫紙の画像を撮影する際に大きさが既知の対象物とともに撮影をするように促すことが端末2において行われてもよい。また、対象物の実寸の値がユーザによって入力されてもよい。このような手法により、実施形態の足切り処理は、区画の枠線が引かれていない捕虫紙の画像に対しても適用され得る。
[Modified example]
Modifications of the embodiment will be described below. In the embodiment, the actual size of the frame of the section shown in the photographed image 345 is used to calculate the actual size of the insect body. This is because the insect-trapping paper has a frame line indicating the frame of each compartment, and the actual size of the frame of each compartment is known. On the other hand, when an object whose size is known in advance is shown in the photographed image 345, the actual size of the insect body does not necessarily have to be calculated from the actual size of the division frame. In this case, when the user uses the terminal 2 to take an image of the insect paper, the terminal 2 may prompt the user to take the photo together with an object whose size is known. Furthermore, the actual size of the object may be input by the user. By using such a method, the leg-cutting process of the embodiment can be applied even to an image of insect-trapping paper in which no border lines are drawn.
 また、実施形態では、虫体検出の確からしさが確からしさ閾値を超えているときには、その後の判定へ移行するものとされている。これに対し、虫体検出の確からしさが確からしさ閾値を超えており、かつ、虫体分類の確からしさが確からしさ閾値を超えているときだけ、その後の判定へ移行するように構成されてもよい。この場合において、虫体検出の確からしさ閾値と虫体分類の確からしさ閾値とは同じ値であってもよいし、異なる値であってもよい。 Furthermore, in the embodiment, when the certainty of detecting an insect body exceeds the certainty threshold, the process proceeds to the subsequent determination. In contrast, it may be configured to proceed to the subsequent determination only when the certainty of insect detection exceeds the certainty threshold and the certainty of insect classification exceeds the certainty threshold. good. In this case, the certainty threshold for insect detection and the certainty threshold for insect classification may be the same value or may be different values.
 また、前述した実施形態では、検査のための撮影画像の解析は、サーバ3において行われるとされている。これに対し、検査のための撮影画像の解析は、端末2において行われてもよい。 Furthermore, in the embodiment described above, analysis of captured images for inspection is performed in the server 3. On the other hand, analysis of captured images for inspection may be performed at the terminal 2.
 また、上述した実施形態による各処理は、コンピュータであるプロセッサ31に実行させることができるプログラムとして記憶させておくこともできる。この他、磁気ディスク、光ディスク、半導体メモリ等の外部記憶装置の記憶媒体に格納して配布することができる。そして、プロセッサ31は、この外部記憶装置の記憶媒体に記憶されたプログラムを読み込み、この読み込んだプログラムによって動作が制御されることにより、上述した処理を実行することができる。 Additionally, each process according to the embodiment described above can be stored as a program that can be executed by the processor 31, which is a computer. In addition, it can be stored and distributed in a storage medium of an external storage device such as a magnetic disk, an optical disk, or a semiconductor memory. The processor 31 reads a program stored in the storage medium of this external storage device, and its operations are controlled by the read program, thereby being able to execute the above-described processes.
 また、本発明は、上記実施形態に限定されるものではなく、実施段階ではその要旨を逸脱しない範囲で種々に変形することが可能である。また、各実施形態は適宜組み合わせて実施してもよく、その場合組み合わせた効果が得られる。更に、上記実施形態には種々の発明が含まれており、開示される複数の構成要件から選択された組み合わせにより種々の発明が抽出され得る。例えば、実施形態に示される全構成要件からいくつかの構成要件が削除されても、課題が解決でき、効果が得られる場合には、この構成要件が削除された構成が発明として抽出され得る。 Furthermore, the present invention is not limited to the above-described embodiments, and can be variously modified at the implementation stage without departing from the spirit thereof. Moreover, each embodiment may be implemented in combination as appropriate, and in that case, the combined effect can be obtained. Furthermore, the embodiments described above include various inventions, and various inventions can be extracted by combinations selected from the plurality of constituent features disclosed. For example, if a problem can be solved and an effect can be obtained even if some constituent features are deleted from all the constituent features shown in the embodiment, the configuration from which these constituent features are deleted can be extracted as an invention.
 1 システム、2 端末、3 サーバ、31 プロセッサ、32 ROM、33 RAM、34 ストレージ、35 入力インターフェース、36 ディスプレイ、37 通信モジュール、341 OS、342 害虫検査プログラム、343 検査モデル、344 データベース、345 撮影画像、3431 捕虫紙枠検出エンジン、3432 虫体検出エンジン、3433 虫体分類エンジン、3441 確からしさ閾値、3442 枠データ、3443 虫データ。 1 System, 2 Terminal, 3 Server, 31 Processor, 32 ROM, 33 RAM, 34 Storage, 35 Input interface, 36 Display, 37 Communication module, 341 OS, 342 Pest inspection program, 343 Inspection model, 344 Database, 345 Photographed image , 3431 Insect trap paper frame detection engine, 3432 Insect detection engine, 3433 Insect classification engine, 3441 Likelihood threshold, 3442 Frame data, 3443 Insect data.

Claims (7)

  1.  捕虫紙の画像に基づいて、前記捕虫紙に張り付いた虫体の種名を分類するように構成された検査モデルに前記捕虫紙の画像を入力して前記虫体の種名の分類を実施することと、
     前記虫体の種名の分類結果が疑わしいか否かを判定し、疑わしいと判定された分類結果を分類できなかったと変更する足切り処理を実施することと、
     をコンピュータに実行させるための害虫の検査のためのプログラム。
    The image of the insect paper is input to a test model configured to classify the species name of the insect body stuck to the insect paper based on the image of the insect paper, and the species name of the insect body is classified. to do and
    Determining whether the classification result of the species name of the insect body is doubtful or not, and performing a cutting process to change the classification result determined to be doubtful to that it could not be classified;
    A pest inspection program that runs on a computer.
  2.  前記検査モデルは、さらに、前記虫体の種名の特性に基づくグループ名の分類を実施し、
     前記虫体の種名の分類結果が疑わしいか否かを判定することは、分類された種名及びグループ名の組み合わせが予め登録された種名及びグループ名の組み合わせと一致していないときに、前記虫体の種名の分類結果が疑わしいと判定することを含む請求項1に記載の害虫の検査のためのプログラム。
    The test model further performs classification of group names based on characteristics of species names of the insects,
    Determining whether or not the classification result of the species name of the insect is suspicious is when the combination of the classified species name and group name does not match the combination of the species name and group name registered in advance. 2. The program for inspecting pests according to claim 1, further comprising determining that the classification result of the species name of the insect body is suspicious.
  3.  前記捕虫紙の画像から前記虫体の実寸を計算することをさらに前記コンピュータに実行させ、
     前記虫体の種名の分類結果が疑わしいか否かを判定することは、計算された実寸を有する虫体の分布範囲が予め登録された分布範囲に含まれていないときに、前記虫体の種名の分類結果が疑わしいと判定することを含む請求項1に記載の害虫の検査のためのプログラム。
    further causing the computer to calculate the actual size of the insect body from the image of the insect trapping paper;
    Determining whether the classification result of the species name of the insect is doubtful is to determine whether the classification result of the insect's species name is doubtful or not when the distribution range of the insect with the calculated actual size is not included in the pre-registered distribution range. The program for inspecting pests according to claim 1, further comprising determining that the classification result of the species name is suspicious.
  4.  前記捕虫紙の画像から前記虫体の実寸を計算することは、
     前記捕虫紙に予め設定された実寸が既知の区画の枠を前記捕虫紙の画像から検出することと、
     前記捕虫紙の画像に占める前記区画の枠の大きさから前記虫体の実寸を計算することと、
     を含む請求項3に記載の害虫の検査のためのプログラム。
    Calculating the actual size of the insect body from the image of the insect paper includes:
    detecting, from the image of the insect trapping paper, a frame of a section whose actual size is known and is set in advance on the insect trapping paper;
    Calculating the actual size of the insect body from the size of the frame of the division in the image of the insect trapping paper;
    4. The program for inspecting pests according to claim 3.
  5.  前記検査モデルは、前記捕虫紙の画像から前記区画の枠を検出するようにさらに構成されている請求項4に記載の害虫の検査のためのプログラム。 The program for inspecting pests according to claim 4, wherein the inspection model is further configured to detect the frame of the division from the image of the insect trapping paper.
  6.  前記虫体の種名の分類結果が疑わしいか否かを判定することは、前記捕虫紙の画像における虫体の検出結果の確からしさが閾値を超えていないときに、前記虫体の種名の分類結果が疑わしいと判定することを含む請求項1に記載の害虫の検査のためのプログラム。 Determining whether the classification result of the species name of the insect body is doubtful is to determine whether the classification result of the species name of the insect body is doubtful when the certainty of the classification result of the insect body detection result in the image of the insect paper does not exceed a threshold value. The program for inspecting pests according to claim 1, further comprising determining that the classification result is suspicious.
  7.  捕虫紙の画像に基づいて、前記捕虫紙に張り付いた虫体の種名を分類するように構成された検査モデルに前記捕虫紙の画像を入力して前記虫体の種名の分類を実施する分類部と、
     前記虫体の種名の分類結果が疑わしいか否かを判定し、疑わしいと判定された分類結果を分類できなかったと変更する足切り処理を実施する足切り処理部と、
     を具備する害虫検査装置。
    The image of the insect paper is input to a test model configured to classify the species name of the insect body stuck to the insect paper based on the image of the insect paper, and the species name of the insect body is classified. a classification section to
    a foot-cutting processing unit that determines whether or not the classification result of the species name of the insect body is suspicious, and performs a foot-cutting process of changing the classification result determined to be suspicious to that it could not be classified;
    A pest inspection device equipped with
PCT/JP2023/024998 2022-07-27 2023-07-05 Program for pest inspection and pest inspection device WO2024024434A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2022-119873 2022-07-27
JP2022119873A JP7400035B1 (en) 2022-07-27 2022-07-27 Programs and pest inspection equipment for pest inspection

Publications (1)

Publication Number Publication Date
WO2024024434A1 true WO2024024434A1 (en) 2024-02-01

Family

ID=89190372

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2023/024998 WO2024024434A1 (en) 2022-07-27 2023-07-05 Program for pest inspection and pest inspection device

Country Status (2)

Country Link
JP (1) JP7400035B1 (en)
WO (1) WO2024024434A1 (en)

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH08263660A (en) * 1995-03-22 1996-10-11 Atr Onsei Honyaku Tsushin Kenkyusho:Kk Method and device for recognizing signal and learning method and device of signal recognizing device
JPH10187651A (en) * 1996-12-19 1998-07-21 Matsushita Electric Ind Co Ltd Learning type recognition and judgement device
JP2014142833A (en) * 2013-01-24 2014-08-07 Earth Kankyo Service Kk Captured insect identification method and identification system
JP2018014059A (en) * 2016-07-22 2018-01-25 株式会社トプコン Medical information processing system and medical information processing method
CN111507314A (en) * 2020-06-05 2020-08-07 张立 Artificial intelligence image data acquisition system of insect pest control facility
WO2021001957A1 (en) * 2019-07-03 2021-01-07 株式会社Luci Insect trap server and insect trap display system
JP2021114171A (en) * 2020-01-20 2021-08-05 キヤノンメディカルシステムズ株式会社 Medical image processing apparatus, program, and medical image processing system

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP6937438B2 (en) 2018-07-31 2021-09-22 オリンパス株式会社 Image diagnosis support system and image diagnosis support method
US20200117897A1 (en) 2018-10-15 2020-04-16 Walt Froloff Adaptive Artificial Intelligence Training Data Acquisition and Plant Monitoring System
EP3963476A1 (en) 2019-05-03 2022-03-09 Verily Life Sciences LLC Predictive classification of insects
US20210212305A1 (en) 2020-01-10 2021-07-15 Deborah Woods Insect Trap with Multi-textured Surface

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH08263660A (en) * 1995-03-22 1996-10-11 Atr Onsei Honyaku Tsushin Kenkyusho:Kk Method and device for recognizing signal and learning method and device of signal recognizing device
JPH10187651A (en) * 1996-12-19 1998-07-21 Matsushita Electric Ind Co Ltd Learning type recognition and judgement device
JP2014142833A (en) * 2013-01-24 2014-08-07 Earth Kankyo Service Kk Captured insect identification method and identification system
JP2018014059A (en) * 2016-07-22 2018-01-25 株式会社トプコン Medical information processing system and medical information processing method
WO2021001957A1 (en) * 2019-07-03 2021-01-07 株式会社Luci Insect trap server and insect trap display system
JP2021114171A (en) * 2020-01-20 2021-08-05 キヤノンメディカルシステムズ株式会社 Medical image processing apparatus, program, and medical image processing system
CN111507314A (en) * 2020-06-05 2020-08-07 张立 Artificial intelligence image data acquisition system of insect pest control facility

Also Published As

Publication number Publication date
JP7400035B1 (en) 2023-12-18
JP2024017319A (en) 2024-02-08

Similar Documents

Publication Publication Date Title
Saunders et al. Local and cross‐seasonal associations of climate and land use with abundance of monarch butterflies Danaus plexippus
TWI698808B (en) Model testing method and device
US20210368747A1 (en) Analysis and sorting in aquaculture
US20110067106A1 (en) Network intrusion detection visualization
Andrianto et al. Smartphone application for deep learning-based rice plant disease detection
CN113110207A (en) Insect pest remote monitoring method and system based on sensor of Internet of things and storage medium
van Straalen Theory of ecological risk assessment based on species sensitivity distributions
US9870279B2 (en) Analysis apparatus and analysis method
US20210368748A1 (en) Analysis and sorting in aquaculture
CN111709374A (en) Bird condition detection method and device, computer equipment and storage medium
Sivakoff et al. Threshold choice and the analysis of protein marking data in long‐distance dispersal studies
CN111832727B (en) Cross-data, information, knowledge modality and dimension essence identification method and component
Heineck et al. Using R‐based image analysis to quantify rusts on perennial ryegrass
CN116543347A (en) Intelligent insect condition on-line monitoring system, method, device and medium
CN117455231A (en) Monitoring and early warning method and system based on risk assessment model
Del Ponte et al. Evaluation of app-embedded disease scales for aiding visual severity estimation of Cercospora leaf spot of table beet
WO2024024434A1 (en) Program for pest inspection and pest inspection device
Freeman et al. Dealing with non-equilibrium bias and survey effort in presence-only invasive Species Distribution Models (iSDM); Predicting the range of muntjac deer in Britain and Ireland
Mifsud et al. Acoustic characterization of bats from Malta: setting a baseline for monitoring and conservation of bat populations
Chapman et al. Inventory and review of quantitative models for spread of plant pests for use in pest risk assessment for the EU territory
Teixeira et al. A deep learning approach for automatic counting of bedbugs and grape moth
Petrellis Plant lesion characterization for disease recognition A Windows Phone application
CN116824311A (en) Performance detection method, device, equipment and storage medium of crowd analysis algorithm
CN115904883A (en) RPA flow execution visualization abnormity monitoring method, device and medium
CN113792715A (en) Granary pest monitoring and early warning method, device, equipment and storage medium

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 23846167

Country of ref document: EP

Kind code of ref document: A1