WO2021173609A1 - Real-time monitoring and early detection system for insect activity in grains during storage - Google Patents

Real-time monitoring and early detection system for insect activity in grains during storage Download PDF

Info

Publication number
WO2021173609A1
WO2021173609A1 PCT/US2021/019325 US2021019325W WO2021173609A1 WO 2021173609 A1 WO2021173609 A1 WO 2021173609A1 US 2021019325 W US2021019325 W US 2021019325W WO 2021173609 A1 WO2021173609 A1 WO 2021173609A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
insects
trap
smart
collection chamber
Prior art date
Application number
PCT/US2021/019325
Other languages
French (fr)
Inventor
Zhongli Pan
Ragab KHIR
Original Assignee
The Regents Of The University Of California
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by The Regents Of The University Of California filed Critical The Regents Of The University Of California
Priority to US17/801,573 priority Critical patent/US20230129551A1/en
Priority to CA3169032A priority patent/CA3169032A1/en
Priority to CN202180026268.0A priority patent/CN115361865A/en
Priority to EP21759848.1A priority patent/EP4110049A4/en
Publication of WO2021173609A1 publication Critical patent/WO2021173609A1/en

Links

Classifications

    • AHUMAN NECESSITIES
    • A01AGRICULTURE; FORESTRY; ANIMAL HUSBANDRY; HUNTING; TRAPPING; FISHING
    • A01MCATCHING, TRAPPING OR SCARING OF ANIMALS; APPARATUS FOR THE DESTRUCTION OF NOXIOUS ANIMALS OR NOXIOUS PLANTS
    • A01M1/00Stationary means for catching or killing insects
    • A01M1/02Stationary means for catching or killing insects with devices or substances, e.g. food, pheronones attracting the insects
    • A01M1/026Stationary means for catching or killing insects with devices or substances, e.g. food, pheronones attracting the insects combined with devices for monitoring insect presence, e.g. termites
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/11Region-based segmentation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/20Image preprocessing
    • G06V10/25Determination of region of interest [ROI] or a volume of interest [VOI]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/20Image preprocessing
    • G06V10/30Noise filtering
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/764Arrangements for image or video recognition or understanding using pattern recognition or machine learning using classification, e.g. of video objects

Definitions

  • a system for real-time monitoring of insects includes a trap and an image processor.
  • the trap includes a chamber with perforations sized to admit insects into an interior of the smart trap, a collection chamber located within the interior of the smart trap, and an imaging system for capturing images that include the collection chamber of the smart trap.
  • the image processor is configured to receive images captured by the smart trap and to determine a count of insects within the collection chamber based on image analysis of the received image, wherein image analysis includes identifying a region within the received image corresponding with a boundary of the collection chamber, cropping the received image to the identified region to generate a cropped image, modifying at least one characteristic of the cropped image to generate a modified, cropped image, and determining a count of insects based on the modified, cropped image.
  • a trap for detecting insects includes a perforated chamber with openings sized to admit insects into an interior of the trap, a collection chamber located within the interior of the smart trap for collecting admitted insects, and a cap configured to cover the perforated chamber, the cap housing an electronic system including an imaging system to capture an image of the collection chamber.
  • a method of counting insects in a captured image includes cropping and masking the captured image to produce a first processed image containing only a region in the captured image that correlates to a collection chamber, modifying at least one characteristic of the first processed image to produce a second processed image, and determining a count of insects in the captured image by executing a particle detection algorithm on the second processed image.
  • FIG. 1 shows an embodiment of an insect detection system.
  • FIGS. 2A-B are views of a smart trap, according to some embodiments of this disclosure.
  • FIG. 2A is a schematic diagram of a smart trap.
  • FIG. 2B is a photo of a trap, according to some embodiments of this disclosure.
  • FIG. 3 is a block diagram of the systems of a smart trap, according to some embodiments of this disclosure.
  • FIG. 4 is a block diagram of modules of a main board of a smart trap, according to some embodiments of this disclosure.
  • FIG. 5 is a block diagram of modules of the shield board, according to some embodiments of this disclosure.
  • FIG. 6 is a block diagram of a camera board, according to some embodiments of this disclosure.
  • FIG. 7 shows an exemplary embodiment of an arrangement for the systems for a smart trap.
  • FIGS.8A-B shows an exemplary embodiment of the main board, showing (A) the first side of the main board and (B) the second side of the main board.
  • FIG. 9A-C shows an exemplary embodiment of the shield board, showing (A) the first side of the shield board; (B) the second side of the shield board; and (C) an exemplary connection schematic for the modules of the shield board.
  • FIGS. 10A-B shows an exemplary embodiment of the camera board, showing (A) the first side of the camera board and (B) the second side of the camera board.
  • FIG. 11 is a flowchart of an algorithm to count the number of insects in an image, according to some embodiments of this disclosure.
  • FIG. 12 is a flowchart of an algorithm to count the number of insects in an image according to some embodiments of this disclosure.
  • FIG. 13 is a flowchart of an exemplary subroutine for determining the region in a captured image that corresponds to the collection chamber.
  • FIG. 14 is a flowchart of an exemplary subroutine for modifying one or more characteristics of the first processed image.
  • FIG. 15 is a flowchart of an exemplary subroutine particle detection algorithm, according to some embodiments of this disclosure. Arrows indicate that a step may be repeated at least one time before proceeding to the next step of the method.
  • FIG. 16 is a graph showing the time to detect the emergence of the first insect during lab and commercial tests of an insect system according to some embodiments of this disclosure.
  • An insect detection system 100 as described herein has high reliability and provides a highly accurate insect count. For example, in a laboratory test of an insect detection system 100 as described herein the emergence of the first insect was detected within 19, 18, 20, and 20 minutes under infestation concentrations of 0.5/kg, 1/kg, 2/kg, and 3/kg, respectively. The average image counting accuracy rate of the insect detection system 100 was 94.3%. Additionally, in a commercial test of an insect detection system 100 as described herein, insect activity was detected within twelve minutes with a counting accuracy of 91.3%.
  • an insect detection system 100 described herein decreases labor cost, increases the efficacy of pest management practice, enables early intervention/corrective action to be taken before the damage becomes severe, improves the quality, grade, and/or safety of stored grains, and/or decreases financial losses to grain growers and processors.
  • the insect detection system 100 includes at least one smart trap 110, at least one server 120, and at least one user interface/display device 130 which are communicatively coupled to one another via communication channel 140.
  • the smart trap 110 is used to collect data in a facility storing grain.
  • Any suitable wireless communication protocol may be used for communication between the smart trap 110, the server 120, and/or the user device 130.
  • communication between the smart trap 110 and the server 120 and/or user device 130 are encrypted.
  • a hypertext transfer protocol HTTP is used for communication between components 110, 120, 130 of the insect detection system 100.
  • Communication of data and/or instructions may be done as a bundle or individually. Data may be saved to memory and/or processed by server 120 and/or user device 130.
  • Server 120 includes one or more devices capable of receiving, generating, storing, processing, and/or providing information.
  • server 120 may include a server, a data center, a workstation computer, a virtual machine (VM) implemented in a cloud computing environment, or a similar type of device.
  • VM virtual machine
  • the user device 130 includes one or more devices capable of receiving, generating, storing, processing, and/or providing information associated with interactions of a user of user device 130 with a user interface provided for display via a display associated with user device 130.
  • user device 130 may include a desktop computer, a mobile phone (e.g. a smartphone, a radiotelephone, etc.), a laptop computer, a tablet computer, a handheld computer, a gaming device, a virtual reality device, a wearable communication device (e.g. a smart wristwatch, a pair of smart eyeglasses, etc.), or a similar type of device.
  • server 120 and user device 130 may be a single component/device.
  • the insect detection system 100 further includes a user interface on server 120 and/or the user device 130 for remote operation of the insect detection system 100.
  • Any suitable user interface may be implemented that allows an operator to interact with the user interface.
  • the user interface may be a graphical user interface (GUI), a webpage, and/or an application for a smartphone or tablet. Interacting with user interface elements includes selecting a button on the user interface, inputting text via a text box, toggling a control, and/or the like.
  • GUI graphical user interface
  • Interacting with user interface elements includes selecting a button on the user interface, inputting text via a text box, toggling a control, and/or the like.
  • At least one smart trap 100 is inserted into a grain mass.
  • the number of smart traps 110 used in the system 100 depends on the size of the grain mass to be monitored.
  • a system 100 implemented in a commercial grain storage facility utilizes 10-20 smart traps 110.
  • the smart trap 110 collects data relevant to monitor insect infestation, e.g. images of an interior of the smart trap 110 and/or data about an ambient condition.
  • the user device 130 communicates one or more instructions to the trap 110. Instructions include an on- demand request to capture an image, and/or a schedule for data collection. One exemplary schedule for data collection is capturing an image one time per hour. Instructions may also include assigning an unassigned trap to a registered user and/or adding a new trap to the system. [0030]
  • the collected data may be processed by a microcontroller 211 located in the smart trap 110, the server 120, and/or the user device 130.
  • Processed data may be stored in memory on server 120 and/or the user device 130.
  • Stored data may be retrieved and/or viewed by the user interface.
  • the user interface may be used to access a database or list of smart traps.
  • Information that may be displayed includes the trap ID, trap location, detected insects per timespan, and sensor readings.
  • the user device may be used to visualize data collected by the smart trap 110.
  • the collected data is analyzed to determine a correlation between insect emergence and the temperature and relative humidity.
  • FIG. 2A is a schematic of a smart trap 110 and Figure 2B is a photo of an exemplary embodiment of a smart trap 110.
  • the smart trap 110 has a cap 200 that covers the top opening of the perforated cylinder 300 and forms the top of the smart trap 110.
  • the cap 200 houses an electronic system 204 that includes a camera 242 with a lens.
  • the smart trap 110 has a power source that may be attached to the cap 200.
  • the power source is a battery 218, held by a battery clip, attached to the cap 200 ( Figure 2B).
  • the perforated cylinder 300 forms the body of the smart trap 110.
  • an annulus connects the perforated chamber 300 to the cap 200.
  • the diameter and length of the perforated chamber 300 are selected based on an appropriate force required to insert the trap into the grain mass and/or to provide an appropriate depth so the smart trap 110 is placed where the insects are active in the grain mass.
  • the perforations are sized to admit insects into the smart trap 110 where they fall into the collection chamber 400. In one aspect, the size of the perforations allows insects to enter the smart trap 110 but prevents grain from filling the smart trap 110.
  • the collection chamber 400 has a base 410, a conical end 420, and is attached to the bottom of perforated chamber 300 to form the bottom of the smart trap 110.
  • an annulus connects collection chamber 400 to the perforated chamber 300.
  • the collection chamber 400 may be detachable.
  • the collection chamber 400 has a threaded connection. Camera 242 is directed towards and has an unobstructed view of, the collection chamber 400.
  • the dimensions and shape of the smart trap 110 provide efficient features for attracting insects and easily insert the smart trap 110 into a grain mass (e.g. diameter, length, perforation diameter, conical end). Images of insects caught in the collection chamber 400 are captured by camera 242.
  • the base 410 of the collection chamber 400 is white to increase the contrast and/or brightness of a captured image.
  • the conical nose 420 reduces the amount of force required to insert the smart trap 110 into a grain mass.
  • the collection chamber 400 may be detached so that insects captured in the collection chamber 400 may be discarded. Placing the electronic system 204 in cap 200 allows the operator of the system 100 to easily repair and/or replace the entire electronic system 204 or one or more individual modules or boards associated with the electronic system 204.
  • Examples of some suitable materials for smart trap 110 include polyvinyl chloride (PVC) and stainless steel.
  • the cap 200 is a PVC cap fitting.
  • the cap 200 has a 2.4-inch inner diameter.
  • the perforated chamber 300 may be connected to cap 200 and collection chamber 400 by a PVC annulus that matches the inner diameter of the perforated chamber.
  • the perforated chamber 300 is made of stainless steel.
  • a male piece is sealed at the bottom with a solid PVC nose cut to shape on a lathe, and a rubber O-ring is added to the connection.
  • the collection chamber 400 is manufactured from a straight threaded polyvinyl chloride (PVC) connection.
  • each smart trap 110 has an independent power source.
  • 240 mAh of energy is used every time the insect detection system 100 captures an image and sends the image to server 120.
  • Table 1 shows the results of tests measuring the lifespan of different battery types based on the frequency at which images are taken by the imaging system. Table 1. Lifespan of Battery According to Frequency of Image Capture
  • FIG 3 is a block diagram of the electronic system 204.
  • the electronic system 204 includes a main board 210, a shield board 220, and/or a camera board 240 that are communicatively coupled to one another via a or communication channel 150.
  • Each board 210, 220, 240 is configured to execute at least one communication protocol. Examples of suitable communication protocols include the Inter-Integrated Circuit (I2C) protocol and the Serial Peripheral Interface (SPI) protocol.
  • I2C protocol is used for communication 150 between the main board 210 and the shield board 220
  • the SPI protocol is used for communication 150 between the main board 210 and the camera board 240.
  • each board 210, 220, and 240 is attached to another board.
  • the main board 210 is connected to the shield board 220, which is connected to the camera board 240 (see e.g. Figure 7).
  • the electronic system 204 is configured to capture one or more images, provide light during image capture, sensors for measuring temperature and relative humidity, convert analog data into digital data, process images to count the number of insects in a captured image, process the ambient data, display/visualize the data, store data, and/or communicate information (e.g. data and/or instructions).
  • the main board 210 converts analog data into digital data, processes captured images, processes data, and/or communicates with the shield board 220, the camera board 240, server 120 and/or the user device 130.
  • the shield board 210 collects sensor data for ambient conditions, provides light during image capture, and/or communicates with the main board 210.
  • the camera board 240 captures one or more images and/or communicates with the main board 210.
  • FIG. 4 is a block diagram of the main board 210.
  • the main board 210 includes a microcontroller 211, a communication module 212, a clock module 214, and/or a power module 216 that includes a long-lasting power supply such as a battery.
  • the main board 210 is an MCU ESP8266 board.
  • the microcontroller 211 communicates instructions to other modules of the system 204. For example, after an image is captured, the microcontroller 211 communicates 150 instructions to reset system 204 so that system 204 is ready to take another image. In another aspect, the microcontroller 211 processes data. For example, the microcontroller 211 converts the analog sensor readings to digital values. The microcontroller may also process an image captured by the imaging system 240, 242, 244, 246. In a further aspect, the microcontroller controls communication between server 120 and user device 130. In a further aspect, the microcontroller 211 may be programmed with instructions to power the system 204 only when a new command is received. In this example, a received instruction is added to a queue of instructions and, after the instruction is accepted, the imaging system 240, 242, 244, 246, and sensor module 224 are activated to collect the requested data.
  • clock module 214 assists in the implementation of time-dependent routines such as an image capture schedule or a power-saving routine, such as a deep sleep mode, to save power and increase the lifespan of the smart trap 110.
  • FIG. 5 is a block diagram of the shield board 220.
  • the shield board 220 includes a lighting module 222, a sensor module 224, and a shield communication module 226.
  • the lighting module 222 directs light towards the collection chamber 400 when an image is being captured by the camera 242, 244.
  • the lighting module 222 has at least one light that is perpendicular to the collection chamber 400. The light may be flat.
  • One or more light-emitting diodes (FEDs) may be used for the lighting module 222.
  • the lighting module 222 may receive instructions from the microcontroller 211 to turn on, capture an image, and/or save the image to a temporary file on the main board 210.
  • FEDs light-emitting diodes
  • the sensor module 224 has at least one sensor for monitoring and/or recording an ambient condition such as temperature and/or relative humidity.
  • the shield communication module 226 is utilized to provide communication with the main board 210.
  • the shield communication module 226 may receive instructions from the microcontroller 211 to turn on the lighting module 222 or collect sensor data from the sensor module 224.
  • the shield communication module is a port expander.
  • the port expander is an MCP23008 port expander.
  • FIG. 6 is a block diagram of the camera board 240.
  • the camera board 240 includes a camera interface 246, an image sensor 244, and a camera 242 with a lens and a shutter.
  • the camera board 240 is an ArduCam 2 mega-pixel OV2640 camera.
  • Camera 242 is connected to a camera board 240.
  • camera 242 is a high-resolution camera.
  • the camera interface 246 receives instructions from the main board 210 to keep the shutter open for a pre-determined amount of time.
  • the shield communication module 226 may provide confirmation of lights being turned on and/or sensor data being collected by the sensor module 224.
  • Camera 242 captures an image of the collection chamber 400.
  • the captured image is a colored (RGB) image.
  • Figures 7-10B show exemplary embodiments of an insect detection system 100 as described herein.
  • Figure 7 shows an exemplary arrangement for the boards 210, 220, 240.
  • the shield board 220 is positioned between the main board 210 and the camera board 240.
  • the main board 210 is the top board and the camera board 240 is the bottom board.
  • the boards 210, 220, 240 are connected by a vertical support (see rear left).
  • the lighting module 222 located on the shield board 222 (as shown in Figure 5), and the lens of the camera 242 are oriented in the same direction so that when shield board 222 is attached to the cap of a smart trap (as shown in Figure 2A), the lighting module 222 and camera 242 are directed towards the collection chamber 400.
  • Figures 8A-B show an exemplary main board 210.
  • the power module 216 for the main board 210 includes a battery socket 217 receiving power from least one battery 218.
  • the battery socket 217 and WiFi module 212 are located on the first side of the main board 210 ( Figure 8A).
  • the first side of the main board 210 may face upward when the main board 210 is housed in the cap 200.
  • Figures 9A-C show an exemplary embodiment of a shield board 220.
  • the lighting module is located on the first side of the shield board 220 and includes a plurality of light sources 223, e.g. LEDs ( Figure 9A).
  • the communication module 226 and sensor module 224 are located on the second side of the shield board 220, as shown in Figure 9B.
  • four LED lights form the lighting module 222.
  • Figure 9C shows the trace for the exemplary shield board 220 embodiment.
  • Figures 10A-B show an exemplary camera board 240. Camera 242, 244 is attached to the first side ( Figure 10A)
  • an insect detection system 100 as disclosed herein is configured to acquire high resolution, high quality, images in a dark environment. Analyzing a high- resolution, high-quality image improves the accuracy of the insect count. First, a high- resolution camera produces a higher quality image. Furthermore, a white base 410 provides a higher contrast background for insects in a captured image, thereby producing a higher quality image. Also, uniform lighting provided by the lighting module 222 during the imaging process improves image quality. Additionally, instructions to keep the shutter open pre-determined amount of time are sent to the camera board 240 so that the image sensor 244 can absorb more light.
  • captured images are time stamped.
  • the filename for an image may include identifying information such as the trap ID and the date and time the image was captured.
  • a database is used to organize the captured images and/or processed images.
  • FIG 11 is a flowchart that illustrates a method 500 of analyzing images captured by the smart trap 110.
  • algorithm 500 determines the number of insects in each captured image.
  • the algorithm 500 may be executed by an image processor.
  • the image processor may be the microcontroller 211 located in trap 110, server 120, and/or user device 130.
  • algorithm 500 is a macro/function that may be executed by the microcontroller 211.
  • the captured image is cropped and masked to form a first processed image.
  • cropping and masking the captured image 600 removes extraneous areas and details of the captured image so that only the region in the captured image corresponding to the collection chamber 400 undergoes further processing and analysis.
  • the first processed image for a circular collection chamber 400 is smaller than the captured image and includes only the region corresponding to the expected location of insects.
  • the cropped/masked image is processed to modify one or more characteristics of the image.
  • modifying at least one characteristic of the cropped/masked image reduces noise, minimizes or negates fine particles and/or extraneous details, and/or converts the cropped/masked image into a binary image.
  • Modifications include transforming the cropped/masked image into a single colored image (e.g. greyscale), adjusting the brightness and/or contrast of the image, binarizing the image, and/or reducing image noise and/or detail in the image.
  • Binarization creates a binary image by converts a pixel image to a binary image and/or reduces noise in the image. Binarization may be conducted only on dark regions or on the entire image. Step 700 forms a second processed image that is a processed cropped image.
  • the processed cropped image is analyzed to determine the count of insects in the image.
  • the count of insects is a measure of grain infestation.
  • the insect count and sensor data can be analyzed to determine a correlation between insect emergence and ambient conditions (e.g. temperature and/or relative humidity).
  • Figure 12 is a flowchart that illustrates a method 502 of analyzing images captured by the smart trap 110.
  • the algorithm 502 determines the number of insects in each captured image.
  • the algorithm 502 may be executed on microcontroller 211, server 120, and/or user device 130.
  • algorithm 502 is a macro that may be executed by the microcontroller 211.
  • the captured image is analyzed to define a region in the image that corresponds to the collection chamber 400.
  • the captured image is cropped and masked the captured image to form a first processed image that contains only the region that corresponds to the collection chamber 400.
  • step 630 includes reloading the captured image, cropping the captured image to fit the bounding box, and masking the comers of the bounding box to produce a circular image.
  • the cropped/masked image is processed to modify one or more characteristics of the image.
  • modifying at least one characteristic of the cropped/masked image reduces noise, minimizes or negates fine particles and/or extraneous details, and/or converts the cropped/masked image into a binary image.
  • Modifications include transforming the cropped/masked image into a single colored image (e.g. greyscale), adjusting the brightness and/or contrast of the image, binarizing the image, and/or reducing image noise and/or detail in the image.
  • Binarization creates a binary image by converts a pixel image to a binary image and/or reduces noise in the image. Binarization may be conducted only on dark regions or on the entire image.
  • Step 700 forms a second processed image that is a processed cropped image.
  • a particle detection algorithm is applied to the processed cropped image.
  • the processed cropped image is analyzed to determine the count of insects in the image.
  • the count of insects is a measure of grain infestation.
  • the insect count and sensor data can be analyzed to determine a correlation between insect emergence and ambient conditions (e.g. temperature and/or relative humidity).
  • steps 610, 630, 820, and 830 of algorithm 502 are subroutines of the algorithm 500 shown in Figure 11, with steps 610 and 630 being subroutines of cropping and masking the captured image 600 and steps 820 and 830 being subroutines of determining a count of insects 800.
  • FIG. 13 is a flowchart of an exemplary method 610 for determining a region in the captured image that corresponds to the collection chamber 400.
  • the captured image is converted to greyscale.
  • noise is removed from the captured image.
  • step 614 includes applying an averaging method that removes noise while preserving edges.
  • An example of a suitable averaging method is the median blur method.
  • the captured image is transformed into a binary image.
  • a transform method is applied to the captured image to define the region that corresponds to the collection chamber 400.
  • step 618 applies the Hough Circle transform.
  • the Hough Circle transform identifies at least one circular region in the captured image. Each identified region is delineated by a boundary box. An average of all identified bounding boxes is used to determine the center of the region in the captured image that corresponds to the collection chamber 400.
  • the Hough Circle transform is applied to a binary image.
  • FIG 14 is a flowchart of an exemplary method of modifying one or more characteristics of an image 700 to form a processed cropped image.
  • the cropped image is transformed into a single colored image (e.g. greyscale).
  • the brightness and/or contrast of the cropped image is adj usted.
  • dark regions of the cropped image are binarized.
  • step 720 the amount of noise and/or fine detail in the cropped image is reduced.
  • step 720 includes applying Gaussian Blur and Unsharp masking. Multiple applications of Gaussian Blue and Unsharp masking may be applied. The Gaussian Blur method reduces image noise/detail while the Unsharp Masking method increases the sharpness.
  • the brightness and/or contrast of the cropped image is adjusted again.
  • the entire cropped image is binarized.
  • FIG 15 is a flowchart of an exemplary method for the particle detection algorithm 820.
  • the insect number is determined by running a Python code.
  • the code is used to process images by using an insect counting algorithm (ImageJ) to count the insects in each image and save the data. To count the insects, ImageJ is used to refine the image to eliminate any particles in the background and highlight the dark-colored insects via the following steps.
  • ImageJ insect counting algorithm
  • At step 822 at least one bounding box is identified.
  • identifying at least one bounding box 822 includes identifying regions of interest in the processed cropped image and delineating each region of interest with a bounding box, and placing the bounding box into a set of bounding boxes.
  • the set of bounding boxes is restricted/filtered to eliminate bounding boxes that are unlikely to contain an insect and/or keep bounding boxes with are likely to contain an insect.
  • a bounding box may surround an extraneous region that does not contain an insect.
  • Bounding boxes to be analyzed in the count of insects may include bounding boxes within a certain area band, bounding boxes that only contain black pixels, bounding boxes that include an object with a specified eccentricity, and/or bounding boxes with an oval-shaped object.
  • a count of the number of insects in the processed cropped image is determined by counting the insects in the subset of bounding boxes 830.
  • the insect count data may be saved to memory on server 120 and visualized by using the user interface for further analysis.
  • the user interface including mobile application, was designed to allow the user to easily see the data related to insect number, temperature, and relative humidity. Additionally, a graph may be used to visualize the insect count over time. The mobile application also has an additional feature which allows user to view past data.
  • Any suitable programming language may be used to implement instructions for insect detection system 100, algorithm 500, and user interface. Some examples include Python, C++, HTML5, CSS3, and/or JavaScript.
  • the laboratory setting was a cylindrical container filled with rice infested with red flour beetles.
  • the cylindrical container had a diameter of 20 cm, a height of 48 cm, and contained 8 kg of infested rice.
  • the system was tested under different infestation concentrations (4 insects/8 kg, 8 insects/8 kg, 16 insects/8 kg, and 24 insects/8 kg of rough rice) which is equal to (0.5/kg, 1/kg, 2/kg, and 3/kg). Three tests were conducted for each concentration.
  • Table 2 shows the effectiveness and recovery rate of the system.
  • the system can detect the emergence of the first insect within 19, 18, 20, and 20 minutes under infestation concentrations of 0.5/kg, 1/kg, 2/kg, and 3/kg, respectively (Table 2).
  • the corresponding recovery rates of total insects were 83%, 75%, 73%, and 76% after 24 hours.
  • For an insect concentration of 0.5 insects/kg the system detected the emergence of the first insect within 12, 16, and 29 minutes for replicates 1, 2, and 3 respectively (Table 2).
  • the corresponding values for recovery rates of total insects were 75%, 100%, and 75%, respectively.
  • For an insect concentration of 1 insect/kg the system detected the emergence of the first insect within 33, 17, and 4 minutes, for replicates 1, 2, and 3 respectively (Table 2).
  • the corresponding values for recovery rates of total insects were 75%, 75%, and 75%, respectively.
  • the system detected the emergence of the first insect within 29, 18, and 13 minutes for replicate 1, 2, and 3, respectively.
  • the corresponding values for recovery rates of total insects were 80%, 71%, and 80%, respectively.
  • the recovery rate is the percentage of the insects detected after 24 hours compared to the total number of insects in the infested rice.
  • the recovery rate of total insects can be calculated using the following equation: [0064] where RR is recovery rate (%), NID24 hr is the number of insects detected after 24 hours, and TNIs kg is the total number of insects infesting the 8 kg of rice.
  • Table 3 shows the insect counting accuracy rate of the system during the laboratory test.
  • the insect counting accuracy rate is the difference between the number of insects visually counted and those counted by the system.
  • the system achieved high counting accuracy of 93% and 95.6% for 1/kg and 2/kg, respectively (Table 3).
  • the average image counting accuracy rate was 94.3%.
  • the insect counting accuracy rate can be calculated using the following equation: [0068] where ICAR is the image counting accuracy rate (%), AD is the difference between the number of insects visually counted and those counted by the system, and NIVC is the number of insects visually counted.
  • Table 4 provides the averages and standard deviations of temperatures and relative humidity recorded by the system and thermometer during the laboratory test.
  • the overall average and standard deviation for temperature and relativity recorded by the system were 26.5 ⁇ 0.6°C and 30 ⁇ 0.7%, respectively.
  • the corresponding values recorded by the thermometer were 24.5 ⁇ 1.0°C and 32.5 ⁇ 0.8%, respectively (Table 4).
  • the commercial setting was a commercial storage facility with rice.
  • the rice in the commercial storage facility was visually inspected before and after three traps were installed.
  • Visual inspection of the commercial storage facility included taking samples from nine different locations and inspecting the samples using a screening method to determine that the rice was not infested.
  • the tests in the commercial setting were done in triplicate. For each replicate, the system was installed and left for one week. During that time, insect activity was remotely monitored. After 7 days, the traps were inspected, and the trapped insects were visually counted and compared with those detected by the system.
  • Table 5 provides data relevant to the effectiveness and early detection of insect activity by the insect system in a commercial setting. The system was able to detect the emergence of the first, second, and third insects within 10, 40, and 130 minutes for trap number 1 (Table 5). The corresponding values for trap numbers 2 and 3 were 11, 42, 120 minutes, and 15, 43, and 130 minutes, respectively (Table 5).
  • Table 6 shows the insect counting accuracy rate of the system in a commercial setting. Analysis of the data revealed that it took only 12 minutes to detect insect activity with an accounting accuracy of 91.3%. For trap number 1, the results revealed that the counting accuracy was 100%, 91.7%, and 90% for the first, second, and third tests, respectively (Table 6). The corresponding values for trap numbers 2 and 3 were 75%, 100%, 88.9%, and 88.9%, 87.5%, and 100%, respectively (Table 6).
  • Table 7 shows the averages and standard deviations of temperatures and relative humidity recorded by the system and thermometer during the commercial storage tests.
  • the overall averages and standard deviations for the temperature recorded by the system sensors were 31.2 ⁇ 4.5, 30.9 ⁇ 5.0, and 31.7 ⁇ 3.8°C for trap numbers 1, 2, and 3, respectively, (Table 7).
  • the corresponding values recorded by the thermometer were 30.5 ⁇ 4.0, 29.3 ⁇ 2.1, and 30.1 ⁇ 3.1°C, respectively.
  • the overall average and standard deviation for the relative humidity recorded by the system sensors were 49.5 ⁇ 11, 50 ⁇ 10, and 50 ⁇ 12% for trap numbers 1, 2, and 3, respectively (Table 7).
  • thermometer The corresponding values recorded by the thermometer were 49 ⁇ 10, 48 ⁇ 11, and 48 ⁇ 10%, respectively.
  • the average ambient temperatures inside and outside the storage were 25.7 ⁇ 4.6 °C and 28.1 ⁇ 8.6 °C, respectively.
  • the corresponding values for relative humidity were 46.9 ⁇ 5.3% and 45.4 ⁇ 12.6%.
  • the average moisture content of stored rice was 11.8 ⁇ 0.5%, respectively.
  • Table 7 Averages and Standard Deviations of Temperatures and Relative Humidity Recoded by the System and Thermometer During the Commercial Storage Tests.
  • the system as described herein can detect insect activity during lab and commercial storage tests in less than 20 minutes with a counting accuracy of more than 90% (FIG. 16). Specifically, the system detected insect activity during the commercial setting test within 12 minutes with a counting accuracy of 91.3%.
  • a system for real-time monitoring of insects includes a trap and an image processor.
  • the trap includes a chamber with perforations sized to admit insects into an interior of the smart trap, a collection chamber located within the interior of the smart trap, and an imaging system for capturing images that include the collection chamber of the smart trap.
  • the image processor is configured to receive images captured by the smart trap and to determine a count of insects within the collection chamber based on image analysis of the received image, wherein image analysis includes identifying a region within the received image corresponding with a boundary of the collection chamber, cropping the received image to the identified region to generate a cropped image, modifying at least one characteristic of the cropped image to generate a modified, cropped image, and determining a count of insects based on the modified, cropped image.
  • image analysis includes identifying a region within the received image corresponding with a boundary of the collection chamber, cropping the received image to the identified region to generate a cropped image, modifying at least one characteristic of the cropped image to generate a modified, cropped image, and determining a count of insects based on the modified, cropped image.
  • the image processor is included as part of the smart trap.
  • the system further includes a server located remotely from the smart trap, wherein the server is communicatively coupled to the smart trap to receive data from the smart trap.
  • the image processor is located on the server, and wherein data received from the smart trap includes images captured by the smart trap.
  • identifying a region within the received image corresponding with a boundary of the collection chamber includes applying a Hough Circle transform to the captured image to identify the region in the received image corresponding with the boundary of the collection chamber.
  • modifying at least one characteristic of the cropped image includes one or more of converting the cropped image to greyscale, adjusting brightness/contrast of the cropped image, binarizing dark regions of the cropped image, reducing image/noise of the cropped image, and binarizing the cropped image.
  • determining a count of insects based on the modified, cropped image includes applying a particle detection algorithm to the modified, cropped image.
  • applying the particle detection algorithm includes identifying a bounding box for each region of interest in the second processed image, placing the bounding box into a set of bounding boxes, filtering the set of bounding boxes to a subset of bounding boxes, counting the insects in the subset of bounding boxes to determine a count of insects in the captured image.
  • filtering the set of bounding boxes includes restricting bounding boxes in the set to bounding boxes based on one or more of location of the bounding box within a certain area band, presence of black pixels within the bounding box, presence of object having a specified eccentricity; and/or presence of an object having an oval shaped object.
  • the particle detection algorithm determines a count of insects from a subset of bounding boxes, each bounding box identifying a region of interest in the modified, cropped image.
  • a trap for detecting insects includes a perforated chamber with openings sized to admit insects into an interior of the trap, a collection chamber located within the interior of the smart trap for collecting admitted insects, and a cap configured to cover the perforated chamber, the cap housing an electronic system including an imaging system to capture an image of the collection chamber.
  • the trap of the preceding paragraph can optionally include, additionally and/or alternatively any, one or more of the following features, configurations, and/or additional components.
  • the electronic system further including a lighting module for lighting the collection chamber when the imaging system captures an image.
  • the electronic system further including a sensor module for collecting data on one or more ambient conditions.
  • the imaging system includes a camera board, the system further comprising: a main board having a microcontroller and a communication module, and a shield board having a lighting module; wherein the main board, the shield board, and the camera board are stacked horizontally with the shield board positioned between the main board and the camera board.
  • the microcontroller is configured to provide instructions to the shield board and the camera board, wherein instructions include instructions to the lighting module to illuminate the interior of the smart trap and instructions to the camera board to capture an image of the interior of the smart trap.
  • a method of counting insects in a captured image includes cropping and masking the captured image to produce a first processed image containing only a region in the captured image that correlates to a collection chamber, modifying at least one characteristic of the first processed image to produce a second processed image, and determining a count of insects in the captured image by executing a particle detection algorithm on the second processed image.
  • cropping and masking the captured image includes applying a Hough Circle transform to define the region.
  • modifying at least one characteristic of the first processed image makes any insects in the first processed image more pronounced for easier identification by the particle detection algorithm.
  • modifying at least one characteristic of the first processed image includes one or more of converting the first processed image to greyscale, adjusting brightness/contrast of the first processed image, binarizing dark regions of the first processed image, reducing image/noise of the first processed image, and binarizing the first processed image.
  • the particle detection algorithm includes determining a count of insects from a subset of bounding boxes, each bounding box identifying a region of interest in the second processed image.
  • the particle detection algorithm further includes identifying a bounding box for each region of interest in the second processed image; placing the bounding box into a set of bounding boxes; filtering the set bounding boxes to a subset of bounding boxes; and counting insects in the subset of bounding boxes to determine a count of insects in the captured image.
  • filtering the set of bounding boxes includes restricting bounding boxes in the set to bounding boxes based on one or more of location of the bounding box within a certain area band, presence of black pixels within the bounding box, presence of object having a specified eccentricity; and/or presence of an object having an oval- shaped object.

Landscapes

  • Engineering & Computer Science (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Pest Control & Pesticides (AREA)
  • Theoretical Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Environmental Sciences (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Zoology (AREA)
  • Wood Science & Technology (AREA)
  • Insects & Arthropods (AREA)
  • Health & Medical Sciences (AREA)
  • Software Systems (AREA)
  • Medical Informatics (AREA)
  • General Health & Medical Sciences (AREA)
  • Evolutionary Computation (AREA)
  • Databases & Information Systems (AREA)
  • Computing Systems (AREA)
  • Artificial Intelligence (AREA)
  • Catching Or Destruction (AREA)

Abstract

A system for real-time monitoring of insects includes a smart trap and an image processor. The smart trap includes a chamber with perforations sized to admit insects into an interior of the smart trap; a collection chamber located within the interior of the smart trap; and an imaging system for capturing images that include the collection chamber. The image processor is configured to receive images captured by the smart trap and to determine a count of insects within the collection chamber based on image analysis of the received image. Image analysis includes identifying a region within the received image corresponding with a boundary of the collection chamber, cropping the received image to the identified region to generate a cropped image, modifying at least one characteristic of the cropped image to generate a modified, cropped image, and determining a count of insects based on the modified, cropped image.

Description

REAL-TIME MONITORING AND EARLY DETECTION SYSTEM FOR INSECT ACTIVITY IN GRAINS DURING STORAGE
BACKGROUND
[0001] Insect infestation of stored grains negatively affects the grade of the stored grains, increases the grain temperature, and promotes the growth of microorganisms that cause spoilage and thereby further reduce grain quality. Consequently, an infestation can lead to significant financial losses for the grain growers and processors. The early detection of insect infestation is, therefore, an important need in the grain industry.
SUMMARY
[0002] According to one aspect, a system for real-time monitoring of insects includes a trap and an image processor. The trap includes a chamber with perforations sized to admit insects into an interior of the smart trap, a collection chamber located within the interior of the smart trap, and an imaging system for capturing images that include the collection chamber of the smart trap. The image processor is configured to receive images captured by the smart trap and to determine a count of insects within the collection chamber based on image analysis of the received image, wherein image analysis includes identifying a region within the received image corresponding with a boundary of the collection chamber, cropping the received image to the identified region to generate a cropped image, modifying at least one characteristic of the cropped image to generate a modified, cropped image, and determining a count of insects based on the modified, cropped image.
[0003] According to another aspect, a trap for detecting insects includes a perforated chamber with openings sized to admit insects into an interior of the trap, a collection chamber located within the interior of the smart trap for collecting admitted insects, and a cap configured to cover the perforated chamber, the cap housing an electronic system including an imaging system to capture an image of the collection chamber.
[0004] According to a further aspect, a method of counting insects in a captured image includes cropping and masking the captured image to produce a first processed image containing only a region in the captured image that correlates to a collection chamber, modifying at least one characteristic of the first processed image to produce a second processed image, and determining a count of insects in the captured image by executing a particle detection algorithm on the second processed image.
BRIEF DESCRIPTION OF DRAWINGS
[0005] This written disclosure describes illustrative embodiments that are non-limiting and non-exhaustive. Reference is made to illustrative embodiments that are depicted in the figures, in which:
[0006] FIG. 1 shows an embodiment of an insect detection system.
[0007] FIGS. 2A-B are views of a smart trap, according to some embodiments of this disclosure. FIG. 2A is a schematic diagram of a smart trap. FIG. 2B is a photo of a trap, according to some embodiments of this disclosure.
[0008] FIG. 3 is a block diagram of the systems of a smart trap, according to some embodiments of this disclosure.
[0009] FIG. 4 is a block diagram of modules of a main board of a smart trap, according to some embodiments of this disclosure.
[0010] FIG. 5 is a block diagram of modules of the shield board, according to some embodiments of this disclosure.
[0011] FIG. 6 is a block diagram of a camera board, according to some embodiments of this disclosure.
[0012] FIG. 7 shows an exemplary embodiment of an arrangement for the systems for a smart trap.
[0013] FIGS.8A-B shows an exemplary embodiment of the main board, showing (A) the first side of the main board and (B) the second side of the main board.
[0014] FIG. 9A-C shows an exemplary embodiment of the shield board, showing (A) the first side of the shield board; (B) the second side of the shield board; and (C) an exemplary connection schematic for the modules of the shield board.
[0015] FIGS. 10A-B shows an exemplary embodiment of the camera board, showing (A) the first side of the camera board and (B) the second side of the camera board.
[0016] FIG. 11 is a flowchart of an algorithm to count the number of insects in an image, according to some embodiments of this disclosure.
[0017] FIG. 12 is a flowchart of an algorithm to count the number of insects in an image according to some embodiments of this disclosure. [0018] FIG. 13 is a flowchart of an exemplary subroutine for determining the region in a captured image that corresponds to the collection chamber.
[0019] FIG. 14 is a flowchart of an exemplary subroutine for modifying one or more characteristics of the first processed image.
[0020] FIG. 15 is a flowchart of an exemplary subroutine particle detection algorithm, according to some embodiments of this disclosure. Arrows indicate that a step may be repeated at least one time before proceeding to the next step of the method.
[0021] FIG. 16 is a graph showing the time to detect the emergence of the first insect during lab and commercial tests of an insect system according to some embodiments of this disclosure.
DETAILED DESCRIPTION
[0022] Early insect-detection is considered an effective technique to determine the optimal pest management practice to eliminate the infestation risk and maintain storage longevity, quality, grade, and safety of grains. The current methods of insect-detection in grains do not have the capability of real-time monitoring and early detection of insect activity in stored grains. Additionally, the current methods are inaccurate, time-consuming, and require trained personnel to identify the insect risk. Embodiments of the present disclosure describe systems and methods for early detection of insects in stored grains and/or for real-time detecting/monitoring of insect activity in stored grains.
[0023] An insect detection system 100 as described herein has high reliability and provides a highly accurate insect count. For example, in a laboratory test of an insect detection system 100 as described herein the emergence of the first insect was detected within 19, 18, 20, and 20 minutes under infestation concentrations of 0.5/kg, 1/kg, 2/kg, and 3/kg, respectively. The average image counting accuracy rate of the insect detection system 100 was 94.3%. Additionally, in a commercial test of an insect detection system 100 as described herein, insect activity was detected within twelve minutes with a counting accuracy of 91.3%.
[0024] In addition to being a low-cost system, an insect detection system 100 described herein decreases labor cost, increases the efficacy of pest management practice, enables early intervention/corrective action to be taken before the damage becomes severe, improves the quality, grade, and/or safety of stored grains, and/or decreases financial losses to grain growers and processors. [0025] Referring now to Figure 1 an insect detection system 100 according to some embodiments of this disclosure is depicted. The insect detection system 100 includes at least one smart trap 110, at least one server 120, and at least one user interface/display device 130 which are communicatively coupled to one another via communication channel 140. In one aspect, the smart trap 110 is used to collect data in a facility storing grain. Any suitable wireless communication protocol may be used for communication between the smart trap 110, the server 120, and/or the user device 130. In one aspect, communication between the smart trap 110 and the server 120 and/or user device 130 are encrypted. In one example, a hypertext transfer protocol (HTTP) is used for communication between components 110, 120, 130 of the insect detection system 100. Communication of data and/or instructions may be done as a bundle or individually. Data may be saved to memory and/or processed by server 120 and/or user device 130.
[0026] Server 120 includes one or more devices capable of receiving, generating, storing, processing, and/or providing information. For example, server 120 may include a server, a data center, a workstation computer, a virtual machine (VM) implemented in a cloud computing environment, or a similar type of device.
[0027] The user device 130 includes one or more devices capable of receiving, generating, storing, processing, and/or providing information associated with interactions of a user of user device 130 with a user interface provided for display via a display associated with user device 130. For example, user device 130 may include a desktop computer, a mobile phone (e.g. a smartphone, a radiotelephone, etc.), a laptop computer, a tablet computer, a handheld computer, a gaming device, a virtual reality device, a wearable communication device (e.g. a smart wristwatch, a pair of smart eyeglasses, etc.), or a similar type of device. Although the example insect detection system 100 shows server 120 and user device 130 as separate components, server 120 and user device 130 may be a single component/device.
[0028] The insect detection system 100 further includes a user interface on server 120 and/or the user device 130 for remote operation of the insect detection system 100. Any suitable user interface may be implemented that allows an operator to interact with the user interface. For example, the user interface may be a graphical user interface (GUI), a webpage, and/or an application for a smartphone or tablet. Interacting with user interface elements includes selecting a button on the user interface, inputting text via a text box, toggling a control, and/or the like. [0029] To monitor insect infestation of stored grain in real-time, at least one smart trap 100 is inserted into a grain mass. In one aspect, the number of smart traps 110 used in the system 100 depends on the size of the grain mass to be monitored. In one example, a system 100 implemented in a commercial grain storage facility utilizes 10-20 smart traps 110. The smart trap 110 collects data relevant to monitor insect infestation, e.g. images of an interior of the smart trap 110 and/or data about an ambient condition. In some embodiments, the user device 130 communicates one or more instructions to the trap 110. Instructions include an on- demand request to capture an image, and/or a schedule for data collection. One exemplary schedule for data collection is capturing an image one time per hour. Instructions may also include assigning an unassigned trap to a registered user and/or adding a new trap to the system. [0030] The collected data may be processed by a microcontroller 211 located in the smart trap 110, the server 120, and/or the user device 130. Processed data may be stored in memory on server 120 and/or the user device 130. Stored data may be retrieved and/or viewed by the user interface. For example, the user interface may be used to access a database or list of smart traps. When the operator selects a smart trap, information about the smart trap will be displayed. Information that may be displayed includes the trap ID, trap location, detected insects per timespan, and sensor readings. In one aspect, the user device may be used to visualize data collected by the smart trap 110. In another aspect, the collected data is analyzed to determine a correlation between insect emergence and the temperature and relative humidity.
[0031] Figure 2A is a schematic of a smart trap 110 and Figure 2B is a photo of an exemplary embodiment of a smart trap 110. The smart trap 110 has a cap 200 that covers the top opening of the perforated cylinder 300 and forms the top of the smart trap 110. The cap 200 houses an electronic system 204 that includes a camera 242 with a lens. In one aspect, the smart trap 110 has a power source that may be attached to the cap 200. In one example, the power source is a battery 218, held by a battery clip, attached to the cap 200 (Figure 2B). [0032] The perforated cylinder 300 forms the body of the smart trap 110. In one example, an annulus connects the perforated chamber 300 to the cap 200. The diameter and length of the perforated chamber 300 are selected based on an appropriate force required to insert the trap into the grain mass and/or to provide an appropriate depth so the smart trap 110 is placed where the insects are active in the grain mass. The perforations are sized to admit insects into the smart trap 110 where they fall into the collection chamber 400. In one aspect, the size of the perforations allows insects to enter the smart trap 110 but prevents grain from filling the smart trap 110. [0033] The collection chamber 400 has a base 410, a conical end 420, and is attached to the bottom of perforated chamber 300 to form the bottom of the smart trap 110. In one example, an annulus connects collection chamber 400 to the perforated chamber 300. The collection chamber 400 may be detachable. In one example, the collection chamber 400 has a threaded connection. Camera 242 is directed towards and has an unobstructed view of, the collection chamber 400.
[0034] In one aspect, the dimensions and shape of the smart trap 110 provide efficient features for attracting insects and easily insert the smart trap 110 into a grain mass (e.g. diameter, length, perforation diameter, conical end). Images of insects caught in the collection chamber 400 are captured by camera 242. In another aspect, the base 410 of the collection chamber 400 is white to increase the contrast and/or brightness of a captured image. In a further aspect, the conical nose 420 reduces the amount of force required to insert the smart trap 110 into a grain mass. The collection chamber 400 may be detached so that insects captured in the collection chamber 400 may be discarded. Placing the electronic system 204 in cap 200 allows the operator of the system 100 to easily repair and/or replace the entire electronic system 204 or one or more individual modules or boards associated with the electronic system 204.
[0035] Examples of some suitable materials for smart trap 110 include polyvinyl chloride (PVC) and stainless steel. In one example, the cap 200 is a PVC cap fitting. In one specific example, the cap 200 has a 2.4-inch inner diameter. The perforated chamber 300 may be connected to cap 200 and collection chamber 400 by a PVC annulus that matches the inner diameter of the perforated chamber. In another example, the perforated chamber 300 is made of stainless steel.
[0036] In one exemplary method of forming the collection chamber 400, a male piece is sealed at the bottom with a solid PVC nose cut to shape on a lathe, and a rubber O-ring is added to the connection. In one example, the collection chamber 400 is manufactured from a straight threaded polyvinyl chloride (PVC) connection.
[0037] Using a battery as a power source provides the smart trap 110 with a long lifespan. Another advantage is that each smart trap 110 has an independent power source. In one specific example, 240 mAh of energy is used every time the insect detection system 100 captures an image and sends the image to server 120. Table 1 shows the results of tests measuring the lifespan of different battery types based on the frequency at which images are taken by the imaging system. Table 1. Lifespan of Battery According to Frequency of Image Capture
Figure imgf000008_0001
[0038] Figure 3 is a block diagram of the electronic system 204. In some embodiments, the electronic system 204 includes a main board 210, a shield board 220, and/or a camera board 240 that are communicatively coupled to one another via a or communication channel 150. Each board 210, 220, 240 is configured to execute at least one communication protocol. Examples of suitable communication protocols include the Inter-Integrated Circuit (I2C) protocol and the Serial Peripheral Interface (SPI) protocol. In one example, the I2C protocol is used for communication 150 between the main board 210 and the shield board 220, and the SPI protocol is used for communication 150 between the main board 210 and the camera board 240. In one aspect, each board 210, 220, and 240 is attached to another board. In one example, the main board 210 is connected to the shield board 220, which is connected to the camera board 240 (see e.g. Figure 7).
[0039] To provide real-time monitoring and early detection of insect activity in stored grains, the electronic system 204 is configured to capture one or more images, provide light during image capture, sensors for measuring temperature and relative humidity, convert analog data into digital data, process images to count the number of insects in a captured image, process the ambient data, display/visualize the data, store data, and/or communicate information (e.g. data and/or instructions). In some embodiments, the main board 210 converts analog data into digital data, processes captured images, processes data, and/or communicates with the shield board 220, the camera board 240, server 120 and/or the user device 130. In some embodiments, the shield board 210 collects sensor data for ambient conditions, provides light during image capture, and/or communicates with the main board 210. In some embodiments, the camera board 240 captures one or more images and/or communicates with the main board 210.
[0040] Figure 4 is a block diagram of the main board 210. The main board 210 includes a microcontroller 211, a communication module 212, a clock module 214, and/or a power module 216 that includes a long-lasting power supply such as a battery. In one specific example, the main board 210 is an MCU ESP8266 board.
[0041] In one aspect, the microcontroller 211 communicates instructions to other modules of the system 204. For example, after an image is captured, the microcontroller 211 communicates 150 instructions to reset system 204 so that system 204 is ready to take another image. In another aspect, the microcontroller 211 processes data. For example, the microcontroller 211 converts the analog sensor readings to digital values. The microcontroller may also process an image captured by the imaging system 240, 242, 244, 246. In a further aspect, the microcontroller controls communication between server 120 and user device 130. In a further aspect, the microcontroller 211 may be programmed with instructions to power the system 204 only when a new command is received. In this example, a received instruction is added to a queue of instructions and, after the instruction is accepted, the imaging system 240, 242, 244, 246, and sensor module 224 are activated to collect the requested data.
[0042] In one aspect, clock module 214 assists in the implementation of time-dependent routines such as an image capture schedule or a power-saving routine, such as a deep sleep mode, to save power and increase the lifespan of the smart trap 110.
[0043] Figure 5 is a block diagram of the shield board 220. In some embodiments, the shield board 220 includes a lighting module 222, a sensor module 224, and a shield communication module 226. The lighting module 222 directs light towards the collection chamber 400 when an image is being captured by the camera 242, 244. The lighting module 222 has at least one light that is perpendicular to the collection chamber 400. The light may be flat. One or more light-emitting diodes (FEDs) may be used for the lighting module 222. The lighting module 222 may receive instructions from the microcontroller 211 to turn on, capture an image, and/or save the image to a temporary file on the main board 210. In some embodiments, the sensor module 224 has at least one sensor for monitoring and/or recording an ambient condition such as temperature and/or relative humidity. In some embodiments, the shield communication module 226 is utilized to provide communication with the main board 210. For example, the shield communication module 226 may receive instructions from the microcontroller 211 to turn on the lighting module 222 or collect sensor data from the sensor module 224. In one aspect, the shield communication module is a port expander. In one example, the port expander is an MCP23008 port expander.
[0044] Figure 6 is a block diagram of the camera board 240. The camera board 240 includes a camera interface 246, an image sensor 244, and a camera 242 with a lens and a shutter. In one example, the camera board 240 is an ArduCam 2 mega-pixel OV2640 camera. Camera 242 is connected to a camera board 240. In one aspect, camera 242 is a high-resolution camera. In another aspect, the camera interface 246 receives instructions from the main board 210 to keep the shutter open for a pre-determined amount of time. In addition, the shield communication module 226 may provide confirmation of lights being turned on and/or sensor data being collected by the sensor module 224. Camera 242 captures an image of the collection chamber 400. In one aspect, the captured image is a colored (RGB) image.
[0045] Figures 7-10B show exemplary embodiments of an insect detection system 100 as described herein. Figure 7 shows an exemplary arrangement for the boards 210, 220, 240. In this example, the shield board 220 is positioned between the main board 210 and the camera board 240. The main board 210 is the top board and the camera board 240 is the bottom board. The boards 210, 220, 240 are connected by a vertical support (see rear left). The lighting module 222, located on the shield board 222 (as shown in Figure 5), and the lens of the camera 242 are oriented in the same direction so that when shield board 222 is attached to the cap of a smart trap (as shown in Figure 2A), the lighting module 222 and camera 242 are directed towards the collection chamber 400. Figures 8A-B show an exemplary main board 210. In this example, the power module 216 for the main board 210 includes a battery socket 217 receiving power from least one battery 218. The battery socket 217 and WiFi module 212 are located on the first side of the main board 210 (Figure 8A). In some embodiments, the first side of the main board 210 may face upward when the main board 210 is housed in the cap 200. Figures 9A-C show an exemplary embodiment of a shield board 220. In this example, the lighting module is located on the first side of the shield board 220 and includes a plurality of light sources 223, e.g. LEDs (Figure 9A). In some embodiments, the communication module 226 and sensor module 224 are located on the second side of the shield board 220, as shown in Figure 9B. In this example, four LED lights form the lighting module 222. Figure 9C shows the trace for the exemplary shield board 220 embodiment. Figures 10A-B show an exemplary camera board 240. Camera 242, 244 is attached to the first side (Figure 10A)
[0046] In summary, an insect detection system 100 as disclosed herein is configured to acquire high resolution, high quality, images in a dark environment. Analyzing a high- resolution, high-quality image improves the accuracy of the insect count. First, a high- resolution camera produces a higher quality image. Furthermore, a white base 410 provides a higher contrast background for insects in a captured image, thereby producing a higher quality image. Also, uniform lighting provided by the lighting module 222 during the imaging process improves image quality. Additionally, instructions to keep the shutter open pre-determined amount of time are sent to the camera board 240 so that the image sensor 244 can absorb more light.
[0047] In an additional aspect, captured images are time stamped. The filename for an image may include identifying information such as the trap ID and the date and time the image was captured. In one aspect, a database is used to organize the captured images and/or processed images.
[0048] Figure 11 is a flowchart that illustrates a method 500 of analyzing images captured by the smart trap 110. In some embodiments, algorithm 500 determines the number of insects in each captured image. In some embodiments, the algorithm 500 may be executed by an image processor. The image processor may be the microcontroller 211 located in trap 110, server 120, and/or user device 130. In one example, algorithm 500 is a macro/function that may be executed by the microcontroller 211. At step 600, the captured image is cropped and masked to form a first processed image.
[0049] In one aspect, cropping and masking the captured image 600 removes extraneous areas and details of the captured image so that only the region in the captured image corresponding to the collection chamber 400 undergoes further processing and analysis. Thus, the first processed image for a circular collection chamber 400 is smaller than the captured image and includes only the region corresponding to the expected location of insects.
[0050] At step 700, the cropped/masked image is processed to modify one or more characteristics of the image. In some embodiments, modifying at least one characteristic of the cropped/masked image reduces noise, minimizes or negates fine particles and/or extraneous details, and/or converts the cropped/masked image into a binary image. These modifications, alone or in combination, increase the accuracy of the count of insects. Modifications include transforming the cropped/masked image into a single colored image (e.g. greyscale), adjusting the brightness and/or contrast of the image, binarizing the image, and/or reducing image noise and/or detail in the image. Binarization creates a binary image by converts a pixel image to a binary image and/or reduces noise in the image. Binarization may be conducted only on dark regions or on the entire image. Step 700 forms a second processed image that is a processed cropped image.
[0051] At step 800, the processed cropped image is analyzed to determine the count of insects in the image. The count of insects is a measure of grain infestation. The insect count and sensor data can be analyzed to determine a correlation between insect emergence and ambient conditions (e.g. temperature and/or relative humidity).
[0052] Figure 12 is a flowchart that illustrates a method 502 of analyzing images captured by the smart trap 110. In some embodiments, the algorithm 502 determines the number of insects in each captured image. In some embodiments, the algorithm 502 may be executed on microcontroller 211, server 120, and/or user device 130. In one example, algorithm 502 is a macro that may be executed by the microcontroller 211.
[0053] At step 610, the captured image is analyzed to define a region in the image that corresponds to the collection chamber 400. At step 630, the captured image is cropped and masked the captured image to form a first processed image that contains only the region that corresponds to the collection chamber 400. In one example, step 630 includes reloading the captured image, cropping the captured image to fit the bounding box, and masking the comers of the bounding box to produce a circular image. At step 700, the cropped/masked image is processed to modify one or more characteristics of the image. In some embodiments, modifying at least one characteristic of the cropped/masked image reduces noise, minimizes or negates fine particles and/or extraneous details, and/or converts the cropped/masked image into a binary image. These modifications, alone or in combination, increase the accuracy of the count of insects. Modifications include transforming the cropped/masked image into a single colored image (e.g. greyscale), adjusting the brightness and/or contrast of the image, binarizing the image, and/or reducing image noise and/or detail in the image. Binarization creates a binary image by converts a pixel image to a binary image and/or reduces noise in the image. Binarization may be conducted only on dark regions or on the entire image. Step 700 forms a second processed image that is a processed cropped image. At step 820 a particle detection algorithm is applied to the processed cropped image. At step 830 the processed cropped image is analyzed to determine the count of insects in the image. The count of insects is a measure of grain infestation. The insect count and sensor data can be analyzed to determine a correlation between insect emergence and ambient conditions (e.g. temperature and/or relative humidity). [0054] In one example, steps 610, 630, 820, and 830 of algorithm 502 are subroutines of the algorithm 500 shown in Figure 11, with steps 610 and 630 being subroutines of cropping and masking the captured image 600 and steps 820 and 830 being subroutines of determining a count of insects 800.
[0055] Figure 13 is a flowchart of an exemplary method 610 for determining a region in the captured image that corresponds to the collection chamber 400. At step 612, the captured image is converted to greyscale. At step 614, noise is removed from the captured image. In one example, step 614 includes applying an averaging method that removes noise while preserving edges. An example of a suitable averaging method is the median blur method. At step 616, the captured image is transformed into a binary image. At step 618, a transform method is applied to the captured image to define the region that corresponds to the collection chamber 400. In one example, step 618 applies the Hough Circle transform. The Hough Circle transform identifies at least one circular region in the captured image. Each identified region is delineated by a boundary box. An average of all identified bounding boxes is used to determine the center of the region in the captured image that corresponds to the collection chamber 400. In one aspect, the Hough Circle transform is applied to a binary image.
[0056] Figure 14 is a flowchart of an exemplary method of modifying one or more characteristics of an image 700 to form a processed cropped image. At step 714, the cropped image is transformed into a single colored image (e.g. greyscale). At step 716, the brightness and/or contrast of the cropped image is adj usted. At step 718, dark regions of the cropped image are binarized. At step 720 the amount of noise and/or fine detail in the cropped image is reduced. In one embodiment, step 720 includes applying Gaussian Blur and Unsharp masking. Multiple applications of Gaussian Blue and Unsharp masking may be applied. The Gaussian Blur method reduces image noise/detail while the Unsharp Masking method increases the sharpness. At step 722, the brightness and/or contrast of the cropped image is adjusted again. At step 724, the entire cropped image is binarized.
[0057] Figure 15 is a flowchart of an exemplary method for the particle detection algorithm 820. The insect number is determined by running a Python code. The code is used to process images by using an insect counting algorithm (ImageJ) to count the insects in each image and save the data. To count the insects, ImageJ is used to refine the image to eliminate any particles in the background and highlight the dark-colored insects via the following steps. At step 822, at least one bounding box is identified. In one example, identifying at least one bounding box 822 includes identifying regions of interest in the processed cropped image and delineating each region of interest with a bounding box, and placing the bounding box into a set of bounding boxes. At step 824, the set of bounding boxes is restricted/filtered to eliminate bounding boxes that are unlikely to contain an insect and/or keep bounding boxes with are likely to contain an insect. For example, a bounding box may surround an extraneous region that does not contain an insect. Bounding boxes to be analyzed in the count of insects may include bounding boxes within a certain area band, bounding boxes that only contain black pixels, bounding boxes that include an object with a specified eccentricity, and/or bounding boxes with an oval-shaped object. A count of the number of insects in the processed cropped image is determined by counting the insects in the subset of bounding boxes 830. In a further aspect, the insect count data may be saved to memory on server 120 and visualized by using the user interface for further analysis. The user interface, including mobile application, was designed to allow the user to easily see the data related to insect number, temperature, and relative humidity. Additionally, a graph may be used to visualize the insect count over time. The mobile application also has an additional feature which allows user to view past data. [0058] Any suitable programming language may be used to implement instructions for insect detection system 100, algorithm 500, and user interface. Some examples include Python, C++, HTML5, CSS3, and/or JavaScript.
[0059] The effectiveness and accuracy of an insect detection system 100 as described herein was evaluated in a laboratory setting and a commercial setting. The effectiveness, recovery rate, and insect counting accuracy rate were examined. Additionally, the temperature and relative humidity of ambient air inside and outside of storage, and rice moisture were measured during the tests.
[0060] The laboratory setting was a cylindrical container filled with rice infested with red flour beetles. The cylindrical container had a diameter of 20 cm, a height of 48 cm, and contained 8 kg of infested rice. The system was tested under different infestation concentrations (4 insects/8 kg, 8 insects/8 kg, 16 insects/8 kg, and 24 insects/8 kg of rough rice) which is equal to (0.5/kg, 1/kg, 2/kg, and 3/kg). Three tests were conducted for each concentration.
[0061] Table 2 shows the effectiveness and recovery rate of the system. The system can detect the emergence of the first insect within 19, 18, 20, and 20 minutes under infestation concentrations of 0.5/kg, 1/kg, 2/kg, and 3/kg, respectively (Table 2). The corresponding recovery rates of total insects were 83%, 75%, 73%, and 76% after 24 hours. For an insect concentration of 0.5 insects/kg, the system detected the emergence of the first insect within 12, 16, and 29 minutes for replicates 1, 2, and 3 respectively (Table 2). The corresponding values for recovery rates of total insects were 75%, 100%, and 75%, respectively. For an insect concentration of 1 insect/kg, the system detected the emergence of the first insect within 33, 17, and 4 minutes, for replicates 1, 2, and 3 respectively (Table 2). The corresponding values for recovery rates of total insects were 75%, 75%, and 75%, respectively. For a concentration of 3 insects/kg, the system detected the emergence of the first insect within 29, 18, and 13 minutes for replicate 1, 2, and 3, respectively. The corresponding values for recovery rates of total insects were 80%, 71%, and 80%, respectively.
Table 2. Effectiveness and Recovery Rate of the System During Laboratory Test
Figure imgf000015_0002
[0062] The recovery rate is the percentage of the insects detected after 24 hours compared to the total number of insects in the infested rice. The recovery rate of total insects can be calculated using the following equation:
Figure imgf000015_0001
[0064] where RR is recovery rate (%), NID24hr is the number of insects detected after 24 hours, and TNIskg is the total number of insects infesting the 8 kg of rice.
[0065] Table 3 shows the insect counting accuracy rate of the system during the laboratory test. The insect counting accuracy rate is the difference between the number of insects visually counted and those counted by the system. The system achieved high counting accuracy of 93% and 95.6% for 1/kg and 2/kg, respectively (Table 3). The average image counting accuracy rate was 94.3%.
Table 3. Insect Counting Accuracy Rate of the System During the Laboratory Test
Figure imgf000016_0002
[0066] The insect counting accuracy rate can be calculated using the following equation:
Figure imgf000016_0001
[0068] where ICAR is the image counting accuracy rate (%), AD is the difference between the number of insects visually counted and those counted by the system, and NIVC is the number of insects visually counted.
[0069] Table 4 provides the averages and standard deviations of temperatures and relative humidity recorded by the system and thermometer during the laboratory test. The overall average and standard deviation for temperature and relativity recorded by the system were 26.5 ± 0.6°C and 30 ± 0.7%, respectively. The corresponding values recorded by the thermometer were 24.5 ± 1.0°C and 32.5 ± 0.8%, respectively (Table 4).
Table 4. Averages and Standard Deviations of temperatures and Relative Humidity
Recorded by the System and Thermometer During the Laboratory Test
Figure imgf000017_0001
Figure imgf000018_0001
[0070] The commercial setting was a commercial storage facility with rice. The rice in the commercial storage facility was visually inspected before and after three traps were installed. Visual inspection of the commercial storage facility included taking samples from nine different locations and inspecting the samples using a screening method to determine that the rice was not infested. The tests in the commercial setting were done in triplicate. For each replicate, the system was installed and left for one week. During that time, insect activity was remotely monitored. After 7 days, the traps were inspected, and the trapped insects were visually counted and compared with those detected by the system.
[0071] Table 5 provides data relevant to the effectiveness and early detection of insect activity by the insect system in a commercial setting. The system was able to detect the emergence of the first, second, and third insects within 10, 40, and 130 minutes for trap number 1 (Table 5). The corresponding values for trap numbers 2 and 3 were 11, 42, 120 minutes, and 15, 43, and 130 minutes, respectively (Table 5).
Table 5. Effectiveness and Early Detection of Insect Activity During Commercial Storage Tests
Figure imgf000018_0002
Figure imgf000019_0001
[0072] Table 6 shows the insect counting accuracy rate of the system in a commercial setting. Analysis of the data revealed that it took only 12 minutes to detect insect activity with an accounting accuracy of 91.3%. For trap number 1, the results revealed that the counting accuracy was 100%, 91.7%, and 90% for the first, second, and third tests, respectively (Table 6). The corresponding values for trap numbers 2 and 3 were 75%, 100%, 88.9%, and 88.9%, 87.5%, and 100%, respectively (Table 6).
Table 6. Insect Counting Accuracy of the System During Commercial Storage Tests
Figure imgf000019_0002
[0073] Table 7 shows the averages and standard deviations of temperatures and relative humidity recorded by the system and thermometer during the commercial storage tests. The overall averages and standard deviations for the temperature recorded by the system sensors were 31.2 ± 4.5, 30.9 ± 5.0, and 31.7 ± 3.8°C for trap numbers 1, 2, and 3, respectively, (Table 7). The corresponding values recorded by the thermometer were 30.5 ± 4.0, 29.3 ± 2.1, and 30.1 ± 3.1°C, respectively. While, the overall average and standard deviation for the relative humidity recorded by the system sensors were 49.5 ± 11, 50 ± 10, and 50 ± 12% for trap numbers 1, 2, and 3, respectively (Table 7). The corresponding values recorded by the thermometer were 49 ± 10, 48 ± 11, and 48 ± 10%, respectively. The average ambient temperatures inside and outside the storage were 25.7 ± 4.6 °C and 28.1 ± 8.6 °C, respectively. The corresponding values for relative humidity were 46.9 ± 5.3% and 45.4 ± 12.6%. The average moisture content of stored rice was 11.8 ± 0.5%, respectively.
Table 7. Averages and Standard Deviations of Temperatures and Relative Humidity Recoded by the System and Thermometer During the Commercial Storage Tests.
Figure imgf000020_0001
[0074] As can be seen from the data, the results obtained from the commercial storage facility were consistent with those obtained from the laboratory test setting.
[0075] In summary, the system as described herein can detect insect activity during lab and commercial storage tests in less than 20 minutes with a counting accuracy of more than 90% (FIG. 16). Specifically, the system detected insect activity during the commercial setting test within 12 minutes with a counting accuracy of 91.3%.
[0076] Discussion of Possible Embodiments
[0077] According to one aspect, a system for real-time monitoring of insects includes a trap and an image processor. The trap includes a chamber with perforations sized to admit insects into an interior of the smart trap, a collection chamber located within the interior of the smart trap, and an imaging system for capturing images that include the collection chamber of the smart trap. The image processor is configured to receive images captured by the smart trap and to determine a count of insects within the collection chamber based on image analysis of the received image, wherein image analysis includes identifying a region within the received image corresponding with a boundary of the collection chamber, cropping the received image to the identified region to generate a cropped image, modifying at least one characteristic of the cropped image to generate a modified, cropped image, and determining a count of insects based on the modified, cropped image. [0078] The system of the preceding paragraph can optionally include, additionally and/or alternatively any, one or more of the following features, configurations, and/or additional components.
[0079] For example, in some embodiments, the image processor is included as part of the smart trap.
[0080] In some embodiments, the system further includes a server located remotely from the smart trap, wherein the server is communicatively coupled to the smart trap to receive data from the smart trap.
[0081] In some embodiments, the image processor is located on the server, and wherein data received from the smart trap includes images captured by the smart trap.
[0082] In some embodiments, wherein identifying a region within the received image corresponding with a boundary of the collection chamber includes applying a Hough Circle transform to the captured image to identify the region in the received image corresponding with the boundary of the collection chamber.
[0083] In some embodiments, wherein modifying at least one characteristic of the cropped image includes one or more of converting the cropped image to greyscale, adjusting brightness/contrast of the cropped image, binarizing dark regions of the cropped image, reducing image/noise of the cropped image, and binarizing the cropped image.
[0084] In some embodiments, wherein determining a count of insects based on the modified, cropped image includes applying a particle detection algorithm to the modified, cropped image.
[0085] In some embodiments, wherein applying the particle detection algorithm includes identifying a bounding box for each region of interest in the second processed image, placing the bounding box into a set of bounding boxes, filtering the set of bounding boxes to a subset of bounding boxes, counting the insects in the subset of bounding boxes to determine a count of insects in the captured image.
[0086] In some embodiments, wherein filtering the set of bounding boxes includes restricting bounding boxes in the set to bounding boxes based on one or more of location of the bounding box within a certain area band, presence of black pixels within the bounding box, presence of object having a specified eccentricity; and/or presence of an object having an oval shaped object. [0087] In some embodiments, wherein the particle detection algorithm determines a count of insects from a subset of bounding boxes, each bounding box identifying a region of interest in the modified, cropped image.
[0088] According to another aspect, a trap for detecting insects includes a perforated chamber with openings sized to admit insects into an interior of the trap, a collection chamber located within the interior of the smart trap for collecting admitted insects, and a cap configured to cover the perforated chamber, the cap housing an electronic system including an imaging system to capture an image of the collection chamber.
[0089] The trap of the preceding paragraph can optionally include, additionally and/or alternatively any, one or more of the following features, configurations, and/or additional components.
[0090] In some embodiments, the electronic system further including a lighting module for lighting the collection chamber when the imaging system captures an image.
[0091] In some embodiments, the electronic system further including a sensor module for collecting data on one or more ambient conditions.
[0092] In some embodiments, the imaging system includes a camera board, the system further comprising: a main board having a microcontroller and a communication module, and a shield board having a lighting module; wherein the main board, the shield board, and the camera board are stacked horizontally with the shield board positioned between the main board and the camera board.
[0093] In some embodiments, the microcontroller is configured to provide instructions to the shield board and the camera board, wherein instructions include instructions to the lighting module to illuminate the interior of the smart trap and instructions to the camera board to capture an image of the interior of the smart trap.
[0094] According to another aspect, a method of counting insects in a captured image includes cropping and masking the captured image to produce a first processed image containing only a region in the captured image that correlates to a collection chamber, modifying at least one characteristic of the first processed image to produce a second processed image, and determining a count of insects in the captured image by executing a particle detection algorithm on the second processed image.
[0095] The method of the preceding paragraph can optionally include, additionally and/or alternatively any, one or more of the following features, configurations, and/or additional components. [0096] In some embodiments, cropping and masking the captured image includes applying a Hough Circle transform to define the region.
[0097] In some embodiments, modifying at least one characteristic of the first processed image makes any insects in the first processed image more pronounced for easier identification by the particle detection algorithm.
[0098] In some embodiments, modifying at least one characteristic of the first processed image includes one or more of converting the first processed image to greyscale, adjusting brightness/contrast of the first processed image, binarizing dark regions of the first processed image, reducing image/noise of the first processed image, and binarizing the first processed image.
[0099] In some embodiments, the particle detection algorithm includes determining a count of insects from a subset of bounding boxes, each bounding box identifying a region of interest in the second processed image.
[00100] In some embodiments, the particle detection algorithm further includes identifying a bounding box for each region of interest in the second processed image; placing the bounding box into a set of bounding boxes; filtering the set bounding boxes to a subset of bounding boxes; and counting insects in the subset of bounding boxes to determine a count of insects in the captured image.
[00101] In some embodiments, filtering the set of bounding boxes includes restricting bounding boxes in the set to bounding boxes based on one or more of location of the bounding box within a certain area band, presence of black pixels within the bounding box, presence of object having a specified eccentricity; and/or presence of an object having an oval- shaped object.
[00102] Other embodiments of the present disclosure are possible. Although the description above contains much specificity, these should not be construed as limiting the scope of the disclosure, but as merely providing illustrations of some of the presently preferred embodiments of this disclosure. It is also contemplated that various combinations or sub combinations of the specific features and aspects of the embodiments may be made and still fall within the scope of this disclosure. It should be understood that various features and aspects of the disclosed embodiments can be combined with or substituted for one another in order to form various embodiments. Thus, it is intended that the scope of at least some of the present disclosure should not be limited by the particular disclosed embodiments described above. [00103] Thus the scope of this disclosure should be determined by the appended claims and their legal equivalents. Therefore, it will be appreciated that the scope of the present disclosure fully encompasses other embodiments which may become obvious to those skilled in the art, and that the scope of the present disclosure is accordingly to be limited by nothing other than the appended claims, in which reference to an element in the singular is not intended to mean "one and only one" unless explicitly so stated, but rather "one or more." All structural, chemical, and functional equivalents to the elements of the above-described preferred embodiment that are known to those of ordinary skill in the art are expressly incorporated herein by reference and are intended to be encompassed by the present claims. Moreover, it is not necessary for a device or method to address each and every problem sought to be solved by the present disclosure, for it to be encompassed by the present claims. Furthermore, no element, component, or method step in the present disclosure is intended to be dedicated to the public regardless of whether the element, component, or method step is explicitly recited in the claims.
[00104] The foregoing description of various preferred embodiments of the disclosure have been presented for purposes of illustration and description. It is not intended to be exhaustive or to limit the disclosure to the precise embodiments, and obviously many modifications and variations are possible in light of the above teaching. The example embodiments, as described above, were chosen and described in order to best explain the principles of the disclosure and its practical application to thereby enable others skilled in the art to best utilize the disclosure in various embodiments and with various modifications as are suited to the particular use contemplated. It is intended that the scope of the disclosure be defined by the claims appended hereto
[00105] Various examples have been described. These and other examples are within the scope of the following claims.

Claims

WHAT IS CLAIMED IS:
1. A system for real-time monitoring of insects comprising: a smart trap comprising: a chamber with perforations sized to admit insects into an interior of the smart trap; a collection chamber located within the interior of the smart trap; and an imaging system for capturing images that include the collection chamber of the smart trap; and an image processor configured to receive images captured by the smart trap and to determine a count of insects within the collection chamber based on image analysis of the received image, wherein image analysis includes identifying a region within the received image corresponding with a boundary of the collection chamber, cropping the received image to the identified region to generate a cropped image, modifying at least one characteristic of the cropped image to generate a modified, cropped image, and determining a count of insects based on the modified, cropped image.
2. The system of claim 1, wherein the image processor is included as part of the smart trap.
3. The system of claim 1, further including a server located remotely from the smart trap, wherein the server is communicatively coupled to the smart trap to receive data from the smart trap.
4. The system of claim 3, wherein the image processor is located on the server, and wherein data received from the smart trap includes images captured by the smart trap.
5. The system of claim 1, wherein identifying a region within the received image corresponding with a boundary of the collection chamber includes applying a Hough Circle transform to the captured image to identify the region in the received image corresponding with the boundary of the collection chamber.
6. The system of claim 1, wherein modifying at least one characteristic of the cropped image includes one or more of converting the cropped image to greyscale, adjusting brightness/contrast of the cropped image, binarizing dark regions of the cropped image, reducing image/noise of the cropped image, and binarizing the cropped image.
7. The system of claim 1, wherein determining a count of insects based on the modified, cropped image includes applying a particle detection algorithm to the modified, cropped image.
8. The system of claim 7, wherein applying the particle detection algorithm includes identifying a bounding box for each region of interest in the second processed image, placing the bounding box into a set of bounding boxes, filtering the set of bounding boxes to a subset of bounding boxes, counting the insects in the subset of bounding boxes to determine a count of insects in the captured image.
9. The system of claim 8, wherein filtering the set of bounding boxes includes restricting bounding boxes in the set to bounding boxes based on one or more of location of the bounding box within a certain area band, presence of black pixels within the bounding box, presence of object having a specified eccentricity; and/or presence of an object having an oval-shaped object.
10. The system of claim 7, wherein the particle detection algorithm determines a count of insects from a subset of bounding boxes, each bounding box identifying a region of interest in the modified, cropped image.
11. A smart trap for detecting insects, comprising: a perforated chamber with openings sized to admit insects into an interior of the smart trap; a collection chamber located within the interior of the smart trap for collecting admitted insects; and a cap configured to cover the perforated chamber, the cap housing an electronic system including an imaging system to capture an image of the collection chamber.
12. The smart trap of claim 11, the electronic system further including a lighting module for lighting the collection chamber when the imaging system captures an image.
13. The smart trap of claim 11, the electronic system further including a sensor module for collecting data on one or more ambient conditions.
14. The trap of claim 11, wherein the imaging system includes a camera board, the system further comprising: a main board having a microcontroller and a communication module; a shield board having a lighting module; wherein the main board, the shield board, and the camera board are stacked horizontally with the shield board positioned between the main board and the camera board.
15. The smart trap of claim 14, wherein the microcontroller is configured to provide instructions to the shield board and the camera board, wherein instructions include instructions to the lighting module to illuminate the interior of the smart trap and instructions to the camera board to capture an image of the interior of the smart trap.
16. A method of counting insects in a captured image, comprising: cropping and masking the captured image to produce a first processed image containing only a region in the captured image that correlates to a collection chamber; modifying at least one characteristic of the first processed image to produce a second processed image; and determining a count of insects in the captured image by executing a particle detection algorithm on the second processed image.
17. The method of claim 16, wherein cropping and masking the captured image includes applying a Hough Circle transform to define the region.
18. The method of claim 16, wherein modifying at least one characteristic of the first processed image makes any insects in the first processed image more pronounced for easier identification by the particle detection algorithm.
19. The method of claim 16, wherein modifying at least one characteristic of the first processed image includes one or more of converting the first processed image to greyscale, adjusting brightness/contrast of the first processed image, binarizing dark regions of the first processed image, reducing image/noise of the first processed image, and binarizing the first processed image.
20. The method of claim 16, wherein the particle detection algorithm includes determining a count of insects from a subset of bounding boxes, each bounding box identifying a region of interest in the second processed image.
21. The method of claim 16, wherein the particle detection algorithm further includes: identifying a bounding box for each region of interest in the second processed image; placing the bounding box into a set of bounding boxes; filtering the set bounding boxes to a subset of bounding boxes; and counting insects in the subset of bounding boxes to determine a count of insects in the captured image.
22. The method of claim 21, wherein filtering the set of bounding boxes includes restricting bounding boxes in the set to bounding boxes based on one or more of location of the bounding box within a certain area band, presence of black pixels within the bounding box, presence of object having a specified eccentricity; and/or presence of an oval-shaped object.
PCT/US2021/019325 2020-02-24 2021-02-24 Real-time monitoring and early detection system for insect activity in grains during storage WO2021173609A1 (en)

Priority Applications (4)

Application Number Priority Date Filing Date Title
US17/801,573 US20230129551A1 (en) 2020-02-24 2021-02-24 Real-time monitoring and early detection system for insect activity in grains during storage
CA3169032A CA3169032A1 (en) 2020-02-24 2021-02-24 Real-time monitoring and early detection system for insect activity in grains during storage
CN202180026268.0A CN115361865A (en) 2020-02-24 2021-02-24 Real-time monitoring and early detection system for insect activity in grain during storage
EP21759848.1A EP4110049A4 (en) 2020-02-24 2021-02-24 Real-time monitoring and early detection system for insect activity in grains during storage

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US202062980952P 2020-02-24 2020-02-24
US62/980,952 2020-02-24

Publications (1)

Publication Number Publication Date
WO2021173609A1 true WO2021173609A1 (en) 2021-09-02

Family

ID=77492002

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2021/019325 WO2021173609A1 (en) 2020-02-24 2021-02-24 Real-time monitoring and early detection system for insect activity in grains during storage

Country Status (5)

Country Link
US (1) US20230129551A1 (en)
EP (1) EP4110049A4 (en)
CN (1) CN115361865A (en)
CA (1) CA3169032A1 (en)
WO (1) WO2021173609A1 (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114679966A (en) * 2022-04-18 2022-07-01 杭州师范大学 Wisdom grain storage warehouse
EP4295681A1 (en) * 2022-06-21 2023-12-27 WITASEK Pflanzenschutz GmbH Insect trap

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
BR102018072956B1 (en) * 2018-11-08 2024-02-20 Livefarm Tecnologia Agropecuaria Ltda ADAPTER FOR AUTOMATION OF DETECTION DEVICES, REMOTE, AUTOMATIC AND UNINTERRUPTED COUNTING OF TARGET PESTS AND PERIMETERAL CONTROLLER OF LEPIDOPTERA
US20210342713A1 (en) * 2020-05-04 2021-11-04 Bioverse Labs Corp Environmental and crop monitoring system
US20220217962A1 (en) * 2019-05-24 2022-07-14 Anastasiia Romanivna ROMANOVA Mosquito monitoring and counting system

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20050025357A1 (en) * 2003-06-13 2005-02-03 Landwehr Val R. Method and system for detecting and classifying objects in images, such as insects and other arthropods
US20100037512A1 (en) * 2006-06-15 2010-02-18 Woodstream Corporation Flying insect trapping device and flying insect trapping system
US20110067293A1 (en) * 2007-12-14 2011-03-24 Sterling International Inc. Multi-species insect trap with separated plumes
US20130223677A1 (en) * 2012-02-23 2013-08-29 Kirk Ots System for counting trapped insects
US20190034736A1 (en) * 2017-07-14 2019-01-31 Illumitex, Inc. System and Method for Identifying a Number of Insects in a Horticultural Area

Family Cites Families (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5646404A (en) * 1995-02-17 1997-07-08 The United States Of America As Represented By The Secretary Of Agriculture Electronic grain probe insect counter (EGPIC)
US9664813B2 (en) * 2015-02-13 2017-05-30 Delta Five, Llc Automated insect monitoring system
CN105976502A (en) * 2016-04-26 2016-09-28 江苏茴香豆网络科技有限公司 Acceptance bill serial number identification method and acceptance bill serial number identification device
CN205624024U (en) * 2016-05-17 2016-10-12 河南工业大学 Grain is piled surperficial top layer pest and is traped monitoring devices
PT109433A (en) * 2016-06-07 2017-12-07 Filipe Pinheiro Pinto Sobreiro Luís MACHINE FOR INSECT CAPTURE, COUNTING AND MONITORING
CN107094729A (en) * 2017-05-22 2017-08-29 常州大学 The machine visual detection device and method of counting of insect inside silo
CN109726615A (en) * 2017-10-30 2019-05-07 北京京东尚科信息技术有限公司 A kind of recognition methods of road boundary and device
CN109684906B (en) * 2018-05-31 2021-04-30 北京林业大学 Method for detecting red fat bark beetles based on deep learning
CN209268438U (en) * 2018-12-18 2019-08-20 郑州源创智控有限公司 Integration lures worm to survey worm device
CN110250123B (en) * 2019-06-26 2021-12-21 江苏大学 Low-density grain storage pest real-time monitoring system based on image recognition

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20050025357A1 (en) * 2003-06-13 2005-02-03 Landwehr Val R. Method and system for detecting and classifying objects in images, such as insects and other arthropods
US20100037512A1 (en) * 2006-06-15 2010-02-18 Woodstream Corporation Flying insect trapping device and flying insect trapping system
US20110067293A1 (en) * 2007-12-14 2011-03-24 Sterling International Inc. Multi-species insect trap with separated plumes
US20130223677A1 (en) * 2012-02-23 2013-08-29 Kirk Ots System for counting trapped insects
US20190034736A1 (en) * 2017-07-14 2019-01-31 Illumitex, Inc. System and Method for Identifying a Number of Insects in a Horticultural Area

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
See also references of EP4110049A4 *
WU ET AL.: "Multi-mosquito object detection and 2D pose estimation for automation of PfSPZ malaria vaccine production", 2019 IEEE 15TH INTERNATIONAL CONFERENCE ON AUTOMATION SCIENCE AND ENGINEERING (CASE), 22 August 2019 (2019-08-22), pages 411 - 417, XP033618339 *

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114679966A (en) * 2022-04-18 2022-07-01 杭州师范大学 Wisdom grain storage warehouse
EP4295681A1 (en) * 2022-06-21 2023-12-27 WITASEK Pflanzenschutz GmbH Insect trap

Also Published As

Publication number Publication date
US20230129551A1 (en) 2023-04-27
CA3169032A1 (en) 2021-09-02
EP4110049A1 (en) 2023-01-04
CN115361865A (en) 2022-11-18
EP4110049A4 (en) 2023-08-09

Similar Documents

Publication Publication Date Title
US20230129551A1 (en) Real-time monitoring and early detection system for insect activity in grains during storage
US10796161B2 (en) System and method for identifying a number of insects in a horticultural area
US11282181B2 (en) Methods of yield assessment with crop photometry
EP3264886A1 (en) System, device and method for observing piglet birth
CN108881710A (en) Image processing method, device and system and storage medium
US10984548B2 (en) Yield prediction for a cornfield
WO2019216755A1 (en) An apparatus for assessing quality of agarwood tree or agarwood essential oil
Jonckheere et al. Methods for leaf area index determination. Part I: Theories, techniques and instruments
Yao et al. Hyperspectral image classification and development of fluorescence index for single corn kernels infected with Aspergillus flavus
Kulyukin et al. Toward Sustainable Electronic Beehive Monitoring: Algorithms for Omnidirectional Bee Counting from Images and Harmonic Analysis of Buzzing Signals.
DE102018103509B3 (en) Mobile ingredient analysis system as well as procedures for sample-correct measurement and user guidance with this
CN114993965A (en) Automatic pollution source identification method and system
Benoit et al. On the value of the Kullback–Leibler divergence for cost-effective spectral imaging of plants by optimal selection of wavebands
AU2019300793B2 (en) Information processing apparatus, information processing method, and program
CN114667452A (en) Method for determining the concentration of an analyte in a body fluid
JP2016019505A (en) Pest insect generation estimation device
Vroegindeweij et al. Object discrimination in poultry housing using spectral reflectivity
CN207908379U (en) Potato disease detection device based on machine vision and spectrum
US20220172306A1 (en) Automated mobile field scouting sensor data and image classification devices
Wang et al. Automatic identification of asian rice plant-hopper based on image processing
CN114663341A (en) Quantitative detection method for glare defect of optical lens
Park et al. Real-time multispectral imaging system for online poultry fecal inspection using unified modeling language
Park et al. Line-scan hyperspectral imaging for real-time poultry fecal detection
JP6948032B2 (en) Image analyzer and inspection system
CN207379924U (en) open fluorescence detection device

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21759848

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 3169032

Country of ref document: CA

NENP Non-entry into the national phase

Ref country code: DE

ENP Entry into the national phase

Ref document number: 2021759848

Country of ref document: EP

Effective date: 20220926