WO2020188364A1 - Machine vision system and methods for inspecting aquatic species - Google Patents

Machine vision system and methods for inspecting aquatic species Download PDF

Info

Publication number
WO2020188364A1
WO2020188364A1 PCT/IB2020/050626 IB2020050626W WO2020188364A1 WO 2020188364 A1 WO2020188364 A1 WO 2020188364A1 IB 2020050626 W IB2020050626 W IB 2020050626W WO 2020188364 A1 WO2020188364 A1 WO 2020188364A1
Authority
WO
WIPO (PCT)
Prior art keywords
aquatic species
image capturing
data
videos
tray
Prior art date
Application number
PCT/IB2020/050626
Other languages
French (fr)
Inventor
Suryanarayana Raju Sagi
Original Assignee
Suryanarayana Raju Sagi
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Suryanarayana Raju Sagi filed Critical Suryanarayana Raju Sagi
Publication of WO2020188364A1 publication Critical patent/WO2020188364A1/en

Links

Classifications

    • AHUMAN NECESSITIES
    • A01AGRICULTURE; FORESTRY; ANIMAL HUSBANDRY; HUNTING; TRAPPING; FISHING
    • A01KANIMAL HUSBANDRY; AVICULTURE; APICULTURE; PISCICULTURE; FISHING; REARING OR BREEDING ANIMALS, NOT OTHERWISE PROVIDED FOR; NEW BREEDS OF ANIMALS
    • A01K61/00Culture of aquatic animals
    • A01K61/90Sorting, grading, counting or marking live aquatic animals, e.g. sex determination
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • G06T7/0004Industrial image inspection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/60Analysis of geometric attributes
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/20Image preprocessing
    • G06V10/255Detecting or recognising potential candidate objects based on visual cues, e.g. shapes
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30108Industrial image inspection
    • G06T2207/30128Food products
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30242Counting objects in image

Definitions

  • the disclosed subject matter relates generally to aquatic species inspection. More particularly, the present disclosure relates to a machine vision system and methods for inspecting aquatic species by using an image capturing device.
  • Aquaculture has become an important part of modern agriculture.
  • the farming of aquatic species (fish and shrimp, for example) is an important part of the aquaculture industry.
  • aquaculture production there is a need to check the growth conditions of the aquatic species periodically. Many factors affect the throughput, quality, and yield of the aquatic species.
  • the netting is the known practice for checking the count of aquatic species.
  • the present invention aims to improve the accuracy of the image inspection by lighting to reduce the influence of disturbance having a different light source.
  • An objective of the present disclosure is directed towards identifying the size and count of the aquatic species.
  • Another objective of the present disclosure is directed towards identifying whether the aquatic species are undergoing any moulting process.
  • Another objective of the present disclosure is directed towards identifying whether the aquatic species are infected with white spot virus.
  • Another objective of the present disclosure is directed towards estimating the total number of aquatic species based on the images of the tray.
  • Another objective of the present disclosure is directed towards transmitting the images and videos captured by the image capturing device to the second end user device periodically for the clear understanding of the behavior, health, moulting process, size, of the aquatic species.
  • a system comprising a pole configured to secure a winch, an image capturing device, and a tray.
  • the winch configured to lift the tray in a predetermined direction and the tray configured to trap one or more aquatic species contained in a water storage tank, the image capturing device configured to capture one or more images and videos of one or more aquatic species trapped in the tray.
  • the system comprising a first computing device and a second computing device comprising a data analytics module configured to perform analytics on the one or more images and videos of one or more aquatic species to obtain the aquatic species monitoring data.
  • the aquatic species monitoring data transmitted to the first computing device and the second computing device over a network.
  • FIG. 1 is a diagram depicting a machine vision system for inspecting aquatic species, in accordance with one or more exemplary embodiments.
  • FIG. 2A is a diagram depicting a schematic representation of an image processing module 112, in accordance with one or more exemplary embodiments.
  • FIG. 2B is a diagram depicting a schematic representation of a data analytics module 110, in accordance with one or more exemplary embodiments.
  • FIG. 3 is a diagram depicting a schematic representation of a system for inspecting aquatic species, in accordance with one or more exemplary embodiments.
  • FIG. 4 is a diagram depicting a schematic representation of a system for inspecting aquatic species by connecting the Ethernet cable to the second computing device, in accordance with one or more exemplary embodiments.
  • FIG. 5 is a diagram depicting the image capturing device assembly, in accordance with one or more exemplary embodiments.
  • FIG. 6 is a diagram depicting the exploded view of the image capturing device, in accordance with one or more exemplary embodiments.
  • FIG. 7 is a flowchart depicting an exemplary method for analyzing the captured images and videos for inspecting the aquatic species, in accordance with one or more exemplary embodiments.
  • FIG. 8 is a flowchart depicting an exemplary method for analyzing the images and videos of the aquatic species by using data analytics module, in accordance with one or more exemplary embodiments.
  • FIG. 9 is a flowchart depicting an exemplary method for generating the aquatic species monitoring data, in accordance with one or more exemplary embodiments.
  • FIG. 10 is a block diagram illustrating the details of a digital processing system 1000 in which various aspects of the present disclosure are operative by execution of appropriate software instructions.
  • FIG. 1 is a diagram depicting a machine vision system for inspecting aquatic species, in accordance with one or more exemplary embodiments.
  • the machine vision system 100 depicts a first computing device 102, a second computing device 104, an image capturing device 106, an image acquisition module 107, a network 108, an illuminating source 109, a data analytics module 110, an object 111, and an image processing module 112.
  • the first computing device 102 and the second computing device 104 may include, but not limited to, a computer workstation, an interactive kiosk, and a personal mobile computing device such as a digital assistant, a mobile phone, a laptop, and storage devices, backend servers hosting database and other software and so forth.
  • the first computing device 102 and the second computing device 104 may be operated by a farmer.
  • the image capturing device 106 may be configured to capture images and videos of the aquatic species present in the water storage tanks (not shown).
  • the multiple image capturing devices may also positioned to capture images and videos of the aquatic species present in the water storage tanks (not shown).
  • the aquatic species may include but not limited to, shrimps, prawns, crabs, snails, crustaceans and molluscs, and so forth.
  • the water storage tanks (not shown) may include but not limited to, ponds, lakes, rivers, and so forth.
  • the image capturing device 106 may also be configured to deliver the captured images and videos of the aquatic species to the first computing device and the second computing device over the network 108.
  • the image capturing device 106 may include but not limited to, near-infrared camera, infrared night-vision camera, thermal imaging camera, and so forth.
  • the image acquisition module 107 may be configured to transmit the images and videos of the aquatic species to the second computing device 104.
  • the illuminating source 109 may be positioned additionally between the image capturing device 106 and the tray (not shown) to capture clear images of the aquatic species.
  • the object 111 may be present inside the water storage tank (not shown).
  • the object 111 may include the aquatics species.
  • the machine vision technique may be programmed in the image acquisition module 107, the data analytics module 110 and the image processing module 112. The machine vision can be used for measuring size, shape, and color, including improvements of cameras, illumination settings, image processing, and analysis methods.
  • the machine vision technique may be configured to achieve high accuracy and efficiency, and wider application in visual quality detection of aquatic species.
  • the quality evaluation of shrimp may be completed automatically by using the machine vision.
  • the network 108 may include but not limited to, an Internet of things (IoT network devices), an Ethernet, a wireless local area network (WLAN), or a wide area network (WAN), a Bluetooth low energy network, a ZigBee network, a WIFI communication network e.g., the wireless high speed internet, or a combination of networks, a cellular service such as a 4G (e.g., LTE, mobile WiMAX) or 5G cellular data service, a RFID module, a NFC module, wired cables, such as the world-wide-web based Internet, or other types of networks may include Transport Control Protocol/Internet Protocol (TCP/IP) or device addresses (e.g.
  • the applications are mobile applications (for e.g., android applications, IOS applications), software that offers the functionality of accessing mobile applications, and viewing/processing of interactive pages, for example, is implemented in the first computing device 102 and the second computing device 104, as will be apparent to one skilled in the relevant arts by reading the disclosure provided herein.
  • the image processing module 112 and the image analytics module 110 may be configured to perform analytics on the captured images and videos of the aquatic species to obtain the aquatic species monitoring data by considering the parameters.
  • the parameters may include but not limited to, size (length, width area and perimeter), shape (size dependent/size independent), color (mean and variance), contrast of object of interest to the background, length of object of interest, width of object of interest, and various patterns on the object of interest.
  • the aquatic species monitoring data may include but not limited to, biomass, health conditions, count, size, molting process, infections, health data, feeding habits, feeding percentage, stress, and so forth.
  • FIG. 2A is a diagram 200a depicting a schematic representation of an image processing module 112, in accordance with one or more exemplary embodiments.
  • the diagram 200a depicts a bus 201a, an image pre-treatment module 203, an image segmentation module 205, and a feature extraction module 207.
  • the bus 201a may include a path that permits communication among the components of the image processing module 112.
  • the image processing module 112 may be configured to process the images and videos of the aquatic species to the image pre-treatment module 203.
  • the image pre-treatment module 203 may be configured to remove the noise and enhance the contrast of the image.
  • the image segmentation module 205 may be configured to perform the segmentation based on the threshold, region, gradient, and classification of the captured images.
  • the feature extraction module 207 may be configured to extract the feature of the images based on the parameters.
  • FIG. 2B is a diagram 200b depicting a schematic representation of a data analytics module 110, in accordance with one or more exemplary embodiments.
  • the data analytics module 110 includes a bus 201b, a data capturing module 202, a data processing module 204, and a data generating module 206, a network module 208, a cloud server 210, a data transmitting module 212 and a data conversion module 214.
  • the bus 201b may include a path that permits communication among the components of the data analytics module 110.
  • the data processing module 204 may be configured to identify the object of interest of the captured images and videos of the aquatic species.
  • the data processing module 204 may be configured to analyze the captured images and videos to obtain the aquatic species monitoring data.
  • the data processing module 204 may also be configured to a threshold and isolate the object of interest and also calculates the dimensions of the object of interest.
  • the data conversion module 214 may be configured to convert the object of interest into common length (For example, cm/in).
  • the data transmitting module 212 may be configured to transmit the dimensions of the aquatic species to the data generating module 206.
  • the data generating module 206 may be configured to generate and display the aquatic species monitoring data on the second end user device 104.
  • the network module 208 may be configured to deliver the aquatic species monitoring data to the first computing device 102 and the second computing device 104.
  • the cloud server 210 may be configured to update and store the monitoring data of the aquatic species.
  • the cloud server 210 may also be configured to retrieve the monitoring data of the aquatic species and delivers to the first and second computing devices 102 and 104.
  • FIG. 3 is a diagram 300 depicting a schematic representation of a system for inspecting aquatic species, in accordance with one or more exemplary embodiments.
  • the schematic representation of a system 300 depicts a pole 302, a winch 304, a pulley 306, an Ethernet cable 308, a chain 310, an electrical wire 312, an image capturing device casing 314, a light source casing 316, connecting rods 318a-318c, a first light source 320, a tray 322, a pulley hook 323 and a mesh 324.
  • the pole 302 may be connected with the winch 304, the image capturing device 106, and the tray 322.
  • the winch 304 and the pulley 306 may be configured to allow moving the tray 322 into the water storage tank (not shown) and may also help to pull out the tray 322 from the water storage tank (not shown) in the desired direction.
  • the direction may include but not limited to, horizontal direction, vertical direction, linear direction, and so forth.
  • the image capturing device 102 and the second computing device 104 may be connected with the Ethernet cable 308 or with the universal serial bus port (not shown).
  • the Ethernet cable 308 configured to deliver the images and videos of the aquatic species captured by the image capturing device 106.
  • the chain 310 may be configured to hold the image capturing device casing 314 along with the tray 322.
  • the image capturing device 106 may be positioned inside the image capturing device casing 314.
  • the image capturing device casing 314 may be configured to prevent the entry of water into the image capturing device 106.
  • the electrical wire 312 may be configured to provide electrical power to the first light source 320 and the second light source (not shown).
  • the first light source 320 and the second light source may include but not limited to, infrared light, visible light, ultraviolet light, x-rays and gamma radiation and so forth.
  • the light source casing 316 may be configured to prevent the entry of water into the second light source (not shown).
  • the connecting rods 318a-318c may be configured to hold the tray 322.
  • the first light source 320 may be positioned inside the tray 322 to provide the lightning inside the tray 322 to capture the clear images and videos for the inspection of the aquatic species.
  • the tray 322 may be configured to trap the aquatic species for the inspection.
  • the mesh 324 may be configured to estimate the size of the aquatic species trapped in the tray 322.
  • the pulley hook 323 may be configured to hold the chain 310.
  • the mesh 324 may be positioned all over the tray 322 and it may also be positioned on the sides of the tray 322.
  • FIG. 4 is a diagram 400 depicting a schematic representation of a system for inspecting aquatic species by connecting the Ethernet cable to the second computing device, in accordance with one or more exemplary embodiments.
  • the schematic representation of a system 400 depicts a second light source 402, a cap 404, a first acrylic window 406, a second acrylic window 408, a POE switch 410, and the light source casing 316, an image capturing device 106, the electrical wire 312, Ethernet cable 308, and the second computing device 104.
  • the second light source 402 may be positioned inside the light source casing 316 to provide lightning on the tray 322.
  • the cap 404 may be configured to prevent the water getting inside the image capturing device 106.
  • the first acrylic window 406 may be configured to allow the lightning to fall on the tray 322.
  • the second acrylic window 408 may be configured to make visible to the image capturing device 106 to capture the images and videos in the water storage tanks (not shown).
  • the POE switch 410 may be configured to provide electrical power over Ethernet to the second computing device 104 by using the DC power source.
  • the second computing device 104 may be configured to receive the images and videos captured by the image capturing device 106 through the POE switch 410.
  • the second computing device 104 may be configured to inspect the received images and videos to provide the aquatic species monitoring data by using the data analytics module 110. For example, the data analytics module 110 may provide how much feed need to be put in the tray 322 to feed the aquatic species.
  • FIG. 5 is a diagram 500 depicting the image capturing device assembly, in accordance with one or more exemplary embodiments.
  • the image capturing device assembly 500 depicts a cord-grip 502, retaining rings 504a-504c, the chain 310, the Ethernet cable 308, the image capturing device casing 314, the light source casing 316.
  • the cord-grip 502 may be configured to hold the Ethernet cable 308 and prevents the entering of water into the image capturing device casing 314.
  • the retaining rings 504a-504b may allow placing the image capturing device 106 into the light source casing 316.
  • the retaining ring 504c may be configured to hold the connecting rods 318a-318c and prevents the tray 322 from moving.
  • the image capturing device casing 314 may be configured to prevent the water from entering into the image capturing device 106.
  • the light source casing 316 may be configured to prevent the water from entering into the light source 302a.
  • FIG. 6 is a diagram 600 depicting the exploded view of the image capturing device, in accordance with one or more exemplary embodiments.
  • the exploded view of the image capturing device 600 depicts a spring 602, the cord-grip 502, the Ethernet cable 308, the image capturing device 106, the cap 404, and the image capturing device casing 314.
  • the spring 602 may be compressed by engaging the image capturing device 106 within the image capturing device casing 314 on one end and with the cap 308 on the other end.
  • the cord- grip 502 may be configured to hold the Ethernet cable and prevents the entering of water into the image capturing device casing 314.
  • the cap 404 may be configured to secure the image capturing device 106 in the image capturing device casing 314.
  • FIG. 7 is a flowchart 700 depicting an exemplary method for analyzing the captured images and videos for inspecting the aquatic species, in accordance with one or more exemplary embodiments.
  • the method 700 is carried out in the context of the details of FIG. 1, FIG. 2 FIG. 3, FIG. 4, FIG.5 and FIG. 6.
  • the method 700 is carried out in any desired environment. Further, the aforementioned definitions are equally applied to the description below.
  • the exemplary method 700 commences at step 702, immersing the tray along with the image capturing device into the water storage tank. Trapping the aquatic species into the tray, at step 704. Capturing the images and videos of the trapped aquatic species by using image capturing device to inspect the aquatic species, at step 706. Analyzing the captured images and videos to obtain the aquatic species monitoring data by the data analytics module, at step 708. Displaying the obtained aquatic species monitoring data on the second end user device over the network, at step 710.
  • FIG. 8 is a flowchart 800 depicting an exemplary method for analyzing the images and videos of the aquatic species by using data analytics module, in accordance with one or more exemplary embodiments.
  • the method 800 is carried out in the context of the details of FIG. 1, FIG. 2 FIG. 3, FIG. 4, FIG.5 FIG. 6 and FIG. 7.
  • the method 800 is carried out in any desired environment. Further, the aforementioned definitions are equally applied to the description below.
  • the exemplary method 800 commences at step 802, acquiring the images and videos by using the data capturing module. Processing and analyzing the acquired images and videos by using the data processing module, at step 804. Obtaining the aquatic species monitoring data by analyzing the images and videos of the aquatic species by using data generating module, at step 806. Delivering the obtained aquatic species monitoring data to the first and second computing devices by the network module, at step 808. Storing and updating the obtained aquatic species monitoring data in the cloud server, at step 810.
  • FIG. 9 is a flowchart 900 depicting an exemplary method for generating the aquatic species monitoring data, in accordance with one or more exemplary embodiments.
  • the method 900 is carried out in the context of the details of FIG. 1 , FIG. 2 FIG. 3, FIG. 4, FIG.5 FIG. 6, FIG. 7 and FIG. 8.
  • the method 900 is carried out in any desired environment. Further, the aforementioned definitions are equally applied to the description below.
  • the exemplary method 900 commences at step 902, identifying the object of interest of the captured images and videos by the data processing module. Isolating and threshold the object of interest of the captured images and videos by the data processing module, at step 904. Calculating the dimensions (pixels) of the object of interest by the data processing module, at step 906. Converting the dimensions into the predetermined length by the data conversion module, at step 908. Transmitting the converted dimensions to the data generating module, at step 910. Generating the aquatic species monitoring data by the data generating module, at step 912. [0042] Referring to FIG. 10 is a block diagram 1000 illustrating the details of a digital processing system 1000 in which various aspects of the present disclosure are operative by execution of appropriate software instructions.
  • the Digital processing system 1000 may correspond to the first computing device and the second computing device 102, 104 (or any other system in which the various features disclosed above can be implemented).
  • Digital processing system 1000 may contain one or more processors such as a central processing unit (CPU) 1010, random access memory (RAM) 1020, secondary memory 1027, graphics controller 1060, display unit 1070, network interface 1080, and input interface 1090. All the components except display unit 1070 may communicate with each other over communication path 1050, which may contain several buses as is well known in the relevant arts. The components of Figure 10 are described below in further detail.
  • processors such as a central processing unit (CPU) 1010, random access memory (RAM) 1020, secondary memory 1027, graphics controller 1060, display unit 1070, network interface 1080, and input interface 1090. All the components except display unit 1070 may communicate with each other over communication path 1050, which may contain several buses as is well known in the relevant arts. The components of Figure 10 are described below in further detail.
  • CPU 1010 may execute instructions stored in RAM 1020 to provide several features of the present disclosure.
  • CPU 1010 may contain multiple processing units, with each processing unit potentially being designed for a specific task.
  • CPU 1010 may contain only a single general-purpose processing unit.
  • RAM 1020 may receive instructions from secondary memory 1030 using communication path 1050.
  • RAM 1020 is shown currently containing software instructions, such as those used in threads and stacks, constituting shared environment 1025 and/or user programs 1026.
  • Shared environment 1025 includes operating systems, device drivers, virtual machines, etc., which provide a (common) run time environment for execution of user programs 1026.
  • Graphics controller 1060 generates display signals (e.g., in RGB format) to display unit 1070 based on data/instructions received from CPU 1010.
  • Display unit 1070 contains a display screen to display the images defined by the display signals.
  • Input interface 1090 may correspond to a keyboard and a pointing device (e.g., touch-pad, mouse) and may be used to provide inputs.
  • Network interface 1080 provides connectivity to a network (e.g., using Internet Protocol), and may be used to communicate with other systems (such as those shown in Figure 1) connected to the network 108.
  • Secondary memory 1030 may contain hard drive 1035, flash memory 1036, and removable storage drive 1037. Secondary memory 1030 may store the data software instructions (e.g., for performing the actions noted above with respect to the Figures), which enable digital processing system 1000 to provide several features in accordance with the present disclosure.
  • removable storage unit 1040 Some or all of the data and instructions may be provided on removable storage unit 1040, and the data and instructions may be read and provided by removable storage drive 1037 to CPU 1010.
  • Floppy drive, magnetic tape drive, CD-ROM drive, DVD Drive, Flash memory, removable memory chip (PCMCIA Card, EEPROM) are examples of such removable storage drive 1037.
  • Removable storage unit 1040 may be implemented using medium and storage format compatible with removable storage drive 1037 such that removable storage drive 1037 can read the data and instructions.
  • removable storage unit 1040 includes a computer readable (storage) medium having stored therein computer software and/or data.
  • the computer (or machine, in general) readable medium can be in other forms (e.g., non removable, random access, etc.).
  • computer program product is used to generally refer to removable storage unit 1040 or hard disk installed in hard drive 1035. These computer program products are means for providing software to digital processing system 1000.
  • CPU 1010 may retrieve the software instructions, and execute the instructions to provide various features of the present disclosure described above.
  • Non-volatile media includes, for example, optical disks, magnetic disks, or solid-state drives, such as storage memory 1030.
  • Volatile media includes dynamic memory, such as RAM 1020.
  • Common forms of storage media include, for example, a floppy disk, a flexible disk, hard disk, solid- state drive, magnetic tape, or any other magnetic data storage medium, a CD-ROM, any other optical data storage medium, any physical medium with patterns of holes, a RAM, a PROM, and EPROM, a FLASH-EPROM, NVRAM, any other memory chip or cartridge.
  • Storage media is distinct from but may be used in conjunction with transmission media.
  • Transmission media participates in transferring information between storage media.
  • transmission media includes coaxial cables, copper wire and fiber optics, including the wires that comprise bus (communication path) 1050.
  • transmission media can also take the form of acoustic or light waves, such as those generated during radio-wave and infra-red data communications.

Landscapes

  • Engineering & Computer Science (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Physics & Mathematics (AREA)
  • Zoology (AREA)
  • Theoretical Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Environmental Sciences (AREA)
  • Biodiversity & Conservation Biology (AREA)
  • Animal Husbandry (AREA)
  • Multimedia (AREA)
  • Marine Sciences & Fisheries (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Geometry (AREA)
  • Quality & Reliability (AREA)
  • Investigating Or Analysing Materials By Optical Means (AREA)
  • Image Analysis (AREA)

Abstract

Exemplary embodiments of the present disclosure are directed towards a machine vision system for inspecting the aquatic species, comprising a pole configured to secure a winch, an image capturing device and a tray, the winch configured to lift the tray in a predetermined direction and the tray configured to trap aquatic species contained in a water storage tank, the image capturing device configured to capture images and videos of aquatic species trapped in the tray; a first computing device and a second computing device comprising a data analytics module configured to perform analytics on the images and videos of aquatic species to obtain the aquatic species monitoring data, the aquatic species monitoring data transmitted to the first computing device and the second computing device over a network.

Description

“MACHINE VISION SYSTEM AND METHODS FOR INSPECTING AQUATIC
SPECIES”
TECHNICAL FIELD
[001] The disclosed subject matter relates generally to aquatic species inspection. More particularly, the present disclosure relates to a machine vision system and methods for inspecting aquatic species by using an image capturing device.
BACKGROUND
[002] Aquaculture has become an important part of modern agriculture. The farming of aquatic species (fish and shrimp, for example) is an important part of the aquaculture industry. In aquaculture production, there is a need to check the growth conditions of the aquatic species periodically. Many factors affect the throughput, quality, and yield of the aquatic species. The netting is the known practice for checking the count of aquatic species. Right now there is no solution except doing everything manually by using netting which will create contamination of the ponds and the water storage tanks. The present invention aims to improve the accuracy of the image inspection by lighting to reduce the influence of disturbance having a different light source.
[003] In the light of the aforementioned discussion there exists a need for a system and method that would overcome or ameliorate the above mentioned disadvantages.
SUMMARY
[004] The following presents a simplified summary of the disclosure in order to provide a basic understanding of the reader. This summary is not an extensive overview of the disclosure and it does not identify key/critical elements of the invention or delineate the scope of the invention. Its sole purpose is to present some concepts disclosed herein in a simplified form as a prelude to the more detailed description that is presented later.
[005] An objective of the present disclosure is directed towards identifying the size and count of the aquatic species. [006] Another objective of the present disclosure is directed towards identifying whether the aquatic species are undergoing any moulting process.
[007] Another objective of the present disclosure is directed towards identifying whether the aquatic species are infected with white spot virus.
[008] Another objective of the present disclosure is directed towards estimating the total number of aquatic species based on the images of the tray.
[009] Another objective of the present disclosure is directed towards transmitting the images and videos captured by the image capturing device to the second end user device periodically for the clear understanding of the behavior, health, moulting process, size, of the aquatic species.
[0010] According to an exemplary aspect, a system comprising a pole configured to secure a winch, an image capturing device, and a tray.
[0011] According to another exemplary aspect, the winch configured to lift the tray in a predetermined direction and the tray configured to trap one or more aquatic species contained in a water storage tank, the image capturing device configured to capture one or more images and videos of one or more aquatic species trapped in the tray.
[0012] According to another exemplary aspect, the system comprising a first computing device and a second computing device comprising a data analytics module configured to perform analytics on the one or more images and videos of one or more aquatic species to obtain the aquatic species monitoring data.
[0013] According to another exemplary aspect, the aquatic species monitoring data transmitted to the first computing device and the second computing device over a network.
BRIEF DESCRIPTION OF THE DRAWINGS [0014] FIG. 1 is a diagram depicting a machine vision system for inspecting aquatic species, in accordance with one or more exemplary embodiments.
[0015] FIG. 2A is a diagram depicting a schematic representation of an image processing module 112, in accordance with one or more exemplary embodiments.
[0016] FIG. 2B is a diagram depicting a schematic representation of a data analytics module 110, in accordance with one or more exemplary embodiments.
[0017] FIG. 3 is a diagram depicting a schematic representation of a system for inspecting aquatic species, in accordance with one or more exemplary embodiments.
[0018] FIG. 4 is a diagram depicting a schematic representation of a system for inspecting aquatic species by connecting the Ethernet cable to the second computing device, in accordance with one or more exemplary embodiments.
[0019] FIG. 5 is a diagram depicting the image capturing device assembly, in accordance with one or more exemplary embodiments.
[0020] FIG. 6 is a diagram depicting the exploded view of the image capturing device, in accordance with one or more exemplary embodiments.
[0021] FIG. 7 is a flowchart depicting an exemplary method for analyzing the captured images and videos for inspecting the aquatic species, in accordance with one or more exemplary embodiments.
[0022] FIG. 8 is a flowchart depicting an exemplary method for analyzing the images and videos of the aquatic species by using data analytics module, in accordance with one or more exemplary embodiments.
[0023] FIG. 9 is a flowchart depicting an exemplary method for generating the aquatic species monitoring data, in accordance with one or more exemplary embodiments. [0024] FIG. 10 is a block diagram illustrating the details of a digital processing system 1000 in which various aspects of the present disclosure are operative by execution of appropriate software instructions.
DETAILED DESCRIPTION OF EXAMPLE EMBODIMENTS
[0025] It is to be understood that the present disclosure is not limited in its application to the details of construction and the arrangement of components set forth in the following description or illustrated in the drawings. The present disclosure is capable of other embodiments and of being practiced or of being carried out in various ways. Also, it is to be understood that the phraseology and terminology used herein is for the purpose of description and should not be regarded as limiting.
[0026] The use of “including”,“comprising” or“having” and variations thereof herein is meant to encompass the items listed thereafter and equivalents thereof as well as additional items. The terms“a” and“an” herein do not denote a limitation of quantity, but rather denote the presence of at least one of the referenced item. Further, the use of terms“first”,“second”, and“third”, and so forth, herein do not denote any order, quantity, or importance, but rather are used to distinguish one element from another.
[0027] Referring to FIG. 1 is a diagram depicting a machine vision system for inspecting aquatic species, in accordance with one or more exemplary embodiments. The machine vision system 100 depicts a first computing device 102, a second computing device 104, an image capturing device 106, an image acquisition module 107, a network 108, an illuminating source 109, a data analytics module 110, an object 111, and an image processing module 112. The first computing device 102 and the second computing device 104 may include, but not limited to, a computer workstation, an interactive kiosk, and a personal mobile computing device such as a digital assistant, a mobile phone, a laptop, and storage devices, backend servers hosting database and other software and so forth. The first computing device 102 and the second computing device 104 may be operated by a farmer. The image capturing device 106 may be configured to capture images and videos of the aquatic species present in the water storage tanks (not shown). For example, the multiple image capturing devices may also positioned to capture images and videos of the aquatic species present in the water storage tanks (not shown). The aquatic species may include but not limited to, shrimps, prawns, crabs, snails, crustaceans and molluscs, and so forth. The water storage tanks (not shown) may include but not limited to, ponds, lakes, rivers, and so forth. The image capturing device 106 may also be configured to deliver the captured images and videos of the aquatic species to the first computing device and the second computing device over the network 108. The image capturing device 106 may include but not limited to, near-infrared camera, infrared night-vision camera, thermal imaging camera, and so forth. The image acquisition module 107 may be configured to transmit the images and videos of the aquatic species to the second computing device 104. The illuminating source 109 may be positioned additionally between the image capturing device 106 and the tray (not shown) to capture clear images of the aquatic species. The object 111 may be present inside the water storage tank (not shown). The object 111 may include the aquatics species. The machine vision technique may be programmed in the image acquisition module 107, the data analytics module 110 and the image processing module 112. The machine vision can be used for measuring size, shape, and color, including improvements of cameras, illumination settings, image processing, and analysis methods. The machine vision technique may be configured to achieve high accuracy and efficiency, and wider application in visual quality detection of aquatic species. The quality evaluation of shrimp may be completed automatically by using the machine vision.
[0028] The network 108 may include but not limited to, an Internet of things (IoT network devices), an Ethernet, a wireless local area network (WLAN), or a wide area network (WAN), a Bluetooth low energy network, a ZigBee network, a WIFI communication network e.g., the wireless high speed internet, or a combination of networks, a cellular service such as a 4G (e.g., LTE, mobile WiMAX) or 5G cellular data service, a RFID module, a NFC module, wired cables, such as the world-wide-web based Internet, or other types of networks may include Transport Control Protocol/Internet Protocol (TCP/IP) or device addresses (e.g. network-based MAC addresses, or those provided in a proprietary networking protocol, such as Modbus TCP, or by using appropriate data feeds to obtain data from various web services, including retrieving XML data from an HTTP address, then traversing the XML for a particular node) and so forth without limiting the scope of the present disclosure. [0029] The applications (for e.g., the image processing module 112, and the data analytics module 110) are mobile applications (for e.g., android applications, IOS applications), software that offers the functionality of accessing mobile applications, and viewing/processing of interactive pages, for example, is implemented in the first computing device 102 and the second computing device 104, as will be apparent to one skilled in the relevant arts by reading the disclosure provided herein. The image processing module 112 and the image analytics module 110 may be configured to perform analytics on the captured images and videos of the aquatic species to obtain the aquatic species monitoring data by considering the parameters. The parameters may include but not limited to, size (length, width area and perimeter), shape (size dependent/size independent), color (mean and variance), contrast of object of interest to the background, length of object of interest, width of object of interest, and various patterns on the object of interest. The aquatic species monitoring data may include but not limited to, biomass, health conditions, count, size, molting process, infections, health data, feeding habits, feeding percentage, stress, and so forth.
[0030] Referring to FIG. 2A is a diagram 200a depicting a schematic representation of an image processing module 112, in accordance with one or more exemplary embodiments. The diagram 200a depicts a bus 201a, an image pre-treatment module 203, an image segmentation module 205, and a feature extraction module 207. The bus 201a may include a path that permits communication among the components of the image processing module 112. The image processing module 112 may be configured to process the images and videos of the aquatic species to the image pre-treatment module 203. The image pre-treatment module 203 may be configured to remove the noise and enhance the contrast of the image. The image segmentation module 205 may be configured to perform the segmentation based on the threshold, region, gradient, and classification of the captured images. The feature extraction module 207 may be configured to extract the feature of the images based on the parameters.
[0031] Referring to FIG. 2B is a diagram 200b depicting a schematic representation of a data analytics module 110, in accordance with one or more exemplary embodiments. The data analytics module 110 includes a bus 201b, a data capturing module 202, a data processing module 204, and a data generating module 206, a network module 208, a cloud server 210, a data transmitting module 212 and a data conversion module 214. The bus 201b may include a path that permits communication among the components of the data analytics module 110. The data processing module 204 may be configured to identify the object of interest of the captured images and videos of the aquatic species. The data processing module 204 may be configured to analyze the captured images and videos to obtain the aquatic species monitoring data. The data processing module 204 may also be configured to a threshold and isolate the object of interest and also calculates the dimensions of the object of interest. The data conversion module 214 may be configured to convert the object of interest into common length (For example, cm/in). The data transmitting module 212 may be configured to transmit the dimensions of the aquatic species to the data generating module 206. The data generating module 206 may be configured to generate and display the aquatic species monitoring data on the second end user device 104. The network module 208 may be configured to deliver the aquatic species monitoring data to the first computing device 102 and the second computing device 104. The cloud server 210 may be configured to update and store the monitoring data of the aquatic species. The cloud server 210 may also be configured to retrieve the monitoring data of the aquatic species and delivers to the first and second computing devices 102 and 104.
[0032] Referring to FIG. 3 is a diagram 300 depicting a schematic representation of a system for inspecting aquatic species, in accordance with one or more exemplary embodiments. The schematic representation of a system 300 depicts a pole 302, a winch 304, a pulley 306, an Ethernet cable 308, a chain 310, an electrical wire 312, an image capturing device casing 314, a light source casing 316, connecting rods 318a-318c, a first light source 320, a tray 322, a pulley hook 323 and a mesh 324. The pole 302 may be connected with the winch 304, the image capturing device 106, and the tray 322. The winch 304 and the pulley 306 may be configured to allow moving the tray 322 into the water storage tank (not shown) and may also help to pull out the tray 322 from the water storage tank (not shown) in the desired direction. The direction may include but not limited to, horizontal direction, vertical direction, linear direction, and so forth. The image capturing device 102 and the second computing device 104 may be connected with the Ethernet cable 308 or with the universal serial bus port (not shown). The Ethernet cable 308 configured to deliver the images and videos of the aquatic species captured by the image capturing device 106. The chain 310 may be configured to hold the image capturing device casing 314 along with the tray 322. The image capturing device 106 may be positioned inside the image capturing device casing 314. The image capturing device casing 314 may be configured to prevent the entry of water into the image capturing device 106. The electrical wire 312 may be configured to provide electrical power to the first light source 320 and the second light source (not shown). The first light source 320 and the second light source may include but not limited to, infrared light, visible light, ultraviolet light, x-rays and gamma radiation and so forth. The light source casing 316 may be configured to prevent the entry of water into the second light source (not shown). The connecting rods 318a-318c may be configured to hold the tray 322. The first light source 320 may be positioned inside the tray 322 to provide the lightning inside the tray 322 to capture the clear images and videos for the inspection of the aquatic species. The tray 322 may be configured to trap the aquatic species for the inspection. The mesh 324 may be configured to estimate the size of the aquatic species trapped in the tray 322. The pulley hook 323 may be configured to hold the chain 310. The mesh 324 may be positioned all over the tray 322 and it may also be positioned on the sides of the tray 322.
[0033] Referring to FIG. 4 is a diagram 400 depicting a schematic representation of a system for inspecting aquatic species by connecting the Ethernet cable to the second computing device, in accordance with one or more exemplary embodiments. The schematic representation of a system 400 depicts a second light source 402, a cap 404, a first acrylic window 406, a second acrylic window 408, a POE switch 410, and the light source casing 316, an image capturing device 106, the electrical wire 312, Ethernet cable 308, and the second computing device 104. The second light source 402 may be positioned inside the light source casing 316 to provide lightning on the tray 322. The cap 404 may be configured to prevent the water getting inside the image capturing device 106. The first acrylic window 406 may be configured to allow the lightning to fall on the tray 322. The second acrylic window 408 may be configured to make visible to the image capturing device 106 to capture the images and videos in the water storage tanks (not shown). The POE switch 410 may be configured to provide electrical power over Ethernet to the second computing device 104 by using the DC power source. The second computing device 104 may be configured to receive the images and videos captured by the image capturing device 106 through the POE switch 410. The second computing device 104 may be configured to inspect the received images and videos to provide the aquatic species monitoring data by using the data analytics module 110. For example, the data analytics module 110 may provide how much feed need to be put in the tray 322 to feed the aquatic species.
[0034] Referring to FIG. 5 is a diagram 500 depicting the image capturing device assembly, in accordance with one or more exemplary embodiments. The image capturing device assembly 500 depicts a cord-grip 502, retaining rings 504a-504c, the chain 310, the Ethernet cable 308, the image capturing device casing 314, the light source casing 316. The cord-grip 502 may be configured to hold the Ethernet cable 308 and prevents the entering of water into the image capturing device casing 314. The retaining rings 504a-504b may allow placing the image capturing device 106 into the light source casing 316. The retaining ring 504c may be configured to hold the connecting rods 318a-318c and prevents the tray 322 from moving. The image capturing device casing 314 may be configured to prevent the water from entering into the image capturing device 106. The light source casing 316 may be configured to prevent the water from entering into the light source 302a.
[0035] Referring to FIG. 6 is a diagram 600 depicting the exploded view of the image capturing device, in accordance with one or more exemplary embodiments. The exploded view of the image capturing device 600 depicts a spring 602, the cord-grip 502, the Ethernet cable 308, the image capturing device 106, the cap 404, and the image capturing device casing 314. The spring 602 may be compressed by engaging the image capturing device 106 within the image capturing device casing 314 on one end and with the cap 308 on the other end. The cord- grip 502 may be configured to hold the Ethernet cable and prevents the entering of water into the image capturing device casing 314. The cap 404 may be configured to secure the image capturing device 106 in the image capturing device casing 314.
[0036] Referring to FIG. 7 is a flowchart 700 depicting an exemplary method for analyzing the captured images and videos for inspecting the aquatic species, in accordance with one or more exemplary embodiments. As an option, the method 700 is carried out in the context of the details of FIG. 1, FIG. 2 FIG. 3, FIG. 4, FIG.5 and FIG. 6. However, the method 700 is carried out in any desired environment. Further, the aforementioned definitions are equally applied to the description below.
[0037] The exemplary method 700 commences at step 702, immersing the tray along with the image capturing device into the water storage tank. Trapping the aquatic species into the tray, at step 704. Capturing the images and videos of the trapped aquatic species by using image capturing device to inspect the aquatic species, at step 706. Analyzing the captured images and videos to obtain the aquatic species monitoring data by the data analytics module, at step 708. Displaying the obtained aquatic species monitoring data on the second end user device over the network, at step 710.
[0038] Referring to FIG. 8 is a flowchart 800 depicting an exemplary method for analyzing the images and videos of the aquatic species by using data analytics module, in accordance with one or more exemplary embodiments. As an option, the method 800 is carried out in the context of the details of FIG. 1, FIG. 2 FIG. 3, FIG. 4, FIG.5 FIG. 6 and FIG. 7. Flowever, the method 800 is carried out in any desired environment. Further, the aforementioned definitions are equally applied to the description below.
[0039] The exemplary method 800 commences at step 802, acquiring the images and videos by using the data capturing module. Processing and analyzing the acquired images and videos by using the data processing module, at step 804. Obtaining the aquatic species monitoring data by analyzing the images and videos of the aquatic species by using data generating module, at step 806. Delivering the obtained aquatic species monitoring data to the first and second computing devices by the network module, at step 808. Storing and updating the obtained aquatic species monitoring data in the cloud server, at step 810.
[0040] Referring to FIG. 9 is a flowchart 900 depicting an exemplary method for generating the aquatic species monitoring data, in accordance with one or more exemplary embodiments. As an option, the method 900 is carried out in the context of the details of FIG. 1 , FIG. 2 FIG. 3, FIG. 4, FIG.5 FIG. 6, FIG. 7 and FIG. 8. Flowever, the method 900 is carried out in any desired environment. Further, the aforementioned definitions are equally applied to the description below.
[0041] The exemplary method 900 commences at step 902, identifying the object of interest of the captured images and videos by the data processing module. Isolating and threshold the object of interest of the captured images and videos by the data processing module, at step 904. Calculating the dimensions (pixels) of the object of interest by the data processing module, at step 906. Converting the dimensions into the predetermined length by the data conversion module, at step 908. Transmitting the converted dimensions to the data generating module, at step 910. Generating the aquatic species monitoring data by the data generating module, at step 912. [0042] Referring to FIG. 10 is a block diagram 1000 illustrating the details of a digital processing system 1000 in which various aspects of the present disclosure are operative by execution of appropriate software instructions. The Digital processing system 1000 may correspond to the first computing device and the second computing device 102, 104 (or any other system in which the various features disclosed above can be implemented).
[0043] Digital processing system 1000 may contain one or more processors such as a central processing unit (CPU) 1010, random access memory (RAM) 1020, secondary memory 1027, graphics controller 1060, display unit 1070, network interface 1080, and input interface 1090. All the components except display unit 1070 may communicate with each other over communication path 1050, which may contain several buses as is well known in the relevant arts. The components of Figure 10 are described below in further detail.
[0044] CPU 1010 may execute instructions stored in RAM 1020 to provide several features of the present disclosure. CPU 1010 may contain multiple processing units, with each processing unit potentially being designed for a specific task. Alternatively, CPU 1010 may contain only a single general-purpose processing unit.
[0045] RAM 1020 may receive instructions from secondary memory 1030 using communication path 1050. RAM 1020 is shown currently containing software instructions, such as those used in threads and stacks, constituting shared environment 1025 and/or user programs 1026. Shared environment 1025 includes operating systems, device drivers, virtual machines, etc., which provide a (common) run time environment for execution of user programs 1026.
[0046] Graphics controller 1060 generates display signals (e.g., in RGB format) to display unit 1070 based on data/instructions received from CPU 1010. Display unit 1070 contains a display screen to display the images defined by the display signals. Input interface 1090 may correspond to a keyboard and a pointing device (e.g., touch-pad, mouse) and may be used to provide inputs. Network interface 1080 provides connectivity to a network (e.g., using Internet Protocol), and may be used to communicate with other systems (such as those shown in Figure 1) connected to the network 108. [0047] Secondary memory 1030 may contain hard drive 1035, flash memory 1036, and removable storage drive 1037. Secondary memory 1030 may store the data software instructions (e.g., for performing the actions noted above with respect to the Figures), which enable digital processing system 1000 to provide several features in accordance with the present disclosure.
[0048] Some or all of the data and instructions may be provided on removable storage unit 1040, and the data and instructions may be read and provided by removable storage drive 1037 to CPU 1010. Floppy drive, magnetic tape drive, CD-ROM drive, DVD Drive, Flash memory, removable memory chip (PCMCIA Card, EEPROM) are examples of such removable storage drive 1037.
[0049] Removable storage unit 1040 may be implemented using medium and storage format compatible with removable storage drive 1037 such that removable storage drive 1037 can read the data and instructions. Thus, removable storage unit 1040 includes a computer readable (storage) medium having stored therein computer software and/or data. However, the computer (or machine, in general) readable medium can be in other forms (e.g., non removable, random access, etc.).
[0050] In this document, the term "computer program product" is used to generally refer to removable storage unit 1040 or hard disk installed in hard drive 1035. These computer program products are means for providing software to digital processing system 1000. CPU 1010 may retrieve the software instructions, and execute the instructions to provide various features of the present disclosure described above.
[0051] The term“storage media/medium” as used herein refers to any non-transitory media that store data and/or instructions that cause a machine to operate in a specific fashion. Such storage media may comprise non-volatile media and/or volatile media. Non-volatile media includes, for example, optical disks, magnetic disks, or solid-state drives, such as storage memory 1030. Volatile media includes dynamic memory, such as RAM 1020. Common forms of storage media include, for example, a floppy disk, a flexible disk, hard disk, solid- state drive, magnetic tape, or any other magnetic data storage medium, a CD-ROM, any other optical data storage medium, any physical medium with patterns of holes, a RAM, a PROM, and EPROM, a FLASH-EPROM, NVRAM, any other memory chip or cartridge.
[0052] Storage media is distinct from but may be used in conjunction with transmission media. Transmission media participates in transferring information between storage media. For example, transmission media includes coaxial cables, copper wire and fiber optics, including the wires that comprise bus (communication path) 1050. Transmission media can also take the form of acoustic or light waves, such as those generated during radio-wave and infra-red data communications.
[0053] Reference throughout this specification to“one embodiment”,“an embodiment”, or similar language means that a particular feature, structure, or characteristic described in connection with the embodiment is included in at least one embodiment of the present disclosure. Thus, appearances of the phrases“in one embodiment”,“in an embodiment” and similar language throughout this specification may, but do not necessarily, all refer to the same embodiment.
[0054] Furthermore, the described features, structures, or characteristics of the disclosure may be combined in any suitable manner in one or more embodiments. In the above description, numerous specific details are provided such as examples of programming, software modules, user selections, network transactions, database queries, database structures, hardware modules, hardware circuits, hardware chips, etc., to provide a thorough understanding of embodiments of the disclosure.
[0055] Although the present disclosure has been described in terms of certain preferred embodiments and illustrations thereof, other embodiments and modifications to preferred embodiments may be possible that are within the principles and spirit of the invention. The above descriptions and figures are therefore to be regarded as illustrative and not restrictive.
[0056] Thus the scope of the present disclosure is defined by the appended claims and includes both combinations and sub-combinations of the various features described hereinabove as well as variations and modifications thereof, which would occur to persons skilled in the art upon reading the foregoing description.

Claims

CLAIMS aim :
1. A machine vision system for inspecting the aquatic species, comprising: a pole is configured to secure a winch, an image capturing device and a tray, whereby the winch is configured to lift the tray in a predetermined direction and the tray is configured to trap one or more aquatic species contained in a water storage tank, the image capturing device is configured to capture one or more images and videos of one or more aquatic species trapped in the tray; and a first computing device and a second computing device comprising a data analytics module and an image processing module, whereby the data analytics module and the image processing module are configured to perform analytics on the one or more images and videos of one or more aquatic species to obtain the aquatic species monitoring data, the aquatic species monitoring data transmitted to the first computing device and the second computing device over a network.
2. The system of claim 1, wherein the image capturing device comprises an image acquisition module configured to transmit the captured images and videos to the first and second computing devices.
3. The system of claim 1, wherein the second computing device connected with an Ethernet cable configured to deliver the images and videos of the aquatic species captured by the image capturing device.
4. The system of claim 3, wherein the Ethernet cable comprising a POE switch configured to provide electrical power to the second computing device over the network.
5. The system of claim 1, wherein the image capturing device comprising an image capturing device casing configured to prevent the water from entering into the image capturing device.
6. The system of claim 1, wherein the image capturing device secured in a light source casing configured to prevent the water from entering into a second light source.
7. The system of claim 6, wherein the second light source positioned inside the light source casing configured to provide the lightning on the tray.
8. The system of claim 7, wherein the light source casing further comprising a retaining ring configured to connect with one or more connecting rods to hold the tray.
9. The system of claim 1, wherein the data analytics module comprising a data capturing module configured to acquire the one or more images and videos of one or more aquatic species.
10. The system of claim 1, wherein the data analytics module comprising a data processing module configured to analyze the one or more images and videos of one or more aquatic species.
11. The system of claim 1, wherein the data processing module further configured to identify the object of interest of the captured images and videos of the aquatic species.
12. The system of claim 1, wherein the data analytics module comprising a data generating module configured to generate and display the aquatic species monitoring data on the second end user device.
13. The system of claim 1, wherein the data analytics module comprising a network module configured to transmit the aquatic species monitoring data to a cloud server.
14. The system of claim 13, wherein the cloud server configured to store and update the aquatic species monitoring data and delivers to the first computing device.
15. A method for inspecting the aquatic species, comprising: immersing a tray along with an image capturing device into a water storage tank; trapping the aquatic species into the tray contained in a water storage tank; capturing the images and videos of the trapped aquatic species by using the image capturing device to inspect the aquatic species; analyzing the captured images and videos to obtain the aquatic species monitoring data by a data analytics module and an image processing module, the aquatic species monitoring data transmitted to a first computing device and a second computing device over a network; and displaying the obtained aquatic species monitoring data on the second end user device over the network.
PCT/IB2020/050626 2019-03-20 2020-01-28 Machine vision system and methods for inspecting aquatic species WO2020188364A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
IN201941010984 2019-03-20
IN201941010984 2019-03-20

Publications (1)

Publication Number Publication Date
WO2020188364A1 true WO2020188364A1 (en) 2020-09-24

Family

ID=72520542

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/IB2020/050626 WO2020188364A1 (en) 2019-03-20 2020-01-28 Machine vision system and methods for inspecting aquatic species

Country Status (1)

Country Link
WO (1) WO2020188364A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN118052377A (en) * 2024-04-16 2024-05-17 中国环境监测总站 Water ecological comprehensive evaluation method and system based on automatic inversion of water habitat

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR100940000B1 (en) * 2008-04-08 2010-02-03 엔엔티시스템즈(주) Underwater monitoring method and apparatus thereof
CN106165661A (en) * 2016-07-08 2016-11-30 太仓市新志杰电子科技有限公司 A kind of green Aquiculture Monitoring System
CN106659136A (en) * 2014-03-04 2017-05-10 绿玛瑙有限公司 Systems and methods for cultivating and distributing aquatic organisms

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR100940000B1 (en) * 2008-04-08 2010-02-03 엔엔티시스템즈(주) Underwater monitoring method and apparatus thereof
CN106659136A (en) * 2014-03-04 2017-05-10 绿玛瑙有限公司 Systems and methods for cultivating and distributing aquatic organisms
CN106165661A (en) * 2016-07-08 2016-11-30 太仓市新志杰电子科技有限公司 A kind of green Aquiculture Monitoring System

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN118052377A (en) * 2024-04-16 2024-05-17 中国环境监测总站 Water ecological comprehensive evaluation method and system based on automatic inversion of water habitat

Similar Documents

Publication Publication Date Title
WO2020134255A1 (en) Method for monitoring growth situations of fishes based on machine vision
US11328525B2 (en) Method for calculating deviation relations of a population
WO2020006899A1 (en) Method and device for measuring weight of livestock
JP6823449B2 (en) How to operate image detection devices and computer program products (visual object and event detection and prediction systems using saccades)
US9536304B2 (en) Determining pathogens based on an image of somatic cells in a fluid sample
CN110287902B (en) Livestock and poultry survival detection method, device, equipment and computer program product
US11694331B2 (en) Capture and storage of magnified images
Majewski et al. Multipurpose monitoring system for edible insect breeding based on machine learning
US20180213753A1 (en) Systems and methods for larval fish enumeration and growth monitoring
WO2020188364A1 (en) Machine vision system and methods for inspecting aquatic species
Salvucci et al. Fast olive quality assessment through RGB images and advanced convolutional neural network modeling
CN112906488A (en) Security protection video quality evaluation system based on artificial intelligence
CN104297290B (en) The Sex in Silkworm Cocoons recognition methods of heated type infrared imaging and device thereof
Rakesh et al. An Overview on Machine Learning Techniques for Identification of Diseases in Aquaculture
US20230100268A1 (en) Quantifying biotic damage on plant leaves, by convolutional neural networks
KR20230117244A (en) Systems and Methods for Identifying Cancer in Companion Animals
Obu et al. Crop Disease Detection using Yolo V5 on Raspberry Pi
Paolillo et al. Automated image analysis to assess hygienic behaviour of honeybees
CN109934045B (en) Pedestrian detection method and device
Jaballah et al. A deep learning approach to detect and identify live freshwater macroinvertebrates
Muvva et al. Automatic identification of broiler mortality using image processing technology
Curlis et al. Batch-mask: An automated mask r-cnn workflow to isolate non-standard biological specimens for color pattern analysis
Pérez et al. High-resolution X-Ray imaging of small animal samples based on Commercial-Off-The-Shelf CMOS image sensors
KR102263544B1 (en) Artificial intelligence-based request information retrieval device that retrieves image information of heterogeneous imaging devices using artificial intelligence and retrieves user request information
Gunes et al. Identification of honey bee (Apis mellifera) larvae in the hive with faster R-CNN for royal jelly production

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20774257

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 20774257

Country of ref document: EP

Kind code of ref document: A1