WO2017001971A1 - Method and system for measuring biomass volume and weight of a fish farming tank - Google Patents

Method and system for measuring biomass volume and weight of a fish farming tank Download PDF

Info

Publication number
WO2017001971A1
WO2017001971A1 PCT/IB2016/053682 IB2016053682W WO2017001971A1 WO 2017001971 A1 WO2017001971 A1 WO 2017001971A1 IB 2016053682 W IB2016053682 W IB 2016053682W WO 2017001971 A1 WO2017001971 A1 WO 2017001971A1
Authority
WO
WIPO (PCT)
Prior art keywords
tank
underwater
biomass
sls
fish
Prior art date
Application number
PCT/IB2016/053682
Other languages
French (fr)
Inventor
Paulo Jorge MOREIRA TORRES DE AZEVEDO
Flávio Wilson MOREIRA LOPES
Hugo Miguel GOMES DA SILVA
José Miguel SOARES DE ALMEIDA
Eduardo Alexandre PEREIRA DA SILVA
Original Assignee
Antípoda, Lda
Inesc Tec - Instituto De Engenharia De Sistemas E Computadores, Tecnologia E Ciência
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Antípoda, Lda, Inesc Tec - Instituto De Engenharia De Sistemas E Computadores, Tecnologia E Ciência filed Critical Antípoda, Lda
Publication of WO2017001971A1 publication Critical patent/WO2017001971A1/en

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/24Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures
    • G01B11/25Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures by projecting a pattern, e.g. one or more lines, moiré fringes on the object
    • G01B11/2518Projection by scanning of the object
    • AHUMAN NECESSITIES
    • A01AGRICULTURE; FORESTRY; ANIMAL HUSBANDRY; HUNTING; TRAPPING; FISHING
    • A01KANIMAL HUSBANDRY; AVICULTURE; APICULTURE; PISCICULTURE; FISHING; REARING OR BREEDING ANIMALS, NOT OTHERWISE PROVIDED FOR; NEW BREEDS OF ANIMALS
    • A01K61/00Culture of aquatic animals
    • A01K61/90Sorting, grading, counting or marking live aquatic animals, e.g. sex determination
    • A01K61/95Sorting, grading, counting or marking live aquatic animals, e.g. sex determination specially adapted for fish
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques

Definitions

  • the present disclosure relates to a robotic solution for fish farming biomass estimation, in particular a method and system for measuring biomass volume and weight of a fish farming tank.
  • the United Nations FAO organization [2] issues regulation that address fisheries management and development, by taking into account the knowledge and uncertainties about biotic and human components of marine ecosystems.
  • the sole purpose of this approach is to plan, develop and manage fisheries in such a way that it can cope with human society needs, but also keep the full range of goods and services provided by marine ecosystems.
  • the biomass estimation process is very labour intensive and increases the cost of production.
  • the density and biomass estimates are crucial for evaluating fish growth during its growth cycle.
  • the statistical value is fundamental for fish farmers to estimate and adjust fish food dosage, medicine dosage, early detection of fish loss, but most importantly growth rates and food conversion factor appraisal to decide when is the best time to conduct financial transactions.
  • the present disclosure relates to a robotic solution for fish farming biomass estimation, in particular a method and system for measuring biomass volume and weight of a fish farming tank, further in particular for measuring biomass volume and weight of flatfish lying on the bottom of a fish farming tank.
  • a flatfish is a member of the order Pleuronectiformes of ray-finned demersal fishes, also called the Heterosomata, sometimes classified as a suborder of Perciformes.
  • both eyes lie on one side of the head, one or the other migrating through and around the head during development.
  • the following fish are particularly suited to the present disclosure: flounders, soles, turbot, plaice, and halibut.
  • the system preferably moves silently on top of the tank borders using differential wheels and a structured light vision system (SLS).
  • SLS structured light vision system
  • the SLS system can be constituted by a camera and one or more line lasers (projectors) equipped with a line beam that allows obtaining the fish depth profile present in the tank to perform biomass estimation.
  • the number of camera/lasers is a parameter of the system pending on the desired FOV relation and need to cover the tank borders.
  • a system with 2 lasers according to an embodiment of the disclosure is particularly apt for scanning all the surface of the bottom of a tank right to the border edges.
  • a system with more than 2 lasers is also useful for illuminating and scanning larger areas at the same time and may, for example, dispense with a movement along 2-axis by simply moving longitudinally.
  • the present disclosure comprises two main physical components: First a robotic mechanical platform, which is placed on top of the borders of the underwater tank. The platform moves autonomously on top of the tank using differential wheels with encoder information. The information is then coupled with position lasers that measure platform travelled distance to a fixed target, usually placed at end of the tank; Second component is a mobile platform that moves on top of the mechanical platform (i.e. transversally to the tank) and contains a structured light vision system (SLS), equipped with two lasers, a camera and a processing unit that performs 3D underwater mapping of the tank bottom using the triangulation principle [4], in an u- shape type movement.
  • SLS structured light vision system
  • [0018] In [13] it was proposed a method to evaluate the spatial distribution of flat fish in raceway tanks using a laser and a digital camera. The aim of the system was to improve tank design and fish management.
  • a laser scanning system is used to recover the biomass inspection of a fish tank. The SLS system, does not process the line laser images in real-time and the platform is not fully automated, instead it is manually pushed.
  • an underwater structured light vision system placed upon the platform, which comprises a camera for capturing underwater images and one or more line laser projectors each for projecting an underwater laser line beam;
  • data processing system configured for: processing the captured underwater images of the laser beam lines,
  • the underwater depth of the SLS is adjustable, in particular wherein the full SLS is for submerging underwater or only the optical parts of the camera and line laser projectors are for submerging underwater.
  • the captured underwater images are synchronised with the displacement of the SLS over the tank.
  • An embodiment further comprises laser position trackers for tracking the displacement of the SLS over the tank.
  • the system further comprises a transversal arm for laterally crossing the tank, said arm being movable longitudinally along the tank, and wherein the platform is movable along the length of said arm in order to be movable over the area of the tank where the biomass is to be estimated.
  • said arm comprises motorised wheels at its two ends for placing on top of lateral borders of the tank.
  • An embodiment further comprises rotary encoders and/or inertial sensors for encoding the displacement and/or inclination of the platform over the tank, in particular for encoding the displacement and/or inclination of the arm over the tank or in particular for encoding the displacement of the platform in respect of the arm.
  • displacing a structured light vision system SLS, which is placed upon a platform, over an area of the tank where the biomass is to be estimated, wherein the SLS comprises a camera and two line laser projectors, and the camera comprises a camera lens which is kept underwater;
  • An embodiment further comprises multiplying the calculated biomass volume by an estimated density of the fish present in the fish farming tank for obtaining the biomass weight of the fish farming tank.
  • An embodiment further comprises adjusting the underwater depth of the SLS such that the camera lens is kept underwater across the area of the tank where the biomass is to be estimated.
  • An embodiment further comprises adjusting the underwater depth of the SLS such that the camera lens is kept as shallow as possible while keeping the camera lens constantly submerged across the area of the tank where the biomass is to be estimated.
  • An embodiment further comprises synchronising the captured underwater images with the displacement of the SLS over the tank.
  • An embodiment further comprises tracking the displacement of the SLS over the tank with laser position trackers.
  • An embodiment further comprises longitudinally moving an arm, said arm being a transversal arm laterally crossing the tank, and moving the platform along the length of said arm in order to be displace the SLS over the area of the tank where the biomass is to be estimated.
  • An embodiment further comprises using rotary encoders for encoding the displacement of the platform over the tank.
  • An embodiment further comprises displacing the structured light vision system, SLS, over the area of the tank where the biomass is to be estimated, in U-shaped movements.
  • An embodiment further comprises using rotary encoders for encoding the displacement of the platform over the tank.
  • An embodiment further comprises, when using the data processing system for processing the captured underwater images of the laser line beams:
  • An embodiment further comprises, when using the data processing system for calculating the underwater depth profile of the tank:
  • Figure 1 Schematic representation of an embodiment of the hardware architecture fish farming autonomous calibration system.
  • Figure 2a, 2b Schematic representation of the mechanical platform.
  • Figure 3 Schematic representation of the mechanical platform architecture.
  • Figure 4 Images of the structured light vision system, wherein (a) pictures the Structured Light Vision System - Camera and red lasers; wherein (b) pictures a snapshot by the Structured Light Vision Image, in this case a red laser line deformation in the presence of a fish.
  • Figure 5 Schematic representation of the SLS software architecture.
  • Figure 6 Images of the SLS Line laser detection.
  • Figure 7 Image of an experimental setup laboratory tank.
  • Figure 8 Point cloud 3D scan result of the fish farming calibration in the laboratory tank.
  • Figure 9 Point cloud 3D scan result of the fish farming in the laboratory tank.
  • Figure 10 Point cloud 3D scan result of a turbot fish.
  • Figure 11 Image of the experimental setup indoor tank.
  • Figure 12 Point Cloud 3D scan result of the fish farming indoor tank.
  • the present subject matter was designed to address the challenge of intensive biomass estimation in indoor RAS tanks.
  • the present subject matter discloses a robotic platform that is mechanically adaptable to the RAS tanks.
  • the robotic platform includes DC motors, sensors for providing odometry information and a processing unit.
  • the structured light vision system includes line lasers, camera and also a processing unit. The two components communicate between each by Wi-Fi network.
  • the robotic platform may comprise the following components:
  • PLC for example, an OMRON(tm) PLC
  • tank reference frame XX is to the right, YY is to front and ZZ is down.
  • FIG. 2 an image of the robotic platform is displayed.
  • the platform moves in the plane XY, parallel with the water plane.
  • the platform movement in the ZZ axis is manual calibrated and it is adjustable with the water column.
  • the RAS tanks have a very shallow depth (between 15 - 25 cm).
  • the traction wheels are actuated using Brushless DC Motors that allow performing movement in YY axis.
  • the SLS is placed in a mobile frame that moves in XX axis, through a system with a DC Motor that operates with a "timing belt” mechanism.
  • the SLS contains a processing unit synchronized with the PLC that is controlling the robotic platform, and communicates using the Wi-Fi network.
  • position lasers were installed in the robotic platform and artificial targets are placed at end of tank. This sensing mechanism allows us to keep track of the total travel distance of the robotic platform in the tank, the lasers are placed on the sides of the robotic platform.
  • the position lasers allow traction control in YY axis and avoid other situations such as slipping wheel. In this situation the system tries to compensate the motion or stops the scan.
  • the PLC is responsible for receiving lasers and encoders information, control the platform movement in XX and YY axis and send odometry information to the SLS processing unit.
  • the robotic platform operation in automatic mode can be described by: The robotic platform is placed in a initial reference point with the SLS on the side of the tank. A scan is performed, by moving the SLS in XX axis until it reaches the other side of the tank. The SLS continuously captures and process the images while receiving odometry information from the PLC. Then, a movement is performed in the YY axis with a predetermined step, which is constant throughout the length of the tank.
  • a scan is while the SLS is moving in the opposite direction. This procedure, based on a u-shape movement is repeated until it scans the entire tank.
  • the robotic platform is mechanically built preferably using aluminium and/or stainless steel to prevent corrosion caused by the hostile environment on fish farming industries.
  • the SLS unit it is preferably built using waterproof housings.
  • the mobile structured light vision system may contain the following components:
  • processing unit for example, a NUC Intel i5 (tm) 2.70 GHz
  • a camera for example, IDS UI-3240C 1.3Mp
  • Data communications for example, a Wifi communications 2.4 GHz module
  • a Wifi communications 2.4 GHz module for example, a Wifi communications 2.4 GHz module
  • processing unit for example, a NUC Intel i5 (tm) 2.70 GHz
  • a camera for example, IDS UI-3240C 1.3Mp
  • a single line laser for example, 1 Global Lyte Mv Lasers 635nm wavelength (red) 5mW power with waterproof Housing )
  • Data communications for example, a Wifi communications 2.4 GHz module
  • a Wifi communications 2.4 GHz module for example, a Wifi communications 2.4 GHz module
  • processing unit for example, a NUC Intel i5 (tm) 2.70 GHz
  • N line lasers for example, N Global Lyte Mv Lasers 635nm wavelength (red) 5mW power with waterproof Housing )
  • Data communications for example, a Wifi communications 2.4 GHz module
  • a Wifi communications 2.4 GHz module for example, a Wifi communications 2.4 GHz module
  • processing unit for example, a NUC Intel i5 (tm) 2.70 GHz
  • a camera for example, IDS UI-3240C 1.3Mp
  • N line lasers from different colours for example, N Z Lasers 532nm wavelength
  • Data communications for example, a Wifi communications 2.4 GHz module
  • the Structured Light Vision System, SLS, unit preferably consists of: one camera and two red line lasers, both with waterproof housings, see in Fig. 4.
  • the processing unit is responsible for acquiring and processing the 2D images, receive odometry information and compute the triangulation for obtaining all points of the line laser in 3D reference coordinates.
  • the SLS allows us to obtain 3D information from the flats fish present in an rectangular indoor tank.
  • Fig. Error! Reference source not found. we can see the SLS software architecture.
  • the main purpose is to obtain synchronized time stamp images with dual line laser projection information.
  • Flatfish species are not prone to sudden movements; the fish usually stay in the bottom of the tank in layers. Therefore, with a SLS constantly submerged underwater performing u-shaped laser scans is possible to capture the fish profile layer in the tank with high precision.
  • the method which allows obtaining the 3D point cloud is the following.
  • the next step concerns the detection of line lasers in the 2D image, as shown in Fig. 6b.
  • Using a Gaussian Kernel is possible to obtain the pixel with greater colour intensity for each horizontal line of the image, as shown in Fig. 6c.
  • the weight of this sample is obtained using the following equation, where M represent the biomass weight, V the biomass volume, measured by the system and ⁇ the biomass density.
  • Fig. 10 a result of 3D scan of the fish turbot is presented. In this image, it is possible to see that the tank was about 125 mm and the fish maximum thickness is approximated 40 mm.
  • the fish farming calibration system was experimented in a live indoor aquaculture farming industry. For this purpose a section of a large 40x3.5m tank, with 2.80x3.50m was selected. In Fig. 11 we can see the system operating in a RAS production tank.
  • the biomass weight was estimated using density values, between 1200 kg/m and 1300 kg/m. Even not being totally thorough, since these are indicative values for live fish individuals, it allows us to have an approximated idea of the measurement errors associated, see Table 3. The results show that our approach has between 10% and 17% of relative error in biomass volume in real aquaculture environment. These results will have to be further validated in future work, taking into consideration the amount of fish present in a given tank and their growth during their life cycle. Also in the future, the mean value for biomass density will be estimated using statistical databases. The database will be generated by multiples scans of the same individuals in the same tank for estimate an approximate real biomass density value, and also by weighing all the fish in the tank priory to their sale.
  • Table 3 Comparison of the total biomass weight measurements between the manual calibration and the SCAN system in the fish farming facility.

Landscapes

  • Life Sciences & Earth Sciences (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Zoology (AREA)
  • Environmental Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Marine Sciences & Fisheries (AREA)
  • Animal Husbandry (AREA)
  • Biodiversity & Conservation Biology (AREA)
  • Farming Of Fish And Shellfish (AREA)

Abstract

System for estimating biomass of a fish farming tank by obtaining a depth profile of the tank, comprising: a platform movable over the tank; an underwater structured light vision system, SLS, placed upon the platform, which comprises a camera for capturing underwater images and two one or more line laser projectors each for projecting an underwater laser line beam; such that the SLS is displaceable over an area of the tank where the biomass is to be estimated; and a data processing system configured for: processing the captured underwater images of the laser beam lines, calculating the underwater depth profile of the tank from the triangulation of the processed images of the laser beam lines, and subtracting, from the calculated underwater depth profile of the tank, an underwater depth profile of the tank previously obtained when the tank was empty of fish, for obtaining the biomass volume of the fish farming tank.

Description

D E S C R I P T I O N
METHOD AND SYSTEM FOR MEASURING BIOMASS VOLUME AND WEIGHT OF A FISH FARMING TANK
Technical field
[0001] The present disclosure relates to a robotic solution for fish farming biomass estimation, in particular a method and system for measuring biomass volume and weight of a fish farming tank.
Background
[0002] The fish farming industry is becoming widespread all over the world. By 2039 most of the fish we eat will come from the fish farming industry.
[0003] The growth rate of world population 1.14% per year [1], making a total of 7.3 billion people in the world today and rising up to 10 billion in 2050. This growth is putting lots of pressure in the renewal of the planet natural food resources.
[0004] The United Nations FAO organization [2] issues regulation that address fisheries management and development, by taking into account the knowledge and uncertainties about biotic and human components of marine ecosystems. The sole purpose of this approach is to plan, develop and manage fisheries in such a way that it can cope with human society needs, but also keep the full range of goods and services provided by marine ecosystems. Due to increasing demand for fish and fish proteins, fish farming has become a widespread activity all over the world. Already considered as a solid alternative to cover fish market demands, and according to a world bank report [3] it will be 2/3 of all world fish supplies by year 2030.
[0005] We can roughly divide fish farming activities into two categories: extensive aquaculture or intensive aquaculture. While extensive aquaculture is usually related to outdoor fish farming, the intensive and semi-intensive aquaculture is more dedicated to indoor fish tank systems. In these kinds of fish growth systems, fish production per unit of surface is key, and companies try to maximize this key production item by mixing three fundamental elements: oxygen, fresh water and food. The way to do so, is by setting a very well structured fish tank production environment with controlled illumination, a continuous fresh water supply, and conducting periodical biological/chemical tests of all elements present in the production environment. They also conduct regular biomass and fish size evaluation procedures in order to check fish growth, a process denoted as biomass estimation.
[0006] The biomass estimation process is very labour intensive and increases the cost of production. On the other hand the density and biomass estimates are crucial for evaluating fish growth during its growth cycle. The statistical value is fundamental for fish farmers to estimate and adjust fish food dosage, medicine dosage, early detection of fish loss, but most importantly growth rates and food conversion factor appraisal to decide when is the best time to conduct financial transactions.
[0007] Nowadays, most of biomass estimation procedures in the fish farming industry are performed manually. Individual fish samples are collected from the indoor tanks to be measured and weighted. This procedure induces high fish mortality, illness, also causes fish stress affecting the fish biorhythm thus inducing a more slow growth rate.
[0008] These facts are disclosed in order to illustrate the technical problem addressed by the present disclosure.
General Description
[0009] The present disclosure relates to a robotic solution for fish farming biomass estimation, in particular a method and system for measuring biomass volume and weight of a fish farming tank, further in particular for measuring biomass volume and weight of flatfish lying on the bottom of a fish farming tank.
[0010] A flatfish is a member of the order Pleuronectiformes of ray-finned demersal fishes, also called the Heterosomata, sometimes classified as a suborder of Perciformes. In many species, both eyes lie on one side of the head, one or the other migrating through and around the head during development. In particular, the following fish are particularly suited to the present disclosure: flounders, soles, turbot, plaice, and halibut.
[0011] It is disclosed an autonomous robotic solution for fish farming biomass estimation, in particular indoor fish farming. The system preferably moves silently on top of the tank borders using differential wheels and a structured light vision system (SLS). The SLS system can be constituted by a camera and one or more line lasers (projectors) equipped with a line beam that allows obtaining the fish depth profile present in the tank to perform biomass estimation. In the embodiment is presented an example of an SLS system with one camera and two line lasers. The number of camera/lasers is a parameter of the system pending on the desired FOV relation and need to cover the tank borders. Thus, a system with 2 lasers according to an embodiment of the disclosure is particularly apt for scanning all the surface of the bottom of a tank right to the border edges. A system with more than 2 lasers is also useful for illuminating and scanning larger areas at the same time and may, for example, dispense with a movement along 2-axis by simply moving longitudinally.
[0012] Results in laboratory and in real aquaculture environment with live fish are presented throughout the present disclosure.
[0013] This system was tested in laboratory, and in a fish farming indoor production tank using live fish specimens. The obtained results allows us to conclude that the disclosed solution is able to measure the biomass volume and the fish weight far better than current solutions that are only based on manual samples collected by humans. Embodiments can be upgraded by lowering the system weight and working on a regular basis.
[0014] The present disclosure comprises two main physical components: First a robotic mechanical platform, which is placed on top of the borders of the underwater tank. The platform moves autonomously on top of the tank using differential wheels with encoder information. The information is then coupled with position lasers that measure platform travelled distance to a fixed target, usually placed at end of the tank; Second component is a mobile platform that moves on top of the mechanical platform (i.e. transversally to the tank) and contains a structured light vision system (SLS), equipped with two lasers, a camera and a processing unit that performs 3D underwater mapping of the tank bottom using the triangulation principle [4], in an u- shape type movement.
[0015] The use of computer vision techniques for biomass estimation started in the mid 90s [5, 6, 7]. However, these systems still were equipped with low level of automation and required intense activity from operators. In early work, Foster et al. [5] used an underwater camera and image analysis tool to detect and count left over pellets. Petrell et al. [6] developed a system for estimating the mass of fish using stereo video cameras. In [7] the authors propose a video system for measuring the size and swimming speed in cages and tanks with a non-intrusive and offline system.
[0016] Most of the methods were applied to fix structures and only average biomass estimations were obtained. Furthermore at the time, there was no real time processing of the video images. Instead, video images were collected and its image content analysed in post-processing. As for instance, in [8] Lines et al. develop automatic techniques for identifying good images of fish in video frames and determine the outline of the fish in 3D space. Thereunto, they used a stereo vision system to extract linear dimensions of salmons and estimate their mass. Preliminary tests showed that the mean mass measurement error was 18% with a standard deviation of only 9%. In other work, Costa et al. [9] tested a dual underwater camera system for counting and estimating fish length, while the fish was being transferred, the system reported length estimation calibration errors of less than 13% and biomass estimation error of roughly 50%. The same authors, in [10] used a submersible dual camera module connected through two frame grabbers to a PC, the system used filtered and segmented images with a fixed threshold to obtain binary images. Image segments were analysed for area, major axis length and circularity. Afterwards, feature points in each stereo image pair are used to obtain their geometry. Fish-length estimation error, based on a single measurement of a model fish, was approximately 2%.
[0017] In [11, 12] Martinez et al. develop a biomass estimation system using computer vision and robotic techniques, but the biomass estimation for indoor aquaculture was conducted over the water surface using pure visual based methods. With the increase in automation and machine vision systems, the combination of laser systems with visual methods for inspection tanks become widespread.
[0018] In [13] it was proposed a method to evaluate the spatial distribution of flat fish in raceway tanks using a laser and a digital camera. The aim of the system was to improve tank design and fish management. In [14] a laser scanning system is used to recover the biomass inspection of a fish tank. The SLS system, does not process the line laser images in real-time and the platform is not fully automated, instead it is manually pushed.
[0019] On the contrary, here it is disclosed a fully autonomous solution without human in the loop for obtaining 3D underwater mapping of an indoor Raceway tank system (RAS). With this solution and based on conservative estimates of an average fish density, is possible to obtain real-time estimates of the biomass in each tank. Moreover, the system can perform multiple scans and transform what is a painful manual process into an automated simply process that can be executed several times per day, with much less measurement uncertainty.
[0020] It is disclosed a system for estimating biomass of a fish farming tank by obtaining a depth profile of the tank, comprising:
a platform movable over the tank;
an underwater structured light vision system, SLS, placed upon the platform, which comprises a camera for capturing underwater images and one or more line laser projectors each for projecting an underwater laser line beam;
such that the SLS is displaceable over an area of the tank where the biomass is to be estimated;
data processing system configured for: processing the captured underwater images of the laser beam lines,
calculating the underwater depth profile of the tank from the triangulation of the processed images of the laser beam lines, and
subtracting, from the calculated underwater depth profile of the tank, an underwater depth profile of the tank previously obtained when the tank was empty of fish, for obtaining the biomass volume of the fish farming tank.
[0021] In an embodiment, the underwater depth of the SLS is adjustable, in particular wherein the full SLS is for submerging underwater or only the optical parts of the camera and line laser projectors are for submerging underwater.
[0022] In an embodiment, the captured underwater images are synchronised with the displacement of the SLS over the tank.
[0023] An embodiment further comprises laser position trackers for tracking the displacement of the SLS over the tank.
[0024] In an embodiment, the system further comprises a transversal arm for laterally crossing the tank, said arm being movable longitudinally along the tank, and wherein the platform is movable along the length of said arm in order to be movable over the area of the tank where the biomass is to be estimated.
[0025] In an embodiment, said arm comprises motorised wheels at its two ends for placing on top of lateral borders of the tank.
[0026] An embodiment further comprises rotary encoders and/or inertial sensors for encoding the displacement and/or inclination of the platform over the tank, in particular for encoding the displacement and/or inclination of the arm over the tank or in particular for encoding the displacement of the platform in respect of the arm.
[0027] It is also disclosed a method for estimating biomass of a fish farming tank by obtaining a depth profile of the tank, comprising:
displacing a structured light vision system, SLS, which is placed upon a platform, over an area of the tank where the biomass is to be estimated, wherein the SLS comprises a camera and two line laser projectors, and the camera comprises a camera lens which is kept underwater;
projecting an underwater laser line beam by each of the two line laser projectors;
capturing underwater images of the two projected laser beam lines;
using a data processing system for:
processing the captured underwater images of the laser beam lines, calculating the underwater depth profile of the tank from the processed images of the laser beams lines, and
subtracting, from the calculated underwater depth profile of the tank, an underwater depth profile of the tank previously obtained when the tank was empty of fish, for obtaining the biomass volume of the fish farming tank.
[0028] An embodiment further comprises multiplying the calculated biomass volume by an estimated density of the fish present in the fish farming tank for obtaining the biomass weight of the fish farming tank.
[0029] An embodiment further comprises adjusting the underwater depth of the SLS such that the camera lens is kept underwater across the area of the tank where the biomass is to be estimated.
[0030] An embodiment further comprises adjusting the underwater depth of the SLS such that the camera lens is kept as shallow as possible while keeping the camera lens constantly submerged across the area of the tank where the biomass is to be estimated.
[0031] An embodiment further comprises synchronising the captured underwater images with the displacement of the SLS over the tank.
[0032] An embodiment further comprises tracking the displacement of the SLS over the tank with laser position trackers.
[0033] An embodiment further comprises longitudinally moving an arm, said arm being a transversal arm laterally crossing the tank, and moving the platform along the length of said arm in order to be displace the SLS over the area of the tank where the biomass is to be estimated.
[0034] An embodiment further comprises using rotary encoders for encoding the displacement of the platform over the tank.
[0035] An embodiment further comprises displacing the structured light vision system, SLS, over the area of the tank where the biomass is to be estimated, in U-shaped movements.
[0036] An embodiment further comprises using rotary encoders for encoding the displacement of the platform over the tank.
[0037] An embodiment further comprises, when using the data processing system for processing the captured underwater images of the laser line beams:
colour segmenting the captured image for obtaining the colour component of the selected laser colour;
using a Gaussian kernel for obtaining the pixel with greater colour intensity for each horizontal line of the captured image where the laser beam line is arranged vertically.
[0038] An embodiment further comprises, when using the data processing system for calculating the underwater depth profile of the tank:
detecting the points of the laser beam lines present in said processed images; triangulating the detected points in order to obtain the 3D coordinates for said points;
combining the 3D coordinates of the detected points in order to obtain the underwater depth profile of the tank.
Brief Description of the Drawings
[0039] The following figures provide preferred embodiments for illustrating the description and should not be seen as limiting the scope of invention. [0040] Figure 1: Schematic representation of an embodiment of the hardware architecture fish farming autonomous calibration system.
[0041] Figure 2a, 2b: Schematic representation of the mechanical platform.
[0042] Figure 3: Schematic representation of the mechanical platform architecture.
[0043] Figure 4: Images of the structured light vision system, wherein (a) pictures the Structured Light Vision System - Camera and red lasers; wherein (b) pictures a snapshot by the Structured Light Vision Image, in this case a red laser line deformation in the presence of a fish.
[0044] Figure 5: Schematic representation of the SLS software architecture.
[0045] Figure 6: Images of the SLS Line laser detection.
[0046] Figure 7: Image of an experimental setup laboratory tank.
[0047] Figure 8: Point cloud 3D scan result of the fish farming calibration in the laboratory tank.
[0048] Figure 9: Point cloud 3D scan result of the fish farming in the laboratory tank.
[0049] Figure 10: Point cloud 3D scan result of a turbot fish.
[0050] Figure 11: Image of the experimental setup indoor tank.
[0051] Figure 12: Point Cloud 3D scan result of the fish farming indoor tank.
Detailed Description
[0052] The present subject matter was designed to address the challenge of intensive biomass estimation in indoor RAS tanks. The present subject matter discloses a robotic platform that is mechanically adaptable to the RAS tanks. In an embodiment of Fig. 1, we can see our fish farming calibration system hardware architecture. It has two main components, the robotic platform includes DC motors, sensors for providing odometry information and a processing unit. The structured light vision system includes line lasers, camera and also a processing unit. The two components communicate between each by Wi-Fi network.
[0053] The robotic platform may comprise the following components:
PLC (for example, an OMRON(tm) PLC)
Mechanical Platform Position Lasers
Brushless DC Motors with encoder information
On-board Processing Unit
LifeP04 batteries
Wifi communications 2.4 GHz module
[0054] For the present disclosure, we will use the tank reference frame: XX is to the right, YY is to front and ZZ is down.
[0055] In Fig. 2, an image of the robotic platform is displayed. The platform moves in the plane XY, parallel with the water plane. The platform movement in the ZZ axis is manual calibrated and it is adjustable with the water column. The RAS tanks have a very shallow depth (between 15 - 25 cm). The traction wheels are actuated using Brushless DC Motors that allow performing movement in YY axis.
[0056] The SLS is placed in a mobile frame that moves in XX axis, through a system with a DC Motor that operates with a "timing belt" mechanism. The SLS contains a processing unit synchronized with the PLC that is controlling the robotic platform, and communicates using the Wi-Fi network.
[0057] In order to obtain the global platform position in the tank reference frame, position lasers were installed in the robotic platform and artificial targets are placed at end of tank. This sensing mechanism allows us to keep track of the total travel distance of the robotic platform in the tank, the lasers are placed on the sides of the robotic platform. The position lasers allow traction control in YY axis and avoid other situations such as slipping wheel. In this situation the system tries to compensate the motion or stops the scan. The PLC is responsible for receiving lasers and encoders information, control the platform movement in XX and YY axis and send odometry information to the SLS processing unit. [0058] The robotic platform operation in automatic mode can be described by: The robotic platform is placed in a initial reference point with the SLS on the side of the tank. A scan is performed, by moving the SLS in XX axis until it reaches the other side of the tank. The SLS continuously captures and process the images while receiving odometry information from the PLC. Then, a movement is performed in the YY axis with a predetermined step, which is constant throughout the length of the tank. Followed by a scan is while the SLS is moving in the opposite direction. This procedure, based on a u-shape movement is repeated until it scans the entire tank.
[0059] The robotic platform is mechanically built preferably using aluminium and/or stainless steel to prevent corrosion caused by the hostile environment on fish farming industries. As for the SLS unit it is preferably built using waterproof housings.
[0060] The mobile structured light vision system (SLS), may contain the following components:
processing unit (for example, a NUC Intel i5 (tm) 2.70 GHz)
a camera (for example, IDS UI-3240C 1.3Mp)
two lasers (for example, 2 Global Lyte Mv Lasers 635nm wavelength (red) 5mW power with waterproof Housing)
Batteries (for example, LifeP04 batteries)
Data communications (for example, a Wifi communications 2.4 GHz module) or a
processing unit (for example, a NUC Intel i5 (tm) 2.70 GHz)
a camera (for example, IDS UI-3240C 1.3Mp)
a single line laser (for example, 1 Global Lyte Mv Lasers 635nm wavelength (red) 5mW power with waterproof Housing )
Batteries (for example, LifeP04 batteries)
Data communications (for example, a Wifi communications 2.4 GHz module) or a
processing unit (for example, a NUC Intel i5 (tm) 2.70 GHz)
a camera (for example, IDS UI-3240C 1.3Mp) N line lasers (for example, N Global Lyte Mv Lasers 635nm wavelength (red) 5mW power with waterproof Housing )
Batteries (for example, LifeP04 batteries)
Data communications (for example, a Wifi communications 2.4 GHz module) or a
processing unit (for example, a NUC Intel i5 (tm) 2.70 GHz)
a camera (for example, IDS UI-3240C 1.3Mp)
N line lasers from different colours (for example, N Z Lasers 532nm wavelength
(green) 5mW power )
Batteries (for example, LifeP04 batteries)
Data communications (for example, a Wifi communications 2.4 GHz module)
[0061] The Structured Light Vision System, SLS, unit preferably consists of: one camera and two red line lasers, both with waterproof housings, see in Fig. 4. The processing unit is responsible for acquiring and processing the 2D images, receive odometry information and compute the triangulation for obtaining all points of the line laser in 3D reference coordinates.
[0062] One of the critical issues that needed to be solved in order for the SLS system to work, was the calibration between the laser projectors and the camera. In [15] we proposed two SLS calibration methods, one based on the cross-ratio principle, while the other was based on the robust estimation of the laser line projection in the camera reference frame. The methods were tested both in dry and underwater environments and the line projection method achieved 2x times smaller errors than the cross-ratio method. The results obtained show that error for the line projection method is about lmm, see [15] results section.
[0063] The SLS allows us to obtain 3D information from the flats fish present in an rectangular indoor tank. In Fig. Error! Reference source not found., we can see the SLS software architecture. The main purpose is to obtain synchronized time stamp images with dual line laser projection information. Flatfish species are not prone to sudden movements; the fish usually stay in the bottom of the tank in layers. Therefore, with a SLS constantly submerged underwater performing u-shaped laser scans is possible to capture the fish profile layer in the tank with high precision.
[0064] Prior to be able to use the SLS for obtaining accurate 3D underwater measurement, there is the need to perform a global system calibration. So, in this step we need to load the camera intrinsic parameters and the plane of each projector position in camera reference frame, obtained in calibration step. Initially, it may be necessary to define configuration parameters such as: frame rate, exposure time or a threshold value for the colour segmentation.
[0065] The method which allows obtaining the 3D point cloud is the following. First, we start image acquisition procedure followed by an image segmentation method where only the colour component corresponding to the laser is preserved. The next step concerns the detection of line lasers in the 2D image, as shown in Fig. 6b. Using a Gaussian Kernel is possible to obtain the pixel with greater colour intensity for each horizontal line of the image, as shown in Fig. 6c.
[0066] After obtaining the line laser in the 2D image frame and previously knowing the camera/laser calibration parameters, is possible to obtain 3D point information, using the triangulation principle between the camera viewpoint and the light projector. For more detail please see our previous work [16]. For computing the 3D point cloud in the tank reference frame is necessary to use odometry information. The profile layer gives us fish height in the tank reference frame, compared to a previously known measure of the profile tank height only with water. Having determined this profile we can then estimate the biomass volume of the fish present in the tank.
[0067] For testing our fish farming autonomous calibration system we started by performing some laboratory experiments. A fish tank measuring 3xlm with depth variations between 12 cm and 15 cm, was used in the experiments. In Fig. 7, it is possible to see an image of the experimental setup.
[0068] For this experiment we used two different types of fish: flounder and turbot. The working principle consisted on first do the 3D underwater mapping of the tank only with water. Then we placed dead fish and perform the 3D mapping of the fish using the SLS. Based on the comparison between both measures is possible to extract the fish biomass volume contained in the tank.
[0069] In the first test, a sample with 4 flounder with 0,956 kg of weight was used. In Fig. 8, it is possible to see the fish sample, as well as, the corresponding 3D point cloud result.
[0070] Our system measures the overall biomass volume present in a given tank. However a potential client/producer is actually interested in knowing the overall fish biomass weight (M) present in the tank. Therefore, is necessary to relate the actual fish weight with his density. For doing so, we collected a fish sample, and use a container with known dimensions and known volume of the water column. We placed all the fish samples inside of the container in order to measure the difference of water column height to calculate the density of the fish.
[0071] Afterwards, the weight of this sample is obtained using the following equation, where M represent the biomass weight, V the biomass volume, measured by the system and ρ the biomass density.
M
[0072] The results presented in Table 1, show that our approach has about 5% of relative error in estimate the biomass weight.
Table 1: Comparison of the total biomass weight measurements between the manual calibration and the SCAN system in INESC TEC laboratory - FLOUNDER
Figure imgf000016_0001
Weight Absolute Error Relative Error
(kg) (kg) (%)
Manual Calibration 0,956 - -
SCAN 0,907 0,049 5,125
[0073] Analogously, the same tests were performed for other type of flat fish, the turbot. For this test, we use a sample of 5 turbot weighing about 4,850 kg. In Fig. 9, it is shown the fish sample used and the corresponding 3D point cloud result. The results presented in Table 2, show that our approach has about 8% of relative error in estimating the biomass weight.
Table 2: Comparison of the total biomass weight measurements between the manual calibration and the SCAN system in INESC TEC laboratory - TURBOT
Figure imgf000017_0001
Figure imgf000017_0002
[0074] In Fig. 10 a result of 3D scan of the fish turbot is presented. In this image, it is possible to see that the tank was about 125 mm and the fish maximum thickness is approximated 40 mm.
[0075] The fish farming calibration system was experimented in a live indoor aquaculture farming industry. For this purpose a section of a large 40x3.5m tank, with 2.80x3.50m was selected. In Fig. 11 we can see the system operating in a RAS production tank.
[0076] The amount of biomass available in the tank was manually calibrated for ground-truth estimation purposes. In this test, performed in an actual aquaculture environment, we used 209 flounders with 69,700 kg of total weight. Having such a large quantity of fish it was impossible to know the real mean value of its biomass density.
[0077] Therefore, the biomass weight was estimated using density values, between 1200 kg/m and 1300 kg/m. Even not being totally thorough, since these are indicative values for live fish individuals, it allows us to have an approximated idea of the measurement errors associated, see Table 3. The results show that our approach has between 10% and 17% of relative error in biomass volume in real aquaculture environment. These results will have to be further validated in future work, taking into consideration the amount of fish present in a given tank and their growth during their life cycle. Also in the future, the mean value for biomass density will be estimated using statistical databases. The database will be generated by multiples scans of the same individuals in the same tank for estimate an approximate real biomass density value, and also by weighing all the fish in the tank priory to their sale.
Table 3: Comparison of the total biomass weight measurements between the manual calibration and the SCAN system in the fish farming facility.
Figure imgf000018_0001
Figure imgf000018_0002
SCAN 57,749 11,950 17,146
Biomass density (kg/m) 1300
Figure imgf000019_0001
[0078] The aquaculture industry is turning rapidly into a billion dollar industry. Therefore is important to maintain standards that allow fish to grow in a sustainable manner. Currently there are lots of interests in optimizing the growth with parameters like food, water quality, temperature and oxygen in the tank. The biomass estimation procedure is usually performed manually, originating fish mortality, illness, and causing fish stress that induces a slow growth.
[0079] In this disclosure we have presented an autonomous system for indoor fish farming biomass estimation. Our robotic mechanical platform that is placed and moves on the borders of a RAS tank contains a moving platform with our custom designed structured light vision system. The 3D underwater laser scanning technique is a promising tool for estimating the biomass weight, and is an alternative to the common manual measured system.
[0080] For proof concept our system was tested in laboratory experiments, and in these tests we use two different types of fish: flounder and turbot. In laboratory experiments, the results show that our approach has a maximum 8% of relative error in estimating the biomass weight. The fish farming calibration system was also experimented in a live indoor aquaculture farming industry and the results show that our approach has about 15% error in overall biomass weight in real aquaculture environment. [0081] Optionally, integrating an inertial measurement unit in the SLS system is advantageous in order to reduce the position and attitude errors of the robotic mechanic platform. One of the error sources is not taking into consideration the robotic platform attitude in the tank. Also mechanical redesign towards a weight reduction of the all system is also a preferable embodiment, preferably to allow an easy assembly and mobility inside the indoor fish farm.
[0082] The term "comprising" whenever used in this document is intended to indicate the presence of stated features, integers, steps, components, but not to preclude the presence or addition of one or more other features, integers, steps, components or groups thereof.
[0083] The disclosure should not be seen in any way restricted to the embodiments described and a person with ordinary skill in the art will foresee many possibilities to modifications thereof.
[0084] The above described embodiments are combinable.
[0085] The following claims further set out particular embodiments of the disclosure.
[0086] The following references, should be considered herewith incorporated in their entirety:
[1] http://www.worldometers.info/world-population/
[2] http://www.fao.org/docrep/005/Y4470E/Y4470E00.HTM
[3] http://documents.worldbank.org/curated/en/2013/12/18882045/fish-2030- prospects-fisheries-aquaculture
[4] Hartley, R. I. and Zisserman, A.," Multiple View Geometry in Computer Vision", Second Edition, 2004,Cambridge University Press, ISBN: 0521540518
[5] M. Foster, R. Petrell, M.R. Ito, R. Ward, "Detection and counting of uneaten food pellets in a sea cage using image analysis", Aquacultural Engineering, Volume 14, Issue 3, Pages 251-269, ISSN 0144-8609, (1995).
[6] R.J. Petrell, X. Shi, R.K. Ward, A. Naiberg, C.R. Savage, "Determining fish size and swimming speed in cages and tanks using simple video techniques", Aquacultural Engineering, Volume 16, Issues 1-2, Pages 63-84, ISSN 0144-8609, ( March 1997). [7] K.P. Ang, R.J. Petrell, "Control of feed dispensation in seacages using underwater video monitoring: effects on growth and food conversion". In: Aquacultural Engineering 16, pp.45-62, (1997).
[8] J.A. Lines, R.D. Tillett, L.G. Ross, D.Chan, S. Hockaday, N.J.B McFarlane, "An automatic image-based system for estimating the mass of free-swimming fish". In: Computers and Electronics in Agriculture, pp.151-168, (April 2001)
[9] C. Costa, M. Scardi, V. Vitalini S. Cataudella, "A dual camera system for counting and sizing Northern Bluefin Tuna (Thunnus thynnus; Linnaeus, 1758) stock, during transfer to aquaculture cages, with a semi automatic Artificial Neural Network tool". In: Aquaculture 291, pp.161-167, (2009).
[10] C. Costa, A. Loy, S. Cataudella, D. Davis, M. Scardi, "Extracting fish size using dual underwater cameras". In: Aquaculture Engineering 35, pp.218-227, (2006).
[11] C. Serna , A. Ollero, "A stereo vision system for the estimation of biomass in fish farms". Proceedings of the 6th IFAC Symposium, pp.185-191. Berlin, Germany, 8,9 October, (2001).
[12] J. R. Martinez-de Dios, C. Serna, A. Ollero. "Computer vision and robotics techniques in fish farms". Robotica 21, 233-243, (June 2003).
[13] J. Oca, S. Duarte, L. Reig. "Evaluation of spatial distribution of flatfish by laser scanning", Aquaculture Europe 2007. Istanbul, Turkey: European Aquaculture Society, 24-27 p. 157, (2007).
[14] C. Almansa, L. Reig, J. Oca, "Use of laser scanning to evaluate turbot
(Scophthalmus maximus) distribution in raceways with different water velocities,
Aquacultural Engineering, Volume 51, Pages 7-14, ISSN 0144-8609, (2012).
[15] F. Lopes, H. Silva, J. Almeida, A. Martins ,E. Silva, "Structured Light System
Calibration for Perception in Underwater Tanks", IbPRIA 2015: 7th Iberian Conference on Pattern Recognition and Image Analysis Santiago de Compostela, Spain. June 17-19
(2015)
[16] F. Lopes, H. Silva, J. Almeida, E. Silva, "Structured Light System for Underwater Inspection Operations", OCEANS'15 MTS/IEEE Genova, Italy, 18-21 May (2015)

Claims

C L A I M S
1. System for estimating biomass of a fish farming tank by obtaining a depth profile of the tank, comprising:
a platform movable over the tank;
an underwater structured light vision system, SLS, placed upon the platform, which comprises a camera for capturing underwater images and one or more line laser projectors each for projecting an underwater laser line beam;
such that the SLS is displaceable over an area of the tank where the biomass is to be estimated;
a data processing system configured for:
processing the captured underwater images of the laser beam lines,
calculating the underwater depth profile of the tank from the triangulation of the processed images of the laser beam lines, and
subtracting, from the calculated underwater depth profile of the tank, a n underwater depth profile of the tank previously obtained when the tank was empty of fish, for obtaining the biomass volume of the fish farming tank.
2. System according to claim 1 wherein the underwater depth of the SLS is adjustable, in particular wherein the full SLS is submergible underwater or only the optical parts of the camera and line laser projectors are submergible underwater.
3. System according to any of the previous claims wherein the captured underwater images of the laser beam lines are spatially synchronised with the displacement of the SLS over the tank.
4. System according to the previous claim further comprising laser position trackers for tracking the displacement of the SLS over the tank.
5. System according to any of the previous claims, wherein the system further comprises a transversal arm for laterally crossing the tank, said arm being movable longitudinally along the tank, and wherein the platform is movable along the length of said arm in order to be movable over the area of the tank where the biomass is to be estimated.
6. System according to the previous claim, wherein said arm comprises motorised wheels at its two ends for placing on top of lateral borders of the tank.
7. System according to any of the claims 5-6 comprising rotary encoders and/or inertial sensors for encoding the displacement and/or inclination of the platform over the tank, in particular for encoding the displacement and/or inclination of the arm over the tank or in particular for encoding the displacement of the platform in respect of the arm.
8. System according to any of the previous claims comprising two line laser projectors each arranged for projecting an underwater laser line beam such that the lateral edges of the bottom of the fish farming tank are illuminable by said laser projectors.
9. System according to any of the previous claims wherein the SLS comprises two or more cameras for capturing underwater images of the laser beam lines.
10. Method for estimating biomass of a fish farming tank by obtaining a depth profile of the tank, comprising:
displacing a structured light vision system, SLS, which is placed upon a platform, over an area of the tank where the biomass is to be estimated, wherein the SLS comprises a camera and two line laser projectors, and the camera comprises a camera lens which is kept underwater;
projecting an underwater laser line beam by each of the two line laser projectors; capturing underwater images of the two projected laser beam lines;
using a data processing system for:
processing the captured underwater images of the laser beam lines, calculating the underwater depth profile of the tank from the processed images of the laser beams lines, and
subtracting, from the calculated underwater depth profile of the tank, a n underwater depth profile of the tank previously obtained when the tank was empty of fish, for obtaining the biomass volume of the fish farming tank.
11. Method according to the previous claim further comprising multiplying the calculated biomass volume by an estimated density of the fish present in the fish farming tank for obtaining the biomass weight of the fish farming tank.
12. Method according to claim 10 or 11 further comprising adjusting the underwater depth of the SLS such that the camera lens is kept underwater across the area of the tank where the biomass is to be estimated.
13. Method according to the previous claim further comprising adjusting the underwater depth of the SLS such that the camera lens is kept as shallow as possible while keeping the camera lens constantly submerged across the area of the tank where the biomass is to be estimated.
14. Method according to any of the claims 10-13, further comprising synchronising the captured underwater images with the displacement of the SLS over the tank.
15. Method according to the previous claim further comprising tracking the displacement of the SLS over the tank with laser position trackers.
16. Method according to any of the claims 10-15, further comprising longitudinally moving an arm, said arm being a transversal arm laterally crossing the tank, and moving the platform along the length of said arm in order to be displace the SLS over the area of the tank where the biomass is to be estimated.
17. Method according to any of the claims 10-16, further comprising using rotary encoders for encoding the displacement of the platform over the tank.
18. Method according to any of the claims 10-17, further comprising displacing the structured light vision system, SLS, over the area of the tank where the biomass is to be estimated, in U-shaped movements.
19. Method according to any of the claims 10-18, further comprising using rotary encoders for encoding the displacement of the platform over the tank.
20. Method according to any of the claims 10-19, further comprising when using the data processing system for processing the captured underwater images of the laser line beams:
colour segmenting the captured image for obtaining the colour component corresponding to the colour of the selected laser;
using a Gaussian kernel for obtaining the pixel with greater colour intensity for each horizontal line of the captured image where the laser beam line is arranged vertically.
21. Method according to any of the claims 10-20, further comprising when using the data processing system for calculating the underwater depth profile of the tank: detecting the points of the laser beam lines present in said processed images; triangulating the detected points in order to obtain the 3D coordinates for said points;
combining the 3D coordinates of the detected points in order to obtain the underwater depth profile of the tank.
PCT/IB2016/053682 2015-06-30 2016-06-21 Method and system for measuring biomass volume and weight of a fish farming tank WO2017001971A1 (en)

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
PT10862715 2015-06-30
PT108627 2015-06-30
PT10920316 2016-03-01
PT109203 2016-03-01

Publications (1)

Publication Number Publication Date
WO2017001971A1 true WO2017001971A1 (en) 2017-01-05

Family

ID=56418562

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/IB2016/053682 WO2017001971A1 (en) 2015-06-30 2016-06-21 Method and system for measuring biomass volume and weight of a fish farming tank

Country Status (1)

Country Link
WO (1) WO2017001971A1 (en)

Cited By (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2019180698A1 (en) * 2018-03-20 2019-09-26 Giliocean Technology Ltd Method and system for extraction of statistical sample of moving objects
US10599922B2 (en) 2018-01-25 2020-03-24 X Development Llc Fish biomass, shape, and size determination
ES2799975A1 (en) * 2019-06-21 2020-12-22 Univ Oviedo Biomass estimation system in aquaculture based on reconstructions of images in three dimensions (Machine-translation by Google Translate, not legally binding)
WO2020256566A1 (en) * 2019-06-19 2020-12-24 Subc3D As System and method for depiction and counting of external structures on a fish
CN112763487A (en) * 2020-12-09 2021-05-07 天津市水产研究所 Aquatic livestock high-throughput scale type acquisition device
CN112997937A (en) * 2021-02-20 2021-06-22 东营市阔海水产科技有限公司 Prawn feeding table observation equipment
WO2021135392A1 (en) * 2019-12-30 2021-07-08 科沃斯机器人股份有限公司 Structured light module and autonomous moving apparatus
WO2021222113A1 (en) * 2020-04-27 2021-11-04 Ecto, Inc. Dynamic laser system reconfiguration for parasite control
CN114046961A (en) * 2021-09-18 2022-02-15 浙江大学 Sediment erosion testing system based on digital imaging technology
US11615638B2 (en) 2020-11-10 2023-03-28 X Development Llc Image processing-based weight estimation for aquaculture
KR102563980B1 (en) * 2022-08-29 2023-08-03 유병자 Automatic measuring device and automatic measuring method of fish mass
KR102626586B1 (en) * 2022-08-29 2024-01-17 유병자 Automatic measuring device and automatic measuring method of fish mass

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2014098614A1 (en) * 2012-12-20 2014-06-26 Ebtech As System and method for calculating physical dimensions for freely movable objects in water

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2014098614A1 (en) * 2012-12-20 2014-06-26 Ebtech As System and method for calculating physical dimensions for freely movable objects in water

Non-Patent Citations (18)

* Cited by examiner, † Cited by third party
Title
ALMANSA C ET AL: "The laser scanner is a reliable method to estimate the biomass of a Senegalese sole (Solea senegalensis) population in a tank", AQUACULTURAL ENGINEERING, ELSEVIER SCIENCE PUBLISHERS LTD, AMSTERDAM, NL, vol. 69, 22 October 2015 (2015-10-22), pages 78 - 83, XP029303255, ISSN: 0144-8609, DOI: 10.1016/J.AQUAENG.2015.10.003 *
ALMANSA C ET AL: "Use of laser scanning to evaluate turbot (Scophthalmus maximus) distribution in raceways with different water velocities", AQUACULTURAL ENGINEERING, vol. 51, 1 November 2012 (2012-11-01), pages 7 - 14, XP028944568, ISSN: 0144-8609, DOI: 10.1016/J.AQUAENG.2012.04.002 *
BRUNO F ET AL: "Experimentation of structured light and stereo vision for underwater 3D reconstruction", ISPRS JOURNAL OF PHOTOGRAMMETRY AND REMOTE SENSING, AMSTERDAM [U.A.] : ELSEVIER, AMSTERDAM, NL, vol. 66, no. 4, 21 March 2011 (2011-03-21), pages 508 - 518, XP028224089, ISSN: 0924-2716, [retrieved on 20110303], DOI: 10.1016/J.ISPRSJPRS.2011.02.009 *
C. ALMANSA; L. REIG; J. OCA: "Use of laser scanning to evaluate turbot (Scophthalmus maximus) distribution in raceways with different water velocities", AQUACULTURAL ENGINEERING, vol. 51, 2012, pages 7 - 14, XP028944568, DOI: doi:10.1016/j.aquaeng.2012.04.002
C. COSTA; A. LOY; S. CATAUDELLA; D. DAVIS; M. SCARDI: "Extracting fish size using dual underwater cameras", AQUACULTURE ENGINEERING, vol. 35, 2006, pages 218 - 227, XP025087455, DOI: doi:10.1016/j.aquaeng.2006.02.003
C. COSTA; M. SCARDI; V. VITALINI; S. CATAUDELLA: "A dual camera system for counting and sizing Northern Bluefin Tuna (Thunnus thynnus; Linnaeus, 1758) stock, during transfer to aquaculture cages, with a semi automatic Artificial Neural Network too", AQUACULTURE, vol. 291, 2009, pages 161 - 167, XP026097739, DOI: doi:10.1016/j.aquaculture.2009.02.013
C. SERNA; A. OLLERO: "A stereo vision system for the estimation of biomass in fish farms", PROCEEDINGS OF THE 6TH IFAC SYMPOSIUM, 8 October 2001 (2001-10-08), pages 185 - 191
F. LOPES; H. SILVA; J. ALMEIDA; A. MARTINS; E. SILVA: "Structured Light System Calibration for Perception in Underwater Tanks", IBPRIA 2015: 7TH IBERIAN CONFERENCE ON PATTERN RECOGNITION AND IMAGE ANALYSIS SANTIAGO DE COMPOSTELA, SPAIN, 17 June 2015 (2015-06-17)
F. LOPES; H. SILVA; J. ALMEIDA; E. SILVA: "Structured Light System for Underwater Inspection Operations", OCEANS'15 MTS/IEEE GENOVA, 18 May 2015 (2015-05-18)
HARTLEY, R. I.; ZISSERMAN, A.: "Multiple View Geometry in Computer Vision", 2004, CAMBRIDGE UNIVERSITY PRESS
J OCA ET AL: "EVALUATION Of SpATIAL DISTRIBUTION Of fLATfISH BY LASER SCANNING", AQUACULTURE EUROPE 2007, 1 October 2007 (2007-10-01), pages 157, XP055303832, Retrieved from the Internet <URL:https://upcommons.upc.edu/handle/2117/1309> [retrieved on 20160919] *
J. OCA; S. DUARTE; L. REIG: "Aquaculture Europe 2007", vol. 24-27, 2007, ISTANBUL, TURKEY: EUROPEAN AQUACULTURE SOCIETY, article "Evaluation of spatial distribution of flatfish by laser scanning", pages: 157
J. R. MARTINEZ-DE DIOS; C. SERNA; A. OLLERO.: "Computer vision and robotics techniques in fish farms", ROBOTICA, vol. 21, June 2003 (2003-06-01), pages 233 - 243, XP009153523
J.A. LINES; R.D. TILLETT; L.G. ROSS; D.CHAN; S. HOCKADAY; N.J.B MCFARLANE: "An automatic image-based system for estimating the mass of free-swimming fish", COMPUTERS AND ELECTRONICS IN AGRICULTURE, April 2001 (2001-04-01), pages 151 - 168, XP055287889, DOI: doi:10.1016/S0168-1699(00)00181-2
K.P. ANG; R.J. PETRELL: "Control of feed dispensation in seacages using underwater video monitoring: effects on growth and food conversion", AQUACULTURAL ENGINEERING, vol. 16, 1997, pages 45 - 62
M. FOSTER; R. PETRELL; M.R. ITO; R. WARD: "Detection and counting of uneaten food pellets in a sea cage using image analysis", AQUACULTURAL ENGINEERING, vol. 14, no. 3, 1995, pages 251 - 269
MARTINEZ-DE DIOS J R ET AL: "Computer vision and robotics techniques in fish farms", ROBOTICA, CAMBRIDGE, GB, vol. 21, no. 3, 1 June 2003 (2003-06-01), pages 233 - 243, XP009153523, ISSN: 0263-5747, [retrieved on 20030513] *
R.J. PETRELL; X. SHI; R.K. WARD; A. NAIBERG; C.R. SAVAGE: "Determining fish size and swimming speed in cages and tanks using simple video techniques", AQUACULTURAL ENGINEERING, vol. 16, no. 1-2, March 1997 (1997-03-01), pages 63 - 84

Cited By (18)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11232297B2 (en) 2018-01-25 2022-01-25 X Development Llc Fish biomass, shape, and size determination
US10599922B2 (en) 2018-01-25 2020-03-24 X Development Llc Fish biomass, shape, and size determination
US11688196B2 (en) 2018-01-25 2023-06-27 X Development Llc Fish biomass, shape, and size determination
CN111868472A (en) * 2018-03-20 2020-10-30 吉利海洋科技有限公司 System and method for extracting statistical samples of multiple moving objects
EP3769036B1 (en) * 2018-03-20 2023-11-22 Giliocean Technology Ltd Method and system for extraction of statistical sample of moving fish
WO2019180698A1 (en) * 2018-03-20 2019-09-26 Giliocean Technology Ltd Method and system for extraction of statistical sample of moving objects
GB2599532B (en) * 2019-06-19 2022-10-26 Subc3D As System and method for depiction and counting of external structures on a fish
GB2599532A (en) * 2019-06-19 2022-04-06 Subc3D As System and method for depiction and counting of external structures on a fish
WO2020256566A1 (en) * 2019-06-19 2020-12-24 Subc3D As System and method for depiction and counting of external structures on a fish
ES2799975A1 (en) * 2019-06-21 2020-12-22 Univ Oviedo Biomass estimation system in aquaculture based on reconstructions of images in three dimensions (Machine-translation by Google Translate, not legally binding)
WO2021135392A1 (en) * 2019-12-30 2021-07-08 科沃斯机器人股份有限公司 Structured light module and autonomous moving apparatus
WO2021222113A1 (en) * 2020-04-27 2021-11-04 Ecto, Inc. Dynamic laser system reconfiguration for parasite control
US11615638B2 (en) 2020-11-10 2023-03-28 X Development Llc Image processing-based weight estimation for aquaculture
CN112763487A (en) * 2020-12-09 2021-05-07 天津市水产研究所 Aquatic livestock high-throughput scale type acquisition device
CN112997937A (en) * 2021-02-20 2021-06-22 东营市阔海水产科技有限公司 Prawn feeding table observation equipment
CN114046961A (en) * 2021-09-18 2022-02-15 浙江大学 Sediment erosion testing system based on digital imaging technology
KR102563980B1 (en) * 2022-08-29 2023-08-03 유병자 Automatic measuring device and automatic measuring method of fish mass
KR102626586B1 (en) * 2022-08-29 2024-01-17 유병자 Automatic measuring device and automatic measuring method of fish mass

Similar Documents

Publication Publication Date Title
WO2017001971A1 (en) Method and system for measuring biomass volume and weight of a fish farming tank
Costa et al. Extracting fish size using dual underwater cameras
Shortis et al. Design and calibration of an underwater stereo-video system for the monitoring of marine fauna populations
CA2744146C (en) Arrangement and method for determining a body condition score of an animal
Shuai et al. Research on 3D surface reconstruction and body size measurement of pigs based on multi-view RGB-D cameras
Martinez-de Dios et al. Computer vision and robotics techniques in fish farms
Lopes et al. Fish farming autonomous calibration system
Li et al. Estimation of pig weight by machine vision: A review
Long et al. Potato volume measurement based on RGB-D camera
EP3769036B1 (en) Method and system for extraction of statistical sample of moving fish
Lin et al. Three-dimensional location of target fish by monocular infrared imaging sensor based on a L–z correlation model
Pérez et al. Automatic measurement of fish size using stereo vision
Hansen et al. Non-intrusive automated measurement of dairy cow body condition using 3D video
Bianco et al. Plankton 3D tracking: the importance of camera calibration in stereo computer vision systems
Rochet et al. Precision and accuracy of fish length measurements obtained with two visual underwater methods
Minagawa et al. Determining the weight of pigs with image analysis
Odone et al. Visual Learning of Weight from Shape Using Support Vector Machines.
Sun et al. A practical system of fish size measurement
Savinov et al. Automatic contactless weighing of fish during experiments
Hwang et al. Machine vision based weight prediction for flatfish
Abdullah et al. Measuring fish length from digital images (FiLeDI)
Zong et al. Comparisons of non-contact methods for pig weight estimation
Rahim et al. A new approach in measuring fish length using FiLeDI framework
RU2769731C2 (en) Method of determining numerical values of animal exterior
Biswas et al. A Study on Artificial Intelligence Techniques for Automatic Fish-Size Estimation

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 16739570

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

32PN Ep: public notification in the ep bulletin as address of the adressee cannot be established

Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC (EPO FORM 1205A DATED 11.04.2018)

122 Ep: pct application non-entry in european phase

Ref document number: 16739570

Country of ref document: EP

Kind code of ref document: A1