WO2023108307A1 - System for inspecting farming netting in aquaculture cages - Google Patents

System for inspecting farming netting in aquaculture cages Download PDF

Info

Publication number
WO2023108307A1
WO2023108307A1 PCT/CL2022/050128 CL2022050128W WO2023108307A1 WO 2023108307 A1 WO2023108307 A1 WO 2023108307A1 CL 2022050128 W CL2022050128 W CL 2022050128W WO 2023108307 A1 WO2023108307 A1 WO 2023108307A1
Authority
WO
WIPO (PCT)
Prior art keywords
culture
meshes
inspection
cage
mesh
Prior art date
Application number
PCT/CL2022/050128
Other languages
Spanish (es)
French (fr)
Inventor
Marcos David ZÚÑIGA BARRAZA
Gonzalo Andrés CARVAJAL BARRERA
Mohamed Abdelhamid AHMED ABDELHAMID
Nicolás Alonso JARA CARVALLO
Rodrigo Javier CARVAJAL GUERRA
Eduardo Ignacio GARCÍA LÓPEZ
Manuel Alejandro PEDRAZA GONZÁLEZ
Javier Andrés REBOLLEDO BUSTAMANTE
Karel TOLEDO DE LA GARZA
Original Assignee
Universidad Técnica Federico Santa María
Sociedad Comercial Aquarov Ltda.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Universidad Técnica Federico Santa María, Sociedad Comercial Aquarov Ltda. filed Critical Universidad Técnica Federico Santa María
Publication of WO2023108307A1 publication Critical patent/WO2023108307A1/en

Links

Classifications

    • AHUMAN NECESSITIES
    • A01AGRICULTURE; FORESTRY; ANIMAL HUSBANDRY; HUNTING; TRAPPING; FISHING
    • A01KANIMAL HUSBANDRY; AVICULTURE; APICULTURE; PISCICULTURE; FISHING; REARING OR BREEDING ANIMALS, NOT OTHERWISE PROVIDED FOR; NEW BREEDS OF ANIMALS
    • A01K61/00Culture of aquatic animals
    • AHUMAN NECESSITIES
    • A01AGRICULTURE; FORESTRY; ANIMAL HUSBANDRY; HUNTING; TRAPPING; FISHING
    • A01KANIMAL HUSBANDRY; AVICULTURE; APICULTURE; PISCICULTURE; FISHING; REARING OR BREEDING ANIMALS, NOT OTHERWISE PROVIDED FOR; NEW BREEDS OF ANIMALS
    • A01K61/00Culture of aquatic animals
    • A01K61/10Culture of aquatic animals of fish
    • AHUMAN NECESSITIES
    • A01AGRICULTURE; FORESTRY; ANIMAL HUSBANDRY; HUNTING; TRAPPING; FISHING
    • A01KANIMAL HUSBANDRY; AVICULTURE; APICULTURE; PISCICULTURE; FISHING; REARING OR BREEDING ANIMALS, NOT OTHERWISE PROVIDED FOR; NEW BREEDS OF ANIMALS
    • A01K61/00Culture of aquatic animals
    • A01K61/60Floating cultivation devices, e.g. rafts or floating fish-farms
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B63SHIPS OR OTHER WATERBORNE VESSELS; RELATED EQUIPMENT
    • B63CLAUNCHING, HAULING-OUT, OR DRY-DOCKING OF VESSELS; LIFE-SAVING IN WATER; EQUIPMENT FOR DWELLING OR WORKING UNDER WATER; MEANS FOR SALVAGING OR SEARCHING FOR UNDERWATER OBJECTS
    • B63C11/00Equipment for dwelling or working underwater; Means for searching for underwater objects
    • B63C11/48Means for searching for underwater objects
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B63SHIPS OR OTHER WATERBORNE VESSELS; RELATED EQUIPMENT
    • B63CLAUNCHING, HAULING-OUT, OR DRY-DOCKING OF VESSELS; LIFE-SAVING IN WATER; EQUIPMENT FOR DWELLING OR WORKING UNDER WATER; MEANS FOR SALVAGING OR SEARCHING FOR UNDERWATER OBJECTS
    • B63C11/00Equipment for dwelling or working underwater; Means for searching for underwater objects
    • B63C11/52Tools specially adapted for working underwater, not otherwise provided for
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S15/00Systems using the reflection or reradiation of acoustic waves, e.g. sonar systems
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S15/00Systems using the reflection or reradiation of acoustic waves, e.g. sonar systems
    • G01S15/88Sonar systems specially adapted for specific applications
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S5/00Position-fixing by co-ordinating two or more direction or position line determinations; Position-fixing by co-ordinating two or more distance determinations
    • G01S5/18Position-fixing by co-ordinating two or more direction or position line determinations; Position-fixing by co-ordinating two or more distance determinations using ultrasonic, sonic, or infrasonic waves
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S5/00Position-fixing by co-ordinating two or more direction or position line determinations; Position-fixing by co-ordinating two or more distance determinations
    • G01S5/18Position-fixing by co-ordinating two or more direction or position line determinations; Position-fixing by co-ordinating two or more distance determinations using ultrasonic, sonic, or infrasonic waves
    • G01S5/22Position of source determined by co-ordinating a plurality of position lines defined by path-difference measurements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/05Underwater scenes
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/18Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02ATECHNOLOGIES FOR ADAPTATION TO CLIMATE CHANGE
    • Y02A40/00Adaptation technologies in agriculture, forestry, livestock or agroalimentary production
    • Y02A40/80Adaptation technologies in agriculture, forestry, livestock or agroalimentary production in fisheries management
    • Y02A40/81Aquaculture, e.g. of fish

Definitions

  • the present invention is related to the aquaculture industry and to equipment aimed at measuring the condition of aquaculture cage meshes.
  • the present invention consists of a system for the inspection of culture meshes in aquaculture cages, which allows real-time monitoring of the state of the cages.
  • sensors and smart devices as tools for monitoring operations has gained great relevance in recent years, since it allows the production of a large amount of information that helps the analysis of variables and provides tools for decision-making in the operation.
  • monitoring systems for different areas of the industry, there are few applications or technologies specifically aimed at jerking and processing fish, and more particularly, at monitoring submerged cages.
  • this document describes a system that allows the monitoring of meshes and cages, in the system described there, there are still limitations regarding the precision of the data analyzed, mainly taking into account the determination of the location of the objects identified by the vehicles.
  • this type of solution is based on the use of images captured by tele-operated robotic vehicles, from which the position of the captured elements is calculated, whether fish or anomalies in the meshes, in order to So the system relies on visual evidence delivered by cameras on board the vehicle and estimated speed and location data.
  • navigation conditions in this type of applications are commonly low visibility and under the influence of currents that make precise piloting difficult. Due to this, the results of these operations have a high degree of imprecision regarding the location of each one of the detected events and the complete efficiency of the process is largely conditioned by the skill and experience of the vehicle operator.
  • a system for the inspection of culture meshes in aquaculture cages which allows real-time monitoring of the state of cages and the fish inside them.
  • the system comprises: a mobile detection device that is remotely operable and that comprises means for capturing images and means for emitting omnidirectional ultrasound signals; a plurality of ultrasound signal receiving means located in a fixed manner in different locations of a cage; a processing system that is configured to determine the position of the mobile detection device inside the cage by processing the ultrasound signals and identify the state of a cage mesh; and a user interface configured to generate automatic displays in a three-dimensional environment of the location of the mobile detection device and the state of the mesh.
  • the system of the present invention makes it possible to obtain images, preferably in the form of video, of the interior of an aquaculture cage through the use of a submerged mobile device.
  • the system allows to accurately determine the location of the mobile device in relation to the cage itself by analyzing the ultrasound signals. In this way, the system allows the precise determination of the location of each of the anomalies and objects that are being detected by the mobile device.
  • the present invention provides a method for the inspection of culture meshes in aquaculture cages, which comprises the steps of: having a mobile detection device that is remotely operable and that is configured to capture images and emit warning signals; omnidirectional ultrasound; receiving the ultrasound signals emitted by the mobile detection device by means of a plurality of ultrasound signal receiving means located fixedly at different locations in a cage; processing the ultrasound signals, by means of a processing system, to determine the position of the mobile detection device within the cage; processing the received images, by the processing system, to identify the state of a mesh of the cage; and generate automatic visualizations in a three-dimensional environment, in a user interface, of the location of the mobile detection device and the state of the mesh.
  • the ability to identify the state of a cage mesh comprises the detection of anomalies by analyzing the captured images, where said anomalies may include the presence of breaks in the mesh, or detect other critical conditions, such as example, the level of dirt in an area of the mesh.
  • the determination of the position of the mobile detection device can be complemented by using the processing of the images detected by the same device, in order to provide greater robustness to the ultrasound-based positioning system.
  • This configuration is based on a visual odometry process, where a position measurement is carried out by analyzing the images obtained by the same device.
  • the visualization in a three-dimensional environment is based on the use of the information obtained regarding the determination of the position of the mobile detection device inside the cage, either through ultrasound, image processing, or a combination of both, processing this information. in order to project said position in a three-dimensional model of the cage.
  • a user can be provided with visual support of the inspection process by displaying the mobile detection device within the three-dimensional model, the status of the inspection process, anomalies in the mesh and the level of dirt (fouling).
  • FIG. 1 shows an exemplary representation of a system for the inspection of culture meshes in aquaculture cages in accordance with the present invention.
  • FIG. 2 shows a representative diagram of the architecture of the processing system of the present invention.
  • FIG. 3 shows a representative diagram of a method for detecting breaks in a cage mesh in accordance with the present invention.
  • FIGs. 4A and 4B show a representation of results obtained from the detection of breaks in a mesh of a cage.
  • FIGs. 5A to 5D show a sequence of images taken from a mesh in an underwater cage, where progressive levels of dirt can be seen around the elements of the mesh.
  • FIGs. 6A to 6F show an exemplary implementation of the invention in which detection of fish within the cage is carried out.
  • the FIGs. 6A, 6C and 6C show the real data captured (ground-truth data), and FIGs. 6B, 6D and 6F show the results of detection by the present invention.
  • FIG. 7 shows a representation of the location of the mobile detection device with respect to the cage in a three-dimensional environment.
  • the present invention consists of a system (100) for the inspection of culture meshes in aquaculture cages (110), which allows real-time monitoring of the state of the culture meshes inside, in terms of breakage and anomalies, and characterization of the level of dirt.
  • the system comprises: a mobile detection device (120) that is remotely operable and that includes means for capturing images and means for emitting omnidirectional ultrasound signals; a plurality of ultrasonic signal receiving means (130) fixedly located at different locations in a cage (110); a processing system (140) that is configured to determine the position of the mobile detection device inside the cage by processing the ultrasound signals and identify the state of a cage mesh; and a user interface (144) configured to generate automatic displays in a three-dimensional environment of the location of the mobile detection device, the status of the inspection process, anomalies in the mesh, and the level of soiling.
  • the mobile detection device (120) comprises ultrasonic signal transmission means that are detectable by the plurality of ultrasound signal receiving means (130), which are preferably located around the cage, as shown. shows in FIG. 1.
  • the omnidirectional ultrasound signal emitting means of the mobile detection device (120) preferably transmit signals periodically.
  • the signal captured by the receiving means (130) is transmitted to the processing system (140), which is in charge of digitizing the signals and retransmitting them to a local network (141).
  • the system preferably includes a LoRa system ("Long Range") due to the significant distances that can exist between one or more cages and the center of
  • the location of the mobile detection device is carried out using a multilateration technique, also known as "Time Difference of Arrival (TDoA)", which allows calculating the signal transmission time differences between the mobile detection device (120) and the plurality of receiving means (130).
  • TDoA Time Difference of Arrival
  • the processing system makes it possible to determine the difference in distances between the transmission means on board the mobile detection device (120) and each means receiver (130), which allows monitoring the position of the mobile detection device (120) inside the cage at all times, to complement the inspection tasks.
  • the system contemplates a minimum of four receivers (130) to obtain a location in two dimensions and a minimum of five receivers for a location in three dimensions.
  • the processing system converts the received analog signals into digital signals through the use of an analog-to-digital converter (ADC), which are further processed, for example, by a personal computer (PC) or in a computer.
  • ADC analog-to-digital converter
  • PC personal computer
  • FPGA Field-Programmable Gate Array
  • the system detects an instant of time when each ultrasound signal arrives at the receiving medium (130) and stores that time (timestamp), from which the position of the mobile detection device is determined, by means of the TDoA technique. .
  • the receiving means (130) can be complemented with other positioning systems, such as GPS, in order to synchronize this information with the processing of the ultrasound signals coming from the mobile detection device (120). ).
  • the processing system (140) is configured using a three-layer architecture: an access layer (142), a core layer (143), and a user interface layer (144).
  • the access layer (142) is in charge of communicating with the different detection means of system variables, in order to receive the data obtained by the different detection means and send the information to the central layer for further processing.
  • the system comprises fixed detection means (132) and mobile detection means (123) that are located in the mobile detection device (120).
  • the fixed detection means (132) is the use of ultrasound signal receiving means (130) and other alternative detection means, such as the use of fixed cameras (131) in different locations of a cage.
  • the mobile detection means (123) preferably include capture means imaging (121) and omnidirectional ultrasound signal emission means (122), both on board the mobile detection device (120).
  • other types of detection means may also form part of other alternative implementations of the invention.
  • the central layer (143) is in charge of communicating with the access layer (142) for the reception of the data obtained by the different means of detection (123, 132), and carries out the administration and processing of all the information to subsequently send the output data to the user interface layer (144).
  • the user interface layer (144) is the intermediary between the user and the entire cage inspection system (100), from where it is possible to access the information processed by the central layer (143) and interact with the different components. of the system.
  • This layer comprises one or more user interfaces, which may be located at a remote location from the cage(s) being inspected.
  • the user interfaces can include the use of computers, electronic tablets, mobile phone applications, or others.
  • the present invention also provides a method for the inspection of culture meshes in aquaculture cages, which allows real-time monitoring of the state of cages and the fish inside, which comprises the steps of: having a detection device mobile and remotely operable (120), which is configured to capture images and emit omnidirectional ultrasound signals; receiving the ultrasound signals emitted by the mobile detection device (120) by means of a plurality of ultrasound signal receiving means (130) fixedly located at different locations in a cage (110); processing the ultrasound signals, by means of a processing system, to determine the position of the mobile detection device within the cage; processing the received images, by the processing system, to identify the state of a mesh of the cage; and generate automatic visualizations in a three-dimensional environment, in a user interface, of the location of the mobile detection device, the status of the inspection process, the anomalies in the mesh and the level of dirt.
  • the step of identifying the state of a mesh in the cage comprises detecting anomalies, such as breaks in the mesh, and detecting other critical conditions, such as the level of dirt in an area of the mesh.
  • the invention is based on the use of different image processing techniques, based on geometric characteristics through the use of libraries dedicated to this type of application, such as OpenCV. Furthermore, preferably the processing techniques are selected considering a low computational cost in order that the processing can be executed with the hardware on board the mobile detection device.
  • a method for the detection of breaks comprises the steps of: capturing images (151), preferably in the form of video , by using the mobile detection device; process the images to carry out a detection and segmentation of the mesh; develop a statistical model (156) from the processed images; select outliers as anomaly candidates (157); perform space-time filtering (158); and making an event log (159).
  • the step of processing the images comprises in turn image pre-processing steps (152), application of an adaptive threshold (153) for adjusting the segmentation to various lighting and contrast conditions, component analysis connected (154) and generation of a binary mask corresponding to the pixels of the mesh (155).
  • the purpose of the pre-processing stage (152) is to deal with the effects that underwater conditions generate in the image.
  • the pre-processing stage allows taking care of abrupt changes in light intensity, the effects caused by the need to use artificial light sources and the high content of floating particles that directly hinder the detection and classification of the pixels that correspond to the mesh.
  • the pre-processing of the images (152) comprises adjusting the size of the image to 1280x720 pixels in order to reduce the computational cost without considerable loss of information.
  • the images are taken to the CIE-LAB color space in order to carry out the processing only with the channel that contains all the relevant light perception information, since it is considered that the color components do not provide much relevant information under the context of the problem.
  • the step of applying an adaptive threshold (153) is carried out to then generate a binary mask (155) that corresponds to the pixels of the mesh.
  • the method includes the step of inverting the binary mask generated by the adaptive threshold, for those cases in which it is identified that the background of the image is lighter. than the intensity of the pixels of the mesh.
  • An example of this underwater context corresponds to situations in which the mobile detection device is at a depth of over ten meters and pointing towards an illuminated surface.
  • the method contemplates the measurement of the proportion between light and dark pixels of the binary mask, allowing the correct mask to be generated.
  • morphological operation filters are applied to suppress the effects of floating particles and connect gaps in the mesh that could be left open by default.
  • FIGs. 4A and 4B show examples of results obtained by the above steps.
  • the method includes the step of analyzing connected components (154), through which the area in pixels corresponding to each of the holes in the mesh is determined.
  • the size of the holes is similar, as observed in FIGS. 4A and 4B.
  • the average and standard deviation between these areas are calculated to detect those areas of outliers greater than three standard deviations.
  • the method preferably includes the step of filtering possible false positives by associating the spatial information obtained by image processing with a spatiotemporal analysis (158) using video tracking techniques.
  • false positives can possibly be caused by lack of lighting, fast camera movements, and very steep and close-to-grid perspectives, for which a final stage of spatiotemporal filtering is added by tracking in time. . This means that only those outliers that persist over time are identified as breaks, without drastically changing their position in the image. To visually check that these conditions are met, each break candidate is assigned an identification, which is displayed on the screen and should not vary between frames.
  • the detection of anomalies in the mesh of a cage includes the determination of the level of dirt in an area of the mesh.
  • the FIGs. 5A to 5D show a sequence of images taken from a mesh in an underwater cage in which progressive levels of dirt can be seen around the elements of the mesh, where FIG. 5A shows a mesh area with low levels of dirt and FIG. 5D shows a mesh area with a high degree of soiling. In these images it can be seen that the space between mesh elements is progressively covered by the growth of algae that adhere to the mesh elements.
  • the invention preferably addresses the problem through artificial intelligence, through a process of "transfer learning", through which the efficiency of previously trained neural networks is used to solve general tasks and re-trains the last layers of the network with exclusive images of the problem
  • the problem is addressed by using shallower Convolutional Neural Networks (CNN), designing and processing the information with different architectures, hyperparameters and data sets in order to to generate a model that can predict the level of dirt with the greatest possible precision.
  • CNN Convolutional Neural Networks
  • the method includes the detection of fish inside the cage, which comprises determining the location of the fish and its subsequent segmentation from a set of images obtained from the mobile detection device.
  • the method preferably contemplates the use of artificial intelligence for the processing of the images obtained, and more preferably the use of neural networks.
  • the Mask R-CNN algorithm can be implemented, which corresponds to a neural network capable of segmenting objects learned from a data set with the objects of interest.
  • the purpose of the detection and segmentation of fish in the cage is to be able to mask the presence of fish so that they do not significantly affect the analysis and characterization of the state of the mesh.
  • fish detection is also projected as a fundamental element for biomass estimation processes.
  • the method can be complemented with optimization tools, such as a Bayesian optimizer, in order to be able to automatically choose hyper-parameters that best minimize the segmentation fault.
  • optimization tools such as a Bayesian optimizer
  • the optimizer works by generating a probabilistic map that determines in which part of the hyper-parameter space it is more convenient to search for new candidates, such that the variance of the model is minimized.
  • the searches become more precise and specialized because the process searches for combinations with the highest probability of maximizing performance.
  • FIGs. 6A to 6F show the result of implementing the previously described steps, wherein FIGS. 6A, 6C and 6E show reference images taken at different times inside a fish cage, while FIGs. 6B, 6D and 6Fd show the results obtained by the previously described detection and segmentation processes, showing erroneously identified objects in light gray and true positives in dark grey.
  • the determination of the position of the mobile detection device (120) can be complemented with the use of a processing of the images detected by the same device, in order to provide greater robustness to the positioning system.
  • ultrasound based In this configuration, monitoring of the visual characteristics of the meshes in the culture cages is carried out, integrating the sensors of the mobile detection device.
  • a vision algorithm based on "Simultaneous Location And Mapping" (SLAM) is preferably implemented for the complementary estimation of the position of the mobile detection device based on the visual characteristics of the meshes.
  • the method based on local descriptors comprises using a feature extraction method, such as SIFT (Scale-invariant feature detection), SURF (Speeded Up Robust Features), ORB (Oriented-FAST and Rotated-BRIEF), BRISK (Binary Robust Invariant Scalable Keypoints), AKAZE (Accelerated-KAZE Features), etc., to track them between frames to determine the movement, both rotational and translational, of the camera.
  • SIFT Scale-invariant feature detection
  • SURF Speeded Up Robust Features
  • ORB Oriented-FAST and Rotated-BRIEF
  • BRISK Binary Robust Invariant Scalable Keypoints
  • AKAZE Accelerated-KAZE Features
  • the aforementioned steps are applied to the image streams in the environment of the mobile detection device, whereby a comparison is carried out using qualitative criteria, such as computation time and how consistent the tracking is. the characteristics, in order to define which of the methods is the best for a specific application.
  • the method further comprises performing a display in a three-dimensional environment of the mobile sensing device (120).
  • the method comprises the use of previously obtained information regarding the determination of the position of the mobile detection device in relation to the cage, either by processing ultrasound signals, image analysis, or a combination of both. , and processing this information in order to project the position of the device in a three-dimensional model of the cage.
  • the extracted visual information is projected in terms of inspection coverage, the position of breaks in the mesh and the characterization of the level of dirt. In this way it is possible to visually support the inspection process through visualization, allowing the traceability of the process, modeled within a three-dimensional virtual environment.

Landscapes

  • Engineering & Computer Science (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Physics & Mathematics (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Environmental Sciences (AREA)
  • General Physics & Mathematics (AREA)
  • Biodiversity & Conservation Biology (AREA)
  • Animal Husbandry (AREA)
  • Multimedia (AREA)
  • Zoology (AREA)
  • Marine Sciences & Fisheries (AREA)
  • Ocean & Marine Engineering (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Mechanical Engineering (AREA)
  • Acoustics & Sound (AREA)
  • Signal Processing (AREA)
  • Measurement Of Velocity Or Position Using Acoustic Or Ultrasonic Waves (AREA)

Abstract

The invention relates to a system and method for inspecting farming netting in aquaculture cages, which allows real-time monitoring of the condition of farming netting in cages, in terms of tears and anomalies, and allows the level of fouling to be assessed. The system comprises: a remotely operatable mobile detection device, comprising means for capturing images and means for omnidirectional ultrasound signal emission; a plurality of means for receiving ultrasound signals, fixedly located in different positions in a cage; a processing system configured to determine the position of the device inside the cage by processing the ultrasound signals, and to identify the condition of netting in the cage; and a user interface configured to generate automatic visualisations, in a three-dimensional environment, of the position of the device and the condition of the netting.

Description

SISTEMA PARA LA INSPECCIÓN DE MALLAS DE CULTIVO EN JAULAS DE ACUICULTURA SYSTEM FOR THE INSPECTION OF CULTURE MESHES IN AQUACULTURE CAGES
CAMPO TÉCNICO DE LA INVENCIÓN TECHNICAL FIELD OF THE INVENTION
La presente invención se relaciona con la industria de la acuicultura y con equipos dirigidos a la medición del estado de mallas de jaulas de acuicultura. En particular, la presente invención consiste en un sistema para la inspección de mallas de cultivo en jaulas de acuicultura, lo que permite una monitorización en tiempo real del estado de las jaulas. The present invention is related to the aquaculture industry and to equipment aimed at measuring the condition of aquaculture cage meshes. In particular, the present invention consists of a system for the inspection of culture meshes in aquaculture cages, which allows real-time monitoring of the state of the cages.
ANTECEDENTES BACKGROUND
El uso de sensores y dispositivos inteligentes como herramientas para la monitorización de operaciones ha adquirido gran relevancia en los últimos años, pues permite producir una gran cantidad de información que ayuda al análisis de variables y entrega herramientas para la toma de decisiones de la operación. Sin embargo, si bien existen diversos tipos de sistemas de monitorización para distintas áreas de la industria, existen pocas aplicaciones o tecnologías dirigidas específicamente a la chanza y procesamiento de peces, y más particularmente, a la monitorización de jaulas sumergidas. The use of sensors and smart devices as tools for monitoring operations has gained great relevance in recent years, since it allows the production of a large amount of information that helps the analysis of variables and provides tools for decision-making in the operation. However, while there are various types of monitoring systems for different areas of the industry, there are few applications or technologies specifically aimed at jerking and processing fish, and more particularly, at monitoring submerged cages.
A pesar de que en el estado del arte se han desarrollado algunas tecnologías dirigidas a la monitorización en línea para jaulas sumergidas, en estas persisten algunos inconvenientes en la implementación que afectan directamente la eficiencia de la operación, como por ejemplo el extenso cableado, la corta distancia para la transmisión de datos y la dificultad en la extracción de datos debido a las condiciones del ambiente subacuático. Despite the fact that in the state of the art some technologies aimed at online monitoring for submerged cages have been developed, there are still some drawbacks in the implementation that directly affect the efficiency of the operation, such as extensive wiring, short distance for data transmission and the difficulty in data extraction due to the conditions of the underwater environment.
Una tecnología representativa de las soluciones del arte previo para los problemas antes mencionados se describe en el documento de patente CN 111047583, el cual describe un método para detectar daño en una red utilizando un robot subacuático operado de manera remota (ROV). En este documento se describe un robot subacuático que puede capturar imágenes de la red, en la forma de videos, mediante la selección de un área de interés, la aplicación de filtros en la imagen, transformando la imagen a escala de grises, obteniendo una imagen binaria a partir de la imagen en escala de grises, y finalmente dividiendo la imagen en escala de grises en dominios independientes. A partir de esta información se obtiene el área de cada dominio y se identifican los defectos como aquellos dominios con área fuera de la norma. A representative technology of the prior art solutions for the aforementioned problems is described in the patent document CN 111047583, which describes a method for detecting damage in a network using a remotely operated underwater robot (ROV). This document describes an underwater robot that can capture images from the network, in the form of videos, by selecting an area of interest, applying filters on the image, transforming the image to grayscale, obtaining a binary image from the grayscale image, and finally splitting the grayscale image into independent domains. From this information, the area of each domain is obtained and the defects are identified as those domains with an area outside the norm.
Sin embargo, si bien este documento describe un sistema que permite la monitorización de mallas y jaulas, en el sistema allí descrito persisten limitaciones en cuanto a la precisión de los datos analizados, fundamentalmente teniendo en consideración la determinación de la ubicación de los objetos identificados por los vehículos. En este sentido, en general este tipo de soluciones se basa en la utilización de imágenes captadas por los vehículos robóticos tele-operados, a partir de las cuales se calcula la posición de los elementos captados, ya sea peces o anomalías en las mallas, de modo que el sistema se apoya en la evidencia visual entregada por cámaras a bordo del vehículo y datos estimados de velocidad y ubicación. Sin embargo, comúnmente las condiciones de navegación en este tipo de aplicaciones son de baja visibilidad y bajo la influencia de corrientes que dificultan un pilotaje preciso. Debido a esto, los resultados de estas operaciones poseen un alto grado de imprecisión respecto a la ubicación de cada uno de los eventos detectados y la eficiencia completa del proceso queda condicionada en gran medida a la habilidad y experiencia del operador del vehículo. However, although this document describes a system that allows the monitoring of meshes and cages, in the system described there, there are still limitations regarding the precision of the data analyzed, mainly taking into account the determination of the location of the objects identified by the vehicles. In this sense, in general this type of solution is based on the use of images captured by tele-operated robotic vehicles, from which the position of the captured elements is calculated, whether fish or anomalies in the meshes, in order to So the system relies on visual evidence delivered by cameras on board the vehicle and estimated speed and location data. However, navigation conditions in this type of applications are commonly low visibility and under the influence of currents that make precise piloting difficult. Due to this, the results of these operations have a high degree of imprecision regarding the location of each one of the detected events and the complete efficiency of the process is largely conditioned by the skill and experience of the vehicle operator.
En vista de lo anterior, se desprende del estado de la técnica que existe la necesidad de contar con sistemas de monitorización que permitan obtener mediciones más precisas y que permitan determinar con exactitud la localización de los elementos y objetos que son detectados dentro de la jaula. In view of the foregoing, it is clear from the state of the art that there is a need for monitoring systems that allow more precise measurements to be obtained and that allow the exact location of the elements and objects that are detected inside the cage to be determined.
BREVE DESCRIPCIÓN DE LA INVENCIÓN BRIEF DESCRIPTION OF THE INVENTION
Para subsanar las deficiencias mencionadas anteriormente se presenta un sistema para la inspección de mallas de cultivo en jaulas de acuicultura, que permite una monitorización en tiempo real del estado de jaulas y los peces en su interior. El sistema comprende: un dispositivo de detección móvil que es operable en forma remota y que comprende medios de captación de imágenes y medios de emisión de señales de ultrasonido omnidireccional; una pluralidad de medios receptores de señales de ultrasonido localizados en forma fija en distintas ubicaciones de una jaula; un sistema de procesamiento que está configurado para determinar la posición del dispositivo de detección móvil dentro de la jaula mediante el procesamiento de las señales de ultrasonido e identificar el estado de una malla de la jaula; y una interfaz de usuario configurada para generar visualizaciones automáticas en un entorno tridimensional de la ubicación del dispositivo de detección móvil y del estado de la malla. In order to correct the deficiencies mentioned above, a system for the inspection of culture meshes in aquaculture cages is presented, which allows real-time monitoring of the state of cages and the fish inside them. The system comprises: a mobile detection device that is remotely operable and that comprises means for capturing images and means for emitting omnidirectional ultrasound signals; a plurality of ultrasound signal receiving means located in a fixed manner in different locations of a cage; a processing system that is configured to determine the position of the mobile detection device inside the cage by processing the ultrasound signals and identify the state of a cage mesh; and a user interface configured to generate automatic displays in a three-dimensional environment of the location of the mobile detection device and the state of the mesh.
De esta manera, el sistema de la presente invención permite obtener imágenes, preferentemente en la forma de video, del interior de una jaula de acuicultura mediante la utilización de un dispositivo móvil sumergido. Además, al mismo tiempo el sistema permite determinar con exactitud la ubicación del dispositivo móvil en relación a la misma jaula mediante el análisis de las señales de ultrasonido. De esta forma, el sistema permite la determinación precisa de la ubicación de cada una de las anomalías y objetos que están siendo detectados por el dispositivo móvil. In this way, the system of the present invention makes it possible to obtain images, preferably in the form of video, of the interior of an aquaculture cage through the use of a submerged mobile device. In addition, at the same time the system allows to accurately determine the location of the mobile device in relation to the cage itself by analyzing the ultrasound signals. In this way, the system allows the precise determination of the location of each of the anomalies and objects that are being detected by the mobile device.
Adicionalmente, la presente invención proporciona un método para la inspección de mallas de cultivo en jaulas de acuicultura, que comprende los pasos de: disponer de un dispositivo de detección móvil que es operable en forma remota y que está configurado para captar imágenes y emitir señales de ultrasonido omnidireccional; recibir las señales de ultrasonido emitidas por el dispositivo de detección móvil mediante una pluralidad de medios receptores de señales de ultrasonido localizados en forma fija en distintas ubicaciones de una jaula; procesar las señales de ultrasonido, mediante un sistema de procesamiento, para determinar la posición del dispositivo de detección móvil dentro de la jaula; procesar las imágenes recibidas, mediante el sistema de procesamiento, para identificar el estado de una malla de la jaula; y generar visualizaciones automáticas en un entorno tridimensional, en una interfaz de usuario, de la ubicación del dispositivo de detección móvil y del estado de la malla. Additionally, the present invention provides a method for the inspection of culture meshes in aquaculture cages, which comprises the steps of: having a mobile detection device that is remotely operable and that is configured to capture images and emit warning signals; omnidirectional ultrasound; receiving the ultrasound signals emitted by the mobile detection device by means of a plurality of ultrasound signal receiving means located fixedly at different locations in a cage; processing the ultrasound signals, by means of a processing system, to determine the position of the mobile detection device within the cage; processing the received images, by the processing system, to identify the state of a mesh of the cage; and generate automatic visualizations in a three-dimensional environment, in a user interface, of the location of the mobile detection device and the state of the mesh.
Más particularmente, la capacidad de identificar el estado de una malla de una jaula comprende la detección de anomalías mediante el análisis de las imágenes captadas, en donde dichas anomalías pueden incluir la presencia de roturas en la malla, o detectar otras condiciones críticas, como por ejemplo el nivel de suciedad en una zona de la malla. More particularly, the ability to identify the state of a cage mesh comprises the detection of anomalies by analyzing the captured images, where said anomalies may include the presence of breaks in the mesh, or detect other critical conditions, such as example, the level of dirt in an area of the mesh.
Además, en algunas implementaciones de la invención, la determinación de la posición del dispositivo de detección móvil puede ser complementada mediante el uso de procesamiento de las imágenes detectadas por el mismo dispositivo, con el fin de brindar mayor robustez al sistema de posicionamiento basado en ultrasonido. Esta configuración se basa en un proceso de odometría visual, en donde se lleva a cabo una medición de posiciones mediante el análisis de las imágenes obtenidas por el mismo dispositivo. In addition, in some implementations of the invention, the determination of the position of the mobile detection device can be complemented by using the processing of the images detected by the same device, in order to provide greater robustness to the ultrasound-based positioning system. . This configuration is based on a visual odometry process, where a position measurement is carried out by analyzing the images obtained by the same device.
La visualization en un entorno tridimensional se basa en la utilización de la información obtenida respecto de la determinación de la posición del dispositivo de detección móvil dentro de la jaula, ya sea mediante ultrasonido, procesamiento de imágenes, o una combinación de ambos, procesando esta información con el objeto de proyectar dicha posición en un modelo tridimensional de la jaula. De esta forma, se puede proporcionar a un usuario un apoyo visual del proceso de inspección mediante la visualization del dispositivo de detección móvil dentro del modelo tridimensional, el estado del proceso de inspección, anomalías en la malla y nivel de suciedad (fouling). BREVE DESCRIPCIÓN DE LAS FIGURAS The visualization in a three-dimensional environment is based on the use of the information obtained regarding the determination of the position of the mobile detection device inside the cage, either through ultrasound, image processing, or a combination of both, processing this information. in order to project said position in a three-dimensional model of the cage. In this way, a user can be provided with visual support of the inspection process by displaying the mobile detection device within the three-dimensional model, the status of the inspection process, anomalies in the mesh and the level of dirt (fouling). BRIEF DESCRIPTION OF THE FIGURES
La FIG. 1 muestra una representación ejemplar de un sistema para la inspección de mallas de cultivo en jaulas de acuicultura de acuerdo con la presente invención. The FIG. 1 shows an exemplary representation of a system for the inspection of culture meshes in aquaculture cages in accordance with the present invention.
La FIG. 2 muestra un diagrama representativo de la arquitectura del sistema de procesamiento de la presente invención. The FIG. 2 shows a representative diagram of the architecture of the processing system of the present invention.
La FIG. 3 muestra un diagrama representativo de un método para la detección de roturas en una malla de la jaula de acuerdo con la presente invención. The FIG. 3 shows a representative diagram of a method for detecting breaks in a cage mesh in accordance with the present invention.
Las FIGs. 4A y 4B muestran una representación de resultados obtenidos de la detección de roturas en una malla de una jaula. The FIGs. 4A and 4B show a representation of results obtained from the detection of breaks in a mesh of a cage.
Las FIGs. 5A a 5D muestran una secuencia de imágenes tomadas de una malla en una jaula submarina, en donde se aprecian niveles progresivos de suciedad en torno a los elementos de la malla. The FIGs. 5A to 5D show a sequence of images taken from a mesh in an underwater cage, where progressive levels of dirt can be seen around the elements of the mesh.
Las FIGs. 6A a 6F muestran una implementación ejemplar de la invención en que se lleva a cabo la detección de peces al interior de la jaula. Las FIGs. 6A, 6C y 6C muestran los datos reales captados (ground-truth data), y las FIGs. 6B, 6D y 6F muestran los resultados de la detección mediante la presente invención. The FIGs. 6A to 6F show an exemplary implementation of the invention in which detection of fish within the cage is carried out. The FIGs. 6A, 6C and 6C show the real data captured (ground-truth data), and FIGs. 6B, 6D and 6F show the results of detection by the present invention.
La FIG. 7 muestra una representación de la ubicación del dispositivo de detección móvil respecto de la jaula en un entorno tridimensional. The FIG. 7 shows a representation of the location of the mobile detection device with respect to the cage in a three-dimensional environment.
DESCRIPCIÓN DETALLADA DE LA INVENCIÓN DETAILED DESCRIPTION OF THE INVENTION
De acuerdo con la configuración ejemplar que se muestra en la FIG. 1 , la presente invención consiste en un sistema (100) para la inspección de mallas de cultivo en jaulas de acuicultura (110), que permite una monitorización en tiempo real del estado de las mallas de cultivo en su interior, en cuanto a roturas y anomalías, y caracterización del nivel de suciedad. El sistema comprende: un dispositivo de detección móvil (120) que es operable en forma remota y que comprende medios de captación de imágenes y medios de emisión de señales de ultrasonido omnidireccional; una pluralidad de medios receptores de señales de ultrasonido (130) localizados en forma fija en distintas ubicaciones de una jaula (110); un sistema de procesamiento (140) que está configurado para determinar la posición del dispositivo de detección móvil dentro de la jaula mediante el procesamiento de las señales de ultrasonido e identificar el estado de una malla de la jaula; y una interfaz de usuario (144) configurada para generar visualizaciones automáticas en un entorno tridimensional de la ubicación del dispositivo de detección móvil, el estado del proceso de inspección, anomalías en la malla y el nivel de suciedad. According to the exemplary configuration shown in FIG. 1, the present invention consists of a system (100) for the inspection of culture meshes in aquaculture cages (110), which allows real-time monitoring of the state of the culture meshes inside, in terms of breakage and anomalies, and characterization of the level of dirt. The system comprises: a mobile detection device (120) that is remotely operable and that includes means for capturing images and means for emitting omnidirectional ultrasound signals; a plurality of ultrasonic signal receiving means (130) fixedly located at different locations in a cage (110); a processing system (140) that is configured to determine the position of the mobile detection device inside the cage by processing the ultrasound signals and identify the state of a cage mesh; and a user interface (144) configured to generate automatic displays in a three-dimensional environment of the location of the mobile detection device, the status of the inspection process, anomalies in the mesh, and the level of soiling.
De esta manera, el dispositivo de detección móvil (120) comprende medios de transmisión de señales de ultrasonido que son detectadles por la pluralidad de medios receptores de señales de ultrasonido (130), los cuales se ubican preferentemente en torno a la jaula, como se muestra en la FIG. 1. Además, preferentemente los medios de emisión de señales de ultrasonido omnidireccional del dispositivo de detección móvil (120) transmiten señales en forma periódica. In this way, the mobile detection device (120) comprises ultrasonic signal transmission means that are detectable by the plurality of ultrasound signal receiving means (130), which are preferably located around the cage, as shown. shows in FIG. 1. In addition, the omnidirectional ultrasound signal emitting means of the mobile detection device (120) preferably transmit signals periodically.
Como se muestra en la FIG. 1 , la señal captada por los medios receptores (130) es transmitida al sistema de procesamiento (140), el cual se encarga de digitalizar las señales y retransmitirlas a una red local (141). Para la transmisión de las señales entre los medios receptores (130) y la red local (141) el sistema comprende preferentemente un sistema LoRa (“Long Range") debido a las distancias significativas que pueden existir entre una o más jaulas y el centro de procesamiento. Además, en configuraciones preferentes, la localización del dispositivo de detección móvil se lleva a cabo mediante una técnica de multilateración, también conocida como “Time Difference of Arrival (TDoA)”, que permite calcular las diferencias de tiempo de transmisión de señales entre el dispositivo de detección móvil (120) y la pluralidad de medios receptores (130). De esta forma, el sistema de procesamiento permite determinar la diferencia de distancias entre los medios de transmisión a bordo del dispositivo de detección móvil (120) y cada medio receptor (130), lo cual permite dar seguimiento a la posición del dispositivo de detección móvil (120) dentro de la jaula en todo momento, para complementar las tareas de inspección. As shown in FIG. 1, the signal captured by the receiving means (130) is transmitted to the processing system (140), which is in charge of digitizing the signals and retransmitting them to a local network (141). For the transmission of signals between the receiving means (130) and the local network (141), the system preferably includes a LoRa system ("Long Range") due to the significant distances that can exist between one or more cages and the center of In addition, in preferred configurations, the location of the mobile detection device is carried out using a multilateration technique, also known as "Time Difference of Arrival (TDoA)", which allows calculating the signal transmission time differences between the mobile detection device (120) and the plurality of receiving means (130).In this way, the processing system makes it possible to determine the difference in distances between the transmission means on board the mobile detection device (120) and each means receiver (130), which allows monitoring the position of the mobile detection device (120) inside the cage at all times, to complement the inspection tasks.
Preferentemente, para llevar a cabo una localización basada en TDoA el sistema contempla un mínimo de cuatro receptores (130) para obtener una localización en dos dimensiones y un mínimo de cinco receptores para una localización en tres dimensiones. Una vez transmitidas las señales, el sistema de procesamiento convierte las señales analógicas recibidas en señales digitales mediante el uso de un conversor análogo-digital (ADC), las cuales son posteriormente procesadas, por ejemplo, mediante un computador personal (PC) o en un sistema embebido dedicado, como una Field-Programmable Gate Array (FPGA). El sistema detecta un instante de tiempo en que llega cada señal de ultrasonido al medio receptor (130) y almacena ese tiempo (timestamp), a partir de lo cual se determina la posición del dispositivo de detección móvil, por medio de la técnica de TDoA. Preferably, to carry out a location based on TDoA, the system contemplates a minimum of four receivers (130) to obtain a location in two dimensions and a minimum of five receivers for a location in three dimensions. Once the signals are transmitted, the processing system converts the received analog signals into digital signals through the use of an analog-to-digital converter (ADC), which are further processed, for example, by a personal computer (PC) or in a computer. dedicated embedded system, such as a Field-Programmable Gate Array (FPGA). The system detects an instant of time when each ultrasound signal arrives at the receiving medium (130) and stores that time (timestamp), from which the position of the mobile detection device is determined, by means of the TDoA technique. .
En algunas configuraciones de la invención, los medios receptores (130) se pueden complementar con otros sistemas de posicionamiento, como por ejemplo GPS, con el objeto de sincronizar esta información con el procesamiento de las señales de ultrasonido provenientes del dispositivo de detección móvil (120). In some configurations of the invention, the receiving means (130) can be complemented with other positioning systems, such as GPS, in order to synchronize this information with the processing of the ultrasound signals coming from the mobile detection device (120). ).
De acuerdo con la FIG. 2, preferentemente el sistema de procesamiento (140) está configurado mediante una arquitectura de tres capas: una capa de acceso (142), una capa central (143) y una capa de la interfaz de usuario (144). According to FIG. 2, preferably the processing system (140) is configured using a three-layer architecture: an access layer (142), a core layer (143), and a user interface layer (144).
La capa de acceso (142) es la encargada de comunicarse con los distintos medios de detección de variables del sistema, de modo de recibir los datos obtenidos por los distintos medios de detección y enviar la información a la capa central para su posterior procesamiento. Más particularmente, el sistema comprende medios de detección fijos (132) y medios de detección móviles (123) que se encuentran en el dispositivo de detección móvil (120). Entre los medios de detección fijos (132) se encuentra el uso de los medios receptores de señales de ultrasonido (130) y otros medios de detección alternativos, como por ejemplo el uso de cámaras fijas (131) en distintas ubicaciones de una jaula. Por su parte, los medios de detección móviles (123) incluyen preferentemente medios de captación de imágenes (121) y medios de emisión de señales de ultrasonido omnidirectional (122), ambos a bordo del dispositivo de detección móvil (120). Sin embargo, cabe tener en consideración que otro tipo de medios de detección pueden igualmente formar parte de otras implementaciones alternativas de la invención. The access layer (142) is in charge of communicating with the different detection means of system variables, in order to receive the data obtained by the different detection means and send the information to the central layer for further processing. More particularly, the system comprises fixed detection means (132) and mobile detection means (123) that are located in the mobile detection device (120). Among the fixed detection means (132) is the use of ultrasound signal receiving means (130) and other alternative detection means, such as the use of fixed cameras (131) in different locations of a cage. For their part, the mobile detection means (123) preferably include capture means imaging (121) and omnidirectional ultrasound signal emission means (122), both on board the mobile detection device (120). However, it should be taken into consideration that other types of detection means may also form part of other alternative implementations of the invention.
La capa central (143) es la encargada de comunicarse con la capa de acceso (142) para la recepción de los datos obtenidos por los distintos medios de detección (123, 132), y lleva a cabo la administración y procesamiento de toda la información para enviar posteriormente los datos de salida hacia la capa de interfaz de usuario (144). The central layer (143) is in charge of communicating with the access layer (142) for the reception of the data obtained by the different means of detection (123, 132), and carries out the administration and processing of all the information to subsequently send the output data to the user interface layer (144).
La capa de interfaz de usuario (144) es la intermediaria entre el usuario y todo el sistema para la inspección de jaulas (100), desde donde es posible acceder a la información procesada por la capa central (143) e interactuar con los distintos componentes del sistema. Esta capa comprende una o más interfaces de usuario, que pueden estar localizadas en una ubicación remota respecto de la o las jaulas que están siendo inspeccionadas. Las interfaces de usuario pueden comprender el uso de computadores, tableta electrónica, aplicaciones de teléfonos móviles, u otros. The user interface layer (144) is the intermediary between the user and the entire cage inspection system (100), from where it is possible to access the information processed by the central layer (143) and interact with the different components. of the system. This layer comprises one or more user interfaces, which may be located at a remote location from the cage(s) being inspected. The user interfaces can include the use of computers, electronic tablets, mobile phone applications, or others.
La presente invención proporciona además un método para la inspección de mallas de cultivo en jaulas de acuicultura, que permite una monitorización en tiempo real del estado de jaulas y los peces en su interior, el cual comprende los pasos de: disponer de un dispositivo de detección móvil y operable en forma remota (120), que está configurado para captar imágenes y emitir señales de ultrasonido omnidirectional; recibir las señales de ultrasonido emitidas por el dispositivo de detección móvil (120) mediante una pluralidad de medios receptores de señales de ultrasonido (130) localizados en forma fija en distintas ubicaciones de una jaula (110); procesar las señales de ultrasonido, mediante un sistema de procesamiento, para determinar la posición del dispositivo de detección móvil dentro de la jaula; procesar las imágenes recibidas, mediante el sistema de procesamiento, para identificar el estado de una malla de la jaula; y generar visualizaciones automáticas en un entorno tridimensional, en una interfaz de usuario, de la ubicación del dispositivo de detección móvil, del estado del proceso de inspección, de las anomalías en la malla y del nivel de suciedad. The present invention also provides a method for the inspection of culture meshes in aquaculture cages, which allows real-time monitoring of the state of cages and the fish inside, which comprises the steps of: having a detection device mobile and remotely operable (120), which is configured to capture images and emit omnidirectional ultrasound signals; receiving the ultrasound signals emitted by the mobile detection device (120) by means of a plurality of ultrasound signal receiving means (130) fixedly located at different locations in a cage (110); processing the ultrasound signals, by means of a processing system, to determine the position of the mobile detection device within the cage; processing the received images, by the processing system, to identify the state of a mesh of the cage; and generate automatic visualizations in a three-dimensional environment, in a user interface, of the location of the mobile detection device, the status of the inspection process, the anomalies in the mesh and the level of dirt.
Preferentemente, el paso de identificar el estado de una malla de la jaula comprende la detección de anomalías, como roturas en la malla, y detectar otras condiciones críticas, como por ejemplo el nivel de suciedad en una zona de la malla. Preferably, the step of identifying the state of a mesh in the cage comprises detecting anomalies, such as breaks in the mesh, and detecting other critical conditions, such as the level of dirt in an area of the mesh.
En el caso de la detección de roturas en la malla de la jaula, la invención se basa en la utilización de distintas técnicas de procesamiento de imágenes, basadas en las características geométricas mediante el uso de bibliotecas dedicadas a este tipo de aplicaciones, como por ejemplo OpenCV. Además, preferentemente las técnicas de procesamiento se seleccionan considerando un bajo costo computacional con el objeto de que el procesamiento pueda ser ejecutado con el hardware a bordo del dispositivo móvil de detección. In the case of detecting breaks in the mesh of the cage, the invention is based on the use of different image processing techniques, based on geometric characteristics through the use of libraries dedicated to this type of application, such as OpenCV. Furthermore, preferably the processing techniques are selected considering a low computational cost in order that the processing can be executed with the hardware on board the mobile detection device.
De acuerdo con la FIG. 3, con el objeto de llevar a cabo la detección de roturas en la malla de la jaula, se proporciona un método para la detección de roturas (150) que comprende los pasos de: capturar imágenes (151), preferentemente en la forma de video, mediante la utilización del dispositivo de detección móvil; procesar las imágenes para llevar a cabo una detección y segmentación de la malla; elaborar un modelo estadístico (156) a partir de las imágenes procesadas; seleccionar valores atípicos como candidatos de anomalía (157); llevar a cabo un filtrado espacio-temporal (158); y efectuar un registro de eventos (159). According to FIG. 3, in order to carry out the detection of breaks in the cage mesh, a method for the detection of breaks (150) is provided, which comprises the steps of: capturing images (151), preferably in the form of video , by using the mobile detection device; process the images to carry out a detection and segmentation of the mesh; develop a statistical model (156) from the processed images; select outliers as anomaly candidates (157); perform space-time filtering (158); and making an event log (159).
Más particularmente, el paso de procesar las imágenes comprende a su vez etapas de pre-procesamiento de las imágenes (152), aplicación de un umbral adaptativo (153) para el ajuste de la segmentación a diversas condiciones de luminosidad y contraste, análisis de componentes conectados (154) y generación de una máscara binaria correspondiente a los pixeles de la malla (155). More particularly, the step of processing the images comprises in turn image pre-processing steps (152), application of an adaptive threshold (153) for adjusting the segmentation to various lighting and contrast conditions, component analysis connected (154) and generation of a binary mask corresponding to the pixels of the mesh (155).
La etapa de pre-procesamiento (152) tiene por objeto lidiar con los efectos que las condiciones subacuáticas generan en la imagen. Entre otras características, la etapa de pre-procesamiento permite hacerse cargo de cambios abruptos en la intensidad lumínica, los efectos que provoca la necesidad de utilizar fuentes de luz artificial y el alto contenido de partículas flotantes que dificultan de manera directa la detección y clasificación de los pixeles que corresponden a la malla. The purpose of the pre-processing stage (152) is to deal with the effects that underwater conditions generate in the image. Among other features, the pre-processing stage allows taking care of abrupt changes in light intensity, the effects caused by the need to use artificial light sources and the high content of floating particles that directly hinder the detection and classification of the pixels that correspond to the mesh.
Preferentemente, el pre-procesamiento de las imágenes (152) comprende ajustar el tamaño de la imagen a 1280x720 pixeles con el objeto de disminuir el costo computacional sin pérdida considerable de información. Además, preferentemente las imágenes se llevan al espacio de color CIE-LAB con el objeto de efectuar el procesamiento únicamente con el canal que contiene toda la información de percepción de luz relevante, pues se considera que las componentes de color no entregan mucha información relevante bajo el contexto del problema. Posteriormente, se lleva a cabo el paso de aplicación de un umbral adaptativo (153) para generar luego una máscara binaria (155) que corresponde a los pixeles de la malla. Preferably, the pre-processing of the images (152) comprises adjusting the size of the image to 1280x720 pixels in order to reduce the computational cost without considerable loss of information. In addition, preferably the images are taken to the CIE-LAB color space in order to carry out the processing only with the channel that contains all the relevant light perception information, since it is considered that the color components do not provide much relevant information under the context of the problem. Subsequently, the step of applying an adaptive threshold (153) is carried out to then generate a binary mask (155) that corresponds to the pixels of the mesh.
Respecto de la generación de una máscara binaria (155), en algunas configuraciones de la invención el método comprende el paso de invertir la máscara binaria generada por el umbral adaptativo, para aquellos casos en que se identifica que el fondo de la imagen es más claro que la intensidad de los pixeles de la malla. Un ejemplo de este contexto subacuático corresponde a las situaciones en que el dispositivo móvil de detección se encuentra a una profundidad por sobre los diez metros y apuntando hacia una superficie iluminada. Para lograr identificar este escenario y otros similares, el método contempla la medición de la proporción entre pixeles claros y oscuros de la máscara binaria, lo que permite generar la máscara correcta. Finalmente, se aplican filtros de operaciones morfológicas para suprimir los efectos de partículas flotantes y conectar espacios de la malla que pudiesen quedar abiertos por omisión. Las FIGs. 4A y 4B muestran ejemplos de resultados obtenidos mediante los pasos anteriores. Regarding the generation of a binary mask (155), in some configurations of the invention the method includes the step of inverting the binary mask generated by the adaptive threshold, for those cases in which it is identified that the background of the image is lighter. than the intensity of the pixels of the mesh. An example of this underwater context corresponds to situations in which the mobile detection device is at a depth of over ten meters and pointing towards an illuminated surface. In order to identify this scenario and other similar ones, the method contemplates the measurement of the proportion between light and dark pixels of the binary mask, allowing the correct mask to be generated. Finally, morphological operation filters are applied to suppress the effects of floating particles and connect gaps in the mesh that could be left open by default. The FIGs. 4A and 4B show examples of results obtained by the above steps.
Posterior a la etapa de detección y segmentación de la malla el método comprende el paso de analizar componentes conectados (154), mediante el cual se determina el área en píxeles correspondientes a cada uno de los orificios de la malla. Preferentemente, dado el tamaño estándar del enlazado de la malla se observa que incluso, producto de deformaciones, el tamaño de los agujeros es similar, como se observa en las FIGs. 4A y 4B. Considerando estas restricciones, se calcula el promedio y la desviación estándar entre estas áreas para detectar aquellas superficies de valores atípicos mayores a tres desviaciones estándar. After the detection and segmentation stage of the mesh, the method includes the step of analyzing connected components (154), through which the area in pixels corresponding to each of the holes in the mesh is determined. Preferably, given the standard size of the mesh connection, it is observed that even, as a result of deformations, the size of the holes is similar, as observed in FIGS. 4A and 4B. Considering these restrictions, the average and standard deviation between these areas are calculated to detect those areas of outliers greater than three standard deviations.
Con el objeto de robustecer el proceso de detección teniendo en consideración las características dinámicas del contexto submarino, el método preferentemente comprende el paso de filtrar posibles falsos positivos asociando la información espacial obtenida por el procesamiento de imágenes con un análisis espacio-temporal (158) utilizando técnicas de seguimiento en video. En esta configuración, los falsos positivos pueden ser provocados posiblemente por falta de iluminación, movimientos rápidos de la cámara y perspectivas muy inclinadas y cercanas a la malla, para lo cual se agrega una etapa final de filtrado espacio- temporal mediante un seguimiento en el tiempo. Esto quiere decir que se identifican como rotura sólo aquellos valores atípicos que persisten en el tiempo, sin variar drásticamente su posición en la imagen. Para comprobar visualmente que estas condiciones se cumplen, se le asigna a cada candidato de rotura una identificación, la cual se muestra en pantalla y no debe variar entre cuadros. In order to strengthen the detection process taking into account the dynamic characteristics of the underwater context, the method preferably includes the step of filtering possible false positives by associating the spatial information obtained by image processing with a spatiotemporal analysis (158) using video tracking techniques. In this configuration, false positives can possibly be caused by lack of lighting, fast camera movements, and very steep and close-to-grid perspectives, for which a final stage of spatiotemporal filtering is added by tracking in time. . This means that only those outliers that persist over time are identified as breaks, without drastically changing their position in the image. To visually check that these conditions are met, each break candidate is assigned an identification, which is displayed on the screen and should not vary between frames.
En algunas configuraciones de la invención, la detección de anomalías en la malla de una jaula incluye la determinación del nivel de suciedad en una zona de la malla. Las FIGs. 5A a 5D muestran una secuencia de imágenes tomadas de una malla en una jaula submarina en las cuales se aprecian niveles progresivos de suciedad en torno a los elementos de la malla, en donde la FIG. 5A muestra una zona de malla con bajos niveles de suciedad y la FIG. 5D muestra una zona de malla con un alto grado de suciedad. En estas imágenes se puede apreciar que el espacio entre elementos de malla se ve progresivamente cubierto por el crecimiento de algas que se adhieren a los elementos de malla. In some configurations of the invention, the detection of anomalies in the mesh of a cage includes the determination of the level of dirt in an area of the mesh. The FIGs. 5A to 5D show a sequence of images taken from a mesh in an underwater cage in which progressive levels of dirt can be seen around the elements of the mesh, where FIG. 5A shows a mesh area with low levels of dirt and FIG. 5D shows a mesh area with a high degree of soiling. In these images it can be seen that the space between mesh elements is progressively covered by the growth of algae that adhere to the mesh elements.
Para establecer el grado de suciedad, la invención aborda el problema preferentemente mediante inteligencia artificial, a través de un proceso de “transfer learning", mediante el cual se aprovecha la eficacia de redes neuronales entrenadas previamente para resolver tareas generales y se re-entrenan las últimas capas de la red con imágenes exclusivas del problema. En otras configuraciones de la invención, el problema se aborda mediante Redes Neuronales Convolucionales (CNN) menos profundas, diseñando y procesando la información con distintas arquitecturas, hiperparámetros y conjuntos de datos con el fin de lograr generar un modelo que pueda predecir el nivel de suciedad con la mayor precisión posible. To establish the degree of dirtiness, the invention preferably addresses the problem through artificial intelligence, through a process of "transfer learning", through which the efficiency of previously trained neural networks is used to solve general tasks and re-trains the last layers of the network with exclusive images of the problem In other configurations of the invention, the problem is addressed by using shallower Convolutional Neural Networks (CNN), designing and processing the information with different architectures, hyperparameters and data sets in order to to generate a model that can predict the level of dirt with the greatest possible precision.
Adicionalmente, en algunas configuraciones alternativas, el método incluye la detección de peces al interior de la jaula, lo cual comprende determinar la localización de los peces y su posterior segmentación a partir de un conjunto de imágenes obtenidas desde el dispositivo de detección móvil. Para este propósito, el método contempla preferentemente la utilización de inteligencia artificial para el procesamiento de las imágenes obtenidas, y más preferentemente el uso de redes neuronales. En esta configuración, se puede implementar por ejemplo el algoritmo Mask R-CNN, que corresponde a una red neuronal capaz de segmentar objetos aprendidos desde un conjunto de datos con los objetos de interés. El propósito de la detección y segmentación de peces en la jaula, en el contexto de esta invención, es poder enmascarar la presencia de peces para que ellos no afecten significativamente el análisis y la caracterización del estado de la malla. Sin embargo, la detección de peces se proyecta también como un elemento fundamental para los procesos de estimación de biomasa. Additionally, in some alternative configurations, the method includes the detection of fish inside the cage, which comprises determining the location of the fish and its subsequent segmentation from a set of images obtained from the mobile detection device. For this purpose, the method preferably contemplates the use of artificial intelligence for the processing of the images obtained, and more preferably the use of neural networks. In this configuration, for example, the Mask R-CNN algorithm can be implemented, which corresponds to a neural network capable of segmenting objects learned from a data set with the objects of interest. The purpose of the detection and segmentation of fish in the cage, in the context of this invention, is to be able to mask the presence of fish so that they do not significantly affect the analysis and characterization of the state of the mesh. However, fish detection is also projected as a fundamental element for biomass estimation processes.
Adicionalmente, el método puede ser complementado con herramientas de optimización, como por ejemplo un optimizador bayesiano, con el fin de poder elegir de forma automatizada hiper-parámetros que logran de mejor forma minimizar el error de segmentación. En esta configuración, el optimizador funciona generando un mapa probabilístico que determina en qué parte del espacio de hiper-parámetros es más conveniente buscar nuevos candidatos, tal que se minimice la varianza del modelo. En la medida que van aumentando las iteraciones, las búsquedas se hacen más precisas y especializadas debido a que el proceso va buscando combinaciones con la máxima probabilidad de maximizar el desempeño. Additionally, the method can be complemented with optimization tools, such as a Bayesian optimizer, in order to be able to automatically choose hyper-parameters that best minimize the segmentation fault. In this configuration, the optimizer works by generating a probabilistic map that determines in which part of the hyper-parameter space it is more convenient to search for new candidates, such that the variance of the model is minimized. As the iterations increase, the searches become more precise and specialized because the process searches for combinations with the highest probability of maximizing performance.
Las FIGs. 6A a 6F muestran el resultado de la implementación de los pasos previamente descritos, en donde las FIGs. 6A, 6C y 6E muestran imágenes de referencia tomadas en distintos instantes dentro de una jaula de peces, mientras que las FIGs. 6B, 6D y 6Fd muestran los resultados obtenidos mediante los procesos de detección y segmentación previamente descritos, mostrando en gris claro los objetos erróneamente identificados y en gris oscuro los verdaderos positivos. The FIGs. 6A to 6F show the result of implementing the previously described steps, wherein FIGS. 6A, 6C and 6E show reference images taken at different times inside a fish cage, while FIGs. 6B, 6D and 6Fd show the results obtained by the previously described detection and segmentation processes, showing erroneously identified objects in light gray and true positives in dark grey.
En algunas configuraciones alternativas de la invención, la determinación de la posición del dispositivo de detección móvil (120) puede ser complementada con el uso de un procesamiento de las imágenes detectadas por el mismo dispositivo, con el fin de brindar mayor robustez al sistema de posicionamiento basado en ultrasonido. En esta configuración, se lleva a cabo un seguimiento de características visuales de las mallas en las jaulas de cultivo e integrando los sensores propios del dispositivo de detección móvil. Para ello, preferentemente se implementa un algoritmo de visión basado en “Simultaneous Location And Mapping" (SLAM) para la estimación complementaria de la posición del dispositivo de detección móvil basado en las características visuales de las mallas. In some alternative configurations of the invention, the determination of the position of the mobile detection device (120) can be complemented with the use of a processing of the images detected by the same device, in order to provide greater robustness to the positioning system. ultrasound based. In this configuration, monitoring of the visual characteristics of the meshes in the culture cages is carried out, integrating the sensors of the mobile detection device. For this, a vision algorithm based on "Simultaneous Location And Mapping" (SLAM) is preferably implemented for the complementary estimation of the position of the mobile detection device based on the visual characteristics of the meshes.
Estos pasos adicionales del método se basan en un proceso de odometría visual, en donde se lleva a cabo una medición de posiciones mediante el análisis de las imágenes obtenidas por el mismo dispositivo. En esta configuración, el procesamiento de las imágenes obtenidas permite encontrar características de la jaula de cultivo en la escena submarina, de modo que es posible hacer un seguimiento de la forma más precisa posible entre cuadros, con un punto de referencia conocido en la jaula. Para este fin se consideran dos implementaciones alternativas: uno basado en extracción de descriptores locales y un método directo. El método basado en descriptores locales comprende utilizar un método de extracción de características, como por ejemplo SIFT (Scale-invariant feature detection), SURF (Speeded Up Robust Features), ORB (Oriented-FAST and Rotated-BRIEF), BRISK (Binary Robust Invariant Scalable Keypoints), AKAZE (Accelerated-KAZE Features), etc., para hacer seguimiento a estas mismas entre cuadros para determinar el movimiento, tanto rotacional como traslacional, de la cámara. These additional steps of the method are based on a visual odometry process, where a position measurement is carried out by analyzing the images obtained by the same device. In this configuration, the processing of the obtained images allows to find characteristics of the culture cage in the underwater scene, so that it is possible to track as accurately as possible between frames, with a known reference point in the cage. For this purpose, two alternative implementations are considered: one based on extraction of local descriptors and a direct method. The method based on local descriptors comprises using a feature extraction method, such as SIFT (Scale-invariant feature detection), SURF (Speeded Up Robust Features), ORB (Oriented-FAST and Rotated-BRIEF), BRISK (Binary Robust Invariant Scalable Keypoints), AKAZE (Accelerated-KAZE Features), etc., to track them between frames to determine the movement, both rotational and translational, of the camera.
De esta forma, los pasos antes mencionados son aplicados a los flujos de imágenes en el entorno del dispositivo de detección móvil, con lo cual se lleva a cabo una comparación utilizando criterios cualitativos, como el tiempo de computación y qué tan coherente es el seguimiento de las características, con el objeto de definir cuál de los métodos es el mejor para una aplicación específica. In this way, the aforementioned steps are applied to the image streams in the environment of the mobile detection device, whereby a comparison is carried out using qualitative criteria, such as computation time and how consistent the tracking is. the characteristics, in order to define which of the methods is the best for a specific application.
Como se observa en la FIG. 7, el método comprende además llevar a cabo una visualization en un entorno tridimensional del dispositivo de detección móvil (120). El método comprende la utilización de la información obtenida previamente respecto de la determinación de la posición del dispositivo de detección móvil en relación a la jaula, ya sea mediante el procesamiento de las señales de ultrasonido, el análisis de imágenes, o bien una combinación de ambos, y procesando esta información con el objeto de proyectar la posición del dispositivo en un modelo tridimensional de la jaula. Además, en base a la información de la perspectiva del dispositivo, se proyecta la información visual extraída en cuanto a la cobertura de la inspección, a la posición de roturas en la malla y a la caracterización del nivel de suciedad. De esta forma es posible apoyar visualmente el proceso de inspección mediante la visualization, permitiendo la trazabilidad del proceso, modelado dentro de un entorno virtual tridimensional. As shown in Fig. 7, the method further comprises performing a display in a three-dimensional environment of the mobile sensing device (120). The method comprises the use of previously obtained information regarding the determination of the position of the mobile detection device in relation to the cage, either by processing ultrasound signals, image analysis, or a combination of both. , and processing this information in order to project the position of the device in a three-dimensional model of the cage. In addition, based on the information from the perspective of the device, the extracted visual information is projected in terms of inspection coverage, the position of breaks in the mesh and the characterization of the level of dirt. In this way it is possible to visually support the inspection process through visualization, allowing the traceability of the process, modeled within a three-dimensional virtual environment.
Por último, cabe destacar que tanto las dimensiones, la elección de los materiales, y aspectos específicos de las configuraciones preferidas descritas anteriormente pueden variar o ser modificadas en función de requerimientos operationales específicos. En consecuencia, la descripción de las configuraciones específicas descritas anteriormente no pretende ser limitante, y dichas variaciones y/o modificaciones se encuentran dentro del espíritu y alcance de la invención. Finally, it should be noted that both the dimensions, the choice of materials, and specific aspects of the preferred configurations described above may vary or be modified based on specific operational requirements. Accordingly, the description of the specific configurations described above is not intended to be limiting, and such variations and/or modifications are within the spirit and scope of the invention.

Claims

REIVINDICACIONES Un sistema (100) para la inspección de mallas de cultivo en jaulas de acuicultura, que permite una monitorización en tiempo real del estado de las mallas de cultivo, CARACTERIZADO porque comprende: un dispositivo de detección móvil (120) que es operable en forma remota y que comprende medios de captación de imágenes y medios de emisión de señales de ultrasonido omnidireccional; una pluralidad de medios receptores de señales de ultrasonido (130) localizados en forma fija en distintas ubicaciones de una jaula (110), los cuales reciben las señales de ultrasonido omnidireccional emitidas; un sistema de procesamiento (140) que determina la posición del dispositivo de detección móvil dentro de la jaula mediante el procesamiento de las señales de ultrasonido omnidireccional e identifica el estado de una malla de la jaula mediante las imágenes captadas; y una interfaz de usuario (144) configurada para generar visualizaciones automáticas en un entorno tridimensional de la ubicación del dispositivo de detección móvil, el estado del proceso de inspección, anomalías en la malla y el nivel de suciedad. Un sistema (100) para la inspección de mallas de cultivo en jaulas de acuicultura de acuerdo con la reivindicación 1 , CARACTERIZADO porque los medios de emisión de señales de ultrasonido omnidireccional transmiten señales en forma periódica. Un sistema (100) para la inspección de mallas de cultivo en jaulas de acuicultura de acuerdo con la reivindicación 1 , CARACTERIZADO porque el sistema de procesamiento (140) se encarga de digitalizar las señales y retransmitirlas a una red local (141). Un sistema (100) para la inspección de mallas de cultivo en jaulas de acuicultura de acuerdo con la reivindicación 1 , CARACTERIZADO porque para la transmisión de las señales entre los medios receptores (130) y la red local (141) el sistema comprende un sistema LoRa (“Long Range”). Un sistema (100) para la inspección de mallas de cultivo en jaulas de acuicultura de acuerdo con la reivindicación 1 , CARACTERIZADO porque la localización del dispositivo de detección móvil (120) se lleva a cabo mediante una técnica de multilateración que permite calcular las diferencias de tiempo de transmisión de señales entre el dispositivo de detección móvil (120) y la pluralidad de medios receptores (130), permitiendo dar seguimiento a la posición del dispositivo de detección móvil (120) dentro de la jaula. Un sistema (100) para la inspección de mallas de cultivo en jaulas de acuicultura de acuerdo con la reivindicación 5, CARACTERIZADO porque para la localización el sistema comprende un mínimo de cuatro receptores (130) para obtener una localización en dos dimensiones y un mínimo de cinco receptores para una localización en tres dimensiones. Un sistema (100) para la inspección de mallas de cultivo en jaulas de acuicultura de acuerdo con la reivindicación 1 , CARACTERIZADO porque además comprende otros sistemas de posicionamiento, como GPS, con el objeto de sincronizar esta información con el procesamiento de las señales de ultrasonido provenientes del dispositivo de detección móvil (120). Un sistema (100) para la inspección de mallas de cultivo en jaulas de acuicultura de acuerdo con la reivindicación 1 , CARACTERIZADO porque comprende medios de detección fijos (132) ubicados en distintos lugares de la jaula, y medios de detección móviles (123) ubicados en el dispositivo de detección móvil (120). Un sistema (100) para la inspección de mallas de cultivo en jaulas de acuicultura de acuerdo con la reivindicación 8, CARACTERIZADO porque los medios de detección fijos (132) incluyen medios receptores de señales de ultrasonido (130) y otros, como cámaras fijas (131) en distintas ubicaciones de la jaula, y los medios de detección móviles (123) incluyen medios de 17 captación de imágenes (121) y medios de emisión de señales de ultrasonido omnidireccional (122) a bordo del dispositivo de detección móvil (120). Un sistema (100) para la inspección de mallas de cultivo en jaulas de acuicultura de acuerdo con la reivindicación 9, CARACTERIZADO porque el sistema de procesamiento (140) está configurado mediante una arquitectura de tres capas: una capa de acceso (142), una capa central (143) y una capa de interfaz de usuario (144). Un sistema (100) para la inspección de mallas de cultivo en jaulas de acuicultura de acuerdo con la reivindicación 10, CARACTERIZADO porque la capa de acceso (142) se comunica con los distintos medios de detección fijos y móviles, de modo de recibir los datos obtenidos y enviar la información a la capa central (143) para su posterior procesamiento. Un sistema (100) para la inspección de mallas de cultivo en jaulas de acuicultura de acuerdo con la reivindicación 11 , CARACTERIZADO porque la capa central (143) se comunica con la capa de acceso (142) para la recepción de los datos obtenidos y lleva a cabo la administración y procesamiento de toda la información para enviar posteriormente los datos de salida hacia la capa de interfaz de usuario (144). Un sistema (100) para la inspección de mallas de cultivo en jaulas de acuicultura de acuerdo con la reivindicación 12, CARACTERIZADO porque la capa de interfaz de usuario (144) comprende uno o más dispositivos que permiten intermediar entre un usuario y todo el sistema, permitiendo acceder a la información procesada por la capa central (143) e interactuar con los distintos componentes. Un método para la inspección de mallas de cultivo en jaulas de acuicultura, que permite una monitorización en tiempo real del estado de jaulas, CARACTERIZADO porque comprende los pasos de: 18 disponer de un dispositivo de detección móvil operable en forma remota (120), que está configurado para captar imágenes y emitir señales de ultrasonido omnidireccional; recibir las señales de ultrasonido omnidireccional emitidas por el dispositivo de detección móvil (120) mediante una pluralidad de medios receptores de señales de ultrasonido (130) localizados en forma fija en distintas ubicaciones de una jaula (110); determinar, mediante un sistema de procesamiento, la posición del dispositivo de detección móvil dentro de la jaula a partir de las señales de ultrasonido omnidireccional; y identificar, mediante el sistema de procesamiento, el estado de una malla de la jaula a partir de las imágenes recibidas; y generar visualizaciones automáticas en un entorno tridimensional, en una interfaz de usuario, de la ubicación del dispositivo de detección móvil, del estado del proceso de inspección, de las anomalías en la malla y del nivel de suciedad. Un método para la inspección de mallas de cultivo en jaulas de acuicultura de acuerdo con la reivindicación 14, CARACTERIZADO porque el paso de identificar el estado de una malla de la jaula comprende la detección de anomalías, como roturas en la malla, y detectar el nivel de suciedad en zonas de la malla. Un método para la inspección de mallas de cultivo en jaulas de acuicultura de acuerdo con la reivindicación 15, CARACTERIZADO porque la detección de roturas en la malla de la jaula comprende los pasos de: capturar imágenes (151) mediante la utilización del dispositivo de detección móvil; procesar las imágenes para llevar a cabo una detección y segmentación de la malla; 19 elaborar un modelo estadístico (156) a partir de las imágenes procesadas; seleccionar valores atípicos como candidatos de anomalía (157); llevar a cabo un filtrado espacio-temporal (158); y efectuar un registro de eventos (159). Un método para la inspección de mallas de cultivo en jaulas de acuicultura de acuerdo con la reivindicación 16, CARACTERIZADO porque el paso de procesar las imágenes comprende etapas de: pre-procesamiento de las imágenes (152), aplicación de un umbral adaptativo (153), análisis de componentes conectados (154) y generación de una máscara binaria correspondiente a los pixeles de la malla (155). Un método para la inspección de mallas de cultivo en jaulas de acuicultura de acuerdo con la reivindicación 17, CARACTERIZADO porque el preprocesamiento de las imágenes (152) comprende ajustar el tamaño de la imagen y llevar al espacio de color CIE-LAB para efectuar el procesamiento con un canal que contiene toda la información de percepción de luz relevante. Un método para la inspección de mallas de cultivo en jaulas de acuicultura de acuerdo con la reivindicación 18, CARACTERIZADO porque, posterior al preprocesamiento de las imágenes, se lleva a cabo el paso de aplicación de un umbral adaptativo (153) para generar luego una máscara binaria (155). Un método para la inspección de mallas de cultivo en jaulas de acuicultura de acuerdo con la reivindicación 19, CARACTERIZADO porque además comprende el paso de invertir la máscara binaria generada por el umbral adaptativo, para aquellos casos en que se identifica que el fondo de la imagen es más claro que la intensidad de los pixeles de la malla, lo cual comprende la medición de una proporción entre pixeles claros y oscuros de la máscara binaria, y aplicar posteriormente filtros de operaciones morfológicas. 20 Un método para la inspección de mallas de cultivo en jaulas de acuicultura de acuerdo con la reivindicación 17, CARACTERIZADO porque el paso de analizar componentes conectados (154) comprende determinar el área en píxeles correspondientes a cada uno de los orificios de la malla, y calcular un promedio y una desviación estándar entre estas áreas, para detectar aquellas superficies con valores atípicos en función de las desviaciones estándar. Un método para la inspección de mallas de cultivo en jaulas de acuicultura de acuerdo con la reivindicación 16, CARACTERIZADO porque además comprende el paso de filtrar posibles falsos positivos asociando información espacial obtenida por el procesamiento de imágenes con un análisis espacio- temporal (158) mediante seguimiento en video, en donde se identifican como rotura sólo aquellos valores atípicos que persisten en el tiempo, sin variar drásticamente su posición en la imagen. Un método para la inspección de mallas de cultivo en jaulas de acuicultura de acuerdo con la reivindicación 15, CARACTERIZADO porque la detección del nivel de suciedad en zonas de la malla comprende la utilización de inteligencia artificial, a través de un proceso de “transfer learning”. Un método para la inspección de mallas de cultivo en jaulas de acuicultura de acuerdo con la reivindicación 15, CARACTERIZADO porque la detección del nivel de suciedad en zonas de la malla comprende la utilización de Redes Neuronales Convolucionales (CNN). Un método para la inspección de mallas de cultivo en jaulas de acuicultura de acuerdo con la reivindicación 14, CARACTERIZADO porque además incluye la detección de peces al interior de la jaula, que comprende determinar una localización de peces y su posterior segmentación a partir de imágenes obtenidas desde el dispositivo de detección móvil. Un método para la inspección de mallas de cultivo en jaulas de acuicultura de acuerdo con la reivindicación 25, CARACTERIZADO porque la detección de peces al interior de la jaula comprende la utilización de inteligencia artificial 21 para el procesamiento de las imágenes obtenidas y/o el uso de redes neuronales. Un método para la inspección de mallas de cultivo en jaulas de acuicultura de acuerdo con la reivindicación 14, CARACTERIZADO porque además comprende una determinación complementaria de la posición del dispositivo de detección móvil (120) mediante el procesamiento de las imágenes detectadas por el mismo dispositivo, que incluye un análisis y seguimiento de características visuales de las mallas en las jaulas de cultivo. Un método para la inspección de mallas de cultivo en jaulas de acuicultura de acuerdo con la reivindicación 27, CARACTERIZADO porque la determinación complementaria comprende llevar a cabo una identificación de posiciones de características de la malla de la jaula de cultivo, mediante el análisis de las imágenes obtenidas por el mismo dispositivo, efectuando un seguimiento entre cuadros con un punto de referencia conocido dentro de la jaula. Un método para la inspección de mallas de cultivo en jaulas de acuicultura de acuerdo con la reivindicación 14, CARACTERIZADO porque la visualization en un entorno tridimensional del dispositivo de detección móvil (120) comprende la utilización de la información obtenida previamente respecto de la posición del dispositivo de detección móvil en relación a la jaula, y procesar esta información para proyectar la posición del dispositivo en un modelo tridimensional de la jaula. Un método para la inspección de mallas de cultivo en jaulas de acuicultura de acuerdo con la reivindicación 29, CARACTERIZADO porque en base a la información de la perspectiva del dispositivo, el método comprende la proyección de la información visual extraída para visualizar la posición de roturas en la malla y la caracterización del nivel de suciedad. CLAIMS A system (100) for the inspection of culture meshes in aquaculture cages, which allows real-time monitoring of the state of the culture meshes, CHARACTERIZED in that it comprises: a mobile detection device (120) that is operable in remote and comprising image capture means and omnidirectional ultrasound signal emission means; a plurality of ultrasound signal receiving means (130) fixedly located at different locations in a cage (110), which receive the emitted omnidirectional ultrasound signals; a processing system (140) that determines the position of the mobile detection device inside the cage by processing the omnidirectional ultrasound signals and identifies the state of a cage mesh by means of the captured images; and a user interface (144) configured to generate automatic displays in a three-dimensional environment of the location of the mobile detection device, the status of the inspection process, anomalies in the mesh, and the level of soiling. A system (100) for the inspection of culture meshes in aquaculture cages according to claim 1, CHARACTERIZED in that the omnidirectional ultrasound signal emission means transmit signals periodically. A system (100) for the inspection of culture meshes in aquaculture cages according to claim 1, CHARACTERIZED in that the processing system (140) is responsible for digitizing the signals and retransmitting them to a local network (141). A system (100) for the inspection of culture meshes in aquaculture cages according to claim 1, CHARACTERIZED in that For the transmission of the signals between the receiving means (130) and the local network (141), the system comprises a LoRa (“Long Range”) system. A system (100) for the inspection of culture meshes in aquaculture cages according to claim 1, CHARACTERIZED in that the location of the mobile detection device (120) is carried out using a multilateration technique that allows calculating the differences in signal transmission time between the mobile detection device (120) and the plurality of receiving means (130), allowing tracking of the position of the mobile detection device (120) inside the cage. A system (100) for the inspection of culture meshes in aquaculture cages according to claim 5, CHARACTERIZED in that for the location the system comprises a minimum of four receptors (130) to obtain a location in two dimensions and a minimum of five receivers for a location in three dimensions. A system (100) for the inspection of culture meshes in aquaculture cages according to claim 1, CHARACTERIZED in that it also includes other positioning systems, such as GPS, in order to synchronize this information with the processing of ultrasound signals. from the mobile detection device (120). A system (100) for the inspection of culture meshes in aquaculture cages according to claim 1, CHARACTERIZED in that it comprises fixed detection means (132) located in different parts of the cage, and mobile detection means (123) located on the mobile detection device (120). A system (100) for the inspection of culture meshes in aquaculture cages according to claim 8, CHARACTERIZED in that the fixed detection means (132) include ultrasound signal receiving means (130) and others, such as fixed cameras ( 131) in different locations of the cage, and the mobile detection means (123) include means of 17 image capture (121) and omnidirectional ultrasound signal emission means (122) on board the mobile detection device (120). A system (100) for the inspection of culture meshes in aquaculture cages according to claim 9, CHARACTERIZED in that the processing system (140) is configured using a three-layer architecture: an access layer (142), a core layer (143) and a user interface layer (144). A system (100) for the inspection of culture meshes in aquaculture cages according to claim 10, CHARACTERIZED in that the access layer (142) communicates with the different fixed and mobile detection means, in order to receive the data obtained and send the information to the central layer (143) for further processing. A system (100) for the inspection of culture meshes in aquaculture cages according to claim 11, CHARACTERIZED in that the central layer (143) communicates with the access layer (142) to receive the data obtained and carries carry out the administration and processing of all the information to subsequently send the output data to the user interface layer (144). A system (100) for the inspection of culture meshes in aquaculture cages according to claim 12, CHARACTERIZED in that the user interface layer (144) comprises one or more devices that allow mediation between a user and the entire system, allowing access to the information processed by the central layer (143) and interaction with the different components. A method for the inspection of culture meshes in aquaculture cages, which allows real-time monitoring of the state of cages, CHARACTERIZED because it includes the steps of: 18 having a remotely operable mobile detection device (120), which is configured to capture images and emit omnidirectional ultrasound signals; receiving the omnidirectional ultrasound signals emitted by the mobile detection device (120) by means of a plurality of ultrasound signal receiving means (130) fixedly located at different locations in a cage (110); determining, by means of a processing system, the position of the mobile detection device inside the cage from the omnidirectional ultrasound signals; and identifying, by means of the processing system, the state of a mesh of the cage from the received images; and generate automatic visualizations in a three-dimensional environment, in a user interface, of the location of the mobile detection device, the status of the inspection process, the anomalies in the mesh and the level of contamination. A method for the inspection of culture meshes in aquaculture cages according to claim 14, CHARACTERIZED in that the step of identifying the state of a cage mesh comprises detecting anomalies, such as breaks in the mesh, and detecting the level of dirt in areas of the mesh. A method for the inspection of culture meshes in aquaculture cages according to claim 15, CHARACTERIZED in that the detection of breaks in the cage mesh comprises the steps of: capturing images (151) by using the mobile detection device ; process the images to carry out a detection and segmentation of the mesh; 19 elaborate a statistical model (156) from the processed images; select outliers as anomaly candidates (157); perform space-time filtering (158); and making an event log (159). A method for the inspection of culture meshes in aquaculture cages according to claim 16, CHARACTERIZED in that the step of processing the images comprises stages of: pre-processing of the images (152), application of an adaptive threshold (153) , analysis of connected components (154) and generation of a binary mask corresponding to the pixels of the mesh (155). A method for the inspection of culture meshes in aquaculture cages according to claim 17, CHARACTERIZED in that the pre-processing of the images (152) comprises adjusting the size of the image and taking it to the CIE-LAB color space to carry out the processing with a channel containing all the relevant light perception information. A method for the inspection of culture meshes in aquaculture cages according to claim 18, CHARACTERIZED in that, after the pre-processing of the images, the step of applying an adaptive threshold (153) is carried out to then generate a mask binary (155). A method for the inspection of culture meshes in aquaculture cages according to claim 19, CHARACTERIZED in that it also comprises the step of inverting the binary mask generated by the adaptive threshold, for those cases in which it is identified that the background of the image is lighter than the intensity of the mesh pixels, which involves measuring a ratio between light and dark pixels of the binary mask, and subsequently applying morphological operation filters. 20 A method for the inspection of culture meshes in aquaculture cages according to claim 17, CHARACTERIZED in that the step of analyzing connected components (154) comprises determining the area in pixels corresponding to each of the holes in the mesh, and calculate an average and a standard deviation between these areas, to detect those areas with outliers based on the standard deviations. A method for the inspection of culture meshes in aquaculture cages according to claim 16, CHARACTERIZED in that it also comprises the step of filtering possible false positives by associating spatial information obtained by image processing with a spatiotemporal analysis (158) by means of video monitoring, where only those atypical values that persist over time are identified as breaks, without drastically changing their position in the image. A method for the inspection of culture meshes in aquaculture cages according to claim 15, CHARACTERIZED in that the detection of the level of dirt in areas of the mesh comprises the use of artificial intelligence, through a process of "transfer learning" . A method for the inspection of culture meshes in aquaculture cages according to claim 15, CHARACTERIZED in that the detection of the level of dirt in areas of the mesh comprises the use of Convolutional Neural Networks (CNN). A method for the inspection of culture meshes in aquaculture cages according to claim 14, CHARACTERIZED in that it also includes the detection of fish inside the cage, which comprises determining a location of fish and its subsequent segmentation from images obtained from the mobile detection device. A method for the inspection of culture meshes in aquaculture cages according to claim 25, CHARACTERIZED in that the detection of fish inside the cage comprises the use of artificial intelligence 21 for the processing of the images obtained and/or the use of neural networks. A method for the inspection of culture meshes in aquaculture cages according to claim 14, CHARACTERIZED in that it also comprises a complementary determination of the position of the mobile detection device (120) by processing the images detected by the same device, which includes an analysis and monitoring of visual characteristics of the meshes in the culture cages. A method for the inspection of culture meshes in aquaculture cages according to claim 27, CHARACTERIZED in that the complementary determination comprises carrying out an identification of characteristic positions of the culture cage mesh, by analyzing the images obtained by the same device, tracking between frames with a known reference point inside the cage. A method for the inspection of culture meshes in aquaculture cages according to claim 14, CHARACTERIZED in that the visualization in a three-dimensional environment of the mobile detection device (120) comprises the use of the information previously obtained regarding the position of the device moving sensing relative to the cage, and processing this information to project the position of the device onto a three-dimensional model of the cage. A method for the inspection of culture meshes in aquaculture cages according to claim 29, CHARACTERIZED in that, based on the information from the perspective of the device, the method comprises the projection of the extracted visual information to visualize the position of breaks in the mesh and the characterization of the level of dirt.
PCT/CL2022/050128 2021-12-16 2022-12-15 System for inspecting farming netting in aquaculture cages WO2023108307A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CL3370-2021 2021-12-16
CL2021003370A CL2021003370A1 (en) 2021-12-16 2021-12-16 System for the inspection of culture meshes in aquaculture cages

Publications (1)

Publication Number Publication Date
WO2023108307A1 true WO2023108307A1 (en) 2023-06-22

Family

ID=81535968

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CL2022/050128 WO2023108307A1 (en) 2021-12-16 2022-12-15 System for inspecting farming netting in aquaculture cages

Country Status (2)

Country Link
CL (1) CL2021003370A1 (en)
WO (1) WO2023108307A1 (en)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2002084217A2 (en) * 2001-04-11 2002-10-24 Hafmynd Ehf. Underwater inspection
WO2018147746A1 (en) * 2017-02-08 2018-08-16 Hallgeir Solberg Apparatus for removal and collection of fouling from a dived structure and a method for using the apparatus
KR102234697B1 (en) * 2018-11-02 2021-04-02 광주과학기술원 Fish net surveillance apparatus using Remotely-Operated underwater Vehicle, controlling method of the same
KR20210104250A (en) * 2020-02-17 2021-08-25 (주)아이로 Position Recognition System of Underwater Moving Object and Its Operation Method

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2002084217A2 (en) * 2001-04-11 2002-10-24 Hafmynd Ehf. Underwater inspection
WO2018147746A1 (en) * 2017-02-08 2018-08-16 Hallgeir Solberg Apparatus for removal and collection of fouling from a dived structure and a method for using the apparatus
KR102234697B1 (en) * 2018-11-02 2021-04-02 광주과학기술원 Fish net surveillance apparatus using Remotely-Operated underwater Vehicle, controlling method of the same
KR20210104250A (en) * 2020-02-17 2021-08-25 (주)아이로 Position Recognition System of Underwater Moving Object and Its Operation Method

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
BETANCOURT GONZALEZ JOHN ALBERTO: "NetInspector: Visión por computador para la detección de daños y defectos en redes de piscicultura", THESIS, PONTIFICIA UNIVERSIDADA JAVERIANA, BOGOTA, COLOMBIA, 1 January 2016 (2016-01-01), XP093076673, Retrieved from the Internet <URL:http://hdl.handle.net/10554/21415> [retrieved on 20230828] *
SALVI, J ET AL.: "Visual SLAM for underwater vehicles using video velocity log and natural landmarks", OCEANS 2008, 2008, pages 1 - 6, XP031482881, Retrieved from the Internet <URL:https://www.researchgate.net/publication/224556799> [retrieved on 20230328], DOI: 10.1109/OCEANS.2008.5151887 *

Also Published As

Publication number Publication date
CL2021003370A1 (en) 2022-04-08

Similar Documents

Publication Publication Date Title
US10809376B2 (en) Systems and methods for detecting objects in underwater environments
US11734560B2 (en) Systems and methods for automatic estimation of object characteristics from digital images
JP6289564B2 (en) Method, apparatus and computer readable medium for detecting changes to structures
CN102737236B (en) Method for automatically acquiring vehicle training sample based on multi-modal sensor data
US10930013B2 (en) Method and system for calibrating imaging system
TW200903379A (en) Automatic camera calibration and geo-registration using objects that provide positional information
CN111409070B (en) Detection method and device, intelligent robot and storage medium
CN113160327A (en) Method and system for realizing point cloud completion
IL236778A (en) Calibration of camera-based surveillance systems
Al-Sheary et al. Crowd monitoring system using unmanned aerial vehicle (UAV)
Kim et al. Robotic sensing and object recognition from thermal-mapped point clouds
CN111046877A (en) Millimeter wave image suspicious article detection method and system
CN116339337A (en) Target intelligent positioning control system and method based on infrared imaging, laser radar and sound directional detection
CN110136186B (en) Detection target matching method for mobile robot target ranging
Djuricic et al. Supporting UAVs in low visibility conditions by multiple-pulse laser scanning devices
CN105303580A (en) Identification system and method of panoramic looking-around multi-camera calibration rod
WO2023108307A1 (en) System for inspecting farming netting in aquaculture cages
US20230073689A1 (en) Inspection Device for Inspecting a Building or Structure
CN109903308B (en) Method and device for acquiring information
Liu et al. Outdoor camera calibration method for a GPS & camera based surveillance system
US20220299627A1 (en) Apparatus and Method for Collecting and Auto-Labelling Measurement Data in Traffic Scenario
Lu et al. 3-D location estimation of underwater circular features by monocular vision
Mossel Robust 3D position estimation in wide and unconstrained indoor environments
CN114782359A (en) Industrial conveyor belt surface defect detection method based on data heterogeneous fusion
Chen Analysis and management of uav-captured images towards automation of building facade inspections

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22905610

Country of ref document: EP

Kind code of ref document: A1