WO2024108318A1 - System and method for identifying pelagic species outside the water - Google Patents

System and method for identifying pelagic species outside the water Download PDF

Info

Publication number
WO2024108318A1
WO2024108318A1 PCT/CL2023/050100 CL2023050100W WO2024108318A1 WO 2024108318 A1 WO2024108318 A1 WO 2024108318A1 CL 2023050100 W CL2023050100 W CL 2023050100W WO 2024108318 A1 WO2024108318 A1 WO 2024108318A1
Authority
WO
WIPO (PCT)
Prior art keywords
species
pelagic
pelagic species
illuminated
database
Prior art date
Application number
PCT/CL2023/050100
Other languages
Spanish (es)
French (fr)
Inventor
Mauricio URBINA FONERON
Sergio Neftalí TORRES INOSTROZA
Ariel Patricio TORRES SÁNCHEZ
Vincenzo Alexo CARO FUENTES
Danny Alberto LUARTE CANTO
Sebastián Eugenio GODOY MEDEL
Jorge Edgardo PEZOA NUÑEZ
Original Assignee
Universidad De Concepcion
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Universidad De Concepcion filed Critical Universidad De Concepcion
Publication of WO2024108318A1 publication Critical patent/WO2024108318A1/en

Links

Classifications

    • AHUMAN NECESSITIES
    • A01AGRICULTURE; FORESTRY; ANIMAL HUSBANDRY; HUNTING; TRAPPING; FISHING
    • A01KANIMAL HUSBANDRY; AVICULTURE; APICULTURE; PISCICULTURE; FISHING; REARING OR BREEDING ANIMALS, NOT OTHERWISE PROVIDED FOR; NEW BREEDS OF ANIMALS
    • A01K61/00Culture of aquatic animals
    • A01K61/90Sorting, grading, counting or marking live aquatic animals, e.g. sex determination
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N33/00Investigating or analysing materials by specific methods not covered by groups G01N1/00 - G01N31/00
    • G01N33/02Food
    • G01N33/12Meat; Fish
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N35/00Automatic analysis not limited to methods or materials provided for in any single one of groups G01N1/00 - G01N33/00; Handling materials therefor
    • G01N35/02Automatic analysis not limited to methods or materials provided for in any single one of groups G01N1/00 - G01N33/00; Handling materials therefor using a plurality of sample containers moved by a conveyor system past one or more treatment or analysis stations
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N35/00Automatic analysis not limited to methods or materials provided for in any single one of groups G01N1/00 - G01N33/00; Handling materials therefor
    • G01N35/02Automatic analysis not limited to methods or materials provided for in any single one of groups G01N1/00 - G01N33/00; Handling materials therefor using a plurality of sample containers moved by a conveyor system past one or more treatment or analysis stations
    • G01N35/04Details of the conveyor system
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T1/00General purpose image data processing
    • G06T1/20Processor architectures; Processor configuration, e.g. pipelining
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/168Segmentation; Edge detection involving transform domain methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/40Analysis of texture
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding

Definitions

  • the present invention is related to the fishing industry, more particularly, it is related to a system composed of an apparatus and a method that allows digitally acquiring, measuring, counting, identifying and classifying the different species that are unloaded from fishing vessels.
  • the analysis carried out on landings usually consists of the in-situ analysis of less than a dozen fishing samples taken in buckets of approx. 3 liters.
  • the samples are analyzed manually to count the individuals of each species within the buckets and then, once all the resources have been weighed, the percentage contribution of each one is calculated with respect to the total weight of the sample and this value is assumed as the distribution of species to be landed.
  • This procedure is imprecise, it is not automated, it does not have traceability, and it does not allow the associated error to be estimated, since the fishing in the volume analyzed from the buckets is not necessarily representative of the total fishing of each landing.
  • the publication “Fish species recognition using computer vision and a neural network” by Storbeck et al. describes that fish species recognition systems have been developed based on a computer vision system and neural networks.
  • the vision system measures a number of characteristics, including width and height at various locations along the body of the fish as it is transported by a belt.
  • the model used for species classification is based on a traditional neural network architecture with backpropagation, which integrates length and width information from various areas of the fish as its input features.
  • the measurement of length and width is carried out based on the distortion measurement provided by a helium-neon laser integrated into the vision system, (Frank Storbeck, Berent Daan, “Fish species recognition using computer vision and a neural network” Fisheries Research, Volume 51, Issue 1, April 2001, Pages 11 -15).
  • Patent application W02008/056988(A1) describes a method and a system for automatic classification of fish underwater, comprising a camera system that takes images of fish and an analysis unit ready to receive and analyze each image to find the outline of each fish. Furthermore, this method involves dividing the fish into a certain number of segments and analyzing the color in each segment, where the characteristic features are compared with the data of the fish in a knowledge database for the classification of fish with the help of analysis. discriminatory.
  • this patent uses basic analytical methods, such as discriminant analysis, which are lagging in performance compared to current complex methods, which use deep learning. Nor does it make use of additional information such as that provided by specific spectral bands. It is focused on underwater measurements.
  • Patent application US2012/306644(A1) describes a fish identification device consisting of a housing, a touch screen that allows the user to view and operate functions, an internal processor operated by software, an internal GPS, a camera with a lens and flash that photographs a fish, an internal memory and an operation software.
  • the device allows the user to determine local fishing regulations, photograph a fish, identify it, and then save the information to internal memory for later use.
  • this technology requires the manual use of a device to identify individual fish, therefore, it is not applicable to the problem of monitoring landings.
  • phone applications such as Google Lens, which use a computer's camera to recognize various objects when searching on Google Search Engine.
  • This type of application can be used to recognize pelagic species, but the results it provides are not consistent because they are strongly affected by factors such as luminosity, camera focus, simultaneity of objects, among others.
  • a technology is required that allows analyzing the fish taken from fishing vessels, taking a greater number of samples in a shorter time, in addition to maintaining more objective and consistent comparison standards and that allows generating traceability and registration of the species analyzed with better quantification of the results.
  • the present technology refers to an apparatus, a method and a system for identification and classification of pelagic species at the fishing landing point, which more particularly consists of an automated multi-camera vision apparatus, a method for counting, recording, classification and biomethco analysis of fishing and a protocol for integrating both into a single system.
  • This system is composed of a device that includes a gantry with embedded computer vision technology for multimodal digital video acquisition and a method based on machine learning and image processing and analysis programs to discriminate different species of fish such as, for example, but not limited to, small pelagic species and their accompanying fauna in artisanal fishing and other forms of industrial fishing.
  • Figure 1 shows an internal diagram of the device and the system, whose main components are: a gantry where the acquisition system (100) is installed that contains an array of cameras (101) with optical spectral filters, a remote communication router ( 102) that packages the information acquired by the module and transmits it securely over the Internet to the user.
  • the acquisition requires a controlled lighting system (103), and must be ventilated (104) for correct operation.
  • the acquired and transmitted information is received in an image processing system (200) where the method of identification, counting and classification of the samples is implemented.
  • Figure 2 shows a diagram of the implementation of the method within the system (200) whose main components are a module for receiving images transmitted from a gantry (201), a deep convolutional neural network that identifies and classifies the species in the image. captured (202). A false identification discard algorithm (203) that automatically eliminates false positives from the sample and those that remain are recorded by a system (204) and stored in parallel (205). Finally, the video feed with the overlay of the identified relevant data is displayed to the user by a user interface (206).
  • Figure 3 shows an illustration of the diagram of the operating process of the present technology applied to the example of fish landing.
  • the gantry captures broadband and narrowband images with the camera array. These images are processed and generate a report that is shown to the end user, where (a) corresponds to user interface software, reports and evidence; (b) classification algorithm; (c) cameras and optical filters; (d) preprocessing and control software; (e) controlled lighting
  • Figure 4 shows a graph with spectral fingerprints obtained from complete anatomy images of the example of fish whose species are of interest for national fishing, where (a) corresponds to sardine, (b) to anchovy, (c) to mote , (d) horse mackerel, (e) hake and (f) pippin.
  • Figure 4A shows a graph with spectral fingerprints from full anatomy images of species of interest
  • Figure 4B shows a graph with spectral fingerprints from images of the lateral zone of the species.
  • Figure 5 shows a schematic of the deep convolutional neural network architecture of the present technology to discriminate the species of interest, where (I) corresponds to the input with (a) narrow band segmented images for N bands; (II) feature extraction stage with (b) 2D convolutional layer and (c) repeating 2D Maxpooling sequentially; (III) classification stage with fully connected networks; and (IV) output stage that delivers classification results
  • Figure 6 shows a representation of the camera system and lighting system of an exemplary embodiment of the present technology. Specifically, in (a) the system of multiple cameras and optical filters is shown; (b) shows the controlled lighting system and (c) shows the ventilation system required for the operation of the embedded opto-electronics of the gantry.
  • Figure 7 shows a confusion matrix of the validation results carried out by certifiers.
  • the present technology refers to an integrated system that incorporates a specialized apparatus and a method that facilitates the identification and classification of objects that pass through the field of vision of the apparatus such as, for example, but not limited to the classification of pelagic species passing through a conveyor belt after disembarkation.
  • the integrated system comprises an automatic vision system that illuminates one or more pelagic species to be analyzed with controlled or structured light for the counting, registration, classification and biometric analysis of pelagic species, but which can be easily extended to new situations or configurations.
  • the system comprises computer vision technology, machine learning methodologies integrated with image processing techniques designed to discriminate and analyze different objects passing through their field of view, particularly but not exclusively, to analyze different pelagic species, especially, small pelagic species and their accompanying fauna in artisanal sardine and anchovy fishing in different regions of Chile and the world.
  • this technology automatically discriminates the objects or species of interest, generating a representative sampling of the composition of the analyzed species, for example, but not limited to fishing landings of pelagic species, in addition to providing statistics of relevant biometric estimates of the fishery such as , for example, but not limited to the sizes and weight of the classified specimens.
  • the specialized computer vision apparatus has a controlled lighting system of the preferred type, controlled arrangement of LED panels or arrangement of one or more cameras at one or more wavelengths, with fixed or mobile optical filters for the acquisition of multiple spectral bands of interest and embedded computer vision systems.
  • the infrastructure of the device generates lighting conditions specifically selected for the application in which the device is being used, such that it enables the acquisition of broadband or high-band images. narrow according to the application, through the use of optical filters that highlight the discriminatory spectral and morphological differences of the species to be classified, independent of the external lighting conditions. Depending on the conditions of the acquisition environment, the device can have uniform or structured lighting in order to optimize discrimination between the different classes to be identified.
  • the optical filters are selected in tune with the spectral bands that promote the contrast difference of the species in the images taken by the infrared spectrum or visible spectrum camera array, and may or may not be essential depending on the morphology of the species to be classified. . More particularly, spectral filters in the bands 300 nm to 500 nm promote the difference in contrast between, but not limited to, sardine and anchovy.
  • the broadband or narrow band images are entered into an analysis unit connected to a database of pelagic species, where they are computationally analyzed to carry out the identification, counting, registration and biomethco analysis of size and weight of the species classified in the images.
  • This analysis unit comprises at least one processor configured to receive the images, compare discriminatory spectral and morphological particularities of the illuminated pelagic species with existing records in the pelagic species database, and identify the illuminated pelagic species according to criteria of similarity with the existing records in the pelagic species database.
  • the unit of analysis is the one that implements the method of this technology. To do this, it uses previously trained machine-learning algorithms to identify objects (for example, fish) in real time, from the multimodal and multispectral video feed of a conveyor belt; determines relevant information (such as, for example, the species and size of pelagic species) and reports the results to a user. These reports are the complete record with the level of detail that the user requires and contain the digital evidence of the objects and species analyzed in images and the statistics associated with an analysis time window that includes the number, relative proportion and estimation of relevant parameters (e.g. size and weight of classified species).
  • the technology also comprises a user interface connected to the on-site device and to the back-end of an analysis unit to generate the aforementioned reports.
  • FIG. 3 An illustration of the operation of the system in the example of classifying pelagic species is shown in Figure 3, where (a) corresponds to software, user interface, reports and evidence; (b) to classification software; (c) to camera and narrow band filters; (d) preprocessing and control software; and (e) LED lighting.
  • Figures 4 A and B show graphs with records of spectral fingerprints of species of interest obtained with the present technology.
  • the spectral fingerprint is defined as the ratio between the amount of light reflected by a pelagic species, as a function of wavelength, with respect to the total light reflected by a known reference, and where the specific measurements of the spectrum are spatially averaged using all those pixels in an image.
  • Figure 4A shows a graph referring to the complete anatomy of the pelagic species under analysis
  • Figure 4B shows a graph referring to the lateral zone of the pelagic species under analysis.
  • Figure 5 shows a diagram of the species classification system based on a deep convolutional neural network architecture used by this technology to discriminate the species of interest, where (I) corresponds to the input, (II) to the property extraction, (III) to the output, (a) to the narrow band segmented images, N bands, (b) to 2D convolutional layer and (c) to 2D max pooling.
  • the computational analysis system consists mainly of a set of pelagic species classifiers based on machine learning technology and image processing programs that operate in embedded computer vision systems.
  • Figure 6 shows a render of the designed, manufactured and installed gantry, where the gantry comprises an anchoring system that is installed on the conveyor belt, a controlled lighting system and an array of cameras with optical filters to improve discrimination. of species.
  • part of a fish landing was analyzed by passing through a conveyor belt, where specifically 70 tons were analyzed.
  • three 3-liter buckets were extracted with pelagic reference species, where it was estimated that the landing comprised 79% of anchovy and 21% of sardine.
  • a lighting system, a camera system and a set of optical filters of the present technology were installed on the conveyor belt through which the fish were unloaded and broadband images of the pelagic species were recorded as they passed under the camera system.
  • the images were transmitted to an analysis unit connected to a database of pelagic species from the Biob ⁇ o and ⁇ uble regions in Chile.
  • the analysis unit compared discriminatory morphological particularities of the illuminated pelagic species with the existing records in the database, identifying the pelagic species illuminated according to similarity criteria using a deep convolutional neural network algorithm, thus obtaining that the landing included 82.5% anchovy and 17.5% sardine.
  • the results indicate a similar distribution of species both by what was recorded by the inspector and by the identification and classification method.
  • the main difference of this system is that, advantageously, it can achieve representative sampling by being able to analyze a greater number of species and in a distributed manner throughout the duration of the download.
  • another advantage is that the inspector will not need to be in person at the unloading point, since he can carry out the work remotely by analyzing the results delivered by the recognition system.
  • a validation process of the results was carried out with certifying personnel.
  • certifying personnel In this exercise, a total of 240 different images were selected, and the identification system was used to identify all possible species. Then, these already identified images were distributed to 5 expert certifiers, in order for them to indicate the wrong identifications of the system.

Landscapes

  • Engineering & Computer Science (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • General Physics & Mathematics (AREA)
  • Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Chemical & Material Sciences (AREA)
  • Health & Medical Sciences (AREA)
  • Biochemistry (AREA)
  • Pathology (AREA)
  • Analytical Chemistry (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • General Health & Medical Sciences (AREA)
  • Immunology (AREA)
  • Food Science & Technology (AREA)
  • Zoology (AREA)
  • Environmental Sciences (AREA)
  • Multimedia (AREA)
  • Medicinal Chemistry (AREA)
  • Marine Sciences & Fisheries (AREA)
  • Animal Husbandry (AREA)
  • Biodiversity & Conservation Biology (AREA)
  • Investigating Or Analysing Materials By Optical Means (AREA)
  • Image Analysis (AREA)

Abstract

The present invention relates to a system for identifying pelagic species outside the water that comprises (a) an illumination system that illuminates one or more pelagic species to be analysed with controlled light; (b) an optical filter set for selecting spectral bands; (c) an arrangement of cameras with one or more wavelength ranges that record broadband or narrowband images of the illuminated pelagic species by means of optical filters; (d) an analysis unit connected to a database of pelagic species, wherein said unit comprises at least one processor configured to receive the images, compare spectral and morphological particularities that distinguish the illuminated pelagic species with records available in the database, and identify the illuminated pelagic species according to similarity criteria with the records available in the database of pelagic species. The invention further relates to a method for identifying species.

Description

UN SISTEMA Y MÉTODO PARA LA IDENTIFICACIÓN DE ESPECIES PELÁGICAS FUERA DEL AGUA. A SYSTEM AND METHOD FOR THE IDENTIFICATION OF PELAGIC SPECIES OUT OF WATER.
Sector Técnico Technical Sector
La presente invención se relaciona con la industria pesquera, más particularmente, se relaciona con un sistema compuesto por un aparato y un método que permite adquirir, medir, contar, identificar y clasificar en forma digital las distintas especies que son descargadas desde embarcaciones pesqueras. The present invention is related to the fishing industry, more particularly, it is related to a system composed of an apparatus and a method that allows digitally acquiring, measuring, counting, identifying and classifying the different species that are unloaded from fishing vessels.
Técnica Anterior Previous Technique
Actualmente en la industria pesquera, muchos países establecen cuotas de pesca para realizar una mejor gestión de sus recursos marinos, sin embargo, la fiscalización y control para que dichas cuotas se cumplan resulta difícil de realizar. Currently in the fishing industry, many countries establish fishing quotas to better manage their marine resources, however, the supervision and control so that these quotas are met is difficult to carry out.
Las organizaciones encargadas del cumplimiento de las cuotas deben fiscalizar, controlar y registrar las pesquerías sujetas a cuotas de captura. En algunos casos, los desembarques de pesca deben ser fiscalizados en terreno por personal fiscalizador. El proceso de fiscalización no es sencillo pues los volúmenes de pesca extraídos por barco pueden ser cientos o miles de kilogramos de especies pelágicas, el número de desembarques diarios es alto y el número de fiscalizadores en terreno es limitado. Organizations in charge of quota compliance must supervise, control and register fisheries subject to catch quotas. In some cases, fishing landings must be inspected on the ground by inspection personnel. The inspection process is not simple because the volumes of fishing extracted by boat can be hundreds or thousands of kilograms of pelagic species, the number of daily landings is high and the number of inspectors on the ground is limited.
El análisis realizado en los desembarques habitualmente consiste en el análisis in-situ de menos de una decena de muestras de pesca tomadas en baldes de apróx. 3 litros. Las muestras se analizan manualmente para contar los individuos de cada especie dentro de los baldes y luego, una vez pesados todos los recursos, se calcula el porcentaje de aporte de cada uno con respecto al peso total de la muestra y este valor se asume como la distribución de especies a desembarcar. Este procedimiento resulta impreciso, no está automatizado, no tiene trazabilidad, y tampoco permite estimar el error asociado, ya que la pesca en el volumen analizado desde los baldes no necesariamente es representativa del total de la pesca de cada desembarque. The analysis carried out on landings usually consists of the in-situ analysis of less than a dozen fishing samples taken in buckets of approx. 3 liters. The samples are analyzed manually to count the individuals of each species within the buckets and then, once all the resources have been weighed, the percentage contribution of each one is calculated with respect to the total weight of the sample and this value is assumed as the distribution of species to be landed. This procedure is imprecise, it is not automated, it does not have traceability, and it does not allow the associated error to be estimated, since the fishing in the volume analyzed from the buckets is not necessarily representative of the total fishing of each landing.
A continuación, se revisan las alternativas científicas y técnicas más relevantes que a simple vista presentan cierto grado de similitud al sistema para la clasificación de especies de peces que se describe en este documento, pero que no solucionan el mismo problema o no lo hacen con la misma técnica. Below, the most relevant scientific and technical alternatives are reviewed that at first glance present a certain degree of similarity to the system for the classification of fish species described in this document, but that do not solve the same problem or do not do so with the same technique.
En cuanto a la clasificación fuera del agua, la publicación “Automated measurement of species and length of fish by computer vision" de White et al. propone métodos de visión computacional para distinguir ciertas especies de peces y determinar su talla. Para esto, se determina la orientación del pescado usando un método de momentos invariante, e identificaron con alta precisión la tipología de los peces: plano o redondo, a partir de relaciones entre las intensidades de ciertos píxeles en la imagen y la longitud de éste. Sin embargo, el estudio utiliza solamente peces con una alta relación de ancho-alto (aspect ratio) y, por ende, su desempeño no está garantizado en especies con diferentes características morfológicas. Esta tecnología soluciona este problema considerando características globales en el entrenamiento de la red neuronal. El método anterior no es aplicable en situaciones donde las condiciones de la correa cambian constantemente (iluminación, suciedad, cortes, etc.), y no es del todo útil cuando los peces vienen amontonados, lo cual dificulta a muchos métodos de segmentación más tradicionales como detección de bordes más umbralización u otras operaciones morfológicas, (D.J. White, C. Svellingen, N.J.C. Strachan, “Automated measurement of species and length of fish by computer vision” Fisheries Research, Volume 80, Issues 2-3, September 2006, Pages 203-210.) Regarding classification out of water, the publication “Automated measurement of species and length of fish by computer vision" by White et al. proposes computer vision methods to distinguish certain species of fish and determine their size. For this, it is determined the orientation of the fish using an invariant moment method, and identified with high precision the typology of the fish: flat or round, from relationships between the intensities of certain pixels in the image and its length. The study uses only fish with a high aspect ratio and, therefore, its performance is not guaranteed in species with different morphological characteristics. This technology solves this problem by considering global features in the training of the neural network. The above method is not applicable in situations where leash conditions are constantly changing (lighting, dirt, cuts, etc.), and it is not at all useful when the fish come in bunches, which hinders many more traditional segmentation methods such as edge detection plus thresholding or other morphological operations, (DJ White, C. Svellingen, NJC Strachan, “Automated measurement of species and length of fish by computer vision” Fisheries Research, Volume 80, Issues 2-3, September 2006, Pages 203 -210.)
Por otro lado, la publicación “Fish species recognition using computer vision and a neural network” de Storbeck et al. describe que se han desarrollado sistemas de reconocimiento de especies de peces basado en un sistema de visión por computador y redes neuronales. En este caso, el sistema de visión mide un número de características, incluyendo el ancho y la altura en varios lugares a lo largo del cuerpo del pescado mientras es transportado por una banda. Sin embargo, el modelo utilizado para la clasificación de las especies se basa en una arquitectura de red neuronal tradicional con retropropagación, la cual integra información de largo y ancho de varias zonas de los pescados como sus características de entrada. La medición del largo y ancho se realiza en base a la medida de la distorsión proporcionada por un láser de helio-neón integrado al sistema de visión, (Frank Storbeck, Berent Daan, “Fish species recognition using computer vision and a neural network” Fisheries Research, Volume 51 , Issue 1 , April 2001 , Pages 11 -15). On the other hand, the publication “Fish species recognition using computer vision and a neural network” by Storbeck et al. describes that fish species recognition systems have been developed based on a computer vision system and neural networks. In this case, the vision system measures a number of characteristics, including width and height at various locations along the body of the fish as it is transported by a belt. However, the model used for species classification is based on a traditional neural network architecture with backpropagation, which integrates length and width information from various areas of the fish as its input features. The measurement of length and width is carried out based on the distortion measurement provided by a helium-neon laser integrated into the vision system, (Frank Storbeck, Berent Daan, “Fish species recognition using computer vision and a neural network” Fisheries Research, Volume 51, Issue 1, April 2001, Pages 11 -15).
La publicación "Classification of Fish Species with Augmented Data using Deep Convolutional Neural Network" de Montalbo etal., describe la utilización de redes neuronales convolucionales (CNN) para clasificación de peces. Se utiliza la arquitectura VGG16 y se realiza transfer learning para adaptar los clasificadores a las especies de peces existentes en la Isla Verde de Tailandia. Sin embargo, actualmente, ya no se utilizan clasificadores de este estilo ya que impactan sobre la capacidad de procesamiento del sistema (en FPS). Entre más detecciones se encuentran y sin aplicar algún método para paralelizar la clasificación de cada bounding box que se genera, los FPS bajan demasiado como para realizar procesamiento en tiempo real. Adicionalmente, no está siendo aplicado en algún ambiente industrial que utilice correas transportadoras o algún sistema de visión en particular, sólo se preocupan del método de clasificación (F. J. P. Montalbo and A. A. Hernandez, "Classification of Fish Species with Augmented Data using Deep Convolutional Neural Network," 2019 IEEE 9th International Conference on System Engineering and Technology (ICSET), Shah Alam, Malaysia, 2019, pp. 396-401 , doi : 10.1 109/ICSEngT.2O19.8906433). The publication "Classification of Fish Species with Augmented Data using Deep Convolutional Neural Network" by Montalbo etal., describes the use of convolutional neural networks (CNN) for fish classification. The VGG16 architecture is used and transfer learning is performed to adapt the classifiers to the existing fish species on the Green Island of Thailand. However, currently, classifiers of this style are no longer used since they impact the processing capacity of the system (in FPS). The more detections there are and without applying some method to parallelize the classification of each bounding box that is generated, the FPS drops too low to perform real-time processing. Additionally, it is not being applied in any industrial environment that uses conveyor belts or any particular vision system, they are only concerned with the classification method (F. J. P. Montalbo and A. A. Hernandez, "Classification of Fish Species with Augmented Data using Deep Convolutional Neural Network, " 2019 IEEE 9th International Conference on System Engineering and Technology (ICSET), Shah Alam, Malaysia, 2019, pp. 396-401, doi: 10.1 109/ICSEngT.2O19.8906433).
La solicitud de patente W02008/056988(A1 ) describe un método y un sistema para la clasificación automática de peces bajo el agua, que comprende un sistema de cámara que toma imágenes de los peces y una unidad de análisis dispuesta para recibir y analizar cada imagen para encontrar el contorno cada pez. Además, dicho método involucra dividir el pez en un cierto número de segmentos y analizar el color en cada segmento, donde se comparan los rasgos característicos con los datos de los peces en una base de datos de conocimiento para la clasificación de pescados con ayuda de análisis discriminatorios. Sin embargo, esta patente utiliza métodos analíticos básicos, como los análisis discriminantes, que están atrasados en rendimiento en comparación con los métodos complejos actuales, que usan deep learning. Tampoco hace uso de información adicional como la proporcionada por bandas espectrales específicas. Está enfocada en mediciones bajo el agua. Patent application W02008/056988(A1) describes a method and a system for automatic classification of fish underwater, comprising a camera system that takes images of fish and an analysis unit ready to receive and analyze each image to find the outline of each fish. Furthermore, this method involves dividing the fish into a certain number of segments and analyzing the color in each segment, where the characteristic features are compared with the data of the fish in a knowledge database for the classification of fish with the help of analysis. discriminatory. However, this patent uses basic analytical methods, such as discriminant analysis, which are lagging in performance compared to current complex methods, which use deep learning. Nor does it make use of additional information such as that provided by specific spectral bands. It is focused on underwater measurements.
La solicitud de patente US2012/306644(A1 ) describe un dispositivo de identificación de peces formado por una carcasa, una pantalla táctil que permite al usuario ver y operar funciones, un procesador interno operado por un software, un GPS interno, una cámara con un lente y flash que fotografía un pez, una memoria interna y un software de operación. El dispositivo permite al usuario determinar las regulaciones locales de pesca, fotografiar un pez, identificarlo y luego guardar la información en una memoria interna para uso posterior. No obstante, esta tecnología requiere del uso manual de un dispositivo, para identificar peces de forma individual, por lo tanto, no es aplicable al problema de fiscalización de desembarques. Patent application US2012/306644(A1) describes a fish identification device consisting of a housing, a touch screen that allows the user to view and operate functions, an internal processor operated by software, an internal GPS, a camera with a lens and flash that photographs a fish, an internal memory and an operation software. The device allows the user to determine local fishing regulations, photograph a fish, identify it, and then save the information to internal memory for later use. However, this technology requires the manual use of a device to identify individual fish, therefore, it is not applicable to the problem of monitoring landings.
Se debe destacar también que aplicaciones de teléfono como Google Lens, que usan la cámara de un equipo para reconocer diversos objetos al realizar búsquedas en Google Search Engine. Este tipo de aplicaciones puede usarse para reconocer especies pelágicas, pero los resultados que entrega no son consistentes debido a que se ven fuertemente afectados por factores como luminosidad, enfoque de la cámara, simultaneidad de objetos, entre otros. It should also be noted that phone applications such as Google Lens, which use a computer's camera to recognize various objects when searching on Google Search Engine. This type of application can be used to recognize pelagic species, but the results it provides are not consistent because they are strongly affected by factors such as luminosity, camera focus, simultaneity of objects, among others.
Por lo tanto, se requiere de una tecnología que permita analizar la pesca extraída de los barcos pesqueros tomando un mayor número de muestras en un menor tiempo, además de mantener estándares de comparación más objetivos y consistentes y que permita generar una trazabilidad y registro de las especies analizadas con una mejor cuantificación de los resultados. Therefore, a technology is required that allows analyzing the fish taken from fishing vessels, taking a greater number of samples in a shorter time, in addition to maintaining more objective and consistent comparison standards and that allows generating traceability and registration of the species analyzed with better quantification of the results.
Breve descripción de la invención Brief description of the invention
La presente tecnología se refiere a un aparato, un método y un sistema para identificación y clasificación de especies pelágicas en punto de desembarque pesquero, que más particularmente consiste en un aparato automatizado de visión por múltiples cámaras, un método para el conteo, registro, clasificación y análisis biométhco de pesca y un protocolo de integración de ambos en un solo sistema. The present technology refers to an apparatus, a method and a system for identification and classification of pelagic species at the fishing landing point, which more particularly consists of an automated multi-camera vision apparatus, a method for counting, recording, classification and biomethco analysis of fishing and a protocol for integrating both into a single system.
Este sistema está compuesto por un aparato que comprende un pórtico con tecnología de visión computacional embebida para adquisición de video digital multimodal y un método basado en machine learning y en programas de procesamiento y análisis de imágenes para discriminar distintas especies de peces como, por ejemplo, pero no limitado a las especies pelágicas pequeñas y su fauna acompañante en la pesca artesanal y en otras formas de pesca industrial. This system is composed of a device that includes a gantry with embedded computer vision technology for multimodal digital video acquisition and a method based on machine learning and image processing and analysis programs to discriminate different species of fish such as, for example, but not limited to, small pelagic species and their accompanying fauna in artisanal fishing and other forms of industrial fishing.
Descripción de las figuras Description of the figures
La Figura 1 muestra un diagrama interno del aparato y del sistema, cuyos componentes principales son: un pórtico donde se instala el sistema de adquisición (100) que contiene un arreglo de cámaras (101 ) con filtros ópticos espectrales, un router de comunicación remota (102) que empaqueta la información adquirida por el módulo y la transmite de forma segura por internet al usuario. La adquisición requiere un sistema do iluminación controlada (103), y debe ser ventilado (104) para el correcto funcionamiento. La información adquirida y transmitida se recibe en un sistema de procesamiento de imágenes (200) en donde se implementa el método de identificación, conteo y clasificación de las muestras. Figure 1 shows an internal diagram of the device and the system, whose main components are: a gantry where the acquisition system (100) is installed that contains an array of cameras (101) with optical spectral filters, a remote communication router ( 102) that packages the information acquired by the module and transmits it securely over the Internet to the user. The acquisition requires a controlled lighting system (103), and must be ventilated (104) for correct operation. The acquired and transmitted information is received in an image processing system (200) where the method of identification, counting and classification of the samples is implemented.
La Figura 2 muestra un diagrama de la implementación del método dentro del sistema (200) cuyos componentes principales son un módulo de recepción de las imágenes transmitidas desde un pórtico (201 ), una red neuronal convolucional profunda que identifica y clasifica las especies en la imagen capturada (202). Un algoritmo de descarte de falsas identificaciones (203) que automáticamente elimina falsos positivos de la muestra y las que se mantienen son registradas por un sistema (204) y almacenados de forma paralela (205). Finalmente, el feed de video con la superposición de los datos relevantes identificados se muestran al usuario por una interfaz de usuario (206). Figure 2 shows a diagram of the implementation of the method within the system (200) whose main components are a module for receiving images transmitted from a gantry (201), a deep convolutional neural network that identifies and classifies the species in the image. captured (202). A false identification discard algorithm (203) that automatically eliminates false positives from the sample and those that remain are recorded by a system (204) and stored in parallel (205). Finally, the video feed with the overlay of the identified relevant data is displayed to the user by a user interface (206).
La Figura 3 muestra una ilustración del diagrama del proceso de funcionamiento de la presente tecnología aplicado al ejemplo de la descarga de peces. El pórtico captura imágenes de banda ancha y angosta con el arreglo de cámaras. Estas imágenes se procesan y generan un reporte que se muestra al usuario final, donde (a) corresponde a software interfaz de usuario, reportes y evidencia; (b) algoritmo de clasificación; (c) cámaras y filtros ópticos; (d) software de preprocesamiento y control; (e) iluminación controlada Figure 3 shows an illustration of the diagram of the operating process of the present technology applied to the example of fish landing. The gantry captures broadband and narrowband images with the camera array. These images are processed and generate a report that is shown to the end user, where (a) corresponds to user interface software, reports and evidence; (b) classification algorithm; (c) cameras and optical filters; (d) preprocessing and control software; (e) controlled lighting
La Figura 4 muestra un gráfico con huellas espectrales obtenidas a partir de imágenes de anatomía completa del ejemplo de peces cuyas especies son de interés para la pesca nacional, donde (a) corresponde a sardina, (b) a anchoveta, (c) a mote, (d) a jurel, (e) a merluza y (f) a reineta. Específicamente, la Figura 4A muestra un gráfico con huellas espectrales a partir de imágenes de anatomía completa de especies de interés y la Figura 4B muestra un gráfico con huellas espectrales a partir de imágenes de la zona lateral de la especie. Figure 4 shows a graph with spectral fingerprints obtained from complete anatomy images of the example of fish whose species are of interest for national fishing, where (a) corresponds to sardine, (b) to anchovy, (c) to mote , (d) horse mackerel, (e) hake and (f) pippin. Specifically, Figure 4A shows a graph with spectral fingerprints from full anatomy images of species of interest and Figure 4B shows a graph with spectral fingerprints from images of the lateral zone of the species.
La Figura 5 muestra un esquema de la arquitectura de red neuronal convolucional profunda de la presente tecnología para discriminar las especies de interés, donde (I) corresponde a la entrada con (a) imágenes segmentadas de banda angosta para N bandas; (II) etapa de extracción de particularidades con (b) capa convolucional 2D y (c) Maxpooling 2D que se repiten secuencialmente; (III) etapa de clasificación con redes totalmente conectadas; y (IV) etapa de salida que entrega resultados de la clasificación Figure 5 shows a schematic of the deep convolutional neural network architecture of the present technology to discriminate the species of interest, where (I) corresponds to the input with (a) narrow band segmented images for N bands; (II) feature extraction stage with (b) 2D convolutional layer and (c) repeating 2D Maxpooling sequentially; (III) classification stage with fully connected networks; and (IV) output stage that delivers classification results
La Figura 6 muestra una representación del sistema de cámaras y el sistema de iluminación de una modalidad ejemplar de la presente tecnología. Específicamente, en (a) se muestra el sistema de múltiples cámaras y filtros ópticos; (b) muestra el sistema de iluminación controlada y (c) muestra el sistema de ventilación requerida para la operación de la opto-electrónica embebida del pórtico. Figure 6 shows a representation of the camera system and lighting system of an exemplary embodiment of the present technology. Specifically, in (a) the system of multiple cameras and optical filters is shown; (b) shows the controlled lighting system and (c) shows the ventilation system required for the operation of the embedded opto-electronics of the gantry.
La Figura 7 muestra una matriz de confusión de los resultados de validación llevados a cabo por certificadores. Figure 7 shows a confusion matrix of the validation results carried out by certifiers.
Descripción detallada de la invención Detailed description of the invention
La presente tecnología se refiere a un sistema integrado que incorpora un aparato especializado y un método que facilita la identificación y clasificación de objetos que pasan por el campo de visión del aparato como, por ejemplo, pero no limitado a la clasificación de especies pelágicas pasando por una correa transportadora luego de un desembarque. Bajo este escenario, el sistema integrado comprende un sistema automático de visión que ilumina una o más especies pelágicas a analizar con luz controlada o estructurada para el conteo, registro, clasificación y análisis biométrico de las especies pelágicas, pero que puede ser fácilmente extendido a nuevas situaciones o configuraciones. The present technology refers to an integrated system that incorporates a specialized apparatus and a method that facilitates the identification and classification of objects that pass through the field of vision of the apparatus such as, for example, but not limited to the classification of pelagic species passing through a conveyor belt after disembarkation. Under this scenario, the integrated system comprises an automatic vision system that illuminates one or more pelagic species to be analyzed with controlled or structured light for the counting, registration, classification and biometric analysis of pelagic species, but which can be easily extended to new situations or configurations.
Específicamente, el sistema comprende tecnología en visión computacional, metodologías de aprendizaje automático (machine learning) integradas con técnicas de procesamiento de imágenes diseñadas para discriminar y analizar diferentes objetos pasando por su campo de visión, particular pero no exclusivamente, para analizar distintas especies pelágicas, especialmente, especies pelágicas pequeñas y su fauna acompañante en la pesca artesanal de sardina y anchoveta en diferentes regiones de Chile y del mundo. Specifically, the system comprises computer vision technology, machine learning methodologies integrated with image processing techniques designed to discriminate and analyze different objects passing through their field of view, particularly but not exclusively, to analyze different pelagic species, especially, small pelagic species and their accompanying fauna in artisanal sardine and anchovy fishing in different regions of Chile and the world.
Ventajosamente, esta tecnología discrimina automáticamente los objetos o las especies de interés, generando un muestreo representativo de la composición de especies analizadas, por ejemplo, pero no limitado a desembarques pesqueros de especies pelágicas, además de entregar estadísticas de estimaciones biométñcas relevantes de la pesca como, por ejemplo, pero no limitada a las tallas y peso de los especímenes clasificados. Advantageously, this technology automatically discriminates the objects or species of interest, generating a representative sampling of the composition of the analyzed species, for example, but not limited to fishing landings of pelagic species, in addition to providing statistics of relevant biometric estimates of the fishery such as , for example, but not limited to the sizes and weight of the classified specimens.
El aparato especializado de visión computacional posee un sistema de iluminación controlada del tipo preferente, arreglo controlado de paneles LED o arreglo de una o más cámaras en una o más longitudes de onda, con filtros ópticos fijos o móviles para la adquisición de múltiples bandas espectrales de interés y sistemas embebidos de visión computacional. The specialized computer vision apparatus has a controlled lighting system of the preferred type, controlled arrangement of LED panels or arrangement of one or more cameras at one or more wavelengths, with fixed or mobile optical filters for the acquisition of multiple spectral bands of interest and embedded computer vision systems.
La infraestructura del aparato genera condiciones de iluminación específicamente seleccionadas para la aplicación en que el aparato se esté utilizando, tal que habilita la adquisición de imágenes de banda ancha o banda angosta conforme a la aplicación, mediante el uso de filtros ópticos que resaltan las diferencias espectrales y morfológicas discriminatorias de las especies a clasificar, independiente de las condiciones externas de iluminación. Dependiendo de las condiciones del entorno de adquisición, el aparato puede tener iluminación uniforme o estructurada de manera de optimizar la discriminación entre las diferentes clases a identificar. The infrastructure of the device generates lighting conditions specifically selected for the application in which the device is being used, such that it enables the acquisition of broadband or high-band images. narrow according to the application, through the use of optical filters that highlight the discriminatory spectral and morphological differences of the species to be classified, independent of the external lighting conditions. Depending on the conditions of the acquisition environment, the device can have uniform or structured lighting in order to optimize discrimination between the different classes to be identified.
Los filtros ópticos se seleccionan en sintonía con las bandas espectrales que fomentan la diferencia de contraste de las especies en las imágenes tomadas por el arreglo de cámaras de espectro infrarrojo o espectro visible, pudiendo ser o no indispensables dependiendo de la morfología de las especies a clasificar. Más particularmente, los filtros espectrales en las bandas 300 nm a 500 nm fomentan la diferencia de contraste entre, pero no limitado a sardina y anchoveta.The optical filters are selected in tune with the spectral bands that promote the contrast difference of the species in the images taken by the infrared spectrum or visible spectrum camera array, and may or may not be essential depending on the morphology of the species to be classified. . More particularly, spectral filters in the bands 300 nm to 500 nm promote the difference in contrast between, but not limited to, sardine and anchovy.
Las imágenes de banda ancha o banda angosta se ingresan a una unidad de análisis conectada a una base de datos de especies pelágicas, donde se analizan computacionalmente para realizar la identificación, el conteo, registro y análisis biométhco de talla y peso de las especies clasificadas en las imágenes. Esta unidad de análisis comprende al menos un procesador configurado para recibir las imágenes, comparar particularidades espectrales y morfológicas discriminatorias de las especies pelágicas iluminadas con registros existentes en la base de datos de especies pelágicas, e identificar las especies pelágicas iluminadas según criterios de similitud con los registros existentes en la base de datos de especies pelágicas. The broadband or narrow band images are entered into an analysis unit connected to a database of pelagic species, where they are computationally analyzed to carry out the identification, counting, registration and biomethco analysis of size and weight of the species classified in the images. This analysis unit comprises at least one processor configured to receive the images, compare discriminatory spectral and morphological particularities of the illuminated pelagic species with existing records in the pelagic species database, and identify the illuminated pelagic species according to criteria of similarity with the existing records in the pelagic species database.
La unidad de análisis es la que implementa el método de la presente tecnología. Para ello utiliza algoritmos de machine-learning previamente entrenados para identificar en tiempo real los objetos (por ejemplo, peces), desde el feed de video multimodal y multiespectral de una correa transportadora; determina información relevante (como, por ejemplo, la especie y talla de especies pelágicas) y reporta a un usuario los resultados. Estos reportes son el registro completo con el nivel de detalle que el usuario requiera y contienen la evidencia digital de los objetos y de las especies analizadas en imágenes y las estadísticas asociadas a una ventana de tiempo de análisis donde se incluye el número, proporción relativa y estimación de parámetros relevantes (por ejemplo, talla y peso de las especies clasificadas). La tecnología también comprende una interfaz de usuario conectada con el aparato in-situ y con el back-end de una unidad de análisis para generar los reportes mencionados. Una ilustración de la operación del sistema en el ejemplo de clasificar especies pelágicas se muestra en la Figura 3, donde (a) corresponde a software, interfaz usuario, reportes y evidencias; (b) a software de clasificación; (c) a cámara y filtros de banda angosta; (d) a software preprocesamiento y control; y (e) a iluminación LED. The unit of analysis is the one that implements the method of this technology. To do this, it uses previously trained machine-learning algorithms to identify objects (for example, fish) in real time, from the multimodal and multispectral video feed of a conveyor belt; determines relevant information (such as, for example, the species and size of pelagic species) and reports the results to a user. These reports are the complete record with the level of detail that the user requires and contain the digital evidence of the objects and species analyzed in images and the statistics associated with an analysis time window that includes the number, relative proportion and estimation of relevant parameters (e.g. size and weight of classified species). The technology also comprises a user interface connected to the on-site device and to the back-end of an analysis unit to generate the aforementioned reports. An illustration of the operation of the system in the example of classifying pelagic species is shown in Figure 3, where (a) corresponds to software, user interface, reports and evidence; (b) to classification software; (c) to camera and narrow band filters; (d) preprocessing and control software; and (e) LED lighting.
En las Figuras 4 A y B se muestran gráficos con registros de huellas espectrales de especies de interés obtenidas con la presente tecnología. La huella espectral se define como la razón entre la cantidad de luz reflejada por una especie pelágica, en función de la longitud de onda, con respecto al total de luz reflejada por una referencia conocida, y donde las mediciones puntuales del espectro son promediadas espacialmente usando todos aquellos píxeles en una imagen. Específicamente, en la Figura 4A se muestra un gráfico referente a la anatomía completa de la especie pelágica bajo análisis, mientras que la Figura 4B se muestra un gráfico referente a la zona lateral de la especie pelágica bajo análisis.Figures 4 A and B show graphs with records of spectral fingerprints of species of interest obtained with the present technology. The spectral fingerprint is defined as the ratio between the amount of light reflected by a pelagic species, as a function of wavelength, with respect to the total light reflected by a known reference, and where the specific measurements of the spectrum are spatially averaged using all those pixels in an image. Specifically, Figure 4A shows a graph referring to the complete anatomy of the pelagic species under analysis, while Figure 4B shows a graph referring to the lateral zone of the pelagic species under analysis.
En la Figura 5 se observa un esquema del sistema de clasificación de especies a partir de una arquitectura de red neuronal convolucional profunda empleada por la presente tecnología para discriminar las especies de interés, donde (I) corresponde a la entrada, (II) a la extracción de propiedades, (III) a la salida, (a) a las imágenes segmentadas de banda angosta, N bandas, (b) a capa convolucional 2D y (c) a agrupación máxima 2D. El sistema computacional de análisis consiste principalmente en un conjunto de clasificadores de especies pelágicas basados en tecnología de machine learning y programas de procesamiento de imágenes que operan en sistemas embebidos de visión computacional. Figure 5 shows a diagram of the species classification system based on a deep convolutional neural network architecture used by this technology to discriminate the species of interest, where (I) corresponds to the input, (II) to the property extraction, (III) to the output, (a) to the narrow band segmented images, N bands, (b) to 2D convolutional layer and (c) to 2D max pooling. The computational analysis system consists mainly of a set of pelagic species classifiers based on machine learning technology and image processing programs that operate in embedded computer vision systems.
En la Figura 6 se muestra un render del pórtico diseñado, fabricado e instalado, donde el pórtico comprende de un sistema de anclaje que se instala sobre la correa transportadora, un sistema de iluminación controlada y un arreglo de cámaras con filtros ópticos para mejorar la discriminación de especies. Figure 6 shows a render of the designed, manufactured and installed gantry, where the gantry comprises an anchoring system that is installed on the conveyor belt, a controlled lighting system and an array of cameras with optical filters to improve discrimination. of species.
A continuación, se describen detalladamente realizaciones ejemplares para ¡lustrar los principios de la invención. Las realizaciones se proporcionan para ¡lustrar aspectos de la invención, pero éstas no se limitan a ninguna realización en particular. El alcance de la invención abarca numerosas alternativas, modificaciones y equivalentes, sólo limitadas por las realizaciones de las reivindicaciones. Exemplary embodiments are described in detail below to illustrate the principles of the invention. Embodiments are provided to illustrate aspects of the invention, but are not limited to any particular embodiment. The scope of the invention encompasses numerous alternatives, modifications and equivalents, limited only by the embodiments of the claims.
Ejemplos de aplicación Application examples
Ejemplo 1 . Example 1 .
En una realización ejemplar de la presente tecnología, se analizó parte de un desembarque de pesca pasando a través de una correa transportadora, donde específicamente se analizaron 70 toneladas. Para la comparación del presente ejemplo se extrajeron tres baldes de 3 litros con especies pelágicas de referencia, en donde se estimó que el desembarco comprendía 79% de anchoveta y 21 % de sardina. In an exemplary embodiment of the present technology, part of a fish landing was analyzed by passing through a conveyor belt, where specifically 70 tons were analyzed. For the comparison of this example, three 3-liter buckets were extracted with pelagic reference species, where it was estimated that the landing comprised 79% of anchovy and 21% of sardine.
Posteriormente, se instaló un sistema de iluminación, un sistema de cámaras y un conjunto de filtros ópticos de la presente tecnología sobre la correa transportadora por donde se descargaron los peces y se registraron imágenes de banda ancha de las especies pelágicas mientras éstas iban pasando bajo el sistema de cámaras. Las imágenes se transmitieron a una unidad de análisis conectada a una base de datos de especies pelágicas de la zona de las regiones Biobío y Ñuble en Chile. La unidad de análisis comparó particularidades morfológicas discriminatorias de las especies pelágicas iluminadas con los registros existentes en la base de datos, identificando las especies pelágicas iluminadas según criterios de similitud empleando un algoritmo de redes neuronales convolucionales profundas, obteniendo así que el desembarco comprendía 82,5% de anchoveta y 17,5% de sardina. Los resultados indican una distribución similar de especies tanto por lo registrado por el fiscalizador como por el método de identificación y clasificación. La principal diferencia de este sistema es que, ventajosamente, puede lograr un muestreo representativo al poder analizar una mayor cantidad de especies y de forma distribuida durante todo el tiempo que dure la descarga. Por otro lado, otra ventaja es que el fiscalizador no requerirá de estar presencial en el punto de descarga, ya que puede realizar la labor de forma remota analizando los resultados entregados por el sistema de reconocimiento. Subsequently, a lighting system, a camera system and a set of optical filters of the present technology were installed on the conveyor belt through which the fish were unloaded and broadband images of the pelagic species were recorded as they passed under the camera system. The images were transmitted to an analysis unit connected to a database of pelagic species from the Biobío and Ñuble regions in Chile. The analysis unit compared discriminatory morphological particularities of the illuminated pelagic species with the existing records in the database, identifying the pelagic species illuminated according to similarity criteria using a deep convolutional neural network algorithm, thus obtaining that the landing included 82.5% anchovy and 17.5% sardine. The results indicate a similar distribution of species both by what was recorded by the inspector and by the identification and classification method. The main difference of this system is that, advantageously, it can achieve representative sampling by being able to analyze a greater number of species and in a distributed manner throughout the duration of the download. On the other hand, another advantage is that the inspector will not need to be in person at the unloading point, since he can carry out the work remotely by analyzing the results delivered by the recognition system.
Ejemplo 2. Example 2.
En otro ejemplo, se llevó a cabo un proceso de validación de los resultados con personal certificador. En este ejercicio, se seleccionaron un total de 240 de imágenes diferentes, y se utilizó el sistema de identificación para que identificara todas las especies posibles. Luego, estas imágenes ya identificadas se repartieron a 5 certificadores expertos, con el fin de que indicaran las identificaciones erradas del sistema. In another example, a validation process of the results was carried out with certifying personnel. In this exercise, a total of 240 different images were selected, and the identification system was used to identify all possible species. Then, these already identified images were distributed to 5 expert certifiers, in order for them to indicate the wrong identifications of the system.
Esta validación mostró que la precisión promedio del sistema para la identificación correcta de cada una de las 4 especies pelágicas disponibles correspondió a: 98,2% para jurel, 86,6% para caballa, 75,4% para anchoveta, y 93,4% para sardina (ver Figura 7 donde se muestra una matriz de confusión de los resultados de validación llevados a cabo por certificadores). Por lo que, por medio del sistema de análisis se logró una alta precisión en la identificación de especies descargadas desde embarcaciones pesqueras. This validation showed that the average precision of the system for the correct identification of each of the 4 pelagic species available corresponded to: 98.2% for horse mackerel, 86.6% for mackerel, 75.4% for anchovy, and 93.4 % for sardine (see Figure 7 where a confusion matrix of the validation results carried out by certifiers is shown). Therefore, through the analysis system, high precision was achieved in the identification of species unloaded from fishing vessels.

Claims

Reivindicaciones Claims
1. Un sistema para la identificación de especies pelágicas fuera del agua CARACTERIZADO porque comprende: un sistema de iluminación que ilumina una o más especies pelágicas a analizar con luz controlada; un conjunto de filtros ópticos fijos o móviles para la selección de bandas espectrales; un arreglo de cámaras que comprende una o más cámaras en uno o más rangos de longitudes de onda que registran imágenes de banda ancha o banda angosta de las especies pelágicas iluminadas utilizando filtros ópticos; una unidad de análisis conectada a una base de datos de especies pelágicas, en donde la unidad de análisis comprende al menos un procesador configurado para recibir las imágenes, comparar particularidades espectrales y morfológicas discriminatorias de las especies pelágicas iluminadas con registros existentes en la base de datos de especies pelágicas, e identificar las especies pelágicas iluminadas según criterios de similitud con los registros existentes en la base de datos de especies pelágicas. 1. A system for the identification of pelagic species out of water CHARACTERIZED because it comprises: a lighting system that illuminates one or more pelagic species to be analyzed with controlled light; a set of fixed or mobile optical filters for the selection of spectral bands; a camera array comprising one or more cameras in one or more wavelength ranges that record broadband or narrow band images of illuminated pelagic species using optical filters; an analysis unit connected to a database of pelagic species, wherein the analysis unit comprises at least one processor configured to receive the images, compare discriminatory spectral and morphological particularities of the illuminated pelagic species with existing records in the database of pelagic species, and identify the illuminated pelagic species according to similarity criteria with the existing records in the pelagic species database.
2. El sistema de acuerdo con la reivindicación 1 , CARACTERIZADO porque sistema de iluminación comprende al menos una fuente de iluminación uniforme. 2. The system according to claim 1, CHARACTERIZED in that the lighting system comprises at least one uniform lighting source.
3. El sistema de acuerdo con la reivindicación 1 , CARACTERIZADO porque sistema de iluminación comprende al menos una fuente de iluminación controlada o estructurada. 3. The system according to claim 1, CHARACTERIZED in that the lighting system comprises at least one controlled or structured lighting source.
4. El sistema de acuerdo con la reivindicación 1 , CARACTERIZADO porque sistema de cámaras comprende al menos una cámara de espectro infrarrojo o una cámara de espectro visible. 4. The system according to claim 1, CHARACTERIZED in that the camera system comprises at least one infrared spectrum camera or a visible spectrum camera.
5. El sistema de acuerdo con la reivindicación 1 , CARACTERIZADO porque el conjunto de filtros ópticos filtra en las bandas 300 nm a 500 nm. 5. The system according to claim 1, CHARACTERIZED in that the set of optical filters filters in the bands 300 nm to 500 nm.
6. El sistema de acuerdo con la reivindicación 1 , CARACTERIZADO porque la identificación las especies pelágicas iluminadas según criterios de similitud son determinados mediante redes neuronales convolucionales. 6. The system according to claim 1, CHARACTERIZED in that the identification of the illuminated pelagic species according to similarity criteria is determined by convolutional neural networks.
7. El sistema de acuerdo con la reivindicación 6, CARACTERIZADO porque las redes neuronales convolucionales son redes neuronales profundas. 7. The system according to claim 6, CHARACTERIZED because the convolutional neural networks are deep neural networks.
8. Un método para la identificación de especies pelágicas fuera del agua, CARACTERIZADO porque comprende los pasos de: proveer una unidad de análisis conectada a una base de datos de especies pelágicas; proveer un sistema de iluminación controlado, un sistema de arreglo de cámaras y un conjunto de filtros ópticos sobre una o más especies pelágicas a analizar; 8. A method for the identification of pelagic species out of water, CHARACTERIZED because it includes the steps of: providing a unit of analysis connected to a database of pelagic species; provide a controlled lighting system, a camera array system and a set of optical filters on one or more pelagic species to be analyzed;
¡luminar una o más especies pelágicas con iluminación controlada; registrar imágenes de banda ancha y banda angosta de las especies pelágicas mediante el sistema de cámaras multiespectrales a través del conjunto de filtros ópticos; recibir en la unidad de análisis las imágenes de los peces iluminados; comparar particularidades espectrales y morfológicas discriminatorias de los peces iluminados con registros existentes en la base de datos de especies pelágicas; identificar las especies pelágicas iluminadas según criterios de similitud con los registros existentes en la base de datos de especies pelágicas. illuminate one or more pelagic species with controlled lighting; record broadband and narrow band images of pelagic species using the multispectral camera system through the set of optical filters; receiving the images of the illuminated fish in the analysis unit; compare discriminatory spectral and morphological particularities of illuminated fish with existing records in the pelagic species database; identify the illuminated pelagic species according to similarity criteria with existing records in the pelagic species database.
9. El método de acuerdo con la reivindicación 8, CARACTERIZADO porque los criterios de similitud son determinados mediante redes neuronales convolucionales. 9. The method according to claim 8, CHARACTERIZED because the similarity criteria are determined by convolutional neural networks.
10. El método de acuerdo con la reivindicación 8, CARACTERIZADO porque las redes neuronales convolucionales son redes neuronales profundas. 10. The method according to claim 8, CHARACTERIZED because the convolutional neural networks are deep neural networks.
PCT/CL2023/050100 2022-11-24 2023-11-03 System and method for identifying pelagic species outside the water WO2024108318A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CL2022003307A CL2022003307A1 (en) 2022-11-24 2022-11-24 A system and method for the identification of pelagic species out of water.
CL3307-2022 2022-11-24

Publications (1)

Publication Number Publication Date
WO2024108318A1 true WO2024108318A1 (en) 2024-05-30

Family

ID=85172318

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CL2023/050100 WO2024108318A1 (en) 2022-11-24 2023-11-03 System and method for identifying pelagic species outside the water

Country Status (2)

Country Link
CL (1) CL2022003307A1 (en)
WO (1) WO2024108318A1 (en)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20050025357A1 (en) * 2003-06-13 2005-02-03 Landwehr Val R. Method and system for detecting and classifying objects in images, such as insects and other arthropods
ES2552397A1 (en) * 2014-05-27 2015-11-27 Tecnología Marina Ximo, S.L. System and method for estimating tunas caught by species on board fishing vessels (Machine-translation by Google Translate, not legally binding)
JP6401411B1 (en) * 2018-02-13 2018-10-10 株式会社Aiハヤブサ Artificial intelligence catch identification system, management system and logistics system
US11134659B2 (en) * 2015-01-22 2021-10-05 Signify Holding B.V. Light unit for counting sea lice
US11727669B1 (en) * 2022-03-17 2023-08-15 Institute of Facility Agriculture, Guangdong Academy of Agricultural Science Intelligent recognition method of hyperspectral image of parasites in raw fish

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20050025357A1 (en) * 2003-06-13 2005-02-03 Landwehr Val R. Method and system for detecting and classifying objects in images, such as insects and other arthropods
ES2552397A1 (en) * 2014-05-27 2015-11-27 Tecnología Marina Ximo, S.L. System and method for estimating tunas caught by species on board fishing vessels (Machine-translation by Google Translate, not legally binding)
US11134659B2 (en) * 2015-01-22 2021-10-05 Signify Holding B.V. Light unit for counting sea lice
JP6401411B1 (en) * 2018-02-13 2018-10-10 株式会社Aiハヤブサ Artificial intelligence catch identification system, management system and logistics system
US11727669B1 (en) * 2022-03-17 2023-08-15 Institute of Facility Agriculture, Guangdong Academy of Agricultural Science Intelligent recognition method of hyperspectral image of parasites in raw fish

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
KOLMANN M, KALACSKA M.; LUCANUS O.; SOUSA L.; WAINWRIGHT D.; ARROYO-MORA J. P.; ANDRADE M. C.: "Hyperspectral data as a biodiversity screening tool can differentiate among diverse Neotropical fishes", SCIENTIFIC REPORTS, NATURE PUBLISHING GROUP, US, vol. 11, no. 1, US , XP093176597, ISSN: 2045-2322, DOI: 10.1038/s41598-021-95713-0 *

Also Published As

Publication number Publication date
CL2022003307A1 (en) 2023-01-06

Similar Documents

Publication Publication Date Title
US20210153457A1 (en) System for uniquely identifying subjects from a target population
CN107077626B (en) Non-invasive multi-modal biometric identification system for animals
US6529612B1 (en) Method for acquiring, storing and analyzing crystal images
CN108738339A (en) Method and apparatus for classifying to the pseudomorphism in sample
CN104778474B (en) A kind of classifier construction method and object detection method for target detection
Kim et al. Detection of skin tumors on chicken carcasses using hyperspectral fluorescence imaging
WO2019148265A1 (en) Wound imaging and analysis
Schmid et al. The LOKI underwater imaging system and an automatic identification model for the detection of zooplankton taxa in the Arctic Ocean
US20150235499A1 (en) Coin Identification System and Method Using Image Processing
US11798662B2 (en) Methods for identifying biological material by microscopy
JP6560757B2 (en) Classification of barcode tag states from top view sample tube images for laboratory automation
Ghazal et al. Automated framework for accurate segmentation of leaf images for plant health assessment
CN112464843A (en) Accurate passenger flow statistical system, method and device based on human face human shape
US20220012884A1 (en) Image analysis system and analysis method
CN106526177A (en) Biomarker detection system and method based on colloidal gold test strip
CN112633297A (en) Target object identification method and device, storage medium and electronic device
Kutlu et al. Multi-stage fish classification system using morphometry
CN115298692A (en) Automatic candidate sperm identification
WO2024108318A1 (en) System and method for identifying pelagic species outside the water
WO2016070297A1 (en) Portable device for analysing production and health condition using digital analysis of images obtained of the external and internal structure of fish
ES2552397B1 (en) System and method for estimating tuna caught by species on board fishing vessels
PP et al. Automated quality assessment of cocoons using a smart camera based system
Paul et al. A review on computational methods based on machine learning and deep learning techniques for malaria detection
CN114152621A (en) Processing method, processing device and processing system
Castillo et al. A supervised learning approach on rice variety classification using convolutional neural networks