WO2022258802A1 - Appareil de type capteur et système de capteur pour la pisciculture - Google Patents

Appareil de type capteur et système de capteur pour la pisciculture Download PDF

Info

Publication number
WO2022258802A1
WO2022258802A1 PCT/EP2022/065813 EP2022065813W WO2022258802A1 WO 2022258802 A1 WO2022258802 A1 WO 2022258802A1 EP 2022065813 W EP2022065813 W EP 2022065813W WO 2022258802 A1 WO2022258802 A1 WO 2022258802A1
Authority
WO
WIPO (PCT)
Prior art keywords
module
sensor
sensor apparatus
fish
functional
Prior art date
Application number
PCT/EP2022/065813
Other languages
English (en)
Inventor
Martin KONOLD
Simon HEUGEL
Chaitanya DHUMASKER
Original Assignee
Monitorfish Gmbh
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Monitorfish Gmbh filed Critical Monitorfish Gmbh
Publication of WO2022258802A1 publication Critical patent/WO2022258802A1/fr

Links

Classifications

    • AHUMAN NECESSITIES
    • A01AGRICULTURE; FORESTRY; ANIMAL HUSBANDRY; HUNTING; TRAPPING; FISHING
    • A01KANIMAL HUSBANDRY; AVICULTURE; APICULTURE; PISCICULTURE; FISHING; REARING OR BREEDING ANIMALS, NOT OTHERWISE PROVIDED FOR; NEW BREEDS OF ANIMALS
    • A01K61/00Culture of aquatic animals
    • A01K61/90Sorting, grading, counting or marking live aquatic animals, e.g. sex determination
    • A01K61/95Sorting, grading, counting or marking live aquatic animals, e.g. sex determination specially adapted for fish

Definitions

  • the present invention relates to a sensor apparatus for a farming system comprising at least two functional modules for measuring, recording and transmitting fish farming parameters and at least one connection modules for connecting the functional modules.
  • the invention relates to a camera module as functional module of a sensor apparatus.
  • the invention relates to a sensor system comprising at least one sensor apparatus.
  • Aquaculture is farming of aquatic organisms, including fishing in both coastal and inland areas involving interventions in a rearing process to enhance production.
  • Fish farming requires fish to be fed the optimum amount of food. Fish are typically fed with food pellets that are dropped into the enclosed areas in which each shoal of fish are farmed.
  • WO 2019/121900 A1 discloses a sensor device for monitoring fish in an aquaculture for fish parasites.
  • the sensor device can be arranged by at least one fastening module in the form of cables below a water surface in a tank.
  • the sensor device has two lighting modules, a camera module and control modules.
  • the lighting modules and the camera module are connected to each other by connecting modules in the form of rods.
  • the image data of the fish captured by the camera module are processed by the control module with at least one trained neural network in order to determine the presence of parasites on the fish. If the presence of the parasites is determined, it is communicated to an operator of the fish farmer via a user interface, so that an operator can react accordingly.
  • the device comprises a sensor module with a large number of different sensors, a communication module for transmitting the data recorded by the sensors and a control module for controlling these modules.
  • the various modules are connected to one another via wireless communication.
  • CN 10 1995875 A discloses a monitoring system for an aquaculture system having a central control module arranged to control a camera module and a sensor module with sensors which are wirelessly connected to the central control module.
  • WO 2018/222048 A1 discloses a device for hyperspectral imaging of fish in a fish farm.
  • the device comprises at least one illumination source and at least one hyperspectral imager arranged to a mounting assembly to the fish farm.
  • the object of the invention is to reduce the disadvantages of known arrangements.
  • an apparatus allows any arrangements of various modules, e.g. camera modules, lighting modules, in space and enables an adaption to a required arrangement space.
  • a sensor apparatus for a fish farming system comprises at least two functional modules for measuring, recording and transmitting fish farming parameters and at least one connection modules for connecting the functional modules.
  • each functional module comprises at least one uniformly formed cover section and said connection module comprises a connection plate being functionally and mechanically complementary to said cover section, so that at least two of the functional modules are connectable with each other mechanically, electrically and communicatively by the at least one connection module via a connection of the cover section and the connection plate.
  • a sensor apparatus comprises several functional modules and every two functional modules are connected by a connection module.
  • the functional module is provided with a uniformly formed cover section and the connection module is provided with a connection plate, wherein the cover section and the connection plate are designed functionally and mechanically, preferably shape complementary to each other.
  • the connection module ensures mechanical stability and electrical as well as communicative connection between functional modules of the sensor apparatus.
  • the mechanical stability serves in particular to resist vibrations and mechanical impacts caused by aquatic life, objects and currents.
  • connection module Due to the connection module, it is possible that the functional modules can be connected with each other in any combination, wherein all mechanical connections and attachments are realized only by the connection module. Furthermore, the connection module allows any arrangement of the functional modules, so that the sensor apparatus can be flexible arranged in an underwater space, in particular in a limited underwater space.
  • connection module all connections being necessary for network communication and power supply are arranged. Furthermore, the electrical and communicative connections can be integrated in the connection module, wherein a solution based on Power over Ethernet (PoE) technology can be preferably used.
  • PoE Power over Ethernet
  • network communication and power supply lines and cables are designed in such a way that the function modules can be directly connected in series by means of the lines and cables arranged in the connection module.
  • the functional module is advantageously designed as standardized arrangement comprising a cover section, a central electrical and communicative distribution unit, a central control and computation unit and a base set for sensors for measuring various fish farming parameters.
  • the function modules can be provided with network interfaces for transmitting recorded data with each other and with cloud infrastructures as well as local data processing units.
  • the sensor apparatus can be used for remote-controlled operations by means of network connections and self-sufficient systems, wherein the data obtained can be read out remotely or displaced, stored, and processed on remote systems.
  • the measured and recorded data can be
  • the sensor apparatus is used for determining process and production parameters in production and farming environments, especially in the underwater area and in particular in the form of bio indicators for plant and fish growth.
  • the sensor apparatus can be therefore used for plant chronology, plant chromatography, environmental monitoring, observing biological indicators, climate change, underwater observation in farming systems and / or observation of growth of disturbing creatures.
  • all housing parts of the sensor apparatus are made from aluminium with a sufficient treatment of surfaces to avoid any damage from the environment, in particular from degradation of outer surfaces by mixtures of water, salt, acid and leach.
  • other types of housing material are also possible, as long as the strong environmental requirements from fish farming are being met. All materials used are environmentally friendly and can be used in the area of fish farming in a water tank. It is further advantageous that the material used withstand typical chemical stresses, such as salinity comparably to seawater, and are resistant to commercially available chemicals for water treatment as well as cleaning. Moreover, tightness is ensured at the connection areas between modules.
  • connection module can comprise at least one module plate which is designed as standardized connection plate, on which the functional module is attached via the cover section. At each connector end the module plate can be fixed allowing for mounting the functional modules to the connection module.
  • connection module is designed as standardized connection module, which can comprises a connector and at least one module plate, preferably two module plates, wherein an interface between the functional modules and the connection modules is built. It is thus elementary part of each surface of functional module housings, at which connections are made possible.
  • the standardized connection plate can comprise a mechanical interface section such that the connection module can be fixed permanently onto the cover section of the functional modules.
  • the interface section can include a network port, in particular a RJ-45 port which is designed to be waterproof.
  • a network port in particular a RJ-45 port which is designed to be waterproof.
  • PoE Power-over-Ethernet
  • the functional modules can be arranged along a straight line by the connection modules, so that the sensor apparatus is arrangeable at least partially in an underwater area of the fish farming system in a vertical direction. Users can assemble the vertical connections by themselves.
  • a camera module can be designed as functional module, wherein in the cover section of the camera module electrical and network ports are arranged, wherein the camera module comprises at least one stereoscopic camera sensor and a stereoscopic optical lens system.
  • a set of cameras can be used to take stereographic images. The taken image pairs with short delay for detecting movement can include information about velocity and direction of movements.
  • Both cameras can comprises a stereoscopic camera sensor and stereoscopic optical lenses respectively, wherein the sensors are controllable by the sensor apparatus, including a synchronisation of timing for image exposures. Thereby an accelerometer for event based image exposures can be integrated.
  • the lenses are also controllable with regards to all optical parameters such as zoom, focus and iris.
  • the camera module can have a taking stills mode, a taking videos mode, e.g. for analysis purposes, and a live streaming of video signal mode.
  • the camera module can have moreover a monocular camera and / or a stereo camera with corresponding monocular, binocular or more camera sensors setup. Additional cameras such as 3D- cameras with light sensors can also be provided in the camera module.
  • At least one Real-Time Streaming Protocol (RTSP)-server can be provided for local video streaming and / or an online connection to remote RTSP- servers.
  • at least one light source can be integrated.
  • an external light source can be used, wherein exposures and light flashes can be synchronised.
  • An internet-enabled control interface can be provided via a network interface, so that reading image date such as video and individual images are made available via this interface. If a microphone is used, sound track of a video recording can be generated.
  • the obtained images and data can be sent to a cloud infrastructure and / or a local data processing unit through the network interface of the camera module.
  • At least one acceleration sensor can be integrated in the camera module.
  • detections of shock can be realized. It enables a smart triggering of camera shots according to those environmental signals.
  • a communication module can be designed as functional module, wherein the communication module comprises preferably at least one data storage unit for storing measured and recorded data and at least one data transmission unit to transmit the data to an external network.
  • the communication module acts as an interface to the outside world and can comprise wired and / or wireless network connectivity, power supply and a light indicator, e.g. LED light source, for status indication.
  • the communication module can be designed to act as a Machine-to-Human interface above the water level for indication of operational state information to users.
  • the communication can be designed to be waterproof.
  • the communication module can therefore act as an input for network communication and / or electrical power supply.
  • the communication module have a direct connection to both external power sources and to a wired network using the PoE port.
  • the communication module is the end point and can carry a cryptographic identity which can be stored securely in the form of the secured hardware-supported date storage unit such as smart cards. The cryptographic identity can also be used to secure integrity of all recorded measurement and image data. If a plurality of communication modules are provided they can send the collected data to processing units individually or to a main communication module which further sends the data to the processing units.
  • the data transmission unit can comprise at least one wireless interface element which is preferably designed as a radio communication interface element, in particular as a WLAN, mobile telephone, satellite communication interface element.
  • a lighting module can be designed as functional module, wherein the lighting module comprises at least one light source, in particular an infrared light source, preferably a LED light source, in particular a LED light strip, for illuminating an underwater area as illumination space for stereoscopic inspection.
  • the lighting module is used for illumination within the underwater area to be imaged by the camera modules. All parameters related to underwater lighting, such as
  • the lighting module can be provided with an internet-enabled control interface via a network interface, so that timing between the camera module and the lighting module can be realized facilitating the network connectivity within the camera module, where no absolute time information is necessary.
  • the infrared light source is shown to be efficient both in terms of the imaging efficiency and in terms of the avoidance of disturbance of the living species behaviour. Using the infrared light source is further beneficial when it comes to effect of water turbidity, where longer wavelengths are less affected. This nevertheless depends strongly on the size of those particles giving rise to turbidity in first place. If an angle of light emission needs to be sufficiently wide the LED light source can be selected.
  • an infrared light with a wavelength between 700 to 900 nm is used as the light source. It is further advantageous that illumination modes of the lighting module, e.g. torch / flash, are selectable. With the lighting module angels for optimized illumination can be fixed. It is further advantageous that the light sources are provided with a passive cooling. As functional component of the lighting module a light source driver unit can be provided. The lighting module can further comprise all necessary electronic components to perform the above- mentioned functionalities. In a further advantageous embodiment the lighting module can be integrated in the camera module.
  • the combined camera and lighting module comprises all features of the camera module and the lighting module and can allow for direct activation of the light sources without need of synchronisation between external entities. Low energy consumption can be advantageously achieved by non- continuous operation. For tight space application, e.g. small diameter fish tanks, the combined camera and lighting module is beneficial. Furthermore, the electrical triggering of both light flashes and camera exposures allows for a very precise synchronisation so that a decrease in energy consumption can be realized.
  • a sensor module can be designed as functional module, wherein the sensor module comprises at least one sensor unit which is preferably designed as temperature sensor, pressure sensor, pH-sensor and / or salinity sensor.
  • the sensor module serves to accommodate sensor for measuring physical parameters in the underwater area such as temperature, water pressure in different directions, water flow, electrical tensions, humidity, vibration and accelerations, sound and noise, electrical and magnetic fields und potentials, chemical parameters such as salinity, conductivity and turbidity.
  • the sensor module can be provided with a control and readout interface via a network interface. All relevant operating parameters can be managed with it. Furthermore, a universal expansion interface with generic analog and digital I/O lines can be provided.
  • At least one attachment module arranged on at least one of the functional and / or connection modules can be comprised, so that the sensor apparatus is fastenable to the fish farming system, such as mounted towards fish tank border walls.
  • the attachment module is a mechanical structure and can be designed for specific tank types and dimensions. With the attachment module the sensor apparatus can be fastened to any other kind of structure allowing for floating at the sea level or under a water surface. With the rigid structure of the attachment module vibrations, effects of water flow and turbulences of camera positions can be eliminated.
  • a camera module as functional module of a sensor apparatus comprising at least a stereoscopic camera sensor and a stereoscopic optical lens system
  • the camera module further comprises at least one local Al-based analyzing unit for pre-evaluation images taken by the camera module.
  • the recorded images can be preselected before upload to a cloud infrastructure and further processing of these images for biomass estimation or any other analysis.
  • a software trigger for images can also be realized and visibility conditions including information about water quality can be classified.
  • a sensor system comprises at least one above- mentioned sensor apparatus is proposed.
  • One of the sensor apparatuses or several sensor apparatuses can take over a direct interface to the overall system. This means that an access, e.g. for user interfaces, is possible directly via such sensor apparatus.
  • the analyzing apparatus can comprise at least one Al-based analyzing unit for evaluation and optimization of fish growth in the fish farming system.
  • at least one cloud infrastructure can be arranged. Due to cloud connections, several of the sensor apparatuses can also be connected logically at different locations and with regard to data processing, so that analysis such as correlations can be carried out. All aspects of controlling the sensor system can be carried out by using the cloud connection or through network communication. This includes Human-Machine-Interfaces (HMI) and Machine to Machine (M2M)- interfaces.
  • HMI Human-Machine-Interfaces
  • M2M Machine to Machine
  • fish diseases can be early detected by using fish motion, fish morphology and / or fish behaviour. Furthermore, it is possible to detect fish hunger by using fish behaviour. Moreover, determination of individual fish can be achieved. It is further possible that other sensors offering smart farm controlling for a digital production optimization can be integrated. Fish weight together with total fish count is the key factor in determining the correct amount of fish feed. Since too much or too little feed are both harmful to the fish and their growth, knowing the correct fish weight is key to efficient aquaculture management. With biomass estimation, the issue of not knowing weight of the individual fish in a tank can be solved. Additionally, fish growth over time is an indicator of whether other parameters such as dissolved oxygen, water temperature and pH are at optimal levels or need to be adjusted.
  • the biomass estimation relies on processing the images taken by the camera modules.
  • the images can be uploaded to the cloud infrastructure for processing. To get the biomass from the images, they are run through rough following steps in a top to bottom order:
  • object detection drawing a bounding box around each fish
  • object coordinates determining the object coordinates (XYZ) of each key point
  • post-processing performing outlier based filtering and in the future sanity checks on relative distances and angles between key points;
  • biomass estimation estimating weight based on distances between key points via a diverence factor.
  • the transition from a single image to stereometry takes place by means of the so-called keypoints.
  • the image of an isolated individual fish is only useful for a such process if one or more of the defined keypoints (organs, extremities) are clearly recognizable. It helps to place the marker where the dot can actually be seen, not where it is "know”. It is advantageous that more keypoints are clearly visible in the image. Incorrect exposures, lack of contrast and / or sharpness result in inability to distinguish the characteristic features of the point in question.
  • the keypoints should always correspond to directly visible characteristics of certain body parts, wherein no geometric operations, e.g. point vertically / centred / above / below body part X, should be required to find and label the keypoint. Moreover, no morphological or physiological concepts should need to be applied to identify an indirectly visible keypoint.
  • the keypoints should furthermore always be defined as points on the certain body parts. Some examples are:
  • tail fin upper/lower tail fin end, fork of tail fin
  • the keypoints should not be defined for such body parts that normally do not appear on real footage but closely attached to the main body and thus are not clearly detectable against the background or the main body. This is often the case for ventral fins of adult individuals or for front sections of dorsal fins. There is no need for the keypoints to be directly related to target measures. In real use cases, body sizes are seldom ly derived from distances between detectable features.
  • the keypoints should be defined for all such detectable features, even if their relevance or usage are initially undetermined. For example, the eye may well be irrelevant for body mass estimation, but may sooner or later be useful for calibrations of the camera modules.
  • the species bred, or at best genus, are one of the principal characteristic of a use case. This characteristic does not only determine the detectability of features on the body parts of fish, but ultimately also determines their mode of detection.
  • Training of object and keypoint detection may initially start from some adequate species independent model weights. Nevertheless, the results from such training are only valid for a certain species or genus. When breeding several species together, the training ground truth must assign individuals to corresponding non trivial classes.
  • pipelines of object and keypoint detection must be organized accordingly: when detecting an object from class A, its cropped sub-image would be submitted to the keypoint detection for class A.
  • Each class considered has a separately developed keypoint detection.
  • ventral fins of adult individuals almost never are well detectable, the requirement to always label a ventral fin base position would lead to noise and pseudo features in training data.
  • CNN Convolutional Neural Network
  • a network capable of classifying images (or pairs of images) into a useful class, which is possible to determine 3D coordinates of one or more points, and a useless class, which do not allow the photogrammetric calculations to be made.
  • the useless class at least one of the images is blank or contains errors (blurred, underexposed) or shows the fish at an unfavorable angle or several fish are close together or only insufficient ones parts of fish are found.
  • a network capable of further analyzing an image been recognized as useful, wherein the network identifies one or more different individuals and encloses them with a rectangle, with which the relevant sub-image can later be extracted. In the sub-image only a single fish is shown.
  • annotations consist of rectangular zones that enclose the fish visible in the image.
  • the training material then consists of thousands, tens of thousands or hundreds of thousands of images, together with corresponding numerical definitions of the rectangular zones that enclose the fish in the respective image.
  • a majority of input data is used directly for training.
  • a smaller part is used to evaluate the recognition quality achieved by the network at any point in time.
  • This evaluation or validation consists in applying the network to an image to get a set of predictions which can then be compared with those obtained from the annotations. A match of approximately 100% is considered optimal since the validation annotations are not taken into account when adjusting the parameters.
  • the network can be applied to new images that have never been used for training.
  • the results are expected to prove fairly accurate in many cases.
  • a function may be integrated that the network itself is able to give its own "estimate" of the security with which it outputs its results.
  • the parameter sets obtained can in turn serve as a starting point to resume the training process at any time. Normally, old training data to be judged as not contributing to increasing the accuracy of the network is deleted, while thousands of additional images and annotations are added to the training data.
  • the aim is to isolate the images of individual fish in order to be capable of analyzing them further separately. Any mixing of individuals in an ROI is disruptive. There is a risk of calculating meaningless results when mixing the head of one fish with part of the fins of another fish, or with the tail of a third fish in the images.
  • the criterion is that the image of a fish is better if its longitudinal axis is oriented more or less in a X-direction and worse if this axis approaches a Z-direction.
  • An image of an fish oriented along the X-axis makes keypoints easier to see in comparison with an image of the fish having an increasing Y-component.
  • an accuracy of the measured values is reduced with an increasing Z- component, because the error inherent in the calculations increases sharply.
  • fish near one of the camera lenses can not appear or not well appear in the image taken by a sister camera. Besides that, fish further away from the camera can show small differences between both images, which makes photogrammetric triangulation and obtaining 3D coordinates more difficult.
  • crops of fish located at a medium distance to the camera are preferred, which may appear on both images of the pair, but with noticeable differences that provide the basis for determining the 3D coordinates.
  • the Al-based analyzing unit By using the Al-based analyzing unit, controlling interventions in the production and breeding processes and alarm states in critical conditions can be generated. Furthermore, it can provide following advantages:
  • Fig. 1 shows a schematic layout of a sensor apparatus according to an embodiment of the invention
  • Fig. 2 shows a schematic layout of a sensor apparatus according to an embodiment of the invention
  • Fig. 3 shows a schematic layout of a modular structure of a sensor apparatus according to an embodiment of the invention
  • Fig. 4 shows a perspective view of a connection module of a sensor apparatus according to an embodiment of the invention
  • Fig. 5 shows a perspective view of a camera module of a sensor apparatus according to an embodiment of the invention
  • Fig. 6 shows an exploded view of a connection between a functional module and a connection module of a sensor apparatus according to an embodiment of the invention
  • Fig. 7 shows a schematic layout of a camera module of a sensor apparatus according to an embodiment of the invention
  • Fig. 8 shows a schematic layout of a lighting module of a sensor apparatus according to an embodiment of the invention
  • Fig. 9 shows a schematic layout of a camera module with a integrated lighting module of a sensor apparatus according to an embodiment of the invention.
  • Fig. 10 shows keypoints of fish for biomass estimation
  • Fig. 11 shows a process view of a method of determining fish biomass
  • Fig. 12 shows a diagrammatic view of Al-based method for biomass estimation.
  • Fig. 1 shows a schematic layout of a sensor apparatus 10 according to an embodiment of the invention.
  • the sensor apparatus 10 comprises a communication module 24, a lighting module 34 and a camera module 20, wherein the communication module 24 is connected to the lighting module 34 by a connection module 14 and the lighting module 34 is connected to the camera module 20 by a connection module 14.
  • the communication, lighting and camera modules 24, 34, 20 are designed as functional modules 12.
  • Two network interface units 50 are designed as data transmission units 28 and arranged in the communication module 24 for transmitting recorded data to external networks, e.g. with cloud infrastructures and / or local data processing units, wherein the network interface units 50 can be designed as wireless network interfaces.
  • a signal light unit 52 is provided for indicating status of the communication module 24 and error information to users.
  • a sensor unit 42 is arranged.
  • a control unit 54 is provided for controlling these units 50, 52, 42.
  • a light source 38 and a sensor unit 42 are arranged and controlled by a control unit 54, wherein the light source 38 can be designed as an infrared light source and / or a LED light source.
  • the camera module 20 comprises a camera 22 and a sensor unit 42 controlled by a control unit 54, through which a synchronisation of timing for image exposures and all optical parameters such as zoom, focus and iris are controllable.
  • two attachment modules 16 are provided for fastening the functional modules 12 to a fish farming system.
  • Fig. 2 shows a schematic layout of a sensor apparatus 10 according to an embodiment of the invention.
  • the sensor apparatus 10 is essentially identical to the structural construction of the sensor apparatus 10 shown in Fig. 1.
  • the sensor apparatus 10 shown in Fig. 2 differs from the sensor apparatus 10 shown in Fig. 1 in that a lighting module 34 is integrated in a camera module 20.
  • a light source 38 is arranged in the camera module 20 .
  • Fig. 3 shows a schematic layout of a modular structure of a sensor apparatus 10 according to an embodiment of the invention, in which a connection module 14 can have a standardized connecting plate for forming a standard module connection interface between the connection module 14 and a functional module 12.
  • the interface section can include a network port, in particular a RJ-45 port which can be designed to be waterproof.
  • PoE Power-over-Ethernet
  • a central control and computing as well as communication unit is provided for communication with external networks and transmitting data as well as images recorded by functional modules and sensors.
  • Fig. 4 shows a connection module 14, which comprises a module plate 18 designed as standardized connecting plate 30, on which a functional module 12 can be attached via a cover section 32. At each end of a connector 60 the module plate 30 is fixed. Furthermore, a sealant plate 56 allowing for permanent underwater operations can be arranged and fastened on the module plate 30, so that the connection module can be sealed and all cable connections are free from any water. The module plate 30 and the sealant plate 56 have an opening 58 respectively, so that cables for power supply and network communication can pass through the opening 58 and be inserted in the connection module 14.
  • Fig. 5 shows a camera module 20 designed as functional module 12 comprising a cover section 32 for mechanical, electrical and communicative connecting the camera module 20 with a connection module 14. Furthermore, the camera module 20 comprises a RJ-45 port 62 for power supply and network communication.
  • a functional module 12 such as a camera or lighting module 20, 34 in Fig. 6 can be mechanically, electrically and communicatively connected to a connection module 14 via a cover section 32 and a module plate 18 designed as standardized connecting plate 30.
  • the connection plate 30 is designed functionally and mechanically complimentary to the cover section 32.
  • a sealant plate 56 can be inserted between the connection plate 30 and the cover section 32 so that all cable connections inside a sensor apparatus 10 is free from water.
  • a RJ-45 port 62 for power supply and network communication of the functional module 12 can pass through openings 58 of the connection plate and the sealant plate 56.
  • the connection plate 30 is fastened at the end of a connector 60 of the connection module 14.
  • a camera module 20 in Fig. 7 comprises a central control and computing as well as communication unit for processing images, videos, video streaming and giving controlling signals via a camera driver unit to a plurality of cameras #1 to #N and to an accelerometer for event based image exposures.
  • Fig. 8 shows a lighting module 34 in which a central control and computing as well as communication unit is integrated. Signals can be given and sent via the this unit to a plurality of light sources #1 to #N.
  • Fig. 9 shows a combination of the camera module 20 shown in Fig. 7 with the lighting module 34 shown in fig. 8, wherein the lighting module 34 is integrated in the camera module 20.
  • Fig. 10 shows keypoints P1 to P10 defined for fish biomass estimation.
  • Keypoints P1, P3 are besides a rostrum and an eye.
  • Keypoints P2, P5, P9 are besides a tail.
  • Keypoints P10, P4, P8, P6, P7 are besides dorsal, ventral and anal fins.
  • Fig. 11 shows a method of determining fish biomass.
  • a final outcome i.e. biomass of individual fish or group is based on tow inputs.
  • An input part 1 is given through a camera module 20 designed as functional module 12 which is installed underwater and an input part 2 is given through data science i.e. through created fish libraries.
  • data science i.e. through created fish libraries.
  • the images are firstly taken and collected by camera module 20 and then the images are annotated. These images are used then for training a computer vision. Then matching and measurement are carried out by the computer version.
  • the fish libraries are created through the data science which is built by manual fish sampling, image annotation, measurements and regression models.
  • Fig. 12 shows a diagrammatic view of Al-based method for biomass estimation. Images are taken by camera modules 20 every 24h and uploaded to cloud infrastructures and / or local data processing units S3 as zip archive.
  • Said S3 system is an online/cloud based storage system for storing, handling and processing streams of sensor- audio- and video data streams. Then the zip archive is transformed to “swapped” bucket, because left and right image halves of combined images are mixed up. The combined image pairs are annotated and the annotations are exported in two formats for corresponding keypoint detection training and object detection training.
  • These training datasets can be combined and configured to create training projects which can then be run to experiment with different hyper parameters.
  • the trained models are then uploaded and made available for predictions.
  • S3 bucket is an online/cloud based storage system for storage and real-time processing of the monitored fish data. Then it is produced to an online/cloud based Al-service, such as Kinesis which is an Amazon Web Service, for collecting, processing and analysing streaming video and streaming data in real time. From said Al-service, the predictions are pushed to a publication platform such as OpenSearch, wherein filtering and weight regression of fish entries are carried out on OpenSearch.
  • Kinesis which is an Amazon Web Service

Landscapes

  • Life Sciences & Earth Sciences (AREA)
  • Zoology (AREA)
  • Environmental Sciences (AREA)
  • Marine Sciences & Fisheries (AREA)
  • Animal Husbandry (AREA)
  • Biodiversity & Conservation Biology (AREA)
  • Closed-Circuit Television Systems (AREA)

Abstract

La présente invention concerne un appareil de type capteur (10) pour un système d'élevage comprenant au moins deux modules fonctionnels (12) pour mesurer, enregistrer et transmettre des paramètres d'élevage de poissons et au moins un module de liaison (14) pour relier les modules fonctionnels (12). Chaque module fonctionnel (12) comprend au moins une section de recouvrement formée de manière uniforme (32) et ledit module de liaison (14) comprend une plaque de liaison (30) qui est fonctionnellement et mécaniquement complémentaire à ladite section de recouvrement (32), de sorte qu'au moins deux des modules fonctionnels (12) puissent être reliés l'un à l'autre de manière mécanique, électrique et communicative par ledit module de liaison (14) grâce à une liaison de la section de recouvrement (32) et de la plaque de liaison (30). Un module fonctionnel (12) d'un tel appareil de type capteur (12) peut être un module de caméra (20). La présente invention concerne en outre un système de capteur (44) comprenant au moins un tel appareil de type capteur (10).
PCT/EP2022/065813 2021-06-11 2022-06-10 Appareil de type capteur et système de capteur pour la pisciculture WO2022258802A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
DE102021115145.7 2021-06-11
DE102021115145 2021-06-11

Publications (1)

Publication Number Publication Date
WO2022258802A1 true WO2022258802A1 (fr) 2022-12-15

Family

ID=82214460

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/EP2022/065813 WO2022258802A1 (fr) 2021-06-11 2022-06-10 Appareil de type capteur et système de capteur pour la pisciculture

Country Status (1)

Country Link
WO (1) WO2022258802A1 (fr)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101995875A (zh) 2010-12-14 2011-03-30 重庆市科学技术研究院 水产养殖远程自动化监控系统及其监控方法
WO2014125419A1 (fr) 2013-02-13 2014-08-21 Sreeram Raavi Dispositif et procédé de mesure des paramètres chimiques et physiques de l'eau destinée à l'aquaculture
WO2018111124A2 (fr) * 2016-12-15 2018-06-21 University Of The Philippines Estimation d'une taille de poissons, d'une densité de population, d'une répartition d'espèces et d'une biomasse
WO2018222048A1 (fr) 2017-05-29 2018-12-06 Ecotone As Procédé et système d'imagerie hyperspectrale sous-marine de poissons
WO2019121900A1 (fr) 2017-12-20 2019-06-27 Intervet Inc. Procédé et système de surveillance de parasites externes de poisson en aquaculture
US20200394402A1 (en) * 2018-03-09 2020-12-17 Nec Corporation Object identification device, object identification system, object identification method, and program recording medium

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101995875A (zh) 2010-12-14 2011-03-30 重庆市科学技术研究院 水产养殖远程自动化监控系统及其监控方法
WO2014125419A1 (fr) 2013-02-13 2014-08-21 Sreeram Raavi Dispositif et procédé de mesure des paramètres chimiques et physiques de l'eau destinée à l'aquaculture
WO2018111124A2 (fr) * 2016-12-15 2018-06-21 University Of The Philippines Estimation d'une taille de poissons, d'une densité de population, d'une répartition d'espèces et d'une biomasse
WO2018222048A1 (fr) 2017-05-29 2018-12-06 Ecotone As Procédé et système d'imagerie hyperspectrale sous-marine de poissons
WO2019121900A1 (fr) 2017-12-20 2019-06-27 Intervet Inc. Procédé et système de surveillance de parasites externes de poisson en aquaculture
US20200394402A1 (en) * 2018-03-09 2020-12-17 Nec Corporation Object identification device, object identification system, object identification method, and program recording medium

Similar Documents

Publication Publication Date Title
Shortis et al. A review of underwater stereo-image measurement for marine biology and ecology applications
Monkman et al. Using machine vision to estimate fish length from images using regional convolutional neural networks
Shortis et al. Design and calibration of an underwater stereo-video system for the monitoring of marine fauna populations
EP3465080A1 (fr) Agencement et procédé de mesure de la masse biologique de poissons, et utilisation de cet agencement
Williams et al. Use of stereo camera systems for assessment of rockfish abundance in untrawlable areas and for recording pollock behavior during midwater trawls
Chiang et al. Three-dimensional reconstruction of brain-wide wiring networks in Drosophila at single-cell resolution
Han et al. Fish shoals behavior detection based on convolutional neural network and spatiotemporal information
CN107077626A (zh) 动物用非侵入式多模态生物特征辨识系统
Livanos et al. Intelligent navigation and control of a prototype autonomous underwater vehicle for automated inspection of aquaculture net pen cages
AU2020102433A4 (en) Machine learning based fish monitoring machine and method thereof
JP6736414B2 (ja) 画像処理装置、画像処理方法及びプログラム
KR101988963B1 (ko) 딥러닝을 이용한 해양 탐사 시스템
Shortis et al. Using stereo-video for deep water benthic habitat surveys
JP7207561B2 (ja) 大きさ推定装置、大きさ推定方法および大きさ推定プログラム
EP3769036B1 (fr) Procédé et système d'extraction d'un échantillon statistique de poissons en mouvement
Chang et al. A two-mode underwater smart sensor object for precision aquaculture based on AIoT technology
CN107146257B (zh) 一种自适应水质的水下相机标定装置
KR20150000054A (ko) 어류 양식 관리용 무인 잠수정
WO2022258802A1 (fr) Appareil de type capteur et système de capteur pour la pisciculture
Thomson et al. Gigapixel behavioral and neural activity imaging with a novel multi-camera array microscope
CN207940236U (zh) 鱼群自动健康检测及档案管理装置
JP2021136965A (ja) 養殖管理装置、養殖管理方法及び給餌ロボット
CN116724964A (zh) 基于灯光诱集的鱼类检测系统及方法
CN116868912A (zh) 检测动物社交障碍行为的装置及其方法、电子设备、介质
Böer et al. A deep-learning based pipeline for estimating the abundance and size of aquatic organisms in an unconstrained underwater environment from continuously captured stereo video

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22733591

Country of ref document: EP

Kind code of ref document: A1

DPE1 Request for preliminary examination filed after expiration of 19th month from priority date (pct application filed from 20040101)
NENP Non-entry into the national phase

Ref country code: DE

32PN Ep: public notification in the ep bulletin as address of the adressee cannot be established

Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC (EPO FORM 1205A DATED 21/02/2024)