EP2936051A1 - System and method for calculating physical dimensions for freely movable objects in water - Google Patents

System and method for calculating physical dimensions for freely movable objects in water

Info

Publication number
EP2936051A1
EP2936051A1 EP13864165.9A EP13864165A EP2936051A1 EP 2936051 A1 EP2936051 A1 EP 2936051A1 EP 13864165 A EP13864165 A EP 13864165A EP 2936051 A1 EP2936051 A1 EP 2936051A1
Authority
EP
European Patent Office
Prior art keywords
light
objects
fish
light source
lla
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Ceased
Application number
EP13864165.9A
Other languages
German (de)
French (fr)
Other versions
EP2936051A4 (en
Inventor
Even BRINGSDAL
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Vard Aqua Sunndal As
Original Assignee
Ebtech As
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Family has litigation
First worldwide family litigation filed litigation Critical https://patents.darts-ip.com/?family=50978780&utm_source=google_patent&utm_medium=platform_link&utm_campaign=public_patent_search&patent=EP2936051(A1) "Global patent litigation dataset” by Darts-ip is licensed under a Creative Commons Attribution 4.0 International License.
Application filed by Ebtech As filed Critical Ebtech As
Publication of EP2936051A1 publication Critical patent/EP2936051A1/en
Publication of EP2936051A4 publication Critical patent/EP2936051A4/en
Ceased legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/24Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures
    • G01B11/25Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures by projecting a pattern, e.g. one or more lines, moiré fringes on the object
    • AHUMAN NECESSITIES
    • A01AGRICULTURE; FORESTRY; ANIMAL HUSBANDRY; HUNTING; TRAPPING; FISHING
    • A01KANIMAL HUSBANDRY; CARE OF BIRDS, FISHES, INSECTS; FISHING; REARING OR BREEDING ANIMALS, NOT OTHERWISE PROVIDED FOR; NEW BREEDS OF ANIMALS
    • A01K61/00Culture of aquatic animals
    • A01K61/90Sorting, grading, counting or marking live aquatic animals, e.g. sex determination
    • A01K61/95Sorting, grading, counting or marking live aquatic animals, e.g. sex determination specially adapted for fish
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/10Image acquisition
    • G06V10/12Details of acquisition arrangements; Constructional details thereof
    • G06V10/14Optical characteristics of the device performing the acquisition or on the illumination arrangements
    • G06V10/145Illumination specially adapted for pattern recognition, e.g. using gratings
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/60Type of objects
    • G06V20/64Three-dimensional objects
    • G06V20/653Three-dimensional objects by matching three-dimensional models, e.g. conformal mapping of Riemann surfaces
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02ATECHNOLOGIES FOR ADAPTATION TO CLIMATE CHANGE
    • Y02A40/00Adaptation technologies in agriculture, forestry, livestock or agroalimentary production
    • Y02A40/80Adaptation technologies in agriculture, forestry, livestock or agroalimentary production in fisheries management
    • Y02A40/81Aquaculture, e.g. of fish

Definitions

  • the present invention concerns a method for calculating physical dimensions, such as one or more of: 3D model, size, weight and volume, for free movable objects in water, in accordance with the preamble of claim 1.
  • the present invention concerns a system for calculation of physical dimensions, such as one or more of: 3D model size, weight and volume, for freely movable objects in water, in accordance with the preamble of claim 18.
  • the present invention is directed to calculation of one or more of: 3D model, size, weight and volume of fish in fish farming net cages, fish hatcheries where tubs are used, and similar applications.
  • Reliable automatic systems for monitoring of fish in fish farms are desirable to optimize the operation and reduce the costs.
  • the size of the fish, average weight, size distribution in the net cage, total biomass, including the condition of the fish, behavior and actual feed consumption are factors which are desirable to control.
  • NO 332103 concerns a system and a method for calculation of size of marine organisms in water.
  • the system utilizes a distance meter to measure distance to fish of which a camera is going to take an image of.
  • the camera is arranged to shoot images down into the water, and the images are taken when the distance meter has detected an object within a metering range.
  • the image and the information from the distance meter are transferred to an image analysis tool to determine the size and weight of the fish.
  • NO 330423 describes a device and a method for counting fish or determination of biomass.
  • a device for determination of the volume or mass of an object suspended in a medium is described.
  • the volume or mass is calculated by using a 3D camera in combination with a grayscale 2D camera.
  • NO 330863 describes a device and a method for measuring average weight and appetite feeding in fish farms, where a method and a system for recording substantially freely movable objects in a fish farming net cage. Images from a given number of cameras, from respective angles, are transferred to a data processing unit for processing. The data processing unit detects whether there is fish, pellet, faeces or other foreign elements in the image. Information of weight and volume is connected with data about the fish feeding. From EP 1 217 328 a method is known for providing a 3D image by projecting a known pattern on an object at a certain distance, and take an image of the object having the projected pattern.
  • the peculiar with EP 1 217 328 is that the given pattern is formed by arranging alternating areas of local maximum and minimum light density.
  • WO 2010/098954 describes a method for estimation of a physical dimension of an object, where a given light pattern is projected onto the object, whereupon the reflected light is detected and the collected data is processed in a computer to provide a three-dimensional presentation of the object, and the physical dimension of the object can then be calculated on basis on the three- dimensional structure.
  • EP 1 659 857 Al describes a method for recording and estimating the weight of fish.
  • a number of cameras, particularly CCD cameras record images of fish moving by the cameras in a transfer pipe. The fish is illuminated from different sides in the transfer pipe, and images of different parts of the fish are recorded by a sequence controller in such a manner that a combined image recording is produced, which is being used as a basis in estimation of the fish weight.
  • a unit for performing measurements on fish moving in a transfer pipe is described, having at least two light sources on the wall of the transfer pipe for illumination of fish, including two or more cameras, particularly CCD cameras, arranged uniformly in a cross-plane around the circumference, to record reflections from the fish or shadow images of fish.
  • EP 2 425 215 describes a contact-less system and a method for estimation of the mass or weight of a target object.
  • the target object is depicted and a spatial illustration of the target object (the animal) is derived from the images.
  • a virtual spatial model is provided from a characteristic object from a class of objects to which the target object belongs.
  • the virtual spatial model is transformed for optimally matching the spatial presentation of the single animal.
  • the mass or weight of the target object is estimated as a function of shape variables characterizing the transformed virtual object.
  • NO 20101736 describes a system and a method for calculation of size of marine organisms in water, such as fish, where at least one camera is used, and image analysis tool to analyze images taken by the camera.
  • the system comprises use of a distance meter to measure distance to fish of which the camera is to take images of, where said camera is arranged to take images down into the water of single fish at reception of a signal from the distance meter, and that images of the fish including information about the distance to the fish, is arranged to be transferred to the image analysis tool for determination of the size and weight of the fish.
  • An obvious disadvantage of this is that the distance to the object is required to be able to calculate the volume of the object.
  • the distance to the object is used, and from this, the size of the object is calculated by counting number of pixels with fish in the image and distance. Since the distance only is measured in one point, the distance measurement is quite inaccurate and dependent on how the object is moving or is positioned. It should also be noted that this solution only is arranged to take images of single fish based upon the distance meter, something which results in that fish to be evaluated must be within the metering range of the distance meter, resulting in a limited data basis for statistical analysis.
  • US 2004/0008259 Al describes a system which is recording physical dimensions of an object, consisting of a light source which illuminates the object, and a camera taking images of the object.
  • the light source is illuminating the object by light having known wavelength and pattern, and the camera has filter for filtering out light not originating from the light source, so that light from the environments not interfere with the measurements.
  • a processor calculates the physical size of the object from the images.
  • This publication is arranged for use in air and not for use in water, and is not able to handle back-scattering from particles, algae or similar in water.
  • Particles, algae, high fish density and similar substantially affects the light conditions in a net cage. This makes the use of computer-aided imaging challenging, since the system has to take these varying conditions into consideration, which the present systems fail to conduct satisfactorily.
  • the main object of the present invention is to provide a method and a system solving the disadvantages of the prior art mentioned above.
  • An object of the present invention is to provide a system and a method which is capable of estimating physical dimensions for freely movable objects in water, also when the light conditions and visibility in general are bad and/or varying.
  • the present invention provides a system and a method for measuring physical dimensions for freely movable objects in water, particularly of freely movable fish in fish farming net cages, fish hatcheries where tubs are being used, or similar applications by using at least one light source arranged to illuminate an object by emitting light or structured light at a given wavelength, including at least one recording means in the form of a 2D camera for recording still images or video of the illuminated objects.
  • the system comprises at least one light source arranged to project a predefined or pre-selected light pattern at a selected wavelength on freely movable objects to be observed, including at least one recording means in the form of a 2D camera arranged to depict (still image or video) the freely movable object being observed having the light pattern from the light source projected thereon.
  • the system comprises at least one light source being arranged to illuminate freely movable objects to be observed by light having a selected wavelength, including at least two recording means in the form of at least two 2D cameras arranged to provide a stereoscopic vision system.
  • the system comprises a control unit arranged to control the light source to emit light having given properties, such as selection of wavelength (colour), intensity, frequency, etc.
  • the control unit will be arranged to select actual light pattern when a light pattern is to be projected onto the object.
  • the system may comprise a separate image processing unit provided with means and/or software arranged to utilize the information from the illuminated object or objects, on which the light pattern is projected, recorded by the recording means to generate a 3D model of the freely movable object.
  • the image processing unit or its functions may be arranged in the control unit or in an external unit.
  • a light pattern is preferably used that covers as much of the object as possible. Speed calculation is difficult due to the object may have varying speed, e.g. fish will have varying speed due to it is using the tail when swimming. By projecting the object with a pattern, only one image is required to calculate the 3D model.
  • the 3D model can be used to estimate volume and weight on basis of a function of shape variables characterizing the depicted object. Measured parameters in the 3D model, such as length and height, are inserted into a calculation formula for estimation of weight.
  • a substantial advantage of the present invention is that it enables calculation of weight of one or more fish at a time in every image depending on the fish density.
  • the system comprises static optical or programmable filters arranged to the recording means, said filters being arranged accept selected wavelengths, thus obtaining homogenous quality of the recorded images.
  • selected wavelength will mainly be the wavelength which is sent from the light source to illuminate the object or to project the light pattern onto the freely movable object, which e.g. will be light in the infrared wavelength range.
  • the object is advantageously illuminated/projected by the light source at a defined angle in relation to the recording means.
  • the recording means are provided with static optical or programmable filters this results in that unwanted reflection is avoided, so that the recording means collect correct light/light pattern and produces images which can be used in automatic image processing in the image processing unit. Since light absorbs quickly in water, it is particularly important to use filter combinations to obtain images with sufficient quality. Additionally, the conditions in a net cage change rapidly. This makes presently known image processing difficult and this is one of the reasons that there are no automatic optically based biomass meters on the market today.
  • filtering and correct angle is important with regard to minimizing reflections from, e.g. the shiny surface of the fish. Too much reflection will result in saturation of image chip due to too much light.
  • the water absorption also makes this angle parameter a lot more important than in air, since it is particularly important to obtain as much reflection of the light as possible.
  • the anisotropic properties of the fish-scales results in that the light is scattered in different directions, and therefore the energy in the light is scattered and is absorbed faster in the water, resulting in less light reaching the recording means.
  • Infrared light is absorbed quickly in water, which means that at a certain depth there is no noise from infrared light from the sun. For that reason, only infrared light from an artificial light source is visible in the images. Homogenous measurement conditions are obtained by filtering away all wavelengths shorter than the artificial light source. In order to ensure good measurements due to the light being absorbed quickly in water, a light source having correct wave length and correct density must be used.
  • the contrast in water decreases by increasing distance to the object. Since the contrast is low in underwater images, it is necessary to add information to the images, which in the present invention has been done by projecting a pattern onto the object. Adding information to the image by the projected pattern will also make it possible to distinguish several objects from each other, since 3D models of the objects may be generated irrespective of the distance to them being different, something which enable calculation of volume, weight and size therefrom.
  • the system must be calibrated in water to increase accuracy of the measurements. This must be done to take into consideration that the refraction index in air versus water is different.
  • the present invention will not suffer at conditions like this, since the light source can be arranged with selected wavelength, intensity, frequency, etc. so that the present invention may take these conditions into consideration and thus provide images of homogenous quality for image processing.
  • the present invention is based on an optical system combined with artificial light source, where the light source illuminates objects or projects a light pattern onto the object, such as fish, at a selected wavelength, and filter away all remaining light from the recording means by means of filters, or use an image producing sensor which only collects the selected wavelength.
  • a homogenous measuring environment is achieved. For example, by using a wavelength, such as in the infrared wavelength range, which is quickly absorbed in water, low interference is obtained from other light sources (such as sunlight or other artificially supplied light).
  • a homogeneous quality of the recorded images for use in image processing is obtained by filtering away/minimizing other wavelengths than the light supplied from the light source or utilize an image producing sensor that collects selected wavelength, i.e. the images are without interference from other light sources.
  • the homogeneous image quality makes it possible to perform a robust and accurate image processing and 3D modelling.
  • the 3D model is, in accordance with the present invention, calculated by triangulation between light and image, or between images from several cameras. This provides a "depth image" where the pixel value represents distance to the measured object.
  • the present invention When the object to be observed is fish, it is an advantage to arrange the present invention close to the feeding area for the fish in a net cage, since all fish is passing to eat during a feeding period and will provide a best possible representative selection of measured fish.
  • the present invention may naturally be placed anywhere in the net cage. By placing the system in accordance with the present invention in the vicinity of the feeding area, it is not necessary to move the system around in the net cage to seek out the fish where it resides.
  • the goal of fish farmers is that all fish must appear at the feeding area to eat during the day, something which results in that both small and large fish will pass the present invention during a feeding cycle.
  • the present invention may also be provided with a winch or similar for measurement through a water column, and for elevation and lowering in the net cage, alternatively also arranged to be moved around within the net cage if desired by means of suitable means for this.
  • the present invention is also applicable in estimation of number of fish, by using data about fish density in a model for this.
  • Figure 1 shows a principle drawing of a system in accordance with a first embodiment of the present invention using structured light
  • Figure 2 shows a principle drawing of a system in accordance with a second embodiment of the present invention which is using a light source for illumination of the object to be observed
  • Figure 3 shows a principle drawing of a system in accordance with the present invention located in a net cage
  • Figure 4 shows a block diagram of a system in accordance with the present invention.
  • Figure 1 shows a principle drawing of a system in accordance with a first embodiment of the present invention, to be arranged in a net cage for observation of fish.
  • a system in accordance with the first embodiment of the present invention comprises at least one light source 10 arranged for emitting structured light at a selected wavelength, and at least one 2D camera 11 provided with an optical filter 12.
  • the 2D camera 11 may be arranged to take still images only, video only or both video and still images.
  • the light source 10 is, to enable emission of structured light, provided with means and/or software for emitting a known or selected light pattern 13 which is to be projected on fish 100 or other objects residing within the illumination range of the light source 10 at a selected wavelength.
  • structured light basically only one 2D camera is required for image recognition, but using several cameras may also be an advantage, which will be described further below.
  • the light pattern 13 projected by the light source 10 may exhibit any selected pattern.
  • the pattern may be formed by one or more of (but not limited to the list): - either horizontal or vertical lines,
  • the pattern as a whole should preferably cover the entire object to be observed.
  • Figure 2 shows a principle drawing of a system in accordance with a second embodiment of the invention, to be arranged in a net cage for observation of fish.
  • a system in accordance with the second embodiment of the present invention comprises at least one light source 10 arranged to illuminate the object at a selected wavelength, and at least two 2D cameras lla-b arranged to provide a stereoscopic vision system.
  • at least two 2D cameras are required to enable generation of a 3D model of the object.
  • Two or more cameras are required to enable calculation of distance to the object and thereupon generating 3D model.
  • the 2D cameras lla-b may be arranged to take still images only, video only, or both video and still images.
  • the light source(s) 10 is/are, both in the first and second embodiment, arranged to emit light having desired parameters with regard to wavelength (color), optionally frequency, intensity, etc.
  • the light source is preferably of the type light emitting diode (LED), laser or another type arranged for emitting light at selected wavelength, including light sources where only selected wavelength is filtered out.
  • the 2D camera(s) is/are, both in the first and second embodiment, provided with optical filters 12, 12a-b arranged to accept light only at selected wavelengths.
  • the 2D cameras 11, lla-b are provided with image producing sensors which only collects selected wavelength.
  • Use of several 2D cameras lla-b provides more possibilities, optionally also in combination with several light sources or controllable light sources.
  • the 2D cameras 11, lla-b may, for example, be provided with different filters 12a-b, i.e. filters having different properties.
  • one of the cameras may be provided with an I filter which only accepts IR light.
  • the light source 10 emits a light/light pattern with IR light
  • this camera will therefore accept the light/light pattern.
  • the other camera may, for example, be arranged with a filter which accepts only green light.
  • a 3D model may be generated on basis of the two images in combination.
  • light sources are being used, for example two light sources. Then, for example, both light sources may emit light with a pattern at different wavelengths and the two cameras 12a-b collect separate patterns, whereupon the images are combined in generation of a 3D model. Moreover, the light source may emit light at different wavelengths to obtain best possible images or video for processing.
  • the system advantageously comprise a suspension device 20, e.g. formed of a suspension point 21 and a rod 22 arranged vertically from the suspension point 21, said rod 22 being adapted for attachment of the 2D cameras lla-b and the light source 10.
  • the light source 10 is in the example shown in Figure 2 arranged between the two 2D cameras lla-b, at a given distance between the 2D cameras lla-b and the light source 10, and that the light source 10 exhibits an angle in relation to the fish to be observed, whereas it in the first embodiment is located under the 2D camera 11, but it could naturally also have been located above the camera.
  • the light source 10 may be located anywhere in relation to the 2D camera(s) depending on the light reflection from the object to be illuminated.
  • FIG. 3 shows a principle drawing of a system in accordance with the first embodiment of the invention arranged in a net cage 30 in the vicinity of a feeding station 40.
  • the system in accordance with the invention is in this example connected to a winch 50 for elevating and lowering the system in the net cage.
  • the system may be arranged to means for moving the system around within the net cage 30, if desirable. It may be seen from the figure that a light pattern 13 is projected onto a fish 100' by the light source 10, while other fishes 100 do not have a light pattern projected thereon.
  • the system may be arranged stationary in the feeding area, so that the users don't have to move the system around, as all fish usually visit the feeding area during the course of a day.
  • the system comprises an image processing unit 60 provided with means and/or software to utilize the information from the light pattern 13 recorded by the 2D camera 11 to generate a 3D model of the fish 100' being illuminated (for the first embodiment) or provided with means and/or software to utilize the information from the stereoscopic vision created by the two 2D cameras lla-b (for the second embodiment).
  • the image processing unit 60 may be a separate unit, a unit integrated in a control unit 70 for the system, or an external unit 80.
  • the system comprises a control unit 70 which may comprise the image processing unit 60, and is moreover provided with means and/or software for controlling the light source(s) 10, the recording means 11, lla-b and optionally the filter(s) 12, 12a-b.
  • the control unit 70 will provide the filters 12, 12a-b with settings informing which wavelength the light source 10 is emitting light with, so that the filters 12, 12a-b are adjusted to receive light only at this wavelength.
  • the system is provided with communication means for communication with an external unit 80, which external unit 80 may also be the control unit of the system.
  • the control unit 70, the image processing unit 60 or the external unit 80 are provided with means and/or software for utilizing the generated 3D model for estimation of physical dimensions of the depicted fish, such as size, volume and weight, based on a function of shape variables which are characterizing the depicted object. Measured parameters (for example length and height) from the 3D model are used in a model for estimating weight.
  • the weight of singular fish is advantageously stored in a database, for example, in the external unit 80 or the control unit 70, so that the average weight may be calculated when the data basis has become large enough for this.
  • the total biomass in the net cage can be calculated from the average weight and the total number of fish.
  • the size distribution may be found as well.
  • control unit 70 or the external unit 80 is arranged to present the calculated physical dimensions to the user.
  • control unit 70 may be arranged to control the winch 50 for elevating and lowering the system in the net cage 30, including any other means for moving the system in accordance with the present invention around within a net cage 30.
  • control unit 70 is also arranged to control the feeding station 40 on basis of image information from the recording means.
  • the present invention also have another field of use by that the images/videos acquired also may be used to search for, e.g. salmon louse, deformations, injuries, illness and similar.
  • the present invention is not limited to net cages but may also be used in fish hatcheries, where tubs are used, and similar applications.

Abstract

Method and system for calculating physical dimensions for freely movable objects in water, by illuminating the object or projecting a known or selected light pattern (13) on the object by means of at least one light source (10) at selected wavelength or wavelengths, recording illuminated object or objects with the projected light pattern (13) by means of recording means (11, 11a-b) in the form of at least one 2D camera provided with filters (12, 12a-b) arranged to only accept light having selected wavelengths or provided with image producing sensors arranged to only collect light with selected wavelengths, including generation of a 3D model based on recorded images and/or video from the recording means (11, 11a-b) as basis for calculating the physical dimensions.

Description

System and method for calculating physical dimensions for freely movable objects in water
The present invention concerns a method for calculating physical dimensions, such as one or more of: 3D model, size, weight and volume, for free movable objects in water, in accordance with the preamble of claim 1.
Moreover, the present invention concerns a system for calculation of physical dimensions, such as one or more of: 3D model size, weight and volume, for freely movable objects in water, in accordance with the preamble of claim 18.
In particular, the present invention is directed to calculation of one or more of: 3D model, size, weight and volume of fish in fish farming net cages, fish hatcheries where tubs are used, and similar applications.
Background
Reliable automatic systems for monitoring of fish in fish farms are desirable to optimize the operation and reduce the costs. The size of the fish, average weight, size distribution in the net cage, total biomass, including the condition of the fish, behavior and actual feed consumption are factors which are desirable to control.
NO 332103 concerns a system and a method for calculation of size of marine organisms in water. The system utilizes a distance meter to measure distance to fish of which a camera is going to take an image of. The camera is arranged to shoot images down into the water, and the images are taken when the distance meter has detected an object within a metering range. The image and the information from the distance meter are transferred to an image analysis tool to determine the size and weight of the fish.
NO 330423 describes a device and a method for counting fish or determination of biomass. A device for determination of the volume or mass of an object suspended in a medium is described. The volume or mass is calculated by using a 3D camera in combination with a grayscale 2D camera.
NO 330863 describes a device and a method for measuring average weight and appetite feeding in fish farms, where a method and a system for recording substantially freely movable objects in a fish farming net cage. Images from a given number of cameras, from respective angles, are transferred to a data processing unit for processing. The data processing unit detects whether there is fish, pellet, faeces or other foreign elements in the image. Information of weight and volume is connected with data about the fish feeding. From EP 1 217 328 a method is known for providing a 3D image by projecting a known pattern on an object at a certain distance, and take an image of the object having the projected pattern. Then, the pattern is detected from the image, whereupon the detected and projected image is compared and the distance to the numerous parts of the object is calculated. The peculiar with EP 1 217 328 is that the given pattern is formed by arranging alternating areas of local maximum and minimum light density.
WO 2010/098954 describes a method for estimation of a physical dimension of an object, where a given light pattern is projected onto the object, whereupon the reflected light is detected and the collected data is processed in a computer to provide a three-dimensional presentation of the object, and the physical dimension of the object can then be calculated on basis on the three- dimensional structure.
EP 1 659 857 Al describes a method for recording and estimating the weight of fish. A number of cameras, particularly CCD cameras, record images of fish moving by the cameras in a transfer pipe. The fish is illuminated from different sides in the transfer pipe, and images of different parts of the fish are recorded by a sequence controller in such a manner that a combined image recording is produced, which is being used as a basis in estimation of the fish weight. A unit for performing measurements on fish moving in a transfer pipe is described, having at least two light sources on the wall of the transfer pipe for illumination of fish, including two or more cameras, particularly CCD cameras, arranged uniformly in a cross-plane around the circumference, to record reflections from the fish or shadow images of fish.
EP 2 425 215 describes a contact-less system and a method for estimation of the mass or weight of a target object. The target object is depicted and a spatial illustration of the target object (the animal) is derived from the images. A virtual spatial model is provided from a characteristic object from a class of objects to which the target object belongs. The virtual spatial model is transformed for optimally matching the spatial presentation of the single animal. Finally, the mass or weight of the target object is estimated as a function of shape variables characterizing the transformed virtual object.
NO 20101736 describes a system and a method for calculation of size of marine organisms in water, such as fish, where at least one camera is used, and image analysis tool to analyze images taken by the camera. Moreover, the system comprises use of a distance meter to measure distance to fish of which the camera is to take images of, where said camera is arranged to take images down into the water of single fish at reception of a signal from the distance meter, and that images of the fish including information about the distance to the fish, is arranged to be transferred to the image analysis tool for determination of the size and weight of the fish. An obvious disadvantage of this is that the distance to the object is required to be able to calculate the volume of the object. In other words, the distance to the object is used, and from this, the size of the object is calculated by counting number of pixels with fish in the image and distance. Since the distance only is measured in one point, the distance measurement is quite inaccurate and dependent on how the object is moving or is positioned. It should also be noted that this solution only is arranged to take images of single fish based upon the distance meter, something which results in that fish to be evaluated must be within the metering range of the distance meter, resulting in a limited data basis for statistical analysis.
US 2004/0008259 Al describes a system which is recording physical dimensions of an object, consisting of a light source which illuminates the object, and a camera taking images of the object. The light source is illuminating the object by light having known wavelength and pattern, and the camera has filter for filtering out light not originating from the light source, so that light from the environments not interfere with the measurements. A processor calculates the physical size of the object from the images. This publication is arranged for use in air and not for use in water, and is not able to handle back-scattering from particles, algae or similar in water.
Disadvantages of prior art are that they do not take the inhomogeneous light conditions in water into consideration, which makes it difficult to segment out the fish in a robust, automatic and accurate manner from the images. 3D modeling of the fish thus becomes inaccurate, resulting in a few numbers of measurements and inaccurately estimated weight.
An increasingly industrialized farming where the trend is larger and larger net cages with more fish, impose limitations on the existing technologies for measurement of biomass. In order to obtain a good result from biomass measurement, it is important to provide a representative selection of measurements. Today, there are some systems on the market for measuring biomass of fish. The most frequently used is a solution where the fish has to swim through a physical device to be measured. This is something the fish is resisting, and frequently a low number of measurements are obtained, resulting in a low statistic representative basis for calculation of correct average weight in the net cage with deviations from actual weight as a result. Use of 2D cameras (arranged to obtain stereoscopic vision) is per se known from before to measure fish biomass, but these solutions are based on manual recognition of the fish in two images to find corresponding points, and such systems face challenges with regard to repeatability, and the systems are labor demanding.
Particles, algae, high fish density and similar, substantially affects the light conditions in a net cage. This makes the use of computer-aided imaging challenging, since the system has to take these varying conditions into consideration, which the present systems fail to conduct satisfactorily.
Object
The main object of the present invention is to provide a method and a system solving the disadvantages of the prior art mentioned above.
Moreover, it is an object to provide a system and a method that provides increased accuracy in calculation of physical dimensions for freely movable objects in water.
An object of the present invention is to provide a system and a method which is capable of estimating physical dimensions for freely movable objects in water, also when the light conditions and visibility in general are bad and/or varying.
It is also an object of the present invention to provide a system and a method that provides still images or video images with homogenous quality as a basis for generation of a 3D model for freely movable objects in water, since underwater images/video otherwise become of heterogeneous quality due to varying and bad light conditions.
Moreover, it is an object of the present invention to provide a system and a method which utilizes light of varying wavelength to illuminate the object, and at least two recording means arranged to provide a stereoscopic vision system that produces still images or video images of homogenous quality as a basis for generation of a 3D model for freely movable objects in water.
Moreover, it is an object of the present invention to provide a system and a method that utilizes structured light at selected wavelength which is projected as a light pattern on freely movable objects in water, which together with at least one recording means provides still images or video images of homogenous quality as basis for generation of a 3D model for freely movable objects in water.
Moreover, it is an object of the present invention to provide a system and a method that take interfering light sources into consideration, by using filters or image-producing sensors to record only light at selected wavelengths. Moreover, it is an object of the present invention to provide a system and a method that measures a sufficiently large number of objects to be able to calculate average weight as accurate as possible, optionally including biomass.
The invention A method in accordance with the invention is stated in claim 1. Advantageous features of the method are described in the claims 2-17.
A system in accordance with the invention is stated in claim 18. Advantageous features of the system are described in the claims 19-27.
The present invention provides a system and a method for measuring physical dimensions for freely movable objects in water, particularly of freely movable fish in fish farming net cages, fish hatcheries where tubs are being used, or similar applications by using at least one light source arranged to illuminate an object by emitting light or structured light at a given wavelength, including at least one recording means in the form of a 2D camera for recording still images or video of the illuminated objects. In a first embodiment of the invention, the system comprises at least one light source arranged to project a predefined or pre-selected light pattern at a selected wavelength on freely movable objects to be observed, including at least one recording means in the form of a 2D camera arranged to depict (still image or video) the freely movable object being observed having the light pattern from the light source projected thereon. In a second embodiment, the system comprises at least one light source being arranged to illuminate freely movable objects to be observed by light having a selected wavelength, including at least two recording means in the form of at least two 2D cameras arranged to provide a stereoscopic vision system.
Moreover, the system comprises a control unit arranged to control the light source to emit light having given properties, such as selection of wavelength (colour), intensity, frequency, etc. Moreover, the control unit will be arranged to select actual light pattern when a light pattern is to be projected onto the object.
Moreover, the system may comprise a separate image processing unit provided with means and/or software arranged to utilize the information from the illuminated object or objects, on which the light pattern is projected, recorded by the recording means to generate a 3D model of the freely movable object. As an alternative, the image processing unit or its functions may be arranged in the control unit or in an external unit.
In order to avoid calculation of the object speed, a light pattern is preferably used that covers as much of the object as possible. Speed calculation is difficult due to the object may have varying speed, e.g. fish will have varying speed due to it is using the tail when swimming. By projecting the object with a pattern, only one image is required to calculate the 3D model.
Moreover, the 3D model can be used to estimate volume and weight on basis of a function of shape variables characterizing the depicted object. Measured parameters in the 3D model, such as length and height, are inserted into a calculation formula for estimation of weight.
Images/video that do contain incomplete objects, do not have to be discarded prior to further analysis, but instead images of incomplete objects in the image may either be assembled as a 3D model of several images or of several 3D models, or alternatively estimating the complete 3D model on basis of an image of a in complete object. Moreover, it should be noted that a substantial advantage of the present invention is that it enables calculation of weight of one or more fish at a time in every image depending on the fish density.
Moreover, the system comprises static optical or programmable filters arranged to the recording means, said filters being arranged accept selected wavelengths, thus obtaining homogenous quality of the recorded images. Typically, selected wavelength will mainly be the wavelength which is sent from the light source to illuminate the object or to project the light pattern onto the freely movable object, which e.g. will be light in the infrared wavelength range.
Use of structured light or light in combination with filter to provide homogenously recorded images, solves some of the challenges which until now have been encountered by using computer- aided imaging (image processing) on freely movable objects in water, particularly fish, by being able to segment out the fish even though the contrast in the image is low and the surrounding conditions are varying.
In other words, in accordance with the present invention it will be possible to estimate physical dimensions of the object in a proper and accurate manner, even when the light conditions and visibility otherwise are bad, also during varying light conditions. In order to avoid unwanted reflections from the shiny surface of a fish, for example, the object (the fish) is advantageously illuminated/projected by the light source at a defined angle in relation to the recording means.
By that the recording means are provided with static optical or programmable filters this results in that unwanted reflection is avoided, so that the recording means collect correct light/light pattern and produces images which can be used in automatic image processing in the image processing unit. Since light absorbs quickly in water, it is particularly important to use filter combinations to obtain images with sufficient quality. Additionally, the conditions in a net cage change rapidly. This makes presently known image processing difficult and this is one of the reasons that there are no automatic optically based biomass meters on the market today.
It should additionally be noted that filtering and correct angle is important with regard to minimizing reflections from, e.g. the shiny surface of the fish. Too much reflection will result in saturation of image chip due to too much light.
An example of the importance of the angle between recording means and light, particularly by the use of infrared light, in an underwater application in connection with fish as an object is: if the angle is wrong, too much light is absorbed on the fish or reflected in wrong direction and to little light reach the recording means and the images become useless. By changing the angle some, more light is reflected and the images become good and can be used in the further image processing. Moreover, correct angle between the recording means and light is particularly important in this field of use, since the surface of the fish is not smooth, but anisotropic. To avoid back-scattering, i.e., that particles, algae etc. are illuminated, it is important to find correct angle between the recording means and light. These challenges are not encountered in air. The water absorption also makes this angle parameter a lot more important than in air, since it is particularly important to obtain as much reflection of the light as possible. The anisotropic properties of the fish-scales results in that the light is scattered in different directions, and therefore the energy in the light is scattered and is absorbed faster in the water, resulting in less light reaching the recording means.
Infrared light is absorbed quickly in water, which means that at a certain depth there is no noise from infrared light from the sun. For that reason, only infrared light from an artificial light source is visible in the images. Homogenous measurement conditions are obtained by filtering away all wavelengths shorter than the artificial light source. In order to ensure good measurements due to the light being absorbed quickly in water, a light source having correct wave length and correct density must be used.
Since infrared light has limited range in water, black background can be obtained in the images, in accordance with the present invention, which is an advantage at segmentation of the pattern for calculation of 3D model from the image.
Another aspect which has to be taken into consideration is that the contrast in water decreases by increasing distance to the object. Since the contrast is low in underwater images, it is necessary to add information to the images, which in the present invention has been done by projecting a pattern onto the object. Adding information to the image by the projected pattern will also make it possible to distinguish several objects from each other, since 3D models of the objects may be generated irrespective of the distance to them being different, something which enable calculation of volume, weight and size therefrom.
Because of the above-mentioned challenges of using light in water, the system must be calibrated in water to increase accuracy of the measurements. This must be done to take into consideration that the refraction index in air versus water is different.
Automatic measurement of average weight and biomass in fish farms is challenging. The fish often stay close and move fast, which result in heterogeneous light conditions due to the fish will shadow for incoming sunlight. Weather changes, day variations, particles, algae and seasonal variations also result in heterogeneous light conditions. Heterogeneous light conditions bring about challenges in automatic image processing, since the varying conditions must be addressed during image processing.
The present invention will not suffer at conditions like this, since the light source can be arranged with selected wavelength, intensity, frequency, etc. so that the present invention may take these conditions into consideration and thus provide images of homogenous quality for image processing.
The present invention is based on an optical system combined with artificial light source, where the light source illuminates objects or projects a light pattern onto the object, such as fish, at a selected wavelength, and filter away all remaining light from the recording means by means of filters, or use an image producing sensor which only collects the selected wavelength. In this way, a homogenous measuring environment is achieved. For example, by using a wavelength, such as in the infrared wavelength range, which is quickly absorbed in water, low interference is obtained from other light sources (such as sunlight or other artificially supplied light).
Since marine organisms, such as fish, may be affected by artificial light, it is an advantage to utilize light having a wavelength which affects the organism as little as possible, or light which is completely invisible to the organism (for example light in the infrared wavelength range). By choosing wavelengths that absorb quickly in water, the result will be, at a certain depth, low or no affection by these wavelengths from, for example sunlight, something which results in a minimum affection from interfering sources affecting the images registered by the recording means, something which simplifies the image processing and provides increased accuracy during calculation of the physical dimensions.
A homogeneous quality of the recorded images for use in image processing is obtained by filtering away/minimizing other wavelengths than the light supplied from the light source or utilize an image producing sensor that collects selected wavelength, i.e. the images are without interference from other light sources. The homogeneous image quality makes it possible to perform a robust and accurate image processing and 3D modelling.
In addition to the above-mentioned, the 3D model is, in accordance with the present invention, calculated by triangulation between light and image, or between images from several cameras. This provides a "depth image" where the pixel value represents distance to the measured object.
In this connection, it should be mentioned that triangulation from a line projected on the object and calculation of a 3D model from several combined images can be performed, something which will enable measurement of height and thickness of the object, including estimation of the length.
When the object to be observed is fish, it is an advantage to arrange the present invention close to the feeding area for the fish in a net cage, since all fish is passing to eat during a feeding period and will provide a best possible representative selection of measured fish. The present invention may naturally be placed anywhere in the net cage. By placing the system in accordance with the present invention in the vicinity of the feeding area, it is not necessary to move the system around in the net cage to seek out the fish where it resides.
The goal of fish farmers is that all fish must appear at the feeding area to eat during the day, something which results in that both small and large fish will pass the present invention during a feeding cycle. The present invention may also be provided with a winch or similar for measurement through a water column, and for elevation and lowering in the net cage, alternatively also arranged to be moved around within the net cage if desired by means of suitable means for this.
The present invention is also applicable in estimation of number of fish, by using data about fish density in a model for this.
Further beneficial features and details of the present invention will appear from the following example description.
Example
The present invention will be described below in more details with reference to the attached drawings, where
Figure 1 shows a principle drawing of a system in accordance with a first embodiment of the present invention using structured light,
Figure 2 shows a principle drawing of a system in accordance with a second embodiment of the present invention which is using a light source for illumination of the object to be observed, Figure 3 shows a principle drawing of a system in accordance with the present invention located in a net cage, and
Figure 4 shows a block diagram of a system in accordance with the present invention.
Reference is now made to Figure 1, which shows a principle drawing of a system in accordance with a first embodiment of the present invention, to be arranged in a net cage for observation of fish.
A system in accordance with the first embodiment of the present invention comprises at least one light source 10 arranged for emitting structured light at a selected wavelength, and at least one 2D camera 11 provided with an optical filter 12. The 2D camera 11 may be arranged to take still images only, video only or both video and still images. The light source 10 is, to enable emission of structured light, provided with means and/or software for emitting a known or selected light pattern 13 which is to be projected on fish 100 or other objects residing within the illumination range of the light source 10 at a selected wavelength. By using structured light, basically only one 2D camera is required for image recognition, but using several cameras may also be an advantage, which will be described further below.
The light pattern 13 projected by the light source 10 may exhibit any selected pattern.
For example, the pattern may be formed by one or more of (but not limited to the list): - either horizontal or vertical lines,
- both horizontal and vertical lines,
- lines with different properties,
- lines per se forming a pattern suitable for optic recognition,
- by using visible and/or invisible light, - a continuous pattern,
- sections or sectors having different patterns,
- point pattern,
- sinus pattern,
- a combination of these. The pattern as a whole should preferably cover the entire object to be observed.
Reference is now made to Figure 2, which shows a principle drawing of a system in accordance with a second embodiment of the invention, to be arranged in a net cage for observation of fish.
A system in accordance with the second embodiment of the present invention comprises at least one light source 10 arranged to illuminate the object at a selected wavelength, and at least two 2D cameras lla-b arranged to provide a stereoscopic vision system. When the object is illuminated by light without pattern, at least two 2D cameras are required to enable generation of a 3D model of the object. Two or more cameras are required to enable calculation of distance to the object and thereupon generating 3D model. The 2D cameras lla-b may be arranged to take still images only, video only, or both video and still images. The light source(s) 10 is/are, both in the first and second embodiment, arranged to emit light having desired parameters with regard to wavelength (color), optionally frequency, intensity, etc. The light source is preferably of the type light emitting diode (LED), laser or another type arranged for emitting light at selected wavelength, including light sources where only selected wavelength is filtered out.
Moreover, the 2D camera(s) is/are, both in the first and second embodiment, provided with optical filters 12, 12a-b arranged to accept light only at selected wavelengths. In an alternative embodiment (not shown), instead of filters, the 2D cameras 11, lla-b are provided with image producing sensors which only collects selected wavelength. Use of several 2D cameras lla-b provides more possibilities, optionally also in combination with several light sources or controllable light sources. The 2D cameras 11, lla-b may, for example, be provided with different filters 12a-b, i.e. filters having different properties. For example, one of the cameras may be provided with an I filter which only accepts IR light. If the light source 10 emits a light/light pattern with IR light, this camera will therefore accept the light/light pattern. The other camera may, for example, be arranged with a filter which accepts only green light. By illuminating the fish with green light/light pattern, one will be able to depict the fish in a second way and thus have two images which may be combined to segment the fish in a most robust way as possible. A 3D model may be generated on basis of the two images in combination. By image processing of images of the fish (without light pattern), one may, e.g., search for salmon louse, injuries, deformations or similar.
Another example is that several light sources are being used, for example two light sources. Then, for example, both light sources may emit light with a pattern at different wavelengths and the two cameras 12a-b collect separate patterns, whereupon the images are combined in generation of a 3D model. Moreover, the light source may emit light at different wavelengths to obtain best possible images or video for processing.
This shows that the combination of one or more controllable light sources and one or more cameras with filter provides many possibilities for different setups.
Moreover, the system advantageously comprise a suspension device 20, e.g. formed of a suspension point 21 and a rod 22 arranged vertically from the suspension point 21, said rod 22 being adapted for attachment of the 2D cameras lla-b and the light source 10. The light source 10 is in the example shown in Figure 2 arranged between the two 2D cameras lla-b, at a given distance between the 2D cameras lla-b and the light source 10, and that the light source 10 exhibits an angle in relation to the fish to be observed, whereas it in the first embodiment is located under the 2D camera 11, but it could naturally also have been located above the camera.
Moreover, the light source 10 may be located anywhere in relation to the 2D camera(s) depending on the light reflection from the object to be illuminated.
Reference is now made to Figure 3, which shows a principle drawing of a system in accordance with the first embodiment of the invention arranged in a net cage 30 in the vicinity of a feeding station 40. The system in accordance with the invention is in this example connected to a winch 50 for elevating and lowering the system in the net cage. Moreover, the system may be arranged to means for moving the system around within the net cage 30, if desirable. It may be seen from the figure that a light pattern 13 is projected onto a fish 100' by the light source 10, while other fishes 100 do not have a light pattern projected thereon. Alternatively, the system may be arranged stationary in the feeding area, so that the users don't have to move the system around, as all fish usually visit the feeding area during the course of a day.
Reference is now made to Figure 4, which shows a block diagram of the system in accordance with the present invention. The block diagram covers both the first and second embodiment, so that if it in the first embodiment only is used one camera, the components lib and 12b are omitted from the block diagram. In addition to the components mentioned above, the system comprises an image processing unit 60 provided with means and/or software to utilize the information from the light pattern 13 recorded by the 2D camera 11 to generate a 3D model of the fish 100' being illuminated (for the first embodiment) or provided with means and/or software to utilize the information from the stereoscopic vision created by the two 2D cameras lla-b (for the second embodiment). The image processing unit 60 may be a separate unit, a unit integrated in a control unit 70 for the system, or an external unit 80.
Moreover, as mentioned, the system comprises a control unit 70 which may comprise the image processing unit 60, and is moreover provided with means and/or software for controlling the light source(s) 10, the recording means 11, lla-b and optionally the filter(s) 12, 12a-b. For example, the control unit 70 will provide the filters 12, 12a-b with settings informing which wavelength the light source 10 is emitting light with, so that the filters 12, 12a-b are adjusted to receive light only at this wavelength. Moreover, the system is provided with communication means for communication with an external unit 80, which external unit 80 may also be the control unit of the system.
The control unit 70, the image processing unit 60 or the external unit 80 are provided with means and/or software for utilizing the generated 3D model for estimation of physical dimensions of the depicted fish, such as size, volume and weight, based on a function of shape variables which are characterizing the depicted object. Measured parameters (for example length and height) from the 3D model are used in a model for estimating weight. The weight of singular fish is advantageously stored in a database, for example, in the external unit 80 or the control unit 70, so that the average weight may be calculated when the data basis has become large enough for this. The total biomass in the net cage can be calculated from the average weight and the total number of fish. The size distribution may be found as well.
Moreover, the control unit 70 or the external unit 80 is arranged to present the calculated physical dimensions to the user.
Moreover, the control unit 70 may be arranged to control the winch 50 for elevating and lowering the system in the net cage 30, including any other means for moving the system in accordance with the present invention around within a net cage 30.
In a further embodiment of the invention, the control unit 70 is also arranged to control the feeding station 40 on basis of image information from the recording means.
Moreover, the present invention also have another field of use by that the images/videos acquired also may be used to search for, e.g. salmon louse, deformations, injuries, illness and similar.
Moreover, the present invention is not limited to net cages but may also be used in fish hatcheries, where tubs are used, and similar applications.

Claims

Claims
1. A method for calculation of physical dimensions for freely movable objects in water, characterized in comprising: a. illuminating at least one object or project a known or selected light pattern (13) on at least one object by means of at least one light source (10) emitting light at selected wavelength or wavelengths, b. recording illuminated objects or objects having a light pattern (13) projected thereon, by means of recording means (11, lla-b) in the form of at least one 2D camera (11, lla-b) provided with filters (12, 12a-b) arranged to only admit light having selected wave lengths or provided with image producing sensors arranged to collect light only at selected wavelengths, c. generate 3D models of objects based on recorded images and/or video from the recording means (11, lla-b) as basis for calculation of the physical dimensions.
2. The method of claim 1, characterized in that it comprises generation of 3D models of objects based on triangulation between the recording means (11, lla-b) and recorded images or between recorded images from several recording means (11, lla-b).
3. The method of claim 1, characterized in that it comprises triangulation from one line projected on objects and calculation of 3D models from several combined recorded images.
4. The method of claim 1, characterized in that it comprises illuminating at least one object or projecting the light pattern (13) on at least one object with the light source (10) at a given angle in relation to the recording means (11, lla-b).
5. The method of claim 1, characterized in that it comprises setting up parameters of the light source (10), including one or more of: wavelength, intensity, frequency or light pattern (13), for optimum recording by the recording means (11, lla-b).
6. The method of claim 1, characterized in that it comprises using filters (12, 12a-b) which only admit light at wavelengths corresponding to wavelengths emitted by the light source (10).
7. The method of claim 1, characterized in that it comprises using image producing sensor(s) which only collects light at wavelengths corresponding to wavelengths emitted by the light source (10).
8. The method of claim 1, characterized in that it comprises using generated 3D models to calculate physical dimensions for objects, such as size, volume and/or weight, based on a function of shape variables characterizing depicted objects in the generated 3D models.
9. The method of claim 1, characterized in that it comprises using: - only recordings of complete objects for generation of 3D models, or
- recordings of incomplete objects in the image, and either combine several images to generate 3D models or combine several 3D models of the same object.
10. The method of claim 1, characterized in that it comprises using recordings of incomplete objects in the image and estimate 3D models based on the partially depicted objects.
11. The method of claim 1, characterized in that it comprises estimating volume of the objects based on parts of the 3D models of the objects.
12. The method of claims 1-11, characterized in that it comprises moving the light source(s) (10) and recording means (11, lla-b) in a net cage (30) by means of a winch (50).
13. The method of claims 1-12, characterized in that objects being observed are freely movable fish (100).
14. The method of claims 1-13, characterized in that it comprises arranging the light source(s) (10) and recording means (11, lla-b) with filter (12, 12a-b) or image producing sensor(s) stationary in the feeding area for fish in a net cage (30) or a tub in a fish hatchery, to ensure a representative range of observed fishes passing by to eat during a day.
15. The method of claims 1-14, characterized in that it further comprises recording the results/data in a database and using the stored results/data in further statistical analysis, including calculation of average weight and/or size distribution.
16. The method of claim 15, characterized in that it further comprises calculating total biomass of fish based on the average weight and the total number of fish in the net cage (30) or the tub.
17. The method of claim 15, characterized in that it further comprises using data about fish density to estimate the number of fish in the net cage (30) or tub.
18. System for calculation of physical dimensions of freely movable objects in water, said system comprising at least one light source (10) for illuminating objects and at least one recording means (11, lla-b) for recording the illuminated objects, characterized in that:
- the light source(s) (10) is/are arranged for illuminating at least one object or that the light source (10) is provided with means and/or software to project a light pattern (13) on at least one objet to be observed at selected wavelength or wavelengths,
- the recording means (11, lla-b) comprising at least one 2D camera provided with filters (12, 12a- b) arranged to admit light at selected wavelengths or provided with image producing sensors arranged to only collect light with selected wavelengths, and - an image processing unit (60) provided with means and/or software for generating 3D models of objects based on recorded images and/or video from the recording means (11, lla-b) as basis for calculation of the physical dimensions.
19. The system of claim 18, characterized in that the light source(s) (10) are arranged for illuminating objects or projecting the light pattern (13) on objects at a given angle in relation to the recording means (11, lla-b).
20. The system of claim 18, characterized in that it comprises a control unit (70) provided with means and/or software for controlling the light source(s) (10), recording means (11, lla-b), including the filters (12, 12a-b) or image producing sensors.
21. The system of claim 18, characterized in that the control unit (70) is provided with means and/or software for controlling a winch (50) for elevating and lowering of the light source(s) (10) and recording means (11, lla-b) in a net cage (30).
22. The system of claim 19, characterized in that the system comprises communication means for communication with an external unit (80).
23. The system of any one of the claims 18-22, characterized in that the image processing unit (60), the control unit (70) or the external unit (80) are provided with means and/or software for calculating physical dimensions, including size, volume, weight or biomass, of objects being recorded.
24. The system of any one of the claims 18-23, characterized in that objects being observed is fish.
25. The system of claim 18, characterized in that the light source(s) (11) is a light emitting diode, laser or another type of light source arranged for emitting light at selected wavelength.
26. The system of claim 25, characterized in that the light source (11) is arranged for emitting light in the infrared wavelength range.
27. The system of the claims 18-26, characterized in that the system is arranged in the vicinity of a feeding area for fish in a net cage (30) or tub for a fish hatchery.
Abstract
Method and system for calculating physical dimensions for freely movable objects in water, by illuminating the object or projecting a known or selected light pattern (13) on the object by means of at least one light source (10) at selected wavelength or wavelengths, recording illuminated object or objects with the projected light pattern (13) by means of recording means (11, lla-b) in the form of at least one 2D camera provided with filters (12, 12a-b) arranged to only accept light having selected wavelengths or provided with image producing sensors arranged to only collect light with selected wavelengths, including generation of a 3D model based on recorded images and/or video from the recording means (11, lla-b) as basis for calculating the physical dimensions.
Figure 1
EP13864165.9A 2012-12-20 2013-12-20 System and method for calculating physical dimensions for freely movable objects in water Ceased EP2936051A4 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
NO20121541A NO337305B1 (en) 2012-12-20 2012-12-20 System and method for calculating physical sizes for freely moving objects in water
PCT/NO2013/050231 WO2014098614A1 (en) 2012-12-20 2013-12-20 System and method for calculating physical dimensions for freely movable objects in water

Publications (2)

Publication Number Publication Date
EP2936051A1 true EP2936051A1 (en) 2015-10-28
EP2936051A4 EP2936051A4 (en) 2016-08-24

Family

ID=50978780

Family Applications (1)

Application Number Title Priority Date Filing Date
EP13864165.9A Ceased EP2936051A4 (en) 2012-12-20 2013-12-20 System and method for calculating physical dimensions for freely movable objects in water

Country Status (5)

Country Link
EP (1) EP2936051A4 (en)
CA (1) CA2895758A1 (en)
CL (1) CL2015001722A1 (en)
NO (1) NO337305B1 (en)
WO (1) WO2014098614A1 (en)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111491508A (en) * 2017-12-20 2020-08-04 英特维特国际股份有限公司 System for fish ectoparasite monitoring in aquaculture
CN111511201A (en) * 2017-12-20 2020-08-07 英特维特国际股份有限公司 System for fish ectoparasite monitoring in aquaculture
CN111511203A (en) * 2017-12-20 2020-08-07 英特维特国际股份有限公司 Method and system for fish ectoparasite monitoring in aquaculture
CN111511202A (en) * 2017-12-20 2020-08-07 英特维特国际股份有限公司 System for fish ectoparasite monitoring in aquaculture

Families Citing this family (20)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB2539495B (en) * 2015-06-19 2017-08-23 Ace Aquatec Ltd Improvements relating to time-of-flight cameras
WO2017001971A1 (en) * 2015-06-30 2017-01-05 Antípoda, Lda Method and system for measuring biomass volume and weight of a fish farming tank
CN105104278B (en) * 2015-08-20 2017-07-18 江苏大学 Circulating water cultivation floats bait automatic delivery method and device
CL2016002664A1 (en) 2015-10-22 2018-01-05 Intervet Int Bv A method for automatic monitoring of sea lice in salmon aquaculture
PT109333B (en) * 2016-04-16 2020-10-02 Fishmetrics. Lda. FISH MEASUREMENT SYSTEM USING A CAMERA AND A STRUCTURED LIGHT PROJECTOR
NO20160880A1 (en) * 2016-05-24 2017-11-27 Itecsolutions Systems & Services As Arrangement and method for measuring the biological mass of fish and use of the arrangement
GB201710705D0 (en) 2017-07-04 2017-08-16 Optoscale As Structured-Light Illumination
CN107576279A (en) * 2017-09-18 2018-01-12 三峡大学 A kind of device and method for determining fish body center line equation in motion
CA3093646C (en) * 2018-03-20 2021-03-30 Giliocean Technology Ltd Method and system for extraction of statistical sample of moving objects
US10534967B2 (en) * 2018-05-03 2020-01-14 X Development Llc Fish measurement station keeping
WO2019232247A1 (en) 2018-06-01 2019-12-05 Aquabyte, Inc. Biomass estimation in an aquaculture environment
US11659819B2 (en) 2018-10-05 2023-05-30 X Development Llc Sensor positioning system
NO20190203A1 (en) * 2019-02-13 2020-03-25 Stingray Marine Solutions As A cage observation system with a submerged observation unit
ES2786798B2 (en) * 2019-04-11 2022-02-08 Univ Oviedo Biomass estimation system in aquaculture based on optical sensors and neural networks
NO347348B1 (en) * 2019-06-19 2023-09-25 Subc3D As System and procedure for imaging and counting external structures on a fish
ES2799975A1 (en) * 2019-06-21 2020-12-22 Univ Oviedo Biomass estimation system in aquaculture based on reconstructions of images in three dimensions (Machine-translation by Google Translate, not legally binding)
US11089227B1 (en) 2020-02-07 2021-08-10 X Development Llc Camera winch control for dynamic monitoring
EP3915361A1 (en) * 2020-05-26 2021-12-01 Furuno Electric Co., Ltd. Device for calculating fish body depth
NO20201382A1 (en) 2020-12-16 2022-06-17 Createview As A system for monitoring of dead fish
US11864537B2 (en) 2021-03-07 2024-01-09 ReelData Inc. AI based feeding system and method for land-based fish farms

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7310431B2 (en) * 2002-04-10 2007-12-18 Canesta, Inc. Optical methods for remotely measuring objects
US7399220B2 (en) * 2002-08-02 2008-07-15 Kriesel Marshall S Apparatus and methods for the volumetric and dimensional measurement of livestock
NO330863B1 (en) * 2007-07-09 2011-08-01 Feed Control Norway As Apparatus and method for cutting weight milling and appetite lining in fish farms
CA2753249A1 (en) * 2009-02-27 2010-09-02 Body Surface Translations, Inc. Estimating physical parameters using three dimensional representations
NO332103B1 (en) * 2010-12-13 2012-06-25 Ocea As System and method for calculating the size of marine organisms in water

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111491508A (en) * 2017-12-20 2020-08-04 英特维特国际股份有限公司 System for fish ectoparasite monitoring in aquaculture
CN111511201A (en) * 2017-12-20 2020-08-07 英特维特国际股份有限公司 System for fish ectoparasite monitoring in aquaculture
CN111511203A (en) * 2017-12-20 2020-08-07 英特维特国际股份有限公司 Method and system for fish ectoparasite monitoring in aquaculture
CN111511202A (en) * 2017-12-20 2020-08-07 英特维特国际股份有限公司 System for fish ectoparasite monitoring in aquaculture
CN111491508B (en) * 2017-12-20 2022-05-24 英特维特国际股份有限公司 System for fish ectoparasite monitoring in aquaculture

Also Published As

Publication number Publication date
WO2014098614A1 (en) 2014-06-26
NO20121541A1 (en) 2014-06-23
EP2936051A4 (en) 2016-08-24
CA2895758A1 (en) 2014-06-26
CL2015001722A1 (en) 2016-05-20
NO337305B1 (en) 2016-03-07

Similar Documents

Publication Publication Date Title
EP2936051A1 (en) System and method for calculating physical dimensions for freely movable objects in water
US20200267947A1 (en) Arrangement and method for measuring the biological mass of fish, and use of the arrangement
US7853046B2 (en) Imaging system and method for body condition evaluation
CA2744146C (en) Arrangement and method for determining a body condition score of an animal
WO2019232247A1 (en) Biomass estimation in an aquaculture environment
US20170199122A1 (en) Method of determining a value of a variable of interest of a sample having organisms and system therefore
Pautsina et al. Infrared reflection system for indoor 3D tracking of fish
GB2539495A (en) Improvements relating to time-of-flight cameras
CN111127411B (en) Monitoring control method for fishery cultivation
CA3083984A1 (en) Method and system for external fish parasite monitoring in aquaculture
CN102854148A (en) Detection and grading system for tenderness of fresh beef based on multispectral imagery
WO2017001971A1 (en) Method and system for measuring biomass volume and weight of a fish farming tank
JP6200519B2 (en) System and method for counting zooplankton
Li et al. Estimation of pig weight by machine vision: A review
Livanos et al. Intelligent navigation and control of a prototype autonomous underwater vehicle for automated inspection of aquaculture net pen cages
CN105933652A (en) Apparatus and method for detecting sturgeon activity based on image identifying and positioning
CN103761565A (en) Underwater fry, young shrimp and young crab quantity estimating and behavior monitoring device and method based on computer vision
JP2019194573A (en) Body weight estimation device
CN114241031A (en) Fish body ruler measurement and weight prediction method and device based on double-view fusion
EP3769036B1 (en) Method and system for extraction of statistical sample of moving fish
CN204202563U (en) Fish morphological parameters self-operated measuring unit
TWI718572B (en) A computer-stereo-vision-based automatic measurement system and its approaches for aquatic creatures
TORISAWA et al. A technique of three-dimensional monitoring for free-swimming pacific bluefin tuna Thunnus orientalis cultured in a net cage using a digital stereo-video camera system
CN113780073B (en) Device and method for auxiliary estimation of chicken flock uniformity
Sun et al. A practical system of fish size measurement

Legal Events

Date Code Title Description
PUAI Public reference made under article 153(3) epc to a published international application that has entered the european phase

Free format text: ORIGINAL CODE: 0009012

17P Request for examination filed

Effective date: 20150720

AK Designated contracting states

Kind code of ref document: A1

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR

AX Request for extension of the european patent

Extension state: BA ME

DAX Request for extension of the european patent (deleted)
A4 Supplementary search report drawn up and despatched

Effective date: 20160726

RIC1 Information provided on ipc code assigned before grant

Ipc: A01K 61/00 20060101ALI20160720BHEP

Ipc: G01B 11/25 20060101AFI20160720BHEP

Ipc: G01B 11/24 20060101ALI20160720BHEP

Ipc: G06K 9/20 20060101ALI20160720BHEP

Ipc: G01B 11/04 20060101ALI20160720BHEP

Ipc: G06K 9/00 20060101ALI20160720BHEP

RAP1 Party data changed (applicant data changed or rights of an application transferred)

Owner name: STORVIK AQUA AS

17Q First examination report despatched

Effective date: 20171004

RAP1 Party data changed (applicant data changed or rights of an application transferred)

Owner name: VARD AQUA SUNNDAL AS

REG Reference to a national code

Ref country code: DE

Ref legal event code: R003

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE APPLICATION HAS BEEN REFUSED

18R Application refused

Effective date: 20190607