WO2024137939A1 - Ground truth and training data generation for fish - Google Patents

Ground truth and training data generation for fish Download PDF

Info

Publication number
WO2024137939A1
WO2024137939A1 PCT/US2023/085348 US2023085348W WO2024137939A1 WO 2024137939 A1 WO2024137939 A1 WO 2024137939A1 US 2023085348 W US2023085348 W US 2023085348W WO 2024137939 A1 WO2024137939 A1 WO 2024137939A1
Authority
WO
WIPO (PCT)
Prior art keywords
fish
container
ground truth
images
data item
Prior art date
Application number
PCT/US2023/085348
Other languages
French (fr)
Inventor
Thomas Robert SWANSON
Harrison PHAM
Grace Calvert YOUNG
Original Assignee
X Development Llc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by X Development Llc filed Critical X Development Llc
Publication of WO2024137939A1 publication Critical patent/WO2024137939A1/en

Links

Classifications

    • AHUMAN NECESSITIES
    • A01AGRICULTURE; FORESTRY; ANIMAL HUSBANDRY; HUNTING; TRAPPING; FISHING
    • A01KANIMAL HUSBANDRY; AVICULTURE; APICULTURE; PISCICULTURE; FISHING; REARING OR BREEDING ANIMALS, NOT OTHERWISE PROVIDED FOR; NEW BREEDS OF ANIMALS
    • A01K61/00Culture of aquatic animals
    • A01K61/90Sorting, grading, counting or marking live aquatic animals, e.g. sex determination
    • A01K61/95Sorting, grading, counting or marking live aquatic animals, e.g. sex determination specially adapted for fish
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/10Image acquisition
    • G06V10/12Details of acquisition arrangements; Constructional details thereof
    • G06V10/14Optical characteristics of the device performing the acquisition or on the illumination arrangements
    • G06V10/141Control of illumination
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/77Processing image or video features in feature spaces; using data integration or data reduction, e.g. principal component analysis [PCA] or independent component analysis [ICA] or self-organising maps [SOM]; Blind source separation
    • G06V10/774Generating sets of training patterns; Bootstrap methods, e.g. bagging or boosting
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/05Underwater scenes
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands

Definitions

  • a system generates ground truth data from fish enclosed within a chamber.
  • the system can prevent harm to the fish by avoiding manual extraction and, instead, using water ways, such as pipes, to move fish within sensing ranges of sensors.
  • the system uses suction force to move a fish within a fish pen into a pipe filled with water that is connected to a chamber for obtaining data, such as images or other sensor data.
  • the system can obtain the fish from an aquaculture fish pen.
  • the fish can travel to a first chamber where images can be obtained.
  • the fish can travel to a second chamber configured to obtain accurate ground truth measurements.
  • the system can generate an element of training data, including ground truth data, that includes the obtained images and ground truth measurements.
  • the system can use the training data to train a machine learning model to predict values indicated by the ground truth measurements based on input data similar to the obtained images.
  • Advantageous implementations can include one or more of the following features.
  • systems and methods described do not require that fish are anesthetized.
  • the described system, apparatus, and corresponding methods improve fish welfare, e.g., by reducing movement of fish by hand, while increasing an amount of generated ground truths.
  • systems can increase the accuracy of corresponding trained models.
  • systems and methods described enable faster, more automated ground-truth collection compared to manual observations or maneuvering of fish.
  • Faster ground-truth collection allows for greater amounts of training data and corresponding greater accuracy in associated trained models, e.g., models trained to predict biomass of fish.
  • Systems and methods can improve support for new species of fish or new conditions detected in fish by generating training data that depicts the new species.
  • weight and fish dimensions are collected.
  • a 2-axis light curtain can be used to collect weight and fish dimensions of a fish.
  • accuracy of a light curtain can enable models to predict more subtle features of fish, such as physical deformities, spinal deformities, among others. Typical images can be insufficient to detect such subtleties.
  • a fish is identified using a device with a unique identifier, e.g., a chip, tag, among others.
  • a fish can have a tag attached to its body or a chip embedded in the skin that can be read wirelessly as the fish swims. Data obtained in one or more chambers can be associated with an identifier of a chip, tag, or other physical identification of the fish.
  • features of a fish are used to determine a unique identity.
  • a system can detect features of a fish, such as truss lengths, spots, physical feature location, among others. Based on the detected features, a system can determine a unique identity of a fish, e.g., by providing data representing a fish to a model trained to identify fish, and associate data obtained that represents the fish with the unique identity of the fish.
  • training data that is generated is used directly to train a model. For example, after a system generates an item of training data, the item of training data can be used to train a corresponding model until one or more thresholds of the model are satisfied. The system can determine if enough training data has been generated or the model is performing adequately. In some cases, stopping training data generation as soon as a model has satisfied one or more thresholds, e.g., prediction accuracy thresholds, can help reduce stress on fish or improve other operations of aquaculture by reducing movement of fish from in or out of fish pens.
  • thresholds e.g., prediction accuracy thresholds
  • increasing a rate of training data generation allows for quicker detection and mitigation of disease.
  • proposed systems and methods described can increase a rate of training data generation which, when used to train a model that identifies diseases in fish, can enable such a model to more quickly make detections.
  • a system can treat the disease identified by the model to have the disease or sort the diseased fish from the non-diseased fish. Because diseases can grow more serious in a population over time, e.g., affecting more individuals or affecting individuals more severely, enabling more rapid detection through more rapid training of corresponding models can potentially save individuals from death by the given disease or others from contracting the disease if communicable.
  • ground truth and corresponding training data are generated only until a corresponding trained model satisfies one or more thresholds.
  • a system can train a model while generating training data for training the model. Once the system trains the model such that the performance of the model satisfies one or more thresholds, the system can stop generating training data and ground truth data. In this way, the system can reduce the amount of fish measuring and movement as well as corresponding energy and climate effects of running the system by only operating until sufficient training data has been generated.
  • Ground truth data captured and included in training data can include one or more of: a weight or biomass of a fish; dimension of a fish, such as length and width, spine straightness or deformity, protruding belly, gill shortening, among others; a number or location of lice, or other ectoparasites, on a fish; a presence or absence of diseases in the fish, e.g., diseases that lead to bloated belly, red-tint in scales, among others; sizes and locations of lesions or wounds on the fish; levels of stress hormones in the fish; fish’s eye condition, e.g., cataracts, eye damage, among others.
  • FIG. 1 is a diagram showing an example of a system for ground truth and training data generation.
  • FIG. 2 is a flow diagram illustrating an example of a process for ground truth and training data generation.
  • FIG. 3 is a diagram illustrating an example of a computing system used for ground truth and training data generation.
  • FIG. 1 is a diagram showing an example of a system 100 for ground truth and training data generation.
  • the system 100 includes a control unit 107, a first chamber 106, and a second chamber 124.
  • the control unit 107 operates one or more modules for obtaining data, generating training data, and training a machine learning model, e.g., a light controller 108, a camera controller 110, a ground truth engine 142, and a model trainer 160.
  • the control unit 107 can include one or more computing systems including one or more processing elements.
  • a fish 104 moves in a tube 102 towards the first chamber 106.
  • the tube 102 can be sized to fit the fish 104.
  • the tube 102 can include water such that the fish 104 is able to swim in the tube 102 or the fish 104 is pushed by a current of water flowing in the tube 102 to the first chamber 106.
  • the control unit 107 controls a suction effect at an end of the tube 102.
  • the control unit 107 can control a waterjet or propeller that moves water in the tube 102 towards the first chamber 106.
  • the system 100 can prevent harm to the fish 104.
  • the fish 104 may be extracted by hand from a fish pen and put into a location for measurement.
  • the manual extraction can cause lesions which kill the fish 104.
  • the manual extraction can also cause lesions which kill the fish 104 over time by impairing an ability of the fish 104 to feed or perform other necessary tasks.
  • the system 100 through harm reduction to the fish 104, can prevent corresponding ecological harm from, e.g., overabundance of organic decay from fish that would have been killed using traditional techniques.
  • the system 100 can also prevent psychological distress of the fish 104 by reducing such manual extraction. Psychological distress can affect health and even lead to death in some cases.
  • stage B a fish 112 is shown in the first chamber 106.
  • the first chamber 106 includes a lighting element 114 and a camera 116.
  • the lighting element 114 and the camera 116 are controlled by the control unit 107.
  • modules of the control unit e.g., the light controller 108 and the camera controller 110, can control the lighting element 114 and the camera 116.
  • control unit 107 controls the camera 116 to capture fish images 118 of the fish 112.
  • control unit 107 can send a signal to the camera 116 or processor that directly controls the camera 116, configured to operate the camera 116 and obtain the fish images 118.
  • the fish images 118 can include still images or video segments of the fish 112.
  • the control unit 107 controls the lighting element 114.
  • the control unit 107 can send a signal to the lighting element 114, or processor that directly controls the lighting element 114, configured to operate the lighting element 114 and trigger a light, turn off a light, or start a specific lighting pattern, such as an on/off sequence, specific light frequency ranges, among others.
  • the lighting element 114 illuminates the fish 112.
  • the lighting element 114 is timed by the control unit 107 to illuminate the fish 112 while the camera 116 is obtaining an image of the fish 112 to enable the camera 116 to capture images of the fish 112 while the fish 112 is illuminated by the lighting element 114.
  • the control unit 107 controls the lighting element 114 to vary an amount or type of light while the camera 116 obtains one or more images. For example, the control unit 107 can control the lighting element 114 to illuminate with a first intensity or with light of a first frequency range for a first period of time. The control unit 107 can then control the lighting element 114 to illuminate with a second intensity or with light of a second frequency range for a second period of time. The control unit 107 can control the camera 116 to capture images of the varying light. In some implementations, capturing images of varying light helps the resulting training data train a model that generalizes well to varying real world lighting scenarios.
  • the lighting element 114 includes powered lights, such as light emitting diodes (LEDs).
  • the lighting element 114 includes an opening to allow sunlight or other produced light into the first chamber 106.
  • Such an opening can be connected to a mechanism, e.g., shade, covering, shield, among others, to open and close the opening or to adjust, or vary an amount of light, e.g., fully opaque, semi-translucent, clear, that illuminates the fish 112.
  • the control unit 107 determines a quality, e.g., frequency/wavelength, brightness, polarization, among others, of light with which to illuminate the fish 112. For example, the control unit 107 can determine the fish 112 is at risk for a particular disease, e.g., by visual inspection using images captured by the camera 116, upstream processing element, or controlled by a user with knowledge of an origin of the fish 112. The control unit 107 can determine that, for the particular disease, a particular one or more frequency ranges are useful in detecting certain conditions.
  • a quality e.g., frequency/wavelength, brightness, polarization, among others
  • control unit 107 can control the lighting element 114 to alternate flashes of red light and blue light or other frequency ranges determined by the control unit 107.
  • the control unit 107 can configure the camera 116 to capture images of the fish 112 with the fish 112 alternately illuminated by the red light and the blue light.
  • the control unit 107 controls one or more lighting features of the lighting element 114 to pulse light that illuminates the fish 112.
  • the control unit 107 can pulse a white or colored light at a particular rate, e.g., pulses per second.
  • One or more colors of light e.g., wavelength ranges, can be pulsed such that the camera 116 captures images of the fish 112 illuminated by different colored light.
  • the pulses of multiple light colors can be interlaced so that a pulse of a first color is followed by a second color which is followed by either the first color or a third color. Such a pattern can be continued for one or more colored lights.
  • the control unit 107 determines a pulse rate for a light that satisfies a detection threshold of the fish 112. For example, if the control unit 107 determines that the fish 112 can detect light pulse rates below 30 hertz, the control unit 107 can increase the rate above 30 hertz to prevent the fish 112 from detecting the pulses. In general, if a fish does not detect light pulsing, corresponding stress hormone production can be reduced which thereby increases fish welfare. Fish welfare can be tied to infection rates, morbidity rates, production yield, among others.
  • the control unit 107 uses images from the camera 116 to determine a species of the fish 112. For example, the control unit 107 can determine that the fish 112 is of a first species with a pulse sensitivity below 10 hertz using a model trained to predict a species. In some implementations, the control unit obtains data provided by a user of the system 100 indicating a current species to be measured. Based on determining that the fish 112 is of the first species, the control unit 107 can control the lighting element 114 or the camera 116 to illuminate the fish 112 with a pulse rate at least above 10 hertz.
  • the lighting element 114 includes one or more lighting features.
  • the lighting element 114 can include one or more of: a first light that emits light of a first frequency or range of frequencies or a second light that emits light of a second frequency or range of frequencies.
  • the first light and the second light can emit light of the same or different color.
  • the first light and the second light can emit light of the same or different frequency ranges.
  • the first chamber 106 is attached to the second chamber 124.
  • the first chamber 106 and the second chamber 124 can be a part of the same chamber.
  • a single chamber including the first chamber 106 and the second chamber 124 includes a partition to control a flow of fish from the first chamber 106 to the second chamber 124.
  • a mesh netting can be used with an opening that is controlled by the control unit 107 to allow fish to flow between the first chamber 106 and the second chamber 124.
  • the fish 112 can move from the first chamber 106 to the second chamber 124.
  • a fish 126 is shown in the second chamber 124.
  • the fish 126 moved from the first chamber 106 through a connecting tube 122 to the second chamber 124.
  • a fish moving in the tube 102 moves into the second chamber 124 before the first chamber 106.
  • the first chamber 106 can be positioned after the second chamber 124 relative to a movement of fish.
  • the control unit 107 can similarly collect images and ground truth data as discussed.
  • the system 100 is configured to allow fish to move in either directions.
  • the control 107 can change a direction of water flow to control a direction of fish moving through the system, e.g., moving from the first chamber 106 to the second chamber 124, or from the second chamber 124 to the first chamber 106.
  • the second chamber 124 is shown using two views of the second chamber 124: (i) view 124a showing one set of light emitters 128 and light sensors 132 and (ii) view 124b showing another set of light emitters 134 with corresponding light sensors not shown.
  • the light emitters 128 and 134 and light sensors 132 and corresponding sensors for light emitters 134 form a light curtain along 2 axes.
  • view 124a the light from light emitters 128 are shown from the light emitters 128 to the light sensors 132 on the opposite side of the second chamber 124.
  • the fish 126 blocks a portion of the light traveling from the light emitters 128 to the light sensors 132.
  • the control unit 107 obtains sensor data 140.
  • the sensor data 140 includes data indicating how much, and where, light is sensed by the light sensors 132.
  • the control unit 107 can determine physical dimensions of the fish 126, e.g., length and width, diseases, lesions, eye damage, among others.
  • the light emitters 128 and 134 emit light of one or more frequencies or brightness.
  • the light emitters 128 and 134 can include light emitted from one or more LASERS, (light amplification by stimulated emission of radiation).
  • the light emitters 128 and 134 can include LEDs.
  • the light emitters 128 and 134 can emit light of one or more frequency ranges.
  • the light emitters 128 and 134 can emit light of red color, blue color, or other colors.
  • the light emitters 128 and 134 similar to the lighting element 114, can emit light within specific frequency ranges within a specific general color of light.
  • control unit 107 can use the light emitters 128 and 134 to estimate fish size, e.g., length and width, or to detect conditions on a given fish, e.g., disease, lesions, or eye damage.
  • condition detection requires light of a specific type, e.g., frequency, brightness, pulse rate, among others.
  • light pulses of the light emitters 128 and 134 can be rapid enough so that the fish doesn’t perceive the pulses.
  • the pulse rate of the light emitters 128 and 134 can be greater than a given fish eye’s “shutter speed.”
  • Other sensors could also be mounted in the chamber (such as a sensor to detect fish stress hormone). In general, sensors to detect dimensions, weight, or health conditions of fish can be used in the second chamber 124.
  • the second chamber 124 is a known-volume container that includes doors at both ends.
  • the control unit 107 or connected processor controls a water filling device to fill the second chamber 124 with water.
  • Weight sensors of the second chamber 124 can then provide weight sensor data of the second chamber 124 in the sensor data 140 to the control unit 107.
  • one or more processing elements of the second chamber 124 provide the sensor data 140 to the control unit 107.
  • the one or more processing elements can be computing components of one or more computers that control sensors, light emitters, among others, used to obtain the sensor data 140.
  • the control unit 107 directly controls sensors or emitters of the second chamber 124.
  • the control unit 107 can control a pulse rate, frequency, timing, among other parameters of the emitters 128 and 134.
  • the control unit 107 can control what data is obtained in the second chamber 124 either directly or by sending a signal configured to control elements of the second chamber 124.
  • the control unit 107 obtains the sensor data 140 and the fish images 118.
  • the control unit 107 stores the data in a database 120.
  • the control unit 107 can store the data, including the sensor data 140 and the fish images 118, using one or more keys.
  • the keys can indicate a specific fish.
  • the control unit 107 can obtain images and sensor data of the fish 112 and store the data for the fish 112 with a key identifying the fish 112.
  • control unit 107 generates separate data elements for each collection of images and sensor data.
  • the separate data elements can be used in multiple iterations of training. For example, images of a data element can be provided to a model that then outputs a prediction. The control unit 107 can compare the predicted output to the sensor data portion of the data element to determine what adjustments are to be made to the model being trained to improve the model.
  • the control unit 107 generates training data element 143.
  • the training data element 142 includes data representing the fish images 118 and sensor data obtained for the fish 112 similar to the sensor data 140 obtained for the fish 126.
  • the known features 150 indicate values to be predicted by a model being trained, such as the model 162.
  • the training data element 143 includes images 144, 146, and 148 of the fish 112.
  • the model training 160 of the control unit 107 uses the generated training data element 143 to train the model 162.
  • the model 162 can, in general, be any type of machine learning model trained to predict features of fish. Of course, in situations where the system 100 is used for non-fish purposes, the model can be configured to predict non-fish features, e.g., agriculture animals, manufactured objects, among others.
  • the model trainer 160 operated by the control unit 107 provides a portion of the data element 143 to the model 162.
  • the model trainer 160 provides one or more images, such as images 144, 146, or 148.
  • the model 162 can generate output predicting a known feature such as one or more of: a weight or biomass of a fish; a fish length, width, or other dimensions; deformities such as if a spine is straight or has a deformity, if there is a protruding belly, or if there is gill shortening; a number and location of lice or other ectoparasites on a fish; presence or absence of diseases in a fish, e.g., diseases that lead to bloated belly, or red-tint in scales; sizes and locations of lesions or wounds on a fish; levels of stress hormones in a fish; a fish’s eye condition, e.g., if the fish has cataracts or eye damage.
  • the model trainer 160 obtains predicted output of the model 162 and compares the output to a portion of the data element used for input, e.g., the data element 143.
  • the model trainer 160 can compare one or more predictions of the model 162 with one or more of the known features 150. Based on a comparison, the model trainer 160 can adjust one or more weights or parameters of the model 162 to improve an accuracy of one or more predictions.
  • the model trainer 160 compares accuracy, based on a difference between a prediction and known value, to a threshold. For example, the model trainer 160 can compare a known weight of a fish obtained in the second chamber 124 and included in the data element 142 with a predicted weight of the given fish by the model 162. If the prediction satisfies a threshold difference compared to the known value, the model trainer 160 can send a signal to the control unit 107 indicating that training data generation can be stopped.
  • the model trainer 160 determines whether an average accuracy of one or more iterations of training satisfy a threshold accuracy. For example, the model trainer 160 can provide one or more portions of input data and compare corresponding output data to known features. The model trainer 160 can generate difference values indicating an amount of difference between a predicted and known feature. The model trainer 160 can combined two or more difference values to generate a combined difference value and compare the difference value to a threshold. In some implementations, the combined difference value is an average difference across multiple predicted and known feature values. Based on the comparison, the model trainer 160 can determine if the model 162 satisfies a corresponding performance threshold or not.
  • control unit 107 stops training data generation after the model 162 is trained sufficient to satisfy one or more thresholds.
  • control unit 107 can stop a suction affect or close a door to the tube 102, the first chamber 106, the second chamber 124, or tube 166 for providing fish, such as fish 164, to a new location, e.g., back to a starting fish pen.
  • control unit 107 sends a signal to one or more automated processes of the system 100 after determining the model 162 satisfies one or more thresholds.
  • a given signal can be configured to control the camera 116 to stop obtaining images, or the light element 114 from stopping a given illumination, or sensors of the second chamber 124 to stop obtaining sensor data.
  • stops capturing images and moves water through the tubes and chambers to move the fish out through the discharge tube 166 stops capturing images and moves water through the tubes and chambers to move the fish out through the discharge tube 166.
  • the control unit 107 can increase a current flow through one or more of the tubes 102, 122, 166, or chambers 106 or 124.
  • tubes 102, 122, and 166 can be any shape sufficient to fit a given type of fish used in measurement.
  • FIG. 2 is a flow diagram illustrating an example of a process 200 for ground truth and training data generation.
  • the process 200 may be performed by one or more electronic systems, for example, the system 100 of FIG. 1 .
  • the process 200 includes obtaining one or more images of a fish in a first container connected to a second container by a pipe that allows the fish to move from the first container to the second container (202).
  • the control unit 107 can control the camera 116 using the camera controller 110 to obtain the fish images 118 of the fish 112.
  • the process 200 includes obtaining sensor data from the second container representing features of the fish (204).
  • the control unit 107 can control one or more sensors of the second chamber 124 to obtain sensor data of the fish 112 similar to the sensor data 140 obtained for the fish 126.
  • the process 200 includes generating a ground truth data item that includes (i) the one or more images of the fish and (ii) the sensor data from the second container (206).
  • the ground truth engine 142 of the control unit 107 can generate the data element 142 that includes images of the fish images 118 and the known features indicating sensor data of the fish 112 obtained in the second chamber 124 similar to the sensor data 140 obtained for the fish 126.
  • the process 200 includes providing a first portion of the ground truth data item to a machine learning model (208).
  • the model trainer 160 of the control unit 107 can provide a portion of the data element, such as any one or more of the images 144, 146, or 148, to the model 162.
  • the process 200 includes adjusting the machine learning model using a comparison of (i) output from the machine learning model processing the first portion of the ground truth data item and (ii) a second portion of the ground truth data item (210).
  • the model trainer 160 can train the model 162 based on generating a comparison of an output of the model 162 and data of the known features 150.
  • FIG. 3 is a diagram illustrating an example of a computing system used for ground truth and training data generation.
  • the computing system includes computing device 300 and a mobile computing device 350 that can be used to implement the techniques described herein.
  • one or more components of the system 100 could be an example of the computing device 300 or the mobile computing device 350, such as a computer system implementing the control unit 107, devices that access information from the control unit 107, or a server that accesses or stores information regarding the operations performed by the control unit 107.
  • the computing device 300 is intended to represent various forms of digital computers, such as laptops, desktops, workstations, personal digital assistants, servers, blade servers, mainframes, and other appropriate computers.
  • the mobile computing device 350 is intended to represent various forms of mobile devices, such as personal digital assistants, cellular telephones, smart-phones, mobile embedded radio systems, radio diagnostic computing devices, and other similar computing devices.
  • the components shown here, their connections and relationships, and their functions, are meant to be examples only, and are not meant to be limiting.
  • the computing device 300 includes a processor 302, a memory 304, a storage device 306, a high-speed interface 308 connecting to the memory 304 and multiple high-speed expansion ports 310, and a low-speed interface 312 connecting to a low-speed expansion port 314 and the storage device 306.
  • Each of the processor 302, the memory 304, the storage device 306, the high-speed interface 308, the high-speed expansion ports 310, and the low-speed interface 312, are interconnected using various busses, and may be mounted on a common motherboard or in other manners as appropriate.
  • the processor 302 can process instructions for execution within the computing device 300, including instructions stored in the memory 304 or on the storage device 306 to display graphical information for a GUI on an external input/output device, such as a display 316 coupled to the high-speed interface 308.
  • multiple processors and/or multiple buses may be used, as appropriate, along with multiple memories and types of memory.
  • multiple computing devices may be connected, with each device providing portions of the operations (e.g., as a server bank, a group of blade servers, or a multi-processor system).
  • the processor 302 is a single threaded processor.
  • the processor 302 is a multi-threaded processor.
  • the processor 302 is a quantum computer.
  • the memory 304 stores information within the computing device 300.
  • the memory 304 is a volatile memory unit or units.
  • the memory 304 is a non-volatile memory unit or units.
  • the memory 304 may also be another form of computer-readable medium, such as a magnetic or optical disk.
  • the storage device 306 is capable of providing mass storage for the computing device 300.
  • the storage device 306 may be or include a computer-readable medium, such as a floppy disk device, a hard disk device, an optical disk device, or a tape device, a flash memory or other similar solid- state memory device, or an array of devices, including devices in a storage area network or other configurations.
  • Instructions can be stored in an information carrier.
  • the instructions when executed by one or more processing devices (for example, processor 302), perform one or more methods, such as those described above.
  • the instructions can also be stored by one or more storage devices such as computer- or machine readable mediums (for example, the memory 304, the storage device 306, or memory on the processor 302).
  • the high-speed interface 308 manages bandwidth-intensive operations for the computing device 300, while the low-speed interface 312 manages lower bandwidth-intensive operations. Such allocation of functions is an example only.
  • the high speed interface 308 is coupled to the memory 304, the display 316 (e.g., through a graphics processor or accelerator), and to the high-speed expansion ports 310, which may accept various expansion cards (not shown).
  • the low-speed interface 312 is coupled to the storage device 306 and the low-speed expansion port 314.
  • input/output devices such as a keyboard, a pointing device, a scanner, or a networking device such as a switch or router, e.g., through a network adapter.
  • the computing device 300 may be implemented in a number of different forms, as shown in the figure. For example, it may be implemented as a standard server 320, or multiple times in a group of such servers. In addition, it may be implemented in a personal computer such as a laptop computer 322. It may also be implemented as part of a rack server system 324. Alternatively, components from the computing device 300 may be combined with other components in a mobile device, such as a mobile computing device 350. Each of such devices may include one or more of the computing device 300 and the mobile computing device 350, and an entire system may be made up of multiple computing devices communicating with each other.
  • the mobile computing device 350 includes a processor 352, a memory 364, an input/output device such as a display 354, a communication interface 366, and a transceiver 368, among other components.
  • the mobile computing device 350 may also be provided with a storage device, such as a micro-drive or other device, to provide additional storage.
  • a storage device such as a micro-drive or other device, to provide additional storage.
  • Each of the processor 352, the memory 364, the display 354, the communication interface 366, and the transceiver 368, are interconnected using various buses, and several of the components may be mounted on a common motherboard or in other manners as appropriate.
  • the processor 352 can execute instructions within the mobile computing device 350, including instructions stored in the memory 364.
  • the processor 352 may be implemented as a chipset of chips that include separate and multiple analog and digital processors.
  • the processor 352 may provide, for example, for coordination of the other components of the mobile computing device 350, such as control of user interfaces, applications run by the mobile computing device 350, and wireless communication by the mobile computing device 350.
  • the processor 352 may communicate with a user through a control interface 358 and a display interface 356 coupled to the display 354.
  • the display 354 may be, for example, a TFT (Thin-Film-Transistor Liquid Crystal Display) display or an OLED (Organic Light Emitting Diode) display, or other appropriate display technology.
  • the display interface 356 may include appropriate circuitry for driving the display 354 to present graphical and other information to a user.
  • the control interface 358 may receive commands from a user and convert them for submission to the processor 352.
  • an external interface 362 may provide communication with the processor 352, so as to enable near area communication of the mobile computing device 350 with other devices.
  • the external interface 362 may provide, for example, for wired communication in some implementations, or for wireless communication in other implementations, and multiple interfaces may also be used.
  • the memory 364 stores information within the mobile computing device 350.
  • the memory 364 can be implemented as one or more of a computer-readable medium or media, a volatile memory unit or units, or a non-volatile memory unit or units.
  • An expansion memory 374 may also be provided and connected to the mobile computing device 350 through an expansion interface 372, which may include, for example, a SIMM (Single In Line Memory Module) card interface.
  • SIMM Single In Line Memory Module
  • the expansion memory 374 may provide extra storage space for the mobile computing device 350, or may also store applications or other information for the mobile computing device 350.
  • the expansion memory 374 may include instructions to carry out or supplement the processes described above, and may include secure information also.
  • the expansion memory 374 may be provide as a security module for the mobile computing device 350, and may be programmed with instructions that permit secure use of the mobile computing device 350.
  • secure applications may be provided via the SIMM cards, along with additional information, such as placing identifying information on the SIMM card in a non- hackable manner.
  • the memory may include, for example, flash memory and/or NVRAM memory (nonvolatile random access memory), as discussed below.
  • instructions are stored in an information carrier such that the instructions, when executed by one or more processing devices (for example, processor 352), perform one or more methods, such as those described above.
  • the instructions can also be stored by one or more storage devices, such as one or more computer- or machine-readable mediums (for example, the memory 364, the expansion memory 374, or memory on the processor 352).
  • the instructions can be received in a propagated signal, for example, over the transceiver 368 or the external interface 362.
  • the mobile computing device 350 may communicate wirelessly through the communication interface 366, which may include digital signal processing circuitry in some cases.
  • the communication interface 366 may provide for communications under various modes or protocols, such as GSM voice calls (Global System for Mobile communications), SMS (Short Message Service), EMS (Enhanced Messaging Service), or MMS messaging (Multimedia Messaging Service), CDMA (code division multiple access), TDMA (time division multiple access), PDC (Personal Digital Cellular), WCDMA (Wideband Code Division Multiple Access), CDMA2000, or GPRS (General Packet Radio Service), LTE, 5G/6G cellular, among others.
  • GSM voice calls Global System for Mobile communications
  • SMS Short Message Service
  • EMS Enhanced Messaging Service
  • MMS messaging Multimedia Messaging Service
  • CDMA code division multiple access
  • TDMA time division multiple access
  • PDC Personal Digital Cellular
  • WCDMA Wideband Code Division Multiple Access
  • CDMA2000 Code Division Multiple Access
  • GPRS General Packet Radio Service
  • LTE 5G/6G
  • a GPS (Global Positioning System) receiver module 370 may provide additional navigation- and location-related wireless data to the mobile computing device 350, which may be used as appropriate by applications running on the mobile computing device 350.
  • the mobile computing device 350 may also communicate audibly using an audio codec 360, which may receive spoken information from a user and convert it to usable digital information.
  • the audio codec 360 may likewise generate audible sound for a user, such as through a speaker, e.g., in a handset of the mobile computing device 350.
  • Such sound may include sound from voice telephone calls, may include recorded sound (e.g., voice messages, music files, among others) and may also include sound generated by applications operating on the mobile computing device 350.
  • the mobile computing device 350 may be implemented in a number of different forms, as shown in the figure. For example, it may be implemented as a cellular telephone 380. It may also be implemented as part of a smart-phone 382, personal digital assistant, or other similar mobile device. [0075] A number of implementations have been described. Nevertheless, it will be understood that various modifications may be made without departing from the spirit and scope of the disclosure. For example, various forms of the flows shown above may be used, with steps re-ordered, added, or removed.
  • Embodiments of the invention and all of the functional operations described in this specification can be implemented in digital electronic circuitry, or in computer software, firmware, or hardware, including the structures disclosed in this specification and their structural equivalents, or in combinations of one or more of them.
  • Embodiments of the invention can be implemented as one or more computer program products, e.g., one or more modules of computer program instructions encoded on a computer readable medium for execution by, or to control the operation of, data processing apparatus.
  • the computer readable medium can be a machine-readable storage device, a machine-readable storage substrate, a memory device, a composition of matter effecting a machine-readable propagated signal, or a combination of one or more of them.
  • data processing apparatus encompasses all apparatus, devices, and machines for processing data, including by way of example a programmable processor, a computer, or multiple processors or computers.
  • the apparatus can include, in addition to hardware, code that creates an execution environment for the computer program in question, e.g., code that constitutes processor firmware, a protocol stack, a database management system, an operating system, or a combination of one or more of them.
  • a propagated signal is an artificially generated signal, e.g., a machine-generated electrical, optical, or electromagnetic signal that is generated to encode information for transmission to suitable receiver apparatus.
  • a computer program (also known as a program, software, software application, script, or code) can be written in any form of programming language, including compiled or interpreted languages, and it can be deployed in any form, including as a stand alone program or as a module, component, subroutine, or other unit suitable for use in a computing environment.
  • a computer program does not necessarily correspond to a file in a file system.
  • a program can be stored in a portion of a file that holds other programs or data (e.g., one or more scripts stored in a markup language document), in a single file dedicated to the program in question, or in multiple coordinated files (e.g, files that store one or more modules, sub programs, or portions of code).
  • a computer program can be deployed to be executed on one computer or on multiple computers that are located at one site or distributed across multiple sites and interconnected by a communication network.
  • the processes and logic flows described in this specification can be performed by one or more programmable processors executing one or more computer programs to perform functions by operating on input data and generating output.
  • the processes and logic flows can also be performed by, and apparatus can also be implemented as, special purpose logic circuitry, e.g., an FPGA (field programmable gate array) or an ASIC (application specific integrated circuit).
  • processors suitable for the execution of a computer program include, by way of example, both general and special purpose microprocessors, and any one or more processors of any kind of digital computer.
  • a processor will receive instructions and data from a read only memory or a random access memory or both.
  • the essential elements of a computer are a processor for performing instructions and one or more memory devices for storing instructions and data.
  • a computer will also include, or be operatively coupled to receive data from or transfer data to, or both, one or more mass storage devices for storing data, e.g., magnetic, magneto optical disks, or optical disks.
  • mass storage devices for storing data, e.g., magnetic, magneto optical disks, or optical disks.
  • a computer need not have such devices.
  • a computer can be embedded in another device, e.g., a tablet computer, a mobile telephone, a personal digital assistant (PDA), a mobile audio player, a Global Positioning System (GPS) receiver, to name just a few.
  • Computer readable media suitable for storing computer program instructions and data include all forms of non volatile memory, media and memory devices, including by way of example semiconductor memory devices, e.g., EPROM, EEPROM, and flash memory devices; magnetic disks, e.g., internal hard disks or removable disks; magneto optical disks; and CD ROM and DVD-ROM disks.
  • the processor and the memory can be supplemented by, or incorporated in, special purpose logic circuitry.
  • embodiments of the invention can be implemented on a computer having a display device, e.g., a CRT (cathode ray tube) or LCD (liquid crystal display) monitor, for displaying information to the user and a keyboard and a pointing device, e.g., a mouse or a trackball, by which the user can provide input to the computer.
  • a display device e.g., a CRT (cathode ray tube) or LCD (liquid crystal display) monitor
  • keyboard and a pointing device e.g., a mouse or a trackball
  • Other kinds of devices can be used to provide for interaction with a user as well; for example, feedback provided to the user can be any form of sensory feedback, e.g., visual feedback, auditory feedback, or tactile feedback; and input from the user can be received in any form, including acoustic, speech, or tactile input.
  • Embodiments of the invention can be implemented in a computing system that includes a back end component, e.g., as a data server, or that includes a middleware component, e.g., an application server, or that includes a front end component, e.g., a client computer having a graphical user interface or a Web browser through which a user can interact with an implementation of the invention, or any combination of one or more such back end, middleware, or front end components.
  • the components of the system can be interconnected by any form or medium of digital data communication, e.g., a communication network. Examples of communication networks include a local area network (“LAN”) and a wide area network (“WAN”), e.g., the Internet.
  • LAN local area network
  • WAN wide area network
  • the computing system can include clients and servers.
  • a client and server are generally remote from each other and typically interact through a communication network.
  • the relationship of client and server arises by virtue of computer programs running on the respective computers and having a client-server relationship to each other.
  • HTML file may be replaced by an XML, JSON, plain text, or other types of files.
  • XML XML
  • JSON XML
  • plain text XML
  • table or hash table other data structures (such as spreadsheets, relational databases, or structured files) may be used.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Physics & Mathematics (AREA)
  • Environmental Sciences (AREA)
  • Zoology (AREA)
  • Evolutionary Computation (AREA)
  • Human Computer Interaction (AREA)
  • Databases & Information Systems (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • General Health & Medical Sciences (AREA)
  • Medical Informatics (AREA)
  • Software Systems (AREA)
  • Computing Systems (AREA)
  • Artificial Intelligence (AREA)
  • Marine Sciences & Fisheries (AREA)
  • Animal Husbandry (AREA)
  • Biodiversity & Conservation Biology (AREA)
  • Health & Medical Sciences (AREA)
  • Image Processing (AREA)

Abstract

Methods, systems, and apparatus, including computer programs encoded on computer-storage media, for ground truth and training data generation. In some implementations, a method includes obtaining one or more images of a fish in a first container connected to a second container by a pipe configured to allow the fish to move from the first container to the second container; obtaining sensor data from the second container representing features of the fish; generating a ground truth data item that includes (i) the one or more images of the fish and (ii) the sensor data from the second container; providing a first portion of the ground truth data item to a machine learning model; and adjusting the machine learning model using a comparison of (i) output from the machine learning model processing the first portion of the ground truth data item and (ii) a second portion of the ground truth data item.

Description

GROUND TRUTH AND TRAINING DATA GENERATION FOR FISH
BACKGROUND
[0001] Systems for training machine learning models require training data and associated ground truths. Ground truth collection can be difficult and inaccurate ground truths can make machine learning models perform inaccurately.
[0002] Traditional techniques for obtaining ground truths in the aquaculture domain, such as by manual extraction examination of fish by hand, can distress, harm or even kill the fish in the process. In the aggregate, such damage can damage fragile ecosystems by disrupting naturally apportioned nutrients with decaying organic matter. The resulting ecosystem damage can cause algae blooms with knock-on effects that damage larger regions of an aquatic environment.
SUMMARY
[0003] In some implementations, a system generates ground truth data from fish enclosed within a chamber. The system can prevent harm to the fish by avoiding manual extraction and, instead, using water ways, such as pipes, to move fish within sensing ranges of sensors. In some implementations, the system uses suction force to move a fish within a fish pen into a pipe filled with water that is connected to a chamber for obtaining data, such as images or other sensor data.
[0004] The system can obtain the fish from an aquaculture fish pen. The fish can travel to a first chamber where images can be obtained. The fish can travel to a second chamber configured to obtain accurate ground truth measurements. The system can generate an element of training data, including ground truth data, that includes the obtained images and ground truth measurements. The system can use the training data to train a machine learning model to predict values indicated by the ground truth measurements based on input data similar to the obtained images.
[0005] Advantageous implementations can include one or more of the following features. For example, systems and methods described do not require that fish are anesthetized. The described system, apparatus, and corresponding methods, improve fish welfare, e.g., by reducing movement of fish by hand, while increasing an amount of generated ground truths. By increasing the amount of ground truth and training data generated, systems can increase the accuracy of corresponding trained models.
[0006] In some implementations, systems and methods described enable faster, more automated ground-truth collection compared to manual observations or maneuvering of fish. Faster ground-truth collection allows for greater amounts of training data and corresponding greater accuracy in associated trained models, e.g., models trained to predict biomass of fish. Systems and methods can improve support for new species of fish or new conditions detected in fish by generating training data that depicts the new species.
[0007] In some implementations, weight and fish dimensions are collected. For example, a 2-axis light curtain can be used to collect weight and fish dimensions of a fish. In some cases, accuracy of a light curtain can enable models to predict more subtle features of fish, such as physical deformities, spinal deformities, among others. Typical images can be insufficient to detect such subtleties.
[0008] In some implementations, a fish is identified using a device with a unique identifier, e.g., a chip, tag, among others. For example, a fish can have a tag attached to its body or a chip embedded in the skin that can be read wirelessly as the fish swims. Data obtained in one or more chambers can be associated with an identifier of a chip, tag, or other physical identification of the fish.
[0009] In some implementations, features of a fish are used to determine a unique identity. For example, a system can detect features of a fish, such as truss lengths, spots, physical feature location, among others. Based on the detected features, a system can determine a unique identity of a fish, e.g., by providing data representing a fish to a model trained to identify fish, and associate data obtained that represents the fish with the unique identity of the fish.
[0010] In some implementations, training data that is generated is used directly to train a model. For example, after a system generates an item of training data, the item of training data can be used to train a corresponding model until one or more thresholds of the model are satisfied. The system can determine if enough training data has been generated or the model is performing adequately. In some cases, stopping training data generation as soon as a model has satisfied one or more thresholds, e.g., prediction accuracy thresholds, can help reduce stress on fish or improve other operations of aquaculture by reducing movement of fish from in or out of fish pens.
[0011] In some implementations, increasing a rate of training data generation allows for quicker detection and mitigation of disease. For example, compared to traditional manual methods of obtaining training data, proposed systems and methods described can increase a rate of training data generation which, when used to train a model that identifies diseases in fish, can enable such a model to more quickly make detections. After detection, a system can treat the disease identified by the model to have the disease or sort the diseased fish from the non-diseased fish. Because diseases can grow more serious in a population over time, e.g., affecting more individuals or affecting individuals more severely, enabling more rapid detection through more rapid training of corresponding models can potentially save individuals from death by the given disease or others from contracting the disease if communicable.
[0012] In some implementations, ground truth and corresponding training data are generated only until a corresponding trained model satisfies one or more thresholds. For example, a system can train a model while generating training data for training the model. Once the system trains the model such that the performance of the model satisfies one or more thresholds, the system can stop generating training data and ground truth data. In this way, the system can reduce the amount of fish measuring and movement as well as corresponding energy and climate effects of running the system by only operating until sufficient training data has been generated.
[0013] Ground truth data captured and included in training data can include one or more of: a weight or biomass of a fish; dimension of a fish, such as length and width, spine straightness or deformity, protruding belly, gill shortening, among others; a number or location of lice, or other ectoparasites, on a fish; a presence or absence of diseases in the fish, e.g., diseases that lead to bloated belly, red-tint in scales, among others; sizes and locations of lesions or wounds on the fish; levels of stress hormones in the fish; fish’s eye condition, e.g., cataracts, eye damage, among others. [0014] The details of one or more embodiments of the invention are set forth in the accompanying drawings and the description below. Other features and advantages of the invention will become apparent from the description, the drawings, and the claims.
BRIEF DESCRIPTION OF THE DRAWINGS
[0015] FIG. 1 is a diagram showing an example of a system for ground truth and training data generation.
[0016] FIG. 2 is a flow diagram illustrating an example of a process for ground truth and training data generation.
[0017] FIG. 3 is a diagram illustrating an example of a computing system used for ground truth and training data generation.
[0018] Like reference numbers and designations in the various drawings indicate like elements.
DETAILED DESCRIPTION
[0019] FIG. 1 is a diagram showing an example of a system 100 for ground truth and training data generation. The system 100 includes a control unit 107, a first chamber 106, and a second chamber 124. The control unit 107 operates one or more modules for obtaining data, generating training data, and training a machine learning model, e.g., a light controller 108, a camera controller 110, a ground truth engine 142, and a model trainer 160. The control unit 107 can include one or more computing systems including one or more processing elements.
[0020] In stage A, a fish 104 moves in a tube 102 towards the first chamber 106. The tube 102 can be sized to fit the fish 104. The tube 102 can include water such that the fish 104 is able to swim in the tube 102 or the fish 104 is pushed by a current of water flowing in the tube 102 to the first chamber 106. In some implementations, the control unit 107 controls a suction effect at an end of the tube 102. For example, the control unit 107 can control a waterjet or propeller that moves water in the tube 102 towards the first chamber 106.
[0021] By using the tube 102, which can simply be a water way that allows the fish 104 to move from a starting location to the first chamber 106, the system 100 can prevent harm to the fish 104. For example, in traditional methods, the fish 104 may be extracted by hand from a fish pen and put into a location for measurement. The manual extraction can cause lesions which kill the fish 104. The manual extraction can also cause lesions which kill the fish 104 over time by impairing an ability of the fish 104 to feed or perform other necessary tasks.
[0022] The system 100, through harm reduction to the fish 104, can prevent corresponding ecological harm from, e.g., overabundance of organic decay from fish that would have been killed using traditional techniques. The system 100 can also prevent psychological distress of the fish 104 by reducing such manual extraction. Psychological distress can affect health and even lead to death in some cases.
[0023] In stage B, a fish 112 is shown in the first chamber 106. The first chamber 106 includes a lighting element 114 and a camera 116. In some implementations, the lighting element 114 and the camera 116 are controlled by the control unit 107. For example, modules of the control unit, e.g., the light controller 108 and the camera controller 110, can control the lighting element 114 and the camera 116.
[0024] In some implementations, the control unit 107 controls the camera 116 to capture fish images 118 of the fish 112. For example, the control unit 107 can send a signal to the camera 116 or processor that directly controls the camera 116, configured to operate the camera 116 and obtain the fish images 118. The fish images 118 can include still images or video segments of the fish 112.
[0025] In some implementations, the control unit 107 controls the lighting element 114. For example, the control unit 107 can send a signal to the lighting element 114, or processor that directly controls the lighting element 114, configured to operate the lighting element 114 and trigger a light, turn off a light, or start a specific lighting pattern, such as an on/off sequence, specific light frequency ranges, among others. In some implementations, the lighting element 114 illuminates the fish 112. In some implementations, the lighting element 114 is timed by the control unit 107 to illuminate the fish 112 while the camera 116 is obtaining an image of the fish 112 to enable the camera 116 to capture images of the fish 112 while the fish 112 is illuminated by the lighting element 114.
[0026] In some implementations, the control unit 107 controls the lighting element 114 to vary an amount or type of light while the camera 116 obtains one or more images. For example, the control unit 107 can control the lighting element 114 to illuminate with a first intensity or with light of a first frequency range for a first period of time. The control unit 107 can then control the lighting element 114 to illuminate with a second intensity or with light of a second frequency range for a second period of time. The control unit 107 can control the camera 116 to capture images of the varying light. In some implementations, capturing images of varying light helps the resulting training data train a model that generalizes well to varying real world lighting scenarios.
[0027] In some implementations, the lighting element 114 includes powered lights, such as light emitting diodes (LEDs). In some implementations, the lighting element 114 includes an opening to allow sunlight or other produced light into the first chamber 106. Such an opening can be connected to a mechanism, e.g., shade, covering, shield, among others, to open and close the opening or to adjust, or vary an amount of light, e.g., fully opaque, semi-translucent, clear, that illuminates the fish 112.
[0028] In some implementations, the control unit 107 determines a quality, e.g., frequency/wavelength, brightness, polarization, among others, of light with which to illuminate the fish 112. For example, the control unit 107 can determine the fish 112 is at risk for a particular disease, e.g., by visual inspection using images captured by the camera 116, upstream processing element, or controlled by a user with knowledge of an origin of the fish 112. The control unit 107 can determine that, for the particular disease, a particular one or more frequency ranges are useful in detecting certain conditions. For example, if the fish 112 is determined to be at risk for sea lice, the control unit 107 can control the lighting element 114 to alternate flashes of red light and blue light or other frequency ranges determined by the control unit 107. The control unit 107 can configure the camera 116 to capture images of the fish 112 with the fish 112 alternately illuminated by the red light and the blue light.
[0029] In some implementations, the control unit 107 controls one or more lighting features of the lighting element 114 to pulse light that illuminates the fish 112. For example, the control unit 107 can pulse a white or colored light at a particular rate, e.g., pulses per second. One or more colors of light, e.g., wavelength ranges, can be pulsed such that the camera 116 captures images of the fish 112 illuminated by different colored light. The pulses of multiple light colors can be interlaced so that a pulse of a first color is followed by a second color which is followed by either the first color or a third color. Such a pattern can be continued for one or more colored lights.
[0030] In some implementations, the control unit 107 determines a pulse rate for a light that satisfies a detection threshold of the fish 112. For example, if the control unit 107 determines that the fish 112 can detect light pulse rates below 30 hertz, the control unit 107 can increase the rate above 30 hertz to prevent the fish 112 from detecting the pulses. In general, if a fish does not detect light pulsing, corresponding stress hormone production can be reduced which thereby increases fish welfare. Fish welfare can be tied to infection rates, morbidity rates, production yield, among others.
[0031] In some implementations, the control unit 107 uses images from the camera 116 to determine a species of the fish 112. For example, the control unit 107 can determine that the fish 112 is of a first species with a pulse sensitivity below 10 hertz using a model trained to predict a species. In some implementations, the control unit obtains data provided by a user of the system 100 indicating a current species to be measured. Based on determining that the fish 112 is of the first species, the control unit 107 can control the lighting element 114 or the camera 116 to illuminate the fish 112 with a pulse rate at least above 10 hertz.
[0032] In some implementations, the lighting element 114 includes one or more lighting features. For example, the lighting element 114 can include one or more of: a first light that emits light of a first frequency or range of frequencies or a second light that emits light of a second frequency or range of frequencies. The first light and the second light can emit light of the same or different color. The first light and the second light can emit light of the same or different frequency ranges.
[0033] In some implementations, the first chamber 106 is attached to the second chamber 124. For example, the first chamber 106 and the second chamber 124 can be a part of the same chamber. In some implementations, a single chamber including the first chamber 106 and the second chamber 124 includes a partition to control a flow of fish from the first chamber 106 to the second chamber 124. For example, a mesh netting can be used with an opening that is controlled by the control unit 107 to allow fish to flow between the first chamber 106 and the second chamber 124.
[0034] The fish 112 can move from the first chamber 106 to the second chamber 124. A fish 126 is shown in the second chamber 124. The fish 126 moved from the first chamber 106 through a connecting tube 122 to the second chamber 124.
[0035] In some implementations, a fish moving in the tube 102 moves into the second chamber 124 before the first chamber 106. For example, the first chamber 106 can be positioned after the second chamber 124 relative to a movement of fish. The control unit 107 can similarly collect images and ground truth data as discussed. In some implementations, the system 100 is configured to allow fish to move in either directions. For example, the control 107 can change a direction of water flow to control a direction of fish moving through the system, e.g., moving from the first chamber 106 to the second chamber 124, or from the second chamber 124 to the first chamber 106.
[0036] The second chamber 124 is shown using two views of the second chamber 124: (i) view 124a showing one set of light emitters 128 and light sensors 132 and (ii) view 124b showing another set of light emitters 134 with corresponding light sensors not shown.
[0037] The light emitters 128 and 134 and light sensors 132 and corresponding sensors for light emitters 134, form a light curtain along 2 axes. In view 124a, the light from light emitters 128 are shown from the light emitters 128 to the light sensors 132 on the opposite side of the second chamber 124. The fish 126 blocks a portion of the light traveling from the light emitters 128 to the light sensors 132.
[0038] The control unit 107 obtains sensor data 140. In some implementations, the sensor data 140 includes data indicating how much, and where, light is sensed by the light sensors 132. Using the sensor data 140, the control unit 107 can determine physical dimensions of the fish 126, e.g., length and width, diseases, lesions, eye damage, among others.
[0039] In some implementations, the light emitters 128 and 134 emit light of one or more frequencies or brightness. For example, the light emitters 128 and 134 can include light emitted from one or more LASERS, (light amplification by stimulated emission of radiation). The light emitters 128 and 134 can include LEDs. The light emitters 128 and 134 can emit light of one or more frequency ranges. For example, the light emitters 128 and 134 can emit light of red color, blue color, or other colors. The light emitters 128 and 134, similar to the lighting element 114, can emit light within specific frequency ranges within a specific general color of light.
[0040] In general, the control unit 107 can use the light emitters 128 and 134 to estimate fish size, e.g., length and width, or to detect conditions on a given fish, e.g., disease, lesions, or eye damage. In some implementations, condition detection requires light of a specific type, e.g., frequency, brightness, pulse rate, among others.
[0041] To avoid startling the fish and increase fish welfare, similar to the lighting element 114, light pulses of the light emitters 128 and 134 can be rapid enough so that the fish doesn’t perceive the pulses. The pulse rate of the light emitters 128 and 134 can be greater than a given fish eye’s “shutter speed.” Other sensors could also be mounted in the chamber (such as a sensor to detect fish stress hormone). In general, sensors to detect dimensions, weight, or health conditions of fish can be used in the second chamber 124.
[0042] In some implementations, the second chamber 124 is a known-volume container that includes doors at both ends. In some implementations, after the fish 126 moves into the second chamber 124, the control unit 107 or connected processor the second chamber 124 controls a water filling device to fill the second chamber 124 with water. Weight sensors of the second chamber 124 can then provide weight sensor data of the second chamber 124 in the sensor data 140 to the control unit 107.
[0043] In stage C, one or more processing elements of the second chamber 124 provide the sensor data 140 to the control unit 107. The one or more processing elements can be computing components of one or more computers that control sensors, light emitters, among others, used to obtain the sensor data 140. In some implementations, the control unit 107 directly controls sensors or emitters of the second chamber 124. For example, the control unit 107 can control a pulse rate, frequency, timing, among other parameters of the emitters 128 and 134. In general, the control unit 107 can control what data is obtained in the second chamber 124 either directly or by sending a signal configured to control elements of the second chamber 124.
[0044] The control unit 107 obtains the sensor data 140 and the fish images 118. In some implementations, the control unit 107 stores the data in a database 120. For example, the control unit 107 can store the data, including the sensor data 140 and the fish images 118, using one or more keys. The keys can indicate a specific fish. For example, the control unit 107 can obtain images and sensor data of the fish 112 and store the data for the fish 112 with a key identifying the fish 112.
[0045] In some implementations, the control unit 107 generates separate data elements for each collection of images and sensor data. The separate data elements can be used in multiple iterations of training. For example, images of a data element can be provided to a model that then outputs a prediction. The control unit 107 can compare the predicted output to the sensor data portion of the data element to determine what adjustments are to be made to the model being trained to improve the model.
[0046] The control unit 107 generates training data element 143. The training data element 142 includes data representing the fish images 118 and sensor data obtained for the fish 112 similar to the sensor data 140 obtained for the fish 126.
The known features 150 indicate values to be predicted by a model being trained, such as the model 162. The training data element 143 includes images 144, 146, and 148 of the fish 112.
[0047] The model training 160 of the control unit 107 uses the generated training data element 143 to train the model 162. The model 162 can, in general, be any type of machine learning model trained to predict features of fish. Of course, in situations where the system 100 is used for non-fish purposes, the model can be configured to predict non-fish features, e.g., agriculture animals, manufactured objects, among others.
[0048] The model trainer 160 operated by the control unit 107 provides a portion of the data element 143 to the model 162. In some implementations, the model trainer 160 provides one or more images, such as images 144, 146, or 148. The model 162 can generate output predicting a known feature such as one or more of: a weight or biomass of a fish; a fish length, width, or other dimensions; deformities such as if a spine is straight or has a deformity, if there is a protruding belly, or if there is gill shortening; a number and location of lice or other ectoparasites on a fish; presence or absence of diseases in a fish, e.g., diseases that lead to bloated belly, or red-tint in scales; sizes and locations of lesions or wounds on a fish; levels of stress hormones in a fish; a fish’s eye condition, e.g., if the fish has cataracts or eye damage.
[0049] The model trainer 160 obtains predicted output of the model 162 and compares the output to a portion of the data element used for input, e.g., the data element 143. The model trainer 160 can compare one or more predictions of the model 162 with one or more of the known features 150. Based on a comparison, the model trainer 160 can adjust one or more weights or parameters of the model 162 to improve an accuracy of one or more predictions.
[0050] In some implementations, the model trainer 160 compares accuracy, based on a difference between a prediction and known value, to a threshold. For example, the model trainer 160 can compare a known weight of a fish obtained in the second chamber 124 and included in the data element 142 with a predicted weight of the given fish by the model 162. If the prediction satisfies a threshold difference compared to the known value, the model trainer 160 can send a signal to the control unit 107 indicating that training data generation can be stopped.
[0051] In some implementations, the model trainer 160 determines whether an average accuracy of one or more iterations of training satisfy a threshold accuracy. For example, the model trainer 160 can provide one or more portions of input data and compare corresponding output data to known features. The model trainer 160 can generate difference values indicating an amount of difference between a predicted and known feature. The model trainer 160 can combined two or more difference values to generate a combined difference value and compare the difference value to a threshold. In some implementations, the combined difference value is an average difference across multiple predicted and known feature values. Based on the comparison, the model trainer 160 can determine if the model 162 satisfies a corresponding performance threshold or not.
[0052] In some implementations, the control unit 107 stops training data generation after the model 162 is trained sufficient to satisfy one or more thresholds. For example, the control unit 107 can stop a suction affect or close a door to the tube 102, the first chamber 106, the second chamber 124, or tube 166 for providing fish, such as fish 164, to a new location, e.g., back to a starting fish pen.
[0053] In some implementations, the control unit 107 sends a signal to one or more automated processes of the system 100 after determining the model 162 satisfies one or more thresholds. For example, a given signal can be configured to control the camera 116 to stop obtaining images, or the light element 114 from stopping a given illumination, or sensors of the second chamber 124 to stop obtaining sensor data.
[0054] In some implementations, stops capturing images and moves water through the tubes and chambers to move the fish out through the discharge tube 166. For example, the control unit 107 can increase a current flow through one or more of the tubes 102, 122, 166, or chambers 106 or 124. In general, tubes 102, 122, and 166 can be any shape sufficient to fit a given type of fish used in measurement.
[0055] FIG. 2 is a flow diagram illustrating an example of a process 200 for ground truth and training data generation. The process 200 may be performed by one or more electronic systems, for example, the system 100 of FIG. 1 .
[0056] The process 200 includes obtaining one or more images of a fish in a first container connected to a second container by a pipe that allows the fish to move from the first container to the second container (202). For example, the control unit 107 can control the camera 116 using the camera controller 110 to obtain the fish images 118 of the fish 112.
[0057] The process 200 includes obtaining sensor data from the second container representing features of the fish (204). For example, the control unit 107 can control one or more sensors of the second chamber 124 to obtain sensor data of the fish 112 similar to the sensor data 140 obtained for the fish 126.
[0058] The process 200 includes generating a ground truth data item that includes (i) the one or more images of the fish and (ii) the sensor data from the second container (206). For example, the ground truth engine 142 of the control unit 107 can generate the data element 142 that includes images of the fish images 118 and the known features indicating sensor data of the fish 112 obtained in the second chamber 124 similar to the sensor data 140 obtained for the fish 126. [0059] The process 200 includes providing a first portion of the ground truth data item to a machine learning model (208). For example, the model trainer 160 of the control unit 107 can provide a portion of the data element, such as any one or more of the images 144, 146, or 148, to the model 162.
[0060] The process 200 includes adjusting the machine learning model using a comparison of (i) output from the machine learning model processing the first portion of the ground truth data item and (ii) a second portion of the ground truth data item (210). For example, the model trainer 160 can train the model 162 based on generating a comparison of an output of the model 162 and data of the known features 150.
[0061] FIG. 3 is a diagram illustrating an example of a computing system used for ground truth and training data generation. The computing system includes computing device 300 and a mobile computing device 350 that can be used to implement the techniques described herein. For example, one or more components of the system 100 could be an example of the computing device 300 or the mobile computing device 350, such as a computer system implementing the control unit 107, devices that access information from the control unit 107, or a server that accesses or stores information regarding the operations performed by the control unit 107.
[0062] The computing device 300 is intended to represent various forms of digital computers, such as laptops, desktops, workstations, personal digital assistants, servers, blade servers, mainframes, and other appropriate computers. The mobile computing device 350 is intended to represent various forms of mobile devices, such as personal digital assistants, cellular telephones, smart-phones, mobile embedded radio systems, radio diagnostic computing devices, and other similar computing devices. The components shown here, their connections and relationships, and their functions, are meant to be examples only, and are not meant to be limiting.
[0063] The computing device 300 includes a processor 302, a memory 304, a storage device 306, a high-speed interface 308 connecting to the memory 304 and multiple high-speed expansion ports 310, and a low-speed interface 312 connecting to a low-speed expansion port 314 and the storage device 306. Each of the processor 302, the memory 304, the storage device 306, the high-speed interface 308, the high-speed expansion ports 310, and the low-speed interface 312, are interconnected using various busses, and may be mounted on a common motherboard or in other manners as appropriate. The processor 302 can process instructions for execution within the computing device 300, including instructions stored in the memory 304 or on the storage device 306 to display graphical information for a GUI on an external input/output device, such as a display 316 coupled to the high-speed interface 308. In other implementations, multiple processors and/or multiple buses may be used, as appropriate, along with multiple memories and types of memory. In addition, multiple computing devices may be connected, with each device providing portions of the operations (e.g., as a server bank, a group of blade servers, or a multi-processor system). In some implementations, the processor 302 is a single threaded processor. In some implementations, the processor 302 is a multi-threaded processor. In some implementations, the processor 302 is a quantum computer.
[0064] The memory 304 stores information within the computing device 300. In some implementations, the memory 304 is a volatile memory unit or units. In some implementations, the memory 304 is a non-volatile memory unit or units. The memory 304 may also be another form of computer-readable medium, such as a magnetic or optical disk.
[0065] The storage device 306 is capable of providing mass storage for the computing device 300. In some implementations, the storage device 306 may be or include a computer-readable medium, such as a floppy disk device, a hard disk device, an optical disk device, or a tape device, a flash memory or other similar solid- state memory device, or an array of devices, including devices in a storage area network or other configurations. Instructions can be stored in an information carrier. The instructions, when executed by one or more processing devices (for example, processor 302), perform one or more methods, such as those described above. The instructions can also be stored by one or more storage devices such as computer- or machine readable mediums (for example, the memory 304, the storage device 306, or memory on the processor 302). The high-speed interface 308 manages bandwidth-intensive operations for the computing device 300, while the low-speed interface 312 manages lower bandwidth-intensive operations. Such allocation of functions is an example only. In some implementations, the high speed interface 308 is coupled to the memory 304, the display 316 (e.g., through a graphics processor or accelerator), and to the high-speed expansion ports 310, which may accept various expansion cards (not shown). In the implementation, the low-speed interface 312 is coupled to the storage device 306 and the low-speed expansion port 314. The low- speed expansion port 314, which may include various communication ports (e.g., USB, Bluetooth, Ethernet, wireless Ethernet) may be coupled to one or more input/output devices, such as a keyboard, a pointing device, a scanner, or a networking device such as a switch or router, e.g., through a network adapter.
[0066] The computing device 300 may be implemented in a number of different forms, as shown in the figure. For example, it may be implemented as a standard server 320, or multiple times in a group of such servers. In addition, it may be implemented in a personal computer such as a laptop computer 322. It may also be implemented as part of a rack server system 324. Alternatively, components from the computing device 300 may be combined with other components in a mobile device, such as a mobile computing device 350. Each of such devices may include one or more of the computing device 300 and the mobile computing device 350, and an entire system may be made up of multiple computing devices communicating with each other.
[0067] The mobile computing device 350 includes a processor 352, a memory 364, an input/output device such as a display 354, a communication interface 366, and a transceiver 368, among other components. The mobile computing device 350 may also be provided with a storage device, such as a micro-drive or other device, to provide additional storage. Each of the processor 352, the memory 364, the display 354, the communication interface 366, and the transceiver 368, are interconnected using various buses, and several of the components may be mounted on a common motherboard or in other manners as appropriate.
[0068] The processor 352 can execute instructions within the mobile computing device 350, including instructions stored in the memory 364. The processor 352 may be implemented as a chipset of chips that include separate and multiple analog and digital processors. The processor 352 may provide, for example, for coordination of the other components of the mobile computing device 350, such as control of user interfaces, applications run by the mobile computing device 350, and wireless communication by the mobile computing device 350. [0069] The processor 352 may communicate with a user through a control interface 358 and a display interface 356 coupled to the display 354. The display 354 may be, for example, a TFT (Thin-Film-Transistor Liquid Crystal Display) display or an OLED (Organic Light Emitting Diode) display, or other appropriate display technology. The display interface 356 may include appropriate circuitry for driving the display 354 to present graphical and other information to a user. The control interface 358 may receive commands from a user and convert them for submission to the processor 352. In addition, an external interface 362 may provide communication with the processor 352, so as to enable near area communication of the mobile computing device 350 with other devices. The external interface 362 may provide, for example, for wired communication in some implementations, or for wireless communication in other implementations, and multiple interfaces may also be used.
[0070] The memory 364 stores information within the mobile computing device 350. The memory 364 can be implemented as one or more of a computer-readable medium or media, a volatile memory unit or units, or a non-volatile memory unit or units. An expansion memory 374 may also be provided and connected to the mobile computing device 350 through an expansion interface 372, which may include, for example, a SIMM (Single In Line Memory Module) card interface. The expansion memory 374 may provide extra storage space for the mobile computing device 350, or may also store applications or other information for the mobile computing device 350. Specifically, the expansion memory 374 may include instructions to carry out or supplement the processes described above, and may include secure information also. Thus, for example, the expansion memory 374 may be provide as a security module for the mobile computing device 350, and may be programmed with instructions that permit secure use of the mobile computing device 350. In addition, secure applications may be provided via the SIMM cards, along with additional information, such as placing identifying information on the SIMM card in a non- hackable manner.
[0071] The memory may include, for example, flash memory and/or NVRAM memory (nonvolatile random access memory), as discussed below. In some implementations, instructions are stored in an information carrier such that the instructions, when executed by one or more processing devices (for example, processor 352), perform one or more methods, such as those described above. The instructions can also be stored by one or more storage devices, such as one or more computer- or machine-readable mediums (for example, the memory 364, the expansion memory 374, or memory on the processor 352). In some implementations, the instructions can be received in a propagated signal, for example, over the transceiver 368 or the external interface 362.
[0072] The mobile computing device 350 may communicate wirelessly through the communication interface 366, which may include digital signal processing circuitry in some cases. The communication interface 366 may provide for communications under various modes or protocols, such as GSM voice calls (Global System for Mobile communications), SMS (Short Message Service), EMS (Enhanced Messaging Service), or MMS messaging (Multimedia Messaging Service), CDMA (code division multiple access), TDMA (time division multiple access), PDC (Personal Digital Cellular), WCDMA (Wideband Code Division Multiple Access), CDMA2000, or GPRS (General Packet Radio Service), LTE, 5G/6G cellular, among others. Such communication may occur, for example, through the transceiver 368 using a radio frequency. In addition, short-range communication may occur, such as using a Bluetooth, Wi-Fi, or other such transceiver (not shown). In addition, a GPS (Global Positioning System) receiver module 370 may provide additional navigation- and location-related wireless data to the mobile computing device 350, which may be used as appropriate by applications running on the mobile computing device 350.
[0073] The mobile computing device 350 may also communicate audibly using an audio codec 360, which may receive spoken information from a user and convert it to usable digital information. The audio codec 360 may likewise generate audible sound for a user, such as through a speaker, e.g., in a handset of the mobile computing device 350. Such sound may include sound from voice telephone calls, may include recorded sound (e.g., voice messages, music files, among others) and may also include sound generated by applications operating on the mobile computing device 350.
[0074] The mobile computing device 350 may be implemented in a number of different forms, as shown in the figure. For example, it may be implemented as a cellular telephone 380. It may also be implemented as part of a smart-phone 382, personal digital assistant, or other similar mobile device. [0075] A number of implementations have been described. Nevertheless, it will be understood that various modifications may be made without departing from the spirit and scope of the disclosure. For example, various forms of the flows shown above may be used, with steps re-ordered, added, or removed.
[0076] Embodiments of the invention and all of the functional operations described in this specification can be implemented in digital electronic circuitry, or in computer software, firmware, or hardware, including the structures disclosed in this specification and their structural equivalents, or in combinations of one or more of them. Embodiments of the invention can be implemented as one or more computer program products, e.g., one or more modules of computer program instructions encoded on a computer readable medium for execution by, or to control the operation of, data processing apparatus. The computer readable medium can be a machine-readable storage device, a machine-readable storage substrate, a memory device, a composition of matter effecting a machine-readable propagated signal, or a combination of one or more of them. The term “data processing apparatus” encompasses all apparatus, devices, and machines for processing data, including by way of example a programmable processor, a computer, or multiple processors or computers. The apparatus can include, in addition to hardware, code that creates an execution environment for the computer program in question, e.g., code that constitutes processor firmware, a protocol stack, a database management system, an operating system, or a combination of one or more of them. A propagated signal is an artificially generated signal, e.g., a machine-generated electrical, optical, or electromagnetic signal that is generated to encode information for transmission to suitable receiver apparatus.
[0077] A computer program (also known as a program, software, software application, script, or code) can be written in any form of programming language, including compiled or interpreted languages, and it can be deployed in any form, including as a stand alone program or as a module, component, subroutine, or other unit suitable for use in a computing environment. A computer program does not necessarily correspond to a file in a file system. A program can be stored in a portion of a file that holds other programs or data (e.g., one or more scripts stored in a markup language document), in a single file dedicated to the program in question, or in multiple coordinated files (e.g, files that store one or more modules, sub programs, or portions of code). A computer program can be deployed to be executed on one computer or on multiple computers that are located at one site or distributed across multiple sites and interconnected by a communication network.
[0078] The processes and logic flows described in this specification can be performed by one or more programmable processors executing one or more computer programs to perform functions by operating on input data and generating output. The processes and logic flows can also be performed by, and apparatus can also be implemented as, special purpose logic circuitry, e.g., an FPGA (field programmable gate array) or an ASIC (application specific integrated circuit).
[0079] Processors suitable for the execution of a computer program include, by way of example, both general and special purpose microprocessors, and any one or more processors of any kind of digital computer. Generally, a processor will receive instructions and data from a read only memory or a random access memory or both. The essential elements of a computer are a processor for performing instructions and one or more memory devices for storing instructions and data. Generally, a computer will also include, or be operatively coupled to receive data from or transfer data to, or both, one or more mass storage devices for storing data, e.g., magnetic, magneto optical disks, or optical disks. However, a computer need not have such devices. Moreover, a computer can be embedded in another device, e.g., a tablet computer, a mobile telephone, a personal digital assistant (PDA), a mobile audio player, a Global Positioning System (GPS) receiver, to name just a few. Computer readable media suitable for storing computer program instructions and data include all forms of non volatile memory, media and memory devices, including by way of example semiconductor memory devices, e.g., EPROM, EEPROM, and flash memory devices; magnetic disks, e.g., internal hard disks or removable disks; magneto optical disks; and CD ROM and DVD-ROM disks. The processor and the memory can be supplemented by, or incorporated in, special purpose logic circuitry.
[0080] To provide for interaction with a user, embodiments of the invention can be implemented on a computer having a display device, e.g., a CRT (cathode ray tube) or LCD (liquid crystal display) monitor, for displaying information to the user and a keyboard and a pointing device, e.g., a mouse or a trackball, by which the user can provide input to the computer. Other kinds of devices can be used to provide for interaction with a user as well; for example, feedback provided to the user can be any form of sensory feedback, e.g., visual feedback, auditory feedback, or tactile feedback; and input from the user can be received in any form, including acoustic, speech, or tactile input.
[0081] Embodiments of the invention can be implemented in a computing system that includes a back end component, e.g., as a data server, or that includes a middleware component, e.g., an application server, or that includes a front end component, e.g., a client computer having a graphical user interface or a Web browser through which a user can interact with an implementation of the invention, or any combination of one or more such back end, middleware, or front end components. The components of the system can be interconnected by any form or medium of digital data communication, e.g., a communication network. Examples of communication networks include a local area network (“LAN”) and a wide area network (“WAN”), e.g., the Internet.
[0082] The computing system can include clients and servers. A client and server are generally remote from each other and typically interact through a communication network. The relationship of client and server arises by virtue of computer programs running on the respective computers and having a client-server relationship to each other.
[0083] While this specification contains many specifics, these should not be construed as limitations on the scope of the invention or of what may be claimed, but rather as descriptions of features specific to particular embodiments of the invention. Certain features that are described in this specification in the context of separate embodiments can also be implemented in combination in a single embodiment. Conversely, various features that are described in the context of a single embodiment can also be implemented in multiple embodiments separately or in any suitable subcombination. Moreover, although features may be described above as acting in certain combinations and even initially claimed as such, one or more features from a claimed combination can in some cases be excised from the combination, and the claimed combination may be directed to a subcombination or variation of a subcombination.
[0084] Similarly, while operations are depicted in the drawings in a particular order, this should not be understood as requiring that such operations be performed in the particular order shown or in sequential order, or that all illustrated operations be performed, to achieve desirable results. In certain circumstances, multitasking and parallel processing may be advantageous. Moreover, the separation of various system components in the embodiments described above should not be understood as requiring such separation in all embodiments, and it should be understood that the described program components and systems can generally be integrated together in a single software product or packaged into multiple software products.
[0085] In each instance where an HTML file is mentioned, other file types or formats may be substituted. For instance, an HTML file may be replaced by an XML, JSON, plain text, or other types of files. Moreover, where a table or hash table is mentioned, other data structures (such as spreadsheets, relational databases, or structured files) may be used.
[0086] Particular embodiments of the invention have been described. Other embodiments are within the scope of the following claims. For example, the steps recited in the claims can be performed in a different order and still achieve desirable results.
[0087]What is claimed is:

Claims

1 . A method comprising: obtaining one or more images of a fish in a first container connected to a second container by a pipe configured to allow the fish to move from the first container to the second container; obtaining sensor data from the second container representing features of the fish; generating a ground truth data item that includes (i) the one or more images of the fish and (ii) the sensor data from the second container; providing a first portion of the ground truth data item to a machine learning model; and adjusting the machine learning model using a comparison of (i) output from the machine learning model processing the first portion of the ground truth data item and (ii) a second portion of the ground truth data item.
2. The method of claim 1 , wherein the second container includes one or more sensors for determining physical features or health conditions of fish.
3. The method of claim 2, wherein the physical features include one or more of fish dimensions or fish biomass.
4. The method of claim 1 , wherein the second container includes a plurality of light sensors and a plurality of light emitters as a light curtain for measuring fish.
5. The method of claim 1 , wherein the first container and the second container are included in a single container.
6. The method of claim 1 , comprising: controlling one or more lights in the first container to illuminate the fish.
7. The method of claim 1 , comprising: controlling one or more cameras in the first container to capture the one or more images of the fish while illuminated by one or more lights.
8. The method of claim 1 , comprising: generating a suction at one end of the pipe connected to the first container that moves the fish into the first container.
9. A non-transitory computer-readable medium storing one or more instructions executable by a computer system to perform operations comprising: obtaining one or more images of a fish in a first container connected to a second container by a pipe configured to allow the fish to move from the first container to the second container; obtaining sensor data from the second container representing features of the fish; generating a ground truth data item that includes (i) the one or more images of the fish and (ii) the sensor data from the second container; providing a first portion of the ground truth data item to a machine learning model; and adjusting the machine learning model using a comparison of (i) output from the machine learning model processing the first portion of the ground truth data item and (ii) a second portion of the ground truth data item.
10. The medium of claim 9, wherein the second container includes one or more sensors for determining physical features or health conditions of fish.
11 . The medium of claim 10, wherein the physical features include one or more of fish dimensions or fish biomass.
12. The medium of claim 9, wherein the second container includes a plurality of light sensors and a plurality of light emitters as a light curtain for measuring fish.
13. The medium of claim 9, wherein the first container and the second container are included in a single container.
14. The medium of claim 9, wherein the operations comprise: controlling one or more lights in the first container to illuminate the fish.
15. The medium of claim 9, wherein the operations comprise: controlling one or more cameras in the first container to capture the one or more images of the fish while illuminated by one or more lights.
16. The medium of claim 9, wherein the operations comprise: generating a suction at one end of the pipe connected to the first container that moves the fish into the first container.
17. A system, comprising: one or more processors; and machine-readable media interoperably coupled with the one or more processors and storing one or more instructions that, when executed by the one or more processors, perform operations comprising: obtaining one or more images of a fish in a first container connected to a second container by a pipe configured to allow the fish to move from the first container to the second container; obtaining sensor data from the second container representing features of the fish; generating a ground truth data item that includes (i) the one or more images of the fish and (ii) the sensor data from the second container; providing a first portion of the ground truth data item to a machine learning model; and adjusting the machine learning model using a comparison of (i) output from the machine learning model processing the first portion of the ground truth data item and (ii) a second portion of the ground truth data item.
18. The system of claim 17, wherein the second container includes one or more sensors for determining physical features or health conditions of fish.
19. The system of claim 18, wherein the physical features include one or more of fish dimensions or fish biomass.
20. The system of claim 17, wherein the second container includes a plurality of light sensors and a plurality of light emitters as a light curtain for measuring fish.
PCT/US2023/085348 2022-12-22 2023-12-21 Ground truth and training data generation for fish WO2024137939A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US202263476874P 2022-12-22 2022-12-22
US63/476,874 2022-12-22

Publications (1)

Publication Number Publication Date
WO2024137939A1 true WO2024137939A1 (en) 2024-06-27

Family

ID=89715899

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2023/085348 WO2024137939A1 (en) 2022-12-22 2023-12-21 Ground truth and training data generation for fish

Country Status (1)

Country Link
WO (1) WO2024137939A1 (en)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112215798A (en) * 2020-09-14 2021-01-12 江苏大学 Fry counting detection method and device based on machine vision
WO2021206890A1 (en) * 2020-04-10 2021-10-14 X Development Llc Multi-chamber lighting controller for aquaculture
WO2021242368A1 (en) * 2020-05-28 2021-12-02 X Development Llc Analysis and sorting in aquaculture
WO2022010816A1 (en) * 2020-07-06 2022-01-13 Ecto, Inc. Splash detection for surface splash scoring
WO2022115142A1 (en) * 2020-11-24 2022-06-02 X Development Llc Escape detection and mitigation for aquaculture

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2021206890A1 (en) * 2020-04-10 2021-10-14 X Development Llc Multi-chamber lighting controller for aquaculture
WO2021242368A1 (en) * 2020-05-28 2021-12-02 X Development Llc Analysis and sorting in aquaculture
WO2022010816A1 (en) * 2020-07-06 2022-01-13 Ecto, Inc. Splash detection for surface splash scoring
CN112215798A (en) * 2020-09-14 2021-01-12 江苏大学 Fry counting detection method and device based on machine vision
WO2022115142A1 (en) * 2020-11-24 2022-06-02 X Development Llc Escape detection and mitigation for aquaculture

Similar Documents

Publication Publication Date Title
DK181307B1 (en) System for external fish parasite monitoring in aquaculture
CN111511201A (en) System for fish ectoparasite monitoring in aquaculture
CN116034917A (en) Method and system for monitoring fish ectoparasites in aquaculture
CN111511203B (en) Method and system for fish ectoparasite monitoring in aquaculture
EP3869949B1 (en) Lighting control method for sea lice detection
US20210289765A1 (en) Devices and methods for monitoring and elimination of honey bee parasites
US20220004760A1 (en) Splash detection for surface splash scoring
US11657498B2 (en) Multi-chamber lighting controller for aquaculture
Pert et al. Using sentinel cages to estimate infestation pressure on salmonids from sea lice in Loch Shieldaig, Scotland
Melnychuk et al. Meso-scale movement and mortality patterns of juvenile coho salmon and steelhead trout migrating through a coastal fjord
Colotelo et al. Injury and mortality of two Mekong River species exposed to turbulent shear forces
WO2024137939A1 (en) Ground truth and training data generation for fish
von Dassow et al. Bioluminescent response of the dinoflagellate Lingulodinium polyedrum to developing flow: Tuning of sensitivity and the role of desensitization in controlling a defensive behavior of a planktonic cell
CN115687911B (en) Signal lamp detection method, device and system based on pulse signals
JP7494392B2 (en) Escape detection and mitigation for aquaculture
EP4346394A1 (en) Underwater camera as light sensor
TWI740672B (en) System and method for smart aquaculture
Moser et al. Potential for use of accelerometers to monitor green sturgeon Acipenser medirostris (Ayres, 1854) behavior after handling
US20230083826A1 (en) Computer vision approaches for environmentally sustainable aquaculture
US20230262338A1 (en) Enhanced controller synchronization verification
US20220396339A1 (en) Framework for controlling devices
Alfredsen et al. An acoustic sensor transmitter for in situ assessment of water quality and fish behaviour during chemical treatment of a parasite-infected river system: tag design and practical use
US11778127B2 (en) Enhanced synchronization framework
RU2778096C2 (en) Monitoring system for external fish parasites in aquaculture
AU2023214562A1 (en) Fine-scale movement tracking of underwater objects