WO2024069898A1 - Action determination device, action determination method, and action determination program - Google Patents

Action determination device, action determination method, and action determination program Download PDF

Info

Publication number
WO2024069898A1
WO2024069898A1 PCT/JP2022/036590 JP2022036590W WO2024069898A1 WO 2024069898 A1 WO2024069898 A1 WO 2024069898A1 JP 2022036590 W JP2022036590 W JP 2022036590W WO 2024069898 A1 WO2024069898 A1 WO 2024069898A1
Authority
WO
WIPO (PCT)
Prior art keywords
behavior
target organism
feeding
data
target
Prior art date
Application number
PCT/JP2022/036590
Other languages
French (fr)
Japanese (ja)
Inventor
葉子 田内
亮史 服部
孝之 小平
Original Assignee
三菱電機株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 三菱電機株式会社 filed Critical 三菱電機株式会社
Priority to PCT/JP2022/036590 priority Critical patent/WO2024069898A1/en
Publication of WO2024069898A1 publication Critical patent/WO2024069898A1/en

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A01AGRICULTURE; FORESTRY; ANIMAL HUSBANDRY; HUNTING; TRAPPING; FISHING
    • A01KANIMAL HUSBANDRY; CARE OF BIRDS, FISHES, INSECTS; FISHING; REARING OR BREEDING ANIMALS, NOT OTHERWISE PROVIDED FOR; NEW BREEDS OF ANIMALS
    • A01K29/00Other apparatus for animal husbandry
    • AHUMAN NECESSITIES
    • A01AGRICULTURE; FORESTRY; ANIMAL HUSBANDRY; HUNTING; TRAPPING; FISHING
    • A01KANIMAL HUSBANDRY; CARE OF BIRDS, FISHES, INSECTS; FISHING; REARING OR BREEDING ANIMALS, NOT OTHERWISE PROVIDED FOR; NEW BREEDS OF ANIMALS
    • A01K61/00Culture of aquatic animals
    • A01K61/80Feeding devices

Definitions

  • This disclosure relates to a behavior determination device, a behavior determination method, and a behavior determination program.
  • Patent Document 1 discloses a technology for automating a land-based aquaculture system with the aim of reducing the labor required for these manual tasks.
  • the technology disclosed in Patent Document 1 determines the three-dimensional position of each aquatic organism, such as fish, in an aquarium from a captured image, and individual characteristic data of each organism, such as its size and body shape, tracks the organism based on the determined position and characteristic data, and monitors the growth state of the organism based on the tracking results.
  • the technology is characterized by having a means for controlling the amount of feed and growth environment conditions to achieve optimal growth state using statistically managed data.
  • Patent Document 1 does not take into consideration the timing of feeding. As a result, this technology has the problem that feeding is not always carried out at an appropriate time, and therefore efficient feeding may not be possible.
  • the present disclosure aims to provide data that enables an automatic feeding device to feed cultivated aquatic organisms at appropriate times.
  • the behavior determination device comprises: a behavior determination unit that determines the behavior of a target organism using behavior features that are calculated based on time transitions of at least one of the positions and bending of each joint of the target organism, the behavior features corresponding to the behavior of the target organism, and a reference behavior model that is a model that indicates a classification of the behavior of the target organism according to the behavior features;
  • the device includes a growth condition control unit that generates data for determining whether or not the device that feeds the target organism should feed the target organism based on the determined behavior of the target organism.
  • the behavior determination unit determines the behavior of the target organism in accordance with the behavioral features, and the growth condition control unit generates data in accordance with the behavior of the target organism for a device that feeds the target organism to decide whether or not to feed the target organism.
  • the device corresponds to an automatic feeding device, and the automatic feeding device can feed the target organism at an appropriate time by deciding whether or not to feed the target organism based on the data.
  • data can be provided that enables the automatic feeding device to feed the cultivated aquatic organism at an appropriate time.
  • FIG. 1 is a diagram showing an example of the configuration of a behavior determination system 9 according to a first embodiment.
  • 5A to 5C are diagrams for explaining the processing of a skeleton extraction unit 23 according to the first embodiment.
  • 1A and 1B are diagrams for explaining the processing of the behavior determination unit 27 in the first embodiment, in which (a) shows a specific example when the target organism is a shrimp, and (b) shows a specific example when the target organism is a catfish.
  • FIG. 2 is a diagram for explaining a feeding behavior model according to the first embodiment.
  • FIG. 2 is a diagram for explaining a non-feeding behavior model according to the first embodiment.
  • FIG. 1 is a diagram for explaining a normal behavior model according to the first embodiment
  • 4 is a diagram for explaining an abnormal behavior model according to the first embodiment
  • FIG. 2 is a diagram showing an example of the hardware configuration of a behavior determining device 10 according to the first embodiment.
  • 4 is a flowchart showing the operation of the behavior determining device 10 according to the first embodiment.
  • FIG. 13 is a diagram showing an example of a hardware configuration of a behavior determining device 10 according to a modification of the first embodiment.
  • FIG. 13 is a diagram showing a configuration example of a behavior determination system 9 according to a second embodiment.
  • 13A to 13C are diagrams for explaining the processing of the learning unit 301 in the second embodiment, where FIG. 13A shows an example of creating a feeding behavior model, FIG.
  • FIG. 13B shows an example of creating a non-feeding behavior model
  • FIG. 13C shows an example of creating an abnormal behavior model
  • 10 is a flowchart showing the operation of a remaining feed amount acquisition section 200 and a learning section 301 according to the second embodiment
  • 11 is a flowchart showing the operation of the corpse number acquisition unit 400 and the learning unit 301 according to the second embodiment.
  • the behavior determination system 9 includes a camera 1, a behavior determination device 10, an automatic feeding device 60, and a growth environment control device 70.
  • the behavior determination device 10 has a function of determining the behavior of an aquatic organism.
  • the aquatic organism typically refers to an aquatic organism cultivated in an aquarium.
  • the behavior determination device 10 includes an image acquisition unit 21, a target organism detection unit 22, a skeleton extraction unit 23, a target organism tracking unit 24, a motion feature amount calculation unit 25, a bending feature amount calculation unit 26, a behavior determination unit 27, and a growth condition control unit 28, and stores a reference behavior model 31.
  • the behavior determination device 10 is connected to a camera 1, an automatic feeding device 60, and a growth environment control device 70.
  • the camera 1 is a device for photographing aquatic organisms, and there may be a plurality of cameras.
  • the camera 1 outputs video data representing images photographed by the camera 1.
  • the video data is data representing a plurality of frames, and as a specific example, is data of a moving image.
  • the automatic feeding device 60 is a device that feeds target organisms, and is a device that automatically feeds aquatic organisms.
  • the target organisms are aquatic organisms having joints, and specific examples thereof include fish and crustaceans.
  • the growth environment control device 70 is a device that controls the growth environment of aquatic organisms.
  • the video acquisition unit 21 acquires the video data output by the camera 1.
  • the target organism detection unit 22 detects a target organism in each frame indicated by the video data acquired by the video acquisition unit 21.
  • the target organism detection unit 22 may detect the location of the target organism in each frame by using a trained model based on AI (Artificial Intelligence) technology.
  • AI Artificial Intelligence
  • the trained model is a model that receives data indicating a frame as input and outputs the result of detecting the target organism.
  • the target organism detection unit 22 When there are multiple frames in which a target organism is detected, the target organism detection unit 22 generates, as detection information, information indicating the target organism detected in each frame in which the target organism is detected.
  • the skeleton extraction unit 23 extracts the skeleton of the target organism indicated by the detection information generated by the target organism detection unit 22, and generates skeleton information indicating the extracted skeleton. At this time, typically, the skeleton extraction unit 23 extracts each joint of the target organism in each frame indicated by the video data, and generates information indicating each extracted joint as skeleton information. At this time, the video data indicates a plurality of frames in which the target organism is shown.
  • the skeleton information may be information indicating the position and bending of each joint, or may be information indicating parts such as the eyes, each fin, the tail, and the legs.
  • the skeleton extraction unit 23 may change the skeleton to be extracted depending on the type of the target organism.
  • Fig. 2 is a diagram for explaining the processing of the skeleton extraction unit 23.
  • the skeleton extraction unit 23 extracts each part of the skeleton of the target organism, that is, the left eye, the right eye, the left pectoral fin, the right pectoral fin, the pelvic fin, the center of the dorsal fin, the anal fin, the base of the caudal fin, and the tip of the caudal fin, as shown in Fig. 2.
  • the target organism tracking unit 24 tracks the target organism based on the skeletal information, and generates information indicating the tracking results as tracking information. Specifically, the target organism tracking unit 24 tracks the target organism based on multiple frames indicated by the video data acquired by the video acquisition unit 21 and the skeletal information generated by the skeletal extraction unit 23, and generates tracking information indicating the tracking results. At this time, the target organism tracking unit 24 sets, as a specific example, the reference point for tracking to a certain point within the head of the target organism. As a specific example, the certain point is a point corresponding to the mouth or eyes, etc.
  • the tracking information indicates, as a specific example, the distance and time traveled by the target organism.
  • the movement characteristic amount calculation unit 25 calculates the movement characteristic amount of the target organism based on the tracking information generated by the target organism tracking unit 24.
  • the movement characteristic amount is a characteristic amount related to the movement of the target organism, and corresponds to at least one of the movement, position, and posture of the target organism.
  • the movement characteristic amount is composed of at least one of a value indicating the position of the target organism, a value indicating the posture of the target organism, a value indicating the swimming distance of the target organism, a value indicating the swimming speed of the target organism, and a value indicating the acceleration of the target organism while swimming.
  • the movement characteristic amount calculation unit 25 may change the characteristic amount calculated depending on the items of the reference behavior model 31.
  • the bending feature amount calculation unit 26 calculates the bending feature amount of the target organism based on the skeleton information generated by the skeleton extraction unit 23.
  • the bending feature amount calculation unit 26 may use the tracking information generated by the target organism tracking unit 24 when calculating the bending feature amount.
  • the bending feature amount is a feature amount related to the bending of the joints of the target organism, and is a feature amount corresponding to at least one of the bending of the joints of the target organism and the joints of the target organism that have moved with the bending of the target joint.
  • the bending feature amount is composed of at least one of a value indicating each of the bending direction and degree of each of at least some of the joints of the target organism, a value indicating each of the angular velocity and angular acceleration at the time of bending of each of the bent joints of the joints of the target organism, and a value indicating each of the movement amount, movement speed, and movement direction of each of the joints that have moved with the bending of the joints of the target organism.
  • the bending feature amount calculation unit 26 may change the feature amount calculated according to the item of the reference behavior model 31.
  • the bend feature amount calculation unit 26 calculates the movement trajectory of each joint of the target organism, and calculates the movement distance and movement direction of each joint based on the calculated movement trajectory. Note that the nature of the movement trajectory calculated by the bend feature amount calculation unit 26 basically differs depending on the type, age, etc. of the target organism.
  • the behavior determination unit 27 determines the behavior of the target organism using the behavior feature amount and the reference behavior model 31.
  • the behavior feature amount is a feature amount calculated based on the time progression of at least one of the position and bending of each joint of the target organism, and corresponds to the behavior of the target organism.
  • the behavior feature amount is composed of at least one of a movement feature amount and a bending feature amount.
  • the behavior determination unit 27 determines what type of behavior the target organism is by applying the calculated movement feature amount and displacement feature amount to the reference behavior model 31.
  • behavior determination unit 27 determines the behavior of the target organism as feeding behavior when the swimming speed of the target organism is faster than a predetermined speed, the orientation of the target organism's head is within a predetermined angle from the vertical upward direction, and the target organism wags its tail at intervals shorter than a predetermined interval. Feeding behavior is the behavior of the target organism when it desires food. As another specific example, the behavior determination unit 27 determines the behavior of the target organism as abnormal when the swimming speed of the target organism is slower than a predetermined speed and the abdomen of the target organism is vertically above the dorsal fin of the target organism. Abnormal behavior is behavior that is not normal, and specific examples include behavior of the target organism when the target organism is in a weakened state, is under high stress, or is ill. Normal behavior is behavior of the target organism when the target organism is in good health.
  • FIG. 3 is a diagram for explaining the process of the behavior determination unit 27.
  • the target organism is a shrimp.
  • the behavior determination unit 27 determines the shrimp's behavior as a feeding behavior.
  • the behavior determination unit 27 determines the shrimp's behavior as a non-feeding behavior.
  • Non-feeding behavior is behavior that is not a feeding behavior.
  • 3B shows a specific example in which the target organism is a catfish. When the catfish frequently moves its tail, the behavior determination unit 27 determines that the behavior of the catfish is normal. On the other hand, when the catfish hardly moves its tail, the behavior determination unit 27 determines that the behavior of the catfish is abnormal.
  • the growth condition control unit 28 generates data for the device that feeds the target organism to determine whether or not to feed the target organism, according to the behavior of the target organism determined by the behavior determination unit 27.
  • the data may be data indicating whether or not to feed the target organism, data indicating the amount of feeding, data indicating whether or not there is a target organism that is performing a behavior determined to be a feeding behavior, data indicating each of the total number of target organisms performing a behavior determined to be a feeding behavior and the total number of target organisms in the aquarium, or data indicating the processing results of each unit included in the behavior determination device 10.
  • the automatic feeding device 60 determines not to feed the target organism when the amount of feeding is 0, and determines to feed the target organism when the amount of feeding is not 0.
  • the growth condition control unit 28 may generate data indicating the amount of adjustment of each of the automatic feeding device 60 and the growth environment control device 70 based on the determination result of the behavior determination unit 27, and output the generated data.
  • the growth condition control unit 28 calculates at least one of the feeding timing and feeding amount for the target organism according to the determined behavior of the target organism, and also generates data used to adjust the growth environment of the target organism according to the determined behavior of the target organism.
  • the reference behavior model 31 is a model that indicates a classification of the behavior of a target organism according to behavioral features, and is used to classify the behavior of aquatic organisms.
  • the reference behavior model 31 may be a model expressed in a table format, or may be a classifier based on machine learning. 4 to 7 show specific examples of the reference behavior model 31.
  • the reference behavior model 31 will be described below with reference to Figs. 4 to 7. Note that the feeding behavior model, the non-feeding behavior model, the normal behavior model, and the abnormal behavior model are each a specific example of a model included in the reference behavior model 31.
  • FIG. 4 shows a specific example of a feeding behavior model.
  • the feeding behavior model is a model used to determine whether the behavior of a target organism is feeding behavior, and as a specific example, is a model that shows indices for classifying the behavior of the target organism as feeding behavior according to behavioral features.
  • the behavior determination unit 27 can determine whether the behavior of the target organism is feeding behavior based on the swimming speed, head angle, swimming depth, and caudal fin angle change of the target organism.
  • the swimming speed, head angle, and swimming depth correspond to the movement features
  • the caudal fin angle change corresponds to the bending feature.
  • FIG. 5 shows a specific example of a non-feeding behavior model.
  • the non-feeding behavior model is a model used to determine whether the behavior of a target organism is non-feeding behavior, and as a specific example, is a model that shows indices for classifying the behavior of a target organism as non-feeding behavior according to behavioral features.
  • the behavior determination unit 27 can determine whether the behavior of a target organism is non-feeding behavior based on the target organism's swimming speed, head angle, swimming depth, and caudal fin angle change.
  • Non-feeding behavior is behavior that is not feeding behavior, and as a specific example, is behavior that a target organism takes when full.
  • FIG. 6 shows a specific example of a normal behavior model.
  • the normal behavior model is a model used to determine whether the behavior of a target organism is normal or not, and as a specific example, is a model that shows an index for classifying the behavior of the target organism as normal behavior according to the behavior feature.
  • the behavior determination unit 27 can determine whether the behavior of the target organism is normal or not based on the swimming speed, pectoral fin position, and number of caudal fin angle changes of the target organism.
  • the pectoral fin position corresponds to the movement feature
  • the number of caudal fin angle changes corresponds to the bending feature.
  • FIG. 7 shows a specific example of an abnormal behavior model.
  • the abnormal behavior model is a model used to determine whether the behavior of a target organism is abnormal or not, and as a specific example, is a model that shows an index for classifying the behavior of the target organism as abnormal depending on the behavioral features.
  • the behavior determination unit 27 can determine whether the behavior of the target organism is abnormal or not based on the swimming speed, pectoral fin position, and number of changes in caudal fin angle of the target organism.
  • FIG. 8 shows an example of the hardware configuration of the behavior determination device 10 according to this embodiment.
  • the behavior determination device 10 is made up of a computer, and includes hardware such as a processor 20, a storage device 30, a communication device 40, and an input/output interface 50.
  • the behavior determination device 10 may be made up of multiple computers.
  • the processor 20 is an integrated circuit (IC) that performs arithmetic processing and controls the hardware of the computer.
  • Specific examples of the processor 20 include a central processing unit (CPU), a digital signal processor (DSP), and a graphics processing unit (GPU).
  • the behavior determination device 10 may include a plurality of processors that replace the processor 20. The plurality of processors share the role of the processor 20.
  • the storage device 30 may be a volatile storage device, a non-volatile storage device, or a combination of these.
  • a specific example of a volatile storage device is RAM (Random Access Memory).
  • a specific example of a non-volatile storage device is ROM (Read Only Memory), HDD (Hard Disk Drive), or flash memory.
  • the communication device 40 is a receiver and a transmitter.
  • a specific example of the communication device 40 is a communication chip or a NIC (Network Interface Card).
  • the input/output interface 50 is a port to which an input device and an output device are connected.
  • a specific example of the input/output interface 50 is a USB (Universal Serial Bus) terminal.
  • a specific example of the input device is a keyboard and a mouse.
  • a specific example of the output device is a display.
  • Each part of the behavior determination device 10 may use the input/output interface 50 and the communication device 40 as appropriate when communicating with other devices.
  • the communication device 40 and the input/output interface 50 acquire signals output by the camera 1, etc.
  • the storage device 30 stores a reference behavior model 31 and a behavior determination program.
  • the behavior determination program is a program that causes a computer to realize the functions of each part of the behavior determination device 10.
  • the processor 20 reads out and executes the behavior determination program stored in the storage device 30, thereby operating as each part of the behavior determination device 10.
  • the functions of each part of the behavior determination device 10 are realized by software.
  • the data used when executing the behavior determination program and the data obtained by executing the behavior determination program are appropriately stored in the storage device 30.
  • Each part of the behavior determination device 10 uses the storage device 30 as appropriate.
  • the terms "data” and "information” may have the same meaning.
  • the storage device 30 may be independent of the computer.
  • the behavior determination program may be recorded on a computer-readable non-volatile recording medium.
  • Specific examples of the non-volatile recording medium include an optical disk or a flash memory.
  • the behavior determination program may be provided as a program product.
  • the operation procedure of the behavior determination device 10 corresponds to a behavior determination method, and the program that realizes the operation of the behavior determination device 10 corresponds to a behavior determination program.
  • FIG. 9 is a flowchart showing an example of the operation of the behavior determination device 10. The operation of the behavior determination device 10 will be explained with reference to FIG. 9.
  • Step S1 The video acquisition unit 21 acquires, as target video data, the video data output by the camera 1.
  • the video acquisition unit 21 may acquire, as target video data, only data within a certain time range from the video data output by the camera 1.
  • the target organism detection unit 22 detects a target organism from each frame represented by the target video data.
  • the target organism detection unit 22 may use only a portion of the frames represented by the target video data.
  • Step S3 If a target living thing is detected from the image represented by the target image data, the behavior determination device 10 proceeds to step S4, otherwise the behavior determination device 10 returns to step S1.
  • the behavior determination device 10 returns to step S1.
  • Step S4 The behavior determination device 10 selects, as a selected target organism, one of the one or more target organisms that has not yet been selected in the iterative processing from step S4 to step S10. Note that the selected target organism is assumed to have been detected in multiple frames.
  • Step S5 The skeleton extraction unit 23 extracts the skeleton of the selected target organism in each frame in which the selected target organism is detected among the frames indicated by the target video data, and generates information indicating the extracted skeleton as target skeleton information.
  • Step S6 The target organism tracking unit 24 tracks the selected target organism based on the target skeletal information, and generates information indicating the tracking results as target tracking information.
  • Step S7 The motion characteristic amount calculation unit 25 calculates the motion characteristic amount of the selected target organism based on the target tracking information.
  • the motion characteristic amount calculation unit 25 calculates the swimming distance and swimming time of the target organism based on the target tracking information, and sets the calculated results as the motion characteristic amount.
  • the curvature feature amount calculation section 26 may calculate the curvature feature amount of the selected target organism based on the target skeletal information.
  • Step S8 The behavior determination unit 27 compares the feature amount calculated in step S7 with a reference behavior model 31 prepared in advance.
  • Step S10 The behavior determination device 10 repeatedly executes steps S5 to S9 the number of times corresponding to the number of target living things detected from the target video data.
  • Step S11 The growth condition control unit 28 generates data indicating the amount of adjustment of each of the automatic feeding device 60 and the growth environment control device 70 based on the judgment result of the behavior judgment unit 27 in step S9, and outputs the generated data to each of the automatic feeding device 60 and the growth environment control device 70.
  • the automatic feeding device 60 determines that it is feeding time and executes feeding. At this time, the automatic feeding device 60 may change the amount of feeding depending on the proportion of target organisms exhibiting the behavior determined to be a feeding behavior in step S9.
  • the automatic feeding device 60 has a reference feeding amount, which is a reference feeding amount for each target organism, in a database not shown, and obtains the number of target organisms exhibiting the behavior determined to be a feeding behavior in step S9 and the number of target organisms detected in step S3, and calculates the proportion of target organisms exhibiting the behavior determined to be a feeding behavior by dividing the number of target organisms exhibiting the behavior determined to be a feeding behavior by the number of detected target organisms, and sets the feeding amount to 1.2 times the reference feeding amount when the calculated proportion is 80% or more.
  • a reference feeding amount which is a reference feeding amount for each target organism, in a database not shown, and obtains the number of target organisms exhibiting the behavior determined to be a feeding behavior in step S9 and the number of target organisms detected in step S3, and calculates the proportion of target organisms exhibiting the behavior determined to be a feeding behavior by dividing the number of target organisms exhibiting the behavior determined to be a feeding behavior by the number of detected target organisms, and sets the feeding
  • the automatic feeding device 60 has a feeding amount index that indicates a higher value than the value indicated by the feeding behavior index, and when the feature value corresponding to the target organism exhibiting behavior determined to be feeding behavior is equal to or greater than the value indicated by the feeding amount index, the feeding amount is set to 1.2 times the reference feeding amount.
  • the feeding behavior index indicates a swimming speed of 15 cm/s
  • the feeding amount index indicates a swimming speed of 25 cm/s.
  • the feeding behavior index is an index used to determine whether the behavior of the target organism is feeding behavior.
  • the feeding amount index is an index used to determine the feeding amount.
  • the growth environment control device 70 first sets the water temperature or oxygen supply amount in the aquarium high when there is one or more target organisms whose behavior is determined to be abnormal in step S9. After that, the behavior determination device 10 executes the behavior determination flow shown in FIG. 9 again after a certain time has elapsed, and when there is no target organism whose behavior is determined to be abnormal in step S9, the growth environment control device 70 sets the water temperature or oxygen supply amount in the aquarium low.
  • the reason for first setting the water temperature or oxygen supply amount in the aquarium high is that, in general, the higher the water temperature and the higher the oxygen concentration, the more active (actively moving) the fish are.
  • the growth environment control device 70 may first set the water temperature or oxygen supply amount in the aquarium low, and then set the water temperature or oxygen supply amount in the aquarium high.
  • the behavior of the target organism can be understood in more detail than by only tracking the target organism by determining the type of behavior of the target organism based on the swimming behavior and leg movements of the target organism, etc.
  • the skeleton of the target organism is extracted according to the type of the target organism, and the behavior of the target organism is determined based on the behavior of the target organism tracked based on the extracted skeleton and a model for classifying the behavior of the target organism, thereby making it possible to understand the behavior according to the ecology of the target organism.
  • efficient feeding can be performed based on the results of understanding the behavior of the target organism.
  • feeding can be performed at an appropriate timing, it is possible to prevent the water quality from deteriorating by feeding the target organism when it is full, and to prevent not feeding the target organism when it is hungry, and it is possible to adjust the amount of feeding according to the health condition of the target organism.
  • FIG. 10 shows an example of the hardware configuration of the behavior determining device 10 according to this modified example.
  • the behavior determination device 10 includes a processing circuit 19 instead of the processor 20 or instead of the processor 20 and the storage device 30 .
  • the processing circuitry 19 is hardware that realizes at least a portion of each unit of the behavior determination device 10 .
  • the processing circuitry 19 may be dedicated hardware, or may be a processor that executes a program stored in a storage device 30 .
  • processing circuitry 19 When processing circuitry 19 is dedicated hardware, processing circuitry 19 may be, for example, a single circuit, a composite circuit, a programmed processor, a parallel programmed processor, an ASIC (Application Specific Integrated Circuit), an FPGA (Field Programmable Gate Array), or a combination thereof.
  • the behavior determination device 10 may include a plurality of processing circuits that replace the processing circuit 19. The plurality of processing circuits share the role of the processing circuit 19.
  • the behavior determination device 10 some functions may be realized by dedicated hardware, and the remaining functions may be realized by software or firmware.
  • the processing circuitry 19 is illustratively implemented in hardware, software, firmware, or a combination thereof.
  • the processor 20, the storage device 30, and the processing circuit 19 are collectively referred to as the “processing circuitry.”
  • the functions of the functional components of the behavior determination device 10 are realized by the processing circuitry.
  • the behavior determining device 10 according to other embodiments may also have a similar configuration to this modified example.
  • Embodiment 2 The following mainly describes the differences from the above-described embodiment with reference to the drawings.
  • Fig. 11 shows an example of the configuration of a behavior determination system 9 according to this embodiment.
  • the behavior determination device 10 according to this embodiment further includes a remaining food amount acquisition unit 200, a learning unit 301, and a corpse number acquisition unit 400.
  • the remaining food amount acquisition section 200 comprises a video acquisition section 201 , a remaining food detection section 202 , and a remaining food amount calculation section 203 .
  • the corpse number acquisition unit 400 includes an image acquisition unit 401 , a corpse detection unit 402 , and a corpse number calculation unit 403 .
  • Video acquisition unit 201 is similar to video acquisition unit 21.
  • the remaining food detection unit 202 detects remaining food in the tank after feeding from each frame shown in the video data acquired by the video acquisition unit 201. After feeding is after the target organism has eaten the provided food, and as a specific example, after the time normally required for the target organism to finish eating the food has elapsed from the time of feeding.
  • the remaining food detection unit 202 may determine remaining food using a background subtraction method, or may determine where remaining food is present in each frame using a trained model based on AI technology.
  • the trained model is a model that takes data showing a frame as input and outputs the result of detecting remaining food.
  • the remaining food amount calculation unit 203 calculates the amount of food detected by the remaining food detection unit 202.
  • Video acquisition unit 401 is similar to video acquisition unit 21.
  • the corpse detection unit 402 detects corpses in the tank from each frame of the video data acquired by the video acquisition unit 401.
  • the corpse detection unit 402 may determine the presence of a corpse using motion features and a background subtraction method, or may determine where in each frame the corpse is located using a trained model based on AI technology.
  • the trained model is a model that receives data indicating a frame as input and outputs the result of detecting a corpse.
  • Corporatione number calculation unit 403 calculates the number of corpses detected by corpse detection unit 402.
  • corpse number calculation unit 403 calculates the number of corpses based on the total number of pixels estimated to reflect corpses when corpses are detected using the background subtraction method, and calculates the number of corpses based on the number of detection frames when corpses are detected using AI technology.
  • the learning unit 301 generates a model based on the movement feature amount and the curvature feature amount, and data indicating the results of detection by the remaining food detection unit 202 or the carcass detection unit 402 .
  • the learning unit 301 uses the non-feeding behavior data and the feeding behavior data to learn a model that classifies the behavior of the target organism into feeding behavior and non-feeding behavior according to the behavior feature of the target organism.
  • the non-feeding behavior data is data when the amount of remaining food is equal to or greater than a reference remaining food amount, data showing the behavior feature of the learning aquatic organism before feeding, and data labeled as data corresponding to non-feeding behavior.
  • the remaining food amount is the amount of food remaining after the feeding time has elapsed from the feeding time.
  • the feeding time is a time set as the time required for the learning aquatic organism to eat the fed food.
  • the learning aquatic organism is an aquatic organism corresponding to the target organism.
  • the learning aquatic organism may be the target organism itself, or may be an aquatic organism having the same or similar properties as the target organism.
  • the feeding behavior data is data when the amount of remaining food is less than a reference remaining food amount, data showing the behavior feature of the learning aquatic organism before feeding, and data labeled as data corresponding to feeding behavior.
  • the learning unit 301 learns the feeding behavior of the target organism based on the movement features and curvature features before feeding and the amount of remaining food after feeding, and may use a model that shows the movement features and curvature features corresponding to the feeding behavior as at least a part of the reference behavior model 31.
  • the learning unit 301 uses the abnormal behavior data and the normal behavior data to learn a model that classifies the behavior of the target organism into either normal behavior or abnormal behavior according to the behavior feature of the target organism.
  • an aquarium is used to raise a group of aquatic organisms composed of a plurality of aquatic organisms corresponding to the target organism.
  • Each of the plurality of aquatic organisms corresponding to the target organism may be an aquatic organism having the same or similar properties as the target organism.
  • the abnormal behavior data is data when the carcasses of aquatic organisms constituting an abnormal number or more of aquatic organisms in the aquarium are detected, and is data indicating the behavior feature of each aquatic organism constituting the aquatic organisms before the carcasses of aquatic organisms constituting an abnormal number or more of aquatic organisms in the aquarium are detected, and is labeled as data corresponding to abnormal behavior.
  • the normal behavior data is data when the corpses of aquatic organisms constituting the aquatic organism group less than an abnormal number are detected in the aquarium, and is data indicating behavioral features of each aquatic organism constituting the aquatic organism group before the corpses of the aquatic organisms constituting the aquatic organism group less than an abnormal number are detected in the aquarium, and is data labeled as data corresponding to normal behavior.
  • the learning unit 301 learns abnormal behavior of the target organism based on the movement features and curvature features before corpse detection and the number of corpses, and may use a model indicating the movement features and curvature features corresponding to abnormal behavior as at least a part of the reference behavior model 31.
  • FIG. 12 is a diagram for explaining the processing of the learning unit 301.
  • FIG. 12A shows a specific example of creating a feeding behavior model by the learning unit 301.
  • the learning unit 301 creates a feeding behavior model in which the behavior of the target organism before feeding is classified as feeding behavior.
  • 12B shows a specific example of creating a non-feeding behavior model by the learning unit 301.
  • the learning unit 301 When the amount of remaining food detected after feeding is greater than a certain amount, i.e., when the target product has not eaten enough of the provided food, the learning unit 301 creates a non-feeding behavior model in which the behavior of the target organism before feeding is classified as non-feeding behavior. 12C shows a specific example of creating an abnormal behavior model by the learning unit 301.
  • the learning unit 301 When the number of detected target organism corpses is greater than a certain number, the learning unit 301 creates an abnormal behavior model in which the behavior of the target organism before the detection of more than a certain number of corpses is classified as abnormal behavior.
  • the learning unit 301 may generate a normal behavior model when the number of detected target creature corpses is less than a certain number.
  • *** Operation Description *** 13 is a flow chart showing an example of the operation of the remaining feed amount acquisition section 200 and the learning section 301. The operation of the remaining feed amount acquisition section 200 and the learning section 301 will be described with reference to FIG.
  • Step S21 The image acquisition unit 201 acquires the image data output by the camera 1 as target image data. This step is the same as step S1. Note that the target image data indicates an image captured inside an aquarium in which a target organism is kept.
  • Step S22 If the target video data is data after feeding, the behavior determination apparatus 10 proceeds to step S23. Otherwise, the behavior determination apparatus 10 returns to step S21. Note that, in cases where the behavior determination apparatus 10 is aware that the target video data is data taken after feeding, the behavior determination apparatus 10 may skip this step.
  • Step S23 The remaining food detection unit 202 detects remaining food from each frame represented by the target video data.
  • Step S24 If remaining food is detected in step S23, the behavior determination apparatus 10 proceeds to step S25. Otherwise, the behavior determination apparatus 10 proceeds to step S26.
  • Step S25 The remaining food amount calculation section 203 calculates the amount of remaining food detected in step S24.
  • Step S26 The learning unit 301 acquires information indicating each of the movement amount feature amount and the curvature feature amount before feeding.
  • Step S27, Step S28, Step S29 If the amount of remaining food calculated in step S25 is equal to or greater than a certain amount, the learning unit 301 labels the information acquired in step S26 as a non-feeding behavior. Otherwise, the learning unit 301 labels the information acquired in step S26 as a feeding behavior.
  • the learning unit 301 may label each piece of information about an individual.
  • the data labeled in steps S28 and S29 corresponds to learning data
  • the data labeled as non-feeding behavior corresponds to non-feeding behavior data
  • the data labeled as feeding behavior corresponds to feeding behavior data.
  • Step S30 If the amount of learning data is sufficient, the behavior determination apparatus 10 proceeds to step S31, otherwise, the behavior determination apparatus 10 returns to step S21.
  • Step S31 The learning unit 301 uses the learning data labeled in steps S28 and S29 to learn the relationship between the behavior of the target organism and the label.
  • Step S32 Based on the result of learning in step S31, the learning unit 301 creates a feeding behavior model as a model to be included in the reference behavior model 31. Note that the result of learning in step S31 may be the feeding behavior model itself.
  • FIG. 14 is a flowchart showing an example of the operation of the corpse number acquisition unit 400 and the learning unit 301. The operation of the corpse number acquisition unit 400 and the learning unit 301 will be described with reference to FIG. 14.
  • Step S41 The image acquisition unit 401 acquires, as target image data, the image data output by the camera 1. This step is similar to step S21.
  • Step S42 The corpse detection unit 402 detects the corpse of a target organism from each frame represented by the target video data.
  • Step S43 If a corpse is detected in step S42, the behavior determination apparatus 10 proceeds to step S44. Otherwise, the behavior determination apparatus 10 proceeds to step S45.
  • the learning unit 301 acquires information indicating each of the motion feature and the curvature feature before the detection of a corpse.
  • the learning unit 301 may acquire only information regarding an individual estimated to be a corpse. Specific examples of an individual estimated to be a corpse include an individual floating on the water surface, or an individual whose surface color has changed.
  • Step S46, Step S47, Step S48 If the number of corpses calculated in step S45 is equal to or greater than a certain number, the learning unit 301 labels the information acquired in step S46 as abnormal behavior. Otherwise, the learning unit 301 labels the information acquired in step S46 as normal behavior.
  • the learning unit 301 may label each piece of information about an individual.
  • the data labeled in steps S47 and S48 corresponds to learning data, the data labeled as abnormal behavior corresponds to abnormal behavior data, and the data labeled as normal behavior corresponds to normal behavior data.
  • Step S49 If the amount of learning data is sufficient, the behavior determining apparatus 10 proceeds to step S50, otherwise, the behavior determining apparatus 10 returns to step S41.
  • Step S50 The learning unit 301 uses the learning data labeled in steps S47 and S48 to learn the relationship between the behavior of the target organism and the label.
  • Step S51 Based on the result of learning in step S50, the learning unit 301 creates an abnormal behavior model as a model to be included in the reference behavior model 31. Note that the result of learning in step S50 may be the abnormal behavior model itself.
  • a model used to classify the behavior of a target organism can be generated based on the behavior of the target organism and the state inside the aquarium.

Abstract

An action determination device (10) is provided with: an action determination unit (27) that determines the action of a subject creature, i.e. an aquatic life, using an action feature and a reference action model (31), said action feature being a feature corresponding to the action of the subject creature and calculated on the basis of the time transition of at least one of the position and the flexion of each joint of the subject creature, said reference action model comprising a model for indicating the action category of the subject creature on the basis of the action feature; and a rearing condition control unit (28) that, according to the determined action of the subject creature, generates data for determining whether to cause a device for feeding the subject creature to feed the subject creature.

Description

行動判定装置、行動判定方法、及び行動判定プログラムBehavior determination device, behavior determination method, and behavior determination program
 本開示は、行動判定装置、行動判定方法、及び行動判定プログラムに関する。 This disclosure relates to a behavior determination device, a behavior determination method, and a behavior determination program.
 近年、水生生物を安定的に生産することができる陸上養殖が注目されている。従来の陸上養殖では、水生生物の健康状態の把握と、水質の調整と、給餌等に係る機器の操作等の作業を、経験者が水生生物の様子を見ながら勘に頼って実行する必要があった。
 特許文献1は、人によるこれらの作業を省力化することを目的として、陸上養殖におけるシステムの自動化技術等を開示している。特許文献1が開示している技術は、水槽内の魚類等の各水生生物について、撮影画像から3次元的な各水生生物の位置と、各水生生物の大きさ及び体形等の個体の特徴データとを求め、求めた位置及び特徴データに基づいて水生生物を追跡し、追跡結果に基づいて水生生物の成育状態を監視するものである。さらに、当該技術は、統計管理されているデータを用いて最適な成育状態になるように給餌量及び成育環境条件を制御する手段を備えていることを特徴とする。
In recent years, land-based aquaculture, which can stably produce aquatic organisms, has been attracting attention. In conventional land-based aquaculture, it was necessary for experienced personnel to grasp the health of the aquatic organisms, adjust the water quality, and operate the equipment for feeding, etc., relying on intuition while observing the aquatic organisms.
Patent Document 1 discloses a technology for automating a land-based aquaculture system with the aim of reducing the labor required for these manual tasks. The technology disclosed in Patent Document 1 determines the three-dimensional position of each aquatic organism, such as fish, in an aquarium from a captured image, and individual characteristic data of each organism, such as its size and body shape, tracks the organism based on the determined position and characteristic data, and monitors the growth state of the organism based on the tracking results. Furthermore, the technology is characterized by having a means for controlling the amount of feed and growth environment conditions to achieve optimal growth state using statistically managed data.
特開2003-250382号公報JP 2003-250382 A
 特許文献1が開示している技術には給餌タイミングの視点がない。そのため、当該技術には、適したタイミングにおいて給餌を実行するとは限らないため、効率的な給餌を実行することができない場合があるという課題がある。 The technology disclosed in Patent Document 1 does not take into consideration the timing of feeding. As a result, this technology has the problem that feeding is not always carried out at an appropriate time, and therefore efficient feeding may not be possible.
 本開示は、養殖されている水生生物に対して適したタイミングにおいて自動給餌装置が給餌を実行するためのデータを提供することを目的とする。 The present disclosure aims to provide data that enables an automatic feeding device to feed cultivated aquatic organisms at appropriate times.
 本開示に係る行動判定装置は、
 水生生物である対象生物が有する各関節の位置及び屈曲の少なくともいずれかの時間推移に基づいて算出された特徴量であって、前記対象生物の行動に対応する特徴量である行動特徴量と、前記行動特徴量に応じた前記対象生物の行動の分類を示すモデルから成る参照行動モデルとを用いて前記対象生物の行動を判定する行動判定部と、
 前記対象生物に対して給餌を実行する装置が前記対象生物に対して給餌を実行するか否かを決定するためのデータを、判定された前記対象生物の行動に応じて生成する生育条件制御部と
を備える。
The behavior determination device according to the present disclosure comprises:
a behavior determination unit that determines the behavior of a target organism using behavior features that are calculated based on time transitions of at least one of the positions and bending of each joint of the target organism, the behavior features corresponding to the behavior of the target organism, and a reference behavior model that is a model that indicates a classification of the behavior of the target organism according to the behavior features;
The device includes a growth condition control unit that generates data for determining whether or not the device that feeds the target organism should feed the target organism based on the determined behavior of the target organism.
 本開示によれば、行動判定部が行動特徴量に応じて対象生物の行動を判定し、生育条件制御部が、対象生物に対して給餌を実行する装置が対象生物に対して給餌を実行するか否かを決定するためのデータを、対象生物の行動に応じて生成する。ここで、当該装置は自動給餌装置に当たり、自動給餌装置が当該データに基づいて対象生物に対して給餌を実行するか否かを決定することにより自動給餌装置は適したタイミングにおいて対象生物に対して給餌を実行することができる。従って、本開示によれば、養殖されている水生生物に対して適したタイミングにおいて自動給餌装置が給餌を実行するためのデータを提供することができる。 According to the present disclosure, the behavior determination unit determines the behavior of the target organism in accordance with the behavioral features, and the growth condition control unit generates data in accordance with the behavior of the target organism for a device that feeds the target organism to decide whether or not to feed the target organism. Here, the device corresponds to an automatic feeding device, and the automatic feeding device can feed the target organism at an appropriate time by deciding whether or not to feed the target organism based on the data. Thus, according to the present disclosure, data can be provided that enables the automatic feeding device to feed the cultivated aquatic organism at an appropriate time.
実施の形態1に係る行動判定システム9の構成例を示す図。FIG. 1 is a diagram showing an example of the configuration of a behavior determination system 9 according to a first embodiment. 実施の形態1に係る骨格抽出部23の処理を説明する図。5A to 5C are diagrams for explaining the processing of a skeleton extraction unit 23 according to the first embodiment. 実施の形態1に係る行動判定部27の処理を説明する図であり、(a)は対象生物がエビである場合における具体例を示す図、(b)は対象生物がナマズである場合における具体例を示す図。1A and 1B are diagrams for explaining the processing of the behavior determination unit 27 in the first embodiment, in which (a) shows a specific example when the target organism is a shrimp, and (b) shows a specific example when the target organism is a catfish. 実施の形態1に係る摂餌行動モデルを説明する図。FIG. 2 is a diagram for explaining a feeding behavior model according to the first embodiment. 実施の形態1に係る非摂餌行動モデルを説明する図。FIG. 2 is a diagram for explaining a non-feeding behavior model according to the first embodiment. 実施の形態1に係る正常行動モデルを説明する図。1 is a diagram for explaining a normal behavior model according to the first embodiment; 実施の形態1に係る異常行動モデルを説明する図。4 is a diagram for explaining an abnormal behavior model according to the first embodiment; 実施の形態1に係る行動判定装置10のハードウェア構成例を示す図。FIG. 2 is a diagram showing an example of the hardware configuration of a behavior determining device 10 according to the first embodiment. 実施の形態1に係る行動判定装置10の動作を示すフローチャート。4 is a flowchart showing the operation of the behavior determining device 10 according to the first embodiment. 実施の形態1の変形例に係る行動判定装置10のハードウェア構成例を示す図。FIG. 13 is a diagram showing an example of a hardware configuration of a behavior determining device 10 according to a modification of the first embodiment. 実施の形態2に係る行動判定システム9の構成例を示す図。FIG. 13 is a diagram showing a configuration example of a behavior determination system 9 according to a second embodiment. 実施の形態2に係る学習部301の処理を説明する図であり、(a)は摂餌行動モデル作成例を示す図、(b)は非摂餌行動モデル作成例を示す図、(c)は異常行動モデル作成例を示す図。13A to 13C are diagrams for explaining the processing of the learning unit 301 in the second embodiment, where FIG. 13A shows an example of creating a feeding behavior model, FIG. 13B shows an example of creating a non-feeding behavior model, and FIG. 13C shows an example of creating an abnormal behavior model. 実施の形態2に係る残餌量取得部200及び学習部301の動作を示すフローチャート。10 is a flowchart showing the operation of a remaining feed amount acquisition section 200 and a learning section 301 according to the second embodiment. 実施の形態2に係る死骸数取得部400及び学習部301の動作を示すフローチャート。11 is a flowchart showing the operation of the corpse number acquisition unit 400 and the learning unit 301 according to the second embodiment.
 実施の形態の説明及び図面において、同じ要素及び対応する要素には同じ符号を付している。同じ符号が付された要素の説明は、適宜に省略又は簡略化する。図中の矢印はデータの流れ又は処理の流れを主に示している。また、「部」を、「回路」、「工程」、「手順」、「処理」又は「サーキットリー」に適宜読み替えてもよい。 In the description of the embodiments and the drawings, the same elements and corresponding elements are given the same reference numerals. Descriptions of elements given the same reference numerals are omitted or simplified as appropriate. Arrows in the drawings primarily indicate data flow or processing flow. In addition, "part" may be interpreted as "circuit," "step," "procedure," "processing," or "circuitry" as appropriate.
 実施の形態1.
 以下、本実施の形態について、図面を参照しながら詳細に説明する。
Embodiment 1.
Hereinafter, the present embodiment will be described in detail with reference to the drawings.
***構成の説明***
 図1は、本実施の形態に係る行動判定システム9の構成例を示している。行動判定システム9は、図1に示すように、カメラ1と、行動判定装置10と、自動給餌装置60と、生育環境制御装置70とを備える。
 行動判定装置10は水生生物の行動を判定する機能を有する。本願において、水生生物は典型的には水槽を用いて養殖されている水生生物を指す。行動判定装置10は、図1に示すように、映像取得部21と、対象生物検知部22と、骨格抽出部23と、対象生物追跡部24と、運動特徴量算出部25と、屈曲特徴量算出部26と、行動判定部27と、生育条件制御部28とを備え、参照行動モデル31を記憶する。また、行動判定装置10には、カメラ1と、自動給餌装置60と、生育環境制御装置70との各々が接続されている。
 カメラ1は、水生生物を撮影する装置であり、複数台存在してもよい。カメラ1は、カメラ1が撮影した映像を示す映像データを出力する。映像データは、複数のフレームを示すデータであり、具体例として動画のデータである。
 自動給餌装置60は、対象生物に対して給餌を実行する装置に当たり、水生生物に対して自動的に給餌を実行する装置である。対象生物は、関節を有する水生生物であり、具体例として魚類又は甲殻類である。
 生育環境制御装置70は、水生生物の生育環境を制御する装置である。
***Configuration Description***
1 shows an example of the configuration of a behavior determination system 9 according to the present embodiment. As shown in FIG. 1, the behavior determination system 9 includes a camera 1, a behavior determination device 10, an automatic feeding device 60, and a growth environment control device 70.
The behavior determination device 10 has a function of determining the behavior of an aquatic organism. In the present application, the aquatic organism typically refers to an aquatic organism cultivated in an aquarium. As shown in FIG. 1, the behavior determination device 10 includes an image acquisition unit 21, a target organism detection unit 22, a skeleton extraction unit 23, a target organism tracking unit 24, a motion feature amount calculation unit 25, a bending feature amount calculation unit 26, a behavior determination unit 27, and a growth condition control unit 28, and stores a reference behavior model 31. In addition, the behavior determination device 10 is connected to a camera 1, an automatic feeding device 60, and a growth environment control device 70.
The camera 1 is a device for photographing aquatic organisms, and there may be a plurality of cameras. The camera 1 outputs video data representing images photographed by the camera 1. The video data is data representing a plurality of frames, and as a specific example, is data of a moving image.
The automatic feeding device 60 is a device that feeds target organisms, and is a device that automatically feeds aquatic organisms. The target organisms are aquatic organisms having joints, and specific examples thereof include fish and crustaceans.
The growth environment control device 70 is a device that controls the growth environment of aquatic organisms.
 映像取得部21は、カメラ1が出力した映像データを取得する。 The video acquisition unit 21 acquires the video data output by the camera 1.
 対象生物検知部22は、映像取得部21が取得した映像データが示す各フレームにおいて対象生物を検知する。対象生物検知部22は、AI(Artificial Intelligence)技術に基づく学習済モデルを用いて各フレームにおいて対象生物が存在する場所を検知してもよい。当該学習済モデルは、具体例として、フレームを示すデータを入力とし、対象生物を検知した結果を出力とするモデルである。
 対象生物検知部22は、対象生物を検知したフレームが複数ある場合、対象生物を検知した各フレームにおいて検知した対象生物を示す情報を検知情報として生成する。
The target organism detection unit 22 detects a target organism in each frame indicated by the video data acquired by the video acquisition unit 21. The target organism detection unit 22 may detect the location of the target organism in each frame by using a trained model based on AI (Artificial Intelligence) technology. As a specific example, the trained model is a model that receives data indicating a frame as input and outputs the result of detecting the target organism.
When there are multiple frames in which a target organism is detected, the target organism detection unit 22 generates, as detection information, information indicating the target organism detected in each frame in which the target organism is detected.
 骨格抽出部23は、対象生物検知部22が生成した検知情報が示す対象生物の骨格を抽出し、抽出した骨格を示す骨格情報を生成する。この際、典型的には、骨格抽出部23は、映像データが示す各フレームにおいて対象生物が有する各関節を抽出し、抽出した各関節を示す情報を骨格情報として生成する。このとき、映像データは対象生物が映っている複数のフレームを示すものとする。骨格情報は、各関節の位置及び屈曲を示す情報であってもよく、目と、各鰭と、尾と、脚等の部位を示す情報であってもよい。骨格抽出部23は、対象生物の種類に応じて抽出する骨格を変更してもよい。なお、対象生物が甲殻類である場合において、対象生物追跡部24は、骨格として対象生物の節を抽出する。
 図2は、骨格抽出部23の処理を説明する図である。対象生物が魚類である場合において、具体例として、骨格抽出部23は図2に示すように対象生物の骨格として、左目と、右目と、左胸鰭と、右胸鰭と、腹鰭と、背鰭中心と、臀鰭と、尾鰭の付け根と、尾鰭の先端との各部位を抽出する。
The skeleton extraction unit 23 extracts the skeleton of the target organism indicated by the detection information generated by the target organism detection unit 22, and generates skeleton information indicating the extracted skeleton. At this time, typically, the skeleton extraction unit 23 extracts each joint of the target organism in each frame indicated by the video data, and generates information indicating each extracted joint as skeleton information. At this time, the video data indicates a plurality of frames in which the target organism is shown. The skeleton information may be information indicating the position and bending of each joint, or may be information indicating parts such as the eyes, each fin, the tail, and the legs. The skeleton extraction unit 23 may change the skeleton to be extracted depending on the type of the target organism. In addition, when the target organism is a crustacean, the target organism tracking unit 24 extracts the joints of the target organism as the skeleton.
Fig. 2 is a diagram for explaining the processing of the skeleton extraction unit 23. When the target organism is a fish, as a specific example, the skeleton extraction unit 23 extracts each part of the skeleton of the target organism, that is, the left eye, the right eye, the left pectoral fin, the right pectoral fin, the pelvic fin, the center of the dorsal fin, the anal fin, the base of the caudal fin, and the tip of the caudal fin, as shown in Fig. 2.
 対象生物追跡部24は、骨格情報に基づいて対象生物を追跡し、追跡した結果を示す情報を追跡情報として生成する。具体的には、対象生物追跡部24は、映像取得部21が取得した映像データが示す複数のフレームと、骨格抽出部23が生成した骨格情報とに基づいて対象生物を追跡し、追跡した結果を示す追跡情報を生成する。このとき、対象生物追跡部24は、具体例として追跡の基準点を対象生物の頭部内のある点とする。当該ある点は、具体例として口又は目等に対応する点である。追跡情報は、具体例として、対象生物の移動距離及び移動時間を示す。 The target organism tracking unit 24 tracks the target organism based on the skeletal information, and generates information indicating the tracking results as tracking information. Specifically, the target organism tracking unit 24 tracks the target organism based on multiple frames indicated by the video data acquired by the video acquisition unit 21 and the skeletal information generated by the skeletal extraction unit 23, and generates tracking information indicating the tracking results. At this time, the target organism tracking unit 24 sets, as a specific example, the reference point for tracking to a certain point within the head of the target organism. As a specific example, the certain point is a point corresponding to the mouth or eyes, etc. The tracking information indicates, as a specific example, the distance and time traveled by the target organism.
 運動特徴量算出部25は、対象生物追跡部24が生成した追跡情報に基づいて対象生物の運動特徴量を算出する。運動特徴量は、対象生物の運動に関する特徴量であり、対象生物の移動と位置と姿勢との少なくともいずれかに対応する特徴量である。運動特徴量は、具体例として、対象生物の位置を示す値と、対象生物の姿勢を示す値と、対象生物の遊泳距離を示す値と、対象生物の遊泳速度を示す値と、対象生物の遊泳時の加速度を示す値との少なくともいずれかから成る。運動特徴量算出部25は、参照行動モデル31の項目に応じて算出する特徴量を変更してもよい。 The movement characteristic amount calculation unit 25 calculates the movement characteristic amount of the target organism based on the tracking information generated by the target organism tracking unit 24. The movement characteristic amount is a characteristic amount related to the movement of the target organism, and corresponds to at least one of the movement, position, and posture of the target organism. As specific examples, the movement characteristic amount is composed of at least one of a value indicating the position of the target organism, a value indicating the posture of the target organism, a value indicating the swimming distance of the target organism, a value indicating the swimming speed of the target organism, and a value indicating the acceleration of the target organism while swimming. The movement characteristic amount calculation unit 25 may change the characteristic amount calculated depending on the items of the reference behavior model 31.
 屈曲特徴量算出部26は、骨格抽出部23が生成した骨格情報に基づいて対象生物の屈曲特徴量を算出する。屈曲特徴量算出部26は、屈曲特徴量を算出する際に対象生物追跡部24が生成した追跡情報を用いてもよい。屈曲特徴量は、対象生物が有する関節の屈曲に関する特徴量であり、対象生物の関節の屈曲と、対象生物の関節のうち対象関節の屈曲に伴って移動した関節との少なくともいずれかに対応する特徴量である。屈曲特徴量は、具体例として、対象生物の少なくとも一部の関節の各関節の屈曲の方向及び程度の各々を示す値と、対象生物が有する関節のうち屈曲した各関節の屈曲時における角速度及び角加速度の各々を示す値と、対象生物の一部の関節が屈曲したことに伴って移動した各関節の移動量と移動速度と移動方向との各々を示す値との少なくともいずれかから成る。屈曲特徴量算出部26は、参照行動モデル31の項目に応じて算出する特徴量を変更してもよい。
 屈曲特徴量算出部26は、具体例として、対象生物が有する関節の各関節の移動軌跡を算出し、算出した移動軌跡に基づいて各関節の移動距離及び移動方向を算出する。なお、屈曲特徴量算出部26が算出する移動軌跡の性質は、基本的には対象生物の種類及び年齢等に応じて異なる。
The bending feature amount calculation unit 26 calculates the bending feature amount of the target organism based on the skeleton information generated by the skeleton extraction unit 23. The bending feature amount calculation unit 26 may use the tracking information generated by the target organism tracking unit 24 when calculating the bending feature amount. The bending feature amount is a feature amount related to the bending of the joints of the target organism, and is a feature amount corresponding to at least one of the bending of the joints of the target organism and the joints of the target organism that have moved with the bending of the target joint. As a specific example, the bending feature amount is composed of at least one of a value indicating each of the bending direction and degree of each of at least some of the joints of the target organism, a value indicating each of the angular velocity and angular acceleration at the time of bending of each of the bent joints of the joints of the target organism, and a value indicating each of the movement amount, movement speed, and movement direction of each of the joints that have moved with the bending of the joints of the target organism. The bending feature amount calculation unit 26 may change the feature amount calculated according to the item of the reference behavior model 31.
As a specific example, the bend feature amount calculation unit 26 calculates the movement trajectory of each joint of the target organism, and calculates the movement distance and movement direction of each joint based on the calculated movement trajectory. Note that the nature of the movement trajectory calculated by the bend feature amount calculation unit 26 basically differs depending on the type, age, etc. of the target organism.
 行動判定部27は、行動特徴量と、参照行動モデル31とを用いて対象生物の行動を判定する。行動特徴量は、対象生物が有する各関節の位置及び屈曲の少なくともいずれかの時間推移に基づいて算出された特徴量であって、対象生物の行動に対応する特徴量である。行動特徴量は、具体例として、運動特徴量と屈曲特徴量との少なくともいずれかから成る。具体的には、行動判定部27は、算出された運動特徴量及び変位特徴量を参照行動モデル31に当てはめることにより、対象生物の行動がどのような行動であるのかを判定する。
 行動判定部27は、具体例として、対象生物の遊泳速度が所定速度よりも速く、対象生物の頭の向きが鉛直方向上向きから所定の角度以内の方向であり、かつ、所定間隔よりも短い間隔で対象生物が尾を振る場合に対象生物の行動を摂餌行動と判定する。摂餌行動は、対象生物が餌を欲している場合における対象生物の行動である。
 行動判定部27は、別の具体例として、対象生物の遊泳速度が所定速度よりも遅く、かつ、対象生物の腹部が対象生物の背鰭よりも鉛直方向上側に存在する場合に対象生物の行動を異常行動と判定する。異常行動は、正常行動ではない行動であり、具体例として、対象生物が衰弱している状態、対象生物のストレス負荷が高い状態、又は病気を抱えている状態等である場合における対象生物の行動である。正常行動は、対象生物の健康状態が良好である場合における対象生物の行動である。
The behavior determination unit 27 determines the behavior of the target organism using the behavior feature amount and the reference behavior model 31. The behavior feature amount is a feature amount calculated based on the time progression of at least one of the position and bending of each joint of the target organism, and corresponds to the behavior of the target organism. As a specific example, the behavior feature amount is composed of at least one of a movement feature amount and a bending feature amount. Specifically, the behavior determination unit 27 determines what type of behavior the target organism is by applying the calculated movement feature amount and displacement feature amount to the reference behavior model 31.
As a specific example, behavior determination unit 27 determines the behavior of the target organism as feeding behavior when the swimming speed of the target organism is faster than a predetermined speed, the orientation of the target organism's head is within a predetermined angle from the vertical upward direction, and the target organism wags its tail at intervals shorter than a predetermined interval. Feeding behavior is the behavior of the target organism when it desires food.
As another specific example, the behavior determination unit 27 determines the behavior of the target organism as abnormal when the swimming speed of the target organism is slower than a predetermined speed and the abdomen of the target organism is vertically above the dorsal fin of the target organism. Abnormal behavior is behavior that is not normal, and specific examples include behavior of the target organism when the target organism is in a weakened state, is under high stress, or is ill. Normal behavior is behavior of the target organism when the target organism is in good health.
 図3は、行動判定部27の処理を説明する図である。
 図3の(a)は、対象生物がエビである場合における具体例を示している。エビが水面に浮上しており、かつ、エビの脚の動き量が多い場合に、行動判定部27はエビの行動を摂餌行動と判定する。一方、エビが水底に滞在しており、かつ、エビの動き量が少ない場合に、行動判定部27はエビの行動を非摂餌行動と判定する。非摂餌行動は摂餌行動ではない行動である。
 図3の(b)は、対象生物がナマズである場合における具体例を示している。ナマズが尾を頻繁に動かす場合に、行動判定部27はナマズの行動を正常行動と判定する。一方、ナマズが尾をほぼ動かさない場合に、行動判定部27はナマズの行動を異常行動と判定する。
FIG. 3 is a diagram for explaining the process of the behavior determination unit 27. As shown in FIG.
3A shows a specific example in which the target organism is a shrimp. When the shrimp floats to the water surface and the shrimp's legs move a lot, the behavior determination unit 27 determines the shrimp's behavior as a feeding behavior. On the other hand, when the shrimp stays on the water bottom and the shrimp moves a little, the behavior determination unit 27 determines the shrimp's behavior as a non-feeding behavior. Non-feeding behavior is behavior that is not a feeding behavior.
3B shows a specific example in which the target organism is a catfish. When the catfish frequently moves its tail, the behavior determination unit 27 determines that the behavior of the catfish is normal. On the other hand, when the catfish hardly moves its tail, the behavior determination unit 27 determines that the behavior of the catfish is abnormal.
 生育条件制御部28は、対象生物に対して給餌を実行する装置が対象生物に対して給餌を実行するか否かを決定するためのデータを、行動判定部27によって判定された対象生物の行動に応じて生成する。当該データは、給餌を実行するか否かを示すデータであってもよく、給餌量を示すデータであってもよく、摂餌行動と判定された行動をとる対象生物がいるか否かを示すデータであってもよく、摂餌行動と判定された行動をとる対象生物の総数と、水槽内の対象生物の総数との各々を示すデータであってもよく、行動判定装置10が備える各部の処理結果を示すデータであってもよい。当該データが給餌量を示す場合において、自動給餌装置60は、給餌量が0である場合に対象生物に対して給餌を実行しないことを決定し、給餌量が0ではない場合に対象生物に対して給餌を実行することを決定する。
 また、生育条件制御部28は、行動判定部27の判定結果に基づいて自動給餌装置60及び生育環境制御装置70の各々の調節量等を示すデータを生成し、生成したデータを出力してもよい。具体例として、生育条件制御部28は、判定された対象生物の行動に応じて対象生物に対する給餌タイミング及び給餌量の少なくともいずれかを算出し、また、判定された対象生物の行動に応じて、対象生物の生育環境を調節することに用いられるデータを生成する。
The growth condition control unit 28 generates data for the device that feeds the target organism to determine whether or not to feed the target organism, according to the behavior of the target organism determined by the behavior determination unit 27. The data may be data indicating whether or not to feed the target organism, data indicating the amount of feeding, data indicating whether or not there is a target organism that is performing a behavior determined to be a feeding behavior, data indicating each of the total number of target organisms performing a behavior determined to be a feeding behavior and the total number of target organisms in the aquarium, or data indicating the processing results of each unit included in the behavior determination device 10. When the data indicates the amount of feeding, the automatic feeding device 60 determines not to feed the target organism when the amount of feeding is 0, and determines to feed the target organism when the amount of feeding is not 0.
Furthermore, the growth condition control unit 28 may generate data indicating the amount of adjustment of each of the automatic feeding device 60 and the growth environment control device 70 based on the determination result of the behavior determination unit 27, and output the generated data. As a specific example, the growth condition control unit 28 calculates at least one of the feeding timing and feeding amount for the target organism according to the determined behavior of the target organism, and also generates data used to adjust the growth environment of the target organism according to the determined behavior of the target organism.
 参照行動モデル31は、行動特徴量に応じた対象生物の行動の分類を示すモデルから成り、水生生物の行動を分類することに用いられるモデルである。参照行動モデル31は、テーブル形式で表現されるモデルであってもよく、機械学習に基づく分類器であってもよい。
 図4から7は参照行動モデル31の具体例を示している。以下、図4から7を用いて参照行動モデル31を説明する。なお、摂餌行動モデルと、非摂餌行動モデルと、正常行動モデルと、異常行動モデルとの各々は、参照行動モデル31に含まれるモデルの具体例である。
The reference behavior model 31 is a model that indicates a classification of the behavior of a target organism according to behavioral features, and is used to classify the behavior of aquatic organisms. The reference behavior model 31 may be a model expressed in a table format, or may be a classifier based on machine learning.
4 to 7 show specific examples of the reference behavior model 31. The reference behavior model 31 will be described below with reference to Figs. 4 to 7. Note that the feeding behavior model, the non-feeding behavior model, the normal behavior model, and the abnormal behavior model are each a specific example of a model included in the reference behavior model 31.
 図4は、摂餌行動モデルの具体例を示している。摂餌行動モデルは、対象生物の行動が摂餌行動であるか否かを判定することに用いられるモデルであり、具体例として行動特徴量に応じて対象生物の行動を摂餌行動に分類するための指標を示すモデルである。図4に示す摂餌行動モデルを用いることにより、行動判定部27は、対象生物の遊泳速度と頭角度と遊泳水深と尾鰭角度変化量とに基づいて、対象生物の行動が摂餌行動であるか否かを判定することができる。ここで、遊泳速度と頭角度と遊泳水深とは運動特徴量に当たり、尾鰭角度変化量は屈曲特徴量に当たる。 FIG. 4 shows a specific example of a feeding behavior model. The feeding behavior model is a model used to determine whether the behavior of a target organism is feeding behavior, and as a specific example, is a model that shows indices for classifying the behavior of the target organism as feeding behavior according to behavioral features. By using the feeding behavior model shown in FIG. 4, the behavior determination unit 27 can determine whether the behavior of the target organism is feeding behavior based on the swimming speed, head angle, swimming depth, and caudal fin angle change of the target organism. Here, the swimming speed, head angle, and swimming depth correspond to the movement features, and the caudal fin angle change corresponds to the bending feature.
 図5は、非摂餌行動モデルの具体例を示している。非摂餌行動モデルは、対象生物の行動が非摂餌行動であるか否かを判定することに用いられるモデルであり、具体例として行動特徴量に応じて対象生物の行動を非摂餌行動に分類するための指標を示すモデルである。図5に示す非摂餌行動モデルを用いることにより、行動判定部27は、対象生物の遊泳速度と頭角度と遊泳水深と尾鰭角度変化量とに基づいて、対象生物の行動が非摂餌行動であるか否かを判定することができる。非摂餌行動は、摂餌行動ではない行動であり、具体例として、対象生物が満腹時にとる行動である。 FIG. 5 shows a specific example of a non-feeding behavior model. The non-feeding behavior model is a model used to determine whether the behavior of a target organism is non-feeding behavior, and as a specific example, is a model that shows indices for classifying the behavior of a target organism as non-feeding behavior according to behavioral features. By using the non-feeding behavior model shown in FIG. 5, the behavior determination unit 27 can determine whether the behavior of a target organism is non-feeding behavior based on the target organism's swimming speed, head angle, swimming depth, and caudal fin angle change. Non-feeding behavior is behavior that is not feeding behavior, and as a specific example, is behavior that a target organism takes when full.
 図6は、正常行動モデルの具体例を示している。正常行動モデルは、対象生物の行動が正常行動であるか否かを判定することに用いられるモデルであり、具体例として行動特徴量に応じて対象生物の行動を正常行動に分類するための指標を示すモデルである。図6に示す正常行動モデルを用いることにより、行動判定部27は、対象生物の遊泳速度と胸鰭位置と尾鰭角度変化回数とに基づいて、対象生物の行動が正常行動であるか否かを判定することができる。ここで、胸鰭位置は運動特徴量に当たり、尾鰭角度変化回数は屈曲特徴量に当たる。 FIG. 6 shows a specific example of a normal behavior model. The normal behavior model is a model used to determine whether the behavior of a target organism is normal or not, and as a specific example, is a model that shows an index for classifying the behavior of the target organism as normal behavior according to the behavior feature. By using the normal behavior model shown in FIG. 6, the behavior determination unit 27 can determine whether the behavior of the target organism is normal or not based on the swimming speed, pectoral fin position, and number of caudal fin angle changes of the target organism. Here, the pectoral fin position corresponds to the movement feature, and the number of caudal fin angle changes corresponds to the bending feature.
 図7は、異常行動モデルの具体例を示している。異常行動モデルは、対象生物の行動が異常行動であるか否かを判定することに用いられるモデルであり、具体例として行動特徴量に応じて対象生物の行動を異常行動に分類するための指標を示すモデルである。図7に示す異常行動モデルを用いることにより、行動判定部27は、対象生物の遊泳速度と胸鰭位置と尾鰭角度変化回数とに基づいて、対象生物の行動が異常行動であるか否かを判定することができる。 FIG. 7 shows a specific example of an abnormal behavior model. The abnormal behavior model is a model used to determine whether the behavior of a target organism is abnormal or not, and as a specific example, is a model that shows an index for classifying the behavior of the target organism as abnormal depending on the behavioral features. By using the abnormal behavior model shown in FIG. 7, the behavior determination unit 27 can determine whether the behavior of the target organism is abnormal or not based on the swimming speed, pectoral fin position, and number of changes in caudal fin angle of the target organism.
 図8は、本実施の形態に係る行動判定装置10のハードウェア構成例を示している。行動判定装置10は、コンピュータから成り、プロセッサ20と、記憶装置30と、通信装置40と、入出力インタフェース50等のハードウェアを備える。行動判定装置10は複数のコンピュータから成ってもよい。 FIG. 8 shows an example of the hardware configuration of the behavior determination device 10 according to this embodiment. The behavior determination device 10 is made up of a computer, and includes hardware such as a processor 20, a storage device 30, a communication device 40, and an input/output interface 50. The behavior determination device 10 may be made up of multiple computers.
 プロセッサ20は、演算処理を行うIC(Integrated Circuit)であり、かつ、コンピュータが備えるハードウェアを制御する。プロセッサ20は、具体例として、CPU(Central Processing Unit)、DSP(Digital Signal Processor)、又はGPU(Graphics Processing Unit)である。
 行動判定装置10は、プロセッサ20を代替する複数のプロセッサを備えてもよい。複数のプロセッサはプロセッサ20の役割を分担する。
The processor 20 is an integrated circuit (IC) that performs arithmetic processing and controls the hardware of the computer. Specific examples of the processor 20 include a central processing unit (CPU), a digital signal processor (DSP), and a graphics processing unit (GPU).
The behavior determination device 10 may include a plurality of processors that replace the processor 20. The plurality of processors share the role of the processor 20.
 記憶装置30は、揮発性の記憶装置、不揮発性の記憶装置、またこれらの組合せである。揮発性の記憶装置は、具体例としてRAM(Random Access Memory)である。不揮発性の記憶装置は、具体例として、ROM(Read Only Memory)、HDD(Hard Disk Drive)、又はフラッシュメモリである。 The storage device 30 may be a volatile storage device, a non-volatile storage device, or a combination of these. A specific example of a volatile storage device is RAM (Random Access Memory). A specific example of a non-volatile storage device is ROM (Read Only Memory), HDD (Hard Disk Drive), or flash memory.
 通信装置40は、レシーバ及びトランスミッタである。通信装置40は、具体例として、通信チップ又はNIC(Network Interface Card)である。 The communication device 40 is a receiver and a transmitter. A specific example of the communication device 40 is a communication chip or a NIC (Network Interface Card).
 入出力インタフェース50は、入力装置及び出力装置が接続されるポートである。入出力インタフェース50は、具体例として、USB(Universal Serial Bus)端子である。入力装置は、具体例として、キーボード及びマウスである。出力装置は、具体例としてディスプレイである。 The input/output interface 50 is a port to which an input device and an output device are connected. A specific example of the input/output interface 50 is a USB (Universal Serial Bus) terminal. A specific example of the input device is a keyboard and a mouse. A specific example of the output device is a display.
 行動判定装置10の各部は、他の装置等と通信する際に、入出力インタフェース50及び通信装置40を適宜用いてもよい。通信装置40及び入出力インタフェース50は、カメラ1等が出力した信号を取得する。 Each part of the behavior determination device 10 may use the input/output interface 50 and the communication device 40 as appropriate when communicating with other devices. The communication device 40 and the input/output interface 50 acquire signals output by the camera 1, etc.
 記憶装置30は、参照行動モデル31と行動判定プログラムとを記憶する。行動判定プログラムは、行動判定装置10が備える各部の機能をコンピュータに実現させるプログラムである。プロセッサ20は、記憶装置30に記憶されている行動判定プログラムを読み出して実行することにより、行動判定装置10が備える各部として動作する。行動判定装置10が備える各部の機能はソフトウェアにより実現される。 The storage device 30 stores a reference behavior model 31 and a behavior determination program. The behavior determination program is a program that causes a computer to realize the functions of each part of the behavior determination device 10. The processor 20 reads out and executes the behavior determination program stored in the storage device 30, thereby operating as each part of the behavior determination device 10. The functions of each part of the behavior determination device 10 are realized by software.
 行動判定プログラムを実行する際に用いられるデータと、行動判定プログラムを実行することによって得られるデータ等は、記憶装置30に適宜記憶される。行動判定装置10の各部は記憶装置30を適宜利用する。なお、データという用語と情報という用語とは同等の意味を有することもある。記憶装置30はコンピュータと独立したものであってもよい。 The data used when executing the behavior determination program and the data obtained by executing the behavior determination program are appropriately stored in the storage device 30. Each part of the behavior determination device 10 uses the storage device 30 as appropriate. Note that the terms "data" and "information" may have the same meaning. The storage device 30 may be independent of the computer.
 行動判定プログラムは、コンピュータが読み取り可能な不揮発性の記録媒体に記録されていてもよい。不揮発性の記録媒体は、具体例として、光ディスク又はフラッシュメモリである。行動判定プログラムは、プログラムプロダクトとして提供されてもよい。 The behavior determination program may be recorded on a computer-readable non-volatile recording medium. Specific examples of the non-volatile recording medium include an optical disk or a flash memory. The behavior determination program may be provided as a program product.
***動作の説明***
 行動判定装置10の動作手順は行動判定方法に相当する。また、行動判定装置10の動作を実現するプログラムは行動判定プログラムに相当する。
*** Operation Description ***
The operation procedure of the behavior determination device 10 corresponds to a behavior determination method, and the program that realizes the operation of the behavior determination device 10 corresponds to a behavior determination program.
 図9は、行動判定装置10の動作の一例を示すフローチャートである。図9を参照して行動判定装置10の動作を説明する。 FIG. 9 is a flowchart showing an example of the operation of the behavior determination device 10. The operation of the behavior determination device 10 will be explained with reference to FIG. 9.
(ステップS1)
 映像取得部21は、カメラ1が出力した映像データを対象映像データとして取得する。映像取得部21は、カメラ1が出力した映像データのうちある時間範囲内のデータのみを対象映像データとして取得してもよい。
(Step S1)
The video acquisition unit 21 acquires, as target video data, the video data output by the camera 1. The video acquisition unit 21 may acquire, as target video data, only data within a certain time range from the video data output by the camera 1.
(ステップS2)
 対象生物検知部22は、対象映像データが示す各フレームから対象生物を検知する。対象生物検知部22は、対象映像データが示すフレームのうち一部のフレームのみを利用してもよい。
(Step S2)
The target organism detection unit 22 detects a target organism from each frame represented by the target video data. The target organism detection unit 22 may use only a portion of the frames represented by the target video data.
(ステップS3)
 対象映像データが示す映像から対象生物が検知された場合、行動判定装置10はステップS4に進む。それ以外の場合、行動判定装置10はステップS1に戻る。
 なお、対象映像データが示す映像から対象生物が検知された場合において、1つ以上の対象生物が検知されたものとする。
(Step S3)
If a target living thing is detected from the image represented by the target image data, the behavior determination device 10 proceeds to step S4, otherwise the behavior determination device 10 returns to step S1.
In addition, when a target organism is detected from the image represented by the target image data, it is assumed that one or more target organisms are detected.
(ステップS4)
 行動判定装置10は、1つ以上の対象生物のうちステップS4からステップS10までの反復処理においてまだ選択されていない対象生物を選択対象生物として選択する。なお、選択対象生物は複数のフレームにおいて検知されているものとする。
(Step S4)
The behavior determination device 10 selects, as a selected target organism, one of the one or more target organisms that has not yet been selected in the iterative processing from step S4 to step S10. Note that the selected target organism is assumed to have been detected in multiple frames.
(ステップS5)
 骨格抽出部23は、対象映像データが示すフレームのうち選択対象生物が検知された各フレームにおいて選択対象生物の骨格を抽出し、抽出した骨格を示す情報を対象骨格情報として生成する。
(Step S5)
The skeleton extraction unit 23 extracts the skeleton of the selected target organism in each frame in which the selected target organism is detected among the frames indicated by the target video data, and generates information indicating the extracted skeleton as target skeleton information.
(ステップS6)
 対象生物追跡部24は、対象骨格情報に基づいて選択対象生物を追跡し、追跡した結果を示す情報を対象追跡情報として生成する。
(Step S6)
The target organism tracking unit 24 tracks the selected target organism based on the target skeletal information, and generates information indicating the tracking results as target tracking information.
(ステップS7)
 運動特徴量算出部25は、対象追跡情報に基づいて選択対象生物の運動特徴量を算出する。運動特徴量算出部25は、具体例として、対象追跡情報に基づいて対象生物の遊泳距離及び遊泳時間等を算出し、算出した結果を運動特徴量とする。
 なお、本ステップにおいて、屈曲特徴量算出部26は対象骨格情報に基づいて選択対象生物の屈曲特徴量を算出してもよい。
(Step S7)
The motion characteristic amount calculation unit 25 calculates the motion characteristic amount of the selected target organism based on the target tracking information. As a specific example, the motion characteristic amount calculation unit 25 calculates the swimming distance and swimming time of the target organism based on the target tracking information, and sets the calculated results as the motion characteristic amount.
In this step, the curvature feature amount calculation section 26 may calculate the curvature feature amount of the selected target organism based on the target skeletal information.
(ステップS8)
 行動判定部27は、ステップS7において算出した特徴量を事前に用意してある参照行動モデル31と比較する。
(Step S8)
The behavior determination unit 27 compares the feature amount calculated in step S7 with a reference behavior model 31 prepared in advance.
(ステップS9)
 行動判定部27は、ステップS8における比較結果に基づいて選択対象生物の行動を判定する。
(Step S9)
The behavior determination section 27 determines the behavior of the selected target creature based on the comparison result in step S8.
(ステップS10)
 行動判定装置10は、ステップS5からステップS9までを対象映像データから検知した対象生物の数分繰り返し実行する。
(Step S10)
The behavior determination device 10 repeatedly executes steps S5 to S9 the number of times corresponding to the number of target living things detected from the target video data.
(ステップS11)
 生育条件制御部28は、ステップS9における行動判定部27の判定結果に基づいて自動給餌装置60及び生育環境制御装置70の各々の調節量等を示すデータを生成し、生成したデータを自動給餌装置60及び生育環境制御装置70の各々に対して出力する。
 具体例として、自動給餌装置60は、ステップS9において摂餌行動と判定された行動をとる対象生物がいた場合に、給餌タイミングであると判定し、給餌を実行する。この際、自動給餌装置60は、ステップS9において摂餌行動と判定された行動をとる対象生物の割合によって給餌量を変更してもよい。具体例として、自動給餌装置60は、対象生物1体ごとに基準となる給餌量である基準給餌量を図示しないデータベースに有しており、ステップS9において摂餌行動と判定された行動をとる対象生物の数と、ステップS3において検知された対象生物の数とを取得し、摂餌行動と判定された行動をとる対象生物の数を検知された対象生物の数で割ることにより、摂餌行動と判定された行動をとる対象生物の割合を算出し、算出した割合が80%以上である場合に給餌量を基準給餌量の1.2倍にする。
 別の具体例として、自動給餌装置60は、摂餌行動指標が示す値よりも高い値を示す給餌量指標を有しており、摂餌行動と判定された行動をとる対象生物に対応する特徴量が給餌量指標が示す値以上である場合に給餌量を基準給餌量の1.2倍にする。具体例として、摂餌行動指標は遊泳速度15cm/sを示し、給餌量指標は遊泳速度25cm/sを示す。摂餌行動指標は、対象生物の行動が摂餌行動であるか否かを判定することに用いられる指標である。給餌量指標は、給餌量を決定することに用いられる指標である。
 また、具体例として、生育環境制御装置70は、ステップS9において異常行動と判定された行動をとる対象生物が1体以上存在する場合に、まず水槽内の水温又は酸素供給量を高く設定する。その後、一定時間経過後に行動判定装置10が図9に示す行動判定のフローを再度実施し、ステップS9において異常行動と判定された行動をとる対象生物がいない場合に、生育環境制御装置70は水槽内の水温又は酸素供給量を低く設定する。なお、水槽内の水温又は酸素供給量をまず高く設定する理由は、一般的に、水温が高いほど、また酸素濃度が高いほど魚類が活性化する(活発に動く)ためである。なお、生育環境制御装置70は、まず水槽内の水温又は酸素供給量を低く設定し、その後水槽内の水温又は酸素供給量を高く設定してもよい。
(Step S11)
The growth condition control unit 28 generates data indicating the amount of adjustment of each of the automatic feeding device 60 and the growth environment control device 70 based on the judgment result of the behavior judgment unit 27 in step S9, and outputs the generated data to each of the automatic feeding device 60 and the growth environment control device 70.
As a specific example, when a target organism exhibits a behavior determined to be a feeding behavior in step S9, the automatic feeding device 60 determines that it is feeding time and executes feeding. At this time, the automatic feeding device 60 may change the amount of feeding depending on the proportion of target organisms exhibiting the behavior determined to be a feeding behavior in step S9. As a specific example, the automatic feeding device 60 has a reference feeding amount, which is a reference feeding amount for each target organism, in a database not shown, and obtains the number of target organisms exhibiting the behavior determined to be a feeding behavior in step S9 and the number of target organisms detected in step S3, and calculates the proportion of target organisms exhibiting the behavior determined to be a feeding behavior by dividing the number of target organisms exhibiting the behavior determined to be a feeding behavior by the number of detected target organisms, and sets the feeding amount to 1.2 times the reference feeding amount when the calculated proportion is 80% or more.
As another specific example, the automatic feeding device 60 has a feeding amount index that indicates a higher value than the value indicated by the feeding behavior index, and when the feature value corresponding to the target organism exhibiting behavior determined to be feeding behavior is equal to or greater than the value indicated by the feeding amount index, the feeding amount is set to 1.2 times the reference feeding amount. As a specific example, the feeding behavior index indicates a swimming speed of 15 cm/s, and the feeding amount index indicates a swimming speed of 25 cm/s. The feeding behavior index is an index used to determine whether the behavior of the target organism is feeding behavior. The feeding amount index is an index used to determine the feeding amount.
As a specific example, the growth environment control device 70 first sets the water temperature or oxygen supply amount in the aquarium high when there is one or more target organisms whose behavior is determined to be abnormal in step S9. After that, the behavior determination device 10 executes the behavior determination flow shown in FIG. 9 again after a certain time has elapsed, and when there is no target organism whose behavior is determined to be abnormal in step S9, the growth environment control device 70 sets the water temperature or oxygen supply amount in the aquarium low. The reason for first setting the water temperature or oxygen supply amount in the aquarium high is that, in general, the higher the water temperature and the higher the oxygen concentration, the more active (actively moving) the fish are. The growth environment control device 70 may first set the water temperature or oxygen supply amount in the aquarium low, and then set the water temperature or oxygen supply amount in the aquarium high.
***実施の形態1の効果の説明***
 以上のように、本実施の形態によれば、対象生物の遊泳行動及び脚の動き等に基づいて対象生物の行動の種類を判定することにより、対象生物の追跡のみと比較して対象生物の行動をより詳細に把握することができる。また、本実施の形態によれば、対象生物の種類に応じて対象生物の骨格を抽出し、抽出した骨格に基づいて追跡した対象生物の行動と、対象生物の行動を分類するモデルとに基づいて対象生物の行動を判定することにより対象生物の生態に応じた行動を把握することができる。
 また、本実施の形態によれば、対象生物の行動を把握した結果に基づいて効率的な給餌を実行することができる。即ち、本実施の形態によれば、適したタイミングにおいて給餌を実行することができるため、具体例として、対象生物が満腹状態である場合に給餌をすることによって水質が悪化することを防ぐことができ、対象生物が空腹状態である場合に給餌しないことを防ぐことができ、対象生物の健康状態に応じて給餌量を調節することができる。
***Description of Effect of First Embodiment***
As described above, according to this embodiment, the behavior of the target organism can be understood in more detail than by only tracking the target organism by determining the type of behavior of the target organism based on the swimming behavior and leg movements of the target organism, etc. Also, according to this embodiment, the skeleton of the target organism is extracted according to the type of the target organism, and the behavior of the target organism is determined based on the behavior of the target organism tracked based on the extracted skeleton and a model for classifying the behavior of the target organism, thereby making it possible to understand the behavior according to the ecology of the target organism.
Furthermore, according to this embodiment, efficient feeding can be performed based on the results of understanding the behavior of the target organism. That is, according to this embodiment, since feeding can be performed at an appropriate timing, it is possible to prevent the water quality from deteriorating by feeding the target organism when it is full, and to prevent not feeding the target organism when it is hungry, and it is possible to adjust the amount of feeding according to the health condition of the target organism.
***他の構成***
<変形例1>
 図10は、本変形例に係る行動判定装置10のハードウェア構成例を示している。
 行動判定装置10は、プロセッサ20、あるいはプロセッサ20と記憶装置30とに代えて、処理回路19を備える。
 処理回路19は、行動判定装置10が備える各部の少なくとも一部を実現するハードウェアである。
 処理回路19は、専用のハードウェアであってもよく、また、記憶装置30に格納されるプログラムを実行するプロセッサであってもよい。
***Other configurations***
<Modification 1>
FIG. 10 shows an example of the hardware configuration of the behavior determining device 10 according to this modified example.
The behavior determination device 10 includes a processing circuit 19 instead of the processor 20 or instead of the processor 20 and the storage device 30 .
The processing circuitry 19 is hardware that realizes at least a portion of each unit of the behavior determination device 10 .
The processing circuitry 19 may be dedicated hardware, or may be a processor that executes a program stored in a storage device 30 .
 処理回路19が専用のハードウェアである場合、処理回路19は、具体例として、単一回路、複合回路、プログラム化したプロセッサ、並列プログラム化したプロセッサ、ASIC(Application Specific Integrated Circuit)、FPGA(Field Programmable Gate Array)又はこれらの組み合わせである。
 行動判定装置10は、処理回路19を代替する複数の処理回路を備えてもよい。複数の処理回路は、処理回路19の役割を分担する。
When processing circuitry 19 is dedicated hardware, processing circuitry 19 may be, for example, a single circuit, a composite circuit, a programmed processor, a parallel programmed processor, an ASIC (Application Specific Integrated Circuit), an FPGA (Field Programmable Gate Array), or a combination thereof.
The behavior determination device 10 may include a plurality of processing circuits that replace the processing circuit 19. The plurality of processing circuits share the role of the processing circuit 19.
 行動判定装置10において、一部の機能が専用のハードウェアによって実現されて、残りの機能がソフトウェア又はファームウェアによって実現されてもよい。 In the behavior determination device 10, some functions may be realized by dedicated hardware, and the remaining functions may be realized by software or firmware.
 処理回路19は、具体例として、ハードウェア、ソフトウェア、ファームウェア、又はこれらの組み合わせにより実現される。
 プロセッサ20と記憶装置30と処理回路19とを、総称して「プロセッシングサーキットリー」という。つまり、行動判定装置10の各機能構成要素の機能は、プロセッシングサーキットリーにより実現される。
 他の実施の形態に係る行動判定装置10についても、本変形例と同様の構成であってもよい。
The processing circuitry 19 is illustratively implemented in hardware, software, firmware, or a combination thereof.
The processor 20, the storage device 30, and the processing circuit 19 are collectively referred to as the “processing circuitry.” In other words, the functions of the functional components of the behavior determination device 10 are realized by the processing circuitry.
The behavior determining device 10 according to other embodiments may also have a similar configuration to this modified example.
 実施の形態2.
 以下、主に前述した実施の形態と異なる点について、図面を参照しながら説明する。
Embodiment 2.
The following mainly describes the differences from the above-described embodiment with reference to the drawings.
***構成の説明***
 図11は、本実施の形態に係る行動判定システム9の構成例を示している。本実施の形態に係る行動判定装置10は、図11に示すように、残餌量取得部200と、学習部301と、死骸数取得部400とをさらに備える。
 残餌量取得部200は、映像取得部201と、残餌検知部202と、残餌量算出部203とを備える。
 死骸数取得部400は、映像取得部401と、死骸検知部402と、死骸数算出部403とを備える。
***Configuration Description***
Fig. 11 shows an example of the configuration of a behavior determination system 9 according to this embodiment. As shown in Fig. 11, the behavior determination device 10 according to this embodiment further includes a remaining food amount acquisition unit 200, a learning unit 301, and a corpse number acquisition unit 400.
The remaining food amount acquisition section 200 comprises a video acquisition section 201 , a remaining food detection section 202 , and a remaining food amount calculation section 203 .
The corpse number acquisition unit 400 includes an image acquisition unit 401 , a corpse detection unit 402 , and a corpse number calculation unit 403 .
 映像取得部201は映像取得部21と同様である。 Video acquisition unit 201 is similar to video acquisition unit 21.
 残餌検知部202は、映像取得部201が取得した映像データが示す各フレームから摂餌後における水槽内の残餌を検知する。摂餌後は、対象生物が給餌された餌を食べた後であり、具体例として、給餌のタイミングから、対象生物が餌を食べきることに通常要する時間が経過した後である。残餌検知部202は、背景差分方式を用いて残餌を判定してもよく、AI技術に基づく学習済モデルを用いて各フレーム中のどこに残餌が存在するのかを判定してもよい。当該学習済モデルは、具体例として、フレームを示すデータを入力とし、残餌を検知した結果を出力とするモデルである。 The remaining food detection unit 202 detects remaining food in the tank after feeding from each frame shown in the video data acquired by the video acquisition unit 201. After feeding is after the target organism has eaten the provided food, and as a specific example, after the time normally required for the target organism to finish eating the food has elapsed from the time of feeding. The remaining food detection unit 202 may determine remaining food using a background subtraction method, or may determine where remaining food is present in each frame using a trained model based on AI technology. As a specific example, the trained model is a model that takes data showing a frame as input and outputs the result of detecting remaining food.
 残餌量算出部203は、残餌検知部202が検知した餌の量を算出する。 The remaining food amount calculation unit 203 calculates the amount of food detected by the remaining food detection unit 202.
 映像取得部401は映像取得部21と同様である。 Video acquisition unit 401 is similar to video acquisition unit 21.
 死骸検知部402は、映像取得部401取得した映像データが示す各フレームから水槽内の死骸を検知する。死骸検知部402は、動き特徴と背景差分方式とを用いて死骸を判定してもよく、AI技術に基づく学習済モデルを用いて各フレーム中のどこに死骸が存在するのかを判定してもよい。当該学習済モデルは、具体例として、フレームを示すデータを入力とし、死骸を検知した結果を出力とするモデルである。 The corpse detection unit 402 detects corpses in the tank from each frame of the video data acquired by the video acquisition unit 401. The corpse detection unit 402 may determine the presence of a corpse using motion features and a background subtraction method, or may determine where in each frame the corpse is located using a trained model based on AI technology. As a specific example, the trained model is a model that receives data indicating a frame as input and outputs the result of detecting a corpse.
 死骸数算出部403は、死骸検知部402が検知した死骸の数を算出する。具体例として、死骸数算出部403は、背景差分法方式を用いて死骸が検知された場合において死骸を映していると推定される画素の総数に基づいて死骸の数を算出し、AI技術を利用して死骸が検知された場合において検知枠数に基づいて死骸の数を算出する。 Corpse number calculation unit 403 calculates the number of corpses detected by corpse detection unit 402. As a specific example, corpse number calculation unit 403 calculates the number of corpses based on the total number of pixels estimated to reflect corpses when corpses are detected using the background subtraction method, and calculates the number of corpses based on the number of detection frames when corpses are detected using AI technology.
 学習部301は、運動特徴量及び屈曲特徴量と、残餌検知部202又は死骸検知部402が検知した結果を示すデータとに基づいてモデルを生成する。
 具体例として、学習部301は、非摂餌行動データと摂餌行動データとを用いて、対象生物の行動特徴量に応じて対象生物の行動を摂餌行動と非摂餌行動とのいずれかに分類するモデルを学習する。非摂餌行動データは、残餌量が基準残餌量以上である場合におけるデータであって、給餌前における学習用水生生物の行動特徴量を示すデータであって、非摂餌行動に対応するデータであるとラベル付けされているデータである。残餌量は、摂餌時間が給餌時点から経過した後において残っている餌の量である。摂餌時間は、学習用水生生物が給餌された餌を摂餌することに要する時間として設定された時間である。学習用水生生物は対象生物に対応する水生生物である。学習用水生生物は、対象生物そのものであってもよく、対象生物の性質と同じ又は類似する性質を有する水生生物であってもよい。摂餌行動データは、残餌量が基準残餌量未満である場合におけるデータであって、給餌前における学習用水生生物の行動特徴量を示すデータであって、摂餌行動に対応するデータであるとラベル付けされているデータである。本例において、学習部301は、残餌量算出部203が算出した餌の量が一定量未満である場合において、給餌前における運動特徴量及び屈曲特徴量と、摂餌後における残餌量とに基づいて対象生物の摂餌行動を学習し、摂餌行動に対応する運動特徴量及び屈曲特徴量を示すモデルを参照行動モデル31の少なくとも一部としてもよい。
 別の具体例として、学習部301は、異常行動データと正常行動データとを用いて、対象生物の行動特徴量に応じて対象生物の行動を正常行動と異常行動とのいずれかに分類するモデルを学習する。ここで、本例において水槽を用いて対象生物に対応する複数の水生生物から構成される水生生物群を飼育しているものとする。対象生物に対応する複数の水生生物の各々は、対象生物の性質と同じ又は類似する性質を有する水生生物であってもよい。異常行動データは、水槽内において異常数以上の水生生物群を構成する水生生物の死骸が検知された場合におけるデータであって、水槽内において異常数以上の水生生物群を構成する水生生物の死骸が検知される前における水生生物群を構成する各水生生物の行動特徴量を示すデータであって、異常行動に対応するデータであるとラベル付けされているデータである。正常行動データは、水槽内において異常数未満の水生生物群を構成する水生生物の死骸が検知された場合におけるデータであって、水槽内において異常数未満の水生生物群を構成する水生生物の死骸が検知される前における水生生物群を構成する各水生生物の行動特徴量を示すデータであって、正常行動に対応するデータであるとラベル付けされているデータである。本例において、学習部301は、死骸数算出部403が算出した死骸の数が一定数未満である場合において、死骸検知前における運動特徴量及び屈曲特徴量と、死骸数とに基づいて対象生物の異常行動を学習し、異常行動に対応する運動特徴量及び屈曲特徴量を示すモデルを参照行動モデル31の少なくとも一部としてもよい。
The learning unit 301 generates a model based on the movement feature amount and the curvature feature amount, and data indicating the results of detection by the remaining food detection unit 202 or the carcass detection unit 402 .
As a specific example, the learning unit 301 uses the non-feeding behavior data and the feeding behavior data to learn a model that classifies the behavior of the target organism into feeding behavior and non-feeding behavior according to the behavior feature of the target organism. The non-feeding behavior data is data when the amount of remaining food is equal to or greater than a reference remaining food amount, data showing the behavior feature of the learning aquatic organism before feeding, and data labeled as data corresponding to non-feeding behavior. The remaining food amount is the amount of food remaining after the feeding time has elapsed from the feeding time. The feeding time is a time set as the time required for the learning aquatic organism to eat the fed food. The learning aquatic organism is an aquatic organism corresponding to the target organism. The learning aquatic organism may be the target organism itself, or may be an aquatic organism having the same or similar properties as the target organism. The feeding behavior data is data when the amount of remaining food is less than a reference remaining food amount, data showing the behavior feature of the learning aquatic organism before feeding, and data labeled as data corresponding to feeding behavior. In this example, when the amount of food calculated by the remaining food amount calculation unit 203 is less than a certain amount, the learning unit 301 learns the feeding behavior of the target organism based on the movement features and curvature features before feeding and the amount of remaining food after feeding, and may use a model that shows the movement features and curvature features corresponding to the feeding behavior as at least a part of the reference behavior model 31.
As another specific example, the learning unit 301 uses the abnormal behavior data and the normal behavior data to learn a model that classifies the behavior of the target organism into either normal behavior or abnormal behavior according to the behavior feature of the target organism. Here, in this example, it is assumed that an aquarium is used to raise a group of aquatic organisms composed of a plurality of aquatic organisms corresponding to the target organism. Each of the plurality of aquatic organisms corresponding to the target organism may be an aquatic organism having the same or similar properties as the target organism. The abnormal behavior data is data when the carcasses of aquatic organisms constituting an abnormal number or more of aquatic organisms in the aquarium are detected, and is data indicating the behavior feature of each aquatic organism constituting the aquatic organisms before the carcasses of aquatic organisms constituting an abnormal number or more of aquatic organisms in the aquarium are detected, and is labeled as data corresponding to abnormal behavior. The normal behavior data is data when the corpses of aquatic organisms constituting the aquatic organism group less than an abnormal number are detected in the aquarium, and is data indicating behavioral features of each aquatic organism constituting the aquatic organism group before the corpses of the aquatic organisms constituting the aquatic organism group less than an abnormal number are detected in the aquarium, and is data labeled as data corresponding to normal behavior. In this example, when the number of corpses calculated by the corpse number calculation unit 403 is less than a certain number, the learning unit 301 learns abnormal behavior of the target organism based on the movement features and curvature features before corpse detection and the number of corpses, and may use a model indicating the movement features and curvature features corresponding to abnormal behavior as at least a part of the reference behavior model 31.
 図12は、学習部301の処理を説明する図である。
 図12の(a)は、学習部301が摂餌行動モデルを作成する具体例を示している。摂餌後において検知された残餌が一定量よりも少ない場合、即ち、対象生成物が給餌された餌を十分に食べた場合において、学習部301は、摂餌行動モデルとして、給餌前における対象生物の行動が摂餌行動に分類されるようなモデルを生成する。
 図12の(b)は、学習部301が非摂餌行動モデルを作成する具体例を示している。摂餌後において検知された残餌が一定量よりも多い場合、即ち、対象生成物が給餌された餌を十分に食べなかった場合において、学習部301は、非摂餌行動モデルとして、給餌前における対象生物の行動が非摂餌行動に分類されるようなモデルを生成する。
 図12の(c)は、学習部301が異常行動モデルを作成する具体例を示している。検知された対象生物の死骸の数が一定数よりも多い場合において、学習部301は、異常行動モデルとして、一定数よりも多い死骸が検知される前における対象生物の行動が異常行動に分類されるようなモデルを生成する。
 なお、学習部301は、検知された対象生物の死骸の数が一定数よりも少ない場合において正常行動モデルを生成してもよい。
FIG. 12 is a diagram for explaining the processing of the learning unit 301. As shown in FIG.
12A shows a specific example of creating a feeding behavior model by the learning unit 301. When the amount of remaining food detected after feeding is less than a certain amount, i.e., when the target product has eaten a sufficient amount of the provided food, the learning unit 301 creates a feeding behavior model in which the behavior of the target organism before feeding is classified as feeding behavior.
12B shows a specific example of creating a non-feeding behavior model by the learning unit 301. When the amount of remaining food detected after feeding is greater than a certain amount, i.e., when the target product has not eaten enough of the provided food, the learning unit 301 creates a non-feeding behavior model in which the behavior of the target organism before feeding is classified as non-feeding behavior.
12C shows a specific example of creating an abnormal behavior model by the learning unit 301. When the number of detected target organism corpses is greater than a certain number, the learning unit 301 creates an abnormal behavior model in which the behavior of the target organism before the detection of more than a certain number of corpses is classified as abnormal behavior.
The learning unit 301 may generate a normal behavior model when the number of detected target creature corpses is less than a certain number.
***動作の説明***
 図13は、残餌量取得部200及び学習部301の動作の一例を示すフローチャートである。図13を参照して残餌量取得部200及び学習部301の動作を説明する。
*** Operation Description ***
13 is a flow chart showing an example of the operation of the remaining feed amount acquisition section 200 and the learning section 301. The operation of the remaining feed amount acquisition section 200 and the learning section 301 will be described with reference to FIG.
(ステップS21)
 映像取得部201は、カメラ1が出力した映像データを対象映像データとして取得する。本ステップはステップS1と同様である。なお、対象映像データは対象生物を飼育している水槽内を撮影した映像を示すものとする。
(Step S21)
The image acquisition unit 201 acquires the image data output by the camera 1 as target image data. This step is the same as step S1. Note that the target image data indicates an image captured inside an aquarium in which a target organism is kept.
(ステップS22)
 対象映像データが摂餌後におけるデータである場合、行動判定装置10はステップS23に進む。それ以外の場合、行動判定装置10はステップS21に戻る。
 なお、対象映像データが摂餌後におけるデータであることを行動判定装置10が把握している場合等において、行動判定装置10は本ステップをスキップしてもよい。
(Step S22)
If the target video data is data after feeding, the behavior determination apparatus 10 proceeds to step S23. Otherwise, the behavior determination apparatus 10 returns to step S21.
Note that, in cases where the behavior determination apparatus 10 is aware that the target video data is data taken after feeding, the behavior determination apparatus 10 may skip this step.
(ステップS23)
 残餌検知部202は、対象映像データが示す各フレームから残餌を検知する。
(Step S23)
The remaining food detection unit 202 detects remaining food from each frame represented by the target video data.
(ステップS24)
 ステップS23において残餌が検知された場合、行動判定装置10はステップS25に進む。それ以外の場合、行動判定装置10はステップS26に進む。
(Step S24)
If remaining food is detected in step S23, the behavior determination apparatus 10 proceeds to step S25. Otherwise, the behavior determination apparatus 10 proceeds to step S26.
(ステップS25)
 残餌量算出部203は、ステップS24において検知された残餌の量を算出する。
(Step S25)
The remaining food amount calculation section 203 calculates the amount of remaining food detected in step S24.
(ステップS26)
 学習部301は、給餌前における運動量特徴量と屈曲特徴量との各々を示す情報を取得する。
(Step S26)
The learning unit 301 acquires information indicating each of the movement amount feature amount and the curvature feature amount before feeding.
(ステップS27、ステップS28、ステップS29)
 ステップS25において算出された残餌の量が一定量以上である場合、学習部301はステップS26において取得した情報に対して非摂餌行動とラベル付けする。それ以外の場合、学習部301はステップS26において取得した情報に対して摂餌行動とラベル付けする。学習部301は個体の情報毎にラベル付けしてもよい。
 なお、ステップS28及びステップS29においてラベル付けされたデータは学習データに当たる。非摂餌行動とラベル付けされたデータは非摂餌行動データに当たる。摂餌行動とラベル付けされたデータは摂餌行動データに当たる。
(Step S27, Step S28, Step S29)
If the amount of remaining food calculated in step S25 is equal to or greater than a certain amount, the learning unit 301 labels the information acquired in step S26 as a non-feeding behavior. Otherwise, the learning unit 301 labels the information acquired in step S26 as a feeding behavior. The learning unit 301 may label each piece of information about an individual.
The data labeled in steps S28 and S29 corresponds to learning data, the data labeled as non-feeding behavior corresponds to non-feeding behavior data, and the data labeled as feeding behavior corresponds to feeding behavior data.
(ステップS30)
 学習データの量が十分である場合、行動判定装置10はステップS31に進む。それ以外の場合、行動判定装置10はステップS21に戻る。
(Step S30)
If the amount of learning data is sufficient, the behavior determination apparatus 10 proceeds to step S31, otherwise, the behavior determination apparatus 10 returns to step S21.
(ステップS31)
 学習部301は、ステップS28及びステップS29においてラベル付けした学習データを用いて、対象生物の行動と、ラベルとの関係を学習する。
(Step S31)
The learning unit 301 uses the learning data labeled in steps S28 and S29 to learn the relationship between the behavior of the target organism and the label.
(ステップS32)
 学習部301は、ステップS31において学習した結果に基づいて、参照行動モデル31に含めるモデルとして摂餌行動モデルを作成する。なお、ステップS31において学習した結果が摂餌行動モデルそのものであってもよい。
(Step S32)
Based on the result of learning in step S31, the learning unit 301 creates a feeding behavior model as a model to be included in the reference behavior model 31. Note that the result of learning in step S31 may be the feeding behavior model itself.
 図14は、死骸数取得部400及び学習部301の動作の一例を示すフローチャートである。図14を参照して死骸数取得部400及び学習部301の動作を説明する。 FIG. 14 is a flowchart showing an example of the operation of the corpse number acquisition unit 400 and the learning unit 301. The operation of the corpse number acquisition unit 400 and the learning unit 301 will be described with reference to FIG. 14.
(ステップS41)
 映像取得部401は、カメラ1が出力した映像データを対象映像データとして取得する。本ステップはステップS21と同様である。
(Step S41)
The image acquisition unit 401 acquires, as target image data, the image data output by the camera 1. This step is similar to step S21.
(ステップS42)
 死骸検知部402は、対象映像データが示す各フレームから対象生物の死骸を検知する。
(Step S42)
The corpse detection unit 402 detects the corpse of a target organism from each frame represented by the target video data.
(ステップS43)
 ステップS42において死骸が検知された場合、行動判定装置10はステップS44に進む。それ以外の場合、行動判定装置10はステップS45に進む。
(Step S43)
If a corpse is detected in step S42, the behavior determination apparatus 10 proceeds to step S44. Otherwise, the behavior determination apparatus 10 proceeds to step S45.
(ステップS44)
 死骸数算出部403は、ステップS43において検知された死骸の数を算出する。
(Step S44)
Corpse number calculation section 403 calculates the number of corpses detected in step S43.
(ステップS45)
 学習部301は、死骸が検知される前における運動特徴量と屈曲特徴量との各々を示す情報を取得する。学習部301は、死骸と推定される個体に関する情報のみを取得してもよい。死骸と推定される個体は、具体例として、水面に浮いている個体、又は表面の色が変色している個体である。
(Step S45)
The learning unit 301 acquires information indicating each of the motion feature and the curvature feature before the detection of a corpse. The learning unit 301 may acquire only information regarding an individual estimated to be a corpse. Specific examples of an individual estimated to be a corpse include an individual floating on the water surface, or an individual whose surface color has changed.
(ステップS46、ステップS47、ステップS48)
 ステップS45において算出された死骸の数が一定数以上である場合、学習部301はステップS46において取得した情報に対して異常行動とラベル付けする。それ以外の場合、学習部301はステップS46において取得した情報に対して正常行動とラベル付けする。学習部301は個体の情報毎にラベル付けしてもよい。
 なお、ステップS47及びステップS48においてラベル付けされたデータは学習データに当たる。異常行動とラベル付けされたデータは異常行動データに当たる。正常行動とラベル付けされたデータは正常行動データに当たる。
(Step S46, Step S47, Step S48)
If the number of corpses calculated in step S45 is equal to or greater than a certain number, the learning unit 301 labels the information acquired in step S46 as abnormal behavior. Otherwise, the learning unit 301 labels the information acquired in step S46 as normal behavior. The learning unit 301 may label each piece of information about an individual.
The data labeled in steps S47 and S48 corresponds to learning data, the data labeled as abnormal behavior corresponds to abnormal behavior data, and the data labeled as normal behavior corresponds to normal behavior data.
(ステップS49)
 学習データの量が十分である場合、行動判定装置10はステップS50へ進む。それ以外の場合、行動判定装置10はステップS41へ戻る。
(Step S49)
If the amount of learning data is sufficient, the behavior determining apparatus 10 proceeds to step S50, otherwise, the behavior determining apparatus 10 returns to step S41.
(ステップS50)
 学習部301は、ステップS47及びステップS48においてラベル付けした学習データを用いて、対象生物の行動と、ラベルとの関係を学習する。
(Step S50)
The learning unit 301 uses the learning data labeled in steps S47 and S48 to learn the relationship between the behavior of the target organism and the label.
(ステップS51)
 学習部301は、ステップS50において学習した結果に基づいて、参照行動モデル31に含めるモデルとして異常行動モデルを作成する。なお、ステップS50において学習した結果が異常行動モデルそのものであってもよい。
(Step S51)
Based on the result of learning in step S50, the learning unit 301 creates an abnormal behavior model as a model to be included in the reference behavior model 31. Note that the result of learning in step S50 may be the abnormal behavior model itself.
***実施の形態2の効果の説明***
 以上のように、本実施の形態によれば、対象生物の行動と、水槽内の様子とに基づいて、対象生物の行動を分類することに用いられるモデルを生成することができる。
***Description of Effect of Second Embodiment***
As described above, according to this embodiment, a model used to classify the behavior of a target organism can be generated based on the behavior of the target organism and the state inside the aquarium.
***他の実施の形態***
 前述した各実施の形態の自由な組み合わせ、あるいは各実施の形態の任意の構成要素の変形、もしくは各実施の形態において任意の構成要素の省略が可能である。
 また、実施の形態は、実施の形態1から2で示したものに限定されるものではなく、必要に応じて種々の変更が可能である。フローチャート等を用いて説明した手順は適宜変更されてもよい。
***Other embodiments***
The above-described embodiments may be freely combined, or any of the components in each embodiment may be modified, or any of the components in each embodiment may be omitted.
In addition, the embodiments are not limited to those shown in the first and second embodiments, and various modifications are possible as necessary. The procedures described using the flowcharts and the like may be modified as appropriate.
 1 カメラ、9 行動判定システム、10 行動判定装置、19 処理回路、20 プロセッサ、21 映像取得部、22 対象生物検知部、23 骨格抽出部、24 対象生物追跡部、25 運動特徴量算出部、26 屈曲特徴量算出部、27 行動判定部、28 生育条件制御部、30 記憶装置、31 参照行動モデル、40 通信装置、50 入出力インタフェース、60 自動給餌装置、70 生育環境制御装置、200 残餌量取得部、201 映像取得部、202 残餌検知部、203 残餌量算出部、301 学習部、400 死骸数取得部、401 映像取得部、402 死骸検知部、403 死骸数算出部。 1 camera, 9 behavior determination system, 10 behavior determination device, 19 processing circuit, 20 processor, 21 image acquisition unit, 22 target organism detection unit, 23 skeleton extraction unit, 24 target organism tracking unit, 25 movement feature value calculation unit, 26 bending feature value calculation unit, 27 behavior determination unit, 28 growth condition control unit, 30 storage device, 31 reference behavior model, 40 communication device, 50 input/output interface, 60 automatic feeding device, 70 growth environment control device, 200 remaining food amount acquisition unit, 201 image acquisition unit, 202 remaining food detection unit, 203 remaining food amount calculation unit, 301 learning unit, 400 corpse number acquisition unit, 401 image acquisition unit, 402 corpse detection unit, 403 corpse number calculation unit.

Claims (10)

  1.  水生生物である対象生物が有する各関節の位置及び屈曲の少なくともいずれかの時間推移に基づいて算出された特徴量であって、前記対象生物の行動に対応する特徴量である行動特徴量と、前記行動特徴量に応じた前記対象生物の行動の分類を示すモデルから成る参照行動モデルとを用いて前記対象生物の行動を判定する行動判定部と、
     前記対象生物に対して給餌を実行する装置が前記対象生物に対して給餌を実行するか否かを決定するためのデータを、判定された前記対象生物の行動に応じて生成する生育条件制御部と
    を備える行動判定装置。
    a behavior determination unit that determines the behavior of a target organism using behavior features that are calculated based on time transitions of at least one of the positions and bending of each joint of the target organism, the behavior features corresponding to the behavior of the target organism, and a reference behavior model that is a model that indicates a classification of the behavior of the target organism according to the behavior features;
    A behavior determination device comprising: a growth condition control unit that generates data for a device that feeds the target organism to determine whether or not to feed the target organism in accordance with the determined behavior of the target organism.
  2.  前記行動特徴量は、
     前記対象生物の移動と位置と姿勢との少なくともいずれかに対応する特徴量である運動特徴量と、
     前記対象生物の関節の屈曲と、前記対象生物の関節のうち対象関節の屈曲に伴って移動した関節との少なくともいずれかに対応する特徴量である屈曲特徴量との少なくともいずれかから成る請求項1に記載の行動判定装置。
    The behavior feature amount is
    A movement feature corresponding to at least one of the movement, position, and posture of the target living thing;
    The behavior determination device according to claim 1 , further comprising at least one of a bending feature amount which is a feature amount corresponding to at least one of the bending of the joint of the target organism and a joint among the joints of the target organism which moves in conjunction with the bending of the target joint.
  3.  前記行動判定装置は、さらに、
     前記対象生物が映っている複数のフレームを示す映像データが示す各フレームにおいて前記対象生物が有する各関節を抽出し、抽出した各関節を示す情報を骨格情報として生成する骨格抽出部と、
     前記骨格情報に基づいて前記対象生物を追跡し、追跡した結果を示す情報を追跡情報として生成する対象生物追跡部と、
     前記追跡情報に基づいて前記対象生物の運動特徴量を算出する運動特徴量算出部と、
     前記骨格情報に基づいて前記対象生物の屈曲特徴量を算出する屈曲特徴量算出部と
    を備える請求項2に記載の行動判定装置。
    The behavior determination device further includes:
    a skeleton extraction unit that extracts each joint of the target organism in each frame represented by video data representing a plurality of frames in which the target organism is captured, and generates information representing each of the extracted joints as skeleton information;
    a target organism tracking unit that tracks the target organism based on the skeletal information and generates information indicating a tracking result as tracking information;
    a motion feature amount calculation unit that calculates a motion feature amount of the target living thing based on the tracking information;
    The behavior determining device according to claim 2 , further comprising a curvature feature amount calculation unit that calculates a curvature feature amount of the target organism based on the skeletal information.
  4.  前記生育条件制御部は、判定された前記対象生物の行動に応じて、前記対象生物の生育環境を調節することに用いられるデータを生成する請求項1から3のいずれか1項に記載の行動判定装置。 The behavior determination device according to any one of claims 1 to 3, wherein the growth condition control unit generates data used to adjust the growth environment of the target organism according to the determined behavior of the target organism.
  5.  前記参照行動モデルは、前記行動特徴量に応じて前記対象生物の行動を、前記対象生物が餌を欲している場合における前記対象生物の行動である摂餌行動に分類するための指標を示す摂餌行動モデルと、前記行動特徴量に応じて前記対象生物の行動を前記摂餌行動ではない行動である非摂餌行動に分類するための指標を示す非摂餌行動モデルとの少なくともいずれかを含む請求項1から4のいずれか1項に記載の行動判定装置。 The behavior determination device according to any one of claims 1 to 4, wherein the reference behavior model includes at least one of a feeding behavior model showing an index for classifying the behavior of the target organism into feeding behavior, which is the behavior of the target organism when the target organism desires food, according to the behavior feature amount, and a non-feeding behavior model showing an index for classifying the behavior of the target organism into non-feeding behavior, which is the behavior other than the feeding behavior, according to the behavior feature amount.
  6.  前記行動判定装置は、さらに、
     前記対象生物に対応する水生生物である学習用水生生物が給餌された餌を摂餌することに要する時間として設定された時間である摂餌時間が給餌時点から経過した後において残っている餌の量である残餌量が基準残餌量以上である場合におけるデータであって、給餌前における前記学習用水生生物の行動特徴量を示すデータであって、前記非摂餌行動に対応するデータであるとラベル付けされているデータである非摂餌行動データと、前記残餌量が前記基準残餌量未満である場合におけるデータであって、給餌前における前記学習用水生生物の行動特徴量を示すデータであって、前記摂餌行動に対応するデータであるとラベル付けされているデータである摂餌行動データとを用いて、前記対象生物の行動特徴量に応じて前記対象生物の行動を前記摂餌行動と前記非摂餌行動とのいずれかに分類するモデルを学習する学習部
    を備える請求項5に記載の行動判定装置。
    The behavior determination device further includes:
    6. The behavior determination device of claim 5, further comprising a learning unit that learns a model that classifies the behavior of the target organism into either the feeding behavior or the non-feeding behavior according to the behavioral features of the target organism using non-feeding behavior data, which is data when the amount of remaining food, which is the amount of food remaining after the feeding time, which is the time required for the target organism to ingest the fed food, has elapsed from the feeding time, and which indicates the behavioral features of the target organism before feeding, and which is labeled as data corresponding to the non-feeding behavior, and feeding behavior data, which is data when the amount of remaining food is less than the standard remaining feed amount, and which indicates the behavioral features of the target organism before feeding, and which is labeled as data corresponding to the feeding behavior.
  7.  前記参照行動モデルは、前記行動特徴量に応じて前記対象生物の行動を、前記対象生物の健康状態が良好である場合における前記対象生物の行動である正常行動に分類するための指標を示す正常行動モデルと、前記行動特徴量に応じて前記対象生物の行動を前記正常行動ではない行動である異常行動に分類するための指標を示す異常行動モデルとの少なくともいずれかを含む請求項1から4のいずれか1項に記載の行動判定装置。 The behavior determination device according to any one of claims 1 to 4, wherein the reference behavior model includes at least one of a normal behavior model showing an index for classifying the behavior of the target organism into normal behavior, which is the behavior of the target organism when the health condition of the target organism is good, according to the behavior feature amount, and an abnormal behavior model showing an index for classifying the behavior of the target organism into abnormal behavior, which is the behavior that is not the normal behavior, according to the behavior feature amount.
  8.  前記行動判定装置は、さらに、
     水槽を用いて前記対象生物に対応する複数の水生生物から構成される水生生物群を飼育しているとき、
     前記水槽内において異常数以上の前記水生生物群を構成する水生生物の死骸が検知された場合におけるデータであって、前記水槽内において前記異常数以上の前記水生生物群を構成する水生生物の死骸が検知される前における前記水生生物群を構成する各水生生物の行動特徴量を示すデータであって、前記異常行動に対応するデータであるとラベル付けされているデータである異常行動データと、前記水槽内において前記異常数未満の前記水生生物群を構成する水生生物の死骸が検知された場合におけるデータであって、前記水槽内において前記異常数未満の前記水生生物群を構成する水生生物の死骸が検知される前における前記水生生物群を構成する各水生生物の行動特徴量を示すデータであって、前記正常行動に対応するデータであるとラベル付けされているデータである正常行動データとを用いて、前記対象生物の行動特徴量に応じて前記対象生物の行動を前記正常行動と前記異常行動とのいずれかに分類するモデルを学習する学習部
    を備える請求項7に記載の行動判定装置。
    The behavior determination device further includes:
    When a group of aquatic organisms consisting of a plurality of aquatic organisms corresponding to the target organisms is kept in an aquarium,
    8. The behavior determination device according to claim 7, further comprising a learning unit that uses abnormal behavior data, which is data when an abnormal number or more of corpses of aquatic organisms constituting the aquatic organisms are detected in the aquarium, indicating behavioral features of each aquatic organism constituting the aquatic organisms before the abnormal number or more of corpses of the aquatic organisms constituting the aquatic organisms are detected in the aquarium, and the data is labeled as data corresponding to the abnormal behavior, and normal behavior data, which is data when an abnormal number or more of corpses of aquatic organisms constituting the aquatic organisms are detected in the aquarium, indicating behavioral features of each aquatic organism constituting the aquatic organisms before the abnormal number or more of corpses of the aquatic organisms constituting the aquatic organisms are detected in the aquarium, and the data is labeled as data corresponding to the normal behavior, to learn a model that classifies the behavior of the target organism into either the normal behavior or the abnormal behavior according to the behavioral features of the target organism.
  9.  コンピュータが、水生生物である対象生物が有する各関節の位置及び屈曲の少なくともいずれかの時間推移に基づいて算出された特徴量であって、前記対象生物の行動に対応する特徴量である行動特徴量と、前記行動特徴量に応じた前記対象生物の行動の分類を示すモデルから成る参照行動モデルとを用いて前記対象生物の行動を判定し、
     前記コンピュータが、前記対象生物に対して給餌を実行する装置が前記対象生物に対して給餌を実行するか否かを決定するためのデータを、判定された前記対象生物の行動に応じて生成する行動判定方法。
    a computer determines a behavior of the target organism using behavior features calculated based on time transitions of at least one of the positions and bending of each joint of the target organism, the behavior features corresponding to the behavior of the target organism, and a reference behavior model including a model indicating a classification of the behavior of the target organism according to the behavior features;
    A behavior determination method in which the computer generates data for determining whether or not a device that feeds the target organism will feed the target organism, based on the determined behavior of the target organism.
  10.  水生生物である対象生物が有する各関節の位置及び屈曲の少なくともいずれかの時間推移に基づいて算出された特徴量であって、前記対象生物の行動に対応する特徴量である行動特徴量と、前記行動特徴量に応じた前記対象生物の行動の分類を示すモデルから成る参照行動モデルとを用いて前記対象生物の行動を判定する行動判定処理と、
     前記対象生物に対して給餌を実行する装置が前記対象生物に対して給餌を実行するか否かを決定するためのデータを、判定された前記対象生物の行動に応じて生成する生育条件制御処理と
    をコンピュータである行動判定装置に実行させる行動判定プログラム。
    a behavior determination process for determining the behavior of a target organism, the target organism being an aquatic organism, using behavior features that are calculated based on time transitions of at least one of the positions and flexions of each joint of the target organism, the behavior features corresponding to the behavior of the target organism, and a reference behavior model that is a model that indicates a classification of the behavior of the target organism according to the behavior features;
    A behavior determination program that causes a behavior determination device, which is a computer, to execute a growth condition control process that generates data for the device that feeds the target organism to decide whether or not to feed the target organism, based on the determined behavior of the target organism.
PCT/JP2022/036590 2022-09-29 2022-09-29 Action determination device, action determination method, and action determination program WO2024069898A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/JP2022/036590 WO2024069898A1 (en) 2022-09-29 2022-09-29 Action determination device, action determination method, and action determination program

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2022/036590 WO2024069898A1 (en) 2022-09-29 2022-09-29 Action determination device, action determination method, and action determination program

Publications (1)

Publication Number Publication Date
WO2024069898A1 true WO2024069898A1 (en) 2024-04-04

Family

ID=90476886

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2022/036590 WO2024069898A1 (en) 2022-09-29 2022-09-29 Action determination device, action determination method, and action determination program

Country Status (1)

Country Link
WO (1) WO2024069898A1 (en)

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP3101938B2 (en) * 1996-03-27 2000-10-23 株式会社日立製作所 Automatic feeding device and method for aquatic organisms
JP3462412B2 (en) * 1999-01-18 2003-11-05 株式会社日立製作所 Automatic feeding device for aquatic organisms
CN108055479A (en) * 2017-12-28 2018-05-18 暨南大学 A kind of production method of animal behavior video
JP6739049B2 (en) * 2018-11-14 2020-08-12 株式会社 アイエスイー Automatic feeding method for farmed fish and automatic feeding system
JP2021511012A (en) * 2018-05-03 2021-05-06 エックス デベロップメント エルエルシー Fish measurement station management
JP6945663B2 (en) * 2020-01-23 2021-10-06 ソフトバンク株式会社 Estimator program, estimation method and information processing device
JP7077496B1 (en) * 2021-03-26 2022-05-31 株式会社マイスティア Feeding system, feeding method, and sound determination model
JP7101424B2 (en) * 2020-08-06 2022-07-15 曁南大学 Measurement and display method of muscle deformation in the motor process of aquatic animals

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP3101938B2 (en) * 1996-03-27 2000-10-23 株式会社日立製作所 Automatic feeding device and method for aquatic organisms
JP3462412B2 (en) * 1999-01-18 2003-11-05 株式会社日立製作所 Automatic feeding device for aquatic organisms
CN108055479A (en) * 2017-12-28 2018-05-18 暨南大学 A kind of production method of animal behavior video
JP2021511012A (en) * 2018-05-03 2021-05-06 エックス デベロップメント エルエルシー Fish measurement station management
JP6739049B2 (en) * 2018-11-14 2020-08-12 株式会社 アイエスイー Automatic feeding method for farmed fish and automatic feeding system
JP6945663B2 (en) * 2020-01-23 2021-10-06 ソフトバンク株式会社 Estimator program, estimation method and information processing device
JP7101424B2 (en) * 2020-08-06 2022-07-15 曁南大学 Measurement and display method of muscle deformation in the motor process of aquatic animals
JP7077496B1 (en) * 2021-03-26 2022-05-31 株式会社マイスティア Feeding system, feeding method, and sound determination model

Similar Documents

Publication Publication Date Title
Yang et al. Deep learning for smart fish farming: applications, opportunities and challenges
Yang et al. Computer vision models in intelligent aquaculture with emphasis on fish detection and behavior analysis: a review
Delcourt et al. Video multitracking of fish behaviour: a synthesis and future perspectives
Yang et al. Automatic recognition of sow nursing behaviour using deep learning-based segmentation and spatial and temporal features
WO2019232247A1 (en) Biomass estimation in an aquaculture environment
Liu et al. Measuring feeding activity of fish in RAS using computer vision
Ye et al. Behavioral characteristics and statistics-based imaging techniques in the assessment and optimization of tilapia feeding in a recirculating aquaculture system
Tang et al. An improved YOLOv3 algorithm to detect molting in swimming crabs against a complex background
WO2020023467A1 (en) Unique identification of freely swimming fish in an aquaculture environment
EP3843542A1 (en) Optimal feeding based on signals in an aquaculture environment
CN113040081B (en) Recirculating aquaculture fish feeding decision-making system based on fish shoal swimming energy consumption analysis
Lee et al. Research Article The Use of Vision in a Sustainable Aquaculture Feeding System
US20230071265A1 (en) Quantifying plant infestation by estimating the number of biological objects on leaves, by convolutional neural networks that use training images obtained by a semi-supervised approach
Cooper et al. The evolution of jaw protrusion mechanics is tightly coupled to bentho-pelagic divergence in damselfishes (Pomacentridae)
US11568530B2 (en) System and method to analyse an animal&#39;s image for market value determination
CN113592896B (en) Fish feeding method, system, equipment and storage medium based on image processing
Li et al. CMFTNet: Multiple fish tracking based on counterpoised JointNet
Helmer et al. Saccadic movement strategy in common cuttlefish (Sepia officinalis)
Zhang et al. Intelligent fish feeding based on machine vision: A review
Wu et al. Monitoring the respiratory behavior of multiple cows based on computer vision and deep learning
WO2024069898A1 (en) Action determination device, action determination method, and action determination program
Fleuren et al. Three-dimensional analysis of the fast-start escape response of the least killifish, Heterandria formosa
Hou et al. Research on fish bait particles counting model based on improved MCNN
Liu et al. Research progress of computer vision technology in abnormal fish detection
Pai et al. A computer vision based behavioral study and fish counting in a controlled environment