WO2024069898A1 - Dispositif de détermination d'action, procédé de détermination d'action et programme de détermination d'action - Google Patents

Dispositif de détermination d'action, procédé de détermination d'action et programme de détermination d'action Download PDF

Info

Publication number
WO2024069898A1
WO2024069898A1 PCT/JP2022/036590 JP2022036590W WO2024069898A1 WO 2024069898 A1 WO2024069898 A1 WO 2024069898A1 JP 2022036590 W JP2022036590 W JP 2022036590W WO 2024069898 A1 WO2024069898 A1 WO 2024069898A1
Authority
WO
WIPO (PCT)
Prior art keywords
behavior
target organism
feeding
data
target
Prior art date
Application number
PCT/JP2022/036590
Other languages
English (en)
Japanese (ja)
Inventor
葉子 田内
亮史 服部
孝之 小平
Original Assignee
三菱電機株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 三菱電機株式会社 filed Critical 三菱電機株式会社
Priority to PCT/JP2022/036590 priority Critical patent/WO2024069898A1/fr
Publication of WO2024069898A1 publication Critical patent/WO2024069898A1/fr

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A01AGRICULTURE; FORESTRY; ANIMAL HUSBANDRY; HUNTING; TRAPPING; FISHING
    • A01KANIMAL HUSBANDRY; AVICULTURE; APICULTURE; PISCICULTURE; FISHING; REARING OR BREEDING ANIMALS, NOT OTHERWISE PROVIDED FOR; NEW BREEDS OF ANIMALS
    • A01K29/00Other apparatus for animal husbandry
    • AHUMAN NECESSITIES
    • A01AGRICULTURE; FORESTRY; ANIMAL HUSBANDRY; HUNTING; TRAPPING; FISHING
    • A01KANIMAL HUSBANDRY; AVICULTURE; APICULTURE; PISCICULTURE; FISHING; REARING OR BREEDING ANIMALS, NOT OTHERWISE PROVIDED FOR; NEW BREEDS OF ANIMALS
    • A01K61/00Culture of aquatic animals
    • A01K61/80Feeding devices

Definitions

  • This disclosure relates to a behavior determination device, a behavior determination method, and a behavior determination program.
  • Patent Document 1 discloses a technology for automating a land-based aquaculture system with the aim of reducing the labor required for these manual tasks.
  • the technology disclosed in Patent Document 1 determines the three-dimensional position of each aquatic organism, such as fish, in an aquarium from a captured image, and individual characteristic data of each organism, such as its size and body shape, tracks the organism based on the determined position and characteristic data, and monitors the growth state of the organism based on the tracking results.
  • the technology is characterized by having a means for controlling the amount of feed and growth environment conditions to achieve optimal growth state using statistically managed data.
  • Patent Document 1 does not take into consideration the timing of feeding. As a result, this technology has the problem that feeding is not always carried out at an appropriate time, and therefore efficient feeding may not be possible.
  • the present disclosure aims to provide data that enables an automatic feeding device to feed cultivated aquatic organisms at appropriate times.
  • the behavior determination device comprises: a behavior determination unit that determines the behavior of a target organism using behavior features that are calculated based on time transitions of at least one of the positions and bending of each joint of the target organism, the behavior features corresponding to the behavior of the target organism, and a reference behavior model that is a model that indicates a classification of the behavior of the target organism according to the behavior features;
  • the device includes a growth condition control unit that generates data for determining whether or not the device that feeds the target organism should feed the target organism based on the determined behavior of the target organism.
  • the behavior determination unit determines the behavior of the target organism in accordance with the behavioral features, and the growth condition control unit generates data in accordance with the behavior of the target organism for a device that feeds the target organism to decide whether or not to feed the target organism.
  • the device corresponds to an automatic feeding device, and the automatic feeding device can feed the target organism at an appropriate time by deciding whether or not to feed the target organism based on the data.
  • data can be provided that enables the automatic feeding device to feed the cultivated aquatic organism at an appropriate time.
  • FIG. 1 is a diagram showing an example of the configuration of a behavior determination system 9 according to a first embodiment.
  • 5A to 5C are diagrams for explaining the processing of a skeleton extraction unit 23 according to the first embodiment.
  • 1A and 1B are diagrams for explaining the processing of the behavior determination unit 27 in the first embodiment, in which (a) shows a specific example when the target organism is a shrimp, and (b) shows a specific example when the target organism is a catfish.
  • FIG. 2 is a diagram for explaining a feeding behavior model according to the first embodiment.
  • FIG. 2 is a diagram for explaining a non-feeding behavior model according to the first embodiment.
  • FIG. 1 is a diagram for explaining a normal behavior model according to the first embodiment
  • 4 is a diagram for explaining an abnormal behavior model according to the first embodiment
  • FIG. 2 is a diagram showing an example of the hardware configuration of a behavior determining device 10 according to the first embodiment.
  • 4 is a flowchart showing the operation of the behavior determining device 10 according to the first embodiment.
  • FIG. 13 is a diagram showing an example of a hardware configuration of a behavior determining device 10 according to a modification of the first embodiment.
  • FIG. 13 is a diagram showing a configuration example of a behavior determination system 9 according to a second embodiment.
  • 13A to 13C are diagrams for explaining the processing of the learning unit 301 in the second embodiment, where FIG. 13A shows an example of creating a feeding behavior model, FIG.
  • FIG. 13B shows an example of creating a non-feeding behavior model
  • FIG. 13C shows an example of creating an abnormal behavior model
  • 10 is a flowchart showing the operation of a remaining feed amount acquisition section 200 and a learning section 301 according to the second embodiment
  • 11 is a flowchart showing the operation of the corpse number acquisition unit 400 and the learning unit 301 according to the second embodiment.
  • the behavior determination system 9 includes a camera 1, a behavior determination device 10, an automatic feeding device 60, and a growth environment control device 70.
  • the behavior determination device 10 has a function of determining the behavior of an aquatic organism.
  • the aquatic organism typically refers to an aquatic organism cultivated in an aquarium.
  • the behavior determination device 10 includes an image acquisition unit 21, a target organism detection unit 22, a skeleton extraction unit 23, a target organism tracking unit 24, a motion feature amount calculation unit 25, a bending feature amount calculation unit 26, a behavior determination unit 27, and a growth condition control unit 28, and stores a reference behavior model 31.
  • the behavior determination device 10 is connected to a camera 1, an automatic feeding device 60, and a growth environment control device 70.
  • the camera 1 is a device for photographing aquatic organisms, and there may be a plurality of cameras.
  • the camera 1 outputs video data representing images photographed by the camera 1.
  • the video data is data representing a plurality of frames, and as a specific example, is data of a moving image.
  • the automatic feeding device 60 is a device that feeds target organisms, and is a device that automatically feeds aquatic organisms.
  • the target organisms are aquatic organisms having joints, and specific examples thereof include fish and crustaceans.
  • the growth environment control device 70 is a device that controls the growth environment of aquatic organisms.
  • the video acquisition unit 21 acquires the video data output by the camera 1.
  • the target organism detection unit 22 detects a target organism in each frame indicated by the video data acquired by the video acquisition unit 21.
  • the target organism detection unit 22 may detect the location of the target organism in each frame by using a trained model based on AI (Artificial Intelligence) technology.
  • AI Artificial Intelligence
  • the trained model is a model that receives data indicating a frame as input and outputs the result of detecting the target organism.
  • the target organism detection unit 22 When there are multiple frames in which a target organism is detected, the target organism detection unit 22 generates, as detection information, information indicating the target organism detected in each frame in which the target organism is detected.
  • the skeleton extraction unit 23 extracts the skeleton of the target organism indicated by the detection information generated by the target organism detection unit 22, and generates skeleton information indicating the extracted skeleton. At this time, typically, the skeleton extraction unit 23 extracts each joint of the target organism in each frame indicated by the video data, and generates information indicating each extracted joint as skeleton information. At this time, the video data indicates a plurality of frames in which the target organism is shown.
  • the skeleton information may be information indicating the position and bending of each joint, or may be information indicating parts such as the eyes, each fin, the tail, and the legs.
  • the skeleton extraction unit 23 may change the skeleton to be extracted depending on the type of the target organism.
  • Fig. 2 is a diagram for explaining the processing of the skeleton extraction unit 23.
  • the skeleton extraction unit 23 extracts each part of the skeleton of the target organism, that is, the left eye, the right eye, the left pectoral fin, the right pectoral fin, the pelvic fin, the center of the dorsal fin, the anal fin, the base of the caudal fin, and the tip of the caudal fin, as shown in Fig. 2.
  • the target organism tracking unit 24 tracks the target organism based on the skeletal information, and generates information indicating the tracking results as tracking information. Specifically, the target organism tracking unit 24 tracks the target organism based on multiple frames indicated by the video data acquired by the video acquisition unit 21 and the skeletal information generated by the skeletal extraction unit 23, and generates tracking information indicating the tracking results. At this time, the target organism tracking unit 24 sets, as a specific example, the reference point for tracking to a certain point within the head of the target organism. As a specific example, the certain point is a point corresponding to the mouth or eyes, etc.
  • the tracking information indicates, as a specific example, the distance and time traveled by the target organism.
  • the movement characteristic amount calculation unit 25 calculates the movement characteristic amount of the target organism based on the tracking information generated by the target organism tracking unit 24.
  • the movement characteristic amount is a characteristic amount related to the movement of the target organism, and corresponds to at least one of the movement, position, and posture of the target organism.
  • the movement characteristic amount is composed of at least one of a value indicating the position of the target organism, a value indicating the posture of the target organism, a value indicating the swimming distance of the target organism, a value indicating the swimming speed of the target organism, and a value indicating the acceleration of the target organism while swimming.
  • the movement characteristic amount calculation unit 25 may change the characteristic amount calculated depending on the items of the reference behavior model 31.
  • the bending feature amount calculation unit 26 calculates the bending feature amount of the target organism based on the skeleton information generated by the skeleton extraction unit 23.
  • the bending feature amount calculation unit 26 may use the tracking information generated by the target organism tracking unit 24 when calculating the bending feature amount.
  • the bending feature amount is a feature amount related to the bending of the joints of the target organism, and is a feature amount corresponding to at least one of the bending of the joints of the target organism and the joints of the target organism that have moved with the bending of the target joint.
  • the bending feature amount is composed of at least one of a value indicating each of the bending direction and degree of each of at least some of the joints of the target organism, a value indicating each of the angular velocity and angular acceleration at the time of bending of each of the bent joints of the joints of the target organism, and a value indicating each of the movement amount, movement speed, and movement direction of each of the joints that have moved with the bending of the joints of the target organism.
  • the bending feature amount calculation unit 26 may change the feature amount calculated according to the item of the reference behavior model 31.
  • the bend feature amount calculation unit 26 calculates the movement trajectory of each joint of the target organism, and calculates the movement distance and movement direction of each joint based on the calculated movement trajectory. Note that the nature of the movement trajectory calculated by the bend feature amount calculation unit 26 basically differs depending on the type, age, etc. of the target organism.
  • the behavior determination unit 27 determines the behavior of the target organism using the behavior feature amount and the reference behavior model 31.
  • the behavior feature amount is a feature amount calculated based on the time progression of at least one of the position and bending of each joint of the target organism, and corresponds to the behavior of the target organism.
  • the behavior feature amount is composed of at least one of a movement feature amount and a bending feature amount.
  • the behavior determination unit 27 determines what type of behavior the target organism is by applying the calculated movement feature amount and displacement feature amount to the reference behavior model 31.
  • behavior determination unit 27 determines the behavior of the target organism as feeding behavior when the swimming speed of the target organism is faster than a predetermined speed, the orientation of the target organism's head is within a predetermined angle from the vertical upward direction, and the target organism wags its tail at intervals shorter than a predetermined interval. Feeding behavior is the behavior of the target organism when it desires food. As another specific example, the behavior determination unit 27 determines the behavior of the target organism as abnormal when the swimming speed of the target organism is slower than a predetermined speed and the abdomen of the target organism is vertically above the dorsal fin of the target organism. Abnormal behavior is behavior that is not normal, and specific examples include behavior of the target organism when the target organism is in a weakened state, is under high stress, or is ill. Normal behavior is behavior of the target organism when the target organism is in good health.
  • FIG. 3 is a diagram for explaining the process of the behavior determination unit 27.
  • the target organism is a shrimp.
  • the behavior determination unit 27 determines the shrimp's behavior as a feeding behavior.
  • the behavior determination unit 27 determines the shrimp's behavior as a non-feeding behavior.
  • Non-feeding behavior is behavior that is not a feeding behavior.
  • 3B shows a specific example in which the target organism is a catfish. When the catfish frequently moves its tail, the behavior determination unit 27 determines that the behavior of the catfish is normal. On the other hand, when the catfish hardly moves its tail, the behavior determination unit 27 determines that the behavior of the catfish is abnormal.
  • the growth condition control unit 28 generates data for the device that feeds the target organism to determine whether or not to feed the target organism, according to the behavior of the target organism determined by the behavior determination unit 27.
  • the data may be data indicating whether or not to feed the target organism, data indicating the amount of feeding, data indicating whether or not there is a target organism that is performing a behavior determined to be a feeding behavior, data indicating each of the total number of target organisms performing a behavior determined to be a feeding behavior and the total number of target organisms in the aquarium, or data indicating the processing results of each unit included in the behavior determination device 10.
  • the automatic feeding device 60 determines not to feed the target organism when the amount of feeding is 0, and determines to feed the target organism when the amount of feeding is not 0.
  • the growth condition control unit 28 may generate data indicating the amount of adjustment of each of the automatic feeding device 60 and the growth environment control device 70 based on the determination result of the behavior determination unit 27, and output the generated data.
  • the growth condition control unit 28 calculates at least one of the feeding timing and feeding amount for the target organism according to the determined behavior of the target organism, and also generates data used to adjust the growth environment of the target organism according to the determined behavior of the target organism.
  • the reference behavior model 31 is a model that indicates a classification of the behavior of a target organism according to behavioral features, and is used to classify the behavior of aquatic organisms.
  • the reference behavior model 31 may be a model expressed in a table format, or may be a classifier based on machine learning. 4 to 7 show specific examples of the reference behavior model 31.
  • the reference behavior model 31 will be described below with reference to Figs. 4 to 7. Note that the feeding behavior model, the non-feeding behavior model, the normal behavior model, and the abnormal behavior model are each a specific example of a model included in the reference behavior model 31.
  • FIG. 4 shows a specific example of a feeding behavior model.
  • the feeding behavior model is a model used to determine whether the behavior of a target organism is feeding behavior, and as a specific example, is a model that shows indices for classifying the behavior of the target organism as feeding behavior according to behavioral features.
  • the behavior determination unit 27 can determine whether the behavior of the target organism is feeding behavior based on the swimming speed, head angle, swimming depth, and caudal fin angle change of the target organism.
  • the swimming speed, head angle, and swimming depth correspond to the movement features
  • the caudal fin angle change corresponds to the bending feature.
  • FIG. 5 shows a specific example of a non-feeding behavior model.
  • the non-feeding behavior model is a model used to determine whether the behavior of a target organism is non-feeding behavior, and as a specific example, is a model that shows indices for classifying the behavior of a target organism as non-feeding behavior according to behavioral features.
  • the behavior determination unit 27 can determine whether the behavior of a target organism is non-feeding behavior based on the target organism's swimming speed, head angle, swimming depth, and caudal fin angle change.
  • Non-feeding behavior is behavior that is not feeding behavior, and as a specific example, is behavior that a target organism takes when full.
  • FIG. 6 shows a specific example of a normal behavior model.
  • the normal behavior model is a model used to determine whether the behavior of a target organism is normal or not, and as a specific example, is a model that shows an index for classifying the behavior of the target organism as normal behavior according to the behavior feature.
  • the behavior determination unit 27 can determine whether the behavior of the target organism is normal or not based on the swimming speed, pectoral fin position, and number of caudal fin angle changes of the target organism.
  • the pectoral fin position corresponds to the movement feature
  • the number of caudal fin angle changes corresponds to the bending feature.
  • FIG. 7 shows a specific example of an abnormal behavior model.
  • the abnormal behavior model is a model used to determine whether the behavior of a target organism is abnormal or not, and as a specific example, is a model that shows an index for classifying the behavior of the target organism as abnormal depending on the behavioral features.
  • the behavior determination unit 27 can determine whether the behavior of the target organism is abnormal or not based on the swimming speed, pectoral fin position, and number of changes in caudal fin angle of the target organism.
  • FIG. 8 shows an example of the hardware configuration of the behavior determination device 10 according to this embodiment.
  • the behavior determination device 10 is made up of a computer, and includes hardware such as a processor 20, a storage device 30, a communication device 40, and an input/output interface 50.
  • the behavior determination device 10 may be made up of multiple computers.
  • the processor 20 is an integrated circuit (IC) that performs arithmetic processing and controls the hardware of the computer.
  • Specific examples of the processor 20 include a central processing unit (CPU), a digital signal processor (DSP), and a graphics processing unit (GPU).
  • the behavior determination device 10 may include a plurality of processors that replace the processor 20. The plurality of processors share the role of the processor 20.
  • the storage device 30 may be a volatile storage device, a non-volatile storage device, or a combination of these.
  • a specific example of a volatile storage device is RAM (Random Access Memory).
  • a specific example of a non-volatile storage device is ROM (Read Only Memory), HDD (Hard Disk Drive), or flash memory.
  • the communication device 40 is a receiver and a transmitter.
  • a specific example of the communication device 40 is a communication chip or a NIC (Network Interface Card).
  • the input/output interface 50 is a port to which an input device and an output device are connected.
  • a specific example of the input/output interface 50 is a USB (Universal Serial Bus) terminal.
  • a specific example of the input device is a keyboard and a mouse.
  • a specific example of the output device is a display.
  • Each part of the behavior determination device 10 may use the input/output interface 50 and the communication device 40 as appropriate when communicating with other devices.
  • the communication device 40 and the input/output interface 50 acquire signals output by the camera 1, etc.
  • the storage device 30 stores a reference behavior model 31 and a behavior determination program.
  • the behavior determination program is a program that causes a computer to realize the functions of each part of the behavior determination device 10.
  • the processor 20 reads out and executes the behavior determination program stored in the storage device 30, thereby operating as each part of the behavior determination device 10.
  • the functions of each part of the behavior determination device 10 are realized by software.
  • the data used when executing the behavior determination program and the data obtained by executing the behavior determination program are appropriately stored in the storage device 30.
  • Each part of the behavior determination device 10 uses the storage device 30 as appropriate.
  • the terms "data” and "information” may have the same meaning.
  • the storage device 30 may be independent of the computer.
  • the behavior determination program may be recorded on a computer-readable non-volatile recording medium.
  • Specific examples of the non-volatile recording medium include an optical disk or a flash memory.
  • the behavior determination program may be provided as a program product.
  • the operation procedure of the behavior determination device 10 corresponds to a behavior determination method, and the program that realizes the operation of the behavior determination device 10 corresponds to a behavior determination program.
  • FIG. 9 is a flowchart showing an example of the operation of the behavior determination device 10. The operation of the behavior determination device 10 will be explained with reference to FIG. 9.
  • Step S1 The video acquisition unit 21 acquires, as target video data, the video data output by the camera 1.
  • the video acquisition unit 21 may acquire, as target video data, only data within a certain time range from the video data output by the camera 1.
  • the target organism detection unit 22 detects a target organism from each frame represented by the target video data.
  • the target organism detection unit 22 may use only a portion of the frames represented by the target video data.
  • Step S3 If a target living thing is detected from the image represented by the target image data, the behavior determination device 10 proceeds to step S4, otherwise the behavior determination device 10 returns to step S1.
  • the behavior determination device 10 returns to step S1.
  • Step S4 The behavior determination device 10 selects, as a selected target organism, one of the one or more target organisms that has not yet been selected in the iterative processing from step S4 to step S10. Note that the selected target organism is assumed to have been detected in multiple frames.
  • Step S5 The skeleton extraction unit 23 extracts the skeleton of the selected target organism in each frame in which the selected target organism is detected among the frames indicated by the target video data, and generates information indicating the extracted skeleton as target skeleton information.
  • Step S6 The target organism tracking unit 24 tracks the selected target organism based on the target skeletal information, and generates information indicating the tracking results as target tracking information.
  • Step S7 The motion characteristic amount calculation unit 25 calculates the motion characteristic amount of the selected target organism based on the target tracking information.
  • the motion characteristic amount calculation unit 25 calculates the swimming distance and swimming time of the target organism based on the target tracking information, and sets the calculated results as the motion characteristic amount.
  • the curvature feature amount calculation section 26 may calculate the curvature feature amount of the selected target organism based on the target skeletal information.
  • Step S8 The behavior determination unit 27 compares the feature amount calculated in step S7 with a reference behavior model 31 prepared in advance.
  • Step S10 The behavior determination device 10 repeatedly executes steps S5 to S9 the number of times corresponding to the number of target living things detected from the target video data.
  • Step S11 The growth condition control unit 28 generates data indicating the amount of adjustment of each of the automatic feeding device 60 and the growth environment control device 70 based on the judgment result of the behavior judgment unit 27 in step S9, and outputs the generated data to each of the automatic feeding device 60 and the growth environment control device 70.
  • the automatic feeding device 60 determines that it is feeding time and executes feeding. At this time, the automatic feeding device 60 may change the amount of feeding depending on the proportion of target organisms exhibiting the behavior determined to be a feeding behavior in step S9.
  • the automatic feeding device 60 has a reference feeding amount, which is a reference feeding amount for each target organism, in a database not shown, and obtains the number of target organisms exhibiting the behavior determined to be a feeding behavior in step S9 and the number of target organisms detected in step S3, and calculates the proportion of target organisms exhibiting the behavior determined to be a feeding behavior by dividing the number of target organisms exhibiting the behavior determined to be a feeding behavior by the number of detected target organisms, and sets the feeding amount to 1.2 times the reference feeding amount when the calculated proportion is 80% or more.
  • a reference feeding amount which is a reference feeding amount for each target organism, in a database not shown, and obtains the number of target organisms exhibiting the behavior determined to be a feeding behavior in step S9 and the number of target organisms detected in step S3, and calculates the proportion of target organisms exhibiting the behavior determined to be a feeding behavior by dividing the number of target organisms exhibiting the behavior determined to be a feeding behavior by the number of detected target organisms, and sets the feeding
  • the automatic feeding device 60 has a feeding amount index that indicates a higher value than the value indicated by the feeding behavior index, and when the feature value corresponding to the target organism exhibiting behavior determined to be feeding behavior is equal to or greater than the value indicated by the feeding amount index, the feeding amount is set to 1.2 times the reference feeding amount.
  • the feeding behavior index indicates a swimming speed of 15 cm/s
  • the feeding amount index indicates a swimming speed of 25 cm/s.
  • the feeding behavior index is an index used to determine whether the behavior of the target organism is feeding behavior.
  • the feeding amount index is an index used to determine the feeding amount.
  • the growth environment control device 70 first sets the water temperature or oxygen supply amount in the aquarium high when there is one or more target organisms whose behavior is determined to be abnormal in step S9. After that, the behavior determination device 10 executes the behavior determination flow shown in FIG. 9 again after a certain time has elapsed, and when there is no target organism whose behavior is determined to be abnormal in step S9, the growth environment control device 70 sets the water temperature or oxygen supply amount in the aquarium low.
  • the reason for first setting the water temperature or oxygen supply amount in the aquarium high is that, in general, the higher the water temperature and the higher the oxygen concentration, the more active (actively moving) the fish are.
  • the growth environment control device 70 may first set the water temperature or oxygen supply amount in the aquarium low, and then set the water temperature or oxygen supply amount in the aquarium high.
  • the behavior of the target organism can be understood in more detail than by only tracking the target organism by determining the type of behavior of the target organism based on the swimming behavior and leg movements of the target organism, etc.
  • the skeleton of the target organism is extracted according to the type of the target organism, and the behavior of the target organism is determined based on the behavior of the target organism tracked based on the extracted skeleton and a model for classifying the behavior of the target organism, thereby making it possible to understand the behavior according to the ecology of the target organism.
  • efficient feeding can be performed based on the results of understanding the behavior of the target organism.
  • feeding can be performed at an appropriate timing, it is possible to prevent the water quality from deteriorating by feeding the target organism when it is full, and to prevent not feeding the target organism when it is hungry, and it is possible to adjust the amount of feeding according to the health condition of the target organism.
  • FIG. 10 shows an example of the hardware configuration of the behavior determining device 10 according to this modified example.
  • the behavior determination device 10 includes a processing circuit 19 instead of the processor 20 or instead of the processor 20 and the storage device 30 .
  • the processing circuitry 19 is hardware that realizes at least a portion of each unit of the behavior determination device 10 .
  • the processing circuitry 19 may be dedicated hardware, or may be a processor that executes a program stored in a storage device 30 .
  • processing circuitry 19 When processing circuitry 19 is dedicated hardware, processing circuitry 19 may be, for example, a single circuit, a composite circuit, a programmed processor, a parallel programmed processor, an ASIC (Application Specific Integrated Circuit), an FPGA (Field Programmable Gate Array), or a combination thereof.
  • the behavior determination device 10 may include a plurality of processing circuits that replace the processing circuit 19. The plurality of processing circuits share the role of the processing circuit 19.
  • the behavior determination device 10 some functions may be realized by dedicated hardware, and the remaining functions may be realized by software or firmware.
  • the processing circuitry 19 is illustratively implemented in hardware, software, firmware, or a combination thereof.
  • the processor 20, the storage device 30, and the processing circuit 19 are collectively referred to as the “processing circuitry.”
  • the functions of the functional components of the behavior determination device 10 are realized by the processing circuitry.
  • the behavior determining device 10 according to other embodiments may also have a similar configuration to this modified example.
  • Embodiment 2 The following mainly describes the differences from the above-described embodiment with reference to the drawings.
  • Fig. 11 shows an example of the configuration of a behavior determination system 9 according to this embodiment.
  • the behavior determination device 10 according to this embodiment further includes a remaining food amount acquisition unit 200, a learning unit 301, and a corpse number acquisition unit 400.
  • the remaining food amount acquisition section 200 comprises a video acquisition section 201 , a remaining food detection section 202 , and a remaining food amount calculation section 203 .
  • the corpse number acquisition unit 400 includes an image acquisition unit 401 , a corpse detection unit 402 , and a corpse number calculation unit 403 .
  • Video acquisition unit 201 is similar to video acquisition unit 21.
  • the remaining food detection unit 202 detects remaining food in the tank after feeding from each frame shown in the video data acquired by the video acquisition unit 201. After feeding is after the target organism has eaten the provided food, and as a specific example, after the time normally required for the target organism to finish eating the food has elapsed from the time of feeding.
  • the remaining food detection unit 202 may determine remaining food using a background subtraction method, or may determine where remaining food is present in each frame using a trained model based on AI technology.
  • the trained model is a model that takes data showing a frame as input and outputs the result of detecting remaining food.
  • the remaining food amount calculation unit 203 calculates the amount of food detected by the remaining food detection unit 202.
  • Video acquisition unit 401 is similar to video acquisition unit 21.
  • the corpse detection unit 402 detects corpses in the tank from each frame of the video data acquired by the video acquisition unit 401.
  • the corpse detection unit 402 may determine the presence of a corpse using motion features and a background subtraction method, or may determine where in each frame the corpse is located using a trained model based on AI technology.
  • the trained model is a model that receives data indicating a frame as input and outputs the result of detecting a corpse.
  • Corporatione number calculation unit 403 calculates the number of corpses detected by corpse detection unit 402.
  • corpse number calculation unit 403 calculates the number of corpses based on the total number of pixels estimated to reflect corpses when corpses are detected using the background subtraction method, and calculates the number of corpses based on the number of detection frames when corpses are detected using AI technology.
  • the learning unit 301 generates a model based on the movement feature amount and the curvature feature amount, and data indicating the results of detection by the remaining food detection unit 202 or the carcass detection unit 402 .
  • the learning unit 301 uses the non-feeding behavior data and the feeding behavior data to learn a model that classifies the behavior of the target organism into feeding behavior and non-feeding behavior according to the behavior feature of the target organism.
  • the non-feeding behavior data is data when the amount of remaining food is equal to or greater than a reference remaining food amount, data showing the behavior feature of the learning aquatic organism before feeding, and data labeled as data corresponding to non-feeding behavior.
  • the remaining food amount is the amount of food remaining after the feeding time has elapsed from the feeding time.
  • the feeding time is a time set as the time required for the learning aquatic organism to eat the fed food.
  • the learning aquatic organism is an aquatic organism corresponding to the target organism.
  • the learning aquatic organism may be the target organism itself, or may be an aquatic organism having the same or similar properties as the target organism.
  • the feeding behavior data is data when the amount of remaining food is less than a reference remaining food amount, data showing the behavior feature of the learning aquatic organism before feeding, and data labeled as data corresponding to feeding behavior.
  • the learning unit 301 learns the feeding behavior of the target organism based on the movement features and curvature features before feeding and the amount of remaining food after feeding, and may use a model that shows the movement features and curvature features corresponding to the feeding behavior as at least a part of the reference behavior model 31.
  • the learning unit 301 uses the abnormal behavior data and the normal behavior data to learn a model that classifies the behavior of the target organism into either normal behavior or abnormal behavior according to the behavior feature of the target organism.
  • an aquarium is used to raise a group of aquatic organisms composed of a plurality of aquatic organisms corresponding to the target organism.
  • Each of the plurality of aquatic organisms corresponding to the target organism may be an aquatic organism having the same or similar properties as the target organism.
  • the abnormal behavior data is data when the carcasses of aquatic organisms constituting an abnormal number or more of aquatic organisms in the aquarium are detected, and is data indicating the behavior feature of each aquatic organism constituting the aquatic organisms before the carcasses of aquatic organisms constituting an abnormal number or more of aquatic organisms in the aquarium are detected, and is labeled as data corresponding to abnormal behavior.
  • the normal behavior data is data when the corpses of aquatic organisms constituting the aquatic organism group less than an abnormal number are detected in the aquarium, and is data indicating behavioral features of each aquatic organism constituting the aquatic organism group before the corpses of the aquatic organisms constituting the aquatic organism group less than an abnormal number are detected in the aquarium, and is data labeled as data corresponding to normal behavior.
  • the learning unit 301 learns abnormal behavior of the target organism based on the movement features and curvature features before corpse detection and the number of corpses, and may use a model indicating the movement features and curvature features corresponding to abnormal behavior as at least a part of the reference behavior model 31.
  • FIG. 12 is a diagram for explaining the processing of the learning unit 301.
  • FIG. 12A shows a specific example of creating a feeding behavior model by the learning unit 301.
  • the learning unit 301 creates a feeding behavior model in which the behavior of the target organism before feeding is classified as feeding behavior.
  • 12B shows a specific example of creating a non-feeding behavior model by the learning unit 301.
  • the learning unit 301 When the amount of remaining food detected after feeding is greater than a certain amount, i.e., when the target product has not eaten enough of the provided food, the learning unit 301 creates a non-feeding behavior model in which the behavior of the target organism before feeding is classified as non-feeding behavior. 12C shows a specific example of creating an abnormal behavior model by the learning unit 301.
  • the learning unit 301 When the number of detected target organism corpses is greater than a certain number, the learning unit 301 creates an abnormal behavior model in which the behavior of the target organism before the detection of more than a certain number of corpses is classified as abnormal behavior.
  • the learning unit 301 may generate a normal behavior model when the number of detected target creature corpses is less than a certain number.
  • *** Operation Description *** 13 is a flow chart showing an example of the operation of the remaining feed amount acquisition section 200 and the learning section 301. The operation of the remaining feed amount acquisition section 200 and the learning section 301 will be described with reference to FIG.
  • Step S21 The image acquisition unit 201 acquires the image data output by the camera 1 as target image data. This step is the same as step S1. Note that the target image data indicates an image captured inside an aquarium in which a target organism is kept.
  • Step S22 If the target video data is data after feeding, the behavior determination apparatus 10 proceeds to step S23. Otherwise, the behavior determination apparatus 10 returns to step S21. Note that, in cases where the behavior determination apparatus 10 is aware that the target video data is data taken after feeding, the behavior determination apparatus 10 may skip this step.
  • Step S23 The remaining food detection unit 202 detects remaining food from each frame represented by the target video data.
  • Step S24 If remaining food is detected in step S23, the behavior determination apparatus 10 proceeds to step S25. Otherwise, the behavior determination apparatus 10 proceeds to step S26.
  • Step S25 The remaining food amount calculation section 203 calculates the amount of remaining food detected in step S24.
  • Step S26 The learning unit 301 acquires information indicating each of the movement amount feature amount and the curvature feature amount before feeding.
  • Step S27, Step S28, Step S29 If the amount of remaining food calculated in step S25 is equal to or greater than a certain amount, the learning unit 301 labels the information acquired in step S26 as a non-feeding behavior. Otherwise, the learning unit 301 labels the information acquired in step S26 as a feeding behavior.
  • the learning unit 301 may label each piece of information about an individual.
  • the data labeled in steps S28 and S29 corresponds to learning data
  • the data labeled as non-feeding behavior corresponds to non-feeding behavior data
  • the data labeled as feeding behavior corresponds to feeding behavior data.
  • Step S30 If the amount of learning data is sufficient, the behavior determination apparatus 10 proceeds to step S31, otherwise, the behavior determination apparatus 10 returns to step S21.
  • Step S31 The learning unit 301 uses the learning data labeled in steps S28 and S29 to learn the relationship between the behavior of the target organism and the label.
  • Step S32 Based on the result of learning in step S31, the learning unit 301 creates a feeding behavior model as a model to be included in the reference behavior model 31. Note that the result of learning in step S31 may be the feeding behavior model itself.
  • FIG. 14 is a flowchart showing an example of the operation of the corpse number acquisition unit 400 and the learning unit 301. The operation of the corpse number acquisition unit 400 and the learning unit 301 will be described with reference to FIG. 14.
  • Step S41 The image acquisition unit 401 acquires, as target image data, the image data output by the camera 1. This step is similar to step S21.
  • Step S42 The corpse detection unit 402 detects the corpse of a target organism from each frame represented by the target video data.
  • Step S43 If a corpse is detected in step S42, the behavior determination apparatus 10 proceeds to step S44. Otherwise, the behavior determination apparatus 10 proceeds to step S45.
  • the learning unit 301 acquires information indicating each of the motion feature and the curvature feature before the detection of a corpse.
  • the learning unit 301 may acquire only information regarding an individual estimated to be a corpse. Specific examples of an individual estimated to be a corpse include an individual floating on the water surface, or an individual whose surface color has changed.
  • Step S46, Step S47, Step S48 If the number of corpses calculated in step S45 is equal to or greater than a certain number, the learning unit 301 labels the information acquired in step S46 as abnormal behavior. Otherwise, the learning unit 301 labels the information acquired in step S46 as normal behavior.
  • the learning unit 301 may label each piece of information about an individual.
  • the data labeled in steps S47 and S48 corresponds to learning data, the data labeled as abnormal behavior corresponds to abnormal behavior data, and the data labeled as normal behavior corresponds to normal behavior data.
  • Step S49 If the amount of learning data is sufficient, the behavior determining apparatus 10 proceeds to step S50, otherwise, the behavior determining apparatus 10 returns to step S41.
  • Step S50 The learning unit 301 uses the learning data labeled in steps S47 and S48 to learn the relationship between the behavior of the target organism and the label.
  • Step S51 Based on the result of learning in step S50, the learning unit 301 creates an abnormal behavior model as a model to be included in the reference behavior model 31. Note that the result of learning in step S50 may be the abnormal behavior model itself.
  • a model used to classify the behavior of a target organism can be generated based on the behavior of the target organism and the state inside the aquarium.

Landscapes

  • Life Sciences & Earth Sciences (AREA)
  • Environmental Sciences (AREA)
  • Animal Husbandry (AREA)
  • Biodiversity & Conservation Biology (AREA)
  • Marine Sciences & Fisheries (AREA)
  • Zoology (AREA)
  • Image Analysis (AREA)

Abstract

Un dispositif de détermination d'action (10) est pourvu : d'une unité de détermination d'action (27) qui détermine l'action d'une créature sujet, c'est-à-dire un organisme aquatique, à l'aide d'une caractéristique d'action et d'un modèle d'action de référence (31), ladite caractéristique d'action étant une caractéristique correspondant à l'action de la créature sujet et étant calculée sur la base de la transition temporelle de la position et/ou de la flexion de chaque articulation de la créature sujet, ledit modèle d'action de référence comprenant un modèle pour indiquer la catégorie d'action de la créature sujet sur la base de la caractéristique d'action ; et d'une unité de commande de condition d'élevage (28) qui, en fonction de l'action déterminée de la créature sujet, génère des données pour déterminer s'il faut amener un dispositif à nourrir la créature sujet pour nourrir la créature sujet.
PCT/JP2022/036590 2022-09-29 2022-09-29 Dispositif de détermination d'action, procédé de détermination d'action et programme de détermination d'action WO2024069898A1 (fr)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/JP2022/036590 WO2024069898A1 (fr) 2022-09-29 2022-09-29 Dispositif de détermination d'action, procédé de détermination d'action et programme de détermination d'action

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2022/036590 WO2024069898A1 (fr) 2022-09-29 2022-09-29 Dispositif de détermination d'action, procédé de détermination d'action et programme de détermination d'action

Publications (1)

Publication Number Publication Date
WO2024069898A1 true WO2024069898A1 (fr) 2024-04-04

Family

ID=90476886

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2022/036590 WO2024069898A1 (fr) 2022-09-29 2022-09-29 Dispositif de détermination d'action, procédé de détermination d'action et programme de détermination d'action

Country Status (1)

Country Link
WO (1) WO2024069898A1 (fr)

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP3101938B2 (ja) * 1996-03-27 2000-10-23 株式会社日立製作所 水棲生物用自動給餌装置及び方法
JP3462412B2 (ja) * 1999-01-18 2003-11-05 株式会社日立製作所 水棲生物用自動給餌装置
CN108055479A (zh) * 2017-12-28 2018-05-18 暨南大学 一种动物行为视频的制作方法
JP6739049B2 (ja) * 2018-11-14 2020-08-12 株式会社 アイエスイー 養殖魚の自動給餌方法並びに自動給餌システム
JP2021511012A (ja) * 2018-05-03 2021-05-06 エックス デベロップメント エルエルシー 魚測定ステーション管理
JP6945663B2 (ja) * 2020-01-23 2021-10-06 ソフトバンク株式会社 推定プログラム、推定方法および情報処理装置
JP7077496B1 (ja) * 2021-03-26 2022-05-31 株式会社マイスティア 給餌システム、および給餌方法、ならびに音判定モデル
JP7101424B2 (ja) * 2020-08-06 2022-07-15 曁南大学 水生動物の運動過程における筋肉変形の測定及び表示方法

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP3101938B2 (ja) * 1996-03-27 2000-10-23 株式会社日立製作所 水棲生物用自動給餌装置及び方法
JP3462412B2 (ja) * 1999-01-18 2003-11-05 株式会社日立製作所 水棲生物用自動給餌装置
CN108055479A (zh) * 2017-12-28 2018-05-18 暨南大学 一种动物行为视频的制作方法
JP2021511012A (ja) * 2018-05-03 2021-05-06 エックス デベロップメント エルエルシー 魚測定ステーション管理
JP6739049B2 (ja) * 2018-11-14 2020-08-12 株式会社 アイエスイー 養殖魚の自動給餌方法並びに自動給餌システム
JP6945663B2 (ja) * 2020-01-23 2021-10-06 ソフトバンク株式会社 推定プログラム、推定方法および情報処理装置
JP7101424B2 (ja) * 2020-08-06 2022-07-15 曁南大学 水生動物の運動過程における筋肉変形の測定及び表示方法
JP7077496B1 (ja) * 2021-03-26 2022-05-31 株式会社マイスティア 給餌システム、および給餌方法、ならびに音判定モデル

Similar Documents

Publication Publication Date Title
Yang et al. Computer vision models in intelligent aquaculture with emphasis on fish detection and behavior analysis: a review
Chen et al. Behaviour recognition of pigs and cattle: Journey from computer vision to deep learning
Delcourt et al. Video multitracking of fish behaviour: a synthesis and future perspectives
Mohamed et al. Msr-yolo: Method to enhance fish detection and tracking in fish farms
Yang et al. Automatic recognition of sow nursing behaviour using deep learning-based segmentation and spatial and temporal features
WO2019232247A1 (fr) Estimation de biomasse dans un environnement aquacole
Liu et al. Measuring feeding activity of fish in RAS using computer vision
Tu et al. Artificial fishes: Physics, locomotion, perception, behavior
Ye et al. Behavioral characteristics and statistics-based imaging techniques in the assessment and optimization of tilapia feeding in a recirculating aquaculture system
WO2019245722A1 (fr) Détection et classification de poux de mer dans un environnement d'aquaculture
Tang et al. An improved YOLOv3 algorithm to detect molting in swimming crabs against a complex background
EP3843542A1 (fr) Alimentation optimale basée sur des signaux dans un environnement aquacole
CN113040081B (zh) 基于鱼群游泳能耗分析的循环水养殖鱼类投喂决策系统
Lee et al. Research Article The Use of Vision in a Sustainable Aquaculture Feeding System
US20230071265A1 (en) Quantifying plant infestation by estimating the number of biological objects on leaves, by convolutional neural networks that use training images obtained by a semi-supervised approach
Cooper et al. The evolution of jaw protrusion mechanics is tightly coupled to bentho-pelagic divergence in damselfishes (Pomacentridae)
US20200202511A1 (en) System and method to analyse an animal's image for market value determination
Helmer et al. Saccadic movement strategy in common cuttlefish (Sepia officinalis)
Zhang et al. Intelligent fish feeding based on machine vision: A review
CN113516563A (zh) 基于传感数据的鱼类养殖环境调节方法、系统及存储介质
Wu et al. Monitoring the respiratory behavior of multiple cows based on computer vision and deep learning
WO2024069898A1 (fr) Dispositif de détermination d'action, procédé de détermination d'action et programme de détermination d'action
Fleuren et al. Three-dimensional analysis of the fast-start escape response of the least killifish, Heterandria formosa
Xu et al. Behavioral response of fish under ammonia nitrogen stress based on machine vision
Pai et al. A computer vision based behavioral study and fish counting in a controlled environment

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22960957

Country of ref document: EP

Kind code of ref document: A1