WO2020145336A1 - 魚監視システム及びカメラユニット - Google Patents
魚監視システム及びカメラユニット Download PDFInfo
- Publication number
- WO2020145336A1 WO2020145336A1 PCT/JP2020/000414 JP2020000414W WO2020145336A1 WO 2020145336 A1 WO2020145336 A1 WO 2020145336A1 JP 2020000414 W JP2020000414 W JP 2020000414W WO 2020145336 A1 WO2020145336 A1 WO 2020145336A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- fish
- unit
- camera
- dead
- monitoring server
- Prior art date
Links
- 238000012544 monitoring process Methods 0.000 title claims abstract description 120
- 241000251468 Actinopterygii Species 0.000 claims abstract description 226
- 238000001514 detection method Methods 0.000 claims description 78
- 238000004891 communication Methods 0.000 claims description 33
- XLYOFNOQVPJJNP-UHFFFAOYSA-N water Substances O XLYOFNOQVPJJNP-UHFFFAOYSA-N 0.000 claims description 27
- 238000012937 correction Methods 0.000 claims description 23
- 238000006243 chemical reaction Methods 0.000 claims description 17
- 238000010801 machine learning Methods 0.000 claims description 7
- 230000005540 biological transmission Effects 0.000 claims description 6
- 238000012545 processing Methods 0.000 claims description 2
- 238000012986 modification Methods 0.000 description 20
- 230000004048 modification Effects 0.000 description 20
- 238000010586 diagram Methods 0.000 description 15
- 238000000034 method Methods 0.000 description 11
- 230000033001 locomotion Effects 0.000 description 9
- 230000008569 process Effects 0.000 description 8
- 230000001133 acceleration Effects 0.000 description 5
- 238000009360 aquaculture Methods 0.000 description 5
- 244000144974 aquaculture Species 0.000 description 5
- 230000006870 function Effects 0.000 description 5
- 210000002816 gill Anatomy 0.000 description 4
- 230000008859 change Effects 0.000 description 3
- 230000000694 effects Effects 0.000 description 3
- 230000009182 swimming Effects 0.000 description 3
- 239000000470 constituent Substances 0.000 description 2
- 230000009545 invasion Effects 0.000 description 2
- 230000004044 response Effects 0.000 description 2
- 241000251730 Chondrichthyes Species 0.000 description 1
- 241001465754 Metazoa Species 0.000 description 1
- 230000002238 attenuated effect Effects 0.000 description 1
- 239000011248 coating agent Substances 0.000 description 1
- 238000000576 coating method Methods 0.000 description 1
- 239000003086 colorant Substances 0.000 description 1
- 201000010099 disease Diseases 0.000 description 1
- 208000037265 diseases, disorders, signs and symptoms Diseases 0.000 description 1
- 239000004973 liquid crystal related substance Substances 0.000 description 1
- 239000000463 material Substances 0.000 description 1
- 238000010295 mobile communication Methods 0.000 description 1
- 230000000737 periodic effect Effects 0.000 description 1
- 239000011347 resin Substances 0.000 description 1
- 229920005989 resin Polymers 0.000 description 1
- 238000005096 rolling process Methods 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06Q—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q50/00—Information and communication technology [ICT] specially adapted for implementation of business processes of specific business sectors, e.g. utilities or tourism
- G06Q50/02—Agriculture; Fishing; Forestry; Mining
-
- A—HUMAN NECESSITIES
- A01—AGRICULTURE; FORESTRY; ANIMAL HUSBANDRY; HUNTING; TRAPPING; FISHING
- A01K—ANIMAL HUSBANDRY; AVICULTURE; APICULTURE; PISCICULTURE; FISHING; REARING OR BREEDING ANIMALS, NOT OTHERWISE PROVIDED FOR; NEW BREEDS OF ANIMALS
- A01K61/00—Culture of aquatic animals
- A01K61/60—Floating cultivation devices, e.g. rafts or floating fish-farms
-
- A—HUMAN NECESSITIES
- A01—AGRICULTURE; FORESTRY; ANIMAL HUSBANDRY; HUNTING; TRAPPING; FISHING
- A01K—ANIMAL HUSBANDRY; AVICULTURE; APICULTURE; PISCICULTURE; FISHING; REARING OR BREEDING ANIMALS, NOT OTHERWISE PROVIDED FOR; NEW BREEDS OF ANIMALS
- A01K61/00—Culture of aquatic animals
- A01K61/90—Sorting, grading, counting or marking live aquatic animals, e.g. sex determination
- A01K61/95—Sorting, grading, counting or marking live aquatic animals, e.g. sex determination specially adapted for fish
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/0002—Inspection of images, e.g. flaw detection
- G06T7/0004—Industrial image inspection
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/0002—Inspection of images, e.g. flaw detection
- G06T7/0012—Biomedical image inspection
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/10—Image acquisition
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/50—Context or environment of the image
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/68—Control of cameras or camera modules for stable pick-up of the scene, e.g. compensating for camera body vibrations
- H04N23/682—Vibration or motion blur correction
- H04N23/685—Vibration or motion blur correction performed by mechanical compensation
- H04N23/687—Vibration or motion blur correction performed by mechanical compensation by shifting the lens or sensor position
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/698—Control of cameras or camera modules for achieving an enlarged field of view, e.g. panoramic image capture
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N7/00—Television systems
- H04N7/18—Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
- H04N7/183—Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast for receiving images from a single remote source
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/20—Special algorithmic details
- G06T2207/20081—Training; Learning
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30108—Industrial image inspection
- G06T2207/30128—Food products
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/57—Mechanical or electrical details of cameras or camera modules specially adapted for being embedded in other devices
-
- Y—GENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
- Y02—TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
- Y02A—TECHNOLOGIES FOR ADAPTATION TO CLIMATE CHANGE
- Y02A40/00—Adaptation technologies in agriculture, forestry, livestock or agroalimentary production
- Y02A40/80—Adaptation technologies in agriculture, forestry, livestock or agroalimentary production in fisheries management
- Y02A40/81—Aquaculture, e.g. of fish
Definitions
- the present invention relates to a system for monitoring fish kept in cages.
- Patent Document 1 describes an underwater monitoring system that includes an underwater monitoring unit that floats on the surface of the water and photographs the underwater, and a personal computer that remotely controls the underwater monitoring unit. According to this underwater monitoring system, the user can view the underwater image taken by the underwater monitoring unit through the personal computer.
- the present invention aims to monitor fish kept in cages.
- a fish monitoring system is a fish monitoring system for monitoring a fish raised in a cage, and a camera unit fixed to the bottom of the cage and the camera unit. And a monitoring server connected via a communication network, and a user terminal connected to the monitoring server via the communication network, wherein the camera unit includes a camera having a lens directed to the water surface of the cage.
- the detection unit may detect dead fish or moribund fish by applying an identification model generated by machine learning to the image.
- the detection unit detects dead fish and moribund fish
- the notification unit notifies the user terminal that dead fish has been detected when the detection unit detects dead fish.
- the user terminal may be notified that the dying fish has been detected.
- the camera unit may have a flat plate shape
- the lens may be a fisheye lens
- the transmitting unit may transmit the fisheye image captured by the camera to the monitoring server.
- the monitoring server further includes a correction unit that performs a distortion correction process on the fish-eye image to generate a rectangular image, and the detection unit is based on the rectangular image dead fish or moribund fish. May be detected.
- the camera unit further includes a white light and a blue or red light, and a light controller that lights the white light and either the blue or the red light depending on the weather when the image is taken by the camera. You may prepare.
- the camera unit further includes a microphone
- the transmitting unit transmits the underwater sound picked up by the microphone to the monitoring server
- the monitoring server sets the volume of the sound to a predetermined level.
- a determination unit that determines whether or not a threshold value is exceeded, and a data acquisition unit that transmits a shooting instruction to the camera unit when the determination unit makes a positive determination, further include the transmission unit.
- the detection unit further detects a natural enemy fish of the fish, and the notification unit notifies the user terminal that the natural enemy fish is detected when the natural enemy fish is detected by the detection unit. May be.
- the monitoring server when a natural enemy fish is detected by the detector, converts the position of the natural enemy fish in the image into a position in the cage by using a predetermined conversion table. Further, the notification unit may notify the user terminal of the position in the cage that is specified by the conversion unit.
- the fish monitoring system includes a plurality of camera units, and the conversion unit uses the conversion table to determine the position of the dead fish in the image when the detection unit detects the dead fish.
- the position in the cage, of the dead fish detected based on the image transmitted from the first camera unit among the plurality of camera units, and the position in the cage, and among the plurality of camera units When the position of the dead fish detected in the cage based on the image transmitted from the second camera unit falls within a predetermined range, the notification unit notifies the user terminal that the dead fish has been detected. You may.
- a camera unit is a camera unit fixed to the bottom of a cage, and a camera including a lens directed to the water surface of the cage, and a transmitter for transmitting an image captured by the camera to a monitoring server.
- Diagram showing an example of the configuration of a fish monitoring system The figure which shows an example of the external appearance of the camera unit 1.
- Block diagram showing an example of the electrical configuration of the camera unit 1.
- Block diagram showing an example of the electrical configuration of the monitoring server 3.
- Diagram showing an example of a rectangular image
- Block diagram showing an example of the electrical configuration of the user terminal 4.
- Sequence diagram showing an example of real-time monitoring operation
- Sequence diagram showing an example of regular monitoring operation
- Embodiment 1-1 Configuration A fish monitoring system according to an embodiment of the present invention will be described with reference to the drawings.
- This fish monitoring system is a system for monitoring fish kept in cages.
- FIG. 1 is a diagram showing an example of the configuration of this fish monitoring system.
- the fish monitoring system shown in the figure communicates with a camera unit 1 fixed to the bottom center of the cage LB, a communication unit 2 connected to the camera unit 1 via a cable C and installed on the water, and a communication unit 2.
- the monitoring server 3 connected via the network NW and the user terminal 4 connected to the monitoring server 3 via the communication network NW are provided.
- each component will be described.
- the camera unit 1 is a device for capturing underwater images.
- FIG. 2 is a diagram showing an example of the external appearance of the camera unit 1.
- 2A is a plan view and FIG. 2B is a side view.
- the camera unit 1 shown in FIG. 1 has a disk shape, a housing 101 having an opening at the center of the upper surface, a lid portion 102 that closes the opening of the housing 101, and is formed at equal intervals along the outer periphery of the housing 101.
- Four hole portions 103 and a cable outlet 104 that opens on the side surface of the housing 101.
- the housing 101 is formed in a tapered shape so that the upper surface thereof gently bulges from its end portion to the opening.
- the lid 102 is made of a transparent resin and has a hemispherical shape, and accommodates a fish-eye lens described later.
- the housing 101 and the lid 102 are coated with an algae-proof coating.
- the hole portion 103 is a hole through which a hook for fixing the camera unit 1 to the aquaculture net N forming the cage LB is inserted.
- the cable C is inserted into the cable outlet 104.
- FIG. 3 is a block diagram showing an example of the electrical configuration of the camera unit 1.
- the camera unit 1 shown in the figure includes a camera 11, a light 12, a water temperature sensor 13, a water pressure sensor 14, a temperature sensor 15, an acceleration sensor 16, and a control unit 17.
- the camera 11 includes a fisheye lens.
- the fisheye lens is arranged so as to face the axial direction of the housing 101. Therefore, when the camera unit 1 is installed horizontally on the seabed, the fisheye lens faces the direction of the sea surface. However, since the fisheye lens has an angle of view of 180 degrees or more, it is also possible to capture an object located in the horizontal direction of the camera unit 1 (in other words, an object existing on the seabed).
- the camera 11 performs known shift blur correction based on the output value of the acceleration sensor 16 at the time of shooting. Therefore, even if the camera unit 1 swings due to the ocean current at the time of shooting, the image blur is reduced.
- the light 12 is specifically a white LED light and is used when the camera 11 takes an image.
- the temperature sensor 15 is a sensor for measuring the temperature inside the housing 101 of the camera unit 1.
- the control unit 17 includes a microprocessor and a memory. When the microprocessor included in the control unit 17 executes the control program stored in the memory, the function of the data output unit 171 is provided.
- the light output unit 171 When the data output unit 171 receives the data output instruction transmitted from the monitoring server 3, the light output unit 171 turns on the light 12 and outputs a shooting instruction to the camera 11, and acquires image data representing a moving image shot by the camera 11. ..
- the water temperature data, the water pressure data, and the temperature data inside the housing are acquired from the water temperature sensor 13, the water pressure sensor 14, and the temperature sensor 15. Then, the sensor data including the acquired data is output to the communication unit 2 for transmission to the monitoring server 3.
- the communication unit 2 is a communication device that enables data communication between the camera unit 1 and the monitoring server 3.
- the cable C connecting the communication unit 2 and the camera unit 1 has a communication cable and a power supply cable.
- the communication unit 2 includes a battery and supplies power to the camera unit 1 via a power cable.
- the communication network NW that communicably connects the communication unit 2 and the monitoring server 3 is a mobile communication network, the Internet, or a combination thereof.
- FIG. 4 is a block diagram showing an example of the electrical configuration of the monitoring server 3.
- the monitoring server 3 shown in the figure includes a processor 31 such as a CPU, a volatile memory 32 such as a DRAM, a non-volatile memory 33 such as an HDD, and a network card 34.
- a processor 31 such as a CPU
- a volatile memory 32 such as a DRAM
- a non-volatile memory 33 such as an HDD
- a network card 34 When the processor 31 included in the monitoring server 3 executes the control program stored in the non-volatile memory 33, each function of the data acquisition unit 311, the correction unit 312, the detection unit 313, and the notification unit 314 is provided. ..
- the data acquisition unit 311 transmits a data output instruction to the communication unit 2 upon receiving a data acquisition instruction transmitted from the user terminal 4 or periodically. Then, the sensor data output from the camera unit 1 is acquired.
- the correction unit 312 performs a well-known distortion correction process on the circular fish-eye image represented by the image data among the sensor data acquired by the data acquisition unit 311 to generate a rectangular image.
- the detection unit 313 applies a discriminator (in other words, a discrimination model) generated by machine learning to the rectangular image generated by the correction unit 312 to detect dead fish or moribund fish.
- the discriminator is generated by learning the characteristic amounts of the positive and negative case data of dead fish and moribund fish (in other words, the normal case data of normal raw fish).
- the feature amount to be learned is, for example, a Harr-like feature amount, an LBP feature amount, or a HOG feature amount.
- Dead fish are generally stationary on the seabed by overturning or reversing. Dying fish generally move slightly while opening or closing the gills, while rolling or reversing on the seabed. Therefore, the detection unit 313 detects dead fish or moribund fish only from the upper and lower end regions corresponding to the vicinity of the sea bottom in the rectangular image generated by the correction unit 312.
- FIG. 5 is a diagram showing an example of a rectangular image generated by the correction unit 312. In the rectangular image shown in the figure, dead fish or moribund fish is detected only from the regions R1 and R2 corresponding to the vicinity of the sea floor. By narrowing the detection range in this way, the possibility of erroneous detection is reduced.
- the detection unit 313 When the detection unit 313 detects a dead fish or a dying fish, the detection unit 313 performs object tracking of the detected dead fish or a dying fish to calculate the movement amount of the detected object. If the calculated movement amount is less than or equal to the first threshold value, the object is determined to be dead fish, and if the calculated movement amount is less than or equal to the second threshold value, the object is determined to be moribund fish.
- the first threshold value is a threshold value set in advance for determining dead fish, and is set in consideration of the amount of movement of dead fish due to ocean current.
- the second threshold value is a threshold value set in advance for distinguishing a dying fish, and is set to a value larger than the first threshold value.
- the notification unit 314 replaces the fisheye image with the image data of the rectangular image generated by the correction unit 312. Then, the sensor data including the above is transmitted to the user terminal 4.
- a dead fish is detected by the detection unit 313 from a fisheye image acquired regularly, a dead fish detection notification indicating that is transmitted to the user terminal 4, and when a dead fish is detected. Sends a moribund fish detection notification to that effect to the user terminal 4.
- the user terminal 4 is a communication terminal used by the manager of the cage LB, and is a communication terminal for receiving sensor data and various notifications from the monitoring server 3. Specifically, for example, it is a smartphone or a PC.
- FIG. 6 is a block diagram showing an example of the electrical configuration of the user terminal 4.
- the user terminal 4 shown in the figure includes a processor 41 such as a CPU, a volatile memory 42 such as a DRAM, a non-volatile memory 43 such as a flash memory, a network card 44, a display 45 such as a liquid crystal display, and a touch panel 46.
- the processor 41 included in the user terminal 4 executes the control program stored in the non-volatile memory 43, each function of the data acquisition unit 411 and the screen generation unit 412 is provided.
- the data acquisition unit 411 transmits a data acquisition instruction to the monitoring server 3 and acquires the sensor data transmitted from the monitoring server 3.
- the data acquisition unit 411 also acquires a dead fish detection notification or a dying fish detection notification transmitted from the monitoring server 3.
- the screen generation unit 412 generates a sensor data display screen based on the sensor data acquired by the data acquisition unit 411, and displays it on the display 45.
- the underwater image of the cage LB, the water temperature and the water pressure, and the temperature inside the housing of the camera unit 1 are displayed.
- the displayed underwater image is a part of the rectangular image included in the sensor data, and the user can observe the entire rectangular image by performing the pan-tilt-zoom operation on the user terminal 4.
- the screen generation unit 412 generates a dead fish detection notification screen for notifying the detection of dead fish based on the dead fish detection notification acquired by the data acquisition unit 411, and displays the dead fish detection notification screen on the display 45.
- the screen generation unit 412 generates a moribund fish detection notification screen for notifying the detection of moribund fish based on the moribund fish detection notification acquired by the data acquisition unit 411, and displays it on the display 45.
- FIG. 7 is a sequence diagram showing an example of the real-time monitoring operation.
- the data acquisition unit 411 of the user terminal 4 transmits a data acquisition instruction to the monitoring server 3 (step Sa1).
- the data acquisition unit 311 of the monitoring server 3 transmits a data output instruction to the communication unit 2 (step Sa2).
- the data output unit 171 of the camera unit 1 receives the data output instruction transmitted from the monitoring server 3 via the communication unit 2 (step Sa3), the light 12 is turned on and the photographing instruction is output to the camera 11, Image data representing a moving image captured by the camera 11 is acquired.
- the water temperature data, the water pressure data, and the temperature data inside the housing are acquired from the water temperature sensor 13, the water pressure sensor 14, and the temperature sensor 15. Then, sensor data including each acquired data is generated (step Sa4), and is output to the communication unit 2 for transmission to the monitoring server 3 (step Sa5).
- the data acquisition unit 311 of the monitoring server 3 acquires the sensor data output from the camera unit 1 via the communication unit 2 (step Sa6).
- the correction unit 312 of the monitoring server 3 performs a well-known distortion correction process on the circular fisheye image represented by the image data among the sensor data acquired by the data acquisition unit 311 to generate a rectangular image.
- Step Sa7 the notification unit 314 of the monitoring server 3 transmits sensor data including image data of a rectangular image generated by the correction unit 312 to the user terminal 4, instead of the fisheye image. Yes (step Sa8).
- the data acquisition unit 411 of the user terminal 4 acquires the sensor data transmitted from the monitoring server 3.
- the screen generation unit 412 of the user terminal 4 generates a sensor data display screen based on the sensor data acquired by the data acquisition unit 411 and displays it on the display 45 (step Sa9).
- the underwater image of the cage LB, the water temperature and the water pressure, and the temperature inside the housing of the camera unit 1 are displayed.
- the user of the user terminal 4 can know the current state in the cage LB.
- the above is the description of the real-time monitoring operation.
- FIG. 8 is a sequence diagram showing an example of the regular monitoring operation.
- the data acquisition unit 311 of the monitoring server 3 periodically transmits a data output instruction to the communication unit 2 (step Sb1).
- the data output unit 171 of the camera unit 1 receives the data output instruction transmitted from the monitoring server 3 via the communication unit 2 (step Sb2)
- the light 12 is turned on, and the photographing instruction is output to the camera 11,
- Image data representing a moving image captured by the camera 11 is acquired.
- the water temperature data, the water pressure data, and the temperature data inside the housing are acquired from the water temperature sensor 13, the water pressure sensor 14, and the temperature sensor 15.
- the sensor data including the acquired data is generated (step Sb3) and output to the communication unit 2 for transmission to the monitoring server 3 (step Sb4).
- the data acquisition unit 311 of the monitoring server 3 acquires the sensor data output from the camera unit 1 via the communication unit 2 (step Sb5).
- the correction unit 312 of the monitoring server 3 performs a well-known distortion correction process on the circular fisheye image represented by the image data among the sensor data acquired by the data acquisition unit 311 to generate a rectangular image. Yes (step Sb6).
- the detection unit 313 of the monitoring server 3 applies a discriminator generated by machine learning to the rectangular image generated by the correction unit 312 to detect dead fish or moribund fish (step Sb7). When detecting the dead fish or the dying fish, the detecting unit 313 performs the moving body tracking of the detected dead fish or the dying fish to calculate the movement amount of the detected object (step Sb8).
- the object When the calculated movement amount is less than or equal to the first threshold value, the object is determined to be dead fish, and when the calculated movement amount is less than or equal to the second threshold value, the object is determined to be moribund fish (step Sb9). ).
- the detection unit 313 detects a dead fish
- the notification unit 314 of the monitoring server 3 transmits a dead fish detection notification indicating the fact to the user terminal 4, and when a dead fish is detected, A dying fish detection notification indicating that is transmitted to the user terminal 4 (step Sb10).
- the data acquisition unit 411 of the user terminal 4 acquires the dead fish detection notification or the dying fish detection notification of the monitoring server 3.
- the screen generation unit 412 of the user terminal 4 causes the display 45 to display a dead fish detection notification screen that notifies dead fish detection, while the data acquisition unit 411.
- the notification screen of the dying fish which notifies the detection of the dying fish is displayed on the display 45 (step Sb11).
- the user of the user terminal 4, which has received the dead fish detection notification can quickly remove the dead fish from the live cage LB to prevent the spread of diseases caused by dead fish.
- the dying fish can be captured and sold before the dying fish dies.
- the above is the description of the regular monitoring operation.
- the camera unit 1 may be fixed to a place other than the center of the bottom of the cage LB. For example, it may be fixed to the side surface of the bottomed tubular aquaculture net N.
- Modification 4 Communication between the camera unit 1 and the communication unit 2 may be performed wirelessly using a known underwater communication technique.
- the camera unit 1 and the monitoring server 3 may be allowed to directly communicate with each other by using a known underwater communication technique.
- the lens mounted on the camera unit 1 is not necessarily limited to the fisheye lens. Instead of the fisheye lens, a standard lens having an angle of view of 40 degrees to 60 degrees may be mounted. In that case, the lens is arranged so that the vicinity of the sea bottom can be photographed when the camera unit 1 is fixed in the sea.
- the color of the light 12 of the camera unit 1 is not necessarily white.
- the color of the light 12 may be appropriately determined according to the environment of the cage LB, the habit of the fish to be raised, and the like.
- the camera unit 1 may be provided with a plurality of lights of different colors, and the lights used may be switched according to the situation.
- the camera unit 1 is further provided with a blue or red LED light that is not easily attenuated even in muddy water, a white LED light is used when the weather at the time of shooting is fine or cloudy, and a blue LED light is used when it is raining. You may make it use a red LED light.
- the monitoring server 3 also transmits the weather information when transmitting the data output instruction to the camera unit 1, and the camera unit 1 selects the color of the light to be used according to the received weather information. To do.
- the camera unit 1 may also transmit the output value of the acceleration sensor 16 to the monitoring server 3 as sensor data.
- the monitoring server 3 detects the tilt of the camera unit 1 from the output value of the acceleration sensor 16 and determines whether the detected tilt exceeds a predetermined threshold value.
- the predetermined threshold value is set to an appropriate value (for example, plus or minus 45 degrees) for detecting the fall of the camera unit 1.
- the fall notification is transmitted to the user terminal 4. The user of the user terminal 4 that has received this fall notification can notice that the camera unit 1 has fallen.
- the detection unit 313 of the monitoring server 3 narrows down the detection range in the rectangular image in order to improve the detection accuracy, but it does not necessarily have to be narrowed down.
- the detection unit 313 may detect dead fish or dying fish from the entire rectangular image.
- the dead fish and the dying fish may be distinguished by focusing on the movement of the gills of the fish.
- the detection unit 313 of the monitoring server 3 identifies a change in the movement amount of the gills of the object determined to be dead fish as a result of the moving body tracking in order to improve the accuracy of identifying dead fish and dying fish. Then, if the specified change matches a predetermined pattern for distinguishing a dying fish, the object is determined to be a dying fish, and if the specified change does not match the predetermined pattern, the object is determined to be a dead fish. You may do so.
- the detection unit 313 of the monitoring server 3 applies the discriminator (in other words, the discrimination model) generated by machine learning to the rectangular image generated by the correction unit 312, and the dead fish and the moribund fish are detected.
- the discriminator used here is generated by learning the feature amounts of dead fish positive case data, moribund fish positive case data, and normal raw fish positive case data.
- the feature amount to be learned is, for example, a Harr-like feature amount, an LBP feature amount, or a HOG feature amount.
- the detection unit 313 of the monitoring server 3 may not necessarily be able to detect both dead fish and moribund fish. Depending on the needs of the user of the fish monitoring system, only one of dead or dying fish may be detectable.
- a fish for example, a shark
- the monitoring server 3 may detect the invading natural enemy fish and notify the user terminal 4 to that effect.
- the detection unit 313 of the monitoring server 3 applies a classifier (in other words, a discrimination model) generated by machine learning to the rectangular image generated by the correction unit 312, Detects dead or moribund fish and natural enemy fish.
- the discriminator used here is generated by learning the feature amounts of the dead fish and moribund fish positive example data, the natural enemy fish positive example data, and the normal raw fish positive example data.
- the feature amount to be learned is, for example, a Harr-like feature amount, an LBP feature amount, or a HOG feature amount.
- the notification unit 314 of the monitoring server 3 transmits a natural enemy fish detection notification to that effect to the user terminal 4.
- the user of the user terminal 4 that has received this natural enemy fish detection notification can notice the invasion of the natural enemy fish.
- ⁇ General enemy fish generally invade from the break of the aquaculture net N.
- the breakage of the aquaculture net N needs to be repaired in order to avoid further invasion of natural enemy fish and outflow of the fish being raised. Therefore, when the natural enemy fish is detected, the user of the user terminal 4 may be notified of the detection position of the natural enemy fish, which is considered to be close to the breach that the natural enemy fish has passed through.
- the monitoring server 3 uses the predetermined conversion table to determine the position of the natural enemy fish in the rectangular image in which the natural enemy fish is detected, by using a predetermined conversion table. Convert to position in LB.
- the predetermined conversion table used here is a table that associates a set of plane coordinates and fish size values in the rectangular image with the spatial coordinates of the cage LB.
- the notification unit 314 of the monitoring server 3 transmits a natural enemy fish detection notification that notifies the specified position to the user terminal 4.
- the user of the user terminal 4 that has received this natural enemy fish detection notification can know the position in the fish cage LB that is considered to be close to the break through which the natural enemy fish has passed.
- the camera unit 1 may be provided with a microphone, and the detection process described in the modified example 12 may be performed when the swimming sound of the natural enemy fish is collected by the microphone. By reducing the frequency of the detection process, it is possible to suppress the power consumption associated with the image capturing in the camera unit 1.
- the data output unit 171 of the camera unit 1 receives the audio data output instruction that is periodically transmitted from the monitoring server 3, and receives the underwater audio data collected by the microphone from the monitoring server 3. Send to.
- the monitoring server 3 determines whether the volume of the audio data transmitted from the camera unit 1 has exceeded a predetermined threshold value.
- the predetermined threshold value is set in advance in order to determine the swimming sound of the natural enemy fish.
- the data acquisition unit 311 of the monitoring server 3 acquires sensor data from the camera unit 1.
- the image data included in the acquired sensor data is the target of the detection process as described above.
- Modification 14 The arrangement of each function of the camera unit 1, the monitoring server 3, and the user terminal 4 may be appropriately changed according to the implementation environment of the present invention.
- the screen generated by the screen generation unit 412 of the user terminal 4 may be generated in the monitoring server 3 and the generated screen may be displayed on the user terminal 4.
- Modification 15 Multiple camera units 1 may be used in one cage LB. When a plurality of camera units 1 are used, in order to improve the detection accuracy of dead fish, the detection of dead fish may be determined based on the fact that dead fish are detected from the captured images of the camera units 1. If two camera units 1A and 1B are used, the monitoring server 3 detects the dead fish from the rectangular image of the camera unit 1A, and the position of the dead fish in the rectangular image in which the dead fish is detected. Is converted into a position in the cage LB by using the conversion table described in Modification 12. In this conversion table, the spatial coordinates of the cage LB are determined in advance according to the fixed position of the camera unit 1A.
- the monitoring server 3 uses the conversion table described in the modified example 12 to determine the position of the dead fish in the rectangular image in which the dead fish is detected, Convert to a position in the cage LB.
- the spatial coordinates of the cage LB are determined in advance according to the fixed position of the camera unit 1B.
- the monitoring server 3 determines that the detected dead fish are the same dead fish and determines the detection of the dead fish.
- Modification 16 The distortion correction processing by the correction unit 312 may be omitted.
- the detection unit 313 may detect dead fish and/or dying fish by a method that does not use a classifier generated by machine learning.
Landscapes
- Engineering & Computer Science (AREA)
- Life Sciences & Earth Sciences (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Multimedia (AREA)
- Environmental Sciences (AREA)
- Marine Sciences & Fisheries (AREA)
- Signal Processing (AREA)
- Animal Husbandry (AREA)
- Zoology (AREA)
- Business, Economics & Management (AREA)
- Health & Medical Sciences (AREA)
- General Health & Medical Sciences (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Biodiversity & Conservation Biology (AREA)
- Economics (AREA)
- Quality & Reliability (AREA)
- Tourism & Hospitality (AREA)
- Strategic Management (AREA)
- Primary Health Care (AREA)
- Marketing (AREA)
- Human Resources & Organizations (AREA)
- Mining & Mineral Resources (AREA)
- Agronomy & Crop Science (AREA)
- General Business, Economics & Management (AREA)
- Medical Informatics (AREA)
- Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
- Radiology & Medical Imaging (AREA)
- Studio Devices (AREA)
- Management, Administration, Business Operations System, And Electronic Commerce (AREA)
- Closed-Circuit Television Systems (AREA)
- Farming Of Fish And Shellfish (AREA)
- Image Analysis (AREA)
- Mechanical Means For Catching Fish (AREA)
Abstract
Description
1-1.構成
本発明の一実施形態に係る魚監視システムについて図面を参照して説明する。この魚監視システムは、生け簀で飼育される魚を監視するためのシステムである。図1は、この魚監視システムの構成の一例を示す図である。同図に示す魚監視システムは、生け簀LBの底中央に固定されるカメラユニット1と、カメラユニット1とケーブルCを介して接続され、水上に設置される通信ユニット2と、通信ユニット2と通信ネットワークNWを介して接続される監視サーバ3と、監視サーバ3と通信ネットワークNWを介して接続されるユーザ端末4とを備える。以下、各構成要素について説明する。
次に、魚監視システムの動作について説明する。具体的には、ユーザ端末4の利用者の指示を受けて実行されるリアルタイム監視動作と、ユーザ端末4の利用者の指示によらず自動的に実行される定期監視動作について説明する。
ユーザ端末4の利用者によりセンサデータの表示が指示されると、ユーザ端末4のデータ取得部411は、監視サーバ3に対してデータ取得指示を送信する(ステップSa1)。監視サーバ3のデータ取得部311は、ユーザ端末4から送信されたデータ取得指示を受け付けると、通信ユニット2に対してデータ出力指示を送信する(ステップSa2)。カメラユニット1のデータ出力部171は、監視サーバ3から送信されたデータ出力指示を、通信ユニット2を介して受け付けると(ステップSa3)、ライト12を点灯させ、カメラ11に撮影指示を出力し、カメラ11により撮影された動画を表す画像データを取得する。また、水温センサ13、水圧センサ14及び温度センサ15から、水温データ、水圧データ及び筺体内温度データを取得する。そして、取得した各データを含むセンサデータを生成し(ステップSa4)、監視サーバ3に送信するために通信ユニット2に出力する(ステップSa5)。
以上が、リアルタイム監視動作についての説明である。
監視サーバ3のデータ取得部311は、定期的に通信ユニット2に対してデータ出力指示を送信する(ステップSb1)。カメラユニット1のデータ出力部171は、監視サーバ3から送信されたデータ出力指示を、通信ユニット2を介して受け付けると(ステップSb2)、ライト12を点灯させ、カメラ11に撮影指示を出力し、カメラ11により撮影された動画を表す画像データを取得する。また、水温センサ13、水圧センサ14及び温度センサ15から、水温データ、水圧データ及び筺体内温度データを取得する。そして、取得した各データを含むセンサデータを生成し(ステップSb3)、監視サーバ3に送信するために通信ユニット2に出力する(ステップSb4)。
以上が、定期監視動作についての説明である。
上記の実施形態は、下記のように変形してもよい。なお、下記の変形例は互いに組み合わせてもよい。
上記の魚監視システムは、魚(言い換えると、魚類)を監視対象としているが、魚以外の水生動物を監視するために利用されてもよい。
カメラユニット1は、生け簀LBの底中央以外の場所に固定されてもよい。例えば、有底筒状をなす養殖網Nの側面に固定されてもよい。
カメラユニット1の外観は、必ずしも図2に示す例に限られない。カメラユニット1を構成する各構成要素の形状、素材、数及び配置は、使用される環境に応じて適宜変更されてよい。
カメラユニット1と通信ユニット2の間の通信を、周知の水中通信技術を用いて無線で行うようにしてもよい。
または、カメラユニット1と監視サーバ3を、周知の水中通信技術を用いて直接通信可能としてもよい。
カメラユニット1に搭載するレンズは、必ずしも魚眼レンズに限られない。魚眼レンズに代えて、画角40度から60度の標準レンズを搭載させてもよい。その場合、当該レンズは、カメラユニット1が海中に固定されたときに、海底付近を撮影可能なように配置される。
カメラユニット1のライト12の色は、必ずしも白色に限られない。ライト12の色は、生け簀LBの環境、飼育される魚の習性等に応じて適宜決定されてよい。
カメラユニット1は、加速度センサ16の出力値もセンサデータとして監視サーバ3に送信するようにしてもよい。監視サーバ3は、加速度センサ16の出力値からカメラユニット1の傾きを検出し、検出した傾きが所定の閾値を超えたか否かを判定する。ここで所定の閾値は、カメラユニット1の転倒を検出する上で適切な値(例えば、プラスマイナス45度)に設定される。そして、検出した傾きが所定の閾値を超える場合には、ユーザ端末4に対し、転倒通知を送信する。この転倒通知を受信したユーザ端末4の利用者は、カメラユニット1が転倒してしまったことに気付くことができる。
監視サーバ3の検出部313は、検出精度を高めるために矩形画像内の検出範囲を絞っているが、必ずしも絞らなくてもよい。検出部313は、矩形画像全体から死魚又は瀕死魚を検出するようにしてもよい。
瀕死魚は一般に、死魚と異なり、えらを開閉させる。そこで、魚のえらの動きに着目して、死魚と瀕死魚を識別するようにしてもよい。具体的には、監視サーバ3の検出部313は、死魚と瀕死魚の識別精度を向上させるために、動体追跡の結果、死魚と判定された物体について、えら部分の移動量の変化を特定し、特定した変化が、瀕死魚を判別するための所定のパターンに合致するときには、当該物体を瀕死魚と判定し、特定した変化が所定のパターンに合致しないときには、当該物体を死魚と判定するようにしてもよい。
生け簀LBで飼育される魚が静止画だけで死魚と瀕死魚を識別可能な種類の魚である場合には、動体追跡の処理は省略されてもよい。この場合、監視サーバ3の検出部313は、補正部312により生成された矩形画像に対して、機械学習により生成された識別器(言い換えると、識別モデル)を適用して、死魚及び瀕死魚を検出する。ここで使用される識別器は、死魚の正例データと瀕死魚の正例データと正常な生魚の正例データの特徴量を学習させることにより生成する。学習させる特徴量は、例えば、Harr-like特徴量、LBP特徴量又はHOG特徴量である。
監視サーバ3の検出部313は、必ずしも死魚と瀕死魚の両方を検出可能でなくてもよい。魚監視システムの利用者のニーズによっては、死魚と瀕死魚の一方のみを検出可能であってもよい。
生け簀LB内に、飼育されている魚の天敵である魚(例えば鮫)が侵入することがある。侵入した天敵魚を放置しておくと、飼育されている魚が捕食されてしまう可能性がある。そこで、監視サーバ3は、侵入した天敵魚を検出し、その旨をユーザ端末4に通知するようにしてもよい。その場合、具体的には、監視サーバ3の検出部313は、補正部312により生成された矩形画像に対して、機械学習により生成された識別器(言い換えると、識別モデル)を適用して、死魚又は瀕死魚と天敵魚とを検出する。ここで使用される識別器は、死魚及び瀕死魚の正例データと天敵魚の正例データと正常な生魚の正例データの特徴量を学習させることにより生成する。学習させる特徴量は、例えば、Harr-like特徴量、LBP特徴量又はHOG特徴量である。検出部313により天敵魚が検出されると、監視サーバ3の通知部314は、その旨を示す天敵魚検出通知をユーザ端末4に送信する。この天敵魚検出通知を受信したユーザ端末4の利用者は、天敵魚の侵入に気付くことができる。
天敵魚は一般に、飼育されている魚よりも体長が大きく、そのため遊泳音が飼育されている魚よりも大きい。そこで、カメラユニット1にマイクを備えさせて、マイクにより天敵魚の遊泳音が収音されたときに、変形例12に記載の検出処理を行うようにしてもよい。検出処理の頻度を減らすことで、カメラユニット1における画像撮影に伴う電力消費を抑えることができる。この場合、具体的には、カメラユニット1のデータ出力部171は、監視サーバ3から定期的に送信される音声データ出力指示を受けて、マイクにより収音された水中の音声データを監視サーバ3に送信する。監視サーバ3は、カメラユニット1から送信された音声データの音量が、所定の閾値を超えたか否かを判定する。ここで所定の閾値は、天敵魚の遊泳音を判別するために予め設定される。音声データの音量が所定の閾値を超えると、監視サーバ3のデータ取得部311は、カメラユニット1からセンサデータを取得する。取得されたセンサデータに含まれる画像データは、上記の通り、検出処理の対象となる。
カメラユニット1、監視サーバ3及びユーザ端末4の各機能の配置は、本発明の実施環境に応じて適宜変更されてもよい。例えば、ユーザ端末4の画面生成部412により生成される画面は、監視サーバ3において生成し、生成した画面をユーザ端末4に表示させるようにしてもよい。
1つの生け簀LB内で複数のカメラユニット1を使用してもよい。複数のカメラユニット1を使用する場合、死魚の検出精度を高めるために、複数のカメラユニット1の撮影画像から死魚が検出されたことをもって、死魚の検出を断定するようにしてもよい。仮に2台のカメラユニット1A及び1Bを使用する場合には、監視サーバ3は、カメラユニット1Aの矩形画像から死魚が検出されると、その死魚が検出された矩形画像における当該死魚の位置を、変形例12に記載した変換テーブルを用いて、生け簀LB内の位置に変換する。なお、この変換テーブルでは、生け簀LBの空間座標はカメラユニット1Aの固定位置に応じて予め決定される。また、監視サーバ3は、カメラユニット1Bの矩形画像から死魚が検出されると、その死魚が検出された矩形画像における当該死魚の位置を、変形例12に記載した変換テーブルを用いて、生け簀LB内の位置に変換する。なお、この変換テーブルでは、生け簀LBの空間座標はカメラユニット1Bの固定位置に応じて予め決定される。監視サーバ3は、特定した2つの位置が所定の範囲内に収まる場合には、検出した死魚が同一の死魚であると判定し、死魚の検出を断定する。
なお、上記の説明は、死魚の検出時に複数のカメラユニット1を利用する場合の説明であるが、瀕死魚の検出時や天敵魚の検出時にも複数のカメラユニット1を利用して検出精度を高めてもよい。
補正部312による歪み補正処理は省略されてもよい。
検出部313は、機械学習により生成された識別器を用いない方法で、死魚及び/又は瀕死魚を検出してもよい。
Claims (11)
- 生け簀で飼育される魚を監視するための魚監視システムであって、
前記生け簀の底に固定されるカメラユニットと、
前記カメラユニットと通信ネットワークを介して接続される監視サーバと、
前記監視サーバと前記通信ネットワークを介して接続されるユーザ端末と
を備え、
前記カメラユニットは、
前記生け簀の水面に向けられたレンズを備えるカメラと、
前記カメラにより撮影された画像を前記監視サーバに送信する送信部と
を備え、
前記監視サーバは、
前記画像に基づいて死魚又は瀕死魚を検出する検出部と、
前記検出部により死魚又は瀕死魚が検出されると、死魚又は瀕死魚が検出されたことを前記ユーザ端末に通知する通知部と
を備えることを特徴とする魚監視システム。 - 前記検出部は、前記画像に対して、機械学習により生成された識別モデルを適用することで、死魚又は瀕死魚を検出することを特徴とする、請求項1に記載の魚監視システム。
- 前記検出部は、死魚と瀕死魚を検出し、
前記通知部は、前記検出部により死魚が検出されると、死魚が検出されたことを前記ユーザ端末に通知し、前記検出部により瀕死魚が検出されると、瀕死魚が検出されたことを前記ユーザ端末に通知する
ことを特徴とする、請求項1に記載の魚監視システム。 - 前記カメラユニットは平板状であり、
前記レンズは魚眼レンズであり、
前記送信部は、前記カメラにより撮影された魚眼画像を前記監視サーバに送信する
ことを特徴とする、請求項1乃至3のいずれか1項に記載の魚監視システム。 - 前記監視サーバは、前記魚眼画像に対して歪み補正処理を行って矩形画像を生成する補正部をさらに備え、
前記検出部は、前記矩形画像に基づいて死魚又は瀕死魚を検出する
ことを特徴とする、請求項4に記載の魚監視システム。 - 前記カメラユニットは、
白色ライト及び青色又は赤色ライトと、
前記カメラによる撮影時に、天気に応じて前記白色ライトと前記青色又は赤色ライトのいずれかを点灯させるライト制御部と
をさらに備えることを特徴とする、請求項1乃至5のいずれか1項に記載の魚監視システム。 - 前記カメラユニットは、マイクをさらに備え、
前記送信部は、前記マイクにより収音された水中の音声を前記監視サーバに送信し、
前記監視サーバは、
前記音声の音量が所定の閾値を超えたか否かを判定する判定部と、
前記判定部により肯定的な判定がなされた場合に、前記カメラユニットに対する撮影指示を送信するデータ取得部と
をさらに備え、
前記送信部は、前記撮影指示が前記カメラユニットにより受信されたことを契機として、前記カメラにより撮影された画像を前記監視サーバに送信する
ことを特徴とする、請求項1乃至6のいずれか1項に記載の魚監視システム。 - 前記検出部は、前記魚の天敵魚をさらに検出し、
前記通知部は、前記検出部により天敵魚が検出されると、天敵魚が検出されたことを前記ユーザ端末に通知する
ことを特徴とする、請求項1乃至7のいずれか1項に記載の魚監視システム。 - 前記監視サーバは、前記検出部により天敵魚が検出されると、前記画像における当該天敵魚の位置を、所定の変換テーブルを用いて、前記生け簀内の位置に変換する変換部をさらに備え、
前記通知部は、前記変換部により特定された前記生け簀内の位置を前記ユーザ端末に通知する
ことを特徴とする、請求項8に記載の魚監視システム。 - 複数のカメラユニットを備え、
前記変換部は、前記検出部により死魚が検出されると、前記画像における当該死魚の位置を、前記変換テーブルを用いて、前記生け簀内の位置に変換し、
前記複数のカメラユニットのうちの第1カメラユニットから送信された画像に基づいて検出された死魚の前記生け簀内の位置と、前記複数のカメラユニットのうちの第2カメラユニットから送信された画像に基づいて検出された死魚の前記生け簀内の位置とが所定の範囲内に収まる場合に、前記通知部は死魚が検出されたことを前記ユーザ端末に通知する
ことを特徴とする、請求項9に記載の魚監視システム。 - 生け簀の底に固定されるカメラユニットであって、
前記生け簀の水面に向けられたレンズを備えるカメラと、
前記カメラにより撮影された画像を監視サーバに送信する送信部と
を備えるカメラユニット。
Priority Applications (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
MX2021008271A MX2021008271A (es) | 2019-01-11 | 2020-01-09 | Sistema de monitoreo de peces y unidad de camara. |
NO20210908A NO20210908A1 (en) | 2019-01-11 | 2020-01-09 | Fish Monitoring System and Camera Unit |
AU2020207707A AU2020207707A1 (en) | 2019-01-11 | 2020-01-09 | Fish monitoring system and camera unit |
EP20738472.8A EP3909424A4 (en) | 2019-01-11 | 2020-01-09 | FISH MONITORING SYSTEM AND CAMERA UNIT |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2019-003327 | 2019-01-11 | ||
JP2019003327A JP6530152B1 (ja) | 2019-01-11 | 2019-01-11 | 魚監視システム |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2020145336A1 true WO2020145336A1 (ja) | 2020-07-16 |
Family
ID=66821720
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2020/000414 WO2020145336A1 (ja) | 2019-01-11 | 2020-01-09 | 魚監視システム及びカメラユニット |
Country Status (7)
Country | Link |
---|---|
EP (1) | EP3909424A4 (ja) |
JP (1) | JP6530152B1 (ja) |
AU (1) | AU2020207707A1 (ja) |
MX (1) | MX2021008271A (ja) |
NO (1) | NO20210908A1 (ja) |
TW (1) | TW202032969A (ja) |
WO (1) | WO2020145336A1 (ja) |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20210319550A1 (en) * | 2020-04-10 | 2021-10-14 | X Development Llc | Multi-chamber lighting controller for aquaculture |
US20220000079A1 (en) * | 2020-07-06 | 2022-01-06 | Ecto, Inc. | Acoustics augmentation for monocular depth estimation |
Families Citing this family (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN110250053B (zh) * | 2019-07-25 | 2021-09-14 | 广东中科英海科技有限公司 | 一种用于斑马鱼研究的视动反应测试系统 |
JP7243986B2 (ja) * | 2019-10-24 | 2023-03-22 | 中央電子株式会社 | 水産物識別方法、水産物識別プログラム、及び水産物識別システム |
US11475689B2 (en) | 2020-01-06 | 2022-10-18 | X Development Llc | Fish biomass, shape, size, or health determination |
CN112450146A (zh) * | 2020-11-26 | 2021-03-09 | 澜途集思生态科技集团有限公司 | 适于水产养殖的控制系统 |
KR102622793B1 (ko) * | 2021-11-05 | 2024-01-09 | 주식회사 부상 | 양식 어류의 실시간 질병 감지시스템 및 그 방법 |
Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2003037755A (ja) | 2001-07-26 | 2003-02-07 | Fuji Photo Optical Co Ltd | 水中監視システム |
JP2004173554A (ja) * | 2002-11-26 | 2004-06-24 | Nec Engineering Ltd | 養殖魚介類盗難防止装置 |
JP2007187575A (ja) * | 2006-01-13 | 2007-07-26 | Shikoku Res Inst Inc | 水質監視装置および水質監視方法 |
JP3160730U (ja) * | 2010-01-05 | 2010-07-08 | 山本 隆洋 | 魚類監視水槽 |
US20160119065A1 (en) * | 2014-10-24 | 2016-04-28 | Wahoo Technologies, LLC | System and method for providing underwater video |
CN106550223A (zh) * | 2017-01-13 | 2017-03-29 | 湖南理工学院 | 一种用于水产养殖的死鱼监控装置及监控方法 |
WO2018061926A1 (ja) * | 2016-09-30 | 2018-04-05 | 日本電気株式会社 | 計数システムおよび計数方法 |
Family Cites Families (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
DE602004020558D1 (de) * | 2003-08-26 | 2009-05-28 | Com E Ind Equa Ltda Puerto Mon | Verfahren zur Überwachung und Steuerung des unverzehrtes Futters in Fischzuchtanlagen |
CN205958008U (zh) * | 2016-06-01 | 2017-02-15 | 湖南理工学院 | 一种用于水产养殖的死鱼监控装置 |
CN206370898U (zh) * | 2017-01-13 | 2017-08-01 | 湖南理工学院 | 一种用于水产养殖的死鱼监控装置 |
CN107423745A (zh) * | 2017-03-27 | 2017-12-01 | 浙江工业大学 | 一种基于神经网络的鱼类活性分类方法 |
JP6401411B1 (ja) * | 2018-02-13 | 2018-10-10 | 株式会社Aiハヤブサ | 人工知能による漁獲物識別システム、管理システム及び物流システム |
-
2019
- 2019-01-11 JP JP2019003327A patent/JP6530152B1/ja active Active
-
2020
- 2020-01-06 TW TW109100282A patent/TW202032969A/zh unknown
- 2020-01-09 WO PCT/JP2020/000414 patent/WO2020145336A1/ja unknown
- 2020-01-09 EP EP20738472.8A patent/EP3909424A4/en not_active Withdrawn
- 2020-01-09 AU AU2020207707A patent/AU2020207707A1/en not_active Abandoned
- 2020-01-09 MX MX2021008271A patent/MX2021008271A/es unknown
- 2020-01-09 NO NO20210908A patent/NO20210908A1/en unknown
Patent Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2003037755A (ja) | 2001-07-26 | 2003-02-07 | Fuji Photo Optical Co Ltd | 水中監視システム |
JP2004173554A (ja) * | 2002-11-26 | 2004-06-24 | Nec Engineering Ltd | 養殖魚介類盗難防止装置 |
JP2007187575A (ja) * | 2006-01-13 | 2007-07-26 | Shikoku Res Inst Inc | 水質監視装置および水質監視方法 |
JP3160730U (ja) * | 2010-01-05 | 2010-07-08 | 山本 隆洋 | 魚類監視水槽 |
US20160119065A1 (en) * | 2014-10-24 | 2016-04-28 | Wahoo Technologies, LLC | System and method for providing underwater video |
WO2018061926A1 (ja) * | 2016-09-30 | 2018-04-05 | 日本電気株式会社 | 計数システムおよび計数方法 |
CN106550223A (zh) * | 2017-01-13 | 2017-03-29 | 湖南理工学院 | 一种用于水产养殖的死鱼监控装置及监控方法 |
Non-Patent Citations (1)
Title |
---|
See also references of EP3909424A4 |
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20210319550A1 (en) * | 2020-04-10 | 2021-10-14 | X Development Llc | Multi-chamber lighting controller for aquaculture |
US11657498B2 (en) * | 2020-04-10 | 2023-05-23 | X Development Llc | Multi-chamber lighting controller for aquaculture |
US20220000079A1 (en) * | 2020-07-06 | 2022-01-06 | Ecto, Inc. | Acoustics augmentation for monocular depth estimation |
Also Published As
Publication number | Publication date |
---|---|
NO20210908A1 (en) | 2021-07-14 |
TW202032969A (zh) | 2020-09-01 |
EP3909424A1 (en) | 2021-11-17 |
JP6530152B1 (ja) | 2019-06-12 |
MX2021008271A (es) | 2021-09-30 |
EP3909424A4 (en) | 2022-10-19 |
AU2020207707A1 (en) | 2021-07-29 |
JP2020110079A (ja) | 2020-07-27 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2020145336A1 (ja) | 魚監視システム及びカメラユニット | |
JP2020110138A (ja) | 魚監視システム及びカメラユニット | |
US9609206B2 (en) | Image processing apparatus, method for controlling image processing apparatus and storage medium | |
JP6849081B2 (ja) | 情報処理装置、計数システム、計数方法およびコンピュータプログラム | |
CN110934572A (zh) | 体温监测方法及装置、可读存储介质 | |
KR101858924B1 (ko) | 물고기의 종류를 감지하는 스마트 낚시찌 | |
AU2016421610B2 (en) | Feeding system and feeding method | |
CN106063460B (zh) | 智能喂食器、远程控制方法及智能家居通信系统 | |
EP1510125B1 (en) | Method for monitoring and controlling the non-consumed food in fish farms | |
JP6629055B2 (ja) | 情報処理装置および情報処理方法 | |
CN105959581A (zh) | 具有动态控制闪光灯的用于图像捕捉的电子设备及相关的控制方法 | |
US20230270078A1 (en) | Bird station | |
US10904448B2 (en) | Controlling flash behavior during capture of image data | |
JP6625786B2 (ja) | 異常検知システム、異常検知方法及びプログラム | |
JP6508314B2 (ja) | 撮像装置、撮像方法、及びプログラム | |
KR20230035509A (ko) | 관심 영역들을 구비한 음향 검출 디바이스 및 시스템 | |
JP4825909B2 (ja) | 顔検知装置 | |
JP6650739B2 (ja) | 発光デバイス調整装置および駆動電流調整方法 | |
KR20150000054A (ko) | 어류 양식 관리용 무인 잠수정 | |
JP2008186283A (ja) | 人体検出装置 | |
US20160073087A1 (en) | Augmenting a digital image with distance data derived based on acoustic range information | |
CN211883760U (zh) | 体温监测装置 | |
KR20220001580A (ko) | IoT를 이용한 수위측정 시스템 및 이를 이용한 수위 측정 방법 | |
CN111586352A (zh) | 多光电最佳适配联合调度系统及方法 | |
JP7353864B2 (ja) | 情報処理装置、情報処理装置の制御方法およびプログラム、撮像システム |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 20738472 Country of ref document: EP Kind code of ref document: A1 |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
ENP | Entry into the national phase |
Ref document number: 2020207707 Country of ref document: AU Date of ref document: 20200109 Kind code of ref document: A |
|
ENP | Entry into the national phase |
Ref document number: 2020738472 Country of ref document: EP Effective date: 20210811 |