CN113240650A - Fry counting system and method based on deep learning density map regression - Google Patents
Fry counting system and method based on deep learning density map regression Download PDFInfo
- Publication number
- CN113240650A CN113240650A CN202110545082.8A CN202110545082A CN113240650A CN 113240650 A CN113240650 A CN 113240650A CN 202110545082 A CN202110545082 A CN 202110545082A CN 113240650 A CN113240650 A CN 113240650A
- Authority
- CN
- China
- Prior art keywords
- fry
- counting
- image
- density map
- deep learning
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 238000000034 method Methods 0.000 title claims abstract description 48
- 238000013135 deep learning Methods 0.000 title claims abstract description 26
- 241000251468 Actinopterygii Species 0.000 claims abstract description 16
- 238000012545 processing Methods 0.000 claims abstract description 16
- 230000003068 static effect Effects 0.000 claims abstract description 6
- 238000001514 detection method Methods 0.000 claims description 8
- 238000005070 sampling Methods 0.000 claims description 4
- 230000006870 function Effects 0.000 claims description 3
- 230000010354 integration Effects 0.000 claims description 3
- 230000003044 adaptive effect Effects 0.000 claims description 2
- 239000011800 void material Substances 0.000 claims description 2
- 230000002452 interceptive effect Effects 0.000 description 9
- 230000000007 visual effect Effects 0.000 description 8
- XLYOFNOQVPJJNP-UHFFFAOYSA-N water Substances O XLYOFNOQVPJJNP-UHFFFAOYSA-N 0.000 description 7
- 238000010586 diagram Methods 0.000 description 5
- 230000000694 effects Effects 0.000 description 4
- 239000007788 liquid Substances 0.000 description 4
- 238000005516 engineering process Methods 0.000 description 3
- 238000011160 research Methods 0.000 description 3
- 241000157468 Reinhardtius hippoglossoides Species 0.000 description 2
- 238000001914 filtration Methods 0.000 description 2
- 230000011218 segmentation Effects 0.000 description 2
- 240000007651 Rubus glaucus Species 0.000 description 1
- 235000011034 Rubus glaucus Nutrition 0.000 description 1
- 235000009122 Rubus idaeus Nutrition 0.000 description 1
- 238000009360 aquaculture Methods 0.000 description 1
- 244000144974 aquaculture Species 0.000 description 1
- 238000013475 authorization Methods 0.000 description 1
- 238000010009 beating Methods 0.000 description 1
- 230000009286 beneficial effect Effects 0.000 description 1
- 238000009395 breeding Methods 0.000 description 1
- 230000001488 breeding effect Effects 0.000 description 1
- 238000013527 convolutional neural network Methods 0.000 description 1
- 238000011156 evaluation Methods 0.000 description 1
- 238000002474 experimental method Methods 0.000 description 1
- 238000002347 injection Methods 0.000 description 1
- 239000007924 injection Substances 0.000 description 1
- 230000003993 interaction Effects 0.000 description 1
- 238000007873 sieving Methods 0.000 description 1
- 238000006467 substitution reaction Methods 0.000 description 1
- 230000004083 survival effect Effects 0.000 description 1
- 238000005303 weighing Methods 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/0002—Inspection of images, e.g. flaw detection
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/04—Architecture, e.g. interconnection topology
- G06N3/045—Combinations of networks
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T5/00—Image enhancement or restoration
- G06T5/40—Image enhancement or restoration using histogram techniques
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T5/00—Image enhancement or restoration
- G06T5/70—Denoising; Smoothing
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30242—Counting objects in image
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- General Physics & Mathematics (AREA)
- General Health & Medical Sciences (AREA)
- General Engineering & Computer Science (AREA)
- Biophysics (AREA)
- Computational Linguistics (AREA)
- Data Mining & Analysis (AREA)
- Evolutionary Computation (AREA)
- Artificial Intelligence (AREA)
- Molecular Biology (AREA)
- Computing Systems (AREA)
- Biomedical Technology (AREA)
- Life Sciences & Earth Sciences (AREA)
- Mathematical Physics (AREA)
- Software Systems (AREA)
- Health & Medical Sciences (AREA)
- Quality & Reliability (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Image Analysis (AREA)
- Farming Of Fish And Shellfish (AREA)
Abstract
The invention discloses a fry counting system and method based on deep learning density map regression, and belongs to the technical field of metering equipment and counting. The method comprises the following steps: step 1, putting fish fries into a fry box, and remotely connecting and controlling embedded chip equipment through an industrial personal computer; step 2, acquiring static image information of the fry by using a camera and displaying the static image information on an industrial personal computer; step 3, transmitting the image obtained in the step 2 into an image processing module in the embedded chip equipment for carrying out sharpening and denoising; step 4, transferring the image processed in the step 3 into a fry counting model; and 5, checking the counting image and the counting result on the industrial personal computer and storing the counting image and the counting result in a database. The method combines a fry counting model regressed by a deep learning density map with embedded equipment, can well solve the problem of adhesion, and ensures the accurate result of counting.
Description
Technical Field
The invention relates to the technical field of metering equipment and counting, in particular to a fry counting system and method based on deep learning density map regression.
Background
In aquaculture, accurate counting of fry is the basis of scientific management of links such as scientific feed feeding, fry transportation and sale, fry survival rate evaluation, culture density control and the like. The traditional counting method mostly adopts manual counting methods such as a cup beating method, a sieving method, a weighing method and the like, and is time-consuming, labor-consuming and low in accuracy.
Currently, there are some researchers designed physical distribution methods to replace the traditional manual counting method in the counting field, for example, the authorization publication number CN106204626A discloses a fry counter based on image processing and counting method[1]The fry counter designed by the method uniformly spreads the fry on a layer of water surface by utilizing the vertically upward water flow in the counting box so as to control the fry to gather; application publication number CN110973036A discloses fry counting device and method based on machine vision[2]The fry is counted in batches through the revolving door to avoid repeated counting caused by continuous shooting counting, and compared with the traditional counting method, the physical shunting modes save labor, but the speed and the accuracy are still insufficient, and the fry is damaged to a certain extent; application publication number CN109509175A discloses portable fry counter based on machine vision and deep learning and counting method[5]The counter comprises a control box and a fry counting box, super-resolution reconstruction is carried out on the fry image after the fry image stream is obtained, rapid coarse segmentation is carried out by utilizing a depth convolution generation type countermeasure network, and finally, accurate semantic segmentation is carried out, the fry outline is extracted, and the number is counted; application publication number CN205263883U discloses online automatic counting device of fry based on machine vision[6]The device comprises a red backlight source, an industrial camera, a pipeline, a filter screen, a circular plane, a uncovered cylindrical water tank and an electromagnetThe liquid level meter is connected with the liquid level meter through a pipeline, and the liquid level meter is connected with the liquid level meter through a pipeline; application publication number CN106339752A discloses fry counting assembly[7]The method is characterized in that: the fry counting device comprises a fry box for placing fries, a counting box for counting the fries and a plurality of measuring tools for sampling the fries, wherein a camera device is arranged above the counting box, and the height of a lens of the camera device from the water level of the counting box is adjustable; the camera device is connected with a processor, the processor comprises an image processing module, a comparison module and a memory, and is provided with a display screen and printer output equipment; application publication number CN112215798A discloses fry counting detection method and device based on machine vision[8]The method comprises the steps of realizing fry positioning and counting by using an algorithm based on machine vision and a convolutional neural network, predicting fry positions by using detected water flow speed to avoid repeated counting of the fries, adopting a pulse type backflushing device to jet backflushing water flow to a channel port at intervals to avoid fry blockage, and solving the problem of missed detection caused by overlapping of a large number of fries by using two water injection devices; great achievements of Taiyuan science and technology university[9]A turbot fry counting method based on computer vision is provided, through computer vision and image processing technologies, collected pictures are subjected to graying, filtering and binarization processing, fry objects are extracted, then single-frame pictures are counted by adopting an area method and a connected domain method respectively, when target images are not adhered, the counting effect is good, when the target images are adhered, counting is inaccurate, and therefore a curve evolution method is adopted to improve the previous direct counting method.
In image processing systems and underwater observation systems, Lixu et al[3]A system for observing the appearance of an underwater structure by using a 40-camera array is provided; wu De Hao[4]Etc. have proposed a method study of detecting a target with an image processing system of a multi-camera array. When the system is applied by using the multi-camera array, the host and the display need to be carried along in addition to the camera array, the cable and the like, and although the precision is improved compared with the prior method, the method still needs to consume much time, and the adopted devices are adoptedThe volume of the device is large, and the cost of the field experiment is high.
Therefore, in combination with the above problems, in the invention, with the fry counting as a research purpose, an embedded fry counting device based on deep learning density map regression is designed, and a visual interaction system is designed, so that the application effect of more than 90% of precision can be achieved while the volume is reduced by about 80%, the speed close to real time is achieved, and meanwhile, the fry damage condition does not exist, and the fry counting embedded device is convenient to use in links of breeding increase and release, fry selling and the like.
[1] Zhejiang marine aquatic product institute, a fry counter based on image processing and a counting method [ P ], CN106204626A,2016-12-07
[2] A fry counting device and method based on machine vision [ P ]. jiangsu province: CN110973036A,2020-04-10.
[3] Implementation and application of the underwater structure appearance observation system of lissaka.40 camera array [ D ]. southeast university, 2018.
[4] WudeHao, Multi-camera array image processing System and target detection technology research [ D ]. university of electronic technology, 2017.
[5] Leafy stamp glume, hang cheng yu, zhao jian, cultural script, zhuangming, a portable counter for fry based on machine vision and deep learning and a counting method [ P ]. zhejiang province: CN109509175A,2019-03-22.
[6] The device is characterized by comprising the following components in parts by weight, namely, wide plum blossom, old love army, Shenxiayan, Baifenghe, Zhuxinming and Wangcheng, a fish fry online automatic counting device based on machine vision [ P ]. Zhejiang: CN205263883U,2016-05-25.
[7] Zhouyongdong, Fengmeizi, Lipeng, Xukaidao, Jiang-Nian, Liulian is a fry counting device and a counting method [ P ]. Zhejiang: CN106339752A,2017-01-18.
[8] Zhangrong Biao, Chenjinlong, a fry counting detection method and device based on machine vision [ P ]. Jiangsu province: CN112215798A,2021-01-12.
[9] Wangshou, turbot fry counting method based on computer vision research [ D ]. Taiyuan science and technology university, 2015.
Disclosure of Invention
The invention aims to provide a fry counting system and a fry counting method based on deep learning density map regression.
The fry counting system based on deep learning density map regression is characterized by comprising embedded chip equipment (1), a camera (2), a fan (3), a wireless network card (4), a fry box (5), an industrial personal computer (6) and a USB interface (7); the intelligent seedling box comprises an embedded chip device (1), a camera (2), a wireless network card (4), a USB interface (7), a fan (3), the wireless network card (4) and the USB interface (7), wherein the camera (2) is connected with the USB interface (7) through a network cable, and image information of a seedling box (5) collected by the camera (2) is transmitted to an industrial personal computer (6) through the wireless network card (4) to be displayed.
The counting method of the fry counting system based on deep learning density map regression is characterized by comprising the following steps of:
step 1: putting the fry into the fry box, and remotely connecting and controlling the embedded chip equipment through an industrial personal computer;
step 2: acquiring static image information of the fry by using a camera and displaying the static image information on an industrial personal computer;
and step 3: transmitting the image obtained in the step (2) into an image processing module in the embedded chip equipment for carrying out sharpening and denoising;
and 4, step 4: transferring the image processed in the step 3 into a fry counting model MespNet;
step 41: inputting the image to be counted into a regression network structure of the deep learning density map and performing down-sampling, wherein the resolution ratio is 1/4 of the original resolution ratio; the concrete formula for converting the fry image into the density map is as follows:
an image label containing N fish fries is
In order to make the density map correspond to the number of the fries, the density map based on the geometric adaptive Gaussian kernel is adopted as
In the formula, xiRepresenting the corresponding pixel coordinate, delta (x-x), of the fry in the imagei) Which is a function of fry coordinates in the figure, N represents the total fry number,is a distance xiTaking the center distance of the fish heads as the distance between two fish fries, wherein the average distance between the d fish fries closest to the fish fries and the fish fries is approximately equal to the size of the fish heads under the dense condition;
step 42: extracting more comprehensive information by a multi-scale method to improve the detection precision of a fry counting model MespNet, wherein in a front-end network structure, Bottlenecks of MobileNet _ v2 is adopted, an ESP Blocks is adopted in a rear-end network structure, a feature map is converted into a density map by using 1 x 1 convolution at the tail end, and a counting result is obtained by integration;
and 5: and checking the counting image and the counting result on the industrial personal computer and storing the counting image and the counting result in a database.
The ESP Blocks are divided into Reduce and Split; wherein the number of parameters of the Reduce part is
In the formula, M is the channel number of convolution kernels, N is the number of used convolution kernels, and K represents the number of feature graphs with the same size;
the number of the Split part is
Wherein n is the size of the convolution kernel;
ESP Blocks Total parameter quantity is
The sensing field of ESP Blocks is the sensing field sensed by the largest convolution kernel in K void convolutions, and the size of the sensing field is
[(n-1)×2K-1+1]2。
The invention has the beneficial effects that:
according to the invention, the fry counting model regressed by the deep learning density map is combined with the embedded equipment, the size of the device is effectively reduced compared with a case that a multi-camera array needs to carry a host and the like, and the deep learning counting model regressed by the density map can well solve the adhesion problem and ensure the accurate counting result. In addition, the device can acquire images of the fry and display image information, if the image quality is poor, the image information can be acquired again by controlling the camera of the embedded equipment through the industrial personal computer, and an image processing module is attached to the image shot under the condition of poor environment, so that the integral operation is convenient and quick.
Drawings
FIG. 1 is a schematic diagram of a fry counting system based on deep learning density map regression;
in the figure: 1-embedded chip equipment, 2-camera, 3-fan, 4-wireless network card, 5-seedling box, 6-industrial personal computer and 7-USB interface;
FIG. 2 is a flow chart of a counting method of a fry counting system based on deep learning density map regression;
FIG. 3 is a diagram of a deep learning density graph regression network architecture in accordance with the present invention;
FIG. 4 is an interface diagram of an interactive system of the present invention.
Detailed Description
The invention provides a fry counting system and a fry counting method based on deep learning density map regression, and the invention is further explained by combining the attached drawings and the specific embodiment.
Fig. 1 is a schematic diagram of a fry counting system based on deep learning density map regression. In this embodiment, the fry counting system includes an embedded device, an embedded camera, an embedded small fan, a wireless network card, a fry box for placing fries, and an industrial personal computer (for remote control). The camera, the small fan and the wireless network card of the embodiment are embedded into the device in an embedding mode to form complete equipment, and a controllable visual interactive system is matched.
Fig. 2 is a flowchart of a counting method of a fry counting system based on deep learning density map regression. In this embodiment, the fry counting based on deep learning includes the following steps:
1) and (3) putting the fry into the fry box, turning on a power supply of the embedded equipment, starting the fan to dissipate heat, providing a network for the device through the wireless network card, and remotely connecting the embedded equipment through a controllable visual interactive system.
2) The embedded camera module is called by clicking the controllable photographing button of the visual interactive system, the acquired image information is displayed on a popup window of a system interface, frequent clicking is performed to frequently acquire the image information, meanwhile, the image generated later can cover the previous image, and memory occupation is reduced.
3) And (3) denoising the picture shot in the step 2) by clicking a controllable denoising button of the visual interactive system through methods such as a histogram of OpenCV (open circuit vehicle) and filtering to obtain a clear image, and performing multiple processing and returning the result in the step 2) only in one step by clicking multiple times.
4) And (3) transmitting the image obtained in the step 3) into a fry counting model by clicking a controllable counting button of the visual interactive system, starting to process the image by the model, executing the popup window after the processing is finished, and performing the processing for multiple times by clicking for multiple times, wherein the counting results are the same.
5) And checking the final counting result on an interface of the visual interactive system, and storing the counting result in a database.
The method of the step 4) comprises the following steps:
firstly, inputting an image to be counted into a deep learning density graph regression network structure of the invention, performing down-sampling, wherein the resolution is 1/4, extracting more comprehensive information by a multi-scale method to improve the detection precision of the model, in the front-end network structure, Bottlenecks of MobileNet _ v2 is adopted, in the rear-end network structure, ESP Blocks is adopted, at the tail end, 1 × 1 convolution is used to convert a feature graph into a density graph, and a counting result is obtained by integration. The structure of the fry counting model MespNet network is shown in figure 3.
The fry data set is tested to achieve better precision and speed effects, and the detection effects are shown in tables 1 and 2.
TABLE 1 count error and model weight comparison for each count model
TABLE 2 comparison of the parameters of the inventive method (MespNet) and FCNet
As can be seen from the comparison of Table 1 and Table 2, the MespNet method has less error and faster processing speed.
FIG. 4 is an interface diagram of an interactive system of the present invention. The embedded device of the embodiment can adopt raspberry, DSP, android mobile phone, ios mobile phone, tablet and the like, realizes the functions of photographing, denoising and counting through a controllable visual interactive system, and stores the counting result in a database.
The present invention is not limited to the above embodiments, and any changes or substitutions that can be easily made by those skilled in the art within the technical scope of the present invention are also within the scope of the present invention. Therefore, the protection scope of the present invention shall be subject to the protection scope of the claims.
Claims (3)
1. A fry counting system based on deep learning density map regression is characterized by comprising an embedded chip device (1), a camera (2), a fan (3), a wireless network card (4), a fry box (5), an industrial personal computer (6) and a USB interface (7); the intelligent seedling box comprises an embedded chip device (1), a camera (2), a wireless network card (4), a USB interface (7), a fan (3), the wireless network card (4) and the USB interface (7), wherein the camera (2) is connected with the USB interface (7) through a network cable, and image information of a seedling box (5) collected by the camera (2) is transmitted to an industrial personal computer (6) through the wireless network card (4) to be displayed.
2. The counting method of the fry counting system based on deep learning density map regression as claimed in claim 1, characterized by comprising the following steps:
step 1: putting the fry into the fry box, and remotely connecting and controlling the embedded chip equipment through an industrial personal computer;
step 2: acquiring static image information of the fry by using a camera and displaying the static image information on an industrial personal computer;
and step 3: transmitting the image obtained in the step (2) into an image processing module in the embedded chip equipment for carrying out sharpening and denoising;
and 4, step 4: transferring the image processed in the step 3 into a fry counting model MespNet;
step 41: inputting the image to be counted into a regression network structure of the deep learning density map and performing down-sampling, wherein the resolution ratio is 1/4 of the original resolution ratio; the concrete formula for converting the fry image into the density map is as follows:
an image label containing N fish fries is
In order to make the density map correspond to the number of the fries, the density map based on the geometric adaptive Gaussian kernel is adopted as
In the formula, xiRepresenting the corresponding pixel coordinate, delta (x-x), of the fry in the imagei) Which is a function of fry coordinates in the figure, N represents the total fry number,is a distance xiTaking the center distance of the fish heads as the distance between two fish fries, wherein the average distance between the d fish fries closest to the fish fries and the fish fries is approximately equal to the size of the fish heads under the dense condition;
step 42: extracting more comprehensive information by a multi-scale method to improve the detection precision of a fry counting model MespNet, wherein in a front-end network structure, Bottlenecks of MobileNet _ v2 is adopted, an ESP Blocks is adopted in a rear-end network structure, a feature map is converted into a density map by using 1 x 1 convolution at the tail end, and a counting result is obtained by integration;
and 5: and checking the counting image and the counting result on the industrial personal computer and storing the counting image and the counting result in a database.
3. The counting method of the fry counting system based on deep learning density map regression as claimed in claim 2, wherein the ESP Blocks is divided into Reduce and Split; wherein the number of parameters of the Reduce part is
In the formula, M is the channel number of convolution kernels, N is the number of used convolution kernels, and K represents the number of feature graphs with the same size;
the number of the Split part is
Wherein n is the size of the convolution kernel;
ESP Blocks Total parameter quantity is
The sensing field of ESP Blocks is the sensing field sensed by the largest convolution kernel in K void convolutions, and the size of the sensing field is
[(n-1)×2K-1+1]2。
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202110545082.8A CN113240650A (en) | 2021-05-19 | 2021-05-19 | Fry counting system and method based on deep learning density map regression |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202110545082.8A CN113240650A (en) | 2021-05-19 | 2021-05-19 | Fry counting system and method based on deep learning density map regression |
Publications (1)
Publication Number | Publication Date |
---|---|
CN113240650A true CN113240650A (en) | 2021-08-10 |
Family
ID=77137558
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202110545082.8A Pending CN113240650A (en) | 2021-05-19 | 2021-05-19 | Fry counting system and method based on deep learning density map regression |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN113240650A (en) |
Cited By (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN114460082A (en) * | 2022-02-11 | 2022-05-10 | 中国农业大学 | Method for estimating soil evaporation intensity based on thermal infrared imaging technology |
CN114782376A (en) * | 2022-04-24 | 2022-07-22 | 青岛森科特智能仪器有限公司 | Fry counting equipment with learning function and working method thereof |
CN114913544A (en) * | 2022-04-28 | 2022-08-16 | 华南农业大学 | Shrimp larvae counting method and system based on semantic segmentation, cloud server and medium |
CN115281139A (en) * | 2022-01-23 | 2022-11-04 | 仰恩大学 | Fry counting and classifying device |
CN115396754A (en) * | 2022-10-27 | 2022-11-25 | 江西省水生生物保护救助中心 | Fishery water quality remote Internet of things environment monitoring system |
CN115953725A (en) * | 2023-03-14 | 2023-04-11 | 浙江大学 | Fish egg automatic counting system based on deep learning and counting method thereof |
CN118298287A (en) * | 2024-04-17 | 2024-07-05 | 华能澜沧江水电股份有限公司 | Method for monitoring limnoperna fortunei of underwater structure |
Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN109086678A (en) * | 2018-07-09 | 2018-12-25 | 天津大学 | A kind of pedestrian detection method extracting image multi-stage characteristics based on depth supervised learning |
CN109344736A (en) * | 2018-09-12 | 2019-02-15 | 苏州大学 | A kind of still image people counting method based on combination learning |
CN109509175A (en) * | 2018-10-15 | 2019-03-22 | 浙江大学 | A kind of fry portable counters and method of counting based on machine vision and deep learning |
CN111209892A (en) * | 2020-01-19 | 2020-05-29 | 浙江中创天成科技有限公司 | Crowd density and quantity estimation method based on convolutional neural network |
US20200242777A1 (en) * | 2017-11-01 | 2020-07-30 | Nokia Technologies Oy | Depth-aware object counting |
CN112712518A (en) * | 2021-01-13 | 2021-04-27 | 中国农业大学 | Fish counting method, fish counting device, electronic equipment and storage medium |
CN112767382A (en) * | 2021-01-29 | 2021-05-07 | 安徽工大信息技术有限公司 | Fry counting method based on deep learning |
-
2021
- 2021-05-19 CN CN202110545082.8A patent/CN113240650A/en active Pending
Patent Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20200242777A1 (en) * | 2017-11-01 | 2020-07-30 | Nokia Technologies Oy | Depth-aware object counting |
CN109086678A (en) * | 2018-07-09 | 2018-12-25 | 天津大学 | A kind of pedestrian detection method extracting image multi-stage characteristics based on depth supervised learning |
CN109344736A (en) * | 2018-09-12 | 2019-02-15 | 苏州大学 | A kind of still image people counting method based on combination learning |
CN109509175A (en) * | 2018-10-15 | 2019-03-22 | 浙江大学 | A kind of fry portable counters and method of counting based on machine vision and deep learning |
CN111209892A (en) * | 2020-01-19 | 2020-05-29 | 浙江中创天成科技有限公司 | Crowd density and quantity estimation method based on convolutional neural network |
CN112712518A (en) * | 2021-01-13 | 2021-04-27 | 中国农业大学 | Fish counting method, fish counting device, electronic equipment and storage medium |
CN112767382A (en) * | 2021-01-29 | 2021-05-07 | 安徽工大信息技术有限公司 | Fry counting method based on deep learning |
Non-Patent Citations (2)
Title |
---|
SACHIN MEHTA1 ET AL: "ESPNet: Efficient Spatial Pyramid of Dilated Convolutions for Semantic Segmentation", 《ARXIV》, pages 1 - 29 * |
汪梦婷 等: "鱼类目标的密度估计模型", 《哈尔滨工程大学学报》, pages 1 - 7 * |
Cited By (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN115281139A (en) * | 2022-01-23 | 2022-11-04 | 仰恩大学 | Fry counting and classifying device |
CN114460082A (en) * | 2022-02-11 | 2022-05-10 | 中国农业大学 | Method for estimating soil evaporation intensity based on thermal infrared imaging technology |
CN114460082B (en) * | 2022-02-11 | 2024-03-29 | 中国农业大学 | Method for estimating soil evaporation intensity based on thermal infrared imaging technology |
CN114782376A (en) * | 2022-04-24 | 2022-07-22 | 青岛森科特智能仪器有限公司 | Fry counting equipment with learning function and working method thereof |
CN114913544A (en) * | 2022-04-28 | 2022-08-16 | 华南农业大学 | Shrimp larvae counting method and system based on semantic segmentation, cloud server and medium |
CN115396754A (en) * | 2022-10-27 | 2022-11-25 | 江西省水生生物保护救助中心 | Fishery water quality remote Internet of things environment monitoring system |
CN115953725A (en) * | 2023-03-14 | 2023-04-11 | 浙江大学 | Fish egg automatic counting system based on deep learning and counting method thereof |
CN118298287A (en) * | 2024-04-17 | 2024-07-05 | 华能澜沧江水电股份有限公司 | Method for monitoring limnoperna fortunei of underwater structure |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN113240650A (en) | Fry counting system and method based on deep learning density map regression | |
EP3843542B1 (en) | Optimal feeding based on signals in an aquaculture environment | |
CN104089697B (en) | Real-time online visual vibration measurement method based on thread pool concurrent technology | |
WO2020023467A1 (en) | Unique identification of freely swimming fish in an aquaculture environment | |
CN108038459A (en) | A kind of detection recognition method of aquatic organism, terminal device and storage medium | |
CN109509175A (en) | A kind of fry portable counters and method of counting based on machine vision and deep learning | |
Li et al. | Development of a buoy-borne underwater imaging system for in situ mesoplankton monitoring of coastal waters | |
CN112232978A (en) | Aquatic product length and weight detection method, terminal equipment and storage medium | |
US20230267385A1 (en) | Forecasting growth of aquatic organisms in an aquaculture environment | |
Zhang et al. | A method for calculating the leaf inclination of soybean canopy based on 3D point clouds | |
CN105405145B (en) | A kind of granule number grain method based on image Segmentation Technology | |
CN115512215A (en) | Underwater biological monitoring method and device and storage medium | |
Shi et al. | Underwater fish mass estimation using pattern matching based on binocular system | |
CN116778309A (en) | Residual bait monitoring method, device, system and storage medium | |
CN116295022A (en) | Pig body ruler measurement method based on deep learning multi-parameter fusion | |
Patel et al. | Deep Learning-Based Plant Organ Segmentation and Phenotyping of Sorghum Plants Using LiDAR Point Cloud | |
Wang et al. | B-YOLOX-S: a lightweight method for underwater object detection based on data augmentation and multiscale feature fusion | |
CN115206040B (en) | Biological invasion early warning method, device and terminal for nuclear power water intake | |
Yu et al. | An automatic detection and counting method for fish lateral line scales of underwater fish based on improved YOLOv5 | |
CN207940236U (en) | Shoal of fish automatic health detects and archive management device | |
CN106339752A (en) | Fry counting device and counting method | |
CN112966698A (en) | Freshwater fish image real-time identification method based on lightweight convolutional network | |
CN107481282A (en) | volume measuring method, device and user terminal | |
CN205825916U (en) | A kind of leaf area index observation device | |
CN114037737B (en) | Neural network-based offshore submarine fish detection and tracking statistical method |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination |