CN115984145B - Precise fish-passing identification method for fishway - Google Patents

Precise fish-passing identification method for fishway Download PDF

Info

Publication number
CN115984145B
CN115984145B CN202310186817.1A CN202310186817A CN115984145B CN 115984145 B CN115984145 B CN 115984145B CN 202310186817 A CN202310186817 A CN 202310186817A CN 115984145 B CN115984145 B CN 115984145B
Authority
CN
China
Prior art keywords
pixel
fish
passing
image
fishway
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202310186817.1A
Other languages
Chinese (zh)
Other versions
CN115984145A (en
Inventor
曹艳敏
王崇宇
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hunan City University
Original Assignee
Hunan City University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hunan City University filed Critical Hunan City University
Priority to CN202310186817.1A priority Critical patent/CN115984145B/en
Publication of CN115984145A publication Critical patent/CN115984145A/en
Application granted granted Critical
Publication of CN115984145B publication Critical patent/CN115984145B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02ATECHNOLOGIES FOR ADAPTATION TO CLIMATE CHANGE
    • Y02A40/00Adaptation technologies in agriculture, forestry, livestock or agroalimentary production
    • Y02A40/60Ecological corridors or buffer zones

Landscapes

  • Image Analysis (AREA)
  • Image Processing (AREA)

Abstract

The invention relates to the technical field of fish-passing counting of fishways, and discloses a precise fishway fish-passing identification method, which comprises the following steps: carrying out image contour segmentation on the pre-processed fish-passing image of the fishway; splitting the segmented overlapping fish-passing contour image based on the contour curvature to obtain a plurality of single fish-passing contour images; fitting the split multiple single fish profile images to obtain multiple complete single fish profile images, and counting to obtain the fish-passing number of the fishway. According to the invention, the acquired image is filtered by utilizing a self-adaptive median filtering method, noise pixels in the acquired image are filtered, a pixel similarity measurement objective function combined with pixel similarity is provided, fish contour segmentation in an overlapped scene is realized, contour discontinuous points are determined based on contour curvature, a plurality of fish contour images are separated by combining the discontinuous points, contour fitting is carried out by adopting a fitting mode combined with pixel distance, and a complete single fish image is obtained and counted.

Description

Precise fish-passing identification method for fishway
Technical Field
The invention relates to the technical field of fish-passing counting of fishways, in particular to a precise fishway fish-passing identification method.
Background
The Hunan river large source sailing electric hub and the Royal sailing armature are respectively a 6 th seat and a 7 th seat of stairs from top to bottom of Hunan river main flow, and all the fish passing channels are built. The fish passing facilities are used as connecting channels at the upstream and downstream of the dam-building river, and play an important potential role in assisting fish migration, promoting upstream and downstream aquatic organism gene communication and keeping river connectivity. Installing fish monitoring facilities in the fishway to realize dynamic intelligent identification and tracking of the quantity and the types of fish; the system overcomes the defects of inaccurate fish identification and large counting difficulty of the traditional manual work, improves the fish identification efficiency and accuracy, and is matched with a riversiorer type fish passage fish effect observer. The instrument is fixed at the place where fish need to pass in the fishway and pond, and the scanning unit consists of two scanning discs (20X 60 cm) fixed on the frame, and the distance between the two scanning discs is from 10cm to 45cm. The LED on the scanning unit emits infrared light beams to the opposite-end receiver, and when the fish walks through the light beam network, the system automatically acquires the outline profile of the fish, and then the size of the fish is calculated. The pictures of each fish are stored in a control unit which counts the fish while the pictures are stored. The system may be connected to a digital camera for recording video or still images of the fish as they pass through the scanning unit. The scanner triggers the digital camera to acquire 1 to 5 digital pictures or video segments of each fish, and the processing software automatically correlates the digital pictures or videos with other information acquired by the system, such as the size, passing time, speed, contour, temperature and the like of the fish. However, the overlapping phenomenon exists in the fish back-stream process, so that the counting is inaccurate, and the invention provides the accurate fish-passing recognition method for the fish, aiming at the problem, by extracting and complementing the overlapping part, the accurate fish-passing counting is realized.
Disclosure of Invention
In view of the above, the invention provides a precise fish-passing recognition method for fishways, which aims at: 1) Adopting a self-adaptive median filtering method, identifying noise pixels in advance, filtering the noise pixels while other pixels remain unchanged, and simultaneously adding adjacent local information of the combined noise pixels in the median filtering process, wherein the closer the noise pixels are, the larger the weight is, and then correcting the median filtering result; 2) Based on an image segmentation theory, providing a pixel similarity measurement objective function combined with pixel similarity, solving the objective function to obtain corresponding segmentation parameters, selecting pixels close to the segmentation parameters as fish contour pixels, realizing fish contour segmentation in an overlapped scene, determining contour discontinuous points based on contour curvature, and separating a plurality of fish contour images by combining the discontinuous points; 3) And carrying out contour fitting by adopting a fitting mode combining pixel distances, carrying out fitting treatment on the unsealed fish contour image to obtain a complete single fish-passing image, and carrying out statistics counting to realize real-time fish-passing counting of the fishway.
The invention provides a precise fish-passing recognition method for a fishway, which comprises the following steps:
S1: collecting a fishway fish-passing image and preprocessing the fishway fish-passing image to obtain a preprocessed fishway fish-passing image, wherein the preprocessing mode is filtering and morphological processing;
s2: carrying out image contour segmentation on the pre-processed fish-passing image of the fishway to obtain an overlapped fish-passing contour image;
s3: splitting the segmented overlapping fish-passing contour images based on contour curvature to obtain a plurality of single fish-passing contour images, wherein each single fish-passing contour image represents a single fish contour image passing through a fishway;
s4: fitting the split multiple single fish profile images to obtain multiple complete single fish profile images, and counting to obtain the fish-passing number of the fishway.
As a further improvement of the present invention:
optionally, the step S1 of collecting the fish-passing image of the fishway and filtering the collected image includes:
arranging an imaging device at a fishway opening, and shooting at the fishway opening by using the imaging device to obtain a real-time fishway fish-passing image as an image acquisition result, wherein the fishway fish-passing image is the fishway opening image, and the shot fishway fish-passing image comprises a plurality of fishes;
filtering the collected fish-passing image of the fishway, wherein the filtering process flow is as follows:
S11: gray processing is carried out on any pixel I (I, j) in the fish-passing image of the fishway, wherein a gray processing formula is as follows:
g(i,j)=max{I R (i,j),I G (i,j),I B (i,j)}
wherein:
I R (i,j),I G (i,j),I B (I, j) are the color values of the pixel I (I, j) in the R, G, B color channels, respectively, the pixel I (I, j) represents the pixel of the ith row and jth column in the fish-passing image, I E [1, M],j∈[1,N]M represents the number of pixel rows of the fish-passing image of the fishway, and N represents the number of pixel columns of the fish-passing image of the fishway;
g (I, j) represents the gray value of pixel I (I, j);
s12: noise pixel marking is carried out on any pixel I (I, j) in the fish-passing image of the fishway:
wherein:
α (I, j) represents the noise marking result of the pixel I (I, j), α (I, j) =0 represents that the pixel I (I, j) is a noise pixel, and α (I, j) =1 represents that the pixel I (I, j) is a non-noise pixel;
s13: constructing a 5×5 filter matrix, wherein the constructed filter matrix is empty, the input of the filter matrix is an adjacent 5×5 pixel matrix centering on noise pixels, and the pixel values in the pixel matrix are pixel gray values;
s14: for arbitrary noise pixels I (I 0 ,j 0 ) Inputting a 5×5 pixel matrix adjacent to the pixel matrix into a filter matrix, calculating the Euclidean distance from any pixel in the filter matrix to a central noise pixel, and carrying out normalization processing on the Euclidean distance by using a z-score method, wherein in the embodiment of the invention, the pixel value of the pixel is set to be 0 for the non-existing pixel;
S15: calculating to obtain local information of the filter matrix:
wherein:
D(i 0 ,j 0 ) Represented as noise pixels I (I 0 ,j 0 ) Local information of the filter matrix which is the center;
Ω(i 0 ,j 0 ) Represented as noise pixels I (I 0 ,j 0 ) A non-center set of pixels of the filter matrix that is center;
d k representing pixel k to noise pixel I (I 0 ,j 0 ) G (k) represents the pixel value of pixel k;
s16: converting the local information of the filter matrix into median filter weights:
wherein:
represented as noise pixels I (I 0 ,j 0 ) Median filter weight, th, of a centered filter matrix 1 =30,Th 2 =60;
S17: center pixel I (I) of the filter matrix based on median filter weights 0 ,j 0 ) And performing median filtering, wherein the formula of the median filtering is as follows:
wherein:
g(i 0 ,j 0 ) Representing the center pixel I (I 0 ,j 0 ) Initial pixel value g of (2) (i 0 ,j 0 ) Representing the center pixel I (I 0 ,j 0 ) The pixel value after the filtering processing;
median(i 0 ,j 0 ) Representing the center-off pixel I (I) 0 ,j 0 ) Besides, the pixel value median of the rest pixels;
and (3) processing the steps S13 to S17 on all noise pixels in the fishway fish passing image to obtain a fishway fish passing image after filtering processing.
Optionally, in the step S1, morphological processing is performed on the filtered fish-passing image of the fishway, including:
morphological processing is carried out on the fish-passing image of the fishway after the filtering processing, wherein the morphological processing flow is as follows:
Constructing a morphological processing matrix:
wherein:
M 1 m is a matrix for morphological corrosion treatment 2 A matrix for morphological dilation processing is constructed;
comparing any pixel in the fish-passing image of the fishway after the filtering treatment with a preset pixel threshold value, if the pixel threshold value is larger than the preset pixel threshold value, marking the pixel as 1, otherwise marking the pixel as 0;
using matrix M 1 Scanning any pixel in the filtered fish-passing image of the fishway, wherein the matrix M 1 The center coincides with the pixel to be scanned, and the marking results of the pixel to be scanned and the adjacent pixels are the same as the matrix M 1 If the pixel values are 1, not changing the pixel value of the pixel to be scanned, otherwise, modifying the pixel value of the pixel to be scanned to the pixel average value of the adjacent 3X 3 pixel area to obtain the fish-passing image of the fish channel after morphological corrosion treatment;
using matrix M 2 Scanning any pixel in the morphological corrosion processed fish-passing image, wherein the matrix M 2 The center coincides with the pixel to be scanned, and the marking results of the pixel to be scanned and the adjacent pixels are the same as the matrix M 2 And operation is performed in the corresponding region of the pixel to be scanned, if both are 0, the pixel value of the pixel to be scanned is not changed,otherwise, modifying the pixel value of the pixel to be scanned into the pixel mean value of the adjacent 3×3 pixel area to obtain the pre-processed fish-passing image.
Optionally, in the step S2, image contour segmentation is performed on the pre-processed fish-pass image, including:
carrying out image contour segmentation on the pre-processed fish-passing image of the fishway to obtain an overlapped fish-passing contour image, wherein the image contour segmentation process comprises the following steps:
s21: calculating to obtain the pixel weight among any different pixels:
wherein:
f a,b representing pixel weights between pixel I (a) and pixel I (b) in the pre-processed fish-pass fish image;
g '(a) represents the pixel value of the preprocessed pixel I (a), g' (b) represents the pixel value of the preprocessed pixel I (b);
dist (a, b) represents the euclidean distance between pixel I (a) and pixel I (b);
σ a representing the standard deviation, sigma, of the Euclidean distance between pixel I (a) and the remaining pixels b A standard deviation representing the Euclidean distance between the pixel I (b) and the remaining pixels;
s22: setting a parameter to be solved as theta, marking the pixel value smaller than or equal to the theta as a fish part pixel, marking the rest pixels as background pixels, and marking the pixel part with the pixel value close to the theta as a fish contour pixel;
s23: constructing a pixel similarity measurement objective function F (theta):
wherein:
beta (c) represents a discriminant function of the pixel I (c), if the pixel value of the pixel I (c) is less than or equal to theta, the beta (c) is marked as 1, otherwise, the beta (c) is marked as-1;
f c,z Representing the pixel weight between pixel I (c) and pixel I (z);
omega (c) represents the set of pixels in the pre-processed fish-passage image, except for pixel I (c);
s24: and selecting a parameter theta enabling the pixel similarity measurement objective function to be minimum as a solving result, and further marking fish profile pixels in the preprocessed fish-passing image to obtain a fish profile decomposition result.
Optionally, in the step S3, splitting the segmented overlapping fish-profile image based on the profile curvature includes:
splitting the segmented overlapping fish-passing profile images based on the profile curvature to obtain a plurality of single fish-passing profile images, wherein each single fish-passing profile image represents a single fish profile image passing through a fishway, and the splitting process of the overlapping fish-passing profile images based on the profile curvature comprises the following steps:
s31: taking any fish profile pixel as a starting point, forming a group of line segments by 10 continuous fish profile pixels, and dividing the fish profile pixels in the overlapped fish profile image into a plurality of line segments;
s32: for any pixel I (I' 1 ,j′ 1 ) As a starting point, pixel I (I ") 1 ,j″ 1 ) Line segment L as end point 1 And adjacent line segment L 2 Wherein line segment L 2 With pixel I (I' 2 ,j′ 2 ) As a starting point, pixel I (I ") 2 ,j″ 2 ) For the end point, the curvature S (L) 1 ,L 2 ):
Wherein:
Len(L 1 ) Representing line segment L 1 Is a length of (2);
ε(L 1 ) Representing line segment L 1 An included angle with the horizontal axis;
s33: and calculating to obtain the curvature average value of the adjacent line segments, setting the intersection point between the adjacent line segments lower than the area average value as a discontinuous point, and splitting the contour in the overlapping fish contour image based on the discontinuous point to obtain a plurality of groups of single fish contour images, wherein each group of single fish contour images comprises a group of continuous fish contours.
Optionally, in the step S4, fitting the split multiple single fish profile images includes:
fitting the split multiple single fish profile images, wherein the fitting process is as follows:
for any single fish profile image, marking a pixel coordinate sequence (h (1), h (2), h (T)) of the fish profile, wherein h (T) represents the pixel coordinates of the T-th fish profile pixel in the single fish profile image, T represents the total number of fish profile pixels in the single fish profile image, and h (1) represents the starting pixel of the fish profile;
constructing a contour function G (t) of the fish contour:
G(t)=u 0 +u 1 t 1 +u 2 t 2 +u 3 t 3
Wherein:
u 0 ,u 1 ,u 2 ,u 3 parameters representing a contour function;
g (t) represents the distance between the t-th fish profile pixel coordinate and the t-1 th fish profile pixel coordinate;
constructing a corresponding matrix calculation formula based on an interpolation mode:
wherein:
R 1 a Euclidean distance between pixel coordinates h (1) and h (2) representing the 1 st fish outline pixel;
solving the equation to obtain a matrix P, and calculating according to elements in the matrix P to obtain profile function parameters, wherein:
and obtaining the distance between the coordinates of the follow-up unknown pixel and the previous coordinate based on the constructed contour function, setting the included angle between the connecting line of the coordinates of the follow-up unknown pixel and the previous coordinate and the horizontal axis, wherein the set angle is the curvature of the adjacent line segment taking the previous coordinate as the end point, obtaining the specific coordinates of the follow-up unknown pixel based on the set included angle and the distance, and iterating the coordinates of the pixel in an iterative mode until a closed fitting result is obtained, wherein the fitting result is the single fish image.
Optionally, in the step S4, the counting of the single fish passing images to obtain the real-time fish passing number of the fishway includes:
filtering the single fish-passing profile images, wherein the number of fish profile pixels in the single fish-passing profile images is less than a preset threshold value, and the single fish-passing images with a closed result cannot be formed, counting the reserved single fish-passing images, and the counting result is the real-time fish-passing number of the fishways.
In order to solve the above-described problems, the present invention provides an electronic apparatus including:
a memory storing at least one instruction;
the communication interface is used for realizing the communication of the electronic equipment; and
And the processor executes the instructions stored in the memory to realize the precise fishway fish passing identification method.
In order to solve the above-mentioned problems, the present invention also provides a computer-readable storage medium having stored therein at least one instruction that is executed by a processor in an electronic device to implement the above-mentioned precise fishway fish-passing identification method.
Compared with the prior art, the invention provides a precise fish passage identification method, which has the following advantages:
firstly, the scheme provides an image self-adaptive filtering processing flow, which carries out filtering processing on the collected fishway fish-passing image, wherein the filtering processing flow is as follows: gray processing is carried out on any pixel I (I, j) in the fish-passing image of the fishway, wherein a gray processing formula is as follows:
g(i,j)=max{I R (i,j),I G (i,j),I B (i,j)}
wherein: i R (i,j),I G (i,j),I B (I, j) are the color values of the pixel I (I, j) in the R, G, B color channels, respectively, the pixel I (I, j) represents the pixel of the ith row and jth column in the fish-passing image, I E [1, M ],j∈[1,N]M represents the number of pixel rows of the fish-passing image of the fishway, and N represents the number of pixel columns of the fish-passing image of the fishway; g (I, j) represents the gray value of pixel I (I, j); noise pixel marking is carried out on any pixel I (I, j) in the fish-passing image of the fishway:
wherein: α (I, j) represents the noise marking result of the pixel I (I, j), α (I, j) =0 represents that the pixel I (I, j) is a noise pixel, and α (I, j) =1 represents that the pixel I (I, j) is a non-noise pixel; constructing a 5 x 5 filter matrix, wherein the constructedThe filter matrix is empty, the input of the filter matrix is an adjacent 5×5 pixel matrix taking noise pixels as the center, and the pixel values in the pixel matrix are pixel gray values; for arbitrary noise pixels I (I 0 ,j 0 ) Inputting a 5 multiplied by 5 pixel matrix adjacent to the pixel matrix into a filter matrix, calculating the Euclidean distance from any pixel in the filter matrix to a central noise pixel, and carrying out normalization processing on the Euclidean distance by using a z-score method; calculating to obtain local information of the filter matrix:
wherein: d (i) 0 ,j 0 ) Represented as noise pixels I (I 0 ,j 0 ) Local information of the filter matrix which is the center; omega (i) 0 ,j 0 ) Represented as noise pixels I (I 0 ,j 0 ) A non-center set of pixels of the filter matrix that is center; d, d k Representing pixel k to noise pixel I (I 0 ,j 0 ) G (k) represents the pixel value of pixel k; converting the local information of the filter matrix into median filter weights:
Wherein:represented as noise pixels I (I 0 ,j 0 ) Median filter weight, th, of a centered filter matrix 1 =30,Th 2 =60; center pixel I (I) of the filter matrix based on median filter weights 0 ,j 0 ) And performing median filtering, wherein the formula of the median filtering is as follows:
wherein: g (i) 0 ,j 0 ) Representation ofCenter pixel I (I) 0 ,j 0 ) G' (i) 0 ,j 0 ) Representing the center pixel I (I 0 ,j 0 ) The pixel value after the filtering processing; media (i) 0 ,j 0 ) Representing the center-off pixel I (I) 0 ,j 0 ) In addition, the pixel values of the remaining pixels are median. According to the scheme, a self-adaptive median filtering method is adopted, noise pixels are identified in advance, filtering is conducted on the noise pixels, other pixels are kept unchanged, meanwhile, in the median filtering process, adjacent local information of the combined noise pixels is added, the weight is larger when the noise pixels are closer to the noise pixels, and then the median filtering result is corrected.
Meanwhile, the scheme provides an image contour segmentation method, which is used for carrying out image contour segmentation on the pre-processed fish-passing image to obtain an overlapped fish-passing contour image, wherein the image contour segmentation process comprises the following steps: calculating to obtain the pixel weight among any different pixels:
wherein: f (f) a,b Representing pixel weights between pixel I (a) and pixel I (b) in the pre-processed fish-pass fish image; g '(a) represents the pixel value of the preprocessed pixel I (a), g' (b) represents the pixel value of the preprocessed pixel I (b); dist (a, b) represents the euclidean distance between pixel I (a) and pixel I (b); sigma (sigma) a Representing the standard deviation, sigma, of the Euclidean distance between pixel I (a) and the remaining pixels b A standard deviation representing the Euclidean distance between the pixel I (b) and the remaining pixels; setting a parameter to be solved as theta, marking the pixel value smaller than or equal to the theta as a fish part pixel, marking the rest pixels as background pixels, and marking the pixel part with the pixel value close to the theta as a fish contour pixel; constructing a pixel similarity measurement objective function F (theta):
wherein: beta (c) represents a discriminant function of the pixel I (c), if the pixel value of the pixel I (c) is less than or equal to theta, the beta (c) is marked as 1, otherwise, the beta (c) is marked as-1; f (f) c,z Representing the pixel weight between pixel I (c) and pixel I (z); omega (c) represents the set of pixels in the pre-processed fish-passage image, except for pixel I (c); and selecting a parameter theta which enables the pixel similarity measurement objective function to be minimum as a solving result, and further marking fish profile pixels in the preprocessed fish-passing image to obtain a fish profile decomposition result. Splitting the segmented overlapping fish-passing contour images based on the contour curvature to obtain a plurality of single fish-passing contour images, wherein each single fish-passing contour image represents a single fish-passing contour image passing through the fishway. According to the method, based on an image segmentation theory, a pixel similarity measurement objective function combining pixel similarity is provided, a corresponding segmentation parameter is obtained by solving the objective function, pixels close to the segmentation parameter are selected to serve as fish profile pixels, fish profile segmentation under an overlapped scene is achieved, profile discontinuous points are determined based on profile curvature, a plurality of fish profile images are separated by combining the discontinuous points, further, a fitting mode combining pixel distances is adopted for carrying out profile fitting, the unsealed fish profile images are subjected to fitting processing, complete single fish-passing images are obtained, statistics counting is carried out, and real-time fish-pass fish-passing count is achieved.
Drawings
Fig. 1 is a schematic flow chart of a precise fishway fish-passing identification method according to an embodiment of the invention;
fig. 2 is a schematic structural diagram of an electronic device for implementing a precise fishway fish-passing recognition method according to an embodiment of the invention.
The achievement of the objects, functional features and advantages of the present invention will be further described with reference to the accompanying drawings, in conjunction with the embodiments.
Detailed Description
It should be understood that the specific embodiments described herein are for purposes of illustration only and are not intended to limit the scope of the invention.
The embodiment of the application provides a precise fish-passing identification method for a fishway. The execution main body of the precise fishway fish identification method comprises, but is not limited to, at least one of a server, a terminal and the like which can be configured to execute the method provided by the embodiment of the application. In other words, the precise fishway fish identification method may be performed by software or hardware installed in a terminal device or a server device, and the software may be a blockchain platform. The service end includes but is not limited to:
a single server, a server cluster, a cloud server or a cloud server cluster, and the like.
Example 1:
s1: and acquiring a fish-passing image of the fishway and preprocessing the fish-passing image of the fishway to obtain a preprocessed fish-passing image of the fishway, wherein the preprocessing mode is filtering and morphological processing.
The step S1 is to collect the fish-passing image of the fishway and filter the collected image, and comprises the following steps:
arranging an imaging device at a fishway opening, and shooting at the fishway opening by using the imaging device to obtain a real-time fishway fish-passing image as an image acquisition result, wherein the fishway fish-passing image is the fishway opening image, and the shot fishway fish-passing image comprises a plurality of fishes;
filtering the collected fish-passing image of the fishway, wherein the filtering process flow is as follows:
s11: gray processing is carried out on any pixel I (I, j) in the fish-passing image of the fishway, wherein a gray processing formula is as follows:
g(i,j)=max{I R (i,j),I G (i,j),I B (i,j)}
wherein:
I R (i,j),I G (i,j),I B (I, j) are the color values of the pixel I (I, j) in the R, G, B color channels, respectively, the pixel I (I, j) represents the pixel of the ith row and jth column in the fish-passing image, I E [1, M],j∈[1,N]M represents the number of pixel rows of the fish-passing image of the fishway, and N represents the number of pixel columns of the fish-passing image of the fishway;
g (I, j) represents the gray value of pixel I (I, j);
s12: noise pixel marking is carried out on any pixel I (I, j) in the fish-passing image of the fishway:
wherein:
α (I, j) represents the noise marking result of the pixel I (I, j), α (I, j) =0 represents that the pixel I (I, j) is a noise pixel, and α (I, j) =1 represents that the pixel I (I, j) is a non-noise pixel;
s13: constructing a 5×5 filter matrix, wherein the constructed filter matrix is empty, the input of the filter matrix is an adjacent 5×5 pixel matrix centering on noise pixels, and the pixel values in the pixel matrix are pixel gray values;
S14: for arbitrary noise pixels I (I 0 ,j 0 ) Inputting a 5×5 pixel matrix adjacent to the pixel matrix into a filter matrix, calculating the Euclidean distance from any pixel in the filter matrix to a central noise pixel, and carrying out normalization processing on the Euclidean distance by using a z-score method, wherein in the embodiment of the invention, the pixel value of the pixel is set to be 0 for the non-existing pixel;
s15: calculating to obtain local information of the filter matrix:
wherein:
D(i 0 ,j 0 ) Represented as noise pixels I (I 0 ,j 0 ) Local information of the filter matrix which is the center;
Ω(i 0 ,j 0 ) Represented as noise pixels I (I 0 ,j 0 ) A non-center set of pixels of the filter matrix that is center;
d k representing pixel k to noise pixel I (I 0 ,j 0 ) G (k) represents the pixel value of pixel k;
s16: converting the local information of the filter matrix into median filter weights:
wherein:
represented as noise pixels I (I 0 ,j 0 ) Median filter weight, th, of a centered filter matrix 1 =30,Th 2 =60;
S17: center pixel I (I) of the filter matrix based on median filter weights 0 ,j 0 ) And performing median filtering, wherein the formula of the median filtering is as follows:
wherein:
g(i 0 ,j 0 ) Representing the center pixel I (I 0 ,j 0 ) G' (i) 0 ,j 0 ) Representing the center pixel I (I 0 ,j 0 ) The pixel value after the filtering processing;
median(i 0 ,j 0 ) Representing the center-off pixel I (I) 0 ,j 0 ) Besides, the pixel value median of the rest pixels;
and (3) processing the steps S13 to S17 on all noise pixels in the fishway fish passing image to obtain a fishway fish passing image after filtering processing.
In the step S1, morphological processing is performed on the fish-passing image of the fishway after the filtering processing, including:
morphological processing is carried out on the fish-passing image of the fishway after the filtering processing, wherein the morphological processing flow is as follows:
constructing a morphological processing matrix:
wherein:
M 1 m is a matrix for morphological corrosion treatment 2 A matrix for morphological dilation processing is constructed;
comparing any pixel in the fish-passing image of the fishway after the filtering treatment with a preset pixel threshold value, if the pixel threshold value is larger than the preset pixel threshold value, marking the pixel as 1, otherwise marking the pixel as 0;
using matrix M 1 Scanning any pixel in the filtered fish-passing image of the fishway, wherein the matrix M 1 The center coincides with the pixel to be scanned, and the marking results of the pixel to be scanned and the adjacent pixels are the same as the matrix M 1 If the pixel values are 1, not changing the pixel value of the pixel to be scanned, otherwise, modifying the pixel value of the pixel to be scanned to the pixel average value of the adjacent 3X 3 pixel area to obtain the fish-passing image of the fish channel after morphological corrosion treatment;
Using matrix M 2 Scanning any pixel in the morphological corrosion processed fish-passing image, wherein the matrix M 2 The center coincides with the pixel to be scanned, and the marking results of the pixel to be scanned and the adjacent pixels are the same as the matrix M 2 And (3) performing AND operation, if the pixel values are 0, not changing the pixel value of the pixel to be scanned, otherwise, modifying the pixel value of the pixel to be scanned to the pixel mean value of the adjacent 3X 3 pixel area, and obtaining the pre-processed fish-passage fish-passing image.
S2: and performing image contour segmentation on the pre-processed fish-passing image of the fishway to obtain an overlapped fish-passing contour image.
In the step S2, image contour segmentation is carried out on the pre-processed fish-passing image of the fishway, and the method comprises the following steps:
carrying out image contour segmentation on the pre-processed fish-passing image of the fishway to obtain an overlapped fish-passing contour image, wherein the image contour segmentation process comprises the following steps:
s21: calculating to obtain the pixel weight among any different pixels:
wherein:
f a,b representing pixel weights between pixel I (a) and pixel I (b) in the pre-processed fish-pass fish image;
g '(a) represents the pixel value of the preprocessed pixel I (a), g' (b) represents the pixel value of the preprocessed pixel I (b);
dist (a, b) represents the euclidean distance between pixel I (a) and pixel I (b);
σ a Representing the standard deviation, sigma, of the Euclidean distance between pixel I (a) and the remaining pixels b A standard deviation representing the Euclidean distance between the pixel I (b) and the remaining pixels;
s22: setting a parameter to be solved as theta, marking the pixel value smaller than or equal to the theta as a fish part pixel, marking the rest pixels as background pixels, and marking the pixel part with the pixel value close to the theta as a fish contour pixel;
s23: constructing a pixel similarity measurement objective function F (theta):
wherein:
beta (c) represents a discriminant function of the pixel I (c), if the pixel value of the pixel I (c) is less than or equal to theta, the beta (c) is marked as 1, otherwise, the beta (c) is marked as-1;
f c,z representing the pixel weight between pixel I (c) and pixel I (z);
omega (c) represents the set of pixels in the pre-processed fish-passage image, except for pixel I (c);
s24: and selecting a parameter theta enabling the pixel similarity measurement objective function to be minimum as a solving result, and further marking fish profile pixels in the preprocessed fish-passing image to obtain a fish profile decomposition result.
S3: splitting the segmented overlapping fish-passing contour images based on the contour curvature to obtain a plurality of single fish-passing contour images, wherein each single fish-passing contour image represents a single fish-passing contour image passing through the fishway.
In the step S3, splitting the segmented overlapped fish-passing contour image based on the contour curvature comprises the following steps:
splitting the segmented overlapping fish-passing profile images based on the profile curvature to obtain a plurality of single fish-passing profile images, wherein each single fish-passing profile image represents a single fish profile image passing through a fishway, and the splitting process of the overlapping fish-passing profile images based on the profile curvature comprises the following steps:
s31: taking any fish profile pixel as a starting point, forming a group of line segments by 10 continuous fish profile pixels, and dividing the fish profile pixels in the overlapped fish profile image into a plurality of line segments;
s32: for any pixel I (I' 1 ,j′ 1 ) As a starting point, pixel I (I ") 1 ,j″ 1 ) Line segment L as end point 1 And adjacent line segment L 2 Wherein line segment L 2 With pixel I (I' 2 ,j′ 2 ) As a starting point, pixel I (I ") 2 ,j″ 2 ) For the end point, the curvature S (L) 1 ,L 2 ):
Wherein:
Len(L 1 ) Representing line segment L 1 Is a length of (2);
ε(L 1 ) Representing line segment L 1 An included angle with the horizontal axis;
s33: and calculating to obtain the curvature average value of the adjacent line segments, setting the intersection point between the adjacent line segments lower than the area average value as a discontinuous point, and splitting the contour in the overlapping fish contour image based on the discontinuous point to obtain a plurality of groups of single fish contour images, wherein each group of single fish contour images comprises a group of continuous fish contours.
S4: fitting the split multiple single fish profile images to obtain multiple complete single fish profile images, and counting to obtain the fish-passing number of the fishway.
And in the step S4, fitting the split multiple single fish profile images, wherein the fitting comprises the following steps:
fitting the split multiple single fish profile images, wherein the fitting process is as follows:
for any single fish profile image, marking a pixel coordinate sequence (h (1), h (2), h (T)) of the fish profile, wherein h (T) represents the pixel coordinates of the T-th fish profile pixel in the single fish profile image, T represents the total number of fish profile pixels in the single fish profile image, and h (1) represents the starting pixel of the fish profile;
constructing a contour function G (t) of the fish contour:
G(t)=u 0 +u 1 t 1 +u 2 t 2 +u 3 t 3
wherein:
u 0 ,u 1 ,u 2 ,u 3 parameters representing a contour function;
g (t) represents the distance between the t-th fish profile pixel coordinate and the t-1 th fish profile pixel coordinate;
constructing a corresponding matrix calculation formula based on an interpolation mode:
wherein:
R 1 a Euclidean distance between pixel coordinates h (1) and h (2) representing the 1 st fish outline pixel;
solving the equation to obtain a matrix P, and calculating according to elements in the matrix P to obtain profile function parameters, wherein:
And obtaining the distance between the coordinates of the follow-up unknown pixel and the previous coordinate based on the constructed contour function, setting the included angle between the connecting line of the coordinates of the follow-up unknown pixel and the previous coordinate and the horizontal axis, wherein the set angle is the curvature of the adjacent line segment taking the previous coordinate as the end point, obtaining the specific coordinates of the follow-up unknown pixel based on the set included angle and the distance, and iterating the coordinates of the pixel in an iterative mode until a closed fitting result is obtained, wherein the fitting result is the single fish image.
In the step S4, counting the single fish passing images to obtain the real-time fish passing number of the fishway, which comprises the following steps:
filtering the single fish-passing profile images, wherein the number of fish profile pixels in the single fish-passing profile images is less than a preset threshold value, and the single fish-passing images with a closed result cannot be formed, counting the reserved single fish-passing images, and the counting result is the real-time fish-passing number of the fishways.
Example 2:
fig. 2 is a schematic structural diagram of an electronic device for implementing a precise fish-path fish-passing recognition method according to an embodiment of the invention.
The electronic device 1 may comprise a processor 10, a memory 11, a communication interface 13 and a bus, and may further comprise a computer program, such as program 12, stored in the memory 11 and executable on the processor 10.
The memory 11 includes at least one type of readable storage medium, including flash memory, a mobile hard disk, a multimedia card, a card memory (e.g., SD or DX memory, etc.), a magnetic memory, a magnetic disk, an optical disk, etc. The memory 11 may in some embodiments be an internal storage unit of the electronic device 1, such as a removable hard disk of the electronic device 1. The memory 11 may in other embodiments also be an external storage device of the electronic device 1, such as a plug-in mobile hard disk, a Smart Media Card (SMC), a Secure Digital (SD) Card, a Flash memory Card (Flash Card) or the like, which are provided on the electronic device 1. Further, the memory 11 may also include both an internal storage unit and an external storage device of the electronic device 1. The memory 11 may be used not only for storing application software installed in the electronic device 1 and various types of data, such as codes of the program 12, but also for temporarily storing data that has been output or is to be output.
The processor 10 may be comprised of integrated circuits in some embodiments, for example, a single packaged integrated circuit, or may be comprised of multiple integrated circuits packaged with the same or different functions, including one or more central processing units (Central Processing unit, CPU), microprocessors, digital processing chips, graphics processors, combinations of various control chips, and the like. The processor 10 is a Control Unit (Control Unit) of the electronic device, connects respective parts of the entire electronic device using various interfaces and lines, executes or executes programs or modules (a program 12 for realizing precise fishway identification, etc.) stored in the memory 11, and invokes data stored in the memory 11 to perform various functions of the electronic device 1 and process data.
The communication interface 13 may comprise a wired interface and/or a wireless interface (e.g. WI-FI interface, bluetooth interface, etc.), typically used to establish a communication connection between the electronic device 1 and other electronic devices and to enable connection communication between internal components of the electronic device.
The bus may be a peripheral component interconnect standard (peripheral component interconnect, PCI) bus or an extended industry standard architecture (extended industry standard architecture, EISA) bus, among others. The bus may be classified as an address bus, a data bus, a control bus, etc. The bus is arranged to enable a connection communication between the memory 11 and at least one processor 10 etc.
Fig. 2 shows only an electronic device with components, it being understood by a person skilled in the art that the structure shown in fig. 2 does not constitute a limitation of the electronic device 1, and may comprise fewer or more components than shown, or may combine certain components, or may be arranged in different components.
For example, although not shown, the electronic device 1 may further include a power source (such as a battery) for supplying power to each component, and preferably, the power source may be logically connected to the at least one processor 10 through a power management device, so that functions of charge management, discharge management, power consumption management, and the like are implemented through the power management device. The power supply may also include one or more of any of a direct current or alternating current power supply, recharging device, power failure detection circuit, power converter or inverter, power status indicator, etc. The electronic device 1 may further include various sensors, bluetooth modules, wi-Fi modules, etc., which will not be described herein.
The electronic device 1 may optionally further comprise a user interface, which may be a Display, an input unit, such as a Keyboard (Keyboard), or a standard wired interface, a wireless interface. Alternatively, in some embodiments, the display may be an LED display, a liquid crystal display, a touch-sensitive liquid crystal display, an OLED (Organic Light-Emitting Diode) touch, or the like. The display may also be referred to as a display screen or display unit, as appropriate, for displaying information processed in the electronic device 1 and for displaying a visual user interface.
It should be understood that the embodiments described are for illustrative purposes only and are not limited to this configuration in the scope of the patent application.
The program 12 stored in the memory 11 of the electronic device 1 is a combination of instructions that, when executed in the processor 10, may implement:
collecting a fishway fish-passing image and preprocessing the fishway fish-passing image to obtain a preprocessed fishway fish-passing image;
carrying out image contour segmentation on the pre-processed fish-passing image of the fishway to obtain an overlapped fish-passing contour image;
splitting the segmented overlapping fish-passing contour image based on the contour curvature to obtain a plurality of single fish-passing contour images;
Fitting the split multiple single fish profile images to obtain multiple complete single fish profile images, and counting to obtain the fish-passing number of the fishway.
Specifically, the specific implementation method of the above instruction by the processor 10 may refer to descriptions of related steps in the corresponding embodiments of fig. 1 to 2, which are not repeated herein.
It should be noted that, the foregoing reference numerals of the embodiments of the present invention are merely for describing the embodiments, and do not represent the advantages and disadvantages of the embodiments. And the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, apparatus, article, or method that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, apparatus, article, or method. Without further limitation, an element defined by the phrase "comprising one … …" does not exclude the presence of other like elements in a process, apparatus, article or method that comprises the element.
From the above description of the embodiments, it will be clear to those skilled in the art that the above-described embodiment method may be implemented by means of software plus a necessary general hardware platform, but of course may also be implemented by means of hardware, but in many cases the former is a preferred embodiment. Based on such understanding, the technical solution of the present invention may be embodied essentially or in a part contributing to the prior art in the form of a software product stored in a storage medium (e.g. ROM/RAM, magnetic disk, optical disk) as described above, comprising instructions for causing a terminal device (which may be a mobile phone, a computer, a server, or a network device, etc.) to perform the method according to the embodiments of the present invention.
The foregoing description is only of the preferred embodiments of the present invention, and is not intended to limit the scope of the invention, but rather is intended to cover any equivalents of the structures or equivalent processes disclosed herein or in the alternative, which may be employed directly or indirectly in other related arts.

Claims (5)

1. An accurate fish passage identification method, which is characterized by comprising the following steps:
s1: collecting a fishway fish-passing image and preprocessing the fishway fish-passing image to obtain a preprocessed fishway fish-passing image, wherein the preprocessing mode is filtering and morphological processing;
s2: carrying out image contour segmentation on the pre-processed fish-passing image of the fishway to obtain an overlapped fish-passing contour image;
the image contour segmentation process comprises the following steps:
s21: calculating to obtain the pixel weight among any different pixels:
wherein:
representing pixels in the pre-processed fish-passage fish-passing image +.>And pixel->Pixel weights in between;
representing the pixel after preprocessing +.>Pixel value of>Representing the pixel after preprocessing +.>Pixel values of (2);
representing pixel +.>And pixel->A Euclidean distance between them;
representing pixel +.>Standard deviation of euclidean distance from the remaining pixels, < >>Representing pixel +. >Standard deviation of euclidean distance from the remaining pixels;
s22: setting parameters to be solved asPixel value is less than or equal to +.>The pixel value is close to +.A part of the pixels are marked as fish and the rest pixels are marked as background pixels>The pixel part of (2) is a fish outline pixel;
s23: construction of a pixel similarity metric objective function
Wherein:
representing pixel +.>Is the discriminant function of (1) if the pixel +.>The pixel value of (2) is equal to or less than +.>Will->Marked 1, otherwise ∈>Marked as-1;
representing pixel +.>And pixel->Pixel weights in between;
representing the pixel removal in the pre-processed fish-passage fish-passing image>An outer set of pixels;
s24: selecting parameters that minimize the objective function of the pixel similarity measureAs a solving result, further marking fish profile pixels in the pretreated fish passage fish passing image to obtain a fish profile decomposition result;
s3: splitting the segmented overlapping fish-passing contour images based on contour curvature to obtain a plurality of single fish-passing contour images, wherein each single fish-passing contour image represents a single fish contour image passing through a fishway;
the overlapping fish profile image splitting process based on the profile curvature comprises the following steps:
s31: taking any fish profile pixel as a starting point, forming a group of line segments by 10 continuous fish profile pixels, and dividing the fish profile pixels in the overlapped fish profile image into a plurality of line segments;
S32: for any pixelAs a starting point, pixel->Line segment +.>And adjacent line segment->Wherein line segment->With pixels +.>As a starting point, pixel->Calculating the curvature of two adjacent line segments as the end point
Wherein:
representing line segment->Is a length of (2);
representing line segment->An included angle with the horizontal axis;
s33: calculating to obtain the average value of the curvatures of all the adjacent line segments, setting the intersection point between the adjacent line segments lower than the average value of the curvatures as a discontinuous point, and splitting the contour in the overlapping fish-passing contour image based on the discontinuous point to obtain a plurality of groups of single fish-passing contour images, wherein each group of single fish-passing contour images comprises a group of continuous fish contours;
s4: fitting the split multiple single fish profile images to obtain multiple complete single fish profile images, and counting to obtain the fish-passing number of the fishway.
2. The precise fish-passing identification method according to claim 1, wherein the step S1 of acquiring the fish-passing image and filtering the acquired image comprises:
arranging an imaging device at a fishway opening, and shooting at the fishway opening by using the imaging device to obtain a real-time fishway fish-passing image as an image acquisition result, wherein the fishway fish-passing image is the fishway opening image, and the shot fishway fish-passing image comprises a plurality of fishes;
Filtering the collected fish-passing image of the fishway, wherein the filtering process flow is as follows:
s11: for any pixel in fish-passing image of fishwayAnd carrying out graying treatment, wherein the graying treatment formula is as follows:
wherein:
pixels respectively->Color values of R, G, B color channels, pixels>Pixels representing the ith row and jth column in the fish-pass image +.>,/>M represents the number of pixel rows of the fish-passing image of the fishway, and N represents the number of pixel columns of the fish-passing image of the fishway;
representing pixel +.>Gray values of (2);
s12: for any pixel in fish-passing image of fishwayNoise pixel labeling:
wherein:
representing pixel +.>Noise marking results,/-, of (2)>Representation imageSu->Is noise pixel +.>Representing pixel +.>Is a non-noise pixel;
s13: constructionWherein the constructed filter matrix is null and the input of the filter matrix is adjacent +.>A pixel matrix, wherein the pixel values in the pixel matrix are pixel gray values;
s14: for arbitrary noise pixelsAdjacent it->Inputting the pixel matrix into a filter matrix, calculating the Euclidean distance from any pixel in the filter matrix to a central noise pixel, and carrying out normalization processing on the Euclidean distance by using a z-score method;
S15: calculating to obtain local information of the filter matrix:
wherein:
expressed as noise pixels->Local information of the filter matrix which is the center;
expressed as noise pixels->A non-center set of pixels of the filter matrix that is center;
representing pixel k to noise pixel->Normalized Euclidean distance of>A pixel value representing pixel k;
s16: converting the local information of the filter matrix into median filter weights:
wherein:
expressed as noise pixels->Median filter weight of the filter matrix being centered, < ->
S17: center pixel of filter matrix based on median filter weightAnd performing median filtering, wherein the formula of the median filtering is as follows:
wherein:
representing the center pixel +.>Is +.>Representing the center pixel +.>The pixel value after the filtering processing;
representing the center-off pixel in the filter matrix>Besides, the pixel value median of the rest pixels;
and (3) processing the steps S13 to S17 on all noise pixels in the fishway fish passing image to obtain a fishway fish passing image after filtering processing.
3. The precise fish-passing identification method according to claim 2, wherein the step S1 of morphological processing of the filtered fish-passing image includes:
Morphological processing is carried out on the fish-passing image of the fishway after the filtering processing, wherein the morphological processing flow is as follows:
constructing a morphological processing matrix:
wherein:
for the matrix constructed for morphological erosion treatment,/->A matrix for morphological dilation processing is constructed;
comparing any pixel in the fish-passing image of the fishway after the filtering treatment with a preset pixel threshold value, if the pixel threshold value is larger than the preset pixel threshold value, marking the pixel as 1, otherwise marking the pixel as 0;
using matricesScanning any pixel in the filtered fish-passing image, wherein the matrix +.>The center coincides with the pixel to be scanned, and the marking results of the pixel to be scanned and the adjacent pixels are the same as the matrix +.>And operation of the corresponding region of the pixel to be scanned is not changed if the pixel values are 1, otherwise, the pixel value of the pixel to be scanned is modified to be adjacent +.>The pixel mean value of the pixel area is used for obtaining a fish-passing image of the fishway after morphological corrosion treatment;
using matricesScanning any pixel in the morphological corrosion processed fish-passing image, wherein the matrixThe center coincides with the pixel to be scanned, and the marking results of the pixel to be scanned and the adjacent pixels are the same as the matrix +. >And operating the corresponding areas of the pixel to be scanned, if the pixel values are 0, not changing the pixel value of the pixel to be scanned, otherwise, modifying the pixel value of the pixel to be scanned to be adjacentAnd obtaining the pixel mean value of the pixel region to obtain the pretreated fish-passing image of the fishway.
4. The precise fishway fish-passing identification method of claim 1, wherein the step S4 of fitting the split single fish-passing profile images comprises:
fitting the split multiple single fish profile images, wherein the fitting process is as follows:
for any single fish profile image, a sequence of pixel coordinates for marking the fish profileWherein->Representing the pixel coordinates of the T-th fish profile pixel in the single fish profile image, T representing the total number of fish profile pixels in the single fish profile image, < +.>A starting pixel representing a fish profile;
construction of a contour function of a fish contour
Wherein:
parameters representing a contour function;
representing a distance between the t-th fish profile pixel coordinate and the t-1 th fish profile pixel coordinate;
constructing a corresponding matrix calculation formula based on an interpolation mode:
wherein:
pixel coordinates representing 1 st fish profile pixel +.>And->A Euclidean distance between them;
solving the equation to obtain a matrix P, and calculating according to elements in the matrix P to obtain profile function parameters, wherein:
And obtaining the distance between the coordinates of the follow-up unknown pixel and the previous coordinate based on the constructed contour function, setting the included angle between the connecting line of the coordinates of the follow-up unknown pixel and the previous coordinate and the horizontal axis, wherein the set angle is the curvature of the adjacent line segment taking the previous coordinate as the end point, obtaining the specific coordinates of the follow-up unknown pixel based on the set included angle and the distance, and iterating the coordinates of the pixel in an iterative mode until a closed fitting result is obtained, wherein the fitting result is the single fish image.
5. The method for identifying fish passing through a precise fishway of claim 4, wherein the step S4 of counting the single fish passing images to obtain the real-time fish passing number comprises the steps of:
filtering the single fish-passing profile images, wherein the number of fish profile pixels in the single fish-passing profile images is less than a preset threshold value, and the single fish-passing images with a closed result cannot be formed, counting the reserved single fish-passing images, and the counting result is the real-time fish-passing number of the fishways.
CN202310186817.1A 2023-02-27 2023-02-27 Precise fish-passing identification method for fishway Active CN115984145B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202310186817.1A CN115984145B (en) 2023-02-27 2023-02-27 Precise fish-passing identification method for fishway

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202310186817.1A CN115984145B (en) 2023-02-27 2023-02-27 Precise fish-passing identification method for fishway

Publications (2)

Publication Number Publication Date
CN115984145A CN115984145A (en) 2023-04-18
CN115984145B true CN115984145B (en) 2024-02-02

Family

ID=85970501

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202310186817.1A Active CN115984145B (en) 2023-02-27 2023-02-27 Precise fish-passing identification method for fishway

Country Status (1)

Country Link
CN (1) CN115984145B (en)

Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104200442A (en) * 2014-09-19 2014-12-10 西安电子科技大学 Improved canny edge detection based non-local means MRI (magnetic resonance image) denoising method
CN111199551A (en) * 2020-01-06 2020-05-26 北京农业信息技术研究中心 Target segmentation method and system for fish overlapped image
KR102129698B1 (en) * 2019-12-19 2020-07-02 김맹기 Automatic fish counting system
CN113870149A (en) * 2021-10-21 2021-12-31 重庆邮电大学 Non-local total variation image restoration method based on smooth structure tensor self-adaption
CN114419320A (en) * 2022-01-19 2022-04-29 中山大学 Fish shoal detection method, system and device based on image recognition
CN115147710A (en) * 2022-07-15 2022-10-04 杭州电子科技大学 Sonar image target processing method based on heterogeneous filtering detection and level set segmentation
CN115170535A (en) * 2022-07-20 2022-10-11 水电水利规划设计总院有限公司 Hydroelectric engineering fishway fish passing counting method and system based on image recognition
CN115512145A (en) * 2022-10-12 2022-12-23 中国第一汽车股份有限公司 Image segmentation method and device, vehicle and storage medium
CN115526285A (en) * 2021-06-25 2022-12-27 中国农业大学 Fish counting device and counting method thereof, electronic equipment and storage medium
CN115546746A (en) * 2022-08-24 2022-12-30 湖南第一师范学院 Crack detection method and device for high-speed running rail vehicle
CN116184377A (en) * 2022-12-07 2023-05-30 南京朴厚生态科技有限公司 Fishway fish-passing monitoring system using sonar technology

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104077773A (en) * 2014-06-23 2014-10-01 北京京东方视讯科技有限公司 Image edge detection method, and image target identification method and device
JP6702535B2 (en) * 2015-03-09 2020-06-03 Necソリューションイノベータ株式会社 Same fish judging device, fish counting device, portable terminal for fish counting, same fish judging method, fish counting method, fish number predicting device, fish number predicting method, same fish judging system, fish counting system and fish number predicting system

Patent Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104200442A (en) * 2014-09-19 2014-12-10 西安电子科技大学 Improved canny edge detection based non-local means MRI (magnetic resonance image) denoising method
KR102129698B1 (en) * 2019-12-19 2020-07-02 김맹기 Automatic fish counting system
CN111199551A (en) * 2020-01-06 2020-05-26 北京农业信息技术研究中心 Target segmentation method and system for fish overlapped image
CN115526285A (en) * 2021-06-25 2022-12-27 中国农业大学 Fish counting device and counting method thereof, electronic equipment and storage medium
CN113870149A (en) * 2021-10-21 2021-12-31 重庆邮电大学 Non-local total variation image restoration method based on smooth structure tensor self-adaption
CN114419320A (en) * 2022-01-19 2022-04-29 中山大学 Fish shoal detection method, system and device based on image recognition
CN115147710A (en) * 2022-07-15 2022-10-04 杭州电子科技大学 Sonar image target processing method based on heterogeneous filtering detection and level set segmentation
CN115170535A (en) * 2022-07-20 2022-10-11 水电水利规划设计总院有限公司 Hydroelectric engineering fishway fish passing counting method and system based on image recognition
CN115546746A (en) * 2022-08-24 2022-12-30 湖南第一师范学院 Crack detection method and device for high-speed running rail vehicle
CN115512145A (en) * 2022-10-12 2022-12-23 中国第一汽车股份有限公司 Image segmentation method and device, vehicle and storage medium
CN116184377A (en) * 2022-12-07 2023-05-30 南京朴厚生态科技有限公司 Fishway fish-passing monitoring system using sonar technology

Non-Patent Citations (5)

* Cited by examiner, † Cited by third party
Title
Method for segmentation of overlapping fish images in aquaculture;Zhou C, et al.;Method for segmentation of overlapping fish images in aquaculture;第12卷(第6期);全文 *
基于机器视觉技术的鱼类识别研究进展;杨东海;张胜茂;汤先峰;;渔业信息与战略(第02期);全文 *
基于模糊度量视觉特征的非局部均值去噪;吕俊瑞;罗学刚;岐世峰;彭真明;;重庆邮电大学学报(自然科学版)(第03期);全文 *
基于计算机视觉的大菱鲆鱼苗计数方法研究;王硕;中国优秀硕士学位论文全文数据库信息科技辑(第8期);论文第二章-第三章 *
基于轮廓曲率和距离分析的重叠柑橘分割与重建;刘妤等;中国农业科技导报;第22卷(第8期);论文第1节 *

Also Published As

Publication number Publication date
CN115984145A (en) 2023-04-18

Similar Documents

Publication Publication Date Title
CN106960195B (en) Crowd counting method and device based on deep learning
EP3740897B1 (en) License plate reader using optical character recognition on plural detected regions
US11783572B2 (en) Method of automatically extracting information of a predefined type from a document
WO2021068178A1 (en) Systems and methods for image quality detection
CN113160257B (en) Image data labeling method, device, electronic equipment and storage medium
CN110287963B (en) OCR recognition method for comprehensive performance test
CN111860439A (en) Unmanned aerial vehicle inspection image defect detection method, system and equipment
CN111639629B (en) Pig weight measurement method and device based on image processing and storage medium
CN113324864B (en) Pantograph carbon slide plate abrasion detection method based on deep learning target detection
CN109271848B (en) Face detection method, face detection device and storage medium
CN114821014B (en) Multi-mode and countermeasure learning-based multi-task target detection and identification method and device
CN110807775A (en) Traditional Chinese medicine tongue image segmentation device and method based on artificial intelligence and storage medium
CN110580499B (en) Deep learning target detection method and system based on crowdsourcing repeated labels
CN109800659B (en) Action recognition method and device
CN111639704A (en) Target identification method, device and computer readable storage medium
CN112862703B (en) Image correction method and device based on mobile photographing, electronic equipment and medium
CN115984145B (en) Precise fish-passing identification method for fishway
CN116580326A (en) Aviation environment safety risk prevention and control detection and early warning system
CN114550069B (en) Piglet nipple counting method based on deep learning
CN116363655A (en) Financial bill identification method and system
CN112818987B (en) Method and system for identifying and correcting display content of electronic bus stop board
CN115147405A (en) Rapid nondestructive testing method for new energy battery
CN113205067A (en) Method and device for monitoring operator, electronic equipment and storage medium
CN112749731A (en) Bill quantity identification method and system based on deep neural network
CN117218162B (en) Panoramic tracking vision control system based on ai

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant