CN116030373A - Intelligent fishway monitoring system - Google Patents

Intelligent fishway monitoring system Download PDF

Info

Publication number
CN116030373A
CN116030373A CN202210568008.2A CN202210568008A CN116030373A CN 116030373 A CN116030373 A CN 116030373A CN 202210568008 A CN202210568008 A CN 202210568008A CN 116030373 A CN116030373 A CN 116030373A
Authority
CN
China
Prior art keywords
fish
fishway
passing
grating
intelligent
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202210568008.2A
Other languages
Chinese (zh)
Inventor
靳帅
田若朝
袁静
江杰
付彦伟
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guoneng Dadu River Zhentouba Power Generation Co ltd
Original Assignee
Guoneng Dadu River Zhentouba Power Generation Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guoneng Dadu River Zhentouba Power Generation Co ltd filed Critical Guoneng Dadu River Zhentouba Power Generation Co ltd
Priority to CN202210568008.2A priority Critical patent/CN116030373A/en
Publication of CN116030373A publication Critical patent/CN116030373A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02ATECHNOLOGIES FOR ADAPTATION TO CLIMATE CHANGE
    • Y02A40/00Adaptation technologies in agriculture, forestry, livestock or agroalimentary production
    • Y02A40/60Ecological corridors or buffer zones

Landscapes

  • Image Analysis (AREA)

Abstract

The invention relates to the technical field of intelligent fishway monitoring, in particular to an intelligent fishway monitoring system, which comprises: the fishway comprehensive monitoring module comprises a swimming fish real-time monitoring submodule, a fish passing quantity counting submodule, a fishway flow velocity trend analysis submodule, a fishway water turbidity counting submodule and a fishway water level water temperature change trend counting submodule; the fish passing quantity counting submodule is specifically realized by an infrared grating and a camera, wherein the infrared grating is arranged at the entrance of a fish pond in a set number, vertical, equal-height and parallel mode, signals of the infrared grating are shielded, the infrared grating is judged to be fish passing starting signals, scanning is conducted at set frequency, shielding state data of grating points with the number set from top to bottom, obtained through each scanning, are collected to form one frame of grating data and reported, reporting is stopped after fish passing is finished, and each frame of grating data is connected and combined to form an image to obtain a complete image of fish passing.

Description

Intelligent fishway monitoring system
Technical Field
The invention relates to the technical field of intelligent fishway monitoring, in particular to an intelligent fishway monitoring system.
Background
In the prior art, in order to effectively monitor the state of fish entering the fish channel, a complicated monitoring device is arranged, but the following problems exist: on the one hand, the monitoring device is easily shifted or swayed due to the influence of water flow, so that monitoring information is inaccurate, on the other hand, the monitoring device is low in integration level, the matching among devices is not ideal, and the energy efficiency of the existing fishway which is only used as a channel for supplying fish migration cannot be controlled. Important factors influencing ecological health such as fish passing size, fish passing variety, fish passing quantity, fish passing environment and the like cannot be controlled, so that the functions and application effects of the fishway are greatly limited. Based on this, we designed an intelligent monitoring system for fishways to overcome the above problems.
Disclosure of Invention
The invention aims to provide an intelligent fishway monitoring system which is used for solving the technical problems.
The embodiment of the invention is realized by the following technical scheme:
fishway intelligent monitoring system includes:
the fishway comprehensive monitoring module comprises a swimming fish real-time monitoring submodule, a fish passing quantity counting submodule, a fishway flow velocity trend analysis submodule, a fishway water turbidity counting submodule and a fishway water level water temperature change trend counting submodule;
the fish passing quantity counting submodule is specifically realized by an infrared grating and a camera, wherein the infrared grating is arranged at the entrance of a fish pond in a set number, vertical, equal-height and parallel mode, signals of the infrared grating are shielded, the infrared grating is judged to be fish passing starting signals, scanning is conducted at set frequency, shielding state data of grating points with the number set from top to bottom, obtained through each scanning, are collected to form one frame of grating data and reported, reporting is stopped after fish passing is finished, and each frame of grating data is connected and combined to form an image to obtain a complete image of fish passing.
Optionally, the reported grating data is connected and combined for imaging, and is subjected to the measurement of the swimming speed and the measurement of the migration direction of the fish.
Optionally, the time for the fish to reach the first set of infrared gratings is set as T 1 The time for completely recovering the first set of infrared grating signals is T s The calculation formula of the fish length is as follows:
L=V*(T s -T 1 )
wherein T is 1 For the time of fish reaching the first set of infrared grating, T s And V is the swimming speed of fish passing for the time of completely recovering the first set of infrared raster signals.
Optionally, in the fish passing number counting sub-module, a fish passing identification method is further preset, and the method comprises the following steps:
inputting the complete image of the fish into a CNN model which is finished being trained, carrying out target detection on the complete image of the fish through a YoloV5 algorithm preset in the CNN model which is finished being trained, identifying the type of the fish, obtaining the pixel size of the fish, and calculating the real size of the fish according to the proportional relation between the view size of a camera and the pixel size.
Optionally, the training process of the CNN model is as follows:
inputting a historical fish passing image, and extracting features of the historical fish passing image through a constructed CNN model to obtain features of the historical fish passing image;
processing the historical fish-passing image features through a region generating network to generate a frame of the historical fish-passing image features, and normalizing the frame of the historical fish-passing image features through softmax after cutting and filtering to obtain the classification probability of the historical fish-passing image features;
performing two classification on the historical fish-passing image feature frames after normalization processing, screening to obtain ROIs, enabling the ROIs to generate feature maps with set sizes through an ROI mapping layer, mapping the pro-area windows onto the feature maps of the last layer of the CNN model, and performing regression combined training on the classification probability and the frames of the historical fish-passing image features through softmax Loss and Smooth L1 Loss to obtain the CNN model which is completed to be trained.
Optionally, when the characteristic extraction is performed on the historical fish passing image, the color of the historical fish passing image is used as a classification characteristic, and the color of the historical fish passing image is represented by hue brightness.
Optionally, a multi-target tracking algorithm is preset in the CNN model after training, so as to obtain the speed, the number and the direction data of the fish, specifically:
acquiring a tracking frame ID and frame position information;
setting a counting line BC, calculating the center position A' of the current tracking frame, defining the motion vector as the center position A of the tracking frame with the same ID as the previous frame
Figure SMS_1
If->
Figure SMS_2
Crossing the counting line BC, the fish is judged to swim past the counting line BC, < >>
Figure SMS_3
And updating the number of frames of each tracking frame ID passing line for the direction of the fish, and considering that the fish passes again and counts if a certain tracking frame ID passes the counting line again when the threshold of the number of frames is exceeded, or else considering that the fish is in a loitering state and does not count repeatedly.
Optionally, in the fish passing number statistics sub-module, an error counter propagation ANN classifier is preset, and whether the fish belongs to the fish is judged from the pixels in the complete image of the fish passing, wherein the label of the fish is set to be 1, the labels of other organisms or objects are set to be-1, and the output of the error counter propagation ANN classifier indicates whether the pixels belong to the fish.
Optionally, the method further comprises:
the ecological environment monitoring module comprises an ecological environment real-time monitoring sub-module, an environment monitoring equipment management sub-module and an upstream and downstream data trend statistics sub-module;
the intelligent monitoring center comprises a fishway underwater real-time monitoring submodule and a historical swimming fish information query submodule;
the system configuration center comprises a user booklet module, a user authority management sub-module, a password modification sub-module, a data backup sub-module, an organization management sub-module and a system parameter configuration sub-module.
Optionally, the data acquired by the camera and the infrared grating are specifically received by the underwater connector and reported to the server.
The technical scheme of the embodiment of the invention has at least the following advantages and beneficial effects:
in the invention, the fish passing monitoring device is designed by simulating a real environment experiment and performing serious analysis and summarization, and the fish passing monitoring device comprises a main box body: caliber: width: 300mm; high: 400mm; and (3) length: 600mm. Maximum overall dimension: width x height x length = 1165x552x1372. The fish-passing monitoring device comprises a fish-passing monitoring device, a first infrared grating, a second infrared grating, a first image acquisition box, a second image acquisition box, an underwater camera and an underwater light source, wherein the first infrared grating and the second infrared grating are respectively arranged at the inlet end and the outlet end of the fish-passing monitoring device, and the first image acquisition box and the second image acquisition box are respectively arranged on a top plate and a side plate of the fish-passing monitoring device. The dynamic blocking net device is additionally arranged in the middle of the fish-passing monitoring device, the optimal position of fish shooting can be adjusted according to actual conditions, and the problem of inaccurate fish channel monitoring data when a water source is turbid and the swimming fish is overlapped in the prior art is effectively solved by combining the video with the infrared grating.
In the invention, compared with the traditional method, the computer vision technology is intelligent and convenient, and accurately reflects the real underwater condition. The core algorithm is used for real-time target detection, multi-target tracking and the custom statistical algorithm is used for realizing key indexes such as fish species, count, size, speed, swimming direction and the like, so that the confusion of the industry in fish video monitoring under different water environments is overcome.
According to the intelligent fish monitoring system, the technical means such as the Internet of things, big data and artificial intelligence can be fully brought into play, real-time video and hydrologic data (turbidity, flow velocity, temperature, water level and the like) are monitored, the fish number and swimming algorithm model and the intelligent fish detection model of the infrared grating are realized through innovative visual technology, and better service is provided for ecological protection and ecological maintenance of the river basin based on living conditions of fish and innovative application analysis of the real-time hydrologic data.
In the invention, video data and water environment data are collected underwater, the underwater environment is influenced by natural environment and seasonal variation, the collection difficulty is relatively high, and the environment is changeable. The wireless LoRa technology of the current mainstream has the advantages of self-networking, strong penetrability, long distance, low power consumption and the like. The video data and the water environment data are collected by adopting the LoRa technology, so that the stability and the reliability of the real-time monitoring of the data are ensured.
Drawings
FIG. 1 is a schematic diagram of a framework of the intelligent fishway monitoring system of the invention;
FIG. 2 is a schematic diagram of the construction of the fish passing imaging and recognition model provided by the invention;
FIG. 3 is a schematic diagram of fish passing imaging provided by the invention;
fig. 4 is a schematic diagram of a deep learning process of fish identification provided by the invention;
fig. 5 is a schematic diagram of a statistical shape model of fish according to the present invention.
Detailed Description
For the purpose of making the objects, technical solutions and advantages of the embodiments of the present invention more apparent, the technical solutions of the embodiments of the present invention will be clearly and completely described below with reference to the accompanying drawings in the embodiments of the present invention, and it is apparent that the described embodiments are some embodiments of the present invention, but not all embodiments of the present invention. The components of the embodiments of the present invention generally described and illustrated in the figures herein may be arranged and designed in a wide variety of different configurations.
As shown in fig. 1, the present invention provides one of the embodiments: the fish way intelligent monitoring system is matched with the data acquired by the camera system according to the data acquired by infrared rays, so that the profile map of each fish can be analyzed, the migration mode of the fish can be checked through the size of the fish and the special time period in one day, the upstream and the migration of the fish can be distinguished, the spawning environment requirements of the fish can be analyzed, the change of the water temperature can be checked, or the counting precision of the fish can be verified through the acquired profile map of the fish.
The fishway comprehensive monitoring module mainly has the following functions:
the latest dynamic reminding of the fishway monitoring information. Real-time reminding of fish monitoring information in the fishway when fish passes;
displaying upstream and downstream fish total number statistical data monitored in the last 24 hours;
and (5) fish passing quantity statistics. And counting the historical fish number, time, date and other information. Statistical information of the fish passing number of a certain day or a month or a year can be checked specifically and displayed in a trend chart mode;
a fishway flow velocity trend graph. Comparing the upstream and downstream flow velocity trend graphs of the fishway for about 24 hours;
and monitoring water turbidity information in the fishway. The water turbidity monitoring data display can be used for self-defining the statistical time;
and water level change information in the fishway. The water level change trend graphs at the upstream and downstream in the fishway can be used for self-defining and counting time;
and the change information of the water temperature in the fishway. And a temperature change trend chart of upstream water and downstream water in the fishway.
The ecological environment monitoring module mainly has the following functions:
displaying the working state of the environment monitoring equipment and displaying abnormal equipment reminding;
displaying real-time water environment monitoring information in the fishway;
trend analysis chart display of water environment monitoring;
has report and statistics functions.
In this embodiment, capturing the fish-passing event of the fishway is implemented by using an infrared grating, each set of measurement grating is divided into two parts, namely an emitter and a receiver, the emitter continuously emits light according to a fixed frequency period, and the receiver synchronously checks whether the light is received or not, thereby forming a section scanner. In the absence of fish, the measurement grating receiver would receive the signal from the transmitter completely, and when fish passes, some of the transmitted signal would be blocked.
Therefore, as shown in fig. 2 and 3, two sets of measuring gratings are vertically, equi-height and parallel arranged at the entrance of the fishway, and the two sets of gratings start from the occlusion of a signal, trigger the fish passing start signal and scan at 100 Hz. And forming a frame of data and reporting the shielding/passing state of each point of the two sets of gratings every time, and stopping reporting the grating data until the fish is passed and all grating signals are recovered to pass. The method comprises the steps of starting a grating from the condition that a signal is shielded, scanning at a frequency of 100Hz, obtaining shielding/passing state data of 50 grating points from top to bottom in each scanning, wherein a shielding part is the projection of the current section of the fish, and connecting and combining each frame of grating data until the fish is completely passed, so as to obtain a complete image of the fish.
Record the time of fish reaching the first set of infrared grating as T 1 The time for completely recovering the first set of infrared grating signals is T s The calculation formula of the fish length is as follows:
L=V*(T s -T 1 )
wherein T is 1 For the time of fish reaching the first set of infrared grating, T s And (3) for the time of completely recovering the first set of infrared raster signals, obtaining the length of the shielded part at each moment, namely the width of the fish scanned at the moment, and carrying out integral processing on the width to obtain the side surface area of the fish.
In the embodiment, the underwater connector uniformly receives the collected data from each sensor, including the video data collected by the underwater camera equipment and the collected data of the infrared grating, and then transmits the data to a server on land; the server receives and caches all underwater data and forwards the underwater data to the big data center according to agreed protocols and rules; the big data center is composed of a plurality of subsystems and is responsible for converting, storing, processing and analyzing different types of data. For video data, the processing procedure of the embodiment takes a picture as a unit, after receiving observed video data, firstly, the video is decomposed into frames, then, necessary preprocessing is carried out for each frame, so that the requirements of the recognition procedure on the picture quality can be met, and then, the picture is recognized by using a method based on deep learning.
As shown in fig. 4, when training the own data set, the embodiment needs to first create an image list file, extract the fish-passing video frame by frame to obtain the image set to generate an image list, and then perform marking processing on each image to mark the category name and the category number of the fish, and also mark the category name, the test data of the category and the number of training data corresponding to each category.
Defining a neural network, and training by using a Faster R-CNN, wherein the training comprises the following specific steps:
inputting a test image;
inputting the whole picture into CNN, and extracting the characteristics;
generating an Anchor box (a drawing frame) by using RPN, cutting and filtering, and passing through softmax;
classifying the anchor boxes (foreground/background), and screening out the ROI;
mapping the proposal windows to the feature map of the last layer of the CNN;
generating a feature map with a fixed size by each ROI through the ROI mapping layer;
classification probabilities and frame regression were trained using softmax Loss and smoothl 1 Loss.
After training, obtaining a weight file of a data set, performing target detection by utilizing YoloV5 (target detection algorithm), identifying the type of the swimming fish, obtaining the pixel size of the swimming fish, and then calculating the real size of the fish according to the proportional relation between the field size of an actual camera and the pixel size.
Finally, counting the speed, direction and quantity of the fish passing through by deep start (multi-target tracking algorithm). The specific calculation steps are as follows:
obtaining a tracking frame ID and frame position information by the Deep SORT;
setting a counting line BC, calculating the center position A' of the current tracking frame, defining the motion vector as the center position A of the tracking frame with the same ID as the previous frame
Figure SMS_4
If->
Figure SMS_5
Crossing the counting line BC, the fish is judged to swim past the counting line BC, < >>
Figure SMS_6
And updating the number of frames of each tracking frame ID passing line for the direction of the fish, and considering that the fish passes again and counts if a certain tracking frame ID passes the counting line again when the threshold of the number of frames is exceeded, or else considering that the fish is in a loitering state and does not count repeatedly.
As shown in fig. 5, in the present embodiment, a Statistical Shape Model (SSM) may distinguish real fish from other objects having similar fish colors. In this model we describe the shape of the object with landmark points to contrast the example fitness we generated in this model. There are two objects of fish identification results obtained from an ANN classifier: one is fish and the other is a non-fish object. The statistical shape model may also be used to identify fish in non-fish objects that are counted by the ANN classifier. The model may fit the shape after multiple iterations, approximately as shown. Wherein blue dots are in the shape of average fish and red dots are in the shape of variability limits.
In order to obtain reliable results, the present embodiment defines that if an object overlaps 70% of the fitting result, the object will be identified as a corresponding fish. The key technique of this algorithm is the synthesis of the shape of the fish, which can be described by a series of points in two or three dimensions in a sequential order, these points being called feature points or marker points (Landmarks). It is desirable to find the feature points that best characterize the object to create a point distribution model (PointDistributionModel, PDM) of the target shape, so that the feature points are typically selected manually by experts in a particular field. It is conceivable that such feature point marking work is rather cumbersome and tedious, so how to automatically or semi-automatically mark feature points is also one of the hot spots of research. If a shape is described by n feature points in d-dimensional space, the shape can be represented as a vector containing nd elements whose elements are composed of coordinates of the n feature points connected in sequence. For example, given a two-dimensional image and n feature point coordinates, { (x, y) }, the shape can be represented by a vector containing n elements as follows:
x=(x 1 ,x 2 ,...,x n ,y 1 ,y 2 ,...,y n ) T
assuming that the training set contains N images of the swimming fish, each of which is labeled as defined by 68 points, then N vectors containing 2N elements (where 2N is equal to 136) as shown in the above formula are obtained. Before statistically analyzing these shape-characterizing vectors, it is important to ensure that all shapes are in the same coordinate system, which is largely dependent on the similarity-transformation-based shape alignment process.
The similarity transformation (Similarity Transformation) is a basic geometric transformation, and includes operations such as translation (translation), scaling (scaling), rotation (rotation), and the like. The shape remains unchanged from the similarity transformation, that is, the feature points are subjected to various panning, zooming and rotation operations, and the shape itself will remain unchanged. Shape alignment is to filter out inconsistencies between different images due to factors such as position, size, angle, etc. by similarity transformation.
Procrustes Analysis is a typical method of aligning two shapes x1 and x 2. The Procrustes distance between the shapes x1 and x2 is minimized by finding the optimal similarity transformation T, including the scaling factor s, the rotation angle θ and the coordinate translation (tx, ty). The Procrustes distance actually measures two shapes composed of the same number of feature points, and the square sum of the coordinate distances between the corresponding feature points is a measurement unit meeting the minimum variance property, and is calculated by the following formula:
Figure SMS_7
wherein x is 1 =(x 11 ,...,x 1n ,y 11 ,...,y 1n ),x 2 =(x 21 ,...,x 2n ,y 21 ,...,y 2n );
The alignment method for multiple shapes in the training set is based on the Procrustes Analysis alignment method for two shapes, called Generalized Procrustes Analysis (GPA). The method aligns each shape with a reference shape (generally referred to as an average shape) in turn, recalculates the reference shape after the alignment of this round is completed, and then performs alignment of every two next rounds until there is no significant change between the average shape of the previous round and the average shape calculated at present, which can be regarded as an iterative process until convergence.
After the N shapes in the training set are all aligned to a common coordinate system using the Generalized Procrustes Analysis (GPA) method, the shapes can be considered as N points that make up a distribution in the nd-dimensional space. Modeling is performed on the aligned shape, namely the point distribution of the nd dimension space, by using a principal component analysis method:
Figure SMS_8
wherein phi is s Is t is included in s A set of feature vectors.
In general, in this embodiment, it can be considered that, in the nd-dimensional space, there are some empty spaces, any point in the subspace corresponds to a reasonable shape of the training set target object, and the shape represented by each point outside the space is irrelevant to the object, and such subspace is called an "allowed shape domain" ("A")Allowable ShapeDomain). On this basis, a new shape similar to the training set can be synthesized by a weighted linear combination of the average shape and the feature vector; b s Is a shape parameter representing a weighting coefficient:
Figure SMS_9
wherein b s Vector defines the parameter set of the variable model, and changes vector b s Element b s[i] ,i=1,2,...,t s The value of (c) can change the final composite shape.
The intelligent fish detection method based on the infrared grating for fish type identification solves the problem that in the traditional technology, the accuracy rate of fish identification is poor based on Hu invariant moment. According to the method, firstly, the types possibly existing in the fish passing image are roughly screened based on the infrared grating and the Hu invariant moment, then the type of the fish passing image is determined by combining the swimming speed of the fish on the basis of the roughly screened fish, the problem that the gesture is not standard when the fish enters the infrared grating and the type determination is inaccurate is caused, and compared with the traditional fish identification based on the Hu invariant moment, the accuracy is greatly improved.
In this embodiment, the fish-passing monitoring device is designed by simulating a real environment experiment and performing careful analysis and summary, and the fish-passing monitoring device is a main box: caliber: width: 300mm; high: 400mm; and (3) length: 600mm. Maximum overall dimension: width x height x length = 1165x552x1372. The fish-passing monitoring device comprises a fish-passing monitoring device, a first infrared grating, a second infrared grating, a first image acquisition box, a second image acquisition box, an underwater camera and an underwater light source, wherein the first infrared grating and the second infrared grating are respectively arranged at the inlet end and the outlet end of the fish-passing monitoring device, and the first image acquisition box and the second image acquisition box are respectively arranged on a top plate and a side plate of the fish-passing monitoring device. The dynamic blocking net device is additionally arranged in the middle of the fish-passing monitoring device, the optimal position of fish shooting can be adjusted according to actual conditions, and the problem of inaccurate fish channel monitoring data when a water source is turbid and the swimming fish is overlapped in the prior art is effectively solved by combining the video with the infrared grating.
In the embodiment, compared with the traditional method, the computer vision technology is intelligent and convenient, and accurately reflects the real underwater condition. The core algorithm is used for real-time target detection, multi-target tracking and the custom statistical algorithm is used for realizing key indexes such as fish species, count, size, speed, swimming direction and the like, so that the confusion of the industry in fish video monitoring under different water environments is overcome.
In the embodiment, the technical means such as the Internet of things, big data, artificial intelligence and the like can be fully exerted, real-time video and hydrologic data (turbidity, flow velocity, temperature, water level and the like) are monitored, the fish number and swimming algorithm model and the intelligent fish detection model of the infrared grating are realized through innovative visual technology, and better service is provided for ecological protection and ecological maintenance of the river basin based on living conditions of fish and innovative application analysis of the real-time hydrologic data.
In this embodiment, the video data and the water environment data are collected underwater, and the underwater environment is relatively difficult to collect due to the influence of natural environment and seasonal variation, and the environment is also changeable. The wireless LoRa technology of the current mainstream has the advantages of self-networking, strong penetrability, long distance, low power consumption and the like. According to the embodiment, the LoRa technology is adopted to collect video data and water environment data, so that the stability and reliability of data real-time monitoring are guaranteed.
The above is only a preferred embodiment of the present invention, and is not intended to limit the present invention, but various modifications and variations can be made to the present invention by those skilled in the art. Any modification, equivalent replacement, improvement, etc. made within the spirit and principle of the present invention should be included in the protection scope of the present invention.

Claims (10)

1. Fishway intelligent monitoring system, its characterized in that includes:
the fishway comprehensive monitoring module comprises a swimming fish real-time monitoring submodule, a fish passing quantity counting submodule, a fishway flow velocity trend analysis submodule, a fishway water turbidity counting submodule and a fishway water level water temperature change trend counting submodule;
the fish passing quantity counting submodule is specifically realized by an infrared grating and a camera, wherein the infrared grating is arranged at the entrance of a fish pond in a set number, vertical, equal-height and parallel mode, signals of the infrared grating are shielded, the infrared grating is judged to be fish passing starting signals, scanning is conducted at set frequency, shielding state data of grating points with the number set from top to bottom, obtained through each scanning, are collected to form one frame of grating data and reported, reporting is stopped after fish passing is finished, and each frame of grating data is connected and combined to form an image to obtain a complete image of fish passing.
2. The intelligent fishway monitoring system of claim 1, wherein the reported grating data is subjected to the measurement of the swimming speed and the measurement of the migration direction of the fish in addition to the connection and the combination imaging.
3. The intelligent fish way monitoring system of claim 1, wherein the time for the fish to reach the first set of infrared gratings is set to be T 1 The time for completely recovering the first set of infrared grating signals is T s The calculation formula of the fish length is as follows:
L=V*(T s -T 1 )
wherein T is 1 For the time of fish reaching the first set of infrared grating, T s And V is the swimming speed of fish passing for the time of completely recovering the first set of infrared raster signals.
4. The intelligent fishway monitoring system of claim 3, wherein the fish number statistics sub-module further comprises a fish identification method, and the method comprises the steps of:
inputting the complete image of the fish into a CNN model which is finished being trained, carrying out target detection on the complete image of the fish through a YoloV5 algorithm preset in the CNN model which is finished being trained, identifying the type of the fish, obtaining the pixel size of the fish, and calculating the real size of the fish according to the proportional relation between the view size of a camera and the pixel size.
5. The intelligent fishway monitoring system of claim 4, wherein the training process of the CNN model is as follows:
inputting a historical fish passing image, and extracting features of the historical fish passing image through a constructed CNN model to obtain features of the historical fish passing image;
processing the historical fish-passing image features through a region generating network to generate a frame of the historical fish-passing image features, and normalizing the frame of the historical fish-passing image features through softmax after cutting and filtering to obtain the classification probability of the historical fish-passing image features;
performing two classification on the historical fish-passing image feature frames after normalization processing, screening to obtain ROIs, enabling the ROIs to generate feature maps with set sizes through an ROI mapping layer, mapping the pro-area windows onto the feature maps of the last layer of the CNN model, and performing regression combined training on the classification probability and the frames of the historical fish-passing image features through softmax Loss and Smooth L1 Loss to obtain the CNN model which is completed to be trained.
6. The intelligent fishway monitoring system of claim 5, wherein the color of the historical fish image is also used as a classification feature when the feature extraction is performed on the historical fish image, and the color of the historical fish image is represented by hue brightness.
7. The intelligent fishway monitoring system of claim 4, wherein the trained CNN model is further provided with a multi-objective tracking algorithm to obtain speed, number and direction data of fish, specifically:
acquiring a tracking frame ID and frame position information;
setting a counting line BC, calculating the center position A' of the current tracking frame, defining the motion vector as the center position A of the tracking frame with the same ID as the previous frame
Figure FDA0003659008270000031
If->
Figure FDA0003659008270000032
Crossing the counting line BC, the fish is judged to swim past the counting line BC, < >>
Figure FDA0003659008270000033
And updating the number of frames of each tracking frame ID passing line for the direction of the fish, and considering that the fish passes again and counts if a certain tracking frame ID passes the counting line again when the threshold of the number of frames is exceeded, or else considering that the fish is in a loitering state and does not count repeatedly.
8. The intelligent fish way monitoring system according to claim 4, wherein in the fish passing number statistics sub-module, an error back propagation ANN classifier is further preset, and whether the fish belongs to the fish is judged from pixels in the complete image of the fish passing, wherein the label of the fish is set to be 1, the labels of other organisms or objects are set to be-1, and the output of the error back propagation ANN classifier indicates whether the pixels belong to the fish.
9. The intelligent fishway monitoring system of claim 1, further comprising:
the ecological environment monitoring module comprises an ecological environment real-time monitoring sub-module, an environment monitoring equipment management sub-module and an upstream and downstream data trend statistics sub-module;
the intelligent monitoring center comprises a fishway underwater real-time monitoring submodule and a historical swimming fish information query submodule;
the system configuration center comprises a user booklet module, a user authority management sub-module, a password modification sub-module, a data backup sub-module, an organization management sub-module and a system parameter configuration sub-module.
10. The intelligent fishway monitoring system of claim 1, wherein the data acquired by the camera and the infrared grating are received by the underwater connector and reported to the server.
CN202210568008.2A 2022-05-24 2022-05-24 Intelligent fishway monitoring system Pending CN116030373A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210568008.2A CN116030373A (en) 2022-05-24 2022-05-24 Intelligent fishway monitoring system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210568008.2A CN116030373A (en) 2022-05-24 2022-05-24 Intelligent fishway monitoring system

Publications (1)

Publication Number Publication Date
CN116030373A true CN116030373A (en) 2023-04-28

Family

ID=86080136

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210568008.2A Pending CN116030373A (en) 2022-05-24 2022-05-24 Intelligent fishway monitoring system

Country Status (1)

Country Link
CN (1) CN116030373A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116452967A (en) * 2023-06-16 2023-07-18 青岛励图高科信息技术有限公司 Fish swimming speed identification method based on machine vision

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116452967A (en) * 2023-06-16 2023-07-18 青岛励图高科信息技术有限公司 Fish swimming speed identification method based on machine vision
CN116452967B (en) * 2023-06-16 2023-08-22 青岛励图高科信息技术有限公司 Fish swimming speed identification method based on machine vision

Similar Documents

Publication Publication Date Title
JP6549797B2 (en) Method and system for identifying head of passerby
Guijarro et al. Discrete wavelets transform for improving greenness image segmentation in agricultural images
Boltes et al. Automatic extraction of pedestrian trajectories from video recordings
CN104112370B (en) Parking lot based on monitoring image intelligent car position recognition methods and system
CN106951836B (en) crop coverage extraction method based on prior threshold optimization convolutional neural network
CN109635875A (en) A kind of end-to-end network interface detection method based on deep learning
CN102704215B (en) Automatic cutting method of embroidery cloth based on combination of DST file parsing and machine vision
CN109447168A (en) A kind of safety cap wearing detection method detected based on depth characteristic and video object
CN109785337A (en) Mammal counting method in a kind of column of Case-based Reasoning partitioning algorithm
CN108596046A (en) A kind of cell detection method of counting and system based on deep learning
CN109583324A (en) A kind of pointer meters reading automatic identifying method based on the more box detectors of single-point
CN108986064A (en) A kind of people flow rate statistical method, equipment and system
CN104700404A (en) Fruit location identification method
CN106295124A (en) Utilize the method that multiple image detecting technique comprehensively analyzes gene polyadenylation signal figure likelihood probability amount
CN109255298A (en) Safety helmet detection method and system in dynamic background
CN107437068B (en) Pig individual identification method based on Gabor direction histogram and pig body hair mode
CN110728252B (en) Face detection method applied to regional personnel motion trail monitoring
CN112907520B (en) Single tree crown detection method based on end-to-end deep learning method
CN112560623B (en) Unmanned aerial vehicle-based rapid mangrove plant species identification method
Ji et al. In-field automatic detection of maize tassels using computer vision
CN109492665A (en) Detection method, device and the electronic equipment of growth period duration of rice
CN106056597A (en) Object visual detection method and device
Ye et al. An image-based approach for automatic detecting tasseling stage of maize using spatio-temporal saliency
Lee et al. Contour matching for fish species recognition and migration monitoring
Baharav et al. In situ height and width estimation of sorghum plants from 2.5 d infrared images

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination