CN114463675A - Underwater fish group activity intensity identification method and device - Google Patents

Underwater fish group activity intensity identification method and device Download PDF

Info

Publication number
CN114463675A
CN114463675A CN202210028850.7A CN202210028850A CN114463675A CN 114463675 A CN114463675 A CN 114463675A CN 202210028850 A CN202210028850 A CN 202210028850A CN 114463675 A CN114463675 A CN 114463675A
Authority
CN
China
Prior art keywords
fish
behavior
images
target fish
target
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202210028850.7A
Other languages
Chinese (zh)
Other versions
CN114463675B (en
Inventor
周超
赵振锡
杨信廷
刘锦涛
冯双星
孙传恒
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Research Center of Information Technology of Beijing Academy of Agriculture and Forestry Sciences
Original Assignee
Research Center of Information Technology of Beijing Academy of Agriculture and Forestry Sciences
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Research Center of Information Technology of Beijing Academy of Agriculture and Forestry Sciences filed Critical Research Center of Information Technology of Beijing Academy of Agriculture and Forestry Sciences
Priority to CN202210028850.7A priority Critical patent/CN114463675B/en
Publication of CN114463675A publication Critical patent/CN114463675A/en
Application granted granted Critical
Publication of CN114463675B publication Critical patent/CN114463675B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/045Combinations of networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02ATECHNOLOGIES FOR ADAPTATION TO CLIMATE CHANGE
    • Y02A40/00Adaptation technologies in agriculture, forestry, livestock or agroalimentary production
    • Y02A40/80Adaptation technologies in agriculture, forestry, livestock or agroalimentary production in fisheries management
    • Y02A40/81Aquaculture, e.g. of fish

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Data Mining & Analysis (AREA)
  • General Health & Medical Sciences (AREA)
  • Biomedical Technology (AREA)
  • Biophysics (AREA)
  • Computational Linguistics (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Evolutionary Computation (AREA)
  • Artificial Intelligence (AREA)
  • Molecular Biology (AREA)
  • Computing Systems (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Mathematical Physics (AREA)
  • Software Systems (AREA)
  • Health & Medical Sciences (AREA)
  • Image Analysis (AREA)

Abstract

The invention provides a method and a device for identifying activity intensity of an underwater fish group, comprising the following steps: acquiring a plurality of target fish school images of a target fish school under water; inputting the target fish shoal images into a behavior recognition model based on time information in the target fish shoal images, and determining individual behavior information and group behavior intensity information of the target fish shoal output by the behavior recognition model; the behavior recognition model is constructed based on the feature vector distribution correction module. According to the underwater fish group activity intensity identification method and device provided by the invention, the behavior identification model is constructed based on the characteristic vector distribution correction module, the latitude of the activity characteristic vector can obey Gaussian distribution by using a small amount of expansion data sample characteristics and basic sample data set characteristics, and the mismatching condition between the estimation distribution and the real distribution learned by using a small amount of samples is effectively solved, so that the identification precision of the fish group behaviors is improved.

Description

Underwater fish group activity intensity identification method and device
Technical Field
The invention relates to the technical field of artificial intelligence, in particular to a method and a device for identifying activity intensity of an underwater fish group.
Background
The group behavior analysis of the fish shoal is not only beneficial to evaluating the food intake activity and survival safety of the fish shoal, but also can reflect the daily energy consumption of the fish shoal. Therefore, the real-time automatic fish school activity recognition is realized, and the method has important significance for understanding the life habits and growth conditions of the fishes and making effective management strategies.
At present, the human group activity recognition network based on deep learning is widely applied to analyzing fish behaviors, and analysis angles comprise textures, speed or direction, spatial position and group interaction.
Due to the fact that shielding among underwater fish swarm individuals is serious, fish postures are various, changes are rapid, and actions of the fish individuals are not obvious, the method cannot effectively solve the problem of spontaneous and irregular fish swarm activity recognition.
Disclosure of Invention
Aiming at the problems in the prior art, the embodiment of the invention provides a method and a device for identifying the activity intensity of an underwater fish group.
The invention provides a method for identifying activity intensity of an underwater fish group, which comprises the following steps: acquiring a plurality of target fish school images of a target fish school under water;
inputting the target fish swarm images into a behavior recognition model based on time information in the target fish swarm images, and determining individual behavior information and group behavior intensity information of the target fish swarm output by the behavior recognition model;
the behavior recognition model is obtained after training based on a sample fish school image with an individual behavior information label and a group behavior intensity information label;
the behavior recognition model is used for recognizing the behavior of the target fish school based on the target fish school image;
the behavior recognition model is constructed based on the feature vector distribution correction module.
According to the underwater fish group activity intensity identification method provided by the invention, the step of acquiring a plurality of target fish group images of a target fish group underwater comprises the following steps:
acquiring video data of the target fish school;
determining a plurality of initial images of the target fish school according to the time information of each frame of image in the video data;
and performing size preprocessing on all the initial images to determine the target fish swarm images.
According to the underwater fish group activity intensity recognition method provided by the invention, before the inputting the target fish group images into the behavior recognition model, the method further comprises the following steps:
extracting multi-frame fish shoal images from an underwater fish shoal video;
preprocessing the multi-frame fish school images to obtain a plurality of sample fish school images;
determining a group activity intensity information label of each sample fish swarm image and an individual behavior information label of each fish in each sample fish swarm image;
taking each sample fish school image and the combination of the group activity intensity information label and the individual behavior information label corresponding to each sample fish school image as a training sample to obtain a plurality of training samples;
and training an initial recognition model by using the training samples based on the time information in each training sample to obtain the behavior recognition model.
According to the underwater fish group activity intensity recognition method provided by the invention, before the training of the initial recognition model by using the plurality of training samples, the method further comprises the following steps:
constructing an attention mechanism residual error network based on a backbone network, an individual action feature extraction network and a regional feature aggregation module;
constructing the initial identification model based on the attention mechanism residual error network, the graph convolution network and the feature vector distribution correction module;
the loss function of the initial recognition model is determined based on individual fish behavioral characteristics and fish group behavioral characteristics.
According to the underwater fish group activity intensity recognition method provided by the invention, the step of inputting the target fish group images into a behavior recognition model based on the time information in the target fish group images and determining the individual behavior information and the group behavior intensity information of the target fish group output by the behavior recognition model comprises the following steps:
based on the time information in the target fish swarm images, performing feature extraction and feature fusion on the target fish swarm images by using the attention mechanism residual error network, and determining individual action relation matrixes of the target fish swarm images;
performing action relation reasoning on the individual action characteristic vector of each target fish school image by using the graph convolution network, and determining the activity characteristic vector of each target fish school image;
and performing Gaussian distribution correction on each activity characteristic vector by using the characteristic vector distribution correction module, and determining the individual behavior information and the group behavior intensity information of the target fish school.
According to the underwater fish group activity intensity identification method provided by the invention, the step of performing feature extraction and feature fusion on the target fish group images by using the attention mechanism residual error network based on the time information in the target fish group images to determine the individual action relationship matrix of the target fish group images comprises the following steps:
based on the time information in the target fish swarm images, performing feature extraction and fusion on the target fish swarm images by using the backbone network to obtain a plurality of feature fusion images;
utilizing the individual motion feature extraction network to extract individual motion features of the feature fusion images, and determining individual motion feature vectors;
and performing action characteristic extraction on the individual action characteristic vector by utilizing the region characteristic aggregation module, and determining individual action relation matrixes of the plurality of target fish school images.
The invention also provides a device for identifying the activity intensity of the underwater fish group, which comprises: the acquisition module is used for acquiring a plurality of target fish school images of a target fish school under water;
the determining module is used for inputting the target fish swarm images into a behavior recognition model based on time information in the target fish swarm images, and determining individual behavior information and group behavior intensity information of the target fish swarm output by the behavior recognition model;
the behavior recognition model is obtained after training based on a sample fish school image with an individual behavior information label and a group behavior intensity information label;
the behavior recognition model is used for recognizing the behavior of the target fish school based on the target fish school image;
the behavior recognition model is constructed based on the feature vector distribution correction module.
The invention also provides an electronic device, which comprises a memory, a processor and a computer program stored on the memory and capable of running on the processor, wherein the processor executes the program to realize the steps of the underwater fish group activity intensity identification method.
The present invention also provides a non-transitory computer readable storage medium having stored thereon a computer program which, when being executed by a processor, carries out the steps of the method for identifying an activity intensity of an underwater fish herd as described in any one of the above.
The invention also provides a computer program product comprising a computer program which, when executed by a processor, performs the steps of the method for identifying activity intensity of an underwater fish group as described in any one of the above.
According to the underwater fish group activity intensity identification method and device provided by the invention, the behavior identification model is constructed based on the characteristic vector distribution correction module, the latitude of the activity characteristic vector can obey Gaussian distribution by using a small amount of expansion data sample characteristics and basic sample data set characteristics, and the mismatching condition between the estimation distribution and the real distribution learned by using a small amount of samples is effectively solved, so that the identification precision of the fish group behaviors is improved.
Drawings
In order to more clearly illustrate the technical solutions of the present invention or the prior art, the drawings needed for the description of the embodiments or the prior art will be briefly described below, and it is obvious that the drawings in the following description are some embodiments of the present invention, and those skilled in the art can also obtain other drawings according to the drawings without creative efforts.
FIG. 1 is a schematic flow chart of a method for identifying activity intensity of an underwater fish group provided by the present invention;
FIG. 2 is a schematic structural diagram of a fish swarm activity intensity identification framework provided by the invention;
FIG. 3 is a second schematic flow chart of the method for identifying activity intensity of underwater fish group provided by the present invention;
FIG. 4 is a schematic structural diagram of an underwater fish group activity intensity recognition device provided by the present invention;
FIG. 5 is a second schematic structural diagram of an apparatus for identifying activity intensity of underwater fish group provided by the present invention;
fig. 6 is a schematic structural diagram of an electronic device provided in the present invention.
Detailed Description
In order to make the objects, technical solutions and advantages of the present invention clearer, the technical solutions of the present invention will be clearly and completely described below with reference to the accompanying drawings, and it is obvious that the described embodiments are some, but not all embodiments of the present invention. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
It should be noted that in the description of the embodiments of the present invention, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other identical elements in a process, method, article, or apparatus that comprises the element. The terms "upper", "lower", and the like, indicate orientations or positional relationships based on the orientations or positional relationships shown in the drawings, and are only for convenience in describing the present invention and simplifying the description, but do not indicate or imply that the referred devices or elements must have a specific orientation, be constructed and operated in a specific orientation, and thus, should not be construed as limiting the present invention. Unless expressly stated or limited otherwise, the terms "mounted," "connected," and "connected" are intended to be inclusive and mean, for example, that they may be fixedly connected, detachably connected, or integrally connected; can be mechanically or electrically connected; they may be connected directly or indirectly through intervening media, or they may be interconnected between two elements. The specific meanings of the above terms in the present invention can be understood by those skilled in the art according to specific situations.
In a complex underwater environment, rapid and uncontrolled fish movement possibly causes the problems of unclear fish individual action characteristics and group activity characteristics, adhesion of data distribution and the like. Meanwhile, mismatching between the estimation distribution and the real distribution which are learned by using only a few samples is relieved, and therefore data distribution correction is achieved.
The method and apparatus for identifying activity intensity of underwater fish group provided by the embodiment of the invention are described below with reference to fig. 1 to 6.
Fig. 1 is a schematic flow chart of the underwater fish group activity intensity identification method provided by the present invention, as shown in fig. 1, including but not limited to the following steps:
first, in step S11, a plurality of target fish school images of a target fish school under water are acquired.
The target fish school image can be obtained by shooting the target fish school image by using an underwater camera at different time, or can be obtained by performing video recording on the target fish school and extracting multi-frame images from the recorded video as the target fish school image.
Specifically, target fish swarm images shot at different time points are obtained for a target fish swarm which is a research object, and the target fish swarm images carry shooting time information.
Further, in step S12, based on the time information in the target fish swarm images, the target fish swarm images are input to a behavior recognition model, and the individual behavior information and the group behavior intensity information of the target fish swarm output by the behavior recognition model are determined;
the behavior recognition model is obtained after training based on a sample fish school image with an individual behavior information label and a group behavior intensity information label;
the behavior recognition model is used for recognizing the behavior of the target fish school based on the target fish school image; the behavior recognition model is constructed based on the feature vector distribution correction module.
Because a group exists in a fish swarm and moves along with certain fishes in the group, the conventional fish action identification method only assists action characteristics staying on the surface and other information to directly research the group activity of the fishes, neglects the relationship reasoning between individual actions of the fishes and the group activity, and has a GCN network in an action identification model to solve the problem.
Aiming at the problems of unclear fish individual action characteristics and group activity characteristics, adhesion of data distribution and the like in a complex underwater environment, a characteristic vector distribution correction module is additionally arranged on an original GCN (generalized network for network) in a behavior recognition model, and the characteristic vector distribution correction module is used for correcting the data distribution of the activity characteristic vectors of a target fish swarm image, so that the latitudes of the activity characteristic vectors obey Gaussian distribution.
Specifically, a plurality of target fish swarm images are input into a behavior recognition model, and feature extraction, fusion, action relation reasoning and vector data distribution correction are carried out on each target fish swarm image by the behavior recognition model, so that final individual behavior information and group behavior intensity information of the target fish swarm are obtained and output.
The individual behavior information is the action behavior information of each fish in the target fish swarm and comprises the following steps: acceleration, deceleration, swimming, resting, ingestion; the group behavior intensity information is the activity intensity of the whole target fish school, and comprises the following steps: none, strong, medium, weak.
According to the underwater fish group activity intensity identification method provided by the invention, the behavior identification model is constructed based on the characteristic vector distribution correction module, the latitude of the activity characteristic vector can obey Gaussian distribution by using a small amount of expansion data sample characteristics and basic sample data set characteristics, and the mismatching condition between the estimation distribution and the real distribution learned by using a small amount of samples is effectively solved, so that the identification precision of the fish group behaviors is improved.
Optionally, the obtaining of a plurality of target fish school images of a target fish school under water; each target fish school image carries time information, and the time information comprises the following steps:
acquiring video data of the target fish school;
determining a plurality of initial images of the target fish school according to the time information of each frame of image in the video data;
and performing size preprocessing on all the initial images to determine the target fish swarm images.
The video data is obtained by shooting a target fish school.
Specifically, a plurality of frame images are extracted from the shot video data as initial images of the target fish school.
And (4) carrying out size adjustment and image enhancement on each initial image, wherein the sizes of the obtained target fish swarm images are 1280 multiplied by 720 pixels.
Optionally, before the inputting the plurality of target fish swarm images into the behavior recognition model, the method further includes:
extracting multi-frame fish shoal images from an underwater fish shoal video;
preprocessing the multi-frame fish school images to obtain a plurality of sample fish school images;
determining a group activity intensity information label of each sample fish swarm image and an individual behavior information label of each fish in each sample fish swarm image;
taking each sample fish school image, and the combination of the group activity intensity information label corresponding to each sample fish school image and all individual behavior information labels as a training sample to obtain a plurality of training samples;
and training an initial recognition model by using the training samples based on the time information in each training sample to obtain the behavior recognition model.
And dividing S individual behavior information labels and K group activity intensity information labels according to the individual behavior action characteristics and the group activity behavior characteristics of the fishes.
In order to facilitate disorder, n groups of fish school activity sub-videos are divided from the shot m groups of underwater fish school videos. Since fish shoal can not be seen in each frame of video, images with fish shoal are extracted from each sub video for each section of fish shoal activity, each image carries time information, and the time point of image shooting can be displayed.
And (3) adjusting the size of the extracted image, limiting the long side H to 1280, adjusting the short side W to 720, and performing image inversion enhancement to obtain a plurality of sample fish school images.
And taking each sample fish school image, and the combination of the group activity intensity information label corresponding to each sample fish school image and all individual behavior information labels as a training sample to obtain a plurality of training samples. Each sample fish school image corresponds to one group activity intensity information label, each fish in the sample fish school images corresponds to one individual behavior information label, and a plurality of samples are obtained.
A plurality of training samples are allocated into a training set and a testing set according to a 9:1 ratio.
And respectively grouping training samples in the training set and the testing set, wherein the number of the training samples in each group is not more than that of GCNs in the initial recognition model, arranging the training samples in each group according to the sequence of the time points of image shooting, inputting the arranged training samples into the initial recognition model, and obtaining a finally trained model as a behavior recognition model.
Optionally, before the training the initial recognition model by using the plurality of training samples, the method further includes:
constructing an attention mechanism residual error network based on a backbone network, an individual action feature extraction network and a regional feature aggregation module;
constructing the initial identification model based on the attention mechanism residual error network, the graph convolution network and the feature vector distribution correction module;
the loss function of the initial recognition model is determined based on individual fish behavioral characteristics and fish group behavioral characteristics.
The region feature aggregation module can be a RoI-Align module.
Optionally, the inputting the target fish swarm images into a behavior recognition model based on time information in the target fish swarm images, and determining individual behavior information and group behavior intensity information of the target fish swarm output by the behavior recognition model includes:
based on the time information in the target fish swarm images, performing feature extraction and feature fusion on the target fish swarm images by using the attention mechanism residual error network, and determining individual action relation matrixes of the target fish swarm images;
performing action relation reasoning on the individual action characteristic vector of each target fish school image by using the graph convolution network, and determining the activity characteristic vector of each target fish school image;
and performing Gaussian distribution correction on each activity characteristic vector by using the characteristic vector distribution correction module, and determining the individual behavior information and the group behavior intensity information of the target fish school.
Optionally, the determining the individual action relationship matrix of the target fish swarm images by performing feature extraction and feature fusion on the target fish swarm images by using the attention mechanism residual error network based on the time information in the target fish swarm images includes:
based on the time information in the target fish swarm images, performing feature extraction and fusion on the target fish swarm images by using the backbone network to obtain a plurality of feature fusion images;
utilizing the individual motion feature extraction network to extract individual motion features of the feature fusion images, and determining individual motion feature vectors;
and performing action characteristic extraction on the individual action characteristic vector by utilizing the region characteristic aggregation module, and determining individual action relation matrixes of the plurality of target fish school images.
In order to solve the problem of insufficient extraction capability of fish target feature information in a backbone network, the invention also improves a ResNeSt backbone network with attention mechanism to form ResNeSt-tiny, so that the ResNeSt-tiny is more suitable for extraction of individual fish motion features, provides more obvious individual motion features for a subsequent individual motion feature extraction network, and reduces the number of network model parameters to solve the defect of insufficient extraction capability of fish motion features. The motion characteristic vector calibration idea is applied to reasoning and learning of individual fish motion characteristics and group motion relations, and the underwater fish group identification precision of the existing algorithm is further improved.
In the design of the loss function, the large difference between the fish individual action tag and the group activity tag is considered. Therefore, based on the individual fish behavior characteristics and the fish group behavior characteristics, a weight distribution coefficient lambda is introduced into the activity recognition loss function and the individual action loss function. Meanwhile, in the actual sampling of the fish group data, the individual action labels and the activity labels are different, and in order to enable each label to be sufficiently learned, a cross entropy loss function is used and an action weight coefficient and an activity weight coefficient are introduced.
For the constructed initial recognition network, setting network initial parameters and training the recognition network by using an Adam optimizer.
The attention-driven residual error network (ResNeSt-tiny) mainly comprises a backbone network and an individual action feature extraction network.
The backbone network, common to ResNeSt, contains four local residual blocks res1, res2, res3, and res 4. The feature output layer responsible for detecting the small target is pruned in the backbone network structure, the feature output of the four residual network layers is not performed any more, only the feature output of res2 and res3 is reserved, W/8 × H/8 × 512 and W/16 × H/16 × 102, namely, feature channels of two different scales of 90 × 160 × 512 and 45 × 80 × 1024 are formed, and feature maps of two scales are output to serve as the input of the individual action feature extraction network.
The res local residual error network has four structures of res1, res2, res3 and res4 which are respectively composed of 3, 4, 23 and 3 bottleeck series, wherein the bottleeck is composed of 4 convolution layers with convolution kernel of 1 × 1 and 4 Batch Normalization layers; a convolution layer with convolution kernel of 3 × 3 and convolution step size of 1, ReLu function and rsofmax activation layer.
Taking the motion characteristic information of a target individual as input, and sequentially passing the input characteristic diagram through a convolution layer with a convolution kernel of 1 multiplied by 1 and a Batch Normalization layer; then sequentially passing through a convolution layer with convolution kernel of 3 multiplied by 3 and convolution step length of 1 and a Batch Normalization layer to be output by a ReLu activation layer; then, the method is carried out by inputting two layers of convolution layers with convolution kernel of 1 multiplied by 1 and convolution step size of 1, a Batch Normalization layer and an rsofmax activation layer again. And finally, completing the feature diagram output of the bottleeck layer once again through the convolution layer with the convolution kernel of 1 multiplied by 1, the Batch Normalization layer and the ReLu activation layer.
The individual action feature extraction network is composed of an up-sampling convolution and a multi-scale feature information fusion structure. The method is mainly responsible for performing action feature fusion and extraction to obtain individual action feature vectors and provide input for ROI-Align network matting action features.
Calculation formula A of motion feature fusion and extraction of individual motion feature extraction of ResNeSt50-tinyffComprises the following steps:
Aff=D(upconv(x2)+x3);
wherein A isffFusing the characteristic information of the individual actions obtained after the fusion of the two scales; x is the number oflCharacteristic information output for the first layer res in the backbone network; upconv (·) represents a composite connection, consisting of a 1 × 1 convolutional layer and a batch normalization layer to perform an upsampling operation; d (-) is the recombination and splicing fusion conversion of the multi-scale feature information.
The behavioral-action relationship inference structure between fish individuals can be defined as:
Figure BDA0003465561220000121
wherein, the h (.) function represents the relation between the motion characteristic value and the space position information among the fish individuals;
Figure BDA0003465561220000122
representing the action characteristic value relation among the fish individuals;
Figure BDA0003465561220000123
representing the spatial position information relationship between individual fish.
Figure BDA0003465561220000124
Respectively represents the individual action characteristics of different fishes;
Figure BDA0003465561220000125
respectively representing the location information of different individual fish.
The above-mentioned relationship of characteristic values of the behavior between individual fishes
Figure BDA0003465561220000126
Calculating outThe formula is as follows:
Figure BDA0003465561220000127
wherein the content of the first and second substances,
Figure BDA0003465561220000128
representing a linear translation operation between two individual movements, WθAnd WφAre all weights, bθAnd bφAre all bias values; dkIs a normalization factor.
Spatial position information relationship between fish individuals
Figure BDA0003465561220000129
The calculation formula is as follows:
Figure BDA00034655612200001210
wherein the content of the first and second substances,
Figure BDA00034655612200001211
determining a function for the mask;
Figure BDA00034655612200001212
the Euclidean distance between the central points of different fish individuals; μ is a determination threshold.
And finally, learning the fish individual action relationship information by using the GCN to reason out the fish group activities. The relationship between the fish individual action graph network and the GCN network can be defined as follows:
Figure BDA00034655612200001213
wherein Z is(l)As input to the first layer, Z(0)Is the appearance characteristic of the input (with the size of Nxd); g is the expression of the fish individual action relation matrix, and the size is NxN; w(l)Is a layer-specific learnable weight matrix with a size dxd; NGThe number of the graph nerves can be 8; z(l+1)Then the active feature vector after GCN is Nxd.
Firstly, taking a training set and a verification set as a basic sample data set, and acquiring statistical information of the basic sample data set. Calculating a mean value mu in a statistical base sample data setiSum variance ΣiIt is specifically as follows:
Figure BDA0003465561220000131
Figure BDA0003465561220000132
wherein x isjIs the activity characteristic vector after passing through GCN, namely Z(l+1);niFor each activity category.
Further, the Tukey's Ladder of Powers Transformation (TLPT) makes the feature distribution more close to Gaussian. TLPT is used for carrying out data transformation on the support set and the query set samples of the expansion data, and distribution skewness is reduced.
Figure BDA0003465561220000133
Wherein the content of the first and second substances,
Figure BDA0003465561220000134
is a sample.
Further, statistical calibration (Ndatas data statistical simulation) of data-driven micro simulation models will be selected from the set of support sample data herein
Figure BDA0003465561220000138
The characteristic distance of the Base Class is the nearest to the Top-K Base Class and is used for calibration, and the specific calculation formula is as follows:
Figure BDA0003465561220000135
wherein topk (·) is used to derive a set S of input distancesdSelecting a top element; cbIs a basic data set category sample; sNIs and
Figure BDA0003465561220000136
the N most recent base class sets in the sample. Finally, the mean and covariance of the new distribution will be counted and calibrated from the most recent set of base sample data.
Further, the mean value mu is measurediSum variance ΣiAnd (3) re-correcting, wherein a specific calculation formula is as follows:
Figure BDA0003465561220000137
wherein mu' is the corrected mean value; sigma' is the corrected variance; alpha is a hyperparameter, between 0 and 1, and 0.42 can be selected. k is the number of active label categories.
Further, performing data sample expansion on the support sample data set by using the corrected variance μ 'and mean Σ', generating a plurality of samples and using the samples to train a classifier, wherein a specific calculation formula is as follows:
Dy={(Z(l+1),y)|Z(l+1)~N(μ,∑)∈Sy};
wherein the total number of features generated per class is set to a hyper-parameter and they are for SyEach calibration distribution in (a) is equally distributed; dyA corrected data sample set is obtained; syTo support data set samples.
Further, a classifier is selected. Training multi-label classes using logistic regression and one-to-one (one vs one) strategies, and thus their rectified input samples
Figure BDA0003465561220000141
Of the r class
Figure BDA0003465561220000142
The formula for calculating the posterior probability function is:
Figure BDA0003465561220000143
wherein K is the number of the categories of the active labels; each class corresponds to a weight vector, so that the total K weight vectors are set as the weight vector of the i-th class
Figure BDA0003465561220000144
And the dimensionality of the corrected activity characteristic vector approaches Gaussian distribution.
The unification of the distribution of the fish activity characteristic data effectively reduces the mismatching between the estimated distribution and the true distribution learned by using only a few samples, thereby improving the identification accuracy of the fish group activities.
In addition, active feature information vector S and augmented generated feature vector S in the support sample dataset are used
Figure BDA0003465561220000145
As data samples for training the classifier and using a cross entropy loss function to train the classifier. The cross entropy loss function calculation formula is as follows:
Figure BDA0003465561220000146
wherein the loss function in regression classification uses cross entropy loss function and introduces action weight coefficient (W)action1,Waction2,…,Wactionn) And activity weight coefficient (W)activityl,Wactivity2,…,Wactivityn). The entire fish herd activity recognition model (GCVC) is trained in an end-to-end fashion by back propagation. Combining with the standard cross entropy loss function, the final loss function calculation formula is:
Figure BDA0003465561220000147
wherein, λ is a weight distribution coefficient of the activity recognition loss function and the individual action loss function;
Figure BDA0003465561220000148
predicting a label for the individual action after the GCN network;
Figure BDA0003465561220000149
and labeling the action of the real individual. In a similar manner to that described above,
Figure BDA00034655612200001410
predicting tags for group activities after the GCN network;
Figure BDA00034655612200001411
labeling the real individual action; n is the number of action individual categories; m is the number of group activity categories.
The underwater fish group activity intensity identification method provided by the invention can be used for constructing an underwater fish group activity identification model on a 64-bit ubuntu 18.04 operating system platform on the basis of a PyTorch-GPU1.5.0 deep learning framework and by using a Python3.6.8 scripting language, and completing model training by using a NVIDIA GTX 2080ti GPU. The batch size on a single GPU is two images for all comparative and text models. The initial learning rates are all set to 0.00002, and the number of training rounds is all unified to 100 epoch. On testing, a single-scale test mode may be run in which the short edge of the input image is adjusted to 720 and the long edge is limited to 1280.
FIG. 2 is a schematic structural diagram of a frame for identifying activity intensity of fish groups provided by the present invention, as shown in FIG. 2, three images T in the sequence of shooting time are obtained from a 3 XT × H × W target fish group image1、t2、t3Target fish population at time as input to the recognition network, t1、t2、t3The time interval between the moments is not limited. T is the video frame number, 3 is the channel number, and H × W is the size of the input target fish image.
Stem is a plurality of convolution layers and is used for extracting a plurality of shallow information of a target fish school image to a main network, the main network is used for outputting feature maps of two different scales and inputting the feature maps to an action feature extraction module, the action feature extraction module performs feature fusion and extraction on the feature maps of the different scales to obtain individual action feature vectors of 3 fishes in the target fish school image, the individual action feature vectors are input to a RoI-Align module, the RoI-Align module performs action feature extraction on the individual action feature vectors to obtain an N x d individual action relationship matrix, and the GCN performs action relationship reasoning according to the individual action relationship matrix to obtain the distribution of the activity feature vectors; and the activity characteristic vector distribution correction module outputs the individual behavior information and the group behavior intensity information of the target fish school after correcting the activity characteristic vector distribution.
Fig. 3 is a second schematic flow chart of the method for identifying activity intensity of an underwater fish group provided by the present invention, as shown in fig. 3, including:
firstly, in step S31, acquiring underwater fish images and constructing a fish group activity data set; further, in step S32, constructing an underwater fish individual action relationship inference network; further, in step S33, setting initial parameters of the network and training the fish individual action relationship inference network; further, in step S34, the activity eigenvector distribution correction module corrects the data distribution; further, in step S35, the individual behavior information and the group behavior intensity information of the underwater fish school are output.
Fig. 4 is a schematic structural diagram of an underwater fish group activity intensity recognition device provided by the present invention, as shown in fig. 4, including:
the acquiring module 401 is configured to acquire a plurality of target fish school images of a target fish school under water;
a determining module 402, configured to input the target fish swarm images into a behavior recognition model based on time information in the target fish swarm images, and determine individual behavior information and group behavior intensity information of the target fish swarm output by the behavior recognition model; the behavior recognition model is obtained after training based on a sample fish school image with an individual behavior information label and a group behavior intensity information label; the behavior recognition model is used for recognizing the behavior of the target fish school based on the target fish school image; the behavior recognition model is constructed based on the feature vector distribution correction module.
First, the obtaining module 401 obtains a plurality of target fish swarm images of a target fish swarm under water.
The target fish school image can be obtained by shooting the target fish school image by using an underwater camera at different time, or can be obtained by performing video recording on the target fish school and extracting multi-frame images from the recorded video as the target fish school image.
Specifically, target fish swarm images shot at different time points are obtained for a target fish swarm which is a research object, and the target fish swarm images carry shooting time information.
Further, the determining module 402 inputs the target fish swarm images into a behavior recognition model based on the time information in the target fish swarm images, and determines the individual behavior information and the group behavior intensity information of the target fish swarm output by the behavior recognition model;
the behavior recognition model is obtained after training based on a sample fish school image with an individual behavior information label and a group behavior intensity information label;
the behavior recognition model is used for recognizing the behavior of the target fish school based on the target fish school image; the behavior recognition model is constructed based on the feature vector distribution correction module.
Because a group exists in a fish swarm and moves along with certain fishes in the group, the conventional fish action identification method only assists action characteristics staying on the surface and other information to directly research the group activity of the fishes, neglects the relationship reasoning between individual actions of the fishes and the group activity, and has a GCN network in an action identification model to solve the problem.
Aiming at the problems of unclear fish individual action characteristics and group activity characteristics, adhesion of data distribution and the like in a complex underwater environment, a characteristic vector distribution correction module is additionally arranged on an original GCN (generalized network for network) in a behavior recognition model, and the characteristic vector distribution correction module is used for correcting the data distribution of the activity characteristic vectors of a target fish swarm image, so that the latitudes of the activity characteristic vectors obey Gaussian distribution.
Specifically, a plurality of target fish swarm images are input into a behavior recognition model, and feature extraction, fusion, action relation reasoning and vector data distribution correction are carried out on each target fish swarm image by the behavior recognition model, so that final individual behavior information and group behavior intensity information of the target fish swarm are obtained and output.
The individual behavior information is the action behavior information of each fish in the target fish swarm and comprises the following steps: acceleration, deceleration, swimming, resting, ingestion; the group behavior intensity information is the activity intensity of the whole target fish school, and comprises the following steps: none, strong, medium, weak.
According to the underwater fish group activity intensity identification device provided by the invention, the behavior identification model is constructed based on the characteristic vector distribution correction module, the latitude of the activity characteristic vector can obey Gaussian distribution by using a small amount of expansion data sample characteristics and basic sample data set characteristics, and the mismatching condition between the estimation distribution and the real distribution learned by using a small amount of samples is effectively solved, so that the identification precision of the fish group behaviors is improved.
Fig. 5 is a second schematic structural diagram of the underwater fish group activity intensity recognition device provided by the present invention, as shown in fig. 5, including: the underwater camera comprises an underwater camera 1, a light source 2, a light intensity transmitter 3 and a control processor 4, wherein the control processor 4 is respectively in communication connection with the underwater camera 1, the light source 2 and the light intensity transmitter 3.
Camera 1 can gather the image under water of shoal of fish under control processor 4's control under, light source 2 is used for 1 light filling for camera under water, and light intensity transmitter 3 can the response environment light intensity and give control processor 4 with light intensity information transfer, and control processor 4 is according to light intensity information control light source 2 switch and illumination intensity, and control processor 4 can receive the image of camera 1 collection under water and carry out real-time individual line of fish discernment and location to the image.
The illuminance transmitter 3 comprises an illuminance sensor, a microcontroller and a communication interface in sequence; the microcontroller is connected with the illuminance sensor and the communication interface respectively, and can control the illuminance sensor to collect data and transmit the data collected by the illuminance sensor to the control processor 4 through the communication interface.
According to the underwater fish group activity intensity recognition device, an underwater image of a fish school is obtained by arranging an underwater camera and a light source according to illumination intensity data acquired by an illumination intensity transmitter, then a control processor identifies individual fish behaviors and group activity behaviors of a complex underwater environment, and carries out real-time recognition and positioning on the underwater fish image, so that important data support is provided for monitoring fish biodiversity, marine resource management, ecosystem monitoring, commercial fishing ground and the like.
Furthermore, the analysis of the population behavior of fish populations is not only useful for assessing the feeding activity and survival safety of fish populations, but also for reflecting their daily energy expenditure. The realized and automatic fish group activity recognition has great significance for understanding the life habit, the growth condition and the formation mechanism of fish groups and formulating an efficient feeding strategy.
It should be noted that, when being specifically executed, the underwater fish swarm activity intensity identification device provided in the embodiment of the present invention may be implemented based on the underwater fish swarm activity intensity identification method described in any of the above embodiments, and details of this embodiment are not described herein.
Fig. 6 is a schematic structural diagram of an electronic device provided in the present invention, and as shown in fig. 6, the electronic device may include: a processor (processor)610, a communication Interface (Communications Interface)620, a memory (memory)630 and a communication bus 640, wherein the processor 610, the communication Interface 620 and the memory 630 communicate with each other via the communication bus 640. The processor 610 may invoke logic instructions in the memory 630 to perform a method of underwater fish swarm activity intensity identification, the method comprising: acquiring a plurality of target fish school images of a target fish school under water; inputting the target fish swarm images into a behavior recognition model based on time information in the target fish swarm images, and determining individual behavior information and group behavior intensity information of the target fish swarm output by the behavior recognition model; the behavior recognition model is obtained after training based on a sample fish school image with an individual behavior information label and a group behavior intensity information label; the behavior recognition model is used for recognizing the behavior of the target fish school based on the target fish school image; the behavior recognition model is constructed based on the feature vector distribution correction module.
In addition, the logic instructions in the memory 630 may be implemented in software functional units and stored in a computer readable storage medium when the logic instructions are sold or used as independent products. Based on such understanding, the technical solution of the present invention may be embodied in the form of a software product, which is stored in a storage medium and includes instructions for causing a computer device (which may be a personal computer, a server, or a network device) to execute all or part of the steps of the method according to the embodiments of the present invention. And the aforementioned storage medium includes: various media capable of storing program codes, such as a usb disk, a removable hard disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk, or an optical disk.
In another aspect, the present invention also provides a computer program product, the computer program product comprising a computer program stored on a non-transitory computer-readable storage medium, the computer program comprising program instructions, which when executed by a computer, enable the computer to execute the underwater fish group activity intensity identification method provided by the above methods, the method comprising: acquiring a plurality of target fish school images of a target fish school under water; inputting the target fish swarm images into a behavior recognition model based on time information in the target fish swarm images, and determining individual behavior information and group behavior intensity information of the target fish swarm output by the behavior recognition model; the behavior recognition model is obtained after training based on a sample fish school image with an individual behavior information label and a group behavior intensity information label; the behavior recognition model is used for recognizing the behavior of the target fish school based on the target fish school image; the behavior recognition model is constructed based on the feature vector distribution correction module.
In yet another aspect, the present invention further provides a non-transitory computer-readable storage medium, on which a computer program is stored, the computer program being implemented by a processor to perform the underwater fish swarm activity intensity identification method provided in the foregoing embodiments, the method including: acquiring a plurality of target fish school images of a target fish school under water; inputting the target fish swarm images into a behavior recognition model based on time information in the target fish swarm images, and determining individual behavior information and group behavior intensity information of the target fish swarm output by the behavior recognition model; the behavior recognition model is obtained after training based on a sample fish school image with an individual behavior information label and a group behavior intensity information label; the behavior recognition model is used for recognizing the behavior of the target fish school based on the target fish school image; the behavior recognition model is constructed based on the feature vector distribution correction module.
The above-described embodiments of the apparatus are merely illustrative, and the units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the modules may be selected according to actual needs to achieve the purpose of the solution of the present embodiment. One of ordinary skill in the art can understand and implement it without inventive effort.
Through the above description of the embodiments, those skilled in the art will clearly understand that each embodiment can be implemented by software plus a necessary general hardware platform, and certainly can also be implemented by hardware. With this understanding in mind, the above-described technical solutions may be embodied in the form of a software product, which can be stored in a computer-readable storage medium such as ROM/RAM, magnetic disk, optical disk, etc., and includes instructions for causing a computer device (which may be a personal computer, a server, or a network device, etc.) to execute the methods described in the embodiments or some parts of the embodiments.
Finally, it should be noted that: the above examples are only intended to illustrate the technical solution of the present invention, but not to limit it; although the present invention has been described in detail with reference to the foregoing embodiments, it will be understood by those of ordinary skill in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some technical features may be equivalently replaced; and such modifications or substitutions do not depart from the spirit and scope of the corresponding technical solutions of the embodiments of the present invention.

Claims (10)

1. A method for identifying activity intensity of underwater fish groups is characterized by comprising the following steps:
acquiring a plurality of target fish school images of a target fish school under water;
inputting the target fish swarm images into a behavior recognition model based on time information in the target fish swarm images, and determining individual behavior information and group behavior intensity information of the target fish swarm output by the behavior recognition model;
the behavior recognition model is obtained after training based on a sample fish school image with an individual behavior information label and a group behavior intensity information label; the behavior recognition model is used for recognizing the behavior of the target fish school based on the target fish school image; the behavior recognition model is constructed based on the feature vector distribution correction module.
2. The underwater fish swarm activity intensity identification method of claim 1, wherein the obtaining of the plurality of target fish swarm images of the target fish swarm underwater comprises:
acquiring video data of the target fish school;
determining a plurality of initial images of the target fish school according to the time information of each frame of image in the video data;
and performing size preprocessing on all the initial images to determine the target fish swarm images.
3. The underwater fish group activity intensity recognition method according to claim 1 or 2, further comprising, before the inputting the plurality of target fish group images into a behavior recognition model:
extracting multi-frame fish shoal images from an underwater fish shoal video;
preprocessing the multi-frame fish school images to obtain a plurality of sample fish school images;
determining a group activity intensity information label of each sample fish swarm image and an individual behavior information label of each fish in each sample fish swarm image;
taking each sample fish school image and the combination of the group activity intensity information label and the individual behavior information label corresponding to each sample fish school image as a training sample to obtain a plurality of training samples;
and training an initial recognition model by using the training samples based on the time information in each training sample to obtain the behavior recognition model.
4. The method of claim 3, wherein before the training the initial recognition model with the training samples, the method further comprises:
constructing an attention mechanism residual error network based on a backbone network, an individual action feature extraction network and a regional feature aggregation module;
constructing the initial identification model based on the attention mechanism residual error network, the graph convolution network and the feature vector distribution correction module;
the loss function of the initial recognition model is determined based on individual fish behavioral characteristics and fish group behavioral characteristics.
5. The underwater fish swarm activity intensity identification method of claim 4, wherein the step of inputting the target fish swarm images into a behavior identification model based on the time information in the target fish swarm images, and determining the individual behavior information and the swarm behavior intensity information of the target fish swarm output by the behavior identification model comprises the steps of:
based on the time information in the target fish swarm images, performing feature extraction and feature fusion on the target fish swarm images by using the attention mechanism residual error network, and determining individual action relation matrixes of the target fish swarm images;
performing action relation reasoning on the individual action characteristic vector of each target fish school image by using the graph convolution network, and determining the activity characteristic vector of each target fish school image;
and performing Gaussian distribution correction on each activity characteristic vector by using the characteristic vector distribution correction module, and determining the individual behavior information and the group behavior intensity information of the target fish school.
6. The underwater fish swarm activity intensity identification method of claim 5, wherein the determining the individual action relationship matrix of the target fish swarm images by performing feature extraction and feature fusion on the target fish swarm images by using the attention mechanism residual network based on the time information in the target fish swarm images comprises:
based on the time information in the target fish swarm images, performing feature extraction and fusion on the target fish swarm images by using the backbone network to obtain a plurality of feature fusion images;
utilizing the individual motion feature extraction network to extract individual motion features of the feature fusion images, and determining individual motion feature vectors;
and performing action characteristic extraction on the individual action characteristic vector by utilizing the region characteristic aggregation module, and determining individual action relation matrixes of the plurality of target fish school images.
7. An underwater fish group activity intensity recognition device, comprising:
the acquisition module is used for acquiring a plurality of target fish school images of a target fish school under water;
the determining module is used for inputting the target fish swarm images into a behavior recognition model based on time information in the target fish swarm images, and determining individual behavior information and group behavior intensity information of the target fish swarm output by the behavior recognition model;
the behavior recognition model is obtained after training based on a sample fish school image with an individual behavior information label and a group behavior intensity information label;
the behavior recognition model is used for recognizing the behavior of the target fish school based on the target fish school image;
the behavior recognition model is constructed based on the feature vector distribution correction module.
8. An electronic device comprising a memory, a processor and a computer program stored on the memory and executable on the processor, wherein the processor when executing the computer program implements the underwater fish swarm activity intensity identification method steps according to any one of claims 1 to 6.
9. A non-transitory computer-readable storage medium, on which a computer program is stored, wherein the computer program, when being executed by a processor, implements the underwater fish swarm activity intensity identification method steps as claimed in any of the claims 1 to 6.
10. A computer program product comprising a computer program, wherein the computer program when executed by a processor implements the steps of the method for identifying activity intensity of an underwater fish group as claimed in any one of claims 1 to 6.
CN202210028850.7A 2022-01-11 2022-01-11 Underwater fish group activity intensity identification method and device Active CN114463675B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210028850.7A CN114463675B (en) 2022-01-11 2022-01-11 Underwater fish group activity intensity identification method and device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210028850.7A CN114463675B (en) 2022-01-11 2022-01-11 Underwater fish group activity intensity identification method and device

Publications (2)

Publication Number Publication Date
CN114463675A true CN114463675A (en) 2022-05-10
CN114463675B CN114463675B (en) 2023-04-28

Family

ID=81409291

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210028850.7A Active CN114463675B (en) 2022-01-11 2022-01-11 Underwater fish group activity intensity identification method and device

Country Status (1)

Country Link
CN (1) CN114463675B (en)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114637014A (en) * 2022-05-17 2022-06-17 仲恺农业工程学院 Underwater robot-based unmanned fishing ground fish school behavior recognition system and method
CN115170942A (en) * 2022-07-25 2022-10-11 大连海洋大学 Fish behavior identification method with multilevel fusion of sound and vision
CN116863322A (en) * 2023-07-28 2023-10-10 江苏中水东泽农业发展股份有限公司 Self-adaptive illumination method, device and storage medium for fish breeding based on AI
CN117409368A (en) * 2023-10-31 2024-01-16 大连海洋大学 Real-time analysis method for shoal gathering behavior and shoal starvation behavior based on density distribution

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107423745A (en) * 2017-03-27 2017-12-01 浙江工业大学 A kind of fish activity classification method based on neutral net
CN112070114A (en) * 2020-08-03 2020-12-11 中国科学院信息工程研究所 Scene character recognition method and system based on Gaussian constraint attention mechanism network
AU2020103130A4 (en) * 2020-10-30 2021-01-07 Xi’an University of Technology Habitat Identification Method Based on Fish Individual Dynamic Simulation Technology
CN112598713A (en) * 2021-03-03 2021-04-02 浙江大学 Offshore submarine fish detection and tracking statistical method based on deep learning
CN112800994A (en) * 2021-02-03 2021-05-14 中国农业大学 Fish swarm feeding behavior identification method and device, electronic equipment and storage medium

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107423745A (en) * 2017-03-27 2017-12-01 浙江工业大学 A kind of fish activity classification method based on neutral net
CN112070114A (en) * 2020-08-03 2020-12-11 中国科学院信息工程研究所 Scene character recognition method and system based on Gaussian constraint attention mechanism network
AU2020103130A4 (en) * 2020-10-30 2021-01-07 Xi’an University of Technology Habitat Identification Method Based on Fish Individual Dynamic Simulation Technology
CN112800994A (en) * 2021-02-03 2021-05-14 中国农业大学 Fish swarm feeding behavior identification method and device, electronic equipment and storage medium
CN112598713A (en) * 2021-03-03 2021-04-02 浙江大学 Offshore submarine fish detection and tracking statistical method based on deep learning

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
陈明等: "基于特征加权融合的鱼类摄食活动强度评估方法", 《农业机械学报》 *

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114637014A (en) * 2022-05-17 2022-06-17 仲恺农业工程学院 Underwater robot-based unmanned fishing ground fish school behavior recognition system and method
CN115170942A (en) * 2022-07-25 2022-10-11 大连海洋大学 Fish behavior identification method with multilevel fusion of sound and vision
CN115170942B (en) * 2022-07-25 2023-10-17 大连海洋大学 Fish behavior recognition method with multi-stage fusion of sound and vision
CN116863322A (en) * 2023-07-28 2023-10-10 江苏中水东泽农业发展股份有限公司 Self-adaptive illumination method, device and storage medium for fish breeding based on AI
CN116863322B (en) * 2023-07-28 2024-04-30 江苏中水东泽农业发展股份有限公司 Self-adaptive illumination method, device and storage medium for fish breeding based on AI
CN117409368A (en) * 2023-10-31 2024-01-16 大连海洋大学 Real-time analysis method for shoal gathering behavior and shoal starvation behavior based on density distribution

Also Published As

Publication number Publication date
CN114463675B (en) 2023-04-28

Similar Documents

Publication Publication Date Title
CN114463675A (en) Underwater fish group activity intensity identification method and device
CN111523621B (en) Image recognition method and device, computer equipment and storage medium
CN110598029B (en) Fine-grained image classification method based on attention transfer mechanism
CN108830334B (en) Fine-grained target discrimination method based on antagonistic transfer learning
Nathan et al. Using tri-axial acceleration data to identify behavioral modes of free-ranging animals: general concepts and tools illustrated for griffon vultures
Reddy et al. Few-shot scene adaptive crowd counting using meta-learning
CN112016527B (en) Panda behavior recognition method, system, terminal and medium based on deep learning
US20210383149A1 (en) Method for identifying individuals of oplegnathus punctatus based on convolutional neural network
CN110414541B (en) Method, apparatus, and computer-readable storage medium for identifying an object
Binguitcha-Fare et al. Crops and weeds classification using convolutional neural networks via optimization of transfer learning parameters
Alsmadi et al. Fish recognition based on the combination between robust feature selection, image segmentation and geometrical parameter techniques using Artificial Neural Network and Decision Tree
KR20190004429A (en) Method and apparatus for determining training of unknown data related to neural networks
Bezamat et al. The influence of cooperative foraging with fishermen on the dynamics of a bottlenose dolphin population
Martins et al. Deep learning applied to the identification of fruit fly in intelligent traps
Cole et al. Spatial implicit neural representations for global-scale species mapping
CN117010971B (en) Intelligent health risk providing method and system based on portrait identification
Merkle et al. Likelihood‐based photograph identification: Application with photographs of free‐ranging bison
CN112215066A (en) Livestock face image recognition method and device
Khanam et al. Application of Deep CNN for image-based identification and classification of plant diseases
Sultana et al. A Deep CNN based Kaggle Contest Winning Model to Recognize Real-Time Facial Expression
US20240111924A1 (en) Distributed Invasive Species Tracking Network
Han Based on XGBoost Model and SAR Model to Identify New Species and Predict the Number Change and Distribution of New Species: Based on the Invasion and Spread of Asian Hornet in the United States
JP7184195B2 (en) LEARNING DEVICE, LEARNING METHOD AND PROGRAM
Kim et al. Devising a method for recognizing the causes of deviations in the development of the plant Aloe arborescens L. using machine learning capabilities
Sheng et al. Age-groups classification of Irrawaddy dolphins based on dorsal fin geometric morphological features

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant