CN116721303B - Unmanned aerial vehicle fish culture method and system based on artificial intelligence - Google Patents
Unmanned aerial vehicle fish culture method and system based on artificial intelligence Download PDFInfo
- Publication number
- CN116721303B CN116721303B CN202311007901.9A CN202311007901A CN116721303B CN 116721303 B CN116721303 B CN 116721303B CN 202311007901 A CN202311007901 A CN 202311007901A CN 116721303 B CN116721303 B CN 116721303B
- Authority
- CN
- China
- Prior art keywords
- fish
- calculated
- image
- following formula
- calculating
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
- 241000251468 Actinopterygii Species 0.000 title claims abstract description 141
- 238000013473 artificial intelligence Methods 0.000 title claims abstract description 20
- 238000012136 culture method Methods 0.000 title claims abstract description 8
- 238000003708 edge detection Methods 0.000 claims abstract description 55
- 238000000034 method Methods 0.000 claims abstract description 22
- 238000000605 extraction Methods 0.000 claims abstract description 12
- 238000004364 calculation method Methods 0.000 claims abstract description 9
- 238000004458 analytical method Methods 0.000 claims abstract description 7
- 238000005457 optimization Methods 0.000 claims description 63
- 238000013145 classification model Methods 0.000 claims description 57
- 239000011159 matrix material Substances 0.000 claims description 44
- 238000012549 training Methods 0.000 claims description 24
- 238000011156 evaluation Methods 0.000 claims description 21
- 238000012360 testing method Methods 0.000 claims description 12
- 239000013598 vector Substances 0.000 claims description 12
- 230000006870 function Effects 0.000 claims description 9
- 238000009372 pisciculture Methods 0.000 claims description 8
- 230000005484 gravity Effects 0.000 claims description 6
- 230000001174 ascending effect Effects 0.000 claims description 3
- 238000004519 manufacturing process Methods 0.000 claims description 3
- 238000010606 normalization Methods 0.000 claims description 3
- 238000012545 processing Methods 0.000 claims description 3
- 238000012364 cultivation method Methods 0.000 abstract description 2
- 230000002708 enhancing effect Effects 0.000 abstract 1
- 239000002245 particle Substances 0.000 abstract 1
- 230000008569 process Effects 0.000 description 8
- 238000007635 classification algorithm Methods 0.000 description 6
- 230000009471 action Effects 0.000 description 3
- 238000010586 diagram Methods 0.000 description 2
- 230000004075 alteration Effects 0.000 description 1
- 230000009286 beneficial effect Effects 0.000 description 1
- 238000004422 calculation algorithm Methods 0.000 description 1
- 238000010276 construction Methods 0.000 description 1
- 230000007547 defect Effects 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 238000003672 processing method Methods 0.000 description 1
- 241000894007 species Species 0.000 description 1
- 238000006467 substitution reaction Methods 0.000 description 1
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/70—Arrangements for image or video recognition or understanding using pattern recognition or machine learning
- G06V10/764—Arrangements for image or video recognition or understanding using pattern recognition or machine learning using classification, e.g. of video objects
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/10—Segmentation; Edge detection
- G06T7/13—Edge detection
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/60—Analysis of geometric attributes
- G06T7/62—Analysis of geometric attributes of area, perimeter, diameter or volume
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/40—Extraction of image or video features
- G06V10/44—Local feature extraction by analysis of parts of the pattern, e.g. by detecting edges, contours, loops, corners, strokes or intersections; Connectivity analysis, e.g. of connected components
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/40—Extraction of image or video features
- G06V10/54—Extraction of image or video features relating to texture
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/40—Extraction of image or video features
- G06V10/56—Extraction of image or video features relating to colour
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/05—Underwater scenes
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/10—Terrestrial scenes
- G06V20/17—Terrestrial scenes taken from planes or by drones
-
- Y—GENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
- Y02—TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
- Y02A—TECHNOLOGIES FOR ADAPTATION TO CLIMATE CHANGE
- Y02A40/00—Adaptation technologies in agriculture, forestry, livestock or agroalimentary production
- Y02A40/80—Adaptation technologies in agriculture, forestry, livestock or agroalimentary production in fisheries management
- Y02A40/81—Aquaculture, e.g. of fish
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Multimedia (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Computing Systems (AREA)
- General Health & Medical Sciences (AREA)
- Medical Informatics (AREA)
- Software Systems (AREA)
- Evolutionary Computation (AREA)
- Databases & Information Systems (AREA)
- Health & Medical Sciences (AREA)
- Artificial Intelligence (AREA)
- Remote Sensing (AREA)
- Geometry (AREA)
- Image Processing (AREA)
- Image Analysis (AREA)
Abstract
The invention discloses an unmanned aerial vehicle fish culture method and system based on artificial intelligence, wherein the method comprises the following steps: data acquisition, texture feature extraction, color feature extraction, shape feature extraction, input parameter determination and multi-variety fish state classification. The invention belongs to the technical field of intelligent cultivation, in particular to an unmanned aerial vehicle fish cultivation method and system based on artificial intelligence, wherein the calculation formulas of a final edge detection operator are improved by improving the calculation formulas of a first edge detection operator and a second edge detection operator, and the accuracy of edge detection is improved, so that the extraction quality of shape features is improved; determining input parameters by adopting a gray relation analysis method, enhancing the correlation between the input parameters and experimental results, and improving the convergence speed and the prediction accuracy of the model; by continuously adjusting the size of the inertia weight, particles are close to a better searching area, and the problem that a better global solution cannot be found due to the fact that local minima are trapped is avoided.
Description
Technical Field
The invention belongs to the technical field of intelligent cultivation, and particularly relates to an unmanned aerial vehicle fish cultivation method and system based on artificial intelligence.
Background
The method is characterized in that feature information is generally extracted by selecting a mode of combining an edge detection algorithm and angular point extraction based on an image to establish a classification model, and the extracted feature information is utilized to establish the classification model, but the existing image processing method has the technical problem of inaccurate image edge positioning in the process of extracting shape features; the technical problem that excessive input parameters cause excessive fitting of the classification model exists; there is a technical problem that the classification algorithm is easy to sink into a local minimum value and cannot find a better global solution.
Disclosure of Invention
Aiming at the situation, in order to overcome the defects of the prior art, the invention provides an unmanned aerial vehicle fish culture method and system based on artificial intelligence, and aims at the technical problem of inaccurate image edge positioning in the process of extracting shape features; aiming at the technical problem of excessive fitting of the classification model caused by excessive input parameters, the invention adopts a gray relation analysis method to determine the input parameters, strengthens the correlation between the input parameters and experimental results, and improves the convergence speed and the prediction precision of the model; aiming at the technical problem that the classification algorithm is easy to sink into the local minimum value and cannot find a better global solution, the invention continuously adjusts the size of the inertia weight to make the optimization parameters close to a better search area, thereby avoiding the problem that the classification algorithm is easy to sink into the local minimum value and cannot find the better global solution.
The technical scheme adopted by the invention is as follows: the invention provides an unmanned aerial vehicle fish culture method based on artificial intelligence, which comprises the following steps:
step S1: collecting data, namely collecting fish images and corresponding tags, wherein the tags are the variety and growth state of fish, and taking the collected fish images as fish images;
step S2: extracting texture features, calculating corresponding gray values based on pixel values of RGB three channels of the fish image, further calculating gray co-occurrence matrix and probability of each center pixel, and finally obtaining the texture features by calculating contrast, energy, entropy and uniformity;
step S3: extracting color features, converting a fish image into an HSV color space, dividing the HSV color space into a plurality of intervals, further calculating a color histogram, and finally obtaining the color features by calculating a mean value, a variance, a median and a standard deviation;
step S4: extracting shape characteristics, calculating an improved edge detection operator by calculating a first edge detection operator and calculating a second edge detection operator, calculating a final image edge by calculating a small-scale image edge and calculating a large-scale image edge, performing polygon fitting, and finally obtaining shape characteristics by calculating contour length, contour area, center distance and eccentricity;
Step S5: determining input parameters, firstly constructing a classification data set, then constructing a comparison matrix by setting a reference sequence, obtaining a non-dimensionality matrix by non-dimensionality data, calculating gray correlation degree by calculating gray correlation coefficient, and finally determining the input parameters;
step S6: the method comprises the steps of firstly constructing a training data set and a testing data set, initializing optimized parameter positions and speeds, generating multi-variety fish state classification model parameters and training multi-variety fish state classification models, initializing individual optimal positions and global optimal positions, updating optimized parameter speeds, positions and fitness values, updating inertia weights, individual optimal positions and global optimal positions, determining a final multi-variety fish state classification model based on evaluation thresholds, and feeding fish varieties and growth states output by real-time fish image acquisition by using unmanned aerial vehicles.
Further, in step S2, the extracting texture features specifically includes the following steps:
step S21: calculating gray values, calculating pixel values of three RGB channels of the fish image to obtain corresponding gray values, and assigning the obtained gray values to corresponding pixel points to obtain a gray image, wherein the formula is as follows:
A=0.299*R+0.587*G+0.114*B;
Wherein, A is the gray value of each pixel point, R, G, B is the pixel value of red, green and blue channels, and 0.299, 0.587 and 0.114 are the weighting coefficients corresponding to R, G, B respectively;
step S22: the gray level co-occurrence matrix is calculated by the following formula:
;
wherein G (I, j, δr, δc) is a gray level co-occurrence matrix, I and j are gray levels, δr and δc are offsets of the pixels of the field in the row and column directions, nr and Nc are the number of rows and columns of the grayscale image, and I (m, n) is a gray value of the pixels of the m-th row and n-th column of the grayscale image;
step S23: the probability is calculated using the formula:
;
wherein P (i, j) is the probability of using gray level i as a neighborhood pixel and gray level j as a center pixel in the gray level co-occurrence matrix, npq is the occurrence number of pixels in the field with P as the center and q as the center, and N1 is the sum of all elements in the gray level co-occurrence matrix;
step S24: calculating texture features, wherein the steps are as follows:
step S241: the contrast is calculated using the following formula:
C=∑ i ∑ j (i,j) 2 *P(i,j);
wherein C is the contrast between pixels in the image;
step S242: the energy was calculated using the formula:
D=∑ i ∑ j P(i,j) 2 ;
wherein D is energy;
step S243: the entropy is calculated using the formula:
E=-∑ i ∑ j P(i,j)*㏒(P(i,j));
Wherein E is entropy;
step S244: uniformity was calculated using the following formula:
;
wherein F is uniformity.
Further, in step S3, the extracting color features specifically includes the steps of:
step S31: the fish image is converted into HSV color space, and the steps are as follows:
step S311: normalizing, namely normalizing RGB values in the RGB color image to be [0,1];
step S312: the hue is calculated using the formula:
;
wherein H is tone and the value range is [0 degree, 360 degrees ], lmax and Lmin are the corresponding maximum value and minimum value in R, G, B color channels respectively;
step S313: the saturation was calculated using the formula:
;
wherein S is saturation and has a value range of [0,1];
step S314: the brightness was calculated using the following formula:
V=Lmax;
wherein V is brightness and has a value range of [0,1];
step S32: dividing an HSV color space into a plurality of sections, uniformly dividing a tone H into 24 sections in the HSV color space, and uniformly dividing saturation S and brightness V into 10 sections respectively;
step S33: calculating a color histogram, traversing each pixel in an image, and counting the number of the color space intervals to which the pixel belongs to obtain the color histogram;
Step S34: the color characteristics are calculated as follows:
step S341: the mean was calculated using the formula:
;
wherein μ is a mean value, mi is the frequency of occurrence of the ith pixel value in the color histogram, and N2 is the total number of pixel values in the color histogram;
step S342: the variance is calculated using the formula:
;
in sigma 2 Is the variance;
step S343: calculating the median, sorting the pixel values in the color histogram according to ascending order, and taking the value arranged at the middle position as the median;
step S344: standard deviation was calculated using the following formula:
σ=sqrt(σ 2 );
where σ is the standard deviation.
Further, in step S4, the extracting the shape feature specifically includes the steps of:
step S41: image denoising, namely performing image denoising based on the gray-scale image obtained in the step S21, wherein the following formula is used:
;
wherein g (x, y) is a denoised grey image, f (x, y) is an original grey image, k is a normalization coefficient, r is a radius of a gaussian filter, ωij is a weight of the gaussian filter;
step S42: the first edge detection operator is calculated using the following formula:
Y1i=(g(x,y)⊕bi)•bi-g(x,y);
wherein Y1i is a first edge detection operator, bi is a structural element in different directions and i=1, 2, …,8, # is an exclusive or operator, # is a dot product operator between two vectors;
Step S43: the second edge detection operator is calculated using the following formula:
;
where Y2i is the second edge detection operator, Θ is the logical OR operator,is a bitwise product operator between two vectors;
step S44: the improved edge detection operator is calculated using the following formula:
Yi=Y1i+Y2i+Ymin;
where Yi is an improved edge detection operator, yimin is an edge minimum and Yimin = min { Y1i, Y2i };
step S45: the edges of the small-scale image were calculated and edge detection was performed using the structural element bi (i=1, 2,3, 4) of 3*3 using the following formula:
;
wherein Q1 is a small scale image edge;
step S46: the edges of the large-scale image were calculated and edge detection was performed using the structural element bi (i=5, 6,7, 8) of 5*5 using the following formula:
;
wherein Q2 is the edge of the large-scale image;
step S47: calculating the final image edge to obtain the edge information of the graphic area, wherein the formula is as follows:
;
where Q is the final image edge;
step S48: polygon fitting, namely, finding out the outline of the graphic area based on the edge information obtained in the step S47, and converting the outline into a polygon by using the polygon fitting;
step S49: calculating shape characteristics, wherein the steps are as follows:
Step S491: the contour length is calculated using the following formula:
;
wherein R is the contour length, n is a variable of a polygon, di is the length of the ith side;
step S492: the contour area is calculated using the following formula:
;
where S is the area of the contour, (x i ,y i ) Is the coordinates of the ith vertex of the polygon;
step S493: the center distance is calculated using the following formula:
;
wherein O is the center distance, (x) c ,y c ) Is the coordinates of the center of gravity of the contour, (x m ,y m ) Is the coordinates of the contour point closest to the center of gravity point;
step S494: the eccentricity was calculated using the following formula:
;
where U is the eccentricity, e is the length of the major axis of the smallest circumscribing ellipse of the fish image profile, and z is the length of the minor axis of the smallest circumscribing ellipse of the fish image profile.
Further, in step S5, the determining the input parameter specifically includes the following steps:
step S51: constructing a classification data set based on the texture features calculated in the step S2, the color features calculated in the step S3, the shape features calculated in the step S4 and the fish images acquired in the step S1, wherein the feature variables of the texture features comprise contrast, energy, entropy and uniformity, the feature variables of the color features comprise mean, variance, median and standard deviation, and the feature variables of the shape features comprise contour length, contour area, center distance and eccentricity;
Step S52: setting a reference sequence, and selecting n standard data from the classified data set in advance as evaluation parameters by the following formula:
X0=(x0(1),x0(2),…,x0(n));
wherein X0 is a reference sequence and n is the number of evaluation parameters;
step S53: a comparison matrix is constructed, and a comparison sequence is set based on sample data in the classification data set, wherein the formula is as follows:
;
wherein X is a comparison matrix, and m is the number of sample data in the classified data set;
step S54: non-dimensionalized data using the formula:
;
wherein xp (q) is the original data of the p-th column and q-th row of the comparison matrix X, X' p (q) is the non-dimensionalized data of the original data xp (q) after the non-dimensionalization processing, xmin is the minimum value of the p-th column of the comparison matrix X, and xmax is the maximum value of the p-th column of the comparison matrix X;
step S55: a non-dimensionalized matrix using the formula:
;
wherein X' is a non-dimensionalized matrix;
step S56: the gray correlation coefficients are calculated, and the gray correlation coefficients between the corresponding elements of each sample data sequence and the reference sequence are calculated using the following formula:
;
where εp (q) is the gray correlation coefficient of the p-th sample data sequence and the reference sequence between the q-th evaluation parameters, ρ is the resolution coefficient and 0< ρ <1;
Step S57: the gray correlation is calculated using the following formula:
;
where rp is the gray correlation of the p-th sample data sequence and the reference sequence over all evaluation parameters;
step S58: and determining an input parameter, presetting a gray correlation threshold, and determining a characteristic variable with gray correlation larger than the threshold as the input parameter.
Further, in step S6, the classification of the states of the multiple species of fish specifically includes the following steps:
step S61: constructing a training data set and a test data set, deleting dimension information of feature variables which are not more than a gray correlation threshold in the classification data set, and acquiring data corresponding labels to obtain sample data, wherein the corresponding labels are labels acquired in the step S1, 70% of the sample data are randomly selected as the training data set, and the rest 30% of the sample data are selected as the test data set;
step S62: initializing the positions of the optimized parameters, and randomly generating an initial position for each optimized parameter by using the following formula:
Y(i,j)=rand(0,1)*(U(j)-L(j))+L(j);
where i is the number of the optimization parameter, j is the dimension of Y, Y (i, j) is the position of the ith optimization parameter in the jth dimension, rand (0, 1) is a random number generated between 0 and 1, U (j) is the upper bound limit of the jth dimension, and L (j) is the lower bound limit of the jth dimension;
Step S63: initializing the speed of optimized parameters, and randomly generating an initial speed for each optimized parameter by using the following formula:
V(i,j)=rand(0,1)*(Vmax(j)-Vmin(j))+Vmin(j);
where V (i, j) is the speed of the ith optimization parameter in the jth dimension, vmax (j) is the upper limit of the speed in the jth dimension, and Vmin (j) is the lower limit of the speed in the jth dimension;
step S64: generating multiple-variety fish state classification model parameters, and generating a group of multiple-variety fish state classification model parameters according to the current position for each optimization parameter, wherein the group of multiple-variety fish state classification model parameters consists of a punishment factor and a kernel function parameter, and the formula is as follows:
C(i)=2 Y(i,1) ;
G(i)=2 Y(i,2) ;
wherein, C (i) is a punishment factor of the state classification model of the multi-variety fish, and G (i) is a kernel function parameter of the state classification model of the multi-variety fish;
step S65: training a multi-variety fish state classification model, training the multi-variety fish state classification model based on the multi-variety fish state classification model parameters determined in the step S64 and the training data set constructed in the step S61, and calculating weight vectors and bias values of the multi-variety fish state classification model, wherein the formula is as follows:
wi=∑ai*ci*ei;
;
where wi is the weight vector of the ith classifier, ai is the Lagrangian multiplier of the ith sample, ci is the eigenvector of the ith sample, ei is the corresponding label of the ith sample, ti is the bias value of the ith classifier, n is the number of training samples, g (ci, cj) is the kernel function, cj is the eigenvector of the jth sample;
Step S66: calculating an optimization parameter fitness value, predicting the test data set constructed in the step S61 by using the multi-variety fish state classification model trained in the step S65, and calculating the fitness value by using the following formula:
;
wherein f (i) is the fitness value of the ith optimization parameter, k is the number of types of corresponding labels, nj is the sample size of the jth corresponding label, yjl is the true corresponding label of the ith sample in the jth corresponding label, pjl (i) is the prediction probability of the multiple-variety fish state classification model of the ith sample in the jth corresponding label on the ith optimization parameter;
step S67: initializing an individual optimal position and a global optimal position, taking the initial position of each optimization parameter initialized in the step S62 as the individual optimal position of the corresponding optimization parameter, and taking the individual optimal position of the optimization parameter with the lowest fitness value in all the optimization parameters as the global optimal position;
step S68: updating the speed of the optimized parameters by the following formula:
V(i,j)=h*V(i,j)+d1*rand(0,1)*(T1(i,j)-Y(i,j))+d2*rand(0,1)*(T2(j)-Y(i,j));
where h is the inertial weight, T1 (i, j) is the individual optimal position of the ith optimization parameter in the jth dimension, T2 (j) is the value of the global optimal position in the jth dimension, d1 and d2 are learning factors, and rand (0, 1) is a random number generated between 0 and 1;
Step S69: updating the optimized parameter position by the following formula:
Y(i,j)=Y(i,j)+V(i,j);
where Y (i, j) is the position of the ith optimization parameter in the jth dimension;
step S610: updating the fitness value of the optimization parameter;
step S611: the inertial weights are updated using the following formula:
;
wherein hmin is the minimum inertial weight, hmax is the maximum inertial weight, f is the current fitness value, fmin is the minimum fitness value, favg is the average value of fitness of all optimization parameters;
step S612: updating the individual optimal position and the global optimal position, updating the individual optimal position of the optimization parameters according to the fitness value of the optimization parameters, and updating the global optimal position according to the individual optimal positions of all the optimization parameters;
step S613: determining a model, namely presetting an evaluation threshold value and the maximum iteration times, and establishing a multi-variety fish state classification model based on the current parameters when the fitness value of the optimized parameters is lower than the evaluation threshold value, and turning to step S614; if the maximum iteration number is reached, go to step S62; otherwise go to step S68;
step S614: classifying, wherein the unmanned aerial vehicle collects fish images in real time and inputs the fish images into a classification model of the states of the fishes of multiple varieties, and feeding is performed based on the fish varieties and the growth states output by the model.
The invention provides an unmanned aerial vehicle fish culture system based on artificial intelligence, which comprises a data acquisition module, a texture feature acquisition module, a color feature acquisition module, a shape feature acquisition module, an input parameter determination module and a multi-variety fish state classification module, wherein the data acquisition module is used for acquiring texture features;
taking a fish image as an example, the data acquisition module acquires the fish image and corresponding tags in various production states, takes the acquired fish image as a fish image, and sends the fish image to the texture feature acquisition module and the color feature acquisition module;
the texture feature acquisition module and the color feature acquisition module receive the fish images sent by the data acquisition module, extract texture features and color features by using the gray level co-occurrence matrix and the color histogram respectively, send the extracted texture features and color features to the input parameter determination module, and send the gray level images to the shape feature acquisition module;
the shape feature acquisition module receives the graying image sent by the texture feature acquisition module, improves a final edge detection operator by improving a calculation formula of the first edge detection operator and the second edge detection operator, improves the extraction quality of the shape feature, and sends the extracted shape feature to the input parameter determination module;
The input parameter determining module receives the texture features sent by the texture feature obtaining module, the color features sent by the color feature obtaining module and the shape features sent by the shape feature obtaining module, determines input parameters by adopting a gray relation analysis method, and sends the determined input parameters to the multi-variety fish state classification module;
the multi-variety fish state classification module receives and determines the input parameters sent by the input parameter module, and finally establishes a multi-variety fish state classification model by continuously adjusting the magnitude of the inertia weight.
By adopting the scheme, the beneficial effects obtained by the invention are as follows:
(1) Aiming at the technical problem of inaccurate image edge positioning in the process of extracting shape features, the invention improves the calculation formulas of the final edge detection operators by improving the calculation formulas of the first edge detection operator and the second edge detection operator, so that the image edge positioning is more accurate, the accuracy of edge detection is improved, and the extraction quality of the shape features is improved.
(2) Aiming at the technical problem of excessive fitting of the classification model caused by excessive input parameters, the invention adopts a gray relation analysis method to determine the input parameters, strengthens the correlation between the input parameters and experimental results, and improves the convergence rate and the prediction precision of the model.
(3) Aiming at the technical problem that a better global solution cannot be found because a classification algorithm is easy to sink into a local minimum value, the invention continuously adjusts the size of the inertia weight to make the optimization parameters close to a better search area, so as to avoid the problem that the better global solution cannot be found because of sinking into the local minimum value.
Drawings
FIG. 1 is a schematic flow chart of an unmanned aerial vehicle fish culture method based on artificial intelligence;
FIG. 2 is a schematic diagram of an unmanned aerial vehicle fish culture system based on artificial intelligence;
FIG. 3 is a flow chart of step S2;
FIG. 4 is a flow chart of step S3;
FIG. 5 is a flow chart of step S4;
FIG. 6 is a flow chart of step S5;
fig. 7 is a flow chart of step S6;
FIG. 8 is a schematic diagram of an optimization parameter search location;
fig. 9 is a graph of an optimization parameter search.
The accompanying drawings are included to provide a further understanding of the invention and are incorporated in and constitute a part of this specification, illustrate the invention and together with the embodiments of the invention, serve to explain the invention.
Detailed Description
The following description of the embodiments of the present invention will be made clearly and fully with reference to the accompanying drawings, in which it is evident that the embodiments described are only some, but not all embodiments of the invention; all other embodiments, which can be made by those skilled in the art based on the embodiments of the invention without making any inventive effort, are intended to be within the scope of the invention.
In the description of the present invention, it should be understood that the terms "upper," "lower," "front," "rear," "left," "right," "top," "bottom," "inner," "outer," and the like indicate orientation or positional relationships based on those shown in the drawings, merely to facilitate description of the invention and simplify the description, and do not indicate or imply that the devices or elements referred to must have a particular orientation, be constructed and operated in a particular orientation, and thus should not be construed as limiting the invention.
Referring to fig. 1, the invention provides an unmanned aerial vehicle fish culture method based on artificial intelligence, which comprises the following steps:
step S1: collecting data, namely collecting fish images and corresponding tags, wherein the tags are the variety and growth state of fish, and taking the collected fish images as fish images;
step S2: extracting texture features, calculating corresponding gray values based on pixel values of RGB three channels of the fish image, further calculating gray co-occurrence matrix and probability of each center pixel, and finally obtaining the texture features by calculating contrast, energy, entropy and uniformity;
step S3: extracting color features, converting a fish image into an HSV color space, dividing the HSV color space into a plurality of intervals, further calculating a color histogram, and finally obtaining the color features by calculating a mean value, a variance, a median and a standard deviation;
Step S4: extracting shape characteristics, calculating an improved edge detection operator by calculating a first edge detection operator and calculating a second edge detection operator, calculating a final image edge by calculating a small-scale image edge and calculating a large-scale image edge, performing polygon fitting, and finally obtaining shape characteristics by calculating contour length, contour area, center distance and eccentricity;
step S5: determining input parameters, firstly constructing a classification data set, then constructing a comparison matrix by setting a reference sequence, obtaining a non-dimensionality matrix by non-dimensionality data, calculating gray correlation degree by calculating gray correlation coefficient, and finally determining the input parameters;
step S6: the method comprises the steps of firstly constructing a training data set and a testing data set, initializing optimized parameter positions and speeds, generating multi-variety fish state classification model parameters and training multi-variety fish state classification models, initializing individual optimal positions and global optimal positions, updating optimized parameter speeds, positions and fitness values, updating inertia weights, individual optimal positions and global optimal positions, determining a final multi-variety fish state classification model based on evaluation thresholds, and feeding fish varieties and growth states output by real-time fish image acquisition by using unmanned aerial vehicles.
In a second embodiment, referring to fig. 1 and 3, the texture feature extraction in step S2 specifically includes the following steps:
step S21: calculating gray values, calculating pixel values of three RGB channels of the fish image to obtain corresponding gray values, and assigning the obtained gray values to corresponding pixel points to obtain a gray image, wherein the formula is as follows:
A=0.299*R+0.587*G+0.114*B;
wherein, A is the gray value of each pixel point, R, G, B is the pixel value of red, green and blue channels, and 0.299, 0.587 and 0.114 are the weighting coefficients corresponding to R, G, B respectively;
step S22: the gray level co-occurrence matrix is calculated by the following formula:
;
wherein G (I, j, δr, δc) is a gray level co-occurrence matrix, I and j are gray levels, δr and δc are offsets of the pixels of the field in the row and column directions, nr and Nc are the number of rows and columns of the grayscale image, and I (m, n) is a gray value of the pixels of the m-th row and n-th column of the grayscale image;
step S23: the probability is calculated using the formula:
;
wherein P (i, j) is the probability of using gray level i as a neighborhood pixel and gray level j as a center pixel in the gray level co-occurrence matrix, npq is the occurrence number of pixels in the field with P as the center and q as the center, and N1 is the sum of all elements in the gray level co-occurrence matrix;
Step S24: calculating texture features, wherein the steps are as follows:
step S241: the contrast is calculated using the following formula:
C=∑ i ∑ j (i,j) 2 *P(i,j);
wherein C is the contrast between pixels in the image;
step S242: the energy was calculated using the formula:
D=∑ i ∑ j P(i,j) 2 ;
wherein D is energy;
step S243: the entropy is calculated using the formula:
E=-∑ i ∑ j P(i,j)*㏒(P(i,j));
wherein E is entropy;
step S244: uniformity was calculated using the following formula:
;
wherein F is uniformity.
In the third embodiment, referring to fig. 1 and 4, the color feature extraction in step S3 specifically includes the following steps:
step S31: the fish image is converted into HSV color space, and the steps are as follows:
step S311: normalizing, namely normalizing RGB values in the RGB color image to be [0,1];
step S312: the hue is calculated using the formula:
;
wherein H is tone and the value range is [0 degree, 360 degrees ], lmax and Lmin are the corresponding maximum value and minimum value in R, G, B color channels respectively;
step S313: the saturation was calculated using the formula:
;
wherein S is saturation and has a value range of [0,1];
step S314: the brightness was calculated using the following formula:
V=Lmax;
wherein V is brightness and has a value range of [0,1];
Step S32: dividing an HSV color space into a plurality of sections, uniformly dividing a tone H into 24 sections in the HSV color space, and uniformly dividing saturation S and brightness V into 10 sections respectively;
step S33: calculating a color histogram, traversing each pixel in an image, and counting the number of the color space intervals to which the pixel belongs to obtain the color histogram;
step S34: the color characteristics are calculated as follows:
step S341: the mean was calculated using the formula:
;
wherein μ is a mean value, mi is the frequency of occurrence of the ith pixel value in the color histogram, and N2 is the total number of pixel values in the color histogram;
step S342: the variance is calculated using the formula:
;
in sigma 2 Is the variance;
step S343: calculating the median, sorting the pixel values in the color histogram according to ascending order, and taking the value arranged at the middle position as the median;
step S344: standard deviation was calculated using the following formula:
σ=sqrt(σ 2 );
where σ is the standard deviation.
In the fourth embodiment, referring to fig. 1 and 5, the embodiment is based on the above embodiment, and in step S4, extracting the shape feature specifically includes the following steps:
step S41: image denoising, namely performing image denoising based on the gray-scale image obtained in the step S21, wherein the following formula is used:
;
Wherein g (x, y) is a denoised grey image, f (x, y) is an original grey image, k is a normalization coefficient, r is a radius of a gaussian filter, ωij is a weight of the gaussian filter;
step S42: the first edge detection operator is calculated using the following formula:
Y1i=(g(x,y)⊕bi)•bi-g(x,y);
wherein Y1i is a first edge detection operator, bi is a structural element in different directions and i=1, 2, …,8, # is an exclusive or operator, # is a dot product operator between two vectors;
step S43: the second edge detection operator is calculated using the following formula:
;
where Y2i is the second edge detection operator, Θ is the logical OR operator,is a bitwise product operator between two vectors;
step S44: the improved edge detection operator is calculated using the following formula:
Yi=Y1i+Y2i+Ymin;
where Yi is an improved edge detection operator, yimin is an edge minimum and Yimin = min { Y1i, Y2i };
step S45: the edges of the small-scale image were calculated and edge detection was performed using the structural element bi (i=1, 2,3, 4) of 3*3 using the following formula:
;
wherein Q1 is a small scale image edge;
step S46: the edges of the large-scale image were calculated and edge detection was performed using the structural element bi (i=5, 6,7, 8) of 5*5 using the following formula:
;
Wherein Q2 is the edge of the large-scale image;
step S47: calculating the final image edge to obtain the edge information of the graphic area, wherein the formula is as follows:
;
where Q is the final image edge;
step S48: polygon fitting, namely, finding out the outline of the graphic area based on the edge information obtained in the step S47, and converting the outline into a polygon by using the polygon fitting;
step S49: calculating shape characteristics, wherein the steps are as follows:
step S491: the contour length is calculated using the following formula:
;
wherein R is the contour length, n is a variable of a polygon, di is the length of the ith side;
step S492: the contour area is calculated using the following formula:
;
where S is the area of the contour, (x i ,y i ) Is the coordinates of the ith vertex of the polygon;
step S493: the center distance is calculated using the following formula:
;
wherein O is the center distance, (x) c ,y c ) Is the coordinates of the center of gravity of the contour, (x m ,y m ) Is the coordinates of the contour point closest to the center of gravity point;
step S494: the eccentricity was calculated using the following formula:
;
where U is the eccentricity, e is the length of the major axis of the smallest circumscribing ellipse of the fish image profile, and z is the length of the minor axis of the smallest circumscribing ellipse of the fish image profile.
By executing the operation, aiming at the technical problem of inaccurate image edge positioning in the process of extracting the shape features, the invention improves the calculation formulas of the final edge detection operator by improving the calculation formulas of the first edge detection operator and the second edge detection operator, so that the image edge positioning is more accurate, the accuracy of edge detection is improved, and the extraction quality of the shape features is improved.
In a fifth embodiment, referring to fig. 1 and 6, based on the above embodiment, in step S5, determining the input parameters specifically includes the following steps:
step S51: constructing a classification data set based on the texture features calculated in the step S2, the color features calculated in the step S3, the shape features calculated in the step S4 and the fish images acquired in the step S1, wherein the feature variables of the texture features comprise contrast, energy, entropy and uniformity, the feature variables of the color features comprise mean, variance, median and standard deviation, and the feature variables of the shape features comprise contour length, contour area, center distance and eccentricity;
step S52: setting a reference sequence, and selecting n standard data from the classified data set in advance as evaluation parameters by the following formula:
X0=(x0(1),x0(2),…,x0(n));
wherein X0 is a reference sequence and n is the number of evaluation parameters;
step S53: a comparison matrix is constructed, and a comparison sequence is set based on sample data in the classification data set, wherein the formula is as follows:
;
wherein X is a comparison matrix, and m is the number of sample data in the classified data set;
step S54: non-dimensionalized data using the formula:
;
wherein xp (q) is the original data of the p-th column and q-th row of the comparison matrix X, X' p (q) is the non-dimensionalized data of the original data xp (q) after the non-dimensionalization processing, xmin is the minimum value of the p-th column of the comparison matrix X, and xmax is the maximum value of the p-th column of the comparison matrix X;
Step S55: a non-dimensionalized matrix using the formula:
;
wherein X' is a non-dimensionalized matrix;
step S56: the gray correlation coefficients are calculated, and the gray correlation coefficients between the corresponding elements of each sample data sequence and the reference sequence are calculated using the following formula:
;
where εp (q) is the gray correlation coefficient of the p-th sample data sequence and the reference sequence between the q-th evaluation parameters, ρ is the resolution coefficient and 0< ρ <1;
step S57: the gray correlation is calculated using the following formula:
;
where rp is the gray correlation of the p-th sample data sequence and the reference sequence over all evaluation parameters;
step S58: and determining an input parameter, presetting a gray correlation threshold, and determining a characteristic variable with gray correlation larger than the threshold as the input parameter.
By executing the operation, aiming at the technical problem of excessive fitting of the classification model caused by excessive input parameters, the invention adopts a gray relation analysis method to determine the input parameters, strengthens the correlation between the input parameters and experimental results, and improves the convergence speed and the prediction precision of the model.
In a sixth embodiment, referring to fig. 1 and 7, based on the above embodiment, in step S6, the multi-variety fish status classification specifically includes the following steps:
Step S61: constructing a training data set and a test data set, deleting dimension information of feature variables which are not more than a gray correlation threshold in the classification data set, and acquiring data corresponding labels to obtain sample data, wherein the corresponding labels are labels acquired in the step S1, 70% of the sample data are randomly selected as the training data set, and the rest 30% of the sample data are selected as the test data set;
step S62: initializing the positions of the optimized parameters, and randomly generating an initial position for each optimized parameter by using the following formula:
Y(i,j)=rand(0,1)*(U(j)-L(j))+L(j);
where i is the number of the optimization parameter, j is the dimension of Y, Y (i, j) is the position of the ith optimization parameter in the jth dimension, rand (0, 1) is a random number generated between 0 and 1, U (j) is the upper bound limit of the jth dimension, and L (j) is the lower bound limit of the jth dimension;
step S63: initializing the speed of optimized parameters, and randomly generating an initial speed for each optimized parameter by using the following formula:
V(i,j)=rand(0,1)*(Vmax(j)-Vmin(j))+Vmin(j);
where V (i, j) is the speed of the ith optimization parameter in the jth dimension, vmax (j) is the upper limit of the speed in the jth dimension, and Vmin (j) is the lower limit of the speed in the jth dimension;
step S64: generating multiple-variety fish state classification model parameters, and generating a group of multiple-variety fish state classification model parameters according to the current position for each optimization parameter, wherein the group of multiple-variety fish state classification model parameters consists of a punishment factor and a kernel function parameter, and the formula is as follows:
C(i)=2 Y(i,1) ;
G(i)=2 Y(i,2) ;
Wherein, C (i) is a punishment factor of the state classification model of the multi-variety fish, and G (i) is a kernel function parameter of the state classification model of the multi-variety fish;
step S65: training a multi-variety fish state classification model, training the multi-variety fish state classification model based on the multi-variety fish state classification model parameters determined in the step S64 and the training data set constructed in the step S61, and calculating weight vectors and bias values of the multi-variety fish state classification model, wherein the formula is as follows:
wi=∑ai*ci*ei;
;
where wi is the weight vector of the ith classifier, ai is the Lagrangian multiplier of the ith sample, ci is the eigenvector of the ith sample, ei is the corresponding label of the ith sample, ti is the bias value of the ith classifier, n is the number of training samples, g (ci, cj) is the kernel function, cj is the eigenvector of the jth sample;
step S66: calculating an optimization parameter fitness value, predicting the test data set constructed in the step S61 by using the multi-variety fish state classification model trained in the step S65, and calculating the fitness value by using the following formula:
;
wherein f (i) is the fitness value of the ith optimization parameter, k is the number of types of corresponding labels, nj is the sample size of the jth corresponding label, yjl is the true corresponding label of the ith sample in the jth corresponding label, pjl (i) is the prediction probability of the multiple-variety fish state classification model of the ith sample in the jth corresponding label on the ith optimization parameter;
Step S67: initializing an individual optimal position and a global optimal position, taking the initial position of each optimization parameter initialized in the step S62 as the individual optimal position of the corresponding optimization parameter, and taking the individual optimal position of the optimization parameter with the lowest fitness value in all the optimization parameters as the global optimal position;
step S68: updating the speed of the optimized parameters by the following formula:
V(i,j)=h*V(i,j)+d1*rand(0,1)*(T1(i,j)-Y(i,j))+d2*rand(0,1)*(T2(j)-Y(i,j));
where h is the inertial weight, T1 (i, j) is the individual optimal position of the ith optimization parameter in the jth dimension, T2 (j) is the value of the global optimal position in the jth dimension, d1 and d2 are learning factors, and rand (0, 1) is a random number generated between 0 and 1;
step S69: updating the optimized parameter position by the following formula:
Y(i,j)=Y(i,j)+V(i,j);
where Y (i, j) is the position of the ith optimization parameter in the jth dimension;
step S610: updating the fitness value of the optimization parameter;
step S611: the inertial weights are updated using the following formula:
;
wherein hmin is the minimum inertial weight, hmax is the maximum inertial weight, f is the current fitness value, fmin is the minimum fitness value, favg is the average value of fitness of all optimization parameters;
step S612: updating the individual optimal position and the global optimal position, updating the individual optimal position of the optimization parameters according to the fitness value of the optimization parameters, and updating the global optimal position according to the individual optimal positions of all the optimization parameters;
Step S613: determining a model, namely presetting an evaluation threshold value and the maximum iteration times, and establishing a multi-variety fish state classification model based on the current parameters when the fitness value of the optimized parameters is lower than the evaluation threshold value, and turning to step S614; if the maximum iteration number is reached, go to step S62; otherwise go to step S68;
step S614: classifying, wherein the unmanned aerial vehicle collects fish images in real time and inputs the fish images into a classification model of the states of the fishes of multiple varieties, and feeding is performed based on the fish varieties and the growth states output by the model.
By executing the operation, the method aims at solving the technical problem that the better global solution cannot be found because the classification algorithm is easy to sink into the local minimum, and the optimization parameters are close to a better search area by continuously adjusting the size of the inertia weight, so that the problem that the better global solution cannot be found because the classification algorithm is easy to sink into the local minimum is avoided.
Embodiment seven, referring to fig. 8 and 9, this embodiment is based on the above embodiment, and in fig. 8, a process of continuously updating the location of the optimization parameter until the global optimal location is found is shown; in fig. 9, the ordinate is the position of the optimal solution of the optimization parameter, the abscissa is the iteration number, and the variation process that the position of the optimization parameter continuously tends to the position of the optimal solution along with the variation of the iteration number is shown, so that the optimization parameter is close to a better search area, and the problem that a better global solution cannot be found due to the fact that a local minimum is trapped is avoided.
An embodiment eight, referring to fig. 2, based on the above embodiment, the unmanned aerial vehicle fish farming system based on artificial intelligence provided by the invention includes a data acquisition module, a texture feature acquisition module, a color feature acquisition module, a shape feature acquisition module, an input parameter determination module and a multi-variety fish state classification module;
taking a fish image as an example, the data acquisition module acquires the fish image and corresponding tags in various production states, takes the acquired fish image as a fish image, and sends the fish image to the texture feature acquisition module and the color feature acquisition module;
the texture feature acquisition module and the color feature acquisition module receive the fish images sent by the data acquisition module, extract texture features and color features by using the gray level co-occurrence matrix and the color histogram respectively, send the extracted texture features and color features to the input parameter determination module, and send the gray level images to the shape feature acquisition module;
the shape feature acquisition module receives the graying image sent by the texture feature acquisition module, improves a final edge detection operator by improving a calculation formula of the first edge detection operator and the second edge detection operator, improves the extraction quality of the shape feature, and sends the extracted shape feature to the input parameter determination module;
The input parameter determining module receives the texture features sent by the texture feature obtaining module, the color features sent by the color feature obtaining module and the shape features sent by the shape feature obtaining module, determines input parameters by adopting a gray relation analysis method, and sends the determined input parameters to the multi-variety fish state classification module;
the multi-variety fish state classification module receives and determines the input parameters sent by the input parameter module, and finally establishes a multi-variety fish state classification model by continuously adjusting the magnitude of the inertia weight.
It is noted that relational terms such as first and second, and the like are used solely to distinguish one entity or action from another entity or action without necessarily requiring or implying any actual such relationship or order between such entities or actions. Moreover, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus.
Although embodiments of the present invention have been shown and described, it will be understood by those skilled in the art that various changes, modifications, substitutions and alterations can be made therein without departing from the principles and spirit of the invention, the scope of which is defined in the appended claims and their equivalents.
The invention and its embodiments have been described above with no limitation, and the actual construction is not limited to the embodiments of the invention as shown in the drawings. In summary, if one of ordinary skill in the art is informed by this disclosure, a structural manner and an embodiment similar to the technical solution should not be creatively devised without departing from the gist of the present invention.
Claims (7)
1. An unmanned aerial vehicle fish culture method based on artificial intelligence is characterized in that: the method comprises the following steps:
step S1: collecting data, namely collecting fish images and corresponding tags, wherein the tags are the variety and growth state of fish, and taking the collected fish images as fish images;
step S2: extracting texture features, and calculating contrast, energy, entropy and uniformity to obtain the texture features;
Step S3: extracting color characteristics, and calculating the mean, variance, median and standard deviation to obtain the color characteristics;
step S4: extracting shape characteristics, calculating an improved edge detection operator by calculating a first edge detection operator and calculating a second edge detection operator, calculating a final image edge by calculating a small-scale image edge and calculating a large-scale image edge, performing polygon fitting, and finally obtaining shape characteristics by calculating contour length, contour area, center distance and eccentricity;
step S5: determining input parameters, firstly constructing a classification data set, then constructing a comparison matrix by setting a reference sequence, obtaining a non-dimensionality matrix by non-dimensionality data, calculating gray correlation degree by calculating gray correlation coefficient, and finally determining the input parameters;
step S6: the method comprises the steps of firstly, constructing a training data set and a testing data set, initializing optimized parameter positions and speeds, generating multi-variety fish state classification model parameters and training multi-variety fish state classification models, initializing individual optimal positions and global optimal positions, updating optimized parameter speeds, positions and fitness values, updating inertia weights, individual optimal positions and global optimal positions, determining a final multi-variety fish state classification model based on evaluation thresholds, and feeding fish varieties and growth states output by real-time fish image acquisition by using unmanned aerial vehicles;
In step S4, the extracting the shape feature specifically includes the steps of:
step S41: image denoising, namely performing image denoising based on the gray-scale image obtained in the step S21, wherein the following formula is used:
;
wherein g (x, y) is a denoised grey image, f (x, y) is an original grey image, k is a normalization coefficient, r is a radius of a gaussian filter, ωij is a weight of the gaussian filter;
step S42: the first edge detection operator is calculated using the following formula:
Y1i=(g(x,y)⊕bi)•bi-g(x,y);
wherein Y1i is a first edge detection operator, bi is a structural element in different directions and i=1, 2, …,8, # is an exclusive or operator, # is a dot product operator between two vectors;
step S43: the second edge detection operator is calculated using the following formula:
;
where Y2i is the second edge detection operator, Θ is the logical OR operator,is a bitwise product operator between two vectors;
step S44: the improved edge detection operator is calculated using the following formula:
Yi=Y1i+Y2i+Ymin;
where Yi is an improved edge detection operator, yimin is an edge minimum and Yimin = min { Y1i, Y2i };
step S45: the edges of the small-scale image were calculated and edge detection was performed using the structural element bi (i=1, 2,3, 4) of 3*3 using the following formula:
;
Wherein Q1 is a small scale image edge;
step S46: the edges of the large-scale image were calculated and edge detection was performed using the structural element bi (i=5, 6,7, 8) of 5*5 using the following formula:
;
wherein Q2 is the edge of the large-scale image;
step S47: calculating the final image edge to obtain the edge information of the graphic area, wherein the formula is as follows:
;
where Q is the final image edge;
step S48: polygon fitting, namely, finding out the outline of the graphic area based on the edge information obtained in the step S47, and converting the outline into a polygon by using the polygon fitting;
step S49: calculating shape characteristics, wherein the steps are as follows:
step S491: the contour length is calculated using the following formula:
;
wherein R is the contour length, n is a variable of a polygon, di is the length of the ith side;
step S492: the contour area is calculated using the following formula:
;
where S is the area of the contour, (x i ,y i ) Is the coordinates of the ith vertex of the polygon;
step S493: the center distance is calculated using the following formula:
;
wherein O is the center distance, (x) c ,y c ) Is the coordinates of the center of gravity of the contour, (x m ,y m ) Is the coordinates of the contour point closest to the center of gravity point;
step S494: the eccentricity was calculated using the following formula:
;
Where U is the eccentricity, e is the length of the major axis of the smallest circumscribing ellipse of the fish image profile, and z is the length of the minor axis of the smallest circumscribing ellipse of the fish image profile.
2. The unmanned aerial vehicle fish farming method based on artificial intelligence according to claim 1, wherein: in step S6, the classification of the states of the multiple fish species specifically includes the following steps:
step S61: constructing a training data set and a test data set, deleting dimension information of feature variables which are not more than a gray correlation threshold in the classification data set, and acquiring data corresponding labels to obtain sample data, wherein the corresponding labels are labels acquired in the step S1, 70% of the sample data are randomly selected as the training data set, and the rest 30% of the sample data are selected as the test data set;
step S62: initializing the positions of the optimized parameters, and randomly generating an initial position for each optimized parameter by using the following formula:
Y(i,j)=rand(0,1)*(U(j)-L(j))+L(j);
where i is the number of the optimization parameter, j is the dimension of Y, Y (i, j) is the position of the ith optimization parameter in the jth dimension, rand (0, 1) is a random number generated between 0 and 1, U (j) is the upper bound limit of the jth dimension, and L (j) is the lower bound limit of the jth dimension;
Step S63: initializing the speed of optimized parameters, and randomly generating an initial speed for each optimized parameter by using the following formula:
V(i,j)=rand(0,1)*(Vmax(j)-Vmin(j))+Vmin(j);
where V (i, j) is the speed of the ith optimization parameter in the jth dimension, vmax (j) is the upper limit of the speed in the jth dimension, and Vmin (j) is the lower limit of the speed in the jth dimension;
step S64: generating multiple-variety fish state classification model parameters, and generating a group of multiple-variety fish state classification model parameters according to the current position for each optimization parameter, wherein the group of multiple-variety fish state classification model parameters consists of a punishment factor and a kernel function parameter, and the formula is as follows:
C(i)=2 Y(i,1) ;
G(i)=2 Y(i,2) ;
wherein, C (i) is a punishment factor of the state classification model of the multi-variety fish, and G (i) is a kernel function parameter of the state classification model of the multi-variety fish;
step S65: training a multi-variety fish state classification model, training the multi-variety fish state classification model based on the multi-variety fish state classification model parameters determined in the step S64 and the training data set constructed in the step S61, and calculating weight vectors and bias values of the multi-variety fish state classification model, wherein the formula is as follows:
wi=∑ai*ci*ei;
;
where wi is the weight vector of the ith classifier, ai is the Lagrangian multiplier of the ith sample, ci is the eigenvector of the ith sample, ei is the corresponding label of the ith sample, ti is the bias value of the ith classifier, n is the number of training samples, g (ci, cj) is the kernel function, cj is the eigenvector of the jth sample;
Step S66: calculating an optimization parameter fitness value, predicting the test data set constructed in the step S61 by using the multi-variety fish state classification model trained in the step S65, and calculating the fitness value by using the following formula:
;
wherein f (i) is the fitness value of the ith optimization parameter, k is the number of types of corresponding labels, nj is the sample size of the jth corresponding label, yjl is the true corresponding label of the ith sample in the jth corresponding label, pjl (i) is the prediction probability of the multiple-variety fish state classification model of the ith sample in the jth corresponding label on the ith optimization parameter;
step S67: initializing an individual optimal position and a global optimal position, taking the initial position of each optimization parameter initialized in the step S62 as the individual optimal position of the corresponding optimization parameter, and taking the individual optimal position of the optimization parameter with the lowest fitness value in all the optimization parameters as the global optimal position;
step S68: updating the speed of the optimized parameters by the following formula:
V(i,j)=h*V(i,j)+d1*rand(0,1)*(T1(i,j)-Y(i,j))+d2*rand(0,1)*(T2(j)-Y(i,j));
where h is the inertial weight, T1 (i, j) is the individual optimal position of the ith optimization parameter in the jth dimension, T2 (j) is the value of the global optimal position in the jth dimension, d1 and d2 are learning factors, and rand (0, 1) is a random number generated between 0 and 1;
Step S69: updating the optimized parameter position by the following formula:
Y(i,j)=Y(i,j)+V(i,j);
where Y (i, j) is the position of the ith optimization parameter in the jth dimension;
step S610: updating the fitness value of the optimization parameter;
step S611: the inertial weights are updated using the following formula:
;
wherein hmin is the minimum inertial weight, hmax is the maximum inertial weight, f is the current fitness value, fmin is the minimum fitness value, favg is the average value of fitness of all optimization parameters;
step S612: updating the individual optimal position and the global optimal position, updating the individual optimal position of the optimization parameters according to the fitness value of the optimization parameters, and updating the global optimal position according to the individual optimal positions of all the optimization parameters;
step S613: determining a model, namely presetting an evaluation threshold value and the maximum iteration times, and establishing a multi-variety fish state classification model based on the current parameters when the fitness value of the optimized parameters is lower than the evaluation threshold value, and turning to step S614; if the maximum iteration number is reached, go to step S62; otherwise go to step S68;
step S614: classifying, wherein the unmanned aerial vehicle collects fish images in real time and inputs the fish images into a classification model of the states of the fishes of multiple varieties, and feeding is performed based on the fish varieties and the growth states output by the model.
3. The unmanned aerial vehicle fish farming method based on artificial intelligence according to claim 1, wherein: in step S5, the determining the input parameter specifically includes the following steps:
step S51: constructing a classification data set based on the texture features calculated in the step S2, the color features calculated in the step S3, the shape features calculated in the step S4 and the fish images acquired in the step S1, wherein the feature variables of the texture features comprise contrast, energy, entropy and uniformity, the feature variables of the color features comprise mean, variance, median and standard deviation, and the feature variables of the shape features comprise contour length, contour area, center distance and eccentricity;
step S52: setting a reference sequence, and selecting n standard data from the classified data set in advance as evaluation parameters by the following formula:
X0=(x0(1),x0(2),…,x0(n));
wherein X0 is a reference sequence and n is the number of evaluation parameters;
step S53: a comparison matrix is constructed, and a comparison sequence is set based on sample data in the classification data set, wherein the formula is as follows:
;
wherein X is a comparison matrix, and m is the number of sample data in the classified data set;
step S54: non-dimensionalized data using the formula:
;
Wherein xp (q) is the original data of the p-th column and q-th row of the comparison matrix X, X' p (q) is the non-dimensionalized data of the original data xp (q) after the non-dimensionalization processing, xmin is the minimum value of the p-th column of the comparison matrix X, and xmax is the maximum value of the p-th column of the comparison matrix X;
step S55: a non-dimensionalized matrix using the formula:
;
wherein X' is a non-dimensionalized matrix;
step S56: the gray correlation coefficients are calculated, and the gray correlation coefficients between the corresponding elements of each sample data sequence and the reference sequence are calculated using the following formula:
;
where εp (q) is the gray correlation coefficient of the p-th sample data sequence and the reference sequence between the q-th evaluation parameters, ρ is the resolution coefficient and 0< ρ <1;
step S57: the gray correlation is calculated using the following formula:
;
where rp is the gray correlation of the p-th sample data sequence and the reference sequence over all evaluation parameters;
step S58: and determining an input parameter, presetting a gray correlation threshold, and determining a characteristic variable with gray correlation larger than the threshold as the input parameter.
4. The unmanned aerial vehicle fish farming method based on artificial intelligence according to claim 1, wherein: in step S2, the extracting texture features specifically includes the following steps:
Step S21: calculating gray values, calculating pixel values of three RGB channels of the fish image to obtain corresponding gray values, and assigning the obtained gray values to corresponding pixel points to obtain a gray image, wherein the formula is as follows:
A=0.299*R+0.587*G+0.114*B;
wherein, A is the gray value of each pixel point, R, G, B is the pixel value of red, green and blue channels, and 0.299, 0.587 and 0.114 are the weighting coefficients corresponding to R, G, B respectively;
step S22: the gray level co-occurrence matrix is calculated by the following formula:
;
wherein G (I, j, δr, δc) is a gray level co-occurrence matrix, I and j are gray levels, δr and δc are offsets of the pixels of the field in the row and column directions, nr and Nc are the number of rows and columns of the grayscale image, and I (m, n) is a gray value of the pixels of the m-th row and n-th column of the grayscale image;
step S23: the probability is calculated using the formula:
;
wherein P (i, j) is the probability of using gray level i as a neighborhood pixel and gray level j as a center pixel in the gray level co-occurrence matrix, npq is the occurrence number of pixels in the field with P as the center and q as the center, and N1 is the sum of all elements in the gray level co-occurrence matrix;
step S24: calculating texture features, wherein the steps are as follows:
step S241: the contrast is calculated using the following formula:
C=∑ i ∑ j (i,j) 2 *P(i,j);
Wherein C is the contrast between pixels in the image;
step S242: the energy was calculated using the formula:
D=∑ i ∑ j P(i,j) 2 ;
wherein D is energy;
step S243: the entropy is calculated using the formula:
E=-∑ i ∑ j P(i,j)*㏒(P(i,j));
wherein E is entropy;
step S244: uniformity was calculated using the following formula:
;
wherein F is uniformity.
5. The unmanned aerial vehicle fish farming method based on artificial intelligence according to claim 1, wherein: in step S3, the extracting color features specifically includes the following steps:
step S31: the fish image is converted into HSV color space, and the steps are as follows:
step S311: normalizing, namely normalizing RGB values in the RGB color image to be [0,1];
step S312: the hue is calculated using the formula:
;
wherein H is tone and the value range is [0 degree, 360 degrees ], lmax and Lmin are the corresponding maximum value and minimum value in R, G, B color channels respectively;
step S313: the saturation was calculated using the formula:
;
wherein S is saturation and has a value range of [0,1];
step S314: the brightness was calculated using the following formula:
V=Lmax;
wherein V is brightness and has a value range of [0,1];
step S32: dividing an HSV color space into a plurality of sections, uniformly dividing a tone H into 24 sections in the HSV color space, and uniformly dividing saturation S and brightness V into 10 sections respectively;
Step S33: calculating a color histogram, traversing each pixel in an image, and counting the number of the color space intervals to which the pixel belongs to obtain the color histogram;
step S34: the color characteristics are calculated as follows:
step S341: the mean was calculated using the formula:
;
wherein μ is a mean value, mi is the frequency of occurrence of the ith pixel value in the color histogram, and N2 is the total number of pixel values in the color histogram;
step S342: the variance is calculated using the formula:
;
in sigma 2 Is the variance;
step S343: calculating the median, sorting the pixel values in the color histogram according to ascending order, and taking the value arranged at the middle position as the median;
step S344: standard deviation was calculated using the following formula:
σ=sqrt(σ 2 );
where σ is the standard deviation.
6. An artificial intelligence based unmanned aerial vehicle fish farming system for implementing an artificial intelligence based unmanned aerial vehicle fish farming method as defined in any one of claims 1-5, wherein: the fish state classifying device comprises a data acquisition module, a texture feature acquisition module, a color feature acquisition module, a shape feature acquisition module, an input parameter determining module and a multi-variety fish state classifying module.
7. An artificial intelligence based unmanned aerial vehicle fish farming system according to claim 6, wherein: taking a fish image as an example, the data acquisition module acquires the fish image and corresponding tags in various production states, takes the acquired fish image as a fish image, and sends the fish image to the texture feature acquisition module and the color feature acquisition module;
The texture feature acquisition module and the color feature acquisition module receive the fish images sent by the data acquisition module, extract texture features and color features by using the gray level co-occurrence matrix and the color histogram respectively, send the extracted texture features and color features to the input parameter determination module, and send the gray level images to the shape feature acquisition module;
the shape feature acquisition module receives the graying image sent by the texture feature acquisition module, improves a final edge detection operator by improving a calculation formula of the first edge detection operator and the second edge detection operator, improves the extraction quality of the shape feature, and sends the extracted shape feature to the input parameter determination module;
the input parameter determining module receives the texture features sent by the texture feature obtaining module, the color features sent by the color feature obtaining module and the shape features sent by the shape feature obtaining module, determines input parameters by adopting a gray relation analysis method, and sends the determined input parameters to the multi-variety fish state classification module;
the multi-variety fish state classification module receives and determines the input parameters sent by the input parameter module, and finally establishes a multi-variety fish state classification model by continuously adjusting the magnitude of the inertia weight.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202311007901.9A CN116721303B (en) | 2023-08-11 | 2023-08-11 | Unmanned aerial vehicle fish culture method and system based on artificial intelligence |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202311007901.9A CN116721303B (en) | 2023-08-11 | 2023-08-11 | Unmanned aerial vehicle fish culture method and system based on artificial intelligence |
Publications (2)
Publication Number | Publication Date |
---|---|
CN116721303A CN116721303A (en) | 2023-09-08 |
CN116721303B true CN116721303B (en) | 2023-10-20 |
Family
ID=87866540
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202311007901.9A Active CN116721303B (en) | 2023-08-11 | 2023-08-11 | Unmanned aerial vehicle fish culture method and system based on artificial intelligence |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN116721303B (en) |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN110069817A (en) * | 2019-03-15 | 2019-07-30 | 温州大学 | A method of prediction model is constructed based on California gray whale optimization algorithm is improved |
CN110287896A (en) * | 2019-06-27 | 2019-09-27 | 北京理工大学 | A kind of Human bodys' response method based on heterogeneous layering PSO and SVM |
CN112862750A (en) * | 2020-12-29 | 2021-05-28 | 深圳信息职业技术学院 | Blood vessel image processing method and device based on multi-scale fusion and meta-heuristic optimization |
CN116168392A (en) * | 2022-12-28 | 2023-05-26 | 北京工业大学 | Target labeling method and system based on optimal source domain of multidimensional space feature model |
-
2023
- 2023-08-11 CN CN202311007901.9A patent/CN116721303B/en active Active
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN110069817A (en) * | 2019-03-15 | 2019-07-30 | 温州大学 | A method of prediction model is constructed based on California gray whale optimization algorithm is improved |
CN110287896A (en) * | 2019-06-27 | 2019-09-27 | 北京理工大学 | A kind of Human bodys' response method based on heterogeneous layering PSO and SVM |
CN112862750A (en) * | 2020-12-29 | 2021-05-28 | 深圳信息职业技术学院 | Blood vessel image processing method and device based on multi-scale fusion and meta-heuristic optimization |
CN116168392A (en) * | 2022-12-28 | 2023-05-26 | 北京工业大学 | Target labeling method and system based on optimal source domain of multidimensional space feature model |
Non-Patent Citations (2)
Title |
---|
基于图像处理技术的黄瓜叶片含水率检测系统设计;李畅畅;《中国优秀硕士学位论文全文数据库》;D048-36 * |
基于灰色关联度和Prewitt算子的边缘检测新方法;石俊涛等;《微计算机信息》;214-216 * |
Also Published As
Publication number | Publication date |
---|---|
CN116721303A (en) | 2023-09-08 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN109614985B (en) | Target detection method based on densely connected feature pyramid network | |
CN110298280B (en) | Ocean vortex identification method based on MKL multi-feature fusion | |
CN108537102B (en) | High-resolution SAR image classification method based on sparse features and conditional random field | |
CN101915764B (en) | Road surface crack detection method based on dynamic programming | |
CN101980298B (en) | Multi-agent genetic clustering algorithm-based image segmentation method | |
CN108399420B (en) | Visible light ship false alarm rejection method based on deep convolutional network | |
CN106169081A (en) | A kind of image classification based on different illumination and processing method | |
CN109815979B (en) | Weak label semantic segmentation calibration data generation method and system | |
CN111611972B (en) | Crop leaf type identification method based on multi-view multi-task integrated learning | |
CN109409438B (en) | Remote sensing image classification method based on IFCM clustering and variational inference | |
CN112990148B (en) | Target identification method and system for intelligent transfer robot | |
CN111368660A (en) | Single-stage semi-supervised image human body target detection method | |
CN106056165B (en) | A kind of conspicuousness detection method based on super-pixel relevance enhancing Adaboost classification learning | |
CN111273288B (en) | Radar unknown target identification method based on long-term and short-term memory network | |
CN111931700A (en) | Corn variety authenticity identification method and identification system based on multiple classifiers | |
CN112509017B (en) | Remote sensing image change detection method based on learnable differential algorithm | |
CN111783885A (en) | Millimeter wave image quality classification model construction method based on local enhancement | |
CN112233099A (en) | Reusable spacecraft surface impact damage characteristic identification method | |
CN104050680B (en) | Based on iteration self-organizing and the image partition method of multi-agent genetic clustering algorithm | |
CN114463843A (en) | Multi-feature fusion fish abnormal behavior detection method based on deep learning | |
CN115908930A (en) | Improved CFWPSO-SVM-based forward-looking sonar image recognition and classification method | |
CN112183237A (en) | Automatic white blood cell classification method based on color space adaptive threshold segmentation | |
CN116071339A (en) | Product defect identification method based on improved whale algorithm optimization SVM | |
CN104537660B (en) | Image partition method based on Multiobjective Intelligent body evolution clustering algorithm | |
CN116721303B (en) | Unmanned aerial vehicle fish culture method and system based on artificial intelligence |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |