CN115457376A - Coral reef health condition intelligent assessment method - Google Patents

Coral reef health condition intelligent assessment method Download PDF

Info

Publication number
CN115457376A
CN115457376A CN202210936081.0A CN202210936081A CN115457376A CN 115457376 A CN115457376 A CN 115457376A CN 202210936081 A CN202210936081 A CN 202210936081A CN 115457376 A CN115457376 A CN 115457376A
Authority
CN
China
Prior art keywords
coral
image
color
health
convolution
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202210936081.0A
Other languages
Chinese (zh)
Inventor
文强
孙彤
王聿隽
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shandong Henghao Information Technology Co ltd
Original Assignee
Shandong Henghao Information Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shandong Henghao Information Technology Co ltd filed Critical Shandong Henghao Information Technology Co ltd
Priority to CN202210936081.0A priority Critical patent/CN115457376A/en
Publication of CN115457376A publication Critical patent/CN115457376A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/05Underwater scenes
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q10/00Administration; Management
    • G06Q10/06Resources, workflows, human or project management; Enterprise or organisation planning; Enterprise or organisation modelling
    • G06Q10/063Operations research, analysis or management
    • G06Q10/0639Performance analysis of employees; Performance analysis of enterprise or organisation operations
    • G06Q10/06393Score-carding, benchmarking or key performance indicator [KPI] analysis
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/20Image preprocessing
    • G06V10/30Noise filtering
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/764Arrangements for image or video recognition or understanding using pattern recognition or machine learning using classification, e.g. of video objects
    • G06V10/765Arrangements for image or video recognition or understanding using pattern recognition or machine learning using classification, e.g. of video objects using rules for classification or partitioning the feature space
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/77Processing image or video features in feature spaces; using data integration or data reduction, e.g. principal component analysis [PCA] or independent component analysis [ICA] or self-organising maps [SOM]; Blind source separation
    • G06V10/80Fusion, i.e. combining data from various sources at the sensor level, preprocessing level, feature extraction level or classification level
    • G06V10/806Fusion, i.e. combining data from various sources at the sensor level, preprocessing level, feature extraction level or classification level of extracted features
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/82Arrangements for image or video recognition or understanding using pattern recognition or machine learning using neural networks
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02ATECHNOLOGIES FOR ADAPTATION TO CLIMATE CHANGE
    • Y02A40/00Adaptation technologies in agriculture, forestry, livestock or agroalimentary production
    • Y02A40/80Adaptation technologies in agriculture, forestry, livestock or agroalimentary production in fisheries management
    • Y02A40/81Aquaculture, e.g. of fish

Abstract

The invention provides an intelligent coral reef health condition assessment method, which comprises the steps of collecting coral reef area color images through an underwater camera, preprocessing the images and improving the image quality; constructing an image enhancement classification model, enhancing the classification precision by enhancing the image, and identifying the coral type in the acquired image; and taking the coral coverage rate, the coral health index and the average color scores of all corals in the area as evaluation indexes of the coral reef health condition to obtain a comprehensive score of the coral reef health condition. The invention solves the problems that the quality of coral images collected by a camera is greatly limited due to weak light in water, and great obstruction is brought to the health assessment of coral reef areas.

Description

Coral reef health condition intelligent assessment method
Technical Field
The invention relates to the technical field of intelligent calculation, in particular to an intelligent assessment method for the health condition of a coral reef.
Background
Coral reefs are one of the most abundant and most diverse ecosystems of marine life, and with the global climate change and the influence of human activities, coral bears enormous pressure, and at present, coral reefs are in a continuously attenuated deterioration state. With the rapid development of underwater robots, the underwater robots are used for carrying out video acquisition on warm water corals and researching the coral distribution and health conditions. However, as the light is weak due to strong scattering property and absorption property in the process of light propagation in water, the quality of coral images acquired by the camera is greatly limited, and a great obstacle is brought to health assessment of coral reef areas.
Disclosure of Invention
The technical problem to be solved by the invention is as follows: because the light in water is weak, the quality of the coral image collected by the camera is greatly limited, and great obstruction is brought to the health assessment of the coral reef area. Therefore, the coral reef health condition intelligent assessment method is provided.
The invention relates to an intelligent evaluation method for the health condition of a coral reef, which comprises the following steps:
s1, collecting a coral reef area color image through an underwater camera, and preprocessing the image to improve the image quality;
s2, constructing an image enhancement classification model, enhancing the classification precision by enhancing the image, and identifying the coral type in the acquired image;
and S3, taking the coral coverage rate, the coral health index and the average coral color scores in the area as evaluation indexes of the coral reef health condition to obtain a comprehensive score of the coral reef health condition.
Further, the step S1 includes:
s11, collecting images of the coral reef area by using an underwater camera, and preprocessing the images to improve the image quality; the coral image collected by the underwater camera is a color image, the coral image s (i, j) is analyzed, the color image is regarded as a two-dimensional pixel lattice of R, G, B three independent channels, and (i, j) represents twoRows and columns of a dimensional pixel lattice, the gray scale image of three channels being s R (i,j)、s G (i,j)、s B (i, j) the color image completes the processing of the gray image in different color channels; in each color channel, each pixel corresponds to the response value of the corresponding photosensitive element to red, green and blue light rays, so that the color image is converted into three gray-scale images;
processing an original image by adopting a multi-scale homomorphic filtering algorithm with a color recovery factor:
Figure BDA0003782072580000021
wherein S is l (i, j) represents the two-dimensional pixel lattice of the ith channel of the processed image, i ∈ { R, G, B }, s l (i, j) represents the two-dimensional pixel lattice of the l channel of the original image, N represents the number of scales, W n Is the weight corresponding to the nth scale, N belongs to N, F n (i, j) represents a gaussian kernel function corresponding to the nth scale;
s12, denoising the image by a diffusion filtering method, introducing a time operator t to show that denoising is related to diffusion duration:
S l (i,j)=I 0 =I(i,j,0)
wherein, I 0 An image signal representing an initial time; calculating partial derivatives of the image signals at the time t:
Figure BDA0003782072580000022
wherein, div represents a divergence operator,
Figure BDA0003782072580000023
representing image gradients
Figure BDA0003782072580000024
The equation of (a) for the diffusion of (b),
Figure BDA0003782072580000025
in order to be a gradient operator, the method comprises the following steps,
Figure BDA0003782072580000026
representing gradient amplitude, k representing a gradient threshold, and when the image gradient is far greater than the gradient threshold k, the diffusion equation tends to be 0; when the image gradient is far smaller than a gradient threshold k, the diffusion equation tends to 1; original image I 0 As a medium, it spreads over the image at a non-constant speed, the area through which it spreads gradually forming a series of smooth images I (I, j, t).
Further, the step S2 includes:
equally dividing the collected image into M small squares with the size of U multiplied by V multiplied by 3, and constructing an image enhancement classification model; the classification precision is improved by enhancing the image, and the coral species in the acquired image are identified; the image enhancement classification model comprises an encoder, a decoder and a discriminator, wherein the encoder consists of a plurality of convolution layers, and the number of the convolution layers is determined according to actual requirements; the decoder consists of a plurality of deconvolution layers corresponding to the convolution layer mirror images, and a 'solid state pointer fusion' operation of a feature map exists between the convolution layers and the deconvolution layers;
assuming that the encoder and the decoder are composed of m convolutional layers and a deconvolution layer, the operation of 'solid pointer fusion' is used for carrying out convolutional fusion on a characteristic graph output by the q convolutional layer and a characteristic graph output by the m-q deconvolution layers on a channel;
convolution kernel H of convolutional layer 1 3 x 3, number of channels 3, step size 2, convolution kernel H of deconvolution layer 2 Is 3 × 3, step length is
Figure BDA0003782072580000031
Initializing the parameter mu of the encoder e Decoder parameter mu d And parameter mu of the discriminator a Network input is denoted as I = I 1 ,I 2 ,...,I M (ii) a Firstly, the encoder obtains image characteristics through convolution operation:
Figure BDA0003782072580000032
wherein conv is the convolution characteristic diagram of the output,
Figure BDA0003782072580000033
for the input gamma image, gamma is the M, c 1 Indicates the number of input channels, c 1 =3,c 2 Indicates the number of output channels, i 0 ,j 0 Representing channel coordinates; the convolution operation of the encoder enhances the capability of the image enhancement classification model for extracting image features;
after the encoder performs sampling processing on the convolution layer once, the characteristic size of the obtained image is reduced to the original size
Figure BDA0003782072580000034
Inputting the obtained feature map into a decoder, and increasing the image size by 4 times through deconvolution operation; the deconvolution operation is:
Figure BDA0003782072580000035
wherein the content of the first and second substances,
Figure BDA0003782072580000036
as an output deconvolution feature map, c 3 Is the number of deconvolution output channels;
and (3) carrying out convolution fusion on the characteristic diagram output by the q-th convolution layer and the characteristic diagram output by the m-q deconvolution layers on a channel through a solid-state pointer fusion operation:
Figure BDA0003782072580000037
will be fused
Figure BDA0003782072580000038
The image features are input into a discriminator and are operated by' solid pointer fusionThe problems of artifacts, mosaics and the like caused by deconvolution up-sampling are solved, and the generation quality of the image is improved;
the discriminator consists of two parts, wherein the first part is a module consisting of a convolution unit and a modified linear unit, and the second part is a module consisting of convolution and batch standardization;
the convolution kernel of the first part is
Figure BDA0003782072580000041
The output is:
Figure BDA0003782072580000042
the second part has a convolution kernel of
Figure BDA0003782072580000043
The output is:
Figure BDA0003782072580000044
wherein
Figure BDA0003782072580000045
Is the mean of the input samples, σ is the variance of the input samples; and finally, outputting a two-dimensional matrix, wherein each element in the matrix is mapped to a subarea in the input image and is used for distinguishing the coral type corresponding to each characteristic point in the input image.
Further, the step S3 includes:
comparing the identified coral color with a coral health color card to obtain a coral color score Si, wherein the specific method comprises the following steps:
converting the coral color values from RGB to HSV, and calculating the color distance value from the coral color values in the HSV domain to each color square of the coral health color card:
Figure BDA0003782072580000046
wherein (x) 0 ,y 0 ,z 0 ) Coordinates of coral color values within the HSV domain; (x) i ,y i ,z i ) Coordinates of color values of each color square of the coral health color card in the HSV domain; obtaining a coral color fraction Si = min d according to the color card color square information corresponding to the minimum color distance;
and obtaining a comprehensive score W of the health condition of the coral reef according to the weight values of different evaluation indexes by taking the coral coverage rate, the coral health index and all coral color scores Si in the area as evaluation indexes of the health condition of the coral reef, and obtaining a coral reef health evaluation result according to the comprehensive score W of the health condition of the coral reef.
Further, the step S3 includes:
the diversity index of coral is:
Figure BDA0003782072580000047
wherein HI is the diversity index, p i Is the number of the ith coral, P is the total number of the corals, and s is the number of coral species; the coral health index is:
Figure BDA0003782072580000051
wherein H i Is the diversity index of the i-th class of organisms, H i0 Is the health value of the diversity index of the ith biological group, and n is the number of the biological groups;
and (3) calculating the coverage rate of the coral:
Figure BDA0003782072580000052
a is the number of small squares occupied by coral in the collected image which is equally divided into small squares with the size of U multiplied by V;
calculate the average color score MI of all corals:
Figure BDA0003782072580000053
according to the calculated coral coverage rate TC, coralThe average color fraction MI of all corals and the health index CI obtain the corresponding evaluation score Y 1 、Y 2 And Y 3
Y 1 =TC×100
Figure BDA0003782072580000054
Figure BDA0003782072580000055
Wherein alpha is 1 、α 2 、α 3 、α 4 、α 5 、b 1 、b 2 、b 3 、b 4 Are all parameters, δ 1 、δ 2 、δ 3 、δ 4 Are all exponential threshold values, which are obtained by experiments;
taking the coral coverage rate, the coral health indexes and the average coral color scores in the area as evaluation indexes of the coral reef health condition, and obtaining a comprehensive score W of the coral reef health condition according to the weight values of different evaluation indexes:
W=ω 1 Y 12 Y 23 Y 3
wherein, ω is 1 Weighing coral coverage rate; omega 2 The coral health index weight value; omega 3 Weight values for all coral mean color scores.
The invention has the beneficial effects that:
1. the method and the device realize the noise reduction processing of the image, can effectively retain characteristic information such as the boundary, lines and the like of the image while reducing noise, and increase the accuracy of image identification.
2. The low-order information sharing between input and output can be realized by constructing the solid pointer fusion operation, the problems of artifacts, mosaics and the like caused by deconvolution up-sampling are solved, and the generation quality of the image is favorably improved.
3. The coral coverage rate, the coral health index and all coral color scores in the area are used as evaluation indexes of the coral reef health condition, and the evaluation accuracy is improved.
Drawings
FIG. 1 is a flow chart of an intelligent assessment method for coral reef health status according to the present invention;
FIG. 2 is a diagram of an image enhanced classification model according to the present invention.
Detailed Description
The following detailed description will be provided with reference to the drawings in the present embodiment, so that how to apply the technical means to solve the technical problems and achieve the technical effects can be fully understood and implemented. It should be noted that, as long as there is no conflict, the features in the embodiments of the present invention may be combined with each other, and the formed technical solutions are within the scope of the present invention.
Referring to fig. 1, the method for intelligently evaluating the health condition of the coral reef, provided by the invention, comprises the following steps:
s1, collecting a coral reef area color image through an underwater camera, and preprocessing the image to improve the image quality;
s2, constructing an image enhancement classification model, enhancing the classification precision by enhancing the image, and identifying the coral type in the acquired image;
and S3, taking the coral coverage rate, the coral health index and the average coral color scores in the area as evaluation indexes of the coral reef health condition to obtain a comprehensive score of the coral reef health condition.
Further, the image preprocessing method in step S1 is:
s11, an underwater camera is used for collecting images of the coral reef area, preprocessing is carried out on the images, and the image quality is improved. The coral image collected by the underwater camera is a color image, the coral image s (i, j) is analyzed, the color image is regarded as a R, G, B two-dimensional pixel lattice of three independent channels, (i, j) represents the row and column of the two-dimensional pixel lattice, and the gray level image of the three channels is s R (i,j)、s G (i,j)、s B (i, j), the color image completes the processing of the grayscale image in different color channels. In thatIn each color channel, each pixel corresponds to the response value of the corresponding photosensitive element to red, green and blue light rays, so that the color image is converted into three gray-scale images.
Processing an original image by adopting a multi-scale homomorphic filtering algorithm with a color recovery factor:
Figure BDA0003782072580000071
wherein S is l (i, j) represents the two-dimensional pixel lattice of the ith channel of the processed image, i ∈ { R, G, B }, s l (i, j) represents the two-dimensional pixel lattice of the l channel of the original image, N represents the number of scales, W n Is the weight value corresponding to the nth scale, N belongs to N, F n (i, j) represents the Gaussian kernel function corresponding to the nth scale.
S12, denoising the image by a diffusion filtering method, introducing a time operator t to show that denoising is related to diffusion duration:
S l (i,j)=I 0 =I(i,j,0)
wherein, I 0 Representing the image signal at the initial instant. Calculating partial derivatives of the image signals at the time t:
Figure BDA0003782072580000072
wherein, div represents a divergence operator,
Figure BDA0003782072580000073
representing image gradients
Figure BDA0003782072580000074
The diffusion equation of (a) is given,
Figure BDA0003782072580000075
in order to be a gradient operator, the method comprises the following steps,
Figure BDA0003782072580000076
representing gradient amplitude, k representing a gradient threshold, and when the image gradient is far greater than the gradient threshold k, the diffusion equation tends to be 0; when the image gradient is much smaller than the gradient threshold k, the diffusion equation tends towards 1. Original image I 0 As a medium, it spreads over the image at a non-constant speed, the area through which it spreads gradually forming a series of smooth images I (I, j, t).
The diffusion filtering method has the advantages that the noise reduction processing of the image is realized, the characteristic information such as the boundary, lines and the like of the image can be effectively reserved while the noise is reduced, and the image identification accuracy is improved.
Further, the coral species identification method in step S2 is:
the collected image is equally divided into M small squares with the size of U × V × 3, and an image enhancement classification model is constructed, as shown in fig. 2. By enhancing the images, the classification precision is improved, and the coral species in the collected images are identified. The image enhancement classification model comprises an encoder, a decoder and a discriminator, wherein the encoder consists of a plurality of convolution layers, and the number of the convolution layers is determined according to actual requirements; the decoder is composed of a plurality of deconvolution layers corresponding to the convolution layer mirror images, and a 'solid state pointer fusion' operation of a feature map exists between the convolution layers and the deconvolution layers.
The solid pointer fusion operation can realize low-order information sharing between input and output, and is beneficial to improving the generation quality of images. Assuming that the encoder and decoder are composed of m convolutional layers and a deconvolution layer, the "solid state pointer fusion" operation convolutionally fuses the feature map output by the q-th convolutional layer with the feature maps output by the m-q deconvolution layers on the channel.
Convolution kernel H of convolutional layer 1 3 x 3, number of channels 3, step size 2, convolution kernel H of deconvolution layer 2 Is 3 × 3, step length is
Figure BDA0003782072580000081
Initializing the parameter mu of the encoder e Decoder parameter mu d And parameter mu of the discriminator a Network input is denoted as I = I 1 ,I 2 ,...,I M . Firstly, the encoder obtains image characteristics through convolution operation:
Figure BDA0003782072580000082
wherein conv is the convolution characteristic diagram of the output,
Figure BDA0003782072580000083
for the input gamma image, gamma is the M, c 1 Indicates the number of input channels, c 1 =3,c 2 Indicates the number of output channels, i 0 ,j 0 Representing the channel coordinates. The convolution operation of the encoder enhances the ability of the image enhancement classification model to extract image features.
After the encoder performs sampling processing on the convolution layer once, the characteristic size of the obtained image is reduced to the original size
Figure BDA0003782072580000084
And inputting the obtained feature map into a decoder to increase the image size by 4 times through a deconvolution operation. The deconvolution operation is:
Figure BDA0003782072580000085
wherein the content of the first and second substances,
Figure BDA0003782072580000086
as an output deconvolved feature map, c 3 Is the number of output channels of the deconvolution.
And (3) carrying out convolution fusion on the characteristic diagram output by the q-th convolution layer and the characteristic diagram output by the m-q deconvolution layers on a channel through a solid-state pointer fusion operation:
Figure BDA0003782072580000091
will be fused
Figure BDA0003782072580000092
In the image characteristic input discriminator, the problems of artifacts, mosaics and the like caused by deconvolution up-sampling are relieved through solid pointer fusion operation, and the generation quality of the image is improved.
The discriminator consists of two parts, the first part is a module consisting of convolution and a modified linear unit, and the second part is a module consisting of convolution and batch standardization.
The convolution kernel of the first part is
Figure BDA0003782072580000093
The output is:
Figure BDA0003782072580000094
the second part has a convolution kernel of
Figure BDA0003782072580000095
The output is:
Figure BDA0003782072580000096
wherein
Figure BDA0003782072580000097
σ is the variance of the input samples, which is the mean of the input samples. And finally, outputting a two-dimensional matrix, wherein each element in the matrix is mapped to a subarea in the input image and is used for distinguishing the coral types corresponding to the characteristic points in the input image.
Further, the coral reef health condition evaluation method in step S3 includes:
comparing the identified coral color with a coral health color card to obtain a coral color score Si, wherein the specific method comprises the following steps:
converting the coral color values from RGB to HSV, and calculating the color distance value from the coral color values in the HSV domain to each color square of the coral health color card:
Figure BDA0003782072580000098
wherein (x) 0 ,y 0 ,z 0 ) Coordinates of coral color values within the HSV domain; (x) i ,y i ,z i ) Coordinates of color values of each color square of the coral health color card in the HSV domain; and obtaining the coral color fraction Si = min d according to the color card color square information corresponding to the minimum color distance.
Taking the coral coverage rate, the coral health indexes and all coral color scores Si in the area as evaluation indexes of the coral reef health condition, obtaining a comprehensive score W of the coral reef health condition according to weight values of different evaluation indexes, and obtaining a coral reef health evaluation result according to the comprehensive score W of the coral reef health condition;
the method for calculating the comprehensive score W of the health condition of the coral reef specifically comprises the following steps:
the diversity index of coral is:
Figure BDA0003782072580000101
wherein HI is the diversity index, p i Is the number of i-th coral species, P is the total number of coral species, and s is the number of coral species. The coral health index is:
Figure BDA0003782072580000102
wherein H i Is the diversity index of the i-th class of organisms, H i0 Is the health value of the diversity index of the ith biological group, and n is the number of the biological groups.
And (3) calculating the coverage rate of the coral:
Figure BDA0003782072580000103
a is the number of small squares occupied by coral in the collected image which is equally divided into small squares with the size of U multiplied by V;
calculate the average color score MI of all corals:
Figure BDA0003782072580000104
obtaining corresponding evaluation score Y according to the calculated coral coverage rate TC, coral health index CI and average color fraction MI of all corals 1 、Y 2 And Y 3
Y 1 =TC×100
Figure BDA0003782072580000105
Figure BDA0003782072580000106
Wherein alpha is 1 、α 2 、α 3 、α 4 、α 5 、b 1 、b 2 、b 3 、b 4 Are all parameters, δ 1 、δ 2 、δ 3 、δ 4 Are all exponential thresholds, derived from experiments.
Taking the coral coverage rate, the coral health index and the average color scores of all corals in the area as evaluation indexes of the coral reef health condition, and obtaining the comprehensive score W of the coral reef health condition according to the weight values of different evaluation indexes:
W=ω 1 Y 12 Y 23 Y 3
wherein, ω is 1 Weighing coral coverage rate; omega 2 Weighing coral health indexes; omega 3 Weight values for all coral mean color scores.
The beneficial effects of the method for evaluating the health of the reef are as follows: the coral coverage rate, the coral health index and all coral color scores Si in the area are used as evaluation indexes of the coral reef health condition, and the evaluation accuracy is improved.
In conclusion, the method for intelligently evaluating the health condition of the coral reef is completed.
The present invention is described with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems), and computer program products according to embodiments of the invention. It will be understood that each flow and/or block of the flow diagrams and/or block diagrams, and combinations of flows and/or blocks in the flow diagrams and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, embedded processor, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be loaded onto a computer or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide steps for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
While preferred embodiments of the present invention have been described, additional variations and modifications in those embodiments may occur to those skilled in the art once they learn of the basic inventive concepts. Therefore, it is intended that the appended claims be interpreted as including preferred embodiments and all such alterations and modifications as fall within the scope of the invention.
It will be apparent to those skilled in the art that various changes and modifications may be made in the present invention without departing from the spirit and scope of the invention. Thus, if such modifications and variations of the present invention fall within the scope of the claims of the present invention and their equivalents, the present invention is also intended to include such modifications and variations.

Claims (5)

1. The method for intelligently evaluating the health condition of the coral reef is characterized by comprising the following steps of:
s1, collecting a coral reef area color image through an underwater camera, and preprocessing the image to improve the image quality;
s2, constructing an image enhancement classification model, enhancing the classification precision by enhancing the image, and identifying the coral type in the acquired image;
and S3, taking the coral coverage rate, the coral health index and the average coral color scores in the area as evaluation indexes of the coral reef health condition to obtain a comprehensive score of the coral reef health condition.
2. The method for intelligently evaluating the health condition of the coral reef as claimed in claim 1, wherein the step S1 comprises:
s11, collecting images of the coral reef area by using an underwater camera, and preprocessing the images to improve the image quality; the coral image collected by the underwater camera is a color image, the coral image s (i, j) is analyzed, the color image is regarded as a R, G, B two-dimensional pixel lattice of three independent channels, (i, j) represents the row and column of the two-dimensional pixel lattice, and the gray level image of the three channels is s R (i,j)、s G (i,j)、s B (i, j), the color image completes the processing of the gray image in different color channels; in each color channel, each pixel corresponds to the response value of the corresponding photosensitive element to red, green and blue light rays, so that the color image is converted into three gray-scale images;
processing an original image by adopting a multi-scale homomorphic filtering algorithm with a color recovery factor:
Figure FDA0003782072570000011
wherein S is l (i, j) represents the two-dimensional pixel lattice of the ith channel of the processed image, i ∈ { R, G, B }, s l (i, j) represents the two-dimensional pixel lattice of the l channel of the original image, N represents the number of scales, W n Is the weight value corresponding to the nth scale, N belongs to N, F n (i, j) represents a gaussian kernel function corresponding to the nth scale;
s12, denoising the image by a diffusion filtering method, introducing a time operator t to show that denoising is related to diffusion duration:
S l (i,j)=I 0 =I(i,j,0)
wherein, I 0 An image signal representing an initial time; calculating partial derivatives of the image signals at the time t:
Figure FDA0003782072570000021
wherein, div represents a divergence operator,
Figure FDA0003782072570000022
representing image gradients
Figure FDA0003782072570000023
The equation of (a) for the diffusion of (b),
Figure FDA0003782072570000024
in order to be a gradient operator, the method comprises the following steps,
Figure FDA0003782072570000025
representing gradient amplitude, k representing a gradient threshold, and when the image gradient is far greater than the gradient threshold k, the diffusion equation tends to be 0; when the image gradient is far smaller than a gradient threshold k, the diffusion equation tends to 1; original image I 0 As a medium, it spreads over the image at a non-constant speed, the area through which it spreads gradually forming a series of smooth images I (I, j, t).
3. The method for intelligently assessing the health of the coral reef according to claim 2, wherein the step S2 comprises:
equally dividing the collected image into M small squares with the size of U multiplied by V multiplied by 3, and constructing an image enhancement classification model; the classification precision is improved by enhancing the image, and the coral species in the acquired image are identified; the image enhancement classification model comprises an encoder, a decoder and a discriminator, wherein the encoder consists of a plurality of convolution layers, and the number of the convolution layers is determined according to actual requirements; the decoder consists of a plurality of deconvolution layers corresponding to the convolution layer mirror images, and a 'solid pointer fusion' operation of a feature map exists between the convolution layers and the deconvolution layers;
assuming that the encoder and the decoder are composed of m convolutional layers and a deconvolution layer, the operation of 'solid pointer fusion' is used for carrying out convolutional fusion on a characteristic graph output by the q convolutional layer and a characteristic graph output by the m-q deconvolution layers on a channel;
convolution kernel H of convolutional layer 1 3 x 3, number of channels 3, step size 2, convolution kernel H of deconvolution layer 2 Is 3X 3, step size is
Figure FDA0003782072570000026
Initializing the parameter mu of the encoder e Decoder parameter mu d And parameter mu of the discriminator a Network input is denoted as I = I 1 ,I 2 ,...,I M (ii) a Firstly, an encoder obtains image characteristics through convolution operation:
Figure FDA0003782072570000027
wherein conv is the convolution characteristic diagram of the output,
Figure FDA0003782072570000028
for the input gamma image, gamma is the M, c 1 Indicates the number of input channels, c 1 =3,c 2 Indicates the number of output channels, i 0 ,j 0 Representing channel coordinates; the convolution operation of the encoder enhances the capability of the image enhancement classification model for extracting image features;
after the encoder samples the convolution layer once, the obtained image characteristic size is reduced to the original one
Figure FDA0003782072570000031
Inputting the obtained feature map into a decoder, and increasing the image size by 4 times through deconvolution operation; the deconvolution operation is:
Figure FDA0003782072570000032
wherein the content of the first and second substances,
Figure FDA0003782072570000033
as an output deconvolution feature map, c 3 Is the number of deconvolution output channels;
and (3) carrying out convolution fusion on the characteristic diagram output by the q-th convolution layer and the characteristic diagram output by the m-q deconvolution layers on a channel through a solid-state pointer fusion operation:
Figure FDA0003782072570000034
will be fused
Figure FDA0003782072570000035
The image features are input into a discriminator, the problems of artifacts, mosaics and the like caused by deconvolution up-sampling are relieved through solid pointer fusion operation, and the generation quality of the image is improved;
the discriminator consists of two parts, wherein the first part is a module consisting of a convolution and a modified linear unit, and the second part is a module consisting of convolution and batch standardization;
the convolution kernel of the first part is
Figure FDA0003782072570000036
The output is:
Figure FDA0003782072570000037
the second part has a convolution kernel of
Figure FDA0003782072570000038
The output is:
Figure FDA0003782072570000039
wherein
Figure FDA00037820725700000310
Is the mean of the input samples, σ is the variance of the input samples; and finally, outputting a two-dimensional matrix, wherein each element in the matrix is mapped to a subarea in the input image and is used for distinguishing the coral type corresponding to each characteristic point in the input image.
4. The method for intelligently assessing the health of the coral reef according to claim 3, wherein the step S3 comprises:
comparing the identified coral color with a coral health color card to obtain a coral color score Si, wherein the specific method comprises the following steps:
converting the coral color values from RGB to HSV, and calculating the color distance value from the coral color values in the HSV domain to each color square of the coral health color card:
Figure FDA0003782072570000041
wherein (x) 0 ,y 0 ,z 0 ) Coordinates of coral color values within the HSV domain; (x) i ,y i ,z i ) Coordinates of color values of each color square of the coral health color card in the HSV domain; obtaining a coral color score Si = min d according to the color card color square information corresponding to the minimum color distance;
and obtaining a comprehensive score W of the health condition of the coral reef according to the weight values of different evaluation indexes by taking the coral coverage rate, the coral health index and all coral color scores Si in the area as evaluation indexes of the health condition of the coral reef, and obtaining a coral reef health evaluation result according to the comprehensive score W of the health condition of the coral reef.
5. The method for intelligently assessing the health of the coral reef according to claim 4, wherein the step S3 comprises:
the diversity index of coral is:
Figure FDA0003782072570000042
wherein HI is the diversity index, p i Is the number of the ith coral, P is the total number of the corals, and s is the number of coral species; the coral health index is:
Figure FDA0003782072570000043
wherein H i Is the diversity index of the i-th class of organisms, H i0 Is the health value of the diversity index of the ith biological group, and n is the number of the biological groups;
and (3) calculating the coverage rate of the coral:
Figure FDA0003782072570000044
a is the number of small squares occupied by coral in the collected images equally divided into small squares with the size of U multiplied by V;
calculate the average color score MI of all corals:
Figure FDA0003782072570000045
obtaining corresponding evaluation score Y according to the calculated coral coverage rate TC, coral health index CI and average color fraction MI of all corals 1 、Y 2 And Y 3
Y 1 =TC×100
Figure FDA0003782072570000051
Figure FDA0003782072570000052
Wherein alpha is 1 、α 2 、α 3 、α 4 、α 5 、b 1 、b 2 、b 3 、b 4 Are all parameters, δ 1 、δ 2 、δ 3 、δ 4 Are all exponential threshold values, which are obtained by experiments;
taking the coral coverage rate, the coral health indexes and the average coral color scores in the area as evaluation indexes of the coral reef health condition, and obtaining a comprehensive score W of the coral reef health condition according to the weight values of different evaluation indexes:
W=ω 1 Y 12 Y 23 Y 3
wherein, ω is 1 Weighing coral coverage rate; omega 2 Weighing coral health indexes; omega 3 Weight values for all coral mean color scores.
CN202210936081.0A 2022-08-04 2022-08-04 Coral reef health condition intelligent assessment method Pending CN115457376A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210936081.0A CN115457376A (en) 2022-08-04 2022-08-04 Coral reef health condition intelligent assessment method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210936081.0A CN115457376A (en) 2022-08-04 2022-08-04 Coral reef health condition intelligent assessment method

Publications (1)

Publication Number Publication Date
CN115457376A true CN115457376A (en) 2022-12-09

Family

ID=84296723

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210936081.0A Pending CN115457376A (en) 2022-08-04 2022-08-04 Coral reef health condition intelligent assessment method

Country Status (1)

Country Link
CN (1) CN115457376A (en)

Similar Documents

Publication Publication Date Title
CN112183360B (en) Lightweight semantic segmentation method for high-resolution remote sensing image
CN109800736B (en) Road extraction method based on remote sensing image and deep learning
CN107564025B (en) Electric power equipment infrared image semantic segmentation method based on deep neural network
CN108154105B (en) Underwater biological detection and identification method and device, server and terminal equipment
CN111814867A (en) Defect detection model training method, defect detection method and related device
CN111257341B (en) Underwater building crack detection method based on multi-scale features and stacked full convolution network
CN110298387A (en) Incorporate the deep neural network object detection method of Pixel-level attention mechanism
CN113642390B (en) Street view image semantic segmentation method based on local attention network
CN110930378B (en) Emphysema image processing method and system based on low data demand
CN110751644B (en) Road surface crack detection method
CN111311543A (en) Image definition detection method, system, device and storage medium
CN115439654B (en) Method and system for finely dividing weakly supervised farmland plots under dynamic constraint
CN111882555B (en) Deep learning-based netting detection method, device, equipment and storage medium
CN111145145A (en) Image surface defect detection method based on MobileNet
Wang et al. Haze removal algorithm based on single-images with chromatic properties
CN114677349A (en) Image segmentation method and system for edge information enhancement and attention guidance of encoding and decoding
CN113516652A (en) Battery surface defect and adhesive detection method, device, medium and electronic equipment
CN115830514B (en) Whole river reach surface flow velocity calculation method and system suitable for curved river channel
CN113298086A (en) Red tide multispectral detection method based on U-Net network
CN116109829B (en) Coral reef water area image segmentation method based on fusion network
CN112558022A (en) Radar echo image processing method, system, device and storage medium
CN115457376A (en) Coral reef health condition intelligent assessment method
CN115100577A (en) Visibility recognition method and system based on neural network, electronic device and storage medium
CN116452408A (en) Transparent liquid sensing method based on style migration
CN115690752A (en) Driver behavior detection method and device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination