CN114612788B - Urban landscape plant diversity monitoring method based on neural network - Google Patents

Urban landscape plant diversity monitoring method based on neural network Download PDF

Info

Publication number
CN114612788B
CN114612788B CN202210283098.0A CN202210283098A CN114612788B CN 114612788 B CN114612788 B CN 114612788B CN 202210283098 A CN202210283098 A CN 202210283098A CN 114612788 B CN114612788 B CN 114612788B
Authority
CN
China
Prior art keywords
visible light
layer
unmanned aerial
plant
aerial vehicle
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202210283098.0A
Other languages
Chinese (zh)
Other versions
CN114612788A (en
Inventor
王蕾
贾佳
严俊鑫
姚允龙
翟雅琳
王佳轩
姚明辰
尚雅杰
荆忠伟
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Northeast Forestry University
Original Assignee
Northeast Forestry University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Northeast Forestry University filed Critical Northeast Forestry University
Priority to CN202210283098.0A priority Critical patent/CN114612788B/en
Publication of CN114612788A publication Critical patent/CN114612788A/en
Application granted granted Critical
Publication of CN114612788B publication Critical patent/CN114612788B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/21Design or setup of recognition systems or techniques; Extraction of features in feature space; Blind source separation
    • G06F18/214Generating training patterns; Bootstrap methods, e.g. bagging or boosting
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/24Classification techniques
    • G06F18/241Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/045Combinations of networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02ATECHNOLOGIES FOR ADAPTATION TO CLIMATE CHANGE
    • Y02A30/00Adapting or protecting infrastructure or their operation
    • Y02A30/60Planning or developing urban green infrastructure
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02TCLIMATE CHANGE MITIGATION TECHNOLOGIES RELATED TO TRANSPORTATION
    • Y02T10/00Road transport of goods or passengers
    • Y02T10/10Internal combustion engine [ICE] based vehicles
    • Y02T10/40Engine management systems

Abstract

The invention discloses a method for monitoring diversity of urban landscape plants based on a neural network, and relates to a method for monitoring diversity of urban landscape plants. The invention aims to solve the problems that the traditional urban landscape plant diversity monitoring method cannot realize large-scale dynamic monitoring and requires large manpower and material resources during monitoring. The process is as follows: collecting a visible light image shot by an unmanned aerial vehicle; establishing a neural network model; predicting the diversity of urban landscape plants on the visible light image; registering the visible light image and the infrared remote sensing image; intercepting subimages of the infrared remote sensing images; obtaining a characteristic vector of a visible light image and a sub-image; obtaining the best matching result; obtaining the coverage rate and the water content of various plants in the urban landscape plants in the visible light image; the coverage and moisture content of the nth plant in the designated city in the designated month is monitored for successive M years. The invention is used for the field of monitoring the diversity of urban landscape plants.

Description

Urban landscape plant diversity monitoring method based on neural network
Technical Field
The invention relates to a method for monitoring the diversity of urban landscape plants.
Background
Urban landscaping landscape is an important component element of a human living environment, provides space for communication, rest and entertainment for residents, and integrates the design requirements of urban landscaping plants into the lives of people. With the improvement of living standard of people, people pay more and more attention to the quality of life and pay more and more attention to living environment, and the ecological benefit, social benefit and economic benefit of urban landscape plants become more and more obvious. Different ecosystems exist due to different climatic and topographic characteristics. As for plants, pine and cypress are planted in urban garden landscapes of northern cities for many times, and the pine and cypress are cold-resistant and drought-resistant; the urban garden landscape in southern cities mainly comprises tropical plants and is full of all-season bird-language flower fragrance. Urban landscape plants in various regions have individual characteristics.
Urban landscape plants in various regions have diversity, the landscape plants comprise chrysanthemum, peony, chinese rose and the like, different plants have different water contents, the water content of moss is 6%, and the water content of lotus is 90%; the water content is an important evaluation standard of the growth condition of the plants; in order to ensure that urban landscape plants in various regions have diversity, the coverage rates of various landscape plants are also distributed in a balanced manner;
the remote sensing technology has the characteristics of wide detection range, short updating period, quick information acquisition and the like, and leaves of plants are red due to the reflection of infrared rays on the color infrared image. However, the spectral characteristics of various vegetation types or plants are different in different growth stages or under the influence of different environments, so that the depth of red on the color infrared image is different, for example, the color of normally growing conifer forest is red to magenta, withered vegetation is dark red, and the withered vegetation is cyan.
The traditional urban landscape plant diversity monitoring method needs large manpower and material resources, is slow in updating speed and cannot realize large-scale dynamic monitoring. Therefore, the advantages of the remote sensing technology provide possibility for the application of the remote sensing technology to urban landscape plant diversity monitoring.
Disclosure of Invention
The invention aims to provide a city landscape plant diversity monitoring method based on a neural network, aiming at solving the problems that the traditional city landscape plant diversity monitoring method cannot realize large-scale dynamic monitoring and requires large manpower and material resources during monitoring.
The urban landscape plant diversity monitoring method based on the neural network comprises the following specific processes:
step 1: collecting visible light images shot by an unmanned aerial vehicle to form a sample training set;
the visible light image shot by the unmanned aerial vehicle comprises the plant name of the urban landscape plant, the city to which the urban landscape plant belongs and the month to which the urban landscape plant belongs;
step 2: establishing a neural network model;
and step 3: inputting the sample training set data collected in the step one into the established neural network model, and performing iterative optimization by adopting an Adam algorithm to obtain an optimal network model;
and 4, step 4: collecting infrared remote sensing images of the same city and the same month and visible light images shot by an unmanned aerial vehicle, inputting the collected visible light images shot by the unmanned aerial vehicle into an optimal network model to predict the diversity of the urban landscape plants, and obtaining the plant names of the urban landscape plants, the cities to which the urban landscape plants belong and the months to which the urban landscape plants belong in the collected visible light images shot by the unmanned aerial vehicle;
and 5: registering the plant names of the urban landscape plants, the cities to which the urban landscape plants belong and the months to which the urban landscape plants belong, which are determined in the step 4, by using the visible light images shot by the unmanned aerial vehicles and the infrared remote sensing images collected in the step 4;
determining the area of an aerial image of the unmanned aerial vehicle based on longitude, latitude and height position information of an imaging point recorded in the process of shooting a visible light image by the unmanned aerial vehicle, determining the area of the aerial image in which the infrared remote sensing image possibly appears according to the area of the aerial image of the unmanned aerial vehicle and the positioning precision of the unmanned aerial vehicle, and then intercepting the area of the aerial image in which the infrared remote sensing image possibly appears as a sub-image of the infrared remote sensing image;
and 6: inputting the subimages of the visible light image and the infrared remote sensing image shot by the unmanned aerial vehicle in the step 5 into a feature extraction network to obtain a feature vector of the visible light image and a feature vector of the subimage of the infrared remote sensing image shot by the unmanned aerial vehicle to be detected;
and 7: calculating the characteristic vector of the infrared remote sensing image and the characteristic response score of the characteristic vector of the visible light image through a QATM matching algorithm, and selecting the area with the highest score as the best matching result;
and step 8: taking the plant name of the urban landscape plant, the city to which the urban landscape plant belongs and the month to which the urban landscape plant belongs of the visible light image corresponding to the optimal matching result as the plant name of the urban landscape plant at the pixel position of the infrared remote sensing image, the city to which the urban landscape plant belongs and the month to which the urban landscape plant belongs;
obtaining the coverage rate and the water content of the urban landscape plants at the pixel position of the infrared remote sensing image based on the plant names of the urban landscape plants at the pixel position of the infrared remote sensing image, the cities to which the urban landscape plants belong and the months to which the urban landscape plants belong;
taking the coverage rate and the water content of the urban landscape plants at the pixel position of the infrared remote sensing image as the coverage rate and the water content of various plants in the urban landscape plants in the visible light image corresponding to the pixel position;
and step 9: monitoring the coverage rate of the nth plant of the designated city in the designated month in the continuous M years, and the water content of the nth plant of the designated city in the continuous M years, and respectively recording the monitored water content of the nth plant in the M years
Figure BDA0003558801840000021
m=1,2,…,M,n=1,2,…,N,/>
Figure BDA0003558801840000022
Represents the moisture content of the nth plant of the mth year; the coverage of the nth plant in M years monitored was recorded as ^ er respectively>
Figure BDA0003558801840000031
m=1,2,…,M,n=1,2,…,N,/>
Figure BDA0003558801840000032
Represents the coverage of the nth plant in the mth year;
inputting the coverage rate of the nth plant of the designated city in the continuous M years and the water content image of the nth plant of the designated city in the continuous M years into a prediction network to predict the coverage rate change curve and the water content image of the nth plant in the future; and obtaining coverage rate and water content images of other various plants by adopting the method;
m successive in the future 0 In the year, if the coverage rate of the nth plant is gradually reduced and the water content is continuously reduced, the plant in the month of the designated city needs to be adjusted, and the nth plant is replaced by a drought-resistant plant;
if the coverage rate of the nth plant is decreased gradually and the water content is larger than or equal to the water content threshold value, monitoring whether the nth plant is interfered by diseases or not, and preventing;
if the coverage rate of the nth plant is unchanged and the water content is smaller than the water content threshold value, carrying out water irrigation on the plant with the water content smaller than the water content threshold value so as to ensure the diversity of the plant species;
if the coverage rate of the nth plant is increased in sequence and the water content is greater than or equal to the water content threshold value, reducing the planting of the nth plant;
and other conditions are not required to be optimized, and monitoring is continued.
Further, establishing a neural network model in the step 2; the specific process is as follows:
the neural network model includes: the system comprises an input layer, a first three-dimensional convolution layer, a first batch normalization layer BN layer, a first ReLU activation layer, a first global average pooling layer, a second global average pooling layer, a first unit convolution, a third batch normalization layer BN layer, a first Sigmoid activation function layer, a second Sigmoid activation function layer, a third Sigmoid activation function layer, a fourth Sigmoid activation function layer, a second batch normalization layer BN layer, a second ReReactivating layer, a first Dropout, a first global maximum pooling layer, an FC full-link LU layer, a Softmax function classifier and an output layer;
the connection relation of the neural network model is as follows:
the input layer is connected with a first three-dimensional convolutional layer, the first three-dimensional convolutional layer is connected with a first normalization layer BN layer, the first normalization layer BN layer is connected with a first ReLU activation layer, the first ReLU activation layer is respectively connected with a first global average pooling layer and a second global average pooling layer, the output of the first global average pooling layer and the output of the second global average pooling layer are cascaded and then input into a first unit convolution, the first unit convolution is connected with a third normalization layer BN layer, and the third normalization layer BN layer is respectively connected with a first Sigmoid activation function layer, a second Sigmoid activation function layer, a third Sigmoid activation function layer and a fourth Sigmoid activation function layer;
and performing multiplication operation on the outputs of the first Sigmoid activation function layer, the second Sigmoid activation function layer, the third Sigmoid activation function layer and the fourth Sigmoid activation function layer, inputting a result after multiplication into a second batch of normalization layer BN layers, connecting the outputs of the second batch of normalization layer BN layers with a second ReLU activation layer, connecting the output of the second ReLU activation layer with a first Dropout, connecting the output of the first Dropout with a first global maximum pooling layer, connecting the output of the first global maximum pooling layer with a FC full connection layer and the output of the FC full connection layer with a Softmax function classifier, and outputting the output result of the Softmax function classifier through an output layer.
Furthermore, the number of times of training the neural network model is N, and the training of the neural network model is divided into three stages;
the loss function used when training times are (0,N/2) is:
L(y,f(x))=max(0,1-yf(x))
wherein f (x) is a predicted value, between-1 and 1, and y is a target value;
when the training times are [ N/2,3N/4 ], the loss function adopted by the training times is as follows:
L(y,f(x))=max(0,-f(x))
when the training times are [3N/4,N ] times, the loss function adopted by the training is as follows:
Figure BDA0003558801840000041
wherein the content of the first and second substances,
Figure BDA0003558801840000042
predicting the network input for the t +1 moment, wherein P is the total number of features; y is p The network output is predicted for time t.
Further, in the step 5, the plant names of the urban landscape plants, the cities to which the urban landscape plants belong, the visible light images shot by the unmanned aerial vehicles of the months to which the urban landscape plants belong, which are determined in the step 4, and the infrared remote sensing images collected in the step 4 are registered; the specific process is as follows:
step 51: carrying out direction registration on a visible light image shot by the unmanned aerial vehicle based on the infrared remote sensing image;
step 52: and carrying out scale space registration on the visible light image shot by the unmanned aerial vehicle after the direction registration and the infrared remote sensing image to obtain the visible light image shot by the unmanned aerial vehicle after the scale space registration.
Further, in the step 51, the direction registration is performed on the visible light image shot by the unmanned aerial vehicle based on the infrared remote sensing image; the specific process is as follows:
establishing a pixel coordinate system HOW by taking the upper left corner of the image as an origin O, and taking H as the vertical coordinate of the pixel coordinate system and the height of the image; w is the abscissa of the pixel coordinate system and is the image width;
one point (X) in the image pixel coordinate system 0 ,Y 0 ) Coordinate (X) rotated clockwise by theta degrees around the center of the image 0 ′,Y 0 ') is calculated by:
Figure BDA0003558801840000051
further, in the step 52, performing scale space registration on the visible light image shot by the unmanned aerial vehicle after direction registration and the infrared remote sensing image to obtain a visible light image shot by the unmanned aerial vehicle after scale space registration; the specific process is as follows:
suppose that the ground coverage area length of the infrared remote sensing image is L aX Width of L aY And the size of the infrared remote sensing image is H a ×W a ,H a Is the number of pixel points, W, of a line of an infrared remote sensing image a For pixels of columns of infrared remote sensing imagesCounting;
the ground coverage size of the infrared remote sensing image corresponding to a single pixel point in the X and Y directions of a central coordinate system is as follows:
Figure BDA0003558801840000052
Figure BDA0003558801840000053
the ground coverage area length of the visible light image shot by the unmanned aerial vehicle is L X Width of L Y The size of the visible light image shot by the unmanned aerial vehicle is h multiplied by w, h is the number of pixels in a row of the visible light image shot by the unmanned aerial vehicle, and w is the number of pixels in a column of the visible light image shot by the unmanned aerial vehicle;
the ground coverage size of a visible light image shot by the unmanned aerial vehicle, which corresponds to a single pixel point in the central coordinate system X and the Y direction, is as follows:
Figure BDA0003558801840000054
Figure BDA0003558801840000055
the scale relationship between the visible light image shot by the unmanned aerial vehicle and the infrared remote sensing image, namely the relationship of the ground coverage size corresponding to a single pixel point between the visible light image shot by the unmanned aerial vehicle and the infrared remote sensing image, is as follows:
Figure BDA0003558801840000056
Figure BDA0003558801840000057
wherein, beta X Representing an aerial image andsatellite image dimension in X direction of central coordinate system Y The corresponding relation of the dimension of the aerial image and the satellite image in the Y direction of the central coordinate system is represented;
the method comprises the steps of up-sampling a visible light image shot by the unmanned aerial vehicle to obtain an adjusted visible light image shot by the unmanned aerial vehicle, realizing scale registration between the visible light image shot by the unmanned aerial vehicle and an infrared remote sensing image, and obtaining an aerial image of the unmanned aerial vehicle after scale space registration.
Further, the visible light image shot by the unmanned aerial vehicle is up-sampled to obtain an adjusted visible light image shot by the unmanned aerial vehicle; the expression is as follows:
Figure BDA0003558801840000061
Figure BDA0003558801840000062
/>
wherein the content of the first and second substances,
Figure BDA0003558801840000063
the data is rounded downwards, h is the number of pixels in a row of a visible light image shot by the unmanned aerial vehicle, w is the number of pixels in a column of the visible light image shot by the unmanned aerial vehicle, and the number of the pixels in the column is greater or less than>
Figure BDA0003558801840000064
The number of pixel points in the visible light image column photographed by the unmanned aerial vehicle after adjustment is greater than or equal to>
Figure BDA0003558801840000065
The number of pixel points of the line of the visible light image shot by the unmanned aerial vehicle after adjustment.
Further, the loss function of the prediction network is:
Figure BDA0003558801840000066
wherein,
Figure BDA0003558801840000067
Predicting the network input for the t +1 moment, wherein P is the total number of features; y is p The network output is predicted for time t.
Further, the coverage rate calculation process in step 9 is as follows:
the ratio of the pixel coverage number of the nth plant species to the pixel coverage numbers of all the plant species is used as the coverage rate of the nth plant species;
further, the process of determining the moisture content threshold in step 9 is:
and manually selecting partial samples of the nth plant in the continuous a years in the designated months of the designated cities, and taking the average value of the water contents of the partial samples as a water content threshold value, wherein a is more than or equal to 1 and less than or equal to M/2.
The invention has the beneficial effects that:
the invention realizes the plant diversity monitoring based on the infrared remote sensing image and the visible light image of the urban landscape plant, firstly, the visible light image obtains the plant classification result through the neural network, various plants in the urban landscape form the plant diversity, and the plant names, the belonged cities and the belonged months of the various plants are monitored to ensure the plant diversity and the ecological balance;
due to the large difference between the infrared remote sensing image and the visible light image in scale, the infrared remote sensing image and the visible light image need to be registered in scale space. Because the visible light image shot by the unmanned aerial vehicle records the attitude information such as the pitch angle, the azimuth angle, the position and the like of the camera during image imaging in the shooting process, and meanwhile, the infrared remote sensing image also comprises latitude and longitude information, the size of the ground area actually covered by the image and the direction included angle between the images can be calculated according to the information, then the image is rotated, and the image is resampled by combining the size of the pixel size of the image, so that the ground coverage sizes of the infrared remote sensing image and the visible light image in the direction and the unit pixel representation are consistent, and the registration of the infrared remote sensing image and the visible light image in the dimension and the direction is realized.
According to the method, the coverage rate and the water content of the urban landscape plants at the pixel positions of the infrared remote sensing images are obtained based on the plant names of the urban landscape plants at the pixel positions of the infrared remote sensing images, the cities to which the urban landscape plants belong and the months to which the urban landscape plants belong; taking the coverage rate and the water content of the urban landscape plants at the pixel position of the infrared remote sensing image as the coverage rate and the water content of various plants in the urban landscape plants in the visible light image corresponding to the pixel position;
monitoring the coverage of the nth plant of the designated city in the specified month in the consecutive M years and the water content of the nth plant of the designated city in the consecutive M years; inputting the coverage rate of the nth plant of the designated city in the continuous M years and the water content image of the nth plant of the designated city in the continuous M years into a prediction network to predict the coverage rate change curve and the water content image of the nth plant in the future; the coverage rate and the water content image of other various plants are obtained by adopting the method; m successive in the future 0 In the year, if the coverage rate of the nth plant is gradually reduced and the water content is continuously reduced, the plant in the month of the designated city needs to be adjusted, the nth plant is replaced by a drought-resistant plant, and the diversity monitoring and ecological balance of the urban landscape plants are ensured; if the coverage rate of the nth plant is gradually reduced and the water content is larger than or equal to the water content threshold value, monitoring whether the nth plant is interfered by diseases and pests, and preventing (monitoring whether the plant is corroded by the diseases and the pests, and spraying pesticide); if the coverage rate of the nth plant is unchanged but the water content is smaller than the water content threshold value, carrying out water irrigation on the plant with the water content smaller than the water content threshold value so as to ensure the variety of the plant; if the coverage rate of the nth plant is increased in sequence and the water content is greater than or equal to the water content threshold value, the planting of the nth plant is reduced, and the diversity monitoring and ecological balance of the urban landscape plants are ensured; the other conditions are not required to be optimized, and monitoring is continued; the method realizes the monitoring of the species, the coverage rate and the water content of the urban landscape plants, and solves the problems that the traditional monitoring method can not realize the dynamic monitoring in a large range and the manpower and material resources required by the monitoring are large. Furthermore, the monitoring result according to the invention can be used for futureAnd predicting the coverage rate and the water content change trend, and performing early intervention according to a prediction result to ensure the diversity of urban landscape plant species so as to keep the stability of an ecological system.
The neural network model training is divided into three stages; when the training times are (0-N/2) (the range of 0 to N/2 is larger than or equal to 0) times, the adopted loss function is as follows: l (y, f (x)) = max (0,1-yf (x)), the loss function robustness is relatively high; when the training times are in the range of [ N/2-3N/4) (N/2 is larger than or equal to and smaller than 3N/4), the loss function adopted by the training times is as follows: l (y, f (x)) = max (0, -f (x)); when the training times are in [3N/4-N]The loss function adopted by the training times (more than or equal to 3N/4 and less than N range) is as follows:
Figure BDA0003558801840000071
the loss function L (y, f (x)) = max (0, -f (x)) is simpler than the loss function L (y, f (x)) = max (0,1-yf (x)), and the generalization capability of the neural network model corresponding to the loss function L (y, f (x)) = max (0, -f (x)) is not as strong as that of the neural network model corresponding to the loss function L (y, f (x)) = max (0,1-yf (x)); therefore, the invention adopts L (y, f (x)) = max (0,1-yf (x)) when the training times are (0-N/2) (more than or equal to 0 and less than N/2), before selecting the loss function with strong generalization capability, the effect is better, after the neural network model obtains a certain generalization capability, when the training times are in the range of [ N/2-3N/4) (more than or equal to N/2 and less than 3N/4), the training adopts the simple loss function L (y, f (x)) = max (0, -f (x)) when the training times are in the range of [ N/2-3N/4), and when the training times are in the range of [3N/4-N (x)) after the training times of (0-N/2) and [ N/2-3N/4) are correct, as long as the judgment type of the sample is correct]The loss function adopted by the training times (more than or equal to 3N/4 and less than N range) is as follows:
Figure BDA0003558801840000081
the prediction is repeated, and the accuracy of the prediction is ensured based on the first two loss functions.
Drawings
FIG. 1 is a flow chart of the present invention.
Detailed Description
The first embodiment is as follows: the urban landscape plant diversity monitoring method based on the neural network comprises the following specific processes:
step 1: collecting visible light images shot by an unmanned aerial vehicle to form a sample training set;
the visible light image shot by the unmanned aerial vehicle comprises the plant name of the urban landscape plant, the city to which the urban landscape plant belongs and the month to which the urban landscape plant belongs;
step 2: establishing a neural network model;
and step 3: inputting the sample training set data collected in the step one into the established neural network model, and performing iterative optimization by adopting an Adam algorithm to obtain an optimal network model;
and 4, step 4: collecting infrared remote sensing images of the same city and the same month and visible light images shot by an unmanned aerial vehicle, inputting the collected visible light images shot by the unmanned aerial vehicle into an optimal network model to predict the diversity of the urban landscape plants, and obtaining the plant names of the urban landscape plants, the cities to which the urban landscape plants belong and the months to which the urban landscape plants belong in the collected visible light images shot by the unmanned aerial vehicle;
and 5: registering the plant names of the urban landscape plants, the cities to which the urban landscape plants belong and the months to which the urban landscape plants belong, which are determined in the step 4, by using the visible light images shot by the unmanned aerial vehicles and the infrared remote sensing images collected in the step 4;
determining the area where the aerial image of the unmanned aerial vehicle is located based on longitude, latitude and height position information of an imaging point recorded in the process of shooting the visible light image by the unmanned aerial vehicle, determining the area where the aerial image is likely to appear in the infrared remote sensing image according to the area where the aerial image of the unmanned aerial vehicle is located and the positioning precision of the unmanned aerial vehicle, and then intercepting the area where the aerial image is likely to appear in the infrared remote sensing image as a sub-image of the infrared remote sensing image;
and 6: inputting the subimages of the visible light image and the infrared remote sensing image shot by the unmanned aerial vehicle in the step 5 into a feature extraction network (a pre-trained CNN network), and obtaining a feature vector of the visible light image and a feature vector of the subimage of the infrared remote sensing image shot by the unmanned aerial vehicle to be detected;
and 7: calculating the characteristic vector of the infrared remote sensing image and the characteristic response score of the characteristic vector of the visible light image through a QATM matching algorithm, and selecting the area with the highest score as the best matching result, thereby realizing the matching of the infrared remote sensing image and the visible light image;
and 8: the plant name of the urban landscape plant, the city to which the urban landscape plant belongs and the month to which the urban landscape plant belongs of the visible light image corresponding to the best matching result are used as the plant name of the urban landscape plant at the pixel position of the infrared remote sensing image, the city to which the urban landscape plant belongs and the month to which the urban landscape plant belongs;
obtaining the coverage rate and the water content of the urban landscape plants at the pixel position of the infrared remote sensing image based on the plant names of the urban landscape plants at the pixel position of the infrared remote sensing image, the cities to which the urban landscape plants belong and the months to which the urban landscape plants belong;
taking the coverage rate and the water content of the urban landscape plants at the pixel position of the infrared remote sensing image as the coverage rate and the water content of various plants in the urban landscape plants in the visible light image corresponding to the pixel position;
and step 9: monitoring the coverage rate of the nth plant of the designated city in the designated month in the continuous M years, and the water content of the nth plant of the designated city in the continuous M years, and respectively recording the monitored water content of the nth plant in the M years
Figure BDA0003558801840000091
m=1,2,…,M,n=1,2,…,N,/>
Figure BDA0003558801840000092
Represents the moisture content of the nth plant of the mth year; the coverage in n plants in M years monitored is respectively recorded as->
Figure BDA0003558801840000093
m=1,2,…,M,n=1,2,…,N,/>
Figure BDA0003558801840000094
Represents the coverage of the nth plant in the mth year;
inputting the coverage rate of the nth plant of the designated city in the continuous M years and the water content image of the nth plant of the designated city in the continuous M years into a prediction network (an existing CNN network) to predict a coverage rate change curve and a water content image of the nth plant in the future; and obtaining coverage rate and water content images of other various plants by adopting the method;
m successive in the future 0 In the year, if the coverage rate of the nth plant is gradually reduced and the water content is continuously reduced, the plant in the month of the designated city needs to be adjusted, the nth plant is replaced by a drought-resistant plant, and the diversity monitoring and ecological balance of the urban landscape plants are ensured;
if the coverage rate of the nth plant is gradually reduced and the water content is larger than or equal to the water content threshold value, monitoring whether the nth plant is interfered by diseases and pests, and preventing (monitoring whether the plant is corroded by the diseases and the pests, and spraying pesticide);
if the coverage rate of the nth plant is unchanged but the water content is smaller than the water content threshold value, carrying out water irrigation on the plant with the water content smaller than the water content threshold value so as to ensure the variety of the plant;
if the coverage rate of the nth plant is increased in sequence and the water content is greater than or equal to the water content threshold value, the planting of the nth plant is reduced, and the diversity monitoring and ecological balance of the urban landscape plants are ensured;
and (4) other conditions are not required to be optimized, and monitoring is continued.
The second embodiment is as follows: the difference between the present embodiment and the first embodiment is that, in the step 2, a neural network model is established; the specific process is as follows:
the neural network model comprises: the system comprises an input layer, a first three-dimensional convolution layer, a first batch normalization layer BN layer, a first ReLU activation layer, a first global average pooling layer, a second global average pooling layer, a first unit convolution, a third batch normalization layer BN layer, a first Sigmoid activation function layer, a second Sigmoid activation function layer, a third Sigmoid activation function layer, a fourth Sigmoid activation function layer, a second batch normalization layer BN layer, a second ReReactivating layer, a first Dropout, a first global maximum pooling layer, an FC full-link LU layer, a Softmax function classifier and an output layer;
the connection relation of the neural network model is as follows:
the input layer is connected with a first three-dimensional convolutional layer, the first three-dimensional convolutional layer is connected with a first normalization layer BN layer, the first normalization layer BN layer is connected with a first ReLU activation layer, the first ReLU activation layer is respectively connected with a first global average pooling layer and a second global average pooling layer, the output of the first global average pooling layer and the output of the second global average pooling layer are cascaded and then input into a first unit convolution, the first unit convolution is connected with a third normalization layer BN layer, and the third normalization layer BN layer is respectively connected with a first Sigmoid activation function layer, a second Sigmoid activation function layer, a third Sigmoid activation function layer and a fourth Sigmoid activation function layer;
and performing multiplication operation on the outputs of the first Sigmoid activation function layer, the second Sigmoid activation function layer, the third Sigmoid activation function layer and the fourth Sigmoid activation function layer, inputting the result after multiplication into a second batch of normalization layer BN layers, connecting the outputs of the second batch of normalization layer BN layers with a second ReLU activation layer, connecting the output of the second ReLU activation layer with a first Dropout, connecting the output of the first Dropout with a first global maximum pooling layer, connecting the output of the first global maximum pooling layer with a FC full connection layer and the output of the FC full connection layer with a Softmax function classifier, and outputting the output result of the Softmax function classifier through an output layer.
Other steps and parameters are the same as those in the first embodiment.
The third concrete implementation mode: the first or second embodiment is different from the first or second embodiment in that the number of times of training the neural network model is N, and the training of the neural network model is divided into three stages;
when the training times are (0,N/2) (the range of 0 to N/2 is greater than or equal to) times, the adopted loss function is as follows:
L(y,f(x))=max(0,1-yf(x))
the classification is correct, and the loss is 0; otherwise the loss is 1-yf (x);
wherein f (x) is a predicted value, between-1 and 1, and y is a target value (-1 or 1);
the robustness is relatively high;
when the training times are [ N/2,3N/4 ] (N/2 is more than or equal to and less than 3N/4 range), the loss function adopted by the training times is as follows:
L(y,f(x))=max(0,-f(x))
the loss function L (y, f (x)) = max (0, -f (x)) is simpler than the loss function L (y, f (x)) = max (0,1-yf (x)), and the generalization capability of the neural network model corresponding to the loss function L (y, f (x)) = max (0, -f (x)) is not as strong as that of the neural network model corresponding to the loss function L (y, f (x)) = max (0,1-yf (x)); therefore, the method adopts L (y, f (x)) = max (0,1-yf (x)) when the training times are (0,N/2) (0 is larger than or equal to N/2 range), the method selects the loss function with strong generalization capability, the effect is better, after the neural network model obtains certain generalization capability, when the training times are in [ N/2,3N/4) (N/2 is larger than or equal to N/2 is smaller than 3N/4 range), the training adopts the simple loss function L (y, f (x)) = max (0, -f (x)) for times, the training is carried out by (0,N/2) and [ N/2,3N/4), and when the training times are in [3N/4,N after the training times are trained for times (3N/3236 zxft 8978 zxft 324) as long as the sample is correct, and the neural network model is obtained by training after the training times are not smaller than the training time]The loss function adopted by the training times (more than or equal to 3N/4 and less than N range) is as follows:
Figure BDA0003558801840000111
the prediction is repeated, and the accuracy of the prediction is ensured based on the basis of the first two loss functions;
when the training times are [3N/4,N ] (the range of more than or equal to 3N/4 and less than N), the loss function adopted by the training times is as follows:
Figure BDA0003558801840000112
wherein the content of the first and second substances,
Figure BDA0003558801840000113
predicting the network input for the t +1 moment, wherein P is the total number of features; y is p The network output is predicted for time t.
Other steps and parameters are the same as those in the first or second embodiment.
The fourth concrete implementation mode: the difference between the embodiment and the first to third embodiments is that in the step 5, the plant names of the urban landscape plants, the cities to which the urban landscape plants belong, the visible light images shot by the unmanned aerial vehicles of the months to which the urban landscape plants belong, which are determined in the step 4, and the infrared remote sensing images collected in the step 4 are registered; the specific process is as follows:
step 51: carrying out direction registration on a visible light image shot by the unmanned aerial vehicle based on the infrared remote sensing image;
step 52: and carrying out scale space registration on the visible light image shot by the unmanned aerial vehicle after direction registration and the infrared remote sensing image to obtain the visible light image shot by the unmanned aerial vehicle after scale space registration.
Other steps and parameters are the same as those in one of the first to third embodiments.
The fifth concrete implementation mode is as follows: the difference between this embodiment and the first to the fourth embodiment is that, in the step 51, the direction registration is performed on the visible light image shot by the unmanned aerial vehicle based on the infrared remote sensing image; the specific process is as follows:
establishing a pixel coordinate system HOW by taking the upper left corner of the image as an origin O, and taking H as the vertical coordinate of the pixel coordinate system as the height of the image; w is the abscissa of the pixel coordinate system and is the image width;
one point (X) in the image pixel coordinate system 0 ,Y 0 ) Coordinate (X) rotated clockwise by theta degrees around the center of the image 0 ′,Y 0 ') is calculated by:
Figure BDA0003558801840000121
other steps and parameters are the same as in one of the first to fourth embodiments.
The sixth specific implementation mode: the difference between this embodiment and one of the first to fifth embodiments is that, in step 52, the visible light image shot by the unmanned aerial vehicle after direction registration and the infrared remote sensing image are subjected to scale space registration to obtain a visible light image shot by the unmanned aerial vehicle after scale space registration; the specific process is as follows:
suppose that the ground coverage area length of the infrared remote sensing image is L aX Width of L aY And the size of the infrared remote sensing image is H a ×W a ,H a Is the number of pixel points, W, of the infrared remote sensing image line a Counting the number of pixel rows of the infrared remote sensing image;
the ground coverage size of the infrared remote sensing image corresponding to a single pixel point in the X and Y directions of a central coordinate system is as follows:
Figure BDA0003558801840000122
/>
Figure BDA0003558801840000123
the ground coverage area length of the visible light image shot by the unmanned aerial vehicle is L X Width of L Y The size of the visible light image shot by the unmanned aerial vehicle is h multiplied by w, h is the number of pixels in a row of the visible light image shot by the unmanned aerial vehicle, and w is the number of pixels in a column of the visible light image shot by the unmanned aerial vehicle;
the ground coverage size of a visible light image shot by the unmanned aerial vehicle, which corresponds to a single pixel point in the central coordinate system X and the Y direction, is as follows:
Figure BDA0003558801840000131
Figure BDA0003558801840000132
the scale relation between the visible light image shot by the unmanned aerial vehicle and the infrared remote sensing image is also the relation of the ground coverage size corresponding to a single pixel point between the visible light image shot by the unmanned aerial vehicle and the infrared remote sensing image, and the expression is as follows:
Figure BDA0003558801840000133
Figure BDA0003558801840000134
wherein, beta X The corresponding relation of the size of the aerial image and the satellite image in the X direction of the central coordinate system is represented, beta Y The corresponding relation of the dimension of the aerial image and the satellite image in the Y direction of the central coordinate system is represented;
and performing up-sampling on the visible light image shot by the unmanned aerial vehicle to obtain the adjusted visible light image shot by the unmanned aerial vehicle, so that the scale registration between the visible light image shot by the unmanned aerial vehicle and the infrared remote sensing image is realized, and the unmanned aerial vehicle aerial image after the scale space registration is obtained.
Other steps and parameters are the same as those in one of the first to fifth embodiments.
The seventh embodiment: the difference between the present embodiment and the first to sixth embodiments is that the visible light image shot by the unmanned aerial vehicle is up-sampled to obtain the adjusted visible light image shot by the unmanned aerial vehicle; the expression is as follows:
Figure BDA0003558801840000135
Figure BDA0003558801840000136
wherein, the first and the second end of the pipe are connected with each other,
Figure BDA0003558801840000137
the data are rounded downwards, h is the number of pixels in a row of a visible light image shot by the unmanned aerial vehicle, w is the number of pixels in a column of the visible light image shot by the unmanned aerial vehicle, and the number of the pixels is greater than or equal to>
Figure BDA0003558801840000138
The number of pixel points in the visible light image column photographed by the unmanned aerial vehicle after adjustment is greater than or equal to>
Figure BDA0003558801840000139
For the adjusted unmanned aerial vehicle to shoot visibleThe number of pixel points of the light image line.
Other steps and parameters are the same as those in one of the first to sixth embodiments.
The specific implementation mode is eight: the difference between this embodiment and the first to seventh embodiments is that the loss function of the prediction network is:
Figure BDA0003558801840000141
/>
wherein the content of the first and second substances,
Figure BDA0003558801840000142
predicting the network input for the t +1 moment, wherein P is the total number of features; y is p The network output is predicted for time t.
Other steps and parameters are the same as those in one of the first to fifth embodiments.
The specific implementation method nine: the difference between this embodiment and the first to eighth embodiments is that, in the step 9, the coverage rate calculation process is:
the ratio of the pixel coverage number of the nth plant species to the pixel coverage numbers of all the plant species is used as the coverage rate of the nth plant species;
other steps and parameters are the same as those in one to eight of the embodiments.
The detailed implementation mode is ten: this embodiment differs from one of the first to ninth embodiments in that the water content threshold determination process in step 9 is:
and (3) manually selecting partial samples of the nth plant in the continuous a years in the designated month of the designated city, and taking the average value of the water content of the partial samples as a water content threshold, wherein a is more than or equal to 1 and less than or equal to M/2.
Other steps and parameters are the same as those in one of the first to ninth embodiments.
The present invention is capable of other embodiments and its several details are capable of modifications in various obvious respects, all without departing from the spirit and scope of the present invention.

Claims (1)

1. The urban landscape plant diversity monitoring method based on the neural network is characterized by comprising the following steps: the method comprises the following specific processes:
step 1: collecting visible light images shot by an unmanned aerial vehicle to form a sample training set;
the visible light image shot by the unmanned aerial vehicle comprises the plant name of the urban landscape plant, the city to which the urban landscape plant belongs and the month to which the urban landscape plant belongs;
step 2: establishing a neural network model;
and step 3: inputting the sample training set data collected in the step one into the established neural network model, and performing iterative optimization by adopting an Adam algorithm to obtain an optimal network model;
and 4, step 4: collecting infrared remote sensing images of the same city and the same month and visible light images shot by an unmanned aerial vehicle, inputting the collected visible light images shot by the unmanned aerial vehicle into an optimal network model to predict the diversity of the urban landscape plants, and obtaining the plant names of the urban landscape plants, the cities to which the urban landscape plants belong and the months to which the urban landscape plants belong in the collected visible light images shot by the unmanned aerial vehicle;
and 5: registering the plant names of the urban landscape plants, the cities to which the urban landscape plants belong and the months to which the urban landscape plants belong, which are determined in the step 4, by using the unmanned aerial vehicle to shoot the visible light images and the infrared remote sensing images acquired in the step 4;
determining the area of an aerial image of the unmanned aerial vehicle based on longitude, latitude and height position information of an imaging point recorded in the process of shooting a visible light image by the unmanned aerial vehicle, determining the area of the aerial image in which the infrared remote sensing image possibly appears according to the area of the aerial image of the unmanned aerial vehicle and the positioning precision of the unmanned aerial vehicle, and then intercepting the area of the aerial image in which the infrared remote sensing image possibly appears as a sub-image of the infrared remote sensing image;
and 6: inputting the sub-images of the visible light image and the infrared remote sensing image shot by the unmanned aerial vehicle in the step 5 into a feature extraction network, and obtaining a feature vector of the visible light image and a feature vector of the sub-image of the infrared remote sensing image shot by the unmanned aerial vehicle to be detected;
and 7: calculating the feature response scores of the feature vectors of the infrared remote sensing images and the feature vectors of the visible light images through a QATM matching algorithm, and selecting the area with the highest score as the best matching result, thereby realizing the matching of the infrared remote sensing images and the visible light images;
and 8: the plant names of the urban landscape plants, the cities to which the urban landscape plants belong and the months to which the urban landscape plants belong of the visible light images corresponding to the optimal matching results are used as the plant names of the urban landscape plants at the pixel positions of the infrared remote sensing images, the cities to which the urban landscape plants belong and the months to which the urban landscape plants belong;
obtaining the coverage rate and the water content of the urban landscape plants at the pixel position of the infrared remote sensing image based on the plant names of the urban landscape plants at the pixel position of the infrared remote sensing image, the cities to which the urban landscape plants belong and the months to which the urban landscape plants belong;
taking the coverage rate and the water content of the urban landscape plants at the pixel position of the infrared remote sensing image as the coverage rate and the water content of various plants in the urban landscape plants in the visible light image corresponding to the pixel position;
and step 9: monitoring the coverage rate of the nth plant in the designated city designated month in the continuous M years and the water content of the nth plant in the designated city designated month in the continuous M years, and recording the monitored water content of the nth plant in the M years as the water content of the nth plant respectively
Figure FDA0003900061310000021
Figure FDA0003900061310000022
Represents the moisture content of the nth plant of the mth year; the coverage of the nth plant in M years monitored was recorded as ^ er respectively>
Figure FDA0003900061310000023
Figure FDA0003900061310000024
Represents the coverage of the nth plant in the mth year;
inputting the coverage rate of the nth plant of the designated city in the continuous M years and the water content image of the nth plant of the designated city in the continuous M years into a prediction network to predict the coverage rate change curve and the water content image of the nth plant in the future; and obtaining coverage rate and water content images of other various plants by adopting the method;
m successive in the future 0 In the year, if the coverage rate of the nth plant is gradually reduced and the water content is continuously reduced, the plant in the month of the designated city needs to be adjusted, the nth plant is replaced by a drought-resistant plant, and the diversity monitoring and ecological balance of the urban landscape plants are ensured;
if the coverage rate of the nth plant is decreased gradually and the water content is larger than or equal to the water content threshold value, monitoring whether the nth plant is interfered by diseases or not, and preventing;
if the coverage rate of the nth plant is unchanged but the water content is smaller than the water content threshold value, carrying out water irrigation on the plant with the water content smaller than the water content threshold value so as to ensure the variety of the plant;
if the coverage rate of the nth plant is increased in sequence and the water content is greater than or equal to the water content threshold value, the planting of the nth plant is reduced, and the diversity monitoring and ecological balance of the urban landscape plants are ensured;
the other conditions are not required to be optimized, and monitoring is continued;
establishing a neural network model in the step 2; the specific process is as follows:
the neural network model includes: the system comprises an input layer, a first three-dimensional convolution layer, a first batch normalization layer BN layer, a first ReLU activation layer, a first global average pooling layer, a second global average pooling layer, a first unit convolution, a third batch normalization layer BN layer, a first Sigmoid activation function layer, a second Sigmoid activation function layer, a third Sigmoid activation function layer, a fourth Sigmoid activation function layer, a second batch normalization layer BN layer, a second ReReactivating layer, a first Dropout, a first global maximum pooling layer, an FC full-link LU layer, a Softmax function classifier and an output layer;
the connection relationship of the neural network model is as follows:
the input layer is connected with a first three-dimensional convolutional layer, the first three-dimensional convolutional layer is connected with a first normalization layer BN layer, the first normalization layer BN layer is connected with a first ReLU activation layer, the first ReLU activation layer is respectively connected with a first global average pooling layer and a second global average pooling layer, the output of the first global average pooling layer and the output of the second global average pooling layer are cascaded and then input into a first unit convolution, the first unit convolution is connected with a third normalization layer BN layer, and the third normalization layer BN layer is respectively connected with a first Sigmoid activation function layer, a second Sigmoid activation function layer, a third Sigmoid activation function layer and a fourth Sigmoid activation function layer;
performing product operation on the outputs of the first Sigmoid activation function layer, the second Sigmoid activation function layer, the third Sigmoid activation function layer and the fourth Sigmoid activation function layer, inputting a product result into a second batch of normalization layer BN layers, connecting the outputs of the second batch of normalization layer BN layers with a second ReLU activation layer, connecting the output of the second ReLU activation layer with a first Dropout, connecting the output of the first Dropout with a first global maximum pooling layer, connecting the output of the first global maximum pooling layer with a FC full connection layer and a Softmax function classifier, and outputting the output result of the Softmax function classifier through an output layer;
the training times of the neural network model are N times, and the neural network model is divided into three stages;
the loss function used when training times are (0,N/2) is:
L(y,f(x))=max(0,1-yf(x))
wherein f (x) is a predicted value and y is a target value;
when the training times are [ N/2,3N/4) ] times, the loss function adopted by the training times is as follows:
L(y,f(x))=max(0,-f(x))
when the training times are [3N/4,N ] times, the loss function adopted by the training is as follows:
Figure FDA0003900061310000031
wherein, the first and the second end of the pipe are connected with each other,
Figure FDA0003900061310000032
predicting the network input for the t +1 moment, wherein P is the total number of features; y is p Predicting the network output for time t; />
In the step 5, the plant names of the urban landscape plants, the cities to which the urban landscape plants belong and the months to which the urban landscape plants belong, which are determined in the step 4, are registered with the visible light images shot by the unmanned aerial vehicles and the infrared remote sensing images collected in the step 4; the specific process is as follows:
step 51: carrying out direction registration on a visible light image shot by the unmanned aerial vehicle based on the infrared remote sensing image;
step 52: carrying out scale space registration on the visible light image shot by the unmanned aerial vehicle after the direction registration and the infrared remote sensing image to obtain a visible light image shot by the unmanned aerial vehicle after the scale space registration;
in the step 51, the direction registration is performed on the visible light image shot by the unmanned aerial vehicle based on the infrared remote sensing image; the specific process is as follows:
establishing a pixel coordinate system HOW by taking the upper left corner of the image as an origin O, and taking H as the vertical coordinate of the pixel coordinate system as the height of the image; w is the abscissa of the pixel coordinate system and is the image width;
one point (X) in the image pixel coordinate system 0 ,Y 0 ) Coordinate (X ') rotated clockwise by θ degrees around the center of the image' 0 ,Y′ 0 ) Calculated from the following formula:
Figure FDA0003900061310000041
in the step 52, the visible light image shot by the unmanned aerial vehicle after the direction registration is subjected to scale space registration with the infrared remote sensing image to obtain a visible light image shot by the unmanned aerial vehicle after the scale space registration; the specific process is as follows:
suppose that the ground coverage area length of the infrared remote sensing image is L aX Width of L aY And the size of the infrared remote sensing image is H a ×W a ,H a Is the number of pixel points, W, of the infrared remote sensing image line a Counting the number of pixel rows of the infrared remote sensing image;
the ground coverage size of the infrared remote sensing image corresponding to a single pixel point in the X and Y directions of a central coordinate system is as follows:
Figure FDA0003900061310000042
Figure FDA0003900061310000043
the ground coverage area length of the visible light image shot by the unmanned aerial vehicle is L X Width of L Y The size of the visible light image shot by the unmanned aerial vehicle is h multiplied by w, h is the number of pixels in a row of the visible light image shot by the unmanned aerial vehicle, and w is the number of pixels in a column of the visible light image shot by the unmanned aerial vehicle;
the ground coverage size of a visible light image shot by the unmanned aerial vehicle, which corresponds to a single pixel point in the central coordinate system X and the Y direction, is as follows:
Figure FDA0003900061310000044
Figure FDA0003900061310000045
the scale relation between the visible light image shot by the unmanned aerial vehicle and the infrared remote sensing image is also the relation of the ground coverage size corresponding to a single pixel point between the visible light image shot by the unmanned aerial vehicle and the infrared remote sensing image, and the expression is as follows:
Figure FDA0003900061310000051
/>
Figure FDA0003900061310000052
wherein, beta X The corresponding relation of the size of the aerial image and the satellite image in the X direction of the central coordinate system is represented, beta Y The corresponding relation of the dimension of the aerial image and the satellite image in the Y direction of the central coordinate system is represented;
the method comprises the steps of up-sampling a visible light image shot by an unmanned aerial vehicle to obtain an adjusted visible light image shot by the unmanned aerial vehicle, realizing scale registration between the visible light image shot by the unmanned aerial vehicle and an infrared remote sensing image, and obtaining an unmanned aerial vehicle aerial image after scale space registration;
the visible light image shot by the unmanned aerial vehicle is up-sampled to obtain an adjusted visible light image shot by the unmanned aerial vehicle; the expression is as follows:
Figure FDA0003900061310000053
Figure FDA0003900061310000054
wherein the content of the first and second substances,
Figure FDA0003900061310000055
the data are rounded downwards, h is the number of pixels in a row of a visible light image shot by the unmanned aerial vehicle, w is the number of pixels in a column of the visible light image shot by the unmanned aerial vehicle, and the number of the pixels is greater than or equal to>
Figure FDA0003900061310000056
The adjusted visible light image column pixel points shot by the unmanned aerial vehicle are counted and compared>
Figure FDA0003900061310000057
The number of pixel points of the line of the adjusted visible light image shot by the unmanned aerial vehicle is adjusted;
the loss function of the prediction network is:
Figure FDA0003900061310000058
wherein, the first and the second end of the pipe are connected with each other,
Figure FDA0003900061310000059
predicting the network input for the t +1 moment, wherein P is the total number of features; y is p Predicting the network output for time t;
the coverage rate calculation process in the step 9 is as follows:
the ratio of the pixel coverage number of the nth plant species to the pixel coverage number of all the plant species is used as the coverage rate of the nth plant species;
the process for determining the water content threshold in the step 9 comprises the following steps:
and (3) manually selecting partial samples of the nth plant in the continuous a years in the designated month of the designated city, and taking the average value of the water content of the partial samples as a water content threshold, wherein a is more than or equal to 1 and less than or equal to M/2.
CN202210283098.0A 2022-03-22 2022-03-22 Urban landscape plant diversity monitoring method based on neural network Active CN114612788B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210283098.0A CN114612788B (en) 2022-03-22 2022-03-22 Urban landscape plant diversity monitoring method based on neural network

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210283098.0A CN114612788B (en) 2022-03-22 2022-03-22 Urban landscape plant diversity monitoring method based on neural network

Publications (2)

Publication Number Publication Date
CN114612788A CN114612788A (en) 2022-06-10
CN114612788B true CN114612788B (en) 2023-04-07

Family

ID=81864690

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210283098.0A Active CN114612788B (en) 2022-03-22 2022-03-22 Urban landscape plant diversity monitoring method based on neural network

Country Status (1)

Country Link
CN (1) CN114612788B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114973001B (en) * 2022-06-17 2023-02-03 东北林业大学 Wetland plant growth information monitoring method

Family Cites Families (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109269645A (en) * 2018-09-06 2019-01-25 西北农林科技大学 A kind of field corn canopy surface temperature extracting method based on unmanned plane visible light and thermal infrared remote sensing
CN110390267B (en) * 2019-06-25 2021-06-01 东南大学 Mountain landscape building extraction method and device based on high-resolution remote sensing image
CN110675448B (en) * 2019-08-21 2023-05-02 深圳大学 Ground lamplight remote sensing monitoring method, system and storage medium based on civil airliner
CN111507321B (en) * 2020-07-01 2020-09-25 中国地质大学(武汉) Training method, classification method and device of multi-output land cover classification model
CN112419374B (en) * 2020-11-11 2022-12-27 北京航空航天大学 Unmanned aerial vehicle positioning method based on image registration
CN113793270A (en) * 2021-08-05 2021-12-14 杭州电子科技大学 Aerial image geometric correction method based on unmanned aerial vehicle attitude information
CN113936314A (en) * 2021-10-13 2022-01-14 浙江核新同花顺网络信息股份有限公司 Facial expression recognition method and system

Also Published As

Publication number Publication date
CN114612788A (en) 2022-06-10

Similar Documents

Publication Publication Date Title
Hamylton et al. Evaluating techniques for mapping island vegetation from unmanned aerial vehicle (UAV) images: Pixel classification, visual interpretation and machine learning approaches
CN112101256B (en) Garlic crop identification method based on coupling active and passive remote sensing images of cloud platform
CN111406261A (en) Disease identification from images with large field of view
CN109117811A (en) A kind of system and method based on low-altitude remote sensing measuring technique estimation urban vegetation coverage rate
CN108351959A (en) Image Acquisition task system based on variation detection
CN114612788B (en) Urban landscape plant diversity monitoring method based on neural network
CN108921035B (en) Sub-pixel positioning method and system based on spatial gravitation and pixel aggregation
CN112464766A (en) Farmland automatic identification method and system
CN113610040B (en) Paddy field weed density real-time statistical method based on improved BiSeNetV2 segmentation network
Ouyang et al. Assessment of canopy size using UAV-based point cloud analysis to detect the severity and spatial distribution of canopy decline
Kulpanich et al. The Application of Unmanned Aerial Vehicles (UAVs) and Extreme Gradient Boosting (XGBoost) to Crop Yield Estimation: A Case Study of Don Tum District, Nakhon Pathom, Thailand.
CN113570273A (en) Spatial method and system for irrigation farmland statistical data
CN117392535A (en) Fruit tree flower bud target detection and white point rate estimation method oriented to complex environment
CN115797764A (en) Remote sensing big data interpretation method and system applied to farmland non-agronomy monitoring
CN115527108A (en) Method for rapidly identifying water and soil loss artificial disturbance plots based on multi-temporal Sentinel-2
CN114462485A (en) Red date jujube witches broom initial-stage control method
CN114842360A (en) Pasturing area drought identification method, system and management platform
Yang et al. New method for cotton fractional vegetation cover extraction based on UAV RGB images
CN117110217B (en) Three-dimensional water quality monitoring method and system
CN116052141B (en) Crop growth period identification method, device, equipment and medium
CN116482034A (en) Grassland ecological monitoring method and system
Quilter Vegetation monitoring using low-altitude, large-scale imagery from radio-controlled drones
Thapa Impacts of Land-Use and Land-Cover Change on Land Degradation: A Case Study in Dolakha District, Nepal
Ismail et al. Urban forestry planning using remote sensing/GIS technique
Xia et al. Analyzing Spatial Distribution Patterns of Wheat Ears

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant