CN110827269B - Crop growth change condition detection method, device, equipment and medium - Google Patents

Crop growth change condition detection method, device, equipment and medium Download PDF

Info

Publication number
CN110827269B
CN110827269B CN201911096106.5A CN201911096106A CN110827269B CN 110827269 B CN110827269 B CN 110827269B CN 201911096106 A CN201911096106 A CN 201911096106A CN 110827269 B CN110827269 B CN 110827269B
Authority
CN
China
Prior art keywords
preset
sub
growth
crop
crop growth
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201911096106.5A
Other languages
Chinese (zh)
Other versions
CN110827269A (en
Inventor
汪飙
邹冲
李世行
张元梵
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
WeBank Co Ltd
Original Assignee
WeBank Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by WeBank Co Ltd filed Critical WeBank Co Ltd
Priority to CN201911096106.5A priority Critical patent/CN110827269B/en
Publication of CN110827269A publication Critical patent/CN110827269A/en
Application granted granted Critical
Publication of CN110827269B publication Critical patent/CN110827269B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/11Region-based segmentation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/187Segmentation; Edge detection involving region growing; involving region merging; involving connected component labelling
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20081Training; Learning
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20084Artificial neural networks [ANN]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30181Earth observation
    • G06T2207/30188Vegetation; Agriculture

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Biophysics (AREA)
  • Data Mining & Analysis (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Artificial Intelligence (AREA)
  • Biomedical Technology (AREA)
  • Software Systems (AREA)
  • Computational Linguistics (AREA)
  • Health & Medical Sciences (AREA)
  • Evolutionary Computation (AREA)
  • General Health & Medical Sciences (AREA)
  • Molecular Biology (AREA)
  • Computing Systems (AREA)
  • General Engineering & Computer Science (AREA)
  • Mathematical Physics (AREA)
  • Quality & Reliability (AREA)
  • Image Analysis (AREA)

Abstract

The application discloses a crop growth change condition detection method, a device, equipment and a medium, wherein the crop growth change condition detection method comprises the following steps: receiving a crop growth area picture to be detected, dividing the crop growth area picture to be detected to obtain sub-area pictures, inputting each sub-area picture into a preset crop growth change condition detection model to obtain sub-area detection results, and combining the sub-area detection results to obtain a crop growth area detection result to be detected. The application solves the technical problem of low detection accuracy of the growth condition of crops.

Description

Crop growth change condition detection method, device, equipment and medium
Technical Field
The application relates to the technical field of neural networks of financial science and technology, in particular to a method, a device, equipment and a medium for detecting growth change conditions of crops.
Background
With the continuous development of financial technology, especially internet technology finance, more and more technologies (such as distributed, blockchain, artificial intelligence, etc.) are applied in the finance field, but the finance industry also puts forward higher requirements on technologies, such as distribution of corresponding backlog in the finance industry.
With the gradual development of modern agriculture, people usually monitor the growth condition of crops in real time, and then acquire higher crop yield through mastering the growth condition of the crops, at present, the growth condition of the crops is usually monitored through technologies such as manual monitoring or satellite remote sensing, but the growth condition of the crops is usually detected through sampling by manual monitoring, and then the detection result accuracy of the growth condition of the crops is not high, and the growth condition of the crops is monitored through the satellite remote sensing technology, so that the growth conditions of the crops in different land areas are difficult to accurately classify and count, and finally are not satisfactory in the aspect of overall statistical analysis, and further the detection result accuracy of the growth condition of the crops is not high, so that the technical problem of low detection accuracy of the growth condition of the crops exists in the prior art.
Disclosure of Invention
The main aim of the application is to provide a crop growth change condition detection method, device, equipment and medium, which aim to solve the technical problem of low accuracy of crop growth condition detection in the prior art.
To achieve the above object, the present application provides a crop growth variation condition detection method applied to a crop growth variation condition detection apparatus, the crop growth variation condition detection method comprising:
Receiving a crop growth area picture to be detected, and dividing the crop growth area picture to be detected to obtain a sub-area picture;
inputting each sub-region picture into a preset crop growth change condition detection model to obtain a sub-region detection result;
and combining the detection results of the subregions to obtain the detection result of the crop growth region to be detected.
Optionally, the sub-region picture comprises a first time point sub-region picture and a second time point sub-region picture, the preset crop growth variation condition detection model comprises a convolutional neural network model,
inputting each sub-region picture into a preset crop growth change condition detection model, and obtaining a sub-region detection result comprises the following steps:
inputting each sub-region picture into a preset crop growth change condition detection model to perform frame difference processing on the first time point sub-region picture and the second time point sub-region picture so as to obtain a differential matrix;
and inputting the differential matrix into the convolutional neural network model to obtain the detection result of the subarea.
Optionally, the step of performing frame difference processing on the first time point sub-area picture and the second time point sub-area picture to obtain a differential matrix includes:
Acquiring a first pixel matrix and a second pixel matrix respectively corresponding to the first time point sub-area picture and the second time point sub-area picture;
and subtracting the first pixel matrix and the second pixel matrix to obtain a differential matrix.
The step of inputting the differential matrix into the convolutional neural network model to obtain a sub-region detection result comprises the following steps:
inputting the differential matrix into the convolutional neural network model, and performing convolutional processing on the differential matrix to obtain a convolutional processing result;
carrying out pooling treatment on the convolution treatment result to obtain a pooling treatment result;
and repeatedly carrying out convolution and pooling alternating processing on the pooling processing result based on the preset convolution pooling times to obtain the sub-region detection result.
Optionally, the step of inputting each of the sub-region pictures into a preset crop growth variation condition detection model to obtain a sub-region detection result includes:
acquiring a preset basic detection model and preset training data, wherein the preset training data comprises a training differential matrix and a theoretical output result corresponding to the training differential matrix;
inputting the training differential matrix into the preset basic detection model to obtain an actual output result;
Comparing the actual output result with the theoretical output result to obtain a regression loss value;
and adjusting and retraining the preset basic detection model based on the regression loss value until the regression loss value is smaller than the preset regression loss threshold value, so as to obtain the preset crop growth change condition detection model.
Optionally, before the step of obtaining the preset crop growth variation condition detection model, the method further includes:
acquiring preset verification data, wherein the preset verification data comprise verification differential matrixes and theoretical verification results corresponding to the verification differential matrixes;
inputting each verification differential matrix into the converged preset basic detection model to obtain an actual verification result;
comparing each actual verification result with each theoretical verification result to obtain a plurality of error rates;
comparing each error rate with a preset standard error rate range, counting the number of error rates in the preset standard error rate range, and counting the number of error rates relative to the number duty ratio of the plurality of error rates;
if the number duty ratio is smaller than or equal to a preset number duty ratio threshold, adjusting and retraining the preset basic detection model based on the number duty ratio;
The step of obtaining the preset crop growth variation condition detection model comprises the following steps:
and if the quantity duty ratio is larger than a preset quantity duty ratio threshold value, determining a preset basic detection model obtained through corresponding training as the preset crop growth change condition detection model.
Optionally, the subarea detection result comprises the variation degree of the growth condition of the subarea crops,
the step of combining the detection results of all the subregions to obtain the detection result of the growth region of the crop to be detected comprises the following steps:
marking each subarea detection result based on the subarea crop growth condition change degree to obtain a subarea growth change degree label;
and obtaining a segmentation template corresponding to the picture of the crop growing area to be detected, and importing the detection result of each subarea and the label of the growth variation degree of each subarea into the segmentation template to obtain the detection result of the crop growing area to be detected.
The application also provides a crop growth variation situation detection apparatus, crop growth variation situation detection apparatus is applied to crop growth variation situation check out test set, crop growth variation situation detection apparatus includes:
The segmentation module is used for receiving the crop growth area picture to be detected and segmenting the crop growth area picture to be detected to obtain a sub-area picture;
the detection module is used for inputting each sub-region picture into a preset crop growth change condition detection model to obtain a sub-region detection result;
and the merging module is used for merging the detection results of the subareas to obtain a detection result of the crop growth area to be detected.
Optionally, the detection module includes:
the first input unit is used for inputting each sub-region picture into a preset crop growth change condition detection model so as to perform frame difference processing on the first time point sub-region picture and the second time point sub-region picture to obtain a differential matrix;
and the second input unit is used for inputting the differential matrix into the convolutional neural network model to obtain the detection result of the subarea.
Optionally, the first input unit includes:
the acquisition subunit is used for acquiring a first pixel matrix and a second pixel matrix which correspond to the first time point sub-area picture and the second time point sub-area picture respectively;
And the calculating subunit is used for carrying out subtraction operation on the first pixel matrix and the second pixel matrix to obtain a differential matrix.
Optionally, the second input unit includes:
the convolution subunit is used for inputting the differential matrix into the convolution neural network model, and carrying out convolution processing on the differential matrix to obtain a convolution processing result;
chi Huazi unit, configured to perform pooling processing on the convolution processing result to obtain a pooling processing result;
and the repeated subunit is used for repeatedly carrying out convolution and pooling alternating processing on the pooling processing result based on the preset convolution pooling times to obtain the sub-area detection result.
Optionally, the crop growth variation condition detection apparatus further comprises:
the first acquisition unit is used for acquiring a preset basic detection model and preset training data, wherein the preset training data comprises a training differential matrix and a theoretical output result corresponding to the training differential matrix;
the alternating processing unit is used for inputting the training differential matrix into the preset basic detection model to obtain an actual output result;
the first comparison unit is used for comparing the actual output result with the theoretical output result to obtain a regression loss value;
And the first adjusting unit is used for adjusting and retraining the preset basic detection model based on the regression loss value until the regression loss value is smaller than the preset regression loss threshold value, so as to obtain the preset crop growth change condition detection model.
Optionally, the crop growth variation condition detection apparatus further comprises:
the second acquisition unit is used for acquiring preset verification data, wherein the preset verification data comprise verification differential matrixes and theoretical verification results corresponding to the verification differential matrixes;
the third input unit is used for inputting each verification differential matrix into the converged preset basic detection model to obtain an actual verification result;
the second comparison unit is used for comparing each actual verification result with each theoretical verification result to obtain a plurality of error rates;
a statistics unit, configured to compare each error rate with a preset standard error rate range, count the number of error rates in the preset standard error rate range, and count the number of error rates relative to the number ratio of the plurality of error rates;
The second adjusting unit is used for adjusting the preset basic detection model and retraining based on the quantity duty ratio if the quantity duty ratio is smaller than or equal to a preset quantity duty ratio threshold value;
and the determining unit is used for determining a preset basic detection model obtained by corresponding training as the preset crop growth change condition detection model if the number ratio is larger than a preset number ratio threshold value.
Optionally, the combining module includes:
the identification unit is used for identifying the detection results of each subarea based on the growth condition change degree of the crops in the subarea to obtain a subarea growth change degree label;
the importing unit is used for acquiring a segmentation template corresponding to the picture of the crop growing area to be detected, importing the detection result of each subarea and the growth variation degree label of each subarea into the segmentation template, and obtaining the detection result of the crop growing area to be detected.
The application also provides a crop growth variation situation detection apparatus, the crop growth variation situation detection apparatus includes: the method comprises a memory, a processor and a program of the crop growth variation condition detection method stored in the memory and capable of running on the processor, wherein the program of the crop growth variation condition detection method can realize the steps of the crop growth variation condition detection method when being executed by the processor.
The application also provides a medium, which is a readable storage medium, and the medium stores a program for realizing the crop growth variation condition detection method, and the program for realizing the crop growth variation condition detection method when executed by a processor.
According to the method, the picture of the crop growing area to be detected is received, the picture of the crop growing area to be detected is segmented, the picture of the subarea is obtained, the pictures of the subareas are input into a preset crop growing change condition detection model, the detection results of the subareas are obtained, and finally the detection results of the subareas are combined, so that the detection results of the crop growing area to be detected are obtained. That is, the application firstly performs segmentation on the picture of the crop growth area to be detected to obtain the picture of the subarea, then inputs each picture of the subarea into a preset crop growth change condition detection model, performs acquisition of the detection result of the subarea, and further performs merging of the detection results of the subareas to obtain the detection result of the crop growth area to be detected. That is, the application divides the image of the crop growth area to be detected into a plurality of sub-area images, and then detects the crop growth conditions corresponding to the plurality of sub-area images, respectively, obtains the sub-area detection results corresponding to the sub-area images, and further obtains the detection results of the crop growth area to be detected by combining the detection results of the sub-area images, so that the application can accurately detect the overall crop growth condition and the distribution condition of the growth condition of each sub-area of the crop growth area to be detected, and the technical problem of low detection accuracy of the crop growth condition in the prior art is solved.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments consistent with the application and together with the description, serve to explain the principles of the application.
In order to more clearly illustrate the embodiments of the present application or the technical solutions in the prior art, the drawings that are required to be used in the description of the embodiments or the prior art will be briefly described below, and it will be obvious to those skilled in the art that other drawings can be obtained from these drawings without inventive effort.
FIG. 1 is a schematic flow chart of a first embodiment of a method for detecting crop growth variation in the present application;
FIG. 2 is a schematic diagram of dividing a picture of a crop growth area to be detected in the crop growth variation condition detection method of the present application;
FIG. 3 is a schematic diagram showing the combination of the detection results of each sub-region in the crop growth variation detection method of the present application;
FIG. 4 is a flow chart of a second embodiment of a method for detecting crop growth variation according to the present application;
FIG. 5 is a schematic flow chart of the method for detecting the growth change condition of crops in the application, wherein the method is used for obtaining the detection result of the subareas;
FIG. 6 is a flow chart of a third embodiment of a method for detecting crop growth variation according to the present application;
FIG. 7 is a schematic flow chart of training a preset basic detection model in the crop growth variation condition detection method of the present application;
fig. 8 is a schematic device structure diagram of a hardware running environment according to an embodiment of the present application.
The realization, functional characteristics and advantages of the present application will be further described with reference to the embodiments, referring to the attached drawings.
Detailed Description
It should be understood that the specific embodiments described herein are for purposes of illustration only and are not intended to limit the present application.
In a first embodiment of the present application, referring to fig. 1, the crop growth variation condition detection method includes:
step S10, receiving a crop growth area picture to be detected, and dividing the crop growth area picture to be detected to obtain a sub-area picture;
in this embodiment, a to-be-detected crop growth area picture is received and segmented to obtain a sub-area picture, specifically, a to-be-detected crop growth area picture obtained through a preset shooting mode is received, where the preset shooting mode includes modes such as satellite shooting, aerial shooting, camera shooting, and the like, further, as shown in fig. 2, a1 to a9 are to-be-detected crop growth area pictures, and further, the to-be-detected crop growth area picture is segmented according to a preset segmentation ratio to obtain a sub-area picture, for example, assuming that the size of the to-be-detected crop growth area picture is x×y, where the picture is rectangular, X is a picture length, Y is a picture width, the preset segmentation ratio is a score of a, the picture width is b score of b score, and the size of the sub-area picture is X/b×y/a.
Step S20, inputting each sub-region picture into a preset crop growth change condition detection model to obtain a sub-region detection result;
in this embodiment, it should be noted that the preset crop growth variation condition detection model is a trained and reliable model, and the detection result of the sub-region is a crop growth variation condition within a preset time period, where a first time point of the preset time period is an initial time point, and a last time point of the preset time period is a detection time point, and the crop growth variation condition includes a crop result variation condition, a crop yield variation condition, a crop flowering variation condition, a crop growth luxury degree, and other growth variation conditions.
Inputting each sub-region picture into a preset crop growth change condition detection model to obtain a sub-region detection result, specifically inputting each sub-region picture into the preset crop growth change condition detection model to make frame differences between the picture of the sub-region corresponding to the initial time point and the picture of the same sub-region corresponding to the detection time point to obtain a differential matrix corresponding to each sub-region picture, and further inputting the differential matrix into a convolutional neural network model in the preset crop growth change condition detection model to obtain the sub-region detection result.
And step S30, combining the detection results of all the subregions to obtain the detection result of the crop growth region to be detected.
In this embodiment, it should be noted that the detection result of the crop growth area to be detected is a crop growth variation condition within a preset time period, where a first time point of the preset time period is an initial time point, and a last time point of the preset time period is a detection time point.
Combining the detection results of the subareas to obtain a detection result of a growth area of the crop to be detected, and specifically, combining the detection results of the subareas reversely based on a segmentation mode for segmenting the picture of the growth area of the crop to be detected to obtain the detection result of the growth area of the crop to be detected, wherein the detection result of the growth area of the crop to be detected comprises a change growth condition of the whole area crop corresponding to the picture of the growth area of the crop to be detected and a change growth condition of the subarea crop corresponding to the picture of the subarea.
Wherein the detection result of the subarea comprises the change degree of the growth condition of crops in the subarea,
the step of combining the detection results of all the subregions to obtain the detection result of the growth region of the crop to be detected comprises the following steps:
Step S31, marking each subarea detection result based on the subarea crop growth condition change degree to obtain a subarea growth change degree label;
in this embodiment, it should be noted that, the change degree of the growth condition of the sub-area crops includes a change degree of the yield of the sub-area crops, a change degree of the result of the sub-area crops, a change degree of the flourishing quality of the sub-area crops, and the like, and the label of the change degree of the growth of the sub-area crops is used for marking the change degree of the growth degree of the sub-area crops, for example, the greater the change degree of the growth degree of the sub-area crops, the darker the color of the label of the change degree of the growth of the sub-area crops may be set, and the label of the change degree of the growth of the sub-area includes labels such as two-dimensional codes, bar codes, patterns, characters, and the like.
Marking each sub-region detection result based on the sub-region crop growth condition change degree to obtain a sub-region growth change degree label, specifically grading the sub-region crop growth condition change degree to obtain grading levels of each sub-region detection result, marking each sub-region detection result based on the grading levels to obtain a sub-region growth change degree label corresponding to the sub-region detection result, for example, assuming that the sub-region crop growth condition change degree is a crop yield change and the crop yield change is divided into 3 levels, respectively, the sub-region growth change degree label corresponding to the sub-region detection result is set to be red, the sub-region growth change degree label corresponding to the sub-region detection result is set to be yellow, and the sub-region growth change degree label corresponding to the sub-region detection result is set to be white, wherein the sub-region growth change degree label corresponding to the sub-region detection result is set to be more than 2000 jin and less than 1500 jin.
Step S32, obtaining a segmentation template corresponding to the picture of the crop growing area to be detected, and importing the detection result of each subarea and the label of the growth variation degree of each subarea into the segmentation template to obtain the detection result of the crop growing area to be detected.
In this embodiment, as shown in fig. 3, the left graph in fig. 3 is a segmentation template after the sub-region detection results and the sub-region growth variation degree labels are imported, where out1 to out9 each include the sub-region detection result of the corresponding sub-region picture and the sub-region growth variation degree label, that is, out1 to out9 each include the sub-region crop variation growth condition and the sub-region growth variation degree label of the corresponding sub-region picture, and the right graph in fig. 3 is the detection result of the to-be-detected crop growth region, where the outl includes out1 to out9 and the overall region crop variation growth condition corresponding to the to-be-detected crop growth region picture, for example, assuming that the right graph is a dynamic graph, clicking the outl may show the left graph.
According to the embodiment, the image of the crop growing area to be detected is received, the image of the crop growing area to be detected is segmented to obtain the image of the subareas, the images of the subareas are input into a preset crop growing change condition detection model to obtain the detection results of the subareas, and finally the detection results of the subareas are combined to obtain the detection results of the crop growing area to be detected. That is, in this embodiment, the image of the growing area of the crop to be detected is first divided to obtain the image of the sub-area, and then each image of the sub-area is input into a preset crop growing variation condition detection model to obtain the detection result of the sub-area, and further, the detection results of the sub-areas are combined to obtain the detection result of the growing area of the crop to be detected. That is, in this embodiment, the image of the crop growth area to be detected is divided into a plurality of sub-area images, and then the crop growth conditions corresponding to the plurality of sub-area images are detected respectively, so as to obtain sub-area detection results corresponding to the sub-area images, and then the detection results of the crop growth area to be detected are obtained by combining the detection results of the sub-areas, so that the overall crop growth condition of the crop growth area to be detected and the distribution condition of the growth condition of each sub-area can be accurately detected, and therefore, the technical problem of low accuracy of detecting the crop growth condition in the prior art is solved.
Further, referring to fig. 4, in another embodiment of the crop growth variation detection method according to the first embodiment of the present application, the sub-region picture includes a first time point sub-region picture and a second time point sub-region picture, the preset crop growth variation detection model includes a convolutional neural network model,
inputting each sub-region picture into a preset crop growth change condition detection model, and obtaining a sub-region detection result comprises the following steps:
step S21, inputting each sub-region picture into a preset crop growth change condition detection model, so as to perform frame difference processing on the first time point sub-region picture and the second time point sub-region picture, and obtaining a differential matrix;
in this embodiment, it should be noted that, the first time point sub-area picture and the second time point sub-area picture both correspond to the same sub-area of the crop growth area to be detected, the first time point is an initial time point of a preset time period, and the second time point is a detection time point of the preset time period.
Inputting each of the sub-region pictures into a preset crop growth variation condition detection model to perform frame difference processing on the first time point sub-region picture and the second time point sub-region picture to obtain a differential matrix, specifically inputting each of the sub-region pictures into the preset crop growth variation condition detection model to perform frame difference on the first time point sub-region picture and the second time point sub-region picture, that is, performing subtraction operation on a first pixel matrix corresponding to the first time point sub-region picture and a second pixel matrix corresponding to the second time point sub-region picture to obtain a differential matrix, and converting the differential matrix into a differential matrix, as shown in fig. 5, a1M and a1n are respectively the first time point sub-region picture and the second time point sub-region picture, dc11j is an input sub-region of the preset crop growth variation condition detection model, CNN 2 is the preset crop growth variation condition detection model, and out1 is a detection result.
In step S21, the step of performing frame difference processing on the first time point sub-area picture and the second time point sub-area picture to obtain a differential matrix includes:
step S211, acquiring a first pixel matrix and a second pixel matrix corresponding to the first time point sub-area picture and the second time point sub-area picture, respectively;
in this embodiment, it should be noted that, the first time point sub-area picture and the second time point sub-area picture may be represented by a digital matrix in the computer, that is, the first pixel matrix and the second pixel matrix corresponding to the first time point sub-area picture and the second time point sub-area picture respectively, where a numerical value in the digital matrix is a pixel value of the picture.
Step S212, subtracting the first pixel matrix and the second pixel matrix to obtain a differential matrix;
in this embodiment, it should be noted that the specifications of the first pixel matrix and the second pixel matrix are the same, for example, assuming that the first pixel matrix is a m×n matrix, the second pixel matrix is also a m×n matrix.
And subtracting the first pixel matrix and the second pixel matrix to obtain a differential matrix, and specifically subtracting corresponding pixel values in the first pixel matrix and the second pixel matrix to obtain the differential matrix.
S22, inputting the differential matrix into the convolutional neural network model to obtain the detection result of the subarea;
in this embodiment, it should be noted that the convolutional neural network model is a model already trained based on deep learning, and the convolutional neural network model includes a convolutional layer, a pooling layer, a full-connection layer, and other data processing layers.
Inputting the differential matrix into the convolutional neural network model to obtain the sub-region detection result, specifically inputting the differential matrix into the convolutional neural network model, and performing data processing on the differential matrix based on the selected data processing layers such as a convolutional layer, a pooling layer or a full connection layer, wherein the data processing comprises convolution, pooling, full connection and the like, so as to obtain the sub-region detection result, and the number and the type of the data processing layers can be selected and used by a user independently.
The step of inputting the differential matrix into the convolutional neural network model to obtain a sub-region detection result comprises the following steps:
Step S221, inputting the differential matrix into the convolutional neural network model, and performing convolutional processing on the differential matrix to obtain a convolutional processing result;
in this embodiment, it should be noted that the convolution process can be understood as: the statistical characteristics of one part of the image features are the same as those of other parts, namely, the statistical characteristics learned in the part can also appear on the corresponding other part, so that the learned statistical characteristics are used as detectors and applied to any place of the image features, namely, the statistical characteristics learned through a small-range image are convolved with the image features of the original large-size image, and in mathematics, the convolution can be the multiplication and final summation of the characteristic matrix of the corresponding image and a plurality of detection matrices in advance, so as to obtain a convolution processing result.
In this embodiment, convolution processing is performed on the differential matrix based on the plurality of preset image features and the plurality of weight matrices to obtain a convolution processing result, specifically, dot multiplication is performed on an image matrix corresponding to the differential matrix and the weight matrix, and then weight summation is performed to obtain the convolution processing result.
Step S223, carrying out pooling treatment on the convolution treatment result to obtain a pooling treatment result;
In this embodiment, the pooling processing includes modes such as maximum pooling and mean pooling, and the pooling processing is performed on the convolution processing result to obtain a pooling processing result, specifically, the convolution processing result is first divided into a plurality of pixel matrixes with preset sizes, if the maximum pooling is performed, the pixel matrix is replaced by the maximum pixel value of the pixel matrix, so as to obtain a new image matrix, that is, the pooling processing result is obtained.
Step S224, based on a preset convolution pooling frequency, repeatedly performing convolution and pooling alternating processing on the pooling processing result, so as to obtain the sub-region detection result.
In this embodiment, based on a preset convolution pooling number, the pooling processing result is repeatedly subjected to convolution and pooling alternating processing to obtain the sub-region detection result, and specifically, steps S222 to S223 are performed in a circulating manner for a preset number of times until the number of times of convolution and pooling alternating processing reaches the preset convolution pooling number of times, so as to obtain the sub-region detection result. In addition, after the convolution and pooling alternate processing is performed, a full connection layer is selectively connected in the convolutional neural network model to perform full connection so as to obtain the sub-region detection result, wherein the full connection can be regarded as a special convolution processing, the result of the special convolution processing is to obtain a one-dimensional vector corresponding to an image, that is, the sub-region feature maps are converted into a one-dimensional vector through full connection, and the one-dimensional vector comprises combined information of all features of the one-dimensional vector corresponding to a differential matrix, wherein the combined information comprises crop growth change conditions, and then the sub-region detection result is obtained.
According to the embodiment, each sub-region picture is input into a preset crop growth change condition detection model, so that frame difference processing is carried out on the first time point sub-region picture and the second time point sub-region picture to obtain a differential matrix, and the differential matrix is input into the convolutional neural network model to obtain a sub-region detection result. In this embodiment, each of the sub-region pictures is input into a preset crop growth variation condition detection model to perform frame difference processing on the first time point sub-region picture and the second time point sub-region picture, so as to obtain a differential matrix, and further, the differential matrix is input into the convolutional neural network model to obtain the sub-region detection result. That is, the present embodiment provides a method for obtaining the detection result of the subregion, that is, inputting each image of the subregion into a preset crop growth change condition detection model, that is, outputting the detection result of the subregion, thereby laying a foundation for obtaining the detection result of the crop growth region to be detected, and laying a foundation for solving the technical problem of low accuracy of detecting the crop growth condition in the prior art.
Further, referring to fig. 6, based on the first embodiment and the second embodiment in the present application, in another embodiment of the method for detecting a crop growth variation status, the step of inputting each of the sub-area pictures into a preset crop growth variation status detection model, and obtaining a sub-area detection result includes:
step A10, a preset basic detection model and preset training data are obtained, wherein the preset training data comprise a training differential matrix and theoretical output results corresponding to the training differential matrix;
in this embodiment, it should be noted that, the preset basic detection model is a model that is not well trained, and image data of m times can be obtained by shooting an area to be shot at different m times, where the image data includes pictures of a plurality of sub-areas, and further, the sub-areas at two times are taken as frame differences, so that the training differential matrix can be obtained, and crop growth conditions of each area on the image of the first m times are obtained through websites such as a national statistical bureau, so that crop production variation conditions of any two times in the m times can be obtained, that is, theoretical output results corresponding to the training differential matrix are obtained.
Step A20, inputting the training differential matrix into the preset basic detection model to obtain an actual output result;
in this embodiment, the convolution refers to a process of multiplying and summing an image matrix corresponding to an image and a convolution kernel one by one to obtain an image feature value, where the convolution kernel refers to a weight matrix corresponding to a differential matrix feature, and the pooling refers to a process of integrating the image feature values obtained by convolution to obtain a new feature value.
Inputting the training differential matrix into the preset basic detection model to obtain an actual output result, specifically inputting the differential matrix into the preset basic detection model, and performing data processing on the training differential matrix based on the selected data processing layers such as a convolution layer, a pooling layer or a full connection layer, wherein the data processing comprises convolution, pooling, full connection and the like, so as to obtain the actual output result, and the number and the type of the data processing layers can be selected and used by a user independently.
Step A30, comparing the actual output result with the theoretical output result to obtain a regression loss value;
In this embodiment, the actual output result is compared with the theoretical output result to obtain a regression loss value, specifically, the actual output result is compared with the theoretical output result to obtain a difference value between the actual output result and the theoretical output result, and a ratio of the difference value to the theoretical output result is calculated, where the ratio is the regression loss value, that is, the regression loss value is obtained.
And step A40, adjusting and retraining the preset basic detection model based on the regression loss value until the regression loss value is smaller than the preset regression loss threshold value, so as to obtain the preset crop growth change condition detection model.
In this embodiment, based on the regression loss value, the preset basic detection model is adjusted and retrained until the regression loss value is smaller than the preset regression loss threshold, and specifically, the regression loss value is compared with a preset regression loss threshold, when the regression loss value is smaller than the preset regression loss threshold, a verification model corresponding to the preset basic detection model is obtained, when the regression loss value is larger than or equal to the preset regression loss threshold, the weight matrix in the preset basic detection model is adjusted and retrained until the regression loss value is smaller than the preset regression loss threshold, the preset basic detection model is set as the preset crop growth change condition detection model, wherein the preset regression loss threshold can be input through a user or a system default threshold is used, and the preset regression loss threshold is small, the detection model is detected more accurately, as a preset basic detection model is shown as a preset regression loss value, and j is obtained as a theoretical difference M1, and a theoretical difference M1 is obtained between the theoretical difference M1 and a difference M1 is obtained.
The step of obtaining the preset crop growth variation condition detection model comprises the following steps:
step B10, acquiring preset verification data, wherein the preset verification data comprise verification differential matrixes and theoretical verification results corresponding to the verification differential matrixes;
in this embodiment, it should be noted that, by photographing the region to be photographed at different m times, image data of m times may be obtained, where the image data includes pictures of a plurality of sub-regions, and further, a frame difference is taken between the sub-regions of two times, so as to obtain the verification differential matrix, and the crop growth condition of each region on the image of the previous m times is obtained through a website such as the national statistical bureau, so that the crop production variation condition of any two times in the m times may be obtained, that is, the theoretical verification result corresponding to the verification differential matrix is obtained.
Step B20, inputting each verification differential matrix into the converged preset basic detection model to obtain an actual verification result;
in this embodiment, it should be noted that the preset basic detection model that has converged refers to a prediction basic detection model in which a regression loss value of the preset basic detection model is smaller than the preset regression loss threshold.
Inputting each verification differential matrix into the preset basic detection model which is converged to obtain an actual verification result, specifically inputting the verification differential matrix into the preset basic detection model which is converged, and carrying out data processing on the verification differential matrix based on a selected data processing layer such as a convolution layer, a pooling layer or a full connection layer, wherein the data processing comprises convolution, pooling, full connection and the like, and further obtaining the actual verification result, and the number and the type of the data processing layers can be selected and used by a user independently.
Step B30, comparing each actual verification result with each theoretical verification result to obtain a plurality of error rates;
in this embodiment, it should be noted that, the number of the actual verification results may ensure reliability of verification of the verification model, each of the actual verification results is compared with each of the theoretical verification results to obtain a plurality of error rates, specifically, each of the actual verification results is compared with each of the theoretical verification results corresponding to each of the actual verification results, a plurality of error values of each of the actual verification results and each of the theoretical verification results are calculated, and then a ratio between the error values and the theoretical verification results is calculated to obtain a plurality of error rates, for example, assuming that the actual verification results are 900 jin of crop yield per mu and 1000 jin of crop yield per mu, the error value is 100% and the error rate is 10%.
Step B40, comparing each error rate with a preset standard error rate range, counting the number of error rates in the preset standard error rate range, and counting the number of error rates relative to the number ratio of the plurality of error rates;
in this embodiment, comparing each error rate with a preset standard error rate range, counting the number of error rates within the preset standard error rate range, and counting the number of error rates relative to the number of error rates, specifically comparing each error rate with a preset standard error rate range, counting the number of error rates within the preset standard error rate range, and further calculating the ratio between the number of error rates and the number of error rates to obtain the number ratio.
Step B50, if the number duty ratio is smaller than or equal to a preset number duty ratio threshold value, adjusting and retraining the preset basic detection model based on the number duty ratio;
in this embodiment, if the number duty ratio is smaller than or equal to a preset number duty ratio threshold, the preset basic detection model is adjusted and retrained based on the number duty ratio, specifically, if the number duty ratio is smaller than or equal to a preset number duty ratio threshold, the weight matrix in the preset basic detection model is adjusted based on the number duty ratio is smaller than or equal to the preset number duty ratio threshold, and retraining and verification are performed until the number duty ratio is greater than the preset number duty ratio threshold.
The step of obtaining the preset crop growth change condition detection model comprises the following steps:
and step B60, if the number duty ratio is larger than a preset number duty ratio threshold value, determining a preset basic detection model obtained through corresponding training as the preset crop growth change condition detection model.
In this embodiment, it should be noted that, if the number ratio is greater than the preset number ratio threshold, it is determined that the prediction accuracy of the preset basic detection model has reached the requirement of the preset crop growth variation condition detection model, that is, the preset basic detection model may be used as the preset crop growth variation condition detection model.
According to the embodiment, a preset basic detection model and preset training data are obtained, wherein the preset training data comprise a training differential matrix and theoretical output results corresponding to the training differential matrix, the training differential matrix is input into the preset basic detection model, rolling and pooling alternating processing of preset times is carried out on the training differential matrix to obtain an actual output result, the actual output result is compared with the theoretical output result to obtain a regression loss value, and finally, the preset basic detection model is adjusted and retrained based on the regression loss value until the regression loss value is smaller than the preset regression loss threshold value to obtain the preset crop growth change condition detection model. That is, in this embodiment, the preset basic detection model and the preset training data are first obtained, the training differential matrix is further input into the preset basic detection model to obtain an actual output result, the actual output result is further compared with the theoretical output result to obtain a regression loss value, and finally, the preset basic detection model is adjusted and retrained based on the regression loss value until the regression loss value is smaller than the preset regression loss threshold value, so as to obtain the preset crop growth variation condition detection model. That is, the application provides a training and obtaining method for a preset crop growth change condition detection model, which can train the preset basic detection model into the preset crop growth change condition detection model, so as to lay a foundation for obtaining the detection result of the subarea, and lay a foundation for solving the technical problem of low accuracy of crop growth condition detection in the prior art.
Referring to fig. 8, fig. 8 is a schematic device structure diagram of a hardware running environment according to an embodiment of the present application.
As shown in fig. 8, the crop growth variation condition detection apparatus may include: a processor 1001, such as a CPU, memory 1005, and a communication bus 1002. Wherein a communication bus 1002 is used to enable connected communication between the processor 1001 and a memory 1005. The memory 1005 may be a high-speed RAM memory or a stable memory (non-volatile memory), such as a disk memory. The memory 1005 may also optionally be a storage device separate from the processor 1001 described above.
Optionally, the crop growth variation status detection apparatus may further include a rectangular user interface, a network interface, a camera, an RF (Radio Frequency) circuit, a sensor, an audio circuit, a WiFi module, and the like. The rectangular user interface may include a Display screen (Display), an input sub-module such as a Keyboard (Keyboard), and the optional rectangular user interface may also include a standard wired interface, a wireless interface. The network interface may optionally include a standard wired interface, a wireless interface (e.g., WI-FI interface).
It will be appreciated by those skilled in the art that the crop growth variation condition detection apparatus structure illustrated in fig. 8 does not constitute a limitation of the crop growth variation condition detection apparatus and may include more or fewer components than illustrated, or may combine certain components, or may be arranged in a different arrangement of components.
As shown in fig. 8, an operating system, a network communication module, and a crop growth variation condition detection program may be included in a memory 1005 as one type of computer storage medium. The operating system is a program for managing and controlling the hardware and software resources of the crop growth variation detection apparatus, and supports the operation of the crop growth variation detection program and other software and/or programs. The network communication module is used to enable communication between the components within the memory 1005 and with other hardware and software in the crop growth variation detection system.
In the crop growth variation situation detection apparatus shown in fig. 8, a processor 1001 is configured to execute a crop growth variation situation detection program stored in a memory 1005, implementing the steps of the crop growth variation situation detection method described in any one of the above.
The specific embodiment of the crop growth variation condition detection apparatus is basically the same as the embodiments of the crop growth variation condition detection method described above, and will not be described herein again.
The embodiment of the application also provides a crop growth change condition detection apparatus, the crop growth change condition detection apparatus includes:
The segmentation module is used for receiving the crop growth area picture to be detected and segmenting the crop growth area picture to be detected to obtain a sub-area picture;
the detection module is used for inputting each sub-region picture into a preset crop growth change condition detection model to obtain a sub-region detection result;
and the merging module is used for merging the detection results of the subareas to obtain a detection result of the crop growth area to be detected.
Optionally, the detection module includes:
the first input unit is used for inputting each sub-region picture into a preset crop growth change condition detection model so as to perform frame difference processing on the first time point sub-region picture and the second time point sub-region picture to obtain a differential matrix;
and the second input unit is used for inputting the differential matrix into the convolutional neural network model to obtain the detection result of the subarea.
Optionally, the first input unit includes:
the acquisition subunit is used for acquiring a first pixel matrix and a second pixel matrix which correspond to the first time point sub-area picture and the second time point sub-area picture respectively;
And the calculating subunit is used for carrying out subtraction operation on the first pixel matrix and the second pixel matrix to obtain a differential matrix.
Optionally, the second input unit includes:
the convolution subunit is used for inputting the differential matrix into the convolution neural network model, and carrying out convolution processing on the differential matrix to obtain a convolution processing result;
chi Huazi unit, configured to perform pooling processing on the convolution processing result to obtain a pooling processing result;
and the repeated subunit is used for repeatedly carrying out convolution and pooling alternating processing on the pooling processing result based on the preset convolution pooling times to obtain the sub-area detection result.
Optionally, the crop growth variation condition detection apparatus further comprises:
the first acquisition unit is used for acquiring a preset basic detection model and preset training data, wherein the preset training data comprises a training differential matrix and a theoretical output result corresponding to the training differential matrix;
the alternating processing unit is used for inputting the training differential matrix into the preset basic detection model to obtain an actual output result;
the first comparison unit is used for comparing the actual output result with the theoretical output result to obtain a regression loss value;
And the first adjusting unit is used for adjusting and retraining the preset basic detection model based on the regression loss value until the regression loss value is smaller than the preset regression loss threshold value, so as to obtain the preset crop growth change condition detection model.
Optionally, the crop growth variation condition detection apparatus further comprises:
the second acquisition unit is used for acquiring preset verification data, wherein the preset verification data comprise verification differential matrixes and theoretical verification results corresponding to the verification differential matrixes;
the third input unit is used for inputting each verification differential matrix into the converged preset basic detection model to obtain an actual verification result;
the second comparison unit is used for comparing each actual verification result with each theoretical verification result to obtain a plurality of error rates;
a statistics unit, configured to compare each error rate with a preset standard error rate range, count the number of error rates in the preset standard error rate range, and count the number of error rates relative to the number ratio of the plurality of error rates;
The second adjusting unit is used for adjusting the preset basic detection model and retraining based on the quantity duty ratio if the quantity duty ratio is smaller than or equal to a preset quantity duty ratio threshold value;
and the determining unit is used for determining a preset basic detection model obtained by corresponding training as the preset crop growth change condition detection model if the number ratio is larger than a preset number ratio threshold value.
Optionally, the combining module includes:
the identification unit is used for identifying the detection results of each subarea based on the growth condition change degree of the crops in the subarea to obtain a subarea growth change degree label;
the importing unit is used for acquiring a segmentation template corresponding to the picture of the crop growing area to be detected, importing the detection result of each subarea and the growth variation degree label of each subarea into the segmentation template, and obtaining the detection result of the crop growing area to be detected.
The specific embodiment of the crop growth variation detection apparatus is substantially the same as the embodiments of the crop growth variation detection method described above, and will not be described herein.
Embodiments of the present application provide a medium, which is a readable storage medium, and which stores one or more programs, and the one or more programs are further executable by one or more processors to implement the steps of the crop growth variation condition detection method described in any one of the above.
The specific embodiments of the medium in the present application are basically the same as the above embodiments of the method for detecting the growth change condition of crops, and are not described herein again.
The foregoing description is only of the preferred embodiments of the present application and is not intended to limit the scope of the claims, and all equivalent structures or equivalent processes using the descriptions and drawings of the present application, or direct or indirect application in other related technical fields are included in the scope of the claims.

Claims (8)

1. A method for detecting a growth variation of a crop, the method comprising:
receiving a crop growth area picture to be detected, and dividing the crop growth area picture to be detected to obtain each sub-area picture, wherein any one of the sub-area pictures comprises an initial time point sub-area picture of a preset time period and a detection time point sub-area picture of the preset time period;
Inputting each initial time point sub-area picture and each detection time point sub-area picture into a preset crop growth change condition detection model to obtain a sub-area detection result;
marking each sub-region detection result according to the grading level of each sub-region detection result to obtain a corresponding sub-region growth variation degree label, wherein the sub-region growth variation degree label is used for marking the variation degree of the growth condition of the sub-region crops;
obtaining a segmentation template corresponding to the crop growth area picture to be detected, and importing each subarea detection result and each subarea growth variation degree label into the segmentation template to obtain a crop growth area detection result to be detected, wherein the crop growth area detection result to be detected comprises an overall area crop variation growth condition corresponding to the crop growth area picture to be detected and a subarea crop variation growth condition corresponding to each subarea picture, the overall area crop variation growth condition is obtained by combining the subarea crop variation growth conditions, each subarea crop variation growth condition can be dynamically displayed by clicking the overall area crop variation growth condition, the preset crop growth variation condition detection model comprises a convolutional neural network model, each initial time point subarea picture and each detection time point subarea picture are input into the preset crop growth variation condition detection model, and the step of obtaining the subarea detection result comprises the following steps:
Inputting each initial time point sub-area picture and each detection time point sub-area picture into a preset crop growth change condition detection model, and obtaining a differential matrix by carrying out frame difference processing on the initial time point sub-area picture and the detection time point sub-area picture; and inputting the differential matrix into the convolutional neural network model to obtain the detection result of the subarea.
2. The method for detecting the growth variation of crops according to claim 1, wherein the step of performing frame difference processing on the initial time point sub-area picture and the detection time point sub-area picture to obtain a differential matrix comprises:
acquiring a first pixel matrix and a second pixel matrix respectively corresponding to the initial time point sub-area picture and the detection time point sub-area picture;
and subtracting the first pixel matrix and the second pixel matrix to obtain a differential matrix.
3. The method for detecting the growth change condition of crops according to claim 1, wherein the step of inputting the differential matrix into the convolutional neural network model to obtain the detection result of the subareas comprises:
inputting the differential matrix into the convolutional neural network model, and performing convolutional processing on the differential matrix to obtain a convolutional processing result;
Carrying out pooling treatment on the convolution treatment result to obtain a pooling treatment result;
and repeatedly carrying out convolution and pooling alternating processing on the pooling processing result based on the preset convolution pooling times to obtain the sub-region detection result.
4. The method for detecting the growth change condition of crops according to claim 1, wherein the step of inputting each of the initial time point sub-area pictures and each of the detection time point sub-area pictures into a preset crop growth change condition detection model to obtain the sub-area detection result comprises the steps of:
acquiring a preset basic detection model and preset training data, wherein the preset training data comprises a training differential matrix and a theoretical output result corresponding to the training differential matrix;
inputting the training differential matrix into the preset basic detection model to obtain an actual output result;
comparing the actual output result with the theoretical output result to obtain a regression loss value;
and adjusting and retraining the preset basic detection model based on the regression loss value until the regression loss value is smaller than a preset regression loss threshold value, so as to obtain the preset crop growth change condition detection model.
5. The method for detecting the growth variation of crops according to claim 4, wherein before the step of obtaining the predetermined crop growth variation detection model, further comprising:
acquiring preset verification data, wherein the preset verification data comprise verification differential matrixes and theoretical verification results corresponding to the verification differential matrixes;
inputting each verification differential matrix into the converged preset basic detection model to obtain an actual verification result;
comparing each actual verification result with each theoretical verification result to obtain a plurality of error rates;
comparing each error rate with a preset standard error rate range, counting the number of error rates in the preset standard error rate range, and counting the number of error rates relative to the number duty ratio of the plurality of error rates;
if the number duty ratio is smaller than or equal to a preset number duty ratio threshold, adjusting and retraining the preset basic detection model based on the number duty ratio;
the step of obtaining the preset crop growth variation condition detection model comprises the following steps:
and if the quantity duty ratio is larger than a preset quantity duty ratio threshold value, determining a preset basic detection model obtained through corresponding training as the preset crop growth change condition detection model.
6. A crop growth variation monitoring apparatus, characterized in that the crop growth variation monitoring apparatus is applied to a crop growth variation monitoring device, the crop growth variation monitoring apparatus comprising:
the segmentation module is used for receiving a crop growth area picture to be detected, and segmenting the crop growth area picture to be detected to obtain each sub-area picture, wherein any one of the sub-area pictures comprises an initial time point sub-area picture of a preset time period and a detection time point sub-area picture of the preset time period;
the detection module is used for inputting each initial time point sub-area picture and each detection time point sub-area picture into a preset crop growth change condition detection model to obtain a sub-area detection result;
the identification module is used for identifying each sub-region detection result according to the grading level of each sub-region detection result to obtain a corresponding sub-region growth variation degree label, wherein the sub-region growth variation degree label is used for identifying the variation degree of the growth condition of the sub-region crops;
the merging module is used for acquiring a segmentation template corresponding to the crop growth area picture to be detected, importing each subarea detection result and each subarea growth variation degree label into the segmentation template to obtain a crop growth area detection result to be detected, wherein the crop growth area detection result to be detected comprises an overall area crop variation growth condition corresponding to the crop growth area picture to be detected and a subarea crop variation growth condition corresponding to each subarea picture, the overall area crop variation growth condition is obtained by merging the subarea crop variation growth conditions, each subarea crop variation growth condition can be dynamically displayed by clicking the overall area crop variation growth condition, the preset crop growth variation condition detection model comprises a convolutional neural network model, and the detecting module is further used for:
Inputting each initial time point sub-area picture and each detection time point sub-area picture into a preset crop growth change condition detection model, and obtaining a differential matrix by carrying out frame difference processing on the initial time point sub-area picture and the detection time point sub-area picture; and inputting the differential matrix into the convolutional neural network model to obtain the detection result of the subarea.
7. A crop growth variation monitoring apparatus, characterized in that the crop growth variation monitoring apparatus comprises: a memory, a processor and a program stored on the memory for implementing the crop growth variation condition detection method,
the memory is used for storing programs for realizing the crop growth change condition detection method;
the processor is configured to execute a program for implementing the crop growth variation condition detection method to implement the steps of the crop growth variation condition detection method according to any one of claims 1 to 5.
8. A medium having stored thereon a program for realizing the crop growth variation condition detection method, the program for realizing the crop growth variation condition detection method being executed by a processor to realize the steps of the crop growth variation condition detection method according to any one of claims 1 to 5.
CN201911096106.5A 2019-11-11 2019-11-11 Crop growth change condition detection method, device, equipment and medium Active CN110827269B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201911096106.5A CN110827269B (en) 2019-11-11 2019-11-11 Crop growth change condition detection method, device, equipment and medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201911096106.5A CN110827269B (en) 2019-11-11 2019-11-11 Crop growth change condition detection method, device, equipment and medium

Publications (2)

Publication Number Publication Date
CN110827269A CN110827269A (en) 2020-02-21
CN110827269B true CN110827269B (en) 2024-03-05

Family

ID=69553956

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201911096106.5A Active CN110827269B (en) 2019-11-11 2019-11-11 Crop growth change condition detection method, device, equipment and medium

Country Status (1)

Country Link
CN (1) CN110827269B (en)

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111369152A (en) * 2020-03-06 2020-07-03 深圳前海微众银行股份有限公司 Agricultural land value evaluation optimization method, device and equipment and readable storage medium
CN112800929B (en) * 2021-01-25 2022-05-31 安徽农业大学 Bamboo shoot quantity and high growth rate online monitoring method based on deep learning
CN113418509A (en) * 2021-05-20 2021-09-21 中国农业科学院烟草研究所(中国烟草总公司青州烟草研究所) Automatic target-aiming detection device and detection method for agriculture
CN116228454A (en) * 2023-03-20 2023-06-06 广东七天牧草种养殖有限公司 Big data-based planting management control method, system and readable storage medium

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
TW201010435A (en) * 2008-08-22 2010-03-01 Hon Hai Prec Ind Co Ltd Video monitoring system and method thereof
CN108198239A (en) * 2017-12-27 2018-06-22 中山大学 A kind of three-dimensional visualization method for realizing blood vessel dynamic simulation
CN108776772A (en) * 2018-05-02 2018-11-09 北京佳格天地科技有限公司 Across the time building variation detection modeling method of one kind and detection device, method and storage medium
CN110163294A (en) * 2019-05-29 2019-08-23 广东工业大学 Remote Sensing Imagery Change method for detecting area based on dimensionality reduction operation and convolutional network
CN110378224A (en) * 2019-06-14 2019-10-25 香港理工大学深圳研究院 A kind of detection method of feature changes, detection system and terminal

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
TW201010435A (en) * 2008-08-22 2010-03-01 Hon Hai Prec Ind Co Ltd Video monitoring system and method thereof
CN108198239A (en) * 2017-12-27 2018-06-22 中山大学 A kind of three-dimensional visualization method for realizing blood vessel dynamic simulation
CN108776772A (en) * 2018-05-02 2018-11-09 北京佳格天地科技有限公司 Across the time building variation detection modeling method of one kind and detection device, method and storage medium
CN110163294A (en) * 2019-05-29 2019-08-23 广东工业大学 Remote Sensing Imagery Change method for detecting area based on dimensionality reduction operation and convolutional network
CN110378224A (en) * 2019-06-14 2019-10-25 香港理工大学深圳研究院 A kind of detection method of feature changes, detection system and terminal

Also Published As

Publication number Publication date
CN110827269A (en) 2020-02-21

Similar Documents

Publication Publication Date Title
CN110827269B (en) Crop growth change condition detection method, device, equipment and medium
US10810723B2 (en) System and method for single image object density estimation
US9741107B2 (en) Full reference image quality assessment based on convolutional neural network
US11915430B2 (en) Image analysis apparatus, image analysis method, and storage medium to display information representing flow quantity
CN111160379A (en) Training method and device of image detection model and target detection method and device
CN110287875B (en) Video object detection method and device, electronic equipment and storage medium
US11657513B2 (en) Method and system for generating a tri-map for image matting
CN113159300B (en) Image detection neural network model, training method thereof and image detection method
Al Sobbahi et al. Low-light homomorphic filtering network for integrating image enhancement and classification
CN110245747B (en) Image processing method and device based on full convolution neural network
CN114066857A (en) Infrared image quality evaluation method and device, electronic equipment and readable storage medium
CN110443783B (en) Image quality evaluation method and device
CN110766007A (en) Certificate shielding detection method, device and equipment and readable storage medium
CN111814820B (en) Image processing method and device
CN111339902A (en) Liquid crystal display number identification method and device of digital display instrument
CN117147561B (en) Surface quality detection method and system for metal zipper
CN112396594A (en) Change detection model acquisition method and device, change detection method, computer device and readable storage medium
CN113052019A (en) Target tracking method and device, intelligent equipment and computer storage medium
Qureshi et al. An information based framework for performance evaluation of image enhancement methods
CN112766481B (en) Training method and device for neural network model and image detection method
Tang et al. Feature comparison and analysis for new challenging research fields of image quality assessment
CN113096077B (en) Abnormal proportion detection method, device, equipment and computer readable storage medium
CN112330619B (en) Method, device, equipment and storage medium for detecting target area
CN114266749B (en) TridentNet-based image processing method
CN117710756B (en) Target detection and model training method, device, equipment and medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant