CN110827269A - Crop growth change condition detection method, device, equipment and medium - Google Patents

Crop growth change condition detection method, device, equipment and medium Download PDF

Info

Publication number
CN110827269A
CN110827269A CN201911096106.5A CN201911096106A CN110827269A CN 110827269 A CN110827269 A CN 110827269A CN 201911096106 A CN201911096106 A CN 201911096106A CN 110827269 A CN110827269 A CN 110827269A
Authority
CN
China
Prior art keywords
preset
sub
crop growth
picture
detection
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201911096106.5A
Other languages
Chinese (zh)
Other versions
CN110827269B (en
Inventor
汪飙
邹冲
李世行
张元梵
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
WeBank Co Ltd
Original Assignee
WeBank Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by WeBank Co Ltd filed Critical WeBank Co Ltd
Priority to CN201911096106.5A priority Critical patent/CN110827269B/en
Publication of CN110827269A publication Critical patent/CN110827269A/en
Application granted granted Critical
Publication of CN110827269B publication Critical patent/CN110827269B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/11Region-based segmentation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/187Segmentation; Edge detection involving region growing; involving region merging; involving connected component labelling
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20081Training; Learning
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20084Artificial neural networks [ANN]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30181Earth observation
    • G06T2207/30188Vegetation; Agriculture

Abstract

The application discloses crop growth change condition detection method, device, equipment and medium, the crop growth change condition detection method comprises the following steps: receiving a picture of a crop growth area to be detected, segmenting the picture of the crop growth area to be detected to obtain sub-area pictures, inputting each sub-area picture into a preset crop growth change condition detection model to obtain sub-area detection results, and combining the sub-area detection results to obtain a detection result of the crop growth area to be detected. The technical problem that the crop growth condition detection accuracy is low is solved.

Description

Crop growth change condition detection method, device, equipment and medium
Technical Field
The application relates to the technical field of neural networks of financial science and technology, in particular to a method, a device, equipment and a medium for detecting the growth change condition of crops.
Background
With the continuous development of financial technologies, especially internet technology and finance, more and more technologies (such as distributed, Blockchain, artificial intelligence and the like) are applied to the financial field, but the financial industry also puts higher requirements on the technologies, such as higher requirements on the distribution of backlog of the financial industry.
With the gradual development of modern agriculture, people usually monitor the growth conditions of crops in real time, and further acquire higher crop yield by mastering the growth conditions of the crops, at present, the growth conditions of the crops are usually monitored by manual monitoring or satellite remote sensing and other technologies, but the growth conditions of the crops are usually detected by sampling through the manual monitoring, so that the accuracy of the detection result of the growth conditions of the crops is not high, the growth conditions of the crops are monitored by a satellite remote sensing technology, the crop growth conditions of different plots are difficult to accurately classify and count, so that the overall statistical analysis is not satisfactory, and the accuracy of the detection result of the growth conditions of the crops is not high, so that the technical problem of low accuracy of the detection of the growth conditions of the crops exists in the prior art.
Disclosure of Invention
The application mainly aims to provide a crop growth change condition detection method, device, equipment and medium, and aims to solve the technical problem that in the prior art, the accuracy of crop growth condition detection is low.
In order to achieve the above object, the present application provides a crop growth variation detection method applied to a crop growth variation detection apparatus, the crop growth variation detection method including:
receiving a picture of a crop growth area to be detected, and segmenting the picture of the crop growth area to be detected to obtain a sub-area picture;
inputting each subregion picture into a preset crop growth change condition detection model to obtain subregion detection results;
and combining the detection results of the sub-areas to obtain the detection result of the crop growth area to be detected.
Optionally, the sub-region picture includes a first time point sub-region picture and a second time point sub-region picture, the preset crop growth change condition detection model includes a convolutional neural network model,
the step of inputting each subregion picture into a preset crop growth change condition detection model to obtain subregion detection results comprises the following steps:
inputting each subregion picture into a preset crop growth change condition detection model to perform frame difference processing on the first time point subregion picture and the second time point subregion picture to obtain a difference matrix;
and inputting the difference matrix into the convolutional neural network model to obtain the detection result of the sub-region.
Optionally, the step of performing frame difference processing on the first time point sub-region picture and the second time point sub-region picture to obtain a difference matrix includes:
acquiring a first pixel matrix and a second pixel matrix corresponding to the first time point sub-region picture and the second time point sub-region picture respectively;
and carrying out subtraction operation on the first pixel matrix and the second pixel matrix to obtain a difference matrix.
The step of inputting the difference matrix into the convolutional neural network model to obtain the detection result of the sub-region comprises:
inputting the difference matrix into the convolutional neural network model, and performing convolution processing on the difference matrix to obtain a convolution processing result;
performing pooling treatment on the convolution treatment result to obtain a pooling treatment result;
and repeatedly performing convolution and pooling alternative processing on the pooling processing result based on the preset convolution pooling times to obtain the sub-region detection result.
Optionally, the step of inputting each of the sub-region pictures into a preset crop growth change condition detection model to obtain a sub-region detection result includes:
acquiring a preset basic detection model and preset training data, wherein the preset training data comprise a training difference matrix and a theoretical output result corresponding to the training difference matrix;
inputting the training difference matrix into the preset basic detection model to obtain an actual output result;
comparing the actual output result with the theoretical output result to obtain a regression loss value;
and adjusting the preset basic detection model and training again based on the regression loss value until the regression loss value is smaller than the preset regression loss threshold value, and obtaining the preset crop growth change condition detection model.
Optionally, before the step of obtaining the preset crop growth change condition detection model, the method further includes:
acquiring preset verification data, wherein the preset verification data comprise verification difference matrixes and theoretical verification results corresponding to the verification difference matrixes;
inputting each verification differential matrix into the converged preset basic detection model to obtain an actual verification result;
comparing each actual verification result with each theoretical verification result to obtain a plurality of error rates;
comparing each error rate with a preset standard error rate range, counting the number of the error rates within the preset standard error rate range, and counting the number of the error rates relative to the number of the error rates;
if the number ratio is smaller than or equal to a preset number ratio threshold value, adjusting the preset basic detection model based on the number ratio and training again;
the step of obtaining the preset crop growth change condition detection model comprises the following steps:
and if the number ratio is larger than a preset number ratio threshold value, determining a preset basic detection model obtained by corresponding training as the preset crop growth change condition detection model.
Optionally, the sub-area detection result comprises the change degree of the growth condition of the crop in the sub-area,
the step of combining the detection results of the sub-areas to obtain the detection result of the growth area of the crop to be detected comprises the following steps:
identifying the detection result of each subregion based on the change degree of the crop growth condition of the subregion to obtain a subregion growth change degree label;
and acquiring a segmentation template corresponding to the picture of the crop growth area to be detected, and introducing the detection result of each sub-area and the growth change degree label of each sub-area into the segmentation template to obtain the detection result of the crop growth area to be detected.
The application still provides a crops growth situation of change detection device, crops growth situation of change detection device is applied to crops growth situation of change check out test set, crops growth situation of change detection device includes:
the dividing module is used for receiving the picture of the crop growth area to be detected, dividing the picture of the crop growth area to be detected and obtaining a sub-area picture;
the detection module is used for inputting each subregion picture into a preset crop growth change condition detection model to obtain a subregion detection result;
and the merging module is used for merging the detection results of the sub-areas to obtain the detection result of the growth area of the crop to be detected.
Optionally, the detection module includes:
the first input unit is used for inputting the sub-region pictures into a preset crop growth change condition detection model so as to perform frame difference processing on the first time point sub-region picture and the second time point sub-region picture to obtain a difference matrix;
and the second input unit is used for inputting the difference matrix into the convolutional neural network model to obtain the detection result of the sub-region.
Optionally, the first input unit includes:
an obtaining subunit, configured to obtain a first pixel matrix and a second pixel matrix corresponding to the first time point sub-region picture and the second time point sub-region picture, respectively;
and the calculating subunit is used for performing subtraction operation on the first pixel matrix and the second pixel matrix to obtain a difference matrix.
Optionally, the second input unit includes:
a convolution subunit, configured to input the difference matrix into the convolutional neural network model, perform convolution processing on the difference matrix, and obtain a convolution processing result;
the pooling subunit is used for pooling the convolution processing result to obtain a pooled processing result;
and the repeating subunit is used for repeatedly performing convolution and pooling alternative processing on the pooling processing result based on the preset convolution pooling times to obtain the sub-region detection result.
Optionally, the crop growth change condition detection device further includes:
the first acquisition unit is used for acquiring a preset basic detection model and preset training data, wherein the preset training data comprise a training difference matrix and theoretical output results corresponding to the training difference matrix;
the alternative processing unit is used for inputting the training difference matrix into the preset basic detection model to obtain an actual output result;
the first comparison unit is used for comparing the actual output result with the theoretical output result to obtain a regression loss value;
and the first adjusting unit is used for adjusting and retraining the preset basic detection model based on the regression loss value until the regression loss value is smaller than the preset regression loss threshold value, and obtaining the preset crop growth change condition detection model.
Optionally, the crop growth change condition detection device further includes:
the second obtaining unit is used for obtaining preset verification data, wherein the preset verification data comprise verification difference matrixes and theoretical verification results corresponding to the verification difference matrixes;
a third input unit, configured to input each verification difference matrix into the converged preset basis detection model to obtain an actual verification result;
a second comparing unit, configured to compare each actual verification result with each theoretical verification result to obtain a plurality of error rates;
the statistical unit is used for comparing each error rate with a preset standard error rate range, counting the number of the error rates within the preset standard error rate range, and counting the number ratio of the error rates to the error rates;
a second adjusting unit, configured to adjust and retrain the preset basic detection model based on the number ratio if the number ratio is smaller than or equal to a preset number ratio threshold;
and the determining unit is used for determining the preset basic detection model obtained by corresponding training as the preset crop growth change condition detection model if the number ratio is greater than a preset number ratio threshold value.
Optionally, the merging module includes:
the identification unit is used for identifying the detection results of the sub-regions based on the change degree of the crop growth conditions of the sub-regions to obtain a sub-region growth change degree label;
and the importing unit is used for acquiring the segmentation template corresponding to the picture of the crop growth area to be detected, and importing the detection result of each sub-area and the growth change degree label of each sub-area into the segmentation template to obtain the detection result of the crop growth area to be detected.
The application still provides a crops growth situation of change check out test set, crops growth situation of change check out test set includes: a memory, a processor and a program of the crop growth variation detection method stored on the memory and executable on the processor, wherein the program of the crop growth variation detection method, when executed by the processor, is capable of implementing the steps of the crop growth variation detection method as described above.
The present application also provides a medium, which is a readable storage medium, and the medium stores a program for implementing the crop growth variation detection method, and the program of the crop growth variation detection method, when executed by a processor, implements the steps of the crop growth variation detection method as described above.
According to the method and the device, the picture of the crop growth area to be detected is received and divided to obtain the pictures of the sub-areas, then each picture of the sub-areas is input into a preset crop growth change condition detection model to obtain the detection results of the sub-areas, and finally the detection results of the sub-areas are combined to obtain the detection result of the crop growth area to be detected. That is, according to the crop growth area detection method and device, firstly, the to-be-detected crop growth area picture is divided to obtain sub-area pictures, then each sub-area picture is input into a preset crop growth change condition detection model to obtain sub-area detection results, and further, the sub-area detection results are combined to obtain the to-be-detected crop growth area detection results. That is, the crop growth area picture to be detected is divided into the plurality of sub-area pictures, then the crop growth conditions corresponding to the plurality of sub-area pictures are detected respectively, the sub-area detection results corresponding to the sub-area pictures are obtained, and then the crop growth area detection results to be detected are obtained by combining the sub-area detection results, so that the whole crop growth condition and the growth condition distribution condition of the sub-areas of the crop growth area to be detected can be accurately detected, and the technical problem of low accuracy in detecting the crop growth condition in the prior art is solved.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments consistent with the present application and together with the description, serve to explain the principles of the application.
In order to more clearly illustrate the embodiments of the present application or the technical solutions in the prior art, the drawings needed to be used in the description of the embodiments or the prior art will be briefly described below, and it is obvious for those skilled in the art to obtain other drawings without inventive exercise.
FIG. 1 is a schematic flow chart illustrating a first embodiment of a method for detecting a change in crop growth according to the present application;
FIG. 2 is a schematic diagram illustrating a picture of a growing area of a crop to be detected being divided in the method for detecting a change in a growth condition of the crop according to the present application;
FIG. 3 is a schematic diagram illustrating combining the detection results of the sub-regions in the method for detecting the growth change status of crops according to the present application;
FIG. 4 is a schematic flow chart illustrating a second embodiment of a method for detecting a change in crop growth according to the present application;
FIG. 5 is a schematic flow chart illustrating the process of obtaining the detection result of the sub-region in the method for detecting the growth change condition of the crop of the present application;
FIG. 6 is a schematic flow chart illustrating a third exemplary embodiment of a method for detecting a change in crop growth according to the present application;
FIG. 7 is a schematic view illustrating a process of training a preset basic detection model in the method for detecting a crop growth variation condition according to the present application;
fig. 8 is a schematic device structure diagram of a hardware operating environment according to an embodiment of the present application.
The implementation, functional features and advantages of the objectives of the present application will be further explained with reference to the accompanying drawings.
Detailed Description
It should be understood that the specific embodiments described herein are merely illustrative of the present application and are not intended to limit the present application.
In a first embodiment of the crop growth variation detection method of the present application, referring to fig. 1, the crop growth variation detection method includes:
step S10, receiving a picture of a crop growth area to be detected, and segmenting the picture of the crop growth area to be detected to obtain a sub-area picture;
in this embodiment, a picture of a crop growth area to be detected is received and divided to obtain a sub-area picture, and specifically, a picture of a crop growth area to be detected is received, wherein the preset shooting manner includes satellite shooting, aerial shooting, camera shooting, and the like, and further, as shown in fig. 2, the dividing manner is shown, wherein Sn is a picture of a crop growth area to be detected, a1 to a9 are sub-area pictures, and the picture of the crop growth area to be detected is further divided according to a preset dividing ratio to obtain a sub-area picture, for example, assuming that the size of the picture of the crop growth area to be detected is X Y, wherein the picture is rectangular, X is the picture length, Y is the picture width, the preset dividing ratio is that the picture length is equal to a, and the picture width is equal to b, the sub-region picture size is X/b X Y/a.
Step S20, inputting each subregion picture into a preset crop growth change condition detection model to obtain a subregion detection result;
in this embodiment, it should be noted that the preset crop growth variation detection model is a trained and reliable model, and the detection result of the sub-region is a crop growth variation within a preset time period, where a first time point of the preset time period is an initial time point, a last time point of the preset time period is a detection time point, and the crop growth variation includes growth variations such as a crop fruiting amount variation, a crop yield variation, a crop flowering amount variation, and a luxuriant crop growth degree.
Inputting each subregion picture into a preset crop growth change condition detection model to obtain a subregion detection result, specifically, inputting each subregion picture into a preset crop growth change condition detection model to perform frame difference on a subregion picture corresponding to the initial time point and a same subregion picture corresponding to the detection time point to obtain a difference matrix corresponding to each subregion picture, and further inputting the difference matrix into a convolutional neural network model in the preset crop growth change condition detection model to further obtain the subregion detection result.
And step S30, combining the detection results of the sub-regions to obtain the detection result of the growth region of the crop to be detected.
In this embodiment, it should be noted that the detection result of the crop growth area to be detected is a crop growth variation condition in a preset time period, where a first time point of the preset time period is an initial time point, and a last time point of the preset time period is a detection time point.
Combining the detection results of the sub-areas to obtain a detection result of a growth area of the crop to be detected, specifically, combining the detection results of the sub-areas in a reverse direction based on a dividing mode of dividing a picture of the growth area of the crop to be detected to obtain a detection result of the growth area of the crop to be detected, wherein the detection result of the growth area of the crop to be detected comprises a crop change growth condition of the whole area corresponding to the picture of the growth area of the crop to be detected and a crop change growth condition of the sub-area corresponding to the picture of the sub-area.
Wherein the detection result of the subregion comprises the change degree of the crop growth condition of the subregion,
the step of combining the detection results of the sub-areas to obtain the detection result of the growth area of the crop to be detected comprises the following steps:
step S31, based on the change degree of the crop growth condition of the subareas, identifying the detection result of each subarea to obtain a subarea growth change degree label;
in this embodiment, it should be noted that the sub-area crop growth condition variation degree includes a sub-area crop yield variation condition, a sub-area crop result variation condition, a sub-area crop flourishing degree variation condition, and the like, and the sub-area growth variation degree label is used to identify the sub-area crop growth condition variation degree, for example, it may be set that the greater the sub-area crop growth condition variation degree is, the darker the color of the sub-area growth variation degree label is, and the sub-area growth variation degree label includes a two-dimensional code, a barcode, a pattern, a character, and the like.
Identifying each sub-region detection result based on the sub-region crop growth condition change degree to obtain a sub-region growth change degree label, specifically, classifying the sub-region crop growth condition change degree to obtain a classification grade of each sub-region detection result, and identifying each sub-region detection result based on the classification grade to obtain a sub-region growth change degree label corresponding to the sub-region detection result, for example, assuming that the sub-region crop growth condition change degree is a crop yield change, and the crop yield change is divided into 3 grades, which are respectively more than 2000 jin per mu, 1500 jin to 2000 jin per mu and less than 1500 jin per mu, setting the sub-region growth change degree label corresponding to the sub-region detection result with the yield of 2000 jin to red, and setting the sub-region growth change degree label corresponding to the sub-region detection result with the yield of 1500 jin to 2000 jin to yellow And setting the sub-region growth change degree label corresponding to the sub-region detection result of less than 1500 jin per mu as white.
Step S32, obtaining a segmentation template corresponding to the picture of the crop growth area to be detected, and importing each subregion detection result and each subregion growth change degree label into the segmentation template to obtain the crop growth area detection result to be detected.
In this embodiment, as shown in fig. 3, a left diagram in fig. 3 is a segmentation template after each of the sub-region detection results and each of the sub-region growth change degree labels are introduced, where out1 to out9 all include the sub-region detection results and the sub-region growth change degree labels of their corresponding sub-region pictures, that is, out1 to out9 all include the sub-region crop change growth conditions and the sub-region growth change degree labels of their corresponding sub-region pictures, and a right diagram in fig. 3 is the detection results of the crop growth region to be detected, where outall includes out1 to out9 and the overall region crop change growth conditions corresponding to the crop growth region picture to be detected, for example, if the right diagram is a dynamic diagram, and the left diagram is clicked, then the left diagram can be displayed.
In the embodiment, the picture of the crop growth area to be detected is received and divided to obtain the pictures of the sub-areas, then each picture of the sub-areas is input into a preset crop growth change condition detection model to obtain the detection results of the sub-areas, and finally the detection results of the sub-areas are combined to obtain the detection result of the crop growth area to be detected. That is, in this embodiment, the picture of the crop growth area to be detected is firstly divided to obtain the pictures of the sub-areas, and then each picture of the sub-areas is input into the preset detection model for detecting the crop growth variation condition, so as to obtain the detection result of the sub-areas, and further, the detection results of the sub-areas are combined to obtain the detection result of the crop growth area to be detected. That is, in the present embodiment, the picture of the crop growth area to be detected is divided into a plurality of sub-area pictures, and then the crop growth conditions corresponding to the plurality of sub-area pictures are respectively detected, so as to obtain the sub-area detection results corresponding to each sub-area picture, and then the detection results of the crop growth area to be detected are obtained by combining the sub-area detection results, so that the present embodiment can accurately detect the overall crop growth condition and the growth condition distribution condition of each sub-area in the crop growth area to be detected, and therefore, the technical problem in the prior art that the accuracy of detecting the crop growth condition is low is solved.
Further, referring to fig. 4, in another embodiment of the crop growth variation condition detection method based on the first embodiment of the present application, the sub-region picture includes a first time point sub-region picture and a second time point sub-region picture, the preset crop growth variation condition detection model includes a convolutional neural network model,
the step of inputting each subregion picture into a preset crop growth change condition detection model to obtain subregion detection results comprises the following steps:
step S21, inputting each subregion picture into a preset crop growth change condition detection model to perform frame difference processing on the first time point subregion picture and the second time point subregion picture to obtain a difference matrix;
in this embodiment, it should be noted that the first time point sub-region picture and the second time point sub-region picture both correspond to the same sub-region of the crop growth region to be detected, the first time point is an initial time point of a preset time period, and the second time point is a detection time point of the preset time period.
Inputting each sub-region picture into a preset crop growth change condition detection model to perform frame difference processing on the first time point sub-region picture and the second time point sub-region picture to obtain a difference matrix, specifically, inputting each sub-region picture into a preset crop growth change condition detection model to perform frame difference on the first time point sub-region picture and the second time point sub-region picture, that is, performing subtraction on a first pixel matrix corresponding to the first time point sub-region picture and a second pixel matrix corresponding to the second time point sub-region picture to obtain a difference matrix, and converting the difference matrix into a difference matrix, as shown in fig. 5, which is a schematic flow diagram for obtaining sub-region detection results, wherein a1m and a1n are the first time point sub-region picture and the second time point sub-region picture respectively, dc11j is the input of the detection model for the growth variation condition of the preset crop, CNN M2 is the detection model for the growth variation condition of the preset crop, and out1 is the detection result of the subregion.
In step S21, the step of performing frame difference processing on the first time point sub-region picture and the second time point sub-region picture to obtain a difference matrix includes:
step S211, acquiring a first pixel matrix and a second pixel matrix corresponding to the first time point sub-region picture and the second time point sub-region picture, respectively;
in this embodiment, it should be noted that both the first time point sub-region picture and the second time point sub-region picture can be represented by a digital matrix in a computer, that is, the first time point sub-region picture and the second time point sub-region picture respectively correspond to a first pixel matrix and a second pixel matrix, where a numerical value in the digital matrix is a pixel value of the picture.
Step S212, performing subtraction on the first pixel matrix and the second pixel matrix to obtain a difference matrix;
in this embodiment, it should be noted that the specifications of the first pixel matrix and the second pixel matrix are the same, for example, assuming that the first pixel matrix is an m × n matrix, the second pixel matrix is also an m × n matrix.
And performing subtraction operation on the first pixel matrix and the second pixel matrix to obtain a difference matrix, specifically, subtracting corresponding pixel values in the first pixel matrix and the second pixel matrix to obtain the difference matrix.
Step S22, inputting the difference matrix into the convolutional neural network model to obtain the detection result of the sub-region;
in this embodiment, it should be noted that the convolutional neural network model is a model that has been trained based on deep learning, and the convolutional neural network model includes data processing layers such as a convolutional layer, a pooling layer, and a full-link layer.
Inputting the difference matrix into the convolutional neural network model to obtain the sub-region detection result, specifically, inputting the difference matrix into the convolutional neural network model, and performing data processing on the difference matrix based on the selected data processing layers such as a convolutional layer, a pooling layer or a full-link layer, wherein the data processing includes convolution, pooling, full-link and the like, so as to obtain the sub-region detection result, and the number and the type of the data processing layers can be selected and used by a user.
Wherein, the step of inputting the difference matrix into the convolutional neural network model to obtain the sub-region detection result comprises:
step S221, inputting the difference matrix into the convolutional neural network model, and performing convolution processing on the difference matrix to obtain a convolution processing result;
in this embodiment, it should be noted that the convolution process can be understood as: the statistical characteristics of one part of the image features are the same as those of other parts, namely, the statistical characteristics learned in the part can also appear in the other part, so that the learned statistical characteristics are used as a detector and applied to any part of the image features, namely, the statistical characteristics learned by the small-range image are convoluted with the image features of the original large-size image, and mathematically, the convolution can be that a characteristic matrix of the corresponding image is multiplied by a plurality of detection matrixes in advance to obtain a convolution processing result.
In this embodiment, based on the preset image features and the weight matrices, the difference matrix is convolved to obtain a convolution result, and specifically, an image matrix corresponding to the difference matrix is dot-multiplied by the weight matrix, and then the weight sums to obtain a convolution result.
Step S223, performing pooling treatment on the convolution treatment result to obtain a pooling treatment result;
in this embodiment, the pooling includes maximum pooling, mean pooling, and the like, and the convolution processing result is pooled to obtain a pooling result, specifically, the convolution processing result is firstly divided into a plurality of pixel matrices with preset sizes, and if the maximum pooling result is obtained, the maximum pixel value of the pixel matrix is used to replace the pixel matrix, so as to obtain a new image matrix, that is, a pooling result is obtained.
And S224, repeatedly performing convolution and pooling alternative processing on the pooling processing result based on the preset convolution pooling times to obtain the sub-region detection result.
In this embodiment, based on a preset convolution pooling number, performing convolution and pooling alternative processing repeatedly on the pooling processing result to obtain the sub-region detection result, specifically, performing preset number of times for loop processing on steps S222 to S223 until the number of times of convolution and pooling alternative processing reaches the preset convolution pooling number to obtain the sub-region detection result. In addition, after convolution and pooling alternative processing are performed, a full connection layer can be accessed into a convolution neural network model for full connection to obtain the sub-region detection result, wherein the full connection can be regarded as special convolution processing, the result of the special convolution processing is a one-dimensional vector corresponding to an obtained image, that is, the sub-region feature maps are converted into a one-dimensional vector through the full connection, the one-dimensional vector comprises combination information of all features of a differential matrix corresponding to the one-dimensional vector, and the combination information comprises crop growth change conditions, so that the sub-region detection result is obtained.
In this embodiment, each of the sub-region pictures is input into a preset crop growth change condition detection model to perform frame difference processing on the first time point sub-region picture and the second time point sub-region picture to obtain a difference matrix, and the difference matrix is input into the convolutional neural network model to obtain the sub-region detection result. That is, in this embodiment, each sub-region picture is first input into a preset crop growth change condition detection model to perform frame difference processing on the first time point sub-region picture and the second time point sub-region picture, so as to obtain a difference matrix, and further, the difference matrix is input into the convolutional neural network model to obtain the sub-region detection result. That is, the present embodiment provides a method for obtaining the detection result of the sub-region, that is, inputting each picture of the sub-region into a preset crop growth change condition detection model, and outputting the detection result of the sub-region, thereby laying a foundation for obtaining the detection result of the crop growth region to be detected, and therefore laying a foundation for solving the technical problem of low accuracy in detecting the crop growth condition in the prior art.
Further, referring to fig. 6, based on the first embodiment and the second embodiment of the present application, in another embodiment of the crop growth variation condition detection method, the step of inputting each of the sub-region pictures into a preset crop growth variation condition detection model and obtaining the sub-region detection result includes:
step A10, acquiring a preset basic detection model and preset training data, wherein the preset training data comprises a training difference matrix and a theoretical output result corresponding to the training difference matrix;
in this embodiment, it should be noted that the preset basic detection model is a model that is not determined whether to be trained, and image data at m times can be obtained by shooting the region to be shot at different m times, where the image data includes pictures of a plurality of sub-regions, and then frame differences are made between the sub-regions at two times, so that the training differential matrix can be obtained, and crop growth conditions of each region in the image at the previous m times are obtained through websites such as the national statistical office, so that crop production change conditions at any two times in the m times can be obtained, that is, theoretical output results corresponding to the training differential matrix can be obtained.
Step A20, inputting the training difference matrix into the preset basic detection model to obtain an actual output result;
in this embodiment, it should be noted that the convolution refers to a process of performing element-by-element multiplication and summation on an image matrix corresponding to an image and a convolution kernel to obtain an image characteristic value, where the convolution kernel refers to a weight matrix corresponding to a difference matrix characteristic, and the pooling refers to a process of integrating image characteristic values obtained through convolution to obtain a new characteristic value.
Inputting the training differential matrix into the preset basic detection model to obtain an actual output result, specifically, inputting the differential matrix into the preset basic detection model, and performing data processing on the training differential matrix based on the selected data processing layers such as a convolution layer, a pooling layer or a full-link layer, wherein the data processing layers include convolution, pooling, full-link and the like, so as to obtain the actual output result, and the number and types of the data processing layers can be selected and used by a user.
Step A30, comparing the actual output result with the theoretical output result to obtain a regression loss value;
in this embodiment, the actual output result is compared with the theoretical output result to obtain a regression loss value, specifically, the actual output result is compared with the theoretical output result to obtain a difference between the actual output result and the theoretical output result, and a ratio between the difference and the theoretical output result is calculated, where the ratio is a regression loss value, that is, a regression loss value is obtained.
And A40, adjusting and retraining the preset basic detection model based on the regression loss value until the regression loss value is smaller than the preset regression loss threshold value, and obtaining the preset crop growth change condition detection model.
In this embodiment, based on the regression loss value, the preset basic detection model is adjusted and trained again until the regression loss value is smaller than the preset regression loss threshold value, so as to obtain the preset crop growth change condition detection model, specifically, the regression loss value is compared with the preset regression loss threshold value, when the regression loss value is smaller than the preset regression loss threshold value, a verification model corresponding to the preset basic detection model is obtained, when the regression loss value is greater than or equal to the preset regression loss threshold value, the weight matrix in the preset basic detection model is adjusted, and training is performed again until the regression loss value is smaller than the preset regression loss threshold value, so as to set the preset basic detection model as the preset crop growth change condition detection model, it should be noted that the preset regression loss threshold may be input by a user or a default threshold of a system is used, and the smaller the preset regression loss threshold is, the higher the detection accuracy of the preset crop growth change condition detection model is, as shown in fig. 7, the process diagram of training the preset basic detection model is shown, where a1i and a1j are frame-differenced to obtain the training differential matrix, da11j is input to the preset basic detection model, dp11j is a theoretical output result of the model, CNN M1 is the preset basic detection model, da11j is input to CNN M1, and then the output of CNN M1 is compared with dp11j to obtain a regression loss value, that is, the actual output result is compared with the theoretical output result to obtain the regression loss value.
Wherein, the step of obtaining the preset crop growth change condition detection model comprises the following steps:
step B10, acquiring preset verification data, wherein the preset verification data comprises verification difference matrixes and theoretical verification results corresponding to the verification difference matrixes;
in this embodiment, it should be noted that, by shooting the area to be shot at different m times, image data at the m times can be obtained, where the image data includes pictures of a plurality of sub-areas, and then taking the sub-areas at two times to perform frame difference, the verification difference matrix can be obtained, and the crop growth condition of each area on the image at the previous m times is obtained through websites such as the national statistical bureau, so that the crop production change condition at any two times of the m times can be obtained, that is, the theoretical verification result corresponding to the verification difference matrix is obtained.
Step B20, inputting each verification differential matrix into the converged preset basic detection model to obtain an actual verification result;
in this embodiment, it should be noted that the converged preset basic detection model refers to a prediction basic detection model whose regression loss value is smaller than the preset regression loss threshold.
Inputting each verification differential matrix into the converged preset basic detection model to obtain an actual verification result, specifically, inputting the verification differential matrix into the converged preset basic detection model, and performing data processing on the verification differential matrix based on selected data processing layers such as a convolution layer, a pooling layer or a full-link layer, wherein the data processing layers include convolution, pooling, full-link and the like, so as to obtain the actual verification result, and the number and types of the data processing layers can be selected and used by a user.
Step B30, comparing each actual verification result with each theoretical verification result to obtain a plurality of error rates;
in this embodiment, it should be noted that the number of the actual verification results may ensure the reliability of the verification model, and each actual verification result is compared with each theoretical verification result to obtain a plurality of error rates, specifically, each actual verification result is compared with each theoretical verification result corresponding to each actual verification result, a plurality of error values of each actual verification result and each theoretical verification result are calculated, and then a ratio between the error values and the theoretical verification results is calculated to obtain a plurality of error rates, for example, assuming that the actual verification result is 900 jin per mu yield of crop and the theoretical verification result is 1000 jin per mu yield, the error value is 100 jin and the error rate is 10%.
Step B40, comparing each error rate with a preset standard error rate range, counting the number of error rates within the preset standard error rate range, and counting the number ratio of the error rates to the number of the error rates;
in this embodiment, each error rate is compared with a preset standard error rate range, the number of error rates within the preset standard error rate range is counted, and the number of error rates relative to the number of error rates is counted.
Step B50, if the number ratio is less than or equal to a preset number ratio threshold, adjusting the preset basic detection model based on the number ratio and training again;
in this embodiment, if the number ratio is less than or equal to a preset number ratio threshold, the preset basic detection model is adjusted and retrained based on the number ratio, and specifically, if the number ratio is less than or equal to the preset number ratio threshold, the weight matrix in the preset basic detection model is adjusted based on the number ratio being less than or equal to the preset number ratio threshold, and retraining and verifying are performed until the number ratio is greater than the preset number ratio threshold.
Wherein, the step of obtaining the detection model of the growth change condition of the preset crops comprises the following steps:
and step B60, if the quantity ratio is larger than a preset quantity ratio threshold value, determining a preset basic detection model obtained by corresponding training as the preset crop growth change condition detection model.
In this embodiment, it should be noted that, if the number ratio is greater than a preset number ratio threshold, it is determined that the prediction accuracy of the preset basic detection model has reached the requirement of the preset crop growth variation detection model, that is, the preset basic detection model may be used as the preset crop growth variation detection model.
In this embodiment, a preset basic detection model and preset training data are obtained, wherein the preset training data include a training differential matrix and a theoretical output result corresponding to the training differential matrix, and then the training differential matrix is input into the preset basic detection model to be processed alternately by convolution and pooling for a preset number of times on the training differential matrix, so as to obtain an actual output result, and the actual output result is compared with the theoretical output result to obtain a regression loss value, and finally, the preset basic detection model is adjusted and trained again based on the regression loss value until the regression loss value is smaller than the preset regression loss threshold value, so as to obtain the preset crop growth change condition detection model. That is, in this embodiment, a preset basic detection model and preset training data are first obtained, the training differential matrix is then input into the preset basic detection model, an actual output result is obtained, the actual output result is compared with the theoretical output result, a regression loss value is obtained, and finally, the preset basic detection model is adjusted and trained again based on the regression loss value until the regression loss value is smaller than the preset regression loss threshold value, so as to obtain the preset crop growth change condition detection model. That is, the application provides a training and acquiring method for a preset crop growth variation condition detection model, which can train the preset basic detection model into the preset crop growth variation condition detection model, and further lay a foundation for acquiring the detection result of the subregion, so that a foundation is laid for solving the technical problem of low accuracy in detecting the crop growth condition in the prior art.
Referring to fig. 8, fig. 8 is a schematic device structure diagram of a hardware operating environment according to an embodiment of the present application.
As shown in fig. 8, the crop growth change condition detection apparatus may include: a processor 1001, such as a CPU, a memory 1005, and a communication bus 1002. The communication bus 1002 is used for realizing connection communication between the processor 1001 and the memory 1005. The memory 1005 may be a high-speed RAM memory or a non-volatile memory (e.g., a magnetic disk memory). The memory 1005 may alternatively be a memory device separate from the processor 1001 described above.
Optionally, the crop growth change condition detection device may further include a rectangular user interface, a network interface, a camera, a Radio Frequency (RF) circuit, a sensor, an audio circuit, a WiFi module, and the like. The rectangular user interface may comprise a Display screen (Display), an input sub-module such as a Keyboard (Keyboard), and the optional rectangular user interface may also comprise a standard wired interface, a wireless interface. The network interface may optionally include a standard wired interface, a wireless interface (e.g., WI-FI interface).
It will be appreciated by those skilled in the art that the crop growth variation detection apparatus configuration shown in fig. 8 does not constitute a limitation of crop growth variation detection apparatus and may include more or less components than those shown, or some components may be combined, or a different arrangement of components.
As shown in fig. 8, a memory 1005, which is a kind of computer storage medium, may include therein an operating system, a network communication module, and a crop growth change condition detection program. The operating system is a program that manages and controls the hardware and software resources of the crop growth change condition detection device, supporting the operation of the crop growth change condition detection program as well as other software and/or programs. The network communication module is used for realizing communication among the components in the memory 1005 and communication with other hardware and software in the crop growth change condition detection system.
In the crop growth variation detection apparatus shown in fig. 8, the processor 1001 is configured to execute a crop growth variation detection program stored in the memory 1005 to implement the steps of any of the crop growth variation detection methods described above.
The specific implementation of the crop growth change condition detection device of the present application is substantially the same as the embodiments of the crop growth change condition detection method, and is not described herein again.
The embodiment of this application still provides a crops growth situation of change detection device, crops growth situation of change detection device includes:
the dividing module is used for receiving the picture of the crop growth area to be detected, dividing the picture of the crop growth area to be detected and obtaining a sub-area picture;
the detection module is used for inputting each subregion picture into a preset crop growth change condition detection model to obtain a subregion detection result;
and the merging module is used for merging the detection results of the sub-areas to obtain the detection result of the growth area of the crop to be detected.
Optionally, the detection module includes:
the first input unit is used for inputting the sub-region pictures into a preset crop growth change condition detection model so as to perform frame difference processing on the first time point sub-region picture and the second time point sub-region picture to obtain a difference matrix;
and the second input unit is used for inputting the difference matrix into the convolutional neural network model to obtain the detection result of the sub-region.
Optionally, the first input unit includes:
an obtaining subunit, configured to obtain a first pixel matrix and a second pixel matrix corresponding to the first time point sub-region picture and the second time point sub-region picture, respectively;
and the calculating subunit is used for performing subtraction operation on the first pixel matrix and the second pixel matrix to obtain a difference matrix.
Optionally, the second input unit includes:
a convolution subunit, configured to input the difference matrix into the convolutional neural network model, perform convolution processing on the difference matrix, and obtain a convolution processing result;
the pooling subunit is used for pooling the convolution processing result to obtain a pooled processing result;
and the repeating subunit is used for repeatedly performing convolution and pooling alternative processing on the pooling processing result based on the preset convolution pooling times to obtain the sub-region detection result.
Optionally, the crop growth change condition detection device further includes:
the first acquisition unit is used for acquiring a preset basic detection model and preset training data, wherein the preset training data comprise a training difference matrix and theoretical output results corresponding to the training difference matrix;
the alternative processing unit is used for inputting the training difference matrix into the preset basic detection model to obtain an actual output result;
the first comparison unit is used for comparing the actual output result with the theoretical output result to obtain a regression loss value;
and the first adjusting unit is used for adjusting and retraining the preset basic detection model based on the regression loss value until the regression loss value is smaller than the preset regression loss threshold value, and obtaining the preset crop growth change condition detection model.
Optionally, the crop growth change condition detection device further includes:
the second obtaining unit is used for obtaining preset verification data, wherein the preset verification data comprise verification difference matrixes and theoretical verification results corresponding to the verification difference matrixes;
a third input unit, configured to input each verification difference matrix into the converged preset basis detection model to obtain an actual verification result;
a second comparing unit, configured to compare each actual verification result with each theoretical verification result to obtain a plurality of error rates;
the statistical unit is used for comparing each error rate with a preset standard error rate range, counting the number of the error rates within the preset standard error rate range, and counting the number ratio of the error rates to the error rates;
a second adjusting unit, configured to adjust and retrain the preset basic detection model based on the number ratio if the number ratio is smaller than or equal to a preset number ratio threshold;
and the determining unit is used for determining the preset basic detection model obtained by corresponding training as the preset crop growth change condition detection model if the number ratio is greater than a preset number ratio threshold value.
Optionally, the merging module includes:
the identification unit is used for identifying the detection results of the sub-regions based on the change degree of the crop growth conditions of the sub-regions to obtain a sub-region growth change degree label;
and the importing unit is used for acquiring the segmentation template corresponding to the picture of the crop growth area to be detected, and importing the detection result of each sub-area and the growth change degree label of each sub-area into the segmentation template to obtain the detection result of the crop growth area to be detected.
The specific implementation of the crop growth variation detection apparatus of the present application is substantially the same as the embodiments of the crop growth variation detection method, and is not described herein again.
The present invention provides a medium, which is a readable storage medium, and the medium stores one or more programs, and the one or more programs are further executable by one or more processors for implementing the steps of the crop growth change condition detection method described in any one of the above.
The specific implementation of the medium of the present application is substantially the same as that of each embodiment of the method for detecting the growth change condition of the crops, and is not described herein again.
The above description is only a preferred embodiment of the present application, and not intended to limit the scope of the present application, and all modifications of equivalent structures and equivalent processes, which are made by the contents of the specification and the drawings, or which are directly or indirectly applied to other related technical fields, are included in the scope of the present application.

Claims (10)

1. A crop growth change condition detection method is characterized by comprising the following steps:
receiving a picture of a crop growth area to be detected, and segmenting the picture of the crop growth area to be detected to obtain a sub-area picture;
inputting each subregion picture into a preset crop growth change condition detection model to obtain subregion detection results;
and combining the detection results of the sub-areas to obtain the detection result of the crop growth area to be detected.
2. The method according to claim 1, wherein the sub-region picture comprises a first time point sub-region picture and a second time point sub-region picture, the predetermined crop growth variation detection model comprises a convolutional neural network model,
the step of inputting each subregion picture into a preset crop growth change condition detection model to obtain subregion detection results comprises the following steps:
inputting each subregion picture into a preset crop growth change condition detection model to perform frame difference processing on the first time point subregion picture and the second time point subregion picture to obtain a difference matrix;
and inputting the difference matrix into the convolutional neural network model to obtain the detection result of the sub-region.
3. The method for detecting the growth change condition of crops according to claim 2, wherein the step of performing frame difference processing on the first time point sub-region picture and the second time point sub-region picture to obtain a difference matrix comprises:
acquiring a first pixel matrix and a second pixel matrix corresponding to the first time point sub-region picture and the second time point sub-region picture respectively;
and carrying out subtraction operation on the first pixel matrix and the second pixel matrix to obtain a difference matrix.
4. The method for detecting the growth change of crops according to claim 2, wherein the step of inputting the difference matrix into the convolutional neural network model to obtain the detection result of the sub-region comprises:
inputting the difference matrix into the convolutional neural network model, and performing convolution processing on the difference matrix to obtain a convolution processing result;
performing pooling treatment on the convolution treatment result to obtain a pooling treatment result;
and repeatedly performing convolution and pooling alternative processing on the pooling processing result based on the preset convolution pooling times to obtain the sub-region detection result.
5. The method for detecting the growth variation of crops according to claim 1, wherein the step of inputting each picture of the sub-region into a preset crop growth variation detection model to obtain the detection result of the sub-region comprises:
acquiring a preset basic detection model and preset training data, wherein the preset training data comprise a training difference matrix and a theoretical output result corresponding to the training difference matrix;
inputting the training difference matrix into the preset basic detection model to obtain an actual output result;
comparing the actual output result with the theoretical output result to obtain a regression loss value;
and adjusting the preset basic detection model and training again based on the regression loss value until the regression loss value is smaller than the preset regression loss threshold value, and obtaining the preset crop growth change condition detection model.
6. The method for detecting crop growth variation situation according to claim 5, wherein the step of obtaining the preset crop growth variation situation detection model further comprises:
acquiring preset verification data, wherein the preset verification data comprise verification difference matrixes and theoretical verification results corresponding to the verification difference matrixes;
inputting each verification differential matrix into the converged preset basic detection model to obtain an actual verification result;
comparing each actual verification result with each theoretical verification result to obtain a plurality of error rates;
comparing each error rate with a preset standard error rate range, counting the number of the error rates within the preset standard error rate range, and counting the number of the error rates relative to the number of the error rates;
if the number ratio is smaller than or equal to a preset number ratio threshold value, adjusting the preset basic detection model based on the number ratio and training again;
the step of obtaining the preset crop growth change condition detection model comprises the following steps:
and if the number ratio is larger than a preset number ratio threshold value, determining a preset basic detection model obtained by corresponding training as the preset crop growth change condition detection model.
7. The method according to claim 1, wherein the sub-area detection result includes a degree of change in the growth status of the crop in the sub-area,
the step of combining the detection results of the sub-areas to obtain the detection result of the growth area of the crop to be detected comprises the following steps:
identifying the detection result of each subregion based on the change degree of the crop growth condition of the subregion to obtain a subregion growth change degree label;
and acquiring a segmentation template corresponding to the picture of the crop growth area to be detected, and introducing the detection result of each sub-area and the growth change degree label of each sub-area into the segmentation template to obtain the detection result of the crop growth area to be detected.
8. A crop growth change condition monitoring device is applied to crop growth change condition monitoring equipment, and comprises:
the dividing module is used for receiving the picture of the crop growth area to be detected, dividing the picture of the crop growth area to be detected and obtaining a sub-area picture;
the detection module is used for inputting each subregion picture into a preset crop growth change condition detection model to obtain a subregion detection result;
and the merging module is used for merging the detection results of the sub-areas to obtain the detection result of the growth area of the crop to be detected.
9. A crop growth change condition monitoring device, comprising: a memory, a processor and a program stored on the memory for implementing the crop growth variation condition detection method,
the memory is used for storing a program for realizing the crop growth change condition detection method;
the processor is used for executing the program for realizing the crop growth change condition detection method, so as to realize the steps of the crop growth change condition detection method according to any one of claims 1 to 7.
10. A medium having a program for implementing a crop growth variation detection method stored thereon, the program being executed by a processor to implement the steps of the crop growth variation detection method according to any one of claims 1 to 7.
CN201911096106.5A 2019-11-11 2019-11-11 Crop growth change condition detection method, device, equipment and medium Active CN110827269B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201911096106.5A CN110827269B (en) 2019-11-11 2019-11-11 Crop growth change condition detection method, device, equipment and medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201911096106.5A CN110827269B (en) 2019-11-11 2019-11-11 Crop growth change condition detection method, device, equipment and medium

Publications (2)

Publication Number Publication Date
CN110827269A true CN110827269A (en) 2020-02-21
CN110827269B CN110827269B (en) 2024-03-05

Family

ID=69553956

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201911096106.5A Active CN110827269B (en) 2019-11-11 2019-11-11 Crop growth change condition detection method, device, equipment and medium

Country Status (1)

Country Link
CN (1) CN110827269B (en)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111369152A (en) * 2020-03-06 2020-07-03 深圳前海微众银行股份有限公司 Agricultural land value evaluation optimization method, device and equipment and readable storage medium
CN112800929A (en) * 2021-01-25 2021-05-14 安徽农业大学 On-line monitoring method for bamboo shoot quantity and high growth rate based on deep learning
CN113418509A (en) * 2021-05-20 2021-09-21 中国农业科学院烟草研究所(中国烟草总公司青州烟草研究所) Automatic target-aiming detection device and detection method for agriculture
CN116228454A (en) * 2023-03-20 2023-06-06 广东七天牧草种养殖有限公司 Big data-based planting management control method, system and readable storage medium

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
TW201010435A (en) * 2008-08-22 2010-03-01 Hon Hai Prec Ind Co Ltd Video monitoring system and method thereof
CN108198239A (en) * 2017-12-27 2018-06-22 中山大学 A kind of three-dimensional visualization method for realizing blood vessel dynamic simulation
CN108776772A (en) * 2018-05-02 2018-11-09 北京佳格天地科技有限公司 Across the time building variation detection modeling method of one kind and detection device, method and storage medium
CN110163294A (en) * 2019-05-29 2019-08-23 广东工业大学 Remote Sensing Imagery Change method for detecting area based on dimensionality reduction operation and convolutional network
CN110378224A (en) * 2019-06-14 2019-10-25 香港理工大学深圳研究院 A kind of detection method of feature changes, detection system and terminal

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
TW201010435A (en) * 2008-08-22 2010-03-01 Hon Hai Prec Ind Co Ltd Video monitoring system and method thereof
CN108198239A (en) * 2017-12-27 2018-06-22 中山大学 A kind of three-dimensional visualization method for realizing blood vessel dynamic simulation
CN108776772A (en) * 2018-05-02 2018-11-09 北京佳格天地科技有限公司 Across the time building variation detection modeling method of one kind and detection device, method and storage medium
CN110163294A (en) * 2019-05-29 2019-08-23 广东工业大学 Remote Sensing Imagery Change method for detecting area based on dimensionality reduction operation and convolutional network
CN110378224A (en) * 2019-06-14 2019-10-25 香港理工大学深圳研究院 A kind of detection method of feature changes, detection system and terminal

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111369152A (en) * 2020-03-06 2020-07-03 深圳前海微众银行股份有限公司 Agricultural land value evaluation optimization method, device and equipment and readable storage medium
CN112800929A (en) * 2021-01-25 2021-05-14 安徽农业大学 On-line monitoring method for bamboo shoot quantity and high growth rate based on deep learning
CN113418509A (en) * 2021-05-20 2021-09-21 中国农业科学院烟草研究所(中国烟草总公司青州烟草研究所) Automatic target-aiming detection device and detection method for agriculture
CN116228454A (en) * 2023-03-20 2023-06-06 广东七天牧草种养殖有限公司 Big data-based planting management control method, system and readable storage medium

Also Published As

Publication number Publication date
CN110827269B (en) 2024-03-05

Similar Documents

Publication Publication Date Title
CN110827269A (en) Crop growth change condition detection method, device, equipment and medium
KR101967089B1 (en) Convergence Neural Network based complete reference image quality evaluation
CN110060237B (en) Fault detection method, device, equipment and system
CN108337505B (en) Information acquisition method and device
US11915430B2 (en) Image analysis apparatus, image analysis method, and storage medium to display information representing flow quantity
CN113159300B (en) Image detection neural network model, training method thereof and image detection method
CN110766007B (en) Certificate shielding detection method, device, equipment and readable storage medium
CN113642474A (en) Hazardous area personnel monitoring method based on YOLOV5
CN111814776B (en) Image processing method, device, server and storage medium
CN115331172A (en) Workshop dangerous behavior recognition alarm method and system based on monitoring video
CN115880260A (en) Method, device and equipment for detecting base station construction and computer readable storage medium
CN111950457A (en) Oil field safety production image identification method and system
CN111369152A (en) Agricultural land value evaluation optimization method, device and equipment and readable storage medium
CN111783528B (en) Method, computer and system for monitoring items on a shelf
CN116091874B (en) Image verification method, training method, device, medium, equipment and program product
CN112784494A (en) Training method of false positive recognition model, target recognition method and device
CN113052019A (en) Target tracking method and device, intelligent equipment and computer storage medium
CN112686851B (en) Image detection method, device and storage medium
CN113989632A (en) Bridge detection method and device for remote sensing image, electronic equipment and storage medium
CN117710756B (en) Target detection and model training method, device, equipment and medium
CN114764833A (en) Plant growth curve determination method and device, electronic equipment and medium
CN112330619B (en) Method, device, equipment and storage medium for detecting target area
CN114266749B (en) TridentNet-based image processing method
CN116630367B (en) Target tracking method, device, electronic equipment and storage medium
CN117670755B (en) Detection method and device for lifting hook anti-drop device, storage medium and electronic equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant