CN116897668A - Electric-drive crop sowing and fertilizing control method and system - Google Patents

Electric-drive crop sowing and fertilizing control method and system Download PDF

Info

Publication number
CN116897668A
CN116897668A CN202310870532.XA CN202310870532A CN116897668A CN 116897668 A CN116897668 A CN 116897668A CN 202310870532 A CN202310870532 A CN 202310870532A CN 116897668 A CN116897668 A CN 116897668A
Authority
CN
China
Prior art keywords
seed
fertilizing
feature map
module
seeding
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202310870532.XA
Other languages
Chinese (zh)
Other versions
CN116897668B (en
Inventor
董向辉
聂影
班春华
孙继东
安鹤峰
高健博
沈卓群
王帅
孟欣
王一凡
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Liaoning Agricultural Mechanization Research Institute
Original Assignee
Liaoning Agricultural Mechanization Research Institute
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Liaoning Agricultural Mechanization Research Institute filed Critical Liaoning Agricultural Mechanization Research Institute
Priority to CN202310870532.XA priority Critical patent/CN116897668B/en
Publication of CN116897668A publication Critical patent/CN116897668A/en
Application granted granted Critical
Publication of CN116897668B publication Critical patent/CN116897668B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/047Probabilistic or stochastic networks
    • AHUMAN NECESSITIES
    • A01AGRICULTURE; FORESTRY; ANIMAL HUSBANDRY; HUNTING; TRAPPING; FISHING
    • A01CPLANTING; SOWING; FERTILISING
    • A01C21/00Methods of fertilising, sowing or planting
    • AHUMAN NECESSITIES
    • A01AGRICULTURE; FORESTRY; ANIMAL HUSBANDRY; HUNTING; TRAPPING; FISHING
    • A01CPLANTING; SOWING; FERTILISING
    • A01C7/00Sowing
    • A01C7/06Seeders combined with fertilising apparatus
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/0464Convolutional networks [CNN, ConvNet]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods

Abstract

The invention provides a method and a system for controlling sowing and fertilization of electrically driven crops, and belongs to the technical field of crop automation. Firstly, acquiring speed data of a sensor in a sowing and fertilizing area; secondly, acquiring and processing an image of a sowing and fertilizing area, and identifying and analyzing information such as sowing and fertilizing areas, crop growth states and the like in the image by utilizing a CNN model; combining the speed data with the image analysis result, calculating sowing and fertilizing operation parameters of each row of areas, analyzing the quality of seeds, and transmitting the control signals to each row of electric drive sowing and fertilizing control units to perform sowing and fertilizing operations; and finally, uploading the parameter information to a host unit and a cloud server so as to facilitate decision making, crop monitoring, production management and other functions. The invention accurately identifies and analyzes the seeding and fertilizing areas of crops, and performs accurate seeding and fertilizing control, thereby improving the yield and quality of crops and reducing the labor cost and time waste.

Description

Electric-drive crop sowing and fertilizing control method and system
Technical Field
The invention belongs to the technical field of crop automation, and particularly relates to an electric-drive crop seeding and fertilizing control method and system.
Background
The sowing and fertilizing of crops are vital links in agricultural production, and directly affect the growth and the development and the yield of the crops. The traditional crop seeding and fertilizing method mainly depends on manual operation, has low operation efficiency and is difficult to meet the requirements of large-scale farmlands. With the development of agricultural intelligent technology, the seeding and fertilizing of crops by using technologies such as a control device and image recognition gradually become a development trend.
In the prior art, china patent grant number: CN113079756B provides a field sowing operation simulation system and method. The patent comprises a soil covering mechanism, a seeding mechanism, an image acquisition mechanism, a seeding conveying mechanism and a control device; the earthing mechanism is used for conveying soil to the seeding conveying mechanism so as to form a soil layer on the conveying surface; the sowing mechanism is used for sowing on the soil layer; the image acquisition mechanism is used for acquiring image information of seeds sowed by the sowing mechanism distributed on the soil layer. The patent simulates the real seeding state of the field, acquires the seeding number, the seeding number and the seeding position of the seeding mechanism, and is convenient for analyzing the seeding state, thereby realizing the quality control of precision seeding operation.
However, the sowing mechanism mentioned in the patent does not carry out quality detection on seeds, only analyzes the seed metering number, the sowing number and the sowing position after sowing, does not consider the subsequent growth state of the seeds, identifies the seeds, can exclude inferior seeds, ensures germination rate and growth potential, improves sowing quality, increases crop yield, and can reduce labor cost and time waste, therefore, the invention provides accurate identification and analysis on sowing and fertilizing areas of crops, and carries out accurate sowing and fertilizing control so as to overcome the limitation of the prior art and improve the yield and quality of crops.
Disclosure of Invention
Based on the technical problems, the invention provides a seeding and fertilizing control method and system for electrically driven crops, which are characterized in that a CNN model is utilized to identify and analyze seeding and fertilizing areas in images, seeding and fertilizing operation parameters of each row are calculated according to speed characteristics and analysis results of the CNN model, precise seeding and fertilizing control is performed, crop yield and quality are improved, and labor cost and time waste are reduced.
The invention provides a control method for sowing and fertilizing of electrically driven crops, which comprises the following steps:
step S1: the method comprises the steps that a terminal collects sensor speed data in a seeding and fertilizing area, a host unit communicates with the terminal and processes control signals, the host unit obtains positioning and speed characteristics according to the sensor speed data, and the seeding and fertilizing area comprises a plurality of rows of seeding and fertilizing areas;
Step S2: the method comprises the steps of obtaining image data in a sowing and fertilizing area, performing preprocessing operation to obtain a segmentation feature map, inputting the segmentation feature map into a sowing and fertilizing network for prediction to obtain a fertilizer category result, wherein the fertilizer category result comprises: shallow fertilization or deep fertilization;
step S3: the host unit calculates parameters of each row of sowing and fertilizing areas according to the positioning and speed characteristics and the fertilizer category results to obtain parameter results;
step S4: the method comprises the steps of obtaining an image of a seed, performing preprocessing operation to obtain a seed feature map, inputting the seed feature map into a seed network for feature extraction to obtain a seed category result, wherein the seed category result comprises: inferior seeds and superior seeds;
step S5: transmitting the seed category result and the parameter result to each row of electric drive seed and fertilizer discharging control units, and executing corresponding operation on each row of sowing and fertilizer discharging areas;
step S6: and the parameter result is transmitted back to the host unit, and the host unit transmits the parameter result to the cloud server in real time.
Optionally, the acquiring the image data in the seeding and fertilizing area performs preprocessing operation to obtain a segmentation feature map, and inputs the segmentation feature map to a seeding and fertilizing network for prediction to obtain a fertilizer category result, which specifically includes:
Acquiring image data in the sowing and fertilizing area, and sequentially performing image enhancement, edge detection and image segmentation operation to obtain a segmentation feature map;
the segmentation feature map is sequentially input into a first standard convolution module, a first depth separable convolution module, a second depth separable convolution module, a third depth separable convolution module, a global average pooling layer, a neuron discarding layer, a first full-connection layer and a first Softmax classifier in a seeding and fertilizing network to conduct prediction, so that a fertilizer category result is obtained;
the seeding and fertilizing network specifically comprises: the system comprises a first standard convolution module, a first depth separable convolution module, a second depth separable convolution module, a third depth separable convolution module, a global average pooling layer, a neuron discarding layer, a first full connection layer and a first Softmax classifier.
Optionally, the preprocessing operation is performed on the obtained image of the seed to obtain a seed feature map, and the seed feature map is input into a seed network to perform feature extraction to obtain a seed category result, which specifically includes:
acquiring an image of a seed, and sequentially performing image enhancement, normalization and denoising operations to obtain a seed feature map;
Sequentially inputting the seed feature map into a second standard convolution module, a fourth maximum pooling layer and a first standard convolution residual error module in a seed network, and carrying out feature extraction on the second standard convolution residual error module, an average pooling layer, a flattening layer, a second full-connection layer and a second Softmax classifier to obtain a seed class result;
the seed network specifically comprises: the system comprises a second standard convolution module, a first standard convolution residual module, a second standard convolution residual module, an average pooling layer, a flattening layer, a second full connection layer and a second Softmax classifier.
Optionally, the seed category result and the parameter result are transmitted to each row of electric drive seed and fertilizer discharging and applying control units, and corresponding operations are executed on each row of seeding and fertilizer applying areas, which specifically includes:
judging whether the fertilizer category result in the parameter result is deep fertilization; if the fertilizer type result in the parameter result is 'deep fertilization', sequentially performing seed sorting, seed sowing and deep fertilization according to the seed type result; and if the fertilizer type result in the parameter result is 'shallow fertilization', shallow fertilization operation is carried out on the sowing and fertilizing area.
Optionally, the step of sequentially inputting the segmentation feature map into a first standard convolution module, a first depth separable convolution module, a second depth separable convolution module, a third depth separable convolution module, a global average pooling layer, a neuron discarding layer, a first full-connection layer and a first Softmax classifier in the seeding and fertilizing network to predict, so as to obtain a fertilizer category result, which specifically includes:
inputting the segmentation feature map to the first standard convolution module to perform standard convolution operation to obtain a feature map S2;
inputting the feature map S2 to the first depth separable convolution module to perform depth separable convolution operation to obtain a feature map S6;
inputting the feature map S6 to the second depth separable convolution module to perform depth separable convolution operation to obtain a feature map S10;
inputting the feature map S10 to the third depth separable convolution module to perform depth separable convolution operation to obtain a feature map S14;
and sequentially inputting the feature map S14 into the global average pooling layer, the neuron discarding layer, the first full-connection layer and the first Softmax classifier for prediction to obtain a fertilizer category result.
Optionally, the step of sequentially inputting the seed feature map to a second standard convolution module, a fourth max pooling layer, and a first standard convolution residual module in the seed network, where feature extraction is performed by the second standard convolution residual module, an average pooling layer, a flattening layer, a second full-connection layer, and a second Softmax classifier to obtain a seed class result, and specifically includes:
sequentially inputting the seed feature map to the second standard convolution module and the fourth maximum pooling layer to perform standard convolution operation and maximum pooling operation to obtain a feature map Z3;
inputting the characteristic diagram Z3 into the first standard convolution residual error module to perform standard convolution operation to obtain a characteristic diagram Z12;
inputting the characteristic diagram Z12 into the second standard convolution residual error module to perform standard convolution operation to obtain a characteristic diagram Z21;
and sequentially inputting the feature map Z21 into the average pooling layer, the flattening layer, the second full-connection layer and the second Softmax classifier to perform feature extraction, so as to obtain a seed class result.
The invention also provides an electrically driven crop seeding and fertilizing control system, which comprises:
the speed sensing module is used for acquiring sensor speed data in a seeding and fertilizing area by the terminal, the host unit is communicated with the terminal and used for processing control signals, the host unit obtains positioning and speed characteristics according to the sensor speed data, and the seeding and fertilizing area comprises a plurality of rows of seeding and fertilizing areas;
The seeding and fertilizing analysis module is used for acquiring the image data in the seeding and fertilizing area to perform preprocessing operation to obtain a segmentation feature map, inputting the segmentation feature map into a seeding and fertilizing network to predict, and obtaining a fertilizer category result, wherein the fertilizer category result comprises: shallow fertilization or deep fertilization;
the parameter calculation module is used for calculating parameters of each row of sowing and fertilizing areas according to the positioning and speed characteristics and the fertilizer category result by the host unit to obtain a parameter result;
the seed analysis module is used for acquiring an image of a seed to perform preprocessing operation to obtain a seed feature map, inputting the seed feature map into a seed network to perform feature extraction to obtain a seed category result, wherein the seed category result comprises: inferior seeds and superior seeds;
the seeding and fertilizing area operation module is used for transmitting the seed category result and the parameter result to each row of electric drive seeding and fertilizing control units and executing corresponding operation on each row of seeding and fertilizing areas;
and the parameter feedback module is used for transmitting the parameter result back to the host unit, and the host unit transmits the parameter result to the cloud server in real time.
Optionally, the seeding and fertilizing analysis module specifically includes:
the seeding and fertilizing image processing sub-module is used for acquiring image data in the seeding and fertilizing area, and sequentially carrying out image enhancement, edge detection and image segmentation operation to obtain a segmentation feature map;
and the seeding and fertilizing network sub-module sequentially inputs the segmentation feature map into a first standard convolution module, a first depth separable convolution module, a second depth separable convolution module, a third depth separable convolution module, a global average pooling layer, a neuron discarding layer, a first full-connection layer and a first Softmax classifier in the seeding and fertilizing network to predict, so that a fertilizer category result is obtained.
Optionally, the seed analysis module specifically includes:
the seed image processing sub-module is used for acquiring images of seeds and sequentially carrying out image enhancement, normalization and denoising operations to obtain a seed characteristic image;
and the seed network sub-module is used for sequentially inputting the seed characteristic map into a second standard convolution module, a fourth maximum pooling layer and a first standard convolution residual error module in a seed network, and carrying out characteristic extraction on the second standard convolution residual error module, an average pooling layer, a flattening layer, a second full-connection layer and a second Softmax classifier to obtain a seed class result.
Optionally, the seeding and fertilizing area operation module specifically includes:
the judging submodule is used for judging whether the fertilizer type result in the parameter result is deep fertilization; if the fertilizer type result in the parameter result is 'deep fertilization', sequentially performing seed sorting, seed sowing and deep fertilization according to the seed type result; and if the fertilizer type result in the parameter result is 'shallow fertilization', shallow fertilization operation is carried out on the sowing and fertilizing area.
Compared with the prior art, the invention has the following beneficial effects:
according to the invention, through collecting the sensor speed data in the seeding and fertilizing areas, positioning and speed characteristic analysis, and combining the preprocessed segmentation characteristic diagram, different seeding and fertilizing areas can be accurately identified, and each row of seeding and fertilizing areas is accurately operated and controlled according to the parameter results, so that proper application of fertilizer can be ensured, and the precision and uniformity of seeding and fertilizing are improved; the seeds can be classified into high quality and low quality by preprocessing the image data of the seeds and extracting the characteristics. By removing inferior seeds and carrying out accurate seed sowing and fertilizing operation on high-quality seeds according to seed category results, the sowing quality can be improved, good germination and growth of the seeds are promoted, the growth and the development and the yield of crops are directly affected, and the quality and the agricultural production benefit of the crops are improved; through sensor data acquisition, image data processing and the calculation control of host computer unit at terminal to and the communication and the operation transmission of electricity drive seed metering fertilization control unit, realized automation and the intellectuality of seeding fertilization process, automated operation can improve the efficiency of seeding fertilization, reduces human cost and time waste. The intelligent control and decision making capability enables seeding and fertilization to be more accurate and adapt to different farmland conditions, and the modernization level of agricultural production is improved.
Drawings
FIG. 1 is a flow chart of a method for controlling sowing and fertilization of electrically driven crops;
FIG. 2 is a diagram of a seeding and fertilizing network of an electrically driven crop seeding and fertilizing control method of the invention;
FIG. 3 is a diagram of a seed network of an electrically driven crop seeding and fertilizing control method according to the present invention;
fig. 4 is a diagram showing a structure of an electrically driven crop seeding and fertilizing control system.
Detailed Description
The invention is further described below in connection with specific embodiments and the accompanying drawings, but the invention is not limited to these embodiments.
Example 1
As shown in fig. 1, the invention discloses an electrically driven crop seeding and fertilizing control method, which comprises the following steps:
step S1: the terminal collects sensor speed data in a seeding and fertilizing area, the host unit communicates with the terminal and processes control signals, the host unit obtains positioning and speed characteristics according to the sensor speed data, and the seeding and fertilizing area comprises a plurality of rows of seeding and fertilizing areas.
Step S2: the method comprises the steps of obtaining image data in a sowing and fertilizing area, performing preprocessing operation to obtain a segmentation feature map, inputting the segmentation feature map into a sowing and fertilizing network for prediction to obtain a fertilizer type result, wherein the fertilizer type result comprises: shallow fertilization or deep fertilization.
Step S3: and the host unit calculates parameters of each row of sowing and fertilizing areas according to the positioning and speed characteristics and the fertilizer category result to obtain a parameter result.
Step S4: the method comprises the steps of obtaining an image of a seed, performing preprocessing operation to obtain a seed feature map, inputting the seed feature map into a seed network to perform feature extraction to obtain a seed category result, wherein the seed category result comprises: inferior seeds and superior seeds.
Step S5: and transmitting the seed type result and the parameter result to each row of electric drive seeding and fertilizing control units, and executing corresponding operation on each row of seeding and fertilizing areas.
Step S6: and the parameter result is transmitted back to the host unit, and the host unit transmits the parameter result to the cloud server in real time.
The steps are discussed in detail below:
step S1: the terminal collects sensor speed data in a seeding and fertilizing area, the host unit communicates with the terminal and processes control signals, the host unit obtains positioning and speed characteristics according to the sensor speed data, and the seeding and fertilizing area comprises a plurality of rows of seeding and fertilizing areas.
The step S1 specifically comprises the following steps:
step S11: the system terminal tests the speed and provides the three-in-one mode of RTK satellite centimeter-level positioning and receiving integration, ground wheel speed measurement sensing and ground radar speed measurement sensing, can be separately installed according to operation requirements, can also be completely installed in 3 speed measurement modes, and under all installation states, the system mainly tests the speed of the ground radar, and the RTK and the ground wheel speed measurement are coordinated and verified and corrected, and under the ground radar speed measurement sensing failure state, the RTK is mainly tested and corrected, and under the RTK signal weak state in an electromagnetic signal interference region, the system is automatically switched to the ground wheel speed measurement function, so that effective seeding operation is ensured, and the motion state of a seeding and fertilizing region can be monitored in real time and corresponding speed data is recorded.
Step S12: the host unit establishes communication connection with the terminal, and performs data transmission through an infinite communication protocol; the host unit sends a control signal to the terminal to trigger the sensor to acquire speed data, and at the same time, the host unit receives the acquired data from the terminal.
Step S13: the host unit analyzes and processes the received speed data, including analyzing the format and units of the sensor data, ensuring the accuracy and availability of the data. The host unit also performs data preprocessing, mainly to remove outliers or filtering, to ensure reliable positioning and speed characteristics.
Step S14: the host unit utilizes the analyzed and preprocessed speed data, performs position estimation based on the speed data of the sowing and fertilizing area by using an acceleration motion model, and estimates the change of the position by accumulating the speed data and the time interval; by calculating the average value of the speed data, the average speed characteristic of the sowing and fertilizing area can be obtained, which can be realized by summing the speed data and dividing the speed data by the number of data points; by finding the maximum value in the speed data, the maximum speed characteristics of the seeding and fertilizing area can be obtained, which can be achieved by traversing the speed data and comparing the value of each data point; by calculating the rate of change of the speed data, the rate characteristics of the seeding and fertilizing area can be obtained, which can be achieved by calculating the difference or slope between the speed data points; if the seeding and fertilizing area is equipped with a Global Navigation Satellite System (GNSS) receiver, such as GPS or GLONASS, a GNSS positioning algorithm may be used to obtain position information, and using satellite signal measurement and signal processing techniques, longitude, latitude, and altitude etc. parameters of the area are calculated; in some specific scenarios, positioning may be achieved by placing a fixed beacon or reference point, and the host unit may utilize the distance or signal strength information between the seeding and fertilizing area and the beacon, applying a triangulation or fingerprint-based positioning algorithm to estimate the position.
Step S2: and acquiring image data in the sowing and fertilizing area, performing preprocessing operation to obtain a segmentation feature map, and inputting the segmentation feature map into a sowing and fertilizing network for prediction to obtain a fertilizer category result.
The step S2 specifically comprises the following steps:
step S21: acquiring image data in a seeding and fertilizing area, sequentially performing image enhancement, edge detection and image segmentation operation to obtain a segmentation feature map, and specifically comprising:
in the sowing and fertilizing process, image data of farmlands are collected by using an image pickup device. The camera equipment can be arranged at a proper position to acquire comprehensive and accurate farmland images, so that the resolution and the image quality of the image acquisition equipment are ensured to be enough to provide clear images; removing noise in the image by using median filtering or Gaussian filtering; performing contrast enhancement, brightness adjustment and histogram equalization on the denoised image to improve the quality and the visual effect of the image; performing edge detection on the enhanced image, and detecting edge information in the image by adopting a Canny edge detection algorithm; the image is segmented to distinguish a foreground crop planting area from a background irrelevant area, and the target is separated from the background.
In fig. 2, conv2D represents a standard convolutional layer; strides represents the step size; the normalized activation layer comprises a batch normalization layer (Batchnormalization) and an activation function layer (Activation (Relu)), and the normalized activation layer selects a Relu activation function; the batch normalization alone represents a batch normalization layer; sepConv2D represents a depth separable convolutional layer; dense stands for full connectivity layer; dropout represents the neuron dropping layer; globalaeragepooling 2D represents a global average pooling layer; maxpooling2D represents the maximum pooling layer; s represents each characteristic diagram obtained in the seeding and fertilizing network, and the value range is [1,14] which is an integer.
Step S22: the segmentation feature map is sequentially input into a first standard convolution module, a first depth separable convolution module, a second depth separable convolution module, a third depth separable convolution module, a global average pooling layer, a neuron discarding layer, a first full-connection layer and a first Softmax classifier in a seeding and fertilizing network to conduct prediction, and a fertilizer category result is obtained.
As shown in fig. 2, step S22 specifically includes:
i, inputting the segmentation feature map to a first standard convolution module to perform standard convolution operation to obtain a feature map S2, wherein the method specifically comprises the following steps of:
Inputting the segmentation feature map (256,256,3) into a first standard convolution layer for convolution operation to obtain a feature map S1, wherein the number of convolution kernels of the first standard convolution layer is 16, the size of the convolution kernels is 3 multiplied by 3, and the step length is 2; the feature map S1 is 128×128 of 16 channels; inputting the feature map S1 into a first normalized activation layer to perform batch normalization and activation operation to obtain a feature map S2; the feature map S2 is 128×128 of 16 channels.
In this embodiment, the first standard convolution module includes a first standard convolution layer and a first normalized activation layer.
II, inputting the feature map S2 into a first depth separable convolution module to perform depth separable convolution operation to obtain a feature map S6, which specifically comprises the following steps:
inputting the feature map S2 into a first depth-separable convolution layer to perform depth-separable convolution operation to obtain a feature map S3, wherein the number of convolution kernels of the first depth-separable convolution layer is 32, the size of the convolution kernels is 3 multiplied by 3, and the step length is 2; the feature map S3 is 64×64 for 32 channels; inputting the feature map S3 into a second normalized activation layer to perform batch normalization and activation operation to obtain a feature map S4; the feature map S4 is 64×64 for 32 channels; inputting the feature map S4 into a second depth-separable convolution layer to perform depth-separable convolution operation to obtain a feature map S5, wherein the number of convolution kernels of the second depth-separable convolution layer is 64, the size of the convolution kernels is 3 multiplied by 3, and the step length is 1; the feature map S5 is 64×64 of 64 channels; sequentially inputting the feature map S5 into a first batch normalization layer and a first maximum pooling layer to perform batch normalization operation and maximum pooling operation to obtain a feature map S6; the size of the pooling window of the first maximum pooling layer is 3 multiplied by 3, and the step length is 2; the feature map S6 is 32×32 of 64 channels.
In this embodiment, the first depth-separable convolution module includes a first depth-separable convolution layer, a second normalized activation layer, a second depth-separable convolution layer, a first batch of normalized layers, and a first maximum pooling layer.
III, inputting the feature map S6 into a second depth separable convolution module to perform depth separable convolution operation to obtain a feature map S10, wherein the method specifically comprises the following steps:
inputting the feature map S6 into a third depth separable convolution layer to perform depth separable convolution operation to obtain a feature map S7, wherein the number of convolution kernels of the third depth separable convolution layer is 128, the size of the convolution kernels is 3 multiplied by 3, and the step length is 2; feature map S7 is 16×16 for 128 channels; inputting the feature map S7 into a third normalized activation layer to perform batch normalization and activation operation to obtain a feature map S8; feature map S8 is 16×16 for 128 channels; inputting the feature map S8 into a fourth depth separable convolution layer to perform depth separable convolution operation to obtain a feature map S9, wherein the number of convolution kernels of the fourth depth separable convolution layer is 256, the size of the convolution kernels is 3 multiplied by 3, and the step length is 1; feature map S9 is 16×16 for 256 channels; sequentially inputting the feature map S9 into a second batch normalization layer and a second maximum pooling layer to perform batch normalization operation and maximum pooling operation to obtain a feature map S10; the size of the pooling window of the second maximum pooling layer is 3 multiplied by 3, and the step length is 2; the feature map S10 is 8×8 of 256 channels.
In this embodiment, the second depth-separable convolution module includes a third depth-separable convolution layer, a third normalized activation layer, a fourth depth-separable convolution layer, a second batch of normalized layers, and a second maximum pooling layer.
IV, inputting the feature map S10 into a third depth separable convolution module to perform depth separable convolution operation to obtain a feature map S14, wherein the method specifically comprises the following steps of:
inputting the feature map S10 into a fifth depth separable convolution layer to perform depth separable convolution operation to obtain a feature map S11, wherein the number of convolution kernels of the fifth depth separable convolution layer is 512, the size of the convolution kernels is 3 multiplied by 3, and the step length is 2; the feature map S11 is 4×4 of 512 channels; inputting the feature map S11 into a fourth normalized activation layer for batch normalization and activation operation to obtain a feature map S12; the feature map S12 is 4×4 of 512 channels; inputting the feature map S12 into a sixth depth separable convolution layer to perform depth separable convolution operation to obtain a feature map S13, wherein the number of convolution kernels of the sixth depth separable convolution layer is 1024, the size of the convolution kernels is 3 multiplied by 3, and the step length is 1; the feature map S13 is 4×4 of 1024 channels; sequentially inputting the feature map S13 into a third batch normalization layer and a third maximum pooling layer to perform batch normalization operation and maximum pooling operation to obtain a feature map S14; the size of a pooling window of the third maximum pooling layer is 3 multiplied by 3, and the step length is 2; the feature map S10 is 8×8 of 256 channels.
In this embodiment, the third depth-separable convolution module includes a fifth depth-separable convolution layer, a fourth normalized activation layer, a sixth depth-separable convolution layer, a third batch of normalized layers, and a third maximum pooling layer.
V, sequentially inputting the feature map S14 into a global average pooling layer, a neuron discarding layer, a first full-connection layer and a first Softmax classifier for prediction to obtain a fertilizer category result, wherein the method specifically comprises the following steps of:
the feature map S14 is sequentially input into a global average pooling layer, a neuron discarding layer, a first full-connection layer and a first Softmax classifier for prediction, and fertilizer category results are obtained, wherein the fertilizer category results comprise: shallow fertilization or deep fertilization; shallow fertilization is the application of fertilizer to the surface layer of soil or to a shallower depth of soil, typically in areas where the root system of the plant is more concentrated. The fertilizing mode can enable the root system of the plant to be in more direct contact with the fertilizer so as to quickly absorb the needed nutrients. Deep fertilization is the application of fertilizer into deeper layers of soil, away from the soil surface and the root area of the plant. The fertilizing method aims to make the fertilizer more stable and durable in soil, reduce the risk of nutrient loss, promote plant root system to grow downwards, explore deeper nutrients and improve the utilization efficiency of the nutrients.
The shallow fertilization and the deep fertilization can be continuously classified into different grades according to the growth state of actual crops, and the method specifically comprises the following steps:
when the height of the crops is greater than or equal to a first threshold value, the current crop growth state grade is excellent, the growth health of the crops is indicated, and the nutrient supply is sufficient; proper amount of basal fertilization can be performed.
When the height of the crops is larger than or equal to the second threshold value and smaller than the first threshold value, the current crop growth state grade is general, and the crop growth is general, so that the nutrient supply is slightly insufficient; the application amount of the fertilizer is increased.
When the height of the crops is larger than or equal to the third threshold value and smaller than the second threshold value, the current crop growth state grade is poor, which means that the crops grow weakly and nutrient supply is insufficient; the application amount of the fertilizer is increased more.
When the crop height is smaller than the third threshold value, the current crop growth state grade is poor, which means that the crop growth is very weak and the nutrient supply is extremely insufficient; greatly increases the application amount of the fertilizer.
Crop growth status ratings include: excellent, general, poor and bad.
The first threshold, the second threshold and the third threshold can be set according to actual requirements.
In this embodiment, the seeding and fertilizing network specifically includes: the system comprises a first standard convolution module, a first depth separable convolution module, a second depth separable convolution module, a third depth separable convolution module, a global average pooling layer, a neuron discarding layer, a first full connection layer and a first Softmax classifier.
Step S3: the host unit calculates parameters of each row of sowing and fertilizing areas according to the positioning and speed characteristics and fertilizer category results to obtain parameter results, and specifically comprises the following steps:
the host unit performs shallow fertilization or deep fertilization operation according to the need of each row of sowing and fertilization areas, and controls the fertilization amount of fertilization to obtain the parameter result of each row.
Step S4: the method comprises the steps of obtaining an image of a seed, performing preprocessing operation to obtain a seed feature map, inputting the seed feature map into a seed network to perform feature extraction to obtain a seed category result, wherein the seed category result comprises: inferior seeds and superior seeds.
The step S4 specifically comprises the following steps:
step S41: the method comprises the steps of obtaining an image of a seed, sequentially carrying out image enhancement, normalization and denoising operations to obtain a seed feature map, and specifically comprises the following steps:
the device is configured with high resolution camera image processing software for capturing images of seeds, having different sizes and resolutions for different seed images. In order to maintain consistency of the input image, the image may be adjusted to a fixed size (256,256,3), by scaling, cropping or filling operations, converting the seed image to a specific color space may provide better feature expression, converting the image from the original color space to the target color space, selecting any of RGB, HSV or HIS; histogram equalization, contrast enhancement, sharpening methods are applied to enhance the visual characteristics of the image, which typically require normalization of the input image before it is classified to make the image data statistically consistent. Common normalization methods are scaling the pixel values to between 0 and 1 or normalization (mean 0, standard deviation 1); noise or interference is present in the image and filters can be used to reduce the effect of noise on classification performance.
In fig. 3, conv2D represents a standard convolutional layer; strides represents the step size; the normalized activation layer comprises a batch normalization layer (Batchnormalization) and an activation function layer (Activation (Relu)), and the normalized activation layer selects a Relu activation function; the batch normalization alone represents a batch normalization layer; separate Activation (Relu) is the activation function layer, add (,) representation and element-wise addition; dense stands for full connectivity layer; flat represents a flattening layer; averagePooling2D represents an average pooling layer; maxpooling2D represents the maximum pooling layer; z represents each characteristic diagram obtained in the seed network, and the value range is [1,22] and is an integer.
Step S42: and sequentially inputting the seed characteristic diagram into a second standard convolution module, a fourth maximum pooling layer and a first standard convolution residual error module in a seed network, and carrying out characteristic extraction on the second standard convolution residual error module, an average pooling layer, a flattening layer, a second full-connection layer and a second Softmax classifier to obtain a seed class result.
As shown in fig. 3, step S42 specifically includes:
(1) sequentially inputting the seed feature map to a second standard convolution module and a fourth maximum pooling layer to perform standard convolution operation and maximum pooling operation to obtain a feature map Z3, wherein the method specifically comprises the following steps:
Inputting the seed characteristic diagram (256,256,3) into a second standard convolution layer to carry out convolution operation to obtain a characteristic diagram Z1, wherein the number of convolution kernels of the second standard convolution layer is 32, the size of the convolution kernels is 3 multiplied by 3, and the step length is 2; feature map Z1 is 128×128 for 32 channels; inputting the feature map Z1 into a fifth normalized activation layer to perform batch normalization and activation operation to obtain a feature map Z2; feature map Z2 is 128×128 for 32 channels; inputting the feature map Z2 into a fourth maximum pooling layer to perform maximum pooling operation to obtain a feature map Z3; the window size of the fourth maximum pooling layer is 3 multiplied by 3, and the step length is 2; the signature Z3 is 64×64 for 32 channels.
In this embodiment, the second standard convolution module includes a second standard convolution layer and a fifth normalized activation layer.
(2) Inputting the feature map Z3 to a first standard convolution residual module to perform standard convolution operation to obtain a feature map Z12, wherein the method specifically comprises the following steps:
(1) Inputting the characteristic diagram Z3 into a third standard convolution layer for convolution operation to obtain a characteristic diagram Z4, wherein the number of convolution kernels of the third standard convolution layer is 64, the size of the convolution kernels is 3 multiplied by 3, and the step length is 2; the feature map Z4 is 32×32 of 64 channels; inputting the feature map Z4 into a sixth normalized activation layer to perform batch normalization and activation operation to obtain a feature map Z5; the profile Z5 is 128×128 of 32 channels.
(2) Sequentially inputting the feature map Z3 into a fourth standard convolution layer and a fourth batch of normalization layers to carry out convolution operation and batch normalization operation to obtain a feature map Z6, wherein the number of convolution kernels of the fourth standard convolution layer is 64, the size of the convolution kernels is 1 multiplied by 1, and the step length is 2; the feature map Z6 is 32×32 of 64 channels.
(3) Inputting the feature map Z5 and the feature map Z6 into a first element-by-element addition layer for element addition operation to obtain a feature map Z7; the feature map Z7 is 32×32 of 64 channels; inputting the feature map Z7 into a first activation function layer to perform activation function operation to obtain a feature map Z8; the feature map Z8 is 32×32 of 64 channels.
(4) Inputting the feature map Z8 into a fifth standard convolution layer for convolution operation to obtain a feature map Z9, wherein the number of convolution kernels of the fifth standard convolution layer is 64, the size of the convolution kernels is 3 multiplied by 3, and the step length is 1; the feature map Z9 is 32×32 of 64 channels; inputting the feature map Z9 into a seventh normalized activation layer for batch normalization and activation operation to obtain a feature map Z10; the feature map Z10 is 32×32 of 64 channels.
(5) Inputting the feature map Z8 and the feature map Z10 into a second element-by-element addition layer to perform element addition operation to obtain a feature map Z11; the feature map Z11 is 32×32 of 64 channels; inputting the feature map Z12 into a second activation function layer to perform activation function operation, so as to obtain a feature map Z12; the feature map Z12 is 32×32 of 64 channels.
In this embodiment, the first standard convolution residual module includes a third standard convolution layer, a sixth normalized activation layer, a fourth standard convolution layer, a fourth normalized layer, a first element-by-element addition layer, a first activation function layer, a fifth standard convolution layer, a seventh normalized activation layer, a second element-by-element addition layer, and a second activation function layer.
(3) Inputting the feature map Z12 to a second standard convolution residual module to perform standard convolution operation to obtain a feature map Z21, which specifically comprises:
1) Inputting the feature map Z12 into a sixth standard convolution layer for convolution operation to obtain a feature map Z13, wherein the number of convolution kernels of the sixth standard convolution layer is 128, the size of the convolution kernels is 3 multiplied by 3, and the step length is 2; feature map Z13 is 16×16 for 128 channels; inputting the feature map Z13 into an eighth normalized activation layer for batch normalization and activation operation to obtain a feature map Z14; the profile Z14 is 32×32 of 128 channels.
2) Sequentially inputting the feature map Z12 into a seventh standard convolution layer and a fifth normalization layer to carry out convolution operation and batch normalization operation to obtain a feature map Z15, wherein the number of convolution kernels of the seventh standard convolution layer is 64, the size of the convolution kernels is 1 multiplied by 1, and the step length is 2; the profile Z6 is 16×16 of 128 channels.
3) Inputting the feature map Z14 and the feature map Z15 into a third element-by-element addition layer for element addition operation to obtain a feature map Z16; feature map Z16 is 16×16 of 128 channels; inputting the feature map Z16 into a third activation function layer to perform activation function operation to obtain a feature map Z17; feature map Z17 is 16×16 of 128 channels.
4) Inputting the feature map Z17 into an eighth standard convolution layer for convolution operation to obtain a feature map Z18, wherein the number of convolution kernels of the eighth standard convolution layer is 128, the size of the convolution kernels is 3 multiplied by 3, and the step length is 1; feature map Z18 is 16×16 for 128 channels; inputting the feature map Z18 into a ninth normalized activation layer for batch normalization and activation operation to obtain a feature map Z19; the profile Z19 is 16×16 of 128 channels.
5) Inputting the feature map Z17 and the feature map Z19 into a fourth element-by-element addition layer for element addition operation to obtain a feature map Z20; feature map Z20 is 16×16 for 128 channels; inputting the feature map Z20 into a fourth activation function layer to perform activation function operation to obtain a feature map Z21; the profile Z21 is 16×16 of 128 channels.
In this embodiment, the second standard convolution residual module includes a sixth standard convolution layer, an eighth normalized activation layer, a seventh standard convolution layer, a fifth normalization layer, a third element-by-element addition layer, a third activation function layer, an eighth standard convolution layer, a ninth normalized activation layer, a fourth element-by-element addition layer, and a fourth activation function layer.
(4) And sequentially inputting the feature map Z21 into an average pooling layer, a flattening layer, a second full-connection layer and a second Softmax classifier to perform feature extraction, so as to obtain a seed class result.
In this embodiment, the seed network specifically includes: the system comprises a second standard convolution module, a first standard convolution residual module, a second standard convolution residual module, an average pooling layer, a flattening layer, a second full connection layer and a second Softmax classifier.
Step S5: and transmitting the seed type result and the parameter result to each row of electric drive seeding and fertilizing control units, and executing corresponding operation on each row of seeding and fertilizing areas.
The step S5 specifically comprises the following steps:
judging whether the fertilizer type result in the parameter result is deep fertilization; if the fertilizer type result in the parameter result is 'deep fertilization', seed sorting and seed sowing operations are sequentially carried out according to the seed type result, and a sorting system is arranged to sort the seeds. The sorting system can separate inferior seeds from superior seeds by a mechanical arm, air jet or other automatic devices; the seed quality detection and sorting module is integrated with a seeding and fertilizing control system. Ensuring that the seed quality detection and sorting operations are performed simultaneously during the sowing process. If inferior seeds are detected, the seeding machine is adjusted by each row of electric drive seed metering control units in time, so that the seeding of the inferior seeds is avoided; by adding the seed quality detection and sorting module, the system can detect the quality of seeds in real time in the sowing process and sort the seeds in real time according to the detection result. Thus, the sowing quality of seeds can be improved, the yield and quality of crops can be improved, and the waste of resources can be reduced.
And if the fertilizer type result in the parameter result is 'shallow layer fertilization', the electric drive seed and fertilizer discharging control unit of each row performs fertilization operation on the sowing and fertilizing area.
In the embodiment, the host unit is arranged in the cab and is transmitted to each row of electric drive seed metering control units through control signals, and the machine tool concentrator end is connected with the host unit and the concentrators of each row of electric drive seed metering fertilization control units, so that the functions of data transmission and control signal forwarding are achieved; each row of electric drive seed and fertilizer discharging control units is a core component of the system, each row of planting areas is provided with a control unit, and each row of electric drive seed and fertilizer discharging control units is responsible for controlling the seeding and fertilizer discharging operations of the row.
Step S6: and the parameter result is transmitted back to the host unit, and the host unit transmits the monitoring data to the cloud for storage and analysis in real time so as to support the functions of decision making, crop monitoring, production management and the like. Through data recording and cloud support, farmers can know information such as the growth state, fertilization effect and the like of crops in real time, so that decision making and production management are carried out.
Example 2
As shown in fig. 4, the invention discloses an electrically driven crop seeding and fertilizing control system, which comprises:
And the speed sensing module 10 is used for collecting sensor speed data in a seeding and fertilizing area, the host unit is communicated with the terminal and used for processing control signals, and the host unit obtains positioning and speed characteristics according to the sensor speed data, and the seeding and fertilizing area comprises a plurality of rows of seeding and fertilizing areas.
The seeding and fertilizing analysis module 20 is configured to acquire image data in a seeding and fertilizing area, perform a preprocessing operation to obtain a segmentation feature map, input the segmentation feature map into a seeding and fertilizing network for prediction, and obtain a fertilizer category result, where the fertilizer category result includes: shallow fertilization or deep fertilization.
And the parameter calculation module 30 is used for calculating the parameters of the seeding and fertilizing areas of each row according to the positioning and speed characteristics and the fertilizer category result by the host unit to obtain a parameter result.
The seed analysis module 40 is configured to obtain an image of a seed, perform a preprocessing operation to obtain a seed feature map, input the seed feature map to a seed network, perform feature extraction, and obtain a seed category result, where the seed category result includes: inferior seeds and superior seeds.
The seeding and fertilizing area operation module 50 transmits the seed category result and the parameter result to each row of the electric drive seeding and fertilizing control units, and performs corresponding operation on each row of the seeding and fertilizing area.
The parameter feedback module 60 is configured to send the parameter result back to the host unit, and the host unit transmits the parameter result to the cloud server in real time.
As an alternative embodiment, the seeding and fertilizing analysis module 20 of the present invention specifically comprises:
and the seeding and fertilizing image processing sub-module is used for acquiring image data in a seeding and fertilizing area and sequentially carrying out image enhancement, edge detection and image segmentation operation to obtain a segmentation feature map.
The seeding and fertilizing network sub-module sequentially inputs the segmentation feature map into a first standard convolution module, a first depth separable convolution module, a second depth separable convolution module, a third depth separable convolution module, a global average pooling layer, a neuron discarding layer, a first full-connection layer and a first Softmax classifier in the seeding and fertilizing network for prediction, and a fertilizer category result is obtained.
As an alternative embodiment, the seed analysis module 40 of the present invention specifically includes:
and the seed image processing sub-module is used for acquiring the images of the seeds and sequentially carrying out image enhancement, normalization and denoising operations to obtain a seed characteristic image.
The seed network sub-module is used for sequentially inputting the seed characteristic diagram into a second standard convolution module, a fourth maximum pooling layer and a first standard convolution residual error module in the seed network, and carrying out characteristic extraction on the second standard convolution residual error module, the average pooling layer, the flattening layer, the second full-connection layer and the second Softmax classifier to obtain a seed class result.
As an alternative embodiment, the seeding and fertilizing area operation module 50 of the present invention specifically comprises:
the judging submodule is used for judging whether the fertilizer type result in the parameter result is deep fertilization; if the fertilizer type result in the result is 'deep fertilization', sequentially carrying out seed sorting, seed sowing and deep fertilization according to the seed type result; and if the fertilizer type in the result is 'shallow fertilization', shallow fertilization operation is carried out on the sowing and fertilizing area.
The above description is only of the preferred embodiments of the present invention and is not intended to limit the present invention, but various modifications and variations can be made to the present invention by those skilled in the art. Any modification, equivalent replacement, improvement, etc. made within the spirit and principle of the present invention should be included in the protection scope of the present invention.

Claims (10)

1. An electrically driven crop seeding and fertilizing control method is characterized by comprising the following steps:
step S1: the method comprises the steps that a terminal collects sensor speed data in a seeding and fertilizing area, a host unit communicates with the terminal and processes control signals, the host unit obtains positioning and speed characteristics according to the sensor speed data, and the seeding and fertilizing area comprises a plurality of rows of seeding and fertilizing areas;
Step S2: the method comprises the steps of obtaining image data in a sowing and fertilizing area, performing preprocessing operation to obtain a segmentation feature map, inputting the segmentation feature map into a sowing and fertilizing network for prediction to obtain a fertilizer category result, wherein the fertilizer category result comprises: shallow fertilization or deep fertilization;
step S3: the host unit calculates parameters of each row of sowing and fertilizing areas according to the positioning and speed characteristics and the fertilizer category results to obtain parameter results;
step S4: the method comprises the steps of obtaining an image of a seed, performing preprocessing operation to obtain a seed feature map, inputting the seed feature map into a seed network for feature extraction to obtain a seed category result, wherein the seed category result comprises: inferior seeds and superior seeds;
step S5: transmitting the seed category result and the parameter result to each row of electric drive seed and fertilizer discharging control units, and executing corresponding operation on each row of sowing and fertilizer discharging areas;
step S6: and the parameter result is transmitted back to the host unit, and the host unit transmits the parameter result to the cloud server in real time.
2. The method for controlling sowing and fertilizing of electrically driven crops according to claim 1, wherein the step of obtaining the image data in the sowing and fertilizing area for preprocessing operation to obtain a segmentation feature map, and inputting the segmentation feature map into a sowing and fertilizing network for prediction to obtain a fertilizer category result comprises the following steps:
Acquiring image data in the sowing and fertilizing area, and sequentially performing image enhancement, edge detection and image segmentation operation to obtain a segmentation feature map;
the segmentation feature map is sequentially input into a first standard convolution module, a first depth separable convolution module, a second depth separable convolution module, a third depth separable convolution module, a global average pooling layer, a neuron discarding layer, a first full-connection layer and a first Softmax classifier in a seeding and fertilizing network to conduct prediction, so that a fertilizer category result is obtained;
the seeding and fertilizing network specifically comprises: the system comprises a first standard convolution module, a first depth separable convolution module, a second depth separable convolution module, a third depth separable convolution module, a global average pooling layer, a neuron discarding layer, a first full connection layer and a first Softmax classifier.
3. The method for controlling sowing and fertilizing of electrically driven crops according to claim 1, wherein the step of obtaining the image of the seeds to perform preprocessing operation to obtain a seed characteristic map, and inputting the seed characteristic map to a seed network to perform characteristic extraction to obtain seed category results comprises the following steps:
acquiring an image of a seed, and sequentially performing image enhancement, normalization and denoising operations to obtain a seed feature map;
Sequentially inputting the seed feature map into a second standard convolution module, a fourth maximum pooling layer and a first standard convolution residual error module in a seed network, and carrying out feature extraction on the second standard convolution residual error module, an average pooling layer, a flattening layer, a second full-connection layer and a second Softmax classifier to obtain a seed class result;
the seed network specifically comprises: the system comprises a second standard convolution module, a first standard convolution residual module, a second standard convolution residual module, an average pooling layer, a flattening layer, a second full connection layer and a second Softmax classifier.
4. The method according to claim 1, wherein the seed-sowing and fertilizer-application control unit is transmitted to each row of the seed-sowing and fertilizer-application control unit according to the seed category result and the parameter result, and the method specifically comprises the steps of:
judging whether the fertilizer category result in the parameter result is deep fertilization; if the fertilizer type result in the parameter result is 'deep fertilization', sequentially performing seed sorting, seed sowing and deep fertilization according to the seed type result; and if the fertilizer type result in the parameter result is 'shallow fertilization', shallow fertilization operation is carried out on the sowing and fertilizing area.
5. The method for controlling sowing and fertilizing of electrically driven crops according to claim 2, wherein the step of inputting the segmentation feature map into a first standard convolution module, a first depth separable convolution module, a second depth separable convolution module, a third depth separable convolution module, a global averaging pooling layer, a neuron discarding layer, a first full-connection layer and a first Softmax classifier in sequence to predict to obtain a fertilizer category result specifically comprises:
inputting the segmentation feature map to the first standard convolution module to perform standard convolution operation to obtain a feature map S2;
inputting the feature map S2 to the first depth separable convolution module to perform depth separable convolution operation to obtain a feature map S6;
inputting the feature map S6 to the second depth separable convolution module to perform depth separable convolution operation to obtain a feature map S10;
inputting the feature map S10 to the third depth separable convolution module to perform depth separable convolution operation to obtain a feature map S14;
and sequentially inputting the feature map S14 into the global average pooling layer, the neuron discarding layer, the first full-connection layer and the first Softmax classifier for prediction to obtain a fertilizer category result.
6. The method for controlling sowing and fertilizing of electrically driven crops according to claim 3, wherein the step of sequentially inputting the seed feature map into a second standard convolution module, a fourth maximum pooling layer, a first standard convolution residual module in a seed network, and performing feature extraction by the second standard convolution residual module, an average pooling layer, a flattening layer, a second full-connection layer and a second Softmax classifier to obtain seed category results specifically comprises:
sequentially inputting the seed feature map to the second standard convolution module and the fourth maximum pooling layer to perform standard convolution operation and maximum pooling operation to obtain a feature map Z3;
inputting the characteristic diagram Z3 into the first standard convolution residual error module to perform standard convolution operation to obtain a characteristic diagram Z12;
inputting the characteristic diagram Z12 into the second standard convolution residual error module to perform standard convolution operation to obtain a characteristic diagram Z21;
and sequentially inputting the feature map Z21 into the average pooling layer, the flattening layer, the second full-connection layer and the second Softmax classifier to perform feature extraction, so as to obtain a seed class result.
7. An electrically driven crop seeding and fertilizing control system, comprising:
The speed sensing module is used for acquiring sensor speed data in a seeding and fertilizing area by the terminal, the host unit is communicated with the terminal and used for processing control signals, the host unit obtains positioning and speed characteristics according to the sensor speed data, and the seeding and fertilizing area comprises a plurality of rows of seeding and fertilizing areas;
the seeding and fertilizing analysis module is used for acquiring the image data in the seeding and fertilizing area to perform preprocessing operation to obtain a segmentation feature map, inputting the segmentation feature map into a seeding and fertilizing network to predict, and obtaining a fertilizer category result, wherein the fertilizer category result comprises: shallow fertilization or deep fertilization;
the parameter calculation module is used for calculating parameters of each row of sowing and fertilizing areas according to the positioning and speed characteristics and the fertilizer category result by the host unit to obtain a parameter result;
the seed analysis module is used for acquiring an image of a seed to perform preprocessing operation to obtain a seed feature map, inputting the seed feature map into a seed network to perform feature extraction to obtain a seed category result, wherein the seed category result comprises: inferior seeds and superior seeds;
the seeding and fertilizing area operation module is used for transmitting the seed category result and the parameter result to each row of electric drive seeding and fertilizing control units and executing corresponding operation on each row of seeding and fertilizing areas;
And the parameter feedback module is used for transmitting the parameter result back to the host unit, and the host unit transmits the parameter result to the cloud server in real time.
8. The electrically driven crop seeding and fertilizing control system as in claim 7, wherein said seeding and fertilizing analysis module comprises in particular:
the seeding and fertilizing image processing sub-module is used for acquiring image data in the seeding and fertilizing area, and sequentially carrying out image enhancement, edge detection and image segmentation operation to obtain a segmentation feature map;
and the seeding and fertilizing network sub-module sequentially inputs the segmentation feature map into a first standard convolution module, a first depth separable convolution module, a second depth separable convolution module, a third depth separable convolution module, a global average pooling layer, a neuron discarding layer, a first full-connection layer and a first Softmax classifier in the seeding and fertilizing network to predict, so that a fertilizer category result is obtained.
9. The electrically driven crop seeding and fertilizing control system as in claim 7, wherein said seed analysis module comprises in particular:
the seed image processing sub-module is used for acquiring images of seeds and sequentially carrying out image enhancement, normalization and denoising operations to obtain a seed characteristic image;
And the seed network sub-module is used for sequentially inputting the seed characteristic map into a second standard convolution module, a fourth maximum pooling layer and a first standard convolution residual error module in a seed network, and carrying out characteristic extraction on the second standard convolution residual error module, an average pooling layer, a flattening layer, a second full-connection layer and a second Softmax classifier to obtain a seed class result.
10. The electrically driven crop seeding and fertilizing control system as in claim 7, wherein said seeding and fertilizing area operation module specifically comprises:
the judging submodule is used for judging whether the fertilizer type result in the parameter result is deep fertilization; if the fertilizer type result in the parameter result is 'deep fertilization', sequentially performing seed sorting, seed sowing and deep fertilization according to the seed type result; and if the fertilizer type result in the parameter result is 'shallow fertilization', shallow fertilization operation is carried out on the sowing and fertilizing area.
CN202310870532.XA 2023-07-17 2023-07-17 Electric-drive crop sowing and fertilizing control method and system Active CN116897668B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202310870532.XA CN116897668B (en) 2023-07-17 2023-07-17 Electric-drive crop sowing and fertilizing control method and system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202310870532.XA CN116897668B (en) 2023-07-17 2023-07-17 Electric-drive crop sowing and fertilizing control method and system

Publications (2)

Publication Number Publication Date
CN116897668A true CN116897668A (en) 2023-10-20
CN116897668B CN116897668B (en) 2024-01-23

Family

ID=88352512

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202310870532.XA Active CN116897668B (en) 2023-07-17 2023-07-17 Electric-drive crop sowing and fertilizing control method and system

Country Status (1)

Country Link
CN (1) CN116897668B (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117252488A (en) * 2023-11-16 2023-12-19 国网吉林省电力有限公司经济技术研究院 Industrial cluster energy efficiency optimization method and system based on big data

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109121589A (en) * 2018-07-26 2019-01-04 辽宁省农业机械化研究所 A kind of crop seeding, fertilising automated job intelligent monitor system
CN110287869A (en) * 2019-06-25 2019-09-27 吉林大学 High-resolution remote sensing image Crop classification method based on deep learning
CN112119688A (en) * 2020-09-02 2020-12-25 北京农业信息技术研究中心 Layered accurate fertilizing and seeding machine and control method thereof
CN113508657A (en) * 2021-08-13 2021-10-19 中国农业大学 Wheat layered variable fertilizing device and method based on prescription chart
WO2022174561A1 (en) * 2021-02-22 2022-08-25 中国农业机械化科学研究院 Variable sowing and fertilizing method, system, and device
CN114997535A (en) * 2022-08-01 2022-09-02 联通(四川)产业互联网有限公司 Intelligent analysis method and system platform for big data produced in whole process of intelligent agriculture
CN115643874A (en) * 2022-11-02 2023-01-31 西南大学 Agricultural automatic accurate control variable rate fertilization method
CN115968614A (en) * 2022-12-09 2023-04-18 霍邱云农服农业服务有限公司 Seeder and intelligent detection system thereof

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109121589A (en) * 2018-07-26 2019-01-04 辽宁省农业机械化研究所 A kind of crop seeding, fertilising automated job intelligent monitor system
CN110287869A (en) * 2019-06-25 2019-09-27 吉林大学 High-resolution remote sensing image Crop classification method based on deep learning
CN112119688A (en) * 2020-09-02 2020-12-25 北京农业信息技术研究中心 Layered accurate fertilizing and seeding machine and control method thereof
WO2022174561A1 (en) * 2021-02-22 2022-08-25 中国农业机械化科学研究院 Variable sowing and fertilizing method, system, and device
CN113508657A (en) * 2021-08-13 2021-10-19 中国农业大学 Wheat layered variable fertilizing device and method based on prescription chart
CN114997535A (en) * 2022-08-01 2022-09-02 联通(四川)产业互联网有限公司 Intelligent analysis method and system platform for big data produced in whole process of intelligent agriculture
CN115643874A (en) * 2022-11-02 2023-01-31 西南大学 Agricultural automatic accurate control variable rate fertilization method
CN115968614A (en) * 2022-12-09 2023-04-18 霍邱云农服农业服务有限公司 Seeder and intelligent detection system thereof

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117252488A (en) * 2023-11-16 2023-12-19 国网吉林省电力有限公司经济技术研究院 Industrial cluster energy efficiency optimization method and system based on big data
CN117252488B (en) * 2023-11-16 2024-02-09 国网吉林省电力有限公司经济技术研究院 Industrial cluster energy efficiency optimization method and system based on big data

Also Published As

Publication number Publication date
CN116897668B (en) 2024-01-23

Similar Documents

Publication Publication Date Title
CN106971167B (en) Crop growth analysis method and system based on unmanned aerial vehicle platform
CN110689519B (en) Fog drop deposition image detection system and method based on yolo network
CN116897668B (en) Electric-drive crop sowing and fertilizing control method and system
Shrestha et al. Automatic corn plant population measurement using machine vision
CN113029971B (en) Crop canopy nitrogen monitoring method and system
CN105095958A (en) Cocoon counting method
CN111985724B (en) Crop yield estimation method, device, equipment and storage medium
CN110163138A (en) A kind of wheat tillering density measuring method based on unmanned plane multi-spectral remote sensing image
CN107633202A (en) A kind of plant protection unmanned plane based on the identification of farmland characteristics of image flies control method and system
CN109270952A (en) A kind of agricultural land information acquisition system and method
CN111727457A (en) Cotton crop row detection method and device based on computer vision and storage medium
CN114694047A (en) Corn sowing quality evaluation method and device
CN114881127A (en) Crop fine classification method based on high-resolution remote sensing satellite image
CN113920106A (en) Corn growth three-dimensional reconstruction and stem thickness measurement method based on RGB-D camera
CN106683069A (en) Method for recognizing inline crops and weeds in seedling stage of farmland
CN112270707A (en) Crop position detection method and device, mobile platform and storage medium
CN111814585A (en) Unmanned aerial vehicle near-ground-to-air crop seedling condition remote sensing monitoring method and device and storage medium
CN108537164B (en) Method and device for monitoring germination rate of dibbling and sowing based on unmanned aerial vehicle remote sensing
CN110781865A (en) Crop growth control system
CN115424151A (en) Agricultural intelligent platform based on image processing
CN115372281A (en) Monitoring system and method for physical structure and chemical composition of soil
CN114511520A (en) Crop form detection method, device, equipment and storage medium
CN113705282A (en) Information acquisition device, system and method and information acquisition vehicle
CN111886982A (en) Real-time detection system and detection method for dry land planting operation quality
CN113807129A (en) Crop area identification method and device, computer equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant