CN111967523B - Data fusion agricultural condition detection system and method based on multi-rotor aircraft - Google Patents

Data fusion agricultural condition detection system and method based on multi-rotor aircraft Download PDF

Info

Publication number
CN111967523B
CN111967523B CN202010836910.9A CN202010836910A CN111967523B CN 111967523 B CN111967523 B CN 111967523B CN 202010836910 A CN202010836910 A CN 202010836910A CN 111967523 B CN111967523 B CN 111967523B
Authority
CN
China
Prior art keywords
unit
data fusion
image
electrically connected
data
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202010836910.9A
Other languages
Chinese (zh)
Other versions
CN111967523A (en
Inventor
邱新伟
李亚芹
王俊发
刘向东
李彦沛
蒲岩岩
李志博
刘羽菲
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Jiamusi University
Original Assignee
Jiamusi University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Jiamusi University filed Critical Jiamusi University
Priority to CN202010836910.9A priority Critical patent/CN111967523B/en
Publication of CN111967523A publication Critical patent/CN111967523A/en
Application granted granted Critical
Publication of CN111967523B publication Critical patent/CN111967523B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/25Fusion techniques
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/88Radar or analogous systems specially adapted for specific applications
    • G01S13/882Radar or analogous systems specially adapted for specific applications for altimeters
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/88Radar or analogous systems specially adapted for specific applications
    • G01S13/89Radar or analogous systems specially adapted for specific applications for mapping or imaging
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q50/00Information and communication technology [ICT] specially adapted for implementation of business processes of specific business sectors, e.g. utilities or tourism
    • G06Q50/02Agriculture; Fishing; Forestry; Mining
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/10Image acquisition
    • G06V10/12Details of acquisition arrangements; Constructional details thereof
    • G06V10/14Optical characteristics of the device performing the acquisition or on the illumination arrangements
    • G06V10/143Sensing or illuminating at different wavelengths
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/10Terrestrial scenes
    • G06V20/188Vegetation

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Remote Sensing (AREA)
  • Radar, Positioning & Navigation (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Business, Economics & Management (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Data Mining & Analysis (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Electromagnetism (AREA)
  • Multimedia (AREA)
  • General Health & Medical Sciences (AREA)
  • Health & Medical Sciences (AREA)
  • Marine Sciences & Fisheries (AREA)
  • Image Processing (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • General Engineering & Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Evolutionary Biology (AREA)
  • Agronomy & Crop Science (AREA)
  • Animal Husbandry (AREA)
  • Artificial Intelligence (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Economics (AREA)
  • Evolutionary Computation (AREA)
  • Human Resources & Organizations (AREA)
  • Marketing (AREA)
  • Primary Health Care (AREA)
  • Strategic Management (AREA)
  • Tourism & Hospitality (AREA)
  • General Business, Economics & Management (AREA)
  • Mining & Mineral Resources (AREA)

Abstract

The invention discloses a data agricultural condition detection system based on a multi-rotor aircraft, which comprises: a data fusion unit; the digital imaging radar unit is electrically connected with the data fusion unit and is used for providing a point cloud picture for the data fusion unit; the multispectral camera unit is electrically connected with the data fusion unit and is used for providing a visible light image for the data fusion unit; the millimeter radar wave unit is electrically connected with the data fusion unit and is used for measuring the relative height between the millimeter radar wave unit and the ground; and the execution unit is in bidirectional electric connection with the data fusion unit and is used for adjusting the spraying dosage of the pesticide. The density condition of crops can be evaluated, and the pesticide spraying amount can be adjusted. The invention further provides a data agricultural condition detection method based on the multi-rotor aircraft.

Description

Data fusion agricultural condition detection system and method based on multi-rotor aircraft
Technical Field
The invention relates to a data fusion agricultural condition detection system and method based on a multi-rotor aircraft, and belongs to the field of agricultural informatization.
Background
In recent years, with the third wave of the information industry, the technology of the internet of things is developed vigorously, and the technology is developed in a plurality of agricultural fields such as sustainable utilization of soil and water resources, ecological environment monitoring, fine management of the agricultural production process, agricultural product and food safety traceability systems, large-scale agricultural machinery operation service scheduling and the like. The internet of things technology feeds back and processes various data collected by a sensor arranged on the field, and real-time agricultural monitoring and data collection and transmission are achieved to a certain extent.
For the aspect of agricultural detection data acquisition and processing in the field of plant protection, two methods are widely adopted in the market at present, one method is that an unmanned aerial vehicle aerial survey mode is adopted before plant protection flight operation, and a multispectral camera is used for acquiring the approximate growth state of crops to plan the path; the other is to use a sensor to sense the terrain during the flight so as to adjust the flight height singly. From the perspective of product technology, data acquired by a single sensor is not subjected to data fusion processing, and complete agricultural monitoring and accurate pesticide application cannot be realized only by a single variable control mode; from the product performance, relevant products do not focus on comprehensive agricultural monitoring, and the functions of the plant protection unmanned aerial vehicle are yet to be further improved.
Disclosure of Invention
The invention designs and develops a data fusion agricultural condition detection system based on a multi-rotor aircraft, and can solve the problem that the traditional plant protection unmanned aerial vehicle cannot adjust the spraying operation amount in real time according to the operation terrain, the plant distribution sparsity and the crop growth condition to cause seedling burning.
The invention also designs and develops a data fusion agricultural condition detection method based on the multi-rotor aircraft, carries out data fusion according to the point cloud image and the multispectral image, evaluates the density condition of crops and realizes the adjustment of the pesticide spraying amount.
The technical scheme provided by the invention is as follows:
a data fusion agricultural condition detection system based on multi-rotor aircraft comprises:
a data fusion unit;
the digital imaging radar unit is electrically connected with the data fusion unit and is used for providing a point cloud picture for the data fusion unit;
the multispectral camera unit is electrically connected with the data fusion unit and is used for providing a visible light image for the data fusion unit;
the millimeter radar wave unit is electrically connected with the data fusion unit and is used for measuring the relative height between the millimeter radar wave unit and the ground;
and the execution unit is in bidirectional electric connection with the data fusion unit and is used for adjusting the spraying dosage of the pesticide.
Preferably, the digital imaging radar unit includes:
an imaging radar;
the first storage unit is electrically connected with the imaging radar in a bidirectional mode;
and the first processing unit is in bidirectional point connection with the first storage unit and is electrically connected with the data fusion unit.
Preferably, the multispectral camera unit includes:
a multispectral camera;
a second storage unit which is electrically connected with the multispectral camera in a bidirectional way;
and the second processing unit is bidirectionally and electrically connected with the second storage unit and is electrically connected with the data fusion unit.
Preferably, the data fusion unit includes:
a third storage unit which is electrically connected with the multispectral camera unit and the digital imaging radar unit at the same time;
a third processing unit in bidirectional electrical connection with the third memory unit and in electrical connection with the execution structure.
A data fusion agricultural condition detection method based on a multi-rotor aircraft uses the data fusion agricultural condition detection system based on the multi-rotor aircraft and comprises the following steps:
determining and maintaining the flying height of the unmanned aerial vehicle and the ground, and sending a cloud point map to the data fusion unit after sector scanning is carried out below the unmanned aerial vehicle;
photographing and analyzing the lower part of the unmanned aerial vehicle, and sending a visible light image to the data fusion unit;
and performing data fusion on the point cloud image and the multispectral image, evaluating the density distribution of crops, and sending the evaluation result to an executing mechanism to adjust the spraying amount of pesticides in real time so as to realize real-time variable spraying.
Preferably, the data fusion process includes:
setting a point cloud picture as an image A and setting a visible light image as an image B;
respectively carrying out L-layer scale decomposition on the image A and the image B, and fusing the two groups of decomposition coefficients according to a fusion strategy to obtain fusion coefficients;
reversely reconstructing by adopting the decomposition coefficient after the multi-scale inverse transformation fusion to obtain a fused image;
and (5) performing image enhancement, judging the density of the crops by evaluating the area of the highlight area, and further adjusting the flow of the liquid medicine.
It is preferable that the first and second liquid crystal layers are formed of,
the decomposition coefficient formula of the image A at the i scale is as follows:
Figure BDA0002640031990000031
the decomposition coefficient formula of the image B at the i scale is as follows:
Figure BDA0002640031990000032
the fusion coefficient formula is as follows:
Figure BDA0002640031990000033
Figure BDA0002640031990000034
wherein, i =1,2, the.
Preferably, the image enhancement comprises:
step 1, counting the pixel gray level in the histogram of the fusion image to obtain the number of idle gray levels with zero pixel number in the RGB color gray level range of [0-255 ];
modifying the image histogram to:
Figure BDA0002640031990000035
counting the number of the gray levels of his' =0 to obtain the number of the gray levels L r
In the formula, his r is the data to be processed, delta is the threshold value, and his' r is the processed data;
step 2, calculating the whole gray scale range [0,255 ]]Number of inner effective gray levels L e
L e =255-L r
And 3, allocating the idle gray level to an effective gray level, and performing nonlinear stretching transformation on the effective gray level which is not zero in the whole gray level range, wherein the transformation function is as follows:
Figure BDA0002640031990000036
in the formula, S k For enhanced gray scale, T is the gray scale coefficient, r k Is a pixel region requiring processing of L' i The gray scale of the pixel of the image to be processed;
the liquid medicine flow L is adjusted in real time by calculating the area of the highlight area, and the empirical formula of the flow adjustment is as follows:
L=S k ×P;
wherein P is the basic operation flow.
The invention has the following beneficial effects: the problem that the traditional unmanned aerial vehicle for plant protection cannot adjust the spraying operation amount in real time according to the operation terrain, the plant distribution sparsity and the crop growth condition to burn seedlings is solved. Traditional plant protection unmanned aerial vehicle has used the millimeter wave radar, through the data that the radar of installing the plant protection unmanned aerial vehicle below in flight gathered, thereby judge the relative distance of plant protection unmanned aerial vehicle and ground and guarantee that the relative altitude on plant protection unmanned aerial vehicle and ground keeps unchangeable constantly, thereby solved plant protection unmanned aerial vehicle flight safety in the operation and can't realize spraying the difficult problem of liquid medicine volume according to the real-time adjustment of crop growth through to surrounding environment real-time sensing.
Drawings
Fig. 1 is a schematic overall structure diagram of a data fusion agricultural condition detection system based on a multi-rotor aircraft according to the present invention.
Fig. 2 is a schematic structural diagram of a digital radar imaging unit according to the present invention.
Fig. 3 is a schematic structural diagram of the multispectral camera unit according to the present invention.
Fig. 4 is a schematic structural diagram of a data fusion unit according to the present invention.
Fig. 5 is a schematic structural diagram of the actuator according to the present invention.
Fig. 6 is a schematic structural diagram of a millimeter wave radar unit according to the present invention.
FIG. 7 is a flow chart of a fusion algorithm according to the present invention.
Fig. 8 is a schematic diagram of a simplified area array beam combiner according to the present invention.
FIG. 9 is a block diagram of a neural network PID according to the present invention.
Fig. 10 is a diagram of a conventional PID step response according to the present invention.
FIG. 11 is a diagram of a neural network PID step response according to the invention.
FIG. 12 is a flow chart of a variable spray system according to the present invention.
Detailed Description
The present invention is described in further detail below with reference to the attached drawings so that those skilled in the art can implement the invention by referring to the description text.
As shown in fig. 1-12, the present invention provides a data fusion agricultural condition detection system based on multi-rotor aircraft, comprising: a data fusion unit; the device comprises a digital imaging radar unit, a multispectral camera unit, an execution unit and a millimeter wave radar unit.
The digital imaging radar unit is electrically connected with the data fusion unit and used for providing a point cloud picture for the data fusion unit, the multispectral camera unit is electrically connected with the data fusion unit and used for providing a visible light image for the data fusion unit, and the millimeter wave radar unit is electrically connected with the data fusion unit and used for measuring the relative height of the system and the ground; the actuating mechanism is electrically connected with the data fusion unit and is used for adjusting the spraying amount of the pesticide in real time.
The plant protection unmanned aerial vehicle is used as an actuating mechanism in actual operation flight, a millimeter wave radar unit is adopted to send out millimeter wave signals to the ground, the returned data calculate the relative height with the ground, the data are transmitted to a data fusion unit, and the data fusion unit is transmitted back to a flight control system so as to control the plant protection unmanned aerial vehicle to keep the relatively stable flight height with the ground; carrying out sector scanning on the lower part of the plant protection unmanned aerial vehicle in flight through a digital imaging radar unit, collecting returned data, and finally generating a point cloud image and providing the point cloud image to a data fusion unit; and photographing and analyzing the lower part of the plant protection unmanned aerial vehicle in flight through the multispectral camera, generating visible light images in different intervals and providing the visible light images for the data fusion unit. And the data fusion unit performs data fusion according to the point cloud image and the multispectral image, evaluates the density distribution condition of crops, and sends the density distribution condition to an execution mechanism to adjust the spraying amount of pesticides in real time, so that real-time variable spraying of the pesticides is realized.
The digital imaging radar unit includes: the imaging radar, the first storage unit and the first processing unit are electrically connected with the imaging radar and the first processing unit in a two-way mode, the first processing unit is electrically connected with the data fusion unit, and the first storage unit is used for storing original data collected by the imaging radar and control data of the control unit on the imaging radar. The first processing unit is used for processing the original data collected by the imaging radar unit, performing operation to generate a point cloud image, and transmitting the image data to the data fusion unit.
As shown in fig. 7, the digital imaging radar unit uses a phased array radar as a transmission source, and the main hardware system is a high-speed signal processing board, which includes 8 paths of 10-bit AD converters, an FPGA, and a 10hitsDA conversion chip. In this system, we use a combination of hardware and software. In the hardware part, each channel carries out data measurement under the control of FPCA, analog-to-digital conversion is carried out on a pause signal sent by an SHA analog intermediate frequency input interface, 8 paths of data streams are formed and sent to an FPGA, digital quadrature demodulation, 8 paths of single-beam DBP processing and pulse compression processing are completed in the FPGA, two paths of digital I/Q signals are formed, a single path of digital video signals are formed through modulus calculation and sent to a DA chip.
In a hardware structure of the digital imaging radar, radio Frequency (RF) signals received by each unit of an array antenna are respectively subjected to analog-to-digital conversion by an A/D converter in a respective processing module, then down-conversion and synchronous detection processing are carried out, obtained digital orthogonal baseband signals are sent to a DBF processor, complex baseband digital signals S and a preset complex weighting vector W are multiplied and accumulated in a DBF former, and then required point cloud data are obtained and output.
The multispectral camera unit includes: the multispectral camera, the second storage unit and the second processing unit are electrically connected with the multispectral camera and the second processing unit in a bidirectional mode. The second storage unit is used for storing original data collected by the multispectral camera and control data of the multispectral camera by the control unit, and the second processing unit is used for processing the original data collected by the multispectral camera to perform operation and generate images of different visible light wave bands and transmitting the image data to the data fusion unit.
When the device works, light passes through the aperture, and the aperture and the focal length are adjusted to ensure proper luminous flux and image definition; then, the light passes through a light extender, and the LCTF liquid crystal light splitting is used for tuning the imaging range at the wave band of every 5 nm; and thirdly, imaging the light to a focal plane of the CMOs area array detector through the adapter, sequentially transmitting the acquired images to a computer by the CMOS detector for storage, and finally analyzing and processing the acquired multispectral images by the computer.
The data fusion unit comprises a third storage unit and a third processing unit, the third storage unit is simultaneously electrically connected with the multispectral camera unit and the digital imaging radar unit, is bidirectionally electrically connected with the third processing unit, and transmits signals to the execution structure through the third processing unit. The third storage unit is used for storing point cloud image data generated by the imaging radar unit and image data of different visible light wave bands generated by the multispectral camera unit; the third processing unit is used for processing point cloud image data generated by the imaging radar unit and image data of different visible light wave bands generated by the multispectral camera unit, combining the point cloud image data and the image data into a pair of images, evaluating the images by an algorithm, finally transmitting an evaluation result to the execution mechanism, and adjusting the spraying operation amount in real time by the execution mechanism according to the operation terrain, the plant distribution sparsity and the crop growth vigor.
The invention also provides a data fusion agricultural condition monitoring method based on the multi-rotor flight chess, which carries out data fusion according to the cloud point map and the multispectral image, evaluates the density condition of crops and realizes the adjustment of pesticide spraying dosage, and comprises the following steps:
determining and maintaining the flying height of the unmanned aerial vehicle and the ground, and sending a point cloud picture to the data fusion unit after sector scanning is carried out below the unmanned aerial vehicle;
photographing and analyzing the lower part of the unmanned aerial vehicle, and sending a visible light image to the data fusion unit;
and performing data fusion on the point cloud image and the multispectral image, evaluating the density distribution of crops, and sending the evaluation result to an executing mechanism to adjust the spraying amount of pesticides in real time so as to realize real-time variable spraying.
The image processing procedure described. Image fusion may provide more efficient information for image segmentation, target detection and recognition, image understanding, and the like. Aiming at the characteristics of the processed multispectral image and point cloud image, the method adopts pixel-level fusion to fuse the collected original image data, and then analyzes and processes the fused image data. The fused image contains the distribution condition of plant branches (from a point cloud picture) and the density information of plant leaves.
The image fusion technology based on multi-scale analysis is a research hotspot in the field of image fusion, and can obtain a good fusion result by adopting a mode that human vision perceives an objective world from coarse to fine. The multi-scale decomposition method of the image obtains decomposition components in different scale spaces of the image, so that high-frequency and low-frequency information in the image is separated, and the method is similar to a mode of processing visual signals by human eyes. The multi-scale decomposition method is widely applied in various fields of image processing, wherein the separation of image edge details and global approximate information obtains computational convenience and reliability. In order to utilize the information characteristics of different levels of the image to flexibly and quickly realize the evaluation, selection and fusion of the image information, the image fusion method used in the research is a fusion method based on multi-scale decomposition, and the flow is as follows:
as shown in fig. 7, the set point cloud is image a, and the visible light image is image B;
1) In order to obtain respective multi-scale decomposition coefficients, the input image A and the image B are respectively subjected to L-layer scale decomposition,
the decomposition coefficient formula of the image A at the i scale is as follows:
Figure BDA0002640031990000071
the decomposition coefficient formula of the image B at the i scale is as follows:
Figure BDA0002640031990000072
2) Fusing the two groups of decomposition coefficients according to a fusion strategy to obtain fusion coefficients;
the fusion coefficient formula is:
Figure BDA0002640031990000073
3) And then reversely reconstructing the fused image by applying multi-scale inverse transformation to the fused decomposition coefficient to obtain:
Figure BDA0002640031990000074
wherein i =1,2, the.. The., L, F are coefficients obtained after transformation, and L is the scale degree of decomposition;
image enhancement methods can be divided into spatial and frequency domain based methods. The enhancement method based on the airspace mainly comprises a histogram stipulation method, a histogram equalization method, a gray scale conversion method, an unsharp masking method and the like. Histogram equalization enhances all pixels in an image, with the disadvantage that it is not easy to highlight objects in the image. While histogram specification can theoretically be targeted to enhance specific information in an image, it is difficult to select an optimal histogram in practical applications. The histogram stretching method can enhance the image contrast to a certain extent, but the variation function must be determined empirically or experimentally, and the method is greatly limited in application. The frequency domain-based enhancement method is mainly used for removing image noise and enhancing detail contents such as edges.
The image is enhanced by adopting a self-adaptive image enhancement algorithm based on nonlinear stretching, different levels of gray levels are prevented from being combined, compared with other linear stretching and nonlinear stretching algorithms, the determination of a transformation function and an adjustment parameter does not need to be carried out by means of experience or experiment, the algorithm has small computation amount and strong self-adaptation, and is suitable for a real-time processing system.
The method specifically comprises the following steps:
step 1, counting the pixel gray in the histogram of the original image to obtain the number of idle gray levels with zero pixel number in [0-255], wherein [0-255] is the gray definition in RGB color, the definition range from full white to full black is [0-255], theoretically, the gray level with the absolute frequency of zero is the idle gray level, but the imaging equipment is not ideal equipment, noise points exist in the image in a certain proportion, and the gray levels with the gray level occurrence frequency smaller than a certain threshold value are all classified as the idle gray level, so that the interference of the noise gray level with small frequency in the idle gray level statistics is reduced. Thus, the highest image signal-to-noise ratio is obtained, the contrast and the human visual effect are improved, namely, the image histogram is modified into:
Figure BDA0002640031990000081
counting that the number of the gray levels of his' =0 is L r
Counting the number of the gray levels his' =0 to obtain the number L of the gray levels r
In the formula, his r is the data to be processed, delta is the threshold value, and his' r is the processed data
Step 2, calculating the whole gray scale range [0,255 ]]Number of inner effective gray levels L e
L e =255-L r
And 3, assigning the idle gray level to the effective gray level. Gray levels having a smaller frequency of occurrence, i.e., a smaller number of pixels, are assigned to more free gray levels, and gray values having a larger frequency of occurrence are assigned to less gray levels. This is equivalent to non-linearly stretching the histogram in the gray axis direction, i.e. stretching the histogram at a smaller frequency for a larger distance and conversely for a larger gray scale pitch. Thus, the interval of the gray level of the target detail with small frequency of appearance is stretched, and the detail part is enhanced. The degeneracy of the target segment with lower gray level occurrence frequency in the histogram equalization method by the background segment with higher gray level occurrence frequency is avoided;
and 4, performing nonlinear stretching transformation on the effective gray level which is not zero in the whole gray level range, wherein the transformation function is as follows:
Figure BDA0002640031990000091
in the formula, S k For enhanced gray scale, T is the gray scale coefficient, r k Is a pixel region, L' i The gray scale of the pixel of the image to be processed;
the gray level after image processing is equal to the area gray level multiplied by the gray coefficient, and the operation result of each pixel in the area and the gray level coefficient. By assigning the idle gray levels to the effective gray levels, gray levels having a smaller frequency of appearance, i.e., a smaller number of pixels, are assigned to more idle gray levels, and gray values having a larger frequency of appearance are assigned to less gray levels. This corresponds to a non-linear stretching of the histogram in the gray axis direction, i.e. the stretching distance is greater where the frequency is smaller, and conversely the gray scale spacing is greater. Thus, the interval of the gray level of the target detail having a small frequency of appearance is stretched, and the detail portion is enhanced.
Therefore, an enhanced image is obtained, the branch and the leaf parts are highlighted, and the flow of the liquid medicine is adjusted in real time by calculating the area of the highlighted area.
The liquid medicine flow L is adjusted in real time by calculating the area of the highlight area, and the empirical formula of the flow adjustment is as follows:
as shown in fig. 9-11, for the design structure of the liquid pump PID control algorithm based on neural network machine learning as shown in fig. 8, 2 neural network modules with different functions are used in the model using the neural network algorithm to assume different responsibilities, one is an NNI online identifier, and the other is an NNC adaptive PID control processor. The working principle of the flow controller of the liquid pump in agricultural plant protection is that the right system of the identification result of an NNC controlled sub-item in an algorithm is adjusted in real time by using an algorithm model, so that the controlled item generates adaptivity and stability. And the stability and the performance of the neural network PID control system are verified through Matlab operation software at the later stage, and a large amount of data simulation verification is performed. The experimental results shown in fig. 9-10 show that the PID control system based on the neural network has better control characteristics than the conventional control system.
As shown in fig. 11, for the development of a complete control system, the hardware is only a part of the development, and the quality of the software directly affects the realization of the whole system function. The control system adopts modular programming, and the whole control system is realized by writing C language codes. In the research, freeRTOS is selected as a software system core, and Keil integrated development environment is used for completing software modularization.
The flow of the liquid medicine is adjusted in real time by calculating the area of the high-brightness area:
L=S k ×P;
wherein P is the basic operation flow.
The flow rate is equal to the gray level of the current area after enhancement multiplied by the current crop type. The basic operation flow P is manually determined according to the type of the crop in the operation farmland, the gray level of the system is enhanced by an image enhancement algorithm according to the crop image shot in the operation area, and the denser the crop is, the gray level S is k The higher the required operation flow, the larger the required operation flow, thereby realizing the real-time variable spraying.
While embodiments of the invention have been described above, it is not limited to the applications set forth in the description and the embodiments, which are fully applicable in various fields of endeavor to which the invention pertains, and further modifications may readily be made by those skilled in the art, it being understood that the invention is not limited to the details shown and described herein without departing from the general concept defined by the appended claims and their equivalents.

Claims (5)

1. A data fusion agricultural condition detection method based on a multi-rotor aircraft is characterized in that a data fusion agricultural condition detection system based on the multi-rotor aircraft is used for detection, and the data fusion agricultural condition detection system based on the multi-rotor aircraft comprises:
a data fusion unit;
the digital imaging radar unit is electrically connected with the data fusion unit and is used for providing a point cloud picture for the data fusion unit;
the multispectral camera unit is electrically connected with the data fusion unit and is used for providing a visible light image for the data fusion unit;
the millimeter radar wave unit is electrically connected with the data fusion unit and is used for measuring the relative height between the millimeter radar wave unit and the ground;
the execution unit is electrically connected with the data fusion unit in a bidirectional way and is used for adjusting the spraying amount of the pesticide;
the data fusion agricultural condition detection method based on the multi-rotor aircraft comprises the following steps:
determining and maintaining the flying height of the unmanned aerial vehicle and the ground, and sending a cloud point map to the data fusion unit after sector scanning is carried out below the unmanned aerial vehicle;
photographing and analyzing the lower part of the unmanned aerial vehicle, and sending a visible light image to the data fusion unit;
performing data fusion on the point cloud image and the multispectral image, evaluating the density distribution of crops and sending the density distribution to an executing mechanism to adjust the spraying amount of pesticides in real time so as to realize real-time variable spraying;
the data fusion process comprises the following steps:
setting a point cloud picture as an image A and setting a visible light image as an image B;
respectively carrying out L-layer scale decomposition on the image A and the image B, and fusing the two groups of decomposition coefficients according to a fusion strategy to obtain fusion coefficients;
reversely reconstructing by adopting the decomposition coefficient after the multi-scale inverse transformation fusion to obtain a fused image;
performing image enhancement, judging the density of crops by evaluating the area of the highlight area, and further adjusting the flow of the liquid medicine;
the digital imaging radar unit includes:
an imaging radar;
a first storage unit which is electrically connected with the imaging radar in a bidirectional way;
and the first processing unit is in bidirectional point connection with the first storage unit and is electrically connected with the data fusion unit.
2. The multi-rotor aircraft-based data fusion agricultural condition detection method according to claim 1, wherein the multispectral camera unit comprises:
a multispectral camera;
a second storage unit which is electrically connected with the multispectral camera in a bidirectional way;
and the second processing unit is bidirectionally and electrically connected with the second storage unit and is electrically connected with the data fusion unit.
3. The multi-rotor aircraft-based data fusion agricultural condition detection method according to claim 2, wherein the data fusion unit comprises:
a third storage unit electrically connected to the multispectral camera unit and the digital imaging radar unit at the same time;
and the third processing unit is bidirectionally and electrically connected with the third storage unit and is electrically connected with the execution mechanism.
4. The multi-rotor aircraft-based data fusion agricultural condition detection method according to claim 3,
the decomposition coefficient formula of the image A at the i scale is as follows:
Figure FDA0003871470900000021
the decomposition coefficient formula of the image B at i scale is as follows:
Figure FDA0003871470900000022
the fusion coefficient formula is as follows:
Figure FDA0003871470900000023
Figure FDA0003871470900000024
wherein, i =1,2, the.
5. The multi-rotor aircraft-based data fusion agricultural condition detection method of claim 4, wherein the image enhancement comprises:
step 1, counting the pixel gray in the histogram of the fusion image to obtain the number of idle gray with zero pixel number in the RGB color gray range of [0-255 ];
modifying the image histogram to:
Figure FDA0003871470900000025
counting the number of the gray levels of his' =0 to obtain the number of the gray levels L r
In the formula, his r is the data to be processed, delta is the threshold value, and his' r is the processed data;
step 2, calculating the whole gray scale range [0,255 ]]Number of inner effective gray levels L e
L e =255-L r
Step 3, assigning the idle gray level to an effective gray level, and performing nonlinear stretching transformation on the effective gray level which is not zero in the whole gray level range, wherein the transformation function is as follows:
Figure FDA0003871470900000031
in the formula, S k For enhanced gray scale, T is the gray scale coefficient, r k Is a pixel region requiring processing of L' i The gray scale of the pixel of the image to be processed;
the liquid medicine flow L is adjusted in real time by calculating the area of the highlight area, and the empirical formula of the flow adjustment is as follows:
L=S k ×P;
wherein P is the basic operation flow.
CN202010836910.9A 2020-08-19 2020-08-19 Data fusion agricultural condition detection system and method based on multi-rotor aircraft Active CN111967523B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010836910.9A CN111967523B (en) 2020-08-19 2020-08-19 Data fusion agricultural condition detection system and method based on multi-rotor aircraft

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010836910.9A CN111967523B (en) 2020-08-19 2020-08-19 Data fusion agricultural condition detection system and method based on multi-rotor aircraft

Publications (2)

Publication Number Publication Date
CN111967523A CN111967523A (en) 2020-11-20
CN111967523B true CN111967523B (en) 2022-11-15

Family

ID=73388977

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010836910.9A Active CN111967523B (en) 2020-08-19 2020-08-19 Data fusion agricultural condition detection system and method based on multi-rotor aircraft

Country Status (1)

Country Link
CN (1) CN111967523B (en)

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104835130A (en) * 2015-04-17 2015-08-12 北京联合大学 Multi-exposure image fusion method
CA3024580A1 (en) * 2015-05-15 2016-11-24 Airfusion, Inc. Portable apparatus and method for decision support for real time automated multisensor data fusion and analysis
CN108519775A (en) * 2017-10-30 2018-09-11 北京博鹰通航科技有限公司 A kind of UAV system and its control method precisely sprayed

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104835130A (en) * 2015-04-17 2015-08-12 北京联合大学 Multi-exposure image fusion method
CA3024580A1 (en) * 2015-05-15 2016-11-24 Airfusion, Inc. Portable apparatus and method for decision support for real time automated multisensor data fusion and analysis
CN108519775A (en) * 2017-10-30 2018-09-11 北京博鹰通航科技有限公司 A kind of UAV system and its control method precisely sprayed

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
基于多光谱融合图像的飞机导航系统设计;马翰飞;《电子设计工程》;20191231;第161-166页 *
数据融合及其在农情遥感监测中的应用与展望;钱永兰等;《农业工程学报》;20040731;第286-290页 *

Also Published As

Publication number Publication date
CN111967523A (en) 2020-11-20

Similar Documents

Publication Publication Date Title
Yang et al. Deep convolutional neural networks for rice grain yield estimation at the ripening stage using UAV-based remotely sensed images
Zhu et al. Identification of grape diseases using image analysis and BP neural networks
EP4116840A1 (en) System and method of detection and identification of crops and weeds
CN106778888A (en) A kind of orchard pest and disease damage survey system and method based on unmanned aerial vehicle remote sensing
Rasti et al. Crop growth stage estimation prior to canopy closure using deep learning algorithms
US11562563B2 (en) Automatic crop classification system and method
CN111798539A (en) Adaptive camouflage online design method and system
Bagheri et al. An autonomous robot inspired by insect neurophysiology pursues moving features in natural environments
Dhanush et al. A comprehensive review of machine vision systems and artificial intelligence algorithms for the detection and harvesting of agricultural produce
CN112711900A (en) Crop digital twin modeling method
CN113822198B (en) Peanut growth monitoring method, system and medium based on UAV-RGB image and deep learning
CN116721389A (en) Crop planting management method
CN111967523B (en) Data fusion agricultural condition detection system and method based on multi-rotor aircraft
Moazzam et al. Crop and weeds classification in aerial imagery of sesame crop fields using a patch-based deep learning model-ensembling method
EP4034862B1 (en) Plant identification in the presence of airborne particulates
Harder et al. NightVision: generating nighttime satellite imagery from infra-Red observations
Backes et al. Classification of weed patches in Quickbird images: verification by ground truth data
Hong et al. Adaptive target spray system based on machine vision for plant protection UAV
Sun et al. 3D computer vision and machine learning based technique for high throughput cotton boll mapping under field conditions
Krestenitis et al. Overcome the Fear Of Missing Out: Active sensing UAV scanning for precision agriculture
Toh et al. Classification of oil palm growth status with L band microwave satellite imagery
Deka et al. UAV Sensing-Based Litchi Segmentation Using Modified Mask-RCNN for Precision Agriculture
Khidher et al. Automatic trees density classification using deep learning of unmanned aerial vehicles images
Kataev et al. Farm fields UAV images clusterization
CN112464688A (en) Unmanned aerial vehicle flies to prevent and pest and disease damage intelligent recognition system

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant