CN110657891A - Technology for identifying plants by applying multispectral investigation camera by unmanned aerial vehicle - Google Patents

Technology for identifying plants by applying multispectral investigation camera by unmanned aerial vehicle Download PDF

Info

Publication number
CN110657891A
CN110657891A CN201910933748.XA CN201910933748A CN110657891A CN 110657891 A CN110657891 A CN 110657891A CN 201910933748 A CN201910933748 A CN 201910933748A CN 110657891 A CN110657891 A CN 110657891A
Authority
CN
China
Prior art keywords
images
camera
unmanned aerial
aerial vehicle
live
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201910933748.XA
Other languages
Chinese (zh)
Other versions
CN110657891B (en
Inventor
陈小华
谢惠
徐琪
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Zhejiang A&F University ZAFU
Original Assignee
Zhejiang A&F University ZAFU
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Zhejiang A&F University ZAFU filed Critical Zhejiang A&F University ZAFU
Priority to CN201910933748.XA priority Critical patent/CN110657891B/en
Publication of CN110657891A publication Critical patent/CN110657891A/en
Application granted granted Critical
Publication of CN110657891B publication Critical patent/CN110657891B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01JMEASUREMENT OF INTENSITY, VELOCITY, SPECTRAL CONTENT, POLARISATION, PHASE OR PULSE CHARACTERISTICS OF INFRARED, VISIBLE OR ULTRAVIOLET LIGHT; COLORIMETRY; RADIATION PYROMETRY
    • G01J3/00Spectrometry; Spectrophotometry; Monochromators; Measuring colours
    • G01J3/28Investigating the spectrum
    • G01J3/2823Imaging spectrometer
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01JMEASUREMENT OF INTENSITY, VELOCITY, SPECTRAL CONTENT, POLARISATION, PHASE OR PULSE CHARACTERISTICS OF INFRARED, VISIBLE OR ULTRAVIOLET LIGHT; COLORIMETRY; RADIATION PYROMETRY
    • G01J3/00Spectrometry; Spectrophotometry; Monochromators; Measuring colours
    • G01J3/28Investigating the spectrum
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S19/00Satellite radio beacon positioning systems; Determining position, velocity or attitude using signals transmitted by such systems
    • G01S19/38Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system
    • G01S19/39Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system the satellite radio beacon positioning system transmitting time-stamped messages, e.g. GPS [Global Positioning System], GLONASS [Global Orbiting Navigation Satellite System] or GALILEO
    • G01S19/42Determining position
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T3/00Geometric image transformations in the plane of the image
    • G06T3/40Scaling of whole images or parts thereof, e.g. expanding or contracting
    • G06T3/4038Image mosaicing, e.g. composing plane images from plane sub-images
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/10Terrestrial scenes
    • G06V20/188Vegetation
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64CAEROPLANES; HELICOPTERS
    • B64C27/00Rotorcraft; Rotors peculiar thereto
    • B64C27/04Helicopters
    • B64C27/08Helicopters with two or more rotors
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64UUNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
    • B64U2101/00UAVs specially adapted for particular uses or applications
    • B64U2101/30UAVs specially adapted for particular uses or applications for imaging, photography or videography
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01JMEASUREMENT OF INTENSITY, VELOCITY, SPECTRAL CONTENT, POLARISATION, PHASE OR PULSE CHARACTERISTICS OF INFRARED, VISIBLE OR ULTRAVIOLET LIGHT; COLORIMETRY; RADIATION PYROMETRY
    • G01J3/00Spectrometry; Spectrophotometry; Monochromators; Measuring colours
    • G01J3/28Investigating the spectrum
    • G01J2003/283Investigating the spectrum computer-interfaced
    • G01J2003/284Spectral construction
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/10Terrestrial scenes
    • G06V20/194Terrestrial scenes using hyperspectral data, i.e. more or other wavelengths than RGB

Landscapes

  • Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Spectroscopy & Molecular Physics (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Theoretical Computer Science (AREA)
  • Health & Medical Sciences (AREA)
  • Multimedia (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • General Health & Medical Sciences (AREA)
  • Image Processing (AREA)
  • Investigating Or Analysing Materials By Optical Means (AREA)

Abstract

The invention discloses a technology for identifying plants by an unmanned aerial vehicle through a multispectral investigation camera, which comprises the following steps: s1, carrying out double-image acquisition of a spectrum image and a live-action image on plants in the wetland identification area in a flat-push scanning mode by adopting an unmanned aerial vehicle carrying a hyperspectral investigation camera and a high-resolution camera according to a preset planning cruising path; s2, uploading the double images synchronously; s3, orderly splicing the spectrogram images, and orderly splicing the live-action images; s4, calling the spliced spectral images and spliced live-action images under the acquisition paths, sequentially splicing according to a preset planning cruising path, and associating the spectral images and the live-action images of the whole frame; and S5, calling the wetland plant spectral database to compare and identify the plant spectral features on the spectral image of the whole frame, and drawing the identification result on the area map in the form of identification points. The invention can flexibly and conveniently survey and identify the wetland plants, and quickly and accurately identify the wetland plants.

Description

Technology for identifying plants by applying multispectral investigation camera by unmanned aerial vehicle
Technical Field
The invention relates to the technical field of wetland plant spectrum identification, in particular to a technology for identifying plants by an unmanned aerial vehicle through a multispectral detection camera.
Background
There are two main methods of conventional vegetation investigation. The first method is a method in which an investigator visits a site and visually determines the status of the site. The second investigation method is a method (remote sensing) in which a recognizer discriminates using a photograph or an image taken from a satellite, an aircraft, or the like. These methods are used individually or in combination. The sensor used for the remote sensing of the second investigation method was panchromatic (black and white) but recently changed to multispectral (color), and an expert (recognizer) recognized the multispectral photograph or image to recognize vegetation. Recently, vegetation map creation by a GIS (Geographic Information System) is becoming the mainstream. In the GIS, the plant type is identified by pattern matching images captured by a camera or a sensor using a standard vegetation index (NDVI) as reference information indicating the shape and color of a crown of each plant type prepared in advance.
The two investigation methods are mainly applied to natural environments such as plains, hills, mountain forest lands and the like, and the wetland is an important existing environment in natural ecology, so that few researches on plant investigation are carried out at present. Mainly because of the particularity of the wetland environment, the wetland is inconvenient to observe on the spot, and many wetland areas are not large and are located in cities, so that a satellite and an aircraft are inconvenient to call for remote measurement.
The Hangzhou Xixi wetland is located in the west part of Hangzhou urban area of Zhejiang province, spans two areas of a west lake area and a Yunzhou area, is only 6 kilometers away from the Wulin door of the Hangzhou main urban area, and is only 5 kilometers away from the west lake. The total area of the park is about 11.5 square kilometers, the total length of the river is more than 100 kilometers, and about 70 percent of the area is water areas such as harbors, ponds, lakes, marshes and the like, and water channels such as lanes, river branches such as nets, fishpond slenderness and island chess cloth are arranged among the water areas; the land greening rate is more than 85 percent, and the soil mainly comprises 3 soil types of red soil, lithologic soil and rice soil, wherein the red soil and the rice soil are most widely distributed. The Xixi wetland contains 254 species of vascular plants, belonging to the family 91 of the genus 204. Wherein the fern plants are 8, 9 and 9; gymnospermum 4, gymnospermum 6, 7; 184 dicotyledonous plants of the 146 genus of the 67 family; monocotyledons 12, 43 and 54. The major components are gramineae, compositae, leguminosae and rosaceous. The species of plants are abundant and various, but lack specific genus and species. Common ones include green bristlegrass, pennisetum, bermuda grass, achyranthes bidentata, persimmon, magnolia officinalis, mangosteen, dichroa henryi, honeysuckle, multiflora rose, chinaberry, broussonetia papyrifera, etc.
The Xixi wetland is an important component of the Hangzhou green land ecosystem, the wetland can regulate the atmospheric environment, and rich animal and plant communities in the wetland can absorb a large amount of carbon dioxide and release oxygen; meanwhile, the air purifier can absorb harmful gas, achieve the effect of purifying air, deal with the problem of urban air pollution, and provide sufficient water sources and good climatic conditions for cities. The ecological wetland is also a typical diversified ecosystem, and the wetland is a complex and diversified plant community, provides a good habitat for wild animals, is a place for breeding, inhabitation, migration and overwintering of birds and amphibians, and has an important effect on improving the variety of urban species.
Therefore, the research on the plant species and the distribution of the Xixi wetland are important for protecting the ecological health of the Xixi wetland, and the invention is applied based on the research on the problem.
Disclosure of Invention
In order to solve the defects of the technology, the invention provides a technology for identifying plants by using a multispectral detection camera by an unmanned aerial vehicle.
The technical scheme adopted by the invention for realizing the technical effects is as follows:
a technology for identifying plants by an unmanned aerial vehicle through a multispectral investigation camera comprises the following steps:
s1, synchronously acquiring double images, namely acquiring double images of a spectrum image and a live-action image of plants in a wetland identification area by adopting an unmanned aerial vehicle carrying a hyperspectral investigation camera and a high-resolution camera according to a preset planning cruising path in a flat push scanning mode;
s2, synchronously uploading the double images, and synchronously uploading the synchronously acquired spectral images and live-action images to a cloud server;
s3, splicing the two images, namely splicing the spectrogram images in order, splicing the live-action images in order, calling GPS information carried by the live-action images, identifying the GPS longitude and latitude information on the spliced live-action images in a plane woven net mode, and respectively storing the spliced spectrogram images and the spliced live-action images under the acquisition path;
s4, splicing the full picture, calling the spliced spectral images and spliced live-action images under all the acquisition paths, sequentially splicing according to a preset planning cruise path, and associating the spectral images and the live-action images of the full picture;
and S5, performing spectrum recognition, calling the wetland plant spectrum database to compare and recognize plant spectral features on the full-frame spectral image, and drawing the recognition result on the area map in the form of identification points.
Preferably, in the method for identifying wetland plants by using the hyperspectral reconnaissance camera by the unmanned aerial vehicle, before the step S5 is executed, the method further includes a step S45 of performing deconvolution noise reduction processing on the full-frame spectral image, and the steps are as follows:
s451, carrying out Gaussian fitting on the spectral peaks with the matched characteristic points to obtain an initial guess h0 of the deconvolution iterative function;
s452, obtaining a filter function through Fourier transform:g, O, H are respectively a spectral characteristic wave band, a noise wave band and a Fourier spectrum of a point spread function, and SNR is a signal-to-noise ratio;
s453, deconvoluting the iterative function,
g(x)=∫o(x′)h(x-x′)dx′=o(x)×h(x),
g=[g(1),g(2),…,g(Ng)]T
o=[o(1),o(2),…,o(No)]T
h=[h(1),h(2),…,h(Nh)]T
Figure BDA0002220971450000041
Ng=No+Nh-1,
let fiIs NgVector of dimension columns, element f thereofj(i) The following relationships are satisfied:
Figure BDA0002220971450000042
and S453, substituting the initial guess into the deconvolution iterative function by the increment delta h for iteration to obtain the full-frame spectral image after noise reduction, wherein the solving equation of the best fitting value of the increment delta h is as follows:
Figure BDA0002220971450000043
wherein lambda is a regularization parameter, the larger lambda is, the more obvious noise suppression is, the lower the detail restoration degree is,
Figure BDA0002220971450000044
constant coefficient C is belonged to (0.6, 0.8);
s454, determining an iteration termination condition: l ([ sigma ])(k-1)(k))/σ(k)|,σ(k)The half-height width of the Gaussian point spread function after k steps of iteration is as follows:
Figure BDA0002220971450000045
wherein alpha is1,α2,α3Respectively the peak height, center and half-width at half-height of the fitted gaussian peak.
Preferably, in the method for identifying wetland plants by the unmanned aerial vehicle through the hyperspectral investigation camera, the size of the framing pictures of the hyperspectral investigation camera and the high-resolution camera is the same, the hyperspectral investigation camera and the high-resolution camera are arranged on the tripod head of the unmanned aerial vehicle in a straight line along the length direction of the abdomen body of the unmanned aerial vehicle, the hyperspectral investigation camera is positioned in front of the high-resolution camera, and the hyperspectral investigation camera and the high-resolution camera are arranged compactly.
Preferably, in the method for identifying wetland plants by the unmanned aerial vehicle using the hyperspectral investigation camera, the preset planned cruising path takes the current wetland identification area as a path pool, the path is formed by continuous straight-line segments, each straight-line segment path is overlapped with a central line of a view-finding picture of the hyperspectral investigation camera, and the central line corresponds to the advancing direction of the unmanned aerial vehicle when flying on the current straight-line segment path.
Preferably, in the method for identifying wetland plants by using the hyperspectral investigation camera by the unmanned aerial vehicle, the planned cruising path is a zigzag path, and the distance between two parallel straight-line-segment paths is the width of a framing picture of the hyperspectral investigation camera.
Preferably, in the method for identifying wetland plants by using the hyperspectral investigation camera by the unmanned aerial vehicle, the planned cruising path is a pulse-type back-and-forth path, and the distance between parallel straight-line segments of two peak lines of the same pulse peak is the width of a framing picture of the hyperspectral investigation camera.
Preferably, in the method for identifying wetland plants by the unmanned aerial vehicle using the hyperspectral investigation camera, the lenses of the hyperspectral investigation camera and the high-resolution camera are vertically downward for image acquisition.
Preferably, in the method for identifying wetland plants by the unmanned aerial vehicle through the hyperspectral investigation camera, the hyperspectral investigation camera and the high-resolution camera are controlled by the controller, the controller is connected with the GPS module of the unmanned aerial vehicle, and when the unmanned aerial vehicle advances by the distance of one lens frame, the controller controls the hyperspectral investigation camera and the high-resolution camera to shoot a spectral image and a live-action head image under one lens frame.
Preferably, in the method for identifying wetland plants by the unmanned aerial vehicle through the hyperspectral investigation camera, the acquisition time of the double images is 4-5 months, windless sunny days, 9-10 am or 15-16 pm, and the flying height of the unmanned aerial vehicle is 30-50 m.
Preferably, in the method for identifying wetland plants by the unmanned aerial vehicle using the hyperspectral reconnaissance camera, the number of the unmanned aerial vehicles is at least one, or the unmanned aerial vehicles are units flying in a straight-line formation manner.
The invention has the beneficial effects that: the method for identifying the wetland plants by the unmanned aerial vehicle using the hyperspectral investigation camera can flexibly and conveniently investigate and identify the wetland plants by adopting the unmanned aerial vehicle carrying the hyperspectral investigation camera and the high-resolution camera to carry out double-image acquisition of the spectral images and the live-action images on the wetland plants, and can quickly and accurately identify the wetland plants by comparing and identifying the plant spectral features on the full-frame spectral images based on the wetland plant spectral database. The GPS longitude and latitude information is marked on the spliced live-action image in the form of a plane woven net, and the identification result is drawn on the spectral image on the area map in the form of the identification point in a matching manner, so that the plant distribution condition of each position of the wetland area can be visually checked, namely the live-action image of the real object and the spectral image after the mark are provided, and the checking and the data calling are convenient. The hyperspectral investigation camera and the high-resolution camera are controlled by the controller, the controller is connected with the GPS module of the unmanned aerial vehicle, when the unmanned aerial vehicle advances for the distance of one lens frame, the controller controls the hyperspectral investigation camera and the high-resolution camera to shoot a spectral image and a live-action head image under one lens frame, energy consumption can be saved, meanwhile, the shot spectral image and the shot live-action head image can correspond to each current lens frame, fusion of frame images is not needed, full-frame images only need to be spliced in order, and image frames and image data are real and complete.
Drawings
FIG. 1 is a flow chart of an embodiment of the present invention;
FIG. 2 is a flow chart of another embodiment of the present invention;
FIG. 3 is a schematic structural diagram of an unmanned aerial vehicle carrying a hyperspectral investigation camera and a high-resolution camera according to the invention;
FIG. 4 is a schematic view of a rectangular-shaped path planning cruise path according to an embodiment of the present invention;
FIG. 5 is a diagram of an embodiment of an impulse type round-trip path planning cruise path;
FIG. 6 is a graph of reflectance before noise reduction of a spectral image according to the present invention;
FIG. 7 is a graph of the reflectance after noise reduction of a spectral image according to the present invention;
fig. 8 is a schematic diagram of the recognition result of the present invention plotted on the area map in the form of the identification point.
Detailed Description
For a further understanding of the invention, reference is made to the following description taken in conjunction with the accompanying drawings and specific examples, in which:
in the description of the present invention, it should be noted that the terms "vertical", "upper", "lower", "horizontal", and the like indicate orientations or positional relationships based on those shown in the drawings, and are only for convenience of describing the present invention and simplifying the description, but do not indicate or imply that the referred device or element must have a specific orientation, be constructed and operated in a specific orientation, and thus, should not be construed as limiting the present invention. Furthermore, "first," "second," "third," and "fourth" are used for descriptive purposes only and are not to be construed as indicating or implying relative importance.
In the description of the present invention, it should be further noted that, unless otherwise specifically stated or limited, the terms "disposed," "mounted," "connected," and "connected" are to be construed broadly and may be, for example, fixedly connected, detachably connected, integrally connected, mechanically connected, electrically connected, directly connected, connected through an intermediate medium, or connected through the insides of two elements. The specific meanings of the above terms in the present invention can be understood by those skilled in the art according to specific situations.
The first embodiment is as follows:
the invention provides a technology for identifying plants by an unmanned aerial vehicle by using a multispectral detection camera, which comprises the following steps of:
s1, synchronously acquiring double images, namely acquiring double images of a spectrum image and a live-action image of plants in a wetland identification area by adopting an unmanned aerial vehicle carrying a hyperspectral investigation camera and a high-resolution camera according to a preset planning cruising path in a flat push scanning mode;
s2, synchronously uploading the double images, synchronously uploading the synchronously acquired spectral images and live-action images to a cloud server, and uploading by adopting a wireless transmission module in real time;
s3, splicing the two images, namely splicing the spectrogram images in order, splicing the live-action images in order, calling GPS information carried by the live-action images, identifying the GPS longitude and latitude information on the spliced live-action images in a plane woven net mode, and respectively storing the spliced spectrogram images and the spliced live-action images under the acquisition path;
s4, splicing the full picture, calling the spliced spectral images and spliced live-action images under all the acquisition paths, sequentially splicing according to a preset planning cruise path, and associating the spectral images and the live-action images of the full picture;
and S5, performing spectrum recognition, calling the wetland plant spectrum database to compare and recognize plant spectral features on the full-frame spectral image, and drawing the recognition result on the area map in the form of identification points.
Specifically, as shown in fig. 3, the hyperspectral reconnaissance camera 2 and the high-resolution camera 3 have the same framing picture size, and are carried on the unmanned aerial vehicle pan-tilt 4 in a straight line along the length direction of the abdomen body of the unmanned aerial vehicle 1, and the hyperspectral reconnaissance camera 2 is located in front of the high-resolution camera 3 and is compactly arranged.
In the preferred embodiment of the invention, the lenses of the hyperspectral reconnaissance camera 2 and the high-resolution camera 3 are vertically downwards used for picture acquisition, the unmanned aerial vehicle is provided with a gyroscope and a stabilizer, so that the stable flying posture of the unmanned aerial vehicle can be kept, the unmanned aerial vehicle holder 4 is an anti-shake adaptive adjusting holder, so that the lens shake can be prevented, the framing and shooting windows of the lenses can be corrected, and the shot pictures of the hyperspectral reconnaissance camera 2 and the high-resolution camera 3 are clear and reliable. The existing three-axis anti-shake holder is adopted in the anti-shake self-adaptive adjusting holder, and the existing gyroscope and stabilizer are adopted in the gyroscope and stabilizer, so that the details are not repeated here.
Specifically, in a preferred embodiment of the present invention, the hyperspectral reconnaissance camera 2 and the high-resolution camera 3 are controlled by a controller inside the unmanned aerial vehicle, the controller is connected to a GPS module of the unmanned aerial vehicle, and when the unmanned aerial vehicle moves forward by the distance of one lens frame, the controller controls the hyperspectral reconnaissance camera 2 and the high-resolution camera 3 to shoot a spectral image and a live-action head image under one lens frame.
In order to better perform dual-image acquisition on the wetland area, the acquisition time is selected to be 4-5 months, windless sunny days, 9-10 am or 15-16 pm, and the flying height of the unmanned aerial vehicle is 30-50 m. In the embodiment of the invention, the investigated wetland area is selected to be Hangzhou xi wetland, the wetland water area map is provided with relevant documents, drawings and data by Hangzhou surveying and mapping bureau, and the simulation wetland water area map containing longitude and latitude information is generated by drawing through computer simulation drawing software.
Further, as a preferred embodiment of the present invention, as shown in fig. 2, before performing step S5, a step S45 of performing deconvolution noise reduction processing on the full-frame spectral image is further included, and the specific steps are as follows:
s451, carrying out Gaussian fitting on the spectral peaks with the matched characteristic points to obtain an initial guess h0 of the deconvolution iterative function;
s452, obtaining a filter function through Fourier transform:
Figure BDA0002220971450000091
g, O, H are respectively a spectral characteristic wave band, a noise wave band and a Fourier spectrum of a point spread function, and SNR is a signal-to-noise ratio;
s453, deconvoluting the iterative function,
g(x)=∫o(x′)h(x-x′)dx′=o(x)×h(x),
g=[g(1),g(2),…,g(Ng)]T
o=[o(1),o(2),…,o(No)]T
h=[h(1),h(2),…,h(Nh)]T
Figure BDA0002220971450000092
Ng=No+Nh-1,
let fiIs NgVector of dimension columns, element f thereofj(i) The following relationships are satisfied:
Figure BDA0002220971450000093
and S453, substituting the initial guess into the deconvolution iterative function by the increment delta h for iteration to obtain the full-frame spectral image after noise reduction, wherein the solving equation of the best fitting value of the increment delta h is as follows:
Figure BDA0002220971450000101
wherein lambda is a regularization parameter, the larger lambda is, the more obvious noise suppression is, the lower the detail restoration degree is,
Figure BDA0002220971450000102
constant coefficient C is belonged to (0.6, 0.8);
s454, determining an iteration termination condition: l ([ sigma ])(k+1)(k))/σ(k)|,σ(k)The half-height width of the Gaussian point spread function after k steps of iteration is as follows:wherein alpha is123Respectively the peak height, center and half-width at half-height of the fitted gaussian peak.
By carrying out deconvolution noise reduction processing on the full-frame spectral image, the spectral resolution of the obtained noise-reduced spectral image is at least improved by 20% compared with the resolution of the spectral image before noise reduction, a spectral contrast source with higher reliability is improved for the later spectral identification, the spectral identification plant category is more accurate, and the error is reduced. The noise reduction part mainly comprises water surface ripple reflection and water vapor refraction generated by water vapor transpiration. The collection time is selected to be 4-5 months, windless sunny days, 9-10 am or 15-16 pm, and the influence factors on the unmanned aerial vehicle for collecting the double images are small. The flying height of the unmanned aerial vehicle is 30-50 meters, so that the influence of wetland bush and grasses can be avoided, and meanwhile, a wider picture collecting surface can be provided for the hyperspectral investigation camera 2 and the high-resolution camera 3. As shown in fig. 6 and 7, the reflectivity curve graph before the spectral image is denoised and the reflectivity curve graph after the spectral image is denoised are respectively used, and the purpose of denoising is mainly to highlight the characteristic information of the spectral image, improve the signal-to-noise ratio of the spectral image, facilitate the spectrum identification in the later period without interference, and improve the identification efficiency and accuracy.
Further, in a preferred embodiment of the present invention, the predetermined planned cruising path uses the current wetland identification area as a path pool, the path is formed by continuous straight-line segments, each straight-line segment coincides with a center line of a view-finding picture of the hyperspectral reconnaissance camera, and the center line corresponds to an advancing direction of the unmanned aerial vehicle when flying on the current straight-line segment. In the first embodiment, as shown in fig. 4, the planned cruising path is preferably a zigzag path, and the distance between two parallel straight line segment paths is the width of the viewfinder frame of the hyperspectral spy camera. Fig. 8 is a schematic diagram of the recognition result drawn on the area graph in the form of the identification point, the identification is performed by using a solid dot cluster, the different types of identifications are different colors, coordinate position information is displayed at the same time, and the generated graph is drawn to be a computer graph and meaning plant type, distribution, quantity, position and other information, so that the view and the call are convenient.
Example two:
the second embodiment of the present invention provides a technology for identifying plants by an unmanned aerial vehicle using a multispectral detection camera, as shown in fig. 1, which includes the steps of:
s1, synchronously acquiring double images, namely acquiring double images of a spectrum image and a live-action image of plants in a wetland identification area by adopting an unmanned aerial vehicle carrying a hyperspectral investigation camera and a high-resolution camera according to a preset planning cruising path in a flat push scanning mode;
s2, synchronously uploading the double images, synchronously uploading the synchronously acquired spectral images and live-action images to a cloud server, and uploading by adopting a wireless transmission module in real time;
s3, splicing the two images, namely splicing the spectrogram images in order, splicing the live-action images in order, calling GPS information carried by the live-action images, identifying the GPS longitude and latitude information on the spliced live-action images in a plane woven net mode, and respectively storing the spliced spectrogram images and the spliced live-action images under the acquisition path;
s4, splicing the full picture, calling the spliced spectral images and spliced live-action images under all the acquisition paths, sequentially splicing according to a preset planning cruise path, and associating the spectral images and the live-action images of the full picture;
and S5, performing spectrum recognition, calling the wetland plant spectrum database to compare and recognize plant spectral features on the full-frame spectral image, and drawing the recognition result on the area map in the form of identification points.
Specifically, as shown in fig. 3, the hyperspectral reconnaissance camera 2 and the high-resolution camera 3 have the same framing picture size, and are carried on the unmanned aerial vehicle pan-tilt 4 in a straight line along the length direction of the abdomen body of the unmanned aerial vehicle 1, and the hyperspectral reconnaissance camera 2 is located in front of the high-resolution camera 3 and is compactly arranged.
In the preferred embodiment of the invention, the lenses of the hyperspectral reconnaissance camera 2 and the high-resolution camera 3 are vertically downwards used for picture acquisition, the unmanned aerial vehicle is provided with a gyroscope and a stabilizer, so that the stable flying posture of the unmanned aerial vehicle can be kept, the unmanned aerial vehicle holder 4 is an anti-shake adaptive adjusting holder, so that the lens shake can be prevented, the framing and shooting windows of the lenses can be corrected, and the shot pictures of the hyperspectral reconnaissance camera 2 and the high-resolution camera 3 are clear and reliable. The existing three-axis anti-shake holder is adopted in the anti-shake self-adaptive adjusting holder, and the existing gyroscope and stabilizer are adopted in the gyroscope and stabilizer, so that the details are not repeated here.
Specifically, in a preferred embodiment of the present invention, the hyperspectral reconnaissance camera 2 and the high-resolution camera 3 are controlled by a controller inside the unmanned aerial vehicle, the controller is connected to a GPS module of the unmanned aerial vehicle, and when the unmanned aerial vehicle moves forward by the distance of one lens frame, the controller controls the hyperspectral reconnaissance camera 2 and the high-resolution camera 3 to shoot a spectral image and a live-action head image under one lens frame.
In order to better perform dual-image acquisition on the wetland area, the acquisition time is selected to be 4-5 months, windless sunny days, 9-10 am or 15-16 pm, and the flying height of the unmanned aerial vehicle is 30-50 m. In the embodiment of the invention, the investigated wetland area is selected to be Hangzhou xi wetland, the wetland water area map is provided with relevant documents, drawings and data by Hangzhou surveying and mapping bureau, and the simulation wetland water area map containing longitude and latitude information is generated by drawing through computer simulation drawing software.
Further, as a preferred embodiment of the present invention, as shown in fig. 2, before performing step S5, a step S45 of performing deconvolution noise reduction processing on the full-frame spectral image is further included, and the specific steps are as follows:
s451, carrying out Gaussian fitting on the spectral peaks with the matched characteristic points to obtain an initial guess h0 of the deconvolution iterative function;
s452, obtaining a filter function through Fourier transform:
Figure BDA0002220971450000121
g, O, H are respectively a spectral characteristic wave band, a noise wave band and a Fourier spectrum of a point spread function, and SNR is a signal-to-noise ratio;
s453, deconvoluting the iterative function,
g(x)=∫o(x′)h(x-x′)dx′=o(x)×h(x),
g=[g(1),g(2),…,g(Ng)]T
o=[o(1),o(2),…,o(No)]T
h=[h(1),h(2),…,h(Nh)]T
Figure BDA0002220971450000131
Ng=No+Nh-1,
let fiIs NgVector of dimension columns, element f thereofj(i) The following relationships are satisfied:
Figure BDA0002220971450000132
and S453, substituting the initial guess into the deconvolution iterative function by the increment delta h for iteration to obtain the full-frame spectral image after noise reduction, wherein the solving equation of the best fitting value of the increment delta h is as follows:
Figure BDA0002220971450000133
wherein lambda is a regularization parameter, the larger lambda is, the more obvious noise suppression is, the lower the detail restoration degree is,
Figure BDA0002220971450000134
constant coefficient C is belonged to (0.6, 0.8);
s454, determining an iteration termination condition: l ([ sigma ])(k+1)(k))/σ(k)|,σ(k)The half-height width of the Gaussian point spread function after k steps of iteration is as follows:
Figure BDA0002220971450000135
wherein alpha is1,α2,α3Respectively the peak height, center and half-width at half-height of the fitted gaussian peak.
By carrying out deconvolution noise reduction processing on the full-frame spectral image, the spectral resolution of the obtained noise-reduced spectral image is at least improved by 20% compared with the resolution of the spectral image before noise reduction, a spectral contrast source with higher reliability is improved for the later spectral identification, the spectral identification plant category is more accurate, and the error is reduced. The noise reduction part mainly comprises water surface ripple reflection and water vapor refraction generated by water vapor transpiration. The collection time is selected to be 4-5 months, windless sunny days, 9-10 am or 15-16 pm, and the influence factors on the unmanned aerial vehicle for collecting the double images are small. The flying height of the unmanned aerial vehicle is 30-50 meters, so that the influence of wetland bush and grasses can be avoided, and meanwhile, a wider picture collecting surface can be provided for the hyperspectral investigation camera 2 and the high-resolution camera 3. As shown in fig. 6 and 7, the reflectivity curve graph before the spectral image is denoised and the reflectivity curve graph after the spectral image is denoised are respectively used, and the purpose of denoising is mainly to highlight the characteristic information of the spectral image, improve the signal-to-noise ratio of the spectral image, facilitate the spectrum identification in the later period without interference, and improve the identification efficiency and accuracy.
Further, in a preferred embodiment of the present invention, the predetermined planned cruising path uses the current wetland identification area as a path pool, the path is formed by continuous straight-line segments, each straight-line segment coincides with a center line of a view-finding picture of the hyperspectral reconnaissance camera, and the center line corresponds to an advancing direction of the unmanned aerial vehicle when flying on the current straight-line segment. In the second embodiment, as shown in fig. 5, the planned cruising path is preferably a pulsed back-and-forth path, and the distance between the parallel straight line segments of two peak lines of the same pulse peak is the width of the viewfinder frame of the hyperspectral spy camera. Fig. 8 is a schematic diagram of the recognition result drawn on the area graph in the form of the identification point, the identification is performed by using a solid dot cluster, the different types of identifications are different colors, coordinate position information is displayed at the same time, and the generated graph is drawn to be a computer graph and meaning plant type, distribution, quantity, position and other information, so that the view and the call are convenient.
In the embodiment of the invention, the number of the unmanned aerial vehicles is at least one, optionally, in other embodiments, a plurality of linked unmanned aerial vehicles can be further arranged, a unit flying in a linear formation mode is adopted to carry out horizontal push scanning type ground double-image acquisition on the wetland area, and the acquisition efficiency of the spectrum image and the live-action image can be greatly improved through the unit flying in the formation mode.
In summary, the method for identifying wetland plants by the unmanned aerial vehicle using the hyperspectral investigation camera can flexibly and conveniently investigate and identify the wetland plants by adopting the unmanned aerial vehicle carrying the hyperspectral investigation camera and the high-resolution camera to carry out double-image acquisition of the spectrum image and the live-action image on the wetland plants, and can quickly and accurately identify the wetland plants by comparing and identifying the plant spectrum characteristics on the full-frame spectrum image based on the wetland plant spectrum database. The GPS longitude and latitude information is marked on the spliced live-action image in the form of a plane woven net, and the identification result is drawn on the spectral image on the area map in the form of the identification point in a matching manner, so that the plant distribution condition of each position of the wetland area can be visually checked, namely the live-action image of the real object and the spectral image after the mark are provided, and the checking and the data calling are convenient. The hyperspectral investigation camera and the high-resolution camera are controlled by the controller, the controller is connected with the GPS module of the unmanned aerial vehicle, when the unmanned aerial vehicle advances for the distance of one lens frame, the controller controls the hyperspectral investigation camera and the high-resolution camera to shoot a spectral image and a live-action head image under one lens frame, energy consumption can be saved, meanwhile, the shot spectral image and the shot live-action head image can correspond to each current lens frame, fusion of frame images is not needed, full-frame images only need to be spliced in order, and image frames and image data are real and complete.
The foregoing shows and describes the general principles, essential features, and advantages of the invention. It will be understood by those skilled in the art that the present invention is not limited to the embodiments described above, which are merely illustrative of the principles of the invention, but that various changes and modifications may be made without departing from the spirit and scope of the invention, which is defined by the appended claims and their equivalents.

Claims (10)

1. A technology for identifying plants by an unmanned aerial vehicle through a multispectral investigation camera is characterized by comprising the following steps:
s1, synchronously acquiring double images, namely acquiring double images of a spectrum image and a live-action image of plants in a wetland identification area by adopting an unmanned aerial vehicle carrying a hyperspectral investigation camera and a high-resolution camera according to a preset planning cruising path in a flat push scanning mode;
s2, synchronously uploading the double images, and synchronously uploading the synchronously acquired spectral images and live-action images to a cloud server;
s3, splicing the two images, namely splicing the spectrogram images in order, splicing the live-action images in order, calling GPS information carried by the live-action images, identifying the GPS longitude and latitude information on the spliced live-action images in a plane woven net mode, and respectively storing the spliced spectrogram images and the spliced live-action images under the acquisition path;
s4, splicing the full picture, calling the spliced spectral images and spliced live-action images under all the acquisition paths, sequentially splicing according to a preset planning cruise path, and associating the spectral images and the live-action images of the full picture;
and S5, performing spectrum recognition, calling the wetland plant spectrum database to compare and recognize plant spectral features on the full-frame spectral image, and drawing the recognition result on the area map in the form of identification points.
2. The technique for identifying plants by using multispectral reconnaissance camera for drones as claimed in claim 1, further comprising a step S45 of deconvolving and denoising the full-frame spectral image before performing the step S5, wherein the steps are as follows:
s451, carrying out Gaussian fitting on the spectral peaks with the matched characteristic points to obtain an initial guess h0 of the deconvolution iterative function;
s452, obtaining a filter function through Fourier transform:
Figure FDA0002220971440000011
g, O, H are respectively a spectral characteristic wave band, a noise wave band and a Fourier spectrum of a point spread function, and SNR is a signal-to-noise ratio;
s453, deconvoluting the iterative function,
g(x)=∫o(x′)h(x-x′)dx′=o(x)×h(x),
g=[g(1),g(2),…,g(Ng)]T
o=[o(1),o(2),…,o(No)]T
h=[h(1),h(2),…,h(Nh)]T
Figure FDA0002220971440000021
Ng=No+Nh-1,
let fiIs NgVector of dimension columns, element f thereofj(i) The following relationships are satisfied:
Figure FDA0002220971440000022
and S453, substituting the initial guess into the deconvolution iterative function by the increment delta h for iteration to obtain the full-frame spectral image after noise reduction, wherein the solving equation of the best fitting value of the increment delta h is as follows:
wherein lambda is a regularization parameter, the larger lambda is, the more obvious noise suppression is, the lower the detail restoration degree is,
Figure FDA0002220971440000024
constant coefficient C is belonged to (0.6, 0.8);
s454, determining an iteration termination condition: l ([ sigma ])(k+1)(k))/σ(k)|,σ(k)The half-height width of the Gaussian point spread function after k steps of iteration is as follows:
Figure FDA0002220971440000025
wherein alpha is1,α2,α3Respectively the peak height, center and half-width at half-height of the fitted gaussian peak.
3. The technique for recognizing plants by using the multispectral surveillance camera for the unmanned aerial vehicle as claimed in claim 1 or 2, wherein the size of the viewfinder frames of the hyperspectral surveillance camera and the high-resolution camera is the same, the hyperspectral surveillance camera and the high-resolution camera are mounted on the pan tilt head of the unmanned aerial vehicle in a straight line along the length direction of the abdomen body of the unmanned aerial vehicle, and the hyperspectral surveillance camera is positioned in front of the high-resolution camera and is arranged compactly.
4. The technique as claimed in claim 3, wherein the predetermined planned cruising path is formed by continuous straight-line paths, each of which coincides with a central line of a view-finding frame of the hyperspectral reconnaissance camera, the central line corresponding to a forward direction of the unmanned aerial vehicle when flying on the current straight-line path, with the current wetland identification area as a path pool.
5. The technique as claimed in claim 3, wherein the planned cruising path is a zigzag path, and the distance between two parallel straight line segments is the width of the view frame of the hyperspectral camera.
6. The technique as claimed in claim 3, wherein the planned cruising path is a pulsed back and forth path, and the distance between the parallel straight line segments of two peak lines of the same pulse peak is the width of the view frame of the hyperspectral reconnaissance camera.
7. The technique for identifying plants by using the multispectral scout camera for the unmanned aerial vehicle as claimed in claim 1, wherein the lenses of the hyperspectral scout camera and the high-resolution camera are vertically downward for image acquisition.
8. The technique of claim 1, wherein the hyperspectral camera and the high-resolution camera are controlled by a controller, the controller is connected to the GPS module of the unmanned aerial vehicle, and the controller controls the hyperspectral camera and the high-resolution camera to capture the spectral image and the live-action head image of a lens frame when the unmanned aerial vehicle moves forward by the distance of the lens frame.
9. The technique of claim 1, wherein the dual images are collected at 4-5 months, windless sunny days, 9-10 am, or 15-16 pm, and the flying height of the drone is 30-50 m.
10. The technique of claim 1, wherein the number of drones is at least one, or two or more, flying units in a straight formation.
CN201910933748.XA 2019-09-29 2019-09-29 Technology for identifying plants by applying multispectral investigation camera by unmanned aerial vehicle Active CN110657891B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910933748.XA CN110657891B (en) 2019-09-29 2019-09-29 Technology for identifying plants by applying multispectral investigation camera by unmanned aerial vehicle

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910933748.XA CN110657891B (en) 2019-09-29 2019-09-29 Technology for identifying plants by applying multispectral investigation camera by unmanned aerial vehicle

Publications (2)

Publication Number Publication Date
CN110657891A true CN110657891A (en) 2020-01-07
CN110657891B CN110657891B (en) 2021-08-10

Family

ID=69039860

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910933748.XA Active CN110657891B (en) 2019-09-29 2019-09-29 Technology for identifying plants by applying multispectral investigation camera by unmanned aerial vehicle

Country Status (1)

Country Link
CN (1) CN110657891B (en)

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111556157A (en) * 2020-05-06 2020-08-18 中南民族大学 Crop distribution monitoring method, equipment, storage medium and device
CN111783539A (en) * 2020-05-30 2020-10-16 上海晏河建设勘测设计有限公司 Terrain measurement method, measurement device, measurement system and computer readable storage medium
CN111924101A (en) * 2020-08-31 2020-11-13 金陵科技学院 Unmanned aerial vehicle double-tripod-head camera and working method thereof
CN112540623A (en) * 2020-11-19 2021-03-23 华中农业大学 Method for realizing landscape patterns of field crops based on high-precision positioning unmanned aerial vehicle aerial seeding
CN113325868A (en) * 2021-05-31 2021-08-31 南通大学 Crop real-time identification system and method based on unmanned aerial vehicle

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102385694A (en) * 2010-09-06 2012-03-21 邬明权 Hyperspectral identification method for land parcel-based crop variety
CN103620381A (en) * 2011-06-29 2014-03-05 富士通株式会社 Plant species identification device, method, and program
CN104236710A (en) * 2014-09-29 2014-12-24 杭州彩谱科技有限公司 Spectrum super-resolution method of handheld light source color illumination photometry meter
CN108346143A (en) * 2018-01-30 2018-07-31 浙江大学 A kind of crop disease monitoring method and system based on the fusion of unmanned plane multi-source image
CN109447902A (en) * 2018-10-15 2019-03-08 广州地理研究所 A kind of image split-joint method, device, storage medium and equipment
CN109596533A (en) * 2018-12-18 2019-04-09 北京航天泰坦科技股份有限公司 A kind of potato planting management method based on unmanned plane high-spectral data

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102385694A (en) * 2010-09-06 2012-03-21 邬明权 Hyperspectral identification method for land parcel-based crop variety
CN103620381A (en) * 2011-06-29 2014-03-05 富士通株式会社 Plant species identification device, method, and program
CN104236710A (en) * 2014-09-29 2014-12-24 杭州彩谱科技有限公司 Spectrum super-resolution method of handheld light source color illumination photometry meter
CN108346143A (en) * 2018-01-30 2018-07-31 浙江大学 A kind of crop disease monitoring method and system based on the fusion of unmanned plane multi-source image
CN109447902A (en) * 2018-10-15 2019-03-08 广州地理研究所 A kind of image split-joint method, device, storage medium and equipment
CN109596533A (en) * 2018-12-18 2019-04-09 北京航天泰坦科技股份有限公司 A kind of potato planting management method based on unmanned plane high-spectral data

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111556157A (en) * 2020-05-06 2020-08-18 中南民族大学 Crop distribution monitoring method, equipment, storage medium and device
CN111783539A (en) * 2020-05-30 2020-10-16 上海晏河建设勘测设计有限公司 Terrain measurement method, measurement device, measurement system and computer readable storage medium
CN111924101A (en) * 2020-08-31 2020-11-13 金陵科技学院 Unmanned aerial vehicle double-tripod-head camera and working method thereof
CN111924101B (en) * 2020-08-31 2024-04-09 金陵科技学院 Unmanned aerial vehicle double-cradle head camera and working method thereof
CN112540623A (en) * 2020-11-19 2021-03-23 华中农业大学 Method for realizing landscape patterns of field crops based on high-precision positioning unmanned aerial vehicle aerial seeding
CN113325868A (en) * 2021-05-31 2021-08-31 南通大学 Crop real-time identification system and method based on unmanned aerial vehicle
CN113325868B (en) * 2021-05-31 2023-02-28 南通大学 Crop real-time identification system and method based on unmanned aerial vehicle

Also Published As

Publication number Publication date
CN110657891B (en) 2021-08-10

Similar Documents

Publication Publication Date Title
CN110657891B (en) Technology for identifying plants by applying multispectral investigation camera by unmanned aerial vehicle
JP2003009664A (en) Crop growth level measuring system, crop growth level measuring method, crop growth level measuring program, and computer-readable recording medium recorded with the program
CN111815014A (en) Crop yield prediction method and system based on unmanned aerial vehicle low-altitude remote sensing information
JP7229864B2 (en) REMOTE SENSING IMAGE ACQUISITION TIME DETERMINATION SYSTEM AND CROPT GROWTH ANALYSIS METHOD
CN108732129A (en) A kind of system and method with graphical representation agricultural land soil ingredient
CN112881294B (en) Unmanned aerial vehicle-based mangrove forest stand health degree evaluation method
US11800246B2 (en) Systems and methods for multispectral landscape mapping
CN113795846A (en) Method, device and computer storage medium for determining crop planting information
Meneses et al. Modelling heights of sparse aquatic reed (Phragmites australis) using Structure from Motion point clouds derived from Rotary-and Fixed-Wing Unmanned Aerial Vehicle (UAV) data
CN113610040B (en) Paddy field weed density real-time statistical method based on improved BiSeNetV2 segmentation network
Aldrich Detecting disturbances in a forest environment
Cermakova et al. Calculation of visible spectral indices from UAV-based data: small water bodies monitoring
Hoffmeister et al. 3D terres-trial laser scanning for field crop modelling
CN111412899B (en) Method for monitoring and evaluating river by using unmanned aerial vehicle surveying and mapping
KR20220106639A (en) Method and System for Detecting Aquatic Plants in Small Reservoirs using Multispectral UAV Imagery and Vegetation index
CN115797807A (en) Ocean garbage monitoring method, system and medium based on data of unmanned aerial vehicle-mounted spectrometer
Hosingholizade et al. Height estimation of pine (Pinus eldarica) single trees using slope corrected shadow length on unmanned aerial vehicle (UAV) imagery in a plantation forest
US20230095661A1 (en) Plant and/or vehicle locating
Pitt et al. Large-scale 35-mm aerial photographs for assessment of vegetation-management research plots in eastern Canada
Imam Aerial Photography and Photogrammetary
Sánchez-Azofeifa et al. Experiences in field data collection: In support of land use and land cover change classification in boreal and tropical environments
CN113743208B (en) Unmanned aerial vehicle array-based white dolphin number statistical method and system
CN111932622B (en) Device, method and system for determining flight altitude of unmanned aerial vehicle
Mancini et al. Unmanned Aerial System Applications to Coastal Environments
WO2023079063A1 (en) Method and system for collecting data on a field used for agriculture

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant