CN115272314B - Agricultural low-altitude remote sensing mapping method and device - Google Patents
Agricultural low-altitude remote sensing mapping method and device Download PDFInfo
- Publication number
- CN115272314B CN115272314B CN202211177670.1A CN202211177670A CN115272314B CN 115272314 B CN115272314 B CN 115272314B CN 202211177670 A CN202211177670 A CN 202211177670A CN 115272314 B CN115272314 B CN 115272314B
- Authority
- CN
- China
- Prior art keywords
- full
- image
- color image
- preset
- remote sensing
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
- 238000000034 method Methods 0.000 title claims abstract description 48
- 238000013507 mapping Methods 0.000 title claims abstract description 38
- 239000000779 smoke Substances 0.000 claims abstract description 21
- 230000008569 process Effects 0.000 claims description 10
- 230000004927 fusion Effects 0.000 claims description 6
- 239000000725 suspension Substances 0.000 claims description 4
- 230000007774 longterm Effects 0.000 description 9
- 230000001276 controlling effect Effects 0.000 description 7
- 238000010586 diagram Methods 0.000 description 4
- 230000009471 action Effects 0.000 description 3
- 230000008859 change Effects 0.000 description 3
- 230000006870 function Effects 0.000 description 2
- 238000007499 fusion processing Methods 0.000 description 2
- 239000010902 straw Substances 0.000 description 2
- 241000271566 Aves Species 0.000 description 1
- 241000196324 Embryophyta Species 0.000 description 1
- 241001146157 Erechtites Species 0.000 description 1
- 241001465754 Metazoa Species 0.000 description 1
- 238000013528 artificial neural network Methods 0.000 description 1
- 239000004459 forage Substances 0.000 description 1
- 238000003306 harvesting Methods 0.000 description 1
- 230000003116 impacting effect Effects 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 230000001105 regulatory effect Effects 0.000 description 1
- 230000003595 spectral effect Effects 0.000 description 1
- 230000000153 supplemental effect Effects 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/0002—Inspection of images, e.g. flaw detection
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C11/00—Photogrammetry or videogrammetry, e.g. stereogrammetry; Photographic surveying
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C15/00—Surveying instruments or accessories not provided for in groups G01C1/00 - G01C13/00
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01J—MEASUREMENT OF INTENSITY, VELOCITY, SPECTRAL CONTENT, POLARISATION, PHASE OR PULSE CHARACTERISTICS OF INFRARED, VISIBLE OR ULTRAVIOLET LIGHT; COLORIMETRY; RADIATION PYROMETRY
- G01J3/00—Spectrometry; Spectrophotometry; Monochromators; Measuring colours
- G01J3/28—Investigating the spectrum
- G01J3/2823—Imaging spectrometer
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/40—Extraction of image or video features
- G06V10/46—Descriptors for shape, contour or point-related descriptors, e.g. scale invariant feature transform [SIFT] or bags of words [BoW]; Salient regional features
- G06V10/462—Salient features, e.g. scale invariant feature transforms [SIFT]
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/70—Arrangements for image or video recognition or understanding using pattern recognition or machine learning
- G06V10/74—Image or video pattern matching; Proximity measures in feature spaces
- G06V10/761—Proximity, similarity or dissimilarity measures
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/10—Terrestrial scenes
- G06V20/17—Terrestrial scenes taken from planes or by drones
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/10—Terrestrial scenes
- G06V20/182—Network patterns, e.g. roads or rivers
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/10—Terrestrial scenes
- G06V20/188—Vegetation
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/10—Terrestrial scenes
- G06V20/194—Terrestrial scenes using hyperspectral data, i.e. more or other wavelengths than RGB
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10032—Satellite or aerial image; Remote sensing
- G06T2207/10036—Multispectral image; Hyperspectral image
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10032—Satellite or aerial image; Remote sensing
- G06T2207/10041—Panchromatic image
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30168—Image quality inspection
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30181—Earth observation
- G06T2207/30188—Vegetation; Agriculture
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Multimedia (AREA)
- Remote Sensing (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Spectroscopy & Molecular Physics (AREA)
- General Health & Medical Sciences (AREA)
- Health & Medical Sciences (AREA)
- Radar, Positioning & Navigation (AREA)
- Computing Systems (AREA)
- Databases & Information Systems (AREA)
- Evolutionary Computation (AREA)
- Medical Informatics (AREA)
- Software Systems (AREA)
- Artificial Intelligence (AREA)
- Quality & Reliability (AREA)
- Image Processing (AREA)
- Image Analysis (AREA)
Abstract
The invention relates to the technical field of scene images shot by unmanned aerial vehicles, in particular to an agricultural low-altitude remote sensing mapping method and device. The agricultural low-altitude remote sensing mapping method comprises the following steps: carrying out grid division on a target area to obtain a plurality of image acquisition points; controlling the unmanned aerial vehicle to the plurality of image acquisition points one by one to acquire a hyperspectral image and a full-color image; when the full-color image is not similar to the preset reference image and is fuzzy, controlling the unmanned aerial vehicle to perform acquisition again during the return stroke of the flight operation; when the full-color image is not similar to the preset reference image and is not fuzzy, whether the acquired full-color image has black smoke or not is identified, and when the acquired full-color image has the black smoke, the unmanned aerial vehicle is controlled to acquire again during the next flight operation. The agricultural low-altitude remote sensing mapping method provided by the invention can save energy and time consumed by workers.
Description
Technical Field
The invention relates to the technical field of scene images shot by unmanned aerial vehicles, in particular to an agricultural low-altitude remote sensing mapping method and device.
Background
Because the precision that agricultural remote sensing survey and drawing required is higher, the data of demand is more complicated, generally need use low latitude remote sensing survey and drawing to gather the processing to farmland data.
The high-precision low-altitude remote sensing surveying and mapping generally needs to grid a preset collecting area firstly to obtain a plurality of image collecting points, and then controls an unmanned aerial vehicle carrying a panchromatic camera and a hyperspectral camera to carry out fixed-point suspension shooting on the image collecting points one by one.
If when the unmanned aerial vehicle is shot at a fixed point, farmers burn weeds or straws in the field to generate large black smoke in the fixed point area, or the farmer encounters an unmanned aerial vehicle which is collided with the fixed point and suspended when animals such as birds prey in the fixed point area, the shot images are large black smoke or blurred images, the shot images in the fixed point area are bad and do not meet the requirements of remote sensing surveying and mapping, so that workers need to check all the obtained images after the unmanned aerial vehicle returns, the unmanned aerial vehicle needs to be controlled to collect the images again to the corresponding image collecting points after checking, and a great amount of energy and time of the workers are consumed.
Disclosure of Invention
In order to solve the technical problems or at least partially solve the technical problems, the invention provides an agricultural low-altitude remote sensing mapping method and device, which can save energy and time consumed by workers.
In a first aspect, the invention provides an agricultural low-altitude remote sensing mapping method, which comprises the following steps:
carrying out grid division on a target area to obtain a plurality of image acquisition points;
carrying out primary flight operation: controlling the unmanned aerial vehicle carrying the hyperspectral camera and the panchromatic camera to move to the plurality of image acquisition points one by one, and carrying out fixed-point suspension shooting to obtain hyperspectral images and full-color images in fixed-point areas corresponding to the image acquisition points;
selecting a full-color image obtained by shooting and judging the similarity of the full-color image and a preset reference image;
when the similarity of the full-color image and the preset reference image is smaller than a preset similarity threshold value:
obtaining the ambiguity of the full-color image through a preset ambiguity judgment formula, and controlling the unmanned aerial vehicle to perform acquisition again to an image acquisition point corresponding to the full-color image when the ambiguity of the full-color image is greater than a preset ambiguity threshold value during the return process of the flight operation;
when the similarity of the panchromatic image and a preset reference image is smaller than a preset similarity threshold value and the fuzziness of the panchromatic image is smaller than a preset fuzziness threshold value:
whether the obtained full-color image has black smoke or not is identified, and when the obtained full-color image has black smoke, the unmanned aerial vehicle is controlled to carry out collection again to an image collection point corresponding to the full-color image during next flight operation.
Optionally, the preset ambiguity determination formula is as follows:
wherein D is i Gray value of ith pixel of full-color image, D i-1 Gray value of i-1 pixel of full-color image, H i For presetting the gray value of the ith pixel of the reference image, H i-1 The gray value of the i-1 th pixel of the reference image is preset.
Optionally, the preset ambiguity threshold is 15.
Optionally, identifying whether the obtained full-color image has black smoke comprises:
converting the obtained full-color image into an HSV channel, and judging that black smoke exists when the proportion of first characteristic pixels in the full-color image to pixels of the total full-color image exceeds a preset proportion;
the first characteristic pixel meets the following conditions in the HSV channel:
0.3<H<0.7,0.07<S<0.3,0.6<V≤1。
optionally, the preset proportion is 30%.
Optionally, the agricultural low-altitude remote sensing mapping method further includes:
fuse full-color image and high spectral image that obtain in flight operation once, include: after the full-color image and the hyperspectral image respectively acquire respective feature descriptors through an SIFT algorithm, matching the feature descriptors of the full-color image and the hyperspectral image to complete the fusion of the full-color image and the hyperspectral image;
when the similarity of the full-color image and a preset reference image is smaller than a preset similarity threshold value, the fuzziness of the full-color image is smaller than a preset fuzziness threshold value, and the number of the full-color images and the number of the feature descriptors of the hyperspectral images, which are obtained by the full-color image and the full-color image corresponding to the fixed-point area and without black smoke, obtained by the SIFT algorithm are respectively a first number, the number of the feature descriptors, which are obtained by the full-color images and the hyperspectral images obtained by other fixed-point areas through the SIFT algorithm are respectively a second number, and the first number is larger than the second number.
Optionally, the first number is 256 and the second number is 128.
Optionally, the selecting the photographed full-color image and the preset reference image to perform similarity judgment includes:
converting the full-color image and the preset reference image into gray level images, acquiring respective gray level histograms, calculating the coincidence degree of the gray level histogram of the full-color image and the gray level histogram of the preset reference image to obtain the similarity of the full-color image and the preset reference image, and comparing the similarity with a preset similarity threshold value.
Optionally, the preset similarity threshold is 0.9.
In a second aspect, the present invention provides an agricultural low-altitude remote sensing mapping device, including an unmanned aerial vehicle, a hyperspectral camera, a panchromatic camera, a processor, and a memory, where at least one instruction, at least one program, a set of codes, or a set of instructions is stored in the memory, and the at least one instruction, the at least one program, the set of codes, or the set of instructions is loaded and executed by the processor to implement the agricultural low-altitude remote sensing mapping method according to any one of the first aspects.
Compared with the prior art, the technical scheme provided by the invention has the following advantages:
the agricultural low-altitude remote sensing surveying and mapping method provided by the invention can screen and identify short-term influence factors of blurred pictures of shot images caused by factors such as bird strike, and immediately acquire the images again to image acquisition points corresponding to the blurred pictures when the flight mission returns, so that the time delay caused by blurred pictures due to the short-term influence factors is reduced.
Through differentiating short-term influence factor and long term influence factor to control unmanned aerial vehicle respectively and carry out the retake to bad image when this flight operation return or when the next flight operation, can reduce short-term influence factor and long term influence factor to the total length of whole remote sensing operation, and can practice thrift energy and the time that the staff consumed.
Drawings
FIG. 1 is an application scene diagram of an agricultural low-altitude remote sensing mapping method provided by an embodiment of the invention;
FIG. 2 is a flow chart of an agricultural low-altitude remote sensing mapping method provided by the embodiment of the invention;
fig. 3 is a structural block diagram of an agricultural low-altitude remote sensing and mapping device provided by the embodiment of the invention.
Detailed Description
In order that the above objects, features and advantages of the present invention may be more clearly understood, a solution of the present invention will be further described below. It should be noted that the embodiments of the present invention and features of the embodiments may be combined with each other without conflict.
In the following description, numerous specific details are set forth in order to provide a thorough understanding of the present invention, but the present invention may be practiced in other ways than those described herein; it is to be understood that the embodiments described in this specification are only some embodiments of the invention, and not all embodiments.
Fig. 1 is an application scene diagram of the agricultural low-altitude remote sensing mapping method provided by the embodiment of the invention. As shown in FIG. 1, the agricultural mapping method provided by the invention is used for regulating and controlling the operation flow of low-altitude remote sensing mapping, and can be suitable for a target area 12 with a specific size, such as a farmland, a lake, a forest land and the like which are already defined with a specific size. The invention collects the images of the fixed-point areas corresponding to the image collecting points divided by the target area 12 one by one through the unmanned aerial vehicle 11 carrying the hyperspectral camera and the panchromatic camera, and finishes the identification of the quality of the collected images in the process of flight operation. When the image is identified as bad, short-time influence factors with short duration such as fuzzy image pictures shot by impacting the unmanned aerial vehicle 11 when the bird cluster forages and long-time influence factors with long duration such as large black smoke on the pictures caused by burning weeds and residual straws after harvest can be distinguished. And if the influence factor is determined to be a short-term influence factor, controlling the unmanned aerial vehicle 11 to return to the image acquisition point corresponding to the bad image to acquire the image again when the flight operation returns.
When the long-term influence factor is determined, the unmanned aerial vehicle 11 is controlled to return to the image acquisition point corresponding to the bad image for image acquisition again when the unmanned aerial vehicle flies next time, and the unmanned aerial vehicle 11 needs to charge the battery after returning and needs to derive the image data stored in the unmanned aerial vehicle 11, so that 45-90 minutes is generally consumed, the long-term influence factor is generally eliminated at the moment, and the unmanned aerial vehicle 11 can smoothly acquire the qualified image again when the unmanned aerial vehicle flies next time.
Through differentiating short-term influence factor and long-term influence factor to control unmanned aerial vehicle 11 respectively and retake bad image when this flight operation is returned or when the flight operation next time, can make short-term influence factor or long-term influence factor to the total length of whole remote sensing operation, and can practice thrift energy and the time that the staff consumed.
Fig. 2 is a flowchart of an agricultural low-altitude remote sensing mapping method provided by the embodiment of the invention. Referring to fig. 2, the agricultural low-altitude remote sensing mapping method provided by the embodiment of the invention comprises the following steps:
and S201, performing grid division on the target area to obtain a plurality of image acquisition points.
Specifically, a target area is first gridded on a map to obtain a plurality of small grids, a central point of each small grid is an image acquisition point, and in this embodiment, the intervals between the centers of the small grids are 50 meters, so as to obtain a plurality of image acquisition points and corresponding longitudes and latitudes thereof.
And S202, controlling the unmanned aerial vehicle to carry out fixed point suspension shooting on the plurality of image acquisition points one by one.
Specifically, after the unmanned aerial vehicle loaded with the hyperspectral camera and the panchromatic camera is controlled to navigate to the longitude and latitude of the image acquisition point one by one according to GPS navigation, the hyperspectral image of the fixed point area corresponding to the image acquisition point is acquired through the hyperspectral camera, and the panchromatic image of the fixed point area corresponding to the image acquisition point is acquired through the panchromatic camera. In this embodiment, the fixed point region is a region corresponding to the small grid.
And S203, selecting the shot full-color image and a preset reference image to judge the similarity.
Specifically, the full-color image and a preset reference image are converted into a gray-scale image, and the preset reference image is a historical full-color image corresponding to the fixed-point area.
And acquiring respective gray level histograms, and calculating the coincidence degree of the gray level histogram of the full-color image and the gray level histogram of the preset reference image to obtain the similarity of the full-color image and the preset reference image, namely obtaining the similarity of the full-color image and the preset reference image through a gray level histogram algorithm.
The similarity is compared with a preset similarity threshold, which is 0.9 in this embodiment.
In other embodiments, SSIM (structural similarity metric) or three-channel histogram similarity or other commonly used image similarity algorithms may also be used to determine the similarity between the panchromatic image and the preset reference image, and the similarity threshold is changed according to the difference of the algorithms, which generally needs to ensure that the similarity between the panchromatic image and the preset reference image is not lower than 90%.
In this embodiment, since the full-color image is insensitive to the change caused by seasons after the graying, it can be known whether the image acquired by the image acquisition point is unqualified in image picture quality due to short-term influence factors or long-term influence factors by simply judging the similarity between the full-color image and the preset reference image after the graying.
And S204, obtaining the fuzzy degree of the full-color image with the similarity lower than a preset similarity threshold value through a preset fuzzy degree judgment formula to judge whether the full-color image is fuzzy or not.
Specifically, the full-color image lower than a preset similarity threshold is judged through the preset ambiguity judgment formula, where the preset ambiguity judgment formula is as follows:
wherein D is i Is the gray value of the ith pixel of the full-color image, D i-1 Gray value of i-1 pixel of full-color image, H i For presetting the gray value of the ith pixel of the reference image, H i-1 The gray value of the i-1 th pixel of the reference image is preset.
And graying the full-color image and the preset reference image, and introducing the grayed full-color image and the preset reference image into a preset ambiguity judgment formula to obtain the ambiguity of the full-color image lower than a preset similarity threshold value.
And comparing the fuzziness of the full-color image lower than the preset similarity threshold with a preset fuzziness threshold, wherein the fuzziness is judged if the fuzziness is greater than the preset fuzziness threshold, and the fuzziness is judged if the fuzziness is not greater than the preset fuzziness threshold. In this embodiment, the preset threshold value of the ambiguity is 15. In other embodiments, the preset threshold value of the degree of blur may be adjusted by the staff appropriately according to the full-color image obtained by shooting in the actual flight operation.
And S205, when judging that the full-color image is fuzzy, controlling the unmanned aerial vehicle to return to the image acquisition point for acquisition again during the return process.
Specifically, when the full-color image is judged to be blurred, an image acquisition point for acquiring the full-color image is recorded, and the image in the fixed point area is acquired again after returning to the image acquisition point in the return process, and then the return process is performed.
The scheme provided by the specific implementation mode can distinguish short-time influence factors, and collects the image again in the return process, so that the time consumed by the whole remote sensing mapping task can be saved.
And S206, identifying whether black smoke exists in the non-blurred full-color image with the similarity lower than a preset similarity threshold value.
After the determination in S204, if there is a full-color image whose similarity to the preset reference image is smaller than the preset similarity threshold and whose blur is smaller than the preset blur threshold, it is determined that the full-color image has a similarity lower than the preset similarity threshold and is not blurred.
Specifically, a full-color image with similarity lower than a preset similarity threshold and without blurring is converted into an HSV channel, and when the proportion of first characteristic pixels in the full-color image to pixels of the full-color image exceeds a preset proportion, black smoke is judged to exist;
wherein the first characteristic pixel is a pixel in a full color image, but the following condition is satisfied within the HSV channel:
0.3<H<0.7,0.07<S<0.3,0.6<V≤1。
in the present embodiment, the predetermined ratio is 30%.
And S207, when the similarity is lower than a preset similarity threshold value and black smoke exists in the non-blurred full-color image, controlling the unmanned aerial vehicle to fly to the image acquisition point for the next time to perform acquisition again.
Specifically, in this embodiment, the image is collected again at the image collection point when the next flight work task returns.
The scheme provided by the specific embodiment can distinguish the long-term influence factors, and carry out the supplemental photography after the long-term influence factors are finished as far as possible, so that the manual regulation and control of workers are not needed, and the energy and time consumed by the workers can be saved.
Furthermore, after the return of the unmanned aerial vehicle, the acquired hyperspectral image and the full-color image are required to be exported for fusion processing.
Specifically, the full-color image and the hyperspectral image obtained in one flight operation are fused, and the fusion process comprises the following steps: and after the full-color image and the hyperspectral image respectively acquire respective feature descriptors through an SIFT algorithm, matching the feature descriptors of the full-color image and the hyperspectral image to complete the fusion of the full-color image and the hyperspectral image. The fusion of the panchromatic image and the hyperspectral image according to the feature descriptors is the prior art, and is not repeated herein, for example, the same feature descriptors are mapped or a neural network is trained to match the feature descriptors of the panchromatic image and the hyperspectral image.
When the similarity between the full-color image and the preset reference image is smaller than a preset similarity threshold value, the fuzziness of the full-color image is smaller than a preset fuzziness threshold value, and no black smoke exists in the full-color image and the hyperspectral image which are obtained by the fixed-point area corresponding to the full-color image, the number of the feature descriptors obtained through the SIFT algorithm is respectively a first number, the number of the feature descriptors obtained by the fixed-point area, in which the similarity between other full-color images and the preset reference image is larger than the preset similarity threshold value, and the number of the feature descriptors obtained by the hyperspectral image and the full-color image which are obtained by the fixed-point area corresponding to the hyperspectral image are respectively a second number, and the first number is larger than the second number. In this embodiment, the first number is 256 and the second number is 128.
In this embodiment, if the similarity between the full-color image and the preset reference image is smaller than the preset similarity threshold, and the blur degree of the full-color image is smaller than the preset blur degree threshold and the full-color image without black smoke occurs, that is, the full-color image is not similar to the historical full-color image, and the full-color image is not blurred and has no black smoke, it is proved that the full-color image does not have a problem, but the appearance of the corresponding fixed point region has a relatively large change, so that the fixed point region belongs to a focal region in image fusion, and more feature descriptors are arranged on the full-color image and the hyperspectral image obtained by the fixed point region, so that the full-color image and the hyperspectral image can embody more change details after fusion, and subsequent fused images with the history are convenient to study and analyze.
Fig. 3 is a structural block diagram of an agricultural low-altitude remote sensing mapping device provided by the embodiment of the invention. Referring to fig. 3, an agricultural low-altitude remote sensing mapping device provided by the embodiment of the invention includes: the system comprises a unmanned aerial vehicle 11, a hyperspectral camera 301, a panchromatic camera 302, a processor 304 and a memory 305, wherein at least one instruction, at least one program, a code set or an instruction set is stored in the memory 305, and the at least one instruction, the at least one program, the code set or the instruction set is loaded and executed by the processor 304 to implement the functions of the agricultural low-altitude remote sensing mapping method in the above method embodiments, and the functions can be implemented by hardware or by hardware executing corresponding software.
In a specific application, the components of the agricultural low-altitude remote sensing and mapping device are coupled together through a bus system 303, wherein the bus system 303 may include a power bus, a control bus, a status signal bus and the like in addition to a data bus. For clarity of illustration, however, the various buses are illustrated in the figure as the bus system 303.
The method disclosed in the above embodiments of the present invention may be applied to the processor 304, or may be implemented by the processor 304. The processor 304 may be an integrated circuit chip having signal processing capabilities. In implementation, the steps of the above method may be performed by integrated logic circuits of hardware or instructions in the form of software in the processor 304. The processor 304 may be a general purpose processor, a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), an off-the-shelf programmable gate array (FPGA) or other programmable logic device, discrete gate or transistor logic, discrete hardware components. The various methods, steps, and logic blocks disclosed in the embodiments of the present invention may be implemented or performed. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like. The steps of the method disclosed in connection with the embodiments of the present invention may be directly implemented by a hardware decoding processor, or implemented by a combination of hardware and software modules in the decoding processor. The software modules may be located in ram, flash, rom, prom, or eprom, registers, etc. as is well known in the art. The storage medium is located in a memory, and a processor reads information in the memory and combines hardware thereof to complete the steps of the method.
The agricultural low-altitude remote sensing surveying and mapping device can be understood by referring to the description of the embodiment part of the method, and redundant description is not repeated here.
It is noted that, in this document, relational terms such as "first" and "second," and the like, may be used solely to distinguish one entity or action from another entity or action without necessarily requiring or implying any actual such relationship or order between such entities or actions. Also, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising a … …" does not exclude the presence of another identical element in a process, method, article, or apparatus that comprises the element.
The foregoing are merely exemplary embodiments of the present invention, which enable those skilled in the art to understand or practice the present invention. Various modifications to these embodiments will be readily apparent to those skilled in the art, and the generic principles defined herein may be applied to other embodiments without departing from the spirit or scope of the invention. Thus, the present invention is not intended to be limited to the embodiments shown herein but is to be accorded the widest scope consistent with the principles and novel features disclosed herein.
Claims (9)
1. The agricultural low-altitude remote sensing and mapping method is characterized by comprising the following steps:
carrying out grid division on a target area to obtain a plurality of image acquisition points;
carrying out primary flight operation: controlling an unmanned aerial vehicle carrying a hyperspectral camera and a panchromatic camera to move to the plurality of image acquisition points one by one, and carrying out fixed-point suspension shooting to obtain hyperspectral images and panchromatic images in fixed-point areas corresponding to the image acquisition points;
selecting a full-color image obtained by shooting and judging the similarity of the full-color image and a preset reference image;
when the similarity of the full-color image and the preset reference image is less than a preset similarity threshold value:
obtaining the ambiguity of the panchromatic image through a preset ambiguity judging formula, and controlling the unmanned aerial vehicle to perform acquisition again to an image acquisition point corresponding to the panchromatic image when the ambiguity of the panchromatic image is larger than a preset ambiguity threshold value in the return process of the flight operation;
when the similarity of the panchromatic image and the preset reference image is smaller than a preset similarity threshold value and the fuzziness of the panchromatic image is smaller than a preset fuzziness threshold value:
identifying whether the obtained full-color image has black smoke or not, and controlling the unmanned aerial vehicle to perform collection again to an image collection point corresponding to the full-color image during the next flight operation when the black smoke is identified in the obtained full-color image;
the full-color image and the hyperspectral image acquired in one flight operation are fused, and the method comprises the following steps: after the full-color image and the hyperspectral image respectively acquire respective feature descriptors through an SIFT algorithm, matching the feature descriptors of the full-color image and the hyperspectral image to complete the fusion of the full-color image and the hyperspectral image;
when the similarity of the full-color image and a preset reference image is smaller than a preset similarity threshold value, the fuzziness of the full-color image is smaller than a preset fuzziness threshold value, and the number of the full-color images and the number of the feature descriptors of the hyperspectral images, which are obtained by the full-color image and the full-color image corresponding to the fixed-point area and without black smoke, obtained by the SIFT algorithm are respectively a first number, the number of the feature descriptors, which are obtained by the full-color images and the hyperspectral images obtained by other fixed-point areas through the SIFT algorithm are respectively a second number, and the first number is larger than the second number.
2. The agricultural low-altitude remote sensing mapping method according to claim 1, wherein the preset ambiguity determination formula is as follows:
wherein D is i Gray value of ith pixel of full-color image, D i-1 Gray value of i-1 pixel of full-color image, H i For presetting the gray value of the ith pixel of the reference image, H i-1 The gray value of the i-1 th pixel of the reference image is preset.
3. The agricultural low-altitude remote sensing mapping method according to claim 1, wherein the preset threshold value of ambiguity is 15.
4. The agricultural low-altitude remote sensing mapping method according to claim 1, wherein identifying whether the obtained full-color image has black smoke comprises:
converting the obtained full-color image into an HSV channel, and judging that black smoke exists when the proportion of first characteristic pixels in the full-color image to pixels of the total full-color image exceeds a preset proportion;
the first characteristic pixel meets the following conditions in the HSV channel:
0.3<H<0.7,0.07<S<0.3,0.6<V≤1。
5. the agricultural low-altitude remote sensing mapping method according to claim 4, wherein the preset proportion is 30%.
6. The agricultural low-altitude remote sensing mapping method according to claim 1, wherein the first number is 256 and the second number is 128.
7. The agricultural low-altitude remote sensing mapping method according to claim 1, wherein the judgment of the similarity between the full-color image obtained by shooting and the preset reference image comprises the following steps:
converting the full-color image and the preset reference image into gray level images, acquiring respective gray level histograms, calculating the coincidence degree of the gray level histogram of the full-color image and the gray level histogram of the preset reference image to obtain the similarity of the full-color image and the preset reference image, and comparing the similarity with a preset similarity threshold value.
8. The agricultural low-altitude remote sensing mapping method according to claim 1, wherein the preset similarity threshold is 0.9.
9. Agricultural low-altitude remote sensing mapping apparatus, comprising an unmanned aerial vehicle, a hyperspectral camera, a panchromatic camera, a processor and a memory, wherein the memory stores at least one instruction, at least one program, a set of codes, or a set of instructions, and the at least one instruction, the at least one program, the set of codes, or the set of instructions is loaded and executed by the processor to implement the agricultural low-altitude remote sensing mapping method according to any one of claims 1 to 8.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202211177670.1A CN115272314B (en) | 2022-09-27 | 2022-09-27 | Agricultural low-altitude remote sensing mapping method and device |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202211177670.1A CN115272314B (en) | 2022-09-27 | 2022-09-27 | Agricultural low-altitude remote sensing mapping method and device |
Publications (2)
Publication Number | Publication Date |
---|---|
CN115272314A CN115272314A (en) | 2022-11-01 |
CN115272314B true CN115272314B (en) | 2022-12-23 |
Family
ID=83757721
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202211177670.1A Active CN115272314B (en) | 2022-09-27 | 2022-09-27 | Agricultural low-altitude remote sensing mapping method and device |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN115272314B (en) |
Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN107784639A (en) * | 2017-11-02 | 2018-03-09 | 长安大学 | A kind of polygon filtering and noise reduction method of unmanned aerial vehicle remote sensing image improvement |
CN114881869A (en) * | 2022-03-24 | 2022-08-09 | 天津三源电力信息技术股份有限公司 | Inspection video image preprocessing method |
Family Cites Families (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN110148147B (en) * | 2018-11-07 | 2024-02-09 | 腾讯大地通途(北京)科技有限公司 | Image detection method, image detection device, storage medium and electronic device |
WO2021152741A1 (en) * | 2020-01-29 | 2021-08-05 | 株式会社ナイルワークス | Crop-growing system |
CN112166911B (en) * | 2020-10-15 | 2022-05-24 | 安阳工学院 | Rapid accurate pest and disease damage detection pesticide application system and method |
CN113326752B (en) * | 2021-05-20 | 2024-04-30 | 淮阴工学院 | Unmanned aerial vehicle-based photovoltaic power station identification method and system |
CN114821375B (en) * | 2022-06-27 | 2022-09-06 | 江西省地矿测绘有限公司 | Mapping method and device based on multi-source remote sensing data, storage medium and equipment |
CN114885105B (en) * | 2022-07-12 | 2022-09-23 | 江苏奥派电气科技有限公司 | Image acquisition and adjustment method for photovoltaic power station inspection unmanned aerial vehicle |
-
2022
- 2022-09-27 CN CN202211177670.1A patent/CN115272314B/en active Active
Patent Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN107784639A (en) * | 2017-11-02 | 2018-03-09 | 长安大学 | A kind of polygon filtering and noise reduction method of unmanned aerial vehicle remote sensing image improvement |
CN114881869A (en) * | 2022-03-24 | 2022-08-09 | 天津三源电力信息技术股份有限公司 | Inspection video image preprocessing method |
Also Published As
Publication number | Publication date |
---|---|
CN115272314A (en) | 2022-11-01 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US20230316555A1 (en) | System and Method for Image-Based Remote Sensing of Crop Plants | |
US10402692B1 (en) | Learning method and learning device for fluctuation-robust object detector based on CNN using target object estimating network adaptable to customers' requirements such as key performance index, and testing device using the same | |
CN110580428A (en) | image processing method, image processing device, computer-readable storage medium and electronic equipment | |
CN112560623B (en) | Unmanned aerial vehicle-based rapid mangrove plant species identification method | |
CN115861858B (en) | Small sample learning crop canopy coverage calculating method based on background filtering | |
CN111765974A (en) | Wild animal observation system and method based on miniature refrigeration thermal infrared imager | |
CN113850312A (en) | Forest ecological condition monitoring method and device, electronic equipment and storage medium | |
CN117456257A (en) | Agricultural pest identification method based on improved YOLOv5 | |
Gauci et al. | A Machine Learning approach for automatic land cover mapping from DSLR images over the Maltese Islands | |
CN115240168A (en) | Perception result obtaining method and device, computer equipment and storage medium | |
CN116612103A (en) | Intelligent detection method and system for building structure cracks based on machine vision | |
Varamesh et al. | Detection of land use changes in northeastern Iran by Landsat satellite data. | |
CN114693528A (en) | Unmanned aerial vehicle low-altitude remote sensing image splicing quality evaluation and redundancy reduction method and system | |
CN115272314B (en) | Agricultural low-altitude remote sensing mapping method and device | |
CN117292281B (en) | Open-field vegetable detection method, device, equipment and medium based on unmanned aerial vehicle image | |
CN117451012A (en) | Unmanned aerial vehicle aerial photography measurement method and system | |
Thorp et al. | Using aerial hyperspectral remote sensing imagery to estimate corn plant stand density | |
CN115655157A (en) | Fish-eye image-based leaf area index measuring and calculating method | |
CN110070513A (en) | The radiation correction method and system of remote sensing image | |
CN115061502A (en) | Method and system for determining optimal flight height of unmanned aerial vehicle, electronic device and medium | |
CN115311336A (en) | Image registration method, device and equipment of multiple cameras and storage medium | |
Miraki et al. | Using canopy height model derived from UAV imagery as an auxiliary for spectral data to estimate the canopy cover of mixed broadleaf forests | |
CN114821658A (en) | Face recognition method, operation control device, electronic device, and storage medium | |
CN112580510A (en) | Permeable ground area rate estimation method, permeable ground area rate estimation device, permeable ground area rate estimation equipment and storage medium | |
CN114092850A (en) | Re-recognition method and device, computer equipment and storage medium |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |