CN110399832B - TomosAR vegetation pest and disease damage monitoring method and device based on coherence - Google Patents

TomosAR vegetation pest and disease damage monitoring method and device based on coherence Download PDF

Info

Publication number
CN110399832B
CN110399832B CN201910676471.7A CN201910676471A CN110399832B CN 110399832 B CN110399832 B CN 110399832B CN 201910676471 A CN201910676471 A CN 201910676471A CN 110399832 B CN110399832 B CN 110399832B
Authority
CN
China
Prior art keywords
vegetation
target
image data
data
coherence
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201910676471.7A
Other languages
Chinese (zh)
Other versions
CN110399832A (en
Inventor
徐伟
曹琨坤
谭维贤
黄平平
董亦凡
李新武
李秀娟
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Inner Mongolia University of Technology
Original Assignee
Inner Mongolia University of Technology
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Inner Mongolia University of Technology filed Critical Inner Mongolia University of Technology
Priority to CN201910676471.7A priority Critical patent/CN110399832B/en
Publication of CN110399832A publication Critical patent/CN110399832A/en
Application granted granted Critical
Publication of CN110399832B publication Critical patent/CN110399832B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/88Radar or analogous systems specially adapted for specific applications
    • G01S13/89Radar or analogous systems specially adapted for specific applications for mapping or imaging
    • G01S13/90Radar or analogous systems specially adapted for specific applications for mapping or imaging using synthetic aperture techniques, e.g. synthetic aperture radar [SAR] techniques
    • G01S13/9021SAR image post-processing techniques
    • G01S13/9027Pattern recognition for feature extraction
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/88Radar or analogous systems specially adapted for specific applications
    • G01S13/89Radar or analogous systems specially adapted for specific applications for mapping or imaging
    • G01S13/90Radar or analogous systems specially adapted for specific applications for mapping or imaging using synthetic aperture techniques, e.g. synthetic aperture radar [SAR] techniques
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/24Classification techniques
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/10Terrestrial scenes
    • G06V20/13Satellite images
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/60Type of objects
    • G06V20/64Three-dimensional objects
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2218/00Aspects of pattern recognition specially adapted for signal processing
    • G06F2218/12Classification; Matching

Abstract

The embodiment of the application discloses a method and a device for monitoring TomosAR vegetation pest and disease damage based on coherence. One specific embodiment of the monitoring method comprises the following steps: obtaining target image data based on the obtained synthetic aperture radar image data of the target vegetation; processing the target image data by adopting a multi-signal classification algorithm to obtain three-dimensional structure data of the target vegetation; performing coherence analysis on the three-dimensional structure data of the target vegetation and the sample vegetation data; and determining the pest and disease damage condition of the target vegetation according to the analysis result. This kind of embodiment can carry out all-weather monitoring to vegetation, can realize moreover that the height is to the high accuracy measurement of vegetation structure, helps improving the degree of accuracy of vegetation plant diseases and insect pests monitoring result.

Description

TomosAR vegetation pest and disease damage monitoring method and device based on coherence
Technical Field
The embodiment of the application relates to the technical field of radar observation, in particular to a TomosAR vegetation pest and disease damage monitoring method and device based on coherence.
Background
Synthetic Aperture Radar (TomoSAR) technology is a new leading-edge technology developed in the last decade to acquire high-precision three-dimensional and four-dimensional information of a target. The method can realize the measurement of the height direction distribution scatterer by changing the data processing algorithm after imaging. In combination with polarization information, it is also possible to obtain target fine structure, physical composition, and spatial distribution information, so that it is possible to distinguish a plurality of scatterers of different heights, monitor the spatial position change of the scatterers, and the like. The technology is applied to the fields of forest structure parameter estimation, urban three-dimensional reconstruction, urban surface subsidence and the like, and has great application potential in the aspects of geology, glaciery and detection of underground buried objects.
Disclosure of Invention
The embodiment of the application provides a TomosAR vegetation pest and disease damage monitoring method and device based on coherence.
In a first aspect, an embodiment of the present application provides a method for monitoring TomoSAR vegetation pest and disease damage based on coherence, including: obtaining target image data based on the obtained synthetic aperture radar image data of the target vegetation; processing the target image data by adopting a multi-signal classification algorithm to obtain three-dimensional structure data of the target vegetation; performing coherence analysis on the three-dimensional structure data of the target vegetation and the sample vegetation data; and determining the pest and disease damage condition of the target vegetation according to the analysis result.
In some embodiments, deriving target image data based on acquired synthetic aperture radar image data of the target vegetation comprises: and acquiring synthetic aperture radar image data of target vegetation at different monitoring heights, and performing reference correction and phase compensation on the acquired multiple image data to obtain target image data.
In some embodiments, acquiring synthetic aperture radar image data of target vegetation at different monitoring heights, and performing reference correction and phase compensation processing on the acquired multiple image data includes: monitoring target vegetation on different height surfaces by using the same synthetic aperture radar to obtain a plurality of pieces of image data; one of the plurality of pieces of image data is used as main image data, and reference correction and phase compensation processing are performed on the remaining image data.
In some embodiments, processing the target image data by using a multi-signal classification algorithm to obtain three-dimensional structure data of the target vegetation includes: constructing a covariance matrix of target image data, and performing characteristic decomposition on the constructed covariance matrix to obtain a signal subspace matrix and a noise subspace matrix; and constructing a spatial spectrum function according to the signal subspace matrix and the noise subspace matrix, and searching a spectrum peak to obtain structural information of the target vegetation in the height direction.
In some embodiments, the performing the coherence analysis of the three-dimensional structure data of the target vegetation with the sample vegetation data comprises: and selecting pixel point data positioned on a vegetation canopy from the three-dimensional structure data of the target vegetation, and determining the coherence coefficient of the selected pixel point data and the pixel point data positioned at the same position of the sample vegetation.
In some embodiments, selecting pixel point data located in a vegetation canopy from three-dimensional structure data of a target vegetation includes: selecting a certain pixel point positioned on a vegetation canopy from the three-dimensional structure data of the target vegetation, taking the selected pixel point as a center, extracting data of all pixel points positioned in a preset size space, and generating vector data of the target vegetation.
In a second aspect, an embodiment of the present application provides a TomoSAR vegetation pest monitoring device based on coherence, including: a generating unit configured to obtain target image data based on the acquired synthetic aperture radar image data of the target vegetation; the processing unit is configured to process the target image data by adopting a multi-signal classification algorithm to obtain three-dimensional structure data of the target vegetation; an analysis unit configured to perform coherence analysis on the three-dimensional structure data of the target vegetation and the sample vegetation data; a determination unit configured to determine a pest condition of the target vegetation according to the analysis result.
In some embodiments, the analyzing unit is further configured to select pixel point data located in a vegetation canopy in the three-dimensional structure data of the target vegetation, and determine a coherence coefficient of the selected pixel point data and the pixel point data located at the same position of the sample vegetation.
In a third aspect, an embodiment of the present application provides an electronic device, including: a processor; a storage device having a computer program stored thereon; the processor, when executing the computer program on the memory device, causes the electronic device to implement the coherence-based TomoSAR vegetation pest monitoring method as described in any of the embodiments of the first aspect.
In a fourth aspect, embodiments of the present application provide a computer readable medium on which a computer program is stored, which when executed by a processor, implements the coherence-based TomoSAR vegetation pest monitoring method as described in any of the embodiments of the first aspect.
According to the TomosAR vegetation pest and disease damage monitoring method and device based on coherence, firstly, target image data can be obtained based on the obtained synthetic aperture radar image data of the target vegetation. And then, processing the target image data by adopting a multi-signal classification algorithm to obtain three-dimensional structure data of the target vegetation. And then, carrying out coherence analysis on the three-dimensional structure data of the target vegetation and the sample vegetation data. And finally, determining the pest and disease damage condition of the target vegetation according to the analysis result. The method can realize all-time and all-weather monitoring of the vegetation by utilizing the synthetic aperture radar image data of the target vegetation. And through the processing of a multi-signal classification algorithm, the high-precision measurement of the height-oriented vegetation structure can be realized. Thus being beneficial to improving the accuracy of the vegetation pest monitoring result.
Drawings
Other features, objects and advantages of the present application will become more apparent upon reading of the following detailed description of non-limiting embodiments thereof, made with reference to the accompanying drawings in which:
FIG. 1 is an exemplary system architecture diagram in which one embodiment of the present application may be applied;
FIG. 2 is a flow chart of one embodiment of a coherency-based TomosAR vegetation pest monitoring method provided herein;
fig. 3 is a schematic structural diagram of an embodiment of the coherency-based TomoSAR vegetation pest monitoring device provided in the present application.
Detailed Description
The present application will be described in further detail with reference to the following drawings and examples. It is to be understood that the specific embodiments described herein are merely illustrative of the relevant invention and not restrictive of the invention. It should be noted that, for convenience of description, only the portions related to the related invention are shown in the drawings.
It should be noted that the embodiments and features of the embodiments in the present application may be combined with each other without conflict. The present application will be described in detail below with reference to the embodiments with reference to the attached drawings.
Fig. 1 shows an exemplary system architecture 100 to which the coherency-based TomoSAR vegetation pest monitoring method or apparatus of embodiments of the present application may be applied.
As shown in fig. 1, the system architecture 100 may include a terminal 101, a network 102, a server 103, and a synthetic aperture radar 104. Network 102 may be the medium used to provide a communication link between terminal 101 and server 103. Network 102 may include various connection types, such as wired, wireless communication links, or fiber optic cables, to name a few.
A user may use the terminal 101 to interact with the server 103 via the network 102 to receive or send messages or the like. For example, the user may send a vegetation monitoring instruction to the server 103 through the terminal 101. Various client applications, such as vegetation hazard monitoring applications, image players, browsers, instant messaging tools, and the like, may be installed on the terminal 101. Vegetation herein may include, but is not limited to, tree forests, bushes, grasslands, and the like. Here, the disaster may include, but is not limited to, a pest disaster, a natural weather disaster (e.g., fire, freezing disaster), a man-made logging disaster, and the like.
Here, the terminal 101 may be hardware or software. When the terminal 101 is hardware, it may be various electronic devices with a display screen, including but not limited to a smart phone, a tablet computer, a desktop computer, and the like. When the terminal 101 is software, it can be installed in the electronic devices listed above. It may be implemented as multiple pieces of software or software modules (e.g., to provide distributed services) or as a single piece of software or software module. And is not particularly limited herein.
The server 103 may be a server that provides various services, and may be, for example, a background server that provides support for applications installed by the terminal 101. When receiving the monitoring instruction sent by the terminal 101, the background server may obtain image data of the target vegetation through the synthetic aperture radar 104. The data may then be analyzed and the results of the analysis (e.g., pest and disease conditions of the target vegetation) may be transmitted to the terminal 101.
Here, the server 103 may be hardware or software. When the server 103 is hardware, it may be implemented as a distributed server cluster composed of a plurality of servers, or may be implemented as a single server. When the server 103 is software, it may be implemented as multiple pieces of software or software modules (e.g., to provide distributed services), or as a single piece of software or software module. And is not particularly limited herein.
It should be noted that the TomoSAR vegetation pest monitoring method based on coherence provided in the embodiment of the present application may be generally executed by the server 103 (or the terminal 101). Accordingly, the TomoSAR vegetation pest monitoring device based on coherence may also be generally disposed in the server 103 (or the terminal 101).
It should be understood that the number of terminals, networks, servers and synthetic aperture radars in fig. 1 is merely illustrative. There may be any number of terminals, networks, servers, and synthetic aperture radars, as desired for implementation.
Please refer to fig. 2, which shows a flow 200 of an embodiment of the coherency-based TomoSAR vegetation pest monitoring method provided herein. The method may comprise the steps of:
and step 201, obtaining target image data based on the obtained synthetic aperture radar image data of the target vegetation.
In this embodiment, an executing body (e.g., the server 103 shown in fig. 1) of the coherency-based TomoSAR vegetation pest monitoring method may acquire Synthetic Aperture Radar (SAR) image data of the target vegetation in various ways. For example, the execution body may receive the synthetic aperture radar image data of the target vegetation transmitted by the user using the terminal (e.g., the terminal 101 shown in fig. 1) through a wired connection manner or a wireless connection manner. As another example, the executing entity may obtain the synthetic aperture radar image data of the target vegetation from an online resource (e.g., cloud) or a database. As another example, the executing entity may actually observe the target vegetation through a synthetic aperture radar (e.g., synthetic aperture radar 104 shown in fig. 1) to obtain image data thereof.
It will be appreciated that using synthetic aperture radar for observation, information data is typically obtained for a plurality of polarisation directions. At this time, a polarized SAR image classification algorithm may be employed to obtain a position image of the target vegetation.
Specifically, the total polarization information contained in each pixel of the polarized SAR image can be generally expressed as a3 × 3 polarized coherence matrix T:
Figure BDA0002143435550000051
in the formula, TijFor each element of the matrix T, the superscript denotes the conjugate of the corresponding element.
In order to reduce the influence of random fluctuation of the scattering echo of the complex ground object, the T matrix is subjected to de-orientation processing to obtain a new coherence matrix T', which is shown as the following formula:
T′=QTQH
in the formula (I), the compound is shown in the specification,
Figure BDA0002143435550000052
is an orientation angle rotation matrix, the superscript H is a conjugate transpose, theta is a target rotation angle, and the range is (-p/2, p/2)]。
Decomposing the matrix after the orientation removal processing into three components, wherein the specific representation form is as follows:
T′=PsTs+PdTd+PvTv
in the formula (I), the compound is shown in the specification,
Figure BDA0002143435550000053
Tij' for each element of the matrix after the disorientation process, Ps、Pd、PvCorresponding to the power value T of three components of surface scattering, even-order scattering and bulk scattering of a certain pixels、Td、TvThe polarization coherent matrix model is corresponding to three basic scattering mechanisms.
In order to inhibit the influence of speckle noise on the experimental result, filtering processing is carried out on the original polarization SAR data.
Using the formula T ═ PsTs+PdTd+PvTvPerforming three-component decomposition on the coherence matrix T to calculate the power value (P) of three components of surface scattering, even scattering and bulk scattering of each pixels、Pd、Pv) And total power value (Span).
Span=Ps+Pd+Pv
According to Ps、Pd、PvDetermines the dominant scattering mechanism, i.e. P, of each pixelmax=max(Ps,Pd,Pv) The corresponding scatter component. P of each pixels、Pd、PvForm a vector, i.e. P ═ Ps,Pd,Pv]=[P1,P2,P3]. The initial cluster partition is as follows:
Figure BDA0002143435550000061
CP1,2,3,4 respectively denote a surface scattering type, an even scattering type, a volume scattering type, and a mixed scattering type. Th is a threshold of empirical percentage, the higher the valueHigh, the higher the pixel accuracy for the three scattering mechanisms.
In the first three scattering types, pixels corresponding to the magnitude of the dominant scattering mechanism power value are respectively sorted and are divided into 30 subclasses with basically equal number; the similarity between each two classes is measured by the Wishart distance to carry out class merging, and the classes are respectively merged to the number of classes which is specified in advance (N1, N2, N3, and the number of the classes is less than 30). And (3) merging rules: if the distance between two subclasses of the same scattering type is shortest, combining the two subclasses, wherein the distance between the subclasses adopts the Wishart distance:
Dij=1/2{ln(|Vi|)+ln(|Vj|)+Tr(Vi -1Vj+Vj -1Vi)};
in the formula, Vi、VjThe average coherence matrix of the ith and jth classes is shown, and Tr represents the matrix tracing.
And solving the obtained average coherence matrix of each cluster, taking the average coherence matrix as a class center, and respectively reusing the Wishart classifier in the four initial clusters to iterate according to the distance from each pixel point to each class center. Here, in order to obtain a stable convergence effect, the Wishart classifier may be applied to iterate two to four times.
In addition, in order to more clearly represent various ground features, different colors can be used for representing different ground features according to actual conditions, for example, blue can represent surface scattering (such as ocean bare land), red can represent even scattering (such as urban area), and green can represent body scattering (such as forest vegetation).
In this embodiment, the target vegetation may be any vegetation that needs to be monitored, such as a forest that needs to be monitored for pest conditions. The geographic location, footprint, vegetation type, etc. are not limited in this application. Here, the execution subject may derive the target image data based on the acquired synthetic aperture radar image data of the target vegetation. For example, the executing entity may pre-process the acquired synthetic aperture radar image data of the target vegetation to obtain target image data. The target image data may be image data required for a subsequent processing procedure. Whereas the pre-processing procedure is typically a correlation process performed to obtain the desired target image data. Here, the preprocessing method and the target image data may be set according to actual needs of the user.
As an example, the target image data may be image data of a particular region of the target vegetation (e.g., a tree canopy or tree branch and leaf region). At this time, the executing entity may screen the acquired synthetic aperture radar image data of the target vegetation, so as to obtain synthetic aperture radar image data including the specific area image. Furthermore, in order to improve the efficiency of the subsequent processing, the execution subject may cut out the image data that has been screened out, thereby removing unnecessary image data from the original image data and obtaining image data that includes only the image of the feature region. In some application scenes, the execution main body can also perform cloud and fog removal processing and the like on image data with poor definition, so that the influence of weather factors is reduced.
It should be noted that, in order to obtain structural information of the target vegetation in the height (range) direction, it is necessary to obtain synthetic aperture radar image data of the target vegetation at different monitoring angles, especially image data at different monitoring heights. Namely image data obtained by monitoring the target vegetation by the synthetic aperture radar under different heights. At this time, the executing body may screen the acquired synthetic aperture radar image data of the target vegetation, so as to obtain a plurality of pieces of image data (i.e. at different monitoring heights).
It is to be understood that the manner in which image data is acquired at different monitored elevations is not limited in this application. For example, the target vegetation may be monitored by using a plurality of synthetic aperture radars located at different height planes. For another example, in order to simplify the method, the target vegetation may be monitored on different height planes (e.g., parallel tracks with different heights) by using the same synthetic aperture radar. Or the target vegetation can be monitored by adopting a synthetic aperture radar provided with a plurality of antennas with different heights.
In some optional implementations, the execution subject may also perform reference correction, phase compensation, and the like on the image data at these different monitoring heights. Therefore, subsequent data processing can be conveniently carried out, and the processing efficiency is improved. As an example, the execution subject may perform processing such as correction, phase deviation compensation, and the like on image data at different monitoring heights according to a reference set by a human.
Alternatively, the executing entity may use one piece of image data of the plurality of pieces of image data as main image data, that is, reference image data, so as to perform reference correction, phase compensation, and other processing on the remaining image data (that is, image data excluding the main image data) to obtain target image data, that is, data for subsequent tomosynthesis SAR imaging. The method comprises the following specific steps:
after the synthetic aperture radar system receives the signals, a two-dimensional backscattering complex image can be formed through imaging processing. Here, the azimuth direction is represented by x; r represents a distance direction; s represents the height direction. Wherein the azimuth resolution ρx(λ r)/(2 Δ x); distance direction resolution rhor=c/(2BW). Wherein λ is the wavelength; Δ x is the azimuthal synthesis space; c is the propagation velocity of the wave; b isWIs the SAR system bandwidth. For a single pixel u (x ', r') at a distance r 'and at a zero doppler position x', its complex signal is expressed as:
Figure BDA0002143435550000081
wherein gamma (x, r, s) is a reflectivity equation of the three-dimensional scene;
Figure BDA0002143435550000082
is the direct distance of the ground target to the sensor; f (x '-x, r' -r) represents a point spread function formed by the combined effect of the antenna directivity and the weighting in the imaging process, and there are points spread functions generally formed without considering the weighting
Figure BDA0002143435550000083
A single-base-station SAR imaging system carries out M times of observation on a single area (such as target vegetation) on parallel tracks with different heights, and M-scene complex SAR images can be obtained. At this time, the M/2 th scene image may be selected as the main image, and the others may be the auxiliary images. And then all data are preprocessed by registration, phase correction and the like. The m-th acquired SAR complex image can be represented as:
Figure BDA0002143435550000084
here, M is 1, …, M;
Figure BDA0002143435550000085
wherein, b//mRepresents a horizontal baseline; b⊥mIndicating a vertical baseline.
For convenience, assuming that the point spread function is a two-dimensional Dirac function (i.e., Dirac delta function), for a given pixel point (x ', r'), an M-dimensional vector can be obtained
Figure BDA0002143435550000086
Where each element may be represented as:
Figure BDA0002143435550000087
wherein Δ s represents the effective observation range in the height direction; rm(s)=Rm(s,r′=r,x′=x)。
Since the phase in the above equation includes a quadratic phase deviation related to the baseline
Figure BDA0002143435550000091
Therefore, it is necessary to multiply the received signal by a complex conjugate quadratic phase function
Figure BDA0002143435550000092
Thereby compensating for this secondary phase deviation. That is, the two-dimensional SAR image data needs to be deskewed, that is:
Figure BDA0002143435550000093
after the inclined treatment, the following can be obtained:
Figure BDA0002143435550000094
the phase term is incorporated into the reflectivity equation γ(s) to yield:
Figure BDA0002143435550000095
wherein the content of the first and second substances,
Figure BDA0002143435550000096
is the spatial (height) frequency.
In practical applications, if the phase characteristics of the reflectivity equation γ(s) need to be considered, the reflectivity equation can be multiplied by a complex conjugate quadratic phase function
Figure BDA0002143435550000097
The phase offset is removed to preserve the phase information of the reflectivity equation γ(s).
It should be noted that additive noise
Figure BDA0002143435550000098
Where present, the discrete expression of the formula is:
Figure BDA0002143435550000099
or
Figure BDA00021434355500000910
Wherein g ═ g (g)1,g2,…,gM)TIs a column vector with M elements;
Figure BDA00021434355500000911
a guide matrix of M × N with elements of Rm×n=exp(-j2πξmsn);
Figure BDA00021434355500000912
As a steering vector (steering matrix)
Figure BDA00021434355500000913
Column vector of (d):
Figure BDA00021434355500000914
gamma is an N-dimensional discretized reflectivity matrix whose elements are gamman=γ(sn),sn(N-1, …, N) represents the discretized height position.
Step 202, processing the target image data by adopting a multi-signal classification algorithm to obtain three-dimensional structure data of the target vegetation.
In this embodiment, the executing subject may adopt a multi-signal classification algorithm (MUSIC algorithm) to perform tomography processing on the target image data obtained in step 201, so as to obtain three-dimensional structure data of the target vegetation. As an example, the execution subject may construct a covariance matrix of the target image data, and perform feature decomposition on the constructed covariance matrix, thereby obtaining a signal subspace matrix and a noise subspace matrix; and then, according to the signal subspace matrix and the noise subspace matrix, a spatial spectrum function can be constructed, and spectrum peak searching is carried out to obtain structural information of the target vegetation in the height direction. The method comprises the following specific steps:
first, according to the M observation sample value points, the K-order autocorrelation function can be determined by the following formula
Figure BDA0002143435550000101
And constructing a sample covariance matrix
Figure BDA0002143435550000102
Figure BDA0002143435550000103
Wherein n represents a variable with a value range of [1, M];x*(n) represents a complex conjugate of x (n); x (n) represents a received signal,
Figure BDA0002143435550000104
Ajrepresenting the scattering coefficient of the point target; omegajRepresenting the frequency vector corresponding to the height of each scattering target point; v (n) represents a zero mean.
The sample covariance matrix may then be used
Figure BDA0002143435550000105
And (5) performing characteristic decomposition. And can estimate the number of signal sources using minimum likelihood criterion
Figure BDA0002143435550000106
The feature vector space at this time can be represented by matrices S and G, respectively. Wherein the content of the first and second substances,
Figure BDA0002143435550000107
can be prepared from
Figure BDA0002143435550000108
In
Figure BDA0002143435550000109
Eigenvectors corresponding to the largest eigenvalues (i.e., corresponding to the signals)
Figure BDA00021434355500001010
Composition, i.e., signal subspace;
Figure BDA00021434355500001011
can be prepared from
Figure BDA00021434355500001012
Feature vector corresponding to minimum feature value in
Figure BDA00021434355500001013
The composition, i.e. the noise subspace. Thus, the spatial spectrum function (i.e., MUSIC spectrum search function) can be expressed as follows:
Figure BDA00021434355500001014
wherein, beta*(ω) represents the complex number of conjugation of β (ω); β (ω) ═ 1, e-iω,…,e-i(M-1)ω]T
Thus, when β (ω) is at [0,2 π]When the global search is carried out in the range, the determination is carried out
Figure BDA00021434355500001015
The value of (a) enables a determination of the pseudo-power spectrum of the signal. The maximum value of the spectrum function is obtained in the space spectrum domain, and the angle corresponding to the spectrum peak is the estimated value of the incoming wave direction angle. That is, the position of the spectral peak is the value in the height direction, so that the structural information of the target vegetation in the height direction can be obtained.
It can be understood that the MUSIC algorithm uses the orthogonal property between the two complementary spaces to estimate the azimuth of the spatial signal. All vectors of the noise subspace can be used to construct a spectrum, with the peak positions in all spatial orientation spectra corresponding to the incoming wave orientations of the signal. The algorithm can greatly improve the direction-finding resolution and can be suitable for antenna arrays in any shapes.
And 203, performing coherence analysis on the three-dimensional structure data of the target vegetation and the sample vegetation data.
In this embodiment, the executive body may perform coherence analysis on the three-dimensional structure data of the target vegetation obtained in step 202 and the sample vegetation data. Wherein the sample vegetation may generally be normal (i.e., not exposed to a disaster) vegetation. For example, the sample vegetation may generally be vegetation of the same or similar species as the target vegetation, and/or vegetation of similar geographic location as the target vegetation. And the sample vegetation data can be set according to actual conditions. For example, the sample vegetation data may be image data of the whole vegetation or image data of a specific area of vegetation. As another example, the sample vegetation data may be image data of the target vegetation for a particular period of time (e.g., mid-may, and no pest and disease condition). Here, the specific manner of coherence analysis is not limited.
In order to monitor disaster (for example, pest) conditions of target vegetation, it is often necessary to monitor branch and leaf regions of vegetation with emphasis. Therefore, in some embodiments, in order to improve the monitoring efficiency and the accuracy of the monitoring result, the executive body may select pixel point data located in a vegetation canopy from the three-dimensional structure data of the target vegetation. And may determine the coherence coefficient of the selected pixel point data and the pixel point data at the same location of the sample vegetation. The same position can refer to a vegetation canopy, and also can refer to the position of the selected pixel point on the vegetation canopy.
Here, the selection manner of the pixel point data of the vegetation canopy is not limited in the present application, and if the selection manner is manual, the selection manner may also be selected by image recognition. As an example, firstly, a certain pixel point located on a vegetation canopy can be selected from three-dimensional structure data of a target vegetation; then, taking the selected pixel point as a center, extracting data of all pixel points in a preset size space (such as 3 multiplied by 3); finally, vector data of the target vegetation may be generated from the extracted pixel point data. For example, the 27 pixel point data extracted as described above may be arranged in a certain order to constitute the vector data X. The vector data Y of the sample vegetation can be obtained by the same method. Carrying out coherence detection on the vector data X and Y to obtain a coherence coefficient rho:
Figure BDA0002143435550000111
wherein, Y*Represents the complex conjugate of Y; e represents the mathematical expectation.
And 204, determining the pest and disease damage condition of the target vegetation according to the analysis result.
In this embodiment, the implementer may determine the pest status of the target vegetation from the analysis in step 203. As an example, the executing subject may determine the pest condition of the target vegetation according to the relationship between the determined coherence coefficient and the preset value range. For example, if the coherence coefficient ρ is between [ a1, 1 ], it can be said that the branches and leaves of the target vegetation have slight falling phenomenon, and 0-30% of the branches and leaves fall. If the coherence coefficient rho is between [ a2, a1 ], the phenomenon that branches and leaves of the target vegetation are seriously fallen can be shown, and the leaf loss rate is about 30-50%. If the coherence coefficient rho is between the values of a3 and a2, the branches and leaves of the target vegetation can be proved to fall off seriously, the leaf loss rate is about 50-80%, and the target vegetation is seriously damaged by pests. If the coherence coefficient rho is between 0 and a3, the branches and leaves of the target vegetation can be proved to fall off seriously, the vegetation is seriously damaged by pests, and the leaf loss rate is close to 80-100%. For deciduous forest vegetation, typically, a1 is 0.98, a2 is 0.95, and a3 is 0.85.
It will be appreciated that in order to improve the accuracy of the monitoring results, multiple sets of pixel point data are typically selected for analysis. At this time, the coherence coefficient may be calculated according to an average value of the selected groups of pixel point data and an average value of groups of pixel point data located at the same position in the sample vegetation; or the calculation can be performed according to the selected pixel point data of each group and the pixel point data of the corresponding group in the sample vegetation.
Optionally, the executing body may further determine the pest condition of the target vegetation according to the proportion of the coherence coefficients located in different value ranges. For example, if the proportion of the number of the coherence coefficients between [ a1, 1) to the total number of the coherence coefficients reaches 30%, and/or the proportion of the number of the coherence coefficients between [ a3, a2) and [0, a3) to the total number of the coherence coefficients reaches 50%, it can be said that the branches and leaves of the target vegetation have a serious shedding phenomenon and have a serious pest and disease disaster. For deciduous forest vegetation, typically, a1 is 0.98, a2 is 0.95, and a3 is 0.85.
In the method for monitoring the TomoSAR vegetation pest and disease damage based on coherence, first, target image data can be obtained based on the acquired synthetic aperture radar image data of the target vegetation. And then, processing the target image data by adopting a multi-signal classification algorithm to obtain the three-dimensional structure data of the target vegetation. And then, carrying out coherence analysis on the three-dimensional structure data of the target vegetation and the sample vegetation data. And finally, determining the pest and disease damage condition of the target vegetation according to the analysis result. The method can realize all-time and all-weather monitoring of the vegetation by utilizing the synthetic aperture radar image data of the target vegetation. And through the processing of a multi-signal classification algorithm, the high-precision measurement of the height-oriented vegetation structure can be realized. Thus being beneficial to improving the accuracy of the vegetation pest monitoring result.
With further reference to fig. 3, as an implementation of the method shown in the above embodiments, the present application also provides an embodiment of a TomoSAR vegetation pest monitoring device based on coherence. This device embodiment corresponds to the method embodiment shown in the various embodiments described above. The device can be applied to various electronic equipment.
As shown in fig. 3, the monitoring device 300 of the present embodiment may include: a generating unit 301 configured to obtain target image data based on the acquired synthetic aperture radar image data of the target vegetation; a processing unit 302 configured to process target image data by using a multi-signal classification algorithm to obtain three-dimensional structure data of the target vegetation; an analysis unit 303 configured to perform coherence analysis on the three-dimensional structure data of the target vegetation and the sample vegetation data; a determination unit 304 configured to determine a pest condition of the target vegetation from the analysis result.
In some embodiments, the generation unit 301 may be further configured to acquire synthetic aperture radar image data of target vegetation at different monitoring heights, and perform reference correction and phase compensation on the acquired multiple pieces of image data to obtain target image data.
Optionally, the generating unit 301 may be further configured to monitor the target vegetation on different height surfaces by using the same synthetic aperture radar, so as to obtain a plurality of pieces of image data; one of the plurality of pieces of image data is used as main image data, and reference correction and phase compensation processing are performed on the remaining image data.
In some embodiments, the processing unit 302 may be further configured to construct a covariance matrix of the target image data, and perform eigen decomposition on the constructed covariance matrix to obtain a signal subspace matrix and a noise subspace matrix; and constructing a spatial spectrum function according to the signal subspace matrix and the noise subspace matrix, and searching a spectrum peak to obtain structural information of the target vegetation in the height direction.
Optionally, the analyzing unit 303 may be further configured to select, from the three-dimensional structure data of the target vegetation, pixel point data located in a vegetation canopy and determine a coherence coefficient of the selected pixel point data and the pixel point data located at the same position of the sample vegetation.
Further, the analysis unit 303 may be further configured to select a certain pixel point located on a vegetation canopy in the three-dimensional structure data of the target vegetation, extract data of all pixel points located in a preset size space with the selected pixel point as a center, and generate vector data of the target vegetation.
It will be understood that the units described in the apparatus 300 correspond to the various steps in the method described with reference to fig. 2. Thus, the operations, features and resulting advantages described above with respect to the method are also applicable to the apparatus 300 and the units included therein, and are not described in detail herein.
It is to be noted that the flowchart and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods and computer program products according to various embodiments of the present application. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It should also be noted that, in some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based systems which perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
The units described in the embodiments of the present application may be implemented by software or hardware. The described units may also be located in the processor. Wherein the names of the elements do not in some way constitute a limitation on the elements themselves. For example, the generation unit may also be described as a "unit that derives target image data based on acquired synthetic aperture radar image data of target vegetation".
As another aspect, the present application also provides a computer-readable medium. The computer readable medium herein may be a computer readable signal medium or a computer readable storage medium or any combination of the two. The computer-readable medium may be included in the electronic device described in the above embodiments; or may exist separately without being assembled into the electronic device. The above-mentioned computer-readable medium carries a computer program which, when executed by the electronic device, enables the electronic device to implement the coherence-based TomoSAR vegetation pest monitoring method as described in any of the above-mentioned embodiments.
The above description is only a preferred embodiment of the application and is illustrative of the principles of the technology employed. It will be appreciated by those skilled in the art that the scope of the invention herein disclosed is not limited to the particular combination of features described above, but also encompasses other arrangements formed by any combination of the above features or their equivalents without departing from the spirit of the invention. For example, the above features may be replaced with (but not limited to) features having similar functions disclosed in the present application.

Claims (10)

1. A TomosAR vegetation pest and disease damage monitoring method based on coherence comprises the following steps:
obtaining target image data based on the obtained synthetic aperture radar image data of the target vegetation;
processing target image data by adopting a multi-signal classification algorithm to obtain three-dimensional structure data of the target vegetation;
performing coherence analysis on the three-dimensional structure data of the target vegetation and the sample vegetation data;
determining the pest and disease damage condition of the target vegetation according to the analysis result;
wherein the obtaining of target image data based on the obtained synthetic aperture radar image data of the target vegetation comprises:
using the formula T ═ PsTs+PdTd+PvTvPerforming three-component decomposition on the coherence matrix T' to calculate the power value (P) of three components of surface scattering, even scattering and bulk scattering of each pixels、Pd、Pv) With the total power value (Span),
Span=Ps+Pd+Pv
according to Ps、Pd、PvDetermines the dominant scattering mechanism, i.e. P, of each pixelmax=max(Ps,Pd,Pv) Corresponding scatter component, P of each pixelS、Pd、PvForm a vector, i.e. P ═ Ps,Pd,Pv]=[P1,P2,P3]The initial clustering is divided as follows:
Figure FDA0003110011730000011
CP1,2,3,4 denote surface, even, volume and mixed scatter types, respectively, with Th being the empirical percentage threshold.
2. The method of claim 1, the deriving target image data based on the acquired synthetic aperture radar image data of the target vegetation comprising:
and acquiring synthetic aperture radar image data of target vegetation at different monitoring heights, and performing reference correction and phase compensation on the acquired multiple image data to obtain target image data.
3. The method of claim 2, wherein the acquiring the synthetic aperture radar image data of the target vegetation at different monitoring heights, and performing reference correction and phase compensation processing on the acquired image data comprises:
monitoring target vegetation on different height surfaces by using the same synthetic aperture radar to obtain a plurality of pieces of image data; one of the plurality of pieces of image data is used as main image data, and reference correction and phase compensation processing are performed on the remaining image data.
4. The method of claim 1, wherein processing the target image data using a multi-signal classification algorithm to obtain three-dimensional structural data of the target vegetation comprises:
constructing a covariance matrix of target image data, and performing characteristic decomposition on the constructed covariance matrix to obtain a signal subspace matrix and a noise subspace matrix;
and constructing a spatial spectrum function according to the signal subspace matrix and the noise subspace matrix, and searching a spectrum peak to obtain structural information of the target vegetation in the height direction.
5. The method of any one of claims 1-4, wherein the performing the coherence analysis of the three-dimensional structure data of the target vegetation with the sample vegetation data comprises:
and selecting pixel point data positioned on a vegetation canopy from the three-dimensional structure data of the target vegetation, and determining the coherence coefficient of the selected pixel point data and the pixel point data positioned at the same position of the sample vegetation.
6. The method of claim 5, wherein selecting pixel data located in a vegetation canopy from the three-dimensional structure data of the target vegetation comprises:
and selecting a certain pixel point positioned on a vegetation canopy from the three-dimensional structure data of the target vegetation, taking the selected pixel point as a center, extracting data of all pixel points positioned in a preset size space, and generating vector data of the target vegetation.
7. A TomosAR vegetation pest monitoring device based on coherence includes:
the generating unit is configured to obtain target image data based on the acquired synthetic aperture radar image data of the target vegetation, and specifically includes:
using the formula T ═ PsTs+PdTd+PvTvPerforming three-component decomposition on the coherence matrix T' to calculate the power value (P) of three components of surface scattering, even scattering and bulk scattering of each pixels、Pd、Pv) With the total power value (Span),
Span=Ps+Pd+Pv
according to Ps、Pd、PvDetermines the dominant scattering mechanism, i.e. P, of each pixelmax=max(Ps,Pd,Pv) Corresponding scatter component, P of each pixels、Pd、PvForm a vector, i.e. P ═ Ps,Pd,Pv]=[P1,P2,P3]The initial clustering is divided as follows:
Figure FDA0003110011730000031
CP1,2,3,4 respectively represent a surface scattering type, an even scattering type, a volume scattering type and a mixed scattering type, and Th is an empirical percentage threshold;
the processing unit is configured to process target image data by adopting a multi-signal classification algorithm to obtain three-dimensional structure data of the target vegetation;
an analysis unit configured to perform coherence analysis on the three-dimensional structure data of the target vegetation and sample vegetation data;
a determination unit configured to determine a pest condition of the target vegetation according to the analysis result.
8. The apparatus of claim 7, the analysis unit further configured to select pixel point data located at a vegetation canopy in the three-dimensional structure data of the target vegetation and determine a coherence coefficient of the selected pixel point data and pixel point data located at a same location of a sample vegetation.
9. An electronic device, comprising:
a processor;
a storage device having a computer program stored thereon;
the computer program on the storage device, when executed by the processor, causes an electronic device to implement the coherence based TomoSAR vegetation pest monitoring method of one of claims 1-6.
10. A computer readable medium having stored thereon a computer program which, when executed by a processor, implements the coherence based TomoSAR vegetation pest monitoring method of one of claims 1-6.
CN201910676471.7A 2019-07-25 2019-07-25 TomosAR vegetation pest and disease damage monitoring method and device based on coherence Active CN110399832B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910676471.7A CN110399832B (en) 2019-07-25 2019-07-25 TomosAR vegetation pest and disease damage monitoring method and device based on coherence

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910676471.7A CN110399832B (en) 2019-07-25 2019-07-25 TomosAR vegetation pest and disease damage monitoring method and device based on coherence

Publications (2)

Publication Number Publication Date
CN110399832A CN110399832A (en) 2019-11-01
CN110399832B true CN110399832B (en) 2021-08-13

Family

ID=68325098

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910676471.7A Active CN110399832B (en) 2019-07-25 2019-07-25 TomosAR vegetation pest and disease damage monitoring method and device based on coherence

Country Status (1)

Country Link
CN (1) CN110399832B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115166741B (en) * 2022-09-08 2022-11-29 中国科学院空天信息创新研究院 Simplified model-based dual-phase central polarization chromatography decomposition method

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102221697A (en) * 2011-03-25 2011-10-19 电子科技大学 Airborne multi-antenna SAR chromatography three dimensional imaging system and imaging method thereof
CN103969645A (en) * 2014-05-14 2014-08-06 中国科学院电子学研究所 Method for measuring tree heights by tomography synthetic aperture radar (SAR) based on compression multi-signal classification (CS-MUSIC)

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108960295B (en) * 2018-06-13 2022-08-26 中国科学院空天信息创新研究院 Multi-temporal fully-polarized SAR image feature extraction method and classification method

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102221697A (en) * 2011-03-25 2011-10-19 电子科技大学 Airborne multi-antenna SAR chromatography three dimensional imaging system and imaging method thereof
CN103969645A (en) * 2014-05-14 2014-08-06 中国科学院电子学研究所 Method for measuring tree heights by tomography synthetic aperture radar (SAR) based on compression multi-signal classification (CS-MUSIC)

Non-Patent Citations (4)

* Cited by examiner, † Cited by third party
Title
Three-Dimensional Structure Inversion of Buildings with Nonparametric Iterative Adaptive Approach Using;Xing Peng 等;《Remote Sensing》;20180625;第1-2页 *
一种结合散射相似性和Wishart的极化SAR图像分类方法;谭维贤 等;《信号处理》;20190531;第35卷(第5期);第1-2页 *
基于Sentinel_1多时相InSAR影像的云南松切梢小蠹危害程度监测;薛娟 等;《国土资源遥感》;20181231;第30卷(第4期);第108-114页 *
多基线层析SAR成像方法研究;陈钦;《中国优秀硕士学位论文全文数据库信息科技辑》;20110715(第07期);第I136-775页 *

Also Published As

Publication number Publication date
CN110399832A (en) 2019-11-01

Similar Documents

Publication Publication Date Title
Neumann et al. Estimation of forest structure, ground, and canopy layer characteristics from multibaseline polarimetric interferometric SAR data
Chen et al. PolInSAR complex coherence estimation based on covariance matrix similarity test
Ballester-Berman et al. Retrieval of biophysical parameters of agricultural crops using polarimetric SAR interferometry
Hariharan et al. A novel phenology based feature subset selection technique using random forest for multitemporal PolSAR crop classification
Sauer et al. Three-dimensional imaging and scattering mechanism estimation over urban scenes using dual-baseline polarimetric InSAR observations at L-band
Jin et al. Polarimetric scattering and SAR information retrieval
Jin et al. Assessing integration of intensity, polarimetric scattering, interferometric coherence and spatial texture metrics in PALSAR-derived land cover classification
Paloscia et al. The potential of C-and L-band SAR in estimating vegetation biomass: the ERS-1 and JERS-1 experiments
CN110378896B (en) TomosAR vegetation pest and disease damage monitoring method and device based on polarization coherence
Alonso-Gonzalez et al. Processing multidimensional SAR and hyperspectral images with binary partition tree
Melon et al. On the retrieving of forest stem volume from VHF SAR data: Observation and modeling
Aghababaee et al. Model-based target scattering decomposition of polarimetric SAR tomography
Dey et al. Novel clustering schemes for full and compact polarimetric SAR data: An application for rice phenology characterization
Biondi Multi-chromatic analysis polarimetric interferometric synthetic aperture radar (MCA-PolInSAR) for urban classification
Garestier et al. PolInSAR analysis of X-band data over vegetated and urban areas
Yuzugullu et al. Assessment of paddy rice height: Sequential inversion of coherent and incoherent models
CN110399832B (en) TomosAR vegetation pest and disease damage monitoring method and device based on coherence
Dingle Robertson et al. Monitoring crops using compact polarimetry and the RADARSAT constellation mission
Yang et al. A deep learning solution for height estimation on a forested area based on Pol-TomoSAR data
Lavalle Full and compact polarimetric radar interferometry for vegetation remote sensing
CN110133656A (en) A kind of sparse imaging method of three-dimensional S AR for decomposing with merging based on relatively prime array
Carcereri et al. A deep learning framework for the estimation of forest height from bistatic TanDEM-X data
CN110378894B (en) TomosAR vegetation pest and disease damage monitoring method and device based on correlation
Tebaldini et al. Forest structure from longer wavelength SARs
López-Dekker et al. BIOMASS end-to-end mission performance simulator

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant