CN112287871B - Near-shore aquaculture area remote sensing image extraction method based on multi-feature and spectrum fusion - Google Patents

Near-shore aquaculture area remote sensing image extraction method based on multi-feature and spectrum fusion Download PDF

Info

Publication number
CN112287871B
CN112287871B CN202011257631.3A CN202011257631A CN112287871B CN 112287871 B CN112287871 B CN 112287871B CN 202011257631 A CN202011257631 A CN 202011257631A CN 112287871 B CN112287871 B CN 112287871B
Authority
CN
China
Prior art keywords
remote sensing
sensing image
feature
extraction
image
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202011257631.3A
Other languages
Chinese (zh)
Other versions
CN112287871A (en
Inventor
付东洋
钟雅枫
余果
黄浩恩
刘大召
徐华兵
罗亚飞
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangdong Ocean University
Original Assignee
Guangdong Ocean University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangdong Ocean University filed Critical Guangdong Ocean University
Priority to CN202011257631.3A priority Critical patent/CN112287871B/en
Publication of CN112287871A publication Critical patent/CN112287871A/en
Application granted granted Critical
Publication of CN112287871B publication Critical patent/CN112287871B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/10Terrestrial scenes
    • G06V20/13Satellite images
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/24Classification techniques
    • G06F18/241Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/11Region-based segmentation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/136Segmentation; Edge detection involving thresholding
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/20Image preprocessing
    • G06V10/26Segmentation of patterns in the image field; Cutting or merging of image elements to establish the pattern region, e.g. clustering-based techniques; Detection of occlusion
    • G06V10/267Segmentation of patterns in the image field; Cutting or merging of image elements to establish the pattern region, e.g. clustering-based techniques; Detection of occlusion by performing operations on regions, e.g. growing, shrinking or watersheds
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/40Extraction of image or video features
    • G06V10/44Local feature extraction by analysis of parts of the pattern, e.g. by detecting edges, contours, loops, corners, strokes or intersections; Connectivity analysis, e.g. of connected components
    • G06V10/443Local feature extraction by analysis of parts of the pattern, e.g. by detecting edges, contours, loops, corners, strokes or intersections; Connectivity analysis, e.g. of connected components by matching or filtering
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10032Satellite or aerial image; Remote sensing
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20081Training; Learning
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30181Earth observation
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02ATECHNOLOGIES FOR ADAPTATION TO CLIMATE CHANGE
    • Y02A40/00Adaptation technologies in agriculture, forestry, livestock or agroalimentary production
    • Y02A40/80Adaptation technologies in agriculture, forestry, livestock or agroalimentary production in fisheries management
    • Y02A40/81Aquaculture, e.g. of fish

Abstract

The invention discloses a near-shore aquaculture area remote sensing image extraction method based on multi-feature and spectrum fusion. Secondly, the improved constrained energy minimization algorithm is used for enhancing the target ground objects in the culture area, background spectrum information is weakened, then the Otsu method is used for calculating a threshold value, and the single-waveband threshold value is combined for carrying out primary extraction on the target ground objects. And finally, according to the ground feature characteristics, the primarily extracted result is subjected to customized elimination of ground feature interfering objects by using a gray level co-occurrence texture matrix or an object-oriented method, and the final extracted result of the culture area is output. Compared with the traditional target detection method, the method can effectively overcome the interference of background ground objects even in the culture area with obvious foreign matter co-spectrum phenomenon, obtain the extraction result of the raft culture area with higher precision, and better meet the high-precision extraction requirement of the raft culture area.

Description

Near-shore aquaculture area remote sensing image extraction method based on multi-feature and spectrum fusion
Technical Field
The invention belongs to the technical field of remote sensing technology and image processing, and particularly relates to a remote sensing image extraction method for an offshore aquaculture area based on multi-feature and spectrum fusion.
Background
Since the second industrial revolution, fishery resources have gained more and more national and regional attention as a supplement to terrestrial food resources. As an important component of fishery resources, the autonomous selectivity and economic benefit of aquaculture activities are higher, but the influence on the natural environment is deeper, along with the continuous deposition of excrement in water and the increase of residual baits, the ammonia nitrogen content of aquaculture water is continuously accumulated, the eutrophication phenomenon of the water is aggravated, and the natural environment of aquaculture areas is worsened. Therefore, how to carry out reasonable cultivation and scientific planning on the aquaculture area is a main problem of sustainable development of fishery resources, the spatial layout of the aquaculture area is effectively mastered, and the dynamic monitoring technology of the aquaculture area is improved, so that the method is an important link for reasonably planning and scientifically treating fishery resources.
Compared with the traditional monitoring technology, the remote sensing technology can macroscopically, continuously and automatically observe the objects on the earth surface by formulating the satellite orbit path and the operation period, overcomes the defects of long period, time and labor consumption, large man-made interference factor and the like of the traditional monitoring technology, and becomes an important means for dynamic monitoring of aquaculture areas. The aquaculture area extraction method based on the remote sensing technology mainly comprises a visual interpretation method, an object-oriented method, a pixel spectral feature and texture analysis method and the like, and some scholars obtain ideal extraction effects and application prospects in corresponding aquaculture area extraction experiments, so that the development of the remote sensing technology is greatly promoted. The visual interpretation method is most commonly used, but the accuracy of the visual interpretation method depends on the self interpretation experience of visual interpretation personnel to the greatest extent, the objectivity is low, the workload is large, the time is consumed, and the requirement of long-time and dynamic monitoring of the culture area is not facilitated. The object-oriented extraction method comprehensively considers the space, spectrum, texture and shape characteristics of classified objects in the remote sensing image, reduces the 'salt and pepper' noise influence which is difficult to solve in the traditional image extraction method, but may cause the reduction of the extraction precision of the culture area due to the subjectivity of the segmentation scale and the 'foreign matter co-spectrum' problem of part of pixels in the extraction process. The pixel-based extraction method can better utilize the spectral reflection characteristic of the aquaculture area, highlight the difference between the aquaculture area and the non-aquaculture area, and can automatically extract the aquaculture area by utilizing a threshold value. However, due to the influence of the difference between the sensors and the parameters of the sensors, an aquatic characteristic index which can be used for most satellite image data does not exist, and the reflection characteristics of the aquaculture area are differentiated due to the influence of different water quality factors in the same aquaculture area. The difficulty is increased for the pixel-based extraction method, the aquaculture area is difficult to extract independently and accurately by the pixel extraction method, and the spectral characteristics of the aquaculture area can be better utilized by combining with other methods, so that the precision of the extraction algorithm of the aquaculture area is improved.
Aiming at the defects of the existing algorithm, the invention provides a near-shore aquaculture area remote sensing image extraction method based on multi-feature and spectrum fusion, and a multi-feature analysis method combining the spectral features of remote sensing images and ground features, a threshold value method, a gray level co-occurrence texture matrix and the like is constructed, so that the precise extraction of aquaculture areas is realized.
Disclosure of Invention
The invention aims to provide a near-shore aquaculture area remote sensing image extraction method based on multi-feature and spectrum fusion, so as to solve the existing problems.
The technical solution for realizing the purpose of the invention is as follows:
the method for extracting the remote sensing image of the near-shore aquaculture area based on multi-feature and spectrum fusion is characterized by comprising the following steps of:
step 1: inputting an original remote sensing image;
step 2: carrying out image preprocessing on an input remote sensing image, extracting a characteristic spectrum by using a characteristic index method aiming at the feature of the ground features in the processed image, and constructing a target ground feature characteristic set according to the obtained characteristic spectrum;
and step 3: constructing a finite impulse response linear filter, selecting target ground object pixel spectral data based on a constrained energy minimization algorithm of a gradient integral recurrent neural network, and enhancing the target ground object spectral data by using the finite impulse response linear filter to obtain an enhanced remote sensing image;
and 4, step 4: performing preliminary extraction on the target ground object by using an Otsu algorithm and a single-band threshold value on the enhanced remote sensing image to obtain a preliminarily extracted remote sensing image of the target ground object;
and 5: the preliminarily extracted remote sensing image is subjected to customized elimination on ground feature interference objects in the culture area by adopting an object-oriented method or a gray level symbiotic texture matrix based on the texture features and the geometric features of the ground features;
step 6: and outputting the final remote sensing image extracted from the culture area.
Further, the specific operation steps of step 3 include:
step 31: constructing a finite impulse response linear filter according to the prior information of the known target pixel spectrum;
step 32: the constrained energy minimization algorithm is expressed as a linear constrained optimization mathematical model, and the expression is as follows:
Figure BDA0002773569050000031
wherein w represents the substitution filter coefficient, R is the autocorrelation matrix of the remote sensing image, and
Figure BDA0002773569050000041
d represents a constraint condition vector, and d satisfies the condition:
Figure BDA0002773569050000042
step 33: under the constraint condition vector, when the filter is corresponding to the input r i Output y of i Satisfies the following formula:
Figure BDA0002773569050000043
then the remote sensing image r 1 ,r 2 ,...,r N The corresponding average output energy is:
Figure BDA0002773569050000044
wherein, { r 1 ,r 2 ,...,r N The pixel vector in the image represents the spectral information in the remote sensing image, N is the total pixel value in the image, and each pixel r i =[r i1 ,r i2 ,...,r il ] T Is a l-dimensional column vector, wherein l is the wave band number of the image, and i is more than or equal to 1 and less than or equal to N;
step 34: converting the formula (1) into an unconstrained optimization mathematical model by using a Lagrange multiplier method, wherein the formula is as follows:
F(w)=w T Rw+λ(d T w-1) (5),
wherein λ is a Lagrange multiplier;
step 35: converting equation (5) into a mathematical model of a linear equation, wherein the equation is as follows:
Gs(t)=b (6),
wherein the autocorrelation coefficient matrix G = [2R T ;d,0]∈R (l+1)×(l+1) (ii) a b is a coefficient vector and b = [0,1 =] T ∈R l+1 ;s(t)=[w(t),λ(t)] T ∈R l+1 The vector is to be solved; w (t) = { w 1 (t),w 2 (t),…,w l (t)} T Is a vector of dimension l formed by filter coefficients, and lambda (t) belongs to R and is a Lagrange function multiplier;
step 36: the error function defining equation (6) is:
e(t)=Gs(t)-b (7),
step 37: according to equation (7), the integral enhanced gradient recurrence equation is constructed as follows:
Figure BDA0002773569050000051
step 38: performing recursive calculation according to the formula (8) until the calculated error is smaller than the allowable error, and obtaining a filtering output coefficient w (t);
step 39: and performing inversion according to the obtained filter output coefficient, and outputting the enhanced remote sensing image.
Further, the specific operation steps of step 4 include:
step 41: aiming at the enhanced remote sensing image, calculating an optimal threshold value for ground feature extraction by adopting an Otsu algorithm, carrying out threshold segmentation on the remote sensing image according to the obtained threshold value, extracting partial ground features and obtaining an Otsu threshold value extraction result;
step 42: and extracting spectral values with the same Otsu threshold extraction result position on a single waveband of the remote sensing image, performing threshold segmentation on the single waveband gray value, and removing partial interference objects to obtain a single waveband threshold extraction result.
Further, the specific operation steps of step 5 include:
step 51: establishing specific wave bands among wave bands of the remote sensing image based on the texture features of the target ground object, selecting sensitive wave bands by adopting a Babbitt distance method, establishing a gray level co-occurrence texture matrix on the basis of the sensitive wave bands, defining a threshold value for target extraction, and eliminating interference objects;
step 52: based on the spatial attributes of the target ground objects, the target is extracted from the characteristics of the area, extensibility, perimeter, compactness, firmness, shape elements and roundness among the ground objects, and the interference objects are removed.
Compared with the prior art, the method has the following beneficial effects:
firstly, the method provided by the invention uses the improved constrained energy minimization algorithm to strengthen the target ground objects in the culture area, weakens the background spectrum information, and uses the gray level co-occurrence texture matrix or the object-oriented method to carry out customized elimination of ground object interference objects after obtaining the primary extraction result, even in the culture area with obvious foreign matter co-spectrum phenomenon, the method can still effectively overcome the background ground object interference, and obtain the extraction result of the culture area with higher precision;
secondly, the spectral characteristics of the ground features can be used for extracting the ground features of the culture area, the threshold value and the single-waveband threshold value calculated by the Otsu algorithm are used for threshold value segmentation, a part of target ground feature areas are extracted, interference objects can be removed in a customized mode according to the texture characteristics and the geometric characteristics of the ground features among the ground features of the culture area, and the extraction precision is further improved;
thirdly, the method converts the constrained energy minimization algorithm into an unconstrained optimization mathematical model by using a Lagrange multiplier method, and converges the calculation error of each spectrum to zero by using an error function and an evolution formula, so that the classification precision is improved.
In conclusion, the invention combines the actual application effect of the current satellite remote sensing image and the actual requirements of the offshore culture area to construct a multi-feature analysis method combining the spectral features of the remote sensing image surface features, a threshold value method, a gray level symbiotic texture matrix and the like, and the method is utilized to realize the accurate extraction of the culture area.
Drawings
FIG. 1 is a schematic flow diagram of the process of the present invention;
FIG. 2 is a remote sensing image of a raft culture area in Zhanjiang estuary;
FIG. 3 is a diagram showing the result of the extraction in the raft culture area in Zhanjiang estuary;
FIG. 4 is a remote sensing image of a region of a culture pond in Zhanjiang estuary;
FIG. 5 is a graph showing the results of the method for extracting an area of a culture pond in Yangtze Bay.
Detailed Description
In order to make those skilled in the art better understand the technical solution of the present invention, the following further describes the technical solution of the present invention with reference to the drawings and the embodiments.
Referring to the attached figures 1-5, the method for extracting the remote sensing image of the near-shore aquaculture area based on the fusion of the multi-feature and the spectrum comprises the following steps:
step 1: inputting an original remote sensing image;
step 2: carrying out image preprocessing on an input remote sensing image, carrying out sea-land separation on a remote sensing image of a culture area to be extracted, cutting and reserving the culture area to be extracted, constructing a characteristic spectrum according to feature of ground objects in the processed image, screening a sensitive band in a characteristic spectrum band and an original band of the remote sensing image by adopting a Babbitt distance method, and constructing a target ground object feature set based on the obtained characteristic spectrum;
and step 3: constructing a finite impulse response linear filter, selecting target ground object pixel spectral data based on a constrained energy minimization algorithm of a gradient integral recurrent neural network, and enhancing the target ground object spectral data while inhibiting image background information by using the finite impulse response linear filter to obtain an enhanced remote sensing image;
and 4, step 4: performing preliminary extraction on the target ground object by using an Otsu algorithm and a single-band threshold value on the enhanced remote sensing image to obtain a preliminarily extracted remote sensing image of the target ground object;
and 5: the preliminarily extracted remote sensing image is subjected to customized elimination on ground feature interference objects in the culture area by adopting an object-oriented method or a gray level symbiotic texture matrix based on the texture features and the geometric features of the ground features;
step 6: and outputting the final remote sensing image extracted from the culture area.
Further, the specific operation steps of step 3 include:
step 31: constructing a finite impulse response linear filter according to the prior information of the known target pixel spectrum;
step 32: the constrained energy minimization algorithm is expressed as a linear constrained optimization mathematical model, and the expression is as follows:
Figure BDA0002773569050000081
wherein w represents the substitution filter coefficient, R is the autocorrelation matrix of the remote sensing image, and
Figure BDA0002773569050000082
d represents a constraint condition vector, and d satisfies the condition:
Figure BDA0002773569050000083
step 33: under the constraint condition vector, when the filter is corresponding to the input r i Output y of i Satisfies the following formula:
Figure BDA0002773569050000084
then the remote sensing image r 1 ,r 2 ,...,r N The corresponding average output energy is:
Figure BDA0002773569050000091
wherein, { r 1 ,r 2 ,...,r N The pixel vector in the image represents the spectral information in the remote sensing image, N is the total pixel value in the image, and each pixel r i =[r i1 ,r i2 ,...,r il ] T Is a l-dimensional column vector, wherein l is the wave band number of the image, and i is more than or equal to 1 and less than or equal to N;
step 34: converting the formula (1) into an unconstrained optimization mathematical model by using a Lagrange multiplier method, wherein the formula is as follows:
F(w)=w T Rw+λ(d T w-1) (5),
wherein λ is a Lagrange multiplier;
step 35: converting equation (5) into a mathematical model of a linear equation, wherein the equation is as follows:
Gs(t)=b (6),
wherein the autocorrelation coefficient matrix G = [2R T ;d,0]∈R (l+1)×(l+1) (ii) a b is a coefficient vector and b = [0,1 =] T ∈R l+1 ;s(t)=[w(t),λ(t)] T ∈R l+1 The vector is to be solved; w (t) = { w 1 (t),w 2 (t),…,w l (t)} T Is a vector of dimension l formed by filter coefficients, and lambda (t) belongs to R and is a Lagrange function multiplier;
step 36: the error function defining equation (6) is:
e(t)=Gs(t)-b (7),
step 37: according to equation (7), the integral enhanced gradient recurrence equation is constructed as follows:
Figure BDA0002773569050000101
step 38: performing recursive calculation according to the formula (8) until the calculated error is smaller than the allowable error, and obtaining a filter output coefficient w (t);
step 39: and performing inversion according to the obtained filter output coefficient, and outputting the enhanced remote sensing image.
Further, the specific operation steps of step 4 include:
step 41: aiming at the enhanced remote sensing image, calculating an optimal threshold value for ground feature extraction by adopting an Otsu algorithm, carrying out threshold segmentation on the remote sensing image according to the obtained threshold value, extracting partial ground features and obtaining an Otsu threshold value extraction result;
step 42: extracting spectral values with the same position as an Otsu threshold extraction result on a single waveband of a remote sensing image by using the characteristic that the gray value of a part of interference objects on a certain single waveband is smaller than that of a target ground object, carrying out threshold segmentation on the gray value of the single waveband, and eliminating part of interference objects to obtain a single-waveband threshold extraction result;
further, the specific operation steps of step 5 include:
step 51: establishing specific wave bands among wave bands of the remote sensing image based on the texture features of the target ground object, selecting sensitive wave bands by adopting a Babbitt distance method, establishing a gray level co-occurrence texture matrix on the basis of the sensitive wave bands, defining a threshold value for target extraction, and eliminating interference objects;
step 52: based on the spatial attributes of the target ground objects, the target is extracted from the characteristics of the area, extensibility, perimeter, compactness, firmness, shape elements and roundness among the ground objects, and the interference objects are removed.
Examples
1. Procedure of the test
Firstly, obtaining an original remote sensing image shown in an attached drawing 2 and an attached drawing 4 by using a sensor, preprocessing the original remote sensing image, reserving a culture area to be extracted, then constructing a characteristic index by using an NDAI characteristic index, a suspended sediment difference index TSM and a chlorophyll a concentration index CHL, and screening 7 wave bands by using a Papanicolaou distance method in combination with 4 original wave bands of the remote sensing image to construct a ground feature spectral feature set of the culture area. The spectral feature set of the raft culture zone mainly comprises a red light waveband, a near red waveband and a CHL waveband, and the culture pond area comprises a blue light waveband, a CHL waveband and a red light waveband;
secondly, respectively inputting target ground object pixel values of the raft culture area and the culture pond area, executing the steps 31-39 by using a constrained energy minimization algorithm based on a gradient integral recurrent neural network, and performing image enhancement processing to obtain an enhanced remote sensing image;
after the image enhancement treatment is carried out on the raft culture area, obvious difference exists between pixel gray scale of the raft frame and nearby water, the raft frame and surrounding water are effectively distinguished, and the edge outline of the raft frame is effectively enhanced. The water body of the culture pond in the culture pond area after image enhancement is obviously different from the seawater, and the culture pond is easy to distinguish.
And thirdly, preliminarily extracting the target ground object, firstly, performing threshold segmentation on the enhanced remote sensing image by using a threshold calculated by an Otsu algorithm, and extracting a part of ground object as a roughly extracted target ground object region. The raft culture area still retains a lot of seawater in the raft frame area after Otsu algorithm threshold segmentation, and the culture pond area retains a large amount of small artificial ground objects. Then, by using the characteristic that the gray value of a part of interference objects in a certain single wave band is smaller than that of the target ground object, a spectral value with the same position as the Otsu algorithm result is extracted from the certain single wave band of the image, threshold segmentation is carried out on the gray value of the single wave band, and the part of interference objects are extracted. Wherein, threshold segmentation is carried out in the blue light wave band in the raft culture area, and threshold segmentation is carried out in the green light wave band in the culture pond area.
And finally, performing customized elimination of the interference objects according to the texture features and the geometric features of the ground objects in the culture area. For example, if there is a significant difference between the texture features of the target ground object in the raft culture area and the texture of the water body, the residual water body in the raft culture area is removed by using the mean value features in the gray level symbiotic texture matrix. The culture ponds in the culture pond area are mostly rectangular or square, and the flexibility of the river is obvious, so that the geometric characteristics of the river are utilized to remove the culture ponds. Wherein the raft culture area respectively generates gray level symbiotic texture matrix mean characteristics based on green light wave band and red light wave band ratio, near red wave band and blue light wave band ratio, and the culture pond area extracts the culture area by applying Rect _ Fit attribute and Elongation attribute in an object-oriented method; the method comprises the steps of extracting a culture area, classifying the culture area, and filtering the obtained result.
Finally, referring to fig. 3 and 5, it can be seen that the result after extraction according to the method is shown.
2. Analysis of results
According to the method, a layered random sampling mode is adopted, the verification sample is established on the visual interpretation result of the field sampling or the image with higher spatial resolution, and the accuracy verification is carried out on the extraction result by applying a confusion matrix method, so that the visual interpretation result of Google Earth software is evaluated by referring to the attribute of each sample point in ENVI software, and a confusion matrix is established and the accuracy evaluation is carried out on the attribute.
As can be seen from various precision tables of culture areas shown in Table 1, aiming at the extraction result of the GF-1 image raft culture area, the method effectively overcomes the influence of high-turbidity water body on the basis of fully utilizing spectral characteristics and textural characteristics of ground features, the raft frame has clear outline, and the overall precision reaches 98.74 percent; aiming at the extraction result of the ZY-3 image culture pond area, the multi-feature analysis method realizes the high-precision extraction of the culture pond by using the geometric difference between rivers and the culture pond, and the total precision reaches 96.74 percent.
TABLE 1 various precisions meters of culture area
Figure BDA0002773569050000131
In conclusion, no matter which type of cultivation is adopted, the method has remarkable comparative advantages in the aspects of overcoming the 'same-spectrum foreign matters' and 'same-and-different-spectrum' of the ground features, and after the original remote sensing image is processed by the method, the extraction result is clear, the accuracy is high, and therefore the method can be used as a preferred algorithm for extracting the high-resolution remote sensing image cultivation area.
Those not described in detail in this specification are within the skill of the art. Although the present invention has been described in detail with reference to the foregoing embodiments, it will be apparent to those skilled in the art that various changes in the embodiments and modifications of the invention can be made, and equivalents of some features of the invention can be substituted, and any changes, equivalents, improvements and the like, which fall within the spirit and principle of the invention, are intended to be included within the scope of the invention.

Claims (3)

1. The method for extracting the remote sensing image of the near-shore aquaculture area based on multi-feature and spectrum fusion is characterized by comprising the following steps of:
step 1: inputting an original remote sensing image;
step 2: carrying out image preprocessing on an input remote sensing image, extracting a characteristic spectrum by using a characteristic index method aiming at the feature of the ground object in the processed image, and constructing a target ground object feature set according to the obtained characteristic spectrum;
and step 3: constructing a finite impulse response linear filter, inputting target ground object pixel spectral data based on a constrained energy minimization algorithm of a gradient integral recurrent neural network, and enhancing the target ground object spectral data by using the finite impulse response linear filter to obtain an enhanced remote sensing image;
and 4, step 4: performing preliminary extraction on the target ground object by using an Otsu algorithm and a single-band threshold value on the enhanced remote sensing image to obtain a preliminarily extracted remote sensing image of the target ground object;
and 5: the preliminarily extracted remote sensing image is subjected to customized elimination on ground feature interference objects in the culture area by adopting an object-oriented method or a gray level symbiotic texture matrix based on the texture features and the geometric features of the ground features;
and 6: outputting the final remote sensing image extracted from the culture area;
the specific operation steps of the step 3 comprise:
step 31: constructing a finite impulse response linear filter according to the prior information of the known target pixel spectrum;
step 32: the constrained energy minimization algorithm is expressed as a linear constrained optimization mathematical model, and the expression is as follows:
Figure FDA0003791762650000011
wherein w represents the substitution filter coefficient, R is the autocorrelation matrix of the remote sensing image, and
Figure FDA0003791762650000021
d represents a constraint condition vector, and d satisfies the condition:
Figure FDA0003791762650000022
step 33: under the constraint condition vector, when the filter is corresponding to the input r i Output y of i Satisfies the following formula:
Figure FDA0003791762650000023
then the remote sensing image r 1 ,r 2 ,...,r N The corresponding average output energy is:
Figure FDA0003791762650000024
wherein, { r 1 ,r 2 ,...,r N The pixel vector in the image represents the spectral information in the remote sensing image, N is the total pixel value in the image, and each pixel r i =[r i1 ,r i2 ,...,r il ] T Is a l-dimensional column vector, wherein l is the wave band number of the image, and i is more than or equal to 1 and less than or equal to N;
step 34: converting the formula (1) into an unconstrained optimization mathematical model by using a Lagrange multiplier method, wherein the formula is as follows:
F(w)=w T Rw+λ(d T w-1) (5),
wherein λ is a Lagrange multiplier;
step 35: converting equation (5) into a mathematical model of a linear equation, wherein the equation is as follows:
Gs(t)=b (6),
wherein the autocorrelation coefficient matrix G = [2R T ;d,0]∈R (l+1)×(l+1) (ii) a b is a coefficient vector and b = [0,1 =] T ∈R l+1 ;s(t)=[w(t),λ(t)] T ∈R l+1 The vector is to be solved; w (t) = { w 1 (t),w 2 (t),…,w l (t)} T Is a vector of dimension l formed by filter coefficients, and lambda (t) belongs to R and is a Lagrange function multiplier;
step 36: the error function defining equation (6) is:
e(t)=Gs(t)-b (7),
step 37: according to equation (7), the integrated enhanced gradient recurrence equation is constructed as:
Figure FDA0003791762650000031
step 38: performing recursive calculation according to the formula (8) until the calculated error is smaller than the allowable error, and obtaining a filter output coefficient w (t);
step 39: and performing inversion according to the obtained filter output coefficient, and outputting the enhanced remote sensing image.
2. The near-shore aquaculture area remote sensing image extraction method based on multi-feature and spectrum fusion of claim 1, characterized in that the specific operation steps of step 4 comprise:
step 41: aiming at the enhanced remote sensing image, calculating an optimal threshold value for ground feature extraction by adopting an Otsu algorithm, carrying out threshold segmentation on the remote sensing image according to the obtained threshold value, extracting partial ground features and obtaining an Otsu threshold value extraction result;
step 42: and extracting a spectral value with the same position as the Otsu threshold extraction result on a single waveband of the remote sensing image, performing threshold segmentation on the single waveband gray value, and removing part of interference objects to obtain a single waveband threshold extraction result.
3. The near-shore aquaculture area remote sensing image extraction method based on multi-feature and spectrum fusion of claim 1, characterized in that the specific operation steps of step 5 comprise:
step 51: establishing specific wave bands among wave bands of the remote sensing image based on the texture features of the target ground object, selecting sensitive wave bands by adopting a Babbitt distance method, establishing a gray level co-occurrence texture matrix on the basis of the sensitive wave bands, defining a threshold value for target extraction, and eliminating interference objects;
step 52: based on the spatial attributes of the target ground objects, the target is extracted from the characteristics of the area, extensibility, perimeter, compactness, firmness, shape elements and roundness among the ground objects, and the interference objects are removed.
CN202011257631.3A 2020-11-12 2020-11-12 Near-shore aquaculture area remote sensing image extraction method based on multi-feature and spectrum fusion Active CN112287871B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011257631.3A CN112287871B (en) 2020-11-12 2020-11-12 Near-shore aquaculture area remote sensing image extraction method based on multi-feature and spectrum fusion

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011257631.3A CN112287871B (en) 2020-11-12 2020-11-12 Near-shore aquaculture area remote sensing image extraction method based on multi-feature and spectrum fusion

Publications (2)

Publication Number Publication Date
CN112287871A CN112287871A (en) 2021-01-29
CN112287871B true CN112287871B (en) 2023-01-17

Family

ID=74398866

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011257631.3A Active CN112287871B (en) 2020-11-12 2020-11-12 Near-shore aquaculture area remote sensing image extraction method based on multi-feature and spectrum fusion

Country Status (1)

Country Link
CN (1) CN112287871B (en)

Families Citing this family (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112989940B (en) * 2021-02-08 2023-08-01 国家海洋环境监测中心 Raft culture area extraction method based on high-resolution third satellite SAR image
CN112907486B (en) * 2021-03-18 2022-12-09 国家海洋信息中心 Remote sensing image toning method based on deep learning and color mapping
CN113378679A (en) * 2021-06-01 2021-09-10 大连海事大学 Coastal culture pond extraction method based on improved geometric features and feature keeping sampling
CN113538559B (en) * 2021-07-02 2022-02-18 宁波大学 Extraction method of offshore aquaculture raft extraction index based on hyperspectral remote sensing image
CN113837123A (en) * 2021-09-28 2021-12-24 大连海事大学 Mid-resolution remote sensing image offshore culture area extraction method based on spectral-spatial information combination
CN114241336B (en) * 2021-12-30 2022-09-20 河南祥宇工程勘察设计有限公司 River and lake water area right-determining demarcation method based on dynamic low-resolution remote sensing image
CN114612387B (en) * 2022-02-16 2023-02-10 珠江水利委员会珠江水利科学研究院 Remote sensing image fusion method, system, equipment and medium based on characteristic threshold
CN116452901B (en) * 2023-06-19 2023-09-15 中国科学院海洋研究所 Automatic extraction method for ocean culture area of remote sensing image based on deep learning
CN117011555B (en) * 2023-10-07 2023-12-01 广东海洋大学 Mangrove forest ecological detection method based on remote sensing image recognition
CN117095299B (en) * 2023-10-18 2024-01-26 浙江省测绘科学技术研究院 Grain crop extraction method, system, equipment and medium for crushing cultivation area

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102622607A (en) * 2012-02-24 2012-08-01 河海大学 Remote sensing image classification method based on multi-feature fusion
CN109840496A (en) * 2019-01-29 2019-06-04 青岛大学 Aquaculture area hierarchical classification extracting method, device, storage medium and electronic equipment

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8280111B2 (en) * 2010-02-17 2012-10-02 The Boeing Company Advanced background estimation technique and circuit for a hyper-spectral target detection method
CN108875659B (en) * 2018-06-26 2022-04-22 上海海事大学 Sea chart cultivation area identification method based on multispectral remote sensing image
CN110135479A (en) * 2019-04-29 2019-08-16 中国地质大学(武汉) The high spectrum image object detection method and system of study are estimated based on random forest

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102622607A (en) * 2012-02-24 2012-08-01 河海大学 Remote sensing image classification method based on multi-feature fusion
CN109840496A (en) * 2019-01-29 2019-06-04 青岛大学 Aquaculture area hierarchical classification extracting method, device, storage medium and electronic equipment

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
基于背景抑制的遥感图像目标检测方法研究;崔照斌;《中国优秀硕士学位论文全文数据库 信息科技辑》;20180415(第04期);I140-1075 *

Also Published As

Publication number Publication date
CN112287871A (en) 2021-01-29

Similar Documents

Publication Publication Date Title
CN112287871B (en) Near-shore aquaculture area remote sensing image extraction method based on multi-feature and spectrum fusion
CN109190538B (en) Sediment-laden river delta coastal zone evolution analysis method based on remote sensing technology
Rishikeshan et al. An automated mathematical morphology driven algorithm for water body extraction from remotely sensed images
Hou et al. Marine floating raft aquaculture extraction of hyperspectral remote sensing images based decision tree algorithm
Lathrop et al. A multi-scale segmentation approach to mapping seagrass habitats using airborne digital camera imagery
Qiao et al. An automatic active contour method for sea cucumber segmentation in natural underwater environments
CN109635765B (en) Automatic extraction method for remote sensing information of shallow sea coral reef
CN106815819B (en) More strategy grain worm visible detection methods
CN109410228A (en) Internal wave of ocean detection algorithm based on Method Based on Multi-Scale Mathematical Morphology Fusion Features
CN104318051B (en) The rule-based remote sensing of Water-Body Information on a large scale automatic extracting system and method
CN102176001A (en) Permeable band ratio factor-based water depth inversion method
CN111402169A (en) Method for repairing remote sensing vegetation index time sequence under influence of coastal tide
CN112037244B (en) Landsat-8 image culture pond extraction method combining index and contour indicator SLIC
CN110569733B (en) Lake long time sequence continuous water area change reconstruction method based on remote sensing big data platform
CN111339959A (en) Method for extracting offshore buoyant raft culture area based on SAR and optical image fusion
Liu et al. Mapping China’s offshore mariculture based on dense time-series optical and radar data
KR101050067B1 (en) Aquaculture detection from satellite image
CN107392927B (en) A kind of sub-meter grade remote sensing image fishery net cage extracting method
Xu et al. Remote Sensing Mapping of Cage and Floating-raft Aquaculture in China's Offshore Waters Using Machine Learning Methods and Google Earth Engine
Tamim et al. Detection of Moroccan coastal upwelling fronts in SST images using the microcanonical multiscale formalism
Zhu et al. spectral characteristic analysis and remote sensing classification of coastal aquaculture areas based on GF-1 data
CN115761493A (en) Water body extraction method based on combined water body index frequency
CN113837123A (en) Mid-resolution remote sensing image offshore culture area extraction method based on spectral-spatial information combination
CN114565853A (en) Method for extracting offshore sea area culture pond area under cooperation of spectral characteristics and spatial convolution
CN112308024A (en) Water body information extraction method

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant