CN111339959A - Method for extracting offshore buoyant raft culture area based on SAR and optical image fusion - Google Patents

Method for extracting offshore buoyant raft culture area based on SAR and optical image fusion Download PDF

Info

Publication number
CN111339959A
CN111339959A CN202010128207.2A CN202010128207A CN111339959A CN 111339959 A CN111339959 A CN 111339959A CN 202010128207 A CN202010128207 A CN 202010128207A CN 111339959 A CN111339959 A CN 111339959A
Authority
CN
China
Prior art keywords
image
fusion
sar
resolution
transformation
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202010128207.2A
Other languages
Chinese (zh)
Inventor
张瑞
刘国祥
于慧男
王晓文
张波
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Southwest Jiaotong University
Original Assignee
Southwest Jiaotong University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Southwest Jiaotong University filed Critical Southwest Jiaotong University
Priority to CN202010128207.2A priority Critical patent/CN111339959A/en
Publication of CN111339959A publication Critical patent/CN111339959A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/10Terrestrial scenes
    • G06V20/13Satellite images
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/25Fusion techniques
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q50/00Systems or methods specially adapted for specific business sectors, e.g. utilities or tourism
    • G06Q50/02Agriculture; Fishing; Mining
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/20Image preprocessing
    • G06V10/26Segmentation of patterns in the image field; Cutting or merging of image elements to establish the pattern region, e.g. clustering-based techniques; Detection of occlusion
    • G06V10/267Segmentation of patterns in the image field; Cutting or merging of image elements to establish the pattern region, e.g. clustering-based techniques; Detection of occlusion by performing operations on regions, e.g. growing, shrinking or watersheds
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/40Extraction of image or video features
    • G06V10/56Extraction of image or video features relating to colour

Abstract

The invention discloses an offshore floating raft culture area extraction method based on SAR and optical image fusion. The experimental result can accurately define the culture range of the ocean floating raft, is favorable for government departments to carry out more effective supervision on offshore culture areas, is favorable for standardizing the culture density of the culture areas, purifies the environment of the culture water areas, is favorable for relieving the contradiction between the growth of transport ships and the continuous expansion of the culture water areas, prevents commercial ships from entering the culture areas by mistake to pollute the culture water areas, and can also ensure the navigation safety of the ships.

Description

Method for extracting offshore buoyant raft culture area based on SAR and optical image fusion
Technical Field
The invention relates to the field of offshore remote sensing extraction, in particular to an offshore buoyant raft culture area extraction method based on SAR and optical image fusion.
Background
The raft culture is constructed by floating barrels, plastic floats and other materials in shallow sea areas, and is used for culturing macroalgae, shellfish and other marine animals. With the annual expansion of the culture scale of the floating raft, the offshore sea area space is continuously developed, and the environmental pressure of continuous high-density culture on local sea areas is gradually highlighted. In recent years, the situations of shellfish culture yield reduction and death rate increase in some shellfish culture areas in China are caused by improper use of culture resources. Therefore, the system deeply monitors the sea area for the floating raft culture and accurately extracts the floating raft culture range, and has very important significance for the healthy development of the marine culture industry and the sustainable utilization of marine resources.
Because the range of floating raft culture is wide and the raft areas are scattered, the traditional monitoring method at present utilizes a GPS to carry out measurement on site, not only is the time consumed long, but also a large amount of manpower and material resources are needed, and the result precision is poor.
The remote sensing has unique advantages in the aspect of extracting the ocean floating raft culture area due to the characteristics of wide detection range, high timeliness, economy and the like. The optical remote sensing image has rich spectral information, and the culture area can be extracted from the RGB images formed by combining different wave bands. However, the method belongs to passive remote sensing, is easily limited by weather conditions such as cloud, rain, fog and snow to influence imaging, and cannot accurately extract the culture area in time. Synthetic Aperture Radar (SAR) remote sensing belongs to active remote sensing, is not limited by weather conditions, actively transmits microwaves and receives ground object echoes, can image all day long and all weather, and has rich texture information. However, a single SAR image has single wave band, is a gray image, cannot provide rich spectral information, and increases the difficulty of extracting the floating raft culture information.
The current research mainly takes a single optical image or a single SAR image to extract a floating raft culture area, and the advantages of the two images cannot be combined to extract the marine floating raft culture area. There is an obvious technical bottleneck in carrying out remote sensing extraction of aquaculture areas based on only a single data source.
Disclosure of Invention
Aiming at the defects in the prior art, the method for extracting the offshore buoyant raft culture area based on SAR and optical image fusion provided by the invention solves the problems of low precision and reliability of the existing technology for extracting the offshore buoyant raft culture area.
In order to achieve the purpose of the invention, the invention adopts the technical scheme that: an offshore buoyant raft culture area extraction method based on SAR and optical image fusion is characterized by comprising the following steps:
s1, preprocessing the optical image and the SAR image to obtain a preprocessed optical image and a preprocessed SAR image;
s2, fusing the preprocessed optical image and the preprocessed SAR image by adopting four fusion methods to obtain a fused image;
and S3, extracting the floating raft culture area from the fused image by adopting an object-oriented method.
Further, the preprocessing of step S1 includes: radiometric calibration, atmospheric correction, complex data amplitude conversion data, multi-view processing, filtering processing and geocoding.
Further, the four fusion methods of step S2 include: HIS transformation fusion, Brovey transformation fusion, G-S transformation fusion and principal component transformation fusion.
Further, the HIS transform fusion method comprises the following steps:
a1, finding out a multispectral image with lower spatial resolution from the preprocessed optical image and the preprocessed SAR image, and registering and resampling three waveband images of the multispectral image with lower spatial resolution and the high-resolution image to enable the multispectral image and the high-resolution image to have the same pixel resolution and spatial geometric position;
a2, correspondingly transforming the multispectral image into an HIS space according to the wave bands and the RGB values to obtain three components of brightness I, chroma H and saturation S;
a3, performing contrast stretching on the image with high spatial resolution to make the image have the same mean and variance with the brightness component I;
and A4, replacing the I component with the stretched high-resolution image, and performing HIS inverse transformation on the I component, the gray-scale stretched chrominance component H 'and the gray-scale stretched saturation component S' to obtain a fused image.
Further, the Brovey transform fusion method is: and reassigning the RBG values of the preprocessed optical image and the preprocessed SAR image according to the following formula:
Figure BDA0002395055670000031
wherein R isnewNew value of R component of RBG value, GnewNew value of G component, B, for RBG valuenewThe new value of the B component of the RBG value and PAN is the full-color band value.
Further, the G-S transformation fusion method comprises the following steps:
b1, dividing the preprocessed optical image and the preprocessed SAR image into a multispectral image with low spatial resolution and a high-resolution panchromatic image according to a spatial resolution threshold;
b2, generating a simulated low-spatial-resolution full-color image by using the multispectral image with low spatial resolution;
b3, overlapping the first waveband analog image to the multispectral image, and then performing G-S forward transformation on the recombined multispectral image;
and B4, replacing the first component after G-S transformation by the high-resolution full-color image, and performing G-S inverse transformation on the replaced multispectral image to obtain a final fusion image.
Further, the principal component transformation fusion method comprises the following steps:
c1, dividing the preprocessed optical image and the preprocessed SAR image into a low-resolution image and a high-resolution image according to a spatial resolution threshold;
c2, performing principal component transformation on the low-resolution image;
c3, stretching the gray scale of the high-resolution image to make the mean value and the variance of the gray scale consistent with the first component image of the low-resolution image with principal component transformation;
and C4, replacing the first component image of the low-resolution image with the stretched high-resolution image, and restoring the replaced low-resolution image to the original image space through principal component inverse transformation.
The invention has the beneficial effects that: the ocean culture area is extracted by adopting a mode of fusing the optical image and the SAR image, and a fusion algorithm which is most suitable for the extraction of the ocean culture area is obtained through experiments, so that the extraction precision of the floating raft culture area in the offshore water area is effectively improved. The experimental result can accurately define the culture range of the ocean floating raft, is favorable for government departments to carry out more effective supervision on offshore culture areas, is favorable for standardizing the culture density of the culture areas, purifies the environment of the culture water areas, is favorable for relieving the contradiction between the growth of transport ships and the continuous expansion of the culture water areas, prevents commercial ships from entering the culture areas by mistake to pollute the culture water areas, and can also ensure the navigation safety of the ships.
Drawings
FIG. 1 is a schematic flow chart of a method for extracting an offshore buoyant raft culture area based on SAR and optical image fusion;
FIG. 2 is a graph showing the image fusion result of the Changhai county cultivation area in Liaoning province, wherein (a) shows the optical image of the test area subjected to preprocessing, (b) shows the SAR image of the test area subjected to preprocessing, (c) shows the result of fusion by HIS transformation, (d) shows the result of fusion by Brovey transformation, (e) shows the result of fusion by G-S transformation, and (f) shows the result of fusion by K-L transformation;
fig. 3 is a drawing of results of extraction in the Changhai county area of Liaoning province, wherein (a) shows the result of extraction of Landsat-8 images, (b) shows the result of extraction of GF-3 images, (c) shows the result of extraction of HIS transform fusion images, (d) shows the result of extraction of Brovey transform fusion images, (e) shows the result of extraction of G-S transform fusion images, and (f) adopts the result of extraction of K-L transform fusion images;
fig. 4 is a result of image fusion of a culture area of a gulf of mulberry in Shandong, wherein (a) shows an optical image of a pretreated test area, (b) shows an SAR image of a pretreated test area, (c) shows a result of fusion by HIS transform, (d) shows a result of fusion by Brovey transform, (e) shows a result of fusion by G-S transform, and (f) shows a result of fusion by K-L transform;
fig. 5 shows the extraction results of the sanguiwan cultivated area in shandong province, wherein (a) shows the extraction results of Landsat-8 images, (b) shows the extraction results of GF-3 images, (c) shows the extraction results of HIS transform fusion images, (d) shows the extraction results of brooey transform fusion images, (e) shows the extraction results of G-S transform fusion images, and (f) adopts the extraction results of K-L transform fusion images;
fig. 6 is a graph showing the image fusion result of the san du australia cultivation area in fujian province, wherein (a) shows the optical image of the test area subjected to preprocessing, (b) shows the SAR image of the test area subjected to preprocessing, (c) shows the fusion result by using HIS transformation, (d) shows the fusion result by using Brovey transformation, (e) shows the fusion result by using G-S transformation, and (f) shows the fusion result by using K-L transformation;
fig. 7 shows the extraction results of the san du australian culture area in fujian province, wherein (a) shows the extraction results of Landsat-8 images, (b) shows the extraction results of GF-3 images, (c) shows the extraction results of HIS transform fusion images, (d) shows the extraction results of Brovey transform fusion images, (e) shows the extraction results of G-S transform fusion images, and (f) shows the extraction results of K-L transform fusion images.
Detailed Description
The following description of the embodiments of the present invention is provided to facilitate the understanding of the present invention by those skilled in the art, but it should be understood that the present invention is not limited to the scope of the embodiments, and it will be apparent to those skilled in the art that various changes may be made without departing from the spirit and scope of the invention as defined and defined in the appended claims, and all matters produced by the invention using the inventive concept are protected.
As shown in fig. 1, in an embodiment of the present invention, an offshore raft culture area extraction method based on SAR and optical image fusion includes the following steps:
s1, preprocessing the optical image and the SAR image to obtain a preprocessed optical image and a preprocessed SAR image;
s2, fusing the preprocessed optical image and the preprocessed SAR image by adopting four fusion methods to obtain a fused image;
and S3, extracting the floating raft culture area from the fused image by adopting an object-oriented method.
The step of S1 includes:
s11: the pre-processing of the optical image includes radiometric calibration and atmospheric correction.
(1) Radiometric calibration
Radiometric calibration is a process of converting a voltage or a digital quantized value recorded by a sensor into an absolute radiance value (radiance), or into a relative value related to a physical quantity such as a surface reflectivity, a surface temperature, and the like.
(2) Atmospheric correction
The purpose of atmospheric correction is to eliminate the influence of factors such as atmosphere and illumination on the reflection of the ground objects, obtain real physical model parameters such as the reflectivity, radiance and surface temperature of the ground objects, eliminate the influence of water vapor, oxygen, carbon dioxide, methane, ozone and the like in the atmosphere on the reflection of the ground objects and eliminate the influence of atmospheric molecule and aerosol scattering. In most cases, atmospheric correction is also a process for inverting the true reflectivity of the ground object.
S12: the SAR image is L1A level single view slant range complex (SLC) data, and requires preprocessing such as radiometric calibration, data conversion, multi-view processing, filtering processing, and geocoding on raw data.
(1) Radiometric calibration
The SAR sensor measures the ratio of the transmitted pulse and the received signal strength, which is called backscatter. The backscattering intensity information subjected to radiometric calibration is not influenced by SAR data observation geometry (different SAR sensors or different receiving modes), is normalized to the same standard, and can be compared and analyzed.
(2) Plural amplitude of rotation data
The amplitude feature is one of the most main features of the SAR image, and the ground object target information can be extracted based on the SAR amplitude image, so that SAR complex data needs to be converted into SAR amplitude data.
(3) Multi-view processing
The single view complex data (SLC) of SAR is the original highest resolution data, but the coherent superposition of SAR echo signals scattered from a single image element results in much noise in the intensity information. In order to improve the visual effect of the image and improve the estimation precision of backscattering of each pixel, multi-view processing is required, namely, a plurality of independent samples of the target are averagely superposed. The multi-view processing makes the geometric characteristics of the image closer to the ground reality on one hand, and reduces speckle noise to a certain extent (reduces the spatial resolution while reducing the noise) on the other hand.
The multi-view processing is the most effective method for inhibiting speckle noise at present, and the variance of the speckle noise can be reduced after the N-view processing
Figure BDA0002395055670000071
Multiple, but at the expense of N times the spatial resolution.
(4) Filtering process
Since the SAR system is a coherent system, speckle noise is an inherent phenomenon of SAR images. Due to the existence of speckle noise, the SAR image cannot correctly reflect the scattering characteristics of the ground object target, the image quality is seriously influenced, and the ground object interpretability of the SAR image is reduced, so that speckle noise filtering processing is required. The Frost filtering has good edge holding capability and meets the basic requirement of extraction in a culture area, so the SAR filter adopted in the research is the Frost filter.
(5) Geocoding
The SAR system observes the intensity and phase information of the SAR pulse reflected (backscattered) after the electromagnetic wave is incident on the earth's surface. This information is encoded into the SAR coordinate system, i.e. the slant range coordinate system, and recorded. In some applications, the SAR data needs to be transferred from the slant range coordinate system to the geographic coordinate system. This process is the geocoding of the SAR data;
further, four fusion methods are adopted to fuse the optical image and the SAR image:
s21: the specific process of the HIS transformation fusion method comprises the following steps:
(1) registering and resampling three band images of the multispectral image with lower spatial resolution and the high-resolution image to enable the multispectral image and the high-resolution image to have the same pixel resolution and spatial geometric position;
(2) converting the multispectral image into an HIS space according to wave bands and RGB values to obtain three components of brightness I, chroma H and saturation S;
(3) then, performing contrast stretching on the image with high spatial resolution to enable the image to have the same mean value and variance with the brightness component I;
(4) finally, replacing the component I with the stretched high-resolution image, and carrying out HIS inverse transformation on the stretched high-resolution image, the gray-scale chroma component H 'and the saturation component S' to obtain a fused image;
s22: the Brovey transform is essentially the multiplication of each band value by the panchromatic band after comparing the total band value:
Figure BDA0002395055670000081
wherein R isnewNew value of R component of RBG value, GnewNew value of G component, B, for RBG valuenewThe new value of the B component of the RBG value and PAN is the full-color band value.
S23: the main steps of G-S transformation:
the algorithm mainly comprises the following steps: (1) generating a simulated low spatial resolution panchromatic image from the low spatial resolution multispectral image; (2) superposing the analog image (serving as a first wave band) on the multispectral image, and then carrying out G-S forward transformation on the recombined multispectral image; (3) and replacing the first component after G-S transformation by using the high-resolution full-color image, and then performing G-S inverse transformation on the replaced multispectral image to obtain a final fusion image.
S24: the principal component transformation fusion is to carry out principal component transformation on the low-resolution images of N wave bands, and stretch the gray scale of the high-resolution images of single wave bands to make the mean value and the variance of the gray scale consistent with the first component image of the principal component transformation; and then replacing the first component image with the stretched high-resolution image, and restoring the high-resolution image to the original space through principal component inverse transformation.
The step of S3 includes: firstly, image segmentation is carried out, a remote sensing image is segmented into objects, then the characteristics of the floating raft object are counted and analyzed, and classification results are obtained after the objects are identified in the space characteristics.
After the extraction operation is finished, in order to verify the effect of the method, the fusion quality and the extraction precision are evaluated according to the following steps:
s41: the objective evaluation of image quality is to determine statistical parameters of the fused image. The objective evaluation can overcome the interference of subjective factors of people and judge the advantages and disadvantages of different fusion methods. Currently, there are several types of commonly used methods for objectively evaluating image quality. Wherein the mean, the average gradient and the standard deviation are evaluation indexes reflecting brightness information;
assuming that the image is Z and the image function is Z (x, y), the number of rows and columns of the image are M and N, respectively, the size of the image is M × N, and L is the total gray level of the image.
(1) Mean μ (Average Value):
the average value is an arithmetic average value of brightness values of all pixels in the image, can be used for reflecting the average reflection intensity of the ground objects in the remote sensing image, and represents the average reflectivity of the ground objects. The definition is as follows:
Figure BDA0002395055670000091
(2) mean gradient
Figure BDA0002395055670000092
The average gradient sensitively reflects the ability of the image to contrast with small details, and can be used to evaluate the sharpness of the image.
An image blur means that the image near the boundary and the two sides of the line in the image is blurred, that is, the gray scale change rate is small, and the change rate can be represented by a gradient, so that the definition of the image can be measured by an average gradient value, and the calculation formula of the average gradient is as follows:
Figure BDA0002395055670000101
generally, the larger the image hierarchy, the clearer the image is, and therefore, the difference in the fine detail expression ability of the fused image can be reflected by using the index.
(3) Standard deviation δ (Standard deviation):
the standard deviation describes the degree of dispersion of the pixel value from the image mean. To some extent, the standard deviation can be used to evaluate the magnitude of the image contrast. The larger the standard deviation is, the more the distribution of the gray level of the image is dispersed, and the larger the contrast of the image is; the smaller the standard deviation, the smaller the image contrast. The definition is as follows:
Figure BDA0002395055670000102
(4) entropy of information e (information entropy):
the entropy of the image is an important index for measuring the richness of information, and the size of the entropy can indicate the average amount of information contained in the image. For an individual image, the gray values of the pixels are considered as mutually independent samples, and let the gray distribution of an image be P ═ P0,P1,P2...Pn},PiRepresenting the probability of a pixel having a gray value i in the image, i.e. the number of pixels N having a gray value iiAnd the number of pixels of the image N. According to the principles of Shannon information theory, it defines:
Figure BDA0002395055670000103
the information entropy can objectively evaluate the change of the information amount of the image before and after fusion. The larger the entropy of the image is, the more the information amount of the fused image increases, and the richer the information contained in the fused image is, the better the fusion quality is.
(5) Correlation Coefficient ρ (Correlation Coefficient):
the correlation coefficient ρ reflects the similarity of the spectral features between the fused image F and the original image a, i.e., the ability to maintain the spectral characteristics. The definition is as follows:
Figure BDA0002395055670000111
wherein
Figure BDA0002395055670000112
And
Figure BDA0002395055670000113
the gray level mean values of the fused image F and the original image A are respectively, and the relationship between the fused image and the original image can be seen by comparing the image correlation coefficients before and after fusion. The larger the rho value is, the more information the fused image obtains from the original image is, and the better the fusion effect is.
(6) Deviation index DI (Difference index)
The deviation index is used for reflecting the deviation degree of the fused image and the original multispectral image, and is defined as follows:
Figure BDA0002395055670000114
and (4) solving a deviation index according to the formula by using the fused image and the original multispectral image, wherein the smaller the deviation index is, the more the fusion method can keep the spectral information of the image. The deviation index may therefore describe the degree of spectral distortion of the fused image and the original image.
S42: the extraction precision is evaluated, and the evaluation indexes are as follows:
(1) the confusion matrix may be understood as the total number of samples per category, the number of samples that are misclassified, and the number of samples that are missing.
(2) Production accuracy or drawing accuracy indicates the proportion of samples that are also class i, among all samples whose actual measurement type is class i (a certain column of the confusion matrix), are correctly extracted. The drawing accuracy corresponds to the missing minute error, i.e., the missing minute error is 1-drawing accuracy.
(3) The user precision represents the proportion of samples of the i-th class (a certain row of the confusion matrix) in which the measurement type is actually the i-th class. Corresponding to the user precision is an error score error, i.e., the error score error is 1-user precision.
(4) The overall extraction accuracy is the proportion of samples that are correctly extracted among all samples.
(5) The KAPPA coefficient, unlike the overall extraction accuracy, utilizes the information of the entire error matrix, which is generally believed to more accurately reflect the overall extraction accuracy. It is emphasized, however, that the KAPPA coefficients apply only if the test sample is randomly selected from the entire frame.
The invention will be further described with reference to the accompanying drawings in which:
aiming at the SAR data and Landsat-8OLI optical data of the high-resolution third model, taking floating raft culture areas of Changhai county, Shandong province, Mulberry bay and Fujian province, all three Australia, of Liaoning province as examples, extracting the floating raft culture areas by adopting an image fusion method:
firstly, preprocessing an optical image and an SAR image, then fusing the optical image and the SAR image by four fusion methods of HIS (high-intensity-localization) transformation, Brovey transformation G-S transformation and principal component transformation, evaluating the fusion quality, and finally extracting a floating raft culture area from the fusion result by an object-oriented method and evaluating the precision of the extraction result.
1. Changhai county culture area of Liaoning province
The results of the image fusion experiments are shown in FIG. 2.
First, the fused image is subjectively evaluated. No double image phenomenon exists in the four fusion results, which indicates that the registration condition is good; in the aspect of color contrast between the culture area and the ocean, the four fusion results are all superior to the original image before fusion; in the aspects of spatial information and texture, the four fusion results better inherit the clear texture of the SAR image; the definition of each fusion result is kept good, and the edge of the floating raft area is clear. Generally speaking, because the water quality of the region is clear and the visible depth is high, the four fused images well inherit the good texture characteristics of the SAR image, and because the spectrum of the ocean of the original optical image is not distributed differently, the spectrum characteristics of the ocean in the same region of the fused image are uniform and consistent.
The fused image is then objectively evaluated from three aspects, namely, an evaluation index (mean, average gradient, and standard deviation) reflecting luminance information, an evaluation index (entropy) reflecting spatial detail information, and an evaluation index (correlation coefficient and deviation index) reflecting spectral information, as shown in tables 1 and 2.
TABLE 1 index values for evaluation of brightness of four fusion results
Figure BDA0002395055670000121
Figure BDA0002395055670000131
TABLE 2 evaluation index values of spatial detail information and spectral information of four fusion results
Figure BDA0002395055670000132
The mean value difference of each wave band of the fused image is reduced; from the average gradient, the definition and the layering of the fused band images are greatly improved compared with the corresponding bands of the original optical image, wherein the gradient of the G-S conversion fused image is the highest; from the standard deviation, only the standard deviation of the image fused by HIS transformation is increased, which shows that the contrast of the image fused by HIS transformation is increased, thus being beneficial to subsequent extraction, and the standard deviation of the image fused by other methods is very small compared with the original image.
The information entropy can be seen, compared with the information entropy of the corresponding wave band of the original multispectral, the information entropy of the four fused images is improved, and the information quantity of each wave band of the optical image can be improved by the four methods; it can be seen from the deviation index that the four fusion methods all distort the spectral information of the original image, but the distortion degree of each fusion method to each waveband is different. The HIS image fusion method has large distortion in the B2 wave band, the distortion degree of each method is almost the same in the B3 wave band, the HIS image fusion method and the Brovey transformation fusion method have minimum distortion degree in the B4 wave band, and the distortion degree of the other two methods is large. It can be seen from the correlation coefficient with the optical image that, except that the correlation coefficient with the original optical image is lower in the G-S transform fusion method, the correlation degrees between the B2 and B3 wave bands and the original optical image are very large (and are all larger than 0.75), but the correlation coefficients between the B4 wave bands and the original optical image are relatively small in the three methods. It can be seen from the correlation coefficient with the SAR image that the correlation coefficients of the four fused images and the SAR image are improved to different degrees compared with the correlation coefficients of the original optical image and the SAR image.
In general, from the viewpoint of comprehensive subjective evaluation and objective evaluation, the HIS fusion image achieves the best fusion result, and is superior to other fusion results in terms of richness of spectral information and preservation of brightness and texture information of the whole image.
The object-oriented extraction results of the Changhai county culture area in Liaoning province are shown in FIG. 3.
From the viewpoint of subjective evaluation, the processing effect of (a) is poor due to the limitation of resolution; (b) the result is obviously superior to the result of optical image extraction, but the floating raft of the result (b) is very discontinuous, the extracted floating raft is completely dispersed, the integrity of the floating raft is kept very poor, mainly because only gray information can be utilized, and the extraction basis is single; the extraction results of the four fusion methods are better in accuracy, but the interior of the floating raft is not complete and coherent, and the phenomenon that the gray level is consistent with that of an ocean area exists in the original image floating raft area through analysis of an optical image and an SAR image before fusion, and the fact that the floating raft only has bright edges and only has ocean reflection in the interior is preliminarily considered as the floating raft is not provided with a suspension cage under a floating ball and has black display due to the fact that the interior of the floating raft does not have corresponding body scattering.
In order to evaluate the extraction result more accurately, the result is evaluated for quantitative accuracy. The precision evaluation selects evaluation indexes commonly used in object-oriented extraction, including confusion matrix, user precision, producer precision, total precision and KAPPA coefficient. The extraction accuracy evaluation tables are shown in tables 5 to 6 and tables 5 to 7.
According to the condition of the confusion matrix, all extraction results have wrongly-divided samples; from the precision of a producer, the four results are that the division missing error of a culture area is larger than that of a non-culture area, which is consistent with the subjective evaluation result, and the primary judgment is caused by the fact that a suspension cage is not installed below a floating ball; from the user precision, the wrong division errors of the culture areas corresponding to the original image, the HIS fusion image and the G-S fusion image are smaller than the wrong division errors of the non-culture areas, and the other two methods have opposite conditions; from the overall precision, the Brovey fusion image has the highest precision, the precision of the other three fusion results reaches more than 90%, and the overall precision of the optical image is the lowest; from the perspective of the KAPPA coefficient, the HIS fusion image is the best effect, the KAPPA coefficients of the other three results are also high and exceed 0.8, and the KAPPA coefficients of the original image are all low; in conclusion, for the cultivation areas in Changhai county of Liaoning province with clear water quality and large visible depth, the extraction result precision of the four fusion methods is higher and better than that of the original image, the highest comprehensive precision is the Brovey fusion result, and the HIS fusion result is inferior.
TABLE 3 confusion matrix extracted from the image floating raft culture area in Changhai county of Liaoning province
Figure BDA0002395055670000151
Figure BDA0002395055670000161
TABLE 4 Overall precision and KAPPA coefficient extracted from the image floating raft culture area in Changhai county of Liaoning province
Figure BDA0002395055670000162
2. Shandong province mulberry ditch bay culture area
The results of the image fusion experiments are shown in FIG. 4.
First, the fused image is subjectively evaluated. No double image phenomenon exists in the four fusion results, which indicates that the registration condition is good; in the aspect of color contrast between a culture area and the ocean, the G-S transformation fusion result and the K-L transformation fusion result have poor effects, and the HIS transformation fusion result is best and superior to the original image before fusion; in the aspects of spatial information and texture, the four fusion results inherit the clear texture of the SAR image, but the definition of the HIS fusion result is kept best, and the edge of the floating raft area is clear. The conjecture is that the region is influenced by a yellow substance CDOM and a suspended particulate matter SPM, the ground objects of the optical remote sensing image are distributed in a differentiated mode, the ocean in the same region presents different spectral characteristics, and the G-S transformation fusion result and the K-L transformation fusion result are poor in effect.
The fused image was objectively evaluated from three aspects of evaluation indexes (mean, average gradient, and standard deviation) reflecting luminance information, evaluation indexes (entropy) reflecting spatial detail information, and evaluation indexes (correlation coefficient and deviation index) reflecting spectral information, as shown in tables 5 and 6.
From the average gradient, the definition and the layering of the HIS transformed image are superior to those of the original optical image, and the average gradient of the rest fusion methods is reduced compared with that of the original optical image; from the standard deviation, the standard deviation of the image fused by the HIS transformation is increased, and the standard deviation of the image fused by other methods is lower than that of the original image.
As can be seen from the information entropy, the information entropy of only the HIS fusion image is improved compared with the information entropy of the original multi-spectral corresponding wave band, and the rest information entropy is reduced; it can be seen from the deviation index that the four fusion methods all distort the spectral information of the original image, but the distortion degree of each fusion method to each waveband is different. The most distorted band is the K-L transform fusion method, and the least distorted band is the Brovey transform fusion method. It can be seen from the correlation coefficient with the optical image that, except that the correlation coefficient with the original optical image is lower in the K-L transform fusion method, the correlation degrees between the three methods and the original optical image in the B2 waveband are very large (and are all larger than 0.75), but the correlation coefficients between the three methods and the optical image in the B3 and the B4 waveband are relatively small. As can be seen from the correlation coefficient with the SAR image, the correlation coefficient of the HIS conversion fusion image, the Brovey conversion fusion image and the SAR image is greatly improved relative to the correlation coefficient of the original optical image and the SAR image, the G-S conversion fusion image has the second effect, the K-L conversion fusion image has the minimum correlation coefficient with the SAR image, and the effect is the worst.
In general, from the viewpoint of comprehensive subjective evaluation and objective evaluation, the HIS fusion image achieves the best fusion result, and is superior to other fusion results in terms of richness of spectral information and preservation of brightness and texture information of the whole image.
TABLE 5 index values for evaluation of brightness of four fusion results
Figure BDA0002395055670000171
TABLE 6 evaluation index values of spatial detail information and spectral information of four fusion results
Figure BDA0002395055670000172
Figure BDA0002395055670000181
The results of the subject-oriented extraction in the bay of mulberries in Shandong province are shown in FIG. 5.
From the perspective of subjective evaluation, the extraction results of (a), (e) and (f) are poor, mainly because the region is affected by yellow matter CDOM and suspended particulate matter SPM, the optical remote sensing image ground features are distributed in a differentiated mode, the ocean in the same region presents different spectral characteristics, and a certain brightness threshold value cannot be determined to extract the culture region. After G-S conversion and K-L conversion, the fused image highlights the condition of differential distribution, so that the extraction results of (e) and (f) are poor; (a) the reason for the poor results is also the resolution effect. (b) As a result, a large number of gaps between the floating rafts below the floating raft can not be separated; (c) and (d) the extraction result is better, but the fragments of part of the culture areas in the step (d) are serious, mainly because the brightness of the fused image is low, the brightness of the original points with low brightness in the floating raft is lower, the brightness difference with the brightness of the surrounding oceans is small, and the extraction is influenced. Quantitative evaluation was then performed, and the extraction accuracy evaluation tables are shown in tables 7 and 8.
TABLE 7 confusion matrix extracted from the image floating raft culture area of the Shandong province Mulberry ditch bay
Figure BDA0002395055670000182
Figure BDA0002395055670000191
TABLE 8 Overall precision and KAPPA coefficient extracted from each fused image floating raft culture area in Shandong province Mulberry ditch bay
Figure BDA0002395055670000192
Since the KAPPA coefficients of (a), (e) and (f) are all lower than 0.5, the G-S transformation and the K-L transformation are considered to be unsuitable for the low visibility gulf culture zone in mulberry with subjective evaluation; (b) the extraction result of (c) is good, but the overall accuracy and the KAPPA coefficient are still lower than those of (c) and (d); except that the extracted result is more accurate for the producer, the HIS fusion image is higher than the Brovey fusion image, and the other evaluation indexes are approximately the same, so in summary, the HIS transformation fusion result is more suitable for the sea area with low visible depth.
3. Cultivation area of Oudeno. III, Oudeno, Fujian province
The results of the image fusion experiments are shown in fig. 6.
First, the fused image is subjectively evaluated. No double image phenomenon exists in the four fusion results, which indicates that the registration condition is good; in the aspect of color contrast between the culture area and the ocean, the other three fusion results except G-S transformation are all superior to the original image before fusion; in the aspects of spatial information and texture, only the HIS transformation fusion result and the Brovey fusion result well inherit the clear texture of the SAR image, the edge of a floating raft area is clear, the G-S transformation fusion result and the K-L transformation fusion result are poor, a large number of floating raft areas are not displayed, and the reason is presumed to be that the ocean visibility depth is low. Next, the fused image is objectively evaluated from three aspects of evaluation indexes (mean, average gradient, and standard deviation) reflecting luminance information, evaluation indexes (information entropy) reflecting spatial detail information, and evaluation indexes (correlation coefficient and deviation index) reflecting spectral information, as shown in tables 9 and 10.
As can be seen from the information entropy, the information entropy of the HIS transformation fusion image is improved compared with the information entropy of the original multi-spectral corresponding wave band, and the rest three methods are lower than the information content of each wave band of the optical image; it can be seen from the deviation index that the four fusion methods all distort the spectral information of the original image, but the distortion degree of each fusion method to each waveband is different. The HIS image fusion method is the one with larger distortion in the B2 wave band, the G-S transformation fusion method is the one with larger distortion in the B3 wave band, and the Brovey transformation fusion method has the smallest distortion in the B4 wave band. It can be seen from the correlation coefficient with the optical image that, except that the K-L transform fusion method has a low correlation coefficient with the original optical image, the correlation between the three bands and the original optical image is very large (and is greater than 0.75). It can be seen from the correlation coefficient with the SAR image that the correlation coefficient of the HIS transform fusion image and the brooey transform fusion image with the SAR image is improved to a different extent in the B2 and B3 wave bands relative to the correlation coefficient of the original optical image and the SAR image, and the correlation coefficient of the G-S transform fusion image and the K-L transform fusion image with the SAR image is lower.
In general, from the viewpoint of comprehensive subjective evaluation and objective evaluation, the HIS fusion image achieves the best fusion result, and is superior to other fusion results in terms of richness of spectral information and preservation of brightness and texture information of the whole image.
TABLE 9 index values for evaluation of brightness of four fusion results
Figure BDA0002395055670000201
TABLE 10 evaluation index values of spatial detail information and spectral information of four fusion results
Figure BDA0002395055670000202
Figure BDA0002395055670000211
The object-oriented extraction results of the three-city australia cultivation area in fujian province are shown in fig. 7. The visible depth of the Sandu Australian culture area is superior to that of the Shandong Sanguinu bay culture area, but the visible depth of the Sandu Australian culture area is different from that of the Shandong Sanguinu bay culture area, and similar to the extraction results of the Sanguinu bay culture area, the extraction results of (a), (e) and (f) are poor, and the phenomenon is caused because the visible depth of the ocean is low, so that the extraction is influenced. (b) The separation between the raft and the ocean is not separated, and the result is poor; (c) and (d) the extraction result is better, the ocean gaps between the floating raft areas are well separated, and the floating rafts are regular in shape and are arranged in order. The extraction accuracy evaluation tables are shown in tables 11 and 12.
Since the KAPPA coefficients of (e) and (f) are both lower than 0.1, the extraction result is very poor, and the G-S transformation and the K-L transformation are considered to be not suitable for the Macadimia sanctu culture area with low visibility in combination with subjective evaluation. (a) The missing division phenomenon is serious, a plurality of floating rafts are not divided into the culture area, and (b) the wrong division phenomenon is serious, and gaps among a large number of floating rafts are divided into the culture area; (c) the results of (a) and (d) are approximately equivalent, but the results of (c) are superior to those of (d) in the overall accuracy and the KAPPA coefficient, and therefore, it is determined that the HIS transform fusion result is suitable for the sea area with a low visible depth.
TABLE 11 Overall precision and KAPPA coefficient extracted from Sandu Australia fused image buoyant raft culture area of Fujian province
Figure BDA0002395055670000212
TABLE 12 confusion matrix extracted from Sandu Australia fusion image floating raft culture area in Fujian province
Figure BDA0002395055670000213
Figure BDA0002395055670000221
4. Algorithm fitness evaluation
The optical and SAR images corresponding to the three test areas are subjected to image fusion of four methods, all fusion results are subjectively evaluated and objectively evaluated, from the final evaluation result, the best fusion result of the three areas is the HIS transformation fusion image no matter in the aspects of abundance of spectral information or brightness and texture information of the whole image, and the information entropy of the HIS fusion results of the three test areas in all fusion results is the highest, so that the HIS fusion result is proved to contain the most information and the most abundant image information.
The fused image obtained by the HIS conversion fusion method is less affected by suspended matters such as silt, the edges of the floating rafts are clear and complete, and the arrangement rules of the floating rafts are regular, so that for the extraction of the ocean floating raft culture area, the HIS conversion fusion method integrates the respective advantages of an optical image and an SAR image, and provides a new thought and a new method for quickly and accurately extracting the ocean culture area.
The extraction results of different fusion methods in the three areas are integrated, so that the quality of the fusion results directly influences the extraction results of the culture areas. In the sea area with higher ocean visible depth, the fusion results of different methods obtain better extraction results, and the extraction result of the HIS transformation fusion method is the best; in the sea area with low visible depth, the extraction result of the HIS transformation fusion method is also high in precision.
The invention effectively solves the problem of extraction in the floating raft culture area in the offshore cloudy and foggy area; the advantages of the SAR image and the advantages of the optical image are integrated, the offshore floating raft culture area is accurately and efficiently extracted, and a fusion algorithm with the highest precision for extracting the floating raft culture area based on image fusion is obtained.

Claims (7)

1. An offshore buoyant raft culture area extraction method based on SAR and optical image fusion is characterized by comprising the following steps:
s1, preprocessing the optical image and the SAR image to obtain a preprocessed optical image and a preprocessed SAR image;
s2, fusing the preprocessed optical image and the preprocessed SAR image by adopting four fusion methods to obtain a fused image;
and S3, extracting the floating raft culture area from the fused image by adopting an object-oriented method.
2. The SAR and optical image fusion based offshore buoyant raft culture zone extraction method of claim 1, wherein the preprocessing of the step S1 comprises: radiometric calibration, atmospheric correction, complex data amplitude conversion data, multi-view processing, filtering processing and geocoding.
3. The SAR and optical image fusion based offshore buoyant raft culture zone extraction method of claim 1, wherein the four fusion methods of the step S2 comprise: HIS transformation fusion, Brovey transformation fusion, G-S transformation fusion and principal component transformation fusion.
4. The SAR and optical image fusion based offshore buoyant raft culture zone extraction method of claim 3, characterized in that the HIS transformation fusion method comprises the following steps:
a1, finding out a multispectral image with lower spatial resolution from the preprocessed optical image and the preprocessed SAR image, and registering and resampling three waveband images of the multispectral image with lower spatial resolution and the high-resolution image to enable the multispectral image and the high-resolution image to have the same pixel resolution and spatial geometric position;
a2, correspondingly transforming the multispectral image into an HIS space according to the wave bands and the RGB values to obtain three components of brightness I, chroma H and saturation S;
a3, performing contrast stretching on the image with high spatial resolution to make the image have the same mean and variance with the brightness component I;
and A4, replacing the I component with the stretched high-resolution image, and performing HIS inverse transformation on the I component, the gray-scale stretched chrominance component H 'and the gray-scale stretched saturation component S' to obtain a fused image.
5. The SAR and optical image fusion based offshore buoyant raft culture area extraction method of claim 3, wherein the Brovey transform fusion method is as follows: and reassigning the RBG values of the preprocessed optical image and the preprocessed SAR image according to the following formula:
Figure FDA0002395055660000021
wherein R isnewNew value of R component of RBG value, GnewNew value of G component, B, for RBG valuenewThe new value of the B component of the RBG value and PAN is the full-color band value.
6. The SAR and optical image fusion based offshore buoyant raft culture zone extraction method of claim 3, characterized in that the G-S transformation fusion method comprises the following steps:
b1, dividing the preprocessed optical image and the preprocessed SAR image into a multispectral image with low spatial resolution and a high-resolution panchromatic image according to a spatial resolution threshold;
b2, generating a simulated low-spatial-resolution full-color image by using the multispectral image with low spatial resolution;
b3, overlapping the first waveband analog image to the multispectral image, and then performing G-S forward transformation on the recombined multispectral image;
and B4, replacing the first component after G-S transformation by the high-resolution full-color image, and performing G-S inverse transformation on the replaced multispectral image to obtain a final fusion image.
7. The SAR and optical image fusion based offshore buoyant raft culture zone extraction method of claim 3, wherein the principal component transformation fusion method comprises the following steps:
c1, dividing the preprocessed optical image and the preprocessed SAR image into a low-resolution image and a high-resolution image according to a spatial resolution threshold;
c2, performing principal component transformation on the low-resolution image;
c3, stretching the gray scale of the high-resolution image to make the mean value and the variance of the gray scale consistent with the first component image of the low-resolution image with principal component transformation;
and C4, replacing the first component image of the low-resolution image with the stretched high-resolution image, and restoring the replaced low-resolution image to the original image space through principal component inverse transformation.
CN202010128207.2A 2020-02-28 2020-02-28 Method for extracting offshore buoyant raft culture area based on SAR and optical image fusion Pending CN111339959A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010128207.2A CN111339959A (en) 2020-02-28 2020-02-28 Method for extracting offshore buoyant raft culture area based on SAR and optical image fusion

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010128207.2A CN111339959A (en) 2020-02-28 2020-02-28 Method for extracting offshore buoyant raft culture area based on SAR and optical image fusion

Publications (1)

Publication Number Publication Date
CN111339959A true CN111339959A (en) 2020-06-26

Family

ID=71182111

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010128207.2A Pending CN111339959A (en) 2020-02-28 2020-02-28 Method for extracting offshore buoyant raft culture area based on SAR and optical image fusion

Country Status (1)

Country Link
CN (1) CN111339959A (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112906645A (en) * 2021-03-15 2021-06-04 山东科技大学 Sea ice target extraction method with SAR data and multispectral data fused
CN113076991A (en) * 2021-03-30 2021-07-06 中国人民解放军93114部队 Multi-target information comprehensive processing method and device based on nonlinear integral algorithm
CN114037902A (en) * 2021-10-28 2022-02-11 江苏海洋大学 Inversion method for extracting and identifying suspended sediment in porphyra yezoensis culture area

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP2610636A1 (en) * 2011-12-29 2013-07-03 Windward Ltd. Providing near real-time maritime insight from satellite imagery and extrinsic data
CN108734171A (en) * 2017-04-14 2018-11-02 国家海洋环境监测中心 A kind of SAR remote sensing image ocean floating raft recognition methods of depth collaboration sparse coding network
CN109472304A (en) * 2018-10-30 2019-03-15 厦门理工学院 Tree species classification method, device and equipment based on SAR Yu optical remote sensing time series data
CN110097101A (en) * 2019-04-19 2019-08-06 大连海事大学 A kind of remote sensing image fusion and seashore method of tape sorting based on improvement reliability factor
CN110111259A (en) * 2019-05-15 2019-08-09 电子科技大学 A kind of multisource image anastomosing method based on regional guidance
CN110598748A (en) * 2019-08-13 2019-12-20 清华大学 Heterogeneous image change detection method and device based on convolutional neural network fusion

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP2610636A1 (en) * 2011-12-29 2013-07-03 Windward Ltd. Providing near real-time maritime insight from satellite imagery and extrinsic data
CN108734171A (en) * 2017-04-14 2018-11-02 国家海洋环境监测中心 A kind of SAR remote sensing image ocean floating raft recognition methods of depth collaboration sparse coding network
CN109472304A (en) * 2018-10-30 2019-03-15 厦门理工学院 Tree species classification method, device and equipment based on SAR Yu optical remote sensing time series data
CN110097101A (en) * 2019-04-19 2019-08-06 大连海事大学 A kind of remote sensing image fusion and seashore method of tape sorting based on improvement reliability factor
CN110111259A (en) * 2019-05-15 2019-08-09 电子科技大学 A kind of multisource image anastomosing method based on regional guidance
CN110598748A (en) * 2019-08-13 2019-12-20 清华大学 Heterogeneous image change detection method and device based on convolutional neural network fusion

Non-Patent Citations (6)

* Cited by examiner, † Cited by third party
Title
G JIE等: "Weighted Fusion-Based Representation Classifiers for Marine Floating Raft Detection of SAR Images", 《IEEE GEOSCIENCE AND REMOTE SENSING LETTERS》 *
刘志军: "无居民海岛及其人类活动特征遥感识别方法研究", 《中国优秀博硕士学位论文全文数据库(博士)信息科技辑》 *
周琳: "基于改进SLIC算法的融合影像水体提取研究——以GF-1与GF-3为例", 《中国优秀博硕士学位论文全文数据库(硕士)基础科学辑》 *
王庭刚: "基于GF-2的近海养殖区遥感监测及环境污染负荷评估", 《中国优秀博硕士学位论文全文数据库(硕士)工程科技Ⅰ辑》 *
贾永红等: "《数字图像处理实习教程》", 30 November 2016, 武汉大学出版社 *
郭云开等, 中国地图出版社 *

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112906645A (en) * 2021-03-15 2021-06-04 山东科技大学 Sea ice target extraction method with SAR data and multispectral data fused
CN112906645B (en) * 2021-03-15 2022-08-23 山东科技大学 Sea ice target extraction method with SAR data and multispectral data fused
CN113076991A (en) * 2021-03-30 2021-07-06 中国人民解放军93114部队 Multi-target information comprehensive processing method and device based on nonlinear integral algorithm
CN113076991B (en) * 2021-03-30 2024-03-08 中国人民解放军93114部队 Nonlinear integration algorithm-based multi-target information comprehensive processing method and device
CN114037902A (en) * 2021-10-28 2022-02-11 江苏海洋大学 Inversion method for extracting and identifying suspended sediment in porphyra yezoensis culture area
CN114037902B (en) * 2021-10-28 2023-09-12 江苏海洋大学 Inversion method for extracting and identifying suspended sediment in Porphyra yezoensis cultivation area

Similar Documents

Publication Publication Date Title
Wang et al. Sea ice concentration estimation during melt from dual-pol SAR scenes using deep convolutional neural networks: A case study
CN105243367B (en) A kind of water body range monitoring method and device based on satellite remote sensing date
CN111339959A (en) Method for extracting offshore buoyant raft culture area based on SAR and optical image fusion
CN106886760B (en) A kind of EO-1 hyperion Ship Detection combined based on empty spectrum information
Dong et al. Coral reef geomorphology of the Spratly Islands: A simple method based on time-series of Landsat-8 multi-band inundation maps
Su et al. Improving MODIS sea ice detectability using gray level co-occurrence matrix texture analysis method: A case study in the Bohai Sea
CN109781073B (en) Shallow sea water depth remote sensing extraction method integrating sea wave characteristics and spectral characteristics
CN109781626A (en) A kind of offshore based on spectrum analysis uphangs husky water body green tide remote sensing recognition method
CN111008664B (en) Hyperspectral sea ice detection method based on space-spectrum combined characteristics
CN107688776B (en) Urban water body extraction method
Zhai Inversion of organic matter content in wetland soil based on Landsat 8 remote sensing image
Liang et al. Maximum likelihood classification of soil remote sensing image based on deep learning
CN112037244B (en) Landsat-8 image culture pond extraction method combining index and contour indicator SLIC
Kwon et al. ETVOS: An enhanced total variation optimization segmentation approach for SAR sea-ice image segmentation
CN112884029B (en) Collaborative classification method integrating fully-polarized SAR and hyperspectral remote sensing
CN117274831A (en) Offshore turbid water body depth inversion method based on machine learning and hyperspectral satellite remote sensing image
Deng et al. Mapping bathymetry from multi-source remote sensing images: A case study in the Beilun Estuary, Guangxi, China
CN115271560B (en) Quantitative evaluation system and evaluation method for offshore oil drilling spilled oil weathering
Liu et al. Water extraction on the hyperspectral images of gaofen-5 satellite using spectral indices
CN112906645B (en) Sea ice target extraction method with SAR data and multispectral data fused
Stumpf et al. Mapping water depths in clear water from space
CN115620133A (en) Mangrove extraction method considering phenology and water level time sequence characteristics
CN112052720B (en) High-space-time normalization vegetation index NDVI fusion model based on histogram clustering
Ķēniņš Land cover classification using very high spatial resolution remote sensing data and deep learning
CN113837123A (en) Mid-resolution remote sensing image offshore culture area extraction method based on spectral-spatial information combination

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication

Application publication date: 20200626

RJ01 Rejection of invention patent application after publication