US20190171862A1 - Method of extracting image of port wharf through multispectral interpretation - Google Patents

Method of extracting image of port wharf through multispectral interpretation Download PDF

Info

Publication number
US20190171862A1
US20190171862A1 US16/205,251 US201816205251A US2019171862A1 US 20190171862 A1 US20190171862 A1 US 20190171862A1 US 201816205251 A US201816205251 A US 201816205251A US 2019171862 A1 US2019171862 A1 US 2019171862A1
Authority
US
United States
Prior art keywords
image
values
grayscale
extracting
multispectral
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
US16/205,251
Other versions
US10325151B1 (en
Inventor
Yue Qi
Yaping MAO
Yun Feng
Jun Hao
Jun Huang
Hanbing SUN
Yi Yang
Wentao Ding
Haiyuan YAO
Chen Shen
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Transport Planning And Research Institute Ministry Of Transport
Original Assignee
Transport Planning And Research Institute Ministry Of Transport
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Transport Planning And Research Institute Ministry Of Transport filed Critical Transport Planning And Research Institute Ministry Of Transport
Assigned to Transport Planning and Research Institute Ministry of Transport reassignment Transport Planning and Research Institute Ministry of Transport ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: DING, Wentao, FENG, YUN, Hao, jun, HUANG, JUN, MAO, YAPING, QI, YUE, SHEN, Chen, SUN, HANBING, YANG, YI, YAO, HAIYUAN
Publication of US20190171862A1 publication Critical patent/US20190171862A1/en
Application granted granted Critical
Publication of US10325151B1 publication Critical patent/US10325151B1/en
Expired - Fee Related legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • G06K9/0063
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/20Image preprocessing
    • G06V10/30Noise filtering
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/21Design or setup of recognition systems or techniques; Extraction of features in feature space; Blind source separation
    • G06F18/213Feature extraction, e.g. by transforming the feature space; Summarisation; Mappings, e.g. subspace methods
    • G06F18/2134Feature extraction, e.g. by transforming the feature space; Summarisation; Mappings, e.g. subspace methods based on separation criteria, e.g. independent component analysis
    • G06F18/21342Feature extraction, e.g. by transforming the feature space; Summarisation; Mappings, e.g. subspace methods based on separation criteria, e.g. independent component analysis using statistical independence, i.e. minimising mutual information or maximising non-gaussianity
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/21Design or setup of recognition systems or techniques; Extraction of features in feature space; Blind source separation
    • G06F18/213Feature extraction, e.g. by transforming the feature space; Summarisation; Mappings, e.g. subspace methods
    • G06F18/2135Feature extraction, e.g. by transforming the feature space; Summarisation; Mappings, e.g. subspace methods based on approximation criteria, e.g. principal component analysis
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/21Design or setup of recognition systems or techniques; Extraction of features in feature space; Blind source separation
    • G06F18/217Validation; Performance evaluation; Active pattern learning techniques
    • G06K9/40
    • G06K9/44
    • G06K9/4604
    • G06K9/6262
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/20Image preprocessing
    • G06V10/34Smoothing or thinning of the pattern; Morphological operations; Skeletonisation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/40Extraction of image or video features
    • G06V10/44Local feature extraction by analysis of parts of the pattern, e.g. by detecting edges, contours, loops, corners, strokes or intersections; Connectivity analysis, e.g. of connected components
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/40Extraction of image or video features
    • G06V10/50Extraction of image or video features by performing operations within image blocks; by using histograms, e.g. histogram of oriented gradients [HoG]; by summing image-intensity values; Projection analysis
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/77Processing image or video features in feature spaces; using data integration or data reduction, e.g. principal component analysis [PCA] or independent component analysis [ICA] or self-organising maps [SOM]; Blind source separation
    • G06V10/7715Feature extraction, e.g. by transforming the feature space, e.g. multi-dimensional scaling [MDS]; Mappings, e.g. subspace methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/10Terrestrial scenes
    • G06V20/13Satellite images
    • G06K2009/00644
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/10Terrestrial scenes
    • G06V20/194Terrestrial scenes using hyperspectral data, i.e. more or other wavelengths than RGB

Definitions

  • the present invention relates to the field of remote sensing recognition technologies, and particularly to a method of extracting an image of a port wharf through multispectral interpretation.
  • An investigation method based on a remote sensing image achieves effective resource utilization and management mainly by extracting a spectral image of a port by means of satellite remote sensing, aerial photography and the like, and then extracting characteristics of a ground object, for example, a wharf in the port, using a remote sensing image interpretation technology.
  • Spectral characteristics are characteristics uniquely possessed by each ground object in a remote sensing image.
  • recognizing a port wharf in a remote sensing image using multiple spectrums is a technical means commonly used at present; in the prior art, for example, in a method of segmenting a wharf and a ship by combining characteristics of multi-source remote sensing images, which is disclosed in Patent Application No. 201210591353.4, a wharf and a ship are segmented using multivariate characteristics of multi-source images, and more target information is acquired by the complementarity between different types of remote sensing images, thus increasing the accuracy of wharf and ship segmentation.
  • a technical solution adopted by the present invention to solve the technical problems is a method of extracting an image of a port wharf through multispectral interpretation, comprising the following steps:
  • S100 assigning values to grayscale values to divide a coastline; performing a grayscale processing on the original remote sensing image, assigning values to grayscale values, and extracting a blurred coastline according to the distribution of the grayscale values;
  • S200 smoothing the image and removing noise; performing a smoothing processing on an area of the blurred coastline in the original remote sensing image according to a range limited by the blurred coastline and removing interference noise to extract edge information;
  • S300 establishing a multispectral database of a targeted port linearly combining multivariate observed values among different edge information, regularizing the linear combination to obtain a kernel function, that is, multispectral data, and repeating the foregoing process to obtain a multispectral database; and
  • S400 extracting a port wharf using a projected eigenvector, performing an MAF transformation on the legitimized kernel function, projecting the multivariate observed values to original eigenvectors, and identifying a remote sensing image area corresponding to a validated original eigenvector as a port wharf to be extracted.
  • step S100 different values are assigned to remote sensing images of different gray scales, and a value ‘0’ is assigned to the grayscale value of a water area remote sensing image and a value ‘10’ to the part on land having a maximum grayscale value.
  • step S100 extracting a coastline by assigning values to grayscale values specifically includes the following steps:
  • S101 first, performing a uniform grayscale processing on the original remote sensing image and dividing areas having different grayscale values, then calculating grayscale value variances for the divided areas to obtain a grayscale distribution uniformity coefficient;
  • S102 checking the grayscale value variances calculated in S101, selecting and determining a variance contrast value, identifying an area where a grayscale value is smaller than the variance contrast value as a water area and an area where a grayscale value is greater than the variance contrast value as a land area;
  • S103 restoring distinct boundary lines existing between the water area and the land area that are identified through the foregoing steps to a continuous boundary line using an interpolation method, wherein the continuous boundary line forms the blurred coastline.
  • a preprocessing is carried out for the remote sensing image, the preprocessing includes geometric correction, atmospheric correction and radiation correction.
  • step S200 the image is smoothed using any one of mean value smoothing, median filtering or Gaussian blur filtering, wherein the median filtering and the Gaussian blur filtering both employ a normalized ratio method and use the following specific calculation formula:
  • NDWI (Green ⁇ NIR)/(Green+NIR), where Green represents a green light waveband image, NIR represents a near-infrared waveband image, and NDWI represents a combination of wavebands.
  • step S300 the MAF transformation is performed on an image in an edge region before the linear combination operation is performed, so as to obtain an autocorrelation factor, and the specific algorithm is as follows:
  • step S300 the linear combination and the regularization thereof include the following steps:
  • step S400 after the regularization is performed, the original eigenvector is set to be a i , then a projection algorithm of the original eigenvector is as follows:
  • a validation method specifically includes: comparing the original eigenvector with a transformation variance, and determining that the original eigenvector smaller than the transformation variance meets a requirement, wherein a specific algorithm of the transformation variance is as follows:
  • n is the number of times extraction is performed.
  • the method further includes a step S500 of validating a spatial correlation relationship, which specifically includes:
  • steps S100 and S200 based on steps S100 and S200, obtaining a spatial correlation relationship of the blurred coastline obtained through the assignment of values to grayscales and an image processing operation, and validating, using the spatial relationship, a spatial correlation relationship for the port wharf recognized in the step S400.
  • the present invention has the following beneficial effects: the method disclosed herein, which first performs a grayscale processing on the original remote sensing image to divide a water area from a land area and thus determines the approximate location of a port wharf, and then performs a multispectral processing on the location and directly switches the remote sensing image to a related data calculation based on an MAF transformation, is capable of recognizing a port wharf rapidly and accurately by comparing characteristic spectrums, and improving the accuracy of a characteristic spectrum from the source by carrying out an error correction for a remote sensing image during a recognition process, moreover, the method is further capable of validating a spatial correlation relationship during a recognition process and thus improving recognition accuracy.
  • FIG. 1 is a schematic diagram illustrating a flow according to the present invention.
  • the present invention provides a method of extracting an image of a port wharf through multispectral interpretation, comprising the following steps:
  • S100 assigning values to grayscale values to divide a coastline; performing a grayscale processing on the original remote sensing image, assigning different values to remote sensing images of different grayscales, and extracting a blurred coastline according to the distribution of the grayscale values.
  • a value ‘0’ is assigned to the grayscale value of a water area and a value ‘10’ to the part on a land having a maximum grayscale value during a value assignment process, and the other grayscale values are proportionally adjusted according to the magnitude of grayscale
  • assigning a value ‘10’ to the part on a land having a maximum grayscale value specifically refers to performing a normal value assignment after removing scattered extremely-high grayscale values in order to effectively decrease processing errors and avoid the inclusion of ‘polluted data’.
  • the values are assigned to the grayscale values and the coastline is extracted by executing the following steps:
  • S101 first, performing a uniform grayscale processing on the original remote sensing image, then dividing areas having different grayscale values, and calculating a grayscale value variance for the divided areas to obtain a grayscale distribution uniformity coefficient;
  • S102 checking the grayscale value variance calculated in S101, selecting and determining a variance contrast value, identifying an area where a grayscale value is smaller than the variance contrast value as a water area and an area where a grayscale value is greater than the variance contrast value as a land area;
  • a preprocessing is carried out for the remote sensing image, the preprocessing includes geometric correction, atmospheric correction and radiation correction.
  • the effect of the geometric correction is to make coordinates of a ground object in a remote sensing image more conform to reality, correct a coordinate error occurring in remote sensing, and cause a recognition result to be closer to an actual result.
  • the atmospheric correction refers to eliminating the effect caused by a cloud block to a remote sending image after performing a geometric correction.
  • the effect of the radiation correction is to eliminate the effect caused by the radiation of a ground object to a remote sensing image.
  • correction methods can be directly executed using a piece of image processing software in the prior art.
  • texture characteristics of a ground object in a corrected remote sensing image meet an extraction requirement.
  • S200 smoothing the image and removing noise; performing a smoothing processing on an area of the blurred coastline in the original remote sensing image according to a range limited by the blurred coastline and removing interference noise to extract edge information.
  • Sea spray is common in the sea, and in a remote sensing image, a breaking wave of sea spray has a grayscale value that is close to the grayscale value of a wharf, if the image smoothing and noise removal is not performed, big problems will be brought to a subsequent recognition, which will increase recognition errors that are transferred and magnified in continuously repeated recognition calculation, moreover, the amount of the calculation conducted in a recognition process will be greatly increased.
  • step S200 the image is smoothed through any one of mean value smoothing, median filtering or Gaussian blur filtering, wherein the median filtering and the Gaussian blur filtering both employ a normalized ratio method and use the following specific calculation formula:
  • NDWI (Green ⁇ NIR)/(Green+NIR), where Green represents a green light waveband image, NIR represents a near-infrared waveband image, and NDWI represents a waveband combination.
  • the median filtering because of its capability of effectively preserving edge information while suppressing noise, is more likely to be adopted herein for an image smoothing processing, the Gaussian blur filtering is the second choice, and the mean value smoothing is not recommended for a smoothing processing.
  • the present invention mainly uses an optical band, based on a fundamental principle that water body information is extracted according to the characteristic difference between a water body and a land in reflections of green lights and near-infrared waves, and a threshold ‘0’ is set in a calculation process, that is, a calculation result being a negative value represents a water area, and the other non-water areas are all represented by positive values.
  • a multispectral waveband mentioned herein includes four wavebands: a blue light waveband, a green light waveband, a red light waveband and a near-infrared light wave band.
  • S300 establishing a multispectral database of a targeted port linearly combining multivariate observed values among different edge information, regularizing the linear combination to obtain a kernel function, that is, multispectral data, and repeating the foregoing process to obtain a multispectral database.
  • An MAF transformation is performed on an image in an edge region before the linear combination is performed, so as to obtain an autocorrelation factor, and the specific algorithm is as follows:
  • the MAF transformation refers to a maximum/minimum autocorrelation factor transformation that focuses on spatial characteristics of a remote sensing image
  • the use of spatial characteristics of a remote sensing image for recognition is necessary in multispectral recognition because spatial characteristics of a ground object are corresponding to a unique characteristic spectrum, thus, by extracting a characteristic spectrum and applying the extracted characteristic spectrum to recognition, a corresponding demarcated object can be recognized easily and accurately.
  • a remote sensing image should be regarded as an observed data set with n pixels and p spectral bands, and in this case, the MAF maximizes a correlation factor of a linear combination ⁇ T x(r) of the original variable x(r).
  • linear combination and the regularization thereof include the following steps:
  • step S303 it should be noted that in the transformation of a kernel function form, as long as one linear transformation is found through the MAF transformation, the found linear transformation is regularized first for the sake of convenience of subsequent operations, so that each linear transformation can have a corresponding optional form corresponding to the original mode and the kernel function form can be obtained through a transformation.
  • S400 extracting a port wharf using a projected eigenvector, performing the MAF transformation on the regularized kernel function, and projecting multivariate observed values to original eigenvectors, and determining a remote sensing image area corresponding to the original eigenvector smaller than a transformation variance as a port wharf to be extracted.
  • step S400 after the regularization is performed, the original eigenvector is set to be then a projection algorithm of the original eigenvector is as follows:
  • n is the number of times extraction is performed.
  • a step S500 of validating a spatial correlation relationship is also included here, which specifically includes:
  • steps S100 and S200 based on steps S100 and S200, obtaining a spatial correlation relationship of the blurred coastline obtained through the assignment of values to grayscales and an image processing, and validating, using the spatial relationship, a spatial correlation relationship for the port wharf recognized in the step S400.
  • a detection can be carried out by making full use of the correlation of spatial relationships to increase the accuracy of a validation further, the correlation of spatial relationships of a port wharf is relatively simple, for example, in the aspect of the spatial correlation relationship of a port wharf, what should be taken into consideration merely includes: a departing channel water system, transshipment roads, transshipment squares, warehouses and the like, whose characteristics can be simply recognized through remotely sensed spectral characteristics; in the recognition method provided herein, even just by assigning values to grayscale values, a port wharf can be recognized through the values assigned to the grayscales, and a the recognized port wharf can be conveniently checked through a matching operation.
  • the present invention which first performs a grayscale processing on the original remote sensing image to divide a water area from a land area and thus determines the approximate location of a port wharf, and then carries out a multi spectral processing on the location and switches the remote sensing image directly to the calculation of related data based on an MAF transformation, is capable of recognizing a port wharf rapidly and accurately by comparing characteristic spectrums;
  • the present invention improves the accuracy of a characteristic spectrum from the source, moreover, by validating a recognition using a spatial correlation relationship during a recognition process, the present invention improves recognition accuracy.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Data Mining & Analysis (AREA)
  • Evolutionary Computation (AREA)
  • Artificial Intelligence (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • General Engineering & Computer Science (AREA)
  • Evolutionary Biology (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Astronomy & Astrophysics (AREA)
  • Remote Sensing (AREA)
  • Computing Systems (AREA)
  • Health & Medical Sciences (AREA)
  • Medical Informatics (AREA)
  • Databases & Information Systems (AREA)
  • General Health & Medical Sciences (AREA)
  • Software Systems (AREA)
  • Image Processing (AREA)
  • Image Analysis (AREA)

Abstract

A method of extracting an image of a port wharf through multispectral interpretation includes: first, extracting a blurred coastline by assigning values to grayscale values; then, performing a smoothing and noise removal processing on a remote sensing image in a targeted area to extract edge information; sequentially, establishing a multi spectral database of a targeted port wharf; and extracting a port wharf using a projected eigenvector, performing an MAF transformation on the regularized kernel function again, projecting multivariate observed values to original eigenvectors, and identifying a remote sensing image area corresponding to the original eigenvector smaller than a transformation variance as a port wharf to be extracted, and then carrying out a validation operation.

Description

    CROSS REFERENCE TO THE RELATED APPLICATIONS
  • The present application claims priority to Chinese Patent Application No. 2017112643570 filed on Dec. 5, 2017, the entire content of which is incorporated herein by reference.
  • TECHNICAL FIELD
  • The present invention relates to the field of remote sensing recognition technologies, and particularly to a method of extracting an image of a port wharf through multispectral interpretation.
  • BACKGROUND
  • Along with the determination of marine economy development strategies and the rapid development of marine economy enterprises, sea port management becomes more and more important; to develop and utilize these resources rationally, it is necessary to make proper investigations and evaluations, and in the prior art, making an investigation based on a remote sensing image is an effective way to evaluate an sea port. An investigation method based on a remote sensing image achieves effective resource utilization and management mainly by extracting a spectral image of a port by means of satellite remote sensing, aerial photography and the like, and then extracting characteristics of a ground object, for example, a wharf in the port, using a remote sensing image interpretation technology.
  • Spectral characteristics are characteristics uniquely possessed by each ground object in a remote sensing image. Thus, recognizing a port wharf in a remote sensing image using multiple spectrums is a technical means commonly used at present; in the prior art, for example, in a method of segmenting a wharf and a ship by combining characteristics of multi-source remote sensing images, which is disclosed in Patent Application No. 201210591353.4, a wharf and a ship are segmented using multivariate characteristics of multi-source images, and more target information is acquired by the complementarity between different types of remote sensing images, thus increasing the accuracy of wharf and ship segmentation.
  • However, taking an overall view of the foregoing technical solution, actually existing problems and the currently widely used technical solutions, the following major defects are found:
  • (1) with the development of high-resolution optical images, desired information is extracted using a remote sensing image while corresponding information is frequently extracted directly based on a high-resolution remote sensing image, but the direct extraction of port wharf information is inadvisable because an actual high-resolution image contains a complicated sea condition and abundant texture details, a simple direct recognition needs an extremely large database to provide a recognition basis, moreover, the actual computation burden and correction algorithms are also considerable; and
  • (2) in a direct recognition process, a port wharf is detected only using geometrical information or spectral information, resulting in inadequate information utilization and lowered recognition accuracy and thus reducing the reliability of a detection result, moreover, in a case where a recognition is carried out using geometrical information, the recognition may fail because of an error.
  • SUMMARY
  • A technical solution adopted by the present invention to solve the technical problems is a method of extracting an image of a port wharf through multispectral interpretation, comprising the following steps:
  • S100: assigning values to grayscale values to divide a coastline; performing a grayscale processing on the original remote sensing image, assigning values to grayscale values, and extracting a blurred coastline according to the distribution of the grayscale values;
  • S200: smoothing the image and removing noise; performing a smoothing processing on an area of the blurred coastline in the original remote sensing image according to a range limited by the blurred coastline and removing interference noise to extract edge information;
  • S300: establishing a multispectral database of a targeted port linearly combining multivariate observed values among different edge information, regularizing the linear combination to obtain a kernel function, that is, multispectral data, and repeating the foregoing process to obtain a multispectral database; and
  • S400: extracting a port wharf using a projected eigenvector, performing an MAF transformation on the legitimized kernel function, projecting the multivariate observed values to original eigenvectors, and identifying a remote sensing image area corresponding to a validated original eigenvector as a port wharf to be extracted.
  • As a preferred technical solution of the present invention, in step S100: different values are assigned to remote sensing images of different gray scales, and a value ‘0’ is assigned to the grayscale value of a water area remote sensing image and a value ‘10’ to the part on land having a maximum grayscale value.
  • As a preferred technical solution of the present invention, in step S100: extracting a coastline by assigning values to grayscale values specifically includes the following steps:
  • S101: first, performing a uniform grayscale processing on the original remote sensing image and dividing areas having different grayscale values, then calculating grayscale value variances for the divided areas to obtain a grayscale distribution uniformity coefficient;
  • S102: checking the grayscale value variances calculated in S101, selecting and determining a variance contrast value, identifying an area where a grayscale value is smaller than the variance contrast value as a water area and an area where a grayscale value is greater than the variance contrast value as a land area; and
  • S103: restoring distinct boundary lines existing between the water area and the land area that are identified through the foregoing steps to a continuous boundary line using an interpolation method, wherein the continuous boundary line forms the blurred coastline.
  • As a preferred technical solution of the present invention, after the blurred coastline is extracted by assigning values to grayscale values, a preprocessing is carried out for the remote sensing image, the preprocessing includes geometric correction, atmospheric correction and radiation correction.
  • As a preferred technical solution of the present invention, in step S200, the image is smoothed using any one of mean value smoothing, median filtering or Gaussian blur filtering, wherein the median filtering and the Gaussian blur filtering both employ a normalized ratio method and use the following specific calculation formula:
  • NDWI=(Green−NIR)/(Green+NIR), where Green represents a green light waveband image, NIR represents a near-infrared waveband image, and NDWI represents a combination of wavebands.
  • As a preferred technical solution of the present invention, in step S300, the MAF transformation is performed on an image in an edge region before the linear combination operation is performed, so as to obtain an autocorrelation factor, and the specific algorithm is as follows:
  • αTx(r) is set as an autocorrelation factor, x(r) as a multivariate observed value at a point r, x(r+67 ) as a multivariate observed value at a point r+δ, and δ as a spatial displacement vector, then an auto-covariance R of a linear combination αTx(r) of x(r) is calculated according to the following formula: R=Cov{αTx(r), αTx(r+δ)}, and the autocorrelation factor is obtained after an inverse operation is performed on the auto-covariance R.
  • As a preferred technical solution of the present invention, in step S300, the linear combination and the regularization thereof include the following steps:
  • S301: transforming the auto-covariance R to obtain the following equation: αTCδα=αT(Cδ+Cδ T)α/2, where Cδ is a transformation-related matrix;
  • S302: setting an autocorrelation coefficient ρ of the linear combination as follows: ρ=1−(αTSδα)/(2αTSα), where a difference covariance matrix Sδ is calculated according to the following formula: Sδ=2S−(Cδ+Cδ T), and S=XTX/(n−1) is set as a covariance matrix of x; and
  • S303: selecting the following optional form of the autocorrelation coefficient: ρ=1−½[(αTXTAα)/(αT[(1−k)Xδ TXδ+kIp)]α]−1, transforming the optional form to obtain the following kernel function form: ρ=1−½[(bTK2b)/(bT[(1−k)KδKδ T+kK)]b]−1, where XTb=α, A is a transformation factor, k is a transformation coefficient Ip is a unit vector of the eigenvector P, and K is a correlation matrix of a transformation coefficient.
  • As a preferred technical solution of the present invention, in step S400, after the regularization is performed, the original eigenvector is set to be ai, then a projection algorithm of the original eigenvector is as follows:

  • φ(x)T a i=φ(x)TφT b i =[k(x, x 1), k(x, x 2), . . . , k(x, x N)]b i.
  • As a preferred technical solution of the present invention, a validation method specifically includes: comparing the original eigenvector with a transformation variance, and determining that the original eigenvector smaller than the transformation variance meets a requirement, wherein a specific algorithm of the transformation variance is as follows:
  • a column of a matrix A is set as ai and that of a matrix B is set as bi, then φ=KB, in this case, the transformation variance is calculated according to the following formula:
  • Var { a i δ T ϕ ( x ) } = a i δ T ϕ T a i / ( n - 1 ) = b i ϕϕ T ϕϕ T b i / ( n - 1 ) = b i δ T K 2 b i / ( n - 1 ) = 1 / ( n - 1 ) ,
  • where n is the number of times extraction is performed.
  • As a preferred technical solution of the present invention, the method further includes a step S500 of validating a spatial correlation relationship, which specifically includes:
  • based on steps S100 and S200, obtaining a spatial correlation relationship of the blurred coastline obtained through the assignment of values to grayscales and an image processing operation, and validating, using the spatial relationship, a spatial correlation relationship for the port wharf recognized in the step S400.
  • Compared with the prior art, the present invention has the following beneficial effects: the method disclosed herein, which first performs a grayscale processing on the original remote sensing image to divide a water area from a land area and thus determines the approximate location of a port wharf, and then performs a multispectral processing on the location and directly switches the remote sensing image to a related data calculation based on an MAF transformation, is capable of recognizing a port wharf rapidly and accurately by comparing characteristic spectrums, and improving the accuracy of a characteristic spectrum from the source by carrying out an error correction for a remote sensing image during a recognition process, moreover, the method is further capable of validating a spatial correlation relationship during a recognition process and thus improving recognition accuracy.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • FIG. 1 is a schematic diagram illustrating a flow according to the present invention.
  • DETAILED DESCRIPTION OF THE EMBODIMENTS
  • Technical solutions of the present invention will be described clearly and completely below in conjunction with accompanying drawings set forth therein, and apparently, the embodiments described herein are merely a part of, but not all of the embodiments of the present invention. All other embodiments devised by those of ordinary skill without any creative work based on the embodiments described herein should fall within the scope of the present invention.
  • As shown in FIG. 1, the present invention provides a method of extracting an image of a port wharf through multispectral interpretation, comprising the following steps:
  • S100: assigning values to grayscale values to divide a coastline; performing a grayscale processing on the original remote sensing image, assigning different values to remote sensing images of different grayscales, and extracting a blurred coastline according to the distribution of the grayscale values.
  • When the values are assigned to the grayscales on a basis of a guaranteed resolution in the step above, it should be noted that because in a remote sensing image, a water area generally has a relatively low grayscale value and is uniformly distributed in grayscale, the representation vector of the grayscale value of a water area is variance, whereas the grayscale value of a land is relatively high and has a great valiance, a distinct boundary line exists between a water area and a land area. To represent characteristics of a boundary line better and prominently on a digital map, it is set that in a remote sensing image, a value ‘0’ is assigned to the grayscale value of a water area and a value ‘10’ to the part on a land having a maximum grayscale value during a value assignment process, and the other grayscale values are proportionally adjusted according to the magnitude of grayscale, it should further be noted that assigning a value ‘10’ to the part on a land having a maximum grayscale value specifically refers to performing a normal value assignment after removing scattered extremely-high grayscale values in order to effectively decrease processing errors and avoid the inclusion of ‘polluted data’.
  • Therefore, in the assignment of the values to the grayscale values, the values are assigned to the grayscale values and the coastline is extracted by executing the following steps:
  • S101: first, performing a uniform grayscale processing on the original remote sensing image, then dividing areas having different grayscale values, and calculating a grayscale value variance for the divided areas to obtain a grayscale distribution uniformity coefficient;
  • S102: checking the grayscale value variance calculated in S101, selecting and determining a variance contrast value, identifying an area where a grayscale value is smaller than the variance contrast value as a water area and an area where a grayscale value is greater than the variance contrast value as a land area; and
  • S103: restoring distinct boundary lines existing between the water area and the land area identified in the foregoing steps to a continuous boundary line using an interpolation method, wherein the continuous boundary line forms the blurred coastline.
  • After the blurred coastline is extracted by assigning the values to the grayscale values, a preprocessing is carried out for the remote sensing image, the preprocessing includes geometric correction, atmospheric correction and radiation correction.
  • The effect of the geometric correction is to make coordinates of a ground object in a remote sensing image more conform to reality, correct a coordinate error occurring in remote sensing, and cause a recognition result to be closer to an actual result.
  • The atmospheric correction refers to eliminating the effect caused by a cloud block to a remote sending image after performing a geometric correction.
  • The effect of the radiation correction is to eliminate the effect caused by the radiation of a ground object to a remote sensing image.
  • The foregoing correction methods can be directly executed using a piece of image processing software in the prior art. In a corrected remote sensing image, texture characteristics of a ground object in a corrected remote sensing image meet an extraction requirement.
  • S200: smoothing the image and removing noise; performing a smoothing processing on an area of the blurred coastline in the original remote sensing image according to a range limited by the blurred coastline and removing interference noise to extract edge information.
  • Sea spray is common in the sea, and in a remote sensing image, a breaking wave of sea spray has a grayscale value that is close to the grayscale value of a wharf, if the image smoothing and noise removal is not performed, big problems will be brought to a subsequent recognition, which will increase recognition errors that are transferred and magnified in continuously repeated recognition calculation, moreover, the amount of the calculation conducted in a recognition process will be greatly increased.
  • In step S200, the image is smoothed through any one of mean value smoothing, median filtering or Gaussian blur filtering, wherein the median filtering and the Gaussian blur filtering both employ a normalized ratio method and use the following specific calculation formula:
  • NDWI=(Green−NIR)/(Green+NIR), where Green represents a green light waveband image, NIR represents a near-infrared waveband image, and NDWI represents a waveband combination.
  • The median filtering, because of its capability of effectively preserving edge information while suppressing noise, is more likely to be adopted herein for an image smoothing processing, the Gaussian blur filtering is the second choice, and the mean value smoothing is not recommended for a smoothing processing.
  • It should further be emphasized here that during the image smoothing and noise removal process, the present invention mainly uses an optical band, based on a fundamental principle that water body information is extracted according to the characteristic difference between a water body and a land in reflections of green lights and near-infrared waves, and a threshold ‘0’ is set in a calculation process, that is, a calculation result being a negative value represents a water area, and the other non-water areas are all represented by positive values.
  • Before performing the multispectral extraction, it should be explicated that a multispectral waveband mentioned herein includes four wavebands: a blue light waveband, a green light waveband, a red light waveband and a near-infrared light wave band.
  • S300: establishing a multispectral database of a targeted port linearly combining multivariate observed values among different edge information, regularizing the linear combination to obtain a kernel function, that is, multispectral data, and repeating the foregoing process to obtain a multispectral database.
  • An MAF transformation is performed on an image in an edge region before the linear combination is performed, so as to obtain an autocorrelation factor, and the specific algorithm is as follows:
  • αTx(r) is set as an autocorrelation factor, x(r) as a multivariate observed value at a point r, x(r+δ) as a multivariate observed value at a point r+δ, and δ as a spatial displacement vector, then an auto-covariance R of a linear combination αTx(r) of x(r) is calculated according to the following formula: R=Cov{αTx(r), αTx(r+δ)}, and the autocorrelation factor is obtained after an inverse operation is performed on the auto-covariance R.
  • What should be explicated in the description above is the MAF transformation, which refers to a maximum/minimum autocorrelation factor transformation that focuses on spatial characteristics of a remote sensing image, and the use of spatial characteristics of a remote sensing image for recognition is necessary in multispectral recognition because spatial characteristics of a ground object are corresponding to a unique characteristic spectrum, thus, by extracting a characteristic spectrum and applying the extracted characteristic spectrum to recognition, a corresponding demarcated object can be recognized easily and accurately.
  • After the MAF transformation is performed on the image, it is necessary to take a covariance matrix of the image into consideration, and it is needed to eliminate a covariance matrix of the difference between original data and offset data because the MAE transformation is based on the autocorrelation of data.
  • In the description above, a remote sensing image should be regarded as an observed data set with n pixels and p spectral bands, and in this case, the MAF maximizes a correlation factor of a linear combination αTx(r) of the original variable x(r).
  • Specifically, the linear combination and the regularization thereof include the following steps:
  • S301: transforming the auto-covariance R to obtain the following equation: αTCδα=αT(Cδ+Cδ T)α/2, where Cδ is a transformation-related matrix;
  • S302: setting an autocorrelation coefficient ρ of the linear combination as follows: ρ=1−(αTSδα)/(2αTSα), where a difference covariance matrix Sδ is calculated according to the following formula: Sδ=2S−(Cδ+C67 T), and S=XTX/(n−1) is set as a covariance matrix of x; and
  • S303: selecting the following optional form of the autocorrelation coefficient: ρ=1−½[(αTXTAα)/(αT[(1−k)Xδ TXδ+kIp)]α]−1, transforming the optional form to obtain the following kernel function form: ρ=1−½[(bTK2b)/(bT[(1−k)KδKδ T+kK)]b]−1 where XTb=α, A is a transformation factor, k is a transformation coefficient, Ip is a unit vector of the eigenvector P, and K is a correlation matrix of a transformation coefficient.
  • In step S303, it should be noted that in the transformation of a kernel function form, as long as one linear transformation is found through the MAF transformation, the found linear transformation is regularized first for the sake of convenience of subsequent operations, so that each linear transformation can have a corresponding optional form corresponding to the original mode and the kernel function form can be obtained through a transformation.
  • S400: extracting a port wharf using a projected eigenvector, performing the MAF transformation on the regularized kernel function, and projecting multivariate observed values to original eigenvectors, and determining a remote sensing image area corresponding to the original eigenvector smaller than a transformation variance as a port wharf to be extracted.
  • In step S400, after the regularization is performed, the original eigenvector is set to be then a projection algorithm of the original eigenvector is as follows:

  • φ(x)T a i=φ(x)TφT b i =[k(x, x 1), k(x, x 2), . . . , k(x, x N)]b i.
  • Additionally, a specific algorithm of the transformation valiance is as follows:
  • a column of a matrix A is set as ai, and that of a matrix B is set as bi, then φA=KB, in this case, the transformation variance is calculated according to the following formula:
  • Var { a i δ T ϕ ( x ) } = a i δ T ϕ T a i / ( n - 1 ) = b i ϕϕ T ϕϕ T b i / ( n - 1 ) = b i δ T K 2 b i / ( n - 1 ) = 1 / ( n - 1 ) ,
  • where n is the number of times extraction is performed.
  • In the present invention, it should also be noted that although a port wharf can be effectively determined by recognizing characteristics of multiple spectrums, the spatial characteristic relationship of a port wharf relies not only on the recognition of multiple spectrums in actual applications.
  • Thus, a step S500 of validating a spatial correlation relationship is also included here, which specifically includes:
  • based on steps S100 and S200, obtaining a spatial correlation relationship of the blurred coastline obtained through the assignment of values to grayscales and an image processing, and validating, using the spatial relationship, a spatial correlation relationship for the port wharf recognized in the step S400.
  • A detection can be carried out by making full use of the correlation of spatial relationships to increase the accuracy of a validation further, the correlation of spatial relationships of a port wharf is relatively simple, for example, in the aspect of the spatial correlation relationship of a port wharf, what should be taken into consideration merely includes: a departing channel water system, transshipment roads, transshipment squares, warehouses and the like, whose characteristics can be simply recognized through remotely sensed spectral characteristics; in the recognition method provided herein, even just by assigning values to grayscale values, a port wharf can be recognized through the values assigned to the grayscales, and a the recognized port wharf can be conveniently checked through a matching operation.
  • To sum up, the main features of the present invention lie in that:
  • (1) the present invention, which first performs a grayscale processing on the original remote sensing image to divide a water area from a land area and thus determines the approximate location of a port wharf, and then carries out a multi spectral processing on the location and switches the remote sensing image directly to the calculation of related data based on an MAF transformation, is capable of recognizing a port wharf rapidly and accurately by comparing characteristic spectrums; and
  • (2) by carrying out an error correction for a remote sensing image during a recognition process, the present invention improves the accuracy of a characteristic spectrum from the source, moreover, by validating a recognition using a spatial correlation relationship during a recognition process, the present invention improves recognition accuracy.
  • It is apparent for those skilled in the art that the present invention is not limited to details of the foregoing exemplary embodiments and the present invention can be realized in other specific forms without departing from the spirit or basic characteristics of the present invention. Thus, the embodiments should be regarded as exemplary but not limitative in any aspect; because the scope of the present invention is defined by appended claims but not the foregoing description, the present invention is intended to cover all the variations falling within the meaning and scope of an equivalent of the claims. Any reference symbol in the claims should not be construed as limiting a relevant claim.

Claims (10)

What is claimed is:
1. A method of extracting an image of a port wharf through multispectral interpretation, comprising the following steps:
S100: assigning values to grayscale values to divide a coastline: performing a grayscale processing on an original remote sensing image, assigning the values to the grayscale values, and extracting a blurred coastline according to a distribution of the grayscale values;
S200: smoothing the original remote sensing image and removing noise: performing a smoothing processing on an area of the blurred coastline in the original remote sensing image according to a range limited by the blurred coastline and removing interference noise to extract edge information;
S300: establishing a multispectral database of a targeted port wharf: linearly combining multivariate observed values among different edge information, regularizing a linear combination to obtain a kernel function, as multispectral data, and repeating to obtain the multispectral database,
wherein an MAF transformation is performed on an image in an edge region before an operation of the linear combination is performed, in order to obtain an autocorrelation factor, and a specific algorithm is as follows:
αTx(r) is set as the autocorrelation factor, x(r) as a first multivariate observed value at a point r, x(r+δ) as a second multivariate observed value at a point r+δ, and δ as a spatial displacement vector, then an auto-covariance R of a linear combination αTx(r) of x(r) is calculated according to the following formula: R=Cov{αTx(r), αTx(r+δ)}, and the autocorrelation factor is obtained after an inverse operation is performed on the auto-covariance R;
the linear combination and a regularization of the linear combination comprises the following steps:
S301: transforming the auto-covariance R to obtain the following equation: αTCδα=αT(Cδ+Cδ T)α/2, where Cδ is a transformation-related matrix;
S302: setting, an autocorrelation coefficient ρ of the linear combination as follows: ρ=1−(αTSδα)/(2αTSα), where a difference covariance matrix Sδ is calculated according to the following formula: Sδ=2S−(C67 +Cδ T), and S=XTX/(n−1) is set as a covariance matrix of x:
and S303: selecting an optional form of the autocorrelation coefficient as following: ρ=1−½[(αTXTAα)/(αT[(1−k)Xδ TXδ+kIp)]α]−1, transforming the optional form to obtain a kernel function form as following: ρ=1−½[(bTK2b )/(bT[(1−k)KδK67 T+kK)]b]−1, where XTb=α, A is a transformation factor, k is a transformation coefficient, Ip is a unit vector of a eigenvector P, and K is a correlation matrix of the transformation coefficient: and
S400: extracting a port wharf using a projected eigenvector: performing an MAF transformation on the kernel function after being regularized, projecting the multivariate observed values to original eigenvectors, and identifying a remote sensing image area corresponding to a validated original eigenvector as a port wharf to be extracted.
2. The method of extracting the image of the port wharf through multispectral interpretation according to claim 1, wherein
in step S100: different values are assigned to remote sensing images of different grayscales, and a value ‘0’ is assigned to a grayscale value of a water area in a remote sensing image and a value ‘10’ to a part on land having a maximum grayscale value.
3. The method of extracting the image of the port wharf through multispectral interpretation according to claim 1, wherein
in step S100: extracting a coastline by assigning the values to the grayscale values specifically comprises the following steps:
S101: first, performing a uniform grayscale processing on the original remote sensing image and dividing areas having different grayscale values, then calculating grayscale value variances for the areas after being divided to obtain a grayscale distribution uniformity coefficient;
S102: checking the grayscale value variances calculated in S101, selecting and determining a variance contrast value, identifying an area where a grayscale value is smaller than the variance contrast value as a water area and an area where a grayscale value is greater than the variance contrast value as a land area; and
S103: restoring distinct boundary lines existing between the water area and the land area to a continuous boundary line to be as the blurred coastline by an interpolation method, wherein the distinct boundary lines are identified through S101 and S102.
4. The method of extracting the image of the port wharf through multispectral interpretation according to claim 1, wherein
after the blurred coastline is extracted by assigning the values to the grayscale values, a preprocessing is carried out for the remote sensing image, and the preprocessing includes geometric correction, atmospheric correction and radiation correction.
5. The method of extracting the image of the port wharf through multispectral interpretation according to claim 1, wherein
in step S200, the original remote sensing image is smoothed by using any one of mean value smoothing, median filtering or Gaussian blur filtering, wherein, the median filtering and the Gaussian blur filtering both employ a normalized ratio method and use the following specific calculation formula:
NDWI=(Green−NIR)/(Green+NIR), where Green represents a green light waveband image, NIR represents a near-infrared waveband image, and NDWI represents a combination of wavebands.
6. The method of extracting the image of a the port wharf through multispectral interpretation according to claim 1, wherein, in step 400, an original eigenvector is set to be ai after the regularization is performed, then, a projection algorithm of the original eigenvector is as follows:

φ(x)T a i=φ(x)TφT b i =[k(x, x 1), k(x, x 2), . . . , k(x, x N)]b i.
7. The method of extracting the image of the port wharf through multispectral interpretation according to claim 1, wherein, a validation method comprises: comparing an original eigenvector with a transformation variance, and determining that the original eigenvector smaller than the transformation variance meets a requirement, wherein a specific algorithm of the transformation variance is as follows:
a column of a matrix A is set as ai and that of a matrix B is set as bi, then φA=KB, in this case, the transformation variance is calculated according to the following formula:
Var { a i δ T ϕ ( x ) } = a i δ T ϕ T a i / ( n - 1 ) = b i ϕϕ T ϕϕ T b i / ( n - 1 ) = b i δ T K 2 b i / ( n - 1 ) = 1 / ( n - 1 ) ,
where n is the number of times the extraction is performed.
8. The method of extracting the image of the port wharf through multispectral interpretation according to claim 1, further comprising:
a step S500 of validating a first spatial correlation relationship, which specifically comprises:
based on steps S100 and S200, obtaining a second spatial correlation relationship of the blurred coastline obtained through an assignment of the values to the grayscales and an image processing operation, and validating the first spatial correlation relationship for the port wharf recognized in the step S400 by using the second spatial correlation relationship.
9. (canceled)
10. (canceled)
US16/205,251 2017-12-05 2018-11-30 Method of extracting image of port wharf through multispectral interpretation Expired - Fee Related US10325151B1 (en)

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
CN201711264357.0A CN108256419B (en) 2017-12-05 2017-12-05 A method of port and pier image is extracted using multispectral interpretation
CN201711264357.0 2017-12-05
CN201711264357 2017-12-05

Publications (2)

Publication Number Publication Date
US20190171862A1 true US20190171862A1 (en) 2019-06-06
US10325151B1 US10325151B1 (en) 2019-06-18

Family

ID=62721712

Family Applications (1)

Application Number Title Priority Date Filing Date
US16/205,251 Expired - Fee Related US10325151B1 (en) 2017-12-05 2018-11-30 Method of extracting image of port wharf through multispectral interpretation

Country Status (2)

Country Link
US (1) US10325151B1 (en)
CN (1) CN108256419B (en)

Cited By (32)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110310246A (en) * 2019-07-05 2019-10-08 广西壮族自治区基础地理信息中心 A kind of cane -growing region remote sensing information extracting method based on three-line imagery
CN110632007A (en) * 2019-09-25 2019-12-31 南宁师范大学 Rapid extraction method for exposed water surface tidal flat range
CN110852381A (en) * 2019-11-11 2020-02-28 四川航天神坤科技有限公司 Forest fire burned area extraction method and system
CN110991248A (en) * 2019-11-04 2020-04-10 同济大学 High-resolution noctilucent remote sensing image automatic change detection method based on feature fusion
CN111047570A (en) * 2019-12-10 2020-04-21 西安中科星图空间数据技术有限公司 Automatic cloud detection method based on texture analysis method
CN111104889A (en) * 2019-12-04 2020-05-05 山东科技大学 Water body remote sensing identification method based on U-net
CN111274968A (en) * 2020-01-20 2020-06-12 广州市城市规划自动化中心 Object-oriented road information extraction method and device and electronic equipment
CN111340755A (en) * 2020-02-11 2020-06-26 南阳理工学院 Remote sensing image processing method
CN111428627A (en) * 2020-03-23 2020-07-17 西北大学 Mountain landform remote sensing extraction method and system
CN111461033A (en) * 2020-04-07 2020-07-28 北京中科千寻科技有限公司 Local climate area classification structure and method based on branch CNN and using SAR and multispectral remote sensing data
CN111832502A (en) * 2020-07-20 2020-10-27 中国人民解放军战略支援部队航天工程大学 Remote sensing image visual salient region intelligent search method for satellite in-orbit application
CN111832575A (en) * 2020-07-16 2020-10-27 黄河勘测规划设计研究院有限公司 Water surface area extraction method and device based on remote sensing image
CN112861824A (en) * 2021-04-06 2021-05-28 中国科学院地理科学与资源研究所 Coastline extraction method and device, terminal device and readable storage medium
CN112906577A (en) * 2021-02-23 2021-06-04 清华大学 Fusion method of multi-source remote sensing image
CN112949657A (en) * 2021-03-09 2021-06-11 河南省现代农业大数据产业技术研究院有限公司 Forest land distribution extraction method and device based on remote sensing image texture features
CN112990066A (en) * 2021-03-31 2021-06-18 武汉大学 Remote sensing image solid waste identification method and system based on multi-strategy enhancement
CN113408460A (en) * 2021-06-30 2021-09-17 中国科学院东北地理与农业生态研究所 Method for detecting spartina alterniflora distribution based on remote sensing big data and cloud platform
CN113408615A (en) * 2021-06-16 2021-09-17 中国石油大学(华东) Automatic ship matching method based on optical satellite remote sensing image
CN113610940A (en) * 2021-08-10 2021-11-05 江苏天汇空间信息研究院有限公司 Ocean vector file and image channel threshold based coastal area color homogenizing method
CN113744249A (en) * 2021-09-07 2021-12-03 中国科学院大学 Marine ecological environment damage investigation method
WO2021258758A1 (en) * 2020-06-22 2021-12-30 大连海洋大学 Coastline change identification method based on multiple factors
CN114119769A (en) * 2021-11-22 2022-03-01 北京市遥感信息研究所 High-precision yaw relative radiation calibration method based on uniform field
CN114241302A (en) * 2021-12-01 2022-03-25 浙江大学德清先进技术与产业研究院 Regular house roof automatic extraction method based on remote sensing image spectral information
CN114612793A (en) * 2022-02-22 2022-06-10 中国自然资源航空物探遥感中心 Multi-temporal remote sensing coastline and tidal flat detection method based on high-frequency observation of water sideline
CN114663783A (en) * 2022-05-23 2022-06-24 自然资源部第二海洋研究所 Remote sensing identification method for water body pollution of river entering sea based on machine learning
CN114742854A (en) * 2022-04-02 2022-07-12 西安电子科技大学 SAR image sea-land segmentation method based on scene prior knowledge and region combination
CN114821355A (en) * 2022-04-27 2022-07-29 生态环境部卫星环境应用中心 Coastline automatic identification method and device
CN115326722A (en) * 2022-08-12 2022-11-11 宁波拾烨智能科技有限公司 Ocean red tide early warning method based on hyperspectral remote sensing data
CN115861824A (en) * 2023-02-23 2023-03-28 汕头大学 Remote sensing image identification method based on improved Transformer
CN116630811A (en) * 2023-06-07 2023-08-22 自然资源部国土卫星遥感应用中心 River extraction method, river extraction device, terminal equipment and readable storage medium
CN117671513A (en) * 2023-11-22 2024-03-08 中国人民公安大学 Optical remote sensing image recognition method, device, equipment and storage medium
CN118072165A (en) * 2024-02-20 2024-05-24 生态环境部华南环境科学研究所(生态环境部生态环境应急研究所) River network intensive urban black and odorous water body risk division method and system

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116883861B (en) * 2023-07-17 2024-01-26 中国人民解放军战略支援部队航天工程大学 Port large and medium-sized ship activity identification method and system for microsatellite on-orbit application
CN117690028B (en) * 2024-02-02 2024-04-09 江苏菲尔浦物联网有限公司 Target detection method and system based on remote sensing sensor

Family Cites Families (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP4219805B2 (en) * 2001-06-19 2009-02-04 フェユル キム Extraction method of shape change descriptor for image sequence retrieval
US8238658B2 (en) * 2009-01-21 2012-08-07 The United States Of America, As Represented By The Secretary Of The Navy Boundary extraction method
US8676555B2 (en) * 2010-10-26 2014-03-18 The United States Of America, As Represented By The Secretary Of The Navy Tool for rapid configuration of a river model using imagery-based information
CN102663394B (en) * 2012-03-02 2013-09-25 北京航空航天大学 Method of identifying large and medium-sized objects based on multi-source remote sensing image fusion
US9224200B2 (en) * 2012-04-27 2015-12-29 Parasite Technologies A/S Computer vision based method for extracting features relating to the developmental stages of Trichuris spp. eggs
CN103020975A (en) * 2012-12-29 2013-04-03 北方工业大学 Wharf and ship segmentation method combining multi-source remote sensing image characteristics
CN103679138A (en) * 2013-11-15 2014-03-26 中国科学院遥感与数字地球研究所 Ship and port prior knowledge supported large-scale ship detection method
FR3013876B1 (en) * 2013-11-28 2016-01-01 Sagem Defense Securite ANALYSIS OF A MULTISPECTRAL IMAGE
CN104063870A (en) * 2014-07-04 2014-09-24 中国科学院大学 Automatic land and sea template segmentation method based on scanning line detection and application thereof
CN104966065B (en) * 2015-06-23 2018-11-09 电子科技大学 target identification method and device
CN105740794B (en) * 2016-01-27 2019-07-05 中国人民解放军92859部队 A kind of coastline based on satellite image automatically extracts and classification method
CN106407938A (en) * 2016-09-23 2017-02-15 交通运输部规划研究院 Method and system for extracting specific ground object of port by utilizing remote sensing image
CN106407940A (en) * 2016-09-23 2017-02-15 交通运输部规划研究院 Port water area image extraction method and system
CN107358161B (en) * 2017-06-08 2020-01-10 深圳先进技术研究院 Coastline extraction method and coastline extraction system based on remote sensing image classification

Cited By (32)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110310246A (en) * 2019-07-05 2019-10-08 广西壮族自治区基础地理信息中心 A kind of cane -growing region remote sensing information extracting method based on three-line imagery
CN110632007A (en) * 2019-09-25 2019-12-31 南宁师范大学 Rapid extraction method for exposed water surface tidal flat range
CN110991248A (en) * 2019-11-04 2020-04-10 同济大学 High-resolution noctilucent remote sensing image automatic change detection method based on feature fusion
CN110852381A (en) * 2019-11-11 2020-02-28 四川航天神坤科技有限公司 Forest fire burned area extraction method and system
CN111104889A (en) * 2019-12-04 2020-05-05 山东科技大学 Water body remote sensing identification method based on U-net
CN111047570A (en) * 2019-12-10 2020-04-21 西安中科星图空间数据技术有限公司 Automatic cloud detection method based on texture analysis method
CN111274968A (en) * 2020-01-20 2020-06-12 广州市城市规划自动化中心 Object-oriented road information extraction method and device and electronic equipment
CN111340755A (en) * 2020-02-11 2020-06-26 南阳理工学院 Remote sensing image processing method
CN111428627A (en) * 2020-03-23 2020-07-17 西北大学 Mountain landform remote sensing extraction method and system
CN111461033A (en) * 2020-04-07 2020-07-28 北京中科千寻科技有限公司 Local climate area classification structure and method based on branch CNN and using SAR and multispectral remote sensing data
WO2021258758A1 (en) * 2020-06-22 2021-12-30 大连海洋大学 Coastline change identification method based on multiple factors
CN111832575A (en) * 2020-07-16 2020-10-27 黄河勘测规划设计研究院有限公司 Water surface area extraction method and device based on remote sensing image
CN111832502A (en) * 2020-07-20 2020-10-27 中国人民解放军战略支援部队航天工程大学 Remote sensing image visual salient region intelligent search method for satellite in-orbit application
CN112906577A (en) * 2021-02-23 2021-06-04 清华大学 Fusion method of multi-source remote sensing image
CN112949657A (en) * 2021-03-09 2021-06-11 河南省现代农业大数据产业技术研究院有限公司 Forest land distribution extraction method and device based on remote sensing image texture features
CN112990066A (en) * 2021-03-31 2021-06-18 武汉大学 Remote sensing image solid waste identification method and system based on multi-strategy enhancement
CN112861824A (en) * 2021-04-06 2021-05-28 中国科学院地理科学与资源研究所 Coastline extraction method and device, terminal device and readable storage medium
CN113408615A (en) * 2021-06-16 2021-09-17 中国石油大学(华东) Automatic ship matching method based on optical satellite remote sensing image
CN113408460A (en) * 2021-06-30 2021-09-17 中国科学院东北地理与农业生态研究所 Method for detecting spartina alterniflora distribution based on remote sensing big data and cloud platform
CN113610940A (en) * 2021-08-10 2021-11-05 江苏天汇空间信息研究院有限公司 Ocean vector file and image channel threshold based coastal area color homogenizing method
CN113744249A (en) * 2021-09-07 2021-12-03 中国科学院大学 Marine ecological environment damage investigation method
CN114119769A (en) * 2021-11-22 2022-03-01 北京市遥感信息研究所 High-precision yaw relative radiation calibration method based on uniform field
CN114241302A (en) * 2021-12-01 2022-03-25 浙江大学德清先进技术与产业研究院 Regular house roof automatic extraction method based on remote sensing image spectral information
CN114612793A (en) * 2022-02-22 2022-06-10 中国自然资源航空物探遥感中心 Multi-temporal remote sensing coastline and tidal flat detection method based on high-frequency observation of water sideline
CN114742854A (en) * 2022-04-02 2022-07-12 西安电子科技大学 SAR image sea-land segmentation method based on scene prior knowledge and region combination
CN114821355A (en) * 2022-04-27 2022-07-29 生态环境部卫星环境应用中心 Coastline automatic identification method and device
CN114663783A (en) * 2022-05-23 2022-06-24 自然资源部第二海洋研究所 Remote sensing identification method for water body pollution of river entering sea based on machine learning
CN115326722A (en) * 2022-08-12 2022-11-11 宁波拾烨智能科技有限公司 Ocean red tide early warning method based on hyperspectral remote sensing data
CN115861824A (en) * 2023-02-23 2023-03-28 汕头大学 Remote sensing image identification method based on improved Transformer
CN116630811A (en) * 2023-06-07 2023-08-22 自然资源部国土卫星遥感应用中心 River extraction method, river extraction device, terminal equipment and readable storage medium
CN117671513A (en) * 2023-11-22 2024-03-08 中国人民公安大学 Optical remote sensing image recognition method, device, equipment and storage medium
CN118072165A (en) * 2024-02-20 2024-05-24 生态环境部华南环境科学研究所(生态环境部生态环境应急研究所) River network intensive urban black and odorous water body risk division method and system

Also Published As

Publication number Publication date
CN108256419A (en) 2018-07-06
CN108256419B (en) 2018-11-23
US10325151B1 (en) 2019-06-18

Similar Documents

Publication Publication Date Title
US10325151B1 (en) Method of extracting image of port wharf through multispectral interpretation
CN109359602B (en) Lane line detection method and device
US9424486B2 (en) Method of image processing
US10922794B2 (en) Image correction method and device
US10699134B2 (en) Method, apparatus, storage medium and device for modeling lane line identification, and method, apparatus, storage medium and device for identifying lane line
CN107301661A (en) High-resolution remote sensing image method for registering based on edge point feature
CN108230376B (en) Remote sensing image processing method and device and electronic equipment
CN112950508A (en) Drainage pipeline video data restoration method based on computer vision
US11227367B2 (en) Image processing device, image processing method and storage medium
CN103871039B (en) Generation method for difference chart in SAR (Synthetic Aperture Radar) image change detection
CN103226832B (en) Based on the multi-spectrum remote sensing image change detecting method of spectral reflectivity mutation analysis
CN109829858B (en) Ship-borne radar image oil spill monitoring method based on local adaptive threshold
WO2017193414A1 (en) Image corner detection method based on turning radius
CN106327455A (en) Improved method for fusing remote-sensing multispectrum with full-color image
JP6958743B2 (en) Image processing device, image processing method and image processing program
CN110852207A (en) Blue roof building extraction method based on object-oriented image classification technology
US9077926B2 (en) Image processing method and image processing apparatus
CN104680536A (en) Method for detecting SAR image change by utilizing improved non-local average algorithm
Wang et al. Road detection based on illuminant invariance and quadratic estimation
CN110490848A (en) Infrared target detection method, apparatus and computer storage medium
CN103337080A (en) Registration technology of infrared image and visible image based on Hausdorff distance in gradient direction
CN109461171A (en) The small IR targets detection algorithm of DoG filtering is improved based on multichannel
CN111062384B (en) Vehicle window accurate positioning method based on deep learning
CN106204596B (en) Panchromatic waveband remote sensing image cloud detection method based on Gaussian fitting function and fuzzy mixed estimation
CN104200460A (en) Image registration method based on images characteristics and mutual information

Legal Events

Date Code Title Description
FEPP Fee payment procedure

Free format text: ENTITY STATUS SET TO UNDISCOUNTED (ORIGINAL EVENT CODE: BIG.); ENTITY STATUS OF PATENT OWNER: SMALL ENTITY

AS Assignment

Owner name: TRANSPORT PLANNING AND RESEARCH INSTITUTE MINISTRY

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:QI, YUE;MAO, YAPING;FENG, YUN;AND OTHERS;REEL/FRAME:047701/0001

Effective date: 20181127

FEPP Fee payment procedure

Free format text: ENTITY STATUS SET TO SMALL (ORIGINAL EVENT CODE: SMAL); ENTITY STATUS OF PATENT OWNER: SMALL ENTITY

STCF Information on status: patent grant

Free format text: PATENTED CASE

FEPP Fee payment procedure

Free format text: MAINTENANCE FEE REMINDER MAILED (ORIGINAL EVENT CODE: REM.); ENTITY STATUS OF PATENT OWNER: SMALL ENTITY

LAPS Lapse for failure to pay maintenance fees

Free format text: PATENT EXPIRED FOR FAILURE TO PAY MAINTENANCE FEES (ORIGINAL EVENT CODE: EXP.); ENTITY STATUS OF PATENT OWNER: SMALL ENTITY

STCH Information on status: patent discontinuation

Free format text: PATENT EXPIRED DUE TO NONPAYMENT OF MAINTENANCE FEES UNDER 37 CFR 1.362

FP Lapsed due to failure to pay maintenance fee

Effective date: 20230618