CN109858394A - A kind of remote sensing images water area extracting method based on conspicuousness detection - Google Patents

A kind of remote sensing images water area extracting method based on conspicuousness detection Download PDF

Info

Publication number
CN109858394A
CN109858394A CN201910027907.XA CN201910027907A CN109858394A CN 109858394 A CN109858394 A CN 109858394A CN 201910027907 A CN201910027907 A CN 201910027907A CN 109858394 A CN109858394 A CN 109858394A
Authority
CN
China
Prior art keywords
remote sensing
sensing images
water body
threshold value
candidate region
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201910027907.XA
Other languages
Chinese (zh)
Inventor
吕宁
陈晨
万春曼
刘佳凤
胡少哲
张旭东
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Xidian University
Original Assignee
Xidian University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Xidian University filed Critical Xidian University
Priority to CN201910027907.XA priority Critical patent/CN109858394A/en
Publication of CN109858394A publication Critical patent/CN109858394A/en
Pending legal-status Critical Current

Links

Landscapes

  • Image Processing (AREA)

Abstract

The present invention relates to a kind of remote sensing images water area extracting methods based on conspicuousness detection, comprising: carries out Preliminary detection to initial remote sensing images using conspicuousness detection method, obtains candidate region;Water body classification is carried out to the candidate region using normalization water body index method, determines water area.Method of the invention is by conspicuousness detection method in conjunction with normalization water body index method, improve the efficiency of water body information, method of the invention is applicable not only to the acquisition of Water-Body Information in low resolution remote sensing images, the efficiency of water body detection is improved, and facilitates the acquisition of Water-Body Information in high-resolution remote sensing image.

Description

A kind of remote sensing images water area extracting method based on conspicuousness detection
Technical field
The invention belongs to remote sensing images technical field of information processing, and in particular to a kind of remote sensing figure based on conspicuousness detection As water area extracting method.
Background technique
Remote sensing images refer to the film or photo for recording various atural object electromagnetic wave sizes, are broadly divided into aerial image and satellite Image.In recent years, remote sensing images are widely used in various fields, become management national security, push economic construction and maintenance The important means of information service, wherein it is Investigation of water resources, wetland that Water-Body Information is quickly and accurately extracted using remote sensing images The important means of protection, method for flood submerged area measurement and consequential loss assessment etc..
The basic theories of Clean water withdraw information is to be clustered according to the spectral signature of atural object in remote sensing images, due to difference The Spectral Characteristic that the water body of type is showed under different external conditions is different, therefore uses different band combination remotely-sensed datas It can classify to water body.Currently, the main method for extracting Water-Body Information in remote sensing images is single band threshold method and more Wave band spectrum-photometric method.In regions with complex terrain, most of topographic shadowings are similar compared with the spectral signature of deep water water body, utilize Single band threshold method often can not effectively extract water body.Multiband spectrum-photometric method is combined using simple water body index A large amount of multispectral remote sensing information can be lost, and as a whole by entire remote sensing image, statistical data reflection is to grind Study carefully the otherness of different type difference waters spectral signature in area, and the Water-Body Information of details is covered, and is extracted result and is mixed Many non-Water-Body Informations, by being then based on the operation of pixel layer, therefore be not suitable for high-resolution remote sensing image yet.
Summary of the invention
In order to solve the above-mentioned problems in the prior art, the present invention provides a kind of remote sensing based on conspicuousness detection Image water area extracting method.The technical problem to be solved in the present invention is achieved through the following technical solutions:
A kind of remote sensing images water area extracting method based on conspicuousness detection provided by the invention, comprising:
Preliminary detection is carried out to initial remote sensing images using conspicuousness detection method, obtains candidate region;
Water body classification is carried out to the candidate region using normalization water body index method, determines water area.
In embodiments of the present invention, described that Preliminary detection is carried out to initial remote sensing images using conspicuousness detection method, it obtains Candidate region, comprising:
The initial remote sensing images are converted into remote sensing images luminance graph;
Gaussian Blur is carried out to the remote sensing images luminance graph, obtains Gaussian Blur remote sensing images;
The remote sensing images luminance graph and the Gaussian Blur remote sensing images are calculated, notable figure is obtained;
The candidate region is judged according to the notable figure.
In embodiments of the present invention, the remote sensing images luminance graph and the Gaussian Blur remote sensing images are calculated, Obtain notable figure, comprising:
Calculate in the remote sensing images luminance graph brightness value of a certain pixel and its in the Gaussian Blur remote sensing images The Euclidean distance of the brightness value of middle corresponding pixel points obtains the saliency value of the pixel;
Calculate the saliency value of all pixels point in the remote sensing images luminance graph;
All saliency value are constituted into the notable figure.
In embodiments of the present invention, according to the notable figure, judge the candidate region, comprising:
Optimum gradation threshold value is obtained according to the notable figure;
Marking area is obtained according to the optimum gradation threshold value;
The marking area is mapped in the initial remote sensing images, the candidate region is obtained.
In an embodiment of the present invention, optimum gradation threshold value is obtained according to the notable figure, comprising:
Optimum gradation threshold value is obtained using maximum variance between clusters according to the notable figure.
In an embodiment of the present invention, marking area is obtained according to the optimum gradation threshold value, comprising:
According to the optimum gradation threshold value, the optimum gradation threshold value or the institute equal to the optimum gradation threshold value will be greater than It states saliency value and constitutes the marking area.
In an embodiment of the present invention, water body classification is carried out to the candidate region using normalization water body index method, really Determine water area, comprising:
Normalization water body index data are constructed to the candidate region;
Determine the water body threshold value in the normalization water body index data;
The water area is determined according to the water body threshold value.
In an embodiment of the present invention, normalization water body index data are constructed to the candidate region, comprising:
Using green light band and near infrared band building normalization water body index (NDWI) data, the normalization water body refers to The calculation formula of number (NDWI) is NDWI=(G-NIR)/(G+NRI), wherein G indicates green light band;NIR indicates near-infrared wave Section.
Compared with prior art, the beneficial effects of the present invention are:
Preliminary detection is carried out to the water area in initial remote sensing images with conspicuousness detection method first, and is generated a small amount of Object boundary frame the candidate region classified as candidate region, then with normalization water body index method, determine water The position of body region.Conspicuousness detection method in conjunction with normalization water body index method, is improved Water-Body Information by method of the invention The efficiency of extraction, method of the invention are applicable not only to the acquisition of Water-Body Information in low resolution remote sensing images, improve water body inspection The efficiency of survey, and facilitate the acquisition of Water-Body Information in high-resolution remote sensing image.
Detailed description of the invention
Fig. 1 is a kind of process of remote sensing images water area extracting method based on conspicuousness detection of the embodiment of the present invention Schematic diagram.
Fig. 2 is the remote sensing figure for cutting the initial remote sensing images in part from Landsat8 (Landsat -8) data set and being converted to Image brightness figure;
Fig. 3 is the enlarged drawing of region A in Fig. 2;
Fig. 4 is the notable figure of image in Fig. 3;
Fig. 5 is image of the notable figure of Fig. 4 after determining marking area.
Specific embodiment
The content of present invention is further described combined with specific embodiments below, but embodiments of the present invention are not limited to This.
Referring to Figure 1, Fig. 1 is that a kind of remote sensing images water area based on conspicuousness detection of the embodiment of the present invention is extracted The flow diagram of method, as shown, a kind of remote sensing images water area based on conspicuousness detection of the present embodiment is extracted Method, comprising:
S1: Preliminary detection is carried out to initial remote sensing images using conspicuousness detection method, obtains candidate region;
S2: water body classification is carried out to the candidate region using normalization water body index method, determines water area.
Further, the S1 includes:
S11: the initial remote sensing images are converted into remote sensing images luminance graph;
Specifically, pure water body reflects the mainly bluish-green optical band in visible light, in visible light in remote sensing images The reflectivity of other wave bands is very low, especially near infrared band, absorb it is just stronger, so water body on remote sensing image often darkly Color.In view of computer is obtaining the cost in view of texture and structure feature and boundary profile shape feature, the image of selection is special Sign is the brightness of remote sensing images.The brightness value I, the brightness value I for calculating each pixel constitute the remote sensing images brightness Figure, the brightness value I=(R+G+B)/3, wherein R, G, B respectively represent the red of each pixel in initial remote sensing images, Green and blue component.As shown in Figures 2 and 3, Fig. 2 is that cutting part is initial from Landsat8 (Landsat -8) data set The remote sensing images luminance graph that remote sensing images are converted to;Fig. 3 is the enlarged drawing of region A in Fig. 2.
S12: Gaussian Blur is carried out to the remote sensing images luminance graph, obtains Gaussian Blur remote sensing images;
Specifically, by 2-d gaussian filters functionIt is bright with the remote sensing images Degree figure does convolution, thus by the luminance-value reduction in the remote sensing images luminance graph.
In the present embodiment, the remote sensing images luminance graph, standard deviation are obscured using the Gaussian kernel having a size of the σ of 3 σ × 3 =min (W, H)/σs, wherein W and H indicates the width and height of the remote sensing images luminance graph, parameter σsWeighed intensities are controlled, if It is set to 16.
S13: the remote sensing images luminance graph and the Gaussian Blur remote sensing images are calculated, notable figure is obtained;
Fig. 4 is referred to, Fig. 4 is the notable figure of image in Fig. 3.
Further, the S13, comprising:
S131: calculate in the remote sensing images luminance graph brightness value of a certain pixel and its in the Gaussian Blur remote sensing The Euclidean distance of the brightness value of corresponding pixel points in image, obtains the saliency value of the pixel;
Specifically, the calculation formula of the saliency value are as follows: ρ=| | I-Iω||2, wherein I is the remote sensing images luminance graph A pixel brightness value, IωFor the brightness value of corresponding pixel points in the Gaussian Blur remote sensing images.
S132: according to the saliency value calculation formula in step S131, all pixels in the remote sensing images luminance graph are calculated The saliency value of point;
S133: all saliency value are constituted into the notable figure.
S14: the candidate region is judged according to the notable figure.
Further, the S14 includes:
S141: optimum gradation threshold value is obtained according to notable figure;
S142: marking area is obtained according to the optimum gradation threshold value;
S143: the marking area is mapped in the initial remote sensing images, the candidate region is obtained.
Further, the S141 includes:
Optimum gradation threshold value is obtained using maximum variance between clusters according to the notable figure
Specifically, maximum variance between clusters are a kind of based on global Binarization methods, it is special according to the gray scale of image Property, i.e., two parts of foreground and background are divided the image into according to the brightness value of image, when obtaining optimal threshold, between two parts Gamma characteristic difference is the largest, i.e. maximum between-cluster variance.
In the present embodiment, the notable figure is divided into foreground image and background image, the prospect according to its brightness value The segmentation threshold of image and the background image is denoted as gray threshold, and the pixel of the foreground image accounts for the notable figure pixel The ratio of point is ω0, the average gray of the foreground image is u0, the pixel of the background image accounts for the notable figure pixel The ratio of point is ω0, the average gray of the background image is u1, the overall average gray scale of the specific image is u, the prospect The gray variance of image and the background image is g, is calculated:
U=ω0×u01×u1
G=ω0×(u0-u)21×(u0-u)2
SoWhen gray variance g maximum, it is believed that the foreground image at this time It is maximum with the difference of the background image, then the corresponding gray threshold of the maximum gray variance is the optimum gradation threshold Value, is denoted as t.
Further, the S142 includes:
According to the optimum gradation threshold value, the optimum gradation threshold value or the institute equal to the optimum gradation threshold value will be greater than It states saliency value and constitutes the marking area.
Specifically, the judgment formula of marking area Sa are as follows:
That is, being greater than described in the optimum gradation threshold value t or saliency value ρ equal to the optimum gradation threshold value t constitutes Marking area Sa constitutes non-significant region n-Sa less than the saliency value ρ of optimum gradation threshold value t.As shown in figure 5, Fig. 5 is Fig. 4 Image of the notable figure after determining marking area, the white area in Fig. 5 are marking area, by white area be mapped to it is described just In beginning remote sensing images, the candidate region is obtained.
Further, the S2 includes:
S21: normalization water body index data are constructed to the candidate region;
S22: the water body threshold value in the normalization water body index data is determined;
S23: the water area is determined according to the water body threshold value.
Further, the S21 includes:
Using green light band and near infrared band building normalization water body index (NDWI) data, the normalization water body refers to The calculation formula of number (NDWI) is NDWI=(G-NIR)/(G+NRI), wherein G indicates green light band;NIR indicates near-infrared wave Section.
Specifically, water body gradually weakens its reflected intensity, and water body is in near-infrared from visible light to short infrared wave band Maximum intensity is absorbed within the scope of wave band and short infrared wave band, is not reflected almost, but vegetation is in the reflection of near infrared band Rate is very strong, so can farthest inhibit vegetation information with the ratio of green light band and near infrared band, to reach prominent The purpose of water-outlet body information.Therefore, the operation building normalization water body index between green wave band and near infrared band is utilized (NDWI) data, in the NDWI data, the value of water body is higher than the value of non-water body atural object, can protrude in the candidate region Water-Body Information, setting offline threshold value appropriate can determine the water area.Normalization water body index (NDWI) Calculation formula is NDWI=(G-NIR)/(G+NRI), and wherein G indicates green light band;NIR indicates near infrared band.In this implementation The candidate region, which is handled, using remote sensing image processing software (ENVI) in example obtains the water area.
A kind of remote sensing images water area extracting method based on conspicuousness detection of the present embodiment, uses conspicuousness first Detection method carries out Preliminary detection to the water area in initial remote sensing images, and generates a small amount of object boundary frame as candidate regions Domain, then the candidate region is classified with normalization water body index method, determine the position of water area.The present embodiment Conspicuousness detection method in conjunction with normalization water body index method, is improved the efficiency of water body information, side of the invention by method Method is applicable not only to the acquisition of Water-Body Information in low resolution remote sensing images, improves the efficiency of water body detection, and facilitates height The acquisition of Water-Body Information in resolution remote sensing images.
The above content is a further detailed description of the present invention in conjunction with specific preferred embodiments, and it cannot be said that Specific implementation of the invention is only limited to these instructions.For those of ordinary skill in the art to which the present invention belongs, exist Under the premise of not departing from present inventive concept, a number of simple deductions or replacements can also be made, all shall be regarded as belonging to of the invention Protection scope.

Claims (8)

1. a kind of remote sensing images water area extracting method based on conspicuousness detection characterized by comprising
Preliminary detection is carried out to initial remote sensing images using conspicuousness detection method, obtains candidate region;
Water body classification is carried out to the candidate region using normalization water body index method, determines water area.
2. the method according to claim 1, wherein it is described using conspicuousness detection method to initial remote sensing images into Row Preliminary detection, obtains candidate region, comprising:
The initial remote sensing images are converted into remote sensing images luminance graph;
Gaussian Blur is carried out to the remote sensing images luminance graph, obtains Gaussian Blur remote sensing images;
The remote sensing images luminance graph and the Gaussian Blur remote sensing images are calculated, notable figure is obtained;
The candidate region is judged according to the notable figure.
3. according to the method described in claim 2, it is characterized in that, distant to the remote sensing images luminance graph and the Gaussian Blur Sense image is calculated, and notable figure is obtained, comprising:
It calculates in the remote sensing images luminance graph brightness value of a certain pixel and it is right in the Gaussian Blur remote sensing images The Euclidean distance for answering the brightness value of pixel obtains the saliency value of the pixel;
Calculate the saliency value of all pixels point in the remote sensing images luminance graph;
All saliency value are constituted into the notable figure.
4. according to the method described in claim 3, wrapping it is characterized in that, judge the candidate region according to the notable figure It includes:
Optimum gradation threshold value is obtained according to the notable figure;
Marking area is obtained according to the optimum gradation threshold value;
The marking area is mapped in the initial remote sensing images, the candidate region is obtained.
5. according to the method described in claim 4, it is characterized in that, obtaining optimum gradation threshold value according to the notable figure, comprising:
Optimum gradation threshold value is obtained using maximum variance between clusters according to the notable figure.
6. according to the method described in claim 4, wrapping it is characterized in that, obtain marking area according to the optimum gradation threshold value It includes:
According to the optimum gradation threshold value, the optimum gradation threshold value or described aobvious equal to the optimum gradation threshold value will be greater than Work value constitutes the marking area.
7. the method according to claim 1, wherein using normalization water body index method to the candidate region into The classification of row water body, determines water area, comprising:
Normalization water body index data are constructed to the candidate region;
Determine the water body threshold value in the normalization water body index data;
The water area is determined according to the water body threshold value.
8. the method according to the description of claim 7 is characterized in that constructing normalization water body index number to the candidate region According to, comprising:
Utilize green light band and near infrared band building normalization water body index (NDWI) data, the normalization water body index (NDWI) calculation formula is NDWI=(G-NIR)/(G+NRI), wherein G indicates green light band;NIR indicates near-infrared wave Section.
CN201910027907.XA 2019-01-11 2019-01-11 A kind of remote sensing images water area extracting method based on conspicuousness detection Pending CN109858394A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910027907.XA CN109858394A (en) 2019-01-11 2019-01-11 A kind of remote sensing images water area extracting method based on conspicuousness detection

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910027907.XA CN109858394A (en) 2019-01-11 2019-01-11 A kind of remote sensing images water area extracting method based on conspicuousness detection

Publications (1)

Publication Number Publication Date
CN109858394A true CN109858394A (en) 2019-06-07

Family

ID=66894597

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910027907.XA Pending CN109858394A (en) 2019-01-11 2019-01-11 A kind of remote sensing images water area extracting method based on conspicuousness detection

Country Status (1)

Country Link
CN (1) CN109858394A (en)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110688990A (en) * 2019-10-31 2020-01-14 中国科学院地理科学与资源研究所 Rice planting candidate area determination method
CN111007039A (en) * 2019-11-29 2020-04-14 航天东方红卫星有限公司 Automatic extraction method and system for sub-pixel level water body of medium-low resolution remote sensing image
CN111242965A (en) * 2020-01-10 2020-06-05 西安电子科技大学 Genetic algorithm-based breast tumor contour dynamic extraction method
CN111931709A (en) * 2020-09-17 2020-11-13 航天宏图信息技术股份有限公司 Water body extraction method and device for remote sensing image, electronic equipment and storage medium

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103020965A (en) * 2012-11-29 2013-04-03 奇瑞汽车股份有限公司 Foreground segmentation method based on significance detection
CN103729848A (en) * 2013-12-28 2014-04-16 北京工业大学 Hyperspectral remote sensing image small target detection method based on spectrum saliency
CN104700412A (en) * 2015-03-17 2015-06-10 苏州大学 Calculating method of visual salience drawing
CN104966085A (en) * 2015-06-16 2015-10-07 北京师范大学 Remote sensing image region-of-interest detection method based on multi-significant-feature fusion
CN108985307A (en) * 2018-07-16 2018-12-11 中国科学院东北地理与农业生态研究所 A kind of Clean water withdraw method and system based on remote sensing image

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103020965A (en) * 2012-11-29 2013-04-03 奇瑞汽车股份有限公司 Foreground segmentation method based on significance detection
CN103729848A (en) * 2013-12-28 2014-04-16 北京工业大学 Hyperspectral remote sensing image small target detection method based on spectrum saliency
CN104700412A (en) * 2015-03-17 2015-06-10 苏州大学 Calculating method of visual salience drawing
CN104966085A (en) * 2015-06-16 2015-10-07 北京师范大学 Remote sensing image region-of-interest detection method based on multi-significant-feature fusion
CN108985307A (en) * 2018-07-16 2018-12-11 中国科学院东北地理与农业生态研究所 A kind of Clean water withdraw method and system based on remote sensing image

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110688990A (en) * 2019-10-31 2020-01-14 中国科学院地理科学与资源研究所 Rice planting candidate area determination method
CN110688990B (en) * 2019-10-31 2022-07-26 中国科学院地理科学与资源研究所 Rice planting candidate area determination method
CN111007039A (en) * 2019-11-29 2020-04-14 航天东方红卫星有限公司 Automatic extraction method and system for sub-pixel level water body of medium-low resolution remote sensing image
CN111007039B (en) * 2019-11-29 2022-07-29 航天东方红卫星有限公司 Automatic extraction method and system for sub-pixel level water body of medium-low resolution remote sensing image
CN111242965A (en) * 2020-01-10 2020-06-05 西安电子科技大学 Genetic algorithm-based breast tumor contour dynamic extraction method
CN111931709A (en) * 2020-09-17 2020-11-13 航天宏图信息技术股份有限公司 Water body extraction method and device for remote sensing image, electronic equipment and storage medium
CN111931709B (en) * 2020-09-17 2021-01-05 航天宏图信息技术股份有限公司 Water body extraction method and device for remote sensing image, electronic equipment and storage medium

Similar Documents

Publication Publication Date Title
CN109858394A (en) A kind of remote sensing images water area extracting method based on conspicuousness detection
CN106022288B (en) The identification of marine oil spill information and extracting method based on SAR image
CN103763515B (en) A kind of video abnormality detection method based on machine learning
CN111753577B (en) Apple identification and positioning method in automatic picking robot
CN111027446B (en) Coastline automatic extraction method of high-resolution image
CN106650812B (en) A kind of urban water-body extracting method of satellite remote-sensing image
CN102867185B (en) Method and system for identifying automobile tire number
CN102750701B (en) Method for detecting spissatus and spissatus shadow based on Landsat thematic mapper (TM) images and Landsat enhanced thematic mapper (ETM) images
CN108596103A (en) High resolution ratio satellite remote-sensing image building extracting method based on optimal spectrum Index selection
Awrangjeb et al. Improved building detection using texture information
US11017507B2 (en) Image processing device for detection and correction of cloud cover, image processing method and storage medium
CN102842037A (en) Method for removing vehicle shadow based on multi-feature fusion
US11151377B2 (en) Cloud detection method based on landsat 8 snow-containing image
CN107992856B (en) High-resolution remote sensing building shadow detection method under urban scene
CN102855627B (en) City remote sensing image shadow detection method based on spectral characteristic and topological relation
CN107103295B (en) Optical remote sensing image cloud detection method
CN115082776A (en) Electric energy meter automatic detection system and method based on image recognition
CN102354388A (en) Method for carrying out adaptive computing on importance weights of low-level features of image
CN117456371B (en) Group string hot spot detection method, device, equipment and medium
CN107133958B (en) Optical remote sensing ship slice segmentation method based on block particle size pre-judging balance histogram
CN117994679A (en) Intelligent image analysis method for defects of wind power equipment
CN116758423A (en) Power transmission line foreign matter detection method based on white point rate method
CN110633705A (en) Low-illumination imaging license plate recognition method and device
Li et al. Detection and compensation of shadows based on ICA algorithm in remote sensing image
CN115376131A (en) Design and identification method of dot-shaped coding mark

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication
RJ01 Rejection of invention patent application after publication

Application publication date: 20190607