CN107273803B - Cloud layer image detection method - Google Patents

Cloud layer image detection method Download PDF

Info

Publication number
CN107273803B
CN107273803B CN201710345541.1A CN201710345541A CN107273803B CN 107273803 B CN107273803 B CN 107273803B CN 201710345541 A CN201710345541 A CN 201710345541A CN 107273803 B CN107273803 B CN 107273803B
Authority
CN
China
Prior art keywords
detection device
remote sensing
image
spectrum
detection
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201710345541.1A
Other languages
Chinese (zh)
Other versions
CN107273803A (en
Inventor
王俊
刘延利
彭真明
王晓阳
杨春平
李霞
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Institute of Environmental Features
Original Assignee
Beijing Institute of Environmental Features
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Institute of Environmental Features filed Critical Beijing Institute of Environmental Features
Priority to CN201710345541.1A priority Critical patent/CN107273803B/en
Publication of CN107273803A publication Critical patent/CN107273803A/en
Application granted granted Critical
Publication of CN107273803B publication Critical patent/CN107273803B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/10Terrestrial scenes
    • G06V20/13Satellite images
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/20Image preprocessing
    • G06V10/25Determination of region of interest [ROI] or a volume of interest [VOI]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/40Extraction of image or video features
    • G06V10/46Descriptors for shape, contour or point-related descriptors, e.g. scale invariant feature transform [SIFT] or bags of words [BoW]; Salient regional features
    • G06V10/462Salient features, e.g. scale invariant feature transforms [SIFT]

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Theoretical Computer Science (AREA)
  • Astronomy & Astrophysics (AREA)
  • Remote Sensing (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Image Analysis (AREA)
  • Image Processing (AREA)

Abstract

The invention provides a cloud layer image detection method, which comprises the following steps of; the detection device receives a remote sensing image sequence containing N frames of remote sensing images; the detection device extracts energy characteristics of the remote sensing images from the remote sensing image sequence to obtain an energy saliency map; the detection device calculates the brightness contrast of the energy saliency map to obtain a brightness contrast image; the detection device extracts the texture features of the brightness contrast map to obtain a texture feature map; the detection device performs interframe correlation on the texture feature map by using the image motion information, extracts an interested region and obtains a cloud layer detection result. The requirements on the sensor are reduced, and meanwhile, the position and the size of the cirrus cloud can be accurately detected; and the adopted algorithm is simple, the calculation efficiency is high, and the real-time requirement can be met.

Description

Cloud layer image detection method
Technical Field
The invention relates to the technical field of remote sensing, in particular to a cloud layer image detection method based on energy characteristics, textural characteristics and motion characteristics.
Background
In recent years, remote sensing technology is widely applied in the fields of modern military, space-based detection, meteorological analysis and the like, and remote sensing image interpretation is one of key technologies of the remote sensing technology. The remote sensing image interpretation refers to comprehensive analysis according to the geometric characteristics and physical properties of the images, so as to reveal the quality and quantity characteristics of the objects or phenomena and the interrelation among the characteristics, and further research the development process and the distribution rule of the objects or phenomena, namely, the characteristics of the objects or the phenomena represented by the characteristics are identified according to the image characteristics. The false alarm source has a great influence on the interpretation of the remote sensing image. Different virtual alarm sources often exist in the remote sensing image, and the virtual alarm sources have the characteristics of high radiation intensity, time variation and the like. For example, high altitude curly clouds are an important source of false alarms. Areas 1/3 to 1/2 on the earth are covered by clouds, and the high-altitude clouds are a main clutter for remote sensing images and target detection systems. Due to the characteristics of fast shape change, variable motion and the like, the cirrus cloud brings certain difficulty for interpretation of the remote sensing image. The research on a proper cirrus cloud detection algorithm can improve the precision of the remote sensing imaging and detection system and is beneficial to the realization of military and space application.
Generally, for the cloud detection, a spectrum analysis method is usually adopted to collect multi-channel data such as visible light, infrared light and the like, and the radiation difference between a cloud layer and other ground objects is used for detection. However, this method is not high in real-time performance and high in requirements on imaging equipment, and is not easy to implement the cirrus detection.
Disclosure of Invention
The invention aims to provide a cloud layer image detection method based on energy, texture and motion characteristics, and solve the problem that high-altitude cloud is difficult to accurately detect in the prior art.
In order to solve the above technical problem, an embodiment of the present invention provides a cloud image detection method, including: the detection device receives a remote sensing image sequence f containing N remote sensing imagesn(x, y), wherein N is 1.., N is a frame number, and N is a total frame number; the detection device extracts the energy characteristics of the remote sensing image from the remote sensing image sequence to obtain an energy significance map
Figure GDA0002375689870000021
The detection device calculates the brightness contrast of the energy saliency map to obtain a brightness contrast image
Figure GDA0002375689870000022
The detection device extracts the texture features of the brightness contrast image to obtain a texture feature map
Figure GDA0002375689870000023
The detection device performs interframe correlation on the texture feature map by using image motion information, extracts a region of interest (ROI) and obtains a cloud layer detection result.
Optionally, the detecting device performs inter-frame association on the texture feature map by using image motion information, extracts the region of interest ROI, and obtains a cloud layer detection result specifically as follows:
the detection device sets a brightness threshold T, performs threshold segmentation on the brightness contrast result to obtain a segmentation result Thn(x,y):
Figure GDA0002375689870000024
The detection means detects the division result Thn(x, y) performing opening operation to eliminate isolated bright spots, remove partial clutter, fill holes and obtain a processed segmentation result Th'n(x, y); the detection device detects Th'nMarking the region with the pixel value of 1 in (x, y) as a region of interest ROI, and calculating and obtaining the center coordinate (x) of the current ROIn,yn) (ii) a The detection device sets a motion threshold Mov, and calculates the distance between the centers of the ROI of the nth frame and the n +1 th frame:
Figure GDA0002375689870000025
n-1, · N-1; if D isnIf the frame number is less than Mov, the detection device associates the ROI areas of the n frames and the n +1 th frame; the detection device extracts mutually associated ROI areas, and the mutually associated ROI areas are cloud layer detection results.
Optionally, the detection device extracts energy features of the remote sensing image from the remote sensing image sequence to obtain an energy saliency map
Figure GDA0002375689870000031
The method specifically comprises the following steps:
the detection device performs Fourier transform on the remote sensing image sequence: snxy)=F[fn(x,y)]N1.., N, where F denotes a fourier transform operator, (ω) represents a fourier transform operatorxy) Representing coordinates transformed to the frequency domain; the detection device calculates the amplitude of Fourier transform, and takes logarithm to obtain a log spectrum: l isnxy)=log[|snxy)|]Where | represents a magnitude operator; calculating a phase spectrum
Figure GDA0002375689870000038
Wherein
Figure GDA0002375689870000037
Representing a phase operator; the detection device convolves the log spectrum with a mean filtering template with the size of mxm to obtain a smoothed spectrum: v (omega)xy)=Lnxy)*hmxy) Wherein the mean filtering template is:
Figure GDA0002375689870000032
the detection device subtracts the logarithmic spectrum from the smooth spectrum to obtain a spectrum residual R (omega)xy):R(ωxy)=Lnxy)-V(ωxy) The detection device detects the spectrum residual R (omega)xy) And the phase spectrum P (omega)xy) Performing two-dimensional inverse discrete Fourier transform to obtain an energy significance map
Figure GDA0002375689870000033
Figure GDA0002375689870000034
Optionally, the detection device extracts texture features of the brightness contrast image to obtain a texture feature map
Figure GDA0002375689870000035
The method specifically comprises the following steps: the detection device constructs a filter:
Figure GDA0002375689870000036
wherein x' ═ a-m(xcosθ+ysinθ),y′=a-m(-xcosθ+ysinθ),a-mIs a scale factor, theta represents the direction of the kernel function, lambda represents the wavelength of the sine function, psi represents the phase shift, sigma represents the standard deviation of the gaussian function, and gamma represents the aspect ratio of the function; the detection device convolutes the filter with the brightness contrast image to obtain a filtering result, wherein the filtering result is a texture feature map:
Figure GDA0002375689870000041
the technical scheme of the invention has the following beneficial effects: in the scheme, the detection device adopts an image processing mode to detect the cloud layer in the remote sensing image, so that the requirement on the sensor is reduced, and the position and the size of the cloud can be accurately detected; and the adopted algorithm is simple, the calculation efficiency is high, and the real-time requirement can be met.
The foregoing is a summary of the present invention, and in order to provide a clear understanding of the technical means of the present invention and to be implemented in accordance with the present specification, the following is a detailed description of the preferred embodiments of the present invention with reference to the accompanying drawings.
Drawings
Fig. 1 is a flowchart of a cloud image detection method according to the present invention.
FIG. 2 is a set of infrared images containing a cloud of clouds in accordance with an example of the invention.
Fig. 3 is an energy saliency map for the infrared image of fig. 2.
Fig. 4 is a luminance contrast image of the energy saliency map of fig. 3.
Fig. 5 is a texture feature map of the luminance contrast image of fig. 4.
Fig. 6 shows the cloud layer detection results.
Detailed Description
In order to make the technical problems, technical solutions and advantages of the present invention more apparent, the following detailed description is given with reference to the accompanying drawings and specific embodiments.
The invention provides a cloud layer image detection method based on energy, texture and motion characteristics, aiming at the problem that the existing high-altitude cirrus cloud is difficult to accurately detect.
As shown in fig. 1, an embodiment of the present invention provides a cloud image detection method based on energy, texture, and motion characteristics, where the method is applied to a detection device, and specifically includes:
step 1: the detection device receives N frames of remote sensing images, and the images are recorded as a remote sensing image sequence, and the remote sensing image sequence comprisesThe function f can be used for a sequence of remote sensing imagesn(x, y), where N is 1. For example: fig. 2 is a group of infrared remote sensing images containing cirrus clouds.
Step 2: the detection device respectively extracts the energy characteristics of each frame of remote sensing image from the remote sensing image sequence, and obtains an energy significance map according to the energy characteristics of each frame of remote sensing image, wherein the energy significance map is marked as
Figure GDA0002375689870000051
N is 1. For example: fig. 3 is an energy saliency map obtained by performing energy feature extraction on the infrared image containing the cirrus cloud of fig. 2.
The energy characteristic of each frame of remote sensing image adopts a spectrum residual error characteristic. The method comprises the following specific steps:
21, the detection device performs Fourier transform on the remote sensing image sequence:
snxy)=F[fn(x,y)],n=1,...,N
wherein F represents a Fourier transform operator, (ω)xy) Representing the coordinates transformed to the frequency domain.
22, the detection device calculates the amplitude of the Fourier transform and takes the logarithm to obtain a logarithmic spectrum:
Lnxy)=log[|snxy)|]
where | represents the magnitude operator. Simultaneously, the phase spectrum is calculated:
Figure GDA0002375689870000052
wherein
Figure GDA0002375689870000053
Representing the phase operator.
23, the detection device convolves the log spectrum obtained in the previous step with a mean filtering template with the size of mxm to obtain a smooth spectrum:
V(ωxy)=Lnxy)*hmxy)
wherein the mean filtering template is:
Figure GDA0002375689870000054
and 24, subtracting the smooth spectrum from the logarithmic spectrum by the detection device to obtain a spectrum residual:
R(ωxy)=Lnxy)-V(ωxy)
25, the detecting device makes the spectrum residual R (omega)xy) And the phase spectrum P (omega)xy) Performing two-dimensional inverse discrete Fourier transform to obtain an energy significance map:
Figure GDA0002375689870000061
and step 3: the detection device calculates the brightness contrast of the energy significance map to obtain a brightness contrast image, and the contrast image is recorded as
Figure GDA0002375689870000062
N1., N; for example: fig. 4 is a luminance contrast image obtained after performing a luminance contrast calculation on the energy saliency map of fig. 3.
And 4, step 4: the detection device extracts the texture features of the brightness contrast image to obtain a texture feature map, and the texture feature map is recorded as
Figure GDA0002375689870000063
N1., N; for example: fig. 5 is a texture feature map obtained by extracting texture features from the luminance contrast image of fig. 4.
The texture feature can be represented by a directional Gabor feature. The specific process for obtaining the texture feature map comprises the following steps:
41, the detection device constructs a filter:
Figure GDA0002375689870000064
wherein x' ═ a-m(xcosθ+ysinθ),y′=a-m(-xcosθ+ysinθ)。a-mFor scale factors, θ denotes the direction of the Gabor kernel, λ denotes the wavelength of the sine function, ψ denotes the phase shift, σ denotes the standard deviation of the gaussian function, and γ denotes the aspect ratio of the function. And selecting different directions to obtain the Gabor multidirectional filter.
42, the detection device convolves the Gabor filter with the brightness contrast image to obtain a filtering result, wherein the filtering result is a texture feature map:
Figure GDA0002375689870000065
and 5: the detection device performs interframe correlation on the texture feature map by using the image motion information, extracts the region of interest and obtains a cloud layer detection result. For example: fig. 6 shows the final cloud detection result.
The process of step 5 above is described in detail below:
51, the detection device sets a brightness threshold T, and performs threshold division on the result of brightness contrast to obtain a division result, which is recorded as Thn(x,y):
Figure GDA0002375689870000066
52, the detection device opens the segmentation result to eliminate isolated bright spots, remove partial clutter, fill holes, and record the processed segmentation result as Thn' (x, y). General Thn' (x, y) the Region with pixel value 1 is marked as Region of interest (ROI), the center coordinate of the current ROI is calculated and obtained, and the center coordinate of the current ROI is marked as (x)n,yn)。
The detection device sets a motion threshold value Mov, and calculates the distance between the centers of the ROIs of the nth frame and the (n + 1) th frame:
Figure GDA0002375689870000071
if D isnIf the frame number is less than Mov, the detection device associates the ROI areas of the n frames and the n +1 th frame; if D isnIf the correlation fails, the detection device does not correlate the ROI of the n-th frame and the n + 1-th frame.
And 54, extracting the correlated ROI area by the detection device, wherein the correlated ROI area is a cirrus cloud detection result.
In summary, due to the adoption of the technical scheme, the invention has the beneficial effects that:
the detection device adopts an image processing mode to detect the cirrus cloud in the remote sensing image, so that the requirement on the sensor is reduced, and the position and the size of the cirrus cloud can be accurately detected; and the adopted algorithm is simple, the calculation efficiency is high, and the real-time requirement can be met.
While the foregoing is directed to the preferred embodiment of the present invention, it will be understood by those skilled in the art that various changes and modifications may be made without departing from the spirit and scope of the invention as defined in the appended claims.

Claims (3)

1. A cloud layer image detection method is characterized by comprising the following steps:
the detection device receives a remote sensing image sequence f containing N remote sensing imagesn(x, y), wherein N is 1.., N is a frame number, and N is a total frame number;
the detection device extracts the energy characteristics of the remote sensing image from the remote sensing image sequence to obtain an energy significance map
Figure FDA0002388003670000011
The detection device calculates the brightness contrast of the energy saliency map to obtain a brightness contrast image
Figure FDA0002388003670000012
The detection device extracts the texture features of the brightness contrast image to obtain a texture feature map
Figure FDA0002388003670000013
The detection device performs interframe correlation on the texture feature map by using image motion information, extracts a region of interest (ROI) and obtains a cloud layer detection result, the detection device performs interframe correlation on the texture feature map by using the image motion information and extracts the region of interest (ROI), and the obtained cloud layer detection result specifically comprises the following steps:
the detection device sets a brightness threshold T, performs threshold segmentation on the brightness contrast result to obtain a segmentation result Thn(x,y):
Figure FDA0002388003670000014
The detection means detects the division result Thn(x, y) performing opening operation to eliminate isolated bright spots, remove partial clutter, fill holes and obtain a processed segmentation result Th'n(x,y);
The detection device detects the Th'nThe region with pixel value 1 in (x, y) is marked as region of interest ROI, and the center coordinate (x) of the current ROI is obtainedn,yn);
The detection device sets a motion threshold Mov, and calculates the distance between the centers of the ROI of the nth frame and the n +1 th frame:
Figure FDA0002388003670000021
if D isnIf the frame number is less than Mov, the detection device associates the ROI areas of the n frames and the n +1 th frame;
the detection device extracts mutually associated ROI areas, and the mutually associated ROI areas are cloud layer detection results.
2. The method of claim 1, wherein the detection device extracts energy features of the remote sensing images from the sequence of remote sensing images to obtain an energy saliency map
Figure FDA0002388003670000022
The method specifically comprises the following steps:
the detection device performs Fourier transform on the remote sensing image sequence:
snxy)=F[fn(x,y)]n1.., N, where F denotes a fourier transform operator, (ω) represents a fourier transform operatorxy) Representing coordinates transformed to the frequency domain;
the detection device calculates the amplitude of Fourier transform, and takes logarithm to obtain a log spectrum:
Lnxy)=log[|snxy)|]where | represents the magnitude operator,
calculating the phase spectrum P (omega)xy):
Figure FDA0002388003670000023
Wherein
Figure FDA0002388003670000024
Representing a phase operator;
the detection device convolves the log spectrum with a mean filtering template with the size of mxm to obtain a smoothed spectrum:
V(ωxy)=Lnxy)*hmxy) Wherein the mean filtering template is:
Figure FDA0002388003670000025
the detection device subtracts the logarithmic spectrum and the smooth spectrum to obtain a spectrum residual R (omega)xy):
R(ωxy)=Lnxy)-V(ωxy),
The detection device uses the spectrum residual R (omega)xy) With the phase spectrum P (ω)xy) Performing two-dimensional inverse discrete Fourier transform to obtain an energy significance map
Figure FDA0002388003670000031
Figure FDA0002388003670000032
3. The method of claim 1, wherein the detecting device extracts texture features of the brightness contrast image to obtain a texture feature map
Figure FDA0002388003670000033
The method specifically comprises the following steps:
the detection device constructs a filter:
Figure FDA0002388003670000034
wherein x' ═ a-m(xcosθ+ysinθ),y′=a-m(-xcosθ+ysinθ);a-mIs a scale factor, theta represents the direction of the kernel function, lambda represents the wavelength of the sine function, psi represents the phase shift, sigma represents the standard deviation of the gaussian function, and gamma represents the aspect ratio of the function;
the detection device convolutes the filter with the brightness contrast image to obtain a filtering result, wherein the filtering result is a texture feature map:
Figure FDA0002388003670000035
CN201710345541.1A 2017-05-16 2017-05-16 Cloud layer image detection method Active CN107273803B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201710345541.1A CN107273803B (en) 2017-05-16 2017-05-16 Cloud layer image detection method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201710345541.1A CN107273803B (en) 2017-05-16 2017-05-16 Cloud layer image detection method

Publications (2)

Publication Number Publication Date
CN107273803A CN107273803A (en) 2017-10-20
CN107273803B true CN107273803B (en) 2020-04-24

Family

ID=60065156

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201710345541.1A Active CN107273803B (en) 2017-05-16 2017-05-16 Cloud layer image detection method

Country Status (1)

Country Link
CN (1) CN107273803B (en)

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108648184A (en) * 2018-05-10 2018-10-12 电子科技大学 A kind of detection method of remote sensing images high-altitude cirrus
CN110667847B (en) * 2019-10-17 2020-08-18 安徽省徽腾智能交通科技有限公司泗县分公司 Unmanned aerial vehicle intelligent flying height control platform
CN111967508A (en) * 2020-07-31 2020-11-20 复旦大学 Time series abnormal point detection method based on saliency map
CN111812106B (en) * 2020-09-15 2020-12-08 沈阳风驰软件股份有限公司 Method and system for detecting glue overflow of appearance surface of wireless earphone

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103093241A (en) * 2013-01-23 2013-05-08 北京理工大学 Optical remote sensing image non-homogeneous cloud layer discriminating method based on homogenization processing
CN105868745A (en) * 2016-06-20 2016-08-17 重庆大学 Weather identifying method based on dynamic scene perception

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103052962B (en) * 2010-11-24 2016-01-27 印度统计学院 The classification of rough wavelet granular space and multi-spectral remote sensing image

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103093241A (en) * 2013-01-23 2013-05-08 北京理工大学 Optical remote sensing image non-homogeneous cloud layer discriminating method based on homogenization processing
CN105868745A (en) * 2016-06-20 2016-08-17 重庆大学 Weather identifying method based on dynamic scene perception

Also Published As

Publication number Publication date
CN107273803A (en) 2017-10-20

Similar Documents

Publication Publication Date Title
CN107273803B (en) Cloud layer image detection method
US20210049769A1 (en) Vibe-based three-dimensional sonar point cloud image segmentation method
CN103325112B (en) Moving target method for quick in dynamic scene
WO2015180527A1 (en) Image saliency detection method
CN109919960B (en) Image continuous edge detection method based on multi-scale Gabor filter
US20120328161A1 (en) Method and multi-scale attention system for spatiotemporal change determination and object detection
CN105184804B (en) Small targets detection in sea clutter method based on Airborne IR camera Aerial Images
CN103871039B (en) Generation method for difference chart in SAR (Synthetic Aperture Radar) image change detection
CN108765327B (en) Image rain removing method based on depth of field and sparse coding
CN102360503B (en) SAR (Specific Absorption Rate) image change detection method based on space approach degree and pixel similarity
CN108257153B (en) Target tracking method based on direction gradient statistical characteristics
CN110110618B (en) SAR target detection method based on PCA and global contrast
CN111161222A (en) Printing roller defect detection method based on visual saliency
CN111145121B (en) Confidence term filter target tracking method for strengthening multi-feature fusion
CN107516322A (en) A kind of image object size based on logarithm pole space and rotation estimation computational methods
CN102175993A (en) Radar scene matching feature reference map preparation method based on satellite SAR (synthetic aperture radar) images
CN109800713A (en) The remote sensing images cloud detection method of optic increased based on region
CN104951765A (en) Remote sensing image target division method based on shape priori information and vision contrast ratio
CN116863357A (en) Unmanned aerial vehicle remote sensing dyke image calibration and intelligent segmentation change detection method
CN103646397B (en) Real-time synthetic aperture perspective imaging method based on multisource data fusion
CN109767442B (en) Remote sensing image airplane target detection method based on rotation invariant features
CN112884795A (en) Power transmission line inspection foreground and background segmentation method based on multi-feature significance fusion
Fu et al. A noise-resistant superpixel segmentation algorithm for hyperspectral images
CN106778822B (en) Image straight line detection method based on funnel transformation
Wu et al. Research on crack detection algorithm of asphalt pavement

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant