CN115512231B - Remote sensing interpretation method suitable for homeland space ecological restoration - Google Patents

Remote sensing interpretation method suitable for homeland space ecological restoration Download PDF

Info

Publication number
CN115512231B
CN115512231B CN202211420132.0A CN202211420132A CN115512231B CN 115512231 B CN115512231 B CN 115512231B CN 202211420132 A CN202211420132 A CN 202211420132A CN 115512231 B CN115512231 B CN 115512231B
Authority
CN
China
Prior art keywords
gray level
gray
interval
pixel point
remote sensing
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202211420132.0A
Other languages
Chinese (zh)
Other versions
CN115512231A (en
Inventor
孙振喜
王萌
苏彬
李晋
生海迪
王军
孙文胜
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shandong Institute Of Land And Spatial Data And Remote Sensing Technology Shandong Sea Area Dynamic Monitoring And Monitoring Center
Original Assignee
Shandong Institute Of Land And Spatial Data And Remote Sensing Technology Shandong Sea Area Dynamic Monitoring And Monitoring Center
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shandong Institute Of Land And Spatial Data And Remote Sensing Technology Shandong Sea Area Dynamic Monitoring And Monitoring Center filed Critical Shandong Institute Of Land And Spatial Data And Remote Sensing Technology Shandong Sea Area Dynamic Monitoring And Monitoring Center
Priority to CN202211420132.0A priority Critical patent/CN115512231B/en
Publication of CN115512231A publication Critical patent/CN115512231A/en
Application granted granted Critical
Publication of CN115512231B publication Critical patent/CN115512231B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/10Terrestrial scenes
    • G06V20/188Vegetation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/20Image preprocessing
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/20Image preprocessing
    • G06V10/30Noise filtering
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/20Image preprocessing
    • G06V10/36Applying a local operator, i.e. means to operate on image points situated in the vicinity of a given point; Non-linear local filtering operations, e.g. median filtering
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/40Extraction of image or video features
    • G06V10/50Extraction of image or video features by performing operations within image blocks; by using histograms, e.g. histogram of oriented gradients [HoG]; by summing image-intensity values; Projection analysis
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/10Terrestrial scenes
    • G06V20/13Satellite images

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Theoretical Computer Science (AREA)
  • Astronomy & Astrophysics (AREA)
  • Remote Sensing (AREA)
  • Health & Medical Sciences (AREA)
  • General Health & Medical Sciences (AREA)
  • Nonlinear Science (AREA)
  • Image Processing (AREA)

Abstract

The invention relates to the technical field of image processing, in particular to a remote sensing interpretation method suitable for ecological restoration of a homeland space, which comprises the following steps: acquiring a gray scale interval; obtaining the contrast ratio of the pixel point corresponding to each gray level to the pixel point in the 8 neighborhoods of the pixel point according to the number of the pixel points corresponding to each gray level and the gray value of each pixel point in the 8 neighborhoods of the pixel point corresponding to each gray level; obtaining contrast in a gray scale interval; obtaining an expected enhancement effect of the gray level in the gray level interval; obtaining an optimal division mode according to the expected enhancement effect of the gray level in each divided gray level interval, carrying out histogram equalization on the gray level histogram of the remote sensing image of the ecological restoration area according to the optimal division mode to obtain an optimal enhancement image, and judging the ecological restoration effect of the homeland space according to the optimal enhancement image. The invention improves the efficiency and the capability of natural resource supervision.

Description

Remote sensing interpretation method suitable for homeland space ecological restoration
Technical Field
The invention relates to the technical field of image processing, in particular to a remote sensing interpretation method suitable for ecological restoration of a homeland space.
Background
The existing remote measuring instrument on an artificial earth satellite is used for monitoring the earth surface by induction remote measurement and resource management, so that the monitoring and management of the ecological restoration of the homeland space are realized. However, due to the influence of the atmosphere and cloud and mist, the remote sensing image has the problems of low contrast, poor visual effect and the like, and the efficiency and the capability of natural resource supervision can be directly influenced; therefore, contrast enhancement processing needs to be performed on the acquired remote sensing image, and the gray levels with a small number of pixel points in the remote sensing image can be merged when the image enhancement processing is performed by the traditional histogram equalization algorithm, so that the contrast of the remote sensing image is improved, and meanwhile, the number of the gray levels of the remote sensing image is greatly reduced, so that the entropy of the remote sensing image information is reduced, local details are lost, and the efficiency and the capability of natural resource supervision are influenced.
Disclosure of Invention
The invention provides a remote sensing interpretation method suitable for ecological restoration of a homeland space, which aims to solve the problem that local details of a remote sensing image are lost due to existing histogram equalization.
The remote sensing interpretation method suitable for the ecological restoration of the homeland space adopts the following technical scheme:
acquiring a gray level image and a gray level histogram of a remote sensing image of an ecological restoration area, and dividing the gray level histogram for multiple times according to the number of wave crests of the gray level histogram to obtain multiple gray level intervals after each division;
acquiring the number of pixel points corresponding to each gray level, acquiring the gray value of each pixel point in 8 neighborhoods of the pixel points corresponding to each gray level, and acquiring the contrast ratio of the pixel point corresponding to each gray level to the pixel point in 8 neighborhoods of the pixel points according to the number of the pixel points corresponding to each gray level and the gray value of each pixel point in 8 neighborhoods of the pixel points corresponding to each gray level;
obtaining the contrast ratio in each gray scale interval according to all gray scales in each gray scale interval and the contrast ratio between the pixel point corresponding to each gray scale in the corresponding gray scale interval and the pixel point in the neighborhood of the pixel point 8;
obtaining an expected enhancement effect of the gray level in each gray level interval according to the contrast in each gray level interval, the number of pixel points corresponding to each gray level in each gray level interval and the number of gray levels in each gray level interval;
obtaining an optimal division mode according to the expected enhancement effect of the gray level in each divided gray level interval, dividing the gray level histogram of the remote sensing image of the ecological restoration area according to the optimal division mode to obtain a plurality of target gray level intervals, enhancing each target gray level interval by utilizing histogram equalization to obtain an optimal enhancement image, and judging the ecological restoration effect of the homeland space according to the optimal enhancement image.
Further, the multiple gray scale intervals after each division are determined as follows:
obtaining the ratio of the numerator and the denominator by taking 255 as the numerator and the number of wave crests of the gray level histogram as the denominator;
multiplying the ratio of the numerator and the denominator by the dividing times to obtain the interval length of the division, dividing the gray level histogram according to the interval length of the division to obtain a plurality of divided gray level intervals, wherein the dividing times of the gray level histogram are
Figure 100002_DEST_PATH_IMAGE001
Figure 520875DEST_PATH_IMAGE002
The number of peak points in the gray level histogram is represented, and the number of gray level intervals obtained by each division is different.
Further, a specific expression of the contrast between the pixel point corresponding to each gray level and the pixel point in the 8 neighborhoods of the pixel point is as follows:
Figure 569603DEST_PATH_IMAGE004
in the formula:
Figure 100002_DEST_PATH_IMAGE005
representing grey levels
Figure 266163DEST_PATH_IMAGE006
The contrast of the corresponding pixel point to the pixel points in the neighborhood of this pixel point 8,
Figure 100002_DEST_PATH_IMAGE007
represents the g-th gray level of
Figure 894591DEST_PATH_IMAGE006
The gray value of the pixel point of (a),
Figure 461838DEST_PATH_IMAGE008
represents the g-th gray level of
Figure 291254DEST_PATH_IMAGE006
The gray value of the h neighborhood pixel point in 8 neighborhoods of the pixel point,
Figure 100002_DEST_PATH_IMAGE009
represents the g-th gray level of
Figure 475111DEST_PATH_IMAGE006
The number of the pixel points of (a),
Figure 438388DEST_PATH_IMAGE010
representing 8 neighborhoods, g representing gray levels of
Figure 797825DEST_PATH_IMAGE006
The g-th pixel point.
Further, the contrast in the gray scale interval is determined as follows:
accumulating the contrast between the pixel point corresponding to each gray level in each gray level interval and the pixel point in the neighborhood of the pixel point 8, calculating the average value, and taking the average value as a first average value;
acquiring a difference value of adjacent gray levels in each gray level interval, accumulating all the difference values obtained in each gray level interval, calculating an average value, and taking the average value as a second average value;
and subtracting the second average value from the first average value in each gray scale interval to obtain the contrast in the gray scale interval.
Further, the expected enhancement effect of the gray levels in the gray scale interval is determined as follows:
acquiring the variance of the number of pixel points corresponding to each gray level in a gray level interval;
acquiring the quantity difference of the quantity of pixel points corresponding to the adjacent gray levels in the gray level interval, accumulating all the quantity differences obtained in the gray level interval to obtain an accumulated sum, calculating an average value, and taking the average value as a third average value;
and taking the contrast in the gray scale interval as a numerator, and taking the product of the third mean value in the gray scale interval and the variance of the number of the pixels corresponding to each gray scale as a denominator to obtain the expected enhancement effect of the gray scale in the gray scale interval.
Further, an average value of expected enhancement effects of the gray levels in all the divided gray level intervals is obtained, and the division mode corresponding to the maximum average value is used as the optimal division mode.
Further, judging the ecological restoration effect of the homeland space according to the optimal enhanced image further comprises:
acquiring remote sensing images of the same ecological restoration area acquired at the same period year by year;
acquiring an optimal enhanced image of the remote sensing image of the same ecological restoration area acquired at the same period year by using the remote sensing image of the same ecological restoration area acquired at the same period year by year;
and obtaining the ecological restoration effect of the homeland space according to the number of pixel points in the forest region in the optimal enhanced image of the remote sensing image of the same ecological restoration region in the same period year by year.
The invention has the beneficial effects that: according to the method, the initial length of the gray level interval of the gray level histogram is obtained according to the number of peak points of the gray level histogram of the remote sensing image, the divided gray level interval range is too small, the merging and widening effect of the gray level is poor, and the contrast enhancement effect of the remote sensing image is insufficient, so that on the basis of the primary division by using the peak points, the highest peak points in the primary interval are counted, the interval is divided again according to the adjacent highest peak points, the division is performed to avoid the problem that the divided gray level interval range is too small to a certain extent, however, in order to enable the final enhancement effect of the remote sensing image to be optimal, the overall expected enhancement effect of the gray level interval after each division is calculated, the step realizes the self-adaption of the division of the gray level interval, ensures that the final equalized effect of each remote sensing image is optimal, the optimal interval self-adaption division is performed on the gray level histogram of the gray level image, and then the equalization enhancement is performed on each interval, the number of the gray level interval is reduced during the equalization, the ecological detail part of the image is protected, the ecological detail enhancement effect of the remote sensing image is ensured, and the final repair effect of the remote sensing image is more accurate according to the remote sensing image is judged;
meanwhile, due to the fact that the optimal image is obtained, when the restoration effect of the soil space ecology is judged according to the optimal image, the judgment efficiency can be improved, and finally the efficiency and the capability of natural resource supervision are improved.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings used in the embodiments or the description of the prior art will be briefly described below, it is obvious that the drawings in the following description are only some embodiments of the present invention, and for those skilled in the art, other drawings can be obtained according to the drawings without creative efforts.
FIG. 1 is a flowchart of an embodiment of the remote sensing interpretation method applicable to the ecological restoration of the homeland space.
Detailed Description
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
The embodiment of the remote sensing interpretation method applicable to the ecological restoration of the homeland space, as shown in figure 1, comprises the following steps:
s1, obtaining a gray level image and a gray level histogram of a remote sensing image of an ecological restoration area, and dividing the gray level histogram for multiple times according to the number of wave crests of the gray level histogram to obtain multiple gray level intervals after each division.
Because the remote sensing image has the problems of low contrast and poor visual effect, image enhancement processing is required, and the enhancement of the remote sensing image is realized by improving the histogram equalization algorithm.
The method comprises the following specific steps of obtaining a gray level image and a gray level histogram of a remote sensing image of an ecological restoration area: firstly, collecting remote sensing images of the same ecological restoration area in the same period every year, then carrying out graying processing on the collected remote sensing images, then carrying out denoising processing by using self-adaptive median filtering, and then counting a gray level histogram of the remote sensing images to obtain the gray level image and the gray level histogram of the remote sensing images of the ecological restoration area. It should be noted that all the grayscale images and grayscale histograms appearing hereinafter refer to the grayscale images and grayscale histograms of the same year in the same ecological restoration area after the remote sensing image is denoised.
The basic principle of histogram equalization is to widen the gray levels with a large number of pixel points in the gray image and merge the gray levels with a small number of pixel points, so that the image contrast is increased and the image enhancement is achieved.
Firstly, curve fitting is carried out on a gray level histogram, because the environment in a remote sensing image is complex and changeable, a curve fitted by the gray level histogram has a plurality of wave peaks, the abscissa value of each wave peak point is counted, the abscissa value is the gray level of the image, and a gray level set is obtained
Figure DEST_PATH_IMAGE011
Where n represents the number of peak points in the gray-scale histogram.
Because the gray levels in each single local peak in the gray histogram before histogram equalization are adjacent and the number of pixels corresponding to each gray level is relatively similar, the number of the reduced gray levels after the histogram equalization can be effectively reduced by taking the single local peak in the gray histogram as an interval and dividing the gray histogram, but if the range of the gray interval is too small, the merging and widening effect of the gray levels is poor, and the contrast enhancement effect of the image is insufficient, the gray level range between the divided areas is gradually increased from small to large on the basis of the single local peak, various gray histogram gray level interval division modes are obtained, and the gray level interval division method with the optimal histogram equalization enhancement effect is selected.
The specific steps for obtaining the plurality of gray scale intervals after each division are as follows: obtaining the ratio of a numerator and a denominator by taking 255 as a numerator and taking the number of wave crests of the gray level histogram as a denominator; obtaining the interval length of the division according to the multiplication of the ratio of the numerator and the denominator and the division times, and obtaining a plurality of divided gray level intervals by dividing the gray level histogram according to the interval length of the division times, wherein the division times of the gray level histogram are
Figure 453934DEST_PATH_IMAGE001
The number of the gray level intervals obtained by dividing every time is different, and the specific expression of the interval length is as follows:
Figure DEST_PATH_IMAGE013
in the formula:
Figure 125087DEST_PATH_IMAGE014
represents a section length of each gray scale section after the m-th division,
Figure DEST_PATH_IMAGE015
is shown as
Figure 360896DEST_PATH_IMAGE015
The way of sub-division is adopted,
Figure 840419DEST_PATH_IMAGE016
n represents the number of peak points in the gray histogram,
Figure DEST_PATH_IMAGE017
indicating rounding down and 255 indicating the maximum distribution range of the remote sensing image gray levels.
Wherein when
Figure 870692DEST_PATH_IMAGE018
Then, the gray level representing the gray histogram is equally divided into two sections, and the number of divided gray sections is minimized, so that the maximum value of m is
Figure DEST_PATH_IMAGE019
(ii) a Taking m =1 as an example, the section length of the gradation section of the gradation histogram at this time
Figure 29141DEST_PATH_IMAGE020
Then the preliminary interval of gray levels in the gray histogram at this time is divided into 0,
Figure DEST_PATH_IMAGE021
],[
Figure 537483DEST_PATH_IMAGE022
],…,[(n-1)
Figure DEST_PATH_IMAGE023
]sequentially counting the gray levels corresponding to the peak points in each gray interval from left to right, if no peak point exists in the gray interval, not counting, and if a plurality of peak points exist in the gray interval, taking the gray level corresponding to the peak point with the largest vertical coordinate, namely the largest pixel number, to obtain the peak point set when m =1
Figure 464987DEST_PATH_IMAGE024
In which
Figure DEST_PATH_IMAGE025
Indicating the number of peak points selected.
Thus, when m =1 is obtained, the gray scale interval in the gray scale histogram is divided into [0,
Figure 462899DEST_PATH_IMAGE026
],[
Figure DEST_PATH_IMAGE027
],…,[
Figure 108644DEST_PATH_IMAGE028
]wherein
Figure DEST_PATH_IMAGE029
Represents the selected second
Figure 155097DEST_PATH_IMAGE030
Peak point and
Figure 609213DEST_PATH_IMAGE025
the mean value of the sum of the gray levels corresponding to the peak points represents the trough position between the two peak points, so that each gray level interval of the divided gray level histogram contains each local peak point, and the number of the pixel points corresponding to the gray level in each gray level interval is similar.
In the same way, accomplish
Figure DEST_PATH_IMAGE031
And (4) dividing the gray level interval of the gray level histogram.
To this end, finish
Figure 981288DEST_PATH_IMAGE019
Obtaining the division of the gray interval of the secondary gray histogram
Figure 317591DEST_PATH_IMAGE019
And a gray scale interval division mode is adopted.
S2, acquiring the number of pixel points corresponding to each gray level, acquiring the gray value of each pixel point in 8 neighborhoods of the pixel points corresponding to each gray level, and acquiring the contrast ratio of the pixel point corresponding to each gray level to the pixel point in 8 neighborhoods of the pixel points according to the number of the pixel points corresponding to each gray level and the gray value of each pixel point in 8 neighborhoods of the pixel points corresponding to each gray level.
Given that the remote sensing image of the ecological restoration area is influenced by atmosphere and cloud and mist, and the remote sensing image has the defect of low contrast, after the gray level interval division is carried out on the gray level histogram in the step S1, if the collected remote sensing image is less influenced by atmosphere and cloud and mist and the contrast in the division interval is higher, the number of pixel points corresponding to the gray level in the interval needing to be divided is gradually changed, so that the contrast is improved after the histogram is equalized, meanwhile, the loss of the gray level is reduced, and the image details are protected.
Therefore, it is necessary to calculate the contrast between the pixel points corresponding to each gray level in the gray level histogram of the remote sensing image before enhancement without histogram equalization and the pixel points in the 8 neighborhoods thereof, count the gray level in the gray level histogram and the number of the pixel points corresponding to each gray level, and obtain a gray level set
Figure 167736DEST_PATH_IMAGE032
Set of number of pixels corresponding to gray level
Figure DEST_PATH_IMAGE033
Where t denotes the number of grey levels within the grey image, the grey level
Figure 866570DEST_PATH_IMAGE034
Correspond toThe number of the pixel points is
Figure DEST_PATH_IMAGE035
It is known that the larger the difference between different pixels, the more obvious the contrast ratio, so the gray scale is used
Figure 143968DEST_PATH_IMAGE006
For example, the contrast between the pixel point corresponding to the gray scale and the pixel point in the neighborhood of 8 pixels is calculated
Figure 905251DEST_PATH_IMAGE005
Comprises the following steps:
Figure 355824DEST_PATH_IMAGE036
in the formula:
Figure 315689DEST_PATH_IMAGE005
representing grey levels
Figure 436092DEST_PATH_IMAGE006
The contrast of the corresponding pixel point with the pixel points in the neighborhood of this pixel point 8,
Figure 809305DEST_PATH_IMAGE007
represents the g-th gray level of
Figure 673355DEST_PATH_IMAGE006
The gray value of the pixel point of (a),
Figure 815624DEST_PATH_IMAGE008
represents the g-th gray level of
Figure 372507DEST_PATH_IMAGE006
The gray value of the h neighborhood pixel point in 8 neighborhoods of the pixel point,
Figure 233016DEST_PATH_IMAGE009
represents the g-th gray level of
Figure 697495DEST_PATH_IMAGE006
The number of the pixel points of (a),
Figure 569636DEST_PATH_IMAGE010
representing 8 neighborhoods, g representing gray levels of
Figure 156475DEST_PATH_IMAGE006
The g-th pixel point.
Wherein,
Figure DEST_PATH_IMAGE037
represents the g-th gray level of
Figure 996559DEST_PATH_IMAGE006
The larger the mean value is, the larger the difference between the pixel point and the 8-neighborhood pixel point is, the larger the contrast of the pixel point is, so that the formula
Figure 202412DEST_PATH_IMAGE038
Express a gray level of
Figure 319273DEST_PATH_IMAGE006
And the mean value of the gray differences of all the pixel points and 8 neighborhoods thereof, thereby
Figure 217959DEST_PATH_IMAGE005
The larger the size of the hole is,
Figure 459584DEST_PATH_IMAGE006
the greater the contrast between the pixel point corresponding to the gray level and the pixel point in the neighborhood of the pixel point 8.
Similarly, the contrast between the pixel point corresponding to other gray levels in the gray level set and the pixel point in the neighborhood of the pixel point 8 is calculated, the contrast between the pixel point corresponding to each gray level and the pixel point in the neighborhood of the pixel point 8 is obtained, and thus, the contrast between the pixel point corresponding to each gray level and the pixel point in the neighborhood of the pixel point 8 is obtained.
And S3, obtaining the contrast ratio in each gray scale interval according to all gray scales in each gray scale interval and the contrast ratio between the pixel point corresponding to each gray scale in the corresponding gray scale interval and the pixel point in the neighborhood of the pixel point 8.
The specific steps for obtaining the contrast ratio in the gray scale interval are as follows: taking m =1 as an example, accumulating the contrast of each pixel point corresponding to each gray level in each gray level interval after the first division and the contrast of the pixel points in the neighborhood of 8 of the pixel point, and calculating an average value, and taking the average value as a first average value; acquiring a difference value of adjacent gray levels in each gray level interval, accumulating all the difference values obtained in each gray level interval, calculating an average value, and taking the average value as a second average value; subtracting the first average value and the second average value in each gray scale interval to obtain the contrast in the gray scale interval, wherein the specific expression of the contrast in the gray scale interval is as follows:
Figure 593762DEST_PATH_IMAGE040
in the formula:
Figure DEST_PATH_IMAGE041
showing the contrast in any gray scale interval after the first division,
Figure 299550DEST_PATH_IMAGE042
indicating the number of gray levels in the gray scale interval after the first division,
Figure DEST_PATH_IMAGE043
the contrast between the pixel point corresponding to the q-th gray level in the gray scale interval and the pixel point in the neighborhood of the pixel point 8 is represented,
Figure 431454DEST_PATH_IMAGE044
representing the q +1 th gray level in the gray interval,
Figure DEST_PATH_IMAGE045
representing the q-th gray level within the gray scale interval.
Wherein,
Figure 222693DEST_PATH_IMAGE046
represents the difference of adjacent gray levels in the gray scale interval, therefore
Figure 770349DEST_PATH_IMAGE048
Representing the contrast between the gray levels in the divided gray intervals, wherein the greater the contrast value is, the greater the contrast in the gray intervals is;
Figure 533905DEST_PATH_IMAGE043
the contrast between the pixel point corresponding to the q-th gray level in the gray scale interval and the pixel point in the neighborhood of 8 pixels is expressed, so
Figure DEST_PATH_IMAGE049
Expressing the contrast between the pixel points in the divided gray scale interval and the pixel points in the 8 neighborhoods of the pixel points, wherein the larger the contrast value is, the larger the contrast in the gray scale interval is, thereby
Figure 102290DEST_PATH_IMAGE041
The larger the value of (a), the larger the contrast in this interval.
And S4, obtaining an expected enhancement effect of the gray level in each gray level interval according to the contrast in each gray level interval, the number of pixel points corresponding to each gray level in each gray level interval and the number of gray levels in each gray level interval.
The basic principle of histogram equalization is to widen the gray levels with a large number of pixel points in the gray image and merge the gray levels with a small number of pixel points, so that the image contrast is increased and the purpose of image enhancement is achieved. Therefore, the larger the contrast in the selected gray scale interval is, the more the gray scale does not need to be merged in a large quantity, so as to achieve the effect of enhancing the contrast, and therefore when the number of the pixels corresponding to the gray scale changes the smaller the number of the pixels in the gray scale interval is, the fewer the merging conditions of the gray scale with the small number of the pixels in the gray scale interval are, so as to reduce the number of the gray scale with the reduced number after the histogram is equalized, protect the image details, and finally, the better the enhancement effect of the gray scale interval after the histogram equalization is performed is.
Therefore, taking m =1 as an example, according to the contrast in any gray scale interval of the first division and the change of the number of pixels corresponding to the gray scale in the gray scale interval, the expected enhancement effect after histogram equalization enhancement in the gray scale interval is calculated
Figure 646404DEST_PATH_IMAGE050
The specific steps for obtaining the expected enhancement effect of the gray level in each gray level interval are as follows: the method comprises the steps of obtaining variance of pixel quantity corresponding to each gray level in a gray level interval, obtaining quantity difference of pixel quantity corresponding to adjacent gray levels in the gray level interval, accumulating all quantity differences obtained in the gray level interval to obtain accumulated sum, solving for average value, taking the average value as a third average value, taking contrast in the gray level interval as numerator, taking the product of the third average value in the gray level interval and the variance of the pixel quantity corresponding to each gray level as denominator to obtain expected enhancement effect of the gray level in the gray level interval, wherein the specific expression of the expected enhancement effect of the gray level in the gray level interval is as follows:
Figure 997751DEST_PATH_IMAGE052
in the formula:
Figure 943710DEST_PATH_IMAGE041
representing the contrast in any one gray scale interval after the first division,
Figure DEST_PATH_IMAGE053
the variance of the number of pixels corresponding to each gray level in the gray level interval is represented,
Figure 948575DEST_PATH_IMAGE050
representing the desired enhancement effect of the grey levels within the grey interval,
Figure 855351DEST_PATH_IMAGE042
indicating a first strokeThe number of gray levels in the divided gray scale interval,
Figure 869444DEST_PATH_IMAGE054
the number of pixels corresponding to the (q + 1) th gray level in the gray scale interval is represented,
Figure DEST_PATH_IMAGE055
and expressing the number of pixel points corresponding to the q-th gray level in the gray scale interval.
Wherein,
Figure 404330DEST_PATH_IMAGE053
expressing the variance of the number of the pixel points corresponding to each gray level in the gray level interval, wherein the variance expresses the uniformity of data, so that the smaller the variance is, the smaller the integral change of the number of the pixel points corresponding to the gray level in the gray level interval is;
Figure 721042DEST_PATH_IMAGE042
representing the number of gray levels in the gray scale interval after the first division,
Figure 239748DEST_PATH_IMAGE056
the difference of the pixel numbers corresponding to two adjacent gray levels in the gray level interval is expressed, so
Figure DEST_PATH_IMAGE057
Expressing the total difference of the number of the pixel points corresponding to the adjacent gray levels in the gray level interval, wherein the smaller the value is, the smaller the change of the number of the pixel points corresponding to the adjacent gray levels in the gray level interval is expressed;
Figure 526373DEST_PATH_IMAGE041
expressing the contrast in the gray scale interval, so that the larger the denominator in the formula, the smaller the numerator, the larger the contrast in the gray scale interval, the smaller the change of the number of the pixels corresponding to the gray scale, and the expected enhancement effect after the gray scale in the gray scale interval is equalized by the histogram
Figure 322291DEST_PATH_IMAGE050
The better.
In the same way, the expected enhancement effect of the gray level in each gray level interval after the histogram in the gray level interval after the first division is equalized is obtained, and a set is obtained
Figure 872221DEST_PATH_IMAGE058
Wherein
Figure 878223DEST_PATH_IMAGE025
When m =1 is represented, the number of peak points selected on the gray level histogram, that is, the number of gray level intervals after the gray level histogram is divided for the first time is represented. In the same way, the expected enhancement effect can be obtained after the gray level in each gray level interval is equalized through the histogram after each division.
And S5, obtaining an optimal division mode according to the expected enhancement effect of the gray level in each divided gray level interval, dividing a gray level histogram of the remote sensing image of the ecological restoration area according to the optimal division mode to obtain a plurality of target gray level intervals, enhancing each target gray level interval by utilizing histogram equalization to obtain an optimal enhancement image, and judging the ecological restoration effect of the homeland space according to the optimal enhancement image.
The specific steps for obtaining the optimal division mode are as follows: taking m =1 as an example, calculating the overall expected enhancement effect of the gray scale interval after the first division, specifically, obtaining the average value of the expected enhancement effects of the gray scales in all the gray scale intervals after the first division, and taking the average value as the overall expected enhancement effect of the gray scale interval after the first division
Figure DEST_PATH_IMAGE059
And in the same way, obtaining the overall expected enhancement effect of the gray level interval after each division, and obtaining an expected enhancement effect set
Figure 702959DEST_PATH_IMAGE060
Taking the maximum value in the set as
Figure DEST_PATH_IMAGE061
At this time, the process of the present invention,
Figure 681280DEST_PATH_IMAGE061
and the corresponding w-th gray level histogram division mode is the optimal division mode, so that the gray level interval division mode of the gray level histogram when m = w in the step S1 is selected, the target gray level intervals after the w-th division are respectively enhanced by using a histogram equalization algorithm to obtain the enhanced target gray level intervals, and the optimal enhanced image is obtained according to the enhanced target gray level intervals.
The concrete steps of judging the ecological restoration effect of the homeland space according to the optimal enhanced image are as follows: obtaining the optimal enhanced image of the remote sensing images of the same ecological restoration area acquired year by year in the same period by utilizing the steps S1-S5; and obtaining the ecological restoration effect of the homeland space according to the number of the pixel points of the forest region in the optimal enhanced image of the remote sensing image of the same ecological restoration region in the same period year by year, wherein when the number of the pixel points of the forest region in the optimal enhanced image of the remote sensing image of the same ecological restoration region in the same period year by year is increased, the ecological restoration effect of the homeland space is good.
The invention has the beneficial effects that: according to the method, the initial length of the gray level interval of the gray level histogram is obtained according to the number of peak points of the gray level histogram of the remote sensing image, the divided gray level interval range is too small, the merging and widening effect of the gray level is poor, and the contrast enhancement effect of the remote sensing image is insufficient, so that on the basis of the primary division by using the peak points, the highest peak points in the primary interval are counted, the interval is divided again according to the adjacent highest peak points, the division is performed to avoid the problem that the divided gray level interval range is too small to a certain extent, however, in order to enable the final enhancement effect of the remote sensing image to be optimal, the overall expected enhancement effect of the gray level interval after each division is calculated, the step realizes the self-adaption of the division of the gray level interval, ensures that the final equalized effect of each remote sensing image is optimal, the optimal interval self-adaption division is performed on the gray level histogram of the gray level image, and then the equalization enhancement is performed on each interval, the number of the gray level interval is reduced during the equalization, the ecological detail part of the image is protected, the ecological detail enhancement effect of the remote sensing image is ensured, and the final repair effect of the remote sensing image is more accurate according to the remote sensing image is judged;
meanwhile, due to the fact that the optimal image is obtained, when the restoration effect of the soil space ecology is judged according to the optimal image, the judgment efficiency can be improved, and finally the efficiency and the capability of natural resource supervision are improved.
The above description is only for the purpose of illustrating the preferred embodiments of the present invention and is not to be construed as limiting the invention, and any modifications, equivalents, improvements and the like that fall within the spirit and principle of the present invention are intended to be included therein.

Claims (4)

1. The remote sensing interpretation method suitable for homeland space ecological restoration is characterized by comprising the following steps:
acquiring a gray level image and a gray level histogram of a remote sensing image of an ecological restoration area, and dividing the gray level histogram for multiple times according to the number of wave crests of the gray level histogram to obtain multiple gray level intervals after each division;
the method comprises the following specific steps of obtaining a gray level image and a gray level histogram of a remote sensing image of an ecological restoration area: acquiring remote sensing images of the same ecological restoration area at the same period every year;
graying and denoising the remote sensing image to obtain a grayscale image and a grayscale histogram of the remote sensing image;
acquiring the number of pixel points corresponding to each gray level, acquiring the gray value of each pixel point in 8 neighborhoods of the pixel points corresponding to each gray level, and acquiring the contrast ratio of the pixel point corresponding to each gray level and the pixel point in 8 neighborhoods of the pixel points according to the number of the pixel points corresponding to each gray level and the gray value of each pixel point in 8 neighborhoods of the pixel points corresponding to each gray level;
obtaining the contrast ratio in each gray scale interval according to all gray scales in each gray scale interval and the contrast ratio between the pixel point corresponding to each gray scale in the corresponding gray scale interval and the pixel point in the neighborhood of the pixel point 8;
the contrast in the gray scale interval is determined as follows:
accumulating the contrast of the pixel point corresponding to each gray level in each gray level interval and the pixel point in the neighborhood of the pixel point 8, and calculating the average value, wherein the average value is used as a first average value;
acquiring a difference value of adjacent gray levels in each gray level interval, accumulating all the difference values obtained in each gray level interval, calculating an average value, and taking the average value as a second average value;
subtracting the first average value and the second average value in each gray scale interval to obtain the contrast in the gray scale interval;
obtaining an expected enhancement effect of the gray level in each gray level interval according to the contrast in each gray level interval, the number of pixel points corresponding to each gray level in each gray level interval and the number of gray levels in each gray level interval;
the desired enhancement effect of the grey levels in the grey scale interval is determined as follows:
acquiring the variance of the number of pixel points corresponding to each gray level in a gray level interval;
acquiring the quantity difference of the quantity of pixel points corresponding to the adjacent gray levels in the gray level interval, accumulating all the quantity differences obtained in the gray level interval to obtain an accumulated sum, calculating an average value, and taking the average value as a third average value;
taking the contrast in the gray scale interval as a numerator, and taking the product of a third mean value in the gray scale interval and the variance of the number of the pixel points corresponding to each gray scale as a denominator to obtain an expected enhancement effect of the gray scale in the gray scale interval;
obtaining an optimal division mode according to the expected enhancement effect of the gray level in each divided gray level interval, dividing a gray level histogram of the remote sensing image of the ecological restoration area according to the optimal division mode to obtain a plurality of target gray level intervals, enhancing each target gray level interval by utilizing histogram equalization to obtain an optimal enhancement image, and judging the ecological restoration effect of the homeland space according to the optimal enhancement image;
the optimal division mode is determined according to the following method:
and acquiring the average value of the expected enhancement effect of the gray levels in all the divided gray level intervals, and taking the corresponding division mode when the average value is maximum as the optimal division mode.
2. The remote sensing interpretation method for homeland space ecological restoration according to claim 1, wherein judging the homeland space ecological restoration effect according to the best enhanced image further comprises:
acquiring remote sensing images of the same ecological restoration area acquired at the same period year by year;
acquiring an optimal enhanced image of the remote sensing image of the same ecological restoration area acquired at the same period year by using the remote sensing image of the same ecological restoration area acquired at the same period year by year;
and obtaining the ecological restoration effect of the homeland space according to the number of pixel points in the forest region in the optimal enhanced image of the remote sensing image of the same ecological restoration region in the same period year by year.
3. The remote sensing interpretation method suitable for homeland space ecological restoration according to claim 1, wherein the plurality of gray scale intervals after each division are determined as follows:
obtaining the ratio of a numerator and a denominator by taking 255 as a numerator and taking the number of wave crests of the gray level histogram as a denominator;
obtaining the interval length of the division according to the multiplication of the ratio of the numerator and the denominator and the division times, and obtaining a plurality of divided gray level intervals by dividing the gray level histogram according to the interval length of the division times, wherein the division times of the gray level histogram are
Figure DEST_PATH_IMAGE001
Figure 397544DEST_PATH_IMAGE002
The number of the peak points in the gray level histogram is represented, and the number of the gray level intervals obtained by each division is different.
4. The remote sensing interpretation method suitable for homeland space ecological restoration according to claim 1, wherein the specific expression of the contrast between the pixel point corresponding to each gray level and the pixel point in the 8 neighborhoods of the pixel point is as follows:
Figure 751165DEST_PATH_IMAGE004
in the formula:
Figure DEST_PATH_IMAGE005
representing grey levels
Figure 955882DEST_PATH_IMAGE006
The contrast between the corresponding pixel and the pixel in the neighborhood of the pixel 8,
Figure DEST_PATH_IMAGE007
represents the g-th gray level of
Figure 482678DEST_PATH_IMAGE006
The gray value of the pixel point of (a),
Figure 682715DEST_PATH_IMAGE008
represents the g-th gray level of
Figure 331871DEST_PATH_IMAGE006
The gray value of the h neighborhood pixel point in 8 neighborhoods of the pixel point,
Figure DEST_PATH_IMAGE009
represents the g-th gray level of
Figure 23884DEST_PATH_IMAGE006
The number of the pixel points of (a),
Figure 292054DEST_PATH_IMAGE010
representing 8 neighborhoods, g tableShow a gray scale of
Figure 474161DEST_PATH_IMAGE006
The g-th pixel point.
CN202211420132.0A 2022-11-15 2022-11-15 Remote sensing interpretation method suitable for homeland space ecological restoration Active CN115512231B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202211420132.0A CN115512231B (en) 2022-11-15 2022-11-15 Remote sensing interpretation method suitable for homeland space ecological restoration

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202211420132.0A CN115512231B (en) 2022-11-15 2022-11-15 Remote sensing interpretation method suitable for homeland space ecological restoration

Publications (2)

Publication Number Publication Date
CN115512231A CN115512231A (en) 2022-12-23
CN115512231B true CN115512231B (en) 2023-02-28

Family

ID=84513539

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202211420132.0A Active CN115512231B (en) 2022-11-15 2022-11-15 Remote sensing interpretation method suitable for homeland space ecological restoration

Country Status (1)

Country Link
CN (1) CN115512231B (en)

Families Citing this family (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115797798B (en) * 2023-02-10 2023-04-25 山东省地质矿产勘查开发局八〇一水文地质工程地质大队(山东省地矿工程勘察院) Ecological restoration effect evaluation method based on abandoned mine remote sensing image
CN115830459B (en) * 2023-02-14 2023-05-12 山东省国土空间生态修复中心(山东省地质灾害防治技术指导中心、山东省土地储备中心) Mountain forest grass life community damage degree detection method based on neural network
CN115861135B (en) * 2023-03-01 2023-05-23 铜牛能源科技(山东)有限公司 Image enhancement and recognition method applied to panoramic detection of box body
CN116188327B (en) * 2023-04-21 2023-07-14 济宁职业技术学院 Image enhancement method for security monitoring video
CN117333504B (en) * 2023-12-01 2024-03-01 山东省国土空间数据和遥感技术研究院(山东省海域动态监视监测中心) Precise segmentation method for remote sensing image of complex terrain
CN117593193B (en) * 2024-01-19 2024-04-23 山东海天七彩建材有限公司 Sheet metal image enhancement method and system based on machine learning

Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP0749362A1 (en) * 1994-03-07 1996-12-27 International Business Machines Corporation Improvements in image processing
CN101951523A (en) * 2010-09-21 2011-01-19 北京工业大学 Adaptive colour image processing method and system
CN109345491A (en) * 2018-09-26 2019-02-15 中国科学院西安光学精密机械研究所 Remote sensing image enhancement method fusing gradient and gray scale information
CN111462024A (en) * 2020-04-03 2020-07-28 中国科学院半导体研究所 Bilateral self-adaptive image visualization enhancement method and imaging system
CN112734654A (en) * 2020-12-23 2021-04-30 中国科学院苏州纳米技术与纳米仿生研究所 Image processing method, device, equipment and storage medium
CN114049283A (en) * 2021-11-16 2022-02-15 上海无线电设备研究所 Self-adaptive gray gradient histogram equalization remote sensing image enhancement method
CN114926466A (en) * 2022-07-21 2022-08-19 山东省土地发展集团有限公司 Land integrated monitoring and decision-making method and platform based on big data
CN114998209A (en) * 2022-04-28 2022-09-02 南通奕霖智慧医学科技有限公司 Foreign matter detection method for infusion medicine bottle lamp detection process
CN115082508A (en) * 2022-08-18 2022-09-20 山东省蓝睿科技开发有限公司 Ocean buoy production quality detection method
CN115311301A (en) * 2022-10-12 2022-11-08 江苏银生新能源科技有限公司 PCB welding spot defect detection method

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7840066B1 (en) * 2005-11-15 2010-11-23 University Of Tennessee Research Foundation Method of enhancing a digital image by gray-level grouping
KR101426242B1 (en) * 2013-04-18 2014-08-05 삼성전자주식회사 Method and apparatus for converting gray level of color image
CN115311176B (en) * 2022-10-12 2023-03-07 江苏菲尔浦物联网有限公司 Night image enhancement method based on histogram equalization

Patent Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP0749362A1 (en) * 1994-03-07 1996-12-27 International Business Machines Corporation Improvements in image processing
CN101951523A (en) * 2010-09-21 2011-01-19 北京工业大学 Adaptive colour image processing method and system
CN109345491A (en) * 2018-09-26 2019-02-15 中国科学院西安光学精密机械研究所 Remote sensing image enhancement method fusing gradient and gray scale information
CN111462024A (en) * 2020-04-03 2020-07-28 中国科学院半导体研究所 Bilateral self-adaptive image visualization enhancement method and imaging system
CN112734654A (en) * 2020-12-23 2021-04-30 中国科学院苏州纳米技术与纳米仿生研究所 Image processing method, device, equipment and storage medium
CN114049283A (en) * 2021-11-16 2022-02-15 上海无线电设备研究所 Self-adaptive gray gradient histogram equalization remote sensing image enhancement method
CN114998209A (en) * 2022-04-28 2022-09-02 南通奕霖智慧医学科技有限公司 Foreign matter detection method for infusion medicine bottle lamp detection process
CN114926466A (en) * 2022-07-21 2022-08-19 山东省土地发展集团有限公司 Land integrated monitoring and decision-making method and platform based on big data
CN115082508A (en) * 2022-08-18 2022-09-20 山东省蓝睿科技开发有限公司 Ocean buoy production quality detection method
CN115311301A (en) * 2022-10-12 2022-11-08 江苏银生新能源科技有限公司 PCB welding spot defect detection method

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
《A Design of Dynamic Defective Pixel Correction for Image Sensor》;Liu Yongji;《2020 IEEE International Conference on Artificial Intelligence and Information Systems (ICAIIS)》;20200911;全文 *
胡明 ; 葛俊锋.《 基于图像处理和K近邻算法的示温漆判读方法》.《航空发动机》.2021,全文. *

Also Published As

Publication number Publication date
CN115512231A (en) 2022-12-23

Similar Documents

Publication Publication Date Title
CN115512231B (en) Remote sensing interpretation method suitable for homeland space ecological restoration
CN115829883B (en) Surface image denoising method for special-shaped metal structural member
CN107704807B (en) Dynamic monitoring method based on multi-source remote sensing sequence image
CN103226832B (en) Based on the multi-spectrum remote sensing image change detecting method of spectral reflectivity mutation analysis
CN105335965B (en) Multi-scale self-adaptive decision fusion segmentation method for high-resolution remote sensing image
CN107705314A (en) A kind of more subject image dividing methods based on intensity profile
CN103377468A (en) Image processing device and image processing method
CN115797798A (en) Ecological restoration effect evaluation method based on abandoned mine remote sensing image
CN109613531B (en) Multi-threshold optimization deformation inversion method and system for micro-variation perception early warning radar
CN110070545B (en) Method for automatically extracting urban built-up area by urban texture feature density
CN104036485A (en) Method about image resampling tampering detection
CN117368920B (en) D-insar-based coal mining area subsidence monitoring method and system
CN115797473B (en) Concrete forming evaluation method for civil engineering
CN116883408B (en) Integrating instrument shell defect detection method based on artificial intelligence
CN116542972A (en) Wall plate surface defect rapid detection method based on artificial intelligence
CN116823824A (en) Underground belt conveyor dust fall detecting system based on machine vision
CN112307803A (en) Digital geological outcrop crack extraction method and device
CN114972370A (en) Remote sensing image self-adaptive segmentation method for neural network reasoning
CN105205816A (en) Method for extracting high-resolution SAR image building zone through multi-feature weighted fusion
CN114881960A (en) Feature enhancement-based cloth linear defect detection method and system
CN116740579B (en) Intelligent collection method for territorial space planning data
CN117237396A (en) Rail bolt rust area segmentation method based on image characteristics
CN105488798B (en) SAR image method for measuring similarity based on point set contrast
CN114266899A (en) Image target parallel detection method based on multi-core DSP
CN116503426B (en) Ultrasonic image segmentation method based on image processing

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant