CN112232119A - Remote sensing texture image segmentation method and device - Google Patents

Remote sensing texture image segmentation method and device Download PDF

Info

Publication number
CN112232119A
CN112232119A CN202010942317.2A CN202010942317A CN112232119A CN 112232119 A CN112232119 A CN 112232119A CN 202010942317 A CN202010942317 A CN 202010942317A CN 112232119 A CN112232119 A CN 112232119A
Authority
CN
China
Prior art keywords
texture
value
image
remote sensing
initial
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202010942317.2A
Other languages
Chinese (zh)
Inventor
魏宝安
夏立福
郭慧宇
莫玉兵
张伟东
宋尚萍
王平
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Fourth Topographic Survey Team Of Ministry Of Natural Resources
Original Assignee
Fourth Topographic Survey Team Of Ministry Of Natural Resources
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Fourth Topographic Survey Team Of Ministry Of Natural Resources filed Critical Fourth Topographic Survey Team Of Ministry Of Natural Resources
Priority to CN202010942317.2A priority Critical patent/CN112232119A/en
Publication of CN112232119A publication Critical patent/CN112232119A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/10Terrestrial scenes
    • G06V20/13Satellite images
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/23Clustering techniques
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/004Artificial life, i.e. computing arrangements simulating life
    • G06N3/006Artificial life, i.e. computing arrangements simulating life based on simulated virtual individual or collective life forms, e.g. social simulations or particle swarm optimisation [PSO]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/20Image preprocessing
    • G06V10/26Segmentation of patterns in the image field; Cutting or merging of image elements to establish the pattern region, e.g. clustering-based techniques; Detection of occlusion
    • G06V10/267Segmentation of patterns in the image field; Cutting or merging of image elements to establish the pattern region, e.g. clustering-based techniques; Detection of occlusion by performing operations on regions, e.g. growing, shrinking or watersheds
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/40Extraction of image or video features
    • G06V10/44Local feature extraction by analysis of parts of the pattern, e.g. by detecting edges, contours, loops, corners, strokes or intersections; Connectivity analysis, e.g. of connected components

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Data Mining & Analysis (AREA)
  • Multimedia (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Artificial Intelligence (AREA)
  • Evolutionary Computation (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • General Engineering & Computer Science (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Biophysics (AREA)
  • Astronomy & Astrophysics (AREA)
  • Remote Sensing (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Health & Medical Sciences (AREA)
  • Biomedical Technology (AREA)
  • Evolutionary Biology (AREA)
  • Computational Linguistics (AREA)
  • General Health & Medical Sciences (AREA)
  • Molecular Biology (AREA)
  • Computing Systems (AREA)
  • Mathematical Physics (AREA)
  • Software Systems (AREA)
  • Image Analysis (AREA)

Abstract

The invention provides a remote sensing texture image segmentation method and device. The method comprises the following steps: extracting texture features of the obtained remote sensing texture image to be segmented to obtain texture feature values; carrying out normalization processing on the texture characteristic value to generate a normalized texture characteristic value; setting initial cluster number, initial cluster population scale and iteration stop conditions; initializing an algorithm individual of a grasshopper optimized clustering algorithm based on the initial clustering number and the initial clustering population scale to obtain initial parameters; calculating to obtain a fitness value of the initial grasshopper population based on the initial parameters; iteratively executing steps from the step of obtaining a target fitness value in the fitness values to an iteration stop condition based on the initial cluster number and the initial cluster population scale, and obtaining the target fitness value in the fitness values; and dividing the remote sensing texture image to be segmented based on the target fitness value to generate a segmented image. The invention can improve the efficiency of the whole image segmentation process and obtain the optimal image segmentation effect.

Description

Remote sensing texture image segmentation method and device
Technical Field
The invention relates to the technical field of remote sensing image analysis and processing, in particular to a remote sensing texture image segmentation method and device.
Background
Image segmentation is a basic computer vision technique and is also a key step in the transition from image processing to image analysis. The segmentation is to divide the image into several meaningful regions according to some features (such as gray scale, texture, color), pixels contained in the same region should have the same or similar characteristics, and pixels contained in different regions should have different characteristics. The texture is an important mark of the remote sensing image, and the remote sensing image segmentation can be well completed by utilizing the texture characteristics, so that a foundation is laid for subsequent high-quality remote sensing analysis. In essence, image segmentation is a process of clustering according to pixel attributes, and therefore, many clustering algorithms are applied to image segmentation and achieve good results. The C-means algorithm is one of the most commonly used clustering algorithms. Under the condition of giving the number of categories, when the C mean value is used for clustering, the result is mainly influenced by the initial class center and is easy to fall into a local optimal solution, and a good method for selecting the initial class center does not exist so far. In addition, when the number of clusters C is large, the number of iterations of the algorithm will increase greatly.
Disclosure of Invention
The technical problem solved by the invention is as follows: the defects of the prior art are overcome, and a remote sensing texture image segmentation method and a remote sensing texture image segmentation device are provided.
In order to solve the technical problem, an embodiment of the present invention provides a remote sensing texture image segmentation method, including:
performing texture feature extraction on the obtained remote sensing texture image to be segmented to obtain a texture feature value corresponding to the texture image to be segmented;
carrying out normalization processing on the texture characteristic value to generate a normalized texture characteristic value;
setting initial cluster number, initial cluster population scale and iteration stop conditions;
initializing an algorithm individual of a grasshopper optimized clustering algorithm based on the initial clustering number and the initial clustering population scale to obtain initial parameters;
calculating to obtain a fitness value of the initial grasshopper population based on the initial parameters;
iteratively executing the steps of the obtaining and obtaining a target fitness value of the fitness values to the iteration stop condition based on the initial cluster number and the initial cluster population scale, and obtaining a target fitness value of the fitness values;
and dividing the remote sensing texture image to be segmented based on the target fitness value to generate a segmented image.
Optionally, the performing texture feature extraction on the obtained remote sensing texture image to be segmented to obtain a texture feature value corresponding to the texture image to be segmented includes:
acquiring the remote sensing texture image to be segmented;
carrying out Gabor transformation processing on the remote sensing texture image to be segmented to generate a transformed remote sensing texture image;
and extracting texture features in the transformed remote sensing texture image to obtain the texture feature value.
Optionally, the performing Gabor transform processing on the remote sensing texture image to be segmented to generate a transformed remote sensing texture image includes:
carrying out Gabor transformation processing on the remote sensing texture image to be segmented by adopting the following formulas (1) and (2) to generate the transformed remote sensing texture image:
Figure BDA0002674052240000021
Figure BDA0002674052240000022
in the above equations (1) and (2), j is a complex operator, W represents the band width of the Gabor wavelet, and σxAnd σyRepresenting the constants, σ, of the Gaussian envelope in the x-and y-directions, respectivelyu=1/2πσx、σv=1/2πσyFor multi-scale and multi-direction Gabor features, Gabor filter banks with 5 scales and 8 directions are respectively adopted, and the multi-scale and multi-direction features corresponding to each pixel point z ═ x, y in the image are represented as Gu,v(z)。
Optionally, the normalizing the texture feature value to generate a normalized texture feature value includes:
performing normalization processing on the texture characteristic value by adopting the following formula (3) to generate a normalized texture characteristic value:
Figure BDA0002674052240000031
in the above formula (3), xtgnewRepresenting the texture characteristic value, x, obtained after normalizationtgRepresenting the original characteristic value, x, of the t-th dimension texturetmaxMaximum value, x, of t-dimension texture feature valuetminMinimum value of t-dimension texture feature value.
Optionally, calculating a fitness value of the initial grasshopper population based on the initial parameters includes:
calculating the fitness value by adopting the following formula (4):
Figure BDA0002674052240000032
in the above-mentioned formula (4),k represents a plurality of information of a total number of categories of images contained in the image, x is the current individual, ciFor the set of all individuals belonging to the i-th class,
Figure BDA0002674052240000035
set of representations ciAverage of all individuals in (1).
Optionally, after calculating the fitness value of the initial grasshopper population based on the initial parameters, the method further includes:
under the condition that the iteration times do not reach the iteration stop condition, updating the positions of the algorithm individuals based on the grasshopper algorithm positions, and resetting the individuals which are not in the preset upper and lower bound range;
and updating the fitness value, adding 1 to the iteration times, and executing the operation of judging whether the iteration times reach the iteration stop condition.
Optionally, the updating the position of the individual algorithm based on the grasshopper algorithm position update includes:
updating the position of the algorithm individual by adopting the following formula (5):
Figure BDA0002674052240000033
in the above formula (5), the i-th grasshopper position is denoted as Xi
Figure BDA0002674052240000034
A value representing the d-dimension component of the i-th grasshopper,
Figure BDA0002674052240000041
a value representing the d-dimension component of the i grasshopper individual, ubdAnd lbdRespectively representing the upper and lower limits of the d-th dimension,
Figure BDA0002674052240000042
representing the best solution of the grasshopper positions so far in d-dimensional space, dijIs the ith grasshopper and the ithThe distance between j grasshoppers, the S function represents the social strength.
In order to solve the above technical problem, an embodiment of the present invention further provides a remote sensing texture image segmentation apparatus, including:
the texture characteristic value acquisition module is used for extracting texture characteristics of the acquired remote sensing texture image to be segmented to obtain a texture characteristic value corresponding to the texture image to be segmented;
the normalized characteristic value generating module is used for carrying out normalization processing on the texture characteristic value to generate a normalized texture characteristic value;
the cluster number setting module is used for setting an initial cluster number, an initial cluster population scale and an iteration stopping condition;
the initial parameter acquisition module is used for initializing an algorithm individual of the grasshopper optimization clustering algorithm based on the initial clustering number and the initial clustering population scale to obtain initial parameters;
the fitness value calculation module is used for calculating and obtaining the fitness value of the initial grasshopper population based on the initial parameters;
the target fitness acquisition module is used for executing the initial parameter acquisition module and the fitness value calculation module in an iterative manner and acquiring a target fitness value in the fitness values;
and the segmentation image generation module is used for dividing the remote sensing texture image to be segmented based on the target fitness value to generate a segmentation image.
Optionally, the texture feature value obtaining module includes:
the texture image acquisition unit is used for acquiring the remote sensing texture image to be segmented;
the transformation image generation unit is used for carrying out Gabor transformation processing on the remote sensing texture image to be segmented to generate a transformation remote sensing texture image;
and the texture characteristic value acquisition unit is used for extracting texture characteristics in the transformed remote sensing texture image to obtain the texture characteristic value.
Optionally, the transformed image generating unit includes:
carrying out Gabor transformation processing on the remote sensing texture image to be segmented by adopting the following formulas (1) and (2) to generate the transformed remote sensing texture image:
Figure BDA0002674052240000051
Figure BDA0002674052240000052
in the above equations (1) and (2), j is a complex operator, W represents the band width of the Gabor wavelet, and σxAnd σyRepresenting the constants, σ, of the Gaussian envelope in the x-and y-directions, respectivelyu=1/2πσx、σv=1/2πσyFor multi-scale and multi-direction Gabor features, Gabor filter banks with 5 scales and 8 directions are respectively adopted, and the multi-scale and multi-direction features corresponding to each pixel point z ═ x, y in the image are represented as Gu,v(z)。
Optionally, the normalized feature value generating module includes:
performing normalization processing on the texture characteristic value by adopting the following formula (3) to generate a normalized texture characteristic value:
Figure BDA0002674052240000053
in the above formula (3), xtgnewRepresenting the texture characteristic value, x, obtained after normalizationtgRepresenting the original characteristic value, x, of the t-th dimension texturetmaxMaximum value, x, of t-dimension texture feature valuetminMinimum value of t-dimension texture feature value.
Optionally, the fitness value calculating module includes:
calculating the fitness value by adopting the following formula (4):
Figure BDA0002674052240000054
in the above formula (4), k represents a plurality of pieces of information of one image of the total number of categories included in the image, x is the current individual, ciFor the set of all individuals belonging to the i-th class,
Figure BDA0002674052240000055
set of representations ciAverage of all individuals in (1).
Optionally, the method further comprises:
the individual position updating module is used for updating the position of the algorithm individual based on the grasshopper algorithm position under the condition that the iteration times do not reach the iteration stop condition, and resetting the individual which is not in the preset upper and lower boundary range;
and the stop condition operation execution module is used for updating the fitness value, adding 1 to the iteration times and executing the operation of judging whether the iteration times reach the iteration stop condition.
Optionally, the individual location update module comprises:
updating the position of the algorithm individual by adopting the following formula (5):
Figure BDA0002674052240000061
in the above formula (5), the i-th grasshopper position is denoted as Xi
Figure BDA0002674052240000062
A value representing the d-dimension component of the i-th grasshopper,
Figure BDA0002674052240000063
a value representing the d-dimension component of the i grasshopper individual, ubdAnd lbdRespectively representing the upper and lower limits of the d-th dimension,
Figure BDA0002674052240000064
representing the optimal solution of the grasshopper positions in d-dimensional space so far,dijIs the distance between the ith grasshopper and the jth grasshopper, and the S function represents the social strength.
Compared with the prior art, the invention has the advantages that:
according to the remote sensing texture image segmentation method and device provided by the embodiment of the invention, the texture feature extraction is carried out on the obtained remote sensing texture image to be segmented to obtain the texture feature value corresponding to the texture image to be segmented, and the texture feature value is subjected to normalization processing to generate a normalized texture feature value; setting an initial clustering number, an initial clustering population scale and an iteration stop condition, initializing an algorithm individual of a grasshopper optimization clustering algorithm based on the initial clustering number and the initial clustering population scale to obtain an initial parameter, calculating a fitness value of the initial grasshopper population based on the initial parameter, performing iteration based on the initial clustering number and the initial clustering population scale to obtain a target fitness value in the fitness value to the iteration stop condition, obtaining the target fitness value in the fitness value, dividing the remote sensing texture image to be divided based on the target fitness value, and generating a divided image. The remote sensing texture image segmentation method based on grasshopper optimized clustering provided by the embodiment of the invention not only reduces the sensitivity of the traditional C mean value clustering segmentation method to the initial clustering center. Meanwhile, a grasshopper optimization algorithm can quickly find the optimal solution. And in the aspect of time complexity, the efficiency of the whole image segmentation process is greatly improved. And the optimal image segmentation effect is quickly obtained.
Drawings
FIG. 1 is a flowchart illustrating steps of a method for segmenting a remote sensing texture image according to an embodiment of the present invention;
fig. 2 is a schematic structural diagram of a remote sensing texture image segmentation apparatus according to an embodiment of the present invention.
Detailed Description
Referring to fig. 1, a flowchart illustrating steps of a remote sensing texture image segmentation method according to an embodiment of the present invention is shown, and as shown in fig. 1, the remote sensing texture image segmentation method may specifically include the following steps:
step 101: and extracting texture features of the obtained remote sensing texture image to be segmented to obtain texture feature values corresponding to the texture image to be segmented.
The embodiment of the invention can be applied to a scene for segmenting the remote sensing texture image by combining a grasshopper optimization clustering algorithm.
Image segmentation is a basic computer vision technique and is also a key step in the transition from image processing to image analysis. The segmentation is to divide the image into a plurality of meaningful regions according to certain characteristics (such as gray scale, texture, color, etc.), pixels contained in the same region should have the same or similar characteristics, and pixels contained in different regions should have different characteristics.
The texture is an important mark of the remote sensing image, and the remote sensing image segmentation can be well completed by utilizing the texture characteristics, so that a foundation is laid for subsequent high-quality remote sensing analysis.
The remote sensing texture image to be segmented refers to a texture image which is input by a user and needs to be segmented.
After the remote sensing texture image to be segmented is obtained, extracting texture features of the obtained remote sensing texture image to be segmented may be performed to obtain texture feature values corresponding to the texture image to be segmented, and specifically, the detailed description may be performed in combination with the following specific implementation manner.
In a specific implementation manner of the present invention, the step 101 may include:
substep A1: and acquiring the remote sensing texture image to be segmented.
Substep A2: and carrying out Gabor transformation processing on the remote sensing texture image to be segmented to generate a transformed remote sensing texture image.
Substep A3: and extracting texture features in the transformed remote sensing texture image to obtain the texture feature value.
In the embodiment of the invention, the remote sensing texture image to be segmented can be input by a user, and then the remote sensing texture image can be preprocessed, specifically, Gabor filter banks with 5 scales and 8 directions can be respectively adopted to perform Gabor transformation processing on the image, so that the transformed remote sensing texture image can be generated, and the texture feature of the transformed remote sensing texture image can be extracted.
The Gabor transformation processing is carried out on the remote sensing texture image to be segmented, and the generation of the transformed remote sensing texture image can be realized by combining the following formulas (1) and (2):
Figure BDA0002674052240000081
Figure BDA0002674052240000082
in the above equations (1) and (2), j is a complex operator, W represents the band width of the Gabor wavelet, and σxAnd σyRepresenting the constants, σ, of the Gaussian envelope in the x-and y-directions, respectivelyu=1/2πσx、σv=1/2πσyFor multi-scale and multi-direction Gabor features, Gabor filter banks with 5 scales and 8 directions are respectively adopted, and the multi-scale and multi-direction features corresponding to each pixel point z ═ x, y in the image are represented as Gu,v(z)。
After the remote sensing texture image is generated, a texture characteristic value in the remote sensing texture image can be extracted, specifically, 40-dimensional characteristics can be extracted, and the multi-scale and multi-direction characteristic characterization based on the texture image in the invention is as follows:
{Gu,v(z):u∈(0,1,...,4),v∈(0,1,...,7)}
after extracting the texture feature values in the remote sensing texture image, step 102 is performed.
Step 102: and carrying out normalization processing on the texture characteristic value to generate a normalized texture characteristic value.
After extracting the texture feature value in the remote sensing texture image, performing normalization processing on the texture feature value to generate a normalized texture feature value, specifically, normalizing the extracted texture feature value to [0, 1], specifically, the normalization may be implemented by using the following formula (3):
Figure BDA0002674052240000083
in the above formula (3), xtgnewRepresenting the texture characteristic value, x, obtained after normalizationtgRepresenting the original characteristic value, x, of the t-th dimension texturetmaxMaximum value, x, of t-dimension texture feature valuetmin the minimum value of the t-th dimension texture feature value.
After generating the normalized texture feature values, step 103 is performed.
Step 103: and setting an initial cluster number, an initial cluster population scale and an iteration stop condition.
At this time, an initial cluster number, an initial cluster population size (i.e., the number of images to be segmented), and an iteration stop condition may be set in advance, where the initial cluster number may be set to K, the initial cluster population estimate may be set to P, and the iteration stop condition may be a condition that a result of a last clustering and a result of a next clustering are almost identical, or the like.
After setting the initial cluster number, the initial cluster population size, and the iteration stop condition, step 104 is performed.
Step 104: initializing an algorithm individual of the grasshopper optimized clustering algorithm based on the initial clustering number and the initial clustering population scale to obtain initial parameters.
After setting initial cluster number and initial cluster population scale, initializing the algorithm individuals of the grasshopper optimization clustering algorithm based on the initial cluster number and the initial cluster population scale to obtain initial parameters, wherein the initial parameters can include: c (i.e. the number of divisions) to produce an initial population GPAnd the position and other parameters of the initial grasshopper population, in the method, as the population position vector represents the clustering center of the texture image, the value range of one bit in any individual in the population is [0, 1]]Finally, the running times of the grasshopper optimization algorithm are represented by an algebra T, and the T is made to be 0 under the initial condition.
After initializing the algorithm individual of the grasshopper optimization algorithm to obtain initial parameters, step 105 is executed.
Step 105: and calculating the fitness value of the initial grasshopper population based on the initial parameters.
After obtaining the initial parameters, a fitness value of the initial grasshopper population can be calculated based on the initial parameters, and specifically, the fitness value can be calculated based on the following formula (4):
Figure BDA0002674052240000091
in the above formula (4), k represents a plurality of pieces of information of one image of the total number of categories included in the image, x is the current individual, ciFor the set of all individuals belonging to the i-th class,
Figure BDA0002674052240000092
set of representations ciAverage of all individuals in (1).
In a specific implementation manner of the present invention, after the step 105, the method may further include:
step B1: under the condition that the iteration times do not reach the iteration stop condition, updating the positions of the algorithm individuals based on the grasshopper algorithm positions, and resetting the individuals which are not in the preset upper and lower bound range;
step B2: and updating the fitness value, adding 1 to the iteration times, and executing the operation of judging whether the iteration times reach the iteration stop condition.
In the embodiment of the invention, under the condition that the iteration times do not reach the iteration stop condition, the positions of individuals can be updated according to a grasshopper algorithm position update formula, and for the individuals which are not in the upper and lower bound ranges of all dimensions, the grasshopper optimization clustering algorithm position update formula is as follows:
Figure BDA0002674052240000101
in the above formula (5), the i-th grasshopper position is denoted as Xi
Figure BDA0002674052240000102
A value representing the d-dimension component of the i-th grasshopper,
Figure BDA0002674052240000103
a value representing the d-dimension component of the i grasshopper individual, ubdAnd lbdRespectively representing the upper and lower limits of the d-th dimension,
Figure BDA0002674052240000104
representing the best solution of the grasshopper positions so far in d-dimensional space, dijIs the distance between the ith grasshopper and the jth grasshopper, and the S function represents the social strength.
The social strength is calculated as follows:
Figure BDA0002674052240000105
wherein r represents the independent variable of the S function, and f is the attraction strength; l is an attraction length scale, the value is 1.5, f is 0.5, c is a decreasing coefficient, the suitable range is shrunk in the iteration process, and the parameter c is updated according to the following formula:
Figure BDA0002674052240000106
where cmax is the maximum value of c, cmin is the minimum value of c, titRepresenting the current number of iterations and L representing the maximum number of iterations.
After updating the locations of the algorithmic individuals, the optimal individuals G may be updatedTAnd adding 1 to the iteration number, and executing the operation of judging whether the iteration number reaches the iteration stop condition.
After calculating the fitness value of the initial grasshopper population based on the initial parameters, step 106 is performed.
Step 106: and iterating the steps of obtaining and obtaining a target fitness value in the fitness values to the iteration stop condition based on the initial cluster number and the initial cluster population scale, and obtaining the target fitness value in the fitness values.
And performing iteration, namely initializing an algorithm individual of the grasshopper optimized clustering algorithm based on the initial clustering number and the initial clustering population scale to obtain an initial parameter, and calculating the fitness value of the initial grasshopper population based on the initial parameter until an iteration stop condition is met, so that a target fitness value in the fitness value can be obtained.
After the target fitness value is acquired, step 107 is performed.
Step 107: and dividing the remote sensing texture image to be segmented based on the target fitness value to generate a segmented image.
After the target fitness is obtained, the remote sensing texture image to be segmented can be divided based on the target fitness value to generate a segmented image.
According to the remote sensing texture image segmentation method provided by the embodiment of the invention, the texture characteristic value corresponding to the texture image to be segmented is obtained by extracting the texture characteristic of the obtained remote sensing texture image to be segmented, and the texture characteristic value is subjected to normalization processing to generate a normalized texture characteristic value; setting an initial clustering number, an initial clustering population scale and an iteration stop condition, initializing an algorithm individual of a grasshopper optimization clustering algorithm based on the initial clustering number and the initial clustering population scale to obtain an initial parameter, calculating a fitness value of the initial grasshopper population based on the initial parameter, performing iteration based on the initial clustering number and the initial clustering population scale to obtain a target fitness value in the fitness value to the iteration stop condition, obtaining the target fitness value in the fitness value, dividing the remote sensing texture image to be divided based on the target fitness value, and generating a divided image. The remote sensing texture image segmentation method based on grasshopper optimized clustering provided by the embodiment of the invention not only reduces the sensitivity of the traditional C mean value clustering segmentation method to the initial clustering center. Meanwhile, a grasshopper optimization algorithm can quickly find the optimal solution. And in the aspect of time complexity, the efficiency of the whole image segmentation process is greatly improved. And the optimal image segmentation effect is quickly obtained.
Referring to fig. 2, a schematic structural diagram of a remote sensing texture image segmentation apparatus according to an embodiment of the present invention is shown, and as shown in fig. 2, the remote sensing texture image segmentation apparatus 200 may specifically include the following modules:
the texture feature value obtaining module 210 is configured to perform texture feature extraction on the obtained remote sensing texture image to be segmented to obtain a texture feature value corresponding to the texture image to be segmented;
a normalized feature value generation module 220, configured to perform normalization processing on the texture feature value to generate a normalized texture feature value;
a cluster number setting module 230, configured to set an initial cluster number, an initial cluster population scale, and an iteration stop condition;
an initial parameter obtaining module 240, configured to initialize an algorithm individual of the grasshopper optimized clustering algorithm based on the initial clustering number and the initial clustering population scale to obtain an initial parameter;
a fitness value calculation module 250, configured to calculate a fitness value of the initial grasshopper population based on the initial parameters;
a target fitness obtaining module 260, configured to iteratively execute the initial parameter obtaining module and the fitness value calculating module, and obtain a target fitness value of the fitness values;
and the segmentation image generation module 270 is configured to divide the remote sensing texture image to be segmented based on the target fitness value to generate a segmentation image.
Optionally, the texture feature value obtaining module 210 includes:
the texture image acquisition unit is used for acquiring the remote sensing texture image to be segmented;
the transformation image generation unit is used for carrying out Gabor transformation processing on the remote sensing texture image to be segmented to generate a transformation remote sensing texture image;
and the texture characteristic value acquisition unit is used for extracting texture characteristics in the transformed remote sensing texture image to obtain the texture characteristic value.
Optionally, the transformed image generating unit includes:
carrying out Gabor transformation processing on the remote sensing texture image to be segmented by adopting the following formulas (1) and (2) to generate the transformed remote sensing texture image:
Figure BDA0002674052240000121
Figure BDA0002674052240000122
in the above equations (1) and (2), j is a complex operator, W represents the band width of the Gabor wavelet, and σxAnd σyRepresenting the constants, σ, of the Gaussian envelope in the x-and y-directions, respectivelyu=1/2πσx、σv=1/2πσyFor multi-scale and multi-direction Gabor features, Gabor filter banks with 5 scales and 8 directions are respectively adopted, and the multi-scale and multi-direction features corresponding to each pixel point z ═ x, y in the image are represented as Gu,v(z)。
Optionally, the normalized feature value generating module 220 includes:
performing normalization processing on the texture characteristic value by adopting the following formula (3) to generate a normalized texture characteristic value:
Figure BDA0002674052240000131
in the above formula (3), xtgnewRepresenting the texture characteristic value, x, obtained after normalizationtgRepresenting the original characteristic value, x, of the t-th dimension texturetmaxMaximum value, x, of t-dimension texture feature valuetminMinimum value of t-dimension texture feature value.
Optionally, the fitness value calculating module 250 includes:
calculating the fitness value by adopting the following formula (4):
Figure BDA0002674052240000132
in the above formula (4), k represents a plurality of pieces of information of one image of the total number of categories included in the image, x is the current individual, ciFor the set of all individuals belonging to the i-th class,
Figure BDA0002674052240000133
set of representations ciAverage of all individuals in (1).
Optionally, the method further comprises:
the individual position updating module is used for updating the position of the algorithm individual based on the grasshopper algorithm position under the condition that the iteration times do not reach the iteration stop condition, and resetting the individual which is not in the preset upper and lower boundary range;
and the stop condition operation execution module is used for updating the fitness value, adding 1 to the iteration times and executing the operation of judging whether the iteration times reach the iteration stop condition.
Optionally, the individual location update module comprises:
updating the position of the algorithm individual by adopting the following formula (5):
Figure BDA0002674052240000141
in the above formula (5), the i-th grasshopper position is denoted as Xi
Figure BDA0002674052240000142
A value representing the d-dimension component of the i-th grasshopper,
Figure BDA0002674052240000143
a value representing the d-dimension component of the i grasshopper individual, ubdAnd lbdRespectively representing the upper and lower limits of the d-th dimension,
Figure BDA0002674052240000144
representing the best solution of the grasshopper positions so far in d-dimensional space, dijIs the distance between the ith grasshopper and the jth grasshopper, and the S function represents the social strength.
Those skilled in the art will appreciate that those matters not described in detail in the present specification are well known in the art.

Claims (14)

1. A remote sensing texture image segmentation method is characterized by comprising the following steps:
performing texture feature extraction on the obtained remote sensing texture image to be segmented to obtain a texture feature value corresponding to the texture image to be segmented;
carrying out normalization processing on the texture characteristic value to generate a normalized texture characteristic value;
setting initial cluster number, initial cluster population scale and iteration stop conditions;
initializing an algorithm individual of a grasshopper optimized clustering algorithm based on the initial clustering number and the initial clustering population scale to obtain initial parameters;
calculating to obtain a fitness value of the initial grasshopper population based on the initial parameters;
iteratively executing the steps of the obtaining and obtaining a target fitness value of the fitness values to the iteration stop condition based on the initial cluster number and the initial cluster population scale, and obtaining a target fitness value of the fitness values;
and dividing the remote sensing texture image to be segmented based on the target fitness value to generate a segmented image.
2. The method according to claim 1, wherein the extracting the texture features of the obtained remote sensing texture image to be segmented to obtain the texture feature value corresponding to the texture image to be segmented comprises:
acquiring the remote sensing texture image to be segmented;
carrying out Gabor transformation processing on the remote sensing texture image to be segmented to generate a transformed remote sensing texture image;
and extracting texture features in the transformed remote sensing texture image to obtain the texture feature value.
3. The method according to claim 2, wherein the Gabor transform processing is performed on the remote sensing texture image to be segmented to generate a transformed remote sensing texture image, and the Gabor transform processing comprises:
carrying out Gabor transformation processing on the remote sensing texture image to be segmented by adopting the following formulas (1) and (2) to generate the transformed remote sensing texture image:
Figure FDA0002674052230000011
Figure FDA0002674052230000021
in the above equations (1) and (2), j is a complex operator, W represents the band width of the Gabor wavelet, and σxAnd σyRepresenting the constants, σ, of the Gaussian envelope in the x-and y-directions, respectivelyu=1/2πσx、σv=1/2πσyFor multi-scale and multi-direction Gabor features, Gabor filter banks with 5 scales and 8 directions are respectively adopted, and the multi-scale and multi-direction features corresponding to each pixel point z ═ x, y in the image are represented as Gu,v(z)。
4. The method according to claim 1, wherein the normalizing the texture feature value to generate a normalized texture feature value comprises:
performing normalization processing on the texture characteristic value by adopting the following formula (3) to generate a normalized texture characteristic value:
Figure FDA0002674052230000022
in the above formula (3), xtgnewRepresenting the texture characteristic value, x, obtained after normalizationtgRepresenting the original characteristic value, x, of the t-th dimension texturetmaxMaximum value, x, of t-dimension texture feature valuetminMinimum value of t-dimension texture feature value.
5. The method of claim 1, wherein calculating a fitness value for the initial grasshopper population based on the initial parameters comprises:
calculating the fitness value by adopting the following formula (4):
Figure FDA0002674052230000023
in the above formula (4), k represents a plurality of pieces of information of one image of the total number of categories included in the image, x is the current individual, ciFor the set of all individuals belonging to the i-th class,
Figure FDA0002674052230000024
set of representations ciAverage of all individuals in (1).
6. The method of claim 1, further comprising, after said calculating a fitness value for said initial grasshopper population based on said initial parameters:
under the condition that the iteration times do not reach the iteration stop condition, updating the positions of the algorithm individuals based on the grasshopper algorithm positions, and resetting the individuals which are not in the preset upper and lower bound range;
and updating the fitness value, adding 1 to the iteration times, and executing the operation of judging whether the iteration times reach the iteration stop condition.
7. The method of claim 6, wherein updating the location of the individual algorithms based on grasshopper algorithm location updates comprises:
updating the position of the algorithm individual by adopting the following formula (5):
Figure FDA0002674052230000031
in the above formula (5), the i-th grasshopper position is denoted as Xi
Figure FDA0002674052230000032
A value representing the d-dimension component of the i-th grasshopper,
Figure FDA0002674052230000033
a value representing the d-dimension component of the i grasshopper individual, ubdAnd lbdRespectively representing the upper and lower limits of the d-th dimension,
Figure FDA0002674052230000034
representing the best solution of the grasshopper positions so far in d-dimensional space, dijIs the distance between the ith grasshopper and the jth grasshopper, and the S function represents the social strength.
8. A remote sensing texture image segmentation apparatus, comprising:
the texture characteristic value acquisition module is used for extracting texture characteristics of the acquired remote sensing texture image to be segmented to obtain a texture characteristic value corresponding to the texture image to be segmented;
the normalized characteristic value generating module is used for carrying out normalization processing on the texture characteristic value to generate a normalized texture characteristic value;
the cluster number setting module is used for setting an initial cluster number, an initial cluster population scale and an iteration stopping condition;
the initial parameter acquisition module is used for initializing an algorithm individual of the grasshopper optimization clustering algorithm based on the initial clustering number and the initial clustering population scale to obtain initial parameters;
the fitness value calculation module is used for calculating and obtaining the fitness value of the initial grasshopper population based on the initial parameters;
the target fitness acquisition module is used for executing the initial parameter acquisition module and the fitness value calculation module in an iterative manner and acquiring a target fitness value in the fitness values;
and the segmentation image generation module is used for dividing the remote sensing texture image to be segmented based on the target fitness value to generate a segmentation image.
9. The apparatus according to claim 8, wherein the texture feature value obtaining module comprises:
the texture image acquisition unit is used for acquiring the remote sensing texture image to be segmented;
the transformation image generation unit is used for carrying out Gabor transformation processing on the remote sensing texture image to be segmented to generate a transformation remote sensing texture image;
and the texture characteristic value acquisition unit is used for extracting texture characteristics in the transformed remote sensing texture image to obtain the texture characteristic value.
10. The apparatus according to claim 9, wherein the transformed image generating unit includes:
carrying out Gabor transformation processing on the remote sensing texture image to be segmented by adopting the following formulas (1) and (2) to generate the transformed remote sensing texture image:
Figure FDA0002674052230000041
Figure FDA0002674052230000042
in the above equations (1) and (2), j is a complex operator, W represents the band width of the Gabor wavelet, and σxAnd σyRepresenting the Gaussian envelope along the x-axis and y-axis, respectivelyConstant in the axial direction, σu=1/2πσx、σv=1/2πσyFor multi-scale and multi-direction Gabor features, Gabor filter banks with 5 scales and 8 directions are respectively adopted, and the multi-scale and multi-direction features corresponding to each pixel point z ═ x, y in the image are represented as Gu,v(z)。
11. The apparatus of claim 8, wherein the normalized feature value generation module comprises:
performing normalization processing on the texture characteristic value by adopting the following formula (3) to generate a normalized texture characteristic value:
Figure FDA0002674052230000043
in the above formula (3), xtgnewRepresenting the texture characteristic value, x, obtained after normalizationtgRepresenting the original characteristic value, x, of the t-th dimension texturetmaxMaximum value, x, of t-dimension texture feature valuetminMinimum value of t-dimension texture feature value.
12. The apparatus of claim 8, wherein the fitness value calculation module comprises:
calculating the fitness value by adopting the following formula (4):
Figure FDA0002674052230000051
in the above formula (4), k represents a plurality of pieces of information of one image of the total number of categories included in the image, x is the current individual, ciFor the set of all individuals belonging to the i-th class,
Figure FDA0002674052230000052
set of representations ciAverage of all individuals in (1).
13. The apparatus of claim 8, further comprising:
the individual position updating module is used for updating the position of the algorithm individual based on the grasshopper algorithm position under the condition that the iteration times do not reach the iteration stop condition, and resetting the individual which is not in the preset upper and lower boundary range;
and the stop condition operation execution module is used for updating the fitness value, adding 1 to the iteration times and executing the operation of judging whether the iteration times reach the iteration stop condition.
14. The apparatus of claim 13, wherein the individual location update module comprises:
updating the position of the algorithm individual by adopting the following formula (5):
Figure FDA0002674052230000053
in the above formula (5), the i-th grasshopper position is denoted as Xi
Figure FDA0002674052230000054
A value representing the d-dimension component of the i-th grasshopper,
Figure FDA0002674052230000055
a value representing the d-dimension component of the i grasshopper individual, ubdAnd lbdRespectively representing the upper and lower limits of the d-th dimension,
Figure FDA0002674052230000056
representing the best solution of the grasshopper positions so far in d-dimensional space, dijIs the distance between the ith grasshopper and the jth grasshopper, and the S function represents the social strength.
CN202010942317.2A 2020-09-09 2020-09-09 Remote sensing texture image segmentation method and device Pending CN112232119A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010942317.2A CN112232119A (en) 2020-09-09 2020-09-09 Remote sensing texture image segmentation method and device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010942317.2A CN112232119A (en) 2020-09-09 2020-09-09 Remote sensing texture image segmentation method and device

Publications (1)

Publication Number Publication Date
CN112232119A true CN112232119A (en) 2021-01-15

Family

ID=74116106

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010942317.2A Pending CN112232119A (en) 2020-09-09 2020-09-09 Remote sensing texture image segmentation method and device

Country Status (1)

Country Link
CN (1) CN112232119A (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113570555A (en) * 2021-07-07 2021-10-29 温州大学 Two-dimensional segmentation method of multi-threshold medical image based on improved grasshopper algorithm
CN117476165A (en) * 2023-12-26 2024-01-30 贵州维康子帆药业股份有限公司 Intelligent management method and system for Chinese patent medicine medicinal materials

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113570555A (en) * 2021-07-07 2021-10-29 温州大学 Two-dimensional segmentation method of multi-threshold medical image based on improved grasshopper algorithm
CN113570555B (en) * 2021-07-07 2024-02-09 温州大学 Two-dimensional segmentation method of multi-threshold medical image based on improved grasshopper algorithm
CN117476165A (en) * 2023-12-26 2024-01-30 贵州维康子帆药业股份有限公司 Intelligent management method and system for Chinese patent medicine medicinal materials
CN117476165B (en) * 2023-12-26 2024-03-12 贵州维康子帆药业股份有限公司 Intelligent management method and system for Chinese patent medicine medicinal materials

Similar Documents

Publication Publication Date Title
CN111753828B (en) Natural scene horizontal character detection method based on deep convolutional neural network
CN107529650B (en) Closed loop detection method and device and computer equipment
Montazer et al. An improved radial basis function neural network for object image retrieval
CN111695636B (en) Hyperspectral image classification method based on graph neural network
CN109978848B (en) Method for detecting hard exudation in fundus image based on multi-light-source color constancy model
CN112288761B (en) Abnormal heating power equipment detection method and device and readable storage medium
CN111476310B (en) Image classification method, device and equipment
CN112101364B (en) Semantic segmentation method based on parameter importance increment learning
Li et al. A fast level set algorithm for building roof recognition from high spatial resolution panchromatic images
CN112232119A (en) Remote sensing texture image segmentation method and device
CN113052185A (en) Small sample target detection method based on fast R-CNN
CN112348059A (en) Deep learning-based method and system for classifying multiple dyeing pathological images
CN111832642A (en) Image identification method based on VGG16 in insect taxonomy
CN111882554B (en) SK-YOLOv 3-based intelligent power line fault detection method
CN112132145A (en) Image classification method and system based on model extended convolutional neural network
CN116310466A (en) Small sample image classification method based on local irrelevant area screening graph neural network
CN111126169B (en) Face recognition method and system based on orthogonalization graph regular nonnegative matrix factorization
CN111639697A (en) Hyperspectral image classification method based on non-repeated sampling and prototype network
CN111666813A (en) Subcutaneous sweat gland extraction method based on three-dimensional convolutional neural network of non-local information
CN112329818B (en) Hyperspectral image non-supervision classification method based on graph convolution network embedded characterization
CN108876776A (en) A kind of method of generating classification model, eye fundus image classification method and device
CN116994060A (en) Brain texture analysis method based on LBP extraction and TCNN neural network
CN116030308B (en) Multi-mode medical image classification method and system based on graph convolution neural network
CN105844299B (en) A kind of image classification method based on bag of words
CN116597275A (en) High-speed moving target recognition method based on data enhancement

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination