CN113269796B - Image segmentation method and device and terminal equipment - Google Patents

Image segmentation method and device and terminal equipment Download PDF

Info

Publication number
CN113269796B
CN113269796B CN202110649372.7A CN202110649372A CN113269796B CN 113269796 B CN113269796 B CN 113269796B CN 202110649372 A CN202110649372 A CN 202110649372A CN 113269796 B CN113269796 B CN 113269796B
Authority
CN
China
Prior art keywords
image
representing
gray level
weighted sum
gray
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202110649372.7A
Other languages
Chinese (zh)
Other versions
CN113269796A (en
Inventor
王军芬
冯艳红
朱占龙
李明亮
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hebei GEO University
Original Assignee
Hebei GEO University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hebei GEO University filed Critical Hebei GEO University
Priority to CN202110649372.7A priority Critical patent/CN113269796B/en
Publication of CN113269796A publication Critical patent/CN113269796A/en
Application granted granted Critical
Publication of CN113269796B publication Critical patent/CN113269796B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/11Region-based segmentation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/22Matching criteria, e.g. proximity measures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/23Clustering techniques
    • G06F18/232Non-hierarchical techniques
    • G06F18/2321Non-hierarchical techniques using statistics or function optimisation, e.g. modelling of probability density functions
    • G06F18/23213Non-hierarchical techniques using statistics or function optimisation, e.g. modelling of probability density functions with fixed number of clusters, e.g. K-means clustering
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/20Image enhancement or restoration by the use of local operators
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10004Still image; Photographic image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20024Filtering details
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02TCLIMATE CHANGE MITIGATION TECHNOLOGIES RELATED TO TRANSPORTATION
    • Y02T10/00Road transport of goods or passengers
    • Y02T10/10Internal combustion engine [ICE] based vehicles
    • Y02T10/40Engine management systems

Abstract

The application is applicable to the technical field of image processing, and provides an image segmentation method, an image segmentation device and terminal equipment, wherein the method comprises the following steps: calculating a local space operator and a local gray operator of an original gray image; carrying out nonlinear filtering processing on the original gray level image according to the local space operator and the local gray level operator to obtain a nonlinear weighted sum image; acquiring a gray level set of the nonlinear weighted sum image; and based on a fuzzy clustering algorithm and the gray level set of the nonlinear weighted sum image, carrying out image segmentation on the nonlinear weighted sum image to obtain a region after the nonlinear weighted sum image segmentation. According to the application, the nonlinear filtering processing is carried out on the original gray level image, so that the influence of noise on an image segmentation algorithm can be avoided, the robustness of the algorithm is improved, and the segmentation effect of the image is improved.

Description

Image segmentation method and device and terminal equipment
Technical Field
The present application belongs to the technical field of image processing, and in particular, relates to an image segmentation method, an image segmentation device and a terminal device.
Background
Image segmentation plays an important role in image processing technology, which is a key step in image processing. The purpose of image segmentation is to divide the image into a number of specific regions with specific properties and extract the object of interest.
At present, an FCM (Fuzzy C-means) clustering method is a common image segmentation method, but the FCM clustering algorithm is sensitive to noise and cluster size when an image is segmented, and the image segmentation effect of the algorithm is not ideal after the image is polluted by noise.
Disclosure of Invention
In view of the above, the embodiments of the present application provide an image segmentation method, apparatus, and terminal device, so as to solve the problem in the prior art that the segmentation effect is not ideal after the image is polluted by noise.
A first aspect of an embodiment of the present application provides an image segmentation method, including:
calculating a local space operator and a local gray operator of an original gray image;
carrying out nonlinear filtering processing on the original gray level image according to the local space operator and the local gray level operator to obtain a nonlinear weighted sum image;
acquiring a gray level set of the nonlinear weighted sum image;
and based on a fuzzy clustering algorithm and the gray level set of the nonlinear weighted sum image, carrying out image segmentation on the nonlinear weighted sum image to obtain a region after the nonlinear weighted sum image segmentation.
A second aspect of an embodiment of the present application provides an image segmentation apparatus, including:
the operator calculation module is used for calculating a local space operator and a local gray operator of the original gray image;
the filtering processing module is used for carrying out nonlinear filtering processing on the original gray image according to the local space operator and the local gray operator to obtain a nonlinear weighted sum image;
the gray level set acquisition module is used for acquiring the gray level set of the nonlinear weighted sum image;
and the image segmentation module is used for carrying out image segmentation on the nonlinear weighted sum image based on a fuzzy clustering algorithm and the gray level set of the nonlinear weighted sum image to obtain a region after the nonlinear weighted sum image is segmented.
A third aspect of the embodiments of the present application provides a terminal device comprising a memory, a processor and a computer program stored in the memory and executable on the processor, the processor implementing the steps of the image segmentation method as described above when executing the computer program.
A fourth aspect of the embodiments of the present application provides a computer readable storage medium storing a computer program which, when executed by a processor, implements the steps of the image segmentation method as described above.
Compared with the prior art, the embodiment of the application has the beneficial effects that: firstly, calculating a local space operator and a local gray operator of an original gray image; then nonlinear filtering processing is carried out on the original gray level image according to the local space operator and the local gray level operator, and nonlinear weighted sum images are obtained; acquiring a gray level set of the nonlinear weighted sum image; and finally, based on a fuzzy clustering algorithm and the gray level set of the nonlinear weighted sum image, carrying out image segmentation on the nonlinear weighted sum image to obtain a region after the nonlinear weighted sum image segmentation. According to the embodiment, the nonlinear filtering processing is carried out on the original gray level image, so that the influence of noise on an image segmentation algorithm can be avoided, the robustness of the algorithm is improved, and the segmentation effect of the image is improved.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present application, the drawings that are needed in the embodiments or the description of the prior art will be briefly described below, it being obvious that the drawings in the following description are only some embodiments of the present application, and that other drawings may be obtained according to these drawings without inventive effort for a person skilled in the art.
Fig. 1 is a schematic flow chart of an image segmentation method according to an embodiment of the present application;
fig. 2 is a schematic diagram of an image segmentation apparatus according to an embodiment of the present application;
fig. 3 is a schematic diagram of a terminal device according to an embodiment of the present application.
Detailed Description
In the following description, for purposes of explanation and not limitation, specific details are set forth such as the particular system architecture, techniques, etc., in order to provide a thorough understanding of the embodiments of the present application. It will be apparent, however, to one skilled in the art that the present application may be practiced in other embodiments that depart from these specific details. In other instances, detailed descriptions of well-known systems, devices, circuits, and methods are omitted so as not to obscure the description of the present application with unnecessary detail.
In order to illustrate the technical scheme of the application, the following description is made by specific examples.
In one embodiment, as shown in fig. 1, fig. 1 shows an image segmentation method provided in this embodiment, and the process is described in detail as follows:
s101: and calculating a local space operator and a local gray operator of the original gray image.
In the present embodiment, the original image I 0 The size is n=m×n, and the total pixel set is x= { X 1 ,x 2 ,…,x N And the gray level number is G.
Specifically, the window size of the filter is set to be MxM and the scale factor lambda of the local space operator s Global scaling factor lambda for local gray operator g Then, a local spatial operator and a local gray operator are calculated.
In one embodiment, the calculation formula of the local spatial operator is:
wherein S is s_kj Representing the local spatial operator, (L k ,Q k ) Representing the spatial coordinates, lambda, of the kth pixel in the original gray scale image s Represents a scale factor, (L) j ,Q j ) Representing the spatial coordinates of the j-th pixel in the original gray image;
the calculation formula of the local gray operator is as follows:
wherein S is g_kj Representing the local gray operator, N k A pixel set representing a window of size m×m centered on a kth pixel in the original gray scale image; k (k) j Representing a gray value of a j-th pixel in a window centered on the k-th pixel in the original gray image; x is x k Representing a gray value of a kth pixel in the original gray image; lambda (lambda) g Representing global scale factor, sigma g_k Representing a local window density function.
S102: and carrying out nonlinear filtering processing on the original gray level image according to the local space operator and the local gray level operator to obtain a nonlinear weighted sum image.
In one embodiment, the specific implementation procedure of S102 in fig. 1 includes:
s201: calculating local similarity measurement according to the local space operator and the local gray operator;
s202: filtering the original gray level image according to a first formula to obtain the nonlinear weighted sum image;
the local similarity measure is:
wherein S is kj And representing the local similarity measure between the kth pixel and the jth pixel in the original gray image, wherein the local similarity measure is used for representing the weight of the jth pixel.
The first formula is:
wherein I is k Representing the gray value of the kth pixel of the nonlinear weighted sum image.
Specifically, I k ∈[0,G-1]。
S103: and acquiring a gray level set of the nonlinear weighted sum image.
In this embodiment, the gray level set of the nonlinear weighted sum image is { I } 1 ,I 2 ,…,I G }. Wherein the gray level number G < N, dividing the nonlinear weighted sum image into C regions, obtaining C fuzzy subsets, eachThe fuzzy subset corresponds to a cluster center v i The cluster center set is v= { V 1 ,v 2 ,…,v c }。
S104: and based on a fuzzy clustering algorithm and the gray level set of the nonlinear weighted sum image, carrying out image segmentation on the nonlinear weighted sum image to obtain a region after the nonlinear weighted sum image segmentation.
As can be seen from the above embodiments, in this embodiment, the local spatial operator and the local gray operator of the original gray image are calculated first; then nonlinear filtering processing is carried out on the original gray level image according to the local space operator and the local gray level operator, and nonlinear weighted sum images are obtained; acquiring a gray level set of the nonlinear weighted sum image; and finally, based on a fuzzy clustering algorithm and the gray level set of the nonlinear weighted sum image, carrying out image segmentation on the nonlinear weighted sum image to obtain a region after the nonlinear weighted sum image segmentation. According to the embodiment, the nonlinear filtering processing is carried out on the original gray level image, so that the influence of noise on an image segmentation algorithm can be avoided, the robustness of the algorithm is improved, and the segmentation effect of the image is improved.
In one embodiment, the specific implementation procedure of S104 in fig. 1 includes:
step one: initializing classification parameters and condition variables of the nonlinear weighted sum image, wherein the classification parameters comprise the number of regional segmentation and a cluster center set, and the cluster center set comprises the cluster centers of all the regions.
In this embodiment, the classification parameters include the number of region divisions, the cluster center set, the maximum iteration number, and the current iteration number.
Specifically, in the initialization process, firstly, the number of region divisions of an image is set, the number of region divisions C is more than or equal to 2, and then the maximum iteration number Iter is set max The number of iterations it=0 for initialization and the iteration termination condition. Finally, I in gray level set is adopted 0 To I G Random value initialization class center betweenInitializing a conditional variable f g =1/C, g=1, 2, … G where f g The current condition variable representing the G-th gray level, C representing the number of region divisions, and G representing the gray level maximum value.
Step two: based on a fuzzy C-means clustering algorithm, calculating a current membership matrix according to the gray level set, the region segmentation number, the current condition variable and the current clustering center set, wherein the element mu in the current membership matrix ig And (5) representing the fuzzy membership degree of the g-th gray level relative to the i-th cluster center in the current cluster center set.
Beginning at step two, the iteration number of the current loop, iter=iter+1, is first calculated.
In one embodiment, the step two specifically includes:
s301: constructing an objective function based on a fuzzy C-means algorithm; the objective function is:
wherein mu ig Representing the g-th gray level relative to the cluster center v i Is a fuzzy membership degree of (2); i g A value representing the g-th gray level; gamma ray g The number of pixels representing the g-th gray level; c represents the number of division of the region, and G represents the maximum gray level.
In this embodiment, in order to improve the clustering performance of the algorithm on unbalanced data, a condition variable f is introduced during the clustering process g (0≤f g And 1) assigning low condition values to objects of larger clusters and high condition values to objects of smaller clusters to prevent the small cluster class from drifting toward the large cluster class.
In one embodiment, the updated condition variables are determined from a condition variable update formula;
the conditional variable updating formula is as follows:
wherein P is imin Representing the minimum of the prior probabilities of all regions to which the current cycle corresponds, each gray level once assigned to a particular cluster has a fixed condition value with respect to all clusters, and all gray levels in the same cluster have the same condition value. Gamma ray g The number of pixels representing the g-th gray level; g i Representing a set of gray levels belonging to an i-th region; n represents the total number of pixels of the nonlinear weighted sum image; n (N) i Representing the total number of pixels belonging to the i-th region; p (P) i Representing the prior probability of the i-th region corresponding to the current cycle.
S302: is satisfied thatUnder the constraint condition of (1), optimizing an objective function by using a Lagrange multiplier method to obtain a membership calculation formula.
S303: calculating a current membership matrix according to a membership calculation formula;
the membership calculation formula is as follows:
wherein f g The current condition variable representing the g-th gray level, i.e. [1,2, …, C]C represents the number of divided regions, m represents the ambiguity index, and m (1. Ltoreq.m. Ltoreq.infinity), may take a value of 2.
Step three: and dividing pixels with the same gray level in the nonlinear weighted sum image into a region with the maximum fuzzy membership corresponding to the gray level based on the current membership matrix, and determining the region after the nonlinear weighted sum image is divided under the current circulation.
In this embodiment, the pixels with the same gray level in the nonlinear weighted sum image are divided into the regions with the largest fuzzy membership corresponding to the gray level according to the following formula.
k=argmaxμ ig i=1,2,…C
Concrete embodimentsThe image is divided into C=3 areas by setting the gray level number G=8 of the nonlinear weighted sum image, and the membership degree matrix
For gray level I 1 For comparison of grey level I 1 Fuzzy membership u of each corresponding region 11 、u 21 、u 31 If u is the size of 11 Maximum, all gray levels are I 1 The image pixels of (a) are classified as the 1 st region. If u 21 Maximum, all gray levels are I 1 The image pixels of (2) are classified as the 2 nd region. If u 31 Maximum, all gray levels are I 1 The image pixels of (a) are classified as 3 rd region.
For gray level I 2 For comparison of grey level I 2 Fuzzy membership u of each corresponding region 12 、u 22 、u 32 If u is the size of 12 Maximum, all gray levels are I 1 The image pixels of (a) are classified as the 1 st region. If u 22 Maximum, all gray levels are I 1 The image pixels of (2) are classified as the 2 nd region. If u 32 Maximum, all gray levels are I 2 The image pixels of (a) are classified as 3 rd region.
Similarly, the pixel points corresponding to 8 gray levels can be divided into corresponding areas.
Step four: and updating the condition variable and the cluster center set based on the region corresponding to the current cycle.
In one embodiment, step four in the above process further includes:
determining an updated cluster center set according to a cluster center updating formula;
the cluster center updating formula is as follows:
wherein v is i Representing the first of the cluster center setsClustering center gamma corresponding to i areas g The number of pixels representing the g-th gray level; mu (mu) ig Representing the g-th gray level relative to the cluster center v i Is a fuzzy membership degree of (2); i g The value of the G-th gray level, C represents the number of division of the region, G represents the maximum gray level, and m represents the ambiguity index.
Step five: and returning the updated condition variables and the cluster center set to the fourth step, and repeatedly executing the second to fifth steps until the classification parameters meet the cycle termination conditions, and outputting the nonlinear weighted sum image segmented area under the last cycle.
In this embodiment, the loop termination condition may be that the iteration number Iter is equal to or greater than Iter max Or |V (Iter) VIter-1. Ltoreq.ε, when Iter<Itermax or VIter-1>And e, repeating the second step to the fifth step until the classification parameters meet the cycle termination condition.
The time complexity of the algorithm provided in this embodiment includes two parts: preprocessing and segmentation of images. The image preprocessing is mainly to obtain a nonlinear weighted sum image I with a temporal complexity O (nxm 2 ) The method comprises the steps of carrying out a first treatment on the surface of the The segmentation process is mainly an iterative process of the algorithm, with its time complexity O (gxc 2 X Iter), iter is the number of iterations. The algorithm provided in this embodiment has a temporal complexity of O (n×m 2 +G×C 2 X Iter). It can be seen that the present embodiment classifies the gray level of the image I by using nonlinear weighting, and the dividing time of the image is not dependent on the size of the image any more, but the gray level of the image, thereby greatly reducing the time complexity of the algorithm. In addition, the present embodiment introduces the condition variable f in the classification process g The low condition value is allocated to the object of the larger regional cluster, and the high condition value is allocated to the object of the smaller regional cluster to prevent the small cluster from drifting towards the large cluster, so that the segmentation precision, the overall performance of the clustering algorithm and the clustering performance of the clustering algorithm on the small cluster are improved.
It should be understood that the sequence number of each step in the foregoing embodiment does not mean that the execution sequence of each process should be determined by the function and the internal logic, and should not limit the implementation process of the embodiment of the present application.
In one embodiment, as shown in fig. 2, fig. 2 shows a structure of an image segmentation apparatus 100 provided in this embodiment, which includes:
an operator calculation module 110, configured to calculate a local spatial operator and a local gray operator of the original gray image;
the filtering processing module 120 is configured to perform nonlinear filtering processing on the original gray-scale image according to the local spatial operator and the local gray-scale operator, so as to obtain a nonlinear weighted sum image;
a gray level set acquisition module 130, configured to acquire a gray level set of the nonlinear weighted sum image;
the image segmentation module 140 is configured to perform image segmentation on the nonlinear weighted sum image based on a fuzzy clustering algorithm and a gray level set of the nonlinear weighted sum image, so as to obtain a region after the nonlinear weighted sum image segmentation.
As can be seen from the above embodiments, in this embodiment, the local spatial operator and the local gray operator of the original gray image are calculated first; then nonlinear filtering processing is carried out on the original gray level image according to the local space operator and the local gray level operator, and nonlinear weighted sum images are obtained; acquiring a gray level set of the nonlinear weighted sum image; and finally, based on a fuzzy clustering algorithm and the gray level set of the nonlinear weighted sum image, carrying out image segmentation on the nonlinear weighted sum image to obtain a region after the nonlinear weighted sum image segmentation. According to the embodiment, the nonlinear filtering processing is carried out on the original gray level image, so that the influence of noise on an image segmentation algorithm can be avoided, the robustness of the algorithm is improved, and the segmentation effect of the image is improved.
In one embodiment, the calculation formula of the local spatial operator is:
wherein S is s_kj Representing the local spatial operator, (L k ,Q k ) Representing the spatial coordinates, lambda, of the kth pixel in the original gray scale image s Represents a scale factor, (L) j ,Q j ) Representing the spatial coordinates of the j-th pixel in the original gray image;
the calculation formula of the local gray operator is as follows:
wherein S is g_kj Representing the local gray operator, N k A pixel set representing a window of size m×m centered on a kth pixel in the original gray scale image; x is x j Representing a gray value of a j-th pixel in a window centered on the k-th pixel in the original gray image; x is x k Representing a gray value of a kth pixel in the original gray image; lambda (lambda) g Representing global scale factor, sigma g_k Representing a local window density function;
in one embodiment, the filter processing module 120 includes:
a similarity measure calculating unit, configured to calculate a local similarity measure according to the local spatial operator and the local gray operator;
the nonlinear weighted sum image acquisition unit is used for carrying out filtering processing on the original gray level image according to a first formula to obtain the nonlinear weighted sum image;
the local similarity measure is:
wherein S is kj Representing a local similarity measure between a kth pixel and a jth pixel in the original gray scale image;
the first formula is:
wherein I is k Representing the gray value of the kth pixel of the nonlinear weighted sum image.
In one embodiment, the image segmentation module 140 is specifically configured to:
step one: initializing classification parameters and conditional variables of the nonlinear weighted sum image, wherein the classification parameters comprise the number of regional segmentation and a cluster center set, and the cluster center set comprises cluster centers of all regions;
step two: based on a fuzzy C-means clustering algorithm, calculating a current membership matrix according to the gray level set, the region segmentation number, the current condition variable and the current clustering center set, wherein the element mu in the current membership matrix ig Representing the fuzzy membership degree of the g-th gray level relative to the i-th clustering center in the current clustering center set;
step three: dividing pixels with the same gray level in the nonlinear weighted sum image into a region with the maximum fuzzy membership corresponding to the gray level based on the current membership matrix, and determining the region after the nonlinear weighted sum image is divided under the current circulation;
step four: updating the condition variable and the cluster center set based on the region corresponding to the current cycle;
step five: and returning the updated condition variables and the cluster center set to the fourth step, and repeatedly executing the second to fifth steps until the classification parameters meet the cycle termination conditions, and outputting the nonlinear weighted sum image segmented area under the last cycle.
In one embodiment, the second step of the image segmentation module 140 specifically includes:
constructing an objective function based on a fuzzy C-means algorithm; the objective function is:
wherein mu ig Representing the g-th gray level relative to the cluster center v i Is a fuzzy membership degree of (2); i g A value representing the g-th gray level; gamma ray g The number of pixels representing the g-th gray level; c represents the number of division of the region, G represents the maximum gray level;
is satisfied thatUnder the constraint condition of (1), optimizing an objective function by using a Lagrange multiplier method to obtain a membership calculation formula;
calculating a current membership matrix according to a membership calculation formula;
the membership calculation formula is as follows:
wherein f g The current condition variable representing the g-th gray level, i.e. [1,2, …, C]C represents the number of region divisions, and m represents the ambiguity index.
In one embodiment, the initialized calculation formula of the condition variable is: f (f) g =1/C, g=1, 2, … G; wherein f g The current condition variable representing the G-th gray level, C representing the number of region divisions, and G representing the gray level maximum value. Step four of the image segmentation module 140 includes:
determining an updated condition variable according to the condition variable updating formula;
the conditional variable updating formula is as follows:
wherein P is imin Representing the minimum value of the prior probabilities of all regions corresponding to the current cycle, gamma g The number of pixels representing the g-th gray level; g i Representing a set of gray levels belonging to an i-th region; n represents the total number of pixels of the nonlinear weighted sum image; n (N) i Representing the total number of pixels belonging to the i-th region; p (P) i Representing the current cycleThe prior probability of the corresponding i-th region.
In one embodiment, step four of the image segmentation module 140 further includes:
determining an updated cluster center set according to a cluster center updating formula;
the cluster center updating formula is as follows:
wherein v is i Representing the clustering center corresponding to the ith area in the clustering center set, and gamma g The number of pixels representing the g-th gray level; mu (mu) ig Representing the g-th gray level relative to the cluster center v i Is a fuzzy membership degree of (2); i g The value of the G-th gray level, C represents the number of division of the region, G represents the maximum gray level, and m represents the ambiguity index.
Fig. 3 is a schematic diagram of a terminal device according to an embodiment of the present application. As shown in fig. 3, the terminal device 3 of this embodiment includes: a processor 30, a memory 31 and a computer program 32 stored in said memory 31 and executable on said processor 30. The processor 30, when executing the computer program 32, implements the steps of the various image segmentation method embodiments described above, such as steps 101-104 shown in fig. 1. Alternatively, the processor 30 may perform the functions of the modules/units of the apparatus embodiments described above, such as the functions of the modules 110-140 of fig. 2, when executing the computer program 32.
The computer program 32 may be divided into one or more modules/units which are stored in the memory 31 and executed by the processor 30 to complete the present application. The one or more modules/units may be a series of computer program instruction segments capable of performing specific functions for describing the execution of the computer program 32 in the terminal device 3.
The terminal device 3 may be a computing device such as a desktop computer, a notebook computer, a palm computer, a cloud server, etc. The terminal device may include, but is not limited to, a processor 30, a memory 31. It will be appreciated by those skilled in the art that fig. 3 is merely an example of the terminal device 3 and does not constitute a limitation of the terminal device 3, and may include more or less components than illustrated, or may combine certain components, or different components, e.g., the terminal device may further include an input-output device, a network access device, a bus, etc.
The processor 30 may be a central processing unit (Central Processing Unit, CPU), other general purpose processors, digital signal processors (Digital Signal Processor, DSP), application specific integrated circuits (Application Specific Integrated Circuit, ASIC), off-the-shelf programmable gate arrays (Field-Programmable Gate Array, FPGA) or other programmable logic devices, discrete gate or transistor logic devices, discrete hardware components, or the like. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like.
The memory 31 may be an internal storage unit of the terminal device 3, such as a hard disk or a memory of the terminal device 3. The memory 31 may be an external storage device of the terminal device 3, such as a plug-in hard disk, a Smart Media Card (SMC), a Secure Digital (SD) Card, a Flash memory Card (Flash Card) or the like, which are provided on the terminal device 3. Further, the memory 31 may also include both an internal storage unit and an external storage device of the terminal device 3. The memory 31 is used for storing the computer program as well as other programs and data required by the terminal device. The memory 31 may also be used for temporarily storing data that has been output or is to be output.
It will be apparent to those skilled in the art that, for convenience and brevity of description, only the above-described division of each functional unit and module is illustrated, and in practical application, the above-described functional allocation may be performed by different functional units and modules, i.e. the internal structure of the apparatus is divided into different functional units or modules, so as to perform all or part of the above-described functions. The functional units and modules in the embodiment may be integrated in one processing unit, or each unit may exist alone physically, or two or more units may be integrated in one unit, where the integrated units may be implemented in a form of hardware or a form of a software functional unit. In addition, the specific names of the functional units and modules are only for distinguishing from each other, and are not used for limiting the protection scope of the present application. The specific working process of the units and modules in the above system may refer to the corresponding process in the foregoing method embodiment, which is not described herein again.
In the foregoing embodiments, the descriptions of the embodiments are emphasized, and in part, not described or illustrated in any particular embodiment, reference is made to the related descriptions of other embodiments.
Those of ordinary skill in the art will appreciate that the various illustrative elements and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware, or combinations of computer software and electronic hardware. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the solution. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present application.
In the embodiments provided in the present application, it should be understood that the disclosed apparatus/terminal device and method may be implemented in other manners. For example, the apparatus/terminal device embodiments described above are merely illustrative, e.g., the division of the modules or units is merely a logical function division, and there may be additional divisions in actual implementation, e.g., multiple units or components may be combined or integrated into another system, or some features may be omitted, or not performed. Alternatively, the coupling or direct coupling or communication connection shown or discussed may be an indirect coupling or communication connection via interfaces, devices or units, which may be in electrical, mechanical or other forms.
The units described as separate units may or may not be physically separate, and units shown as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units may be selected according to actual needs to achieve the purpose of the solution of this embodiment.
In addition, each functional unit in the embodiments of the present application may be integrated in one processing unit, or each unit may exist alone physically, or two or more units may be integrated in one unit. The integrated units may be implemented in hardware or in software functional units.
The integrated modules/units, if implemented in the form of software functional units and sold or used as stand-alone products, may be stored in a computer readable storage medium. Based on such understanding, the present application may implement all or part of the flow of the method of the above embodiment, or may be implemented by a computer program to instruct related hardware, where the computer program may be stored in a computer readable storage medium, and when the computer program is executed by a processor, the computer program may implement the steps of each of the method embodiments described above. Wherein the computer program comprises computer program code which may be in source code form, object code form, executable file or some intermediate form etc. The computer readable medium may include: any entity or device capable of carrying the computer program code, a recording medium, a U disk, a removable hard disk, a magnetic disk, an optical disk, a computer Memory, a Read-Only Memory (ROM), a random access Memory (RAM, random Access Memory), an electrical carrier signal, a telecommunications signal, a software distribution medium, and so forth. It should be noted that the computer readable medium contains content that can be appropriately scaled according to the requirements of jurisdictions in which such content is subject to legislation and patent practice, such as in certain jurisdictions in which such content is subject to legislation and patent practice, the computer readable medium does not include electrical carrier signals and telecommunication signals.
The above embodiments are only for illustrating the technical solution of the present application, and not for limiting the same; although the application has been described in detail with reference to the foregoing embodiments, it will be understood by those of ordinary skill in the art that: the technical scheme described in the foregoing embodiments can be modified or some technical features thereof can be replaced by equivalents; such modifications and substitutions do not depart from the spirit and scope of the technical solutions of the embodiments of the present application, and are intended to be included in the scope of the present application.

Claims (8)

1. An image segmentation method, comprising:
calculating a local space operator and a local gray operator of an original gray image;
carrying out nonlinear filtering processing on the original gray level image according to the local space operator and the local gray level operator to obtain a nonlinear weighted sum image; acquiring a gray level set of the nonlinear weighted sum image;
based on a fuzzy clustering algorithm and a gray level set of the nonlinear weighted sum image, performing image segmentation on the nonlinear weighted sum image to obtain a segmented region of the nonlinear weighted sum image;
based on a fuzzy clustering algorithm and a gray level set of the nonlinear weighted sum image, image segmentation is carried out on the nonlinear weighted sum image to obtain a region after the nonlinear weighted sum image segmentation, and the method comprises the following steps:
step one: initializing classification parameters and condition variables of the nonlinear weighted sum image, wherein the classification parameters comprise the number of regional segmentation and a cluster center set, and the cluster center set comprises the cluster centers of all the regions, and the initialization calculation formula of the condition variables is as follows: f (f) g =1/C, g=1, 2, … G; wherein f g The current condition variable representing the G-th gray level, C representing the number of region divisions, G representing the gray level maximum;
step two: based on a fuzzy C-means clustering algorithm, counting according to the gray level set, the region segmentation number, the current condition variable and the current clustering center setCalculating a current membership matrix, wherein the element mu in the current membership matrix ig Representing the fuzzy membership degree of the g-th gray level relative to the i-th clustering center in the current clustering center set;
step three: dividing pixels with the same gray level in the nonlinear weighted sum image into a region with the maximum fuzzy membership corresponding to the gray level based on the current membership matrix, and determining the region after the nonlinear weighted sum image is divided under the current circulation;
step four: updating the condition variable and the cluster center set based on the region corresponding to the current cycle;
step five: returning the updated condition variables and the cluster center set to the fourth step, and repeatedly executing the second to fifth steps until the classification parameters meet the cycle termination conditions, and outputting the nonlinear weighted sum image segmented area in the last cycle;
the updating the condition variable based on the area corresponding to the current cycle comprises the following steps:
determining an updated condition variable according to the condition variable updating formula;
the conditional variable updating formula is as follows:
wherein P is imin Representing the minimum value of the prior probabilities of all regions corresponding to the current cycle, gamma g The number of pixels representing the g-th gray level; g i Representing a set of gray levels belonging to an i-th region; n represents the total number of pixels of the nonlinear weighted sum image; n (N) i Representing the total number of pixels belonging to the i-th region; p (P) i Representing the prior probability of the ith area corresponding to the current cycle;
wherein, membership calculation formula is:
wherein v is i Representing the current cluster center, I g The value representing the g-th gray level, i.e. [1,2, …, C]M represents the ambiguity index.
2. The image segmentation method as set forth in claim 1, wherein the calculation formula of the local spatial operator is:
wherein S is s_kj Representing the local spatial operator, (L k ,Q k ) Representing the spatial coordinates, lambda, of the kth pixel in the original gray scale image s Represents a scale factor, (L) j ,Q j ) Representing the spatial coordinates of the j-th pixel in the original gray image;
the calculation formula of the local gray operator is as follows:
wherein S is g_kj Representing the local gray operator, N k A pixel set representing a window of size m×m centered on a kth pixel in the original gray scale image; x is x j Representing a gray value of a j-th pixel in a window centered on the k-th pixel in the original gray image; x is x k Representing a gray value of a kth pixel in the original gray image; lambda (lambda) g Representing global scale factor, sigma g_k Representing a local window density function.
3. The image segmentation method according to claim 2, wherein the performing nonlinear filtering processing on the original gray-scale image according to the local spatial operator and the local gray-scale operator to obtain a nonlinear weighted sum image includes:
calculating local similarity measurement according to the local space operator and the local gray operator;
filtering the original gray level image according to a first formula to obtain the nonlinear weighted sum image;
the local similarity measure is:
wherein S is kj Representing a local similarity measure between a kth pixel and a jth pixel in the original gray scale image;
the first formula is:
wherein I is k Representing the gray value of the kth pixel of the nonlinear weighted sum image.
4. The image segmentation method as set forth in claim 1, wherein the second step includes:
constructing an objective function based on a fuzzy C-means algorithm; the objective function is:
is satisfied thatUnder the constraint condition of (1), optimizing an objective function by using a Lagrange multiplier method to obtain a membership calculation formula;
and calculating the current membership matrix according to a membership calculation formula.
5. The image segmentation method as set forth in claim 1, wherein the updating the cluster center set based on the region corresponding to the current cycle comprises:
determining an updated cluster center set according to a cluster center updating formula;
the cluster center updating formula is as follows:
wherein v is i Representing the clustering center corresponding to the ith area in the clustering center set, and gamma g The number of pixels representing the g-th gray level; mu (mu) ig Representing the g-th gray level relative to the cluster center v i Is a fuzzy membership degree of (2); i g The value of the G-th gray level, C represents the number of division of the region, G represents the maximum gray level, and m represents the ambiguity index.
6. An image dividing apparatus, comprising:
the operator calculation module is used for calculating a local space operator and a local gray operator of the original gray image;
the filtering processing module is used for carrying out nonlinear filtering processing on the original gray image according to the local space operator and the local gray operator to obtain a nonlinear weighted sum image;
the gray level set acquisition module is used for acquiring the gray level set of the nonlinear weighted sum image;
the image segmentation module is used for carrying out image segmentation on the nonlinear weighted sum image based on a fuzzy clustering algorithm and a gray level set of the nonlinear weighted sum image to obtain a region after the nonlinear weighted sum image is segmented;
the image segmentation module is specifically used for:
step one: initializing classification parameters and conditional variables of the nonlinear weighted sum image, wherein the classification parameters comprise the number of regional segmentation and a cluster center set, and the cluster center set comprises cluster centers of all regions;
step two: based on a fuzzy C-means clustering algorithm, according to the gray level set and the regionCalculating a current membership matrix by using the domain segmentation number, the current condition variable and the current clustering center set, wherein the element mu in the current membership matrix ig Representing the fuzzy membership degree of the g-th gray level relative to the i-th clustering center in the current clustering center set;
step three: dividing pixels with the same gray level in the nonlinear weighted sum image into a region with the maximum fuzzy membership corresponding to the gray level based on the current membership matrix, and determining the region after the nonlinear weighted sum image is divided under the current circulation;
step four: updating the condition variable and the cluster center set based on the region corresponding to the current cycle;
step five: returning the updated condition variables and the cluster center set to the fourth step, and repeatedly executing the second to fifth steps until the classification parameters meet the cycle termination conditions, and outputting the nonlinear weighted sum image segmented area in the last cycle;
the updating the condition variable based on the area corresponding to the current cycle comprises the following steps:
determining an updated condition variable according to the condition variable updating formula;
the conditional variable updating formula is as follows:
wherein P is imin Representing the minimum value of the prior probabilities of all regions corresponding to the current cycle, gamma g The number of pixels representing the g-th gray level; g i Representing a set of gray levels belonging to an i-th region; n represents the total number of pixels of the nonlinear weighted sum image; n (N) i Representing the total number of pixels belonging to the i-th region; p (P) i Representing the prior probability of the ith area corresponding to the current cycle;
wherein, membership calculation formula is:
wherein v is i Representing the current cluster center, I g The value representing the g-th gray level, i.e. [1,2, …, C]M represents the ambiguity index.
7. A terminal device comprising a memory, a processor and a computer program stored in the memory and executable on the processor, characterized in that the processor implements the steps of the method according to any of claims 1 to 5 when the computer program is executed.
8. A computer readable storage medium storing a computer program, characterized in that the computer program when executed by a processor implements the steps of the method according to any one of claims 1 to 5.
CN202110649372.7A 2021-06-10 2021-06-10 Image segmentation method and device and terminal equipment Active CN113269796B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110649372.7A CN113269796B (en) 2021-06-10 2021-06-10 Image segmentation method and device and terminal equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110649372.7A CN113269796B (en) 2021-06-10 2021-06-10 Image segmentation method and device and terminal equipment

Publications (2)

Publication Number Publication Date
CN113269796A CN113269796A (en) 2021-08-17
CN113269796B true CN113269796B (en) 2023-08-25

Family

ID=77234763

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110649372.7A Active CN113269796B (en) 2021-06-10 2021-06-10 Image segmentation method and device and terminal equipment

Country Status (1)

Country Link
CN (1) CN113269796B (en)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105741279A (en) * 2016-01-27 2016-07-06 西安电子科技大学 Rough set based image segmentation method for quickly inhibiting fuzzy clustering
CN107727010A (en) * 2017-10-31 2018-02-23 北京农业信息技术研究中心 A kind of method for measuring corps leaf surface product index
CN109145921A (en) * 2018-08-29 2019-01-04 江南大学 A kind of image partition method based on improved intuitionistic fuzzy C mean cluster
CN109360207A (en) * 2018-09-26 2019-02-19 江南大学 A kind of fuzzy clustering method merging neighborhood information
CN110211126A (en) * 2019-06-12 2019-09-06 西安邮电大学 Image partition method based on intuitionistic fuzzy C mean cluster
CN110634141A (en) * 2019-09-19 2019-12-31 南京邮电大学 Image segmentation method based on improved intuitionistic fuzzy c-means clustering and storage medium

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105741279A (en) * 2016-01-27 2016-07-06 西安电子科技大学 Rough set based image segmentation method for quickly inhibiting fuzzy clustering
CN107727010A (en) * 2017-10-31 2018-02-23 北京农业信息技术研究中心 A kind of method for measuring corps leaf surface product index
CN109145921A (en) * 2018-08-29 2019-01-04 江南大学 A kind of image partition method based on improved intuitionistic fuzzy C mean cluster
CN109360207A (en) * 2018-09-26 2019-02-19 江南大学 A kind of fuzzy clustering method merging neighborhood information
CN110211126A (en) * 2019-06-12 2019-09-06 西安邮电大学 Image partition method based on intuitionistic fuzzy C mean cluster
CN110634141A (en) * 2019-09-19 2019-12-31 南京邮电大学 Image segmentation method based on improved intuitionistic fuzzy c-means clustering and storage medium

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
基于模糊聚类的图像分割算法的研究与应用;陆海青;《中国优秀硕博士学位论文全文数据库(硕士)信息科技辑》;20190115;第2章 *

Also Published As

Publication number Publication date
CN113269796A (en) 2021-08-17

Similar Documents

Publication Publication Date Title
CN111860398B (en) Remote sensing image target detection method and system and terminal equipment
CN111507993A (en) Image segmentation method and device based on generation countermeasure network and storage medium
CN111553215B (en) Personnel association method and device, graph roll-up network training method and device
CN111950408B (en) Finger vein image recognition method and device based on rule diagram and storage medium
CN111080654B (en) Image lesion region segmentation method and device and server
Alawad et al. Stochastic-based deep convolutional networks with reconfigurable logic fabric
CN110533632B (en) Image blurring tampering detection method and device, computer equipment and storage medium
CN111079764A (en) Low-illumination license plate image recognition method and device based on deep learning
CN110675334A (en) Image enhancement method and device
CN112328715A (en) Visual positioning method, training method of related model, related device and equipment
CN112183212A (en) Weed identification method and device, terminal equipment and readable storage medium
CN115147598A (en) Target detection segmentation method and device, intelligent terminal and storage medium
CN112819199A (en) Precipitation prediction method, device, equipment and storage medium
CN109961435B (en) Brain image acquisition method, device, equipment and storage medium
CN113920382B (en) Cross-domain image classification method based on class consistency structured learning and related device
Hossain et al. Anti-aliasing deep image classifiers using novel depth adaptive blurring and activation function
CN113269796B (en) Image segmentation method and device and terminal equipment
CN112465837B (en) Image segmentation method for sparse subspace fuzzy clustering by utilizing spatial information constraint
CN110647805B (en) Reticulate pattern image recognition method and device and terminal equipment
CN114065913A (en) Model quantization method and device and terminal equipment
CN112950652A (en) Robot and hand image segmentation method and device thereof
CN110232302B (en) Method for detecting change of integrated gray value, spatial information and category knowledge
CN113516275A (en) Power distribution network ultra-short term load prediction method and device and terminal equipment
CN109871867A (en) A kind of pattern fitting method of the data characterization based on preference statistics
CN112150484B (en) Super-pixel dirichlet mixing model image segmentation method, device and equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant