CN115131375B - Automatic ore cutting method - Google Patents

Automatic ore cutting method Download PDF

Info

Publication number
CN115131375B
CN115131375B CN202211037087.0A CN202211037087A CN115131375B CN 115131375 B CN115131375 B CN 115131375B CN 202211037087 A CN202211037087 A CN 202211037087A CN 115131375 B CN115131375 B CN 115131375B
Authority
CN
China
Prior art keywords
edge
ore
block
image
blocks
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202211037087.0A
Other languages
Chinese (zh)
Other versions
CN115131375A (en
Inventor
易小艳
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Nantong Fleck Fluid Equipment Co ltd
Original Assignee
Nantong Fleck Fluid Equipment Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Nantong Fleck Fluid Equipment Co ltd filed Critical Nantong Fleck Fluid Equipment Co ltd
Priority to CN202211037087.0A priority Critical patent/CN115131375B/en
Publication of CN115131375A publication Critical patent/CN115131375A/en
Application granted granted Critical
Publication of CN115131375B publication Critical patent/CN115131375B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/11Region-based segmentation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/13Edge detection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/136Segmentation; Edge detection involving thresholding
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/40Analysis of texture
    • G06T7/41Analysis of texture based on statistical description of texture
    • G06T7/45Analysis of texture based on statistical description of texture using co-occurrence matrix computation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/762Arrangements for image or video recognition or understanding using pattern recognition or machine learning using clustering, e.g. of similar faces in social networks
    • G06V10/763Non-hierarchical techniques, e.g. based on statistics of modelling distributions

Abstract

The invention relates to the technical field of image processing, in particular to an automatic ore segmentation method, which is essentially a method for segmenting ores based on image processing, and comprises the following steps: obtaining an ore gray image, determining the size of a quantization block, and segmenting the ore gray image according to the size of the quantization block to obtain an ore segmentation image; calculating color information indexes of the quantization blocks according to the gray levels in the quantization blocks; calculating an edge information index according to the shape divergence coefficient, the category divergence coefficient and the space divergence coefficient of the quantization block; calculating texture information indexes of the quantization blocks according to entropy values corresponding to gray level co-occurrence matrixes of the pixel points in the quantization blocks on the ore segmentation image; calculating to obtain the complexity of the quantization block; and clustering the quantization blocks according to the complexity to obtain different classification areas, and respectively segmenting the different classification areas to obtain ore segmentation results. The invention improves the segmentation precision on the basis of reducing the segmentation calculation amount.

Description

Automatic ore segmentation method
Technical Field
The invention relates to the technical field of image processing, in particular to an automatic ore segmentation method.
Background
With the development of intelligent mining equipment in recent years, classification of ore types, ore granularity inspection and ore grade estimation can be intelligently realized. If the accuracy of classification, granularity inspection and grade estimation needs to be improved, the ore is accurately segmented firstly, but because the ore environment is complex, the collected ore image has the conditions of uneven illumination, adhesion and accumulation of the ore, small difficulty in separation from the background and the like, and the conditions bring great difficulty to the accurate segmentation of the ore blocks on the ore image.
Meanwhile, due to the fact that scenes in the ore image are complex, not only are the ore regions and the background regions exist, but also due to the fact that illumination of the shooting environment is not balanced, regions with uniform illumination and regions with non-uniform illumination can appear in the ore image, and a good segmentation effect cannot be achieved by adopting a single segmentation method.
Disclosure of Invention
In order to solve the above technical problems, the present invention aims to provide an automatic ore segmentation method, which adopts the following technical scheme:
acquiring an ore gray image, performing edge detection on the ore gray image to obtain an ore edge image, and setting a shape coefficient and a size coefficient of a quantization block; determining the size of a quantization block according to the size of the ore gray level image, the shape coefficient and the size coefficient; segmenting the ore gray level image according to the size of the quantization block to obtain an ore segmentation image, wherein the ore segmentation image comprises a plurality of quantization blocks;
summing the difference values of the median values of the frequencies corresponding to each gray level in each quantization block on the ore segmentation image and the frequencies corresponding to all the gray levels, and obtaining the color information index of each quantization block according to the ratio of the summation result to the total number of the gray levels in the quantization block;
extracting edge sub-blocks from corresponding positions on the ore edge image according to the size of the quantization blocks, acquiring edge pixel points in the edge sub-blocks, and clustering the edge pixel points to obtain a plurality of edge line segments; calculating the average value of the correlation of the edge pixel points on all the edge line segments to obtain the shape divergence coefficient of each edge sub-block; acquiring the number of edge line segments in the edge sub-blocks, and recording the number as a category divergence coefficient of each edge sub-block; calculating a spatial divergence coefficient of each edge sub-block based on the position distance between edge line segments in the edge sub-blocks; obtaining an edge information index according to the shape divergence coefficient, the category divergence coefficient and the space divergence coefficient;
calculating texture information indexes of the quantization blocks according to entropy values corresponding to gray level co-occurrence matrixes of the pixel points in the quantization blocks on the ore segmentation image; weighting and summing the color information index, the edge information index and the texture information index to obtain the complexity of the quantization block; and clustering the quantized blocks according to the complexity of each quantized block on the ore segmentation image to obtain different classification areas, and segmenting the different classification areas respectively to obtain an ore segmentation result.
Preferably, the method for acquiring the ore edge image specifically comprises the following steps: and extracting edge information of the ore gray image by using a Canny operator, reassigning the pixel values of edge pixel points to be 1, reassigning the pixel values of other pixel points to be 0, obtaining a binary image, and recording the binary image as the ore edge image.
Preferably, the method for obtaining the size of the quantization block specifically includes:
Figure 100002_DEST_PATH_IMAGE001
Figure 880169DEST_PATH_IMAGE002
wherein the content of the first and second substances,
Figure DEST_PATH_IMAGE003
which represents the size of the quantized block,
Figure 782266DEST_PATH_IMAGE004
representing the size of the ore gray image, zo representing the size coefficient of the quantization block, fe representing the shape coefficient of the quantization block,
Figure 794084DEST_PATH_IMAGE005
represents the minimum of M and N.
Preferably, the method for acquiring the color information index specifically includes:
Figure 617683DEST_PATH_IMAGE006
Figure 740360DEST_PATH_IMAGE007
wherein, the first and the second end of the pipe are connected with each other,
Figure DEST_PATH_IMAGE008
a color information index representing the quantized block,
Figure 28122DEST_PATH_IMAGE009
representing the frequency corresponding to the ith gray level within the quantization block,
Figure DEST_PATH_IMAGE010
the median value representing the frequency of all gray levels, l the total number of gray levels in the quantization block,
Figure 210842DEST_PATH_IMAGE011
Representing the uniformity of the quantization block; the frequency corresponding to the ith gray level is specifically the ratio of the number of pixels corresponding to the ith gray level to the total number of pixels in the quantization block.
Preferably, the method for obtaining the shape divergence coefficient specifically comprises:
acquiring coordinates of all edge pixel points on an edge line segment in an edge sub-block, respectively forming a row coordinate sequence and a column coordinate sequence corresponding to the edge line segment by using row coordinates and column coordinates of all edge pixel points, and calculating a Pearson correlation coefficient of the row coordinate sequence and the column coordinate sequence to obtain the correlation of the edge pixel points on the edge line segment; and obtaining the average value of the correlation of the edge pixel points on all the edge line segments to obtain the shape divergence coefficient of each edge sub-block.
Preferably, the method for obtaining the spatial divergence coefficient specifically includes:
acquiring coordinates of a central point on an edge line segment in the edge sub-block, wherein the row coordinates of the central point are mean values of all edge pixel point row coordinates on the edge line segment, and the column coordinates of the central point are mean values of all edge pixel point column coordinates on the edge line segment; calculating the row coordinate and the column coordinate of the whole density center of the edge sub-block, and expressing the row coordinate and the column coordinate as follows:
Figure 521737DEST_PATH_IMAGE012
Figure 182526DEST_PATH_IMAGE013
wherein, the first and the second end of the pipe are connected with each other,
Figure 996898DEST_PATH_IMAGE014
and
Figure 350519DEST_PATH_IMAGE015
row seats respectively representing the whole density center of edge sub-blockThe coordinates of the target and the column are,
Figure 100002_DEST_PATH_IMAGE016
and
Figure 972930DEST_PATH_IMAGE017
respectively, the row coordinate and the column coordinate of the center point on the e-th edge line segment, Q the total number of edge line segments in the edge sub-block,
Figure DEST_PATH_IMAGE018
representing the number of edge pixel points on the e-th edge line segment,
Figure 437409DEST_PATH_IMAGE019
representing the number of edge pixel points contained in all edge line segments in the edge sub-block;
calculating the position distance between the edge line segments based on the row coordinates and the column coordinates of the whole density center of the edge sub-blocks to obtain the spatial divergence coefficient of the edge sub-blocks, and expressing the coefficient as follows by using a formula:
Figure 106288DEST_PATH_IMAGE020
wherein the content of the first and second substances,
Figure 630810DEST_PATH_IMAGE021
the coefficient of the spatial divergence is represented as,
Figure 650719DEST_PATH_IMAGE014
and
Figure 653310DEST_PATH_IMAGE015
respectively representing the row and column coordinates of the center of the global density of the edge sub-block,
Figure 442274DEST_PATH_IMAGE016
and
Figure 137698DEST_PATH_IMAGE017
respectively representing the centre point of the e-th edge line segmentRow and column coordinates, Q represents the total number of edge line segments in the edge sub-block.
Preferably, the method for acquiring the edge information index specifically includes:
if the number of the edge line segments in the edge sub-block is more than 1, carrying out weighted summation on the shape divergence coefficient, the category divergence coefficient and the space divergence coefficient to obtain edge distribution divergence; if the number of the edge line segments in the edge sub-block is equal to 1, the edge distribution divergence is a shape divergence coefficient; if the number of the edge line segments in the edge sub-block is equal to 0, the value of the edge distribution divergence is 0; obtaining the edge density of the edge sub-block according to the ratio of the number of the edge pixel points in the edge sub-block to the number of all the pixel points in the edge sub-block; and obtaining an edge information index according to the edge density and the edge distribution divergence of the edge sub-blocks.
Preferably, the clustering the quantized blocks according to the complexity of each quantized block on the ore segmentation image to obtain different classification regions specifically comprises: segmenting the ore gray level image according to the quantized blocks with different sizes to obtain ore segmented images corresponding to the quantized blocks with different sizes; respectively calculating the complexity mean value of a quantization block on each ore segmentation image, and acquiring the ore segmentation image corresponding to the maximum value of the complexity mean value; and clustering the quantized blocks on the ore segmentation image according to the complexity of the quantized blocks on the ore segmentation image to obtain an area with uneven illumination of the ore blocks, an area with even illumination of the ore blocks and a mixed area.
Preferably, the obtaining of the ore segmentation result by respectively segmenting the different classification areas specifically comprises:
segmenting an area with uneven illumination of the ore blocks by using a local self-adaptive threshold segmentation method to obtain an ore block part and a background part of the area; segmenting the ore illumination uniform area by using an Otsu threshold segmentation algorithm to obtain an ore block part and a background part of the area; calculating a gray level co-occurrence matrix of each pixel point in the mixing region, and calculating corresponding energy, entropy value, contrast and inverse difference according to the gray level co-occurrence matrix; and forming texture vectors of all pixel points in the mixed region by the energy, the entropy, the contrast and the inverse difference distance, and clustering the mixed region according to the texture vectors of the pixel points to obtain an ore block part and a background part of the mixed region.
The embodiment of the invention at least has the following beneficial effects:
the invention essentially relates to a method for segmenting ores based on image processing, which particularly calculates the complexity of a single image block based on the color information index, the edge information index and the texture information index of an ore image, then divides the image block in the image into three regions based on the complexity of the image block in the image, and adopts a proper segmentation method for each region, thereby improving the segmentation precision on the basis of reducing the segmentation calculation amount. Meanwhile, a segmentation mode more fitting the shape and size of the ore blocks is selected to segment the image, and a better segmentation effect can be obtained.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions and advantages of the prior art, the drawings used in the description of the embodiments or the prior art will be briefly described below, it is obvious that the drawings in the following description are only some embodiments of the present invention, and other drawings can be obtained by those skilled in the art without creative efforts.
Fig. 1 is a flow chart of a method of automatic ore segmentation according to the present invention.
Detailed Description
To further illustrate the technical means and effects of the present invention adopted to achieve the predetermined objects, the following detailed description of an ore automatic segmentation method according to the present invention, its specific implementation, structure, features and effects, with reference to the accompanying drawings and preferred embodiments, is provided below. In the following description, different "one embodiment" or "another embodiment" refers to not necessarily the same embodiment. Furthermore, the particular features, structures, or characteristics may be combined in any suitable manner in one or more embodiments.
Unless defined otherwise, all technical and scientific terms used herein have the same meaning as commonly understood by one of ordinary skill in the art to which this invention belongs.
The following describes a specific scheme of an automatic ore segmentation method provided by the invention in detail with reference to the accompanying drawings.
Referring to fig. 1, a flow chart of an ore automatic cutting method according to an embodiment of the present invention is shown, the method includes the following steps:
the method comprises the steps of firstly, obtaining an ore gray image, carrying out edge detection on the ore gray image to obtain an ore edge image, and setting a shape coefficient and a size coefficient of a quantization block; determining the size of a quantization block according to the size of the ore gray level image, the shape coefficient and the size coefficient; and segmenting the ore gray level image according to the size of the quantization block to obtain an ore segmentation image, wherein the ore segmentation image comprises a plurality of quantization blocks.
Specifically, an ore image is collected, the size of the ore image is MxN, the ore image is converted into a gray image from a color RGB image, a Gaussian filter is used for carrying out noise filtering operation on the image, then gamma conversion is used for correcting the gray value of the image, the condition that the light of the ore image is dark is improved, and the ore gray image is obtained.
And extracting edge information of the ore gray image by using a Canny operator, reassigning the pixel values of edge pixel points to be 1, reassigning the pixel values of other pixel points to be 0, obtaining a binary image, and recording the binary image as the ore edge image.
It should be noted that, because the distribution of the regions in the collected ore image is relatively complex, there are not only the ore region but also the background region, and due to the characteristic of uneven illumination, there may be regions with even illumination and regions with uneven illumination in the collected ore image, so that it is considered to perform a suitable segmentation method for each region, and the segmentation accuracy is improved on the basis of reducing the calculation amount.
Setting a shape coefficient fe and a size coefficient zo of the quantization block based on the idea of the quantization block; and determining the size of the quantization block according to the size of the ore gray level image and the shape coefficient and the size coefficient. In this embodiment, the shape coefficients of the quantization blocks are set to three values, which are 0.5, 1, and 2, respectively, and the size coefficients of the quantization blocks are set to six values, which are 0.005, 0.01, 0.02, 0.04, 0.08, and 0.2, respectively. The size of the quantization block in 18 can be obtained according to the values of the three shape coefficients and the six size coefficients, and is expressed by a formula:
Figure 379323DEST_PATH_IMAGE001
Figure 185605DEST_PATH_IMAGE002
wherein, the first and the second end of the pipe are connected with each other,
Figure 829076DEST_PATH_IMAGE003
which represents the size of the quantized block or blocks,
Figure 429822DEST_PATH_IMAGE004
representing the size of the ore gray image, zo representing the size coefficient of the quantization block, fe representing the shape coefficient of the quantization block,
Figure 424322DEST_PATH_IMAGE005
represents the minimum of M and N.
The shape coefficient and the size coefficient can be set by an implementer according to actual conditions. However, in the acquired ore image, the ratio of the area of one ore block in the image to the area of the ore image may be different due to the distance between the ore block and the ore block. The larger the average area of a single ore block in the image is, the fewer the number of required blocks is, and the larger the size coefficient zo is suitable for being adopted; meanwhile, different shape coefficients are suitable due to different shapes of the ore blocks, if the lengths and the widths of the ore blocks are approximate, the shape coefficient fe =1 is suitable, and if the lengths and the widths of the ore blocks are greatly different, the shape coefficient fe =0.5 or fe =2 is suitable. The size coefficients zo and the shape coefficients fe are selected to more accurately fit the shape and size of the ore block in the ore image, that is, the ore image is "quantized" by using the most appropriate "quantization scale", which reduces the amount of calculation while ensuring accurate fitting of the shape and size of the ore block.
And segmenting the ore gray level image according to the calculated sizes of the quantized blocks to obtain an ore segmentation image, wherein the ore segmentation image comprises a plurality of quantized blocks. It should be noted that, the present invention obtains a plurality of different quantization block sizes to segment the image through a plurality of different values of shape coefficients and size coefficients, and further analyzes the segmentation effect subsequently to find the quantization block size corresponding to the optimal segmentation result, which can ensure accurate segmentation of the ore image.
And step two, summing the difference values of the median values of the frequency corresponding to each gray level in each quantization block on the ore segmentation image and the frequency corresponding to all the gray levels, and obtaining the color information index of each quantization block according to the ratio of the summation result to the total number of the gray levels in the quantization block.
Specifically, a color information index of the quantization block is calculated, formulated as:
Figure 768716DEST_PATH_IMAGE006
Figure 532273DEST_PATH_IMAGE007
wherein the content of the first and second substances,
Figure 38340DEST_PATH_IMAGE008
a color information index representing the quantized block,
Figure 520137DEST_PATH_IMAGE009
representing the frequency corresponding to the ith gray level within the quantization block,
Figure 668222DEST_PATH_IMAGE010
the median value representing the frequency of all gray levels, l the total number of gray levels in the quantization blockThe amount of the compound (A) is,
Figure 286285DEST_PATH_IMAGE011
representing the uniformity of the quantization block; the frequency corresponding to the ith gray level is specifically the ratio of the number of pixels corresponding to the ith gray level to the total number of pixels in the quantization block.
It should be noted that the color information index indicates the content of color information contained in a single quantization block, and the gray level in the grayscale image indicates the color information, and the more and richer the number of gray levels contained in a single quantization block are, and the closer the frequency of occurrence of the pixel points at each gray level is, the larger the color information index is. If all the pixel points in a single quantization block are distributed uniformly in different gray levels, the uniformity is smaller, and meanwhile, the more the total number of the gray levels is, the more the content of the color information contained in the single quantization block is, the larger the color information index is.
Extracting edge sub-blocks from corresponding positions on the ore edge image according to the sizes of the quantized blocks, acquiring edge pixel points in the edge sub-blocks, and clustering the edge pixel points to obtain a plurality of edge line segments; calculating the average value of the correlation of the edge pixel points on all the edge line segments to obtain the shape divergence coefficient of each edge sub-block; acquiring the number of edge line segments in the edge sub-blocks, and recording the number as a category divergence coefficient of each edge sub-block; calculating a spatial divergence coefficient of each edge sub-block based on the position distance between edge line segments in the edge sub-blocks; and obtaining an edge information index according to the shape divergence coefficient, the category divergence coefficient and the space divergence coefficient.
First, according to the size of the quantization block and the position in the ore segmentation image, the quantization blocks with the same size are extracted from the corresponding position on the ore edge image and are recorded as edge sub-blocks. And obtaining edge pixel points in the edge sub-block, and obtaining the edge density of the edge sub-block according to the ratio of the number of the edge pixel points in the edge sub-block to the number of all the pixel points in the edge sub-block.
It should be noted that the edge sub-blocks are image blocks which are obtained according to the sizes of the quantized blocks and the positions in the ore segmented image and contain edge information, the edge sub-blocks are essentially edge image blocks corresponding to the quantized blocks in the ore segmented image, and the edge sub-blocks correspond to the quantized blocks one by one, so that the related indexes obtained by analyzing the edge information of the edge sub-blocks can also be regarded as related indexes of the quantized blocks.
And clustering the edge pixel points in each edge sub-block by using a density clustering algorithm DBSCAN to obtain a plurality of independent and disjoint edge line segments. The related parameters of the clustering algorithm are set as follows: neighborhood radius
Figure 494413DEST_PATH_IMAGE022
Number threshold
Figure 197926DEST_PATH_IMAGE023
Then, coordinates of all edge pixel points on an edge line segment in the edge sub-block are obtained, row coordinates and column coordinates of all the edge pixel points respectively form a row coordinate sequence and a column coordinate sequence corresponding to the edge line segment, and a Pearson correlation coefficient of the row coordinate sequence and the column coordinate sequence is calculated to obtain the correlation of the edge pixel points on the edge line segment; obtaining the average value of the correlation of the edge pixel points on all the edge line segments to obtain the shape divergence coefficient of each edge sub-block, and expressing the shape divergence coefficient as follows by using a formula:
Figure 618543DEST_PATH_IMAGE024
wherein, the first and the second end of the pipe are connected with each other,
Figure 356692DEST_PATH_IMAGE025
represents the divergence coefficient of the shape of the edge sub-block, Q represents the total number of edge line segments in the edge sub-block,
Figure DEST_PATH_IMAGE026
the correlation of the edge pixels on the a-th edge line segment is represented, in this embodiment, the correlation of the edge pixels is obtained by calculating the pearson correlation coefficient, and the value range is [0,1 ]]The more approximate the edge line segment isIn a straight line, the closer the value of the correlation is to 1, the smaller the value of the shape divergence coefficient is, and conversely, the more curved the edge line segment is, the less similar the edge line segment is to a straight line, the closer the value of the correlation is to 0, and the larger the shape divergence coefficient is.
It should be noted that, in this embodiment, the shape divergence of the edge sub-block is further obtained by calculating the pearson correlation coefficient as the correlation of the edge pixel, and an implementer may select another more appropriate method according to the actual situation, calculate the similarity of the edge pixel on each edge line segment of the edge sub-block, and may represent the degree that each edge line segment is similar to a straight line.
Meanwhile, the number of independent and mutually disjoint edge line segments in the edge sub-blocks is obtained and recorded as the category divergence coefficient of each edge sub-block
Figure 735721DEST_PATH_IMAGE027
Further, coordinates of a central point on an edge line segment in the edge sub-block are obtained, wherein the row coordinates of the central point are mean values of row coordinates of all edge pixel points on the edge line segment, and the column coordinates of the central point are mean values of column coordinates of all edge pixel points on the edge line segment; calculating the row coordinate and the column coordinate of the whole density center of the edge sub-block, and expressing the row coordinate and the column coordinate as follows:
Figure 926531DEST_PATH_IMAGE028
Figure 416418DEST_PATH_IMAGE029
wherein, the first and the second end of the pipe are connected with each other,
Figure 9073DEST_PATH_IMAGE014
and
Figure 559003DEST_PATH_IMAGE015
respectively representing the row and column coordinates of the center of the global density of the edge sub-block,
Figure 237109DEST_PATH_IMAGE016
and
Figure 265108DEST_PATH_IMAGE017
respectively, the row coordinate and the column coordinate of the center point on the e-th edge line segment, Q the total number of edge line segments in the edge sub-block,
Figure 712270DEST_PATH_IMAGE018
representing the number of edge pixel points on the e-th edge line segment,
Figure 433102DEST_PATH_IMAGE019
and the number of edge pixel points contained in all edge line segments in the edge sub-blocks is represented. The overall density center represents the position condition of the edge line segment integrally distributed in the edge sub-block, for example, more edge pixel points appear at the upper left corner in the edge sub-block, and the overall density center is more likely to appear at the upper left corner of the edge sub-block due to the weight.
Calculating the position distance between the edge line segments based on the row coordinates and the column coordinates of the whole density center of the edge sub-blocks to obtain the spatial divergence coefficient of the edge sub-blocks, and expressing the coefficient as follows by using a formula:
Figure 332924DEST_PATH_IMAGE020
wherein, the first and the second end of the pipe are connected with each other,
Figure 164614DEST_PATH_IMAGE021
the coefficient of the spatial divergence is represented,
Figure 466283DEST_PATH_IMAGE014
and
Figure 358015DEST_PATH_IMAGE015
respectively representing the row coordinates and the column coordinates of the center of the overall density of the edge sub-block,
Figure 10713DEST_PATH_IMAGE016
and
Figure 380515DEST_PATH_IMAGE017
respectively representing the row coordinate and the column coordinate of the central point on the e-th edge line segment, and Q representing the total number of edge line segments in the edge sub-block. It should be noted that, for an edge line segment with a large number of edge pixels, the central point of the edge line segment has a larger influence on the position of the overall density center.
Finally, if the number of edge line segments in the edge sub-block is greater than 1, the shape divergence coefficient, the category divergence coefficient and the space divergence coefficient are subjected to weighted summation to obtain the edge distribution divergence, namely the edge distribution divergence
Figure 802269DEST_PATH_IMAGE030
. If the number of the edge line segments in the edge sub-blocks is equal to 1, the edge distribution divergence is a shape divergence coefficient; if the number of the edge line segments in the edge sub-block is equal to 0, the value of the edge distribution divergence is 0; obtaining an edge information index according to the edge density and the edge distribution divergence of the edge sub-blocks, and expressing the edge information index by a formula as follows:
Figure DEST_PATH_IMAGE031
wherein the content of the first and second substances,
Figure 130482DEST_PATH_IMAGE032
the index of the edge information is represented,
Figure 4897DEST_PATH_IMAGE033
the divergence of the edge distribution is represented,
Figure 178390DEST_PATH_IMAGE034
the density of the edges is shown as,
Figure 454650DEST_PATH_IMAGE035
represents a weight coefficient, and takes a value of 5 in this embodiment. The higher the edge density and the greater the edge distribution divergence, the higher the edge information content index.
It should be noted that the edge distribution divergence indicates the spatial shape structure characteristics of the edge pixel points in the edge sub-block, and if the shape structure of the edge pixel points is more complex, the more the edge spatial dispersion degree is represented. For example, in edge sub-blocks with the same edge density, if the spatial dispersion degree of edge pixel points which are similar to straight lines is smaller than that of curves, the edge pixel points can be represented by a shape divergence coefficient; two independent edge line segments have larger spatial dispersion degree than a single edge line segment, and can be represented by a category divergence coefficient; the spatial dispersion degree of two independent edge line segments with longer spatial distance is larger than that of two independent edge line segments with shorter spatial distance, and the two independent edge line segments with shorter spatial distance can be represented by a spatial divergence coefficient.
Meanwhile, the edge information index obtained by calculating the edge distribution divergence can reflect whether the division condition of the edge sub-blocks and the quantization blocks is reasonable or not to a certain extent. Ideally, when the quantized block and the edge sub-block with relatively proper sizes are used for partitioning, a single quantized block and an edge sub-block should contain complete edge information of an ore, and if the quantized block and the edge sub-block are too small in size, an edge line segment in an image block obtained by partitioning the quantized block and the edge sub-block may be a part of a line segment of an ore edge, and the line segment is more approximate to a straight line.
Calculating texture information indexes of the quantization blocks according to entropy values corresponding to gray level co-occurrence matrixes of the pixel points in the quantization blocks on the ore segmentation image; weighting and summing the color information index, the edge information index and the texture information index to obtain the complexity of the quantization block; and clustering the quantization blocks according to the complexity of each quantization block on the ore segmentation image to obtain different classification areas, and segmenting the different classification areas respectively to obtain an ore segmentation result.
Firstly, respectively calculating gray level co-occurrence matrixes of pixel points in each quantization block on an ore segmentation image, calculating corresponding entropy values according to the gray level co-occurrence matrixes, calculating the mean value of the entropy values corresponding to all the pixel points in the quantization block, and obtaining the texture information index of the quantization block
Figure 688185DEST_PATH_IMAGE036
. Wherein, passing through ashThe entropy value of the degree co-occurrence matrix can represent the texture complexity of the quantization block, and the gray level co-occurrence matrix and the method for obtaining the corresponding entropy value thereof are well known technologies and are not described in detail herein.
And weighting and summing the color information index, the edge information index and the texture information index to obtain the complexity of the quantization block. Namely, it is
Figure 49897DEST_PATH_IMAGE037
Wherein, in the step (A),
Figure 761501DEST_PATH_IMAGE038
Figure DEST_PATH_IMAGE039
and
Figure 423426DEST_PATH_IMAGE040
all represent the tuning parameters, which in this example are 0.05, 0.05 and 1, respectively.
And then, segmenting the ore gray level image according to the quantization blocks with different sizes to obtain ore segmentation images corresponding to the quantization blocks with different sizes, wherein each time one quantization block with one size is adopted for image segmentation, one ore segmentation image is obtained. And respectively calculating the complexity mean value of the quantization blocks on each ore segmentation image, and acquiring the ore segmentation image corresponding to the maximum value of the complexity mean value, namely taking the segmentation mode with the highest complexity of a single quantization block on the ore segmentation image as the optimal segmentation mode. After the optimal segmentation mode is selected, clustering the quantized blocks on the ore segmentation image according to the complexity of the quantized blocks on the ore segmentation image to obtain an area with uneven illumination of the ore blocks, an area with even illumination of the ore blocks and a mixed area. Wherein, a kmeans algorithm is adopted for clustering, k =3 is set, and the clustering algorithm is a known technology and is not described in detail herein.
Finally, due to the fact that illumination is uneven, gray value distribution of different areas in the ore gray image is different, but the difference between the gray value of the ore block part and the gray value of the edge part of the ore block part is large, the ore block part and the background part of the area can be obtained by directly utilizing a local self-adaptive threshold value segmentation method to segment the area with uneven illumination of the ore block. And taking the pixel area which is larger than the optimal threshold obtained by the local adaptive threshold segmentation method as an ore block part, and taking the other pixel areas as a background part.
Because the region with uniform illumination of the ore blocks mainly comprises the ore blocks, and the illumination is uniform, the gray value distribution of different ore block regions is relatively similar, and the gray value difference between the ore block part and the edge part is relatively large. Therefore, the ore block uniform illumination area can be directly segmented by using the Otsu threshold segmentation algorithm to obtain the ore block part and the background part of the area. Wherein, the pixel area which is larger than the optimal threshold value obtained by the Otsu threshold segmentation algorithm is used as the ore block part of the area, and the other pixel areas are used as the background part.
Because the gray value distribution of the ore block part and the background part contained in the mixed region is similar, the effect of threshold segmentation based on the gray value is poor, but the texture difference between the two parts is obvious, so that the texture information of each pixel point can be obtained based on the gray co-occurrence matrix.
Calculating a gray level co-occurrence matrix of each pixel point in the mixing region, and calculating corresponding energy, entropy value, contrast and inverse difference according to the gray level co-occurrence matrix; and forming texture vectors of all pixel points in the mixed region by the energy, the entropy, the contrast and the inverse difference distance, and clustering the mixed region according to the texture vectors of the pixel points to obtain an ore block part and a background part of the mixed region. Wherein the clustering is performed by adopting a kmeans algorithm, and k =2. Meanwhile, the uniformity degree and the rule degree of the texture of the background are high, and the average value of the energy ASM of the pixels in the mixed region is high, so that two types of region parts obtained by adopting a kmeans clustering algorithm are respectively marked as a background part and a mine block part according to the average energy ASM.
The above-mentioned embodiments are only used to illustrate the technical solutions of the present application, and not to limit the same; although the present application has been described in detail with reference to the foregoing embodiments, it should be understood by those of ordinary skill in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some technical features may be equivalently replaced; such modifications and substitutions do not substantially depart from the spirit and scope of the embodiments of the present application and are intended to be included within the scope of the present application.

Claims (8)

1. An automatic ore segmentation method is characterized by comprising the following steps:
acquiring an ore gray image, performing edge detection on the ore gray image to obtain an ore edge image, and setting a shape coefficient and a size coefficient of a quantization block; determining the size of a quantization block according to the size of the ore gray level image, the shape coefficient and the size coefficient; segmenting the ore gray level image according to the size of the quantization block to obtain an ore segmentation image, wherein the ore segmentation image comprises a plurality of quantization blocks;
summing the difference values of the median values of the frequency corresponding to each gray level in each quantization block on the ore segmentation image and the frequency corresponding to all the gray levels, and obtaining the color information index of each quantization block according to the ratio of the summation result to the total number of the gray levels in the quantization block;
extracting edge sub-blocks from corresponding positions on the ore edge image according to the size of the quantization blocks, acquiring edge pixel points in the edge sub-blocks, and clustering the edge pixel points to obtain a plurality of edge line segments; calculating the average value of the correlation of the edge pixel points on all the edge line segments to obtain the shape divergence coefficient of each edge sub-block; acquiring the number of edge line segments in the edge sub-blocks, and recording the number as a category divergence coefficient of each edge sub-block; calculating a spatial divergence coefficient of each edge sub-block based on the position distance between edge line segments in the edge sub-blocks; according to the shape divergence coefficient, the category divergence coefficient and the space divergence coefficient, obtaining an edge information index, specifically: if the number of the edge line segments in the edge sub-block is more than 1, carrying out weighted summation on the shape divergence coefficient, the category divergence coefficient and the space divergence coefficient to obtain edge distribution divergence; if the number of the edge line segments in the edge sub-blocks is equal to 1, the edge distribution divergence is a shape divergence coefficient; if the number of the edge line segments in the edge sub-block is equal to 0, the value of the edge distribution divergence is 0;
obtaining the edge density of the edge sub-block according to the ratio of the number of the edge pixel points in the edge sub-block to the number of all the pixel points in the edge sub-block; obtaining an edge information index according to the edge density and the edge distribution divergence of the edge sub-blocks, and expressing the edge information index by a formula as follows:
Figure DEST_PATH_IMAGE001
wherein the content of the first and second substances,
Figure 802544DEST_PATH_IMAGE002
an index representing the edge information is represented by,
Figure 963398DEST_PATH_IMAGE003
the divergence of the edge distribution is represented,
Figure 106934DEST_PATH_IMAGE004
the density of the edges is shown to be,
Figure 568003DEST_PATH_IMAGE005
representing a weight coefficient;
calculating texture information indexes of the quantization blocks according to entropy values corresponding to gray level co-occurrence matrixes of the pixel points in the quantization blocks on the ore segmentation image; weighting and summing the color information index, the edge information index and the texture information index to obtain the complexity of the quantization block; and clustering the quantized blocks according to the complexity of each quantized block on the ore segmentation image to obtain different classification areas, and segmenting the different classification areas respectively to obtain an ore segmentation result.
2. The automatic ore segmentation method according to claim 1, wherein the method for obtaining the ore edge image specifically comprises: and extracting edge information of the ore gray image by using a Canny operator, reassigning the pixel values of edge pixel points to be 1, reassigning the pixel values of other pixel points to be 0, obtaining a binary image, and recording the binary image as the ore edge image.
3. The automatic ore segmentation method according to claim 1, wherein the size of the quantization block is obtained by a method specifically including:
Figure 935530DEST_PATH_IMAGE006
Figure DEST_PATH_IMAGE007
wherein the content of the first and second substances,
Figure 52522DEST_PATH_IMAGE008
which represents the size of the quantized block,
Figure DEST_PATH_IMAGE009
represents the size of the ore gray image, zo represents the size coefficient of the quantization block, fe represents the shape coefficient of the quantization block,
Figure 734170DEST_PATH_IMAGE010
represents the minimum of M and N.
4. The automatic ore segmentation method according to claim 1, wherein the color information index is obtained by a method specifically comprising:
Figure 49745DEST_PATH_IMAGE011
Figure 322594DEST_PATH_IMAGE012
wherein, the first and the second end of the pipe are connected with each other,
Figure 989199DEST_PATH_IMAGE013
a color information index representing the quantized block,
Figure DEST_PATH_IMAGE014
representing the frequency corresponding to the ith gray level within the quantization block,
Figure 400502DEST_PATH_IMAGE015
the median value representing the frequency for all gray levels, l represents the total number of gray levels within the quantization block,
Figure DEST_PATH_IMAGE016
representing the uniformity of the quantization block; the frequency corresponding to the ith gray level is specifically the ratio of the number of pixels corresponding to the ith gray level to the total number of pixels in the quantization block.
5. The automatic ore segmentation method according to claim 1, wherein the shape divergence coefficient is obtained by a method specifically including:
acquiring coordinates of all edge pixel points on an edge line segment in an edge sub-block, respectively forming a row coordinate sequence and a column coordinate sequence corresponding to the edge line segment by using row coordinates and column coordinates of all edge pixel points, and calculating a Pearson correlation coefficient of the row coordinate sequence and the column coordinate sequence to obtain the correlation of the edge pixel points on the edge line segment; and obtaining the average value of the correlation of the edge pixel points on all the edge line segments to obtain the shape divergence coefficient of each edge sub-block.
6. The automatic ore segmentation method according to claim 1, wherein the method for obtaining the spatial divergence coefficient specifically comprises:
acquiring coordinates of a central point on an edge line segment in the edge sub-block, wherein the row coordinates of the central point are the mean values of the row coordinates of all edge pixel points on the edge line segment, and the column coordinates of the central point are the mean values of the column coordinates of all edge pixel points on the edge line segment; calculating the row coordinate and the column coordinate of the whole density center of the edge sub-block, and expressing the row coordinate and the column coordinate by a formula as follows:
Figure 445950DEST_PATH_IMAGE017
Figure 92963DEST_PATH_IMAGE018
wherein, the first and the second end of the pipe are connected with each other,
Figure 918968DEST_PATH_IMAGE019
and
Figure 145681DEST_PATH_IMAGE020
respectively representing the row and column coordinates of the center of the global density of the edge sub-block,
Figure 170269DEST_PATH_IMAGE021
and
Figure DEST_PATH_IMAGE022
respectively, the row coordinate and the column coordinate of the central point on the e-th edge line segment, Q the total number of edge line segments in the edge sub-block,
Figure 988183DEST_PATH_IMAGE023
representing the number of edge pixel points on the e-th edge line segment,
Figure DEST_PATH_IMAGE024
representing the number of edge pixel points contained in all edge line segments in the edge sub-blocks;
calculating the position distance between the edge line segments based on the row coordinates and the column coordinates of the whole density center of the edge sub-blocks to obtain the spatial divergence coefficient of the edge sub-blocks, which is expressed by a formula as follows:
Figure 567063DEST_PATH_IMAGE025
wherein, the first and the second end of the pipe are connected with each other,
Figure 190943DEST_PATH_IMAGE026
the coefficient of the spatial divergence is represented as,
Figure 335616DEST_PATH_IMAGE019
and
Figure 652328DEST_PATH_IMAGE020
respectively representing the row and column coordinates of the center of the global density of the edge sub-block,
Figure 780821DEST_PATH_IMAGE021
and
Figure 473970DEST_PATH_IMAGE022
respectively representing the row coordinate and the column coordinate of the central point on the e-th edge line segment, and Q representing the total number of edge line segments in the edge sub-block.
7. The method according to claim 1, wherein the clustering the quantized blocks according to the complexity of each quantized block in the image of ore segmentation to obtain different classification regions is specifically:
segmenting the ore gray level image according to the quantization blocks with different sizes to obtain ore segmentation images corresponding to the quantization blocks with different sizes; respectively calculating the complexity mean value of a quantization block on each ore segmentation image, and acquiring the ore segmentation image corresponding to the maximum value of the complexity mean value; and clustering the quantization blocks on the ore segmentation image according to the complexity of the quantization blocks on the ore segmentation image to obtain an area with uneven illumination of the ore blocks, an area with even illumination of the ore blocks and a mixed area.
8. The automatic ore segmentation method according to claim 1, wherein the segmentation of the regions of different categories to obtain the ore segmentation result specifically comprises:
segmenting an area with uneven illumination of the ore blocks by using a local self-adaptive threshold segmentation method to obtain an ore block part and a background part of the area; segmenting the ore illumination uniform area by using an Otsu threshold segmentation algorithm to obtain an ore block part and a background part of the area;
calculating a gray level co-occurrence matrix of each pixel point in the mixing region, and calculating corresponding energy, entropy value, contrast and inverse difference according to the gray level co-occurrence matrix; and forming texture vectors of all pixel points in the mixed region by the energy, the entropy, the contrast and the inverse difference distance, and clustering the mixed region according to the texture vectors of the pixel points to obtain an ore block part and a background part of the mixed region.
CN202211037087.0A 2022-08-29 2022-08-29 Automatic ore cutting method Active CN115131375B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202211037087.0A CN115131375B (en) 2022-08-29 2022-08-29 Automatic ore cutting method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202211037087.0A CN115131375B (en) 2022-08-29 2022-08-29 Automatic ore cutting method

Publications (2)

Publication Number Publication Date
CN115131375A CN115131375A (en) 2022-09-30
CN115131375B true CN115131375B (en) 2022-11-18

Family

ID=83387309

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202211037087.0A Active CN115131375B (en) 2022-08-29 2022-08-29 Automatic ore cutting method

Country Status (1)

Country Link
CN (1) CN115131375B (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115311278B (en) * 2022-10-11 2023-12-22 南通欧惠纺织科技有限公司 Yarn segmentation method for yarn detection
CN116129364B (en) * 2023-04-17 2023-06-30 山东山矿机械有限公司 Belt centralized control system

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101710424B (en) * 2009-12-22 2012-04-25 中国矿业大学(北京) Method for segmenting ore image
CN109785330A (en) * 2018-11-22 2019-05-21 梁栋 A kind of water area extracting method based on linear partition
CN112365494B (en) * 2020-11-30 2022-12-13 北京理工大学 Ore material image segmentation method based on deep learning prediction edge
CN114842027A (en) * 2022-04-24 2022-08-02 南通真馨家纺有限公司 Fabric defect segmentation method and system based on gray level co-occurrence matrix

Also Published As

Publication number Publication date
CN115131375A (en) 2022-09-30

Similar Documents

Publication Publication Date Title
CN115131375B (en) Automatic ore cutting method
CN115457041B (en) Road quality identification and detection method
CN104732227B (en) A kind of Location Method of Vehicle License Plate based on definition and luminance evaluation
CN110009638B (en) Bridge inhaul cable image appearance defect detection method based on local statistical characteristics
CN115311277B (en) Pit defect identification method for stainless steel product
CN115829883A (en) Surface image denoising method for dissimilar metal structural member
CN110428450B (en) Scale-adaptive target tracking method applied to mine tunnel mobile inspection image
CN102306307B (en) Positioning method of fixed point noise in color microscopic image sequence
CN111583279A (en) Super-pixel image segmentation method based on PCBA
CN111339924B (en) Polarized SAR image classification method based on superpixel and full convolution network
CN116630813B (en) Highway road surface construction quality intelligent detection system
CN113052859A (en) Super-pixel segmentation method based on self-adaptive seed point density clustering
CN115272319B (en) Ore granularity detection method
CN116805316B (en) Degradable plastic processing quality detection method based on image enhancement
CN114299051A (en) Leather material surface defect detection method based on feature modeling significance detection
CN115908154A (en) Video late-stage particle noise removing method based on image processing
CN115457551A (en) Leaf damage identification method suitable for small sample condition
CN115797607A (en) Image optimization processing method for enhancing VR real effect
CN112070717A (en) Power transmission line icing thickness detection method based on image processing
CN117292137B (en) Aerial remote sensing image optimization segmentation processing method
CN111476744A (en) Underwater image enhancement method based on classification and atmospheric imaging model
Kapoor et al. Capturing banding in images: Database construction and objective assessment
CN109801246B (en) Global histogram equalization method for adaptive threshold
CN115880181A (en) Method, device and terminal for enhancing image contrast
CN115033721A (en) Image retrieval method based on big data

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant