CN115294379B - Flotation method foam identification method based on optical information - Google Patents

Flotation method foam identification method based on optical information Download PDF

Info

Publication number
CN115294379B
CN115294379B CN202211194724.5A CN202211194724A CN115294379B CN 115294379 B CN115294379 B CN 115294379B CN 202211194724 A CN202211194724 A CN 202211194724A CN 115294379 B CN115294379 B CN 115294379B
Authority
CN
China
Prior art keywords
bubble
coefficient
area
pixel point
central
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202211194724.5A
Other languages
Chinese (zh)
Other versions
CN115294379A (en
Inventor
黄小燕
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Nantong Ganyu Tempered Glass Products Co ltd
Original Assignee
Nantong Ganyu Tempered Glass Products Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Nantong Ganyu Tempered Glass Products Co ltd filed Critical Nantong Ganyu Tempered Glass Products Co ltd
Priority to CN202211194724.5A priority Critical patent/CN115294379B/en
Publication of CN115294379A publication Critical patent/CN115294379A/en
Application granted granted Critical
Publication of CN115294379B publication Critical patent/CN115294379B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/762Arrangements for image or video recognition or understanding using pattern recognition or machine learning using clustering, e.g. of similar faces in social networks
    • G06V10/763Non-hierarchical techniques, e.g. based on statistics of modelling distributions
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/20Image preprocessing
    • G06V10/22Image preprocessing by selection of a specific region containing or referencing a pattern; Locating or processing of specific regions to guide the detection or recognition
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/20Image preprocessing
    • G06V10/26Segmentation of patterns in the image field; Cutting or merging of image elements to establish the pattern region, e.g. clustering-based techniques; Detection of occlusion
    • G06V10/267Segmentation of patterns in the image field; Cutting or merging of image elements to establish the pattern region, e.g. clustering-based techniques; Detection of occlusion by performing operations on regions, e.g. growing, shrinking or watersheds
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/20Image preprocessing
    • G06V10/30Noise filtering
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/20Image preprocessing
    • G06V10/34Smoothing or thinning of the pattern; Morphological operations; Skeletonisation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/40Extraction of image or video features
    • G06V10/44Local feature extraction by analysis of parts of the pattern, e.g. by detecting edges, contours, loops, corners, strokes or intersections; Connectivity analysis, e.g. of connected components
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V2201/00Indexing scheme relating to image or video recognition or understanding
    • G06V2201/07Target detection

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Software Systems (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Health & Medical Sciences (AREA)
  • Probability & Statistics with Applications (AREA)
  • Artificial Intelligence (AREA)
  • Computing Systems (AREA)
  • Databases & Information Systems (AREA)
  • Evolutionary Computation (AREA)
  • General Health & Medical Sciences (AREA)
  • Medical Informatics (AREA)
  • Image Analysis (AREA)

Abstract

The invention relates to the technical field of data processing, in particular to a flotation method froth identification method based on optical information, which comprises the steps of collecting flotation froth images to obtain corresponding gray level images, and carrying out threshold segmentation on the gray level images to obtain a plurality of light-reflecting white areas; the method comprises the steps of confirming bubble reflection white areas through central highlight distribution and peripheral uniform annular band-shaped particle characteristics, detecting straight lines and curves of non-bubble reflection white areas, counting the number of the bubble reflection white areas on two sides of each straight line or curve to determine a real bubble edge line, and identifying and segmenting bubbles in a flotation foam image according to position information of the real bubble edge line, so that accurate identification and segmentation of the bubbles are realized, the accuracy of segmentation results is enhanced, and the working efficiency is improved.

Description

Flotation method foam identification method based on optical information
Technical Field
The invention relates to the technical field of data processing, in particular to a flotation method foam identification method based on optical information.
Background
Froth flotation is one of the most widely used beneficiation processes that can be used for the separation of almost all ores. The froth flotation is an ore dressing method which takes surface chemistry as the basis and utilizes the difference of the surface hydrophobicity of mineral particles to achieve the effective separation effect of different minerals, and the comprehensive phenomenon of the flotation working condition is directly represented by the surface visual characteristics of the froth, the foam size, the color and the like.
With the development of image processing technology, most of targets utilize an image segmentation method to obtain the surface visual characteristics of foams, but the problems of complex foam structure, serious foam adhesion and the like exist in the collected flotation foam images, so that the segmentation difficulty is greatly increased, serious over-segmentation or insufficient segmentation phenomena are easy to occur, and the segmentation effect is not ideal.
Disclosure of Invention
In order to solve the technical problems, the invention aims to provide a flotation method foam identification method based on optical information, and the adopted technical scheme is as follows:
collecting a flotation froth image to obtain a corresponding gray level image, and obtaining a plurality of reflective white areas in the gray level image by using an Otsu threshold value method, wherein the reflective white areas refer to white areas caused by reflection; respectively carrying out connected domain analysis on each light-reflecting white area to obtain a target area;
acquiring a central pixel point of a current target area, and calculating a gray value difference value between the central pixel point and each pixel point in a surrounding area of the central pixel point to obtain a central high brightness coefficient; acquiring an annular area of a current target area according to the distance from a central pixel point to the edge of the current target area, clustering the pixel points in the annular area to obtain a plurality of clusters, and performing ellipse fitting according to the central point coordinate of each cluster to obtain goodness of fit; calculating Euclidean distances between the current cluster and other clusters respectively, selecting the minimum Euclidean distance as a target Euclidean distance of the current cluster, calculating a target Euclidean distance variance according to the target Euclidean distances of all clusters, and taking the target Euclidean distance variance as a granular sensation chaos value in the annular region; combining the goodness of fit and the granular sensation confusion value to obtain a peripheral light ring coefficient corresponding to the annular area; taking the ratio of the central highlight coefficient to the peripheral halo coefficient as the bubble highlight coefficient of the current target area;
acquiring the bubble high brightness coefficient of each target area, and determining the target area as a bubble reflection white area when the bubble high brightness coefficient is larger than a bubble brightness coefficient threshold value; the method comprises the steps of carrying out edge detection on non-bubble reflective white areas in a gray level image to obtain corresponding edge images, detecting straight lines and curves in the edge images, counting the number of the bubble reflective white areas on two sides of each straight line or curve to determine a real bubble edge line, and identifying and segmenting bubbles in the flotation froth images according to position information of the real bubble edge line.
Further, the method for obtaining the central highlight coefficient includes:
construction with central pixel point as window center
Figure DEST_PATH_IMAGE001
Window for calculating central pixel point and central pixel point respectively
Figure 52685DEST_PATH_IMAGE001
Obtaining the gray value difference value between other 24 pixel points in the window, and obtaining the current target according to the gray value difference valueAnd marking the central highlight coefficient of the area, wherein the calculation formula of the central highlight coefficient is as follows:
Figure 100002_DEST_PATH_IMAGE002
wherein the content of the first and second substances,
Figure DEST_PATH_IMAGE003
the central highlight coefficient;
Figure 100002_DEST_PATH_IMAGE004
the gray value of the central pixel point in the current target area,
Figure DEST_PATH_IMAGE005
is composed of
Figure 454979DEST_PATH_IMAGE001
The first of the other 24 pixel points in the window
Figure 100002_DEST_PATH_IMAGE006
The gray value of each pixel point.
Further, the method for obtaining the annular region of the current target region according to the distance from the central pixel point to the edge of the current target region includes:
acquiring a connecting line between a central pixel point and each edge pixel point of a current target area, and taking a point which is on the connecting line and is twice as long as the central pixel point as a connecting line length as a dividing point, wherein the dividing point forms a dividing edge; and removing the region surrounded by the segmentation edges in the current target region to obtain an annular region.
Further, the calculation formula of the ambient light ring coefficient is as follows:
Figure DEST_PATH_IMAGE007
wherein the content of the first and second substances,
Figure 100002_DEST_PATH_IMAGE008
the ambient light ring coefficient is obtained;
Figure DEST_PATH_IMAGE009
to adjust the coefficient;
Figure 100002_DEST_PATH_IMAGE010
is the goodness-of-fit;
Figure DEST_PATH_IMAGE011
is the granular sensation disorder value;
Figure 100002_DEST_PATH_IMAGE012
and
Figure DEST_PATH_IMAGE013
are all adjustment coefficients;
Figure 100002_DEST_PATH_IMAGE014
is the number of clusters.
Further, the bubble high brightness coefficient and the central high brightness coefficient are in a negative correlation relationship, and the bubble high brightness coefficient and the surrounding halo coefficient are in a positive correlation relationship.
Further, the method for counting the number of the bubble reflection white areas on two sides of each straight line or curve to determine the edge line of the real bubble comprises the following steps:
the method comprises the steps of generally referring a straight line and a curve as lines, respectively obtaining target detection areas formed by the current line and other lines on two sides of the current line according to the position of the current line, wherein the lines do not exist in the target detection areas; respectively counting the number of bubble light-emitting white areas in two target detection areas of the current line;
when only one bubble light-emitting white area exists in the two target detection areas, the current line is confirmed to be a true bubble edge; when more than or equal to 2 bubble luminous white areas appear in any target detection area, returning to line detection until the number of the bubble luminous white areas in the target detection area is less than 2 by improving the detection precision of the lines; otherwise, the current line is confirmed to belong to a false bubble edge.
The embodiment of the invention at least has the following beneficial effects: the central light reflecting area of the bubbles in the flotation froth image is accurately identified through central highlight distribution and peripheral uniform annular band-shaped particle characteristics, so that the approximate position of the bubbles in the flotation froth image and the position of the central light reflecting area of the bubbles are determined, the edge in the flotation froth image is analyzed and judged by taking the central light reflecting area of the bubbles as a basis, the bubbles are accurately identified and segmented, the accuracy of segmentation results is enhanced, and the working efficiency is improved.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions and advantages of the prior art, the drawings used in the embodiments or the description of the prior art will be briefly described below, it is obvious that the drawings in the following description are only some embodiments of the present invention, and other drawings can be obtained by those skilled in the art without creative efforts.
Fig. 1 is a flow chart illustrating steps of a flotation froth identification method based on optical information according to an embodiment of the present invention.
Detailed Description
In order to further illustrate the technical means and effects of the present invention adopted to achieve the predetermined objects, the following detailed description of the embodiment, structure, features and effects of the flotation method foam identification method based on optical information according to the present invention is provided with the accompanying drawings and the preferred embodiments. In the following description, different "one embodiment" or "another embodiment" refers to not necessarily the same embodiment. Furthermore, the particular features, structures, or characteristics may be combined in any suitable manner in one or more embodiments.
Unless defined otherwise, all technical and scientific terms used herein have the same meaning as commonly understood by one of ordinary skill in the art to which this invention belongs.
The following specifically describes a specific scheme of the flotation method foam identification method based on optical information, which is provided by the invention, with reference to the accompanying drawings.
Referring to fig. 1, a flow chart illustrating steps of a flotation froth identification method based on optical information according to an embodiment of the present invention is shown, where the method includes the following steps:
s001, acquiring a flotation froth image to obtain a corresponding gray image, and obtaining a plurality of reflective white areas in the gray image by using an Otsu threshold method, wherein the reflective white areas refer to white areas caused by reflection; and respectively carrying out connected domain analysis on each light-reflecting white area to obtain a target area.
Specifically, a large amount of bubbles generated by ore pulp in the flotation tank are subjected to image acquisition by using a visible light camera to obtain a flotation froth image. Due to the influence of the environment, the camera and the like, noise appears on the acquired image, the subsequent image processing and feature extraction are influenced, the result is not accurate, the image preprocessing needs to be carried out on the flotation froth image, and the image preprocessing method comprises the following steps: according to the embodiment of the invention, the flotation froth image is processed by adopting median filtering, so that the flotation froth image is smoother; due to the fact that illumination is not uniform, the edges of partial foams on the flotation foam image are not obvious, then the flotation foam image is subjected to enhancement processing through a reverse sharpening mask method, and the edge area is sharpened, so that the foam edges in the processed flotation foam image become clearer and more obvious.
Further, in the flotation froth image of the ore pulp, the center of each bubble can cause a very obvious white area due to reflection, the main body of the bubble is a dark area, and the edge of the bubble is a very dark or bright obvious boundary, so that a gray distribution histogram obtained according to the gray value of each pixel point in the gray image can be divided into three more obvious peaks which respectively correspond to the reflection white area, the dark area and the very dark boundary area, so that the flotation froth image after pretreatment is converted into the gray image, the gray image is subjected to threshold segmentation by using the Otsu method to obtain a plurality of reflection white areas in the gray image, the reflection white areas refer to the white areas caused by reflection, then each reflection white area is respectively subjected to connected domain analysis to obtain a target area, and one reflection white area corresponds to one target area.
Step S002, obtaining a central pixel point of the current target area, and calculating a gray value difference value between the central pixel point and each pixel point in the area around the central pixel point to obtain a central high brightness coefficient; acquiring an annular region of the current target region according to the distance from the central pixel point to the edge of the current target region, and clustering the pixel points in the annular region to obtain a plurality of clusters; and calculating a peripheral halo coefficient corresponding to the annular area according to the position distribution of the clusters, and taking the ratio of the central highlight coefficient to the peripheral halo coefficient as a bubble highlight coefficient of the current target area.
Specifically, since the reflective white area is formed by light reflection, the central brightness of the reflective white area is the highest and the brightness distribution is very uniform, so the central high brightness coefficient of each target area is obtained according to the characteristics, specifically: obtaining a central pixel point of the current target area, calculating a gray value difference value between the central pixel point and each pixel point in the area around the central pixel point to obtain a central highlight coefficient, wherein the gray value difference value and the central highlight coefficient are in a positive correlation relationship.
As an example, the embodiment of the present invention constructs a window center by using a center pixel point as a window center
Figure 202092DEST_PATH_IMAGE001
Window is provided with
Figure 394039DEST_PATH_IMAGE001
The range of the window is used as the surrounding area of the central pixel point, and then the central pixel point and the central pixel point are respectively calculated
Figure 5149DEST_PATH_IMAGE001
Obtaining the central highlight coefficient of the current target area according to the gray value difference value between other 24 pixel points in the window, wherein the calculation formula of the central highlight coefficient is as follows:
Figure 358770DEST_PATH_IMAGE002
wherein the content of the first and second substances,
Figure 422541DEST_PATH_IMAGE003
the center highlight coefficient;
Figure 916714DEST_PATH_IMAGE004
is the gray value of the central pixel point in the target area,
Figure 382331DEST_PATH_IMAGE005
is composed of
Figure 172432DEST_PATH_IMAGE001
The first of the other 24 pixel points in the window
Figure 723499DEST_PATH_IMAGE006
The gray value of each pixel point.
It should be noted that the more uniform the central luminance distribution in the target region, i.e., the smaller the gray value difference, the smaller the central highlight coefficient.
The brightness of the light-reflecting white area is slightly reduced from the center to the outer part, the foam is formed by mineral particles dissolved in a solvent, so that the surface is rough, and obvious granular sensation is displayed under illumination, therefore, in the light-emitting white area, due to the prominent illumination, the particles are distributed around the central highlight area in a ring belt shape, and then the ambient light ring coefficient of each target area is constructed according to the characteristics, and the specific steps comprise:
(1) Obtaining an annular area of the current target area according to the distance from the central pixel point to the edge of the current target area, clustering the pixel points in the annular area to obtain a plurality of clusters, and performing ellipse fitting according to the central point coordinate of each cluster to obtain the goodness of fit.
Specifically, the center of the current target area is selected
Figure DEST_PATH_IMAGE015
Outer annular regionThat is, a connecting line between the central pixel point and each edge pixel point of the current target region is obtained, a point on the connecting line, which is twice as long as the central pixel point, is taken as a dividing point, the dividing point forms a dividing edge, and after the region surrounded by the dividing edge in the current target region is removed, the remaining region is an annular region.
Because the annular area is the outer circle part of the current target area, a plurality of clusters can be obtained through clustering according to the rough-surface particles in the annular area, the clustering is carried out by adopting a DBSCAN clustering algorithm and the clustering radius is 1, the MinPts is 3, and the obtained cluster is obtained
Figure 555451DEST_PATH_IMAGE014
And N is a positive integer. The positions of clusters corresponding to the particles are outer ring parts of the mud bubbles containing a plurality of mineral particles and corresponding to the light-reflecting white areas, so that the clusters corresponding to the particles are distributed in an elliptical manner, the central coordinate of each cluster is selected, and fitting is performed according to all the central coordinates and the ellipse to obtain goodness of fit
Figure 609995DEST_PATH_IMAGE010
And the goodness-of-fit is used for measuring whether the distribution of the clusters is elliptical or not.
It should be noted that, in the following description,
Figure 836577DEST_PATH_IMAGE014
the number of clusters is the number of particles corresponding to the outer ring part of the reflective white area, namely the number of particles in the annular area, and is the unique characteristic of the outer ring of the bubble except the central highlight area. Since the position where the air bubbles meet exhibits the same characteristics of the central highlight region but does not exhibit the same granular appearance in the periphery, the lower limit of the number of clusters is set to 6 in order to prevent the computer from mistaking these portions as air bubbles, which is a characteristic of unique granules in the annular region.
(2) The granular bright spots appearing in the annular area are distributed more uniformly, so that the degree of particle disorder in the annular area is measured according to the characteristic.
Specifically, euclidean distances between the current cluster and other clusters are calculated, the minimum Euclidean distance is selected as the target Euclidean distance of the current cluster, the target Euclidean distance variance is calculated according to the target Euclidean distances of all the clusters, and the target Euclidean distance variance is used as a granular sensation chaos value in the annular area.
It should be noted that, when the target euclidean distance variance is smaller, the particle misordering degree of the annular region is smaller, and the annular region corresponding to the current target region under illumination presents more uniform annular-band-shaped particles.
(3) When the number of detected clusters is larger and the distribution of the clusters is closer to an ellipse, that is, the goodness of fit is larger, the value of granular disturbance is larger
Figure 874940DEST_PATH_IMAGE011
The smaller the ambient light ring coefficient corresponding to the annular region is, the larger the ambient light ring coefficient corresponding to the annular region is, that is, the more granular parts identified in the annular region of the current target region are, the more uniform and closer the particles are to the annular region, the larger the corresponding ambient light ring coefficient is, therefore, the ambient light ring coefficient corresponding to the annular region is obtained by combining the goodness of fit and the particle sense confusion value, and the calculation formula of the ambient light ring coefficient is as follows:
Figure 946801DEST_PATH_IMAGE007
wherein, the first and the second end of the pipe are connected with each other,
Figure 121431DEST_PATH_IMAGE008
the ambient halo coefficient;
Figure 40887DEST_PATH_IMAGE009
the adjustment coefficient is used for adjusting the value range of the function value;
Figure 300967DEST_PATH_IMAGE011
is the granular sensation confusion value;
Figure 176519DEST_PATH_IMAGE012
and
Figure 471234DEST_PATH_IMAGE013
all are adjustment coefficients used for adjusting the value range of function values.
Preferably, in the embodiments of the present invention
Figure 100002_DEST_PATH_IMAGE016
And
Figure 836356DEST_PATH_IMAGE013
taking empirical values, i.e.
Figure DEST_PATH_IMAGE017
Figure 100002_DEST_PATH_IMAGE018
Figure DEST_PATH_IMAGE019
It should be noted that, in the following description,
Figure 209831DEST_PATH_IMAGE010
the lower limit of the goodness of fit is set to 0.65 because the distribution of the central points corresponding to the clusters is similar to the ellipse, but the upper limit of the goodness of fit is set to 0.9 because the number of the central points is large and the goodness of fit is not too high and the condition that the goodness of fit is higher because only a few clusters are detected is avoided, so the value range of the goodness of fit is
Figure 100002_DEST_PATH_IMAGE020
(4) By using the methods from the step (1) to the step (3), the ambient light ring coefficient of each target area can be obtained.
For each target region, the center highlight factor
Figure 184347DEST_PATH_IMAGE003
And coefficient of ambient light ring
Figure 333568DEST_PATH_IMAGE008
The characteristics of the bubbles are all represented, so that the bubble highlight coefficient of the corresponding target area is analyzed according to the central highlight coefficient and the peripheral halo coefficient corresponding to each target area, the ratio of the central highlight coefficient to the peripheral halo coefficient is used as the bubble highlight coefficient of the corresponding target area, the bubble highlight coefficient and the central highlight coefficient are in a negative correlation relationship, and the bubble highlight coefficient and the peripheral halo coefficient are in a positive correlation relationship.
Step S003, acquiring a bubble highlight coefficient of each target area, and determining the target area as a bubble reflection white area when the bubble highlight coefficient is greater than a bubble brightness coefficient threshold value; the method comprises the steps of carrying out edge detection on non-bubble reflective white areas in a gray level image to obtain corresponding edge images, detecting straight lines and curves in the edge images, counting the number of the bubble reflective white areas on two sides of each straight line or curve to determine a real bubble edge line, and identifying and segmenting bubbles in a flotation froth image according to position information of the real bubble edge line.
Specifically, according to the method of step S002, the bubble highlight coefficient of each target area can be obtained, the bubble luminance coefficient threshold is set, and when the bubble highlight coefficient is greater than the bubble luminance coefficient threshold, the target area is determined to be a bubble reflection white area, which is a white area formed by reflecting light with bubbles.
Preferably, the bubble brightness coefficient threshold value in the embodiment of the present invention is an empirical value, and the bubble brightness coefficient threshold value is 0.231.
Further, edge detection is carried out on a non-bubble reflective white area in the gray level image by using a canny operator to obtain a corresponding edge image, wherein the non-bubble reflective white area is the residual area without the bubble reflective area; and (3) respectively detecting straight lines and curves in the edge image by using Hough straight line detection and Hough ellipse detection, marking the corresponding straight lines and curves in the edge image, and for convenient expression, collectively referring the straight lines and the curves as lines, and performing multiple closing operations on the lines at the marked positions to connect the lines.
According to the praatorium, the bubble is pieced together by complete smooth curved surface, and the juncture of bubble is the curve that two liang of lines meet and constitute, so detect all borders that can acquire the bubble according to the lines, but because the bubble is more mixed and disorderly in the image of gathering, the lines that lead to detecting out easily not only include the bubble border, still include some other mixed and disorderly lines, consequently need screen out the lines that belong to the bubble border, and concrete method is: taking any line as an example, respectively acquiring target detection areas formed by the current line and other lines on two sides of the current line according to the position of the current line, wherein the line does not exist in the target detection areas, and one side of the target detection area corresponds to one target detection area; respectively counting the number of bubble light-emitting white areas in two target detection areas of the current line, and confirming that the current line is a true bubble edge when only one bubble light-emitting white area exists in the two target detection areas; when more than or equal to 2 bubble luminous white areas appear in any target detection area, returning to line detection until the number of the bubble luminous white areas in the target detection area is less than 2 by improving the line identification precision; otherwise, the current line is confirmed to belong to a false bubble edge.
As an example, let the number of bubble light-emitting white areas in each target detection area be
Figure DEST_PATH_IMAGE021
Wherein, in the step (A),
Figure 100002_DEST_PATH_IMAGE022
the number of white areas is reflected for a complete bubble,
Figure DEST_PATH_IMAGE023
the number of white areas that are reflected for incomplete bubbles,
Figure 463067DEST_PATH_IMAGE006
is an imaginary unit based on
Figure 199204DEST_PATH_IMAGE021
Specific example of (1)Value calculation of one-sided bubble eigenvalue of current line
Figure 100002_DEST_PATH_IMAGE024
Figure DEST_PATH_IMAGE025
When the temperature is higher than the set temperature
Figure 10034DEST_PATH_IMAGE024
Is taken as
Figure 100002_DEST_PATH_IMAGE026
In the process, all bubble edges are not screened out during line detection, and then parts for detecting straight lines and curves need to be returned, so that the detection precision of the lines is improved, and further the single-side bubble characteristic value of the lines
Figure 574614DEST_PATH_IMAGE024
Until obtained
Figure 484802DEST_PATH_IMAGE024
The value is constant.
Respectively obtaining single-side bubble characteristic values of two sides of the current line, calculating a product between the single-side bubble characteristic values, confirming that the current line is a true bubble edge when the product is equal to 1, and otherwise, confirming that the current line is a false bubble edge when the product is not equal to 1.
Furthermore, all the edges of the real air bubbles are confirmed through the line screening method of the air bubble boundary, the edges of the air bubbles in the flotation froth image are marked according to the position information of all the edges of the real air bubbles, and the air bubbles are identified and segmented according to the marked flotation froth image.
In summary, the embodiment of the present invention provides a flotation froth identification method based on optical information, the method acquires a flotation froth image to obtain a corresponding gray image, and performs threshold segmentation on the gray image to obtain a plurality of reflective white areas; the bubble reflection white area is confirmed through central highlight distribution and peripheral uniform annular band-shaped particle characteristics, straight line and curve detection is carried out on the non-bubble reflection white area in the gray level image, the number of the bubble reflection white areas on the two sides of each straight line or curve is counted to determine a real bubble edge line, bubbles in the flotation froth image are identified and segmented according to the position information of the real bubble edge line, accurate identification and segmentation of the bubbles are achieved, accuracy of segmentation results is improved, and working efficiency is improved.
It should be noted that: the precedence order of the above embodiments of the present invention is only for description, and does not represent the merits of the embodiments. And that specific embodiments have been described above. In addition, the processes depicted in the accompanying figures do not necessarily require the particular order shown, or sequential order, to achieve desirable results. In some embodiments, multitasking and parallel processing may also be possible or may be advantageous.
All the embodiments in the present specification are described in a progressive manner, and the same and similar parts among the embodiments are referred to each other, and each embodiment focuses on the differences from other embodiments.
The above description is only for the purpose of illustrating the preferred embodiments of the present invention and should not be taken as limiting the scope of the present invention, which is intended to cover any modifications, equivalents, improvements, etc. within the spirit of the present invention.

Claims (6)

1. A flotation method foam identification method based on optical information is characterized by comprising the following steps:
collecting a flotation froth image to obtain a corresponding gray level image, and obtaining a plurality of reflective white areas in the gray level image by using an Otsu threshold value method, wherein the reflective white areas refer to white areas caused by reflection; respectively carrying out connected domain analysis on each light-reflecting white area to obtain a target area;
acquiring a central pixel point of a current target area, and calculating a gray value difference value between the central pixel point and each pixel point in an area around the central pixel point to obtain a central high brightness coefficient; acquiring an annular area of a current target area according to the distance from a central pixel point to the edge of the current target area, clustering the pixel points in the annular area to obtain a plurality of clusters, and performing ellipse fitting according to the central point coordinate of each cluster to obtain goodness of fit; calculating Euclidean distances between the current cluster and other clusters respectively, selecting the minimum Euclidean distance as a target Euclidean distance of the current cluster, calculating a target Euclidean distance variance according to the target Euclidean distances of all the clusters, and taking the target Euclidean distance variance as a granular sensation chaos value in the annular region; combining the goodness of fit and the granular sensation confusion value to obtain a peripheral light ring coefficient corresponding to the annular area; taking the ratio of the central highlight coefficient to the peripheral halo coefficient as the bubble highlight coefficient of the current target area;
acquiring the bubble high brightness coefficient of each target area, and determining the target area as a bubble reflective white area when the bubble high brightness coefficient is larger than a bubble brightness coefficient threshold value; the method comprises the steps of carrying out edge detection on non-bubble reflective white areas in a gray level image to obtain corresponding edge images, detecting straight lines and curves in the edge images, counting the number of the bubble reflective white areas on two sides of each straight line or curve to determine a real bubble edge line, and identifying and segmenting bubbles in the flotation froth images according to position information of the real bubble edge line.
2. The flotation froth identification method based on optical information as claimed in claim 1, wherein the method for obtaining the central highlight coefficient comprises:
construction with central pixel point as window center
Figure DEST_PATH_IMAGE002
Window for calculating central pixel point and central pixel point respectively
Figure 516874DEST_PATH_IMAGE002
Obtaining the central highlight coefficient of the current target area according to the gray value difference value between other 24 pixel points in the window, and highlighting the centerThe coefficient is calculated by the formula:
Figure DEST_PATH_IMAGE004
wherein the content of the first and second substances,
Figure DEST_PATH_IMAGE006
the center highlight coefficient;
Figure DEST_PATH_IMAGE008
the gray value of the central pixel point in the current target area,
Figure DEST_PATH_IMAGE010
is composed of
Figure 417482DEST_PATH_IMAGE002
The first of the other 24 pixel points in the window
Figure DEST_PATH_IMAGE012
The gray value of each pixel point.
3. The flotation froth identification method based on optical information as claimed in claim 1, wherein the method for obtaining the annular region of the current target region according to the distance from the central pixel point to the edge of the current target region comprises:
acquiring a connecting line between a central pixel point and each edge pixel point of a current target area, and taking a point which is on the connecting line and is twice as long as the central pixel point as a connecting line length as a dividing point, wherein the dividing point forms a dividing edge; and removing the region surrounded by the segmentation edges in the current target region to obtain an annular region.
4. The flotation froth identification method based on optical information as claimed in claim 1, wherein the calculation formula of the ambient light ring coefficient is:
Figure DEST_PATH_IMAGE014
wherein the content of the first and second substances,
Figure DEST_PATH_IMAGE016
the ambient light ring coefficient is obtained;
Figure DEST_PATH_IMAGE018
to adjust the coefficient;
Figure DEST_PATH_IMAGE020
is the goodness-of-fit;
Figure DEST_PATH_IMAGE022
is the granular sensation disorder value;
Figure DEST_PATH_IMAGE024
and
Figure DEST_PATH_IMAGE026
are all adjustment coefficients;
Figure DEST_PATH_IMAGE028
is the number of clusters.
5. The flotation froth identification method based on optical information as claimed in claim 1, wherein the bubble high brightness coefficient is in negative correlation with the central high brightness coefficient and the bubble high brightness coefficient is in positive correlation with the surrounding halo coefficient.
6. The flotation froth identification method based on optical information as claimed in claim 1, wherein the method for counting the number of the bubble reflection white areas at the two sides of each straight line or curve to determine the edge line of the real bubble comprises:
the method comprises the steps of generally referring a straight line and a curve as lines, respectively obtaining target detection areas formed by the current line and other lines on two sides of the current line according to the position of the current line, wherein the lines do not exist in the target detection areas; respectively counting the number of bubble light-emitting white areas in two target detection areas of the current line;
when only one bubble light-emitting white area exists in the two target detection areas, the current line is confirmed to be a true bubble edge; when more than or equal to 2 bubble luminous white areas appear in any target detection area, returning to line detection until the number of the bubble luminous white areas in the target detection area is less than 2 by improving the detection precision of the lines; otherwise, the current line is confirmed to belong to a false bubble edge.
CN202211194724.5A 2022-09-29 2022-09-29 Flotation method foam identification method based on optical information Active CN115294379B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202211194724.5A CN115294379B (en) 2022-09-29 2022-09-29 Flotation method foam identification method based on optical information

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202211194724.5A CN115294379B (en) 2022-09-29 2022-09-29 Flotation method foam identification method based on optical information

Publications (2)

Publication Number Publication Date
CN115294379A CN115294379A (en) 2022-11-04
CN115294379B true CN115294379B (en) 2023-01-03

Family

ID=83834491

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202211194724.5A Active CN115294379B (en) 2022-09-29 2022-09-29 Flotation method foam identification method based on optical information

Country Status (1)

Country Link
CN (1) CN115294379B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116385455B (en) * 2023-05-22 2024-01-26 北京科技大学 Flotation foam image example segmentation method and device based on gradient field label

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101339659A (en) * 2008-08-22 2009-01-07 北京矿冶研究总院 Region growing image segmentation method based on rules
CN107274403A (en) * 2017-06-30 2017-10-20 长安大学 A kind of evaluation method of flotation surface quality
CN111563410A (en) * 2020-03-27 2020-08-21 中信重工机械股份有限公司 Processing method for detecting motion speed of foam image
CN113567058A (en) * 2021-09-22 2021-10-29 南通中煌工具有限公司 Light source parameter adjusting method based on artificial intelligence and visual perception

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101339659A (en) * 2008-08-22 2009-01-07 北京矿冶研究总院 Region growing image segmentation method based on rules
CN107274403A (en) * 2017-06-30 2017-10-20 长安大学 A kind of evaluation method of flotation surface quality
CN111563410A (en) * 2020-03-27 2020-08-21 中信重工机械股份有限公司 Processing method for detecting motion speed of foam image
CN113567058A (en) * 2021-09-22 2021-10-29 南通中煌工具有限公司 Light source parameter adjusting method based on artificial intelligence and visual perception

Also Published As

Publication number Publication date
CN115294379A (en) 2022-11-04

Similar Documents

Publication Publication Date Title
CN109377485B (en) Machine vision detection method for instant noodle packaging defects
CN115311292B (en) Strip steel surface defect detection method and system based on image processing
CN115082683B (en) Injection molding defect detection method based on image processing
CN115082419B (en) Blow-molded luggage production defect detection method
CN115018828A (en) Defect detection method for electronic component
CN103971126A (en) Method and device for identifying traffic signs
EP3161787A1 (en) Detecting edges of a nucleus using image analysis
CN116703907A (en) Machine vision-based method for detecting surface defects of automobile castings
CN115294379B (en) Flotation method foam identification method based on optical information
CN115249246A (en) Optical glass surface defect detection method
CN111476246B (en) Robust and efficient intelligent reading method for pointer instrument applied to complex environment
CN109886168B (en) Ground traffic sign identification method based on hierarchy
CN110648330B (en) Defect detection method for camera glass
CN107516315B (en) Tunneling machine slag tapping monitoring method based on machine vision
CN110175556B (en) Remote sensing image cloud detection method based on Sobel operator
CN114972356A (en) Plastic product surface defect detection and identification method and system
CN115311283B (en) Glass tube drawing defect detection method and system
CN114119603A (en) Image processing-based snack box short shot defect detection method
CN116309577B (en) Intelligent detection method and system for high-strength conveyor belt materials
CN117237646B (en) PET high-temperature flame-retardant adhesive tape flaw extraction method and system based on image segmentation
CN116542968A (en) Intelligent counting method for steel bars based on template matching
Liang et al. Flotation froth image segmentation based on highlight correction and parameter adaptation
CN112101108A (en) Left-right-to-pass sign identification method based on pole position characteristics of graph
CN110110810B (en) Squid quality grade identification and sorting method
CN113989771A (en) Traffic signal lamp identification method based on digital image processing

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant