CN104408724B - Froth flotation level monitoring and operating mode's switch method and system based on depth information - Google Patents

Froth flotation level monitoring and operating mode's switch method and system based on depth information Download PDF

Info

Publication number
CN104408724B
CN104408724B CN201410699401.0A CN201410699401A CN104408724B CN 104408724 B CN104408724 B CN 104408724B CN 201410699401 A CN201410699401 A CN 201410699401A CN 104408724 B CN104408724 B CN 104408724B
Authority
CN
China
Prior art keywords
depth
gray
foam
formula
value
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201410699401.0A
Other languages
Chinese (zh)
Other versions
CN104408724A (en
Inventor
彭涛
赵永恒
赵林
蔡耀仪
宋彦坡
韩华
赵璐
彭霞
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Central South University
Original Assignee
Central South University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Central South University filed Critical Central South University
Priority to CN201410699401.0A priority Critical patent/CN104408724B/en
Publication of CN104408724A publication Critical patent/CN104408724A/en
Application granted granted Critical
Publication of CN104408724B publication Critical patent/CN104408724B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01FMEASURING VOLUME, VOLUME FLOW, MASS FLOW OR LIQUID LEVEL; METERING BY VOLUME
    • G01F23/00Indicating or measuring liquid level or level of fluent solid material, e.g. indicating in terms of volume or indicating by means of an alarm
    • G01F23/22Indicating or measuring liquid level or level of fluent solid material, e.g. indicating in terms of volume or indicating by means of an alarm by measuring physical variables, other than linear dimensions, pressure or weight, dependent on the level to be measured, e.g. by difference of heat transfer of steam or water
    • G01F23/28Indicating or measuring liquid level or level of fluent solid material, e.g. indicating in terms of volume or indicating by means of an alarm by measuring physical variables, other than linear dimensions, pressure or weight, dependent on the level to be measured, e.g. by difference of heat transfer of steam or water by measuring the variations of parameters of electromagnetic or acoustic waves applied directly to the liquid or fluent solid material
    • G01F23/284Electromagnetic waves
    • G01F23/292Light, e.g. infrared or ultraviolet

Abstract

The present invention discloses froth flotation level monitoring and operating mode's switch method and system based on depth information.First, the froth flotation working-condition monitoring system based on kinect is built in terms of software and hardware;Secondly, flotation froth color and depth data are gathered by Kinect sensor, extraction of depth information and filtering process is carried out to depth data, sequential and the alignment on position and its corresponding storage are carried out to color and depth data.Then, color combining data and depth data, solid (the carrying depth information) features such as the color of foam, area, volume, speed, percentage of damage are extracted.Furthermore carry out level monitoring by analyzing the relation between current foam top layer liquid level feature and overflow launder edge height.Finally, cluster analysis is carried out to froth images feature using improved k means algorithms, realizes the ONLINE RECOGNITION to flotation operating mode.The present invention can be used for the monitoring of working condition at froth flotation scene and real-time working condition to identify, is operated with realizing that flotation production automatically controls with optimization, improves resource utilization.

Description

Froth flotation level monitoring and operating mode's switch method and system based on depth information
Technical field
The present invention relates to mineral manufacture field, especially a kind of froth flotation level monitoring and operating mode based on depth information Recognition methods and system.
Background technology
Flotation operating mode is the working condition in Floating Production Process, timely and accurately identify operating mode to instruct flotation produce to Close important.Froth flotation is most widely used a kind of beneficiation method in mineral processing, and the physicochemical change being related to is extremely Complexity, be in accordance with all the time experienced worker observe by the naked eye foam surface characteristics in flotation cell (color, size, Speed etc.) change artificially judges current working, and controls the floating operations such as chemical feeding quantity with this.But different workers is to foam Surface characteristics judges no unified quantitative criteria, causes the subjectivity of operating mode's switch or even operation and randomness larger, makes Flotation is difficult to cause production process unstable in optimum state, influences the quality of production.With machine vision and image procossing Etc. the rapid development of technology, the work that Intelligent Recognition operating mode is carried out with reference to flotation site foam characteristics has made great progress.It is logical The operating mode classification for fast and accurately identifying flotation site is crossed, flotation production control system can adjust manufacturing parameter in time, make flotation Production process remains at optimum state.
Mineral pulp level is an important indicator of flotation site operating mode's switch, but is difficult to survey due to being covered by froth bed Amount, meanwhile, froth flotation monitoring of working condition and identifying system and method based on traditional industry video camera are (described briefly below for tradition system System and method) floatation foam image that is gathered only has plane picture information, without depth information, can not obtain to ore flotation The real-time monitoring of slurries position change, it is difficult to realize the automatic identification of liquid level and automatically control.In addition, no depth information can also be given The extraction of the determination (coming from work human stereo vision) of characteristics of image empirical value and monitor value is (flat from legacy system and method Facial vision) larger difference is brought, the unstable and inaccurate of operating mode's switch result is caused, makes the application of legacy system and method It is extremely restricted.
The content of the invention
It is an object of the invention to provide a kind of froth flotation level monitoring based on depth information and operating mode's switch method and System, by visual field foam top layer a little away from bottom land distance, obtain the foam top layer liquid level spy related to mineral pulp level Sign, have effectively achieved the online level monitoring based on machine vision;The flotation froth surface extracted by kinect sensors Stereoscopic features, the limitation of planar foam feature is effectively breached, can more accurately identify current working.
To achieve the above object, the present invention provides a kind of froth flotation level monitoring and operating mode's switch based on depth information Method, comprise the following steps:
Step 1:The froth flotation working-condition monitoring system based on kinect is built in terms of software and hardware;
Step 2:Gather, handle and preserve by the kinect flotation froth color data obtained and depth data;
Step 3:The color and depth data obtained to step 2 carries out feature extraction, and color combining and depth data carry Take solid (the carrying depth information) features such as the color of foam, area, volume, speed, percentage of damage;
Step 4:The depth data obtained using distribution fitting method to step 2 carries out statistical analysis;Estimated using parameter Meter method obtains foam top layer liquid level feature;And then obtain the pass between foam top layer liquid level feature and overflow launder edge height System, on-line monitoring and operating mode's switch for froth flotation liquid level;
Step 5:By step 3 obtain foam characteristics constitutive characteristic vector, using improved k-means algorithms carry out from The cluster analysis of line, obtain several cluster centres;The foam characteristics of extract real-time flotation site, and carried out in fact with cluster centre When match, obtain the operating mode of current flotation online.
The present invention also provides a kind of froth flotation level monitoring and working condition recognition system based on depth information, and its feature exists In, including kinect sensors, high frequency light source, computer, kinect sensors be fixed on away from flotation cell liquid level 1.2m-3.5m models In enclosing, plane where its camera is parallel with flotation groove bottom;High frequency light source gathers color data for kinect sensors and provided Illumination, the color collected and depth data stream are sent to computer by USB data line, computer realize data processing, Real-time working condition is monitored and its result is shown;Software systems are under cross-platform framework Openni, using interactive programming interface, with reality Existing color and real-time acquisition, processing, preservation and the feature extraction of depth data stream and operating mode's switch.
Beneficial effect:The flotation froth surface three-dimensional feature that the present invention is extracted by kinect sensors, it is effective to break through The limitation of planar foam feature, and be combined with extraction foam top layer liquid level feature, can more it accurately identify current Operating mode.
Brief description of the drawings
Fig. 1 is froth flotation level monitoring and operating mode's switch method and system of the embodiment of the present invention based on depth information Hardware structure diagram.
Fig. 2 is the software of the froth flotation level monitoring and operating mode's switch method and system shown in Fig. 1 based on depth information Flow chart.
Fig. 3 is the color image that kinect is gathered in the embodiment of the present invention.
Fig. 4 is the depth image that kinect is gathered in the embodiment of the present invention.
Embodiment
Kinect is a novel sensor (camera) that Microsoft develops, and can obtain the face of target area in real time Two kinds of data of color and depth, breach the limitation of the two dimensional image feature of traditional camera, can obtain with real world more Close three-dimensional feature information, it is applied to field of play at present, but is not widely used in industry spot also.
Therefore, Kinect sensor is incorporated into froth flotation industry spot by we first, adopted by Kinect sensor The depth data of collection, the foam top layer liquid level feature related to mineral pulp level is obtained, carry out level monitoring and operating mode;With reference to The color data of Kinect sensor collection, the solids such as the color of foam, area, volume, speed, percentage of damage are extracted (with deep Spend information) feature, realize the real-time working condition identification to froth flotation production process.
The present invention is described in further details below with reference to the drawings and specific embodiments.
As shown in Fig. 2 the present embodiment a kind of froth flotation level monitoring and operating mode's switch method bag based on depth information Include following steps:
Step 1:The froth flotation working-condition monitoring system based on kinect is built in terms of software and hardware;
According to shown in Fig. 1 build froth flotation monitoring of working condition and identifying system based on depth information, hardware mainly by Kinect sensors 1, high frequency light source 4, computer 5 etc. form.High frequency light source 4 is that the collection of kinect sensors 1 color data carries For illumination.Kinect sensors 1 are fixed on away from the liquid level 2m of flotation cell 6, its place plane of camera 2 must be with the bottom of flotation cell 6 Face is parallel, and the color collected in the range of camera visual field 3 and depth data stream are sent into computer by USB data line 5, computer 5 realizes that data processing, real-time working condition monitoring and its result are shown.In addition, in high frequency light source 4 and kinect sensors 1 outside adds dust cover (being not drawn into figure), for weakening the influence of extraneous factor (as corroded, pollution camera lens etc.), enhancing system The reliability of system.The one end of flotation cell 6 is provided with overflow launder 7, and agitating device 8 and scraper plate 9 are equiped with inside flotation cell 6.In figure, Hb Represent that camera place plane represents that foam top layer point arrives plane where camera in visual field to the distance of flotation cell bottom land, Hd Distance, Hh represents that foam top layer point in visual field arrives the distance of bottom land, and Ht represents froth bed thickness (certain point at certain interior point of visual field To the distance of mineral syrup liquid), Hk represents mineral pulp level, and Hy represents overflow launder edge height.
The froth flotation working-condition monitoring system based on kinect is built from software aspects, in cross-platform framework Openni Under (Open Natural Interaction), using interactive programming interface, to realize the real-time of color and depth data stream Acquisition, processing, preservation and feature extraction and operating mode's switch.
Step 2:Gather, handle and preserve by the kinect flotation froth color data obtained and depth data;
Comprise the steps of:
Step 1:Gathered data;
Color data stream is set as RGB888 (RGB), resolution ratio 640*480, frame per second 30FPS;By depth data Stream format is set as PIXEL_FORMAT_DEPTH_1_MM forms (data precision is millimeter), and resolution ratio is 640 × 480;Frame per second For 30FPS;Using under cross-platform framework Openni " Alternative View " instruments, correction of color camera are taken the photograph with depth As the visual angle of head, to realize color data stream and alignment of the depth data stream on field of regard.
Start kinect equipment and carry out IMAQ;Color data is created using the data format of above-mentioned steps setting simultaneously Stream and depth data stream, obtain two kinds of data of sequential alignment as shown in Figure 3 and Figure 4:640 × 480 × 3 rank color matrix C With 640 × 480 rank matrix of depths D.Such as in color image, point P1(243,133) are labeled as " RGB:242,241,239 ", The value for representing the point in the color matrix of kinect collections is C (243,133,1)=242, C (243,133,2)=241, C (243,133,3)=239.In depth image, point P1(243,133) are labeled as " Index:661 ", represent the face of kinect collections The value of the point is D (243,133)=661 in colour moment battle array;Mark " RGB:0.918,0.918,0.918 " is normalized by depth value Obtain for what is shown, color is deeper in depth image means that depth value is smaller, i.e. the shooting of this foam top layer distance Head distance is nearer, by comparing foam head point P1The Index values (661) at place and foam edge point P2The Index values (693) at place Can be with it is easily verified that above-mentioned saying.
Step 2:Processing data;
(1) depth information in depth data is obtained;
The matrix of depths D obtained by kinect sensors is 16 integer datas, and its high 13 storage is depth information, And low 3 storages is index information, depth information is obtained by shifting function:
D1=D/23Formula 1
D in formula1Only to store the matrix of depths of depth information (without index information).
(2) color combining data are filtered completion processing to depth data;
Using joint bilateral filtering method to matrix of depths D1Processing is filtered, by the depth image due to blocking missing Completion, obtain filtered matrix of depths D2, its depth value D at point (x, y) place2(x, y) is:
Formula 2
Ω in formula1For the filtered reference neighborhood of required point (x, y), x ∈ [1, a], y ∈ [1, b], (i1,j1)∈Ω1, i1∈ [x-r1,x+r1],j1∈[y-r1,y+r1], r1For the radius of neighbourhood, D1(i1,j1) it is matrix D1In point (i1,j1) place depth value, wpFor normalized parameter:
Formula 3
W (i in formula1,j1) be required point (x, y) neighborhood Ω1Interior each point (i1,j1) for the weights of required point, by depth Image space domain weight wsWith coloured image gray scale domain weight wrComposition, i.e.,:
w(i1,j1)=ws(i1,j1)×wr(i1,j1) formula 4
Wherein:
Formula 5
Formula 6
Wherein, σrAnd σsRespectively weight wsAnd wrCorresponding Gaussian function standard deviation, Gray (x, y), Gray (i1,j1) point Wei not pixel (x, y) and its neighborhood Ω1Interior each point (i1,j1) place gray value, calculation formula is:
Gray (x, y)=0.02989 × C (x, y, 1)+0.5870 × C (x, y, 2)+0.1149 × C (x, y, 3) formula 7
Wherein C (x, y, 1), C (x, y, 2), C (x, y, 3) are respectively R, G, B color component in the rank color matrix C of a × b × 3 A × b rank matrix midpoints (x, y) color value, traversal obtain gray matrix Gray a little.
Choose suitable Gaussian function standard deviation sigmarAnd σs, in suitable filtered reference neighborhood Ω1It is interior, can effectively it disappear Except the depth information and noise spot of the region missing that is blocked.
(3) reflection foam top layer is obtained away from bottom land distance HhMatrix of depths D ';
As shown in Figure 1, HhDistance of the foam top layer point (x, y) away from bottom land in visual field is represented, can be obtained by equation below :
Hh=Hb-HdFormula 8
Wherein HbFor kinect sensors camera place plane to the distance (being definite value after installation) of flotation cell bottom land, Hd Distance for foam top layer point in visual field to kinect sensors camera place plane, in kinect sensors camera institute In the case where plane is parallel with flotation groove bottom, HdI.e. for the matrix of depths D that is obtained by kinect sensors is shifted and filter Matrix of depths D after ripple completion2The value of middle corresponding points, i.e. Hd=D2(x,y)。
Obtain reflection HhCertain point (x, y) depth value D ' (x, y) be:
D ' (x, y)=Hb-D2(x, y) formula 9
In the present embodiment, depth information D in depth data is obtained by the shifting function of formula 11.Required point (x, y) is taken to filter Ripple refers to neighborhood Ω1Radius of neighbourhood r1=3, x ∈ [1,640], y ∈ [Isosorbide-5-Nitrae 80], i1∈[x-r1,x+r1], j1∈[y-r1,y+ r1], pass through 2~6 couples of matrix of depths D of formula1Processing is filtered, obtains filtered matrix of depths D2.By formula 7 to face Chromatic number obtains gray matrix Gray according to gray processing is carried out.Reflection foam top layer is obtained away from bottom land distance H by formula 8~9hDepth Degree matrix D ', wherein Hb=4m.
Step 3:Preserve data;
Color matrix C and matrix of depths D ' are stored.Number of colours is accessed with Motion JPEG AVI coded systems According to Motion JPEG2000 (16) coded system access depth data.
Step 3:The color and depth data obtained to step 2 carries out feature extraction, and color combining and depth data carry Take solid (the carrying depth information) features such as the color of foam, area, volume, speed, percentage of damage;Including following sub-step:
Step 1:Extract color characteristic;
The color and depth data obtained to step 2 carries out feature extraction, color combining and depth data, by following public affairs Formula 12~15 extracts the color characteristic of foam:R averages, G averages, B averages and gray average Graym
Assuming that the depth value scope of object is [d in visual fieldmin,dmax], definition datum distance is:
Db=(dmax-dminThe formula of)/2 10
Assuming that the color of point is standard value at reference range, then in the case of intensity of illumination identical, from camera lens away from Point color from more remote (being more than reference range) decays bigger, it is necessary to strengthen its value, and (is less than from distance of camera lens is nearer Reference range) point, it is necessary to weaken to its color value, therefore, by setting weight function, with reference to depth data to color Data are standardized, and formula is:
Q (x, y)=(D'(x, y)-db)3/106+ 1 formula 11
Color characteristic after being standardized to color value is defined as:
R averages:Formula 12
G averages:Formula 13
B averages:Formula 14
The gray matrix Gray obtained by step 2, calculate gray average:
Formula 15
Step 2:Extract area features;
The intensity profile Gs for calculating the gray matrix Gray that step 2 obtains a two field picture is obtained pair by formula 16~17 Than the gray matrix Gray ' of degree enhancing;Matrix Gray ' is split using watershed algorithm, froth zone is obtained by formula 18 Domain two values matrix A;All pixels of a two field picture are traveled through, foam regions are numbered, obtain the mark letter of marked region Cease Psn;The area of all N number of foam regions is counted respectively, obtains area distributions Area;By N number of face in area distributions Area Product value is resequenced to obtain Area ' according to order from small to large, and the area features Area of foam is extracted by formula 22m
The intensity profile Gs of a two field picture is calculated the gray matrix Gray that step 2 obtains, represents to meet with Gs (g) The number of Gray (x, y)=g pixel, wherein g ∈ [0,255].
Since e=0, e=0, e=1 ..., e=e ' are taken respectively, e ' ∈ (0,255), as e=e ', is met:
Formula 16
The gray matrix Gray ' of contrast enhancing is obtained, its gray value at point (x, y) place is:
Formula 17
Matrix Gray ' is split using watershed algorithm, obtains foam regions two values matrix A, it is at point (x, y) place Gray value be designated as A (x, y), the differentiation of foam regions point and boundary point is carried out using formula below:
All pixels of a two field picture are traveled through, foam regions are numbered, obtain the label information of marked region:
Psn:{(x1,y1),(x2,y2),L,(xArea(n),yArea(n)) formula 19
Ps in formulanThe set of n-th of foam regions all pixels point, n=1,2, L, N are represented, N is foam regions sum. Area (n) represents the number of the pixel in n-th of foam regions, as region area, counts all N number of foam regions respectively Area, obtain area distributions:Area:{ Area (1), Area (2) ... Area (n) ... Area (N) }, calculate a two field picture N The overall area area of individual foam regions:
Formula 20
N number of area value in area distributions Area is resequenced to obtain Area ' according to order from small to large:{Area′ (1), Area ' (2) L Area ' (n) L Area ' (N) }, wherein Area ' (1) is foam area minimum value, and Area ' (N) is foam Maximum Area.
From f1=1 starts, and takes f respectively1=1, f1=2 ..., f1=f1′,f1' ∈ (1, N), works as f1=f1' when, meet:
Formula 21
Then defining foam area features is:
Formula 22
Step 3:Extract volume characteristic;
The region barycenter of n-th of foam regions is calculated by formula 22~28With long and short the half of region sub-elliptical Axle CBn、DBn;The pixel coordinate of transverse end points is calculated by formula 29~30:c1(un1,vn1)、c2(un2,vn2);With reference to step The two matrix of depths D ', the depth value l as corresponding to formula 31 calculates transverse end points and region barycenter obtainedn1,ln2,lnc, it is false If foam is the spheroid of standard, foam real space height h can be obtained by formula 32n;Major axis reality is obtained by formula 33~36 Border length CRn, can similarly obtain short axle physical length DRn;The volume V (n) of n-th of foam is obtained by formula 37.
The volume Distribution value Vs of N number of foam regions is resequenced to obtain Vs ' according to order from small to large;Pass through formula The volume characteristic V of 38 extraction foamsm
For region the two values matrix A and zone marker Ps of a two field picturen, (p+q) rank rule of its n-th of foam regions Square is defined as:
Formula 23
The region barycenter of n-th of foam regionsCan be by its 1 rank rule away from trying to achieve:
Formula 24
Then (p+q) rank centre-to-centre spacing of a two field picture is:
Formula 25
If only considering the second-order moment around mean collection of a two field picture, can be approximately by foam regions one using region barycenter as The ellipse at center, then have:
Formula 26
Formula 27
Formula 28
CBnRepresent the major semiaxis of sub-elliptical, DBnFor semi-minor axis, θnFor inclination angle.
And then the pixel coordinate of oval long (short) shaft end point is calculated:c1(un1,vn1)、c2(un2,vn2), formula is:
Formula 29
Formula 30
The matrix of depths D ' obtained with reference to step 2, calculate depth value l corresponding to transverse end points and region barycentern1, ln2,lnc
ln1=D ' (un1,vn1)
ln2=D ' (un2,vn2) formula 31
Assuming that foam can be approximately the spheroid of standard, foam real space height h is definednFor:
hn=lnc-min(ln1,ln2) formula 32
With reference to the view parameter of kinect sensors:Horizontal view angle δ1With vertical angle of view δ2, it is corresponding that major axis end points is calculated Three dimensional space coordinate Cn1And Cn2
Cn1:(δ1*[2un1L+(a-1)(L-ln1)],δ2[2(b-vn1-1)L+(b-1)(L-ln1)],L-ln1) formula 33
Cn2:(δ1*[2un2L+(a-1)(L-ln2)],δ2[2(b-vn2-1)L+(b-1)(L-ln2)],L-ln2) formula 34
Wherein L represents depth camera to the distance with reference to image plane (space coordinates z=0).
And then obtain major axis physical length:
Formula 35
Assuming that depth value corresponding to two end points of major axis is equal, major axis actual (tube) length angle value can be reduced to:
Formula 36
Short axle physical length DR can similarly be obtainedn
The volume (assuming that foam is the spheroid of standard) of n-th of foam may finally be obtained:
Formula 37
The thinking of a two field picture area features is obtained according to sub-step in step 32, by N number of Domain Volume Distribution value:Vs: { Vs (1), Vs (2) ... Vs (n) ... Vs (N) }, resequences to obtain Vs ' according to order from small to large:{Vs′(1),Vs′ (2)...Vs′(n)...Vs′(N)};
From f2=1 starts, and takes f respectively2=1, f2=2 ..., f2=f2′,f2' ∈ (1, N), works as f2=f2' when, meet:
Formula 38
Foam volume is then defined to be characterized as:
Formula 39
Step 4:Extraction rate feature;
(1) Harris Corner Detections;
First, it is assumed that certain point (x, y) is angle point in enhancing gray matrix Gray ', its neighborhood Ω is taken2Radius of neighbourhood r2= 5, i2∈[x-r2,x+r2], j2∈[y-r2,y+r2], by formula 40~42, calculate angle point neighborhood Ω2Interior point a little Numerical value G, fractional value is taken to be higher than the h of fraction 70 up to standard1It is individual, sort to obtain point set from high to low by fractional value
Assuming that certain point (x, y) in enhancing gray matrix Gray ' is angle point, its neighborhood Ω2It is interior a little along certain offset The mean intensity (gray value) in (Vx, Vy) direction, which becomes, to be turned to:
Formula 40
I in formula2∈[x-r2,x+r2],j2∈[y-r2,y+r2], r2For Ω2The radius of neighbourhood.
All offset directions of angle steel joint (x, y) carry out the calculating of mean intensity change, and Taylor series are carried out to formula 40 Expansion, omits higher order term, is described as with matrix form:
Formula 41
Q is covariance matrix in formula, and two characteristic value represents that certain is put maximum mean intensity in all offset directions and become Change and its change of the mean intensity of vertical direction, definition scoring functions are:
G=Det (Q)-kTrace2(Q) formula 42
Wherein Det (Q) is characterized value sum, i.e. determinant of a matrix value, Trace (Q) is characterized the product of value, i.e. matrix Mark.When two characteristic values are all larger, fractional value is higher.
Define GrFor fraction up to standard, traversal angle point (x, y) neighborhood Ω2It is interior a little, obtain fractional value higher than fraction up to standard h1It is individual, sort to obtain point set from high to low by fractional value
Secondly, angle point neighborhood Ω is taken3Radius of neighbourhood r3=7, i3∈[x-r3,x+r3], j3∈[y-r3,y+r3], by formula 43 couples of enhancing gray matrix Gray' carry out the local most value processing of gray value;Determine that gray value is locally most worth by formula 44~45 Point;From point set Hs1In filter out the point for meeting that local gray level is most worth condition, obtain by h2The point set of individual point composition
The local most value processing of gray value is carried out to enhancing gray matrix Gray', formula is:
Graymax (x, y)=max (Gray ' (i3,j3)),(i3,j3)∈Ω3
Formula 43
Graymin (x, y)=min (Gray ' (i3,j3)),(i3,j3)∈Ω3
Wherein Graymax (x, y) is the value at gray matrix Graymax midpoint (x, y) place after gray value local maxima, Graymin (x, y) is the value at gray matrix Graymin midpoint (x, y) place after gray value local minimum, Ω3For point (x, y) Neighborhood, its radius of neighbourhood is set to r3, and i3∈[x-r3,x+r3],j3∈[y-r3,y+r3],
Determine that gray value is locally most worth a little using equation below:
From point set Hs1In filter out the point for meeting that local gray level is most worth condition, obtain by h2The point set of individual point composition
Finally, matrix of depths D' is carried out, with asking the local most value of gray matrix Gray' gray values to order the same operation, obtaining Depth value is locally most worth a little, from point set Hs2In filter out the point for meeting that partial-depth is most worth condition, obtain by h3Individual point composition Angle point point set
(2) to screening the angle point point set Hs determined by angle point three times3In point be tracked, extraction rate feature.
Assuming that the same angle point (x in front and rear two field picturesh,yh) intensity it is constant (i.e. gray value is constant), i.e.,:
Gray't(xh,yh)=Gray't+Vt(xh+Vu,yh+ Vv) formula 46
Wherein Gray't(xh,yh) it is t (previous frame image) enhancing gray matrix Gray' midpoint (xh,yh) gray scale Value, Gray't+Vt(xh+Vu,yh+ Vv) to be to strengthen gray matrix Gray' midpoint (x at the t+Vt moment (latter two field picture)h+Vu,yh+ Vv gray value), (Vu, Vv) are angle point (xh,yh) offset that occurs within the Vt times, Vt is adjacent two frame times interval, h =1,2, L, h3
Then there is basic light stream constraint equation:
Formula 47
Assuming that angle point (xh,yh) and its neighborhood Ω4In offset a little it is consistent, neighborhood Ω4Radius is set to r4, by angle point (xh,yh) and its neighborhood Ω4A little ((2r altogether in institute4+1)2It is individual) bring optical flow constraint equation into, it can obtain exceeding unknown several The equation number of number (Vu, Vv two), by iterative, can obtain some angle point (xh,yh) offset (Vu, Vv), then position Moving size is:
Formula 48
Solution is iterated to all angle points detected, asking for average displacement is:
Formula 49
Adjacent two field pictures interval time is Vt, then defines foam velocity characteristic and be:
Formula 50
In the present embodiment, in the angle point point set Hs determined by the screening of angle point three times3In take angle point (xh,yh), h=1, 2 ..., h3, its neighborhood Ω4Radius of neighbourhood r4=1, i4∈[x-r4,x+r4], j4∈[y-r4,y+r4], asked by formula 46~49 Go out point set Hs3The displacement Vw of interior each point and average displacement w a little;Adjacent two field pictures interval time is Vt=0,33s, by Formula 50 obtains foam velocity characteristic Rate.
Need constantly to delete the angle point that need not be followed the trail of during tracking, as displacement Vw is especially small and removes the angle of sight Point, when deleting a part of angle point, and when remaining angle point number is less than 50, angle point need to be detected again by abovementioned steps and be chased after Track.
Step 5:Extract percentage of damage feature;
Step 3 is obtained into matrix of depths D ', with the N number of foam regions label information point set Ps of a two field picturenCorrespond, Obtain the set PD of depth value corresponding to each pixel of n-th of foam regionsn:(d1,d2,L dArea(n)), obtain depth value Minimum value dmin, obtain the height PH of each pixel of foam regionsn(d1-dmin,d2-dmin,L dArea(n)-dmin), by formula 51 Ask for height and PHnsum
PHnsum=Σ PHnFormula 51
For latter two field picture, read and previous frame data point set PsnCorresponding depth data PDn′(d1′,d2′,L d′Area(n)), obtain the height PH of each pixel of foam regionsn′(d1′-dmin,d2′-dmin,L d′Area(n)-dmin), by formula 52 ask for height and PH 'nsum
PH′nsum=Σ PHn' formula 52
Work as PHnsum'/PHnsumDuring < 0.75, be considered as the bubble collapse marked as n, similarly to all N number of regions at Reason, counts on shared BS bubble collapse, the percentage of damage feature Break of foam is extracted by formula 53.
Formula 53
Step 4:The depth data obtained using distribution fitting method to step 2, statistical analysis is carried out, is estimated using parameter Meter method obtains the distribution of foam top layer height, and then obtains the relation between foam top layer height and overflow launder edge height, On-line monitoring and operating mode's switch for froth level;Including following sub-step:
Step 1:The matrix of depths D ' obtained to step 2, foam top layer liquid level feature is extracted by formula 55~60:It is average Liquid level La, liquid level smoothness Lu, minimum liquid level Ll, highest liquid level Lh
As shown in Figure 1, for foam top layer point (x, y) in kinect visual fields, meet:
Hh=Ht+HkFormula 54
Wherein HkFor mineral pulp level (assuming that mineral syrup liquid is parallel with flotation groove bottom), HtRepresent certain point (x, y) place froth bed Thickness (distance of certain point to mineral syrup liquid in visual field).HkIt is an important indicator of flotation site operating mode's switch, but due to ore deposit Starch Level is covered by froth bed and is difficult to measure, therefore, by visual field foam top layer a little away from bottom land distance Hh=D ' (x, y), the foam top layer liquid level feature related to mineral pulp level is obtained, carry out operating mode's switch.
Fitting of distribution is carried out to the data in matrix of depths D ' using a large amount of off-line datas, obtains its Normal Distribution As a result.Using the parameter of the point estimation method estimation normal distribution:
The average of normal distribution:Formula 55
The standard deviation of normal distribution:Formula 56
Define 4 foam top layer liquid level features:
Average level:La=μ formula 57
Liquid level smoothness:Lu=σ formula 58
Minimum liquid level:Ll=u-3 σ formula 59
Highest liquid level:Lh=u+3 σ formula 60
Step 2:By statistical analysis, foam top layer liquid level feature and overflow launder edge height H are obtainedyBetween relation, Hy =3m;Current level operating mode is judged by formula 61, when category operating mode continues more than 10 minutes, judges that recognition result comes into force.
Step 5:By step 3 extraction foam characteristics constitutive characteristic vector, using improved k-means algorithms carry out from The cluster analysis of line, obtain several cluster centres;The foam characteristics of extract real-time flotation site, and carried out in fact with cluster centre When match, obtain the operating mode of current flotation online.
Step 5 includes following sub-step:
Step 1:8 foam characteristics constitutive characteristic vector d that step 3 is extracteds
Ds=[Graym Rm Gm Bm Aream Vm Rate Break] formula 62
Step 2:It is offline to obtain cluster centre;
By the monitoring record of a period of time, sampling g groups obtain characteristic data set:
Formula 63
Travel through cluster numbers k values, and k ∈ (1,2 ..., g), for each k value, K-means clusters are carried out, defines and evaluates letter Number, determines preferable clustering number kb, obtains kb cluster centre:[cc1,cc2,...cckb], form cluster centre set KB:
Formula 64
Step 3:Online operating mode's switch.
The froth flotation working-condition monitoring system based on depth information built according to step 1, froth flotation face is obtained in real time Color and depth data, by the real-time froth images feature constitutive characteristic vector of step 3 extraction:
cccurrent=[Graymcurrent Rmcurrent Gmcurrent Bmcurrent Areamcurrent Vmcurrent Ratecurrent Breakcurrent] formula 65
The Euclidean distance defined between real-time characteristic and cluster centre feature is characterized data similarity:
Formula 66
Wherein | cckb-cccurrent| represent vector (cckb-cccurrent) mould, i.e.,:
Formula 67
The cluster centre of optimum number is obtained by formula 62~64 to a large amount of off-line datas;The base built according to step 1 In kinect froth flotation working-condition monitoring system, froth flotation color and depth data are obtained in real time, are extracted by step 3 real When froth images feature constitutive characteristic vector cccurrent, will be with real-time characteristic similarity highest cluster centre by formula 66 As current working, operating mode is completed.

Claims (8)

1. a kind of froth flotation level monitoring and operating mode's switch method based on depth information, it is characterised in that including following step Suddenly:
Step 1:The froth flotation working-condition monitoring system based on kinect is built in terms of software and hardware;
Step 2:Gather, handle and preserve by the kinect flotation froth color data obtained and depth data;
Step 3:The color data and depth data obtained to step 2 carries out feature extraction, color combining data and depth number According to the feature of extraction foam:Color, area, volume, speed, percentage of damage;
Step 4:The depth data obtained using distribution fitting method to step 2 carries out statistical analysis;Using parameter Estimation side Method obtains foam top layer liquid level feature;And then the relation between foam top layer liquid level feature and overflow launder edge height is obtained, use In the on-line monitoring and operating mode's switch of froth flotation liquid level;
Step 5:The foam characteristics constitutive characteristic vector that step 3 is obtained, is carried out offline using improved k-means algorithms Cluster analysis, obtain several cluster centres;The foam characteristics of extract real-time flotation site, and carry out real-time with cluster centre Match somebody with somebody, obtain the operating mode of current flotation online.
2. froth flotation level monitoring and operating mode's switch method according to claim 1 based on depth information, its feature It is, the step 1 includes following sub-step:
Step 1.1:The froth flotation working-condition monitoring system based on kinect is built from hardware aspect, hardware senses including kinect Device, high frequency light source, computer, kinect sensors are fixed on away from the range of flotation cell liquid level 1.2m-3.5m, its camera institute It is parallel with flotation groove bottom in plane;High frequency light source gathers color data for kinect sensors and provides illumination, by what is collected Color data and depth data stream are sent to computer by USB data line, and computer realizes data processing, real-time working condition monitoring And its result is shown;
Step 1.2:The froth flotation working-condition monitoring system based on kinect is built from software aspects, in cross-platform framework Openni Under, using interactive programming interface, to realize real-time acquisition, processing, preservation and the feature of color data and depth data stream Extraction and operating mode's switch.
3. froth flotation level monitoring and operating mode's switch method according to claim 1 based on depth information, its feature It is, the step 2 includes following sub-step:
Step 2.1:Gathered data;
(1) data format is set;
Color data stream is set as RGB888, resolution ratio is a × b, and it is vertical that a × b represents that the pixel quantity in horizontal direction is multiplied by The upward pixel quantity of Nogata, frame per second 30FPS;Depth data stream format is set as PIXEL_FORMAT_DEPTH_1_MM Form, data precision are millimeter, and resolution ratio is a × b;Frame per second is 30FPS;
(2) hardware alignment color data and depth data;
Using " Alternative View " instruments, correction of color camera and the depth camera under cross-platform framework Openni Visual angle, to realize color data stream and alignment of the depth data stream on field of regard;
(3) software alignment color data and depth data;
Start kinect equipment and carry out IMAQ;Using step (1) setting data format simultaneously create color data stream and Depth data stream, obtain two kinds of data of sequential alignment:The rank color matrix C and a × b rank matrix of depths D of a × b × 3;
Step 2.2:Processing data;
(1) depth information in depth data is obtained;
The matrix of depths D obtained by kinect sensors is 16 integer datas, and its high 13 storage is depth information, and low 3 storage be index information, depth information is obtained by shifting function:
D1=D/23Formula 1
D in formula1Only to store the matrix of depths of depth information;
(2) color combining data are filtered completion processing to depth data;
Using joint bilateral filtering method to matrix of depths D1Processing is filtered, by the depth image completion due to blocking missing, Obtain filtered matrix of depths D2, its depth value D at pixel (x, y) place2(x, y) is:
Ω in formula1For the filtered reference neighborhood of required pixel (x, y), x ∈ [1, a], y ∈ [1, b], (i1,j1)∈Ω1, i1∈ [x-r1,x+r1],j1∈[y-r1,y+r1], r1For the radius of neighbourhood, D1(i1,j1) it is matrix of depths D1In pixel (i1,j1) place Depth value, wpFor normalized parameter:
W (i in formula1,j1) the filtered reference neighborhood Ω of pixel (x, y) for required by1Interior pixel (i1,j1) for required pixel Weights, by depth image spatial domain weight wsWith coloured image gray scale domain weight wrComposition, i.e.,:
w(i1,j1)=ws(i1,j1)×wr(i1,j1) formula 4
Wherein:
Wherein, σrAnd σsRespectively weight wsAnd wrCorresponding Gaussian function standard deviation, Gray (x, y), Gray (i1,j1) be respectively Pixel (x, y) and its filtered reference neighborhood Ω1Interior pixel (i1,j1) place gray value, calculation formula is:
Gray (x, y)=0.02989 × C (x, y, 1)+0.5870 × C (x, y, 2)+0.1149 × C (x, y, 3) formula 7
Wherein C (x, y, 1), C (x, y, 2), C (x, y, 3) are respectively a of R, G, B color component in the rank color matrix C of a × b × 3 The color value of pixel (x, y) in × b rank matrixes, traversal all pixels point obtain gray matrix Gray;
Choose Gaussian function standard deviation sigmarAnd σs, in filtered reference neighborhood Ω1It is interior, eliminate the depth information for the region missing that is blocked And noise spot;
(3) reflection foam top layer is obtained away from flotation cell bottom land distance HhMatrix of depths D ';
HhDistance of the foam top layer pixel (x, y) away from flotation cell bottom land in visual field is represented, is obtained by equation below:
Hh=Hb-HdFormula 8
Wherein HbFor kinect sensors camera place plane to the distance of flotation cell bottom land, HdFor foam top layer picture in visual field Vegetarian refreshments (x, y) is to the distance of plane where kinect sensors camera, and plane is with floating where kinect sensor cameras In the case of selecting groove bottom parallel, HdI.e. for the matrix of depths D that is obtained by kinect sensors is shifted and filtering completion after it is deep Spend matrix D2The value of middle corresponding pixel points (x, y), i.e. Hd=D2(x, y),
Obtain reflection HhPixel (x, y) depth value D ' (x, y) be:
D ' (x, y)=Hb-D2(x, y) formula 9
Step 2.3:Preserve data;
Color matrix C and matrix of depths D ' are stored, color data is accessed with Motion JPEG AVI coded systems, with Motion JPEG 2000 (16) coded system storage depth data.
4. froth flotation level monitoring and operating mode's switch method according to claim 3 based on depth information, its feature It is, the step 3 includes following sub-step:
Step 3.1:Extract color characteristic;
The color data and depth data obtained to step 2 carries out feature extraction, color combining data and depth data extraction bubble The color characteristic of foam,
Assuming that the depth value scope of object is [d in visual fieldmin,dmax], wherein dminFor minimum depth value, dmaxFor depth capacity Value, definition datum distance are:
Db=(dmax-dminThe formula of)/2 10
Assuming that the color of pixel is standard value at reference range, then in the case of intensity of illumination identical, more than benchmark The distance pixel color more remote from distance of camera lens decays bigger, it is necessary to strengthen its value, and less than reference range from mirror The nearer pixel of head distance to its color value, it is necessary to weaken, therefore, by setting weight function, with reference to depth data Color data is standardized, weights q (x, y) formula is:
Q (x, y)=(D'(x, y)-db)3/106+ 1 formula 11
Color characteristic after being standardized to color value is defined as:
R averages:
G averages:
B averages:
The gray matrix Gray obtained by the sub-step (2) in step 2.2, calculate gray average:
Step 3.2:Extract area features;
The intensity profile Gs of a two field picture is calculated the gray matrix Gray that the sub-step (2) in step 2.2 obtains, with Gs (g') Expression meets the number of Gray (x, y)=g' pixel, wherein g' ∈ [0,255],
Since e=0, e=0, e=1 ..., e=e ' are taken respectively, e ' ∈ (0,255), as e=e ', is met:
The gray matrix Gray ' of contrast enhancing is obtained, its gray value at pixel (x, y) place is:
Matrix Gray ' is split using watershed algorithm, obtains foam regions two values matrix A, it is at pixel (x, y) place Gray value be designated as A (x, y), the differentiation of foam regions point and boundary point is carried out using formula below:
All pixels of a two field picture are traveled through, foam regions are numbered, obtain zone marker information point set:
Psn:{(x1,y1),(x2,y2),…,(xArea(n),yArea(n)) formula 19
Ps in formulanThe zone marker information point set of n-th of foam regions is represented, i.e., the collection of all pixels point label information in region Close, n=1,2 ..., N, N is foam regions sum;Area (n) represents the number of the pixel in n-th of foam regions, is Foam regions area, the area of all N number of foam regions is counted respectively, obtains area distributions:
Area:{ Area (1), Area (2) ... Area (n) ... Area (N) }, calculate total area of the N number of foam regions of a two field picture Domain area:
N number of area value in area distributions Area is resequenced to obtain Area ' according to order from small to large:{Area′(1), Area ' (2) ... Area ' (n) ... Area ' (N) }, wherein Area ' (1) is foam regions area minimum value, and Area ' (N) is foam Region area maximum;
From f1=1 starts, and takes f respectively1=1, f1=2 ..., f1=f1′,f1' ∈ (1, N), works as f1=f1' when, meet:
Then defining foam area features is:
Step 3.3:Extract volume characteristic;
Region two values matrix A and zone marker information point set Ps for a two field picturen, (p+q) rank of its n-th of foam regions Regular square is defined as:
The region barycenter of n-th of foam regionsBy its 1 rank rule away from trying to achieve:
Then (p+q) rank centre-to-centre spacing of a two field picture is:
It it is approximately one centered on the barycenter of region by foam regions if only considering the second-order moment around mean collection of a two field picture Ellipse, then have:
CBnRepresent oval major semiaxis, DBnFor semi-minor axis, θnFor inclination angle;
And then the pixel coordinate of transverse end points and short axle end points is calculated:c1(un1,vn1)、c2(un2,vn2), formula is:
The matrix of depths D ' obtained with reference to the sub-step (3) in step 2.2, calculate transverse end points, short axle end points and region Depth value l corresponding to barycentern1,ln2,lnc
Wherein, ln1,ln2,lncRespectively transverse end points, short axle end points and depth value corresponding to the barycenter of region;
Assuming that foam is approximately the spheroid of standard, foam real space height h is definednFor:
hn=lnc-min(ln1,ln2) formula 32
With reference to the view parameter of kinect sensors:Horizontal view angle δ1With vertical angle of view δ2, it is calculated three corresponding to major axis end points Dimension space coordinate Cn1And Cn2
Cn1:(δ1*[2un1L+(a-1)(L-ln1)],δ2[2(b-vn1-1)L+(b-1)(L-ln1)],L-ln1) formula 33Cn2: (δ1*[2un2L+(a-1)(L-ln2)],δ2[2(b-vn2-1)L+(b-1)(L-ln2)],L-ln2) formula 34
Wherein L represents depth camera to reference image plane space coordinate z=0 distance;
And then obtain major axis physical length:
Formula 35
Assuming that depth value corresponding to two end points of major axis is equal, major axis actual (tube) length angle value is reduced to:
Similarly obtain short axle physical length DRn
Assuming that foam is the spheroid of standard, the volume of n-th of foam is finally given:
The process of a two field picture area features is obtained according to sub-step in step 3 3.2, by N number of Domain Volume Distribution value:Vs: { Vs (1), Vs (2) ... Vs (n) ... Vs (N) }, resequences to obtain Vs ' according to order from small to large:{Vs′(1),Vs′ (2)...Vs′(n)...Vs′(N)};
From f2=1 starts, and takes f respectively2=1, f2=2 ..., f2=f2′,f2' ∈ (1, N), works as f2=f2' when, meet:
Foam volume is then defined to be characterized as:
Step 3.4:Extraction rate feature;
(1) Harris Corner Detections;
Assuming that the pixel (x, y) in the gray matrix Gray ' of contrast enhancing is angle point, its filtered reference neighborhood Ω2It is interior all Along certain offset, (the mean intensity gray-value variation in Δ x, Δ y) direction is pixel:
I in formula2∈[x-r2,x+r2],j2∈[y-r2,y+r2], r2For Ω2The radius of neighbourhood;
The calculating of mean intensity change is carried out to all offset directions of pixel (x, y), Taylor series exhibition is carried out to formula 40 Open, omit higher order term, be described as with matrix form:
Q is covariance matrix in formula, and two characteristic value represents maximum mean intensity change in all offset directions of pixel And its mean intensity change of vertical direction, defining scoring functions is:
G=Det (Q)-kTrace2(Q) formula 42
Wherein Det (Q) is characterized the determinant of the product, i.e. covariance matrix of value, and Trace (Q) is characterized value sum, i.e. association side The mark of poor matrix;
Define GrFor fraction up to standard, traversal pixel (x, y) neighborhood Ω2Interior all pixels point, obtain fractional value and be higher than fraction up to standard H1Individual pixel, sort to obtain angle point collection Hs from high to low by fractional value1:{(x1,y1),(x2,y2)...(xh1,yh1)};
The local most value processing of gray value is carried out to the gray matrix Gray' of contrast enhancing, formula is:
Wherein Graymax (x, y) is the value at pixel (x, y) place in gray matrix Graymax after gray value local maxima, Graymin (x, y) is the value at pixel (x, y) place in gray matrix Graymin after gray value local minimum, Ω3For picture The neighborhood of vegetarian refreshments (x, y), its radius of neighbourhood are set to r3, and i3∈[x-r3,x+r3],j3∈[y-r3,y+r3],
Determine that gray value is locally most worth a little using equation below:
From angle point collection Hs1In filter out the point for meeting that local gray level is most worth condition, obtain by h2The angle point collection of individual angle point composition
Finally, the operation that the gray matrix Gray' gray values strengthened with seeking contrast are locally most worth a little is similar, to matrix of depths D' Ask depth value to be locally most worth a little, obtain depth value and be locally most worth a little, from angle point collection Hs2In filter out and meet that partial-depth is most worth bar The angle point of part, obtain by h3Individual point composition angle point collection Hs3:{(x1,y1),(x2,y2)...(xh3,yh3)};
(2) to screening the angle point collection Hs determined by angle point three times3In point be tracked, extraction rate feature;
Assuming that the same angle point (x in front and rear two field picturesh,yh) intensity it is constant, i.e.,:
Gray't(xh,yh)=Gray't+Δt(xh+Δu,yh+ Δ v) formula 46
Wherein Gray't(xh,yh) be the i.e. previous frame image of t contrast enhancing gray matrix Gray' in angle point (xh, yh) gray value, Gray't+Δt(xh+Δu,yh+ Δ v) is the gray scale for the contrast enhancing for being the i.e. latter two field picture of t+ time Δts Angle point (x in matrix Gray'h+Δu,yh+ Δ v) gray value, (Δ u, Δ v) are angle point (xh,yh) occur within the Δ t times Offset, Δ t are adjacent two frame times interval, h=1,2 ..., h3
Then there is basic light stream constraint equation:
Assuming that angle point (xh,yh) and its neighborhood Ω4Interior offset a little is consistent, neighborhood Ω4Radius is set to r4, by angle point (xh,yh) and its neighborhood Ω4It is interior to be total to (2r a little4+1)2It is individual to bring optical flow constraint equation into, obtain exceeding unknown number number (Δ U, Δ v two) equation number, by iterative, obtain angle point (xh,yh) offset (Δ u, Δ v), then displacement For:
Solution is iterated to all angle points detected, asking for average displacement is:
Adjacent two field pictures interval time is Δ t, then defines foam velocity characteristic and be:
Need constantly to delete the angle point that need not be followed the trail of during tracking, when deleting a part of angle point, and remaining angle point When number is less than some predefined threshold value, angle point need to be detected again by (2) the step of foregoing sub-step 3.4 and be tracked;
Step 3.5:Extract percentage of damage feature;
The matrix of depths D ' that sub-step (3) in step 2.2 is obtained, with a two field picture zone marker information point set PsnOne a pair Should, obtain the set PD of depth value corresponding to each pixel of n-th of foam regionsn:(d1,d2,…dArea(n)), obtain depth The minimum value d of valuemin, draw the height PH of each pixel of foam regionsn(d1-dmin,d2-dmin,…dArea(n)-dmin), ask for Height and:
PHnsum=∑ PHnFormula 51
For latter two field picture, the zone marker information point set Ps with previous frame is readnCorresponding depth data PD 'n(d′1, d′2,…d′Area(n)), obtain the height PH ' of each pixel of foam regionsn(d′1-dmin,d′2-dmin,…d′Area(n)-dmin), Ask for height and:
PH′nsum=∑ PH 'nFormula 52
Work as PHnsum'/PHnsumDuring < 0.75, it is considered as the bubble collapse marked as n, similarly all N number of regions is handled, unites Shared BS bubble collapse is counted, then bubble collapse rate is:
5. froth flotation level monitoring and operating mode's switch method according to claim 4 based on depth information, its feature It is, the step 4 includes following sub-step:
Step 4.1:The matrix of depths D ' obtained to the sub-step (3) in step 2.2, extraction foam top layer liquid level feature:It is average Liquid level La, liquid level smoothness Lu, minimum liquid level Ll, highest liquid level Lh
For foam top layer pixel (x, y) in kinect visual fields, meet:
Hh=Ht+HkFormula 54
Wherein HkFor mineral pulp level, it is assumed that mineral syrup liquid is parallel with flotation groove bottom, HtRepresent pixel (x, y) place foam thickness Degree, i.e., pixel (x, y) arrives the distance of mineral syrup liquid, H in visual fieldkIt is an important indicator of flotation site operating mode's switch, but Be difficult to measure because mineral syrup liquid is covered by froth bed, therefore, by visual field foam top layer a little away from flotation cell groove Bottom distance Hh=D ' (x, y), the foam top layer liquid level feature related to mineral pulp level is obtained, carry out operating mode's switch;
Fitting of distribution is carried out to the data in matrix of depths D ' using a large amount of off-line datas, obtains the knot of its Normal Distribution Fruit, the parameter of normal distribution is estimated using the point estimation method:
The average of normal distribution:
The standard deviation of normal distribution:
Define 4 foam top layer liquid level features:
Average level:La=μ formula 57
Liquid level smoothness:Lu=σ formula 58
Minimum liquid level:Ll=u-3 σ formula 59
Highest liquid level:Lh=u+3 σ formula 60
Step 4.2:By statistical analysis, foam top layer liquid level feature and overflow launder edge height H are obtainedyBetween relation:
Step 4.3:The current foam top layer liquid level feature of extract real-time, is judged using formula 61, obtains current foam top layer The operating mode classification that liquid level feature is reflected;When category operating mode continues more than 10 minutes, judge that recognition result comes into force.
6. froth flotation level monitoring and operating mode's switch method according to claim 5 based on depth information, its feature It is, the step 5 includes following sub-step:
Step 5.1:8 foam characteristics constitutive characteristic vector ds that step 3 is extracted:
Ds=[Graym Rm Gm Bm Aream VmRate Break] formula 62
Step 5.2:It is offline to obtain cluster centre;
By the monitoring record of a period of time, sampling g groups obtain characteristic data set:
Travel through cluster numbers k values, k ∈ (1,2 ..., g), for each k value, progress K-means clusters, definition evaluation function, Preferable clustering number kb is determined, obtains kb cluster centre:[cc1,cc2,...cckb], form cluster centre set KB:
Step 5.3:Online operating mode's switch;
The froth flotation working-condition monitoring system based on kinect built according to step 1, froth flotation color data is obtained in real time And depth data, extract real-time froth images feature constitutive characteristic vector by the calculation of step 3:
cccurrent=[Graymcurrent Rmcurrent Gmcurrent Bmcurrent Areamcurrent Vmcurrent Ratecurrent Breakcurrent] formula 65
The Euclidean distance defined between real-time froth images feature and cluster centre is characterized data similarity:
Wherein | cckb-cccurrent| represent vector (cckb-cccurrent) mould, i.e.,:
Operating mode corresponding to the minimum value of KN matrixes is chosen as current working, completes operating mode's switch.
7. a kind of froth flotation level monitoring and working condition recognition system based on depth information, it is characterised in that including kinect Sensor, high frequency light source, computer, kinect sensors are fixed on away from the range of flotation cell liquid level 1.2m-3.5m, its camera Place plane is parallel with flotation groove bottom;High frequency light source gathers color data for kinect sensors and provides illumination, will collect Color data and depth data stream computer is sent to by USB data line, computer realizes data processing, real-time working condition prison Survey and its result is shown;Software systems are under cross-platform framework Openni, using interactive programming interface, to realize color data Real-time acquisition, processing, preservation and feature extraction and operating mode's switch with depth data stream.
8. froth flotation level monitoring and working condition recognition system according to claim 7 based on depth information, its feature It is, real-time acquisition, processing, preservation and the feature extraction of color data and depth data stream and the mode of operating mode's switch are: Gather, handle and preserve by the kinect flotation froth color data obtained and depth data;To the color data and depth of acquisition Degrees of data carries out feature extraction, color combining data and the depth data extraction color of foam, area, volume, speed, broken Rate;Statistical analysis is carried out to the depth data of acquisition using distribution fitting method;Foam top layer is obtained using method for parameter estimation Liquid level feature;And then the relation between foam top layer liquid level feature and overflow launder edge height is obtained, for froth flotation liquid level On-line monitoring and operating mode's switch;By the foam characteristics constitutive characteristic of acquisition vector, using improved k-means algorithms carry out from The cluster analysis of line, obtain several cluster centres;The foam characteristics of extract real-time flotation site, and carried out in fact with cluster centre When match, obtain the operating mode of current flotation online.
CN201410699401.0A 2014-11-27 2014-11-27 Froth flotation level monitoring and operating mode's switch method and system based on depth information Active CN104408724B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201410699401.0A CN104408724B (en) 2014-11-27 2014-11-27 Froth flotation level monitoring and operating mode's switch method and system based on depth information

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201410699401.0A CN104408724B (en) 2014-11-27 2014-11-27 Froth flotation level monitoring and operating mode's switch method and system based on depth information

Publications (2)

Publication Number Publication Date
CN104408724A CN104408724A (en) 2015-03-11
CN104408724B true CN104408724B (en) 2017-12-01

Family

ID=52646353

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201410699401.0A Active CN104408724B (en) 2014-11-27 2014-11-27 Froth flotation level monitoring and operating mode's switch method and system based on depth information

Country Status (1)

Country Link
CN (1) CN104408724B (en)

Families Citing this family (19)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105306909B (en) * 2015-11-20 2018-04-03 中国矿业大学(北京) The overcrowding warning system of coal mine underground operators of view-based access control model
CN105488816B (en) * 2015-11-27 2018-09-11 中南大学 A kind of mineral floating foam flow velocity on-line measuring device and method based on 3D vision information
CN105430339B (en) * 2015-11-27 2018-04-06 中南大学 A kind of embedded mineral floating foam 3-D image monitoring device based on ARM+Kinect
CN105823704A (en) * 2016-03-21 2016-08-03 东莞美维电路有限公司 Method for testing PCB adhesive removing uniformity
CN107067431B (en) * 2017-01-16 2020-07-03 河海大学常州校区 Kinect-based object volume calculation method
CN108413864B (en) * 2017-02-10 2020-07-17 菜鸟智能物流控股有限公司 Object size measuring method and related equipment
CN107122754A (en) * 2017-05-09 2017-09-01 苏州迪凯尔医疗科技有限公司 Posture identification method and device
CN109410248B (en) * 2018-10-23 2021-07-20 湖南科技大学 Flotation froth motion characteristic extraction method based on r-K algorithm
CN109371633B (en) * 2018-11-29 2020-10-27 嘉兴市金鹭喷织有限公司 Foam spreading state detection system
CN109772593B (en) * 2019-01-25 2020-09-29 东北大学 Ore pulp liquid level prediction method based on flotation foam dynamic characteristics
CN110245601B (en) * 2019-06-11 2022-03-01 Oppo广东移动通信有限公司 Eyeball tracking method and related product
CN110288260B (en) * 2019-07-02 2022-04-22 太原理工大学 Coal slime flotation reagent addition amount evaluation method based on semi-supervised clustering
CN110728253B (en) * 2019-07-22 2021-03-02 中南大学 Texture feature measurement method based on particle roughness
CN111144221A (en) * 2019-11-29 2020-05-12 光大环境科技(中国)有限公司 Method and device for evaluating operation condition of aeration tank through image recognition technology
CN113436162B (en) * 2021-06-23 2022-12-09 湖南国天电子科技有限公司 Method and device for identifying weld defects on surface of hydraulic oil pipeline of underwater robot
CN113591654B (en) * 2021-07-22 2023-09-01 中南大学 Zinc flotation working condition identification method based on long-time-range depth characteristics
CN114519715B (en) * 2022-04-21 2022-07-15 矿冶科技集团有限公司 Foam overflow amount detection method and device, electronic equipment and medium
CN114988567A (en) * 2022-07-15 2022-09-02 南通仁源节能环保科技有限公司 Sewage treatment method and system based on activated sludge foam
CN117252826B (en) * 2023-09-12 2024-03-12 山东神力索具有限公司 Visual technology-based method for detecting cooling sewage of steel caps containing graphite die forging

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101334844A (en) * 2008-07-18 2008-12-31 中南大学 Critical characteristic extraction method for flotation foam image analysis
CN103971379A (en) * 2014-05-30 2014-08-06 中南大学 Single-vidicon equivalent binocular stereoscopic vision model based foam stereoscopic feature extraction method

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CA2873232A1 (en) * 2012-05-14 2013-11-21 Technological Resources Pty. Limited Controlling froth flotation

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101334844A (en) * 2008-07-18 2008-12-31 中南大学 Critical characteristic extraction method for flotation foam image analysis
CN103971379A (en) * 2014-05-30 2014-08-06 中南大学 Single-vidicon equivalent binocular stereoscopic vision model based foam stereoscopic feature extraction method

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
基于机器视觉的矿物浮选过程监控技术研究进展;桂卫华 等;《自动化学报》;20131130;第39卷(第11期);第1879-1888页 *
基于泡沫图像特征的浮选槽液位智能优化设定方法;赵洪伟 等;《自动化学报》;20140630;第40卷(第6期);第1086-1097页 *

Also Published As

Publication number Publication date
CN104408724A (en) 2015-03-11

Similar Documents

Publication Publication Date Title
CN104408724B (en) Froth flotation level monitoring and operating mode's switch method and system based on depth information
CN110175576B (en) Driving vehicle visual detection method combining laser point cloud data
CN106650640B (en) Negative obstacle detection method based on laser radar point cloud local structure characteristics
US9846946B2 (en) Objection recognition in a 3D scene
CN104850850B (en) A kind of binocular stereo vision image characteristic extracting method of combination shape and color
CN105869178B (en) A kind of complex target dynamic scene non-formaldehyde finishing method based on the convex optimization of Multiscale combination feature
CN109460709A (en) The method of RTG dysopia analyte detection based on the fusion of RGB and D information
CN102867311A (en) Target tracking method and target tracking device
CN110379168B (en) Traffic vehicle information acquisition method based on Mask R-CNN
CN102542289A (en) Pedestrian volume statistical method based on plurality of Gaussian counting models
CN104616006B (en) A kind of beard method for detecting human face towards monitor video
CN103106659A (en) Open area target detection and tracking method based on binocular vision sparse point matching
CN107392929B (en) Intelligent target detection and size measurement method based on human eye vision model
CN104036488A (en) Binocular vision-based human body posture and action research method
CN106446785A (en) Passable road detection method based on binocular vision
CN105427345A (en) Three-dimensional people stream movement analysis method based on camera projection matrix
CN107808524A (en) A kind of intersection vehicle checking method based on unmanned plane
Babahajiani et al. Automated super-voxel based features classification of urban environments by integrating 3D point cloud and image content
CN108764338A (en) A kind of pedestrian tracking algorithm applied to video analysis
CN116573017A (en) Urban rail train running clearance foreign matter sensing method, system, device and medium
Yin et al. Removing dynamic 3D objects from point clouds of a moving RGB-D camera
Giosan et al. Superpixel-based obstacle segmentation from dense stereo urban traffic scenarios using intensity, depth and optical flow information
CN113743265B (en) Depth camera-based automatic driving drivable area detection method and system
Illeperuma et al. Computer vision based object tracking as a teaching aid for high school physics experiments
Zhao et al. Features Extraction of Flotation Froth Based on Equivalent Binocular Stereo vision

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant