CN104408724A - Depth information method and system for monitoring liquid level and recognizing working condition of foam flotation - Google Patents

Depth information method and system for monitoring liquid level and recognizing working condition of foam flotation Download PDF

Info

Publication number
CN104408724A
CN104408724A CN201410699401.0A CN201410699401A CN104408724A CN 104408724 A CN104408724 A CN 104408724A CN 201410699401 A CN201410699401 A CN 201410699401A CN 104408724 A CN104408724 A CN 104408724A
Authority
CN
China
Prior art keywords
formula
gray
foam
point
color
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201410699401.0A
Other languages
Chinese (zh)
Other versions
CN104408724B (en
Inventor
彭涛
赵永恒
赵林
蔡耀仪
宋彦坡
韩华
赵璐
彭霞
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Central South University
Original Assignee
Central South University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Central South University filed Critical Central South University
Priority to CN201410699401.0A priority Critical patent/CN104408724B/en
Publication of CN104408724A publication Critical patent/CN104408724A/en
Application granted granted Critical
Publication of CN104408724B publication Critical patent/CN104408724B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01FMEASURING VOLUME, VOLUME FLOW, MASS FLOW OR LIQUID LEVEL; METERING BY VOLUME
    • G01F23/00Indicating or measuring liquid level or level of fluent solid material, e.g. indicating in terms of volume or indicating by means of an alarm
    • G01F23/22Indicating or measuring liquid level or level of fluent solid material, e.g. indicating in terms of volume or indicating by means of an alarm by measuring physical variables, other than linear dimensions, pressure or weight, dependent on the level to be measured, e.g. by difference of heat transfer of steam or water
    • G01F23/28Indicating or measuring liquid level or level of fluent solid material, e.g. indicating in terms of volume or indicating by means of an alarm by measuring physical variables, other than linear dimensions, pressure or weight, dependent on the level to be measured, e.g. by difference of heat transfer of steam or water by measuring the variations of parameters of electromagnetic or acoustic waves applied directly to the liquid or fluent solid material
    • G01F23/284Electromagnetic waves
    • G01F23/292Light, e.g. infrared or ultraviolet

Landscapes

  • Physics & Mathematics (AREA)
  • Electromagnetism (AREA)
  • Thermal Sciences (AREA)
  • Fluid Mechanics (AREA)
  • General Physics & Mathematics (AREA)
  • Image Analysis (AREA)

Abstract

The invention discloses a depth information method and system for monitoring liquid level and recognizing working condition of foam flotation. The depth information method includes 1, establishing a Kinect-based foam flotation working condition monitoring system in the aspects of soft hardware; 2, collecting flotation foam color and depth data through a Kinect sensor, subjecting depth data to depth information extracting and filtering, and aligning the color data and the depth data in the aspects of time sequence and position and storing the same correspondingly; 3, according to the color data and the depth data, extracting three-dimensional characteristics (including the depth information) such as color, area, volume, speed and damage rate of the foam; 4, performing liquid level monitoring by analyzing relation between the current foam surface liquid level characteristics and the overflow well edge height; 5, subjecting the foam image characteristics to clustering analysis by an improved k-means algorithm, allowing on-line recognition of the flotation working condition achieved. The depth information method and system is applicable to working monitoring and real-time working condition recognition of the foam flotation field, automatic control and optimized operation of the flotation production are realized, and resource utilization rate is improved.

Description

Based on froth flotation level monitoring and the operating mode's switch method and system of depth information
Technical field
The present invention relates to mineral manufacture field, especially a kind of froth flotation level monitoring based on depth information and operating mode's switch method and system.
Background technology
Working condition in flotation operating mode and Floating Production Process, identifies that operating mode is most important to instructing flotation to produce timely and accurately.Froth flotation is most widely used a kind of beneficiation method in mineral processing, the physicochemical change related to is extremely complicated, be all artificially judge current working according to experienced workman by foam surface characteristics in visual inspection flotation cell (color, size, speed etc.) change all the time, and control the floating operations such as chemical feeding quantity with this.But the judgement of different workmans to foam surface characteristics does not have unified quantitative criteria, cause the subjectivity of operating mode's switch and even operation and randomness comparatively large, make flotation be difficult to be in optimum state and cause production run unstable, affecting the quality of production.Along with the develop rapidly of the technology such as machine vision and image procossing, the work carrying out Intelligent Recognition operating mode in conjunction with flotation site foam characteristics has made great progress.By identifying the operating mode classification of flotation site fast and accurately, flotation production control system can adjust manufacturing parameter in time, makes Floating Production Process remain at optimum state.
Mineral pulp level is an important indicator of flotation site operating mode's switch, but be difficult to owing to being covered by froth bed measure, simultaneously, the floatation foam image gathered based on the froth flotation monitoring of working condition of traditional industry video camera and recognition system and method (following summary is legacy system and method) only has plane picture information, not there is depth information, the Real-Time Monitoring to the change of flotation pulp liquid level cannot be obtained, be difficult to the automatic identification and the control automatically that realize liquid level.In addition, do not have depth information also can bring larger difference to the extraction of determination (from work human stereo vision) and the monitor value of characteristics of image empirical value (plane visual from legacy system and method), cause the instability of operating mode's switch result and inaccurate, the application of legacy system and method is extremely restricted.
Summary of the invention
The object of this invention is to provide a kind of froth flotation level monitoring based on depth information and operating mode's switch method and system, by in visual field foam top layer a little apart from bottom land distance, obtain the foam top layer liquid level feature relevant to mineral pulp level, have effectively achieved the online level monitoring based on machine vision; The flotation froth surface three-dimensional feature extracted by kinect sensor, is effectively breached the limitation of planar foam feature, can identify current working more exactly.
For achieving the above object, the invention provides a kind of froth flotation level monitoring based on depth information and operating mode's switch method, comprise the following steps:
Step one: build the froth flotation working-condition monitoring system based on kinect from software and hardware aspect;
Step 2: gather, process and preserve the flotation froth color data and depth data that are obtained by kinect;
Step 3: the color obtain step 2 and depth data carry out feature extraction, color combining and depth data extract three-dimensional (with depth information) feature such as color, area, volume, speed, percentage of damage of foam;
Step 4: adopt distribution fitting method to carry out statistical study to the depth data that step 2 obtains; Method for parameter estimation is adopted to obtain foam top layer liquid level feature; And then the relation obtained between foam top layer liquid level feature and overflow groove edge height, for on-line monitoring and the operating mode's switch of froth flotation liquid level;
Step 5: foam characteristics constitutive characteristic vector step 3 obtained, adopts the k-means algorithm improved to carry out the cluster analysis of off-line, obtain several cluster centres; The foam characteristics of extract real-time flotation site, and carry out real-time matching with cluster centre, the online operating mode obtaining current flotation.
The present invention also provides a kind of froth flotation level monitoring based on depth information and working condition recognition system, it is characterized in that, comprise kinect sensor, high frequency light source, computing machine, kinect sensor is fixed on apart within the scope of flotation cell liquid level 1.2m-3.5m, and its camera place plane is parallel with flotation cell bottom surface; High frequency light source provides illumination for kinect sensor gathers color data, the color collected and depth data is flowed through USB data line and is sent to computing machine, and computing machine realizes data processing, real-time working condition monitoring and result display thereof; Software systems, under cross-platform framework Openni, adopt interactive programming interface, to realize the Real-time Obtaining of color and depth data stream, process, preservation and feature extraction and operating mode's switch.
Beneficial effect: the flotation froth surface three-dimensional feature that the present invention is extracted by kinect sensor, effectively breaches the limitation of planar foam feature, and combines with extraction foam top layer liquid level feature, more accurately can identify current working.
Accompanying drawing explanation
Fig. 1 is the froth flotation level monitoring of the embodiment of the present invention based on depth information and the hardware structure diagram of operating mode's switch method and system.
Fig. 2 is based on the froth flotation level monitoring of depth information and the software flow pattern of operating mode's switch method and system shown in Fig. 1.
Fig. 3 is the color image that in the embodiment of the present invention, kinect gathers.
Fig. 4 is the depth image that in the embodiment of the present invention, kinect gathers.
Embodiment
Kinect is a novel sensor (camera) of Microsoft's development, can the color of Real-time Obtaining target area and the degree of depth two kinds of data, breach the limitation of the two dimensional image feature of traditional camera, the three-dimensional feature information more close with real world can be obtained, be applied to field of play at present, but be not also widely used in industry spot.
For this reason, we are incorporated into froth flotation industry spot Kinect sensor first, the depth data gathered by Kinect sensor, obtain the foam top layer liquid level feature relevant to mineral pulp level, carry out level monitoring and operating mode; In conjunction with the color data that Kinect sensor gathers, extract three-dimensional (with depth information) feature such as color, area, volume, speed, percentage of damage of foam, realize the real-time working condition identification to froth flotation production run.
Below with reference to the drawings and specific embodiments, the present invention is described in further details.
As shown in Figure 2, a kind of froth flotation level monitoring based on depth information of the present embodiment and operating mode's switch method comprise the following steps:
Step one: build the froth flotation working-condition monitoring system based on kinect from software and hardware aspect;
According to the froth flotation monitoring of working condition built shown in Fig. 1 based on depth information and recognition system, hardware is primarily of compositions such as kinect sensor 1, high frequency light source 4, computing machines 5.High frequency light source 4 provides illumination for kinect sensor 1 gathers color data.Kinect sensor 1 is fixed on apart from flotation cell 6 liquid level 2m place, its camera 2 place plane must be parallel with flotation cell 6 bottom surface, the color collected in the scope of camera visual field 3 and depth data are flowed through USB data line and is sent to computing machine 5, computing machine 5 realizes data processing, real-time working condition monitoring and result display thereof.In addition, add dust cover (not shown in FIG.) at high frequency light source 4 and kinect sensor 1 outside, be used for weakening the impact of extraneous factor (as corrosion, polluting camera lens etc.), strengthen the reliability of system.Flotation cell 6 one end is provided with overflow groove 7, and flotation cell 6 inside is equiped with stirring apparatus 8 and scraper plate 9.In figure, Hb represents the distance of camera place plane to flotation cell bottom land, Hd represents that in visual field, foam top layer point is to the distance of camera place plane, Hh represents that in visual field, foam top layer point is to the distance of bottom land, Ht represents certain some place froth bed thickness in visual field (certain point is to the distance of mineral syrup liquid), Hk represents mineral pulp level, and Hy represents overflow groove edge height.
The froth flotation working-condition monitoring system based on kinect is built from software aspect, under cross-platform framework Openni (OpenNatural Interaction), adopt interactive programming interface, to realize the Real-time Obtaining of color and depth data stream, process, preservation and feature extraction and operating mode's switch.
Step 2: gather, process and preserve the flotation froth color data and depth data that are obtained by kinect;
Comprise following steps:
Step 1: image data;
Color data stream is set as RGB888 (RGB), resolution is 640*480, and frame per second is 30FPS; Depth data stream format is set as PIXEL_FORMAT_DEPTH_1_MM form (data precision is millimeter), resolution is 640 × 480; Frame per second is 30FPS; Adopt " Alternative View " instrument under cross-platform framework Openni, the visual angle of correction of color camera and depth camera, to realize color data stream and depth data stream aiming on field of regard.
Start kinect equipment and carry out image acquisition; Adopt the data layout of above-mentioned steps setting to create color data stream and depth data stream simultaneously, obtain two kinds of data that sequential is as shown in Figure 3 and Figure 4 aimed at: 640 × 480 × 3 rank color matrix C and 640 × 480 rank matrix of depths D.Such as in color image, some P 1(243,133) are labeled as " RGB:242,241,239 ", represent that the value of this point in the color matrix that kinect gathers is C (243,133,1)=242, C (243,133,2)=241, C (243,133,3)=239.In depth image, some P 1(243,133) are labeled as " Index:661 ", represent that the value of this point in the color matrix that kinect gathers is D (243,133)=661; Mark " RGB:0.918,0.918,0.918 " obtains for showing by depth value normalization, and in depth image, color is darker just represents that depth value is less, and namely this foam top layer distance camera distance is nearer, by comparing foam head point P 1index value (661) and the foam edge point P at place 2the Index value (693) at place can be easy to verify above-mentioned saying.
Step 2: process data;
(1) depth information in depth data is obtained;
The matrix of depths D obtained by kinect sensor is 16 integer data, its high 13 store be depth information, and low 3 store be index information, obtain depth information by shifting function:
D 1=D/2 3formula 1
D in formula 1for only storing the matrix of depths of depth information (and not having index information).
(2) color combining data carry out filtering completion process to depth data;
Adopt associating bilateral filtering method to matrix of depths D 1carry out filtering process, by the depth image completion owing to blocking disappearance, obtain filtered matrix of depths D 2, it is at the depth value D at point (x, y) place 2(x, y) is:
D 2 ( x , y ) = 1 w p Σ ( i 1 , j 1 ) ∈ Ω 1 w ( i 1 , j 1 ) × D 1 ( i 1 , j 1 ) Formula 2
Ω in formula 1for the filtered reference neighborhood of required point (x, y), x ∈ [1, a], y ∈ [1, b], (i 1, j 1) ∈ Ω 1, i 1∈ [x-r 1, x+r 1], j 1∈ [y-r 1, y+r 1], r 1for the radius of neighbourhood, D 1(i 1, j 1) be matrix D 1at point (i 1, j 1) depth value at place, w pfor normalized parameter:
w p = Σ i 1 , j 1 ∈ Ω 1 w ( i 1 , j 1 ) Formula 3
W (i in formula 1, j 1) be the neighborhood Ω of required point (x, y) 1interior each point (i 1, j 1) for the weights of required point, by depth image spatial domain weight w swith coloured image gray scale territory weight w rcomposition, that is:
W (i 1, j 1)=w s(i 1, j 1) × w r(i 1, j 1) formula 4
Wherein:
w s ( i 1 , j 1 ) = exp ( - ( i 1 - x ) 2 + ( j 1 - y ) 2 2 σ s 2 ) Formula 5
w r ( i 1 , j 1 ) = exp ( - ( Gray ( i 1 , j 1 ) - Gray ( x , y ) ) 2 2 σ r 2 ) Formula 6
Wherein, σ rand σ sbe respectively weight w sand w rcorresponding Gaussian function standard deviation, Gray (x, y), Gray (i 1, j 1) be respectively pixel (x, y) and neighborhood Ω thereof 1interior each point (i 1, j 1) gray-scale value at place, computing formula is:
Gray (x, y)=0.02989 × C (x, y, 1)+0.5870 × C (x, y, 2)+0.1149 × C (x, y, 3) formula 7
Wherein C (x, y, 1), C (x, y, 2), C (x, y, 3) be respectively the color value on a × b rank matrix mid point (x, y) of R, G, B color component in the color matrix C of rank, a × b × 3, traversal obtains a little gray matrix Gray.
Choose suitable Gaussian function standard deviation sigma rand σ s, at suitable filtered reference neighborhood Ω 1in, effectively can eliminate depth information and the noise spot of the region disappearance that is blocked.
(3) reflection foam top layer is obtained apart from bottom land distance H hmatrix of depths D ';
As shown in Figure 1, H hrepresent that in visual field, foam top layer point (x, y), apart from the distance of bottom land, can be obtained by following formula:
H h=H b-H dformula 8
Wherein H bfor kinect sensor camera place plane is to the distance (being definite value after installing) of flotation cell bottom land, H dfor foam top layer point in visual field is to the distance of kinect sensor camera place plane, when kinect sensor camera place plane is parallel with flotation cell bottom surface, H dnamely the matrix of depths D matrix of depths D after displacement and filtering completion for being obtained by kinect sensor 2the value of middle corresponding point, i.e. H d=D 2(x, y).
Obtain reflection H hcertain point (x, y) depth value D ' (x, y) be:
D ' (x, y)=H b-D 2(x, y) formula 9
In the present embodiment, obtain depth information D in depth data by formula 1 shifting function 1.Get required point (x, y) filtered reference neighborhood Ω 1radius of neighbourhood r 1=3, x ∈ [1,640], y ∈ [Isosorbide-5-Nitrae 80], i 1∈ [x-r 1, x+r 1], j 1∈ [y-r 1, y+r 1], by formula 2 ~ 6 couples of matrix of depths D 1carry out filtering process, obtain filtered matrix of depths D 2.Carry out gray processing by formula 7 pairs of color data and obtain gray matrix Gray.Reflection foam top layer is obtained apart from bottom land distance H by formula 8 ~ 9 hmatrix of depths D ', wherein H b=4m.
Step 3: preserve data;
Color matrix C and matrix of depths D ' is stored.With Motion JPEG AVI coded system access color data, with Motion JPEG2000 (16) coded system access depth data.
Step 3: the color obtain step 2 and depth data carry out feature extraction, color combining and depth data extract three-dimensional (with depth information) feature such as color, area, volume, speed, percentage of damage of foam; Comprise following sub-step:
Step 1: extract color characteristic;
The color obtain step 2 and depth data carry out feature extraction, color combining and depth data, are extracted the color characteristic of foam: R average, G average, B average and gray average Gray by following formula 12 ~ 15 m.
Suppose that the depth value scope of object in visual field is for [d min, d max], definition datum distance is:
Db=(d max-d min)/2 formula 10
Suppose that the color being positioned at reference range place point is standard value, then when intensity of illumination is identical, from distance of camera lens more away from (being greater than reference range) some color decay larger, need to strengthen its value, and from distance of camera lens more close to the point of (being less than reference range), need to weaken its color value, for this reason, by setting weight function, carry out standardization in conjunction with depth data to color data, formula is:
Q (x, y)=(D'(x, y)-db) 3/ 10 6+ 1 formula 11
Color characteristic after color value standardization is defined as:
R average: R m = Σ x = 1 a Σ y = 1 b C ( x , y , 1 ) · q ( x , y ) a × b Formula 12
G average: G m = Σ x = 1 a Σ y = 1 b C ( x , y , 2 ) · q ( x , y ) a × b Formula 13
B average: B m = Σ x = 1 a Σ y = 1 b C ( x , y , 3 ) · q ( x , y ) a × b Formula 14
The gray matrix Gray obtained by step 2, calculates gray average:
Gray m = Σ x = 1 a Σ y = 1 b Gray ( x , y ) · q ( x , y ) a × b Formula 15
Step 2: extract area features;
The intensity profile Gs that the gray matrix Gray obtained step 2 calculates a two field picture is obtained the gray matrix Gray ' of contrast strengthen by formula 16 ~ 17; Adopt watershed algorithm to split matrix Gray ', obtain foam regions two values matrix A by formula 18; Travel through the pixel that a two field picture is all, foam regions is numbered, obtain the label information Ps of marked region n; Add up the area of all N number of foam regions respectively, obtain area distributions Area; N number of area value in area distributions Area is obtained Area ' according to order rearrangement from small to large, is extracted the area features Area of foam by formula 22 m.
The gray matrix Gray of step 2 acquisition is calculated to the intensity profile Gs of a two field picture, represent the number of the pixel meeting Gray (x, y)=g with Gs (g), wherein g ∈ [0,255].
From e=0, get e=0 respectively, e=1 ..., e=e ', e ' ∈ (0,255), when e=e ' time, meet:
Σ g ∈ [ o , e ′ ] ∪ [ e ′ , 255 ] Gs ( g ) a × b > 0.01 Formula 16
Obtain the gray matrix Gray ' of contrast strengthen, its gray-scale value at point (x, y) place is:
Gray ′ ( x , y ) = 0 Gray ( x , y ) ∈ ( 0 , e ′ ) 255 255 - 2 e ( Gray ( x , y ) - e ′ ) Gray ( x , y ) ∈ ( e , 255 - e ′ ) 255 Gray ( x , y ) ∈ ( 255 - e ′ , 255 ) Formula 17
Adopt watershed algorithm to split matrix Gray ', obtain foam regions two values matrix A, it is designated as A (x, y) at the gray-scale value at point (x, y) place, adopts formula below to carry out the differentiation of foam regions point and frontier point:
Travel through the pixel that a two field picture is all, foam regions be numbered, obtain the label information of marked region:
Ps n: { (x 1, y 1), (x 2, y 2), L, (x area (n), y area (n)) formula 19
Ps in formula nrepresent the set of all pixels of the n-th foam regions, n=1,2, L, N, N are foam regions sum.Area (n) represents the number of the pixel in the n-th foam regions, be region area, add up the area of all N number of foam regions respectively, obtain area distributions: Area:{Area (1), Area (2) ... Area (n) ... Area (N) }, calculate the overall area area of the N number of foam regions of a two field picture:
SArea = Σ n = 1 N Area ( n ) Formula 20
N number of area value in area distributions Area is obtained Area ' according to order rearrangement from small to large: { Area ' (1), Area ' (2) L Area ' (n) L Area ' (N) }, wherein Area ' (1) is foam area minimum value, and Area ' (N) is foam Maximum Area.
From f 1=1 starts, and gets f respectively 1=1, f 1=2 ..., f 1=f 1', f 1' ∈ (1, N), works as f 1=f 1in ' time, meet:
Σ n ∈ [ 1 , f 1 ′ ] Area ′ ( n ) SArea > 0.2 Formula 21
Then defining foam area features is:
Area m = Σ n = f 1 ′ + 1 N Area ′ ( n ) N - f 1 ′ - 1 Formula 22
Step 3: extract volume characteristic;
The region barycenter of the n-th foam regions is calculated by formula 22 ~ 28 with the long and short semiaxis CB of region sub-elliptical n, DB n; The pixel coordinate of transverse end points is calculated: c by formula 29 ~ 30 1(u n1, v n1), c 2(u n2, v n2); The matrix of depths D ' that integrating step two obtains, calculates transverse end points and depth value l corresponding to region barycenter by formula 31 n1, l n2, l nc, suppose that foam is the spheroid of standard, foam real space height h can be obtained by formula 32 n; Major axis physical length CR is obtained by formula 33 ~ 36 n, in like manner can obtain minor axis physical length DR n; Volume V (n) of the n-th foam is obtained by formula 37.
The bulking value of N number of foam regions distribution Vs is obtained Vs ' according to order rearrangement from small to large; The volume characteristic V of foam is extracted by formula 38 m.
For region two values matrix A and the zone marker Ps of a two field picture n, (p+q) rank rule square of its n-th foam regions is defined as:
M n ( p , q ) = Σ ( x , y ) ∈ Ps ( n ) x p y q A ( x , y ) Formula 23
The region barycenter of the n-th foam regions can be tried to achieve by its 1 rank rule distance:
AC n : ( x ‾ n , y ‾ n ) = ( M n ( 1,0 ) , M n ( 0,1 ) ) Formula 24
Then (p+q) rank centre distance of a two field picture is:
U n ( p , q ) = Σ ( x , y ) ∈ Ps ( n ) ( x - x ‾ n ) p ( y - y ‾ n ) q A ( x , y ) Formula 25
If only consider the second-order moment around mean collection of a two field picture, then foam regions can be approximately an ellipse centered by the barycenter of region, then have:
CB n = ( U n ( 2,0 ) + U n ( 0,2 ) + [ ( U n ( 2,0 ) - U n ( 0,2 ) ) 2 + 4 U n ( 1,1 ) 2 ] 1 / 2 U n ( 0,0 ) / 2 ) 1 / 2 Formula 26
DB n = ( U n ( 2,0 ) + U n ( 0,2 ) - [ ( U n ( 2,0 ) - U n ( 0,2 ) ) 2 + 4 U n ( 1,1 ) 2 ] 1 / 2 U n ( 0,0 ) / 2 ) 1 / 2 Formula 27
θ n = 1 2 arctan ( 2 U n ( 1,1 ) U n ( 2,0 ) - U n ( 0,2 ) ) Formula 28
CB nrepresent the major semi-axis of sub-elliptical, DB nfor minor semi-axis, θ nfor inclination angle.
And then calculate the pixel coordinate of oval long (short) axle head point: c 1(u n1, v n1), c 2(u n2, v n2), formula is:
u n 1 = x ‾ n - CB n × cos ( θ n ) , v n 1 = y ‾ n - CB n × sin ( θ n ) Formula 29
u n 2 = x ‾ n + CB n × cos ( θ n ) , v n 2 = y ‾ n + CB n × sin ( θ n ) Formula 30
The matrix of depths D ' that integrating step two obtains, calculates transverse end points and depth value l corresponding to region barycenter n1, l n2, l nc:
l n1=D′(u n1,v n1)
L n2=D ' (u n2, v n2) formula 31
l nc = D ′ ( x ‾ n , y ‾ n )
Suppose that foam can be approximately the spheroid of standard, definition foam real space height h nfor:
H n=l nc-min (l n1, l n2) formula 32
View parameter in conjunction with kinect sensor: horizontal view angle δ 1with vertical angle of view δ 2, calculate the three dimensional space coordinate C that major axis end points is corresponding n1and C n2:
C n1: (δ 1* [2u n1l+ (a-1) (L-l n1)], δ 2[2 (b-v n1-1) L+ (b-1) (L-l n1)], L-l n1) formula 33
C n2: (δ 1* [2u n2l+ (a-1) (L-l n2)], δ 2[2 (b-v n2-1) L+ (b-1) (L-l n2)], L-l n2) formula 34
Wherein L represents the distance of depth camera to reference image plane (volume coordinate z=0).
And then obtain major axis physical length:
CR n = C 1 n C 2 n δ 1 2 [ 2 L ( u 1 n - u 2 n ) - ( a - 1 ) ( l 1 n - l 2 n ) ] 2 + δ 2 2 [ 2 L ( v 1 n - v 2 n ) + ( b - 1 ) ( l 1 n - l 2 n ) ] 2 + ( l 1 n - l 2 n ) 2
Formula 35
Suppose that the depth value that major axis two end points are corresponding is equal, major axis actual (tube) length angle value can be reduced to:
CR n = 2 L δ 1 2 ( u 1 n - u 2 n ) 2 + δ 2 2 ( v 1 n - v 2 n ) 2 Formula 36
In like manner can obtain minor axis physical length DR n.
Finally can obtain the volume (supposing that foam is the spheroid of standard) of the n-th foam:
V ( n ) = 4 3 π ( CR n 2 × DR n 2 × h n ) Formula 37
The thinking of a two field picture area features is obtained according to sub-step in step 32, by N number of Domain Volume Distribution value: Vs:{Vs (1), Vs (2) ... Vs (n) ... Vs (N) }, Vs ' is obtained: { Vs ' (1), Vs ' (2) ... Vs ' (n) ... Vs ' (N) } according to order rearrangement from small to large;
From f 2=1 starts, and gets f respectively 2=1, f 2=2 ..., f 2=f 2', f 2' ∈ (1, N), works as f 2=f 2in ' time, meet:
Σ n ∈ [ 1 , f 2 ′ ] V ′ s ( n ) Σ n ∈ [ 1 , N ] V ′ s ( n ) > 0.2 Formula 38
Then define foam volume to be characterized as:
V m = Σ n = f 2 ′ + 1 N V ′ s ( n ) N - f 2 ′ - 1 Formula 39
Step 4: extraction rate feature;
(1) Harris Corner Detection;
First, supposing to strengthen certain point (x, y) in gray matrix Gray ' is angle point, gets its neighborhood Ω 2radius of neighbourhood r 2=5, i 2∈ [x-r 2, x+r 2], j 2∈ [y-r 2, y+r 2], by formula 40 ~ 42, calculate this angle point neighborhood Ω 2interior fractional value G a little, gets the h of fractional value higher than mark 70 up to standard 1individual, sort from high to low by fractional value and obtain point set Hs 1 : { ( x 1 , y 1 ) , ( x 2 , y 2 ) . . . ( x h 1 , y h 1 ) } .
Suppose that certain point (x, y) strengthened in gray matrix Gray ' is angle point, its neighborhood Ω 2interior to be a little changed to along the mean intensity (gray-scale value) in certain side-play amount (Vx, Vy) direction:
I = Σ ( i 2 , j 2 ) ∈ Ω 2 ( Gray ′ ( i 2 + Vx , j 2 + Vy ) - Gray ′ ( i 2 , j 2 ) ) 2 Formula 40
I in formula 2∈ [x-r 2, x+r 2], j 2∈ [y-r 2, y+r 2], r 2for Ω 2the radius of neighbourhood.
The calculating of Strength Changes that all side-play amount directions of angle point (x, y) are averaged, Taylor series expansion is carried out to formula 40, omits higher order term, be described as with matrix form:
I ≈ Vx Vy Σ ( δGray ′ δh x ) 2 Σ δGray ′ δh x δGray ′ δh y Σ δGray ′ δh x δGray ′ δh y Σ ( δGray ′ δh y ) 2 Vx Vy = Vx Vy Q Vx Vy Formula 41
In formula, Q is covariance matrix, and two eigenwert represents that certain puts the mean intensity change of maximum mean intensity change and vertical direction thereof in all side-play amount directions, and definition scoring functions is:
G=Det (Q)-kTrace 2(Q) formula 42
Wherein Det (Q) is eigenwert sum, i.e. determinant of a matrix value, and Trace (Q) amasss for eigenwert is, i.e. matrix trace.When two eigenwerts are all larger, fractional value is higher.
Definition G rfor mark up to standard, traversal angle point (x, y) neighborhood Ω 2interior a little, obtain the h of fractional value higher than mark up to standard 1individual, sort from high to low by fractional value and obtain point set
Secondly, angle point neighborhood Ω is got 3radius of neighbourhood r 3=7, i 3∈ [x-r 3, x+r 3], j 3∈ [y-r 3, y+r 3], by formula 43, the most value process in gray-scale value local is carried out to enhancing gray matrix Gray'; Determine that gray-scale value local is worth a little most by formula 44 ~ 45; From point set Hs 1in filter out the point meeting local gray level and be worth condition most, obtain by h 2the point set of individual composition Hs 2 : { ( x 1 , y 1 ) , ( x 2 , y 2 ) . . . ( x h 2 , y h 2 ) } .
Carry out the most value process in gray-scale value local to enhancing gray matrix Gray', formula is:
Graymax(x,y)=max(Gray′(i 3,j 3)),(i 3,j 3)∈Ω 3
Formula 43
Graymin(x,y)=min(Gray′(i 3,j 3)),(i 3,j 3)∈Ω 3
Wherein Graymax (x, y) is the value at gray matrix Graymax mid point (x, the y) place after gray-scale value suboptimize, Graymin (x, y) be the value at gray matrix Graymin mid point (x, y) place after gray-scale value local minimum, Ω 3for the neighborhood of point (x, y), its radius of neighbourhood is set to r 3, and i 3∈ [x-r 3, x+r 3], j 3∈ [y-r 3, y+r 3],
Following formula determination gray-scale value local is adopted to be worth most a little:
From point set Hs 1in filter out the point meeting local gray level and be worth condition most, obtain by h 2the point set of individual composition Hs 2 : { ( x 1 , y 1 ) , ( x 2 , y 2 ) . . . ( x h 2 , y h 2 ) } .
Finally, to matrix of depths D' carry out with ask gray matrix Gray' gray-scale value local value order the same operation, obtain depth value and be locally worth most a little, from point set Hs 2in filter out the point meeting partial-depth and be worth condition most, obtain by h 3the angle point point set of individual composition Hs 3 : { ( x 1 , y 1 ) , ( x 2 , y 2 ) . . . ( x h 3 , y h 3 ) } .
(2) screen to through three angle points the angle point point set Hs determined 3in point follow the trail of, extraction rate feature.
Suppose the same angle point (x in the two field picture of front and back two h, y h) intensity constant (namely gray-scale value is constant), that is:
Gray' t(x h, y h)=Gray' t+Vt(x h+ Vu, y h+ Vv) formula 46
Wherein Gray' t(x h, y h) be that t (previous frame image) strengthens gray matrix Gray' mid point (x h, y h) gray-scale value, Gray' t+Vt(x h+ Vu, y h+ Vv) for being the t+Vt moment (a rear two field picture) strengthen gray matrix Gray' mid point (x h+ Vu, y h+ Vv) gray-scale value, (Vu, Vv) is angle point (x h, y h) side-play amount that occurs within the Vt time, Vt is adjacent two frame time intervals, h=1,2, L, h 3.
Then there is basic light stream constraint equation:
∂ Gray ′ ∂ x h Vu + ∂ Gray ′ ∂ y h Vv = - ∂ Gray ′ ∂ Vt Formula 47
Suppose angle point (x h, y h) and neighborhood Ω 4in side-play amount a little consistent, neighborhood Ω 4radius is set to r 4, by angle point (x h, y h) and neighborhood Ω 4a little ((2r is total in institute 4+ 1) 2individual) bring optical flow constraint equation into, the equation number exceeding unknown number number (Vu, Vv two) can be obtained, by iterative, certain angle point (x can be obtained h, y h) side-play amount (Vu, Vv), then displacement size is:
Vw = Vu 2 + Vv 2 Formula 48
Carry out iterative to all angle points detected, asking for average displacement is:
w = Σ i = 1 h 3 Vw i h 3 Formula 49
Adjacent two two field picture interval times are Vt, then define foam velocity characteristic to be:
Rate = w Vt Formula 50
In the present embodiment, screening through three angle points the angle point point set Hs determined 3in get angle point (x h, y h), h=1,2 ..., h 3, its neighborhood Ω 4radius of neighbourhood r 4=1, i 4∈ [x-r 4, x+r 4], j 4∈ [y-r 4, y+r 4], obtain point set Hs by formula 46 ~ 49 3the displacement Vw of interior each point and average displacement w a little; Adjacent two two field picture interval times are Vt=0,33s, obtain foam velocity characteristic Rate by formula 50.
In tracing process, needing continuous deletion not need the angle point followed the trail of, the angle point with shifting out sight line as little especially in displacement Vw, when deleting a part of angle point, and when residue angle point number is less than 50, need again detects angle point by abovementioned steps and following the trail of.
Step 5: extract percentage of damage feature;
Step 3 is obtained matrix of depths D ', with a two field picture N number of foam regions label information point set Ps none_to_one corresponding, obtains the set PD of depth value corresponding to each pixel of the n-th foam regions n: (d 1, d 2, L d area (n)), obtain the minimum value d of depth value min, obtain the height PH of each pixel of foam regions n(d 1-d min, d 2-d min, L d area (n)-d min), ask for height and PH by formula 51 nsum.
PH nsum=Σ PH nformula 51
For a rear two field picture, read and previous frame data point set Ps ncorresponding depth data PD n' (d 1', d 2', L d ' area (n)), obtain the height PH of each pixel of foam regions n' (d 1'-d min, d 2'-d min, L d ' area (n)-d min), ask for height and PH ' by formula 52 nsum.
PH ' nsum=Σ PH n' formula 52
Work as PH nsum'/PH nsumduring < 0.75, be considered as the bubble collapse that label is n, in like manner all N number of regions processed, count on a total BS bubble collapse, extracted the percentage of damage feature Break of foam by formula 53.
Break = BS N Formula 53
Step 4: adopt the depth data that distribution fitting method obtains step 2, carry out statistical study, method for parameter estimation is adopted to obtain the distribution of foam top layer height, and then the relation obtained between foam top layer height and overflow groove edge height, for on-line monitoring and the operating mode's switch of froth level; Comprise following sub-step:
Step 1: the matrix of depths D ' obtained step 2, extracts foam top layer liquid level feature by formula 55 ~ 60: average level L a, liquid level smoothness L u, minimum liquid level L l, the highest liquid level L h.
As shown in Figure 1, for foam top layer point (x, y) in kinect visual field, meet:
H h=H t+ H kformula 54
Wherein H kfor mineral pulp level (supposing that mineral syrup liquid is parallel with flotation cell bottom surface), H trepresent certain point (x, y) place froth bed thickness (in visual field, certain point is to the distance of mineral syrup liquid).H kbe an important indicator of flotation site operating mode's switch, but be difficult to because mineral syrup liquid is covered by froth bed measure, for this reason, by the foam top layer distance bottom land distance H a little of institute in visual field h=D ' (x, y), obtains the foam top layer liquid level feature relevant to mineral pulp level, carries out operating mode's switch.
Use a large amount of off-line data to carry out fitting of distribution to the data in matrix of depths D ', obtain the result of its Normal Distribution.The point estimation method is adopted to estimate the parameter of normal distribution:
The average of normal distribution: &mu; = &Sigma; ( x , y ) = ( 1,1 ) ( a , b ) D &prime; ( x , y ) a &times; b Formula 55
The standard deviation of normal distribution: &sigma; = &Sigma; ( x , y ) = ( 1,1 ) ( a , b ) ( D &prime; ( x , y ) - &mu; ) 2 a &times; b Formula 56
Define 4 foam top layer liquid level features:
Average level: L a=μ formula 57
Liquid level smoothness: L u=σ formula 58
Minimum liquid level: L l=u-3 σ formula 59
The highest liquid level: L h=u+3 σ formula 60
Step 2: by statistical study, obtains foam top layer liquid level feature and overflow groove edge height H ybetween relation, H y=3m; Judge current level operating mode by formula 61, when this classification operating mode continues more than 10 minutes, judge that recognition result comes into force.
Step 5: foam characteristics constitutive characteristic vector step 3 extracted, adopts the k-means algorithm improved to carry out the cluster analysis of off-line, obtain several cluster centres; The foam characteristics of extract real-time flotation site, and carry out real-time matching with cluster centre, the online operating mode obtaining current flotation.
Step 5 comprises following sub-step:
Step 1: 8 foam characteristics constitutive characteristic vector d that step 3 is extracted s:
Ds=[Graym Rm Gm Bm Aream Vm Rate Break] formula 62
Step 2: off-line obtains cluster centre;
By the monitoring record of a period of time, sampling g group obtains characteristic data set:
DS = ds 1 ds 2 . . . ds g = Graym 1 Rm 1 Gm 1 Bm 1 Aream 1 Vm 1 Rate 1 Break 1 Graym 2 Rm 2 Gm 2 Bm 2 Aream 2 Vm 2 Rate 2 Break 2 . . . Graym g Rm g Gm g Bm g Aream g Vm g Rate g Break g Formula 63
Traversal cluster numbers k value, and k ∈ (1,2 ..., g), for each k value, carry out K-means cluster, definition evaluation function, determines preferable clustering number kb, obtains kb cluster centre: [cc 1, cc 2... cc kb], form cluster centre set KB:
KB = cc 1 cc 2 . . . cc kb = Graym c 1 Rm c 1 Gm c 1 Bm c 1 Aream c 1 Vm c 1 Rate c 1 Break c 1 Graym c 2 Rm c 2 Gm c 2 Bm c 2 Aream c 2 Vm c 2 Rate c 2 Break c 2 . . . Graym c kb Rm c kb Gm c kb Bm c kb Aream c kb Vm c kb Rate c kb Break c kb
Formula 64
Step 3: online operating mode's switch.
According to the froth flotation working-condition monitoring system based on depth information that step one builds, Real-time Obtaining froth flotation color and depth data, the real-time froth images structural feature proper vector by step 3 is extracted:
Cc current=[Graym currentrm currentgm currentbm currentaream currentvm currentrate currentbreak current] formula 65
Euclidean distance between definition real-time characteristic and cluster centre feature is characteristic similarity:
KN = | cc 1 - cc current | | cc 2 - cc current | . . . | cc kb - cc current | Formula 66
Wherein | cc kb-cc current| represent vector (cc kb-cc current) mould, that is:
| cc kb - cc current | = ( Graym c kb - Graym current ) 2 + ( Rm c kb - Rm current ) 2 + . . . + ( Break c kb - Break current ) 2 Formula 67
A large amount of off-line data is obtained to the cluster centre of optimum number by formula 62 ~ 64; According to the froth flotation working-condition monitoring system based on kinect that step one builds, Real-time Obtaining froth flotation color and depth data, extract real-time froth images structural feature proper vector cc by step 3 current, by formula 66 using with the highest cluster centre of real-time characteristic similarity as current working, complete operating mode.

Claims (8)

1., based on froth flotation level monitoring and the operating mode's switch method of depth information, it is characterized in that, comprise the following steps:
Step one: build the froth flotation working-condition monitoring system based on kinect from software and hardware aspect;
Step 2: gather, process and preserve the flotation froth color data and depth data that are obtained by kinect;
Step 3: the color obtain step 2 and depth data carry out feature extraction, color combining and depth data extract the stereoscopic features such as color, area, volume, speed, percentage of damage of foam;
Step 4: adopt distribution fitting method to carry out statistical study to the depth data that step 2 obtains; Method for parameter estimation is adopted to obtain foam top layer liquid level feature; And then the relation obtained between foam top layer liquid level feature and overflow groove edge height, for on-line monitoring and the operating mode's switch of froth flotation liquid level;
Step 5: foam characteristics constitutive characteristic vector step 3 obtained, adopts the k-means algorithm improved to carry out the cluster analysis of off-line, obtain several cluster centres; The foam characteristics of extract real-time flotation site, and carry out real-time matching with cluster centre, the online operating mode obtaining current flotation.
2. the froth flotation level monitoring based on depth information according to claim 1 and operating mode's switch method, it is characterized in that, described step one comprises following sub-step:
Step 1: build the froth flotation working-condition monitoring system based on kinect from hardware aspect, hardware comprises kinect sensor, high frequency light source, computing machine, be fixed on by kinect sensor apart within the scope of flotation cell liquid level 1.2m-3.5m, its camera place plane is parallel with flotation cell bottom surface; High frequency light source provides illumination for kinect sensor gathers color data, the color collected and depth data is flowed through USB data line and is sent to computing machine, and computing machine realizes data processing, real-time working condition monitoring and result display thereof;
Step 2: build the froth flotation working-condition monitoring system based on kinect from software aspect, under cross-platform framework Openni, adopt interactive programming interface, to realize the Real-time Obtaining of color and depth data stream, process, preservation and feature extraction and operating mode's switch.
3. the froth flotation level monitoring based on depth information according to claim 1 and operating mode's switch method, it is characterized in that, described step 2 comprises following sub-step:
Step 1: image data;
(1) setting data form;
Color data stream is set as RGB888, and resolution is the pixel quantity that pixel quantity that a × b, a × b represents in horizontal direction is multiplied by vertical direction, and frame per second is 30FPS; Depth data stream format is set as PIXEL_FORMAT_DEPTH_1_MM form, and data precision is millimeter, and resolution is a × b; Frame per second is 30FPS;
(2) hardware aims at color data and depth data;
Adopt " Alternative View " instrument under cross-platform framework Openni, the visual angle of correction of color camera and depth camera, to realize color data stream and depth data stream aiming on field of regard;
(3) software aims at color data and depth data;
Start kinect equipment and carry out image acquisition; Adopt the data layout of above-mentioned steps setting to create color data stream and depth data stream simultaneously, obtain two kinds of data that sequential is aimed at: rank, a × b × 3 and a × b rank matrix of depths D, color matrix C are by representing that 3 a × b rank matrixes of image R, G, B color component form respectively;
Step 2: process data;
(1) depth information in depth data is obtained;
The matrix of depths D obtained by kinect sensor is 16 integer data, its high 13 store be depth information, and low 3 store be index information, obtain depth information by shifting function:
D 1=D/2 3formula 1
D in formula 1for only storing the matrix of depths of depth information;
(2) color combining data carry out filtering completion process to depth data;
Adopt associating bilateral filtering method to matrix of depths D 1carry out filtering process, by the depth image completion owing to blocking disappearance, obtain filtered matrix of depths D 2, it is at the depth value D at point (x, y) place 2(x, y) is:
D 2 ( x , y ) = 1 w p &Sigma; ( i 1 , j 1 ) &Element; &Omega; 1 w ( i 1 , j 1 ) &times; D 1 ( i 1 , j 1 ) Formula 2
Ω in formula 1for the filtered reference neighborhood of required point (x, y), x ∈ [1, a], y ∈ [1, b], (i 1, j 1) ∈ Ω 1, i 1∈ [x-r 1, x+r 1], j 1∈ [y-r 1, y+r 1], r 1for the radius of neighbourhood, D 1(i 1, j 1) be matrix D 1at point (i 1, j 1) depth value at place, w pfor normalized parameter:
w p = &Sigma; i 1 , j 1 &Element; &Omega; 1 w ( i 1 , j 1 ) Formula 3
W (i in formula 1, j 1) be the neighborhood Ω of required point (x, y) 1interior each point (i 1, j 1) for the weights of required point, by depth image spatial domain weight w swith coloured image gray scale territory weight w rcomposition, that is:
W (i 1, j 1)=w s(i 1, j 1) × w r(i 1, j 1) formula 4
Wherein:
w s ( i 1 , j 1 ) = exp ( - ( i 1 - x ) 2 + ( j 1 - y ) 2 2 &sigma; s 2 ) Formula 5
w r ( i 1 , j 1 ) = exp ( - ( Gray ( i 1 , j 1 ) - Gray ( x , y ) ) 2 2 &sigma; r 2 ) Formula 6
Wherein, σ rand σ sbe respectively weight w sand w rcorresponding Gaussian function standard deviation, Gray (x, y), Gray (i 1, j 1) be respectively pixel (x, y) and neighborhood Ω thereof 1interior each point (i 1, j 1) gray-scale value at place, computing formula is:
Gray (x, y)=0.02989 × C (x, y, 1)+0.5870 × C (x, y, 2)+0.1149 × C (x, y, 3) formula 7
Wherein C (x, y, 1), C (x, y, 2), C (x, y, 3) be respectively the color value on a × b rank matrix mid point (x, y) of R, G, B color component in the color matrix C of rank, a × b × 3, traversal obtains a little gray matrix Gray;
Choose suitable Gaussian function standard deviation sigma rand σ s, at suitable filtered reference neighborhood Ω 1in, effectively can eliminate depth information and the noise spot of the region disappearance that is blocked;
(3) reflection foam top layer is obtained apart from bottom land distance H hmatrix of depths D ';
H hrepresent that in visual field, foam top layer point (x, y), apart from the distance of bottom land, can be obtained by following formula:
H h=H b-H dformula 8
Wherein H bfor kinect sensor camera place plane is to the distance of flotation cell bottom land, H dfor foam top layer point in visual field is to the distance of kinect sensor camera place plane, when kinect sensor camera place plane is parallel with flotation cell bottom surface, H dnamely the matrix of depths D matrix of depths D after displacement and filtering completion for being obtained by kinect sensor 2the value of middle corresponding point, i.e. H d=D 2(x, y),
Obtain reflection H hcertain point (x, y) depth value D ' (x, y) be:
D ' (x, y)=H b-D 2(x, y) formula 9
Step 3: preserve data;
Color matrix C and matrix of depths D ' is stored, with Motion JPEGAVI coded system access color data, with Motion JPEG2000 (16) coded system access depth data.
4. the froth flotation level monitoring based on depth information according to claim 3 and operating mode's switch method, it is characterized in that, described step 3 comprises following sub-step:
Step 1: extract color characteristic;
The color obtain step 2 and depth data carry out feature extraction, and color combining and depth data extract the color characteristic of foam,
Suppose that the depth value scope of object in visual field is for [d min, d max], definition datum distance is:
Db=(d max-d min)/2 formula 10
Suppose that the color being positioned at reference range place point is standard value, then when intensity of illumination is identical, be greater than reference range from distance of camera lens more away from some color decay larger, need to strengthen its value, and be less than reference range from distance of camera lens more close to point, need to weaken its color value, for this reason, by setting weight function, carry out standardization in conjunction with depth data to color data, formula is:
Q (x, y)=(D'(x, y)-db) 3/ 10 6+ 1 formula 11
Color characteristic after color value standardization is defined as:
R average: R m = &Sigma; x = 1 a &Sigma; y = 1 b C ( x , y , 1 ) &CenterDot; q ( x , y ) a &times; b Formula 12
G average: G m = &Sigma; x = 1 a &Sigma; y = 1 b C ( x , y , 2 ) &CenterDot; q ( x , y ) a &times; b Formula 13
B average: B m = &Sigma; x = 1 a &Sigma; y = 1 b C ( x , y , 3 ) &CenterDot; q ( x , y ) a &times; b Formula 14
The gray matrix Gray obtained by step 2, calculates gray average:
Gray m = &Sigma; x = 1 a &Sigma; y = 1 b Gray ( x , y ) &CenterDot; q ( x , y ) a &times; b Formula 15
Step 2: extract area features;
The gray matrix Gray of step 2 acquisition is calculated to the intensity profile Gs of a two field picture, represents the number of the pixel meeting Gray (x, y)=g with Gs (g), wherein g ∈ [0,255],
From e=0, get e=0 respectively, e=1 ..., e=e ', e ' ∈ (0,255), when e=e ' time, meet:
&Sigma; g &Element; [ o , e &prime; ] &cup; [ e &prime; , 255 ] Gs ( g ) a &times; b > 0.01 Formula 16
Obtain the gray matrix Gray ' of contrast strengthen, its gray-scale value at point (x, y) place is:
Gray &prime; ( x , y ) = 0 Gray ( x , y ) &Element; ( 0 , e &prime; ) 255 255 - 2 e ( Gray ( x , y ) - e &prime; ) Gray ( x , y ) &Element; ( e , 255 - e &prime; ) 255 Gray ( x , y ) &Element; ( 255 - e &prime; , 255 ) Formula 17
Adopt watershed algorithm to split matrix Gray ', obtain foam regions two values matrix A, it is designated as A (x, y) at the gray-scale value at point (x, y) place, adopts formula below to carry out the differentiation of foam regions point and frontier point:
formula 18
Travel through the pixel that a two field picture is all, foam regions be numbered, obtain the label information of marked region:
Ps n: { (x 1, y 1), (x 2, y 2), L, (x area (n), y area (n)) formula 19
Ps in formula nrepresent the set of all pixels of the n-th foam regions, n=1,2, L, N, N are foam regions sum; Area (n) represents the number of the pixel in the n-th foam regions, be region area, add up the area of all N number of foam regions respectively, obtain area distributions: Area:{Area (1), Area (2) ... Area (n) ... Area (N) }, calculate the overall area area of the N number of foam regions of a two field picture:
SArea = &Sigma; n = 1 N Area ( n ) Formula 20
N number of area value in area distributions Area is obtained Area ' according to order rearrangement from small to large: { Area ' (1), Area ' (2) L Area ' (n) L Area ' (N) }, wherein Area ' (1) is foam area minimum value, and Area ' (N) is foam Maximum Area;
From f 1=1 starts, and gets f respectively 1=1, f 1=2 ..., f 1=f 1', f 1' ∈ (1, N), works as f 1=f 1in ' time, meet:
&Sigma; n &Element; [ 1 , f 1 &prime; ] A rea &prime; ( n ) SArea > 0.2 Formula 21
Then defining foam area features is:
Area m = &Sigma; n = f 1 &prime; + 1 N Area &prime; ( n ) N - f 1 &prime; - 1 Formula 22
Step 3: extract volume characteristic;
For region two values matrix A and the zone marker Ps of a two field picture n, (p+q) rank rule square of its n-th foam regions is defined as:
M n ( p , q ) = &Sigma; ( x , y ) &Element; Ps ( n ) x p y q A ( x , y ) Formula 23
The region barycenter of the n-th foam regions can be tried to achieve by its 1 rank rule distance:
A C n : ( x &OverBar; n , y &OverBar; n ) = ( M n ( 1,0 ) , M n ( 0,1 ) ) Formula 24
Then (p+q) rank centre distance of a two field picture is:
U n ( p , q ) = &Sigma; ( x , y ) &Element; Ps ( n ) ( x - x &OverBar; n ) p ( y - y &OverBar; n ) q A ( x , y ) Formula 25
If only consider the second-order moment around mean collection of a two field picture, then foam regions can be approximately an ellipse centered by the barycenter of region, then have:
CB n = ( U n ( 2,0 ) + U n ( 0,2 ) + [ ( U n ( 2,0 ) - U n ( 0,2 ) ) 2 + 4 U n ( 1,1 ) 2 ] 1 / 2 U n ( 0,0 ) / 2 ) 1 / 2 Formula 26
DB n = ( U n ( 2,0 ) + U n ( 0,2 ) - [ ( U n ( 2,0 ) - U n ( 0,2 ) ) 2 + 4 U n ( 1,1 ) 2 ] 1 / 2 U n ( 0,0 ) / 2 ) 1 / 2 Formula 27
&theta; n = 1 2 arctan ( 2 U n ( 1,1 ) U n ( 2,0 ) - U n ( 0,2 ) ) Formula 28
CB nrepresent the major semi-axis of sub-elliptical, DB nfor minor semi-axis, θ nfor inclination angle;
And then calculate the pixel coordinate of oval long (short) axle head point: c 1(u n1, v n1), c 2(u n2, v n2), formula is:
u n 1 = x &OverBar; n - CB n &times; cos ( &theta; n ) , v n 1 = y &OverBar; n - CB n &times; sin ( &theta; n ) Formula 29
u n 2 = x &OverBar; n + CB n &times; cos ( &theta; n ) , v n 2 = y &OverBar; n + CB n &times; sin ( &theta; n ) Formula 30
The matrix of depths D ' that integrating step two obtains, calculates transverse end points and depth value l corresponding to region barycenter n1, l n2, l nc:
l n1=D′(u n1,v n1)
L n2=D ' (u n2, v n2) formula 31
l nc = D &prime; ( x &OverBar; n , y &OverBar; n )
Suppose that foam can be approximately the spheroid of standard, definition foam real space height h nfor:
H n=l nc-min (l n1, l n2) formula 32
View parameter in conjunction with kinect sensor: horizontal view angle δ 1with vertical angle of view δ 2, calculate the three dimensional space coordinate C that major axis end points is corresponding n1and C n2:
C n1: (δ 1* [2u n1l+ (a-1) (L-l n1)], δ 2[2 (b-v n1-1) L+ (b-1) (L-l n1)], L-l n1) formula 33
C n2: (δ 1* [2u n2l+ (a-1) (L-l n2)], δ 2[2 (b-v n2-1) L+ (b-1) (L-l n2)], L-l n2) formula 34
Wherein L represents the distance of depth camera to reference image plane space coordinate z=0;
And then obtain major axis physical length:
CR n = C 1 n C 2 n = &delta; 1 2 [ 2 L ( u 1 n - u 2 n ) - ( a - 1 ) ( l 1 n - l 2 n ) ] 2 + &delta; 2 2 [ 2 L ( v 1 n - v 2 n ) + ( b - 1 ) ( l 1 n - l 2 n ) ] 2 + ( l 1 n - l 2 n ) 2
Formula 35
Suppose that the depth value that major axis two end points are corresponding is equal, major axis actual (tube) length angle value can be reduced to:
CR n = 2 L &delta; 1 2 ( u 1 n - u 2 n ) 2 + &delta; 2 2 ( v 1 n - v 2 n ) 2 Formula 36
In like manner can obtain minor axis physical length DR n;
Finally can obtain the volume (supposing that foam is the spheroid of standard) of the n-th foam:
V ( n ) = 4 3 &pi; ( CR n 2 &times; DR n 2 &times; h n ) Formula 37
The thinking of a two field picture area features is obtained according to sub-step in step 32, by N number of Domain Volume Distribution value: Vs:{Vs (1), Vs (2) ... Vs (n) ... Vs (N) }, Vs ' is obtained: { Vs ' (1), Vs ' (2) according to order rearrangement from small to large ... Vs ' (n) ... Vs ' (N) };
From f 2=1 starts, and gets f respectively 2=1, f 2=2 ..., f 2=f ' 2, f ' 2∈ (1, N), works as f 2=f ' 2time, meet:
&Sigma; n &Element; [ 1 , f 2 &prime; ] V &prime; s ( n ) &Sigma; n &Element; [ 1 , N ] V &prime; s ( n ) > 0.2 Formula 38
Then define foam volume to be characterized as:
V m = &Sigma; n = f 2 &prime; + 1 N V &prime; s ( n ) N - f 2 &prime; - 1 Formula 39
Step 4: extraction rate feature;
(1) Harris Corner Detection;
Suppose that certain point (x, y) strengthened in gray matrix Gray ' is angle point, its neighborhood Ω 2interior along the mean intensity gray-value variation in certain side-play amount (Vx, Vy) direction be a little:
I = &Sigma; ( i 2 , j 2 ) &Element; &Omega; 2 ( Gray &prime; ( i 2 + Vx , j 2 + Vy ) - Gray &prime; ( i 2 , j 2 ) ) 2 Formula 40
I in formula 2∈ [x-r 2, x+r 2], j 2∈ [y-r 2, y+r 2], r 2for Ω 2the radius of neighbourhood;
The calculating of Strength Changes that all side-play amount directions of angle point (x, y) are averaged, Taylor series expansion is carried out to formula 40, omits higher order term, be described as with matrix form:
I &ap; Vx Vy &Sigma; ( &delta; Gray &prime; &delta;h x ) 2 &Sigma; &delta;Gray &prime; &delta;h x &delta;Gray &prime; &delta;h y &Sigma; &delta;Gray &prime; &delta;h x &delta;Gray &prime; &delta;h y &Sigma; ( &delta;Gray &prime; &delta;h y ) 2 Vx Vy = Vx Vy Q Vx Vy Formula 41
In formula, Q is covariance matrix, and two eigenwert represents that certain puts the mean intensity change of maximum mean intensity change and vertical direction thereof in all side-play amount directions, and definition scoring functions is:
G=Det (Q)-kTrace 2(Q) formula 42
Wherein Det (Q) is eigenwert sum, i.e. determinant of a matrix value, and Trace (Q) amasss for eigenwert is, i.e. matrix trace; When two eigenwerts are all larger, fractional value is higher;
Definition G rfor mark up to standard, traversal angle point (x, y) neighborhood Ω 2interior a little, obtain the h of fractional value higher than mark up to standard 1individual, sort from high to low by fractional value and obtain point set
Carry out the most value process in gray-scale value local to enhancing gray matrix Gray', formula is:
Graymax (x, y)=max (Gray ' (i 3, j 3)), (i 3, j 3) ∈ Ω 3formula 43
Graymin(x,y)=min(Gray′(i 3,j 3)),(i 3,j 3)∈Ω 3
Wherein Graymax (x, y) is the value at gray matrix Graymax mid point (x, the y) place after gray-scale value suboptimize, Graymin (x, y) be the value at gray matrix Graymin mid point (x, y) place after gray-scale value local minimum, Ω 3for the neighborhood of point (x, y), its radius of neighbourhood is set to r 3, and i 3∈ [x-r 3, x+r 3], j 3∈ [y-r 3, y+r 3],
Following formula determination gray-scale value local is adopted to be worth most a little:
formula 44
formula 45
From point set Hs 1in filter out the point meeting local gray level and be worth condition most, obtain by h 2the point set of individual composition Hs 2 : { ( x 1 , y 1 ) , ( x 2 , y 2 ) . . . ( x h 2 , y h 2 ) } ;
Finally, to matrix of depths D' carry out with ask gray matrix Gray' gray-scale value local value order the same operation, obtain depth value and be locally worth most a little, from point set Hs 2in filter out the point meeting partial-depth and be worth condition most, obtain by h 3individual composition angle point point set
(2) screen to through three angle points the angle point point set Hs determined 3in point follow the trail of, extraction rate feature;
Suppose the same angle point (x in the two field picture of front and back two h, y h) intensity constant, that is:
Gray' t(x h, y h)=Gray' t+Vt(x h+ Vu, y h+ Vv) formula 46
Wherein Gray' t(x h, y h) be that t (previous frame image) strengthens gray matrix Gray' mid point (x h, y h) gray-scale value, Gray' t+Vt(x h+ Vu, y h+ Vv) for being the t+Vt moment (a rear two field picture) strengthen gray matrix Gray' mid point (x h+ Vu, y h+ Vv) gray-scale value, (Vu, Vv) is angle point (x h, y h) side-play amount that occurs within the Vt time, Vt is adjacent two frame time intervals, h=1,2, L, h 3;
Then there is basic light stream constraint equation:
&PartialD; Gray &prime; &PartialD; x h Vu + &PartialD; Gray &prime; &PartialD; y h Vv = - &PartialD; Gray &prime; &PartialD; Vt Formula 47
Suppose angle point (x h, y h) and neighborhood Ω 4in side-play amount a little consistent, neighborhood Ω 4radius is set to r 4, by angle point (x h, y h) and neighborhood Ω 4a little (2r is total in institute 4+ 1) 2individually bring optical flow constraint equation into, obtain the equation number exceeding unknown number number (Vu, Vv two), by iterative, certain angle point (x can be obtained h, y h) side-play amount (Vu, Vv), then displacement size is:
Vw = Vu 2 + Vv 2 Formula 48
Carry out iterative to all angle points detected, asking for average displacement is:
w = &Sigma; i = 1 h 3 V w i h 3 Formula 49
Adjacent two two field picture interval times are V t, then define foam velocity characteristic to be:
Rate = w Vt Formula 50
In tracing process, need continuous deletion not need the angle point followed the trail of, the angle point with shifting out sight line as little especially in displacement Vw, when deleting a part of angle point, and when residue angle point number is less than certain predefined threshold value, need again detects angle point by abovementioned steps and follow the trail of;
Step 5: extract percentage of damage feature;
Step 3 is obtained matrix of depths D ', with a two field picture zone marker information point set Ps none_to_one corresponding, obtains the set PD of depth value corresponding to each pixel of the n-th foam regions n: (d 1, d 2, L d area (n)), obtain the minimum value d of depth value min, draw the height PH of each pixel of foam regions n(d 1-d min, d 2-d min, L d area (n)-d min), ask for height and:
PH nsum=Σ PH nformula 51
For a rear two field picture, read and previous frame data point set Ps ncorresponding depth data PD ' n(d ' 1, d ' 2, L d ' area (n)), obtain the height PH ' of each pixel of foam regions n(d ' 1-d min, d ' 2-d min, L d ' area (n)-d min), ask for height and:
PH ' nsum=Σ PH ' nformula 52
Work as PH nsum'/PH nsumduring < 0.75, be considered as the bubble collapse that label is n, in like manner process all N number of regions, count on a total BS bubble collapse, then bubble collapse rate is:
Break = BS N Formula 53.
5. the froth flotation level monitoring based on depth information according to claim 4 and operating mode's switch method, it is characterized in that, described step 4 comprises following sub-step:
Step 1: the matrix of depths D ' obtained step 2, extracts foam top layer liquid level feature: average level L a, liquid level smoothness L u, minimum liquid level L l, the highest liquid level L h;
For foam top layer point (x, y) in kinect visual field, meet:
H h=H t+ H kformula 54
Wherein H kfor mineral pulp level, suppose that mineral syrup liquid is parallel with flotation cell bottom surface, H trepresent certain point (x, y) place froth bed thickness, namely in visual field, certain point arrives the distance of mineral syrup liquid, H kbe an important indicator of flotation site operating mode's switch, but be difficult to because mineral syrup liquid is covered by froth bed measure, for this reason, by the foam top layer distance bottom land distance H a little of institute in visual field h=D ' (x, y), obtains the foam top layer liquid level feature relevant to mineral pulp level, carries out operating mode's switch;
Use a large amount of off-line data to carry out fitting of distribution to the data in matrix of depths D ', obtain the result of its Normal Distribution, adopt the point estimation method to estimate the parameter of normal distribution:
The average of normal distribution: &mu; = &Sigma; ( x , y ) = ( 1,1 ) ( a , b ) D &prime; ( x , y ) a &times; b Formula 55
The standard deviation of normal distribution: &sigma; = &Sigma; ( x , y ) = ( 1,1 ) ( a , b ) ( D &prime; ( x , y ) - &mu; ) 2 a &times; b Formula 56
Define 4 foam top layer liquid level features:
Average level: L a=μ formula 57
Liquid level smoothness: L u=σ formula 58
Minimum liquid level: L l=u-3 σ formula 59
The highest liquid level: L h=u+3 σ formula 60
Step 2: by statistical study, obtains foam top layer liquid level feature and overflow groove edge height H ybetween relation:
formula 61
Step 3: extract real-time current foam top layer liquid level feature, application of formula 61 judges, obtains the operating mode classification that current foam top layer liquid level feature reflects; When this classification operating mode continues more than 10 minutes, judge that recognition result comes into force.
6. the froth flotation level monitoring based on kinect according to claim 5 and operating mode's switch method, it is characterized in that, described step 5 comprises following sub-step:
Step 1: 8 foam characteristics constitutive characteristic vector d that step 3 is extracted s:
Ds=[Graym Rm Gm Bm Aream Vm Rate Break] formula 62
Step 2: off-line obtains cluster centre;
By the monitoring record of a period of time, sampling g group obtains characteristic data set:
DS = ds 1 ds 2 . . . ds g = Graym 1 Rm 1 Gm 1 Bm 1 Aream 1 Vm 1 Rate 1 Break 1 Graym 2 Rm 2 Gm 2 Bm 2 Aream 2 Vm 2 Rate 2 Break 2 . . . Graym g Rm g Gm g Bm g Aream g Vm g Rate g Break g Formula 63
Traversal cluster numbers k value, k ∈ (1,2 ..., g), for each k value, carry out K-means cluster, definition evaluation function, determines preferable clustering number kb, obtains kb cluster centre: [cc 1, cc 2... cc kb], form cluster centre set KB:
KB = cc 1 cc 2 . . . cc kb = Graym c 1 Rm c 1 Gm c 1 Bm c 1 Aream c 1 Vm c 1 Rate c 1 Break c 1 Graym c 2 Rm c 2 Gm c 2 Bm c 2 Aream c 2 Vm c 2 Rate c 2 Break c 2 . . . Graym c kb Rm c kb Gm c kb Bm c kb Aream c kb Vm c kb Rate c kb Break c kb
Formula 64
Step 3: online operating mode's switch;
According to the froth flotation working-condition monitoring system based on kinect that step one builds, Real-time Obtaining froth flotation color and depth data, the real-time froth images structural feature proper vector by step 3 is extracted:
Cc current=[Graym currentrm currentgm currentbm currentaream currentvm currentrate currentbreak current] formula 65
Euclidean distance between definition real-time characteristic and cluster centre feature is characteristic similarity:
KN = | cc 1 - cc current | | cc 2 - cc current | . . . | cc kb - cc current | Formula 66
Wherein | cc kb-cc current| represent vector (cc kb-cc current) mould, that is:
| cc kb - cc current | = ( Graym c kb - Graym current ) 2 + ( Rm c kb - Rm current ) 2 + . . . + ( Break c kb - Break current ) 2 Formula 67
Choose operating mode corresponding to the minimum value of KN matrix as current working, complete operating mode's switch.
7. the froth flotation level monitoring based on depth information and working condition recognition system, it is characterized in that, comprise kinect sensor, high frequency light source, computing machine, kinect sensor is fixed on apart within the scope of flotation cell liquid level 1.2m-3.5m, and its camera place plane is parallel with flotation cell bottom surface; High frequency light source provides illumination for kinect sensor gathers color data, the color collected and depth data is flowed through USB data line and is sent to computing machine, and computing machine realizes data processing, real-time working condition monitoring and result display thereof; Software systems, under cross-platform framework Openni, adopt interactive programming interface, to realize the Real-time Obtaining of color and depth data stream, process, preservation and feature extraction and operating mode's switch.
8. the froth flotation level monitoring based on depth information according to claim 7 and working condition recognition system, it is characterized in that, the mode of the Real-time Obtaining of color and depth data stream, process, preservation and feature extraction and operating mode's switch is: gather, process and preserve the flotation froth color data and depth data that are obtained by kinect; Carry out feature extraction to the color obtained and depth data, color combining and depth data extract the stereoscopic features such as color, area, volume, speed, percentage of damage of foam; Distribution fitting method is adopted to carry out statistical study to the depth data obtained; Method for parameter estimation is adopted to obtain foam top layer liquid level feature; And then the relation obtained between foam top layer liquid level feature and overflow groove edge height, for on-line monitoring and the operating mode's switch of froth flotation liquid level; By the foam characteristics constitutive characteristic vector obtained, adopt the k-means algorithm improved to carry out the cluster analysis of off-line, obtain several cluster centres; The foam characteristics of extract real-time flotation site, and carry out real-time matching with cluster centre, the online operating mode obtaining current flotation.
CN201410699401.0A 2014-11-27 2014-11-27 Froth flotation level monitoring and operating mode's switch method and system based on depth information Active CN104408724B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201410699401.0A CN104408724B (en) 2014-11-27 2014-11-27 Froth flotation level monitoring and operating mode's switch method and system based on depth information

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201410699401.0A CN104408724B (en) 2014-11-27 2014-11-27 Froth flotation level monitoring and operating mode's switch method and system based on depth information

Publications (2)

Publication Number Publication Date
CN104408724A true CN104408724A (en) 2015-03-11
CN104408724B CN104408724B (en) 2017-12-01

Family

ID=52646353

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201410699401.0A Active CN104408724B (en) 2014-11-27 2014-11-27 Froth flotation level monitoring and operating mode's switch method and system based on depth information

Country Status (1)

Country Link
CN (1) CN104408724B (en)

Cited By (19)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105306909A (en) * 2015-11-20 2016-02-03 中国矿业大学(北京) Vision-based coal mine underground worker overcrowding alarm system
CN105430339A (en) * 2015-11-27 2016-03-23 中南大学 Embedded mineral flotation froth three-dimensional image monitoring device based on ARM (Advanced RISC Machine) and Kinect
CN105488816A (en) * 2015-11-27 2016-04-13 中南大学 On-line detection device and method of mineral flotation froth flow velocity on the basis of three-dimensional visual information
CN105823704A (en) * 2016-03-21 2016-08-03 东莞美维电路有限公司 Method for testing PCB adhesive removing uniformity
CN107067431A (en) * 2017-01-16 2017-08-18 河海大学常州校区 A kind of object volume computational methods based on Kinect
CN107122754A (en) * 2017-05-09 2017-09-01 苏州迪凯尔医疗科技有限公司 Posture identification method and device
CN108413864A (en) * 2017-02-10 2018-08-17 菜鸟智能物流控股有限公司 Object size measuring method and related equipment
CN109371633A (en) * 2018-11-29 2019-02-22 余姚市朗硕电器科技有限公司 Foam spreads condition detecting system
CN109410248A (en) * 2018-10-23 2019-03-01 湖南科技大学 A kind of flotation froth motion feature extracting method based on r-K algorithm
CN109772593A (en) * 2019-01-25 2019-05-21 东北大学 A kind of mineral pulp level prediction technique based on flotation froth behavioral characteristics
CN110245601A (en) * 2019-06-11 2019-09-17 Oppo广东移动通信有限公司 Eyeball tracking method and Related product
CN110288260A (en) * 2019-07-02 2019-09-27 太原理工大学 Coal slime flotation additive amount of medicament evaluation method based on semi-supervised clustering
CN110728253A (en) * 2019-07-22 2020-01-24 中南大学 Texture feature measurement method based on particle roughness
CN111144221A (en) * 2019-11-29 2020-05-12 光大环境科技(中国)有限公司 Method and device for evaluating operation condition of aeration tank through image recognition technology
CN113436162A (en) * 2021-06-23 2021-09-24 湖南国天电子科技有限公司 Method and device for identifying weld defects on surface of hydraulic oil pipeline of underwater robot
CN113591654A (en) * 2021-07-22 2021-11-02 中南大学 Zinc flotation working condition identification method based on long-term depth characteristics
CN114519715A (en) * 2022-04-21 2022-05-20 矿冶科技集团有限公司 Foam overflow amount detection method, device, electronic device and medium
CN114988567A (en) * 2022-07-15 2022-09-02 南通仁源节能环保科技有限公司 Sewage treatment method and system based on activated sludge foam
CN117252826A (en) * 2023-09-12 2023-12-19 山东神力索具有限公司 Visual technology-based method for detecting cooling sewage of steel caps containing graphite die forging

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101334844A (en) * 2008-07-18 2008-12-31 中南大学 Critical characteristic extraction method for flotation foam image analysis
WO2013170296A1 (en) * 2012-05-14 2013-11-21 Technological Resources Pty. Limited Controlling froth flotation
CN103971379A (en) * 2014-05-30 2014-08-06 中南大学 Single-vidicon equivalent binocular stereoscopic vision model based foam stereoscopic feature extraction method

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101334844A (en) * 2008-07-18 2008-12-31 中南大学 Critical characteristic extraction method for flotation foam image analysis
WO2013170296A1 (en) * 2012-05-14 2013-11-21 Technological Resources Pty. Limited Controlling froth flotation
CN103971379A (en) * 2014-05-30 2014-08-06 中南大学 Single-vidicon equivalent binocular stereoscopic vision model based foam stereoscopic feature extraction method

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
桂卫华 等: "基于机器视觉的矿物浮选过程监控技术研究进展", 《自动化学报》 *
赵洪伟 等: "基于泡沫图像特征的浮选槽液位智能优化设定方法", 《自动化学报》 *

Cited By (30)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105306909A (en) * 2015-11-20 2016-02-03 中国矿业大学(北京) Vision-based coal mine underground worker overcrowding alarm system
CN105306909B (en) * 2015-11-20 2018-04-03 中国矿业大学(北京) The overcrowding warning system of coal mine underground operators of view-based access control model
CN105488816B (en) * 2015-11-27 2018-09-11 中南大学 A kind of mineral floating foam flow velocity on-line measuring device and method based on 3D vision information
CN105430339A (en) * 2015-11-27 2016-03-23 中南大学 Embedded mineral flotation froth three-dimensional image monitoring device based on ARM (Advanced RISC Machine) and Kinect
CN105488816A (en) * 2015-11-27 2016-04-13 中南大学 On-line detection device and method of mineral flotation froth flow velocity on the basis of three-dimensional visual information
CN105430339B (en) * 2015-11-27 2018-04-06 中南大学 A kind of embedded mineral floating foam 3-D image monitoring device based on ARM+Kinect
CN105823704A (en) * 2016-03-21 2016-08-03 东莞美维电路有限公司 Method for testing PCB adhesive removing uniformity
CN107067431A (en) * 2017-01-16 2017-08-18 河海大学常州校区 A kind of object volume computational methods based on Kinect
CN107067431B (en) * 2017-01-16 2020-07-03 河海大学常州校区 Kinect-based object volume calculation method
CN108413864A (en) * 2017-02-10 2018-08-17 菜鸟智能物流控股有限公司 Object size measuring method and related equipment
CN108413864B (en) * 2017-02-10 2020-07-17 菜鸟智能物流控股有限公司 Object size measuring method and related equipment
CN107122754A (en) * 2017-05-09 2017-09-01 苏州迪凯尔医疗科技有限公司 Posture identification method and device
CN109410248A (en) * 2018-10-23 2019-03-01 湖南科技大学 A kind of flotation froth motion feature extracting method based on r-K algorithm
CN109410248B (en) * 2018-10-23 2021-07-20 湖南科技大学 Flotation froth motion characteristic extraction method based on r-K algorithm
CN109371633A (en) * 2018-11-29 2019-02-22 余姚市朗硕电器科技有限公司 Foam spreads condition detecting system
CN109772593A (en) * 2019-01-25 2019-05-21 东北大学 A kind of mineral pulp level prediction technique based on flotation froth behavioral characteristics
CN110245601A (en) * 2019-06-11 2019-09-17 Oppo广东移动通信有限公司 Eyeball tracking method and Related product
CN110288260A (en) * 2019-07-02 2019-09-27 太原理工大学 Coal slime flotation additive amount of medicament evaluation method based on semi-supervised clustering
CN110288260B (en) * 2019-07-02 2022-04-22 太原理工大学 Coal slime flotation reagent addition amount evaluation method based on semi-supervised clustering
CN110728253B (en) * 2019-07-22 2021-03-02 中南大学 Texture feature measurement method based on particle roughness
CN110728253A (en) * 2019-07-22 2020-01-24 中南大学 Texture feature measurement method based on particle roughness
CN111144221A (en) * 2019-11-29 2020-05-12 光大环境科技(中国)有限公司 Method and device for evaluating operation condition of aeration tank through image recognition technology
CN113436162A (en) * 2021-06-23 2021-09-24 湖南国天电子科技有限公司 Method and device for identifying weld defects on surface of hydraulic oil pipeline of underwater robot
CN113436162B (en) * 2021-06-23 2022-12-09 湖南国天电子科技有限公司 Method and device for identifying weld defects on surface of hydraulic oil pipeline of underwater robot
CN113591654A (en) * 2021-07-22 2021-11-02 中南大学 Zinc flotation working condition identification method based on long-term depth characteristics
CN113591654B (en) * 2021-07-22 2023-09-01 中南大学 Zinc flotation working condition identification method based on long-time-range depth characteristics
CN114519715A (en) * 2022-04-21 2022-05-20 矿冶科技集团有限公司 Foam overflow amount detection method, device, electronic device and medium
CN114988567A (en) * 2022-07-15 2022-09-02 南通仁源节能环保科技有限公司 Sewage treatment method and system based on activated sludge foam
CN117252826A (en) * 2023-09-12 2023-12-19 山东神力索具有限公司 Visual technology-based method for detecting cooling sewage of steel caps containing graphite die forging
CN117252826B (en) * 2023-09-12 2024-03-12 山东神力索具有限公司 Visual technology-based method for detecting cooling sewage of steel caps containing graphite die forging

Also Published As

Publication number Publication date
CN104408724B (en) 2017-12-01

Similar Documents

Publication Publication Date Title
CN104408724A (en) Depth information method and system for monitoring liquid level and recognizing working condition of foam flotation
CN110175576B (en) Driving vehicle visual detection method combining laser point cloud data
CN105260699B (en) A kind of processing method and processing device of lane line data
CN113160192B (en) Visual sense-based snow pressing vehicle appearance defect detection method and device under complex background
CN102324030B (en) Target tracking method and system based on image block characteristics
CN107392232B (en) Flotation working condition classification method and system
CN105574543B (en) A kind of vehicle brand type identifier method and system based on deep learning
CN111784657A (en) Digital image-based system and method for automatically identifying cement pavement diseases
CN110379168B (en) Traffic vehicle information acquisition method based on Mask R-CNN
CN102542289A (en) Pedestrian volume statistical method based on plurality of Gaussian counting models
CN103871062B (en) A kind of lunar surface rock detection method described based on super-pixel
CN103106659A (en) Open area target detection and tracking method based on binocular vision sparse point matching
CN105869178A (en) Method for unsupervised segmentation of complex targets from dynamic scene based on multi-scale combination feature convex optimization
CN103324936A (en) Vehicle lower boundary detection method based on multi-sensor fusion
CN104616006B (en) A kind of beard method for detecting human face towards monitor video
CN102073846A (en) Method for acquiring traffic information based on aerial images
CN107705283A (en) Particle and bubble hit detection method based on Otsu image segmentation
CN103020970A (en) Corn ear image grain segmentation method
CN102704215A (en) Automatic cutting method of embroidery cloth based on combination of DST file parsing and machine vision
CN106446785A (en) Passable road detection method based on binocular vision
CN104992429A (en) Mountain crack detection method based on image local reinforcement
CN102393902A (en) Vehicle color detection method based on H_S two-dimensional histogram and regional color matching
CN107314957A (en) A kind of measuring method of rock fragmentation Size Distribution
CN103198479A (en) SAR image segmentation method based on semantic information classification
CN105512618A (en) Video tracking method

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant