WO2019049060A1 - Froth segmentation in flotation cells - Google Patents
Froth segmentation in flotation cells Download PDFInfo
- Publication number
- WO2019049060A1 WO2019049060A1 PCT/IB2018/056802 IB2018056802W WO2019049060A1 WO 2019049060 A1 WO2019049060 A1 WO 2019049060A1 IB 2018056802 W IB2018056802 W IB 2018056802W WO 2019049060 A1 WO2019049060 A1 WO 2019049060A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- image
- bubble
- froth
- segmentation
- dnn
- Prior art date
Links
Classifications
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B03—SEPARATION OF SOLID MATERIALS USING LIQUIDS OR USING PNEUMATIC TABLES OR JIGS; MAGNETIC OR ELECTROSTATIC SEPARATION OF SOLID MATERIALS FROM SOLID MATERIALS OR FLUIDS; SEPARATION BY HIGH-VOLTAGE ELECTRIC FIELDS
- B03D—FLOTATION; DIFFERENTIAL SEDIMENTATION
- B03D1/00—Flotation
- B03D1/02—Froth-flotation processes
- B03D1/028—Control and monitoring of flotation processes; computer models therefor
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F18/00—Pattern recognition
- G06F18/20—Analysing
- G06F18/24—Classification techniques
- G06F18/241—Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches
- G06F18/2413—Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches based on distances to training or reference patterns
- G06F18/24133—Distances to prototypes
- G06F18/24143—Distances to neighbourhood prototypes, e.g. restricted Coulomb energy networks [RCEN]
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/10—Segmentation; Edge detection
- G06T7/12—Edge-based segmentation
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/20—Image preprocessing
- G06V10/26—Segmentation of patterns in the image field; Cutting or merging of image elements to establish the pattern region, e.g. clustering-based techniques; Detection of occlusion
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/70—Arrangements for image or video recognition or understanding using pattern recognition or machine learning
- G06V10/82—Arrangements for image or video recognition or understanding using pattern recognition or machine learning using neural networks
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/20—Special algorithmic details
- G06T2207/20084—Artificial neural networks [ANN]
Definitions
- This invention relates to a method, apparatus and system for analysing the froth phase of a flotation cell. More specifically, but not exclusively, the invention relates to the real-time measuring of the froth phase of a flotation cell with a view to estimating and/or predicting grade and recovery of the flotation process.
- Froth flotation is a process for separating minerals from gangue by taking advantage of differences in their hydrophobicity. Hydrophobicity differences between valuable minerals and waste gangue are increased through the use of surfactants and wetting agents.
- the flotation process typically takes place in an open cell and consists of the pulp phase which can be described as the 'reactor' and the froth phase which can be termed the 'separator'.
- pulp phase hydrophobic particles preferentially attach to rising air bubbles which form a froth at the top of the pulp phase and are recovered as concentrate.
- Sub-processes such as bubble-particle collision, attachment, detachment and entrainment dominate. These sub- processes have an overall effect of transporting particles, mostly hydrophobic particles, to the froth phase.
- the process of froth formation and transport determines the kind of sub-processes that take place.
- Froth phase sub-processes such as thinning of bubble films, bubble coalescence and froth drainage result in an increase in bubble sizes, particles detaching from bubbles and draining back into the pulp phase.
- Froth phase sub-processes may lead to cleaning/separating action if there is preferential re-attachment of the draining particles to the available bubble surface area. This cleaning action determines the overall grade and recovery of the flotation process.
- the initial separating action of the froth phase starts at the pulp-froth interface. Having an accurate model of the flotation process and obtaining real-time measurements of the froth phase can aid in predicting the grade and recovery of the flotation cell as a whole. This in turn is useful for plant optimisation.
- One conventional way of measuring characteristics of the froth phase of a flotation cell is to place a camera above the surface of the froth phase which is configured to capture a video stream of the froth surface.
- Image analysis techniques are then used to calculate, amongst others, froth velocity, height, stability and bubble size.
- a conventional technique for calculating bubble size from images involves using a so-called "watershedding" algorithm to identify the boundaries of individual bubbles on the froth phase surface.
- Various bubble size metrics may then be calculated from these measurements including mean bubble size and complete bubble size distribution.
- a computer-implemented method of generating a bubble segmentation image from a digital froth image of a froth phase of a flotation cell comprising:
- the deep learning networks having been trained with one or more datasets of training images of labelled, predetermined bubble segmentation images to learn to identify features useful for identifying bubble boundaries automatically;
- the bubble segmentation image utilising the deep learning networks so that the bubble segmentation image includes identified boundary data representing boundaries between bubbles present in the froth image
- step of applying the deep learning networks to the froth image to include applying one or more a deep neural network (DNN) algorithms to the froth image; for the method to include the step of creating the dataset of training images; for the method to include the step of training the DNN algorithms using the dataset; and for the method to include the step of determining if identified bubble boundary data includes false positives, and optionally retraining the DNN architecture if it does.
- DNN deep neural network
- Still further features provide for the method to include the steps of: training the DNN algorithms with a dataset comprising a plurality of labelled, segmentation boundary training images generated utilising commercially available bubble/rock segmentation products; and/or training the DNN algorithms with a dataset comprising a plurality of training images simulated using computer animation software.
- the DNN to be a convolutional deep neural network (CNN); and for the CNN to utilise the U-Net convolutional network architecture.
- CNN convolutional deep neural network
- a system for generating a bubble segmentation image from a froth image of a froth phase of a flotation cell including a memory for storing computer-readable program code and a processor for executing the computer-readable program code, the system comprising:
- an image receiving component arranged to receive a digital froth image from an image capturing component
- a bubble boundary identification component arranged to receive the digital froth image as input, identify bubble boundaries between bubbles in the digital froth image utilising one or more deep learning networks, create the bubble segmentation image and provide it as output, the deep learning networks having been trained with one or more datasets of training images of labelled, predetermined bubble segmentation images to learn to identify features useful for identifying bubble boundaries automatically;
- an image output component arranged to output the bubble segmentation image received from the bubble boundary identification component.
- the system may include an image capturing component which may form part of a measuring unit arranged to be secured above a flotation cell.
- the one or more deep learning networks to be one or more deep neural network (DNN) algorithms; for the system to include a training component arranged to train the DNN networks with the training datasets comprising bubble segmentation training images generated utilising commercially available bubble/rock segmentation products and/or simulated using computer animation software.
- DNN deep neural network
- DNN to be a convolutional deep neural network (CNN); and for the CNN to utilise the U-Net convolutional network architecture.
- CNN convolutional deep neural network
- Figure 1 is a schematic diagram of a flotation cell installation in which a method and system for generating a bubble segmentation image from a froth image of a froth phase of the flotation cell may be utilised;
- Figure 2 is an image of a froth phase of a flotation cell
- Figure 3 is a bubble segmentation image created utilising a method and system according to the disclosure.
- Figure 4 is an image showing a bubble segmentation image created utilising a method and system according to the disclosure, superimposed over a corresponding froth image;
- Figure 5 is a block diagram which illustrates components of an exemplary system used for implementing a method according to the disclosure.
- Figure 6 illustrates an example of a computing device in which various aspects of the disclosure may be implemented.
- Some representations are loosely based on interpretation of information processing and communication patterns in a biological nervous system, such as neural coding that attempts to define a relationship between various stimuli and associated neuronal responses in the brain. Research attempts to create efficient systems to learn these representations from large-scale, unlabelled data sets.
- Deep learning architectures such as deep neural networks, deep belief networks and recurrent neural networks have been applied to fields including computer vision, speech recognition, natural language processing, audio recognition, social network filtering, machine translation and bioinformatics where they produced results comparable to and in some cases superior to human experts.
- Deep learning is a class of machine learning algorithms that:
- the algorithms may be supervised or unsupervised and applications include pattern analysis (unsupervised) and classification (supervised);
- Deep learning adds the assumption that these layers of factors correspond to levels of abstraction or composition. Varying numbers of layers and layer sizes can provide different amounts of abstraction. Deep learning exploits this idea of hierarchical explanatory factors where higher level, more abstract concepts are learned from the lower level ones. Deep learning architectures are often constructed with a greedy layer-by-layer method. Deep learning helps to disentangle these abstractions and pick out which features are useful for improving performance. For supervised learning tasks, deep learning methods obviate feature engineering, by translating the data into compact intermediate representations akin to principal components, and derive layered structures that remove redundancy in representation.
- Deep learning algorithms can be applied to unsupervised learning tasks. This is an important benefit because unlabelled data are more abundant than labelled data. Examples of deep structures that can be trained in an unsupervised manner are neural history compressors and deep belief networks.
- a deep neural network is an artificial neural network (ANN) with multiple hidden layers between the input and output layers. Similar to shallow ANNs, DNNs can model complex nonlinear relationships. DNN architectures generate compositional models where the object is expressed as a layered composition of primitives. The extra layers enable composition of features from lower layers, potentially modelling complex data with fewer units than a similarly performing shallow network.
- Deep architectures include many variants of a few basic approaches. Each architecture has found success in specific domains. It is not always possible to compare the performance of multiple architectures, unless they have been evaluated on the same data sets.
- DNNs are typically feedforward networks in which data flows from the input layer to the output layer without looping back.
- Recurrent neural networks in which data can flow in any direction, are used for applications such as language modelling. Long short-term memory is particularly effective for this use.
- CNNs Convolutional deep neural networks
- ASR automatic speech recognition
- the present disclosure presents a method and system for conducting such digital analysis and automatically generating a bubble segmentation image (as shown in Figure 3) from a froth image (as shown in Figure 2).
- the bubble segmentation image is essentially made up of only the boundaries between the bubbles visible in the froth image.
- FIG 1 is a schematic diagram which illustrates a flotation cell installation in which an exemplary system (100) for conducting such digital bubble segmentation analysis is used with a flotation cell (107).
- the system (100) may be sold as a standalone unit capable of receiving a digital froth image captured by an external imaging device, for present purposes it will be described as part of a measuring unit (102) arranged to be positioned above the flotation cell (107).
- the system (100) includes a digital camera (101 ), which could of course be a video camera, positioned at the top of a sun tube (103) which forms part of a housing (105) of the measuring unit (102).
- the housing (105) is positioned over the flotation cell (107) with the lens of the camera (not shown) directed toward the top of the flotation cell.
- a froth phase (109) made up of a collection of bubbles will form at the top of the flotation cell.
- the top (1 1 1 ) of the froth phase defines the froth level and the bottom (1 13) of the froth phase represents the top of the pulp phase (1 15) of the flotation cell (107).
- a laser sensor or laser pointer (1 17) is provided in the sun tube (103) and is also directed at an angle towards the froth phase (109). While not an essential part of the present invention it bears mention that the laser pointer (1 17) may be used to measure the height of the froth.
- the digital camera (101 ) is in data communication with a processor which may of course be incorporated in a suitable computer infrastructure (121 ).
- the computer infrastructure (121 ) also includes a memory (125) for storing computer-readable program code and the processor (121 ) is arranged to execute the computer-readable program code.
- the computer-readable program code includes one or more deep learning network models which have been trained with one or more datasets of training images. It will be appreciated by those skilled in the art that the training images may include previously generated bubble segmentation images with boundary data of boundaries between bubbles already defined and added to the images.
- the deep learning network is a deep neural network (DNN) algorithm, more specifically a convolutional deep neural network (CNN) of the U-Net architecture, which is a deep neural network architecture that has been applied in industry to self-driving cars and medical imaging.
- DNN deep neural network
- CNN convolutional deep neural network
- the U-Net receives a froth image as an input and outputs the bubble segmentation image.
- a typical froth image used as input to the system (100) is shown in Figure 2 and a typical bubble segmentation image received as an output from the DNN algorithm is shown in Figure 3.
- Figure 4 shows the bubble segmentation image superimposed over the froth image. It will be clear to those skilled in the art that the segmentation image may represent an accurate estimation of the outlines or boundaries of the bubbles in the froth image.
- the U-Net model consists of a series of convolutional layers arranged in a "bottleneck" formation. Images, which can be considered high dimensional data points, are "squeezed” through a narrow information bottleneck, before again expanding back to the original dimensions at the output. However, instead of the output image being the same as the input image, the algorithm is configured to let the output image be the bubble segmentation boundaries. Therefore, during the training of the DNN, the model learns to identify features useful for identifying the bubble boundaries.
- the U-Net model combines the high level abstractions that are output from the successive convolutional layers, with the lower level features that are fed through using "skip connections" from earlier layers in the network. It will, however, be clear to those skilled in the art that any alternative or additional deep learning architectures may be used in the disclosure including, but not limited to, FusionNet and DenseNet.
- the system described above may be used to implement a method in accordance with the disclosure.
- Various components of the system (500) used to implement the method are shown in more detail in the block diagram of Figure 5.
- the method may include the creation of the DNN training datasets by a training component (501 ) as described in more detail further below.
- the method is carried out by capturing a froth image using the digital camera or another image capturing component (503).
- the image is then communicated to the computer infrastructure where it may be received by a suitable image receiving component (505).
- the DNN is applied to the froth image by a DNN application component (507) and boundaries between the various bubbles in the froth image are identified through the application of the DNN.
- An image construction component (509) then constructs the bubble segmentation image including identified boundary data representing the boundaries between the bubbles visible in the froth image.
- the bubble segmentation image is then outputted by an output component (51 1 ) to further processing modules which fall outside the scope of this disclosure. It will, however, be appreciated by those skilled in the art that the further processing modules may use the bubble segmentation image with the identified bubble boundary data to determine the bubble sizes and bubble size distributions.
- the method and system for generating bubble segmentation images may be used for continuous analysis of the froth phase of the flotation cell.
- various parameters of the flotation cell's performance may be derived, for example, the froth velocity, froth height, froth stability, bubble size and bubble size distribution.
- the accurate measurement of these parameters in near real-time may be used to derive critical operational parameters of the flotation cell which may serve as indicators for input adjustments to be made to the flotation cell to ensure efficient operation.
- the compiling of the DNN training datasets and the actual training of the DNN may be essential aspects required for the effective operation of the method and system disclosed.
- the DNN typically requires substantial amounts of labelled data for training.
- the DNN may therefore be trained with datasets of previously generated or measured training images which may typically be labelled.
- Each training image may include defined bubble segmentation boundaries which, in combination with other defined bubble segmentation boundary images that make up the dataset effectively teaches the DNN to identify features useful for identifying bubble boundaries in input images automatically. Since hand segmenting thousands of images is an extremely time-consuming task, it may not be practical to use this technique to gather training image data.
- already available, commercially used products such as the Lynxx particle size analyser, the applicant's existing bubble/rock segmentation product, may be used to provide rough segmentation data that may then be used in the training dataset to train the DNN.
- Another technique that may be used to generate training datasets involves simulating froth images using computer animation software. Unlike the capturing of real froth images, with computer simulations the true geometry of the bubbles used to create the rendering is accessible. As such, perfect segmentation of the bubbles can automatically be generated. Since the textures used to generate the geometries and colours of the bubbles are essentially mathematical expressions, infinite varieties of froth conditions may be artificially created, which helps supplement the training datasets for froth conditions for which good segmentations are not available.
- FIG. 6 illustrates an example of a computing device (600) in which various aspects of the disclosure may be implemented.
- the computing device (600) may be embodied as any form of data processing device including a personal computing device (e.g. laptop or desktop computer), a server computer (which may be self-contained, physically distributed over a number of locations), a client computer, or a communication device, such as a mobile phone (e.g. cellular telephone), satellite phone, tablet computer, personal digital assistant or the like.
- a mobile phone e.g. cellular telephone
- satellite phone e.g. cellular telephone
- tablet computer e.g. cellular telephone
- personal digital assistant e.g. cellular telephone
- the computing device (600) may be suitable for storing and executing computer program code.
- the various participants and elements in the previously described system diagrams may use any suitable number of subsystems or components of the computing device (600) to facilitate the functions described herein.
- the computing device (600) may include subsystems or components interconnected via a communication infrastructure (605) (for example, a communications bus, a network, etc.).
- the computing device (600) may include one or more processors (610) and at least one memory component in the form of computer-readable media.
- the one or more processors (610) may include one or more of: CPUs, graphical processing units (GPUs), microprocessors, field programmable gate arrays (FPGAs), application specific integrated circuits (ASICs) and the like.
- a number of processors may be provided and may be arranged to carry out calculations simultaneously.
- various subsystems or components of the computing device (600) may be distributed over a number of physical locations (e.g. in a distributed, cluster or cloud-based computing configuration) and appropriate software units may be arranged to manage and/or process data on behalf of remote devices.
- the memory components may include system memory (615), which may include read only memory (ROM) and random access memory (RAM).
- ROM read only memory
- RAM random access memory
- BIOS basic input/output system
- System software may be stored in the system memory (615) including operating system software.
- the memory components may also include secondary memory (620).
- the secondary memory (620) may include a fixed disk (621 ), such as a hard disk drive, and, optionally, one or more storage interfaces (622) for interfacing with storage components (623), such as removable storage components (e.g. magnetic tape, optical disk, flash memory drive, external hard drive, removable memory chip, etc.), network attached storage components (e.g. NAS drives), remote storage components (e.g. cloud-based storage) or the like.
- removable storage components e.g. magnetic tape, optical disk, flash memory drive, external hard drive, removable memory chip, etc.
- network attached storage components e.g. NAS drives
- remote storage components e.g. cloud-based storage
- the computing device (600) may include an external communications interface (630) for operation of the computing device (600) in a networked environment enabling transfer of data between multiple computing devices (600) and/or the Internet.
- Data transferred via the external communications interface (630) may be in the form of signals, which may be electronic, electromagnetic, optical, radio, or other types of signal.
- the external communications interface (630) may enable communication of data between the computing device (600) and other computing devices including servers and external storage facilities. Web services may be accessible by and/or from the computing device (600) via the communications interface (630).
- the external communications interface (630) may be configured for connection to wireless communication channels (e.g., a cellular telephone network, wireless local area network (e.g. using Wi-FiTM), satellite-phone network, Satellite Internet Network, etc.) and may include an associated wireless transfer element, such as an antenna and associated circuity.
- wireless communication channels e.g., a cellular telephone network, wireless local area network (e.g. using Wi-FiTM), satellite-phone network
- the computer-readable media in the form of the various memory components may provide storage of computer-executable instructions, data structures, program modules, software units and other data.
- a computer program product may be provided by a computer-readable medium having stored computer-readable program code executable by the central processor (610).
- a computer program product may be provided by a non-transient computer-readable medium, or may be provided via a signal or other transient means via the communications interface (630).
- Interconnection via the communication infrastructure (605) allows the one or more processors (610) to communicate with each subsystem or component and to control the execution of instructions from the memory components, as well as the exchange of information between subsystems or components.
- Peripherals such as printers, scanners, cameras, or the like
- input/output (I/O) devices such as a mouse, touchpad, keyboard, microphone, touch-sensitive display, input buttons, speakers and the like
- I/O controller (635) may couple to or be integrally formed with the computing device (600) either directly or via an I/O controller (635).
- One or more displays (645) (which may be touch-sensitive displays) may be coupled to or integrally formed with the computing device (600) via a display (645) or video adapter (640).
- the processor for executing the functions of components described may be provided by hardware or by software units executing on the computer infrastructure.
- the software units may be stored in a memory component and instructions may be provided to the processor to carry out the functionality of the described components.
- software units arranged to manage and/or process data on behalf of the computer infrastructure may be provided remotely.
- a software unit is implemented with a computer program product comprising a non-transient computer-readable medium containing computer program code, which can be executed by a processor for performing any or all of the steps, operations, or processes described.
- Software units or functions described in this application may be implemented as computer program code using any suitable computer language such as, for example, JavaTM, C++, or PerlTM using, for example, conventional or object-oriented techniques.
- the computer program code may be stored as a series of instructions, or commands on a non- transitory computer-readable medium, such as a random access memory (RAM), a read-only memory (ROM), a magnetic medium such as a hard-drive, or an optical medium such as a CD- ROM. Any such computer-readable medium may also reside on or within a single computational apparatus, and may be present on or within different computational apparatuses within a system or network.
- a non- transitory computer-readable medium such as a random access memory (RAM), a read-only memory (ROM), a magnetic medium such as a hard-drive, or an optical medium such as a CD- ROM.
- RAM random access memory
- ROM read-only memory
- magnetic medium such as a hard-drive
- optical medium such as a CD- ROM
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Evolutionary Computation (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Multimedia (AREA)
- General Engineering & Computer Science (AREA)
- Artificial Intelligence (AREA)
- Computing Systems (AREA)
- Health & Medical Sciences (AREA)
- Databases & Information Systems (AREA)
- General Health & Medical Sciences (AREA)
- Medical Informatics (AREA)
- Software Systems (AREA)
- Data Mining & Analysis (AREA)
- Life Sciences & Earth Sciences (AREA)
- Bioinformatics & Cheminformatics (AREA)
- Bioinformatics & Computational Biology (AREA)
- Evolutionary Biology (AREA)
- Image Analysis (AREA)
- Investigating Or Analysing Biological Materials (AREA)
Abstract
Description
Claims
Priority Applications (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
AU2018327270A AU2018327270A1 (en) | 2017-09-08 | 2018-09-06 | Froth segmentation in flotation cells |
BR112020004522-5A BR112020004522B1 (en) | 2017-09-08 | 2018-09-06 | SEGMENTATION IN FLOTATION CELLS |
ZA2020/01710A ZA202001710B (en) | 2017-09-08 | 2020-03-18 | Froth segmentation in flotation cells |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
ZA2017/06114 | 2017-09-08 | ||
ZA201706114 | 2017-09-08 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2019049060A1 true WO2019049060A1 (en) | 2019-03-14 |
Family
ID=65634915
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/IB2018/056802 WO2019049060A1 (en) | 2017-09-08 | 2018-09-06 | Froth segmentation in flotation cells |
Country Status (4)
Country | Link |
---|---|
AU (1) | AU2018327270A1 (en) |
CL (1) | CL2020000547A1 (en) |
WO (1) | WO2019049060A1 (en) |
ZA (1) | ZA202001710B (en) |
Cited By (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN110689020A (en) * | 2019-10-10 | 2020-01-14 | 湖南师范大学 | Segmentation method of mineral flotation froth image and electronic equipment |
CN111259972A (en) * | 2020-01-20 | 2020-06-09 | 北矿机电科技有限责任公司 | Flotation bubble identification method based on cascade classifier |
CN111325281A (en) * | 2020-03-05 | 2020-06-23 | 新希望六和股份有限公司 | Deep learning network training method and device, computer equipment and storage medium |
CN113837193A (en) * | 2021-09-23 | 2021-12-24 | 中南大学 | Zinc flotation froth image segmentation algorithm based on improved U-Net network |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO1997045203A1 (en) * | 1996-05-31 | 1997-12-04 | Baker Hughes Incorporated | Method and apparatus for controlling froth flotation machines |
CN101036904A (en) * | 2007-04-30 | 2007-09-19 | 中南大学 | Flotation froth image recognition device based on machine vision and the mine concentration grade forecast method |
CN104331714A (en) * | 2014-11-28 | 2015-02-04 | 福州大学 | Image data extraction and neural network modeling-based platinum flotation grade estimation method |
-
2018
- 2018-09-06 WO PCT/IB2018/056802 patent/WO2019049060A1/en active Application Filing
- 2018-09-06 AU AU2018327270A patent/AU2018327270A1/en active Pending
-
2020
- 2020-03-05 CL CL2020000547A patent/CL2020000547A1/en unknown
- 2020-03-18 ZA ZA2020/01710A patent/ZA202001710B/en unknown
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO1997045203A1 (en) * | 1996-05-31 | 1997-12-04 | Baker Hughes Incorporated | Method and apparatus for controlling froth flotation machines |
CN101036904A (en) * | 2007-04-30 | 2007-09-19 | 中南大学 | Flotation froth image recognition device based on machine vision and the mine concentration grade forecast method |
CN104331714A (en) * | 2014-11-28 | 2015-02-04 | 福州大学 | Image data extraction and neural network modeling-based platinum flotation grade estimation method |
Non-Patent Citations (1)
Title |
---|
ALDRICH C. ET AL.: "Online monitoring and control of froth flotation systems with machine vision: A review", INTERNATIONAL JOURNAL OF MINERAL PROCESSING, vol. 96, no. 1-4, 2010, pages 1 - 13, XP027206989, ISSN: 0301-7516, Retrieved from the Internet <URL:https://www.sciencedirect.com/ science /art[c]e/pii/S0301751610000633> [retrieved on 20181106] * |
Cited By (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN110689020A (en) * | 2019-10-10 | 2020-01-14 | 湖南师范大学 | Segmentation method of mineral flotation froth image and electronic equipment |
CN111259972A (en) * | 2020-01-20 | 2020-06-09 | 北矿机电科技有限责任公司 | Flotation bubble identification method based on cascade classifier |
CN111259972B (en) * | 2020-01-20 | 2023-08-11 | 北矿机电科技有限责任公司 | Flotation bubble identification method based on cascade classifier |
CN111325281A (en) * | 2020-03-05 | 2020-06-23 | 新希望六和股份有限公司 | Deep learning network training method and device, computer equipment and storage medium |
CN111325281B (en) * | 2020-03-05 | 2023-10-27 | 新希望六和股份有限公司 | Training method and device for deep learning network, computer equipment and storage medium |
CN113837193A (en) * | 2021-09-23 | 2021-12-24 | 中南大学 | Zinc flotation froth image segmentation algorithm based on improved U-Net network |
CN113837193B (en) * | 2021-09-23 | 2023-09-01 | 中南大学 | Zinc flotation froth image segmentation method based on improved U-Net network |
Also Published As
Publication number | Publication date |
---|---|
AU2018327270A1 (en) | 2020-04-09 |
CL2020000547A1 (en) | 2020-09-04 |
ZA202001710B (en) | 2021-04-28 |
BR112020004522A2 (en) | 2020-09-08 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2019049060A1 (en) | Froth segmentation in flotation cells | |
CN113168567A (en) | System and method for small sample transfer learning | |
CN112967243A (en) | Deep learning chip packaging crack defect detection method based on YOLO | |
CN110276402B (en) | Salt body identification method based on deep learning semantic boundary enhancement | |
CN111241989A (en) | Image recognition method and device and electronic equipment | |
CN112967248B (en) | Method, apparatus, medium and program product for generating defect image samples | |
CN111259812B (en) | Inland ship re-identification method and equipment based on transfer learning and storage medium | |
Mathias et al. | Underwater object detection based on bi-dimensional empirical mode decomposition and Gaussian Mixture Model approach | |
Zhou et al. | Convolutional neural networks–based model for automated sewer defects detection and classification | |
JP2024513596A (en) | Image processing method and apparatus and computer readable storage medium | |
Guo et al. | A novel transformer-based network with attention mechanism for automatic pavement crack detection | |
CN117011274A (en) | Automatic glass bottle detection system and method thereof | |
CN110472673B (en) | Parameter adjustment method, fundus image processing device, fundus image processing medium and fundus image processing apparatus | |
CN114882494B (en) | Three-dimensional point cloud feature extraction method based on multi-modal attention driving | |
CN115953394A (en) | Target segmentation-based detection method and system for mesoscale ocean vortexes | |
CN116977265A (en) | Training method and device for defect detection model, computer equipment and storage medium | |
CN115953506A (en) | Industrial part defect image generation method and system based on image generation model | |
DR | RECOGNITION OF SIGN LANGUAGE USING DEEP NEURAL NETWORK. | |
Kee et al. | Cracks identification using mask region-based denoised deformable convolutional network | |
CN114170625A (en) | Context-aware and noise-robust pedestrian searching method | |
CN114596442A (en) | Image identification method, device, equipment and storage medium | |
CN114187487A (en) | Processing method, device, equipment and medium for large-scale point cloud data | |
CN112749691A (en) | Image processing method and related equipment | |
BR112020004522B1 (en) | SEGMENTATION IN FLOTATION CELLS | |
CN113392924B (en) | Identification method of acoustic-electric imaging log and related equipment |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 18854608 Country of ref document: EP Kind code of ref document: A1 |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
REG | Reference to national code |
Ref country code: BR Ref legal event code: B01A Ref document number: 112020004522 Country of ref document: BR |
|
ENP | Entry into the national phase |
Ref document number: 2018327270 Country of ref document: AU Date of ref document: 20180906 Kind code of ref document: A |
|
ENP | Entry into the national phase |
Ref document number: 112020004522 Country of ref document: BR Kind code of ref document: A2 Effective date: 20200306 |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 18854608 Country of ref document: EP Kind code of ref document: A1 |