CN112070164A - Dry and wet sludge classification method and device - Google Patents

Dry and wet sludge classification method and device Download PDF

Info

Publication number
CN112070164A
CN112070164A CN202010940030.6A CN202010940030A CN112070164A CN 112070164 A CN112070164 A CN 112070164A CN 202010940030 A CN202010940030 A CN 202010940030A CN 112070164 A CN112070164 A CN 112070164A
Authority
CN
China
Prior art keywords
image
sludge
dry
detected
wet
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202010940030.6A
Other languages
Chinese (zh)
Inventor
陈海波
段艺霖
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenlan Intelligent Technology Shanghai Co ltd
Original Assignee
Deep Blue Technology Shanghai Co Ltd
DeepBlue AI Chips Research Institute Jiangsu Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Deep Blue Technology Shanghai Co Ltd, DeepBlue AI Chips Research Institute Jiangsu Co Ltd filed Critical Deep Blue Technology Shanghai Co Ltd
Priority to CN202010940030.6A priority Critical patent/CN112070164A/en
Publication of CN112070164A publication Critical patent/CN112070164A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/24Classification techniques
    • G06F18/241Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/21Design or setup of recognition systems or techniques; Extraction of features in feature space; Blind source separation
    • G06F18/214Generating training patterns; Bootstrap methods, e.g. bagging or boosting
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/22Matching criteria, e.g. proximity measures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/045Combinations of networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Data Mining & Analysis (AREA)
  • Physics & Mathematics (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Artificial Intelligence (AREA)
  • General Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • Evolutionary Computation (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Evolutionary Biology (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Computational Linguistics (AREA)
  • Biomedical Technology (AREA)
  • Biophysics (AREA)
  • Health & Medical Sciences (AREA)
  • General Health & Medical Sciences (AREA)
  • Molecular Biology (AREA)
  • Computing Systems (AREA)
  • Mathematical Physics (AREA)
  • Software Systems (AREA)
  • Image Analysis (AREA)

Abstract

The invention provides a dry and wet sludge classification method and a device, wherein the method comprises the following steps: acquiring a sample data set, wherein the sample data set comprises a plurality of sample images of dry sludge and a plurality of sample images of wet sludge; acquiring a sludge area in each sample image by a template matching method to update a sample data set; training a neural network through the updated sample data set to obtain a dry-wet sludge classification model, wherein the sample image is automatically subjected to environment transformation processing during training; acquiring an image to be detected, and acquiring a sludge area in the image to be detected by a template matching method to update the image to be detected; and inputting the updated image to be detected into the dry-wet sludge classification model so as to judge whether the sludge in the image to be detected is dry sludge or wet sludge. The invention has the advantages of high classification efficiency, low labor cost, high classification accuracy and wide application range.

Description

Dry and wet sludge classification method and device
Technical Field
The invention relates to the technical field of deep learning, in particular to a dry and wet sludge classification method, a dry and wet sludge classification device, computer equipment, a non-transitory computer readable storage medium and a computer program product.
Background
At present, identification and classification of dry sludge and wet sludge are mostly completed in a manual visual observation mode, and the defects of low efficiency, high labor cost, high accuracy rate, large floating and the like exist.
Disclosure of Invention
The invention provides a dry-wet sludge classification method and device for solving the technical problems, and the method and device are high in classification efficiency, low in labor cost, high in classification accuracy and wide in application range.
The technical scheme adopted by the invention is as follows:
a dry-wet sludge classification method comprises the following steps: acquiring a sample data set, wherein the sample data set comprises a plurality of sample images of dry sludge and a plurality of sample images of wet sludge; acquiring a sludge area in each sample image by a template matching method to update the sample data set; training a neural network through the updated sample data set to obtain a dry-wet sludge classification model, wherein the sample image is automatically subjected to environment transformation processing during training; acquiring an image to be detected, and acquiring a sludge area in the image to be detected by a template matching method to update the image to be detected; and inputting the updated image to be detected into the dry and wet sludge classification model so as to judge whether the sludge in the image to be detected is dry sludge or wet sludge.
The sample image is also translated, rotated and indented during training.
Wherein, the template matching method comprises the following steps: decomposing a sample image or an image to be detected into a matrix form, and arranging the characteristics in the matrix form image according to coordinates; and acquiring a sludge area in the image through image comparison, and carrying out coordinate positioning.
The neural network is a VGG (a deep convolutional neural network) network or an inclusion network.
A dry and wet sludge classification device comprises: a first obtaining module, configured to obtain a sample data set, where the sample data set includes a plurality of sample images of dry sludge and a plurality of sample images of wet sludge; the first matching module is used for acquiring a sludge area in each sample image through a template matching method so as to update the sample data set; the training module is used for training the neural network through the updated sample data set to obtain a dry-wet sludge classification model, wherein the sample image is automatically subjected to environment transformation processing during training; the second acquisition module is used for acquiring an image to be detected; the second matching module is used for acquiring a sludge area in the image to be detected through a template matching method so as to update the image to be detected; and the detection module is used for inputting the updated image to be detected into the dry-wet sludge classification model so as to judge whether the sludge in the image to be detected is dry sludge or wet sludge.
The training module also performs translation, rotation and indentation processing on the sample image during training.
Wherein, the template matching method comprises the following steps: decomposing a sample image or an image to be detected into a matrix form, and arranging the characteristics in the matrix form image according to coordinates; and acquiring a sludge area in the image through image comparison, and carrying out coordinate positioning.
A computer device comprises a memory, a processor and a computer program which is stored on the memory and can run on the processor, and when the processor executes the program, the method for classifying the dry and wet sludge is realized.
A non-transitory computer-readable storage medium having stored thereon a computer program which, when executed by a processor, implements the above-described wet and dry sludge classification method.
A computer program product having instructions which, when executed by a processor, perform the above method of wet and dry sludge classification.
The invention has the beneficial effects that:
according to the invention, the sludge area of each sample image in the sample data set is obtained by a template matching method, the neural network is trained, the sample images are automatically subjected to environment transformation processing during training to obtain a dry-wet sludge classification model, and the dry-wet sludge classification model is used for classifying the images to be detected.
Drawings
FIG. 1 is a flow chart of a method of classifying wet and dry sludge according to an embodiment of the present invention;
FIG. 2 is a schematic diagram of a matrixed image according to an embodiment of the invention;
fig. 3 is a block diagram schematically illustrating a dry-wet sludge classification apparatus according to an embodiment of the present invention.
Detailed Description
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
As shown in fig. 1, the method for classifying wet and dry sludge according to the embodiment of the present invention includes the following steps:
and S1, acquiring a sample data set, wherein the sample data set comprises a plurality of sample images of dry sludge and a plurality of sample images of wet sludge.
In one embodiment of the invention, a batch of sample images of dry sludge and sample images of wet sludge can be collected and stored together with the sample images as sample labels to form a sample data set.
In one embodiment of the invention, the ratio of dry sludge contained to wet sludge contained in the sample may be at or near 1: 1.
And S2, acquiring a sludge area in each sample image through a template matching method to update the sample data set.
In an embodiment of the invention, a sludge area in a sample image can be positioned based on template matching, and screenshot is performed on the sludge area to obtain a sample data set containing the screenshot of the sludge area, which is used as input for subsequent neural network training.
Specifically, referring to fig. 2, the sample product image may first be decomposed into a matrix form, and the features in the matrix form image may be arranged according to coordinates. After the matrixing process, the features in the image are obvious, for example, as shown in fig. 2, the pixel with the pixel value of 30 can be selected conveniently and quickly.
And then, acquiring a sludge area in the image through image comparison, and performing coordinate positioning.
In a specific embodiment of the invention, the template matching can be realized through OpenCV, and the sludge template image and the area with the corresponding size of the whole image are matched one by running a function matchTemplate, so that the sludge pixel area and the pixel coordinates thereof are obtained.
And S3, training the neural network through the updated sample data set to obtain a dry-wet sludge classification model, wherein the sample image is automatically subjected to environment transformation processing during training.
In one embodiment of the present invention, the neural network is a convolutional neural network, which may be, for example, a VGG network or an inclusion network.
The convolutional neural network includes an input layer, a hidden layer, and an output layer, wherein the hidden layer includes convolutional layers. At the beginning of training, the filter of the convolutional layer is completely random and will not activate, i.e., detect, any features. A blank filter is modified in weight to detect a specific mode, and the whole process is like feedback in engineering. Through such feedback, the convolutional neural network can learn the core features to be judged by itself.
For each sample image data, the training process may include image input, feature extraction, result prediction, result comparison, and feature memorization. Specifically, the convolutional neural network can match each feature with a corresponding sample label, the correctly matched features are retained by the memory module, the incorrectly matched features are ignored through the loss parameter, and a large number of pictures are continuously iterated through multilayer convolutional deep learning, so that the core features which the convolutional neural network wants to memorize are finally learned, and different core features are classified. The finally trained neural network, namely a dry-wet sludge classification model, can classify the sludge of a new image according to the characteristics.
In one embodiment of the present invention, the environment transformation process may be implemented by changing at least one of chromaticity, brightness, and saturation of the image, and the environmental change of natural light, weather environment, day and night light, and the like is simulated by changing the ambient brightness, sky chromaticity, and surface shadow by the change of the chromaticity, brightness, and saturation of the image. Through environment transform processing, not only can obtain a larger amount of sample images, enrich the sample data set, can also make the model that the training obtained have the function of automatic filtering environmental change influences such as natural light, weather environment, light round the clock, emphasize the judgement of dry and wet characteristic.
In addition, the sample images can be subjected to translation, rotation and indentation processing during training, the enrichment of a sample data set can be realized, and the model obtained through training can have the function of automatically filtering the influence of the shooting angle.
And S4, acquiring the image to be detected, and acquiring a sludge area in the image to be detected by a template matching method to update the image to be detected.
In an embodiment of the invention, the sludge to be detected can be photographed by the camera, for example, the sludge to be detected can be photographed by the network fixed-focus camera with the real-time frame rate of more than 60Hz, and the image to be detected with high quality can be obtained by polishing with an artificial or natural light source during photographing.
The way of obtaining the sludge region in the image to be detected by the template matching method is also the same as the way of obtaining the sludge region in the sample image, and is not described herein again.
And S5, inputting the updated image to be detected into the dry-wet sludge classification model to judge whether the sludge in the image to be detected is dry sludge or wet sludge.
And inputting the sludge image to be detected into the dry-wet sludge classification model to obtain an output result that the sludge in the image is dry sludge or wet sludge.
According to the embodiment of the invention, the neural network training is carried out on a large number of sample images subjected to environment transformation processing, so that the dry-wet sludge classification model is insensitive to color change and sensitive to gray level corresponding relation between dry-wet sludge, and therefore, the dry-wet sludge can be accurately identified and classified.
In addition, when the classification result is obtained, corresponding classification result information can be sent out, for example, alarm information can be sent out, high and low level signals can be output, or operation indication signals can be sent out.
According to the dry-wet sludge classification method provided by the embodiment of the invention, the sludge area of each sample image in the sample data set is obtained through the template matching method, the neural network is used for training, the environment transformation processing is automatically carried out on the sample images during the training to obtain the dry-wet sludge classification model, and the dry-wet sludge classification model is used for classifying the images to be detected, so that the classification efficiency is higher, the labor cost is lower, the classification accuracy is higher, and the application range is wider.
Corresponding to the dry-wet sludge classification method of the embodiment, the invention also provides a dry-wet sludge classification device.
As shown in fig. 3, the wet sludge classification apparatus according to the embodiment of the present invention includes a first obtaining module 10, a first matching module 20, a training module 30, a second obtaining module 40, a second matching module 50, and a detecting module 60. The first acquiring module 10 is configured to acquire a sample data set, where the sample data set includes a plurality of sample images of dry sludge and a plurality of sample images of wet sludge; the first matching module 20 is configured to obtain a sludge area in each sample image by a template matching method to update the sample data set; the training module 30 is configured to train the neural network through the updated sample data set to obtain a dry-wet sludge classification model, wherein an environment transformation process is automatically performed on a sample image during training; the second obtaining module 40 is configured to obtain an image to be detected; the second matching module 50 is used for acquiring a sludge area in the image to be detected by a template matching method so as to update the image to be detected; the detection module 60 is configured to input the updated image to be detected into the dry-wet sludge classification model to determine whether the sludge in the image to be detected is dry sludge or wet sludge.
Further, the training module 30 performs translation, rotation, and indentation processing on the sample image during training.
The template matching method comprises the following steps: decomposing a sample image or an image to be detected into a matrix form, and arranging the characteristics in the matrix form image according to coordinates; and acquiring a sludge area in the image through image comparison, and carrying out coordinate positioning.
The specific implementation of the dry-wet sludge classification device according to the embodiment of the present invention can refer to the above-mentioned embodiment of the dry-wet sludge classification method, and is not described herein again.
According to the dry-wet sludge classification device provided by the embodiment of the invention, the sludge area of each sample image in the sample data set is obtained through the template matching method, the neural network is used for training, the environment transformation processing is automatically carried out on the sample images during the training to obtain the dry-wet sludge classification model, and the dry-wet sludge classification is carried out on the image to be detected through the dry-wet sludge classification model, so that the classification efficiency is higher, the labor cost is lower, the classification accuracy is higher, and the application range is wider.
The invention further provides a computer device corresponding to the embodiment.
The computer device of the embodiment of the invention comprises a memory, a processor and a computer program which is stored on the memory and can run on the processor, and when the processor executes the computer program, the method for classifying the dry and wet sludge according to the embodiment of the invention can be realized.
According to the computer equipment provided by the embodiment of the invention, when the processor executes the computer program stored on the memory, the sludge area of each sample image in the sample data set is obtained through the template matching method, the neural network is used for training, the sample images are automatically subjected to environment transformation processing during training to obtain the dry-wet sludge classification model, and the dry-wet sludge classification model is used for classifying the dry-wet sludge of the image to be detected.
The invention also provides a non-transitory computer readable storage medium corresponding to the above embodiment.
A non-transitory computer-readable storage medium of an embodiment of the present invention, on which a computer program is stored, which when executed by a processor, can implement the method for classifying wet and dry sludge according to the above-described embodiment of the present invention.
According to the non-transitory computer-readable storage medium of the embodiment of the invention, when the processor executes the computer program stored on the processor, the sludge area of each sample image in the sample data set is obtained through the template matching method, the neural network is trained, the sample images are automatically subjected to environment transformation processing during the training to obtain the dry-wet sludge classification model, and the dry-wet sludge classification model is used for classifying the images to be detected, so that the classification efficiency is high, the labor cost is low, the classification accuracy is high, and the application range is wide.
The present invention also provides a computer program product corresponding to the above embodiments.
When the instructions in the computer program product of the embodiment of the invention are executed by the processor, the method for classifying wet and dry sludge according to the above embodiment of the invention can be executed.
According to the computer program product provided by the embodiment of the invention, when the processor executes the instruction, the sludge area of each sample image in the sample data set is obtained through the template matching method, the neural network is used for training, the sample images are automatically subjected to environment transformation processing during training to obtain the dry-wet sludge classification model, and the dry-wet sludge classification model is used for classifying the dry-wet sludge of the image to be detected.
In the description of the present invention, the terms "first" and "second" are used for descriptive purposes only and are not to be construed as indicating or implying relative importance or implying any number of technical features indicated. Thus, a feature defined as "first" or "second" may explicitly or implicitly include one or more of that feature. The meaning of "plurality" is two or more unless specifically limited otherwise.
In the present invention, unless otherwise expressly stated or limited, the terms "mounted," "connected," "secured," and the like are to be construed broadly and can, for example, be fixedly connected, detachably connected, or integrally formed; can be mechanically or electrically connected; either directly or indirectly through intervening media, either internally or in any other relationship. The specific meanings of the above terms in the present invention can be understood by those skilled in the art according to specific situations.
In the present invention, unless otherwise expressly stated or limited, the first feature "on" or "under" the second feature may be directly contacting the first and second features or indirectly contacting the first and second features through an intermediate. Also, a first feature "on," "over," and "above" a second feature may be directly or diagonally above the second feature, or may simply indicate that the first feature is at a higher level than the second feature. A first feature being "under," "below," and "beneath" a second feature may be directly under or obliquely under the first feature, or may simply mean that the first feature is at a lesser elevation than the second feature.
In the description herein, references to the description of the term "one embodiment," "some embodiments," "an example," "a specific example," or "some examples," etc., mean that a particular feature, structure, material, or characteristic described in connection with the embodiment or example is included in at least one embodiment or example of the invention. In this specification, the schematic representations of the terms used above are not necessarily intended to refer to the same embodiment or example. Furthermore, the particular features, structures, materials, or characteristics described may be combined in any suitable manner in any one or more embodiments or examples. Furthermore, various embodiments or examples and features of different embodiments or examples described in this specification can be combined and combined by one skilled in the art without contradiction.
Any process or method descriptions in flow charts or otherwise described herein may be understood as representing modules, segments, or portions of code which include one or more executable instructions for implementing specific logical functions or steps of the process, and alternate implementations are included within the scope of the preferred embodiment of the present invention in which functions may be executed out of order from that shown or discussed, including substantially concurrently or in reverse order, depending on the functionality involved, as would be understood by those reasonably skilled in the art of the present invention.
The logic and/or steps represented in the flowcharts or otherwise described herein, e.g., an ordered listing of executable instructions that can be considered to implement logical functions, can be embodied in any computer-readable medium for use by or in connection with an instruction execution system, apparatus, or device, such as a computer-based system, processor-containing system, or other system that can fetch the instructions from the instruction execution system, apparatus, or device and execute the instructions. For the purposes of this description, a "computer-readable medium" can be any means that can contain, store, communicate, propagate, or transport the program for use by or in connection with the instruction execution system, apparatus, or device. More specific examples (a non-exhaustive list) of the computer-readable medium would include the following: an electrical connection (electronic device) having one or more wires, a portable computer diskette (magnetic device), a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber device, and a portable compact disc read-only memory (CDROM). Additionally, the computer-readable medium could even be paper or another suitable medium upon which the program is printed, as the program can be electronically captured, via for instance optical scanning of the paper or other medium, then compiled, interpreted or otherwise processed in a suitable manner if necessary, and then stored in a computer memory.
It should be understood that portions of the present invention may be implemented in hardware, software, firmware, or a combination thereof. In the above embodiments, the various steps or methods may be implemented in software or firmware stored in memory and executed by a suitable instruction execution system. For example, if implemented in hardware, as in another embodiment, any one or combination of the following techniques, which are known in the art, may be used: a discrete logic circuit having a logic gate circuit for implementing a logic function on a data signal, an application specific integrated circuit having an appropriate combinational logic gate circuit, a Programmable Gate Array (PGA), a Field Programmable Gate Array (FPGA), or the like.
It will be understood by those skilled in the art that all or part of the steps carried by the method for implementing the above embodiments may be implemented by hardware related to instructions of a program, which may be stored in a computer readable storage medium, and when the program is executed, the program includes one or a combination of the steps of the method embodiments.
In addition, functional units in the embodiments of the present invention may be integrated into one processing module, or each unit may exist alone physically, or two or more units are integrated into one module. The integrated module can be realized in a hardware mode, and can also be realized in a software functional module mode. The integrated module, if implemented in the form of a software functional module and sold or used as a stand-alone product, may also be stored in a computer readable storage medium.
The storage medium mentioned above may be a read-only memory, a magnetic or optical disk, etc. Although embodiments of the present invention have been shown and described above, it is understood that the above embodiments are exemplary and should not be construed as limiting the present invention, and that variations, modifications, substitutions and alterations can be made to the above embodiments by those of ordinary skill in the art within the scope of the present invention.

Claims (10)

1. The dry and wet sludge classification method is characterized by comprising the following steps:
acquiring a sample data set, wherein the sample data set comprises a plurality of sample images of dry sludge and a plurality of sample images of wet sludge;
acquiring a sludge area in each sample image by a template matching method to update the sample data set;
training a neural network through the updated sample data set to obtain a dry-wet sludge classification model, wherein the sample image is automatically subjected to environment transformation processing during training;
acquiring an image to be detected, and acquiring a sludge area in the image to be detected by a template matching method to update the image to be detected;
and inputting the updated image to be detected into the dry and wet sludge classification model so as to judge whether the sludge in the image to be detected is dry sludge or wet sludge.
2. The method according to claim 1, wherein the sample image is further processed by translation, rotation and indentation during training.
3. The method for classifying dry and wet sludge according to claim 1, wherein the template matching method comprises:
decomposing a sample image or an image to be detected into a matrix form, and arranging the characteristics in the matrix form image according to coordinates;
and acquiring a sludge area in the image through image comparison, and carrying out coordinate positioning.
4. The dry-wet sludge classification method according to claim 3, wherein the neural network is a VGG network or an inclusion network.
5. The utility model provides a wet and dry mud sorter which characterized in that includes:
a first obtaining module, configured to obtain a sample data set, where the sample data set includes a plurality of sample images of dry sludge and a plurality of sample images of wet sludge;
the first matching module is used for acquiring a sludge area in each sample image through a template matching method so as to update the sample data set;
the training module is used for training the neural network through the updated sample data set to obtain a dry-wet sludge classification model, wherein the sample image is automatically subjected to environment transformation processing during training;
the second acquisition module is used for acquiring an image to be detected;
the second matching module is used for acquiring a sludge area in the image to be detected through a template matching method so as to update the image to be detected;
and the detection module is used for inputting the updated image to be detected into the dry-wet sludge classification model so as to judge whether the sludge in the image to be detected is dry sludge or wet sludge.
6. The dry-wet sludge classification device according to claim 5, wherein the training module is further used for performing translation, rotation and indentation processing on the sample image during training.
7. The dry-wet sludge classification apparatus according to claim 5, wherein the template matching method comprises:
decomposing a sample image or an image to be detected into a matrix form, and arranging the characteristics in the matrix form image according to coordinates;
and acquiring a sludge area in the image through image comparison, and carrying out coordinate positioning.
8. A computer device comprising a memory, a processor and a computer program stored on the memory and executable on the processor, wherein the processor, when executing the program, implements the method of classifying wet and dry sludge according to any one of claims 1-4.
9. A non-transitory computer readable storage medium having stored thereon a computer program, characterized in that the program, when executed by a processor, implements the method for classifying wet and dry sludge according to any one of claims 1-4.
10. A computer program product, characterized in that instructions in the computer program product, when executed by a processor, perform the method for wet and dry sludge classification according to any of claims 1-4.
CN202010940030.6A 2020-09-09 2020-09-09 Dry and wet sludge classification method and device Pending CN112070164A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010940030.6A CN112070164A (en) 2020-09-09 2020-09-09 Dry and wet sludge classification method and device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010940030.6A CN112070164A (en) 2020-09-09 2020-09-09 Dry and wet sludge classification method and device

Publications (1)

Publication Number Publication Date
CN112070164A true CN112070164A (en) 2020-12-11

Family

ID=73663126

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010940030.6A Pending CN112070164A (en) 2020-09-09 2020-09-09 Dry and wet sludge classification method and device

Country Status (1)

Country Link
CN (1) CN112070164A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116958503A (en) * 2023-09-19 2023-10-27 广东新泰隆环保集团有限公司 Image processing-based sludge drying grade identification method and system

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109284780A (en) * 2018-09-10 2019-01-29 中山大学 Ore mineral image automatic identification and classification method

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109284780A (en) * 2018-09-10 2019-01-29 中山大学 Ore mineral image automatic identification and classification method

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
随心漂流: "OpenCV 模板匹配,匹配同一幅图中的多个目标", 《HTTPS://BLOG.CSDN.NET/WEIXIN_42899088/ARTICLE/DETAILS/106568375》 *

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116958503A (en) * 2023-09-19 2023-10-27 广东新泰隆环保集团有限公司 Image processing-based sludge drying grade identification method and system
CN116958503B (en) * 2023-09-19 2024-03-12 广东新泰隆环保集团有限公司 Image processing-based sludge drying grade identification method and system

Similar Documents

Publication Publication Date Title
CN109871895B (en) Method and device for detecting defects of circuit board
CN111612763B (en) Mobile phone screen defect detection method, device and system, computer equipment and medium
CN111325713A (en) Wood defect detection method, system and storage medium based on neural network
CN105574550A (en) Vehicle identification method and device
CN111257341B (en) Underwater building crack detection method based on multi-scale features and stacked full convolution network
CN112070747A (en) LED lamp bead defect detection method and device
CN108764082A (en) A kind of Aircraft Targets detection method, electronic equipment, storage medium and system
CN111929327A (en) Cloth defect detection method and device
CN112070749A (en) Paper defect detection method and device
CN108921099A (en) Moving ship object detection method in a kind of navigation channel based on deep learning
CN112070746A (en) Steel strip defect detection method and device
CN111126393A (en) Vehicle appearance refitting judgment method and device, computer equipment and storage medium
CN111199194A (en) Automobile intelligent cabin instrument testing method based on machine vision and deep learning
CN110599453A (en) Panel defect detection method and device based on image fusion and equipment terminal
CN115439411A (en) Method and device for detecting polarity of circuit board component, medium and electronic equipment
CN113158969A (en) Apple appearance defect identification system and method
CN113780484B (en) Industrial product defect detection method and device
CN112070164A (en) Dry and wet sludge classification method and device
CN112070750A (en) Leather product defect detection method and device
CN110334703B (en) Ship detection and identification method in day and night image
CN116245882A (en) Circuit board electronic element detection method and device and computer equipment
CN116152191A (en) Display screen crack defect detection method, device and equipment based on deep learning
CN114419037B (en) Workpiece defect detection method and device
CN111079807A (en) Ground object classification method and device
CN112750113B (en) Glass bottle defect detection method and device based on deep learning and linear detection

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
TA01 Transfer of patent application right
TA01 Transfer of patent application right

Effective date of registration: 20220330

Address after: Building C, No.888, Huanhu West 2nd Road, Lingang New District, China (Shanghai) pilot Free Trade Zone, Pudong New Area, Shanghai

Applicant after: Shenlan Intelligent Technology (Shanghai) Co.,Ltd.

Address before: 213000 No.103, building 4, Chuangyan port, Changzhou science and Education City, No.18, middle Changwu Road, Wujin District, Changzhou City, Jiangsu Province

Applicant before: SHENLAN ARTIFICIAL INTELLIGENCE CHIP RESEARCH INSTITUTE (JIANGSU) Co.,Ltd.

Applicant before: DEEPBLUE TECHNOLOGY (SHANGHAI) Co.,Ltd.

RJ01 Rejection of invention patent application after publication
RJ01 Rejection of invention patent application after publication

Application publication date: 20201211