CN111974704A - Garbage classification detection system and method based on computer vision - Google Patents

Garbage classification detection system and method based on computer vision Download PDF

Info

Publication number
CN111974704A
CN111974704A CN202010816611.9A CN202010816611A CN111974704A CN 111974704 A CN111974704 A CN 111974704A CN 202010816611 A CN202010816611 A CN 202010816611A CN 111974704 A CN111974704 A CN 111974704A
Authority
CN
China
Prior art keywords
garbage
robot
classification
model
module
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202010816611.9A
Other languages
Chinese (zh)
Inventor
刘浩强
闫冬梅
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Northeastern University Qinhuangdao Branch
Original Assignee
Northeastern University Qinhuangdao Branch
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Northeastern University Qinhuangdao Branch filed Critical Northeastern University Qinhuangdao Branch
Priority to CN202010816611.9A priority Critical patent/CN111974704A/en
Publication of CN111974704A publication Critical patent/CN111974704A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B07SEPARATING SOLIDS FROM SOLIDS; SORTING
    • B07CPOSTAL SORTING; SORTING INDIVIDUAL ARTICLES, OR BULK MATERIAL FIT TO BE SORTED PIECE-MEAL, e.g. BY PICKING
    • B07C5/00Sorting according to a characteristic or feature of the articles or material being sorted, e.g. by control effected by devices which detect or measure such characteristic or feature; Sorting by manually actuated devices, e.g. switches
    • B07C5/34Sorting according to other particular properties
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/21Design or setup of recognition systems or techniques; Extraction of features in feature space; Blind source separation
    • G06F18/214Generating training patterns; Bootstrap methods, e.g. bagging or boosting
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/044Recurrent networks, e.g. Hopfield networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/045Combinations of networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/10Terrestrial scenes
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B07SEPARATING SOLIDS FROM SOLIDS; SORTING
    • B07CPOSTAL SORTING; SORTING INDIVIDUAL ARTICLES, OR BULK MATERIAL FIT TO BE SORTED PIECE-MEAL, e.g. BY PICKING
    • B07C2501/00Sorting according to a characteristic or feature of the articles or material to be sorted
    • B07C2501/0054Sorting of waste or refuse

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Data Mining & Analysis (AREA)
  • General Physics & Mathematics (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Artificial Intelligence (AREA)
  • General Engineering & Computer Science (AREA)
  • Evolutionary Computation (AREA)
  • Molecular Biology (AREA)
  • Computational Linguistics (AREA)
  • Software Systems (AREA)
  • Mathematical Physics (AREA)
  • Health & Medical Sciences (AREA)
  • Biomedical Technology (AREA)
  • Biophysics (AREA)
  • Computing Systems (AREA)
  • General Health & Medical Sciences (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Evolutionary Biology (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Multimedia (AREA)
  • Image Analysis (AREA)
  • Manipulator (AREA)

Abstract

The invention provides a computer vision-based garbage classification detection system and method, and relates to robot technology application and deep learning in artificial intelligence. The system comprises a garbage picking robot and a garbage classification detection device; the garbage picking robot is connected with a binocular camera, a laser radar and an ultrasonic sensor, a target detection algorithm SSD-Mobile and SLAM technology is embedded, an incremental map is constructed by sensing the surrounding environment in the inspection process, obstacles are avoided in real time, when garbage is detected, the robot moves and controls a manipulator to collect the garbage, a detection device is embedded with a garbage classification algorithm based on a deep separable convolutional neural network, the garbage is thrown into a correct position by cooperating with the robot, and the system is mainly applied to garbage detection and classification tasks in life, can greatly accelerate the detection time, improves the classification precision and has certain application value.

Description

Garbage classification detection system and method based on computer vision
Technical Field
The invention relates to the technical field of deep learning in robot technology application and artificial intelligence, in particular to a garbage classification detection system and method based on computer vision.
Background
China is a big population country, people have new pursuits for green and civilized life along with the coming of urbanization footage, the garbage generated in life is continuously increased, most of the garbage is not subjected to strict classification treatment, so the garbage throwing and treating problems become more important, and the garbage classification is carried out at the same time. In brief, the garbage classification means that different types of garbage are separately stored, thrown and transported according to corresponding regulations or standards, and aims to improve the resource value and economic value of the garbage, make the best use of the garbage and facilitate the development of sustainable development and circular economy. The garbage classification mainly comprises kitchen garbage, recoverable matters, harmful garbage, other garbage and the like. The classification of garbage in developed countries such as Germany and Japan has reached a high level, not only the social popularity is high, but also the concept of garbage classification is deep in the national center. At present, most cities in China still adopt a method of mixed collection and centralized sorting, a garbage treatment system is not perfect, and a classification and recovery link lacks powerful actions, so that the new policy of garbage classification of Shanghai comes from the market in execution; at present, the intelligent garbage can manufacturing industry is continuously developing and maturing, but the traditional garbage classification technology based on machine vision has the problems of complex operation, low precision and the like, so a more intelligent and rapid garbage classification system needs to be designed, and the automatic garbage detection and classification design has very wide application value and market prospect.
Disclosure of Invention
Aiming at the defects of the prior art, the invention provides a garbage classification detection system and method based on computer vision;
the technical scheme adopted by the invention is as follows:
in one aspect, the invention provides a computer vision-based garbage classification detection system, which comprises a garbage picking robot and a garbage classification detection device;
the garbage picking robot comprises a first upper computer module (1), a first lower computer module (2), a binocular camera (3), a laser radar (4), a six-degree-of-freedom mechanical arm (5), a motor driving module (6), a first mobile power supply module (7), a first wireless module (8) and an ultrasonic sensor (9);
the first upper computer module (1) is connected with the binocular camera (3) and the laser radar (4) through a USB interface, and receives the processed road surface environment visual information collected by the binocular camera (3) and the obstacle position information collected by the laser radar (4); the first upper computer module is connected with a remote PC (personal computer) end through an SSH (secure Shell) protocol and transmits acquired visual information and laser radar information to the PC end, and the PC end monitors the motion of the robot in real time; the first lower computer module (2) is respectively connected with a six-degree-of-freedom mechanical arm (5), a motor driving module (6), a first wireless module (8) and a first mobile power supply module (7), controls the speed and the direction of the robot and receives a working instruction from the garbage classification detection device;
the garbage classification detection device comprises a second upper computer module (10), a second lower computer module (11), a USB camera (12), 4 LED signal lamps (13), a second mobile power supply module (14), a second wireless module (15), a detection table and four classification garbage boxes;
the second upper computer module (10) is connected with the USB camera (12) and the detection table, the second lower computer module (11) is connected with the 4 LED signal lamps (13), the second wireless module (15) and the second mobile power supply module (14) respectively and used for lighting the signal lamps and sending work instructions to the garbage picking robot, and the 4 signal lamps are placed above the garbage can respectively.
On the other hand, a computer vision-based garbage classification detection method is realized based on the computer vision-based garbage classification detection system, and comprises the following steps:
s1, collecting different garbage pictures, constructing a garbage classification model, and identifying garbage types;
step S11, making a garbage classification model data set, acquiring garbage sample data pictures through web crawlers and photographing, and unifying the formats and sizes of all the garbage sample data pictures;
step S12, increasing the quantity of garbage sample data;
step S13, respectively using the garbage sample data pictures as a training set, a verification set and a test set, training a model on the training set, adjusting the hyper-parameters on the verification set, and testing the effect of the model on the test set;
s14, preprocessing the data of the junk pictures, reducing the sizes of the pictures, removing the mean value, and subtracting the mean value of the dimensional data from the original data of each dimension of the sample to replace the original data;
step S15, manually labeling by using a target detection labeling tool, wherein each picture corresponds to a label which is a 4-dimensional vector and contains category and position information, and converting the label into one-hot codes;
s16, selecting a high-order deep learning API framework Keras to build a neural network, and adopting a Sequential model;
step S17, constructing a model network layer, and adding layers in the model, wherein the layers comprise an input layer, a convolution layer, a pooling layer, a full-link layer and a flattening layer;
step S18, model compiling is carried out, and an optimizer and a loss function are selected to determine a method for calculating back propagation;
step S19, performing model training, calling a fit function to provide data to a model, adjusting various parameters, namely grouping the total number of samples, randomly disordering all data before training, initializing a weight function, and performing L2 regularization treatment to average the weight of the features, so that the garbage classification model is constructed;
step S2, importing the garbage classification model into a software system of the garbage classification detection device, and enabling the garbage picking robot and the garbage classification detection device to work in a combined mode;
step S21, debugging the hardware equipment of the garbage classification detection system, and starting the robot to perform inspection work after all the hardware is debugged;
s22, constructing an incremental map by using an SLAM technology based on an ROS robot system in the inspection process of the robot;
s23, the robot carries out real-time avoidance on the obstacle through multi-sensor fusion and comprises a binocular camera (3), a laser radar (4) and an ultrasonic sensor (9), a real-time picture of the surrounding environment is uploaded through the camera, and garbage on the ground is identified through a target detection algorithm SSD-Mobilenet, namely position information of the garbage is obtained;
step S24, the robot moves to the position of the garbage, and the robot controls the six-degree-of-freedom mechanical arm (5) to collect the garbage into the storage box;
step S25, after working for a preset time, the garbage picking robot moves to a specified place, namely the position of the garbage classification detection device;
s26, the garbage picking robot controls a six-degree-of-freedom mechanical arm (5) to place garbage on a detection table, a USB camera (12) transmits garbage pictures in real time, and the system identifies the garbage category through the garbage classification model constructed in the step 1;
step S27, after the system identifies the garbage, the signal lamp on the corresponding garbage can flickers, and meanwhile, the wireless module (15) transmits a moving instruction to the mobile robot, wherein the instruction is in a four-dimensional vector form;
and S28, finally, controlling the six-degree-of-freedom mechanical arm (5) by the mobile robot to throw the garbage into the corresponding garbage can, and finishing the work.
Adopt the produced beneficial effect of above-mentioned technical scheme to lie in:
the invention provides a computer vision-based garbage classification detection system and method, which are characterized in that a garbage picking robot and a garbage classification detection device are used for completing the whole work, so that the labor is saved, and the work efficiency is improved; the garbage picking robot and the garbage classification detection device use the upper computer module and the lower computer module which have the same model and the wireless communication module, so that the garbage picking robot and the garbage classification detection device are convenient to debug and high in matching degree; the garbage classification model has the advantages of high compiling speed and simple structure, the weight parameter is 5% of that of the common CNN model, and the classification accuracy is 91.33%.
Drawings
Fig. 1 is an external view of a garbage collection robot according to an embodiment of the present invention;
in the figure, the device comprises (1) -a first upper computer module, (2) -a first lower computer module, (3) -a binocular camera, (4) -a laser radar, (5) -a six-degree-of-freedom mechanical arm, (6) -a motor driving module, (7) -a first mobile power supply module, (8) -a first wireless module and (9) -an ultrasonic sensor;
FIG. 2 is a schematic external view of a garbage classification detecting apparatus according to an embodiment of the present invention;
in the figure, (10) -a second upper computer module, (11) -a second lower computer module, (12) -a USB camera, (13) -an LED signal lamp, (14) -a second mobile power supply module and (15) -a second wireless module;
FIG. 3 is a flowchart of an embodiment of a garbage classification detection method;
FIG. 4 is a flowchart of garbage classification model construction according to an embodiment of the present invention;
FIG. 5 is a flowchart of a garbage target detection algorithm according to an embodiment of the present invention;
FIG. 6 is a graph of loss functions for a training set and a validation set of data according to an embodiment of the present invention;
FIG. 7 is a graph of accuracy for a training set and a validation set of data according to an embodiment of the present invention.
Detailed Description
The following detailed description of embodiments of the invention refers to the accompanying drawings.
In one aspect, the invention provides a computer vision-based garbage classification detection system, which comprises a garbage picking robot and a garbage classification detection device;
as shown in fig. 1, the garbage picking robot in this embodiment is characterized in that a first upper computer module (1) adopts a low-power-consumption embedded development platform Nvidia Jetson Tx2, a laser radar (4) adopts radium spirit N30101B, a first lower computer module (2) adopts an Arduino Mega 2560 single chip microcomputer, a motor driving module (6) adopts L298N, and a wireless module (8) adopts NRF24L 01. The first upper computer module is connected with the binocular camera (3) and the laser radar (4) and is responsible for processing visual information, laser radar information and multi-dimensional array calculation in an algorithm, the upper computer is connected with a remote PC (personal computer) end through an SSH (secure Shell) protocol and transmits acquired images of the road environment (garbage and obstacles) to the PC end, and the remote PC monitors the motion of the robot in real time; the first lower computer module (2) is connected with a six-degree-of-freedom mechanical arm (5) through a PWM (pulse width modulation) port, an I/O (input/output) port is connected with a motor driving module (6) and a first wireless module (8) through a DuPont wire, and a USB (universal serial bus) interface is connected with a first mobile power supply module (7), controls the speed and the direction of the robot, and receives a working instruction from the garbage classification detection device.
The multi-sensor fusion distance measurement comprises a binocular camera (3), a laser radar (4) and an ultrasonic sensor (9), wherein the robot adjusts the speed, and calculates the response time to realize better obstacle avoidance. The robot is provided with a six-degree-of-freedom mechanical arm (5) which comprises three actions of stretching, rotating and pitching, various motions of the mechanical arm are uniform in the working process, the stretching motion is driven by a linear hydraulic cylinder, the rotating motion is driven by a rotary cylinder or a rack cylinder, and the pitching motion is driven by a single piston rod.
In the garbage classification detection device in this embodiment, as shown in fig. 2, the second upper computer module (10) adopts Nvidia Jetson Tx2, the second lower computer module (11) adopts an Arduino Mega 2560 single chip microcomputer, and the second wireless module (15) adopts NRF24L 01. The second upper computer module (10) is connected with the USB camera (12) and the detection table and is responsible for processing multidimensional array calculation in visual information and an algorithm, and the second lower computer module (11) is connected with 4 LED signal lamps (13) and a second wireless module (15) through an I/O port DuPont wire and is connected with a second mobile power supply module (14) through a USB interface and used for lighting the signal lamps and sending a working instruction to the garbage picking robot.
On the other hand, a computer vision-based garbage classification detection method is implemented based on the aforementioned computer vision-based garbage classification detection system, as shown in fig. 3, and includes the following steps:
s1, collecting different garbage pictures, constructing a garbage classification model, and identifying garbage types as shown in figure 4;
s11, making a data set, obtaining a certain number of samples through web crawlers and photographing, screening to obtain 480 plastic bottle samples, 400 cardboard samples, 300 storage battery samples and 320 cabbage leaf samples, wherein the cardboard and the plastic bottles belong to recyclable objects, the storage batteries belong to harmful garbage, the cabbage leaves belong to kitchen garbage, all picture formats are unified into jpg, and the sizes are unified into 384 multiplied by 512.
And S12, enhancing the data, increasing the number of samples by methods of turning, cutting, color transformation and the like, and expanding each type of data to 500 sheets.
S13, dividing the picture into 3 parts according to the ratio of 7:2:1, respectively serving as a training set, a verification set and a test set, training the model on the training set, adjusting the hyper-parameters on the verification set to avoid overfitting, then checking the effect and determining when the training is completed, and testing the effect of the model on the test set.
S14, preprocessing the data, firstly reducing the size of the picture to 48 x 64, removing the mean value, and subtracting the mean value of the one-dimensional data from the original data of each dimension of the sample to replace the original data; normalization scales the pixel values (0-255) of each channel of the picture to between [0,1 ].
And S15, labeling the data manually by using a LabelImg tool, wherein each picture corresponds to a label and comprises category and position information. Converting the label into a one-hot code, wherein the label is a 4-dimensional vector, the label of the plastic bottle is coded as [1,0,0,0], the label of the hardboard is coded as [0,1,0,0], the label of the storage battery is coded as [0,0,1,0], and the label of the cabbage leaf is coded as [0,0,0,1 ].
S16, selecting a Tensorflow-based high-order deep learning API framework Keras to build a neural network, and adopting a Sequential model (Sequential);
s17, constructing a network layer, and adding layers in the model, wherein the layers comprise an input layer, a convolution layer, a pooling layer, a full-connection layer, a flattening layer and the like, and the network input is a tensor of 48 multiplied by 64 multiplied by 3; the operation in the convolutional layer is mainly that the image is filtered by a plurality of different convolutional kernels, local features are extracted after bias is added, each convolutional kernel can map out a new 2D image, each channel follows a weight sharing principle, the depth separable convolutional layer is selected to replace the traditional convolutional layer, and the number of the convolutional layers is 3; the pooling can play a role in reducing the dimension, the maximum pooling is selected, the maximum value is found in each area, and the number of pooling layers is 3; the flattening layer can be used for realizing the multidimensional input unidimensional and transitioning from the convolution layer to the full connection layer; the over-fitting can be effectively relieved by random inactivation (dropout), the complex co-adaptation relation among neurons is reduced, and the regularization effect is achieved to a certain extent; the full link layer connects all the characteristics, the output value is sent to the classifier, the full convolutional layer is selected to replace the full link layer, the network parameters are greatly reduced, the nonlinear activation function selects the Softmax function, and the output is converted into the probability.
S18, compiling the model, selecting an optimizer and a loss function to determine a method for calculating the back propagation, wherein an Adam optimizer is selected, the method is simple to implement, high in calculation efficiency, low in memory requirement, free of influence of gradient expansion transformation on parameter updating, good in interpretability and capable of automatically adjusting the learning rate; and selecting a multi-classification cross entropy loss function as the loss function, and calculating the error between the predicted value and the true value of the model.
And S19, training the model, calling a fit function to provide data to the model, adjusting various parameters, and performing batch _ size, namely grouping the total number of samples, wherein the number of samples contained in each group is 32, the number of iterations is 100, and the rest is default. Before training, randomly disorganizing all data, initializing a weight function, and carrying out L2 regularization treatment to average the weight of the features, so that the construction of a garbage classification model is completed;
after 100 iterations, as shown in fig. 6, the loss of the model on the training set was reduced to 0.16, and the accuracy was improved to 93.57%, as shown in fig. 7, the loss on the validation set was reduced to 0.20, the accuracy was improved to 91.82%, and the accuracy of the final test was 91.33%.
S2, importing the garbage classification model into a software system of the garbage classification detection device, and enabling the garbage picking robot and the garbage classification detection device to work in a combined mode;
step 21, debugging hardware equipment, and after all hardware is debugged, performing outdoor work on the robot;
step 22, in the inspection process of the robot, an incremental map is constructed through an SLAM technology based on an ROS robot system;
in SLAM technology, a robot recognizes a feature marker in an unknown environment by using a sensor equipped in the robot, and then calculates a relative position between the robot and the feature marker and coordinates in the global environment to construct an incremental map. The sensors used for avoiding the obstacle comprise an ultrasonic sensor, a binocular vision sensor, a laser radar, a sensor for distance measurement, a robot for adjusting the speed and calculating the response time to realize better obstacle avoidance. The robot is provided with a cantilever type mechanical arm which mainly comprises the actions of stretching, rotating, pitching and the like, various motions of the mechanical arm are constant in a normal state, the acceleration during starting and the speed before termination cannot be too large, and the phenomenon of impact or vibration is prevented. In general, the telescopic motion is driven by a linear hydraulic cylinder, the rotary motion is driven by a rotary cylinder or a rack cylinder, and the pitching motion is driven by a single piston rod. In the inspection process, obstacles can be avoided in real time, the camera detects garbage on the ground, the position information of the garbage is obtained, the garbage is moved to a corresponding position, and the manipulator control system starts analysis to collect the garbage.
Step 23, the robot carries out real-time avoidance on the obstacle through multi-sensor fusion, the robot comprises a binocular camera (3), a laser radar (4) and an ultrasonic sensor (9), a real-time picture of the surrounding environment is uploaded through the camera, garbage on the ground is identified through a target detection algorithm SSD-Mobilenet, the SSD algorithm is used for detecting through a multi-scale feature map, a large target is detected through high-level feature information with a large receptive field, a small target is detected through low-level feature information with a small receptive field, and position information of the garbage is obtained as shown in FIG. 5;
and 24, moving the robot to the position of the garbage, and collecting the garbage into a containing box by using a six-degree-of-freedom mechanical arm (5) of the robot.
Step 25, moving the garbage picking robot to a specified place, namely the position of the garbage classification detection device, after the garbage picking robot works for 2 hours;
26, placing the garbage on a detection table by a six-degree-of-freedom mechanical arm (5) of the garbage picking robot, transmitting garbage pictures by a USB camera (12) in real time, and identifying the garbage category by the system through the garbage classification model constructed in the step 1;
step 27, after the system identifies the garbage, signal lamps on corresponding garbage cans flash, and meanwhile, a second wireless module (15) transmits a moving instruction to the mobile robot, wherein the instruction is in a four-dimensional vector form, kitchen garbage corresponds to [1,0,0,0], recyclable corresponds to [0,1,0,0], harmful garbage corresponds to [0,0,1,0], and other garbage corresponds to [0,0,0,1 ];
and step 28, finally, controlling the six-degree-of-freedom mechanical arm (5) by the mobile robot to throw the garbage into a corresponding garbage can, and finishing the whole work.
Finally, it should be noted that: the above embodiments are only used to illustrate the technical solution of the present invention, and not to limit the same; while the invention has been described in detail and with reference to the foregoing embodiments, it will be understood by those skilled in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some or all of the technical features may be equivalently replaced; such modifications and substitutions do not depart from the spirit of the corresponding technical solutions and scope of the present invention as defined in the appended claims.

Claims (4)

1. A computer vision-based garbage classification detection system is characterized in that: comprises a garbage picking robot and a garbage classification detection device;
the garbage picking robot comprises a first upper computer module (1), a first lower computer module (2), a binocular camera (3), a laser radar (4), a six-degree-of-freedom mechanical arm (5), a motor driving module (6), a first mobile power supply module (7), a first wireless module (8) and an ultrasonic sensor (9);
the first upper computer module (1) is connected with the binocular camera (3) and the laser radar (4) through a USB interface, and receives the processed road surface environment visual information collected by the binocular camera (3) and the obstacle position information collected by the laser radar (4); the first upper computer module is connected with a remote PC (personal computer) end through an SSH (secure Shell) protocol and transmits acquired visual information and laser radar information to the PC end, and the PC end monitors the motion of the robot in real time; the first lower computer module (2) is respectively connected with a six-degree-of-freedom mechanical arm (5), a motor driving module (6), a first wireless module (8) and a first mobile power supply module (7), controls the speed and the direction of the robot and receives a working instruction from the garbage classification detection device;
the garbage classification detection device comprises a second upper computer module (10), a second lower computer module (11), a USB camera (12), 4 LED signal lamps (13), a second mobile power supply module (14), a second wireless module (15), a detection table and four classification garbage boxes;
the second upper computer module (10) is connected with the USB camera (12) and the detection table, the second lower computer module (11) is connected with the 4 LED signal lamps (13), the second wireless module (15) and the second mobile power supply module (14) respectively and used for lighting the signal lamps and sending work instructions to the garbage picking robot, and the 4 signal lamps are placed above the garbage can respectively.
2. The computer vision based trash classification detection system of claim 1, wherein the classification trash can includes kitchen waste, recyclables, hazardous trash, and other trash.
3. A computer vision-based garbage classification detection method, which is realized by the computer vision-based garbage classification detection system of claim 1, and is characterized by comprising the following steps:
s1, collecting different garbage pictures, constructing a garbage classification model, and identifying garbage types;
step S2, importing the garbage classification model into a software system of the garbage classification detection device, and enabling the garbage picking robot and the garbage classification detection device to work in a combined mode;
step S21, debugging the hardware equipment of the garbage classification detection system, and starting the robot to perform inspection work after all the hardware is debugged;
s22, constructing an incremental map by using an SLAM technology based on an ROS robot system in the inspection process of the robot;
s23, the robot carries out real-time avoidance on the obstacle through multi-sensor fusion and comprises a binocular camera (3), a laser radar (4) and an ultrasonic sensor (9), a real-time picture of the surrounding environment is uploaded through the camera, and garbage on the ground is identified through a target detection algorithm SSD-Mobilenet, namely position information of the garbage is obtained;
step S24, the robot moves to the position of the garbage, and the robot controls the six-degree-of-freedom mechanical arm (5) to collect the garbage into the storage box;
step S25, after working for a preset time, the garbage picking robot moves to a specified place, namely the position of the garbage classification detection device;
s26, the garbage picking robot controls a six-degree-of-freedom mechanical arm (5) to place garbage on a detection table, a USB camera (12) transmits garbage pictures in real time, and the system identifies the garbage category through the garbage classification model constructed in the step 1;
step S27, after the system identifies the garbage, the signal lamp on the corresponding garbage can flickers, and meanwhile, the wireless module (15) transmits a moving instruction to the mobile robot, wherein the instruction is in a four-dimensional vector form;
and S28, finally, controlling the six-degree-of-freedom mechanical arm (5) by the mobile robot to throw the garbage into the corresponding garbage can, and finishing the work.
4. The computer vision-based trash classification detecting method of claim 3, wherein the step S1 specifically comprises the following steps:
step S11, making a garbage classification model data set, acquiring garbage sample data pictures through web crawlers and photographing, and unifying the formats and sizes of all the garbage sample data pictures;
step S12, increasing the quantity of garbage sample data;
step S13, respectively using the garbage sample data pictures as a training set, a verification set and a test set, training a model on the training set, adjusting the hyper-parameters on the verification set, and testing the effect of the model on the test set;
s14, preprocessing the data of the junk pictures, reducing the sizes of the pictures, removing the mean value, and subtracting the mean value of the dimensional data from the original data of each dimension of the sample to replace the original data;
step S15, manually labeling by using a target detection labeling tool, wherein each picture corresponds to a label which is a 4-dimensional vector and contains category and position information, and converting the label into one-hot codes;
s16, selecting a high-order deep learning API framework Keras to build a neural network, and adopting a Sequential model;
step S17, constructing a model network layer, and adding layers in the model, wherein the layers comprise an input layer, a convolution layer, a pooling layer, a full-link layer and a flattening layer;
step S18, model compiling is carried out, and an optimizer and a loss function are selected to determine a method for calculating back propagation;
and S19, performing model training, calling a fit function to provide data to a model, adjusting various parameters, namely grouping the total number of samples, randomly disordering all data before training, initializing a weight function, and performing L2 regularization treatment to average the weight of the features to complete the construction of the garbage classification model.
CN202010816611.9A 2020-08-14 2020-08-14 Garbage classification detection system and method based on computer vision Pending CN111974704A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010816611.9A CN111974704A (en) 2020-08-14 2020-08-14 Garbage classification detection system and method based on computer vision

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010816611.9A CN111974704A (en) 2020-08-14 2020-08-14 Garbage classification detection system and method based on computer vision

Publications (1)

Publication Number Publication Date
CN111974704A true CN111974704A (en) 2020-11-24

Family

ID=73434437

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010816611.9A Pending CN111974704A (en) 2020-08-14 2020-08-14 Garbage classification detection system and method based on computer vision

Country Status (1)

Country Link
CN (1) CN111974704A (en)

Cited By (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112620165A (en) * 2020-12-11 2021-04-09 江西理工大学 Garbage classification method
CN112801039A (en) * 2021-03-03 2021-05-14 广西广播电视技术中心 Inferior digital television picture identification method based on improved MobilenetV2 network
CN112947202A (en) * 2021-02-26 2021-06-11 厦门理工学院 Intelligent cleaning system for water surface floating garbage and control method
CN113183138A (en) * 2021-04-26 2021-07-30 上海锵玫人工智能科技有限公司 Garbage carrying and sorting robot and control method thereof
CN113343838A (en) * 2021-06-03 2021-09-03 安徽大学 Intelligent garbage identification method and device based on CNN neural network
CN113351631A (en) * 2021-07-05 2021-09-07 北京理工大学 Photoelectric intelligent garbage sorting trolley system
CN113996543A (en) * 2021-10-09 2022-02-01 西安石油大学 Intelligent garbage sorting robot
CN115026015A (en) * 2022-06-10 2022-09-09 东北大学 Ground rubbish detection system based on image processing
CN115049914A (en) * 2022-07-04 2022-09-13 通号智慧城市研究设计院有限公司 Garbage classification method and device and terminal
CN116000895A (en) * 2023-03-28 2023-04-25 浙江大学 Quality detection robot and method for traditional Chinese medicine pharmacy process based on deep learning
CN116872233A (en) * 2023-09-07 2023-10-13 泉州师范学院 Campus inspection robot and control method thereof
CN117095242A (en) * 2023-10-18 2023-11-21 中交一公局第六工程有限公司 Intelligent building rubbish classification method and system based on machine vision

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
RU2015124673A (en) * 2014-12-11 2017-03-24 Сяоми Инк. Method and device for garbage collection
CN108908373A (en) * 2018-08-20 2018-11-30 佛山信卓派思机械科技有限公司 A kind of garbage classification recycler device people
CN110436093A (en) * 2019-09-16 2019-11-12 福建工程学院 A kind of rubbish cleaning classification vehicle and rubbish clear up classification method
CN110884791A (en) * 2019-11-28 2020-03-17 石家庄邮电职业技术学院(中国邮政集团公司培训中心) Vision garbage classification system and classification method based on TensorFlow
CN111368895A (en) * 2020-02-28 2020-07-03 上海海事大学 Garbage bag target detection method and detection system in wet garbage

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
RU2015124673A (en) * 2014-12-11 2017-03-24 Сяоми Инк. Method and device for garbage collection
CN108908373A (en) * 2018-08-20 2018-11-30 佛山信卓派思机械科技有限公司 A kind of garbage classification recycler device people
CN110436093A (en) * 2019-09-16 2019-11-12 福建工程学院 A kind of rubbish cleaning classification vehicle and rubbish clear up classification method
CN110884791A (en) * 2019-11-28 2020-03-17 石家庄邮电职业技术学院(中国邮政集团公司培训中心) Vision garbage classification system and classification method based on TensorFlow
CN111368895A (en) * 2020-02-28 2020-07-03 上海海事大学 Garbage bag target detection method and detection system in wet garbage

Non-Patent Citations (4)

* Cited by examiner, † Cited by third party
Title
张向荣: "《模式识别》", 30 September 2019, 西安电子科技大学出版社 *
星环科技智能平台团队: "《机器学习实战:基于Sophon平台的机器学习理论与实践》", 31 January 2020, 机械工业出版社 *
汤嘉敏: "《智能机器人基础》", 31 August 2019, 上海教育出版社 *
王俊彪: "《钣金件数字化制造技术》", 31 August 2015, 国防工业出版社 *

Cited By (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112620165A (en) * 2020-12-11 2021-04-09 江西理工大学 Garbage classification method
CN112620165B (en) * 2020-12-11 2022-09-13 江西理工大学 Garbage classification method
CN112947202A (en) * 2021-02-26 2021-06-11 厦门理工学院 Intelligent cleaning system for water surface floating garbage and control method
CN112947202B (en) * 2021-02-26 2022-06-14 厦门理工学院 Intelligent cleaning system for water surface floating garbage and control method
CN112801039A (en) * 2021-03-03 2021-05-14 广西广播电视技术中心 Inferior digital television picture identification method based on improved MobilenetV2 network
CN113183138A (en) * 2021-04-26 2021-07-30 上海锵玫人工智能科技有限公司 Garbage carrying and sorting robot and control method thereof
CN113343838A (en) * 2021-06-03 2021-09-03 安徽大学 Intelligent garbage identification method and device based on CNN neural network
CN113351631A (en) * 2021-07-05 2021-09-07 北京理工大学 Photoelectric intelligent garbage sorting trolley system
CN113996543B (en) * 2021-10-09 2023-11-10 西安石油大学 Intelligent garbage sorting robot
CN113996543A (en) * 2021-10-09 2022-02-01 西安石油大学 Intelligent garbage sorting robot
CN115026015A (en) * 2022-06-10 2022-09-09 东北大学 Ground rubbish detection system based on image processing
CN115049914A (en) * 2022-07-04 2022-09-13 通号智慧城市研究设计院有限公司 Garbage classification method and device and terminal
CN116000895A (en) * 2023-03-28 2023-04-25 浙江大学 Quality detection robot and method for traditional Chinese medicine pharmacy process based on deep learning
CN116872233A (en) * 2023-09-07 2023-10-13 泉州师范学院 Campus inspection robot and control method thereof
CN117095242A (en) * 2023-10-18 2023-11-21 中交一公局第六工程有限公司 Intelligent building rubbish classification method and system based on machine vision
CN117095242B (en) * 2023-10-18 2023-12-26 中交一公局第六工程有限公司 Intelligent building rubbish classification method and system based on machine vision

Similar Documents

Publication Publication Date Title
CN111974704A (en) Garbage classification detection system and method based on computer vision
CN109389161A (en) Rubbish identification evolutionary learning method, apparatus, system and medium based on deep learning
CN109606991B (en) Intelligent garbage can and garbage classification method based on deep learning
CN110210635A (en) A kind of intelligent classification recovery system that can identify waste
CN110116415A (en) A kind of Bottle & Can class rubbish identification sorting machine people based on deep learning
CN108182455A (en) A kind of method, apparatus and intelligent garbage bin of the classification of rubbish image intelligent
CN203782622U (en) Intelligent garbage-collecting-sorting educating robot
CN109261539A (en) A kind of garbage sorting system and method for view-based access control model identification and convolutional neural networks
CN110516625A (en) A kind of method, system, terminal and the storage medium of rubbish identification classification
CN111709333A (en) Traceability early warning system and health monitoring method based on abnormal excrement of cage-raised chickens
CN108672316A (en) A kind of micro parts quality detecting system based on convolutional neural networks
Che et al. Intelligent robotic control system based on computer vision technology
CN112528979B (en) Transformer substation inspection robot obstacle distinguishing method and system
CN110110752A (en) A kind of identification of rubbish and classification method, device and terminal device
CN113469264A (en) Construction method of automatic garbage classification model, garbage sorting method and system
CN111652214A (en) Garbage bottle sorting method based on deep learning
CN105303162A (en) Target proposed algorithm-based insulator recognition algorithm for aerial images
CN113569971B (en) Image recognition-based catch target classification detection method and system
Mitra Detection of waste materials using deep learning and image processing
CN116128879B (en) Lightweight transmission line defect detection method and device
Mittal et al. Trash classification: classifying garbage using deep learning
CN112975970A (en) Vision grabbing mechanical arm system
Gill et al. Garbage Classification Utilizing Effective Convolutional Neural Network
Cai et al. Research on Computer Vision-Based Waste Sorting System
Qing et al. Multi-Class on-Tree Peach Detection Using Improved YOLOv5s and Multi-Modal Images.

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication

Application publication date: 20201124

RJ01 Rejection of invention patent application after publication