CN111268317B - Garbage classification processing method and device, terminal and storage medium - Google Patents

Garbage classification processing method and device, terminal and storage medium Download PDF

Info

Publication number
CN111268317B
CN111268317B CN202010140615.XA CN202010140615A CN111268317B CN 111268317 B CN111268317 B CN 111268317B CN 202010140615 A CN202010140615 A CN 202010140615A CN 111268317 B CN111268317 B CN 111268317B
Authority
CN
China
Prior art keywords
garbage
type
article
target
voice information
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202010140615.XA
Other languages
Chinese (zh)
Other versions
CN111268317A (en
Inventor
田金戈
徐国强
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
OneConnect Financial Technology Co Ltd Shanghai
Original Assignee
OneConnect Financial Technology Co Ltd Shanghai
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by OneConnect Financial Technology Co Ltd Shanghai filed Critical OneConnect Financial Technology Co Ltd Shanghai
Priority to CN202010140615.XA priority Critical patent/CN111268317B/en
Publication of CN111268317A publication Critical patent/CN111268317A/en
Priority to PCT/CN2020/105943 priority patent/WO2021174759A1/en
Application granted granted Critical
Publication of CN111268317B publication Critical patent/CN111268317B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B65CONVEYING; PACKING; STORING; HANDLING THIN OR FILAMENTARY MATERIAL
    • B65FGATHERING OR REMOVAL OF DOMESTIC OR LIKE REFUSE
    • B65F1/00Refuse receptacles; Accessories therefor
    • B65F1/14Other constructional features; Accessories
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B65CONVEYING; PACKING; STORING; HANDLING THIN OR FILAMENTARY MATERIAL
    • B65FGATHERING OR REMOVAL OF DOMESTIC OR LIKE REFUSE
    • B65F1/00Refuse receptacles; Accessories therefor
    • B65F1/0033Refuse receptacles; Accessories therefor specially adapted for segregated refuse collecting, e.g. receptacles with several compartments; Combination of receptacles
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/20Information retrieval; Database structures therefor; File system structures therefor of structured data, e.g. relational data
    • G06F16/24Querying
    • G06F16/245Query processing
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/21Design or setup of recognition systems or techniques; Extraction of features in feature space; Blind source separation
    • G06F18/214Generating training patterns; Bootstrap methods, e.g. bagging or boosting
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/24Classification techniques
    • GPHYSICS
    • G10MUSICAL INSTRUMENTS; ACOUSTICS
    • G10LSPEECH ANALYSIS OR SYNTHESIS; SPEECH RECOGNITION; SPEECH OR VOICE PROCESSING; SPEECH OR AUDIO CODING OR DECODING
    • G10L25/00Speech or voice analysis techniques not restricted to a single one of groups G10L15/00 - G10L21/00
    • G10L25/48Speech or voice analysis techniques not restricted to a single one of groups G10L15/00 - G10L21/00 specially adapted for particular use
    • G10L25/51Speech or voice analysis techniques not restricted to a single one of groups G10L15/00 - G10L21/00 specially adapted for particular use for comparison or discrimination
    • G10L25/60Speech or voice analysis techniques not restricted to a single one of groups G10L15/00 - G10L21/00 specially adapted for particular use for comparison or discrimination for measuring the quality of voice signals
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B65CONVEYING; PACKING; STORING; HANDLING THIN OR FILAMENTARY MATERIAL
    • B65FGATHERING OR REMOVAL OF DOMESTIC OR LIKE REFUSE
    • B65F1/00Refuse receptacles; Accessories therefor
    • B65F1/0033Refuse receptacles; Accessories therefor specially adapted for segregated refuse collecting, e.g. receptacles with several compartments; Combination of receptacles
    • B65F2001/008Means for automatically selecting the receptacle in which refuse should be placed
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B65CONVEYING; PACKING; STORING; HANDLING THIN OR FILAMENTARY MATERIAL
    • B65FGATHERING OR REMOVAL OF DOMESTIC OR LIKE REFUSE
    • B65F2210/00Equipment of refuse receptacles
    • B65F2210/138Identification means
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B65CONVEYING; PACKING; STORING; HANDLING THIN OR FILAMENTARY MATERIAL
    • B65FGATHERING OR REMOVAL OF DOMESTIC OR LIKE REFUSE
    • B65F2210/00Equipment of refuse receptacles
    • B65F2210/168Sensing means
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B65CONVEYING; PACKING; STORING; HANDLING THIN OR FILAMENTARY MATERIAL
    • B65FGATHERING OR REMOVAL OF DOMESTIC OR LIKE REFUSE
    • B65F2210/00Equipment of refuse receptacles
    • B65F2210/176Sorting means
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B65CONVEYING; PACKING; STORING; HANDLING THIN OR FILAMENTARY MATERIAL
    • B65FGATHERING OR REMOVAL OF DOMESTIC OR LIKE REFUSE
    • B65F2210/00Equipment of refuse receptacles
    • B65F2210/178Steps
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02WCLIMATE CHANGE MITIGATION TECHNOLOGIES RELATED TO WASTEWATER TREATMENT OR WASTE MANAGEMENT
    • Y02W30/00Technologies for solid waste management
    • Y02W30/10Waste collection, transportation, transfer or storage, e.g. segregated refuse collecting, electric or hybrid propulsion

Abstract

The invention provides a garbage classification processing method, which is characterized in that a first article type of a target article in an image is identified through an article classification model; receiving second voice information; when the second voice information shows that the first article type is wrong, receiving a second article type input by a user; inquiring whether a known article type corresponding to the second article type exists; when the article throwing signals do not exist, the intelligent garbage can is controlled to open all the garbage throwing inlets, and whether the article throwing signals are generated through induction is detected; when the signal is detected, marking the garbage throwing inlet as a target throwing inlet and acquiring the garbage type corresponding to the target throwing inlet; and storing the garbage type and the second type of the target object in a garbage classification database in an associated manner, and updating the garbage classification model. The invention provides a device, a terminal and a computer readable storage medium. The invention can improve the accuracy of garbage classification and automatically update the garbage classification model in real time.

Description

Garbage classification processing method and device, terminal and storage medium
Technical Field
The invention relates to the technical field of intelligent classification, in particular to a garbage classification processing method, a garbage classification processing device, a garbage classification processing terminal and a garbage classification processing storage medium.
Background
In recent years, people pay more attention to the classification and treatment of garbage along with the improvement of the quality of life of people. The classification treatment of garbage is gradually popularized in large and medium-sized cities in China. In consideration of the problems that the popularization of garbage classification education is not enough and the like, in the actual garbage classification treatment process, the garbage is simply classified by integrating an object detection algorithm in an intelligent garbage can product.
However, because of the wide variety of garbage articles, the conventional object detection algorithm is difficult to accurately classify the garbage effectively, and the conventional object detection algorithm is relatively solid, and the judgment strategy cannot be automatically improved according to the actual situation.
Therefore, it is necessary to provide a method capable of accurately classifying garbage and automatically improving a judgment strategy according to actual situations.
Disclosure of Invention
In view of the above, there is a need for a garbage classification processing method, a garbage classification processing apparatus, a terminal and a computer readable storage medium, which can solve the problems that when the actual garbage items are of many types, the conventional object detection algorithm is difficult to accurately classify the garbage effectively, and the conventional object detection algorithm is relatively solid, and the judgment strategy cannot be automatically improved according to the actual situation.
The first aspect of the embodiments of the present invention provides a garbage classification processing method, which is applied to an intelligent garbage can, and the garbage classification processing method includes:
acquiring first voice information of a user, and detecting whether the first voice information meets a preset activation condition or not through a voice detection model;
when the detection result is that the first voice information meets the preset activation condition, acquiring an image of a target object, and identifying a first object type of the target object in the image through an object classification model;
outputting the first article type to a user in a voice prompt mode, and receiving second voice information input by the user, wherein the second voice information is used for indicating whether the first article type is correct or not;
when the second voice information shows that the first article type is correct, obtaining a garbage type corresponding to the first article type through a garbage classification model, and controlling the intelligent garbage can to open a garbage throwing inlet corresponding to the garbage type for a user to throw;
when the second voice information shows that the first article type is wrong, receiving a second article type of the target article input by a user;
according to the second variety type, traversing a garbage classification database, and inquiring whether a known article type corresponding to the second variety type exists;
when the known object variety corresponding to the second object variety does not exist, controlling the intelligent garbage can to open all garbage throwing inlets, and detecting whether the garbage throwing inlets generate object throwing signals in an induction mode or not;
when detecting that the object throwing signal is generated at the garbage throwing inlet, marking the garbage throwing inlet as a target throwing inlet and acquiring the garbage type corresponding to the target throwing inlet;
and storing the garbage type and the second object type of the target object in the garbage classification database in an associated manner, and updating the garbage classification model according to the updated garbage classification database.
Further, in the above garbage classification processing method provided by the embodiment of the present invention, before the step of identifying the first item type of the target item in the image by the item classification model, the method further includes:
calculating a quality evaluation value of the image;
detecting whether the quality evaluation value meets a preset quality threshold value;
when the detection result is that the quality evaluation value does not meet the preset quality threshold, the image of the target object is collected again until the quality evaluation value of the collected image meets the preset quality threshold;
segmenting a target area in the image meeting the preset quality threshold to obtain a target image;
post-processing the target image, the post-processing including one or more of: white balance processing and equalization processing.
Further, in the above garbage classification processing method provided in an embodiment of the present invention, the step of querying whether there is a known article type corresponding to the second article type by traversing the garbage classification database according to the second article type includes:
inputting the second variety into a pre-trained name similarity acquisition model to obtain a synonym set related to the second variety;
detecting whether target similar words exist in the similar words set or not, wherein the target similar words are consistent with the names of the known article types in the garbage classification database;
and when the detection result shows that the target similar meaning words exist in the similar meaning word set and the item type names of the known item types in the garbage classification database are consistent, determining that the known item types corresponding to the second item types exist.
Further, in the above garbage classification processing method provided in the embodiment of the present invention, the method further includes:
when the detection result is that a known object variety corresponding to the second object variety exists, inputting the known object variety into a garbage classification model to obtain a garbage variety corresponding to the target object;
controlling the intelligent garbage can to open a garbage throwing inlet corresponding to the garbage type;
detecting whether the garbage throwing inlet inducts to generate an article throwing signal or not;
when detecting that rubbish is thrown in and is entered entry and generate article and throw in the signal, control intelligent garbage bin closes rubbish and throws in the entry.
Further, in the above garbage classification processing method provided in the embodiment of the present invention, the method further includes:
when the second voice information shows that the first article type is wrong, storing the image with the wrong first article type identification and the corresponding second article type into the garbage classification database;
calculating the number of images stored in a preset time period;
judging whether the number exceeds a preset number threshold value;
and when the number exceeds the preset number threshold value as a judgment result, taking the stored images and the corresponding second article types as a new training data set, and updating the article classification model based on the new training data set.
Further, in the above garbage classification processing method provided in the embodiment of the present invention, before the step of detecting, by the speech detection model, whether the first speech information satisfies a preset activation condition, the method further includes:
calculating the definition of the first voice information;
judging whether the definition of the first voice information exceeds a preset definition threshold value or not;
and when the definition of the first voice information does not exceed the preset definition threshold, the first voice information of the user is obtained again.
Further, in the above garbage classification processing method provided in an embodiment of the present invention, the training method of the article classification model includes:
crawling a plurality of article images of different article types to obtain an original data set;
dividing the original data set into a training set and a verification set;
inputting the training set into a preset deep learning model for training to obtain an article classification model;
inputting the verification set into an article classification model for verification to obtain a verification passing rate;
detecting whether the verification passing rate exceeds a preset passing rate threshold value or not;
and when the detection result is that the verification passing rate exceeds a preset passing rate threshold value, determining that the training of the article classification model is finished.
The second aspect of the embodiments of the present invention further provides a garbage classification processing apparatus, which is applied to an intelligent garbage can, the garbage classification processing apparatus including:
the condition detection module is used for acquiring first voice information of a user and detecting whether the first voice information meets a preset activation condition or not through a voice detection model;
the type determining module is used for acquiring an image of a target article when the detection result shows that the first voice information meets the preset activation condition, and identifying a first article type of the target article in the image through an article classification model;
the information confirmation module is used for outputting the first article type to a user in a voice prompt mode and receiving second voice information input by the user, wherein the second voice information is used for indicating whether the first article type is correct or not;
the garbage throwing module is used for obtaining the garbage type corresponding to the first article type through a garbage classification model when the second voice information shows that the first article type is correct, and controlling the intelligent garbage can to open a garbage throwing inlet corresponding to the garbage type for a user to throw;
the category receiving module is used for receiving a second category of the target item input by a user when the second voice information shows that the first item category is wrong;
the type detection module is used for traversing the garbage classification database according to the second variety type and inquiring whether a known article type corresponding to the second variety type exists or not;
the entrance opening module is used for controlling the intelligent garbage bin to open all the garbage throwing entrances and detecting whether the garbage throwing entrances sense and generate an article throwing signal or not when a known article type corresponding to the second article type does not exist;
the category determining module is used for marking the garbage throwing inlet as a target throwing inlet and acquiring the garbage category corresponding to the target throwing inlet when detecting that the garbage throwing inlet generates an article throwing signal;
and the model updating module is used for storing the garbage type and the second type of the target object in the garbage classification database in an associated manner, and updating the garbage classification model according to the updated garbage classification database.
A third aspect of an embodiment of the present invention further provides a terminal, where the terminal includes a processor, and the processor is configured to implement any one of the foregoing garbage classification processing methods when executing a computer program stored in a memory.
The fourth aspect of the embodiments of the present invention further provides a computer-readable storage medium, where a computer program is stored, where the computer program is executed by a processor to implement the garbage classification processing method according to any one of the above-mentioned embodiments.
The embodiment of the invention provides a garbage classification processing method, a garbage classification processing device, a terminal and a computer readable storage medium, which are used for acquiring first voice information of a user and detecting whether the first voice information meets a preset activation condition or not through a voice detection model; when the detection result is that the first voice information meets the preset activation condition, acquiring an image of a target article, and identifying a first article type of the target article in the image through an article classification model; outputting the first article type to a user in a voice prompt mode, and receiving second voice information input by the user, wherein the second voice information is used for indicating whether the first article type is correct or not; when the second voice information shows that the first article type is correct, obtaining a garbage type corresponding to the first article type through a garbage classification model, and controlling the intelligent garbage can to open a garbage throwing inlet corresponding to the garbage type for a user to throw; when the second voice information represents that the first article type is wrong, receiving a second article type of the target article input by a user; traversing a garbage classification database according to the second variety class to inquire whether a known article class corresponding to the second variety class exists; when the known object variety corresponding to the second object variety does not exist, controlling the intelligent garbage can to open all garbage throwing inlets, and detecting whether the garbage throwing inlets generate object throwing signals in an induction mode or not; when detecting that the garbage throwing inlet generates an article throwing signal, marking the garbage throwing inlet as a target throwing inlet and acquiring the garbage type corresponding to the target throwing inlet; and storing the garbage type and the second object type of the target object in the garbage classification database in an associated manner, and updating the garbage classification model according to the updated garbage classification database. According to the embodiment of the invention, the man-machine interaction is completed by using the camera shooting unit and the voice unit, so that the accuracy of garbage classification is improved; a secondary classification method is adopted, the article types are firstly classified, and then the garbage types are determined according to the article types, so that the accuracy of garbage classification is improved; according to the garbage types corresponding to the types of all the target objects identified by the garbage classification model, the corresponding inlets of the garbage throwing box are opened through the control unit, so that the condition that the garbage is thrown in a mess mode due to the fact that a user does not obey rules is avoided; and when the object type of the target object is an unknown object type, detecting the garbage type corresponding to the object input by the infrared scanning unit so as to obtain the garbage type corresponding to the unknown object type, thereby automatically updating the garbage classification model in real time.
Drawings
Fig. 1 is a flowchart of a garbage classification processing method according to a first embodiment of the present invention.
Fig. 2 is a schematic structural diagram of a terminal according to an embodiment of the present invention.
Fig. 3 is an exemplary functional block diagram of the terminal shown in fig. 2.
The following detailed description will further illustrate the invention in conjunction with the above-described figures.
Detailed Description
In order that the above objects, features and advantages of the present invention can be more clearly understood, a detailed description of the present invention will be given below with reference to the accompanying drawings and specific embodiments. It should be noted that the embodiments and features of the embodiments of the present application may be combined with each other without conflict.
In the following description, numerous specific details are set forth to provide a thorough understanding of the present invention, and the described embodiments are a part, but not all, of the present invention. All other embodiments, which can be obtained by a person skilled in the art without making any creative effort based on the embodiments in the present invention, belong to the protection scope of the present invention.
Unless defined otherwise, all technical and scientific terms used herein have the same meaning as commonly understood by one of ordinary skill in the art to which this invention belongs. The terminology used in the description of the invention herein is for the purpose of describing particular embodiments only and is not intended to be limiting of the invention.
Fig. 1 is a flowchart of a garbage classification processing method according to a first embodiment of the present invention. The garbage classification method is applied to an intelligent garbage can, and the intelligent garbage can comprises the following steps: the garbage can comprises a camera shooting unit, a voice unit, an infrared scanning unit, a control unit and a garbage throwing box. The camera shooting unit is used for collecting images of a target object; the voice unit is used for receiving voice information output by a user, or the voice unit is used for outputting the voice information to the user; the infrared scanning unit is used for detecting whether a target object is thrown into the garbage throwing box or not; the rubbish is put in the case and is contained rubbish and put in the entry, rubbish is put in the entry and is used for putting in rubbish to rubbish and puts in the case, corresponds to the rubbish of predetermineeing the classification, has the rubbish of predetermineeing the classification of corresponding with it and puts in the case. In an embodiment, the predetermined categories of trash can include a dry trash can, a wet trash can, a harmful trash can, and a recyclable trash can, which are not particularly limited herein.
As shown in fig. 1, an embodiment of the present invention provides a garbage classification processing method, where the garbage classification processing method may include the following steps:
s11, first voice information of a user is obtained, whether the first voice information meets a preset activation condition or not is detected through a voice detection model, and when the detection result shows that the first voice information meets the preset activation condition, the step S12 is executed.
In at least one embodiment of the invention, first voice information input by a user is collected through a voice unit. And detecting whether the first voice information meets a preset activation condition or not through a voice detection model. The voice detection model is a model for voice detection obtained by training through a deep learning network. The preset activation condition is a voice instruction preset by a system user, and may include preset keywords, for example: and keywords such as garbage classification, garbage identification and garbage putting.
In the process of actually inputting the first voice information, the voice information input by the user may be unclear due to various reasons (for example, the problems of low volume, noisy surrounding environment, and the like when the user inputs the voice information), and at this time, the user may be prompted to re-input the voice information (for example, to output a voice prompt of "i do not hear, please re-input"), thereby avoiding that the voice detection model cannot be correctly identified due to unclear voice information. Thus, before the step of detecting whether the first voice information satisfies the preset activation condition through the voice detection model, the method further includes: calculating the definition of the first voice information; judging whether the definition of the first voice information exceeds a preset definition threshold value or not; and when the definition of the first voice information does not exceed the preset definition threshold, the first voice information of the user is obtained again. Where speech intelligibility describes the intelligibility of speech in a noisy environment. The preset definition threshold is a threshold preset by a system user, for example, the preset definition threshold may be 95%. In one embodiment, the step of calculating the intelligibility of the first speech information may comprise: performing voice activity detection on the first voice information to obtain environmental noise information carried by the first voice information; calculating the ratio of the environmental noise information to the first voice information; and obtaining the definition of the first voice information according to the ratio. The environmental noise information may be represented by a physical change amount such as amplitude or energy of the noise information.
And when the judgment result is that the first voice information meets the preset activation condition, acquiring the image of the target object, and identifying the object type of the target object in the image through an object classification model. In at least one embodiment of the present invention, before the step of acquiring the image of the target item, the method further comprises: and starting the camera unit and initializing the trained article classification model. Preferably, when the voice unit does not receive the voice instruction meeting the preset activation condition, the camera unit is in a closed state, so that the purpose of saving electric energy is achieved.
In at least one embodiment of the present invention, the training method of the trained object classification model may include: constructing an original data set; dividing the original data set into a training set and a verification set; inputting the training set into a preset deep learning model for training to obtain an article classification model, and verifying the generated article classification model by using the verification set; and if the verification passing rate is greater than the preset passing rate threshold value, finishing the training, otherwise, increasing the number of the images of the target object in the verification set so as to carry out the training and verification again.
The preset passing rate threshold is preset by a user, for example, the preset passing rate threshold is 99%. The method for constructing the original data set comprises the following steps: crawling a plurality of images of related articles by using a web crawler technology; carrying out article type labeling on the plurality of images of the related articles; an original data set is constructed based on a plurality of images of the item of interest and corresponding item types. And randomly or randomly acquiring a plurality of images of related articles from a mainstream image search engine and an image sharing website by using a web crawler technology. The mainstream image search engine may be, for example, hundredth, google, or the like, and the image sharing website may be, for example, flickr, instagram, or the like.
In an optional embodiment, after the step of constructing the raw data set based on the plurality of images of the related item and the corresponding item type, the method further comprises: calculating the number of images of the target item for each item type; judging whether the number is smaller than a preset number threshold value or not; and when the number is smaller than the preset number threshold value as a judgment result, increasing the number of the target articles of the article types corresponding to the number by a disturbance method. And disturbing the image of the target object of the object type by adopting a disturbance method so as to increase the number of the images of the target object of the object type, and avoiding that the generalization capability of the trained object classification model on the image of the target object of the object type is poor due to the insufficient number of samples of the image of the target object of a certain object type. The perturbation method is prior art, and the present invention is not described herein.
S12, collecting an image of the target object, and identifying a first object type of the target object in the image through an object classification model.
In at least one embodiment of the present invention, an image of a target item is captured and a first item type of the target item in the image is identified by an item classification model. The article classification model is a model which is preset by a user and is used for classifying target articles.
In order to improve the identification accuracy of the item classification model, before the step of identifying the item type of the target item in the image by the item classification model, the method further comprises: calculating a quality evaluation value of the image; detecting whether the quality evaluation value meets a preset quality threshold value; when the detection result is that the quality evaluation value does not meet the preset quality threshold, the image of the target object is collected again until the quality evaluation value of the collected image meets the preset quality threshold; segmenting a target area in the image meeting the preset quality threshold value to obtain a target image; post-processing the target image, the post-processing including one or more of: white balance processing and equalization processing.
In one embodiment, the step of calculating the quality evaluation value of the image includes: acquiring the average brightness, the noise intensity and the feature value definition of the image; calculating according to the average brightness, the noise intensity and the definition of the characteristic value to obtain an image brightness evaluation value, an image noise evaluation value and an image definition evaluation value; and performing weighted calculation on the image brightness evaluation value, the image noise evaluation value and the image definition evaluation value to obtain a quality evaluation value of the image. The average brightness, the noise intensity and the feature value definition of the image can be calculated through a preset neural network, and are not described herein again. The average brightness, the noise intensity and the feature definition are respectively in corresponding relation with the image brightness evaluation value, the image noise evaluation value and the image definition evaluation value. For example, the boundary thresholds a, b, c, and d are set for the average luminance L of an image, and when the average luminance L of an image is within different boundary ranges, there is a corresponding image luminance evaluation value.
Specifically, the image quality of the target object may be calculated by, for example, variance, mean, or the like, and the image of the target object with variance smaller than a preset variance threshold is rejected, or the image of the target object with mean smaller than a preset mean threshold is rejected. In an actual scene, the proportion of the area with the features in one image may be relatively small, for example, in the image of the target object, only the object features are located in the middle of the whole image, and the rest positions may be blank, so that the useful area including the object features is selected, and the useful area in the image of the target object is segmented as the target area, which is beneficial to accelerating the feature extraction of the object classification model in the training process. In one embodiment, a YOKO object detection algorithm may be used to detect an object region in the image of the object item and then segment the object region from the image of the object item. In one embodiment, the post-processing may be white balance processing, equalization processing, and the like. The target image may be white balanced using an open source white balance tool, and may also be equalized using an open source equalization tool.
S13, outputting the first article type to a user in a voice prompt mode, and receiving second voice information input by the user, wherein the second voice information is used for indicating whether the first article type is correct, when the second voice information indicates that the first article type is correct, executing a step S14, and when the second voice information indicates that the first article type is wrong, executing a step S15.
The first article type is output to a user through a voice unit in a voice prompt mode, the user is prompted to confirm whether the first article type is correct, and second voice information input by the user is received through the voice unit to confirm whether the first article type is correct. The second voice message may be "correct" or "wrong", which is not limited herein.
S14, obtaining the garbage type corresponding to the first article type through a garbage classification model, and controlling the intelligent garbage can to open a garbage throwing inlet corresponding to the garbage type for a user to throw.
In at least one embodiment of the present invention, when the second voice message indicates that the first item type is correct, the first item type is input into a garbage classification model to obtain a garbage type corresponding to the target item; and controlling the intelligent garbage can to open a garbage throwing inlet corresponding to the garbage type for a user to throw. It is understood that, after the step of controlling the intelligent trash can to open the trash input inlet corresponding to the trash category, the method further comprises: detecting whether the garbage throwing inlet inducts to generate a throwing signal of the target object or not; when the garbage throwing inlet senses and generates a throwing signal of the target object, the intelligent garbage can is controlled to close the garbage throwing inlet.
The garbage classification model is a model which is trained by adopting a large amount of training data in advance and is used for classifying garbage of determined article types. And inputting the article types into the garbage classification model to obtain garbage types corresponding to the target articles. For example, when the object type of the target object is a beverage bottle, the corresponding garbage type is recyclable garbage. The rubbish that intelligence garbage bin corresponds puts in the entry can include: a dry waste deposit inlet, a wet waste deposit inlet, a hazardous waste deposit inlet and a recyclable waste deposit inlet. An infrared scanning unit can be arranged at each type of garbage throwing inlet; preferably, one infrared scanning unit may be provided for a plurality of types of trash input inlets, thereby saving costs.
In at least one embodiment of the invention, the control unit controls the intelligent garbage bin to open the corresponding garbage throwing inlet, the infrared scanning unit is started, whether the target object is thrown into the garbage throwing inlet is sensed by the infrared scanning unit, and a corresponding object throwing signal is generated when the target object is thrown into the garbage throwing inlet is sensed. Preferably, after the step of controlling the intelligent trash can to open the corresponding trash input inlet according to the trash category, the method further comprises: and outputting a voice prompt through a voice unit to prompt a user to put the target object into a corresponding opened garbage putting inlet of the intelligent garbage can.
It can be understood that the garbage category corresponding to the first article category is identified according to the garbage classification model, the garbage throwing inlet corresponding to the intelligent garbage can is opened through the control unit, and the target article is thrown into the corresponding garbage throwing inlet by the user, so that the garbage random throwing condition caused by the fact that the user does not obey the rule is avoided.
In at least one embodiment of the present invention, the step of detecting whether the trash input inlet senses generation of an item input signal further includes: detecting whether the garbage throwing inlet inducts to generate throwing signals of other objects except the target object or not; when the garbage throwing inlet inducts to generate throwing signals of other articles, identity information of a user is obtained according to the first voice information; and updating a label corresponding to the identity information, wherein the label comprises a garbage throwing error frequency label. In other embodiments, the user identity information containing the tag can be output to the relevant garbage disposal department regularly, and the relevant garbage disposal department can perform criticizing education on users with excessive garbage throwing errors, so that the awareness of the users on correctly throwing the garbage is improved fundamentally.
And S15, receiving a second item type of the target object input by a user.
In at least one embodiment of the present invention, when the second voice message indicates that the first item type is wrong, a voice unit prompts a user to output a second item type of the target item. For example, the content of the prompt may be: "please say the kind of the article".
And S16, according to the second item type, traversing the garbage classification database, inquiring whether a known item type corresponding to the second item type exists, and if the known item type corresponding to the second item type does not exist, executing the step S17.
In at least one embodiment of the present invention, the existence of a known article type corresponding to the second article type is queried according to the second article type traversing the garbage classification database. The garbage classification database comprises data such as names of related articles, images of related articles, article type names of related articles, garbage type names corresponding to related articles and the like, and the data are stored in a mutual association mode.
Specifically, the step of querying whether a known article type corresponding to the second article type exists by traversing the garbage classification database according to the second article type includes: inputting the second variety into a pre-trained name similarity acquisition model to obtain a near word set related to the second variety; detecting whether target similar words exist in the similar word set and the names of the article types of the known article types in the garbage classification database or not; and when the detection result shows that the target similar meaning words exist in the similar meaning word set and the item type names of the known item types in the garbage classification database are consistent, determining that the known item types corresponding to the second item types exist. For example, when the input item type name of the second item type is "mineral water bottle", the synonym set of "mineral water bottle" obtained by the name similarity acquisition model includes: "plastic bottle", "drinking water bottle", "beverage bottle" and the like. Assume that an item type name of a known item type exists in the trash classification database as "drink bottle". The step of detecting whether the target similar meaning word exists in the similar meaning word set and the item type name of the known item type in the garbage classification database is consistent can be used for obtaining the result, and if the 'beverage bottle' exists in the similar meaning word set and the item type name of the known item type is consistent, the 'mineral water bottle' is determined to be the known item type.
In at least one embodiment of the invention, the method further comprises: when the detection result is that a known object variety corresponding to the second object variety exists, inputting the known object variety into a garbage classification model to obtain a garbage variety corresponding to the target object; controlling the intelligent garbage can to open a garbage throwing inlet corresponding to the garbage type; detecting whether the garbage throwing inlet inducts to generate an article throwing signal or not; when detecting that rubbish is thrown in and is entered entry and generate article and throw in the signal, control intelligent garbage bin closes rubbish and throws in the entry.
In at least one embodiment of the invention, the method further comprises: when the second voice information shows that the first article type is wrong, storing the image with the wrong first article type identification and the corresponding second article type into the garbage classification database; calculating the number of images stored in a preset time period; judging whether the number exceeds a preset number threshold value; and when the number exceeds the preset number threshold value as a judgment result, taking the stored images and the corresponding second article types as a new training data set, and updating the article classification model based on the new training data set. The preset number threshold is preset, for example, the preset number threshold is 10.
S17, controlling the intelligent garbage can to open all garbage throwing inlets, detecting whether the garbage throwing inlets generate an article throwing signal in an induction mode, and executing the step S18 when detecting that the garbage throwing inlets generate the article throwing signal.
In at least one embodiment of the present invention, when there is no known item type corresponding to the second item type, the trash classification corresponding to the target item cannot be accurately identified according to the trash classification model, and the control unit cannot control the intelligent trash can to open the corresponding trash input inlets, so that the user manually completes trash classification input by controlling the intelligent trash can to open all the trash input inlets through the control unit. The method comprises the steps of controlling an intelligent garbage can to open all garbage throwing inlets, simultaneously detecting whether an infrared scanning unit induces and generates throwing signals of target objects, and closing the garbage throwing inlets within a preset time interval when a detection result is that the infrared scanning unit does not generate the throwing signals of the target objects. Wherein the predetermined time interval is preset, for example, the predetermined time interval is 10 seconds. And when the detection result is that the infrared scanning unit generates the throwing signal of the target item, executing step S18.
And S18, marking the garbage throwing inlet as a target throwing inlet and acquiring the garbage type corresponding to the target throwing inlet.
In at least one embodiment of the present invention, when it is detected that the trash input inlet generates an item input signal, the control unit closes the infrared scanning unit and closes all trash input inlets corresponding to the intelligent trash can. After the step of controlling the intelligent trash can to close all the trash input inlets, the method further comprises: and marking the garbage throwing inlet as a target throwing inlet and acquiring the garbage type corresponding to the target throwing inlet.
And S19, storing the garbage type and the second type of the target object in the garbage classification database in an associated manner, and updating the garbage classification model according to the updated garbage classification database.
In at least one embodiment of the present invention, the garbage classification model is updated according to the updated garbage classification database by storing the garbage classification and the second item classification of the target item in association with each other in the garbage classification database. After the step of storing the garbage category in association with a second category of target items in the garbage classification database, the method further comprises: determining the quantity of the garbage types stored in the garbage classification database according to a preset time period; detecting whether the quantity of the garbage types exceeds a preset quantity threshold value or not; and when the detection result shows that the quantity of the garbage types exceeds a preset quantity threshold value, taking the newly added garbage types and the second object type of the target object as a newly added training set, and inputting the newly added training set into the garbage classification model for retraining to obtain the updated garbage classification model. The preset time period and the preset number threshold are both preset by a system user, for example, the preset time period is 5 days, and the preset number threshold is 10.
The embodiment of the invention provides a garbage classification processing method, which uses a camera unit and a voice unit to complete man-machine interaction, thereby improving the accuracy of garbage classification; a secondary classification method is adopted, the article types are firstly classified, and then the garbage types are determined according to the article types, so that the accuracy of garbage classification is improved; according to the garbage types corresponding to the types of all the target objects identified by the garbage classification model, the garbage throwing inlet corresponding to the intelligent garbage can is opened through the control unit, so that the condition that the garbage is thrown in a mess due to the fact that a user does not obey rules is avoided; and when the object type of the target object is an unknown object type, detecting the garbage type corresponding to the object type through the infrared scanning unit so as to obtain the garbage type corresponding to the unknown object type, thereby automatically updating the garbage classification model in real time.
The above is a detailed description of the method provided by the embodiments of the present invention. The order of execution of the blocks in the flowcharts shown may be changed, and some blocks may be omitted, according to various needs.
Fig. 2 is a schematic structural diagram of a terminal according to an embodiment of the present invention, and as shown in fig. 2, the terminal 1 includes a memory 10, and the garbage classification processing apparatus 100 is stored in the memory 10. When the garbage classification processing apparatus 100 can acquire first voice information of a user, and detect whether the first voice information satisfies a preset activation condition through a voice detection model; when the detection result is that the first voice information meets the preset activation condition, acquiring an image of a target object, and identifying a first object type of the target object in the image through an object classification model; outputting the first article type to a user in a voice prompt mode, and receiving second voice information input by the user, wherein the second voice information is used for indicating whether the first article type is correct or not; when the second voice information shows that the first article type is correct, obtaining a garbage type corresponding to the first article type through a garbage classification model, and controlling the intelligent garbage can to open a garbage throwing inlet corresponding to the garbage type for a user to throw; when the second voice information represents that the first article type is wrong, receiving a second article type of the target article input by a user; according to the second variety type, traversing a garbage classification database, and inquiring whether a known article type corresponding to the second variety type exists; when the known object variety corresponding to the second object variety does not exist, controlling the intelligent garbage can to open all garbage throwing inlets, and detecting whether the garbage throwing inlets generate object throwing signals in an induction mode or not; when detecting that the object throwing signal is generated at the garbage throwing inlet, marking the garbage throwing inlet as a target throwing inlet and acquiring the garbage type corresponding to the target throwing inlet; and storing the garbage type and the second object type of the target object in the garbage classification database in an associated manner, and updating the garbage classification model according to the updated garbage classification database. According to the embodiment of the invention, the man-machine interaction is completed by using the camera shooting unit and the voice unit, so that the accuracy of garbage classification is improved; a secondary classification method is adopted, the article types are firstly classified, and then the garbage types are determined according to the article types, so that the accuracy of garbage classification is improved; according to the garbage types corresponding to the types of all the target objects identified by the garbage classification model, the garbage throwing inlet corresponding to the intelligent garbage can is opened through the control unit, so that the condition that the garbage is thrown in a mess due to the fact that a user does not obey rules is avoided; and when the object type of the target object is an unknown object type, detecting the garbage type corresponding to the object type through the infrared scanning unit so as to obtain the garbage type corresponding to the unknown object type, thereby automatically updating the garbage classification model in real time.
In this embodiment, the terminal 1 may further include a display 20 and a processor 30. The memory 10 and the display screen 20 may be electrically connected to the processor 30.
The memory 10 may be of different types of memory devices for storing various types of data. For example, the memory or internal memory of the terminal 1 may be used, or a memory Card that can be externally connected to the terminal 1, such as a flash memory, an SM Card (Smart Media Card), an SD Card (Secure Digital Card), and the like. In addition, the memory 10 may include a non-volatile memory such as a hard disk, a memory, a plug-in hard disk, a Smart Media Card (SMC), a Secure Digital (SD) Card, a Flash memory Card (Flash Card), at least one magnetic disk storage device, a Flash memory device, or other non-volatile solid state storage device. The memory 10 is used for storing various types of data, such as various types of application programs (Applications) installed in the terminal 1, data set and acquired by applying the garbage classification processing method, and the like.
A display 20 is mounted to the terminal 1 for displaying information.
The processor 30 is configured to execute the garbage classification processing method and various types of software installed in the terminal 1, such as an operating system and application display software. The processor 30 includes, but is not limited to, a Central Processing Unit (CPU), a Micro Controller Unit (MCU), and other devices for interpreting computer instructions and Processing data in computer software.
The garbage classification processing apparatus 100 may include one or more modules stored in the memory 10 of the terminal 1 and configured to be executed by one or more processors (in this embodiment, one processor 30) to implement the embodiment of the present invention. For example, referring to fig. 3, the garbage classification processing apparatus 100 may include a condition detection module 101, a category determination module 102, an information confirmation module 103, a garbage placement module 104, a category reception module 105, a category detection module 106, an entry opening module 107, a category determination module 108, and a model update module 109. The modules referred to in the embodiments of the present invention may be program segments for performing a specific function, and are more suitable than programs for describing the execution process of software in the processor 30.
It is understood that, corresponding to the above-mentioned embodiments of the garbage classification processing method, the garbage classification processing apparatus 100 may include some or all of the functional modules shown in fig. 3, and the functions of the modules will be described in detail below. It should be noted that the same terms, related terms, and specific explanations thereof in the above embodiments of the garbage classification processing method can also be applied to the following functional descriptions of the modules. For the sake of brevity and to avoid repetition, further description is omitted herein.
The condition detection module 101 may be configured to obtain first voice information of a user, and detect whether the first voice information satisfies a preset activation condition through a voice detection model.
The category determination module 102 may be configured to, when the detection result is that the first voice information satisfies the preset activation condition, acquire an image of a target item, and identify a first item category of the target item in the image through an item classification model.
The information confirmation module 103 may be configured to output the first article type to a user through a voice prompt, and receive second voice information input by the user, where the second voice information is used to indicate whether the first article type is correct.
The trash throwing module 104 may be configured to, when the second voice information indicates that the first article type is correct, obtain a trash type corresponding to the first article type through a trash classification model, and control the intelligent trash can to open a trash throwing inlet corresponding to the trash type for a user to throw.
The category receiving module 105 may be configured to receive a second category of the target item input by the user when the second voice message indicates that the first item category is wrong.
The category detection module 106 may be configured to query whether a known article category corresponding to the second article category exists according to the second article category by traversing the garbage classification database.
The entrance opening module 107 may be configured to control the intelligent trash can to open all the trash input entrances when there is no known object type corresponding to the second object type, and detect whether the trash input entrance senses an object input signal.
The category determining module 108 may be configured to, when it is detected that the trash input entry generates an item input signal, mark the trash input entry as a target input entry and obtain a trash category corresponding to the target input entry.
The model updating module 109 may be configured to store the garbage classification and the second item classification of the target item in the garbage classification database in an associated manner, and update the garbage classification model according to the updated garbage classification database.
An embodiment of the present invention further provides a computer-readable storage medium, on which a computer program is stored, where the computer program, when executed by the processor 30, implements the steps of the garbage classification processing method in any one of the above embodiments.
The garbage sorting apparatus 100, if implemented in the form of software functional units and sold or used as a stand-alone product, may be stored in a computer readable storage medium. Based on such understanding, all or part of the flow in the method according to the above embodiments may be implemented by a computer program, which may be stored in a computer readable storage medium and used by the processor 30 to implement the steps of the above method embodiments. Wherein the computer program comprises computer program code, which may be in the form of source code, object code, an executable file or some intermediate form, etc. The computer-readable storage medium may include: any entity or device capable of carrying said computer program code, a recording medium, a usb-disk, a removable hard disk, a magnetic diskette, an optical disk, a computer Memory, a Read-Only Memory (ROM), etc.
The Processor 30 may be a Central Processing Unit (CPU), other general purpose Processor, a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), a Field Programmable Gate Array (FPGA) or other Programmable logic device, discrete Gate or transistor logic device, discrete hardware component, or the like. The general processor may be a microprocessor or the processor may be any conventional processor, etc., and the processor 30 is a control center of the garbage classification processing apparatus 100/terminal 1, and various interfaces and lines are used to connect various parts of the whole garbage classification processing apparatus 100/terminal 1.
The memory 10 is used for storing the computer programs and/or modules, and the processor 30 implements various functions of the garbage classification processing apparatus 100/terminal 1 by running or executing the computer programs and/or modules stored in the memory 10 and calling data stored in the memory 10. The memory 10 may mainly include a storage program area and a storage data area, wherein the storage program area may store an operating system, an application program required by at least one function (such as a sound playing function, an image playing function, etc.), and the like; the storage data area may store data created according to the use of the terminal 1, and the like.
In the embodiments provided in the present invention, it should be understood that the disclosed terminal and method can be implemented in other manners. For example, the system embodiments described above are merely illustrative, and for example, the division of the modules is only one logical functional division, and other divisions may be realized in practice.
It will be evident to those skilled in the art that the embodiments of the present invention are not limited to the details of the foregoing illustrative embodiments, and that the embodiments of the present invention are capable of being embodied in other specific forms without departing from the spirit or essential attributes thereof. The present embodiments are therefore to be considered in all respects as illustrative and not restrictive, the scope of the embodiments being indicated by the appended claims rather than by the foregoing description, and all changes which come within the meaning and range of equivalency of the claims are therefore intended to be embraced therein. Any reference sign in a claim should not be construed as limiting the claim concerned. Several units, modules or means recited in the system, device or terminal claims may also be implemented by one and the same unit, module or means in software or hardware.
Although the embodiments of the present invention have been described in detail with reference to the preferred embodiments, it should be understood by those skilled in the art that various changes may be made and equivalents may be substituted without departing from the spirit and scope of the embodiments of the present invention.

Claims (9)

1. A garbage classification processing method is applied to an intelligent garbage can and is characterized by comprising the following steps:
acquiring first voice information of a user, and detecting whether the first voice information meets a preset activation condition or not through a voice detection model;
when the detection result is that the first voice information meets the preset activation condition, acquiring an image of a target article, and identifying a first article type of the target article in the image through an article classification model;
outputting the first article type to a user in a voice prompt mode, and receiving second voice information input by the user, wherein the second voice information is used for indicating whether the first article type is correct or not;
when the second voice information shows that the first article type is correct, obtaining a garbage type corresponding to the first article type through a garbage classification model, and controlling the intelligent garbage can to open a garbage throwing inlet corresponding to the garbage type for a user to throw;
when the second voice information shows that the first article type is wrong, receiving a second article type of the target article input by a user;
according to the second variety type, traversing a garbage classification database, and inquiring whether a known article type corresponding to the second variety type exists or not, wherein the steps comprise: inputting the second variety into a pre-trained name similarity acquisition model to obtain a synonym set related to the second variety; detecting whether target similar words exist in the similar words set or not, wherein the target similar words are consistent with the names of the known article types in the garbage classification database; when the detection result shows that the target similar meaning words exist in the similar meaning word set and the object type names of the known object types in the garbage classification database are consistent, determining that the known object types corresponding to the second object type exist;
when the known article type corresponding to the second article type does not exist, controlling the intelligent garbage can to open all garbage throwing inlets, and detecting whether the garbage throwing inlets sense to generate article throwing signals or not;
when detecting that the garbage throwing inlet generates an article throwing signal, marking the garbage throwing inlet as a target throwing inlet and acquiring the garbage type corresponding to the target throwing inlet;
and storing the garbage type and the second object type of the target object in the garbage classification database in an associated manner, and updating the garbage classification model according to the updated garbage classification database.
2. The method of claim 1, wherein prior to the step of identifying a first item type of a target item in the image by an item classification model, the method further comprises:
calculating a quality evaluation value of the image;
detecting whether the quality evaluation value meets a preset quality threshold value;
when the detection result is that the quality evaluation value does not meet the preset quality threshold, the image of the target object is collected again until the quality evaluation value of the newly collected image meets the preset quality threshold;
segmenting a target area in the image meeting the preset quality threshold value to obtain a target image;
post-processing the target image, the post-processing including one or more of: white balance processing and equalization processing.
3. The method of waste sorting processing according to claim 1, further comprising:
when a known article type corresponding to the second article type exists, inputting the known article type into a garbage classification model to obtain a garbage type corresponding to the target article;
and controlling the intelligent garbage can to open a garbage throwing inlet corresponding to the garbage type for a user to throw.
4. The method of waste sorting process according to claim 1, further comprising:
when the second voice information shows that the first article type is wrong, storing the image with the wrong first article type identification and the corresponding second article type into the garbage classification database;
calculating the number of images stored in a preset time period;
judging whether the number exceeds a preset number threshold value;
and when the number exceeds the preset number threshold value as a judgment result, taking the stored image and the corresponding second article type as a new training data set, and updating the article classification model based on the new training data set.
5. The method according to claim 1, wherein before the step of detecting whether the first speech information satisfies the preset activation condition through the speech detection model, the method further comprises:
calculating the definition of the first voice information;
judging whether the definition of the first voice information exceeds a preset definition threshold value or not;
and when the definition of the first voice information does not exceed the preset definition threshold, the first voice information of the user is obtained again.
6. The method of claim 1, wherein the training of the object classification model comprises:
crawling a plurality of article images of different article types to obtain an original data set;
dividing the original data set into a training set and a verification set;
inputting the training set into a preset deep learning model for training to obtain an article classification model;
inputting the verification set into an article classification model for verification to obtain a verification passing rate;
detecting whether the verification passing rate exceeds a preset passing rate threshold value or not;
and when the detection result is that the verification passing rate exceeds a preset passing rate threshold value, determining that the training of the article classification model is finished.
7. The utility model provides a waste classification processing apparatus, is applied to intelligent garbage bin, its characterized in that, waste classification processing apparatus includes:
the condition detection module is used for acquiring first voice information of a user and detecting whether the first voice information meets a preset activation condition or not through a voice detection model;
the type determining module is used for acquiring an image of a target article when the detection result shows that the first voice information meets the preset activation condition, and identifying a first article type of the target article in the image through an article classification model;
the information confirmation module is used for outputting the first article type to a user in a voice prompt mode and receiving second voice information input by the user, wherein the second voice information is used for indicating whether the first article type is correct or not;
the garbage throwing module is used for obtaining the garbage type corresponding to the first article type through a garbage classification model when the second voice information shows that the first article type is correct, and controlling the intelligent garbage can to open a garbage throwing inlet corresponding to the garbage type for a user to throw;
the type receiving module is used for receiving a second type of the target object input by a user when the second voice information shows that the first type of the object is wrong;
the type detection module is used for traversing the garbage classification database according to the second variety type to inquire whether a known article type corresponding to the second variety type exists or not, and comprises the following steps: inputting the second variety into a pre-trained name similarity acquisition model to obtain a synonym set related to the second variety; detecting whether target similar words exist in the similar words set or not, wherein the target similar words are consistent with the names of the known article types in the garbage classification database; when the detection result shows that the target similar meaning words exist in the similar meaning word set and the object type names of the known object types in the garbage classification database are consistent, determining that the known object types corresponding to the second object type exist;
the inlet opening module is used for controlling the intelligent garbage bin to open all the garbage throwing inlets and detecting whether the garbage throwing inlets generate an article throwing signal in an induction mode or not when a known article type corresponding to the second article type does not exist;
the category determining module is used for marking the garbage throwing inlet as a target throwing inlet and acquiring the garbage category corresponding to the target throwing inlet when detecting that the garbage throwing inlet generates an article throwing signal;
and the model updating module is used for storing the garbage type and the second type of the target object in the garbage classification database in an associated manner, and updating the garbage classification model according to the updated garbage classification database.
8. A terminal, characterized in that the terminal comprises a processor for implementing the method of garbage classification processing according to any one of claims 1 to 6 when executing a computer program stored in a memory.
9. A computer-readable storage medium, on which a computer program is stored, which, when being executed by a processor, implements the garbage classification processing method according to any one of claims 1 to 6.
CN202010140615.XA 2020-03-03 2020-03-03 Garbage classification processing method and device, terminal and storage medium Active CN111268317B (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN202010140615.XA CN111268317B (en) 2020-03-03 2020-03-03 Garbage classification processing method and device, terminal and storage medium
PCT/CN2020/105943 WO2021174759A1 (en) 2020-03-03 2020-07-30 Garbage classification processing method and apparatus, terminal, and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010140615.XA CN111268317B (en) 2020-03-03 2020-03-03 Garbage classification processing method and device, terminal and storage medium

Publications (2)

Publication Number Publication Date
CN111268317A CN111268317A (en) 2020-06-12
CN111268317B true CN111268317B (en) 2023-02-03

Family

ID=70995562

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010140615.XA Active CN111268317B (en) 2020-03-03 2020-03-03 Garbage classification processing method and device, terminal and storage medium

Country Status (2)

Country Link
CN (1) CN111268317B (en)
WO (1) WO2021174759A1 (en)

Families Citing this family (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111268317B (en) * 2020-03-03 2023-02-03 深圳壹账通智能科技有限公司 Garbage classification processing method and device, terminal and storage medium
CN111709866B (en) * 2020-06-22 2024-02-20 上海陇宇电子科技有限公司 Intelligent garbage classification method based on mobile equipment
CN112364758A (en) * 2020-11-10 2021-02-12 湖北惠立网络科技有限公司 Garbage classification recovery method and system based on multi-target image recognition
CN112861615B (en) * 2020-12-30 2024-04-09 江苏大学 Garbage random throwing behavior identification method based on depth target detection
CN112758567A (en) * 2021-01-11 2021-05-07 江苏地风环卫有限公司 Garbage throwing behavior analysis and control method and system
CN113625646B (en) * 2021-08-20 2022-12-13 北京云迹科技股份有限公司 Scheduling system of intelligent garbage robot
CN113816038A (en) * 2021-09-29 2021-12-21 夏日阳光智能科技(苏州)有限公司 Intelligent voice garbage can sound pickup and low-power-consumption control method
CN113859813A (en) * 2021-10-12 2021-12-31 朱雨昕 Artificial intelligence garbage classification system
CN114399104A (en) * 2022-01-11 2022-04-26 北京林业大学 Garbage recycling and scheduling method
CN114493351A (en) * 2022-02-16 2022-05-13 平安国际智慧城市科技股份有限公司 Point-of-delivery sampling method and device based on artificial intelligence, terminal device and medium
CN114529021B (en) * 2022-02-16 2022-09-23 浙江联运环境工程股份有限公司 Traceable garbage delivery method
CN114772104A (en) * 2022-04-25 2022-07-22 武汉绿净缘环境科技工程有限公司 Wisdom urban garbage classification system
CN115890699A (en) * 2022-11-04 2023-04-04 爱凤环环保科技有限公司 Interactive garbage classification supervises and leads robot
CN115830545B (en) * 2022-12-13 2023-10-03 苏州市伏泰信息科技股份有限公司 Intelligent supervision method and system for garbage classification
CN116342895B (en) * 2023-05-31 2023-08-11 浙江联运知慧科技有限公司 Method and system for improving sorting efficiency of renewable resources based on AI (advanced technology attachment) processing
CN117671497A (en) * 2023-12-04 2024-03-08 广东筠诚建筑科技有限公司 Engineering construction waste classification method and device based on digital images

Family Cites Families (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8266215B2 (en) * 2003-02-20 2012-09-11 Sonicwall, Inc. Using distinguishing properties to classify messages
CN103778226A (en) * 2014-01-23 2014-05-07 北京奇虎科技有限公司 Method for establishing language information recognition model and language information recognition device
CN204660534U (en) * 2015-05-07 2015-09-23 广西壮族自治区环境保护科学研究院 One way of life intelligent garbage classification system
US20190166024A1 (en) * 2017-11-24 2019-05-30 Institute For Information Industry Network anomaly analysis apparatus, method, and non-transitory computer readable storage medium thereof
JP6944908B2 (en) * 2018-08-08 2021-10-06 Kddi株式会社 Garbage separation support system, garbage separation support method, and program
JP6959197B2 (en) * 2018-08-10 2021-11-02 Kddi株式会社 Garbage separation support system, terminal device, garbage separation support method, and program
CN109051405A (en) * 2018-08-31 2018-12-21 深圳市研本品牌设计有限公司 A kind of intelligent dustbin and storage medium
CN109389161B (en) * 2018-09-28 2021-08-31 广州大学 Garbage identification evolutionary learning method, device, system and medium based on deep learning
CN110482072B (en) * 2019-07-02 2022-09-30 上海净收智能科技有限公司 Garbage classification method, system, medium, garbage storage device and cloud platform
CN110427896A (en) * 2019-08-07 2019-11-08 成都理工大学 A kind of garbage classification intelligence system based on convolutional neural networks
CN110598034A (en) * 2019-08-24 2019-12-20 深圳市奥芯博电子科技有限公司 Garbage classification and identification method and system
CN110498152B (en) * 2019-09-18 2023-10-20 福州大学 Intelligent classification garbage can based on AI and method thereof
CN111268317B (en) * 2020-03-03 2023-02-03 深圳壹账通智能科技有限公司 Garbage classification processing method and device, terminal and storage medium

Also Published As

Publication number Publication date
WO2021174759A1 (en) 2021-09-10
CN111268317A (en) 2020-06-12

Similar Documents

Publication Publication Date Title
CN111268317B (en) Garbage classification processing method and device, terminal and storage medium
CN103034836B (en) Road sign detection method and road sign checkout equipment
CN101447082B (en) Detection method of moving target on a real-time basis
CN108921083B (en) Illegal mobile vendor identification method based on deep learning target detection
US11335086B2 (en) Methods and electronic devices for automated waste management
CN107808659A (en) Intelligent sound signal type recognition system device
CN104881675A (en) Video scene identification method and apparatus
CN109508664A (en) A kind of vegetable identification pricing method based on deep learning
Bertrand et al. Bark and leaf fusion systems to improve automatic tree species recognition
CN110458082A (en) A kind of city management case classification recognition methods
CN112298844B (en) Garbage classification supervision method and device
CN101996308A (en) Human face identification method and system and human face model training method and system
CN111274886B (en) Deep learning-based pedestrian red light running illegal behavior analysis method and system
CN112488021A (en) Monitoring video-based garbage delivery violation detection method and system
CN112707058B (en) Detection method, system, device and medium for standard actions of kitchen waste
CN109271523A (en) A kind of government document subject classification method based on information retrieval
CN110598604A (en) Intelligent supervision method and device for classified garbage putting and computer storage medium
CN110245544A (en) A kind of method and device of determining dead ship condition
Zhang Vehicle target detection methods based on color fusion deformable part model
CN108664943B (en) Indoor daily activity recognition method under multi-occupant scene
CN115062725A (en) Hotel income abnormity analysis method and system
CN112215257A (en) Multi-person multi-modal perception data automatic marking and mutual learning method
Dutta et al. Smart usage of open source license plate detection and using iot tools for private garage and parking solutions
Tan et al. Person re-identification across non-overlapping cameras based on two-stage framework
CN113536298B (en) Deep learning model bias poisoning attack-oriented defense method

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant