CN110991271B - Garbage classification processing method and related products - Google Patents

Garbage classification processing method and related products Download PDF

Info

Publication number
CN110991271B
CN110991271B CN201911119137.8A CN201911119137A CN110991271B CN 110991271 B CN110991271 B CN 110991271B CN 201911119137 A CN201911119137 A CN 201911119137A CN 110991271 B CN110991271 B CN 110991271B
Authority
CN
China
Prior art keywords
target
garbage
weight
determining
loading area
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201911119137.8A
Other languages
Chinese (zh)
Other versions
CN110991271A (en
Inventor
田岱
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Wanyi Technology Co Ltd
Original Assignee
Wanyi Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Wanyi Technology Co Ltd filed Critical Wanyi Technology Co Ltd
Priority to CN201911119137.8A priority Critical patent/CN110991271B/en
Publication of CN110991271A publication Critical patent/CN110991271A/en
Application granted granted Critical
Publication of CN110991271B publication Critical patent/CN110991271B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/52Surveillance or monitoring of activities, e.g. for recognising suspicious objects
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/24Classification techniques
    • G06F18/241Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V30/00Character recognition; Recognising digital ink; Document-oriented image-based pattern recognition
    • G06V30/10Character recognition
    • G06V30/14Image acquisition
    • G06V30/148Segmentation of character regions
    • G06V30/153Segmentation of character regions using recognition of characters or words
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V30/00Character recognition; Recognising digital ink; Document-oriented image-based pattern recognition
    • G06V30/10Character recognition
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02WCLIMATE CHANGE MITIGATION TECHNOLOGIES RELATED TO WASTEWATER TREATMENT OR WASTE MANAGEMENT
    • Y02W30/00Technologies for solid waste management
    • Y02W30/10Waste collection, transportation, transfer or storage, e.g. segregated refuse collecting, electric or hybrid propulsion

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Physics & Mathematics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Data Mining & Analysis (AREA)
  • Multimedia (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Evolutionary Biology (AREA)
  • Evolutionary Computation (AREA)
  • Artificial Intelligence (AREA)
  • General Engineering & Computer Science (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Management, Administration, Business Operations System, And Electronic Commerce (AREA)

Abstract

The embodiment of the application discloses a garbage classification processing method and related products, which are applied to intelligent garbage cans, and the method comprises the following steps: when the user delivers garbage, shooting is carried out through the at least one camera to obtain a target image; performing target identification on the target image to obtain a target garbage type corresponding to garbage delivered by the user; determining a target garbage loading area corresponding to the target garbage type, wherein the intelligent garbage can comprises a plurality of garbage loading areas, and each garbage loading area corresponds to one garbage type; and prompting the user to deliver the garbage to the target garbage loading area. By adopting the embodiment of the application, the user can be helped to improve the garbage classification efficiency.

Description

Garbage classification processing method and related products
Technical Field
The application relates to the technical field of image recognition, in particular to a garbage classification processing method and related products.
Background
Garbage classification processing refers to the process of storing garbage in a classification way, putting garbage in a classification way and carrying garbage in a classification way according to a certain rule or standard, so that the garbage classification processing is converted into a general term of a series of activities of public resources. The classification aims at improving the resource value and the economic value of the garbage and striving for the best use.
Along with popularization of garbage classification policies, garbage classification treatment goes into the field of view of people, so the problem of how to quickly help users to realize garbage classification is urgently needed to be solved.
Disclosure of Invention
The embodiment of the application provides a garbage classification processing method and related products, which can help users to improve garbage classification efficiency.
In a first aspect, an embodiment of the present application provides a garbage classification processing method, which is applied to an intelligent garbage can, where the intelligent garbage can includes at least one camera, and the method includes:
when the user delivers garbage, shooting is carried out through the at least one camera to obtain a target image;
performing target identification on the target image to obtain a target garbage type corresponding to garbage delivered by the user;
determining a target garbage loading area corresponding to the target garbage type, wherein the intelligent garbage can comprises a plurality of garbage loading areas, and each garbage loading area corresponds to one garbage type;
and prompting the user to deliver the garbage to the target garbage loading area.
In a second aspect, embodiments of the present application provide a garbage classification device, applied to an intelligent garbage can, the intelligent garbage can including at least one camera, the device including: a shooting unit, an identification unit, a determination unit and a prompting unit, wherein,
The shooting unit is used for shooting through the at least one camera when the user delivers garbage to obtain a target image;
the identification unit is used for carrying out target identification on the target image to obtain a target garbage type corresponding to the garbage delivered by the user;
the determining unit is used for determining a target garbage loading area corresponding to the target garbage type, and the intelligent garbage can comprises a plurality of garbage loading areas, and each garbage loading area corresponds to one garbage type;
the prompting unit is used for prompting a user to deliver the garbage to the target garbage loading area.
In a third aspect, an embodiment of the present application provides an intelligent garbage can, including a processor, a memory, a communication interface, and one or more programs, where the one or more programs are stored in the memory and configured to be executed by the processor, the programs including instructions for performing the steps in the first aspect of the embodiment of the present application.
In a fourth aspect, embodiments of the present application provide a computer-readable storage medium, where the computer-readable storage medium stores a computer program for electronic data exchange, where the computer program causes a computer to perform some or all of the steps as described in the first aspect of the embodiments of the present application.
In a fifth aspect, embodiments of the present application provide a computer program product, wherein the computer program product comprises a non-transitory computer readable storage medium storing a computer program operable to cause a computer to perform some or all of the steps described in the first aspect of the embodiments of the present application. The computer program product may be a software installation package.
By implementing the embodiment of the application, the following beneficial effects are achieved:
it can be seen that the garbage classification processing method and related products described in the embodiments of the present application are applied to an intelligent garbage can, where the intelligent garbage can includes at least one camera, when a user delivers garbage, the at least one camera is used to capture a target image, the target image is identified to obtain a target garbage type corresponding to the garbage delivered by the user, a target garbage loading area corresponding to the target garbage type is determined, the intelligent garbage can includes a plurality of garbage loading areas, each garbage loading area corresponds to one garbage type, the user is prompted to deliver the garbage to the target garbage loading area, when the user delivers the garbage, the corresponding garbage of the garbage can be identified, and the corresponding garbage type is determined, so that the user is prompted to perform corresponding garbage classification, and the garbage classification efficiency is improved.
Drawings
In order to more clearly illustrate the embodiments of the present application or the technical solutions in the prior art, the drawings that are required in the embodiments or the description of the prior art will be briefly described below, it being obvious that the drawings in the following description are only some embodiments of the present application, and that other drawings may be obtained according to these drawings without inventive effort for a person skilled in the art.
Fig. 1A is a schematic structural diagram of an intelligent garbage can according to an embodiment of the present application;
fig. 1B is a schematic flow chart of a garbage classification processing method according to an embodiment of the present application;
FIG. 2 is a schematic flow chart of another garbage classification processing method according to an embodiment of the present application;
fig. 3 is a schematic structural diagram of an intelligent garbage can according to an embodiment of the present application;
fig. 4A is a functional unit composition block diagram of a garbage classification processing apparatus according to an embodiment of the present application;
fig. 4B is a functional unit block diagram of another garbage classification processing apparatus according to an embodiment of the present application.
Detailed Description
In order to make the present application solution better understood by those skilled in the art, the following description will clearly and completely describe the technical solution in the embodiments of the present application with reference to the accompanying drawings in the embodiments of the present application, and it is apparent that the described embodiments are only some embodiments of the present application, not all embodiments. All other embodiments, which can be made by one of ordinary skill in the art based on the embodiments herein without making any inventive effort, are intended to be within the scope of the present application.
The terms first, second and the like in the description and in the claims of the present application and in the above-described figures, are used for distinguishing between different objects and not for describing a particular sequential order. Furthermore, the terms "comprise" and "have," as well as any variations thereof, are intended to cover a non-exclusive inclusion. For example, a process, method, system, article, or apparatus that comprises a list of steps or elements is not limited to only those listed steps or elements but may include other steps or elements not listed or inherent to such process, method, article, or apparatus.
Reference herein to "an embodiment" means that a particular feature, structure, or characteristic described in connection with the embodiment may be included in at least one embodiment of the present application. The appearances of such phrases in various places in the specification are not necessarily all referring to the same embodiment, nor are separate or alternative embodiments mutually exclusive of other embodiments. Those of skill in the art will explicitly and implicitly appreciate that the embodiments described herein may be combined with other embodiments.
The embodiments of the present application are described in detail below.
Referring to fig. 1A, fig. 1A is a schematic structural diagram for implementing an intelligent garbage can according to an embodiment of the present application, where the intelligent garbage can may include at least one camera, and the intelligent garbage can may further include a plurality of garbage loading areas, for example, the plurality of garbage loading areas may be: dry waste, wet waste, hazardous waste, and recyclable waste, and the like, without limitation.
In the embodiment of the application, the intelligent garbage can further comprises a control circuit, a storage circuit and a processing circuit. The storage circuitry may be a memory, such as a hard drive memory, a non-volatile memory (e.g., flash memory or other electronically programmable read-only memory used to form a solid state drive, etc.), a volatile memory (e.g., static or dynamic random access memory, etc.), and the like, as embodiments of the present application are not limited. The processing circuitry may be used to control the operation of the intelligent garbage can. The processing circuitry may be implemented based on one or more microprocessors, microcontrollers, digital signal processors, baseband processors, power management units, audio codec chips, application specific integrated circuits, display driver integrated circuits, and the like.
Referring to fig. 1B, fig. 1B is a flow chart of a garbage classification processing method provided in an embodiment of the present application, which is applied to the intelligent garbage can shown in fig. 1A, where the intelligent garbage can includes at least one camera, as shown in the figure, the garbage classification processing method includes:
101. When the user delivers garbage, shooting is carried out through the at least one camera, and a target image is obtained.
The intelligent garbage can be applied to public places, such as families, communities, campuses, hospitals, museums, bus stops, parks, tourist attractions, train stations, airports, office buildings, pedestrian streets, markets, supermarkets and the like, and is not limited herein.
In this application embodiment, intelligent garbage bin can set up at least one camera, and this at least one camera can be visible light camera or infrared camera, and of course, this at least one camera can be single camera or many cameras. In a specific implementation, the intelligent dustbin can monitor the surroundings through the at least one camera.
In a specific implementation, the intelligent garbage can may include a plurality of garbage loading areas, for example, the plurality of garbage loading areas may be the following garbage loading areas: dry waste loading areas, wet waste loading areas, hazardous waste loading areas, etc., may be further divided, as: the dry waste loading area may be a used waste loading area, a plastic waste loading area, a glass waste loading area, a metal waste loading area, a cloth waste loading area, etc., without limitation, wherein the used waste loading area may be used to load the following waste: newspapers, journals, books, various wrapping papers, and the like; the plastic waste loading area may be loaded with the following waste: various plastic bags, plastic foams, plastic packages, disposable plastic cutlery boxes, tableware, hard plastic, plastic toothbrushes, plastic cups, mineral water bottles and the like; the glass waste loading area may be used to load the following waste: various glass bottles, broken glass sheets, mirrors, thermos bottles, etc.; the metal refuse loading area may be used to load the following refuse: pop cans, etc. The cloth refuse loading area may be used to load the following refuse: waste clothes, tablecloths, face tissues, schoolbags, shoes and the like.
In one possible example, the intelligent garbage can includes a plurality of cameras, the step 101 of capturing a target image by using the at least one camera may include the following steps:
11. determining the distance and the shooting angle between each camera of the plurality of cameras and the garbage to obtain a plurality of distance values and a plurality of shooting angle values, wherein each camera corresponds to one distance value and one shooting angle value;
12. determining a target first evaluation value corresponding to each distance value in the plurality of distance values according to a mapping relation between a preset distance and the first evaluation value, so as to obtain a plurality of target first evaluation values;
13. determining a target second evaluation value corresponding to each shooting angle value in the plurality of shooting angle values according to a mapping relation between a preset shooting angle and the second evaluation value, so as to obtain a plurality of target second evaluation values;
14. determining a target weight pair corresponding to each shooting angle value in the plurality of shooting angle values according to a mapping relation between the preset shooting angle and the weight pair to obtain a plurality of target weight pairs, wherein each target weight pair comprises a target first weight and a target second weight, the target first weight is a weight corresponding to the first evaluation value, the target second weight is a weight corresponding to the second evaluation value, and the sum of the target first weight and the target second weight is 1;
15. Weighting operation is carried out according to the plurality of target first evaluation values, the plurality of target second evaluation values and the plurality of target weight pairs to obtain a plurality of target evaluation values, and each camera in the plurality of cameras corresponds to one target evaluation value;
16. and controlling a camera corresponding to the maximum value in the plurality of target evaluation values to shoot, so as to obtain the target image.
The mapping relation between the preset distance and the first evaluation value, the mapping relation between the preset shooting angle and the second evaluation value, and the mapping relation between the preset shooting angle and the weight pair can be stored in the intelligent garbage can in advance, wherein the weight pair comprises a first weight corresponding to the first evaluation value and a second weight corresponding to the second evaluation value, and the sum of the first weight and the second weight is 1.
In a specific implementation, the intelligent garbage can determine a distance and a shooting angle between each camera of the plurality of cameras and garbage to obtain a plurality of distance values and a plurality of shooting angle values, each camera corresponds to one distance value and a shooting angle value, further, according to a mapping relation between the preset distance and a first evaluation value, a target first evaluation value corresponding to each distance value of the plurality of distance values is determined to obtain a plurality of target first evaluation values, and according to a mapping relation between the preset shooting angle and a second evaluation value, a target second evaluation value corresponding to each shooting angle value of the plurality of shooting angle values is determined to obtain a plurality of target second evaluation values, and according to a mapping relation between the preset shooting angle and a weight pair, a target weight pair corresponding to each shooting angle value of the plurality of shooting angle values is determined to obtain a plurality of target weight pairs, each target weight pair comprises a target first weight and a target second weight, the target first weight corresponds to the first evaluation value, the target second weight corresponds to the target second weight, and the target second weight corresponds to the target first weight and the target weight 1.
Further, the intelligent garbage can performs weighted operation according to the first evaluation values, the second evaluation values and the weight values to obtain a plurality of target evaluation values, and each camera corresponds to one target evaluation value, namely:
target evaluation value=target first evaluation value+target second evaluation value
Further, the intelligent garbage can control the camera corresponding to the maximum value in the multiple target evaluation values to shoot, and a target image is obtained.
102. And carrying out target identification on the target image to obtain a target garbage type corresponding to the garbage delivered by the user.
In a specific implementation, the intelligent garbage can conduct target recognition on the target image, can recognize the garbage type of the garbage corresponding to the target image, and can obtain the target garbage type corresponding to the garbage delivered by the user. In this embodiment of the present application, the garbage type may be at least one of the following: the dry waste type, wet waste type, hazardous waste type, recyclable waste type, of course, may be further subdivided without limitation herein.
In a possible example, the step 102 of performing target recognition on the target image to obtain a target garbage type corresponding to the garbage delivered by the user may include the following steps:
21. extracting a target from the target image to obtain a target area;
22. extracting the characters from the target area to obtain a plurality of characters;
23. determining the plurality of characters to extract keywords to obtain a plurality of keywords;
24. determining the types of the target objects corresponding to the keywords;
25. and determining the target garbage type corresponding to the target object type according to a mapping relation between the preset object type and the target garbage type.
Because some character identifiers are arranged on many garbage of the household garbage, the character identifiers are used for identifying the garbage to be a piece, in a specific implementation, the object image not only comprises the garbage image but also can comprise a background image, the intelligent garbage can be used for extracting the object image to obtain an object area, the object area is an image with garbage only, the object area can be used for extracting the characters to obtain a plurality of characters, further, the plurality of characters can be used for extracting keywords (for example, carrying out semantic recognition and semantic segmentation) to obtain a plurality of keywords, and further, the object type corresponding to the garbage can be determined according to the plurality of keywords. Ice cream, toilet paper, wet tissue, milk powder, etc., are not limited herein. The intelligent garbage can also can store a mapping relation between a preset object type and a target garbage type in advance, and further can determine the target garbage type corresponding to the target object type according to the mapping relation.
Further, the step 102 may further include the following steps:
a1, dividing the target area into a plurality of areas when the plurality of keywords cannot determine the type of the article or the target area does not comprise characters;
a2, evaluating the image quality of each of the plurality of areas to obtain a plurality of image quality evaluation values;
a3, selecting an image quality evaluation value larger than a preset image quality threshold from the plurality of image quality evaluation values to obtain at least one target image quality evaluation value, and obtaining a region corresponding to the at least one target image quality evaluation value to obtain at least one first region;
a4, extracting lines from each region in the at least one first region to obtain a plurality of lines;
a5, determining a target characteristic parameter set corresponding to the multi-stripe road;
and A6, inputting the target characteristic parameter set into a preset neural network model to obtain the target garbage type corresponding to the target object type.
In a specific implementation, in an embodiment of the present application, the preset neural network model may be at least one of the following: the convolutional neural network model or the cyclic neural network model can be obtained by training a characteristic parameter set extracted from garbage images of different garbage types, and the characteristic parameter set can be at least one of the following: average width of the texture, distribution density of the texture, average length of the texture, etc. The preset image quality threshold may be set by the user or default by the system. The intelligent dustbin may divide the target area into a plurality of areas when the plurality of keywords cannot determine the type of the article or the target area does not include the character, and further, may perform image quality evaluation on each of the plurality of areas to obtain a plurality of image quality evaluation values, and specifically, may perform image quality evaluation on each of the areas by using at least one image quality evaluation index, where the image quality evaluation index may be at least one of: information entropy, average gray level, edge retention, sharpness, etc., are not limited herein.
Further, the intelligent garbage can select an image quality evaluation value greater than a preset image quality threshold from a plurality of image quality evaluation values to obtain at least one target image quality evaluation value, obtain an area corresponding to the at least one target image quality evaluation value to obtain at least one first area, and extract lines from each area in the at least one first area to obtain a plurality of lines, wherein a specific mode of line extraction can be at least one of the following: hough transforms, canny operators, sobel operators, and the like, are not limited herein. Furthermore, a target feature parameter set corresponding to the multi-stripe may be determined, where in this embodiment of the present application, the target feature parameter set may be at least one of the following: the average grain width, grain distribution density, average grain length, etc. are not limited herein, and the target characteristic parameter set may be input into a preset neural network model to obtain a target garbage type corresponding to the target object type.
103. And determining a target garbage loading area corresponding to the target garbage type, wherein the intelligent garbage can comprises a plurality of garbage loading areas, and each garbage loading area corresponds to one garbage type.
In a specific implementation, different garbage loading areas can be used for loading different garbage, the intelligent garbage can comprise a plurality of garbage loading areas, each garbage loading area corresponds to one garbage type, and further, a target garbage loading area corresponding to a target garbage type can be determined.
In a possible example, the determining, in step 102, the target garbage loading area corresponding to the target garbage type may be implemented as follows:
and determining a target garbage loading area corresponding to the target garbage type according to a mapping relation between the preset garbage type and the garbage loading area.
The intelligent garbage can pre-store a mapping relation between a preset garbage type and a garbage loading area, and in a specific implementation, the intelligent garbage can determine a target garbage loading area corresponding to a target garbage type according to the mapping relation. In this embodiment of the present application, the garbage type may be at least one of the following: the dry waste type, wet waste type, hazardous waste type, recyclable waste type, etc., are not limited herein, and specifically, the waste type is used to instruct each waste loading area to load the corresponding waste.
104. And prompting the user to deliver the garbage to the target garbage loading area.
In a specific implementation, the intelligent garbage can prompts a user to deliver garbage to a target garbage loading area in a voice or display mode.
In one possible example, after the step 104, the following steps may be further included:
b1, detecting the bearing capacity of each garbage loading area in the garbage loading areas to obtain a plurality of bearing capacities;
and B2, when any bearing capacity of the plurality of bearing capacities exceeds a preset threshold, sending cleaning request information to an administrator, wherein the cleaning request information carries state information of a garbage loading area to be cleaned, and the garbage loading area to be cleaned is a garbage loading area with the bearing capacity exceeding the preset threshold.
The preset threshold value can be set by the user or default by the system. In a specific implementation, the amount of garbage that each garbage loading area can bear is limited, so that the intelligent garbage can detect the bearing capacity of each garbage loading area in a plurality of garbage loading areas to obtain a plurality of bearing capacities, when any bearing capacity in the plurality of bearing capacities exceeds a preset threshold, the intelligent garbage can send cleaning request information to a corresponding administrator, the cleaning request information can carry state information of the garbage loading area to be cleaned, and the state information can include: the garbage carrying capacity, the maximum carrying capacity, the last cleaning time and the like of each garbage carrying area are not limited herein, and the garbage carrying area to be cleaned is a garbage carrying area with the carrying capacity exceeding a preset threshold.
Furthermore, the intelligent garbage can further carry out packing treatment on the garbage loading area to be cleaned, can unload the packed garbage, and is used for applying a new garbage bag to the garbage loading area to be cleaned, wherein the new garbage bag is used for containing garbage of the garbage type corresponding to the garbage loading area to be cleaned.
Further, in one possible example, the step 104 may further include the following steps:
c1, detecting whether the garbage is delivered to an incorrect garbage loading area;
and C2, reclassifying the garbage by a mechanical arm of the intelligent garbage can when the garbage is delivered to an incorrect garbage loading area.
In a specific implementation, when a user delivers garbage to an incorrect garbage loading area, the intelligent garbage can also classify the garbage and deliver the garbage to a correct garbage loading area, and in particular, a mechanical arm can be integrated in the intelligent garbage can, and the mechanical arm can integrate at least one of the following tools: pliers, forceps, hammers, screwdrivers, rails, blades, pipettes, and the like, as not limited herein. Furthermore, the intelligent garbage can be used for classifying garbage through a mechanical arm.
Further, in the embodiment of the application, the intelligent garbage can simultaneously receive garbage delivered by multiple people and classify the garbage delivered by the multiple people, specifically, the intelligent garbage can adopt multithreading or multi-process to realize garbage classification, and each thread or process is used for processing garbage delivered by one person.
It can be seen that the garbage classification processing method described in the embodiment of the application is applied to an intelligent garbage can, the intelligent garbage can comprises at least one camera, when a user delivers garbage, the at least one camera is used for shooting, a target image is obtained, target recognition is carried out on the target image, a target garbage type corresponding to the garbage delivered by the user is obtained, a target garbage loading area corresponding to the target garbage type is determined, the intelligent garbage can comprises a plurality of garbage loading areas, each garbage loading area corresponds to one garbage type, the user is prompted to deliver the garbage to the target garbage loading area, the corresponding garbage of the garbage can be identified when the user delivers the garbage, the corresponding garbage type of the garbage is determined, the user is prompted to carry out corresponding garbage classification, and the garbage classification efficiency is improved.
In accordance with the embodiment shown in fig. 1B, please refer to fig. 2, fig. 2 is a flow chart of a garbage classification processing method provided in the embodiment of the present application, which is applied to an intelligent garbage can, wherein the intelligent garbage can includes at least one camera, and as shown in the figure, the garbage classification processing method includes:
201. When the user delivers garbage, shooting is carried out through the at least one camera, and a target image is obtained.
202. And carrying out target identification on the target image to obtain a target garbage type corresponding to the garbage delivered by the user.
203. And determining a target garbage loading area corresponding to the target garbage type, wherein the intelligent garbage can comprises a plurality of garbage loading areas, and each garbage loading area corresponds to one garbage type.
204. And prompting the user to deliver the garbage to the target garbage loading area.
205. And detecting the bearing capacity of each garbage loading area in the garbage loading areas to obtain a plurality of bearing capacities.
206. When any bearing capacity of the plurality of bearing capacities exceeds a preset threshold, sending cleaning request information to an administrator, wherein the cleaning request information carries state information of a garbage loading area to be cleaned, and the garbage loading area to be cleaned is a garbage loading area with the bearing capacity exceeding the preset threshold.
The specific description of the steps 201 to 206 may refer to the corresponding steps of the garbage classification processing method described in fig. 1B, and will not be repeated herein.
It can be seen that, the garbage classification processing method described in the embodiment of the application is applied to an intelligent garbage can, the intelligent garbage can comprises at least one camera, when a user delivers garbage, the at least one camera is used for shooting, a target image is obtained, target identification is carried out on the target image, a target garbage type corresponding to the garbage delivered by the user is obtained, a target garbage loading area corresponding to the target garbage type is determined, the intelligent garbage can comprises a plurality of garbage loading areas, each garbage loading area corresponds to one garbage type, the user is prompted to deliver the garbage to the target garbage loading area, the bearing capacity of each garbage loading area in the plurality of garbage loading areas is detected, a plurality of bearing capacities are obtained, when any bearing capacity of the plurality of bearing capacities exceeds a preset threshold, cleaning request information carries state information of the garbage loading area to be cleaned, the garbage loading area to be cleaned is the garbage loading area with the bearing capacity exceeding the preset threshold, the corresponding garbage can be identified when the user delivers the garbage, and the corresponding garbage type is determined, so that the user is prompted to carry out corresponding garbage classification, and the garbage classification efficiency is improved.
In accordance with the foregoing embodiments, referring to fig. 3, fig. 3 is a schematic structural diagram of an intelligent garbage can provided in an embodiment of the present application, where the intelligent garbage can includes a processor, a memory, a communication interface, and one or more programs, and may further include at least one camera, where the one or more programs are stored in the memory and configured to be executed by the processor, and in the embodiment of the present application, the programs include instructions for performing the following steps:
when the user delivers garbage, shooting is carried out through the at least one camera to obtain a target image;
performing target identification on the target image to obtain a target garbage type corresponding to garbage delivered by the user;
determining a target garbage loading area corresponding to the target garbage type, wherein the intelligent garbage can comprises a plurality of garbage loading areas, and each garbage loading area corresponds to one garbage type;
and prompting the user to deliver the garbage to the target garbage loading area.
It can be seen that, the intelligent garbage can described in the embodiment of the application includes at least one camera, when a user delivers garbage, the at least one camera is used for shooting, a target image is obtained, target recognition is performed on the target image, a target garbage type corresponding to the garbage delivered by the user is obtained, a target garbage loading area corresponding to the target garbage type is determined, the intelligent garbage can includes a plurality of garbage loading areas, each garbage loading area corresponds to one garbage type, the user is prompted to deliver garbage to the target garbage loading area, the corresponding garbage of the garbage can be identified when the user delivers garbage, and the corresponding garbage type is determined, so that the user is prompted to perform corresponding garbage classification, and the garbage classification efficiency is improved.
In one possible example, in the determining the target garbage loading area corresponding to the target garbage type, the program includes instructions for:
and determining a target garbage loading area corresponding to the target garbage type according to a mapping relation between the preset garbage type and the garbage loading area.
In one possible example, the above-described program further includes instructions for performing the steps of:
detecting the bearing capacity of each garbage loading area in the garbage loading areas to obtain a plurality of bearing capacities;
when any bearing capacity of the plurality of bearing capacities exceeds a preset threshold, sending cleaning request information to an administrator, wherein the cleaning request information carries state information of a garbage loading area to be cleaned, and the garbage loading area to be cleaned is a garbage loading area with the bearing capacity exceeding the preset threshold.
In one possible example, the intelligent garbage can includes a plurality of cameras, and the program includes instructions for performing the following steps in terms of capturing a target image by the at least one camera:
determining the distance and the shooting angle between each camera of the plurality of cameras and the garbage to obtain a plurality of distance values and a plurality of shooting angle values, wherein each camera corresponds to one distance value and one shooting angle value;
Determining a target first evaluation value corresponding to each distance value in the plurality of distance values according to a mapping relation between a preset distance and the first evaluation value, so as to obtain a plurality of target first evaluation values;
determining a target second evaluation value corresponding to each shooting angle value in the plurality of shooting angle values according to a mapping relation between a preset shooting angle and the second evaluation value, so as to obtain a plurality of target second evaluation values;
determining a target weight pair corresponding to each shooting angle value in the plurality of shooting angle values according to a mapping relation between the preset shooting angle and the weight pair to obtain a plurality of target weight pairs, wherein each target weight pair comprises a target first weight and a target second weight, the target first weight is a weight corresponding to the first evaluation value, the target second weight is a weight corresponding to the second evaluation value, and the sum of the target first weight and the target second weight is 1;
weighting operation is carried out according to the plurality of target first evaluation values, the plurality of target second evaluation values and the plurality of target weight pairs to obtain a plurality of target evaluation values, and each camera in the plurality of cameras corresponds to one target evaluation value;
And controlling a camera corresponding to the maximum value in the plurality of target evaluation values to shoot, so as to obtain the target image.
In one possible example, in terms of the target recognition on the target image to obtain a target garbage type corresponding to the garbage delivered by the user, the program includes instructions for performing the following steps:
extracting a target from the target image to obtain a target area;
extracting the characters from the target area to obtain a plurality of characters;
determining the plurality of characters to extract keywords to obtain a plurality of keywords;
determining the types of the target objects corresponding to the keywords;
and determining the target garbage type corresponding to the target object type according to a mapping relation between the preset object type and the target garbage type.
In one possible example, the above-described program further includes instructions for performing the steps of:
dividing the target area into a plurality of areas when the plurality of keywords cannot determine the type of the article or the target area does not include characters;
performing image quality evaluation on each of the plurality of areas to obtain a plurality of image quality evaluation values;
Selecting an image quality evaluation value greater than a preset image quality threshold from the plurality of image quality evaluation values to obtain at least one target image quality evaluation value, and obtaining a region corresponding to the at least one target image quality evaluation value to obtain at least one first region;
extracting lines from each region in the at least one first region to obtain a plurality of lines;
determining a target characteristic parameter set corresponding to the multi-stripe road;
and inputting the target characteristic parameter set into a preset neural network model to obtain the target garbage type corresponding to the target object type.
The foregoing description of the embodiments of the present application has been presented primarily in terms of a method-side implementation. It will be appreciated that the intelligent garbage can, in order to implement the above-described functions, includes corresponding hardware structures and/or software modules that perform the respective functions. Those of skill in the art will readily appreciate that the elements and algorithm steps described in connection with the embodiments disclosed herein may be embodied as hardware or a combination of hardware and computer software. Whether a function is implemented as hardware or computer software driven hardware depends upon the particular application and design constraints imposed on the solution. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present application.
According to the embodiment of the application, the functional units of the intelligent garbage can be divided according to the method, for example, each functional unit can be divided corresponding to each function, and two or more functions can be integrated in one processing unit. The integrated units may be implemented in hardware or in software functional units. It should be noted that, in the embodiment of the present application, the division of the units is schematic, which is merely a logic function division, and other division manners may be implemented in actual practice.
Fig. 4A is a functional block diagram of a garbage classification apparatus 400 according to an embodiment of the present application. This garbage classification processing apparatus 400 is applied to intelligent garbage bin, intelligent garbage bin includes at least one camera, and the device includes: a photographing unit 401, an identification unit 402, a determination unit 403, and a presentation unit 404, wherein,
the shooting unit 401 is configured to obtain a target image by shooting through the at least one camera when the user delivers the garbage;
the identifying unit 402 is configured to perform target identification on the target image, so as to obtain a target garbage type corresponding to the garbage delivered by the user;
The determining unit 403 is configured to determine a target garbage loading area corresponding to the target garbage type, where the intelligent garbage can includes a plurality of garbage loading areas, and each garbage loading area corresponds to a garbage type;
the prompting unit 404 is configured to prompt a user to deliver the garbage to the target garbage loading area.
In one possible example, in the aspect of determining the target garbage loading area corresponding to the target garbage type, the determining unit 403 is specifically configured to:
and determining a target garbage loading area corresponding to the target garbage type according to a mapping relation between the preset garbage type and the garbage loading area.
In one possible example, as shown in fig. 4B, fig. 4B is a further modified structure of the garbage classification apparatus described in fig. 4A, which may further include, compared to fig. 4A: the detection unit 405 and the transmission unit 406 are specifically as follows:
the detecting unit 405 is configured to detect a bearing capacity of each of the plurality of garbage loading areas, to obtain a plurality of bearing capacities;
the sending unit 406 is configured to send, when any one of the multiple bearing capacities exceeds a preset threshold, cleaning request information to an administrator, where the cleaning request information carries status information of a garbage loading area to be cleaned, and the garbage loading area to be cleaned is a garbage loading area with a bearing capacity exceeding the preset threshold.
In one possible example, the intelligent garbage can includes a plurality of cameras, and in the aspect of capturing by the at least one camera, the capturing unit 401 is specifically configured to:
determining the distance and the shooting angle between each camera of the plurality of cameras and the garbage to obtain a plurality of distance values and a plurality of shooting angle values, wherein each camera corresponds to one distance value and one shooting angle value;
determining a target first evaluation value corresponding to each distance value in the plurality of distance values according to a mapping relation between a preset distance and the first evaluation value, so as to obtain a plurality of target first evaluation values;
determining a target second evaluation value corresponding to each shooting angle value in the plurality of shooting angle values according to a mapping relation between a preset shooting angle and the second evaluation value, so as to obtain a plurality of target second evaluation values;
determining a target weight pair corresponding to each shooting angle value in the plurality of shooting angle values according to a mapping relation between the preset shooting angle and the weight pair to obtain a plurality of target weight pairs, wherein each target weight pair comprises a target first weight and a target second weight, the target first weight is a weight corresponding to the first evaluation value, the target second weight is a weight corresponding to the second evaluation value, and the sum of the target first weight and the target second weight is 1;
Weighting operation is carried out according to the plurality of target first evaluation values, the plurality of target second evaluation values and the plurality of target weight pairs to obtain a plurality of target evaluation values, and each camera in the plurality of cameras corresponds to one target evaluation value;
and controlling a camera corresponding to the maximum value in the plurality of target evaluation values to shoot, so as to obtain the target image.
In one possible example, in terms of performing target recognition on the target image to obtain a target garbage type corresponding to the garbage delivered by the user, the recognition unit 402 is specifically configured to:
extracting a target from the target image to obtain a target area;
extracting the characters from the target area to obtain a plurality of characters;
determining the plurality of characters to extract keywords to obtain a plurality of keywords;
determining the types of the target objects corresponding to the keywords;
and determining the target garbage type corresponding to the target object type according to a mapping relation between the preset object type and the target garbage type.
In one possible example, the identification unit 402 is further specifically configured to:
dividing the target area into a plurality of areas when the plurality of keywords cannot determine the type of the article or the target area does not include characters;
Performing image quality evaluation on each of the plurality of areas to obtain a plurality of image quality evaluation values;
selecting an image quality evaluation value greater than a preset image quality threshold from the plurality of image quality evaluation values to obtain at least one target image quality evaluation value, and obtaining a region corresponding to the at least one target image quality evaluation value to obtain at least one first region;
extracting lines from each region in the at least one first region to obtain a plurality of lines;
determining a target characteristic parameter set corresponding to the multi-stripe road;
and inputting the target characteristic parameter set into a preset neural network model to obtain the target garbage type corresponding to the target object type.
It may be understood that the functions of each program module of the garbage classification processing apparatus of the present embodiment may be specifically implemented according to the method in the foregoing method embodiment, and the specific implementation process may refer to the relevant description of the foregoing method embodiment, which is not repeated herein.
The embodiment of the application also provides a computer storage medium, wherein the computer storage medium stores a computer program for electronic data exchange, and the computer program makes a computer execute part or all of the steps of any one of the method embodiments, and the computer includes a smart trash can.
Embodiments of the present application also provide a computer program product comprising a non-transitory computer-readable storage medium storing a computer program operable to cause a computer to perform some or all of the steps of any one of the methods described in the method embodiments above. The computer program product may be a software installation package, said computer comprising a smart trash can.
It should be noted that, for simplicity of description, the foregoing method embodiments are all expressed as a series of action combinations, but it should be understood by those skilled in the art that the present application is not limited by the order of actions described, as some steps may be performed in other order or simultaneously in accordance with the present application. Further, those skilled in the art will also appreciate that the embodiments described in the specification are all preferred embodiments, and that the acts and modules referred to are not necessarily required in the present application.
In the foregoing embodiments, the descriptions of the embodiments are emphasized, and for parts of one embodiment that are not described in detail, reference may be made to related descriptions of other embodiments.
In the several embodiments provided in this application, it should be understood that the disclosed apparatus may be implemented in other ways. For example, the above-described apparatus embodiments are merely illustrative, such as the above-described division of units, merely a division of logic functions, and there may be additional manners of dividing in actual implementation, such as multiple units or components may be combined or integrated into another system, or some features may be omitted, or not performed. Alternatively, the coupling or direct coupling or communication connection shown or discussed with each other may be an indirect coupling or communication connection via some interfaces, devices or units, or may be in electrical or other forms.
The units described above as separate components may or may not be physically separate, and components shown as units may or may not be physical units, may be located in one place, or may be distributed over a plurality of network units. Some or all of the units may be selected according to actual needs to achieve the purpose of the solution of this embodiment.
In addition, each functional unit in each embodiment of the present application may be integrated in one processing unit, or each unit may exist alone physically, or two or more units may be integrated in one unit. The integrated units may be implemented in hardware or in software functional units.
The integrated units described above, if implemented in the form of software functional units and sold or used as stand-alone products, may be stored in a computer readable memory. Based on such understanding, the technical solution of the present application may be embodied in essence or a part contributing to the prior art or all or part of the technical solution in the form of a software product stored in a memory, including several instructions for causing a computer device (which may be a personal computer, a server or a network device, etc.) to perform all or part of the steps of the above-mentioned method of the various embodiments of the present application. And the aforementioned memory includes: a U-disk, a Read-Only Memory (ROM), a random access Memory (RAM, random Access Memory), a removable hard disk, a magnetic disk, or an optical disk, or other various media capable of storing program codes.
Those of ordinary skill in the art will appreciate that all or a portion of the steps in the various methods of the above embodiments may be implemented by a program that instructs associated hardware, and the program may be stored in a computer readable memory, which may include: flash disk, read-Only Memory (ROM), random access Memory (Random Access Memory, RAM), magnetic disk or optical disk.
The foregoing has outlined rather broadly the more detailed description of embodiments of the present application, wherein specific examples are provided herein to illustrate the principles and embodiments of the present application, the above examples being provided solely to assist in the understanding of the methods of the present application and the core ideas thereof; meanwhile, as those skilled in the art will have modifications in the specific embodiments and application scope in accordance with the ideas of the present application, the present description should not be construed as limiting the present application in view of the above.

Claims (7)

1. A method for sorting garbage, characterized in that it is applied to an intelligent garbage can, said intelligent garbage can comprising at least one camera, said method comprising:
when the user delivers garbage, shooting is carried out through the at least one camera to obtain a target image;
Performing target identification on the target image to obtain a target garbage type corresponding to garbage delivered by the user;
determining a target garbage loading area corresponding to the target garbage type, wherein the intelligent garbage can comprises a plurality of garbage loading areas, and each garbage loading area corresponds to one garbage type;
prompting a user to deliver the garbage to the target garbage loading area;
detecting whether the refuse is delivered to an incorrect refuse loading area;
reclassifying the garbage by a robotic arm of the intelligent garbage can when the garbage is delivered to an incorrect garbage loading area;
wherein, intelligent garbage bin still is used for: the garbage delivered by multiple persons is received, and classified, specifically: the garbage classification is realized by adopting multiple threads or multiple processes, and each thread or process is used for processing garbage delivered by one person;
wherein, intelligent garbage bin includes a plurality of cameras, shoot through at least one camera, obtain the target image, include:
determining the distance and the shooting angle between each camera of the plurality of cameras and the garbage to obtain a plurality of distance values and a plurality of shooting angle values, wherein each camera corresponds to one distance value and one shooting angle value;
Determining a target first evaluation value corresponding to each distance value in the plurality of distance values according to a mapping relation between a preset distance and the first evaluation value, so as to obtain a plurality of target first evaluation values;
determining a target second evaluation value corresponding to each shooting angle value in the plurality of shooting angle values according to a mapping relation between a preset shooting angle and the second evaluation value, so as to obtain a plurality of target second evaluation values;
determining a target weight pair corresponding to each shooting angle value in the plurality of shooting angle values according to a mapping relation between the preset shooting angle and the weight pair to obtain a plurality of target weight pairs, wherein each target weight pair comprises a target first weight and a target second weight, the target first weight is a weight corresponding to the first evaluation value, the target second weight is a weight corresponding to the second evaluation value, and the sum of the target first weight and the target second weight is 1;
weighting operation is carried out according to the plurality of target first evaluation values, the plurality of target second evaluation values and the plurality of target weight pairs to obtain a plurality of target evaluation values, and each camera in the plurality of cameras corresponds to one target evaluation value;
Controlling a camera corresponding to the maximum value in the multiple target evaluation values to shoot, so as to obtain the target image;
the target recognition is performed on the target image to obtain a target garbage type corresponding to garbage delivered by the user, which comprises the following steps:
extracting a target from the target image to obtain a target area;
extracting the characters from the target area to obtain a plurality of characters;
determining the plurality of characters to extract keywords to obtain a plurality of keywords;
determining the types of the target objects corresponding to the keywords;
determining a target garbage type corresponding to a target object type according to a mapping relation between the preset object type and the target garbage type;
wherein the method further comprises:
dividing the target area into a plurality of areas when the plurality of keywords cannot determine the type of the article or the target area does not include characters;
performing image quality evaluation on each of the plurality of areas to obtain a plurality of image quality evaluation values;
selecting an image quality evaluation value greater than a preset image quality threshold from the plurality of image quality evaluation values to obtain at least one target image quality evaluation value, and obtaining a region corresponding to the at least one target image quality evaluation value to obtain at least one first region;
Extracting lines from each region in the at least one first region to obtain a plurality of lines;
determining a target characteristic parameter set corresponding to the multi-stripe road;
and inputting the target characteristic parameter set into a preset neural network model to obtain the target garbage type corresponding to the target object type.
2. The method of claim 1, wherein the determining the target garbage loading area corresponding to the target garbage type comprises:
and determining a target garbage loading area corresponding to the target garbage type according to a mapping relation between the preset garbage type and the garbage loading area.
3. The method according to claim 1 or 2, characterized in that the method further comprises:
detecting the bearing capacity of each garbage loading area in the garbage loading areas to obtain a plurality of bearing capacities;
when any bearing capacity of the plurality of bearing capacities exceeds a preset threshold, sending cleaning request information to an administrator, wherein the cleaning request information carries state information of a garbage loading area to be cleaned, and the garbage loading area to be cleaned is a garbage loading area with the bearing capacity exceeding the preset threshold.
4. A garbage classification device, characterized in that is applied to an intelligent garbage bin, the intelligent garbage bin including at least one camera, the device comprising: a shooting unit, an identification unit, a determination unit and a prompting unit, wherein,
the shooting unit is used for shooting through the at least one camera when the user delivers garbage to obtain a target image;
the identification unit is used for carrying out target identification on the target image to obtain a target garbage type corresponding to the garbage delivered by the user;
the determining unit is used for determining a target garbage loading area corresponding to the target garbage type, and the intelligent garbage can comprises a plurality of garbage loading areas, and each garbage loading area corresponds to one garbage type;
the prompting unit is used for prompting a user to deliver the garbage to the target garbage loading area;
wherein, the device is also specifically used for:
detecting whether the refuse is delivered to an incorrect refuse loading area;
reclassifying the garbage by a robotic arm of the intelligent garbage can when the garbage is delivered to an incorrect garbage loading area;
wherein, intelligent garbage bin still is used for: the garbage delivered by multiple persons is received, and classified, specifically: the garbage classification is realized by adopting multiple threads or multiple processes, and each thread or process is used for processing garbage delivered by one person;
Wherein, intelligent garbage bin includes a plurality of cameras, shoot through at least one camera, obtain the target image, include:
determining the distance and the shooting angle between each camera of the plurality of cameras and the garbage to obtain a plurality of distance values and a plurality of shooting angle values, wherein each camera corresponds to one distance value and one shooting angle value;
determining a target first evaluation value corresponding to each distance value in the plurality of distance values according to a mapping relation between a preset distance and the first evaluation value, so as to obtain a plurality of target first evaluation values;
determining a target second evaluation value corresponding to each shooting angle value in the plurality of shooting angle values according to a mapping relation between a preset shooting angle and the second evaluation value, so as to obtain a plurality of target second evaluation values;
determining a target weight pair corresponding to each shooting angle value in the plurality of shooting angle values according to a mapping relation between the preset shooting angle and the weight pair to obtain a plurality of target weight pairs, wherein each target weight pair comprises a target first weight and a target second weight, the target first weight is a weight corresponding to the first evaluation value, the target second weight is a weight corresponding to the second evaluation value, and the sum of the target first weight and the target second weight is 1;
Weighting operation is carried out according to the plurality of target first evaluation values, the plurality of target second evaluation values and the plurality of target weight pairs to obtain a plurality of target evaluation values, and each camera in the plurality of cameras corresponds to one target evaluation value;
controlling a camera corresponding to the maximum value in the multiple target evaluation values to shoot, so as to obtain the target image;
the target recognition is performed on the target image to obtain a target garbage type corresponding to garbage delivered by the user, which comprises the following steps:
extracting a target from the target image to obtain a target area;
extracting the characters from the target area to obtain a plurality of characters;
determining the plurality of characters to extract keywords to obtain a plurality of keywords;
determining the types of the target objects corresponding to the keywords;
determining a target garbage type corresponding to a target object type according to a mapping relation between the preset object type and the target garbage type;
wherein, the device is also specifically used for:
dividing the target area into a plurality of areas when the plurality of keywords cannot determine the type of the article or the target area does not include characters;
Performing image quality evaluation on each of the plurality of areas to obtain a plurality of image quality evaluation values;
selecting an image quality evaluation value greater than a preset image quality threshold from the plurality of image quality evaluation values to obtain at least one target image quality evaluation value, and obtaining a region corresponding to the at least one target image quality evaluation value to obtain at least one first region;
extracting lines from each region in the at least one first region to obtain a plurality of lines;
determining a target characteristic parameter set corresponding to the multi-stripe road;
and inputting the target characteristic parameter set into a preset neural network model to obtain the target garbage type corresponding to the target object type.
5. The apparatus according to claim 4, wherein in the determining the target garbage loading area corresponding to the target garbage type, the determining unit is specifically configured to:
and determining a target garbage loading area corresponding to the target garbage type according to a mapping relation between the preset garbage type and the garbage loading area.
6. A smart trash can comprising a processor, a memory for storing one or more programs and configured to be executed by the processor, the programs comprising instructions for performing the steps in the method of any of claims 1-3.
7. A computer-readable storage medium, characterized in that a computer program for electronic data exchange is stored, wherein the computer program causes a computer to perform the method according to any of claims 1-3.
CN201911119137.8A 2019-11-15 2019-11-15 Garbage classification processing method and related products Active CN110991271B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201911119137.8A CN110991271B (en) 2019-11-15 2019-11-15 Garbage classification processing method and related products

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201911119137.8A CN110991271B (en) 2019-11-15 2019-11-15 Garbage classification processing method and related products

Publications (2)

Publication Number Publication Date
CN110991271A CN110991271A (en) 2020-04-10
CN110991271B true CN110991271B (en) 2023-05-23

Family

ID=70084377

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201911119137.8A Active CN110991271B (en) 2019-11-15 2019-11-15 Garbage classification processing method and related products

Country Status (1)

Country Link
CN (1) CN110991271B (en)

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112849815A (en) * 2020-12-30 2021-05-28 深兰人工智能芯片研究院(江苏)有限公司 Control method and device of manipulator, intelligent garbage can and storage medium
CN112949509A (en) * 2021-03-08 2021-06-11 三一智造(深圳)有限公司 Garbage classification method based on artificial intelligence
CN113362333A (en) * 2021-07-07 2021-09-07 李有俊 Garbage classification box management method and device

Family Cites Families (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104074151A (en) * 2013-03-25 2014-10-01 刘永飞 Robot for collecting garbage
CN107022965B (en) * 2017-05-17 2018-09-28 巢湖学院 A kind of rubbish is picked up and sorter
CN107585482A (en) * 2017-10-23 2018-01-16 西北农林科技大学 A kind of road garbage case and garbage truck
CN108182455A (en) * 2018-01-18 2018-06-19 齐鲁工业大学 A kind of method, apparatus and intelligent garbage bin of the classification of rubbish image intelligent
CN108861188B (en) * 2018-05-22 2021-02-23 广州云景环保科技有限公司 Intelligent garbage classification and recovery robot
CN109684979B (en) * 2018-12-18 2021-11-30 深圳云天励飞技术有限公司 Image recognition technology-based garbage classification method and device and electronic equipment
CN110087193A (en) * 2019-03-05 2019-08-02 采之翼(北京)科技有限公司 Information uploading method, device, electronic equipment and the readable storage medium storing program for executing of dustbin

Also Published As

Publication number Publication date
CN110991271A (en) 2020-04-10

Similar Documents

Publication Publication Date Title
CN110991271B (en) Garbage classification processing method and related products
CN110795999B (en) Garbage delivery behavior analysis method and related product
CN105564864B (en) Dustbin, the refuse classification method of dustbin and system
CN110294236B (en) Garbage classification monitoring device and method and server system
CN109684979B (en) Image recognition technology-based garbage classification method and device and electronic equipment
CN102930265B (en) A kind of many I.D.s scan method and device
CN111008571B (en) Indoor garbage treatment method and related product
CN110087193A (en) Information uploading method, device, electronic equipment and the readable storage medium storing program for executing of dustbin
CN113128397B (en) Monitoring method, system, device and storage medium for garbage classification delivery
CN111814517B (en) Garbage delivery detection method and related product
CN112499017A (en) Garbage classification method and device and garbage can
CN110929693A (en) Intelligent garbage classification method and system
CN107123076A (en) A kind of waste management system and waste management method
CN112926431A (en) Garbage detection method, device, equipment and computer storage medium
CN111832749B (en) Garbage bag identification method and related device
KR20210074929A (en) deep learning based garbage distribution system
CN112241651A (en) Data display method and system, data processing method, storage medium and system
KR20180097421A (en) Apparatus and method for collecting bottle
CN112241747A (en) Object sorting method, device, sorting equipment and storage medium
CN109178706A (en) A kind of intelligent garbage bin
CN112263189B (en) Sweeping robot and method for distinguishing and cleaning garbage
CN113705638A (en) Mobile vehicle-mounted intelligent garbage information management method and system
CN212048939U (en) Intelligent garbage recycling device for community
CN108257297A (en) Recovery method, device and system
CN105446183B (en) Recycle terminal and its control method in smart city

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant