CN113362333A - Garbage classification box management method and device - Google Patents

Garbage classification box management method and device Download PDF

Info

Publication number
CN113362333A
CN113362333A CN202110770167.6A CN202110770167A CN113362333A CN 113362333 A CN113362333 A CN 113362333A CN 202110770167 A CN202110770167 A CN 202110770167A CN 113362333 A CN113362333 A CN 113362333A
Authority
CN
China
Prior art keywords
image
garbage
tree
value
pixel
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202110770167.6A
Other languages
Chinese (zh)
Inventor
李有俊
王宏
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Individual
Original Assignee
Individual
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Individual filed Critical Individual
Priority to CN202110770167.6A priority Critical patent/CN113362333A/en
Publication of CN113362333A publication Critical patent/CN113362333A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/22Matching criteria, e.g. proximity measures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/136Segmentation; Edge detection involving thresholding
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/90Determination of colour characteristics

Abstract

The application provides a management method and device for a garbage classification box. The method comprises the following steps: acquiring a target image of a target garbage can acquired by the image acquisition module; identifying a target garbage can in the target image, determining a first image corresponding to the target garbage can, and counting pixel points of the first image; based on the pixel points of the first image, the first image is segmented, and segmented images of all the garbage in the target garbage can are generated; wherein the segmentation image is a geometric image of each garbage; and comparing the segmentation image with a preset image library corresponding to the target garbage can to obtain the geometric feature matching degree of the segmentation image and a matching image in the preset image library. Whether the rubbish that can effectually discern the user and abandon belongs to the target garbage bin through this mode, and then provides convenience when picking up for managers, has alleviateed managers's work burden.

Description

Garbage classification box management method and device
Technical Field
The application relates to the technical field of garbage classification, in particular to a garbage classification box management method and device.
Background
The garbage is solid waste generated in human life and production, has complex and various components due to large discharge amount, and has pollution, resource and sociality.
In recent years, more and more cities have begun to perform garbage classification. Garbage classification (english name: Garbage classification) generally refers to a general name of a series of activities that store, release and transport Garbage classification according to a certain rule or standard, and thus convert the Garbage classification into public resources. The classification aims to improve the resource value and the economic value of the garbage, strive for making the best use of things, reduce the garbage treatment capacity and the use of treatment equipment, reduce the treatment cost, reduce the consumption of land resources, and have social, economic, ecological and other benefits.
Then, the existing garbage classification box is only attached with the category identification of each garbage can, and a user can often discard the garbage according to the category. Therefore, the management personnel are very inconvenient to pick up, and the workload of the management personnel is increased.
Disclosure of Invention
An object of the embodiment of the application is to provide a method and a device for managing a garbage classification box, so as to improve the problem that the existing garbage classification box is only pasted with the category identification of each garbage can, and a user still can often discard the garbage according to categories when discarding the garbage, so that a manager is very inconvenient in picking up the garbage, and the workload of the manager is increased.
The invention is realized by the following steps:
in a first aspect, an embodiment of the present application provides a garbage classification box management method, which is applied to a controller of the garbage classification box, the garbage classification box includes an image capture module, a kitchen garbage can, a recyclable garbage can, a harmful garbage can and a dry garbage can, the kitchen garbage can, the recyclable garbage can, the harmful garbage can and the dry garbage can are transparent garbage cans, the image capture module is respectively disposed outside each garbage can, the image capture module is electrically connected with the controller, and the method includes: acquiring a target image of a target garbage can acquired by the image acquisition module; identifying a target garbage can in the target image, determining a first image corresponding to the target garbage can, and counting pixel points of the first image; based on the pixel points of the first image, the first image is segmented, and segmented images of all the garbage in the target garbage can are generated; wherein the segmentation image is a geometric image of each garbage; comparing the segmented image with a preset image library corresponding to the target garbage can to obtain the geometric feature matching degree of the segmented image and a matching image in the preset image library; when a first segmentation image with the geometric feature matching degree lower than a preset threshold exists, representing that the garbage corresponding to the first segmentation image does not belong to the target garbage can.
In this application embodiment, the outside of every garbage bin all is provided with the image acquisition module, and every garbage bin is transparent garbage bin, and then makes the image that the image acquisition module can gather inside rubbish. The image is segmented at pixel points based on the collected image to generate segmented images of all the garbage in the target garbage can, and then the segmented images are compared with a preset image library corresponding to the target garbage can to obtain the geometric feature matching degree of the segmented images and a matching image in the preset image library. Whether the rubbish that can effectually discern the user and abandon belongs to the target garbage bin through this mode, and then provides convenience when picking up for managers, has alleviateed managers's work burden.
With reference to the technical solution provided by the first aspect, in some possible implementation manners, the segmenting the first image based on the pixel point of the first image to generate a segmented image of each trash in the target trash can includes: defining each pixel point on the first image as a tree; and determining the gray value of each tree; acquiring a similarity distance between two adjacent trees on the first image, wherein the similarity distance represents the similarity between the two adjacent trees determined based on the gray value; when the similarity distance between two adjacent trees is smaller than a preset threshold value, merging the two adjacent trees into one tree, and re-determining the gray value of the merged tree until all the trees can not be merged; and segmenting all the trees which can not be combined any more to generate segmented images of all the garbage in the target garbage can.
In the embodiment of the present application, when an image is segmented, each pixel point on a first image is first defined as a tree; and determining the gray value of each tree; then, acquiring a similarity distance between two adjacent trees on the first image, merging the two adjacent trees into one tree when the similarity distance between the two adjacent trees is smaller than a preset threshold value, and re-determining the gray value of the merged tree until all the trees can not be merged; and finally, segmenting all the trees which can not be merged any more to generate segmented images of all the garbage in the target garbage can. By the method, effective segmentation of each garbage can be realized based on the gray value of the first image, and the accuracy of image segmentation is improved.
With reference to the technical solution provided by the first aspect, in some possible implementations, the determining the gray value of each tree includes: graying by using the RGB mean value to obtain the gray value of each tree; the formula of the RGB mean graying is as follows: i ═ R + G + B)/3; wherein I represents the gray value of the I tree; r represents the pixel value of the red channel of the I-th tree; g represents the pixel value of the green channel of the I-th tree; b denotes the pixel value of the blue channel of the I-th tree.
In the embodiment of the application, when the gray value of each tree is determined, a RGB mean graying mode is adopted, so that a relatively average numerical value can be obtained, and the numerical value can be conveniently used for obtaining a reasonable segmentation image subsequently.
With reference to the technical solution provided by the first aspect, in some possible implementation manners, the segmenting the first image based on the pixel point of the first image to generate a segmented image of each trash in the target trash can includes: defining each pixel point on the first image as a tree; acquiring color difference values of three RGB channels between two adjacent trees on the first image; wherein the color difference value is calculated by the following formula:
Figure BDA0003152673910000031
wherein VALUE represents a color difference VALUE of any one of three RGB channels between adjacent trees a and b, and Cia represents the number of pixel points with a pixel value of i in the tree a; cib represents the number of pixel points with the pixel value of i in the tree b; when the color difference value of the RGB three channels between two adjacent trees is smaller than a preset threshold value, merging the three channels into one tree, and re-determining the color numerical value of the RGB three channels of the merged tree until all the trees can not be merged; and segmenting all the trees which can not be combined any more to generate segmented images of all the garbage in the target garbage can.
In the embodiment of the application, when the image is segmented, each pixel point on the first image is defined as a tree; then acquiring color difference values of three channels of RGB between two adjacent trees on the first image; when the color difference value of the RGB three channels between two adjacent trees is smaller than a preset threshold value, merging the three channels into one tree, and re-determining the color numerical value of the RGB three channels of the merged tree until all the trees can not be merged; and segmenting all the trees which can not be combined any more to generate segmented images of all the garbage in the target garbage can. Through this mode, can realize effectively cutting apart to each rubbish based on the colour of the three passageway of RGB of the pixel of first image, further improvement cut apart the degree of accuracy of image.
With reference to the technical solution provided by the first aspect, in some possible implementation manners, the re-determining color values of three RGB channels of the merged tree includes: acquiring a pixel value of a red channel of each pixel point in the merged tree; taking the average value of the pixel values of the red channels of each pixel point in the merged tree as the pixel value of the red channel of the merged tree; acquiring a pixel value of a green channel of each pixel point in the merged tree; taking the average value of the pixel values of the green channels of each pixel point in the merged tree as the pixel value of the green channel of the merged tree; acquiring a pixel value of a blue channel of each pixel point in the merged tree; and taking the average value of the pixel values of the blue channel of each pixel point in the combined tree as the pixel value of the blue channel of the combined tree.
In the embodiment of the present application, the average value of the pixel values of each channel of each pixel point in the merged tree is used as the pixel value of the corresponding channel of the merged tree, and by this means, the more average color value of the merged tree can be obtained, which is convenient for obtaining reasonable segmented images by using this value subsequently.
With reference to the technical solution provided by the first aspect, in some possible implementation manners, when the number of the pixel points in the merged tree is greater than two, the re-determining the color values of the three RGB channels of the merged tree includes: acquiring a pixel value of a red channel of each pixel point in the merged tree; removing the maximum value and the minimum value of the pixel values of the red channels in the combined tree, and taking the average value of the pixel values of the remaining red channels as the pixel value of the red channel of the combined tree; acquiring a pixel value of a green channel of each pixel point in the merged tree; removing the maximum value and the minimum value of the pixel values of the green channels in the combined tree, and taking the average value of the pixel values of the remaining green channels as the pixel value of the green channel of the combined tree; acquiring a pixel value of a blue channel of each pixel point in the merged tree; and removing the maximum value and the minimum value of the pixel values of the blue channel in the combined tree, and taking the average value of the pixel values of the residual blue channels as the pixel value of the blue channel of the combined tree.
The color values of the three RGB channels of the merged tree need to be redefined, and when the number of the pixel points in the merged tree is larger than two, in the embodiment of the application, the maximum value and the minimum value of the pixel in each channel of the merged tree are removed, then the average value of the remaining pixel values is taken as the pixel value of the channel of the merged tree, and by means of the method, the influence of extreme values can be effectively removed, the influence on the division of the split image is reduced, and the accuracy of the split image is improved.
With reference to the technical solution provided by the first aspect, in some possible implementation manners, when there is a first segmented image whose geometric feature matching degree is lower than a preset threshold, the method further includes comparing the first segmented image with matching images of preset image libraries corresponding to the remaining three target trash cans; and when the matching degree of the geometric features of the first segmentation image and the matching image in one preset image library is higher than a preset threshold value, controlling the manipulator to move the garbage corresponding to the first segmentation image into a garbage can corresponding to the matching image with the matching degree of the geometric features higher than the preset threshold value.
In the embodiment of the application, when the geometric feature matching degree of the segmented images of the garbage is lower than a preset threshold, the images are compared with matching images of preset image libraries corresponding to the other three target garbage bins, and when the geometric feature matching degree of the first segmented image and the matching image in one of the preset image libraries is higher than the preset threshold, the manipulator is controlled to move the garbage corresponding to the first segmented image into the garbage bin corresponding to the matching image with the geometric feature matching degree higher than the preset threshold. Through this mode for the mode that rubbish classification case can automatic realization discernment and classify, that is, through the classification of discernment individual rubbish, then control the robot hand and place it in the garbage bin that corresponds with the class, further alleviateed managers's burden.
In combination with the technical solution provided by the first aspect, in some possible implementation manners, garbage bags of different colors corresponding to different types of garbage cans; correspondingly, the method further comprises the following steps: acquiring color values of the segmented image; comparing the segmented image with a preset image library corresponding to the target garbage can to obtain the color matching degree of the segmented image and a matching image in the preset image library; and when a second segmentation image with the color matching degree lower than a preset threshold exists, representing that the garbage corresponding to the second segmentation image does not belong to the target garbage can.
In the embodiment of the application, each type of garbage can corresponds to garbage with different colors. The garbage bags with different colors are used for containing different types of garbage, when a user discards the garbage, the garbage bags with different colors need to be placed into the garbage can with the corresponding color, but the garbage can with different colors is also discarded by the user at will, so that the color value of the segmented image is obtained in the embodiment of the application; comparing the segmented image with a preset image library corresponding to the target garbage can to obtain the color matching degree of the segmented image and a matching image in the preset image library; and when a second segmentation image with the color matching degree lower than a preset threshold exists, representing that the garbage corresponding to the second segmentation image does not belong to the target garbage can. Whether the color of the rubbish that can effectually discern the user and abandon belongs to the target garbage bin through this mode, and then provides convenience when picking up for managers, has alleviateed managers's work burden.
With reference to the technical solution provided by the first aspect, in some possible implementation manners, when there is a second segmented image with a color matching degree lower than a preset threshold, the method further includes comparing the second segmented image with matching images of preset image libraries corresponding to the remaining three target trash cans; and when the color matching degree of the second segmentation image and a matching image in one preset image library is higher than a preset threshold value, controlling the manipulator to move the garbage corresponding to the second segmentation image into a garbage can corresponding to the matching image with the geometric feature matching degree higher than the preset threshold value.
In the embodiment of the application, when the color matching degree of the segmented images of the garbage is lower than a preset threshold value, the images are compared with matching images of preset image libraries corresponding to the other three target garbage bins, and when the color matching degree of the first segmented image and the matching image in one of the preset image libraries is higher than the preset threshold value, the manipulator is controlled to move the garbage corresponding to the first segmented image into the garbage bin corresponding to the matching image with the color matching degree higher than the preset threshold value. Through this mode for the mode that rubbish classification case can automatic realization discernment and classify, that is, through the color classification of discernment individual rubbish, then control the robot hand and place it in the garbage bin that corresponds with class is other, further alleviateed managers's burden.
In a second aspect, the embodiment of the present application provides a garbage can management device, is applied to garbage can's controller, garbage can includes image acquisition module, and surplus garbage bin in kitchen, recoverable garbage bin, harmful garbage bin and dry garbage bin, surplus garbage bin in kitchen recoverable garbage bin harmful garbage bin and dry garbage bin is transparent garbage bin, image acquisition module sets up respectively outside each garbage bin, image acquisition module with the controller electricity is connected, the device includes: the first acquisition module is used for acquiring a target image of the target garbage can acquired by the image acquisition module; the determining module is used for identifying a target garbage can in the target image, determining a first image corresponding to the target garbage can, and counting pixel points of the first image; the generation module is used for segmenting the first image based on pixel points of the first image to generate segmented images of all the garbage in the target garbage can; wherein the segmentation image is a geometric image of each garbage; the second acquisition module is used for comparing the segmentation image with a preset image library corresponding to the target garbage can to acquire the geometric feature matching degree of the segmentation image and a matching image in the preset image library; when a first segmentation image with the geometric feature matching degree lower than a preset threshold exists, representing that the garbage corresponding to the first segmentation image does not belong to the target garbage can.
In a third aspect, an embodiment of the present application provides a garbage classification bin, including: the garbage classification box comprises a controller and an image acquisition module connected with the controller, wherein the image acquisition module is respectively arranged outside a kitchen garbage can, a recyclable garbage can, a harmful garbage can and a dry garbage can of the garbage classification box; the kitchen waste bin, the recyclable bin, the harmful bin and the dry bin are all transparent bins; the controller is configured to perform a method as provided in the above-described first aspect embodiment and/or in combination with some possible implementations of the above-described first aspect embodiment.
In a fourth aspect, embodiments of the present application provide a storage medium having stored thereon a computer program, which, when executed by a processor, performs a method as provided in the above-described first aspect embodiment and/or in connection with some possible implementations of the above-described first aspect embodiment.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present application, the drawings that are required to be used in the embodiments of the present application will be briefly described below, it should be understood that the following drawings only illustrate some embodiments of the present application and therefore should not be considered as limiting the scope, and that those skilled in the art can also obtain other related drawings based on the drawings without inventive efforts.
Fig. 1 is a schematic structural diagram of a garbage classification bin according to an embodiment of the present application.
Fig. 2 is a flowchart illustrating steps of a garbage classification bin management method according to an embodiment of the present application.
Fig. 3 is a flowchart illustrating steps of another garbage bin management method according to an embodiment of the present application.
Fig. 4 is a block diagram of a garbage classification bin management device according to an embodiment of the present application.
Detailed Description
The technical solutions in the embodiments of the present application will be described below with reference to the drawings in the embodiments of the present application.
In view of the fact that the existing garbage classification box is only attached with the category identification of each garbage can, users still can often discard the garbage according to categories when discarding the garbage, and further the management personnel are quite inconvenient when picking up the garbage, and the workload of the management personnel is increased.
Referring to fig. 1, an embodiment of the present application provides a garbage classification bin applying a garbage classification bin management method. The garbage classification box comprises a kitchen garbage can, a recyclable garbage can, a harmful garbage can and a dry garbage can. Wherein, the kitchen garbage can, the recyclable garbage can, the harmful garbage can and the dry garbage can are transparent garbage cans. In the circuit structure, the garbage classification box comprises a controller and an image acquisition module. The image acquisition modules are respectively arranged outside the garbage cans and are electrically connected with the controller.
The controller may be an integrated circuit chip having signal processing capabilities. The controller may also be a general-purpose Processor, for example, a Central Processing Unit (CPU), a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), a discrete gate or transistor logic device, or a discrete hardware component, which may implement or execute the methods, steps, and logic blocks disclosed in the embodiments of the present Application. Further, a general purpose processor may be a microprocessor or any conventional processor or the like.
It should be understood that the configuration shown in fig. 1 is merely illustrative, and the trash sorting bin provided by the embodiments of the present application may have fewer or more components than those shown in fig. 1, or may have a different configuration than that shown in fig. 1. Further, the components shown in fig. 1 may be implemented by software, hardware, or a combination thereof.
Referring to fig. 2, fig. 2 is a flowchart illustrating steps of a garbage bin management method according to an embodiment of the present application, where the method is applied to a controller in the garbage bin shown in fig. 1. It should be noted that the garbage bin management method provided in the embodiment of the present application is not limited by the sequence shown in fig. 2 and below, and specific procedures and steps of the garbage bin management method are described below with reference to fig. 2. The method comprises the following steps: step S101-step S104.
Step S101: and acquiring a target image of the target garbage can acquired by the image acquisition module.
Step S102: and identifying a target garbage can in the target image, determining a first image corresponding to the target garbage can, and counting pixel points of the first image.
Step S103: based on the pixel points of the first image, the first image is segmented, and segmented images of all the garbage in the target garbage can are generated; wherein the segmented image is a geometric image of each garbage.
Step S104: comparing the segmented image with a preset image library corresponding to the target garbage can to obtain the geometric feature matching degree of the segmented image and a matching image in the preset image library; when a first segmentation image with the geometric feature matching degree lower than a preset threshold exists, representing that the garbage corresponding to the first segmentation image does not belong to the target garbage can.
To sum up, in this application embodiment, the outside of every garbage bin all is provided with the image acquisition module, and every garbage bin is transparent garbage bin, and then makes the image that the image acquisition module can gather inside rubbish. The controller divides the image based on the pixel points of the collected image to generate divided images of all the garbage in the target garbage can, and then compares the divided images with a preset image library corresponding to the target garbage can to obtain the geometric feature matching degree of the divided images and a matching image in the preset image library. Whether the rubbish that can effectually discern the user and abandon belongs to the target garbage bin through this mode, and then provides convenience when picking up for managers, has alleviateed managers's work burden.
The above steps are described in detail with reference to specific examples.
Referring to fig. 3, optionally, the segmenting the first image based on the pixel point of the first image in the step S102 to generate the segmented image of each trash in the target trash can specifically includes: step S201-step S204.
Step S201: defining each pixel point on the first image as a tree; and determine the gray value for each tree.
Step S202: and acquiring a similarity distance between two adjacent trees on the first image, wherein the similarity distance represents the similarity between the two adjacent trees determined based on the gray value.
Step S203: when the similarity distance between two adjacent trees is smaller than a preset threshold value, the two adjacent trees are merged into one tree, and the gray value of the merged tree is determined again until all the trees can not be merged any more.
Step S204: and segmenting all the trees which can not be combined any more to generate segmented images of all the garbage in the target garbage can.
That is, in the embodiment of the present application, when an image is segmented, each pixel point on a first image is first defined as a tree; and determining the gray value of each tree; then, acquiring a similarity distance between two adjacent trees on the first image, merging the two adjacent trees into one tree when the similarity distance between the two adjacent trees is smaller than a preset threshold value, and re-determining the gray value of the merged tree until all the trees can not be merged; and finally, segmenting all the trees which can not be merged any more to generate segmented images of all the garbage in the target garbage can. By the method, effective segmentation of each garbage can be realized based on the gray value of the first image, and the accuracy of image segmentation is improved.
Optionally, the determining the gray value of each tree comprises: and graying by adopting the RGB mean value to obtain the gray value of each tree.
The formula of RGB mean graying is as follows: i ═ R + G + B)/3.
In the above formula, I represents the gray scale value of the I-th tree; r represents the pixel value of the red channel of the I-th tree; g represents the pixel value of the green channel of the I-th tree; b denotes the pixel value of the blue channel of the I-th tree.
In the embodiment of the application, when the gray value of each tree is determined, a RGB mean graying mode is adopted, so that a relatively average numerical value can be obtained, and the numerical value can be conveniently used for obtaining a reasonable segmentation image subsequently.
As another embodiment, the method for generating a segmented image of each trash in a target trash can by segmenting a first image based on pixel points of the first image further includes: defining each pixel point on the first image as a tree; acquiring color difference values of three RGB channels between two adjacent trees on the first image; when the color difference value of the RGB three channels between two adjacent trees is smaller than a preset threshold value, merging the three channels into one tree, and re-determining the color numerical value of the RGB three channels of the merged tree until all the trees can not be merged; and segmenting all the trees which can not be combined any more to generate segmented images of all the garbage in the target garbage can.
In the above step, the color difference value is calculated by the following formula:
Figure BDA0003152673910000111
wherein VALUE represents a color difference VALUE of any one of three RGB channels between adjacent trees a and b, and Cia represents the number of pixel points with a pixel value of i in the tree a; cib represents the number of pixel points with the pixel value of i in the tree b;
in the embodiment of the application, when the image is segmented, each pixel point on the first image is defined as a tree; then acquiring color difference values of three channels of RGB between two adjacent trees on the first image; when the color difference value of the RGB three channels between two adjacent trees is smaller than a preset threshold value, merging the three channels into one tree, and re-determining the color numerical value of the RGB three channels of the merged tree until all the trees can not be merged; and segmenting all the trees which can not be combined any more to generate segmented images of all the garbage in the target garbage can. Through this mode, can realize effectively cutting apart to each rubbish based on the colour of the three passageway of RGB of the pixel of first image, further improvement cut apart the degree of accuracy of image.
As a first implementation, in the above step, the re-determining the color values of the three RGB channels of the merged tree includes: acquiring a pixel value of a red channel of each pixel point in the combined tree; taking the average value of the pixel values of the red channels of each pixel point in the combined tree as the pixel value of the red channel of the combined tree; acquiring a pixel value of a green channel of each pixel point in the combined tree; taking the average value of the pixel values of the green channels of all the pixel points in the combined tree as the pixel value of the green channel of the combined tree; acquiring a pixel value of a blue channel of each pixel point in the combined tree; and taking the average value of the pixel values of the blue channels of each pixel point in the combined tree as the pixel value of the blue channel of the combined tree.
It should be noted that, for the merged tree, the color values of the three RGB channels need to be redefined, in the embodiment of the present application, the average value of the pixel values of each channel of each pixel point in the merged tree is used as the pixel value of the corresponding channel of the merged tree, and by this way, the more average color value of the merged tree can be obtained, which is convenient for obtaining a reasonable segmentation image by using the value subsequently.
As a second implementation manner, in the above step, when the number of the pixel points in the merged tree is greater than two, re-determining the color values of three channels of RGB of the merged tree includes: acquiring a pixel value of a red channel of each pixel point in the combined tree; removing the maximum value and the minimum value of the pixel values of the red channels in the combined tree, and taking the average value of the pixel values of the remaining red channels as the pixel value of the red channel of the combined tree; acquiring a pixel value of a green channel of each pixel point in the combined tree; removing the maximum value and the minimum value of the pixel values of the green channels in the combined tree, and taking the average value of the pixel values of the remaining green channels as the pixel value of the green channel of the combined tree; acquiring a pixel value of a blue channel of each pixel point in the combined tree; the maximum value and the minimum value of the pixel values of the blue channel in the merged tree are removed, and the average value of the pixel values of the remaining blue channels is taken as the pixel value of the blue channel of the merged tree.
It should be noted that, for the merged tree, color values of three RGB channels of the merged tree need to be redefined, and when the number of pixels in the merged tree is greater than two, in this embodiment of the present application, the maximum value and the minimum value of the pixel in each channel in the merged tree are removed first, and then the average value of the remaining pixel values is taken as the pixel value of the channel in the merged tree.
Optionally, when a first segmentation image with the geometric feature matching degree lower than a preset threshold exists, the method further comprises the steps of comparing the first segmentation image with matching images of a preset image library corresponding to the rest three target garbage cans; when the matching degree of the geometric features of the first segmentation image and the matching graph in one of the preset image libraries is higher than a preset threshold value, the manipulator is controlled to move the garbage corresponding to the first segmentation image into the garbage can corresponding to the matching graph with the matching degree of the geometric features higher than the preset threshold value.
It should be noted that a robot is an automatic operating device that can simulate some action functions of a human hand and an arm to grasp, carry objects or operate tools according to a fixed program. The robot has the characteristics that various expected operations can be completed through programming, and the advantages of the robot and the manipulator are combined in structure and performance. The manipulator sets up inside the waste classification case.
That is, in the embodiment of the present application, when the geometric feature matching degree of the segmented image of the trash is found to be lower than the preset threshold, the image is compared with the matching maps of the preset image libraries corresponding to the other three target trash cans, and when the geometric feature matching degree of the first segmented image and the matching map in one of the preset image libraries is higher than the preset threshold, the manipulator is controlled to move the trash corresponding to the first segmented image into the trash can corresponding to the matching map whose geometric feature matching degree is higher than the preset threshold. Through this mode for the mode that rubbish classification case can automatic realization discernment and classify, that is, through the classification of discernment individual rubbish, then control the robot hand and place it in the garbage bin that corresponds with the class, further alleviateed managers's burden.
Optionally, this application embodiment still provides a mode through the disposal bag of the different colours that different kind of garbage bins correspond, if the surplus garbage bin in kitchen corresponds red disposal bag, recoverable garbage bin corresponds green disposal bag, harmful garbage bin corresponds purple disposal bag, dry garbage bin corresponds black garbage bin.
Correspondingly, the method for managing the garbage classification box provided by the embodiment of the application further includes: acquiring color values of the segmented image; comparing the segmented image with a preset image library corresponding to the target garbage can to obtain the color matching degree of the segmented image and a matching image in the preset image library; and when a second segmentation image with the color matching degree lower than a preset threshold exists, representing that the garbage corresponding to the second segmentation image does not belong to the target garbage can.
That is, in the embodiment of the present application, each kind of trash can corresponds to trash with different colors. The garbage bags with different colors are used for containing different types of garbage, when a user discards the garbage, the garbage bags with different colors need to be placed into the garbage can with the corresponding color, but the garbage can with different colors is also discarded by the user at will, so that the color value of the segmented image is obtained in the embodiment of the application; comparing the segmented image with a preset image library corresponding to the target garbage can to obtain the color matching degree of the segmented image and a matching image in the preset image library; and when a second segmentation image with the color matching degree lower than a preset threshold exists, representing that the garbage corresponding to the second segmentation image does not belong to the target garbage can. Whether the color of the rubbish that can effectually discern the user and abandon belongs to the target garbage bin through this mode, and then provides convenience when picking up for managers, has alleviateed managers's work burden.
Optionally, when a second segmentation image with the color matching degree lower than a preset threshold exists, the method further comprises the step of comparing the second segmentation image with matching images of a preset image library corresponding to the rest three target garbage cans; and when the color matching degree of the second segmentation image and the matching image in one of the preset image libraries is higher than a preset threshold value, controlling the manipulator to move the garbage corresponding to the second segmentation image into the garbage can corresponding to the matching image with the geometric feature matching degree higher than the preset threshold value.
That is, in the embodiment of the present application, when the color matching degree of the segmented image of the trash is found to be lower than the preset threshold, the image is compared with the matching maps of the preset image libraries corresponding to the other three target trash cans, and when the color matching degree of the first segmented image and the matching map in one of the preset image libraries is higher than the preset threshold, the manipulator is controlled to move the trash corresponding to the first segmented image into the trash can corresponding to the matching map whose color matching degree is higher than the preset threshold. Through this mode for the mode that rubbish classification case can automatic realization discernment and classify, that is, through the color classification of discernment individual rubbish, then control the robot hand and place it in the garbage bin that corresponds with class is other, further alleviateed managers's burden.
Referring to fig. 4, based on the same inventive concept, an embodiment of the present application further provides a garbage classification bin management device 200, including: a first obtaining module 201, a determining module 202, a generating module 203 and a second obtaining module 204.
The first obtaining module 201 is configured to obtain a target image of the target trash can, which is collected by the image collecting module.
A determining module 202, configured to identify a target trash can in the target image, determine a first image corresponding to the target trash can, and count pixel points of the first image.
A generating module 203, configured to segment the first image based on pixel points of the first image, and generate a segmented image of each trash in the target trash can; wherein the segmented image is a geometric image of each garbage.
A second obtaining module 204, configured to compare the segmented image with a preset image library corresponding to the target trash can, and obtain a geometric feature matching degree between the segmented image and a matching image in the preset image library; when a first segmentation image with the geometric feature matching degree lower than a preset threshold exists, representing that the garbage corresponding to the first segmentation image does not belong to the target garbage can.
It should be noted that, as those skilled in the art can clearly understand, for convenience and brevity of description, the specific working processes of the above-described systems, apparatuses and units may refer to the corresponding processes in the foregoing method embodiments, and are not described herein again.
Based on the same inventive concept, the present application further provides a storage medium, on which a computer program is stored, and when the computer program is executed, the computer program performs the method provided in the foregoing embodiments.
The storage medium may be any available medium that can be accessed by a computer or a data storage device including one or more integrated servers, data centers, and the like. The usable medium may be a magnetic medium (e.g., floppy Disk, hard Disk, magnetic tape), an optical medium (e.g., DVD), or a semiconductor medium (e.g., Solid State Disk (SSD)), among others.
In the embodiments provided in the present application, it should be understood that the disclosed apparatus and method may be implemented in other ways. The above-described embodiments of the apparatus are merely illustrative, and for example, the division of the units is only one logical division, and there may be other divisions when actually implemented, and for example, a plurality of units or components may be combined or integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection of devices or units through some communication interfaces, and may be in an electrical, mechanical or other form.
In addition, units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
Furthermore, the functional modules in the embodiments of the present application may be integrated together to form an independent part, or each module may exist separately, or two or more modules may be integrated to form an independent part.
In this document, relational terms such as first and second, and the like may be used solely to distinguish one entity or action from another entity or action without necessarily requiring or implying any actual such relationship or order between such entities or actions.
The above description is only an example of the present application and is not intended to limit the scope of the present application, and various modifications and changes may be made by those skilled in the art. Any modification, equivalent replacement, improvement and the like made within the spirit and principle of the present application shall be included in the protection scope of the present application.

Claims (10)

1. A waste bin management method applied to a controller of a waste bin, wherein the waste bin comprises an image capture module, kitchen waste bins, recyclable waste bins, harmful waste bins and dry waste bins, the kitchen waste bins, the recyclable waste bins, the harmful waste bins and the dry waste bins are transparent waste bins, the image capture module is respectively arranged outside each waste bin, and the image capture module is electrically connected with the controller, the method comprising:
acquiring a target image of a target garbage can acquired by the image acquisition module;
identifying a target garbage can in the target image, determining a first image corresponding to the target garbage can, and counting pixel points of the first image;
based on the pixel points of the first image, the first image is segmented, and segmented images of all the garbage in the target garbage can are generated; wherein the segmentation image is a geometric image of each garbage;
comparing the segmented image with a preset image library corresponding to the target garbage can to obtain the geometric feature matching degree of the segmented image and a matching image in the preset image library; when a first segmentation image with the geometric feature matching degree lower than a preset threshold exists, representing that the garbage corresponding to the first segmentation image does not belong to the target garbage can.
2. The trash bin control method of claim 1, wherein the segmenting the first image based on the pixel points of the first image to generate segmented images of each trash in the target trash bin comprises:
defining each pixel point on the first image as a tree; and determining the gray value of each tree;
acquiring a similarity distance between two adjacent trees on the first image, wherein the similarity distance represents the similarity between the two adjacent trees determined based on the gray value;
when the similarity distance between two adjacent trees is smaller than a preset threshold value, merging the two adjacent trees into one tree, and re-determining the gray value of the merged tree until all the trees can not be merged;
and segmenting all the trees which can not be combined any more to generate segmented images of all the garbage in the target garbage can.
3. A trash sorting bin control method according to claim 2 wherein said determining a gray scale value for each tree comprises:
graying by using the RGB mean value to obtain the gray value of each tree;
the formula of the RGB mean graying is as follows: i ═ R + G + B)/3; wherein I represents the gray value of the I tree; r represents the pixel value of the red channel of the I-th tree; g represents the pixel value of the green channel of the I-th tree; b denotes the pixel value of the blue channel of the I-th tree.
4. The trash bin control method of claim 1, wherein the segmenting the first image based on the pixel points of the first image to generate segmented images of each trash in the target trash bin comprises:
defining each pixel point on the first image as a tree;
acquiring color difference values of three RGB channels between two adjacent trees on the first image; wherein the color difference value is calculated by the following formula:
Figure FDA0003152673900000021
wherein VALUE represents between adjacent trees a and bColor difference value of any one of RGB three channels, Cia represents the number of pixel points with a pixel value of i in the tree a; cib represents the number of pixel points with the pixel value of i in the tree b;
when the color difference value of the RGB three channels between two adjacent trees is smaller than a preset threshold value, merging the three channels into one tree, and re-determining the color numerical value of the RGB three channels of the merged tree until all the trees can not be merged;
and segmenting all the trees which can not be combined any more to generate segmented images of all the garbage in the target garbage can.
5. The trash sorting bin control method of claim 4 wherein said re-determining color values for the three RGB channels of the merged tree comprises:
acquiring a pixel value of a red channel of each pixel point in the merged tree;
taking the average value of the pixel values of the red channels of each pixel point in the merged tree as the pixel value of the red channel of the merged tree;
acquiring a pixel value of a green channel of each pixel point in the merged tree;
taking the average value of the pixel values of the green channels of each pixel point in the merged tree as the pixel value of the green channel of the merged tree;
acquiring a pixel value of a blue channel of each pixel point in the merged tree;
and taking the average value of the pixel values of the blue channel of each pixel point in the combined tree as the pixel value of the blue channel of the combined tree.
6. The garbage can control method according to claim 4, wherein when the number of the pixel points in the merged tree is greater than two, the re-determining the color values of the three RGB channels of the merged tree comprises:
acquiring a pixel value of a red channel of each pixel point in the merged tree;
removing the maximum value and the minimum value of the pixel values of the red channels in the combined tree, and taking the average value of the pixel values of the remaining red channels as the pixel value of the red channel of the combined tree;
acquiring a pixel value of a green channel of each pixel point in the merged tree;
removing the maximum value and the minimum value of the pixel values of the green channels in the combined tree, and taking the average value of the pixel values of the remaining green channels as the pixel value of the green channel of the combined tree;
acquiring a pixel value of a blue channel of each pixel point in the merged tree;
and removing the maximum value and the minimum value of the pixel values of the blue channel in the combined tree, and taking the average value of the pixel values of the residual blue channels as the pixel value of the blue channel of the combined tree.
7. The trash can control method according to claim 1, wherein when there is a first segmented image in which the geometric feature matching degree is lower than a preset threshold, the method further comprises:
comparing the first segmentation image with matching images of preset image libraries corresponding to the other three target garbage cans;
and when the matching degree of the geometric features of the first segmentation image and the matching image in one preset image library is higher than a preset threshold value, controlling the manipulator to move the garbage corresponding to the first segmentation image into a garbage can corresponding to the matching image with the matching degree of the geometric features higher than the preset threshold value.
8. The trash sorting bin control method of claim 1, wherein trash bags of different colors correspond to different types of trash cans; correspondingly, the method further comprises the following steps:
acquiring color values of the segmented image;
comparing the segmented image with a preset image library corresponding to the target garbage can to obtain the color matching degree of the segmented image and a matching image in the preset image library; and when a second segmentation image with the color matching degree lower than a preset threshold exists, representing that the garbage corresponding to the second segmentation image does not belong to the target garbage can.
9. The trash can control method of claim 8, wherein when there is a second segmented image in which the color matching degree is lower than a preset threshold, the method further comprises:
comparing the second segmentation image with matching images of preset image libraries corresponding to the other three target garbage cans;
and when the color matching degree of the second segmentation image and a matching image in one preset image library is higher than a preset threshold value, controlling the manipulator to move the garbage corresponding to the second segmentation image into a garbage can corresponding to the matching image with the geometric feature matching degree higher than the preset threshold value.
10. The utility model provides a garbage can management device which characterized in that is applied to garbage can's controller, garbage can includes image acquisition module, kitchen garbage bin, recoverable garbage bin, harmful garbage bin and dry garbage bin, kitchen garbage bin the recoverable garbage bin harmful garbage bin and dry garbage bin is transparent garbage bin, image acquisition module sets up respectively outside each garbage bin, image acquisition module with the controller electricity is connected, the device includes:
the first acquisition module is used for acquiring a target image of the target garbage can acquired by the image acquisition module;
the determining module is used for identifying a target garbage can in the target image, determining a first image corresponding to the target garbage can, and counting pixel points of the first image;
the generation module is used for segmenting the first image based on pixel points of the first image to generate segmented images of all the garbage in the target garbage can; wherein the segmentation image is a geometric image of each garbage;
the second acquisition module is used for comparing the segmentation image with a preset image library corresponding to the target garbage can to acquire the geometric feature matching degree of the segmentation image and a matching image in the preset image library; when a first segmentation image with the geometric feature matching degree lower than a preset threshold exists, representing that the garbage corresponding to the first segmentation image does not belong to the target garbage can.
CN202110770167.6A 2021-07-07 2021-07-07 Garbage classification box management method and device Pending CN113362333A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110770167.6A CN113362333A (en) 2021-07-07 2021-07-07 Garbage classification box management method and device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110770167.6A CN113362333A (en) 2021-07-07 2021-07-07 Garbage classification box management method and device

Publications (1)

Publication Number Publication Date
CN113362333A true CN113362333A (en) 2021-09-07

Family

ID=77538808

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110770167.6A Pending CN113362333A (en) 2021-07-07 2021-07-07 Garbage classification box management method and device

Country Status (1)

Country Link
CN (1) CN113362333A (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114077877A (en) * 2022-01-19 2022-02-22 人民中科(济南)智能技术有限公司 Newly added garbage identification method and device, computer equipment and storage medium
CN115170935A (en) * 2022-09-09 2022-10-11 南通商翼信息科技有限公司 Trash can state identification method and device understood according to images

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN204507883U (en) * 2015-04-05 2015-07-29 西安航空学院 A kind of garbage classification system
CN107054936A (en) * 2017-03-23 2017-08-18 广东数相智能科技有限公司 A kind of refuse classification prompting dustbin and system based on image recognition
CN107092914A (en) * 2017-03-23 2017-08-25 广东数相智能科技有限公司 Refuse classification method, device and system based on image recognition
CN108182455A (en) * 2018-01-18 2018-06-19 齐鲁工业大学 A kind of method, apparatus and intelligent garbage bin of the classification of rubbish image intelligent
CN110991271A (en) * 2019-11-15 2020-04-10 万翼科技有限公司 Garbage classification processing method and related products
CN111753844A (en) * 2020-06-30 2020-10-09 烟台艾睿光电科技有限公司 Dry and wet garbage classification method, classification box and classification system
CN111784698A (en) * 2020-07-02 2020-10-16 广州信瑞医疗技术有限公司 Image self-adaptive segmentation method and device, electronic equipment and storage medium
CN112766096A (en) * 2021-01-06 2021-05-07 上海净收智能科技有限公司 Recoverable garbage abnormal delivery identification method, system, terminal and throwing device

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN204507883U (en) * 2015-04-05 2015-07-29 西安航空学院 A kind of garbage classification system
CN107054936A (en) * 2017-03-23 2017-08-18 广东数相智能科技有限公司 A kind of refuse classification prompting dustbin and system based on image recognition
CN107092914A (en) * 2017-03-23 2017-08-25 广东数相智能科技有限公司 Refuse classification method, device and system based on image recognition
CN108182455A (en) * 2018-01-18 2018-06-19 齐鲁工业大学 A kind of method, apparatus and intelligent garbage bin of the classification of rubbish image intelligent
CN110991271A (en) * 2019-11-15 2020-04-10 万翼科技有限公司 Garbage classification processing method and related products
CN111753844A (en) * 2020-06-30 2020-10-09 烟台艾睿光电科技有限公司 Dry and wet garbage classification method, classification box and classification system
CN111784698A (en) * 2020-07-02 2020-10-16 广州信瑞医疗技术有限公司 Image self-adaptive segmentation method and device, electronic equipment and storage medium
CN112766096A (en) * 2021-01-06 2021-05-07 上海净收智能科技有限公司 Recoverable garbage abnormal delivery identification method, system, terminal and throwing device

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
程增木主编: "智能网联汽车技术入门一本通 彩色版", 《北京:机械工业出版社》, pages: 137 - 139 *

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114077877A (en) * 2022-01-19 2022-02-22 人民中科(济南)智能技术有限公司 Newly added garbage identification method and device, computer equipment and storage medium
CN115170935A (en) * 2022-09-09 2022-10-11 南通商翼信息科技有限公司 Trash can state identification method and device understood according to images
CN115170935B (en) * 2022-09-09 2022-12-27 南通商翼信息科技有限公司 Trash can state identification method and device understood according to images

Similar Documents

Publication Publication Date Title
CN110854807B (en) New energy relay protection intelligent checking and early warning system based on OCR technology
CN113362333A (en) Garbage classification box management method and device
JP4139615B2 (en) Event clustering of images using foreground / background segmentation
CN110322438B (en) Training method and automatic detection system for automatic detection model of mycobacterium tuberculosis
CN107480643B (en) Intelligent garbage classification processing robot
CN109948641A (en) Anomaly groups recognition methods and device
CN104408429A (en) Method and device for extracting representative frame of video
JP2001256244A (en) Device and method for sorting image data
EP2956891A2 (en) Segmenting objects in multimedia data
CN104252570A (en) Mass medical image data mining system and realization method thereof
CN107808126A (en) Vehicle retrieval method and device
CN108564579A (en) A kind of distress in concrete detection method and detection device based on temporal and spatial correlations
CN108492296B (en) Wheat ear intelligent counting system and method based on superpixel segmentation
WO2015107722A1 (en) Detection control device, program, detection system, storage medium and detection control method
CN112070135A (en) Power equipment image detection method and device, power equipment and storage medium
CN105404657A (en) CEDD feature and PHOG feature based image retrieval method
CN111223079A (en) Power transmission line detection method and device, storage medium and electronic device
CN110163092A (en) Demographic method, device, equipment and storage medium based on recognition of face
CN111898613A (en) Semi-supervised semantic segmentation model training method, recognition method and device
CN114661810A (en) Lightweight multi-source heterogeneous data fusion method and system
CN109213886A (en) Image search method and system based on image segmentation and Fuzzy Pattern Recognition
KR20140026393A (en) Method for improving classification results of a classifier
CN112966687B (en) Image segmentation model training method and device and communication equipment
Kimura et al. Evaluating retrieval effectiveness of descriptors for searching in large image databases
CN111199228B (en) License plate positioning method and device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication

Application publication date: 20210907

RJ01 Rejection of invention patent application after publication