CN111241967B - Garbage classification method, garbage classification device, terminal equipment and storage medium - Google Patents

Garbage classification method, garbage classification device, terminal equipment and storage medium Download PDF

Info

Publication number
CN111241967B
CN111241967B CN202010010769.7A CN202010010769A CN111241967B CN 111241967 B CN111241967 B CN 111241967B CN 202010010769 A CN202010010769 A CN 202010010769A CN 111241967 B CN111241967 B CN 111241967B
Authority
CN
China
Prior art keywords
food
information
garbage
storage
food material
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202010010769.7A
Other languages
Chinese (zh)
Other versions
CN111241967A (en
Inventor
宋德超
冼海鹰
杨洋
秦萍
罗大旺
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Gree Electric Appliances Inc of Zhuhai
Zhuhai Lianyun Technology Co Ltd
Original Assignee
Gree Electric Appliances Inc of Zhuhai
Zhuhai Lianyun Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Gree Electric Appliances Inc of Zhuhai, Zhuhai Lianyun Technology Co Ltd filed Critical Gree Electric Appliances Inc of Zhuhai
Priority to CN202010010769.7A priority Critical patent/CN111241967B/en
Publication of CN111241967A publication Critical patent/CN111241967A/en
Application granted granted Critical
Publication of CN111241967B publication Critical patent/CN111241967B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/10Terrestrial scenes
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01GWEIGHING
    • G01G19/00Weighing apparatus or methods adapted for special purposes not provided for in the preceding groups
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/50Information retrieval; Database structures therefor; File system structures therefor of still image data
    • G06F16/53Querying
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/50Information retrieval; Database structures therefor; File system structures therefor of still image data
    • G06F16/55Clustering; Classification
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q50/00Information and communication technology [ICT] specially adapted for implementation of business processes of specific business sectors, e.g. utilities or tourism
    • G06Q50/10Services
    • G06Q50/26Government or public services

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Business, Economics & Management (AREA)
  • Tourism & Hospitality (AREA)
  • Data Mining & Analysis (AREA)
  • Databases & Information Systems (AREA)
  • General Engineering & Computer Science (AREA)
  • Educational Administration (AREA)
  • Development Economics (AREA)
  • Multimedia (AREA)
  • Health & Medical Sciences (AREA)
  • Economics (AREA)
  • General Health & Medical Sciences (AREA)
  • Human Resources & Organizations (AREA)
  • Marketing (AREA)
  • Primary Health Care (AREA)
  • Strategic Management (AREA)
  • General Business, Economics & Management (AREA)
  • Information Retrieval, Db Structures And Fs Structures Therefor (AREA)
  • Management, Administration, Business Operations System, And Electronic Commerce (AREA)

Abstract

The application relates to a garbage classification method, a garbage classification system, terminal equipment and a storage medium, wherein the garbage classification method comprises the following steps: a garbage classification method obtains food information corresponding to foods placed in each area; searching corresponding junk information of food in a preset junk database according to the food information, wherein the food information and the corresponding junk information of the food are stored in the preset junk database; based on the garbage information, classifying and judging the garbage corresponding to the food to obtain classifying and judging information of the garbage corresponding to the food, so that a user can be helped to judge the type of the garbage generated by the food in advance.

Description

Garbage classification method, garbage classification device, terminal equipment and storage medium
Technical Field
The application relates to the technical field of garbage disposal, in particular to a garbage classification method, a garbage classification device, terminal equipment and a storage medium.
Background
At present, garbage classification treatment has become a trend of environmental treatment, and garbage classification is performed in urban starting test points in China. In the user's home, kitchen waste is the largest source of household waste. In real life, people often cannot classify the garbage corresponding to various food materials in advance due to lack of relevant expertise.
Disclosure of Invention
The application mainly aims to provide a garbage classification method, a device, terminal equipment and a storage medium, which effectively solve the problem that in the prior art, people cannot classify garbage generated by various food materials because of lack of related professional knowledge.
The first aspect of the application provides a garbage classification method, which comprises the following steps: acquiring food information corresponding to food placed in each area; searching corresponding junk information of the food in a preset junk database according to the food information, wherein the food information and the corresponding junk information of the food are stored in the preset junk database; and classifying and judging the garbage corresponding to the food based on the garbage information to obtain classifying and judging information of the garbage corresponding to the food.
Optionally, acquiring food information corresponding to the food placed in each area includes: acquiring a plurality of food material images corresponding to the foods placed in each area; detecting the definition of each food material image, and selecting food material images with the definition meeting preset standards; and carrying out data processing on the food material images meeting the preset standard to obtain the food information.
Optionally, searching the corresponding junk information of the food in a preset junk database according to the food information includes: judging the area of each food material image; inquiring the junk information in a preset junk database corresponding to the area according to the food information.
Optionally, after obtaining the classification discrimination information, the method further includes: analyzing the classification discrimination information to obtain the corresponding garbage preservation environment information and preservation time information of the food; and determining the garbage storage information through the storage environment information and the storage time information, wherein the garbage storage information is used for characterizing the garbage storage meeting the storage environment and the storage time required by garbage.
Optionally, the method further comprises: acquiring the initial weight of the garbage storage; judging whether the weight of the garbage storage is changed or not based on the initial weight; when the weight of the garbage storage is changed, counting the changed weight of the garbage storage, and calculating the weight of the garbage in the garbage storage according to the changed weight of the garbage storage and the initial weight.
Optionally, after the weight of the garbage storage is changed, the method further comprises: and counting the time period when the weight of the garbage storage is changed, and determining the time period when the weight of the garbage storage is changed maximally.
Optionally, the method further comprises: periodically and/or regularly generating an information report through classification discrimination information of the garbage corresponding to the food and/or the weight of the garbage in the garbage storage and/or the time period with the largest change of the weight of the garbage storage, and sending the information report to a user side.
In a second aspect the application provides a waste classification device, the device comprising: the food information acquisition module is used for acquiring food information corresponding to food placed in each area; the searching module is used for searching the junk information corresponding to the food in a preset junk database according to the food information, wherein the food information and the junk information corresponding to the food are stored in the preset junk database; and the classification module is used for classifying and judging the garbage corresponding to the food based on the garbage information to obtain classification and judgment information of the garbage corresponding to the food.
A third aspect of the present application provides a terminal device, comprising a processor and a memory; the memory is used for storing computer instructions, and the processor is used for running the computer instructions stored in the memory so as to realize the garbage classification method.
A fourth aspect of the present application provides a storage medium storing one or more programs executable by one or more processors to implement the garbage classification method described above.
The application has the following beneficial effects: when the foods are used for cooking, the corresponding junk information of the foods can be searched in a preset junk database after the food information is obtained by obtaining the food information corresponding to the foods, and then, the junk generated by the foods is classified and judged according to the junk information to obtain the classification and judgment information of the junk corresponding to the foods. Therefore, the method can help the user to judge the corresponding garbage types of the foods in advance.
Drawings
FIG. 1 is a diagram of an application environment for a garbage classification method in one embodiment;
FIG. 2 is a flow diagram of a garbage classification method according to one embodiment;
FIG. 3 is a schematic diagram of a garbage classification method according to an embodiment;
FIG. 4 is a block diagram of a garbage classification device according to an embodiment;
fig. 5 is an internal structural diagram of a terminal device in one embodiment.
Detailed Description
The present application will be described in further detail with reference to the drawings and examples, in order to make the objects, technical solutions and advantages of the present application more apparent. It should be understood that the specific embodiments described herein are for purposes of illustration only and are not intended to limit the scope of the application.
The garbage classification method provided by the application can be applied to an application environment shown in figure 1. Wherein the terminal 102 communicates with the server 104 via a network. The terminal 102 acquires food information corresponding to the food and transmits the food information to the server 104, and the server 104 searches for junk information generated by the food in a preset junk database according to the food information, wherein the preset junk database stores the food and the junk information generated by the food; and classifying and judging the garbage generated by the food based on the garbage information to obtain classifying and judging information of the garbage generated by the food.
The terminal 102 may be, but not limited to, various personal computers, notebook computers, smartphones, tablet computers, and portable wearable devices, and the server 104 may be implemented by a stand-alone server or a server cluster composed of a plurality of servers.
In one embodiment, as shown in fig. 2 and 3, a garbage classification method is provided, and the method is applied to the server 104 in fig. 1 for illustration, and includes the following steps:
step S201, acquiring food information corresponding to foods placed in each area;
step S202, searching junk information corresponding to the food in a preset junk database according to the food information;
the food information and the corresponding junk information of the food are stored in the preset junk database;
and step 203, based on the garbage information, classifying and judging the garbage corresponding to the food to obtain classifying and judging information of the garbage corresponding to the food.
In S201-S203, first, food information corresponding to foods in different areas needs to be acquired, for example: ham which is not packaged in a refrigerator, cut or washed cucumber on a cooking bench and stir-fried ham on a dining table. Of course, the food to be garbage-produced includes, but is not limited to, non-unpacked food raw materials, cut food materials and fried dishes,
in addition, when the food is ham in the refrigerator, the corresponding garbage includes packaging bags and ham residues; when the food is detected to be the ham-fried cucumber on the dining table, the corresponding garbage comprises the leftovers of the ham-fried cucumber. Moreover, the region includes, but is not limited to: refrigerator, cooktop or dining table.
Therefore, in the present embodiment, the food information corresponding to the food placed in each area is acquired, and of course, the acquisition mode is not limited, for example: food material images of the foods are acquired through the camera. After the food information is obtained, the corresponding garbage information of the food can be searched in a preset garbage database, and then the garbage corresponding to the food can be classified and judged according to the garbage information, and the classification and judgment information of the garbage corresponding to the food is obtained. Therefore, the method can help a user to judge the type of the garbage corresponding to the food in advance when the food does not generate garbage yet.
Of course, in the present embodiment, after the classification discrimination information of the food is obtained, the classification discrimination information may be sent to the user side.
Such as: the refrigerator is provided with the camera with the image recognition technology, and when food is stored, the camera can intelligently detect and distinguish the food and analyze what garbage is generated by the food.
In another embodiment, one implementation manner of the step S201 is:
step S2011: acquiring a plurality of food material images corresponding to the foods placed in each area;
step S2012: detecting the definition of each food material image, judging whether the definition of each food material image meets a preset standard, if so, executing the following step S2013, otherwise, executing the following step S2014;
step S2013: selecting food material images with definition meeting preset standards, and performing data processing on the food material images meeting the preset standards to obtain the food information;
step S2014: and deleting the food material images with the definition not meeting the preset standard.
In step S2011-step S2014, in order to obtain food information corresponding to food, obtaining a plurality of food material images of the food through a camera, then detecting definition of each food material image, judging whether the definition of each food material image meets a preset standard, selecting food material images with definition meeting the preset standard from the food material images, and performing data processing on the selected food material images with definition meeting the preset standard to obtain the food information; of course, the food material image whose definition does not satisfy the preset standard is deleted.
Therefore, in this embodiment, the definition detection can be performed on each food material image by the food material image corresponding to the food; screening food material images with definition meeting preset standards according to definition, for example: and screening out food material images with definition higher than a preset standard. When the screened food material images are subjected to data processing, food information with high accuracy can be identified.
Such as: the food can be intelligently detected and distinguished when the food is stored by the camera in the refrigerator, and what garbage can be generated by the food can be analyzed. When a user takes food out of the refrigerator, the camera in the refrigerator can detect what garbage is generated by the food, and then the garbage of what kind is generated by the food can be generated into a report to the APP of the user. Some foods do not need to be placed in the refrigerator, and at the moment, a camera can be placed in a space for stacking the foods in a kitchen to perform image recognition so as to collect food images. And a camera is arranged on the dining table to monitor leftovers and leftovers. And a camera is arranged on the cooking bench to monitor the rest garbage and the like.
In addition, on the basis of the step S2011-step S2014, one implementation of the step S202 includes:
step S2021: judging the area of each food material image;
step S2022: inquiring the junk information in a preset junk database corresponding to the area according to the food information.
In this embodiment, the determination mode of the area to which each food material image belongs is not limited, and only needs to satisfy the requirements of this embodiment, for example: the cameras in each area can carry area identifiers when sending food images, so that after the server 104 obtains the food images, the area to which each food image belongs and the area to which food on each food image belongs can be rapidly determined. After determining the area to which each food material image belongs, inquiring junk information in a preset junk database corresponding to the corresponding area according to the food information. Such as: the garbage information in the preset garbage database corresponding to the area of the refrigerator comprises, but is not limited to, decomposed food and expired food, and the garbage information in the preset garbage database corresponding to the area of the cooking bench comprises, but is not limited to: the garbage information in the preset garbage database corresponding to the dining table in the area of the packaging bag and the melon, fruit and vegetable peels comprises but is not limited to: leftovers and leftovers.
In another embodiment, an implementation manner of the sharpness detection of each of the food material images in the step S2012 includes:
step S20121: calculating gradient differences of gray features between adjacent pixels of each food material image, a mean value of each food material image on a gray scale image and a variance of each food material image on the gray scale image;
step S20122: and obtaining the definition corresponding to each food material image based on the calculation results of the gradient difference, the mean value and the variance.
Specifically, in steps S20121-S20122, the mean value of the food material image on the gray scale image and the variance of the food material image on the gray scale image need to be calculated, and in this embodiment, the specific calculation mode of the mean value and the variance is not limited, but only needs to meet the requirements of this embodiment, for example: and calculating the variance and the mean value through matlab. Specifically, the field contrast of the food material image, that is, the gradient difference of the gray scale characteristics between adjacent pixels can be considered, and the calculation of the gradient difference can be exemplarily realized by using the following Laplacian algorithm: d (f) = ΣyΣx|g (x, y) | (G (x, y) > T); where G (x, y) is the convolution of the Laplacian operator at the pixel point (x, y).
In this embodiment, after the calculation results of the gradient difference, the mean value and the variance are obtained, the corresponding weights are respectively configured for the gradient difference, the mean value and the variance, so as to obtain the definition corresponding to the food material image.
In another embodiment, in the step S2013, the processing the data of the selected food material image to obtain the implementation manner of the food information includes:
step S20131: fusing the food material images meeting the preset standard, and extracting the image characteristics of the fused food material images;
step S20132: searching the food information in a food feature library according to the extracted image features;
wherein the food information includes all extracted image features, and the food feature library includes the food information and all image features included in the food information.
Specifically, in steps S201311-S201312, the selected food material images are fused, that is: and synthesizing a plurality of food material images into a new image through a preset algorithm. The food material images obtained after fusion can describe scenes more comprehensively, clearly and accurately by utilizing the space-time correlation and the information complementarity of the food material images.
In this embodiment, the preset algorithm is not limited, and only needs to meet the requirements of this embodiment, for example: the preset algorithm includes, but is not limited to: the Image Fusion algorithm has great application value in the fields of remote sensing detection, safety navigation, medical Image analysis, environmental protection, traffic monitoring, clear Image reconstruction, disaster detection and forecasting, in particular in the fields of computer vision and the like. The method is used for fusing infrared light and visible light more and more mature, and various information is displayed on one image so as to highlight the target.
After the fused food material image is obtained, obtaining image features from the fused food material image; then, based on the food material characteristic database, searching for the image characteristics identical to the image characteristics, thereby obtaining food information containing the extracted image characteristics. Such as: and obtaining image features such as the color of each part of the image main body, the shape of each part, the overall structure composition and the like from the fused food material images, and comparing the image features in a food material feature database, wherein the food information comprising all the image features is the query result, and the food information represented by the query result is the food information obtained in the step S201.
In this embodiment, through the steps S201311 to S201312, the obtained food information can be more accurate, clean and comprehensive, thereby helping to improve the accuracy of garbage classification.
In another embodiment, the above-mentioned classification discrimination information includes, but is not limited to: the preservation environment and preservation time of the garbage generated by the food;
therefore, after the above step S203, the garbage classification method further includes:
step S204: analyzing the classification discrimination information to obtain the corresponding garbage preservation environment information and preservation time information of the food;
step S205: and determining the garbage storage information through the storage environment information and the storage time information, wherein the garbage storage information is used for characterizing the garbage storage meeting the storage environment and the storage time required by garbage.
In step S204-S205, the classification discrimination information may be parsed to obtain the preserving environment information and preserving time information of the garbage corresponding to the food, and then the preserving environment information and the preserving time information are used to determine the garbage storage information, where the garbage storage information is used to characterize the garbage storage meeting the preserving environment and preserving time required by the garbage,
and then, the garbage storage information is sent to a user side so as to prompt a user to put garbage generated by the food into a garbage storage corresponding to the identification information and prompt the user to remove the garbage generated by the food from the garbage storage within the preservation time.
Thus, after obtaining the food material image of the food, the user can be helped to directly obtain the food, and after the food is produced, the garbage is placed in the garbage sorting storage, and the garbage is taken out of the garbage storage and is dumped at the latest. Thus, the user can be helped to know the classification and placement of the food, and can be helped to freely arrange the garbage treatment time earlier.
In another embodiment, the garbage classification method further comprises:
step S206: acquiring the initial weight of the garbage storage;
step S207: judging whether the weight of the garbage storage is changed or not based on the initial weight;
step S208: when the weight of the garbage storage is changed, counting the changed weight of the garbage storage, and calculating the weight of the garbage in the garbage storage according to the changed weight of the garbage storage and the initial weight;
of course, in another embodiment, the garbage classification method further includes:
in steps S206-S209, an initial weight of the garbage storage may be obtained, and then whether the weight of the garbage storage is changed is determined based on the initial weight; in this embodiment, one implementation manner is: by weighing the refuse from each refuse storage or, alternatively, pre-storing the initial weight of the classification storage. And the garbage storage is provided with a weighing device, so that garbage in each garbage storage is weighed periodically or regularly, and even when garbage is put into the garbage storage each time, the garbage in each garbage storage is weighed so as to acquire weight information. Therefore, whether the weight of the garbage storage is changed can be judged; when the weight of the garbage storage is changed, the initial weight is subtracted from the weighed weight, so that the weight of the garbage in the garbage storage can be obtained, and then the weight of the garbage in the garbage storage is sent to a user side regularly or periodically.
Further, after determining the weight change of the refuse storage, the refuse classification method further includes: based on the initial weight, counting a time period of weight change generated by the garbage storage; and sending the garbage discrimination information corresponding to the garbage stored in the garbage storage to a user side in a time period when the weight change of the garbage storage is maximum. In this way, the user can know what time period the most garbage is generated and what kind of garbage is generated.
In another embodiment, the application further comprises the steps of:
step S209: periodically and/or regularly generating an information report through classification discrimination information of the garbage corresponding to the food and/or the weight of the garbage in the garbage storage and/or the time period with the largest change of the weight of the garbage storage, and sending the information report to a user side.
Further, the information report is sent to the user terminal periodically and/or periodically, wherein the information report comprises one or more of the following: classification discrimination information of the garbage generated by the food, the weight of the garbage in the garbage storage and a time period when the weight change of the garbage storage is maximum. Thus, the user can clearly quantify how much garbage is generated. Such as: the weight of the garbage in the garbage storage is sent to the user side once a day, so that the user can clearly quantify how much garbage is generated every day. In addition, in the embodiment, the weight of garbage in the garbage storage can be compared regularly, the user is scored, interaction is increased, the user knows how much garbage is generated in one day and one week, and the user is reminded of loving the environment from the side.
Of course, in this embodiment, some hint information may also be included in the information report, where the hint information includes, but is not limited to, one or more of the following: the garbage can generate more pollution and can be degraded only by a long time, so that the user can be called for saving resources and loving the environment.
It should be understood that, although the steps in the flowchart of fig. 2 are shown in sequence as indicated by the arrows, the steps are not necessarily performed in sequence as indicated by the arrows. The steps are not strictly limited to the order of execution unless explicitly recited herein, and the steps may be executed in other orders. Moreover, at least some of the steps in fig. 2 may include multiple sub-steps or stages that are not necessarily performed at the same time, but may be performed at different times, nor do the order in which the sub-steps or stages are performed necessarily performed in sequence, but may be performed alternately or alternately with at least a portion of the sub-steps or stages of other steps or other steps.
In another embodiment, as shown in fig. 4, there is provided a garbage classification apparatus, the apparatus comprising: a food information acquisition module 110 for acquiring food information corresponding to food placed in each area; the searching module 120 is configured to search for spam corresponding to the food in a preset spam database according to the food information, where the preset spam database stores the food information and the spam corresponding to the food; and the classification module 130 is configured to perform classification and discrimination on the garbage corresponding to the food based on the garbage information, so as to obtain classification and discrimination information of the garbage corresponding to the food.
Optionally, the food information obtaining module 110 includes: a food material image acquisition unit for acquiring a plurality of food material images corresponding to the food placed in each region; the detection unit is used for detecting the definition of each food material image and selecting food material images with the definition meeting the preset standard; and the processing unit is used for carrying out data processing on the food material images meeting the preset standard to obtain the food information.
Optionally, the searching module 120 is specifically configured to: judging the area of each food material image; inquiring the junk information in a preset junk database corresponding to the area according to the food information.
Optionally, the detection unit is specifically configured to: calculating gradient differences of gray features between adjacent pixels of each food material image, a mean value of each food material image on a gray scale image and a variance of each food material image on the gray scale image; and obtaining the definition corresponding to each food material image based on the calculation results of the gradient difference, the mean value and the variance.
Optionally, the processing unit is specifically configured to: fusing the food material images meeting the preset standard, and extracting the image characteristics of the fused food material images; and searching the food information in a food feature library according to the extracted image features, wherein the food information comprises all the extracted image features, and the food feature library comprises the food information and all the image features included in the food information.
Optionally, the apparatus further includes: the garbage storage module is used for analyzing the classification and discrimination information after obtaining the classification and discrimination information of the garbage generated by the food to obtain the corresponding garbage storage environment information and storage time information of the food; and determining the garbage storage information through the storage environment information and the storage time information, wherein the garbage storage information is used for characterizing the garbage storage meeting the storage environment and the storage time required by garbage.
Optionally, the apparatus further includes: the initial weight module is used for acquiring the initial weight of the garbage storage; the weight judging module is used for judging whether the weight of the garbage storage is changed or not based on the initial weight; when the weight of the garbage storage is changed, counting the changed weight of the garbage storage, and calculating the weight of the garbage in the garbage storage according to the changed weight of the garbage storage and the initial weight.
Optionally, the apparatus further includes: and after the weight change of the garbage storage is judged, the statistics module is used for counting the time period of the weight change of the garbage storage and determining the time period of the maximum weight change of the garbage storage.
Optionally, the apparatus further includes: and the report module is used for periodically and/or regularly generating an information report through classification discrimination information of the garbage corresponding to the food and/or the weight of the garbage in the garbage storage and/or the time period with the largest change of the weight of the garbage storage, and sending the information report to a user side.
For specific limitations of the two devices, reference may be made to the limitations of the two methods above, and no further description is given here. Each of the modules in the two devices may be implemented in whole or in part by software, hardware, and combinations thereof. The above modules may be embedded in hardware or independent of a processor in the terminal device, or may be stored in software in a memory in the terminal device, so that the processor may call and execute operations corresponding to the above modules.
In one embodiment, a terminal device is provided, which may be a server, and the internal structure thereof may be as shown in fig. 5. The terminal device includes a processor, a memory, a network interface, and a database connected by a system bus. Wherein the processor of the terminal device is adapted to provide computing and control capabilities. The memory of the terminal device comprises a nonvolatile storage medium and an internal memory. The non-volatile storage medium stores an operating system, computer programs, and a database. The internal memory provides an environment for the operation of the operating system and computer programs in the non-volatile storage media. The database of the terminal device is used for storing related data. The network interface of the terminal device is used for communicating with an external terminal through a network connection. The computer program, when executed by the processor, implements a method for determining seasoning usage and a model optimization method.
It will be appreciated by those skilled in the art that the structure shown in fig. 5 is merely a block diagram of a portion of the structure associated with the present inventive arrangements and is not limiting of the terminal device to which the present inventive arrangements are applied, and that a particular terminal device may include more or fewer components than shown, or may combine some of the components, or have a different arrangement of components.
In one embodiment, a terminal device is provided, including a memory and a processor, where the memory stores a computer program, and the processor implements a garbage classification method as described above when executing the computer program.
The terms and implementation principles of a terminal device in this embodiment may refer to a garbage classification method in the embodiment of the present application, which is not described herein.
In one embodiment, a storage medium is provided having a computer program stored thereon, which when executed by a processor implements a garbage classification method as described above.
Those skilled in the art will appreciate that implementing all or part of the above described methods may be accomplished by way of a computer program stored on a non-transitory computer readable storage medium, which when executed, may comprise the steps of the embodiments of the methods described above. Any reference to memory, storage, database, or other medium used in embodiments provided herein may include non-volatile and/or volatile memory. The nonvolatile memory can include Read Only Memory (ROM), programmable ROM (PROM), electrically Programmable ROM (EPROM), electrically Erasable Programmable ROM (EEPROM), or flash memory. Volatile memory can include Random Access Memory (RAM) or external cache memory. By way of illustration and not limitation, RAM is available in a variety of forms such as Static RAM (SRAM), dynamic RAM (DRAM), synchronous DRAM (SDRAM), double Data Rate SDRAM (DDRSDRAM), enhanced SDRAM (ESDRAM), synchronous Link DRAM (SLDRAM), memory bus direct RAM (RDRAM), direct memory bus dynamic RAM (DRDRAM), and memory bus dynamic RAM (RDRAM), among others.
The technical features of the above embodiments may be arbitrarily combined, and all possible combinations of the technical features in the above embodiments are not described for brevity of description, however, as long as there is no contradiction between the combinations of the technical features, they should be considered as the scope of the description.
The above examples illustrate only a few embodiments of the application, which are described in detail and are not to be construed as limiting the scope of the application. It should be noted that it will be apparent to those skilled in the art that several variations and modifications can be made without departing from the spirit of the application, which are all within the scope of the application. Accordingly, the scope of protection of the present application is to be determined by the appended claims.

Claims (7)

1. A method of sorting waste, the method comprising:
acquiring a plurality of food material images corresponding to foods placed in each area, and performing definition detection on each food material image, including: calculating gradient differences of gray features between adjacent pixels of each food material image, a mean value of each food material image on a gray scale image and a variance of each food material image on the gray scale image; respectively configuring corresponding weights based on the calculation results of the gradient difference, the mean value and the variance to obtain the definition corresponding to each food material image;
selecting food material images with definition meeting preset standards; carrying out data processing on the food material images meeting the preset standard to obtain food information, wherein the food information comprises the following steps: fusing the food material images meeting the preset standard, extracting the image characteristics of the fused food material images, and searching the food information in a food characteristic library according to the extracted image characteristics;
deleting food material images with definition not meeting preset standards;
searching the corresponding junk information of the food in a preset junk database according to the food information, wherein the searching comprises the following steps: judging the area of each food material image; inquiring the junk information in a preset junk database corresponding to the area according to the food information;
the food information and the corresponding junk information of the food are stored in the preset junk database;
based on the garbage information, classifying and judging the garbage corresponding to the food to obtain classifying and judging information of the garbage corresponding to the food;
analyzing the classification discrimination information to obtain the corresponding garbage preservation environment information and preservation time information of the food;
and determining the garbage storage information through the storage environment information and the storage time information, wherein the garbage storage information is used for characterizing the garbage storage meeting the storage environment and the storage time required by garbage.
2. The method according to claim 1, wherein the method further comprises:
acquiring the initial weight of the garbage storage;
judging whether the weight of the garbage storage is changed or not based on the initial weight;
when the weight of the garbage storage is changed, counting the changed weight of the garbage storage, and calculating the weight of the garbage in the garbage storage according to the changed weight of the garbage storage and the initial weight.
3. The method of claim 2, wherein after the change in weight of the garbage storage, the method further comprises:
and counting the time period when the weight of the garbage storage is changed, and determining the time period when the weight of the garbage storage is changed maximally.
4. A method according to claim 3, characterized in that the method further comprises:
periodically and/or regularly generating an information report through classification discrimination information of the garbage corresponding to the food and/or the weight of the garbage in the garbage storage and/or the time period with the largest change of the weight of the garbage storage, and sending the information report to a user side.
5. A garbage classification apparatus, the apparatus comprising:
the food information acquisition module is used for acquiring a plurality of food material images corresponding to the foods placed in each area; carrying out definition detection on each food material image, wherein the method comprises the following steps of: calculating gradient differences of gray features between adjacent pixels of each food material image, a mean value of each food material image on a gray scale image and a variance of each food material image on the gray scale image; respectively configuring corresponding weights based on the calculation results of the gradient difference, the mean value and the variance to obtain the definition corresponding to each food material image;
selecting food material images with definition meeting preset standards; carrying out data processing on the food material images meeting the preset standard to obtain food information, wherein the food information comprises the following steps: fusing the food material images meeting the preset standard, extracting the image characteristics of the fused food material images, and searching the food information in a food characteristic library according to the extracted image characteristics;
deleting food material images with definition not meeting preset standards;
the searching module is used for searching the corresponding junk information of the food in a preset junk database according to the food information, and comprises the following steps: judging the area of each food material image; inquiring the junk information in a preset junk database corresponding to the area according to the food information;
the food information and the corresponding junk information of the food are stored in the preset junk database;
the classification module is used for classifying and judging the garbage corresponding to the food based on the garbage information to obtain classification and judgment information of the garbage corresponding to the food;
the garbage storage module is used for analyzing the classification discrimination information to obtain the corresponding garbage storage environment information and storage time information of the food;
and determining the garbage storage information through the storage environment information and the storage time information, wherein the garbage storage information is used for characterizing the garbage storage meeting the storage environment and the storage time required by garbage.
6. A terminal device comprising a processor and a memory;
the memory is configured to store computer instructions and the processor is configured to execute the computer instructions stored by the memory to implement the garbage classification method of any of claims 1 to 4.
7. A storage medium storing one or more programs executable by one or more processors to implement the garbage classification method of any one of claims 1-4.
CN202010010769.7A 2020-01-06 2020-01-06 Garbage classification method, garbage classification device, terminal equipment and storage medium Active CN111241967B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010010769.7A CN111241967B (en) 2020-01-06 2020-01-06 Garbage classification method, garbage classification device, terminal equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010010769.7A CN111241967B (en) 2020-01-06 2020-01-06 Garbage classification method, garbage classification device, terminal equipment and storage medium

Publications (2)

Publication Number Publication Date
CN111241967A CN111241967A (en) 2020-06-05
CN111241967B true CN111241967B (en) 2023-08-15

Family

ID=70870811

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010010769.7A Active CN111241967B (en) 2020-01-06 2020-01-06 Garbage classification method, garbage classification device, terminal equipment and storage medium

Country Status (1)

Country Link
CN (1) CN111241967B (en)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2006035714A1 (en) * 2004-09-28 2006-04-06 Matsushita Electric Industrial Co., Ltd. Food management method and system
CN107054937A (en) * 2017-03-23 2017-08-18 广东数相智能科技有限公司 A kind of refuse classification suggestion device and system based on image recognition
CN107697501A (en) * 2017-09-25 2018-02-16 深圳传音通讯有限公司 Rubbish recovering method and system
CN108182455A (en) * 2018-01-18 2018-06-19 齐鲁工业大学 A kind of method, apparatus and intelligent garbage bin of the classification of rubbish image intelligent
CN110406839A (en) * 2019-08-30 2019-11-05 湘潭大学 Method, apparatus, intelligent garbage bin and the storage medium of Waste sorting recycle
CN110569874A (en) * 2019-08-05 2019-12-13 深圳大学 Garbage classification method and device, intelligent terminal and storage medium

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2006020223A2 (en) * 2004-07-19 2006-02-23 Lean Path, Inc. Systems and methods for food waste monitoring

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2006035714A1 (en) * 2004-09-28 2006-04-06 Matsushita Electric Industrial Co., Ltd. Food management method and system
CN107054937A (en) * 2017-03-23 2017-08-18 广东数相智能科技有限公司 A kind of refuse classification suggestion device and system based on image recognition
CN107697501A (en) * 2017-09-25 2018-02-16 深圳传音通讯有限公司 Rubbish recovering method and system
CN108182455A (en) * 2018-01-18 2018-06-19 齐鲁工业大学 A kind of method, apparatus and intelligent garbage bin of the classification of rubbish image intelligent
CN110569874A (en) * 2019-08-05 2019-12-13 深圳大学 Garbage classification method and device, intelligent terminal and storage medium
CN110406839A (en) * 2019-08-30 2019-11-05 湘潭大学 Method, apparatus, intelligent garbage bin and the storage medium of Waste sorting recycle

Also Published As

Publication number Publication date
CN111241967A (en) 2020-06-05

Similar Documents

Publication Publication Date Title
US20150112903A1 (en) Defect prediction method and apparatus
CN103281341A (en) Network event processing method and device
RU2005137247A (en) AUTOMATIC ASYMMETRIC DETECTION OF THREATS USING TRACKING IN THE REVERSE DIRECTION AND BEHAVIORAL ANALYSIS
CN111626201A (en) Commodity detection method and device and readable storage medium
CN112463859B (en) User data processing method and server based on big data and business analysis
CN113409555B (en) Real-time alarm linkage method and system based on Internet of things
CN108304328A (en) A kind of text of crowdsourcing test report describes generation method, system and device
CN110413881B (en) Method, device, network equipment and storage medium for identifying label accuracy
CN111241967B (en) Garbage classification method, garbage classification device, terminal equipment and storage medium
Yang et al. Chip defect detection based on deep learning method
CN110929605A (en) Video key frame storage method, device, equipment and storage medium
Arunkumar et al. An internet of things based waste management system using hybrid machine learning technique
CN104036016A (en) Picture screening method and picture screening device
Díaz et al. A comparative approach between different computer vision tools, including commercial and open-source, for improving cultural image access and analysis
CN111046974B (en) Article classification method and device, storage medium and electronic equipment
US11527091B2 (en) Analyzing apparatus, control method, and program
US20160239264A1 (en) Re-streaming time series data for historical data analysis
CN115577991B (en) Business intelligent data analysis system and analysis method based on big data
CN110852384A (en) Medical image quality detection method, device and storage medium
CN111967403B (en) Video movement area determining method and device and electronic equipment
CN112287074A (en) Patent information prediction system based on data mining
CN114168408A (en) Inspection method and system based on Internet of things, electronic equipment and storage medium
Wijaya et al. Prediction system of chicken meat expiration time based on polynomial regression using NodeMCU ESP8266 and MQ137 sensor
Chądzyńska-Krasowska et al. Quality of histograms as indicator of approximate query quality
CN111611483A (en) Object portrait construction method, device, equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant