WO2022026818A1 - Waste receptacle with sensing and interactive presentation system - Google Patents
Waste receptacle with sensing and interactive presentation system Download PDFInfo
- Publication number
- WO2022026818A1 WO2022026818A1 PCT/US2021/043881 US2021043881W WO2022026818A1 WO 2022026818 A1 WO2022026818 A1 WO 2022026818A1 US 2021043881 W US2021043881 W US 2021043881W WO 2022026818 A1 WO2022026818 A1 WO 2022026818A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- item
- waste receptacle
- waste
- media content
- items
- Prior art date
Links
- 239000002699 waste material Substances 0.000 title claims abstract description 338
- 230000002452 interceptive effect Effects 0.000 title description 6
- 238000000034 method Methods 0.000 claims abstract description 81
- 238000010295 mobile communication Methods 0.000 claims description 22
- 238000003860 storage Methods 0.000 claims description 17
- 239000004033 plastic Substances 0.000 description 45
- 229920003023 plastic Polymers 0.000 description 45
- 230000015654 memory Effects 0.000 description 40
- 239000000123 paper Substances 0.000 description 26
- 239000000463 material Substances 0.000 description 25
- 238000004891 communication Methods 0.000 description 23
- 239000007788 liquid Substances 0.000 description 21
- 238000012545 processing Methods 0.000 description 18
- 238000004064 recycling Methods 0.000 description 18
- 229910052751 metal Inorganic materials 0.000 description 16
- 239000002184 metal Substances 0.000 description 16
- 238000004422 calculation algorithm Methods 0.000 description 15
- 235000013361 beverage Nutrition 0.000 description 12
- 238000001514 detection method Methods 0.000 description 11
- 230000008569 process Effects 0.000 description 11
- 238000010586 diagram Methods 0.000 description 10
- 239000010819 recyclable waste Substances 0.000 description 10
- 230000008859 change Effects 0.000 description 8
- 235000013305 food Nutrition 0.000 description 8
- 230000009466 transformation Effects 0.000 description 8
- 239000011111 cardboard Substances 0.000 description 7
- 230000033001 locomotion Effects 0.000 description 7
- 230000000007 visual effect Effects 0.000 description 7
- 230000033228 biological regulation Effects 0.000 description 6
- 238000011109 contamination Methods 0.000 description 6
- 238000003384 imaging method Methods 0.000 description 6
- 238000007726 management method Methods 0.000 description 6
- 239000010813 municipal solid waste Substances 0.000 description 6
- 239000000126 substance Substances 0.000 description 6
- 229910052782 aluminium Inorganic materials 0.000 description 5
- XAGFODPZIPBFFR-UHFFFAOYSA-N aluminium Chemical compound [Al] XAGFODPZIPBFFR-UHFFFAOYSA-N 0.000 description 5
- 230000005540 biological transmission Effects 0.000 description 4
- 230000008878 coupling Effects 0.000 description 4
- 238000010168 coupling process Methods 0.000 description 4
- 238000005859 coupling reaction Methods 0.000 description 4
- 239000010847 non-recyclable waste Substances 0.000 description 4
- 235000013550 pizza Nutrition 0.000 description 4
- 239000000843 powder Substances 0.000 description 4
- 238000004458 analytical method Methods 0.000 description 3
- 235000013353 coffee beverage Nutrition 0.000 description 3
- 239000002361 compost Substances 0.000 description 3
- 238000013500 data storage Methods 0.000 description 3
- 238000000151 deposition Methods 0.000 description 3
- 238000005516 engineering process Methods 0.000 description 3
- 235000013410 fast food Nutrition 0.000 description 3
- 239000011521 glass Substances 0.000 description 3
- 231100001261 hazardous Toxicity 0.000 description 3
- 238000010801 machine learning Methods 0.000 description 3
- 230000007246 mechanism Effects 0.000 description 3
- 230000003287 optical effect Effects 0.000 description 3
- 238000013515 script Methods 0.000 description 3
- 239000007787 solid Substances 0.000 description 3
- 238000004611 spectroscopical analysis Methods 0.000 description 3
- 230000007704 transition Effects 0.000 description 3
- 230000009471 action Effects 0.000 description 2
- 230000004913 activation Effects 0.000 description 2
- 235000021152 breakfast Nutrition 0.000 description 2
- 230000001413 cellular effect Effects 0.000 description 2
- 238000000701 chemical imaging Methods 0.000 description 2
- 235000021443 coca cola Nutrition 0.000 description 2
- 238000004590 computer program Methods 0.000 description 2
- 238000009826 distribution Methods 0.000 description 2
- 230000000694 effects Effects 0.000 description 2
- 239000004744 fabric Substances 0.000 description 2
- 230000006870 function Effects 0.000 description 2
- 239000004519 grease Substances 0.000 description 2
- 230000001939 inductive effect Effects 0.000 description 2
- 238000005259 measurement Methods 0.000 description 2
- 238000012986 modification Methods 0.000 description 2
- 230000004048 modification Effects 0.000 description 2
- XLYOFNOQVPJJNP-UHFFFAOYSA-N water Substances O XLYOFNOQVPJJNP-UHFFFAOYSA-N 0.000 description 2
- 206010002942 Apathy Diseases 0.000 description 1
- 208000027418 Wounds and injury Diseases 0.000 description 1
- 238000009825 accumulation Methods 0.000 description 1
- 238000013459 approach Methods 0.000 description 1
- 239000003990 capacitor Substances 0.000 description 1
- 238000005266 casting Methods 0.000 description 1
- 238000007635 classification algorithm Methods 0.000 description 1
- 239000000571 coke Substances 0.000 description 1
- 239000003086 colorant Substances 0.000 description 1
- 230000006378 damage Effects 0.000 description 1
- 238000013135 deep learning Methods 0.000 description 1
- 230000008021 deposition Effects 0.000 description 1
- 238000013461 design Methods 0.000 description 1
- 238000003708 edge detection Methods 0.000 description 1
- 238000000605 extraction Methods 0.000 description 1
- 239000012530 fluid Substances 0.000 description 1
- 239000008187 granular material Substances 0.000 description 1
- 230000005484 gravity Effects 0.000 description 1
- 239000002920 hazardous waste Substances 0.000 description 1
- 238000010191 image analysis Methods 0.000 description 1
- 230000008676 import Effects 0.000 description 1
- 230000006698 induction Effects 0.000 description 1
- 208000014674 injury Diseases 0.000 description 1
- 230000003993 interaction Effects 0.000 description 1
- 239000004973 liquid crystal related substance Substances 0.000 description 1
- 230000007774 longterm Effects 0.000 description 1
- 239000007769 metal material Substances 0.000 description 1
- 150000002739 metals Chemical class 0.000 description 1
- 235000019520 non-alcoholic beverage Nutrition 0.000 description 1
- 239000002245 particle Substances 0.000 description 1
- 238000007637 random forest analysis Methods 0.000 description 1
- 230000011514 reflex Effects 0.000 description 1
- 238000009419 refurbishment Methods 0.000 description 1
- 230000004044 response Effects 0.000 description 1
- 230000011218 segmentation Effects 0.000 description 1
- 235000014214 soft drink Nutrition 0.000 description 1
- 230000005236 sound signal Effects 0.000 description 1
- 239000010902 straw Substances 0.000 description 1
- 230000002123 temporal effect Effects 0.000 description 1
- 238000005303 weighing Methods 0.000 description 1
Classifications
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B65—CONVEYING; PACKING; STORING; HANDLING THIN OR FILAMENTARY MATERIAL
- B65F—GATHERING OR REMOVAL OF DOMESTIC OR LIKE REFUSE
- B65F1/00—Refuse receptacles; Accessories therefor
- B65F1/0033—Refuse receptacles; Accessories therefor specially adapted for segregated refuse collecting, e.g. receptacles with several compartments; Combination of receptacles
- B65F1/0053—Combination of several receptacles
- B65F1/006—Rigid receptacles stored in an enclosure or forming part of it
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06Q—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q10/00—Administration; Management
- G06Q10/30—Administration of product recycling or disposal
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B65—CONVEYING; PACKING; STORING; HANDLING THIN OR FILAMENTARY MATERIAL
- B65F—GATHERING OR REMOVAL OF DOMESTIC OR LIKE REFUSE
- B65F1/00—Refuse receptacles; Accessories therefor
- B65F1/14—Other constructional features; Accessories
- B65F1/16—Lids or covers
- B65F1/1623—Lids or covers with means for assisting the opening or closing thereof, e.g. springs
- B65F1/1638—Electromechanically operated lids
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B65—CONVEYING; PACKING; STORING; HANDLING THIN OR FILAMENTARY MATERIAL
- B65F—GATHERING OR REMOVAL OF DOMESTIC OR LIKE REFUSE
- B65F1/00—Refuse receptacles; Accessories therefor
- B65F1/0033—Refuse receptacles; Accessories therefor specially adapted for segregated refuse collecting, e.g. receptacles with several compartments; Combination of receptacles
- B65F2001/008—Means for automatically selecting the receptacle in which refuse should be placed
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B65—CONVEYING; PACKING; STORING; HANDLING THIN OR FILAMENTARY MATERIAL
- B65F—GATHERING OR REMOVAL OF DOMESTIC OR LIKE REFUSE
- B65F2210/00—Equipment of refuse receptacles
- B65F2210/128—Data transmitting means
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B65—CONVEYING; PACKING; STORING; HANDLING THIN OR FILAMENTARY MATERIAL
- B65F—GATHERING OR REMOVAL OF DOMESTIC OR LIKE REFUSE
- B65F2210/00—Equipment of refuse receptacles
- B65F2210/138—Identification means
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B65—CONVEYING; PACKING; STORING; HANDLING THIN OR FILAMENTARY MATERIAL
- B65F—GATHERING OR REMOVAL OF DOMESTIC OR LIKE REFUSE
- B65F2210/00—Equipment of refuse receptacles
- B65F2210/139—Illuminating means
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B65—CONVEYING; PACKING; STORING; HANDLING THIN OR FILAMENTARY MATERIAL
- B65F—GATHERING OR REMOVAL OF DOMESTIC OR LIKE REFUSE
- B65F2210/00—Equipment of refuse receptacles
- B65F2210/144—Level detecting means
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B65—CONVEYING; PACKING; STORING; HANDLING THIN OR FILAMENTARY MATERIAL
- B65F—GATHERING OR REMOVAL OF DOMESTIC OR LIKE REFUSE
- B65F2210/00—Equipment of refuse receptacles
- B65F2210/152—Material detecting means
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B65—CONVEYING; PACKING; STORING; HANDLING THIN OR FILAMENTARY MATERIAL
- B65F—GATHERING OR REMOVAL OF DOMESTIC OR LIKE REFUSE
- B65F2210/00—Equipment of refuse receptacles
- B65F2210/152—Material detecting means
- B65F2210/1525—Material detecting means for metal
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B65—CONVEYING; PACKING; STORING; HANDLING THIN OR FILAMENTARY MATERIAL
- B65F—GATHERING OR REMOVAL OF DOMESTIC OR LIKE REFUSE
- B65F2210/00—Equipment of refuse receptacles
- B65F2210/16—Music playing devices
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B65—CONVEYING; PACKING; STORING; HANDLING THIN OR FILAMENTARY MATERIAL
- B65F—GATHERING OR REMOVAL OF DOMESTIC OR LIKE REFUSE
- B65F2210/00—Equipment of refuse receptacles
- B65F2210/168—Sensing means
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B65—CONVEYING; PACKING; STORING; HANDLING THIN OR FILAMENTARY MATERIAL
- B65F—GATHERING OR REMOVAL OF DOMESTIC OR LIKE REFUSE
- B65F2210/00—Equipment of refuse receptacles
- B65F2210/176—Sorting means
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B65—CONVEYING; PACKING; STORING; HANDLING THIN OR FILAMENTARY MATERIAL
- B65F—GATHERING OR REMOVAL OF DOMESTIC OR LIKE REFUSE
- B65F2210/00—Equipment of refuse receptacles
- B65F2210/184—Weighing means
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B65—CONVEYING; PACKING; STORING; HANDLING THIN OR FILAMENTARY MATERIAL
- B65F—GATHERING OR REMOVAL OF DOMESTIC OR LIKE REFUSE
- B65F2210/00—Equipment of refuse receptacles
- B65F2210/20—Temperature sensing means
-
- Y—GENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
- Y02—TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
- Y02W—CLIMATE CHANGE MITIGATION TECHNOLOGIES RELATED TO WASTEWATER TREATMENT OR WASTE MANAGEMENT
- Y02W90/00—Enabling technologies or technologies with a potential or indirect contribution to greenhouse gas [GHG] emissions mitigation
Definitions
- the present disclosure relates to receptacles. More specifically, the present disclosure related to implementing systems and methods for identifying types of items placed in a receptacle and causing a changeable medium to react based on the identified types.
- the waste may be too contaminated to be processed as anything other than landfill waste.
- an empty plastic beverage bottle may generally be considered to be recyclable, but a plastic beverage bottle filled with a certain amount of liquid may not be acceptable as a recyclable item at the downstream recycling plant.
- corrugated cardboard may generally be considered to be recyclable, but cardboard materials contaminated with food, grease, etc. may no longer be acceptable as a recyclable item.
- many users still place these contaminated items in a corresponding recycling bin, resulting in the need to sort and dispose of the contaminated item at the downstream recycling plant, which adds significant time and expense to the recycling process.
- contaminated recyclable products that are placed in recycling bins may lead to the contamination of commingled, otherwise-recyclable items, further reducing the efficiency of reclaiming reusable materials.
- the present document relates to implementing systems and methods for operating a system.
- the methods comprise: generating sensor data by at least one sensor of a waste receptacle that specifies at least one characteristic of at least one item; and performing operations by the waste receptacle to identify first media content that is associated with the at least one characteristic of the at least one item.
- the waste receptacle or an external device is caused to output the first media content.
- second media content is selected.
- the second media content is different than the first media content.
- the waste receptacle or the external device is then caused to output the second media content.
- the external device may comprise (i) a mobile communication device of an individual who has or has possession of the at least one item or (ii) a presentation device located near the waste receptacle device.
- the characteristic(s) of the item include, but are not limited to, a type of item, a type of waste that the at least one item comprises, a physical appearance of the at least one item, a condition of the at least one item, a source of the at least one item, a tradename for the at least one item, a product identifier for the at least one item, label information for the at least one item, contents of the at least one item, a group to which the at least one item belongs, a ranking or importance of the at least one item relative to other items, and/or a ranking or importance of the source of the at least one item relative to a source of at least one other item.
- the methods may also comprise performing operations by the waste receptacle to detect when a first item and a second item is disposed therein within a period starting from when the first item was detected in proximity to the waste receptacle.
- the sensor data specifies characteristics of the first item and the second item.
- the characteristics of the first and second items are used to obtain information comprising at least one of an identifier for a group to which the first and second items belong, an identification of a common source for the first and second items, relative importance for the first and second items, and a weighted value.
- the second media content may be selected randomly.
- the second media content is selected based on a time of day, a geographic location of the waste receptacle, a total number of individuals detected in proximity to the waste receptacle at a given time, physical characteristics of individuals proximate to the waste receptacle, types of electronic devices in possession of the individuals proximate to the waste receptacle, a given degree of the at least one item’s popularity amongst a population, or an average number of items processed by the waste receptacle in a given period.
- the sensor data is analyzed to obtain individual- based information that is used to select the second media content.
- the individual-based information can include, but is not limited to, an identifier for an individual proximate to the waste receptacle, information indicating a popularity of an item type amongst persons having at least one trait in common with the individual proximate to the waste receptacle, and/or information indicating a total or average number of items of the same or similar type that has historically been processed by waste receptacle during a given period.
- Implementing systems of the above-described methods can include, but are not limited to, a processor and a non-transitory computer-readable storage medium comprising programming instructions that are configured to cause the processor to implement a method for operating a system.
- FIG. 1 provides an illustration showing an actuated waste container.
- FIG. 2 provides an illustration showing another actuated waster container.
- FIG. 3 provides an illustration showing yet another actuated waster container.
- FIG. 4 provides a cross-sectional view of the waste container shown in FIG. 1.
- FIG. 5 provides an illustration showing an actuated sorting compartment.
- FIG. 6 provides an illustration showing another sorting compartment.
- FIGS. 7A-7C (collectively referred to as “FIG. 7”) provide illustrations showing yet another sorting compartment.
- FIG. 8 provides a flow diagram of an illustrative method for processing a piece of waste within a waste container.
- FIG. 9 provides a flow diagram of an illustrative method for determining a type of waste placed within a waste container.
- FIG. 10 provides a flow diagram of an illustrative method for determining a type of waste placed within a waste container.
- FIG. 11 provides an illustration of an architecture for a circuit of a waste container.
- FIG. 12 provides an illustration of a waste disposal system.
- FIGS. 13A-13B (collectively referred to as “FIG. 13”) provides a flow diagram of an illustrative method for controlling a system.
- FIG. 14 provides a flow diagram of an illustrative method for controlling a system.
- FIG. 15 provides an illustration of illustrative computing device.
- the words “comprise,” “comprising,” and the like are to be construed in an inclusive sense, as opposed to an exclusive or exhaustive sense; that is to say, in the sense of "including, but not limited to.”
- the terms “connected,” “coupled,” or any variant thereof means any connection or coupling, either direct or indirect, between two or more elements.
- the coupling or connection between the elements can be physical, logical, or a combination thereof.
- two devices may be coupled directly, or via one or more intermediary channels or devices.
- devices may be coupled in such a way that information can be passed therebetween, while not sharing any physical connection with one another.
- Each device may have its own processor and/or memory, or the processor and/or memory may be shared with other devices as in a virtual machine or container arrangement.
- the memory will contain or receive programming instructions that, when executed by the processor, cause the electronic device to perform one or more operations according to the programming instructions.
- Examples of electronic devices include personal computers, servers, mainframes, virtual machines, containers, gaming systems, televisions, digital home assistants and mobile electronic devices such as smartphones, fitness tracking devices, wearable virtual reality devices, Internet-connected wearables such as smart watches and smart eyewear, personal digital assistants, cameras, tablet computers, laptop computers, media players and the like.
- Electronic devices also may include appliances and other devices that can communicate in an Intemet-of-things arrangement, such as smart thermostats, refrigerators, connected light bulbs and other devices.
- Electronic devices also may include components of vehicles such as dashboard entertainment and navigation systems, as well as on-board vehicle diagnostic and operation systems.
- the client device and the server are electronic devices, in which the server contains instructions and/or data that the client device accesses via one or more communications links in one or more communications networks.
- a server may be an electronic device, and each virtual machine or container also may be considered an electronic device.
- a client device, server device, virtual machine or container may be referred to simply as a “device” for brevity. Additional elements that may be included in electronic devices are discussed above in the context of FIG. 1.
- processor and “processing device” refer to a hardware component of an electronic device that is configured to execute programming instructions. Except where specifically stated otherwise, the singular term “processor” or “processing device” is intended to include both single-processing device embodiments and embodiments in which multiple processing devices together or collectively perform a process.
- memory each refer to a non-transitory device on which computer- readable data, programming instructions or both are stored. Except where specifically stated otherwise, the terms “memory,” “memory device,” “data store,” “data storage facility” and the like are intended to include single device embodiments, embodiments in which multiple memory devices together or collectively store a set of data or instructions, as well as individual sectors within such devices.
- communication link and “communication path” mean a wired or wireless path via which a first device sends communication signals to and/or receives communication signals from one or more other devices.
- Devices are “communicatively connected” if the devices are able to send and/or receive data via a communication link.
- Electrical communication refers to the transmission of data via one or more signals between two or more electronic devices, whether through a wired or wireless network, and whether directly or indirectly via one or more intermediary devices.
- imaging device refers generally to a hardware sensor that is configured to acquire digital images.
- An imaging device may capture still and/or video images, and optionally may be used for other imagery-related applications.
- an imaging device can be held by a user such as a DSLR (digital single lens reflex) camera, cell phone camera, or video camera.
- the imaging device may be part of an image capturing system that includes other hardware components.
- an imaging device can be mounted on an accessory such as a monopod or tripod.
- the imaging device can also be mounted on a transporting vehicle such as an aerial drone, a robotic vehicle, or on a piloted aircraft such as a plane or helicopter having a transceiver that can send captured digital images to, and receive commands from, other components of the system.
- a transporting vehicle such as an aerial drone, a robotic vehicle, or on a piloted aircraft such as a plane or helicopter having a transceiver that can send captured digital images to, and receive commands from, other components of the system.
- connection when referring to two physical structures, means that the two physical structures touch each other.
- Devices that are connected may be secured to each other, or they may simply touch each other and not be secured.
- the term “electrically connected”, when referring to two electrical components, means that a conductive path exists between the two components.
- the path may be a direct path, or an indirect path through one or more intermediary components.
- first component may be an “upper” component and a second component may be a “lower” component when a device of which the components are a part is oriented in a first direction.
- the relative orientations of the components may be reversed, or the components may be on the same plane, if the orientation of the structure that contains the components is changed.
- the claims are intended to include all orientations of a device containing such components.
- the term “waste” or trash refers to a discarded or object that is placed into a waste container (such as a trash bin), a recycling container (such as a bin into which objects to be recycled are placed), or a refurbishment/reuse container (such as a container to receive spent ink cartridges or other items that can be cleaned, refilled, or otherwise refurbished and reused).
- a waste container such as a trash bin
- a recycling container such as a bin into which objects to be recycled are placed
- a refurbishment/reuse container such as a container to receive spent ink cartridges or other items that can be cleaned, refilled, or otherwise refurbished and reused.
- the technology presented here includes systems and methods to: (i) automatically determine the type of waste deposited into a waste container and dispense the waste into an appropriate waste bin (e.g., a recyclable waste bin, a non-recyclable waste bin (e.g., a landfill or compost bin), and/or other waste bins (e.g., a recyclable paper bin or a recyclable plastics bin) without the need for the user to decipher complicated pictures or read lengthy instructions before throwing a piece of waste away; and (ii) provide users of the water container with feedback to, for example, dynamically encourage use of given products.
- an appropriate waste bin e.g., a recyclable waste bin, a non-recyclable waste bin (e.g., a landfill or compost bin), and/or other waste bins (e.g., a recyclable paper bin or a recyclable plastics bin) without the need for the user to decipher complicated pictures or read lengthy instructions before throwing a piece of waste away
- other waste bins e.g., a recyclable paper
- FIG. 1 there is provided an illustration of an actuated waste container.
- the waste container 100 receives a piece of waste through an opening 102.
- the waste container can alternatively be provided with two or more openings 302, 304 as shown in FIG. 3.
- the openings 302, 304 can be disposed adjacent to each other as shown in FIG. 3 or in another arrangement (e.g., one behind the other, one diagonal with respect to the other, in a vertically stacked arrangement, etc.).
- the opening 102 can optionally be provided with a cover 104.
- Cover 104 can include, but is not limited to, a flap, a hinged lid, and/or a sliding door.
- the cover 104 is configured to toggle or otherwise transition between an open position shown in FIG. 1 and a closed position shown in FIG. 2. Further, the cover 104 can partially close the opening 102.
- the cover can be opened/closed manually by an individual, responsive to a voice command by an induvial via a microphone (e.g., microphone 202 of FIG. 2) and/or automatically based on sensor data generated by sensors (e.g., a proximity sensor) of the waste container (e.g., a detection of an individual or mobile communication device in proximity thereto).
- sensors e.g., a proximity sensor
- the waste container 100 is not limited to the single free-standing configurations shown in FIGS. 1-3.
- a plurality of waste containers can be provided proximate to each other in a room, wall or cabinet.
- These waste containers can comprise separate waste bins with their own openings and/or garbage chutes in a building. In this case, it may be considered a waste receiving system rather than a single waste container.
- the waste container 100 can include a circuit 106 (e.g., a processor, a datastore, a transceiver and/or sensor(s)).
- the circuit 106 is configured to facilitate the communication of data to/from the waste container.
- the data can be received from and/or sent to external computing device(s) of, for example, remote data storage facilities via a communication network.
- the external computing device(s) can include, but is(are) not limited to, a server, a desktop computer, a laptop computer, a mobile device, a personal digital assistant, and/or a smart phone.
- the communication network can include, but is not limited to, a data network, a wireless network, a telephony network or any combination thereof.
- the data network may be any Local Area Network (LAN), Metropolitan Area Network (MAN), Wide Area Network (WAN), a public data network (e.g., the Internet), short range wireless network (such as a Wi-Fi network), a packet-switched network (e.g., a proprietary cable or fiber-optic network and the like) or any combination thereof.
- LAN Local Area Network
- MAN Metropolitan Area Network
- WAN Wide Area Network
- public data network e.g., the Internet
- short range wireless network such as a Wi-Fi network
- a packet-switched network e.g., a proprietary cable or fiber-optic network and the like
- the wireless network may be, for example, a cellular network and may employ various technologies including enhanced data rates for global evolution (EDGE), General Packet Radio Service (GPRS), Global System for Mobile (GSM) communications, Internet protocol Multimedia Subsystem (IMS), Universal Mobile Telecommunications System (UMTS), etc., as well as any other suitable wireless medium (e.g., worldwide interoperability for microwave access (WiMAX), Long Term Evolution (LTE) networks, Code Division Multiple Access (CDMA), Wideband Code Division Multiple Access (WCDMA), wireless fidelity (Wi-Fi), Wireless LAN (WLAN), Bluetooth®, Internet Protocol (IP) data casting, satellite, Mobile Ad hoc Network (MANET) or any combination thereof.
- EDGE enhanced data rates for global evolution
- GPRS General Packet Radio Service
- GSM Global System for Mobile
- IMS Internet protocol Multimedia Subsystem
- UMTS Universal Mobile Telecommunications System
- any other suitable wireless medium e.g., worldwide interoperability for microwave access (WiMAX), Long Term
- the data can include, but is not limited to, sensor data generated by one or more sensor(s) of circuit 106.
- the sensor(s) can include, but are not limited to, inductive sensor(s), capacitive sensor(s), photoelectric sensor(s), load sensor(s), camera(s), temperature sensor(s), infrared sensor(s), near-infrared sensor(s), spectral imaging sensor(s) (such as spectrometer(s)), audio sensor(s), fluorescence sensor(s), millimeter-wave radar sensor(s), electronic scale(s), and/or odor sensor(s).
- the sensor data can include, for example, image(s) captured by camera(s), scent(s) captured by odor detection sensor(s), weight(s) determined by electronic scale(s), temperature(s) determined by infrared sensor(s) or temperature sensor(s), and/or an amount of fluid or other substance contained in the item.
- the circuit 106 of the waste container 100 and/or the external computing device(s) can: process the sensor data to detect when an item is being disposed in the waste container 100, identify a manufacturer of the item, identify a trademark associated with the item, classify the type of item based on the sensor data, and/or identify a condition of the item (e.g., clean or dirty); and/or provide feedback to the individual who disposed of the item based on the results of the sensor data processing.
- the feedback can include auditory feedback, visual feedback and/or tactile feedback (e.g., via a mobile communication device in possession of the individual).
- the waste container 100 can include output devices such as a display screen 200, a speaker 204 and/or light system 206 (e.g., LEDs or laser lighting system) shown in FIG. 2.
- output devices such as a display screen 200, a speaker 204 and/or light system 206 (e.g., LEDs or laser lighting system) shown in FIG. 2.
- Waste container 100 includes a frame 410 with sorting compartment(s) 402 provided therein.
- the frame 410 can be formed of any kind of structural material (e.g., plastic rods, aluminum bars and/or other metals).
- the frame 410 structurally supports the sorting compartment(s) 402 above one or more waste bins 406, 408.
- Each waste bin 406, 408 can be designated for use in collecting items of a given type (e.g., recyclables, landfill garbage, metal, compost, or hazardous waste).
- the waste bins 406, 408 can be provided for collecting items of the same or different type.
- the sorting compartment 402 includes electronic circuitry and mechanical components to sort items and/or controllably direct the items to a given waste bin 406, 408 via a bottom valve or doorway 412 thereof.
- the size of the waste container, the number of waste bins and/or the size(s) of the waste bin(s) may be customizable based on local waste disposal regulations. For example, a municipality allowing mixed recycling (i.e., the mixing of most or all recyclable materials in one container or bin) may only necessitate two waste bins (e.g., one for recyclables and one for landfill waste). On the other hand, a municipality requiring sorted recycling (e.g., individual containers or bins for most or all recyclables) may necessitate more than two waste bins.
- the sensors(s) of the circuit 106 can be disposed in or proximate to the sorting compartment 402.
- the sensor(s) are configured to generate sensor data that the waste container 100 and/or a remote computing device can use to identify, sort and direct pieces of waste when it is received in the sorting compartment 402.
- the sensor(s) can be disposed within, underneath, and/or above the sorting compartment 402 at location(s) that allow the contents within the sorting compartment 402 to be detected and/or identified.
- FIG. 5 provides an illustration that is useful for understanding operations of the sorting compartment 402.
- Sorting compartment 402 is generally configured to receive items 500 and selectively move the items 500 from opening 102 of the waste container 100 into a waste bin 406 or 408 based on the type of waste associated with the items.
- the sorting compartment 402 comprises a first set of walls 508 that are oriented in a first direction and a second set of walls 510 that are oriented in a second direction perpendicular to the first direction.
- One or more of the walls 510 may be actuated to move the items 500 to a given waste bin 406 or 408.
- wall(s) 510 may be actuated to move the same to the left such that it falls into waste bin 406 which is provided to collect recyclable items. If the item is not a recyclable item, wall(s) 510 may be actuated to move the same to the right such that it falls into waste bin 408 which is provided to collect landfill items. Wall(s) 508 can also be actuated to move the item in a third direction towards a front of the waste container and/or in a fourth direction towards a back of the waste container.
- Movement of walls 508 can facilitate the disposition of items into additional waste bins (not visible in FIG. 5).
- the present solution is not limited to the particulars of this example. Any number of walls can be configured to move to the left, right, forwards, backwards and/or diagonally. Movement of the walls can be achieved using a servomotor or a direct drive system.
- FIG. 6 shows another illustrative sorting compartment 600.
- the sorting compartment 600 comprises an opening 400 through which waste can enter the compartment, actuatable exit door(s) 604, 608 through which waste can exit the sorting compartment, and sensors for sensing the types of waste that enter the compartment.
- the exit doors 410, 420 can be opened/closed via motors and hinges 614, 616. Sensors can be disposed within, underneath, and/or above the one or more actuated sorting opening 602 to sense the contents entering and/or within the sorting compartments 600.
- the sensors can include, but are not limited to, inductive sensor(s), capacitive sensor(s), photoelectric sensor(s), load sensor(s), camera(s), temperature sensor(s), odor sensor(s), infrared sensor(s), near-infrared sensor(s), spectral imaging device(s), audio sensor(s), fluorescence sensor(s), and/or millimeter-wave radar sensor(s).
- the exit door(s) 604, 608 may contain additional sensors such as load sensor(s) and metal detection sensor(s) that are housed underneath a layer of waste-resistant material (e.g., a waterproof lining).
- the sorting compartment 600 is configured to move relative to the waste bins.
- a pivot member 618 is provided to facilitate rotatable movement of the sorting compartment.
- Pivot member 618 can include a post that is rotated or otherwise actuated by a motor (e.g., a servomotor or a direct drive system).
- the motor/post can cause the sorting compartment 600 to rotate about a horizontal axis 622 and/or rotate about a vertical axis 620.
- Other mechanical means can be provided in addition to or alternative to the pivot member such that the sorting compartment can move linearly relative to the waste bin(s).
- a circuit of the waste container processes sensor data to (i) determine that an item has been placed inside the sorting compartment 600 and (ii) determine that the item should be disposed of in a landfill.
- the waste bin for collecting landfill items is located to the right of the sorting compartment 600.
- the sorting compartment 600 rotates counterclockwise around the pivot member 618 to position the exit doors 604, 608 above the waste bin.
- the motors 610, 612 actuate the hinges 614, 616 to cause the exit doors 604, 608 to transition from their closed position to their open position, whereby the item is released or otherwise falls into the waste bin.
- FIG. 7 there is provided illustrations of another sorting compartment 700.
- the sorting compartment 700 is disposed on top of a liquid collection compartment 750.
- the liquid collection compartment 750 is stationary, while the sorting compartment 750 is movable (e.g., rotatable above pivot member 706).
- the sorting compartment 700 does not have a bottom wall such that an item deposited therein will come to rest on a top surface 716 of the liquid collection compartment 750.
- a load sensor 704 can be attached to the liquid collection compartment 750 to measure the weight of the item as it rests on surface 716.
- Surface 716 may be concave and have a drainage hole 722 formed therein.
- the drainage hole 722 may be located at the lowest point of the concave surface 716 so that any contents (e.g., liquid, powder, granulate or other substance) drains from the item into the liquid collection compartment 750 at least partially due to gravity.
- the bottom edges 708, 718 of the walls 710, 714 respectively follow the curvature of the edges 712, 720 associated with the concave surface 716.
- the sorting compartment 700 can rotate around the pivot 706 without rubbing against the edges 712, 720 associated with concave surface 716.
- Method 800 begins with 802 and continues with 804 where a circuit (e.g., circuit 106 of FIG. 1) performs operations to detect when an item (e.g., item 500 of FIG. 5) has been received in the sorting compartment (e.g., sorting compartment 402 of FIG. 4, 600 of FIG. 6 or 700 of FIG. 7) of waste container. This detection can be made using sensor data generated by one or more sensors provided with the waste container.
- a circuit e.g., circuit 106 of FIG. 1
- the sorting compartment e.g., sorting compartment 402 of FIG. 4, 600 of FIG. 6 or 700 of FIG.
- the sensor data can include, but is not limited to, image(s), video(s), beam break sensor data, motion sensor data, light detection sensor data, odor sensor data, and/or cover position sensor data.
- the circuit optionally performs operations to cause actuation of a safety mechanism of the waste container to secure the cover (e.g., cover 104 of FIG. 1) in a closed position for preventing an individual from interfering with or experiencing injury from the item’s processing by the waste container.
- the safety mechanism can include, but is not limited to, a latch and/or a lock.
- the circuit of the waste container performs operations to (i) determine a type of waste for the detected item based on sensor data and (ii) determine the type of material that the item comprises.
- the type of waste may include a recyclable type of waste or a non-recyclable type of waste.
- Recyclable waste includes any item that can be recycled (e.g., a metal can, a plastic object, a paper cup, a cardboard sheet, a piece of cloth, and/or a glass container).
- Non-recyclable waste may be subcategorized as a landfill type of waste, a compost type of waste, or other type of waste. If the type of waste is determined to be a recyclable type of waste, then a material category is determined for the item.
- the material category can include, but is not limited to, a plastic category, a paper category, a metal category and/or other recyclable material category.
- Each material category can have a subcategory.
- a plastic category can be associated with a PI plastic subcategory, a P2 plastic subcategory, etc.
- the paper category can be associated with an office paper subcategory, a cardboard subcategory, etc.
- the circuit causes the sorting compartment to deposit the item into a certain waste bin (e.g., waste bin 406 or 408 of FIG. 4), as shown by 810.
- a certain waste bin e.g., waste bin 406 or 408 of FIG. 4
- This deposition can be achieved via activation of a motor (e.g., motor 610 and/or 612 of FIG. 6) so that the sorting compartment releases the item into the waste bin provided for collecting recyclable items.
- the circuit When a determination is made that the item is of a non-recyclable type, the circuit performs operations to cause activation of the motor so that the sorting compartment releases the item into the waste bin provided for collecting non-recyclable items or a sub category of non-recyclable items (e.g., landfill types of items).
- a sub category of non-recyclable items e.g., landfill types of items.
- step 814 the sorting compartment returns to its rest position or state.
- step 816 the security mechanism is optionally actuated so that the cover is released/unlocked/unlatched and another item can be disposed in the waste container.
- step 818 is performed where method 800 ends or other operations are performed (e.g., return to 804).
- Method 900 begins with 902 and continues with 904 where sensor data is generated and used to detect the presence of an item in the sorting compartment.
- sensor data is generated and used to detect the presence of an item in the sorting compartment.
- a motion detection sensor may be employed to detect that an individual in proximate to the system
- a position sensor may be employed to facilitate detection that someone has opened a cover (e.g., cover 104 of FIG. 1) of the system to place an item therein
- a load sensor may be employed to facilitate a detection that a weight has been received.
- Two or more sensors will capture data points that the system will use (i) to assign a type of waste to the item and (ii) select a waste bin of a plurality of waste bins that is associated with the assigned type.
- a camera may capture a digital image of the item placed within the waste container as shown by 906.
- a load sensor may weigh the item as shown by 908.
- a metal detector may be used in 910 to detect whether the item is made of metal, and if so what type of metal.
- the metal detector can include, but is not limited to, an inductance sensor and a capacitive sensor.
- one or more other sensors can be used to generate sensor data that is useful for determine a type of waste category for the item.
- These other sensors can include, but are not limited to, temperature sensors, infrared sensors, spectroscopy sensors, audio sensors, capacitive sensors, fluorescence sensors, millimeter-wave radar sensors, and/or depth cameras.
- the system will perform comparison operations using the sensor data to identify a predefined data set to which the sensor data matches by a certain amount (e.g., 70%). Once a predefined data set has been identified, the system accesses a datastore to obtain a type of waste category associated therewith (e.g., a recyclable type of waste or anon-recyclable type of waste), as shown by 912. For example, if the first sensor is a camera that captured a digital image of the item, the system will perform image processing to determine whether content of the digital image corresponds to one or more characteristics of a known recyclable object.
- a type of waste category associated therewith e.g., a recyclable type of waste or anon-recyclable type of waste
- the image processing may be performed using any now or hereafter known image processing techniques, including but not limited to edge detection, object recognition, feature extraction, and other techniques.
- the characteristics of known recyclable objects may be stored in a data set that is accessible to the processor. Examples of characteristics of known objects that the data set may store include a logo (such as a product trademark), text of a known product name, a shape, a size and/or a color scheme.
- the system will classify the item as a recyclable type of waste in 914.
- the sorting compartment either (i) deposits the time into a recyclable waste bin as shown by 920 or (ii) performs additional analysis in 918, 922, 924 which will be described below.
- the system will classify the piece of waste as a non-recyclable type of waste as shown by 916.
- the sorting compartment directs the item to a non-recyclable waste bin as shown by 922.
- the system may analyze the sensed data to determine whether this material is compostable as shown by 918. If so [918:YES], the sorting compartment directs the item to a compostable waste bin in 924.
- the system may determine that the waste is formed of a recyclable material if the inductance sensor indicates that the piece of waste is metallic.
- the system may analyzes the image produced by the camera, the weight measurement produced by the load sensor, and the inductance measured by the inductance sensor in combination to determine the type of waste, using any suitable algorithms, or comparison of the sensed data to characteristics of material types as stored in a data set.
- the system may use vision algorithms to produce a list of categories and a corresponding list of probabilities associated with the object.
- the system may utilize a machine learning method (e.g., a Bayesian Classification algorithm) which may compare the image or other captured sensor data against a trained data set to determine the probability that the item in the image represents a particular known item.
- the vision algorithm may produce the list of categories: ⁇ a bottle, a cup, a stick, a ball ⁇ and the corresponding list of probabilities: ⁇ 0.6, 0.3, 0.1, 0.2 ⁇ .
- the list of probabilities indicates that the image is a bottle with the probability of 0.6, a cup with the probability of 0.3, a stick with the probability of 0.1 and a ball with the probability of 0.2.
- the present solution is not limited to the particulars of this example.
- the system may use additional sensed data to determine whether the waste meets a recyclability threshold in 916. For example, if the classification operations determined that the piece of waste is a food or beverage container, the recyclability threshold may be a requirement that the container contain no more than a threshold amount of liquid or other material. The system may then determine the weight of the object and determine whether the weight meets or exceeds a threshold beyond which recyclability is not possible. As another example, if the classification operation determined that the piece of waste is a cardboard box, the system may use a sensor (such as a camera) to determine whether the box is a pizza box. If so, the sorting compartment can move the box to a non-recyclable waste bin based on a rule that pizza boxes are typically contaminated with food particles and thus expected to be non-recyclable.
- a sensor such as a camera
- the system also may take weight of the object into account when determining whether or not a particular piece of waste is recyclable in 912.
- the system may store a table of weights and corresponding pieces of waste in a memory.
- the table may include information that landfill paper typically weighs in the range of 1 lg-23g, a metal can typically weighs in the range of 13g-15g, a plastic container typically weighs in the range of 14g-19g, etc.
- the weight categories can correspond to the categories produced by the vision algorithm, can be overlapping, or can be disjointed. Based on the weight of the piece of waste placed in the waste container, the system may assign a probability to each category.
- the system can assign the probability to a category in various ways. If the weight falls into several categories, the system can evenly distribute the probability among the several categories. For example, if the weight of the piece of waste is 16g, the system may determine that the probability that the piece of waste is a metal can is 0.0, the probability that the piece of waste is a plastic container is 0.5, and probability that the piece of waste is landfill paper is 0.5. Additionally or alternatively, the system assigns a greater probability the closer the measured weight is to the average weight associated with a particular category. For example, the average weight of the landfill paper is 17g, the average weight of the metal can is 14g, and the average weight of the plastic container is 16.5g. The weight of the measured piece of waste is determined to be 16g.
- the system may determine that the probability that the piece of waste is a plastic container is 0.49, while the probability that the piece of waste is a landfill paper is 0.51.
- the system may select the higher probability as the material to which the item will be classified.
- the present solution is not limited to the particulars of this example.
- the system also may take inductance of the object into account when determining whether or not a particular piece of waste is recyclable in 912, or to determine the recyclability threshold in 916.
- the system may store a table of inductances and corresponding pieces of waste in the memory. The inductance for most materials is generally equal, approximately 137H, except for pieces of trash containing metal, when the inductance varies within 140H-154H.
- the recyclability threshold may be a particular inductance level (such as 140H).
- the system may assign a probability that the piece of waste is metal.
- the system combines the probabilities associated with various categories received from image analysis, weight analysis, and inductance analysis to determine the type of waste deposited in the waste container.
- the system may automatically divert the waste to a bin designated for landfill disposal. For example, if, after assigning probabilities that a piece of waste is formed of a particular material based on the various sensor input described above, the system still does not recognize the type of waste with at least 80% certainty, the system may divert the waste to a landfill bin.
- the actual waste-type recognition threshold needed to divert waste to a particular recycling bin is not limited to 80%, and may be customizable and/or adjustable within the processor based on recycling plant preferences, local regulations, etc.
- the system can be connected to a database via a wired or a wireless network as described herein.
- the system may store, in a data set in computer-readable memory, information regarding the type of waste received. Further, the data set can associate the type of waste with an identifier (ID) associated with the waste.
- ID identifier
- the system can also retrieve the information regarding the type of waste in the ID associated with the waste and output information and/or statistics to a user of the system on a display device or via an audio output, if the waste container is equipped with a display or speaker.
- Such information can be used in utilizing game mechanics to incentivize consuming items that lead to recyclable waste, educating material type in schools about what is recyclable and not, etc. For example, when two different waste containers are placed on two different floors in an office building, the processor can keep track of which floor discarded more recyclable waste, and award incentives, such as game points, to the floor that discarded more recyclable waste.
- circuit 106 is configured to assesses whether a piece of waste corresponds to a particular item (plastic bottle, metallic object, or paper product). Such an assessment can be performed in the manner discussed above in relation to FIGS. 8-9. While the method described above with respect to FIG. 9 details the utilization of multiple sensors working in conjunction to classify a piece of waste placed within a waste container and appropriately dispose of that waste, FIG. 10 details a method of determining a type of waste, again utilizing multiple different sensor inputs, but prioritizing classification based on particular sensor inputs.
- method 1000 begins with 1002 and continues with 1004 where a camera takes an image of the piece of waste placed within the waste container.
- a load sensor weighs the piece of waste.
- the inductance sensor measures conductivity of the piece of waste. It is to be understood that operations 1004-1008 may be performed in any order, and more than one of each applicable sensor may be utilized. Alternatively and/or additionally, one or more other sensors, such as temperature sensors, infrared sensors, spectroscopy sensors, audio sensors, capacitive sensors, fluorescence sensors, millimeter- wave radar sensors, depth cameras, etc., may be utilized in lieu of, or in addition to, one or more of a camera, load sensor, or inductance sensor.
- step 1012 it is determined, by a processor executing programming instructions, whether the camera has detected the possible presence of a plastic bottle.
- the system may utilize a machine learning method such as, for example, Bayesian Classification, which may compare the image captured by the camera against a trained data set to determine the probability that the item in the image represents a particular, known item, such as a plastic bottle. If yes [1012:YES], method continues with 1018 where the system prioritizes the data received from the camera in classifying the waste as recyclable or non-recyclable.
- a machine learning method such as, for example, Bayesian Classification
- the system may be configured to weight the probability determinations related to the camera data more heavily than data received from other sensors (e.g., load sensor(s), inductance sensor(s), etc.).
- load sensor(s) e.g., load sensor(s), inductance sensor(s), etc.
- the data received from the camera is more likely the most accurate in determining the type of object.
- the method continues with 1014 where the system determines whether the inductance sensor has detected the possible presence of a metallic object (e.g., an aluminum beverage can). Again, a machine learning method may be used to compare conductivity data captured by the inductance sensor against a trained data set to determine the probability that an object having a particular conductivity represents a known item, such as, e.g., an aluminum beverage can. If yes [1014:YES], the method continues with 1020 where the system prioritizes data received from the induction sensor in classifying the waste as recyclable or non-recyclable.
- a metallic object e.g., an aluminum beverage can.
- the processor may be configured to weigh the probability determinations related to the conductivity data more heavily than data received from other sensors, as a metallic object may be more readily and positively identifiable by an inductance sensor than, for example, a camera or load sensor.
- the method continues with 1016 wherein the system may determine whether the load sensor indicates the possible presence of a recyclable paper product.
- paper products typically fall within a specific weight range (e.g., 1 lg-23g), such that the data received from the load sensor may be used to determine if an object that is not a plastic bottle and is not a metallic object may be a paper product. If yes [1016:YES], then the system may prioritize data received from the load sensor in classifying the waste as recyclable or non-recyclable as shown by 1022.
- the processor may be configured to weigh the probability determinations related to the weight data more heavily than data received from other sensors, as a paper product may be more readily and positively identifiable by a load sensor than, for example, a camera or inductance sensor.
- steps 1004-1006 are described using a camera, inductance sensor, and load sensor respectively and in that order, it is again to be understood that the steps may be performed in any order, and the determination steps 1012-1016 may similarly be determined in any order. Also, while a camera, inductance sensor, and load sensor are described in the method set forth in FIG.
- one or more other sensors such as temperature sensors, infrared sensors, spectroscopy sensors, audio sensors, capacitive sensors, fluorescence sensors, millimeter- wave radar sensors, depth cameras, etc.
- the processor may prioritize data received from any one of the above sensors based on relative confidence in a given sensors’ accuracy in determining the presence of a particular object.
- the system may execute instructions to deposit the inserted waste into a bin designated for landfill waste. Such instructions are made based on the fact that none of the sensors utilized determined that the inserted object may contain (at least partially) a recyclable material.
- other sensors may be utilized in lieu of, or in addition to, one or more of a camera, load sensor, or inductance sensor, in making the determination if a recyclable product is potentially present.
- the system may also utilize input from other sensors (e.g., a load sensor, millimeter-wave radar sensor, etc.) in decision block 1026 to determine not only if the object is indeed a plastic bottle, but also if the plastic bottle is recyclable. That is, secondary sensor input from, for example, a load sensor, may indicate that the plastic bottle weighs more than a known threshold range of substantially empty, recyclable plastic bottles, as determined from the trained data set accessed by the processor. A plastic bottle weighing more than a known threshold range may indicate that the bottle still contains liquid.
- sensors e.g., a load sensor, millimeter-wave radar sensor, etc.
- a plastic bottle containing more than a predetermined amount of liquid may be considered contaminated and, therefore, unrecyclable.
- the processor may provide instructions to deposit the waste in a plastics recyclable bin in step 1038, and the waste is automatically deposited into an appropriate bin at step 1044.
- the processor may provide instructions to deposit the waste in a landfill bin at step 1032, and the waste is automatically deposited into an appropriate bin at step 1044.
- the system may prioritize data received from the inductance sensor if the presence of a metallic object is detected, but in step 1028, the processor also utilizes input from other sensors (e.g., a camera, a load sensor, etc.) to determine not only if the object contains a metallic material, but if that metallic object is recyclable.
- sensors e.g., a camera, a load sensor, etc.
- secondary sensor input from, for example, a load sensor may indicate that the metallic object (e.g., an aluminum beverage can) still contains liquid therein and is, therefore, too contaminated to be properly recycled according to local regulations.
- the processor may provide instructions to deposit the waste in a metallic recyclable bin in step 1040, and the waste is automatically deposited into an appropriate bin at step 1044.
- the processor may provide instructions to deposit the waste in a landfill bin at step 1034, and the waste is automatically deposited into an appropriate bin at step 1044.
- the processor may prioritize data received from the load sensor if the presence of a paper product or object is detected, but in step 1030, the processor also utilizes input from other sensors (e.g., a camera, a millimeter-wave radar sensor, etc.) to determine not only if the object is made of paper, but if paper itself is recyclable.
- sensors e.g., a camera, a millimeter-wave radar sensor, etc.
- secondary sensor input from a camera or millimeter-wave radar sensor may indicate that paper product may be too contaminated to be properly recycled according to local regulations.
- a corrugated box may typically be recyclable, a corrugated box having food or grease contamination on any surface thereof may not be recyclable.
- the processor may provide instructions to deposit the waste in a paper recyclable bin in block 1042, and the waste is automatically deposited into an appropriate bin at step 1044.
- the processor may provide instructions to deposit the waste in a landfill bin at step 1036, and the waste is automatically deposited into an appropriate bin at step 1044.
- the thresholds used by the processor to determine whether or not an object is recyclable may be dynamically adjusted based upon local codes and preferences at the location of the waste container. For example, one recycling plant may accept plastic bottles with up to one ounce of liquid still contained therein, whereas a recycling plant in another location may not accept plastic bottles with any liquid remaining therein.
- such thresholds may be dynamically programmed into the processor to provide for location-specific sorting.
- the processor may be programmed to determine (via a variety of sensors, as described above) the type of plastics and sort accordingly. If, in the future, other types of plastic are considered recyclable in that area, the system can be locally or remotely updated to allow for such sorting.
- one or more additional sensors may be present within the waste container so as to determine the fullness and/or remaining capacity of one or more of the waste bins therein.
- one or more sonar sensors or infrared sensors may be utilized within the waste container to determine the relative fullness and/or remaining capacity of one or more of the bins.
- the waste container may have an external indicator, such as a visual and/or audible indicator, which notifies a user or responsible party that a particular waste bin is at or near capacity and should therefore be emptied.
- one or more light-emitting diodes or other visual indicators viewable directly at the waste container may be illuminated when a waste bin is at or near capacity.
- a graphical user interface located on the waste bin itself may provide a visual and/or audible indication that a waste bin is full, or a user interface on a web-enabled device (e.g., a smartphone, tablet computer, etc.) in communication with the waste container may provide a similar visual and/or audible indication that a waste bin is full.
- the waste bin may comprise a display or user interface thereon which may provide, for example, instructions on proper disposal of waste, information on what is (or is not) considered recyclable waste, advertisements, etc.
- the display or user interface may provide an indication to the user as to why the waste was placed in a landfill waste bin as opposed to a recyclable waste bin. For example, if the user places a plastic bottle into the waste bin that is still filled with an impermissible amount of liquid, the display or user interface may provide visual and/or audible feedback to the user, explaining why the plastic bottle was not recyclable.
- the display or user interface may provide the user with one or more recommendations as to how to properly dispose of the waste in the future (e.g., “Please empty all liquids from plastic container prior to disposal”).
- the display or user interface may not be located on the waste bin itself, but may instead be located on a web-enabled device (e.g., a smartphone, tablet computer, etc.) in communication with the waste container.
- Circuit 106 may comprise a machine in the example form of a computer system 800 within which a set of instructions, for causing the machine to perform any one or more of the methodologies or modules discussed herein, may be executed.
- the circuit 106 can include elements such as a processor 1102, a main memory 1104, anon-volatile memory 1106, and an interface device 1108. Various common components (e.g., cache memory) are omitted for illustrative simplicity.
- the circuit 106 is intended to illustrate a hardware device on which any of the components and/or methods described (and any other components described in this specification) can be implemented.
- the circuit 106 can be of any applicable known or convenient type.
- the components of the circuit 106 can be coupled together via a bus or through some other known or convenient device.
- circuit 106 may be an embedded computer system, a System-On-Chip (SOC), a Single-Board Computer (SBC) system (e.g., a Computer-On- Module (COM) or a System-On-Module (SOM)), a desktop computer system, a laptop or notebook computer system, an interactive kiosk, a mainframe, a mesh of computer systems, a mobile telephone, a Personal Digital Assistant (PDA), a Server, or a combination of two or more of these.
- SOC System-On-Chip
- SBC Single-Board Computer
- COM Computer-On- Module
- SOM System-On-Module
- circuit 106 may include one or more computer systems; be unitary or distributed; span multiple locations; span multiple machines; or reside in a cloud, which may include one or more cloud components in one or more networks.
- one or more computer systems may perform without substantial spatial or temporal limitation one or more steps of one or more methods described or illustrated herein.
- one or more computer systems may perform in real time or in batch mode one or more steps of one or more methods described or illustrated herein.
- One or more computer systems may perform at different times or at different locations one or more steps of one or more methods described or illustrated herein, where appropriate.
- a bus 1010 also couples the processor 1102 to the non-volatile memory 1106 and drive unit 1112.
- the non-volatile memory 1106 is often a magnetic floppy or hard disk, a magnetic-optical disk, an optical disk, a Read-Only Memory (ROM), such as a CD-ROM, EPROM, or EEPROM, a magnetic or optical card, or another form of storage for large amounts of data. Some of this data is often written, by a direct memory access process, into memory during execution of software in the circuit.
- the non-volatile storage can be local, remote, or distributed.
- the non-volatile memory is optional because systems can be created with all applicable data available in memory.
- a typical computer system will usually include at least a processor, memory, and a device (e.g., a bus) coupling the memory to the processor.
- Programming instructions are typically stored in the non-volatile memory and/or the drive unit. Indeed, storing an entire large program in memory may not even be possible. Nevertheless, it should be understood that for software to run, if necessary, it is moved to a computer readable location appropriate for processing, and for illustrative purposes, that location is referred to as the memory in this paper. Even when software is moved to the memory for execution, the processor will typically make use of hardware registers to store values associated with the software, and local cache that, ideally, serves to speed up execution.
- a software program is assumed to be stored at any known or convenient location (from non-volatile storage to hardware registers) when the software program is referred to as “implemented in a computer-readable medium.”
- a processor is considered to be “configured to execute a program” when at least one value associated with the program is stored in a register readable by the processor.
- the bus also couples the processor 1102 to the network interface device 1106.
- the network interface device 1106 can include one or more of a modem or network interface. It will be appreciated that a modem or network interface can be considered to be part of the circuit.
- the interface can include an analog modem, ISDN modem, cable modem, token ring interface, satellite transmission interface (e.g. "direct PC"), or other interfaces for coupling a computer system to other computer systems.
- the interface can include one or more input and/or output devices.
- the I/O devices can include, by way of example but not limitation, a keyboard, a mouse or other pointing device, disk drives, printers, a scanner, and other input and/or output devices, including a display device.
- the display device can include, by way of example but not limitation, a Cathode Ray Tube (CRT), Liquid Crystal Display (LCD), or some other applicable known or convenient display device.
- CTR Cathode Ray Tube
- LCD Liquid Crystal Display
- controllers of any devices not depicted in the example of FIG. 11 reside in the interface.
- the circuit can be controlled by operating system software that includes a file management system, such as a disk operating system.
- a file management system such as a disk operating system.
- operating system software with associated file management system software is the family of operating systems known as Windows® from Microsoft Corporation of Redmond, Washington, and their associated file management systems.
- WindowsTM Windows® from Microsoft Corporation of Redmond, Washington
- LinuxTM LinuxTM operating system and its associated file management system.
- the file management system is typically stored in the non-volatile memory and/or drive unit and causes the processor to execute the various acts required by the operating system to input and output data and to store data in the memory, including storing files on the nonvolatile memory and/or drive unit.
- the machine operates as a standalone device or may be connected (e.g., networked) to other machines. In a networked deployment, the machine may operate in the capacity of a server or a client machine in a client-server network environment, or as a peer machine in a peer-to-peer (or distributed) network environment.
- the machine-readable medium or machine-readable storage medium is shown in an exemplary embodiment to be a single medium, the term "machine-readable medium” and “machine-readable storage medium” should be taken to include a single medium or multiple media (e.g., a centralized or distributed database, and/or associated caches and servers) that store the one or more sets of instructions.
- machine- readable medium and “machine-readable storage medium” shall also be taken to include any medium that is capable of storing, encoding or carrying a set of instructions for execution by the machine and that cause the machine to perform any one or more of the methodologies or modules of the presently disclosed technique and innovation.
- routines executed to implement the embodiments of the disclosure may be implemented as part of an operating system or a specific application, component, program, object, module or sequence of instructions referred to as "computer programs.”
- the computer programs typically comprise one or more instructions set at various times in various memory and storage devices in a computer, and that, when read and executed by one or more processing units or processors in a computer, cause the computer to perform operations to execute elements involving the various aspects of the disclosure.
- machine-readable storage media machine-readable media, or computer-readable (storage) media
- recordable type media such as volatile and non-volatile memory devices, floppy and other removable disks, hard disk drives, optical disks (e.g., Compact Disk Read-Only Memory (CD ROMS), Digital Versatile Disks, (DVDs), etc.), among others, and transmission type media such as digital and analog communication links.
- CD ROMS Compact Disk Read-Only Memory
- DVDs Digital Versatile Disks
- transmission type media such as digital and analog communication links.
- operation of a memory device may comprise a transformation, such as a physical transformation.
- a physical transformation may comprise a physical transformation of an article to a different state or thing.
- a change in state may involve an accumulation and storage of charge or a release of stored charge.
- a change of state may comprise a physical change or transformation in magnetic orientation or a physical change or transformation in molecular structure, such as from crystalline to amorphous or vice versa.
- a storage medium typically may be non-transitory or comprise a non-transitory device.
- a non-transitory storage medium may include a device that is tangible, meaning that the device has a concrete physical form, although the device may change its physical state.
- non-transitory refers to a device remaining tangible despite this change in state.
- System 1200 comprises waste receptacle(s) 1202, computing device(s) 1206, datastore(s) 1208, mobile communication device(s) 1210 and optional presentation device(s) 1212.
- the listed components 1202, 1206, 1210 and/or 1212 can communicate with each other via wired communications or wireless communications across network 1204. Any communication technology can be used to facilitate such wired and/or wireless communications.
- Network 1204 can include, but is not limited to, the Internet, a LAN, a WAN, a cellular network and/or a radio network.
- Waste receptacle(s) can include waste container 100 of FIG. 1.
- waste container 100 comprises (i) an opening 102 through which items can be disposed by individuals and (ii) a circuit 106 configured to detect characteristic(s) of the items deposited through the opening 102 and into the sorting compartment (e.g., sorting compartment 402 of FIG. 4, 600 of FIG. 6 or 700 of FIG. 7).
- the detected characteristic(s) can be used for various purposes such as providing individuals with feedback prior to, during and/or after depositing item(s) into the waste receptacle(s).
- the feedback can include, but is not limited to, auditory feedback (e.g., a sound, a song and/or speech), visual feedback (e.g., image(s), video(s) and/or a light show) and/or tactile feedback (e.g., vibration of an electronic device).
- auditory feedback e.g., a sound, a song and/or speech
- visual feedback e.g., image(s), video(s) and/or a light show
- tactile feedback e.g., vibration of an electronic device.
- Each of the listed types of feedback is defined or otherwise specified in a library 1220 that is accessible to the waste receptacle(s).
- the library 1220 can be stored in the datastore 1208 and/or in memory (e.g., main memory 1104 of FIG. 1) of the waste receptacle(s) 1202.
- the library 1220 can include, but is not limited to, media content stored in association with pre-defmed tags and/or other information.
- the media content can include audio (e.g., a sound, a song and/or speech), image(s), video(s), advertisement(s), graphic(s), icon(s), instructions for light show(s), and/or instructions for providing tactile feedback (e.g., a single vibration or a series of vibrations).
- the pre-defmed tags provide a means to search the library 1220 for media content associated with item(s) proximate to (e.g., outside of but in close distance to) or inside of waste receptacle(s).
- the pre-defmed tags can specify an item type (e.g., a bottle, a plate, a piece of silverware, a straw, a box, a newspaper, a magazine, a shirt, etc.), a waste type (e.g., recyclable or non-recyclable), item material(s) (e.g., glass, plastic, paper, cardboard and/or cloth), a manufacturer (e.g., the Coca-Cola Company), a product name (e.g., Coca-Cola Soft drink), a product code or identifier (e.g., a barcode, a unique product code, trademark, etc.), a manufacturer, an item marking (e.g., a trademark, a slogan, a label design), an item shape, an item size, item color(s), and/or item contents (e.g., liquid, solid, powder, perishable food, hazardous liquid chemical, etc.).
- item material(s) e.g., glass, plastic, paper, cardboard and/or cloth
- the other information can include, but is not limited to, demographics of individuals that use and/or purchase items, popularities of items, geographic areas where items are typically used or purchased, and/or manners in which items are used (e.g., consumption, power source for electronic devices, food storage, entertainment, etc.).
- the media content can be selectively output from the waste receptacle(s) 1202, mobile communication device(s) 1210 and/or presentation device(s) 1212 based on sensor data generated by sensor(s) of the waste receptacle(s).
- the manner in which the media content is selectively output will become evident as the discussion progresses.
- the presentation device(s) 1212 can include, but are not limited to, display(s), speaker(s) and/or a light system (e.g., an LED panel and/or laser light system).
- the presentation device(s) 1212 can be mounted on a wall or otherwise placed near the waste receptacle(s) 1202.
- the mobile communication device(s) 1210 can be in the possession of individual(s) in proximity to the waste receptacle(s) 1202.
- the mobile communication device(s) 1210 can include, but are not limited to, smart phones, smart watches, personal computers, personal digital assistants, and/or tablets.
- sensor(s) e.g., sensor(s) 108 of FIG. 1
- sensor(s) of the waste receptacle(s) 1202 perform operations to generate sensor data specifying characteristic(s) of item(s) in proximity to (e.g., outside of) and/or disposed in the waste receptacle(s) 1202.
- the sensor data can also specify characteristics of the individual who is or was in possession of the item(s) prior to being disposed in the waste receptacle(s) 1202.
- the sensor data can include, but is not limited to, image(s), video(s), audio signal(s), odor identifier(s), temperature(s), weight(s), size(s), shape(s), color(s), marking(s), barcode information, condition(s) (e.g., empty, full, damaged, cap on, cap off, etc.), item distance(s) from waste receptacle(s), item location(s) relative to a reference location (e.g., the location of a waste receptacle 1202 or presentation device 1212), item movement(s), location(s) of individual(s) relative to the waste receptacle(s), position of individuals relative to a reference object (e.g., the individual is or is not facing a waste receptacle 1202 or presentation 1212), movement of individuals (e.g., an individual’s path of travel is towards, passing or away from the waste receptacle), and/or mobile communication device identifier(s).
- condition(s) e.g.
- a circuit e.g., circuit 106 of FIG. 1 of the waste receptacle(s) 1202 and/or another device (e.g., computing device(s) 1206) receive and process the sensor data to determine or otherwise obtain item-based information.
- another device e.g., computing device(s) 1206
- the item-based information can include, but is not limited to: (1) the type of item (e.g., a bottle, a can, a pizza box, etc.), (2) the type of waste that the item(s) comprises (e.g., recyclable or non-recyclable), (3) characteristics of the item (e.g., size, weight, shape, color(s), etc.) (4) a condition of the item (e.g., empty, partially full, full, damaged, etc.), (5) a manufacturer of the item, (6) a tradename for the item, (7) a product identifier for the item, (8) label information for the item (e.g., a price), and/or (9) other information associated with the item (e.g., contents of the item (e.g., a liquid, a solid, a powder, a hazardous liquid chemical)).
- the type of item e.g., a bottle, a can, a pizza box, etc.
- characteristics of the item e.g., size, weight,
- Individual -based information can also be obtained using the sensor data.
- the individual-based information can include, but is not limited to, an identifier for the individual (e.g., a Media Access Control (MAC) address for a mobile communication device in the possession of the individual), information indicting the popularity of the item type amongst individuals, and/or information indicating an average number of items of the same or similar type that has historically been processed by system 1200 during a given time of day (e.g., breakfast time, lunch time, dinner time, etc.) or year (e.g., a holiday season).
- MAC Media Access Control
- the MAC address can be used to obtain social media content associated with the individual in possession of the mobile communication device (e.g., indicating the individual’s likes and dislikes for products) and/or item popularity in a group to which the individual belongs (e.g., an age group, a country group, etc.).
- the system can obtain the item-based information and/or the individual-based information utilizing one or more computer vision algorithms including: object detection, image classification, semantic segmentation, or object recognition.
- the system may use deep learning and random forest models to create the classifier algorithm, but other classifiers can be used within the scope of the invention.
- These computer vision algorithms are trained by utilizing an existing data set of characteristics that are processed through statistical means to create a model that can predict characteristics of items in novel images.
- the circuit of the waste receptacle(s) 1202 and/or the another device access(es) and search(es) the library 1220 using the item- based information.
- the library 1220 is searched to identify media content (e.g., a sound, a song, an image, a video, a light pattern, a vibration pattern, etc.) that is associated with the item-based information (e.g., a given type of item, a given type of waste, given item characteristics, given item condition(s), a given manufacturer, a given tradename, a given product identifier, given label information (e.g., price range), given item content(s), and/or other information).
- media content e.g., a sound, a song, an image, a video, a light pattern, a vibration pattern, etc.
- the item-based information e.g., a given type of item, a given type of waste, given item characteristics, given item condition(s), a given manufacturer, a given
- the identified media content is then retrieved from the datastore 1208 and/or local memory of the waste receptacle(s) 1202.
- the media content is then caused to be output from the waste receptacle(s) 1202, mobile communication device(s) 1210 and/or presentation device(s) 1212.
- image(s)/video(s)/advertisement(s) is(are) displayed on a display (e.g., display 200 of FIG. 2) of the waste receptacle(s) 1202, a display of the mobile communication device(s) 1210 in proximity to the waste receptacle(s) 1202, and/or a display of the presentation device(s) 1212 in proximity to the waste receptacle(s) 1202.
- Audio may also be output from speaker(s) of the devices 1202, 1210 and/or 1212.
- Light may also be output from the devices 1202, 1210 and/or 1212.
- the present solution is not limited to the particulars of this example.
- the identified media content is streamed from the datastore 1208 to devices 1202, 1210 and/or 1212.
- the system can select media content in accordance with the individual-based information (e.g., the item’s relationship to other products/items liked or disliked by the individual as indicated, for example, by social media posts) and/or other information (e.g., a time of day, a geographic location of the waste receptacle, a total number of individuals detected in proximity to the waste receptacle at a given time, sizes of all individuals detected in proximity to the waste receptacle (e.g., indicating a child and/or an adult), types of electronic devices in the possession of the proximate individuals (e.g., smart phone, smart watch, tablet, portable e-book reader device, a portable gaming device, etc.), a given degree of the item(s) popularity amongst individuals with or without a common attribute of the individual, an average number of such items processed by the system, etc.).
- the other criteria can be obtained (i) using clocks, (i)
- Method 1300 begins with 1302 and continues with 1304 where a waste receptacle (e.g., waste receptacle 1202 of FIG. 12) is placed in a power save mode operation or an idle mode of operation.
- a waste receptacle e.g., waste receptacle 1202 of FIG. 12
- sensor(s) of the waste receptacle can detect when an individual is in proximity thereto (e.g., within 10 feet thereof), as shown by 1306.
- the sensor(s) can include, but are not limited to, camera(s), proximity sensor(s) (e.g., beam brake sensor) and/or wireless communication device(s) that communicate with external devices (e.g., mobile communication device(s) 1210 of FIG. 12).
- camera(s) e.g., proximity sensor(s) (e.g., beam brake sensor) and/or wireless communication device(s) that communicate with external devices (e.g., mobile communication device(s) 1210 of FIG. 12).
- the waste receptacle transitions to a waste processing mode of operation in 1308.
- sensor(s) of the waste receptacle detect when item(s) is(are) proximate to and/or being disposed in the waste receptacle as shown by 1310 and generate sensor data specifying characteristic(s) of the item(s) and/or individual(s) proximate to the waste receptacle (e.g., within 10 feet of the waste receptacle) as shown by 1312.
- the individual(s) can include (i) those that have or had possession of the item(s) and/or (ii) others who do or did not have possession of the item(s).
- the sensor data is provided in 1314 to a circuit (e.g., circuit 106 of FIG. 6) of the waste receptacle and/or to an external device (e.g., computing device 1206 of FIG. 12).
- the sensor data is analyzed in 1316 to obtain item-based information.
- the item- based information can include, but is not limited to, (1) the type of item (e.g., a bottle, a can, a pizza box, etc.), (2) the type of waste that the item(s) comprises (e.g., recyclable or non- recyclable), (3) characteristics of the item (e.g., size, weight, shape, color(s), etc.) (4) a condition of the item (e.g., empty, partially full, full, damaged, etc.), (5) a manufacturer of the item, (6) a tradename for the item, (7) a product identifier for the item, (8) label information for the item (e.g., a price), and/or (9) other information associated with the item (e.g., contents of the item (e.g., a liquid, a solid, a powder, a hazardous liquid chemical)).
- the type of item e.g., a bottle, a can, a pizza box, etc
- the item-based information is obtained for two or more items that were disposed in the waste receptacle within a threshold period of time starting from the time a first one of the items was detected by the waste receptacle (e.g., within 1 minute of each other).
- the two sets of item-based information are used to generate or otherwise obtain additional item-based information.
- the additional item-based information includes, but is not limited to, item classification(s), an identifier for a group to which both items belong, a common source identifier for both items (e.g., a given store, manufacturer or distributor), a ranking for the items relative to each other; an importance of the items relative to each other, and/or a value computed using weights selected for the items based on characteristic(s) thereof.
- the item-based information is used in 1318 to search a library (e.g., library 1220 of FIG. 12) to identify media content associated therewith.
- the media content can be identified when the item-based information matches (e.g., by a certain degree or amount) a pre-defined tag associated with the media content. If media content was identified [1320:YES], then the media content is output (e.g., streamed) in 1322 from the waste receptacle or another device (e.g., mobile communication device 1210 and/or presentation device 1212 of FIG. 12). Subsequently, 1324 is performed where method 1300 ends or other operations are performed (e.g., return to 1302).
- method 1300 continues with 1326 of FIG. 13B.
- 1326 involves analyzing the sensor data to obtain individual -based information.
- the individual-based information can include, but is not limited to, an identifier for the individual (e.g., a MAC address for a mobile communication device in the possession of the individual), information indicting the popularity of the item type amongst individuals, and/or information indicating an average number of items of the same or similar type that has historically been processed by waste receptacle during a given time of day (e.g., breakfast time, lunch time, dinner time, etc.) or year (e.g., a holiday season).
- a given time of day e.g., breakfast time, lunch time, dinner time, etc.
- year e.g., a holiday season
- the MAC address can be used to obtain social media content associated with the individual in possession of the mobile communication device (e.g., indicating the individual’s likes and dislikes for products) and/or item popularity in a group to which the individual belongs (e.g., an age group, a country group, etc.).
- the library is searched using the individual-based information. If media content was identified during this search [1330:YES], then the media content is output from the waste receptacle or another device (e.g., mobile communication device 1210 and/or presentation device 1212 of FIG. 12). Subsequently, 1338 is performed where method 1300 ends or other operations are performed (e.g., return to 1302).
- media content is selected in 1334 randomly from the library or based on other information (e.g., a time of day, a geographic location of the waste receptacle, a total number of individuals detected in proximity to the waste receptacle at a given time, sizes of all individuals detected in proximity to the waste receptacle (e.g., indicating a child and/or an adult), types of electronic devices in the possession of the proximate individuals (e.g., smart phone, smart watch, tablet, portable e-book reader device, a portable gaming device, etc.), a given degree of the item(s) popularity amongst individuals with or without a common attribute of the individual, an average number of such items processed by the system, etc.).
- other information e.g., a time of day, a geographic location of the waste receptacle, a total number of individuals detected in proximity to the waste receptacle at a given time, sizes of all individuals detected in proximity to the waste receptacle (e.g.
- the selected media content is then output from waste receptacle or another device (e.g., mobile communication device 1210 and/or presentation device 1212 of FIG. 12) as shown by 1336. Subsequently, 1336 is performed where method 1300 ends or other operations are performed (e.g., return to 1302).
- waste receptacle or another device e.g., mobile communication device 1210 and/or presentation device 1212 of FIG. 12
- 1336 is performed where method 1300 ends or other operations are performed (e.g., return to 1302).
- Method 1400 begins with 1402 and continues with 1404 where item-based information is obtained for two or more items. These items can include those that were disposed in a waste receptacle (e.g., waste receptacle 1202 of FIG. 2) within a threshold period of each other or those that are both in proximity to the waste receptacle at the same time.
- a waste receptacle e.g., waste receptacle 1202 of FIG. 2
- method 1400 continues with 1408-1410.
- 1408-1410 involve: selecting media content from a library (e.g., library 1220 of FIG. 12) that is associated with the common source; and causing the selected media content to be output from waste receptacle or another device (e.g., mobile communication device 1210 and/or presentation device 1212 of FIG. 12). Subsequently, 1412 is performed where method 1400 ends or other operations are performed (e.g., return to 1402).
- a library e.g., library 1220 of FIG. 12
- 1412 is performed where method 1400 ends or other operations are performed (e.g., return to 1402).
- method 1400 continues with 1414-1418.
- 1414-1418 involve: optionally determining a ranking of the sources; selecting media content from a library (e.g., library 1220 of FIG. 12) randomly or based on the source ranking; and causing the selected media content to be output from waste receptacle or another device (e.g., mobile communication device 1210 and/or presentation device 1212 of FIG. 12). Subsequently, 1412 is performed where method 1400 ends or other operations are performed (e.g., return to 1402).
- a library e.g., library 1220 of FIG. 12
- 1412 is performed where method 1400 ends or other operations are performed (e.g., return to 1402).
- an individual may approach a waste receiving container and deposit a beverage can.
- the lid closes, and the sensors capture data (e.g., an image) about the object and sends the information to a cloud server hosting a computer vision algorithm that is trained to detect the logos of brand name consumer product goods and return a value with the brand name.
- the computer vision algorithm processes the data received and returns a label for the item.
- the label may be a specific label, such as a value with the beverage company’s brand name, or a general category such as beverage can.
- the system may then select a digital media file containing content that corresponds to the identified label by utilizing a script that, when run by the circuit of the waste container or a remote server, searches through a library database, hosted either locally or in the cloud.
- each piece of media is associated with one or more tags that are related to potential discarded object characteristics such as brand names, object materials, object types (aluminum cans, plastic bottles, glass bottles etc.).
- the scripts searches through this library, it will compare the initial output value of the label to these tags attached to the media, when the algorithm finds a piece of media with a related tag of the identified label it will trigger the system to play that piece of media content.
- the tag may be specific to a brand (e.g., media may be tagged as candidates for presentation if a Coke can is detected), or it may be associated with a category (such as non-alcoholic beverage cans). Any content item may be associated with multiple tags.
- the system may then play the digital file and output the file’s content via the presentation device by streaming the media from a cloud server or playing the media from a local device. For example, if the digital media file picked by running the script is an advertisement for a beverage product, the system may display an image for the beverage product.
- an individual may deposit a coffee cup into a waste container.
- the system will recognize the object as a coffee cup using methods such as those described above.
- the system may then select a media file from the library that includes content about the recyclability of coffee cups appears. This information can include whether the object is recyclable or not, if it needs to be cleaned before being tossed, and common waste statistics.
- the system may output the content via the display or other presentation device.
- the system may use additional rules to determine which media content to present to the consumer. For example, the system may implement a multi-step process that considers not only a single discarded items, but multiple items that are discarded within a short threshold period of time from each other (for example, within 10 seconds, 20 seconds, or 30 seconds of each other). The system may identify each contemporaneously discarded item and, determine whether the items are collectively associated with a “group label” or category. If so, then the system may select and present media content having a tag that is relevant to the group label.
- the system may presume that the wipe was related to the diaper, and that the group label may be “parent with baby.”
- the system may presume that both items may be associated with a group label for the fast food restaurant (or with fast food restaurants in general).
- the system may use other systems to detect whether multiple objects were discarded by the same person.
- An example may include a beacon or other wireless system that detects an identifier of mobile electronic devices (such as mobile phones) that are proximate to the waste receptacle. For example, if only one electronic device is detected during a time period in which multiple items were discarded, the system may presume that the items are related and were discarded by the same individual.
- multiple media content items may have tags that are associated with the discarded item(s) label(s). If this happens, the system may use any suitable process to select content from the candidate media content items. Such processes may include applying a randomization algorithm, optionally in which weights are applied to that certain content items have a greater likelihood of being selected. For example, a content item that is most recent, or a content item for whom the advertiser paid a premium, may be given greater weight.
- rank order i.e., play the item that is next in a queue
- additional factors such as an amount of time since an item has been previously played, exclusionary factors (such as only play certain content items at certain days or times), or other criteria.
- FIG. 15 there is provided an illustration of an illustrative architecture for a computing device 1500.
- the waste receptable(s) 1202, computing device(s) 1206, mobile computing device(s) 1210 and/or presentation device(s) 1212 of FIG. 12 is/are the same as or similar to computing device 1500.
- the discussion of computing device 1500 is sufficient for understanding the c device 1202, 1206, 1210, 1212 of FIG. 12.
- Computing device 1500 may include more or less components than those shown in FIG. 15. However, the components shown are sufficient to disclose an illustrative solution implementing the present solution.
- the hardware architecture of FIG. 15 represents one implementation of a representative computing device configured to operate a vehicle, as described herein. As such, the computing device 1500 of FIG. 15 implements at least a portion of the method(s) described herein.
- the hardware includes, but is not limited to, one or more electronic circuits.
- the electronic circuits can include, but are not limited to, passive components (e.g., resistors and capacitors) and/or active components (e.g., amplifiers and/or microprocessors).
- the passive and/or active components can be adapted to, arranged to and/or programmed to perform one or more of the methodologies, procedures, or functions described herein.
- the computing device 1500 comprises a user interface 1502, a Central Processing Unit (CPU) 1506, a system bus 1510, a memory 1512 connected to and accessible by other portions of computing device 1500 through system bus 1510, a system interface 1560, and hardware entities 1514 connected to system bus 1510.
- the user interface can include input devices and output devices, which facilitate user-software interactions for controlling operations of the computing device 1500.
- the input devices include, but are not limited to, a physical and/or touch keyboard 1550.
- the input devices can be connected to the computing device 1500 via a wired or wireless connection (e.g., a Bluetooth® connection).
- the output devices include, but are not limited to, a speaker 1552, a display 1554, and/or light emitting diodes 1556.
- System interface 1560 is configured to facilitate wired or wireless communications to and from external devices (e.g., network nodes such as access points, etc.).
- Hardware entities 1514 perform actions involving access to and use of memory 1512, which can be a RAM, a disk drive, flash memory, a CD-ROM and/or another hardware device that is capable of storing instructions and data.
- Hardware entities 1514 can include a disk drive unit 1516 comprising a computer-readable storage medium 1518 on which is stored one or more sets of instructions 1520 (e.g., software code) configured to implement one or more of the methodologies, procedures, or functions described herein.
- the instructions 1520 can also reside, completely or at least partially, within the memory 1512 and/or within the CPU 1506 during execution thereof by the computing device 1500.
- the memory 1512 and the CPU 1506 also can constitute machine- readable media.
- machine-readable media refers to a single medium or multiple media (e.g., a centralized or distributed database, and/or associated caches and servers) that store the one or more sets of instructions 1520.
- machine-readable media also refers to any medium that is capable of storing, encoding or carrying a set of instructions 1520 for execution by the computing device 1500 and that cause the computing device 1500 to perform any one or more of the methodologies of the present disclosure.
Landscapes
- Engineering & Computer Science (AREA)
- Business, Economics & Management (AREA)
- Human Resources & Organizations (AREA)
- Strategic Management (AREA)
- Physics & Mathematics (AREA)
- Economics (AREA)
- Entrepreneurship & Innovation (AREA)
- Marketing (AREA)
- Operations Research (AREA)
- Quality & Reliability (AREA)
- Life Sciences & Earth Sciences (AREA)
- Tourism & Hospitality (AREA)
- Sustainable Development (AREA)
- General Business, Economics & Management (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Mechanical Engineering (AREA)
- Processing Of Solid Wastes (AREA)
- Management, Administration, Business Operations System, And Electronic Commerce (AREA)
- Refuse Receptacles (AREA)
- Sorting Of Articles (AREA)
- Refuse Collection And Transfer (AREA)
- Investigating Or Analysing Materials By Optical Means (AREA)
Priority Applications (5)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
EP21849407.8A EP4188838A4 (en) | 2020-07-31 | 2021-07-30 | WASTE BIN WITH TACTILE AND INTERACTIVE PRESENTATION SYSTEM |
CN202180067164.4A CN116802129A (zh) | 2020-07-31 | 2021-07-30 | 具有感测和交互式呈现系统的废弃物容器 |
AU2021315687A AU2021315687A1 (en) | 2020-07-31 | 2021-07-30 | Waste receptacle with sensing and interactive presentation system |
CA3186563A CA3186563A1 (en) | 2020-07-31 | 2021-07-30 | Waste receptacle with sensing and interactive presentation system |
JP2023506576A JP2023537337A (ja) | 2020-07-31 | 2021-07-30 | 検知および対話型提示システムを備えた廃棄物受け器 |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US202063059523P | 2020-07-31 | 2020-07-31 | |
US63/059,523 | 2020-07-31 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2022026818A1 true WO2022026818A1 (en) | 2022-02-03 |
Family
ID=80036103
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/US2021/043881 WO2022026818A1 (en) | 2020-07-31 | 2021-07-30 | Waste receptacle with sensing and interactive presentation system |
Country Status (6)
Country | Link |
---|---|
EP (1) | EP4188838A4 (ja) |
JP (1) | JP2023537337A (ja) |
CN (1) | CN116802129A (ja) |
AU (1) | AU2021315687A1 (ja) |
CA (1) | CA3186563A1 (ja) |
WO (1) | WO2022026818A1 (ja) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2023229538A1 (en) * | 2022-05-27 | 2023-11-30 | National University Of Singapore | Waste management system |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20060042197A1 (en) * | 2004-05-03 | 2006-03-02 | Jody Langston | Apparatus, system, and method for condensing, separating and storing recyclable material |
KR20120118226A (ko) * | 2011-04-18 | 2012-10-26 | 윤원섭 | 영상 처리 기술을 적용한 쓰레기 분리 수거 장치 |
EP2948740B1 (en) * | 2013-01-28 | 2016-10-19 | Enevo OY | Sensor device for remote monitoring |
US20170178066A1 (en) * | 2015-03-06 | 2017-06-22 | Wal-Mart Stores, Inc. | Shopping facility discarded item sorting systems, devices and methods |
US20180016096A1 (en) * | 2016-07-15 | 2018-01-18 | CleanRobotics, Inc. | Automatic sorting of waste |
Family Cites Families (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN107792561A (zh) * | 2017-09-30 | 2018-03-13 | 深圳利万联科技有限公司 | 一种分类垃圾智能收集系统及其方法 |
US20200010270A1 (en) * | 2018-07-06 | 2020-01-09 | Garbi Inc. | Smart waste receptacle |
-
2021
- 2021-07-30 WO PCT/US2021/043881 patent/WO2022026818A1/en active Application Filing
- 2021-07-30 CA CA3186563A patent/CA3186563A1/en active Pending
- 2021-07-30 AU AU2021315687A patent/AU2021315687A1/en active Pending
- 2021-07-30 EP EP21849407.8A patent/EP4188838A4/en not_active Withdrawn
- 2021-07-30 CN CN202180067164.4A patent/CN116802129A/zh active Pending
- 2021-07-30 JP JP2023506576A patent/JP2023537337A/ja active Pending
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20060042197A1 (en) * | 2004-05-03 | 2006-03-02 | Jody Langston | Apparatus, system, and method for condensing, separating and storing recyclable material |
KR20120118226A (ko) * | 2011-04-18 | 2012-10-26 | 윤원섭 | 영상 처리 기술을 적용한 쓰레기 분리 수거 장치 |
EP2948740B1 (en) * | 2013-01-28 | 2016-10-19 | Enevo OY | Sensor device for remote monitoring |
US20170178066A1 (en) * | 2015-03-06 | 2017-06-22 | Wal-Mart Stores, Inc. | Shopping facility discarded item sorting systems, devices and methods |
US20180016096A1 (en) * | 2016-07-15 | 2018-01-18 | CleanRobotics, Inc. | Automatic sorting of waste |
Non-Patent Citations (1)
Title |
---|
See also references of EP4188838A4 * |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2023229538A1 (en) * | 2022-05-27 | 2023-11-30 | National University Of Singapore | Waste management system |
Also Published As
Publication number | Publication date |
---|---|
EP4188838A4 (en) | 2024-08-21 |
CN116802129A (zh) | 2023-09-22 |
AU2021315687A1 (en) | 2023-02-16 |
JP2023537337A (ja) | 2023-08-31 |
CA3186563A1 (en) | 2022-02-03 |
EP4188838A1 (en) | 2023-06-07 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US20210354911A1 (en) | Waste receptacle with sensing and interactive presentation system | |
US20210371196A1 (en) | Automatic sorting of waste | |
US20230136451A1 (en) | Systems and methods for waste item detection and recognition | |
US20220005002A1 (en) | Closed Loop Recycling Process and System | |
Vanderroost et al. | The digitization of a food package’s life cycle: Existing and emerging computer systems in the logistics and post-logistics phase | |
CN110155572A (zh) | 一种智慧社区垃圾分类系统及方法 | |
KR20130140285A (ko) | 모바일 단말기를 통한 재활용품 수거 방법 및 그 시스템 | |
Lugo et al. | Approaching a zero-waste strategy by reuse in New York City: Challenges and potential | |
JP7384483B2 (ja) | ペットボトル減容装置及びペットボトル減容方法 | |
AU2021315687A1 (en) | Waste receptacle with sensing and interactive presentation system | |
US20240232821A1 (en) | Systems and methods for item management | |
US20230281575A1 (en) | Systems and methods for enhancing waste disposal and energy efficiency using sensor and alternative power technologies | |
JP2019003429A (ja) | 物品回収システム及びプログラム | |
Zoumpoulis et al. | Smart bins for enhanced resource recovery and sustainable urban waste practices in smart cities: A systematic literature review | |
US20230174308A1 (en) | Apparatus, system, and method for a drive-through grocery service | |
Lam et al. | Using artificial intelligence and IoT for constructing a smart trash bin | |
Waltner et al. | An intelligent scanning vehicle for waste collection monitoring | |
US20220097958A1 (en) | Waste management based on context conditions | |
KR20230064680A (ko) | 개인별 재활용 폐기물 배출에 따른 보상 지급 시스템 | |
Ríos-Zapata et al. | Can the attributes of a waste bin improve recycling? A literature review for sensors and actuators to define product design objectives | |
Sarafraz et al. | Towards Eliminating Recycling Confusion: Mixed Plastics and Electronics Case Study | |
Senturk et al. | A Low-Cost Intelligent Metal Recycling Machine | |
Krishnaprasad et al. | Geo Mapping of Waste: A Bottom-Up Approach using Android and SVR. | |
GB2627923A (en) | Waste sorting device | |
Colaljo et al. | Development of an Image-Based Reverse Vending Machine Using Raspberry Pi |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 21849407 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 3186563 Country of ref document: CA |
|
ENP | Entry into the national phase |
Ref document number: 2023506576 Country of ref document: JP Kind code of ref document: A |
|
ENP | Entry into the national phase |
Ref document number: 2021315687 Country of ref document: AU Date of ref document: 20210730 Kind code of ref document: A |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2021849407 Country of ref document: EP |
|
ENP | Entry into the national phase |
Ref document number: 2021849407 Country of ref document: EP Effective date: 20230228 |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
WWE | Wipo information: entry into national phase |
Ref document number: 202180067164.4 Country of ref document: CN |