CA3123155A1 - Food waste detection method and system - Google Patents

Food waste detection method and system Download PDF

Info

Publication number
CA3123155A1
CA3123155A1 CA3123155A CA3123155A CA3123155A1 CA 3123155 A1 CA3123155 A1 CA 3123155A1 CA 3123155 A CA3123155 A CA 3123155A CA 3123155 A CA3123155 A CA 3123155A CA 3123155 A1 CA3123155 A1 CA 3123155A1
Authority
CA
Canada
Prior art keywords
related products
food related
cameras
scale
computing device
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CA3123155A
Other languages
French (fr)
Inventor
Bart VAN ARNHEM
Olaf Egbert VAN DER VEEN
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Wastiq BV
Original Assignee
Wastiq BV
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Wastiq BV filed Critical Wastiq BV
Publication of CA3123155A1 publication Critical patent/CA3123155A1/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B65CONVEYING; PACKING; STORING; HANDLING THIN OR FILAMENTARY MATERIAL
    • B65FGATHERING OR REMOVAL OF DOMESTIC OR LIKE REFUSE
    • B65F1/00Refuse receptacles; Accessories therefor
    • B65F1/14Other constructional features; Accessories
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/22Measuring arrangements characterised by the use of optical techniques for measuring depth
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B17/00Measuring arrangements characterised by the use of infrasonic, sonic or ultrasonic vibrations
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01GWEIGHING
    • G01G19/00Weighing apparatus or methods adapted for special purposes not provided for in the preceding groups
    • G01G19/52Weighing apparatus combined with other objects, e.g. furniture
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/40Software arrangements specially adapted for pattern recognition, e.g. user interfaces or toolboxes therefor
    • G06F18/41Interactive pattern learning with a human teacher
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q10/00Administration; Management
    • G06Q10/30Administration of product recycling or disposal
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/60Analysis of geometric attributes
    • G06T7/62Analysis of geometric attributes of area, perimeter, diameter or volume
    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B7/00Signalling systems according to more than one of groups G08B3/00 - G08B6/00; Personal calling systems according to more than one of groups G08B3/00 - G08B6/00
    • G08B7/06Signalling systems according to more than one of groups G08B3/00 - G08B6/00; Personal calling systems according to more than one of groups G08B3/00 - G08B6/00 using electric transmission, e.g. involving audible and visible signalling through the use of sound and light sources
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/20Image signal generators
    • H04N13/204Image signal generators using stereoscopic image cameras
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/18Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
    • H04N7/188Capturing isolated or intermittent images triggered by the occurrence of a predetermined event, e.g. an object reaching a predetermined position
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B65CONVEYING; PACKING; STORING; HANDLING THIN OR FILAMENTARY MATERIAL
    • B65FGATHERING OR REMOVAL OF DOMESTIC OR LIKE REFUSE
    • B65F2210/00Equipment of refuse receptacles
    • B65F2210/138Identification means
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B65CONVEYING; PACKING; STORING; HANDLING THIN OR FILAMENTARY MATERIAL
    • B65FGATHERING OR REMOVAL OF DOMESTIC OR LIKE REFUSE
    • B65F2210/00Equipment of refuse receptacles
    • B65F2210/168Sensing means
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B65CONVEYING; PACKING; STORING; HANDLING THIN OR FILAMENTARY MATERIAL
    • B65FGATHERING OR REMOVAL OF DOMESTIC OR LIKE REFUSE
    • B65F2210/00Equipment of refuse receptacles
    • B65F2210/184Weighing means
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10004Still image; Photographic image
    • G06T2207/10012Stereo images
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/60Type of objects
    • G06V20/68Food, e.g. fruit or vegetables
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N2013/0074Stereoscopic image analysis
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02WCLIMATE CHANGE MITIGATION TECHNOLOGIES RELATED TO WASTEWATER TREATMENT OR WASTE MANAGEMENT
    • Y02W90/00Enabling technologies or technologies with a potential or indirect contribution to greenhouse gas [GHG] emissions mitigation

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Business, Economics & Management (AREA)
  • Theoretical Computer Science (AREA)
  • Human Resources & Organizations (AREA)
  • Multimedia (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Signal Processing (AREA)
  • Entrepreneurship & Innovation (AREA)
  • Tourism & Hospitality (AREA)
  • General Business, Economics & Management (AREA)
  • Strategic Management (AREA)
  • Quality & Reliability (AREA)
  • Operations Research (AREA)
  • Mechanical Engineering (AREA)
  • Marketing (AREA)
  • Economics (AREA)
  • Sustainable Development (AREA)
  • Geometry (AREA)
  • Artificial Intelligence (AREA)
  • Human Computer Interaction (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Data Mining & Analysis (AREA)
  • Evolutionary Biology (AREA)
  • General Engineering & Computer Science (AREA)
  • Evolutionary Computation (AREA)
  • Image Analysis (AREA)
  • Management, Administration, Business Operations System, And Electronic Commerce (AREA)

Abstract

A system (1) for detecting food related products (2) before thrown away, the system comprising: one or more cameras (11); a display unit (12); a computing device (13) that is communicatively connected to the cameras and the display; and a scale (3) that is communicatively connected to the computing device, the scale holding a trash bin (31), wherein the cameras obtain an image or a video of the products when the products are within a field of view of the cameras and before the products are in the trash bin, the scale configured to weigh the products in the trash bin, and wherein the computing device obtains information about the products from the obtained image or video by applying an image recognition algorithm, receives the weight from the scale and generates and outputs data on the display unit, the data being based on the information about products and the weight.

Description

FOOD WASTE DETECTION METHOD AND SYSTEM
TECHNICAL FIELD
[0001] The present invention relates to a system and a method for detecting food related products, and to a display unit for use in the system.
BACKGROUND ART
[0002] Venues that work with food are often faced with food waste by having to throw away food that passed expiration date or is left over after consumption or preparation. An example of such venue is a restaurant, where food waste may be generated by customers leaving food on their plates, in the kitchen by having leftovers after preparing diners, or in the inventory by having food passing the expiry date.
[0003] There is a need to reduce food waste. Insight in the food waste may be used by a restaurant for example to optimize planning, proportioning and inventory management, resulting in a more efficient purchase of food and acting in a more environmentally friendly manner. Other examples of venues that may benefit from insight in food waste are caterers, catering industry, hospitals, healthcare institutions, and generally any venue involved in food preparation.
SUMMARY
[0004] According to an aspect of the invention, a system is proposed for detecting food related products before being thrown away. The system can comprise one or more cameras.
The system can further comprise a display unit. The system can further comprise a computing device that is communicatively connected to the one or more cameras and the display unit.
The system can further comprise a scale that is communicatively connected to the computing device. The scale can be configured to hold a trash bin. The scale can be separable from the trash bin, e.g. by simply placing any trash bin on the scale. The scale can be integrated in the trash bin. The trash bin may be a recycle bin. The one or more cameras can be configured to obtain an image or a video of the food related products when the food related products are within a field of view of the one or more cameras and before the food related products are in the trash bin. Advantageously, this enables food left-overs to be detected before being intermixed with other food waste in the trash bin. The scale can be configured to obtain weight information of the food related products when the food related products are in the trash bin. The computing device can be configured to obtain information about the food related products from the obtained image or video by applying an image recognition algorithm. This image recognition algorithm can run locally on the computing device or remote on a remote server to which the computing device may be communicatively connected. The computing device can be configured to receive the weight information from the scale. The computing device can be configured to generate and output data on the display unit, wherein the data is based on the information about the food related products and the weight information.
[0005] The food related products are typically food leftovers but can also include other objects that are to be thrown away such as plastics, paper, napkins, cardboard and (disposable) cutlery. The food related products may include a bin, plate or other tray item on which the disposables are placed, which may be detected together with the disposables and input to the image recognition algorithm to improve the detection of the food left-overs or other disposables.
[0006] In an embodiment the computing device can be communicatively connected to a remote server. The computing device can be configured to transmit the obtained image or video to the remote server for applying the image recognition algorithm. The computing device can be configured to receive the information about the food related products from the remote server. The remote server can be implemented as a cloud computing server or cloud computing service.
[0007] In an embodiment the computing device can be further configured to store one or more of the information about the food related products, the weight information, the output data, and at time stamp in a data storage of the remote server. This enables food waste to be analyzed or mapped over time. This also enables recommendations to be generated regarding minimizing the food waste as detected over time.
[0008] In an embodiment the computing device can be configured to present one of more questions on the display unit about one or more objects in the obtained image or video in case the image recognition algorithm is unable to identify one or more of the food related products from the image or the video. The display unit can comprise a user interface, preferably in the form of a touch screen interface, for receiving user input in response to the one or more questions. The response can be used by the image recognition algorithm to improve detection of the one or more objects.
[0009] In an embodiment the one or more cameras can be configured to automatically obtain the image or the video when the food related products are within the field of view of the one or more cameras or depth sensors, at a substantially fixed position for a dynamic minimal amount of time necessary for successful detection. The user can be provided audiovisual feedback upon successful ingredient detection. The fixed position can be any position within the field of view and is typically defined by the location at which a user holds the food related products under the one or more cameras before throwing it into the trash bin.
[0010] In an embodiment the output data can comprise a ratio of different food related products. The different food related products can be detected by the image recognition algorithm. The ratio can be based on the weight information. Thus, by combining the weight information and the image detection algorithm, the ratio of the different food related products as presented to the camera and the scale can be obtained.
[0011] In an embodiment the one or more cameras can comprise a stereoscopic imaging camera for obtaining 3D information about the food related products from the image or the video.
[0012] In an embodiment the image recognition algorithm can be configured to obtain volumetric information from the 3D information. The computing device can be configured to obtain a weight estimation of the food related products based on the volumetric information.
The stereoscopic camera can replace the scale. The weight estimation can be used instead of the weight information. Thus, the system can be realized without a scale when using a stereoscopic camera.
[0013] In an embodiment the one or more cameras can comprise a hyperspectral imaging camera for obtaining substance information about the food related products from the image or the video. Non-limiting examples of substance information are levels of fat, protein and sugar in food left-overs.
[0014] In an embodiment the system can further comprise a depth sensor, for example an ultrasonic depth sensor or laser-based depth sensor, for detecting when the food related products are within the field of view of the one or more cameras. The depth sensor may be used in conjunction with the one or more cameras or stand alone to detect when the food related products are within a field of view of the one or more cameras to thereby trigger the one or more cameras to obtain the image or a video of the food related products. The depth sensor is typically located next to the one or more cameras.
[0015] In an embodiment the field of view can be located in an area around a line of sight from the one or more cameras in a substantially downwards direction.
[0016] In an embodiment the display unit can comprises a housing for accommodating the one or more cameras. The housing can comprise an outer surface side that is placed at an angle from a horizontal plane. The cameras can be located within the housing at the outer surface side resulting in the line of sight being vertically angled at the angle. The line of sight is perpendicular to the outer surface side. The angle can be in a range of 15 to 45 degrees, preferably in a range of 15 to 30 degrees, more preferably in a range of 15 to 25 degrees.
[0017] In an embodiment the housing of the display unit can further comprise the computing device.
[0018] In an embodiment the housing can further comprises a visual indicator indicating where the food related products are to be presented to the one or more cameras.
[0019] In an embodiment the visual indicator can change its color when the food related products have been registered by the one or more cameras.
[0020] In an embodiment the housing can further comprises an audible indicator providing audible feedback.
[0021] In and embodiment the audible indicator can produce a sound when the food related products have been registered by the one or more cameras.
[0022] In an embodiment the scale can comprises at least one sloped side wall allowing the trash bin to be rolled on and off the scale.
[0023] 19. The system according to claim 18, wherein the sloped side wall forms an integral part with a top part of the scale.
[0024]
[0025] According to an aspect of the invention, a display unit in a housing is proposed, the housing further comprising one or more cameras and a computing device, for use in a system having one or more of the above described features.
[0026] According to an aspect of the invention, a method is proposed for detecting food related products before being thrown away. The method can comprise obtaining an image or a video of the food related products using one or more cameras when the food related products are within a field of view of the one or more cameras and before the food related products are thrown in a trash bin. The method can further comprise obtaining weight information of the food related products using a scale when the food related products are in the trash bin, wherein the scale is configured to hold the trash bin. The method can further comprise obtaining information in a computing device about the food related products from the obtained image or video by applying an image recognition algorithm. The method can further comprise generating and outputting data by the computing device on the display unit, wherein the data can be based on the information about the food related products and the weight information.
[0027] In an embodiment the method can further comprise transmitting the obtained image or video from the computing device to the remote server for applying the image recognition algorithm. The method can further comprise receiving the information about the food related products from the remote server in the computing device.
[0028] In an embodiment the method can further comprise presenting one of more questions on the display unit about one or more objects in the obtained image or video in case the image recognition algorithm is unable to identify one or more of the food related products from the image or the video. The method can further comprise receiving user input from a user interface of the display unit in response to the one or more questions, the response for use by the image recognition algorithm to improve detection of the one or more objects.
[0029] Hereinafter, embodiments will be described in further detail. It should be appreciated, however, that these embodiments may not be construed as limiting the scope of protection for the present disclosure.
BRIEF DESCRIPTION OF DRAWINGS
[0030] Embodiments will now be described, by way of example only, with reference to the accompanying schematic drawings in which corresponding reference symbols indicate corresponding parts, and in which:
[0031] FIG. 1 shows a system of an exemplary embodiment of the invention;
[0032] FIG. 2 shows a schematic side view of a housing and camera of an exemplary embodiment of the invention;
[0033] FIG. 3 shows an elevated side view of a display unit and camera in a housing of an exemplary embodiment of the invention;
[0034] FIG. 4 shows a block diagram of steps of a method of an exemplary embodiment of the invention;
[0035] FIG. 5 shows another exemplary embodiment of a display unit in a housing;
[0036] FIG. 6A shows an elevated side view of an exemplary scale; and
[0037] FIG. 6B shows a side view of an exemplary scale.
[0038] The figures are meant for illustrative purposes only, and do not serve as restriction of the scope or the protection as laid down by the claims.
DESCRIPTION OF EMBODIMENTS
[0039] FIG. 1 shows an exemplary system 1 for detecting food related products 2 before being thrown away in a trash bin 31. The system 1 preferably includes a housing 100 that includes one or more cameras 11, a display unit 12, a computing unit 13 and a communications module 14. Although less preferred, it is possible to have the one or more cameras 11 separated from the housing 100 and/or have the computing device 13 separated from the housing 100. The system 1 may include a scale 3 that is configured for holding a trash bin 31 and weighing the food related products 2 when placed in the trash bin 31. The scale 3 may be integrated within the trash bin 31. Preferably, the trash bin 31 is a recycle bin allowing the food related products 2 to be recycled after being thrown away.
The scale 3 may include a communications module 32 for communicating with the computing device 13, typically via the communications module 14.
[0040] The display unit 12 is typically capable of presenting full color bitmap images representative of food detected amongst the food related products 2. The display unit 12 may be configured to display a graphical user interface, for example in the form of selectable button objects or any other user interface elements selectable through a touch screen interface of the display unit 12. The computing device 13 may be any suitable CPU, GPU
and/or NPU
based computer, for example in the form of a Raspberry PiTM computer.
Preferably the computing device 13 is a small form factor or single board computer to minimize the size requirements of the computing device 13. The communications module 14 may be integrated with the computing device 13. The communications module 14 may be any suitable wireless or wired communications module. The communications module 14 may include multiple different communication interfaces, for example a BluetoothTM interface for short range communication with the communications module 32 of the scale 3 and a Wi-Fi or LAN

interface for communication with a remote server 4. In the example of a Wi-Fi or LAN
interface, the communication typically further involves a router (not shown) for connecting to the Internet 5, a local area network or any other suitable network.
[0041] In an exemplary embodiment the housing 100 and the scale 3 may be connected by a pole or other vertically aligned support structure for fixing the housing 100 at a vertical distance from the scale 3. This allows the housing 100 and the scale 3 to be moved around or placed at a desired location as a single unit. The vertically aligned support structure may be used to guide or accommodate electrical cabling and/or data cables for electrical or data/signal connections between the scale 3 and components in the housing 100.
[0042] The remote server 4 typically includes a data storage 41 and a communications module 42. The remote server 4 may be a stand-alone server, implemented as cloud computing server, implemented as a cloud computing service, or any other computer system.
The data network 5 may be a local area network, a wide area network, the Internet, or any other suitable network.
[0043] FIG. 2 shows an exemplary embodiment of a camera setup in a housing 100. In FIG. 2 a side view of the housing 100 including a camera 11 is shown. The housing 100 may be attached to a wall 6. In FIG. 2 the housing 100 has an outer surface side 102 at the underside of the housing 100. The outer surface side 102 may be placed under an angle a from a horizontal plane 103, thus the underside of the housing 100 may be placed under angle a. The camera 11 may be installed in the surface side 102. As the line of sight 101 from the camera is typically in a direction perpendicular to the surface side 102, the line of sight 101 may thus be vertically angled under the angle a. Particularly when the housing 100 is attached to the wall 6, the angled line of sight 101 may improve the field of view by eliminating covering a large part of the wall 6 where no food related products 2 can be presented under the camera 11. Furthermore, the angled line of sight 101 enables a slight view of the products from the side, thereby improving the detectability of the food related products 2.
[0044] FIG. 3 shows an exemplary display unit 12 that may be installed in a housing 100. FIG. 3 further shows a camera 11 that may be installed in the housing 100. In the example of FIG. 3 the underside of the housing where the camera 11 is installed, is placed under an angle to allow the line of sight of camera 11 to be vertically angled, as explained with FIG. 2.
[0045] In an embodiment, food related products - preferably everything -that end up in the trash bin 31 may be first captured by a smart camera 11 above the trash bin 31 and then captured by a digital scale 3 located underneath the bin 31. When something is moved within the field of view of the camera, the camera may automatically take a picture or shoot a video as soon as it detects that an object is fully within view and is kept stable (stopped moving) at a fixed location within the field of view. The object may include a plate or a container onto which the food related products 2 are located. The object may be a human hand holding the food related products 2. The user can be provided audiovisual feedback upon successful ingredient detection. The captured image may be sent to the cloud 4, where an ingredient detection may be performed by an image recognition algorithm. Alternatively, the image recognition algorithm may be performed locally, for example in the computing device 13.
The model used by the image recognition algorithm may detect / recognized one or more of the food related products 2 on the image and may send the results back to computing device 13 for local feedback on the display unit 12. When the waste 2 is thrown into the bin 31, the digital scale 3 may capture the weight and send this information to the computing device 13 and/or the cloud 4. The weight and image processing result may be linked to each other in the ratio at which the ingredients were recognized and the thus obtained results may be sent back to the computing device where the results may be displayed on the display unit 12. The results may also be stored in a data storage 41, such as a cloud database.
Preferably, the results are stored together with a time stamp or any other indication of a date and/or time.
This data may then be used at any point in time to generate dashboards that show the actionable results.
[0046] The possible items that are captured in the eye of view of the camera are limited and therefore relatively easy to detect by waste stream. We currently focus on food waste and have identified four different types of food waste streams in a restaurant.
However, in practice we see that not all restaurants split their waste, resulting in that the tool also detects other types of waste, for example: plastics, paper, cardboard, cutlery. This gives our detection model an opportunity that goes beyond solely food waste.
[0047] Different types of food waste streams may be captured by the system 1. Examples hereof are: (i) expired products from an inventory; (ii) processed kitchen waste, which may be detectably presented to the camera 11 on metal containers, in bins or pans or big plastic containers; (iii) cutting waste, which may be detectably presented to the camera 11 on cutting plates or bigger plastic or metal bins; and (iv) waste in a restaurant, which may be detectably presented to the camera 11 on smaller plates or bowls (e.g. porcelain).
[0048] The state or condition of the food related products 2 may be detected in the image as well and may be used in the analysis of the waste stream. For example, in a restaurant environment, the following detection criteria may be used: (i) for expired products from an inventory, product may be untouched and therefore more easy to detect; (ii) processed food from a kitchen may be finely chopped, mashed or may include liquid side dishes; (iii) cutting waste from the kitchen may include inedible peals and bones; and (iv) plate waste from the restaurant may include left-overs from plate or parts of products that may be more difficult to detect because it is partly eaten and mixed. The system 1 may be capable of detecting the state of condition to further improve the detection of the food related products 2 and/or to generate recommendations about minimizing food waste.
[0049] In an exemplary embodiment the camera 11 may be placed approximately 50 cm above the bin 31 or another base platform. In another exemplary embodiment the camera 11 may be placed approximately 70 cm above the bin 31 or another base platform.
Registration of the food related products 2 may take place between this platform and the camera 11, for example at 40 cm under the camera 11, which may be detected by a depth sensor.
The depth sensor may thus be used to trigger the camera 11 to start the registration of the food related products 2. As shown in FIG. 3, the camera 11 may be placed at a slight angle a to the surface 102 on which the camera 11 may be mounted. For example, with reference to FIG. 2, the angle a may be any angle between 15 degrees to 45 degrees from a horizontal plane 103 to enable the camera 11 to have a better perspective, both in taking an image that is unobstructed by the plane upon which the machine is mounted and to get a slight view of the products from the side.
[0050] In an embodiment, the camera 11 may be completely detached from the display unit 12 so as to make the system more suited when space is a limitation. This may also enable the camera 11 to be placed so as to provide an optimal perspective of the food waste that will be registered.
[0051] For the detection of ingredients in the food related products 2, classification of dishes and waste stream origin (e.g. kitchen, restaurant) computer vision technology may be used. At the core of such computer vision technology are neural networks and deep learning.
This is a so called semi-supervised machine learning approach. The terminology "supervised"

means that the image recognition algorithm is typically trained to incrementally become better at the tasks it should perform (the detection). The training is typically done by giving the computer a lot of examples in which a human - through a user interface such as a graphical user interface - has manually labeled the ingredients, the type of dish and type of waste stream.
[0052] There are different types of image recognition strategies that may be applied.
Most used strategies are: (i) classification to classify and assign a label to an image as a whole; (ii) detection to detect and label possibly multiple objects within an image; and/or (iii) segmentation, which is a fine-grained approach where each individual pixel of an image may be assigned a label.
[0053] For a dish and waste stream classification, the first two types of image recognition strategies are most suitable, i.e. classification and detection.
For ingredient detection, the third strategy is most suitable, i.e. the more powerful segmentation strategy.
With the segmentation strategy, a per-pixel labeling may be used to compute the ratio as to which ingredients occur within the image. The ratio may be used to improve the weight estimate that may be assigned to each individual ingredient. The input images that may be used to train the ingredient detection may require a lot of detail, meaning that for each pixel or a group of pixels the name of the ingredient may be assigned.
[0054] Once trained, the model may be used to independently recognize ingredients in new images as captured by the camera 11 that are fed into the image recognition algorithm.
Hence the term "semi"-supervised is applicable: as soon as the model is trained, it may be used to automatically recognize ingredients in images without any required manual actions.
[0055] Additional domain knowledge may be used - such as the physical position of the camera 11, housing 100 and/or scale 3 within a venue such as a restaurant, and/or the menu of the restaurant in question - to improve accuracy by limiting the scope in which the detection algorithm has to operate. For example, the physical position may be used to determine that only certain waste streams will be recorded by the particular system 1, and the menu may be used to limit the variety in ingredients the system 1 may encounter.
[0056] To improve quality and accuracy of the detection algorithms, the one or more cameras 11 may include a stereoscopic camera. A stereoscopic camera is capable of obtaining a 3D depth image that may provide more information on the volume of food that is thrown away and help improve the weight estimate. Compared to a single top-down camera, which
57 may have a problem of occlusion where certain ingredients may be invisible to the camera when covered by other materials, the stereoscopic camera may use two slightly differently angled cameras to provide a better view. Computer vision techniques such as Siamese Neural Networks can use stereoscopic images as input and may be used to better detect the food related products 2 when presented to a stereoscopic camera.
[0057] A stereoscopic camera may be used to obtain volumetric information of the food related products 2. Together with the identification of the food itself, the volumetric information may provide an indication of the weight of the detected food. The stereoscopic camera may then be used instead of the scale 3, in which case the system 1 does not need to include the scale 3.
[0058] To improve quality and accuracy of the detection algorithms, the one or more cameras 11 may include a hyperspectral camera. A hyperspectral camera is capable of obtaining a spectrum per pixel resulting in a lot more information than a standard RGB
camera. This information may be used to detect, for example, levels of fat, protein and sugar, and may simplify and improve quality of detection of ingredients.
[0059] The weight of what ends up in the bin 31 may be registered to the respective food waste picture, possibly to the ratio in which the ingredients are detected in the image. This process may be performed in a short time frame, e.g. within seconds, after the picture is taken and the image and weight data may be sent combinedly to the remote server 4.
[0060] In case the image recognition algorithm cannot determine/detect the food related products 2 from the image, the image may be sent to the remote server 4 or to another remote server or cloud for redetection within the most up-to-date detection model, which may result in a multiple-choice option of images being sent to the display device 12 as a feedback screen. The end user may then select one or multiple images in the feedback screen to identify the product(s). If (parts of) the image is not in the multiple choice, the user may be offered to provide further feedback, for example in the form of a selectable "explain" button to write down what product(s) was not detected. This feedback from the user may be directly added to the detection model of the image recognition algorithm.
[0061] Feedback from the end user may be provided in various manners.
For example, the display device 12 may include a touch screen interface for providing the feedback.
Alternatively or additionally, a speech recognition interface may be installed in the housing 100 allowing the end user to interact with the system. Alternatively or additionally, one or more buttons on the housing 100 may enable the end user to provide feedback to the system.
[0062] The system 1 may be used in various use cases. Non-limiting examples of use cases are: (i) disposal during cleanup by staff in a kitchen or restaurant;
(ii) discard during cleanup by a guest in a kitchen or restaurant; (iii) assembly line detection where the food related products 2 will be presented to the one or more cameras 11 without human involvement, for example in self-service restaurants; (iv) detection of food related products in tray carts where multiple trays are collected before throwing away leftovers and trash and cleaning the trays and cutlery in self-service restaurants and health care institutions.
[0063] Typically, the scale 3 and trash bin 31 will be located underneath the camera 11, but it is possible to place the scale 3 and trash bin 31 at another location.
Preferably, the scale 3 and trash bin 31 are located in a vicinity of the camera 11 to ease the handling of the waste from holding underneath the camera to throwing away the waste in the bin.
[0064] The system 1 may be used to stimulate waste reduction by providing performance data, possibly anonymously, between neighboring and/or peer restaurants for comparison.
[0065] The system may alternatively or additionally detect other types of waste, besides food related products 2. Examples hereof are plastics, paper, cardboard and cutlery.
[0066] The system 1 may be integrated with 3rd party vendors providing for example stock management solutions or point of sale solutions.
[0067] FIG. 4 shows an exemplary block diagram of steps that may be performed by parts of a system 1 as shown in FIG. 1. Comparing FIG. 4 with FIG. 1, the smart camera may be similar to the one or more cameras 11, the smart scale may be similar to the scale 3, the touchscreen terminal may be similar to the display unit 12, the cloud storage may be similar to the data storage 41. Starting from the smart camera, in FIG. 4 the camera may automatically take a picture of a container or plate that is in view of the camera or depth sensor. Direct intermediate image feedback may be transmitted to the touch screen terminal, where the obtained image may be presented as a full color bitmap image. The picture that has been taken may be input to a computer vision program, which may be running on a local computer device 13 and/or in a remote server 4 such as a cloud. The computer vision program may apply the image recognition algorithm to the image to obtain detection results.
After the picture has been taken, the waste may be thrown in the bin. Starting from the smart scale, the scale may then detect a waste change of waste thrown in the bin.
The waste change may be indicative of the weight of the last waste thrown in the bin. Direct intermediate weight feedback may be transmitted to the touch screen terminal, where the obtained weight information may be presented. The detection results and the weight information may be matched and a final registration feedback indicative of the waste that has been thrown away .. may be presented on the touch screen terminal. The detection results and weight information may additionally or alternatively be stored in the cloud storage, from where statistical information may be generated and actionable insights may be presented for example on a dashboard output on a website or in a dashboard application. The touchscreen terminal may be used to obtain re-enforcement learning feedback provided by the user, which may be fed into the computer vision program to improve the detectability of the waste.
[0068] FIG. 5 shows an exemplary display unit 12 in a housing 100, which is a non-limiting alternative to the display unit and housing shown in FIG. 3. In this example the camera 11 may be located at a higher location than the display 12. The housing 12 may further include one or more light sources 15 for illuminating the food related products 2 enabling better detection by the camera 11, The other two circles shown at the top of the housing indicate the possibility of having further sensors, cameras and/or further light sources. These further sensors, cameras and/or light sources may be different in functionality from the camera 11 and light source 15, e.g. a depth sensor or infrared light source.
[0069] The housing 100 may include a visual indicator 16 indicating the height at which the food related products are to be presented to the camera before throwing in the trash bin 31. This helps the system in letting the user place the products at an optimal position with respect to the camera to detect what will be thrown in the bin. The visual indicator 16 may be implemented in various manners, for example as an illuminated LED strip. When using a light emitting visual indicator, the color of the light may be changed to indicate a status. For example, the color of the visual indicator 16 may turn green when the products have been registered by the camera thus indicating that the products may be thrown into the bin.
[0070] The housing 100 may include an audible indicator 17 for providing feedback to the user, e.g. to indicate that the products have been registered by the camera thus indicating that the products may be thrown into the bin.
[0071] FIGs. 6A and 6B and show an exemplary scale 3, which may be used in the system of FIG. 1. The weighing scale 3 of FIGs. 6A and 6B is designed to be as low as physically possible to allow more space for the bin 31 and more space between the bin 31 and the camera 11. Moreover, the scale 3 of FIGs. 6A and 6B includes at least one -e.g. two -ramps 33, i.e. sloped side walls, to allow a trash bin to be rolling on and off the scale 3. The ramps 33 may form a single piece of material with the top part of the scale, i.e. be part of the moving part of the scale 3 when measuring weight.
[0072] One or more embodiments may be implemented as a computer program product for use with a computer system. The program(s) of the program product may define functions of the embodiments (including the methods described herein) and can be contained on a variety of computer-readable storage media. The computer-readable storage media may be non-transitory storage media. Illustrative computer-readable storage media include, but are not limited to: (i) non-writable storage media (e.g., read-only memory devices within a computer such as CD-ROM disks readable by a CD-ROM drive, ROM chips or any type of solid-state non-volatile semiconductor memory) on which information may be permanently stored; and (ii) writable storage media, e.g., hard disk drive or any type of solid-state random-access semiconductor memory, flash memory, on which alterable information may be stored.

Claims (24)

-15-
1. A system (1) for detecting food related products (2) before being thrown away, the system comprising:
one or more cameras (11);
a display unit (12);
a computing device (13) that is communicatively connected to the one or more cameras and the display unit; and a scale (3) that is communicatively connected to the computing device, wherein the scale is configured to hold a trash bin (31), wherein the one or more cameras are configured to obtain an image or a video of the food related products when the food related products are within a field of view of the one or more cameras and before the food related products are in the trash bin, wherein the scale is configured to obtain weight information of the food related products when the food related products are in the trash bin, and wherein the computing device is configured to:
obtain information about the food related products from the obtained image or video by applying an image recognition algorithm;
receive the weight information from the scale; and generate and output data on the display unit, wherein the data is based on the information about the food related products and the weight information.
2. The system according to claim 1, wherein the computing device is communicatively connected to a remote server (4), and wherein the computing device is configured to:
transmit the obtained image or video to the remote server for applying the image recognition algorithm; and receive the information about the food related products from the remote server.
3. The system according to claim 2, wherein the computing device is further configured to store one or more of the information about the food related products, the weight information, the output data, and at time stamp in a data storage (41) of the remote server.
4. The system according to any one of the preceding claims, wherein the computing device is configured to present one of more questions on the display unit about one or more objects in the obtained image or video in case the image recognition algorithm is unable to identify one or more of the food related products from the image or the video, and wherein the display unit comprises a user interface for receiving user input in response to the one or more questions, the response for use by the image recognition algorithm to improve detection of the one or more objects.
5. The system according to any one of the preceding claims, wherein the one or more cameras are configured to automatically obtain the image or the video when the food related products are within the field of view at a substantially fixed position for a dynamic minimal amount of time necessary for successful detection.
6. The system according to any one of the preceding claims, wherein the output data comprises a ratio of different food related products, wherein the different food related products are detected by the image recognition algorithm, and wherein the ratio is based on the weight information and the detected different food related products.
7. The system according to any one of the preceding claims, wherein the one or more cameras comprises a stereoscopic imaging camera for obtaining 3D information about the food related products from the image or the video.
8. The system according to claim 7, wherein the image recognition algorithm is configured to obtain volumetric information from the 3D information, wherein the computing device is configured to obtain a weight estimation of the food related products based on the volumetric information, wherein the stereoscopic camera replaces the scale, and wherein the weight estimation is used instead of the weight information.
9. The system according to any one of the preceding claims, wherein the one or more cameras comprises a hyperspectral imaging camera for obtaining substance information about the food related products from the image or the video.
10. The system according to any one of the preceding claims, further comprising a depth sensor for detecting when the food related products are within the field of view of the one or more cameras.
11. The system according to any one of the preceding claims, wherein the field of view is located in an area around a line of sight from the one or more cameras in a substantially downwards direction.
12. The system according to claim 11, further comprising a housing (100), wherein the housing comprises the display unit, wherein the housing accommodates the one or more cameras, wherein the housing comprises an outer surface side (102) that is placed at an angle (a) from a horizontal plane (103), and wherein the cameras are located within the housing at the outer surface side resulting in the line of sight (101) being vertically angled at the angle (a), the line of sight being perpendicular to the outer surface side, wherein the angle (a) is in a range of 15 to 45 degrees, preferably in a range of 15 to 30 degrees, more preferably in a range of 15 to 25 degrees.
13. The system according to claim 12, wherein the housing further comprises the computing device.
14. The system according to claim 11 or 12, wherein the housing further comprises a visual indicator (16) indicating where the food related products are to be presented to the one or more cameras.
15. The system according to claim 14, wherein the visual indicator changed its color when the food related products have been registered by the one or more cameras.
16. The system according to any one of the claims 12-15, wherein the housing further comprises an audible indicator (17) providing audible feedback.
17. The system according to claim 16, wherein the audible indicator produces a sound when the food related products have been registered by the one or more cameras.
18. The system according to any one of the preceding claims, wherein the scale (3) comprises at least one sloped side wall (33) allowing the trash bin to be rolled on and off the scale.
19. The system according to claim 18, wherein the sloped side wall forms an integral part with a top part of the scale.
20. The system according to any one of the preceding claims, wherein the housing (100) and the scale (3) are connected by a vertically aligned support structure for fixing the housing at a vertical distance from the scale.
21. A housing comprising a display unit, the housing further comprising one or more cameras and a computing device, wherein the housing is configured for use in the system according to claim 12 or 13.
22. A method for detecting food related products before being thrown away, the method compri sing:
obtaining an image or a video of the food related products using one or more cameras when the food related products are within a field of view of the one or more cameras and before the food related products are thrown in a trash bin;
obtaining weight information of the food related products using a scale when the food related products are in the trash bin, wherein the scale is configured to hold the trash bin;
obtaining information in a computing device about the food related products from the obtained image or video by applying an image recognition algorithm;
generating and outputting data by the computing device on the display unit, wherein the data is based on the information about the food related products and the weight information.
23. The method according to claim 22, further comprising:
transmitting the obtained image or video from the computing device to the remote server for applying the image recognition algorithm; and receiving the information about the food related products from the remote server in the computing device.
24. The method according to claim 22 or 23, further comprising:
presenting one of more questions on the display unit about one or more objects in the obtained image or video in case the image recognition algorithm is unable to identify one or more of the food related products from the image or the video; and receiving user input from a user interface of the display unit in response to the one or more questions, the response for use by the image recognition algorithm to improve detection of the one or more objects.
CA3123155A 2018-12-14 2019-12-13 Food waste detection method and system Pending CA3123155A1 (en)

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
NL2022213A NL2022213B1 (en) 2018-12-14 2018-12-14 Food waste detection method and system
NL2022213 2018-12-14
PCT/EP2019/085143 WO2020120757A1 (en) 2018-12-14 2019-12-13 Food waste detection method and system

Publications (1)

Publication Number Publication Date
CA3123155A1 true CA3123155A1 (en) 2020-06-18

Family

ID=65496945

Family Applications (1)

Application Number Title Priority Date Filing Date
CA3123155A Pending CA3123155A1 (en) 2018-12-14 2019-12-13 Food waste detection method and system

Country Status (10)

Country Link
US (1) US20220058388A1 (en)
EP (1) EP3895085A1 (en)
JP (1) JP2022513897A (en)
CN (1) CN113302633A (en)
AU (1) AU2019396665A1 (en)
BR (1) BR112021011397A2 (en)
CA (1) CA3123155A1 (en)
NL (1) NL2022213B1 (en)
WO (1) WO2020120757A1 (en)
ZA (1) ZA202103911B (en)

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP7299187B2 (en) * 2020-03-23 2023-06-27 株式会社日立製作所 Work support system and work support method
BR112023016954A2 (en) * 2021-02-23 2023-11-07 Orchard Holding SYSTEM, DEVICE, PROCESS AND METHOD FOR MEASUREMENT OF FOOD, FOOD CONSUMPTION AND FOOD WASTE
KR102354307B1 (en) * 2021-05-06 2022-01-24 주식회사 스토랑 Ingredient extraction device and method

Family Cites Families (18)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2004139278A (en) * 2002-10-16 2004-05-13 Matsushita Electric Ind Co Ltd Liver support system
JP2005293395A (en) * 2004-04-02 2005-10-20 Bosupuresuto:Kk Meal information management system
EP2416131A1 (en) * 2010-08-05 2012-02-08 Mettler-Toledo (Albstadt) GmbH Casing to install electronic components in weighing scales
JP2012093962A (en) * 2010-10-27 2012-05-17 Seiko Epson Corp Print processor and print processing system
KR20140094761A (en) * 2013-01-22 2014-07-31 한양대학교 산학협력단 Method for checkweighing food and apparatus thereof
US20160078414A1 (en) * 2013-05-03 2016-03-17 Ecowastehub Corp. Solid waste identification and segregation system
WO2015162417A1 (en) * 2014-04-21 2015-10-29 Winnow Solutions Limited A system and method for monitoring food waste
CN104991684A (en) * 2015-07-23 2015-10-21 京东方科技集团股份有限公司 Touch control device and working method therefor
US10467584B2 (en) * 2016-01-25 2019-11-05 Sun Kyong Lee Food inventory systems and method
JP2018147415A (en) * 2017-03-09 2018-09-20 株式会社ブレイン Meal identification system and program therefor
CN207113999U (en) * 2017-07-17 2018-03-16 上海镭慎光电科技有限公司 Intelligent nutrition scale system
CN107238427A (en) * 2017-07-17 2017-10-10 上海镭慎光电科技有限公司 Intelligent nutrition scale system and the method that diet suggestion is provided
WO2019056102A1 (en) * 2017-09-19 2019-03-28 Intuitive Robotics, Inc. Systems and methods for waste item detection and recognition
CN107640480A (en) * 2017-10-19 2018-01-30 广东拜登网络技术有限公司 The method and apparatus and storage medium and terminal device of refuse classification retrospect
EP3483780A1 (en) * 2017-11-10 2019-05-15 Skidata Ag Classification and identification systems and methods
US20190197278A1 (en) * 2017-12-13 2019-06-27 Genista Biosciences Inc. Systems, computer readable media, and methods for retrieving information from an encoded food label
CN108072914B (en) * 2017-12-30 2019-11-12 南京陶特思软件科技有限公司 Hard analyte detection device for food waste treatment equipment
GB201802022D0 (en) * 2018-02-07 2018-03-28 Winnow Solutions Ltd A method and system for classifying food items

Also Published As

Publication number Publication date
EP3895085A1 (en) 2021-10-20
AU2019396665A1 (en) 2021-06-24
ZA202103911B (en) 2022-10-26
BR112021011397A2 (en) 2021-08-31
WO2020120757A1 (en) 2020-06-18
JP2022513897A (en) 2022-02-09
US20220058388A1 (en) 2022-02-24
NL2022213B1 (en) 2020-07-03
CN113302633A (en) 2021-08-24

Similar Documents

Publication Publication Date Title
US20220058388A1 (en) Food Waste Detection Method and System
US20240312600A1 (en) Meal service management system and operating method therefor
KR102606359B1 (en) Cafeteria management system
CN108416703A (en) Kitchen support system
US11335078B2 (en) System, method and computer program
US11763437B2 (en) Analyzing apparatus and method, and image capturing system
US20210397648A1 (en) A method and system for classifying food items
JP2018147416A (en) Meal identification system, identification method and identification program
EP3833242B1 (en) Portable scanning device for ascertaining attributes of sample materials
US20220020471A1 (en) Secure, automated, system and computer implemented process for monitoring care provided to a consumer according to a pharmacological or nutritional regimen of the consumer
WO2017130431A1 (en) Information processing device, information processing system, reception device, server device, information processing method, and program
KR20190104980A (en) Management system of cafeteria and operation method thereof
US12063948B2 (en) Method and system for foodservice with instant feedback
US20230298730A1 (en) Secure, automated, system and computer implemented process for monitoring care provided to a consumer according to a pharmacological or nutritional regimen of the consumer
CN109857880B (en) Model-based data processing method and device and electronic equipment
WO2021085369A1 (en) Meal amount measuring device and method
KR20200123069A (en) Management system of cafeteria and operation method thereof
WO2018008686A1 (en) Management system for managing nutritional component in meal
JP2016081407A (en) Calorie information acquisition apparatus, server, intake calorie monitoring system and calorie information acquisition method
JP2018147414A (en) Meal identification system and program therefor
JP2017150823A (en) Apparatus for determining appearance grade of granular substances

Legal Events

Date Code Title Description
EEER Examination request

Effective date: 20231207

EEER Examination request

Effective date: 20231207