IE87265B1 - System and method for monitoring food waste - Google Patents

System and method for monitoring food waste Download PDF

Info

Publication number
IE87265B1
IE87265B1 IE20210094A IE20210094A IE87265B1 IE 87265 B1 IE87265 B1 IE 87265B1 IE 20210094 A IE20210094 A IE 20210094A IE 20210094 A IE20210094 A IE 20210094A IE 87265 B1 IE87265 B1 IE 87265B1
Authority
IE
Ireland
Prior art keywords
food
controller
scanning area
range finding
classification
Prior art date
Application number
IE20210094A
Other versions
IE20210094A1 (en
Inventor
Kirwan Mark
Original Assignee
Positive Carbon Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Positive Carbon Ltd filed Critical Positive Carbon Ltd
Priority to IE20210094A priority Critical patent/IE87265B1/en
Publication of IE20210094A1 publication Critical patent/IE20210094A1/en
Publication of IE87265B1 publication Critical patent/IE87265B1/en

Links

Landscapes

  • Investigating Or Analysing Materials By Optical Means (AREA)

Abstract

The present disclosure is directed towards a system for monitoring food waste. Preferably, the system comprises a controller; a classification system coupled to the controller; a range finding device coupled to the controller; and an optical sensor coupled to the controller, wherein: the controller is configured to control the range finding device to determine a plurality of distances, wherein each distance is a distance between the range finding device and a point on scan a surface in a scanning area; the controller is configured to determine a volume based on the distances; the controller is configured to control the optical sensor to capture an image of the scanning area; the classification system is configured to identify a food classification based on the captured image; and the controller is configured to determine a food weight of a food item based on the food classification and the volume. <Figure 1>

Description

-116/06/2021 System and Method for Monitoring Food Waste Field of the Invention The present application is directed towards a system and a method for monitoring food 5 waste, and in particular food waste.
Background to the Invention A third of all food produced or prepared is wasted - i.e. it does not make it from farm or factory to fork. To produce food waste is to waste of a host of related resources - for example 10 seeds, water, energy, land, fertilizer, hours of labour, financial capital, etc.
As well as wasting resources, producing food waste has an environmental cost as it generates greenhouse gases at every stage. The environmental cost of food waste is higher than that of food that is not wasted - food waste will continue to produce greenhouse gases, including methane, when organic matter begins to decompose after it lands in a rubbish bin. Food waste emits 15 4.4 gigatons of CO2, or 10% of all global greenhouse gas emissions. To put this into scale, if food waste was a country it would be the third biggest greenhouse gas emitter after China and the US.
Greenhouse gases are the largest driver of global warming. With limited time available to significantly mitigate against the effects of global warming, we need to not only do everything we can to individually limit its impact, but to also develop and be part of systematic approaches.
According to the UN, stopping food waste is the greatest singular action we can take to fight climate change. The UN is working toward this by setting the SDG Goal 12.3: “By 2030, halve per capita global food waste at the retail and consumer levels”.
Considering one third of all food produced globally is wasted, there is an enormous need for an effective and efficient food waste monitoring solution in all food businesses. The average 25 kitchen spends 200,000 euro on food that ends up in the bin. Aside from the environmental cost, this is a significant financial cost in an industry famed for its tight margins. Thus, it is a problem that needs immediate attention.
A key problem in reducing food waste is one of measurement - a large number of food businesses have no idea which food products they are wasting. If a company knows that they are 30 wasting a given quantity of a given food product in a given ordering period, they will simply reduce their order so that they are not ordering a quantity of food product that will be wasted. As a result, less food will go off and become unfit for human consumption. -216/06/2021 Currently, most food companies do not record their food waste losses at all. Of the small number of companies that do measure their food waste, most still use pen and paper - this is a step in the right direction; however, it is still very time consuming and inefficient.
Given these problems, it is unsurprising that companies are keen to track food waste (e.g. 88% of businesses in Ireland were found to be interested in knowing how to measure and track food waste). However, existing systems that are used to track food waste (e.g. US 7,415,375, WO 2015/162417, and US 2020/0108428) have significant drawbacks which have limited their uptake. These systems require the use of cumbersome scales or large cameras which need to be placed underneath and on top of bins. They can be a tripping hazard, large, difficult to install and 10 often get in the way in a busy kitchen. Further, they can require the use of large touch screens for complex user interaction - i.e. a user needs significant training to use such systems accurately. Kitchens typically do not have the time and resources for this training. Thus, pre-existing solutions have not significantly addressed the problem because they are expensive, impractical, and inaccurate.
Object of the Invention The present application is directed towards addressing the above problems by providing a system and method for monitoring food waste that is simple to use and that do not require the use of bulky or expensive components, such as e.g. a weighing scales.
Summary of the Invention The present disclosure is directed towards a system for monitoring food waste. Preferably, the system comprises a controller; a classification system coupled to the controller; a range finding device coupled to the controller; and an optical sensor coupled to the controller, wherein: the controller is configured to control the range finding device to determine a plurality of distances, 25 wherein each distance is a distance between the range finding device and a point on scan a surface in a scanning area; the controller is configured to determine a volume based on the distances; the controller is configured to control the optical sensor to capture an image of the scanning area; the classification system is configured to identify a food classification based on the captured image; and the controller is configured to determine a food weight of a food item based on the food 30 classification and the volume.
Preferably, the classification system comprises a machine learning system. -316/06/2021 Preferably, the system is housed in a housing, wherein the housing comprises a means for mounting the system, whereby the range finding device and optical sensor are mountable above of a bin.
Preferably. the controller and the optical sensor are configured to detect fiducials, wherein 5 the fiducials define the scanning area.
Preferably, the system comprises a storage medium for storing a database of food densities; the controller is configured to look up a food density in the database using the food classification; and the calculator the food weight by multiplying the volume by the food density.
Preferably, the controller is configured to monitor the total volume of waste in the 10 scanning area and provide an alert when total volume is above a pre-determined threshold.
Preferably, the system comprises a means for wireless communication.
Preferably, the controller is configured to provide the image captured by the optical sensor to a remote server.
The present disclosure is also directed towards a method for monitoring food waste.
Preferably, the method comprises: controlling a range finding device to determine a plurality of distances, wherein each distance is a distance between the range finding device and a point on a surface in a scanning area; determining a volume based on the output from the distances; capturing an image of the scanning area with an optical sensor; identifying a food classification based on the captured image with a classification system; and determining a food weight of a food item based on the food classification and the volume.
Preferably, the step of identifying a food classification comprises classifying the captured image using a machine learning system.
Preferably, the method comprises providing a range finding device lidar sensor and an optical sensor above bin.
Preferably, the method comprises providing fiducials, wherein the fiducials define the scanning area and the method comprises determining the scanning area based on the position of the fiducials.
Preferably, the method comprises monitoring the total volume of waste in the scanning area and providing an alert when total volume recorded is above a pre-determined threshold.
Preferably, the method comprises providing the image captured by the optical sensor to a remote server. -416/06/2021 The present disclosure also includes a computer readable storage medium. The computer storage medium comprises instructions which, when executed by a processor coupled to a range finding device and an optical sensor, cause the processor to perform a method as set out above.
Brief Description of the Drawings The invention will be more clearly understood from the following description of an embodiment thereof, given by way of example only, with reference to the accompanying drawings, in which:- Figure 1 shows an exemplary embodiment of a system in accordance with the present 10 disclosure; Figure 2a shows a point cloud obtained by a range finding device; Figure 2b shows a volume calculated from the point cloud; Figure 3a shows the system mounted over a bin; Figure 3b shows the system mounted over a conveyer belt; Figure 4 shows total volume over time; Figure 5a shows an example of a fiducial marker; Figure 5b shows the use of fiducial markers to define a scanning area; and Figure 6 shows a method in accordance with the present disclosure.
Detailed Description of the Drawings With reference to figure 1, a system 1000 in accordance with the present disclosure comprises a range finding device (which is also known as a depth sensor) coupled to a controller. The range finding device is used to determine a distance between the range finding device and a 25 point on a surface. The range finding device can be any suitable device for measuring distance. For example the range finding device, can comprise one or more of a laser sensor, a lidar sensor, a radar, sonar sensor, an ultrasonic range finding device, photogrammetry systems (e.g. a stereo imaging system), a single-photon avalanche diode (SPAD), etc. Preferably, the range finding device comprises a lidar sensor. A lidar sensor is particularly suitable given its low cost, high speed, and high accuracy.
Further, lidar measures distances to a much higher density of points on a surface than other methods of data collection such as photogrammetry. Preferably, the system 1000 is housed in a housing 1010. -516/06/2021 Lidar was developed as a method for determining ranges (variable distance) by targeting an object with a laser and measuring the time for the reflected light to return to the receiver. Range finding device can also be used to make digital 3-D representations of areas on the earth's surface and ocean bottom, due to differences in laser return times, and by varying laser wavelengths. It has terrestrial, airborne, and mobile applications. Lidar is an acronym of light detection and ranging or laser imaging, detection, and ranging. Lidar sometimes is called 3-D laser scanning, a special combination of 3-D scanning and laser scanning. Lidar is commonly used to make high-resolution maps. It is also used in applications in the fields of surveying, geodesy, geomatics, archaeology, geography, geology, geomorphology, seismology, forestry, atmospheric physics, laser guidance, 10 airborne laser swath mapping (ALSM), laser altimetry, and autonomous transport (for the control and navigation of some autonomous vehicles).
In the present disclosure, the lidar sensor comprises a laser configured to emit pulsed electromagnetic radiation into the surrounding environment. Preferably, the light has a wavelength in the range of 600 nm - 1000 nm. Ideally, a wavelength of 860nm is used. Preferably, the lidar is eye15 safe - e.g. the maximum power of the laser may be limited to avoid damaging a user’s eye.
The pulses of electromagnetic radiation reflect off objects in their flight path and are returned to the lidar sensor. The sensor measures the time of flight of received electromagnet pulses (i.e. the time between when an electromagnetic pulse was emitted by the sensor and when it was received by the sensor) to calculate the distance travelled by the electromagnetic pulses.
The controller is configured to control the range finding device to generate a 3D profile of a scanning area. Where the range finding device comprises a lidar sensor, the controller controls the direction in which electromagnetic radiation is transmitted. The controller is further configured to cause the lidar to repeatedly transmit and receive electromagnet pulses. As a result, a 3D profile of the scanning area can be created. Preferably, millions of pulses are transmitted per second to create a precise real-time 3D profile.
With reference to figures 2a and 2b, the range finding device scans the scanning area 2000, the results are provided to the controller. The controller generates a 3D profile 2500 based on the results from the range finding device. The 3D profile is 2500 compared with a previously generated 3D profile to identify any changes in the dimensions of the scanned area. In this way, changes in the 30 3D profile of matter in the scanned area 2000 can be monitored. As a result, new items can be detected when they are placed in the scanned area 2000. Further, whenever a new item is placed in the scanned area 2000, the volume of that item can be calculated by converting the topography of -616/06/2021 the 3D profile of matter in the scanned area 2000 into a volume 2500. The volume (OV) of matter calculated from a previous 3D profile can be subtracted from the volume (NV) of matter calculated from a new 3D profile to obtain the change of volume (AV) of matter in the scanned area (i.e. the volume of an item added to the scanned area) - e.g. AV = NV - OV.
Volume is preferably calculated based on a point cloud 2010 of the scanned area 2000. In particular, when a large number of electromagnetic pulses are directed at a scanned area, a cloud of points 2010 on the surface of the matter in the scanned area 2000 can be determined. The points in the point cloud represent the geometric co-ordinates of where the electromagnetic pulse hit the surface of the matter before returning to the range finding device.
The point cloud 2010 is preferably converted into a mesh object 2510. A point cloud 2010 stores the location of millions and millions of points. Converting a point cloud 2010 into a mesh object 2510 converts the points of the point cloud into triangles of a mesh object 2510. Preferably, a trilinear interpolation is used to convert the point cloud 2010 into a mesh object 2510. Trilinear interpolation is a method for multivariate interpolation on a 3D regular grid. This produces a 3D mesh 2510 of the surface of the matter in the scanned area 2000. The 3D mesh 2510 is a 3D approximation of the surface of the matter in the scanned area 2000. Trilinear interpolation is a technique which is known and is discussed in more detail in the paper entitled Inverse geometry: from the raw point cloud to the 3D Surface: theory and algorithms’ by Julie Digne. Once the 3D mesh object 2510 for the surface of the matter in the scanned area has been obtained, it is then 20 possible to calculate the volume of matter in the scanned area 2000.
The volume of matter in the scanned area is preferably calculated using a 3D version of the shoelace algorithm (the shoelace algorithm is also known as Gauss's area formula and the surveyor's formula). The shoelace algorithm breaks the 3D mesh into its constituent triangles. The determinant is calculated for the three vertices of each triangle using the coordinates of each of the vertices and 25 the Rule of Sarrus. As used in his document, the term ‘determinate’ is intended to have its usual meaning in the field of mathematics - i.e. the determinate is a scalar value that is a function of the entries of a square matrix. In the case of a 2 x 2 matrix the determinant, |A|, can be defined as: = ad — be.
The determinates are then added together. The sum of the determinates is divided by 6 to give the volume of matter in the scanned area. This algorithm has been tested and has proven robust as well as not consuming undue processing power. -716/06/2021 As shown in figures 3a and 3b, the system 1000 is, in use, mounted above a scanning area 2000. When an item is provided in the scanning area 2000, the range finding device scans the scanning area 2000 to measure the increase in volume. It may take time for an item to be scrapped off a plate. Further, after an item has been placed in the scanning area 2000, it may take time to 5 settle. Thus, there may be a time delay between when an item is added to the scanning area 2000 and when the volume matter in the scanning area 2000 can be accurately measured. Thus, the controller of the range finding device is configured to monitor the measurements obtained by the range finding device. When the measurements are stable for a predetermined time period (e.g. 3-10 seconds), the volume of matter in the scanning area 2000 is logged. When a further item is entered 10 into the scanning area 2000, this process is repeated to calculate the volume of the further item.
In the embodiment shown in figure 3a, the system 1000 is mounted over a container 3000 for collecting waste (which is referred to in this disclosure as a bin). Alternatively, as shown in figure 3b, the system 1000 can be mounted over a conveyer belt 3500, wherein the conveyer belt 3500 is configured to move waste from one point to another. For example, the conveyer belt may be 15 configured to move waste from a collection point to a bin via the scanning area 2000.
As shown in figure 4, as well as calculating the volume on an item, the system can also store a total volume 4000. Each time the volume of an item 4100 is calculated it can be added to the total volume 4000. Whenever the bin is emptied, the total volume 4100 reading is reset to zero and the process is then started over - i.e. when a new item is entered into the scanning area its volume 20 is measured and its volume is added to the total volume.
Over time 4200, this automatically produces an item-by-item record 4500 of the matter in the bin on a granularity that is a second-by-second level. By tracking the total volume 4000, it is also possible to determine how full the bin is. As a result, an alert can be provided to warn users when the bin needs to be emptied. The alert can be an audio alert or visual alert (such as a flashing light). 25 Providing the alert to warn users that the bin needs to be emptied prevents the bin from overflowing.
In addition, the system 1000 can comprise a means for wireless communication. For example, the system 1000 can comprise a Wi-Fi or cellular chip coupled to one or more antenna. Where the system 1000 is configured for wireless communication, the warning can be a wireless 30 alert - e.g. the alert can be provided to a computing system remote from the system 1000 such as, e.g., a user’s mobile phone. -816/06/2021 A wireless alert can also allow improvements in the scheduling of emptying bins and the collection of waste. Commonly, waste collection is priced volumetrically. As a result, a user can gauge how efficient their waste collection is. For example, if the waste from a 200 litre bin is collected 12 times a month, a user is paying for 2,400 litres of waste to be taken away. However, by 5 measuring the volume of waste they can accurately find out how much waste they are producing. If, for example, they find that they are only producing 800 litres of waste, they can cut how often they get their bin collected by a factor of 3 (i.e. the 200 litre bin need only be collected 4 time a month). This enables the used to cut their waste costs by a factor of 3. There is also a positive environmental effect because the fuel used, and the emissions caused, by a rubbish collection truck 10 is reduced by 66%.
Further, if the wireless alert is sent to a refuse collection company, the refuse collection company can also use the system’s 1000 volumetric readings 4500 to optimise their pickup routes. By informing the refuse collection company of how much waste users are creating, or when a user needs waste collection, the refuse collection company can plan more efficient collection routes 15 designed to only pickup bins that are full. Incidentally, the word ‘or’ is intended to be read as an inclusive or - i.e. ‘a or b’ means one of: ‘a’, ‘b’, or ‘a and b’. the word ‘xor’ is used when an exclusive or is intended - i.e. ‘a xor b’ means one of ‘a’ or ‘b’.
Traditionally for any tracking of food waste, commercial kitchens would attempt to keep a food waste journal. However, this proves tedious, non-efficient and often non-effective as staff lose 20 interest with having the monitor food each and every day. To overcome this, the system 1000 is also preferably provided with an optical sensor coupled to the controller. The optical sensor is configured to detect electromagnetic radiation in the visual spectrum (i.e. 380 nm - 700 nm). The system 1000 comprises an image classification system.
In one embodiment, the image classification system is coupled to a memory configured to 25 store a plurality of image classifications. When, as described above, the measurements from the range finding device stabilise, the optical sensor captures an image of the scanned area. The captured image is provided to the image classification system. The image classification system classifies the captured image based on the stored image classifications. This is done by determining the image classification that matches most closely with the captured image. Each image classifications 30 corresponds to a type of food. The image classification then provides a first identifier of the food type to the controller. -916/06/2021 Alternatively, or in conjunction with the previous embodiment, the image classification system can be coupled to a user interface comprising a display. An image captured by the optical sensor can be provided by the image classification system to the user interface. The image is then displayed on the display for viewing. The user interface may be configured to receive a label input by 5 the user. Optionally, the label is selected from a number of predetermined labels. For example, the label can be selected using a menu, such as a drop-down menu. The labels are identifiers used to identify one or more items in the captured image. The label entered for an image can be used to classify the image. Optionally the label applied to one image can be used to help identify another image - e.g. through calibration or machine learning (e.g. AI training) as described below in more 10 detail.
The controller is connected to a database of food densities. Each entry in the database comprises an identifier of a particular food type and the density of that food type. Preferably, the database is stored in a memory located in the system. Using the first identifier determined by the image classification system, the controller is configured to look up the density of the item in the 15 database. The controller can then calculate the weight of the item placed in the bin using the volume of the item and the density of the item. E.g. using the formula M = DV (where M is mass, D is density, and V is volume). As a result, the weight of the food placed in the bin can be calculated without the use of scales or any weighing devices.
Preferably, to improve the accuracy of the classification provided by the classification 20 system, a machine learning system is used. In particular, the image classification system preferably comprises a deep convolutional neural network (CNN). Using this machine learning system, the image classification system can calculate at least one of a variant, position or type of dish that has been placed in the bin. Preferably, the image classification system is configured to use models in the CNN. For example, the models in the CNN can pass the captured image through a series of 25 convolution layers with filters or fully connected layers. Preferably, a softmax function (which is also known as softargmax or normalized exponential function) is then applied to classify images with a probability between 0 and 1. The image classification having the highest probability is selected as the closest match for the captured image.
Convolution is typically an initial step used to extract features from the captured image. 30 Convolution is a mathematical operation that takes an image matrix and a filter or kernel and multiplies to produce a feature map. -1016/06/2021 A training set of images for the machine learning system may be stored on a remote server. Alternatively, the machine learning system may be trained using captured images that have been labelled by a user as described above.
To reduce power consumption, the system 1000 is preferably provided with a motion 5 sensor. The motion sensor is coupled to a power controller. The power controller can be configured to place the system 1000 into a low power or sleep mode by switching components within the system 1000 off or into a low power mode after a predetermined period of time after a volume or weight of an item has been determined. When the motion sensor detects motion, it is configured to send a signal to the power controller. On receipt of this signal, the power controller 10 can place the system 1000 into an active mode by switching all components within the system 1000 on. In this way the system 1000 can be activated when an item is placed in the bin and deactivated when the system 1000 is idle.
The system 1000 is preferably configured to determine when a bin enters the scanning area 2000. The controller is configured to detect entry of a bin into the scanning area 2000 e.g. by 15 detecting when there is an increase in height indicative of the presence of a bin detected by the range finding device. The flat surface of the bottom of the bin then becomes the starting point for the volume measurement 4500 i.e., 0.
To identify the scanning area, the optical sensor and controller can be configured to detect fiducial markers (which are referred herein as fiducials). An example of a fiducial 5000 is shown in 20 figure 5a. The fiducial 5000 is an image placed in the field of view of an imaging system which appears in the captured image. The fiducial 5000 can then be used as a point of reference. The fiducial 5000 may be either something placed into or on the imaging subject, or a mark or set of marks in the reticule of an optical instrument. The fiducial 5000 can be provided on a mat or plate which sits below the bin. Alternatively, a plurality of fiducials 5000 can be provided as stickers.
Some fiducial systems are specifically designed to allow rapid, low-latency detection of 6D position estimation (3D location and 3D orientation) and identity of hundreds of unique fiducial markers. For example, the WhyCon marker, WhyCode markers, amoeba reacTIVision fiducials, the d-touch fiducials, or the TRIP circular barcode tags (ringcodes).
As shown in figure 5b, preferably, four predetermined fiducials 5000a, 5000b, 5000c, and 30 5000d, are provided on the corner of each bin 3000 to identify where the bin is and where the volume and contents of the bin 3000 are being examined. As the size and shape of the fiducials 5000 is known, the system 1000 can identify the scanning area 2000. Further the fiducials 5000 provide -1116/06/2021 reference points for calculating the coordinates of points in the point cloud 2010 detected by the range finding device. Alternatively, the four predetermined fiducials 5000a, 5000b, 5000c, and 5000d can be placed in the corners of a rectangle 3510 to define a scanning area 2000 containing a conveyer belt 3500.
Preferably, AR tags are used as the fiducials 5000. AR tags are particularly suitable because their high contrast, square geometry makes them relatively easy to detect and track in an image. Each tag has its own “right way up”, so they encode orientation. The squares that comprise the tag also identify it uniquely via a binary string, so they also encode their own identity.
The system 1000 can also comprise a temperature sensor. Preferably the temperature 10 sensor is an infrared sensor. The temperature sensor can detect if the food waste has started to generate heat - the generation of head is indicative of the food waste decomposing. Thus, the controller can determine if food waste is generating unpleasant smells based on the output of the temperature sensor before they become detectable to a user. As a result, the system 1000 can warn a user when a bin needs to be emptied to avoid it generating unpleasant smells.
The system 1000 may also comprise a positional sensor e.g. a gyroscope or accelerometer.
If the system 1000 moves - e.g. it is moved to a position where it can no longer scan the scanning area, or if it falls - the positional sensor is configured to detect the movement. Based on the output of the positional sensor, an alert can be sent to either a user or a remote system administrator to alert them that the system 1000 needs to be repositioned. Where the positional sensor is a 20 gyroscope, the controller can determine from the output of the gyroscope whether the system 1000 has been correctly orientated. If the system 1000 has not been correctly orientated, an alert can be sent to a user or a remote system administrator.
Preferably, the information collected by the system 1000 is sent to a remote server. The system 1000 preferably also sends a unique identifier to the remote server. The unique identifier is 25 used by the remote server to identify the source of the collected information. As a result, the server can monitor the performance of the system 1000. The information received by the remote server can be used to improve the performance of the system 1000. For example, the images captured by the optical sensors can be used to improve the image set used to train the classification system.
Thus, the system 1000 enables a business to monitor the volume of the food waste that they create. By giving kitchens full visibility of their food waste, it allows them to see what they’re wasting and how to make changes to their production and purchasing habits to avoid this waste. -1216/06/2021 With increased staff engagement, KPIs to measure success and stock reporting, kitchens can make significant savings.
Figure 6 shows a method in accordance with the present disclosure. In step 6100, a range finding device and a controller are configured to determine a volume present in a scanning area.
When a user places food 6200 into the scanning area, the system 1000 may optionally perform a step 6300 of detecting food being placed in the scanning area. In step 6400, a volume is detected using the range finding device and the controller. In step 6500, the volume of an item is logged. Optionally, where the scanning area is the inside of the bin, the volume of the item may be calculated by subtracting a previous volume measurement from the present volume measurement. In step 6600, an optical image is taken of the scanning area. In step 6700, the image is classified using a classification system. In step 6800, the weight of the food item is calculated. This includes obtaining the density of the food item using the classification. Preferably, the density is obtained from a database. Optionally, in step 6900 all the data collected by the system is sent, preferably wirelessly, to a remote server when it can be used for diagnostics.
The above description provides exemplary embodiments in accordance with the present disclosure. Those skilled in the art can readily recognize that numerous variations and substitutions may be made to the system, its use, and its configuration to achieve substantially the same results as achieved by the embodiments described herein.
Accordingly, there is no intention to limit the scope of the present disclosure to the exemplary embodiments. Many variations, modifications, and alternative constructions and equivalent elements fall within the scope and spirit of the disclosure as set out in the following claims.

Claims (15)

Claims 05/07/2021
1. A system for monitoring food waste comprising: a controller; a classification system coupled to the controller; a range finding device coupled to the controller; and an optical sensor coupled to the controller, wherein: the controller is configured to control the range finding device to determine a plurality of distances, wherein each distance is a distance between the range finding device and a point on a surface in a scanning area; the controller is configured to determine a volume based on the distances; the controller is configured to control the optical sensor to capture an image of the scanning area; the classification system is configured to identify a food classification based on the captured image; and the controller is configured to determine a food weight of a food item based on the food classification and the volume.
2. The system of claim 1, wherein the classification system comprises a machine learning system.
3. The system of claim 1 or 2, further comprising a housing, wherein the housing comprises a means for mounting the system, whereby the range finding device and optical sensor are mountable above a bin.
4. The system of any preceding claim, wherein the controller and the optical sensor are configured to detect fiducials, wherein the fiducials define the scanning area.
5. The system of any preceding claim, wherein: the system comprises a storage medium for storing a database of food densities; the controller is configured to look up a food density in the database using the food classification; and - 1 05/07/2021 the controller is configured to determine the food weight by multiplying the volume by the food density.
6. The system of any preceding claim, wherein the controller is configured to monitor the total volume of waste in the scanning area and provide an alert when total volume is above a pre-determined threshold.
7. The system of any preceding claim, further comprising a means for wireless communication.
8. The system of claim 7, wherein the controller is configured to provide the image captured by the optical sensor to a remote server.
9. A method for monitoring food waste comprising: controlling a range finding device to determine a plurality of distances, wherein each distance is a distance between the range finding device and a point on a surface in a scanning area; determining a volume based on the determined distances; capturing an image of the scanning area with an optical sensor; identifying a food classification based on the captured image with a classification system; and determining a food weight of a food item based on the food classification and the volume.
10. The method of claim 9, wherein the step of identifying a food classification comprises classifying the captured image using a machine learning system.
11. The method of claim 9 or 10, further comprising providing a range finding device and an optical sensor above a bin.
12. The method of any one of claims 9 to 11, further comprising providing fiducials, wherein the fiducials define the scanning area and the method comprises determining the scanning area based on the position of the fiducials.
13. The method of any one of claims 9 to 12, comprising monitoring the total volume of waste in the scanning area and providing an alert when total volume recorded is above a pre-determined threshold.
14. The method of any one of claims 9 to 13, comprising providing the image captured by the optical sensor to a remote server.
15. A computer readable storage medium comprising instructions which, when executed by a processor coupled to a range finding device and an optical sensor, cause the processor to perform a method according to any one of claims 9 to 14. 05/07/2021 -
IE20210094A 2021-04-29 2021-04-29 System and method for monitoring food waste IE87265B1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
IE20210094A IE87265B1 (en) 2021-04-29 2021-04-29 System and method for monitoring food waste

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
IE20210094A IE87265B1 (en) 2021-04-29 2021-04-29 System and method for monitoring food waste

Publications (2)

Publication Number Publication Date
IE20210094A1 IE20210094A1 (en) 2021-10-13
IE87265B1 true IE87265B1 (en) 2021-10-13

Family

ID=78205499

Family Applications (1)

Application Number Title Priority Date Filing Date
IE20210094A IE87265B1 (en) 2021-04-29 2021-04-29 System and method for monitoring food waste

Country Status (1)

Country Link
IE (1) IE87265B1 (en)

Also Published As

Publication number Publication date
IE20210094A1 (en) 2021-10-13

Similar Documents

Publication Publication Date Title
US11610185B2 (en) System and method for waste management
Muhadi et al. The use of LiDAR-derived DEM in flood applications: A review
US11593753B2 (en) Multi-phase consolidation optimization tool
Grimaldi et al. Remote sensing-derived water extent and level to constrain hydraulic flood forecasting models: Opportunities and challenges
Coren et al. Radiometric correction in laser scanning
Liu et al. LiDAR-derived high quality ground control information and DEM for image orthorectification
Otepka et al. Georeferenced point clouds: A survey of features and point cloud management
Vauhkonen Estimating crown base height for Scots pine by means of the 3D geometry of airborne laser scanning data
Mehendale et al. Review on lidar technology
Næsset et al. The effects of field plot size on model-assisted estimation of aboveground biomass change using multitemporal interferometric SAR and airborne laser scanning data
Carr et al. Individual tree segmentation from a leaf-off photogrammetric point cloud
Thieme et al. Detection of small single trees in the forest–tundra ecotone using height values from airborne laser scanning
US20210018611A1 (en) Object detection system and method
Forsman et al. Bias of cylinder diameter estimation from ground-based laser scanners with different beam widths: A simulation study
Pfeiffer et al. Derivation of three-dimensional displacement vectors from multi-temporal long-range terrestrial laser scanning at the Reissenschuh landslide (Tyrol, Austria)
Dorigo et al. An application-oriented automated approach for co-registration of forest inventory and airborne laser scanning data
Holmgren et al. Tree crown segmentation based on a tree crown density model derived from Airborne Laser Scanning
Vazirabad et al. Lidar for biomass estimation
King et al. Evaluation of LiDAR-derived snow depth estimates from the iPhone 12 Pro
US20230379068A1 (en) Location Sensing Technology For Detecting Asset Location
Buján et al. Classification of rural landscapes from low-density lidar data: is it theoretically possible?
Soilán et al. Road marking degradation analysis using 3D point cloud data acquired with a low-cost Mobile Mapping System
Räty et al. A comparison of linear-mode and single-photon airborne LiDAR in species-specific forest inventories
IE87265B1 (en) System and method for monitoring food waste
Pashaei et al. Terrestrial lidar data classification based on raw waveform samples versus online waveform attributes

Legal Events

Date Code Title Description
MM4A Patent lapsed