GB2616522A - Apparatus and method for determining food characteristics - Google Patents

Apparatus and method for determining food characteristics Download PDF

Info

Publication number
GB2616522A
GB2616522A GB2302940.8A GB202302940A GB2616522A GB 2616522 A GB2616522 A GB 2616522A GB 202302940 A GB202302940 A GB 202302940A GB 2616522 A GB2616522 A GB 2616522A
Authority
GB
United Kingdom
Prior art keywords
food
dish
image
characteristic
representative
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
GB2302940.8A
Other versions
GB202302940D0 (en
Inventor
Maclachlan Douglas
Wannan John
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
AFE Group Ltd
Original Assignee
AFE Group Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by AFE Group Ltd filed Critical AFE Group Ltd
Publication of GB202302940D0 publication Critical patent/GB202302940D0/en
Publication of GB2616522A publication Critical patent/GB2616522A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/60Type of objects
    • G06V20/68Food, e.g. fruit or vegetables
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N33/00Investigating or analysing materials by specific methods not covered by groups G01N1/00 - G01N31/00
    • G01N33/02Food
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/764Arrangements for image or video recognition or understanding using pattern recognition or machine learning using classification, e.g. of video objects

Landscapes

  • Engineering & Computer Science (AREA)
  • Health & Medical Sciences (AREA)
  • General Physics & Mathematics (AREA)
  • Physics & Mathematics (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Theoretical Computer Science (AREA)
  • Chemical & Material Sciences (AREA)
  • Food Science & Technology (AREA)
  • Multimedia (AREA)
  • General Health & Medical Sciences (AREA)
  • Pathology (AREA)
  • Immunology (AREA)
  • Biochemistry (AREA)
  • Analytical Chemistry (AREA)
  • Medicinal Chemistry (AREA)
  • Databases & Information Systems (AREA)
  • Software Systems (AREA)
  • Medical Informatics (AREA)
  • Evolutionary Computation (AREA)
  • Computing Systems (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Artificial Intelligence (AREA)
  • Investigating Or Analysing Materials By Optical Means (AREA)

Abstract

An apparatus for determining a characteristic of food on a dish, comprises at least two imaging devices comprising an optical camera and a thermal camera. Each camera is configured to capture an image of the dish. The apparatus also comprises a processor configured to receive an image from the cameras, determine the characteristic of the food on the dish in dependence on the image, and output an indicator indicative of the characteristic being representative of a suitability of the food on the dish for consumption. The two cameras may be configured to capture images of the dish from different angles. At least one of the imaging devices may be configured to capture a first image of the dish at a first time, and a second image of the dish at a second time different to the first time, and the processor may be configured to receive the second image from the imaging device and determine a second characteristic of the food on the dish in dependence on the second image or on a comparison between the first image and the second image.

Description

1 APPARATUS AND METHOD FOR DETERMINING FOOD CHARACTERISTICS
3 Field of the invention
The invention relates to apparatuses for assessing the suitability of food for 6 consumption and related methods.
8 Background to the invention
In catering settings, it is sometimes the case that meals are partially or wholly rejected 11 by a consumer. Apart from this being wasteful, in some settings such as hospitals, it 12 can mean that people do not consume the nutrients that they need.
14 In examples, even where meals are consumed, it may be that the quality of the meal is not as desired, perhaps because the meal is undercooked or overcooked, meaning 16 that the expected effects of the meal (e.g. the nutritional benefits) are not realised.
18 It is in this context that the present disclosure has been devised.
Summary of the invention
22 According to a first aspect of the invention, there is provided apparatus for determining 23 at least one characteristic of food on a dish, the apparatus comprising: 1 an imaging device configured to capture a first image of the dish; and 2 a processor configured to: 3 receive the first image from the imaging device; 4 determine the at least one first characteristic of the food on the dish in dependence on the first image; and 6 output an indicator indicative of the at least one first characteristic, 7 wherein the at least one first characteristic is representative of a suitability of 8 the food on the dish for consumption.
The apparatus may be an apparatus configured to determine the at least one first 11 characteristic of food on the dish. The processor may be configured to receive the first 12 image from the imaging device and determine whether the image is an image of a dish.
13 The processor may be configured to receive the first image from the imaging device 14 and determine whether the image is an image of food on a dish. It will be understood that determining whether the image is an image of a dish (optionally an image of food 16 on a dish) can be performed using known image-processing techniques. For example, 17 determining whether the image is an image of a dish (optionally an image of food on a 18 dish) may comprise providing the image as an input to a machine learning algorithm 19 (e.g. a machine learning algorithm trained on training data comprising images of dishes, and/or food on dishes, and/or images which do not contain dishes or food on 21 dishes).
23 The processor may be configured to receive the first image from the imaging device 24 and determine where on the dish food is located in the image. It will be understood that determining where on a dish food is located in an image can also be performed using 26 known image-processing techniques. For example, determining where on a dish food 27 is located in an image may comprise providing the image as an input to a machine 28 learning algorithm (e.g. a machine learning algorithm trained on training data 29 comprising images of dishes with food on the dishes in known locations).
31 It will further be understood that determining at least one first characteristic of the food 32 on the dish in dependence on the first image can be performed using known image- 33 processing techniques. For example, determining at least one first characteristic of the 34 food on the dish in dependence on the first image may comprise providing the first image as an input to a machine learning algorithm (e.g. a machine learning algorithm 36 trained on training data comprising images of food on dishes known to have the at least 1 one first characteristic, and/or food on dishes known not to have the at least one first 2 characteristic).
4 The apparatus may comprise at least two imaging devices, optionally wherein the at least two imaging devices comprise an optical camera and a thermal camera. It will be 6 understood that an optical camera is a camera configured to capture images in the 7 visible light spectrum. The optical camera may be configured to capture colour images.
8 The optical camera may be configured to capture greyscale images. The optical 9 camera may be configured to capture black and white images. The optical camera may be a multispectral camera. The optical camera may be a hyperspectral camera. The 11 apparatus may comprise at least two optical cameras. The apparatus may comprise 12 exactly two optical cameras. The processor may be configured to receive at least one 13 (e.g. first, second, further, etc) image from at least one of the at least two imaging 14 devices. The processor may be configured to receive at least one (e.g. first, second, further, etc) image from each imaging device.
17 Thus, there is provided an apparatus able to efficiently obtain an indicator of food 18 suitability, for example, ahead of serving a meal. By using an imaging device to 19 determine the at least one first characteristic of the dish, the use of a temperature probe or other contact-based checks can be reduced or even completely avoided.
21 Furthermore, the checking process can be performed quickly, thereby enabling a user 22 of the apparatus to provide a meal to a person, without having to carry out further 23 checks, if the at least one characteristic is indicative that the food on the dish is suitable.
24 Conversely, if the food on the dish is not suitable, the apparatus can quickly determine this, enabling a user (and/or the apparatus) to subsequently investigate the reasons 26 why the food is not suitable. As a result, a greater proportion of the food which is served 27 is consumed by the people being served the food, and less food is rejected, because 28 any issues with the suitability of the food can be addressed prior to serving.
29 Furthermore, the food which is consumed is more likely to provide nutritional benefits which more closely match the expected nutritional benefits, for example because the 31 food does not contain allergens, and/or because the food is not overcooked or 32 undercooked.
34 At least one of imaging devices may be configured to capture images of the dish from a first angle. At least one imaging devices may be configured to capture images of the 36 dish from a second angle, different to the first angle. Advantageously, this allows for 1 images to be captured from multiple angles, which can be helpful in determining 2 characteristics of food on the dish, for example volume.
4 The (e.g. at least one of the at least two) imaging device(s) may be configured to capture the (e.g. at least one) first image of the dish at a first time. The imaging device 6 may be configured to capture a (e.g. at least one) second image of the dish (optionally 7 at the first time). The imaging device may be configured to capture the (e.g. at least 8 one) second image of the dish at a second time different to the first time. The second 9 time may be subsequent to the first time. The processor may be configured to receive the (e.g. at least one) second image from the imaging device. The processor may be 11 configured to determine at least one second characteristic of the food on the dish in 12 dependence on the (e.g. at least one) second image. The processor may be configured 13 to output an indicator indicative of the at least one second characteristic. The imaging 14 device may be configured to capture one or more further images at one or more further times (optionally different to the first time, optionally different to the second time). The 16 processor may be configured to receive the one or more further images from the 17 imaging device. The processor may be configured to determine one or more further 18 characteristics of the food on the dish in dependence on the one or more further 19 images. The processor may be configured to output an indicator indicative of the one or more further characteristics.
22 It is therefore possible to capture multiple images and to determine multiple 23 characteristics at multiple times, for example before and after a meal. This provides 24 greater flexibility in the assessment of the suitability of meals.
26 The apparatus may be an apparatus configured to determine the at least one second 27 characteristic of food on the dish. The apparatus may be an apparatus configured to 28 determine the one or more further characteristics of food on the dish.
The processor may be configured to determine the at least one first characteristic of 31 the food on the dish (e.g. further) in dependence on a comparison between the first 32 image and the second image. The processor may be configured to determine the at 33 least one second characteristic of the food on the dish (e.g. further) in dependence on 34 a comparison between the first image and the second image. The processor may be configured to determine one or more of the one or more further characteristics of food 36 on the dish (e.g. further) in dependence on a comparison between any two or more 1 images. Thus, the apparatus can be used to determine a characteristic requiring 2 images from multiple time-points to be used.
4 It will be understood that determining at least one first characteristic of the food on the dish and/or the at least one second characteristic of the food on the dish (and/or or one 6 or more further characteristics of the food on the fish) in dependence on the first image 7 (optionally in dependence on the second image and/or one or more further images, 8 optionally wherein the first, second, and/or further images are considered individually 9 or in combination) can be performed using known image-processing techniques. For example, determining at least one first (and/or second and/or further) characteristic of 11 the food on the dish in dependence on the first (and/or second and/or further) image 12 may comprise providing the first image as an input to a machine learning algorithm 13 (e.g. a machine learning algorithm trained on training data comprising images of food 14 on dishes known to have the at least one first (and/or second and/or further) characteristic, and/or food on dishes known not to have the at least one first (and/or 16 second and/or further) characteristic).
18 The at least one second characteristic may be representative of a suitability of the food 19 on the dish for consumption.
21 An imaging device configured to capture a second image and a processor configured 22 to determine a second characteristic provides the further advantage of further 23 opportunities to provide a suitable dish of food. For example, the user could serve food 24 onto the dish, capture a first image, and receive an output indicator of a first characteristic which is representative of the food not being suitable. They could then 26 make some changes to the food (e.g. heating or re-heating it if the food is not suitable 27 because it is too cold), capture a second image, and receive an output of a second 28 characteristic. This second characteristic might be representative of the suitability of 29 the food is now suitable for consumption, or it might indicate that the food is not suitable, either for the same reason (e.g. the food is still too cold because it was not 31 heated for long enough) or for a different reason (e.g. the food is now burned because 32 it was heated for too long). Again, this allows the user to quickly assess whether their 33 actions have resulted in food that is suitable for consumption and, if not, to address 34 any issues.
1 Because the second characteristic (and optionally further characteristics) can be 2 determined in dependence on a comparison between the first image and the second 3 image, this also provides the advantage to the user that they can determine the 4 suitability of the food for consumption at two (or more) different times. For example, the user could use the apparatus a first time and receive an output indicator indicative of 6 the first characteristic representative of food that is suitable for consumption, and 7 therefore serve the food to a person as a pad of a meal. After the meal, the user could 8 use the apparatus a second time, and receive an output indicator indicative of a (e.g. 9 second) characteristic determined based on a comparison between the first image and the second image. This comparison may show that all of the food was consumed.
11 Alternatively, it might show that only some of the food was consumed, or that only 12 certain types of the food were consumed.
14 The first image may be an image of the dish after food has been served onto the dish.
The first image may be an image of the dish before the dish has been provided to a 16 person for consumption of the food. The second image may be an image of the dish 17 after the person has had an opportunity to consume the food. The second image may 18 be an image of the dish after the person has finished eating. The food on the dish may 19 be (e.g. at least part of) a meal. The first image may be an image of the dish before the dish is served as (e.g. at least part of) a meal. The second image may be an image of 21 the dish after the meal has been finished, for example after a person has had an 22 opportunity to eat the food.
24 The at least one first characteristic may be a plurality of first characteristics. The at least one second characteristic may be a plurality of second characteristics.
27 The first characteristic may be representative of food type. The first characteristic may 28 be representative of temperature, e.g. food temperature, e.g. the temperature of one 29 or more (e.g. each) type(s) of food. The first characteristic may be representative of food quantity, e.g. the quantity of one or more (e.g. each) type(s) of food. The first 31 characteristic may be representative of food quality, e.g. the quality of one or more (e.g. 32 each) type(s) of food. The first characteristic may be representative of food weight, e.g. 33 the weight of one or more (e.g. each) type(s) of food. The first characteristic may be 34 representative of food volume, e.g. the volume of one or more (e.g. each) type(s) of food. The first characteristic may be representative of food nutritional information, e.g. 36 the nutritional information of one or more (e.g. each) type(s) of food. The first 1 characteristic may be representative of food calorific information, e.g. how many 2 calories are in the food on the dish, e.g. how many calories are in one or more (e.g. 3 each) type(s) of food on the dish. The first characteristic may be representative of 4 presence of non-food items. The first characteristic may be representative of presence of undercooked food. The first characteristic may be representative of presence of 6 overcooked food. The first characteristic may be representative of presence of 7 allergens. The first characteristic may be representative of whether at least one 8 suitability criterion is met. The determination of the first characteristic may be 9 performed with known data analysis techniques, such as the use of machine learning algorithms. The suitability criterion may be indicative of a suitability of food on the dish 11 for consumption.
13 Where the first characteristic is representative of temperature, it may be representative 14 of the temperature of the dish. Where the first characteristic is representative of temperature, it may be representative of the temperature of a region of the dish. Where 16 the first characteristic is representative of temperature, it may be representative of the 17 temperature of food on the dish. Where the first characteristic is representative of 18 temperature, it may be representative of the temperature of a type of food on the dish.
The temperature may be an average (e.g. weighted average, e.g. mean, e.g. mode, 21 e.g. median) temperature of one or more (e.g. each) type(s) of food on the dish and/or 22 the temperature of the dish. The temperature may be an average (e.g. weighted 23 average, e.g. mean, e.g. mode, e.g. median) temperature of one or more regions of the 24 dish. The first characteristic may be an estimate of temperature.
26 The second characteristic may be representative of food type. The second 27 characteristic may be representative of temperature, e.g. food temperature, e.g. the 28 temperature of one or more (e.g. each) type(s) of food. The second characteristic may 29 be representative of food quantity, e.g. the quantity of one or more (e.g. each) type(s) of food. The second characteristic may be representative of food quality, e.g. the quality 31 of one or more (e.g. each) type(s) of food. The second characteristic may be 32 representative of food weight, e.g. the weight of one or more (e.g. each) type(s) of food.
33 The second characteristic may be indicative of food volume, e.g. the volume of one or 34 more (e.g. each) type(s) of food. The second characteristic may be representative of food nutritional information, e.g. the nutritional information of one or more (e.g. each) 36 type(s) of food. The second characteristic may be representative of food calorific 1 information, e.g. how many calories are in the food on the dish, e.g. how many calories 2 are in one or more (e.g. each) type(s) of food on the dish. The second characteristic 3 may be representative of presence of non-food items. The second characteristic may 4 be representative of presence of undercooked food. The second characteristic may be representative of presence of overcooked food. The second characteristic may be 6 representative of presence of allergens. The second characteristic may be 7 representative of whether at least one suitability criterion is met. The determination of 8 the second characteristic may be performed with known data analysis techniques, such 9 as the use of machine learning algorithms.
11 Where the second characteristic is representative of temperature, it may be 12 representative of the temperature of the dish. Where the second characteristic is 13 representative of temperature, it may be representative of the temperature of a region 14 of the dish. Where the second characteristic is representative of temperature, it may be representative of the temperature of food on the dish. Where the second 16 characteristic is representative of temperature, it may be representative of the 17 temperature of one or more (e.g. each) type(s) of food on the dish. The second 18 characteristic may be an estimate of temperature. The estimate of temperature may be 19 accurate to within at least 5 °C, e.g. at least 3 °C, e.g. at least 1 °C, e.g. at least 0.5 °C.
It will be understood that an estimate of temperature is accurate to within at least 5 °C, 21 e.g. at least 3 °C, e.g. at least 1 °C if a temperature sensor (e.g. a food thermometer) 22 would provide a temperature measurement of within at 5 °C, e.g. at least 3 °C, e.g. at 23 least 1 °C of the estimate.
Where the first or second characteristic is indicative of any of food type, food quantity, 26 food quality, food weight, food volume, food nutritional information, and/or food calorific 27 information, this is helpful in any setting where such parameters are monitored. For 28 example, this is helpful to people who are following a particular diet, to healthcare 29 professionals who are monitoring the effects of a particular diet on a patient or a group of patients, and to kitchen and catering professionals. Where the first or second 31 characteristic is indictive of food temperature, this eliminates the requirement for a food 32 thermometer, and thus reduces the risk of contamination. Where the first or second 33 characteristic is representative of the presence of non-food items, undercooked food, 34 or allergens, the apparatus improves the safety of food being served, as the risk of providing items which should not be eaten is reduced. Where the first or second 36 characteristic is representative of the presence of overcooked food, this reduces the 1 risk of the food not being enjoyed by the consumer. As food which is not enjoyed is 2 more likely to be rejected, this can help to reduce food waste.
4 The first characteristic may be representative of which types of food have been consumed. The first characteristic may be representative of a change in temperature, 6 e.g. a change in food temperature. The first characteristic may be representative of 7 how much food has been consumed (e.g. the weight or volume of food that has been 8 consumed, optionally the weight or volume of each type of food that has been 9 consumed). The first characteristic may be representative of nutrients of the food that has been consumed (e.g. the nutrients of each type of food that has been consumed).
11 The first characteristic may be representative of how many calories have been 12 consumed (e.g. how many calories have been consumed from each type of food).
14 The second characteristic may be representative of which types of food have been consumed. The second characteristic may be representative of a change in 16 temperature, e.g. a change in food temperature (e.g. the change in temperature of one 17 or more types of food). The second characteristic may be representative of how much 18 food has been consumed (e.g. the weight or volume of food that has been consumed, 19 optionally the weight or volume of each type of food that has been consumed). The second characteristic may be representative of nutrients of the food that has been 21 consumed (e.g. the nutrients of each type of food that has been consumed). The 22 second characteristic may be representative of how many calories have been 23 consumed (e.g. how many calories have been consumed from each type of food).
The processor may be configured to determine a region of the first image 26 corresponding to the food on the dish. The processor may be configured to determine 27 the at least one first characteristic in dependence thereon. The processor may be 28 configured to segregate the region of the first image into a plurality of sub-regions, 29 indicative of a plurality of different types of food. The processor may be configured to determine the at least one first characteristic for each of the plurality of sub-regions.
32 The processor may be configured to determine a region of the second image 33 corresponding to the food on the dish. The processor may be configured to determine 34 the at least one second characteristic in dependence thereon. The processor may be configured to segregate the region of the second image into a plurality of sub-regions, 1 indicative of a plurality of different types of food. The processor may be configured to 2 determine the at least one second characteristic for each of the plurality of sub-regions.
4 It will be understood that determination of regions and/or segregation of the images into regions based on the type of food can be performed using known image- 6 processing techniques. For example, segregation of the images into regions based on 7 the type of food may comprise providing the images as inputs to a machine learning 8 algorithm (e.g. a machine learning algorithm trained on training data comprising 9 images of known types of food on dishes). Alternatively or additionally, segregation of the images into regions based on the type of food may comprise providing the images 11 as inputs to an edge detection algorithm.
13 In an example, the processor may be configured to: 14 segregate the region of the first image into a plurality of sub-regions indicative of a plurality of different types of food; 16 determine a first volume of each of the plurality of different types of food; 17 segregate the region of the second image into a plurality of sub-regions 18 indicative of a first plurality of different types of food; 19 determine a second volume of each of the plurality of different types of food; compare the first volume of each of the plurality of different types of food and 21 the second volume of each of the plurality of different types of food to thereby 22 determine a food consumption estimate. The food consumption estimate may be an 23 estimate comprising the volume of each of the different types of food that have been 24 consumed. The processor may be configured to output a (e.g. the) food consumption estimate.
27 It will be understood that determination of volumes of food can be performed using 28 known image-processing techniques. For example, determining volumes of food may 29 comprise combining information from two images of food on the dish taken from two different angles to thereby determine a food surface and subsequently determining 31 (e.g. by summing, optionally by integrating) the volume under that surface. Additionally 32 or alternatively, determining volumes of food may comprise providing the images as 33 inputs to a machine learning algorithm (e.g. a machine learning algorithm trained on 34 training data comprising images of known volumes of food on dishes).
1 Where the first or second characteristic is representative of a change in temperature, 2 it may be representative of a change in the temperature of the dish. Where the first or 3 second characteristic is representative of a change in temperature, it may be 4 representative of a change in the temperature of a region of the dish. Where the second characteristic is representative of a change in temperature, it may be representative of 6 a change in the temperature of food on the dish. Where the second characteristic is 7 representative of a change in temperature, it may be representative of a change in the 8 temperature of one or more (e.g. each) type(s) of food on the dish. The first or second 9 characteristic may be an estimate of the change in temperature.
11 Such information is also helpful to people who are following a particular diet, to 12 healthcare professionals who are monitoring the effects of a particular diet on a patient 13 or a group of patients, and to kitchen and catering professionals. In addition, this 14 information can be used to improve the apparatus. For example if an allergen was present but the first characteristic was representative of food being suitable, the 16 apparatus could be improved to ensure that the said allergen is detected in future and 17 that where it is detected the first characteristic is representative of the food not being
18 suitable.
The apparatus may comprise one or more lights. The one or more lights may be 21 configured to illuminate food on the dish. Advantageously, this helps to ensure that it 22 is possible for high quality images to be recorded with the one or more imaging devices 23 and limits the possibility of parts of the dish being obscured by shadow. Furthermore, 24 imaging the food on the dish when illuminated with the one or more lights of the apparatus helps to ensure that one image recorded with the one or more imaging 26 devices will have similar lighting to other images recorded with the one or more imaging 27 devices. This is thus preferable to recorded images where food on the dish is 28 illuminated only with ambient lighting, e.g. ceiling lighting or daylight. It will be 29 understood that the one or more lights of the apparatus are typically not provided by ceiling lights, lights which do not form part of the apparatus, or other ambient lighting.
31 The one or more lights may comprise one or more bulbs. The one or more lights may 32 comprise one or more LEDs.
34 The apparatus may comprise a trolley. The trolley may comprise (e.g. house) the at least one imaging device (optionally at least one of the at least two imaging devices, 36 optionally each imaging device). It will be understood that a trolley is a portable 1 structure including a platform onto which one or more dishes may be placed for serving 2 food onto the one or more dishes. A trolley may have movement means, such as in the 3 form of one or more wheels.
A trolley provides a convenient platform on which to rest dishes before and during 6 serving food onto the dishes. Because trolleys typically have wheels, they are also 7 portable, making the apparatus particularly suited to serving food in settings where 8 meals must be brought to people in different locations (for example in a hospital or care 9 home). Where a trolley is provided which has the at least one imaging device, the dish can be imaged at the site where it is to be served, rather than in a kitchen, for example.
11 This reduces the risk of the suitability of the food changing between imaging and 12 serving.
14 The or each imaging device may comprise a camera. One or more of the imaging devices may comprise an optical camera. It will be understood that an optical camera 16 is a camera configured to capture images in the human-visible light spectrum (e.g. a 17 light spectrum including wavelengths in the range of approximately between 310 and 18 1,100 nanometres, e.g. 380 and 700 nanometres, e.g. 400 and 680 nanometres).
19 Typically, optical cameras are not able to capture images outside the human-visible light spectrum, however this is not required. One or more of the imaging devices may 21 comprise a multispectral camera. It will be understood that a multispectral camera is 22 typically receptive to electromagnetic radiation both within and outside the human- 23 visible light spectrum. One or more of the imaging devices may comprise a 24 hyperspectral camera. One or more of the imaging devices may comprise a thermal camera. The thermal camera may be configured to capture images comprising pixels 26 containing information indicative of temperature variations (e.g. in degrees Kelvin). It 27 will be understood that a thermal camera is a camera configured to capture images in 28 the longwave infrared light spectrum The thermal camera may be configured to capture 29 images comprising pixels containing information indicative of the wavelengths of light received by the camera, wherein the wavelengths of light are in the range of 7,500 31 nanometres to 14,500 nanometres (e.g. 7,800 nanometres to 14,200 nanometres, e.g. 32 8,000 nanometres to 14,000 nanometres).
34 One or more of the imaging devices may be an infrared (IR) camera. It will be understood that an IR camera is a camera configured to capture images in the (near) 36 infrared light spectrum. The IR camera may be configured to capture images 1 comprising pixels containing information indicative of the wavelength of light received 2 by the camera, wherein the wavelengths of light are wavelengths in the range of 0.3 to 3 280 pm (e.g. 0.5 to 290 pm, e.g. 0.5 to 300 pm).
The imaging device may comprise at least one optical camera and at least one thermal 6 camera (e.g. two thermal cameras).
8 Where the first or second characteristic is representative of a temperature or a change 9 in temperature, this may be determined on the basis of an image recorded by a thermal or IR camera. Determination of a temperature or a change in temperature on the basis 11 of an image recorded by a thermal camera (optionally an IR camera) may be performed 12 using known image-processing techniques. Determination of a temperature or a 13 change in temperature on the basis of an image recorded by a thermal camera 14 (optionally an IR camera) may be carried out on the basis of pixel information of the pixels in the image. For example, it may be that each pixel comprises data indicative 16 of a temperature measurement, and determination of a temperature or a change in 17 temperature may comprise receiving the data indicative of a temperature measurement 18 (e.g. the data of each pixel). Determination of a temperature or a change in temperature 19 may comprise receiving the data indicative of a temperature measurement (e.g. the data of each pixel) and determining an average (e.g. mean) temperature across two or 21 more adjacent pixels (e.g. from the data of two or more adjacent pixels).
23 Where the first or second characteristic is representative of the quality of the food on 24 the dish, this may be determined on the basis of an image recorded by a mulfispectral camera, optionally on the basis of an image recorded by a hyperspectral camera.
26 Determination of the quality of the food on the dish may comprise providing an image 27 recorded by a multispectral camera (optionally a hyperspectral camera) as an input to 28 a machine learning algorithm (e.g. a machine learning algorithm trained on training 29 data comprising images of food on dishes where the food is known to be of the quality and images of food on dishes where the food is known not to be of the quality). It will 31 be understood that he food quality may, for example, be indicator that the food has 32 been cooked correctly (e.g. not undercooked, overcooked, or burned.
34 The apparatus comprises an imaging device. The apparatus may comprise at least one imaging device. The apparatus may comprise a plurality of imaging devices. The 36 apparatus may comprise at least two imaging devices. The apparatus may comprise 1 exactly two imaging devices. The imaging device may comprise (e.g. be) a camera.
2 The imaging device may comprise (e.g. be) at least one camera. The imaging device 3 may comprise (e.g. be) a plurality of cameras. The imaging device may comprise (e.g. 4 be) at least two cameras. The imaging device may comprise (e.g. be) exactly two cameras. The imaging device may comprise (e.g. be) exactly two optical cameras and 6 one thermal camera. The or each imaging device may (e.g. each) be configured to 7 capture a (e.g. first, second, further, and/or additional) image of food on a dish. The or 8 each imaging device may (e.g. each) be configured to capture a (e.g. first, second, 9 further and/or additional) image of a dish in predetermined location. The apparatus may comprise an imaging area. The or each imaging device may (e.g. each) be configured 11 to capture a (e.g. first, second, further and/or additional) image of the imaging area.
12 The or each imaging device may (e.g. each) be configured to capture a (e.g. first, 13 second, further and/or additional) image of a dish when the dish is in the imaging area 14 (e.g. in use).
16 The provision of at least two imaging devices (e.g. at least two cameras) provides the 17 advantage that the apparatus can be used to capture (e.g. at least) two images of a 18 dish (e.g. food on a dish) from two different angles. The apparatus may comprise a first 19 imaging device configured to capture an image of food on a dish from a first angle. The apparatus may comprise a second imaging device configured to capture an image of 21 food on a dish from a second angle different to the first angle (optionally at the same 22 time as the first image is captured). The apparatus may comprise one or more further 23 imaging devices configured to capture an image of food on a dish from one or more 24 further angles different to the first angle and different to the second angle (optionally at the same time as the first image is captured, optionally at the same time as the second 26 image is captured). In an example, the first imaging device may be configured to 27 capture the first image at an angle at least 10° from the angle at which the second 28 imaging device is configured to capture the second image, optionally at least 20°, 29 optionally at least 30°. The first may be configured to capture the first image at an angle less than 200° from the angle at which the second imaging device is configured to 31 capture the second image, optionally less than 190°, optionally less than 180°.
33 Alternatively or additionally the apparatus may be configured to allow the first imaging 34 device to be moved to one or more positions relative to the dish so that one or more images may be captured from one or more (e.g. known) angles and distances. For 36 example, the imaging device may be mounted on a track or a gantry.
2 An apparatus which allows for images to be captured from two or more different angles 3 makes it easier to determine volumes of food (and optionally volumes of different types 4 of food) on the dish. The use of at least two imaging devices (e.g. at least two cameras) also improves the accuracy when a characteristic of the food on the dish is determined.
6 In addition, the use of at least two cameras makes it easier to determine when a first 7 food is positioned partially over a second food, different to the first food. The use of at 8 least two cameras makes it easier to determine that a relatively small amount of food 9 on the dish is not suitable, for example because the relatively small amount of food is burned.
12 The trolley may comprise a heating chamber. The heating chamber may be a cooking 13 chamber. The heating chamber may be an oven. The heating chamber may be a 14 microwave. The trolley may comprise a cooling chamber. The cooling chamber may be a refrigerator. The cooling chamber may be a freezer. The trolley may comprise one 16 or more dishes. The trolley may comprise a container (e.g. a cupboard) for housing 17 food items (for example one or more containers of food). The trolley may comprise a 18 user interface. The trolley may comprise a container (e.g. a drawer) for housing items 19 of cutlery. Typically, the user interface allows output of the indicator to the user. The indicator may be output to the user interface. The apparatus (e.g. the trolley) may 21 comprise one or more containers of food. For example, the apparatus (e.g. the trolley) 22 may comprise at least 10 containers of food. The apparatus may comprise at least 10 23 portions of food. The apparatus may comprise at least enough food to provide 10 24 meals. The apparatus may comprise no more than 100 containers of food. The apparatus may comprise no more than 100 portions of food. The apparatus may 26 comprise no more than enough food to provide 100 meals.
28 The processor may be configured to output the indicator indicative of the first 29 characteristic to the user interface. The processor may be configured to cause the user interface to display the indicator indicative of the first characteristic. The processor may 31 be configured to output the indicator indicative of the second characteristic to the user 32 interface. The processor may be configured to cause the user interface to display the 33 indicator indicative of the second characteristic.
The user interface may comprise a visual output device, such as a display. For 36 example, the user interface may comprise a display screen. The display screen may 1 have display area of at least 25 cm2, e.g. at least 100 cm2, e.g. at least 600 cm2, 2 typically less than 50,000 cm2, e.g. less than 10,000 cm2. The processor may be 3 configured to prompt a user to input one or more security details via the user interface 4 before using the apparatus. The one or more security details may comprise a username. The one or more security details may comprise a password. The one or 6 more security details may comprise a PIN. The one or more security details may 7 comprise a piece of biometric information (e.g. a fingerprint).
9 The user interface may comprise an audio output device, such as a speaker.
11 A type of food may comprise (e.g. be) a food group e.g. meat, fish, vegetable, fruit, 12 fungus, dairy, eggs, cereal, legume, confection, baked goods, etc. A type of food may 13 comprise (e.g. be) a macronutrient, e.g. protein, fat, carbohydrate. A type of food may 14 comprise (e.g. be) a food menu item, e.g. macaroni and cheese, lamb dhansak, ham sandwich, etc. A type of food may comprise (e.g. be) a specific food type, for example 16 an ingredient, e.g. chicken, broccoli, pasta, etc. A type of food may comprise (e.g. be) 17 dietary requirement food type, e.g. vegan, vegetarian, kosher, halal, gluten free, dairy 18 free, liquid-based, texture-modified, etc. In some examples, it may be that the food has 19 multiple food types associated with the same meal, such as all of ham and cheese sandwich, meat, dairy, carbohydrate, protein, and fat.
22 Such a trolley is advantageous where meals must be brought to people in different 23 locations because it reduces the requirement for food to be moved back and forth 24 between a kitchen and a location where the food will be eaten. For example, where a trolley comprises a heating chamber, the food can be heated (optionally re-heated) and 26 served onto a dish and the dish can then immediately be imaged. If the first 27 characteristic is indicative of food that is not suitable because it is too cold, the food 28 can be heated further without having to first move the food to a kitchen.
The processor may be configured to output the indicator indicative of the first 31 characteristic to a device, for example a mobile phone or a tablet. The processor may 32 be configured to output the indicator indicative of the first characteristic to a database.
33 The processor may be configured to output the indicator indicative of the second 34 characteristic to a device, for example a mobile phone or a tablet. The processor may be configured to output the indicator indicative of the second characteristic to a 36 database. The information in the database may be accessible via users' device, such 1 as a laptop, desktop computer, smartphone, or tablet. The information in the database 2 may be accessible via a computer program, e.g. an app. It may be that the apparatus 3 comprises a transmitter configured to be in data communication with a further device 4 and to transmit the indicator to the further device. The database may be stored on a further device. The apparatus may comprise a wired communication link. The 6 apparatus may comprise a wireless communication link.
8 The processor may be configured to output the indicator indicative of the first 9 characteristic to a database at a first time when the first characteristic has been determined. Alternatively, the processor may be configured to output the indicator 11 indicative of the first characteristic to a database at a second time following a delay 12 after the first characteristic has been determined. The processor may be configured to 13 output the indicator indicative of the second characteristic to a database at a first time 14 when the second characteristic has been determined. Alternatively, the processor may be configured to output the indicator indicative of the second characteristic to a 16 database at a second time following a delay after the second characteristic has been 17 determined.
19 Where a trolley comprises a user interface, the indicator indicative of the at least one characteristic can be conveniently output to this user interface. This means that the 21 user does not need an additional device in order to receive the indicator.
23 The processor may be configured to output an instruction in dependence on the at least 24 one characteristic. The processor may be configured to output the instruction to the user interface. The processor may be configured to cause the user interface to display 26 the instruction. The processor may be configured to output the instruction to a device, 27 for example a mobile phone or a tablet. The processor may be configured to output the 28 instruction to a database.
A user of a trolley having a user interface to which instructions are output can 31 straightforwardly follow these instructions without having to refer to another source of 32 instructions, or to move elsewhere to receive instructions.
34 The instruction may be an instruction to serve the food. For example, the at least one characteristic may be representative of whether at least one suitability criterion has 36 been met, and the instruction may be an instruction to serve the food on the condition 1 that the at least one characteristic is representative of the at least one suitability 2 criterion having been met. The instruction may be an instruction to heat the food. The 3 instruction may be an instruction to re-heat the food. The instruction may be an 4 instruction to cool the food. The instruction may be to remove a non-food item from the dish. The instruction may be an instruction to reject the food, for example if the at least 6 one characteristic is representative of at least one suitability criterion not being met 7 (e.g. if the at least one characteristic is representative of food that is burned).
9 Because an instruction is output, the user knows whether any steps need to be taken before the food is suitable to be served. For example, if an instruction is output to reject 11 the food, the user can reject the food and start again, without the need to carry out any 12 other tests or actions first. If an instruction is output to heat the food (optionally to re- 13 heat the food), the user can try heating the food before they try other steps which may 14 lead to the food being suitable to be served, in which case it may be that the food does not need to be rejected, and thus food waste is reduced.
17 The apparatus may further comprise computer readable memory (e.g. a non-transitory 18 computer-readable storage medium) comprising (e.g. storing) a database. The 19 database may comprise dietary requirement data indicative of the dietary requirements of one or more people. The database may comprise food selection information of one 21 or more people. The processor may be configured to compare the at least one 22 characteristic with the dietary requirements of (e.g. the dietary requirement data 23 associated with) a respective person. The processor may be configured to compare 24 the at least one characteristic with the food selection information of a respective person. The database may comprise the indicator. The database may comprise (e.g. 26 data or information associated with) the at least one first characteristic. The database 27 may comprise (e.g. data or information associated with) the at least one second 28 characteristic. The database may comprise (e.g. data or information associated with) 29 the one or more further characteristics.
31 The trolley may comprise the computer readable storage medium, however this is not 32 required, and the computer readable storage medium may be remote from the trolley.
33 The computer readable storage medium may be a (e.g. cloud-based) computer 34 readable storage medium in data communication with the apparatus via a wireless communication link (e.g. a Wi-Fi link). The computer readable storage medium may be 36 a (e.g. cloud-based) computer readable storage medium in data communication with 1 the apparatus via a wired communication link (e.g. an ethernet link). The computer 2 readable storage medium may be an Internet Of Things (I0T) computer readable 3 storage medium. The computer readable storage medium may be a component within 4 an IOT environment, optionally within a local area network.
6 The provision of a database including dietary information allows for personalised (and 7 thus improved) nutrition to be provided to the people to whom the food is served. For 8 example, where a respective person is following a low-sugar diet, this information may 9 be included in such a database and the at least one characteristic may then only be indicative of food that is suitable for consumption (e.g. by the respective person) if the 11 food is low in sugar.
13 The dish may comprise a dish identifier (e.g. a dish identification means). It will be 14 understood that a dish identifier is substantially any additional component or marking provided with the dish which can be observed or otherwise analysed (e.g. read) by an 16 electronic reader and providing a unique or semi-unique identifier of the dish. For 17 example, the dish identifier may comprise (e.g. be) a barcode. The dish identifier may 18 comprise (e.g. be) a quick response (OR) code. The dish identifier may comprise (e.g. 19 be) a radiofrequency identification (RFID) tag. The apparatus comprises may comprise a sensor for reading the dish identifier (e.g. means for reading the dish identification 21 means). For example, the sensor may comprise (e.g. be) a barcode scanner. The 22 sensor may comprise (e.g. be) a OR code reader. The sensor may comprise (e.g. be) 23 an RFID reader. The containers of food may each comprise a container identifier 24 readable by the sensor. The containers of food may each comprise a batch code. The apparatus may comprise one or more items of cutlery. For example the apparatus may 26 comprise one or more knives. The apparatus may comprise one or more forks. The 27 apparatus may comprise one or more spoons. The items of cutlery may each comprise 28 a cutlery identifier readable by the sensor.
The dish identifier may be suitable for washing. For example, the dish identifier (e.g. 31 tag) may be water-resistant. The dish identifier (e.g. tag) may be heat-resistant. The 32 dish identifier (e.g. tag) may be suitable for washing in a dishwasher. The dish identifier 33 (e.g. tag) may be suitable for washing at least 1,000 times, e.g. at least 5,000 times, 34 e.g. at least 10,000 times. The dish identifier (e.g. tag) may be a re-writable dish identifier. The sensor may be suitable for use in a kitchen. For example, the sensor 36 may be water resistant. The sensor may be heat-resistant.
2 Where the dish comprises a dish identifier (e.g. a dish identification means) and the 3 apparatus comprises means for reading the dish identification means, this can be used 4 to track the dish. For example, the dish identifier can be read before the food is served to a person as a part of a meal and can be read again after the person has finished 6 their meal, to ensure that the same dish is being imaged before and after the meal.
7 This reduces the risk of misleading comparisons being carried out. The database may 8 comprise (e.g. contain) dish identification data.
The dish may be a plate. The dish may be a bowl. The dish may be a tray. The dish 11 may comprise an anti-slip surface. The dish may comprise feet. The imaging device 12 may be configured to capture a (e.g. first and/or second and/or further) image of the 13 dish within a range of distances from the imaging device. The imaging device may be 14 configured to capture a (e.g. first and/or second and/or further) image of the dish that is no further than 100 cm from the imaging device, e.g. no further than 80 cm, e.g. no 16 further than 50 cm. The imaging device may be configured to capture a (e.g. first and/or 17 second and/or further) image of the dish that is at least 1 cm from the imaging device, 18 e.g. at least 5 cm, e.g. at least 10 cm.
The apparatus may comprise a temperature sensor. The apparatus may comprise a 21 temperature probe. The apparatus may comprise a weight sensor.
23 suitability of the food on the dish for consumption may be a parameter indicative of 24 food type. A suitability of the food on the dish for consumption may be a parameter indicative of temperature, e.g. food temperature (optionally the temperature of one or 26 more types of food). A suitability of the food on the dish for consumption may be a 27 parameter indicative of food quantity (optionally the quantity of one or more types of 28 food). A suitability of the food on the dish for consumption may be a parameter 29 indicative of food volume (optionally the volume of one or more types of food). For example, the food may not be suitable if the volume of food on the dish is (e.g. 31 significantly) lower or higher than expected. A suitability of the food on the dish for 32 consumption may be a parameter indicative of food quality (optionally the quality of 33 one or more types of food). A suitability of the food on the dish for consumption may 34 be a parameter indicative of preference (e.g. preference of a person to whom the food is to be served). A suitability of the food on the dish for consumption may be a 36 parameter indicative of selection (e.g. whether the food matches a food selection made 1 by a person to whom the food is to be served). A suitability of the food on the dish for 2 consumption may be a parameter indicative of presence or absence of undercooked 3 food. A suitability of the food on the dish for consumption may be a parameter indicative 4 of presence or absence of overcooked food. A suitability of the food on the dish for consumption may be a parameter indicative of presence or absence of non-food items.
6 A suitability of the food on the dish for consumption may be a parameter indicative of 7 presence or absence of allergens.
9 Although the apparatus may comprise a trolley and the trolley may comprise at least one imaging device, this need not necessarily be the case. For example, the apparatus 11 may comprise at least one imaging device that is not housed in a or the trolley. The at 12 least one imaging device may be remote from a or the trolley. The trolley may comprise 13 at least one imaging device and the apparatus may comprise at least one further 14 imaging device. The at least one further imaging device may be remote from a or the trolley.
17 The at least one further imaging device may be an imaging device in a kitchen. The 18 apparatus may comprise a kitchen pass (e.g. a chef's pass). The kitchen pass may 19 comprise at least one (e.g. at least one further) imaging device. A kitchen pass will be understood to be an area where food that is ready to be served is placed for collection 21 by a person who will serve it. The apparatus may comprise a kitchen gantry, for 22 example a kitchen gantry comprising one or more heaters (e.g. one or more heat 23 lamps). The kitchen gantry may comprise at least one (e.g. at least one further) imaging 24 device.
26 Although the apparatus may comprise a trolley, this is not required. The apparatus may 27 be wholly or partly integrated with the kitchen pass. The apparatus may be wholly or 28 partly provided within a unit, for example a tabletop unit. The apparatus may comprise 29 (e.g. be) a unit.
31 The unit may comprise a base portion. The base portion may comprise (e.g. the) 32 imaging area. The base portion may comprise (e.g. retain) a (e.g. the) sensor for 33 reading a dish identifier. The base portion may comprise (e.g. retain) a (e.g. the) 34 controller. The base portion may comprise (e.g. retain) a power source.
1 The imaging area may comprise a flat surface configured to receive a dish. The unit 2 may comprise a support structure, optionally a support structure that is mechanically 3 coupled to the base. The unit may comprise (e.g. the) one or more imaging devices 4 (e.g. one or more cameras). The one or more cameras may be mounted or mountable to the support structure. The unit may comprise one or more housings configured to 6 retain the one or more imaging devices. The one or more housings may be mounted 7 or mountable to the support structure. The unit may comprise a display (optionally a 8 screen). The display may be mounted or mountable to the support structure.
The support structure may be configured to allow one or more of the imaging devices 11 (optionally housings) and/or display to be fixedly positioned at a range of distances 12 relative to the imaging area. The support structure may be configured to allow one or 13 more of the imaging devices (optionally housings and/or display) to be moved, e.g. 14 slidably, relative to the imaging area. The unit may be a modular unit.
16 Accordingly, it may be that the first image is captured with a first imaging device and 17 the second imaging device is captured with a second imaging device. For example, 18 where the apparatus comprises a trolley comprising an imaging device and a kitchen 19 comprising an imaging device, the first image may be captured with an imaging device of the trolley and optionally the second image may be captured with an imaging device 21 of the kitchen. In other words, the at least one first characteristic may be determined in 22 dependence on a first image captured by a first imaging device and the at least one 23 second characteristic may be determined in dependence on a second image captured 24 by a second imaging device.
26 The at least one imaging device may be mounted to observe a serving area at which 27 food is served onto the dish.
29 The trolley may comprise suspension. The trolley may be configured to withstand vibration and/or sudden impacts, e.g. during use. The imaging device may be 31 configured to withstand vibration and/or sudden impacts, e.g. during use. The 32 processor may be configured to withstand vibration and/or sudden impacts, e.g. during 33 use. For example, the trolley, and/or the imaging device, and/or the processor may be 34 configured to withstand impacts of at least 5 Newtons. A trolley and/or an imaging device and/or a processor will be understood to have withstood an impact if it still 36 functions after the impact.
2 Although the apparatus typically comprises the processor, the processor may be 3 remote from the apparatus. For example, where the apparatus comprises a trolley, the 4 trolley may comprise (e.g. house) the processor, however in an alternative embodiment the processor may be remote from the trolley.
7 The at least one first characteristic may comprise a date and/or time stamp. The at 8 least one second characteristic may comprise a date and/or time stamp. The one or 9 more further characteristics may comprise a date and/or time stamp. The database may comprise (e.g. contain) date and/or time stamps (e.g. date and/or or time stamp 11 data or information) associated with the at least one first, at least one second and/or 12 one or more further characteristics.
14 According to a second aspect of the invention, there is provided a method of using an apparatus as described hereinbefore, the method comprising: 16 providing a dish; 17 serving food onto the dish; 18 capturing a first image of the dish with the imaging device at a first time; 19 receiving the first captured image from the imaging device; determining at least one first characteristic of the food on the dish in 21 dependence on the first image; and 22 outputting an indicator indicative of the at least one first characteristic, 23 wherein the at least one first characteristic is representative of a suitability of the food 24 on the dish for consumption. The steps of the method may be performed in the order set out hereinbefore.
27 It will be understood that determining at least one first characteristic of the food on the 28 dish in dependence on the first image can be performed using known image- 29 processing techniques. For example, determining at least one first characteristic of the food on the dish in dependence on the first image may comprise providing the first 31 image as an input to a machine learning algorithm (e.g. a machine learning algorithm 32 trained on training data comprising images of food on dishes known to have the at least 33 one first characteristic, and/or food on dishes known not to have the at least one first 34 characteristic).
1 This is an efficient method of obtaining an indicator of food suitability, for example, 2 ahead of serving a meal.
4 The method may comprise heating the food before serving the food onto the dish. The method may comprise re-heating the food before serving the food onto the dish. The 6 method may comprise cooling the food before serving the food onto the dish.
8 The method may comprise capturing at least one first image from a first angle with a 9 first imaging device. The method may comprise capturing at least one second image from a second angle, different to the first angle, with a second imaging device.
12 The method may comprise receiving the first image from the imaging device and 13 determining whether the image is an image of a dish. The method may comprise 14 receiving the first image from the imaging device and determining whether the image is an image of food on a dish. It will be understood that determining whether the image 16 is an image of a dish (optionally an image of food on a dish) can be performed using 17 known image-processing techniques. For example, determining whether the image is 18 an image of a dish (optionally an image of food on a dish) may comprise providing the 19 image as an input to a machine learning algorithm (e.g. a machine learning algorithm trained on training data comprising images of dishes, and/or food on dishes, and/or 21 images which do not contain dishes or food on dishes).
23 The method may comprise receiving the first image from the imaging device and 24 determining where on the dish food is located in the image. It will be understood that determining where on a dish food is located in an image can also be performed using 26 known image-processing techniques. For example, determining where on a dish food 27 is located in an image may comprise providing the image as an input to a machine 28 learning algorithm (e.g. a machine learning algorithm trained on training data 29 comprising images of dishes with food on the dishes in known locations).
31 In some embodiments, the method may comprise receiving a food selection before 32 serving food onto the dish. The method may comprise serving food onto the dish in 33 dependence on the received food selection. The food selection may be a food selection 34 made by a person to whom the food will be served. The food selection may be a food selection made by a healthcare professional. The food selection may be a food 36 selection made by a kitchen or catering professional. The method may comprise 1 receiving the food selection via a user interface, for example a user interface of a 2 trolley.
4 The at least one first characteristic may be representative of whether at least one suitability criterion has been met. The method may comprise serving the food to a 6 person if the at least one first characteristic is representative of at least one suitability 7 criterion having been met. The method may comprise rejecting the food if the at least 8 one first characteristic is representative of at least one suitability criterion not having 9 been met. Where meals are only served when at least one suitability criterion is met, the meals are more likely to be eaten, with less rejected food or food wastage. The 11 meals are also more likely to be safe for consumption, for example because they are 12 less likely to include allergens or non-food items.
14 The method may comprise generating an alert if the at least one first characteristic is representative of at least one suitability criterion not having been met. The alert may 16 comprise (e.g. be) a sound, for example an alarm. The alert may comprise (e.g. be) a 17 light. For example, the alert may comprise (e.g. take the form of) a flashing light. The 18 alert may comprise (e.g. take the form of) a light of a first colour (e.g. green) being 19 switched off. The alert may comprise (e.g. take the form of) a light of a second colour (e.g. red) being switched on. The alert may comprise (e.g. be) a text-based alert. The 21 alert may be displayed on a user interface. The method may comprise outputting the 22 alert to a user interface (e.g. a user interface of a trolley, or of a device such as a tablet 23 or a mobile phone). Where an alert is generated, it is less likely that a user of the 24 method will fail to notice that the food is not suitable for consumption. This improves the safety of the method and reduces the risk of unsuitable food being served.
27 The alert may prompt a user to reject the food (for example, if the food is not suitable).
28 The method may comprise rejecting the food in response to the alert. The alert may 29 prompt a user to heat the food. The alert may prompt a user to re-heat the food. The method may comprise heating the food (for example in a heating chamber) in response 31 to the alert. The method may comprise re-heating the food (for example in a heating 32 chamber) in response the alert. The alert may prompt a user to cool the food. The 33 method may comprise cooling the food (for example in a cooling chamber) in response 34 to the alert. The alert may prompt a user to add or remove one or more items from the dish. The method may comprise adding or removing one or more items from the dish 1 in response to the alert. The method may comprise acknowledging the alert. The 2 method may comprise overriding the alert.
4 The method may comprise capturing two or more (e.g. additional) images (e.g. with two or more imaging devices, optionally from two or more angles) at the first time. The 6 method may comprise determining an indicator indicative of a (e.g. the) first 7 characteristic (e.g. further) in dependence on the two or more (e.g. additional) images 8 captured at the first time. The method may comprise determining an indicator of 9 whether a first type of food is positioned (e.g. at least partially) over a second type of food, different to the first type of food, in dependence on the two or more (e.g. 11 additional) images captured at the first time.
13 The two or more (e.g. additional) images may be captured from at least two different 14 angles. It will be understood that determining an indicator of whether a first food type is positioned (e.g. at least partially) over a second type of food, different to the first type 16 of food, in dependence on the two or more (e.g. additional) images captured at the first 17 time can be performed using known image-processing techniques. For example, 18 determining an indicator of whether a first food type is positioned (e.g. at least partially) 19 over a second type of food, different to the first type of food, in dependence on the two or more (e.g. additional) images may comprise: 21 determining that a first type of food is present (e.g. by providing the images as 22 inputs to a machine learning algorithm, for example a machine learning algorithm 23 trained on training data comprising images of known types of food on dishes); 24 determining that a second type of food is present (e.g. by providing the images as inputs to a machine learning algorithm, for example a machine learning algorithm 26 trained on training data comprising images of known types of food on dishes); 27 determining the locations of the first type of food and the second type of food 28 (e.g. by providing the images as inputs to a machine learning algorithm, for example a 29 machine learning algorithm trained on training data comprising images of foods on dishes in known locations); 31 determining whether the first type of food is in contact with the second type of 32 food in dependence on the determined locations of the first type of food and the second 33 type of food; and if so 34 determining a food surface of the first type of food and a food surface of the second type of food on the basis of the two or more images and determining on the 36 basis of the food surfaces whether a first food type is positioned (e.g. at least partially) 1 over a second type of food, optionally by providing the images as inputs to a machine 2 learning algorithm, for example a machine learning algorithm trained on training data 3 comprising images of foods known to be positioned (e.g. at least partially) over other 4 foods on dishes). In some embodiments, one or more of the steps described hereinabove may be an optional step. For example, in some embodiments, only the 6 final step described hereinabove may be used.
8 The method may comprise capturing a (e.g. at least one) second image of the dish with 9 the or each imaging device. The method may comprise receiving the a (e.g. at least one) second captured image from the or each imaging device. The method may 11 comprise determining at least one second characteristic of the food on the dish in 12 dependence on the (e.g. at least one) second image.
14 It will be understood that determining at least one second characteristic of the food on the dish in dependence on the second image can be performed using known image- 16 processing techniques. For example, determining at least one second characteristic of 17 the food on the dish in dependence on the second image may comprise providing the 18 second image as an input to a machine learning algorithm (e.g. a machine learning 19 algorithm trained on training data comprising images of food on dishes known to have the at least one second characteristic, and/or food on dishes known not to have the at 21 least one second characteristic).
23 The method may comprise comparing the (e.g. at least one) first image with the further 24 image. The method may comprise determining a second characteristic of the food on the dish in dependence on the comparison between the (e.g. at least one) first image 26 and the further image. The method may comprise outputting an indicator indicative of 27 the second characteristic. The at least one second characteristic may be 28 representative of whether at least one suitability criterion has been met.
The method may comprise moving the dish away from the imaging device between 31 capturing the first image and capturing the further image. The method may comprise 32 providing the dish to a consumer (e.g. as at least part of a meal) after capturing the first 33 image. The method may comprise receiving the dish from a consumer (e.g. after the 34 consumer has had an opportunity to consume the food on the dish, e.g. as at least part of a meal) before capturing the further image.
1 Because the second characteristic can be determined in dependence on a comparison 2 between the first image and the second image, this also provides the advantage to the 3 user that they can determine the suitability of the food for consumption at two (or more) 4 different times. For example, this may be carried out before and after a meal to determine whether food initially determined to be suitable was in fact suitable.
7 The method may comprise capturing two or more (e.g. additional) images (e.g. with 8 two or more imaging devices, optionally from two or more angles) at the second time.
9 The method may comprise determining an indicator indicative of a (e.g. the) first characteristic (e.g. further) in dependence on the two or more (e.g. additional) images 11 captured at the second time. The method may comprise determining an indicator 12 indicative of whether a first type of food is positioned (e.g. at least partially) over a 13 second type of food, different to the first type of food, in dependence on the two or more 14 (e.g. additional) images captured at the second time.
16 For example, the method may comprise: capturing a (e.g. at least one) second (e.g. 17 further) image of the dish with the or each imaging device at a second time different to 18 the first time. The method may comprise receiving the (e.g. at least one) second (e.g. 19 further) captured image from the or each imaging device. The method may comprise determining at least one second characteristic of the food on the dish in dependence 21 on the (e.g. at least one) second (e.g. further) image. The method may comprise 22 comparing the first image with the (e.g. at least one) second (e.g. further) image. The 23 method may comprise determining at least one second characteristic of the food on 24 the dish (e.g. further) in dependence on the comparison between the (e.g. at least one) first image and the second (e.g. further) image.
27 The method may comprise determining a region of the first image corresponding to the 28 food on the dish. The method may comprise determining the at least one first 29 characteristic in dependence thereon. The method may comprise segregating the region of the first image into a plurality of sub-regions, indicative of a plurality of 31 different types of food. The method may comprise determining the at least one first 32 characteristic for each of the plurality of sub-regions.
34 The method may comprise determining a region of the second image corresponding to the food on the dish. The method may comprise determining the at least one second 36 characteristic in dependence thereon. The method may comprise segregating the 1 region of the second image into a plurality of sub-regions, indicative of a plurality of 2 different types of food. The method may comprise determining the at least one second 3 characteristic for each of the plurality of sub-regions.
It will be understood that segregation of the images into regions based on the type of 6 food can be performed using known image-processing techniques. For example, 7 segregation of the images into regions indicative of a plurality of different types of food 8 may comprise providing the images as inputs to a machine learning algorithm (e.g. a 9 machine learning algorithm trained on training data comprising images of known types of food on dishes).
12 The processor may be one or more processors. The apparatus may comprise a 13 controller. The controller may comprise the one or more processors (e.g. the 14 processor) and a memory configured to store processing instructions which when executed by the one or more processors cause the apparatus to carry out the functions 16 of the processor described herein. The memory may be non-transitory, computer- 17 readable memory. The memory may have the processing instructions stored thereon.
18 The present invention extends to a non-transitory computer-readable medium (e.g. 19 memory) having the processing instructions stored thereon to control the apparatus as described herein. The memory may be solid-state memory. The controller may be 21 provided in a single device. In other examples, the controller may be distributed, having 22 a plurality of processors (e.g. including the processor). A first processor (e.g. the 23 processor) may be separated from a second processor in a distributed manner.
The method may comprise providing the dish to a respective consumer as at least part 26 of a meal. The method may comprise receiving the dish from the respective consumer 27 after the meal. The method may comprise providing the dish to a respective consumer 28 as at least part of a meal and/or receiving the dish from the respective consumer after 29 the meal, before capturing the (e.g. at least one) second image of the dish with the imaging device. It will be understood that "after the meal" includes occasions where the 31 dish has been provided to a respective consumer and the respective consumer has 32 had an opportunity to consume the meal, e.g. even if they do not consume the food or 33 if they consume only some of the food. In some embodiments "after the meal" includes 34 occasions where a predetermined mealtime has elapsed, e.g. even if the respective consumer was absent during the predetermined mealtime. The method may comprise 36 disposing of leftover food after a meal.
2 Accordingly, the method can be used to provide a comparison of food on the dish 3 before and after a meal. This comparison may be indicative of all of the food having 4 been consumed. Alternatively, it may be indicative of only some of the food having been consumed, or of only certain types of the food having been (e.g. partially 6 consumed. This is useful to, for example, medical professionals who may wish to 7 assess how much of a meal is consumed by a patient. For example, the comparison 8 may be a comparison of the determined volume of food (optionally the volume of a type 9 of food) on the dish before the meal, and a determined volume of food (optionally the volume of a type of food) on the dish after the meal (e.g. where the determination of 11 volume of food and/or type(s) of food on the dish is carried out as described 12 hereinabove) and may thereby provide an indicator of the total volume of food 13 (optionally the total volume of the types of food) consumed. The indicator of the total 14 volume of food consumed may be a food consumption estimate. The indicator of the total volume of food (optionally the total volume of the types of food) consumed may 16 be used to determine further nutritional indicators, for example an estimate of the 17 calorie content of the food consumed. This determination may be carried out with the 18 use of a database containing information including the weight of food types and the 19 nutritional content of known weights of food types, for example.
21 The method may comprise outputting the indicator indicative of the at least one first 22 characteristic to a user interface. The method may comprise outputting the indicator 23 indicative of the at least one first characteristic to a user interface of a trolley. The 24 method may comprise outputting the indicator indicative of the at least one first characteristic to a user interface of a device (e.g. a device owned by a user), for 26 example a tablet or a mobile phone. The method may comprise outputting the indicator 27 indicative of the at least one first characteristic to a database. The method may 28 comprise accepting the at least one first characteristic, for example via the user 29 interface. The method may comprise correcting the at least one first characteristic, for example via the user interface. The method may comprise outputting the indicator 31 indicative of the at least one first characteristic at a first time when the at least one first 32 characteristic has been determined. The method may comprise outputting the indicator 33 indicative of the at least one first characteristic at a second time following a delay after 34 the at least one first characteristic has been determined.
1 The method may comprise outputting the indicator indicative of the at least one second 2 characteristic to a user interface. The method may comprise outputting the indicator 3 indicative of the at least one second characteristic to a user interface of a trolley. The 4 method may comprise outputting the indicator indicative of the at least one second characteristic to a user interface of a device, for example a tablet or a mobile phone.
6 The method may comprise outputting the indicator indicative of the at least one second 7 characteristic to a database. The method may comprise accepting the at least one 8 second characteristic. The method may comprise correcting the at least one second 9 characteristic, for example via the user interface. The method may comprise outputting the indicator indicative of the at least one second characteristic at a first time when the 11 at least one second characteristic has been determined. The method may comprise 12 outputting the indicator indicative of the at least one second characteristic at a second 13 time following a delay after the at least one second characteristic has been determined.
The indicator may be output within at least 20 seconds of capturing the first image, e.g. 16 within at least 10 seconds, e.g. within at least 2 seconds, e.g. within at least 1 second.
17 The method may comprise outputting the indicator within at least 20 seconds of 18 capturing the first image, e.g. within at least 10 seconds, e.g. within at least 2 seconds, 19 e.g. within at least 1 second.
21 The method may comprise outputting an instruction in dependence on the at least one 22 first characteristic. The method may comprise outputting an instruction in dependence 23 on the at least one second characteristic. The method may comprise outputting the 24 instruction to the user interface. The method may comprise causing the user interface to display the instruction. The method may comprise outputting the instruction to a 26 device, for example a mobile phone or a tablet. The method may comprise outputting 27 the instruction to a database. The method may comprise acknowledging the 28 instruction. The method may comprise carrying out one or more steps in response to 29 the instruction. The method may comprise overriding the instruction.
31 The dish may comprise a dish identifier. The apparatus may comprise a sensor for 32 reading the dish identifier. The method may comprise identifying the dish with the 33 sensor. For example, the dish identifier may comprise (e.g. be) a barcode, the sensor 34 may comprise (e.g. be) a barcode scanner, and the method may comprise scanning the barcode with a barcode scanner. The dish identifier may comprise (e.g. be) a OR 36 code, the sensor may comprise (e.g. be) a OR code reader, and the method may 1 comprise reading the OR code with a OR code reader. The dish identifier may comprise 2 (e.g. be) an RFID tag, the sensor may comprise (e.g. be) and RFID reader, and the 3 method may comprise reading the RFID tag with an RFID reader. The one or more 4 containers of food may each comprise a container identifier readable by the sensor.
The method may comprise reading a container identifier with the sensor. The one or 6 more items of cutlery may each comprise a cutlery identifier readable by the sensor.
7 The method may comprise removing data from the dish identifier, e.g. after use. The 8 method may comprise writing data to the dish identifier, e.g. before use.
The method may comprise an initial identification step. For example, an initial 11 identification step wherein the dish is identified before a meal. The method may 12 comprise a subsequent identification step. For example, a subsequent identification 13 step, wherein the dish is identified after a meal. This helps to ensure that the same dish 14 is imaged before and after the meal. The or each identification step may comprise identifying the dish with the sensor (e.g. via the dish identifier).
17 The method may comprise determining a food consumption estimate in dependence 18 on the comparison between the at least one first characteristic and the at least one 19 second characteristic. The food consumption estimate may be an estimate of the quantity of food consumed (e.g. as a percentage by weight or volume of the food which 21 was served onto the dish, or as a weight or volume of food). The food consumption 22 estimate may be an estimate of the quantities of one or more types of food consumed 23 (e.g. as a percentage by weight of those foods which were served onto the dish, or as 24 a weight of each type of food consumed). The food consumption estimate may be an estimate of the weight of food consumed (e.g. the weight of each of one or more types 26 of food that have been consumed). The food consumption estimate may be an estimate 27 of the nutritional content of food consumed. The food consumption estimate may be an 28 estimate of the nutritional content of each type of food consumed. The food 29 consumption estimate may be a tiered estimate of the quantity of food consumed (e.g. including tiers indicative of "all or most of the food consumed"; "approximately half of 31 the food consumed"; and/or "all or most of the food not consumed"). The food 32 consumption estimate may be a tiered estimate of the quantity of each type of food 33 consumed. The food consumption estimate may be an estimate of the calorific content 34 of food consumed (e.g. an estimate of how many calories have been consumed in total). The food consumption estimate may be an estimate of the calorific content of 36 each type of food consumed. The food consumption estimate may be an estimate of 1 the types of food consumed. The food consumption estimate may be an estimate of 2 the types of food not consumed. The method may comprise outputting the food 3 consumption estimate.
In an example, the method may comprise: 6 segregating the region of the first image into a plurality of sub-regions 7 indicative of a plurality of different types of food; 8 determining a first volume of each of the plurality of different types of food; 9 segregating the region of the second image into a plurality of sub-regions indicative of a first plurality of different types of food; 11 determining a second volume of each of the plurality of different types of food; 12 comparing the first volume and the second volume to determine a food 13 consumption estimate, wherein the food consumption estimate is an estimate 14 comprising the volume of each of the different types of food that have been consumed.
The method may comprise outputting the food consumption estimate.
17 Where the food consumption estimate is an estimate of the quantity of food consumed, 18 and the estimate is indicative that less than 10%, e.g. less than 5%, e.g. less than 2%, 19 e.g. 0% of the food was consumed, the method may comprise inputting a statement.
For example, the statement may be a statement of a reason why more food was not 21 consumed. The statement may be "patient absent"; "patient asleep"; "patient not 22 hungry"; "food not suitable"; or "food not what was ordered", for example.
24 Such information is helpful to, for example, people who are following a particular diet, to healthcare professionals who are monitoring the effects of a particular diet on a 26 patient or a group of patients, and to kitchen and catering professionals.
28 The apparatus may comprise computer readable memory (e.g. a non-transitory 29 computer-readable storage medium) comprising a database. The method may comprise writing the food consumption estimate to the to the database. The database 31 may be stored on a further device.
33 The apparatus may comprise a wired communication link. The apparatus may 34 comprise a wireless communication link. The apparatus may comprise a computer readable storage medium. The apparatus may be configured to transmit and/or receive 36 data (e.g. information) via a wired communication link. The method may comprise 1 transmitting and/or receiving data (e.g. information) via a wired communication link.
2 The apparatus may be configured to transmit and/or receive data (e.g. information) via 3 a wireless communication link. The method may comprise transmitting and/or receiving 4 data (e.g. information) via a wireless communication link. The apparatus may be configured to transmit and/or receive data (e.g. information) via Wi-Fi (TM). The 6 method may comprise transmitting and/or receiving data (e.g. information) via Wi-Fi 7 (TM). Wi-H (TM) is a family of wireless network protocols, based on the IEEE 802.11 8 family of standards. The apparatus may be configured to transmit and/or receive data 9 (e.g. information) via LoRa (TM). The method may comprise transmitting and/or receiving data (e.g. information) via LoRa (TM). LoRa (TM) is a low-power wide-area 11 network modulation technique. The apparatus may be configured to transmit and/or 12 receive data (e.g. information) via Bluetooth (TM). The method may comprise 13 transmitting and/or receiving data (e.g. information) via Bluetooth (TM). Bluetooth (TM) 14 is a short-range wireless technology standard. The apparatus may be an internet of things (I0T) apparatus. The apparatus may comprise an IOT platform, optionally 16 having a connection to at least one server computer. The apparatus may be a 17 component within an IOT environment, optionally within a local area network.
19 The data (e.g. information) may comprise the indicator indicative of the at least one first characteristic. The data (e.g. information) may comprise the indicator indicative of 21 the at least one second characteristic. The data (e.g. information) may comprise the at 22 least one first characteristic. The data (e.g. information) may comprise the at least one 23 second characteristic. The data may comprise (e.g. be) raw (e.g. unprocessed) data.
24 The data may comprise (e.g. be) information derived from raw (e.g. unprocessed) data.
The data (e.g. information) may comprise a food consumption estimate.
27 The method may comprise processing the food consumption estimate to determine an 28 estimate of food wastage. The estimate of food wastage may comprise (e.g. be) an 29 estimate indicative of the types of food on the dish post-meal (e.g. after a consumer has had an opportunity to consume the food). The estimate of food wastage may 31 comprise (e.g. be) an estimate indicative of the quantity of food left on the dish post- 32 meal (e.g. the quantity of one or more (e.g. each) type(s) of food, e.g. after a consumer 33 has had an opportunity to consume the food). The estimate of food wastage may 34 comprise (e.g. be) an estimate indicative of the weight of food left on the dish post-meal (e.g. the weight of one or more (e.g. each) type(s) of food, e.g. after a consumer 36 has had an opportunity to consume the food). The estimate of food wastage may 1 comprise (e.g. be) an estimate indicative of the volume of food left on the dish post- 2 meal (e.g. the volume of one or more (e.g. each) type(s) of food, e.g. after a consumer 3 has had an opportunity to consume the food). The estimate of food wastage may 4 comprise (e.g. be) an estimate indicative of the value of food left on the dish post-meal.
The estimate of food wastage may comprise (e.g. be) an estimate indicative of the 6 carbon footprint of the food left on the dish post-meal. In an example, the estimate of 7 food wastage may comprise (e.g. be) an estimate indicative of the sum of a plurality of 8 quantities of food left on a respective plurality of dishes post-meal (e.g. the quantity of 9 one or more (e.g. each) type(s) of food, e.g. after a respective consumer has had an opportunity to consume the food on each dish).
12 The method may comprise determining a combined estimate of food wastage 13 comprising a combination of two or more estimates of food wastage indicative of the 14 food left on two or more dishes post-meal. For example, the combined estimate of food wastage may comprise a combination of estimates of food wastage indicative of food 16 left on every dish served within a period (e.g. a period comprising a day, or a week, or 17 a month). The combined estimate of food wastage may comprise a combination of 18 estimates of food wastage indicative of food left on every dish served to a respective 19 consumer within a period (e.g. a period comprising a day, or a week, or a month).
21 The method may comprise processing a plurality of food consumption estimates 22 associated with a plurality of meals to determine the estimate of food wastage.
24 Information relating to food wastage is helpful to, for example, healthcare professionals and kitchen and catering professionals. For a healthcare professional, it is helpful to 26 know whether prescribed food is being consumed, as if it is not this may affect patient 27 recovery.
29 The database may comprise dietary requirement data indicative of the dietary requirements of one or more people. The database may comprise food selection 31 information of one or more people. The method may comprise retrieving dietary 32 requirement information of a respective consumer from the database. The method may 33 comprise retrieving food selection information for a respective consumer from the 34 database. The food selection information may be food selection information input (e.g. to the database) by a consumer, however the food selection information may be food 36 selection information input (e.g. to the database) by a user who is not the consumer, 1 such as a medical professional. The food selection information may be input (e.g. to 2 the database) via the apparatus, e.g. via the user interface. However, the food selection 3 information may be input (e.g. to the database) in some other way, for example via a 4 device separate from the apparatus, optionally remotely from the apparatus. The method may comprise processing the first image to determine whether the dish 6 contains any food items incompatible with the dietary requirement information of the 7 respective consumer. The method may comprise processing the first image to 8 determine whether the food on the dish corresponds to the food selection information 9 of the respective consumer. The method may comprise processing the second image to determine whether the dish contains any food items incompatible with the dietary 11 requirement information of the respective consumer. The at least one first 12 characteristic may be representative of whether the dish contains any food items 13 incompatible with the dietary requirement information of the respective consumer. The 14 at least one second characteristic may be representative of whether the dish contains any food items incompatible with the dietary requirement information of the respective 16 consumer.
18 The provision of a database including dietary information allows for personalised (and 19 thus improved) nutrition to be provided to the people to whom the food is served. The method is thus a safer way of providing better nutrition to the people to whom the food 21 is to be served.
23 The method may comprise one or more phases. For example, the method may 24 comprise an outgoing-dishes phase. The outgoing-dishes phase may comprise outgoing-dishes steps of: 26 providing a dish; 27 serving food onto the dish; 28 capturing a first image of the dish with the imaging device; 29 receiving the first captured image from the imaging device; determining at least one first characteristic of the food on the dish in 31 dependence on the first imaging device; 32 optionally outputting an indicator indicative of the at least one first 33 characteristic; and 34 optionally serving the food on the dish to a respective person (e.g. if the at least one characteristic is indicative of food that is suitable for serving). The outgoing-dishes 1 steps may be repeated. The outgoing-dishes phase may comprise the step of 2 identifying the dish with the sensor (e.g. via the dish identifier).
4 The method may comprise an incoming-dishes phase. The incoming-dishes phase may comprise incoming dishes steps of: 6 receiving a dish from a respective person after a meal; 7 capturing a second image of the dish with the imaging device; 8 receiving the second captured image from the imaging device; and 9 determining at least one second characteristic of the food on the dish in dependence on the second image. The incoming-dishes steps may be repeated. The 11 incoming-dishes phase may comprise the step of identifying the dish with the sensor 12 (e.g. via the dish identifier).
14 The method may comprise outputting an indicator that the outgoing-dishes phase has started. The method may comprise acknowledging the indicator that the outgoing- 16 dishes phase has started. The method may comprise outputting an indicator that the 17 outgoing-dishes phase has ended. The method may comprise acknowledging the 18 indicator that the outgoing-dishes phase has ended. The method may comprise 19 outputting an indicator that the incoming-dishes phase has started. The method may comprise acknowledging the indicator that the incoming-dishes phase has started. The 21 method may comprise outputting an indicator that the incoming-dishes phase has 22 ended. The method may comprise acknowledging the indicator that the incoming- 23 dishes phase has ended.
The method may be repeated at one or more food service times. The outgoing-dishes 26 phase may be repeated at one or more food service times. The incoming-dishes phase 27 may be repeated at one or more food service times. For example, the method may be 28 carried out at a first food service time and at a second food service time. A food service 29 may be a breakfast service. A food service may be a luncheon service. A food service may be a dinner service. The method may comprise determining the number of 31 containers of food remaining after each food service time. The method may comprise 32 determining the number of portions of food remaining after each food service time. The 33 method may comprise writing the number of containers of food remaining after each 34 food service time to a database. The method may comprise writing the number of portions of food remaining after each food service time to a database. The method may 1 comprise loading containers onto the trolley. The method may comprise unloading 2 containers from the trolley.
4 The method may comprise one or more calibration steps. For example, the method may comprise calibrating the imaging device. The method may comprise focussing the 6 imaging device. The method may comprise calibrating the imaging device using a test 7 card. The method may comprise calibrating the imaging device using a checkerboard 8 test. The method may comprise calibrating the imaging device by imaging a known 9 item (e.g. a reference item) and optionally correcting the imaging device and/or the processor. The method may comprise calibrating the imaging device by imaging an 11 item having a known volume (e.g. a reference item having a reference volume) and 12 optionally correcting the imaging device and/or the processor. The method may 13 comprise calibrating the imaging device by imaging an item having a known 14 temperature (e.g. a reference item having a reference temperature) and optionally correcting the imaging device and/or the processor.
17 The method may comprise one or more checking steps. For example, the method may 18 comprise checking that a captured image is an image of a dish. The method may 19 comprise generating an alert if the captured image is not an image of a dish.
21 The method may comprise cleaning the dish. The method may comprise cleaning the 22 one or more items of cutlery. The method may comprise cleaning the trolley. The 23 method may comprise cleaning the apparatus.
The skilled person will appreciate that where an image or other information is provided 26 as an input to a machine learning algorithm, and output will be generated. This output 27 may be further processed. This output may be provided to a further machine learning 28 algorithm. This output may be output to a display. This output may be recorded in a 29 database.
31 It will be understood that food may include solid food (optionally partially solid food). It 32 will be understood that food my include liquid (e.g. liquid food). In some embodiments 33 food may include drinks.
It will be understood that any features described above in relation to the apparatus may 36 also be optional features of the other aspects of the invention.
2 Description of the Drawings
4 An example embodiment of the present invention will now be illustrated with reference to the following Figures in which: 7 Figure 1 is a diagram of an example apparatus according to the invention, wherein the 8 apparatus is provided in the form of a trolley; Figure 2 is a diagram of a dish of food; 12 Figure 3A is a plot of an image of food on a dish before a meal; Figure 3B is a plot of a 13 representation of food types appearing in the image of Figure 3A; Figure 3C is a plot 14 of an image of food on a dish after a meal; Figure 3D is a plot of a representation of food types appearing in the image of Figure 3C; 17 Figure 4 is a flow chart of steps in an example embodiment of a method according to 18 the invention; Figure 5 is a flow chart of further steps in an example embodiment of a method 21 according to the invention 23 Figure 6 is a flow chart of further steps in an example embodiment of a method 24 according to the invention; 26 Figure] is a flow chart of steps in an example embodiment of the invention; and 28 Figure 8 is a schematic illustration of an apparatus according to an example 29 embodiment according to the invention.
31 Detailed Description of an Example Embodiment
33 It will be understood by those skilled in the art that any dimensions and relative 34 orientations such as lower and higher, above and below, and directions such as vertical, horizontal, upper, lower, longitudinal, axial, radial, lateral, circumferential, etc. 36 referred to in this description refer to, and are within expected structural tolerances and 1 limits for, the technical field and the apparatus described, and these should be 2 interpreted with this in mind.
4 Referring to Figures 1 and 2, an example embodiment of an apparatus 1 according to the invention takes the form of a trolley 2. The trolley has a control module 4 having a 6 user interface 6 in the form of a touch-sensitive display screen and an imaging device 7 8 in the form of a multispectral camera. The control module 4 also houses a processor 8 (not shown). The control module is configured to transmit and receive information via 9 Wi-Fi (TM).
11 The trolley 2 also has a heating chamber 10 in the form of an oven, and cupboard 12 12 for holding food items 28 (here 40 containers of food, where each container contains 13 enough food to provide one meal to one person), such as food items within trays of 14 food, before the food items 28 are heated (if appropriate) and served. The trolley 2 provides a food preparation platform 14 including a serving area 16 and an imaging 16 area 18 (though in some examples, it will be understood that the serving area 16 can 17 be combined with the imaging area 18, such that imaging of the food items occurs in 18 the same area in which the food is served onto the dish). Adjacent to the food 19 preparation platform 14, the trolley 2 has an RFID reader 24. The trolley 2 is portable because it has wheels 20 (two of which are shown in Figure 1).
22 In use, a dish 22 is placed in the serving area 16 on the food preparation platform 14.
23 The dish 22 has an RFID tag 26 and this is read by the RFID reader 24 to identify the 24 dish. Food items 28 are removed from cupboard 12 and, if appropriate, are heated in the heating chamber 10 before being served onto the dish 22. The dish 22 and the food 26 28 on the dish 22 are then moved to the imaging area 18 of the food preparation 27 platform 14 and are imaged with the imaging device 8. The processor receives the 28 resulting first image, determines at least one characteristic of the food 28 on the dish 29 22 in dependence on the image, and outputs an indicator of the characteristic to the user interface 6. The indicator is output to the user interface 6 within 3 seconds of the 31 dish 22 being moved to the imaging area. In this example embodiment, the 32 determination of the at least one characteristic of the food 28 on the dish 22 is 33 performed by providing the first image to a machine learning algorithm which has been 34 trained on training data including images of food which is known to have the at least one characteristic and images of food which is known to not have the at least one 36 characteristic.
2 If the characteristic is indicative of food 28 that is suitable to be served to a respective 3 person, the user interface 6 displays a message to indicate that this is the case, and 4 the user of the apparatus serves the food to the person. If the characteristic is indicative of food 28 that is not suitable to be served to a respective person, the user interface 6 6 displays a warning message to indicate that this is the case.
8 For example, if the characteristic is indicative of undercooked food, the warning 9 message may be "Food undercooked. Heat further." The user of the apparatus then returns the food 28 to the heating chamber 10 for further heating or re-heating. After 11 (re-)heating the food 28, the user images the dish 22 and the food 28 on the dish 22 a 12 second time, to gain a second characteristic of the food 28 on the dish 22. If the second 13 characteristic is indicative of food 28 that is suitable to be served to the person, the 14 user interface 6 will display a message to indicate that this is the case. The user of the apparatus can then serve the food to the person. In some cases, if the second 16 characteristic is indicative of food 28 that is still not suitable to be served to a respective 17 person, the message may be a warning message and may indicate that the food 28 18 should be rejected.
After the respective person has had an opportunity to consume the food 28, the dish 21 22 and any remaining food is retrieved from the person. The RE ID tag 26 of the dish 22 22 is read by the RE ID reader 24 to identify the dish 22 and check that it is the same 23 dish 22 that was provided to the person. The dish 22 and any remaining food on the 24 dish 22 are then imaged with the imaging device 8. The processor receives the resulting further image and determines at least one further characteristic of the food 28 26 on the dish 22 in dependence on a comparison between the first image and the further 27 image. The at least one further characteristic is indicative of the volume of each type 28 of food 28 that has been consumed. In this example embodiment, the determination of 29 the at least one further characteristic of the food 28 on the dish 22 is performed by providing the further image to a machine learning algorithm which has been trained on 31 training data including images of food which is known to have the at least one further 32 characteristic and images of food which is known to not have the at least one further 33 characteristic.
In some cases, for example where less than 5% of the food 28 served has been 36 consumed when the further image is captured, the at least one further characteristic is 1 indicative of this. In which case, the processor outputs the at least one further 2 characteristic to the user interface, along with a prompt for a statement. This prompt is 3 intended to obtain an explanation for why relatively little of the food 28 was consumed.
4 The statement may be, for example, "patent asleep", "patient absent", "patient not hungry", "food not what was ordered".
7 The food preparation platform 14 has a wipe-clean surface. The wipe-clean surface is 8 water-resistant and heat-resistant. The dish 22 is a ceramic plate. The cutlery is 9 stainless steel cutlery. The RFID tag 26 is water-resistant and heat-resistant. The RFID tag 26 is suitable for washing.
12 Figures 3A to 3D are plots of images (here photographs) 50, 60 of food 28 on a dish 13 22 before and after a meal, and corresponding plots of representations 52, 62 of food 14 types appearing in the images. Here, the imaging device 8 is used to capture a first image 50 of the food 28 on the dish 22 before the food 28 is served to a person. The 16 processor processes this image 50 to determine the types of food on the dish 22 and 17 produces a representation 52 of the types of food at each location in the image. In this 18 example, the foods identified are curried chicken 54, rice 56, and vegetables 58. In 19 this example embodiment, the determination of the types of food 28 on the dish 22 is performed by providing the first image to a machine learning algorithm which has been 21 trained on training data including images of known types of food.
23 After the person has had an opportunity to eat the food 28, the dish 22 is retrieved and 24 the imaging device 8 is used to capture a second image 60 of the dish 22 and any remaining food. The processor processes the second image 60 to determine the types 26 of food remaining on the dish 22 and produces a second representation 62 of the types 27 of food at each location in the image. In this example, the majority of the food identified 28 as vegetables 58 is no longer on the dish 22, whereas the majority of the food identified 29 as curried chicken 54 and some of the food identified as rice 56 remains on the dish 22. Accordingly, based on the comparison between the first 50 and second 60 images, 31 the processor determines a second characteristic which is indicative of how much of 32 each food 28 was eaten. This second characteristic is output to a database.
34 Figure 4 is a flow chart of steps in a method according to an example embodiment of the invention. In this method the first step is to provide 80 a dish 22. Food 28 is then 36 served 82 onto the dish 22. An image is captured 84 of the food 28 on the dish 22. In 1 the example embodiment the image is captured 84 after the food is served 82 onto the 2 dish. The image is received 86 by the processor. The processor determines 88 a 3 characteristic representative of the suitability of the food 28 on the dish 22 for 4 consumption. The processor outputs 90 the characteristic (e.g. to the user interface 6).
6 Figure 5 is a flow chart of steps in a method according to an example embodiment of 7 the invention, wherein these steps would typically follow those of Figure 4. Here, the 8 processor determines 92 whether the characteristic is indicative of a suitability criterion 9 being met. If the suitability criterion is met, the user serves 94 the food to a respective person 94. If the suitability criterion is not met, an additional action is taken 96. The 11 nature of the additional action will depend on why the characteristic is indicative of a 12 suitability criterion not being met. For example, if the characteristic is indicative of the 13 suitability criterion not being met because the food is burned, the additional action may 14 be to reject the food (e.g. to dispose of the food) and not to serve it to the person. In another example, if the characteristic is indicative of the suitability criterion not being 16 met because the food is too cool, the additional action may be to (re-)heat the food. In 17 each case, it is typical for an indicator of the additional action to be displayed on the 18 user interface 6. The steps in the method of Figure 4 may then be repeated, followed 19 by the steps in the method of Figure 5.
21 Figure 6 is a flow chart of steps in a method according to an example embodiment of 22 the invention, including the steps of those shown in Figure 4, as well as steps which 23 would follow those shown in Figure 5 if the characteristic is indicative of the suitability 24 criterion being met. Here, as in Figure 4, the first step is to provide 80 a dish 22. Food 28 is then served 82 onto the dish 22. An image is captured 84 of the food 28 on the 26 dish 22. The image is received 86 by the processor. The processor determines 88 a 27 characteristic representative of the suitability of the food 28 on the dish 22 for 28 consumption. The processor outputs 90 the characteristic (e.g. to the user interface 6).
29 As the characteristic is indicative of the suitability criterion being met, the food 28 is then served 94 to a person. After the person has had an opportunity to consume the 31 food 28, the dish 22 and any remaining food 28 is received 100 from the person. A 32 second image is then captured 102 and the second image is received 104 by the 33 processor. The processor determines 106 a second characteristic in dependence on 34 the second image. In this example embodiment, the determination of the characteristic representative of the suitability of the food 28 on the dish 22 consumption is performed 36 by providing the first image to a machine learning algorithm which has been trained on 1 training data including images of food which is known to have a characteristic indicative 2 of food that is suitable for consumption and images of food which is known have a 3 characteristic indicative of food that is not suitable for consumption.
In some example embodiments, the second characteristic is output, for example to the 6 user interface 6 and/or to a database.
8 Figure 7 is a flow chart of steps in an example embodiment of the invention, indicating 9 some of the more detailed steps in an example of how information may be determined about the food 28 on the dish 22. In this example, the dish has an RFID tag 26 and the 11 first steps are to scan 120 the RFID tag 26 and to receive 122 a meal order. When the 12 food 28 has been served onto the dish 22 (step not shown in Figure 7) a first image 84 13 is captured from a first angle. This first image is then divided into regions 124 14 corresponding to different food types and the different food types are determined 126 (in this example, both of these steps are carried out by providing the image to a 16 machine learning algorithm trained on training data including images of known foods 17 on dishes).
19 On the basis of the determined different food types, a meal is identified 128 (in this example, this is performed with reference to a database containing information about 21 which food types are present in a plurality of known meals). The identified meal is 22 compared with the received 122 meal order, to confirm whether the food 28 served 23 onto the dish 22 corresponds with the meal that was ordered.
At the same time as capturing 84 a first image from a first angle, a second image is 26 captured 130 from a second angle. The first and second images are used to estimate 27 132 the depth of food on the dish (e.g. by first determining a food surface and 28 comparing the height of this surface to the height of the surface of the dish). From the 29 depth estimation, a volume 134 is calculated (e.g. by integrating). The identified 128 meal and the volume calculation 134 are used to determine the calorie contents of food 31 28 on the dish 22 (in this example, this is performed with reference to a database 32 containing information about the weights of foods as a function of volume, and the 33 calorie contents of foods as a function of weight).
1 Optionally, a thermal image may also be captured, and the method may include the 2 step of checking that the food is at the correct temperature (e.g. by referring to a 3 database indicating the correct serving temperature of meals and/or foods).
Optionally, the process of Figure 7 may be repeated after the meal has been served 6 and a person has had an opportunity to consume the meal. A comparison may be made 7 between the food types, volume, and calorie contents the first time the process is 8 performed and the second time the process is performed, and thus an estimate of food 9 consumed (e.g. type of food consumed, volume of food consumed, calories consumed, etc) may be determined.
12 Optionally, data from any step in the method may be stored locally or remotely and/or 13 may be further processed. Optionally, information relating to data from any step in the 14 method may be displayed, for example on the user interface 6 or on another device.
16 Figure 8 is a schematic illustration of an apparatus 1 according to an example 17 embodiment of the invention. The apparatus 1 comprises a user interface 6 and a 18 controller 202. The controller 202 is configured to exchange signals 204 with the user 19 interface 6. The controller 202 also typically transmits data externally, for example to further components of the apparatus 1, and/or to devices external to the apparatus, via 21 a wireless data connection. The signals 204 include data received by the controller 22 202, for example from user inputs by a user via the user interface 6. The controller 202 23 in this example is realised by one or more processors 206 and a computer-readable 24 memory 208. The memory 208 stores instructions which, when executed by the one or more processors 206, cause the apparatus 1 to operate as described herein.
27 Although the controller 202 is shown as being part of the apparatus 1, it will be 28 understood that one or more components of the controller 202, or even the whole 29 controller 202 can be provided separate from the apparatus 1, for example remotely from the apparatus 1, to exchange signals with the user interface 6 by wireless 31 communication.
33 In summary, there is provided an apparatus (1) for determining at least one first 34 characteristic of food (28) on a dish (22), the apparatus comprising: an imaging device (8) configured to capture a first image of the dish; and a processor configured to: 36 receive the first image from the imaging device; determine the at least one first 1 characteristic of the food on the dish in dependence on the first image; and output an 2 indicator indicative of the at least one first characteristic, wherein the at least one first 3 characteristic is representative of a suitability of the food on the dish for consumption.
Throughout the description and claims of this specification, the words "comprise" and 6 "contain" and variations of them mean "including but not limited to", and they are not 7 intended to and do not exclude other components, integers, or steps. Throughout the 8 description and claims of this specification, the singular encompasses the plural unless 9 the context otherwise requires. In particular, where the indefinite article is used, the specification is to be understood as contemplating plurality as well as singularity, 11 unless the context requires otherwise.
13 Features, integers, characteristics, or groups described in conjunction with a particular 14 aspect, embodiment, or example of the invention are to be understood to be applicable to any other aspect, embodiment or example described herein unless incompatible 16 therewith. All of the features disclosed in this specification (including any 17 accompanying claims, abstract and drawings), and/or all of the steps of any method or 18 process so disclosed, may be combined in any combination, except combinations 19 where at least some of such features and/or steps are mutually exclusive. The invention is not restricted to the details of any foregoing embodiments. The invention 21 extends to any novel one, or any novel combination, of the features disclosed in this 22 specification (including any accompanying claims, abstract and drawings), or to any 23 novel one, or any novel combination, of the steps of any method or process so 24 disclosed.

Claims (24)

1 Claims 3 1. Apparatus for determining at least one first characteristic of food on a dish, 4 the apparatus comprising: at least two imaging devices, the at least two imaging devices 6 comprising an optical camera and a thermal camera, each imaging device 7 configured to capture at least one first image of the dish; and 8 a processor configured to: 9 receive the at least one first image from at least one of the at least two imaging devices; 11 determine the at least one first characteristic of the food on the dish 12 in dependence on the first image; and 13 output an indicator indicative of the at least one first characteristic, 14 wherein the at least one first characteristic is representative of a suitability of the food on the dish for consumption.17
2. Apparatus according to claim 1, wherein at least one of the at least two 18 imaging devices is configured to capture images of the dish from a first 19 angle, and wherein at least one of the at least two imaging devices is configured to capture images of the dish from a second angle, different to 21 the first angle.23
3. Apparatus according to claim 1 or claim 2, wherein at least one of the at 24 least two imaging devices is configured to capture the at least one first image of the dish at a first time, and at least one second image of the dish 26 at a second time different to the first time, and wherein the processor is 27 configured to: 28 receive the at least one second image from the imaging device; and 29 determine at least one second characteristic of the food on the dish in dependence on the at least one second image.32
4. Apparatus according to claim 1 or claim 2, wherein at least one of the at 33 least two imaging devices is configured to capture the at least one first 34 image at a first time, and at least one second image of the dish at a second time different to the first time, and wherein the processor is configured to: 36 receive the at least one second image from the imaging device; and 1 determine the at least one first characteristic of the food on the dish 2 further in dependence a comparison between the at least one first image 3 and the at least one second image.
5. Apparatus according to claim 3 or claim 4, wherein the at least one second 6 characteristic is representative of a suitability of the food on the dish for 7 consumption.9
6. Apparatus according to any preceding claim, wherein the first characteristic is representative of one or more of: food type; temperature; food quantity; 11 food quality; food weight; food volume; food nutritional information; food 12 calorific information; presence of non-food items; presence of undercooked 13 food; presence of overcooked food; and presence of allergens.
7. Apparatus according to any of claims 3 to 5, wherein the second 16 characteristic is representative of one or more of: which types of food have 17 been consumed; a change in temperature; how much food has been 18 consumed; nutrients of the food that has been consumed; how many 19 calories have been consumed.21
8. Apparatus according to any preceding claim, wherein the apparatus 22 comprises a trolley, the trolley housing at least one of the at least two 23 imaging devices.
9. Apparatus according to claim 8, wherein the trolley comprises a heating 26 chamber, one or more dishes, and a user interface.28
10. Apparatus according to any preceding claim, wherein the processor is 29 further configured to output an instruction in dependence on the at least one characteristic.32
11. Apparatus according to claim 10, wherein the at least one characteristic is 33 representative of whether at least one suitability criterion has been met, and 34 wherein the instruction is an instruction to serve the food on the condition that the at least one characteristic is representative of the at least one 36 suitability criterion having been met.1
12. Apparatus according to any preceding claim, wherein the apparatus further 2 comprises computer readable memory comprising a database, the 3 database comprising dietary requirement data indicative of the dietary 4 requirements of one or more people, and wherein the processor is configured to compare the at least one characteristic with the dietary 6 requirements of a respective person.8
13. Apparatus according to any preceding claim, wherein the dish comprises a 9 dish identifier and the apparatus comprises a sensor for reading the dish identifier.12
14. A method of using the apparatus according to any preceding claim, the 13 method comprising: 14 providing a dish; serving food onto the dish; 16 capturing at least one first image of the dish with the or each imaging 17 device at a first time; 18 receiving the at least one first captured image from the or each imaging 19 device; determining at least one first characteristic of the food on the dish in 21 dependence on the at least one first image; and 22 outputting an indicator indicative of the at least one first characteristic, 23 wherein the at least one first characteristic is representative of a 24 suitability of the food on the dish for consumption.26
15. A method according to claim 14, comprising: 27 capturing at least one first image from a first angle with a first imaging 28 device, and 29 capturing at least one second image from a second angle, different to the first angle, with a second imaging device.32
16. A method according to claim 14 or claim 15, wherein the at least one first 33 characteristic is representative of whether at least one suitability criterion 34 has been met, and wherein the method comprises serving the food to a person if the at least one first characteristic is representative of at least one 36 suitability criterion having been met.2
17. A method according to any one of claims 14 to 16, wherein the at least one 3 first characteristic is representative of whether at least one suitability 4 criterion has been met, and wherein the method comprises rejecting the food if the at least one first characteristic is representative of at least one 6 suitability criterion not having been met.8
18. A method according to any one of claims 14 to 17, wherein at least one the 9 first characteristic is representative of whether at least one suitability criterion has been met, and wherein the method comprises generating an 11 alert if the at least one first characteristic is representative of at least one 12 suitability criterion not having been met.14
19. A method according to any of claims 14 to 18 comprising: capturing at least one second image of the dish with the or each imaging 16 device at a second time different to the first time; 17 receiving the at least one second captured image from the imaging 18 device; and 19 determining at least one second characteristic of the food on the dish in dependence on the at least one second image.22
20. A method according to any of claims 14 to 19, comprising: 23 capturing at least one second image of the dish with the or each imaging 24 device at a second time different to the first time; receiving the at least one second captured image from the or each 26 imaging device; 27 comparing the at least one first image with the at least one second 28 image; and 29 determining at least one second characteristic of the food on the dish in dependence on the comparison between the at least one first image and the at 31 least one second image.33
21. A method according to claim 19 or claim 20, wherein before capturing the 34 at least one second image, the method comprises: providing the dish to a respective consumer as at least part of a meal; 36 and 1 receiving the dish from the respective consumer after the meal.3
22. A method according to any of claims 14 to 21, wherein the dish comprises 4 a dish identifier and the apparatus comprises a sensor for reading the dish identifier, and wherein the method comprises identifying the dish with 6 sensor.8
23. A method according to any of claims 19 to 22, wherein the apparatus 9 comprises computer readable memory comprising a database, and wherein the method comprises: 11 determining a food consumption estimate in dependence on the 12 comparison between the at least one first characteristic and the at least one 13 second characteristic; and 14 writing the food consumption estimate to the to the database.16
24. A method according to any of claims 14 to 23, wherein the apparatus 17 comprises computer readable memory comprising a database, the 18 database comprising dietary requirement data indicative of the dietary 19 requirements of one or more people, and wherein the method comprises: retrieving dietary requirement information of a respective consumer 21 from the database; 22 and processing the first image to determine whether the dish contains 23 any food items incompatible with the dietary requirement information of the 24 respective consumer.
GB2302940.8A 2022-03-03 2023-02-28 Apparatus and method for determining food characteristics Pending GB2616522A (en)

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
GBGB2202984.7A GB202202984D0 (en) 2022-03-03 2022-03-03 Apparatus and method for determining food characteristics

Publications (2)

Publication Number Publication Date
GB202302940D0 GB202302940D0 (en) 2023-04-12
GB2616522A true GB2616522A (en) 2023-09-13

Family

ID=81175280

Family Applications (2)

Application Number Title Priority Date Filing Date
GBGB2202984.7A Ceased GB202202984D0 (en) 2022-03-03 2022-03-03 Apparatus and method for determining food characteristics
GB2302940.8A Pending GB2616522A (en) 2022-03-03 2023-02-28 Apparatus and method for determining food characteristics

Family Applications Before (1)

Application Number Title Priority Date Filing Date
GBGB2202984.7A Ceased GB202202984D0 (en) 2022-03-03 2022-03-03 Apparatus and method for determining food characteristics

Country Status (1)

Country Link
GB (2) GB202202984D0 (en)

Citations (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPS5949427A (en) * 1982-09-14 1984-03-22 Toshiba Corp High-frequency heater
JP2000337639A (en) * 1999-05-26 2000-12-08 Sharp Corp Heating cooking appliance
JP2003106995A (en) * 2001-09-28 2003-04-09 Takai Seisakusho:Kk Quality determining method for gel forming food
US20050265423A1 (en) * 2004-05-26 2005-12-01 Mahowald Peter H Monitoring system for cooking station
US20070114224A1 (en) * 2004-03-17 2007-05-24 Sachio Nagamitsu Ingredient cooking-operation recognition system and ingredient cooking-operation recognition program
US20150289324A1 (en) * 2014-04-07 2015-10-08 Mark Braxton Rober Microwave oven with thermal imaging temperature display and control
WO2017121713A1 (en) * 2016-01-11 2017-07-20 Teknologisk Institut A method and device for scanning of objects using a combination of spectral ranges within vision, nir and x-rays
US20170262973A1 (en) * 2016-03-14 2017-09-14 Amazon Technologies, Inc. Image-based spoilage sensing refrigerator
US20170290095A1 (en) * 2016-03-30 2017-10-05 The Markov Corporation Electronic oven with infrared evaluative control
EP3232733A1 (en) * 2016-04-15 2017-10-18 Panasonic Intellectual Property Management Co., Ltd. System that emits light to overheated portion of cooking container
US20190234617A1 (en) * 2015-05-05 2019-08-01 June Life, Inc. Connected food preparation system and method of use
US20200139554A1 (en) * 2018-11-07 2020-05-07 Miso Robotics, Inc. Modular robotic food preparation system and related methods
US10819905B1 (en) * 2019-09-13 2020-10-27 Guangdong Media Kitchen Appliance Manufacturing Co., Ltd. System and method for temperature sensing in cooking appliance with data fusion
EP3742052A1 (en) * 2019-05-21 2020-11-25 Whirlpool Corporation Cooking assistance appliance
US20210385917A1 (en) * 2018-11-08 2021-12-09 BSH Hausgeräte GmbH Method for operating a domestic cooking appliance and domestic cooking appliance
US20220030677A1 (en) * 2018-12-10 2022-01-27 BSH Hausgeräte GmbH Method for operating a domestic cooking appliance and domestic cooking appliance

Patent Citations (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPS5949427A (en) * 1982-09-14 1984-03-22 Toshiba Corp High-frequency heater
JP2000337639A (en) * 1999-05-26 2000-12-08 Sharp Corp Heating cooking appliance
JP2003106995A (en) * 2001-09-28 2003-04-09 Takai Seisakusho:Kk Quality determining method for gel forming food
US20070114224A1 (en) * 2004-03-17 2007-05-24 Sachio Nagamitsu Ingredient cooking-operation recognition system and ingredient cooking-operation recognition program
US20050265423A1 (en) * 2004-05-26 2005-12-01 Mahowald Peter H Monitoring system for cooking station
US20150289324A1 (en) * 2014-04-07 2015-10-08 Mark Braxton Rober Microwave oven with thermal imaging temperature display and control
US20190234617A1 (en) * 2015-05-05 2019-08-01 June Life, Inc. Connected food preparation system and method of use
WO2017121713A1 (en) * 2016-01-11 2017-07-20 Teknologisk Institut A method and device for scanning of objects using a combination of spectral ranges within vision, nir and x-rays
US20170262973A1 (en) * 2016-03-14 2017-09-14 Amazon Technologies, Inc. Image-based spoilage sensing refrigerator
US20170290095A1 (en) * 2016-03-30 2017-10-05 The Markov Corporation Electronic oven with infrared evaluative control
EP3232733A1 (en) * 2016-04-15 2017-10-18 Panasonic Intellectual Property Management Co., Ltd. System that emits light to overheated portion of cooking container
US20200139554A1 (en) * 2018-11-07 2020-05-07 Miso Robotics, Inc. Modular robotic food preparation system and related methods
US20210385917A1 (en) * 2018-11-08 2021-12-09 BSH Hausgeräte GmbH Method for operating a domestic cooking appliance and domestic cooking appliance
US20220030677A1 (en) * 2018-12-10 2022-01-27 BSH Hausgeräte GmbH Method for operating a domestic cooking appliance and domestic cooking appliance
EP3742052A1 (en) * 2019-05-21 2020-11-25 Whirlpool Corporation Cooking assistance appliance
US10819905B1 (en) * 2019-09-13 2020-10-27 Guangdong Media Kitchen Appliance Manufacturing Co., Ltd. System and method for temperature sensing in cooking appliance with data fusion

Also Published As

Publication number Publication date
GB202302940D0 (en) 2023-04-12
GB202202984D0 (en) 2022-04-20

Similar Documents

Publication Publication Date Title
US10847054B2 (en) Conditioner with sensors for nutritional substances
US10207859B2 (en) Nutritional substance label system for adaptive conditioning
US10209691B2 (en) Instructions for conditioning nutritional substances
US9564064B2 (en) Conditioner with weight sensors for nutritional substances
US9702858B1 (en) Dynamic recipe control
US9069340B2 (en) Multi-conditioner control for conditioning nutritional substances
US20230245757A1 (en) System and method for providing a food recommendation based on food sensitivity testing
US8851365B2 (en) Adaptive storage and conditioning systems for nutritional substances
US9436170B2 (en) Appliances with weight sensors for nutritional substances
US9080997B2 (en) Local storage and conditioning systems for nutritional substances
US20180063900A1 (en) Calibration Of Dynamic Conditioning Systems
JP3171825U (en) Electronic scale with health management function
US20150260699A1 (en) Dynamic recipe control
WO2014210531A2 (en) Local storage and conditioning systems for nutritional substances
WO2015069325A1 (en) Multi-conditioner control for conditioning nutritional substances
WO2015069950A1 (en) Instructions for conditioning nutritional substances
WO2015073569A1 (en) Nutritional substance label system for adaptive conditioning
KR101694434B1 (en) Composite food inspection terminal apparatus
GB2616522A (en) Apparatus and method for determining food characteristics
WO2015195575A1 (en) Dynamic recipe control
WO2018034800A1 (en) Dynamic recipe control
WO2015195573A1 (en) Multi-conditioner control for conditioning nutritional substances
TWI664877B (en) Intelligent multifunctional cooking equipment
JP6823750B1 (en) Cooking determination methods, systems, programs, recording media, and cooking equipment
JP2018147415A (en) Meal identification system and program therefor