WO2023187573A1 - Eating rate estimation through a mobile device camera - Google Patents

Eating rate estimation through a mobile device camera Download PDF

Info

Publication number
WO2023187573A1
WO2023187573A1 PCT/IB2023/052924 IB2023052924W WO2023187573A1 WO 2023187573 A1 WO2023187573 A1 WO 2023187573A1 IB 2023052924 W IB2023052924 W IB 2023052924W WO 2023187573 A1 WO2023187573 A1 WO 2023187573A1
Authority
WO
WIPO (PCT)
Prior art keywords
food intake
rate
user
food
mobile device
Prior art date
Application number
PCT/IB2023/052924
Other languages
French (fr)
Other versions
WO2023187573A4 (en
Inventor
Ioannis IOAKEIMIDIS
Kosmas DIMITROPOULOS
Petros DARAS
Dimitrios KONSTANTINIDIS
Original Assignee
Centre For Research And Technology Hellas
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Centre For Research And Technology Hellas filed Critical Centre For Research And Technology Hellas
Publication of WO2023187573A1 publication Critical patent/WO2023187573A1/en
Publication of WO2023187573A4 publication Critical patent/WO2023187573A4/en

Links

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/11Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
    • A61B5/1113Local tracking of patients, e.g. in a hospital or private home
    • A61B5/1114Tracking parts of the body
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H20/00ICT specially adapted for therapies or health-improving plans, e.g. for handling prescriptions, for steering therapy or for monitoring patient compliance
    • G16H20/60ICT specially adapted for therapies or health-improving plans, e.g. for handling prescriptions, for steering therapy or for monitoring patient compliance relating to nutrition control, e.g. diets
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/48Other medical applications
    • A61B5/486Bio-feedback
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/48Other medical applications
    • A61B5/4866Evaluating metabolism
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/72Signal processing specially adapted for physiological signals or for diagnostic purposes
    • A61B5/7235Details of waveform analysis
    • A61B5/7264Classification of physiological signals or data, e.g. using neural networks, statistical classifiers, expert systems or fuzzy systems
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/72Signal processing specially adapted for physiological signals or for diagnostic purposes
    • A61B5/7235Details of waveform analysis
    • A61B5/7264Classification of physiological signals or data, e.g. using neural networks, statistical classifiers, expert systems or fuzzy systems
    • A61B5/7267Classification of physiological signals or data, e.g. using neural networks, statistical classifiers, expert systems or fuzzy systems involving training the classification device
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/74Details of notification to user or communication with user or patient ; user input means
    • A61B5/742Details of notification to user or communication with user or patient ; user input means using visual displays
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H50/00ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics
    • G16H50/20ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics for computer-aided diagnosis, e.g. based on medical expert systems
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H50/00ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics
    • G16H50/30ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics for calculating health indices; for individual health risk assessment

Definitions

  • the present invention concerns a method for estimating the rate of food intake by means of a camera of a mobile device.
  • the invention relates to a method of estimating an individual food intake rate via a mobile device, a suitable computerized program and highly discriminative deep learning networks.
  • this invention allows recording an individual by a mobile camera during a meal, processing the video in near real time, and estimating the individual food intake rate as calculated in number of food intake events per minute.
  • the purpose of the present invention is to be applied to people with such medical problems or to people in general who are careful about their diet and their metabolic rate so that they do not develop medical problems in the future.
  • the present invention may intervene during a meal and inform the user that he or she is exceeding the normal and permissible limits of food intake rate, thereby succeeding in slowing down or even preventing the development of diet-related medical problems.
  • Some inventions use the camera as the dominant element to identify the type and quantity of food consumed, but the camera is placed on the person, causing discomfort during the person's movements and thus limiting its usefulness. Additionally, such inventions cannot estimate the rate of food intake and, therefore, they cannot provide users with valuable feedback to alter their eating behaviour. Examples of such technologies include EP1179799 (Method for monitoring food intake, Feb. 13, 2002, Karnieli Eddie, et al), US2015306771 (Apparatus for monitoring food consumption by an individual, Oct. 29, 2015, Dekar Jonathan), US2016012749 (Eyewear system for monitoring and modifying nutritional intake, Jan. 16, 2016, Connor Robert) and US2018348187 (Method, apparatus and system for food intake and physical activity assessment, Dec. 6 2018, Fernstorm John et al).
  • KR20180116779 An artificial intelligence based image and speech recognition nutritional assessment method, Oct. 26, 2018, Hae-Jeung Lee et al
  • KR20180116779 An artificial intelligence based image and speech recognition nutritional assessment method, Oct. 26, 2018, Hae-Jeung Lee et al
  • mobile devices equipped with a camera and processor, taking pictures of food at regular intervals or at the beginning and end of the meal, calculating the differences from the pictures and estimating the rate and volume of food consumption.
  • KR20180116779 An artificial intelligence based image and speech recognition nutritional assessment method, Oct. 26, 2018, Hae-Jeung Lee et al
  • Such inventions either estimate only an overall rate of food intake once the meal is over, or have serious accuracy problems due to the great difficulty in estimating food volume from photographs.
  • the invention addresses the problem of estimating a person's rate of food intake in near real time using a mobile camera and therefore offers specific advantages which are listed below:
  • the invention estimates food intake rate through rapid video processing and estimation of the number of food intake events per minute (bites/min).
  • the invention is based on the use of highly discriminative deep learning networks trained on a database of labelled videos of individuals recorded during various meals, achieving high accuracy in estimating food consumption rate.
  • the fact that the networks have been trained with different individuals and meals allows the invention to be used with high accuracy and robustness on new videos.
  • the fact that it is based on the combined processing of visual data of the user's movement (body, hands, face) provides the invention with the discriminating ability to avoid false detections that may be due to movements of the person in order to wipe, touch his/her face or simply raise his/her hand, thereby circumventing the drawbacks of applications based on non-visual data.
  • the invention is not limited by the number of plates a person can consume from, thus avoiding the serious drawback of applications based on portable scales.
  • the invention does not require additional or specialized sensors, and therefore the cost of use is minimal, since it operates as a computerized program that can run on a mobile device using only its camera, without forcing the user to acquire additional sensors or processing units. Since the majority of the general population has a mobile device, which is available wherever they go, it allows the invention to be used in order to estimate the rate of food intake without restrictions under any conditions, either indoors or outdoors. This fact increases the portability, utility and scope of the invention unlike existing applications or inventions.
  • the invention is based on lightweight deep learning networks that allow rapid execution even on devices of small computational power, such as a mobile device. In this way, the estimation of the rate of food intake during a meal is achieved, allowing the invention to intervene through visual and auditory information and enabling the person to modify in situ the rate of food intake when a deviation from normal limits is observed. This fact has the effect of increasing the usefulness of the invention over other applications that assess the rate of food intake after the meal is over.
  • the invention relates to a novel method of estimating the rate of food intake during a meal using video data from a wearable device.
  • the main parts of which it consists are as follows:
  • the system for recording and estimating individual food intake rate includes a user (1), who sits on a chair and consumes food from one or more plates (2) placed on a table.
  • the user's mobile device (3) must be placed on the table at a distance of at least 60 cm from the user so that the upper part of the user's body, including the hands and face, is visible to the camera of the mobile device throughout the meal.
  • the distance of the mobile device from the user shall not be more than 100 cm, as the user's movements shall be clearly visible to the camera of the mobile device. Objects that interfere with the field of view of the camera and obscure the user's movements shall be removed.
  • a mounting bracket (4) for the mobile device may be used to record the user better.
  • the conditions and distances mentioned above are ideal and ensure the proper functioning of the method for estimating the individual rate of food consumption.
  • the system has also been tested at shorter or longer distances and/or with interfering objects, but in such cases a loss of accuracy may be observed.
  • the user can press the start recording button of the computerized program and start eating.
  • the deep learning networks are automatically executed and calculate the rate of food intake, which is received by the computerized program and presented to the user via visual and auditory signals.
  • the user finishes his/her meal he/she can press the button again to stop the recording and the estimation of the food intake rate.
  • the neural networks are executed when the user presses the corresponding button to start recording and stop when the user presses the corresponding button to stop recording.
  • Sophisticated deep learning techniques are employed to process video data and estimate the rate of food intake.
  • the video during recording is divided into separate image frames (Step 1).
  • a neural network of high discrimination ability (Neural Network 1) is applied, extracting spatial feature vectors to describe the content depicted in each frame (Step 2).
  • the extracted feature vectors from consecutive image frames are collected using a sliding window of 2 seconds to form a temporal information for each equal duration video sequence (Step 3).
  • Step 4 This information is then fed to a neural network (Neural Network 2), which analyzes the temporal sequence of the information in order to identify the presence or absence of a food intake event (Step 4). Finally, an accumulation variable is used to aggregate the total number of food intake events per minute and thereby estimate the rate of food consumption (Step 5). Due to the sliding window, the food intake rate is updated every 10 seconds approximately, providing the user with valuable information about the current food intake rate and enabling the user to alter his/her eating behaviour appropriately during the meal.
  • Computerized program ( Figure 3] [0020] Part of the invention is the computerized program, which has a simple and user-friendly interface that enables the user to operate easily the invention, in order to estimate the rate of food intake.
  • the user interface consists of the video display frame (5); the recording button (6); and the food intake rate display bar (7). More specifically, the user can observe in real-time what the camera of his mobile device is recording, so that he/she can position it in a suitable place to capture him/her from the waist up during the meal without interference from other objects or obstacles.
  • the record button allows the user to start and pause recording at will.
  • the food intake rate display bar informs the user of the estimated number of food intake events per minute by changing the size and color of the bar accordingly.
  • the color of the display bar is also used to indicate whether the rate of food consumption is within or outside normal and predefined limits, which are defined in the literature and which may vary depending on the individual's condition.
  • the color of the bar changes accordingly and the computerized program plays an audible alert (tone) through the speaker (8) of the mobile device as an additional alert to guide the user towards reducing the food intake rate and returning it to within normal and predefined limits.
  • Example 1 shows a user (1) sitting on a chair, with a plate of food (2) located on a table at a distance of 60-100 cm from the user. On the table, there is also a mobile device (3) placed on a mounting bracket (4) to record the user while consuming the food. On the mobile device, the user can see the video that is recorded (5) and initiate the eating rate estimation method by pressing the recording button (6). While the eating rate estimation runs, a display bar (7) allows the user to see whether the estimated food intake rate is within normal levels, while an auditory signal is played through the speaker (8) of the mobile device when the estimated food intake rate is above normal levels.

Landscapes

  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • Medical Informatics (AREA)
  • Public Health (AREA)
  • Physics & Mathematics (AREA)
  • General Health & Medical Sciences (AREA)
  • Biomedical Technology (AREA)
  • Pathology (AREA)
  • Animal Behavior & Ethology (AREA)
  • Surgery (AREA)
  • Veterinary Medicine (AREA)
  • Molecular Biology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Biophysics (AREA)
  • Artificial Intelligence (AREA)
  • Physiology (AREA)
  • Evolutionary Computation (AREA)
  • Signal Processing (AREA)
  • Primary Health Care (AREA)
  • Epidemiology (AREA)
  • Psychiatry (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Mathematical Physics (AREA)
  • Fuzzy Systems (AREA)
  • Data Mining & Analysis (AREA)
  • Nutrition Science (AREA)
  • Biodiversity & Conservation Biology (AREA)
  • Obesity (AREA)
  • Databases & Information Systems (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • Dentistry (AREA)
  • Image Analysis (AREA)
  • Image Processing (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The invention concerns a method for estimating individual food intake rate during a meal. It is based on a computerized program that can be executed on a handheld device, utilizing the camera thereof. It consists of a user interface that allows the user to monitor the estimated rate of food intake, while visual and auditory information is presented in a simple and clear manner, in order to facilitate the user in adjusting and maintaining the rate of food consumption within normal limits. It is based on lightweight deep learning networks that are trained to process video data to accurately estimate food intake events. Advantages include accuracy in estimating the rate of food intake, not using additional or specialized sensors that may reduce the usability and portability of the invention, and estimating the rate of food consumption in near real time.

Description

DESCRIPTION
EATING RATE ESTIMATION THROUGH A MOBILE DEVICE CAMERA
[001] The present invention concerns a method for estimating the rate of food intake by means of a camera of a mobile device.
Technical field to which the invention relates
[0002] The invention relates to a method of estimating an individual food intake rate via a mobile device, a suitable computerized program and highly discriminative deep learning networks.
[0003] In particular, this invention allows recording an individual by a mobile camera during a meal, processing the video in near real time, and estimating the individual food intake rate as calculated in number of food intake events per minute.
[0004] It has been shown in the literature that food intake rate is directly related to the development or existence of diet-related medical problems such as obesity, type 2 diabetes, gastroesophageal reflux, and metabolic syndrome.
[0005] Therefore, the purpose of the present invention is to be applied to people with such medical problems or to people in general who are careful about their diet and their metabolic rate so that they do not develop medical problems in the future. By means of visual and acoustic signals from the wearable device, the present invention may intervene during a meal and inform the user that he or she is exceeding the normal and permissible limits of food intake rate, thereby succeeding in slowing down or even preventing the development of diet-related medical problems.
Level of prior art and evaluation thereof
[0006] The need for accurate assessment of food intake rate has been recognized in the literature as part of the more general problem of recording eating behavior and there have been several attempts in this direction, each with its own advantages and disadvantages.
[0007] Some inventions use the camera as the dominant element to identify the type and quantity of food consumed, but the camera is placed on the person, causing discomfort during the person's movements and thus limiting its usefulness. Additionally, such inventions cannot estimate the rate of food intake and, therefore, they cannot provide users with valuable feedback to alter their eating behaviour. Examples of such technologies include EP1179799 (Method for monitoring food intake, Feb. 13, 2002, Karnieli Eddie, et al), US2015306771 (Apparatus for monitoring food consumption by an individual, Oct. 29, 2015, Dekar Jonathan), US2016012749 (Eyewear system for monitoring and modifying nutritional intake, Jan. 16, 2016, Connor Robert) and US2018348187 (Method, apparatus and system for food intake and physical activity assessment, Dec. 6 2018, Fernstorm John et al).
[0008] Other disadvantages of the aforementioned technologies include the high cost and complexity for developing and maintaining these systems, as well as the need for powerful computing systems to process the collected data, further limiting their outdoor use and thus their portability and range of application.
[0009] To overcome the high computational needs of camera-based eating detection and rate estimation systems, methods, such as KR20180116779 (An artificial intelligence based image and speech recognition nutritional assessment method, Oct. 26, 2018, Hae-Jeung Lee et al), resort to mobile devices equipped with a camera and processor, taking pictures of food at regular intervals or at the beginning and end of the meal, calculating the differences from the pictures and estimating the rate and volume of food consumption. However, such inventions either estimate only an overall rate of food intake once the meal is over, or have serious accuracy problems due to the great difficulty in estimating food volume from photographs.
[0010] Many attempts to estimate the changing rate of food intake have also been made using smart watches or wristbands. Such applications exploit the time signals generated by the accelerometer and gyroscope of these devices to recognise hand movements and correlate them with the corresponding movements made during the food intake process. In addition, several inventions employ jaw sensors to study the chewing intensity and cameras to recognize the type of food consumed. Major drawbacks of these applications include the obtrusiveness of these technologies due to the use of wearable devices and the complexity requirements of having multiple sensors to collect complementary data. In addition, such applications face accuracy problems because movements of the individual to wipe or touch his/her face can be misidentified as food intake events. Examples of such inventions include WO2015046673 (Head Mounted Display and Method of Controlling the Same, Apr. 2, 2015, Kim Yongsin, et al), US2016073953 (Food intake monitor, Mar. 17, 2016, Sazonov) and US2017156634 (Wearable device and method for monitoring eating, Jun. 8, 2017, Li Mubing et al).
[0011] Finally, some recent studies have relied on portable scales that allow accurate estimation of both the changing rate of food intake and the volume of food per bite. A major drawback of such an application is the inherent limitation of the scale to provide weight information from only one food plate, which means that it is not possible to accurately estimate the rate of food intake when an individual consumes food from two or more plates. Furthermore, another disadvantage of such an application is the need to acquire and carry the portable scale wherever necessary, burdening the budget of each person concerned and limiting its usefulness.
Advantages of the invention
[0012] The invention addresses the problem of estimating a person's rate of food intake in near real time using a mobile camera and therefore offers specific advantages which are listed below:
A) Estimation of food intake rate with high accuracy thanks to the processing of video sequences, which provide information about both the body (including the hands) and the face of the user.
B) Estimation of food intake rate without the use of specialized or additional sensors (more than one camera, weight scales, jaw sensors or wristbands).
C) Estimation of food intake rate in near real time during a meal.
[0013] A detailed explanation and description of each of the advantages of the invention separately will be given below.
Advantage A: Accuracy in estimating the rate of food intake
[0014] The invention estimates food intake rate through rapid video processing and estimation of the number of food intake events per minute (bites/min). The invention is based on the use of highly discriminative deep learning networks trained on a database of labelled videos of individuals recorded during various meals, achieving high accuracy in estimating food consumption rate. The fact that the networks have been trained with different individuals and meals allows the invention to be used with high accuracy and robustness on new videos. Moreover, the fact that it is based on the combined processing of visual data of the user's movement (body, hands, face), provides the invention with the discriminating ability to avoid false detections that may be due to movements of the person in order to wipe, touch his/her face or simply raise his/her hand, thereby circumventing the drawbacks of applications based on non-visual data. Finally, the invention is not limited by the number of plates a person can consume from, thus avoiding the serious drawback of applications based on portable scales.
Advantage B: Portability and cost
[0015] The invention does not require additional or specialized sensors, and therefore the cost of use is minimal, since it operates as a computerized program that can run on a mobile device using only its camera, without forcing the user to acquire additional sensors or processing units. Since the majority of the general population has a mobile device, which is available wherever they go, it allows the invention to be used in order to estimate the rate of food intake without restrictions under any conditions, either indoors or outdoors. This fact increases the portability, utility and scope of the invention unlike existing applications or inventions.
Advantage C: Near real-time estimation
[0016] The invention is based on lightweight deep learning networks that allow rapid execution even on devices of small computational power, such as a mobile device. In this way, the estimation of the rate of food intake during a meal is achieved, allowing the invention to intervene through visual and auditory information and enabling the person to modify in situ the rate of food intake when a deviation from normal limits is observed. This fact has the effect of increasing the usefulness of the invention over other applications that assess the rate of food intake after the meal is over.
Disclosure of the invention
[0017] The invention relates to a novel method of estimating the rate of food intake during a meal using video data from a wearable device. The main parts of which it consists are as follows:
A system for recording and estimating individual food intake rate (Figure 1]
[0018] The system for recording and estimating individual food intake rate includes a user (1), who sits on a chair and consumes food from one or more plates (2) placed on a table. The user's mobile device (3) must be placed on the table at a distance of at least 60 cm from the user so that the upper part of the user's body, including the hands and face, is visible to the camera of the mobile device throughout the meal. Furthermore, the distance of the mobile device from the user shall not be more than 100 cm, as the user's movements shall be clearly visible to the camera of the mobile device. Objects that interfere with the field of view of the camera and obscure the user's movements shall be removed. In order to achieve the above, a mounting bracket (4) for the mobile device may be used to record the user better. At this point, it should be noted that the conditions and distances mentioned above are ideal and ensure the proper functioning of the method for estimating the individual rate of food consumption. The system has also been tested at shorter or longer distances and/or with interfering objects, but in such cases a loss of accuracy may be observed. When ready, the user can press the start recording button of the computerized program and start eating. The deep learning networks are automatically executed and calculate the rate of food intake, which is received by the computerized program and presented to the user via visual and auditory signals. When the user finishes his/her meal, he/she can press the button again to stop the recording and the estimation of the food intake rate.
Neural networks for estimating the rate of food intake (Figure 2)
[0019] The neural networks are executed when the user presses the corresponding button to start recording and stop when the user presses the corresponding button to stop recording. Sophisticated deep learning techniques are employed to process video data and estimate the rate of food intake. Initially, the video during recording is divided into separate image frames (Step 1). Then, a neural network of high discrimination ability (Neural Network 1) is applied, extracting spatial feature vectors to describe the content depicted in each frame (Step 2). The extracted feature vectors from consecutive image frames are collected using a sliding window of 2 seconds to form a temporal information for each equal duration video sequence (Step 3). This information is then fed to a neural network (Neural Network 2), which analyzes the temporal sequence of the information in order to identify the presence or absence of a food intake event (Step 4). Finally, an accumulation variable is used to aggregate the total number of food intake events per minute and thereby estimate the rate of food consumption (Step 5). Due to the sliding window, the food intake rate is updated every 10 seconds approximately, providing the user with valuable information about the current food intake rate and enabling the user to alter his/her eating behaviour appropriately during the meal.
Computerized program (Figure 3] [0020] Part of the invention is the computerized program, which has a simple and user-friendly interface that enables the user to operate easily the invention, in order to estimate the rate of food intake. The user interface consists of the video display frame (5); the recording button (6); and the food intake rate display bar (7). More specifically, the user can observe in real-time what the camera of his mobile device is recording, so that he/she can position it in a suitable place to capture him/her from the waist up during the meal without interference from other objects or obstacles. The record button allows the user to start and pause recording at will. On the other hand, the food intake rate display bar informs the user of the estimated number of food intake events per minute by changing the size and color of the bar accordingly. The color of the display bar is also used to indicate whether the rate of food consumption is within or outside normal and predefined limits, which are defined in the literature and which may vary depending on the individual's condition. In the event that the user exceeds the normal and predefined limits of the food intake rate, the color of the bar changes accordingly and the computerized program plays an audible alert (tone) through the speaker (8) of the mobile device as an additional alert to guide the user towards reducing the food intake rate and returning it to within normal and predefined limits.
Implementation of the invention
[0021] Example 1 (figures 1 and 3) shows a user (1) sitting on a chair, with a plate of food (2) located on a table at a distance of 60-100 cm from the user. On the table, there is also a mobile device (3) placed on a mounting bracket (4) to record the user while consuming the food. On the mobile device, the user can see the video that is recorded (5) and initiate the eating rate estimation method by pressing the recording button (6). While the eating rate estimation runs, a display bar (7) allows the user to see whether the estimated food intake rate is within normal levels, while an auditory signal is played through the speaker (8) of the mobile device when the estimated food intake rate is above normal levels.

Claims

1. A method for estimating the rate of food intake via a mobile device camera in near real time, comprising the following stages:
A. Video data of a user consuming food are recorded via a mobile device camera.
B. The video is divided into separate image frames.
C. A neural network of high discrimination ability is applied to the image frames to extract spatial feature vectors describing the content depicted in each frame.
D. The extracted feature vectors from consecutive image frames are collected using a sliding window of 2 seconds to form a temporal information for each video sequence of equal duration.
E. This temporal information is fed to a second neural network, which analyses the temporal sequence to identify the existence or not of a food intake event.
F. An accumulation variable is used to aggregate the total number of food intake events per minute and thereby estimate the rate of food intake.
2. A method according to claim 1, characterized in that the food intake rate is updated every 10 seconds.
3. A method according to previous claims, characterized in that the food intake rate is displayed on a screen.
4. A method according to previous claims, characterized in that the user is informed about the food intake rate with auditory information.
5. A system comprising the means for implementing the food intake rate estimation method as described in previous claims, characterized by a user (1) sitting on a chair and consuming food from one or more plates (2) placed on a table, where a portable device (3) with a camera is positioned.
6. A system according to claim 5, characterized in that the portable device may be either a mobile phone or another type of device (portable computer, notebook, tablet).
7. A system according to claims 5 and 6, characterized in that the portable device may be supported on a mounting bracket (4).
8. A system according to claims 5, 6 and 7, characterized in that the portable device must be more than sixty (60) and less than one hundred (100) centimetres from the user.
9. A computerised program that implements the method for estimating the rate of food intake using a mobile device camera in near real time as described in claims 1, 2, 3 and 4, via a system described in claims 5, 6, 7 and 8, characterized by a user interface consisting of a frame that displays the recording video (5), a button that starts and stops the recording process (6), a display bar that depicts the estimated rate of food intake (7) and auditory signals generated through the speaker (8) of the mobile device when the estimated food intake rate exceeds predefined limits.
PCT/IB2023/052924 2022-03-28 2023-03-24 Eating rate estimation through a mobile device camera WO2023187573A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
GR20220100272A GR1010356B (en) 2022-03-28 2022-03-28 Food ingestion rate estimation method via portable device camera
GR20220100272 2022-03-28

Publications (2)

Publication Number Publication Date
WO2023187573A1 true WO2023187573A1 (en) 2023-10-05
WO2023187573A4 WO2023187573A4 (en) 2023-12-21

Family

ID=85112958

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/IB2023/052924 WO2023187573A1 (en) 2022-03-28 2023-03-24 Eating rate estimation through a mobile device camera

Country Status (2)

Country Link
GR (1) GR1010356B (en)
WO (1) WO2023187573A1 (en)

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP1179799A2 (en) 2000-08-08 2002-02-13 Eddie Prof. Karnieli Method for monitoring food intake
WO2015046673A1 (en) 2013-09-30 2015-04-02 Lg Electronics Inc. Head mounted display and method of controlling the same
US20150306771A1 (en) 2011-10-10 2015-10-29 Desin Llc Apparatus for monitoring food consumption by an individual
US20160012749A1 (en) 2012-06-14 2016-01-14 Robert A. Connor Eyewear System for Monitoring and Modifying Nutritional Intake
US20160073953A1 (en) 2014-09-11 2016-03-17 Board Of Trustees Of The University Of Alabama Food intake monitor
US20170156634A1 (en) 2015-06-03 2017-06-08 Boe Technology Group Co., Ltd. Wearable device and method for monitoring eating
KR20180116779A (en) 2017-04-17 2018-10-26 가천대학교 산학협력단 An artificial intelligence based image and speech recognition nutritional assessment method
US20180348187A1 (en) 2011-11-14 2018-12-06 University Of Pittsburgh - Of The Commonwealth System Of Higher Education Method, Apparatus and System for Food Intake and Physical Activity Assessment

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9198621B2 (en) * 2007-06-18 2015-12-01 University of Pittsburgh—of the Commonwealth System of Higher Education Method, apparatus and system for food intake and physical activity assessment
US20160143582A1 (en) * 2014-11-22 2016-05-26 Medibotics Llc Wearable Food Consumption Monitor

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP1179799A2 (en) 2000-08-08 2002-02-13 Eddie Prof. Karnieli Method for monitoring food intake
US20150306771A1 (en) 2011-10-10 2015-10-29 Desin Llc Apparatus for monitoring food consumption by an individual
US20180348187A1 (en) 2011-11-14 2018-12-06 University Of Pittsburgh - Of The Commonwealth System Of Higher Education Method, Apparatus and System for Food Intake and Physical Activity Assessment
US20160012749A1 (en) 2012-06-14 2016-01-14 Robert A. Connor Eyewear System for Monitoring and Modifying Nutritional Intake
WO2015046673A1 (en) 2013-09-30 2015-04-02 Lg Electronics Inc. Head mounted display and method of controlling the same
US20160073953A1 (en) 2014-09-11 2016-03-17 Board Of Trustees Of The University Of Alabama Food intake monitor
US20170156634A1 (en) 2015-06-03 2017-06-08 Boe Technology Group Co., Ltd. Wearable device and method for monitoring eating
KR20180116779A (en) 2017-04-17 2018-10-26 가천대학교 산학협력단 An artificial intelligence based image and speech recognition nutritional assessment method

Non-Patent Citations (5)

* Cited by examiner, † Cited by third party
Title
KONG F ET AL: "DietCam: Automatic dietary assessment with mobile camera phones", PERVASIVE AND MOBILE COMPUTING, ELSEVIER, NL, vol. 8, no. 1, 18 July 2011 (2011-07-18), pages 147 - 163, XP028448045, ISSN: 1574-1192, [retrieved on 20110726], DOI: 10.1016/J.PMCJ.2011.07.003 *
KONSTANTINIDIS D ET AL: "Validation of a Deep Learning System for the Full Automation of Bite and Meal Duration Analysis of Experimental Meal Videos", NUTRIENTS, vol. 12, no. 1, 13 January 2020 (2020-01-13), pages 209, XP093078105, DOI: 10.3390/nu12010209 *
RAJU V ET AL: "Processing of Egocentric Camera Images from a Wearable Food Intake Sensor", 2019 SOUTHEASTCON, IEEE, 11 April 2019 (2019-04-11), pages 1 - 6, XP033733586, DOI: 10.1109/SOUTHEASTCON42311.2019.9020284 *
ROBINSON E ET AL: "A systematic review and meta-analysis examining the effect of eating rate on energy intake and hunger", AMERICAN JOURNAL OF CLINICAL NUTRITION, vol. 100, no. 1, 1 July 2014 (2014-07-01), pages 123 - 151, XP093078154, ISSN: 0002-9165, Retrieved from the Internet <URL:https://www.sciencedirect.com/science/article/pii/S0002916523046816/pdfft?md5=037a1399f6d27bf5ae14d5e89a8c88ae&pid=1-s2.0-S0002916523046816-main.pdf> [retrieved on 20230901], DOI: 10.3945/ajcn.113.081745 *
ROUAST P V ET AL: "Learning deep representations for video-based intake gesture detection", ARXIV.ORG, CORNELL UNIVERSITY LIBRARY, 201 OLIN LIBRARY CORNELL UNIVERSITY ITHACA, NY 14853, 24 September 2019 (2019-09-24), XP081480896, DOI: 10.1109/JBHI.2019.2942845 *

Also Published As

Publication number Publication date
GR1010356B (en) 2022-12-13
WO2023187573A4 (en) 2023-12-21

Similar Documents

Publication Publication Date Title
US11929167B2 (en) Method and apparatus for tracking of food intake and other behaviors and providing relevant feedback
US11948401B2 (en) AI-based physical function assessment system
US11728024B2 (en) Method and apparatus for tracking of food intake and other behaviors and providing relevant feedback
EP3705032A1 (en) Open api-based medical information providing method and system
Liu et al. An intelligent food-intake monitoring system using wearable sensors
CN108778097A (en) Device and method for assessing heart failure
CN108052079A (en) Apparatus control method, device, plant control unit and storage medium
CN103238311A (en) Electronic device and electronic device control program
CN107066778A (en) The Nounou intelligent guarding systems accompanied for health care for the aged
US20210249116A1 (en) Smart Glasses and Wearable Systems for Measuring Food Consumption
TWI772762B (en) Care system and method of automated care
US9465981B2 (en) System and method for communication
JP2018005512A (en) Program, electronic device, information processing device and system
CN110706784A (en) Calorie intake amount calculation method, device, system, apparatus, and storage medium
CN107016224A (en) The Nounou intelligent monitoring devices accompanied for health care for the aged
WO2023187573A1 (en) Eating rate estimation through a mobile device camera
CN106303939A (en) The method and device of healthalert
GB2593931A (en) Person monitoring system and method
CN103186701A (en) Method, system and equipment for analyzing eating habits
CN205286342U (en) Remote monitoring medical treatment and health protection robot
CN113995395A (en) Household movement guiding and recognizing system and method for chronic pain
EP4434440A1 (en) Heat consumption estimation method and device and storage medium
WO2021254091A1 (en) Method for determining number of motions and terminal
JP2022139993A (en) Determination device and determination method
Lobo et al. A review of devices using modern dietary assessment methods for reducing obesity

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 23736456

Country of ref document: EP

Kind code of ref document: A1