WO2019063762A1 - Systèmes et procédés d'aide à la nutrition - Google Patents

Systèmes et procédés d'aide à la nutrition Download PDF

Info

Publication number
WO2019063762A1
WO2019063762A1 PCT/EP2018/076403 EP2018076403W WO2019063762A1 WO 2019063762 A1 WO2019063762 A1 WO 2019063762A1 EP 2018076403 W EP2018076403 W EP 2018076403W WO 2019063762 A1 WO2019063762 A1 WO 2019063762A1
Authority
WO
WIPO (PCT)
Prior art keywords
data
food
individual
user device
food intake
Prior art date
Application number
PCT/EP2018/076403
Other languages
English (en)
Inventor
Shrutin ULMAN
Prasad RAGHOTHAM VENKAT
Amogh HIREMATH
Original Assignee
Koninklijke Philips N.V.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Koninklijke Philips N.V. filed Critical Koninklijke Philips N.V.
Publication of WO2019063762A1 publication Critical patent/WO2019063762A1/fr

Links

Classifications

    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H20/00ICT specially adapted for therapies or health-improving plans, e.g. for handling prescriptions, for steering therapy or for monitoring patient compliance
    • G16H20/60ICT specially adapted for therapies or health-improving plans, e.g. for handling prescriptions, for steering therapy or for monitoring patient compliance relating to nutrition control, e.g. diets

Definitions

  • the present subject matter relates, in general, to nutrition support and, in particular, to providing nutrition support based on sensorial data.
  • Nutrition support includes measuring and monitoring a person's dietary intake to ensure that proper nutrition is provided to the person.
  • Nutrition support may be used by individuals for preventive healthcare or during convalescence. For example, treatment of certain diseases and disorders, such as cancer, cardiac diseases, and the like, are associated with modifications to diet consumed by the person.
  • high calorie and fat content food are restricted during treatments, such as chemotherapy, radiotherapy, immunotherapy, hormone therapy, surgery and the like.
  • treatments such as chemotherapy, radiotherapy, immunotherapy, hormone therapy, surgery and the like.
  • the person may need to cook and store meals for a prolonged period of time to reduce frequency of visits to the market.
  • Consumption of stored food or prolonged treatment can lead to change in taste patterns over a period of time.
  • solid foods may not be consumable.
  • a liquid diet maybe prescribed.
  • Some of these treatment techniques are also associated with side- effects, such as loss of appetite, changes in taste and smell, dry mouth, sore throat, nausea, and the like.
  • administering proper food with adequate calories is important for recovery and maintenance of good quality of life. Due to the above mentioned reasons, it is important that a nutritionist and the person or a caregiver plan the diet and that the diet is monitored accurately.
  • monitoring a person's dietary intake can be done by using food journaling applications on devices, such as mobile phones, laptops or computers, which require the person to manually enter and track the type and amount of food consumed at regular intervals.
  • Such applications utilize a database that contains nutrient and caloric information for a number of food items to monitor the nutrition intake.
  • the journaling of food and portion entry and management of nutrition using such applications can be a time-consuming and cumbersome process for an individual and for the caregivers.
  • the present disclosure relates to a network environment for providing nutrition support, the network environment comprising a feeding tool that includes sensors to determine food intake data from food consumed by an individual using the feeding tool.
  • a user device records the food intake data received from the feeding tool and a system analyzes the food intake data received from the user device to determine consumption patterns and food preferences of the individual and to provide a recommendation for food items to be provided to the individual based on the determination.
  • the present disclosure relates to a system for providing nutrition support, the system being communicatively coupled to a user device.
  • the system comprises a data aggregation engine to receive food intake data related to an individual from the user device, wherein the food intake data comprises at least taste data, smell data, and composition data determined by sensors from food consumed by the individual and process the food intake data to determine consumption data and sensorial data, wherein the consumption data is indicative of food consumption patterns and the sensorial data is indicative of food preference patterns.
  • a recommendation engine determines a recommendation for food items to be provided to the individual based on the consumption data, the sensorial data, and food prescription data, wherein the food prescription data is indicative of nutrition requirements of the individual and provide the recommendation for food items to the user device for being displayed to a user.
  • Fig. l illustrates an example network environment for providing nutrition support, in accordance with principles of the present subject matter.
  • Fig. 2 illustrates a method for providing nutrition support, in accordance with principles of the present subject matter.
  • food journaling applications are, typically, based on databases that contain nutrient and caloric information for specific food items.
  • the caloric information for each food item available in the database is fixed based on the ingredients used and portion sizes. While suggesting food to the person being treated or a caregiver, food journaling applications do not accurately provide calorific value. This could be, for example, due to variation in ingredients and portion sizes as compared to availability on the database. Moreover, in case any erroneous entry is made or an entry is missed, the applications will cease to serve their purpose.
  • aspects of the present subject matter are directed to providing nutrition support to an individual.
  • the individual to whom the nutrition support may be provided may be a healthy individual or a patient undergoing medical treatment.
  • the systems, devices, and methods of the present subject matter may be utilized by the individual or a care giver of the individual, such as a nurse or attendant, to ensure proper nutrition is provided to the individual.
  • the term 'user' as used herein can refer to either the individual or the care giver.
  • a system for nutrition support is provided.
  • the system can be associated with a nutrition database that includes nutritional data for different food ingredients and food items.
  • the system receives food prescription data that indicates the nutrition to be provided to the individual.
  • the food prescription data includes number of calories prescribed to the person by a medical professional, such as a doctor or nutritionist or dietician, requirement of liquid feeding, frequency of feeding, other nutrients, such as vitamins and minerals to be provided, and the like.
  • the system also aggregates consumption data for the individual.
  • the consumption data can include an amount and type of food consumed, time at which food was consumed, etc.
  • the consumption data can be obtained from sensors provided in feeding tools, such as straw, spoon, cup, plate, etc.
  • the consumption data is sent to a user device from the sensors and is then transmitted to the system by the user device for further analysis.
  • the system also aggregates sensorial data, such as food preferences, changes in taste and smell of the individual, and the like.
  • the sensorial data can be received from a user through the user device and may also be analytically determined from the consumption data. For example, if the individual consumed certain food items in a larger quantity then it may be determined that the individual preferred the food taste or texture of those food items.
  • the system can analyze the food prescription data, the consumption data, and the sensorial data, in conjunction with the nutritional data, and provide suggestions regarding food items and portions that can be eaten by the individual.
  • the system can automatically monitor the food consumed and nutrition intake and can provide recommendations for food to be provided to the individual.
  • the sensorial data while providing the recommendation, the likelihood of the individual adhering to the recommendation and receiving the required nutrition is higher.
  • the recommendation can be dynamically altered based on changes in food prescription or preferences over time, thus ensuring better nutrition outcome.
  • Fig. l illustrates an example network environment 100 for providing nutrition support, in accordance with one implementation of the present subject matter.
  • the network environment 100 includes a system 102, a user device 104, and feeding tools 106-1, 106-2...106-n, also referred to as feeding tool 106.
  • feeding tool 106 feeding tools 106-1, 106-2...106-n, also referred to as feeding tool 106.
  • multiple user devices 104 may be present in the network environment 100 and each user device 104 can communicate with multiple feeding tools 106.
  • one user device 104 has been illustrated in the figure.
  • the system 102 can be a desktop computer, a server, a laptop computer, and the like.
  • the system 102 may be communicatively coupled to one or more user devices 104.
  • the user device 104 can be a mobile device, a desktop computer, a laptop computer, a notebook, and the like.
  • the user device 104 may be communicatively coupled to one or more feeding tools 106.
  • the feeding tools can include, for example, cutlery or utensils using which food may be served or fed to an individual.
  • the feeding tools 106 may include straws, spoons, cups, plates, bowls, etc.
  • Each feeding tool 106 may include one or more sensors 108.
  • the feeding tool 106-1 may include sensor(s) 108-1
  • the feeding tool 106-2 may include sensor(s) 108-2
  • the feeding tool 106-n may include sensors 108-n.
  • the sensors 108 may be selected from taste sensor, smell sensor, calorific value sensor, density sensor, viscosity sensor, temperature sensor, and the like.
  • a straw may include a taste sensor, a smell sensor, a temperature sensor, and a chemical composition sensor.
  • a bowl may include a taste sensor, a smell sensor, a temperature sensor, a density sensor, a weight sensor, and a chemical composition sensor. It will be understood that any combination of sensors 108 can be used on each of the feeding tools 106 and is not limited to the examples provided herein.
  • the user device 104 includes a display 110 and a nutrition assistance engine 112.
  • the nutrition assistance engine 1 12 can receive information from and/or send information to the system 102 and the feeding tools 106 through various communication interfaces in the user device 104.
  • the information received from and/ or sent to the system 102 or the feeding tools 106 can be displayed on the user device 104 through the display 1 10.
  • the system 102 can include a registration engine 114, a data aggregation engine 116, and a recommendation engine 1 18.
  • the system 102 can also include data 120, such as personal data 122, food prescription data 124, consumption data 126, and sensorial data 128. While the data 120 is shown as a part of the system 102, it will be understood that a part or all of the data 120 may be stored on a separate data storage device, which may communicate with the system 102 directly or over a network.
  • the system 102 may also be associated with a nutritional database 130, for example, over a communication network. In other implementations, the nutritional database 130 may be, in part or completely, stored on the system 102.
  • the data 120 and the nutritional database 130 may serve as repositories for storing data that may be fetched, processed, received, or created by system 102.
  • the system 102 and the user device 104 may include respective processors, interfaces, memories, modules, other data, and the like, which are not shown for brevity.
  • the processors may be implemented as one or more microprocessors, microcomputers, microcontrollers, digital signal processors, central processing units, state machines, logic circuitries, and/or any devices that manipulate signals based on operational instructions.
  • the processors fetch and execute computer-readable instructions stored in a memory.
  • the interfaces may include a variety of computer-readable instructions-based interfaces and hardware interfaces that allow interaction with other communication, storage, and computing devices, such as network entities, web servers, databases, and external repositories, and peripheral devices.
  • the memories may include any non-transitory computer-readable medium including, for example, volatile memory (e.g., RAM), and/or non-volatile memory (e.g., EPROM, flash memory, etc.).
  • volatile memory e.g., RAM
  • non-volatile memory e.g., EPROM, flash memory, etc.
  • the memories may include an external memory unit, such as a flash drive, a compact disk drive, an external hard disk drive, or the like.
  • the modules may include operating system and applications that may be executed on the system 102 or the user device 104. Other data may include data used, retrieved, stored, or in any way manipulated by the system 102 or the user device 104.
  • the various engines such as the nutrition assistance engine 1 12, the registration engine 1 14, the data aggregation engine 116, and the recommendation engine 1 18 may be coupled to processor(s), and may include, amongst other things, routines, programs, objects, components, data structures, and the like, which perform particular tasks or implement particular abstract data types.
  • the engines may be implemented as hardware, software, or a combination of the two.
  • the system 102 may be integrated with the user device 104 as a single computing device.
  • the system 102 may be implemented in a cloud network environment and may communicate with the user device 104 over a first communication network.
  • the first communication network can be a wireless network, a wired network, or a combination thereof.
  • the first communication network can also be an individual network or a collection of many such individual networks, interconnected with each other and functioning as a single large network, e.g., the Internet or an intranet.
  • the first communication network can include different types of networks, such as intranet, local area network (LAN), wide area network (WAN), the internet, and such.
  • the first communication network may also include individual networks, such as, but not limited to, Global System for Communication (GSM) network, Universal Telecommunications System (UMTS) network, Long Term Evolution (LTE) network, etc. Accordingly, the first communication network includes various network entities, such as base stations, gateways, servers, and routers; however, such details have been omitted to maintain the brevity of the description.
  • the user device 104 may communicate with the feeding tools 106 over a second communication network.
  • the second communication network may be different from or a part of the first communication network.
  • the second communication network is a short-range wireless communication network that uses network protocols, such as Bluetooth, Near Field Communication (NFC), Wi-Fi, etc. Accordingly, the user device 104 and the feeding tools 106 may include communication interfaces to allow them to communicate over the second communication network.
  • GSM Global System for Communication
  • UMTS Universal Telecommunications System
  • LTE Long Term Evolution
  • the user device 104 may communicate with the feeding tools 106 over a second communication network.
  • one or more individuals may be registered with the system 102 for receiving nutrition support.
  • an individual may register with the system 102 using the nutrition assistance engine 112 of the user device 104 and the registration engine 1 14 of the system 102.
  • the nutrition assistance engine 1 12 may be an application, such as a mobile app, installed on the user device 104.
  • the nutrition assistance engine 1 12 may provide a graphical user interface through the display 1 10 for registration and subsequent editing of a registered profile.
  • the nutrition assistance engine 112 may receive profile details, such as name, age, health history, food prescription, food preferences, etc., for the individual to be registered from a user.
  • the user may be the individual themselves or a care giver.
  • the user may provide a link to other data sources, such as hospital medical records, for obtaining some of the profile details, such as food prescription and health history.
  • the nutrition assistance engine 112 can provide the profile details to the registration engine 1 14, which may process and store the profile details in the data 120 for each registered individual. Further, the nutrition assistance engine 112 may also store a local profile of the individual on the user device 104 for ease of access. In one example, the registration engine 1 14 may download some of the profile details from the other data sources using the link provided by the user. In such cases, the registration engine 1 14 may also periodically check for changes to the profile details in the other data sources to ensure that the data 120 present in the system 102 is automatically updated.
  • the registration engine 114 can create an account for each individual to be registered, assign an identifier (ID), such as a name or number, and store the corresponding profile details as data along with the ID.
  • ID identifier
  • the name, age, and health history may be stored as personal data 122
  • the food prescription may be stored in food prescription data 124
  • the food preferences may be stored in sensorial data 128.
  • the food prescription data 124 is thus indicative of the nutritional requirements of the individual.
  • the registration engine 114 may standardize the profile details prior to storing in the data 120 to help with subsequent search and analysis. For example, standard codes may be used for denoting various diseases, disorders, medication, food and nutrients prescribed, food preferences, etc. It will be understood that in cases where the individual is already registered, the registration engine 114 can update the data 120 based on the received profile details.
  • the nutrition assistance engine 1 12 can help monitor the nutrition intake and can provide recommendations for the food to be provided to the individual to ensure that appropriate nutrition, as prescribed, is taken by the individual.
  • the nutrition assistance engine 112 can receive food intake data, also referred to as intake data, from the sensors 108 of the feeding tools 106.
  • the corresponding sensors 108 can transmit intake data, such as amount of food, temperature of food, taste of food, smell of food, chemical composition, etc., to the nutrition assistance engine 1 12.
  • the sensors 108 include a gas sensor, which works like an electrochemical human nose, and a taste sensor, which has an artificial lipid membrane that consistently produces a potential in response to a similar taste.
  • a gas sensor which works like an electrochemical human nose
  • a taste sensor which has an artificial lipid membrane that consistently produces a potential in response to a similar taste.
  • such taste and smell sensors can be used to accurately simulate human taste and smell using machine learning algorithms and pattern classification algorithms.
  • the taste and smell sensors can be calibrated, for example, using the sensor values and the food preferences of the individual, to determine which tastes and smells are preferred by the individual. Further, the food preferences can be updated over time based on the received food intake data as will be discussed below.
  • the taste and smell sensors can be used to optimize food provided to the user. For example, based on the smell sensors, the food ingredients may be modified to provide an aroma preferred by the user.
  • the sensors 108 also include a chemical composition sensor, such as a molecular sensor, that uses spectrometry, such as Near Infra-Red (NIR) spectroscopy, to determine the composition, such as chemical content, calorific value, fat content, water content, etc., of food items.
  • spectrometry such as Near Infra-Red (NIR) spectroscopy
  • NIR Near Infra-Red
  • these sensors directly measure the composition, they can account for differences in portion size, methods of preparation, and food ingredients used and are hence more accurate in comparison to conventional techniques that solely rely upon standard databases to determine the composition.
  • a temperature sensor that can be used to determine the temperature of the food.
  • the sensors 108 may also include a weight sensor that can be used to determine the amount of food consumed using a difference in the weight of food in the feeding tool 106, such as a bowl or plate, before and after consumption. Further, based on the feeding tool 106 used, the nutrition assistance engine 112 can determine the type of food consumed. For example, if a straw is used, the food consumed is likely to be a liquid, or if a plate is used, the food consumed is likely to be a solid.
  • Other types of sensors known in the art may be used in conjunction with or alternatively to the example sensors described above. It will be understood that use of all such sensors is intended to be covered in the scope of the present subject matter. The examples are provided only for illustration and not as a limitation.
  • a user can also provide intake data to the nutrition assistance engine 112.
  • the user may provide textual input describing the food item consumed.
  • the user may take an image of the food prior to consumption, for example, using a camera of the user device 104.
  • the nutrition assistance engine 112 can apply image analysis techniques on the image to determine the type of food, texture of food, and quantity of food consumed.
  • the nutrition assistance engine 112 may provide a menu of food items from which the user can select the food consumed. Further, as a part of the input, the user may indicate whether the user liked the food or not. It will be understood that any combination of the above methods can be used to provide additional intake data to the nutrition assistance engine 112.
  • the nutrition assistance engine 1 12 can process and send the intake data to the system 102.
  • the nutrition assistance engine 1 12 may add a time stamp and send the intake data to the system 102.
  • the nutrition assistance engine 112 may process the image or text input or may send the image or text input without processing to the system 102.
  • the data aggregation engine 1 16 may further process the intake data and the image or text input and store it as consumption data 126 and/ or sensorial data 128.
  • the consumption data 126 is indicative of food consumption patterns of the individual and the sensorial data 128 is indicative of food preference patterns of the individual.
  • the amount and type of food consumed, time at which food was consumed, etc. can be stored as consumption data 126.
  • the data aggregation engine 1 16 may standardize the metrics and codes used and store the standardized data as consumption data 126.
  • the food preferences, changes in taste and smell of the individual, and the like can be stored as sensorial data 128.
  • the sensorial data 128 can also be determined from the consumption data by the data aggregation engine 116. For example, if the individual consumed certain food items in a larger quantity then it may be determined that the individual preferred the taste or texture of those food items.
  • monitoring of food consumed by the individual can be made automatic with minimal manual intervention. Also, likelihood of erroneous or missed entries decreases substantially and thus better nutrition support can be provided.
  • the system 102 can also determine whether sufficient nutrition, as per the food prescription, is being taken by the individual and can also provide recommendations for palatable food based on the food preferences of the individual.
  • the recommendation engine 118 can analyze the food prescription data 124, the consumption data 126, and the sensorial data 128, in conjunction with nutritional data from the nutritional database 130, and can provide the recommendations to the user.
  • the nutritional database 130 may be a collection of public or private databases, for example, databases accessible through web servers. In another example, the nutritional database 130 may be developed as a proprietary database. The nutritional database 130 includes information regarding the nutrient content of different food items and ingredients, their taste and smell attributes, chemical composition, etc.
  • the recommendation engine 118 can use a pattern classification algorithm which analyses consumption pattern and preferences of the user. For example, based on the consumption data 126 and the sensorial data 128, the recommendation engine 1 18 can determine what nutrients were consumed in what quantity and which of the food items were liked by the individual and which were not liked. Further, based on the food prescription data 124, the recommendation engine can determine whether there is any deficiency in the nutrient intake or whether there are any nutrients of which the intake needs to be reduced or whether a liquid or semi-solid or solid diet is to be provided. Subsequently, based on the sensorial data 128 and the nutritional database 130, the recommendation engine 118 can provide recommendations for palatable food items to meet the food prescription requirements while taking into account the preferences of the individual.
  • a pattern classification algorithm which analyses consumption pattern and preferences of the user. For example, based on the consumption data 126 and the sensorial data 128, the recommendation engine 1 18 can determine what nutrients were consumed in what quantity and which of the food items were liked by the individual and which were not liked. Further, based
  • the recommendation engine 118 can provide the recommendations to the nutrition assistance engine 112, which can display the recommendations on the display 1 10 of the user device 104. Accordingly, the user can change the diet plan of the individual. In an example, the user has the flexibility to choose from the recommendations provided based on taste and smell that the individual prefers at that point of time. Based on food items chosen by the user, the recommendation engine 1 18 can change portion sizes of the recommendations to adjust to nutrition requirements of the user.
  • any changes in food preferences or food prescription or health conditions get automatically reflected in the recommendations provided to the user.
  • the food provided can also be made more palatable and personalized based on dietary patterns changes, taste and smell preferences, and the like, while providing necessary calories. This helps in faster convalescence and in maintaining a healthier lifestyle.
  • the present subject matter leverages the ubiquitous presence of user devices, such as mobile devices, to record intake data and communicate with the feeding tools and the system 102, it is easy and cost- effective to implement.
  • greater amount of intelligence and processing capability can be in-built in the system 102 and the system 102 can continually learn over time to improve the nutrition support outcomes.
  • effective nutrient support can be provided.
  • Fig. 2 The methods used for providing nutrition support will now be further described with reference to Fig. 2. While the method illustrated in Fig. 2 may be implemented in any system, for discussion, the method is described with reference to the implementations illustrated in Fig. 1.
  • Fig. 2 illustrates an example method 200 for providing nutrition support implemented by a user device, in accordance with principles of the present subject matter.
  • the order in which the method 200 is described is not intended to be construed as a limitation, and any number of the described method blocks can be combined in any order to implement the method or an alternative method. Additionally, individual blocks may be deleted from the method 200 without departing from the scope of the subject matter described herein.
  • the method 200 may be implemented in any suitable hardware, computer readable instructions, firmware, or combination thereof.
  • steps of the method 200 can be performed by programmed computing devices.
  • program storage devices and non-transitory computer readable medium for example, digital data storage media, which are computer readable and encode computer-executable instructions, where said instructions perform some or all of the steps of the described methods.
  • the program storage devices may be, for example, digital memories, magnetic storage media, such as magnetic disks and magnetic tapes, hard drives, or optically readable digital data storage media.
  • the food intake data includes, for example, data determined by sensors 108 in the feeding tool 106.
  • the food intake data includes at least taste data, smell data, and composition data.
  • the sensors 108 include at least an artificial lipid membrane based taste sensor, a gas based smell sensor, and a spectrometry based chemical composition sensor. Further, the sensors 108 may be calibrated initially to ensure that the data captured by the sensors 108 is in accordance with the way the individual senses taste and smell.
  • the food intake data determined by the sensors 108 may be sent from the feeding tool 106 to the user device 104.
  • a time stamp may be added to the food intake data.
  • at least one of image data and text data may be added to the food intake data. Subsequently, the food intake data may be sent to the system 102 for processing.
  • the food intake data is processed to determine consumption data 126 and sensorial data 128.
  • the consumption data 126 is indicative of food consumption patterns of the individual and the sensorial data 128 is indicative of food preference patterns of the individual;
  • a recommendation for food items is determined based on the consumption data 126, the sensorial data 128, and food prescription data 124.
  • the food prescription data 124 is indicative of nutritional requirements of the individual.
  • the food prescription data 124 may be dynamically updated based on changes in the medical records.
  • the recommendation for food items is provided to the user device 104 for being displayed to a user.
  • the method can dynamically capture changes in the consumption patterns and the food preference patterns.
  • nutrition support can be provided to an individual in accordance with prescription provided by medical professionals, such as doctors, dieticians, and nutritionists, and the individual's palate preferences.

Landscapes

  • Health & Medical Sciences (AREA)
  • Nutrition Science (AREA)
  • Engineering & Computer Science (AREA)
  • Epidemiology (AREA)
  • General Health & Medical Sciences (AREA)
  • Medical Informatics (AREA)
  • Primary Health Care (AREA)
  • Public Health (AREA)
  • Medical Treatment And Welfare Office Work (AREA)

Abstract

La présente invention concerne un environnement de réseau (100) pour fournir une aide à la nutrition qui comprend un outil d'alimentation (106). L'outil d'alimentation (106) comprend des capteurs (108) pour déterminer des données d'ingestion d'aliments à partir d'aliments consommés par un individu au moyen de l'outil d'alimentation (106). Un dispositif utilisateur (104) peut enregistrer les données d'ingestion d'aliments reçues depuis l'outil d'alimentation (106). Un système (102) peut analyser les données d'ingestion d'aliments reçues depuis le dispositif utilisateur (104) pour déterminer des profils de consommation et des préférences alimentaires de l'individu et pour fournir une recommandation d'aliments à fournir à l'individu sur la base de la détermination.
PCT/EP2018/076403 2017-09-28 2018-09-28 Systèmes et procédés d'aide à la nutrition WO2019063762A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
EP17193624.8 2017-09-28
EP17193624 2017-09-28

Publications (1)

Publication Number Publication Date
WO2019063762A1 true WO2019063762A1 (fr) 2019-04-04

Family

ID=59997156

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/EP2018/076403 WO2019063762A1 (fr) 2017-09-28 2018-09-28 Systèmes et procédés d'aide à la nutrition

Country Status (1)

Country Link
WO (1) WO2019063762A1 (fr)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116486097A (zh) * 2023-04-10 2023-07-25 深圳市前海远为科技有限公司 应用于啮齿动物喂食场景下的远程自动投喂方法及系统

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20110318717A1 (en) * 2010-06-23 2011-12-29 Laurent Adamowicz Personalized Food Identification and Nutrition Guidance System
US9146147B1 (en) * 2015-04-13 2015-09-29 Umar Rahim Bakhsh Dynamic nutrition tracking utensils
US20160260352A1 (en) * 2015-03-02 2016-09-08 Fitly Inc. Apparatus and method for identifying food nutritional values

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20110318717A1 (en) * 2010-06-23 2011-12-29 Laurent Adamowicz Personalized Food Identification and Nutrition Guidance System
US20160260352A1 (en) * 2015-03-02 2016-09-08 Fitly Inc. Apparatus and method for identifying food nutritional values
US9146147B1 (en) * 2015-04-13 2015-09-29 Umar Rahim Bakhsh Dynamic nutrition tracking utensils

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116486097A (zh) * 2023-04-10 2023-07-25 深圳市前海远为科技有限公司 应用于啮齿动物喂食场景下的远程自动投喂方法及系统
CN116486097B (zh) * 2023-04-10 2023-10-24 深圳市前海远为科技有限公司 应用于啮齿动物喂食场景下的远程自动投喂方法及系统

Similar Documents

Publication Publication Date Title
CN103562921B (zh) 位置启用的食物数据库
CN107423549B (zh) 健康跟踪设备
TWI658813B (zh) 飲食限制之遵從系統
Drincic et al. Evidence-based mobile medical applications in diabetes
US20130157232A1 (en) System and methods for monitoring food consumption
CN108141714B (zh) 移动健康应用的个性化、同伴衍生消息的自动构建的装置和方法
US20150379226A1 (en) Health management system, health management apparatus, and display method
US11348479B2 (en) Accuracy of measuring nutritional responses in a non-clinical setting
WO2012047940A1 (fr) Système de conseil personnel de nutrition et de bien-être physique
US20190006040A1 (en) Cognitive diabetic regulator
CN115769303A (zh) 用于膳食信息收集、膳食评估和分析物数据相关性的系统、设备和方法
US20190267121A1 (en) Medical recommendation platform
US20200075152A1 (en) Fitness nutrition tracking and recommendation service
US20220238038A1 (en) Nutrition management and kitchen appliance
KR20220158477A (ko) 식이관리 서비스를 제공하는 서버, 클라이언트 및 이를 구현하는 방법
WO2019063762A1 (fr) Systèmes et procédés d'aide à la nutrition
CN113436738A (zh) 管理风险用户的方法、装置、设备及存储介质
TWI668664B (zh) Method for dynamically analyzing blood sugar level, system thereof and computer program product
US20230207100A1 (en) Nutritional information exchange system
Roy et al. OBESEYE: Interpretable Diet Recommender for Obesity Management using Machine Learning and Explainable AI
KR101968965B1 (ko) 아토피 피부염 관리장치, 및 이를 포함하는 관리 시스템 및 방법
KR102426924B1 (ko) 생체 임피던스 측정장치를 이용한 건강 관리시스템 및 건강 관리 장치
Holst et al. Nutritional assessment, diagnosis, and treatment in geriatrics
KR102395631B1 (ko) 스마트 트레이 기반의 개인 섭생 관리 시스템
JP2009003905A (ja) 献立情報を用いた健康状態測定プログラムを格納する記録媒体とこのプログラムを用いた健康状態測定方法およびシステム

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 18773472

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 18773472

Country of ref document: EP

Kind code of ref document: A1