KR20140103738A - Method for calculating and assessment of nutrient intake - Google Patents

Method for calculating and assessment of nutrient intake Download PDF

Info

Publication number
KR20140103738A
KR20140103738A KR1020130017575A KR20130017575A KR20140103738A KR 20140103738 A KR20140103738 A KR 20140103738A KR 1020130017575 A KR1020130017575 A KR 1020130017575A KR 20130017575 A KR20130017575 A KR 20130017575A KR 20140103738 A KR20140103738 A KR 20140103738A
Authority
KR
South Korea
Prior art keywords
food
nutrient
database
food material
intake
Prior art date
Application number
KR1020130017575A
Other languages
Korean (ko)
Inventor
박혜숙
Original Assignee
이화여자대학교 산학협력단
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 이화여자대학교 산학협력단 filed Critical 이화여자대학교 산학협력단
Priority to KR1020130017575A priority Critical patent/KR20140103738A/en
Publication of KR20140103738A publication Critical patent/KR20140103738A/en

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q50/00Information and communication technology [ICT] specially adapted for implementation of business processes of specific business sectors, e.g. utilities or tourism
    • G06Q50/10Services
    • G06Q50/12Hotels or restaurants
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H20/00ICT specially adapted for therapies or health-improving plans, e.g. for handling prescriptions, for steering therapy or for monitoring patient compliance
    • G16H20/60ICT specially adapted for therapies or health-improving plans, e.g. for handling prescriptions, for steering therapy or for monitoring patient compliance relating to nutrition control, e.g. diets

Landscapes

  • Business, Economics & Management (AREA)
  • Tourism & Hospitality (AREA)
  • Health & Medical Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • Primary Health Care (AREA)
  • General Health & Medical Sciences (AREA)
  • General Business, Economics & Management (AREA)
  • Marketing (AREA)
  • Strategic Management (AREA)
  • Physics & Mathematics (AREA)
  • Human Resources & Organizations (AREA)
  • General Physics & Mathematics (AREA)
  • Economics (AREA)
  • Theoretical Computer Science (AREA)
  • Nutrition Science (AREA)
  • Epidemiology (AREA)
  • Medical Informatics (AREA)
  • Public Health (AREA)
  • Medical Treatment And Welfare Office Work (AREA)

Abstract

The present invention can calculate nutrients of foods consumed by a user using a mobile terminal such as a smart phone and output the nutrients to a user, calculate total nutrients for a predetermined period of time, and output excessive or insufficient nutrients to the user , And nutrient calculation and assessment methods that enable balanced nutrition intake.

Description

BACKGROUND OF THE INVENTION 1. Field of the Invention The present invention relates to nutrient-

The present invention can calculate nutrients of foods consumed by a user using a mobile terminal such as a smart phone and output the nutrients to a user, calculate total nutrients for a predetermined period of time, and output excessive or insufficient nutrients to the user , And a nutritional calculation and evaluation method that enables balanced nutrition intake.

It is important to have balanced nutrition for a healthy life.

Balanced nutrition may be more important, especially for infants and children, for growth, energy supply for active physical activity, and resistance to pathogens and infections.

However, the nutrition of many foods including instant food is unbalanced, and there is a problem that balanced nutrition can not be provided because of unbalanced influences.

A related related art will be described as follows

Japanese Laid-Open Patent Application No. 2008-217702 discloses a method of storing food color, brightness, surface shape, contrast, and the like in a database, A camera capable of outputting calories is disclosed.

Korean Patent Laid-Open Publication No. 2011-0098409 discloses a method of calculating calorie intake by inputting the type and amount of food to be ingested using a mobile terminal, calculating the calorie consumption by inputting the exercise time and amount, Discloses a method for managing obesity.

Korean Laid-Open Patent No. 2011-0122006 discloses a system in which a photograph of a food is photographed using a smartphone, and a cost can be automatically set by confirming the photographed food.

In other words, according to the related art, there has been developed a technology for photographing and discriminating food, but it has a problem in that it can not only help calorie calculation or cost settlement, but also provide balanced nutrition.

In addition, even if the calorie of the food is calculated, the user does not consider the case where the food is left alone or the food is unbalanced, and even if the same food is eaten, the household can not consider the added ingredients by different recipes, There is a problem that the accuracy is low.

(Patent Document 1) JP2008-217702 A

(Patent Document 2) KR2011-0098409 A

(Patent Document 3) KR2011-0122006 A

SUMMARY OF THE INVENTION The present invention has been made in order to solve the above problems.

In other words, we propose a method that can help the balanced nutrition intake by not only calorific calculation but also using the foods identified through the images.

In addition, we propose a nutritional calculation and judgment method that increases the accuracy by adding the bias of users to the nutrition consumption and also considering the variables according to different recipes.

In order to solve the above-mentioned problems, the present invention provides a method of controlling a blood pressure monitor, comprising the steps of: (a) capturing an image taken before and after taking a food, and transmitting the taken image to a controller; (b) the food identification module 151 of the controller 150 identifies food using a food database 210 in which food and images are mapped and stored, and a pre-food image; (c) The food material identification module 152 of the controller 150 determines whether or not the food is included in the identified food by using the food material database 220 in which the nutrients according to the unit mass of food, Identifying food ingredients; (d) further adding a user-added material to the controller (150) as the food material; (e) calculating an intake amount for each food material by using the food quantity calculation module 154 of the controller 150 using the food image before and the image after food intake; And (f) calculating a nutrient amount using the nutrient calculation module 153 of the controller 150 using the food material database 220 and the calculated intake amount of each food material, To provide nutrient calculation and assessment methods.

In addition, after the step (d), the nutrient calculation module 153 may set the amount of one serving of the food identified in the step (b) And calculating the amount of the estimated intake nutrient by using the food material and the food material database 220. [

In addition, the steps (a) to (f) are repeatedly performed, the user additional material input in the step (d) is stored in the user database 210, and the step (c) The food material identification module 152 of the food material identification module 152 determines whether or not the foods included in the identified food using the food material database 220 and the food material database 220 in which the nutrients of the food, It is preferable to confirm the material.

In addition, after the step (f), (g) the amount of the calculated intake nutrient is stored in the user database 210, the steps (a) to (g) (h) calculating a cumulative amount of intake nutrients stored in the user database 210 for a predetermined period of time by the statistical module 155 of the controller 150; And (i) checking whether a nutrient or a nutrient is present in the statistical module 155 by comparing the accumulated amount of the calculated nutrient with the statistical database 240 storing the recommended nutrient intake amount for the predetermined period of time, .

In addition, after step (i), if the excessive nutrients are read out, (j), the statistical module 155 identifies the food material containing the excessive nutrients; And (k) the statistical module 155 identifies the food containing the identified food material from the food material database 220, and selects and outputs the selected food as a note.

Wherein the step (k) comprises the steps of: (k1) checking whether the statistical module (155) determines whether the identified food material is the food material additionally input in the step (d); And (k2), the statistical module (155) checks the food containing the food material from the food material database (220) and selects the food as the food of interest and outputs the selected food .

In addition, after step (i), if the tributary is read out (j '), the statistical module 155 may identify the food material containing the tributary; And (k ') the statistical module 155 identifies the food containing the identified food material from the food material database 220, and selects and outputs it as a recommended food.

According to the present invention, by capturing two images before and after eating a food with a user terminal such as a smart phone, it is possible to obtain a high accuracy in consideration of whether or not an individual is unbalanced and whether there is a difference in a recipe for adding different materials to each household The nutrient intake can be calculated.

In addition, the calculated nutrient intake can be linked to an electronic medical record (EMR), so that the nutrient intake of seriously ill patients living at home can be directly transmitted to experts such as doctors, It may also create effective medical practices for outpatients.

1 is a conceptual diagram for explaining an ingestion nutrient calculation and evaluation system according to the present invention.
FIG. 2 is a flowchart illustrating a method of calculating an intake nutrient according to the present invention.
FIG. 3 is a flowchart for explaining an ingestion nutrient evaluation method according to the present invention.

Hereinafter, an ingestion nutrient calculation and evaluation system and an operation and evaluation method according to the present invention will be described with reference to the drawings.

Explanation of ingestion nutrient calculation and evaluation system

The system according to the present invention includes a user terminal 100 and a plurality of databases 210, 220, 230, and 240.

Here, the user terminal 100 has a photographing function, and data calculation and transmission and reception are possible. In one embodiment, it may be a smart phone having a photographing function and capable of communication.

The user terminal 100 includes a photographing unit 110, an input unit 120, an output unit 130, and a control unit 150.

The photographing unit 110 can photograph an image before and after the ingestion of food. And may be a camera provided in a smart phone in one embodiment.

The images before and after the food intake are respectively referred to as a pre-food image and a post-food image, and the images taken by the image pick-up unit 110 can be transmitted to the controller 150.

The user can input data by using the input unit 120. [

The input information may be a user ID, a user-added material that the user specifically adds to the food.

The output unit 130 functions to output data to a user.

The information that is output may be the food material contained in the food, the expected intake nutrient, the actual nutrient intake, the over or under nutrient, the diet of the week, or the recommended food calculated accordingly.

In one embodiment, the input unit 120 and the output unit 130 may be integrated or may be a touch screen of a smartphone.

The control unit 150 includes a food identification module 151, a food material identification module 152, a nutrient calculation module 153, a food quantity calculation module 154 and a statistics module 155.

The food identification module 151 functions to identify which food is the food by comparing the information with the information stored in the food database 220 after receiving the image before the food intake.

After receiving the food identified by the food identification module 151, the food material identification module 152 identifies the food and the material mapped and stored in the food material database 230 to determine what kind of material is contained in the food Function. In addition, the user database 210 identifies user-added materials by identifying stored food and additional materials that are mapped on a user-by-user basis.

The nutrient calculation module 153 has a first function for calculating the expected intake nutrients and a second function for calculating the actual intake nutrients.

In relation to the first function, after determining the amount of food material identified in the food material identification module 152, assuming that the food is identified in the food identification module 151, , It is possible to calculate the amount of expected intake of nutrients that would be expected to be consumed when a person consumes one meal.

In relation to the second function, when the food intake amount per material of the actual food is calculated in the food amount calculation module 154 described later, the amount of the actual intake nutrient is checked by checking the material mapped and stored in the food material database 230 and the nutrient thereof .

The food quantity calculation module 154 can calculate the food quantity actually consumed using the food image taken before and after the food image taken by the photographing unit 110, which will be described in detail below.

The statistical module 155 calculates an accumulation amount of intake nutrients stored in the user database 210 for a preset period of time and calculates excessive or tributary nutrients according to the amount of nutrients, identifies nutritional materials containing the nutrients, You can check the recommended food.

The user database 210 is generated for each user ID. User login / logout history, user age, and user added materials for specific foods. In addition, information on the actually ingested nutrients confirmed by the control unit 150 can be accumulated and stored.

In the food database 220, the images are mapped to various foods and stored. By using this, it is possible to identify a food through the food photograph taken at the photographing unit 110.

In the food material database 230, food-food material-nutrient is mapped and stored. For example, "Tteokbokki" - "rice cake, fish cake, cabbage, and kochujang" are mapped and stored, and nutrients per unit mass contained in rice cake, fish cake, cabbage, and kochujang are stored.

The recommended intake amount per nutrient is stored in the statistical database 240 according to a preset period.

Explanation of ingestion nutrient calculation method

The nutrient calculation method according to the present invention will be described with reference to FIG.

The user accesses the system through the user ID using the input unit 120 of the user terminal 100. [ At this time, the database 210 of the user is selected (S201).

The user uses the photographing unit 110 to photograph an image before food is consumed (S202).

The photographed food image before feeding is transmitted to the control unit 150. The food identification module 151 of the control unit 150 identifies the food of the food using the information stored in the food database 220 in advance, And stores it in the user database 210 (S203). The method of identifying food as described above is a conventional method, and a detailed description thereof will be omitted.

The food material confirmation module 152 confirms the ingredients included in the food identified in the step S203 using the food material database 230 (S204). As described above, the food material database 230 has the food and its material mapped so that the food is automatically identified when the food is identified.

Next, the food material confirmation module 152 checks the user addition material included in the food identified in the step S203 using the user database 210 (S205).

A user additive material is a material that is not normally included in the food, but is often added by the user.

For example, when the food identified in the step S203 is "Tteokbokki ", it can be confirmed through the data stored in the food material database 230 that the food material is rice cake, fish cake, cabbage, and hot pepper paste. &Quot; cheese "is added as user additional material and is stored in the user database 210 when it is confirmed that it often eats" cheese "together.

The step of adding the user additional material is explained in steps S207 to S208, and of course, there is no user additional material for the first identified food.

Next, the output unit 130 outputs the identified food materials (i.e., the basic material identified in step S204 and the user added material identified in step S205) (S206).

After confirming the output food materials, the user can input the additional user material (S207) if there is more user material among the unprinted materials. The input additional user material is mapped to the corresponding food and stored in the user database 210 (S208).

In the food material database 230, nutrients per unit mass are stored in advance for each food material. The nutrient calculation module 153 calculates expected nutrients by using the food materials identified in step S206 (S209). Here, it can be estimated that one person of the food identified in step S203 is consumed.

The output unit 130 outputs the estimated intake nutrient calculated in step S209 (S210).

Next, the user consumes food. After taking the food, the user uses the photographing unit 110 to photograph the food after eating the food (S211).

The photographed food intake image is transmitted to the control unit 150. The food quantity calculation module 154 of the control unit 150 calculates the food intake amount per food by using the food image before food intake and the food image after the food intake in step S202 (S212).

Specifically, the step of identifying one or more containers common to the pre-food image and the post-food image. Adjusting the size of one of the two images so that the number of pixels of one of the identified containers is the same in both images; and assigning any one of the food materials to each pixel according to a preset reference in the captured image before the food intake And the step of confirming whether the pixel assigned to each food material has been changed to the container identified in the captured image after the food is consumed can be automatically calculated for each material of the food.

The criteria for allocating the pixels for each food material are the same as those of the conventional art, and the detailed description thereof will be omitted. In addition, the criteria may be continuously updated, which may increase the accuracy of the calculated intake.

On the other hand, here, the container can be identified using the general shape of the container, such as a circle or a rectangle. If the container is not identified, the reference point of the table including the spoon, the chopstick, You can also synchronize the size of the image.

When the intake amount of each food material is calculated, the nutrient calculation module 153 can calculate the actual intake nutrient similarly to the step S209. The calculated nutrients are stored in the user database 210 (S213).

The calculated intake nutrients are output to the user through the output unit 130 (S214).

Through this method, the user can check how much nutrients he or she has consumed after eating the food.

In particular, even if a certain food material is left by the eccentricity, it is possible to accurately confirm the fact that the image is taken after the food consumption, and the degree of the nutrient change can be confirmed even for the food material which is additionally added by different recipes It can be made more advanced than the conventional technology.

In addition, the nutrients thus calculated can be transmitted directly to an expert such as a doctor in cooperation with the EMR, so that they can not only help experts make decisions, but also can listen to advice on unconventional or recipes, It can also create a medical act.

Explanation of method of nutrient intake ingestion

Referring to FIG. 3, the nutrient assessment method according to the present invention will be described.

As noted above, the ingested nutrients may be delivered and evaluated directly to an expert, but of course, a method that is automatically evaluated here.

As shown in step S201, the user accesses the system through the input unit 120 of the user terminal 100 through the user ID. The database 210 of the user is selected (S301).

The statistical module 155 calculates the accumulation amount of intake nutrients stored in the user database 210 in a step S213 for a preset period of time (S302). Here, the predetermined period may be, for example, one day, one week, one month, and the like, but is not limited thereto.

In the statistical database 240, a recommended intake amount per nutrient is stored as a range according to a predetermined period of time such as one day, one week, one month, and the like. The statistical module 155 compares the recommended nutrient intake amount stored in advance in the user database 210 with the actual nutrient accumulation amount of the user stored in the user database 210 to determine which nutrient is excessive or deficient (S303).

In another embodiment of the present invention, a predetermined period of the accumulated amount of intake nutrients accumulated in the user database 210 may be different from a predetermined period of the recommended recommended intake amount stored in the statistical database 240.

For example, the cumulative amount of intake nutrients accumulated in the user database 210 may be one meal, that is, 1/3 day, and the preset recommended intake amount to be stored in the statistical database 240 may be one day. In this case, the statistics module 155 computes whether the nutrient excess or deficiency is matched to the two periods. In the above example, the statistics module 155 compares the daily intake nutrients accumulated in the user database 210 after 1/3 of the recommended daily intake stored in the statistical database 240.

On the other hand, if there is excessive nutrient (S304), the statistical module 155 confirms the food material containing the nutrient from the food material database 230 (S305).

Next, the statistical module 155 checks whether the corresponding food material is a user-added material (S306). This is because, in the case of user-added ingredients added by different recipes at home, the ingredients are more likely to be consumed than the average.

When it is confirmed that the user is a supplementary material, the food that the user has consumed is set as a care food (S307).

If it is determined that the user is not the additional material, the food containing the food material is set as the note food from the food material database 230 (S308).

The set meal is output to the user (S309), and the user can confirm by confirming that the food intake should be reduced.

If there is a tributary nutrient (S310), the statistical module 155 checks the food material containing the nutrient from the food material database 230 (S311), checks the food containing the identified food material, (S312).

The thus-set recommended food is output to the user (S313), and the user can confirm by confirming that he / she should increase the consumption of the food.

In another embodiment, the steps S307, S308, and S312 are omitted, and only the care food or the recommended food material may be output instead of the care food or the recommended food in steps S309 and S313. The user can confirm the output of the food material so that the user can refrain from using any food material at the time of cooking at home and confirm which food material should be added for cooking.

In this way, it is necessary to check whether the nutrients that are caused by different reciprocal habits or the different recipes of each household are over-or deficient, and at the same time, what kind of food should be reduced or increased to supplement it, It has the advantage of being able to automatically check whether it should be reduced or increased during cooking.

Although the preferred embodiments of the present invention have been described, the present invention is not limited to the specific embodiments described above. It will be apparent to those skilled in the art that numerous modifications and variations can be made in the present invention without departing from the spirit or scope of the appended claims. And equivalents should be regarded as falling within the scope of the present invention.

100: User terminal
110:
120: Input unit
130:
150:
151: food identification module
152: Food ingredient identification module
153: Nutrient calculation module
154: food quantity calculation module
155: Statistics module
210: user database
220: Food database
230: Food ingredient database
240: Statistics database

Claims (7)

(a) capturing an image taken before the food intake and an image taken after the food intake, and transmitting the taken image to the controller 150;
(b) the food identification module 151 of the controller 150 identifies food using a food database 210 in which food and images are mapped and stored, and a pre-food image;
(c) The food material identification module 152 of the controller 150 determines whether or not the food is included in the identified food by using the food material database 220 in which the nutrients according to the unit mass of food, Identifying food ingredients;
(d) further adding a user-added material to the controller (150) as the food material;
(e) calculating an intake amount for each food material by using the food quantity calculation module 154 of the controller 150 using the food image before and the image after food intake; And
(f) the nutrient calculation module 153 of the controller 150 calculates the amount of the intake nutrient by using the food material database 220 and the calculated intake amount per food material ,
Ingestion nutrient calculation and evaluation methods.
The method according to claim 1,
After the step (d)
The nutrient calculation module 153 sets an amount of one serving of the food identified in the step (b), and determines the food material and the food material database 220 ), ≪ / RTI > further comprising the step of calculating an amount of expected ingestion nutrients,
Ingestion nutrient calculation and evaluation methods.
The method according to claim 1,
The steps (a) to (f) are repeated,
The user additional material input in the step (d) is stored in the user database 210, and
The step (c)
The food material identification module 152 of the control unit 150 determines whether or not the identified food material is contained in the food material database 220 using the food material database 220 in which nutrients are mapped according to the unit masses of food, And the step of confirming the food ingredients contained in the food.
Ingestion nutrient calculation and evaluation methods.
4. The method according to any one of claims 1 to 3,
After the step (f)
(g) the amount of the calculated intake nutrient is stored in the user database 210,
The steps (a) to (g) are repeatedly performed, and
After the step (g)
(h) calculating an accumulation amount of intake nutrients stored in the user database 210 for a preset period by the statistical module 155 of the controller 150; And
(i) the statistical module 155 compares the accumulated amount of the intake nutrients with the statistical database 240 in which the recommended intake amount per nutrient is stored for the predetermined period, thereby verifying presence or absence of excess nutrients or tributaries ≪ / RTI >
Ingestion nutrient calculation and evaluation methods.
5. The method of claim 4,
After the step (i)
(j) if the excess nutrient is read, the statistical module 155 identifies the food material containing the excess nutrient; And
(k) checking the food containing the identified food material from the food material database (220) by the statistical module (155), and selecting and outputting the selected food as a note food.
Ingestion nutrient calculation and evaluation methods.
6. The method of claim 5,
The step (k)
(k1) checking whether the statistical module (155) determines whether the identified food material is the food material further input in the step (d); And
(k2), the statistical module (155) checks the food containing the food material from the food material database (220), and selects and outputs the food as a care food Features,
Ingestion nutrient calculation and evaluation methods.
5. The method of claim 4,
After the step (i)
(j ') if the tributary is read out, the statistical module (155) identifies the food material containing the tributary; And
(k ') further comprising the step of the statistical module (155) checking the food containing the identified food material from the food material database (220) and selecting it as a recommended food and outputting it.
Ingestion nutrient calculation and evaluation methods.
KR1020130017575A 2013-02-19 2013-02-19 Method for calculating and assessment of nutrient intake KR20140103738A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
KR1020130017575A KR20140103738A (en) 2013-02-19 2013-02-19 Method for calculating and assessment of nutrient intake

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
KR1020130017575A KR20140103738A (en) 2013-02-19 2013-02-19 Method for calculating and assessment of nutrient intake

Publications (1)

Publication Number Publication Date
KR20140103738A true KR20140103738A (en) 2014-08-27

Family

ID=51747951

Family Applications (1)

Application Number Title Priority Date Filing Date
KR1020130017575A KR20140103738A (en) 2013-02-19 2013-02-19 Method for calculating and assessment of nutrient intake

Country Status (1)

Country Link
KR (1) KR20140103738A (en)

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20160120131A (en) * 2015-04-07 2016-10-17 엘지전자 주식회사 Wearalbe terminal and display device wireless communicating with the same
KR20180116779A (en) * 2017-04-17 2018-10-26 가천대학교 산학협력단 An artificial intelligence based image and speech recognition nutritional assessment method
KR20200003590A (en) * 2018-07-02 2020-01-10 이화여자대학교 산학협력단 Method and apparatus for managing dietary habits of hemodialysis patients
KR20200009710A (en) * 2018-07-20 2020-01-30 주식회사 가온앤 Method and system for recommending baby food through nutritional analysis
KR20200072444A (en) * 2018-12-12 2020-06-22 주식회사 누비랩 Cafeteria management system

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20160120131A (en) * 2015-04-07 2016-10-17 엘지전자 주식회사 Wearalbe terminal and display device wireless communicating with the same
KR20180116779A (en) * 2017-04-17 2018-10-26 가천대학교 산학협력단 An artificial intelligence based image and speech recognition nutritional assessment method
KR20200003590A (en) * 2018-07-02 2020-01-10 이화여자대학교 산학협력단 Method and apparatus for managing dietary habits of hemodialysis patients
KR20200009710A (en) * 2018-07-20 2020-01-30 주식회사 가온앤 Method and system for recommending baby food through nutritional analysis
KR20200072444A (en) * 2018-12-12 2020-06-22 주식회사 누비랩 Cafeteria management system

Similar Documents

Publication Publication Date Title
CN104809164A (en) Healthy diet recommendation method based on mobile terminal and mobile terminal
US20030076983A1 (en) Personal food analyzer
CN111480202B (en) Apparatus and method for personalized meal plan generation
KR20140103738A (en) Method for calculating and assessment of nutrient intake
CN105793887A (en) Method and system for capturing food consumption information of a user
WO2012047940A1 (en) Personal nutrition and wellness advisor
KR20120058294A (en) Meal plate for appropriate feeding behavior and meal quantity management
CN107763958A (en) A kind of intelligent refrigerator
JP2010033326A (en) Diet-health management system, method, and program
KR102310493B1 (en) Diet care service method
CN112289407A (en) Catering management method, system, device and storage medium based on health management
CN110853732A (en) Digital menu generation method and electronic equipment
KR20220158477A (en) Server and client of providing dietary management service and method implementing thereof
CN107705837B (en) Method, device and system for recommending selected food
JP2016173658A (en) Health management system, health management method, program, and recording medium
CN110806697B (en) Method and device for determining prompt mode based on smart home operating system
JP2011203799A (en) Meal management system
CN112349385A (en) Recipe recommendation method, device, equipment and storage medium based on refrigeration system
EP3557587A1 (en) An apparatus and method for personalized meal plan generation
KR101580016B1 (en) System for providing meal valuation and method thereof
CN115985470A (en) Intelligent nutrition management method and intelligent management system
JP2022530263A (en) Food measurement methods, equipment and programs
JP6083661B1 (en) Health management server, health management server control method, and health management program
CN114207732A (en) Apparatus and method for providing dietary recommendations
KR20190048922A (en) Smart table and controlling method thereof

Legal Events

Date Code Title Description
A201 Request for examination
E902 Notification of reason for refusal
E601 Decision to refuse application