CN111652044A - Dietary nutrition analysis method based on convolutional neural network target detection - Google Patents

Dietary nutrition analysis method based on convolutional neural network target detection Download PDF

Info

Publication number
CN111652044A
CN111652044A CN202010298450.9A CN202010298450A CN111652044A CN 111652044 A CN111652044 A CN 111652044A CN 202010298450 A CN202010298450 A CN 202010298450A CN 111652044 A CN111652044 A CN 111652044A
Authority
CN
China
Prior art keywords
dish
data
neural network
convolutional neural
target detection
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202010298450.9A
Other languages
Chinese (zh)
Inventor
罗飞宏
张静
孙浛
尚永豪
周怡瑶
陈婕妤
孙成君
程偌倩
奚立
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
East China University of Science and Technology
Childrens Hospital of Fudan University
Original Assignee
East China University of Science and Technology
Childrens Hospital of Fudan University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by East China University of Science and Technology, Childrens Hospital of Fudan University filed Critical East China University of Science and Technology
Priority to CN202010298450.9A priority Critical patent/CN111652044A/en
Publication of CN111652044A publication Critical patent/CN111652044A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/045Combinations of networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H20/00ICT specially adapted for therapies or health-improving plans, e.g. for handling prescriptions, for steering therapy or for monitoring patient compliance
    • G16H20/60ICT specially adapted for therapies or health-improving plans, e.g. for handling prescriptions, for steering therapy or for monitoring patient compliance relating to nutrition control, e.g. diets

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Health & Medical Sciences (AREA)
  • General Health & Medical Sciences (AREA)
  • General Physics & Mathematics (AREA)
  • Computing Systems (AREA)
  • Mathematical Physics (AREA)
  • Data Mining & Analysis (AREA)
  • Evolutionary Computation (AREA)
  • Biophysics (AREA)
  • Molecular Biology (AREA)
  • Biomedical Technology (AREA)
  • General Engineering & Computer Science (AREA)
  • Artificial Intelligence (AREA)
  • Computational Linguistics (AREA)
  • Software Systems (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Multimedia (AREA)
  • Nutrition Science (AREA)
  • Epidemiology (AREA)
  • Medical Informatics (AREA)
  • Primary Health Care (AREA)
  • Public Health (AREA)
  • Medical Treatment And Welfare Office Work (AREA)

Abstract

The invention provides a food nutrition analysis method based on convolutional neural network target detection, which comprises the following steps of collecting data and manually marking, shooting and storing a dish picture or a real object of a cooking dish as dish picture data, manually marking dish name data on the dish picture, and calculating the dish nutrition component data according to the raw materials and the quality of the cooking dish; step two, data enhancement is carried out on the data; step three, training a neural network; calculating the nutrient content of the food; the dish picture data, the dish name data and the dish nutritional ingredient data are obtained through meal nutritional analysis software installed on the mobile terminal; the invention aims to provide a food nutrition analysis method based on convolutional neural network target detection to identify dish pictures or real objects of cooking dishes and obtain the nutritional ingredients of the cooking dishes.

Description

Dietary nutrition analysis method based on convolutional neural network target detection
Technical Field
The invention belongs to the field of nutrition analysis, and particularly relates to a dietary nutrition analysis method based on convolutional neural network target detection.
Background
Although the living conditions are gradually improved, the dietary level of people is greatly improved, and the problem is that people can take excessive nutrient components to cause burden to the body. Most of the current software on the market can only record the nutrient components ingested by the user every day, the ingested nutrient components are manually input by the user, and the nutrient components of dishes ingested by the user cannot be automatically calculated.
Target detection means that a computer simulates human eyes to retrieve and acquire a target object of interest in an image. The target detection method based on deep learning can learn characteristics from a large amount of data, and the deep learning model applied to image recognition and analysis research at present mainly comprises a Convolutional Neural Network (CNN), a Deep Belief Network (DBN) and a Stacked Automatic Encoder (SAE). The feature information obtained through the convolutional neural network comprises shallow information and depth information, wherein the shallow information refers to: the field of view of the feature map obtained by the preceding convolutional layer is more concerned with features such as image detail texture. The depth information includes: and feature map information obtained by convolution of the convolution layer of the later stage.
Summary of the invention
In view of the above, the present invention provides a method for analyzing dietary nutrition based on convolutional neural network target detection to identify the picture or the object of the cooked dish and obtain the nutritional components of the cooked dish.
In order to achieve the purpose, the technical scheme of the invention is realized as follows:
a diet nutrition analysis method based on convolutional neural network target detection comprises the following steps,
step one, collecting data and manually marking: shooting a dish picture or a real object of a cooking dish and storing the dish picture or the real object as dish picture data, manually marking dish name data on the dish picture, calculating nutrient component data of the dish according to raw materials and quality of the cooking dish, and storing the dish picture data, the dish name data and the nutrient component data of the dish in a database;
step two, data enhancement is carried out on the data: expanding the dish picture data through a data enhancement technology;
step three, training a neural network: training a convolutional neural network target detection system through the dish picture data, the dish name data and the dish nutrient component data, so that the convolutional neural network target detection system can output the dish nutrient component data through inputting the dish name data and/or the dish picture data;
step four, calculating the nutrient content of the food: and calculating the nutrient content of the cooking dish by summarizing and counting the nutrient content data of the dish.
And step four, calculating the contents of carbohydrate, fat and protein of the dish according to the nutrient contents contained in each 100 g of various food raw materials, and calculating the calorie according to the contents of the carbohydrate, the fat and the protein.
Further, the following steps are also included between the third step and the fourth step,
identifying the type of food and acquiring the position coordinates of the dishes by using a neural network: identifying the type of food and acquiring the position coordinates of dishes by using a convolutional neural network-based target detection technology;
predicting depth information of the food and estimating the volume using a neural network: depth information of the food is predicted and the volume is estimated using convolutional neural network based object detection techniques.
Furthermore, the dish picture data, the dish name data, the dish nutrient content data and the convolutional neural network target detection system are stored in the mobile phone mobile terminal.
Furthermore, the dish picture data, the dish name data and the dish nutritional ingredient data are obtained through meal nutritional analysis software installed on the mobile terminal.
Furthermore, the meal nutrition analysis software comprises a data input program, an identification program and a data display program, wherein the data input program is used for reading dish picture data and dish name data input by a user; the identification program is used for identifying dishes and calculating nutrient content data of the dishes; and the data display program is used for displaying the nutrient content data of the dishes.
Furthermore, the dietary nutrition analysis software also comprises a step of calculating corresponding nutrient component recommended amount according to height and weight data input by a user, and the nutrient component recommended amount is read by the data display program.
Compared with the prior art, the dietary nutrition analysis method based on convolutional neural network target detection has the following advantages:
according to the invention, the target detection system can output the nutrient composition data of the dishes by inputting the name data of the dishes and/or the picture data of the dishes, the name data of the dishes and the nutrient composition data of the dishes are obtained by meal nutrient analysis software installed on the mobile terminal, when the user takes food, the name data of the dishes and/or the picture data of the dishes are input by the user, preferably, the user can take a picture of the eaten dishes to form the picture data of the dishes, and by using the meal nutrient analysis software, the name data of the dishes can be automatically identified and the nutrient composition data of the dishes can be output; in the whole process, the user does not need to manually measure the weight of the dishes or input the nutritional ingredients by himself, so that the user can feel easier and more convenient.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments of the invention and, together with the description, serve to explain the invention without limitation. In the drawings:
FIG. 1 is a schematic diagram of a dietary nutrition analysis method based on convolutional neural network target detection according to the present invention;
fig. 2 is a flow chart of the dietary nutrition analysis software according to the present invention.
Detailed Description
It should be noted that the embodiments and features of the embodiments of the present invention may be combined with each other without conflict.
In the description of the present invention, it is to be understood that the terms "central," "longitudinal," "lateral," "upper," "lower," "front," "rear," "left," "right," "vertical," "horizontal," "top," "bottom," "inner," "outer," and the like are used in the orientation or positional relationship indicated in the drawings, which are merely for convenience in describing the invention and to simplify the description, and are not intended to indicate or imply that the referenced device or element must have a particular orientation, be constructed and operated in a particular orientation, and are therefore not to be construed as limiting the invention. Furthermore, the terms "first", "second", etc. are used for descriptive purposes only and are not to be construed as indicating or implying relative importance or implicitly indicating the number of technical features indicated. Thus, a feature defined as "first," "second," etc. may explicitly or implicitly include one or more of that feature. In the description of the invention, the meaning of "a plurality" is two or more unless otherwise specified.
In the description of the invention, it is to be noted that, unless otherwise explicitly specified or limited, the terms "mounted", "connected" and "connected" are to be construed broadly, e.g. as being fixed or detachable or integrally connected; can be mechanically or electrically connected; they may be connected directly or indirectly through intervening media, or they may be interconnected between two elements. The specific meaning of the above terms in the creation of the present invention can be understood by those of ordinary skill in the art through specific situations.
The invention will be described in detail with reference to the following embodiments with reference to the attached drawings.
As shown in fig. 1, a method for analyzing dietary nutrition based on convolutional neural network target detection includes the following steps,
step one, collecting data and manually marking: shooting a dish picture or a real object of a cooking dish and storing the dish picture or the real object as dish picture data, manually marking dish name data on the dish picture, calculating nutrient component data of the dish according to raw materials and quality of the cooking dish, and storing the dish picture data, the dish name data and the nutrient component data of the dish in a database;
step two, data enhancement is carried out on the data: expanding the dish picture data through a data enhancement technology;
step three, training a neural network: and training the convolutional neural network target detection system through the dish picture data, the dish name data and the dish nutrient component data, so that the convolutional neural network target detection system can output the dish nutrient component data through inputting the dish name data and/or the dish picture data.
Step four, calculating the nutrient content of the food: and calculating the nutrient content of the cooking dish by summarizing and counting the nutrient content data of the dish.
It should be further noted that the data enhancement technology is to enhance data, that is, to use the existing data to flip, translate or rotate, so as to create more data, so that the neural network has better generalization effect. Common data enhancement techniques include flipping, rotating, scaling, cropping, translating.
And step four, calculating the contents of carbohydrate, fat and protein of the dish according to the nutrient contents contained in each 100 g of various food raw materials, and calculating the calorie according to the contents of the carbohydrate, the fat and the protein.
The calculation of the nutrient content of the food as described in step four is illustrated as follows:
for example: a big bowl of stewed beef dish with potatoes is obtained by the three steps after photographing:
Figure RE-GDA0002613714000000051
finding out the nutrient content of the corresponding food according to the food ingredient list:
Figure RE-GDA0002613714000000052
Figure RE-GDA0002613714000000061
according to the generated calorie formula of each nutrient:
1 gram of carbohydrate produces 4kcal of energy
1 gram of fat produces 9kcal of energy
1 gram of protein produces 4kcal of energy
1 gram of alcohol produces 7kcal of energy
The heat card of a large bowl of stewed beef obtained by the calculation is as follows:
60*4+56.6*9+87.4*4+17.6=1116.6kcal
a diet nutrition analysis method based on convolutional neural network target detection further comprises the following steps,
identifying the type of food and acquiring the position coordinates of the dishes by using a neural network: identifying the type of food and acquiring the position coordinates of dishes by using a convolutional neural network-based target detection technology; the convolutional neural network target detection technology can realize the identification and the positioning marking of target features by outputting the coordinates (x, y) of feature points on the image.
Predicting depth information of the food and estimating the volume using a neural network: depth information of the food is predicted and the volume is estimated using convolutional neural network based object detection techniques. The feature information obtained through the convolutional neural network comprises shallow information and depth information, wherein the shallow information refers to: the field of view of the feature map obtained by the preceding convolutional layer is more concerned with features such as image detail texture. The depth information includes: and feature map information obtained by convolution of the convolution layer of the later stage.
Furthermore, the dish picture data, the dish name data, the dish nutrient content data and the convolutional neural network target detection system are stored in the mobile phone mobile terminal.
As shown in fig. 2, further, the dish picture data, the dish name data and the dish nutrient component data are obtained through meal nutrient analysis software installed on the mobile terminal.
Furthermore, the meal nutrition analysis software comprises a data input program, an identification program and a data display program, wherein the data input program is used for reading dish picture data and dish name data input by a user; the identification program is used for identifying dishes and calculating nutrient content data of the dishes; and the data display program is used for displaying the nutrient content data of the dishes.
When the food nutrition analysis software is used, a user can photograph the fed dishes 3 by using the mobile terminal to obtain dish picture data, and the dish 3 is identified and/or volume estimated by the convolutional neural network target detection system, so that the automatic calculation and input of the nutrient components of the dishes are completed, and the user can more conveniently control the daily intake of the user.
Furthermore, the dietary nutrition analysis software also comprises a step of calculating corresponding nutrient component recommended amount according to height and weight data input by a user, and the nutrient component recommended amount is read by the data display program.
The recommended amount of the nutrient components can be calculated according to Ideal Body Weight (IBW) which refers to different national standards at different age stages, and the specific description is as follows:
1) adult stage: body Mass Index (BMI) ═ body weight (in kg)/bodyHeight of2(unit is m)2)
,BMI<18.5kg/m2The daily nutrient ratio is that carbohydrate accounts for 50-60% of the total calorie, fat accounts for 25-30% and protein accounts for 10-15%, the adult user selects the current labor intensity from the following table, the program automatically judges the calorie requirement corresponding to the weight state according to the height and the weight input by the user, the recommended daily nutrient content is the calorie requirement of × units of weight, if a male in the age of 30, a computer software engineer has the current height of 1.68m, the weight of 80kg and the body activity level of low, and after the program is input, the program automatically calculates the BMI of 80/1.682=28.34(kg/m2) Automatically judging the obesity according to the internal standard, and recommending the daily calorie requirement range to be 80 × (20-25), namely 1600-2000 kilocalories;
standard of physical activity level in adult stage
Figure RE-GDA0002613714000000081
Caloric requirement per unit body weight in the adult stage
Figure RE-GDA0002613714000000082
2) In the juvenile stage of children: three nutrient allocations were made according to age: in view of the importance of nutrition in early children, according to the international consensus of passage, the diagnosis of obesity, overweight, etc. is not considered in children under 2 years of age, and therefore the nutritional requirements of children under 2 years of age are not covered by the software of the present application. The normal weight of children in the application refers to the weight reduction*And overweight and obesity*Normal body weight is in all cases, and the nutrient requirements of each age group over 2 years old are as follows: 2-4 years old: carbohydrate accounts for 50% of the total caloric content, fat accounts for 35%, protein accounts for: 15 percent; 4-18 years old: carbohydrate accounts for 55% of the total calories, fat accounts for 30%, protein accounts for: 15 percent. Method for judging ideal body weight of children and teenagersAnd calorie and nutrient calculation: according to the growth curve corresponding to each age of children and teenagers published in China, the corresponding height and weight are judged, and according to the following table of the nutrient component suggestion table of the children, the caloric requirement and nutrient distribution are generated. For example, the weight of a boy is 45kg after 9 years old, the height is 1.354m, the ideal weight median of a control growth curve is 30.46kg, the IBW beyond the ideal weight is 45-30.46-14.54 kg, the recommended amount of the nutrient components of the boy is 10 < IBW ≤ 20kg, and the caloric (kcal) is 900+50(IBW-10kg) is 900+50(30.46-10) is 1923 kcal.
The nutrient component suggestion table for children is as follows:
ideal Body Weight (IBW) Recommended amount of nutrient
IBW≤10kg Thermal card (kcal) 90 IBW
10<IBW≤20kg Caloric (kcal) 900+50(IBW-10kg)
IBW>20kg Thermal card (kcal) ═ 1500+20 (IBW-20)
And the user inputs the height, weight and other information of the user, and the recommended amount of the nutrient components of the user is calculated through the dietary nutrient analysis software. When eating, the user only needs to take a picture of the eaten dishes, the software can automatically identify the types of the dishes and calculate the volume of the dishes, so that the nutrient content of the dishes can be calculated, and the user can manually adjust the result. In the whole process, the user does not need to manually measure the weight of the dishes or input the nutritional ingredients by himself, so that the user can feel easier and more convenient.
The above description is only for the purpose of illustrating the preferred embodiments of the present invention and should not be taken as limiting the invention, so that any modifications, equivalents, improvements and the like, which are within the spirit and principle of the present invention, should be included in the scope of the present invention.

Claims (7)

1. A dietary nutrition analysis method based on convolutional neural network target detection is characterized by comprising the following steps: comprises the following steps of (a) carrying out,
step one, collecting data and manually marking: shooting a dish picture or a real object of a cooking dish and storing the dish picture or the real object as dish picture data, manually marking dish name data on the dish picture, calculating nutrient component data of the dish according to raw materials and quality of the cooking dish, and storing the dish picture data, the dish name data and the nutrient component data of the dish in a database;
step two, data enhancement is carried out on the data: expanding the dish picture data through a data enhancement technology;
step three, training a neural network: training a convolutional neural network target detection system through the dish picture data, the dish name data and the dish nutrient component data, so that the convolutional neural network target detection system can output the dish nutrient component data through inputting the dish name data and/or the dish picture data;
step four, calculating the nutrient content of the food: and calculating the nutrient content of the cooking dish by summarizing and counting the nutrient content data of the dish.
2. A method of dietary nutrition analysis based on convolutional neural network target detection as claimed in claim 1, wherein: and step four, calculating the contents of carbohydrate, fat and protein of the dish according to the nutrient contents in each 100 g of various food raw materials, and calculating the calorie according to the contents of the carbohydrate, the fat and the protein.
3. A method of dietary nutrition analysis based on convolutional neural network target detection as claimed in claim 1, wherein: the following steps are also included between the third step and the fourth step,
identifying the type of food and acquiring the position coordinates of the dishes by using a neural network: identifying the type of food and acquiring the position coordinates of dishes by using a convolutional neural network-based target detection technology;
predicting depth information of the food and estimating the volume using a neural network: depth information of the food is predicted and the volume is estimated using convolutional neural network based object detection techniques.
4. A method of dietary nutrition analysis based on convolutional neural network target detection as claimed in claim 1, wherein: and the dish picture data, the dish name data, the dish nutrient component data and the convolutional neural network target detection system are stored in the mobile phone mobile terminal.
5. A method of dietary nutrition analysis based on convolutional neural network target detection as claimed in claim 1, wherein: and the dish picture data, the dish name data and the dish nutritional ingredient data are acquired through meal nutritional analysis software installed on the mobile terminal.
6. A dietary nutrition analysis method based on convolutional neural network target detection as claimed in claim 5, characterized in that: the meal nutrition analysis software comprises a data input program, an identification program and a data display program, wherein the data input program is used for reading dish picture data and dish name data input by a user; the identification program is used for identifying dishes and calculating nutrient content data of the dishes; and the data display program is used for displaying the nutrient content data of the dishes.
7. A method of dietary nutrition analysis based on convolutional neural network target detection as claimed in claim 6, wherein: the dietary nutrition analysis software also comprises a step of calculating corresponding nutrient component recommended amount according to height and weight data input by a user, and the nutrient component recommended amount is read by the data display program.
CN202010298450.9A 2020-04-16 2020-04-16 Dietary nutrition analysis method based on convolutional neural network target detection Pending CN111652044A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010298450.9A CN111652044A (en) 2020-04-16 2020-04-16 Dietary nutrition analysis method based on convolutional neural network target detection

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010298450.9A CN111652044A (en) 2020-04-16 2020-04-16 Dietary nutrition analysis method based on convolutional neural network target detection

Publications (1)

Publication Number Publication Date
CN111652044A true CN111652044A (en) 2020-09-11

Family

ID=72346444

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010298450.9A Pending CN111652044A (en) 2020-04-16 2020-04-16 Dietary nutrition analysis method based on convolutional neural network target detection

Country Status (1)

Country Link
CN (1) CN111652044A (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114359299A (en) * 2022-03-18 2022-04-15 天津九安医疗电子股份有限公司 Diet segmentation method and diet nutrition management method for chronic disease patients
CN114360690A (en) * 2022-03-18 2022-04-15 天津九安医疗电子股份有限公司 Method and system for managing diet nutrition of chronic disease patient

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108320786A (en) * 2018-02-06 2018-07-24 华南理工大学 A kind of Chinese meal vegetable recommendation method based on deep neural network
CN108831530A (en) * 2018-05-02 2018-11-16 杭州机慧科技有限公司 Vegetable nutrient calculation method based on convolutional neural networks
CN110059654A (en) * 2019-04-25 2019-07-26 台州智必安科技有限责任公司 A kind of vegetable Automatic-settlement and healthy diet management method based on fine granularity identification

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108320786A (en) * 2018-02-06 2018-07-24 华南理工大学 A kind of Chinese meal vegetable recommendation method based on deep neural network
CN108831530A (en) * 2018-05-02 2018-11-16 杭州机慧科技有限公司 Vegetable nutrient calculation method based on convolutional neural networks
CN110059654A (en) * 2019-04-25 2019-07-26 台州智必安科技有限责任公司 A kind of vegetable Automatic-settlement and healthy diet management method based on fine granularity identification

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114359299A (en) * 2022-03-18 2022-04-15 天津九安医疗电子股份有限公司 Diet segmentation method and diet nutrition management method for chronic disease patients
CN114360690A (en) * 2022-03-18 2022-04-15 天津九安医疗电子股份有限公司 Method and system for managing diet nutrition of chronic disease patient

Similar Documents

Publication Publication Date Title
CN109346153A (en) It is a kind of to digitize system and method for having dinner
CN107731278A (en) A kind of food recognition methods, nutrient health analysis method, system and device
CN104778374A (en) Automatic dietary estimation device based on image processing and recognizing method
CN112017756B (en) Dietary nutrition analysis method based on face recognition self-service meal-making system
US20090288887A1 (en) Electronic scale
CN111652044A (en) Dietary nutrition analysis method based on convolutional neural network target detection
Tay et al. Current developments in digital quantitative volume estimation for the optimisation of dietary assessment
JP2022091962A (en) Health management system
CN110853735A (en) Method and device for analyzing intake of dietary component, computer device and storage medium
Karikome et al. A system for supporting dietary habits: planning menus and visualizing nutritional intake balance
Pouladzadeh et al. Intelligent SVM based food intake measurement system
CN106843044A (en) A kind of health diet accessory system
CN106642970A (en) Nutrition judgment system and judgment method of intelligent refrigerator
CN106339574A (en) Refrigerator food-based personalized catering method and system as well as refrigerator
CN108630293A (en) A kind of nutrient diet method and apparatus
CN111584039A (en) Chinese and western medicine combined diet scheme generation method, system and terminal
CN110910987A (en) Diet advice generation method and apparatus, computer apparatus, and storage medium
CN114359299B (en) Diet segmentation method and diet nutrition management method for chronic disease patients
CN114360690B (en) Method and system for managing diet nutrition of chronic disease patient
CN111524576B (en) Food weight estimation learning system for weight control
CN114388102A (en) Diet recommendation method and device and electronic equipment
Nakamoto et al. Prediction of mental state from food images
CN114023419A (en) Recipe recommendation method and device and nonvolatile storage medium
CN117078955B (en) Health management method based on image recognition
CN111584037A (en) Nutritional data analysis guidance method for chronic diseases

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination