CN116825286A - Food ingredient identification and nutrition recommendation system - Google Patents
Food ingredient identification and nutrition recommendation system Download PDFInfo
- Publication number
- CN116825286A CN116825286A CN202311109888.8A CN202311109888A CN116825286A CN 116825286 A CN116825286 A CN 116825286A CN 202311109888 A CN202311109888 A CN 202311109888A CN 116825286 A CN116825286 A CN 116825286A
- Authority
- CN
- China
- Prior art keywords
- food
- information
- detected
- determining
- area image
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
- 235000016709 nutrition Nutrition 0.000 title claims abstract description 39
- 230000035764 nutrition Effects 0.000 title claims abstract description 38
- 235000012041 food component Nutrition 0.000 title claims abstract description 28
- 239000005417 food ingredient Substances 0.000 title claims abstract description 28
- 235000013305 food Nutrition 0.000 claims abstract description 401
- 201000010099 disease Diseases 0.000 claims abstract description 72
- 208000037265 diseases, disorders, signs and symptoms Diseases 0.000 claims abstract description 72
- 235000005911 diet Nutrition 0.000 claims abstract description 38
- 230000037213 diet Effects 0.000 claims abstract description 35
- 238000010411 cooking Methods 0.000 claims abstract description 34
- 239000000463 material Substances 0.000 claims description 165
- 235000011194 food seasoning agent Nutrition 0.000 claims description 33
- 230000000877 morphologic effect Effects 0.000 claims description 13
- 239000004615 ingredient Substances 0.000 claims description 11
- 238000001514 detection method Methods 0.000 claims description 10
- 235000002864 food coloring agent Nutrition 0.000 claims description 10
- 235000015097 nutrients Nutrition 0.000 claims description 7
- 208000024891 symptom Diseases 0.000 claims description 7
- 239000000284 extract Substances 0.000 claims description 5
- 235000020979 dietary recommendations Nutrition 0.000 claims 1
- 244000218514 Opuntia robusta Species 0.000 description 12
- 235000003166 Opuntia robusta Nutrition 0.000 description 12
- 238000013528 artificial neural network Methods 0.000 description 10
- 241000251468 Actinopterygii Species 0.000 description 6
- 235000013527 bean curd Nutrition 0.000 description 6
- 238000010586 diagram Methods 0.000 description 6
- 239000004973 liquid crystal related substance Substances 0.000 description 6
- 230000000306 recurrent effect Effects 0.000 description 6
- OYPRJOBELJOOCE-UHFFFAOYSA-N Calcium Chemical compound [Ca] OYPRJOBELJOOCE-UHFFFAOYSA-N 0.000 description 4
- 229910052791 calcium Inorganic materials 0.000 description 4
- 239000011575 calcium Substances 0.000 description 4
- 230000000378 dietary effect Effects 0.000 description 3
- 235000013399 edible fruits Nutrition 0.000 description 3
- 238000000034 method Methods 0.000 description 3
- 235000014102 seafood Nutrition 0.000 description 3
- 235000018553 tannin Nutrition 0.000 description 3
- 229920001864 tannin Polymers 0.000 description 3
- 239000001648 tannin Substances 0.000 description 3
- 241000195493 Cryptophyta Species 0.000 description 2
- 241000238557 Decapoda Species 0.000 description 2
- 238000010521 absorption reaction Methods 0.000 description 2
- 210000000577 adipose tissue Anatomy 0.000 description 2
- 230000018109 developmental process Effects 0.000 description 2
- 230000000694 effects Effects 0.000 description 2
- 238000010801 machine learning Methods 0.000 description 2
- 235000013372 meat Nutrition 0.000 description 2
- 230000006403 short-term memory Effects 0.000 description 2
- 208000004998 Abdominal Pain Diseases 0.000 description 1
- 208000020084 Bone disease Diseases 0.000 description 1
- 241001070941 Castanea Species 0.000 description 1
- 235000014036 Castanea Nutrition 0.000 description 1
- 235000009917 Crataegus X brevipes Nutrition 0.000 description 1
- 235000013204 Crataegus X haemacarpa Nutrition 0.000 description 1
- 235000009685 Crataegus X maligna Nutrition 0.000 description 1
- 235000009444 Crataegus X rubrocarnea Nutrition 0.000 description 1
- 235000009486 Crataegus bullatus Nutrition 0.000 description 1
- 235000017181 Crataegus chrysocarpa Nutrition 0.000 description 1
- 235000009682 Crataegus limnophila Nutrition 0.000 description 1
- 235000004423 Crataegus monogyna Nutrition 0.000 description 1
- 240000000171 Crataegus monogyna Species 0.000 description 1
- 235000002313 Crataegus paludosa Nutrition 0.000 description 1
- 235000009840 Crataegus x incaedua Nutrition 0.000 description 1
- 235000011511 Diospyros Nutrition 0.000 description 1
- 244000236655 Diospyros kaki Species 0.000 description 1
- 206010028813 Nausea Diseases 0.000 description 1
- 240000007817 Olea europaea Species 0.000 description 1
- 208000001132 Osteoporosis Diseases 0.000 description 1
- 244000294611 Punica granatum Species 0.000 description 1
- 235000014360 Punica granatum Nutrition 0.000 description 1
- 244000061456 Solanum tuberosum Species 0.000 description 1
- 235000002595 Solanum tuberosum Nutrition 0.000 description 1
- 229930003316 Vitamin D Natural products 0.000 description 1
- QYSXJUFSXHHAJI-XFEUOLMDSA-N Vitamin D3 Natural products C1(/[C@@H]2CC[C@@H]([C@]2(CCC1)C)[C@H](C)CCCC(C)C)=C/C=C1\C[C@@H](O)CCC1=C QYSXJUFSXHHAJI-XFEUOLMDSA-N 0.000 description 1
- 235000009754 Vitis X bourquina Nutrition 0.000 description 1
- 235000012333 Vitis X labruscana Nutrition 0.000 description 1
- 240000006365 Vitis vinifera Species 0.000 description 1
- 235000014787 Vitis vinifera Nutrition 0.000 description 1
- 206010047700 Vomiting Diseases 0.000 description 1
- 230000002411 adverse Effects 0.000 description 1
- 235000004251 balanced diet Nutrition 0.000 description 1
- 230000009286 beneficial effect Effects 0.000 description 1
- 238000012790 confirmation Methods 0.000 description 1
- 230000005802 health problem Effects 0.000 description 1
- 238000003709 image segmentation Methods 0.000 description 1
- 239000000203 mixture Substances 0.000 description 1
- 230000008693 nausea Effects 0.000 description 1
- 235000015816 nutrient absorption Nutrition 0.000 description 1
- 230000001766 physiological effect Effects 0.000 description 1
- 102000004169 proteins and genes Human genes 0.000 description 1
- 108090000623 proteins and genes Proteins 0.000 description 1
- 208000007442 rickets Diseases 0.000 description 1
- 210000002784 stomach Anatomy 0.000 description 1
- 239000000126 substance Substances 0.000 description 1
- 235000013311 vegetables Nutrition 0.000 description 1
- 235000019166 vitamin D Nutrition 0.000 description 1
- 239000011710 vitamin D Substances 0.000 description 1
- 150000003710 vitamin D derivatives Chemical class 0.000 description 1
- 229940046008 vitamin d Drugs 0.000 description 1
- 230000008673 vomiting Effects 0.000 description 1
- 238000005303 weighing Methods 0.000 description 1
Classifications
-
- G—PHYSICS
- G16—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
- G16H—HEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
- G16H20/00—ICT specially adapted for therapies or health-improving plans, e.g. for handling prescriptions, for steering therapy or for monitoring patient compliance
- G16H20/60—ICT specially adapted for therapies or health-improving plans, e.g. for handling prescriptions, for steering therapy or for monitoring patient compliance relating to nutrition control, e.g. diets
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N5/00—Computing arrangements using knowledge-based models
- G06N5/02—Knowledge representation; Symbolic representation
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/70—Arrangements for image or video recognition or understanding using pattern recognition or machine learning
- G06V10/764—Arrangements for image or video recognition or understanding using pattern recognition or machine learning using classification, e.g. of video objects
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/70—Arrangements for image or video recognition or understanding using pattern recognition or machine learning
- G06V10/82—Arrangements for image or video recognition or understanding using pattern recognition or machine learning using neural networks
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/60—Type of objects
- G06V20/68—Food, e.g. fruit or vegetables
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/04—Architecture, e.g. interconnection topology
- G06N3/044—Recurrent networks, e.g. Hopfield networks
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/04—Architecture, e.g. interconnection topology
- G06N3/044—Recurrent networks, e.g. Hopfield networks
- G06N3/0442—Recurrent networks, e.g. Hopfield networks characterised by memory or gating, e.g. long short-term memory [LSTM] or gated recurrent units [GRU]
-
- Y—GENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
- Y02—TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
- Y02A—TECHNOLOGIES FOR ADAPTATION TO CLIMATE CHANGE
- Y02A90/00—Technologies having an indirect contribution to adaptation to climate change
- Y02A90/10—Information and communication technologies [ICT] supporting adaptation to climate change, e.g. for weather forecasting or climate simulation
Abstract
The invention provides a food ingredient identification and nutrition recommendation system, which relates to the field of nutrition control and comprises the following components: the map building module is used for obtaining the related information of various foods and the related information of various diseases and building a knowledge map; the information acquisition module is used for acquiring related information of the food to be detected, including image information and weight information of the food to be detected; the information acquisition module is also used for acquiring environment-related information; the component identification module is used for identifying the related information of the food to be detected based on the knowledge graph through the food identification model and determining the identification information of the food to be detected, wherein the identification information at least comprises component information, weight information and cooking modes; the nutrition control module is used for generating diet proposal information based on the component information and weight information of the food to be detected, the related information of the user, the knowledge graph and the related information of the environment, and has the advantage of providing more convenient, accurate and efficient nutrition control service for the user.
Description
Technical Field
The invention relates to the field of nutrition control, in particular to a food ingredient identification and nutrition recommendation system.
Background
With the development of social economy, the living standard of people is greatly improved, and the nutrition and health of diet are also the pursuit targets of modern people. The balanced diet structure can meet the needs of normal development and various physiological activities of human bodies, and can also reduce the risks of various diseases, so that people can have strong bodies and full energy and hug happy lives. However, at present, due to the lack of nutritional health knowledge, the singleness of the channel for obtaining the professional health knowledge, the scarcity of the professional nutritionist, and the like, some health problems caused by improper diet are always examined and challenged by the body of people.
Accordingly, there is a need to provide a food ingredient identification and nutrition recommendation system for providing a user with more convenient, accurate and efficient nutrition control services.
Disclosure of Invention
The present specification provides a food ingredient identification and nutrition recommendation system comprising: the map establishing module is used for acquiring the related information of various foods and the related information of various diseases and establishing a knowledge map based on the related information of various foods and the related information of various diseases; the information acquisition module is used for acquiring the related information of the food to be detected, wherein the related information of the food to be detected comprises the image information and the weight information of the food to be detected; the information acquisition module is also used for acquiring environment-related information; the component identification module is used for identifying the related information of the food to be detected based on the knowledge graph through a food identification model and determining the identification information of the food to be detected, wherein the identification information at least comprises component information, weight information and cooking modes; and the nutrition control module is used for generating diet proposal information based on the component information and weight information of the food to be detected, the user related information, the knowledge graph and the environment related information.
Further, the related information of the plurality of foods at least comprises names of foods, calories, nutrients, candidate cooking modes, food ingredients and matched foods: the related information of the plurality of diseases at least comprises the name of the disease, symptoms of the disease and diet information of a plurality of sample users corresponding to each disease.
Still further, the profile creating module creates a knowledge profile based on the related information of the plurality of foods and the related information of the plurality of diseases, including: for each of the diseases, determining a degree of disease association between each food and the disease based on diet information of a plurality of sample users corresponding to the disease; based on the disease association degree, determining a plurality of associated foods from a plurality of foods, and determining a plurality of associated users from a plurality of sample users corresponding to the disease; determining the food association degree between any two kinds of associated foods based on diet information of the plurality of associated users; the knowledge graph is established based on names, calories, nutrients, candidate cooking modes, food ingredients, matched foods, disease names, disease symptoms, disease association between each food ingredient and the disease, and food association between any two associated foods.
Still further, the information acquisition module acquires information related to food to be detected, including: acquiring point cloud information of the food to be detected; acquiring image information of the food to be detected; and acquiring weight information of the food to be detected.
Still further, the component recognition module recognizes the relevant information of the food to be detected based on the knowledge graph through a food recognition model, and determines the recognition information of the food to be detected, including: the food recognition model extracts at least one dish area image from the image information of the food to be detected based on an SSD target detection algorithm; for each of the dish area images, extracting a food material sub-area image from the dish area image by a food recognition model based on an SSD destination detection algorithm; for each food material sub-region image, extracting food color features based on the food material sub-region image, determining food morphology features based on sub-region point cloud information corresponding to the food material sub-region image, and determining at least one candidate food material corresponding to the food material sub-region image based on the food color features and the food morphology features; determining target food materials corresponding to each food material subarea image based on at least one candidate food material corresponding to each food material subarea image and the knowledge graph; and determining main food materials, auxiliary food materials and seasoning food materials corresponding to the dish area images based on the target food materials corresponding to each food material subarea image.
Still further, the component recognition module recognizes the relevant information of the food to be detected based on the knowledge graph through a food recognition model, and determines the recognition information of the food to be detected, including: determining a plurality of candidate cooking modes based on the main food material, the auxiliary food material and the seasoning food material of the dish area image; and determining the cooking mode corresponding to the dish area image based on the feature label corresponding to each candidate cooking mode and the color feature and the morphological feature of the dish area image.
Still further, the component recognition module recognizes the relevant information of the food to be detected based on the knowledge graph through a food recognition model, and determines the recognition information of the food to be detected, including: determining a candidate weight range corresponding to the dish area image based on the area point cloud information corresponding to the dish area image, the main food material, the auxiliary food material, the seasoning food material and the cooking mode corresponding to the dish area image; and determining the weight range corresponding to each dish area image based on the candidate weight range corresponding to each dish area image and the weight information of the food to be detected.
Still further, the nutrition control module generates diet suggestion information based on the ingredient information and weight information of the food to be detected, user related information, the knowledge graph, and the environment related information, including: for each food material subarea image, determining food material restriction information corresponding to the food material subarea image based on main food materials, auxiliary food materials and seasoning food materials corresponding to the dish area image; determining dish collocation phase-gram information corresponding to the food to be detected based on main food materials, auxiliary food materials and seasoning food materials corresponding to each dish area image; determining the food material of the user based on the related information of the user; and generating user food material restriction information based on the main food material, the auxiliary food material, the seasoning food material and the user restriction food material corresponding to each dish area image.
Still further, the nutrition control module generates diet suggestion information based on the ingredient information and weight information of the food to be detected, user related information, the knowledge graph, and the environment related information, including: determining a heat range corresponding to each dish area image based on a weight range, a cooking mode, main food materials, auxiliary food materials and seasoning food materials corresponding to each dish area image; calculating a heat sum based on the heat range corresponding to each dish area image; a thermal cue is generated based on the sum of the calories and a caloric threshold determined based on the user-related information.
Still further, the nutrition control module generates diet suggestion information based on the ingredient information and weight information of the food to be detected, user related information, the knowledge graph, and the environment related information, including: acquiring diet information of a user at a plurality of historical time points; determining the matching degree of dishes corresponding to each dish area image and the environment based on the cooking mode, main food materials, auxiliary food materials, seasoning food materials, the environment related information and diet information of a user at a plurality of historical time points corresponding to each dish area image; and generating an environment matching prompt based on the matching degree of the dishes corresponding to each dish area image and the environment.
Compared with the prior art, the food ingredient identification and nutrition recommendation system provided by the specification has the following beneficial effects:
1. the method comprises the steps of establishing a knowledge graph, acquiring relevant information of food to be detected, identifying component information, weight information and cooking modes, further combining the relevant information of the environment, judging the nutrition and health of the diet, further generating diet suggestion information, and providing more convenient, accurate and efficient nutrition control service for users;
2. the user may place at least one dish on the dinner plate, firstly, use the food recognition model to carry out image segmentation on the image of the food to be detected based on the SSD target detection algorithm, extract at least one dish area image, wherein one dish area image corresponds to one dish, one dish can be composed of multiple food materials, including main food materials, auxiliary food materials and seasoning food materials, and then use the food recognition model to extract food material subregion images from the dish area image based on the SSD target detection algorithm, so that accuracy of dish and food recognition is improved.
Drawings
The present specification will be further elucidated by way of example embodiments, which will be described in detail by means of the accompanying drawings. The embodiments are not limiting, in which like numerals represent like structures, wherein:
FIG. 1 is a block diagram of a food ingredient identification and nutrition recommendation system, in accordance with some embodiments of the present description;
FIG. 2 is a schematic diagram of a process for creating a knowledge-graph, according to some embodiments of the present disclosure;
FIG. 3 is a schematic flow chart of determining identification information of a food to be detected, according to some embodiments of the present disclosure;
FIG. 4 is a schematic flow diagram of generating dietary advice information according to some embodiments of the present disclosure;
fig. 5 is a schematic diagram of a portion of a knowledge-graph, shown in accordance with some embodiments of the present description.
Detailed Description
In order to more clearly illustrate the technical solutions of the embodiments of the present specification, the drawings that are required to be used in the description of the embodiments will be briefly described below. It is apparent that the drawings in the following description are only some examples or embodiments of the present specification, and it is possible for those of ordinary skill in the art to apply the present specification to other similar situations according to the drawings without inventive effort. Unless otherwise apparent from the context of the language or otherwise specified, like reference numerals in the figures refer to like structures or operations.
Fig. 1 is a schematic block diagram of a food ingredient identification and nutrition recommendation system according to some embodiments of the present disclosure, and as shown in fig. 1, a food ingredient identification and nutrition recommendation system may include a profile creation module, an information acquisition module, an ingredient identification module, and a nutrition control module.
The map establishing module can be used for acquiring the related information of various foods and the related information of various diseases and establishing a knowledge map based on the related information of various foods and the related information of various diseases.
In some embodiments, the information about the plurality of foods includes at least the name of the food, calories, nutrients, candidate cooking profile, food ingredients, and food ingredients. Wherein, the food is food which can produce adverse effect after being matched with the food. For example, fish, shrimp, algae in seafood are rich in nutrients such as protein and calcium, and if they are eaten with fruits containing tannins such as persimmon, grape, pomegranate, hawthorn and olive, calcium in seafood and tannins are easily combined into a new nondigestible substance, which stimulates the stomach to cause discomfort, and symptoms such as abdominal pain, vomiting and nausea. Therefore, the fruits are not suitable for being eaten together with the seafood vegetables, and the interval is preferably 2 hours, so that the fish, the shrimp and the algae can be mutually gram food with the fruits containing tannins. The matched food is food which can promote nutrient absorption after being matched with the food. For example, fish and bean curd are matched, delicious and calcium-supplementing, various bone diseases such as rickets, osteoporosis and the like of children can be prevented, the bean curd contains a large amount of calcium, and if the bean curd is eaten singly, the absorption rate is low, but the bean curd is eaten together with fish meat rich in vitamin D, and the fish and the bean curd can have better effect on the absorption and utilization of calcium, so that the fish and the bean curd can be matched with each other.
In some embodiments, the information about the plurality of diseases includes at least a disease name, a disease symptom, and diet information of a plurality of sample users corresponding to each disease. The sample user may be a user suffering from the disease, and the diet information of the sample user may include information such as dishes and components of dishes consumed by the sample user at a plurality of history time points of a certain sample history time period (for example, the past year, the past two years, etc.).
Fig. 2 is a schematic flow chart of establishing a knowledge graph according to some embodiments of the present disclosure, as shown in fig. 2, and in some embodiments, the graph establishing module establishes the knowledge graph based on related information of multiple foods and related information of multiple diseases, including:
for each disease, determining a degree of disease association between each food and the disease based on diet information of a plurality of sample users corresponding to the disease;
based on the disease association degree, determining a plurality of associated foods from the plurality of foods, and determining a plurality of associated users from a plurality of sample users corresponding to the disease;
determining a food association degree between any two associated foods based on diet information of a plurality of associated users;
a knowledge graph is established based on names, calories, nutrients, candidate cooking patterns, food ingredients, disease names, disease symptoms, disease association between each food ingredient and disease, and food association between any two associated foods.
Specifically, for each disease, the edible frequency of the sample user for each food can be determined based on the diet information of a plurality of sample users corresponding to the disease, when the edible frequency is greater than a first preset edible frequency threshold, the sample user is used as a target sample user corresponding to the food, and then the disease association degree between the food and the disease is determined based on the number of target sample users corresponding to the food.
For example, the disease a and the food B are described, the edible frequency of the food B in the sample history period of the sample user is determined based on the diet information of each sample user corresponding to the disease a, and if the edible frequency of the food B is greater than the first preset edible frequency threshold, the sample user is used as the target sample user corresponding to the food.
For example only, the frequency of consumption of food B may be calculated based on the following formula:
wherein, the liquid crystal display device comprises a liquid crystal display device,for the edible frequency of the ith food, +.>For the number of times the sample user consumed the ith food in the sample history period,/i->For the number of times the sample user eats during the sample history period.
For example only, the disease association between food and disease may be calculated based on the following formula:
wherein, the liquid crystal display device comprises a liquid crystal display device,for disease association between food and disease, < ->Number of target sample users corresponding to food, +.>The total number of sample users corresponding to the food. It can be understood that if the disease association degree between the food corresponding to a certain food and the disease is greater than the preset disease association degree threshold, the food is taken as the associated food of the disease.
The associated user may be a sample user whose eating frequency of at least one associated food in the plurality of sample users corresponding to the disease is greater than a preset eating frequency threshold.
In some embodiments, for any two kinds of associated foods, calculating the frequency of each associated user eating the two kinds of associated foods simultaneously, taking the associated user with the frequency of eating the two kinds of associated foods being greater than a second preset eating frequency threshold as a target associated user, if the proportion of the target associated user is greater than the preset user proportion, judging that the two kinds of associated foods have association, namely, representing that the two kinds of foods are likely to cause the disease when being eaten simultaneously, and taking the proportion of the target associated user as the association degree between the two kinds of associated foods.
Fig. 5 is a schematic diagram of a portion of a knowledge graph, as shown in fig. 5, according to some embodiments of the present disclosure, where the knowledge graph may be formed of two nodes, one being a food node and the other being a disease node, the knowledge graph records matching and correlation relationships between different foods, association relationships between foods and diseases, food association degrees between any two associated foods, and so on.
The information acquisition module can be used for acquiring the related information of the food to be detected.
The related information of the food to be detected comprises image information and weight information of the food to be detected.
In some embodiments, the information acquisition module acquires information about the food to be detected, including:
acquiring point cloud information of food to be detected;
acquiring image information of food to be detected;
weight information of food to be detected is obtained.
Specifically, the information acquisition module can include information acquisition device, and information acquisition device can include the frame and set up the food in the frame and place the platform, and food is placed the bench and is provided with weighing module for obtain the weight information of waiting to detect food, be provided with scanning module and image acquisition subassembly in the frame, scanning module is used for placing the food of waiting to detect on the bench to the food and scans, acquires the point cloud information of waiting to detect food, and image acquisition subassembly is used for obtaining the image information of waiting to detect food.
It can be understood that after a user can place food on the dinner plate, the dinner plate with the food is placed on the food placing table, and the information acquisition device acquires point cloud information, image information and weight information of the food to be detected.
The information acquisition module may also be used to acquire context-related information.
The environment-related information may include information such as temperature, humidity, sunlight intensity, etc. of the environment in which the user is located at a plurality of historical time points, current time points, and a plurality of future time points.
The component recognition module can be used for recognizing the related information of the food to be detected based on the knowledge graph through the food recognition model, and determining the recognition information of the food to be detected.
The identification information at least comprises component information, weight information and cooking modes.
Fig. 3 is a schematic flow chart of determining identification information of food to be detected according to some embodiments of the present disclosure, as shown in fig. 3, in some embodiments, the component identification module identifies relevant information of food to be detected based on a knowledge graph through a food identification model, and determines identification information of food to be detected, including:
the food recognition model extracts at least one dish area image from image information of food to be detected based on an SSD target detection algorithm;
for each dish area image, the food recognition model extracts a food material sub-area image from the dish area image based on the SSD destination detection algorithm;
for each food material sub-region image, extracting food color features based on the food material sub-region image, determining food morphology features based on sub-region point cloud information corresponding to the food material sub-region image, and determining at least one candidate food material corresponding to the food material sub-region image based on the food color features and the food morphology features;
determining target food materials corresponding to each food material subarea image based on at least one candidate food material and a knowledge graph corresponding to each food material subarea image;
and determining main food materials, auxiliary food materials and seasoning food materials corresponding to the dish area images based on the target food materials corresponding to each food material subarea image.
Specifically, the user may place at least one dish on the dinner plate, so that the image of the food to be detected needs to be segmented based on the SSD destination detection algorithm by using the food recognition model, and at least one dish area image is extracted, wherein one dish area image corresponds to one dish.
It will be appreciated that a dish may be composed of a variety of food materials, including main food materials, auxiliary food materials, and seasoning food materials, in order to achieve more accurate dish identification, a food material sub-region image needs to be extracted from a dish region image based on an SSD destination detection algorithm using a food identification model, where the food material sub-region image includes an image of one food material in a dish, such as an image of potato pieces in a ground-burned spareribs image.
In some embodiments, food similarity between the food color feature and the food morphology feature corresponding to the food material sub-region image and the food color feature and the food morphology feature corresponding to each food may be calculated, and at least one candidate food material corresponding to the food material sub-region image may be determined based on the food similarity.
For example, food similarity may be calculated based on the following formula:
wherein, the liquid crystal display device comprises a liquid crystal display device,food similarity, I/L>Is the similarity between the food color characteristics corresponding to the food material subregion image and the food color characteristics of certain food>Is the similarity between the food morphological characteristics corresponding to the food material subregion image and the food morphological characteristics of certain food>、/>All are preset weights. It can be appreciated that food with a food similarity greater than a preset food similarity threshold may be used as the candidate food material corresponding to the food material sub-region image.
In some embodiments, the component identification module may determine, based on the knowledge graph, candidate dishes that may be formed by combining at least one candidate food corresponding to each food sub-region image, and if the candidate dishes cannot be determined, may directly send a dish information request to the user terminal to determine dish information, thereby determining a target food corresponding to each food sub-region image, and determining main food, auxiliary food, and seasoning food corresponding to each food sub-region image. If only one candidate dish which can be formed by combination can be determined, determining a target food material corresponding to each food material subarea image based on the information of the food materials included in the candidate dish, and further determining a main food material, a subsidiary food material and a seasoning food material corresponding to each dish area image. If the number of the determined candidate dishes is greater than 1, confirmation information may be sent to the user terminal, where the determination information may include information of all the candidate dishes, the user may select by using the user terminal, and determine, according to the candidate dishes selected by the user, a target food corresponding to each food sub-area image, and further determine main food, auxiliary food, and seasoning food corresponding to each dish area image.
In some embodiments, the component recognition module recognizes related information of the food to be detected based on the knowledge graph through the food recognition model, determines recognition information of the food to be detected, and further includes:
determining a plurality of candidate cooking modes based on the main food material, the auxiliary food material and the seasoning food material of the dish area image;
and determining the cooking mode corresponding to the dish area image based on the feature label corresponding to each candidate cooking mode and the color feature and the morphological feature of the dish area image.
Specifically, after the main food material, the auxiliary food material and the seasoning food material of the dish area image are determined, sample images corresponding to the main food material, the auxiliary food material and the seasoning food material under different cooking modes and color features and morphological features of the sample images can be obtained, and the feature labels comprise the color features and the morphological features of the corresponding sample images. The method can calculate the dish similarity between the color feature and the morphological feature corresponding to the dish area image and the color feature and the morphological feature of each sample image, determine the target sample image based on the dish similarity, and further take the cooking mode corresponding to the target sample image as the cooking mode corresponding to the dish area image. For example, the color feature similarity between the color feature corresponding to the dish area image and the color feature of the sample image may be calculated, the morphological feature similarity between the morphological feature corresponding to the dish area image and the morphological feature of the sample image may be calculated, the color feature similarity and the morphological feature similarity may be weighted and summed, the dish similarity may be calculated, and the sample image with the largest dish similarity may be used as the target sample image.
In some embodiments, the component recognition module recognizes related information of the food to be detected based on the knowledge graph through the food recognition model, determines recognition information of the food to be detected, and further includes:
determining a candidate weight range corresponding to the dish area image based on the area point cloud information corresponding to the dish area image, the main food material, the auxiliary food material, the seasoning food material and the cooking mode corresponding to the dish area image;
and determining the weight range corresponding to each dish area image based on the candidate weight range corresponding to each dish area image and the weight information of the food to be detected.
It can be appreciated that when at least one dish is placed on the dinner plate, the obtained weight information of the food to be detected not only includes the weight of the dinner plate, but also includes the weight of each dish, and in order to more accurately determine the eating weight of each dish, a candidate weight range corresponding to each dish area image needs to be determined, and the weight range corresponding to each dish area image is determined based on the candidate weight range corresponding to each dish area image and the weight information of the food to be detected.
Specifically, the component identification module can input weight information of a plurality of dinner plates in advance, determine color features and size features of the dinner plate used at this time based on image information of food to be detected and point cloud information of the food to be detected, further determine which dinner plate is specifically used at this time, further determine the weight of the dinner plate used at this time according to the weight information of the plurality of dinner plates input in advance, and subtract the weight of the dinner plate used at this time from the obtained weight of the food to be detected, so that weight information of dishes consumed at this time can be obtained.
In some embodiments, the component identification module may determine volume information of the dish based on the area point cloud information corresponding to the dish area image, and further predict, through a weight prediction model, a candidate weight range corresponding to the dish area image based on the volume information of the dish, the main food material, the auxiliary food material, the seasoning food material and the cooking mode corresponding to the dish area image, where the weight prediction model may be a machine learning model such as an artificial neural network (Artificial Neural Network, ANN) model, a recurrent neural network (Recurrent Neural Networks, RNN) model, a Long Short-Term Memory network (LSTM) model, a bi-directional recurrent neural network (BRNN) model, and the like.
In some embodiments, the component identification module may determine the weight range corresponding to each dish area image based on the candidate weight range corresponding to each dish area image and the weight information of the dish that is consumed this time. Specifically, the candidate weight range corresponding to each dish area image is adjusted and reduced, the weight range corresponding to each dish area image is determined, wherein the absolute value of the difference between the sum of the minimum values of the weight ranges corresponding to each dish area and the weight information of the dishes eaten at this time is smaller than a preset difference threshold, and the absolute value of the difference between the sum of the maximum values of the weight ranges corresponding to each dish area and the weight information of the dishes eaten at this time is smaller than a preset difference threshold.
The nutrition control module may be configured to generate diet advice information based on ingredient information and weight information of the food to be detected, user-related information, a knowledge graph, and environment-related information.
Fig. 4 is a schematic flow chart of generating dietary advice information according to some embodiments of the present disclosure, and as shown in fig. 4, the nutrition control module generates dietary advice information based on the composition information and weight information of the food to be detected, user related information, a knowledge graph, and environment related information, including:
for each food material subarea image, determining food material restriction information corresponding to the food material subarea image based on the main food material, the auxiliary food material and the seasoning food material corresponding to the dish area image;
determining dish collocation gram information corresponding to food to be detected based on main food materials, auxiliary food materials and seasoning food materials corresponding to the images of the dish areas;
determining the food material of the user based on the related information of the user;
and generating user food material restriction information based on the main food material, the auxiliary food material, the seasoning food material and the user restriction food material corresponding to each dish area image.
For example, when the food of the present time contains the meat and chestnut, the dish matching and food preparation information can be generated, and the dish matching and food preparation information contains which food preparation and corresponding processing modes.
In some embodiments, the user-related information may include at least user's illness information, where the user's illness information may include user's historical illness information and current illness information, and based on the user's illness information and knowledge-graph, food materials that the user avoids eating, e.g., associated food materials for the illness, are determined. When the dishes eaten at this time contain food materials which are avoided from being eaten by the user, generating food material gram information of the user, wherein the food material gram information of the user comprises food which needs to be avoided from being eaten at this time.
As shown in fig. 4, in some embodiments, the nutrition control module generates diet advice information based on the ingredient information and weight information of the food to be detected, the user-related information, the knowledge graph, and the environment-related information, and further includes:
determining a heat range corresponding to each dish area image based on a weight range, a cooking mode, main food materials, auxiliary food materials and seasoning food materials corresponding to each dish area image;
calculating a heat sum based on the heat range corresponding to each dish area image;
a caloric cue is generated based on the sum of calories and a caloric threshold determined based on the user-related information.
For example, when the sum of calories is greater than the caloric threshold and the difference between the sum of calories and the caloric threshold is greater than a preset difference threshold, a caloric cue may be generated, where the caloric cue may include calories of each of the dishes that are consumed this time and dishes that are recommended to be discarded from among the dishes that are used this time, and the dishes that are recommended to be discarded from being consumed may be dishes that have calories greater than the caloric threshold of the dishes or dishes that have highest calories.
In some embodiments, the user-related information may include user body fat rate information, historical illness information, and the like, and the caloric threshold may be adjusted based on the user body fat rate information and the historical illness information. For example, a more obese user would have a corresponding lower thermal threshold.
As shown in fig. 4, in some embodiments, the nutrition control module generates diet advice information based on the ingredient information and weight information of the food to be detected, the user-related information, the knowledge graph, and the environment-related information, and further includes:
acquiring diet information of a user at a plurality of historical time points;
determining the matching degree of dishes corresponding to each dish area image and the environment based on the cooking mode, main food materials, auxiliary food materials, seasoning food materials, environment related information and diet information of a user at a plurality of historical time points corresponding to each dish area image;
and generating an environment matching prompt based on the matching degree of the dishes corresponding to each dish area image and the environment.
In some embodiments, the nutrition control module may determine, through an environment matching prediction model, a degree of matching between dishes corresponding to each dish area image and the environment based on a cooking mode, main food, auxiliary food, and seasoning food corresponding to each dish area image, environment related information, and diet information of a user at a plurality of historical time points, where the environment matching prediction model may be a machine learning model such as an artificial neural network (Artificial Neural Network, ANN) model, a recurrent neural network (Recurrent Neural Networks, RNN) model, a Long Short-Term Memory (LSTM) model, a bi-directional recurrent neural network (BRNN) model, and the like.
When the matching degree of the dishes corresponding to the image of a certain dish area and the environment is smaller than the threshold value of the matching degree of the preset dishes and the environment, generating an environment matching prompt for prompting the user not to suggest to eat the dishes and recommended dishes.
Finally, it should be understood that the embodiments described in this specification are merely illustrative of the principles of the embodiments of this specification. Other variations are possible within the scope of this description. Thus, by way of example, and not limitation, alternative configurations of embodiments of the present specification may be considered as consistent with the teachings of the present specification. Accordingly, the embodiments of the present specification are not limited to only the embodiments explicitly described and depicted in the present specification.
Claims (10)
1. A food ingredient identification and nutritional recommendation system, comprising:
the map establishing module is used for acquiring the related information of various foods and the related information of various diseases and establishing a knowledge map based on the related information of various foods and the related information of various diseases;
the information acquisition module is used for acquiring the related information of the food to be detected, wherein the related information of the food to be detected comprises the image information and the weight information of the food to be detected;
the information acquisition module is also used for acquiring environment-related information;
the component identification module is used for identifying the related information of the food to be detected based on the knowledge graph through a food identification model and determining the identification information of the food to be detected, wherein the identification information at least comprises component information, weight information and cooking modes;
and the nutrition control module is used for generating diet proposal information based on the component information and weight information of the food to be detected, the user related information, the knowledge graph and the environment related information.
2. The food ingredient identification and nutrition recommendation system of claim 1, wherein the information related to the plurality of foods includes at least names of foods, calories, nutrients, candidate cooking patterns, food ingredients and matching foods:
the related information of the plurality of diseases at least comprises the name of the disease, symptoms of the disease and diet information of a plurality of sample users corresponding to each disease.
3. The food ingredient identification and nutrition recommendation system of claim 2, wherein the profile creation module creates a knowledge profile based on information related to a plurality of foods and information related to a plurality of diseases, comprising:
for each of the diseases, determining a degree of disease association between each food and the disease based on diet information of a plurality of sample users corresponding to the disease;
based on the disease association degree, determining a plurality of associated foods from a plurality of foods, and determining a plurality of associated users from a plurality of sample users corresponding to the disease;
determining the food association degree between any two kinds of associated foods based on diet information of the plurality of associated users;
the knowledge graph is established based on names, calories, nutrients, candidate cooking modes, food ingredients, matched foods, disease names, disease symptoms, disease association between each food ingredient and the disease, and food association between any two associated foods.
4. A food ingredient identification and nutrition recommendation system according to any of claims 1-3 wherein said information acquisition module acquires information related to food to be detected, comprising:
acquiring point cloud information of the food to be detected;
acquiring image information of the food to be detected;
and acquiring weight information of the food to be detected.
5. The system of claim 4, wherein the component recognition module recognizes the relevant information of the food to be detected based on the knowledge graph through a food recognition model, and determines the recognition information of the food to be detected, comprising:
the food recognition model extracts at least one dish area image from the image information of the food to be detected based on an SSD target detection algorithm;
for each of the dish area images, extracting a food material sub-area image from the dish area image by a food recognition model based on an SSD destination detection algorithm;
for each food material sub-region image, extracting food color features based on the food material sub-region image, determining food morphology features based on sub-region point cloud information corresponding to the food material sub-region image, and determining at least one candidate food material corresponding to the food material sub-region image based on the food color features and the food morphology features;
determining target food materials corresponding to each food material subarea image based on at least one candidate food material corresponding to each food material subarea image and the knowledge graph;
and determining main food materials, auxiliary food materials and seasoning food materials corresponding to the dish area images based on the target food materials corresponding to each food material subarea image.
6. The system for identifying and recommending food ingredients according to claim 5, wherein the ingredient identification module identifies the relevant information of the food to be detected based on the knowledge graph through a food identification model, and determines the identification information of the food to be detected, further comprising:
determining a plurality of candidate cooking modes based on the main food material, the auxiliary food material and the seasoning food material of the dish area image;
and determining the cooking mode corresponding to the dish area image based on the feature label corresponding to each candidate cooking mode and the color feature and the morphological feature of the dish area image.
7. The system of claim 6, wherein the component recognition module recognizes the relevant information of the food to be detected based on the knowledge graph through a food recognition model, and determines the recognition information of the food to be detected, and further comprises:
determining a candidate weight range corresponding to the dish area image based on the area point cloud information corresponding to the dish area image, the main food material, the auxiliary food material, the seasoning food material and the cooking mode corresponding to the dish area image;
and determining the weight range corresponding to each dish area image based on the candidate weight range corresponding to each dish area image and the weight information of the food to be detected.
8. The food ingredient identification and nutrition recommendation system of claim 7, wherein the nutrition control module generates diet recommendation information based on ingredient information and weight information of the food to be detected, user-related information, the knowledge-graph, and the environment-related information, comprising:
for each food material subarea image, determining food material restriction information corresponding to the food material subarea image based on main food materials, auxiliary food materials and seasoning food materials corresponding to the dish area image;
determining dish collocation phase-gram information corresponding to the food to be detected based on main food materials, auxiliary food materials and seasoning food materials corresponding to each dish area image;
determining the food material of the user based on the related information of the user;
and generating user food material restriction information based on the main food material, the auxiliary food material, the seasoning food material and the user restriction food material corresponding to each dish area image.
9. The food ingredient identification and nutrition recommendation system of claim 8, wherein the nutrition control module generates diet recommendation information based on ingredient information and weight information of the food to be detected, user-related information, the knowledge-graph, and the environment-related information, further comprising:
determining a heat range corresponding to each dish area image based on a weight range, a cooking mode, main food materials, auxiliary food materials and seasoning food materials corresponding to each dish area image;
calculating a heat sum based on the heat range corresponding to each dish area image;
a thermal cue is generated based on the sum of the calories and a caloric threshold determined based on the user-related information.
10. The food ingredient identification and nutrition recommendation system of claim 9 wherein the nutrition control module generates diet recommendation information based on ingredient information and weight information of the food to be detected, user related information, the knowledge graph, and the environment related information, further comprising:
acquiring diet information of a user at a plurality of historical time points;
determining the matching degree of dishes corresponding to each dish area image and the environment based on the cooking mode, main food materials, auxiliary food materials, seasoning food materials, the environment related information and diet information of a user at a plurality of historical time points corresponding to each dish area image;
and generating an environment matching prompt based on the matching degree of the dishes corresponding to each dish area image and the environment.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202311109888.8A CN116825286B (en) | 2023-08-31 | 2023-08-31 | Food ingredient identification and nutrition recommendation system |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202311109888.8A CN116825286B (en) | 2023-08-31 | 2023-08-31 | Food ingredient identification and nutrition recommendation system |
Publications (2)
Publication Number | Publication Date |
---|---|
CN116825286A true CN116825286A (en) | 2023-09-29 |
CN116825286B CN116825286B (en) | 2023-11-14 |
Family
ID=88139623
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202311109888.8A Active CN116825286B (en) | 2023-08-31 | 2023-08-31 | Food ingredient identification and nutrition recommendation system |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN116825286B (en) |
Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN112053428A (en) * | 2020-08-07 | 2020-12-08 | 联保(北京)科技有限公司 | Method and device for identifying nutritional information contained in food |
CN114388102A (en) * | 2021-12-27 | 2022-04-22 | 阿里健康科技(中国)有限公司 | Diet recommendation method and device and electronic equipment |
CN114445614A (en) * | 2020-10-16 | 2022-05-06 | 技嘉科技股份有限公司 | Catering content identification method and catering content identification system |
WO2022133190A1 (en) * | 2020-12-17 | 2022-06-23 | Trustees Of Tufts College | Food and nutrient estimation, dietary assessment, evaluation, prediction and management |
CN115602289A (en) * | 2022-10-13 | 2023-01-13 | 中南大学(Cn) | Dynamic diet recommendation method based on crowd attribute-diet knowledge graph |
CN116153466A (en) * | 2023-03-17 | 2023-05-23 | 电子科技大学 | Food ingredient identification and nutrition recommendation system |
WO2023111668A1 (en) * | 2021-12-17 | 2023-06-22 | Evyd研究私人有限公司 | Dietary recommendation method, apparatus and system, and storage medium, and electronic device |
-
2023
- 2023-08-31 CN CN202311109888.8A patent/CN116825286B/en active Active
Patent Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN112053428A (en) * | 2020-08-07 | 2020-12-08 | 联保(北京)科技有限公司 | Method and device for identifying nutritional information contained in food |
CN114445614A (en) * | 2020-10-16 | 2022-05-06 | 技嘉科技股份有限公司 | Catering content identification method and catering content identification system |
WO2022133190A1 (en) * | 2020-12-17 | 2022-06-23 | Trustees Of Tufts College | Food and nutrient estimation, dietary assessment, evaluation, prediction and management |
WO2023111668A1 (en) * | 2021-12-17 | 2023-06-22 | Evyd研究私人有限公司 | Dietary recommendation method, apparatus and system, and storage medium, and electronic device |
CN114388102A (en) * | 2021-12-27 | 2022-04-22 | 阿里健康科技(中国)有限公司 | Diet recommendation method and device and electronic equipment |
CN115602289A (en) * | 2022-10-13 | 2023-01-13 | 中南大学(Cn) | Dynamic diet recommendation method based on crowd attribute-diet knowledge graph |
CN116153466A (en) * | 2023-03-17 | 2023-05-23 | 电子科技大学 | Food ingredient identification and nutrition recommendation system |
Also Published As
Publication number | Publication date |
---|---|
CN116825286B (en) | 2023-11-14 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US9977980B2 (en) | Food logging from images | |
Pouladzadeh et al. | Food calorie measurement using deep learning neural network | |
US9314206B2 (en) | Diet and calories measurements and control | |
Friedenreich et al. | Influence of methodologic factors in a pooled analysis of 13 case-control studies of colorectal cancer and dietary fiber | |
CN107341340A (en) | recipe recommendation method, system and terminal | |
CN104778374A (en) | Automatic dietary estimation device based on image processing and recognizing method | |
US20120179665A1 (en) | Health monitoring system | |
US20150279235A1 (en) | Nutrition management system and nutrition management program | |
CN111261260B (en) | Diet recommendation system | |
CN109147935A (en) | The health data platform of identification technology is acquired based on characteristics of human body | |
CN112017756A (en) | Dietary nutrition analysis method based on face recognition self-service meal-making system | |
CN112786154A (en) | Recipe recommendation method and device, electronic equipment and storage medium | |
CN112289407A (en) | Catering management method, system, device and storage medium based on health management | |
CN109509539A (en) | A kind of eating habit health risk assessment method | |
CN106339574A (en) | Refrigerator food-based personalized catering method and system as well as refrigerator | |
CN116825286B (en) | Food ingredient identification and nutrition recommendation system | |
CN109637626A (en) | It is taken pictures based on artificial intelligence and identifies food recommended dietary APP method | |
CN109102861A (en) | A kind of diet monitoring method and device based on intelligent terminal | |
CN111652044A (en) | Dietary nutrition analysis method based on convolutional neural network target detection | |
CN116453651A (en) | Diet evaluation method, diet evaluation device, computer equipment and computer readable storage medium | |
CN114359299B (en) | Diet segmentation method and diet nutrition management method for chronic disease patients | |
CN111951928A (en) | Method of controlling calorie intake, mobile terminal and computer storage medium | |
CN111696647A (en) | Method and device for recommending menu for refrigerator | |
CN114388102A (en) | Diet recommendation method and device and electronic equipment | |
CN114023419A (en) | Recipe recommendation method and device and nonvolatile storage medium |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |