CN110858279A - Food material identification method and device - Google Patents
Food material identification method and device Download PDFInfo
- Publication number
- CN110858279A CN110858279A CN201810959222.4A CN201810959222A CN110858279A CN 110858279 A CN110858279 A CN 110858279A CN 201810959222 A CN201810959222 A CN 201810959222A CN 110858279 A CN110858279 A CN 110858279A
- Authority
- CN
- China
- Prior art keywords
- image information
- data
- food material
- menu
- food
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 235000013305 food Nutrition 0.000 title claims abstract description 132
- 239000000463 material Substances 0.000 title claims abstract description 123
- 238000000034 method Methods 0.000 title claims abstract description 26
- 238000013527 convolutional neural network Methods 0.000 claims abstract description 26
- 230000007787 long-term memory Effects 0.000 claims abstract description 15
- 230000006403 short-term memory Effects 0.000 claims abstract description 15
- 238000012549 training Methods 0.000 claims abstract description 12
- 238000010801 machine learning Methods 0.000 claims abstract description 11
- 230000015654 memory Effects 0.000 claims description 25
- 230000036541 health Effects 0.000 claims description 14
- 235000012054 meals Nutrition 0.000 claims description 9
- 238000005516 engineering process Methods 0.000 abstract description 6
- 238000012545 processing Methods 0.000 description 6
- 230000000694 effects Effects 0.000 description 4
- 230000008569 process Effects 0.000 description 4
- 239000008280 blood Substances 0.000 description 3
- 210000004369 blood Anatomy 0.000 description 3
- 230000008878 coupling Effects 0.000 description 3
- 238000010168 coupling process Methods 0.000 description 3
- 238000005859 coupling reaction Methods 0.000 description 3
- 238000013528 artificial neural network Methods 0.000 description 2
- 238000004891 communication Methods 0.000 description 2
- 238000005034 decoration Methods 0.000 description 2
- 238000010586 diagram Methods 0.000 description 2
- 230000003993 interaction Effects 0.000 description 2
- 235000013372 meat Nutrition 0.000 description 2
- 238000012986 modification Methods 0.000 description 2
- 230000004048 modification Effects 0.000 description 2
- 235000013311 vegetables Nutrition 0.000 description 2
- 241000287828 Gallus gallus Species 0.000 description 1
- 230000008859 change Effects 0.000 description 1
- 230000008014 freezing Effects 0.000 description 1
- 238000007710 freezing Methods 0.000 description 1
- 230000006870 function Effects 0.000 description 1
- 210000002569 neuron Anatomy 0.000 description 1
- 230000003287 optical effect Effects 0.000 description 1
- 230000002093 peripheral effect Effects 0.000 description 1
- 230000002035 prolonged effect Effects 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/20—Scenes; Scene-specific elements in augmented reality scenes
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/04—Architecture, e.g. interconnection topology
- G06N3/045—Combinations of networks
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/04—Architecture, e.g. interconnection topology
- G06N3/049—Temporal neural networks, e.g. delay elements, oscillating neurons or pulsed inputs
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/08—Learning methods
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06Q—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q30/00—Commerce
- G06Q30/06—Buying, selling or leasing transactions
- G06Q30/0601—Electronic shopping [e-shopping]
- G06Q30/0631—Item recommendations
Abstract
The invention discloses a food material identification method and device. Wherein, the method comprises the following steps: acquiring image information of a purchase label of food material; inputting the image information into a recognition model, and outputting the food material corresponding to the image information by the recognition model, wherein the recognition model is obtained by using a plurality of groups of data through machine learning training, and each group of data in the plurality of groups of data comprises: the image information and the food materials corresponding to the image information are identified by the model, and the identification model comprises a multilayer long-term and short-term memory network and a multilayer convolutional neural network. The invention solves the technical problem that the food materials with higher similarity cannot be automatically identified in the related technology.
Description
Technical Field
The invention relates to the field of intelligent refrigerators, in particular to a food material identification method and device.
Background
With the requirement of a user on multifunctional application of an electric appliance, the refrigerator is expected to realize more functions in daily application besides freezing storage, for example, a food identification system is arranged in the refrigerator, so that internal food materials are automatically collected and identified, and a menu recommendation and a food material purchase recommendation are performed on the user through identification of the food materials. However, the food materials of green vegetables and meat are similar in terms of their growth, so that food material recognition errors often occur only in an image recognition mode, and user experience is further affected.
In view of the above problems, no effective solution has been proposed.
Disclosure of Invention
The embodiment of the invention provides a food material identification method and device, and at least solves the technical problem that the food materials with higher similarity cannot be automatically identified in the related technology.
According to an aspect of the embodiments of the present invention, there is provided a food material identification method, including: acquiring image information of a purchase label of food material; inputting the image information into a recognition model, and outputting the food materials corresponding to the image information by the recognition model, wherein the recognition model is obtained by using multiple groups of data through machine learning training, and each group of data in the multiple groups of data comprises: the identification model comprises multi-layer long-short term memory network and multi-layer convolutional neural network.
Optionally, inputting the image information into an identification model, and outputting the food material corresponding to the image information by the identification model includes: inputting the image information into a multi-layer long and short term memory network; adding the output of the first layer of long-short term memory network and the output of the third layer of long-short term memory network, inputting the sum into a convolutional neural network, and determining result data; and identifying the food material corresponding to the image information according to the result data.
Optionally, inputting the image information into the recognition model, and outputting the food material corresponding to the image information by the recognition model includes: acquiring a menu requirement of a user; generating a recommended menu according to the identified food materials and the menu requirements; and sending the recommended menu.
Optionally, the acquiring the menu requirement of the user includes: acquiring health data of a user and historical dining data; and generating a menu demand according to the health data and the historical meal data.
Optionally, the obtaining of the menu requirement of the user includes: generating a food material purchase recommendation according to the menu demand and the identified food material; and purchasing food materials on the internet according to the food material purchasing recommendation.
According to another aspect of the embodiments of the present invention, there is provided a food material recognition apparatus including: the food purchasing system comprises a first acquisition module, a second acquisition module and a control module, wherein the first acquisition module is used for acquiring image information of a purchasing label of food; the identification module is used for inputting the image information into an identification model and outputting the food materials corresponding to the image information by the identification model, wherein the identification model is obtained by using a plurality of groups of data through machine learning training, and each group of data in the plurality of groups of data comprises: the identification model comprises multi-layer long-short term memory network and multi-layer convolutional neural network.
Optionally, the identification module includes: the input unit is used for inputting the image information into the multilayer long-term and short-term memory network; the summing unit is used for summing the output of the first layer of long-short term memory network and the output of the third layer of long-short term memory network, inputting the sum into the convolutional neural network and determining result data; and the identification unit is used for identifying the food material corresponding to the image information according to the result data.
Optionally, the apparatus further comprises: the second acquisition module is used for acquiring the menu requirements of the user; the first generation module is used for generating a recommended menu according to the identified food materials and the menu requirements; and the sending module is used for sending the recommended menu.
Optionally, the second obtaining module includes: the acquisition unit is used for acquiring health data of a user and historical meal data; and the generating unit is used for generating a menu demand according to the health data and the historical meal data.
Optionally, the apparatus further comprises: the second generation module is used for generating food material purchasing recommendation according to the menu requirement and the identified food materials; and the purchasing module is used for purchasing food materials on line according to the food material purchasing recommendation.
In the embodiment of the invention, the image information of the purchase tag of the food material is acquired; inputting the image information into a recognition model, and outputting the food materials corresponding to the image information by the recognition model, wherein the recognition model is obtained by using multiple groups of data through machine learning training, and each group of data in the multiple groups of data comprises: the food material identification method based on the image information comprises the steps that the image information and the food material corresponding to the image information are identified, the identification model comprises a mode of a multilayer long-short term memory network and a mode of a multilayer convolutional neural network, and a purchase label on the food material is identified through the identification model, so that the aim of accurately identifying the food material is achieved, the technical effect of identifying the food material with higher similarity is achieved, and the technical problem that the food material with higher similarity cannot be automatically identified in the related technology is solved.
Drawings
The accompanying drawings, which are included to provide a further understanding of the invention and are incorporated in and constitute a part of this application, illustrate embodiment(s) of the invention and together with the description serve to explain the invention without limiting the invention. In the drawings:
fig. 1 is a flowchart of a food material identification method according to an embodiment of the invention;
fig. 2 is a schematic structural diagram of a food material identification device according to an embodiment of the invention.
Detailed Description
In order to make the technical solutions of the present invention better understood, the technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
It should be noted that the terms "first," "second," and the like in the description and claims of the present invention and in the drawings described above are used for distinguishing between similar elements and not necessarily for describing a particular sequential or chronological order. It is to be understood that the data so used is interchangeable under appropriate circumstances such that the embodiments of the invention described herein are capable of operation in sequences other than those illustrated or described herein. Furthermore, the terms "comprises," "comprising," and "having," and any variations thereof, are intended to cover a non-exclusive inclusion, such that a process, method, system, article, or apparatus that comprises a list of steps or elements is not necessarily limited to those steps or elements expressly listed, but may include other steps or elements not expressly listed or inherent to such process, method, article, or apparatus.
According to an embodiment of the present invention, there is provided a method embodiment of a food material identification method, it should be noted that the steps shown in the flowchart of the drawings may be executed in a computer system such as a set of computer executable instructions, and although a logical order is shown in the flowchart, in some cases, the steps shown or described may be executed in an order different from that here.
Fig. 1 is a flowchart of a food material identification method according to an embodiment of the present invention, as shown in fig. 1, the method includes the following steps:
step S102, acquiring image information of a purchase label of the food material;
step S104, inputting the image information into an identification model, and outputting the food material corresponding to the image information by the identification model, wherein the identification model is obtained by using a plurality of groups of data through machine learning training, and each group of data in the plurality of groups of data comprises: the image information and the food materials corresponding to the image information are identified by the model, and the identification model comprises a multilayer long-term and short-term memory network and a multilayer convolutional neural network.
Through the steps, the image information of the purchase label of the food material is obtained; inputting the image information into a recognition model, and outputting the food material corresponding to the image information by the recognition model, wherein the recognition model is obtained by using a plurality of groups of data through machine learning training, and each group of data in the plurality of groups of data comprises: the food materials corresponding to the image information and the image information are identified by the identification model in a mode of a multilayer long-short term memory network and a multilayer convolutional neural network, and the purchase labels on the food materials are identified by the identification model, so that the aim of accurately identifying the food materials is fulfilled, the technical effect of identifying the food materials with higher similarity is realized, and the technical problem that the food materials with higher similarity cannot be automatically identified in the related technology is solved.
The mode of acquiring the image information of the purchase tag of the food material may be through an image acquisition device disposed inside the refrigerator. The image acquisition device can be a video camera, a still camera and the like. The number of the image acquisition devices can be one or more. In the case where there are a plurality of image capturing devices, the plurality of image capturing devices are respectively disposed on an inner wall of the refrigerator. In addition, the mode of acquiring the image information of the purchase tag of the food material may be inputting before the food material is placed in a refrigerator, and the input image acquisition device inputs the food material and the purchase tag attached to the food material, which corresponds to the food material, into an image to generate the image information. The input image acquisition device is arranged on the outer shell of the refrigerator.
The recognition model is obtained by machine learning training using a plurality of sets of data, for example, a convolutional neural network recognition model, or a machine learning-enabled recognition model such as a convolutional neural network recognition model, for example, training is performed by using a plurality of sets of data until the model converges, and the recognition model has a recognition capability between input data and output data. Each set of data in the plurality of sets of data includes: the image information and the food material corresponding to the image information.
The recognition model comprises a multilayer long-term and short-term memory network and a multilayer convolutional neural network. A long-short term memory network is a time-recursive neural network that is suitable for processing and predicting very long events in a time series that are spaced and prolonged. The convolutional neural network is a feedforward neural network, and the artificial neurons can affect peripheral units and can perform large-scale image processing. The recognition effect of the recognition model can be better by combining the multilayer long and short term memory network and the multilayer convolutional neural network.
Optionally, inputting the image information into the recognition model, and outputting the food material corresponding to the image information by the recognition model includes: inputting image information into a multi-layer long and short term memory network; adding the output of the first layer of long-short term memory network and the output of the third layer of long-short term memory network, inputting the sum into a convolutional neural network, and determining result data; and identifying the food material corresponding to the image information according to the result data.
In the recognition model, the multi-layer long-short term memory network and the multi-layer convolutional neural network can have various combination relations, in the embodiment, the recognition model comprises three layers of long-short term memory networks and one layer of convolutional neural network, input image information is processed by the first layer of long-short term memory network, and a first result is output by the first layer of long-short term memory network; inputting the first result into a second layer long and short term memory network for processing, and outputting a second result by the second layer long and short term memory network; inputting the second result into a third layer long-short term memory network for processing, and outputting a third result by the third layer long-short term memory network; and adding the first result and the third result, inputting the result into a convolutional neural network, and outputting a final result by the convolutional neural network. And processing the final result by the rest of the recognition model to recognize the food material corresponding to the image information.
Optionally, inputting the image information into the recognition model, and outputting the food material corresponding to the image information by the recognition model includes: acquiring a menu requirement of a user; generating a recommended menu according to the identified food materials and menu requirements; and sending the recommended menu.
The acquiring of the menu requirement of the user may be receiving the menu requirement of the user through a human-computer interaction device. The man-machine interaction device can be a touch screen, a voice input device and the like. The menu requirement can be the requirement of a cuisine, such as Sichuan dish, Lucai, Guangdong dish, Huaiyang dish and the like. The recipe requirements may also be taste requirements, such as sourness and sweetness, spicy, halal, and the like. The above recipe can also be a meat and vegetable requirement. The recipe demand may also be an energy demand, for example, within 200 calories. But also the kind of dish, for example, dish related to chicken, etc. Can meet the menu requirements of various people.
And generating a recommended menu according to the menu requirements and the food materials identified in the refrigerator, combining the dishes meeting the menu requirements, and generating the recommended menu by combining the food materials identified in the refrigerator. When the dishes are selected according to the menu requirements, the selected dishes can be preferentially recommended or deleted according to the menu records selected by the user before. And sending the generated menu to the user for selection by the user.
Optionally, the acquiring the menu requirement of the user includes: acquiring health data of a user and historical dining data; and generating a menu demand according to the health data and the historical dining data.
The health data of the user can be height, weight, blood sugar, blood fat and the like. The requirements of the recipe are generated according to the health data and the historical meal data, for example, the dishes with greasy food and high sugar content are avoided as much as possible for people with heavy weight and high blood sugar.
Optionally, the obtaining of the menu requirement of the user includes: generating a food material purchase recommendation according to the menu demand and the identified food materials; and purchasing food materials on the internet according to the food material purchasing recommendation.
Before the recommended menu is generated, if no food material exists in the refrigerator or the recommended menu cannot be formed with the existing food material according to the menu requirement and the number of the identified food materials is small, food material purchase recommendation is generated according to the menu requirement and the identified food materials. If a recommended menu cannot be formed with the existing food materials according to the menu requirements and the situation that the number of the identified food materials is large can be explained to the user, the situation of the identified food materials and the situation of the menu requirements are both provided for the user, and the user selects to change the menu requirements or purchase the food materials.
And confirming whether the food needs to be purchased or not to the user before purchasing the food on line according to the food purchasing recommendation. The method and the device prevent the user from losing food materials when the user is unaware of the food materials ordering. And in the process of purchasing the food materials, transmitting feedback of the purchasing process to the user.
Fig. 2 is a schematic structural diagram of a food material recognition apparatus according to an embodiment of the invention, and as shown in fig. 2, the food material recognition apparatus 20 includes: a first obtaining module 22 and a recognition module 24, which will be described in detail below.
A first obtaining module 22, configured to obtain image information of a purchase tag of a food material; the recognition module 24 is connected to the first obtaining module 22, and configured to input the image information into a recognition model, and output the food material corresponding to the image information by the recognition model, where the recognition model is obtained by using multiple sets of data through machine learning training, and each set of data in the multiple sets of data includes: the image information and the food materials corresponding to the image information are identified by the model, and the identification model comprises a multilayer long-term and short-term memory network and a multilayer convolutional neural network.
By this means, the first obtaining module 22 obtains the image information of the purchase tag of the food material; the recognition module 24 inputs the image information into a recognition model, and the recognition model outputs the food material corresponding to the image information, wherein the recognition model is obtained by using multiple groups of data through machine learning training, and each group of data in the multiple groups of data comprises: the food materials corresponding to the image information and the image information are identified by the identification model in a mode of a multilayer long-short term memory network and a multilayer convolutional neural network, and the purchase labels on the food materials are identified by the identification model, so that the aim of accurately identifying the food materials is fulfilled, the technical effect of identifying the food materials with higher similarity is realized, and the technical problem that the food materials with higher similarity cannot be automatically identified in the related technology is solved.
Optionally, the identification module 24 includes: the input unit is used for inputting image information into the multi-layer long and short term memory network; the summing unit is used for summing the output of the first layer long and short term memory network and the output of the third layer long and short term memory network and inputting the sum into the convolutional neural network; and the identification unit is used for identifying the food material corresponding to the image information according to the convolutional neural network and the rest part of the identification model.
Optionally, the identification apparatus 20 further includes: the second acquisition module is used for acquiring the menu requirements of the user; the first generation module is used for generating a recommended menu according to the identified food materials and menu requirements; and the sending module is used for sending the recommended menu.
Optionally, the second obtaining module includes: the acquisition unit is used for acquiring health data of a user and historical meal data; and the generating unit is used for generating the menu demand according to the health data and the historical meal data.
Optionally, the identification apparatus 20 further includes: the second generation module is used for generating food material purchasing recommendation according to the menu requirement and the identified food materials; and the purchasing module is used for purchasing food materials on line according to the food material purchasing recommendation.
The above-mentioned serial numbers of the embodiments of the present invention are merely for description and do not represent the merits of the embodiments.
In the above embodiments of the present invention, the descriptions of the respective embodiments have respective emphasis, and for parts that are not described in detail in a certain embodiment, reference may be made to related descriptions of other embodiments.
In the embodiments provided in the present application, it should be understood that the disclosed technology can be implemented in other ways. The above-described embodiments of the apparatus are merely illustrative, and for example, the division of the units may be a logical division, and in actual implementation, there may be another division, for example, multiple units or components may be combined or integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, units or modules, and may be in an electrical or other form.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, functional units in the embodiments of the present invention may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit. The integrated unit can be realized in a form of hardware, and can also be realized in a form of a software functional unit.
The integrated unit, if implemented in the form of a software functional unit and sold or used as a stand-alone product, may be stored in a computer readable storage medium. Based on such understanding, the technical solution of the present invention may be embodied in the form of a software product, which is stored in a storage medium and includes instructions for causing a computer device (which may be a personal computer, a server, or a network device) to execute all or part of the steps of the method according to the embodiments of the present invention. And the aforementioned storage medium includes: a U-disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a removable hard disk, a magnetic or optical disk, and other various media capable of storing program codes.
The foregoing is only a preferred embodiment of the present invention, and it should be noted that, for those skilled in the art, various modifications and decorations can be made without departing from the principle of the present invention, and these modifications and decorations should also be regarded as the protection scope of the present invention.
Claims (10)
1. A food material identification method is characterized by comprising the following steps:
acquiring image information of a purchase label of food material;
inputting the image information into a recognition model, and outputting the food materials corresponding to the image information by the recognition model, wherein the recognition model is obtained by using multiple groups of data through machine learning training, and each group of data in the multiple groups of data comprises: the identification model comprises multi-layer long-short term memory network and multi-layer convolutional neural network.
2. The method of claim 1, wherein inputting the image information into a recognition model, and outputting the food material corresponding to the image information by the recognition model comprises:
inputting the image information into a multi-layer long and short term memory network;
adding the output of the first layer of long-short term memory network and the output of the third layer of long-short term memory network, inputting the sum into a convolutional neural network, and determining result data;
and identifying the food material corresponding to the image information according to the result data.
3. The method of claim 1, wherein inputting the image information into a recognition model, and outputting the food material corresponding to the image information by the recognition model comprises:
acquiring a menu requirement of a user;
generating a recommended menu according to the identified food materials and the menu requirements;
and sending the recommended menu.
4. The method of claim 3, wherein obtaining the user's recipe requirements comprises:
acquiring health data of a user and historical dining data;
and generating a menu demand according to the health data and the historical meal data.
5. The method of claim 3, wherein obtaining the user's recipe requirements comprises:
generating a food material purchase recommendation according to the menu demand and the identified food material;
and purchasing food materials on the internet according to the food material purchasing recommendation.
6. An apparatus for recognizing food material, comprising:
the food purchasing system comprises a first acquisition module, a second acquisition module and a control module, wherein the first acquisition module is used for acquiring image information of a purchasing label of food;
the identification module is used for inputting the image information into an identification model and outputting the food materials corresponding to the image information by the identification model, wherein the identification model is obtained by using a plurality of groups of data through machine learning training, and each group of data in the plurality of groups of data comprises: the identification model comprises multi-layer long-short term memory network and multi-layer convolutional neural network.
7. The apparatus of claim 6, wherein the identification module comprises:
the input unit is used for inputting the image information into the multilayer long-term and short-term memory network;
the summing unit is used for summing the output of the first layer of long-short term memory network and the output of the third layer of long-short term memory network, inputting the sum into the convolutional neural network and determining result data;
and the identification unit is used for identifying the food material corresponding to the image information according to the result data.
8. The apparatus of claim 7, further comprising:
the second acquisition module is used for acquiring the menu requirements of the user;
the first generation module is used for generating a recommended menu according to the identified food materials and the menu requirements;
and the sending module is used for sending the recommended menu.
9. The apparatus of claim 8, wherein the second obtaining module comprises:
the acquisition unit is used for acquiring health data of a user and historical meal data;
and the generating unit is used for generating a menu demand according to the health data and the historical meal data.
10. The apparatus of claim 8, further comprising:
the second generation module is used for generating food material purchasing recommendation according to the menu requirement and the identified food materials;
and the purchasing module is used for purchasing food materials on line according to the food material purchasing recommendation.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201810959222.4A CN110858279A (en) | 2018-08-22 | 2018-08-22 | Food material identification method and device |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201810959222.4A CN110858279A (en) | 2018-08-22 | 2018-08-22 | Food material identification method and device |
Publications (1)
Publication Number | Publication Date |
---|---|
CN110858279A true CN110858279A (en) | 2020-03-03 |
Family
ID=69635834
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201810959222.4A Pending CN110858279A (en) | 2018-08-22 | 2018-08-22 | Food material identification method and device |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN110858279A (en) |
Cited By (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN111553304A (en) * | 2020-05-09 | 2020-08-18 | 北京小狗智能机器人技术有限公司 | Information processing method, terminal and device |
CN111680570A (en) * | 2020-05-13 | 2020-09-18 | 珠海格力电器股份有限公司 | Augmented reality image data processing method, device, equipment and storage medium |
CN111797719A (en) * | 2020-06-17 | 2020-10-20 | 武汉大学 | Food component identification method |
CN113488142A (en) * | 2021-07-28 | 2021-10-08 | 珠海格力电器股份有限公司 | Menu recommendation method and device, storage medium and equipment |
WO2021210230A1 (en) * | 2020-04-14 | 2021-10-21 | 株式会社 ゼンショーホールディングス | Heating state identification device, heating control device, heating control method, heating state identification system, and heating control system |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20160327279A1 (en) * | 2015-05-05 | 2016-11-10 | June Life, Inc. | Connected food preparation system and method of use |
CN107392865A (en) * | 2017-07-01 | 2017-11-24 | 广州深域信息科技有限公司 | A kind of restored method of facial image |
CN107798277A (en) * | 2016-09-05 | 2018-03-13 | 合肥美的智能科技有限公司 | Food materials identifying system and method, food materials model training method, refrigerator and server |
US9965798B1 (en) * | 2017-01-31 | 2018-05-08 | Mikko Vaananen | Self-shopping refrigerator |
CN108335731A (en) * | 2018-02-09 | 2018-07-27 | 辽宁工程技术大学 | A kind of invalid diet's recommendation method based on computer vision |
-
2018
- 2018-08-22 CN CN201810959222.4A patent/CN110858279A/en active Pending
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20160327279A1 (en) * | 2015-05-05 | 2016-11-10 | June Life, Inc. | Connected food preparation system and method of use |
CN107798277A (en) * | 2016-09-05 | 2018-03-13 | 合肥美的智能科技有限公司 | Food materials identifying system and method, food materials model training method, refrigerator and server |
US9965798B1 (en) * | 2017-01-31 | 2018-05-08 | Mikko Vaananen | Self-shopping refrigerator |
CN107392865A (en) * | 2017-07-01 | 2017-11-24 | 广州深域信息科技有限公司 | A kind of restored method of facial image |
CN108335731A (en) * | 2018-02-09 | 2018-07-27 | 辽宁工程技术大学 | A kind of invalid diet's recommendation method based on computer vision |
Non-Patent Citations (1)
Title |
---|
陈宗海主编: "系统仿真技术及其应用 第19卷", 中国科学技术大学出版社, pages: 264 - 273 * |
Cited By (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2021210230A1 (en) * | 2020-04-14 | 2021-10-21 | 株式会社 ゼンショーホールディングス | Heating state identification device, heating control device, heating control method, heating state identification system, and heating control system |
JP2021169875A (en) * | 2020-04-14 | 2021-10-28 | 株式会社 ゼンショーホールディングス | Heating state identification device, heating control device, heating control method, heating state identification system, and heating control system |
CN111553304A (en) * | 2020-05-09 | 2020-08-18 | 北京小狗智能机器人技术有限公司 | Information processing method, terminal and device |
CN111680570A (en) * | 2020-05-13 | 2020-09-18 | 珠海格力电器股份有限公司 | Augmented reality image data processing method, device, equipment and storage medium |
CN111797719A (en) * | 2020-06-17 | 2020-10-20 | 武汉大学 | Food component identification method |
CN111797719B (en) * | 2020-06-17 | 2022-09-02 | 武汉大学 | Food component identification method |
CN113488142A (en) * | 2021-07-28 | 2021-10-08 | 珠海格力电器股份有限公司 | Menu recommendation method and device, storage medium and equipment |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN110858279A (en) | Food material identification method and device | |
KR101798616B1 (en) | Method of providing recipe and server performing the same | |
CN109166614A (en) | A kind of system and method for recommending personal health menu | |
CN103810284A (en) | Kitchen management method and device | |
CN108280729A (en) | A kind of food preparation method and device | |
KR20110035380A (en) | System and method for providing recipe information based on network | |
CN108230075A (en) | The treating method and apparatus of food materials information | |
CN101398805A (en) | Electronic dish ordering system with intelligent recommendation function | |
CN106453545A (en) | Recipe showing and interacting method and system, and intelligent device | |
CN110135646A (en) | The method, apparatus quickly served and storage medium are estimated in a kind of dining room | |
WO2020027633A2 (en) | Cooking recipe service providing method for creating and sharing recipe | |
CN108334606A (en) | Voice interactive method, device and server for smart home | |
CN112464013A (en) | Information pushing method and device, electronic equipment and storage medium | |
CN108172273A (en) | A kind of refrigerator food materials based on visitor recommend method | |
CN109741125A (en) | Recommend method and device, the storage medium, electronic device of vegetable | |
KR20160087622A (en) | Server and method for managing and providing menu information | |
CN115599890B (en) | Product recommendation method and related device | |
CN110021402A (en) | A kind of menu recommended method and menu recommender system based on image recognition | |
CN111476103A (en) | Food material identification and health assessment method, device, equipment and computer readable medium | |
CN109784131B (en) | Object detection method, device, storage medium and processor | |
CN110400197A (en) | Data processing method, device, medium and electronic equipment | |
CN115587245A (en) | Menu list recommendation method and device, storage medium and electronic device | |
CN115082149A (en) | Electronic equipment, server and cooking equipment recommendation method | |
CN111667082A (en) | Feedback method and apparatus, storage medium, and electronic apparatus | |
CN110762943B (en) | Article display method and device and household appliance |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
RJ01 | Rejection of invention patent application after publication |
Application publication date: 20200303 |
|
RJ01 | Rejection of invention patent application after publication |