CN110619371A - Food cooking time length recommendation method, storage medium and electric chafing dish - Google Patents

Food cooking time length recommendation method, storage medium and electric chafing dish Download PDF

Info

Publication number
CN110619371A
CN110619371A CN201910903261.7A CN201910903261A CN110619371A CN 110619371 A CN110619371 A CN 110619371A CN 201910903261 A CN201910903261 A CN 201910903261A CN 110619371 A CN110619371 A CN 110619371A
Authority
CN
China
Prior art keywords
food material
cooking time
neural network
convolutional neural
food
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201910903261.7A
Other languages
Chinese (zh)
Inventor
叶朝虹
宋德超
陈翀
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Gree Electric Appliances Inc of Zhuhai
Zhuhai Lianyun Technology Co Ltd
Original Assignee
Gree Electric Appliances Inc of Zhuhai
Zhuhai Lianyun Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Gree Electric Appliances Inc of Zhuhai, Zhuhai Lianyun Technology Co Ltd filed Critical Gree Electric Appliances Inc of Zhuhai
Priority to CN201910903261.7A priority Critical patent/CN110619371A/en
Publication of CN110619371A publication Critical patent/CN110619371A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/24Classification techniques
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/045Combinations of networks
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H20/00ICT specially adapted for therapies or health-improving plans, e.g. for handling prescriptions, for steering therapy or for monitoring patient compliance
    • G16H20/60ICT specially adapted for therapies or health-improving plans, e.g. for handling prescriptions, for steering therapy or for monitoring patient compliance relating to nutrition control, e.g. diets

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Data Mining & Analysis (AREA)
  • Health & Medical Sciences (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • General Health & Medical Sciences (AREA)
  • General Engineering & Computer Science (AREA)
  • Evolutionary Computation (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Artificial Intelligence (AREA)
  • Public Health (AREA)
  • Mathematical Physics (AREA)
  • Computational Linguistics (AREA)
  • Biomedical Technology (AREA)
  • Primary Health Care (AREA)
  • Molecular Biology (AREA)
  • Computing Systems (AREA)
  • Medical Informatics (AREA)
  • Epidemiology (AREA)
  • Biophysics (AREA)
  • Software Systems (AREA)
  • Nutrition Science (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Evolutionary Biology (AREA)
  • General Preparation And Processing Of Foods (AREA)

Abstract

The invention provides a food material cooking time length recommendation method applied to an electric chafing dish, a storage medium and the electric chafing dish, and relates to the technical field of intelligent household appliances. In the method, a working mode is recommended by starting the cooking time of the food material of the electric chafing dish; acquiring a food material image; the food materials in the food material image are identified by utilizing the convolutional neural network, the time parameter associated with the food materials is obtained according to the identification result of the food materials, and the recommended cooking time of the food materials is determined according to the time parameter so as to control the electric chafing dish to cook the food materials according to the recommended cooking time. The scheme provided by the invention not only enables the food material to keep the best taste and nutrition, but also facilitates the user to control the cooking process.

Description

Food cooking time length recommendation method, storage medium and electric chafing dish
Technical Field
The invention belongs to the technical field of household appliances with skills, and particularly relates to a method for recommending cooking time of food materials, a storage medium and an electric chafing dish.
Background
At present, the food identification technology is applied to the intelligent household appliance industry, and can provide more man-machine interaction functions for users, such as a Mitsubishi CHIQ cloud image identification refrigerator, a American intelligent refrigerator and the like, so that the application based on the food identification is one of the important functions of the household appliance industry in the future on man-machine interaction.
The electric chafing dish is a popular cooking appliance in real life and is commonly used in chafing dish shops and families.
When the electric chafing dish in the prior art is applied to a hot pot shop, the time length for cooking various food materials is usually indicated on a menu, but the cooking result is not liked by all people. When the food is used at home, the cooking time of each food material is not according, and the subjective cooking time of a user is considered, so that the phenomenon that the food material is excessively cooked or is not cooked easily occurs, and the nutritional and health problems are caused.
Disclosure of Invention
In view of the above, an embodiment of the present invention provides a method for recommending cooking time of a food material, so as to solve the problem that in the prior art, when an electric chafing dish is used to cook the food material, the time cannot be determined and recommended according to user requirements and characteristics of the food material itself at the same time.
In order to solve the problem that the food material in the prior art cannot simultaneously judge and recommend the duration according to the user requirement and the characteristics of the food material, the technical scheme provided by the embodiment of the invention is as follows:
in a first aspect, an embodiment of the present invention provides a method for recommending cooking time of a food material, including the following steps:
acquiring a food material image;
identifying the food materials in the food material image by using a convolutional neural network, acquiring a time parameter associated with the food materials according to the identification result of the food materials, determining the recommended cooking time of the food materials according to the time parameter, and pushing the recommended cooking time to cook the food materials.
Preferably, in the embodiment of the present invention, before identifying the food material in the food material image by using the convolutional neural network, the method further includes the following steps:
and constructing a convolutional neural network model, and training model parameters of the convolutional neural network model by using a food material training set to obtain the trained convolutional neural network.
Preferably, in the embodiment of the present invention, the convolutional neural network model is constructed, specifically: the convolutional neural network model includes:
the food material image convolution processing system comprises at least one layer of convolution layer, a plurality of layers of convolution layer and a plurality of layers of image processing units, wherein the input end of the convolution layer receives a food material image and is used for performing convolution processing on the received food material image so as to extract and output a corresponding convolution characteristic image;
the input end of the at least one pooling layer is connected with the output end of the convolutional layer and is used for pooling the convolutional characteristic image output by the convolutional layer and outputting the pooled convolutional characteristic image;
and the input end of the full link layer is connected with the output end of the pooling layer and is used for classifying and identifying the pooled convolution characteristic images and outputting an identification result.
Preferably, in the embodiment of the present invention, the food material training set is used to train the model parameters of the convolutional neural network model to obtain the trained convolutional neural network, which specifically includes:
step S10, initializing the model parameters of the convolutional neural network model;
step S20, inputting the food material training set images into a convolutional neural network model, wherein the convolutional layer, the pooling layer and the full link layer of the convolutional neural network model sequentially process the food material training set images, and output the classification recognition output values of the food material training set images;
step S30, determining an error value between the classification recognition output value of the food material training set image and a preset classification recognition target value;
when the error value is greater than the preset threshold value, reversely inputting the error value into the convolutional neural network to correct the model parameters of the convolutional neural network, and then returning to execute the step S20;
and when the error value is less than or equal to the preset threshold value, outputting the trained convolutional neural network.
Preferably, in the embodiment of the present invention, a time parameter associated with the food material is obtained according to the identification result of the food material, and the recommended cooking time of the food material is determined according to the time parameter, specifically:
the time parameter associated with the food material comprises an empirical cooking time and a current average cooking time;
acquiring the experience cooking time of the food material according to the type of the food material;
acquiring the current average cooking time of the food material according to the cooking history of the food material, wherein the current average cooking time is the average value of the last actual cooking time and the last obtained average cooking time;
and carrying out weighted summation on the empirical cooking time and the current average cooking time to obtain the recommended cooking time of the food material.
Preferably, in the embodiment of the present invention, the method further includes the following steps:
pushing the current recommended cooking time to the user so that the user can set the current actual cooking time according to the current recommended cooking time, and cooking the food material according to the current actual cooking time set by the user.
Preferably, in the embodiment of the present invention, the method further includes the following steps:
and automatically setting the current recommended cooking time to be the current actual cooking time, and cooking the food material according to the current actual cooking time.
Preferably, in an embodiment of the present invention, before acquiring the food material image, the method further includes the following steps: and starting the recommended working mode of the cooking time of the food materials, and acquiring the food material image when the recommended working mode of the cooking time of the food materials is in.
In a second aspect, an embodiment of the present invention further provides a storage medium, which stores a program for implementing the method for recommending cooking time of food material.
In a third aspect, an embodiment of the present invention further provides an electric chafing dish, including: the food cooking time recommendation method comprises a processor and a memory, wherein the processor is used for executing a program stored in the memory so as to realize the food cooking time recommendation method.
The invention has the beneficial effects that: according to the technical scheme, the food material categories are identified through the convolutional neural network, the time parameter associated with the food material is obtained according to the identification result, the optimal recommended cooking time of the food material is calculated according to the time parameter, and therefore the food material is cooked in the optimal state. Compared with the application, the electric chafing dish in the prior art is used to cook the food material, the cooking time of the food material is often determined only according to the subjective feeling of the user, the problem of food material nutrition loss is easy to occur, and the long-time accumulation can lead to the health problem of the user.
The scheme provided by the invention not only enables the food material to keep the best taste and nutrition, but also facilitates the user to control the cooking process.
Additional features and advantages of the invention will be set forth in the description which follows, and in part will be obvious from the description, or may be learned by the practice of the embodiments of the invention. The objectives and other advantages of the invention will be realized and attained by the structure particularly pointed out in the written description and drawings.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present invention, the drawings needed to be used in the embodiments will be briefly described below, it should be understood that the following drawings only illustrate some embodiments of the present invention and therefore should not be considered as limiting the scope, and for those skilled in the art, other related drawings can be obtained according to the drawings without inventive efforts.
Fig. 1 is a schematic flow chart of a method for identifying a food material image by using a convolutional neural network according to an embodiment of the present invention;
FIG. 2 is a schematic flowchart of a method for training a convolutional neural network according to an embodiment of the present invention;
fig. 3 is a flowchart illustrating a method for calculating a recommended cooking time of a food material according to an embodiment of the present invention;
fig. 4 is a flowchart illustrating a method for identifying an image of a food material by using a convolutional neural network and calculating a recommended cooking length of the food material according to an identification result according to an embodiment of the present invention.
Detailed Description
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments. The components of embodiments of the present invention generally described and illustrated in the figures herein may be arranged and designed in a wide variety of different configurations. Thus, the following detailed description of the embodiments of the present invention, presented in the figures, is not intended to limit the scope of the invention, as claimed, but is merely representative of selected embodiments of the invention. All other embodiments, which can be derived by a person skilled in the art from the embodiments of the present invention without making any creative effort, shall fall within the protection scope of the present invention.
It should be noted that: like reference numbers and letters refer to like items in the following figures, and thus, once an item is defined in one figure, it need not be further defined and explained in subsequent figures.
Before describing the specific embodiments of the present invention, a brief description will be given of the application scenarios of the present invention.
At present, when the food materials are cooked by using an electric chafing dish, the optimal cooking time of the food materials is not easy to control. In a hot pot restaurant, the menu often indicates the cooking time of each food material, but the cooking result is not really liked by all people. When the food is used at home, the cooking time of each food material is not according, and the subjective cooking time of a user is considered, so that the phenomenon that the food material is excessively cooked or is not cooked easily occurs, and the nutritional and health problems are caused.
In order to better understand the scheme provided by the embodiment of the invention, the following embodiments take the situation that a user cooks food materials with an electric hot pot at home as an application scenario, and describe the idea provided by the embodiment of the invention in detail.
Example one
Referring to fig. 1 and fig. 2, fig. 1 is a schematic flow chart of a method for recognizing food material images by using a convolutional neural network according to an embodiment of the present invention, fig. 2 is a schematic flow chart of a method for training a convolutional neural network according to an embodiment of the present invention, and a method for recognizing food material images by using a convolutional neural network will be described in detail with reference to steps S110 to S140 of fig. 1 and steps S210 to S260 of fig. 2.
And step S110, starting a working mode of the cooking time of the food materials of the electric chafing dish.
Specifically, the user controls the electric chafing dish to open the recommended working mode of the cooking time of the food material through a working mode selection key arranged on the panel of the electric chafing dish.
Step S120, acquiring a food material image.
The method specifically comprises the step of acquiring an image of the food material by using a camera arranged on the electric chafing dish.
Step S130, identifying the food materials in the food material image by using a convolutional neural network.
Before the food materials in the food material image are identified by the convolutional neural network, a convolutional neural network model is constructed.
Further, a specific method for constructing the convolutional neural network model is as follows: the convolutional neural network model includes:
the input end of the convolutional layer receives the food material image, and the convolutional layer is used for performing convolution processing on the received food material image so as to extract and output a corresponding convolution characteristic image;
the input end of the pooling layer is connected with the output end of the convolution layer, and the pooling layer is used for pooling the convolution characteristic image output by the convolution layer and outputting the pooled convolution characteristic image;
the input end of the full link layer is connected with the output end of the pooling layer, and the full link layer is used for classifying and identifying the convolution characteristic images after pooling processing and outputting the classification and identification results.
Step S210, initializing the model parameters of the convolutional neural network model.
The method comprises the following steps of initializing model parameters of a convolutional neural network model, specifically: initializing model parameters of the convolutional layers, the pooling layers, and the full link layers of a convolutional neural network model.
Step S220, inputting the images of the labeled food material label food material training set into the initialized convolutional neural network model, and outputting the output values of the food material training set image classification and identification results.
Wherein, need to label edible material label with eating material training set image, specifically be: labeling food material labels on each image of the food material training set, and inputting the initialized convolutional neural network model after labeling the food material labels.
Further, the specific process of obtaining the output value of the step is as follows: and sequentially transmitting the food material training set image labeled with the food material label to the front through the convolution layer of the convolution neural network model, the pooling layer and the full link layer, and outputting an output value of a classification recognition result from the food material training set image.
In step S230, an error value between an output value of the food material training set image classification recognition result and a target value is determined.
Step S240, comparing the error value with a preset error threshold, specifically:
when the error value is larger than the preset error threshold value, returning the error value back to the initialized convolutional neural network, updating the weight of the convolutional neural network according to the returned error value, and then executing the step S220;
and when the error value is smaller than or equal to the preset error threshold value, outputting the model parameters of the trained convolutional neural network model to obtain the trained convolutional neural network.
And step S140, identifying the food materials by using the trained convolutional neural network and outputting food material labels.
In summary, in the embodiments of the present invention, the trained convolutional neural network is obtained by constructing the convolutional neural network model and training the convolutional neural network model, and the trained convolutional neural network is used to identify the food material category.
The scheme provided by the invention not only enables the food material to keep the best taste and nutrition, but also facilitates the user to control the cooking process.
Example two
Referring to fig. 3, fig. 3 is a flowchart illustrating a method for calculating a recommended cooking length of a food material according to a food material tag according to an embodiment of the present invention, and a method for calculating the recommended cooking length of the food material according to the food material tag will be described in detail with reference to steps S150-S160.
And step S110, starting a working mode of the cooking time of the food materials of the electric chafing dish.
Specifically, the user controls the electric chafing dish to open the recommended working mode of the cooking time of the food material through a working mode selection key arranged on the panel of the electric chafing dish.
Step S120, acquiring a food material image.
The method specifically comprises the step of acquiring an image of the food material by using a camera arranged on the electric chafing dish.
Step S130, identifying the food materials in the food material image by using a convolutional neural network.
The method specifically comprises the following steps:
and S131, constructing a convolutional neural network model.
Before the food materials in the food material image are identified by the convolutional neural network, a convolutional neural network model is constructed.
Further, a specific method for constructing the convolutional neural network model is as follows: the convolutional neural network model includes:
the input end of the convolutional layer receives the food material image, and the convolutional layer is used for performing convolution processing on the received food material image so as to extract and output a corresponding convolution characteristic image;
the input end of the pooling layer is connected with the output end of the convolution layer, and the pooling layer is used for pooling the convolution characteristic image output by the convolution layer and outputting the pooled convolution characteristic image;
the input end of the full link layer is connected with the output end of the pooling layer, and the full link layer is used for classifying and identifying the convolution characteristic images after pooling processing and outputting the classification and identification results.
Step S210, initializing the model parameters of the convolutional neural network model.
The method comprises the following steps of initializing model parameters of a convolutional neural network model, specifically: initializing model parameters of the convolutional layers, the pooling layers, and the full link layers of a convolutional neural network model.
Step S220, inputting the images of the labeled food material label food material training set into the initialized convolutional neural network model, and outputting the output values of the food material training set image classification and identification results.
Wherein, need to label edible material label with eating material training set image, specifically be: labeling food material labels on each image of the food material training set, and inputting the initialized convolutional neural network model after labeling the food material labels.
Further, the specific process of obtaining the output value of the step is as follows: and sequentially transmitting the food material training set image labeled with the food material label to the front through the convolution layer of the convolution neural network model, the pooling layer and the full link layer, and outputting an output value of a classification recognition result from the food material training set image.
In step S230, an error value between an output value of the food material training set image classification recognition result and a target value is determined.
Step S240, comparing the error value with a preset error threshold, specifically:
when the error value is larger than the preset error threshold value, returning the error value back to the initialized convolutional neural network, updating the weight of the convolutional neural network according to the returned error value, and then executing the step S210;
and when the error value is smaller than or equal to the preset error threshold value, outputting the model parameters of the trained convolutional neural network model to obtain the trained convolutional neural network.
And step S140, identifying the food materials by using the trained convolutional neural network and outputting food material labels.
Step S150, acquiring a time parameter associated with the food material according to the food material tag.
Wherein the time parameters associated with the food material include an empirical cooking duration and a current average cooking duration;
acquiring the experience cooking time of the food material according to the type of the food material;
and acquiring the current average cooking time of the food material according to the cooking history of the food material, wherein the current average cooking time is the average value of the last actual cooking time and the last obtained average cooking time.
Step S160, calculating a recommended cooking time of the food material according to the time parameter associated with the food material tag.
The method comprises the following specific steps: weighting and summing the empirical cooking time and the current average cooking time to obtain the recommended cooking time of the food material, specifically, calculating the recommended cooking time of the food material according to the following formula:
C=a*A+b*B,
wherein:
c is the current recommended cooking time;
a is the current experience cooking duration;
b is the current average cooking time;
a and b are weights, which are user defined or default.
Step S170, setting an actual cooking time according to the recommended cooking time, so as to cook the food material according to the actual cooking time.
Further, the actual cooking time is set according to the recommended cooking time, specifically: pushing the current recommended cooking time to the user so that the user can set the current actual cooking time according to the current recommended cooking time, and cooking the food material according to the current actual cooking time set by the user.
Further, the actual cooking time period is set according to the recommended cooking time period, and the following steps can be further included: and automatically setting the current recommended cooking time to be the current actual cooking time, and cooking the food material according to the current actual cooking time.
In summary, in the embodiments of the present invention, the trained convolutional neural network is obtained by constructing the convolutional neural network model and training the convolutional neural network model, the categories of the food materials are identified by using the trained convolutional neural network, the time parameter associated with the food material is obtained according to the identification result, and the optimal recommended cooking time of the food material is calculated according to the time parameter, so that the food material is cooked in the optimal state.
The scheme provided by the invention not only enables the food material to keep the best taste and nutrition, but also facilitates the user to control the cooking process.
EXAMPLE III
A third embodiment of the present invention further describes in detail a specific process of cooking a food material by using an electric hot pot according to a recommended cooking duration, with reference to fig. 4, fig. 4 is a flowchart of a method for identifying an image of the food material by using a convolutional neural network and calculating the recommended cooking duration of the food material according to an identification result, which is provided by the third embodiment of the present invention, and the following detailed description is provided with reference to steps S410 to S460.
Step S410, after the electric chafing dish is started, a user starts a food material cooking time length recommending mode of the electric chafing dish.
It should be noted that the user can selectively turn on the cooking time period recommendation mode as needed, and when the user turns on the food cooking time period recommendation mode of the electric chafing dish, step S420 is executed.
And step S420, collecting the food material image by using a camera.
It should be noted that the camera is arranged on the pot body of the electric chafing dish, and when the food material images are collected by the camera, the food material images are sent to the cloud end through the wireless network, and step S430 is executed.
And step S430, identifying the food material image through a convolutional neural network at the cloud.
Before the food material image is identified through the cloud convolutional neural network, firstly, a convolutional neural network model is built, model parameters of the convolutional neural network model are trained through a food material training set, and the trained convolutional neural network is obtained.
Note that the convolutional neural network model is constructed by the following method.
The convolutional neural network model includes:
the input end of the convolutional layer receives the food material image, and the convolutional layer is used for performing convolution processing on the received food material image so as to extract and output a corresponding convolution characteristic image;
the input end of the pooling layer is connected with the output end of the convolution layer, and the pooling layer is used for pooling the convolution characteristic image output by the convolution layer and outputting the pooled convolution characteristic image;
the input end of the full link layer is connected with the output end of the pooling layer, and the full link layer is used for classifying and identifying the convolution characteristic images after pooling processing and outputting the classification and identification results.
It should be noted that the convolutional neural network model is trained by the following method to obtain a trained convolutional neural network.
The method comprises the steps of obtaining a food material training set, training a convolutional neural network model, wherein the food material training set is from network images or images shot by a user, and marking food material labels on each image of the food material training set before the food material training set is used for training the convolutional neural network model.
The food material training set is labeled with food material labels, such as pork, beef and mutton, and can be further classified into specific names of pig trotters, beef tripes and the like.
Step S431, initializing the model parameters of the convolutional neural network model,
the method comprises the following steps of initializing model parameters of a convolutional neural network model, specifically: initializing model parameters of the convolutional layers, the pooling layers, and the full link layers of a convolutional neural network model.
Step S432, inputting the images of the food material training set marked with the food material labels into the initialized convolutional neural network model.
And step S433, outputting an output value of the food material training set image classification recognition result.
The specific process for obtaining the output value in the step is as follows: and sequentially transmitting the food material training set image labeled with the food material label to the front through the convolution layer of the convolution neural network model, the pooling layer and the full link layer, and outputting an output value of a classification recognition result from the food material training set image.
In step S434, an error value between the output value of the food material training set image classification recognition result and the target value is determined.
Step S435, comparing the error value with a preset error threshold, specifically:
when the error value is larger than the preset error threshold value, returning the error value back to the initialized convolutional neural network, updating the weight of the convolutional neural network according to the returned error value, and then executing the step S431;
and when the error value is smaller than or equal to the preset error threshold value, outputting the model parameters of the trained convolutional neural network model to obtain the trained convolutional neural network.
And step S436, testing the trained convolutional neural network by using the food material test set so as to judge the accuracy of the trained convolutional neural network.
And step S440, recognizing the food materials by using the trained convolutional neural network and outputting food material labels.
For example, a food material image with a food material label of beef is collected by an electric chafing dish, and the trained convolutional neural network identifies the food material image and outputs a beef food material label.
Step S450, acquiring a time parameter associated with the food material according to the food material tag.
The method comprises the following specific steps: obtaining the empirical cooking time of the food material according to the food material label, wherein the empirical cooking time is the preset cooking time of the food material.
And acquiring the current average cooking time of the food material according to the cooking history of the food material, wherein the current average cooking time is the average value of the last actual cooking time and the last obtained average cooking time.
For example, a time parameter associated with "beef" is obtained from the "beef" food material tag.
And step S460, calculating the recommended cooking time length of the beef according to the time parameter associated with the beef label, and determining the actual cooking time length according to the recommended cooking time length.
And weighting and summing the empirical cooking time and the current average cooking time to obtain the recommended cooking time of the food material.
The method comprises the following specific steps: calculating the recommended cooking time of the beef according to the empirical cooking time of the beef, the current average cooking time of the beef and the weight ratio between the empirical cooking time of the beef and the current average cooking time of the beef, specifically, calculating the recommended cooking time of the food material according to the following formula:
C=a*A+b*B,
wherein:
c is the current recommended cooking time of the beef;
a is the experience of cooking the beef for a long time;
b is the average cooking time of the beef at present;
a and b are weights, which are user defined or default.
It should be noted that the actual cooking time period is determined according to the recommended cooking time period, and specifically: pushing the current recommended cooking time to the user so that the user can set the current actual cooking time according to the current recommended cooking time, and cooking the food material according to the current actual cooking time set by the user.
It should be noted that the actual cooking time period is determined according to the recommended cooking time period, and may be: and automatically setting the current recommended cooking time to be the current actual cooking time, and cooking the food material according to the current actual cooking time. For example, after pushing the current recommended cooking time to the user, if the user does not operate the electric chafing dish within the set time period, the electric chafing dish automatically sets the recommended cooking time to be the current actual cooking time.
In summary, in the embodiments of the present invention, the trained convolutional neural network is obtained by constructing the convolutional neural network model and training the convolutional neural network model, the categories of the food materials are identified by using the trained convolutional neural network, the time parameter associated with the food material is obtained according to the identification result, and the optimal recommended cooking time of the food material is calculated according to the time parameter, so that the food material is cooked in the optimal state. Compared with the application, the electric chafing dish in the prior art is used to cook the food material, the cooking time of the food material is often determined only according to the subjective feeling of the user, the problem of food material nutrition loss is easy to occur, and the long-time accumulation can lead to the health problem of the user.
The scheme provided by the invention not only enables the food material to keep the best taste and nutrition, but also facilitates the user to control the cooking process.
Example four
An embodiment of the present invention provides an electric chafing dish, including: the processor is used for processing a program of the food material cooking time length recommendation method applied to the electric chafing dish, which is stored in the memory, so as to realize the food material cooking time length recommendation method applied to the electric chafing dish. For convenience and simplicity of description, the specific working process of the electric chafing dish described above may refer to the corresponding process in the foregoing method, and will not be described in too much detail herein.
In summary, in the embodiments of the present invention, the trained convolutional neural network is obtained by constructing the convolutional neural network model and training the convolutional neural network model, the categories of the food materials are identified by using the trained convolutional neural network, the time parameter associated with the food material is obtained according to the identification result, and the optimal recommended cooking time of the food material is calculated according to the time parameter, so that the food material is cooked in the optimal state. Compared with the application, the electric chafing dish in the prior art is used to cook the food material, the cooking time of the food material is often determined only according to the subjective feeling of the user, the problem of food material nutrition loss is easy to occur, and the long-time accumulation can lead to the health problem of the user.
The scheme provided by the invention not only enables the food material to keep the best taste and nutrition, but also facilitates the user to control the cooking process.
The above description is only a preferred embodiment of the present invention and is not intended to limit the present invention, and various modifications and changes may be made by those skilled in the art. Any modification, equivalent replacement, or improvement made within the spirit and principle of the present invention should be included in the protection scope of the present invention. It should be noted that: like reference numbers and letters refer to like items in the following figures, and thus, once an item is defined in one figure, it need not be further defined and explained in subsequent figures.
The above description is only for the specific embodiments of the present invention, but the scope of the present invention is not limited thereto, and any person skilled in the art can easily conceive of the changes or substitutions within the technical scope of the present invention, and all the changes or substitutions should be covered within the scope of the present invention.

Claims (10)

1. A method for recommending cooking time of food materials is characterized by comprising the following steps:
acquiring a food material image;
identifying the food materials in the food material image by using a convolutional neural network, acquiring a time parameter associated with the food materials according to the identification result of the food materials, determining the recommended cooking time of the food materials according to the time parameter, and pushing the recommended cooking time to cook the food materials.
2. The method of claim 1, wherein prior to identifying the food material in the food material image using the convolutional neural network, further comprising the steps of:
and constructing a convolutional neural network model, and training model parameters of the convolutional neural network model by using a food material training set to obtain the trained convolutional neural network.
3. The method according to claim 2, characterized by constructing a convolutional neural network model, in particular: the convolutional neural network model includes:
the food material image convolution processing system comprises at least one layer of convolution layer, a plurality of layers of convolution layer and a plurality of layers of image processing units, wherein the input end of the convolution layer receives a food material image and is used for performing convolution processing on the received food material image so as to extract and output a corresponding convolution characteristic image;
the input end of the at least one pooling layer is connected with the output end of the convolutional layer and is used for pooling the convolutional characteristic image output by the convolutional layer and outputting the pooled convolutional characteristic image;
and the input end of the full link layer is connected with the output end of the pooling layer and is used for classifying and identifying the pooled convolution characteristic images and outputting an identification result.
4. The method according to claim 2, wherein the model parameters of the convolutional neural network model are trained using a food material training set to obtain a trained convolutional neural network, specifically:
step S10, initializing the model parameters of the convolutional neural network model;
step S20, inputting the food material training set images into a convolutional neural network model, wherein the convolutional layer, the pooling layer and the full link layer of the convolutional neural network model sequentially process the food material training set images, and output the classification recognition output values of the food material training set images;
step S30, determining an error value between the classification recognition output value of the food material training set image and a preset classification recognition target value;
when the error value is greater than the preset threshold value, reversely inputting the error value into the convolutional neural network to correct the model parameters of the convolutional neural network, and then returning to execute the step S20;
and when the error value is less than or equal to the preset threshold value, outputting the trained convolutional neural network.
5. The method according to claim 1, wherein a time parameter associated with the food material is obtained according to the identification result of the food material, and the recommended cooking time of the food material is determined according to the time parameter, specifically:
the time parameters associated with the food material comprise an empirical cooking duration and a current average cooking duration;
acquiring the experience cooking time of the food material according to the type of the food material;
acquiring the current average cooking time of the food material according to the cooking history of the food material, wherein the current average cooking time is the average value of the last actual cooking time and the last obtained average cooking time;
and carrying out weighted summation on the empirical cooking time and the current average cooking time to obtain the current recommended cooking time of the food material.
6. The method of claim 5, further comprising the steps of:
pushing the current recommended cooking time to the user so that the user can set the current actual cooking time according to the current recommended cooking time, and cooking the food material according to the current actual cooking time set by the user.
7. The method of claim 5, further comprising the steps of:
and automatically setting the current recommended cooking time to be the current actual cooking time, and cooking the food material according to the current actual cooking time.
8. The method of claim 1, further comprising, before acquiring the food material image, the steps of: and starting the recommended working mode of the cooking time of the food materials, and acquiring the food material image when the recommended working mode of the cooking time of the food materials is in.
9. A storage medium, characterized by: which stores a program for implementing the method for recommending cooking time of food material according to any of claims 1-8.
10. An electric chafing dish, comprising: processor and memory for executing a program of the method of any one of claims 1-8 stored on the memory for implementing a method of food cooking time recommendation.
CN201910903261.7A 2019-09-23 2019-09-23 Food cooking time length recommendation method, storage medium and electric chafing dish Pending CN110619371A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910903261.7A CN110619371A (en) 2019-09-23 2019-09-23 Food cooking time length recommendation method, storage medium and electric chafing dish

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910903261.7A CN110619371A (en) 2019-09-23 2019-09-23 Food cooking time length recommendation method, storage medium and electric chafing dish

Publications (1)

Publication Number Publication Date
CN110619371A true CN110619371A (en) 2019-12-27

Family

ID=68923973

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910903261.7A Pending CN110619371A (en) 2019-09-23 2019-09-23 Food cooking time length recommendation method, storage medium and electric chafing dish

Country Status (1)

Country Link
CN (1) CN110619371A (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112528941A (en) * 2020-12-23 2021-03-19 泰州市朗嘉馨网络科技有限公司 Automatic parameter setting system based on neural network
CN113139436A (en) * 2021-03-31 2021-07-20 哈尔滨端点科技发展有限公司 Image recognition-based automatic control method, system and medium for hot pot temperature

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109343394A (en) * 2018-10-16 2019-02-15 珠海格力电器股份有限公司 A kind of control method and cooking apparatus of cooking apparatus

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109343394A (en) * 2018-10-16 2019-02-15 珠海格力电器股份有限公司 A kind of control method and cooking apparatus of cooking apparatus

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
MOHAMMED A. SUBHI, ET.AL: "A deep convolutional neural network for food detection and recognition", 《IEEE-EMBS CONFERENCE ON BIOMEDICAL ENGINEERING AND SCIENCES》 *

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112528941A (en) * 2020-12-23 2021-03-19 泰州市朗嘉馨网络科技有限公司 Automatic parameter setting system based on neural network
CN112528941B (en) * 2020-12-23 2021-11-19 芜湖神图驭器智能科技有限公司 Automatic parameter setting system based on neural network
CN113139436A (en) * 2021-03-31 2021-07-20 哈尔滨端点科技发展有限公司 Image recognition-based automatic control method, system and medium for hot pot temperature
CN113139436B (en) * 2021-03-31 2023-12-15 哈尔滨端点科技发展有限公司 Automatic control method, system and medium for temperature of chafing dish based on image recognition

Similar Documents

Publication Publication Date Title
CN107863138B (en) Menu generating device and method
CN107644641B (en) Dialog scene recognition method, terminal and computer-readable storage medium
CN108447543A (en) Menu method for pushing based on cooking equipment and device
CN111209482A (en) Menu pushing method and device
CN106773859B (en) A kind of intelligent cooking control method
CN108133743A (en) A kind of methods, devices and systems of information push
CN108255084B (en) Cooking control method based on central control equipment and kitchen housekeeper robot
US10995960B2 (en) Food preparation entity
CN106453545A (en) Recipe showing and interacting method and system, and intelligent device
CN110619371A (en) Food cooking time length recommendation method, storage medium and electric chafing dish
CN104200409A (en) Method for matching taste selection information with application objects
CN107665198A (en) Image-recognizing method, server, terminal and refrigerating equipment
CN110322323A (en) Entity methods of exhibiting, device, storage medium and electronic equipment
CN109710855A (en) A kind of method, apparatus, cooking equipment and the storage medium of determining menu
CN109214956B (en) Meal pushing method and device
CN110782308A (en) Pushing method and device for recommended package, electronic equipment and readable storage medium
CN111753049A (en) Menu recommendation method and device, household appliance and storage medium
CN111062780A (en) Household appliance recommendation method, storage medium and electronic equipment
CN111435229A (en) Method and device for controlling cooking mode and cooking appliance
CN109086367A (en) A kind of the function choosing-item recommended method and equipment of intelligent cooking utensil
CN111248716A (en) Food cooking control method, image processing method and device and cooking equipment
CN108123985A (en) Menu recommends method, system, server and intelligent terminal
CN108134809A (en) A kind of methods, devices and systems of information push
CN113139120A (en) Electronic equipment and recipe recommendation method and apparatus
CN111159831A (en) Method and device for predicting freshness of food materials and household appliance

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication
RJ01 Rejection of invention patent application after publication

Application publication date: 20191227