CN110826574A - Food material maturity determination method and device, kitchen electrical equipment and server - Google Patents
Food material maturity determination method and device, kitchen electrical equipment and server Download PDFInfo
- Publication number
- CN110826574A CN110826574A CN201910918784.9A CN201910918784A CN110826574A CN 110826574 A CN110826574 A CN 110826574A CN 201910918784 A CN201910918784 A CN 201910918784A CN 110826574 A CN110826574 A CN 110826574A
- Authority
- CN
- China
- Prior art keywords
- maturity
- image
- food material
- frame image
- cooking
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 235000013305 food Nutrition 0.000 title claims abstract description 212
- 239000000463 material Substances 0.000 title claims abstract description 202
- 238000000034 method Methods 0.000 title claims abstract description 67
- 238000007499 fusion processing Methods 0.000 claims abstract description 80
- 238000010411 cooking Methods 0.000 claims abstract description 52
- 238000012549 training Methods 0.000 claims abstract description 47
- 238000010801 machine learning Methods 0.000 claims abstract description 35
- 230000004927 fusion Effects 0.000 claims abstract description 12
- 238000013527 convolutional neural network Methods 0.000 claims description 26
- 230000006870 function Effects 0.000 claims description 15
- 238000012545 processing Methods 0.000 claims description 7
- 238000000605 extraction Methods 0.000 abstract description 8
- 238000010586 diagram Methods 0.000 description 16
- 238000004891 communication Methods 0.000 description 7
- 238000004590 computer program Methods 0.000 description 4
- 239000003086 colorant Substances 0.000 description 3
- 230000008878 coupling Effects 0.000 description 3
- 238000010168 coupling process Methods 0.000 description 3
- 238000005859 coupling reaction Methods 0.000 description 3
- 230000000694 effects Effects 0.000 description 2
- 238000005516 engineering process Methods 0.000 description 2
- 241000282414 Homo sapiens Species 0.000 description 1
- 230000002159 abnormal effect Effects 0.000 description 1
- 238000013473 artificial intelligence Methods 0.000 description 1
- 230000005540 biological transmission Effects 0.000 description 1
- 238000013135 deep learning Methods 0.000 description 1
- 238000013461 design Methods 0.000 description 1
- 238000001514 detection method Methods 0.000 description 1
- 238000011161 development Methods 0.000 description 1
- 230000003287 optical effect Effects 0.000 description 1
- 238000011176 pooling Methods 0.000 description 1
- 238000012546 transfer Methods 0.000 description 1
- 230000001052 transient effect Effects 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/40—Extraction of image or video features
- G06V10/44—Local feature extraction by analysis of parts of the pattern, e.g. by detecting edges, contours, loops, corners, strokes or intersections; Connectivity analysis, e.g. of connected components
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F18/00—Pattern recognition
- G06F18/20—Analysing
- G06F18/21—Design or setup of recognition systems or techniques; Extraction of features in feature space; Blind source separation
- G06F18/214—Generating training patterns; Bootstrap methods, e.g. bagging or boosting
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F18/00—Pattern recognition
- G06F18/20—Analysing
- G06F18/25—Fusion techniques
- G06F18/253—Fusion techniques of extracted features
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Data Mining & Analysis (AREA)
- Physics & Mathematics (AREA)
- Computer Vision & Pattern Recognition (AREA)
- General Physics & Mathematics (AREA)
- Bioinformatics & Cheminformatics (AREA)
- Bioinformatics & Computational Biology (AREA)
- Evolutionary Biology (AREA)
- Evolutionary Computation (AREA)
- Artificial Intelligence (AREA)
- General Engineering & Computer Science (AREA)
- Life Sciences & Earth Sciences (AREA)
- Multimedia (AREA)
- Management, Administration, Business Operations System, And Electronic Commerce (AREA)
Abstract
The application relates to the technical field of intelligent equipment, and discloses a method and a device for determining food material maturity, kitchen electrical equipment and a server. The method comprises the following steps: acquiring a current frame image and a starting frame image of a cooked food material in electric equipment, wherein the starting frame image is an image of the food material when cooking is started; performing fusion processing on the current frame image and the initial frame image to obtain a current fusion processing image; and inputting the current fusion processing image into a configured maturity machine learning training algorithm model, and determining the maturity corresponding to the food material. Therefore, the maturity of the food materials is determined through feature extraction and model prediction of the fusion processing images after the double-image fusion, and the reliability of determining the maturity of the food materials is improved.
Description
Technical Field
The application relates to the technical field of intelligent equipment, for example, to a method and a device for determining food material maturity, kitchen electrical equipment and a server.
Background
With the progress of science and technology and the development of artificial intelligence, intelligent algorithms are also increasingly applied to intelligent home appliances, such as: a refrigerator, an air conditioner, an oven, etc., wherein the oven, a microwave oven, an air fryer, etc. can intelligently determine cooking time, cooking power, etc. according to the information of the type, weight, etc. of food materials.
At present, for some electric equipment for cooking, the maturity of food materials can be judged, and after the temperature inside the food materials is detected by a temperature sensor, the maturity of the food materials is judged according to the temperature. However, in the roasting process, the temperature of the food material is unstable, so that the maturity determination is not reliable enough, and the detection of the temperature by the contact sensor may damage the appearance of the food material and may contaminate the food material.
Disclosure of Invention
The following presents a simplified summary in order to provide a basic understanding of some aspects of the disclosed embodiments. This summary is not an extensive overview nor is intended to identify key/critical elements or to delineate the scope of such embodiments but rather as a prelude to the more detailed description that is presented later.
The embodiment of the disclosure provides a method and a device for determining food material maturity, kitchen electrical equipment and a server, and aims to solve the technical problem that the reliability of determining the food material maturity is not high.
In some embodiments, the method comprises:
acquiring a current frame image and a starting frame image of a cooked food material in electric equipment, wherein the starting frame image is an image of the food material when cooking is started;
performing fusion processing on the current frame image and the initial frame image to obtain a current fusion processing image;
and inputting the current fusion processing image into a configured maturity machine learning training algorithm model, and determining the maturity corresponding to the food material.
In some embodiments, the apparatus comprises:
the cooking system comprises an information acquisition module, a processing module and a display module, wherein the information acquisition module is configured to acquire a current frame image and a starting frame image of a cooked food material in the kitchen electrical equipment, and the starting frame image is an image of the food material when cooking is started;
the characteristic extraction module is configured to perform fusion processing on the current frame image and the initial frame image to obtain a current fusion processing image;
and the prediction determining module is configured to input the current fusion processing image into a configured maturity machine learning training algorithm model, and determine the maturity corresponding to the food material.
In some embodiments, the kitchen appliance comprises: the food maturity determining device.
In some embodiments, the server comprises: the food maturity determining device.
The method and the device for determining the maturity of food materials, the kitchen electrical equipment and the server provided by the embodiment of the disclosure can achieve the following technical effects:
acquiring a current frame image and a starting frame image in the cooking process of the food material, fusing the two images to obtain a corresponding current fusion processing image, inputting the current fusion processing image into a configured maturity machine learning training algorithm model, and determining the maturity of the food material. Therefore, the maturity of the food material is determined according to the difference degree between the double images in the fusion processing image through the feature extraction and the model prediction of the fusion processing image after the double images are fused, so that the reliability of determining the maturity of the food material is improved, the reliability can be determined without contacting the food material, and the probability of damaging the appearance of the food material and polluting the food material is reduced.
The foregoing general description and the following description are exemplary and explanatory only and are not restrictive of the application.
Drawings
One or more embodiments are illustrated by way of example in the accompanying drawings, which correspond to the accompanying drawings and not in limitation thereof, in which elements having the same reference numeral designations are shown as like elements and not in limitation thereof, and wherein:
fig. 1 is a schematic flow chart of a food material maturity determination method provided by the embodiment of the disclosure;
FIG. 1-1 is a schematic diagram of a fusion processed image provided by an embodiment of the present disclosure;
fig. 2 is a schematic flow chart of a food material maturity determination method provided by the embodiment of the disclosure;
fig. 3 is a schematic structural diagram of a food material maturity determination system provided in an embodiment of the present disclosure;
fig. 4 is a flowchart illustrating a food material maturity determination method according to an embodiment of the present disclosure;
fig. 5 is a schematic structural diagram of a food material maturity determination apparatus according to an embodiment of the present disclosure;
fig. 6 is a schematic structural diagram of a food material maturity determination apparatus according to an embodiment of the present disclosure;
fig. 7 is a schematic structural diagram of a food material maturity determination apparatus according to an embodiment of the present disclosure;
fig. 8 is a schematic structural diagram of a food material maturity determination apparatus provided by the embodiment of the present disclosure.
Detailed Description
So that the manner in which the features and elements of the disclosed embodiments can be understood in detail, a more particular description of the disclosed embodiments, briefly summarized above, may be had by reference to the embodiments, some of which are illustrated in the appended drawings. In the following description of the technology, for purposes of explanation, numerous details are set forth in order to provide a thorough understanding of the disclosed embodiments. However, one or more embodiments may be practiced without these details. In other instances, well-known structures and devices may be shown in simplified form in order to simplify the drawing.
In the embodiment of the disclosure, a current frame image and a starting frame image in a food material cooking process are obtained, a double image is fused to obtain a corresponding current fusion processing image, the current fusion processing image is input into a configured maturity machine learning training algorithm model, and the maturity of the food material is determined. Therefore, the maturity of the food material is determined according to the difference degree between the double images in the fusion processing image through the feature extraction and model prediction of the fusion processing image after the double images are fused, the reliability of determining the maturity of the food material is improved, and the problem of non-convergence of a maturity regression network caused by the difference of the shapes, the colors, the textures and the like of different food materials can be effectively avoided through the double-image fusion processing when the maturity machine learning training algorithm model is configured. In addition, the reliability can be determined without contacting with food materials, and the probability of damaging the appearance of the food materials and polluting the food materials is reduced.
Fig. 1 is a flowchart illustrating a food material maturity determination method according to an embodiment of the disclosure. As shown in fig. 1, the process of determining the maturity of food material includes:
step 101: acquiring a current frame image and a starting frame image of a cooked food material in the kitchen electrical equipment.
In an embodiment of the present disclosure, a kitchen appliance may include: ovens, microwave ovens, air fryers, and the like, which can be enclosed and cook food items. Generally, an image acquisition device may be configured in a kitchen electrical device, for example: the top is provided with a camera, and the image or video information in the kitchen electrical equipment can be collected through the image collecting equipment.
The image acquisition equipment can be started at regular time to acquire images in the kitchen electrical equipment or record video information of set time. Or the image acquisition device can be turned on or off under the control of the trigger signal, so that the corresponding image or video information is acquired. The trigger signal can be sent by the kitchen electric equipment, or the server can be sent by the kitchen electric equipment. For example, when the kitchen electrical equipment is determined to start cooking, the kitchen electrical equipment or the server sends a starting instruction to start the configured image acquisition equipment to acquire video information; and under the condition that the door in the kitchen electrical equipment is opened or the set time is reached, the kitchen electrical equipment or the server sends a closing instruction, the image acquisition equipment is closed, and the video information acquisition is stopped.
Therefore, in some embodiments, in a case that the method is applied to the electric cooking appliance, acquiring the current frame image and the starting frame image of the cooked food in the electric cooking appliance may include: when the cooking is started, the camera is controlled to start recording of the cooking video, a starting frame image corresponding to the food material is obtained and stored, and a current frame image of the food material is obtained through the camera.
In some embodiments, a server in communication with the kitchen electrical appliance obtains the current frame image and the start frame image of the food material in the kitchen electrical appliance, that is, in the case that the method is applied to the server, the current frame image of the food material transmitted by the kitchen electrical appliance and the stored start frame image of the food material are received.
Step 102: and performing fusion processing on the current frame image and the initial frame image to obtain a current fusion processing image.
In this embodiment, a current frame image and a start frame image need to be subjected to a dual-image fusion process, and are fused into a frame image, which may include: the current frame image and the initial frame image can be fused in a mode of pixel rows staggered by double images to obtain a current fusion processing image. Or, the current frame image and the initial frame image are fused in a mode of splicing the double images up and down to obtain a current fusion processing image.
Fig. 1-1 is a schematic diagram of a fusion-processed image according to an embodiment of the present disclosure. In this embodiment, the kitchen electrical appliance may be an oven, and the food material may be a cake. The current frame image of the cake in the baking process of the cake is obtained, and then the current frame image and the corresponding initial frame image when the cake is placed into the baking oven and started are spliced up and down through a double-image mode to obtain the current fusion processing image shown in the figure 1-1. Of course, the present disclosure is not limited thereto, and other ways of fusing two frame images into one frame image may also be applied thereto.
Step 103: and inputting the current fusion processing image into a configured maturity machine learning training algorithm model, and determining the maturity corresponding to the food material.
Convolutional Neural Networks (CNN) can automatically learn the features of an image at various levels through convolution and pooling operations, which is consistent with the common knowledge of people to understand images. When a person recognizes an image, the image is hierarchically abstracted, firstly, color and brightness are understood, then local detail features such as edges, corners and lines are understood, then more complex information and structures such as textures and geometric shapes are obtained, and finally the concept of the whole object is formed. Each convolution layer includes a plurality of convolution kernels, and the convolution kernels are used for scanning the whole image from left to right and from top to bottom in sequence to obtain output data called a feature map (feature map), namely image feature information. In the convolutional neural network, the previous convolutional layer captures local and detailed information of an image and has a small receptive field, namely each pixel of an output image only utilizes a small range of an input image; and the subsequent convolutional layer receptive field is enlarged layer by layer and is used for capturing more complex and abstract information of the image. And finally obtaining abstract representations of the images at different scales through the operation of a plurality of convolution layers.
In the convolutional neural network, the convolutional layer corresponding to each convolutional kernel is actually a system, and a system for judging a certain feature in an image, when all single convolutional layers, namely a single feature judgment system, can effectively complete the task of judging the feature, the complex system CNN consisting of the convolutional layers in a large number can complete the complex task required by human beings and given to the convolutional neural network.
Therefore, in the embodiment of the present disclosure, a maturity machine learning training algorithm model may be configured based on the convolutional neural network CNN and through a large amount of sample data.
In some embodiments, the process of configuring the maturity machine learning training algorithm model comprises: extracting characteristic image information of a plurality of sample images based on a Convolutional Neural Network (CNN), wherein the sample images are generated after the food material images with the ripeness calibrated and corresponding food material initial images are subjected to fusion processing; and (4) performing supervision training on each feature image information through a regression network to generate a maturity machine learning training algorithm model.
The maturity of the food material images is calibrated, and each food material image corresponds to a food material starting image, so that the dual-image fusion processing is also required, that is, the food material images with the calibrated maturity and the corresponding food material starting images are fused in a manner of staggered pixel rows of the dual-image or in a manner of splicing the dual-image up and down, so as to obtain a fusion processing image, namely a sample image. Then, based on the convolutional neural network CNN, the characteristic image information of the image can be extracted, and the characteristic image information is supervised and trained through a regression network to generate a mature machine learning training algorithm model.
After the double images are subjected to fusion processing, when the characteristic image information of the fusion processing image is extracted based on the convolutional neural network CNN, the convolution operation well considers the position relation between pixels in the double images, so that the judged food maturity result can fuse the information between the food material image and the initial image and measure the difference degree between the two images, and therefore the extracted characteristic information utilizes the difference degree between the images, if the difference between the food material image and the initial image is larger, the maturity value is higher, and if the difference between the food material image and the initial image is lower, the maturity value is lower.
In addition, the feature image information of the fusion processing image after the double-image fusion processing is used for supervision and training through the regression network, and the condition that the maturity regression network is not converged caused by the difference of the shapes, the colors, the textures and the like of different food materials can be effectively avoided. Wherein the differences between the same food of different ripeness are amplified; meanwhile, the difference between different foods with the same maturity is reduced, so that the characteristic of judging the maturity of the foods is more obvious, and the network is easier to converge.
In some embodiments, to make the convergence of the regression network under supervised training better, a loss function corresponding to equation (1) may be used:
wherein epsilon is a convergence coefficient, and deltax is a difference value between a predicted value and a calibrated value.
In this embodiment, the regression network is used to predict the maturity of the food material, so the predicted value may be the predicted maturity, and the calibration value is the calibrated maturity. At present, there is no convergence coefficient epsilon in the loss function adopted in the regression network. Thus, convergence is faster when the absolute value of Δ x is greater than 1, and a linear derivative can make the network converge quickly(ii) a And when the absolute value of Δ x is less than 1, the atom smoothL1The method is insensitive to outliers and abnormal values, the gradient change is relatively small, and the training is not easy to run away. But with slow convergence, especially for large networks, large sample training. Therefore, in this embodiment, it may be smoothL1Plus (delta x) introduces epsilon, epsilon is a dynamic coefficient, the absolute value of delta x is less than 1 stage, epsilon can be adjusted, so that the training convergence has the effects of not excessively small gradient of mean absolute value error (MAE, also called L1 loss) and preventing network divergence of mean square error (MSE, also called L2 loss).
In the process of configuring a maturity machine learning training algorithm model, the sample images are fused with food material images with calibrated maturity, namely each food material image has a corresponding calibrated maturity, and maturity calibration is a difficult problem, and a general experienced cook can accurately judge the maturity of a certain food in various foods, which is a precondition of supervised network training for deep learning. In some embodiments, the maturity calibration process comprises: under the condition that the cooking of food materials is started, controlling a camera to start recording of cooking videos, and under the condition that the maturity of the food materials reaches the set maturity, controlling the camera to stop recording of the cooking videos; determining the total frame number of the recorded cooking videos; and determining the maturity corresponding to the first image according to the frame number corresponding to the first image, the total frame number and the set maturity.
The determination process that the maturity of the food material reaches the set maturity can be determined by observing the food material by a cook, and certainly, the determination process is not limited thereto, and is determined by image comparison and the like.
For example: and (3) placing the food materials to be calibrated into the oven, starting to record videos, manually observing the maturity critical state of the food baking and roasting, and stopping recording the videos when the maturity critical state is determined to be reached. Therefore, the total frame number nall of the video can be obtained, and the formula for calculating the maturity of the nth frame can be done as n/nall.
After the maturity machine learning training algorithm model is configured, the current fusion processing image after fusion processing in step 102 can be input, so that current image feature information of the current fusion processing image is extracted based on the convolutional neural network CNN, supervision training is performed on the current image feature information through a regression network, the maturity of the food material is predicted, namely prediction is performed through the model, a prediction result is obtained, and the maturity corresponding to the food material is determined.
Therefore, in the embodiment of the disclosure, the current frame image and the starting frame image in the food material cooking process are obtained, the double images are subjected to fusion processing, the processed fusion processing images are input into the configured maturity machine learning training algorithm model, and the maturity of the food material is determined. Therefore, model prediction is carried out through the fusion processing image fusing the characteristics of the double graphs, the maturity of the food materials is determined according to the difference degree between the double graphs in the fusion processing image, the reliability of determining the maturity of the food materials is improved, and the problem of non-convergence of a maturity regression network caused by the difference of the shapes, the colors, the textures and the like of different food materials can be effectively avoided through the double-graph fusion processing when the maturity machine learning training algorithm model is configured. In addition, the reliability can be determined without contacting with food materials, and the probability of damaging the appearance of the food materials and polluting the food materials is reduced.
In the embodiment of the disclosure, the kitchen electrical device may locally determine the maturity of the food material, and may also send the current frame image and the start frame image to the server, so that the server determines the maturity of the food material. At least one of the following modes can be included: displaying the maturity information on a display screen of the kitchen electrical equipment, and playing the maturity information through a voice broadcasting device; and sending the maturity information to a terminal for displaying and reminding. Therefore, the food material maturity is determined locally by the kitchen electrical equipment, the maturity determination speed is increased, and the occupation of network resources is reduced.
In some embodiments, in the case that the method is applied to a server, after determining the maturity corresponding to the food material, the maturity is sent to the kitchen electrical equipment for information reminding of the maturity. The specific reminding manner of the kitchen electric equipment can be as described above. In this embodiment, the server determines the maturity of food materials, so that the occupation of resources of the kitchen electrical equipment is reduced, the memory is saved, and the multi-control function of the kitchen electrical equipment is improved.
The following operation flows are integrated into a specific embodiment to illustrate the food material maturity determination process provided by the embodiment of the present invention.
In an embodiment of the present disclosure, the kitchen electrical equipment may be an oven, and a camera is configured on the top of the oven. And the kitchen electrical equipment is configured with a maturity machine learning training algorithm model through machine learning.
Fig. 2 is a flowchart illustrating a food material maturity determination method according to an embodiment of the disclosure. As shown in fig. 2, the process of determining the maturity of the food material includes:
step 201: acquiring a current frame image of the cooked food material in the oven.
Step 202: is the current frame image determined to be the start frame image? If so, go to step 203, otherwise, go to step 204.
Step 203: the current frame image is saved as the initial frame image, and the process returns to step 201.
Step 204: and fusing the current frame image and the initial frame image in a mode of arranging pixel rows in a double-image staggered manner to obtain a current fusion processing image.
Step 205: and inputting the current fusion processing image into a configured maturity machine learning training algorithm model, and determining the maturity corresponding to the food material.
Step 206: and displaying the maturity information on a display screen of the oven.
In this embodiment, the oven can acquire the current frame image and the starting frame image in the food material baking process, perform fusion processing on the double images to obtain the corresponding current fusion processing image, and input the current fusion processing image into the configured maturity machine learning training algorithm model to determine the maturity of the food material. Therefore, the maturity of the food material is determined according to the difference degree between the double images in the fusion processing image through the feature extraction and the model prediction of the fusion processing image after the double images are fused, so that the reliability of determining the maturity of the food material is improved, the reliability can be determined without contacting the food material, and the probability of damaging the appearance of the food material and polluting the food material is reduced. And the oven locally determines the maturity of the food materials, so that the maturity determining speed is increased, and the occupation of network resources is reduced.
In an embodiment of the present disclosure, the determination of the maturity of the food material may be performed by a server.
Fig. 3 is a schematic structural diagram of a food material maturity determination system provided by the embodiment of the disclosure. As shown in fig. 3, the system includes: the kitchen electrical equipment comprises the kitchen electrical equipment 100, the image acquisition equipment 200 configured on the kitchen electrical equipment, and a server 300 communicating with the kitchen electrical equipment 100.
The kitchen electrical appliance 100 may obtain the current frame image and the starting frame image of the food material to be cooked in the kitchen electrical appliance through the image capturing device 200, and may send the current frame image and the starting frame image to the server 300. The server 300 configures a maturity machine learning training algorithm model through mechanics, so that the server 300 can extract and predict the features of the current frame image and the current fusion processing image obtained by fusing the start frame image based on the maturity machine learning training algorithm model to obtain the maturity of the food material, and can send the maturity to the kitchen electrical equipment 100 for prompt processing.
Fig. 4 is a flowchart illustrating a food material maturity determination method according to an embodiment of the disclosure. The food material maturity determination system can be as shown in fig. 3 and fig. 4, and the process of determining the food material maturity includes:
step 401: the kitchen electrical equipment acquires the current frame image of the cooked food material in the kitchen electrical equipment through the image acquisition equipment.
Step 402: the kitchen electrical appliance determines whether the current frame image is the start frame image? If yes, go to step 403, otherwise go to step 404.
Step 403: the kitchen electrical equipment saves the current frame image as the initial frame image, and returns to step 401.
Step 404: the kitchen electrical equipment sends the current frame image and the starting frame image to the server.
Step 405: and the server fuses the current frame image and the initial frame image in a mode of splicing the double images up and down to obtain a current fusion processing image.
Step 406: and the server inputs the current fusion processing image into the configured maturity machine learning training algorithm model to determine the maturity corresponding to the food material.
Step 407: the server sends the maturity to the kitchen electrical equipment.
Step 408: the kitchen electrical equipment displays maturity information on a display screen and carries out voice broadcast under the condition that the maturity is greater than a set value.
In this embodiment, the server may obtain the current frame image and the starting frame image in the cooking process of the food material through the kitchen electrical equipment, perform fusion processing on the double images to obtain a corresponding current fusion processing image, input the current fusion processing image into the configured maturity machine learning training algorithm model, and determine the maturity of the food material. Therefore, the maturity of the food material is determined according to the difference degree between the double images in the fusion processing image through the feature extraction and the model prediction of the fusion processing image after the double images are fused, so that the reliability of determining the maturity of the food material is improved, the reliability can be determined without contacting the food material, and the probability of damaging the appearance of the food material and polluting the food material is reduced. In addition, the server determines the maturity of food materials, so that the occupation of resources of the kitchen electrical equipment is reduced, and the multi-control function of the kitchen electrical equipment is improved.
According to the process of determining the maturity of the food material, a device for determining the maturity of the food material can be constructed.
Fig. 5 is a schematic structural diagram of a food material maturity determination apparatus provided by the embodiment of the present disclosure. As shown in fig. 5, the food material maturity determination apparatus includes: an information acquisition module 510, an image fusion module 520, and a prediction determination module 530.
The information obtaining module 510 is configured to obtain a current frame image of a material to be cooked in the kitchen electrical device and a start frame image, where the start frame image is an image of the material when cooking is started.
And an image fusion module 520 configured to perform fusion processing on the current frame image and the start frame image to obtain a current fusion processed image.
And the prediction determining module 530 is configured to input the current fusion processing image into the configured maturity machine learning training algorithm model, and determine the maturity corresponding to the food material.
In some embodiments, further comprising: the model configuration module is configured to extract feature image information of a plurality of sample images based on a convolutional neural network CNN, wherein the sample images are obtained by performing fusion processing on the food material images with the calibrated maturity and the corresponding food material initial images; performing supervision training on each feature image information through a regression network to generate a maturity machine learning training algorithm model; the loss function in the regression network includes:
wherein epsilon is a convergence coefficient, and deltax is a difference value between a predicted value and a calibrated value.
In some embodiments, further comprising: the calibration module is configured to control the camera to start recording of the cooking video when the cooking of the food materials is started, and control the camera to stop recording of the cooking video when the maturity of the food materials reaches the set maturity; determining the total frame number of the recorded cooking videos; and determining the maturity corresponding to the first image according to the frame number corresponding to the first image, the total frame number and the set maturity.
The food material maturity determination apparatus provided by the embodiment of the present disclosure is exemplified below.
Fig. 6 is a schematic structural diagram of a food material maturity determination apparatus provided by the embodiment of the present disclosure. As shown in fig. 6, the food maturity determination apparatus may be applied to a kitchen electrical appliance, and includes: the information acquisition module 510, the image fusion module 520, the prediction determination module 530, further include a model configuration module 540, a calibration module 550, and a processing module 560.
The calibration module 550 can calibrate the maturity of the sample food material, that is, when the cooking of the food material is started, the camera is controlled to start recording the cooking video, and when the maturity of the food material reaches the set maturity, the camera is controlled to stop recording the cooking video; determining the total frame number of the recorded cooking videos; and determining the maturity corresponding to the first image according to the frame number corresponding to the first image, the total frame number and the set maturity.
In this way, the model configuration module 540 may extract feature image information of a plurality of sample images based on the convolutional neural network CNN, where the sample images are obtained by performing fusion processing on the food material images with the ripeness calibrated and the corresponding food material initial images; performing supervision training on each feature image information through a regression network to generate a maturity machine learning training algorithm model; the loss function in the regression network includes:
wherein epsilon is a convergence coefficient, and deltax is a difference value between a predicted value and a calibrated value.
When the cooking is started, the camera starts recording of the cooking video, the information acquisition module 510 can acquire and store the start frame image corresponding to the food material, and the information acquisition module 510 can acquire the current frame image of the food material through the camera.
In this way, the image fusion module 520 can fuse the current frame image and the start frame image to obtain a current fusion processed image.
The prediction determining module 530 may input the current fusion processing image into the maturity machine learning training algorithm model configured by the model configuring module 540, and determine the maturity corresponding to the food material. And the processing module 560 may display the maturity information on the display interface.
As can be seen, in this embodiment, the food material maturity determining apparatus applied to the kitchen electrical equipment can obtain the current frame image and the starting frame image in the cooking process of the food material, perform fusion processing on the double images to obtain the corresponding current fusion processing image, input the current fusion processing image into the configured maturity machine learning training algorithm model, and determine the maturity of the food material. Therefore, the maturity of the food material is determined according to the difference degree between the double images in the fusion processing image through the feature extraction and the model prediction of the fusion processing image after the double images are fused, so that the reliability of determining the maturity of the food material is improved, the reliability can be determined without contacting the food material, and the probability of damaging the appearance of the food material and polluting the food material is reduced. And the kitchen electrical equipment locally determines the maturity of the food materials, so that the maturity determining speed is increased, and the occupation of network resources is reduced.
Fig. 7 is a schematic structural diagram of a food material maturity determination apparatus provided by the embodiment of the present disclosure. As shown in fig. 7, the food material maturity determination apparatus may be applied in a server, and includes: the information acquisition module 510, the image fusion module 520, the prediction determination module 530, the model configuration module 540, the calibration module 550, and the transmission module 570.
The calibration module 550 can calibrate the maturity of the sample food material, that is, when the cooking of the food material is started, the camera is controlled to start recording the cooking video, and when the maturity of the food material reaches the set maturity, the camera is controlled to stop recording the cooking video; determining the total frame number of the recorded cooking videos; and determining the maturity corresponding to the first image according to the frame number corresponding to the first image, the total frame number and the set maturity.
In this way, the model configuration module 540 may extract feature image information of a plurality of sample images based on the convolutional neural network CNN, where the sample images are obtained by performing fusion processing on the food material images with the ripeness calibrated and the corresponding food material initial images; performing supervision training on each feature image information through a regression network to generate a maturity machine learning training algorithm model; the loss function in the regression network includes:
wherein epsilon is a convergence coefficient, and deltax is a difference value between a predicted value and a calibrated value.
When cooking is started, the camera starts recording of a cooking video, and the kitchen electrical equipment can acquire a start frame image and a current frame image corresponding to the food material, so that the information acquisition module 510 can receive the current frame image of the food material sent by the kitchen electrical equipment and the stored start frame image of the food material.
Similarly, the current frame image and the start frame image may be fused by the image fusion module 520 to obtain a current fusion processed image.
The prediction determining module 530 may input the current fusion processing image into the maturity machine learning training algorithm model configured by the model configuring module 540, and determine the maturity corresponding to the food material. And the sending module 570 can send the maturity to the kitchen electrical equipment for information reminding of the maturity.
As can be seen, in this embodiment, the food material maturity determining apparatus applied to the server may obtain the current frame image and the starting frame image in the cooking process of the food material through the kitchen electrical device, perform fusion processing on the double images to obtain the corresponding current fusion processing image, and input the current fusion processing image into the configured maturity machine learning training algorithm model to determine the maturity of the food material. Therefore, the maturity of the food material is determined according to the difference degree between the double images in the fusion processing image through the feature extraction and the model prediction of the fusion processing image after the double images are fused, so that the reliability of determining the maturity of the food material is improved, the reliability can be determined without contacting the food material, and the probability of damaging the appearance of the food material and polluting the food material is reduced. In addition, the server determines the maturity of food materials, so that the occupation of resources of the kitchen electrical equipment is reduced, and the multi-control function of the kitchen electrical equipment is improved.
The embodiment of the present disclosure provides a device for determining food material maturity, the structure of which is shown in fig. 8, including:
a processor (processor)1000 and a memory (memory)1001, and may further include a Communication Interface (Communication Interface)1002 and a bus 1003. The processor 1000, the communication interface 1002, and the memory 1001 may communicate with each other through the bus 1003. The communication interface 102 may be used for information transfer. The processor 1000 may call the logic instructions in the memory 1001 to execute the method for determining the maturity of food material according to the above embodiment.
In addition, the logic instructions in the memory 1001 may be implemented in the form of software functional units and stored in a computer readable storage medium when the logic instructions are sold or used as independent products.
The memory 1001 is a computer readable storage medium and can be used for storing software programs, computer executable programs, such as program instructions/modules corresponding to the methods in the embodiments of the present disclosure. The processor 1000 executes the functional application and data processing by executing the program instructions/modules stored in the memory 1001, that is, implements the method for determining the maturity of food material in the above method embodiment.
The memory 1001 may include a storage program area and a storage data area, wherein the storage program area may store an operating system, an application program required for at least one function; the storage data area may store data created according to the use of the terminal device, and the like. In addition, the memory 101 may include a high-speed random access memory, and may also include a nonvolatile memory.
The embodiment of the disclosure provides a kitchen electrical appliance, which comprises the food material maturity determining device.
The embodiment of the disclosure provides a server, which comprises the food material maturity determining device.
The embodiment of the disclosure provides a computer-readable storage medium, which stores computer-executable instructions configured to execute the food material maturity determination method.
An embodiment of the present disclosure provides a computer program product, including a computer program stored on a computer-readable storage medium, where the computer program includes program instructions, and when the program instructions are executed by a computer, the computer executes the food material maturity determination method.
The computer-readable storage medium described above may be a transitory computer-readable storage medium or a non-transitory computer-readable storage medium.
The technical solution of the embodiments of the present disclosure may be embodied in the form of a software product, where the computer software product is stored in a storage medium and includes one or more instructions to enable a computer device (which may be a personal computer, a server, or a network device) to execute all or part of the steps of the method of the embodiments of the present disclosure. And the aforementioned storage medium may be a non-transitory storage medium comprising: a U-disk, a removable hard disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk or an optical disk, and other various media capable of storing program codes, and may also be a transient storage medium.
The above description and drawings sufficiently illustrate embodiments of the disclosure to enable those skilled in the art to practice them. Other embodiments may incorporate structural, logical, electrical, process, and other changes. The examples merely typify possible variations. Individual components and functions are optional unless explicitly required, and the sequence of operations may vary. Portions and features of some embodiments may be included in or substituted for those of others. The scope of the disclosed embodiments includes the full ambit of the claims, as well as all available equivalents of the claims. As used in this application, although the terms "first," "second," etc. may be used in this application to describe various elements, these elements should not be limited by these terms. These terms are only used to distinguish one element from another. For example, a first element could be termed a second element, and, similarly, a second element could be termed a first element, unless the meaning of the description changes, so long as all occurrences of the "first element" are renamed consistently and all occurrences of the "second element" are renamed consistently. The first and second elements are both elements, but may not be the same element. Furthermore, the words used in the specification are words of description only and are not intended to limit the claims. As used in the description of the embodiments and the claims, the singular forms "a", "an" and "the" are intended to include the plural forms as well, unless the context clearly indicates otherwise. Similarly, the term "and/or" as used in this application is meant to encompass any and all possible combinations of one or more of the associated listed. Furthermore, the terms "comprises" and/or "comprising," when used in this application, specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof. Without further limitation, an element defined by the phrase "comprising an …" does not exclude the presence of other like elements in a process, method or apparatus that comprises the element. In this document, each embodiment may be described with emphasis on differences from other embodiments, and the same and similar parts between the respective embodiments may be referred to each other. For methods, products, etc. of the embodiment disclosures, reference may be made to the description of the method section for relevance if it corresponds to the method section of the embodiment disclosure.
Those of skill in the art would appreciate that the various illustrative elements and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware or combinations of computer software and electronic hardware. Whether such functionality is implemented as hardware or software may depend upon the particular application and design constraints imposed on the solution. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the disclosed embodiments. It can be clearly understood by the skilled person that, for convenience and brevity of description, the specific working processes of the system, the apparatus and the unit described above may refer to the corresponding processes in the foregoing method embodiments, and are not described herein again.
In the embodiments disclosed herein, the disclosed methods, products (including but not limited to devices, apparatuses, etc.) may be implemented in other ways. For example, the above-described apparatus embodiments are merely illustrative, and for example, the division of the units may be merely a logical division, and in actual implementation, there may be another division, for example, multiple units or components may be combined or integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, devices or units, and may be in an electrical, mechanical or other form. The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to implement the present embodiment. In addition, functional units in the embodiments of the present disclosure may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit.
The flowchart and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods and computer program products according to embodiments of the present disclosure. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). In some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. In the description corresponding to the flowcharts and block diagrams in the figures, operations or steps corresponding to different blocks may also occur in different orders than disclosed in the description, and sometimes there is no specific order between the different operations or steps. For example, two sequential operations or steps may in fact be executed substantially concurrently, or they may sometimes be executed in the reverse order, depending upon the functionality involved. Each block of the block diagrams and/or flowchart illustrations, and combinations of blocks in the block diagrams and/or flowchart illustrations, can be implemented by special purpose hardware-based systems that perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
Claims (10)
1. A method for determining food material maturity is characterized by comprising the following steps:
acquiring a current frame image and a starting frame image of a cooked food material in kitchen electrical equipment, wherein the starting frame image is an image of the food material when cooking is started;
performing fusion processing on the current frame image and the initial frame image to obtain a current fusion processing image;
and inputting the current fusion processing image into a configured maturity machine learning training algorithm model, and determining the maturity corresponding to the food material.
2. The method of claim 1, wherein before the obtaining the current frame image and the starting frame image of the food material cooked in the kitchen electrical appliance, further comprising:
extracting characteristic image information of a plurality of sample images based on a Convolutional Neural Network (CNN), wherein the sample images are obtained by fusing food material images with the ripeness calibrated and corresponding food material initial images;
and performing supervision training on each feature image information through a regression network to generate the maturity machine learning training algorithm model.
4. The method of claim 2, wherein the maturity calibration process comprises:
under the condition that the cooking of food materials is started, controlling a camera to start recording of cooking videos, and under the condition that the maturity of the food materials reaches the set maturity, controlling the camera to stop recording of the cooking videos;
determining the total frame number of the recorded cooking videos;
and determining the maturity corresponding to the first image according to the frame number corresponding to the first image, the total frame number and the set maturity.
5. The method of claim 1, wherein the obtaining the current frame image and the starting frame image of the cooked food material in the kitchen appliance comprises:
when the method is applied to kitchen electrical equipment, when cooking is started, a camera is controlled to start recording of a cooking video, a starting frame image corresponding to the food material is obtained and stored, and a current frame image of the food material is obtained through the camera;
in the case that the method is applied to a server, receiving a current frame image of the food material sent by the kitchen electrical equipment and a stored starting frame image of the food material.
6. The method of claim 1 or 5, wherein after determining the maturity corresponding to the food material, further comprising:
when the method is applied to kitchen electrical equipment, information reminding of maturity is carried out;
and under the condition that the method is applied to a server, sending the maturity to the kitchen electric equipment for information reminding of the maturity.
7. An apparatus for determining the ripeness of a food material, comprising:
the cooking system comprises an information acquisition module, a processing module and a display module, wherein the information acquisition module is configured to acquire a current frame image and a starting frame image of a cooked food material in the kitchen electrical equipment, and the starting frame image is an image of the food material when cooking is started;
the image fusion module is configured to perform fusion processing on the current frame image and the initial frame image to obtain a current fusion processing image;
and the prediction determining module is configured to input the current fusion processing image into a configured maturity machine learning training algorithm model, and determine the maturity corresponding to the food material.
8. The apparatus of claim 7, further comprising:
the model configuration module is configured to extract feature image information of a plurality of sample images based on a convolutional neural network CNN, wherein the sample images are obtained by performing fusion processing on the food material images with the calibrated maturity and corresponding food material initial images; performing supervision training on each feature image information through a regression network to generate the maturity machine learning training algorithm model; the loss function in the regression network includes:
wherein epsilon is a convergence coefficient, and deltax is a difference value between a predicted value and a calibration value;
the calibration module is configured to control the camera to start recording of the cooking video when the cooking of the food materials is started, and control the camera to stop recording of the cooking video when the maturity of the food materials reaches the set maturity; determining the total frame number of the recorded cooking videos; and determining the maturity corresponding to the first image according to the frame number corresponding to the first image, the total frame number and the set maturity.
9. A kitchen appliance, characterized in that it comprises a device according to any one of claims 7 to 8.
10. A server, characterized in that it comprises a device according to any one of claims 7 to 8.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201910918784.9A CN110826574A (en) | 2019-09-26 | 2019-09-26 | Food material maturity determination method and device, kitchen electrical equipment and server |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201910918784.9A CN110826574A (en) | 2019-09-26 | 2019-09-26 | Food material maturity determination method and device, kitchen electrical equipment and server |
Publications (1)
Publication Number | Publication Date |
---|---|
CN110826574A true CN110826574A (en) | 2020-02-21 |
Family
ID=69548599
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201910918784.9A Pending CN110826574A (en) | 2019-09-26 | 2019-09-26 | Food material maturity determination method and device, kitchen electrical equipment and server |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN110826574A (en) |
Cited By (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN112741508A (en) * | 2021-01-27 | 2021-05-04 | 九阳股份有限公司 | Control method of cooking equipment and cooking equipment |
WO2021110066A1 (en) * | 2019-12-06 | 2021-06-10 | 广东美的白色家电技术创新中心有限公司 | Food maturity level identification method and device, and computer storage medium |
CN113283447A (en) * | 2021-05-28 | 2021-08-20 | 青岛海尔科技有限公司 | Food baking method and device, storage medium and electronic device |
CN113436159A (en) * | 2021-06-21 | 2021-09-24 | 青岛海尔科技有限公司 | Method and device for detecting maturity of food material, storage medium and electronic device |
CN113793314A (en) * | 2021-09-13 | 2021-12-14 | 河南丹圣源农业开发有限公司 | Pomegranate maturity identification equipment and use method |
CN116594367A (en) * | 2023-07-19 | 2023-08-15 | 烟台金潮果蔬食品有限公司 | Cooking degree control system of sweet potato juice spiral precooking machine |
CN117237939A (en) * | 2023-11-16 | 2023-12-15 | 沈阳东方和利厨业有限公司 | Image data-based detection method and device for food maturity of young cooker |
-
2019
- 2019-09-26 CN CN201910918784.9A patent/CN110826574A/en active Pending
Cited By (12)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2021110066A1 (en) * | 2019-12-06 | 2021-06-10 | 广东美的白色家电技术创新中心有限公司 | Food maturity level identification method and device, and computer storage medium |
US20220262143A1 (en) * | 2019-12-06 | 2022-08-18 | Guangdong Midea White Home Appliance Technology Innovation Center Co., Ltd. | Method of Identifying Level of Doneness of Food, Device, and Computer Storage Medium |
US12094228B2 (en) * | 2019-12-06 | 2024-09-17 | Guangdong Midea White Home Appliance Technology Innovation Center Co., Ltd | Method of identifying level of doneness of food, device, and computer storage medium |
CN112741508A (en) * | 2021-01-27 | 2021-05-04 | 九阳股份有限公司 | Control method of cooking equipment and cooking equipment |
CN113283447A (en) * | 2021-05-28 | 2021-08-20 | 青岛海尔科技有限公司 | Food baking method and device, storage medium and electronic device |
CN113283447B (en) * | 2021-05-28 | 2023-12-19 | 青岛海尔科技有限公司 | Food baking method and device, storage medium and electronic device |
CN113436159A (en) * | 2021-06-21 | 2021-09-24 | 青岛海尔科技有限公司 | Method and device for detecting maturity of food material, storage medium and electronic device |
CN113793314A (en) * | 2021-09-13 | 2021-12-14 | 河南丹圣源农业开发有限公司 | Pomegranate maturity identification equipment and use method |
CN116594367A (en) * | 2023-07-19 | 2023-08-15 | 烟台金潮果蔬食品有限公司 | Cooking degree control system of sweet potato juice spiral precooking machine |
CN116594367B (en) * | 2023-07-19 | 2023-09-19 | 烟台金潮果蔬食品有限公司 | Cooking degree control system of sweet potato juice spiral precooking machine |
CN117237939A (en) * | 2023-11-16 | 2023-12-15 | 沈阳东方和利厨业有限公司 | Image data-based detection method and device for food maturity of young cooker |
CN117237939B (en) * | 2023-11-16 | 2024-01-30 | 沈阳东方和利厨业有限公司 | Image data-based detection method and device for food maturity of young cooker |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN110826574A (en) | Food material maturity determination method and device, kitchen electrical equipment and server | |
KR102329592B1 (en) | Food preparation methods and systems based on ingredient recognition | |
CN107752794B (en) | Baking method and device | |
WO2021110066A1 (en) | Food maturity level identification method and device, and computer storage medium | |
CN111752170B (en) | Intelligent cooking method and device | |
EP4028591A1 (en) | Method and system for controlling machines based on object recognition | |
CN108197635B (en) | Cooking mode display method and device and range hood | |
CN111199249A (en) | Food material identification and update control method and device and refrigeration equipment | |
CN114266959A (en) | Food cooking method and device, storage medium and electronic device | |
CN113239780A (en) | Food material determining method and device, electronic equipment, refrigerator and storage medium | |
CN111435229A (en) | Method and device for controlling cooking mode and cooking appliance | |
CN114092806A (en) | Recognition method and device thereof, cooking equipment and control method thereof and storage medium | |
CN111435541A (en) | Method, device and cooking utensil for obtaining chalkiness of rice grains | |
CN111323551A (en) | Food freshness prompting method and device and household appliance | |
CN111419096B (en) | Food processing method, controller and food processing equipment | |
CN112750158A (en) | Method and device for detecting volume of food material and kitchen electrical equipment | |
CN116129334A (en) | Cleaning judging method and device, storage medium and cooking equipment | |
CN111159831A (en) | Method and device for predicting freshness of food materials and household appliance | |
KR20230011181A (en) | Cooking apparatus and controlling method thereof | |
CN113723498A (en) | Food maturity identification method, device, system, electric appliance, server and medium | |
CN114485020A (en) | Thawing control method and device and thawing equipment | |
CN113283447A (en) | Food baking method and device, storage medium and electronic device | |
CN112558490A (en) | Food material baking control method and device and kitchen electrical equipment | |
US20230389578A1 (en) | Oven appliances and methods of automatic reverse sear cooking | |
WO2021082284A1 (en) | Baking mold specification detection method and apparatus, and kitchen appliance |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination |