CN115177159A - Control system of cooking utensil - Google Patents

Control system of cooking utensil Download PDF

Info

Publication number
CN115177159A
CN115177159A CN202110361195.2A CN202110361195A CN115177159A CN 115177159 A CN115177159 A CN 115177159A CN 202110361195 A CN202110361195 A CN 202110361195A CN 115177159 A CN115177159 A CN 115177159A
Authority
CN
China
Prior art keywords
cooking
image
module
cooking appliance
food
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202110361195.2A
Other languages
Chinese (zh)
Inventor
杨晖
李超
杜群
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
BSH Electrical Appliances Jiangsu Co Ltd
BSH Hausgeraete GmbH
Original Assignee
BSH Electrical Appliances Jiangsu Co Ltd
BSH Hausgeraete GmbH
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by BSH Electrical Appliances Jiangsu Co Ltd, BSH Hausgeraete GmbH filed Critical BSH Electrical Appliances Jiangsu Co Ltd
Priority to CN202110361195.2A priority Critical patent/CN115177159A/en
Publication of CN115177159A publication Critical patent/CN115177159A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A47FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
    • A47JKITCHEN EQUIPMENT; COFFEE MILLS; SPICE MILLS; APPARATUS FOR MAKING BEVERAGES
    • A47J37/00Baking; Roasting; Grilling; Frying
    • A47J37/06Roasters; Grills; Sandwich grills
    • A47J37/0623Small-size cooking ovens, i.e. defining an at least partially closed cooking cavity
    • A47J37/0664Accessories
    • AHUMAN NECESSITIES
    • A47FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
    • A47JKITCHEN EQUIPMENT; COFFEE MILLS; SPICE MILLS; APPARATUS FOR MAKING BEVERAGES
    • A47J27/00Cooking-vessels
    • A47J27/04Cooking-vessels for cooking food in steam; Devices for extracting fruit juice by means of steam ; Vacuum cooking vessels
    • AHUMAN NECESSITIES
    • A47FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
    • A47JKITCHEN EQUIPMENT; COFFEE MILLS; SPICE MILLS; APPARATUS FOR MAKING BEVERAGES
    • A47J36/00Parts, details or accessories of cooking-vessels
    • A47J36/32Time-controlled igniting mechanisms or alarm devices
    • AHUMAN NECESSITIES
    • A47FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
    • A47JKITCHEN EQUIPMENT; COFFEE MILLS; SPICE MILLS; APPARATUS FOR MAKING BEVERAGES
    • A47J27/00Cooking-vessels
    • A47J27/04Cooking-vessels for cooking food in steam; Devices for extracting fruit juice by means of steam ; Vacuum cooking vessels
    • A47J2027/043Cooking-vessels for cooking food in steam; Devices for extracting fruit juice by means of steam ; Vacuum cooking vessels for cooking food in steam

Landscapes

  • Engineering & Computer Science (AREA)
  • Food Science & Technology (AREA)
  • Electric Ovens (AREA)

Abstract

A control system of a cooking appliance comprises the cooking appliance and a server which is in communication connection with the cooking appliance. The control system server identifies the received real-time image according to the monitoring cooking state model so as to obtain an identification result; wherein the monitoring cooking state model at least comprises cooking state standard images representing that the cooked object corresponds to different cooking nodes; acquiring a corresponding control instruction according to the identification result and sending the control instruction to the cooking appliance; the cooking appliance is also used for receiving the control instruction and working according to the instruction. By adopting the embodiment, the abnormal cooking can be monitored in the cooking process through image recognition, the cooking appliance is controlled in real time according to the cooking state of food in the cooking process, the dynamic control and the intelligent control of the cooking appliance are realized, the cooking result is optimized, and the use experience of a user is improved.

Description

Control system of cooking utensil
Technical Field
The embodiment of the invention relates to the technical field of household appliances, in particular to a cooking appliance control system.
Background
In recent years, cooking apparatuses including microwave ovens, and the like have been increasingly widely used, becoming one of the indispensable home appliances for each household. Generally, the existing cooking apparatus can only heat food mechanically according to parameters such as time and temperature set by a user. In such a case, the food is easily overcooked or even burned, or the food is not matured under the set parameters due to undercooking, which requires the user to increase the cooking time or temperature. Therefore, the cooking equipment is not intelligent enough, the cooking effect is not good, and the user experience is not good.
In view of the above situation, a cooking device of the prior art acquires an internal image of the device through a camera and presents the internal image to a user, so that the user can know the cooking state of food. However, the scheme of presenting the cooking state of the internal food in real time still requires the user to adjust cooking parameters according to the observed image, and the user can influence the cooking effect without interference.
In addition, the existing cooking devices can only heat food mechanically according to the parameters of time and temperature set by users. In this case, it is easy to cause the food to be overcooked or even burnt, or the food cannot reach maturity under the set parameters due to under-cooking, and the user is still required to cook the food again. Moreover, these situations can only be detected when the user turns on the cooking device, and there is no way to remedy if the overcooking has occurred; if the cooking is insufficient, the food needs to be reheated, the heating time and temperature need to be reset, and the taste of the food after being reheated cannot be ensured. Therefore, the cooking equipment is not intelligent enough, and the user experience is not good.
Disclosure of Invention
It is an object of embodiments of the present invention to provide an improved cooking appliance control system.
Therefore, the embodiment of the invention provides a cooking appliance control system, which comprises a cooking appliance and a server in communication connection with the cooking appliance. The cooking appliance comprises an image acquisition module, a cooking node and a control module, wherein the image acquisition module is used for at least acquiring a real-time image of a cooked object corresponding to the cooking node; the server comprises a receiving module, a processing module and a sending module, wherein the receiving module is used for receiving a real-time image sent by a cooking appliance, and the real-time image at least comprises a cooking state of a cooked object corresponding to at least one cooking node; the processing module is used for identifying the received real-time image according to the monitoring cooking state model so as to obtain an identification result; wherein the monitoring cooking state model at least comprises cooking state standard images representing that the cooked object corresponds to different cooking nodes; the sending module is used for obtaining a corresponding control instruction according to the identification result and sending the control instruction to the cooking appliance; the cooking appliance is also used for receiving the control instruction and working according to the instruction.
By adopting the embodiment, the abnormal cooking can be monitored in the cooking process through the image acquisition module and the image recognition module, the cooking appliance is controlled in real time according to the cooking state of food in the cooking process, the dynamic control and the intelligent control of the cooking appliance are realized, the cooking result is optimized, and the user experience is improved.
Wherein the real-time image comprises a first image and a second image; the processing module is further used for respectively identifying the first image and the second image according to the cooking state standard image so as to obtain corresponding identification results; the processing module also comprises a priority determining module which is used for determining the priority of the identification result according to a preset priority standard; the sending module is further used for obtaining a corresponding control instruction according to the identification result with the higher priority and sending the control instruction to the cooking appliance.
Optionally, the priority criterion is determined according to a degree of deviation of the real-time image from a standard image, and the more the real-time image deviates from the standard image, the higher the priority of the identification result corresponding to the real-time image is.
Optionally, the image acquisition module includes a first camera and a second camera, and the acquisition viewing angles of the first camera and the second camera are different.
Optionally, the first camera and the second camera are respectively arranged on the door body of the cooking appliance and the top of the inner cavity of the cooking appliance.
The cooking state of food is monitored from different angles through the plurality of camera devices, the abnormal cooking state of the food can be better and timely monitored, and therefore the cooking process is intervened in advance, and the expected effect of the cooking result can be achieved. Further, when the cooking state results corresponding to a plurality of current nodes are obtained and identified, the identification result with a large difference from the standard image is output as a priority result, and the cooking appliance is controlled based on the result, so that the food local over-cooking or under-cooking can be avoided, and the food cooking result can be better controlled to meet the expectation. For example, when a camera device detects that a portion of the food (e.g., the rear portion of the food near the heating device) has been burnt, the cooking time may be reduced or the cooking temperature may be lowered based on the monitoring recognition result.
Optionally, the processing module is further configured to: and comparing the image parameters of the real-time image with the image parameters of the standard image to determine a recognition result. During cooking, the cooking state of the cooked food is monitored at the preset cooking node, so that the timely intervention of the cooking process at the key cooking node can be ensured, the cooking parameters can be adjusted in time, and the cooking state can be corrected in time when deviating from the expected state. Thereby making the doneness of the food desirable.
Optionally, the identification result includes: a first recognition result, namely the current cooking state of the cooked object exceeds the cooking state corresponding to the current cooking node; or the second recognition result is that the current cooking state of the cooked object does not reach the cooking state corresponding to the current cooking node; the sending module is further used for obtaining a control instruction for controlling the reduction of the cooking time or the cooking temperature according to the first identification result and sending the control instruction to the cooking appliance; or acquiring a control instruction for increasing the cooking time or the cooking temperature according to the second identification result, and sending the control instruction to the cooking appliance. At a certain node during cooking, when the cooking state of the food is monitored to exceed or not reach the expectation, the cooking appliance automatically controls to reduce or increase cooking parameters, so that the doneness of the food is in accordance with the expectation, and overcooking or undercooking is avoided.
Optionally, the real-time image of the food under the current cooking node and the standard image corresponding to the real-time image are displayed through the display device, so that the cooking state of the food under the current cooking node is presented. The display device displays the real-time image and the corresponding standard image, so that a user can visually observe the current cooking state of food and whether the current cooking state reaches an expected cooking state, and the use experience is better. The display device may be disposed on the cooking appliance, or may be a user terminal, such as a mobile phone, communicatively connected to the cooking appliance.
Optionally, the processing module is further configured to identify the cooked object according to a food identification model and determine a corresponding monitoring cooking state model according to the identification result. The food type is identified through image identification, the monitoring cooking state model corresponding to the food type is obtained, automatic identification of food and monitoring of the food cooking process are achieved, and the cooking appliance works intelligently.
Optionally, the processing module is further configured to identify a maturity condition of the cooked object in the real-time image at a later stage of the cooking node according to a maturity judging model, where the maturity judging model includes a standard image for representing a maturity state of the cooked object; the sending module is further used for obtaining a corresponding control instruction according to the identification result and sending the control instruction to the cooking appliance. When the cooking process is near the end or after the end, it is possible to monitor again whether the food has come close to maturity and control the cooking appliance based on the result of this determination. Thus, the state of maturity of the food is thus identified in advance at the end of cooking and cooking interventions are automatically taken, if necessary, to obtain the desired cooking result, in particular under-cooking conditions can be avoided. Further, the situation that food is not mature after cooking is finished or the food is mature and is continuously cooked is avoided, resources are wasted, the taste of the food is damaged, and the like.
Optionally, the processing module is further configured to identify cleanliness of the image acquisition module or cleanliness in the cavity according to a cleaning model, where the cleaning model includes a standard image for representing that the view angle of the image acquisition module is blocked or includes a standard image showing an internal oil contamination condition; the sending module is also used for acquiring a control instruction for self-cleaning or cleaning reminding according to the identification result. The method has the advantages that the degree of dirt in the cooking appliance is identified through an image identification means, particularly whether the visual angle of an image acquisition module is shielded by the oil dirt or not is identified, so that the environment in the cooking appliance is accurately identified, and the phenomenon that the image acquisition module cannot acquire an accurate image of a cooked object to influence the cooking control process of the cooking appliance is avoided; and timely remind the user when cleaning is needed.
Optionally, the cooking appliance further comprises a sensor module for acquiring at least one of the weight of the food, the internal temperature of the food, and the temperature in the cavity; the receiving module is also used for acquiring at least one of the detection parameters of the weight of the cooked object, the internal temperature of the cooked object and the internal temperature of the cooking appliance cavity acquired by the sensor module; and the sending module combines the identification result and the detection parameter to obtain a corresponding control instruction and sends the control instruction to the cooking appliance.
Optionally, the sensor includes a temperature detection unit disposed in the container of the cooking appliance to detect at least an internal temperature of the food.
On the basis of monitoring the cooking state through image recognition, the cooking appliance is controlled by combining the internal temperature of food, so that the cooking parameters can be better controlled, and the expected food cooking effect can be obtained. For example, during cooking, the internal temperature of the food does not reach the expected temperature at a certain cooking node, which may cause the problem that the food does not cook well, and the cooking time and the cooking temperature are adjusted by monitoring the internal temperature in real time. Conversely, when the internal temperature of the food is too high at the cooking node, the food is possibly overcooked, and the overcooked food can be avoided by monitoring the internal temperature in real time and intervening in advance.
Furthermore, in the process of cooking some foods, the appearance, color and the like of the foods in a certain cooking process are not changed greatly and cannot be effectively identified through images, and the cooking effect can be better controlled by combining the internal temperature.
The above cooking appliance may be an oven or a steaming and baking all-in-one machine. So, can improve the intelligent degree of oven, realize the culinary art process of automatic monitoring oven to in time automatically regulated oven working parameter when the discovery is unusual, can also discern the ripe result of culinary art and clean the warning.
Drawings
Fig. 1 is a schematic diagram of a cooking appliance control system according to an embodiment of the present invention;
FIG. 2 is a schematic view of a cooking appliance in accordance with an embodiment of the present invention;
FIG. 3 is a side schematic view of a cooking appliance in accordance with an embodiment of the present invention;
fig. 4 is a schematic diagram of a cooking control system architecture based on image recognition according to an embodiment of the present invention;
FIG. 5 is a schematic diagram of a cooking control system architecture based on image recognition according to another embodiment of the present invention;
FIG. 6 is a schematic diagram of a cooking control system architecture based on image recognition according to another embodiment of the present invention;
in the drawings:
1-a cooking appliance; 10-an image acquisition module; 11-a body; 12-a cavity; 13-a door; 14-an input-output module; 15-a control module; 16-a communication module; 17-a sensor module; 101-a first camera; 102-a second camera; 2-a server; 20-a receiving module; 21-a processing module; 22-a sending module; 210-priority determination module.
Detailed Description
As described in the background section, in the prior art, the cooking appliance is not intelligent enough, and the user cannot observe the cooking state of the cooked food inside the cooking appliance in real time, and cannot intervene in time when the cooking abnormality is found, so that the cooking result cannot achieve the expected effect. In addition, if the food is not cooked to be mature after the cooking is finished, the user needs to cook again, and the like.
The embodiment of the invention provides a cooking appliance control system which comprises a cooking appliance and a server in communication connection with the cooking appliance. The cooking appliance comprises an image acquisition module, a cooking module and a control module, wherein the image acquisition module is used for at least acquiring a real-time image of a cooked object corresponding to a cooking node; the server comprises a receiving module, a processing module and a sending module, wherein the receiving module is used for receiving a real-time image sent by a cooking appliance, and the real-time image at least comprises a cooking state of a cooked object corresponding to at least one cooking node; the processing module is used for identifying the received real-time image according to the monitoring cooking state model so as to obtain an identification result; wherein the monitoring cooking state model at least comprises cooking state standard images representing that the cooked object corresponds to different cooking nodes; the sending module is used for acquiring a corresponding control instruction according to the identification result and sending the control instruction to the cooking appliance; the cooking appliance is also used for receiving the control instruction and working according to the instruction.
By adopting the embodiment, through the combination of image recognition and the control process of the cooking appliance, the cooking appliance can be controlled in real time according to the cooking state of food in the cooking process, so that the dynamic control and the intelligent control of the cooking appliance are realized, the cooking result is optimized, and the use experience of a user is improved.
Specifically, the cooking states of a plurality of cooking nodes in the cooking process can be monitored according to a preset monitoring cooking state model, so that cooking parameters are dynamically adjusted, the automatic advanced intervention of equipment in the cooking process is realized, the maturity of food can be ensured to meet expectations, and the situation of over-cooking or under-cooking of the food is avoided.
In order to make the aforementioned objects, features and advantages of the present invention comprehensible, embodiments accompanied with figures are described in detail below.
Fig. 1 is a schematic diagram of a cooking appliance control system according to an embodiment of the present invention; FIGS. 2 and 3 are schematic views of the cooking appliance of FIG. 1 in accordance with an embodiment of the present invention; 4-6 are schematic diagrams of cooking control system architectures based on camera image recognition in embodiments of the present invention;
specifically, referring to fig. 1, a control system of a cooking appliance 1 includes the cooking appliance 1 and a server 2, wherein the cooking appliance 1 and the server 2 are communicably connected.
The cooking appliance 1 comprises an image acquisition module 10 for acquiring real-time images of the interior of the cooking appliance and sending the images to the server 2.
The server 2 is used for receiving the real-time image, identifying and processing the real-time image, and then acquiring a corresponding control instruction according to an identification result so as to control the cooking appliance 1 to work.
Specifically, referring to fig. 2 and 3, the cooking appliance 1 of the present embodiment may be an oven or a steaming and baking all-in-one machine, and generally includes: the cooking device comprises a body 11 and a cavity 12 arranged in the body 11, wherein the cavity 12 is used for placing the cooked object. For example, the cooking appliance 1 may have a door 13 capable of opening or closing the cavity 12, the door 13 exposing the cavity 12 when opened for a user to take or to prevent the cooked object (i.e., food); when the door 13 is closed, the cavity 12 is closed, and the cooked object placed therein can be heated, baked and the like.
The cooking appliance 1 may further include an input-output module 14, and the operation state of the cooking appliance 1 may be adjusted by operating the input-output module 14. The operation state of the cooking appliance 1 may include heating power, heating direction, heating time, steam delivery amount, steam delivery direction, and the like. The adjustment of these operating states may be achieved by adjusting the operating states of specific functional modules in the cooking appliance 1.
Further, the cooking appliance 1 may further include a control module 15, configured to adjust the operation state of the corresponding function module according to the instruction fed back by the input/output module 14, so that the operation state of the cooking appliance 1 conforms to the user instruction. For example, the cooking parameters or cooking settings selected by the user are obtained through the input-output module 14, and the control module 15 controls the cooking appliance 1 to communicate with the server 2 according to the parameters, and further controls the cooking appliance to work.
For example, the cooking appliance 1 may comprise a heating module (not shown) for heating the object to be cooked. The cooking appliance 1 may include a plurality of heating modules, and the plurality of heating modules are dispersedly disposed in different areas of the cooking appliance 1 to heat the cavity 12 from different angles so that the object to be cooked is heated as uniformly as possible. The control module 15 can independently adjust the heating power of each heating module to adjust the amount of heat radiated into the chamber 12 by each heating module. The control module 15 can also independently adjust the heating direction of each heating module to adjust the radiation angle of each heating module to radiate heat into the cavity 12. The control module 15 may also adjust the heating duration for which a particular heating module operates at a particular heating power to heat the food to a desired effect.
As another example, the cooking appliance 1 may include a spray module (not shown) for delivering water vapor into the cavity 12 to adjust the humidity within the cavity 12. By adjusting the humidity in the cavity 12, the surface humidity of the cooked object can be adjusted, so that the water content of the cooked object is moderate, and the surface of the cooked object is prevented from being heated too dry or too wet. The cooking appliance 1 may include a plurality of spraying modules, and the plurality of spraying modules are dispersedly disposed in different areas of the cooking appliance 1 to deliver water vapor into the cavity 12 from different angles, so that the humidity distribution on the surface of the object to be cooked is uniform. The control module 15 can independently adjust the vapor delivery of each spray module. The control module 15 may also adjust the steam delivery direction of each spray module independently. The control module 15 may also adjust the spray duration for a particular spray module operating at a particular steam delivery rate.
It should be noted that the input/output device 14 in fig. 2 may be similar to a knob adjusting key, and in practical applications, the input/output device 14 may also be arranged in other forms, such as a touch screen, a voice control module, and the like. The input/output device 14 may also be a touch display panel with a touch function, which implements data input and image information display.
It should be noted that fig. 2 only shows an exemplary possible setting position of the control module 15 in the cooking appliance 1, and the specific setting position of the control module 15 may be adjusted as needed in practical applications.
Referring to fig. 1, the cooking appliance 1 has a communication module 16, and the server 2 communicates with the control module 15 through the communication module 16 to transmit a control command to the cooking appliance 1, the control command being used to control an operation state of the cooking appliance 1.
Further, referring to fig. 1, 2 and 3, the cooking appliance 1 may include an image capturing module 10 disposed in the cavity 12 or the door 13 to capture the monitoring data of at least one angle. The control module 15 controls the image capturing module 10 to capture an image and transmits the image to the server 2 through the communication module 16.
For example, the image capturing module 10 may be a camera, which is disposed on the top of the cavity 12 and covers at least the tray in the overhead range, and the camera may be disposed in a plurality of positions to photograph the cooked object from different angles.
For example, the camera may be disposed inside the door 13 and the shooting range at least includes the area where the cooked object is located.
Further, the control module 15 interacts with the image capturing module 10 and the server 2, so as to monitor the cooking state of the cooked object and automatically control the operating parameters of the cooking appliance 1.
In one implementation, referring to fig. 1, the server 2 includes a receiving module 20, a processing module 21 and a sending module 22, which interact with each other to identify and process the real-time image acquired from the cooking appliance 1, and acquire a control instruction for controlling the cooking appliance 1 based on the identification result.
Specifically, the receiving module 20 is configured to receive a real-time image sent by the cooking appliance 1, where the real-time image at least includes a cooking state of the object to be cooked corresponding to at least one cooking node; the processing module 21 is used for identifying the received real-time image according to the monitored cooking state model so as to obtain an identification result; wherein the monitoring cooking state model at least comprises cooking state standard images representing that the cooked object corresponds to different cooking nodes; and the sending module 22 is configured to obtain a corresponding control instruction according to the identification result and send the control instruction to the cooking appliance 1.
The cooking node may be a key node in a cooking process, and the cooking node may monitor a change in a cooking state of food. Cooking nodes may be preset in the memory, for example, the time to cook 7-minute beefsteak is 8 minutes, and the corresponding cooking nodes may be set to the 4 th minute, 5 th minute, and 6 th minute. It should be noted that the cooking node may be determined according to the cooking time, or may be set according to the change of the heating temperature, etc.
Wherein, the monitoring cooking state model can be stored in the memory in advance, and the model comprises food cooking state standard images corresponding to a plurality of cooking nodes. The standard image may be understood as the state that the food should reach when at the cooking node, and may include color parameters, shape parameters, etc. for characterizing the state of the food. The standard image is obtained by a large amount of cooking experimental data training.
In a specific implementation, different cooking nodes correspond to different standard images; the monitoring cooking state model may correspond one-to-one to the type of food.
For example, the server 2 includes two cooking state monitoring models, which are a model corresponding to steak and a model corresponding to sweet potato; the model A comprises beefsteak standard images corresponding to 4 cooking nodes, and the model B comprises sweet potato standard images corresponding to 3 cooking nodes. When in the process of cooking the steak, the real-time images and the corresponding standard images of the 4 cooking nodes can be monitored, corresponding control instructions are obtained when one of the cooking nodes is found to be abnormal, and the cooking parameters are adjusted through the control instructions to control the cooking appliance 1.
The server 2 carries out data processing such as recognition and the like at a far end by monitoring the cooking state model, so that on one hand, training, learning and updating of data such as the model and the like are facilitated, and the data and accuracy of the processed data are improved; on the one hand, the complexity of the cooking appliance itself is reduced. So, promoted control cooking utensil's intelligent degree.
It should be noted that the above is merely an example. In practical applications, the memory may be provided with monitoring cooking state models corresponding to a plurality of food types to expand the menu range in which the cooking appliance 1 can intelligently cook.
The control instruction is used for controlling the operating state and the operating mode of the cooking appliance 1 according to the control parameter. The control instructions may be stored in the memory and may include instructions to control the heating module, instructions to control the spray module, instructions to control the image acquisition module, and the like.
Specifically, when the cooking is ready to start, the ready food is put into the cooking appliance 1. When the door 13 of the cooking appliance 1 is closed, the image acquisition module 10 starts to work and acquires the shot image in real time; or when the user sets cooking parameters such as heating time, heating temperature, etc. and starts cooking, the image capturing module 10 starts to operate.
The image acquisition module 10 may store the acquired image in a local memory. The control module 15 sends the image to the server through the communication module 16, the processing module 21 of the server 2 monitors the cooking state of the food according to the collected image, and the preset monitoring cooking state model is adopted to identify the real-time image and adjust the cooking parameters according to the identification result. Therefore, the cooking process is intelligently monitored, and the cooking parameters are timely adjusted when the cooking abnormity is found, so that the cooking result is in line with the expectation.
Further, the image capturing module 10 may include a plurality of cameras for capturing images from different viewing angles. The cooking state of the food can be monitored from different angles through the plurality of image acquisition modules 10, and the abnormal cooking state of the food can be better and more timely monitored, so that the cooking process is intervened in advance, and the expected effect of the cooking result can be achieved.
In one embodiment, the cooking appliance 1 comprises a first camera 101 disposed at the upper part of the cavity 12 and a second camera 102 disposed at the inner side of the door 13, both facing the object to be cooked, respectively, to capture images of the food from different angles.
Specifically, the first camera 101 acquires a first image and the second camera 102 acquires a second image. The processing module 21 respectively identifies the first image and the second image according to the cooking state standard image to obtain corresponding identification results.
In a specific embodiment, the processing module 21 further includes a priority determining module 210 for determining the priority of the plurality of recognition results according to a preset priority standard. The sending module 22 is further configured to obtain a corresponding control parameter according to the recognition result with the higher priority to control the cooking appliance 1.
The priority standard can be determined according to the degree of deviation of the real-time image from the standard image, and the more the real-time image deviates from the standard image, the higher the priority of the identification result corresponding to the real-time image is.
Specifically, when the cooking state results corresponding to a plurality of current nodes are obtained and identified, the identification result with a large difference from the standard image is output as a priority result, and a corresponding instruction for controlling the cooking appliance is obtained based on the result, so that the cooking appliance can work according to the instruction, the local over-cooking or under-cooking of food can be avoided, and the cooking result of the food can be better controlled to meet the expectation.
For example, when the first camera 131 detects that a portion of the food (e.g., the rear portion of the food near the heating device) has been scorched, the cooking time may be reduced or the cooking temperature may be lowered based on the monitoring recognition result.
In one implementation, a cooking node for acquiring the real-time image may be preset, and the control module 15 controls the image capturing module 10 to acquire the real-time image according to the cooking node. During cooking, the cooking state of the cooked food is monitored at the preset cooking node, so that the timely intervention of the cooking process at the key cooking node can be ensured, the cooking parameters can be adjusted in time, and the cooking state can be corrected in time when deviating from the expected state. Thereby making the cooking result of the food desirable.
For example, a time node at which an image is acquired may be set in the memory, and the control module 15 may control the image acquisition module 10 to acquire the image at the time node. Each food type may be provided with a different time node in consideration of the variation difference of different foods during cooking. It should be noted that the time node should correspond to the cooking node in the monitoring cooking state model, so as to implement comparison between the real-time image and the standard image at the same cooking node.
In one embodiment, the processing module 21 is further configured to compare the image parameters of the real-time image with the image parameters of the standard image to determine the recognition result.
Further, the sending module 22 may obtain a control instruction for adjusting parameters of the cooking appliance, such as heating time, heating temperature, and humidification mode, according to the comparison and identification result between the real-time image and the standard image, and send the control instruction to the cooking appliance 1.
Wherein, the recognition result includes: a first recognition result, namely that the current cooking state of the food exceeds the cooking state corresponding to the current cooking node; or, the second recognition result, that is, the current cooking state of the food does not reach the cooking state corresponding to the current cooking node.
Further, the sending module 22 is further configured to obtain a control instruction for reducing the cooking time or the cooking temperature according to the first identification result; or acquiring a control instruction for increasing the cooking time or the cooking temperature according to the second recognition result.
Specifically, when the cooking state of the food is determined to exceed the cooking state corresponding to the current cooking node according to the identification result, the cooking time or the cooking temperature is controlled to be reduced; and when the cooking state of the food is determined not to reach the cooking state corresponding to the current cooking node, controlling to increase the cooking time or the cooking temperature.
For example, when cooking beefsteak using cooking appliance 1, the cooking taste selected by the user is 7 minutes of doneness, and the cooking time is expected to be 8 minutes. During the cooking process, the server 2 monitors that at a third cooking node, for example, at the 6 th minute of the cooking time, the comparison result between the real-time image collected by the image collecting module 10 and the standard image is that the cooking state of the current steak exceeds the cooking state of the steak in the corresponding standard image. If heating is continued for a preset period of 8 minutes, the last steak will be older or less than expected with a high probability. Therefore, the control module controls to reduce the total heating time, for example, to reduce 20 seconds, i.e., to adjust the originally planned heating time of 8 minutes to 7 minutes and 40 seconds. Such fine tuning may provide good protection against over-cooking, thereby allowing the cooking result to be as desired.
In a variation, the real-time image of the food at the current cooking node and the corresponding standard image thereof may be displayed through the display device to present the cooking state of the food at the current cooking node. The display device may be the input/output device 14, or may be a remote mobile device such as a mobile phone, a tablet, a computer, etc.
For example, when a certain cooking node is reached during cooking, the control module 15 may call up the real-time image and the standard image of the node in the memory through the communication module 16 and present them to the user through the display device. So, not only cooking equipment can intelligent adjustment culinary art process, and the user also can see the culinary art state and the culinary art progress of food directly perceivedly, and user experience is better.
In a specific embodiment, the processing module 21 is further configured to identify the cooked object according to a food identification model and determine a corresponding monitoring cooking state model according to the identification result; wherein the food recognition model may be used to recognize the type of food, the model being obtained through a large amount of experimental and training data. Since the food corresponds to the monitored cooking state model, the monitored cooking state model corresponding to the food can be obtained through the identified food to monitor the cooking process based on the model.
In a variation, the cooking appliance 1 may be provided with a plurality of image capturing modules 10 to obtain real-time images of cooking food from different angles, and then determine an identification result based on priority to improve the accuracy of food identification. The priority standard is determined according to the accuracy of food identification model for identifying food, and the higher the accuracy is, the higher the priority of the identification result corresponding to the food image is. In particular implementations, each image acquisition module 10 may correspond to a food recognition model. When the plurality of image capturing modules 10 capture food images at the same time from different angles, the priority of the recognition result can be determined according to the accuracy of the food recognition of the image capturing modules 10, thereby further improving the recognition accuracy.
For example, the cooking appliance 1 includes two cameras, i.e., a first camera 101 disposed at an upper portion of the cavity 12 and a second camera 102 disposed inside the door 13. The two cameras are respectively directed to the cooked object to acquire images of food from different angles. The camera correspondingly acquires two images at the same acquisition time, corresponds to two identification results, and outputs the identification result if the results are the same; and if the difference is not the same, re-identifying or determining a preferential identification result according to the identification accuracy of the camera in the training process.
In one embodiment, the processing module 21 is further configured to identify the maturity of the cooked object in the real-time image of the cooking node according to a maturity judging model, wherein the maturity judging model includes a standard image for representing the maturity state of the cooked object. And then, the sending module acquires a corresponding control instruction according to the identification result and sends the control instruction to the cooking appliance.
Specifically, the maturity judging model may be pre-stored in the memory, and the model includes standard images of maturity states of a plurality of foods, and may further include standard images of maturity states of a food in different tastes. The standard image may be understood as the state that the food should reach when it is ripe, and may include color parameters, shape parameters, etc. for characterizing the ripe state of the food. The standard image is obtained through a large amount of cooking experimental data training. As used herein, the term "ripeness" includes the state of ripeness of food in the conventional sense, and also includes the different stages of ripeness of food required due to differences in taste among people, such as the state of ripeness of five-split beefsteak, the state of ripeness of seven-split beefsteak, and the like.
Specifically, the server 2 includes a ripeness determination model of at least one food, such as standard images representing the ripeness states of steak under different tastes, including medium-ripeness, medium-ripeness and medium-ripeness, the different ripeness states respectively including corresponding ripeness state standard maps, wherein each type of map may include multiple images. The processing module 21 then identifies whether the food has matured based on the model. The sending module 22 obtains a control command for controlling the operating parameters of the cooking appliance according to the identification result, and sends the control command to the cooking appliance to automatically control the operation of the cooking appliance 1, so as to identify the food mature state at the end of cooking, and automatically take cooking intervention measures if necessary to obtain the expected cooking effect, and particularly can avoid the situation of insufficient cooking.
In a variation, the processing module 21 may further obtain a cooking maturity parameter selected by the user through the input/output module, and obtain a corresponding standard image in the maturity judgment model according to the maturity parameter; and comparing the real-time image with the standard image, and acquiring a corresponding control instruction according to the result by a sending module and sending the control instruction to the cooking appliance 1. Different users may have different requirements or preferences for the maturity of cooking of food. Through according to the maturity parameter selected by the user, the corresponding maturity judging model is obtained to further perform image recognition, diversified cooking control can be performed according to different user selections, so that intelligent cooking is realized, and user experience is better.
The degree of contamination inside the cooking apparatus cannot be accurately known in the prior art. Specifically, there are cases where the user does not use the cooking apparatus for a long time, or although the number of uses is large, the oil stains are less generated from the cooked food in most cases. In this case, the cleaning reminding is uniformly performed according to the time point, and the reminding is performed when the cleaning is not needed. Therefore, the cooking equipment is not intelligent enough, and the user experience is not good.
Thus, in one embodiment the processing module 21 is further adapted to identify cleanliness within the cavity or the cleanliness of the image acquisition module 10 corresponding to the view angle from the cleaning model; the sending module 22 is further configured to obtain a corresponding instruction according to the identification result to control the cooking appliance to perform self-cleaning or cleaning reminding. According to the embodiment, the dirtiness degree in the cooking utensil can be identified through an image identification means, particularly, whether the visual angle of the image acquisition module is shielded by oil stains or not is identified, so that the environment in the cooking utensil is accurately identified, and the phenomenon that the image acquisition module cannot acquire accurate images of cooked objects to influence the cooking control process of the cooking utensil is avoided; and timely remind the user when cleaning is needed.
Wherein the cleaning model comprises at least a standard image for characterizing that the view angle of the image acquisition module 10 is blocked. The information in the standard image may include at least one of a distribution position of the oil stain, a coverage area of the oil stain, an amount of the oil stain, and the like. The distribution position of the oil stain can be understood as the distribution position of the oil stain in one image, generally, the position of the oil stain near the middle in the image affects the image acquisition module 10 to acquire the image of the object to be cooked, and the position of the oil stain near the edge of the acquired image has little influence. Likewise, the oil coverage area and amount of oil contamination may also affect the acquisition of images of the cooked object. Therefore, a plurality of standard images whose view angles are blocked can be acquired through experiments and training.
The cleaning model may also be used to identify cleanliness within the cavity, and the model may include standard images for characterizing internal oil contamination conditions.
Specifically, when the door 13 of the cooking appliance 1 is closed, the control module 11 controls the image pickup module 10 to start picking up an image. At this time, if there is oil stain in front of the lens of the image capturing module 10, the captured image includes an image inside the cavity 12 and an image of the oil stain.
The processing module 21 identifies whether the lens is blocked based on the acquired collected image and the cleaning model, and the sending module 22 obtains an instruction for performing cleaning reminding or other reminding according to the identification result. And if the lens is identified to be blocked, reminding cleaning or controlling the cooking appliance to carry out self-cleaning. And if the lens is not shielded, entering a cooking process, including identifying food, identifying the cooking state of the food, identifying whether the food is mature or not and the like. Thereby accurately acquiring the position of the oil stain and carrying out cleaning reminding when necessary.
Furthermore, after the visual angle of the image acquisition module 10 is judged to be shielded, the type of the shielding object can be further judged, the cleaning mode of the cooking appliance is controlled according to the type of the shielding object, and the cooking appliance is more intelligent. For example, the covering may be greasy or dirty, and if greasy, may remind the user to wipe manually; the cooking appliance can be controlled to perform high temperature ashing treatment if the stain is generated.
Further, the image capturing module 10 may be triggered to capture an image only when detecting the closing of the door 13 or when the cooking process is finished, and the processing module 21 and the control module 15 may identify the contamination in the cavity 12. When detecting the door closing action, the user will cook, and when cooking, the detection of the image acquisition module 10 and the cleaning reminder is firstly triggered, so that the situation that the image acquisition module 10 cannot acquire complete or effective food images after cooking is started is avoided. The dirty identification and reminder can also be performed after the cooking program is finished, so as to avoid influencing the use of the cooking appliance next time.
Furthermore, an image capturing module in the cooking appliance is generally used to observe and recognize the cooking state of the cooked food, and the cooking appliance operates according to the captured image. If the field of view of the image acquisition module is blocked, the accuracy of food image recognition of the cooking appliance is affected, and further the subsequent cooking control and cooking effect are affected. Therefore, the situation that the visual field of the image acquisition module is blocked is judged through image recognition, a user is reminded of cleaning based on the recognition result, the oil stain situation can be accurately judged, and oil stains are cleaned in time; and then improve the accuracy of cooking utensil control to improve culinary art effect and user experience.
Further, the cooking appliance 1 may comprise a sensor module 17 for acquiring food weight, temperature, etc. Such as a temperature detecting means (not shown in fig. 2) of the cooking appliance 1 for detecting the temperature of the surface of the food, the inside of the body, etc.
For example, a temperature sensor may be disposed within the chamber 12 to collect the temperature within the chamber 12. For another example, the tray may be provided with a temperature sensor or probe to collect the surface temperature or the internal temperature of the object being cooked.
The sensor module 17 is connected to the control module 15. The control module 15 may be used to control the cooking appliance in combination with the recognition result and the sensor module detection parameter.
In one particular embodiment, the control module 15 may also adjust cooking parameters in conjunction with the real-time internal temperature of the food. The real-time internal temperature of the food may be acquired by a probe or other sensor, and the cooking appliance 1 is controlled according to the real-time internal temperature and the current cooking state of the food.
Specifically, when the real-time internal temperature is lower than the standard temperature, the cooking time or the cooking temperature is increased; otherwise, the cooking time is reduced or the cooking temperature is lowered. On the basis of monitoring the cooking state through image recognition, the cooking appliance 1 is controlled by combining the internal temperature of food, so that the cooking parameters can be better controlled to obtain the expected food cooking effect.
For example, during cooking, the internal temperature of the food does not reach the expected temperature at a certain cooking node, which may cause the problem that the food does not cook well, and the cooking time and the cooking temperature are adjusted by monitoring the internal temperature in real time. On the contrary, when the internal temperature of the food is too high at the cooking node, the food can be overcooked, and the overcooking can be avoided by monitoring the internal temperature in real time and intervening in advance.
For another example, in cooking certain foods (e.g., steak), 7-point doneness and 9-point doneness are very small in image difference; or the appearance, color and the like of certain food in a certain cooking process are not changed greatly and cannot be effectively identified through images, and the cooking effect can be better controlled by combining the internal temperature.
Referring to fig. 4-6, a detailed schematic of the operation of the control system for the 3 ovens is shown.
In fig. 4, the oven is communicatively connected to the cloud server.
The server 2 performs cooking control of the oven through an image recognition technology, wherein the server 2 includes three recognition models, namely a food recognition model for recognizing food, a monitoring model for monitoring a cooking state, and a maturity determination model for determining maturity.
After the oven door is closed, the camera in the oven collects images and sends the images to the cloud server. The oven can display the real-time state of food cooked in the oven through the display screen; the user may also input cooking parameters or select a cooking recipe through the touch panel.
And the cloud server identifies and monitors the acquired image through an identification model.
Specifically, what type of food is determined at the beginning of cooking, and then the cooking parameters (cooking time, cooking temperature, cooking mode, etc.) set in the food cooking management in the cloud server are called according to the identified type of food, and sent to the oven. The oven acquires the cooking parameters and cooks according to the cooking parameters.
In the cooking process, the camera acquires an image of the inside of the oven in real time. The cloud server identifies the acquired image through the monitoring cooking state model and monitors the abnormal state in the cooking process. And when the abnormal state is identified, acquiring a control instruction for controlling the oven. For example, when it is monitored that the food does not reach the corresponding cooking state of a certain cooking node, the control parameter for increasing the cooking time length by 30s is obtained and sent to the oven. The oven is automatically extended for 30s according to the control command.
When cooking is about to end or after the end, the camera continues to acquire internal images. The cloud server identifies whether the current food is cooked to be mature or not through the maturity judging model. If it is recognized that the food is not cooked to the full extent, control instructions for increasing the cooking time (e.g., 30 seconds) and increasing the cooking temperature are obtained to continue heating the food to achieve the desired effect. The increased time and temperature may be preset in the cloud server or in a local oven.
The whole cooking process is automatically monitored by the system, human intervention is not needed when sudden problems do not exist, and the operation is simpler. On the one hand, the intelligent oven is controlled to enable the cooking effect to be expected, on the other hand, when the intelligent oven meets an emergency situation, the intelligent oven can also provide warning for a user, and the safety of the oven is improved.
Fig. 5 shows another embodiment, which differs from the embodiment of fig. 4 in that two cameras are included in the control system, each camera including 3 recognition models. Correspondingly, the control system also comprises a priority used for determining the recognition results corresponding to the images collected by the two cameras. And the cloud server acquires a corresponding control instruction according to the identification result with higher priority, and the oven works according to the control instruction. The priority determination is described above.
In the embodiment, the observation positions of the two cameras are relatively comprehensive, particularly in the cooking monitoring stage, when some parts are scorched or the color changes too fast, if the parts are at the rear part closest to the oven hot air fan, a single camera cannot be taken into consideration, and the two cameras can immediately find abnormality and make corresponding adjustment.
Fig. 6 shows another embodiment, which is different from the embodiment of fig. 4 in that the control system further includes a cleaning recognition model for recognizing the cleanliness of the camera corresponding to the view angle or the cleanliness in the cavity. And the cloud server acquires a control instruction for self-cleaning or cleaning reminding according to the identification result.
When the oven starts or closes the door, the camera can pick the picture and upload the picture to the server, and the high in the clouds server discerns the environmental condition in the oven. Generally, the oil stain in the oven is the oil stain condition in front of the camera, and the oil stain condition in the visible inner cavity of the camera is the oil stain condition in front of the camera. And the cloud server makes a judgment according to the identification model, and when the environment of the oven is clean or the identification is not influenced, the oven continues to execute normal operation. When the oven oil stain is serious, whether the oven oil stain is mainly oil stain or stain can be judged according to the model, the stain can be subjected to high-temperature ashing treatment, the best mode of the oil stain is manual wiping, and therefore when the oven oil stain is judged to be more, a user is reminded of manual cleaning by sending a message to the oven interaction interface and the mobile phone end.
Observe and discern the oven inner chamber greasy dirt condition through the camera to the automatic suggestion user clears up, has improved the intelligent degree of oven, makes the oven environment more sanitary healthy.
While specific embodiments have been described above, these embodiments are not intended to limit the scope of the disclosure, even if only a single embodiment is described with respect to a particular feature. The characteristic examples provided in the present disclosure are intended to be illustrative, not limiting, unless differently expressed. In particular implementations, features from one or more dependent claims may be combined with features of the independent claims and features from respective independent claims may be combined in any appropriate manner and not merely in the specific combinations enumerated in the claims.
Although the present invention is disclosed above, the present invention is not limited thereto. Various changes and modifications may be effected therein by one skilled in the art without departing from the spirit and scope of the invention as defined in the appended claims.

Claims (11)

1. A control system of a cooking appliance comprises the cooking appliance and a server which is connected with the cooking appliance in a communication way,
the cooking appliance comprises an image acquisition module, a cooking node and a control module, wherein the image acquisition module is used for at least acquiring a real-time image of a cooked object corresponding to the cooking node;
the server comprises a receiving module, a processing module and a sending module, wherein,
the receiving module is used for receiving a real-time image sent by the cooking appliance, wherein the real-time image at least comprises a cooking state of the object to be cooked corresponding to at least one cooking node;
the processing module is used for identifying the received real-time image according to the monitoring cooking state model so as to obtain an identification result; wherein the monitoring cooking state model at least comprises cooking state standard images representing that the cooked object corresponds to different cooking nodes;
the sending module is used for acquiring a corresponding control instruction according to the identification result and sending the control instruction to the cooking appliance;
and the cooking appliance is also used for receiving the control instruction and working according to the instruction.
2. The control system of claim 1, wherein the real-time image comprises a first image and a second image;
the processing module is further used for respectively identifying the first image and the second image according to the cooking state standard image so as to obtain corresponding identification results;
the processing module also comprises a priority determining module which is used for determining the priority of the identification result according to a preset priority standard; the sending module is further used for obtaining a corresponding control instruction according to the identification result with the higher priority and sending the control instruction to the cooking appliance.
3. The control system according to claim 2, wherein the priority criterion is determined according to a degree of deviation of the real-time image from a standard image, and the higher the deviation from the standard image, the higher the priority of the recognition result corresponding to the real-time image.
4. A control system according to claims 1-3, characterized in that the recognition result comprises: the first identification result is that the current cooking state of the cooked object exceeds the cooking state corresponding to the current cooking node; or the second recognition result is that the current cooking state of the cooked object does not reach the cooking state corresponding to the current cooking node;
the sending module is further used for obtaining a control instruction for controlling the reduction of the cooking time or the cooking temperature according to the first identification result and sending the control instruction to the cooking appliance; or acquiring a control instruction for increasing the cooking time or the cooking temperature according to the second identification result, and sending the control instruction to the cooking appliance.
5. The control system of claim 1, wherein the processing module is further configured to identify the cooked object according to a food identification model and determine a corresponding monitored cooking state model according to the identification.
6. The control system of claim 1, wherein the processing module is further configured to identify a maturity of the cooked object in the real-time images of the cooking node based on a maturity determination model, wherein the maturity determination model includes standard images for characterizing a maturity status of the cooked object;
the sending module is further used for obtaining a corresponding control instruction according to the identification result and sending the control instruction to the cooking appliance.
7. The control system of claim 1, wherein the processing module is further configured to identify cleanliness of the image acquisition module or cleanliness within the cavity corresponding to the view angle according to a cleaning model, wherein the cleaning model comprises a standard image for characterizing that the view angle of the image acquisition module is blocked or comprises a standard image exhibiting an internal oil contamination condition;
the sending module is also used for acquiring a control instruction for self-cleaning or cleaning reminding according to the identification result.
8. The control system of any one of claims 1-7, wherein the cooking appliance further comprises a sensor module; the receiving module is also used for acquiring at least one of the detection parameters of the weight of the cooked object, the internal temperature of the cooked object and the internal temperature of the cooking appliance cavity acquired by the sensor module;
and the sending module is used for obtaining a corresponding control instruction by combining the identification result and the detection parameter and sending the control instruction to the cooking appliance.
9. The control system of claim 1, wherein the image capture module comprises a first camera and a second camera, and wherein the first camera and the second camera have different capture perspectives.
10. The control system of claim 9, wherein the first camera and the second camera are respectively disposed on a door of the cooking appliance and a top of an inner cavity of the cooking appliance.
11. The control system of claim 1, wherein the cooking appliance comprises an oven.
CN202110361195.2A 2021-04-02 2021-04-02 Control system of cooking utensil Pending CN115177159A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110361195.2A CN115177159A (en) 2021-04-02 2021-04-02 Control system of cooking utensil

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110361195.2A CN115177159A (en) 2021-04-02 2021-04-02 Control system of cooking utensil

Publications (1)

Publication Number Publication Date
CN115177159A true CN115177159A (en) 2022-10-14

Family

ID=83511516

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110361195.2A Pending CN115177159A (en) 2021-04-02 2021-04-02 Control system of cooking utensil

Country Status (1)

Country Link
CN (1) CN115177159A (en)

Similar Documents

Publication Publication Date Title
US11300300B2 (en) Dynamic quality management/monitoring system of a commercial cooking appliance
US11867411B2 (en) Cooking appliance with a user interface
CN107468048B (en) Cooking appliance and control method thereof
CN110780628B (en) Control method and device of cooking equipment, cooking equipment and storage medium
CN110664259B (en) Cooking apparatus, control method thereof, control system thereof, and computer-readable storage medium
CN108253484B (en) Range hood and control device and control method thereof
CN110824942B (en) Cooking apparatus, control method thereof, control system thereof, and computer-readable storage medium
CN107095120A (en) Cooking methods and device
CN114305153B (en) Food heating temperature control method and device of intelligent oven and storage medium
CN112741508A (en) Control method of cooking equipment and cooking equipment
CN112426060A (en) Control method, cooking appliance, server and readable storage medium
CN114224189A (en) Cooking equipment control method and device and cooking equipment
CN114305139B (en) Meat baking method and oven
CN110916533B (en) Baking control method and device for baking equipment, baking equipment and toaster
CN115177159A (en) Control system of cooking utensil
CN111241921B (en) Message reminding method and device of Internet of things operating system
CN112789447A (en) Method for preparing a culinary item, cooking appliance and cooking appliance system
CN110939950A (en) Dry burning prevention system and method for kitchen range with data interconnection function and kitchen range
CN115177158A (en) Cooking utensil
WO2019127650A1 (en) Smart oven
CN115153312A (en) Cooking appliance and control method thereof
CN115187827A (en) Cooking appliance cleaning reminding method and system and cooking appliance
CN115191838A (en) Display device of cooking utensil and cooking utensil
WO2023105023A1 (en) Method for displaying cooking state of cooking appliance and cooking appliance and control system therefor
US11838994B2 (en) Determination device and heating cooking apparatus

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination