CN115191839A - Method for displaying cooking state of cooking appliance, cooking appliance and control system thereof - Google Patents
Method for displaying cooking state of cooking appliance, cooking appliance and control system thereof Download PDFInfo
- Publication number
- CN115191839A CN115191839A CN202111505009.4A CN202111505009A CN115191839A CN 115191839 A CN115191839 A CN 115191839A CN 202111505009 A CN202111505009 A CN 202111505009A CN 115191839 A CN115191839 A CN 115191839A
- Authority
- CN
- China
- Prior art keywords
- cooking
- food
- node
- state
- appliance
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Images
Classifications
-
- A—HUMAN NECESSITIES
- A47—FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
- A47J—KITCHEN EQUIPMENT; COFFEE MILLS; SPICE MILLS; APPARATUS FOR MAKING BEVERAGES
- A47J37/00—Baking; Roasting; Grilling; Frying
- A47J37/06—Roasters; Grills; Sandwich grills
-
- A—HUMAN NECESSITIES
- A47—FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
- A47J—KITCHEN EQUIPMENT; COFFEE MILLS; SPICE MILLS; APPARATUS FOR MAKING BEVERAGES
- A47J37/00—Baking; Roasting; Grilling; Frying
- A47J37/06—Roasters; Grills; Sandwich grills
- A47J37/0623—Small-size cooking ovens, i.e. defining an at least partially closed cooking cavity
- A47J37/0664—Accessories
Landscapes
- Engineering & Computer Science (AREA)
- Food Science & Technology (AREA)
- Electric Ovens (AREA)
- Cookers (AREA)
Abstract
A method for displaying a cooking state of a cooking appliance, the cooking appliance and a control system thereof. The method displays the dynamic recognition and the intervention measures of the cooking state of the cooking appliance to the user in an intuitive mode from beginning to end of the whole process, so that the user can know how the intelligent control of the cooking equipment works and what is done for accurately controlling the cooking process, the intelligent degree and the commercial value of the cooking equipment are intuitively sensed, and the use experience of the user is improved.
Description
Technical Field
The embodiment of the invention relates to the technical field of household appliances, in particular to a method for displaying a cooking state of a cooking appliance, the cooking appliance and a control system thereof.
Background
In recent years, cooking apparatuses including microwave ovens, and the like have been increasingly widely used, becoming one of the indispensable home appliances for each household. Generally, the existing cooking apparatus can only heat food mechanically according to parameters such as time and temperature set by a user. In this case, food may be overcooked or even scorched, or food may not ripen under the set parameters due to undercooking, and the user may be required to increase the cooking time or temperature. Therefore, the cooking equipment is not intelligent enough, the cooking effect is not good, and the user experience is not good.
In view of the above situation, an oven of the prior art acquires an internal image through a camera and presents the internal image to a user, so that the user can know the cooking state of food. However, the scheme of presenting the cooking state of the internal food still requires the user to adjust cooking parameters according to the observed images, and if the user does not intervene in time, the cooking effect is also affected, and the working process cannot be intelligently controlled.
Even during the use of intelligent control of the oven, the user can usually only see the food recognition result of the oven and the parameter settings provided according to the judgment at the beginning. Once cooking is started, the oven gives feedback to the user only about: real-time pictures of the camera, temperature changes in the oven and remaining time. However, through these parameters, a general user cannot distinguish and know the mature state of food and what the intelligent control process of the oven is doing, so the intelligent control is a black box for the user, and the reason is that the user may be puzzled about the authenticity of the intelligence, and the experience is not very good.
Disclosure of Invention
The embodiment of the invention provides a method for displaying a cooking state of a cooking appliance, which comprises the following steps: presetting N cooking nodes and N cooking node identification models of food to be cooked at different time points in a cooking process, wherein the cooking node identification models respectively comprise standard cooking state information of the corresponding cooking nodes, and N is a positive integer greater than or equal to 1; acquiring a cooking instruction of food, and cooking the food according to the received cooking instruction; acquiring actual cooking state information of a cooking node; comparing and identifying the acquired actual cooking state information with the standard cooking state information of the corresponding cooking node through a cooking node identification model; and sending an identification result corresponding to the at least one cooking node to a user terminal interface or a cooking appliance interface to be displayed to a user, wherein the identification result at least comprises normal cooking state information or abnormal cooking state information.
By adopting the embodiment, the cooking node identification models of different cooking nodes in one cooking process are adopted to monitor the cooking state of cooked food and carry out intelligent intervention and control, and then the intelligent control process is displayed to the user according to the identification result, so that the user can intuitively feel the intelligent control process of the cooking appliance, and the use experience of the user is improved.
Optionally, the method further includes displaying the recognition result corresponding to the cooking node to the user in a manner of graphical marking according to the cooking progress. Therefore, the user can more intuitively understand how the intelligent recognition of the cooking appliance compares and judges and controls the cooking appliance.
Optionally, the method further includes: acquiring a control signal for operating the graphic mark by a user; and displaying the cooking state information of the cooking node corresponding to the graphic mark through a user terminal interface or a cooking appliance interface according to the control signal, wherein the cooking state information comprises that the current cooking state of the cooking node is a normal cooking state or an abnormal cooking state. Therefore, interaction with a user can be achieved through the display unit, and the cooking state of the corresponding cooking node can be checked.
Optionally, when the current cooking state of the cooking node is an abnormal cooking state, the information of the control parameter adjusted in the cooking process is simultaneously displayed through a user terminal interface or a cooking appliance interface according to the control signal. On one hand, during cooking, the cooking state of the cooked food is monitored at a preset cooking node, so that the cooking process can be ensured to intervene in a key cooking node in time, cooking parameters can be adjusted in time, and the cooking state can be corrected in time when deviating from the expected cooking state, so that the maturity of the food is in line with the expected cooking state; on the other hand, the intelligent intervention process can be displayed to the user through the display unit, and the feeling of the user on the intelligent control process is improved.
Optionally, the adjusted control parameter information includes control parameter information that is automatically adjusted by the cooking appliance.
For example, the adjusted control parameter information includes at least one of: cooking time, cooking temperature, cooking power.
Optionally, the standard cooking state information includes a cooking state standard image corresponding to a cooking node of food to be cooked in a cooking process; the actual cooking state information comprises an actual cooking state image of food to be cooked corresponding to the cooking node; the cooking state information of the cooking node corresponding to the graphic mark comprises: and the actual cooking state image of the food to be cooked at the cooking node and the cooking state standard image corresponding to the cooking node. The display device displays the real-time image and the corresponding standard image, so that a user can visually observe the current cooking state of food and whether the current cooking state reaches an expected cooking state, and the use experience is better.
Furthermore, the image marks corresponding to different recognition results are different, so that a user can observe different cooking state information of the cooking appliance at the cooking node through the image marks.
Optionally, the abnormal cooking state information includes at least one of: the actual cooking state does not reach the standard cooking state corresponding to the cooking node, and the actual cooking state exceeds the standard cooking state corresponding to the cooking node.
Optionally, the obtaining of the cooking instruction of the food further comprises: presetting a food identification model for food identification; food is identified through a preset food identification model, and a corresponding cooking instruction is obtained based on the identified food. The food type is identified through image identification, and the monitoring cooking state model corresponding to the food type is obtained, so that the food is automatically identified and the food cooking process is monitored, and the cooking appliance works intelligently.
Optionally, the method further includes: and determining N cooking node identification models in a cooking process corresponding to the food according to the identification result of the food.
Optionally, the standard cooking state information includes a standard cooking state image of the food; the actual cooking state information includes an actual cooking state image of the food.
The embodiment of the invention also provides a cooking appliance, which comprises a display unit, an information acquisition unit and a control unit connected with the display unit and the information acquisition unit, wherein the information acquisition unit is configured to acquire actual cooking state information of food to be cooked in a cooking process; the control unit is configured to perform the method of any one of the preceding claims; the display unit is configured to receive the identification result sent by the control unit and display the identification result to a user.
Optionally, the control unit is further configured to adjust a cooking control parameter of the cooking appliance according to the identification result.
Optionally, the control unit is further configured to control the display unit to display the adjusted cooking control parameter.
Optionally, the control unit is wirelessly connected with the display unit and the information acquisition unit.
Optionally, the information acquiring unit includes an image capturing device.
The embodiment of the invention also provides a cooking appliance control system, which comprises a cooking appliance and a server in communication connection with the cooking appliance, wherein the cooking appliance is used for acquiring the actual cooking state information of food to be cooked in the cooking process; the server is used for executing the method of any one of the preceding claims; the cooking appliance is also used for receiving the identification result sent by the server and displaying the identification result to a user through a display unit.
Optionally, the cooking appliance is further configured to adjust a cooking control parameter of the cooking appliance for cooking food according to the recognition result.
Optionally, the cooking appliance is further configured to display the adjusted cooking control parameter.
An embodiment of the present invention further provides a computer-readable storage medium, which is a non-volatile storage medium or a non-transitory storage medium, and on which a computer program is stored, where the computer program is executed by a processor to perform the steps of the above method.
Drawings
Fig. 1 is a schematic view of a cooking appliance according to an embodiment of the present invention;
fig. 2 is a schematic view of a frame of a cooking appliance according to an embodiment of the present invention;
fig. 3 is a flowchart illustrating a cooking state of the cooking appliance according to the embodiment of the present invention;
FIG. 4 is a schematic diagram illustrating the cooking state of FIG. 2 according to an embodiment of the present invention;
FIG. 5 is a schematic view of one illustration of FIG. 4;
FIG. 6 is a schematic view of another embodiment of FIG. 4;
FIG. 7 is a schematic view of a display interface of the display unit of FIG. 2;
fig. 8 is a schematic structural diagram of a control system of a cooking appliance according to an embodiment of the invention;
FIG. 9 is a flowchart of the operation of a cooking appliance control system of FIG. 8;
in the drawings:
1-a cooking appliance; 10-a body; 101-an input-output module; 1011-a display unit; 20-a cavity; 11-a control unit; 12-a heating module; 13-an information acquisition unit; 14-a tray; 2-a server.
Detailed Description
When a user uses the intelligent cooking device, the user hopes that the cooking device such as an oven can be operated automatically completely, so that the user intervention is reduced, but at the same time, the user is worried that the cooking result generated by intelligent control is not expected by the user, and the user hopes to obtain appropriate information feedback, so that the automatic cooking of the oven can be monitored, and the satisfactory cooking result is obtained.
In the existing mode, a user can only see the identification result of the oven on the food at the beginning, and the oven can only set parameters according to the identification result. Once cooking begins, the data fed back by the oven to the user may only include the real-time picture of the camera, the temperature change in the oven and the remaining time.
However, through these parameters, a general user cannot distinguish and know the mature state of food in the cooking process and what the intelligent control of the oven is doing, so the intelligent control process is a black box for the user, and the reason is that the user will be puzzled about the authenticity of the intelligence and the experience is not very good.
The embodiment of the invention provides a method for displaying a cooking state of a cooking appliance, a display device of the cooking appliance, the cooking appliance and a control system thereof, which can display dynamic identification and intervention measures of the cooking state of the cooking appliance to a user in an intuitive mode from beginning to end of the whole process, so that the user can know how intelligent control of the cooking equipment works, and what is done for accurately controlling the cooking process, so that the intelligent degree and commercial value of the cooking equipment are intuitively sensed, and the use experience of the user is improved.
In order to make the aforementioned objects, features and advantages of the present invention comprehensible, embodiments accompanied with figures are described in detail below.
Fig. 1 and 2 are schematic views of a cooking appliance according to an embodiment of the present invention; fig. 3 is a flowchart of a method for showing a cooking state of a cooking appliance according to an embodiment of the present invention. The cooking appliance 1 shown in fig. 1 and 2 may perform the method shown in fig. 3 to monitor the cooking states of food at different cooking nodes during cooking food, and dynamically and intuitively present the working process of the cooking appliance to a user according to the recognition result.
Specifically, referring to fig. 1, the cooking appliance 1 of the present embodiment may be an oven, and may include: the cooking device comprises a body 10 and a cavity 20 arranged in the body 10, wherein the cavity 20 is used for placing the cooked food. For example, the cooking appliance 1 may have a door capable of opening or closing the cavity 20, the door opening to expose the cavity 20 for a user to take or prevent food being cooked; when the door is closed, the cavity 10 is closed, and the cooked food placed therein can be heated, roasted and the like.
The cooking appliance 1 may further include an input-output module 101, and the operating state of the cooking appliance 1 may be adjusted by operating the input-output module 101; the cooking state information may also be displayed through the input-output module 101. The operation state of the cooking appliance 1 may include heating power, heating direction, heating time, steam delivery amount, steam delivery direction, and the like. The adjustment of these operating states may be achieved by adjusting the operating states of specific functional modules in the cooking appliance 1.
Further, the cooking appliance 1 may further include a control unit 11, configured to adjust the operation state of the corresponding function module according to the instruction fed back by the input/output module 101, so that the operation state of the cooking appliance 1 conforms to the user instruction; the control unit 11 may also control an output mode of the input/output module 101, such as a screen display content or a display mode, through an input signal or an instruction.
For example, the cooking appliance 1 may comprise a heating module 12 for heating the food 2 to be cooked. The cooking appliance 1 may comprise a plurality of heating modules 12, and the plurality of heating modules 12 are dispersedly disposed in different areas of the cooking appliance 1 to heat the cavity 20 from different angles so that the cooked food 2 is heated as uniformly as possible. The control unit 11 can independently adjust the heating power of each heating module 12 to adjust the amount of heat radiated into the cavity 20 by each heating module 12. The control unit 11 can also independently adjust the heating direction of each heating module 12 to adjust the radiation angle of the heat radiated into the cavity 20 by each heating module 12. The control unit 11 may also adjust the heating time period for which a particular heating module 12 is operated at a particular heating power to heat the food to a desired effect.
As another example, the cooking appliance 1 may include a spray module (not shown) for delivering water vapor into the cavity 20 to adjust the humidity within the cavity 20. By adjusting the humidity in the cavity 20, the surface humidity of the cooked food can be adjusted, so that the water content of the cooked food is moderate, and the surface of the cooked food is prevented from being heated too dry or too wet. The cooking appliance 1 may include a plurality of spraying modules, and the plurality of spraying modules are dispersedly disposed in different areas of the cooking appliance 1 to deliver water vapor into the cavity 20 from different angles, so that the humidity distribution on the surface of the cooked food is uniform. The control unit 11 can independently adjust the vapor delivery amount of each spray module. The control unit 11 can also adjust the steam delivery direction of each spray module independently. The control unit 11 may also adjust the spray duration for which a particular spray module operates at a particular delivery rate of steam.
Note that the input-output device 101 in fig. 1 includes an input device that realizes an input function and a display device that realizes an output function. The input device and the display device may be integrated, for example, may be a touch display panel with a touch function, which is used for implementing data input and image information display; the input device may be similar to a knob adjusting key, and may also take other forms, such as a touch screen, a voice control module, and the like, and the display device may be a display screen, and the like.
The display device may be disposed on the cooking appliance 1, or may be detachably connected to the cooking appliance 1.
In this embodiment, the display device may include a display unit 1011 and a control unit 11 connected thereto, wherein the control unit 11 may be configured to display a progress pattern reflecting the current cooking progress of the cooking appliance 1 through the display unit 1011. The cooking states of the cooking appliance 1 at different cooking nodes, including real-time cooking state information and intervention information of the cooking appliance 1, are thus presented to the user in a visual manner.
In a specific embodiment, the control unit 11 connected to the display unit 1011 may be a control module of the cooking appliance itself, through which the control of the cooking appliance 1 and the control of the display unit 1011 are realized.
Further, the control unit 11 may execute the method shown in fig. 3 to monitor the cooking state of the cooked food, and display the control parameters or intervention behaviors of the cooking appliance 1 during the cooking process through the display unit 1011 according to the recognition result. For example, the control unit 11 may comprise or be externally connected to a memory (not shown) storing a computer program which, when executed by the processor, performs the steps of the method shown in fig. 3.
It should be noted that fig. 1 only shows an exemplary possible setting position of the control unit 11 in the cooking appliance 1, and the specific setting position of the control unit 11 can be adjusted according to needs in practical applications. In a variation, the control unit 11 may be externally installed on the cooking appliance 1 (e.g., the server 2, etc.), the cooking appliance 1 has a communication module (not shown), and the control unit 11 communicates with the communication module to send a control instruction to the cooking appliance 1, where the control instruction is used to control the operation state of the cooking appliance 1.
Further, the cooking appliance 1 may include an information obtaining unit 13 disposed in the cavity 20 or at the door to collect actual cooking state information of food at least one angle, and the control unit 11 is in communication with the information obtaining unit 13 to obtain data collected by the information obtaining unit 13.
For example, the information acquiring unit 13 may be a camera device disposed on the top of the cavity 20 and covering at least the tray 14 at the overhead range, and the camera device may be disposed in plurality to photograph the cooked food 2 from different angles.
For example, the camera device may be disposed inside the door and the shooting range at least includes the area where the cooked food 2 is located.
Further, the cooking appliance 1 may include a temperature detecting means (not shown) for detecting the temperature of the surface of the food, the inside of the body, or the like.
For example, a temperature sensor (not shown) may be disposed within the chamber 20 to collect (not shown) the temperature within the chamber 20. For another example, the tray 14 may be provided with a temperature sensor or probe (not shown) to collect the surface temperature or the internal temperature of the food 2 being cooked.
In a specific embodiment, referring to fig. 3, the method for displaying the cooking state of the cooking appliance 1 according to the embodiment may include the following steps:
step S100, presetting N cooking nodes and N cooking node identification models of food to be cooked at different time points in a cooking process, wherein the cooking node identification models respectively comprise standard cooking state information of the corresponding cooking nodes, and N is a positive integer greater than or equal to 1;
step S102, obtaining a cooking instruction of food, and cooking the food according to the received cooking instruction;
step S104, acquiring actual cooking state information of the cooking node;
step S106, comparing and identifying the acquired actual cooking state information with the standard cooking state information of the corresponding cooking node through a cooking node identification model;
step S108, sending the identification result corresponding to the at least one cooking node to a user terminal interface or a cooking utensil interface to be displayed to a user, wherein the identification result at least comprises normal cooking state information or abnormal cooking state information. A cooking appliance interface here can be understood as a display device or a display unit of a cooking appliance; the user terminal interface may be a mobile terminal or the like wirelessly connected with the cooking appliance.
In this embodiment, a cooking recipe for a food item generally corresponds to at least one cooking program, and the cooking program may include different cooking nodes during a cooking process, and the cooking nodes are used for monitoring the maturity or cooking status of the food item during different periods of time during the cooking process. The cooking node recognition models of different cooking nodes in the cooking process are adopted to monitor the cooking state of cooked food and carry out intelligent intervention and control, and then the intelligent control process is displayed to the user according to the recognition result, so that the user can intuitively feel the intelligent control process of the cooking appliance, and the use experience of the user is improved.
Specifically, when the cooking is ready to start, the ready food is put into the cooking appliance 1. When the door of the cooking appliance 1 is closed, the information acquisition unit 13 starts to work and acquires the shot image in real time; or when the user sets cooking parameters such as heating time, heating temperature, etc. and starts cooking, the information obtaining unit 13 starts operating.
The information acquisition unit 13 may store the acquired image in the body memory or transmit to the memory of a remote server. The control unit 11 monitors the cooking state of food according to the acquired image, identifies the actual cooking state information of the cooking node by adopting a preset cooking node identification model, and sends the identification result corresponding to the cooking node to a user terminal interface or a display unit of a cooking appliance to be displayed to a user.
In one embodiment, the cooking appliance 1 may adjust the cooking parameter according to the recognition result, and display the adjusted parameter to the user through the display unit 1011. The display mode can be displayed when a cooking process is finished or during the cooking process. Therefore, the cooking process is intelligently monitored, the cooking parameters are timely adjusted when abnormal cooking is found, and the intelligent cooking control process is visually displayed for a user.
Wherein, a cooking node recognition model including standard cooking state information of the corresponding cooking node may be previously stored in the memory. For example, the standard cooking state information may be a food cooking state standard image corresponding to a plurality of cooking nodes. The standard image may be understood as the state that the food should reach at the cooking node and may include color parameters, shape parameters, etc. for characterizing the state of the food. The standard image is obtained through a large amount of cooking experimental data training.
In a particular implementation, different cooking nodes correspond to different standard images.
In a specific implementation, the cooking node identification model corresponds to the type of food one to one.
For example, the memory stores a cooking node recognition model a corresponding to cooking steak, a cooking node recognition model B corresponding to cooking sweet potato, and the like.
For another example, in a cooking steak program comprising 3 cooking nodes, the corresponding model a comprises 3 cooking node identification submodels A1, A2, A3. And each sub-model respectively comprises a beefsteak standard image corresponding to the cooking node. When in the process of cooking the steak, the cooking parameters can be adjusted according to the recognition result and the result can be displayed to the user by monitoring and comparing the real-time cooking state images of the 3 cooking nodes with the corresponding standard images. The recognition result may include information that the steak is in a normal cooking state at one of the cooking nodes, or may be information of an abnormal cooking state.
It should be noted that the above is only an example. In practical application, the memory may be provided with a cooking node identification model corresponding to a plurality of food types to extend a menu range in which the cooking appliance 1 can intelligently cook.
In a specific embodiment, the display device of the cooking appliance 1 may display the identification result corresponding to the cooking node to the user in a manner of graphical marking according to the cooking progress under the control of the control unit 11.
In one embodiment, the control unit 1 may control the display unit 1011 to display a progress pattern reflecting a complete cooking process, wherein the progress pattern includes a plurality of cooking nodes spaced apart from each other in time, each cooking node represents different cooking state information of the cooked food in a cooking process, and each cooking node is visually distinguished. The scene may be a display mode of the display unit after the cooking is finished or during the cooking.
Preferably, the control unit 1 may control the display unit 1011 to display a progress pattern reflecting the current cooking progress, wherein the progress pattern includes a plurality of cooking nodes spaced apart from each other in time, each cooking node represents different cooking state information of the cooked food in one cooking process, and each cooking node is visually distinguished.
For example, the progress pattern may include a bar pattern along which the current cooking progress extends over time in the form of brightness or color of light, so that the cooking progress of the cooking appliance can be intuitively known through the change of the light or color.
For another example, the progress pattern may include a background pattern that characterizes the complete cooking progress during a cooking process, and the current cooking progress gradually covers or replaces the background pattern over time in a manner that distinguishes the brightness or color of the light from the background pattern.
Further, the cooking node may be located on the background pattern to visually present key nodes of the cooking process, facilitating interactive operation of the cooking apparatus or the display device.
Through aforementioned display device's setting, at the intelligent culinary art in-process of cooking utensil, the user can feel directly perceivedly by the maturity that is in by culinary art food, and what role was played in whole culinary art in-process to artificial intelligence that image recognition technology brought, what control and how work was done. Therefore, the dynamic identification process of the cooking appliance 1 is displayed to the user in the manner described above, so as to improve the confidence and success of the user experience and technology.
For example, fig. 4 shows a complete cooking process by means of a display unit in the form of a time axis on which key status points, i.e. cooking nodes, can be set, for example, three cooking nodes of a cooked steak are shown by means of a bubble graph.
Further, the time information of the node may be displayed on the corresponding node. For example, the first node in fig. 4 represents the result of recognizing the cooking state at 10 minutes and 30 seconds.
Further, whether the recognition result of the current cooking node is the normal cooking state or the abnormal cooking state may be represented by a bubble image of a different color. In fig. 4, one recognition result is represented by black filled bubbles, and the other recognition result is represented by blank bubbles. The above description is only exemplary, and different recognition results can be distinguished in different ways in practical application.
Further, the user can view the recognition result of the corresponding node by operating the graphic mark, for example, clicking a bubble graphic in the display unit.
In a specific embodiment, in conjunction with fig. 5 and 6, the user can click on an image mark, such as a bubble mark, desired to be viewed on the display unit 1011, and the cooking appliance 1 can obtain a control signal for the user to operate the image mark through the control unit 11; and displaying the cooking state information of the cooking node corresponding to the graphic mark through a user terminal interface or a cooking appliance interface according to the control signal, wherein the cooking state information comprises that the current cooking state of the cooking node is a normal cooking state or an abnormal cooking state. Therefore, interaction with a user can be achieved through the display unit, and the cooking state of the corresponding cooking node can be checked.
Preferably, the control unit 11 is further configured to respond to a user operation only after the current cooking progress reaches or exceeds the cooking node, and to display cooking state information of the corresponding cooking node on the display unit 1011 based on the operation. Since the cooking nodes represent different cooking state information of the cooked food in a cooking process, but when the current cooking progress has not reached a certain cooking node, the cooking appliance 1 or the control unit 11 has not intervened or adjusted or identified the cooking process, and the corresponding cooking node has no cooking state information available for the user to view. Therefore, only after the current cooking progress reaches or exceeds the cooking node, the user can operate the corresponding cooking node and obtain a response, so that the cooking state information is obtained through the display unit.
Further, the cooking state information of the cooking node corresponding to the graphic mark may include: and the actual cooking state image of the food to be cooked corresponding to the cooking node and the cooking state standard image corresponding to the cooking node are used for intuitively observing whether the actual cooking progress of the node is in accordance with the expectation or not.
Further, when the current cooking state of the cooking node is an abnormal cooking state, the control parameter information adjusted in the cooking process is simultaneously displayed through a user terminal interface or a cooking appliance interface according to the control signal, so that intelligent intervention measures of the cooking appliance 1 on the node are intuitively observed, and the cooking result is as expected as possible.
For example, as shown in fig. 5, when the user clicks the first bubble figure, i.e., the first cooking node in the figure, the display unit presents the user with a screen as shown on the right side in the figure. Wherein the screen includes a standard cooking state image and an actual cooking state image, and the cooking time is increased by 10 seconds. The presented information shows the comparison condition of the actual cooking state of the steak and the standard cooking state at the current cooking node, namely, the cooking state is an abnormal state, meanwhile, the cooking appliance adjusts the cooking time through the comparison condition, and the adjusted condition is displayed to a user through the display unit.
For another example, as shown in fig. 6, when the user clicks a third bubble figure, i.e., a third cooking node in the figure, the display unit presents a screen to the user as shown on the right side in the figure. The screen comprises a standard cooking state image and an actual cooking state image, and the current cooking state is a standard state. The presented information shows that at the current cooking node, the actual cooking state of the steak is equivalent to the standard cooking state, and the cooking state which is in line with the current cooking node and should be reached is the normal cooking state. Therefore, the "current cooking state is the standard state" is directly displayed through the display unit.
In a common embodiment, the abnormal cooking state information includes at least one of: the actual cooking state does not reach the standard cooking state corresponding to the cooking node, and the actual cooking state exceeds the standard cooking state corresponding to the cooking node.
For example, when the abnormal cooking state is the actual cooking state which does not reach the standard cooking state corresponding to the cooking node, the cooking appliance controls to increase the control parameter, and at least one of the cooking time, the cooking temperature, the cooking power, and the like may be increased.
When the abnormal cooking state is the state that the actual cooking state exceeds the standard cooking state corresponding to the cooking node, the cooking appliance controls and reduces the control parameters, so that at least one of the cooking time, the cooking temperature, the cooking firepower and the like can be reduced.
In an embodiment, the adjusted control parameter information includes control parameter information that is automatically adjusted by the cooking appliance, and the corresponding relationship between the intelligent identification result of the cooking appliance 1 and the control parameter information may be set in a preset manner, so that the cooking appliance 1 automatically acquires the corresponding control parameter information according to the identification result.
For example, when cooking beefsteak using the cooking appliance 1, the cooking taste selected by the user is 7 cents, and the expected cooking time is 8 minutes. In the cooking process, the control unit 11 monitors that the cooking state of the current steak exceeds the cooking state of the steak in the corresponding standard image at a third cooking node, for example, at the 6 th minute of the cooking time as a result of comparing the real-time image collected by the information acquisition unit 13 with the standard image. If heating is continued for a preset period of 8 minutes, the last beefsteak is old or cannot be expected with a high probability. Therefore, the control unit controls to reduce the total heating time, for example, to reduce 20 seconds, i.e., to adjust the originally planned heating time of 8 minutes to 7 minutes and 40 seconds. Such fine tuning may be desirable to avoid over-cooking, thereby allowing the cooking result to be as desired.
In one implementation, N cooking nodes of food to be cooked at different time points in a cooking process may be preset, the cooking nodes being obtained according to cooking experimental data of different foods. Experiments show that most food materials have color and form changes in the cooking process, have comparability, and can be used for judging the cooking state of food. Therefore, the entire cooking process can be divided into different state points, i.e., cooking nodes, according to the characteristics of the food. Different food state points can also be different, for example, 3 state points are found by beef steak through experiments, and 2 state points are found by chicken wings, and the state points become important reference points for judging the cooking state of the food by the cooking appliance.
During cooking, the cooking state of cooking food is identified at the preset cooking node, so that the timely intervention of the cooking process at the key cooking node can be ensured, the cooking parameters can be adjusted in time, and the cooking state can be corrected in time when deviating from the expected state. On the one hand, the cooking result of the food is made to be in accordance with the expectation; on the other hand, the cooking state is added on the time progress axis, so that important nodes for intervention of the artificial intelligence are displayed for the user, the user can know what the artificial intelligence does for cooking, and the intelligent control process is intuitively perceived.
In one embodiment, the real-time image of the food under the current cooking node and the corresponding standard image thereof may be displayed through the display unit 1011 to present the cooking state of the food under the current cooking node.
For example, in the cooking process, when a certain cooking node is reached, the control unit 11 may call up the actual cooking state image and the cooking state standard image of the node stored therein and present them to the user through the display unit 1011. As described above, when the user clicks the graphic mark corresponding to the cooking node, the control unit 11 may call the actual cooking state image and the cooking state standard image of the node stored therein and present them to the user through the display unit. Therefore, the cooking process can be intelligently adjusted by the cooking equipment, the cooking state and the cooking progress of food can be visually seen by a user, the intervention process of the cooking appliance on cooking parameters is realized, and the user experience is better.
In one embodiment, the control unit 11 may also control the display unit 1011 to display a first area including an image of the cooked food displayed in real time and a second area including the progress pattern through the display unit 1011. For example, as shown in fig. 7, a display interface of the display unit 1011 includes a real-time cooking status image of the cooked food, i.e., steak, and an intelligent recognition pattern of the cooking process presented in a progress bar. The display interface can also comprise cooking control parameter information of the cooking appliance and the like.
In one variation, the method may also adjust cooking parameters in conjunction with the real-time internal temperature of the food. The real-time internal temperature of the food may be acquired by a probe or other sensor, and the cooking appliance 1 is controlled according to the real-time internal temperature and the current cooking state of the food.
Specifically, when the real-time internal temperature is lower than the standard temperature, the cooking time or the cooking temperature is increased; otherwise, the cooking time is reduced or the cooking temperature is lowered. On the basis of monitoring the cooking state through image recognition, the cooking appliance 1 is controlled by combining the internal temperature of food, so that the cooking parameters can be better controlled to obtain the expected food cooking effect.
For example, during cooking, the internal temperature of the food does not reach the expected temperature at a certain cooking node, which may lead to the problem that the food does not cook well, and the cooking time and the cooking temperature are adjusted by monitoring the internal temperature in real time. Conversely, when the internal temperature of the food is too high at the cooking node, the food is possibly overcooked, and the overcooked food can be avoided by monitoring the internal temperature in real time and intervening in advance.
For another example, in cooking certain foods (e.g., steak), 7-point doneness and 9-point doneness are very small in image difference; or the appearance, color and the like of certain food in a certain cooking process are not changed greatly and cannot be effectively identified through images, and the cooking effect can be better controlled by combining the internal temperature.
In one implementation, the method may be preceded by food recognition and automatically matching a cooking node recognition model based on the food recognition results. Specifically, a food identification model for food identification is preset, an image of cooking food is acquired, the image is identified through the food identification model, and a corresponding cooking instruction is acquired based on the identified food. Further, N cooking node recognition models in a cooking process corresponding to the food are determined according to the food recognition result.
In a variation, the cooking appliance 1 may be provided with a plurality of information acquiring units 13 for acquiring images of cooked food and recognition results from different angles, and determining one recognition result based on priority to improve the accuracy of food recognition.
In one implementation, the method may further determine whether ripening or the expectation has been reached at the end of cooking. Specifically, when the cooking process approaches to the end sound, acquiring a food cooking state image of the last cooking node; judging whether food under the cooking node is mature or not; if yes, controlling the cooking appliance to stop working; or if not, increasing the cooking time or the cooking temperature to control the cooking appliance. Therefore, the situation that food is not mature after cooking or is continuously cooked after being mature is avoided, resources are wasted, the taste of the food is damaged, and the like.
In an embodiment of the present invention, a control system of a cooking appliance 1 is further disclosed, which includes the cooking appliance 1 and the server 2, which can interact with each other, wherein a control unit 11 can also be disposed on the server 2, and the control unit 11 is configured to execute the method described in the foregoing fig. 3. Specifically, the control unit 11 is in communication connection with the cooking appliance 1, and the cooking appliance 1 is configured to receive the identification result sent by the server 2, and display the identification result to the user through the display unit 1011. In this way, the control unit 11 is arranged on the server 2, and performs data processing such as recognition and the like on the server 2, so that on one hand, training, learning and updating of data such as models and the like are facilitated, and data and accuracy of processed data are improved; on the one hand, the complexity of the cooking appliance itself is reduced. So, promoted control cooking utensil's intelligent degree.
The cooking appliance 1 can work according to the cooking control parameters sent by the control unit 11; specifically, the cooking appliance 1 may adjust the cooking control parameter of its cooking food according to the aforementioned recognition result.
Further, the cooking appliance 1 is also used for displaying the adjusted cooking control parameter. Specifically, the cooking appliance controls the display unit 1011 through the control unit 11 to display the adjusted control parameters thereof, so that the intelligent control process of the cooking appliance is visually displayed for the user, and the user experience is improved.
An embodiment of the control system is described in detail below with reference to fig. 9, taking a smart oven as an example.
And starting the oven, acquiring video information of cooked food in real time, and uploading the video information to a server in a video stream mode. The server identifies the cooked food through the food identification model and calls corresponding cooking parameters according to the cooked food. The server sends the cooking parameters to the oven in a cooking instruction mode, and the oven displays the instruction information through the display unit.
The user confirms the displayed instruction and feeds back the user response information to the oven to indicate the oven to start cooking.
The oven starts to cook, and in the cooking process, the oven uploads the video stream of the food to be cooked to the server in real time through the camera device.
And the server receives the video stream and identifies the cooking state of the corresponding cooking node through a preset cooking node identification model.
In fig. 9, if the cooking node 1 recognition model recognizes that the node is an abnormal node, an intervention measure for the abnormal node, i.e., an increase or decrease in cooking time, and a real-time cooking state image of the cooking node are transmitted to the oven.
The oven receives the recognition result and controls the display unit to display the recognition result, wherein the display mode of the recognition result can be a mode of graphic marks, including an actual cooking state image and a standard image, and parameter information of intervention measures of the oven for the node, such as increasing or decreasing cooking time, and the like, as shown in fig. 4 and 5.
If the cooking node 1 identification model identifies that the node is a normal node, the identification result is sent to the oven, and the display unit is controlled to display the identification result, wherein the displayed content comprises an actual cooking state image and a standard image, as shown in fig. 4 and 6.
A plurality of cooking nodes of food in a cooking process are included, and each node corresponds to a preset cooking node identification model. The identification process and display manner of other cooking nodes such as cooking nodes 2 to N are similar to the process of cooking node 1, and are not described herein again.
And when the cooking is finished, sending a cooking finishing instruction to the oven, and displaying the state of finishing the cooking through the display unit.
The embodiment of the present invention further provides a computer-readable storage medium, which is a non-volatile storage medium or a non-transitory storage medium, and a computer program is stored on the computer-readable storage medium, and when the computer program is executed by a processor, the computer program performs the steps of the control method provided in any of the above embodiments.
Although specific embodiments have been described above, these embodiments are not intended to limit the scope of the disclosure, even if only a single embodiment is described with respect to a particular feature. The characteristic examples provided in the present disclosure are intended to be illustrative, not limiting, unless differently expressed. In particular implementations, features from one or more dependent claims may be combined with features of the independent claims and features from respective independent claims may be combined in any appropriate manner and not merely in the specific combinations enumerated in the claims.
Although the present invention is disclosed above, the present invention is not limited thereto. Various changes and modifications may be effected therein by one skilled in the art without departing from the spirit and scope of the invention as defined in the appended claims.
Claims (20)
1. A method of demonstrating a cooking state of a cooking appliance, the method comprising:
presetting N cooking nodes and N cooking node identification models of food to be cooked at different time points in a cooking process, wherein the cooking node identification models respectively comprise standard cooking state information of the corresponding cooking nodes, and N is a positive integer greater than or equal to 1;
acquiring a cooking instruction of food, and cooking the food according to the received cooking instruction;
acquiring actual cooking state information of a cooking node;
comparing and identifying the acquired actual cooking state information with the standard cooking state information of the corresponding cooking node through a cooking node identification model;
and sending an identification result corresponding to the at least one cooking node to a user terminal interface or a cooking appliance interface to be displayed to a user, wherein the identification result at least comprises normal cooking state information or abnormal cooking state information.
2. The method of claim 1, wherein the sending the identification result corresponding to the at least one cooking node to a user terminal interface or a cooking appliance interface for presentation to a user comprises: and displaying the identification result corresponding to the cooking node to a user in a graphic marking mode according to the cooking progress.
3. The method of claim 2, wherein the method further comprises:
acquiring a control signal for operating the graphic mark by a user;
and displaying the cooking state information of the cooking node corresponding to the graphic mark through a user terminal interface or a cooking appliance interface according to the control signal, wherein the cooking state information comprises that the current cooking state of the cooking node is a normal cooking state or an abnormal cooking state.
4. The method of claim 3, wherein when the current cooking state of the cooking node is an abnormal cooking state, information of control parameters adjusted during a cooking process is simultaneously displayed through a user terminal interface or a cooking appliance interface according to the control signal.
5. The method of claim 4, wherein the adjusted control parameter information comprises automatically adjusted control parameter information for the cooking appliance.
6. The method of claim 4, wherein the adjusted control parameter information includes at least one of: cooking time, cooking temperature, cooking power.
7. The method of claim 3, wherein the standard cooking state information includes a cooking state standard image corresponding to a cooking node of the food to be cooked during one cooking process; the actual cooking state information comprises an actual cooking state image of the food to be cooked corresponding to the cooking node;
the cooking state information of the cooking node corresponding to the graphic mark comprises: and the actual cooking state image of the food to be cooked at the cooking node and the cooking state standard image corresponding to the cooking node.
8. The method of claim 2, wherein the image labels for the different recognition results are different.
9. The method of claim 1, wherein the abnormal cooking state information includes at least one of: the actual cooking state does not reach the standard cooking state corresponding to the cooking node, and the actual cooking state exceeds the standard cooking state corresponding to the cooking node.
10. The method of claim 1, wherein the obtaining cooking instructions for the food further comprises, prior to: presetting a food identification model for food identification; food is identified through a preset food identification model, and a corresponding cooking instruction is obtained based on the identified food.
11. The method of claim 10, wherein the method further comprises: and determining N cooking node identification models in a cooking process corresponding to the food according to the identification result of the food.
12. The method of claim 1, wherein the standard cooking state information includes a standard cooking state image of the food; the actual cooking state information includes an actual cooking state image of the food.
13. A cooking appliance comprises a display unit, an information acquisition unit and a control unit connected with the display unit and the information acquisition unit,
the information acquisition unit is configured to acquire actual cooking state information of food to be cooked in a cooking process;
the control unit is configured to perform the method of any one of claims 1-12;
the display unit is configured to receive the identification result sent by the control unit and display the identification result to a user.
14. The cooking appliance of claim 13, wherein the control unit is further configured to adjust a cooking control parameter of the cooking appliance according to the identification result.
15. The cooking appliance of claim 14 wherein the control unit is further configured to control the display unit to display the adjusted cooking control parameter.
16. The cooking appliance of claim 13, wherein the control unit is wirelessly connected to the display unit and the information acquisition unit.
17. The cooking appliance of claim 13, wherein the information acquisition unit comprises a camera.
18. A cooking appliance control system comprises a cooking appliance and a server in communication connection with the cooking appliance, and is characterized in that the cooking appliance is used for acquiring actual cooking state information of food to be cooked in a cooking process;
the server is configured to perform the method of any one of claims 1-12;
and the cooking appliance is also used for receiving the identification result sent by the server and displaying the identification result to a user through a display unit.
19. The cooking appliance control system of claim 18, wherein the cooking appliance is further configured to adjust a cooking control parameter of the food it cooks based on the identification.
20. The cooking appliance control system of claim 19, wherein the cooking appliance is further configured to display the adjusted cooking control parameter.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
PCT/EP2022/085111 WO2023105023A1 (en) | 2021-12-10 | 2022-12-09 | Method for displaying cooking state of cooking appliance and cooking appliance and control system therefor |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202110362519 | 2021-04-02 | ||
CN2021103625194 | 2021-04-02 |
Publications (1)
Publication Number | Publication Date |
---|---|
CN115191839A true CN115191839A (en) | 2022-10-18 |
Family
ID=83574179
Family Applications (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202111505007.5A Pending CN115191838A (en) | 2021-04-02 | 2021-12-10 | Display device of cooking utensil and cooking utensil |
CN202111505009.4A Pending CN115191839A (en) | 2021-04-02 | 2021-12-10 | Method for displaying cooking state of cooking appliance, cooking appliance and control system thereof |
Family Applications Before (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202111505007.5A Pending CN115191838A (en) | 2021-04-02 | 2021-12-10 | Display device of cooking utensil and cooking utensil |
Country Status (1)
Country | Link |
---|---|
CN (2) | CN115191838A (en) |
-
2021
- 2021-12-10 CN CN202111505007.5A patent/CN115191838A/en active Pending
- 2021-12-10 CN CN202111505009.4A patent/CN115191839A/en active Pending
Also Published As
Publication number | Publication date |
---|---|
CN115191838A (en) | 2022-10-18 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN110664259B (en) | Cooking apparatus, control method thereof, control system thereof, and computer-readable storage medium | |
CN110780628B (en) | Control method and device of cooking equipment, cooking equipment and storage medium | |
CN110824942B (en) | Cooking apparatus, control method thereof, control system thereof, and computer-readable storage medium | |
CN107468048B (en) | Cooking appliance and control method thereof | |
CN111481049B (en) | Cooking equipment control method and device, cooking equipment and storage medium | |
US11867411B2 (en) | Cooking appliance with a user interface | |
CN110123149B (en) | Cooking control method of cooking equipment and cooking equipment | |
CN110806699A (en) | Control method and device of cooking equipment, cooking equipment and storage medium | |
CN106037448A (en) | Cooking control method and equipment and cooking device | |
CN110234040B (en) | Food material image acquisition method of cooking equipment and cooking equipment | |
CN110222720A (en) | A kind of cooking equipment with short video acquisition function | |
CN115981141A (en) | Control method, device, equipment and medium based on adaptive matching | |
CN114893946B (en) | Food storage device and intelligent cooking method | |
CN112426060A (en) | Control method, cooking appliance, server and readable storage medium | |
CN110275456A (en) | Cooking control method, system and computer readable storage medium | |
CN111131855A (en) | Cooking process sharing method and device | |
CN209733642U (en) | Intelligent cooking equipment | |
US12066193B2 (en) | Method for preparing a cooking product, cooking device, and cooking device system | |
US20220273138A1 (en) | Cooking Device, Control Method Therefor, Control System Thereof and Computer-Readable Storage Medium | |
CN111419096B (en) | Food processing method, controller and food processing equipment | |
CN115191839A (en) | Method for displaying cooking state of cooking appliance, cooking appliance and control system thereof | |
CN115177159A (en) | Control system of cooking utensil | |
WO2023105033A1 (en) | Display apparatus of cooking appliance and cooking appliance | |
WO2023105023A1 (en) | Method for displaying cooking state of cooking appliance and cooking appliance and control system therefor | |
CN115177158A (en) | Cooking utensil |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination |