CN113729500A - Control method of cooking equipment, control device and storage medium - Google Patents

Control method of cooking equipment, control device and storage medium Download PDF

Info

Publication number
CN113729500A
CN113729500A CN202110987483.9A CN202110987483A CN113729500A CN 113729500 A CN113729500 A CN 113729500A CN 202110987483 A CN202110987483 A CN 202110987483A CN 113729500 A CN113729500 A CN 113729500A
Authority
CN
China
Prior art keywords
cooking
food
information
user
cooked
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
CN202110987483.9A
Other languages
Chinese (zh)
Inventor
杨翔
杨士葶
宁建
张岩
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Shuying Technology Co ltd
Original Assignee
Shenzhen Shuying Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Shuying Technology Co ltd filed Critical Shenzhen Shuying Technology Co ltd
Priority to CN202110987483.9A priority Critical patent/CN113729500A/en
Publication of CN113729500A publication Critical patent/CN113729500A/en
Withdrawn legal-status Critical Current

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A47FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
    • A47JKITCHEN EQUIPMENT; COFFEE MILLS; SPICE MILLS; APPARATUS FOR MAKING BEVERAGES
    • A47J36/00Parts, details or accessories of cooking-vessels
    • A47J36/32Time-controlled igniting mechanisms or alarm devices
    • GPHYSICS
    • G10MUSICAL INSTRUMENTS; ACOUSTICS
    • G10LSPEECH ANALYSIS TECHNIQUES OR SPEECH SYNTHESIS; SPEECH RECOGNITION; SPEECH OR VOICE PROCESSING TECHNIQUES; SPEECH OR AUDIO CODING OR DECODING
    • G10L17/00Speaker identification or verification techniques
    • G10L17/22Interactive procedures; Man-machine interfaces

Landscapes

  • Engineering & Computer Science (AREA)
  • Health & Medical Sciences (AREA)
  • Audiology, Speech & Language Pathology (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • Acoustics & Sound (AREA)
  • Multimedia (AREA)
  • Food Science & Technology (AREA)
  • Electric Ovens (AREA)

Abstract

The application discloses a control method of cooking equipment based on image recognition, the cooking equipment, a control device and a storage medium. The control method comprises the following steps: acquiring an image in the cooking equipment through a camera; extracting a food image of the food being cooked in the image; processing the food image to obtain a surface color of the food; determining cooking information of the cooked food and controlling a cooking state of the cooked food according to a pre-stored food color template; and prompting cooking information of the cooked food. Therefore, the state of the food being cooked can be automatically controlled according to the obtained surface color of the food, and the corresponding optimal cooking parameters are configured, so that the food can be eaten better, and the use experience of a user is improved. In addition, the cooking equipment can also prompt cooking information of the cooked food, so that the cooking equipment is convenient for a user to know.

Description

Control method of cooking equipment, control device and storage medium
Technical Field
The application relates to the field of intelligent household appliances, in particular to a control method of a cooking device, the cooking device, a control device and a storage medium.
Background
The optimal heating time lengths of different kinds of food are different, a user may be difficult to master the heating time, the food tastes not good due to overlong heating time, the food is not cooked due to overlong heating time, and the taste of the food is influenced and the electric energy is wasted due to repeated heating.
Disclosure of Invention
The embodiment of the application provides a control method of a cooking device based on image recognition, the cooking device, an image processing device, an electronic device and a storage medium.
The control method of the cooking device provided by the embodiment of the application comprises the following steps:
acquiring an image in the cooking equipment through a camera;
extracting a food image of the food being cooked in the image;
processing the food image to obtain a surface color of the food;
determining cooking information of the cooked food and controlling a cooking state of the cooked food according to a pre-stored food color template;
and prompting cooking information of the cooked food.
Therefore, the state of the food being cooked can be automatically controlled according to the obtained surface color of the food, and the corresponding optimal cooking parameters are configured, so that the food can be eaten better, and the use experience of a user is improved. In addition, the cooking equipment can also prompt cooking information of the cooked food, so that the cooking equipment is convenient for a user to know.
In some embodiments, the cooking information prompting the cooking food comprises at least one of:
prompting the maturity degree of the cooked food;
prompting a cooked time of the cooked food;
prompting the remaining cooking time of the cooked food.
In some embodiments, the prompting of cooking information for the cooked food comprises:
controlling a display screen to display cooking information of the cooking food; and/or
And controlling the cooking equipment to send out voice prompt information of the cooking information.
In certain embodiments, the control method comprises:
acquiring an image of pre-cooked food through the camera before the cooking equipment cooks;
identifying the type of the pre-cooked food according to the image of the pre-cooked food;
and controlling the cooking equipment to cook the pre-cooked food according to the user identity information by using a cooking method corresponding to the user.
In some embodiments, the user identity information is confirmed by:
collecting sound information;
judging whether the voice information contains the human voice information;
under the condition that the voice information is contained in the voice information, processing the voice information to obtain voiceprint information;
and confirming the identity information of the user according to the voiceprint information.
In some embodiments, the controlling the cooking device to cook according to the cooking method corresponding to the user according to the identity information of the user includes:
judging whether the cooking equipment prestores the dietary preference of the user;
if yes, controlling the cooking equipment to cook according to the dietary preference of the user;
and if not, controlling the cooking equipment to cook according to the age information of the user.
In some embodiments, the control method of the cooking apparatus further includes:
establishing wireless connection between a mobile terminal and the cooking equipment, and enabling the mobile terminal to pop up a control interface;
acquiring control data information on a control interface of the mobile terminal;
and controlling the cooking equipment to cook according to the control data information.
The control device provided by the embodiment of the application comprises:
the acquisition module is used for acquiring images in the cooking equipment through a camera;
an extraction module for extracting a food image of the food being cooked in the image;
the processing module is used for processing the food image to obtain the surface color of the food;
the determining module is used for determining cooking information of the cooked food and controlling the cooking state of the cooked food according to a pre-stored food color template;
and the prompting module is used for prompting the cooking information of the cooking food.
The cooking device provided by the embodiment of the application comprises: the cooking equipment comprises a body, a memory, a controller and a computer program which is stored on the memory and can run on the controller, wherein the memory and the controller are arranged on the body, and the controller realizes the control method of the cooking equipment when executing the computer program.
The present embodiments provide a non-transitory computer-readable storage medium of a computer-executable program, which when executed by one or more processors, implements the control method of the cooking apparatus described above.
Additional aspects and advantages of the present application will be set forth in part in the description which follows and, in part, will be obvious from the description, or may be learned by practice of the present application.
Drawings
The above and/or additional aspects and advantages of the present application will become apparent and readily appreciated from the following description of the embodiments, taken in conjunction with the accompanying drawings of which:
fig. 1 is a flowchart illustrating a control method of a cooking apparatus according to an embodiment of the present application;
fig. 2 is a schematic structural diagram of a cooking apparatus and a mobile terminal according to an embodiment of the present application;
FIG. 3 is a block schematic diagram of a control device according to an embodiment of the present application;
fig. 4 is a further flowchart illustrating a control method of a cooking apparatus according to an embodiment of the present application;
fig. 5 is another flowchart illustrating a control method of a cooking apparatus according to an embodiment of the present application;
fig. 6 is a further flowchart of a control method of a cooking apparatus according to an embodiment of the present application;
fig. 7 is a further flowchart of a control method of a cooking apparatus according to an embodiment of the present application;
fig. 8 is a further flowchart of a control method of a cooking apparatus according to an embodiment of the present application;
fig. 9 is still another flowchart of a control method of a cooking apparatus according to an embodiment of the present application.
Description of the main element symbols:
the cooking device comprises a cooking device 100, a body 11, a memory 12, a controller 13, a camera 14, a display screen 15, a recording device 16, a control device 20, an acquisition module 21, an extraction module 22, a processing module 23, a determination module 24 and a prompt module 25.
Detailed Description
Reference will now be made in detail to embodiments of the present application, examples of which are illustrated in the accompanying drawings, wherein like or similar reference numerals refer to the same or similar elements or elements having the same or similar function throughout. The embodiments described below with reference to the accompanying drawings are illustrative and are only for the purpose of explaining the present application and are not to be construed as limiting the present application.
The following disclosure provides many different embodiments or examples for implementing different features of the application. In order to simplify the disclosure of the present application, specific example components and arrangements are described below. Of course, they are merely examples and are not intended to limit the present application. Moreover, the present application may repeat reference numerals and/or letters in the various examples, such repetition is for the purpose of simplicity and clarity and does not in itself dictate a relationship between the various embodiments and/or configurations discussed. In addition, examples of various specific processes and materials are provided herein, but one of ordinary skill in the art may recognize applications of other processes and/or use of other materials.
Referring to fig. 1, an embodiment of the present application provides a method for controlling a cooking apparatus 100 based on image recognition. The control method comprises the following steps:
s10: acquiring an image in the cooking apparatus 100 through the camera 14;
s20: extracting a food image of the food being cooked in the image;
s30: processing the food image to obtain a surface color of the cooked food;
s40: determining cooking information of cooking food and controlling a cooking state of the cooking food according to a pre-stored food color template;
s50: and prompting cooking information of the cooked food.
Therefore, the state of the food being cooked can be automatically controlled according to the obtained surface color of the food, and the corresponding optimal cooking parameters are configured, so that the food can be eaten better, and the use experience of a user is improved. In addition, the cooking apparatus 100 may also prompt cooking information of the cooking food for the user to understand.
Referring to fig. 2, an embodiment of the present application provides a cooking apparatus 100, where the cooking apparatus 100 includes: the cooking device comprises a body 11, a memory 12, a controller 13 and a computer program stored on the memory 12 and capable of running on the controller 13, wherein the memory 12 and the controller 13 are arranged on the body 11, and the controller 13 is used for collecting images in the cooking device 100 through a camera 14; the food image of the food being cooked in the image is also extracted; also for processing the food image to derive the surface color of the cooked food; the device can also be used for determining cooking information of cooking food and controlling the cooking state of the cooking food according to a pre-stored food color template; and cooking information for prompting to cook the food.
Referring to fig. 3, the present embodiment provides a control device 20, where the control device 20 includes an acquisition module 21, an extraction module 22, a processing module 23, a determination module 24, and a prompt module 25. The collecting module 21 is used for collecting images in the cooking device 100 through the camera 14; the extracting module 22 is used for extracting the food image of the food being cooked in the image; the processing module 23 is used for processing the food image to obtain the surface color of the cooked food; the determining module 24 is used for determining cooking information of cooking food and controlling a cooking state of the cooking food according to a pre-stored food color template; the prompt module 25 is used for prompting cooking information of the cooked food.
Specifically, the outer case of the body 11 of the cooking apparatus 100 may be made of a metal material such as stainless steel. The body 11 is provided with a cavity therein for holding food, wherein the cavity can be made of insulating material. The cooking apparatus 100 may perform a cooking process on food using heat energy. The body 11 is further provided with a camera 14 inside, and the camera 14 can be used for shooting food inside the cooking apparatus 100. The body 11 of the cooking apparatus 100 may further have a display screen 15 mounted thereon.
In step S10, the cooking apparatus 100 is started to cook food after the food is put into the container, and the camera 14 can photograph the inside of the cavity at the same time. The camera 14 may capture images in the cooking apparatus 100, which may include images of the food being cooked, of the container used to carry the food, and of the interior of the cooking apparatus 100. The cooking apparatus 100 may extract a food image of the food being cooked from the image.
The cooking apparatus 100 may further include an intelligent module with networking function, and the intelligent module may compare the image of the food being cooked with the image of the food in a tool library, in which an operable prediction model generated from a file generated by machine learning training is already included, so that the type of the food being cooked may be identified and determined according to the image of the food being cooked.
The surface color of food in order to extract culinary art food is handled again to the food image, and the intelligent module can compare the kind of culinary art food and the colour on surface and the kind and the food color template of the culinary art food in the tool magazine to can judge the degree of maturity of the food that is cooking at present.
In one example, the cooking apparatus 100 cooks the steak, the camera 14 may collect an image of the steak every 2 minutes, the cooking apparatus 100 may recognize that the currently cooked food is the steak and extract the color of the surface of the steak, and when the surface of the steak extracted is red, it may be determined that the steak is currently in a raw state, and the steak is heated for 2 minutes by a medium fire. The camera 14 may collect images of the steak every 2 minutes, each time controlling the duration of the cooking apparatus 100 according to the color of the steak. Simultaneously cooking device 100 can also report "current beefsteak has cooked for 6 minutes, and the distance reaches five cents and still need to continue to cook for 4 minutes".
It should be noted that the image of the food may be collected once every 5 seconds or once every 10 seconds, and of course, there are many time intervals for collecting the image of the food, which may be 3 seconds, 20 seconds, 1 minute, and the like, and the time intervals may be specifically adjusted according to the use habit of the user, the type of the food, and the like, and are not limited specifically herein. It should be noted that the time interval for image acquisition of the food is not too long to avoid the food from reaching a cooked degree, because the image cooking apparatus 100 that does not update the current food continuously cooks the food, which causes the food to be too hot and affects the taste.
The above examples and specific numerical values are provided to facilitate the description of the practice of the present application and should not be construed as limiting the scope of the present application.
The cooking apparatus 100 may automatically control a cooking state of the cooked food according to the kind and the degree of maturity of the cooked food, for example, automatically adjust a cooking power and a cooking time period. In this way, the cooking apparatus 100 may set corresponding cooking parameters in real time in conjunction with the category of the food and the current degree of maturity, and the cooking apparatus 100 cooks the food according to the set cooking parameters. Therefore, the corresponding optimal heating parameters can be automatically configured according to different food types, the proper cooking power can ensure that the food is well cooked and can also ensure that the food has better taste, and the condition that the eating taste is influenced by too long or too short cooking time of the food can be avoided during proper cooking, so that the operation intelligence of the cooking equipment 100 is greatly improved, and the use experience of a user is improved.
Meanwhile, the cooking apparatus 100 may also give a prompt for cooking information for cooking food, for example, the cooking apparatus 100 may voice-report "beef currently being thawed takes 2 minutes to complete". Therefore, the user can reasonably arrange the next cooking operation according to the prompted cooking information.
In some embodiments, the cooking state control of the food by the cooking apparatus 100 may be operated by a user. In one embodiment, the user may control the cooking apparatus 100 through a mobile terminal wirelessly connected with the cooking apparatus 100, wherein the mobile terminal includes, but is not limited to, a smart phone, a smart wearable device, a notebook, a tablet computer, and the like. In another embodiment, the user can also control the cooking apparatus 100 through the display screen 15, for example, the display screen 15 is touch-controlled, and the user can touch the words or patterns on the display screen 15 with a finger to complete the operation of the cooking apparatus 100, and in one embodiment, the user also controls the cooking apparatus 100 through a voice input method.
Referring to fig. 4, the cooking information for prompting the cooking food includes at least one of the following:
s501: prompting the maturity degree of the cooked food;
s502: prompting the cooked time of the cooked food;
s503: indicating the remaining cooking time for cooking the food.
In this manner, the user knows the degree of maturity of the cooked food, the cooked time or the remaining cooking time through the prompt of the cooking apparatus 100.
In some embodiments, the prompting module 25 is used for prompting the maturity of the cooked food; the prompt module 25 is also used for prompting the cooked time of the cooked food; the prompt module 25 is also operable to prompt the remaining cooking time for cooking the food.
In some embodiments, the controller 13 is used to prompt the degree of maturity of the cooked food; the prompt module 25 is also used for prompting the cooked time of the cooked food; the prompt module 25 is also operable to prompt the remaining cooking time for cooking the food.
For example, the cooking apparatus 100 may prompt only the degree of maturity of the cooked food; the cooking apparatus 100 may also prompt the cooked time of the cooked food; the cooking apparatus 100 may also only prompt for the remaining cooking time to cook the food; the cooking apparatus 100 may prompt the degree of maturity and the cooked time of the cooked food; the cooking apparatus 100 may prompt the degree of maturity and the remaining cooking time of the cooked food; the cooking apparatus 100 may also prompt the cooked time and the remaining cooking time for cooking the food; the cooking apparatus 100 may also prompt the degree of maturity, the cooked time, and the remaining cooking time of the cooked food.
In one example, when the user heats the steak for a period of time while putting the steak into the cooking apparatus 100, the cooking apparatus 100 may voice-report that "the current steak is medium cooked, and it takes 14 minutes to heat the steak to medium cooked"
Referring to fig. 5, the prompting of cooking information for cooking food includes:
s504: the control display screen 15 displays cooking information of the cooked food; and/or voice prompt information for controlling the cooking apparatus 100 to emit cooking information.
In this way, the user can know cooking information of the cooked food by viewing the display screen 15 or listen to the cooking apparatus 100 to send out voice prompt information of the cooking information.
In some embodiments, the prompt module 25 is configured to control the display screen 15 to display cooking information for cooking food; and/or voice prompt information for controlling the cooking apparatus 100 to emit cooking information.
In some embodiments, the controller 13 is configured to control the display screen 15 to display cooking information for cooking food; and/or voice prompt information for controlling the cooking apparatus 100 to emit cooking information.
Specifically, the cooking information may be presented through text information, and/or voice information. Only the display screen 15 may be controlled to display cooking information of the cooking food; or only controlling the cooking device 100 to send out voice prompt information of cooking information; the cooking apparatus 100 may also be controlled to emit voice prompt information of cooking information while the cooking information of the cooking food is displayed on the control display 15.
In one example, the control display 15 displays the text "beef currently being thawed requires 2 minutes to complete" and the cooking apparatus 100 may also sound a voice prompt "beef currently being thawed requires 2 minutes to complete". The voice prompt information can make the user know the cooking information of the food currently being cooked more clearly.
Referring to fig. 6, in some embodiments, the control method further includes:
s60: acquiring an image of pre-cooked food through the camera 14 before the cooking apparatus 100 cooks;
s70: identifying the type of the pre-cooked food according to the image of the pre-cooked food;
s80: the cooking apparatus 100 is controlled to cook the pre-cooked food in a cooking method corresponding to the user according to the user identification information.
Therefore, according to different types of food and the requirements of different users, the corresponding cooking method can be selected to cook the pre-cooked food. Different cooking methods can be adopted according to different food types, so that the cooking is more flexible, and manual setting by a user is not needed. Different cooking methods can be set according to different users, so that the food made by the cooking equipment 100 is more suitable for the appetite of the users, and the use experience of the users is improved.
In certain embodiments, the control device 20 may also include an identification module and a control module. The acquisition module 21 is used for acquiring an image of the pre-cooked food through the camera 14 before the cooking device 100 cooks; the identification module is used for identifying the type of the pre-cooked food according to the image of the pre-cooked food; the control module is used for controlling the cooking apparatus 100 to cook the pre-cooked food according to the user identity information in a cooking method corresponding to the user.
In some embodiments, the controller 13 is configured to capture an image of the pre-cooked food item via the camera 14 before the cooking apparatus 100 cooks; the cooking device is also used for identifying the type of the pre-cooked food according to the image of the pre-cooked food; and for controlling the cooking apparatus 100 to cook the pre-cooked food in a cooking method corresponding to the user according to the user identification information.
Specifically, before starting cooking, the user may place the pre-cooked food under the camera 14, and the camera 14 may capture an image of the pre-cooked food. The intelligent module of cooking device 100 has a networking function, and the intelligent module can compare the image of cooking food in advance with the image in the tool library, has included the operable prediction model that the file generation that the machine learning trained produced in the tool library, so can be according to the image recognition of cooking food in advance and confirm the kind of cooking food in advance.
The pre-cooked food types can include finished food materials such as various convenient fast foods and semi-finished food materials, wherein the semi-finished food materials can be meat, rhizomes, leafy vegetables, bean products and the like, and each type can be further subdivided, and the method is not limited herein. After the cooking apparatus 100 determines the kind of the pre-cooked food, the pre-cooked food may be cooked in a cooking method corresponding to the kind of the pre-cooked food.
In some embodiments, the cooking apparatus 100 may confirm the identity information of the user, such as age, sex, and the like, by collecting a voice or an image of the user, and the like. In some embodiments, the user may also input and store the own identity information into the mobile terminal, for example, taste preference, eating habits and physical health conditions, and after the mobile terminal is wirelessly connected with the cooking apparatus 100, the cooking apparatus 100 may obtain the identity information of the user according to the mobile terminal.
In step S80, after the identity information of the user is confirmed, the cooking apparatus 100 is controlled to perform cooking in a cooking method corresponding to the user. It will be appreciated that different users have different eating habits, such as someone being used to eat softer food, someone being used to eat crisper food, or someone not being able to eat spicy, etc. After the identity information of the user is confirmed, the cooking device 100 is controlled to cook food by a cooking method matched with the user, so that the cooked food better meets the requirements of the user, and the cooking device 100 is more intelligent.
In some embodiments, one or more recommended recipes may be generated in combination with the identity information of the user for selection by the user, and after the user selects a recipe, the cooking apparatus 100 may be controlled by voice to cook according to a cooking method corresponding to the recipe. Therefore, the user can select the recipe of the self-mental apparatus by himself, and the food obtained by cooking according to the cooking method on the recipe better meets the eating requirements of the user.
Referring to fig. 7, in some embodiments, the user identity information is confirmed by:
s71: collecting sound information;
s72: judging whether the voice information has the human voice information;
s73: under the condition that the voice information comprises the voice information, processing the voice information to obtain voiceprint information;
s74: and confirming the identity information of the user according to the voiceprint information.
Thus, by allowing the cooking apparatus 100 to collect voice information and processing the voice information when confirming that the voice information is included in the voice information, voiceprint information is obtained, and since the voiceprint information has certain stability for the same user, identity information of different users can be determined through the voiceprint information. The cooking apparatus 100 may cook with different cooking methods according to favorite habits of different users, etc., so that the cooked food meets the user's requirements, and user experience is improved.
In certain embodiments, the control device 20 may further include a determination module. The acquisition module 21 is used for acquiring sound information; the judging module is used for judging whether the voice information contains the voice information; the processing module 23 is configured to, in a case that the voice information includes voice information, process the voice information to obtain voiceprint information; the determining module 24 is configured to confirm the identity information of the user according to the voiceprint information.
In some embodiments, the controller 13 is used to collect voice information; the voice information processing device is also used for judging whether the voice information contains the human voice information; and processing the voice information to obtain voiceprint information under the condition that the voice information has the voice information; and also for confirming the identity information of the user based on the voiceprint information.
Specifically, in step S71, the sound information collected by the controller 13 controlling the cooking apparatus 100 may be the sound generated by the cooking apparatus 100 itself when it is operated, various environmental sounds in the surrounding environment, the sound of an animal, and the sound of different users. For example, the environmental sound may be rain, water, opening and closing a refrigerator door, the animal call may be cat, dog, or bird calls, and the sounds made by different users may be voice commands issued by different users to the cooking apparatus 100, or words spoken by the users when they are in other scenes.
In this case, the cooking apparatus 100 may be provided with a recording device 16 connected to the controller 13, for example, a microphone for collecting sound information. The collection action performed by the cooking device 100 may be implemented by writing software into the controller 13 of the cooking device 100, so that the controller 13 may control collection of sound information when the cooking device 100 is powered on, so as to save energy consumption of relevant modules of the cooking device 100; certainly, after the cooking apparatus 100 is powered on, the cooking apparatus 100 may be controlled to continuously collect surrounding sound information, so as to improve the collection accuracy, where the collection radius range may be 1m to 3m, and the collection radius range is not limited to this, and is specifically determined according to the performance of the recording device 16 built in the cooking apparatus 100.
In step S72, still there is the intelligent module that can network among the cooking equipment 100, and the intelligence module is connected with controller 13 equally, can pass through WIFI or bluetooth communication between intelligence module and the recording device 16 to the sound information that recording device 16 gathered in step S71 can be acquireed to the intelligence module, then can fall the processing of making an uproar to sound information, thereby whether be convenient for discern and judge that whether include the voice information among the sound information.
The identification algorithm can be that after the intelligent module is networked, the intelligent module sends the acquired sound information to the cloud end, and then the cloud end identifies the sound information to judge whether the voice exists or not. The recognition can be performed by importing the voice information into a tool library, and the tool library already contains files generated through machine learning training so as to generate an operable prediction model, so that the voice spectrum, the animal voice, the noise and other non-human voices can be distinguished to judge whether the voice information contains the voice information.
In step S73, it should be explained that the voiceprint is a sound spectrum carrying verbal information displayed by the electro-acoustic apparatus. Modern scientific research shows that the voiceprint not only has specificity, but also has the characteristic of relative stability. The human vocal organs actually have differences in size, form and function. The sound production controller 13 comprises vocal cords, soft jaw, tongue, teeth, lips, etc.; the sounding resonance device comprises a pharyngeal cavity, an oral cavity and a nasal cavity. The small differences in these organs all result in changes in the sound stream, resulting in differences in sound quality and timbre.
In addition, the habit of human voice is fast or slow, and the difference between the sound intensity and the sound length is caused by the strength of the force. These differences can be subdivided into more than a few dozen features that represent different wavelengths, frequencies, intensities, rhythms of different sounds. Therefore, the voiceprint corresponding to each speaking voice is different, and the identity of the user can be judged according to the voiceprint.
The processing of the voice information may be a voiceprint recognition processing, in which a voice signal is converted into an electrical signal, and a corresponding voiceprint image, i.e., voiceprint information, is generated according to the voice information.
It should be noted that the cloud end may store a dedicated voiceprint left when the user starts using the cooking apparatus 100 and sets the cooking apparatus 100 individually. For example, the cooking device 100 may prompt the user to save the voiceprint, specifically, the user may read a string of random numbers and some common statements many times, so as to obtain and upload the voiceprint specific to each user to the cloud.
In steps S71 and S72, the vocal print theoretically has the same identification (identify the individual) function as the fingerprint, because the vocal print has specificity and stability as described above. And comparing the voiceprint information stored in the cloud under the condition of obtaining the voiceprint information, so as to confirm the identity information of the user.
In particular, the user may input and store own identity information into the mobile terminal, and the identity information may include, but is not limited to, information of the user's age, sex, taste preference, eating habits, and physical health.
The mobile terminal can communicate with the cooking equipment 100 through modes such as WIFI, Bluetooth and NFC, so that the cooking equipment 100 can acquire identity information of a user, the cooking equipment 100 can be stored locally and also can upload the identity information to a cloud for storage, and the identity information of the user is bound with voiceprint information. Thus, the identity information of the user can be confirmed according to the voiceprint information.
It will be appreciated that in some embodiments, the owner of the cooking appliance 100 may pre-enter his or her own voice and/or the voice of others that the user authorizes to use, process the voice as one or more voiceprint information and act as the pre-set voiceprint information. After the sound is collected and processed to obtain the current voiceprint information, the obtained voiceprint information can be compared with the preset voiceprint information, and when the current voiceprint information is matched with the preset voiceprint information, the user can be judged to be an authorized user, so that the user can control the cooking equipment 100 to be used; when the current voiceprint information does not match the preset voiceprint information, it may be determined that the user is an unauthorized user, and thus, the cooking apparatus 10010 is in a locked state, and the user may not use the cooking apparatus 100.
In some embodiments, after the identity information of the user is confirmed, the cooking apparatus 100 is controlled to perform cooking in a cooking method corresponding to the user. It will be appreciated that different users have different eating habits, such as someone being used to eat softer food, someone being used to eat crisper food, or someone not being able to eat spicy, etc. After the identity information of the user is confirmed, the cooking device 100 is controlled to cook food by a cooking method matched with the user, so that the cooked food better meets the requirements of the user, and the cooking device 100 is more intelligent.
In one example, after the user speaks "start heating" to the cooking apparatus 100, the recording device 16 of the cooking apparatus 100 collects sound information, and after extracting and judging the voiceprint information of the user, the user is determined to be a young child by comparing and searching with all saved voiceprint information, and then the cooking apparatus 100 is controlled to heat food, wherein the food is soft and rotten for the user to eat due to too long heating time.
Referring to fig. 8, in some embodiments, controlling the cooking apparatus 100 to cook according to the cooking method corresponding to the user according to the identity information of the user includes:
s81: judging whether the cooking apparatus 100 prestores the dietary preference of the user;
s82: if yes, controlling the cooking equipment 100 to cook according to the diet preference of the user;
s83: if not, the cooking device 100 is controlled to cook according to the age information of the user.
In this way, the food cooked by the cooking device 100 better conforms to the preference of the user by cooking according to the pre-stored dietary preference of the user in the cooking device 100, so that the user experience is improved; under the condition that the user does not prestore the dietary preference in the cooking device 100, the cooking can be carried out according to the age information, so that the cooked food is closer to the preference of the user, and the user experience is improved to a certain extent.
In some embodiments, the control module is configured to determine whether the cooking apparatus 100 has a user's dietary preference in advance, control the cooking apparatus 100 to cook according to the user's dietary preference, and control the cooking apparatus 100 to cook according to the user's age information.
In some embodiments, the controller 13 is configured to determine whether the cooking apparatus 100 has a user's dietary preference in advance, and to control the cooking apparatus 100 to cook according to the user's dietary preference, and to control the cooking apparatus 100 to cook according to the user's age information.
Specifically, the diet preference may be "like cheerful and sweet", "like spicy", "like salty", "like softer", and the like, and the user may add the diet preference to the identity information when filling the identity information in the mobile terminal, so that the diet preference of the user may be obtained and stored by obtaining the identity information through communication between the mobile terminal and the cooking device 100.
Of course, the user may also directly input the diet preference on the panel of the cooking apparatus 100, and the cooking apparatus 100 binds the diet preference with the user identity information and the voiceprint information. It can be understood that the manner of inputting on the mobile terminal is more suitable for the operation habit of the user, and can be changed anytime and anywhere, and it is sufficient to synchronize with the cooking apparatus 100 when connecting communication.
Similarly, the age information may be filled in the identity information when the user fills in the identity information on the mobile terminal, and is a necessary item, so that the specific age of the user can be acquired and stored by acquiring the identity information through communication between the mobile terminal and the cooking device 100.
Thus, in an embodiment, after the user speaks "start heating" to the cooking apparatus 100, the recording device 16 of the cooking apparatus 100 collects the sound information, after extracting and determining the voiceprint information of the user, determines which user is specifically the user by comparing with all the stored voiceprint information, finds the bound identity information according to the voiceprint information, and then further finds whether the user carries a diet preference.
In case that a dietary preference is found, for example, "like to eat a little done", the cooking apparatus 100 is controlled to heat the food, wherein the heating time is too long to make the food well done to meet the user's dietary habits.
And under the condition that no diet preference is found, according to the age information in the identity information and the diet preference of people of all ages under the big data, the food is predictively cooked. For example, when the age information is under 10 years old or over 50 years old, the cooking apparatus 10010 is controlled to heat the food, wherein the heating time is too long to soften the food to conform to the eating habits of the user; the age information is between 20 and 30 years old, and the cooking apparatus 100 is controlled to cook the food more crispy or the like when the frying process is performed.
Referring to fig. 9, in some embodiments, the method for controlling the cooking apparatus 100 further includes:
s90: establishing wireless connection between the mobile terminal and the cooking equipment 100, and enabling the mobile terminal to pop up a control interface;
s100: acquiring control data information on a control interface of the mobile terminal;
s110: and controlling the cooking apparatus 100 to cook according to the control data information.
In this way, after the wireless connection between the mobile terminal and the cooking apparatus 100 is established, the user can send data information of relevant instructions to the cooking apparatus 100 through the mobile terminal, the cooking apparatus 100 performs corresponding cooking processing after receiving the data information, and the user can control the operation of the cooking apparatus 100 through the mobile terminal.
In some embodiments, the control device 20 further includes a connection module, which is configured to establish a wireless connection between the mobile terminal and the cooking apparatus 100 and enable the mobile terminal to pop up the control interface; the acquisition module 21 is used for acquiring control data information on a control interface of the mobile terminal; the control module is used for controlling the cooking device 100 to cook according to the control data information.
In some embodiments, the controller 13 is configured to establish a wireless connection between the mobile terminal and the cooking apparatus 100, and to cause the mobile terminal to pop up the control interface; the acquisition module 21 is used for acquiring control data information on a control interface of the mobile terminal; the control module is used for controlling the cooking device 100 to cook according to the control data information.
The cooking device 100 is connected with the mobile terminal after receiving the connection information of the mobile terminal, where the connection information may be bluetooth configuration information, NFC configuration information, and the like, so that the cooking device 100 may communicate with the mobile terminal to control the mobile terminal to pop up a control interface, a user may perform an operation on the control interface, for example, fill in control data information and send the control data information to the cooking device 100, and after the cooking device 100 obtains the control data information, the controller 13 controls the cooking device 100 to cook food according to the control data information, where the control data information may be recipe information.
In this way, the user can control the operation state of the cooking apparatus 100 using the control interface of the mobile terminal. For example, the user may control the on and off states of the cooking apparatus 100, and may also control the cooking apparatus 100 to be turned on or off at regular times, and the like. Thus, the setting of the control button can be cancelled on the shell of the cooking device 100, so that the outer watching of the cooking device 100 is more concise, and the experience of the user is improved. In some embodiments, the user may operate the control interface to control the cooking time and the cooking power of the cooking apparatus 100 for the food.
In one embodiment, after the cooking device 100 is connected to the mobile terminal, a user puts food to be cooked into the cooking device 100, the user can set different recipe information according to different food types and different eating requirements, wherein the recipe information may include cooking time and cooking power for the food, and the mobile terminal can convert the recipe information into corresponding control data information and send the control data information to the cooking device 100, so that the user can control the cooking device 100 by using the mobile terminal to complete cooking of the food.
In certain embodiments, the present application provides a non-transitory computer-readable storage medium containing a computer-executable program which, when executed by one or more processors, implements the control method of any one of the embodiments.
It will be understood by those skilled in the art that all or part of the processes of the methods of the above embodiments may be implemented by hardware instructions of a computer program, which may be stored in a non-volatile computer-readable storage medium, and when executed, may include the processes of the embodiments of the methods described above. The storage medium may be a magnetic disk, an optical disk, a Read-Only Memory (ROM), or the like.
In the description herein, references to the description of the terms "one embodiment," "certain embodiments," "an illustrative embodiment," "an example," "a specific example," or "some examples" or the like mean that a particular feature, structure, material, or characteristic described in connection with the embodiment or example is included in at least one embodiment or example of the present application. In this specification, schematic representations of the above terms do not necessarily refer to the same embodiment or example. Furthermore, the particular features, structures, materials, or characteristics described may be combined in any suitable manner in any one or more embodiments or examples.
While embodiments of the present application have been shown and described, it will be understood by those of ordinary skill in the art that: numerous changes, modifications, substitutions and alterations can be made to the embodiments without departing from the principles and spirit of the application, the scope of which is defined by the claims and their equivalents.

Claims (10)

1. A control method of a cooking apparatus based on image recognition, the control method comprising:
acquiring an image in the cooking equipment through a camera;
extracting a food image of the food being cooked in the image;
processing the food image to derive a surface color of the cooked food;
determining cooking information of the cooked food and controlling a cooking state of the cooked food according to a pre-stored food color template;
and prompting cooking information of the cooked food.
2. The control method of the cooking apparatus according to claim 1, wherein the prompting of the cooking information of the cooking food includes at least one of:
prompting the maturity degree of the cooked food;
prompting a cooked time of the cooked food;
prompting the remaining cooking time of the cooked food.
3. The control method of the cooking apparatus according to claim 1, wherein the prompting of the cooking information of the cooking food includes:
controlling a display screen to display cooking information of the cooking food; and/or
And controlling the cooking equipment to send out voice prompt information of the cooking information.
4. The control method of the cooking apparatus according to claim 1, further comprising:
acquiring an image of pre-cooked food through the camera before the cooking equipment cooks;
identifying the type of the pre-cooked food according to the image of the pre-cooked food;
and controlling the cooking equipment to cook the pre-cooked food according to the user identity information by using a cooking method corresponding to the user.
5. The control method of a cooking apparatus according to claim 4, wherein the user identification information is confirmed by:
collecting sound information;
judging whether the voice information contains the human voice information;
under the condition that the voice information is contained in the voice information, processing the voice information to obtain voiceprint information;
and confirming the identity information of the user according to the voiceprint information.
6. The method for controlling the cooking apparatus according to claim 4, wherein the controlling the cooking apparatus to cook according to the cooking method corresponding to the user according to the identity information of the user comprises:
judging whether the cooking equipment prestores the dietary preference of the user;
if yes, controlling the cooking equipment to cook according to the dietary preference of the user;
and if not, controlling the cooking equipment to cook according to the age information of the user.
7. The control method of a cooking apparatus according to claim 1, wherein the control method of a cooking apparatus comprises:
establishing wireless connection between a mobile terminal and the cooking equipment, and enabling the mobile terminal to pop up a control interface;
acquiring control data information on a control interface of the mobile terminal;
and controlling the cooking equipment to cook according to the control data information.
8. A cooking apparatus, characterized in that the cooking apparatus comprises: a body, a memory, a controller and a computer program stored on the memory and operable on the controller, the memory and the controller being provided on the body, the controller implementing a control method of a cooking apparatus according to any one of claims 1 to 7 when executing the computer program.
9. A control device, comprising:
the acquisition module is used for acquiring images in the cooking equipment through a camera;
an extraction module for extracting a food image of the food being cooked in the image;
a processing module for processing the food image to obtain a surface color of the cooked food;
the determining module is used for determining cooking information of the cooked food and controlling the cooking state of the cooked food according to a pre-stored food color template;
and the prompting module is used for prompting the cooking information of the cooking food.
10. A computer-readable storage medium on which a computer program is stored, characterized in that the program, when executed by a processor, implements a control method of a cooking apparatus according to any one of claims 1 to 7.
CN202110987483.9A 2021-08-26 2021-08-26 Control method of cooking equipment, control device and storage medium Withdrawn CN113729500A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110987483.9A CN113729500A (en) 2021-08-26 2021-08-26 Control method of cooking equipment, control device and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110987483.9A CN113729500A (en) 2021-08-26 2021-08-26 Control method of cooking equipment, control device and storage medium

Publications (1)

Publication Number Publication Date
CN113729500A true CN113729500A (en) 2021-12-03

Family

ID=78733034

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110987483.9A Withdrawn CN113729500A (en) 2021-08-26 2021-08-26 Control method of cooking equipment, control device and storage medium

Country Status (1)

Country Link
CN (1) CN113729500A (en)

Similar Documents

Publication Publication Date Title
KR102131161B1 (en) Microwave voice control method and microwave
CN111596563B (en) Intelligent smoke kitchen system and cooking guiding method thereof
CN104914898B (en) A kind of generation method and system of digital menu
CN104461501B (en) Cloud intelligent cooking method, cloud intelligent cooking equipment and cloud server
CN107468048A (en) Cooking apparatus and its control method
CN110045638B (en) Cooking information recommendation method and device and storage medium
CN106773859B (en) A kind of intelligent cooking control method
CN107844142A (en) Cooking system, mobile terminal and electronic cookbook generation, auxiliary cooking method
CN201115599Y (en) Intelligent cooking apparatus with acoustic control recognition function
CN109380975A (en) Cooking appliance, control method and system thereof and server
CN108937554B (en) Steaming and baking equipment and method for reminding diet by using terminal
CN107452376A (en) A kind of method cooked by Voice command
CN108415298B (en) Control method and device
CN111294997B (en) Heating control method and related device
CN110934508A (en) Oven control method and device
CN110275456A (en) Cooking control method, system and computer readable storage medium
CN108279777A (en) Brain wave control method and relevant device
CN113729500A (en) Control method of cooking equipment, control device and storage medium
CN113827095A (en) Control method, control device, cooking apparatus and readable storage medium
CN110974038A (en) Food material cooking degree determining method and device, cooking control equipment and readable storage medium
CN114893946B (en) Food storage device and intelligent cooking method
AU2019302632B2 (en) Method for operating a cooking appliance
CN113693452A (en) Control method of cooking equipment, control device and storage medium
CN110547665B (en) Cooking state determining method and device, storage medium and server
CN113812856A (en) Control method, control device, cooking apparatus and readable storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
WW01 Invention patent application withdrawn after publication
WW01 Invention patent application withdrawn after publication

Application publication date: 20211203