CN112754301A - Intelligent cooking control system and method - Google Patents

Intelligent cooking control system and method Download PDF

Info

Publication number
CN112754301A
CN112754301A CN202011617453.0A CN202011617453A CN112754301A CN 112754301 A CN112754301 A CN 112754301A CN 202011617453 A CN202011617453 A CN 202011617453A CN 112754301 A CN112754301 A CN 112754301A
Authority
CN
China
Prior art keywords
cooking
user
information
feature information
food material
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202011617453.0A
Other languages
Chinese (zh)
Inventor
许琦
潘叶江
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Vatti Co Ltd
Original Assignee
Vatti Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Vatti Co Ltd filed Critical Vatti Co Ltd
Priority to CN202011617453.0A priority Critical patent/CN112754301A/en
Publication of CN112754301A publication Critical patent/CN112754301A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A47FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
    • A47JKITCHEN EQUIPMENT; COFFEE MILLS; SPICE MILLS; APPARATUS FOR MAKING BEVERAGES
    • A47J36/00Parts, details or accessories of cooking-vessels
    • A47J36/32Time-controlled igniting mechanisms or alarm devices
    • AHUMAN NECESSITIES
    • A47FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
    • A47JKITCHEN EQUIPMENT; COFFEE MILLS; SPICE MILLS; APPARATUS FOR MAKING BEVERAGES
    • A47J27/00Cooking-vessels
    • A47J27/04Cooking-vessels for cooking food in steam; Devices for extracting fruit juice by means of steam ; Vacuum cooking vessels
    • AHUMAN NECESSITIES
    • A47FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
    • A47JKITCHEN EQUIPMENT; COFFEE MILLS; SPICE MILLS; APPARATUS FOR MAKING BEVERAGES
    • A47J36/00Parts, details or accessories of cooking-vessels
    • A47J36/32Time-controlled igniting mechanisms or alarm devices
    • A47J36/321Time-controlled igniting mechanisms or alarm devices the electronic control being performed over a network, e.g. by means of a handheld device
    • AHUMAN NECESSITIES
    • A47FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
    • A47JKITCHEN EQUIPMENT; COFFEE MILLS; SPICE MILLS; APPARATUS FOR MAKING BEVERAGES
    • A47J37/00Baking; Roasting; Grilling; Frying
    • A47J37/06Roasters; Grills; Sandwich grills
    • A47J37/0623Small-size cooking ovens, i.e. defining an at least partially closed cooking cavity
    • A47J37/0629Small-size cooking ovens, i.e. defining an at least partially closed cooking cavity with electric heating elements
    • AHUMAN NECESSITIES
    • A47FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
    • A47JKITCHEN EQUIPMENT; COFFEE MILLS; SPICE MILLS; APPARATUS FOR MAKING BEVERAGES
    • A47J37/00Baking; Roasting; Grilling; Frying
    • A47J37/06Roasters; Grills; Sandwich grills
    • A47J37/0623Small-size cooking ovens, i.e. defining an at least partially closed cooking cavity
    • A47J37/0664Accessories
    • AHUMAN NECESSITIES
    • A47FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
    • A47JKITCHEN EQUIPMENT; COFFEE MILLS; SPICE MILLS; APPARATUS FOR MAKING BEVERAGES
    • A47J27/00Cooking-vessels
    • A47J27/04Cooking-vessels for cooking food in steam; Devices for extracting fruit juice by means of steam ; Vacuum cooking vessels
    • A47J2027/043Cooking-vessels for cooking food in steam; Devices for extracting fruit juice by means of steam ; Vacuum cooking vessels for cooking food in steam

Landscapes

  • Engineering & Computer Science (AREA)
  • Food Science & Technology (AREA)
  • General Preparation And Processing Of Foods (AREA)

Abstract

The invention belongs to the technical field of intelligent cooking equipment, and particularly relates to an intelligent cooking control system and method. The control system comprises cooking equipment, an acquisition module and a control module; the cooking equipment and the acquisition module are both connected with the control module, the acquisition module is used for acquiring food material information and user information, the control module determines the face feature information of a user according to the user information, then determines the actual health condition of the user according to the face feature information of the user and the standard face feature information, and determines a cooking mode according to the actual health condition of the user; finally, determining actual cooking parameters according to the cooking mode, the food material information and preset standard cooking parameters corresponding to the health condition; the cooking equipment cooks the food materials according to the cooking mode and the actual cooking parameters until the cooking is finished. The control system and the control method do not need a user to set a cooking mode and cooking parameters, and solve the problem that the intelligent degree of the existing cooking equipment is not high enough.

Description

Intelligent cooking control system and method
Technical Field
The invention belongs to the technical field of intelligent cooking equipment, and particularly relates to an intelligent cooking control system and method.
Background
Cooking equipment, such as a steaming and baking all-in-one machine, a steaming and baking oven and the like have the characteristics of less oil smoke, no need of continuous operation of a user and the like in a cooking process, and gradually enter thousands of households as a typical cooking appliance for environmental protection, health and newness. But the intelligent degree of cooking equipment product is not high on the market at present, can not provide green healthy culinary art mode for the user voluntarily when the user wants to eat certain food material, and the effect of healthy diet can not necessarily be reached to the low also that the degree of convenience is lower.
Disclosure of Invention
The invention provides an intelligent cooking control system, aiming at solving the problem that the intelligent degree of the existing cooking equipment is not high enough.
Another object of the present invention is to provide a control method of the above intelligent cooking control system.
The invention is realized by adopting the following scheme:
the invention provides an intelligent cooking control system, which comprises cooking equipment, an acquisition module and a control module, wherein the acquisition module is used for acquiring cooking data;
the cooking equipment and the acquisition module are both connected with the control module, the acquisition module is used for acquiring food material information and user information, the control module determines the face feature information of a user according to the user information, determines the actual health condition of the user according to the face feature information of the user and standard face feature information, and determines a cooking mode according to the actual health condition of the user; finally, determining actual cooking parameters according to the cooking mode, the food material information and preset standard cooking parameters corresponding to the health condition; and the cooking equipment cooks food materials according to the cooking mode and the actual cooking parameters until the cooking is finished.
The intelligent cooking control system is further improved in that the acquisition module comprises a first camera and a second camera, the first camera is arranged in the inner container of the cooking equipment and used for shooting food materials to acquire food material photos, and the second camera is arranged on the cooking equipment and used for shooting users to acquire user photos.
The intelligent cooking control system is further improved in that the cooking equipment comprises a blanking assembly used for adding seasonings to food materials, and the blanking assembly is arranged in the inner container and electrically connected with the control module.
The intelligent cooking control system is further improved in that the blanking assembly comprises a plurality of measuring cups for containing seasonings, a mixing part with a mixing cavity and a stirring part;
the mixing part is arranged below the measuring cups, a first through hole communicated with the mixing cavity is formed in the bottom of each measuring cup, and a first valve used for opening or closing the first through hole is arranged at the first through hole; the bottom of the mixing part is provided with a second through hole for communicating the mixing cavity and the inner container, and a second valve for opening or closing the second through hole is arranged at the second through hole;
the stirring part is arranged in the mixing cavity and electrically connected with the control module, and the first valve and the second valve are also electrically connected with the control module.
The intelligent cooking control system is further improved in that the blanking assembly further comprises a spray head, and the spray head is communicated with the mixing cavity through the second through hole and is used for spraying the mixed seasoning to food materials.
The intelligent cooking control system is further improved in that the cooking equipment further comprises a guide assembly, the guide assembly is arranged above food materials, the blanking assembly is connected with the guide assembly in a sliding fit mode, and the guide assembly is further electrically connected with the control module.
A further improvement to the intelligent cooking control system of the present invention is that the guide assembly comprises a guide rail and a drive member; the guide rail is arranged above the food material and is connected with the blanking assembly in a sliding fit mode through the driving piece, and the driving piece is electrically connected with the control module.
The intelligent cooking control system is further improved in that the intelligent cooking control system further comprises a communication module, and the control module is in communication connection with the cooking equipment through the communication module.
The invention provides an intelligent cooking control method, which comprises the following steps:
acquiring food material information in the cooking equipment and user information;
determining face feature information of a user according to the user information;
determining the actual health condition of the user according to the face feature information of the user and the standard face feature information;
determining a cooking mode according to the actual health condition of the user;
determining actual cooking parameters according to the cooking mode, the food material information and preset standard cooking parameters corresponding to the health condition;
and the cooking equipment cooks in the cooking mode and the actual cooking parameters until the cooking is finished.
The intelligent cooking control method of the present invention is further improved in that the acquiring of the food material information and the user information in the cooking device includes:
the method comprises the steps of obtaining a food material photo as food material information by photographing a food material, and obtaining a user photo as user information by photographing a user.
In a further improvement of the intelligent cooking control method of the present invention, the determining the facial feature information of the user according to the user information includes:
acquiring characteristic information of each part of a user face based on a CMYK-H-CbCr skin color segmentation model and combined with an Adaboost algorithm; the characteristic information comprises CMYK characteristic values, HSV characteristic values and/or YCbCr characteristic values.
The intelligent cooking control method of the invention is further improved in that the acquiring of the characteristic information of each part of the face of the user comprises:
feature information of eyebrows, eyes, ears, nose, mouth, hair volume, or/and other facial regions of a user is acquired.
The intelligent cooking control method is further improved in that the standard face feature information comprises healthy face feature information and unhealthy face feature information;
the healthy face feature information comprises healthy face CMYK feature values, healthy face HSV feature values and/or healthy face YCbCr feature values;
the unhealthy face feature information comprises CMYK feature values of unhealthy faces, HSV feature values of the unhealthy faces and/or YCbCr feature values of the unhealthy faces.
The intelligent cooking control method is further improved in that if the face feature information of the user is unhealthy face feature information, the cooking mode is determined to be a steaming mode.
In a further improvement of the intelligent cooking control method of the present invention, the determining the actual cooking parameters according to the cooking mode, the food material information and the preset standard cooking parameters corresponding to the health condition includes:
and if the cooking mode is a steaming mode, determining steaming duration and steaming temperature according to the food material information and preset steaming standard cooking parameters corresponding to the health condition.
Compared with the prior art, the invention adopting the scheme has the beneficial effects that:
in the control system, a user places food materials in an inner container of cooking equipment, an acquisition module acquires food material information and user information, and feeds the food material information and the user information back to the control module; the control module determines face feature information of a user according to the user information; determining the actual health condition of the user according to the face feature information of the user and the standard face feature information, and determining a cooking mode according to the actual health condition of the user; finally, determining actual cooking parameters according to the cooking mode, the food material information and preset standard cooking parameters corresponding to the health condition; the cooking equipment cooks the food materials according to the cooking mode and the actual cooking parameters issued by the control module until the cooking is finished, and the whole cooking process does not need a user to set the cooking mode and the cooking parameters, so that the problem that the intelligent degree of the existing cooking equipment is not high enough is solved; in addition, because the cooking mode and the cooking parameters of the embodiment are determined according to the actual health condition and the food material information of the user, the intelligent cooking effect is achieved by combining the health of the user.
In the intelligent cooking control method, because the whole cooking process does not need a user to set a cooking mode and cooking parameters, the problem that the intelligent degree of the existing cooking equipment is not high enough is solved; in addition, because the cooking mode and the cooking parameters of the embodiment are determined according to the actual health condition and the food material information of the user, the intelligent cooking effect is achieved by combining the health of the user.
Drawings
Fig. 1 is a system diagram of an intelligent cooking control system according to an embodiment of the present invention;
fig. 2 is a schematic structural diagram of an intelligent cooking control system according to an embodiment of the present invention;
fig. 3 is a schematic top view of a measuring cup of an intelligent cooking control system according to an embodiment of the present invention;
fig. 4 is a schematic bottom view of a mixing part of an intelligent cooking control system according to an embodiment of the present invention;
fig. 5 is a left side view schematically illustrating a mixing part of an intelligent cooking control system according to an embodiment of the present invention;
fig. 6 is a flowchart of an intelligent cooking control method according to an embodiment of the present invention.
In the figure: 1. a cooking device; 2. an acquisition module; 3. a control module; 4. a communication module; 11. a blanking assembly; 12. a guide assembly; 21. a first camera; 22. a second camera; 111. a measuring cup; 1111. a first through hole; 112. a mixing section; 1121. a second through hole; 113. and (4) a nozzle.
Detailed Description
So that the manner in which the above recited objects, features and advantages of the present invention can be understood in detail, a more particular description of the invention, briefly summarized above, may be had by reference to the embodiments thereof which are illustrated in the appended drawings. It should be noted that the embodiments and features of the embodiments of the present application may be combined with each other without conflict.
In the following description, numerous specific details are set forth in order to provide a thorough understanding of the present invention, however, the present invention may be practiced in other ways than those specifically described herein, and therefore the scope of the present invention is not limited by the specific embodiments disclosed below.
In the present invention, unless otherwise expressly stated or limited, the terms "mounted," "connected," "fixed," and the like are to be construed broadly and can include, for example, fixed connections, detachable connections, or integral connections; can be mechanically or electrically connected; either directly or indirectly through intervening media, either internally or in any other relationship. The specific meanings of the above terms in the present invention can be understood by those skilled in the art according to specific situations.
Example 1
As shown in fig. 1 and fig. 2, the present embodiment provides an intelligent cooking control system, which includes a cooking device 1, an obtaining module 2 and a control module 3;
the cooking equipment 1 and the acquisition module 2 are both connected with the control module 3, the acquisition module 2 is used for acquiring food material information and user information, the control module 3 determines the face feature information of a user according to the user information, then determines the actual health condition of the user according to the face feature information of the user and standard face feature information, and determines a cooking mode according to the actual health condition of the user; finally, determining actual cooking parameters according to the cooking mode, the food material information and preset standard cooking parameters corresponding to the health condition; the cooking device 1 cooks the food material in the cooking mode and the actual cooking parameters until the cooking is finished.
When the intelligent cooking control system is used, a user places food materials in an inner container of the cooking equipment 1, the acquisition module 2 acquires food material information and user information, and feeds the food material information and the user information back to the control module 3; the control module 3 determines the face feature information of the user according to the user information; determining the actual health condition of the user according to the face feature information of the user and the standard face feature information, and determining a cooking mode according to the actual health condition of the user; finally, determining actual cooking parameters according to the cooking mode, the food material information and preset standard cooking parameters corresponding to the health condition; the cooking equipment 1 cooks the food materials according to the cooking mode and the actual cooking parameters issued by the control module 3 until the cooking is finished, and the whole cooking process does not need a user to set the cooking mode and the cooking parameters, so that the problem that the intelligent degree of the existing cooking equipment is not high enough is solved; in addition, because the cooking mode and the cooking parameters of the embodiment are determined according to the actual health condition and the food material information of the user, the intelligent cooking effect is achieved by combining the health of the user.
In this embodiment, the control module 3 may be independent from the cooking apparatus 1, or may be provided in the cooking apparatus 1. The control module 3 is preferably a controller or a single chip microcomputer and the like capable of storing data, calling data, receiving data, judging and controlling other workpieces to work.
As shown in fig. 2, in this embodiment, the obtaining module 2 includes a first camera 21 and a second camera 22, the first camera 21 is disposed in the inner container of the cooking apparatus 1 for shooting food materials to obtain food material photos, and the second camera 22 is disposed on the cooking apparatus 1 for shooting users to obtain user photos.
As shown in fig. 2, in the present embodiment, the cooking apparatus 1 includes a blanking assembly 11 for adding seasoning to food materials, the blanking assembly 11 is disposed in the inner container and electrically connected to the control module 3; the control module 3 controls the blanking assembly 11 to add seasoning to the food material according to the cooking mode and the cooking parameters.
As shown in fig. 2-4, the blanking assembly 11 comprises a plurality of measuring cups 111 for containing seasonings, a mixing part 112 with a mixing cavity and a stirring part;
the mixing part 112 is arranged below the measuring cups 111, a first through hole 1111 communicated with the mixing cavity is formed in the bottom of each measuring cup 111, and a first valve used for opening or closing the first through hole 1111 is arranged at the first through hole 1111; the bottom of the mixing part 112 is disposed in a second through hole 1121 which communicates the mixing cavity and the inner container, and a second valve for opening or closing the second through hole 1121 is disposed at the second through hole 1121;
the stirring part is arranged in the mixing cavity and is electrically connected with the control module 3, and the first valve and the second valve are also electrically connected with the control module 3.
The number of the measuring cups 111 is at least two, and preferably one of the measuring cups is filled with water, and the other measuring cup is filled with salt. Before the cooking device 1 cooks the food, the control module 3 controls the first valve to open, and the seasoning in the measuring cup 111 falls into the mixing cavity of the mixing part 112 through the first through hole 1111; when the seasonings added into the mixing cavity meet the cooking requirement, the control module 3 controls the first valve to be closed and controls the stirring part to start stirring the seasonings in the mixing cavity so as to uniformly mix the seasonings; after the seasonings are uniformly mixed, the control module 3 controls the stirring part to stop, and simultaneously controls the second valve to open, so that the seasonings in the mixing cavity fall onto the food material through the second through hole 1121, and finally, the food material is cooked until the cooking is finished.
As shown in fig. 5, the discharging assembly 11 further includes a nozzle 113, and the nozzle 113 is communicated with the mixing cavity through a second through hole 1121 for spraying the mixed flavoring to the food material. The mixed seasoning is sprayed to the food material through the second through hole 1121 and the nozzle 113, so that the seasoning is prevented from being accumulated in the same area of the food material.
As shown in fig. 1 and 2, the cooking apparatus 1 further includes a guiding assembly 12, the guiding assembly 12 is disposed above the food material, the blanking assembly 11 is connected to the guiding assembly 12 in a sliding manner, and the guiding assembly 12 is further electrically connected to the control module 3.
Wherein the guiding assembly 12 comprises a guide rail 121 and a driving member; the guide rail 121 is disposed above the food material and is connected to the discharging assembly 11 through a driving member, and the driving member is electrically connected to the control module 3. The control module 3 controls the driving member to start, and the driving member drives the discharging assembly 11 to slide along the guide rail 121 so as to uniformly spread the seasonings on the food materials. The drive member is preferably an electric motor.
In this embodiment, the intelligent cooking device control system further includes a communication module 4, and the control module 3 is in communication connection with the cooking device 1 through the communication module 4. The communication module 4 may be a Wifi module or a bluetooth module. The control module 3 issues the cooking mode and the cooking parameters to the cooking equipment 1 through the communication module 4, and the cooking equipment 1 performs cooking operation after receiving the cooking mode and the cooking parameters until the cooking process is finished.
Example 2
As shown in fig. 6, the present embodiment discloses an intelligent cooking control method, which includes:
s1, acquiring food material information and user information in the cooking equipment;
s2, determining the face feature information of the user according to the user information;
s3, determining the actual health condition of the user according to the face feature information of the user and the standard face feature information;
s4, determining a cooking mode according to the actual health condition of the user;
s5, determining actual cooking parameters according to the cooking mode, the food material information and preset standard cooking parameters corresponding to the health condition;
and S6, the cooking equipment cooks in the cooking mode and the cooking parameters until the cooking is finished.
In the control method in this embodiment, the actual health condition of the user is determined according to the user information, the cooking mode is determined according to the actual health condition of the user, the cooking parameter is determined according to the cooking mode and the food material information, and then the cooking device performs cooking in the cooking mode and the cooking parameter until the cooking is finished. The whole cooking process of the embodiment does not need a user to set a cooking mode and cooking parameters, and the problem that the intelligent degree of the existing cooking equipment is not high enough is solved; in addition, because the cooking mode and the cooking parameters of the embodiment are determined according to the actual health condition and the food material information of the user, the intelligent cooking effect is achieved by combining the health of the user.
In this embodiment, the acquiring of the food material information in the cooking device and the user information in S1 includes:
the method comprises the steps of obtaining a food material photo as food material information by photographing a food material, and obtaining a user photo as user information by photographing a user.
In this embodiment, the determining the facial feature information of the user according to the user information in S2 includes:
acquiring characteristic information of each part of a user face based on a CMYK-H-CbCr skin color segmentation model and combined with an Adaboost algorithm; the feature information includes CMYK feature values, HSV feature values, and/or YCbCr feature values.
The method for acquiring the feature information of each part of the face of the user comprises the following steps: feature information of eyebrows, eyes, ears, nose, mouth, hair volume, or/and other facial regions of a user is acquired.
That is, feature information of the following portions of the face of the user is determined from the acquired user photograph:
CMYK eigenvalues, HSV eigenvalues and/or YCbCr eigenvalues of the user eyebrows;
CMYK eigenvalues, HSV eigenvalues and/or YCbCr eigenvalues of the user's eyes;
CMYK eigenvalues, HSV eigenvalues and/or YCbCr eigenvalues of the user ear;
a CMYK eigenvalue, an HSV eigenvalue and/or a YCbCr eigenvalue of the nose of the user;
a user's spoken CMYK eigenvalue, HSV eigenvalue and/or YCbCr eigenvalue;
CMYK characteristic values, HSV characteristic values and/or YCbCr characteristic values of the user's delivery;
CMYK feature values, HSV feature values and/or YCbCr feature values of other face areas of the user.
In this embodiment, the standard face feature information includes healthy face feature information and unhealthy face feature information;
the healthy face feature information comprises a healthy face CMYK special value, a healthy face HSV feature threshold and/or a healthy face YCbCr feature value;
the unhealthy face feature information comprises CMYK feature values of unhealthy faces, HSV feature values of the unhealthy faces and/or YCbCr feature values of the unhealthy faces.
In a specific embodiment, the standard face feature information is a numerical value preset in the control system in advance, and when it is required to judge whether the actual face feature information matches with the standard face feature information, the standard face feature information is directly called from the database, and the specific standard face feature information is shown in table 1.
TABLE 1 Standard human face characteristic information Table
Figure BDA0002872886200000091
Figure BDA0002872886200000101
Wherein [ X ] in Table 1M1,XM2]Represents XM1≤CMYKM≤XM2Wherein X isM1And XM2Is an arbitrary number; [ X ]M5,XM6]Represents XM5≤CMYKM≤XM6Wherein X isM5And XM6Is an arbitrary number; other face feature information is understood in the same way.
In this embodiment, if the facial feature information of the user is unhealthy facial feature information, the cooking mode is determined to be a steaming mode.
For example, based on CMYK-H-CbCr skin color segmentation model, in combination with Adaboost algorithm, CMYK characteristic value of user eyebrow is obtained as X through user photoMHSV characteristic value is YMAnd YCbCr characteristic value of ZMAnd XM5<XM<XM6,YM5<YM<YM6,ZM5<ZM<ZM6Then it is indicated that the actual health of the user is unhealthy and cooking in a steaming mode is required.
In this embodiment, determining the actual cooking parameters according to the cooking mode, the food material information and the preset standard cooking parameters corresponding to the health condition includes:
and if the cooking mode is a steaming mode, determining steaming duration and steaming temperature according to the food material information and preset steaming standard cooking parameters corresponding to the health condition.
Wherein the standard cooking parameters corresponding to the actual health condition of the user are preset in the control system in advance.
For example, if the type of the food material is determined from the food material photo analysis, the standard cooking parameter corresponding to the type of the food material and corresponding to the health condition is called as the actual cooking parameter.
Preferably, when the food material is fish and shrimp:
if any one of the feature information of the eyebrows, eyes, ears, nose, mouth, hair volume or other facial areas of the user is unhealthy feature information, the actual cooking parameters are as follows: steaming at 100 deg.C for 60 min;
if two, three or four pieces of feature information in the feature information of the eyebrows, eyes, ears, nose, mouth, hair volume or other facial areas of the user are unhealthy feature information, the actual cooking parameters are: steaming temperature is 120 ℃, and steaming time is 60min, 90min and 120min respectively;
if five, six or seven pieces of feature information in the feature information of the eyebrows, eyes, ears, noses, mouths, hair volume or other facial areas of the user are matched with unhealthy feature information, the actual cooking parameters are as follows: steaming time is 120min, and steaming temperature is 200 deg.C, 180 deg.C and 160 deg.C respectively.
Preferably, when the food material is vegetables:
if any one of the feature information of the eyebrows, eyes, ears, nose, mouth, hair volume or other facial areas of the user is unhealthy feature information, the actual cooking parameters are as follows: steaming at 100 deg.C for 13 min;
if two, three or four pieces of feature information in the feature information of the eyebrows, eyes, ears, nose, mouth, hair volume or other facial areas of the user are unhealthy feature information, the actual cooking parameters are: steaming temperature is 120 ℃, and steaming time is 10min, 15min and 20min respectively;
if five, six or seven pieces of feature information in the feature information of the eyebrows, eyes, ears, noses, mouths, hair volume or other facial areas of the user are matched with unhealthy feature information, the actual cooking parameters are as follows: steaming time is 10min, and steaming temperature is 200 deg.C, 180 deg.C and 160 deg.C respectively.
The following describes the intelligent cooking control method of the present embodiment with reference to embodiment 1, and the method includes the following steps:
s1, the first camera 21 of the obtaining module 2 takes a picture of food materials placed in the liner of the cooking equipment 1 to obtain a food material photo, the second camera 22 takes a picture of a user to obtain a user photo, and the obtaining module 2 feeds back the food material photo and the user photo to the control module 3;
s2, the control module 3 analyzes the user photo to obtain the feature information of each part of the user face based on the CMYK-H-CbCr skin color segmentation model by combining the Adaboost algorithm, wherein the feature information of each part of the user face comprises:
CMYK eigenvalue X for user browMHSV characteristic value YMAnd YCbCr eigenvalue ZM
CMYK characteristic value X of user's eyesYHSV characteristic value YYAnd YCbCr eigenvalue ZY
CMYK eigenvalue X of user earEHSV characteristic value YEAnd YCbCr eigenvalue ZE
CMYK eigenvalue X of user noseBHSV characteristic value YBAnd YCbCr eigenvalue ZB
User's oral CMYK eigenvalue XKHSV characteristic value YKAnd YCbCr eigenvalue ZK
CMYK characteristic value X of user's issue quantityFHSV characteristic value YFAnd YCbCr eigenvalue ZF
CMYK characteristic value X of other face area of userOHSV characteristic value YOAnd YCbCr eigenvalue ZO
S3, the control module 3 compares the acquired face feature information with the standard face feature information to determine the actual health condition of the user; wherein, the standard characteristic information is shown in table 1;
for feature information of user eyebrows: if XM5<XM<XM6,YM5<YM<YM6,ZM5<ZM<ZM6If yes, the feature information of the user eyebrow is unhealthy; otherwise, the eyebrow feature information of the user is healthy;
for the feature information of the user's eyes: if XY5<XY<XY6,YY5<YY<YY6,ZY5<ZY<ZY6If yes, the feature information of the user eyebrow is unhealthy; otherwise, the characteristic information of the eyes of the user is healthy;
for the feature information of the user's ear: if XE5<XE<XE6,YE5<YE<YE6,ZE5<ZE<ZE6If the ear is unhealthy, the characteristic information of the ear of the user is indicated; otherwise, the characteristic information of the user ears is healthy;
for the feature information of the nose of the user: if XB5<XB<XB6,YB5<YB<YB6,ZB5<ZB<ZB6If so, indicating that the characteristic information of the nose of the user is unhealthy; otherwise, the characteristic information of the nose of the user is healthy;
for the feature information of the nose of the user: if XK5<XK<XB6,YK5<YK<YK6,ZK5<ZK<ZK6If so, indicating that the characteristic information of the mouth and the nose is unhealthy; otherwise, the characteristic information of the mouth and the nose is indicated to be healthy;
the characteristic information for the user's hair volume: if XF5<XF<XF6,YF5<YF<YF6,ZF5<ZF<ZF6If so, the characteristic information of the user volume is unhealthy; otherwise, the characteristic information of the user volume is healthy;
for feature information of other areas of the user: if XO5<XO<XF6,YO5<YO<YO6,ZO5<ZO<ZO6If yes, the characteristic information of other areas of the user is unhealthy; otherwise, the characteristic information of other areas of the user is healthy;
s4, the control module 3 determines a cooking mode according to the actual health condition of the user;
specifically, if any one of feature information of each part of feature information of a face of a user is unhealthy face feature information, determining that a cooking mode is a steaming mode;
s5, the control module 3 determines the actual cooking parameters according to the cooking mode, the food material information and the preset standard cooking parameters corresponding to the health condition, which specifically includes:
the control module 3 determines the food material type by analyzing the food material photo; determining actual cooking parameters according to the food material types, the food material weights and pre-stored standard cooking parameters corresponding to the health conditions; the control module 3 also issues the cooking mode and the actual cooking parameters to the cooking equipment 1 through the communication module 4;
s6, the cooking device 1 receives the steaming mode and the actual cooking parameters through the communication module 4, the control module 3 controls the first valve of the blanking assembly 11 to be opened, the water, the seasonings and the like in the measuring cup 111 fall into the mixing cavity of the mixing part 112, when the amount of the water and the seasonings reaches a preset threshold value, the first valve is controlled to be closed, the stirring part is started simultaneously, the stirring part is used for uniformly stirring the water and the seasonings, then the second valve is controlled to be opened, the driving part is started simultaneously to drive the blanking assembly 11 to move along the guide rail 121, and at the moment, the mixed seasonings are sprayed onto food materials through the spray head 113; after the mixed seasoning is sprayed, the blanking component 11 is reset, and the cooking device 1 starts to work in the received steaming mode and the received steaming parameters until the cooking is finished.
In the description herein, the description of the terms "one embodiment," "some embodiments," "specific embodiments," etc., means that a particular feature, structure, material, or characteristic described in connection with the embodiment or example is included in at least one embodiment or example of the invention. In this specification, the schematic representations of the terms used above do not necessarily refer to the same embodiment or example. Furthermore, the described parent features, structures, materials, or characteristics may be combined in any suitable manner in any one or more embodiments or examples.
The above description is only for the preferred embodiment of the present invention, but the scope of the present invention is not limited thereto, and any changes or substitutions that can be easily conceived by those skilled in the art within the technical scope of the present invention are included in the scope of the present invention. Therefore, the protection scope of the present invention shall be subject to the protection scope of the claims.

Claims (15)

1. An intelligent cooking control system is characterized by comprising a cooking device (1), an acquisition module (2) and a control module (3);
the cooking equipment (1) and the acquisition module (2) are both connected with the control module (3), the acquisition module (2) is used for acquiring food material information and user information, the control module (3) determines the face feature information of a user according to the user information, then determines the actual health condition of the user according to the face feature information of the user and standard face feature information, and determines a cooking mode according to the actual health condition of the user; finally, determining actual cooking parameters according to the cooking mode, the food material information and preset standard cooking parameters corresponding to the health condition; the cooking equipment (1) cooks food materials according to the cooking mode and the actual cooking parameters until the cooking is finished.
2. The intelligent cooking control system according to claim 1, wherein the obtaining module (2) comprises a first camera (21) and a second camera (22), the first camera (21) is arranged in a liner of the cooking device (1) and used for shooting food materials to obtain food material photos, and the second camera (22) is arranged on the cooking device (1) and used for shooting users to obtain user photos.
3. The intelligent cooking control system according to claim 1, wherein the cooking device (1) comprises a blanking assembly (11) for adding seasoning to food materials, the blanking assembly (11) is arranged in the inner container and is electrically connected with the control module (3).
4. The intelligent cooking control system according to claim 3, wherein the blanking assembly (11) comprises a plurality of measuring cups (111) for containing seasonings, a mixing part (112) with a mixing cavity and a stirring part;
the mixing part (112) is arranged below the measuring cups (111), a first through hole (1111) communicated with the mixing cavity is formed in the bottom of each measuring cup (111), and a first valve used for opening or closing the first through hole (1111) is arranged at the first through hole (1111); the bottom of the mixing part (112) is arranged in a second through hole (1121) which is communicated with the mixing cavity and the inner container, and a second valve used for opening or closing the second through hole (1121) is arranged at the second through hole (1121);
the stirring part is arranged in the mixing cavity and is electrically connected with the control module (3), and the first valve and the second valve are also electrically connected with the control module (3).
5. The intelligent cooking control system according to claim 4, wherein the blanking assembly (11) further comprises a spray head (113), and the spray head (113) is communicated with the mixing cavity through the second through hole (1121) for spraying the mixed seasoning to the food material.
6. The intelligent cooking control system according to claim 3, wherein the cooking device (1) further comprises a guiding assembly (12), the guiding assembly (12) is arranged above food materials, the blanking assembly (11) is connected with the guiding assembly (12) in a sliding fit manner, and the guiding assembly (12) is further electrically connected with the control module (3).
7. The intelligent cooking control system according to claim 6, wherein the guide assembly (12) comprises a guide rail (121) and a drive; the guide rail (121) is arranged above the food material, and is connected with the blanking assembly (11) in a sliding fit mode through the driving piece, and the driving piece is electrically connected with the control module (3).
8. The intelligent cooking control system according to any one of claims 1-7, further comprising a communication module (4), wherein the control module (3) is in communication connection with the cooking device (1) through the communication module (4).
9. An intelligent cooking control method is characterized by comprising the following steps:
acquiring food material information in the cooking equipment and user information;
determining face feature information of a user according to the user information;
determining the actual health condition of the user according to the face feature information of the user and the standard face feature information;
determining a cooking mode according to the actual health condition of the user;
determining actual cooking parameters according to the cooking mode, the food material information and preset standard cooking parameters corresponding to the health condition;
and the cooking equipment cooks in the cooking mode and the actual cooking parameters until the cooking is finished.
10. The intelligent cooking control method according to claim 9, wherein the acquiring of the food material information in the cooking device and the user information comprises:
the method comprises the steps of obtaining a food material photo as food material information by photographing a food material, and obtaining a user photo as user information by photographing a user.
11. The intelligent cooking control method according to claim 9, wherein the determining the facial feature information of the user according to the user information includes:
acquiring characteristic information of each part of a user face based on a CMYK-H-CbCr skin color segmentation model and combined with an Adaboost algorithm; the characteristic information comprises CMYK characteristic values, HSV characteristic values and/or YCbCr characteristic values.
12. The intelligent cooking control method according to claim 11, wherein the obtaining of feature information of each part of the face of the user comprises:
feature information of eyebrows, eyes, ears, nose, mouth, hair volume, or/and other facial regions of a user is acquired.
13. The intelligent cooking control method according to any one of claims 9 to 12, wherein the standard facial feature information includes healthy facial feature information and unhealthy facial feature information;
the healthy face feature information comprises healthy face CMYK feature values, healthy face HSV feature values and/or healthy face YCbCr feature values;
the unhealthy face feature information comprises CMYK feature values of unhealthy faces, HSV feature values of the unhealthy faces and/or YCbCr feature values of the unhealthy faces.
14. The intelligent cooking control method according to claim 13, wherein if the facial feature information of the user is unhealthy facial feature information, the cooking mode is determined to be a steaming mode.
15. The intelligent cooking control method according to claim 14, wherein the determining of the actual cooking parameters according to the cooking mode, the food material information and the preset standard cooking parameters corresponding to the health condition comprises:
and if the cooking mode is a steaming mode, determining steaming duration and steaming temperature according to the food material information and preset steaming standard cooking parameters corresponding to the health condition.
CN202011617453.0A 2020-12-30 2020-12-30 Intelligent cooking control system and method Pending CN112754301A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011617453.0A CN112754301A (en) 2020-12-30 2020-12-30 Intelligent cooking control system and method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011617453.0A CN112754301A (en) 2020-12-30 2020-12-30 Intelligent cooking control system and method

Publications (1)

Publication Number Publication Date
CN112754301A true CN112754301A (en) 2021-05-07

Family

ID=75697871

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011617453.0A Pending CN112754301A (en) 2020-12-30 2020-12-30 Intelligent cooking control system and method

Country Status (1)

Country Link
CN (1) CN112754301A (en)

Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104298983A (en) * 2013-07-15 2015-01-21 清华大学 Tongue fur image acquisition and analysis system with distributed user terminals
CN105030039A (en) * 2015-06-27 2015-11-11 广东天际电器股份有限公司 Intelligent cooking system capable of recognizing user geographic position and collecting user health information and application thereof
CN105573133A (en) * 2015-12-24 2016-05-11 小米科技有限责任公司 Cooking control method and apparatus
CN107367959A (en) * 2016-05-13 2017-11-21 佛山市顺德区美的电热电器制造有限公司 The control method and device of intelligent electric cooker
CN109805736A (en) * 2019-01-31 2019-05-28 珠海优特智厨科技有限公司 Baiting method and device
CN110797105A (en) * 2019-10-08 2020-02-14 珠海格力电器股份有限公司 Menu recommendation method and device, storage medium and cooking equipment
CN111163544A (en) * 2019-12-30 2020-05-15 广东美的厨房电器制造有限公司 Control method, cooking apparatus, and computer-readable storage medium
CN211130751U (en) * 2019-11-26 2020-07-31 成都林炉科技有限公司 Kebab condiment sprinkler
CN111984210A (en) * 2019-05-22 2020-11-24 佛山市顺德区美的电热电器制造有限公司 Cooking utensil
CN212117979U (en) * 2019-12-13 2020-12-11 广州富港万嘉智能科技有限公司 Intelligent kitchen ware

Patent Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104298983A (en) * 2013-07-15 2015-01-21 清华大学 Tongue fur image acquisition and analysis system with distributed user terminals
CN105030039A (en) * 2015-06-27 2015-11-11 广东天际电器股份有限公司 Intelligent cooking system capable of recognizing user geographic position and collecting user health information and application thereof
CN105573133A (en) * 2015-12-24 2016-05-11 小米科技有限责任公司 Cooking control method and apparatus
CN107367959A (en) * 2016-05-13 2017-11-21 佛山市顺德区美的电热电器制造有限公司 The control method and device of intelligent electric cooker
CN109805736A (en) * 2019-01-31 2019-05-28 珠海优特智厨科技有限公司 Baiting method and device
CN111984210A (en) * 2019-05-22 2020-11-24 佛山市顺德区美的电热电器制造有限公司 Cooking utensil
CN110797105A (en) * 2019-10-08 2020-02-14 珠海格力电器股份有限公司 Menu recommendation method and device, storage medium and cooking equipment
CN211130751U (en) * 2019-11-26 2020-07-31 成都林炉科技有限公司 Kebab condiment sprinkler
CN212117979U (en) * 2019-12-13 2020-12-11 广州富港万嘉智能科技有限公司 Intelligent kitchen ware
CN111163544A (en) * 2019-12-30 2020-05-15 广东美的厨房电器制造有限公司 Control method, cooking apparatus, and computer-readable storage medium

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
方承志; 袁海峰: "《基于CMYK-H-CbCr肤色检测和改进型AdaBoost算法的人脸检测》", 《计算机应用与软件》 *

Similar Documents

Publication Publication Date Title
CN107806656B (en) Food heating control method and food heating device
CN110824942B (en) Cooking apparatus, control method thereof, control system thereof, and computer-readable storage medium
CN109445485A (en) A kind of control method and cooking apparatus of cooking apparatus
CN110664259A (en) Cooking apparatus, control method thereof, control system thereof, and computer-readable storage medium
CN108309021A (en) A kind of intelligent control automatic dish cooking machine and its intelligent control method
CN109213015B (en) A kind of control method and cooking apparatus of cooking apparatus
CN110806699A (en) Control method and device of cooking equipment, cooking equipment and storage medium
CN206596942U (en) A kind of intelligent cooking system
CN108552936A (en) The cooking methods of cooking machine and cooking machine
JP2022511916A (en) Beverage preparation machine that recognizes capsules
CN111839247A (en) Control method and control device of cooking equipment and cooking equipment
CN109124293A (en) Cooking appliance, control method and system thereof and server
CN111084541A (en) Cooking equipment and nutrition management method thereof
CN110222720A (en) A kind of cooking equipment with short video acquisition function
CN108937554B (en) Steaming and baking equipment and method for reminding diet by using terminal
CN111700508A (en) Control method of cooking appliance, cooking appliance and computer readable storage medium
CN112754301A (en) Intelligent cooking control system and method
CN110934508A (en) Oven control method and device
CN110236363A (en) The control method and control device of electric cooking apparatus and its breath light
JP2021034223A (en) Heating cooker
CN114092806A (en) Recognition method and device thereof, cooking equipment and control method thereof and storage medium
CN106551633A (en) A kind of heating cooking apparatus and its oil temperature control method
CN102200298A (en) Stove and intelligent control method thereof
CN204378944U (en) Upper cover subassembly and food processor suitable for cooking machine
CN109472510A (en) A kind of intelligent scoring method, cooking system and cooking apparatus for cooking effect

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication
RJ01 Rejection of invention patent application after publication

Application publication date: 20210507