CN113378637A - Kitchen electrical equipment control method based on user cooking action prediction - Google Patents

Kitchen electrical equipment control method based on user cooking action prediction Download PDF

Info

Publication number
CN113378637A
CN113378637A CN202110510069.9A CN202110510069A CN113378637A CN 113378637 A CN113378637 A CN 113378637A CN 202110510069 A CN202110510069 A CN 202110510069A CN 113378637 A CN113378637 A CN 113378637A
Authority
CN
China
Prior art keywords
user
cooking action
cooking
output
dimension
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202110510069.9A
Other languages
Chinese (zh)
Other versions
CN113378637B (en
Inventor
秦臻
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Ningbo Fotile Kitchen Ware Co Ltd
Original Assignee
Ningbo Fotile Kitchen Ware Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Ningbo Fotile Kitchen Ware Co Ltd filed Critical Ningbo Fotile Kitchen Ware Co Ltd
Priority to CN202110510069.9A priority Critical patent/CN113378637B/en
Publication of CN113378637A publication Critical patent/CN113378637A/en
Application granted granted Critical
Publication of CN113378637B publication Critical patent/CN113378637B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/24Classification techniques
    • G06F18/241Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/045Combinations of networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02PCLIMATE CHANGE MITIGATION TECHNOLOGIES IN THE PRODUCTION OR PROCESSING OF GOODS
    • Y02P90/00Enabling technologies with a potential contribution to greenhouse gas [GHG] emissions mitigation
    • Y02P90/02Total factory control, e.g. smart factories, flexible manufacturing systems [FMS] or integrated manufacturing systems [IMS]

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Data Mining & Analysis (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Artificial Intelligence (AREA)
  • General Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • Evolutionary Computation (AREA)
  • Biophysics (AREA)
  • Computational Linguistics (AREA)
  • Software Systems (AREA)
  • Mathematical Physics (AREA)
  • Health & Medical Sciences (AREA)
  • Biomedical Technology (AREA)
  • Computing Systems (AREA)
  • Molecular Biology (AREA)
  • General Health & Medical Sciences (AREA)
  • Evolutionary Biology (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Cookers (AREA)

Abstract

The invention relates to a kitchen electrical equipment control method based on user cooking action prediction, which comprises the steps of constructing a convolutional neural network for predicting the user cooking action in advance, training to obtain each parameter in the convolutional neural network based on a user cooking action data set collected in advance, using the convolutional neural network with each determined parameter as a convolutional neural network model for predicting the cooking action of a user at the next moment, inputting the cooking action data of the user at the current moment into the convolutional neural network model, using the output of the convolutional neural network model as the predicted cooking action of the user at the next moment, and executing work corresponding to the predicted cooking action by the kitchen electrical equipment. Since the cooking action of the user at the later moment can be predicted based on the action at the previous moment before the user actually executes the subsequent cooking action, the response instantaneity of the kitchen electric equipment to the cooking action at the later moment of the user is improved, the hysteresis in response to the cooking action is eliminated, and the response efficiency is improved.

Description

Kitchen electrical equipment control method based on user cooking action prediction
Technical Field
The invention relates to the field of kitchen electrical equipment, in particular to a kitchen electrical equipment control method based on user cooking action prediction.
Background
Along with the continuous integration of intellectuality into people's daily life, intelligent kitchen utensils and appliances also walk into people's kitchen gradually to help the user accomplish the control work about kitchen electrical equipment such as lampblack absorber or cooking utensils more intellectuality.
Chinese patent application CN105825195A discloses a method for intelligently identifying cooking behaviors, comprising: 1) selecting to enter a learning training mode; 2) sequentially selecting a training mode of corresponding cooking actions, and repeatedly carrying out corresponding operations by sharers; 3) the action sensor collects action signals of each cooking action of the sharers; 4) the microprocessor analyzes the action signal of each cooking action of the sharer, so as to form a characteristic signal library of the cooking action; 5) selecting to enter a learning identification mode; 6) the action sensor collects the action signal of the learner; 7) the microprocessor receives and analyzes the behavior and action signal of the learner; 8) the microprocessor compares the behavior and action signal of the learner with the characteristic information of the sharer in the characteristic signal library, and when the analysis result of the behavior and action signal of the learner simultaneously conforms to all the characteristics of a certain cooking action in the characteristic signal library, the specific cooking action of the current behavior of the learner can be determined.
However, the invention patent application CN105825195A also has problems: the method for intelligently identifying the cooking behaviors is characterized in that the specific cooking behaviors of the current behaviors of the learner are identified by matching the behavior and action signals of the learner with the characteristic information in the pre-formed characteristic signal library of the cooking behaviors. Since the method is to recognize the specific cooking action of the device after acquiring the action signal of the learner, the operation executed according to the recognized specific cooking action has a certain hysteresis, that is, the device responding to the cooking action of the user cannot predict the cooking action of the user at the next moment in advance, and then execute the response corresponding to the predicted cooking action.
Disclosure of Invention
The technical problem to be solved by the present invention is to provide a kitchen electrical appliance control method based on user cooking action prediction in view of the above prior art.
The technical scheme adopted by the invention for solving the technical problems is as follows: the kitchen electrical equipment control method based on user cooking action prediction is characterized by comprising the following steps of:
step S1, a convolution neural network for predicting the cooking action of the user is constructed in advance; the input of the convolutional neural network is cooking action data of a user at the previous moment, and the output of the convolutional neural network is predicted cooking action of the user at the later moment;
step S2, collecting a cooking action data set of a user at the previous cooking time in advance, and taking the cooking action data set as a cooking action training set;
step S3, inputting the cooking action training set into a convolutional neural network for training, determining each parameter in the convolutional neural network, and taking the convolutional neural network with each determined parameter as a convolutional neural network model for predicting the cooking action of the user at the next moment;
step S4, inputting the collected cooking action data of the user at the current moment into the convolutional neural network model, and predicting the cooking action of the user at the next moment;
in step S5, the kitchen electrical appliance is controlled to execute a corresponding operation based on the predicted cooking operation of the user at the next time.
In an improvement, in the invention, the kitchen appliance control method based on the user cooking motion prediction further includes:
pre-constructing a relation list between each cooking action of a user and a corresponding kitchen electric equipment control instruction;
and in step S5, obtaining a cooking appliance control command corresponding to the predicted cooking action according to the predicted cooking action of the user at the later time and the relationship list constructed in advance, and controlling the cooking appliance to execute the corresponding work by using the obtained cooking appliance control command.
Still further, in the kitchen electrical appliance control method based on user cooking action prediction, the cooking action data of the user includes lid covering, lid lifting, dish adding, fast pan frying, slow pan frying and pan lifting.
In the kitchen electrical appliance control method based on user cooking action prediction, the convolutional neural network is provided with a first input branch, a first output branch corresponding to the first input branch, a second input branch and a second output branch corresponding to the second input branch; wherein, the first input branch inputs N before the preset time1The frame is continuous images, and the output result of the first output branch is the cooking action of the user at the next moment after the preset moment; inputting N before the preset moment by the second input branch2The output result of the second output branch is the cooking action of the user at the next moment after the preset moment, N2>N1>2。
Further, the kitchen electrical appliance control method based on user cooking action prediction further comprises the following steps:
carrying out fusion processing on an output result of a first output branch and an output result of a second output branch of the convolutional neural network;
and predicting the cooking action of the user at the later moment according to the result after the fusion processing, and controlling the kitchen electric equipment to execute corresponding work according to the predicted subsequent cooking action of the user.
Still further, in the kitchen appliance control method based on user cooking motion prediction, the fusion process employs a connection method without parameters, and the connection method without parameters includes the following steps:
setting the output dimension corresponding to the output result of the first output branch as A1×B1×C1×D1
Setting the output dimension corresponding to the output result of the second output branch as A1×B2×C1×D1
Splicing the two four-dimensional matrixes in the 2 nd dimension corresponding to each output dimension to obtain a new matrix in the 2 nd dimension and obtain a new output dimension with the new matrix in the 2 nd dimension; wherein, the 2 nd dimension of the new matrix is two four dimensions before splicingThe sum of the lengths of the 2 nd dimension of the matrix, i.e. the 2 nd dimension of the new matrix is A1×B3×C1×D1,B3=B1+B2
Flattening operation processing is carried out on the obtained new output dimensionality to obtain an output vector with the dimensionality of 1 multiplied by D; wherein D ═ A1·B3·C1·D1
Performing full-connection processing twice on the output vector with the dimension of 1 × D to obtain an output vector with the dimension of 1 × M; wherein each value in the output vector of dimension 1 × M characterizes a user action of the output as any cooking action cookmM is the total number of cooking action types, cook, in the cooking action data set collected in step S2mRepresenting the mth cooking action in the M cooking action types, wherein M is more than or equal to 0 and less than or equal to M;
and taking the cooking action corresponding to the maximum value of all the numerical values corresponding to the output vector with the dimension of 1 multiplied by M as the predicted cooking action of the user at the later moment.
In another improvement, in the kitchen electrical appliance control method based on user cooking motion prediction, when the maximum value of all values corresponding to the output vector with the dimension of 1 × M is smaller than a preset threshold value, the cooking motion of the user at the next moment is determined as a non-cooking motion.
Preferably, in the kitchen appliance control method based on the prediction of the cooking action of the user, the preset threshold is 0.8.
Compared with the prior art, the invention has the advantages that:
firstly, the kitchen electrical equipment control method in the invention constructs a convolutional neural network for predicting the cooking action of the user in advance, trains and obtains each parameter in the convolutional neural network based on a user cooking action data set collected in advance, further takes the convolutional neural network with each determined parameter as a convolutional neural network model for predicting the cooking action of the user at the later moment, then inputs the collected cooking action data of the user at the current moment into the convolutional neural network model, takes the output of the convolutional neural network model as the predicted cooking action of the user at the later moment, and executes the work corresponding to the predicted cooking action by the kitchen electrical equipment. Because the cooking action of the user at the later moment can be predicted according to the cooking action of the user at the previous moment before the user actually executes the subsequent cooking action, the response instantaneity of the kitchen electric equipment to the cooking action of the user at the later moment is improved, the hysteresis in responding to the cooking action is eliminated, and the response efficiency is improved;
secondly, according to the characteristic that the duration time of the user is different when the user makes different actions, the convolutional neural network is further set into two input branches and two corresponding output branches, and then the cooking action of the user at the next moment is judged more accurately according to the fusion processing of the output structures corresponding to the two output branches, so that the detection accuracy of the cooking action of the user at the next moment is improved.
Drawings
Fig. 1 is a flowchart illustrating a kitchen appliance control method based on user cooking motion prediction according to an embodiment of the present invention.
Detailed Description
The invention is described in further detail below with reference to the accompanying examples.
The embodiment provides a kitchen electrical equipment control method based on user cooking action prediction, in particular to a kitchen range control method based on user cooking action prediction. Referring to fig. 1, the kitchen appliance control method based on user cooking motion prediction according to this embodiment includes the following steps:
step S1, a convolutional neural network for predicting the cooking actions of the user is constructed in advance, and a cooking action-control instruction relation list between each cooking action of the user and a corresponding kitchen electrical equipment control instruction is constructed in advance; the input of the convolutional neural network is cooking action data of a user at the previous moment, and the output of the convolutional neural network is predicted cooking action of the user at the later moment; in this embodiment, it is assumed that the pre-constructed convolutional neural network is labeled as ConvNet, the input of the convolutional neural network ConvNet is labeled as x, and the output of the convolutional neural network ConvNet is labeled as xy, and the input X (t) is the user at the previous time t1The output Y is the user's cooking action data at the next time t2The cooking action of (2);
step S2, collecting a cooking action data set of a user at the previous cooking time in advance, and taking the cooking action data set as a cooking action training set; wherein, it is assumed that the user collected in advance in this embodiment is at a time t before cooking1Is marked as X, X ═ Xt1,m},xt1,mIndicating a previously acquired user at a previous time t1The corresponding mth cooking action data, M is the user's previous time t collected in advance1The total amount of the corresponding cooking action data and the previous time t of the user1Cooking action data xt1,mThe corresponding user is at a later time t2Is marked as yt2,m,m≤M,M≥1;
For example, in this embodiment, the pre-collected cooking action data of swan includes lid covering, lid lifting, dish adding, fast stir-frying, slow stir-frying, and pan lifting;
step S3, inputting the cooking action training set into a convolutional neural network for training, determining each parameter in the convolutional neural network, and taking the convolutional neural network with each determined parameter as a convolutional neural network model for predicting the cooking action of the user at the next moment;
in the convolutional neural network convNet which is constructed in advance in the embodiment, each cooking action data X in the cooking action data set X is determined at the moment due to a plurality of parameters to be determinedt1,mInputting the data into the convolutional neural network ConvNet, and then according to the data x of each cooking actiont1,mThe corresponding user is at the next moment t2Cooking action yt2,mTraining the convolutional neural network ConvNet to determine each parameter to be determined in the convolutional neural network ConvNet, and then using the convolutional neural network with the determined parameters as a convolutional neural network model for predicting the cooking action of the user at the next moment, and marking the modelRecording the convolutional neural network model as CONVNET;
step S4, inputting the collected cooking action data of the user at the current moment into the convolutional neural network model, and predicting the cooking action of the user at the next moment;
since the convolutional neural network model CONVNET with the determined parameters is obtained through the training process of step S3 in this embodiment, the user collected again at the current time t can be obtainedaCooking action data xtInputting the input data into a convolutional neural network model CONVNET, and further predicting the current time t of the useraAt a later time ta+1Cooking action yta+1
And step S5, controlling the kitchen electrical equipment to execute the corresponding work according to the predicted cooking action of the user at the later time and the pre-constructed cooking action-control command relation list. Since the user has been predicted at the subsequent time t by the above-described step S4a+1Cooking action yta+1Then, at this time, the cooking range as the kitchen electrical equipment is controlled to execute the cooking action yta+1And (6) correspondingly working.
In order to more accurately predict the cooking action of the user at the later moment according to the collected cooking action data of the user at the previous moment, the convolutional neural network of the embodiment is provided with a first input branch, a first output branch corresponding to the first input branch, a second input branch and a second output branch corresponding to the second input branch; wherein the first input branch inputs a preset time T1Front N1Frame sequential images, the output result of the first output branch being the user at a preset time T1At a later time T2The cooking action of (2); the second input branch inputs a preset time T1Front N2Frame sequential images, the output result of the second output branch being the user at a predetermined time T1At a later time T2Cooking action of (T)2>T1>0,N2>N1>2. For example, the number of frames N in this embodiment1Set to 16, and the number of frames N2Set to 64.
Of course, the kitchen electrical appliance control method in this embodiment will perform fusion processing on the output result of the first output branch and the output result of the second output branch of the convolutional neural network; and predicting the cooking action of the user at the later moment according to the result after the fusion processing, and controlling the kitchen electric equipment to execute corresponding work according to the predicted subsequent cooking action of the user. The fusion processing adopts a connection method without parameters, and the connection method without parameters comprises the following steps:
step 1, setting the output dimension corresponding to the output result of the first output branch as A1×B1×C1×D1(ii) a Correspondingly, the number of frames N has been already set in this embodiment1Set to 16 and the number of frames N2With 64, the output dimension corresponding to the output result of the first output branch is set to 2048 × 2 × 28 × 28;
step 2, setting the output dimension corresponding to the output result of the second output branch as A1×B2×C1×D1(ii) a Correspondingly, the number of frames N has been already set in this embodiment1Set to 16 and the number of frames N2When the output dimension is set to 64, the output dimension corresponding to the output result of the second output branch is set to 2048 × 8 × 28 × 28;
step 3, splicing the two four-dimensional matrixes in the 2 nd dimension corresponding to each output dimension to obtain a new matrix in the 2 nd dimension and obtain a new output dimension with the new matrix in the 2 nd dimension; wherein, the 2 nd dimension of the new matrix is the sum of the lengths of the 2 nd dimension of the two four-dimensional matrices before splicing, that is, the 2 nd dimension of the new matrix is A1×B3×C1×D1,B3=B1+B2
That is, for the output dimensions 2048 × 2 × 28 × 28 and 2048 × 8 × 28 × 28 corresponding to the two output branches, the two four-dimensional matrices in the 2 nd dimension corresponding to each output dimension are subjected to splicing processing to obtain a new matrix in the 2 nd dimension, and the obtained new output dimension having the new matrix in the 2 nd dimension is 2048 × 10 × 28 × 28;
step 4, flattening operation processing is carried out on the obtained new output dimensionality to obtain an output vector with dimensionality of 1 multiplied by D; wherein D is A1 & B3 & C1 & D1; namely, after the new output dimension 2048 × 10 × 28 × 28 obtained in this embodiment is subjected to flattening processing by adopting a conventional technical means in the field, an output vector with a dimension of 1 × 16056320 is obtained;
step 5, performing full connection processing twice on the output vector with the dimension of 1 × D to obtain an output vector with the dimension of 1 × M; wherein each value in the output vector of dimension 1 × M characterizes a user action of the output as any cooking action cookmM is the total number of cooking action types, cook, in the cooking action data set collected in step S2mRepresenting the mth cooking action in the M cooking action types, wherein M is more than or equal to 0 and less than or equal to M;
specifically, in this embodiment, the obtained output vector with the dimension of 1 × 16056320 is subjected to full connection processing twice to obtain an output vector with the dimension of 1 × 6, where the number "6" in the dimension of 1 × 6 "is the total amount of the cooking action data" cover pot cover, lift pot cover, add dish, quickly turn over and fry, slowly turn over and fry, and lift pot "in step S2, that is, the total amount of the cooking action data is 6; moreover, the full-connection processing here is a conventional technical means in the field, and is not described here again;
assume that the output vector of dimension 1 × 6 here is labeled as (P)1,P2,P3,P4,P5,P6) And a value P1Probability of outputting user action as cooking action data, value P2A probability of the output user action as cooking action data, a value P3A numerical value P representing the probability of the output user action as cooking action data4A numerical value P representing the probability of the output user action as the cooking action data5A numerical value P representing the probability of the output user action as cooking action data of' Slow Stir-fry6Representing user actions of an outputProbability of "lifting" for cooking action data;
and 6, taking the cooking action corresponding to the maximum value in all the numerical values corresponding to the output vector with the dimension of 1 multiplied by M as the predicted cooking action of the user at the next moment.
The probabilities P corresponding to the six cooking action data are obtained in step 51~P6Then, assume the probability P of the cooking action data "slow pan5For these 6 probabilities P1~P6The maximum value is then determined as "slow stir-fry" which is the predicted cooking action to be done by the user at a later time, so that the cooker performs the work corresponding to the "slow stir-fry".
Of course, once in the output vector with the dimension of 1 × 16056320, assuming that the maximum value of all the corresponding values of the output vector is smaller than the preset threshold value of 0.8, the cooking action of the user at the later time is determined to be not the cooking action. Say, if the probability P1~P6Maximum value of (1)5And when the current time is less than the preset threshold value of 0.8, the cooking action of the user at the later moment is determined as the non-cooking action.
It should be noted that the kitchen appliance control method based on user cooking action prediction in this embodiment may also be applied to other kitchen appliances such as a range hood or a dishwasher according to actual needs.

Claims (8)

1. The kitchen electrical equipment control method based on user cooking action prediction is characterized by comprising the following steps of:
step S1, a convolution neural network for predicting the cooking action of the user is constructed in advance; the input of the convolutional neural network is cooking action data of a user at the previous moment, and the output of the convolutional neural network is predicted cooking action of the user at the later moment;
step S2, collecting a cooking action data set of a user at the previous cooking time in advance, and taking the cooking action data set as a cooking action training set;
step S3, inputting the cooking action training set into a convolutional neural network for training, determining each parameter in the convolutional neural network, and taking the convolutional neural network with each determined parameter as a convolutional neural network model for predicting the cooking action of the user at the next moment;
step S4, inputting the collected cooking action data of the user at the current moment into the convolutional neural network model, and predicting the cooking action of the user at the next moment;
in step S5, the kitchen electrical appliance is controlled to execute a corresponding operation based on the predicted cooking operation of the user at the next time.
2. The kitchen electrical appliance control method based on user cooking action prediction according to claim 1, further comprising:
pre-constructing a relation list between each cooking action of a user and a corresponding kitchen electric equipment control instruction;
and in step S5, obtaining a cooking appliance control command corresponding to the predicted cooking action according to the predicted cooking action of the user at the later time and the relationship list constructed in advance, and controlling the cooking appliance to execute the corresponding work by using the obtained cooking appliance control command.
3. The kitchen electrical appliance control method based on user cooking action prediction according to claim 2, characterized in that the user's cooking action data includes lid cover, lid lift, add dish, fast pan, slow pan and lift.
4. The kitchen appliance control method based on user cooking action prediction according to claim 2, characterized in that said convolutional neural network has a first input branch, a first output branch corresponding to the first input branch, a second input branch and a second output branch corresponding to the second input branch; wherein, the first input branch inputs N before the preset time1The frame is continuous images, and the output result of the first output branch is the cooking action of the user at the next moment after the preset moment;inputting N before the preset moment by the second input branch2The output result of the second output branch is the cooking action of the user at the next moment after the preset moment, N2>N1>2。
5. The kitchen electrical appliance control method based on user cooking action prediction according to claim 4, further comprising:
carrying out fusion processing on an output result of a first output branch and an output result of a second output branch of the convolutional neural network;
and predicting the cooking action of the user at the later moment according to the result after the fusion processing, and controlling the kitchen electric equipment to execute corresponding work according to the predicted subsequent cooking action of the user.
6. The control method for kitchen electric appliance based on user cooking action prediction according to claim 5, characterized in that said fusion process uses a connection method without parameters, which comprises the following steps:
setting the output dimension corresponding to the output result of the first output branch as A1×B1×C1×D1
Setting the output dimension corresponding to the output result of the second output branch as A1×B2×C1×D1
Splicing the two four-dimensional matrixes in the 2 nd dimension corresponding to each output dimension to obtain a new matrix in the 2 nd dimension and obtain a new output dimension with the new matrix in the 2 nd dimension; wherein, the 2 nd dimension of the new matrix is the sum of the lengths of the 2 nd dimension of the two four-dimensional matrices before splicing, that is, the 2 nd dimension of the new matrix is A1×B3×C1×D1,B3=B1+B2
Flattening operation processing is carried out on the obtained new output dimensionality to obtain an output vector with the dimensionality of 1 multiplied by D; wherein D ═ A1·B3·C1·D1
Performing full-connection processing twice on the output vector with the dimension of 1 × D to obtain an output vector with the dimension of 1 × M; wherein each value in the output vector of dimension 1 × M characterizes a user action of the output as any cooking action cookmM is the total number of cooking action types, cook, in the cooking action data set collected in step S2mRepresenting the mth cooking action in the M cooking action types, wherein M is more than or equal to 0 and less than or equal to M;
and taking the cooking action corresponding to the maximum value of all the numerical values corresponding to the output vector with the dimension of 1 multiplied by M as the predicted cooking action of the user at the later moment.
7. The kitchen electrical appliance control method based on user cooking action prediction according to claim 6, wherein when the maximum value of all values corresponding to the output vector with dimension of 1 × M is smaller than a preset threshold value, the cooking action of the user at the next moment is determined as a non-cooking action.
8. The kitchen electrical appliance control method based on user cooking action prediction according to claim 7, characterized in that said preset threshold value is 0.8.
CN202110510069.9A 2021-05-11 2021-05-11 Kitchen electrical equipment control method based on user cooking action prediction Active CN113378637B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110510069.9A CN113378637B (en) 2021-05-11 2021-05-11 Kitchen electrical equipment control method based on user cooking action prediction

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110510069.9A CN113378637B (en) 2021-05-11 2021-05-11 Kitchen electrical equipment control method based on user cooking action prediction

Publications (2)

Publication Number Publication Date
CN113378637A true CN113378637A (en) 2021-09-10
CN113378637B CN113378637B (en) 2022-05-17

Family

ID=77572412

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110510069.9A Active CN113378637B (en) 2021-05-11 2021-05-11 Kitchen electrical equipment control method based on user cooking action prediction

Country Status (1)

Country Link
CN (1) CN113378637B (en)

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20040172380A1 (en) * 2001-09-29 2004-09-02 Xiaolin Zhang Automatic cooking method and system
US20150290795A1 (en) * 2014-02-20 2015-10-15 Mark Oleynik Methods and systems for food preparation in a robotic cooking kitchen
CN105825195A (en) * 2016-03-25 2016-08-03 广州美易来智能电器有限公司 Intelligent cooking behavior identification device and method
CN110609481A (en) * 2019-08-13 2019-12-24 深圳市享往科技有限公司 Cooking method, system and controller
CN110848774A (en) * 2019-10-17 2020-02-28 佛山市云米电器科技有限公司 Kitchen range and air conditioning equipment linkage method and system
CN110989430A (en) * 2019-11-25 2020-04-10 重庆特斯联智慧科技股份有限公司 Smart home linkage method and system and readable storage medium
CN111928308A (en) * 2020-08-13 2020-11-13 宁波方太厨具有限公司 Cooking control method, system, electronic device and readable storage medium
EP3748570A1 (en) * 2018-04-13 2020-12-09 Samsung Electronics Co., Ltd. Refrigerator and method for displaying user interface on refrigerator, user terminal, and method for performing function in user terminal
CN112401630A (en) * 2020-10-23 2021-02-26 华帝股份有限公司 Auxiliary cooking method and device

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20040172380A1 (en) * 2001-09-29 2004-09-02 Xiaolin Zhang Automatic cooking method and system
US20150290795A1 (en) * 2014-02-20 2015-10-15 Mark Oleynik Methods and systems for food preparation in a robotic cooking kitchen
CN105825195A (en) * 2016-03-25 2016-08-03 广州美易来智能电器有限公司 Intelligent cooking behavior identification device and method
EP3748570A1 (en) * 2018-04-13 2020-12-09 Samsung Electronics Co., Ltd. Refrigerator and method for displaying user interface on refrigerator, user terminal, and method for performing function in user terminal
CN110609481A (en) * 2019-08-13 2019-12-24 深圳市享往科技有限公司 Cooking method, system and controller
CN110848774A (en) * 2019-10-17 2020-02-28 佛山市云米电器科技有限公司 Kitchen range and air conditioning equipment linkage method and system
CN110989430A (en) * 2019-11-25 2020-04-10 重庆特斯联智慧科技股份有限公司 Smart home linkage method and system and readable storage medium
CN111928308A (en) * 2020-08-13 2020-11-13 宁波方太厨具有限公司 Cooking control method, system, electronic device and readable storage medium
CN112401630A (en) * 2020-10-23 2021-02-26 华帝股份有限公司 Auxiliary cooking method and device

Also Published As

Publication number Publication date
CN113378637B (en) 2022-05-17

Similar Documents

Publication Publication Date Title
CN108829723B (en) Interactive intelligent refrigerator health service terminal based on complex network and deep learning
US20200019887A1 (en) Data-driven activity prediction
CN107752794B (en) Baking method and device
EP3730005B1 (en) Method for generating at least one recipe suggestion, kitchen appliance and system for preparing food
CN107991939A (en) Cooking control method and culinary art control device, storage medium and cooking equipment
CN109961102A (en) Image processing method, device, electronic equipment and storage medium
CN109918522A (en) A kind of vegetable cooking learning method and device based on cooking platform
CN111077786A (en) Intelligent household equipment control method and device based on big data analysis
CN111008643A (en) Image classification method and device based on semi-supervised learning and computer equipment
CN113378637B (en) Kitchen electrical equipment control method based on user cooking action prediction
CN110456851A (en) Cooking apparatus control method, device, equipment and storage medium
WO2019196488A1 (en) Method and device for controlling household appliance to execute control instruction
CN107682236A (en) Smart home interactive system and method based on computer picture recognition
EP4029417A1 (en) Method for controlling cooker by using artificial intelligence and system therefor
CN110269605A (en) A kind of electrocardiosignal noise recognizing method based on deep neural network
US10004112B2 (en) Machine learning apparatus and coil electric heating apparatus
CN109700434A (en) Meta-learning model training method, system and equipment based on electrocardiographic diagnosis
CN111539558A (en) Power load prediction method adopting optimized extreme learning machine
CN114720129B (en) Rolling bearing residual life prediction method and system based on bidirectional GRU
CN110456648A (en) Cooking apparatus control method, device, equipment and storage medium
CN115577934A (en) Emergency scheme updating method and device based on multi-agent reinforcement learning
Madhuravani et al. Prediction exploration for coronary heart disease aid of machine learning
CN112401630A (en) Auxiliary cooking method and device
Bhanu et al. Adaptive image segmentation using multi-objective evaluation and hybrid search methods
Mott State classification of cooking objects using a VGG CNN

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant