CN108197635B - Cooking mode display method and device and range hood - Google Patents
Cooking mode display method and device and range hood Download PDFInfo
- Publication number
- CN108197635B CN108197635B CN201711236578.7A CN201711236578A CN108197635B CN 108197635 B CN108197635 B CN 108197635B CN 201711236578 A CN201711236578 A CN 201711236578A CN 108197635 B CN108197635 B CN 108197635B
- Authority
- CN
- China
- Prior art keywords
- food material
- determining
- dish
- image information
- range hood
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
- 238000010411 cooking Methods 0.000 title claims abstract description 56
- 238000000034 method Methods 0.000 title claims abstract description 56
- 235000013305 food Nutrition 0.000 claims abstract description 152
- 239000000463 material Substances 0.000 claims abstract description 151
- 238000012549 training Methods 0.000 claims description 12
- 238000010801 machine learning Methods 0.000 claims description 10
- 230000008569 process Effects 0.000 claims description 10
- 239000000203 mixture Substances 0.000 claims description 8
- 230000037213 diet Effects 0.000 claims description 5
- 235000005911 diet Nutrition 0.000 claims description 5
- 238000005516 engineering process Methods 0.000 abstract description 3
- 230000006870 function Effects 0.000 description 10
- 239000000779 smoke Substances 0.000 description 6
- 238000012163 sequencing technique Methods 0.000 description 4
- 230000008878 coupling Effects 0.000 description 3
- 238000010168 coupling process Methods 0.000 description 3
- 238000005859 coupling reaction Methods 0.000 description 3
- 238000013135 deep learning Methods 0.000 description 3
- 238000010586 diagram Methods 0.000 description 3
- 238000013528 artificial neural network Methods 0.000 description 2
- 210000004556 brain Anatomy 0.000 description 2
- 238000004891 communication Methods 0.000 description 2
- 238000013527 convolutional neural network Methods 0.000 description 2
- 238000005034 decoration Methods 0.000 description 2
- 238000013508 migration Methods 0.000 description 2
- 230000005012 migration Effects 0.000 description 2
- 238000012986 modification Methods 0.000 description 2
- 230000004048 modification Effects 0.000 description 2
- 238000012544 monitoring process Methods 0.000 description 2
- 210000002569 neuron Anatomy 0.000 description 2
- 238000012545 processing Methods 0.000 description 2
- 230000009466 transformation Effects 0.000 description 2
- 235000002568 Capsicum frutescens Nutrition 0.000 description 1
- 240000008574 Capsicum frutescens Species 0.000 description 1
- 238000003066 decision tree Methods 0.000 description 1
- 238000007599 discharging Methods 0.000 description 1
- 238000000605 extraction Methods 0.000 description 1
- 230000007246 mechanism Effects 0.000 description 1
- 230000008450 motivation Effects 0.000 description 1
- 230000003287 optical effect Effects 0.000 description 1
- 238000011160 research Methods 0.000 description 1
- 230000000391 smoking effect Effects 0.000 description 1
- 238000013526 transfer learning Methods 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F18/00—Pattern recognition
- G06F18/20—Analysing
- G06F18/24—Classification techniques
- G06F18/241—Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches
- G06F18/2415—Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches based on parametric or probabilistic models, e.g. based on likelihood ratio or false acceptance rate versus a false rejection rate
- G06F18/24155—Bayesian classification
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F18/00—Pattern recognition
- G06F18/20—Analysing
- G06F18/24—Classification techniques
- G06F18/241—Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches
- G06F18/2413—Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches based on distances to training or reference patterns
- G06F18/24147—Distances to closest patterns, e.g. nearest neighbour classification
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/04—Architecture, e.g. interconnection topology
- G06N3/045—Combinations of networks
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06Q—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q50/00—Information and communication technology [ICT] specially adapted for implementation of business processes of specific business sectors, e.g. utilities or tourism
- G06Q50/10—Services
- G06Q50/12—Hotels or restaurants
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Physics & Mathematics (AREA)
- Data Mining & Analysis (AREA)
- Business, Economics & Management (AREA)
- General Physics & Mathematics (AREA)
- Tourism & Hospitality (AREA)
- Evolutionary Computation (AREA)
- General Engineering & Computer Science (AREA)
- Life Sciences & Earth Sciences (AREA)
- Artificial Intelligence (AREA)
- General Health & Medical Sciences (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Bioinformatics & Computational Biology (AREA)
- Bioinformatics & Cheminformatics (AREA)
- Health & Medical Sciences (AREA)
- Evolutionary Biology (AREA)
- Computing Systems (AREA)
- Biomedical Technology (AREA)
- Computational Linguistics (AREA)
- Mathematical Physics (AREA)
- Software Systems (AREA)
- Probability & Statistics with Applications (AREA)
- Biophysics (AREA)
- Molecular Biology (AREA)
- Economics (AREA)
- Human Resources & Organizations (AREA)
- Marketing (AREA)
- Primary Health Care (AREA)
- Strategic Management (AREA)
- General Business, Economics & Management (AREA)
- Management, Administration, Business Operations System, And Electronic Commerce (AREA)
- Ventilation (AREA)
Abstract
The invention discloses a cooking mode display method and device and a range hood. Wherein, the method comprises the following steps: acquiring first image information of a current food material; determining a kind of food material based on the first image information; determining at least one dish corresponding to the food material according to the variety of the food material; and determining the cooking mode of the food material according to the dishes, and displaying the cooking mode on the range hood. The invention solves the technical problem of single function of the range hood in the related technology.
Description
Technical Field
The invention relates to the field of range hoods, in particular to a cooking mode display method and device and a range hood.
Background
At present, the function of the range hood is single, only the smoking function of the range hood is completed, and along with more and more personalized demands of people, the single function influences the experience of people on products for a while, and simultaneously, the application scene of the range hood is also limited.
In view of the above problems, no effective solution has been proposed.
Disclosure of Invention
The embodiment of the invention provides a cooking mode display method and device and a range hood, and at least solves the technical problem that the range hood in the related art is single in function.
According to an aspect of an embodiment of the present invention, there is provided a method for displaying a cooking method, including: acquiring first image information of a current food material; determining a kind of food material based on the first image information; determining at least one dish corresponding to the food material according to the variety of the food material; and determining the cooking mode of the food material according to the dishes, and displaying the cooking mode on the range hood.
Optionally, determining the type of the food material based on the first image information comprises: the method comprises the following steps of taking first image information as input of a first model, and determining the type of food materials corresponding to the first image information, wherein the first model is obtained by using multiple groups of data in a first database through machine learning training, and each group of data in the multiple groups of data in the first database comprises: the food material image information and the food material type corresponding to the food material image information.
Optionally, determining at least one dish corresponding to the food material according to the type of the food material includes: determining a dish list according to the variety of the food materials, wherein the food material composition of each dish in the dish list comprises the food materials corresponding to the variety of the food materials; and receiving a selection instruction, and selecting dishes corresponding to the food material from the dish list according to the selection instruction.
Optionally, determining at least one dish corresponding to the food material according to the type of the food material includes: determining a dish list according to the variety of the food materials, wherein the food material composition of each dish in the dish list comprises the food materials corresponding to the variety of the food materials; acquiring a user image of the range hood; determining identity information of a user using the range hood according to the user image;
selecting a dish corresponding to the food material from the dish list based on the historical diet record corresponding to the identity information.
Optionally, determining the type of the food material based on the first image information comprises: determining identity information corresponding to the user image by taking the user image as input of a second model, wherein the second model is obtained by using multiple groups of data in a second database through machine learning training, and each group of data in the multiple groups of data in the second database comprises: a user image and identity information corresponding to the user image.
Optionally, the method further includes: acquiring second image information in the cooking process of the food material, wherein the second image information is used for reflecting the cooking state of the food material; and sending the second image information to the terminal equipment.
Optionally, the method further includes: acquiring the cooking state of the food material based on the second image information; and when the cooking state is the designated state, sending a notification message to the terminal device, wherein the notification message is used for notifying the terminal device that the food material is in the designated state.
According to another aspect of the embodiments of the present invention, there is also provided a method including a control panel and an exhaust flue, the exhaust flue being disposed above the control panel, the method including: the image acquisition device is arranged at the bottom of the control panel and used for acquiring first image information of the current food material; a processor for determining a category of the food material based on the first image information; determining at least one dish corresponding to the food material according to the variety of the food material; determining the cooking mode of the food material according to the dishes; and the display device is arranged on the control panel and used for displaying the cooking mode on the range hood.
The embodiment of the invention provides a display device of another cooking mode, which is applied to a range hood, wherein the display device comprises: the acquisition module is used for acquiring first image information of the current food material; the first determining module is used for determining the type of the food material based on the first image information; the second determining module is used for determining at least one dish corresponding to the food material according to the variety of the food material; the third determining module is used for determining the cooking mode of the food material according to the dishes; and the display module is used for displaying the cooking mode on the range hood.
Optionally, the first determining module is configured to use the first image information as an input of a first model, and determine a category of a food material corresponding to the first image information, where the first model is obtained through machine learning training by using multiple sets of data in a first database, and each set of data in the multiple sets of data in the first database includes: the food material image information and the food material type corresponding to the food material image information.
According to still another aspect of the embodiments of the present invention, there is provided a storage medium including a stored program, wherein when the program runs, a device on which the storage medium is located is controlled to execute the above-mentioned displaying manner of the cooking manner.
According to another aspect of the embodiments of the present invention, there is provided a processor for executing a program, wherein the program executes the method for displaying a cooking manner described above.
In the embodiment of the invention, the purpose of expanding the application scene of the range hood is achieved by adopting the mode of determining the corresponding dishes according to the types of the food materials and determining and displaying the cooking modes corresponding to the food materials according to the dishes, so that the user experience is improved, and the technical problem of single function of the range hood in the related technology is solved.
Drawings
The accompanying drawings, which are included to provide a further understanding of the invention and are incorporated in and constitute a part of this application, illustrate embodiment(s) of the invention and together with the description serve to explain the invention without limiting the invention. In the drawings:
fig. 1 is a schematic structural diagram of a range hood according to an embodiment of the application;
FIG. 2 is a flow chart of a method of displaying an alternative cooking profile according to an embodiment of the present application;
fig. 3 is a schematic structural diagram of a display device for an alternative cooking manner according to an embodiment of the invention.
Detailed Description
In order to make the technical solutions of the present invention better understood, the technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
It should be noted that the terms "first," "second," and the like in the description and claims of the present invention and in the drawings described above are used for distinguishing between similar elements and not necessarily for describing a particular sequential or chronological order. It is to be understood that the data so used is interchangeable under appropriate circumstances such that the embodiments of the invention described herein are capable of operation in sequences other than those illustrated or described herein. Furthermore, the terms "comprises," "comprising," and "having," and any variations thereof, are intended to cover a non-exclusive inclusion, such that a process, method, system, article, or apparatus that comprises a list of steps or elements is not necessarily limited to those steps or elements expressly listed, but may include other steps or elements not expressly listed or inherent to such process, method, article, or apparatus.
First, in order to facilitate understanding of the embodiments of the present invention, some terms or nouns referred to in the present invention will be explained as follows:
pixel: the minimum unit which can be displayed on a computer screen is used for representing the unit of an image, and the unit refers to an array of horizontal and vertical pixels which can be displayed, the more pixel points in the screen are, the higher the resolution of the picture is, and the finer and more vivid the image is; pixel point: refers to the value of the pixel.
Binarization: the method is characterized in that most of pictures shot by a camera are color images, the information content of the color images is huge, the contents of the pictures can be simply divided into foreground and background, the color images are processed firstly, the pictures only have foreground information and background information, the foreground information can be simply defined to be black, the background information is defined to be white, and the pictures are binary images.
CNN: a convolutional neural network, which is used for describing an operation on an input image and outputting a group of probabilities for describing classification or classification of image contents, namely, identifying the input image to output the probability of an object in the image; a more abstract concept is constructed through a series of convolution levels, wherein the concept comprises the steps of establishing a plurality of neurons and establishing corresponding input layers and output layers, so that input nodes are continuously associated through the neurons to obtain an optimized object, generally comprising a convolution layer and a filter layer, a learning cycle is obtained through forward conduction, a loss function, backward conduction and function updating, and a program repeats a fixed number of cycle processes for each training picture to continuously optimize a training learning result.
Searching the images according to the images: the method comprises the steps of obtaining images, sequencing results through deep learning, training a sequencing loss function of a model through triple data (query pictures, clicked pictures and non-clicked pictures) recorded by a user, obtaining sequencing results, automatically detecting a main body by the model after inputting one image, and then discharging results of related objects according to the sequencing scores.
Transfer learning: the method is essentially image matching, a model is applied to various fields through migration learning, specifically, a vector representation X of pictures in a database is migrated to an image X1 of other fields through linear transformation, the migration transformation is converted into a nonlinear function through introducing a random Fourier function, and then a required image is obtained.
Naive bayes: the method is characterized in that one picture can be given, object classification can be returned, and the picture is identified as a simple attitude so as to obtain a corresponding object.
Dependency grammar: the method is characterized in that the relation between a main word and a word for describing the main word is constructed, the level of a phrase is not available in the dependency grammar, each node corresponds to a word in a sentence, and the relation between the words in the sentence can be directly processed, so that analysis and information extraction are facilitated.
Decision tree: the method is characterized in that classification is carried out according to characteristics, each node puts forward a question, data are divided into two types, and questions are asked continuously, wherein the questions are learned and trained on the existing data, so that when new data are input, the data are divided into corresponding leaves according to the questions on the tree where the data are located.
Deep learning: the method is a method for learning based on data representation in machine learning, the concept is derived from the research of artificial neural network, and the motivation is to establish and simulate the neural network for analyzing and learning of human brain, which simulates the mechanism of human brain to interpret data such as image, sound and text. By combining low-level features to form more abstract high-level representation attribute categories or features to find distributed feature representations of data, the multi-layer perceptron with multiple hidden layers is a deep learning structure.
KNN algorithm: if a sample belongs to a certain class in the majority of the k most similar samples in feature space (i.e. the nearest neighbors in feature space), then the sample also belongs to this class. In the KNN algorithm, the selected neighbors are all objects that have been correctly classified.
Fig. 1 is a schematic structural diagram of a range hood according to an embodiment of the present application, and as shown in fig. 1, the range hood includes: the device comprises a control panel 1 and a smoke exhaust tube 2, wherein the smoke exhaust tube 2 is arranged above the control panel 1; the image acquisition device is arranged at the bottom of the control panel 1 and used for acquiring first image information of the current food material; a processor 3 for determining a kind of the food material based on the first image information; determining at least one dish corresponding to the food material according to the variety of the food material; determining the cooking mode of the food material according to the dishes; and a display device 4 arranged on the control panel 1 and used for displaying the cooking mode on the range hood. Alternatively, the processor 3 may be disposed inside the control panel 1, and it should be noted that the processor is located inside the control panel, and is shown on the outer surface of the control panel 1 in fig. 1 for convenience, but it should be known by those skilled in the art that it is not visible inside the control panel.
According to an embodiment of the present invention, there is provided a method embodiment of a method for displaying cooking recipes, which can be operated in the structure shown in fig. 1, but is not limited thereto. It should be noted that the steps illustrated in the flowcharts of the figures may be performed in a computer system such as a set of computer-executable instructions and that, although a logical order is illustrated in the flowcharts, in some cases, the steps illustrated or described may be performed in an order different than presented herein.
Fig. 2 is a flowchart of a method for displaying an alternative cooking manner according to an embodiment of the present application. As shown in fig. 2, the method includes:
step S202, acquiring first image information of the current food material;
in the embodiment of the present application, one or more image capturing devices (e.g., a camera) may be disposed in a designated area in a room where the range hood is disposed to capture image information of food materials, and the location of the camera is not limited in the present application, for example, when the range hood is disposed in a common residential home, for example, but not limited to, a top of a house in an area where a wok is disposed in a kitchen, and other household appliances (e.g., a refrigerator) in the same area (e.g., the same room) as the range hood may be respectively disposed with one camera.
In an alternative embodiment, the image acquisition device may also be provided on the range hood, for example at a control panel of the range hood, or at an edge position of the range hood.
Step S204, determining the type of the food material based on the first image information;
optionally, there are various implementations of this step, for example, the following manner may be adopted to determine the type of the food material based on the first image information: the method comprises the following steps of taking first image information as input of a first model, and determining the type of food materials corresponding to the first image information, wherein the first model is obtained by using multiple groups of data in a first database through machine learning training, and each group of data in the multiple groups of data in the first database comprises: the food material image information and the food material type corresponding to the food material image information.
The method includes the steps that the cameras arranged at different positions can be used for collecting images of oil smoke in the areas respectively, when the images are collected, the images can be shot once every preset time period (for example, every one minute), and then the types of food materials are analyzed according to the images. In an optional embodiment, the operating state of the range hood can be controlled according to the type of food materials or dishes, for example, for food materials or dishes which generate more oil smoke or generate pungent odor (such as hot pepper), the air suction capacity of the range hood is improved.
It should be noted that, the category of the captured image is not limited in this application, and includes but is not limited to: black-and-white images (grayscale images), and color images (RGB images). When the image is analyzed, the information in the image can be analyzed according to a binarization image processing mode, specifically, when the image is analyzed, the positions of a plurality of pixel points in the image and pixel points in a historical image can be compared to determine the pixel points with differences, and then the pixel points with differences are distinguished to obtain the image information whether oil smoke exists in the image.
Optionally, the identity information of the user may be determined by, but is not limited to: the identity information of a user using the range hood is determined according to the user image, the user image is used as the input of a second model, and the identity information corresponding to the user image is determined, wherein the second model is obtained by using multiple groups of data in a second database through machine learning training, and each group of data in the multiple groups of data in the second database comprises: a user image and identity information corresponding to the user image.
Step S206, determining at least one dish corresponding to the food material according to the variety of the food material;
in an alternative embodiment, step S206 may be implemented by one of the following, but is not limited thereto: 1) determining a dish list according to the variety of the food materials, wherein the food material composition of each dish in the dish list comprises the food materials corresponding to the variety of the food materials; and receiving a selection instruction, and selecting dishes corresponding to the food material from the dish list according to the selection instruction. 2) Determining a dish list according to the variety of the food materials, wherein the food material composition of each dish in the dish list comprises the food materials corresponding to the variety of the food materials; acquiring a user image of the range hood; determining identity information of a user using the range hood according to the user image; selecting a dish corresponding to the food material from the dish list based on the historical diet record corresponding to the identity information.
And S208, determining the cooking mode of the food material according to the dishes, and displaying the cooking mode on the range hood.
In order to realize the monitoring of the cooking process, in an alternative embodiment, the method may further include one of the following processes: 1) acquiring second image information in the cooking process of the food material, wherein the second image information is used for reflecting the cooking state of the food material; and sending the second image information to the terminal equipment. 2) Acquiring the cooking state of the food material based on the second image information; and when the cooking state is the designated state, sending a notification message to the terminal device, wherein the notification message is used for notifying the terminal device that the food material is in the designated state. Specifically, through the culinary art process of camera real time monitoring edible material on the smoke extractor, the user can look over the progress of the culinary art of edible material in any one place in room through terminal equipment such as cell-phone, prevents that food from being burnt. Optionally, under the condition that the food is judged to be well prepared by combining the currently detected food image, the user is notified through a terminal device such as a mobile phone, so that the trouble that the user frequently enters and exits the kitchen is avoided.
The embodiment of the invention provides a display device of another cooking mode, which is applied to a range hood, wherein as shown in fig. 3, the display device comprises: an obtaining module 30, configured to obtain first image information of a current food material; a first determining module 32, configured to determine a category of the food material based on the first image information; a second determining module 34, configured to determine at least one dish corresponding to the food material according to the type of the food material; a third determining module 36, configured to determine a cooking manner of the food material according to the dishes; and the display module 38 is used for displaying the cooking mode on the range hood.
Optionally, the first determining module 32 is configured to use the first image information as an input of a first model, and determine a category of the food material corresponding to the first image information, where the first model is obtained through machine learning training by using multiple sets of data in a first database, and each set of data in the multiple sets of data in the first database includes: the food material image information and the food material type corresponding to the food material image information.
The embodiment of the invention also provides a storage medium which comprises a stored program, wherein when the program runs, the equipment where the storage medium is located is controlled to execute the display mode of the cooking mode.
The embodiment of the invention also provides a processor, wherein the processor is used for running the program, and the program executes the display method of the cooking mode when running.
The likes and dislikes of the user experience are not very good. In addition, when people inquire the cooking mode of food materials, terminal equipment such as a computer and a mobile terminal are often used for connecting a network so as to inquire on a search engine, but after the result is inquired, if the cooking process of dishes is complex, people often go to the terminal equipment repeatedly to check, and therefore the time of users is wasted.
The above-mentioned serial numbers of the embodiments of the present invention are merely for description and do not represent the merits of the embodiments.
In the above embodiments of the present invention, the descriptions of the respective embodiments have respective emphasis, and for parts that are not described in detail in a certain embodiment, reference may be made to related descriptions of other embodiments.
In the embodiments provided in the present application, it should be understood that the disclosed technology can be implemented in other ways. The above-described embodiments of the apparatus are merely illustrative, and for example, the division of the units may be a logical division, and in actual implementation, there may be another division, for example, multiple units or components may be combined or integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, units or modules, and may be in an electrical or other form.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, functional units in the embodiments of the present invention may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit. The integrated unit can be realized in a form of hardware, and can also be realized in a form of a software functional unit.
The integrated unit, if implemented in the form of a software functional unit and sold or used as a stand-alone product, may be stored in a computer readable storage medium. Based on such understanding, the technical solution of the present invention may be embodied in the form of a software product, which is stored in a storage medium and includes instructions for causing a computer device (which may be a personal computer, a server, or a network device) to execute all or part of the steps of the method according to the embodiments of the present invention. And the aforementioned storage medium includes: a U-disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a removable hard disk, a magnetic or optical disk, and other various media capable of storing program codes.
The foregoing is only a preferred embodiment of the present invention, and it should be noted that, for those skilled in the art, various modifications and decorations can be made without departing from the principle of the present invention, and these modifications and decorations should also be regarded as the protection scope of the present invention.
Claims (11)
1. A display method of cooking modes is characterized by comprising the following steps:
acquiring first image information of a current food material;
determining a category of the food material based on the first image information;
determining at least one dish corresponding to the food material according to the variety of the food material;
determining a cooking mode of the food material according to the dishes, and displaying the cooking mode on the range hood;
determining at least one dish corresponding to the food material according to the variety of the food material, wherein the method comprises the following steps:
determining a dish list according to the variety of the food material, wherein the food material composition of each dish in the dish list comprises the food material corresponding to the variety of the food material; acquiring a user image of the range hood; determining identity information of a user using the range hood according to the user image; selecting a dish corresponding to the food material from the list of dishes based on a historical diet record corresponding to the identity information.
2. The method of claim 1, wherein determining the food material category based on the first image information comprises:
the method comprises the steps of taking the first image information as input of a first model, and determining the type of food materials corresponding to the first image information, wherein the first model is obtained by using multiple groups of data in a first database through machine learning training, and each group of data in the multiple groups of data in the first database comprises: the food material image information and the food material type corresponding to the food material image information.
3. The method of claim 1, wherein determining at least one dish corresponding to the food material according to the food material category comprises:
determining a dish list according to the variety of the food material, wherein the food material composition of each dish in the dish list comprises the food material corresponding to the variety of the food material;
and receiving a selection instruction, and selecting dishes corresponding to the food material from the dish list according to the selection instruction.
4. The method of claim 1, wherein determining identity information of a user using the range hood from the user image comprises:
determining identity information corresponding to the user image by taking the user image as input of a second model, wherein the second model is obtained by using multiple groups of data in a second database through machine learning training, and each group of data in the multiple groups of data in the second database comprises: a user image and identity information corresponding to the user image.
5. The method of claim 1, further comprising:
acquiring second image information in the cooking process of the food material, wherein the second image information is used for reflecting the cooking state of the food material; and sending the second image information to a terminal device.
6. The method of claim 5, further comprising:
acquiring the cooking state of the food material based on the second image information;
and when the cooking state is a designated state, sending a notification message to the terminal device, wherein the notification message is used for notifying the terminal device that the food material is in the designated state.
7. The utility model provides a range hood, includes control panel and chimney, the chimney sets up the control panel top, its characterized in that includes:
the image acquisition device is arranged at the bottom of the control panel and used for acquiring first image information of the current food material;
a processor for determining a category of the food material based on the first image information; determining at least one dish corresponding to the food material according to the variety of the food material; determining the cooking mode of the food material according to the dishes;
the display device is arranged on the control panel and used for displaying the cooking mode on the range hood;
the processor determines at least one dish corresponding to the food material according to the type of the food material by the following method: determining a dish list according to the variety of the food material, wherein the food material composition of each dish in the dish list comprises the food material corresponding to the variety of the food material; acquiring a user image of the range hood; determining identity information of a user using the range hood according to the user image; selecting a dish corresponding to the food material from the list of dishes based on a historical diet record corresponding to the identity information.
8. A display device of cooking modes is applied to a range hood, and is characterized by comprising:
the acquisition module is used for acquiring first image information of the current food material;
a first determining module for determining the type of the food material based on the first image information;
the second determining module is used for determining at least one dish corresponding to the food material according to the variety of the food material;
the third determining module is used for determining the cooking mode of the food material according to the dishes;
the display module is used for displaying the cooking mode on the range hood;
the second determining module determines at least one dish corresponding to the food material according to the type of the food material by the following method: determining a dish list according to the variety of the food material, wherein the food material composition of each dish in the dish list comprises the food material corresponding to the variety of the food material; acquiring a user image of the range hood; determining identity information of a user using the range hood according to the user image; selecting a dish corresponding to the food material from the list of dishes based on a historical diet record corresponding to the identity information.
9. The apparatus of claim 8, wherein the first determining module is configured to use the first image information as an input of a first model, and determine the category of the food material corresponding to the first image information, wherein the first model is obtained by machine learning training using multiple sets of data in a first database, and each set of data in the multiple sets of data in the first database includes: the food material image information and the food material type corresponding to the food material image information.
10. A storage medium comprising a stored program, wherein the program, when executed, controls an apparatus on which the storage medium is located to perform the method of displaying a cooking recipe of any one of claims 1 to 6.
11. A processor, characterized in that the processor is configured to run a program, wherein the program is configured to execute the method for displaying a cooking style according to any one of claims 1 to 6 when running.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201711236578.7A CN108197635B (en) | 2017-11-29 | 2017-11-29 | Cooking mode display method and device and range hood |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201711236578.7A CN108197635B (en) | 2017-11-29 | 2017-11-29 | Cooking mode display method and device and range hood |
Publications (2)
Publication Number | Publication Date |
---|---|
CN108197635A CN108197635A (en) | 2018-06-22 |
CN108197635B true CN108197635B (en) | 2020-05-29 |
Family
ID=62573492
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201711236578.7A Active CN108197635B (en) | 2017-11-29 | 2017-11-29 | Cooking mode display method and device and range hood |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN108197635B (en) |
Families Citing this family (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN109434844B (en) * | 2018-09-17 | 2022-06-28 | 鲁班嫡系机器人(深圳)有限公司 | Food material processing robot control method, device and system, storage medium and equipment |
CN109446915B (en) * | 2018-09-29 | 2020-12-29 | 口碑(上海)信息技术有限公司 | Dish information generation method and device and electronic equipment |
CN110972346B (en) * | 2018-09-30 | 2021-06-04 | 珠海格力电器股份有限公司 | Data processing method and device |
CN109726685A (en) * | 2018-12-29 | 2019-05-07 | 珠海优特智厨科技有限公司 | A kind of detection food materials launch the method, apparatus and cooking equipment of state |
CN109875381A (en) * | 2019-02-28 | 2019-06-14 | 秒针信息技术有限公司 | Cooking methods and its equipment |
Citations (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN103034675A (en) * | 2011-10-07 | 2013-04-10 | 索尼公司 | Information processing device, information processing server, information processing method, information extracting method and program |
CN103458027A (en) * | 2013-09-02 | 2013-12-18 | 四川长虹电器股份有限公司 | Serving method of range hood |
CN104461501A (en) * | 2014-11-04 | 2015-03-25 | 百度在线网络技术(北京)有限公司 | Cloud intelligent cooking method, cloud intelligent cooking device and cloud server |
CN105627384A (en) * | 2016-03-21 | 2016-06-01 | 黄一宸 | Gas stove based on biological recognition |
CN205447921U (en) * | 2016-03-03 | 2016-08-10 | 沈阳英林特电器有限公司 | Intelligent cooking system |
CN106055648A (en) * | 2016-05-31 | 2016-10-26 | 北京小米移动软件有限公司 | Cookbook determining method and device, and device for determining cookbook |
CN106304453A (en) * | 2016-08-01 | 2017-01-04 | 广东美的厨房电器制造有限公司 | Heat foods control method, equipment and comprise the cooking apparatus of this equipment |
CN106503442A (en) * | 2016-10-21 | 2017-03-15 | 广州视源电子科技股份有限公司 | The recommendation method and apparatus of menu |
CN106773859A (en) * | 2016-12-28 | 2017-05-31 | 九阳股份有限公司 | A kind of intelligent cooking control method |
CN106897661A (en) * | 2017-01-05 | 2017-06-27 | 合肥华凌股份有限公司 | A kind of Weigh sensor method of food materials image, system and household electrical appliance |
-
2017
- 2017-11-29 CN CN201711236578.7A patent/CN108197635B/en active Active
Patent Citations (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN103034675A (en) * | 2011-10-07 | 2013-04-10 | 索尼公司 | Information processing device, information processing server, information processing method, information extracting method and program |
CN103458027A (en) * | 2013-09-02 | 2013-12-18 | 四川长虹电器股份有限公司 | Serving method of range hood |
CN104461501A (en) * | 2014-11-04 | 2015-03-25 | 百度在线网络技术(北京)有限公司 | Cloud intelligent cooking method, cloud intelligent cooking device and cloud server |
CN205447921U (en) * | 2016-03-03 | 2016-08-10 | 沈阳英林特电器有限公司 | Intelligent cooking system |
CN105627384A (en) * | 2016-03-21 | 2016-06-01 | 黄一宸 | Gas stove based on biological recognition |
CN106055648A (en) * | 2016-05-31 | 2016-10-26 | 北京小米移动软件有限公司 | Cookbook determining method and device, and device for determining cookbook |
CN106304453A (en) * | 2016-08-01 | 2017-01-04 | 广东美的厨房电器制造有限公司 | Heat foods control method, equipment and comprise the cooking apparatus of this equipment |
CN106503442A (en) * | 2016-10-21 | 2017-03-15 | 广州视源电子科技股份有限公司 | The recommendation method and apparatus of menu |
CN106773859A (en) * | 2016-12-28 | 2017-05-31 | 九阳股份有限公司 | A kind of intelligent cooking control method |
CN106897661A (en) * | 2017-01-05 | 2017-06-27 | 合肥华凌股份有限公司 | A kind of Weigh sensor method of food materials image, system and household electrical appliance |
Also Published As
Publication number | Publication date |
---|---|
CN108197635A (en) | 2018-06-22 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN108197635B (en) | Cooking mode display method and device and range hood | |
CN107862018B (en) | Recommendation method and device for food cooking method | |
US11229311B2 (en) | Food preparation system | |
CN107273106B (en) | Object information translation and derivative information acquisition method and device | |
CN109176535B (en) | Interaction method and system based on intelligent robot | |
CN108107886B (en) | Driving control method and device of sweeping robot and sweeping robot | |
CN108052199A (en) | Control method, device and the smoke exhaust ventilator of smoke exhaust ventilator | |
CN108416314B (en) | Picture important face detection method | |
WO2016154598A1 (en) | System and method for adaptive, rapidly deployable, human-intelligent sensor feeds | |
CN111643017B (en) | Cleaning robot control method and device based on schedule information and cleaning robot | |
CN103810284A (en) | Kitchen management method and device | |
CN111063419A (en) | Intelligent healthy diet management system | |
EP3941023A1 (en) | Method for recommending personalized content, graphical user interface and system thereof | |
CN114821236A (en) | Smart home environment sensing method, system, storage medium and electronic device | |
CN109872214A (en) | One key ordering method of food materials, system, electronic equipment and storage medium | |
Wang et al. | Long video question answering: A matching-guided attention model | |
CN108052858A (en) | The control method and smoke exhaust ventilator of smoke exhaust ventilator | |
CN107883520B (en) | Reminding method and device based on air conditioning equipment and terminal | |
CN111062780A (en) | Household appliance recommendation method, storage medium and electronic equipment | |
CN111435541A (en) | Method, device and cooking utensil for obtaining chalkiness of rice grains | |
KR102095592B1 (en) | Method for providing choosing meun service per meal using survival game | |
CN108006902B (en) | Air conditioner control method and device | |
CN116484083A (en) | Dish information display method and device, storage medium and electronic device | |
CN110762943B (en) | Article display method and device and household appliance | |
KR100873373B1 (en) | User Intention Recognizing System and Method thereof |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |