CN112084825A - Cooking evaluation method, cooking recommendation method, computer device and storage medium - Google Patents

Cooking evaluation method, cooking recommendation method, computer device and storage medium Download PDF

Info

Publication number
CN112084825A
CN112084825A CN201910517269.XA CN201910517269A CN112084825A CN 112084825 A CN112084825 A CN 112084825A CN 201910517269 A CN201910517269 A CN 201910517269A CN 112084825 A CN112084825 A CN 112084825A
Authority
CN
China
Prior art keywords
cooking
image
evaluation
evaluated
sample
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201910517269.XA
Other languages
Chinese (zh)
Other versions
CN112084825B (en
Inventor
黄源甲
龙永文
周宗旭
陈必东
黄宇华
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Foshan Shunde Midea Electrical Heating Appliances Manufacturing Co Ltd
Original Assignee
Foshan Shunde Midea Electrical Heating Appliances Manufacturing Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Foshan Shunde Midea Electrical Heating Appliances Manufacturing Co Ltd filed Critical Foshan Shunde Midea Electrical Heating Appliances Manufacturing Co Ltd
Priority to CN201910517269.XA priority Critical patent/CN112084825B/en
Publication of CN112084825A publication Critical patent/CN112084825A/en
Application granted granted Critical
Publication of CN112084825B publication Critical patent/CN112084825B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/10Terrestrial scenes
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/21Design or setup of recognition systems or techniques; Extraction of features in feature space; Blind source separation
    • G06F18/214Generating training patterns; Bootstrap methods, e.g. bagging or boosting
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods
    • G06N3/084Backpropagation, e.g. using gradient descent
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/40Extraction of image or video features

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Data Mining & Analysis (AREA)
  • Evolutionary Computation (AREA)
  • Multimedia (AREA)
  • Artificial Intelligence (AREA)
  • General Engineering & Computer Science (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Molecular Biology (AREA)
  • Mathematical Physics (AREA)
  • Computational Linguistics (AREA)
  • General Health & Medical Sciences (AREA)
  • Biophysics (AREA)
  • Computing Systems (AREA)
  • Biomedical Technology (AREA)
  • Health & Medical Sciences (AREA)
  • Software Systems (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Evolutionary Biology (AREA)
  • Image Analysis (AREA)

Abstract

The embodiment of the invention provides a cooking evaluation method, a cooking recommendation method, computer equipment and a storage medium, wherein the method comprises the following steps: acquiring a cooking image to be evaluated; when the display parameters of the cooking image to be evaluated meet preset value conditions, evaluating the cooking image to be evaluated through a trained cooking evaluation model; the training data of the cooking evaluation model comprise cooking sample images and corresponding evaluation marks; and returning the evaluation to the corresponding terminal. In this way, by performing image recognition on the cooking result image to perform scoring, quantification of the cooking result evaluation can be achieved, and the quality of the cooking result can be objectively reflected by the quantified evaluation result.

Description

Cooking evaluation method, cooking recommendation method, computer device and storage medium
Technical Field
The present invention relates to a cooking processing technology, and more particularly, to a cooking evaluation method, a cooking recommendation method, a computer device, and a storage medium.
Background
After the traditional cooking is finished, no effective evaluation method is available for the quality of the cooking result, and the cooking result can be observed or tasted only by a manual method, so that the obtained evaluation result is often not objective.
Disclosure of Invention
The embodiment of the invention provides a cooking evaluation method, a cooking recommendation method, computer equipment and a storage medium, which can objectively evaluate a cooking result of a user.
The technical scheme of the embodiment of the invention is realized as follows:
a cooking evaluation method comprising: acquiring a cooking image to be evaluated; when the display parameters of the cooking image to be evaluated meet preset value conditions, evaluating the cooking image to be evaluated through a trained cooking evaluation model; the training data of the cooking evaluation model comprise cooking sample images and corresponding evaluation marks; and returning the evaluation to the corresponding terminal.
A cooking recommendation method comprising: acquiring a cooking image and corresponding cooking parameters; when the display parameters of the cooking image meet preset value conditions, evaluating the cooking image through a trained cooking evaluation model; the training data of the cooking evaluation model comprise cooking sample images and corresponding evaluation marks; and sequencing according to the evaluation result of the cooking images, and recommending the cooking parameters corresponding to the target cooking images which accord with the set conditions in the sequencing to the corresponding terminals.
A cooking evaluation device comprising: the first acquisition module is used for acquiring a cooking image to be evaluated; the first evaluation module is used for evaluating the cooking image to be evaluated through the trained cooking evaluation model when the display parameter of the cooking image to be evaluated meets the preset value-taking condition; the training data of the cooking evaluation model comprise cooking sample images and corresponding evaluation marks; and the sending module is used for returning the evaluation to the corresponding terminal.
A cooking recommendation device comprising: the second acquisition module is used for acquiring the cooking image and the corresponding cooking parameter; the second evaluation module is used for evaluating the cooking image through the trained cooking evaluation model when the display parameter of the cooking image meets the preset value-taking condition; the training data of the cooking evaluation model comprise cooking sample images and corresponding evaluation marks; and the recommending module is used for sequencing according to the evaluation result of the cooking images and recommending the cooking parameters corresponding to the target cooking images which meet the set conditions in the sequencing to the corresponding terminal.
A computer device, comprising: a processor and a memory for storing a computer program capable of running on the processor; when the computer program is run, the processor is configured to implement the cooking evaluation method according to any embodiment of the present invention or the cooking recommendation method according to any embodiment of the present invention.
A storage medium storing a computer program which, when executed by a processor, implements a cooking evaluation method according to any one of the embodiments of the present invention or a cooking recommendation method according to any one of the embodiments of the present invention.
In the embodiment of the invention, the cooking image to be evaluated is evaluated through the trained cooking evaluation model when the display parameter of the cooking image to be evaluated meets the preset value condition is determined by acquiring the cooking image to be evaluated, so that the quantification of the cooking result evaluation can be realized by carrying out image recognition on the cooking result image for grading, and the quality of the cooking result can be objectively reflected through the quantified evaluation result.
Drawings
FIG. 1 is a schematic view of an application scenario of a cooking evaluation method according to an embodiment of the present invention;
FIG. 2 is a timing diagram of a cooking evaluation method in an embodiment of the present invention;
FIG. 3 is a timing diagram of a cooking evaluation method according to another embodiment of the present invention;
FIG. 4 is a flow chart of a cooking evaluation method in an embodiment of the present invention;
FIG. 5 is a schematic diagram of a neuron model of a BP neural network according to an embodiment of the present invention;
FIG. 6 is a diagram illustrating a neural network model according to an embodiment of the present invention;
FIG. 7 is a schematic view of a cooking evaluation device according to an embodiment of the present invention;
FIG. 8 is a flowchart illustrating a cooking recommendation method according to an embodiment of the present invention;
FIG. 9 is a schematic diagram of a cooking recommendation device in an embodiment of the present invention;
FIG. 10 is a frame diagram of a training model for a cooking evaluation model in an embodiment of the present invention;
fig. 11 is a flowchart of a cooking evaluation and recommendation method in an embodiment of the invention.
Detailed Description
The present invention will be described in further detail below with reference to the accompanying drawings and examples. It should be understood that the specific embodiments described herein are merely illustrative of the invention and are not intended to limit the invention.
Unless defined otherwise, all technical and scientific terms used herein have the same meaning as commonly understood by one of ordinary skill in the art to which this invention belongs. The terminology used in the description of the invention herein is for the purpose of describing particular embodiments only and is not intended to be limiting of the invention. As used herein, the term "and/or" includes any and all combinations of one or more of the associated listed items.
Embodiments of the present invention provide a cooking evaluation method, a cooking recommendation method, a cooking evaluation device implementing the cooking evaluation method, a cooking recommendation device implementing the cooking recommendation method, a computer device, and a storage medium storing an executable computer program for implementing the cooking evaluation method or the cooking recommendation method. The cooking evaluation method and the cooking evaluation method in the embodiment of the present invention are implemented in the same environment, and as for the implementation of the cooking evaluation method and the cooking recommendation method, the embodiment of the present invention provides a scheme implemented on a terminal side and/or a server side, and is described in an exemplary implementation scenario of the cooking evaluation method.
As shown in fig. 1, an optional application scenario diagram for evaluating a to-be-evaluated cooking image on a server to implement the cooking evaluation method provided by the embodiment of the present invention includes a terminal and a server 300 connected by network communication, where the terminal includes a cooking device terminal 200 and a user terminal 100. As an alternative embodiment, please refer to fig. 2 in combination, the cooking evaluation method may include the following steps by installing a client of the cooking evaluation application in the user terminal by the user: s11, the user collects the cooked food image in the cooking equipment terminal through the user terminal as the cooking image to be evaluated; s12, uploading the cooking image to be evaluated in a cooking evaluation application by a user; s13, the server acquires the cooking image to be evaluated, and evaluates the cooking image to be evaluated through the trained cooking evaluation model when the display parameter of the cooking image to be evaluated meets the preset value-taking condition; and S14, returning the evaluation to the client corresponding to the user.
As another alternative, referring to fig. 3, the cooking evaluation method may include the steps of: s21, the cooking equipment terminal collects the cooked food image as the cooking image to be evaluated and sends the cooking image to the server; s22, the server acquires the cooking image to be evaluated, and evaluates the cooking image to be evaluated through the trained cooking evaluation model when the display parameter of the cooking image to be evaluated meets the preset value-taking condition; and S23, the server returns the evaluation result to the user terminal corresponding to the cooking terminal.
Referring to fig. 4, a flowchart of a cooking evaluation method according to an embodiment of the present invention is applicable to the server shown in fig. 1, and it should be noted that the server may also be other computer devices such as a terminal, and the following steps will be described.
Step 101, obtaining a cooking image to be evaluated.
Here, the cooking image to be evaluated refers to an image of cooked food. The acquiring of the cooking image to be evaluated may refer to acquiring an image of cooked food as the cooking image to be evaluated by an image acquisition device installed on a cooking equipment terminal after the cooking equipment terminal finishes cooking, and acquiring the cooking image to be evaluated sent by the cooking equipment terminal by a server; or, after cooking is finished, a user acquires an image of cooked food as a cooking image to be evaluated through an image acquisition device on a user terminal, and a server acquires the cooking image to be evaluated uploaded by the user terminal.
103, evaluating the cooking image to be evaluated through a trained cooking evaluation model when determining that the display parameter of the cooking image to be evaluated meets a preset value-taking condition; the training data of the cooking evaluation model comprises cooking sample images and corresponding evaluation marks.
Here, the cooking evaluation model refers to a model for determining an evaluation result of the cooked food in the cooking image to be evaluated by image recognition of the cooking image to be evaluated. The cooking evaluation model may be a trained neural network model, and the training data of the cooking evaluation model includes cooking sample images and corresponding evaluation marks. The server acquires a cooking image to be evaluated, performs image recognition on the cooking image to be evaluated through the trained cooking rating model, extracts image features in the cooking image to be evaluated, and determines the score of the cooking image to be evaluated through the similarity of the extracted features and features mapped to a sample mark space by sample images corresponding to different evaluations in the cooking evaluation model training process. As an alternative embodiment, the evaluation mark refers to a value of a score of the cooked food in the cooking sample image. By adopting the score value as an evaluation result and carrying out image recognition through the trained neural network model to determine the corresponding score value, the evaluation result of the cooked food can be quantified, and a more objective and accurate evaluation result can be obtained.
And 105, returning the evaluation to the corresponding terminal.
Here, the terminal may refer to a cooking device terminal or a user terminal, and may also include both the cooking device terminal and the user terminal. The cooking equipment terminal refers to equipment for cooking food, such as an electric cooker, an electric stewpan, a microwave oven and other cooking equipment. The user terminal may be a mobile phone terminal, a tablet computer, a personal computer, etc. The user receives the evaluation result of the server on the cooked food through the user terminal, so that the user can conveniently know whether the cooked food is good or not objectively according to the evaluation result.
In the embodiment of the invention, the cooking image to be evaluated is obtained, and when the display parameter of the cooking image to be evaluated meets the preset value taking condition, the cooking image to be evaluated is evaluated through the trained cooking evaluation model, so that the quantification of the cooking result evaluation can be realized by carrying out image recognition on the cooking result image so as to carry out grading, the quality of the cooking result can be objectively reflected through the quantified evaluation result, and whether improvement is needed or not can be known conveniently according to the evaluation result.
In some embodiments, before evaluating the cooking image to be evaluated through the trained cooking evaluation model, the method further includes:
acquiring a first cooking sample image obtained through a standard cooking program and a second cooking sample image obtained through user cooking determined by setting a screening condition;
training an initial neural network model based on a training data set which is formed by combining the first cooking sample image and the second cooking sample image carrying corresponding evaluation marks;
and obtaining the trained cooking evaluation model until the loss function of the neural network model meets the convergence condition.
Here, before the cooking image to be evaluated is evaluated by the trained cooking evaluation model, the method further comprises the step of training the cooking evaluation model. The training data for training the cooking evaluation model includes sample images of two aspects, the first is a first cooking sample image obtained through a standard cooking program, and the second is a second cooking sample image obtained through user cooking determined by setting a screening condition. The standard cooking procedure may refer to a cooking procedure predetermined by a professional cooking person according to a professional cooking experience. The second cooking sample image obtained by the user cooking determined by setting the screening condition may be obtained by selecting the collected image of the food cooked by the user according to a preset screening condition, and taking the image meeting the corresponding screening condition as the second cooking sample image.
The initial neural network model is trained by forming a training data set which comprises a first cooking sample image obtained through a standard cooking program and a second cooking sample image obtained through user cooking determined by setting a screening condition, wherein the second cooking sample image obtained through user cooking can more accurately reflect a cooking result obtained in a real scene of general user cooking, and the image obtained through user cooking is added as a sample image to participate in the training of the neural network model, so that the accuracy of the trained neural network model for image recognition can be further improved.
The initial neural network model may be a neural network model pre-trained based on a known image data set, and the neural network model may be a BP neural network model, a convolutional neural network model, or a variant thereof. The basic constituent units of the BP neural network are neurons, and as shown in fig. 5, the basic constituent units are a schematic diagram of a typical neuron model, where x1 and x2 … xm represent inputs, ω 1, ω 2, and ω m represent synaptic weights, Σ represents a summing node, f (·) represents an activation function, and y represents an output, and as shown in fig. 6, the basic constituent units are neurons, where n corresponds to an input layer, n 1-ns corresponds to an intermediate layer, and m corresponds to an output layer, and a plurality of neurons are connected according to a certain rule to form the neural network model. As can be seen from fig. 5 and 6, the BP neural network model mainly includes an input layer, a hidden layer (intermediate layer), and an output layer. The number of neurons in an input layer is the same as the dimension of input data, the number of neurons in an output layer is the same as the number of data to be fitted, and the number of neurons in a hidden layer and the number of layers are set according to an actual training target. The convolutional neural network mainly comprises a convolutional layer, a pooling layer and a full-connection layer. The convolution layer is a layer for completing image convolution operation, and the convolution operation is to use a convolution kernel to perform convolution with an image corresponding area to obtain a value, and then continuously move the convolution kernel and solve the convolution to complete the convolution of the whole image. The pooling layer may represent the entire region with a maximum of 2 x2 regions. The full-connection layer is mainly used for learning, and the learned distributed characteristic vectors in the training set are mapped to a uniform sample mark space to obtain the weight of the neural network model.
The training of the neural network model mainly comprises loading a training set and training model parameters. And loading a training set, namely inputting a training data set constructed by combining the first cooking sample image and the second cooking sample image carrying the corresponding evaluation marks into an initial neural network model for iterative training, calculating cost by forward conduction and by using the labeling information and the cost function, and updating parameters in each layer by propagating the cost function gradient in a backward direction to adjust the weight of the initial neural network model until the loss function of the neural network model meets the convergence condition, so as to obtain the trained neural network model.
The Loss Function (Loss Function), also called Cost Function (Cost Function), is an objective Function for neural network optimization, the process of neural network training or optimization is a process of minimizing the Loss Function, and the smaller the Loss Function value, the closer the value of the corresponding predicted result and the real result is. In a specific embodiment, a loss function of the neural network model may be formed according to a Cross-Entropy (Cross-Entropy) cost function, the cost function is closer to zero when an actual output value is closer to an expected output value, and the Cross-Entropy cost function is used as the loss function of the neural network model when a sigmoid function is used as an activation function of a neuron, so that training speed may be increased.
In some embodiments, the acquiring a second cooking sample image obtained by the user cooking determined by setting the filtering condition includes:
acquiring a cooking image obtained by cooking of a user and feedback data corresponding to the cooking image;
determining a cooking image with a feedback result meeting the requirement as a second cooking sample image according to the feedback data; wherein the feedback result meeting the requirement comprises at least one of the following: the score value is higher than a preset score, the recommendation number is higher than a corresponding threshold, and the like number is higher than a corresponding threshold.
Here, the set filtering condition means that the feedback result determined from the feedback data satisfies the requirement. The feedback data corresponding to the cooking image may include: the evaluation level of the user on the cooking image, the number of praise of the user on the cooking image, the evaluation value of the user on the cooking image and the like. The feedback data of the user on the cooking image can comprise evaluation feedback of the user for cooking food corresponding to the cooking food, and can also comprise evaluation feedback of other users obtained after the cooking image for cooking food is uploaded to some public platforms, wherein the evaluation feedback corresponds to the cooking food. The feedback result meeting the requirement comprises at least one of the following steps: the score value is higher than a preset score, the recommendation number is higher than a corresponding threshold, and the like number is higher than a corresponding threshold. The preset score, the threshold corresponding to the recommended number and the threshold corresponding to the praise number can be adjusted according to actual conditions, and the sample image data with enough quantity and a reasonable identification precision range can be obtained.
In some embodiments, the determining that the display parameter of the cooking image to be evaluated satisfies a preset value condition includes:
and detecting an imaging area of the cooked food in the cooking image to be evaluated, and determining that the size of the cooking image to be evaluated meets the size requirement.
Here, the size is taken as a display parameter of the image, and the size of the imaging area of the cooking food is taken as a value taking condition. And adjusting the size of the cooking image to be evaluated to be consistent with the size of the imaging area of the contained cooking food, so that the size of the cooking food contained in the cooking image to be evaluated meets the size requirement. Taking the cooking image as a rice picture and the cooking food contained in the cooking image as rice as an example, the imaging area of the cooking food refers to the imaging area of the rice, and the rice cost areas in different original cooking images have the same size, such as 300 pixels (pixels), by adjusting the size of the cooking image. The size of the cooking image is adjusted to enable the size of the imaging area of the cooking food to be consistent, so that the sizes of the cooking food contained in different cooking images to be evaluated can be conveniently unified, the phenomenon that the recognition accuracy is affected due to the fact that the size of the imaging area of the cooking food is too small in a subsequent neural network model is avoided, and the phenomenon that recognition errors are caused due to the fact that the sizes of the cooking food are different is avoided.
In some embodiments, the evaluating the cooking image to be evaluated through a trained cooking evaluation model includes:
extracting feature data of corresponding dimensionality of the cooking image to be evaluated through the trained cooking evaluation model;
and judging the probability that the cooking food contained in the cooking image to be evaluated is the corresponding cooking sample image according to the similarity of the extracted feature data and the feature data of the corresponding dimensionality mapped to the sample mark space by the sample images corresponding to different evaluation marks in the model training process, and determining the score value of the cooking image to be evaluated.
Here, the extracted feature data may include image features of multiple dimensions, such as color features, texture features, shape features, HOG features. In the process of identifying the cooking image to be evaluated by the trained neural network model, judging the probability that the cooking food contained in the cooking image to be evaluated is the corresponding cooking sample image by extracting the image characteristics of multiple dimensions of the cooking image to be evaluated and according to the similarity of the extracted characteristic data and the characteristic data of the corresponding dimension mapped to the sample image of the corresponding different evaluation marks in the model training process, namely, determining the probability that the cooking food contained in the cooking image to be evaluated is the corresponding cooking sample image by combining the similarity of the image characteristics of the multiple dimensions of the cooking image to be evaluated and the image characteristics of each dimension in the sample image with the weight obtained after the neural network model training by the cooking evaluation model, and determining the sample image matched with the cooking image to be evaluated according to the maximum value of the probability, thereby determining the value of the score of the cooking image to be evaluated.
In another aspect, an embodiment of the present invention further provides a cooking evaluation apparatus, which may be implemented by a terminal side or a server side, and as for a hardware structure of the cooking evaluation apparatus, please refer to fig. 7, which is a schematic diagram of an alternative hardware structure of the cooking evaluation apparatus according to the embodiment of the present invention, and the cooking evaluation apparatus may be a computer device, a server, or the like. The cooking evaluation device includes: at least one first processor 101, a first memory 102. Optionally, the cooking evaluation device further comprises at least one network interface 104 and a user interface 106, the various components contained in the device being coupled together by a bus system 105. As will be appreciated, the bus system 105 is used to enable communications among the components. The bus system 105 includes a power bus, a control bus, and a status signal bus in addition to a data bus. For clarity of illustration, however, the various buses are labeled as a bus system in fig. 7.
The user interface 106 may include, among other things, a display, a keyboard, a mouse, a trackball, a click wheel, a key, a button, a touch pad, or a touch screen.
It will be appreciated that the first memory 102 can be either volatile memory or nonvolatile memory, and can include both volatile and nonvolatile memory. Among them, the nonvolatile Memory may be a Read Only Memory (ROM), a Programmable Read-Only Memory (PROM), which serves as an external cache. By way of illustration and not limitation, many forms of RAM are available, such as Static Random Access Memory (SRAM), Synchronous Static Random Access Memory (SSRAM). The described memory for embodiments of the present invention is intended to comprise, without being limited to, these and any other suitable types of memory.
The first memory 102 in the embodiment of the present invention is used to store various kinds of data to support the operation of the cooking evaluation device. Examples of such data include: any executable program for operating on the cooking evaluation device, such as an operating system and application programs; a training data set; a cooking image to be evaluated, etc.; the operating system includes various system programs, such as a framework layer, a core library layer, a driver layer, and the like, and is used for implementing various basic services and processing hardware-based tasks. The application programs may include various application programs such as a Media Player (Media Player), a Browser (Browser), etc. for implementing various application services. The image classification method provided by the embodiment of the invention can be contained in an application program.
The method disclosed in the above embodiments of the present invention may be applied to the first processor 101, or implemented by the first processor 101. In an exemplary embodiment, the cooking evaluation Device may be used by one or more Application Specific Integrated Circuits (ASICs), DSPs, Programmable Logic Devices (PLDs), Complex Programmable Logic Devices (CPLDs) to perform the aforementioned methods.
In an exemplary embodiment, please refer to fig. 7, which is a schematic structural diagram of a cooking evaluation device according to an embodiment of the present invention, the cooking evaluation device includes: the first acquisition module 11 is used for acquiring a cooking image to be evaluated; the first evaluation module 12 is configured to evaluate the cooking image to be evaluated through a trained cooking evaluation model when it is determined that the display parameter of the cooking image to be evaluated meets a preset value-taking condition; the training data of the cooking evaluation model comprise cooking sample images and corresponding evaluation marks; and a sending module 13, configured to return the evaluation to the corresponding terminal.
In some embodiments, the cooking evaluation device further comprises a training module including a sample unit for acquiring a first cooking sample image obtained by a standard cooking procedure and a second cooking sample image obtained by user cooking determined by setting a filtering condition; the training unit is used for training an initial neural network model based on a training data set which is formed by combining the first cooking sample image and the second cooking sample image carrying corresponding evaluation marks; and obtaining the trained cooking evaluation model until the loss function of the neural network model meets the convergence condition.
In some embodiments, the sample unit is further configured to obtain a cooking image obtained by cooking by a user and feedback data corresponding to the cooking image; determining a cooking image with a feedback result meeting the requirement as a second cooking sample image according to the feedback data; wherein the feedback result meeting the requirement comprises at least one of the following: the score value is higher than a preset score, the recommendation number is higher than a corresponding threshold, and the like number is higher than a corresponding threshold.
In some embodiments, the first evaluation module 12 is further configured to detect an imaging area of the cooking food in the cooking image to be evaluated, and determine that the size of the cooking image to be evaluated meets the size requirement.
In some embodiments, the first evaluation module 12 is further configured to extract feature data of corresponding dimensions of the cooking image to be evaluated through the trained cooking evaluation model; and judging the probability that the cooking food contained in the cooking image to be evaluated is the corresponding cooking sample image according to the similarity of the extracted feature data and the feature data of the corresponding dimensionality mapped to the sample mark space by the sample images corresponding to different evaluation marks in the model training process, and determining the score value of the cooking image to be evaluated.
It should be noted that: the cooking evaluation device provided in the above embodiment is only exemplified by the division of the program modules when evaluating the cooking image to be evaluated, and in practical applications, the processing distribution may be completed by different program modules according to needs, that is, the internal structure of the device may be divided into different program modules to complete all or part of the processing described above. In addition, the cooking evaluation device and the cooking evaluation method provided by the embodiment belong to the same concept, and specific implementation processes are described in the method embodiment and are not described again.
In another aspect of the embodiments of the present invention, please refer to fig. 8, which further provides a cooking recommendation method, including:
step 201, acquiring a cooking image and corresponding cooking parameters;
here, the cooking image may be an image of cooked food. By collecting the image of the cooked food of the user and the corresponding cooking parameter during cooking the food, the cooking parameter corresponding to the cooking image with a good evaluation result is recommended to the user after the cooking image is evaluated, and the user can learn the operation mode of the cooked food conveniently. As an optional embodiment, the acquiring the cooking image and the corresponding cooking parameter may include: after a user starts a cooking equipment terminal, the cooking equipment terminal collects cooking parameters corresponding to cooking operation of the user and sends the cooking parameters to a server; after cooking is finished, the cooking equipment terminal collects cooking images of cooked food through the image collecting device and sends the cooking images to the server.
Step 203, evaluating the cooking image through a trained cooking evaluation model when determining that the display parameters of the cooking image meet preset value-taking conditions; the training data of the cooking evaluation model comprises cooking sample images and corresponding evaluation marks.
Here, the evaluation mark may refer to a value of a score of the cooked food in the sample image. The cooking evaluation model refers to a model for determining an evaluation result of the cooked food in the cooking image to be evaluated by performing image recognition on the cooking image to be evaluated. The cooking evaluation model may be a trained neural network model, and the training data of the cooking evaluation model includes cooking sample images and corresponding evaluation marks. The server acquires a cooking image to be evaluated, performs image recognition on the cooking image to be evaluated through the trained cooking rating model, extracts image features in the cooking image to be evaluated, and determines the score of the cooking image to be evaluated through the similarity of the extracted features and features mapped to a sample mark space by sample images corresponding to different evaluations in the cooking evaluation model training process. As an alternative embodiment, the evaluation mark refers to a value of a score of the cooked food in the cooking sample image. By adopting the score value as an evaluation result and carrying out image recognition through the trained neural network model to determine the corresponding score value, the evaluation result of the cooked food can be quantified, and a more objective and accurate evaluation result can be obtained.
And step 205, sorting according to the evaluation result of the cooking images, and recommending the cooking parameters corresponding to the target cooking images meeting the set conditions in the sorting to the corresponding terminals.
The server ranks the evaluation results of the cooking images, which may mean that the server ranks the cooking images according to the scores of the cooking images according to categories of cooking foods, so that the server can recommend cooking parameters corresponding to the cooking images ranked in front to a user for reference when the user accurately cooks the foods of the corresponding categories.
In the embodiment of the invention, the cooking image and the cooking parameter are obtained, and when the display parameter of the cooking image is determined to meet the preset value taking condition, the cooking image is evaluated through the trained cooking evaluation model, so that quantification of the cooking result evaluation can be realized by carrying out image recognition on the cooking result image for grading, the quality of the cooking result can be objectively reflected through the quantified evaluation result, the cooking parameter corresponding to the cooked food can be determined according to the evaluation result, and the cooking parameter is recommended to a user for reference.
In some embodiments, the ranking according to the evaluation result of the cooking images, and recommending the cooking parameters corresponding to the target cooking images meeting the setting conditions in the ranking to a corresponding terminal, includes:
respectively sequencing according to different cooking food types according to the evaluation result of the cooking image;
when a cooking request sent by a terminal is acquired, the type of currently cooked food is determined according to the cooking request, a target cooking image which corresponds to the type of the cooking food and meets set conditions in the sequence is determined, and cooking parameters corresponding to the target cooking image are recommended to the client.
The obtaining of the cooking request sent by the terminal may refer to that after the cooking equipment terminal is started, the cooking equipment terminal determines a type of cooking food according to a cooking mode selected by a user, and sends the cooking request carrying the type of the cooking food to the server; or after the cooking equipment terminal is started, the cooking equipment terminal collects food material images put into the cooking equipment by a user and operation parameters of the cooking equipment, determines the type of cooking food by carrying out image analysis on the food material images, and sends a cooking request carrying the type of the cooking food to the server.
In the embodiment of the invention, the server can acquire the cooking images and the corresponding cooking parameters, perform image recognition on the cooking images and determine the corresponding scores, store the corresponding relations among the cooking images, the cooking parameters and the scores, sort the cooking images according to the scores of the cooking images contained in the types of the cooking foods to respectively form a sorting result, determine the types of the cooking foods to be cooked currently according to the cooking requests when acquiring the cooking requests sent by the terminal, determine the target cooking images which are matched with the types of the cooking foods in the sorting and meet the set conditions, and recommend the cooking parameters corresponding to the target cooking images to the terminal.
In some embodiments, the evaluating the cooking image to be evaluated through a trained cooking evaluation model includes:
extracting feature data of corresponding dimensionality of the cooking image through the trained cooking evaluation model;
and judging the probability that the cooking food contained in the cooking image to be evaluated is the corresponding cooking sample image according to the similarity of the extracted feature data and the feature data of the corresponding dimensionality mapped to the sample mark space by the sample images corresponding to different evaluation marks in the model training process, and determining the score value of the cooking image to be evaluated.
The extracted feature data may include image features of multiple dimensions, such as color features, texture features, shape features, and HOG features. In the process of recognizing the cooking image by the trained neural network model, by extracting the image characteristics of a plurality of dimensions of the cooking image, according to the similarity of the extracted feature data and the feature data of corresponding dimensionalities of the sample images corresponding to different evaluation marks in the model training process, judging the probability that the cooking food contained in the cooking image is the corresponding cooking sample image, namely, the cooking evaluation model determines the probability that the cooking food contained in the cooking image is the corresponding cooking sample image by combining the similarity of the image features of multiple dimensionalities of the cooking image and the image features of each dimensionality in the sample image with the weight parameters obtained after the neural network model training, and determining a sample image matched with the cooking image according to the maximum value of the probability, thereby determining the scoring value of the cooking image.
In another aspect, an embodiment of the present invention further provides a cooking recommendation device, which may be implemented by a terminal side or a server side, and as for a hardware structure of the cooking recommendation device, please refer to fig. 9, which is a schematic diagram of an optional hardware structure of the cooking recommendation device provided in the embodiment of the present invention, and the cooking recommendation device may be a computer device, a server, or the like. The cooking recommendation device includes: at least one second processor 201, a second memory 202. Optionally, the cooking evaluation device further comprises at least one network interface 204 and a user interface 206, the various components contained in the device being coupled together by a bus system 205. As will be appreciated, the bus system 205 is used to enable communications among the components. The bus system 205 includes a power bus, a control bus, and a status signal bus in addition to a data bus. For clarity of illustration, however, the various buses are labeled as a bus system in fig. 9.
The user interface 206 may include, among other things, a display, a keyboard, a mouse, a trackball, a click wheel, keys, buttons, a touch pad, or a touch screen.
It will be appreciated that the second memory 202 can be either volatile memory or nonvolatile memory, and can include both volatile and nonvolatile memory. Among them, the nonvolatile Memory may be a Read Only Memory (ROM), a Programmable Read-Only Memory (PROM), which serves as an external cache. By way of illustration and not limitation, many forms of RAM are available, such as Static Random Access Memory (SRAM), Synchronous Static Random Access Memory (SSRAM). The described memory for embodiments of the present invention is intended to comprise, without being limited to, these and any other suitable types of memory.
The second memory 202 in the embodiment of the present invention is used to store various kinds of data to support the operation of the cooking recommendation device. Examples of such data include: any executable program for operating on the cooking recommendation device, such as an operating system and an application program; a training data set; cooking images, etc.; the operating system includes various system programs, such as a framework layer, a core library layer, a driver layer, and the like, and is used for implementing various basic services and processing hardware-based tasks. The application programs may include various application programs such as a Media Player (Media Player), a Browser (Browser), etc. for implementing various application services. The cooking recommendation method provided by the embodiment of the invention can be contained in an application program.
The method disclosed in the above embodiments of the present invention may be applied to the second processor 201, or implemented by the second processor 201. In an exemplary embodiment, the cooking recommendation Device may be used by one or more Application Specific Integrated Circuits (ASICs), DSPs, Programmable Logic Devices (PLDs), Complex Programmable Logic Devices (CPLDs) to perform the aforementioned methods.
In an exemplary embodiment, please refer to fig. 9, which is a schematic structural diagram of a cooking recommendation device according to an embodiment of the present invention, the cooking recommendation device includes: a second obtaining module 21, configured to obtain a cooking image and corresponding cooking parameters; the second evaluation module 23 is configured to evaluate the cooking image through the trained cooking evaluation model when it is determined that the display parameter of the cooking image meets a preset value-taking condition; the training data of the cooking evaluation model comprise cooking sample images and corresponding evaluation marks; and the recommending module 25 is used for sequencing according to the evaluation result of the cooking images and recommending the cooking parameters corresponding to the target cooking images meeting the set conditions in the sequencing to the corresponding terminals.
In some embodiments, the recommending module 25 is further configured to sort the cooking foods according to different cooking food types according to the evaluation result of the cooking image; when a cooking request sent by a terminal is acquired, the type of currently cooked food is determined according to the cooking request, a target cooking image which corresponds to the type of the cooking food and meets set conditions in the sequence is determined, and cooking parameters corresponding to the target cooking image are recommended to the client.
In some embodiments, the second evaluation module 23 is further configured to extract feature data of corresponding dimensions of the cooking image through the trained cooking evaluation model; and judging the probability that the cooking food contained in the cooking image to be evaluated is the corresponding cooking sample image according to the similarity of the extracted feature data and the feature data of the corresponding dimensionality mapped to the sample mark space by the sample images corresponding to different evaluation marks in the model training process, and determining the score value of the cooking image to be evaluated.
It should be noted that: in the cooking recommendation device provided in the above embodiment, when the cooking image is evaluated and recommended, only the division of the program modules is illustrated, and in practical applications, the processing distribution may be completed by different program modules according to needs, that is, the internal structure of the device may be divided into different program modules to complete all or part of the processing described above. In addition, the cooking recommendation device and the cooking recommendation method provided by the embodiments belong to the same concept, and specific implementation processes thereof are described in the method embodiments in detail and are not described herein again.
In an exemplary embodiment, embodiments of the present invention also provide a storage medium, such as a memory, including an executable program, which is executable by a processor to perform the steps of the foregoing method. The readable storage medium can be FRAM, ROM, PROM, EPROM, EEPROM, Flash Memory, magnetic surface Memory, optical disk, or CD-ROM; or may be various devices including one or any combination of the above memories, such as mobile phones, computer devices, tablet devices, personal digital assistants, medical devices, and the like.
In order to further understand the cooking evaluation method and the cooking recommendation method provided by the embodiment of the present invention, the cooking evaluation method and the cooking recommendation method are described below with reference to fig. 10 and 11, in a training frame diagram of the cooking evaluation model, sources of training data of the cooking evaluation model mainly include: a cooking result image obtained through an experiment and a cooking result image obtained through operation and cooking by a user; the corresponding rating of the cooking result image may be a proposed rating that is scored by expert ratings or constructed through user feedback. And the expert evaluation scoring is to score the real cooking result by a professional, and the data such as user feedback and the like are acquired in the cooking process of the user. The method comprises the following steps:
s31, collecting training data; and/or grading and marking the cooking result operated by the user and the cooking result fed back by the user to form sample data.
S32, training a cooking evaluation model; cooking evaluation model: and establishing a cooking evaluation model by acquiring sample data and an image characteristic algorithm. For example, in a laboratory, thousands of experiments are carried out on different rice seeds, the final rice image is collected, grading marking is carried out on the effect every time, an image-grading mapping table is obtained, and sample data are formed. The training cooking evaluation model can extract a characteristic value of the sample data by adopting an image characteristic extraction algorithm, and then calculate the similarity between the cooking result data and the sample data every time through an svm algorithm to obtain a corresponding score.
S33, start cooking: and writing the parameters set in the cooking into a cloud database. A
And S34, finishing cooking: and acquiring pictures of cooked rice through a camera of the cooking equipment or mobile terminal app and uploading the pictures to a cloud.
S35, warehousing scoring results: and inputting the picture after cooking into a cooking grading model for calculation to obtain a grading value and writing the grading value into a database.
The cooking scoring model performs feature extraction on the cooking image through an image feature extraction algorithm:
characteristic amount Description of characteristic quantities
F1 Edge feature
F2 Transmittance of light
F3 Color characteristics
F4 Texture features
F5 Shape feature
F6 HOG characteristics
And then, calculating the similarity of the cooking image and the sample image through an svm classifier to obtain the score value of the corresponding cooking image.
S36, reading result show: and reading the identification result to the device side or the app side.
S37, processing the scoring result: and sorting according to the grade and recommending the setting method to the user, so as to obtain good experience for the next cooking.
The above description is only for the specific embodiments of the present invention, but the scope of the present invention is not limited thereto, and any person skilled in the art can easily conceive of the changes or substitutions within the technical scope of the present invention, and all the changes or substitutions should be covered within the scope of the present invention. Therefore, the protection scope of the present invention shall be subject to the protection scope of the appended claims.

Claims (12)

1. A cooking evaluation method, comprising:
acquiring a cooking image to be evaluated;
when the display parameters of the cooking image to be evaluated meet preset value conditions, evaluating the cooking image to be evaluated through a trained cooking evaluation model; the training data of the cooking evaluation model comprise cooking sample images and corresponding evaluation marks;
and returning the evaluation to the corresponding terminal.
2. The cooking evaluation method of claim 1, wherein before evaluating the cooking image to be evaluated by the trained cooking evaluation model, the method further comprises:
acquiring a first cooking sample image obtained through a standard cooking program and a second cooking sample image obtained through user cooking determined by setting a screening condition;
training an initial neural network model based on a training data set which is formed by combining the first cooking sample image and the second cooking sample image carrying corresponding evaluation marks;
and obtaining the trained cooking evaluation model until the loss function of the neural network model meets the convergence condition.
3. The cooking evaluation method according to claim 2, wherein the acquiring of the second cooking sample image obtained by the user's cooking determined by setting the filtering condition includes:
acquiring a cooking image obtained by cooking of a user and feedback data corresponding to the cooking image;
determining a cooking image with a feedback result meeting the requirement as a second cooking sample image according to the feedback data; wherein the feedback result meeting the requirement comprises at least one of the following: the score value is higher than a preset score, the recommendation number is higher than a corresponding threshold, and the like number is higher than a corresponding threshold.
4. The cooking evaluation method according to claim 1, wherein the determining that the display parameter of the cooking image to be evaluated meets a preset value condition comprises:
and detecting an imaging area of the cooked food in the cooking image to be evaluated, and determining that the size of the cooking image to be evaluated meets the size requirement.
5. The cooking evaluation method according to any one of claims 1 to 4, wherein the evaluating the cooking image to be evaluated through a trained cooking evaluation model comprises:
extracting feature data of corresponding dimensionality of the cooking image to be evaluated through the trained cooking evaluation model;
and judging the probability that the cooking food contained in the cooking image to be evaluated is the corresponding cooking sample image according to the similarity of the extracted feature data and the feature data of the corresponding dimensionality mapped to the sample mark space by the sample images corresponding to different evaluation marks in the model training process, and determining the score value of the cooking image to be evaluated.
6. A cooking recommendation method, comprising:
acquiring a cooking image and corresponding cooking parameters;
when the display parameters of the cooking image meet preset value conditions, evaluating the cooking image through a trained cooking evaluation model; the training data of the cooking evaluation model comprise cooking sample images and corresponding evaluation marks;
and sequencing according to the evaluation result of the cooking images, and recommending the cooking parameters corresponding to the target cooking images which accord with the set conditions in the sequencing to the corresponding terminals.
7. The cooking recommendation method according to claim 6, wherein the ranking according to the evaluation results of the cooking images and recommending the cooking parameters corresponding to the target cooking images meeting the set conditions in the ranking to the corresponding terminals comprises:
respectively sequencing according to different cooking food types according to the evaluation result of the cooking image;
when a cooking request sent by a terminal is acquired, the type of currently cooked food is determined according to the cooking request, a target cooking image which corresponds to the type of the cooking food and meets set conditions in the sequence is determined, and cooking parameters corresponding to the target cooking image are recommended to the client.
8. The cooking recommendation method according to claim 6, wherein the evaluating the cooking image to be evaluated through a trained cooking evaluation model comprises:
extracting feature data of corresponding dimensionality of the cooking image through the trained cooking evaluation model;
and judging the probability that the cooking food contained in the cooking image to be evaluated is the corresponding cooking sample image according to the similarity of the extracted feature data and the feature data of the corresponding dimensionality mapped to the sample mark space by the sample images corresponding to different evaluation marks in the model training process, and determining the score value of the cooking image to be evaluated.
9. A cooking evaluation device, comprising:
the first acquisition module is used for acquiring a cooking image to be evaluated;
the first evaluation module is used for evaluating the cooking image to be evaluated through the trained cooking evaluation model when the display parameter of the cooking image to be evaluated meets the preset value-taking condition; the training data of the cooking evaluation model comprise cooking sample images and corresponding evaluation marks;
and the sending module is used for returning the evaluation to the corresponding terminal.
10. A cooking recommendation device, comprising:
the second acquisition module is used for acquiring the cooking image and the corresponding cooking parameter;
the second evaluation module is used for evaluating the cooking image through the trained cooking evaluation model when the display parameter of the cooking image meets the preset value-taking condition; the training data of the cooking evaluation model comprise cooking sample images and corresponding evaluation marks;
and the recommending module is used for sequencing according to the evaluation result of the cooking images and recommending the cooking parameters corresponding to the target cooking images which meet the set conditions in the sequencing to the corresponding terminal.
11. A computer device, comprising: a processor and a memory for storing a computer program capable of running on the processor;
wherein the processor is configured to implement the cooking evaluation method of any one of claims 1 to 5 or the cooking recommendation method of any one of claims 6 to 8 when the computer program is run.
12. A storage medium storing a computer program, wherein the computer program, when executed by a processor, implements the cooking evaluation method of any one of claims 1 to 5 or implements the cooking recommendation method of any one of claims 6 to 8.
CN201910517269.XA 2019-06-14 2019-06-14 Cooking evaluation method, cooking recommendation method, computer device and storage medium Active CN112084825B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910517269.XA CN112084825B (en) 2019-06-14 2019-06-14 Cooking evaluation method, cooking recommendation method, computer device and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910517269.XA CN112084825B (en) 2019-06-14 2019-06-14 Cooking evaluation method, cooking recommendation method, computer device and storage medium

Publications (2)

Publication Number Publication Date
CN112084825A true CN112084825A (en) 2020-12-15
CN112084825B CN112084825B (en) 2023-03-24

Family

ID=73734093

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910517269.XA Active CN112084825B (en) 2019-06-14 2019-06-14 Cooking evaluation method, cooking recommendation method, computer device and storage medium

Country Status (1)

Country Link
CN (1) CN112084825B (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112633397A (en) * 2020-12-29 2021-04-09 江苏惟妙纺织科技有限公司 Embroidery customization method and system
CN112801161A (en) * 2021-01-22 2021-05-14 桂林市国创朝阳信息科技有限公司 Small sample image classification method and device, electronic equipment and computer storage medium
CN112861946A (en) * 2021-01-29 2021-05-28 广州富港万嘉智能科技有限公司 Neural network training method, cooking inspection method and system and intelligent cooking equipment

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH09257711A (en) * 1996-03-21 1997-10-03 Matsushita Electric Ind Co Ltd Cooking evaluation apparatus and surface evaluation apparatus
CN106037448A (en) * 2016-07-29 2016-10-26 广东美的厨房电器制造有限公司 Cooking control method and equipment and cooking device
CN106889898A (en) * 2016-10-25 2017-06-27 佛山市顺德区美的电热电器制造有限公司 The control method of cooking apparatus, control device, cooking apparatus and control device
CN107819812A (en) * 2016-09-14 2018-03-20 佛山市顺德区美的电热电器制造有限公司 The evaluation method and device of cooking quality
CN107862018A (en) * 2017-10-30 2018-03-30 珠海格力电器股份有限公司 The recommendation method and smoke exhaust ventilator of food materials cooking methods
CN109118470A (en) * 2018-06-26 2019-01-01 腾讯科技(深圳)有限公司 A kind of image quality evaluating method, device, terminal and server
CN109145956A (en) * 2018-07-26 2019-01-04 上海慧子视听科技有限公司 Methods of marking, device, computer equipment and storage medium
CN109191180A (en) * 2018-08-06 2019-01-11 百度在线网络技术(北京)有限公司 The acquisition methods and device of evaluation

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH09257711A (en) * 1996-03-21 1997-10-03 Matsushita Electric Ind Co Ltd Cooking evaluation apparatus and surface evaluation apparatus
CN106037448A (en) * 2016-07-29 2016-10-26 广东美的厨房电器制造有限公司 Cooking control method and equipment and cooking device
CN107819812A (en) * 2016-09-14 2018-03-20 佛山市顺德区美的电热电器制造有限公司 The evaluation method and device of cooking quality
CN106889898A (en) * 2016-10-25 2017-06-27 佛山市顺德区美的电热电器制造有限公司 The control method of cooking apparatus, control device, cooking apparatus and control device
CN107862018A (en) * 2017-10-30 2018-03-30 珠海格力电器股份有限公司 The recommendation method and smoke exhaust ventilator of food materials cooking methods
CN109118470A (en) * 2018-06-26 2019-01-01 腾讯科技(深圳)有限公司 A kind of image quality evaluating method, device, terminal and server
CN109145956A (en) * 2018-07-26 2019-01-04 上海慧子视听科技有限公司 Methods of marking, device, computer equipment and storage medium
CN109191180A (en) * 2018-08-06 2019-01-11 百度在线网络技术(北京)有限公司 The acquisition methods and device of evaluation

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112633397A (en) * 2020-12-29 2021-04-09 江苏惟妙纺织科技有限公司 Embroidery customization method and system
CN112633397B (en) * 2020-12-29 2021-12-14 江苏惟妙纺织科技有限公司 Embroidery customization method and system
CN112801161A (en) * 2021-01-22 2021-05-14 桂林市国创朝阳信息科技有限公司 Small sample image classification method and device, electronic equipment and computer storage medium
CN112861946A (en) * 2021-01-29 2021-05-28 广州富港万嘉智能科技有限公司 Neural network training method, cooking inspection method and system and intelligent cooking equipment
CN112861946B (en) * 2021-01-29 2024-04-02 广州富港生活智能科技有限公司 Neural network training method, cooking inspection method, system and intelligent cooking equipment

Also Published As

Publication number Publication date
CN112084825B (en) 2023-03-24

Similar Documents

Publication Publication Date Title
CN111291266B (en) Artificial intelligence based recommendation method and device, electronic equipment and storage medium
US11334628B2 (en) Dressing recommendation method and dressing recommendation apparatus
CN111079639B (en) Method, device, equipment and storage medium for constructing garbage image classification model
Wu et al. A perceptually weighted rank correlation indicator for objective image quality assessment
CN112084825B (en) Cooking evaluation method, cooking recommendation method, computer device and storage medium
CN111095293A (en) Image aesthetic processing method and electronic equipment
CN109726746B (en) Template matching method and device
CN110008397B (en) Recommendation model training method and device
CN113536097B (en) Recommendation method and device based on automatic feature grouping
CN111242310A (en) Feature validity evaluation method and device, electronic equipment and storage medium
US10210627B1 (en) Image processing system for determining metrics of objects represented therein
CN113536105A (en) Recommendation model training method and device
CN111401219A (en) Palm key point detection method and device
CN111382410B (en) Face brushing verification method and system
CN111325705A (en) Image processing method, device, equipment and storage medium
CN115730125A (en) Object identification method and device, computer equipment and storage medium
CN111222026B (en) Training method of user category recognition model and user category recognition method
CN116863278A (en) Model training method, image classification method, device, equipment and storage medium
CN116089708A (en) Agricultural knowledge recommendation method and device
CN115168700A (en) Information flow recommendation method, system and medium based on pre-training algorithm
Chen et al. Probabilistic urban structural damage classification using bitemporal satellite images
CN109658172A (en) A kind of commercial circle recommended method calculates unit and storage medium
CN110659579B (en) Deteriorated article identification method, apparatus, device and medium
CN113590673A (en) Data heat degree statistical method based on block chain deep learning
CN113762324A (en) Virtual object detection method, device, equipment and computer readable storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant