CN109434844B - Food material processing robot control method, device and system, storage medium and equipment - Google Patents

Food material processing robot control method, device and system, storage medium and equipment Download PDF

Info

Publication number
CN109434844B
CN109434844B CN201811081931.3A CN201811081931A CN109434844B CN 109434844 B CN109434844 B CN 109434844B CN 201811081931 A CN201811081931 A CN 201811081931A CN 109434844 B CN109434844 B CN 109434844B
Authority
CN
China
Prior art keywords
food material
processing
material processing
information
food
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201811081931.3A
Other languages
Chinese (zh)
Other versions
CN109434844A (en
Inventor
何德裕
朱文飞
何国斌
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Robotics Robotics Shenzhen Ltd
Original Assignee
Robotics Robotics Shenzhen Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Robotics Robotics Shenzhen Ltd filed Critical Robotics Robotics Shenzhen Ltd
Priority to CN201811081931.3A priority Critical patent/CN109434844B/en
Publication of CN109434844A publication Critical patent/CN109434844A/en
Priority to PCT/CN2019/105912 priority patent/WO2020057455A1/en
Application granted granted Critical
Publication of CN109434844B publication Critical patent/CN109434844B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J11/00Manipulators not otherwise provided for
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1602Programme controls characterised by the control system, structure, architecture
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1694Programme controls characterised by use of sensors other than normal servo-feedback from position, speed or acceleration sensors, perception control, multi-sensor controlled systems, sensor fusion
    • B25J9/1697Vision controlled systems

Landscapes

  • Engineering & Computer Science (AREA)
  • Robotics (AREA)
  • Mechanical Engineering (AREA)
  • Automation & Control Theory (AREA)
  • Manipulator (AREA)
  • General Preparation And Processing Of Foods (AREA)

Abstract

The application relates to a food material processing robot control method, device, system, storage medium and equipment, which are used for acquiring an image of a food material to be processed; according to the image of the food material to be processed, obtaining food material type information of the food material to be processed and corresponding food material processing information through a neural network model, and outputting a corresponding food material processing instruction, wherein the neural network model is obtained through image processing and/or video training according to the food material. When food materials are processed, the food material types are identified through the neural network model, corresponding food material processing information is obtained, the robot processes the food materials according to the food material processing information, namely different processing processes are executed aiming at different food materials, and the food material processing mode is more flexible; in addition, the neural network model can learn the food processing information of various different food materials according to different food material processing images and/or videos, so that the application range is wider, and the food material processing is more scientific and reasonable.

Description

Food material processing robot control method, device and system, storage medium and equipment
Technical Field
The present application relates to the field of device control technologies, and in particular, to a method, an apparatus, a system, a storage medium, and a device for controlling a food material processing robot.
Background
With the improvement of the technological level, the whole society develops towards the direction of intellectualization and automation. In the aspect of food material processing, more and more automatic devices (such as food material processing robots) are appeared, so that the processing of food materials is more and more convenient.
In the conventional technology, when an automatic device is used for processing food materials, a fixed food material processing program needs to be written into the device, and then the device processes the food materials according to steps corresponding to the fixed program. However, because the processing methods of different food materials are different, the fixed food material processing steps cannot be applied to all food materials, so that the processing results of part of the food materials are not ideal; in addition, when the types and processing manners of the food materials are large, a large number of food material processing programs need to be written into the device, and the workload is large.
Disclosure of Invention
In view of the above, it is necessary to provide a food material processing robot control method, device, system, storage medium, and apparatus with wider application range and more flexible processing manner in order to solve the problems in the conventional technology.
A food material processing robot control method comprising:
acquiring an image of food material to be processed;
according to the image of the food material to be processed, obtaining food material type information of the food material to be processed and corresponding food material processing information through a neural network model, and outputting a corresponding food material processing instruction, wherein the neural network model is obtained according to food material processing image and/or video training, and the food material processing instruction is used for instructing the food material processing robot to process the food material to be processed according to the food material processing information in the food material processing instruction.
In one embodiment, the training data of the neural network model further comprises at least one of:
force information in the food material processing process;
temperature information in the food material processing process;
time information in the food material processing process.
In one embodiment, after acquiring the image of the food material to be processed, the method further includes:
acquiring current state information of the food material to be processed, wherein the current state information comprises at least one of the following three items: current force information, current temperature information, and current time information.
In one embodiment, the obtaining of the food material category information of the food material to be processed and the corresponding food material processing information through a neural network model according to the image of the food material to be processed includes:
obtaining the food material type information of the food material to be processed through a neural network model according to the image of the food material to be processed;
and obtaining the food material processing information of the food material to be processed through a neural network model according to the food material category information of the food material to be processed and the current state information of the food material to be processed.
In one embodiment, the food material processing information includes a food material processing mode and a food material processing parameter.
In one embodiment, the food material processing parameters comprise at least one of time parameters, temperature parameters and intensity parameters.
In one embodiment, after the step of outputting the corresponding food material processing instruction, the method further includes:
acquiring actual processing information of the food material to be processed, wherein the actual processing information comprises at least one of actual processing time, actual processing strength and actual processing temperature;
and determining a food material processing result of the food material to be processed according to the food material processing information and the actual processing information.
A food material handling robot control apparatus comprising:
the food processing device comprises an image acquisition module, a processing module and a processing module, wherein the image acquisition module is used for acquiring an image of food materials to be processed;
the control module is used for obtaining food material type information of the food material to be processed and corresponding food material processing information through a neural network model according to the image of the food material to be processed and outputting a corresponding food material processing instruction, the neural network model is obtained through image and/or video training of food material processing, and the food material processing instruction is used for instructing the food material processing robot to process the food material to be processed according to the food material processing information in the food material processing instruction.
In one embodiment, the control module is further configured to:
acquiring current state information of the food material to be processed, wherein the current state information comprises at least one of the following three items: current force information, current temperature information, and current time information.
In one embodiment, the control module is further configured to:
obtaining the food material type information of the food material to be processed through a neural network model according to the image of the food material to be processed;
and obtaining the food material processing information of the food material to be processed through a neural network model according to the food material category information of the food material to be processed and the current state information of the food material to be processed.
In one embodiment, the control module is further configured to:
acquiring actual processing information of the food material to be processed, wherein the actual processing information comprises at least one of actual processing time, actual processing strength and actual processing temperature;
and determining a food material processing result of the food material to be processed according to the food material processing information and the actual processing information.
A food material processing system comprising:
the control device is used for acquiring an image of the food material to be processed; according to the image of the food material to be processed, obtaining food material type information of the food material to be processed and corresponding food material processing information through a neural network model, and outputting a corresponding food material processing instruction, wherein the neural network model is obtained by processing the image and/or video training according to the food material;
And the food material processing robot is used for processing the food material to be processed according to the food material processing information in the food material processing instruction.
In one embodiment, the method further comprises at least one of the following three items:
the timer is used for acquiring the actual processing time of the food material to be processed after the control device outputs a food material processing instruction, and sending the actual processing time to the control device;
the force sensor is used for acquiring the current force information of the food material to be processed before the control device outputs a food material processing instruction, and sending the current force information to the control device; the food processing device is used for acquiring the actual processing strength of the food to be processed after the control device outputs a food processing instruction, and sending the actual processing strength to the control device;
the temperature sensor is used for acquiring current temperature information of the food material to be processed before the control device outputs a food material processing instruction, and sending the current temperature information to the control device; and the food processing control device is used for acquiring the actual processing temperature of the food to be processed and sending the actual processing temperature to the control device after the control device outputs a food processing instruction.
A computer device comprising a memory storing a computer program and a processor implementing the following steps when the computer program is executed:
Acquiring an image of food material to be processed;
according to the image of the food material to be processed, obtaining food material type information of the food material to be processed and corresponding food material processing information through a neural network model, and outputting a corresponding food material processing instruction, wherein the neural network model is obtained through food material processing image and/or video training, and the food material processing instruction is used for instructing a food material processing robot to process the food material to be processed according to the food material processing information in the food material processing instruction.
A computer-readable storage medium, on which a computer program is stored which, when executed by a processor, carries out the steps of:
acquiring an image of food material to be processed;
according to the image of the food material to be processed, obtaining food material type information of the food material to be processed and corresponding food material processing information through a neural network model, and outputting a corresponding food material processing instruction, wherein the neural network model is obtained through food material processing image and/or video training, and the food material processing instruction is used for instructing a food material processing robot to process the food material to be processed according to the food material processing information in the food material processing instruction.
The food material processing robot control method, device, system, storage medium and equipment obtain the image of the food material to be processed; according to the image of the food material to be processed, obtaining food material type information of the food material to be processed and corresponding food material processing information through a neural network model, and outputting a corresponding food material processing instruction, wherein the neural network model is obtained according to food material processing image and/or video training, and the food material processing instruction is used for instructing a food material processing robot to process the food material to be processed according to the food material processing information in the food material processing instruction. When food materials are processed, the food material types are identified through the neural network model, corresponding food material processing information is obtained, the robot processes the food materials according to the food material processing information, namely different processing processes are executed aiming at different food materials, and the food material processing mode is more flexible; in addition, the neural network model can learn the food processing information of various different food materials according to different food material processing images and/or videos, so that the application range is wider, and the food material processing is more scientific and reasonable.
Drawings
Fig. 1 is a schematic flow chart illustrating a control method of a food material handling robot according to an embodiment;
Fig. 2 is a schematic view illustrating the structure type of a food material processing robot according to an embodiment;
FIG. 3 is a schematic view of the structure type of the food material handling robot in another embodiment;
fig. 4 is a schematic view illustrating the structure type of a food material handling robot in another embodiment;
FIG. 5 is a schematic view of the structure type of the food material handling robot in a further embodiment;
FIG. 6 is a schematic view of the structure type of the food material handling robot in a further embodiment;
FIG. 7 is a schematic view of a process for frying eggs in one embodiment;
fig. 8 is a flowchart illustrating a control method of the food material handling robot according to another embodiment;
FIG. 9 is a schematic structural diagram of a food material handling robot control device according to an embodiment;
fig. 10 is a schematic structural diagram of a food material processing system in one embodiment.
Detailed Description
In order to make the objects, technical solutions and advantages of the present application more apparent, the present application is described in further detail below with reference to the accompanying drawings and embodiments. It should be understood that the specific embodiments described herein are merely illustrative of the present application and are not intended to limit the present application.
In one embodiment, as shown in fig. 1, there is provided a food material processing robot control method including the steps of:
Step S100, acquiring an image of the food material to be processed. When there is a food material to be processed, an image of the food material to be processed is first acquired by an apparatus/device that can acquire an image by an image sensor or the like. The acquired image can be a single image, can be a plurality of images, and can also be a video consisting of a plurality of frames of images, wherein the single image can save the image acquisition time and can accelerate the image processing speed; multiple images or a video consisting of multiple frames of images can improve the accuracy of image processing.
Step S200, obtaining the food material type information of the food material to be processed and the corresponding food material processing information through a neural network model according to the image of the food material to be processed, and outputting a corresponding food material processing instruction.
After the image of the food material to be processed is obtained, the image is subjected to image recognition processing through a neural network model, food material type information of the food material to be processed contained in the image is obtained, and corresponding food material processing information is obtained. The food materials to be processed comprise food materials commonly used in daily life, and the types of the food materials comprise: grain oils, for example: millet, wheat, barley, corn, mung bean, peanut and the like; vegetables, for example: celery, spinach, Chinese cabbage, towel gourd, cucumber, white gourd, balsam pear, eggplant, tomato, etc.; meats, for example: various fishes and various poultry meats, etc.; fruits, for example: pear, grapefruit, mango, kiwi, banana, orange, strawberry, watermelon, etc.; and other categories. Correspondingly, the food material processing information comprises: cutting (such as cutting vegetables and fruits), cutting (such as cutting fruits), peeling (such as peeling potatoes, peeling (such as peeling pomelo peel), stirring (such as stirring egg liquid), wrapping (such as wrapping dumplings, steamed stuffed bun and steamed bread), frying (such as frying dishes), frying (such as frying eggs), frying (such as frying french fries), cooking (such as cooking noodles and dumplings), steaming (such as steaming steamed bread), and baking (such as baking chicken and duck).
Specifically, the neural network model used in the present embodiment includes, but is not limited to, a Convolutional Neural Network (CNN) model, which may include various network structures, such as: LeNet, AlexNet, ZFNET, VGG, GoogLeNet, Residual Net, DenseNet, R-CNN, SPP-NET, Fast-RCNN, FCN, Mask-RCNN, YOLO, SSD, YOLO2, and other network model structures now known or later developed. The neural network model used in this embodiment may also be another type of neural network model, the neural network model is obtained by performing training according to the food material processing image and/or video, the training method of the model includes supervised learning (training according to a sample), reinforcement learning (defining a reward function), and simulation learning, and the trained neural network model is used to identify the food material type information and obtain the corresponding food material processing information.
Further, when the neural network model is trained according to the food material processing graph and/or video, the training of the neural network model can be completed by directly using the food material processing graph and/or video as the input of the neural network model; in addition, the pose information of the food material processing robot in each image/time point in the food material processing graph and/or video can be extracted, and then the pose information of the food material processing robot is used as the input of the neural network model to complete the training of the neural network model. It can be understood that, in the embodiment, the method used for training the neural network model according to the food material processing graphics and/or videos is not limited to the two methods, and other methods for training the neural network model directly or indirectly according to the food material processing graphics and/or videos may also be used.
After the food material type information of the food material to be processed and the corresponding food material processing information are obtained, a corresponding food material processing instruction is generated according to the food material type information and the food material processing information and is sent to the food material processing robot, and the food material processing instruction is used for instructing the food material processing robot to process the food material to be processed according to the food material processing information in the food material processing instruction.
Further, if the food material type information of the food material to be processed cannot be identified according to the acquired image of the food material to be processed by the neural network model, the image of the food material to be processed is acquired again, and image identification processing is performed again according to the acquired image until the food material type information of the food material to be processed is identified.
It should be noted that the type of the food material processing robot is not limited in this embodiment, and the type of the food material processing robot may be determined according to the variety of the food material to be processed, for example, as shown in fig. 2, the type of the food material processing robot may specifically be a structure such as a manipulator, so that the food material to be processed can be fixed; the food material processing robot may also be a human type food material processing robot including a manipulator and an actuator, as shown in fig. 3 and 4, the actuator may be a tool for holding other appliances, or may be a structure such as a cutter (e.g., a cutting knife, etc.), a peeler (e.g., a peeling knife, etc.) and the like capable of processing food materials to be processed, as shown in fig. 5 and 6, or may be a structure (e.g., a stirrer cooperating with a bowl, a slice cooperating with a pot, etc.) capable of cooperating with other food material processing devices (e.g., a bowl, a pot, etc.) to complete a food material processing process; the food material processing robot may also be a device/apparatus for controlling the operating state of other food material processing devices, such as a device/apparatus for controlling the baking time of an oven, a device/apparatus for controlling the frying temperature of a wok, and the like.
Further, when the food material handling robot at least comprises a manipulator and an actuator, the actuator may hold other appliances (such as a knife, etc.), or the other appliances may be directly fixed on the manipulator.
The embodiment provides a control method of a food material processing robot, when processing food materials, the type of the food materials is identified through a neural network model, and corresponding food material processing information is obtained, the robot processes the food materials according to the food material processing information, that is, different processing processes are executed for different food materials, so that the food material processing mode is more flexible; in addition, the neural network model can learn the food processing information of various different food materials according to different food material processing images and/or videos, so that the application range is wider, and the food material processing is more scientific and reasonable.
In one embodiment, the training data of the neural network model further comprises at least one of: force information in the food material processing process; temperature information in the food material processing process; time information in the food material processing process. When the neural network model is trained according to the food processing image and/or video, the information obtained by the neural network model is limited, for example, only information such as an operation track, an operation mode and the like can be obtained, and in the food processing process, in order to complete processing of the food, specific operation parameters are required besides the information such as the operation track, the operation mode and the like, for example: strength, temperature, time, etc.
In this embodiment, the operation parameter includes at least one of information of food material processing intensity, information of food material processing temperature, and information of food material processing time. Specifically, the food material processing intensity information refers to intensity information applied to the food material to be processed, for example: the purpose of breaking the eggshell of the egg can be achieved by using the large force to beat the egg; the food processing temperature information is temperature information required when processing food, and includes, for example: the purpose of cooking eggs can be achieved by using high temperature; the food processing time information is time information required for processing food, and includes, for example: the purpose of cooking eggs can be achieved by cooking for a long time.
When this embodiment trains the neural network model, its training data still includes at least one item in material processing dynamics information, material processing temperature information, material processing time information, because above-mentioned parameter all is the very important parameter in the material processing process, whether handle the completion with the material and have decisive effect, consequently, train the neural network model through using above-mentioned parameter, can make the material processing information that the material processing instruction of this model output contained more comprehensive concrete, thereby the material of treatment is handled to the control material processing robot that can be better.
In one embodiment, after step S100, the food material processing robot control method further includes: the method comprises the steps of obtaining current state information of food materials to be processed, wherein the obtained current state information is beneficial to obtaining food material processing information required for processing the food materials.
The current state information includes at least one of the following three items: the food processing method comprises the following steps of current strength information, current temperature information and current time information, wherein the current strength information refers to strength information which is applied to food materials to be processed currently, the current temperature information refers to the current temperature information of the food materials to be processed, and the current time information refers to various related time information of the food materials to be processed. For example: referring to fig. 3, for the food material in fig. 3, the current force information is the force of the manipulator for fixing the food material, and since the food material needs to be cut, it needs to be determined whether the food material slips or not in the cutting process according to the current force information, and in addition, the obtained current force information is helpful to obtain the cutting force information required for cutting the food material. Another example is: when processing frozen food materials (such as meat, etc.), since the frozen food materials need a thawing process compared to the normal temperature food materials, the processing process is more complicated than the normal temperature food materials, and accordingly, the food material processing strategy is changed.
The current state of the food material to be processed can be judged by acquiring the current state information of the food material to be processed, so that a better food material processing strategy can be obtained, and the output food material processing instruction is more scientific and reasonable.
In one embodiment, in the step S200, the step of obtaining the food material category information of the food material to be processed and the corresponding food material processing information through the neural network model according to the image of the food material to be processed specifically includes:
obtaining food material type information of the food material to be processed through a neural network model according to the image of the food material to be processed; and obtaining the food material processing information of the food material to be processed through a neural network model according to the food material category information of the food material to be processed and the current state information of the food material to be processed.
After the type information of the food materials to be processed is obtained, the current state of the food materials to be processed can be judged by combining the current state information of the food materials to be processed, so that a better food material processing strategy can be obtained, and the output food material processing instruction is more scientific and reasonable.
In one embodiment, the food material processing information comprises a food material processing mode and a food material processing parameter. The food material processing information obtained through the neural network model comprises a food material processing mode and food material processing parameters, wherein the food material processing mode is a mode for processing food materials, and the food material processing parameters are conditions/environment parameters in the food material processing process.
Specifically, for different kinds of food materials, the food material processing modes may be the same or different; there are various food material processing methods for one food material, such as: for eggs, the processing method comprises the following steps: multiple processing modes such as egg beating, stirring, egg frying, egg steaming and the like. Correspondingly, the food material processing parameters corresponding to egg breaking indicate how to break the eggshells of the eggs; the processing parameters of the food materials corresponding to the stirring indicate how to stir so as to uniformly mix the egg white and the egg yolk of the eggs; the food material processing parameters corresponding to the fried eggs indicate how to fry the eggs so that the eggs can be fried; the food material processing parameters corresponding to the steamed eggs indicate how to steam the eggs so that the eggs can be steamed.
In this embodiment, the food material processing information includes a food material processing mode and a food material processing parameter, and the food material processing robot can process the food material according to the food material processing mode and the food material processing parameter, so that the food material processing process is more intelligent.
In an embodiment, the food material processing parameter comprises at least one of a time parameter, a temperature parameter and a force parameter. The time parameter represents the time for processing the food material to be processed, the temperature parameter represents the environment temperature for processing the food material to be processed, and the strength parameter represents the strength for processing the food material to be processed.
Specifically, according to different food material processing modes, the used food material processing parameters are different, only one of the three parameters may be used, two of the three parameters may be used, or all of the three parameters may be used. It can be understood that the three parameters are parameters commonly used in a general food material processing process, and the food material processing parameters may also be parameters used in the food material processing process, such as the number of times of food material processing, and the like, besides the three parameters.
For example: referring to fig. 2, during the egg breaking process, the required parameters include force parameters, i.e. how much force is used to secure the egg and how much force is used to break the shell of the egg. Referring to fig. 5, the required parameters during stirring include a strength parameter and a time parameter, i.e. how much force is used to stir the albumen and yolk and how long time is used to mix the albumen and yolk evenly. In the process of steaming eggs, the required parameters comprise temperature parameters and time parameters, namely, the temperature is high, the eggs are steamed, and the eggs can be steamed for a long time. As shown in fig. 7, the required parameters include time parameters, temperature parameters and force parameters, i.e. how high temperature is used to fry the eggs, how much force is used to turn the eggs over during frying, and how long the eggs can be fried, during frying.
It should be noted that, when two or more parameters are used in the food material processing process, one of the parameters may change with the change of the other parameter or two parameters, and in the process of training the neural network model, the corresponding change relationship between different parameters may be added. For example: when a food material is boiled in water, the temperature parameter is generally 100 ℃, and the time parameter is 30 minutes, however, in a high altitude area, the boiling point of water is reduced, so that the temperature parameter of the food material boiling in water is lower than 100 ℃, and the time parameter required is more than 30 minutes.
In this embodiment, the food material processing parameters include at least one of a time parameter, a temperature parameter, and a force parameter, and the food material processing robot can process the food material according to one or more of the parameters, so that the food material processing process is more scientific and reasonable.
In one embodiment, as shown in fig. 8, after the step of outputting the corresponding food material processing instruction, the food material processing robot control method further includes step S300 and step S400.
Step S300, acquiring actual processing information of the food material to be processed, wherein the actual processing information comprises at least one of actual processing time, actual processing strength and actual processing temperature. And when the actual processing information is acquired, selecting the acquired actual processing information of the food to be processed according to the food processing parameters contained in the food processing information in the output food processing instruction. For example: the food material processing parameters only comprise the strength parameters, and only the actual processing strength of the food material to be processed is obtained; and the food material processing parameters simultaneously comprise a force parameter and a time parameter, and then the actual processing force and the actual processing time of the food material to be processed are simultaneously obtained.
Step S400, determining a food material processing result of the food material to be processed according to the food material processing information and the actual processing information. And after the actual processing information is obtained, comparing the actual processing information with the food material processing information, and judging whether the food material processing is finished. For example: in the food material processing information of a certain food material, the food material processing mode is poaching, and the food material processing parameters include a time parameter, specifically 30 minutes, and a temperature parameter, specifically 100 ℃. In the obtained actual processing information, if the actual processing temperature is 100 ℃ and the actual processing time is 20 minutes, judging that the food material is not processed; and if the actual processing temperature is 100 ℃ and the actual processing time is 30 minutes, judging that the food material processing is finished, and finishing the processing of the food material.
Further, if the actual processing environment of the food material cannot meet the requirement of the food material processing parameters in the food material processing information, the corresponding food material processing parameters can be obtained again through the neural network model, and the food material processing instruction is output again. For example: in the food material processing information of a certain food material, the food material processing mode is poaching, and the food material processing parameters include a time parameter, specifically 30 minutes, and a temperature parameter, specifically 100 ℃. In the obtained actual processing information, if the actual processing temperature is 90 degrees celsius (due to a reduction in boiling point of water caused by an altitude factor), the corresponding food material processing parameters are obtained again through the neural network model, that is, when the temperature is 90 degrees celsius, the time required for boiling (for example, 45 minutes) is obtained, and the food material processing instruction is output again, in the output instruction again, the temperature parameter is 90 degrees celsius, and the time parameter is 45 minutes.
After the food material processing instruction is output, the method further comprises the steps of acquiring actual processing information of the food material to be processed, and determining a food material processing result of the food material to be processed according to the food material processing information and the actual processing information, so that the food material processing result is ensured to meet requirements, and the food material processing process is more scientific and reasonable.
In one embodiment, taking a food material processing robot for cutting vegetables as an example, the robot comprises a manipulator and an actuator, wherein the actuator holds a cutter, the manipulator is used for fixing food materials to be processed, and the actuator is used for controlling the cutter to cut vegetables of the food materials. When training the neural network model, the trained data comprise images/videos for cutting the food materials and force information fed back by force sensors arranged on the chopping board, the images/videos can be learned through cutting the food materials to perform cutting processing on the food materials, and the information such as operation tracks can be learned through the force information fed back by the force sensors to complete the specific force information required by cutting the food materials.
In an embodiment, when the steps of the food material processing robot control method provided in the foregoing embodiment are executed, the number of the neural network models used is not limited, and in the process of processing the food material, a single neural network model may be used to execute the food material processing process, or a plurality of neural network models may be cooperatively used to execute the food material processing process. For example: when performing a process of a beaten food material, the process is implemented using a single neural network model. When food material processing of fried eggs is performed, recognition of eggs is achieved using the neural network model 1, turn-over processing of eggs is achieved using the neural network model 2, time of frying eggs is controlled using the neural network model 3, temperature of fried eggs is controlled using the neural network model 4, and the like. Through the mutual matching of different neural network models, the food material processing process can be more accurate.
It should be understood that although the steps in the flowcharts of fig. 1 and 8 are shown in order as indicated by the arrows, the steps are not necessarily performed in order as indicated by the arrows. The steps are not performed in the exact order shown and described, and may be performed in other orders, unless explicitly stated otherwise. Moreover, at least some of the steps in fig. 1 and 8 may include multiple sub-steps or multiple stages, which are not necessarily performed at the same time, but may be performed at different times, and the order of performing the sub-steps or stages is not necessarily sequential, but may be performed alternately or alternately with other steps or at least some of the sub-steps or stages of other steps.
In one embodiment, as shown in fig. 9, there is provided a food material handling robot control device comprising an image acquisition module 100 and a control module 200.
The image obtaining module 100 is configured to obtain an image of a food material to be processed. The image acquired by the image acquisition module 100 may be a single image, multiple images, or a video composed of multiple frames of images, wherein the single image can save the image acquisition time and can accelerate the image processing speed; multiple images or a video consisting of multiple frames of images can improve the accuracy of image processing.
The control module 200 is configured to obtain food material type information of a food material to be processed and corresponding food material processing information through a neural network model according to an image of the food material to be processed, and output a corresponding food material processing instruction, where the neural network model is obtained by processing the image and/or video training according to the food material, and the food material processing instruction is used to instruct the food material processing robot to process the food material to be processed according to the food material processing information in the food material processing instruction.
In addition, the image acquiring module 100 is further configured to perform positioning processing and the like on food materials to be processed in the food material processing process performed by the food material processing robot. For example: referring to fig. 7, in the egg frying process, the image acquiring module 100 is further configured to acquire the position of the egg, so as to facilitate the food material robot to turn over the egg.
The embodiment provides a food material processing robot control device, when processing food materials, the type of the food materials is identified through a neural network model, and corresponding food material processing information is obtained, and a food material processing robot processes the food materials according to the food material processing information, that is, different processing processes are executed for different food materials, so that a food material processing mode is more flexible; in addition, the neural network model can learn the food processing information of various different food materials according to different food material processing images and/or videos, so that the application range is wider, and the food material processing is more scientific and reasonable.
In one embodiment, the control module 200 is further configured to: acquiring current state information of food materials to be processed, wherein the current state information comprises at least one of the following three items: current force information, current temperature information, and current time information.
In one embodiment, the control module 200 is further configured to: obtaining food material type information of the food material to be processed through a neural network model according to the image of the food material to be processed; and obtaining the food material processing information of the food material to be processed through a neural network model according to the food material category information of the food material to be processed and the current state information of the food material to be processed.
In one embodiment, the control module 200 is further configured to: acquiring actual processing information of food materials to be processed, wherein the actual processing information comprises at least one of actual processing time, actual processing strength and actual processing temperature; and determining a food material processing result of the food material to be processed according to the food material processing information and the actual processing information.
In this embodiment, after outputting the food material processing instruction, the control module 200 is further configured to acquire actual processing information of the food material to be processed, and determine a food material processing result of the food material to be processed according to the food material processing information and the actual processing information, so that the food material processing result is ensured to meet the requirement, and the food material processing process is more scientific and reasonable.
For specific limitations of the food material processing robot control device, reference may be made to the above limitations of the food material processing robot control method, which are not described herein again. All or part of the modules in the food processing robot control device can be realized by software, hardware and a combination thereof. The modules can be embedded in a hardware form or independent from a processor in the computer device, and can also be stored in a memory in the computer device in a software form, so that the processor can call and execute operations corresponding to the modules.
In one embodiment, as shown in fig. 10, there is provided a food material processing system comprising a control apparatus 300 and a food material processing robot 400.
The control device 300 is used for acquiring an image of food material to be processed; according to the image of the food material to be processed, obtaining food material type information of the food material to be processed and corresponding food material processing information through a neural network model, and outputting a corresponding food material processing instruction, wherein the neural network model is obtained by processing the image and/or video training according to the food material;
the food material processing robot 400 is configured to process the food material to be processed according to the food material processing information in the food material processing instruction.
When processing food materials, a control device identifies the types of the food materials through a neural network model and obtains corresponding food material processing information, and a food material processing robot processes the food materials according to the food material processing information, that is, different processing processes are executed for different food materials, so that a food material processing mode is more flexible; in addition, the neural network model can learn the food processing information of various different food materials according to different food material processing images and/or videos, so that the application range is wider, and the food material processing is more scientific and reasonable.
In one embodiment, the food material processing system further comprises at least one of:
the timer is used for acquiring the actual processing time of the food material to be processed after the control device outputs the food material processing instruction, and sending the actual processing time to the control device;
the force sensor is used for acquiring current force information of the food material to be processed before the control device outputs the food material processing instruction, and sending the current force information to the control device; the food processing device is used for acquiring the actual processing strength of the food to be processed after the control device outputs the food processing instruction, and sending the actual processing strength to the control device;
The temperature sensor is used for acquiring the current temperature information of the food material to be processed before the control device outputs the food material processing instruction, and sending the current temperature information to the control device; and the control device is used for acquiring the actual processing temperature of the food material to be processed after the control device outputs the food material processing instruction, and sending the actual processing temperature to the control device.
In the embodiment, before the food materials are processed, the current state information of the food materials is obtained, which can help to obtain more scientific food material processing information; in the process of actually processing the food materials, the actual processing information of the food materials to be processed is acquired through one or more of the timer, the force sensor and the temperature sensor, and the food material processing result of the food materials to be processed is determined according to the food material processing information and the actual processing information, so that the food material processing result is ensured to meet the requirement, and the food material processing process is more scientific and reasonable.
For specific limitations of the control device, see the above limitations on the food material processing robot control method, which are not described herein again. The respective modules in the above control device may be wholly or partially implemented by software, hardware, and a combination thereof. The modules can be embedded in a hardware form or independent from a processor in the computer device, and can also be stored in a memory in the computer device in a software form, so that the processor can call and execute operations corresponding to the modules.
In one embodiment, there is provided a computer device comprising a memory and a processor, the memory storing a computer program, the processor implementing the following steps when executing the computer program: acquiring an image of food material to be processed; according to the image of the food material to be processed, obtaining food material type information of the food material to be processed and corresponding food material processing information through a neural network model, and outputting a corresponding food material processing instruction, wherein the neural network model is obtained according to food material processing image and/or video training, and the food material processing instruction is used for instructing a food material processing robot to process the food material to be processed according to the food material processing information in the food material processing instruction.
In one embodiment, the processor when executing the computer program further performs the steps of: acquiring current state information of food materials to be processed, wherein the current state information comprises at least one of the following three items: current force information, current temperature information, and current time information.
In one embodiment, the processor when executing the computer program further performs the steps of: obtaining food material type information of the food material to be processed through a neural network model according to the image of the food material to be processed; and obtaining the food material processing information of the food material to be processed through a neural network model according to the food material category information of the food material to be processed and the current state information of the food material to be processed.
In one embodiment, the processor when executing the computer program further performs the steps of: acquiring actual processing information of food materials to be processed, wherein the actual processing information comprises at least one of actual processing time, actual processing force and actual processing temperature; and determining a food material processing result of the food material to be processed according to the food material processing information and the actual processing information.
In one embodiment, a computer-readable storage medium is provided, having stored thereon a computer program which, when executed by a processor, performs the steps of: acquiring an image of food material to be processed; according to the image of the food material to be processed, obtaining food material type information of the food material to be processed and corresponding food material processing information through a neural network model, and outputting a corresponding food material processing instruction, wherein the neural network model is obtained according to food material processing image and/or video training, and the food material processing instruction is used for instructing a food material processing robot to process the food material to be processed according to the food material processing information in the food material processing instruction.
In one embodiment, the computer program when executed by the processor further performs the steps of: acquiring current state information of food materials to be processed, wherein the current state information comprises at least one of the following three items: current force information, current temperature information, and current time information.
In one embodiment, the computer program when executed by the processor further performs the steps of: obtaining food material type information of the food material to be processed through a neural network model according to the image of the food material to be processed; and obtaining the food material processing information of the food material to be processed through a neural network model according to the food material category information of the food material to be processed and the current state information of the food material to be processed.
In one embodiment, the computer program when executed by the processor further performs the steps of:
acquiring actual processing information of food materials to be processed, wherein the actual processing information comprises at least one of actual processing time, actual processing strength and actual processing temperature; and determining a food material processing result of the food material to be processed according to the food material processing information and the actual processing information.
It will be understood by those skilled in the art that all or part of the processes of the methods of the embodiments described above can be implemented by hardware instructions of a computer program, which can be stored in a non-volatile computer-readable storage medium, and when executed, can include the processes of the embodiments of the methods described above. Any reference to memory, storage, database, or other medium used in the embodiments provided herein may include non-volatile and/or volatile memory, among others. Non-volatile memory can include read-only memory (ROM), Programmable ROM (PROM), Electrically Programmable ROM (EPROM), Electrically Erasable Programmable ROM (EEPROM), or flash memory. Volatile memory can include Random Access Memory (RAM) or external cache memory. By way of illustration and not limitation, RAM is available in a variety of forms such as Static RAM (SRAM), Dynamic RAM (DRAM), Synchronous DRAM (SDRAM), Double Data Rate SDRAM (DDRSDRAM), Enhanced SDRAM (ESDRAM), Synchronous Link DRAM (SLDRAM), Rambus Direct RAM (RDRAM), direct bus dynamic RAM (DRDRAM), and memory bus dynamic RAM (RDRAM).
All possible combinations of the technical features of the above embodiments may not be described for the sake of brevity, but should be considered as within the scope of the present disclosure as long as there is no contradiction between the combinations of the technical features.
The above-mentioned embodiments only express several embodiments of the present invention, and the description thereof is specific and detailed, but not to be understood as limiting the scope of the invention. It should be noted that, for a person skilled in the art, several variations and modifications can be made without departing from the inventive concept, which falls within the scope of the present invention. Therefore, the protection scope of the present patent should be subject to the appended claims.

Claims (10)

1. A food material processing robot control method, comprising:
acquiring an image of food material to be processed;
obtaining the food material type information of the food material to be processed through a neural network model according to the image of the food material to be processed; according to the food material type information of the food material to be processed and the current state information of the food material to be processed, food material processing information of the food material to be processed is obtained through a neural network model, the food material processing information comprises a food material processing mode and food material processing parameters, the food material processing mode is a mode for processing the food material, the food material processing parameters are conditions/environmental parameters in the food material processing process, the food material processing parameters comprise at least one of time parameters, temperature parameters and force parameters, and the neural network model is obtained through image processing and/or video training according to the food material;
Outputting a corresponding food material processing instruction, wherein the food material processing instruction is generated according to the food material type information and the food material processing information, and the food material processing instruction is used for instructing the food material processing robot to process the food material to be processed according to the food material processing information in the food material processing instruction;
acquiring actual processing information of food materials to be processed, wherein the actual processing information comprises at least one of actual processing time, actual processing strength and actual processing temperature; determining a food material processing result of the food material to be processed according to the food material processing information and the actual processing information, wherein the food material processing result comprises whether the food material processing is finished or not; if the actual processing environment of the food material cannot meet the requirement of the food material processing parameters in the food material processing information, re-acquiring the corresponding food material processing parameters through the neural network model, and re-outputting the food material processing instruction;
when the neural network model is trained, the pose information of the food material processing robot in each image/time point in the food material processing graph and/or video is extracted, then the pose information of the food material processing robot is used as the input of the neural network model to finish the training of the neural network model, and when the neural network model is trained, the training data further comprises at least one item of force information in the food material processing process, temperature information in the food material processing process and time information in the food material processing process.
2. The food material processing robot control method of claim 1, wherein the image of the food material to be processed is a single image, multiple images or a video consisting of multiple frames of images.
3. The food material processing robot control method of claim 1, wherein the food material processing robot is of a robot type having a manipulator structure.
4. The food material handling robot control method of claim 1, wherein the food material handling robot is of the type comprising a robot hand and an actuator.
5. A food material handling robot control apparatus, comprising:
the food processing device comprises an image acquisition module, a processing module and a processing module, wherein the image acquisition module is used for acquiring an image of food materials to be processed;
the control module is used for obtaining the food material type information of the food material to be processed through a neural network model according to the image of the food material to be processed; obtaining food material processing information of the food material to be processed through a neural network model according to the food material category information of the food material to be processed and the current state information of the food material to be processed, and outputting a corresponding food material processing instruction, wherein the food material processing instruction is generated according to the food material type information and the food material processing information, the food material processing information comprises a food material processing mode and a food material processing parameter, the food material processing mode is a mode for processing food materials, the food material processing parameter is a condition/environment parameter in the food material processing process, and the food material processing parameter comprises at least one of a time parameter, a temperature parameter and a force parameter, the neural network model is obtained by processing images and/or video training according to food materials, and the food material processing instruction is used for instructing the food material processing robot to process the food materials to be processed according to food material processing information in the food material processing instruction; when the neural network model is trained, extracting pose information of a food processing robot in each image/time point in a food processing graph and/or video, finishing training of the neural network model by taking the pose information of the food processing robot as the input of the neural network model, wherein when the neural network model is trained, training data further comprises at least one of force information in the food processing process, temperature information in the food processing process and time information in the food processing process;
The control module is also used for acquiring actual processing information of the food material to be processed, wherein the actual processing information comprises at least one of actual processing time, actual processing strength and actual processing temperature; determining a food material processing result of the food material to be processed according to the food material processing information and the actual processing information, wherein the food material processing result comprises whether the food material processing is finished or not; and if the actual processing environment of the food material cannot meet the requirement of the food material processing parameters in the food material processing information, re-acquiring the corresponding food material processing parameters through the neural network model, and re-outputting the food material processing instruction.
6. The food material processing robot control of claim 5, wherein the control module is further configured to:
acquiring current state information of the food material to be processed, wherein the current state information comprises at least one of the following three items: current force information, current temperature information, and current time information.
7. A food material processing system, comprising:
the control device is used for acquiring an image of the food material to be processed; obtaining the food material type information of the food material to be processed through a neural network model according to the image of the food material to be processed; according to food material category information of food materials to be processed and current state information of the food materials to be processed, food material processing information of the food materials to be processed is obtained through a neural network model, and a corresponding food material processing instruction is output, the food material processing instruction is generated according to the food material category information and the food material processing information, the food material processing information comprises a food material processing mode and a food material processing parameter, the food material processing mode is which mode is adopted to process the food materials, the food material processing parameter is a condition/environment parameter in a food material processing process, the food material processing parameter comprises at least one of a time parameter, a temperature parameter and a strength parameter, and the neural network model is obtained according to food material processing images and/or video training; when the neural network model is trained, extracting pose information of a food processing robot in each image/time point in a food processing graph and/or video, finishing training of the neural network model by taking the pose information of the food processing robot as the input of the neural network model, wherein when the neural network model is trained, training data further comprises at least one of force information in the food processing process, temperature information in the food processing process and time information in the food processing process;
The food material processing robot is used for processing the food material to be processed according to the food material processing information in the food material processing instruction;
the control device is also used for acquiring actual processing information of the food material to be processed, wherein the actual processing information comprises at least one of actual processing time, actual processing strength and actual processing temperature; determining a food material processing result of the food material to be processed according to the food material processing information and the actual processing information, wherein the food material processing result comprises whether the food material processing is finished or not; and if the actual processing environment of the food material cannot meet the requirement of the food material processing parameters in the food material processing information, re-acquiring the corresponding food material processing parameters through the neural network model, and re-outputting the food material processing instruction.
8. The food material processing system of claim 7, further comprising at least one of:
the timer is used for acquiring the actual processing time of the food material to be processed after the control device outputs a food material processing instruction, and sending the actual processing time to the control device;
the force sensor is used for acquiring the current force information of the food material to be processed before the control device outputs a food material processing instruction, and sending the current force information to the control device; the food processing device is used for acquiring the actual processing strength of the food to be processed after the control device outputs the food processing instruction, and sending the actual processing strength to the control device;
The temperature sensor is used for acquiring current temperature information of the food material to be processed before the control device outputs a food material processing instruction, and sending the current temperature information to the control device; and the food processing control device is used for acquiring the actual processing temperature of the food to be processed and sending the actual processing temperature to the control device after the control device outputs a food processing instruction.
9. A computer device comprising a memory and a processor, the memory storing a computer program, wherein the processor when executing the computer program performs the steps of:
acquiring an image of food material to be processed;
obtaining food material type information of the food material to be processed through a neural network model according to the image of the food material to be processed; according to food material type information of food materials to be processed and current state information of the food materials to be processed, food material processing information of the food materials to be processed is obtained through a neural network model, the food material processing information comprises a food material processing mode and food material processing parameters, the food material processing mode is a mode for processing the food materials, the food material processing parameters are conditions/environmental parameters in the food material processing process, the food material processing parameters comprise at least one of time parameters, temperature parameters and strength parameters, and the neural network model is obtained through image processing and/or video training according to the food materials;
Outputting a corresponding food material processing instruction, wherein the food material processing instruction is generated according to the food material type information and the food material processing information, and the food material processing instruction is used for instructing a food material processing robot to process the food material to be processed according to the food material processing information in the food material processing instruction;
acquiring actual processing information of food materials to be processed, wherein the actual processing information comprises at least one of actual processing time, actual processing force and actual processing temperature; determining a food material processing result of the food material to be processed according to the food material processing information and the actual processing information, wherein the food material processing result comprises whether food material processing is finished or not; if the actual processing environment of the food material cannot meet the requirement of the food material processing parameters in the food material processing information, re-acquiring the corresponding food material processing parameters through the neural network model, and re-outputting the food material processing instruction;
when the neural network model is trained, the pose information of the food material processing robot in each image/time point in the food material processing graph and/or video is extracted, and then the pose information of the food material processing robot is used as the input of the neural network model to finish the training of the neural network model.
10. A computer-readable storage medium, on which a computer program is stored, which, when being executed by a processor, carries out the steps of:
acquiring an image of food material to be processed;
obtaining the food material type information of the food material to be processed through a neural network model according to the image of the food material to be processed; according to the food material type information of the food material to be processed and the current state information of the food material to be processed, food material processing information of the food material to be processed is obtained through a neural network model, the food material processing information comprises a food material processing mode and food material processing parameters, the food material processing mode is a mode for processing the food material, the food material processing parameters are conditions/environmental parameters in the food material processing process, the food material processing parameters comprise at least one of time parameters, temperature parameters and force parameters, and the neural network model is obtained through image processing and/or video training according to the food material;
outputting a corresponding food material processing instruction, wherein the food material processing instruction is generated according to the food material type information and the food material processing information, and the food material processing instruction is used for instructing a food material processing robot to process the food material to be processed according to the food material processing information in the food material processing instruction;
Acquiring actual processing information of food materials to be processed, wherein the actual processing information comprises at least one of actual processing time, actual processing force and actual processing temperature; determining a food material processing result of the food material to be processed according to the food material processing information and the actual processing information, wherein the food material processing result comprises whether food material processing is finished or not; if the actual processing environment of the food material cannot meet the requirement of the food material processing parameters in the food material processing information, re-acquiring the corresponding food material processing parameters through the neural network model, and re-outputting a food material processing instruction;
when the neural network model is trained, the pose information of the food material processing robot in each image/time point in food material processing graphs and/or videos is extracted, and then the pose information of the food material processing robot is used as the input of the neural network model to complete the training of the neural network model.
CN201811081931.3A 2018-09-17 2018-09-17 Food material processing robot control method, device and system, storage medium and equipment Active CN109434844B (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN201811081931.3A CN109434844B (en) 2018-09-17 2018-09-17 Food material processing robot control method, device and system, storage medium and equipment
PCT/CN2019/105912 WO2020057455A1 (en) 2018-09-17 2019-09-16 Food material processing robot control method, apparatus, system, storage medium and device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201811081931.3A CN109434844B (en) 2018-09-17 2018-09-17 Food material processing robot control method, device and system, storage medium and equipment

Publications (2)

Publication Number Publication Date
CN109434844A CN109434844A (en) 2019-03-08
CN109434844B true CN109434844B (en) 2022-06-28

Family

ID=65532869

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201811081931.3A Active CN109434844B (en) 2018-09-17 2018-09-17 Food material processing robot control method, device and system, storage medium and equipment

Country Status (2)

Country Link
CN (1) CN109434844B (en)
WO (1) WO2020057455A1 (en)

Families Citing this family (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109434844B (en) * 2018-09-17 2022-06-28 鲁班嫡系机器人(深圳)有限公司 Food material processing robot control method, device and system, storage medium and equipment
CN109998360B (en) * 2019-04-11 2021-03-26 上海长膳智能科技有限公司 Method and device for automatically cooking food
CN111151368B (en) * 2020-01-09 2021-04-02 珠海格力电器股份有限公司 Garbage treatment method, system, storage medium and garbage treatment equipment
CN111814862A (en) * 2020-06-30 2020-10-23 平安国际智慧城市科技股份有限公司 Fruit and vegetable identification method and device
CN111914777B (en) * 2020-08-07 2021-07-06 广东工业大学 Method and system for identifying robot instruction in cross-mode manner

Family Cites Families (17)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP3536551B2 (en) * 1996-10-03 2004-06-14 松下電器産業株式会社 Cooking device
US6650779B2 (en) * 1999-03-26 2003-11-18 Georgia Tech Research Corp. Method and apparatus for analyzing an image to detect and identify patterns
US6468069B2 (en) * 1999-10-25 2002-10-22 Jerome H. Lemelson Automatically optimized combustion control
CN102294695A (en) * 2010-06-25 2011-12-28 鸿富锦精密工业(深圳)有限公司 Robot calibration method and calibration system
CN104914720B (en) * 2015-04-16 2017-11-17 贵州省烟草公司遵义市公司 Electronic nose intelligence baking control system and control method with autolearn feature
JP6522488B2 (en) * 2015-07-31 2019-05-29 ファナック株式会社 Machine learning apparatus, robot system and machine learning method for learning work taking-out operation
CN105512676A (en) * 2015-11-30 2016-04-20 华南理工大学 Food recognition method at intelligent terminal
CN105444222B (en) * 2015-12-11 2017-11-14 美的集团股份有限公司 Cooking control method, system, Cloud Server and the micro-wave oven of micro-wave oven
CN105867257A (en) * 2016-05-30 2016-08-17 深圳市泰瑞达科技有限公司 Intelligent pot and control and compiling method thereof
CN106878697A (en) * 2016-06-29 2017-06-20 鲁班嫡系机器人 A kind of image pickup method and its imaging method, device and equipment
CN106774876B (en) * 2016-12-12 2020-07-28 快创科技(大连)有限公司 Cooking auxiliary system based on AR augmented reality technology and menu generation
CN106897661B (en) * 2017-01-05 2020-03-27 合肥美的智能科技有限公司 Intelligent identification method and system for food material image and household appliance
CN108197635B (en) * 2017-11-29 2020-05-29 珠海格力电器股份有限公司 Cooking mode display method and device and range hood
CN108098773A (en) * 2017-12-20 2018-06-01 芜湖哈特机器人产业技术研究院有限公司 The sorting control system and method for a kind of robot
CN108303920A (en) * 2018-01-18 2018-07-20 周晔 A kind of intelligent electric appliance integrated management approach and system based on food materials identification
CN108527388A (en) * 2018-04-20 2018-09-14 成都昂联科技有限公司 A kind of the kitchen intelligent robot and its operating method of Behavior-based control analysis
CN109434844B (en) * 2018-09-17 2022-06-28 鲁班嫡系机器人(深圳)有限公司 Food material processing robot control method, device and system, storage medium and equipment

Also Published As

Publication number Publication date
WO2020057455A1 (en) 2020-03-26
CN109434844A (en) 2019-03-08

Similar Documents

Publication Publication Date Title
CN109434844B (en) Food material processing robot control method, device and system, storage medium and equipment
EP3068271B1 (en) Smart cooking apparatus and method
Bollini et al. Interpreting and executing recipes with a cooking robot
CN110289079A (en) Wired home recipe supplying system, the method and device for pushing recipe
AU2015220546A1 (en) Methods and systems for food preparation in a robotic cooking kitchen
US20200000130A1 (en) Recipe-based cooking aid
Pei et al. Model-based on fishery management systems with selective harvest policies
US11926036B2 (en) Information processing device and scheduling method
Dongre et al. Applications of artificial neural networks for enhanced livestock productivity: A review
KR102540268B1 (en) A method and an apparatus for measuring the similarity of food for personalized food recommendation using a tree
Hilborn et al. Stock and recruitment
CN111096656B (en) Cooking assistance method, apparatus, computer device and storage medium
CN112107232A (en) Cooking control method, device and equipment and readable storage medium
CN109359624A (en) The prediction technique and prediction meanss of laying hen foraging behaviour neural network based
CN110794718B (en) Multi-food cooking method, terminal and storage medium
Derossi et al. Avenues for non-conventional robotics technology applications in the food industry
CN112906758A (en) Training method, recognition method and equipment of food material freshness recognition model
CN114206178A (en) Use of food processor
US20230092168A1 (en) Information processing device, information processing terminal, and information processing method
US20230128796A1 (en) Control apparatus and control method
Damayanti et al. Applying Hazard Analysis Critical Control Point (HACCP) for production process of oyster mushroom chips in small and medium enterprise (SME)
KR20240014911A (en) Frying robot controlling system for actively adjusting cooking conditions
Kukharev et al. RESEARCH RESULTS OF THE DEVICE FOR DRYING AGRICULTURAL CROPS
Demeure et al. What We Learn about Process Specification Languages, from Studying Recipes
CN115291782A (en) Method, system, computer readable medium and electronic device for automatically generating menu

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant