CN113485130B - Structured data generation and operation method, kitchen robot and storage medium - Google Patents

Structured data generation and operation method, kitchen robot and storage medium Download PDF

Info

Publication number
CN113485130B
CN113485130B CN202110678876.1A CN202110678876A CN113485130B CN 113485130 B CN113485130 B CN 113485130B CN 202110678876 A CN202110678876 A CN 202110678876A CN 113485130 B CN113485130 B CN 113485130B
Authority
CN
China
Prior art keywords
user
job
steps
kitchen robot
type
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202110678876.1A
Other languages
Chinese (zh)
Other versions
CN113485130A (en
Inventor
周川艳
何吾佳
戴逸婷
丛漪
覃梦鸽
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Tineco Intelligent Technology Co Ltd
Original Assignee
Tineco Intelligent Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Tineco Intelligent Technology Co Ltd filed Critical Tineco Intelligent Technology Co Ltd
Priority to CN202110678876.1A priority Critical patent/CN113485130B/en
Priority to CN202310862225.7A priority patent/CN117389162A/en
Publication of CN113485130A publication Critical patent/CN113485130A/en
Application granted granted Critical
Publication of CN113485130B publication Critical patent/CN113485130B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B15/00Systems controlled by a computer
    • G05B15/02Systems controlled by a computer electric
    • AHUMAN NECESSITIES
    • A47FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
    • A47JKITCHEN EQUIPMENT; COFFEE MILLS; SPICE MILLS; APPARATUS FOR MAKING BEVERAGES
    • A47J27/00Cooking-vessels
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B2219/00Program-control systems
    • G05B2219/20Pc systems
    • G05B2219/26Pc applications
    • G05B2219/2643Oven, cooking
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02PCLIMATE CHANGE MITIGATION TECHNOLOGIES IN THE PRODUCTION OR PROCESSING OF GOODS
    • Y02P90/00Enabling technologies with a potential contribution to greenhouse gas [GHG] emissions mitigation
    • Y02P90/02Total factory control, e.g. smart factories, flexible manufacturing systems [FMS] or integrated manufacturing systems [IMS]

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • Food Science & Technology (AREA)
  • Manipulator (AREA)

Abstract

The embodiment of the application provides a structured data generation and operation method, a kitchen robot and a storage medium. In the embodiment of the application, in the process of generating the structured data, at least for one class of operation steps which need to depend on the data object, a user intervention room is reserved, and the user is guided by intervention through setting corresponding prompt information, so that the semi-automatic structured data, but not the full-automatic structured data, is finally generated. In this way, when any kitchen robot executes the task according to the semi-automatic structured data and executes the task step provided with the prompt information, the prompt information can be output to prompt the user to execute the auxiliary operation adapted to the task step, so as to complete the task in cooperation with the kitchen robot. The kitchen robot has stronger flexibility in executing the operation tasks due to the fact that the user is allowed to adjust and intervene in the operation task executing process of the kitchen robot, and the operation requirements of the user can be met.

Description

Structured data generation and operation method, kitchen robot and storage medium
Technical Field
The application relates to the technical field of artificial intelligence, in particular to a structured data generation and operation method, a kitchen robot and a storage medium.
Background
With the rapid development of artificial intelligence, more and more intelligent machines are applied to people's life, such as intelligent cooking machines, and users can complete automatic cooking process with few participation steps by utilizing the intelligent cooking machines, thereby bringing great convenience for cooking food.
In the prior art, some electronic recipes exist, and the intelligent cooking machine can automatically cook various delicacies according to the electronic recipes. However, the intelligent cooking machine lacks flexibility in the process of cooking the food according to the existing electronic menu, which may cause unsatisfactory quality of the cooked food and cannot meet the cooking demands of users.
Disclosure of Invention
Aspects of the present application provide a structured data generation and operation method, a kitchen robot, and a storage medium for improving flexibility in performing operation tasks.
The embodiment of the application provides a structured data generation method, which comprises the following steps: recording a plurality of operation steps executed by the kitchen robot and an execution sequence among the plurality of operation steps in the process of executing the operation tasks by the kitchen robot; generating prompt information corresponding to a first type of operation step in the plurality of operation steps, wherein the first type of operation step at least comprises operation steps needing to depend on a data object; generating reference structured data according to the plurality of operation steps, the execution sequence among the plurality of operation steps and the prompt information corresponding to the first type of operation steps; the prompting information is used for prompting a user to execute auxiliary operation adapted to the first type of operation step when any kitchen robot executes the operation task according to the reference structured data and executes the first type of operation step, so as to complete the operation task in cooperation with any kitchen robot.
The embodiment of the application also provides a structured data generation method, which comprises the following steps: recording a plurality of operation steps executed by the kitchen robot and an execution sequence among the plurality of operation steps in the process of executing the operation tasks by the kitchen robot; for a third type of operation step in the plurality of operation steps, responding to configuration operation of a user, and recording attribute information of a data object which is configured for the third type of operation step by the user and depends on the third type of operation step, wherein the third type of operation step is an operation step which needs to depend on the data object but does not give data object identification information in the plurality of operation steps; and generating follow-up structured data according to the plurality of job steps, the execution sequence among the plurality of job steps and the attribute information of the data object on which the third class of job steps depend.
The embodiment of the application also provides a job task execution method, which comprises the following steps: acquiring reference structured data, wherein the reference structured data comprises a plurality of operation steps in the operation task executed by the kitchen robot, an execution sequence among the plurality of operation steps and prompt information corresponding to a first type of operation step in the plurality of operation steps; sequentially executing the plurality of operation steps according to the execution sequence among the plurality of operation steps; and when the first type of operation step is executed, outputting prompt information to prompt a user to execute auxiliary operation matched with the first type of operation step so as to complete the operation task by matching with the kitchen robot.
The embodiment of the application also provides a method for generating the structured data, which is suitable for the terminal equipment and comprises the following steps: displaying a job control interface, and controlling the kitchen robot to execute a job task based on a control on the job control interface; receiving and displaying the currently executed operation steps of the kitchen robot in the process of executing the operation tasks by the kitchen robot; when the currently executed operation step is a designated operation step, responding to a marking operation of marking the currently executed operation step as a first type operation step by a user, and generating prompt information corresponding to the first type operation step; generating reference structured data according to the operation steps sequentially executed by the kitchen robot and prompt information corresponding to the first type of operation steps; the prompting information is used for prompting a user to execute auxiliary operation adapted to the first type of operation step when any kitchen robot executes the operation task according to the reference structured data and executes the first type of operation step, so as to complete the operation task in cooperation with any kitchen robot.
The embodiment of the application also provides a method for generating the structured data, which is suitable for the terminal equipment and comprises the following steps: displaying a job control interface, and controlling the kitchen robot to execute a job task based on a control on the job control interface; receiving and displaying the currently executed operation steps of the kitchen robot in the process of executing the operation tasks by the kitchen robot; when the currently executed operation step is a designated operation step, an object configuration interface is displayed in response to a marking operation of marking the currently executed operation step as a third type operation step, wherein the third type operation step is an operation step which needs to rely on a data object; responding to the configuration operation of the user on the object configuration interface, and recording attribute information of the data object on which the user depends, which is configured for the third class of operation steps; and generating follow-up structured data according to the operation steps sequentially executed by the kitchen robot and the attribute information of the data objects relied by the third type of operation steps.
The embodiment of the application also provides a kitchen robot, which comprises: a memory and a processor; a memory for storing a computer program; a processor coupled with the memory for executing the computer program for: recording a plurality of operation steps executed by the kitchen robot and an execution sequence among the plurality of operation steps in the process of executing the operation tasks by the kitchen robot; generating prompt information corresponding to a first type of operation step in the plurality of operation steps, wherein the first type of operation step at least comprises operation steps needing to depend on a data object; generating reference structured data according to the plurality of operation steps, the execution sequence among the plurality of operation steps and the prompt information corresponding to the first type of operation steps; the prompting information is used for prompting a user to execute auxiliary operation adapted to the first type of operation step when any kitchen robot executes the operation task according to the reference structured data and executes the first type of operation step, so as to complete the operation task in cooperation with any kitchen robot.
The embodiment of the application also provides a kitchen robot, which comprises: a memory and a processor; a memory for storing a computer program; a processor coupled with the memory for executing the computer program for: recording a plurality of operation steps executed by the kitchen robot and an execution sequence among the plurality of operation steps in the process of executing the operation tasks by the kitchen robot; for a third type of operation step in the plurality of operation steps, responding to configuration operation of a user, and recording attribute information of a data object which is configured for the third type of operation step by the user and depends on the third type of operation step, wherein the third type of operation step is an operation step which needs to depend on the data object but does not give data object identification information in the plurality of operation steps; and generating follow-up structured data according to the plurality of job steps, the execution sequence among the plurality of job steps and the attribute information of the data object on which the third class of job steps depend.
The embodiment of the application also provides a kitchen robot, which comprises: a memory and a processor; a memory for storing a computer program; a processor coupled with the memory for executing the computer program for: acquiring reference structured data, wherein the reference structured data comprises a plurality of operation steps in the operation task executed by the kitchen robot, an execution sequence among the plurality of operation steps and prompt information corresponding to a first type of operation step in the plurality of operation steps; sequentially executing the plurality of operation steps according to the execution sequence among the plurality of operation steps; and when the first type of operation step is executed, outputting prompt information to prompt a user to execute auxiliary operation matched with the first type of operation step so as to complete the operation task by matching with the kitchen robot.
The embodiment of the application also provides a terminal device, which comprises: a memory and a processor; a memory for storing a computer program; a processor coupled with the memory for executing the computer program for: displaying a job control interface, and controlling the kitchen robot to execute a job task based on a control on the job control interface; receiving and displaying the currently executed operation steps of the kitchen robot in the process of executing the operation tasks by the kitchen robot; when the currently executed operation step is a designated operation step, responding to a marking operation of marking the currently executed operation step as a first type operation step by a user, and generating prompt information corresponding to the first type operation step; generating reference structured data according to the operation steps sequentially executed by the kitchen robot and prompt information corresponding to the first type of operation steps; the prompting information is used for prompting a user to execute auxiliary operation adapted to the first type of operation step when any kitchen robot executes the operation task according to the reference structured data and executes the first type of operation step, so as to complete the operation task in cooperation with any kitchen robot.
The embodiment of the application also provides a terminal device, which comprises: a memory and a processor; a memory for storing a computer program; a processor coupled with the memory for executing the computer program for: displaying a job control interface, and controlling the kitchen robot to execute a job task based on a control on the job control interface; receiving and displaying the currently executed operation steps of the kitchen robot in the process of executing the operation tasks by the kitchen robot; when the currently executed operation step is a designated operation step, an object configuration interface is displayed in response to a marking operation of marking the currently executed operation step as a third type operation step, wherein the third type operation step is an operation step which needs to rely on a data object; responding to the configuration operation of the user on the object configuration interface, and recording attribute information of the data object on which the user depends, which is configured for the third class of operation steps; and generating follow-up structured data according to the operation steps sequentially executed by the kitchen robot and the attribute information of the data objects relied by the third type of operation steps.
The embodiment of the application also provides a computer readable storage medium storing a computer program, which when executed by a processor, causes the processor to implement the steps in the structured data generation method and the job task execution method provided by the embodiment of the application.
In the embodiment of the application, in the process of generating the structured data, at least for one class of operation steps which need to depend on the data object, a user intervention room is reserved, and the user is guided by intervention through setting corresponding prompt information, so that the semi-automatic structured data, but not the full-automatic structured data, is finally generated. In this way, when any kitchen robot executes the task according to the semi-automatic structured data and executes the task step provided with the prompt information, the prompt information can be output to prompt the user to execute the auxiliary operation adapted to the task step, so as to complete the task in cooperation with the kitchen robot. The kitchen robot has stronger flexibility in executing the operation tasks due to the fact that the user is allowed to adjust and intervene in the operation task executing process of the kitchen robot, and the operation requirements of the user can be met.
Drawings
The accompanying drawings, which are included to provide a further understanding of the application and are incorporated in and constitute a part of this specification, illustrate embodiments of the application and together with the description serve to explain the application and do not constitute a limitation on the application. In the drawings:
FIG. 1a is a flow chart of a method for generating structured data according to an exemplary embodiment of the present application;
FIG. 1b is a flowchart illustrating steps involved in a method for generating structured data prior to a kitchen robot performing a task, according to an exemplary embodiment of the present application;
FIG. 1c is a flow chart of another method for generating structured data according to an exemplary embodiment of the present application;
FIG. 1d is a flowchart illustrating a further method for generating structured data according to an exemplary embodiment of the present application;
FIG. 1e is a flow chart of yet another method for generating structured data according to an exemplary embodiment of the present application;
FIG. 2a is a flowchart of a task execution method according to an exemplary embodiment of the present application;
fig. 2b is a schematic structural diagram of a voice module according to an exemplary embodiment of the present application;
FIG. 2c is a schematic diagram of a speech system according to an exemplary embodiment of the present application;
fig. 3 is a schematic structural view of a cooking apparatus according to an exemplary embodiment of the present application;
FIG. 4a is a schematic diagram of an authoring mode display interface;
FIG. 4b is a schematic illustration of a pre-editing interface;
FIG. 4c is a schematic illustration of another pre-editing interface;
FIG. 4d is a schematic view of a cartridge weighing interface;
FIG. 4e is a schematic view of another cartridge weighing interface;
FIG. 4f is a schematic diagram of a cooking control interface;
FIG. 4g is a schematic diagram of another cooking control interface;
FIG. 4h is a schematic diagram of an authoring preview interface;
fig. 5 is a schematic structural view of a kitchen robot according to an exemplary embodiment of the present application;
fig. 6 is a schematic structural diagram of a terminal device according to an exemplary embodiment of the present application.
Detailed Description
In order to make the objects, technical solutions and advantages of the present application more apparent, the technical solutions of the present application will be clearly and completely described below with reference to specific embodiments of the present application and corresponding drawings. It will be apparent that the described embodiments are only some, but not all, embodiments of the application. All other embodiments, which can be made by those skilled in the art based on the embodiments of the application without making any inventive effort, are intended to be within the scope of the application.
In the prior art, in the process of cooking food according to the existing electronic menu, a user cannot intervene, the intelligent cooking machine can cook food according to the operation steps in the electronic menu, flexibility is lacked, the quality of cooked food is possibly unsatisfactory, and the cooking requirement of the user cannot be met. In view of the above problems, in the embodiment of the present application, in the process of generating structured data, at least for a class of operation steps that need to rely on data objects, a user intervention room is left, and by setting corresponding prompt information, the user is guided by intervention, so as to finally generate semi-automatic structured data, but not full-automatic structured data. In this way, when any kitchen robot executes the task according to the semi-automatic structured data and executes the task step provided with the prompt information, the prompt information can be output to prompt the user to execute the auxiliary operation adapted to the task step, so as to complete the task in cooperation with the kitchen robot. The kitchen robot has stronger flexibility in executing the operation tasks due to the fact that the user is allowed to adjust and intervene in the operation task executing process of the kitchen robot, and the operation requirements of the user can be met.
The following describes in detail the technical solutions provided by the embodiments of the present application with reference to the accompanying drawings.
Fig. 1a is a flow chart of a method for generating structured data according to an exemplary embodiment of the present application, as shown in fig. 1a, the method includes:
101a, recording a plurality of operation steps executed by the kitchen robot and an execution sequence among the plurality of operation steps in the process of executing the operation task by the kitchen robot;
102a, generating prompt information corresponding to a first type of operation step in a plurality of operation steps, wherein the first type of operation step at least comprises operation steps needing to depend on a data object;
103a, generating reference structured data according to a plurality of operation steps, an execution sequence among the plurality of operation steps and prompt information corresponding to the first type of operation steps; the prompting information is used for prompting a user to execute auxiliary operation adapted to the first type of operation step when any kitchen robot executes the operation task according to the reference structured data and executes the first type of operation step, so as to complete the operation task in cooperation with any kitchen robot.
In this embodiment, the kitchen robot may be any electronic device capable of autonomously performing a task in a kitchen environment based on structured data, such as, but not limited to: intelligent frying machine, intelligent electric cooker, intelligent electric baking pan or intelligent dish washer, etc. Depending on the kitchen robot, the task performed by the kitchen robot may be different, for example, if the kitchen robot is an intelligent cooker, the intelligent cooker may perform a cooking task, if the kitchen robot is an intelligent rice cooker, the intelligent rice cooker may perform a cooking task, etc.
In this embodiment, the kitchen robot may establish a binding relationship with a terminal device, and the terminal device may be any terminal device capable of providing an interface operation function, for example, a mobile phone, a tablet computer, a smart watch, a notebook computer, a desktop computer, or the like.
In this embodiment, the kitchen robot may perform a job task using, but not limited to, the following several implementations. The following is an example.
Embodiment A1: the kitchen robot may receive a job instruction instructing the kitchen robot to execute a job task according to the basic structural data, in which case the kitchen robot may acquire the basic structural data including a plurality of job steps required by the kitchen robot to execute the job task and an execution order between the plurality of job steps; further, the plurality of job steps are sequentially executed in order of execution among the plurality of job steps included in the infrastructure data to execute the job task.
Alternatively, the job instruction may be issued to the kitchen robot by the user through a display screen of the kitchen robot; alternatively, the user may send the voice to the kitchen robot; alternatively, the user may send the APP to the kitchen robot through the APP on the terminal device bound to the kitchen robot; alternatively, the server may issue the information to the kitchen robot.
Alternatively, the basic structural data may be structural data preset by the kitchen robot, and then the basic structural data may be acquired locally. Alternatively, the infrastructure data may be downloaded from a network or server, and the infrastructure data is made and distributed to the network or server by other users.
Embodiment A2: the user can send control instructions to the control kitchen robot continuously, and the control kitchen robot is controlled to execute a plurality of operation steps in sequence, so that the operation task is completed. Optionally, the user may continuously send control instructions to the kitchen robot through a display screen on the terminal device or a display screen on the kitchen robot, or the user may also continuously send control instructions to the kitchen robot through a voice manner, where each control instruction is used to instruct the kitchen robot to perform one or more working steps. For the kitchen robot, the control instruction sent by the user can be continuously received, and a plurality of operation steps are sequentially executed according to the sequence of the received control instruction, so that the operation task is completed.
In embodiment A2, the order in which the user transmits the control signal to control the kitchen robot may reflect the execution order among a plurality of job steps executed by the kitchen robot.
Whichever manner the kitchen robot performs the job task is adopted, the kitchen robot may record the job steps it performs and the execution order between the job steps while performing the job task so as to generate the structured data. In this embodiment, the kitchen robot is mainly used for generating a semi-automatic structured data, the fully-automatic structured data can guide the kitchen robot to automatically complete the task without user participation in the process of guiding the kitchen robot to execute the task, and the semi-automatic structured data can guide the user to participate in the process of guiding the kitchen robot to execute the task, and the kitchen robot can complete the task only under the cooperation of the user. For ease of distinction and description, in embodiments of the present application, such semi-automated structured data is referred to as reference structured data; accordingly, fully automated structured data is referred to as keep-alive structured data.
In this embodiment, the process of generating the reference structured data by the kitchen robot is actually a process of finding out the job steps suitable for the user's intervention while recording the job steps executed by the kitchen robot and the execution order between the job steps in the process of executing the job task, and adding the prompt information for guiding the user's intervention to these job steps. In this embodiment, a job step suitable for user intervention among job steps executed by the kitchen robot is referred to as a first type of job step, and for the first type of job step, the prompt information corresponding to the first type of job step is generated.
In this embodiment, the task execution of the kitchen robot may involve some data objects, where a data object refers to a data object external to the kitchen robot, for example, in executing the task, a data object needs to be provided, added, or subtracted to the kitchen robot, etc. Since these data objects are objects external to the kitchen robot, the operations may be performed manually or manually by a user, rather than having to be performed automatically by the kitchen robot. Based on this, the first class of job steps includes at least: a job step that relies on the data object is required. For example, if the kitchen robot is an intelligent cooker and the job task is cooking dishes, the data object may be food materials, seasonings or other auxiliary tools required for cooking dishes, and the like.
In this embodiment, each first type of operation step corresponds to a prompt message. After identifying all the first type of operation steps and generating corresponding prompt information for each first type of operation step; the reference structured data may be generated according to the recorded plurality of job steps executed by the kitchen robot, execution sequences among the plurality of job steps, and hint information corresponding to the first type of job step. The reference structured data at least comprises a plurality of operation steps required by the kitchen robot to execute the operation task, an execution sequence among the operation steps and prompt information corresponding to a first type of operation step in the operation steps. The reference structured data generated by the embodiment can be used for guiding any kitchen robot to execute the task. Any of the kitchen robots herein include both the kitchen robot that generated the reference structured data and other kitchen robots. The kitchen robot generating the reference structured data may publish the reference structured data to the internet or a server from which other kitchen robots download the reference structured data. The prompting information corresponding to the first type of operation step is specifically used for prompting a user to execute auxiliary operation adapted to the first type of operation step when any kitchen robot executes the operation task according to the reference structured data and executes the first type of operation step, so as to complete the operation task in cooperation with any kitchen robot.
In the embodiment of the application, in the process of generating the structured data, at least for one class of operation steps which need to depend on the data object, a user intervention room is reserved, and the user is guided by intervention through setting corresponding prompt information, so that the semi-automatic structured data, but not the full-automatic structured data, is finally generated. In this way, when any kitchen robot executes the task according to the semi-automatic structured data and executes the task step provided with the prompt information, the prompt information can be output to prompt the user to execute the auxiliary operation adapted to the task step, so as to complete the task in cooperation with the kitchen robot. The kitchen robot has stronger flexibility in executing the operation tasks due to the fact that the user is allowed to adjust and intervene in the operation task executing process of the kitchen robot, and the operation requirements of the user can be met.
In an alternative embodiment, in a case where the first type of job step includes a job step that depends on a data object, an implementation manner of generating the hint information corresponding to the first type of job step includes: according to the dependency behavior of the first type of operation steps on the data object, prompt information for prompting a user to operate on the data object on which the first type of operation steps depend is generated. In this case, the prompt information is used to prompt the user to operate on the data object on which the first type of job step depends, and accordingly, the auxiliary operation performed by the user is to operate on the data object on which the first type of job step depends. Taking the kitchen robot as a cooking machine as an example, the reference structured data can be realized as a reference electronic menu, and the prompting information corresponding to the data object included in the reference electronic menu can be "please add xxx food materials", "add xxx seasonings", "drag out xxx seasonings from the pot", and the like.
Further alternatively, in the case where the first type of job step includes a data object-dependent job step, it may be determined which job steps are first type of job steps that need to be dependent on the data object before the hint information corresponding to the first type of job step is generated. For which work steps explicitly need to depend on the data object, keywords corresponding to the work steps can be pre-configured, and based on the keywords, the currently executed work steps can be matched with the pre-configured keywords in the process of executing the work task by the kitchen robot; and if the preset keywords are matched, determining that the currently executed operation step is the first operation step. For which definitely no data object dependent job steps are needed, because no match to the pre-configured keywords in the document is possible, no recognition as a first class of job steps is possible.
However, in practical applications, some job steps may or may not need to rely on data objects, and for such job steps cannot be identified simply by matching preset keywords. In this embodiment, these job steps that may or may not depend on the data object are referred to as designated job steps, the designated job steps are preconfigured, and a marking option is set for the designated job steps, so that the user marks whether the designated job steps need to depend on the data object or not as required in the process of actually executing the job task by the kitchen robot. If the user marks that the specified job step needs to rely on the data object, the specified job step is a first type of job step that needs to rely on the data object; conversely, the specified job step is not the first type of job step that needs to rely on the data object.
Based on the above, in the process of executing the operation by the kitchen robot, in addition to recording the plurality of operation steps executed by the kitchen robot, the operation step currently executed by the kitchen robot may be displayed to the user through the display screen of the kitchen robot or the display screen of the terminal device bound to the kitchen robot in the execution order among the plurality of operation steps. In addition to displaying the currently executed task step of the kitchen robot, the display screen may also display the task step that the kitchen robot has executed or the task step to be executed. When the kitchen robot executes the specified job step, a marking option for the specified job step may be displayed for a user to mark whether the specified job step depends on the data object; wherein, through the marking option for the specified job step, the user can mark the specified job step dependent data object, and can mark the specified job step independent data object. In the case that the user marks the specified job step dependent data object through the marking option, the dependence behavior of the job step on the data object can also be embodied in the marking information. For example, using a cooker as an example, designating a job step as an uncap step, the marking options may be, but are not limited to: the food materials are added/taken out through the cover opening, the seasonings are added/taken out through the cover opening, the food materials are turned over through the cover opening, or the cover opening is used for cooking, wherein actions such as adding, taking out or turning over are dependent behaviors of the data objects. If the designated job step is marked as a dependent data object, the user may also configure the identification information of the data object on which the designated job step depends for the designated job step, and the kitchen robot may also acquire the identification information of the data object on which the user configures for the designated job step.
In the present embodiment, an embodiment of acquiring identification information of a data object on which a user is dependent configured for specifying a job step is not limited, and is exemplified below.
Embodiment B1: before the kitchen robot executes the job task, a user pre-edits data objects needed by the kitchen robot to execute the job task in advance to obtain a data object list needed by the kitchen robot to execute the job task, wherein the data object list comprises identification information of at least one data object. Based on this, in the case where the kitchen robot performs to a specified job step, and the user marks the specified job step by the above-mentioned marking option depending on the data object, a list of data objects required for the kitchen robot to perform the job task may be further presented; the user can select identification information of the data object on which the specified job step depends from the data object list; the kitchen robot may acquire identification information of the data object selected by the user as identification information of the data object on which the specified job step depends in response to a user's selection operation of the data object list.
Further optionally, before the kitchen robot performs the task, the process of pre-editing the data objects needed by the user to perform the task by the kitchen robot in advance includes: displaying a pre-editing interface, wherein the pre-editing interface comprises candidate data objects and/or editing controls so that a user can edit a data object list required by the kitchen robot to execute a job task; in response to a user's pre-editing operation, a list of data objects required for the kitchen robot to perform a job task is generated. Wherein, the user can select the data object required by the kitchen robot to execute the task from the candidate data objects and add the data object to the data object list; alternatively, in the case where there is no data object list required for the kitchen robot to perform the job task among the candidate data objects, the user may also automatically add or edit information of the data objects required for the kitchen robot through the edit control. Or directly displaying the editing control on the pre-editing interface, directly editing the information of the data objects required by the kitchen robot to execute the job task through the editing control by a user, and adding the information into the data object list.
Embodiment B2: in the case where the kitchen robot performs to a specified job step, and the user marks the specified job step by the above-mentioned marking option depending on the data object, a data object editing interface is further presented to the user, the data object editing interface including: an editing control, through which a user can edit identification information of a data object on which a specified operation step depends; the kitchen robot may acquire identification information of a data object on which a specified job step depends in response to an editing operation by a user.
Embodiment B3: in the case where the kitchen robot performs to a specified job step, and the user marks the specified job step by the above-mentioned marking option depending on the data object, a list of data objects and an add control required for the kitchen robot to perform the job task are further presented. If the data object list contains the identification information of the data object on which the specified operation step depends, the user can directly select the identification information of the data object from the data object list, and the kitchen robot can respond to the selection operation of the user on the data object list to acquire the identification information of the data object selected by the user as the identification information of the data object on which the specified operation step depends. If the data object list does not contain the identification information of the data object on which the specified operation step depends, the user can click on the adding control to enter a data object editing interface, the data object editing interface is provided with an editing control, and the user can edit the identification information of the data object on which the specified operation step depends through the editing interface; the kitchen robot may acquire identification information of a data object on which a specified job step depends in response to an editing operation by a user.
Here, the first type of job step in the embodiment of the present application includes not only job steps that depend on data objects, but also job steps that other users can participate in, for example, job steps that depend on user-configurable job parameters. The user-configurable job parameters are job parameters that allow a user to adjust or configure in real time during the execution of a job by the kitchen robot, and do not affect the execution of the current job task. Alternatively, these user-configurable operating parameters may be operating parameters that are easily configurable by the user, such as, but not limited to: the kitchen robot and/or remote control is provided with operating parameters configuring controls (which may be physical buttons or touch buttons on a touch panel). For example, a user may configure or adjust user-configurable operating parameters via touch buttons on a touch panel of the kitchen robot; or, the user configures or adjusts the user configurable operation parameters through physical buttons on the remote controller of the kitchen robot. Where the kitchen robot is an intelligent cooker, the operating parameters that the intelligent cooker allows the user to configure in real time during its execution of the operating task may be, but are not limited to: the fire power, the speed of the stirring shovel and the like are provided with a fire power adjusting control piece and a stirring speed adjusting control piece on a touch panel of the cooking machine. For the operation steps depending on the user configurable operation parameters, the adjustment or configuration of the operation parameters may be participated by the user, and is not necessarily performed automatically by the kitchen robot, so that the operation steps may also be used as the first type of operation steps in the embodiment of the present application.
In the case that the first type of operation step includes operation steps depending on user-configurable operation parameters, the method for generating the prompt information corresponding to the first type of operation step includes generating the prompt information for prompting the user to configure the user-configurable operation parameters, that is, the prompt information is used for prompting the user to set or adjust the operation parameters in the first type of operation step; accordingly, under the prompt of the prompt information, the auxiliary operation executed by the user is to set or adjust the user configurable operation parameters. Taking the kitchen robot as a cooking machine as an example, the reference structured data can be implemented as a reference electronic menu, and the prompting information corresponding to the user-configurable operation parameters included in the reference electronic menu can be "please increase the firepower", "adjust the stirring speed", "reduce the firepower", and the like.
In some optional embodiments of the present application, the plurality of job steps performed by the kitchen robot may further include a second type of job step in addition to the first type of job step, the second type of job step being other job steps than the first type of job step. It should be noted that, for the kitchen robot, the second type of working step is an optional working step, that is, the plurality of working steps performed by the kitchen robot may be all the first type of working steps. In case that the second type of operation step is included among the plurality of operation steps performed by the kitchen robot, the second type of operation step may be an operation step using a non-user-configurable operation parameter, which is an operation parameter already preset in the operation step, and does not support adjustment or configuration by the user in an interactive manner. Based on this, in the process of generating the reference structured data, the corresponding prompt information is not required to be generated for the second type of operation step, but the non-user-configurable operation parameters used when the kitchen robot executes the second type of operation step can be recorded for the second type of operation step in the plurality of operation steps, that is, the reference structured data includes the second type of operation step and the corresponding non-user-configurable operation parameters thereof.
In this description, the above-mentioned user-configurable and non-user-configurable operating parameters refer to operating parameters that the kitchen robot needs to use when performing the corresponding operating steps, which may be embodied as operating states or state parameters of the kitchen robot.
In other alternative embodiments of the present application, the kitchen robot supports two modes of authoring, including: a first mode of authoring and a second mode of authoring, the first mode of authoring being the mode of authoring that generates the reference structured data, the second mode of authoring being the mode of authoring that generates the follow-up structured data. The follow-up structured data is structured data for guiding the kitchen robot to automatically execute the task. In practical application, a user can select one of the authoring modes according to the requirement of authoring the structured data, so as to instruct the kitchen robot to carry out the authoring process of the corresponding structured data according to the authoring mode selected by the user. Based on this, as shown in fig. 1b, before the kitchen robot performs the task of the job, i.e. before step 101a, the structured data generation method further comprises the steps of:
101b, displaying a mode setting interface, wherein the mode setting interface at least comprises a first authoring mode and a second authoring mode;
102b, responsive to a user selection operation, determining that the user selects the first authoring mode.
In this embodiment, if the user selects the first authoring mode, meaning that the kitchen robot is required to generate the reference structured data, then the kitchen robot performs the operations of steps 101a-103a to generate the reference structured data.
In an alternative embodiment, the user may also select the second authoring mode, and the kitchen robot may generate the follow-up structured data while performing the job task. Fig. 1c is a schematic flow chart of a method for generating structured data according to an embodiment of the present application, as shown in fig. 1c, where the method includes:
101c, recording a plurality of operation steps executed by the kitchen robot and an execution sequence among the plurality of operation steps in the process of executing the operation task by the kitchen robot;
102c, responding to configuration operation of a user for a third type of operation step in the plurality of operation steps, and recording attribute information of a data object which is configured for the third type of operation step by the user and depends on the third type of operation step, wherein the third type of operation step is an operation step which needs to depend on the data object but does not give out data object identification information in the plurality of operation steps;
103c, generating follow-up structured data according to the plurality of job steps, the execution sequence among the plurality of job steps and the attribute information of the data object on which the third class of job steps depend.
For the implementation and detailed description of step 101c, reference may be made to the description of step 101a in the foregoing embodiment, which is not repeated here. In the present embodiment, the plurality of job steps performed by the kitchen robot include: a third class of operation steps, which is an operation step that needs to rely on a data object but does not give data object identification information in a plurality of operation steps, namely, knows that the operation step needs to rely on the operation step, but does not know which data object or data objects the operation step specifically needs; for the third class of operation steps, the user can configure attribute information of the data objects on which the user depends, the kitchen robot can respond to the configuration operation of the user and record the attribute information of the data objects on which the user depends, which is configured for the third class of operation steps, for example, the attribute information can be identification, type or usage of the data objects, and the like; then, the following structured data is generated according to the plurality of job steps, the execution sequence among the plurality of job steps, and the attribute information of the data object on which the third class of job steps depend.
Further alternatively, it may be determined which of the plurality of job steps are the third type of job step before the attribute information of the data object on which it depends is configured for the third type of job step. For which work steps explicitly need to depend on the data object, keywords corresponding to the work steps can be pre-configured, and based on the keywords, the currently executed work steps can be matched with the pre-configured keywords in the process of executing the work task by the kitchen robot; if the preset keywords are matched, further judging whether the currently executed operation step contains attribute information of the data object or not; if the operation step is not included, determining that the currently executed operation step is a third type operation step; otherwise, it is determined that the currently executed job step is not the third type of job step. For which definitely no data object dependent job steps are needed, because no match to the pre-configured keywords in the list is possible, no identification as a third class of job steps is possible.
However, in practical applications, some job steps may or may not need to rely on data objects, and for such job steps cannot be identified simply by matching preset keywords. In this embodiment, these job steps that may or may not depend on the data object are referred to as designated job steps, the designated job steps are preconfigured, and a marking option is set for the designated job steps, so that the user marks whether the designated job steps need to depend on the data object or not as required in the process of actually executing the job task by the kitchen robot. Since it is not known whether the specified operation step needs to rely on the data object, it is natural that the attribute information of the data object on which the specified operation step needs to rely is not known, if the user marks that the specified operation step needs to rely on the data object, the specified operation step is the third type operation step; conversely, the specified job step is not a third type of job step.
Based on the above, in the process of executing the operation by the kitchen robot, in addition to recording the plurality of operation steps executed by the kitchen robot, the operation step currently executed by the kitchen robot may be displayed to the user through the display screen of the kitchen robot or the display screen of the terminal device bound to the kitchen robot in the execution order among the plurality of operation steps. Further, when the kitchen robot performs to the specified job step, a marking option for the specified job step may be displayed for the user to mark whether the specified job step is a dependent data object; if the specified job step is marked as a dependent data object, but because the attribute information of the data object on which the specified job step depends cannot be acquired in advance, the specified job step is regarded as a third-class job step, and the attribute information of the data object on which the specified job step depends is configured by the user. Wherein the specified job step is one that may need to rely on the data object. The manner of configuring the attribute information of the data object on which the user depends for the specified job step is similar to that of the above embodiments B1 to B3, and will not be described here again.
Further alternatively, the kitchen robot may use certain working parameters when executing each working step, so that in addition to recording a plurality of working steps executed by the kitchen robot, the working parameters used when executing each working step by the kitchen robot may be recorded for each working step, and these working parameters may be stored in the following structured data. That is, the following structured data includes not only: the plurality of job steps, the execution order among the plurality of job steps, and the attribute information of the data object on which the third class of job steps depend, further include job parameters used when executing each job step.
In an alternative embodiment, after the kitchen robot generates the following data packet or the reference data packet, a correction operation may be further performed on the generated following data packet or the reference data packet, where the correction operation includes at least one of adding a working step, deleting a working step, modifying a working parameter, adding a video or a picture corresponding to the working step, and modifying a name of the structured data. Based on the above, if the kitchen robot generates the following data packet, the following structured data is displayed on the first preview interface, and the user can perform correction operation on the following structured data; the kitchen robot can respond to the correction operation initiated by the user on the first preview interface to carry out the correction operation on the follow-up structured data; if the kitchen robot generates a follow-up data packet, displaying the reference structured data on a second preview interface, wherein a user can correct the reference structured data; and responding to the correction operation initiated by the user on the second preview interface, and performing the correction operation on the reference structured data. The modifying operation includes at least one of: adding or deleting stirring step; adding or deleting feeding steps; increasing or decreasing the heating time; increasing or decreasing the heating power; increasing or decreasing the stirring speed; adding or deleting a picture video or a picture of the guided operation step; adding or deleting a picture video or a picture showing the operation process in real time; adding or deleting a cover opening step in any step of following or referring to the structured data; in any of the following or referencing steps, a closing step is added or deleted.
Further alternatively, for safety reasons, a step of adjusting the stirring blade is added after the step of closing the cover; in order to keep track of or refer to the stability of the structured data, add the step of uncapping in the last position to keep track of or refer to the structured data; or adding a closing step at the last position to follow or reference the structured data.
In the product realization, the kitchen robot can simultaneously support two creation modes, and a user can select the two creation modes, so that corresponding follow-up structured data or reference structured data are generated according to the creation mode selected by the user. Alternatively, in terms of product implementation, the kitchen robot may also support only the first creation mode or only the second creation mode, and if the kitchen robot supports only the first creation mode, the kitchen robot may generate the reference structured data according to the flow shown in fig. 1 a; if the kitchen robot supports only the second authoring mode, then the keep-alive structured data may be generated according to the flow shown in FIG. 1 c.
In addition, the execution subject of the method embodiment shown in fig. 1 a-1 c may be a kitchen robot, that is, various operations are executed on the kitchen robot; or the kitchen robot and the terminal equipment are matched for implementation. Under the condition that the kitchen robot is matched with the terminal equipment for implementation, the process of generating the structured data is mainly divided into three parts: (1) the user performs pre-editing operation through the terminal equipment, determines a data object required by executing the operation task, and the terminal equipment sends the pre-edited data object to the kitchen robot; (2) the user controls the kitchen robot to execute the operation task and generate reference or follow-up structured data, and controls the kitchen robot to provide the generated structured data to the terminal equipment; (3) and the user carries out correction operation on the reference or follow-up structured data through the terminal equipment to obtain final structured data. The detailed description of each part can be found above, and will not be repeated here.
In this embodiment, the process of generating the structured data may be performed not only by the kitchen robot alone, or by the kitchen robot and the terminal device in cooperation, but also by the terminal device alone. Fig. 1d is a schematic diagram of another method for generating structured data according to an exemplary embodiment of the present application, where the method is applicable to a terminal device bound to a kitchen robot, as shown in fig. 1d, and the method includes:
101d, displaying a job control interface, and controlling the kitchen robot to execute a job task based on a control on the job control interface;
102d, receiving and displaying the currently executed operation steps of the kitchen robot in the process of executing the operation tasks by the kitchen robot;
103d, when the currently executed operation step is a designated operation step, responding to a marking operation of marking the currently executed operation step as a first type operation step by a user, and generating prompt information corresponding to the first type operation step;
104d, generating reference structured data according to the operation steps sequentially executed by the kitchen robot and the prompt information corresponding to the first type of operation steps; the prompting information is used for prompting a user to execute auxiliary operation adapted to the first type of operation step when any kitchen robot executes the operation task according to the reference structured data and executes the first type of operation step, so as to complete the operation task in cooperation with any kitchen robot.
In this embodiment, the control on the job control interface may control the kitchen robot to perform an action. If the kitchen robot is a cooking appliance, the job control interface includes, but is not limited to: a fire control, a stirring control, a switch cover control, a pause control and the like.
In the present embodiment, the marking operation refers to a process in which the user configures identification information of the data object on which he depends for a specified job step. The process of configuring the identification information of the data object on which the user depends for the specified job step or other content of this embodiment can be referred to in the foregoing embodiment, and will not be described herein.
Fig. 1e is a schematic diagram of another method for generating structured data according to an exemplary embodiment of the present application, where the method is applicable to a terminal device bound to a kitchen robot, as shown in fig. 1e, and the method includes:
101e, displaying a job control interface, and controlling the kitchen robot to execute a job task based on a control on the job control interface;
102e, receiving and displaying the currently executed operation steps of the kitchen robot in the process of executing the operation tasks by the kitchen robot;
103e, when the currently executed operation step is a designated operation step, responding to a marking operation of marking the currently executed operation step as a third type operation step by a user, and displaying an object configuration interface, wherein the third type operation step is an operation step needing to depend on a data object;
104e, responding to the configuration operation of the user on the object configuration interface, and recording the attribute information of the data object on which the user depends, which is configured for the third class operation step;
105e, generating follow-up structured data according to the attribute information of the data objects on which the kitchen robot sequentially executes the working steps and the third class working steps depend.
In this embodiment, the object configuration interface may allow a user to configure attribute information of the data object. Through the object configuration interface, a user may manually input or voice input attribute information of a data object. Or, the object configuration interface may also display a data object list selected for the kitchen robot in the pre-editing operation, so that the user may select the data object on which the user depends for the third class of operation step, and further display an attribute information configuration interface of the data object, where the attribute information configuration interface includes configuration items such as data object type, number, name, and the like, so that the user configures attribute information of the data object. Other matters related to this embodiment can be seen in the foregoing embodiments, and will not be described herein.
In this embodiment, whichever way the structured data is generated is used, after the structured data is generated, the structured data may be used to instruct any kitchen robot to perform a task. The procedure of executing the job task by the kitchen robot will be described below taking the example in which the user executes the job task based on the reference structured data.
Fig. 2a is a flow chart of a job task execution method according to an exemplary embodiment of the present application, as shown in fig. 2a, the method includes:
201a, acquiring reference structured data, wherein the reference structured data comprises a plurality of operation steps in a kitchen robot executing operation task, an execution sequence among the plurality of operation steps and prompt information corresponding to a first type of operation step in the plurality of operation steps;
202a, sequentially executing a plurality of operation steps according to the execution sequence among the plurality of operation steps;
203a, outputting prompt information when the first type of operation step is executed, so as to prompt a user to execute auxiliary operation adapted to the first type of operation step, and to complete the operation task in cooperation with the kitchen robot.
In the present embodiment, the generation manner of the reference structured data is not limited. For example, the reference structured data may be authored by the user based on an authoring mode of the kitchen robot, and then the reference structured data may be obtained directly from the kitchen robot local store; for another example, the other users issue the authored reference structured data to the internet or the server, and the kitchen robot acquires the reference structured data issued to the internet or the server by the other users; for another example, a terminal device bound to the kitchen robot obtains reference structured data published by other users onto the internet or a server and provides the reference structured data to the kitchen robot.
In this embodiment, the reference structured data has a prompt message, and under the prompt of the prompt message, the user can adjust and intervene in the process of executing the operation task by the kitchen robot, so that the semi-automatic execution of the operation task by the kitchen robot is realized, the execution of the operation task has a certain flexibility, and the requirements of the user are favorably met.
In an alternative embodiment, the first type of operation step includes an operation step that needs to rely on a data object, and the prompt information is used for prompting the user to operate on the data object on which the first type of operation step depends; or the first type of job step comprises a job step depending on the user-configurable job parameters, and the prompt information is used for prompting the user to configure or adjust the user-configurable job parameters. The details of the operation steps depending on the data object and the user configurable operation parameters can be found in the foregoing embodiments, and will not be described herein.
Further optionally, during execution of the job steps by the kitchen robot, if the first type of job step includes a job step that depends on user-configurable job parameters, the kitchen robot sends a prompt to the user, and the user configures or adjusts the user-configurable job parameters according to the prompt. Specifically, the user may send parameter adjustment instructions to the kitchen robot, the parameter adjustment instructions including new user-configurable operating parameters that the kitchen robot needs to use to execute; the kitchen robot may control the kitchen robot to perform subsequent work steps according to the new user-configurable work parameters in response to the user's parameter adjustment instructions.
In an alternative embodiment, after the user performs the auxiliary operation, a confirmation message may be sent to the kitchen robot. For example, the confirmation information may be sent to the kitchen robot through a display screen of the kitchen robot or a display screen of a terminal device that establishes a binding relationship with the kitchen robot. For another example, the confirmation information may be sent to the kitchen robot through a remote control of the kitchen robot. For another example, the kitchen robot includes a voice module, and the user can send the confirmation information to the kitchen robot in a voice manner, and the voice module of the kitchen robot recognizes the voice of the user and obtains the confirmation information sent by the user. After the kitchen robot obtains the confirmation information, the kitchen robot can respond to the confirmation information sent by the user after the auxiliary operation is executed to judge whether the first type of operation step has a next operation step or not; if yes, continuing to execute the next operation step if the reference structured data is not executed; if the reference structured data does not exist, the operation task is ended after the reference structured data is executed.
In addition, after the kitchen robot sends the prompt information, the user can set a countdown without sending the confirmation information, and after the countdown is finished, the default user performs the auxiliary operation, and the kitchen robot can perform the next operation step or finish the operation task.
Scene embodiment:
in this embodiment, taking the kitchen robot as the cooking device, referring to the structured data as the reference electronic menu, and following the structured data as the electronic menu is taken as an example, the generation process and the use process of the structured data are described. In which the structure of the cooking apparatus is not limited, fig. 3 shows an exemplary structure of the cooking apparatus. As shown in fig. 3, the cooking apparatus 300 includes: the cooking vessel 301, the lid 302 and the MCU 303, the seasoning box 304 (including at least one seasoning box, each seasoning box adding at least one seasoning), a stirring shovel (not shown in FIG. 3) located inside the cooking vessel 301, a heating base 305, a base 306 for carrying the heating base 305, a stand 307, a display screen 308, and a measuring device 309. Wherein, the MCU 303 serves as a brain of the cooking apparatus 300, and may control the cooking apparatus 300 to perform a cooking action, for example, the MCU 303 may control the pot cover 302 of the cooking apparatus 300 to be opened or closed; the MCU 303 can control the rotation of the seasoning box 304 to realize the feeding of the seasoning in the seasoning box 304; the MCU 303 may also control the rotational speed of the stirring blade or control the heating power of the heating base 305, etc.
Further, the cooking apparatus further includes a communication module 310, and the communication module 310 may be implemented as a near field communication technology such as Wifi, bluetooth, infrared or radio frequency identification (Radio Frequency Identification, RFID), and the cooking apparatus may establish a wireless connection with the terminal apparatus based on the communication module.
In this embodiment, an APP corresponding to a cooking device is installed on a terminal device, and a user enters an authoring mode through the APP, where the authoring mode includes: the creation refers to the electronic menu and the creation follows the electronic menu. FIG. 4a is a view-aiding screen of an authoring mode display interface with the authoring mode above the interface for explaining the distinction between the two modes to a user for easy understanding; below the interface is a selection of an authoring mode, which the user can select to author a reference electronic recipe or to author a follow-up electronic recipe. The user may click on the first control in fig. 4a to make a follow-up electronic recipe, or click on the second control in fig. 4a to make a reference electronic recipe.
In case the user selects the first control creation to follow the electronic recipe, the user may perform a pre-editing operation on the terminal device, recording the kinds of food and seasonings required for the cooking device to perform the job task, as shown in fig. 4b and 4 c. As shown in fig. 4b, the terminal device displays a pre-editing interface to the user, where a plurality of common food materials are displayed on the pre-editing interface, and the common food materials may be, but are not limited to: streaky pork, pork ribs, fish, chicken, mushrooms, and the like. The user can select the food material types required by executing the operation tasks, if the common food materials do not have the food material types required by executing the operation tasks, the user can click the food material adding control on the pre-editing interface, and the terminal equipment responds to the click operation of the user and displays the information input interface for the user to input the identification information of the food materials to be added. As shown in fig. 4c, the type of seasoning to be added in each cartridge may be selected on the pre-editing interface, and one seasoning may be added in each cartridge, or a plurality of seasonings may be added, which is not limited. For each cartridge, the pre-editing interface displays common seasonings, which may be, but are not limited to: peanut oil, fine salt, light soy sauce, dark soy sauce, mature vinegar, rice vinegar and the like. The user can select the seasonings required by executing the operation task for the material box, if the common seasonings do not have the types of the seasonings required to be added by the material box, the user can click a seasoning adding control corresponding to the material box on the pre-editing interface, and the terminal equipment responds to the click operation of the user and displays an information input interface for the user to input identification information to be added. As shown in fig. 4c, the user selects the seasoning for cartridge 1 to include: peanut oil, fine salt, light soy sauce and mature vinegar, and the flavoring selected by the user for the material box 2 is fine salt. In addition, in fig. 4b, the user may also add a name to the data packet.
Then, after the user performs the pre-editing operation through the APP of the terminal equipment, the terminal equipment sends the pre-edited data of the user to the cooking equipment, and the cooking equipment can assist the user in carrying out material box setting based on the pre-edited data, namely assist the user in throwing seasonings into corresponding material boxes. As shown in fig. 4d, the flavoring to be added in the cartridge 1 is peanut oil, fine salt, light soy sauce and old vinegar, and in fig. 4d, only peanut oil and fine salt are added, and the light soy sauce and old vinegar are not shown. The user can weigh the seasoning to be added to the cartridge 1 based on a measuring device of the cooking apparatus, such as an electronic scale, and the cooking apparatus can record the amount of seasoning after the measuring device obtains the amount of seasoning in a highlighted state. The user adds the weighed seasoning to the material box 1, and then based on the same method, the weighing, recording and throwing of other seasonings in the material box 1 or seasonings to be added in other material boxes can be sequentially carried out, the feeding process is finished, and cooking is started. When the material box is set, the material box weighing interface can be added with the material which needs to be put into the material box, and the added material is weighed, recorded and put. As shown in fig. 4d, there is an add control on the cartridge weighing interface that the user can trigger to add the seasoning to be delivered to the cartridge. The cooking device can respond to clicking operation of a user to display a common seasoning selection interface for the user to select, or under the condition that common seasoning can not meet the user requirement, the user can trigger a seasoning adding control on the common seasoning selection interface to input identification information of the seasoning to be added.
In this embodiment, besides the implementation manner of adding the seasoning in the seasoning box, the weighing interface of the seasoning box also supports the user to delete the identification information of the seasoning to be added in the seasoning box, as shown in fig. 4e, the user can select the identification information of the seasoning to be deleted and slide leftwards, the cooking device responds to the sliding operation of the user to display a seasoning deletion control, and the user can click the seasoning deletion control to delete the identification information of the seasoning.
In this embodiment, the user may display the cooking control interface on the pan end (cooking apparatus end) based on the display screen of the cooking apparatus, and the user generates the structured data while performing the task based on the cooking control interface. In fig. 4f, a schematic diagram of a cooking control interface is shown on the left side of the cooking control interface, where a designated operation step of the cooking apparatus, for example, the designated operation step is a currently executed operation step, or the cooking apparatus has already executed an already executed operation step and a currently executed operation step. On the right side of the cooking control interface, displaying an interaction control of the cooking equipment, wherein the interaction control comprises: a fire control piece, a stirring control piece, a feeding control piece, a pot cover control piece opening, a pot cover control piece closing, a pause control piece and an end control piece. The fire control is used for adjusting the fire intensity in the process of executing the operation tasks, and the fire intensity is divided into: the off, light, medium and high fires are illustrated in fig. 4f with the target fire size as the high fire. The stirring control piece is used for adjusting the rotating speed of the stirring shovel, and the rotating speed of the stirring shovel is divided into: shut down, low speed, medium speed or high speed. The feeding control is used for indicating the seasoning box to rotate so as to feed the seasoning of the corresponding seasoning box. The pot cover opening control and the pot cover closing control are compound buttons, and if a user clicks the pot cover opening control, the pot cover of the cooking equipment is opened, and at the moment, the compound control displays the pot cover closing control; if the user clicks the cover closing control, the cover of the cooking device is closed, and at the moment, the compound control displays that the cover is opened. The pause control is used for pausing execution of the operation task, at the moment, the stirring shovel of the cooking equipment stops rotating, the firepower is closed, the feeding is also stopped, and the pot cover is in the current state. In the paused state, the user may continue to click on the pause control and the cooking device continues to perform the job task. After the job task is executed, the user can click on the end control to end executing the job task.
In this embodiment, for a specified operation step, such as an uncapping operation step, a marking option for the uncapping operation step may be presented on the cooking control interface. As shown in fig. 4f, the labeling options for the uncapping action are: uncapping food, uncapping cooking, or uncapping other operations. Further, when the user selects the mark type of opening the cover and adding the food, the cooking device may display a food selection interface to the user, as shown in fig. 4g, where the identification information of the food pre-edited by the user is displayed on the food selection interface, the user may select the identification information of the food added in the opening operation, and if the food other than the pre-edited food is also added in the opening operation, the user may input the identification information of the food to be added. If the user selects the marking option of other operations of uncovering, the cooking device can provide an editing control for the user to input specific operations needing to be performed by uncovering based on the editing control. In the process of executing the operation task, a cover opening marking function is provided for a user, if the user marks the cover opening behavior, after the electronic menu is generated, the cover opening behavior and other actions attached to the cover opening behavior, such as food materials and seasonings, can be directly displayed in the structured data, so that the defect that the user can not perceive when the user opens the cover to add the food materials is overcome, and the generated electronic menu does not have actions corresponding to the cover opening behavior, and the memory burden of the user can be greatly reduced due to the use of marking options.
In this embodiment, the kitchen robot may automatically generate a follow-up electronic recipe after the user controls the cooking apparatus to perform the job task. The cooking device can display the follow-up electronic menu through the display screen of the terminal device, and meanwhile, the user can execute correction operation on the follow-up electronic menu. The authoring preview interface shown in fig. 4h shows a follow-up electronic menu of "Changzhou spicy beef powder", and the authoring preview interface includes a "de-edit" control, which can be clicked by a user; the terminal device responds to clicking operation of a user to display an authoring and editing interface, and the user can modify operation parameters in follow-up electronic menu in the authoring and editing interface, for example, adjust firepower of certain operation steps, speed of a stirring shovel, cooking time and the like. In addition, the operation steps may be added or deleted. When the operation steps are added, fault-tolerant control is added, and data packet abnormality caused by step adjustment is avoided. For example, adding a cover opening step at the final position of the electronic menu; and adding a cover closing step at the final position of the electronic menu. Further, for safety reasons, after the cover closing step, a stirring step is added, so that the phenomenon that the rotation of the stirring shovel damages the physical health of a user when the cover is opened is avoided. In addition, the user can add video or picture corresponding to the operation step, modify the name of the structured data, or adjust the type or weight of food or seasoning.
In this embodiment, authoring is the same as or similar to authoring the reference data packet, except that: when creating and referencing the electronic menu, the user does not need to accurately weigh the seasonings and throw the seasonings into the seasoning box, and the user can select the types and the using amounts of the seasonings according to own preference in the process of uncovering. Accordingly, the user needs to mark the job step to which the data object (for example, the seasoning and the food material) is added and generate the prompt information of the job step in the process of executing the job task. The following electronic menu includes a plurality of operation steps, an execution sequence among the plurality of operation steps, attribute information (such as a type, a mark or an amount of food or seasonings) of data objects in some operation steps, and the following electronic menu refers to the plurality of operation steps, the execution sequence among the plurality of operation steps and prompt information corresponding to some operation steps. Other content references to the electronic recipe are the same as or similar to the follow-up electronic recipe, and in particular, refer to the follow-up electronic recipe.
In this embodiment, the user may execute the job task according to the follow-up electronic menu, or may execute the job task according to the reference electronic menu, which are different from each other in that: when the cooking device uses the follow-up electronic menu to execute the operation task, the cooking device automatically executes the operation task, and a user does not need to execute auxiliary operation, so that the user cannot adjust and intervene the operation steps in the mode; when the reference electronic menu is used for executing the job task, the user can adjust and intervene in the process of executing the job task according to the prompt information, so that the user can execute the job task according to own preference, the flexibility of executing the job task is higher, and the job requirement of the user is favorably met. Meanwhile, the prompt information can greatly reduce the memory burden of the user and promote the experience of the user.
The process by which the user performs a cooking task using the reference electronic recipe is described below. Wherein the cooking apparatus supports a keep-alive mode and a reference mode. In the follow-up mode, the user does not need to intervene and adjust the cooking process, and the kitchen robot can execute the operation tasks according to the follow-up electronic menu. The reference mode is mainly aimed at a user with a certain cooking skill basis, and the user wants to intervene or adjust the cooking process, so that the user can select the reference mode and execute cooking tasks based on the reference electronic menu in the reference mode.
The user can pre-edit the types of food materials and seasonings on the terminal equipment, wherein the food materials are tomatoes, and the seasonings are edible oil, salt and chicken essence. After determining the kinds of food materials and seasonings, the terminal device provides the pre-edited content to the cooking device. The cooking equipment executes a cooking process, the cooking equipment opens a big fire and automatically opens a pot cover, then a user is reminded to add edible oil, the user adds the edible oil into the pot body according to a prompt, the amount of the edible oil is controlled by the user, and after counting down for 5 seconds, the cooking equipment defaults that the user has added the edible oil into the pot body; then the cooking device reminds the user of adding the cut tomatoes, after the user adds the tomatoes, the user confirms that the tomatoes are put into the pot body through a control on the display screen, after the cooking device receives confirmation information of the user, the pot cover is closed, the stirring shovel is operated at a medium speed for a period of time, the stirring shovel is closed, the pot cover is opened, the user is reminded of adding salt and chicken essence, after the user finishes adding, confirmation added information is sent to the cooking device, the cooking device is turned off after 5 seconds, and the cooking task is finished.
In the embodiment of the application, the kitchen robot has a voice function, based on the voice function, a user is allowed to interact with the kitchen robot in a voice manner, the user can send a control instruction to the kitchen robot in a voice manner, and the kitchen robot can output various information to the user in a voice manner. The voice function of the kitchen robot of this embodiment includes: voice recognition function, pickup function and voice broadcast function. The pick-up function is mainly used for picking up voice signals sent by a user or other audio signals in the environment where the kitchen robot is located. The voice recognition function is mainly used for performing text conversion on the picked-up voice signals or other audio signals, and further recognizing control instructions for correspondingly controlling the kitchen robot from text information. The voice broadcasting function is used for converting some information which needs to be output by the kitchen robot into voice signals and broadcasting the voice signals to the user in a voice mode.
For example, the user may issue a voice command such as "please start the job" or "execute the job task" to instruct the kitchen robot to start executing the job task, and may further issue a voice command such as "please increase the power", "please open the lid", and the like to control the kitchen robot in real time during the execution of the job task by the kitchen robot. Or, the user can send out voice instructions such as a current working state, a current used power, a current operation progress and the like so as to inquire the running state/working state of the kitchen robot, and the kitchen robot can return running state/working state information to the user according to the voice instructions sent by the user. For example, when the kitchen robot picks up other audio signals in the working environment, for example, the alarm sound of the oil smoke sensor, the sound of broken porcelain, the oil explosion sound, the noise generated by the food dry-burning pot body or the noise generated by the range hood, etc., the picked up other audio signals can be identified, and if the sound is a specific sound signal generated by some events, reminding or alarm information can be output to the user.
In this embodiment, the kitchen robot may include a voice module, which provides voice functions to the kitchen robot, and the voice module may be a hardware module, for example, but not limited to, a voice chip. Fig. 2b shows an implementation of a voice module of a kitchen robot. As shown in fig. 2b, the voice module 200b of the kitchen robot includes: the device comprises a sound receiving module 201b, a voice recognition module 202b, a main control module 203b and a broadcasting module 204b.
The sound pickup module 201b is configured to pick up a voice signal (hereinafter referred to as a user voice) sent by a user and/or other sound signals in an environment where the kitchen robot is located, and the sound pickup module 201b may be, but is not limited to: microphones, microphone arrays, bluetooth microphones, recording pens, recording sticks, recorders, pickups, etc. The broadcasting module 204b is configured to output an audio signal to the outside, and may be, for example, a speaker, a loudspeaker, a speaker, or the like.
The voice recognition module 202b is configured to analyze and process the sound signal (e.g., user voice or noise in the environment where the kitchen robot is located) collected by the sound receiving module 201 b. Specifically, the voice recognition module 202b inputs the user voice picked up by the sound reception module 201b into the deep neural network model for recognition, matches the recognized voice command in the corresponding relation between the pre-stored voice command and the function code, and if the corresponding function code is successfully matched, sends the function code corresponding to the voice command to the main control module 203b, so that the main control module 203b controls the kitchen robot to execute the corresponding action according to the function code. The function code refers to a machine language that the main control module 203b can recognize, and the main control module 203b can control other modules (such as a pot cover driving module or a stirring shovel driving module) of the kitchen robot to execute corresponding actions according to the function code.
Where there may be a plurality of different speech expressions for a speech command, for example, the speech command is a cover opening, the speech signal recognized by the speech recognition module 202b may be, but is not limited to: "uncap", "please uncap", "open lid", etc. In this embodiment, the voice recognition module 202b uses a deep neural network model to accurately recognize the voice signals under different expression modes. The deep neural network model can be offline, namely, the deep neural network model for voice recognition is trained offline in advance and stored in the kitchen robot book, and the deep neural network model is independent of a network, so that the realization cost of the deep neural network model is low, and more convenient and intelligent operation experience can be provided for users.
In this embodiment, the main control module 203b may control the kitchen robot to execute corresponding actions according to the function code sent by the voice recognition module 202 b. Taking a kitchen robot as an example of a cooking device, the functional modules of the cooking device include, but are not limited to: the functions of the cooking apparatus realized by the functional modules controlled by the main control module 203b are shown in the following table 1, such as a pot cover control module, a heating module, a material box control module, a stirring shovel control module, and a voice module.
In this embodiment, the kitchen robot stores the corresponding relationship between the voice command and the query code, which is a machine language that can be identified by the main control module 203b and used for querying the running states/working states of various functional modules of the kitchen robot, in addition to the corresponding relationship between the voice command and the functional code. The user can inquire the running state/working state of various functional modules of the kitchen robot in a voice mode. The voice recognition module 202b recognizes the user voice with the query purpose, matches the recognized voice command in the corresponding relation between the pre-stored voice command and the query code, and if the voice command is successfully matched with the corresponding query code, sends the query code to the main control module 203b; the main control module 203b may query the operation state of the corresponding function module or the corresponding sensor according to the query code. After the main control module 203b inquires the running state of the function module or the sensor, the inquired content is broadcasted through the broadcasting module 204 b. For example, when the user's voice is "what the current fire is," the voice is picked up by the sound receiving module 201b, and then the voice recognition module 202b recognizes the voice to obtain
TABLE 1
The voice instruction is as follows: the "inquiry fire power" is sent to the main control module 203b by the inquiry code corresponding to the voice command of the voice recognition module 202b, the main control module 203b inquires the operation state of the heating module, the current fire power is "middle fire", the main control module 203b provides the inquiry information of "the current fire power is the middle fire" to the broadcasting module 204b, and the broadcasting module 204b broadcasts the inquiry information of "the current fire power is the middle fire".
In this embodiment, the voice recognition module 202b can recognize the voice of the user, and also can recognize the noise in the environment where the kitchen robot picked up by the sound receiving module 201b is located, and send out reminding information or alarm information through the broadcasting module 204b when specific noise is recognized. The specific noise identification process comprises the following steps: the method comprises the steps of collecting specific noise in advance, inputting the collected specific noise into a neural network model for training, and forming a specific noise set; comparing the noise in the environment where the kitchen robot picked up by the sound pickup module 201b in real time with the specific noise, and if the similarity between the picked up noise and the specific noise in the specific noise set exceeds a set similarity threshold, considering the picked up noise as the specific noise.
Further, in the case that the specific noise is recognized, the voice recognition module 202b transmits the function code corresponding to the specific noise to the main control module 203b, and the main control module 203b performs a corresponding operation based on the function code. The function code corresponding to the specific noise may instruct the main control module 203b to perform a corresponding action. For example, the function code corresponding to the specific noise instructs the main control module 203b to perform a corresponding action, and when the voice recognition module 202b recognizes "dry heating of the food material", the function code corresponding to the "dry heating of the food material", that is, an instruction of reducing or closing the fire power, is sent to the main control module 203b, and the main control module 203b performs an action of reducing or closing the fire power.
Or, when specific noise is recognized, the voice recognition module 202b sends notification information to the main control module 203b, and the main control module 203b sends reminding information or alarm information to the user through the broadcasting module based on the notification information. For example, when the voice recognition module 202b recognizes the oil burst, the voice recognition module 202b sends notification information of the recognition of the specific noise to the main control module 203b, and the main control module 203b obtains, based on the notification information, the reminding information as follows: please turn down the fire or turn off the heating "and provide the reminder information to the broadcasting module 204b, the broadcasting module 204b broadcasts the reminder information, and after hearing the broadcasted reminder information, the user turns down the fire or turns off the heating.
Or, the kitchen robot is further provided with a smoke detector, and when the smoke detector detects that the smoke amount in the environment where the current kitchen robot is located is larger than the set smoke amount threshold, a reminding message is sent to the main control module 203b to remind the main control module 203b to check whether the range hood is opened. After the main control module 203b receives the reminding information, the voice recognition module 202b can recognize the noise in the environment where the kitchen robot picked up by the sound reception module 201b in real time, and the voice recognition module 202b judges whether the picked up noise is the noise generated by the range hood according to the picked up noise. If the similarity threshold value of the picked-up noise and the noise generated by the range hood exceeds the set similarity threshold value, the picked-up noise is considered to be the noise generated by the range hood, and the range hood is in an open state; if the similarity threshold value of the picked-up noise and the noise generated by the range hood does not exceed the set similarity threshold value, the picked-up noise is considered to be not the noise generated by the range hood, and the range hood is not in the on state. When the voice recognition module 202b recognizes that the range hood is not in the on state, the voice recognition module 202b sends a notification message to the main control module 203b to notify the main control module 203b that the range hood is not in the on state, and the main control module 203b sends a reminding message to the user through the broadcasting module 204b to remind the user to start the range hood.
In the present embodiment, the installation position of the sound receiving module 201b is not limited. In an alternative embodiment, the sound receiving module 201b is designed to be detachable, the sound receiving module 201b can be embedded into the bottom of the display screen of the kitchen robot and face the user, and the position of the fixed sound receiving module 201b is isolated from the sound receiving module 201b by soft rubber, so that vibration can be reduced, sound receiving effect can be enhanced, and greasy dirt can be effectively reduced. Meanwhile, the radio module 201b and other modules use a wired transmission mode, so that the data transmission speed can be increased. In another alternative embodiment, the radio receiving module 201b is designed to be detachable, the radio receiving module 201b can be worn at the collar of the user, and data transmission between the radio receiving module 201b and other modules is realized by using a wireless communication technology, so that the radio receiving quality is ensured. In yet another alternative embodiment, the radio module 201b may be two types, one is installed on the kitchen robot and used for collecting noise in the environment where the kitchen robot is located, and the other is worn at the collar of the user to ensure radio instructions. The sound receiving module 201b is designed to be detachable, so that the sound receiving module 201b is easy to clean, and oil stains on the surface of the sound receiving module 201b caused by long-time unclean kitchen environment are avoided, thereby affecting the sound receiving effect.
In an alternative embodiment, the kitchen robot may also perform voice conversion on the structured data or the firmware file required by the kitchen robot to perform the task, so as to obtain a corresponding voice signal and output the voice signal. The firmware file comprises prompt information such as a plurality of novice guides, fault prompts or operation prompts. For convenience of distinction and description, in this embodiment, the hint information of the reference structured data in the embodiment shown in fig. 1a is referred to as a first hint information, and the hint information of the firmware file in the embodiment shown in fig. 2c is referred to as a second hint information.
For the structured data, as shown in the system of fig. 2c, the terminal device 23b or the server 22b may send the structured data to the kitchen robot 21b, where the kitchen robot (the main control module 203 b) receives the structured data through the wireless communication module 210b (such as a bluetooth module), sends the received structured data to the voice module 200b, parses the structured data inside the voice module, obtains the name of the structured data, the execution flow information, the attribute information of the data object, the execution status information during the task execution of the kitchen robot 21b, and the like, generates a voice text according to the parsed content, and sends the generated voice text to the TTS module in the voice recognition module 202 b. The TTS module is used for converting received voice text into a voice signal. Alternatively, in the embodiment shown in fig. 1a, the first prompt information in the generated reference structured data may be generated into a voice text, and converted into a voice signal by the TTS module.
For the firmware files, as in the system shown in fig. 2c, the terminal device 23b provides the firmware files required for the kitchen robot 21b device to the kitchen robot 21b by an over-the-air technique (Over The Air technology, OTA). In this embodiment, the firmware file is referred to as a firmware OTA file, and includes: some novice guides, fault reminders or operation reminders and other second prompt information, the firmware OTA files cannot be changed according to different execution job tasks. If the prompt information in the firmware OTA file is to be modified, the firmware of the device needs to be updated through the OTA technology. Further, the main control module 203b may parse the firmware OTA file, generate a voice text according to the parsed content, and send the generated voice text to the TTS module in the voice recognition module 202 b.
As shown in fig. 2c, the voice module 200b of the kitchen robot includes, in addition to: the sound receiving module 201b, the voice recognition module 202b, the main control module 203b, and the broadcasting module 204b (the sound receiving module 201b is not shown in fig. 2 c) further include: a power amplifier module 205b, a memory module 206b. The main control module 203b may belong to the voice module 200b, or the main control module 203b may be a main controller of the kitchen robot; the storage module 206b is configured to store structured data or firmware OTA files; the power amplifier module 205b is used for amplifying the voice signal.
In this embodiment, after the main control module 203b generates the voice text, the voice text is provided to a TTS module in the voice recognition module 202b through a universal asynchronous receiver Transmitter (Universal Asynchronous Receiver/Transmitter, UART) interface or a serial peripheral interface (Serial Peripheral Interface, SPI) interface, the TTS module synthesizes the voice text into a voice signal and outputs the voice signal to the power amplifier module 205b, and the power amplifier module 205b amplifies the voice signal and outputs the voice signal to the broadcasting module 204b; the voice signal is played by the play module 204 b.
In an alternative embodiment, the user may send control instructions to the kitchen robot through the APP of the terminal device. Among them, there are mainly two types of control instructions. The first control instruction is a control of the kitchen robot, for example, a current state of the kitchen robot such as an awake state or a standby state, and for example, a voice broadcasting function of the kitchen robot such as a volume adjustment. In the case of volume adjustment, the main control module 203b may receive and parse a control instruction of the user, and adjust the playing volume of the playing module 204b according to the control instruction. The second control instruction is an inquiry about the state of the kitchen robot, for example, the control instruction indicates to obtain the current fire of the kitchen robot, in which case, the main control module 203b obtains the current fire of the kitchen robot according to the control instruction, and provides the fire to the TTS module in the voice recognition module 202b in the form of voice text, the TTS module converts the voice text of the fire into a voice signal, and then the voice signal is broadcasted by the broadcasting module 204b through the power amplifier module 205 b.
In this embodiment, the flow of controlling the voice broadcast by the main control module 203b is not limited. In an alternative embodiment, the voice broadcast time varies with the length of the broadcast content, and the main control module 203b uses a first-in first-out (First Input First Output, FIFO) queue to buffer the voice to be broadcast. In another alternative embodiment, the voice broadcast includes: sequential broadcasting and plug-in broadcasting are adopted, for example, the cooking step and the like adopt sequential broadcasting, and fault type reminding (such as over-temperature or abnormal communication and the like) adopts plug-in broadcasting. FIFO APIs may be invoked for sequential linecast to broadcast, and last in first out (Last In First Out, LIFO) APIs may be invoked for plug-in broadcasts to prioritize fault broadcasts. In yet another alternative embodiment, after the kitchen robot is powered on, a novice pilot voice broadcast is entered, the main control module 203b invokes a FIFO API, and if the user chooses to actively skip the novice pilot voice broadcast, invokes an overwrite/overlay interface (overlay API), skipping the novice pilot voice broadcast.
In an alternative embodiment, the pronunciation of the polyphonic word may be adjusted based on semantics, e.g., replacing the error prone word with a homophonic word. For example, the broadcasting system may read "dry (gan, four-tone) peppers" as "dry (gan, four-tone) peppers", and the main control module 203b may change the "dry peppers" to "Gan Lajiao" when generating the voice text, so as to avoid the broadcasting error of the broadcasting module.
In this embodiment, no voice packet is required to be made, and the reminding information such as the data object, the execution step, the temperature early warning, the timing and the timing in the process of executing different job tasks can be individually displayed in a voice broadcasting mode. In addition, the voice adding, deleting and changing do not need to make a single voice file OTA for a long time; the addition, deletion and modification of the flow-class voice and the early warning voice can be realized through OTA equipment firmware; the data pair in the cooking process and the voice broadcasting of the executing step are broadcasted through automatic identification of the structured data, independent maintenance is not needed, and broadcasting contents are modified by modifying the contents of the structured data.
It should be noted that, the execution subjects of each step of the method provided in the above embodiment may be the same device, or the method may also be executed by different devices. For example, the execution subject of steps 101a to 103a may be the device a; for another example, the execution subject of steps 101a and 102a may be device a, and the execution subject of step 103a may be device B; etc.
In addition, in some of the flows described in the above embodiments and the drawings, a plurality of operations appearing in a specific order are included, but it should be clearly understood that the operations may be performed out of the order in which they appear herein or performed in parallel, the sequence numbers of the operations such as 101a, 102a, etc. are merely used to distinguish between the various operations, and the sequence numbers themselves do not represent any order of execution. In addition, the flows may include more or fewer operations, and the operations may be performed sequentially or in parallel. It should be noted that, the descriptions of "first" and "second" herein are used to distinguish different messages, devices, modules, etc., and do not represent a sequence, and are not limited to the "first" and the "second" being different types.
Fig. 5 is a schematic structural view of a kitchen robot according to an exemplary embodiment of the present application. As shown in fig. 5, the kitchen robot includes: a memory 54 and a processor 55.
Memory 54 for storing computer programs and may be configured to store various other data to support operations on the kitchen robot. Examples of such data include instructions for any application or method operating on the kitchen robot.
A processor 55 coupled to the memory 54 for executing the computer program in the memory 54 for: recording a plurality of operation steps executed by the kitchen robot and an execution sequence among the plurality of operation steps in the process of executing the operation tasks by the kitchen robot; generating prompt information corresponding to a first type of operation step in the plurality of operation steps, wherein the first type of operation step at least comprises operation steps needing to depend on a data object; generating reference structured data according to the plurality of operation steps, the execution sequence among the plurality of operation steps and the prompt information corresponding to the first type of operation steps; the prompting information is used for prompting a user to execute auxiliary operation adapted to the first type of operation step when any kitchen robot executes the operation task according to the reference structured data and executes the first type of operation step, so as to complete the operation task in cooperation with any kitchen robot.
In an alternative embodiment, the processor 55 is specifically configured to, when generating, for a first type of job step of the plurality of job steps, a hint information corresponding to the first type of job step: if the first type of operation step is the operation step depending on the data object, generating prompt information for prompting the user to operate the data object depending on the first type of operation step according to the dependence behavior of the first type of operation step on the data object, wherein the auxiliary operation executed by the user is to operate the data object depending on the first type of operation step.
In an alternative embodiment, processor 55 is further configured to, prior to generating the hint information corresponding to the first type of job step: when the kitchen robot executes the specified operation step, displaying a marking option for the specified operation step so as to enable a user to mark whether the specified operation step depends on the data object; if the designated job step is marked as a dependent data object, taking the designated job step as a first type of job step, and acquiring identification information of the data object on which the designated job step depends, which is configured by a user for the designated job step.
In an alternative embodiment, the processor 55, when acquiring the identification information of the data object configured by the user for the specified job step, is specifically configured to: displaying a data object list required by the kitchen robot to execute the job task, wherein the data object list comprises identification information of at least one data object; in response to a user selection operation of the data object list, identification information of the data object selected by the user is acquired as identification information of the data object on which the specified job step depends.
In an alternative embodiment, the processor 55 is further configured to, prior to the kitchen robot performing the task: displaying a pre-editing interface, wherein the pre-editing interface comprises candidate data objects and/or editing controls so that a user can edit a data object list required by the kitchen robot to execute a job task; in response to a user's pre-editing operation, a list of data objects required for the kitchen robot to perform a job task is generated.
In an alternative embodiment, the first type of job step further includes a job step that depends on the user configurable job parameter, and the processor 55 is specifically configured to, when generating the prompt corresponding to the first type of job step for the first type of job steps of the plurality of job steps: if the first type of job step is a job step depending on the user-configurable job parameter, a prompt message for prompting the user to configure the user-configurable job parameter is generated, and the auxiliary operation executed by the user is to set or adjust the user-configurable job parameter.
In an alternative embodiment, processor 55 is further configured to: recording non-user-configurable job parameters used when the kitchen robot performs a second type of job step for the second type of job step of the plurality of job steps; the reference structured data further comprises non-user-configurable job parameters corresponding to the second class of job steps; the second type of job step is a job step other than the first type of job step.
In an alternative embodiment, the processor 55 is further configured to, prior to the kitchen robot performing the task: the mode setting interface is displayed and at least comprises a first authoring mode and a second authoring mode; responding to the selection operation of the user, and determining that the user selects a first creation mode; wherein the first authoring mode is for indicating generation of reference structured data and the second authoring mode is for indicating generation of follow-up structured data.
In an alternative embodiment, if the user selects the second authoring mode, the processor 55 is further configured to: recording a plurality of operation steps executed by the kitchen robot and an execution sequence among the plurality of operation steps in the process of executing the operation tasks by the kitchen robot; for a third type of operation step in the plurality of operation steps, responding to configuration operation of a user, and recording attribute information of a data object which is configured for the third type of operation step by the user and depends on the third type of operation step, wherein the third type of operation step is an operation step which needs to depend on the data object but does not give data object identification information in the plurality of operation steps; and generating follow-up structured data according to the plurality of job steps, the execution sequence among the plurality of job steps and the attribute information of the data object on which the third class of job steps depend.
In an alternative embodiment, processor 55 is further configured to: for each of the job steps, recording the job parameters used by the kitchen robot when performing each of the job steps, followed by the structured data further includes the job parameters used when performing each of the job steps.
In an alternative embodiment, processor 55 is also configured to perform at least one of the following: displaying the follow-up structured data on the first preview interface, and responding to the correction operation initiated by the user on the first preview interface to perform the correction operation on the follow-up structured data; displaying the reference structured data on the second preview interface, and responding to the correction operation initiated by the user on the second preview interface, and performing the correction operation on the reference structured data; the correction operation includes at least one of adding a job step, deleting a job step, modifying a job parameter, adding a video or a picture corresponding to the job step, and modifying a name of the structured data.
In an alternative embodiment, processor 55 is operative in the add job step to perform at least one of the following operations: after the cover closing step, adding a stirring step; adding a cover opening step at the final position of the following or reference structured data; and adding a cover closing step at the final position of the follow-up or reference structured data.
Further, as shown in fig. 5, the kitchen robot further includes: communication component 56, display 57, power component 58, audio component 59, and other components. Only part of the components are schematically shown in fig. 5, which does not mean that the kitchen robot only comprises the components shown in fig. 5. It should be noted that, the components within the dashed box in fig. 5 are optional components, and not necessarily optional components, and may depend on the product form of the kitchen robot.
Accordingly, embodiments of the present application also provide a computer readable storage medium storing a computer program, which when executed by a processor, causes the processor to implement the steps of the methods shown in fig. 1a to 1c provided by the embodiments of the present application.
The embodiment of the application also provides a kitchen robot, the implementation structure of which is the same as or similar to that of the kitchen robot shown in fig. 5, and can be realized by referring to the structure of the kitchen robot shown in fig. 5. The difference between the kitchen robot provided in this embodiment and the kitchen robot in the embodiment shown in fig. 5 is that: the functions implemented by a processor executing a computer program stored in memory are different. For the kitchen robot provided in this embodiment, the processor executes the computer program stored in the memory, and may be used to: recording a plurality of operation steps executed by the kitchen robot and an execution sequence among the plurality of operation steps in the process of executing the operation tasks by the kitchen robot; for a third type of operation step in the plurality of operation steps, responding to configuration operation of a user, and recording attribute information of a data object which is configured for the third type of operation step by the user and depends on the third type of operation step, wherein the third type of operation step is an operation step which needs to depend on the data object but does not give data object identification information in the plurality of operation steps; and generating follow-up structured data according to the plurality of job steps, the execution sequence among the plurality of job steps and the attribute information of the data object on which the third class of job steps depend.
Accordingly, embodiments of the present application also provide a computer readable storage medium storing a computer program, which when executed by a processor causes the processor to implement the steps of the method shown in fig. 1c provided by the embodiments of the present application.
The embodiment of the application also provides a kitchen robot, the implementation structure of which is the same as or similar to that of the kitchen robot shown in fig. 5, and can be realized by referring to the structure of the kitchen robot shown in fig. 5. The difference between the kitchen robot provided in this embodiment and the kitchen robot in the embodiment shown in fig. 5 is that: the functions implemented by a processor executing a computer program stored in memory are different. For the kitchen robot provided in this embodiment, the processor executes the computer program stored in the memory, and may be used to: acquiring reference structured data, wherein the reference structured data comprises a plurality of operation steps in the operation task executed by the kitchen robot, an execution sequence among the plurality of operation steps and prompt information corresponding to a first type of operation step in the plurality of operation steps; sequentially executing the plurality of operation steps according to the execution sequence among the plurality of operation steps; and when the first type of operation step is executed, outputting prompt information to prompt a user to execute auxiliary operation matched with the first type of operation step so as to complete the operation task by matching with the kitchen robot.
In an alternative embodiment, the first type of operation step includes an operation step that needs to rely on a data object, and the prompt information is used for prompting the user to operate on the data object on which the first type of operation step depends; or the first type of job step comprises a job step depending on the user-configurable job parameters, and the prompt information is used for prompting the user to configure or adjust the user-configurable job parameters.
In an alternative embodiment, during the execution of the working steps by the kitchen robot, the processor is further adapted to: receiving a parameter adjustment instruction sent by a user, wherein the parameter adjustment instruction comprises new user configurable operation parameters which are needed to be used by the kitchen robot; and controlling the kitchen robot to execute the subsequent operation steps according to the new user configurable operation parameters.
Accordingly, embodiments of the present application also provide a computer readable storage medium storing a computer program, which when executed by a processor causes the processor to implement the steps of the method shown in fig. 2a provided by the embodiments of the present application.
Fig. 6 is a schematic structural diagram of a terminal device according to an exemplary embodiment of the present application. As shown in fig. 6, the terminal device includes: a memory 64 and a processor 65. Further, the method further comprises the following steps: a display 67.
The memory 64 is used for storing a computer program and may be configured to store other various data to support operations on the terminal device. Examples of such data include instructions for any application or method operating on the terminal device.
A processor 65 coupled to the memory 64 for executing the computer program in the memory 64 for: displaying a job control interface through a display screen 67, and controlling the kitchen robot to execute a job task based on a control on the job control interface; receiving and displaying the currently executed operation steps of the kitchen robot in the process of executing the operation tasks by the kitchen robot; when the currently executed operation step is a designated operation step, responding to a marking operation of marking the currently executed operation step as a first type operation step by a user, and generating prompt information corresponding to the first type operation step; generating reference structured data according to the operation steps sequentially executed by the kitchen robot and prompt information corresponding to the first type of operation steps; the prompting information is used for prompting a user to execute auxiliary operation adapted to the first type of operation step when any kitchen robot executes the operation task according to the reference structured data and executes the first type of operation step, so as to complete the operation task in cooperation with any kitchen robot.
Further, as shown in fig. 6, the terminal device further includes: communication component 66, power component 68, audio component 69, and the like. Only part of the components are schematically shown in fig. 6, which does not mean that the terminal device only comprises the components shown in fig. 6. It should be noted that, the components within the dashed line box in fig. 6 are optional components, and not necessarily optional components, and specific depends on the product form of the terminal device.
Accordingly, embodiments of the present application also provide a computer readable storage medium storing a computer program, which when executed by a processor causes the processor to implement the steps of the method shown in fig. 1d provided by the embodiments of the present application.
The embodiment of the application also provides a terminal device, the implementation structure of which is the same as or similar to that of the terminal device shown in fig. 6, and can be realized by referring to the structure of the terminal device shown in fig. 6. The terminal device provided in this embodiment is different from the terminal device in the embodiment shown in fig. 6 mainly in that: the functions implemented by a processor executing a computer program stored in memory are different. The terminal device provided in this embodiment may have a processor configured to execute a computer program stored in a memory, and be configured to: displaying a job control interface, and controlling the kitchen robot to execute a job task based on a control on the job control interface; receiving and displaying the currently executed operation steps of the kitchen robot in the process of executing the operation tasks by the kitchen robot; when the currently executed operation step is a designated operation step, an object configuration interface is displayed in response to a marking operation of marking the currently executed operation step as a third type operation step, wherein the third type operation step is an operation step which needs to rely on a data object; responding to the configuration operation of the user on the object configuration interface, and recording attribute information of the data object on which the user depends, which is configured for the third class of operation steps; and generating follow-up structured data according to the operation steps sequentially executed by the kitchen robot and the attribute information of the data objects relied by the third type of operation steps.
Accordingly, embodiments of the present application also provide a computer readable storage medium storing a computer program, which when executed by a processor causes the processor to implement the steps of the method shown in fig. 1e provided by the embodiments of the present application.
The communication assembly of the above embodiments is configured to facilitate wired or wireless communication between the device in which the communication assembly is located and other devices. The device where the communication component is located can access a wireless network based on a communication standard, such as a mobile communication network of WiFi,2G, 3G, 4G/LTE, 5G, etc., or a combination thereof. In one exemplary embodiment, the communication component receives a broadcast signal or broadcast-related information from an external broadcast management system via a broadcast channel. In one exemplary embodiment, the communication component further comprises a Near Field Communication (NFC) module to facilitate short range communications. For example, the NFC module may be implemented based on Radio Frequency Identification (RFID) technology, infrared data association (IrDA) technology, ultra Wideband (UWB) technology, bluetooth (BT) technology, and other technologies.
The memory in the above embodiments may be implemented by any type or combination of volatile or nonvolatile memory devices such as Static Random Access Memory (SRAM), electrically erasable programmable read-only memory (EEPROM), erasable programmable read-only memory (EPROM), programmable read-only memory (PROM), read-only memory (ROM), magnetic memory, flash memory, magnetic or optical disk.
The display screen in the above-described embodiments includes a screen, which may include a Liquid Crystal Display (LCD) and a Touch Panel (TP). If the screen includes a touch panel, the screen may be implemented as a touch screen to receive input signals from a user. The touch panel includes one or more touch sensors to sense touches, swipes, and gestures on the touch panel. The touch sensor may sense not only the boundary of a touch or slide action, but also the duration and pressure associated with the touch or slide operation.
The power supply assembly in the above embodiment provides power for various components of the device in which the power supply assembly is located. The power components may include a power management system, one or more power sources, and other components associated with generating, managing, and distributing power for the devices in which the power components are located.
The audio component of the above embodiments may be configured to output and/or input audio signals. For example, the audio component includes a Microphone (MIC) configured to receive external audio signals when the device in which the audio component is located is in an operational mode, such as a call mode, a recording mode, and a speech recognition mode. The received audio signal may be further stored in a memory or transmitted via a communication component. In some embodiments, the audio assembly further comprises a speaker for outputting audio signals.
It will be appreciated by those skilled in the art that embodiments of the present application may be provided as a method, system, or computer program product. Accordingly, the present application may take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects. Furthermore, the present application may take the form of a computer program product embodied on one or more computer-usable storage media (including, but not limited to, disk storage, CD-ROM, optical storage, and the like) having computer-usable program code embodied therein.
The present application is described with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems) and computer program products according to embodiments of the application. It will be understood that each flow and/or block of the flowchart illustrations and/or block diagrams, and combinations of flows and/or blocks in the flowchart illustrations and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, embedded processor, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be loaded onto a computer or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide steps for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
In one typical configuration, a computing device includes one or more processors (CPUs), input/output interfaces, network interfaces, and memory.
The memory may include volatile memory in a computer-readable medium, random Access Memory (RAM) and/or nonvolatile memory, such as Read Only Memory (ROM) or flash memory (flash RAM). Memory is an example of computer-readable media.
Computer readable media, including both non-transitory and non-transitory, removable and non-removable media, may implement information storage by any method or technology. The information may be computer readable instructions, data structures, modules of a program, or other data. Examples of storage media for a computer include, but are not limited to, phase change memory (PRAM), static Random Access Memory (SRAM), dynamic Random Access Memory (DRAM), other types of Random Access Memory (RAM), read Only Memory (ROM), electrically Erasable Programmable Read Only Memory (EEPROM), flash memory or other memory technology, compact disc read only memory (CD-ROM), digital Versatile Discs (DVD) or other optical storage, magnetic cassettes, magnetic tape magnetic disk storage or other magnetic storage devices, or any other non-transmission medium, which can be used to store information that can be accessed by a computing device. Computer-readable media, as defined herein, does not include transitory computer-readable media (transmission media), such as modulated data signals and carrier waves.
It should also be noted that the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising one … …" does not exclude the presence of other like elements in a process, method, article or apparatus that comprises the element.
The foregoing is merely exemplary of the present application and is not intended to limit the present application. Various modifications and variations of the present application will be apparent to those skilled in the art. Any modification, equivalent replacement, improvement, etc. which come within the spirit and principles of the application are to be included in the scope of the claims of the present application.

Claims (16)

1. A structured data generation method, comprising:
recording a plurality of operation steps executed by the kitchen robot and an execution sequence among the plurality of operation steps in the process of executing the operation task by the kitchen robot;
generating prompt information corresponding to a first type of operation step in the plurality of operation steps, wherein the first type of operation step at least comprises operation steps needing to depend on a data object;
generating reference structured data according to the plurality of operation steps, the execution sequence among the plurality of operation steps and the prompt information corresponding to the first type of operation steps;
the prompting information is used for prompting a user to execute auxiliary operation adapted to the first type of operation step when any kitchen robot executes an operation task according to the reference structured data and executes the first type of operation step so as to complete the operation task in cooperation with any kitchen robot;
Before the kitchen robot performs the task, further comprising:
the mode setting interface is displayed and at least comprises a first authoring mode and a second authoring mode; and determining that the user selects the first authoring mode in response to a user selection operation;
wherein the first authoring mode is used for indicating to generate reference structured data, and the second authoring mode is used for indicating to generate follow-up structured data;
if the user selects the second authoring mode, the method further comprises:
recording a plurality of operation steps executed by the kitchen robot and an execution sequence among the plurality of operation steps in the process of executing the operation task by the kitchen robot;
for a third type of operation step in the plurality of operation steps, responding to configuration operation of a user, and recording attribute information of a data object on which the user is dependent, wherein the third type of operation step is an operation step which needs to depend on the data object but does not give data object identification information in the plurality of operation steps;
and generating follow-up structured data according to the plurality of job steps, the execution sequence among the plurality of job steps and the attribute information of the data object on which the third class of job steps depend.
2. The method of claim 1, wherein generating, for a first type of job step of the plurality of job steps, hint information corresponding to the first type of job step comprises:
if the first type of operation step is an operation step depending on a data object, generating prompt information for prompting a user to operate on the data object depending on the first type of operation step according to the dependence behavior of the first type of operation step on the data object, wherein the auxiliary operation executed by the user is to operate on the data object depending on the first type of operation step.
3. The method of claim 2, further comprising, prior to generating the hint information corresponding to the first type of job step:
when the kitchen robot executes a designated job step, displaying a marking option for the designated job step so as to enable a user to mark whether the designated job step depends on a data object;
and if the designated job step is marked as a dependent data object, taking the designated job step as a first class job step, and acquiring identification information of the data object on which the designated job step depends, wherein the identification information is configured by a user for the designated job step.
4. A method according to claim 3, wherein obtaining identification information of data objects configured by a user for the specified job step comprises:
displaying a data object list required by the kitchen robot to execute a job task, wherein the data object list comprises identification information of at least one data object;
and responding to the selection operation of the user on the data object list, and acquiring the identification information of the data object selected by the user as the identification information of the data object on which the specified operation step depends.
5. The method of claim 4, further comprising, prior to the kitchen robot performing the task:
displaying a pre-editing interface, wherein the pre-editing interface comprises candidate data objects and/or editing controls for a user to edit a data object list required by the kitchen robot to execute a job task;
in response to a user's pre-editing operation, a list of data objects required by the kitchen robot to perform a job task is generated.
6. The method according to any one of claims 2-5, wherein the first type of job step further comprises a job step dependent on user-configurable job parameters, and wherein generating the prompt corresponding to the first type of job step for the first type of job step of the plurality of job steps comprises:
If the first type of job step is a job step depending on user-configurable job parameters, generating prompt information for prompting a user to configure the user-configurable job parameters, wherein the auxiliary operation executed by the user is to set or adjust the user-configurable job parameters.
7. The method as recited in claim 6, further comprising: recording non-user-configurable job parameters used when the kitchen robot performs a second type of job step for the second type of job step of the plurality of job steps; the reference structured data further comprises non-user-configurable job parameters corresponding to the second class of job steps; the second type of job step is a job step other than the first type of job step.
8. The method of claim 1, further comprising any of the following operations:
displaying the follow-up structured data on a first preview interface, and responding to the correction operation initiated by a user on the first preview interface to perform the correction operation on the follow-up structured data;
displaying the reference structured data on a second preview interface, and responding to the correction operation initiated by a user on the second preview interface to perform the correction operation on the reference structured data;
The correction operation comprises at least one of adding a working step, deleting a working step, modifying a working parameter, adding a video or a picture corresponding to the working step and modifying the name of the structured data.
9. The method of claim 8, wherein the modifying operation comprises at least one of:
adding or deleting stirring step;
adding or deleting feeding steps;
increasing or decreasing the heating time;
increasing or decreasing the heating power;
increasing or decreasing the stirring speed;
adding or deleting a picture video or a picture of the guided operation step;
adding or deleting a picture video or a picture showing the operation process in real time;
adding or deleting a cover opening step in any step of following or referring to the structured data;
in any of the following or referencing steps, a closing step is added or deleted.
10. A structured data generation method, comprising:
recording a plurality of operation steps executed by the kitchen robot and an execution sequence among the plurality of operation steps in the process of executing the operation task by the kitchen robot;
for a third type of operation step in the plurality of operation steps, responding to configuration operation of a user, and recording attribute information of a data object on which the user is dependent, wherein the third type of operation step is an operation step which needs to depend on the data object but does not give data object identification information in the plurality of operation steps;
And generating follow-up structured data according to the plurality of job steps, the execution sequence among the plurality of job steps and the attribute information of the data object on which the third class of job steps depend.
11. A job task execution method, comprising:
acquiring reference structured data, wherein the reference structured data comprises a plurality of operation steps in a kitchen robot executing operation task, an execution sequence among the plurality of operation steps and prompt information corresponding to a first type of operation step in the plurality of operation steps;
sequentially executing the plurality of job steps according to the execution sequence among the plurality of job steps; and
when the first type of operation step is executed, outputting the prompt information to prompt a user to execute auxiliary operation matched with the first type of operation step so as to complete an operation task in cooperation with the kitchen robot;
before the kitchen robot performs the task, further comprising:
the mode setting interface is displayed and at least comprises a first authoring mode and a second authoring mode; and determining that the user selects the first authoring mode in response to a user selection operation;
Wherein the first authoring mode is used for indicating to generate reference structured data, and the second authoring mode is used for indicating to generate follow-up structured data;
if the user selects the second authoring mode, the method further comprises:
recording a plurality of operation steps executed by the kitchen robot and an execution sequence among the plurality of operation steps in the process of executing the operation task by the kitchen robot;
for a third type of operation step in the plurality of operation steps, responding to configuration operation of a user, and recording attribute information of a data object on which the user is dependent, wherein the third type of operation step is an operation step which needs to depend on the data object but does not give data object identification information in the plurality of operation steps;
and generating follow-up structured data according to the plurality of job steps, the execution sequence among the plurality of job steps and the attribute information of the data object on which the third class of job steps depend.
12. The method of claim 11, wherein the first type of job step includes a job step requiring a dependency on a data object, and the prompt message is used to prompt a user to operate on the data object on which the first type of job step depends;
Or alternatively
The first type of job step comprises a job step depending on user-configurable job parameters, and the prompt information is used for prompting a user to configure or adjust the user-configurable job parameters.
13. A structured data generation method, suitable for a terminal device, comprising:
displaying a job control interface, and controlling the kitchen robot to execute a job task based on a control on the job control interface;
receiving and displaying the currently executed operation steps of the kitchen robot in the process of executing the operation tasks by the kitchen robot;
when the currently executed operation step is a designated operation step, responding to a marking operation of marking the currently executed operation step as a first type operation step by a user, and generating prompt information corresponding to the first type operation step;
generating reference structured data according to the operation steps sequentially executed by the kitchen robot and the prompt information corresponding to the first type of operation steps;
the prompting information is used for prompting a user to execute auxiliary operation adapted to the first type of operation step when any kitchen robot executes an operation task according to the reference structured data and executes the first type of operation step so as to complete the operation task in cooperation with any kitchen robot;
Before the kitchen robot performs the task, further comprising:
the mode setting interface is displayed and at least comprises a first authoring mode and a second authoring mode; and determining that the user selects the first authoring mode in response to a user selection operation;
wherein the first authoring mode is used for indicating to generate reference structured data, and the second authoring mode is used for indicating to generate follow-up structured data;
if the user selects the second authoring mode, the method further comprises:
recording a plurality of operation steps executed by the kitchen robot and an execution sequence among the plurality of operation steps in the process of executing the operation task by the kitchen robot;
for a third type of operation step in the plurality of operation steps, responding to configuration operation of a user, and recording attribute information of a data object on which the user is dependent, wherein the third type of operation step is an operation step which needs to depend on the data object but does not give data object identification information in the plurality of operation steps;
and generating follow-up structured data according to the plurality of job steps, the execution sequence among the plurality of job steps and the attribute information of the data object on which the third class of job steps depend.
14. A structured data generation method, suitable for a terminal device, comprising:
displaying a job control interface, and controlling the kitchen robot to execute a job task based on a control on the job control interface;
receiving and displaying the currently executed operation steps of the kitchen robot in the process of executing the operation tasks by the kitchen robot;
when the currently executed operation step is a designated operation step, an object configuration interface is displayed in response to a marking operation of marking the currently executed operation step as a third type operation step, wherein the third type operation step is an operation step needing to depend on a data object;
responding to the configuration operation of a user on the object configuration interface, and recording attribute information of the data object on which the user depends, which is configured for the third class operation step;
and generating follow-up structured data according to the operation steps sequentially executed by the kitchen robot and the attribute information of the data object on which the third type of operation steps depend.
15. A kitchen robot, comprising: a memory and a processor;
the memory is used for storing a computer program;
the processor, coupled to the memory, is configured to execute the computer program for: recording a plurality of operation steps executed by the kitchen robot and an execution sequence among the plurality of operation steps in the process of executing the operation task by the kitchen robot; generating prompt information corresponding to a first type of operation step in the plurality of operation steps, wherein the first type of operation step at least comprises operation steps needing to depend on a data object; generating reference structured data according to the plurality of operation steps, the execution sequence among the plurality of operation steps and the prompt information corresponding to the first type of operation steps; the prompting information is used for prompting a user to execute auxiliary operation adapted to the first type of operation step when any kitchen robot executes an operation task according to the reference structured data and executes the first type of operation step so as to complete the operation task in cooperation with any kitchen robot;
Before the kitchen robot performs the task, the processor is further configured to:
the mode setting interface is displayed and at least comprises a first authoring mode and a second authoring mode; and determining that the user selects the first authoring mode in response to a user selection operation;
wherein the first authoring mode is used for indicating to generate reference structured data, and the second authoring mode is used for indicating to generate follow-up structured data;
if the user selects the second authoring mode, the processor is further configured to:
recording a plurality of operation steps executed by the kitchen robot and an execution sequence among the plurality of operation steps in the process of executing the operation task by the kitchen robot;
for a third type of operation step in the plurality of operation steps, responding to configuration operation of a user, and recording attribute information of a data object on which the user is dependent, wherein the third type of operation step is an operation step which needs to depend on the data object but does not give data object identification information in the plurality of operation steps;
and generating follow-up structured data according to the plurality of job steps, the execution sequence among the plurality of job steps and the attribute information of the data object on which the third class of job steps depend.
16. A computer readable storage medium storing a computer program, characterized in that the computer program, when executed by a processor, causes the processor to carry out the steps of the method of any one of claims 1-14.
CN202110678876.1A 2021-06-18 2021-06-18 Structured data generation and operation method, kitchen robot and storage medium Active CN113485130B (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN202110678876.1A CN113485130B (en) 2021-06-18 2021-06-18 Structured data generation and operation method, kitchen robot and storage medium
CN202310862225.7A CN117389162A (en) 2021-06-18 2021-06-18 Structured data generation and operation method, kitchen robot and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110678876.1A CN113485130B (en) 2021-06-18 2021-06-18 Structured data generation and operation method, kitchen robot and storage medium

Related Child Applications (1)

Application Number Title Priority Date Filing Date
CN202310862225.7A Division CN117389162A (en) 2021-06-18 2021-06-18 Structured data generation and operation method, kitchen robot and storage medium

Publications (2)

Publication Number Publication Date
CN113485130A CN113485130A (en) 2021-10-08
CN113485130B true CN113485130B (en) 2023-08-11

Family

ID=77935617

Family Applications (2)

Application Number Title Priority Date Filing Date
CN202110678876.1A Active CN113485130B (en) 2021-06-18 2021-06-18 Structured data generation and operation method, kitchen robot and storage medium
CN202310862225.7A Pending CN117389162A (en) 2021-06-18 2021-06-18 Structured data generation and operation method, kitchen robot and storage medium

Family Applications After (1)

Application Number Title Priority Date Filing Date
CN202310862225.7A Pending CN117389162A (en) 2021-06-18 2021-06-18 Structured data generation and operation method, kitchen robot and storage medium

Country Status (1)

Country Link
CN (2) CN113485130B (en)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104106973A (en) * 2013-09-18 2014-10-22 珠海优特电力科技股份有限公司 Intelligent menu
CN107561958A (en) * 2016-06-30 2018-01-09 阿里巴巴集团控股有限公司 A kind of data method, device and system
CN108552936A (en) * 2018-05-11 2018-09-21 九阳股份有限公司 The cooking methods of cooking machine and cooking machine
CN112383455A (en) * 2020-10-30 2021-02-19 添可智能科技有限公司 Data generation method and execution method and equipment
CN112493824A (en) * 2020-11-16 2021-03-16 刘娴 Intelligent cooking auxiliary robot and control method thereof

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104106973A (en) * 2013-09-18 2014-10-22 珠海优特电力科技股份有限公司 Intelligent menu
CN107561958A (en) * 2016-06-30 2018-01-09 阿里巴巴集团控股有限公司 A kind of data method, device and system
CN108552936A (en) * 2018-05-11 2018-09-21 九阳股份有限公司 The cooking methods of cooking machine and cooking machine
CN112383455A (en) * 2020-10-30 2021-02-19 添可智能科技有限公司 Data generation method and execution method and equipment
CN112493824A (en) * 2020-11-16 2021-03-16 刘娴 Intelligent cooking auxiliary robot and control method thereof

Also Published As

Publication number Publication date
CN113485130A (en) 2021-10-08
CN117389162A (en) 2024-01-12

Similar Documents

Publication Publication Date Title
CN112789561B (en) System and method for customizing a portable natural language processing interface for an appliance
CN112840345B (en) System and method for providing a portable natural language processing interface across appliances
CN104898613B (en) The control method and device of smart home device
US11365890B2 (en) Cooker hood, and method and system for controlling same
WO2016138828A1 (en) Automatic cooking method and system, smart cooking device and smart oven
CN105425643A (en) Cooking control method and device
CN112925802A (en) Structured data generation method, device and storage medium
US20140038489A1 (en) Interactive plush toy
CN107664959A (en) Intelligent cooking system and its menu generation, cooking methods
US20220240744A1 (en) Scheduling system for autonomous robots
CN105278986A (en) Control method and apparatus of electronic device
CN110021299B (en) Voice interaction method, device, system and storage medium
EP3125506B1 (en) Method and apparatus for adjusting mode
CN104751379A (en) Manufacture method of digital menu
CN107908144B (en) Method and device for controlling smoke extractor and storage medium
CN112204942B (en) Photographing method and terminal equipment
CN112383455B (en) Data generation method and execution method and equipment
CN105159185A (en) Pressure cooker intelligent control method and apparatus
CN107219799A (en) One kind culinary art data automatic control system and control method
CN105955608B (en) Shortcut control method and device and electronic equipment
CN113485130B (en) Structured data generation and operation method, kitchen robot and storage medium
CN112331195A (en) Voice interaction method, device and system
CN111781842A (en) Method for controlling household equipment to work, wearable terminal and storage device
CN113485753A (en) Job execution method, kitchen robot, system, and storage medium
CN113450894B (en) Structured data, electronic menu generation method and device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant