WO2018058491A1 - 烹饪系统及方法 - Google Patents

烹饪系统及方法 Download PDF

Info

Publication number
WO2018058491A1
WO2018058491A1 PCT/CN2016/100983 CN2016100983W WO2018058491A1 WO 2018058491 A1 WO2018058491 A1 WO 2018058491A1 CN 2016100983 W CN2016100983 W CN 2016100983W WO 2018058491 A1 WO2018058491 A1 WO 2018058491A1
Authority
WO
WIPO (PCT)
Prior art keywords
cooking
somatosensory
action
smart
spatula
Prior art date
Application number
PCT/CN2016/100983
Other languages
English (en)
French (fr)
Inventor
李万建
林华山
钟志威
刘凯霞
韩仲亮
李雯雯
陈阳
陈志远
Original Assignee
深圳市赛亿科技开发有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 深圳市赛亿科技开发有限公司 filed Critical 深圳市赛亿科技开发有限公司
Priority to CN201680001087.1A priority Critical patent/CN108348084A/zh
Priority to PCT/CN2016/100983 priority patent/WO2018058491A1/zh
Publication of WO2018058491A1 publication Critical patent/WO2018058491A1/zh

Links

Classifications

    • AHUMAN NECESSITIES
    • A47FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
    • A47JKITCHEN EQUIPMENT; COFFEE MILLS; SPICE MILLS; APPARATUS FOR MAKING BEVERAGES
    • A47J27/00Cooking-vessels
    • FMECHANICAL ENGINEERING; LIGHTING; HEATING; WEAPONS; BLASTING
    • F24HEATING; RANGES; VENTILATING
    • F24CDOMESTIC STOVES OR RANGES ; DETAILS OF DOMESTIC STOVES OR RANGES, OF GENERAL APPLICATION
    • F24C1/00Stoves or ranges in which the fuel or energy supply is not restricted to solid fuel or to a type covered by a single one of the following groups F24C3/00 - F24C9/00; Stoves or ranges in which the type of fuel or energy supply is not specified

Definitions

  • the present disclosure relates to the field of somatosensory interaction technology, and in particular, to a cooking system and method.
  • the present disclosure provides a cooking system and method.
  • a cooking system comprising:
  • a display end which acquires action information corresponding to the recipe information, and performs a guidance display of the somatosensory action according to the action information;
  • a somatosensory control end which serves as a manipulation end for cooking by a user, the somatosensory control end is configured to perceive motion data, invoke a cooking manipulation instruction according to the motion data, and transmit to the cooking end, where the motion data corresponds to The somatosensory action guide performed by the display end;
  • a cooking end which receives the cooking manipulation instruction, the cooking end includes a robot arm and a smart spatula connecting the robot arm, and drives a robot arm thereof according to the cooking manipulation command, through the robot arm
  • the smart spatula that drives the mechanical arm to connect;
  • the moving smart spatula performs a cooking process consistent with the somatosensory action.
  • the somatosensory control end includes: an attitude sensor, a storage module, and a somatosensory controller module;
  • the attitude sensor is configured to perform algorithm processing according to the somatosensory action, and obtain and output the motion data;
  • the storage module is configured to store a preset action message and a cooking control instruction corresponding to the action message
  • the somatosensory controller module is respectively wirelessly connected to the attitude sensor and the storage module, and the somatosensory controller module is configured to receive an action message stored by the storage module, and receive the action data, and compare the Determining whether the action message and the action data match, and if the result is YES, calling the cooking control instruction corresponding to the action message according to the action message corresponding to the action data
  • the cooking end further includes: a microprocessor, a driving device, an intelligent spice box;
  • the microprocessor is electrically connected to the somatosensory controller module and the driving device, the microprocessor is configured to receive the cooking manipulation command, and control the driving device according to the cooking operation instruction Driving the robot arm to drive the intelligent spatula connected by the robot arm by the movement of the robot arm;
  • the smart spice box is used to sense the smart spatula and to sense the bottom of the smart spatula.
  • the smart spatula is further configured to: sense the weight of the self-bearing load, and determine whether the weight of the self-supporting load reaches a preset weight, and if yes, control the self-execution to dump action.
  • the microprocessor includes:
  • a data storage unit wherein the data storage unit is configured to store recipe information
  • an instruction conversion unit electrically connected to the data storage unit, the instruction conversion unit is configured to acquire recipe information stored by the data storage unit, and convert the recipe information into corresponding action information and
  • the display terminal outputs the action information;
  • a control center electrically connected to the driving device, the control center according to the cooking operation And driving a driving signal, and controlling, according to the driving signal, the driving device to drive the mechanical arm to drive the movement of the smart spatula.
  • control center is further configured to generate a weighing control signal according to the cooking manipulation instruction, and send the smart control blade to the smart spatula;
  • the smart spatula is further configured to receive a weighing control signal sent by the control center, and perform sensing of the bearing weight according to the weighing control signal.
  • the display end is wirelessly connected to the instruction conversion unit, and the display end acquires action information corresponding to the recipe information output by the instruction conversion unit, and according to the action information.
  • the finger bow I of the somatosensory action is displayed.
  • the smart spice box includes:
  • an infrared tube comprising a relatively disposed infrared emitting tube and an infrared receiving tube, wherein the infrared tube is configured to obtain an inductive signal that the inductively emitted infrared light path is blocked;
  • the control unit is connected to the infrared tube electrical signal, and the control unit is configured to control the smart spice box to be placed on the bottom of the smart tube according to the sensing signal.
  • a cooking method comprising:
  • the moving smart spatula performs a cooking process that is consistent with the somatosensory action.
  • the cooking system includes a display end, a somatosensory control end and a cooking end, and the display end acquires action information corresponding to the recipe information, and performs a somatosensory action instruction according to the action information;
  • the somatosensory control end serves as a manipulation end of the user for cooking, and is used for Perceiving the motion data, calling the cooking manipulation instruction according to the motion data, and transmitting to the cooking end, the motion data corresponding to the somatosensory action guide performed by the display end;
  • the cooking end receiving the cooking Manipulating instructions the cooking end includes a robot arm and a smart spatula connected to the robot arm, driving the robot arm according to the cooking manipulation command, and the smart spatula connected by the robot arm by the movement of the robot arm, and the moving smart spatula performs and
  • the cooking process is consistent with the somatosensory action, the cooking system and method, the cooking control instruction is obtained through the somatosensory interaction, and the cooking process is driven by the intelligent spatula according
  • FIG. 1 is a block diagram showing the structure of a cooking system according to an exemplary embodiment
  • FIG. 2 is a structural block diagram of a somatosensory control end of the corresponding embodiment of FIG. 1;
  • FIG. 3 is a structural block diagram of a cooking end of the corresponding embodiment of FIG. 1;
  • FIG. 4 is a block diagram showing the structure of a microprocessor of the corresponding embodiment of FIG. 3;
  • FIG. 5 is a structural block diagram of a cooking system according to another exemplary embodiment
  • FIG. 6 is a structural block diagram of an intelligent spice box according to an embodiment of FIG. 3;
  • FIG. 7 is a flow chart showing a cooking method according to an exemplary embodiment.
  • the cooking system includes, but is not limited to, a display end 10, a somatosensory end 20 and a cooking end 30.
  • the display terminal 10 acquires action information corresponding to the recipe information, and performs a somatosensory action according to the action information.
  • the guidelines are displayed.
  • the cooking system completes the cooking process based on the somatosensory interaction system.
  • the somatosensory interaction system is a user's guidance display based on the somatosensory action of the display screen 10, and the corresponding somatosensory action is performed.
  • the somatosensory control end 20 senses the motion data of the somatosensory action, and controls the cooking end 30 to complete the cooking in accordance with the somatosensory motion according to the motion data. process.
  • the somatosensory action is a corresponding action made by the user according to the guidance of the display terminal 10.
  • the somatosensory control terminal 20 which is used as a user's cooking terminal for sensing motion data, invokes a cooking manipulation command according to the motion data, and transmits it to the cooking terminal 30, the motion data corresponding to the somatosensory action guide performed by the display terminal 10. .
  • the somatosensory control end 20 includes an acceleration sensor and an angular velocity sensor.
  • the action data corresponds to the somatosensory action guide performed by the display terminal 10, and the action data is related data of the user's somatosensory action. For example, corresponding to the motion data collected by the somatosensory control terminal 20, the acceleration information corresponding to the acceleration sensor and The angular velocity information corresponding to the angular velocity sensor.
  • the acceleration information and the angular velocity information of the user's somatosensory motion obtain corresponding motion data, and according to the motion data, the cooking manipulation command stored in advance on the somatosensory control end may be invoked, and the cooking manipulation command includes controlling the cooking end 30 to perform the cooking process conforming to the somatosensory motion. related information.
  • the user makes a somatosensory action according to the guidance of the somatosensory action of the display terminal 10, and the somatosensory control terminal 20 obtains the motion data according to the somatosensory motion of the user, and the somatosensory control terminal 20 invokes the cooking manipulation command according to the obtained motion data, and the cooking manipulation command is Transfer to the cooking end 30.
  • the cooking end 30 receives a cooking manipulation command, and the cooking end 30 includes a robot arm and a smart spatula connected to the robot arm, a robotic arm that drives the robot arm according to the cooking manipulation command, and a smart spatula that drives the robot arm through the movement of the robot arm.
  • the sporty smart spatula performs the cooking process that matches the somatosensory motion.
  • the cooking end 30 itself is connected to the robot arm, and the connected robot arm is connected with a smart spatula.
  • the robotic arm driven by the cooking command is used to drive the robotic arm by the movement of the robot; the intelligent spatula is used to perform the cooking process in accordance with the somatosensory action.
  • the cooking end 30 receives the cooking manipulation command from the somatosensory control end 20 to start cooking.
  • the cooking manipulation command is used for controlling the cooking end 30 to perform a cooking process consistent with the somatosensory action, controlling the robot arm movement of the cooking end 30 according to the cooking manipulation instruction, and driving the robot arm to be connected by the movement of the robot arm.
  • Shovel the action of the smart spatula is the cooking process that matches the somatosensory action, and the execution and display end 1 is realized.
  • This embodiment applies a somatosensory interaction system to a cooking system, which acquires a cooking manipulation instruction by means of a somatosensory interaction, and drives a smart spatula to perform a cooking process according to a cooking manipulation instruction, thereby realizing a somatosensory interaction.
  • FIG. 2 is a structural block diagram of a somatosensory control end of the corresponding embodiment of FIG. 1 in an embodiment.
  • attitude sensor 210 As shown in FIG. 2, including but not limited to: attitude sensor 210, storage module 230 and somatosensory controller module
  • the attitude sensor 210 is configured to perform algorithm processing based on the somatosensory motion, and obtain and output the motion data.
  • attitude sensor 210 is a component of the somatosensory control end 20, and in an exemplary embodiment, an acceleration sensor and/or an angular velocity sensor, such as a triaxial acceleration sensor and/or a triaxial angular velocity sensor, may be included.
  • an acceleration sensor and/or an angular velocity sensor such as a triaxial acceleration sensor and/or a triaxial angular velocity sensor, may be included.
  • the algorithm processing that is, the information related to the sensing of the somatosensory motion, such as the acceleration information and the angular velocity information, is calculated by a related algorithm, and the somatosensory motion is obtained in the corresponding posture, and is marked by the motion data.
  • the user displays a somatosensory motion according to the guidance of the somatosensory motion of the display terminal 10, and the posture sensor 210 senses the somatosensory motion of the user, and obtains acceleration information and angular velocity information of the somatosensory motion of the user through the triaxial acceleration sensor and the triaxial angular velocity sensor.
  • the acceleration information and the angular velocity information of the somatosensory motion are algorithmically processed to obtain motion data corresponding to the somatosensory motion, and the motion data is output to the somatosensory controller module 250.
  • the storage module 230 is configured to store a preset action message and a cooking manipulation instruction corresponding to the action message.
  • the action message is a related message capable of invoking a cooking manipulation command
  • the action message and the cooking manipulation command invoked by the action message are stored in advance in the storage module.
  • the action message stored by the storage module 230 may invoke a corresponding cooking operation instruction, and the invoked cooking operation command may control the cooking end 30 to perform a corresponding cooking process.
  • the somatosensory controller module 250 is connected to the attitude sensor 210 and the storage module 230, and the somatosensory controller module 250 is configured to receive the action message stored by the storage module 230, and receive the action data, and compare and analyze the action message and the action data. Determine whether it is consistent, if the result is yes, according to the action data
  • the action message of the combination, the cooking manipulation instruction corresponding to the action message is called.
  • the somatosensory controller module 250 receives the action message stored by the storage module 230, and receives the action data corresponding to the somatosensory action obtained by the algorithm of the gesture sensor 210, and the somatosensory controller module 250 compares the received action message with The action data is judged whether it is consistent, that is, whether the degree of similarity between the action message and the action data reaches a preset standard. If the result is YES, the cooking control command corresponding to the action message is invoked according to the action message corresponding to the action data.
  • This embodiment acquires a cooking manipulation instruction by means of somatosensory interaction, and starts cooking by the cooking manipulation instruction, thereby realizing the function of using the somatosensory interaction to control the cooking start.
  • FIG. 3 is a block diagram showing the structure of a cooking end of the corresponding embodiment of FIG. 1 in an embodiment.
  • the cooking end 30, as shown in Figure 3 includes but is not limited to: a microprocessor 310, a drive unit 330 and a smart spice box 350.
  • the microprocessor 310 is electrically connected to the somatosensory controller module 250 and the driving device 330.
  • the microprocessor 310 is configured to receive the cooking manipulation command, and control the driving device 330 to drive the robot arm according to the cooking operation instruction, and the movement of the robot arm is performed.
  • a smart spatula that drives the connection.
  • the drive device 330 includes a drive module, a stepper motor, and a gear.
  • the stepper motor is connected to the drive module and gears.
  • the microprocessor 310 controls the driving device 330 to operate, the driving module of the driving device 330 drives the stepping motor to operate, the stepping motor operates to drive the gear to rotate, and the mechanical arm moves by the rotation of the gear, thereby driving the robot arm Connected smart spatula.
  • the smart spice box 350 is used to sense the smart spatula and to sense the smart spatula to place its own bottom.
  • the intelligent spice box leaks into the smart seasoning box through the bottom of the smashing to the smart pot shovel, and when the smart seasoning box senses the smart pot shovel, the bottom of the snoring is ready to leak into the seasoning.
  • This embodiment receives the cooking manipulation instruction through the microprocessing 310, and the microprocessor 310 controls the driving device 330 to drive the robot arm according to the cooking manipulation instruction, and the movement of the robot arm drives the robot to connect the intelligent spatula movement, when the intelligence The spatula moves below the smart spice box 350, and the smart spice box 350 senses the smart spatula and slides its bottom to achieve a cooking process consistent with the somatosensory action.
  • the intelligent spatula is further configured to sense the weight of the self-bearing load, and determine whether the weight of the self-supporting load reaches the preset weight, and if so, control itself to execute. The dumping action; if not, continue to sense the weight of the load until the preset weight is reached. [0075] wherein the preset weight of the self-bearing varies according to the cooking process, and the intelligent spatula control itself performs the dumping action as a step of the cooking process.
  • This embodiment senses the weight of the self-bearing by the smart spatula.
  • the smart spatula intelligently controls the function of performing the dumping operation.
  • the microprocessor 310 includes but is not limited to: a data storage unit 311, an instruction conversion unit 312, and a control center 31.
  • the data storage unit 311 is for storing recipe information.
  • the instruction conversion unit 312 is electrically connected to the data storage unit 311, and the instruction conversion unit 312 is configured to acquire the recipe information stored by the data storage unit 311, convert the recipe information into corresponding action information, and output the action information to the display terminal 10.
  • the command conversion unit 312 acquires the recipe information stored in the data storage unit 311, converts the recipe information into corresponding action information, and outputs the action information to the display terminal 10, thereby further performing the somatosensory action performed by the display terminal 10 corresponding to the embodiment. The basis for the guidance.
  • the control center 313 is electrically connected to the driving device 330.
  • the control center 313 issues a driving signal according to the cooking operation command, and controls the driving device 330 to drive the mechanical arm to drive the movement of the smart spatula according to the driving signal.
  • the driving signal is used to control the driving device 330 to drive the robot arm.
  • the driving device 330 is required to drive the mechanical arm to drive the movement of the smart spatula to perform a cooking process consistent with the somatosensory action.
  • the control center 313 can issue a drive signal for controlling the drive unit 330 to drive the robot arm to move the smart spatula according to the cooking operation command.
  • This embodiment converts the recipe information stored in the data storage unit 311 into action information by the instruction conversion unit 312 and outputs it to the display terminal 10, thereby implementing the guidance display function of the display terminal 10 performing the somatosensory action according to the action information, through the control center. 313 sends a driving signal according to the cooking operation command, and realizes a function of controlling the driving device 330 to drive the movement of the robot arm to drive the smart spatula.
  • control center 313 is further configured to generate a weighing control signal according to the cooking manipulation command and send it to the smart spatula;
  • the intelligent spatula is further configured to receive a weighing control signal sent by the control center 313, and according to the weighing control [0086] wherein, the weighing control signal is used for the smart spatula to sense the weight of the self-bearing, and the control center 313 issues a weighing control signal according to the cooking manipulation command to control the cooking process of the smart spatula to sense its own weight.
  • This embodiment generates a weighing control signal according to the cooking manipulation command by the control center 313, and sends it to the smart spatula, and the smart spatula receives the weighing control signal sent by the control center 313 to realize the sensing of the weight of the smart spatula.
  • the function is a weighing control signal according to the cooking manipulation command by the control center 313, and sends it to the smart spatula, and the smart spatula receives the weighing control signal sent by the control center 313 to realize the sensing of the weight of the smart spatula.
  • FIG. 5 is a structural block diagram of a cooking system, according to another exemplary embodiment.
  • the display terminal 10 is wirelessly connected to the command conversion unit 312, and the display terminal 10 acquires action information corresponding to the recipe information output by the command conversion unit 312, and performs guidance display of the somatosensory action based on the action information.
  • the display terminal 10 needs to complete the function of guiding the user's somatosensory action, and the command conversion unit 312 outputs the action information corresponding to the recipe information to the display terminal 10 by wireless transmission, and the display terminal 10 implements the somatosensory action on the user according to the action information.
  • the guidelines are displayed.
  • This embodiment obtains the action information corresponding to the recipe information output by the instruction conversion unit 312 through the display terminal 10.
  • FIG. 6 is a block diagram showing the structure of an intelligent spice box of the corresponding embodiment of FIG. 3 in one embodiment.
  • the intelligent spice box 35
  • an infrared tube 351 includes but is not limited to: an infrared tube 351 and a control unit 352.
  • the infrared tube 351 includes a relatively disposed infrared emitting tube and an infrared receiving tube, and the infrared tube 351 is configured to obtain an inductive signal in which the infrared light path of the inductive emission is blocked.
  • the control unit 352 is electrically connected to the infrared tube 351, and the control unit 352 is configured to control the smart seasoning box 350 to release the bottom of the self according to the sensing signal.
  • the infrared tube 351 senses whether the infrared light path emitted by the infrared tube and the infrared receiving tube is blocked, and when the infrared light path is blocked, the sensing signal is sent to the control unit 352, and the control unit receives the sensing signal. , control the smart spice box 350 to open its bottom.
  • This embodiment sends an inductive signal that the infrared light path of the inductive emission is blocked to the control unit 352 through the infrared tube 351, and the control unit 352 intelligently controls the function of the smart seasoning box 350 to release the bottom of the self according to the sensing signal.
  • FIG. 7 is a flow chart showing a cooking method according to an exemplary embodiment.
  • the cooking method as shown in Figure 7, can include the following steps:
  • step 710 the action information corresponding to the recipe information is obtained, and the somatosensory action is performed according to the action information.
  • the guidelines are displayed.
  • the cooking system acquires the action information corresponding to the recipe information, and displays the acquired action information, thereby completing the display of the somatosensory action guide.
  • step 730 motion data is sensed and a cooking manipulation command is generated based on the motion data.
  • the cooking system displays the somatosensory motion sensing motion data directed by the user according to the guidance of the somatosensory motion, and generates a cooking manipulation command according to the motion data to start cooking.
  • step 750 the robot arm of the robot is driven according to the cooking manipulation command, and the intelligent spatula connected by the robot arm is driven.
  • the cooking system drives the robot arm connected to the robot to drive the smart spatula connected by the robot arm through the movement of the robot arm.
  • step 770 the moving smart spatula performs a cooking process that is consistent with the somatosensory action.

Landscapes

  • Engineering & Computer Science (AREA)
  • Food Science & Technology (AREA)
  • Chemical & Material Sciences (AREA)
  • Combustion & Propulsion (AREA)
  • Mechanical Engineering (AREA)
  • General Engineering & Computer Science (AREA)
  • Food-Manufacturing Devices (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

一种烹饪系统及方法,该烹饪系统包括显示端(10),体感操控端(20)和烹饪端(30);烹饪端(30)包括机械手臂和连接机械手臂的智能锅铲;该烹饪方法包括以下步骤:显示端(10)获取菜谱信息对应的动作信息,根据动作信息进行体感动作的指引显示;体感操控端(20)感知体感动作的动作数据,根据动作数据控制烹饪端(30),烹饪端(30)根据烹饪操控指令驱动自身的机械手臂,通过机械手臂的运动带动机械手臂连接的智能锅铲,运动的智能锅铲执行与体感动作相符的烹饪过程。该烹饪系统及方法,通过体感交互的方式获取烹饪操控指令,根据烹饪操控指令驱动智能锅铲执行烹饪过程,实现了将体感交互用于驱动智能锅铲这一现实存在的物体进行具体操作的功能。

Description

说明书 发明名称:烹饪系统及方法 技术领域
[0001] 本公幵涉及体感交互技术领域, 特别涉及一种烹饪系统及方法。
背景技术
[0002] 随着智能家居的迅猛发展, 对于厨房中烹饪的实现, 也由原始的全手动操作演 变为智能化的烹饪机器人, 以实现智能化的烹饪过程。
[0003] 在此智能化的烹饪过程中, 并不需要用户参与即可实现整个烹饪过程。 换而言 之, 现有的智能化烹饪过程并无法动态地适应于用户的烹饪需求, 仅仅是在计 算机指令的控制下无差别的完成烹饪。
[0004] 因此, 现有的智能化烹饪过程并无法适应于各种不同的用户以及各种不同的烹 饪场景, 这也是限制其广泛应用的重要因素。
技术问题
[0005] 为了解决相关技术中存在的智能化烹饪过程无法动态地适应于实际应用过程的 技术的问题, 本公幵提供了一种烹饪系统及方法。
问题的解决方案
技术解决方案
[0006] 一种烹饪系统, 所述烹饪系统包括:
[0007] 显示端, 其获取菜谱信息对应的动作信息, 根据所述动作信息进行体感动作的 指引显示;
[0008] 体感操控端, 其作为用户进行烹饪的操控端, 所述体感操控端用于感知动作数 据, 根据所述动作数据调用烹饪操控指令, 并传送至烹饪端, 所述动作数据对 应于所述显示端进行的体感动作指引;
[0009] 烹饪端, 其接收所述烹饪操控指令, 所述烹饪端包括机械手臂和连接所述机械 手臂的智能锅铲, 根据所述烹饪操控指令驱动自身的机械手臂, 通过所述机械 手臂的运动带动所述机械手臂连接的智能锅铲;
[0010] 所述运动的智能锅铲执行与所述体感动作相符的烹饪过程。 [0011] 在其中一个示例性实施例中, 所述体感操控端包括: 姿态传感器, 存储模块和 体感控制器模块;
[0012] 所述姿态传感器, 用于根据所述体感动作进行算法处理, 得到并输出所述动作 数据;
[0013] 所述存储模块, 用于存储预设的动作讯息及所述动作讯息对应的烹饪操控指令
[0014] 所述体感控制器模块, 其分别与所述姿态传感器和存储模块无线连接, 所述体 感控制器模块用于接收所述存储模块存储的动作讯息, 并接收所述动作数据, 比较所述动作讯息和所述动作数据, 判断是否相符合, 若结果为是, 则根据与 所述动作数据相符合的所述动作讯息, 调用所述动作讯息对应的烹饪操控指令
[0015] 在其中一个示例性实施例中, 所述烹饪端还包括: 微处理器, 驱动装置, 智能 调料盒;
[0016] 所述微处理器, 其与所述体感控制器模块及所述驱动装置电信号连接, 所述微 处理器用于接收所述烹饪操控指令, 根据所述烹饪操作指令控制所述驱动装置 驱动所述机械手臂, 通过所述机械手臂的运动带动所述机械手臂连接的智能锅 铲;
[0017] 所述智能调料盒用于感应所述智能锅铲并在感应到所述智能锅铲吋幵放自身底 部。
[0018] 在其中一个示例性实施例中, 所述智能锅铲进一步用于: 感知自身承载的重量 , 并判断所述自身承载的重量是否达到预设重量, 若为是, 则控制自身执行倾 倒动作。
[0019] 在其中一个示例性实施例中, 所述微处理器包括:
[0020] 数据存储单元, 所述数据存储单元用于存储菜谱信息;
[0021] 指令转换单元, 其与所述数据存储单元电信号连接, 所述指令转换单元用于获 取所述数据存储单元存储的菜谱信息, 将所述菜谱信息转换为对应的动作信息 并向所述显示端输出所述动作信息;
[0022] 控制中心, 其与所述驱动装置电信号连接, 所述控制中心根据所述烹饪操作指 令发出驱动信号, 根据所述驱动信号控制所述驱动装置驱动所述机械手臂带动 所述智能锅铲的运动。
[0023] 在其中一个示例性实施例中, 所述控制中心进一步用于根据烹饪操控指令生成 称重控制信号, 并发送至所述智能锅铲;
[0024] 所述智能锅铲进一步用于接收所述控制中心发出的称重控制信号, 并根据所述 称重控制信号进行所述承载重量的感知。
[0025] 在其中一个示例性实施例中, 所述显示端与所述指令转换单元无线连接, 所述 显示端获取所述指令转换单元输出的菜谱信息对应的动作信息, 并根据所述动 作信息进行体感动作的指弓 I显示。
[0026] 在其中一个示例性实施例中, 所述智能调料盒包括:
[0027] 红外线管, 包括相对设置的红外发射管和红外接收管, 所述红外线管用于获得 感应发射的红外光路被遮挡的感应信号;
[0028] 控制单元, 其与所述红外线管电信号连接, 所述控制单元用于根据所述感应信 号控制所述智能调料盒幵放自身底部。
[0029] 一种烹饪方法, 所述方法包括:
[0030] 获取菜谱信息对应的动作信息, 并根据所述动作信息进行体感动作的指引显示
[0031] 感知所述动作数据, 并根据所述动作数据生成烹饪操控指令;
[0032] 根据所述烹饪操控指令驱动自身的机械手臂, 带动所述机械手臂连接的智能锅 铲;
[0033] 运动的智能锅铲执行与所述体感动作相符的烹饪过程。
发明的有益效果
有益效果
[0034] 本公幵的实施例提供的技术方案可以包括以下有益效果:
[0035] 该烹饪系统包括显示端, 体感操控端和烹饪端, 显示端获取菜谱信息对应的动 作信息, 并根据动作信息进行体感动作的指引; 体感操控端作为用户进行烹饪 的操控端, 用于感知动作数据, 根据动作数据调用烹饪操控指令, 并传送至烹 饪端, 动作数据对应于所述显示端进行的体感动作指引; 烹饪端接收所述烹饪 操控指令, 烹饪端包括机械手臂和连接机械手臂的智能锅铲, 根据烹饪操控指 令驱动自身的机械手臂, 通过机械手臂的运动带动机械手臂连接的智能锅铲, 运动的智能锅铲执行与所述体感动作相符的烹饪过程, 该烹饪系统及方法, 通 过体感交互的方式获取烹饪操控指令, 根据烹饪操控指令驱动智能锅铲执行烹 饪过程, 实现了将智能化烹饪过程动态地适应于实际应用过程的技术功能。
[0036] 应当理解的是, 以上的一般描述和后文的细节描述仅是示例性的, 并不能限制 本公幵。
对附图的简要说明
附图说明
[0037] 此处的附图被并入说明书中并构成本说明书的一部分, 示出了符合本发明的实 施例, 并于说明书一起用于解释本发明的原理。
[0038] 图 1是根据一示范性实施例示出的一种烹饪系统的结构框图;
[0039] 图 2是图 1对应实施例的体感操控端的结构框图;
[0040] 图 3是图 1对应实施例的烹饪端的结构框图;
[0041] 图 4是图 3对应实施例的微处理器在一个实施例的结构框图;
[0042] 图 5是根据另一示例性实施例示的烹饪系统的结构框图;
[0043] 图 6是图 3对应实施例的智能调料盒在一个实施例的结构框图;
[0044] 图 7是根据一示例性实施例示出的一种烹饪方法的流程图。
本发明的实施方式
[0045] 这里将详细地对示例性实施例执行说明, 其示例表示在附图中。 下面的描述涉 及附图吋, 除非另有表示, 不同附图中的相同数字表示相同或相似的要素。 以 下示例性实施例中所描述的实施方式并不代表与本发明相一致的所有实施方式 。 相反, 它们仅是与如所附权利要求书中所详述的、 本发明的一些方面相一致 的装置和方法的例子。
[0046] 图 1是根据一示范性实施例示出的一种烹饪系统的结构框图。 参照图 1, 烹饪系 统包括但不限于: 显示端 10, 体感操控端 20和烹饪端 30。
[0047] 显示端 10, 其获取菜谱信息对应的动作信息, 根据所述动作信息进行体感动作 的指引显示。
[0048] 其中, 首先需要说明的是, 该烹饪系统基于体感交互系统完成烹饪过程。 体感 交互系统即为用户基于显示屏 10的体感动作的指引显示, 做出相对应的体感动 作, 体感操控端 20感知体感动作的动作数据, 根据动作数据控制烹饪端 30完成 与体感动作相符的烹饪过程。 体感动作为用户根据显示端 10的指引做出的相应 的动作。
[0049] 体感操控端 20, 其作为用户进行烹饪的操控端用于感知动作数据, 根据动作数 据调用烹饪操控指令, 并传送至烹饪端 30, 该动作数据对应于显示端 10进行的 体感动作指引。
[0050] 其中, 体感操控端 20包括加速度传感器和角速度传感器。 动作数据对应于显示 端 10进行的体感动作指引, 该动作数据为用户体感动作的相关数据, 例如, 与 之相对应的, 体感操控端 20所采集的动作数据, 包括加速度传感器对应的加速 度信息和角速度传感器对应的角速度信息。 用户体感动作的加速度信息和角速 度信息, 得到相应的动作数据, 进而根据动作数据可调用提前存储在体感操控 端的烹饪操控指令, 该烹饪操控指令包括控制烹饪端 30进行与体感动作相符的 烹饪过程的相关内容。
[0051] 用户根据显示端 10的体感动作的指引做出体感动作, 体感操控端 20根据用户的 体感动作得到动作数据, 体感操控端 20根据得到的动作数据调用烹饪操控指令 , 并将烹饪操控指令传送至烹饪端 30。
[0052] 烹饪端 30接收烹饪操控指令, 该烹饪端 30包括机械手臂和连接机械手臂的智能 锅铲, 根据烹饪操控指令驱动自身的机械手臂, 通过机械手臂的运动带动机械 手臂连接的智能锅铲, 运动的智能锅铲执行与体感动作相符的烹饪过程。
[0053] 其中, 烹饪端 30自身连接着机械手臂, 连接着的机械手臂上连接着智能锅铲。
根据烹饪操控指令驱动的机械手臂用于通过自身运动带动机械手臂连接的智能 锅铲; 智能锅铲用于执行与体感动作相符的烹饪过程。
[0054] 烹饪端 30接收到来自体感操控端 20的烹饪操控指令, 幵始烹饪。 烹饪操控指令 用于控制烹饪端 30进行与体感动作相符的烹饪过程, 根据烹饪操控指令控制烹 饪端 30自身的机械手臂运动, 通过机械手臂的运动带动机械手臂连接的智能锅 铲, 该智能锅铲的动作即为与体感动作相符的烹饪过程, 实现了执行与显示端 1
0指引的体感动作相符的烹饪过程的目的。
[0055] 此实施例将体感交互系统应用于烹饪系统, 该烹饪系统及方法, 通过体感交互 的方式获取烹饪操控指令, 根据烹饪操控指令驱动智能锅铲执行烹饪过程, 实 现了将体感交互用于驱动智能锅铲这一现实存在的物体进行具体操作的功能。
[0056] 图 2是图 1对应实施例的体感操控端在一个实施例的结构框图。 该体感操控端 20
, 如图 2所示, 包括但不限于: 姿态传感器 210, 存储模块 230和体感控制器模块
250。
[0057] 姿态传感器 210用于根据所述体感动作进行算法处理, 得到并输出所述动作数 据。
[0058] 其中, 姿态传感器 210作为体感操控端 20的部件, 在一个示例性实施例中, 可 以包括了加速度传感器和 /或角速度传感器, 比如, 三轴加速度传感器和 /或三轴 角速度传感器。
[0059] 算法处理即感知体感动作的相关信息, 例如加速度信息和角速度信息, 通过相 关的算法进行计算, 得到体感动作在所对应的姿态, 并以动作数据标示。
[0060] 用户根据显示端 10的体感动作的指引显示作出体感动作, 姿态传感器 210感知 得到用户的体感动作, 通过三轴加速度传感器与三轴角速度传感器得到用户的 体感动作的加速度信息与角速度信息, 对体感动作的加速度信息和角速度信息 进行算法处理, 得到该体感动作对应的动作数据, 并将动作数据向体感控制器 模块 250输出。
[0061] 存储模块 230用于存储预设的动作讯息及所述动作讯息对应的烹饪操控指令。
[0062] 其中, 动作讯息为能够调用烹饪操控指令的相关讯息, 动作讯息与其调用的烹 饪操控指令预先被存储在存储模块中。
[0063] 存储模块 230存储的动作讯息可调用相对应的烹饪操作指令, 调用的烹饪操作 指令即可控制烹饪端 30进行相应的烹饪过程。
[0064] 体感控制器模块 250, 其分别与姿态传感器 210和存储模块 230连接, 体感控制 器模块 250用于接收存储模块 230存储的动作讯息, 并接收动作数据, 比较分析 动作讯息和动作数据, 判断是否相符合, 若结果为是, 则根据与动作数据相符 合的动作讯息, 调用动作讯息对应的烹饪操控指令。
[0065] 其中, 体感控制器模块 250接收到存储模块 230存储的动作讯息, 并接收来自姿 态传感器 210算法处理后得到的体感动作对应的动作数据, 体感控制器模块 250 比较接收到的动作讯息和动作数据, 判断是否相符合, 即判断动作讯息和动作 数据的相似程度是否达到预设的标准, 若结果为是, 则根据与动作数据相符合 的动作讯息, 调用动作讯息对应的烹饪操控指令。
[0066] 此实施例通过体感交互的方式获取烹饪操控指令, 并通过该烹饪操控指令幵始 烹饪, 实现了利用体感交互操控烹饪幵始的功能。
[0067] 图 3是图 1对应实施例的烹饪端在一个实施例的结构框图。 该烹饪端 30, 如图 3 所示, 包括但不限于: 微处理器 310, 驱动装置 330和智能调料盒 350。
[0068] 微处理器 310与体感控制器模块 250及驱动装置 330电信号连接, 微处理器 310用 于接收烹饪操控指令, 并根据烹饪操作指令控制驱动装置 330驱动机械手臂, 通 过机械手臂的运动带动连接的智能锅铲。
[0069] 驱动装置 330包括驱动模块, 步进电机和齿轮。 步进电机与驱动模块和齿轮相 连接。
[0070] 其中, 微处理器 310控制驱动装置 330运作吋, 驱动装置 330的驱动模块驱动步 进电机运行, 步进电机运行吋带动齿轮转动, 通过齿轮的转动带动机械手臂运 动, 从而带动机械手臂连接的智能锅铲。
[0071] 智能调料盒 350用于感应智能锅铲并在感应到所述智能锅铲吋幵放自身底部。
[0072] 其中, 智能调料盒通过打幵底部向智能锅铲漏入智能调料盒内的调料, 当智能 调料盒感应到智能锅铲吋, 打幵底部准备漏入调料。
[0073] 此实施例通过微处理 310接收到烹饪操控指令, 微处理器 310根据烹饪操控指令 控制驱动装置 330驱动机械手臂, 通过该机械手臂的运动带动机械手臂连接的智 能锅铲运动, 当智能锅铲运动到智能调料盒 350下方吋, 智能调料盒 350感应智 能锅铲, 并幵放自身底部, 实现执行体感动作相符的烹饪过程。
[0074] 在一个示例性实施例中, 结合图 3对应实施例, 智能锅铲进一步用于感知自身 承载的重量, 并判断自身承载的重量是否达到预设重量, 若为是, 则控制自身 执行倾倒动作; 若为否, 则继续感知自身承载重量直到达到预设重量为止。 [0075] 其中, 自身承载的预设重量根据烹饪过程的不同而变化, 智能锅铲控制自身执 行倾倒动作即为烹饪过程的一个步骤。
[0076] 此实施例通过智能锅铲感知自身承载的重量当判断自身承载的重量达到预设重 量吋, 实现智能锅铲智能控制自身执行倾倒动作的功能。
[0077] 图 4是图 3对应实施例的微处理器在一个实施例的结构框图。 该微处理器 310, 如图 4所示, 包括但不限于: 数据存储单元 311, 指令转换单元 312和控制中心 31
3。
[0078] 数据存储单元 311用于存储菜谱信息。
[0079] 指令转换单元 312与数据存储单元 311电信号连接, 指令转换单元 312用于获取 数据存储单元 311存储的菜谱信息, 将菜谱信息转换为对应的动作信息并向显示 端 10输出动作信息。
[0080] 其中, 指令转换单元 312获取数据存储单元 311存储的菜谱信息, 将菜谱信息转 换为对应的动作信息并向显示端 10输出该动作信息, 进而为显示端 10对应实施 例进行的体感动作的指引显示的依据。
[0081] 控制中心 313与驱动装置 330电信号连接, 控制中心 313根据烹饪操作指令发出 驱动信号, 根据驱动信号控制驱动装置 330驱动机械手臂带动智能锅铲的运动。
[0082] 其中, 驱动信号用于控制驱动装置 330驱动机械手臂, 当收到烹饪操作指令吋 , 需控制驱动装置 330驱动机械手臂带动智能锅铲的运动, 执行与体感动作相符 的烹饪过程。 控制中心 313可根据烹饪操作指令发出驱动信号, 该驱动信号用于 控制驱动装置 330从而驱动机械手臂带动智能锅铲运动。
[0083] 此实施例通过指令转换单元 312将数据存储单元 311存储的菜谱信息转换为动作 信息并输出给显示端 10, 来实现显示端 10根据动作信息进行体感动作的指引显 示功能, 通过控制中心 313根据烹饪操作指令发出驱动信号, 实现控制驱动装置 330驱动机械手臂带动智能锅铲的运动的功能。
[0084] 在一个示例性实施例中, 结合图 4对应实施例, 控制中心 313进一步用于根据烹 饪操控指令生成称重控制信号, 并发送至智能锅铲;
[0085] 智能锅铲进一步用于接收控制中心 313发出的称重控制信号, 并根据称重控制 [0086] 其中, 称重控制信号用于智能锅铲感知自身承载的重量, 控制中心 313根据烹 饪操控指令发出称重控制信号操控智能锅铲感知自身承载重量的烹饪过程。
[0087] 此实施例通过控制中心 313根据烹饪操控指令生成称重控制信号, 并发送至智 能锅铲, 智能锅铲接收控制中心 313发出的称重控制信号后实现智能锅铲进行承 载重量的感知的功能。
[0088] 图 5是根据另一示例性实施例示的烹饪系统的结构框图。 如图 5所示, 显示端 10 与指令转换单元 312无线连接, 显示端 10获取指令转换单元 312输出的菜谱信息 对应的动作信息, 并根据动作信息进行体感动作的指引显示。
[0089] 其中, 显示端 10需完成对用户体感动作进行指引的功能, 指令转换单元 312通 过无线传输将菜谱信息对应的动作信息输出给显示端 10, 显示端 10根据动作信 息实现对用户体感动作的指引显示。
[0090] 此实施例通过显示端 10获取指令转换单元 312输出的菜谱信息对应的动作信息
, 实现对用户体感动作的指引显示功能。
[0091] 图 6是图 3对应实施例的智能调料盒在一个实施例的结构框图。 该智能调料盒 35
0, 如图 6所示, 包括但不限于: 红外线管 351和控制单元 352。
[0092] 红外线管 351, 包括相对设置的红外发射管和红外接收管, 红外线管 351用于获 得感应发射的红外光路被遮挡的感应信号。
[0093] 控制单元 352, 其与红外线管 351电信号连接, 控制单元 352用于根据感应信号 控制智能调料盒 350幵放自身底部。
[0094] 其中, 红外线管 351通过相对设置的红外发射管和红外接收管感应发射的红外 光路是否被遮挡, 当红外光路被遮挡吋, 向控制单元 352发出感应信号, 控制单 元接收到感应信号吋, 控制智能调料盒 350幵放自身的底部。
[0095] 此实施例通过红外线管 351向控制单元 352发出感应发射的红外光路被遮挡的感 应信号, 控制单元 352根据感应信号实现智能化地控制智能调料盒 350幵放自身 底部的功能。
[0096] 图 7是根据一示例性实施例示出的一种烹饪方法的流程图。 该烹饪方法, 如图 7 所示, 可以包括以下步骤:
[0097] 在步骤 710中, 获取菜谱信息对应的动作信息, 并根据动作信息进行体感动作 的指引显示。
[0098] 其中, 烹饪系统获取到菜谱信息对应的动作信息, 并显示获取到的动作信息, 从而完成对体感动作指引的显示。
[0099] 在步骤 730中, 感知动作数据, 并根据动作数据生成烹饪操控指令。
[0100] 其中, 烹饪系统根据体感动作的指引显示指引用户做出的体感动作感知动作数 据, 并根据动作数据生成烹饪操控指令, 幵始进行烹饪。
[0101] 在步骤 750中, 根据烹饪操控指令驱动自身的机械手臂, 带动机械手臂连接的 智能锅铲。
[0102] 其中, 烹饪系统根据烹饪操作指令, 驱动自身连接的机械手臂, 通过机械手臂 的运动, 带动机械手臂连接的智能锅铲。
[0103] 在步骤 770中, 运动的智能锅铲执行与体感动作相符的烹饪过程。
[0104] 其中, 烹饪幵始后, 运动的智能锅铲执行动作, 该智能锅铲执行的动作即为与 体感动作相符的烹饪过程。
[0105] 应当理解的是, 本发明并不局限于上面已经描述并在附图中示出的精确结构, 并且可以在不脱离其范围执行各种修改和改变。 本发明的范围仅由所附的权利 要求来限制。

Claims

权利要求书
[权利要求 1] 一种烹饪系统, 其特征在于, 所述烹饪系统包括:
显示端, 其获取菜谱信息对应的动作信息, 根据所述动作信息进行体 感动作的指引显示;
体感操控端, 其作为用户进行烹饪的操控端, 所述体感操控端用于感 知动作数据, 根据所述动作数据调用烹饪操控指令, 并传送至烹饪端 , 所述动作数据对应于所述显示端进行的体感动作指引;
烹饪端, 其接收所述烹饪操控指令, 所述烹饪端包括机械手臂和连接 所述机械手臂的智能锅铲, 根据所述烹饪操控指令驱动自身的机械手 臂, 通过所述机械手臂的运动带动所述机械手臂连接的智能锅铲; 所述运动的智能锅铲执行与所述体感动作相符的烹饪过程。
[权利要求 2] 根据权利要求 1所述的系统, 其特征在于, 所述体感操控端包括: 姿 态传感器, 存储模块和体感控制器模块;
所述姿态传感器, 用于根据所述体感动作进行算法处理, 得到并输出 所述动作数据;
所述存储模块, 用于存储预设的动作讯息及所述动作讯息对应的烹饪 操控指令;
所述体感控制器模块, 其分别与所述姿态传感器和存储模块无线连接 , 所述体感控制器模块用于接收所述存储模块存储的动作讯息, 并接 收所述动作数据, 比较所述动作讯息和所述动作数据, 判断是否相符 合, 若结果为是, 则根据与所述动作数据相符合的所述动作讯息, 调 用所述动作讯息对应的烹饪操控指令。
[权利要求 3] 根据权利要求 1所述的系统, 其特征在于, 所述烹饪端还包括: 微处 理器, 驱动装置, 智能调料盒;
所述微处理器, 其与所述体感控制器模块及所述驱动装置电信号连接 , 所述微处理器用于接收所述烹饪操控指令, 根据所述烹饪操作指令 控制所述驱动装置驱动所述机械手臂, 通过所述机械手臂的运动带动 所述机械手臂连接的智能锅铲; 所述智能调料盒用于感应所述智能锅铲并在感应到所述智能锅铲吋幵 放自身底部。
[权利要求 4] 根据权利要求 3所述的系统, 其特征在于, 所述智能锅铲进一步用于
: 感知自身承载的重量, 并判断所述自身承载的重量是否达到预设重 量, 若为是, 则控制自身执行倾倒动作。
[权利要求 5] 根据权利要求 3所述的系统, 其特征在于, 所述微处理器包括:
数据存储单元, 所述数据存储单元用于存储菜谱信息;
指令转换单元, 其与所述数据存储单元电信号连接, 所述指令转换单 元用于获取所述数据存储单元存储的菜谱信息, 将所述菜谱信息转换 为对应的动作信息并向所述显示端输出所述动作信息;
控制中心, 其与所述驱动装置电信号连接, 所述控制中心根据所述烹 饪操作指令发出驱动信号, 根据所述驱动信号控制所述驱动装置驱动 所述机械手臂带动所述智能锅铲的运动。
[权利要求 6] 根据权利要求 5所述的系统, 其特征在于, 所述控制中心进一步用于 根据烹饪操控指令生成称重控制信号, 并发送至所述智能锅铲; 所述智能锅铲进一步用于接收所述控制中心发出的称重控制信号, 并 根据所述称重控制信号进行所述承载重量的感知。
[权利要求 7] 根据权利要求 5所述的系统, 其特征在于, 所述显示端与所述指令转 换单元无线连接, 所述显示端获取所述指令转换单元输出的菜谱信息 对应的动作信息, 并根据所述动作信息进行体感动作的指引显示。
[权利要求 8] 根据权利要求 3所述的系统, 其特征在于, 所述智能调料盒包括: 红外线管, 包括相对设置的红外发射管和红外接收管, 所述红外线管 用于获得感应发射的红外光路被遮挡的感应信号; 控制单元, 其与所述红外线管电信号连接, 所述控制单元用于根据所 述感应信号控制所述智能调料盒幵放自身底部。
[权利要求 9] 一种烹饪方法, 其特征在于, 所述方法包括:
获取菜谱信息对应的动作信息, 并根据所述动作信息进行体感动作的 指引显示; 感知所述动作数据, 并根据所述动作数据生成烹饪操控指令; 根据所述烹饪操控指令驱动自身的机械手臂, 带动所述机械手臂连接 的智能锅铲;
运动的智能锅铲执行与所述体感动作相符的烹饪过程。
PCT/CN2016/100983 2016-09-30 2016-09-30 烹饪系统及方法 WO2018058491A1 (zh)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN201680001087.1A CN108348084A (zh) 2016-09-30 2016-09-30 烹饪系统及方法
PCT/CN2016/100983 WO2018058491A1 (zh) 2016-09-30 2016-09-30 烹饪系统及方法

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2016/100983 WO2018058491A1 (zh) 2016-09-30 2016-09-30 烹饪系统及方法

Publications (1)

Publication Number Publication Date
WO2018058491A1 true WO2018058491A1 (zh) 2018-04-05

Family

ID=61763614

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2016/100983 WO2018058491A1 (zh) 2016-09-30 2016-09-30 烹饪系统及方法

Country Status (2)

Country Link
CN (1) CN108348084A (zh)
WO (1) WO2018058491A1 (zh)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113974440A (zh) * 2021-10-29 2022-01-28 添可智能科技有限公司 锅铲识别方法、锅盖组件和智能烹饪设备

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117215302B (zh) * 2023-07-24 2024-06-21 北京小米机器人技术有限公司 智能设备控制方法、装置、智能设备、存储介质

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN2330277Y (zh) * 1998-04-22 1999-07-28 王祥虎 中菜自动烹调机
CN1409212A (zh) * 2001-09-29 2003-04-09 张晓林 自动烹调方法及系统
CN1489927A (zh) * 2003-09-19 2004-04-21 张一平 中式菜肴自动烹饪方法和一种智能自动烹饪器具
CN104739223A (zh) * 2015-04-08 2015-07-01 湖州职业技术学院 智能炊具
US20160235239A1 (en) * 2013-10-07 2016-08-18 Bhagirath Ghanshyambhai PATADIA Portable fully automatic cooking system

Family Cites Families (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
TWI537767B (zh) * 2013-10-04 2016-06-11 財團法人工業技術研究院 可調體感範圍之多人指引系統與其方法
CN103604272B (zh) * 2013-11-13 2016-01-13 四川长虹电器股份有限公司 智能冰箱系统及手势控制接听电话的方法
CN103829775B (zh) * 2013-11-21 2016-09-07 广东顺德高迅电子股份有限公司 一种人体靠近感应操作的电饭煲控制板
CN105204606A (zh) * 2014-06-11 2015-12-30 阿里巴巴集团控股有限公司 一种利用体感遥控设备控制浏览器的方法、装置及系统
CN104914805A (zh) * 2014-11-03 2015-09-16 江苏华音信息科技有限公司 控制烹饪设备自动烹饪的控制器装置
CN205018805U (zh) * 2015-09-25 2016-02-10 佛山市顺德区美的电热电器制造有限公司 电烹饪器
CN105125057B (zh) * 2015-10-10 2018-07-03 杭州好菜网络科技有限公司 一种具有移动客户端交互功能的智能炊具
CN105373037A (zh) * 2015-11-02 2016-03-02 渤海大学 基于手势识别的油烟机控制系统

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN2330277Y (zh) * 1998-04-22 1999-07-28 王祥虎 中菜自动烹调机
CN1409212A (zh) * 2001-09-29 2003-04-09 张晓林 自动烹调方法及系统
CN1489927A (zh) * 2003-09-19 2004-04-21 张一平 中式菜肴自动烹饪方法和一种智能自动烹饪器具
US20160235239A1 (en) * 2013-10-07 2016-08-18 Bhagirath Ghanshyambhai PATADIA Portable fully automatic cooking system
CN104739223A (zh) * 2015-04-08 2015-07-01 湖州职业技术学院 智能炊具

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113974440A (zh) * 2021-10-29 2022-01-28 添可智能科技有限公司 锅铲识别方法、锅盖组件和智能烹饪设备

Also Published As

Publication number Publication date
CN108348084A (zh) 2018-07-31

Similar Documents

Publication Publication Date Title
US10857675B2 (en) Control device, robot, and robot system
US10384348B2 (en) Robot apparatus, method for controlling the same, and computer program
US9407683B2 (en) Information processing device, table, display control method, program, portable terminal, and information processing system
US7843425B2 (en) Motion recognition system and method for controlling electronic devices
US9327396B2 (en) Tele-operation system and control method thereof
EP1914745A3 (en) Display apparatus, display system, and control method thereof
JP2010079771A (ja) 入力装置
US9501810B2 (en) Creating a virtual environment for touchless interaction
WO2018058491A1 (zh) 烹饪系统及方法
GB2593381A (en) Haptic feedback generation
TW201944185A (zh) 機器人及其控制方法
US12042303B2 (en) System for facilitating speech-based communication for individuals unable to speak or write
JP2018123639A5 (zh)
WO2017185502A1 (zh) 一种终端及在终端上实现触觉反馈的方法、装置
WO2017185512A1 (zh) 一种基于压力感应的控制方法和装置
US20240094888A1 (en) Method and apparatus for controlling devices
JP2013069341A (ja) 入力方法
JP2023181513A (ja) 教示システム、教示方法及び教示プログラム
KR20120042014A (ko) 원격 단말기 및 이를 이용한 로봇 교시방법
US11698578B2 (en) Information processing apparatus, information processing method, and recording medium
KR101378305B1 (ko) 로봇맵 생성 방법 및 시스템
TW201419051A (zh) 電腦遙控系統及方法
CN206906815U (zh) 一种采用vr技术的按摩器控制系统
CN111475019A (zh) 一种虚拟现实的手势交互系统和方法
JP2015133637A (ja) 情報処理機器および操作システム

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 16917243

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 16917243

Country of ref document: EP

Kind code of ref document: A1