WO2017000785A1 - Système et procédé pour l'apprentissage de robot - Google Patents
Système et procédé pour l'apprentissage de robot Download PDFInfo
- Publication number
- WO2017000785A1 WO2017000785A1 PCT/CN2016/085910 CN2016085910W WO2017000785A1 WO 2017000785 A1 WO2017000785 A1 WO 2017000785A1 CN 2016085910 W CN2016085910 W CN 2016085910W WO 2017000785 A1 WO2017000785 A1 WO 2017000785A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- instruction
- unit
- condition data
- robot
- operation instruction
- Prior art date
Links
Images
Classifications
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J19/00—Accessories fitted to manipulators, e.g. for monitoring, for viewing; Safety devices combined with or specially adapted for use in connection with manipulators
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05B—CONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
- G05B19/00—Programme-control systems
Definitions
- the present invention relates to the field of robots, and more particularly to a system and method for training a robot through an image interface.
- the training method for robots is limited to modifying the program of the robot by writing code.
- the training method is difficult to write, and requires the user to have professional programming development skills and excellent code.
- Management literacy have a deep understanding of the bottom of the system, can find problems. Due to the complexity of the process of writing code, it is prone to training logic errors, so the current training methods of robots can not meet the needs of users who do not understand programming languages.
- a system for training robots including:
- An obtaining unit configured to acquire an operation instruction sent by the user, each of the operation instructions corresponding to a corresponding operation, where the operation instruction includes an acquisition instruction and/or an editing instruction;
- a storage unit for storing the operation instruction
- An operation unit configured to execute the operation corresponding to the editing instruction
- a processing unit which is respectively connected to the collection unit, the acquisition unit, the storage unit and the operation unit, for storing the operation instruction in the storage unit, and controlling the operation unit and/or The collecting unit performs the operation corresponding to the operation instruction;
- a display unit is connected to the processing unit, and the display unit includes a preset template, the preset template includes an area for displaying the condition data and an area for displaying an execution image of the operation.
- condition data comprises: a light value, and/or a humidity value, and/or a temperature value, and/or pyroelectric data, and/or a PM2.5 value, and/or a carbon dioxide content, and/or time. , and / or user attributes.
- the user attributes include a manager and/or a visitor.
- the obtaining unit comprises:
- the voice recognition module is configured to receive a voice operation instruction sent by the user, and convert the voice operation instruction into a text operation instruction.
- the processing unit comprises:
- a receiving module configured to receive the condition data and the operation instruction
- An identification module is connected to the receiving module, configured to identify whether the operation instruction is the collection instruction, and output a recognition result
- If the operation instruction is the collection instruction, controlling an operation state of the collection unit according to the collection instruction;
- An extraction module connected to the identification module, when the operation instruction is not the acquisition instruction, the operation instruction is the editing instruction, and the extraction module is configured to extract the operation key in the editing instruction Word and reference condition data, each set of the operation keywords corresponds to one of the operations, and outputs the operation.
- the processing unit further includes:
- a matching module which is respectively connected to the receiving module and the extraction module, for using the condition Matching the data with the reference condition data to obtain a matching result
- the operation unit When the results match, the operation unit performs the operation corresponding to the editing instruction.
- a method of training a robot for controlling the operation of the robot includes the following steps:
- the step S4 includes:
- the operation instruction is the collection instruction, stop acquiring the condition data or start collecting the condition data according to the collection instruction;
- the operation instruction is not the acquisition instruction
- the operation instruction is the editing instruction
- the operation keyword and the reference condition data in the editing instruction are extracted, and each group of the operation keyword corresponds to one The operation and outputting the operation.
- the step S4 further includes:
- the operation instruction sent by the user is acquired by the acquisition unit, the operation instruction is stored in the storage unit by the processing unit, and the control operation unit and/or the acquisition unit is executed to execute the operation instruction.
- Operation, display condition data through the display unit And the execution image of the operation the user can train the robot in an intuitive manner, and the training process is simple, and the user who does not have the basis of programming development can be supported to participate in the behavior training of the robot.
- the robot can be controlled to perform an operation corresponding to the operation instruction, and display the condition data and the execution image of the operation in a preset template, the operation process is simple, and the user can intuitively understand the training process.
- FIG. 1 is a block diagram of an embodiment of a system for training a robot according to the present invention
- FIG. 2 is a flow chart of a method of an embodiment of a method for training a robot according to the present invention
- FIG. 3 is a schematic diagram of an embodiment of the display unit in a normal state according to the present invention.
- FIG. 4 is a schematic diagram of an embodiment of the display unit in an edit state according to the present invention.
- a system for training a robot includes:
- An acquisition unit 3 configured to collect condition data
- An obtaining unit 2 is configured to acquire an operation instruction sent by the user, where each operation instruction corresponds to a corresponding operation, and the operation instruction includes an acquisition instruction and/or an editing instruction;
- a storage unit 5 for storing an operation instruction
- An operation unit 1 configured to perform an operation corresponding to the editing instruction
- a processing unit 4 is connected to the acquisition unit 3, the acquisition unit 2, the storage unit 5 and the operation unit 1, respectively, for storing the operation instruction in the storage unit 5, and controlling the operation unit 1 and/or the acquisition unit 3 to execute the operation instruction Corresponding operation;
- a display unit 6 is connected to the processing unit 4.
- the display unit 6 includes a preset template including an area for displaying condition data and an area for displaying an execution image of the operation.
- the acquiring instruction includes stopping the collecting instruction and starting the collecting instruction; the collecting instruction is for controlling the working state of the collecting unit 3.
- the robot in the system of the embodiment, it is embedded in the robot system in actual application, and the robot can be trained when the robot system is in the integrated development environment.
- the acquisition unit 3 collects the condition data at a predetermined time interval, and the display unit 6 displays the collected condition data in real time in the display area of the condition data, and the user can input the stop acquisition instruction, specifically: a method of starting the edit mode by triggering (such as clicking “coagulation”
- the "button” causes the robot to enter a solid state. In this state, all the acquisition condition data of the robot is not updated, and is solidified at the triggering moment, and the user performs editing in the operation execution image of the display area of the display unit 6 in the solidified state, and the operation is performed.
- the image can be associated with the corresponding operation action corresponding to the editing instruction.
- the user can input the start acquisition instruction.
- the editing action can be uploaded to the robot by triggering the end of the editing mode (clicking the "release” button). (Operation unit 1) is up and running, and the data in the robot continues to be updated in real time.
- the operation instruction sent by the user is acquired by the acquisition unit 2, the operation instruction is stored in the storage unit 5 by the processing unit 4, and the operation unit 1 and/or the acquisition unit 3 are controlled to perform an operation corresponding to the operation instruction.
- the display unit 6 displays the condition data and the execution image of the operation, so that the user can train the robot in an intuitive manner, and the training process is simple, and can support no programming development.
- the basic users participate in the behavioral training of the robot.
- condition data includes: light values, and/or humidity values, and/or temperature values, and/or pyroelectric data, and/or PM2.5 values. , and / or carbon dioxide (CO 2 ) content, and / or time, and / or user attributes.
- CO 2 carbon dioxide
- the collecting unit 3 may include: a light sensor for collecting light values; a humidity sensor for collecting humidity values; a pyroelectric infrared sensor for collecting whether there is a user near the robot; and a PM2.5 collecting module for Collecting PM2.5 values; carbon dioxide sensors for collecting CO 2 content; and an authentication module for identifying user attributes.
- the user attributes include a manager and/or a visitor.
- the identity verification module can be used to identify that the user is a manager or a visitor. If the administrator can directly train the robot, if the visitor can acquire the relationship with the administrator, the corresponding behavior is performed on the robot according to the corresponding authority. operating.
- the obtaining unit 2 may include:
- the voice recognition module 42 is configured to receive a voice operation instruction sent by the user, and convert the voice operation instruction into a text operation instruction.
- a natural language processing module of artificial intelligence may be further included for converting the text operation instruction into digital data for the robot to recognize.
- the processing unit 4 can include:
- a receiving module 41 configured to receive condition data and an operation instruction
- An identification module 42 is connected to the receiving module 41 for identifying whether the operation instruction is an acquisition instruction, and outputting the recognition result;
- If the operation instruction is an acquisition instruction, controlling the working state of the collection unit 3 according to the acquisition instruction;
- An extraction module 44 is connected to the identification module 42.
- the operation instruction is an acquisition instruction
- the operation instruction is an editing instruction
- the extraction module 44 is configured to extract an operation keyword and reference condition data in the editing instruction, and each group of operation keywords corresponds to An operation and output operation.
- the operation may be to turn on the phone function to make a call to a pre-stored contact in the robot, or to activate the prompt tone function, or to control the LED light on the robot to be turned on/off, or to enable the video player function, or to turn on the music.
- the player function or open the search engine function to search for keywords that the user needs to search, or turn on the TTS (Text To Speech) voice module function to convert the text into a voice function.
- TTS Text To Speech
- the processing unit 4 further includes:
- a matching module 43 is respectively connected to the receiving module 41 and the extracting module 44 for matching the condition data with the reference condition data to obtain a matching result;
- the operation unit 1 When the results match, the operation unit 1 performs an operation corresponding to the editing instruction.
- the operation keyword may be pre-stored in the storage unit 5, and when the matching result is matched and the operation keyword in the received editing instruction is the same as the pre-stored operation keyword, the control robot performs the corresponding operation.
- a method for training a robot for controlling the operation of the robot includes the following steps:
- the control robot performs an operation corresponding to the operation instruction, and displays the condition data and the execution image of the operation in a preset template.
- the acquiring instruction includes stopping the acquiring instruction and starting the collecting instruction; and the collecting instruction is used to control whether the robot performs condition data collecting.
- the robot can be controlled to perform an operation corresponding to the operation instruction, and display the condition data and the execution image of the operation in a preset template, the operation process is simple, and the user can intuitively understand the training process.
- step S4 may include:
- the operation instruction is an acquisition instruction, stop acquiring the condition data or start collecting the condition data according to the acquisition instruction;
- the operation instruction is not an acquisition instruction
- the operation instruction is an edit instruction
- the operation keyword and the reference condition data in the edit instruction are extracted, and each group of operation keywords corresponds to an operation, and the operation is output.
- the operation may be to turn on the phone function to make a call to a pre-stored contact in the robot, or to activate the prompt tone function, or to control the LED light on the robot to be turned on/off, or to enable the video player function, or to turn on the music.
- the player function or open the search engine function to search for keywords that the user needs to search, or turn on the TTS (Text To Speech) voice module function to convert the text into a voice function.
- TTS Text To Speech
- step S4 further includes:
- control robot When the matching results match, the control robot performs an operation corresponding to the editing instruction
- the robot system is in an integrated development environment, and the real-time condition state of the robot is displayed on the condition data of the preset template. Area; if the temperature is 20 °C at this time, click the “Coagulation” button (stop the acquisition command) to stop the real-time condition status, and display 20°C in the “Temperature” column (the condition data display area of the preset template).
- the temperature range can be manually set to 10 ° C to 16 ° C; activate the TTS voice module in the operation operation display area of the preset template, enter the text "temperature below 16 degrees Celsius, please turn on the air conditioner" (editing instructions); click "release "Button (start acquisition command), the automatically generated code is uploaded to the robot and updated to run.
- the temperature collected by the robot is lower than 16 °C, it will automatically remind the user to turn on the air conditioner.
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Automation & Control Theory (AREA)
- Robotics (AREA)
- Mechanical Engineering (AREA)
- Manipulator (AREA)
- Electrically Operated Instructional Devices (AREA)
- Toys (AREA)
Abstract
La présente invention concerne un système pour l'apprentissage d'un robot, comprenant: une unité de collecte (3) servant à la collecte de données conditionnelles; une unité d'extraction (2) servant à l'extraction d'une instruction d'opération transmise par un utilisateur; une unité de stockage (5) servant au stockage de l'instruction d'opération; une unité de fonctionnement (1) servant à l'exécution d'une opération correspondant à une instruction d'édition; une unité de traitement (4) connectée respectivement à l'unité de collecte (3), l'unité d'extraction (2), l'unité de stockage (5), et l'unité de fonctionnement (1) et servant au stockage de l'instruction d'opération dans l'unité de stockage (5) et à la commande de l'unité de fonctionnement (1) et/ou de l'unité de collecte (3) pour l'exécution d'une opération correspondant à l'instruction d'opération; et une unité d'affichage (6) connectée à l'unité de traitement (4) et comprenant un modèle prédéfini. Ce système est de structure simple et facilite l'observation par l'utilisateur. L'invention concerne également un procédé permettant l'apprentissage du robot.
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201510385251.0A CN106313113B (zh) | 2015-06-30 | 2015-06-30 | 一种对机器人进行训练的系统及方法 |
CN201510385251.0 | 2015-06-30 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2017000785A1 true WO2017000785A1 (fr) | 2017-01-05 |
Family
ID=57607861
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/CN2016/085910 WO2017000785A1 (fr) | 2015-06-30 | 2016-06-15 | Système et procédé pour l'apprentissage de robot |
Country Status (4)
Country | Link |
---|---|
CN (1) | CN106313113B (fr) |
HK (1) | HK1231439A1 (fr) |
TW (1) | TWI594857B (fr) |
WO (1) | WO2017000785A1 (fr) |
Families Citing this family (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN109101107A (zh) * | 2018-06-29 | 2018-12-28 | 温州大学 | 一种vr虚拟课堂对虚拟机器人训练的系统及方法 |
CN110026983B (zh) * | 2019-04-30 | 2020-06-23 | 南京云图机器人科技有限公司 | 一种机器人编程系统 |
CN110334140B (zh) * | 2019-05-24 | 2022-04-08 | 深圳绿米联创科技有限公司 | 处理设备上报数据的方法、装置以及服务器 |
CN111152228B (zh) * | 2020-01-22 | 2021-07-09 | 深圳国信泰富科技有限公司 | 一种机器人动作自规划系统 |
Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN1970247A (zh) * | 2006-12-01 | 2007-05-30 | 南开大学 | 嵌入式移动机器人核心控制器 |
US20070191969A1 (en) * | 2006-02-13 | 2007-08-16 | Jianying Shi | Automated state change notification |
US20070282858A1 (en) * | 2006-06-01 | 2007-12-06 | Michael Arner | System and method for playing rich internet applications in remote computing devices |
CN202079595U (zh) * | 2010-01-08 | 2011-12-21 | 哈尔滨理工大学 | 新型遥操作远端机器人控制平台 |
CN104102346A (zh) * | 2014-07-01 | 2014-10-15 | 华中科技大学 | 一种家用信息采集和用户情感识别设备及其工作方法 |
CN104688491A (zh) * | 2013-12-04 | 2015-06-10 | 中国科学院宁波材料技术与工程研究所 | 训练机器人及控制方法 |
Family Cites Families (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN1283428C (zh) * | 2000-03-31 | 2006-11-08 | 索尼公司 | 机器人设备、控制机器人设备动作的方法 |
CN100562844C (zh) * | 2005-12-13 | 2009-11-25 | 台达电子工业股份有限公司 | 使用者自定声控功能快捷方式的模块及其方法 |
FR2963132A1 (fr) * | 2010-07-23 | 2012-01-27 | Aldebaran Robotics | Robot humanoide dote d'une interface de dialogue naturel, methode d'utilisation et de programmation de ladite interface |
CN201940040U (zh) * | 2010-09-27 | 2011-08-24 | 深圳市杰思谷科技有限公司 | 家用机器人 |
DE102011005985B4 (de) * | 2011-03-23 | 2019-01-24 | Kuka Roboter Gmbh | Roboter, Steuervorrictung für einen Roboter und Verfahren zum Betreiben eines Roboters |
KR101945185B1 (ko) * | 2012-01-12 | 2019-02-07 | 삼성전자주식회사 | 로봇 및 이상 상황 판단과 대응방법 |
CN103065629A (zh) * | 2012-11-20 | 2013-04-24 | 广东工业大学 | 一种仿人机器人的语音识别系统 |
CN104635651A (zh) * | 2013-11-13 | 2015-05-20 | 沈阳新松机器人自动化股份有限公司 | 一种多功能编程示教盒 |
CN103631221B (zh) * | 2013-11-20 | 2016-05-04 | 华南理工大学广州学院 | 一种遥操作服务机器人系统 |
CN104057458B (zh) * | 2014-06-16 | 2015-12-02 | 浙江大学 | 一种基于体感和触摸的多轴机械臂直观控制系统及方法 |
-
2015
- 2015-06-30 CN CN201510385251.0A patent/CN106313113B/zh active Active
-
2016
- 2016-06-15 WO PCT/CN2016/085910 patent/WO2017000785A1/fr active Application Filing
- 2016-06-29 TW TW105120436A patent/TWI594857B/zh not_active IP Right Cessation
-
2017
- 2017-05-19 HK HK17105089.2A patent/HK1231439A1/zh not_active IP Right Cessation
Patent Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20070191969A1 (en) * | 2006-02-13 | 2007-08-16 | Jianying Shi | Automated state change notification |
US20070282858A1 (en) * | 2006-06-01 | 2007-12-06 | Michael Arner | System and method for playing rich internet applications in remote computing devices |
CN1970247A (zh) * | 2006-12-01 | 2007-05-30 | 南开大学 | 嵌入式移动机器人核心控制器 |
CN202079595U (zh) * | 2010-01-08 | 2011-12-21 | 哈尔滨理工大学 | 新型遥操作远端机器人控制平台 |
CN104688491A (zh) * | 2013-12-04 | 2015-06-10 | 中国科学院宁波材料技术与工程研究所 | 训练机器人及控制方法 |
CN104102346A (zh) * | 2014-07-01 | 2014-10-15 | 华中科技大学 | 一种家用信息采集和用户情感识别设备及其工作方法 |
Also Published As
Publication number | Publication date |
---|---|
HK1231439A1 (zh) | 2017-12-22 |
CN106313113B (zh) | 2019-06-07 |
CN106313113A (zh) | 2017-01-11 |
TW201700237A (zh) | 2017-01-01 |
TWI594857B (zh) | 2017-08-11 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
KR102041063B1 (ko) | 정보 처리 장치, 정보 처리 방법 및 프로그램 | |
TWI594857B (zh) | 一種對機器人進行訓練的系統及方法 | |
US10146318B2 (en) | Techniques for using gesture recognition to effectuate character selection | |
US10909982B2 (en) | Electronic apparatus for processing user utterance and controlling method thereof | |
US10825453B2 (en) | Electronic device for providing speech recognition service and method thereof | |
RU2653283C2 (ru) | Способ диалога между машиной, такой как гуманоидный робот, и собеседником-человеком, компьютерный программный продукт и гуманоидный робот для осуществления такого способа | |
WO2017059815A1 (fr) | Procédé d'identification rapide et robot domestique intelligent | |
KR20190006403A (ko) | 음성 처리 방법 및 이를 지원하는 시스템 | |
US11120792B2 (en) | System for processing user utterance and controlling method thereof | |
KR102304701B1 (ko) | 사용자의 음성 입력에 대한 답변을 제공하는 방법 및 장치 | |
EP3396664B1 (fr) | Dispositif électronique pour fournir un service de reconnaissance vocale et procédé associé | |
TWI665658B (zh) | 智慧型機器人 | |
CN111197841A (zh) | 控制方法、装置、遥控终端、空调器、服务器及存储介质 | |
US20220237915A1 (en) | Electronic apparatus and controlling method thereof | |
CN112233665A (zh) | 模型训练的方法和装置、电子设备和存储介质 | |
US10067738B2 (en) | Device control based on its operational context | |
WO2017143951A1 (fr) | Procédé de rétroaction d'expression et robot intelligent | |
US20210272564A1 (en) | Voice processing device, voice processing method, and recording medium | |
CN110853430A (zh) | 基于智能家居的学习辅导方法、设备及存储介质 | |
WO2016045468A1 (fr) | Procédé et appareil de commande d'entrée vocale et terminal associé | |
KR102380717B1 (ko) | 사용자 발화를 처리하는 전자 장치 및 이 전자 장치의 제어 방법 | |
CN212461143U (zh) | 一种基于数据可视化展示技术的语音识别交互系统 | |
US10621888B2 (en) | Mobile device with local video files for location agnostic video playback | |
US11017772B1 (en) | Natural language programming | |
US20190267002A1 (en) | Intelligent system for creating and editing work instructions |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 16817149 Country of ref document: EP Kind code of ref document: A1 |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 16817149 Country of ref document: EP Kind code of ref document: A1 |