WO2008001777A1 - système de commande de robot, procédé de commande de robot et robot - Google Patents

système de commande de robot, procédé de commande de robot et robot Download PDF

Info

Publication number
WO2008001777A1
WO2008001777A1 PCT/JP2007/062811 JP2007062811W WO2008001777A1 WO 2008001777 A1 WO2008001777 A1 WO 2008001777A1 JP 2007062811 W JP2007062811 W JP 2007062811W WO 2008001777 A1 WO2008001777 A1 WO 2008001777A1
Authority
WO
WIPO (PCT)
Prior art keywords
task
robot
control unit
control
control units
Prior art date
Application number
PCT/JP2007/062811
Other languages
English (en)
Japanese (ja)
Inventor
Yuichiro Nakajima
Original Assignee
Toyota Jidosha Kabushiki Kaisha
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Toyota Jidosha Kabushiki Kaisha filed Critical Toyota Jidosha Kabushiki Kaisha
Publication of WO2008001777A1 publication Critical patent/WO2008001777A1/fr

Links

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/06Programme-controlled manipulators characterised by multi-articulated arms
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J13/00Controls for manipulators
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1602Programme controls characterised by the control system, structure, architecture

Definitions

  • Robot control system Robot control method, and robot
  • the present invention relates to a robot control system, a robot control method, and a robot, and more particularly, to a robot control technique according to a task.
  • Robots for various applications have been developed.
  • a robot generally controls the operation of each part constituting a force robot having a multi-degree-of-freedom mechanism by a control unit constituted by a computer.
  • Patent Document 1 discloses a technique for performing distributed control for robot control. Yes.
  • the technology disclosed in Patent Document 1 relates to a control technology that gives a unique task command to each of a plurality of microrobots and performs a unique mission.
  • the distributed control is not executed for the processing of each part. Therefore, even the robot disclosed in Patent Document 1 can cause the above-described problems.
  • Patent Document 1 Japanese Patent Laid-Open No. 11-231910
  • the present invention has been made to solve an energetic problem.
  • the description of the model is easy, the amount of calculation is small, and it is easy to cope with changes in tasks and some parts. It is an object to provide a bot control system, a robot control method, and a robot. Means for solving the problem
  • a robot control system is provided corresponding to each function of the robot, a plurality of control units capable of controlling the corresponding function, and the requested main operation task, the plurality of control units And a task control unit that converts the subdivided task into subdivided tasks corresponding to each unit, and causes the plurality of control units to execute the subdivided task.
  • the task control unit converts the requested main operation task into the subdivision tasks set for each of the plurality of steps to be processed in time series, and for each step It is preferable to cause the plurality of control units to execute the segmentation task.
  • the plurality of control units autonomously execute operations in response to the segmentation task.
  • the correspondence relationship between the function and the control unit may be changeable.
  • the robot control system described above is mounted on a robot.
  • a robot control method is provided corresponding to each function of the robot. Is a method for controlling a robot equipped with a plurality of control units capable of controlling a function to convert a requested main operation task into subdivided tasks that are subdivided according to the plurality of control units. And a step of causing the plurality of control units to execute the converted segmentation task.
  • a robot control system a robot control method, and a robot that are easy to describe a model, have a small amount of calculation, and can easily cope with a change in a task or a part of a part. Can be provided.
  • FIG. 1 is an explanatory diagram for explaining a processing concept of a robot control system according to the present invention.
  • FIG. 2 is a block diagram showing a configuration of a robot control system that is effective in the present invention.
  • FIG. 3A is a flowchart showing processing (up to S202) of the robot control system that is relevant to the present invention.
  • FIG. 3B is a flowchart showing processing (S203 and subsequent steps) of the robot control system that is relevant to the present invention.
  • FIG. 4A is a detailed block diagram of a head control unit of a robot control system that is effective in the present invention.
  • FIG. 4B is a block diagram illustrating FIG. 4A in more detail.
  • the robot 1 shown in the example shown in FIG. 1 includes at least a head having an environment recognition unit and a double arm having a hand for holding an object.
  • the robot 1 is instructed to carry out a task of carrying the object 2 with both hands and transporting it to a predetermined point (hereinafter referred to as “main operation task”).
  • the robot 1 is preliminarily classified and set in the order of the main operation task force for realizing transportation: the step of viewing ⁇ the step of grasping ⁇ the step of carrying ⁇ the step of placing.
  • the robot 1 executes the carrying step.
  • subdivided task for each part (function) of the robot 1.
  • the subdivision task of looking at the object 2 for the head the subdivision task of grasping the object 2 for the right hand, the subdivision task of extending the arm to the target position for the right arm, and the left hand
  • it is a subdivision task of grasping the object 2, and for the left arm, it follows the external force.
  • Each function (part) autonomously executes an operation based on a subdivision task. In this way, each part autonomously operates based on the subdivision task assigned to itself, but this realizes the highest main operation task of ⁇ transport '' as a whole body be able to.
  • the left hand and the left arm may be given different subdivision tasks as different parts S, and the left hand system may be given one subdivision task as one part. Good.
  • the robot control system inputs a detection signal from the sensor 10 to the control unit 20 from the sensors 10 provided at each part of the robot.
  • the control unit 20 controls the actuator 30 according to the detection signal from the sensor 10 and the operation signal from the operator.
  • the sensor 10 includes, for example, an image sensor (camera) for detecting the position and posture of an object, a motion sensor and a tactile sensor for detecting an external force, an encoder and a tachometer for detecting a joint angle, a sound Detecting microphone etc. are included.
  • the sensor 10 is connected to the control unit 20 by wire or wireless.
  • the control unit 20 includes a first task control unit 21, a second task control unit 22, a head control unit 23, a right arm control unit 24, a left arm control unit 25, a right hand control unit 26, and a left hand control unit 27. ing.
  • the ability to divide functions and parts into units and to provide a control unit can be determined and changed appropriately according to the required performance.
  • the example shown in Fig. 2 assumes a mouth bot with only the upper body as shown in Fig. 1, but in the case of a robot that has legs and can walk autonomously, the right leg control unit and left leg control Part.
  • the control unit 20 includes a CPU (Central Processing Unit), a memory such as ROM, RAM, and the like as a hardware configuration. At this time, the control unit 20 may be configured of the control units 21 to 27 by one CPU as a whole, but may be configured by the number of CPUs corresponding to each of the control units 21 to 27. It can be configured with one CPU for each arbitrary group.
  • a CPU Central Processing Unit
  • ROM Read Only Memory
  • RAM Random Access Memory
  • the control unit 20 may be configured of the control units 21 to 27 by one CPU as a whole, but may be configured by the number of CPUs corresponding to each of the control units 21 to 27. It can be configured with one CPU for each arbitrary group.
  • the first task control unit 21 has a function of recognizing and processing a main operation task requested by an instruction from an operator or a determination of the robot itself. For example, “transporting a specific object from one place to another” is required as a main operation task. In order to enable such processing, the first task control unit 21 has a large number of main operation tasks set in advance and stored in the storage unit.
  • the second task control unit 22 breaks down the main operation task recognized by the first task control unit 21 into a procedure for realizing the main operation task, and for each step constituting the procedure. , Convert to subdivision tasks. For example, the main operation task of “carrying a specific object from one place to another” is broken down into a viewing step, a grasping step, a carrying step, and a placing step. Then, at each step, the head control unit 23, the right arm control unit 24, the left arm control unit 25, the right hand control unit 26, and the left hand control unit 27 are each caused to execute a subdivision task that is a command of an operation unit. In order to enable such processing, the second task control unit 22 associates a main operation task with a subdivision task for each of a plurality of steps. And stored in the storage unit.
  • the head control unit 23, the right arm control unit 24, the left arm control unit 25, the right hand control unit 26, and the left hand control unit 27 are provided corresponding to each function (part) of the robot 1, and the corresponding function (part) It is possible to control the amount of movement in units of motion.
  • the head control unit 23 controls the operation of the head of the robot 1 and a camera provided on the head.
  • the right arm control unit 24 mainly controls the motion of the right arm joint.
  • the left arm control unit 25 mainly controls the movement of the left arm joint.
  • the right hand control unit 26 mainly controls the movement of the joint of the right hand finger.
  • the left hand control unit 27 mainly controls the operation of the joint of the finger part of the left hand.
  • the actuator 30 is a motor or the like that is provided at each part of the robot and drives an arm, a joint, or the like.
  • the actuator 30 is connected to the control unit 20 by wire or wirelessly.
  • the first task control unit 21 recognizes that this instruction is a "transportation" task, Set (S100).
  • the second task control unit 22 acquires information indicating that the task is “transport” from the first task control unit 21, and in the “transport” task, the step of viewing (S 101), It is determined from data associated in advance that the subdivision tasks should be executed in chronological order in the order of grasping step (S102), carrying step (S103), and placing step (S104).
  • a task to be searched for in the head control unit 23, a posture maintenance task in the right arm control unit 24, a posture maintenance task in the left arm control unit 25, and a right hand control unit 26 The task to be released is executed for this task, and the task to be released is executed by the left-hand control unit 27.
  • each control unit 23 to 27 controls each function (part) of the robot 1 based on the detection signal input from the sensor 10.
  • each control unit 23 to 27 is independent of other control units (autonomously) and reaches the target value determined by the subdivision task assigned to each control unit.
  • the viewing step is divided into two steps.
  • the process proceeds to the subsequent step (S202).
  • the head control unit 23 releases the task to be seen
  • the right arm control unit 24 releases the posture maintenance task
  • the left arm control unit 25 releases the posture maintenance task
  • the right hand control unit 26 releases the task.
  • the left hand control unit 27 executes each task to be released.
  • a grasping step (S203) is executed.
  • the task to be seen by the head control unit 23, the task of extending the arm by the right arm control unit 24, the task of extending the arm by the left arm control unit 25, and the task of the right hand control unit 26 are performed.
  • the step of grasping is divided into two steps (S203, S204).
  • the first step (S203) when the robot 1 determines that both hands of the robot 1 have reached the gripping position of the object from the input from the sensor 10, the process proceeds to the subsequent step (S204).
  • the subsequent step (S204) the task to be viewed by the head control unit 23, the task of maintaining the posture by the right arm control unit 24, the task of maintaining the posture by the left arm control unit 25, and the task of the right hand control unit 26 are performed.
  • a carrying step is executed (S205).
  • the head control unit 23 holds the task to be searched, the right arm control unit 24 the arm extension task, the left arm control unit 25 the force tracking task, and the right hand control unit 26 task.
  • Each task is executed by the left hand control unit 27.
  • the subdivision task in the placing step (S104) is executed, and the “transport” task is completed.
  • the head control unit 23 includes a processing block 231 for viewing and a processing block 232 for searching.
  • the sensor 10 to the head control unit 23 for example, an image sensor.
  • a detection signal indicating the position of the object is input from the sensor, and a detection signal indicating the joint angle is input from the encoder.
  • processing for viewing from the target position and the current angle is executed, and in the processing block 232 for searching, processing for searching from the target activation for search and the current angle is performed, and the result is output to the actuator 30.
  • each control unit controls the robot in units of operation, so that the description is easy and the addition and modification of functions are facilitated. .
  • the change does not affect the entire robot and can be easily executed.
  • each function (part) autonomously uses that command as a task. Cooperative operation can be realized simply by operating in the same manner.
  • each part variable, it is possible to control as a single model by combining necessary parts, and it is possible to expand the range of tasks that can be handled. Also, as a whole, mechanisms with multiple degrees of freedom can be processed in parallel and distributed by classifying each part, so the calculation time is shortened and the robot's reaction speed can be increased. .

Landscapes

  • Engineering & Computer Science (AREA)
  • Robotics (AREA)
  • Mechanical Engineering (AREA)
  • Automation & Control Theory (AREA)
  • Manipulator (AREA)

Abstract

L'invention concerne un système de commande de robot, un procédé de commande de robot et un robot qui peuvent facilement décrire un modèle, demandent une faible quantité de calcul et peuvent facilement faire face à une modification d'une tâche ou d'une partie d'un robot. Le système de commande de robot comprend : des unités de commande (23 à 27) qui sont disposées pour des fonctions respectives du robot et peuvent commander les fonctions correspondantes ; des unités de commande de tâche (20, 21) qui convertissent une tâche de fonctionnement principale demandée en des tâches spécialisées correspondant aux unités de commande (23 à 27) et amènent les unités de commande (23 à 27) à exécuter les tâches spécialisées.
PCT/JP2007/062811 2006-06-27 2007-06-26 système de commande de robot, procédé de commande de robot et robot WO2008001777A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2006-177360 2006-06-27
JP2006177360A JP2008006518A (ja) 2006-06-27 2006-06-27 ロボット制御システム、ロボット制御方法及びロボット

Publications (1)

Publication Number Publication Date
WO2008001777A1 true WO2008001777A1 (fr) 2008-01-03

Family

ID=38845541

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2007/062811 WO2008001777A1 (fr) 2006-06-27 2007-06-26 système de commande de robot, procédé de commande de robot et robot

Country Status (2)

Country Link
JP (1) JP2008006518A (fr)
WO (1) WO2008001777A1 (fr)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20230008603A1 (en) * 2009-03-31 2023-01-12 View, Inc. Counter electrode for electrochromic devices

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP5929061B2 (ja) * 2011-09-15 2016-06-01 セイコーエプソン株式会社 ロボット制御装置、ロボットシステム、ロボット
JP6229324B2 (ja) 2013-06-14 2017-11-15 セイコーエプソン株式会社 ロボット、ロボット制御装置およびロボットの制御方法

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH11156765A (ja) * 1997-11-30 1999-06-15 Sony Corp ロボツト装置
JP2005144624A (ja) * 2003-11-18 2005-06-09 Sony Corp 脚式移動ロボット

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH11156765A (ja) * 1997-11-30 1999-06-15 Sony Corp ロボツト装置
JP2005144624A (ja) * 2003-11-18 2005-06-09 Sony Corp 脚式移動ロボット

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20230008603A1 (en) * 2009-03-31 2023-01-12 View, Inc. Counter electrode for electrochromic devices

Also Published As

Publication number Publication date
JP2008006518A (ja) 2008-01-17

Similar Documents

Publication Publication Date Title
US20060195228A1 (en) Robot locus control method and apparatus and program of robot locus control method
JP7015068B2 (ja) ロボットによる衝突処理
CN106945043B (zh) 一种主从式遥操作手术机器人多臂协同控制系统
WO2019116891A1 (fr) Système de robot et procédé de commande de robot
US20150273689A1 (en) Robot control device, robot, robotic system, teaching method, and program
JP2006334774A (ja) エフェクタの軌道を制御するための方法
JP2010069584A (ja) マニピュレータの制御装置および制御方法
JP6696465B2 (ja) コントロールシステム、コントローラ及び制御方法
Liu et al. Function block-based multimodal control for symbiotic human–robot collaborative assembly
Marinho et al. Manipulator control based on the dual quaternion framework for intuitive teleoperation using kinect
US20220105625A1 (en) Device and method for controlling a robotic device
CN114516060A (zh) 用于控制机器人装置的设备和方法
JP6322949B2 (ja) ロボット制御装置、ロボットシステム、ロボット、ロボット制御方法及びロボット制御プログラム
CN115351780A (zh) 用于控制机器人设备的方法
Si et al. Adaptive compliant skill learning for contact-rich manipulation with human in the loop
WO2008001777A1 (fr) système de commande de robot, procédé de commande de robot et robot
CN112894827A (zh) 一种机械臂运动控制方法、系统、装置及可读存储介质
JP2016049607A (ja) ロボット装置、ロボット装置の制御方法、プログラム及び記録媒体
JP2015085499A (ja) ロボット、ロボットシステム、制御装置及び制御方法
JP2021088042A (ja) ロボット制御装置、把持システムおよびロボットハンドの制御方法
Top et al. How to increase crane control usability: An intuitive hmi for remotely operated cranes in industry and construction
WO2021250923A1 (fr) Système de robot, dispositif de commande et procédé de commande
Bailey-Van Kuren Flexible robotic demanufacturing using real time tool path generation
JP2024517361A (ja) 相補性制約を使用して衝突を回避する物体操作
JP2006315128A (ja) ロボットハンドの持ち替え把持制御方法。

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 07767617

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

NENP Non-entry into the national phase

Ref country code: RU

122 Ep: pct application non-entry in european phase

Ref document number: 07767617

Country of ref document: EP

Kind code of ref document: A1