WO2008001777A1 - Robot control system, robot control method, and robot - Google Patents

Robot control system, robot control method, and robot Download PDF

Info

Publication number
WO2008001777A1
WO2008001777A1 PCT/JP2007/062811 JP2007062811W WO2008001777A1 WO 2008001777 A1 WO2008001777 A1 WO 2008001777A1 JP 2007062811 W JP2007062811 W JP 2007062811W WO 2008001777 A1 WO2008001777 A1 WO 2008001777A1
Authority
WO
WIPO (PCT)
Prior art keywords
task
robot
control unit
control
control units
Prior art date
Application number
PCT/JP2007/062811
Other languages
French (fr)
Japanese (ja)
Inventor
Yuichiro Nakajima
Original Assignee
Toyota Jidosha Kabushiki Kaisha
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Toyota Jidosha Kabushiki Kaisha filed Critical Toyota Jidosha Kabushiki Kaisha
Publication of WO2008001777A1 publication Critical patent/WO2008001777A1/en

Links

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/06Programme-controlled manipulators characterised by multi-articulated arms
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J13/00Controls for manipulators
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1602Programme controls characterised by the control system, structure, architecture

Definitions

  • Robot control system Robot control method, and robot
  • the present invention relates to a robot control system, a robot control method, and a robot, and more particularly, to a robot control technique according to a task.
  • Robots for various applications have been developed.
  • a robot generally controls the operation of each part constituting a force robot having a multi-degree-of-freedom mechanism by a control unit constituted by a computer.
  • Patent Document 1 discloses a technique for performing distributed control for robot control. Yes.
  • the technology disclosed in Patent Document 1 relates to a control technology that gives a unique task command to each of a plurality of microrobots and performs a unique mission.
  • the distributed control is not executed for the processing of each part. Therefore, even the robot disclosed in Patent Document 1 can cause the above-described problems.
  • Patent Document 1 Japanese Patent Laid-Open No. 11-231910
  • the present invention has been made to solve an energetic problem.
  • the description of the model is easy, the amount of calculation is small, and it is easy to cope with changes in tasks and some parts. It is an object to provide a bot control system, a robot control method, and a robot. Means for solving the problem
  • a robot control system is provided corresponding to each function of the robot, a plurality of control units capable of controlling the corresponding function, and the requested main operation task, the plurality of control units And a task control unit that converts the subdivided task into subdivided tasks corresponding to each unit, and causes the plurality of control units to execute the subdivided task.
  • the task control unit converts the requested main operation task into the subdivision tasks set for each of the plurality of steps to be processed in time series, and for each step It is preferable to cause the plurality of control units to execute the segmentation task.
  • the plurality of control units autonomously execute operations in response to the segmentation task.
  • the correspondence relationship between the function and the control unit may be changeable.
  • the robot control system described above is mounted on a robot.
  • a robot control method is provided corresponding to each function of the robot. Is a method for controlling a robot equipped with a plurality of control units capable of controlling a function to convert a requested main operation task into subdivided tasks that are subdivided according to the plurality of control units. And a step of causing the plurality of control units to execute the converted segmentation task.
  • a robot control system a robot control method, and a robot that are easy to describe a model, have a small amount of calculation, and can easily cope with a change in a task or a part of a part. Can be provided.
  • FIG. 1 is an explanatory diagram for explaining a processing concept of a robot control system according to the present invention.
  • FIG. 2 is a block diagram showing a configuration of a robot control system that is effective in the present invention.
  • FIG. 3A is a flowchart showing processing (up to S202) of the robot control system that is relevant to the present invention.
  • FIG. 3B is a flowchart showing processing (S203 and subsequent steps) of the robot control system that is relevant to the present invention.
  • FIG. 4A is a detailed block diagram of a head control unit of a robot control system that is effective in the present invention.
  • FIG. 4B is a block diagram illustrating FIG. 4A in more detail.
  • the robot 1 shown in the example shown in FIG. 1 includes at least a head having an environment recognition unit and a double arm having a hand for holding an object.
  • the robot 1 is instructed to carry out a task of carrying the object 2 with both hands and transporting it to a predetermined point (hereinafter referred to as “main operation task”).
  • the robot 1 is preliminarily classified and set in the order of the main operation task force for realizing transportation: the step of viewing ⁇ the step of grasping ⁇ the step of carrying ⁇ the step of placing.
  • the robot 1 executes the carrying step.
  • subdivided task for each part (function) of the robot 1.
  • the subdivision task of looking at the object 2 for the head the subdivision task of grasping the object 2 for the right hand, the subdivision task of extending the arm to the target position for the right arm, and the left hand
  • it is a subdivision task of grasping the object 2, and for the left arm, it follows the external force.
  • Each function (part) autonomously executes an operation based on a subdivision task. In this way, each part autonomously operates based on the subdivision task assigned to itself, but this realizes the highest main operation task of ⁇ transport '' as a whole body be able to.
  • the left hand and the left arm may be given different subdivision tasks as different parts S, and the left hand system may be given one subdivision task as one part. Good.
  • the robot control system inputs a detection signal from the sensor 10 to the control unit 20 from the sensors 10 provided at each part of the robot.
  • the control unit 20 controls the actuator 30 according to the detection signal from the sensor 10 and the operation signal from the operator.
  • the sensor 10 includes, for example, an image sensor (camera) for detecting the position and posture of an object, a motion sensor and a tactile sensor for detecting an external force, an encoder and a tachometer for detecting a joint angle, a sound Detecting microphone etc. are included.
  • the sensor 10 is connected to the control unit 20 by wire or wireless.
  • the control unit 20 includes a first task control unit 21, a second task control unit 22, a head control unit 23, a right arm control unit 24, a left arm control unit 25, a right hand control unit 26, and a left hand control unit 27. ing.
  • the ability to divide functions and parts into units and to provide a control unit can be determined and changed appropriately according to the required performance.
  • the example shown in Fig. 2 assumes a mouth bot with only the upper body as shown in Fig. 1, but in the case of a robot that has legs and can walk autonomously, the right leg control unit and left leg control Part.
  • the control unit 20 includes a CPU (Central Processing Unit), a memory such as ROM, RAM, and the like as a hardware configuration. At this time, the control unit 20 may be configured of the control units 21 to 27 by one CPU as a whole, but may be configured by the number of CPUs corresponding to each of the control units 21 to 27. It can be configured with one CPU for each arbitrary group.
  • a CPU Central Processing Unit
  • ROM Read Only Memory
  • RAM Random Access Memory
  • the control unit 20 may be configured of the control units 21 to 27 by one CPU as a whole, but may be configured by the number of CPUs corresponding to each of the control units 21 to 27. It can be configured with one CPU for each arbitrary group.
  • the first task control unit 21 has a function of recognizing and processing a main operation task requested by an instruction from an operator or a determination of the robot itself. For example, “transporting a specific object from one place to another” is required as a main operation task. In order to enable such processing, the first task control unit 21 has a large number of main operation tasks set in advance and stored in the storage unit.
  • the second task control unit 22 breaks down the main operation task recognized by the first task control unit 21 into a procedure for realizing the main operation task, and for each step constituting the procedure. , Convert to subdivision tasks. For example, the main operation task of “carrying a specific object from one place to another” is broken down into a viewing step, a grasping step, a carrying step, and a placing step. Then, at each step, the head control unit 23, the right arm control unit 24, the left arm control unit 25, the right hand control unit 26, and the left hand control unit 27 are each caused to execute a subdivision task that is a command of an operation unit. In order to enable such processing, the second task control unit 22 associates a main operation task with a subdivision task for each of a plurality of steps. And stored in the storage unit.
  • the head control unit 23, the right arm control unit 24, the left arm control unit 25, the right hand control unit 26, and the left hand control unit 27 are provided corresponding to each function (part) of the robot 1, and the corresponding function (part) It is possible to control the amount of movement in units of motion.
  • the head control unit 23 controls the operation of the head of the robot 1 and a camera provided on the head.
  • the right arm control unit 24 mainly controls the motion of the right arm joint.
  • the left arm control unit 25 mainly controls the movement of the left arm joint.
  • the right hand control unit 26 mainly controls the movement of the joint of the right hand finger.
  • the left hand control unit 27 mainly controls the operation of the joint of the finger part of the left hand.
  • the actuator 30 is a motor or the like that is provided at each part of the robot and drives an arm, a joint, or the like.
  • the actuator 30 is connected to the control unit 20 by wire or wirelessly.
  • the first task control unit 21 recognizes that this instruction is a "transportation" task, Set (S100).
  • the second task control unit 22 acquires information indicating that the task is “transport” from the first task control unit 21, and in the “transport” task, the step of viewing (S 101), It is determined from data associated in advance that the subdivision tasks should be executed in chronological order in the order of grasping step (S102), carrying step (S103), and placing step (S104).
  • a task to be searched for in the head control unit 23, a posture maintenance task in the right arm control unit 24, a posture maintenance task in the left arm control unit 25, and a right hand control unit 26 The task to be released is executed for this task, and the task to be released is executed by the left-hand control unit 27.
  • each control unit 23 to 27 controls each function (part) of the robot 1 based on the detection signal input from the sensor 10.
  • each control unit 23 to 27 is independent of other control units (autonomously) and reaches the target value determined by the subdivision task assigned to each control unit.
  • the viewing step is divided into two steps.
  • the process proceeds to the subsequent step (S202).
  • the head control unit 23 releases the task to be seen
  • the right arm control unit 24 releases the posture maintenance task
  • the left arm control unit 25 releases the posture maintenance task
  • the right hand control unit 26 releases the task.
  • the left hand control unit 27 executes each task to be released.
  • a grasping step (S203) is executed.
  • the task to be seen by the head control unit 23, the task of extending the arm by the right arm control unit 24, the task of extending the arm by the left arm control unit 25, and the task of the right hand control unit 26 are performed.
  • the step of grasping is divided into two steps (S203, S204).
  • the first step (S203) when the robot 1 determines that both hands of the robot 1 have reached the gripping position of the object from the input from the sensor 10, the process proceeds to the subsequent step (S204).
  • the subsequent step (S204) the task to be viewed by the head control unit 23, the task of maintaining the posture by the right arm control unit 24, the task of maintaining the posture by the left arm control unit 25, and the task of the right hand control unit 26 are performed.
  • a carrying step is executed (S205).
  • the head control unit 23 holds the task to be searched, the right arm control unit 24 the arm extension task, the left arm control unit 25 the force tracking task, and the right hand control unit 26 task.
  • Each task is executed by the left hand control unit 27.
  • the subdivision task in the placing step (S104) is executed, and the “transport” task is completed.
  • the head control unit 23 includes a processing block 231 for viewing and a processing block 232 for searching.
  • the sensor 10 to the head control unit 23 for example, an image sensor.
  • a detection signal indicating the position of the object is input from the sensor, and a detection signal indicating the joint angle is input from the encoder.
  • processing for viewing from the target position and the current angle is executed, and in the processing block 232 for searching, processing for searching from the target activation for search and the current angle is performed, and the result is output to the actuator 30.
  • each control unit controls the robot in units of operation, so that the description is easy and the addition and modification of functions are facilitated. .
  • the change does not affect the entire robot and can be easily executed.
  • each function (part) autonomously uses that command as a task. Cooperative operation can be realized simply by operating in the same manner.
  • each part variable, it is possible to control as a single model by combining necessary parts, and it is possible to expand the range of tasks that can be handled. Also, as a whole, mechanisms with multiple degrees of freedom can be processed in parallel and distributed by classifying each part, so the calculation time is shortened and the robot's reaction speed can be increased. .

Abstract

It is possible to provide a robot control system, a robot control method, and a robot which can easily describe a model, requires a small calculation amount, and can easily cope with a modification of a task or a part of a robot. The robot control system includes: control units (23 to 27) which are provided for respective functions of the robot and can control the corresponding functions; and task control units (20, 21) which convert a requested main operation task into specialized tasks corresponding to the control units (23 to 27) and cause the control units (23 to 27) to execute the specialized tasks.

Description

明 細 書  Specification
ロボット制御システム、ロボット制御方法及びロボット  Robot control system, robot control method, and robot
技術分野  Technical field
[0001] 本発明は、ロボット制御システム、ロボット制御方法およびロボットに関し、特に、タス クに応じたロボット制御技術に関する。  The present invention relates to a robot control system, a robot control method, and a robot, and more particularly, to a robot control technique according to a task.
背景技術  Background art
[0002] 様々な用途のロボットが開発されている。通常、ロボットは、多自由度機構を有する 力 ロボットを構成する各部位の動作は、コンピュータにより構成される制御部によつ て全体的に統括して制御される。  [0002] Robots for various applications have been developed. Usually, a robot generally controls the operation of each part constituting a force robot having a multi-degree-of-freedom mechanism by a control unit constituted by a computer.
[0003] 従って、ロボットの全身を協調させて動作させるには、複雑な身体モデルが必要で あった。特に、ロボット技術の進歩に伴って自由度が増加すると、ロボットの身体モデ ルがさらに複雑となり、ロボットの全身を含む環境モデルも大きく複雑なものにならざ るをえなかった。  [0003] Therefore, in order to operate the whole body of the robot in a coordinated manner, a complex body model is required. In particular, as the degree of freedom increases as robot technology advances, the robot's body model becomes more complex, and the environment model that includes the whole body of the robot must be larger and more complex.
[0004] また、ロボットに対して作業を実行させる場合には、制御部は、各部位が協調し、か つ干渉することないように作業を遂行するよう計算しなければならないが、一つの作 業を実行するための計算量は膨大であった。  [0004] In addition, when the robot is caused to perform the work, the control unit must perform calculation so that each part cooperates and does not interfere with each other. The amount of calculation to execute the work was enormous.
[0005] また、タスクを変更又は追加した場合には、各部位が相互に及ぼす影響を考慮しな ければならないためロボットの全身を考慮した上でロボットに対して教示する必要が あった。例えば、 20軸を有するロボットに対しては、各軸に対して指令値を与える必 要があるが、一部の軸に対する目標値を変更すると、その他の全ての軸に対する指 令値を変更する必要があり、その作業は膨大なものであった。  [0005] In addition, when a task is changed or added, it is necessary to teach the robot in consideration of the whole body of the robot because the influence of each part must be taken into consideration. For example, for a robot with 20 axes, it is necessary to give a command value to each axis, but if the target value for some axes is changed, the command values for all other axes are changed. It was necessary and the work was enormous.
[0006] さらに、機能拡張などの理由により、ロボットの部位の一部を変更した場合には、そ の変更がロボットの全身に影響を及ぼすため、再度、ロボット全身の設定を行なわな ければならなかった。例えば、関節の角度が 50度まで曲がる構成から 70度まで曲が る構成に変更された場合や、 2本指構成の手から 5本指構成の手に変更された場合 である。  [0006] Furthermore, if a part of the robot is changed due to a function expansion or the like, the change affects the whole body of the robot, so the whole robot must be set again. There wasn't. For example, when the joint angle is changed from 50 degrees to 70 degrees, or from 2 fingers to 5 fingers.
[0007] 他方、特許文献 1には、ロボットの制御について分散制御を行う技術が開示されて いる。し力しながら、特許文献 1に開示された技術は、複数のマイクロロボットのそれ ぞれに対して独自のタスク命令を与え、そして独自のミッションを遂行する制御技術 に関するものであり、一つのロボットの各部位の処理に関して分散制御を実行するも のではない。従って、特許文献 1に開示されたロボットにおいても上述の問題点が発 生しうる。 On the other hand, Patent Document 1 discloses a technique for performing distributed control for robot control. Yes. However, the technology disclosed in Patent Document 1 relates to a control technology that gives a unique task command to each of a plurality of microrobots and performs a unique mission. The distributed control is not executed for the processing of each part. Therefore, even the robot disclosed in Patent Document 1 can cause the above-described problems.
特許文献 1 :特開平 11— 231910号公報  Patent Document 1: Japanese Patent Laid-Open No. 11-231910
発明の開示  Disclosure of the invention
発明が解決しょうとする課題  Problems to be solved by the invention
[0008] 上述したように、従来のロボット制御によれば、モデルの複雑化や計算量の増大、さ らに、タスク変更に対する教示作業の増加や部位の変更に対する設定作業の増加と レ、つた問題点があった。 [0008] As described above, according to the conventional robot control, the complexity of the model and the calculation amount are increased, and further, the teaching work for the task change and the setting work for the part change are increased. There was a problem.
[0009] 本発明は、力かる課題を解決するためになされたものであり、モデルの記述が容易 で、計算量が少なぐさらにタスクや一部の部位を変更した場合にも対応が容易な口 ボット制御システム、ロボット制御方法およびロボットを提供することを目的とする。 課題を解決するための手段 [0009] The present invention has been made to solve an energetic problem. The description of the model is easy, the amount of calculation is small, and it is easy to cope with changes in tasks and some parts. It is an object to provide a bot control system, a robot control method, and a robot. Means for solving the problem
[0010] 本発明に力かるロボット制御システムは、ロボットの各機能に対応して設けられ、対 応する機能を制御可能な複数の制御部と、要求された主動作タスクを、前記複数の 制御部に対応して細分化された細分化タスクに変換し、当該細分化タスクを前記複 数の制御部に対して実行させるタスク制御部とを備えたものである。 [0010] A robot control system according to the present invention is provided corresponding to each function of the robot, a plurality of control units capable of controlling the corresponding function, and the requested main operation task, the plurality of control units And a task control unit that converts the subdivided task into subdivided tasks corresponding to each unit, and causes the plurality of control units to execute the subdivided task.
[0011] ここで、前記タスク制御部は、要求された主動作タスクを、時系列に処理を実行す べき複数のステップのそれぞれに対して設定された前記細分化タスクに変換し、当該 ステップ毎に当該細分化タスクを前記複数の制御部に対して実行させることが好まし レ、。  [0011] Here, the task control unit converts the requested main operation task into the subdivision tasks set for each of the plurality of steps to be processed in time series, and for each step It is preferable to cause the plurality of control units to execute the segmentation task.
[0012] また、前記複数の制御部は、前記細分化タス外こ応じて自律して動作を実行するこ とが望ましい。  [0012] In addition, it is desirable that the plurality of control units autonomously execute operations in response to the segmentation task.
[0013] さらに、前記機能と制御部の対応関係は変更可能であるとよい。また、好適な実施 の形態において、上述のロボット制御システムはロボットに搭載される。  [0013] Further, the correspondence relationship between the function and the control unit may be changeable. In a preferred embodiment, the robot control system described above is mounted on a robot.
[0014] 本発明に力かるロボット制御方法は、ロボットの各機能に対応して設けられ、対応す る機能を制御可能な複数の制御部を備えたロボットを制御する方法であって、要求さ れた主動作タスクを、前記複数の制御部に対応して細分化された細分化タスクに変 換するステップと、変換された細分化タスクを前記複数の制御部に対して実行させる ステップとを備えたものである。 [0014] A robot control method according to the present invention is provided corresponding to each function of the robot. Is a method for controlling a robot equipped with a plurality of control units capable of controlling a function to convert a requested main operation task into subdivided tasks that are subdivided according to the plurality of control units. And a step of causing the plurality of control units to execute the converted segmentation task.
発明の効果  The invention's effect
[0015] 本発明によれば、モデルの記述が容易で、計算量が少なぐさらにタスクや一部の 部位を変更した場合にも対応が容易なロボット制御システム、ロボット制御方法およ びロボットを提供することができる。  [0015] According to the present invention, there is provided a robot control system, a robot control method, and a robot that are easy to describe a model, have a small amount of calculation, and can easily cope with a change in a task or a part of a part. Can be provided.
図面の簡単な説明  Brief Description of Drawings
[0016] [図 1]本発明に力かるロボット制御システムの処理概念を説明するための説明図であ る。  [0016] FIG. 1 is an explanatory diagram for explaining a processing concept of a robot control system according to the present invention.
[図 2]本発明に力かるロボット制御システムの構成を示すブロック図である。  FIG. 2 is a block diagram showing a configuration of a robot control system that is effective in the present invention.
[図 3A]本発明に力かるロボット制御システムの処理(S202まで)を示すフローチヤ一 トである。  FIG. 3A is a flowchart showing processing (up to S202) of the robot control system that is relevant to the present invention.
[図 3B]本発明に力かるロボット制御システムの処理(S203以降)を示すフローチヤ一 トである。  FIG. 3B is a flowchart showing processing (S203 and subsequent steps) of the robot control system that is relevant to the present invention.
[図 4A]本発明に力かるロボット制御システムの頭制御部の詳細ブロック図である。  FIG. 4A is a detailed block diagram of a head control unit of a robot control system that is effective in the present invention.
[図 4B]図 4Aをさらに詳細に説明したブロック図である。  FIG. 4B is a block diagram illustrating FIG. 4A in more detail.
符号の説明  Explanation of symbols
[0017] 1 ロボット [0017] 1 Robot
2 対象物  2 Object
10 センサー  10 sensors
20 制御部  20 Control unit
21 第 1のタスク制御部  21 First task controller
22 第 2のタスク制御部  22 Second task controller
23 頭制御部  23 Head control unit
24 右腕制御部  24 Right arm control
25 左腕制御部 26 右手制御部 25 Left arm control 26 Right hand control
27 左手制御部  27 Left hand control
30 ァクチユエータ  30 Actuator
発明を実施するための最良の形態  BEST MODE FOR CARRYING OUT THE INVENTION
[0018] まず、本発明に力、かるロボット制御システムについて、図 1に示す概念図を用いて 説明する。図 1に示す例に示すロボット 1は、環境認識部を有する頭部と、物体を把 持するためのハンドを有する双腕アームとを少なくとも備えている。そして、この例で は、ロボット 1に対して対象物 2を両手で抱えて所定の地点へ運搬するというタスク( 以下、「主動作タスク」とする)を実行するよう指示している。この場合、ロボット 1は、運 搬を実現するための主動作タスク力 見るステップ→掴むステップ→運ぶステップ→ 置くステップに、予め分類されて、設定されている。図 1に示す例では、ロボット 1は、 このうち、運ぶステップを実行している。さらに、運ぶステップにおいては、ロボット 1の 各部位 (機能)に対して細分化されたタスク(以下、「細分化タスク」とする)を実行する ことが指示される。例えば、頭に対しては対象物 2を見るという細分化タスク、右手に 対しては対象物 2を掴むという細分化タスク、右腕に対しては目標位置へ腕を伸ばす という細分化タスク、左手に対しては対象物 2を掴むという細分化タスク、左腕に対し ては外力に追従するとレ、う細分化タスクである。  [0018] First, a robot control system that is effective in the present invention will be described with reference to the conceptual diagram shown in FIG. The robot 1 shown in the example shown in FIG. 1 includes at least a head having an environment recognition unit and a double arm having a hand for holding an object. In this example, the robot 1 is instructed to carry out a task of carrying the object 2 with both hands and transporting it to a predetermined point (hereinafter referred to as “main operation task”). In this case, the robot 1 is preliminarily classified and set in the order of the main operation task force for realizing transportation: the step of viewing → the step of grasping → the step of carrying → the step of placing. In the example shown in FIG. 1, the robot 1 executes the carrying step. Further, in the carrying step, it is instructed to execute a subdivided task (hereinafter referred to as “subdivided task”) for each part (function) of the robot 1. For example, the subdivision task of looking at the object 2 for the head, the subdivision task of grasping the object 2 for the right hand, the subdivision task of extending the arm to the target position for the right arm, and the left hand On the other hand, it is a subdivision task of grasping the object 2, and for the left arm, it follows the external force.
[0019] 各機能(部位)は、細分化タスクに基づいて、 自律的に動作を実行する。このように 局所的には各部位が自身に与えられた細分化タスクに基づいて自律的に動作を実 行するが、これにより、全身としては「運搬」という最上位の主動作タスクを実現するこ とができる。  [0019] Each function (part) autonomously executes an operation based on a subdivision task. In this way, each part autonomously operates based on the subdivision task assigned to itself, but this realizes the highest main operation task of `` transport '' as a whole body be able to.
[0020] なお、図 1に示す例では、左手と左腕は異なる部位として異なる細分化タスクが与 えられている力 S、左手系として一つの部位として一つの細分化タスクを与えるようにし てもよい。  [0020] In the example shown in Fig. 1, the left hand and the left arm may be given different subdivision tasks as different parts S, and the left hand system may be given one subdivision task as one part. Good.
[0021] 続いて、図 2を用いて、本発明に力かるロボット制御システムの構成について説明 する。当該ロボット制御システムは、ロボットの各部位に設けられたセンサー 10よりセ ンサー 10による検出信号を制御部 20に入力する。制御部 20は、センサー 10からの 検出信号や操作者による操作信号に応じて、ァクチユエータ 30を制御する。 [0022] センサー 10には、例えば、対象物の位置や姿勢を検出するための画像センサー( カメラ)、外力を検出するカ覚センサーや触覚センサー、関節角度を検出するェンコ 一ダボテンショメータ、音声を検出マイクロフォン等が含まれる。センサー 10は、制御 部 20と有線又は無線により接続されている。 [0021] Next, the configuration of the robot control system that is useful in the present invention will be described with reference to FIG. The robot control system inputs a detection signal from the sensor 10 to the control unit 20 from the sensors 10 provided at each part of the robot. The control unit 20 controls the actuator 30 according to the detection signal from the sensor 10 and the operation signal from the operator. [0022] The sensor 10 includes, for example, an image sensor (camera) for detecting the position and posture of an object, a motion sensor and a tactile sensor for detecting an external force, an encoder and a tachometer for detecting a joint angle, a sound Detecting microphone etc. are included. The sensor 10 is connected to the control unit 20 by wire or wireless.
[0023] 制御部 20は、第 1のタスク制御部 21、第 2のタスク制御部 22、頭制御部 23、右腕 制御部 24、左腕制御部 25、右手制御部 26、左手制御部 27を備えている。どのよう な単位で機能や部位を分けて制御部を設けるかは、要求される性能によって適宜決 定し、変更すること力 Sできる。図 2に示す例は、図 1で示されるような上半身のみの口 ボットを想定しているが、足を有し、 自律歩行可能なロボットの場合には、さらに右脚 制御部、左脚制御部等を有する。  [0023] The control unit 20 includes a first task control unit 21, a second task control unit 22, a head control unit 23, a right arm control unit 24, a left arm control unit 25, a right hand control unit 26, and a left hand control unit 27. ing. The ability to divide functions and parts into units and to provide a control unit can be determined and changed appropriately according to the required performance. The example shown in Fig. 2 assumes a mouth bot with only the upper body as shown in Fig. 1, but in the case of a robot that has legs and can walk autonomously, the right leg control unit and left leg control Part.
[0024] 制御部 20は、ハードウェア構成として、 CPU (Central Processing Unit:中央制御装 置)、 R〇M、 RAM等のメモリ等を備えている。このとき、制御部 20は、全体で一つの CPUによって各制御部 21〜27を構成するようにしてもよいが、各制御部 21〜27の それぞれに対応した数の CPUによって構成してもよぐ任意のグループ毎に一つの CPUによって構成してもよレ、。  The control unit 20 includes a CPU (Central Processing Unit), a memory such as ROM, RAM, and the like as a hardware configuration. At this time, the control unit 20 may be configured of the control units 21 to 27 by one CPU as a whole, but may be configured by the number of CPUs corresponding to each of the control units 21 to 27. It can be configured with one CPU for each arbitrary group.
[0025] 第 1のタスク制御部 21は、操作者からの指令若しくはロボット自身の判断により要求 された主動作タスクを認識処理する機能を有する。例えば、「特定の対象物をある場 所からある場所へ運搬すること」が主動作タスクとして要求される。第 1のタスク制御部 21は、このような処理を可能とするため、多数の主動作タスクが予め設定され、記憶 部に格納している。  [0025] The first task control unit 21 has a function of recognizing and processing a main operation task requested by an instruction from an operator or a determination of the robot itself. For example, “transporting a specific object from one place to another” is required as a main operation task. In order to enable such processing, the first task control unit 21 has a large number of main operation tasks set in advance and stored in the storage unit.
[0026] 第 2のタスク制御部 22は、第 1のタスク制御部 21によって認識された主動作タスク を、当該主動作タスクを実現するための手順に分解し、当該手順を構成するステップ 毎に、細分化タスクに変換する。例えば、「特定の対象物をある場所からある場所へ 運搬する」という主動作タスクは、見るステップ、掴むステップ、運ぶステップ、置くステ ップに分解される。そして、各ステップ毎に、頭制御部 23、右腕制御部 24、左腕制御 部 25、右手制御部 26、左手制御部 27のそれぞれに、動作単位の指令である細分 化タスクを実行させる。第 2のタスク制御部 22は、このような処理を可能とするため、 主動作タスクが、複数のステップのそれぞれについて、細分化タスクと関連付けられ て記述され記憶部に格納されている。 [0026] The second task control unit 22 breaks down the main operation task recognized by the first task control unit 21 into a procedure for realizing the main operation task, and for each step constituting the procedure. , Convert to subdivision tasks. For example, the main operation task of “carrying a specific object from one place to another” is broken down into a viewing step, a grasping step, a carrying step, and a placing step. Then, at each step, the head control unit 23, the right arm control unit 24, the left arm control unit 25, the right hand control unit 26, and the left hand control unit 27 are each caused to execute a subdivision task that is a command of an operation unit. In order to enable such processing, the second task control unit 22 associates a main operation task with a subdivision task for each of a plurality of steps. And stored in the storage unit.
[0027] 頭制御部 23、右腕制御部 24、左腕制御部 25、右手制御部 26、左手制御部 27は 、ロボット 1の各機能 (部位)に対応して設けられ、対応する機能 (部位)を動作単位で 制卸すること力 Sできる。  [0027] The head control unit 23, the right arm control unit 24, the left arm control unit 25, the right hand control unit 26, and the left hand control unit 27 are provided corresponding to each function (part) of the robot 1, and the corresponding function (part) It is possible to control the amount of movement in units of motion.
[0028] 頭制御部 23は、ロボット 1の頭部の動作や、頭部に設けられたカメラを制御する。右 腕制御部 24は、主として右腕の関節の動作を制御する。左腕制御部 25は、主として 左腕の関節の動作を制御する。右手制御部 26は、主として右手の指部の関節の動 作を制御する。左手制御部 27は、主として左手の指部の関節の動作を制御する。  [0028] The head control unit 23 controls the operation of the head of the robot 1 and a camera provided on the head. The right arm control unit 24 mainly controls the motion of the right arm joint. The left arm control unit 25 mainly controls the movement of the left arm joint. The right hand control unit 26 mainly controls the movement of the joint of the right hand finger. The left hand control unit 27 mainly controls the operation of the joint of the finger part of the left hand.
[0029] ァクチユエータ 30は、ロボットの各部位に設けられ、アームや関節等を駆動するモ ータ等である。ァクチユエータ 30は、制御部 20と有線又は無線により接続されている  The actuator 30 is a motor or the like that is provided at each part of the robot and drives an arm, a joint, or the like. The actuator 30 is connected to the control unit 20 by wire or wirelessly.
[0030] 続いて、図 3Α、図 3Βのフローチャートを用いて、本発明に力かるロボット制御シス テムにおける制御の流れについて説明する。図 3Α、図 3Βに示す例は、ロボットに対 して、両手で対象物を掴んで運搬する動作を指示した場合である。 [0030] Next, the flow of control in the robot control system according to the present invention will be described using the flowcharts of Figs. The example shown in Figs. 3 and 3 is when the robot is instructed to hold and carry the object with both hands.
[0031] 操作者からの指令若しくはロボット自身の判断により、所定の対象物の運搬が指示 されると、第 1のタスク制御部 21は、この指示が「運搬」のタスクであると認識し、設定 する(S100)。  [0031] When the transportation of a predetermined object is instructed by a command from the operator or the robot's own judgment, the first task control unit 21 recognizes that this instruction is a "transportation" task, Set (S100).
[0032] 次に、第 2のタスク制御部 22は、第 1のタスク制御部 21から「運搬」のタスクであると の情報を取得し、「運搬」のタスクでは、見るステップ(S101)、掴むステップ(S102) 、運ぶステップ(S103)、置くステップ(S104)の順番に時系列に細分化タスクを実行 すべきであると、予め関連付けられたデータより判断する。  Next, the second task control unit 22 acquires information indicating that the task is “transport” from the first task control unit 21, and in the “transport” task, the step of viewing (S 101), It is determined from data associated in advance that the subdivision tasks should be executed in chronological order in the order of grasping step (S102), carrying step (S103), and placing step (S104).
[0033] まず、見るステップ(S201)では、頭制御部 23には探すタスクを、右腕制御部 24に は姿勢保持のタスクを、左腕制御部 25には姿勢保持のタスクを、右手制御部 26のタ スクには放すタスクを、左手制御部 27には放すタスクをそれぞれ実行させている。そ れぞれの細分化タスクを実行する場合に、各制御部 23〜27は、センサー 10から入 力された検出信号に基づき、ロボット 1の各機能 (部位)を制御している。また、それぞ れの制御部 23〜27は、他の制御部とは独立して(自律して)、それぞれに割り当てら れた細分化タスクにより定められた目標値に達するまで当該細分化タスクを実行する [0034] 本例では、見るステップは 2段階のステップに分かれている。最初のステップ(S201 )においてロボット 1が、センサー 10からの入力より対象物を発見したと判定した場合 には、後のステップ(S202)に移行する。後のステップでは、頭制御部 23には見るタ スクを、右腕制御部 24には姿勢保持のタスクを、左腕制御部 25には姿勢保持のタス クを、右手制御部 26のタスクには放すタスクを、左手制御部 27には放すタスクをそれ ぞれ実行させる。 [0033] First, in the viewing step (S201), a task to be searched for in the head control unit 23, a posture maintenance task in the right arm control unit 24, a posture maintenance task in the left arm control unit 25, and a right hand control unit 26 The task to be released is executed for this task, and the task to be released is executed by the left-hand control unit 27. When executing each subdivision task, each control unit 23 to 27 controls each function (part) of the robot 1 based on the detection signal input from the sensor 10. In addition, each control unit 23 to 27 is independent of other control units (autonomously) and reaches the target value determined by the subdivision task assigned to each control unit. Run [0034] In this example, the viewing step is divided into two steps. If it is determined in the first step (S201) that the robot 1 has found an object from the input from the sensor 10, the process proceeds to the subsequent step (S202). In a later step, the head control unit 23 releases the task to be seen, the right arm control unit 24 releases the posture maintenance task, the left arm control unit 25 releases the posture maintenance task, and the right hand control unit 26 releases the task. The left hand control unit 27 executes each task to be released.
[0035] 次に、ロボット 1が対象物を視野内に含め、撮像状態にあると判定した場合には、掴 むステップ(S203)を実行する。掴むステップ(S203)では、頭制御部 23には見るタ スクを、右腕制御部 24には腕を伸ばすタスクを、左腕制御部 25には腕を伸ばすタス クを、右手制御部 26のタスクには放すタスクを、左手制御部 27には放すタスクをそれ ぞれ実行させる。  Next, when it is determined that the robot 1 includes the object in the field of view and is in the imaging state, a grasping step (S203) is executed. In the grasping step (S203), the task to be seen by the head control unit 23, the task of extending the arm by the right arm control unit 24, the task of extending the arm by the left arm control unit 25, and the task of the right hand control unit 26 are performed. Causes the left hand control unit 27 to execute the released task.
[0036] 本例では、掴むステップは 2段階のステップ(S203, S204)に分かれている。最初 のステップ(S203)においてロボット 1が、センサー 10からの入力よりロボット 1の両手 が対象物の把持位置まで到達したと判定した場合には、後のステップ(S204)に移 行する。後のステップ(S204)では、頭制御部 23には見るタスクを、右腕制御部 24に は姿勢保持のタスクを、左腕制御部 25には姿勢保持のタスクを、右手制御部 26のタ スクには掴むタスクを、左手制御部 27には掴むタスクをそれぞれ実行させる。  [0036] In this example, the step of grasping is divided into two steps (S203, S204). In the first step (S203), when the robot 1 determines that both hands of the robot 1 have reached the gripping position of the object from the input from the sensor 10, the process proceeds to the subsequent step (S204). In the subsequent step (S204), the task to be viewed by the head control unit 23, the task of maintaining the posture by the right arm control unit 24, the task of maintaining the posture by the left arm control unit 25, and the task of the right hand control unit 26 are performed. Causes the left hand control unit 27 to execute the grasping task.
[0037] 次に、ロボット 1が対象物の把持動作を完了したと判定した場合には、運ぶステップ を実行する(S205)。運ぶステップ(S205)では、頭制御部 23には探すタスクを、右 腕制御部 24には腕を伸ばすタスクを、左腕制御部 25には力追従タスクを、右手制御 部 26のタスクには掴むタスクを、左手制御部 27には掴むタスクをそれぞれ実行させ る。図 3A、図 3Bでは図示していないが、さらに置くステップ(S104)における細分化 タスクを実行し、「運搬」タスクを完了する。  [0037] Next, when it is determined that the robot 1 has completed the object gripping operation, a carrying step is executed (S205). In the carrying step (S205), the head control unit 23 holds the task to be searched, the right arm control unit 24 the arm extension task, the left arm control unit 25 the force tracking task, and the right hand control unit 26 task. Each task is executed by the left hand control unit 27. Although not shown in FIGS. 3A and 3B, the subdivision task in the placing step (S104) is executed, and the “transport” task is completed.
[0038] 続いて、図 4A、図 4Bを用いて、制御部 23〜27を代表して頭制御部 23の構成例 について説明する。図 4Aに示されるように、頭制御部 23は、見る処理ブロック 231と 探す処理ブロック 232を備えてレ、る。  [0038] Subsequently, a configuration example of the head control unit 23 will be described on behalf of the control units 23 to 27 with reference to FIGS. 4A and 4B. As shown in FIG. 4A, the head control unit 23 includes a processing block 231 for viewing and a processing block 232 for searching.
[0039] 図 4Bに示されるように、センサー 10から頭制御部 23に対しては、例えば画像セン サ一より対象物の位置を示す検出信号が入力され、エンコーダより関節角度を示す 検出信号が入力される。見る処理ブロック 231では、 目標位置と現在角度から見る処 理が実行され、探す処理ブロック 232では、探索用目標起動と現在角度から探す処 理が実行され、その結果がァクチユエータ 30に出力される。 [0039] As shown in FIG. 4B, the sensor 10 to the head control unit 23, for example, an image sensor. A detection signal indicating the position of the object is input from the sensor, and a detection signal indicating the joint angle is input from the encoder. In the processing block 231 for viewing, processing for viewing from the target position and the current angle is executed, and in the processing block 232 for searching, processing for searching from the target activation for search and the current angle is performed, and the result is output to the actuator 30.
[0040] 以上、説明したように、本発明に力かるロボット制御システムでは、各制御部は、動 作単位でロボットを制御するので、記述が容易となり、機能の追加や修正が容易とな る。また、各部位の機能を追加や修正する場合にも、その変更がロボットの全体に対 して影響しないため、容易に実行できる。さらに、第 2のタスク制御部は、タスクを達成 できるように各機能 (部位)の役割を考慮して動作単位の指令を発行するため、各機 能 (部位)はその指令をタスクとして自律的に動作するだけで、協調動作を実現でき る。 [0040] As described above, in the robot control system according to the present invention, each control unit controls the robot in units of operation, so that the description is easy and the addition and modification of functions are facilitated. . In addition, when adding or modifying the function of each part, the change does not affect the entire robot and can be easily executed. Furthermore, since the second task control unit issues a command for the operation unit in consideration of the role of each function (part) so that the task can be achieved, each function (part) autonomously uses that command as a task. Cooperative operation can be realized simply by operating in the same manner.
[0041] また、各部位の分類を可変にすることで、必要な部位を組み合わせて 1つのモデル として制御することができ、対応できるタスクの範囲を拡大させることが可能となる。ま た、全体でみれば多自由度の機構を、各部位に分類することによって並列分散的に 処理することができるので、計算時間が短縮化され、ロボットの反応速度を上げること が可能となる。  [0041] Further, by making the classification of each part variable, it is possible to control as a single model by combining necessary parts, and it is possible to expand the range of tasks that can be handled. Also, as a whole, mechanisms with multiple degrees of freedom can be processed in parallel and distributed by classifying each part, so the calculation time is shortened and the robot's reaction speed can be increased. .

Claims

請求の範囲 The scope of the claims
[1] ロボットの各機能に対応して設けられ、対応する機能を制御可能な複数の制御部と 要求された主動作タスクを、前記複数の制御部に対応して細分化された細分化タ スクに変換し、当該細分化タスクを前記複数の制御部に対して実行させるタスク制御 部とを備えたロボット制御システム。  [1] A plurality of control units provided corresponding to each function of the robot and capable of controlling the corresponding functions and a requested main operation task are subdivided according to the plurality of control units. A robot control system comprising: a task control unit that converts a task into a task and causes the plurality of control units to execute the subdivided task.
[2] 前記タスク制御部は、要求された主動作タスクを、時系列に処理を実行すべき複数 のステップのそれぞれに対して設定された前記細分化タスクに変換し、当該ステップ 毎に当該細分化タスクを前記複数の制御部に対して実行させることを特徴とする請 求項 1記載のロボット制御システム。  [2] The task control unit converts the requested main operation task into the sub-tasks set for each of the plurality of steps to be processed in time series, and the sub-task is divided for each step. 2. The robot control system according to claim 1, wherein the control task is executed for the plurality of control units.
[3] 前記複数の制御部は、前記細分化タスクに応じて自律して動作を実行することを特 徴とする請求項 1又は 2記載のロボット制御システム。 [3] The robot control system according to [1] or [2], wherein the plurality of control units autonomously execute operations according to the subdivision task.
[4] 前記機能と制御部の対応関係は変更可能であることを特徴とする請求項 1記載の ロボット制御システム。 4. The robot control system according to claim 1, wherein the correspondence relationship between the function and the control unit can be changed.
[5] 前記請求項 1又は 2記載のロボット制御システムを備えたロボット。 [5] A robot comprising the robot control system according to claim 1 or 2.
[6] ロボットの各機能に対応して設けられ、対応する機能を制御可能な複数の制御部を 備えたロボットを制御する方法であって、 [6] A method of controlling a robot that is provided corresponding to each function of the robot and includes a plurality of control units capable of controlling the corresponding function,
要求された主動作タスクを、前記複数の制御部に対応して細分化された細分化タ スクに変換するステップと、  Converting the requested main operation task into a subdivided task subdivided corresponding to the plurality of control units;
変換された細分化タスクを前記複数の制御部に対して実行させるステップとを備え たロボット制御方法。  And a step of causing the plurality of control units to execute the converted subdivision task.
PCT/JP2007/062811 2006-06-27 2007-06-26 Robot control system, robot control method, and robot WO2008001777A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2006-177360 2006-06-27
JP2006177360A JP2008006518A (en) 2006-06-27 2006-06-27 Robot control system, robot control method and robot

Publications (1)

Publication Number Publication Date
WO2008001777A1 true WO2008001777A1 (en) 2008-01-03

Family

ID=38845541

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2007/062811 WO2008001777A1 (en) 2006-06-27 2007-06-26 Robot control system, robot control method, and robot

Country Status (2)

Country Link
JP (1) JP2008006518A (en)
WO (1) WO2008001777A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20230008603A1 (en) * 2009-03-31 2023-01-12 View, Inc. Counter electrode for electrochromic devices

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP5929061B2 (en) * 2011-09-15 2016-06-01 セイコーエプソン株式会社 Robot controller, robot system, robot
JP6229324B2 (en) 2013-06-14 2017-11-15 セイコーエプソン株式会社 ROBOT, ROBOT CONTROL DEVICE, AND ROBOT CONTROL METHOD

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH11156765A (en) * 1997-11-30 1999-06-15 Sony Corp Robot device
JP2005144624A (en) * 2003-11-18 2005-06-09 Sony Corp Legged mobile robot

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH11156765A (en) * 1997-11-30 1999-06-15 Sony Corp Robot device
JP2005144624A (en) * 2003-11-18 2005-06-09 Sony Corp Legged mobile robot

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20230008603A1 (en) * 2009-03-31 2023-01-12 View, Inc. Counter electrode for electrochromic devices

Also Published As

Publication number Publication date
JP2008006518A (en) 2008-01-17

Similar Documents

Publication Publication Date Title
Asfour et al. Armar-6: A high-performance humanoid for human-robot collaboration in real-world scenarios
Tsarouchi et al. High level robot programming using body and hand gestures
US20060195228A1 (en) Robot locus control method and apparatus and program of robot locus control method
JP7015068B2 (en) Collision processing by robot
WO2019116891A1 (en) Robot system and robot control method
JP2006334774A (en) Method for controlling track of effector
JP2010069584A (en) Device and method for controlling manipulator
JP6696465B2 (en) Control system, controller and control method
Liu et al. Function block-based multimodal control for symbiotic human–robot collaborative assembly
Marinho et al. Manipulator control based on the dual quaternion framework for intuitive teleoperation using kinect
US20220105625A1 (en) Device and method for controlling a robotic device
CN115351780A (en) Method for controlling a robotic device
CN114516060A (en) Apparatus and method for controlling a robotic device
Si et al. Adaptive compliant skill learning for contact-rich manipulation with human in the loop
WO2008001777A1 (en) Robot control system, robot control method, and robot
JP6322949B2 (en) Robot control apparatus, robot system, robot, robot control method, and robot control program
CN112894827A (en) Mechanical arm motion control method, system and device and readable storage medium
JP2016049607A (en) Robot device, method of controlling robot device, program, and recording medium
JP2015085499A (en) Robot, robot system, control device and control method
JP2021088042A (en) Robot control device, gripping system and robot hand control method
WO2021250923A1 (en) Robot system, control device, and control method
JP2016040067A (en) Robot device, method for controlling robot, program and recording medium
Bailey-Van Kuren Flexible robotic demanufacturing using real time tool path generation
JP2006315128A (en) Shifting from one hand to the other hand control method for robot hand
Li et al. A new teaching system for arc welding robots with auxiliary path point generation module

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 07767617

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

NENP Non-entry into the national phase

Ref country code: RU

122 Ep: pct application non-entry in european phase

Ref document number: 07767617

Country of ref document: EP

Kind code of ref document: A1