WO2020115985A1 - Système de robot - Google Patents

Système de robot Download PDF

Info

Publication number
WO2020115985A1
WO2020115985A1 PCT/JP2019/036289 JP2019036289W WO2020115985A1 WO 2020115985 A1 WO2020115985 A1 WO 2020115985A1 JP 2019036289 W JP2019036289 W JP 2019036289W WO 2020115985 A1 WO2020115985 A1 WO 2020115985A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
robot hand
robot system
robot
flexible body
Prior art date
Application number
PCT/JP2019/036289
Other languages
English (en)
Japanese (ja)
Inventor
雄司 山川
守仁 黄
村上 健一
正俊 石川
角 博文
和弘 柴橋
Original Assignee
国立大学法人 東京大学
株式会社柴橋商会
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 国立大学法人 東京大学, 株式会社柴橋商会 filed Critical 国立大学法人 東京大学
Publication of WO2020115985A1 publication Critical patent/WO2020115985A1/fr

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J13/00Controls for manipulators
    • B25J13/08Controls for manipulators by means of sensing devices, e.g. viewing or touching devices

Definitions

  • the present invention relates to a robot system.
  • Robots may be used instead of humans in the industrial and medical industries.
  • tasks such as transportation and pickup are performed using robots.
  • Patent Document 1 discloses a work transfer system that transfers a rigid work.
  • the present invention has been made in view of such circumstances, and an object of the present invention is to provide a robot system that can be used in a transfer line handling a flexible body while maintaining safety for humans.
  • a robot system comprising a robot hand, an imaging device, and an information processing device, wherein the robot hand is displaced within a predetermined range in spatial coordinates and is held by a human.
  • the body is configured to be grippable
  • the imaging device is configured to be capable of imaging the flexible body and the human as an image
  • the information processing device based on the image, the position of the flexible body and the human.
  • a robot system configured to recognize a position and control the position/posture of the robot hand so that the robot hand can receive the flexible body from the human without contacting the human.
  • the imaging device images the flexible body and the person as images
  • the information processing device recognizes the position of the predetermined part of the flexible body and the position of the person based on the image. Then, the position and orientation of the robot hand are controlled by the information processing device so that the robot hand can receive the flexible body from the human without contacting the human.
  • FIG. 1 is a schematic configuration diagram of a robot system according to an embodiment.
  • the functional block diagram of an information processing apparatus An example of the spectral characteristic of the material which functions as a predetermined part of a flexible body. An example of the spectral characteristic of a preferable imaging device corresponding to FIG. Robot system operation flow.
  • the “unit” may include, for example, a combination of hardware resources implemented by a circuit in a broad sense and information processing of software that can be specifically realized by these hardware resources. .. Further, in the present embodiment, various kinds of information are handled, but these pieces of information are expressed as a bit group of binary numbers constituted by 0 or 1 by the level of the signal value, and communication/operation is executed on a circuit in a broad sense. Can be done.
  • a circuit in a broad sense is a circuit realized by at least appropriately combining a circuit, a circuit, a processor, a memory, and the like. That is, an application-specific integrated circuit (Application Specific Circuit: ASIC), a programmable logic device (for example, a simple programmable logic device (Simple Programmable Device: SPLD), a complex programmable logic device (Complex Programmable, D regularD rom ): a composite programmable logic device. It includes a programmable gate array (Field Programmable Gate Array: FPGA) and the like.
  • FIG. 1 is a diagram showing a schematic configuration of a robot system 1 according to this embodiment.
  • the robot system 1 is a system that includes an imaging device 2, an information processing device 3, a mechanism control device 4, and a robot hand 5, and these are electrically connected. Each component will be described in more detail below.
  • the image pickup device 2 is a so-called vision sensor (camera) configured to be able to acquire information of the outside world as an image, and it is particularly preferable to employ a high-speed vision device having a high frame rate.
  • the frame rate is, for example, 100 fps or higher, preferably 250 fps or higher, and more preferably 500 fps or 1000 fps.
  • the image pickup device 2 includes a first image pickup device 21 and a second image pickup device 22.
  • the first imaging device 21 is provided on the robot hand 5. More specifically, the first imaging device 21 is provided such that the relative attitude of the pair of grips 521 and 522 of the robot hand 5 is constant regardless of the position of the robot hand 5.
  • the imaging range of the robot hand 5 changes according to the displacement of the robot hand 5, and in particular, the predetermined portion Fp of the flexible body F can be imaged as the first image IM1. Then, even when the flexible body F is displaced/deformed, the position of the predetermined portion Fp of the flexible body F in the first image IM1 is sequentially grasped with high time resolution, and the predetermined portion Fp is defined in the first image IM1.
  • the robot hand 5 is controlled so as to come to the position.
  • the second imaging device 22 is provided at a position different from the robot hand 5. More specifically, the second image pickup device 22 is configured to pick up an image of a specified image pickup range regardless of the displacement of the robot hand 5.
  • the user U who is going to work near the robot hand 5 can be imaged as the second image IM2.
  • the relative positions of the user U and the robot hand 5 are sequentially grasped with high time resolution based on the second image IM2 so that the robot hand 5 does not contact the user U. Control is made.
  • the frame rate of the first imaging device 21 for imaging the flexible body F whose shape changes sequentially is particularly high. Therefore, if the frame rate of the first imaging device 21 is defined as f_1 and the frame rate of the second imaging device 22 is defined as f_2, f_1 ⁇ f_2 may be satisfied.
  • FIG. 2 is a functional block diagram showing an outline configuration of the information processing device 3.
  • the information processing device 3 includes a communication unit 31, a storage unit 32, and a control unit 33, and these components are electrically connected inside the information processing device 3 via a communication bus 30.
  • a communication bus 30 a communication bus
  • the communication unit 31 is preferably a wired communication means such as USB, IEEE 1394, Thunderbolt, and wired LAN network communication, but wireless LAN network communication, mobile communication such as LTE/3G, Bluetooth (registered trademark) communication, and the like are necessary. May be included. That is, it is more preferable to implement it as a set of these plural communication means.
  • the imaging device 2 is configured to be communicable with a predetermined high-speed communication standard. With such a configuration, the image IM (a generic term for the first image IM1 and the second image IM2) can be acquired from the imaging device 2 having a high frame rate.
  • the communication unit 31 sets the control information CI, which is information for controlling the angles of the plurality of rotating units 512 in the arm module 51 of the robot hand 5 described later, to a desired angle based on the inverse kinematics. 4 is configured to be transmittable. Further, it is more preferable that the current angle of the rotating unit 512 is configured to be obtainable as information (encoder). With such a configuration, it is possible to perform control such that the pair of grips 521 and 522 of the robot hand 5 are displaced to desired positions in the spatial coordinates.
  • the storage unit 32 stores various information defined by the above description. This is, for example, a storage device such as a solid state drive (SSD), or a random access memory (Random Access Memory) that stores temporarily necessary information (arguments, arrays, etc.) related to the calculation of a program. It may be implemented as a memory such as RAM). Also, a combination of these may be used.
  • the storage unit 32 stores the image IM captured by the image capturing apparatus 2 and received by the communication unit 31.
  • the image IM is array information including pixel information of 8 bits for each of RGB, for example.
  • the storage unit 32 also stores a predetermined image processing program executed by the image processing unit 331 (described later) in the control unit 33 on the image IM, and in particular, a predetermined flexible body F executed based on the first image IM1.
  • a tracking calculation program for the part Fp is stored.
  • the control information CI for controlling the robot hand 5 is sequentially determined by the tracking calculation unit 332 in the control unit 33 by the tracking calculation program.
  • the storage unit 32 also stores various programs and the like related to the robot system 1 executed by the control unit 33.
  • the control unit 33 processes and controls the overall operation related to the information processing device 3.
  • the control unit 33 is, for example, a central processing unit (CPU) (not shown).
  • the control unit 33 realizes various functions of the information processing device 3 by reading out a predetermined program stored in the storage unit 32. Specifically, it corresponds to an image processing function for the image IM and a calculation function for tracking the predetermined portion Fp of the flexible body F. That is, the information processing by software (stored in the storage unit 32) is specifically realized by the hardware (control unit 33), and thus can be executed as the image processing unit 331 and the tracking calculation unit 332.
  • control unit 33 it is described as a single control unit 33, but it is not limited to this in practice, and a plurality of control units 33 may be provided for each function. It may also be a combination thereof.
  • image processing unit 331 and the tracking calculation unit 332 will be described in more detail.
  • the image processing unit 331 is one in which information processing by software (stored in the storage unit 32) is specifically realized by hardware (control unit 33).
  • the image processing unit 331 is configured to perform predetermined image processing on the image IM transmitted from the imaging device 2 and received by the communication unit 31. More specifically, the image processing unit 331 performs predetermined image processing on the first image IM1 transmitted from the first imaging device 21 and received by the communication unit 31, and in the flexible body F.
  • the predetermined portion Fp and the other non-predetermined portion Fq can be distinguished. As a result, the information processing device 3 extracts and recognizes the predetermined part Fp.
  • the predetermined part Fp is distinguished from the non-predetermined part Fq when viewed from the first imaging device 21, it is preferable that the predetermined part Fp and the non-predetermined part Fq cannot be distinguished from the human eyes.
  • the flexible body F be implemented such that the predetermined portion Fp and the non-predetermined portion Fq appear to be substantially the same color as seen by human eyes. This is because it is considered not appropriate to use a marker or the like that can be visually recognized as a foreign matter by human eyes in towels and sheets where hygiene is important.
  • the predetermined portion Fp is a marker sewn on the towel.
  • FIG. 3 shows an example of spectral characteristics of a material that functions as a predetermined portion Fp of the flexible body F.
  • FIG. 4 shows an example of the spectral characteristics of the preferable image pickup apparatus 2 (especially the first image pickup apparatus 21) corresponding to FIG.
  • the material shown in FIG. 3 has a high reflectance in the near infrared region where humans cannot perceive it. Therefore, by adopting the first imaging device 21 as shown in FIG. 4 capable of sensing such a near infrared region, the predetermined portion Fp and the non-predetermined portion Fq are substantially visible to the human eye.
  • the first imaging device 21 can image the predetermined portion Fp and the non-predetermined portion Fq of the flexible body F so as to be distinguishable while appearing to have the same color.
  • the filter may be configured to cut the visible light region, or may be configured to be able to acquire information regarding both the visible light region and the near infrared region. ..
  • image processing may be performed on a predetermined region ROI of a part of the first image IM1.
  • the predetermined portion Fp since the tracking of the predetermined portion Fp is performed at a high control rate, the predetermined portion Fp is in the vicinity of a fixed position (for example, the center of the image) of the image IM, and the area near this fixed position is the predetermined area ROI. By doing so, the number of pixels for image processing can be reduced. As a result, the load of image processing calculation in the information processing device 3 can be reduced, and tracking can be performed at a high control rate.
  • the image processing unit 331 executes predetermined image processing on the second image IM2 transmitted from the second image pickup device 22 and received by the communication unit 31, so that the user U and the robot hand 5 are connected. Make it distinguishable.
  • the information processing device 3 grasps the relative positional relationship between the user U and the robot hand 5 (particularly, the pair of grips 521 and 522). This is performed, for example, by setting a threshold value for a predetermined parameter (brightness or the like) regarding the imaged second image IM2 and converting the imaged second image IM2 into binary.
  • the tracking calculation unit 332 is one in which information processing by software (stored in the storage unit 32) is specifically realized by hardware (control unit 33).
  • the tracking calculation unit 332 executes a control calculation for aligning the predetermined portion Fp of the flexible body F extracted by the image processing by the image processing unit 331 with the pair of grips 521 and 522 of the robot hand 5.
  • the first imaging device 21 is provided on the robot hand 5 so that the relative posture of the robot hand 5 with respect to the pair of grips 521 and 522 is constant regardless of the position of the robot hand 5. Therefore, by displacing the position of the first imaging device 21 so that the predetermined part Fp in the first image IM1 is aligned with the target position, the relative positional relationship with the first imaging device 21 is fixed.
  • the pair of grips 521 and 522 can be aligned.
  • the tracking calculation unit 332 calculates control information CI relating to these controls, and transmits this to the mechanism control device 4 via the communication unit 31. Then, based on the control information CI, the mechanism control device 4 transmits the control voltage V1 related to the angle control of each rotating unit 512 of the arm module 51 and the control voltage V2 written in the position/posture control of the grip module 52.
  • the robot hand 5 it is possible to avoid contact between the robot hand 5 and the user U by using the information on the relative positional relationship between the user U and the robot hand 5 obtained by the second image IM2. It is preferable to give priority. For example, when the distance between the user U and the robot hand 5 is equal to or less than the predetermined value, the alignment (tracking) between the predetermined part Fp and the pair of grips 521 and 522 performed based on the above-described first image IM1. It is advisable to set a condition such as not performing). In other words, in the second image IM2, when the distance between the user U and the robot hand 5 is equal to or less than the specified value, the control for bringing the robot hand 5 closer to the flexible body F may be unexecuted.
  • the calculation rate of the tracking calculation unit 332 is as high as the frame rate of the first imaging device 21.
  • it is 100 hertz or more, preferably 250 hertz or more, and more preferably 500 hertz or 1000 hertz.
  • the lower frame rate of the first imaging device 21 and the lower calculation rate of the tracking calculation unit 332 function as the control rate related to tracking. In other words, by increasing the frame rate and the calculation rate to the same level, it is possible to perform tracking of the predetermined portion Fp in the flexible body F only by feedback control without using prediction.
  • the mechanism control device 4 includes a storage unit and a control unit (not shown).
  • the storage unit is, for example, a storage device such as a solid state drive (SSD), or a random access memory (Random Access Memory) that stores temporarily necessary information (arguments, arrays, etc.) related to program calculation. : RAM) and the like. Also, a combination of these may be used.
  • the control unit may be implemented as, for example, a central processing unit (CPU). The control unit realizes various functions related to control of the robot hand 5 by reading out a predetermined program stored in the storage unit.
  • the mechanism control device 4 is a wired communication means such as USB, IEEE1394, Thunderbolt, and wired LAN network communication, or wireless LAN network communication, mobile communication such as LTE/3G, and wireless communication such as Bluetooth (registered trademark) communication.
  • the device may be provided as necessary so that it can be connected to an external device.
  • the robot hand 5 is configured to be capable of high-speed communication according to the dedicated communication standard.
  • the mechanism control device 4 is configured such that a control unit (not shown) can control the angle of each rotating unit 512 of the arm module 51 constituting the robot hand 5 and the position/posture of the grip module 52. To be done.
  • the mechanism control device 4 When the mechanism control device 4 is connected to the information processing device 3 and receives the control information CI transmitted from the communication unit 31 in the information processing device 3, the mechanism control device 4 applies the control voltages V1 and V2 to the robot hand 5 based on the control information CI. Send power.
  • control for example, PD control, PI control, PID control, or the like can be appropriately adopted.
  • Each coefficient related to the control may be included in the control information CI, or may be determined by the mechanism control device 4 based on the control information CI. In addition, a preferable value may be set as needed.
  • the control rate of the mechanism control device 4 (the drive rate of the robot hand 5) is as high as the frame rate of the first imaging device 21 and the calculation rate of the information processing device 3.
  • it is 100 hertz or more, preferably 250 hertz or more, and more preferably 500 hertz or 1000 hertz.
  • the operation system used for the mechanism control device 4 is preferably a real-time operation system.
  • the robot hand 5 includes an arm module 51 and a grip module 52 attached to the tip of the arm module 51.
  • the shapes and structures of the arm module 51 and the grip module 52 are merely examples and are not particularly limited.
  • the arm module 51 includes a plurality of arm portions 511 and a plurality of rotating portions 512 that function as joints of the arm portions 511 or as connection points with the grip module 52.
  • Each rotating unit 512 is controlled to have a desired angle by the control voltage V1 transmitted from the mechanism control device 4. Further, as a result, the position/posture of a later-described grip module 52 connected to the arm module 51 is controlled.
  • the grip module 52 is connected to the arm module 51 via one rotating portion 512.
  • the grip module 52 has a pair of grips 521 and 522 at its tip.
  • the grip module 52 is configured to be able to control the relative positions of the pair of grip portions 521 and 522 within the grip module 52.
  • the pair of grips 521 and 522 are configured to grip a predetermined portion Fp of the flexible body F which the user U grips with his or her hand UH.
  • the grip module 52 is provided with the first imaging device 21. More specifically, the first imaging device 21 is provided so that the relative posture with respect to the grips 521 and 522 is constant regardless of the position of the robot hand 5.
  • the relative positions of the pair of grips 521 and 522 in the robot hand 5 are controlled as desired by the control voltage V2 transmitted by the mechanism control device 4.
  • the movable stroke of the pair of grip portions 521 and 522 by controlling the arm module 51 is wide, the movable stroke of the pair of grip portions 521 and 522 by controlling the grip module 52 is equal to the movable stroke of the arm module 51. It is preferable to be limited to a narrower range than the above.
  • the drive rate of the arm module 51 may be lower than the drive rate of the grip module 52. That is, in controlling the pair of grips 521 and 522 to desired positions, rough alignment is performed by the control of the arm module 51 (low frequency control), and further fine adjustment of the position is controlled by the grip module 52 (high frequency control). ).
  • the drive rate of the arm module 51 is defined as f_3 and the drive rate of the gripping module 52 is defined as f_4.
  • the arm module 51 which is controlled at a low drive rate, may be controlled in the direction opposite to the direction in which the user U is located, if necessary.
  • FIG. 5 shows an operation flow of the robot system 1.
  • Step S1 The user U presents the flexible body F to the robot hand 5 while holding the non-predetermined part Fq of the flexible body F with the hand UH (continues to step S2).
  • Step S2 The image processing unit 331 in the information processing device 3 determines a threshold value of a predetermined parameter (brightness or the like) for the first image IM1 captured by the first image capturing device 21 and binary-processes the first image IM1. Turn into. As a result, the predetermined portion Fp of the flexible body F is extracted (continue to step S3).
  • Step S3 The tracking calculation unit 332 of the information processing device 3 determines the position based on the first image IM1 that has been binarized in step S2, based on the positional relationship between the predetermined portion Fp and the pair of grips 521 and 522 (the positional relationship between the first image pickup device 21). Is known), and calculates the distance to. Then, the control information CI that makes the distance closer to 0 is calculated (continuation to step S4).
  • Step S4 The mechanism control device 4 transmits the control voltages V1 and V2 to the robot hand 5 based on the control information CI calculated in step S3. As a result, the angles of the rotary parts 512 in the arm module 51 and the position/posture of the grip module 52 are controlled as desired (continuation to step S5).
  • Step S5 By repeating steps S2 to S4, the pair of grips 521 and 522 approach the flexible body F, and the pair of grips 521 and 522 receive the flexible body F from the user U, whereby the delivery task is completed.
  • the control for maintaining the distance between the user U and the robot hand 5 is preferentially executed instead of steps S2 to S4. It After the delivery is completed, the robot system 1 conveys the flexible body F to an appropriate position, so that it is possible to assist the transportation line of the flexible body F which has been conventionally performed by a human.
  • Modifications Section 3 describes modifications of the robot system 1 according to the present embodiment. That is, the robot system 1 may be further creatively devised in the following manner.
  • the description will be made assuming that the predetermined portion Fp of the flexible body F is extracted in the first image IM1 and the relative positional relationship between the user U and the robot hand 5 is recognized in the second image IM2.
  • the relative positional relationship between the user U and the robot hand 5 can be grasped in the first image IM1
  • a predetermined portion Fp of the flexible body F may be extracted from the second image IM2 and used for position control of the pair of grips 521 and 522.
  • the first image pickup device 21 and the second image pickup device 22 are adopted as the image pickup device 2, but either one of them may be adopted.
  • both the predetermined portion Fp of the flexible body F and the relative positional relationship between the user U and the robot hand 5 can be grasped and recognized from the image IM obtained by either one. I want to.
  • the positions of the pair of grips 521 and 522 are arranged to be known, but in order to further improve the accuracy, the tips of the pair of grips 521 and 522 are arranged.
  • a marker (not shown) may be provided so that the detailed positions of the pair of grips 521 and 522 can be extracted from the first image IM1.
  • the information processing device 3 and the mechanism control device 4 may be implemented as one information processing device without being divided.
  • a separate sensor may be added in addition to the imaging device 2.
  • a force sensor is considered to be effective.
  • a force sensor is introduced to acquire information such as a contact force, and based on this, the tracking calculation unit 332 determines the safety.
  • the control information CI may be calculated in consideration of the above.
  • the robot system 1 includes a robot hand 5, an imaging device 2, and an information processing device (the information processing device 3 and the mechanism control device 4).
  • the robot hand 5 is displaced within a predetermined range in spatial coordinates,
  • the flexible body F grasped by a human (user U) is configured to be graspable, and the imaging device 2 is configured to be able to capture the flexible body F and the human (user U) as an image IM.
  • the processing device 3 and the mechanism control device 4) recognize the position of the flexible body F and the position of the human (user U) based on the image IM, and the robot hand 5 contacts the human (user U) in a non-contact manner.
  • the position and orientation of the robot hand 5 are controllable so that the flexible body F can be received from a human (user U).
  • Robot system 2 Imaging device 21: First imaging device 22: Second imaging device 3: Information processing device 30: Communication bus 31: Communication unit 32: Storage unit 33: Control unit 331: Image processing unit 332: Tracking calculation unit 4: Mechanism control device 5: Robot hand 51: Arm module 511: Arm unit 512: Rotating unit 52: Gripping module 521: Gripping unit 522: Gripping unit CI: Control information F: Flexible body Fp: Predetermined portion Fq: Non-predetermined part IM: Image IM1: First image IM2: Second image U: User UH: Hand V1: Control voltage V2: Control voltage

Landscapes

  • Engineering & Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Robotics (AREA)
  • Mechanical Engineering (AREA)
  • Manipulator (AREA)

Abstract

La présente invention vise à fournir un système de robot qui peut être utilisé dans une ligne de transport pour manipuler un corps souple, tout en maintenant la sécurité pour les êtres humains. À cet effet, l'invention porte sur un système de robot comprenant une main de robot, un dispositif de capture d'image et un dispositif de traitement d'informations. La main de robot est déplacée dans une plage prédéterminée de coordonnées spatiales et configurée pour être apte à maintenir un corps souple étant maintenu par un être humain. Le dispositif de capture d'image est configuré pour être apte à capturer une image du corps souple et de l'être humain. Le dispositif de traitement d'informations reconnaît la position du corps souple et la position de l'être humain sur la base de l'image, et est configuré pour être apte à commander la position/attitude de la main de robot de telle sorte que la main de robot peut recevoir le corps souple de la part de l'être humain sans entrer en contact avec l'être humain.
PCT/JP2019/036289 2018-12-05 2019-09-17 Système de robot WO2020115985A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2018228386A JP2020089944A (ja) 2018-12-05 2018-12-05 ロボットシステム
JP2018-228386 2018-12-05

Publications (1)

Publication Number Publication Date
WO2020115985A1 true WO2020115985A1 (fr) 2020-06-11

Family

ID=70975053

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2019/036289 WO2020115985A1 (fr) 2018-12-05 2019-09-17 Système de robot

Country Status (2)

Country Link
JP (1) JP2020089944A (fr)
WO (1) WO2020115985A1 (fr)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2006013829A1 (fr) * 2004-08-02 2006-02-09 Matsushita Electric Industrial Co., Ltd. Robot pour transporter des biens, systeme pour transporter des biens et methode pour transporter des biens
JP2007216381A (ja) * 2004-07-13 2007-08-30 Matsushita Electric Ind Co Ltd ロボット
JP2011042011A (ja) * 2009-08-21 2011-03-03 Tokyo Metropolitan Univ ロボット制御装置、ロボット制御方法、ロボット制御プログラム、及びロボット
JP2011115207A (ja) * 2009-11-30 2011-06-16 Ist Corp 布製品の折り畳みシステム
US20140277679A1 (en) * 2013-03-15 2014-09-18 Northeastern University Systems and Methods of using a Hieroglyphic Machine Interface Language for Communication with Auxiliary Robotics in Rapid Fabrication Environments

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2007216381A (ja) * 2004-07-13 2007-08-30 Matsushita Electric Ind Co Ltd ロボット
WO2006013829A1 (fr) * 2004-08-02 2006-02-09 Matsushita Electric Industrial Co., Ltd. Robot pour transporter des biens, systeme pour transporter des biens et methode pour transporter des biens
JP2011042011A (ja) * 2009-08-21 2011-03-03 Tokyo Metropolitan Univ ロボット制御装置、ロボット制御方法、ロボット制御プログラム、及びロボット
JP2011115207A (ja) * 2009-11-30 2011-06-16 Ist Corp 布製品の折り畳みシステム
US20140277679A1 (en) * 2013-03-15 2014-09-18 Northeastern University Systems and Methods of using a Hieroglyphic Machine Interface Language for Communication with Auxiliary Robotics in Rapid Fabrication Environments

Also Published As

Publication number Publication date
JP2020089944A (ja) 2020-06-11

Similar Documents

Publication Publication Date Title
US11072068B2 (en) Robot apparatus and method of controlling robot apparatus
CN106945007B (zh) 机器人系统、机器人、以及机器人控制装置
JP4565229B2 (ja) ロボット
JP5953658B2 (ja) ロボット制御装置及びロボット装置の制御方法、コンピューター・プログラム、プログラム記憶媒体、並びにロボット装置
JP7111114B2 (ja) 情報処理装置、情報処理方法及び情報処理システム
US20170203434A1 (en) Robot and robot system
JP6659424B2 (ja) ロボットおよびその制御方法
CN106256512A (zh) 包括机器视觉的机器人装置
US20180281881A1 (en) Robot and control device of the robot
JP7044047B2 (ja) ロボット
CN110781714B (zh) 图像处理装置、图像处理方法
JP2015226965A (ja) ロボット、ロボットシステム、制御装置、及び制御方法
JP5223407B2 (ja) 冗長ロボットの教示方法
CN111085993A (zh) 与人进行协同作业的机器人系统以及机器人控制方法
US20210268660A1 (en) Robot Control Method And Robot System
US20180215044A1 (en) Image processing device, robot control device, and robot
US9833898B2 (en) Positioning control apparatus
WO2020115985A1 (fr) Système de robot
US20160306340A1 (en) Robot and control device
JP2006224291A (ja) ロボットシステム
JP2013180369A (ja) 適応性機械
JP4956964B2 (ja) ロボットハンドの把持制御装置
JP2015157343A (ja) ロボット、ロボットシステム、制御装置、および制御方法
Kiguchi et al. A study of a 4DOF upper-limb power-assist intelligent exoskeleton with visual information for perception-assist
JP6103112B2 (ja) 撮像装置及び撮像装置の制御方法、コンピューター・プログラム、並びにプログラム記憶媒体

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 19892964

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 19892964

Country of ref document: EP

Kind code of ref document: A1