WO2019180916A1 - Dispositif de commande de robot - Google Patents

Dispositif de commande de robot Download PDF

Info

Publication number
WO2019180916A1
WO2019180916A1 PCT/JP2018/011704 JP2018011704W WO2019180916A1 WO 2019180916 A1 WO2019180916 A1 WO 2019180916A1 JP 2018011704 W JP2018011704 W JP 2018011704W WO 2019180916 A1 WO2019180916 A1 WO 2019180916A1
Authority
WO
WIPO (PCT)
Prior art keywords
robot
gesture
robot control
control device
motion
Prior art date
Application number
PCT/JP2018/011704
Other languages
English (en)
Japanese (ja)
Inventor
堅太 藤本
奥田 晴久
文俊 松野
孝浩 遠藤
Original Assignee
三菱電機株式会社
国立大学法人京都大学
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 三菱電機株式会社, 国立大学法人京都大学 filed Critical 三菱電機株式会社
Priority to PCT/JP2018/011704 priority Critical patent/WO2019180916A1/fr
Priority to JP2019510378A priority patent/JP6625266B1/ja
Publication of WO2019180916A1 publication Critical patent/WO2019180916A1/fr

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J19/00Accessories fitted to manipulators, e.g. for monitoring, for viewing; Safety devices combined with or specially adapted for use in connection with manipulators
    • B25J19/02Sensing devices

Definitions

  • the present invention relates to a robot control apparatus that controls a robot.
  • a position / posture indicating the position / posture of the robot hand is moved to a desired position / posture using a teaching-dedicated teaching box connected to the control device by a device expert.
  • the control device may calculate an operation instruction (operation command). Since such teaching is necessary for each shape of the object to be worked, it is necessary to perform a plurality of teachings in order for the robot to perform work on objects of various shapes as in multi-product low-volume production. is there.
  • the robot control unit detects the operator's gesture from the imaging information, specifies the robot control command associated with the gesture, performs the robot control corresponding to the control command, and displays The control unit displays a robot control command to the operator. According to such a configuration, the operator can confirm whether or not the instruction is correctly transmitted to the robot by the displayed robot control command.
  • the robot in the teaching step, is caused to perform an operation based on a three-dimensional virtual world image generated using sensory information acquired from the teaching target robot.
  • Motion information and sensory information such as vision, tactile sensation, and hearing are acquired from the tactile sensor or camera.
  • the robot imitates a desired motion based on the motion information and sensory information.
  • the operator can perform actions corresponding to various objects by performing actions with reference to sensory information such as vision, touch, and hearing.
  • JP 2014-104527 A Japanese Patent No. 4463120
  • the above technique has a problem that it is difficult for a beginner who does not know the robot control command or a beginner who does not know the physicality of the robot such as the movable range of the actual robot to operate the robot. .
  • the present invention has been made in view of the above-described problems, and an object thereof is to provide a technique that allows an operator to easily operate a robot.
  • the robot control device is a robot control device that controls a robot, recognizes the worker's gesture based on a result of detecting the motion of the worker with a sensor, and based on the gesture, the robot A work support device that determines an operation instruction for performing an operation, and information for controlling the operation of the robot is determined based on the operation instruction determined by the work support device, and the information is output to the servo motor And a motion display device that displays a human animation that moves based on the motion instruction determined by the work support device as the motion of the robot.
  • the operation instruction is determined based on the gesture of the worker, the operation of the robot is controlled based on the determined operation instruction, and the moving human animation is displayed based on the determined operation instruction. According to such a configuration, even if the operator does not have knowledge of the robot, the robot can be easily operated by viewing the animation display.
  • FIG. 1 is a block diagram illustrating a configuration of a robot control apparatus according to a first embodiment.
  • 1 is a perspective view showing a robot according to a first embodiment.
  • 1 is a block diagram illustrating a configuration of a robot control apparatus according to a first embodiment.
  • 6 is a diagram showing a gesture / motion visualization table according to Embodiment 1.
  • FIG. FIG. 6 is a diagram showing an operation visualization / animation table according to the first embodiment.
  • 3 is a flowchart illustrating a procedure of initial processing according to the first embodiment. It is a figure which shows an example of a state transition diagram. It is a figure which shows an example of correlation with the movement location of a robot, and its position coordinate.
  • FIG. 1 is a block diagram illustrating a configuration of a robot control apparatus according to a first embodiment. It is a block diagram which shows the structure of the robot control apparatus which concerns on Embodiment 2.
  • FIG. 10 is a diagram showing an operation visualization / animation table according to the second embodiment.
  • FIG. 10 is a block diagram illustrating a configuration of a robot control device according to a third embodiment.
  • FIG. 10 is a diagram showing a gain table according to the third embodiment.
  • FIG. 10 is a diagram showing an operation visualization / animation table according to the third embodiment.
  • FIG. 1 is a block diagram showing a configuration of a robot control apparatus according to an embodiment of the present invention.
  • the robot control device of FIG. 1 includes a gesture detection sensor 1, a work support device 2, a programmable logic controller 3, a robot controller 4, and an operation display device 5.
  • the gesture detection sensor 1 and the work support device 2 are connected to the programmable logic controller 3, the robot controller 4, and the operation display device 5 through a network.
  • the robot 9 is connected so as to be communicable with the robot 9 and controls the robot 9.
  • the robot 9 may be an arm type robot that can perform a plurality of operations by a plurality of servo motors, or may be another robot.
  • 1 includes a non-contact distance sensor, for example, and detects a plurality of coordinates based on the movement of the operator.
  • the work support device 2 recognizes the gesture of the worker based on the result (a plurality of coordinates) detected by the gesture detection sensor 1. Then, the work support device 2 determines an operation instruction for the robot 9 to perform an operation based on the recognized gesture.
  • the programmable logic controller 3 acquires the device state, which is the device state of the robot 9, from an area sensor or the like that detects the device state, and manages the execution of a program for controlling the operation of the robot 9.
  • the robot controller 4 determines information for controlling the operation of the robot 9 based on the operation instruction determined by the work support device 2 and outputs the information to a servo motor (not shown) of the robot 9.
  • the robot controller 4 moves the robot 9 on the basis of the operation instruction determined by the work support device 2, the device state acquired by the programmable logic controller 3, and the program managed by the programmable logic controller 3.
  • An operation command for operating (information for controlling the operation of the robot 9) is generated.
  • the robot controller 4 operates the robot 9 by passing the generated operation command to the robot 9.
  • the motion display device 5 displays a human animation that moves based on the motion instruction determined by the work support device 2 as the motion of the robot 9.
  • the human being displayed by animation includes a figure imitating a human like an avatar, for example.
  • the animation displayed by the motion display device 5 is assumed to be an animation of an avatar moving, and this animation will be described as “avatar animation”.
  • the work support device 2 includes a human motion detection unit 2a, a gesture recognition unit 2b, a motion visualization mechanism unit 2c, a work instruction conversion unit 2d, and a motion display control unit 2e. Although not shown, the work support apparatus 2 also includes a storage unit that can store various tables.
  • the initial process is a process performed before the work starts, and is a process corresponding to, for example, a teaching process and a registration process.
  • the work process is a process in which work is performed by operating the robot 9 in accordance with the operation of the worker.
  • FIG. 3 is a block diagram showing components that perform initial processing among the components of the robot control apparatus. As shown in FIG. 3, the initial process is performed by the human motion detection unit 2a, the gesture recognition unit 2b, the motion visualization mechanism unit 2c, and the like shown in FIG. Hereinafter, components related to the initial process will be described with reference to FIG. 3 and the like.
  • the gesture detection sensor 1 detects the coordinates on the worker as needed from the worker's motion (for example, periodically or periodically).
  • the human motion detection unit 2a detects the coordinates of a specific part (for example, a hand) of the worker's body as needed based on the coordinates detected by the gesture detection sensor 1 as needed.
  • the gesture recognition unit 2b obtains the movement amount of the specific part of the worker based on the coordinates of the specific part detected at any time by the human motion detection unit 2a. During the initial processing, the gesture recognition unit 2b registers a combination of the specific part and the movement amount as a gesture. By using this registration, the gesture recognition unit 2b can recognize (specify) the gesture based on the coordinates detected by the gesture detection sensor 1 at any time. Different gesture numbers are assigned to different gestures.
  • FIG. 4 is a table showing an example of the result of association of the motion visualization mechanism unit 2c.
  • the motion visualization number is a number for specifying the motion of the robot 9 and corresponds to the motion instruction described above.
  • the motion visualization mechanism unit 2c registers the association result as shown in FIG. 4 in the gesture / motion visualization table 2f in FIG.
  • the motion visualization mechanism unit 2c associates the motion visualization number, one of a plurality of avatar animations stored in the motion visualization library 2g, and the movement location.
  • an avatar animation indicating the behavior of the avatar when the motion visualization number of the robot 9 is operated in the normal motion in a state having normal human physical characteristics is associated with the motion visualization number.
  • the movement location is a position related to the robot 9 such as the position of the hand of the hand connected to the robot 9, the position of the work gripped by the hand connected to the robot 9, and the position of the joint of the robot 9.
  • FIG. 5 is a table showing an example of the result of association of the motion visualization mechanism unit 2c.
  • the motion visualization mechanism unit 2c registers the association result as shown in FIG. 5 in the motion visualization / animation table 2h of FIG.
  • FIG. 6 is a flowchart showing a procedure of initial processing.
  • step S1 a device expert such as a skilled technician creates a state transition diagram that defines the operation sequence of the robot 9, and registers it in the robot controller.
  • FIG. 7 is a diagram illustrating an example of a state transition diagram. In FIG. 7, a1 to a19 indicate motion visualization numbers. The transition state at the start of the robot 9 is an “initial state”, and the transition state of the robot 9 changes according to the gesture.
  • step S2 in FIG. 6 the motion visualization mechanism unit 2c associates the gesture gesture number with the motion visualization number.
  • a device familiar person performs an input operation for filling the table of FIG. 4 based on the state transition diagram as shown in FIG. 7, and the motion visualization mechanism unit 2c performs the association based on the input operation.
  • the gesture g1 is recognized, if an input operation for causing the robot 9 in the initial state to perform the motion visualization number a1 is performed, the gesture g1 is associated with the motion visualization number a1.
  • the gesture g2 is recognized, if an input operation for causing the robot 9 in the initial state to perform the motion visualization number a11 is performed, the gesture g2 is associated with the motion visualization number a11.
  • the association result as shown in FIG. 4 is registered in the gesture / motion visualization table 2f of FIG.
  • the motion visualization mechanism unit 2 c associates the motion visualization number, the avatar animation, and the movement location indicating the position of the robot 9.
  • a device expert may perform an input operation for filling the table of FIG. 5 based on the state transition diagram as shown in FIG. 7, and the motion visualization mechanism unit 2c may perform the association based on the input operation.
  • the motion visualization mechanism unit 2c may perform the association by analyzing a state transition diagram as shown in FIG. By this step S3, the result of association as shown in FIG. 5 is registered in the action visualization / animation table 2h of FIG.
  • step S4 in FIG. 6 the device familiar person creates a position information management table in which the movement location in FIG. 5 is associated with the position coordinates of the movement location, and registers them in the robot controller.
  • FIG. 8 is a diagram showing an example of the location information management table.
  • a place where the robot 9 can move is scanned.
  • the position coordinates of the robot 9 are To be acquired.
  • the acquired position coordinates in the field and the movement location in FIG. 5 are associated and registered in the position information management table in FIG.
  • the position coordinates of the robot 9 include, for example, the position coordinates of the hand of the hand connected to the robot 9, the position coordinates of the workpiece gripped by the hand connected to the robot 9, the position coordinates of the joint of the robot 9, and the like.
  • FIG. 9 is a block diagram illustrating components that perform work processing among the components of the robot control device. As shown in FIG. 9, the work process is performed by all the components shown in FIG. Hereinafter, the components related to the work process will be described with reference to FIG.
  • the human motion detection unit 2a Based on the human coordinates detected by the gesture detection sensor 1 at any time (for example, periodically or periodically), the human motion detection unit 2a determines the coordinates of a specific part of the human body at any time (for example, periodically or periodically). )To detect.
  • the gesture recognition unit 2b obtains the movement amount of the specific part of the worker based on the coordinates of the specific part detected at any time by the human motion detection unit 2a, and recognizes the gesture based on the obtained movement amount of the specific part.
  • the motion visualization mechanism unit 2c Based on the gesture recognized by the gesture recognition unit 2b and the transition state of the robot 9, the motion visualization mechanism unit 2c uses a single motion visualization number from the state transition diagram as shown in FIG. 7 and the table as shown in FIG. Is identified. Then, the motion visualization mechanism unit 2c identifies one avatar animation and one movement location from the table as shown in FIG. 5 based on the identified one motion visualization number.
  • the motion visualization mechanism unit 2c reads the gesture g1 from the table in FIG.
  • the action visualization number a1 associated with is specified.
  • the motion visualization mechanism unit 2c identifies the avatar animation associated with the motion visualization number a1 and the movement location “1” from the table of FIG. Thereby, for example, an animation in which the hand of the avatar moves from the movement location “0” to the movement location “1” is specified as the avatar animation associated with the motion visualization number a1.
  • the motion visualization mechanism unit 2c is associated with the gesture g5 from the table in FIG.
  • the operation visualization number a11 is specified.
  • the motion visualization mechanism unit 2c identifies the avatar animation associated with the motion visualization number a11 and the movement location “2” from the table of FIG. Thereby, for example, an animation in which the hand of the avatar moves from the movement location “0” to the movement location “2” is specified as the avatar animation associated with the motion visualization number a11.
  • the work instruction conversion unit 2d specifies the position coordinates of the movement location from the table as shown in FIG. 8 registered in the position information management table 2i based on the one movement location specified by the motion visualization mechanism unit 2c. . Then, the work instruction conversion unit 2d passes the specified position coordinates to the programmable logic controller 3.
  • the programmable logic controller 3 includes a sensor input unit 3a and a program execution management unit 3b.
  • the sensor input unit 3a acquires a device state.
  • the device state is, for example, position information detected by an encoder of the robot 9, information on the arm position of the robot 9, information on the presence / absence of a workpiece, and the like.
  • the program execution management unit 3 b manages a program for the robot controller 4 to control the operation of the robot 9.
  • the programmable logic controller 3 configured as described above has a sensor input function for acquiring a device state and a program execution management function of the robot controller 4. Further, when the programmable logic controller 3 receives the position coordinates from the work instruction conversion unit 2d, the programmable logic controller 3 notifies the robot controller 4 of the position coordinates and the execution start of the program.
  • the robot controller 4 includes a robot command generation unit 4a and a state acquisition unit 4b.
  • the robot command generation unit 4a generates an operation command to the robot 9 from the position coordinates from the programmable logic controller 3 (substantially the position coordinates from the work support device 2) without being conscious of the physicality of the robot 9. .
  • the robot command generation unit 4a calculates the movement amount of the specific part of the robot 9 based on the position coordinates from the programmable logic controller 3 and the current position coordinates of the robot 9 managed by the robot command generation unit 4a. calculate. Then, the robot command generation unit 4 a calculates the movement amount of each axis of the robot 9 based on the calculated movement amount, and passes an operation command including the movement amount to the robot 9. Thereby, the specific part of the robot 9 moves to the target position.
  • the state acquisition unit 4b acquires a camera image acquired by a sensor such as a vision sensor and an image sensor attached to the robot as a robot state, and notifies the robot command generation unit 4a of the robot state.
  • the robot command generation unit 4a When the object is gripped, the robot command generation unit 4a causes the robot 9 to approach the object and grip the object based on the robot state such as the camera image notified from the state acquisition unit 4b. To control. When the object is released, the robot command generation unit 4a causes the robot 9 to move to the target position and release the hand based on the robot state such as the camera image notified from the state acquisition unit 4b. 9 is controlled.
  • the programmable logic controller 3 passes the device state acquired by the sensor input unit 3a to the robot command generation unit 4a.
  • the robot command generation unit 4a moves the movement amount of each axis in the robot controller 4 and the robot 9 based on the device state. Provide feedback to correct.
  • the programmable logic controller 3 detects the completion of movement of the robot arm and the completion of movement of the work as device states, the programmable logic controller 3 notifies the work support apparatus 2 of a program state indicating completion of execution of the program.
  • the operation visualization mechanism unit 2c When the operation visualization mechanism unit 2c receives the program execution completion from the programmable logic controller 3 as a program state, it passes the specified avatar animation to the operation display control unit 2e.
  • the motion display control unit 2e causes the motion display device 5 to display the avatar animation.
  • the operation instruction is determined based on the gesture of the worker, and the operation of the robot 9 is controlled based on the determined operation instruction.
  • Display moving human animation based on movement instructions.
  • the operation of the robot 9 is displayed as a human animation. For this reason, even if the operator does not have knowledge of the robot 9 such as the control command of the robot 9 and the movable range and physicality of the robot 9, the robot 9 is displayed by viewing the display of the human animation. Easy to operate.
  • the work support apparatus 2 is connected to the programmable logic controller 3 via a network. Thereby, the operator can perform remote operation of the robot 9.
  • FIG. 10 is a block diagram showing the configuration of the robot control apparatus according to the second embodiment of the present invention.
  • constituent elements that are the same as or similar to the constituent elements described above are assigned the same reference numerals, and different constituent elements are mainly described.
  • the work state conversion unit 2j is provided in the work support device 2.
  • the state acquisition unit 4b of the robot controller 4 notifies the robot command generation unit 4a and the work support device 2 of the robot state.
  • the work instruction conversion unit 2d generates a work state including the robot state from the state acquisition unit 4b and the program state from the programmable logic controller 3.
  • the work instruction conversion unit 2d passes the generated work state to the motion visualization mechanism unit 2c, and the motion visualization mechanism unit 2c passes the work state to the work state conversion unit 2j.
  • FIG. 11 is a diagram showing an operation visualization / animation table 2h according to the second embodiment.
  • the sound number, vibration number, start sound, end sound, start vibration, and end vibration are added to the motion visualization / animation table 2h in FIG.
  • a file name for identifying the sound file (for convenience, 1 and 2 are attached) is set.
  • a vibration frequency (indicated by 1 and 2 for convenience in FIG. 11) is set. The setting of the sound number and the vibration number is performed by an operator at the time of the gesture association process at the time of the initial process, for example.
  • the start sound flag of the motion visualization number in FIG. 11 is set to ON (in FIG. 11, ⁇ indicates ON). Is done.
  • the end sound flag of the motion visualization number in FIG. 11 is set to ON (in FIG. 11, ⁇ indicates ON). Is done.
  • ON is set for both the start sound and the end sound of the motion visualization number in FIG. Not done.
  • the start vibration flag of the motion visualization number in FIG. 11 is ON (in FIG. 11, ⁇ indicates ON).
  • the end vibration flag of the motion visualization number in FIG. 11 is set to ON (in FIG. 11, ⁇ indicates ON). Is done.
  • ON is set for both the start vibration and the end vibration of the motion visualization number in FIG. Not done.
  • the motion visualization mechanism unit 2c When the motion visualization mechanism unit 2c identifies one motion visualization number and the file name is set to the sound number corresponding to the one motion visualization number, the sound file identified by the file name Is transferred to the work state conversion unit 2j. Similarly, when the motion visualization mechanism unit 2c specifies one motion visualization number and the vibration frequency is set to the vibration number corresponding to the one motion visualization number, the motion visualization mechanism 2c sets the vibration frequency to the working state. The data is transferred to the conversion unit 2j.
  • the output device 6 includes a speaker 6a that notifies (outputs) sound to the worker and a vibration device 6b that notifies (outputs) vibration to the worker.
  • the work state conversion unit 2j controls the output device 6 based on the work state, sound file, and vibration frequency from the motion visualization mechanism unit 2c.
  • the work state conversion unit 2j When the work state including the program state from the programmable logic controller 3 indicates that the execution of the program has been completed and the flag of the end sound of the operation visualization number that has been executed so far is ON, the work state conversion unit 2j The sound file specified by the visualization mechanism unit 2c is transferred to the speaker 6a. Further, when the work state including the program state from the programmable logic controller 3 indicates that the execution of the program has been completed, and the flag of the end vibration of the operation visualization number that has been executed so far is ON, the work state conversion unit 2j The vibration frequency specified by the motion visualization mechanism unit 2c is passed to the vibration device 6b.
  • the work state conversion unit 2j When the robot 9 completes an operation based on a specific motion visualization number (motion instruction), the work state conversion unit 2j configured as described above generates a sound that is associated with the completion of the motion in advance in the table. Output to the speaker 6a. Further, when the robot 9 completes the operation based on the specific motion visualization number (motion instruction), the work state conversion unit 2j outputs the vibration associated with the completion of the motion in advance in the table to the vibration device 6b.
  • the motion visualization mechanism unit 2c determines one motion visualization number from the state transition diagram as shown in FIG. 7 and the association as shown in FIG. Is identified.
  • the motion visualization mechanism unit 2c identifies one avatar animation and one movement location from the table of FIG.
  • a sound file of a name is specified and passed to the work state conversion unit 2j.
  • the work state conversion unit 2j passes the sound file specified by the motion visualization mechanism unit 2c to the speaker 6a.
  • the motion visualization mechanism unit 2c specifies one avatar animation and one movement location from the table of FIG. The frequency is specified and passed to the work state conversion unit 2j.
  • the work state conversion unit 2j passes the vibration frequency specified by the motion visualization mechanism unit 2c to the vibration device 6b.
  • the work state conversion unit 2j When the robot 9 starts an operation based on a specific motion visualization number (motion instruction), the work state conversion unit 2j configured as described above generates a sound previously associated with the start of the motion in the table. Output to the speaker 6a. Further, when the robot 9 starts an operation based on a specific motion visualization number (motion instruction), the work state conversion unit 2j outputs a vibration associated with the start of the motion in advance in the table to the vibration device 6b.
  • FIG. 12 is a block diagram showing the configuration of the robot control apparatus according to Embodiment 3 of the present invention.
  • constituent elements that are the same as or similar to the constituent elements described above are assigned the same reference numerals, and different constituent elements are mainly described.
  • the gesture recognizing unit 2b is not only a normal gesture that is a specific first gesture that is a gesture for determining an action visualization number (action instruction), but also a specific second that is different from the first gesture.
  • a switching gesture which is a gesture
  • a manual gesture which is a third gesture indicating movement of a specific part of the worker in a space of predetermined coordinates, are recognized.
  • the predetermined coordinates are orthogonal coordinates defined by the x-axis, y-axis, and z-axis, and the specific part of the operator is a human hand.
  • the work support device 2 defines a normal mode that is the first mode and a manual mode that is the second mode. In the normal mode, the work support apparatus 2 determines an operation instruction based on the normal gesture as in the first and second embodiments. In the manual mode, the work support device 2 determines the movement amount for the robot 9 to perform movement based on the movement of the hand of the manual gesture as the operation instruction.
  • FIG. 13 is a diagram showing a gain table 2k according to the third embodiment.
  • gxa, gya, and gza are proportional gains of the x-axis, y-axis, and z-axis, respectively.
  • gxb, gyb, and gzb are integral gains of the x-axis, y-axis, and z-axis, respectively.
  • the gesture recognition unit 2b obtains the movement amount of the robot 9 using the following equation (1) based on the movement indicated by the manual gesture, the preset proportional gain, and the preset integral gain. .
  • the body part coordinate previous x coordinate and the body part coordinate current x coordinate are coordinates indicating movement of the x axis indicated by the manual gesture, and the hand coordinate x corresponds to the movement amount of the robot 9.
  • the y coordinate and the z coordinate are the same as the x coordinate.
  • the gesture recognition unit 2b notifies the obtained movement amount to the work instruction conversion unit 2d.
  • the work instruction conversion unit 2d notifies the programmable logic controller 3 of the notified movement amount.
  • the programmable logic controller 3 notifies the robot controller 4 of the notified movement amount.
  • the robot controller 4 performs control for moving the robot 9 based on the notified movement amount when the work support apparatus 2 performs the manual mode.
  • FIG. 14 is a diagram showing an operation visualization / animation table 2h according to the third embodiment.
  • “manual mode” is set at the movement location of the motion visualization number “14”
  • “normal mode” is set at the movement location of the motion visualization number “16”.
  • gestures corresponding to motion visualization numbers “1” to “13”, “15”, and “17 to 19” correspond to normal gestures, and motion visualization numbers “14” and “16”.
  • the gesture corresponding to is equivalent to a switching gesture.
  • the motion visualization mechanism unit 2c When the motion visualization number “14” is identified based on the gesture recognized by the gesture recognition unit 2b and the transition state of the robot 9, the motion visualization mechanism unit 2c notifies the gesture recognition unit 2b of “manual mode”. To do. On the other hand, when the motion visualization number “16” is identified based on the gesture recognized by the gesture recognition unit 2 b and the transition state of the robot 9, the motion visualization mechanism unit 2 c gives the gesture recognition unit 2 b the “normal mode”. To be notified. As described above, when the work support device 2 recognizes the switching gesture, the work support device 2 performs switching from the normal mode to the manual mode or switching from the manual mode to the normal mode.
  • the work support apparatus 2 performs the manual mode instead of the normal mode when the switching gesture is recognized. According to such a configuration, the operator can switch to the manual mode that can deal with fine work by performing a switching gesture.
  • the work movement amount of the robot 9 can be optimized.

Landscapes

  • Engineering & Computer Science (AREA)
  • Robotics (AREA)
  • Mechanical Engineering (AREA)
  • Manipulator (AREA)

Abstract

Le but de la présente invention est de fournir une technologie avec laquelle un opérateur peut facilement faire fonctionner un robot. Ce dispositif de commande de robot destiné à commander un robot (9) comprend : un dispositif de support d'opération (2) qui reconnaît un geste d'un opérateur, et sur la base dudit geste, détermine les instructions d'action afin que le robot (9) effectue une action ; un dispositif de commande de robot (4) qui, sur la base des instructions d'action déterminées par le dispositif de support d'opération (2), commande l'action du robot (9) ; et un dispositif d'affichage d'action (5), qui affiche une animation d'un humain qui se déplace sur la base des instructions d'action déterminées par le dispositif de support d'opération (2).
PCT/JP2018/011704 2018-03-23 2018-03-23 Dispositif de commande de robot WO2019180916A1 (fr)

Priority Applications (2)

Application Number Priority Date Filing Date Title
PCT/JP2018/011704 WO2019180916A1 (fr) 2018-03-23 2018-03-23 Dispositif de commande de robot
JP2019510378A JP6625266B1 (ja) 2018-03-23 2018-03-23 ロボット制御装置

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2018/011704 WO2019180916A1 (fr) 2018-03-23 2018-03-23 Dispositif de commande de robot

Publications (1)

Publication Number Publication Date
WO2019180916A1 true WO2019180916A1 (fr) 2019-09-26

Family

ID=67987033

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2018/011704 WO2019180916A1 (fr) 2018-03-23 2018-03-23 Dispositif de commande de robot

Country Status (2)

Country Link
JP (1) JP6625266B1 (fr)
WO (1) WO2019180916A1 (fr)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2022069924A (ja) * 2020-10-26 2022-05-12 三菱電機株式会社 遠隔操作システム
DE112021004343T5 (de) 2020-08-20 2023-05-25 Fanuc Corporation Robotersteuersystem

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP4265377A4 (fr) 2021-06-28 2024-07-10 Samsung Electronics Co Ltd Robot et son procédé de commande

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH0639754A (ja) * 1992-07-27 1994-02-15 Nippon Telegr & Teleph Corp <Ntt> ロボットハンド制御装置
JP4463120B2 (ja) * 2005-01-17 2010-05-12 独立行政法人理化学研究所 身まねロボットシステムとその身まね動作制御方法
JP2011110620A (ja) * 2009-11-24 2011-06-09 Toyota Industries Corp ロボットの動作を制御する方法およびロボットシステム
JP2014104527A (ja) * 2012-11-27 2014-06-09 Seiko Epson Corp ロボットシステム、プログラム、生産システム及びロボット

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH0639754A (ja) * 1992-07-27 1994-02-15 Nippon Telegr & Teleph Corp <Ntt> ロボットハンド制御装置
JP4463120B2 (ja) * 2005-01-17 2010-05-12 独立行政法人理化学研究所 身まねロボットシステムとその身まね動作制御方法
JP2011110620A (ja) * 2009-11-24 2011-06-09 Toyota Industries Corp ロボットの動作を制御する方法およびロボットシステム
JP2014104527A (ja) * 2012-11-27 2014-06-09 Seiko Epson Corp ロボットシステム、プログラム、生産システム及びロボット

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE112021004343T5 (de) 2020-08-20 2023-05-25 Fanuc Corporation Robotersteuersystem
JP2022069924A (ja) * 2020-10-26 2022-05-12 三菱電機株式会社 遠隔操作システム
JP7365991B2 (ja) 2020-10-26 2023-10-20 三菱電機株式会社 遠隔操作システム

Also Published As

Publication number Publication date
JPWO2019180916A1 (ja) 2020-04-30
JP6625266B1 (ja) 2019-12-25

Similar Documents

Publication Publication Date Title
JP5512048B2 (ja) ロボットアームの制御装置及び制御方法、ロボット、制御プログラム、並びに、集積電子回路
JP3529373B2 (ja) 作業機械のシミュレーション装置
US9387589B2 (en) Visual debugging of robotic tasks
KR100762380B1 (ko) 로봇 위치 교시를 위한 이동 제어 장치, 로봇의 위치 교시장치, 로봇 위치 교시를 위한 이동 제어 방법, 로봇의 위치교시 방법 및 로봇 위치 교시를 위한 이동 제어 프로그램
KR102042115B1 (ko) 로봇의 동작프로그램 생성방법 및 로봇의 동작프로그램 생성장치
JP6445092B2 (ja) ロボットの教示のための情報を表示するロボットシステム
WO2019180916A1 (fr) Dispositif de commande de robot
JP6863927B2 (ja) ロボットのシミュレーション装置
JP7117237B2 (ja) ロボット制御装置、ロボットシステム及びロボット制御方法
JP7049069B2 (ja) ロボットシステム及びロボットシステムの制御方法
JP2018167334A (ja) 教示装置および教示方法
KR20170016436A (ko) 작업 로봇의 교시 데이터 생성 장치 및 교시 데이터 생성 방법
JP2014065100A (ja) ロボットシステム、及びロボットのティーチング方法
JP2018015863A (ja) ロボットシステム、教示データ生成システム及び教示データ生成方法
Miądlicki et al. Real-time gesture control of a CNC machine tool with the use Microsoft Kinect sensor
JP2018069361A (ja) 力制御座標軸設定装置、ロボットおよび力制御座標軸設定方法
JP2023024890A (ja) 直接教示操作を受け付け可能な制御装置、教示装置、および制御装置のコンピュータプログラム
JPH06250730A (ja) 産業用ロボットの教示装置
JP2009196040A (ja) ロボットシステム
JP2022163836A (ja) ロボット画像の表示方法、コンピュータープログラム、及び、ロボット画像の表示システム
JPH1177568A (ja) 教示支援方法及び装置
US20220226982A1 (en) Method Of Creating Control Program For Robot, System Executing Processing Of Creating Control Program For Robot, And Non-Transitory Computer-Readable Storage Medium
WO2023073959A1 (fr) Dispositif d&#39;aide au travail et procédé d&#39;aide au travail
US20240066694A1 (en) Robot control system, robot control method, and robot control program
WO2022269927A1 (fr) Dispositif de création de programme

Legal Events

Date Code Title Description
ENP Entry into the national phase

Ref document number: 2019510378

Country of ref document: JP

Kind code of ref document: A

121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 18911170

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 18911170

Country of ref document: EP

Kind code of ref document: A1