US20230001567A1 - Teaching Support Device - Google Patents

Teaching Support Device Download PDF

Info

Publication number
US20230001567A1
US20230001567A1 US17/855,944 US202217855944A US2023001567A1 US 20230001567 A1 US20230001567 A1 US 20230001567A1 US 202217855944 A US202217855944 A US 202217855944A US 2023001567 A1 US2023001567 A1 US 2023001567A1
Authority
US
United States
Prior art keywords
teaching
force
polishing
section
robot
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
US17/855,944
Other languages
English (en)
Inventor
Yuma SHIMURA
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Seiko Epson Corp
Original Assignee
Seiko Epson Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Seiko Epson Corp filed Critical Seiko Epson Corp
Assigned to SEIKO EPSON CORPORATION reassignment SEIKO EPSON CORPORATION ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: SHIMURA, YUMA
Publication of US20230001567A1 publication Critical patent/US20230001567A1/en
Pending legal-status Critical Current

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/0081Programme-controlled manipulators with master teach-in means
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B24GRINDING; POLISHING
    • B24BMACHINES, DEVICES, OR PROCESSES FOR GRINDING OR POLISHING; DRESSING OR CONDITIONING OF ABRADING SURFACES; FEEDING OF GRINDING, POLISHING, OR LAPPING AGENTS
    • B24B51/00Arrangements for automatic control of a series of individual steps in grinding a workpiece
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J11/00Manipulators not otherwise provided for
    • B25J11/005Manipulators for mechanical processing tasks
    • B25J11/0065Polishing or grinding
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1602Programme controls characterised by the control system, structure, architecture
    • B25J9/1605Simulation of manipulator lay-out, design, modelling of manipulator
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1628Programme controls characterised by the control loop
    • B25J9/1633Programme controls characterised by the control loop compliant, force, torque control, e.g. combined with position control
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1656Programme controls characterised by programming, planning systems for manipulators
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1656Programme controls characterised by programming, planning systems for manipulators
    • B25J9/1669Programme controls characterised by programming, planning systems for manipulators characterised by special application, e.g. multi-arm co-operation, assembly, grasping
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1679Programme controls characterised by the tasks executed
    • B25J9/1689Teleoperation
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B19/00Programme-control systems
    • G05B19/02Programme-control systems electric
    • G05B19/18Numerical control [NC], i.e. automatically operating machines, in particular machine tools, e.g. in a manufacturing environment, so as to execute positioning, movement or co-ordinated operations by means of programme data in numerical form
    • G05B19/409Numerical control [NC], i.e. automatically operating machines, in particular machine tools, e.g. in a manufacturing environment, so as to execute positioning, movement or co-ordinated operations by means of programme data in numerical form characterised by using manual data input [MDI] or by using control panel, e.g. controlling functions with the panel; characterised by control panel details or by setting parameters
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B2219/00Program-control systems
    • G05B2219/30Nc systems
    • G05B2219/36Nc in input of data, input key till input tape
    • G05B2219/36162Pendant control box
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B2219/00Program-control systems
    • G05B2219/30Nc systems
    • G05B2219/39Robotics, robotics to robotics hand
    • G05B2219/39438Direct programming at the console
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B2219/00Program-control systems
    • G05B2219/30Nc systems
    • G05B2219/39Robotics, robotics to robotics hand
    • G05B2219/39443Portable, adapted to handpalm, with joystick, function keys, display
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B2219/00Program-control systems
    • G05B2219/30Nc systems
    • G05B2219/45Nc applications
    • G05B2219/45096Polishing manipulator

Definitions

  • the present disclosure relates to a teaching support device.
  • a teaching support device configured to perform teaching to a robot which has a robot arm a tip of which is attached with a polishing tool, and which controls the robot arm with force control to perform a polishing task on an object
  • the teaching support device including a teaching point acquisition section configured to obtain information related to a plurality of teaching points set to the object, a polishing parameter acquisition section configured to obtain information related to a polishing parameter of the polishing task at the plurality of teaching points obtained by the teaching point acquisition section, and a display control section configured to display the teaching point out of the plurality of teaching points with a color based on the polishing parameter obtained by the polishing parameter acquisition section so as to overlap the object.
  • FIG. 1 is a diagram showing an overall configuration of a robotic system equipped with a teaching support device according to the present disclosure.
  • FIG. 2 is a block diagram of the robotic system shown in FIG. 1 .
  • FIG. 3 is a diagram showing an example of a screen displayed on a display section provided to the robotic system shown in FIG. 1 .
  • FIG. 4 is a diagram showing an example of a screen displayed on the display section provided to the robotic system shown in FIG. 1 .
  • FIG. 5 is a diagram showing an example of a setup screen displayed on the display section provided to the robotic system shown in FIG. 1 .
  • FIG. 6 is a diagram showing an example of a screen displayed on the display section provided to the robotic system shown in FIG. 1 .
  • FIG. 7 is a diagram showing an example of a screen displayed on the display section provided to the robotic system shown in FIG. 1 .
  • FIG. 8 is a block diagram for explaining a robotic system with a focus on hardware.
  • FIG. 9 is a block diagram showing Modified Example 1 with a focus on hardware of a robotic system.
  • FIG. 10 is a block diagram showing Modified Example 2 with a focus on hardware of a robotic system.
  • FIG. 1 is a diagram showing an overall configuration of a robotic system equipped with a teaching support device according to the present disclosure.
  • FIG. 2 is a block diagram of the robotic system shown in FIG. 1 .
  • FIG. 3 is a diagram showing an example of a screen displayed on a display section provided to the robotic system shown in FIG. 1 .
  • FIG. 4 is a diagram showing an example of a screen displayed on a display section provided to the robotic system shown in FIG. 1 .
  • FIG. 5 is a diagram showing an example of a setup screen displayed on a display section provided to the robotic system shown in FIG. 1 .
  • FIG. 6 is a diagram showing an example of a screen displayed on a display section provided to the robotic system shown in FIG. 1 .
  • FIG. 7 is a diagram showing an example of a screen displayed on a display section provided to the robotic system shown in FIG. 1 .
  • a +Z-axis direction namely an upper side in FIG. 1
  • a ⁇ Z-axis direction namely a lower side thereof
  • a platform 11 side in FIG. 1 is referred to as a “base end”
  • an opposite side namely an end effector side
  • the Z-axis direction namely an up-down direction in FIG. 1
  • the X-axis direction and the Y-axis direction namely a right-left direction and a front-back direction, are defined as a “horizontal direction.”
  • the robotic system 100 is provided with a robot 1 , a control device 3 for controlling the robot 1 , and a teaching device 4 . Further, the teaching device 4 incorporates a teaching support device 10 A.
  • the robot 1 shown in FIG. 1 is a single-arm six-axis vertical articulated robot in the present embodiment, and has a platform 11 and a robot arm 10 . Further, it is possible to mount an end effector 20 on a tip portion of the robot arm 10 .
  • the end effector 20 can be a constituent element of the robot 1 , or is not required to be a constituent element of the robot 1 .
  • the robot 1 is not limited to the illustrated configuration, and can be, for example, a double-arm articulated robot. Further, the robot 1 can be a horizontal articulated robot.
  • the platform 11 is a support body for supporting the robot arm 10 from a lower side so as to be able to drive the robot arm 10 , and is fixed to, for example, a floor in a factory.
  • the platform 11 is electrically coupled to the control device 3 via a relay cable 18 .
  • the coupling between the robot 1 and the control device 3 is not limited to the coupling with wire as in the configuration shown in FIG. 1 , but can be, for example, coupling without wire, or can also be connection via a network such as the Internet.
  • the robot arm 10 has a first arm 12 , a second arm 13 , a third arm 14 , a fourth arm 15 , a fifth arm 16 , and a sixth arm 17 , wherein these arms are coupled to one another in this order from the platform 11 side.
  • the number of the arms provided to the robot arm 10 is not limited to six, and can be, for example, one, two, three, four, five, or seven or more. Further, a size such as a total length of each of the arms is not particularly limited, and can arbitrarily be set.
  • the platform 11 and the first arm 12 are coupled to each other via a joint 171 . Further, the first arm 12 is arranged to be able to rotate around a first rotational axis parallel to the vertical direction with respect to the platform 11 taking the first rotational axis as a rotational center. The first rotational axis coincides with a normal line of the floor to which the platform 11 is fixed.
  • the first arm 12 and the second arm 13 are coupled to each other via a joint 172 . Further, the second arm 13 is arranged to be able to rotate with respect to the first arm 12 taking a second rotational axis parallel to the horizontal direction as a rotational center.
  • the second rotational axis is parallel to an axis perpendicular to the first rotational axis.
  • the second arm 13 and the third arm 14 are coupled to each other via a joint 173 . Further, the third arm 14 is arranged to be able to rotate with respect to the second arm 13 taking a third rotational axis parallel to the horizontal direction as a rotational center. The third rotational axis is parallel to the second rotational axis.
  • the third arm 14 and the fourth arm 15 are coupled to each other via a joint 174 . Further, the fourth arm 15 is arranged to be able to rotate with respect to the third arm 14 taking a fourth rotational axis parallel to a central axis of the third arm 14 as a rotational center. The fourth rotational axis is perpendicular to the third rotational axis.
  • the fourth arm 15 and the fifth arm 16 are coupled to each other via a joint 175 . Further, the fifth arm 16 is arranged to be able to rotate with respect to the fourth arm 15 taking a fifth rotational axis as a rotational center. The fifth rotational axis is perpendicular to the fourth rotational axis.
  • the fifth arm 16 and the sixth arm 17 are coupled to each other via a joint 176 . Further, the sixth arm 17 is arranged to be able to rotate with respect to the fifth arm 16 taking a sixth rotational axis as a rotational center. The sixth rotational axis is perpendicular to the fifth rotational axis.
  • the sixth arm 17 forms a robot tip portion located at the farthest to the tip side in the robot arm 10 .
  • the sixth arm 17 can rotate together with the end effector 20 due to the drive of the robot arm 10 .
  • the robot 1 is provided with a motor M 1 , a motor M 2 , a motor M 3 , a motor M 4 , a motor M 5 , and a motor M 6 as a drive section, an encoder E 1 , an encoder E 2 , an encoder E 3 , an encoder E 4 , an encoder E 5 , and an encoder E 6 .
  • the motor M 1 is incorporated in the joint 171 , and rotates the platform 11 and the first arm 12 relatively to each other.
  • the motor M 2 is incorporated in the joint 172 , and rotates the first arm 12 and the second arm 13 relatively to each other.
  • the motor M 3 is incorporated in the joint 173 , and rotates the second arm 13 and the third arm 14 relatively to each other.
  • the motor M 4 is incorporated in the joint 174 , and rotates the third arm 14 and the fourth arm 15 relatively to each other.
  • the motor M 5 is incorporated in the joint 175 , and rotates the fourth arm 15 and the fifth arm 16 relatively to each other.
  • the motor M 6 is incorporated in the joint 176 , and rotates the fifth arm 16 and the sixth arm 17 relatively to each other.
  • the encoder E 1 is incorporated in the joint 171 , and detects a position of the motor M 1 .
  • the encoder E 2 is incorporated in the joint 172 , and detects a position of the motor M 2 .
  • the encoder E 3 is incorporated in the joint 173 , and detects a position of the motor M 3 .
  • the encoder E 4 is incorporated in the joint 174 , and detects a position of the motor M 4 .
  • the encoder E 5 is incorporated in the joint 175 , and detects a position of the motor M 5 .
  • the encoder E 6 is incorporated in the joint 176 , and detects a position of the motor M 6 .
  • the encoders E 1 through the encoder E 6 are electrically coupled to the control device 3 , and positional information, namely an amount of rotation, of each of the motor M 1 through the motor M 6 is transmitted to the control device 3 as an electric signal. Further, based on this information, the control device 3 drives the motor M 1 through the motor M 6 via a driver not shown. In other words, controlling the robot arm 10 is controlling the motor M 1 through the motor M 6 .
  • control point CP means a point acting as a reference when performing the control of the robot arm 10 .
  • a position of the control point CP is figured out in a robotic coordinate system, and the robot arm 10 is driven so that the control point CP moves to a desired position.
  • the robot arm 10 is provided with a force detection section 19 for detecting the force in a detachable manner. Further, the robot arm 10 can be driven in a state in which the force detection section 19 is provided.
  • the force detection section 19 is a six-axis kinesthetic sensor in the present embodiment. The force detection section 19 detects magnitudes of forces on three detection axes perpendicular to each other, and magnitudes of torques around the respective three detection axes.
  • the force detection section 19 detects force components in the respective axial directions of an X axis, a Y axis, and a Z axis perpendicular to each other, a force component in a Tx direction around the X axis, a force component in a Ty direction around the Y axis, and a force component in a Tz direction around the Z axis.
  • the Z-axis direction corresponds to a vertical direction.
  • the force component in each of the axial directions can be referred to as a “translational force component,” and the force component around each of the axes can be referred to as a “rotational force component.”
  • the force detection section 19 is not limited to the six-axis kinesthetic sensor, and can be one having another configuration.
  • the force detection section 19 is provided to the sixth arm 17 .
  • the installation place of the force detection section 19 is not limited to the sixth arm 17 , namely the arm located at the farthest to the tip side, and can be, for example, another arm, an area between the arms adjacent to each other, or a place below the platform 11 , or it is possible to provide the force detection section 19 to each of the joints.
  • the end effector 20 is formed of a polishing tool for performing polishing.
  • the end effector 20 has a grinder at the tip, and polishes a work W 1 with the grinder making contact with the work W 1 while rotating.
  • the grinder is used as the polishing tool in the present embodiment, this is not a limitation, and it is possible to adopt a sponge or what is obtained by making abrasive grains adhere to paper, cloth, or a film.
  • a tool center point TCP is set at an arbitrary position at the tip of the end effector 20 , preferably at a tip of the grinder.
  • the position of the control point CP is figured out in the robotic coordinate system, and the robot arm 10 is driven so that the control point CP moves to the desired position.
  • a type, in particular a length, of the end effector 20 in advance, it is possible to figure out an amount of an offset between the tool center point TCP and the control point CP. Therefore, it is possible to figure out the position of the tool center point TCP in the robotic coordinate system. Therefore, it is possible to use the tool center point TCP as a reference of the control.
  • the work W 1 is an object of the polishing by the end effector 20 .
  • an area to be polished corresponds to a polishing area.
  • the work W 1 there can be cited a connector of electronic equipment, plastic exterior equipment, metal exterior equipment, and so on.
  • control device 3 will be described.
  • the control device 3 is arranged at a distance from the robot 1 , and can be constituted by a computer incorporating a CPU (Central Processing Unit) as an example of a processor, and so on.
  • the control device 3 can be incorporated in the platform 11 of the robot 1 .
  • the control device 3 is coupled to the robot 1 with the relay cable 18 so as to be able to communicate with each other. Further, the control device 3 is coupled to the teaching device 4 so as to be able to communicate with each other wirelessly or with a cable.
  • the teaching device 4 can be a dedicated computer, or can also be a general-purpose computer in which a program for teaching the robot 1 is installed. It is possible to use, for example, a teaching pendant as a dedicated device for teaching the robot 1 instead of the teaching device 4 . Further, it is possible for the control device 3 and the teaching device 4 to be provided with respective chassis separated from each other, or to be configured integrally with each other.
  • a program for generating an execution program which uses a target positional posture S t and a target force f St described later as parameters, and then loading the execution program to the control device 3 can be installed in the control device 3 .
  • the teaching device 4 is provided with a display, a processor, a RAM, and a ROM, and these hardware resources generate the execution program in cooperation with the teaching program.
  • the control device 3 is a computer in which the control program for performing the control of the robot 1 is installed.
  • the control device 3 is provided with a processor, and a RAM and a ROM not shown, and these hardware resources cooperate with a program to thereby control the robot 1 .
  • the control device 3 has a target position setting section 3 A, a drive control section 3 B, and a storage section 3 C.
  • the storage section 3 C is constituted by, for example, a volatile memory such as a RAM (Random Access Memory), a nonvolatile memory such as a ROM (Read Only Memory), and a removable external storage device.
  • a volatile memory such as a RAM (Random Access Memory)
  • a nonvolatile memory such as a ROM (Read Only Memory)
  • a removable external storage device In the storage section 3 C, there are stored an operation program for making the robot 1 operate, and so on.
  • the target position setting section 3 A sets the target positional posture S t and an operation path for executing a predetermined operation on the work W 1 .
  • the target position setting section 3 A sets the target positional posture S t and the operation path based on the teaching information input from the teaching device 4 .
  • the drive control section 3 B is for controlling the drive of the robot arm 10 , and has a position control section 30 , a coordinate conversion section 31 , a coordinate conversion section 32 , a correction section 33 , a force control section 34 , and a command integration section 35 .
  • the position control section 30 generates a position command signal for controlling a position of the tool center point TCP of the robot 1 , namely a position command value, based on the target position designated using a command created in advance.
  • control device 3 is capable of controlling the operation of the robot 1 using force control and so on.
  • the “force control” means control of an operation of the robot 1 of changing a position of the end effector 20 , namely a position of the tool center point TCP, and postures of the first arm 12 through the sixth arm 17 based on the detection result of the force detection section 19 .
  • the force control includes, for example, force trigger control and impedance control.
  • the force trigger control the force detection is performed by the force detection section 19 , and the robot arm 10 is made to perform an operation such as a displacement or a change in posture until a predetermined force is detected by the force detection section 19 .
  • the impedance control includes imitation control.
  • the operation of the robot arm 10 is controlled so as to keep the force applied to the tip portion of the robot arm 10 to a predetermined force as precisely as possible, namely so as to keep the force in a predetermined direction detected by the force detection section 19 at the target force f St as precisely as possible.
  • the robot arm 10 performs an operation imitating an external force applied from the object or an operator with respect to the predetermined direction.
  • the target force f St includes 0.
  • the target value For example, as one of the settings when performing the imitation operation, it is possible to set the target value to “0.” It should be noted that it is possible to set the target force f St to a numerical value other than 0. It is possible for the operator to arbitrarily sect the target force f St via, for example, the teaching device 4 . Further, it is possible to set the target force f St for each of the directions (X, Y, and Z) of the axes, and each of the directions (Tx, Ty, and Tz) around the respective axes.
  • the storage section 3 C stores a correspondence relationship between a combination of rotational angles of the motor M 1 through the motor M 6 , and a position of the tool center point TCP in the robotic coordinate system. Further, the control device 3 stores at least one of the target positional posture S t and the target force f St in the storage section 3 C based on a command in every step of an operation performed by the robot 1 . The command using the target positional posture S t and the target force f St as parameters is set for every step of the operation performed by the robot 1 .
  • the drive control section 3 B controls the first arm 12 through the sixth arm 17 so that the target positional posture S t and the target force f St thus set are achieved at the tool center point TCP.
  • the target force f St means a detected force and a torque of the force detection section 19 to be achieved by actions of the first arm 12 through the sixth arm 17 .
  • the coordinate conversion section 31 shown in FIG. 2 converts the rotational angles into the positional posture S at the tool center point TCP in the robotic coordinate system based on the correspondence relationship. Then, the coordinate conversion section 32 identifies an acting force f S actually acting on the force detection section 19 in the robotic coordinate system based on the positional posture S of the tool center point TCP and the detection value of the force detection section 19 .
  • An acting point of the acting force f S is defined as a force detection origin separately from the tool center point TCP.
  • the force detection origin corresponds to a point at which the force detection section 19 is detecting a force.
  • the control device 3 stores a correspondence relationship which defines a direction of a detection axis in a sensor coordinate system of the force detection section 19 for every positional posture S of the tool center point TCP in the robotic coordinate system. Therefore, it is possible for the control device 3 to identify the acting force f S in the robotic coordinate system based on the positional posture S of the tool center point TCP in the robotic coordinate system and the correspondence relationship.
  • the torque acting on the robot 1 can be calculated from the acting force f S and a distance from the contact point to the force detection section 19 , and is identified as a torque component. It should be noted that when the end effector 20 makes contact with the work W 1 to perform an operation, the contact point can be assumed as the tool center point TCP.
  • the correction section 33 performs a gravity compensation on the acting force f S .
  • the gravity compensation means elimination of a component of a force or a torque caused by the gravity from the acting force f S .
  • the acting force f S on which the gravity compensation has been performed can be assumed as a force other than the gravity acting on the robot arm 10 or the end effector 20 .
  • the correction section 33 performs an inertia compensation on the acting force f S .
  • the inertia compensation means elimination of a component of a force or a torque caused by an inertial force from the acting force f S .
  • the acting force f S on which the inertia compensation has been performed can be assumed as a force other than the inertial force acting on the robot arm 10 or the end effector 20 .
  • the force control section 34 performs the impedance control.
  • the impedance control is active impedance control which realizes an imaginary mechanical impedance with the motor M 1 through the motor M 6 .
  • the control device 3 performs such impedance control when performing direct teaching and a step in a contact state in which the end effector 20 receives a force from the object such as a fitting task, a screwing task, or a polishing task. It should be noted that besides such a step, by performing the impedance control when, for example, a human makes contact with the robot 1 , it is possible to enhance the safety.
  • the target force f St is substituted into a motion equation described later to derive the rotational angles of the motor M 1 through the motor M 6 .
  • Signals with which the control device 3 controls the motor M 1 through the motor M 6 are each a signal modulated with PWM (Pulse Width Modulation).
  • the control device 3 controls the motor M 1 through the motor M 6 with the rotational angles derived by a linear operation from the target positional posture S t .
  • a mode in which the motor M 1 through the motor M 6 are controlled with the rotational angles derived by the linear operation from the target positional posture S t is referred to as a position control mode.
  • the control device 3 substitutes the target force f St and the acting force f S into the motion equation of the impedance control to thereby identify a force-derived correction value ⁇ S.
  • the force-derived correction value ⁇ S means a magnitude of the positional posture S which the tool center point TCP should move for dissolving a force deviation ⁇ f S (t) from the target force f St when the tool center point TCP has received the mechanical impedance.
  • the following formula (1) is the motion equation of the impedance control.
  • the left-hand side of the formula (1) is constituted by a first term in which a second order differential value of the positional posture S of the tool center point TCP is multiplied by a virtual mass coefficient m (hereinafter referred to as a “mass coefficient m”), a second term in which a differential value of the positional posture S of the tool center point TCP is multiplied by a virtual viscosity coefficient d (hereinafter referred to as a “viscosity coefficient d”), and a third term in which the positional posture S of the tool center point TCP is multiplied by a virtual elastic coefficient k (hereinafter referred to as an “elastic coefficient k”).
  • the right-hand side of the formula (1) is formed of the force deviation ⁇ f S (t) obtained by subtracting the actual force f from the target force f St .
  • the differential in formula (1) means the temporal differentiation. In the step performed by the robot, a constant value is set as the target force f St in some cases, and a function of time is set as the target force f St in some cases.
  • the mass coefficient m means a mass which the tool center point TCP virtually has
  • the viscosity coefficient d means a viscosity resistance which the tool center point TCP virtually receives
  • the elastic coefficient k means a spring constant of the elastic force which the tool center point TCP virtually receives.
  • the acceleration of the action decreases, and as the value of the mass coefficient m decreases, the acceleration of the action increases.
  • the viscosity coefficient d increases, the speed of the action decreases, and as the value of the viscosity coefficient d decreases, the speed of the action increases.
  • the elastic coefficient k increases, the elasticity increases, and as the value of the elastic coefficient k decreases, the elasticity decreases.
  • the mass coefficient m, the viscosity coefficient d, and the elastic coefficient k can each be set to values different by direction, or can each also be set to a common value irrespective of the direction. Further, it is possible for the operator to arbitrarily set the mass coefficient m, the viscosity coefficient d, and the elastic coefficient k prior to the operation.
  • the input is performed by the operator using, for example, the teaching device 4 .
  • the mass coefficient m, the viscosity coefficient d, and the elastic coefficient k described above are each a force control parameter.
  • the force control parameter is a value set in advance of the robot arm 10 actually performing an operation.
  • the force control parameters include the mass coefficient m, the viscosity coefficient d, the elastic coefficient k, and so on.
  • the correction value is obtained from the detection value of the force detection section 19 , the force control parameters set in advance, and the target force set in advance.
  • This correction value means the force-derived correction value ⁇ S described above, and means a difference between the position at which the external force is received, and a position to which the tool center point TCP should be moved.
  • the command integration section 35 combines the force-derived correction value ⁇ S with a position command value P generated by the position control section 30 . By performing the above as needed, the command integration section 35 obtains a new position command value P′ from the position command value P which has been used for the displacement to the position at which the external force is received.
  • the teaching device 4 is a device for receiving a variety of setups, generating an operation program, and generating and then displaying such images as shown in FIG. 3 through FIG. 7 .
  • the teaching device 4 has a display section 40 , a control section 41 , a storage section 42 , and a communication section 43 .
  • the teaching device 4 is a laptop personal computer in the illustrated configuration, but is not particularly limited thereto in the present disclosure, and can be, for example, a desktop personal computer, a tablet, or a smartphone.
  • the display section 40 displays a setup screen 400 for the operator to input a variety of types of information, a simulation image for displaying information taught, and so on.
  • a setup screen 400 for the operator to input a variety of types of information
  • a simulation image for displaying information taught, and so on.
  • information related to teaching points P 1 there can be cited information related to teaching points P 1 , information related to a polishing parameter of the polishing task, information related to the force control parameters described above, and so on.
  • the control section 41 has at least one processor.
  • the processor there can be cited, for example, a CPU (Central Processing Unit) and a GPU (Graphics Processing Unit).
  • the control section 41 reads out a variety of programs and so on stored in the storage section 42 , and then executes the programs.
  • the variety of programs there can be cited, for example, an operation program and a teaching program of the robot arm 10 .
  • These programs can be those generated by the teaching device 4 , or can be those stored from an external recording medium such as a CD-ROM, or can be those stored via a network.
  • the teaching program generated by the control section 41 is transmitted to the control device 3 of the robot 1 via the communication section 43 .
  • the robot arm 10 it is possible for the robot arm 10 to execute a predetermined task in a predetermined condition.
  • control section 41 has a teaching point acquisition section 411 , a polishing parameter acquisition section 412 , a color information acquisition section 413 , and a display control section 414 .
  • the teaching point acquisition section 411 , the polishing parameter acquisition section 412 , the color information acquisition section 413 , and the display control section 414 constitute the teaching support device 10 A.
  • the teaching point acquisition section 411 obtains the information related to the teaching points P 1 input by the operator using the setup screen 400 shown in FIG. 5 .
  • the teaching points P 1 are points which are arranged on the surface of the work W 1 , and through which the tool center point TCP should pass.
  • the teaching points P 1 are arranged in a reticular pattern, and information related to a passing order is made to associate with each of the teaching points P 1 .
  • the position information of the teaching points P 1 with respect to the work W 1 is represented in, for example, the robotic coordinate system.
  • the polishing parameter acquisition section 412 obtains the information related to the polishing parameters input by the operator using the setup screen 400 shown in FIG. 5 .
  • the polishing parameters there can be cited, for example, information related to the polishing tool, information related to a material of the abrasive grain of the grinder, information related to a size of the abrasive grain, information related to rotational speed of the grinder, and information related to pressing force. In the illustrated configuration, there is displayed the pressing force.
  • the color information acquisition section 413 obtains color information input by the operator using the setup screen 400 shown in FIG. 5 .
  • the display control section 414 displays arbitrary teaching points P 1 out of the plurality of teaching points P 1 with a color based on the polishing parameters obtained by the polishing parameter acquisition section 412 so as to be superimposed on the work W 1 .
  • the arbitrary teaching point P 1 out of the plurality of teaching points P 1 are displayed with the color based on the polishing parameters obtained. This point will be described later in detail.
  • the storage section 42 stores a variety of types of setup information, and a variety of programs which can be executed by the control section 41 .
  • a volatile memory such as a RAM (Random Access Memory), a nonvolatile memory such as a ROM (Read Only Memory), and a removable external storage device.
  • the communication section 43 performs transmission/reception of signals with the control device 3 using an external interface such as a wired LAN (Local Area Network), or a wireless LAN.
  • an external interface such as a wired LAN (Local Area Network), or a wireless LAN.
  • the display section 40 is formed of a variety of types of display having a display screen.
  • an input operation section such as a mouse or a keyboard
  • this configuration is not a limitation, and it is possible to adopt a configuration in which the display section 40 is provided with a display function and an input operation function such as a touch panel. Further, it is possible to adopt a configuration in which the touch panel, the mouse, and the keyboard are used together with each other.
  • the display section 40 is not limited to the illustrated configuration, and it is possible to adopt one having a configuration of, for example, forming an image on the object or in the air.
  • the setup screen 400 shown in FIG. 5 is a screen to be displayed on the display section 40 , and it is possible for the operator to perform the teaching by inputting each item.
  • the setup screen 400 has an input section 401 , an input section 402 , an input section 403 , an input section 404 , an input section 405 , an input section 406 , and an input section 407 .
  • the input section 401 is a portion in which the numbers of the teaching points P 1 are input to set the numbers of the teaching points P 1 in a plurality of ranges. Further, it is possible to set the items alone in each of the ranges using the input section 402 through the input section 407 described below.
  • the input section 402 is a portion in which the color to be displayed is set for each of the ranges of the teaching points P 1 designated in the input section 401 . Specifically, the input section 402 is a portion in which a color is selectively input at the right side of the range of the teaching points P 1 designated in the input section 401 .
  • the input section 403 is a portion in which a type of a coordinate system is input for each of the ranges of the teaching points P 1 designated in the input section 401 .
  • the types of the coordinate system there are cited a local coordinate system, an end coordinate system, a robotic coordinate system, and so on.
  • the input section 404 is a portion in which a direction and a magnitude of the pressing force of the grinder when performing the polishing task are input for each of the ranges of the teaching points P 1 designated in the input section 401 .
  • the directions of the pressing force there are cited Fx, Fy, and Fz, and the direction of the pressing force can be set by selecting one from these. Further, the magnitude of the pressing force can be set by inputting a numerical value.
  • the input section 405 is a portion in which the viscosity coefficient as one of the force control parameters is input for each of the ranges of the teaching points P 1 designated in the input section 401 . It should be noted that this configuration is not a limitation, and it is possible to adopt a configuration of, for example, inputting the mass coordinate, or the elastic coordinate, or two or more of these parameters.
  • the input section 406 is a portion in which the moving speed of the tool center point TCP is set for each of the ranges of the teaching points P 1 designated in the input section 401 . It is possible to set the moving speed by inputting a numerical value.
  • the input section 407 is a portion in which the rotational speed of the grinder of the polishing tool is set for each of the ranges of the teaching points P 1 designated in the input section 401 . It is possible to set the rotational speed of the grinder by inputting a numerical value.
  • such a simulation image as shown in FIG. 6 is displayed.
  • the simulation image shown in FIG. 6 there are displayed the work W 1 , and the teaching points P 1 set on the work W 1 .
  • the teaching points P 1 are displayed with colors based on the polishing parameters obtained by the polishing parameter acquisition section 412 so as to overlap the work W 1 .
  • the operator it is possible for the operator to figure out the contents of the teaching information input by him- or herself at a glance.
  • by displaying the teaching points P 1 attached with colors what contents the teaching information has can clearly be figured out.
  • the teaching support device 10 A is a device for performing teaching to the robot 1 which has the robot arm 10 the tip of which is attached with the end effector 20 as the polishing tool, and which controls the robot arm 10 with the force control to perform the polishing task on the work W 1 as the object, and is provided with the teaching point acquisition section 411 for obtaining the information related to the plurality of teaching points P 1 set on the work W 1 , the polishing parameter acquisition section 412 for obtaining the information related to the polishing parameters of the polishing task at the plurality of teaching points P 1 obtained by the teaching point acquisition section 411 , and the display control section 414 for displaying the teaching points P 1 with the colors based on the polishing parameters obtained by the polishing parameter acquisition section 412 so as to overlap the work W 1 .
  • the teaching point acquisition section 411 for obtaining the information related to the plurality of teaching points P 1 set on the work W 1
  • the polishing parameter acquisition section 412 for obtaining the information related to the polishing parameters of the polishing task at the plurality of teaching points P
  • the display control section 414 displays the teaching point P 1 with a first color when the polishing parameter thus obtained is in a first range, for example, the pressing force is in a range no lower than 4 N and no higher than 6 N, and displays an arbitrary teaching point P 1 with a second color different from the first color when the polishing parameter thus received is in a second range different from the first range, for example, the pressing force is in a range no lower than 7 N and no higher than 10 N.
  • the display control section 414 displays a circle including the teaching point P 1 .
  • this configuration is not a limitation, and it is possible to adopt any configurations such as a triangular shape, a quadrangular shape, a polygonal shape having larger number of vertexes, or a shape of a star.
  • the polishing parameters include the information of the size of the grinder provided to the polishing tool, and the display control section 414 displays a circle different in side by the information of the size of the grinder. Specifically, the larger the size of the grinder is, the larger the circle to be displayed is made, and the smaller the size of the grinder is, the smaller the circle to be displayed is made. Thus, it is possible to figure out the size of the grinder provided to the polishing tool at a glance. Further, it is possible to figure out the area to actually be polished.
  • the color information acquisition section 413 for obtaining the information of setting the color.
  • the color information acquisition section 413 for obtaining the information of setting the color.
  • the color information acquisition section 413 obtains the number of colors to be set and the ranges of the polishing parameters
  • the display control section 414 displays the colors based on the number of colors and the ranges of the polishing parameters obtained by the color information acquisition section 413 .
  • the display colors are automatically assigned so that an area high in pressing force becomes, for example, red, and an area low in pressing force becomes, for example, green in accordance with the magnitude of the pressing force thus input.
  • the difference in magnitude of the pressing force can also be figured out at a glance.
  • the magnitude of the pressing force and the display color are associated with each other, it is possible to confirm a distribution of the pressing force.
  • the display control section 414 displays the arrow including the teaching point P 1 .
  • the difference in magnitude of the pressing force can be figured out at a glance.
  • FIG. 8 is a block diagram for explaining a robotic system with a focus on hardware.
  • FIG. 8 shows an overall configuration of the robotic system 100 A having the robot 1 , a controller 61 , and a computer 62 coupled to each other.
  • the control of the robot 1 can be executed by reading out commands located in the memory with a processor located in the controller 61 , or can be executed via the controller 61 by reading out the commands located in the memory with a processor located in the computer 62 .
  • FIG. 9 is a block diagram showing Modified Example 1 with a focus on hardware of a robotic system.
  • FIG. 9 shows an overall configuration of the robotic system 100 B in which a computer 63 is directly coupled to the robot 1 .
  • the control of the robot 1 is directly executed by a processor located in the computer 63 reading out the commands located in the memory.
  • FIG. 10 is a block diagram showing Modified Example 2 with a focus on hardware of a robotic system.
  • FIG. 10 shows an overall configuration of the robotic system 100 C in which the robot 1 incorporating the controller 61 and a computer 66 are coupled to each other, and the computer 66 is connected to the cloud 64 via a network 65 such as LAN.
  • the control of the robot 1 can be executed by reading out the commands located in the memory with a processor located in the computer 66 , or can be executed by reading out the commands located in the memory via the computer 66 with a processor located on the cloud 64 .
  • controller 61 any one, two, or three of the controller 61 , the computer 66 , and the cloud 64 as the “control device.”
  • teaching support device according to the present disclosure is described with reference to the illustrated embodiment, the present disclosure is not limited to the illustrated embodiment. Further, the constituents of the teaching support device can be replaced with those capable of exerting substantially the same functions, and having arbitrary configurations. Further, it is possible to add arbitrary constituents.

Landscapes

  • Engineering & Computer Science (AREA)
  • Mechanical Engineering (AREA)
  • Robotics (AREA)
  • Automation & Control Theory (AREA)
  • Human Computer Interaction (AREA)
  • Manufacturing & Machinery (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Numerical Control (AREA)
  • Manipulator (AREA)
US17/855,944 2021-07-02 2022-07-01 Teaching Support Device Pending US20230001567A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2021-110837 2021-07-02
JP2021110837A JP2023007773A (ja) 2021-07-02 2021-07-02 教示支援装置

Publications (1)

Publication Number Publication Date
US20230001567A1 true US20230001567A1 (en) 2023-01-05

Family

ID=84785839

Family Applications (1)

Application Number Title Priority Date Filing Date
US17/855,944 Pending US20230001567A1 (en) 2021-07-02 2022-07-01 Teaching Support Device

Country Status (3)

Country Link
US (1) US20230001567A1 (ja)
JP (1) JP2023007773A (ja)
CN (1) CN115625714A (ja)

Also Published As

Publication number Publication date
JP2023007773A (ja) 2023-01-19
CN115625714A (zh) 2023-01-20

Similar Documents

Publication Publication Date Title
US10173325B2 (en) Robot, robot control apparatus and robot system
US10112297B2 (en) Robot control apparatus, robot, and robot system
US10751874B2 (en) Method of teaching robot and robotic arm control device
CN106891321B (zh) 作业装置
US11370105B2 (en) Robot system and method for operating same
US20180154520A1 (en) Control device, robot, and robot system
JP7124440B2 (ja) ロボット制御装置およびロボットシステム
JP7187765B2 (ja) ロボット制御装置
JP2018118365A (ja) 制御装置およびロボットシステム
WO2018190345A1 (ja) ロボットシステム及びその運転方法
US20220250237A1 (en) Teaching device, teaching method, and recording medium
US20230001577A1 (en) Force Control Parameter Setup Support Method And Force Control Parameter Setup Support System
WO2015137162A1 (ja) 制御装置、ロボットシステム、および制御用データ生成方法
JP2021133470A (ja) ロボットの制御方法およびロボットシステム
JP2021030364A (ja) ロボット制御装置
US20230001567A1 (en) Teaching Support Device
JP2016221653A (ja) ロボット制御装置およびロボットシステム
US20220134571A1 (en) Display Control Method, Display Program, And Robot System
WO2022009765A1 (ja) ロボット制御装置
JP7423943B2 (ja) 制御方法およびロボットシステム
WO2021117701A1 (ja) マスタスレーブシステム及び制御方法
US11904479B2 (en) Method for controlling drives of a robot, and robot system
JP2021121451A (ja) 教示方法およびロボットシステム
US20240004364A1 (en) Display apparatus and display method
WO2021210514A1 (ja) ロボットの制御装置及び制御方法、ロボットシステム、ロボットの動作プログラムを生成する装置及び方法

Legal Events

Date Code Title Description
AS Assignment

Owner name: SEIKO EPSON CORPORATION, JAPAN

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:SHIMURA, YUMA;REEL/FRAME:060420/0304

Effective date: 20220408

STPP Information on status: patent application and granting procedure in general

Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION