WO2012062374A1 - A control system and an operating device for controlling an industrial robot comprising a touch -screen - Google Patents

A control system and an operating device for controlling an industrial robot comprising a touch -screen Download PDF

Info

Publication number
WO2012062374A1
WO2012062374A1 PCT/EP2010/067340 EP2010067340W WO2012062374A1 WO 2012062374 A1 WO2012062374 A1 WO 2012062374A1 EP 2010067340 W EP2010067340 W EP 2010067340W WO 2012062374 A1 WO2012062374 A1 WO 2012062374A1
Authority
WO
WIPO (PCT)
Prior art keywords
robot
contact
contact area
control
zero position
Prior art date
Application number
PCT/EP2010/067340
Other languages
French (fr)
Inventor
Olov Nylén
Ralph SJÖBERG
Original Assignee
Abb Technology Ag
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Abb Technology Ag filed Critical Abb Technology Ag
Priority to PCT/EP2010/067340 priority Critical patent/WO2012062374A1/en
Publication of WO2012062374A1 publication Critical patent/WO2012062374A1/en

Links

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B19/00Programme-control systems
    • G05B19/02Programme-control systems electric
    • G05B19/18Numerical control [NC], i.e. automatically operating machines, in particular machine tools, e.g. in a manufacturing environment, so as to execute positioning, movement or co-ordinated operations by means of programme data in numerical form
    • G05B19/409Numerical control [NC], i.e. automatically operating machines, in particular machine tools, e.g. in a manufacturing environment, so as to execute positioning, movement or co-ordinated operations by means of programme data in numerical form characterised by using manual data input [MDI] or by using control panel, e.g. controlling functions with the panel; characterised by control panel details or by setting parameters
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J13/00Controls for manipulators
    • B25J13/06Control stands, e.g. consoles, switchboards
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1679Programme controls characterised by the tasks executed
    • B25J9/1689Teleoperation
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B2219/00Program-control systems
    • G05B2219/30Nc systems
    • G05B2219/36Nc in input of data, input key till input tape
    • G05B2219/36168Touchscreen
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B2219/00Program-control systems
    • G05B2219/30Nc systems
    • G05B2219/39Robotics, robotics to robotics hand
    • G05B2219/39438Direct programming at the console
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B2219/00Program-control systems
    • G05B2219/30Nc systems
    • G05B2219/40Robotics, robotics mapping to robotics vision
    • G05B2219/40195Tele-operation, computer assisted manual operation

Definitions

  • the present invention relates to a control system and an operating device for controlling an industrial robot.
  • the control system comprises a touch-sensitive contact area and is adapted to manually control the movements of the robot in response to a contact between an object and the contact area.
  • An industrial robot can be operated in two different modes; automatic and manual.
  • the movements of the robot are controlled in accordance with instructions of a control program.
  • a portable operating device generally denoted a Teach Pendant Unit.
  • a robot operator uses the operating device for manually controlling the movements of the robot, for example, to teach or program the robot to follow an operating path.
  • a Teach Pendant Unit normally comprises operator control means, for example a joystick, a ball, a set of buttons or any combination thereof, a visual display unit, and safety equipment for protecting a user against injury during manual control of the robot, such as an enabling device and an emer- gency stop button.
  • the enabling device is, for example, a switch or a push button, which has to be pressed by the operator to enable manual control of the robot movements by the Teach Pendant Unit.
  • To manually move the robot is often called to jog the robot.
  • an operating device having a joystick with three or more axes for controlling the mo- tions of the robot.
  • a joystick is very intuitive and easy to learn and master.
  • the joystick represents a cost for the operating device, and also a component which is affected by the surrounding environment both in handling, chemicals and shocks. Further, the joy stick is limited to the exact deflection directions.
  • the patent application US 2009/0292390 proposes to use a touch screen instead of a joystick in order to reduce the above mentioned problems.
  • the operating device is provided with a touch sensitive contact area and is constructed to generate in response to a contact by an element upon the contact area over a length, an output signal in correspondence to the length for controlling a movement of a machine element in a direction of an axis of the machine tool.
  • a touch screen is insensitive to particles or dirt, liquids and gases which surround the operating device.
  • a plurality of touch sensitive contact areas to generate in response to a contact by the element upon one of the contact areas over a length, an output sig- nal in correspondence to the length for controlling the movement of the machine element in a direction of the associated one of the axes of the machine tool.
  • the speed of the movement can be specified in the same way, i.e. the speed of the movement is proportional to the length of the contact by the element upon the contact area.
  • the object of the present invention is to provide an improved operating device and control system for manually controlling the robot.
  • control system as defined in claim 1 and an operating device as defined in claim 15.
  • the control system is adapted to control the movements of the robot in dependence on the position of a contact point between the object and the contact area, and distance between the contact point and a defined zero position on the contact area.
  • the object can, for example, be a finger of the user or a stylus.
  • the jogging speed is increased when the user moves the object away from the zero position on the screen, and the jogging speed is decreased when the user moves the object towards the zero position.
  • the zero position can be anywhere in the contact area, but it is intuitive to have it in the middle.
  • the location of the zero position can be static and predefined, or dynamic and e.g. defined by the user.
  • the movements of the robot depend on the position of the contact point with respect to a zero position and not on the length of the contact between the object and the contact area as in the prior art.
  • the invention makes it easy to increase or decrease the speed of the robot by moving the contact point to a larger or smaller distance from the zero position, and also to provide stepwise movement of the robot.
  • the user only has to point on the touch screen to order a movement instead of dragging the element over the screen.
  • Another advantage is that a larger freedom to determine the direction of the movement is allowed.
  • the system is adapted to control the direction of the robot movement in dependence of the position of the contact point and to control the speed of the robot movement in dependence on the distance be- tween the contact point and the zero position such that the speed is increasing upon an increasing distance between the contact point and the zero position, and the speed is decreasing upon a decreasing distance between the contact point and the zero position.
  • the direction of the movement can be made dependent on the position of the contact point relative the zero position
  • the contact area can be divided into a plurality of subareas representing different Cartesian directions, and the system is adapted to control the direction of the robot movement in dependence on in which of the subareas the contact point is positioned.
  • the speed of the robot movement depends on the distance between the contact point and the zero position.
  • This embodiment makes it easy for the operator to control the direction as well as the speed of the robot.
  • the system is adapted to control the direction of the robot movement in dependence of the position of the contact point in relation to the zero position. This provides a larger freedom to determine the direction of the movement since a movement is no longer re- stricted to be along a defined Cartesian axis.
  • system is adapted to display a view includes a coordinate system having two axes representing two orthogonal Cartesian directions, the zero posi- tion is defined as the intersection point between the axes, and the speed in the one of the directions is determined by the vertical distance between a contact point and the zero position, and the speed in the other direction is determined by the horizontal distance between the contact point and the zero position.
  • This embodiment makes it possible to determine a direction of the movement in between two orthogonal directions.
  • the contact area may include a plurality of subareas representing different joints of the robot, and the control system is adapted to control the movements of the joints of the robot in dependence on in which of the subareas the contact point is positioned.
  • the speed of a selected joint is controlled by the position of the contact point in relation to the zero position.
  • each joint of the robot is represented by its own subarea.
  • the movements of a selected joint are controlled by pointing in the subarea repre- senting the selected joint, at a distance from the zero position representing the desired speed of the joint.
  • the system is adapted to continue the movements of the robot as long as the object is in contact with the contact area, and to stop the move- ments of the robot when there is no contact between the object and the contact area.
  • This embodiment makes it easy to move the robot stepwise by tapping on the contact area. From a safety point of view, it is advantageous that the movement of the robot is stopped when there is no longer any contact between the ob- ject and the contact area. This embodiment would make it possible to omit the enabling device on the teach pendant unit.
  • the system is adapted to allow the zero position to be defined by the user upon touching a desired position on the contact area.
  • This makes it possible for a user to select where on the screen the zero position is to be located and to specify the movement in relation to the zero point, without having to look at the screen. Further it makes it possible for a user to limit the speed of the movement by positioning the zero point close to an edge of the touch screen.
  • the touch sensitive contact area is adapted to support multi touch and accordingly to detect simultaneous contact points between two objects and the contact area. As the contact area support multi touch it is possible to simultaneously detect the contact point and the zero position. This embodiment enables the user to define the zero position with one of its fingers and to select the contact point with another finger.
  • the speed of the robot is deter- mined by the distance between two fingers of the user being in contact with the contact area, and the speed can be increased or decreased by simply reducing or increasing the distance between the two fingers of the user.
  • This embodiment makes it possible for a user to change the speed of the robot without the need to look at the touch screen when jogging the robot.
  • the touch sensitive contact area supports multi touch enables a user to simultaneously control two joints of the robot in dependence of the positions of two contact points and the distances between the contact points and the zero position.
  • the zero position is fixed and one or more location elements are provided at or in close vicinity of the contact area and in a defined relation to the zero position in order to facilitate for a user to localize the zero position. It is a desire to avoid the need for the user to look at the touch screen when jogging the robot. By providing location elements close to the contact area it is facilitated for the user to localize the zero position.
  • at least one location element is arranged at the same vertical level as the zero position and at least one location element is arranged at the same horizontal level as the zero position.
  • the location element is, for example, a protrusion or a recess. The location element is designed so that a user can sense it with his fingers.
  • the system comprises a handheld operating device provided with a touch screen including said touch-sensitive contact area, and adapted to generate control data for controlling the movements of the robot in response to a contact between the object and the contact area, and in dependence on the position of the contact point and the distance between the contact point and the zero position and a control unit adapted to receive the control data and to control the movements of the robot based on the control data.
  • the invention also relates to an operating device for manually controlling the movements of an industrial robot, the device comprising a touch-sensitive contact area and is adapted to generate control data for controlling the movements of the robot in response to a contact between the object and the contact area.
  • the device is adapted to generate said control data in de- pendence on the position of a contact point between the object and the contact area, and the distance between the contact point and a defined zero position on the screen.
  • the con- trol data includes a speed vector
  • the device is adapted to generate the speed vector in dependence on the position of the contact point and the distance between the contact point and the zero position.
  • Fig. 1 shows an example of industrial robot having a control system according to an embodiment of the invention including.
  • Fig. 2 shows an example of an operating device according to the invention.
  • Fig. 3a shows an example of a display view for controlling the robot in three Cartesian directions.
  • Fig. 3b shows an example of a display view for controlling a plu- rality of joints of the robot.
  • Figs. 4a-c show three examples of display views for controlling the robot in a plane defined by two Cartesian directions.
  • Fig. 5 shows another example of a display view for controlling the robot in three Cartesian directions.
  • Fig. 6 shows an example of a display view on a multi touch screen.
  • Fig. 7 shows another example of a display view for controlling a plurality of joints of the robot.
  • Figure 1 shows an industrial robot comprising a manipulator 1 and a control system including a control unit 2 for controlling the movements of the manipulator in automatic mode and a portable operating device 3 for manually controlling the movements of the manipulator.
  • the control unit 2 and the operating device 3 are connected to each other via a communication link, such as an Ethernet link.
  • the operating device 3 is in communication with the control unit 2 via a cable.
  • the communication can as well be wireless.
  • the control unit 2 as well as the operating device 3 includes hardware such as processors, data storage, and communication components as well as software for executing and handling data and robot programs.
  • the operating device 3 comprises a touch screen 5, function keys 6, and an emergency stop button 8.
  • the function keys permit an operator to select various states for the control system.
  • a touch screen is an electronic visual display that can detect the presence and location of a touch within the display area. Touch screens can sense a finger, a hand or a passive object, such as a stylus, touching the display.
  • the touch screen 5 includes a touch sensitive contact area.
  • the operating device is capable of register the position of one or more contact points on the contact area and is adapted to generate control data for controlling the movements of the manipulator in response to a contact between an object, such as a finger or a stylus, and the contact area.
  • the operating device com- prises a software module receiving information on the location of touches on the screen and converting this information to control data for controlling the movements of the manipulator.
  • the control data is continuously sent to the control unit 2.
  • the control unit 2 comprises a switch 10 for switching between automatic mode and manual mode.
  • the control unit 2 receives control data from the operating device 3. If the control system is in automatic mode, the control unit moves manipulator to in accordance with the received control data.
  • the manipulator 1 includes a plurality of arm parts and a hand, which are rotatable relative each other about a plurality of joints.
  • the robot hand supports a tool, in which an operating point, called Too! Centre Point (TCP) is defined.
  • TCP Too! Centre Point
  • the number of joints may vary between different types of manipulators.
  • the manipu- lator 1 shown in figure 1 has six joints and includes six motors for driving the six joints.
  • the robot When the robot is in manual mode it is a desire to be able to move individual joints of the manipulator as well as to move the TCP in Cartesian directions x, y, z.
  • FIG. 2 shows the operating device 3 in an enlarged view.
  • the touch screen 5 includes a touch sensitive contact area.
  • the contact area is divided into a plurality of subareas 12.
  • Each subarea 12 represents a Cartesian direction x ⁇ x, y,-y, z or -z.
  • a zero position 14 is marked in middle of the contact area.
  • the operating device is adapted to generate a continuous stream of control data, for example in the form of a speed vector, in dependence on contact point between an object and the touch screen, and the distance between the contact point and the zero position 14 on the screen.
  • the speed vector includes values for the speed in each direction and the terms of the vec- tor are, for example, defined as a percentage of maximum allowed speed in a direction.
  • the speed vector is zero.
  • the jogging of the robot is assumed to be finished and the stream of control data to the control unit is stopped.
  • the control unit is adapted to stop the movement of the manipulator when the stream of control data from the operating unit is discontinued.
  • the position of the zero position 14 is fixed and the operating device is provided with four location elements 16a-d arranged close to the touch screen 5 in order to facilitate for a user to localize the zero position 14 without having to look at the touch screen.
  • two location elements 16a and 16c are arranged on the same horizontal level as the zero position 14, and two location elements 16b and 16d are arranged at the same vertical level as the zero position 14.
  • the location element can be a protrusion or depression.
  • Figure 3a shows an example of a display view for controlling the robot in three Cartesian directions.
  • the touch screen 5 is divided into six different subareas 20a-f representing six different Cartesian directions for the TCP of the robot.
  • the subarea 20a represents a positive y direction
  • subarea 20b represents a negative y direction
  • subarea 20c represents a positive x direction
  • subarea 20d represents a negative x direction
  • subarea 20e represents a positive z direction
  • subarea 20f represents a negative z direction.
  • the operator uses his finger to point at the touch screen 5.
  • the position of the contact point 22,24 between the finger and the screen determines the direction and the speed of the movement. If the operator points with his finger in the subarea 20b at contact point 22, as shown in the figure, the TCP of the manipulator is moved in the negative y direction and the speed of the movement is determined by the distance d1 between the contact point 22 and the zero position 14. The movement continues as long as the finger is in contact with the touch area.
  • the jogging speed is increased and when the opera- tor moves the finger in a direction towards the zero position the jogging speed is decreased.
  • the TCP is moved in the positive z direction, and the speed of the movement is determined by the vertical distance d2 between the contact point 24 and the zero position 14.
  • the speed depends on the distance between the contact point and the horizontal level of the zero position.
  • Figure 3b shows an example of a display view for controlling a plurality of joints of the robot.
  • the touch sensitive contact area is divided into a plurality of different subareas 25a-h represent- ing different joints of the manipulator.
  • Subareas 25a-b represent joint 1
  • subareas 25c-d represent joint 2
  • subareas 25e-f represents joint 3
  • subareas 25g-h represent joint 4.
  • a pair of subareas for a joint represent opposite directions of the move- ments of the joint. This embodiment makes it possible to control the movement of one joint at the time. For example, if joint 2 is to be moved in one direction, the operator points at subarea 25c, and if joint 2 is to be moved in an opposite direction the operator points at subarea 25d.
  • the operator can point anywhere in subarea 25c.
  • the speed of joint 2 depends on the distance d1 between the contact point 27 and the zero position 14.
  • the operator uses a pencil 30 to point at the contact area 5. If the speed of the joint 2 is to be increased the operator moves the pencil away from the zero position 14 and if the speed is to be decreased the operator moves the pencil in a direction towards the zero position 14.
  • the subareas represent different Cartesian directions as shown in figure 3a when manual movements of the TCP is selected, and the subareas represent different joints of the robot, as shown in figure 3b, when manual movements of the joints of the robot is selected,
  • the direction of the robot movement depends on in which subarea the contact point is positioned, and the speed of the robot movement depends on the distance between the contact point and the zero position such that the speed is increasing upon an increasing distance between the contact point and the zero position and the speed is decreasing upon an decreasing distance between the contact point and the zero position.
  • the movement of the manipulator is immediately stopped when the contact between the object and the contact area is ceased. This feature makes it possible to omit the enabling device of the operating device, as the move- ment is enabled only when there is a contact between the object and the contact area.
  • Figures 4a-c show three examples of display views for control- ling the TCP of the manipulator in a plane defined by two Cartesian directions.
  • Each of the views includes a coordinate system having two axes, representing two orthogonal Cartesian directions and the zero position is defined as the intersection point between the axes.
  • the speed in the one of the directions is de- termined by the vertical distance between a contact point and the zero position, and the speed in the other direction is determined by the horizontal distance between the contact point and the zero position.
  • Figure 4a shows a view for controlling the movements of the TCP in an x-y plan.
  • Figure 4b shows a view for controlling the movements of the TCP in an x-z plan.
  • Figure 4c shows a view for controlling the movements of the TCP in a z-y plan.
  • the three views shown in figures 4a-c can be shown simultaneously on the contact area, or it can be possible to toggle between the three views.
  • the speed in the y direction is determined by the vertical distance d y between a contact point 30 and a zero position 32
  • the speed in the x direction is determined by the horizontal distance d x between the contact point 30 and the zero position 32.
  • Figure 5 shows another example of a display view for controlling the TCP of the manipulator in three Cartesian directions.
  • the contact area 35 includes two subareas 36a-b.
  • the subarea 36a represents a plane defined by two Cartesian directions, for example the x-y plane, and the subarea 36b represents one Cartesian direction, for example z.
  • a zero position 38 is defined in the middle of the subarea 36a.
  • the speed in the x direction depends on the horizontal distance between a contact point 39 and the zero position 38
  • the speed in the y direction depends on the vertical distance between the contact point 39 and the zero position 38.
  • the axes represent percentage of maximum allowed speed.
  • the contact point 39 specifies that the speed in the x-direction is 20% of maximum allowed speed and the speed in the y-direction is 45% of maximum allowed speed. This means that the operating device generates control data including a speed vector (x,y,z) which is (20, 45, 0).
  • a speed vector (x,y,z) which is (20, 45, 0).
  • the TCP is moved in the positive direction of the z-axis and speed along the z-axis is 30% of maximum allowed speed for the z axis. In this case, the speed vector becomes (0, 0, 30).
  • the zero position can be defined by a user upon touching a position on the contact area.
  • the display screen can be provided with a menu bar providing an option to redefine the zero position. If the user selects this op- tion, a new position for the zero position is defined when the user touches the contact area. Preferably, the user has to confirm the selective position for the zero position before the zero position is changed to the new position. The new zero position is shown on the contact area.
  • the touch screen is a multi touch screen, which makes it is possible detect two or more simultaneous contact points on the touch contact area.
  • a multi touch screen is capable of simultaneously register two or more distinct positions of input touches.
  • the touch screen is capable of registering the position of two or more con- tact points on the contact area. For example, in figure 5 it is possible for the operator to determine the speeds in the x and y directions by pointing with one of his fingers in the subarea 36a, and simultaneously determine the speed in the z direction by pointing with another finger in subarea 36b.
  • Figure 6 shows another example of a display view on a multi touch screen.
  • a zero position 45 is defined by one of the fingers of the operator, and the contact point 46 is defined by another finger. The direction of the movement depends on the position of the contact point 46 relative the axes of the coordinate system displayed in the view, and the speed of the movement depends on the distance between the two fingers.
  • Figure 7 shows another example of contact area 50.
  • the contact area 50 is divided into six subareas representing six different joints of the robot. Each subarea is provided with a zero position 51 a-f. The speed of the joint represented by the subarea is determined by the distance between a contact point located in the subarea and the zero position. If a multi touch screen is used several joints can be controlled simultaneously.
  • the present invention is not limited to the embodiments disclosed but may be varied and modified within the scope of the following claims.
  • the display area can be designed in many different ways.

Landscapes

  • Engineering & Computer Science (AREA)
  • Robotics (AREA)
  • Mechanical Engineering (AREA)
  • Human Computer Interaction (AREA)
  • Manufacturing & Machinery (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • Manipulator (AREA)
  • Numerical Control (AREA)

Abstract

The present invention relates to a control system for controlling an industrial robot, including an operating device (3) for manually controlling the movements of the robot. The device comprises a touch - sensitive contact area (5) and is adapted to generate control data for controlling the movements of the robot in response to a contact between an object and the contact area. The device is adapted to generate the control data in dependence on the position of a contact point (22, 24) between the object and the contact area, and the distance (d1, d2) between the contact point and a defined zero position (14) on the contact area. The system further includes a handheld operating device (3) provided with a touchscreen (5) including said touch - sensitive contact area.

Description

A CONTROL SYSTEM AND AN OPERATING DEVICE FOR CONTROLLING AN
INDUSTRIAL ROBOT COMPRISING A TOUCH -SCREEN
FIELD OF THE INVENTION AND PRIOR ART
The present invention relates to a control system and an operating device for controlling an industrial robot. The control system comprises a touch-sensitive contact area and is adapted to manually control the movements of the robot in response to a contact between an object and the contact area.
An industrial robot can be operated in two different modes; automatic and manual. The movements of the robot are controlled in accordance with instructions of a control program. When the robot is operated in the manual mode, the movements of the robot are controlled by a portable operating device, generally denoted a Teach Pendant Unit. A robot operator uses the operating device for manually controlling the movements of the robot, for example, to teach or program the robot to follow an operating path. A Teach Pendant Unit normally comprises operator control means, for example a joystick, a ball, a set of buttons or any combination thereof, a visual display unit, and safety equipment for protecting a user against injury during manual control of the robot, such as an enabling device and an emer- gency stop button. The enabling device is, for example, a switch or a push button, which has to be pressed by the operator to enable manual control of the robot movements by the Teach Pendant Unit. To manually move the robot is often called to jog the robot. When jogging the robot, it is common to use an operating device having a joystick with three or more axes for controlling the mo- tions of the robot. A joystick is very intuitive and easy to learn and master. However, the joystick represents a cost for the operating device, and also a component which is affected by the surrounding environment both in handling, chemicals and shocks. Further, the joy stick is limited to the exact deflection directions.
The patent application US 2009/0292390 proposes to use a touch screen instead of a joystick in order to reduce the above mentioned problems. The operating device is provided with a touch sensitive contact area and is constructed to generate in response to a contact by an element upon the contact area over a length, an output signal in correspondence to the length for controlling a movement of a machine element in a direction of an axis of the machine tool. A touch screen is insensitive to particles or dirt, liquids and gases which surround the operating device. Further, it is proposed to use a plurality of touch sensitive contact areas to generate in response to a contact by the element upon one of the contact areas over a length, an output sig- nal in correspondence to the length for controlling the movement of the machine element in a direction of the associated one of the axes of the machine tool. Further it is proposed that the speed of the movement can be specified in the same way, i.e. the speed of the movement is proportional to the length of the contact by the element upon the contact area. A problem with this method is that it is difficult to change the speed of the robot.
OBJECTS AND SUMMARY OF THE INVENTION The object of the present invention is to provide an improved operating device and control system for manually controlling the robot.
This object is achieved by a control system as defined in claim 1 and an operating device as defined in claim 15. According to the invention, the control system is adapted to control the movements of the robot in dependence on the position of a contact point between the object and the contact area, and distance between the contact point and a defined zero position on the contact area. The object can, for example, be a finger of the user or a stylus. For example, the jogging speed is increased when the user moves the object away from the zero position on the screen, and the jogging speed is decreased when the user moves the object towards the zero position. The zero position can be anywhere in the contact area, but it is intuitive to have it in the middle. The location of the zero position can be static and predefined, or dynamic and e.g. defined by the user.
Thus, the movements of the robot depend on the position of the contact point with respect to a zero position and not on the length of the contact between the object and the contact area as in the prior art. The invention makes it easy to increase or decrease the speed of the robot by moving the contact point to a larger or smaller distance from the zero position, and also to provide stepwise movement of the robot. The user only has to point on the touch screen to order a movement instead of dragging the element over the screen. Another advantage is that a larger freedom to determine the direction of the movement is allowed.
According to an embodiment of the invention, the system is adapted to control the direction of the robot movement in dependence of the position of the contact point and to control the speed of the robot movement in dependence on the distance be- tween the contact point and the zero position such that the speed is increasing upon an increasing distance between the contact point and the zero position, and the speed is decreasing upon a decreasing distance between the contact point and the zero position. For example, the direction of the movement can be made dependent on the position of the contact point relative the zero position, in another example, the contact area can be divided into a plurality of subareas representing different Cartesian directions, and the system is adapted to control the direction of the robot movement in dependence on in which of the subareas the contact point is positioned. However, the speed of the robot movement depends on the distance between the contact point and the zero position. This embodiment makes it easy for the operator to control the direction as well as the speed of the robot. According to an embodiment of the invention, the system is adapted to control the direction of the robot movement in dependence of the position of the contact point in relation to the zero position. This provides a larger freedom to determine the direction of the movement since a movement is no longer re- stricted to be along a defined Cartesian axis.
According to an embodiment of the invention, system is adapted to display a view includes a coordinate system having two axes representing two orthogonal Cartesian directions, the zero posi- tion is defined as the intersection point between the axes, and the speed in the one of the directions is determined by the vertical distance between a contact point and the zero position, and the speed in the other direction is determined by the horizontal distance between the contact point and the zero position. This embodiment makes it possible to determine a direction of the movement in between two orthogonal directions.
According to another embodiment of the invention the contact area may include a plurality of subareas representing different joints of the robot, and the control system is adapted to control the movements of the joints of the robot in dependence on in which of the subareas the contact point is positioned. The speed of a selected joint is controlled by the position of the contact point in relation to the zero position. Preferably, each joint of the robot is represented by its own subarea. The movements of a selected joint are controlled by pointing in the subarea repre- senting the selected joint, at a distance from the zero position representing the desired speed of the joint. This embodiment makes it possible to control the movements of individual joints of the robot.
Sometimes there is a desire to control individual joints of the robot instead of the Cartesian direction of a tool center point of the robot. According to an embodiment of the invention, at least some of the subareas represent different Cartesian directions when manual movements of the tool center point of the robot is selected and represent different joints of the robot when manual movement of the joints the robot is selected. This embodiment makes it possible to switch between controlling the movements of the robot in Cartesian directions and controlling the move- ments of the joints of robot.
According to an embodiment of the invention, the system is adapted to continue the movements of the robot as long as the object is in contact with the contact area, and to stop the move- ments of the robot when there is no contact between the object and the contact area. This embodiment makes it easy to move the robot stepwise by tapping on the contact area. From a safety point of view, it is advantageous that the movement of the robot is stopped when there is no longer any contact between the ob- ject and the contact area. This embodiment would make it possible to omit the enabling device on the teach pendant unit.
According to an embodiment of the invention, the system is adapted to allow the zero position to be defined by the user upon touching a desired position on the contact area. This makes it possible for a user to select where on the screen the zero position is to be located and to specify the movement in relation to the zero point, without having to look at the screen. Further it makes it possible for a user to limit the speed of the movement by positioning the zero point close to an edge of the touch screen. Further, it is advantageous if the touch sensitive contact area is adapted to support multi touch and accordingly to detect simultaneous contact points between two objects and the contact area. As the contact area support multi touch it is possible to simultaneously detect the contact point and the zero position. This embodiment enables the user to define the zero position with one of its fingers and to select the contact point with another finger. This means that the speed of the robot is deter- mined by the distance between two fingers of the user being in contact with the contact area, and the speed can be increased or decreased by simply reducing or increasing the distance between the two fingers of the user. This embodiment makes it possible for a user to change the speed of the robot without the need to look at the touch screen when jogging the robot. Further, the fact that the touch sensitive contact area supports multi touch enables a user to simultaneously control two joints of the robot in dependence of the positions of two contact points and the distances between the contact points and the zero position.
According to another embodiment of the invention, the zero position is fixed and one or more location elements are provided at or in close vicinity of the contact area and in a defined relation to the zero position in order to facilitate for a user to localize the zero position. It is a desire to avoid the need for the user to look at the touch screen when jogging the robot. By providing location elements close to the contact area it is facilitated for the user to localize the zero position. Preferably, at least one location element is arranged at the same vertical level as the zero position and at least one location element is arranged at the same horizontal level as the zero position. The location element is, for example, a protrusion or a recess. The location element is designed so that a user can sense it with his fingers. According to an embodiment of the invention, the system comprises a handheld operating device provided with a touch screen including said touch-sensitive contact area, and adapted to generate control data for controlling the movements of the robot in response to a contact between the object and the contact area, and in dependence on the position of the contact point and the distance between the contact point and the zero position and a control unit adapted to receive the control data and to control the movements of the robot based on the control data.
The invention also relates to an operating device for manually controlling the movements of an industrial robot, the device comprising a touch-sensitive contact area and is adapted to generate control data for controlling the movements of the robot in response to a contact between the object and the contact area. The device is adapted to generate said control data in de- pendence on the position of a contact point between the object and the contact area, and the distance between the contact point and a defined zero position on the screen.
According to an embodiment of the invention, wherein the con- trol data includes a speed vector, and the device is adapted to generate the speed vector in dependence on the position of the contact point and the distance between the contact point and the zero position. BRIEF DESCRIPTION OF THE DRAWINGS
The invention will now be explained more closely by the description of different embodiments of the invention and with reference to the appended figures.
Fig. 1 shows an example of industrial robot having a control system according to an embodiment of the invention including.
Fig. 2 shows an example of an operating device according to the invention. Fig. 3a shows an example of a display view for controlling the robot in three Cartesian directions.
Fig. 3b shows an example of a display view for controlling a plu- rality of joints of the robot.
Figs. 4a-c show three examples of display views for controlling the robot in a plane defined by two Cartesian directions. Fig. 5 shows another example of a display view for controlling the robot in three Cartesian directions.
Fig. 6 shows an example of a display view on a multi touch screen.
Fig. 7 shows another example of a display view for controlling a plurality of joints of the robot.
DETAILED DESCRIPTION OF PREFERRED EMBODIMENTS OF THE INVENTION
Figure 1 shows an industrial robot comprising a manipulator 1 and a control system including a control unit 2 for controlling the movements of the manipulator in automatic mode and a portable operating device 3 for manually controlling the movements of the manipulator. The control unit 2 and the operating device 3 are connected to each other via a communication link, such as an Ethernet link. In the embodiment disclosed in figure 1 , the operating device 3 is in communication with the control unit 2 via a cable. However, the communication can as well be wireless. The control unit 2 as well as the operating device 3 includes hardware such as processors, data storage, and communication components as well as software for executing and handling data and robot programs. The operating device 3 comprises a touch screen 5, function keys 6, and an emergency stop button 8. The function keys permit an operator to select various states for the control system. In this embodiment, the operating device is not provided with an enabling device. A touch screen is an electronic visual display that can detect the presence and location of a touch within the display area. Touch screens can sense a finger, a hand or a passive object, such as a stylus, touching the display. The touch screen 5 includes a touch sensitive contact area. The operating device is capable of register the position of one or more contact points on the contact area and is adapted to generate control data for controlling the movements of the manipulator in response to a contact between an object, such as a finger or a stylus, and the contact area. The operating device com- prises a software module receiving information on the location of touches on the screen and converting this information to control data for controlling the movements of the manipulator. The control data is continuously sent to the control unit 2. The control unit 2 comprises a switch 10 for switching between automatic mode and manual mode. When the control system is in manual mode the movements of the manipulator 1 are controlled by the operating device 3, and when the control system is in automatic mode the movements of the manipulator is con- trolled by a robot program running on the control unit. The control unit 2 receives control data from the operating device 3. If the control system is in automatic mode, the control unit moves manipulator to in accordance with the received control data. The manipulator 1 includes a plurality of arm parts and a hand, which are rotatable relative each other about a plurality of joints. The robot hand supports a tool, in which an operating point, called Too! Centre Point (TCP) is defined. The number of joints may vary between different types of manipulators. The manipu- lator 1 shown in figure 1 has six joints and includes six motors for driving the six joints. When the robot is in manual mode it is a desire to be able to move individual joints of the manipulator as well as to move the TCP in Cartesian directions x, y, z.
Figure 2 shows the operating device 3 in an enlarged view. The touch screen 5 includes a touch sensitive contact area. In this example, the contact area is divided into a plurality of subareas 12. Each subarea 12 represents a Cartesian direction x ~x, y,-y, z or -z. A zero position 14 is marked in middle of the contact area. The operating device is adapted to generate a continuous stream of control data, for example in the form of a speed vector, in dependence on contact point between an object and the touch screen, and the distance between the contact point and the zero position 14 on the screen. The speed vector includes values for the speed in each direction and the terms of the vec- tor are, for example, defined as a percentage of maximum allowed speed in a direction. If there is no contact between the object and the touch screen the speed vector is zero. When the speed vector is zero, the jogging of the robot is assumed to be finished and the stream of control data to the control unit is stopped. The control unit is adapted to stop the movement of the manipulator when the stream of control data from the operating unit is discontinued.
In the embodiment disclosed in figure 2 the position of the zero position 14 is fixed and the operating device is provided with four location elements 16a-d arranged close to the touch screen 5 in order to facilitate for a user to localize the zero position 14 without having to look at the touch screen. In this embodiment, two location elements 16a and 16c are arranged on the same horizontal level as the zero position 14, and two location elements 16b and 16d are arranged at the same vertical level as the zero position 14. The location element can be a protrusion or depression. Figure 3a shows an example of a display view for controlling the robot in three Cartesian directions. In this embodiment, the touch screen 5 is divided into six different subareas 20a-f representing six different Cartesian directions for the TCP of the robot. The subarea 20a represents a positive y direction, subarea 20b represents a negative y direction, subarea 20c represents a positive x direction, subarea 20d represents a negative x direction, subarea 20e represents a positive z direction, and subarea 20f represents a negative z direction.
In this example, the operator uses his finger to point at the touch screen 5. The position of the contact point 22,24 between the finger and the screen determines the direction and the speed of the movement. If the operator points with his finger in the subarea 20b at contact point 22, as shown in the figure, the TCP of the manipulator is moved in the negative y direction and the speed of the movement is determined by the distance d1 between the contact point 22 and the zero position 14. The movement continues as long as the finger is in contact with the touch area. When the operator moves the finger away from the zero position 14 the jogging speed is increased and when the opera- tor moves the finger in a direction towards the zero position the jogging speed is decreased. If the operator instead points with his finger at the subarea 20e at contact point 24, as shown in the figure, the TCP is moved in the positive z direction, and the speed of the movement is determined by the vertical distance d2 between the contact point 24 and the zero position 14. For the positive and negative z-direction, the speed depends on the distance between the contact point and the horizontal level of the zero position. Thus, the jogging speed is increased when the operator moves the finger away from the horizontal level of the zero position, and the jogging speed is decreased when the operator moves the finger in a direction towards the horizontal level of the zero position.
Figure 3b shows an example of a display view for controlling a plurality of joints of the robot. The touch sensitive contact area is divided into a plurality of different subareas 25a-h represent- ing different joints of the manipulator. Subareas 25a-b represent joint 1 , subareas 25c-d represent joint 2, subareas 25e-f represents joint 3, and subareas 25g-h represent joint 4. A pair of subareas for a joint represent opposite directions of the move- ments of the joint. This embodiment makes it possible to control the movement of one joint at the time. For example, if joint 2 is to be moved in one direction, the operator points at subarea 25c, and if joint 2 is to be moved in an opposite direction the operator points at subarea 25d. To move joint 2, the operator can point anywhere in subarea 25c. However, the speed of joint 2 depends on the distance d1 between the contact point 27 and the zero position 14. In this example, the operator uses a pencil 30 to point at the contact area 5. If the speed of the joint 2 is to be increased the operator moves the pencil away from the zero position 14 and if the speed is to be decreased the operator moves the pencil in a direction towards the zero position 14.
According to an embodiment of the invention, it is possible to switch between manual movements of the TCP of the manipula- tor and manual movements of the individual joints of the robot. In this embodiment the subareas represent different Cartesian directions as shown in figure 3a when manual movements of the TCP is selected, and the subareas represent different joints of the robot, as shown in figure 3b, when manual movements of the joints of the robot is selected, In the embodiments described with reference to figures 3a-b the direction of the robot movement depends on in which subarea the contact point is positioned, and the speed of the robot movement depends on the distance between the contact point and the zero position such that the speed is increasing upon an increasing distance between the contact point and the zero position and the speed is decreasing upon an decreasing distance between the contact point and the zero position. The movement of the manipulator is immediately stopped when the contact between the object and the contact area is ceased. This feature makes it possible to omit the enabling device of the operating device, as the move- ment is enabled only when there is a contact between the object and the contact area.
Figures 4a-c show three examples of display views for control- ling the TCP of the manipulator in a plane defined by two Cartesian directions. Each of the views includes a coordinate system having two axes, representing two orthogonal Cartesian directions and the zero position is defined as the intersection point between the axes. The speed in the one of the directions is de- termined by the vertical distance between a contact point and the zero position, and the speed in the other direction is determined by the horizontal distance between the contact point and the zero position. Figure 4a shows a view for controlling the movements of the TCP in an x-y plan. Figure 4b shows a view for controlling the movements of the TCP in an x-z plan. Figure 4c shows a view for controlling the movements of the TCP in a z-y plan. The three views shown in figures 4a-c can be shown simultaneously on the contact area, or it can be possible to toggle between the three views. As shown in figure 4a the speed in the y direction is determined by the vertical distance dy between a contact point 30 and a zero position 32, and the speed in the x direction is determined by the horizontal distance dx between the contact point 30 and the zero position 32.
Figure 5 shows another example of a display view for controlling the TCP of the manipulator in three Cartesian directions. The contact area 35 includes two subareas 36a-b. The subarea 36a represents a plane defined by two Cartesian directions, for example the x-y plane, and the subarea 36b represents one Cartesian direction, for example z. A zero position 38 is defined in the middle of the subarea 36a. In this example, the speed in the x direction depends on the horizontal distance between a contact point 39 and the zero position 38, and the speed in the y direction depends on the vertical distance between the contact point 39 and the zero position 38. in the example shown in figure 4 the axes represent percentage of maximum allowed speed. The contact point 39 specifies that the speed in the x-direction is 20% of maximum allowed speed and the speed in the y-direction is 45% of maximum allowed speed. This means that the operating device generates control data including a speed vector (x,y,z) which is (20, 45, 0). When the contact point is located in the first quadrant of the displayed coordinate system, the movement of the TCP is in the positive x and y directions, and if the contact point is located in the third quadrant the movement of the TCP is in the negative x and y direction. If the movement of the manipulator is to be controlled in the z-direction, the contact point is moved to the subarea 36b. The speed in the z direction depends on the vertical distance between the contact point and the zero position 38. For example, if a contact point 40 is located as shown in figure 5, the TCP is moved in the positive direction of the z-axis and speed along the z-axis is 30% of maximum allowed speed for the z axis. In this case, the speed vector becomes (0, 0, 30).
It is also possible to allow the zero position to be defined by a user upon touching a position on the contact area. For example, the display screen can be provided with a menu bar providing an option to redefine the zero position. If the user selects this op- tion, a new position for the zero position is defined when the user touches the contact area. Preferably, the user has to confirm the selective position for the zero position before the zero position is changed to the new position. The new zero position is shown on the contact area.
In an embodiment of the invention, the touch screen is a multi touch screen, which makes it is possible detect two or more simultaneous contact points on the touch contact area. A multi touch screen is capable of simultaneously register two or more distinct positions of input touches. In this example, the touch screen is capable of registering the position of two or more con- tact points on the contact area. For example, in figure 5 it is possible for the operator to determine the speeds in the x and y directions by pointing with one of his fingers in the subarea 36a, and simultaneously determine the speed in the z direction by pointing with another finger in subarea 36b.
Figure 6 shows another example of a display view on a multi touch screen. A zero position 45 is defined by one of the fingers of the operator, and the contact point 46 is defined by another finger. The direction of the movement depends on the position of the contact point 46 relative the axes of the coordinate system displayed in the view, and the speed of the movement depends on the distance between the two fingers. Figure 7 shows another example of contact area 50. The contact area 50 is divided into six subareas representing six different joints of the robot. Each subarea is provided with a zero position 51 a-f. The speed of the joint represented by the subarea is determined by the distance between a contact point located in the subarea and the zero position. If a multi touch screen is used several joints can be controlled simultaneously.
The present invention is not limited to the embodiments disclosed but may be varied and modified within the scope of the following claims. For example, the display area can be designed in many different ways.

Claims

1 . A control system for controlling an industrial robot, the system comprising a touch-sensitive contact area (5;35;50) and is adapted to manually control the movements of the robot in response to a contact between an object and the contact area, characterized in that that the system is adapted to control the movements of the robot in dependence on the position of a contact point {22,24;27;30;39,40;46) between the object and the contact area, and the distance (d1 ld2) between the contact point and a defined zero position (14;32;38;45;51 a-f) on the contact area.
2. The control system according to claim 1 , wherein the system is adapted to control the direction of the robot movement in dependence on the position of the contact point (22,24;27;30;39,40;46) and to control the speed of the robot movement in dependence on the distance between the contact point and the zero position such that the speed is increasing upon an increasing distance between the contact point and the zero position (14;32;38;45;51 a-f) and the speed is decreasing upon a decreasing distance between the contact point and the zero position.
3. The control system according to any of the previous claims, wherein the operating device is adapted to continue the movement of the robot as long as the object is in contact with the contact area and to stop the movements when there is no contact between the object and the contact area.
4. The control system according to any of the previous claims, wherein the system is adapted to allow the zero position to be defined by a user upon touching a desired position on the contact area.
5. The control system according to any of the previous claims, wherein the touch-sensitive contact area is adapted to support multi-touch and accordingly to detect simultaneous contact points (45,46) between two objects and the contact area.
6. The control system according to claim 5, wherein the system is adapted to simultaneously control two joints of the robot in dependence on the positions of said two contact points and the distances between the contact points and the zero position.
7. The control system according to claim 5, wherein the position of the zero position is determined as the position of one of said contact points, and the system is adapted to control the movements of the robot in dependence on the distance between the contact points.
8. The control system according to any of the previous claims, wherein the contact area includes a plurality of subareas (20a-f) representing different Cartesian directions, and the system is adapted to control the direction of the robot movement in dependence on in which of the subareas the contact point (22,24) is positioned.
9. The control system according to any of the previous claims, wherein the contact area includes a plurality of subareas (25a-h) representing different joints of the robot, and the system is adapted to control the movements of the joints of the robot in dependence on in which of the subareas the contact point (27) is positioned.
10. The control system according to claim 8 and 9, wherein at least some of said subareas(20a-f; 25a-h) represent different Cartesian directions when manual movements of a Tool Centre Point of the robot is selected and represent different joints of the robot when manual movements of the joints of the robot is selected.
1 1 . The control system according to any of the previous claims, wherein the system is adapted to control the direction of the robot movement in dependence of the position of the contact point in relation to the zero position.
12. The control system according to any of the previous claims, wherein the position of the zero position is fixed and one or more location elements (16a-d) are provided at or in the close vicinity of the contact area (5) in order to facilitate for a user to localize the zero position.
13. The control system according to claim 12, wherein at least one location element (16a, 16c) is arranged at the same vertical level as the zero position and at least one location element (16b, 16d) is arranged at the same horizontal level as the zero position.
14. The control system according to any of the previous claims, wherein the system comprises a handheld operating device (3) provided with a touch screen (5) including said touch-sensitive contact area, and adapted to generate control data for controlling the movements of the robot in response to a contact between the object and the contact area, and in dependence on the position of the contact point and the distance between the contact point and the zero position, and a control unit (2) adapted to receive the control data and to control the movements of the robot based on the control data.
15. An operating device for manually controlling the movements of an industrial robot, the device comprising a touch-sensitive contact area (5;35,50) and is adapted to generate control data for controlling the movements of the robot in response to a contact between an object and the contact area, characterized in that the device is adapted to generate said control data in de- pendence on the position of a contact point (22,24;27;30;39,40;46) between the object and the contact area, and the distance between the contact point and a defined zero position (14;32;38;45;51 a-f) on the contact area.
PCT/EP2010/067340 2010-11-12 2010-11-12 A control system and an operating device for controlling an industrial robot comprising a touch -screen WO2012062374A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/EP2010/067340 WO2012062374A1 (en) 2010-11-12 2010-11-12 A control system and an operating device for controlling an industrial robot comprising a touch -screen

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/EP2010/067340 WO2012062374A1 (en) 2010-11-12 2010-11-12 A control system and an operating device for controlling an industrial robot comprising a touch -screen

Publications (1)

Publication Number Publication Date
WO2012062374A1 true WO2012062374A1 (en) 2012-05-18

Family

ID=44278753

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/EP2010/067340 WO2012062374A1 (en) 2010-11-12 2010-11-12 A control system and an operating device for controlling an industrial robot comprising a touch -screen

Country Status (1)

Country Link
WO (1) WO2012062374A1 (en)

Cited By (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2016059974A (en) * 2014-09-16 2016-04-25 株式会社デンソーウェーブ Robot operating device, robot system, and robot operating program
JP2016060019A (en) * 2014-09-19 2016-04-25 株式会社デンソーウェーブ Robot operating device, robot system, and robot operating program
JP2016068236A (en) * 2014-10-01 2016-05-09 株式会社デンソーウェーブ Robot operation device, robot system, and robot operation program
JP2016175176A (en) * 2015-03-19 2016-10-06 株式会社デンソーウェーブ Robot operation device and robot operation program
JP2016175177A (en) * 2015-03-19 2016-10-06 株式会社デンソーウェーブ Robot operation device and robot operation method
WO2016154995A1 (en) * 2015-04-02 2016-10-06 Abb Technology Ltd Method for industrial robot commissioning, industrial robot system and control system using the same
JP2017211956A (en) * 2016-05-27 2017-11-30 ファナック株式会社 Numerical control device allowing machine operation using multiple touch gesture
IT201600102910A1 (en) * 2016-10-13 2018-04-13 Negri Bossi Spa TOUCH SCREEN PANEL WITH MULTI-AXIS HANDLING OF INJECTION PRESSES FOR PLASTIC MATERIALS
WO2018152779A1 (en) * 2017-02-24 2018-08-30 Abb Schweiz Ag Method and apparatus for selecting initial point for industrial robot commissioning
US10076839B2 (en) 2013-09-20 2018-09-18 Denso Wave Incorporated Robot operation apparatus, robot system, and robot operation program
CN110293564A (en) * 2019-06-28 2019-10-01 北京猎户星空科技有限公司 A kind of Mechanical arm control method, equipment and system
CN110497382A (en) * 2018-05-16 2019-11-26 株式会社安川电机 Operate equipment, control system, control method and storage medium

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6088628A (en) * 1996-07-24 2000-07-11 Fanuc, Ltd. Jog feeding method for robots
US20090292390A1 (en) 2008-05-21 2009-11-26 Siemens Aktiengesellschaft Operating device for operating a machine tool
CN101604153A (en) * 2009-07-06 2009-12-16 三一重工股份有限公司 Engineering vehicle arm rest controller, control system, engineering truck, and control method

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6088628A (en) * 1996-07-24 2000-07-11 Fanuc, Ltd. Jog feeding method for robots
US20090292390A1 (en) 2008-05-21 2009-11-26 Siemens Aktiengesellschaft Operating device for operating a machine tool
CN101604153A (en) * 2009-07-06 2009-12-16 三一重工股份有限公司 Engineering vehicle arm rest controller, control system, engineering truck, and control method

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
SONICWAVEZOR: "Using iPod Touch to control NASA Robot over wifi", 24 March 2010 (2010-03-24), XP002653593, Retrieved from the Internet <URL:http://www.youtube.com/watch?v=4XQeZE4nh6M> [retrieved on 20110727] *
THEROBOTGEEKNET: "Multi-Touch_Robot_Control_Demo_01.wmv", 3 February 2010 (2010-02-03), XP002653715, Retrieved from the Internet <URL:http://www.youtube.com/watch?v=rQxMf1TV_jo> [retrieved on 20110727] *
TYPHOONC2T9: "Control the Robot Arm with Ipod Touch", 5 March 2010 (2010-03-05), XP002653533, Retrieved from the Internet <URL:http://www.youtube.com/watch?v=ttV-gXw3s3U> [retrieved on 20110726] *

Cited By (20)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10076839B2 (en) 2013-09-20 2018-09-18 Denso Wave Incorporated Robot operation apparatus, robot system, and robot operation program
JP2016059974A (en) * 2014-09-16 2016-04-25 株式会社デンソーウェーブ Robot operating device, robot system, and robot operating program
JP2016060019A (en) * 2014-09-19 2016-04-25 株式会社デンソーウェーブ Robot operating device, robot system, and robot operating program
JP2016068236A (en) * 2014-10-01 2016-05-09 株式会社デンソーウェーブ Robot operation device, robot system, and robot operation program
JP2016175176A (en) * 2015-03-19 2016-10-06 株式会社デンソーウェーブ Robot operation device and robot operation program
JP2016175177A (en) * 2015-03-19 2016-10-06 株式会社デンソーウェーブ Robot operation device and robot operation method
WO2016154995A1 (en) * 2015-04-02 2016-10-06 Abb Technology Ltd Method for industrial robot commissioning, industrial robot system and control system using the same
US10786904B2 (en) 2015-04-02 2020-09-29 Abb Schweiz Ag Method for industrial robot commissioning, industrial robot system and control system using the same
US11207781B2 (en) 2015-04-02 2021-12-28 Abb Schweiz Ag Method for industrial robot commissioning, industrial robot system and control system using the same
JP2017211956A (en) * 2016-05-27 2017-11-30 ファナック株式会社 Numerical control device allowing machine operation using multiple touch gesture
EP3309671A1 (en) * 2016-10-13 2018-04-18 Negri Bossi S.P.A. Touch screen panel with multi-axis manoeuvring of injection presses for plastic materials
IT201600102910A1 (en) * 2016-10-13 2018-04-13 Negri Bossi Spa TOUCH SCREEN PANEL WITH MULTI-AXIS HANDLING OF INJECTION PRESSES FOR PLASTIC MATERIALS
US10466893B2 (en) 2016-10-13 2019-11-05 Negri Bossi S.P.A. Touch screen panel with multi-axis manoeuvring of injection presses for plastic materials
US11077561B2 (en) 2017-02-24 2021-08-03 Abb Schweiz Ag Method and apparatus for selecting initial point for industrial robot commissioning
WO2018152779A1 (en) * 2017-02-24 2018-08-30 Abb Schweiz Ag Method and apparatus for selecting initial point for industrial robot commissioning
EP3590662A1 (en) * 2018-05-16 2020-01-08 Kabushiki Kaisha Yaskawa Denki Operation device, control system, control method, and program
CN110497382A (en) * 2018-05-16 2019-11-26 株式会社安川电机 Operate equipment, control system, control method and storage medium
US11426868B2 (en) 2018-05-16 2022-08-30 Kabushiki Kaisha Yaskawa Denki Operation device, control system, control method, and non-transitory computer-readable storage medium
CN110497382B (en) * 2018-05-16 2022-11-15 株式会社安川电机 Operation device, control system, control method, and storage medium
CN110293564A (en) * 2019-06-28 2019-10-01 北京猎户星空科技有限公司 A kind of Mechanical arm control method, equipment and system

Similar Documents

Publication Publication Date Title
WO2012062374A1 (en) A control system and an operating device for controlling an industrial robot comprising a touch -screen
US10166673B2 (en) Portable apparatus for controlling robot and method thereof
US11724388B2 (en) Robot controller and display device using augmented reality and mixed reality
JP6476662B2 (en) Robot operation device, robot system, and robot operation program
JP6497021B2 (en) Robot operation device, robot system, and robot operation program
KR101536106B1 (en) Method for operating an industrial robot
KR101706927B1 (en) Method for operating an industrial robot
JP6631279B2 (en) Robot operation device, robot operation program
EP2923806A1 (en) Robot control device, robot, robotic system, teaching method, and program
JP6690265B2 (en) Robot operating device, robot operating method
JPH10260776A (en) Contact type input equipment and position control method
KR20170024769A (en) Robot control apparatus
US9962835B2 (en) Device for dynamic switching of robot control points
CN108981567B (en) Method for operating a position measuring device
JP3675004B2 (en) Robot control device
CN111770815B (en) Object control method and object control device
JP6710919B2 (en) Robot operating device
JP6379902B2 (en) Robot operation device, robot system, and robot operation program
Araque et al. Augmented reality motion-based robotics off-line programming
KR20150044241A (en) Apparatus for teaching of robot pose Pendant Equipped Slide-out
JP4887109B2 (en) Information processing apparatus and display method thereof
EP3587049A1 (en) Control apparatus, robot, and robot system
JP6379921B2 (en) Robot operation device, robot system, and robot operation program
WO2023067659A1 (en) Control device
JP2005342891A (en) Hand held operation machine and robot control system for industrial robot

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 10777005

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 10777005

Country of ref document: EP

Kind code of ref document: A1