WO2017032407A1 - An industrial robot system and a method for programming an industrial robot - Google Patents

An industrial robot system and a method for programming an industrial robot Download PDF

Info

Publication number
WO2017032407A1
WO2017032407A1 PCT/EP2015/069424 EP2015069424W WO2017032407A1 WO 2017032407 A1 WO2017032407 A1 WO 2017032407A1 EP 2015069424 W EP2015069424 W EP 2015069424W WO 2017032407 A1 WO2017032407 A1 WO 2017032407A1
Authority
WO
WIPO (PCT)
Prior art keywords
robot
model
model structure
computer unit
point
Prior art date
Application number
PCT/EP2015/069424
Other languages
French (fr)
Inventor
Ivan Lundberg
Original Assignee
Abb Schweiz Ag
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Abb Schweiz Ag filed Critical Abb Schweiz Ag
Priority to PCT/EP2015/069424 priority Critical patent/WO2017032407A1/en
Publication of WO2017032407A1 publication Critical patent/WO2017032407A1/en

Links

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1679Programme controls characterised by the tasks executed
    • B25J9/1692Calibration of manipulator
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1656Programme controls characterised by programming, planning systems for manipulators
    • B25J9/1671Programme controls characterised by programming, planning systems for manipulators characterised by simulation, either to verify existing program or to create and verify new program, CAD/CAM oriented, graphic oriented programming systems

Definitions

  • the present disclosure relates to technology for industrial robots, and in particular methods for programming such industrial robots.
  • the robot Before an industrial robot is used for a working task it is common to calibrate the robot to the working place where the working task will take place. Traditionally this is done by relating an internal coordinate system of the robot to an object's coordinate system. A certain degree of knowledge is required to establish the relationship and demands a considerable amount of time. Further, to avoid collisions when performing the working task, the robot needs to learn its environment.
  • the robot environment may be described by a geometric collision model.
  • the geometric collision model may be made available by importing computer aided design (CAD) data or by using a three dimensional (3D) camera to scan the robot cell and its surroundings. While both these methods are theoretically possible they have several challenges. For CAD data there is no guarantee that the CAD data is correct and complete.
  • the data also needs to be calibrated so that the robot is properly situated in the CAD environment.
  • 3D data the data from a 3D camera scan is too dense to be directly used.
  • Data from a 3D camera scan can also be incomplete and have "holes" i.e. due to line of sight issues. It must be simplified algorithmically and this is typically an off line process that must be overseen by an engineer.
  • Availability of CAD data is often limited and 3D cameras are often an expensive add-on product.
  • the disclosure relates to a method for programming an industrial robot.
  • the method includes:
  • the method provides the user with a very easy way of teaching the environment to the robot.
  • the need for mathematical expertize is obviated, and time and costs for programming the robot can be reduced.
  • the geometric model may be represented with coordinates in the robot coordinate system, whereby the robot can relate its position directly to structures created in the geometric model.
  • the created geometric model may be a suitable model of the environment of the robot to be used e.g. for collision checking or path planning, and avoids the complicated and sometimes slow procedure when using dense CAD or camera data of the robot environment.
  • Position data originating from sensors of the robot itself can be used to draw up and define the geometric model, and external accessories such as cameras etc are obviated.
  • the method includes indicating the 3D model structure S.
  • the 3D model structure S may be indicated by being visualized on a display as a graphical element. The user will then know which 3D model structure type the user has selected. On the display the indication may as an alternative be textual.
  • the method includes determining a location of the 3D model structure S in the geometric model based on the first location.
  • the geometric model may become a reproduction of the real robot environment.
  • the method includes receiving boundary data of the size of the first structure, and scaling the 3D model structure S in accordance with the size of the first structure, and including characteristics of the scaled 3D model structure S in the geometric model.
  • the method includes determining boundary data of at least one boundary b of the first structure by pointing or moving the point P at or along the at least one boundary b of the first structure.
  • the method includes continuously visualizing a progress of the method on a display.
  • a progress may here be e.g. an indication of the 3D model structure S, any step in the scaling of the 3D model structure S, a repositioning or a reorientation of the 3D model structure S.
  • the method includes defining by means of an orientation of the point P an allowed robot environment where the robot is allowed to be.
  • An allowed robot environment is thus a space where the robot is allowed to perform work. The robot will thus not steer its end effector or tool to the space outside the allowed robot environment.
  • the disclosure relates to an industrial robot system including an industrial robot having an articulated arm with a point P.
  • the industrial robot system further includes a computer unit with a display having a graphical user interface, GUI.
  • the computer unit is configured to control the industrial robot and to display a representation of at least one three dimensional, 3D, model structure type.
  • the computer unit is configured to receive a first input selecting a type of 3D model structure S from the at least one 3D model structure type.
  • the computer unit is further configured to define in the robot environment a first location of a first structure defined by the point P, and to determine a relationship between the first location and a robot coordinate system.
  • the computer unit is further configured to determine a geometric model of the robot environment including a representation of the first structure based on the relationship and the 3D model structure S.
  • the same advantages as of the method are accomplished also by the system.
  • the geometric model of the robot environment can be saved in a robot control unit and used by the robot control unit for collision avoidance checking or collision free path planning.
  • the robot may then automatically make collision avoidance checking or collision free path planning and the user is remedied of those tasks.
  • the computer unit is configured to determine a location of the 3D model structure S in the geometric model based on the first location.
  • the computer unit is configured to indicate the 3D model structure S on the display as a graphical 3D model structure.
  • the computer unit is configured to be set in a scale mode, in which scale mode the computer unit is configured to receive boundary data of the size of the first structure, and scale the 3D model structure S in accordance with the size of the first structure, and including characteristics of the scaled 3D model structure S in the geometric model.
  • the computer unit is configured to receive boundary data of at least one boundary b of the first structure retrieved by pointing or moving the point P at or along the at least one boundary b of the first structure.
  • To point the point P at the at least one boundary b may include to touch the at least one boundary b with the point P.
  • To move the point P along the at least one boundary b of the first structure may include to touch the at least one boundary b along the whole length or at least partly along the length of the boundary b with the point P.
  • the computer unit is configured to continuously visualize a progress of the scaling in the scale mode on the display.
  • the computer unit is configured to define by means of an orientation of the point P an allowed robot environment where the robot is allowed to be.
  • Fig. 1 shows an industrial robot system according to one embodiment of the disclosure.
  • Fig. 2A shows a robot control unit according to one embodiment of the disclosure.
  • Fig. 2B shows a computer unit according to one embodiment of the disclosure.
  • Fig. 3 shows a flowchart of the method according to one embodiment of the disclosure.
  • Figs. 4A-4D illustrates how a point P of the robot is used to create a geometric model of the robot environment.
  • Fig. 5 illustrates how an allowed robot environment may be defined. Detailed description
  • Fig. 1 shows an industral robot system 7 including an industrial robot 1 .
  • the depicted industrial robot 1 may be referred to as a collaborative robot, as it is specially adapted to collaborate with humans.
  • the industrial robot 1 hereafter referred to as the robot 1 , has two articulated arms 8.
  • Each articulated arm 8 has seven joints, each joint configured to be separately driven by a joint actuator, e.g. a motor.
  • Each joint also has at least one built-in joint encoder (not shown).
  • the joint encoders are arranged to measure the joint angles of the joint.
  • a joint angle is an angle between an arm section of the robot 1 and an adjacent arm section of the robot 1 of the joint in question.
  • Each articulated arm 8 has an end effector 16 to which a tool (not shown) may be attached.
  • the point where the tool should be attached is here marked out with a "P". Any point in the work space of the robot 1 may be pointed out by the "P" on any of the end effectors 16 and the point in the work space will be treated by the industral robot system 7 as will be described in the following.
  • the robot 1 is further arranged to a platform 3, e.g. with bolts. On the platform 3 a cuboid physical structure 5 is located.
  • the cuboid physical structure 5 will hereinafter be referred to as a "first structure”.
  • a point 4 is denoted indicating a "first location" of the first structure 5 which will be further explained in the following.
  • the robot 1 is here only shown for illustrative purposes, and it should be understood that other kinds of industrial robots may be used when implementing the invention.
  • a six degrees of freedom (DOF) robot with one articulated robot arm may be used.
  • An industrial robot is here defined to be a robot that can be automatically controlled, that is reprogrammable, that can adopt to a multitude of tasks and has three or more axes.
  • the robot 1 may be controlled by means of a computer unit 10, and/or a robot control unit 9.
  • the computer unit 10 and the robot control unit 9 are depicted as separate units, but the robot control unit 9 or the functions of the robot control unit 9 may instead be incorporated into the computer unit 10.
  • the computer unit 10 is arranged with a display 1 1 .
  • the unit 9 may be a part of the robot 1 .
  • the robot control unit 9 is illustrated in greater detail in Fig. 2A and includes a processor unit 14A and a memory unit 15A.
  • the processor unit 14A may include one or several central processing units (CPUs).
  • the memory unit 15A may include one or several memory units.
  • the memory unit 15A stores a kinematic model of the robot 1 .
  • the joint angles measured by the joint encoders can be translated into a pose of the robot 1 by means of the kinematic model.
  • the pose describes the orientation and the position of the end effector 16, and thus the point P, in a XYZ robot coordinate system.
  • the robot 1 may be taught a certain path by manually leading the end effector 16 through a series of points.
  • the robot 1 may then be programmed to move through these points by returning to these poses in an order.
  • the poses are again reached by means of the joint motors and joint encoders.
  • the program the robot 1 should follow may be included in a robot control program C and saved in the memory unit 15A.
  • the robot control program C When the robot control program C is run on the processor unit 14A, the robot 1 may perform actions as instructed from the robot control program C.
  • the computer unit 10 includes a user interface, e.g. a graphical user interface (GUI), to which the user can make inputs.
  • GUI graphical user interface
  • the computer unit 10 is shown in more detail in Fig. 2B and comprises a processor unit 14B and a memory unit 15B.
  • the processor unit 14B may include one or several CPUs.
  • the memory unit 15B may include one or several memory units.
  • the computer unit 10 may include the functionality of the robot control unit 9.
  • the kinematic model of the robot 1 may be saved on the memory unit 15B, and the translation of joint encoder readings into poses of the robot 1 may be made by the computer unit 10.
  • the computer unit 10 and the robot control unit 9 are separate units, the computer unit 10 and the robot control unit 9 are configured to exchange data with each other.
  • the robot control unit 9 and the computer unit 10 may be connected by wire (not shown), or may be arranged to exchange data by means of wireless communication, e.g. radio.
  • the industrial robot system 7 can be used to determine a geometric model of the robot environment.
  • the robot environment may in this context be referred to as the working space of the robot 1 .
  • the geometric model may be used as a collision model to make sure the robot 1 avoids collisions with objects in the working space.
  • the geometric model may also be used when programming the robot 1 to move from a certain point A to a point B in the working space. The robot 1 may then use the geometric model to find the fastest collision free path between point A and point B.
  • the industrial robot system 7 Via the computer unit 10, the industrial robot system 7 may be set in different modes. In a lead through mode, the robot 1 may be manually lead through a plurality of points pointed out by the point P of the robot 1 , and these points will be remembered by the industrial robot system 7 as has been previously described.
  • the industrial robot system 7 may further be set in a certain lead through mode for drawing up and define the robot environment, e.g. the working space or cell content. Via the user interface on the computer unit 10, e.g. the GUI, the user may in this mode interact with the robot 1 to create the geometric model.
  • the user may thus make an input to the computer unit 10 to reach the certain lead through mode here defined as a "Define geometric model mode" for drawing up and define the environment of the robot 1 .
  • the user may then move the end effector 16, and the gained position data of the point P is treated in a certain way by the computer unit 10.
  • the computer unit 10 may be configured to display a plurality of different interaction options on the display 1 1 . Depending on which interaction option the user makes, the gained position data of the point P may be treated in a certain way by the computer unit 10.
  • the display 1 1 may be a touch sensitive display, and input to the computer unit 10 may then be made by touching a certain position of the display 1 1 .
  • input to the computer unit 10 may be made by means of an input device such as a computer mouse (not shown), keyboard (not shown), joystick (not shown) or any other input device.
  • the general method includes creating a three dimensional, 3D, model structure S by selecting a 3D model structure type 2, see step A1 .
  • the selection may be made from at least one, e.g. a plurality of, 3D model structure types 2.
  • a graphical representation of a plurality of 3D model structure types 2 are arranged on the display 1 1 of the computer unit 10.
  • the computer unit 10 is thus configured to display the representation, here graphical, of the plurality of 3D model structure types 2.
  • the representation of the plurality of 3D model structure types 2 may be shown as a plurality of interaction options for the user.
  • the 3D model structure types 2 are here a cuboid, a cylinder, a sphere and a cone, but should not be limited to these listed types.
  • the representation may alternatively be textual.
  • the computer unit 10 may thus be configured to receive a first input selecting a 3D model structure type from the plurality of 3D model structure types 2.
  • a 3D model structure S is created that may be directly visualized in the show area 6, in this case as a cuboid.
  • the lines of the 3D model structure S may be dashed, or the 3D model structure S may be transparent or shaded.
  • the computer unit 10 may thus be configured to indicate the 3D model structure S on the display 1 1 as a graphical 3D model structure.
  • the point P is used to define in the robot environment a first location 4 of the first structure 5, see step A2 in Fig. 3.
  • the first structure 5 may be of the same type as the 3D model structure S.
  • the computer unit 10 is configured to define in the robot environment the first location 4 of the first structure 5 defined by the point P.
  • the user takes the end effector 16 and places it on top of the first structure 5, such that the point P is close to or touches the first structure 5.
  • the position of the point P is treated by the computer unit 10 as a first location 4 of the first structure 5.
  • the 3D model structure S may also be oriented in accordance with the orientation of the point P. The user may make some kind of input to the computer unit 10 when the point P is correctly positioned in relation to the first structure 5, to indicate the correct position data to the computer unit 10.
  • the method further includes determining a relationship between the first location 4 and a robot coordinate system of the robot 1 , see step A3 in Fig 3.
  • the computer unit 10 is configured to determine the relationship between the first location 4 of the first structure 5 and a robot coordinate system of the robot 1 .
  • the position of the point P may be expressed in the robot coordinate system in XYZ coordinates. It is here assumed that the robot 1 is located in the origin, thus the zero point, of the robot coordinate system.
  • Position data of the point P of the end effector 16 may however include both coordinates in the robot coordinate system, and an orientation of the point P.
  • the method further includes determining a geometric model of the robot environment including a representation of the first structure 5 based on the relationship and the 3D model structure S, see step A4 in Fig 3.
  • the computer unit 10 is configured to determine the geometric model of the robot environment including the
  • the geometric model will include the 3D model structure S at the position and the orientation appointed by the point P.
  • the 3D model structure S will thus be positioned and oriented with the same position and orientation as the first structure 5 but in the geometric model.
  • the geometric model is visualized in the show area 6 of the display 1 1 .
  • the 3D model structure S is moved and/or orientated accordingly in the show area 6. The user may thus during the whole process of creating the geometric model visually make sure that the representation of the first structure 5 is correct and thus ensure that the geometric model becomes correct.
  • the user may make some kind of input to the computer unit 10, e.g. press an "OK" button, to indicate to the industrial robot system 7 that the 3D model structure S is positioned and orientated correctly in the geometric model.
  • the geometric model will include a geometric representation of the robot environment where structures in the robot environment are represented. If there are no other structures in the environment of the robot 1 , the geometric model is ready to be used by the robot 1 . If the size of the 3D model structure S is not in accordance with the first structure 5, the size may be scaled by using boundary data of the first structure 5. The method may then include receiving boundary data of the size of the first structure 5, and scaling the 3D model structure S in accordance with the size of the first structure 5, and including characteristics of the scaled 3D model structure S in the geometric model.
  • the computer unit 10 may then be configured to be set in a scale mode, in which scale mode the computer unit 10 is configured to receive boundary data of the size of the first structure 5, and scale the 3D model structure S in accordance with the size of the first structure 5, and including characteristics of the scaled 3D model structure S in the geometric model.
  • scale mode the computer unit 10 will recognize position data from the robot 1 as boundary data. For example, each side of a cuboid can be regarded as a boundary.
  • the user may set the computer unit 10 in a scale mode by making an input to the computer unit 10, e.g. selecting a "Scale mode". In Fig.
  • FIG. 4B an example how boundary data may be retrieved is illustrated.
  • the user touches with the point P the first structure 5 on one side that the user wants to scale down.
  • the position and orientation of the point P is transferred as boundary data to the computer unit 10 and treated as a boundary of the 3D model structure S.
  • the corresponding side of the 3D model structure S is scaled and oriented according to the boundary data, and the scaled 3D model structure S is illustrated in the show area 6 of the display 1 1 .
  • the scaled 3D model structure S is still illustrated with dashed lines.
  • the user may now continue to scale the 3D model structure S, or stop the scaling e.g. by pressing an "OK" button on the display 1 1 .
  • the method includes determining boundary data of at least one boundary b of the first structure 5 by pointing or moving the point P at or along the at least one boundary b of the first structure 5.
  • the point P of the end effector 16 is moved along an edge where two sides of the first structure 5 meet.
  • the retrieved position data of the point P is treated as boundary data by the computer unit 10.
  • the computer unit 10 is configured to receive the boundary data of at least one boundary b of the first structure 5 retrieved by pointing or moving the point P at or along the at least one boundary b of the first structure 5. This embodiment may be initiated by making an input to the computer unit 10, e.g. pressing on an icon with a scissor 12 on the display 1 1 .
  • the point P on the end effector 16 will now act as a scissor to the 3D model structure S.
  • the 3D model structure S is cut as indicated in the show area 6 of the display 1 1 . Only the cut is here indicated by dashed lines, the remaining lines of the 3D model structure S are filled.
  • On the display 1 1 another icon of a trashcan 13 is shown.
  • the part that is not wanted anymore is thrown in the trashcan 13.
  • the throw in the trashcan 13 can be accomplished e.g. by pointing on the part of the 3D model structure S on the display 1 1 that is not wanted, and dragging the unwanted part to the trashcan 13.
  • the lines of the unwanted parts may be dashed, or the unwanted part may be shaded.
  • the unwanted part is removed from the display 1 1 and the geometric model.
  • the remaining part of the 3D model structure 5 is redrawn on the display 1 1 to a new whole 3D model structure S.
  • the orientation of the end effector 16 and thus the point P may determine how the 3D model structure S is scaled.
  • the user may be guided through the scaling of the 3D model structure S by one or several indications on the display 1 1 . For example, an arrow may as indicated in Fig. 4C point out to the user which side of the 3D model structure S that is in turn to be scaled.
  • the user may simply use the point P on the end effector 16 to cut the 3D model structure S in accordance with the orientation of the point P.
  • the method may include continuously visualizing a progress of the method on the display 1 1 .
  • the computer unit 10 is then configured to continuously visualize the progress of the method, e.g. the scaling of the 3D model structure S in the scale mode on the display 1 1 .
  • any real physical structure in the environment of the robot 1 may be represented in the geometric model as a 3D model structure S, scaled if needed.
  • a 3D model structure S As soon as one first 3D model structure S has been defined in the geometric model, other new 3D model structures S may be defined in relation to this first 3D model structure S and to further 3D model structures S.
  • the platform 3 first be defined and represented as a large cuboid structure (not shown) in the geometric model, and the smaller cuboid on the platform 3 may be defined and represented as a smaller cuboid structure that can be directly limited by and aligned with the large cuboid structure in the geometric model.
  • the geometric model may thus be a copy or representation of the real robot environment.
  • the copy or representation may be a rough representation of the real robot environment, but enough for making e.g. collision checking and path planning.
  • the method may also include defining by means of the orientation of the point P an allowed robot environment where the robot 1 is allowed to be.
  • the computer unit 10 is thus configured to define the allowed robot environment by means of the orientation of the point P.
  • the allowed robot environment may be defined as a space limited by a plane 17 defined by the point P, as illustrated in Fig. 5.
  • the allowed environment is here the space above the plane 17 where the end effector 16 is located.
  • the space below the plane 17 where the end effector 16 is not located is a not allowed robot environment.
  • the robot 1 will thus know which side of a plane that is allowed, as the robot 1 was on the allowed side with the point P during definition of the geometric model.

Abstract

An industrial robot system (7) and a method for programming an industrial robot (1). The method includes creating a three dimensional,3D,model structure (S) by selecting a3D model structure type(2), using a point P of the robot (1) to define in the robot environment a first location(4) of a first structure(5); determining a relationship between the first location (4) and a robot coordinate system;and determining a geometric model (6) of the environment of the robot (1) including a representation of the first structure (5) based on the relationship and the 3D model structure (S).

Description

An industrial robot system and a method for programming an industrial robot
Technical field
The present disclosure relates to technology for industrial robots, and in particular methods for programming such industrial robots.
Background
Before an industrial robot is used for a working task it is common to calibrate the robot to the working place where the working task will take place. Traditionally this is done by relating an internal coordinate system of the robot to an object's coordinate system. A certain degree of knowledge is required to establish the relationship and demands a considerable amount of time. Further, to avoid collisions when performing the working task, the robot needs to learn its environment. The robot environment may be described by a geometric collision model. The geometric collision model may be made available by importing computer aided design (CAD) data or by using a three dimensional (3D) camera to scan the robot cell and its surroundings. While both these methods are theoretically possible they have several challenges. For CAD data there is no guarantee that the CAD data is correct and complete. The data also needs to be calibrated so that the robot is properly situated in the CAD environment. For 3D data the data from a 3D camera scan is too dense to be directly used. Data from a 3D camera scan can also be incomplete and have "holes" i.e. due to line of sight issues. It must be simplified algorithmically and this is typically an off line process that must be overseen by an engineer. Availability of CAD data is often limited and 3D cameras are often an expensive add-on product.
From e.g. US5880956A it is known to have a pre made workstation model and manually lead the robot through a series of points to teach the robot a desired working path at the workstation. For each new workstation or when something is changed at the present workstation, a new workstation model has to be created. Collision checking is made by using a collision program and the workstation model data, robot model data and end effector model data. The above described methods are difficult and tedious for the unexperienced user. Industrial robots are increasingly used, and there is a need for simplifying the programming of the industrial robots to make them more user friendly. From US2013/0231778A1 a user friendly method for programming a robot is known. A part P of the robot is used to define one or more geometric features relative to the surroundings of the robot, and the features are designated with names. The robot may thereafter be instructed to move in relation to these features. Summary
It is an object of the disclosure to provide an enhanced simplified method for programming a robot. It is a further object of the disclosure to provide a simplified method for teaching the robot its environment. It is a further object to reduce costs and time for programming an industrial robot. These objects and others are at least partly achieved by the methods and the system according to the
independent claims, and by the embodiments according to the dependent claims.
According to a first aspect, the disclosure relates to a method for programming an industrial robot. The method includes:
- creating a three dimensional, 3D, model structure S by selecting a 3D model structure type;
- using a point P of the robot to define in the robot environment a first location of a first structure;
- determining a relationship between the first location and a robot coordinate system;
- determining a geometric model of the robot environment including a
representation of the first structure based on the relationship and the 3D model structure S. The method provides the user with a very easy way of teaching the environment to the robot. The need for mathematical expertize is obviated, and time and costs for programming the robot can be reduced. The geometric model may be represented with coordinates in the robot coordinate system, whereby the robot can relate its position directly to structures created in the geometric model.
Changes in the robot environment are easily added to the geometric model without any heavy and time consuming calculations. The created geometric model may be a suitable model of the environment of the robot to be used e.g. for collision checking or path planning, and avoids the complicated and sometimes slow procedure when using dense CAD or camera data of the robot environment. Position data originating from sensors of the robot itself can be used to draw up and define the geometric model, and external accessories such as cameras etc are obviated.
According to one embodiment, the method includes indicating the 3D model structure S. For example, the 3D model structure S may be indicated by being visualized on a display as a graphical element. The user will then know which 3D model structure type the user has selected. On the display the indication may as an alternative be textual.
According to one embodiment, the method includes determining a location of the 3D model structure S in the geometric model based on the first location. Thus, the geometric model may become a reproduction of the real robot environment.
According to one embodiment, the method includes receiving boundary data of the size of the first structure, and scaling the 3D model structure S in accordance with the size of the first structure, and including characteristics of the scaled 3D model structure S in the geometric model. According to one embodiment, the method includes determining boundary data of at least one boundary b of the first structure by pointing or moving the point P at or along the at least one boundary b of the first structure.
According to one embodiment, the method includes continuously visualizing a progress of the method on a display. A progress may here be e.g. an indication of the 3D model structure S, any step in the scaling of the 3D model structure S, a repositioning or a reorientation of the 3D model structure S.
According to one embodiment, the method includes defining by means of an orientation of the point P an allowed robot environment where the robot is allowed to be. An allowed robot environment is thus a space where the robot is allowed to perform work. The robot will thus not steer its end effector or tool to the space outside the allowed robot environment. According to a second aspect, the disclosure relates to an industrial robot system including an industrial robot having an articulated arm with a point P. The industrial robot system further includes a computer unit with a display having a graphical user interface, GUI. The computer unit is configured to control the industrial robot and to display a representation of at least one three dimensional, 3D, model structure type. The computer unit is configured to receive a first input selecting a type of 3D model structure S from the at least one 3D model structure type. The computer unit is further configured to define in the robot environment a first location of a first structure defined by the point P, and to determine a relationship between the first location and a robot coordinate system. The computer unit is further configured to determine a geometric model of the robot environment including a representation of the first structure based on the relationship and the 3D model structure S.
The same advantages as of the method are accomplished also by the system. The geometric model of the robot environment can be saved in a robot control unit and used by the robot control unit for collision avoidance checking or collision free path planning. The robot may then automatically make collision avoidance checking or collision free path planning and the user is remedied of those tasks. According to one embodiment, the computer unit is configured to determine a location of the 3D model structure S in the geometric model based on the first location. According to one embodiment, the computer unit is configured to indicate the 3D model structure S on the display as a graphical 3D model structure. According to one embodiment, the computer unit is configured to be set in a scale mode, in which scale mode the computer unit is configured to receive boundary data of the size of the first structure, and scale the 3D model structure S in accordance with the size of the first structure, and including characteristics of the scaled 3D model structure S in the geometric model. According to one
embodiment, the computer unit is configured to receive boundary data of at least one boundary b of the first structure retrieved by pointing or moving the point P at or along the at least one boundary b of the first structure. To point the point P at the at least one boundary b may include to touch the at least one boundary b with the point P. To move the point P along the at least one boundary b of the first structure may include to touch the at least one boundary b along the whole length or at least partly along the length of the boundary b with the point P.
According to one embodiment, the computer unit is configured to continuously visualize a progress of the scaling in the scale mode on the display.
According to one embodiment, the computer unit is configured to define by means of an orientation of the point P an allowed robot environment where the robot is allowed to be. Brief description of the drawings
Fig. 1 shows an industrial robot system according to one embodiment of the disclosure.
Fig. 2A shows a robot control unit according to one embodiment of the disclosure. Fig. 2B shows a computer unit according to one embodiment of the disclosure. Fig. 3 shows a flowchart of the method according to one embodiment of the disclosure. Figs. 4A-4D illustrates how a point P of the robot is used to create a geometric model of the robot environment.
Fig. 5 illustrates how an allowed robot environment may be defined. Detailed description
Fig. 1 shows an industral robot system 7 including an industrial robot 1 . The depicted industrial robot 1 may be referred to as a collaborative robot, as it is specially adapted to collaborate with humans. The industrial robot 1 , hereafter referred to as the robot 1 , has two articulated arms 8. Each articulated arm 8 has seven joints, each joint configured to be separately driven by a joint actuator, e.g. a motor. Each joint also has at least one built-in joint encoder (not shown). The joint encoders are arranged to measure the joint angles of the joint. A joint angle is an angle between an arm section of the robot 1 and an adjacent arm section of the robot 1 of the joint in question. Each articulated arm 8 has an end effector 16 to which a tool (not shown) may be attached. The point where the tool should be attached is here marked out with a "P". Any point in the work space of the robot 1 may be pointed out by the "P" on any of the end effectors 16 and the point in the work space will be treated by the industral robot system 7 as will be described in the following. The robot 1 is further arranged to a platform 3, e.g. with bolts. On the platform 3 a cuboid physical structure 5 is located. The cuboid physical structure 5 will hereinafter be referred to as a "first structure". On the top middle of the first structure 5 a point 4 is denoted indicating a "first location" of the first structure 5 which will be further explained in the following. The type of robot 1 in Fig. 1 is here only shown for illustrative purposes, and it should be understood that other kinds of industrial robots may be used when implementing the invention. For example a six degrees of freedom (DOF) robot with one articulated robot arm may be used. An industrial robot is here defined to be a robot that can be automatically controlled, that is reprogrammable, that can adopt to a multitude of tasks and has three or more axes. The robot 1 may be controlled by means of a computer unit 10, and/or a robot control unit 9. In Fig. 1 the computer unit 10 and the robot control unit 9 are depicted as separate units, but the robot control unit 9 or the functions of the robot control unit 9 may instead be incorporated into the computer unit 10. The computer unit 10 is arranged with a display 1 1 . As a separate robot control unit 9, the unit 9 may be a part of the robot 1 . The robot control unit 9 is illustrated in greater detail in Fig. 2A and includes a processor unit 14A and a memory unit 15A. The processor unit 14A may include one or several central processing units (CPUs). The memory unit 15A may include one or several memory units. The memory unit 15A stores a kinematic model of the robot 1 . The joint angles measured by the joint encoders can be translated into a pose of the robot 1 by means of the kinematic model. The pose describes the orientation and the position of the end effector 16, and thus the point P, in a XYZ robot coordinate system. The robot 1 may be taught a certain path by manually leading the end effector 16 through a series of points. These points are then saved in the memory unit 15A as a series of poses. The robot 1 may then be programmed to move through these points by returning to these poses in an order. The poses are again reached by means of the joint motors and joint encoders. The program the robot 1 should follow may be included in a robot control program C and saved in the memory unit 15A. When the robot control program C is run on the processor unit 14A, the robot 1 may perform actions as instructed from the robot control program C.
The computer unit 10 includes a user interface, e.g. a graphical user interface (GUI), to which the user can make inputs. The computer unit 10 is shown in more detail in Fig. 2B and comprises a processor unit 14B and a memory unit 15B. The processor unit 14B may include one or several CPUs. The memory unit 15B may include one or several memory units. As previously explained, the computer unit 10 may include the functionality of the robot control unit 9. For example, the kinematic model of the robot 1 may be saved on the memory unit 15B, and the translation of joint encoder readings into poses of the robot 1 may be made by the computer unit 10. If the computer unit 10 and the robot control unit 9 are separate units, the computer unit 10 and the robot control unit 9 are configured to exchange data with each other. The robot control unit 9 and the computer unit 10 may be connected by wire (not shown), or may be arranged to exchange data by means of wireless communication, e.g. radio.
The industrial robot system 7 can be used to determine a geometric model of the robot environment. The robot environment may in this context be referred to as the working space of the robot 1 . The geometric model may be used as a collision model to make sure the robot 1 avoids collisions with objects in the working space. The geometric model may also be used when programming the robot 1 to move from a certain point A to a point B in the working space. The robot 1 may then use the geometric model to find the fastest collision free path between point A and point B. Via the computer unit 10, the industrial robot system 7 may be set in different modes. In a lead through mode, the robot 1 may be manually lead through a plurality of points pointed out by the point P of the robot 1 , and these points will be remembered by the industrial robot system 7 as has been previously described. The industrial robot system 7 may further be set in a certain lead through mode for drawing up and define the robot environment, e.g. the working space or cell content. Via the user interface on the computer unit 10, e.g. the GUI, the user may in this mode interact with the robot 1 to create the geometric model.
The user may thus make an input to the computer unit 10 to reach the certain lead through mode here defined as a "Define geometric model mode" for drawing up and define the environment of the robot 1 . In response to such input, the user may then move the end effector 16, and the gained position data of the point P is treated in a certain way by the computer unit 10. The computer unit 10 may be configured to display a plurality of different interaction options on the display 1 1 . Depending on which interaction option the user makes, the gained position data of the point P may be treated in a certain way by the computer unit 10. The display 1 1 may be a touch sensitive display, and input to the computer unit 10 may then be made by touching a certain position of the display 1 1 . Instead, input to the computer unit 10 may be made by means of an input device such as a computer mouse (not shown), keyboard (not shown), joystick (not shown) or any other input device.
In the following methods for determining a geometric model of environment of the robot 1 will be described with reference to the flowchart in Fig. 3 and the illustrative Figs. 4A-4D. One or several of the methods may partly be implemented as robot control programs C and saved in the memory unit 15B of the computer unit 10 (Fig. 2B). The methods may thus be implemented with the herein described hardware of the industrial robot system 7 (Fig. 1 ). It is to be understood that the industrial robot system 7 initially has been set in the "Define geometric model mode". With reference to the flowchart in Fig. 3 illustrating a general method, the general method includes creating a three dimensional, 3D, model structure S by selecting a 3D model structure type 2, see step A1 . The selection may be made from at least one, e.g. a plurality of, 3D model structure types 2. In the right side of Fig. 4A, a graphical representation of a plurality of 3D model structure types 2 are arranged on the display 1 1 of the computer unit 10. The computer unit 10 is thus configured to display the representation, here graphical, of the plurality of 3D model structure types 2. The representation of the plurality of 3D model structure types 2 may be shown as a plurality of interaction options for the user. The 3D model structure types 2 are here a cuboid, a cylinder, a sphere and a cone, but should not be limited to these listed types. The representation may alternatively be textual. To the left in the display 1 1 is a show area 6 where a graphical representation of the progress of the determination of the geometric model will be drawn up. To the left in Fig. 4A is a part of a simplified end effector 16 of the robot 1 in Fig. 1 shown, with the depicted point P. On the platform 3 the first structure 5 is located, which should now be defined in the geometric model. To define the first structure 5, the user makes a first input to the computer unit 10 and selects the cuboid 3D model structure type 2. The computer unit 10 may thus be configured to receive a first input selecting a 3D model structure type from the plurality of 3D model structure types 2. As a response to the selection, a 3D model structure S is created that may be directly visualized in the show area 6, in this case as a cuboid. To emphasis that the 3D model structure S is not yet positioned or scaled to a correct size, the lines of the 3D model structure S may be dashed, or the 3D model structure S may be transparent or shaded. The computer unit 10 may thus be configured to indicate the 3D model structure S on the display 1 1 as a graphical 3D model structure.
The point P is used to define in the robot environment a first location 4 of the first structure 5, see step A2 in Fig. 3. The first structure 5 may be of the same type as the 3D model structure S. The computer unit 10 is configured to define in the robot environment the first location 4 of the first structure 5 defined by the point P. The user takes the end effector 16 and places it on top of the first structure 5, such that the point P is close to or touches the first structure 5. The position of the point P is treated by the computer unit 10 as a first location 4 of the first structure 5. The 3D model structure S may also be oriented in accordance with the orientation of the point P. The user may make some kind of input to the computer unit 10 when the point P is correctly positioned in relation to the first structure 5, to indicate the correct position data to the computer unit 10. The method further includes determining a relationship between the first location 4 and a robot coordinate system of the robot 1 , see step A3 in Fig 3. The computer unit 10 is configured to determine the relationship between the first location 4 of the first structure 5 and a robot coordinate system of the robot 1 . For example, the position of the point P may be expressed in the robot coordinate system in XYZ coordinates. It is here assumed that the robot 1 is located in the origin, thus the zero point, of the robot coordinate system. Position data of the point P of the end effector 16 may however include both coordinates in the robot coordinate system, and an orientation of the point P. It is now known to the computer unit 10 where the first structure 5 is located in the environment of the robot 1 . The method further includes determining a geometric model of the robot environment including a representation of the first structure 5 based on the relationship and the 3D model structure S, see step A4 in Fig 3. The computer unit 10 is configured to determine the geometric model of the robot environment including the
representation of the first structure 5 based on the relationship and the 3D model structure S. The appointed position of the point P and orientation of the point P now becomes the position and orientation of the 3D model structure S in the geometric model. If the 3D model structure S has the correct size, thus the same size as the first structure 5, the geometric model will include the 3D model structure S at the position and the orientation appointed by the point P. The 3D model structure S will thus be positioned and oriented with the same position and orientation as the first structure 5 but in the geometric model. The geometric model is visualized in the show area 6 of the display 1 1 . When a new position and/or orientation of the point P has been appointed, the 3D model structure S is moved and/or orientated accordingly in the show area 6. The user may thus during the whole process of creating the geometric model visually make sure that the representation of the first structure 5 is correct and thus ensure that the geometric model becomes correct.
The user may make some kind of input to the computer unit 10, e.g. press an "OK" button, to indicate to the industrial robot system 7 that the 3D model structure S is positioned and orientated correctly in the geometric model. The geometric model will include a geometric representation of the robot environment where structures in the robot environment are represented. If there are no other structures in the environment of the robot 1 , the geometric model is ready to be used by the robot 1 . If the size of the 3D model structure S is not in accordance with the first structure 5, the size may be scaled by using boundary data of the first structure 5. The method may then include receiving boundary data of the size of the first structure 5, and scaling the 3D model structure S in accordance with the size of the first structure 5, and including characteristics of the scaled 3D model structure S in the geometric model. These characteristics may be the new size of the 3D model structure S, represented by XYZ coordinates in the robot coordinate system. The computer unit 10 may then be configured to be set in a scale mode, in which scale mode the computer unit 10 is configured to receive boundary data of the size of the first structure 5, and scale the 3D model structure S in accordance with the size of the first structure 5, and including characteristics of the scaled 3D model structure S in the geometric model. When set in a scale mode, the computer unit 10 will recognize position data from the robot 1 as boundary data. For example, each side of a cuboid can be regarded as a boundary. The user may set the computer unit 10 in a scale mode by making an input to the computer unit 10, e.g. selecting a "Scale mode". In Fig. 4B an example how boundary data may be retrieved is illustrated. The user touches with the point P the first structure 5 on one side that the user wants to scale down. The position and orientation of the point P is transferred as boundary data to the computer unit 10 and treated as a boundary of the 3D model structure S. The corresponding side of the 3D model structure S is scaled and oriented according to the boundary data, and the scaled 3D model structure S is illustrated in the show area 6 of the display 1 1 . The scaled 3D model structure S is still illustrated with dashed lines. The user may now continue to scale the 3D model structure S, or stop the scaling e.g. by pressing an "OK" button on the display 1 1 . According to another embodiment illustrated in Fig. 4C, the method includes determining boundary data of at least one boundary b of the first structure 5 by pointing or moving the point P at or along the at least one boundary b of the first structure 5. The point P of the end effector 16 is moved along an edge where two sides of the first structure 5 meet. The retrieved position data of the point P is treated as boundary data by the computer unit 10. The computer unit 10 is configured to receive the boundary data of at least one boundary b of the first structure 5 retrieved by pointing or moving the point P at or along the at least one boundary b of the first structure 5. This embodiment may be initiated by making an input to the computer unit 10, e.g. pressing on an icon with a scissor 12 on the display 1 1 . The point P on the end effector 16 will now act as a scissor to the 3D model structure S. When the point P is moved along the indicated edge, the 3D model structure S is cut as indicated in the show area 6 of the display 1 1 . Only the cut is here indicated by dashed lines, the remaining lines of the 3D model structure S are filled. On the display 1 1 another icon of a trashcan 13 is shown. As illustrated in Fig. 4D, after the 3D model structure S has been cut, the part that is not wanted anymore is thrown in the trashcan 13. The throw in the trashcan 13 can be accomplished e.g. by pointing on the part of the 3D model structure S on the display 1 1 that is not wanted, and dragging the unwanted part to the trashcan 13. When the unwanted part is dragged, the lines of the unwanted parts may be dashed, or the unwanted part may be shaded. When the unwanted part is over the trashcan 13, the unwanted part is removed from the display 1 1 and the geometric model. The remaining part of the 3D model structure 5 is redrawn on the display 1 1 to a new whole 3D model structure S. In this way the 3D model structure S may be scaled. The orientation of the end effector 16 and thus the point P may determine how the 3D model structure S is scaled. The user may be guided through the scaling of the 3D model structure S by one or several indications on the display 1 1 . For example, an arrow may as indicated in Fig. 4C point out to the user which side of the 3D model structure S that is in turn to be scaled. Alternatively, the user may simply use the point P on the end effector 16 to cut the 3D model structure S in accordance with the orientation of the point P.
As illustrated in the figures, the method may include continuously visualizing a progress of the method on the display 1 1 . The computer unit 10 is then configured to continuously visualize the progress of the method, e.g. the scaling of the 3D model structure S in the scale mode on the display 1 1 .
Any real physical structure in the environment of the robot 1 may be represented in the geometric model as a 3D model structure S, scaled if needed. As soon as one first 3D model structure S has been defined in the geometric model, other new 3D model structures S may be defined in relation to this first 3D model structure S and to further 3D model structures S. For example may the platform 3 first be defined and represented as a large cuboid structure (not shown) in the geometric model, and the smaller cuboid on the platform 3 may be defined and represented as a smaller cuboid structure that can be directly limited by and aligned with the large cuboid structure in the geometric model. The geometric model may thus be a copy or representation of the real robot environment. The copy or representation may be a rough representation of the real robot environment, but enough for making e.g. collision checking and path planning.
The method may also include defining by means of the orientation of the point P an allowed robot environment where the robot 1 is allowed to be. The computer unit 10 is thus configured to define the allowed robot environment by means of the orientation of the point P. The allowed robot environment may be defined as a space limited by a plane 17 defined by the point P, as illustrated in Fig. 5. The allowed environment is here the space above the plane 17 where the end effector 16 is located. The space below the plane 17 where the end effector 16 is not located is a not allowed robot environment. Thus, by touching a surface of the first structure 5 with the correct orientation of the point P, the allowed robot
environment may be defined. The robot 1 will thus know which side of a plane that is allowed, as the robot 1 was on the allowed side with the point P during definition of the geometric model.
The present invention is not limited to the above-described preferred
embodiments. Various alternatives, modifications and equivalents may be used. Therefore, the above embodiments should not be taken as limiting the scope of the invention, which is defined by the appending claims.

Claims

A method for programming an industrial robot (1 ), the method including:
- creating a three dimensional, 3D, model structure (S) by selecting a 3D model structure type (2);
- using a point P of the robot (1 ) to define in the robot environment a first location (4) of a first structure (5);
- determining a relationship between the first location (4) and a robot coordinate system;
- determining a geometric model of the robot environment including a representation of the first structure (5) based on the relationship and the 3D model structure (S).
The method according to claim 1 , including indicating the 3D model structure (S).
The method according to claim 1 or 2, including determining a location of the 3D model structure (S) in the geometric model based on the first location (4).
The method according to any of the preceding claims, including receiving boundary data of the size of the first structure (5), and scaling the 3D model structure (S) in accordance with the size of the first structure (5), and including characteristics of the scaled 3D model structure (S) in the geometric model (6).
The method according to claim 4, wherein the 3D model structure (S) is a graphical 3D model structure.
The method according to claim 4 or 5, including continuously visualizing a progress of the method on a display (1 1 ).
The method according to any of the claims 4 to 6, including
determining boundary data of at least one boundary (b) of the first structure (5) by pointing or moving the point P at or along the at least one boundary (b) of the first structure (5).
8. The method according to any of the previous claims, including defining by means of an orientation of the point P an allowed robot
environment where the robot (1 ) is allowed to be.
9. An industrial robot system (7) including
- an industrial robot (1 ) having an articulated arm (8) with a point P;
- a computer unit (10) with a display (1 1 ) having a graphical user interface, GUI, wherein the computer unit (10) is configured to control the industrial robot (1 ); c h a ra ct e r i z e d i n t h a t the computer unit (10) is configured to display a representation of at least one three dimensional, 3D, model structure type (2) and to receive a first input selecting a type of 3D model structure (S) from the at least one 3D model structure type (2); the computer unit (10) is further configured to define in the robot environment a first location (4) of a first structure (5) defined by the point P; the computer unit (10) is further configured to determine a relationship between the first location (4) and a robot coordinate system, and to determine a geometric model (6) of the robot environment including a representation of the first structure (5) based on the relationship and the 3D model structure (S).
10. The industrial robot system (7) according to claim 9, wherein the
computer unit (10) is configured to determine a location of the 3D model structure (S) in the geometric model based on the first location (4).
1 1 . The industrial robot system (7) according to claim 9 or 10, wherein the computer unit (10) is configured to indicate the 3D model structure (S) on the display (1 1 ) as a graphical 3D model structure.
12. The industrial robot system (7) according to claim 1 1 , wherein the computer unit (10) is configured to be set in a scale mode, in which scale mode the computer unit (10) is configured to receive boundary data of the size of the first structure (5), and scale the 3D model structure (S) in accordance with the size of the first structure (5), and including characteristics of the scaled 3D model structure (S) in the geometric model (6).
13. The industrial robot system (7) according to claim 12, wherein the computer unit (10) is configured to continuously visualize a progress of the scaling in the scale mode on the display (1 1 ).
14. The industrial robot system (7) according to claim 12 or 13, wherein the computer unit (10) is configured to receive boundary data of at least one boundary (b) of the first structure (5) retrieved by pointing or moving the point P at or along the at least one boundary (b) of the first structure (5).
15. The industrial robot system (7) according to any of the claims 9 to 14, wherein the computer unit (10) is configured to define by means of an orientation of the point P an allowed robot environment where the robot (1 ) is allowed to be.
PCT/EP2015/069424 2015-08-25 2015-08-25 An industrial robot system and a method for programming an industrial robot WO2017032407A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/EP2015/069424 WO2017032407A1 (en) 2015-08-25 2015-08-25 An industrial robot system and a method for programming an industrial robot

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/EP2015/069424 WO2017032407A1 (en) 2015-08-25 2015-08-25 An industrial robot system and a method for programming an industrial robot

Publications (1)

Publication Number Publication Date
WO2017032407A1 true WO2017032407A1 (en) 2017-03-02

Family

ID=54007705

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/EP2015/069424 WO2017032407A1 (en) 2015-08-25 2015-08-25 An industrial robot system and a method for programming an industrial robot

Country Status (1)

Country Link
WO (1) WO2017032407A1 (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2018144152A (en) * 2017-03-03 2018-09-20 株式会社キーエンス Robot simulation device, robot simulation method, robot simulation program, computer-readable recording medium and recording device
WO2023117115A1 (en) * 2021-12-23 2023-06-29 Abb Schweiz Ag Modeling a robot working environment

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5880956A (en) 1994-08-12 1999-03-09 Minnesota Mining And Manufacturing Company Lead-through robot programming system
EP1092513A2 (en) * 1999-10-12 2001-04-18 Fanuc Ltd Graphic display apparatus for robot system
US20130231778A1 (en) 2010-11-16 2013-09-05 Universal Robots Aps Method and Means for Controlling a Robot

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5880956A (en) 1994-08-12 1999-03-09 Minnesota Mining And Manufacturing Company Lead-through robot programming system
EP1092513A2 (en) * 1999-10-12 2001-04-18 Fanuc Ltd Graphic display apparatus for robot system
US20130231778A1 (en) 2010-11-16 2013-09-05 Universal Robots Aps Method and Means for Controlling a Robot

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2018144152A (en) * 2017-03-03 2018-09-20 株式会社キーエンス Robot simulation device, robot simulation method, robot simulation program, computer-readable recording medium and recording device
WO2023117115A1 (en) * 2021-12-23 2023-06-29 Abb Schweiz Ag Modeling a robot working environment

Similar Documents

Publication Publication Date Title
CN110977931B (en) Robot control device and display device using augmented reality and mixed reality
US20200139547A1 (en) Teaching device, teaching method, and robot system
KR101671569B1 (en) Robot simulator, robot teaching apparatus and robot teaching method
JP7419271B2 (en) Visualizing and modifying operational boundary zones using augmented reality
US10166673B2 (en) Portable apparatus for controlling robot and method thereof
US10737396B2 (en) Method and apparatus for robot path teaching
JP6810093B2 (en) Robot simulation device
JP6343353B2 (en) Robot motion program generation method and robot motion program generation device
US9387589B2 (en) Visual debugging of robotic tasks
US9958862B2 (en) Intuitive motion coordinate system for controlling an industrial robot
US10556337B2 (en) Method of and apparatus for managing behavior of robot
US20150273689A1 (en) Robot control device, robot, robotic system, teaching method, and program
JP5113666B2 (en) Robot teaching system and display method of robot operation simulation result
JP7155664B2 (en) Simulation equipment, controllers and robots
KR101876845B1 (en) Robot control apparatus
JP2009190113A (en) Robot simulation device
JP6457208B2 (en) Operation instruction system and operation instruction method for robot
WO2017032407A1 (en) An industrial robot system and a method for programming an industrial robot
JP2015174184A (en) Controller
CN108145702B (en) Device for setting a boundary surface and method for setting a boundary surface
US20220241980A1 (en) Object-Based Robot Control
KR102403021B1 (en) Robot teaching apparatus and method for teaching robot using the same
Motta An investigation of singularities in robot kinematic chains aiming at building robot calibration models for off-line programming
JP7424122B2 (en) Simulation equipment and programs
WO2022269927A1 (en) Program creation device

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 15754207

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 15754207

Country of ref document: EP

Kind code of ref document: A1